[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2021184971A1 - 触摸屏手势操作控制方法、装置、终端设备以及存储介质 - Google Patents

触摸屏手势操作控制方法、装置、终端设备以及存储介质 Download PDF

Info

Publication number
WO2021184971A1
WO2021184971A1 PCT/CN2021/074170 CN2021074170W WO2021184971A1 WO 2021184971 A1 WO2021184971 A1 WO 2021184971A1 CN 2021074170 W CN2021074170 W CN 2021074170W WO 2021184971 A1 WO2021184971 A1 WO 2021184971A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
finger
touch screen
edge
gesture
Prior art date
Application number
PCT/CN2021/074170
Other languages
English (en)
French (fr)
Inventor
吴恒刚
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2021184971A1 publication Critical patent/WO2021184971A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • This application relates to the technical field of touch screen control, and in particular to a touch screen gesture operation control method, device, terminal device, and storage medium.
  • gesture operation has become one of the mainstream operation and interaction methods of mobile phones.
  • touch gestures based on the edge of the screen can meet the interaction requirements while reducing the occlusion of the screen by the hand as much as possible, making it easier for the user to view the display content of the screen while operating, and improve the user experience.
  • the current gesture operations are mainly for tablets or other devices with larger screens that are convenient for simultaneous operation with two hands.
  • the other hand is mainly used to hold the device
  • some mobile phones have more scenes.
  • the use of gestures is not convenient, especially the touch gestures based on the edge of the screen, the recognition is not accurate enough.
  • the main purpose of this application is to provide a touch screen gesture operation control method, device, terminal device and storage medium, aiming to improve the recognition accuracy of touch gestures at the edge of the touch screen.
  • the present application provides a touch screen gesture operation control method, the touch screen gesture operation control method includes:
  • the gesture state parameter satisfies a preset condition, it is determined that the gesture operation is an operation of sliding out of the edge of the touch screen, and a corresponding response operation is triggered.
  • An embodiment of the present application also proposes a touch screen gesture operation control device, the touch screen gesture operation control device includes:
  • a tracking module configured to track and detect gesture operations on a preset area on the edge of the touch screen, and acquire gesture state parameters on the preset area on the edge of the touch screen;
  • the operation module is configured to determine that the gesture operation is an operation of sliding out of the edge of the touch screen when the gesture state parameter meets a preset condition, and trigger a corresponding response operation.
  • An embodiment of the present application also proposes a terminal device.
  • the terminal device includes a memory, a processor, and a touch screen gesture operation control program that is stored on the memory and can run on the processor, and the touch screen gesture operation control program When executed by the processor, the steps of the above-mentioned touch screen gesture operation control method are realized.
  • An embodiment of the present application also proposes a computer-readable storage medium, the computer-readable storage medium stores a touch screen gesture operation control program, and when the touch screen gesture operation control program is executed by a processor, the touch screen gesture operation as described above is realized Steps of the control method.
  • the touch screen gesture operation control method, device, terminal device, and storage medium proposed in the embodiments of the present application acquire gesture state parameters on the preset area on the edge of the touch screen by tracking and detecting gesture operations on the edge of the touch screen.
  • the gesture state parameter meets a preset condition, it is determined that the gesture operation is an operation of sliding out of the edge of the touch screen, and a corresponding response operation is triggered.
  • This solution can distinguish the user's subconscious behavior when the user's finger slides out of the edge of the screen from other screen edge sliding operations, and improves the recognition accuracy of touch gestures at the edge of the touch screen, thereby making better use of screen border gestures to achieve corresponding screen operations.
  • it provides more gesture interaction methods on the basis of common screen touch gestures, and provides more gesture options for touch screen interaction design.
  • Reasonable use in interaction design can provide users with a richer and more convenient interface interaction experience. .
  • Figure 1 is a schematic diagram of functional modules of a terminal device to which a touch screen gesture operation control device belongs to the application;
  • FIG. 2 is a schematic flowchart of an exemplary embodiment of a touch screen gesture operation control method according to this application;
  • FIG. 3 is a schematic diagram of a gesture recognition area of a touch screen in an embodiment of the application.
  • FIG. 4 is a schematic diagram of a single-finger tracking process involved in an embodiment of this application.
  • Fig. 5 is a schematic flowchart of another exemplary embodiment of a touch screen gesture operation control method according to the present application.
  • the main solution of the embodiment of the present application is: by tracking and detecting gesture operations on a preset area at the edge of the touch screen, acquiring gesture state parameters on the preset area at the edge of the touch screen; when the gesture state parameters meet the preset When the condition is met, it is determined that the gesture operation is an operation of sliding out of the edge of the touch screen, and a corresponding response operation is triggered.
  • This solution can distinguish the user's subconscious behavior when the user's finger slides out of the edge of the screen from other screen edge sliding operations, and improves the recognition accuracy of touch gestures at the edge of the touch screen.
  • it provides more gesture interaction methods on the basis of common screen touch gestures, and provides more gesture options for touch screen interaction design.
  • Reasonable use in interaction design can provide users with a richer and more convenient interface interaction experience. .
  • Screen touch system The software and hardware system that has the touch screen device's built-in recognition of finger touch screen events, such as the MotionEvent mechanism of Andro logo mobile phone.
  • the embodiment of the application considers that the related technical solutions are based on touch gestures at the edge of the screen, and do not distinguish whether the user's intention when the finger slides toward the edge of the screen is to slide to the edge of the screen or slide directly out of the screen. Therefore, it is difficult to recognize touch gestures based on the edge of the screen. It is easy to cause misoperation.
  • the embodiment of the present application proposes a solution that can distinguish whether the user's intention when the finger slides toward the edge of the screen is to slide to the edge of the screen or slide directly out of the screen, so as to improve the recognition accuracy of touch gestures at the edge of the touch screen, so as to better The use of screen border gestures to achieve the corresponding screen operation.
  • FIG. 1 is a schematic diagram of the functional modules of the terminal device to which the touch screen gesture operation control apparatus of the present application belongs.
  • the touch screen gesture operation control device may be a device that is independent of the terminal device and can perform data processing, and it may be carried on the terminal device in the form of hardware or software.
  • the terminal device may be a smart mobile terminal with a touch screen, such as a mobile phone or a tablet computer, or a fixed terminal with a touch screen. In this embodiment, a mobile phone is used as an example.
  • the terminal device to which the touch screen gesture operation control device belongs at least includes an output module 110, a processor 120, a memory 130, and a communication module 140.
  • the memory 130 stores an operating system and a touch screen gesture operation control program.
  • the touch screen gesture operation control device can perform a gesture operation of a detected finger on a preset area on the edge of the touch screen, and an acquired finger on a preset area on the edge of the touch screen.
  • the information such as the gesture state parameters and so on is stored in the memory 130; the output module 110 may be a display screen, a speaker, and the like.
  • the communication module 140 may include a WIFI module, a mobile communication module, a Bluetooth module, etc., and communicate with an external device or a server through the communication module 140.
  • the gesture state parameter satisfies a preset condition, it is determined that the gesture operation is an operation of sliding out of the edge of the touch screen, and a corresponding response operation is triggered.
  • the gesture state parameter does not meet a preset condition, it is determined that the gesture operation is an operation of sliding to the edge of the touch screen, and a corresponding response operation is triggered.
  • Tracking the sliding state of the finger and acquiring, from the screen touch system according to the identification of the finger, the time point when the finger enters the preset area at the edge of the touch screen;
  • the gesture operation is an operation of sliding out of the edge of the screen, and a corresponding response operation is triggered.
  • the gesture operation is an operation of sliding out of the edge of the screen, and a corresponding response operation is triggered.
  • the gesture operation is sliding out of the edge of the screen And trigger the corresponding response operation.
  • the gesture operation is an operation of sliding out of the edge of the screen, and a corresponding response operation is triggered.
  • the gesture operation is an operation of sliding out of the edge of the screen, and a corresponding response operation is triggered.
  • This embodiment uses the above-mentioned solution, specifically by tracking and detecting gesture operations on the preset area on the edge of the touch screen, to obtain the gesture state parameters on the preset area on the edge of the touch screen; when the gesture state parameters meet the preset conditions , It is determined that the gesture operation is an operation of sliding out of the edge of the touch screen, and a corresponding response operation is triggered.
  • This solution can distinguish the user's subconscious behavior when the user's finger slides out of the edge of the screen from other screen edge sliding operations, and improves the recognition accuracy of touch gestures at the edge of the touch screen, thereby making better use of screen border gestures to achieve corresponding screen operations.
  • it provides more gesture interaction methods on the basis of common screen touch gestures, and provides more gesture options for touch screen interaction design. Reasonable use in interaction design can provide users with a richer and more convenient interface interaction experience. .
  • FIG. 2 is a schematic flowchart of an exemplary embodiment of a touch screen gesture operation control method according to the present application.
  • the touch screen gesture operation control method includes:
  • Step S101 tracking and detecting the gesture operation on the preset area on the edge of the touch screen, and acquiring the gesture state parameters on the preset area on the edge of the touch screen;
  • the execution subject of the method in this embodiment may be a touch screen gesture operation control device, or may be a touch screen terminal.
  • a mobile phone is used as an example.
  • this embodiment has a preset area on the edge of the screen of the touch screen terminal, and the gestures on the preset area Touch operation tracking recognition is used to distinguish the two gestures of sliding the finger to the edge of the screen and sliding the finger out of the screen on the touch screen.
  • the preset area can be a long border area near the edge of the screen.
  • the gesture state parameters of the finger on the preset area on the edge of the touch screen screen are acquired.
  • the gesture state parameters of the finger on the preset area of the edge of the touch screen may include: the entry time point when the finger enters the preset area on the edge of the touch screen, and the lifting time point when the finger is lifted from the preset area on the edge of the touch screen. , And whether the screen position when the finger is lifted is in the preset area of the edge of the touch screen.
  • the solution of this embodiment can be applied to a gesture operation of touching the edge of the screen with a single finger, or a gesture operation of touching the edge of the screen with multiple fingers.
  • the principle of tracking and detecting finger gesture operations on the edge of the touch screen in a preset area is as follows:
  • the sliding state of the finger is tracked, and the gesture state parameter of the finger on the preset area of the edge of the touch screen screen is acquired.
  • the screen position when the finger is lifted and the time point when the finger is lifted are acquired from the screen touch system according to the identification of the finger, and recorded .
  • the gesture state parameters of the finger on the preset area on the edge of the touch screen screen are acquired.
  • the gesture state parameter satisfies the preset condition, thereby determining that the gesture operation is an operation of sliding out of the edge of the touch screen, and triggering the corresponding Respond to operation.
  • the identification of the finger on the screen touch system is recorded for tracking
  • the gesture state parameter meets the preset condition, thereby determining that the gesture operation is an operation of sliding out of the edge of the touch screen, and triggering The corresponding response operation.
  • Step S102 When the gesture state parameter meets a preset condition, it is determined that the gesture operation is an operation of sliding out of the edge of the touch screen, and a corresponding response operation is triggered.
  • the gesture state parameter of the finger on the preset area of the edge of the touch screen meets the preset condition, it is determined that the gesture operation is an operation of sliding out of the edge of the touch screen, and a response operation corresponding to sliding out of the edge of the touch screen is triggered. If the gesture state parameter does not meet the preset condition, it may not respond or respond to other corresponding touch operations.
  • This embodiment uses the above solution, specifically by tracking and detecting the gesture operation of the finger on the preset area on the edge of the touch screen, and acquires the gesture state parameter of the finger on the preset area on the edge of the touch screen; when the gesture state parameter satisfies When the condition is preset, it is determined that the gesture operation is an operation of sliding out of the edge of the touch screen, and a corresponding response operation is triggered.
  • This solution can distinguish the user's subconscious behavior when the user's finger slides out of the edge of the screen from other screen edge sliding operations, and improves the recognition accuracy of touch gestures at the edge of the touch screen, thereby making better use of screen border gestures to achieve corresponding screen operations.
  • it provides more gesture interaction methods on the basis of common screen touch gestures, and provides more gesture options for touch screen interaction design. Reasonable use in interaction design can provide users with a richer and more convenient interface interaction experience. .
  • tracking and detecting the gesture operation of the finger on the preset area on the edge of the touch screen screen, and the step of acquiring the gesture state parameters of the finger on the preset area on the edge of the touch screen screen may include:
  • Tracking the sliding state of the finger and acquiring, from the screen touch system according to the identification of the finger, the time point when the finger enters the preset area at the edge of the touch screen;
  • one of the solutions to determine whether the gesture operation is an operation that slides out of the edge of the screen is as follows:
  • the gesture operation is an operation of sliding out of the edge of the screen, and a corresponding response operation is triggered.
  • the gesture operation is an operation of sliding out of the edge of the screen, and a corresponding response operation is triggered.
  • one of the solutions to determine whether the gesture operation is an operation that slides out of the edge of the screen is as follows:
  • the gesture operation is sliding out of the edge of the screen And trigger the corresponding response operation.
  • tracking and detecting the gesture operation of the finger on the preset area on the edge of the touch screen, and obtaining the gesture state parameters of the finger on the preset area on the edge of the touch screen may include:
  • one of the solutions to determine whether the gesture operation is an operation that slides out of the edge of the screen is as follows:
  • the gesture operation is an operation of sliding out of the edge of the screen, and a corresponding response operation is triggered.
  • the gesture operation is an operation of sliding out of the edge of the screen, and a corresponding response operation is triggered.
  • the method of this embodiment to determine whether the user gesture is to slide to the edge of the screen or slide out of the screen directly from the corresponding edge mainly considers two conditions: the time condition and the position condition of the finger touch operation. If both conditions are satisfied , It is determined that the finger slides out of the edge of the screen.
  • This embodiment takes into account that when the user's finger slides to the edge on the screen, if he only wants to slide to the edge of the screen, he will subconsciously slow down when it is close to the edge of the screen, or even lift his finger after staying for a period of time; but If you want to slide out of the screen from the corresponding edge, the finger will not slow down when you slide, but quickly pass the edge of the screen. Therefore, based on the characteristics of this user behavior, the above scheme of recognizing user intentions based on sliding gestures is designed:
  • the screen touch system tracks each finger's actions such as touching the screen, sliding, and lifting, that is, single-finger tracking, and the related process is shown in Figure 4.
  • each finger that touches the screen is tracked.
  • a finger When a finger is lifted, it is judged whether it is lifted from the preset area (as shown in FIG. 3) corresponding to the edge of the screen, and if so, the time point of the lift is recorded.
  • the recorded number of qualified raised fingers reaches the preset threshold for multi-finger gestures (the corresponding threshold for three-finger gestures is 3, and other multi-finger gestures are analogized), calculate the number of the first raised finger in the record The time difference between the lifting time point and the lifting time point of the last finger lifted, or recording the time point of the first finger entering the preset area on the edge of the screen and the lifting time point of the last finger lifted Time difference.
  • the preset threshold of the time difference can be set to 0.5 seconds . Other cases are determined based on the actual use experience analysis), otherwise it is determined that the user's intention is to slide only to the edge of the screen. Afterwards, according to the determined difference in user intentions, different response methods can be adopted in the interaction design, thereby enriching the user experience.
  • FIG. 5 is a schematic flowchart of another exemplary embodiment of a touch screen gesture operation control method according to the present application.
  • the gesture operation of the finger on the preset area of the edge of the touch screen is tracked and detected, and it is acquired that the finger is on the preset area of the edge of the touch screen.
  • the gesture state parameters also include:
  • Step S103 When the gesture state parameter does not meet a preset condition, determine that the gesture operation is an operation of sliding to the edge of the touch screen, and trigger a corresponding response operation.
  • this embodiment also includes a processing scheme when the gesture state parameter does not meet the preset condition.
  • the gesture operation is determined It is the operation of sliding to the edge of the touch screen and triggering the corresponding response operation.
  • the gesture operation is an operation of sliding to the edge of the touch screen screen, and a corresponding response operation is triggered.
  • the gesture operation is an operation of sliding to the edge of the touch screen and triggering a corresponding response operation.
  • the gesture operation is an operation of sliding to the edge of the touch screen, and the corresponding is triggered Response operation.
  • This embodiment uses the above solution, specifically by tracking and detecting the gesture operation of the finger on the preset area on the edge of the touch screen, and acquiring the gesture state parameter of the finger on the preset area on the edge of the touch screen; when the gesture state parameter satisfies When a preset condition is used, it is determined that the gesture operation is an operation of sliding out of the edge of the touch screen screen, and a corresponding response operation is triggered; when the gesture state parameter does not meet the preset condition, it is determined that the gesture operation is sliding to the edge of the touch screen screen Operation, and trigger the corresponding response operation.
  • This solution can distinguish the user's subconscious behavior when the user's finger slides out of the edge of the screen from other screen edge sliding operations, and improves the recognition accuracy of touch gestures at the edge of the touch screen, thereby making better use of screen border gestures to achieve corresponding screen operations.
  • more gesture interaction methods are provided on the basis of common screen touch gestures, and more gesture options are provided for touch screen interaction design.
  • Reasonable use in the interaction design can provide users with a richer and more convenient interface interaction experience. .
  • an embodiment of the present application also provides a touch screen gesture operation control device, and the touch screen gesture operation control device includes:
  • a tracking module configured to track and detect the gesture operation of the finger on the preset area on the edge of the touch screen screen, and obtain the gesture state parameters of the finger on the preset area on the edge of the touch screen screen;
  • the operation module is configured to determine that the gesture operation is an operation of sliding out of the edge of the touch screen when the gesture state parameter meets a preset condition, and trigger a corresponding response operation.
  • an embodiment of the present application also proposes a terminal device.
  • the terminal device includes a memory, a processor, and a touch screen gesture operation control program that is stored on the memory and can run on the processor.
  • the touch screen gesture operation When the control program is executed by the processor, the steps of the above-mentioned touch screen gesture operation control method are realized.
  • the touch screen gesture operation control program Since the touch screen gesture operation control program is executed by the processor, it adopts all the technical solutions of all the foregoing embodiments, and therefore has at least all the beneficial effects brought by all the technical solutions of all the foregoing embodiments, and will not be repeated here. .
  • an embodiment of the present application also proposes a computer-readable storage medium, the computer-readable storage medium stores a touch screen gesture operation control program, and the touch screen gesture operation control program is executed by a processor to realize the touch screen as described above. Steps of gesture control method.
  • the touch screen gesture operation control program Since the touch screen gesture operation control program is executed by the processor, it adopts all the technical solutions of all the foregoing embodiments, and therefore has at least all the beneficial effects brought by all the technical solutions of all the foregoing embodiments, and will not be repeated here. .
  • the touch screen gesture operation control method, device, terminal device, and storage medium proposed in the embodiments of the present application can track and detect the gesture operation of the finger on the preset area on the edge of the touch screen to obtain the finger on the touch screen.
  • This solution can distinguish the user's subconscious behavior when the user's finger slides out of the edge of the screen from other screen edge sliding operations, and improves the recognition accuracy of touch gestures at the edge of the touch screen, thereby making better use of screen border gestures to achieve corresponding screen operations.
  • it provides more gesture interaction methods on the basis of common screen touch gestures, and provides more gesture options for touch screen interaction design.
  • Reasonable use in interaction design can provide users with a richer and more convenient interface interaction experience. .
  • the technical solution of this application essentially or the part that contributes to the existing technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic A disc or an optical disc) includes a number of instructions to enable a terminal device (which can be a mobile phone, a computer, a server, a controlled terminal, or a network device, etc.) to execute the method of each embodiment of the present application.
  • a terminal device which can be a mobile phone, a computer, a server, a controlled terminal, or a network device, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请公开了一种触摸屏手势操作控制方法、装置、终端设备以及存储介质,其方法包括:追踪检测在触摸屏屏幕边缘预设区域上的手势操作,获取在所述触摸屏屏幕边缘预设区域上的手势状态参数;在手势状态参数满足预设条件时,确定手势操作为滑出触摸屏屏幕边缘的操作,并触发对应的响应操作。本申请提升了触摸屏屏幕边缘的触摸手势的识别准确性,从而更好的利用屏幕边框手势实现相应的屏幕操作,而且在常见的屏幕触摸手势的基础上提供更多的手势交互方式,为触摸屏交互设计提供了更多的手势选择,在交互设计中合理运用即可为用户提供更为丰富和便捷的界面交互体验。

Description

触摸屏手势操作控制方法、装置、终端设备以及存储介质
本申请要求于2020年03月20日提交中国专利局、申请号为202010205297.0、发明名称为“触摸屏手势操作控制方法、装置、终端设备以及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在申请中。
技术领域
本申请涉及触摸屏控制技术领域,尤其涉及一种触摸屏手势操作控制方法、装置、终端设备以及存储介质。
背景技术
随着智能触摸屏手机的普及,手势操作已成为手机的主流操作和交互方式之一。其中,基于屏幕边缘的触摸手势能在满足交互需求的同时尽可能的减少手对屏幕的遮挡,更便于用户在操作的同时查看屏幕的显示内容,提升用户体验。
但是,目前的手势操作主要针对便于两手同时操作的平板电脑或其他屏幕较大的设备,对于屏幕较小且单手操作(另一手主要用于持握设备)场景较多的手机来说,部分手势的使用并不方便,尤其是基于屏幕边缘的触摸手势,识别不够准确。
技术问题
本申请的主要目的在于提供一种触摸屏手势操作控制方法、装置、终端设备以及存储介质,旨在提升触摸屏屏幕边缘的触摸手势的识别准确性。
技术解决方案
为实现上述目的,本申请提供一种触摸屏手势操作控制方法,所述触摸屏手势操作控制方法包括:
追踪检测在触摸屏屏幕边缘预设区域上的手势操作,获取在所述触摸屏屏幕边缘预设区域上的手势状态参数;
在所述手势状态参数满足预设条件时,确定所述手势操作为滑出触摸屏屏幕边缘的操作,并触发对应的响应操作。
本申请实施例还提出一种触摸屏手势操作控制装置,所述触摸屏手势操作控制装置包括:
追踪模块,设置为追踪检测在触摸屏屏幕边缘预设区域上的手势操作,获取在所述触摸屏屏幕边缘预设区域上的手势状态参数;
操作模块,设置为在所述手势状态参数满足预设条件时,确定所述手势操作为滑出触摸屏屏幕边缘的操作,并触发对应的响应操作。
本申请实施例还提出一种终端设备,所述终端设备包括存储器、处理器及存储在所述存储器上并可在所述处理器上运行的触摸屏手势操作控制程序,所述触摸屏手势操作控制程序被所述处理器执行时实现如上所述的触摸屏手势操作控制方法的步骤。
本申请实施例还提出一种计算机可读存储介质,所述计算机可读存储介质上存储有触摸屏手势操作控制程序,所述触摸屏手势操作控制程序被处理器执行时实现如上所述的触摸屏手势操作控制方法的步骤。
有益效果
本申请实施例提出的触摸屏手势操作控制方法、装置、终端设备以及存储介质,通过追踪检测在触摸屏屏幕边缘预设区域上的手势操作,获取在所述触摸屏屏幕边缘预设区域上的手势状态参数;在所述手势状态参数满足预设条件时,确定所述手势操作为滑出触摸屏屏幕边缘的操作,并触发对应的响应操作。该方案可以将用户手指滑出屏幕边缘时下意识的行为与其他屏幕边缘滑动操作进行区分,提升了触摸屏屏幕边缘的触摸手势的识别准确性,从而更好的利用屏幕边框手势实现相应的屏幕操作。而且在常见的屏幕触摸手势的基础上提供更多的手势交互方式,为触摸屏交互设计提供了更多的手势选择,在交互设计中合理运用即可为用户提供更为丰富和便捷的界面交互体验。
附图说明
图1为本申请触摸屏手势操作控制装置所属终端设备的功能模块示意图;
图2为本申请触摸屏手势操作控制方法一示例性实施例的流程示意图;
图3为本申请实施例中一种触摸屏的手势识别区示意图;
图4为本申请实施例涉及的单手指跟踪流程示意图;
图5为本申请触摸屏手势操作控制方法另一示例性实施例的流程示意图。
本发明的实施方式
应当理解,此处所描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。
本申请实施例的主要解决方案是:通过追踪检测在触摸屏屏幕边缘预设区域上的手势操作,获取在所述触摸屏屏幕边缘预设区域上的手势状态参数;在所述手势状态参数满足预设条件时,确定所述手势操作为滑出触摸屏屏幕边缘的操作,并触发对应的响应操作。该方案可以将用户手指滑出屏幕边缘时下意识的行为与其他屏幕边缘滑动操作进行区分,提升了触摸屏屏幕边缘的触摸手势的识别准确性。而且在常见的屏幕触摸手势的基础上提供更多的手势交互方式,为触摸屏交互设计提供了更多的手势选择,在交互设计中合理运用即可为用户提供更为丰富和便捷的界面交互体验。
本申请涉及的技术术语:
屏幕触控系统:具有触摸屏设备自带的识别手指触摸屏幕事件的软硬件系统,如Andro标识手机的MotionEvent机制。
本申请实施例考虑到,相关技术方案基于屏幕边缘的触摸手势,没有区分手指向着屏幕边缘滑动时的用户意图是滑动到屏幕边缘还是直接滑出屏幕,因此基于屏幕边缘的触摸手势识别较为困难,容易造成误操作。
基于此,本申请实施例提出一种解决方案,可以区分手指向着屏幕边缘滑动时的用户意图是滑动到屏幕边缘还是直接滑出屏幕,提升触摸屏屏幕边缘的触摸手势的识别准确性,从而更好的利用屏幕边框手势实现相应的屏幕操作。
具体地,参照图1,图1为本申请触摸屏手势操作控制装置所属终端设备的功能模块示意图。该触摸屏手势操作控制装置可以为独立于终端设备的、能够进行数据处理的装置,其可以通过硬件或软件的形式承载于终端设备上。该终端设备可以为手机、平板电脑等具有触摸屏的智能移动终端,还可以为具有触摸屏的固定终端,本实施例以手机进行举例。
在本实施例中,该触摸屏手势操作控制装置所属终端设备至少包括输出模块110、处理器120、存储器130以及通信模块140。
存储器130中存储有操作系统以及触摸屏手势操作控制程序,触摸屏手势操作控制装置可以将检测到的手指在触摸屏屏幕边缘预设区域上的手势操作,以及获取到的手指在触摸屏屏幕边缘预设区域上的手势状态参数等信息存储于该存储器130中;输出模块110可为显示屏、扬声器等。通信模块140可以包括WIFI模块、移动通信模块以及蓝牙模块等,通过通信模块140与外部设备或服务器进行通信。
其中,存储器130中的触摸屏手势操作控制程序被处理器执行时实现以下步骤:
追踪检测在触摸屏屏幕边缘预设区域上的手势操作,获取在所述触摸屏屏幕边缘预设区域上的手势状态参数;
在所述手势状态参数满足预设条件时,确定所述手势操作为滑出触摸屏屏幕边缘的操作,并触发对应的响应操作。
进一步地,存储器130中的触摸屏手势操作控制程序被处理器执行时还实现以下步骤:
在所述手势状态参数不满足预设条件时,确定所述手势操作为滑动到触摸屏屏幕边缘的操作,并触发对应的响应操作。
进一步地,存储器130中的触摸屏手势操作控制程序被处理器执行时还实现以下步骤:
当检测到触摸屏的屏幕上的触摸指令时,记录触摸手指在屏幕触控系统的标识以进行追踪;
当检测到手指在所述触摸屏的屏幕上滑动时,根据所述手指的标识从所述屏幕触控系统获取所述手指当前在屏幕上的滑动状态;
追踪所述手指的滑动状态,根据所述手指的标识从所述屏幕触控系统获取所述手指进入触摸屏屏幕边缘预设区域的进入时间点;
在检测到所述手指从屏幕上抬起时,根据所述手指的标识从所述屏幕触控系统获取所述手指抬起时所处的屏幕位置;
在所述手指抬起时所处的屏幕位置位于所述触摸屏屏幕边缘预设区域时,获取抬起时间点。
进一步地,存储器130中的触摸屏手势操作控制程序被处理器执行时还实现以下步骤:
若所有手指抬起时所处的屏幕位置均位于所述触摸屏屏幕边缘预设区域,且第一个进入触摸屏屏幕边缘预设区域的第一手指的进入时间点与最后一个从所述预设区域抬起的最后手指的抬起时间点之间的时间差小于预设阈值,则判定所述手势操作为滑出屏幕边缘的操作,并触发对应的响应操作。
进一步地,存储器130中的触摸屏手势操作控制程序被处理器执行时还实现以下步骤:
若从所述预设区域抬起的手指数量达到预设值,且第一个进入触摸屏屏幕边缘预设区域的第一手指的进入时间点与最后一个从所述预设区域抬起的最后手指的抬起时间点之间的时间差小于预设阈值,则判定所述手势操作为滑出屏幕边缘的操作,并触发对应的响应操作。
进一步地,存储器130中的触摸屏手势操作控制程序被处理器执行时还实现以下步骤:
若所述手指进入触摸屏屏幕边缘预设区域的进入时间点与从所述预设区域抬起时的抬起时间点之间的时间差小于预设阈值,则判定所述手势操作为滑出屏幕边缘的操作,并触发对应的响应操作。
进一步地,存储器130中的触摸屏手势操作控制程序被处理器执行时还实现以下步骤:
当检测到手指触摸到触摸屏的屏幕时,记录所述手指在屏幕触控系统的标识以进行追踪;
当检测到手指在所述触摸屏的屏幕上滑动时,根据所述手指的标识从所述屏幕触控系统获取所述手指当前在屏幕上的滑动状态;
追踪所述手指的滑动状态,
在检测到所述手指从屏幕上抬起时,根据所述手指的标识从所述屏幕触控系统获取所述手指抬起时所处的屏幕位置;
在所述手指抬起时所处的屏幕位置位于所述触摸屏屏幕边缘预设区域时,获取抬起时间点。
进一步地,存储器130中的触摸屏手势操作控制程序被处理器执行时还实现以下步骤:
若所有手指抬起时所处的屏幕位置均位于所述触摸屏屏幕边缘预设区域,且第一个从所述预设区域抬起的第一手指的抬起时间点与最后一个从所述预设区域抬起的最后手指的抬起时间点之间的时间差小于预设阈值,则判定所述手势操作为滑出屏幕边缘的操作,并触发对应的响应操作。
进一步地,存储器130中的触摸屏手势操作控制程序被处理器执行时还实现以下步骤:
若从所述预设区域抬起的手指数量达到预设值,且第一个从所述预设区域抬起的第一手指的抬起时间点与最后一个从所述预设区域抬起的最后手指的抬起时间点之间的时间差小于预设阈值,则判定所述手势操作为滑出屏幕边缘的操作,并触发对应的响应操作。
本实施例通过上述方案,具体通过追踪检测在触摸屏屏幕边缘预设区域上的手势操作,获取在所述触摸屏屏幕边缘预设区域上的手势状态参数;在所述手势状态参数满足预设条件时,确定所述手势操作为滑出触摸屏屏幕边缘的操作,并触发对应的响应操作。该方案可以将用户手指滑出屏幕边缘时下意识的行为与其他屏幕边缘滑动操作进行区分,提升了触摸屏屏幕边缘的触摸手势的识别准确性,从而更好的利用屏幕边框手势实现相应的屏幕操作。而且在常见的屏幕触摸手势的基础上提供更多的手势交互方式,为触摸屏交互设计提供了更多的手势选择,在交互设计中合理运用即可为用户提供更为丰富和便捷的界面交互体验。
基于上述终端设备架构但不限于上述架构,提出本申请方法实施例。
参照图2,图2为本申请触摸屏手势操作控制方法一示例性实施例的流程示意图。所述触摸屏手势操作控制方法包括:
步骤S101,追踪检测在触摸屏屏幕边缘预设区域上的手势操作,获取在所述触摸屏屏幕边缘预设区域上的手势状态参数;
本实施例方法的执行主体可以是一种触摸屏手势操作控制装置,也可以是触摸屏终端,本实施例以手机进行举例。
为了提升触摸屏屏幕边缘的触摸手势的识别准确性,从而更好的利用屏幕边框手势实现相应的屏幕操作,本实施例在触摸屏终端的屏幕边缘设有预设区域,通过该预设区域上的手势触控操作跟踪识别,来实现对触摸屏上手指滑动到屏幕边缘和手指滑出屏幕这两种手势进行区分。该预设区域,可以是一个靠近屏幕边缘的长条边框区域,如图3所示,在屏幕的下部边缘设有屏幕边缘手势识别区,该区域的大小或宽度可以根据触摸屏屏幕触控系统感应手指触摸屏幕的特点进行设定,可以依据手指指头接触触摸屏时覆盖的像素区域范围来设定手势识别区域大小。
在本实施例中,通过追踪检测手指在触摸屏屏幕边缘预设区域上的手势操作,获取所述手指在所述触摸屏屏幕边缘预设区域上的手势状态参数。
其中,手指在所述触摸屏屏幕边缘预设区域上的手势状态参数可以包括:手指进入触摸屏屏幕边缘预设区域的进入时间点、在手指从触摸屏屏幕边缘预设区域抬起时的抬起时间点,以及手指抬起时所处的屏幕位置是否在触摸屏屏幕边缘预设区域等。
本实施例方案可以适用单手指触控屏幕边缘的手势操作,也可以适用多手指触控屏幕边缘的手势操作。
其中,追踪检测手指在触摸屏屏幕边缘预设区域上的手势操作的原理如下:
如图4所示,以单指追踪为例,当检测到手指触摸到触摸屏的屏幕时,记录所述手指在屏幕触控系统的标识以进行追踪;
当检测到手指在所述触摸屏的屏幕上滑动时,根据所述手指的标识从所述屏幕触控系统获取所述手指当前在屏幕上的滑动状态;
然后,追踪所述手指的滑动状态,获取所述手指在所述触摸屏屏幕边缘预设区域上的手势状态参数。其中,在检测到所述手指从屏幕上抬起时,根据所述手指的标识从所述屏幕触控系统获取所述手指抬起时所处的屏幕位置以及抬起时的时间点并进行记录。
在本实施例中,通过追踪检测手指在触摸屏屏幕边缘预设区域上的手势操作,获取所述手指在所述触摸屏屏幕边缘预设区域上的手势状态参数。具体实现如下:
作为一种实施方式,当检测到手指在所述触摸屏的屏幕上滑动时,根据所述手指的标识从所述屏幕触控系统获取所述手指当前在屏幕上的滑动状态;
然后,追踪所述手指的滑动状态,根据所述手指的标识从所述屏幕触控系统获取所述手指进入触摸屏屏幕边缘预设区域的进入时间点;
在检测到所述手指从屏幕上抬起时,根据所述手指的标识从所述屏幕触控系统获取所述手指抬起时所处的屏幕位置;
在所述手指抬起时所处的屏幕位置位于所述触摸屏屏幕边缘预设区域时,获取抬起时间点。
若所有手指抬起时所处的屏幕位置均位于所述触摸屏屏幕边缘预设区域,且第一个进入触摸屏屏幕边缘预设区域的第一手指的进入时间点与最后一个从所述预设区域抬起的最后手指的抬起时间点之间的时间差小于预设阈值,则判定所述手势状态参数满足预设条件,从而确定所述手势操作为滑出触摸屏屏幕边缘的操作,并触发对应的响应操作。
作为另一种实施方式,当检测到手指触摸到触摸屏的屏幕时,记录所述手指在屏幕触控系统的标识以进行追踪;
当检测到手指在所述触摸屏的屏幕上滑动时,根据所述手指的标识从所述屏幕触控系统获取所述手指当前在屏幕上的滑动状态;
追踪所述手指的滑动状态,在检测到所述手指从屏幕上抬起时,根据所述手指的标识从所述屏幕触控系统获取所述手指抬起时所处的屏幕位置;
在所述手指抬起时所处的屏幕位置位于所述触摸屏屏幕边缘预设区域时,获取抬起时间点。
若所有手指抬起时所处的屏幕位置均位于所述触摸屏屏幕边缘预设区域,且第一个从所述预设区域抬起的第一手指的抬起时间点与最后一个从所述预设区域抬起的最后手指的抬起时间点之间的时间差小于预设阈值,则判定所述手势状态参数满足预设条件,从而确定所述手势操作为滑出触摸屏屏幕边缘的操作,并触发对应的响应操作。
步骤S102,在所述手势状态参数满足预设条件时,确定所述手势操作为滑出触摸屏屏幕边缘的操作,并触发对应的响应操作。
当手指在触摸屏屏幕边缘预设区域上的手势状态参数满足预设条件时,确定所述手势操作为滑出触摸屏屏幕边缘的操作,并触发对应滑出触摸屏屏幕边缘的响应操作。如果手势状态参数不满足预设条件,则可以不做响应,或者响应其他对应的触控操作。
本实施例通过上述方案,具体通过追踪检测手指在触摸屏屏幕边缘预设区域上的手势操作,获取所述手指在所述触摸屏屏幕边缘预设区域上的手势状态参数;在所述手势状态参数满足预设条件时,确定所述手势操作为滑出触摸屏屏幕边缘的操作,并触发对应的响应操作。该方案可以将用户手指滑出屏幕边缘时下意识的行为与其他屏幕边缘滑动操作进行区分,提升了触摸屏屏幕边缘的触摸手势的识别准确性,从而更好的利用屏幕边框手势实现相应的屏幕操作。而且在常见的屏幕触摸手势的基础上提供更多的手势交互方式,为触摸屏交互设计提供了更多的手势选择,在交互设计中合理运用即可为用户提供更为丰富和便捷的界面交互体验。
以下对单手指和多手指触摸手势操作控制过程进行详细阐述:
作为一种实施方式,追踪检测手指在触摸屏屏幕边缘预设区域上的手势操作,获取所述手指在所述触摸屏屏幕边缘预设区域上的手势状态参数的步骤可以包括:
当检测到手指触摸到触摸屏的屏幕时,记录所述手指在屏幕触控系统的标识以进行追踪;
当检测到手指在所述触摸屏的屏幕上滑动时,根据所述手指的标识从所述屏幕触控系统获取所述手指当前在屏幕上的滑动状态;
追踪所述手指的滑动状态,根据所述手指的标识从所述屏幕触控系统获取所述手指进入触摸屏屏幕边缘预设区域的进入时间点;
在检测到所述手指从屏幕上抬起时,根据所述手指的标识从所述屏幕触控系统获取所述手指抬起时所处的屏幕位置;
在所述手指抬起时所处的屏幕位置位于所述触摸屏屏幕边缘预设区域时,获取抬起时间点。
基于上述手势操作追踪检测方案,对于多手指操作而言,其中一种判断手势操作是否为滑出屏幕边缘的操作的解决方案如下:
若所有手指抬起时所处的屏幕位置均位于所述触摸屏屏幕边缘预设区域,且第一个进入触摸屏屏幕边缘预设区域的第一手指的进入时间点与最后一个从所述预设区域抬起的最后手指的抬起时间点之间的时间差小于预设阈值,则判定所述手势操作为滑出屏幕边缘的操作,并触发对应的响应操作。
基于上述手势操作追踪检测方案,对于多手指操作而言,其中另一种判断手势操作是否为滑出屏幕边缘的操作的解决方案如下:
若从所述预设区域抬起的手指数量达到预设值,且第一个进入触摸屏屏幕边缘预设区域的第一手指的进入时间点与最后一个从所述预设区域抬起的最后手指的抬起时间点之间的时间差小于预设阈值,则判定所述手势操作为滑出屏幕边缘的操作,并触发对应的响应操作。
基于上述手势操作追踪检测方案,对于单手指操作而言,其中一种判断手势操作是否为滑出屏幕边缘的操作的解决方案如下:
若所述手指进入触摸屏屏幕边缘预设区域的进入时间点与从所述预设区域抬起时的抬起时间点之间的时间差小于预设阈值,则判定所述手势操作为滑出屏幕边缘的操作,并触发对应的响应操作。
作为另一种实施方式,追踪检测手指在触摸屏屏幕边缘预设区域上的手势操作,获取所述手指在所述触摸屏屏幕边缘预设区域上的手势状态参数的步骤可以包括:
当检测到手指触摸到触摸屏的屏幕时,记录所述手指在屏幕触控系统的标识以进行追踪;
当检测到手指在所述触摸屏的屏幕上滑动时,根据所述手指的标识从所述屏幕触控系统获取所述手指当前在屏幕上的滑动状态;
追踪所述手指的滑动状态,
在检测到所述手指从屏幕上抬起时,根据所述手指的标识从所述屏幕触控系统获取所述手指抬起时所处的屏幕位置;
在所述手指抬起时所处的屏幕位置位于所述触摸屏屏幕边缘预设区域时,获取抬起时间点。
基于上述手势操作追踪检测方案,对于多手指操作而言,其中一种判断手势操作是否为滑出屏幕边缘的操作的解决方案如下:
若所有手指抬起时所处的屏幕位置均位于所述触摸屏屏幕边缘预设区域,且第一个从所述预设区域抬起的第一手指的抬起时间点与最后一个从所述预设区域抬起的最后手指的抬起时间点之间的时间差小于预设阈值,则判定所述手势操作为滑出屏幕边缘的操作,并触发对应的响应操作。
基于上述手势操作追踪检测方案,对于多手指操作而言,其中另一种判断手势操作是否为滑出屏幕边缘的操作的解决方案如下:
若从所述预设区域抬起的手指数量达到预设值,且第一个从所述预设区域抬起的第一手指的抬起时间点与最后一个从所述预设区域抬起的最后手指的抬起时间点之间的时间差小于预设阈值,则判定所述手势操作为滑出屏幕边缘的操作,并触发对应的响应操作。
通过上述方案可知,本实施例判断用户手势是滑动到屏幕边缘位置还是直接从对应边缘滑出屏幕的方法主要考虑两个条件:手指触控操作的时间条件和位置条件,若两个条件均满足,则判定手指滑出屏幕边缘。
本实施例考虑到:当用户的手指在屏幕上滑动到边缘时,如果仅仅是想滑动到屏幕边缘区域,会下意识的在靠近屏幕边缘时减速,甚至是停留一段时间后再抬起手指;但如果是想从对应的边缘滑出屏幕,手指滑动时并不会减速,而是快速通过屏幕边缘。故基于这种用户行为特点,设计出以上的根据滑动手势识别用户意图方案:
首先,当手指在屏幕上滑动时,基于屏幕触控系统对每个手指的触碰屏幕、滑动、抬起等行为分别进行追踪,即单指追踪,相关流程如图4所示。
然后,以上述单指追踪流程为基础,对每个触摸屏幕的手指进行追踪。当某个手指抬起时,判断其是否从对应屏幕边缘的预设区域(如图3所示)内抬起,如果是就记录其抬起的时间点。当记录的符合条件的已抬起手指数量达到多指手势的预设阀值(三指手势对应阀值就为3,其他多指手势类推)后,计算记录中第一个抬起的手指的抬起时间点和最后一个抬起的手指的抬起时间点的时间差,或者,记录第一个进入屏幕边缘预设区域的手指的进入时间点和最后一个抬起的手指的抬起时间点的时间差。
当这个手指抬起时间差小于时间差的预设阀值时,即判定用户意图为从对应屏幕边缘滑出(以三指向屏幕下边缘滑出为例,则时间差的预设阀值可设置为0.5秒,其他情况根据实际的使用体验分析来定),否则判定用户意图为仅仅滑动到屏幕边缘。之后,根据判定的用户意图的区别,可以在交互设计中采取不同的响应方式,从而丰富用户体验。
参照图5,图5为本申请触摸屏手势操作控制方法另一示例性实施例的流程示意图。基于上述图2所示的实施例,在本实施例中,在上述步骤S101,追踪检测手指在触摸屏屏幕边缘预设区域上的手势操作,获取所述手指在所述触摸屏屏幕边缘预设区域上的手势状态参数之后还包括:
步骤S103,在所述手势状态参数不满足预设条件时,确定所述手势操作为滑动到触摸屏屏幕边缘的操作,并触发对应的响应操作。
相比上述图2所示的实施例,本实施例还包括在所述手势状态参数不满足预设条件时的处理方案。
具体地,在本实施例中,作为一种实施方式,对多手指而言,若所有手指抬起时所处的屏幕位置并不是均位于所述触摸屏屏幕边缘预设区域,或者,第一个进入触摸屏屏幕边缘预设区域的第一手指的进入时间点与最后一个从所述预设区域抬起的最后手指的抬起时间点之间的时间差不小于预设阈值,则判定所述手势操作为滑动到触摸屏屏幕边缘的操作,并触发对应的响应操作。
或者,若从所述预设区域抬起的手指数量未达到预设值,或者第一个进入触摸屏屏幕边缘预设区域的第一手指的进入时间点与最后一个从所述预设区域抬起的最后手指的抬起时间点之间的时间差不小于预设阈值,则判定所述手势操作为滑动到触摸屏屏幕边缘的操作,并触发对应的响应操作。
对于单手指操作而言,若所述手指进入触摸屏屏幕边缘预设区域的进入时间点与从所述预设区域抬起时的抬起时间点之间的时间差不小于预设阈值,则判定所述手势操作为滑动到触摸屏屏幕边缘的操作,并触发对应的响应操作。
作为另一种实施方式,对于多手指操作而言,若从所述预设区域抬起的手指数量未达到预设值,或者第一个从所述预设区域抬起的第一手指的抬起时间点与最后一个从所述预设区域抬起的最后手指的抬起时间点之间的时间差不小于预设阈值,则判定所述手势操作为滑动到触摸屏屏幕边缘的操作,并触发对应的响应操作。
本实施例通过上述方案,具体通过追踪检测手指在触摸屏屏幕边缘预设区域上的手势操作,获取所述手指在所述触摸屏屏幕边缘预设区域上的手势状态参数;在所述手势状态参数满足预设条件时,确定所述手势操作为滑出触摸屏屏幕边缘的操作,并触发对应的响应操作;在所述手势状态参数不满足预设条件时,确定所述手势操作为滑动到触摸屏屏幕边缘的操作,并触发对应的响应操作。该方案可以将用户手指滑出屏幕边缘时下意识的行为与其他屏幕边缘滑动操作进行区分,提升了触摸屏屏幕边缘的触摸手势的识别准确性,从而更好的利用屏幕边框手势实现相应的屏幕操作。而且在常见的屏幕触摸手势的基础上提供更多的手势交互方式,为触摸屏交互设计提供了更多的手势选择,在交互设计中合理运用即可为用户提供更为丰富和便捷的界面交互体验。
此外,本申请实施例还提出一种触摸屏手势操作控制装置,所述触摸屏手势操作控制装置包括:
追踪模块,设置为追踪检测手指在触摸屏屏幕边缘预设区域上的手势操作,获取所述手指在所述触摸屏屏幕边缘预设区域上的手势状态参数;
操作模块,设置为在所述手势状态参数满足预设条件时,确定所述手势操作为滑出触摸屏屏幕边缘的操作,并触发对应的响应操作。
本实施例实现触摸屏手势操作控制的原理及实施过程,请参照上述各实施例,在此不再赘述。
此外,本申请实施例还提出一种终端设备,所述终端设备包括存储器、处理器及存储在所述存储器上并可在所述处理器上运行的触摸屏手势操作控制程序,所述触摸屏手势操作控制程序被所述处理器执行时实现如上所述的触摸屏手势操作控制方法的步骤。
由于本触摸屏手势操作控制程序被处理器执行时,采用了前述所有实施例的全部技术方案,因此至少具有前述所有实施例的全部技术方案所带来的所有有益效果,在此不再一一赘述。
此外,本申请实施例还提出一种计算机可读存储介质,所述计算机可读存储介质上存储有触摸屏手势操作控制程序,所述触摸屏手势操作控制程序被处理器执行时实现如上所述的触摸屏手势操作控制方法的步骤。
由于本触摸屏手势操作控制程序被处理器执行时,采用了前述所有实施例的全部技术方案,因此至少具有前述所有实施例的全部技术方案所带来的所有有益效果,在此不再一一赘述。
相比现有技术,本申请实施例提出的触摸屏手势操作控制方法、装置、终端设备以及存储介质,通过追踪检测手指在触摸屏屏幕边缘预设区域上的手势操作,获取所述手指在所述触摸屏屏幕边缘预设区域上的手势状态参数;在所述手势状态参数满足预设条件时,确定所述手势操作为滑出触摸屏屏幕边缘的操作,并触发对应的响应操作。该方案可以将用户手指滑出屏幕边缘时下意识的行为与其他屏幕边缘滑动操作进行区分,提升了触摸屏屏幕边缘的触摸手势的识别准确性,从而更好的利用屏幕边框手势实现相应的屏幕操作。而且在常见的屏幕触摸手势的基础上提供更多的手势交互方式,为触摸屏交互设计提供了更多的手势选择,在交互设计中合理运用即可为用户提供更为丰富和便捷的界面交互体验。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者系统不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者系统所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者系统中还存在另外的相同要素。
上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在如上的一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,被控终端,或者网络设备等)执行本申请每个实施例的方法。
以上仅为本申请的优选实施例,并非因此限制本申请的专利范围,凡是利用本申请说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本申请的专利保护范围内。

Claims (20)

  1. 一种触摸屏手势操作控制方法,其中,所述触摸屏手势操作控制方法包括:
    追踪检测在触摸屏屏幕边缘预设区域上的手势操作,获取在所述触摸屏屏幕边缘预设区域上的手势状态参数;
    在所述手势状态参数满足预设条件时,确定所述手势操作为滑出触摸屏屏幕边缘的操作,并触发对应的响应操作。
  2. 根据权利要求1所述的触摸屏手势操作控制方法,其中,所述追踪检测在触摸屏屏幕边缘预设区域上的手势操作,获取在所述触摸屏屏幕边缘预设区域上的手势状态参数的步骤之后还包括:
    在所述手势状态参数不满足预设条件时,确定所述手势操作为滑动到触摸屏屏幕边缘的操作,并触发对应的响应操作。
  3. 根据权利要求1所述的触摸屏手势操作控制方法,其中,所述追踪检测在触摸屏屏幕边缘预设区域上的手势操作,获取在所述触摸屏屏幕边缘预设区域上的手势状态参数的步骤包括:
    当检测到触摸屏的屏幕上的触摸指令时,记录触摸手指在屏幕触控系统的标识以进行追踪;
    当检测到在所述触摸屏的屏幕上基于所述手指的滑动指令时,根据所述手指的标识从所述屏幕触控系统获取所述手指当前在屏幕上的滑动状态;
    追踪所述手指的滑动状态,根据所述手指的标识从所述屏幕触控系统获取所述手指进入触摸屏屏幕边缘预设区域的进入时间点;
    在检测到所述手指从屏幕上抬起时,根据所述手指的标识从所述屏幕触控系统获取所述手指抬起时所处的屏幕位置;
    在所述手指抬起时所处的屏幕位置位于所述触摸屏屏幕边缘预设区域时,获取抬起时间点。
  4. 根据权利要求3所述的触摸屏手势操作控制方法,其中,所述手势操作包括多手指操作,所述在所述手势状态参数满足预设条件时,确定所述手势操作为滑出触摸屏屏幕边缘的操作,并触发对应的响应操作的步骤包括:
    若所有手指抬起时所处的屏幕位置均位于所述触摸屏屏幕边缘预设区域,且第一个进入触摸屏屏幕边缘预设区域的第一手指的进入时间点与最后一个从所述预设区域抬起的最后手指的抬起时间点之间的时间差小于预设阈值,则判定所述手势操作为滑出屏幕边缘的操作,并触发对应的响应操作。
  5. 根据权利要求3所述的触摸屏手势操作控制方法,其中,所述手势操作包括多手指操作,所述在所述手势状态参数满足预设条件时,确定所述手势操作为滑出触摸屏屏幕边缘的操作,并触发对应的响应操作的步骤包括:
    若从所述预设区域抬起的手指数量达到预设值,且第一个进入触摸屏屏幕边缘预设区域的第一手指的进入时间点与最后一个从所述预设区域抬起的最后手指的抬起时间点之间的时间差小于预设阈值,则判定所述手势操作为滑出屏幕边缘的操作,并触发对应的响应操作。
  6. 根据权利要求3所述的触摸屏手势操作控制方法,其中,所述手势操作包括单手指操作,所述在所述手势状态参数满足预设条件时,确定所述手势操作为滑出触摸屏屏幕边缘的操作,并触发对应的响应操作的步骤包括:
    若所述手指进入触摸屏屏幕边缘预设区域的进入时间点与从所述预设区域抬起时的抬起时间点之间的时间差小于预设阈值,则判定所述手势操作为滑出屏幕边缘的操作,并触发对应的响应操作。
  7. 根据权利要求1所述的触摸屏手势操作控制方法,其中,所述追踪检测在触摸屏屏幕边缘预设区域上的手势操作,获取在所述触摸屏屏幕边缘预设区域上的手势状态参数的步骤包括:
    当检测到触摸屏的屏幕上的触摸指令时,记录触摸手指在屏幕触控系统的标识以进行追踪;
    当检测到在所述触摸屏的屏幕上基于所述手指的滑动指令时,根据所述手指的标识从所述屏幕触控系统获取所述手指当前在屏幕上的滑动状态;
    追踪所述手指的滑动状态,在检测到所述手指从屏幕上抬起时,根据所述手指的标识从所述屏幕触控系统获取所述手指抬起时所处的屏幕位置;
    在所述手指抬起时所处的屏幕位置位于所述触摸屏屏幕边缘预设区域时,获取抬起时间点。
  8. 根据权利要求7所述的触摸屏手势操作控制方法,其中,所述手势操作包括多手指操作,所述在所述手势状态参数满足预设条件时,确定所述手势操作为滑出触摸屏屏幕边缘的操作,并触发对应的响应操作的步骤包括:
    若所有手指抬起时所处的屏幕位置均位于所述触摸屏屏幕边缘预设区域,且第一个从所述预设区域抬起的第一手指的抬起时间点与最后一个从所述预设区域抬起的最后手指的抬起时间点之间的时间差小于预设阈值,则判定所述手势操作为滑出屏幕边缘的操作,并触发对应的响应操作。
  9. 根据权利要求7所述的触摸屏手势操作控制方法,其中,所述手势操作包括多手指操作,所述在所述手势状态参数满足预设条件时,确定所述手势操作为滑出触摸屏屏幕边缘的操作,并触发对应的响应操作的步骤包括:
    若从所述预设区域抬起的手指数量达到预设值,且第一个从所述预设区域抬起的第一手指的抬起时间点与最后一个从所述预设区域抬起的最后手指的抬起时间点之间的时间差小于预设阈值,则判定所述手势操作为滑出屏幕边缘的操作,并触发对应的响应操作。
  10. 一种触摸屏手势操作控制装置,其中,所述触摸屏手势操作控制装置包括:
    追踪模块,设置为追踪检测在触摸屏屏幕边缘预设区域上的手势操作,获取在所述触摸屏屏幕边缘预设区域上的手势状态参数;
    操作模块,设置为在所述手势状态参数满足预设条件时,确定所述手势操作为滑出触摸屏屏幕边缘的操作,并触发对应的响应操作。
  11. 根据权利要求10所述的触摸屏手势操作控制装置,其中,
    所述操作模块,还设置为在所述手势状态参数不满足预设条件时,确定所述手势操作为滑动到触摸屏屏幕边缘的操作,并触发对应的响应操作。
  12. 一种终端设备,其中,所述终端设备包括存储器、处理器及存储在所述存储器上并可在所述处理器上运行的触摸屏手势操作控制程序,所述触摸屏手势操作控制程序被所述处理器执行时实现如下操作:
    追踪检测在触摸屏屏幕边缘预设区域上的手势操作,获取在所述触摸屏屏幕边缘预设区域上的手势状态参数;
    在所述手势状态参数满足预设条件时,确定所述手势操作为滑出触摸屏屏幕边缘的操作,并触发对应的响应操作。
  13. 根据权利要求12所述的终端设备,其中,所述触摸屏手势操作控制程序被所述处理器执行时还实现如下操作:
    在所述手势状态参数不满足预设条件时,确定所述手势操作为滑动到触摸屏屏幕边缘的操作,并触发对应的响应操作。
  14. 根据权利要求12所述的终端设备,其中,所述触摸屏手势操作控制程序被所述处理器执行时还实现如下操作:
    当检测到触摸屏的屏幕上的触摸指令时,记录触摸手指在屏幕触控系统的标识以进行追踪;
    当检测到在所述触摸屏的屏幕上基于所述手指的滑动指令时,根据所述手指的标识从所述屏幕触控系统获取所述手指当前在屏幕上的滑动状态;
    追踪所述手指的滑动状态,根据所述手指的标识从所述屏幕触控系统获取所述手指进入触摸屏屏幕边缘预设区域的进入时间点;
    在检测到所述手指从屏幕上抬起时,根据所述手指的标识从所述屏幕触控系统获取所述手指抬起时所处的屏幕位置;
    在所述手指抬起时所处的屏幕位置位于所述触摸屏屏幕边缘预设区域时,获取抬起时间点。
  15. 根据权利要求14所述的终端设备,其中,所述触摸屏手势操作控制程序被所述处理器执行时还实现如下操作:
    若所有手指抬起时所处的屏幕位置均位于所述触摸屏屏幕边缘预设区域,且第一个进入触摸屏屏幕边缘预设区域的第一手指的进入时间点与最后一个从所述预设区域抬起的最后手指的抬起时间点之间的时间差小于预设阈值,则判定所述手势操作为滑出屏幕边缘的操作,并触发对应的响应操作。
  16. 根据权利要求14所述的终端设备,其中,所述触摸屏手势操作控制程序被所述处理器执行时还实现如下操作:
    若从所述预设区域抬起的手指数量达到预设值,且第一个进入触摸屏屏幕边缘预设区域的第一手指的进入时间点与最后一个从所述预设区域抬起的最后手指的抬起时间点之间的时间差小于预设阈值,则判定所述手势操作为滑出屏幕边缘的操作,并触发对应的响应操作。
  17. 根据权利要求14所述的终端设备,其中,所述触摸屏手势操作控制程序被所述处理器执行时还实现如下操作:
    若所述手指进入触摸屏屏幕边缘预设区域的进入时间点与从所述预设区域抬起时的抬起时间点之间的时间差小于预设阈值,则判定所述手势操作为滑出屏幕边缘的操作,并触发对应的响应操作。
  18. 根据权利要求12所述的终端设备,其中,所述触摸屏手势操作控制程序被所述处理器执行时还实现如下操作:
    当检测到触摸屏的屏幕上的触摸指令时,记录触摸手指在屏幕触控系统的标识以进行追踪;
    当检测到在所述触摸屏的屏幕上基于所述手指的滑动指令时,根据所述手指的标识从所述屏幕触控系统获取所述手指当前在屏幕上的滑动状态;
    追踪所述手指的滑动状态,在检测到所述手指从屏幕上抬起时,根据所述手指的标识从所述屏幕触控系统获取所述手指抬起时所处的屏幕位置;
    在所述手指抬起时所处的屏幕位置位于所述触摸屏屏幕边缘预设区域时,获取抬起时间点。
  19. 一种计算机可读存储介质,其中,所述计算机可读存储介质上存储有触摸屏手势操作控制程序,所述触摸屏手势操作控制程序被处理器执行时实现如下操作:
    追踪检测在触摸屏屏幕边缘预设区域上的手势操作,获取在所述触摸屏屏幕边缘预设区域上的手势状态参数;
    在所述手势状态参数满足预设条件时,确定所述手势操作为滑出触摸屏屏幕边缘的操作,并触发对应的响应操作。
  20. 根据权利要求19所述的计算机可读存储介质,其中,所述触摸屏手势操作控制程序被所述处理器执行时还实现如下操作:
    当检测到触摸屏的屏幕上的触摸指令时,记录触摸手指在屏幕触控系统的标识以进行追踪;
    当检测到在所述触摸屏的屏幕上基于所述手指的滑动指令时,根据所述手指的标识从所述屏幕触控系统获取所述手指当前在屏幕上的滑动状态;
    追踪所述手指的滑动状态,根据所述手指的标识从所述屏幕触控系统获取所述手指进入触摸屏屏幕边缘预设区域的进入时间点;
    在检测到所述手指从屏幕上抬起时,根据所述手指的标识从所述屏幕触控系统获取所述手指抬起时所处的屏幕位置;
    在所述手指抬起时所处的屏幕位置位于所述触摸屏屏幕边缘预设区域时,获取抬起时间点。
PCT/CN2021/074170 2020-03-20 2021-01-28 触摸屏手势操作控制方法、装置、终端设备以及存储介质 WO2021184971A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010205297.0A CN111427500A (zh) 2020-03-20 2020-03-20 触摸屏手势操作控制方法、装置、终端设备以及存储介质
CN202010205297.0 2020-03-20

Publications (1)

Publication Number Publication Date
WO2021184971A1 true WO2021184971A1 (zh) 2021-09-23

Family

ID=71548473

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/074170 WO2021184971A1 (zh) 2020-03-20 2021-01-28 触摸屏手势操作控制方法、装置、终端设备以及存储介质

Country Status (2)

Country Link
CN (1) CN111427500A (zh)
WO (1) WO2021184971A1 (zh)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111427500A (zh) * 2020-03-20 2020-07-17 Oppo广东移动通信有限公司 触摸屏手势操作控制方法、装置、终端设备以及存储介质
CN112256126A (zh) * 2020-10-19 2021-01-22 上海肇观电子科技有限公司 用于识别手势的方法、电子电路、电子设备和介质
CN112506376B (zh) * 2020-12-09 2023-01-20 歌尔科技有限公司 圆形屏幕的触摸控制方法、终端设备及存储介质
CN113050868A (zh) * 2021-04-07 2021-06-29 中国科学院软件研究所 一种可屏幕内触发边缘手势的控制方法
CN113891160B (zh) * 2021-09-22 2024-02-13 百果园技术(新加坡)有限公司 内容卡片的滑动切换方法、装置、终端及存储介质
WO2023077292A1 (zh) * 2021-11-03 2023-05-11 北京奇点跳跃科技有限公司 触控板控制终端屏幕的方法、装置、控制设备及存储介质
CN114415930A (zh) * 2021-12-31 2022-04-29 联想(北京)有限公司 一种信息处理方法、信息处理装置及电子设备
CN114579033B (zh) * 2022-05-05 2023-04-14 深圳市闪剪智能科技有限公司 安卓平台的手势切换方法、装置、设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160239172A1 (en) * 2015-02-13 2016-08-18 Here Global B.V. Method, apparatus and computer program product for calculating a virtual touch position
CN106155419A (zh) * 2008-01-04 2016-11-23 苹果公司 选择性地拒绝触摸表面的边缘区域中的触摸接触
CN107506092A (zh) * 2017-09-30 2017-12-22 联想(北京)有限公司 一种输入控制方法和终端
CN108845752A (zh) * 2018-06-27 2018-11-20 Oppo广东移动通信有限公司 触控操作方法、装置、存储介质及电子设备
CN111427500A (zh) * 2020-03-20 2020-07-17 Oppo广东移动通信有限公司 触摸屏手势操作控制方法、装置、终端设备以及存储介质

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106527933B (zh) * 2016-10-31 2020-09-01 努比亚技术有限公司 移动终端边缘手势的控制方法及装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106155419A (zh) * 2008-01-04 2016-11-23 苹果公司 选择性地拒绝触摸表面的边缘区域中的触摸接触
US20160239172A1 (en) * 2015-02-13 2016-08-18 Here Global B.V. Method, apparatus and computer program product for calculating a virtual touch position
CN107506092A (zh) * 2017-09-30 2017-12-22 联想(北京)有限公司 一种输入控制方法和终端
CN108845752A (zh) * 2018-06-27 2018-11-20 Oppo广东移动通信有限公司 触控操作方法、装置、存储介质及电子设备
CN111427500A (zh) * 2020-03-20 2020-07-17 Oppo广东移动通信有限公司 触摸屏手势操作控制方法、装置、终端设备以及存储介质

Also Published As

Publication number Publication date
CN111427500A (zh) 2020-07-17

Similar Documents

Publication Publication Date Title
WO2021184971A1 (zh) 触摸屏手势操作控制方法、装置、终端设备以及存储介质
US11460918B2 (en) Managing and mapping multi-sided touch
JP7575435B2 (ja) 電子デバイス上の手書き入力
US11604560B2 (en) Application association processing method and apparatus
US9261990B2 (en) Hybrid touch screen device and method for operating the same
KR101995278B1 (ko) 터치 디바이스의 ui 표시방법 및 장치
WO2015176484A1 (zh) 触摸输入控制方法及装置
CN106598455B (zh) 用于手持触摸设备的触摸行为响应方法和装置及相应设备
US11740754B2 (en) Method for interface operation and terminal, storage medium thereof
CN113448479B (zh) 单手操作模式开启方法、终端及计算机存储介质
WO2015131590A1 (zh) 一种控制黑屏手势处理的方法及终端
US20150370473A1 (en) Using a symbol recognition engine
EP2899623A2 (en) Information processing apparatus, information processing method, and program
US10599326B2 (en) Eye motion and touchscreen gestures
WO2021232956A1 (zh) 设备控制方法、装置、存储介质及电子设备
CN103809794B (zh) 一种信息处理方法以及电子设备
CN109213349A (zh) 基于触摸屏的交互方法及装置、计算机可读存储介质
WO2023169499A1 (zh) 触摸屏的单手控制方法、控制装置、电子设备和存储介质
WO2023098628A1 (zh) 触控操作方法、装置和电子设备
CN114020199B (zh) 一种单手控制方法、装置及移动终端
CN113110786B (zh) 一种页面滑动控制方法及移动终端
CN108877742A (zh) 亮度调整方法及装置
KR101433147B1 (ko) 빠른 일정 보기를 위한 모바일 디바이스의 사용자 인터페이스 방법
WO2024152675A1 (zh) 界面控制方法、装置、终端及存储介质
EP3101522A1 (en) Information processing device, information processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21770660

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21770660

Country of ref document: EP

Kind code of ref document: A1