[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2016107283A1 - 控制应用的方法及装置 - Google Patents

控制应用的方法及装置 Download PDF

Info

Publication number
WO2016107283A1
WO2016107283A1 PCT/CN2015/093862 CN2015093862W WO2016107283A1 WO 2016107283 A1 WO2016107283 A1 WO 2016107283A1 CN 2015093862 W CN2015093862 W CN 2015093862W WO 2016107283 A1 WO2016107283 A1 WO 2016107283A1
Authority
WO
WIPO (PCT)
Prior art keywords
application
current
button
physical button
interface
Prior art date
Application number
PCT/CN2015/093862
Other languages
English (en)
French (fr)
Inventor
高斯太
沈文兴
Original Assignee
小米科技有限责任公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 小米科技有限责任公司 filed Critical 小米科技有限责任公司
Priority to EP15874967.1A priority Critical patent/EP3208704B1/en
Priority to KR1020167000039A priority patent/KR101819985B1/ko
Priority to JP2016567125A priority patent/JP6186091B2/ja
Priority to BR112016002304A priority patent/BR112016002304A2/pt
Priority to MX2016000463A priority patent/MX355827B/es
Priority to RU2016101409A priority patent/RU2640733C2/ru
Priority to US15/052,816 priority patent/US20160187997A1/en
Publication of WO2016107283A1 publication Critical patent/WO2016107283A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Definitions

  • the present disclosure relates to the field of communications and computer processing, and more particularly to methods and apparatus for controlling applications.
  • the present disclosure provides a method and apparatus for controlling an application.
  • a method of controlling an application including:
  • the application operation is performed for the current application.
  • the embodiment pre-configures the correspondence between the physical button and the application operation, and then implements the control of the application by executing the application operation, and does not need to be familiar with and invoke the interface of the application. Update the correspondence to extend and be compatible with new applications. Realizing the control of the physical button to the application, the physical button can be used to give the user a tactile feedback, and the tactile feedback can make the user clearly know whether the operation is successful. This embodiment is more effective when the user is inconvenient to look at the screen or is inconvenient to operate on the touch screen.
  • the application operation includes a gesture operation on a virtual button
  • the determining an application operation corresponding to the triggering operation of the physical button in the current application includes:
  • the performing the application operation for the current application includes performing the gesture operation on the coordinates for the current application.
  • the embodiment recognizes the button in the interface through the interface, does not need to be familiar with the internal interface of the application, and can understand the interface content of the application, and the implementation process is simple and convenient, and is convenient. Extension and compatibility with new applications.
  • the identifying the virtual button in the current interface comprises:
  • the technical solution provided by the embodiment of the present disclosure may include the following beneficial effects: the present embodiment recognizes a button in the interface by means of interface recognition or image recognition, and then performs a gesture operation on the button more accurately.
  • the determining an application operation corresponding to the triggering operation of the physical button in the current application includes:
  • the physical buttons in the embodiment may correspond to different application operations for different interfaces of the same application, and implement a trigger operation of the physical button to perform multiple application operations on the application. It can control the application more conveniently and flexibly.
  • the determining an application operation corresponding to the triggering operation of the physical button in the current application includes:
  • the application operation corresponding to the triggering operation of the physical button in the current application is determined according to the application operation with the highest frequency of use in the history of the application operation under the current application.
  • the embodiment may determine an application operation corresponding to a triggering operation of a physical button by a historical behavior of the user.
  • the application operation determined by the method is more in line with the needs of the user, more accurate and more convenient for the user to operate.
  • one triggering operation of the physical button corresponds to multiple application operations
  • the triggering operation of multiple physical buttons corresponds to an application operation.
  • the technical solution provided by the embodiment of the present disclosure may include the following beneficial effects: the corresponding relationship between the triggering operation of the physical button and the application operation in the embodiment may be flexibly configured, and is applicable to multiple applications.
  • an apparatus for controlling an application including:
  • a receiving module configured to receive a trigger operation of a physical button
  • a determining module configured to determine an application operation corresponding to a triggering operation of the physical button in the current application
  • an execution module configured to execute the application operation for the current application.
  • the application operation includes a gesture operation on a virtual button
  • the determining module includes:
  • Corresponding sub-module configured to determine a virtual button and a gesture operation corresponding to the triggering operation of the physical button in the current application
  • An interface sub-module configured to identify the virtual button in a current interface, and determine coordinates of the virtual button in the current interface
  • the execution module includes:
  • the interface sub-module obtains a current interface of the current application; performs character or pattern recognition of the virtual button on the current interface, and identifies the virtual button.
  • the determining module comprises:
  • the first determining submodule is configured to determine an application operation corresponding to the triggering operation of the physical button in the current interface of the current application.
  • the determining module comprises:
  • the second determining sub-module is configured to determine an application operation corresponding to the triggering operation of the physical button in the current application according to the application operation with the highest frequency of use in the history of the application operation in the current application.
  • one triggering operation of the physical button corresponds to multiple application operations
  • the triggering operation of multiple physical buttons corresponds to an application operation.
  • an apparatus for controlling an application comprising:
  • a memory for storing processor executable instructions
  • processor is configured to:
  • the application operation is performed for the current application.
  • FIG. 1 is a flow chart showing a method of controlling an application according to an exemplary embodiment.
  • FIG. 2 is a schematic diagram of an application interface according to an exemplary embodiment.
  • FIG. 3 is a schematic diagram of an application interface according to an exemplary embodiment.
  • FIG. 4 is a schematic diagram of an application interface according to an exemplary embodiment.
  • FIG. 5 is a schematic diagram of an application interface according to an exemplary embodiment.
  • FIG. 6 is a schematic diagram of an application interface according to an exemplary embodiment.
  • FIG. 7 is a schematic diagram of a configuration interface according to an exemplary embodiment.
  • FIG. 8 is a flowchart of a method of controlling an application, according to an exemplary embodiment.
  • FIG. 9 is a flowchart of a method of controlling an application, according to an exemplary embodiment.
  • FIG. 10 is a block diagram of an apparatus for controlling an application, according to an exemplary embodiment.
  • FIG. 11 is a block diagram of a determination module, according to an exemplary embodiment.
  • FIG. 12 is a block diagram of an execution module, according to an exemplary embodiment.
  • FIG. 13A is a block diagram of a determination module, according to an exemplary embodiment.
  • FIG. 13B is a block diagram of a determination module, according to an exemplary embodiment.
  • FIG. 14 is a block diagram of an apparatus, according to an exemplary embodiment.
  • a mobile terminal with a full touch screen generally has only a few physical buttons (or hardware buttons), which are a switch button and a volume button.
  • buttons can give people tactile feedback. Without looking at the screen, the user can know the success or failure of the operation by pressing the touch of the physical button.
  • the physical button can be more convenient for the user to operate when it is inconvenient to look at the screen or inconvenient to operate on the screen. Therefore, it is desirable that the physical button carries more functions, not only the power on/off and the volume adjustment.
  • One possible solution is to negotiate with the operator of the application in advance, and the operator needs to open the internal interface of the application. Then familiarize yourself with the internal interfaces of each application and adapt the interface of the application to the physical buttons. In actual operation, when the user presses the physical button, the mobile terminal will invoke the interface adapted by the physical button to realize the control of the physical button to the application.
  • This embodiment provides a solution that does not require familiarization and invocation of the internal interface of the application.
  • the operation of the application interface is performed, thereby realizing the control of the application.
  • the touch-sensitive advantage of the physical button is applied to the control of the application under the full touch screen. It is convenient for users to know the operation result more accurately and provides a way to control the application.
  • the physical buttons in this embodiment include: a "home” button, a switch button, a volume button, and an external control button.
  • FIG. 1 is a flowchart of a method for controlling an application according to an exemplary embodiment. As shown in FIG. 1 , the method may be implemented by a mobile terminal, including the following steps:
  • step 101 a triggering operation of a physical button is received.
  • step 102 an application operation corresponding to the triggering operation of the physical button under the current application is determined.
  • step 103 the application operation is performed for the current application.
  • the user opens an application and then presses the physical button under the application.
  • the mobile terminal receives the triggering operation of the physical button under the application, and triggers operations such as clicking, double clicking, and long pressing. Different from the user pressing the physical button under the desktop, after receiving the trigger operation of the physical button after entering the application, the mobile terminal can control the application according to the corresponding application operation of the pre-configured physical button trigger operation in the application. Different controls can be implemented through physical buttons in different applications. If the triggering operation of the physical button is received under the desktop, the mobile terminal can only control a specific single application.
  • control of the application in this embodiment is implemented by performing an application operation, and does not require the operator to open the internal interface, nor does it require a professional technician to be familiar with the internal interface of the application, and the compatibility and scalability are better, and the physical button is updated.
  • the corresponding relationship between the triggering operation and the application operation of the application is sufficient.
  • the application operates, including gestures and gestures.
  • the application operation may be various, such as including a gesture operation on the interface, or including a gesture operation on the virtual button.
  • Gesture operation on the interface the object of the gesture operation is an interface; the gesture operation on the virtual button, the pair of gesture operations Like a virtual button.
  • the application is a reader
  • the triggering operations of the physical buttons include clicking and double clicking. Click the gesture operation corresponding to the left or left click to control the application to page forward. Double-click the gesture operation corresponding to swipe right or right click to control the application to scroll backwards.
  • the application of the reader when the user presses (clicks) the physical button once, the mobile terminal is triggered by the click, and the trigger operation of the click is determined to click on the left area in the reader, as shown in FIG. 2 . Then, a click gesture operation in the left area is performed, which is equivalent to generating a gesture instruction indicating that the click is made in the left area, and the gesture instruction is transmitted to the reader.
  • the reader After the reader receives the gesture command, it performs a process of turning forward one page.
  • the mobile terminal if the user presses the double button twice (double-click), the mobile terminal is triggered by the double click, and the trigger operation of the double click is determined to be clicked in the right area in the reader, as shown in FIG. 2, and then executed in the right area.
  • the click gesture operation is equivalent to generating a gesture instruction indicating that the click is made in the right area, and the gesture instruction is transmitted to the reader.
  • the reader After the reader receives the gesture command, it performs a process of turning back one page.
  • the triggering operation of the physical button can correspond to different gesture operations, which is convenient for flexible control of the application.
  • step 102 can be implemented by step A1 and step A2.
  • step 103 can be implemented by step A3.
  • step A1 a virtual button and a gesture operation corresponding to the triggering operation of the physical button under the current application are determined.
  • step A2 the virtual button is identified in the current interface, and the coordinates of the virtual button in the current interface are determined.
  • step A3 the gesture operation on the coordinates is performed for the current application.
  • the triggering operation of the physical button in this embodiment can correspond to different application operations in different interfaces of one application, that is, different virtual buttons can be controlled.
  • the physical buttons allow for a variety of control over the application, making it more flexible and convenient.
  • the click operation of the physical button corresponds to clicking the "Start" button.
  • the user opens the stopwatch application and presses the physical button.
  • the mobile terminal receives the triggering operation of the physical button, determines the interface between the current application and the current application, and determines that the current application is the stopwatch application, and when the current interface is the home page, the corresponding relationship between the triggering operation of the physical button and the application operation is queried, and the application operation is determined as Click on the Start button.
  • the mobile terminal performs a click operation on the "Start" button.
  • the stopwatch application starts timing. In the timing page of the stopwatch application, the user presses the physical button.
  • the mobile terminal receives the triggering operation of the physical button, determines the interface between the current application and the current application, and determines that the current application is the stopwatch application, and when the current interface is timing, the corresponding relationship between the triggering operation of the physical button and the application operation is queried, and the application operation is determined as Click on the Stop button.
  • the mobile terminal performs a click operation on the "stop" button.
  • the stopwatch application stops timing.
  • the click operation of the physical button corresponds to clicking the "Start” button. After the user presses the physical button, the recording application starts recording. In the interface being recorded, the click operation of the physical button corresponds to the application operation of pausing the recording, which is equivalent to clicking the "pause” button. The double-click operation of the physical button corresponds to the application operation of stopping the recording, which is equivalent to clicking the "stop" button.
  • FIG. 5 Another example is taking a photo application as an example.
  • the click operation of the physical button corresponds to clicking the “photograph” button.
  • the photo application starts taking a photo, and clicking a physical button to take a photo.
  • the long press operation of the physical button corresponds to long press the "photograph” button.
  • the photo application starts to take continuous photos to achieve continuous shooting.
  • the instant messaging application In the chat interface of the instant messaging application, as shown in FIG. 6, the long press operation of the physical button corresponds to long pressing the "press and hold" button. The user can speak after pressing the physical button. The mobile terminal performs recording. After the user lifts the physical button, the mobile terminal stops recording and sends the recorded audio.
  • the user can pre-configure the application corresponding to the trigger operation of the physical button and the corresponding application operation.
  • a control key such as a rice key
  • FIG. 7 a control key external to the physical button is taken as an example.
  • step A2 can be implemented by step A21 and step A22.
  • step A21 the current interface of the current application is obtained.
  • step A22 the character or pattern recognition of the virtual button is performed on the current interface, and the virtual button is recognized.
  • the characters or patterns of the virtual buttons of the respective interfaces of the respective applications are pre-stored, in particular, the characters or patterns of the virtual buttons controllable by the physical buttons.
  • After entering the application using the physical button it is identified whether there is a preset virtual button in the interface of the application.
  • the virtual button can be identified by identifying the plugin, such as identifying the "button" from the interface program.
  • the interface is an image (the image can be obtained through the screen capture), and then the image is recognized, and the character or pattern of the virtual button is recognized.
  • Image recognition can eliminate the need to familiarize with the application's program structure, know the interface style to identify, compatibility and scalability.
  • step 102 can be implemented by step B.
  • step B an application operation corresponding to the triggering operation of the physical button under the current interface of the current application is determined.
  • the physical buttons can correspond to different application operations for different interfaces of the same application.
  • the clicked application operation can correspond to the “start timing” button and the “stop timing”. Button.
  • the clicked application action can correspond to the "Start Recording” button and the "Stop Recording” button.
  • a trigger operation of the physical button can be performed on the application, and the application can be controlled more conveniently and flexibly.
  • step 102 can be implemented by step B1.
  • step B1 the application operation corresponding to the triggering operation of the physical button in the current application is determined according to the application operation with the highest frequency of use in the history record of the application operation under the current application.
  • the embodiment may be determined according to a pre-configuration, as shown in FIG. 7, which may be a system configuration or a user configuration.
  • a pre-configuration as shown in FIG. 7, which may be a system configuration or a user configuration.
  • the embodiment can pass the line to the user Determined for identification and analysis.
  • an application operation of the user under the current application is recorded in advance as a history of the application operation.
  • the user may have multiple application operations under the current application, such as a click operation on buttons 1-3 in the current application.
  • the triggering operation of the physical button corresponds to which application operation can be implemented in multiple ways.
  • the triggering operation of the physical button corresponds to the application operation with the highest frequency of use, and the user behavior can be intelligently analyzed, so that the user is more convenient when using the physical button, and is more in line with his own usage habits.
  • Mode C1 One trigger operation of the physical button corresponds to multiple application operations.
  • the physical button is configured in advance to correspond to an application operation of a 10-second countdown.
  • the stopwatch application performs a 10 second countdown, which is equivalent to setting the application operation for 10 seconds and click countdown.
  • Mode C2 The triggering operation of multiple physical buttons corresponds to an application operation.
  • the trigger operation of the external control key and the "home” key simultaneously clicks corresponds to an application operation, such as the "record” button in the corresponding photo application.
  • control of the application operation is realized by a combination of trigger operations of a plurality of physical buttons. Controls for more application operations can be achieved, making control more flexible and convenient.
  • FIG. 8 is a flowchart of a method for controlling an application according to an exemplary embodiment. As shown in FIG. 8, the method may be implemented by a mobile terminal, including the following steps:
  • step 801 a triggering operation of the physical button is received.
  • step 802 an application operation corresponding to the triggering operation of the physical button under the current interface of the current application is determined.
  • step 803 the virtual button is identified in the current interface and the coordinates of the virtual button in the current interface are determined.
  • step 804 the gesture operation on the coordinates is performed for the current application.
  • FIG. 9 is a flowchart of a method for controlling an application according to an exemplary embodiment. As shown in FIG. 9, the method may be implemented by a mobile terminal, including the following steps:
  • step 901 a triggering operation of the physical button is received.
  • step 902 a virtual button and a gesture operation corresponding to the triggering operation of the physical button under the current application are determined.
  • step 903 a current interface of the current application is obtained.
  • step 904 the character or pattern recognition of the virtual button is performed on the current interface, and the virtual button is recognized.
  • step 905 coordinates of the virtual button in the current interface are determined.
  • step 906 the gesture operation on the coordinates is performed for the current application.
  • FIG. 10 is a schematic diagram of an apparatus for controlling an application according to an exemplary embodiment.
  • the apparatus includes: a receiving module 1001, a determining module 1002, and an executing module 1003.
  • the receiving module 1001 is configured to receive a triggering operation of the physical button.
  • the determining module 1002 is configured to determine an application operation corresponding to the triggering operation of the physical button under the current application.
  • the executing module 1003 is configured to perform the application operation for the current application.
  • the application operation includes a gesture operation on a virtual button
  • the determining module 1002 includes: a corresponding submodule 10021 and an interface submodule 10022.
  • the corresponding sub-module 10021 is configured to determine a virtual button and a gesture operation corresponding to the triggering operation of the physical button under the current application.
  • the interface sub-module 10022 is configured to identify the virtual button in the current interface, and determine coordinates of the virtual button in the current interface.
  • the execution module 1003 includes an execution submodule 10031.
  • the execution sub-module 10031 is configured to perform the gesture operation on the coordinates for the current application.
  • the interface sub-module 10022 obtains a current interface of the current application; performs character or pattern recognition of the virtual button on the current interface, and identifies the virtual button.
  • the determining module 1002 includes: a first determining submodule 10023.
  • the determining sub-module 10023 is configured to determine an application operation corresponding to the triggering operation of the physical button in the current interface of the current application.
  • the determining module 1002 includes a second determining sub-module 10024.
  • the second determining sub-module 10024 is configured to determine an application operation corresponding to the triggering operation of the physical button in the current application according to the application operation with the highest frequency of use in the historical record of the application operation in the current application.
  • one triggering operation of the physical button corresponds to multiple application operations
  • the triggering operation of multiple physical buttons corresponds to an application operation.
  • FIG. 14 is a block diagram of an apparatus 1400 for controlling an application, according to an exemplary embodiment.
  • device 1400 can be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a gaming console, a tablet device, a medical device, a fitness device, a personal digital assistant, and the like.
  • apparatus 1400 can include one or more of the following components: processing component 1402, memory 1404, power component 1406, multimedia component 1408, audio component 1410, input/output (I/O) interface 1412, sensor component 1414, And a communication component 1416.
  • Processing component 1402 typically controls the overall operation of device 1400, such as with display, telephone calls, data communications, The operations associated with camera operations and recording operations.
  • Processing component 1402 can include one or more processors 1420 to execute instructions to perform all or part of the steps of the above described methods.
  • processing component 1402 can include one or more modules to facilitate interaction between component 1402 and other components.
  • processing component 1402 can include a multimedia module to facilitate interaction between multimedia component 1408 and processing component 1402.
  • Memory 1404 is configured to store various types of data to support operation at device 1400. Examples of such data include instructions for any application or method operating on device 1400, contact data, phone book data, messages, pictures, videos, and the like.
  • the memory 1404 can be implemented by any type of volatile or non-volatile storage device, or a combination thereof, such as static random access memory (SRAM), electrically erasable programmable read only memory (EEPROM), erasable.
  • SRAM static random access memory
  • EEPROM electrically erasable programmable read only memory
  • EPROM Programmable Read Only Memory
  • PROM Programmable Read Only Memory
  • ROM Read Only Memory
  • Magnetic Memory Flash Memory
  • Disk Disk or Optical Disk.
  • Power component 1406 provides power to various components of device 1400.
  • Power component 1406 can include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for device 1400.
  • the multimedia component 1408 includes a screen between the device 1400 and the user that provides an output interface.
  • the screen can include a liquid crystal display (LCD) and a touch panel (TP). If the screen includes a touch panel, the screen can be implemented as a touch screen to receive input signals from the user.
  • the touch panel includes one or more touch sensors to sense touches, slides, and gestures on the touch panel. The touch sensor may sense not only the boundary of the touch or sliding action, but also the duration and pressure associated with the touch or slide operation.
  • the multimedia component 1408 includes a front camera and/or a rear camera. When the device 1400 is in an operation mode, such as a shooting mode or a video mode, the front camera and/or the rear camera can receive external multimedia data. Each front and rear camera can be a fixed optical lens system or have focal length and optical zoom capabilities.
  • the audio component 1410 is configured to output and/or input an audio signal.
  • the audio component 1410 includes a microphone (MIC) that is configured to receive an external audio signal when the device 1400 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode.
  • the received audio signal may be further stored in memory 1404 or transmitted via communication component 1416.
  • the audio component 1410 also includes a speaker for outputting an audio signal.
  • the I/O interface 1412 provides an interface between the processing component 1402 and the peripheral interface module, which may be a keyboard, a click wheel, a button, or the like. These buttons may include, but are not limited to, a home button, a volume button, a start button, and a lock button.
  • Sensor assembly 1414 includes one or more sensors for providing a status assessment of various aspects to device 1400.
  • sensor component 1414 can detect an open/closed state of device 1400, relative positioning of components, such as the display and keypad of device 1400, and sensor component 1414 can also detect the location of one component of device 1400 or device 1400. The presence or absence of contact by the user with the device 1400, the orientation or acceleration/deceleration of the device 1400 and the temperature change of the device 1400.
  • Sensor assembly 1414 can include a proximity sensor configured to detect the presence of nearby objects without any physical contact.
  • Sensor assembly 1414 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications.
  • the sensor assembly The 1414 may also include an acceleration sensor, a gyro sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
  • Communication component 1416 is configured to facilitate wired or wireless communication between device 1400 and other devices.
  • the device 1400 can access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof.
  • communication component 1416 receives broadcast signals or broadcast associated information from an external broadcast management system via a broadcast channel.
  • the communication component 1416 also includes a near field communication (NFC) module to facilitate short range communication.
  • NFC near field communication
  • the NFC module can be implemented based on radio frequency identification (RFID) technology, infrared data association (IrDA) technology, ultra-wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
  • RFID radio frequency identification
  • IrDA infrared data association
  • UWB ultra-wideband
  • Bluetooth Bluetooth
  • device 1400 may be implemented by one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable A gate array (FPGA), controller, microcontroller, microprocessor, or other electronic component implementation for performing the above methods.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGA field programmable A gate array
  • controller microcontroller, microprocessor, or other electronic component implementation for performing the above methods.
  • non-transitory computer readable storage medium comprising instructions, such as a memory 1404 comprising instructions executable by processor 1420 of apparatus 1400 to perform the above method.
  • the non-transitory computer readable storage medium may be a ROM, a random access memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, and an optical data storage device.
  • a device for controlling an application comprising:
  • a memory for storing processor executable instructions
  • processor is configured to:
  • the application operation is performed for the current application.
  • the processor can also be configured to:
  • the application operation includes a gesture operation on a virtual button
  • the determining an application operation corresponding to the triggering operation of the physical button in the current application includes:
  • the performing the application operation for the current application includes performing the gesture operation on the coordinates for the current application.
  • the processor can also be configured to:
  • the identifying the virtual button in the current interface includes:
  • the processor can also be configured to:
  • the determining an application operation corresponding to the triggering operation of the physical button in the current application includes:
  • the processor can also be configured to:
  • the determining an application operation corresponding to the triggering operation of the physical button in the current application includes:
  • the application operation corresponding to the triggering operation of the physical button in the current application is determined according to the application operation with the highest frequency of use in the history of the application operation under the current application.
  • the processor can also be configured to:
  • One trigger operation of the physical button corresponds to multiple application operations
  • the triggering operation of multiple physical buttons corresponds to an application operation.
  • a non-transitory computer readable storage medium when instructions in the storage medium are executed by a processor of a mobile terminal, to enable the mobile terminal to perform a method of controlling an application, the method comprising:
  • the application operation is performed for the current application.
  • the instructions in the storage medium may further include:
  • the application operation includes a gesture operation on a virtual button
  • the determining an application operation corresponding to the triggering operation of the physical button in the current application includes:
  • the performing the application operation for the current application includes performing the gesture operation on the coordinates for the current application.
  • the instructions in the storage medium may further include:
  • the identifying the virtual button in the current interface includes:
  • the instructions in the storage medium may further include:
  • the determining an application operation corresponding to the triggering operation of the physical button in the current application includes:
  • the instructions in the storage medium may further include:
  • the determining an application operation corresponding to the triggering operation of the physical button in the current application includes:
  • the application operation corresponding to the triggering operation of the physical button in the current application is determined according to the application operation with the highest frequency of use in the history of the application operation under the current application.
  • the instructions in the storage medium may further include:
  • One trigger operation of the physical button corresponds to multiple application operations
  • the triggering operation of multiple physical buttons corresponds to an application operation.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Input From Keyboards Or The Like (AREA)
  • Telephone Function (AREA)
  • Stored Programmes (AREA)
  • Containers And Packaging Bodies Having A Special Means To Remove Contents (AREA)

Abstract

本公开是关于一种控制应用的方法及装置,用于更方便准确的控制应用。所述方法包括:接收实体按键的触发操作;确定在当前应用下所述实体按键的触发操作对应的应用操作;针对所述当前应用执行所述应用操作。

Description

控制应用的方法及装置
本申请基于申请号为201410856869.6、申请日为2014-12-31的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本申请作为参考。
技术领域
本公开涉及通信及计算机处理领域,尤其涉及控制应用的方法及装置。
背景技术
随着电子技术的发展,移动终端已经普遍应用,并且更新换代非常快。移动终端从最开始的键盘输入,到触摸屏输入。全触摸屏的移动终端已成为主流趋势。
发明内容
本公开提供一种控制应用的方法及装置。
根据本公开实施例的第一方面,提供一种控制应用的方法,包括:
接收实体按键的触发操作;
确定在当前应用下所述实体按键的触发操作对应的应用操作;
针对所述当前应用执行所述应用操作。
本公开的实施例提供的技术方案可以包括以下有益效果:本实施例预先配置实体按键与应用操作的对应关系,然后通过执行应用操作实现对应用的控制,不需要熟悉和调用应用的接口,通过更新对应关系便可以对新增应用的扩展和兼容。实现实体按键对应用的控制,通过实体按键可令用户得到触感的反馈,该触感的反馈可使用户清楚的获知操作是否成功。在用户不方便看屏幕或不方便在触摸屏上操作时,本实施例效果更好。
在一个实施例中,所述应用操作包括对虚拟按钮的手势操作;
所述确定在当前应用下所述实体按键的触发操作对应的应用操作,包括:
确定在当前应用下所述实体按键的触发操作对应的虚拟按钮和手势操作;
在当前界面中识别所述虚拟按钮,并确定所述虚拟按钮在所述当前界面中的坐标;
所述针对所述当前应用执行所述应用操作,包括:针对所述当前应用执行对所述坐标的所述手势操作。
本公开的实施例提供的技术方案可以包括以下有益效果:本实施例通过界面识别出界面中的按钮,不需要熟悉应用的内部接口,了解应用的界面内容便可以实现,实现过程简单方便,便于对新增应用的扩展和兼容。
在一个实施例中,所述在当前界面中识别所述虚拟按钮,包括:
获得所述当前应用的当前界面;
对所述当前界面进行所述虚拟按钮的字符或图样识别,识别出所述虚拟按钮。
本公开的实施例提供的技术方案可以包括以下有益效果:本实施例通过界面识别或图像识别等方式识别出界面中的按钮,然后可较准确的执行对按钮的手势操作。
在一个实施例中,所述确定在当前应用下所述实体按键的触发操作对应的应用操作,包括:
确定在当前应用的当前界面下所述实体按键的触发操作对应的应用操作。
本公开的实施例提供的技术方案可以包括以下有益效果:本实施例中实体按键对于同一应用的不同界面可以对应不同的应用操作,实现了实体按键的一个触发操作可对应用进行多种应用操作,可以更方便、灵活的控制应用。
在一个实施例中,所述确定在当前应用下所述实体按键的触发操作对应的应用操作,包括:
根据当前应用下的应用操作的历史记录中使用频率最高的应用操作,确定在当前应用下所述实体按键的触发操作对应的应用操作。
本公开的实施例提供的技术方案可以包括以下有益效果:本实施例可以通过用户的历史行为习惯确定实体按键的触发操作对应的应用操作。该方式确定的应用操作更符合用户的需要,更准确,更方便用户操作。
在一个实施例中,所述实体按键的一个触发操作对应多个应用操作;或者
多个实体按键的触发操作对应一个应用操作。
本公开的实施例提供的技术方案可以包括以下有益效果:本实施例中实体按键的触发操作与应用操作的对应关系可灵活配置,适用于多种应用。
根据本公开实施例的第二方面,提供一种控制应用的装置,包括:
接收模块,用于接收实体按键的触发操作;
确定模块,用于确定在当前应用下所述实体按键的触发操作对应的应用操作;
执行模块,用于针对所述当前应用执行所述应用操作。
在一个实施例中,所述应用操作包括对虚拟按钮的手势操作;
所述确定模块包括:
对应子模块,用于确定在当前应用下所述实体按键的触发操作对应的虚拟按钮和手势操作;
界面子模块,用于在当前界面中识别所述虚拟按钮,并确定所述虚拟按钮在所述当前界面中的坐标;
所述执行模块包括:
执行子模块,用于针对所述当前应用执行对所述坐标的所述手势操作。
在一个实施例中,所述界面子模块获得所述当前应用的当前界面;对所述当前界面进行所述虚拟按钮的字符或图样识别,识别出所述虚拟按钮。
在一个实施例中,所述确定模块包括:
第一确定子模块,用于确定在当前应用的当前界面下所述实体按键的触发操作对应的应用操作。
在一个实施例中,所述确定模块包括:
第二确定子模块,用于根据当前应用下的应用操作的历史记录中使用频率最高的应用操作,确定在当前应用下所述实体按键的触发操作对应的应用操作。
在一个实施例中,所述实体按键的一个触发操作对应多个应用操作;或者
多个实体按键的触发操作对应一个应用操作。
根据本公开实施例的第三方面,提供一种控制应用的装置,包括:
处理器;
用于存储处理器可执行指令的存储器;
其中,所述处理器被配置为:
接收实体按键的触发操作;
确定在当前应用下所述实体按键的触发操作对应的应用操作;
针对所述当前应用执行所述应用操作。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本公开。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本公开的实施例,并与说明书一起用于解释本公开的原理。
图1是根据一示例性实施例示出的一种控制应用的方法的流程图。
图2是根据一示例性实施例示出的一种应用界面的示意图。
图3是根据一示例性实施例示出的一种应用界面的示意图。
图4是根据一示例性实施例示出的一种应用界面的示意图。
图5是根据一示例性实施例示出的一种应用界面的示意图。
图6是根据一示例性实施例示出的一种应用界面的示意图。
图7是根据一示例性实施例示出的一种配置界面的示意图。
图8是根据一示例性实施例示出的一种控制应用的方法的流程图。
图9是根据一示例性实施例示出的一种控制应用的方法的流程图。
图10是根据一示例性实施例示出的一种控制应用的装置的框图。
图11是根据一示例性实施例示出的一种确定模块的框图。
图12是根据一示例性实施例示出的一种执行模块的框图。
图13A是根据一示例性实施例示出的一种确定模块的框图。
图13B是根据一示例性实施例示出的一种确定模块的框图。
图14是根据一示例性实施例示出的一种装置的框图。
具体实施方式
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本公开相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本公开的一些方面相一致的装置和方法的例子。
相关技术中,大多数移动终端已经没有了硬件键盘,而是采用全触摸屏输入。全触摸屏的移动终端一般只有开关机键和音量键这几个少量的实体按键(或称硬件按键)。
本公开的发明人发现,实体按键可以给人以触觉上的反馈。在不看屏幕的情况下,用户通过按下实体按键的触感,可获知操作的成功或失败。在不方便看屏幕或不方便在屏幕上操作时,实体按键可以更方便用户进行操作。因此希望实体按键承载更多的功能,不仅限于开关机和调整音量。
一种可能的方案是,预先与应用的运营商协商,需要运营商开放应用的内部接口。然后熟悉各个应用的内部接口,将应用的接口与实体按键适配。在实际操作时,用户按下实体按键,移动终端将调用实体按键适配的接口,实现实体按键对应用的控制。
本实施例提供一种不需要熟悉和调用应用的内部接口的方案。在实体按键的触发下,执行对应用界面的操作,从而实现对应用的控制。将实体按键的触感优势,运用到全触摸屏下对应用的控制。方便用户较准确的获知操作结果,并且提供了一种对应用的控制方式。
本实施例中的实体按键包括:“home”键、开关机键、音量键和外接的控制键等。
图1是根据一示例性实施例示出的一种控制应用的方法的流程图,如图1所示,该方法可以由移动终端实现,包括以下步骤:
在步骤101中,接收实体按键的触发操作。
在步骤102中,确定在当前应用下所述实体按键的触发操作对应的应用操作。
在步骤103中,针对所述当前应用执行所述应用操作。
本实施例中,用户开启某个应用,然后在该应用下按下实体按键。移动终端在该应用下接收实体按键的触发操作,触发操作如单击、双击和长按等。不同于用户在桌面下按下实体按键,在进入应用后收到实体按键的触发操作,移动终端可根据预先配置的实体按键的触发操作在该应用下对应的应用操作,对该应用进行控制。在不同应用下,通过实体按键可实现不同的控制。如果是在桌面下收到实体按键的触发操作,则移动终端只能对特定的单一应用进行控制。并且,本实施例对应用的控制是通过执行应用操作来实现,不需要运营商开放内部接口,也不需要专业的技术人员熟悉应用的内部接口,兼容性和可扩展性更好,更新实体按键的触发操作与应用的应用操作的对应关系即可。
在一个实施例中,应用操作,包括手势操作和手势操作的对象。
所述应用操作可以有多种,如包括对界面的手势操作,或者包括对虚拟按钮的手势操作。对界面的手势操作,手势操作的对象为界面;对虚拟按钮的手势操作,手势操作的对 象为虚拟按钮。
例如,应用为阅读器,实体按键的触发操作包括单击和双击。单击对应向左滑动或左侧单击的手势操作,对应用的控制为向前翻一页。双击对应向右滑动或右侧单击的手势操作,对应用的控制为向后翻一页。在阅读器的应用下,用户按下一次(单击)实体按键,则移动终端受到单击的触发,确定单击的触发操作在阅读器中对应单击左侧区域,如图2所示,则执行在左侧区域的单击手势操作,相当于生成表示在左侧区域单击的手势指令,将该手势指令传输给阅读器。阅读器收到该手势指令后,进行向前翻一页的处理。或者,用户连续按下两次(双击)实体按键,则移动终端受到双击的触发,确定双击的触发操作在阅读器中对应单击右侧区域,如图2所示,则执行在右侧区域的单击手势操作,相当于生成表示在右侧区域单击的手势指令,将该手势指令传输给阅读器。阅读器收到该手势指令后,进行向后翻一页的处理。
在应用的不同界面,实体按键的触发操作可对应不同的手势操作,方便对应用灵活控制。
应用操作包括对虚拟按钮的手势操作时,步骤102可由步骤A1和步骤A2实现。步骤103可由步骤A3实现。
在步骤A1中,确定在当前应用下所述实体按键的触发操作对应的虚拟按钮和手势操作。
在步骤A2中,在当前界面中识别所述虚拟按钮,并确定所述虚拟按钮在所述当前界面中的坐标。
在步骤A3中,针对所述当前应用执行对所述坐标的所述手势操作。
本实施例实体按键的触发操作在一个应用的不同界面可以对应不同的应用操作,也就是可以对不同的虚拟按钮进行控制。通过实体按键可对应用进行多种控制,更加灵活方便。
例如,在秒表应用的首页中,如图3所示,实体按键的单击操作对应点击“开始”按钮。用户打开秒表应用,按下实体按键。移动终端收到实体按键的触发操作,确定当前应用和当前应用的界面,在确定当前应用为秒表应用,当前界面为首页时,查询实体按键的触发操作与应用操作的对应关系,确定应用操作为对“开始”按钮的单击操作。移动终端执行对“开始”按钮的单击操作。秒表应用开始计时。在秒表应用的计时页面中,用户按下实体按键。移动终端收到实体按键的触发操作,确定当前应用和当前应用的界面,在确定当前应用为秒表应用,当前界面为计时时,查询实体按键的触发操作与应用操作的对应关系,确定应用操作为对“停止”按钮的单击操作。移动终端执行对“停止”按钮的单击操作。秒表应用停止计时。
又如,以录音应用为例。在录音应用的首页中,如图4所示,实体按键的单击操作对应点击“开始”按钮。用户按下实体按键后,录音应用开始录音。在正在录音的界面中,实体按键的单击操作对应暂停录音的应用操作,相当于点击“暂停”按钮。实体按键的双击操作对应停止录音的应用操作,相当于点击“停止”按钮。
又如,以拍照应用为例。在拍照应用的首页中,如图5所示,实体按键的单击操作对应点击“拍照”按钮。用户按下实体按键后,拍照应用开始拍照,单击一次实体按键拍摄一张照片。实体按键的长按操作对应长按“拍照”按钮。用户长按实体按键后,拍照应用开始连续拍照,实现连拍。
又如,以即时通讯应用为例。在即时通讯应用的聊天界面中,如图6所示,实体按键的长按操作对应长按“按住说话”按钮。用户按下实体按键后,可以说话。移动终端进行录音。用户抬起实体按键后,移动终端停止录音,并将录下的音频发送。
用户可以预先对实体按键的触发操作对应的应用和对应的应用操作进行配置。如图7所示,实体按键以外接的控制键(如米键)为例。
在“米键”的配置界面下,选择“精灵”,然后选择“程序内米键”。在“程序内米键”的配置界面中,可选择实体按键是否应用到本实施例的方案中,也可以选择需要采用本实施例方案的应用。
在一个实施例中,步骤A2可以由步骤A21和步骤A22实现。
在步骤A21中,获得所述当前应用的当前界面。
在步骤A22中,对所述当前界面进行所述虚拟按钮的字符或图样识别,识别出所述虚拟按钮。
本实施例中预先存有各个应用的各个界面的虚拟按钮的字符或图样,尤其是实体按键可控制的虚拟按钮的字符或图样。进入采用实体按键的应用后,识别该应用的界面中是否有预设的虚拟按钮。可以通过识别插件的方式识别虚拟按钮,如从界面程序中识别“button”。或者采用图像识别方式,将界面所为一个图像(可以通过截屏获得图像),然后对图像识别,识别出虚拟按钮的字符或图样。通过图像识别方式可以不需要熟悉应用的程序结构,知道界面样式便可识别,兼容性和可扩充性较好。
在一个实施例中,步骤102可以由步骤B实现。
在步骤B中,确定在当前应用的当前界面下所述实体按键的触发操作对应的应用操作。
本实施例中实体按键对于同一应用的不同界面可以对应不同的应用操作,如图3-图4所示,在秒表应用中,单击的应用操作可以对应“开始计时”按钮和“停止计时”按钮。在录音应用中,单击的应用操作可以对应“开始录音”按钮和“停止录音”按钮。本实施例实现了实体按键的一个触发操作可对应用进行多种应用操作,可以更方便、灵活的控制应用。
在一个实施例中,步骤102可以由步骤B1实现。
在步骤B1中,根据当前应用下的应用操作的历史记录中使用频率最高的应用操作,确定在当前应用下所述实体按键的触发操作对应的应用操作。
本实施例在确定实体按键的触发操作对应的应用操作时,一方面可以根据预先的配置来确定,如图7所示,可以是系统配置也可以是用户配置。另一方面可以通过对用户的行 为识别和分析来确定。例如,预先记录用户在当前应用下的应用操作,作为应用操作的历史记录。用户在当前应用下可能有多种应用操作,如对当前应用中按钮1-3的点击操作。实体按键的触发操作对应哪个应用操作可以有多种实现方式。本实施例将实体按键的触发操作对应使用频率最高的应用操作,可智能分析用户行为,使得用户在使用实体按键时更方便,更符合自身的使用习惯。
在一个实施例中,实体按键的触发操作的应用操作的对应关系可以有多种,如方式C1和方式C2。
方式C1:所述实体按键的一个触发操作对应多个应用操作。
例如,以秒表应用为例,预先将实体按键配置为对应10秒倒计时的应用操作。在秒表应用的首页,用户按下实体按键,则秒表应用进行10秒倒计时,相当于设置10秒和点击倒计时两个应用操作。
本实施例可以通过实体按键实现多个应用操作,使得操作更方便灵活。
方式C2:多个实体按键的触发操作对应一个应用操作。
例如,外接的控制键和“home”键同时单击的触发操作对应一个应用操作,如对应拍照应用中的“录像”按钮。
本实施例通过多个实体按键的触发操作的组合,来实现对应用操作的控制。可实现对更多的应用操作的控制,使得控制更灵活方便。
下面通过几个实施例详细介绍控制应用的实现过程。
图8是根据一示例性实施例示出的一种控制应用的方法的流程图,如图8所示,该方法可以由移动终端实现,包括以下步骤:
在步骤801中,接收实体按键的触发操作。
在步骤802中,确定在当前应用的当前界面下所述实体按键的触发操作对应的应用操作。
在步骤803中,在当前界面中识别所述虚拟按钮,并确定所述虚拟按钮在所述当前界面中的坐标。
在步骤804中,针对所述当前应用执行对所述坐标的所述手势操作。
图9是根据一示例性实施例示出的一种控制应用的方法的流程图,如图9所示,该方法可以由移动终端实现,包括以下步骤:
在步骤901中,接收实体按键的触发操作。
在步骤902中,确定在当前应用下所述实体按键的触发操作对应的虚拟按钮和手势操作。
在步骤903中,获得所述当前应用的当前界面。
在步骤904中,对所述当前界面进行所述虚拟按钮的字符或图样识别,识别出所述虚拟按钮。
在步骤905中,确定所述虚拟按钮在所述当前界面中的坐标。
在步骤906中,针对所述当前应用执行对所述坐标的所述手势操作。
通过以上介绍了解了控制应用的实现过程,该过程由移动终端和计算机实现,下面分别针对两个设备的内部结构和功能进行介绍。
图10是根据一示例性实施例示出的一种控制应用的装置示意图。参照图10,该装置包括:接收模块1001、确定模块1002和执行模块1003。
接收模块1001,用于接收实体按键的触发操作。
确定模块1002,用于确定在当前应用下所述实体按键的触发操作对应的应用操作。
执行模块1003,用于针对所述当前应用执行所述应用操作。
在一个实施例中,所述应用操作包括对虚拟按钮的手势操作;
如图11所示,所述确定模块1002包括:对应子模块10021和界面子模块10022。
对应子模块10021,用于确定在当前应用下所述实体按键的触发操作对应的虚拟按钮和手势操作。
界面子模块10022,用于在当前界面中识别所述虚拟按钮,并确定所述虚拟按钮在所述当前界面中的坐标。
如图12所示,所述执行模块1003包括:执行子模块10031。
执行子模块10031,用于针对所述当前应用执行对所述坐标的所述手势操作。
在一个实施例中,所述界面子模块10022获得所述当前应用的当前界面;对所述当前界面进行所述虚拟按钮的字符或图样识别,识别出所述虚拟按钮。
在一个实施例中,如图13A所示,所述确定模块1002包括:第一确定子模块10023。
确定子模块10023,用于确定在当前应用的当前界面下所述实体按键的触发操作对应的应用操作。
在一个实施例中,如图13B所示,所述确定模块1002包括:第二确定子模块10024。
第二确定子模块10024,用于根据当前应用下的应用操作的历史记录中使用频率最高的应用操作,确定在当前应用下所述实体按键的触发操作对应的应用操作。
在一个实施例中,所述实体按键的一个触发操作对应多个应用操作;或者
多个实体按键的触发操作对应一个应用操作。
关于上述实施例中的装置,其中各个模块执行操作的具体方式已经在有关该方法的实施例中进行了详细描述,此处将不做详细阐述说明。
图14是根据一示例性实施例示出的一种用于控制应用的装置1400的框图。例如,装置1400可以是移动电话,计算机,数字广播终端,消息收发设备,游戏控制台,平板设备,医疗设备,健身设备,个人数字助理等。
参照图14,装置1400可以包括以下一个或多个组件:处理组件1402,存储器1404,电源组件1406,多媒体组件1408,音频组件1410,输入/输出(I/O)的接口1412,传感器组件1414,以及通信组件1416。
处理组件1402通常控制装置1400的整体操作,诸如与显示,电话呼叫,数据通信, 相机操作和记录操作相关联的操作。处理组件1402可以包括一个或多个处理器1420来执行指令,以完成上述的方法的全部或部分步骤。此外,处理组件1402可以包括一个或多个模块,便于处理组件1402和其他组件之间的交互。例如,处理部件1402可以包括多媒体模块,以方便多媒体组件1408和处理组件1402之间的交互。
存储器1404被配置为存储各种类型的数据以支持在设备1400的操作。这些数据的示例包括用于在装置1400上操作的任何应用程序或方法的指令,联系人数据,电话簿数据,消息,图片,视频等。存储器1404可以由任何类型的易失性或非易失性存储设备或者它们的组合实现,如静态随机存取存储器(SRAM),电可擦除可编程只读存储器(EEPROM),可擦除可编程只读存储器(EPROM),可编程只读存储器(PROM),只读存储器(ROM),磁存储器,快闪存储器,磁盘或光盘。
电力组件1406为装置1400的各种组件提供电力。电力组件1406可以包括电源管理系统,一个或多个电源,及其他与为装置1400生成、管理和分配电力相关联的组件。
多媒体组件1408包括在所述装置1400和用户之间的提供一个输出接口的屏幕。在一些实施例中,屏幕可以包括液晶显示器(LCD)和触摸面板(TP)。如果屏幕包括触摸面板,屏幕可以被实现为触摸屏,以接收来自用户的输入信号。触摸面板包括一个或多个触摸传感器以感测触摸、滑动和触摸面板上的手势。所述触摸传感器可以不仅感测触摸或滑动动作的边界,而且还检测与所述触摸或滑动操作相关的持续时间和压力。在一些实施例中,多媒体组件1408包括一个前置摄像头和/或后置摄像头。当设备1400处于操作模式,如拍摄模式或视频模式时,前置摄像头和/或后置摄像头可以接收外部的多媒体数据。每个前置摄像头和后置摄像头可以是一个固定的光学透镜系统或具有焦距和光学变焦能力。
音频组件1410被配置为输出和/或输入音频信号。例如,音频组件1410包括一个麦克风(MIC),当装置1400处于操作模式,如呼叫模式、记录模式和语音识别模式时,麦克风被配置为接收外部音频信号。所接收的音频信号可以被进一步存储在存储器1404或经由通信组件1416发送。在一些实施例中,音频组件1410还包括一个扬声器,用于输出音频信号。
I/O接口1412为处理组件1402和外围接口模块之间提供接口,上述外围接口模块可以是键盘,点击轮,按钮等。这些按钮可包括但不限于:主页按钮、音量按钮、启动按钮和锁定按钮。
传感器组件1414包括一个或多个传感器,用于为装置1400提供各个方面的状态评估。例如,传感器组件1414可以检测到设备1400的打开/关闭状态,组件的相对定位,例如所述组件为装置1400的显示器和小键盘,传感器组件1414还可以检测装置1400或装置1400的一个组件的位置改变,用户与装置1400接触的存在或不存在,装置1400方位或加速/减速和装置1400的温度变化。传感器组件1414可以包括接近传感器,被配置用来在没有任何的物理接触时检测附近物体的存在。传感器组件1414还可以包括光传感器,如CMOS或CCD图像传感器,用于在成像应用中使用。在一些实施例中,该传感器组件 1414还可以包括加速度传感器,陀螺仪传感器,磁传感器,压力传感器或温度传感器。
通信组件1416被配置为便于装置1400和其他设备之间有线或无线方式的通信。装置1400可以接入基于通信标准的无线网络,如WiFi,2G或3G,或它们的组合。在一个示例性实施例中,通信部件1416经由广播信道接收来自外部广播管理系统的广播信号或广播相关信息。在一个示例性实施例中,所述通信部件1416还包括近场通信(NFC)模块,以促进短程通信。例如,在NFC模块可基于射频识别(RFID)技术,红外数据协会(IrDA)技术,超宽带(UWB)技术,蓝牙(BT)技术和其他技术来实现。
在示例性实施例中,装置1400可以被一个或多个应用专用集成电路(ASIC)、数字信号处理器(DSP)、数字信号处理设备(DSPD)、可编程逻辑器件(PLD)、现场可编程门阵列(FPGA)、控制器、微控制器、微处理器或其他电子元件实现,用于执行上述方法。
在示例性实施例中,还提供了一种包括指令的非临时性计算机可读存储介质,例如包括指令的存储器1404,上述指令可由装置1400的处理器1420执行以完成上述方法。例如,所述非临时性计算机可读存储介质可以是ROM、随机存取存储器(RAM)、CD-ROM、磁带、软盘和光数据存储设备等。
一种控制应用的装置,包括:
处理器;
用于存储处理器可执行指令的存储器;
其中,所述处理器被配置为:
接收实体按键的触发操作;
确定在当前应用下所述实体按键的触发操作对应的应用操作;
针对所述当前应用执行所述应用操作。
所述处理器还可以被配置为:
所述应用操作包括对虚拟按钮的手势操作;
所述确定在当前应用下所述实体按键的触发操作对应的应用操作,包括:
确定在当前应用下所述实体按键的触发操作对应的虚拟按钮和手势操作;
在当前界面中识别所述虚拟按钮,并确定所述虚拟按钮在所述当前界面中的坐标;
所述针对所述当前应用执行所述应用操作,包括:针对所述当前应用执行对所述坐标的所述手势操作。
所述处理器还可以被配置为:
所述在当前界面中识别所述虚拟按钮,包括:
获得所述当前应用的当前界面;
对所述当前界面进行所述虚拟按钮的字符或图样识别,识别出所述虚拟按钮。
所述处理器还可以被配置为:
所述确定在当前应用下所述实体按键的触发操作对应的应用操作,包括:
确定在当前应用的当前界面下所述实体按键的触发操作对应的应用操作。
所述处理器还可以被配置为:
所述确定在当前应用下所述实体按键的触发操作对应的应用操作,包括:
根据当前应用下的应用操作的历史记录中使用频率最高的应用操作,确定在当前应用下所述实体按键的触发操作对应的应用操作。
所述处理器还可以被配置为:
所述实体按键的一个触发操作对应多个应用操作;或者
多个实体按键的触发操作对应一个应用操作。
一种非临时性计算机可读存储介质,当所述存储介质中的指令由移动终端的处理器执行时,使得移动终端能够执行一种控制应用的方法,所述方法包括:
接收实体按键的触发操作;
确定在当前应用下所述实体按键的触发操作对应的应用操作;
针对所述当前应用执行所述应用操作。
所述存储介质中的指令还可以包括:
所述应用操作包括对虚拟按钮的手势操作;
所述确定在当前应用下所述实体按键的触发操作对应的应用操作,包括:
确定在当前应用下所述实体按键的触发操作对应的虚拟按钮和手势操作;
在当前界面中识别所述虚拟按钮,并确定所述虚拟按钮在所述当前界面中的坐标;
所述针对所述当前应用执行所述应用操作,包括:针对所述当前应用执行对所述坐标的所述手势操作。
所述存储介质中的指令还可以包括:
所述在当前界面中识别所述虚拟按钮,包括:
获得所述当前应用的当前界面;
对所述当前界面进行所述虚拟按钮的字符或图样识别,识别出所述虚拟按钮。
所述存储介质中的指令还可以包括:
所述确定在当前应用下所述实体按键的触发操作对应的应用操作,包括:
确定在当前应用的当前界面下所述实体按键的触发操作对应的应用操作。
所述存储介质中的指令还可以包括:
所述确定在当前应用下所述实体按键的触发操作对应的应用操作,包括:
根据当前应用下的应用操作的历史记录中使用频率最高的应用操作,确定在当前应用下所述实体按键的触发操作对应的应用操作。
所述存储介质中的指令还可以包括:
所述实体按键的一个触发操作对应多个应用操作;或者
多个实体按键的触发操作对应一个应用操作。
本领域技术人员在考虑说明书及实践这里公开的发明后,将容易想到本公开的其它实 施方案。本申请旨在涵盖本公开的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本公开的一般性原理并包括本公开未公开的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为示例性的,本公开的真正范围和精神由下面的权利要求指出。
应当理解的是,本公开并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围进行各种修改和改变。本公开的范围仅由所附的权利要求来限制。

Claims (13)

  1. 一种控制应用的方法,其特征在于,包括:
    接收实体按键的触发操作;
    确定在当前应用下所述实体按键的触发操作对应的应用操作;
    针对所述当前应用执行所述应用操作。
  2. 根据权利要求1所述的控制应用的方法,其特征在于,所述应用操作包括对虚拟按钮的手势操作时,所述确定在当前应用下所述实体按键的触发操作对应的应用操作,包括:
    确定在当前应用下所述实体按键的触发操作对应的虚拟按钮和手势操作;
    在当前应用界面中识别所述虚拟按钮,并确定所述虚拟按钮在所述当前界面中的坐标;
    所述针对所述当前应用执行所述应用操作,包括:针对所述当前应用执行对所述坐标的所述手势操作。
  3. 根据权利要求2所述的控制应用的方法,其特征在于,所述在当前应用界面中识别所述虚拟按钮,包括:
    获得所述当前应用的当前界面;
    通过对所述当前界面进行所述虚拟按钮的字符或图样识别,识别出所述虚拟按钮。
  4. 根据权利要求1所述的控制应用的方法,其特征在于,所述确定在当前应用下所述实体按键的触发操作对应的应用操作,包括:
    确定在当前应用的当前界面下所述实体按键的触发操作对应的应用操作。
  5. 根据权利要求1所述的控制应用的方法,其特征在于,所述确定在当前应用下所述实体按键的触发操作对应的应用操作,包括:
    根据当前应用下的应用操作的历史记录中使用频率最高的应用操作,确定在当前应用下所述实体按键的触发操作对应的应用操作。
  6. 根据权利要求1所述的控制应用的方法,其特征在于,所述实体按键的一个触发操作对应多个应用操作;或者
    多个实体按键的触发操作对应一个应用操作。
  7. 一种控制应用的装置,其特征在于,包括:
    接收模块,用于接收实体按键的触发操作;
    确定模块,用于确定在当前应用下所述实体按键的触发操作对应的应用操作;
    执行模块,用于针对所述当前应用执行所述应用操作。
  8. 根据权利要求7所述的控制应用的装置,其特征在于,所述应用操作包括对虚拟按钮的手势操作;
    所述确定模块包括:
    对应子模块,用于确定在当前应用下所述实体按键的触发操作对应的虚拟按钮和手势 操作;
    界面子模块,用于在当前界面中识别所述虚拟按钮,并确定所述虚拟按钮在所述当前界面中的坐标;
    所述执行模块包括:
    执行子模块,用于针对所述当前应用执行对所述坐标的所述手势操作。
  9. 根据权利要求8所述的控制应用的装置,其特征在于,所述界面子模块获得所述当前应用的当前界面;对所述当前界面进行所述虚拟按钮的字符或图样识别,识别出所述虚拟按钮。
  10. 根据权利要求7所述的控制应用的装置,其特征在于,所述确定模块包括:
    第一确定子模块,用于确定在当前应用的当前界面下所述实体按键的触发操作对应的应用操作。
  11. 根据权利要求7所述的控制应用的装置,其特征在于,所述确定模块包括:
    第二确定子模块,用于根据当前应用下的应用操作的历史记录中使用频率最高的应用操作,确定在当前应用下所述实体按键的触发操作对应的应用操作。
  12. 根据权利要求7所述的控制应用的装置,其特征在于,所述实体按键的一个触发操作对应多个应用操作;或者
    多个实体按键的触发操作对应一个应用操作。
  13. 一种控制应用的装置,其特征在于,包括:
    处理器;
    用于存储处理器可执行指令的存储器;
    其中,所述处理器被配置为:
    接收实体按键的触发操作;
    确定在当前应用下所述实体按键的触发操作对应的应用操作;
    针对所述当前应用执行所述应用操作。
PCT/CN2015/093862 2014-12-31 2015-11-05 控制应用的方法及装置 WO2016107283A1 (zh)

Priority Applications (7)

Application Number Priority Date Filing Date Title
EP15874967.1A EP3208704B1 (en) 2014-12-31 2015-11-05 Application control method and device
KR1020167000039A KR101819985B1 (ko) 2014-12-31 2015-11-05 애플리케이션 제어 방법, 장치, 프로그램 및 컴퓨터가 판독 가능한 기록매체
JP2016567125A JP6186091B2 (ja) 2014-12-31 2015-11-05 アプリケーション制御方法、装置、プログラム及び記録媒体
BR112016002304A BR112016002304A2 (pt) 2014-12-31 2015-11-05 método e dispositivo para controlar aplicativo
MX2016000463A MX355827B (es) 2014-12-31 2015-11-05 Metodo y dispositivo para control de aplicacion.
RU2016101409A RU2640733C2 (ru) 2014-12-31 2015-11-05 Способ и устройство для управления приложением
US15/052,816 US20160187997A1 (en) 2014-12-31 2016-02-24 Method and device for controlling application

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201410856869.6A CN104461304A (zh) 2014-12-31 2014-12-31 控制应用的方法及装置
CN201410856869.6 2014-12-31

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/052,816 Continuation US20160187997A1 (en) 2014-12-31 2016-02-24 Method and device for controlling application

Publications (1)

Publication Number Publication Date
WO2016107283A1 true WO2016107283A1 (zh) 2016-07-07

Family

ID=52907447

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/093862 WO2016107283A1 (zh) 2014-12-31 2015-11-05 控制应用的方法及装置

Country Status (8)

Country Link
EP (1) EP3208704B1 (zh)
JP (1) JP6186091B2 (zh)
KR (1) KR101819985B1 (zh)
CN (1) CN104461304A (zh)
BR (1) BR112016002304A2 (zh)
MX (1) MX355827B (zh)
RU (1) RU2640733C2 (zh)
WO (1) WO2016107283A1 (zh)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104461304A (zh) * 2014-12-31 2015-03-25 小米科技有限责任公司 控制应用的方法及装置
CN104883444B (zh) * 2015-05-15 2017-11-14 广东欧珀移动通信有限公司 一种通话快捷应用方法、系统及移动终端
CN104935738B (zh) * 2015-05-28 2018-04-27 努比亚技术有限公司 移动终端应用控制方法及装置
CN107544742B (zh) * 2016-06-28 2020-10-20 富泰华工业(深圳)有限公司 一种控制方法及其电子装置
CN106775064B (zh) * 2016-11-24 2020-04-28 努比亚技术有限公司 一种终端控制装置及其方法
CN106775070B (zh) * 2016-11-29 2020-05-22 努比亚技术有限公司 一种输入控制装置及其方法
CN106791037B (zh) * 2016-11-30 2021-01-15 腾讯科技(深圳)有限公司 操作触发方法、系统、移动终端及电磁场发生设备
CN106873822A (zh) * 2016-12-27 2017-06-20 努比亚技术有限公司 一种终端悬浮操作学习装置及其方法
US10812423B2 (en) 2017-03-15 2020-10-20 Naver Corporation Method, apparatus, system, and non-transitory computer readable medium for chatting on mobile device using an external device
CN106951258A (zh) * 2017-03-27 2017-07-14 维沃移动通信有限公司 一种移动终端的操作方法及移动终端
CN106940653B (zh) * 2017-04-14 2021-06-04 北京小米移动软件有限公司 控制应用程序的方法、装置及计算机可读存储介质
CN107402692B (zh) * 2017-07-24 2020-11-17 维沃移动通信有限公司 一种移动终端的操作方法及移动终端
CN108958863A (zh) * 2018-06-21 2018-12-07 西安蜂语信息科技有限公司 应用程序的功能触发方法及装置
CN108984099B (zh) * 2018-07-16 2020-09-18 维沃移动通信有限公司 一种人机交互方法及终端
CN112083839A (zh) * 2019-06-14 2020-12-15 海能达通信股份有限公司 终端操作的方法、终端及具有存储功能的装置
CN112754644A (zh) * 2020-12-31 2021-05-07 杭州堃博生物科技有限公司 一种射频消融仪的控制方法及其系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101482796A (zh) * 2009-02-11 2009-07-15 中兴通讯股份有限公司 一种通过触摸屏启动移动终端应用功能的系统及方法
CN102270080A (zh) * 2010-06-02 2011-12-07 康佳集团股份有限公司 电子设备输入装置及移动终端
CN102929424A (zh) * 2012-09-14 2013-02-13 华为终端有限公司 移动终端的控制方法、装置及移动终端
EP2637128A1 (en) * 2012-03-06 2013-09-11 beyo GmbH Multimodal text input by a keyboard/camera text input module replacing a conventional keyboard text input module on a mobile device
CN103558990A (zh) * 2013-11-05 2014-02-05 广东欧珀移动通信有限公司 一种移动终端快速打开或操作应用程序的方法及系统
CN104461304A (zh) * 2014-12-31 2015-03-25 小米科技有限责任公司 控制应用的方法及装置

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100649149B1 (ko) * 2005-11-24 2006-11-27 주식회사 팬택 디스플레이형 단축 기능 버튼을 가지는 이동 통신 단말기및 그 방법
JP2009088894A (ja) * 2007-09-28 2009-04-23 Kyocera Corp 携帯端末
KR20100027686A (ko) * 2008-09-03 2010-03-11 엘지전자 주식회사 이동단말기 및 그 제어 방법
US8638300B2 (en) * 2008-12-23 2014-01-28 Motorola Mobility Llc Method and touch-sensitive apparatus for searching entries in a stored list
JP5621382B2 (ja) * 2010-07-28 2014-11-12 カシオ計算機株式会社 サーバ装置及びプログラム
JP2012226694A (ja) * 2011-04-22 2012-11-15 Nec Casio Mobile Communications Ltd 入力装置、入力方法およびプログラム
KR101820933B1 (ko) * 2011-08-10 2018-01-23 주식회사 케이티 단말 및 그의 이벤트 기록 방법
TW201312387A (zh) * 2011-09-06 2013-03-16 Askey Technology Jiangsu Ltd 按鍵裝置及具有該按鍵裝置之螢幕顯示器
US20130139113A1 (en) * 2011-11-30 2013-05-30 Microsoft Corporation Quick action for performing frequent tasks on a mobile device
CN103218211B (zh) * 2012-01-19 2015-10-07 腾讯科技(深圳)有限公司 一种应用实例之间共享热键的方法和系统
JP5678948B2 (ja) * 2012-12-12 2015-03-04 株式会社デンソー 車両用表示装置およびプログラム
CN103729065B (zh) * 2013-12-17 2018-01-30 深圳市慧为智能科技股份有限公司 触控操作映射到实体按键的系统及方法
JP2015153317A (ja) * 2014-02-18 2015-08-24 キヤノン株式会社 情報処理装置、情報処理方法、及びコンピュータプログラム
CN104238759B (zh) * 2014-09-04 2017-11-10 广东欧珀移动通信有限公司 实体按键控制终端的方法及装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101482796A (zh) * 2009-02-11 2009-07-15 中兴通讯股份有限公司 一种通过触摸屏启动移动终端应用功能的系统及方法
CN102270080A (zh) * 2010-06-02 2011-12-07 康佳集团股份有限公司 电子设备输入装置及移动终端
EP2637128A1 (en) * 2012-03-06 2013-09-11 beyo GmbH Multimodal text input by a keyboard/camera text input module replacing a conventional keyboard text input module on a mobile device
CN102929424A (zh) * 2012-09-14 2013-02-13 华为终端有限公司 移动终端的控制方法、装置及移动终端
CN103558990A (zh) * 2013-11-05 2014-02-05 广东欧珀移动通信有限公司 一种移动终端快速打开或操作应用程序的方法及系统
CN104461304A (zh) * 2014-12-31 2015-03-25 小米科技有限责任公司 控制应用的方法及装置

Also Published As

Publication number Publication date
JP6186091B2 (ja) 2017-08-23
EP3208704A4 (en) 2018-05-30
CN104461304A (zh) 2015-03-25
RU2640733C2 (ru) 2018-01-11
KR20160092518A (ko) 2016-08-04
RU2016101409A (ru) 2017-07-24
JP2017505969A (ja) 2017-02-23
MX2016000463A (es) 2016-08-26
BR112016002304A2 (pt) 2017-08-01
EP3208704A1 (en) 2017-08-23
KR101819985B1 (ko) 2018-01-18
MX355827B (es) 2018-05-02
EP3208704B1 (en) 2021-01-06

Similar Documents

Publication Publication Date Title
WO2016107283A1 (zh) 控制应用的方法及装置
WO2017036019A1 (zh) 移动终端控制的方法及移动终端
CN105975166B (zh) 应用控制方法及装置
WO2016023340A1 (zh) 一种切换摄像头的方法和装置
WO2017020486A1 (zh) 调节拍摄功能的控制方法、装置和设备
WO2017113842A1 (zh) 控制智能设备的方法及装置
WO2017071068A1 (zh) 应用程序切换方法、装置及设备
WO2016112699A1 (zh) 切换显示模式的方法及装置
WO2017059638A1 (zh) 指纹识别方法及装置
WO2016192324A1 (zh) 单手模式下的按键操作处理方法及装置、电子设备
WO2018098865A1 (zh) 消息阅读方法及装置
WO2017201887A1 (zh) 终端的工作模式确定方法及装置
WO2016206292A1 (zh) 文本输入方法和装置
WO2017084183A1 (zh) 信息显示方法与装置
CN104317402B (zh) 描述信息的显示方法及装置、电子设备
EP2924552B1 (en) Method and mobile terminal for executing user instructions
CN107798309B (zh) 指纹录入方法、装置及计算机可读存储介质
JP6300389B2 (ja) ターゲットオブジェクトの表示方法及び装置
CN104049864B (zh) 对象控制方法及装置
CN107102772A (zh) 触控方法及装置
WO2016110146A1 (zh) 移动终端及虚拟按键的处理方法
WO2017092500A1 (zh) 功能键的触控方法及装置
RU2628484C2 (ru) Способ и устройство для активации рабочего состояния мобильного терминала
CN108766427B (zh) 语音控制方法及装置
WO2017211021A1 (zh) 应用触发方法及装置

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2016567125

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20167000039

Country of ref document: KR

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: MX/A/2016/000463

Country of ref document: MX

ENP Entry into the national phase

Ref document number: 2016101409

Country of ref document: RU

Kind code of ref document: A

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112016002304

Country of ref document: BR

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15874967

Country of ref document: EP

Kind code of ref document: A1

REEP Request for entry into the european phase

Ref document number: 2015874967

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 112016002304

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20160202