CN107870674B - Program starting method and mobile terminal - Google Patents
Program starting method and mobile terminal Download PDFInfo
- Publication number
- CN107870674B CN107870674B CN201711242523.7A CN201711242523A CN107870674B CN 107870674 B CN107870674 B CN 107870674B CN 201711242523 A CN201711242523 A CN 201711242523A CN 107870674 B CN107870674 B CN 107870674B
- Authority
- CN
- China
- Prior art keywords
- gesture
- gesture operation
- program
- input
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/445—Program loading or initiating
- G06F9/44505—Configuring for program initiating, e.g. using registry, configuration files
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
Abstract
The invention discloses a program starting method and a mobile terminal. The method comprises the following steps: acquiring gesture information and sound signals of gesture operation performed in an image acquisition area of a terminal; searching a program matched with the gesture operation from a gesture operation library according to the gesture information and the sound signal; the gesture operation library stores the association relationship between gesture operations and programs, and the programs are used for realizing specific functions; the program is started. According to the embodiment of the invention, the program is started from two dimensions of the gesture information and the sound signal of the gesture operation, and compared with the scheme that the gesture starting is only adopted in the prior art, the starting misoperation can be effectively avoided, and the types of available gesture operations are improved; moreover, the gesture operation does not need to contact a terminal screen, and the use convenience of a user can be effectively improved.
Description
Technical Field
The present invention relates to the field of terminals, and in particular, to a program starting method and a mobile terminal.
Background
The gesture operation is a unique characteristic of the smart phone and has the function of future science and technology sense, namely, the screen sliding operation is supported. The user can directly wake up a certain program of the mobile phone through gestures arranged in the mobile phone.
For example: and the user performs double-finger sliding operation on the touch screen, the terminal matches the monitored double-finger sliding gesture with the gestures corresponding to the preset functions, and whether the corresponding function is started or not is determined according to the matching result.
However, according to this scheme, it is easy to happen that the user may start the corresponding function by performing an action unintentionally, which may cause a malfunction.
Disclosure of Invention
The embodiment of the invention provides a program starting method and a mobile terminal, which are used for solving the problems that the existing program starting scheme is easy to cause misoperation, less in operation type and narrower in application scene.
In a first aspect, an embodiment of the present invention provides a program starting method, including:
acquiring gesture information and sound signals of gesture operation performed by a gesture operation user through a camera of the mobile terminal;
according to the gesture information and the sound signal, searching a program associated with the gesture operation from a gesture operation library; the association relation between the gesture operation and the program is stored in the gesture operation library;
the program is started.
In a second aspect, an embodiment of the present invention further provides a mobile terminal, including:
the acquisition unit is used for acquiring gesture information and sound signals of gesture operation performed by a gesture operation user through a camera of the mobile terminal;
the matching unit is used for searching a program matched with the gesture operation from a gesture operation library according to the gesture information and the sound signal; the gesture operation library stores the association relation between the gesture operation and the program, and the program is used for realizing a specific function;
and the starting unit is used for starting the program.
In a third aspect, an embodiment of the present invention further provides a mobile terminal, including: a memory, a processor and a program start-up program stored on the memory and executable on the processor, the program start-up program, when executed by the processor, implementing the steps of the program start-up method as described above.
In the embodiment of the invention, the program is started from two dimensions of gesture information and sound signals of gesture operation, and compared with the scheme of starting only by gestures in the prior art, the program can be effectively prevented from being operated by mistake when the program is started, and the types of available gesture operation are improved; moreover, the gesture operation can be performed without touching a terminal screen, so that the use experience of a user can be effectively improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention and not to limit the invention. In the drawings:
fig. 1 is a schematic flowchart of a program starting method according to embodiment 1 of the present invention;
fig. 2 is a schematic flowchart of a program starting method according to embodiment 2 of the present invention;
fig. 3 is a schematic structural diagram of a mobile terminal according to embodiment 3 of the present invention;
fig. 4 is a schematic structural diagram of a mobile terminal according to embodiment 4 of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below with reference to the specific embodiments of the present invention and the accompanying drawings. It is to be understood that the described embodiments are merely exemplary of the invention, and not restrictive of the full scope of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the present invention is applicable to a mobile terminal, where the mobile terminal refers to a computer device that can be used in mobile, and broadly includes a mobile phone, a notebook, a tablet computer, a POS machine, and even includes a vehicle-mounted computer. But most often refer to cell phones or smart phones and tablets with multiple application functions.
The technical solutions provided by the embodiments of the present invention are described in detail below with reference to the accompanying drawings.
Example 1
Fig. 1 is a schematic flowchart of a program starting method provided in embodiment 1 of the present invention, and referring to fig. 1, the method may be executed by a processor of a terminal, and specifically may include the following steps:
it should be noted that, one implementation manner of step 12 may be:
when the terminal is in a screen locking/lighting state, a gesture operation user (hereinafter, simply referred to as a user) performs gesture operations in an acquisition area of an image acquisition device such as a camera of the terminal, for example: the method comprises the steps of clapping hands, making a sound finger and the like, collecting gesture information of the gesture operation by a camera, and simultaneously collecting sound signals of the gesture operation by a recording device of a terminal. Wherein the gesture and the sound are simultaneous events.
If the majority of the user does not exist in the image acquisition area when the user performs gesture operation, a prompt of acquisition failure can be selectively sent out.
Step 14, searching a program associated with the gesture operation from a gesture operation library according to the gesture information and the sound signal; the association relation between the gesture operation and the program is stored in the gesture operation library;
it should be noted that, one implementation manner of step 14 may be:
first, the terminal recognizes the content of the gesture operation collected in step 12 based on a gesture recognition technology, for example: gesture information, which is used as a language for interacting with the terminal to let the terminal understand the meaning of the gesture operation, such as: the shape of the gesture, the trajectory of the motion, etc. Then, the gesture operation matched with the gesture operation and the program associated with the found gesture operation are found out from the gesture operation library based on the gesture information and the sound signal corresponding to the gesture operation. The program here may be an application program, for example: a certain app; it may also be a function program, for example: application-some function in the player-play pause/resume.
The method specifically comprises the following substeps:
a construction substep: determining gesture operation to be input; recognizing gesture information and sound signals of the gesture operation to be input, and establishing an incidence relation between the gesture operation to be input and a specified program; and inputting the gesture information and the sound signals of the gesture operation to be input and the incidence relation between the gesture information and the appointed program into a gesture operation library of the terminal.
In the construction sub-step, the association relationship between the gesture operation and the designated program can be established, for example, as follows: the user enters a quick start setting interface, selects a program, then makes gesture operation for awakening the selected program in an image acquisition area of the terminal based on the terminal instruction, creates the association relationship between the gesture information of the gesture operation and the sound signal and the selected program by the terminal, stores the association relationship into a gesture operation library, and then selectively sends prompt information for completing the setting.
In addition, in the sub-step of the construction, in order to avoid the situation that the entered gesture operations are overlapped, after the content (gesture information) of the gesture operation to be entered is identified, whether the gesture operation to be entered is the entered gesture operation can be judged. The method specifically comprises the following steps: judging whether the similarity between the content and/or the sound signal corresponding to the gesture operation to be recorded and the content and/or the sound signal corresponding to the recorded gesture operation stored in the gesture operation library is smaller than a preset similarity threshold value or not; if not, prompting that the gesture operation to be input is the input gesture operation, and refusing to input the gesture operation to be input; if yes, the gesture operation is recorded into a gesture operation library.
A matching substep: respectively matching the collected gesture information and the collected sound signal of the gesture operation with gesture information and sound signals corresponding to the input gesture operation stored in the gesture operation library; and determining a program which has an association relationship between the gesture information corresponding to the acquired gesture operation and the sound signal in the program of the terminal according to the matching result.
In the sub-step of matching, one implementation of matching may be: comparing and determining a first similarity between the acquired gesture information of the gesture operation and the gesture information of the input gesture operation in the gesture operation library; and determining a second similarity between the sound signal corresponding to the acquired gesture operation and the sound signal corresponding to the recorded gesture operation according to the attribute information of the sound signal. Wherein the attribute information of the sound signal includes: at least one of loudness, timbre, pitch. For example: if the gestures of the gesture operation are the same and the loudness of the sound is different, different functions may be awakened.
In the matching sub-step, one implementation of the determination procedure may be: firstly, determining the matching degree of each input gesture operation and the acquired gesture operation in the gesture operation library according to the first similarity and the second similarity; then, determining the gesture operation with the matching degree meeting the preset condition, wherein the preset condition can be the gesture operation with the highest matching degree; and then, selecting a program which is associated with the determined gesture operation from the application programs of the terminal as a program corresponding to the acquired gesture operation.
In the matching sub-step, if it is determined that there are at least two entered gesture operations that have a very close match to the captured gesture operation, for example: if the difference between the matching degrees of at least two entered gesture operations and the gesture operations collected by the hand is smaller than a predetermined threshold, determining that the at least two entered gesture operations are both gesture operations of which the matching degrees with the collected gesture operations meet a predetermined condition, and displaying the at least two programs having an association relationship with the gesture operations, for example: the icons or the operation interfaces are displayed on the screen, and the user selects the program to be operated.
And step 16, starting the program.
It should be noted that, one implementation manner of step 16 may be:
and if a program which is associated with the acquired gesture operation is matched based on the steps 12 and 14, executing the program.
Another implementation of step 16 may be:
if at least two programs which are associated with the acquired gesture operation are matched based on the steps 12 and 14, executing the program corresponding to the user selection after the user selects.
Yet another implementation of step 16 may be:
after the programs which are in the association relation with the collected gesture operation are matched based on the steps 12 and 14, continuing to determine the programs which are in the association relation with the matched programs and the running states of the programs which are in the association relation; and determining whether the current operating environment of the terminal meets the preset condition for starting the matched program or not according to the operating state of the program with the incidence relation. If yes, starting a matched program; if not, no response is made, or a prompt that the instruction is wrong is sent out. For example: the matched program is a functional program, namely video playing pause/play, but if the application program, namely the video player, associated with the matched program is in an unopened state, or the functional program for realizing the starting service is in an unopened state, the current operating environment of the terminal is determined not to meet the preset condition, and then the mobile terminal does not start the functional program, namely the video playing pause/play.
Therefore, the program is started from two dimensions of the gesture information and the sound signal of the gesture operation, and compared with the scheme that the gesture starting is only adopted in the prior art, the starting misoperation can be effectively avoided, and the types of available gesture operations are improved; moreover, the gesture operation does not need to contact a terminal screen, and the use convenience of a user can be effectively improved.
Example 2
Fig. 2 is a schematic flow chart of a program starting method provided in embodiment 2 of the present invention, and referring to fig. 2, the present invention is described in detail in an example form, which may specifically include the following steps:
step 202: enter set function interface
It is understood that the user enters an interface for setting gesture information and sound signals corresponding to gesture operation for a program through the operation terminal. This process is similar to recording fingerprints and is therefore not described in detail herein.
Step 204: inputting gesture and sound
The user inputs gestures and voice in the setting function, the mobile terminal identifies and inputs the gestures and the voice through the camera and the recording equipment, and the gestures and the voice need to be events which occur simultaneously
For example: after a user selects a program to be set (such as a video pause), a finger is made to act in front of a lens, the mobile terminal records the finger act through a camera, and sound attributes such as loudness, tone, timbre and the like of sound are recorded through recording.
Step 206: whether gestures and sounds are similar to existing gestures and sounds
The sound distinguishing method may be: whether the main attributes of the sound of the ring finger and the sound of the input gesture are obviously distinguished or not is judged by acquiring the acoustic characteristics (MFCC) of the sound.
The gesture distinguishing method may be exemplified by: and distinguishing the motion track corresponding to the ring finger gesture from the motion rule of the input gesture.
Additionally, for other actions, such as: the clapping hands are recorded, and the clapping hands with the five fingers close together and the clapping hands with the five fingers separated are regarded as not being obviously distinguished. The bat can be seen as clearly distinguished from the bat by a light bat and a heavy bat, a fast bat and a slow bat, a left hand and a right hand, 1 time and 2 times
Step 208: prompting a user that a gesture entered needs to be distinguished from an existing gesture
And if the ring finger gesture and the ring finger sound are not obviously distinguished from the existing gestures and sounds, the prompt input gesture needs to be distinguished from the existing gestures.
Step 210: program for setting gestures and calling sound by user after inputting
And if the ring finger gesture and the ring finger sound are obviously distinguished from the existing gestures and sounds, guiding the user to set a program needing to be awakened, wherein the program can be a functional program or an application program. Wake-up functions are not unique, but must be mutually exclusive, i.e., functions that do not occur simultaneously. Such as: the wake-up function may be set to start or pause video, take a picture or record a video. It may not be set to wake up both the WeChat and the camera.
Step 212: setup complete
Step 214: user's responding gesture in front of screen and accompanying sound
The user can make corresponding gestures accompanied by sound through the image acquisition area above the screen.
Step 216: recognizing whether the input gesture and sound are recorded through a camera and a recording microphone
The mobile terminal performs gesture recognition matching and sound recognition matching; if the identification is not successful, no response is made.
Wherein the gesture recognition matching can be performed based on the feature image recognition matching.
The sound recognition matching is based on the matching of attributes such as loudness, timbre and tone of the sound with the sound corresponding to the input gesture.
If not, no response is made; if so, step 218 is performed.
Step 218: judging whether the current environment allows to wake up the corresponding program
If the identification is successful but the video playing interface is not available, the clapping pause cannot be awakened and the video can not be played.
If the identification is successful and the current context allows the corresponding function to be woken up, step 220 is performed to wake up the corresponding function.
Therefore, the starting operation of the program is carried out from two dimensions of the gesture information and the sound signal of the gesture operation, and compared with the scheme that the gesture starting is only adopted in the prior art, the starting misoperation can be effectively avoided, and the types of available gesture operations are improved; moreover, the gesture operation does not need to contact a terminal screen, and the use convenience of a user can be effectively improved.
It should be noted that the execution subjects of the steps of the methods provided in embodiments 1 or 2 may be the same apparatus, or different apparatuses may be used as the execution subjects of the methods. For example, the execution subject of steps 12 and 14 may be device 1, and the execution subject of step 16 may be device 2; for another example, the execution subject of step 12 may be device 1, and the execution subjects of steps 14 and 16 may be device 2; and so on.
In addition, for simplicity of explanation, the above-described method embodiments are described as a series of acts or combinations, but it should be understood by those skilled in the art that the present invention is not limited by the order of acts or steps described, as some steps may be performed in other orders or simultaneously according to the present invention. Furthermore, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Example 3
Fig. 3 is a schematic structural diagram of a mobile terminal provided in embodiment 3 of the present invention, which may specifically include: acquisition unit 31, matching unit 32 and start unit 33, wherein:
the acquisition unit 31 is used for acquiring gesture information and sound signals of gesture operation performed by a gesture operation user through a camera of the mobile terminal;
the matching unit 32 is used for searching a program matched with the gesture operation from a gesture operation library according to the gesture information and the sound signal; the gesture operation library stores the association relation between the gesture operation and the program, and the program is used for realizing a specific function;
a starting unit 33 for starting the program.
The matching unit 32 may specifically include:
the matching subunit is used for respectively matching the acquired gesture information and the acquired sound signal of the gesture operation with the gesture information and the sound signal of the input gesture operation in the gesture operation library;
the first determining subunit is used for determining the gesture operation in the gesture operation library, wherein the matching degree of the acquired gesture operation meets the preset condition according to the matching result;
and the selecting subunit is used for selecting the program which has the association relation with the determined gesture operation.
The matching unit 32 may further include:
and the display subunit is used for displaying the program which has an association relation with the at least two gesture operations if the gesture operations of which the matching degrees with the acquired gesture operations meet the preset conditions are determined.
An activation unit 33, comprising:
the second determining subunit is used for determining the programs which have the association relation with the matched programs and the running states of the programs which have the association relation;
and the third determining subunit is configured to determine, according to the operation state of the program with the association relationship, whether the current operation environment of the terminal meets a predetermined condition for starting the matched program.
In another possible implementation, the method further includes:
the determining unit is used for determining gesture operation to be input;
the recognition unit is used for recognizing gesture information and sound signals of the gesture operation to be input and establishing an incidence relation between the gesture operation to be input and a specified program;
and the input unit is used for inputting the gesture information and the sound signals of the gesture operation to be input and the incidence relation between the gesture information and the appointed program into a gesture operation library of the terminal.
In another possible implementation manner, the method further includes:
the judging unit is used for judging whether the similarity between the gesture information and/or the sound signal of the gesture operation to be input and the gesture information and/or the sound signal of the input gesture operation stored in the gesture operation library is smaller than a preset similarity threshold value or not;
and the prompting unit is used for prompting that the gesture operation to be input is the input gesture operation if the input gesture operation is not the input gesture operation.
Therefore, the starting operation of the program is carried out from two dimensions of the gesture information and the sound signal of the gesture operation, and compared with the scheme that the gesture starting is only adopted in the prior art, the starting misoperation can be effectively avoided, and the types of available gesture operations are improved; moreover, the gesture operation does not need to contact a terminal screen, and the use convenience of a user can be effectively improved.
As for the virtual device implementation of the mobile terminal, since it is basically similar to the method implementation, the description is relatively simple, and for relevant points, reference may be made to part of the description of the method implementation.
It should be noted that, in the respective components of the apparatus of the present invention, the components therein are logically divided according to the functions to be implemented thereof, but the present invention is not limited thereto, and the respective components may be newly divided or combined as necessary.
Example 4
Fig. 4 is a schematic structural diagram of a mobile terminal according to embodiment 4 of the present invention, and referring to fig. 4, the mobile terminal 400 includes, but is not limited to: radio frequency unit 401, network module 402, audio output unit 403, input unit 404, sensor 405, display unit 406, user input unit 407, interface unit 408, memory 409, processor 310, and power supply 411. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 4 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The radio frequency unit 401 is configured to acquire gesture information and sound signals of gesture operations performed by a gesture operation user through a camera of the mobile terminal;
a processor 410, configured to search a gesture operation library for a program associated with the gesture operation according to the gesture information and the sound signal; the association relation between the gesture operation and the program is stored in the gesture operation library; the program is started.
Therefore, the starting operation of the program is carried out from two dimensions of the gesture information and the sound signal of the gesture operation, and compared with the scheme that only gesture starting or voice starting is adopted in the prior art, the starting misoperation can be effectively avoided, and the types of available gesture operations are improved; moreover, the gesture operation does not need to contact a terminal screen, and the use convenience of a user can be effectively improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 401 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 410; in addition, the uplink data is transmitted to the base station. Typically, radio unit 401 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio unit 401 can also communicate with a network and other devices through a wireless communication system.
The mobile terminal provides the user with wireless broadband internet access through the network module 402, such as helping the user send and receive e-mails, browse web pages, and access streaming media.
The audio output unit 403 may convert audio data received by the radio frequency unit 401 or the network module 402 or stored in the memory 409 into an audio signal and output as sound. Also, the audio output unit 403 may also provide audio output related to a specific function performed by the mobile terminal 400 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 403 includes a speaker, a buzzer, a receiver, and the like.
The input unit 404 is used to receive audio or video signals. The input Unit 404 may include a Graphics Processing Unit (GPU) 4041 and a microphone 4042, and the Graphics processor 4041 processes image data of a still picture or video obtained by an image capturing apparatus (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 406. The image frames processed by the graphic processor 4041 may be stored in the memory 409 (or other storage medium) or transmitted via the radio frequency unit 401 or the network module 402. The microphone 4042 may receive sound, and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 401 in case of the phone call mode.
The mobile terminal 400 also includes at least one sensor 405, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 4061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 4061 and/or the backlight when the mobile terminal 400 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 405 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which will not be described in detail herein.
The display unit 406 is used to display information input by the user or information provided to the user. The Display unit 406 may include a Display panel 4061, and the Display panel 4061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 407 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 407 includes a touch panel 4071 and other input devices 4072. Touch panel 4071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on or near touch panel 4071 using a finger, a stylus, or any suitable object or attachment). The touch panel 4071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 410, receives a command from the processor 410, and executes the command. In addition, the touch panel 4071 can be implemented by using various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 4071, the user input unit 407 may include other input devices 4072. Specifically, the other input devices 4072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 4071 can be overlaid on the display panel 4061, and when the touch panel 4071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 410 to determine the type of the touch event, and then the processor 410 provides a corresponding visual output on the display panel 4061 according to the type of the touch event. Although in fig. 4, the touch panel 4071 and the display panel 4061 are two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 4071 and the display panel 4061 may be integrated to implement the input and output functions of the mobile terminal, which is not limited herein.
The interface unit 408 is an interface through which an external device is connected to the mobile terminal 400. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 408 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 400 or may be used to transmit data between the mobile terminal 400 and external devices.
The memory 409 may be used to store software programs as well as various data. The memory 409 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 409 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 410 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 409 and calling data stored in the memory 409, thereby integrally monitoring the mobile terminal. Processor 410 may include one or more processing units; preferably, the processor 410 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 410.
The mobile terminal 400 may further include a power supply 411 (e.g., a battery) for supplying power to various components, and preferably, the power supply 411 may be logically connected to the processor 410 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the mobile terminal 400 includes some functional modules that are not shown, and thus, are not described in detail herein.
Preferably, an embodiment of the present invention further provides a mobile terminal, which includes a processor 110, a memory 109, and a computer program stored in the memory 109 and capable of running on the processor 110, where the computer program, when executed by the processor 110, implements the processes of the method embodiments 1 or 2, and can achieve the same technical effect, and details are not described here to avoid repetition.
An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the method in embodiment 1 or 2, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.
Claims (11)
1. A program starting method is applied to a mobile terminal and is characterized by comprising the following steps:
acquiring gesture information and sound signals of gesture operation performed by a gesture operation user through a camera of the mobile terminal;
according to the gesture information and the sound signal, searching a program associated with the gesture operation from a gesture operation library; the association relation between the gesture operation and the program is stored in the gesture operation library;
starting the program;
wherein, prior to starting the program, comprising:
determining a program having an association relation with the matched program and an operation state of the program having the association relation;
and determining whether the current operating environment of the terminal meets the preset condition for starting the matched program or not according to the operating state of the program with the incidence relation.
2. The method according to claim 1, before collecting gesture information and sound signals of gesture operations performed by a gesture operation user through a camera of the mobile terminal, further comprising:
determining gesture operation to be input;
recognizing gesture information and sound signals of the gesture operation to be input, and establishing an incidence relation between the gesture operation to be input and a designated program;
and inputting the gesture information and the sound signals of the gesture operation to be input and the incidence relation between the gesture information and the appointed program into a gesture operation library of the terminal.
3. The method according to claim 2, before establishing the association between the gesture operation to be entered and the designated program, further comprising:
judging whether the similarity between the gesture information and/or the sound signal of the gesture operation to be recorded and the gesture information and/or the sound signal of the recorded gesture operation stored in the gesture operation library is smaller than a preset similarity threshold value or not;
and if not, prompting that the gesture operation to be input is the input gesture operation.
4. The method of claim 1, wherein matching out a program matching the gesture operation from a library of gesture operations comprises:
the collected gesture information and the sound signal of the gesture operation are respectively matched with the gesture information and the sound signal of the input gesture operation in the gesture operation library;
determining gesture operation in the gesture operation library, wherein the matching degree of the acquired gesture operation meets the preset condition according to the matching result;
and selecting a program which is associated with the determined gesture operation.
5. The method according to claim 4, characterized in that if at least two gesture operations are determined, the matching degree of which with the acquired gesture operations meets a predetermined condition, a program associated with the determined at least two gesture operations is displayed.
6. A mobile terminal, comprising:
the acquisition unit is used for acquiring gesture information and sound signals of gesture operation performed by a gesture operation user through a camera of the mobile terminal;
the matching unit is used for searching a program matched with the gesture operation from a gesture operation library according to the gesture information and the sound signal; the gesture operation library stores the association relation between the gesture operation and the program, and the program is used for realizing a specific function;
a starting unit for starting the program;
wherein the starting unit includes:
the second determining subunit is used for determining the programs which have the association relation with the matched programs and the running states of the programs which have the association relation;
and the third determining subunit is configured to determine, according to the operation state of the program with the association relationship, whether the current operation environment of the terminal meets a predetermined condition for starting the matched program.
7. The mobile terminal of claim 6, further comprising:
the determining unit is used for determining gesture operation to be input;
the recognition unit is used for recognizing gesture information and sound signals of the gesture operation to be input and establishing an incidence relation between the gesture operation to be input and a specified program;
and the input unit is used for inputting the gesture information and the sound signals of the gesture operation to be input and the incidence relation between the gesture information and the appointed program into a gesture operation library of the terminal.
8. The mobile terminal of claim 7, further comprising:
the judging unit is used for judging whether the similarity between the gesture information and/or the sound signal of the gesture operation to be input and the gesture information and/or the sound signal of the input gesture operation stored in the gesture operation library is smaller than a preset similarity threshold value or not;
and the prompting unit is used for prompting that the gesture operation to be input is the input gesture operation if the input gesture operation is not the input gesture operation.
9. The mobile terminal according to claim 6, wherein the matching unit comprises:
the matching subunit is used for respectively matching the acquired gesture information and the acquired sound signal of the gesture operation with the gesture information and the sound signal of the input gesture operation in the gesture operation library;
the first determining subunit is used for determining the gesture operation in the gesture operation library, wherein the matching degree of the acquired gesture operation meets the preset condition according to the matching result;
and the selecting subunit is used for selecting the program which has the association relation with the determined gesture operation.
10. The mobile terminal of claim 9, wherein the matching unit further comprises:
and the display subunit is used for displaying the program which has an association relation with the at least two gesture operations if the gesture operations of which the matching degrees with the acquired gesture operations meet the preset conditions are determined.
11. A mobile terminal, comprising: memory, a processor and a program start-up program stored on the memory and executable on the processor, which program start-up program when executed by the processor implements the steps of the program start-up method as claimed in any one of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711242523.7A CN107870674B (en) | 2017-11-30 | 2017-11-30 | Program starting method and mobile terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711242523.7A CN107870674B (en) | 2017-11-30 | 2017-11-30 | Program starting method and mobile terminal |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107870674A CN107870674A (en) | 2018-04-03 |
CN107870674B true CN107870674B (en) | 2021-04-13 |
Family
ID=61754879
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711242523.7A Active CN107870674B (en) | 2017-11-30 | 2017-11-30 | Program starting method and mobile terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107870674B (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110098985A (en) * | 2018-01-29 | 2019-08-06 | 阿里巴巴集团控股有限公司 | The method and apparatus of vocal behavior detection |
CN109358747B (en) * | 2018-09-30 | 2021-11-30 | 平潭诚信智创科技有限公司 | Companion robot control method, system, mobile terminal and storage medium |
CN109887580A (en) * | 2019-02-25 | 2019-06-14 | 宁波江丰生物信息技术有限公司 | A kind of reading method of the digital slices based on gesture remote sensing |
CN111580660B (en) * | 2020-05-09 | 2022-03-18 | 清华大学 | Operation triggering method, device, equipment and readable storage medium |
CN112261765A (en) * | 2020-09-27 | 2021-01-22 | 深圳市广和通无线股份有限公司 | Light control method and device, control module, wearable device and home system |
CN113093905B (en) * | 2021-03-31 | 2024-05-07 | 惠州华阳通用电子有限公司 | Application program starting control method and device |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103890695A (en) * | 2011-08-11 | 2014-06-25 | 视力移动技术有限公司 | Gesture based interface system and method |
CN103957635A (en) * | 2014-04-28 | 2014-07-30 | 梁涛 | On-off device and control realization method thereof |
CN106325481A (en) * | 2015-06-30 | 2017-01-11 | 展讯通信(天津)有限公司 | A non-contact type control system and method and a mobile terminal |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100302138A1 (en) * | 2009-05-29 | 2010-12-02 | Microsoft Corporation | Methods and systems for defining or modifying a visual representation |
-
2017
- 2017-11-30 CN CN201711242523.7A patent/CN107870674B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103890695A (en) * | 2011-08-11 | 2014-06-25 | 视力移动技术有限公司 | Gesture based interface system and method |
CN103957635A (en) * | 2014-04-28 | 2014-07-30 | 梁涛 | On-off device and control realization method thereof |
CN106325481A (en) * | 2015-06-30 | 2017-01-11 | 展讯通信(天津)有限公司 | A non-contact type control system and method and a mobile terminal |
Also Published As
Publication number | Publication date |
---|---|
CN107870674A (en) | 2018-04-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107870674B (en) | Program starting method and mobile terminal | |
CN108845853B (en) | Application program starting method and mobile terminal | |
CN109078319B (en) | Game interface display method and terminal | |
CN107742072B (en) | Face recognition method and mobile terminal | |
CN108279948B (en) | Application program starting method and mobile terminal | |
CN108763316B (en) | Audio list management method and mobile terminal | |
CN108334272B (en) | Control method and mobile terminal | |
CN107728923B (en) | Operation processing method and mobile terminal | |
CN109343788B (en) | Operation control method of mobile terminal and mobile terminal | |
CN110308834B (en) | Setting method of application icon display mode and terminal | |
CN109446775A (en) | A kind of acoustic-controlled method and electronic equipment | |
CN108958593B (en) | Method for determining communication object and mobile terminal | |
CN110096203B (en) | Screenshot method and mobile terminal | |
CN109683768A (en) | A kind of operating method and mobile terminal of application | |
CN109495638B (en) | Information display method and terminal | |
WO2017215615A1 (en) | Sound effect processing method and mobile terminal | |
CN108270928B (en) | Voice recognition method and mobile terminal | |
CN109164908B (en) | Interface control method and mobile terminal | |
CN110780751A (en) | Information processing method and electronic equipment | |
WO2019101127A1 (en) | Biological identification module processing method and apparatus, and mobile terminal | |
CN107895108B (en) | Operation management method and mobile terminal | |
CN109491572B (en) | Screen capturing method of mobile terminal and mobile terminal | |
CN109164951B (en) | Mobile terminal operation method and mobile terminal | |
CN108897467B (en) | Display control method and terminal equipment | |
CN107835310B (en) | Mobile terminal setting method and mobile terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |