WO2018113187A1 - 一种显示控制方法及显示装置 - Google Patents
一种显示控制方法及显示装置 Download PDFInfo
- Publication number
- WO2018113187A1 WO2018113187A1 PCT/CN2017/086116 CN2017086116W WO2018113187A1 WO 2018113187 A1 WO2018113187 A1 WO 2018113187A1 CN 2017086116 W CN2017086116 W CN 2017086116W WO 2018113187 A1 WO2018113187 A1 WO 2018113187A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- eye
- frequency
- terminal
- state
- image
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/003—Details of a display terminal, the details relating to the control arrangement of the display terminal and to the interfaces thereto
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/147—Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/167—Detection; Localisation; Normalisation using comparisons between temporally consecutive images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/171—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
- G06V40/176—Dynamic expression
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/193—Preprocessing; Feature extraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/197—Matching; Classification
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/08—Arrangements within a display terminal for setting, manually or automatically, display parameters of the display terminal
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2330/00—Aspects of power supply; Aspects of display protection and defect management
- G09G2330/02—Details of power systems and of start or stop of display operation
- G09G2330/021—Power management, e.g. power saving
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2330/00—Aspects of power supply; Aspects of display protection and defect management
- G09G2330/02—Details of power systems and of start or stop of display operation
- G09G2330/021—Power management, e.g. power saving
- G09G2330/022—Power management, e.g. power saving in absence of operation, e.g. no data being entered during a predetermined time
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2360/00—Aspects of the architecture of display systems
- G09G2360/14—Detecting light within display terminals, e.g. using a single or a plurality of photosensors
Definitions
- the present disclosure relates to the field of display control technologies, and in particular, to a display control method and a display device.
- the embodiment of the present application provides a display control method and a display device, which can detect whether there is a user in front of the terminal according to the added image sensor, and perform the terminal and the display device according to the frequency of the eye movement calculated according to the acquired image. Intelligent display control.
- an embodiment of the present application provides a display control method, where the method includes
- Corresponding operations are selected according to the frequency of the eye sway and a preset rule, where the preset rule is a correspondence between the frequency of the eye sway and the operation;
- the terminal is controlled to perform a corresponding function according to the operation.
- the embodiment of the present application provides a display device, and the display device includes
- a storage module for storing program instructions
- the processing module is electrically connected to the display panel and the storage module, and is configured to invoke and execute the program instruction to perform the following steps:
- Corresponding operations are selected according to the frequency of the eye sway and a preset rule, where the preset rule is a correspondence between the frequency of the eye sway and the operation;
- the terminal is controlled to perform a corresponding function according to the operation.
- an embodiment of the present application provides a display device, where the display device includes
- An image acquisition module configured to acquire an image in an imaging area in front of the terminal in real time
- a frequency calculation module configured to calculate a frequency of a user's eye movement according to the acquired image
- An operation selection module configured to select a corresponding operation according to the frequency of the eye movement and a preset rule, where the preset rule is a correspondence between a frequency of the eye movement and an operation;
- control module configured to control the terminal to perform a corresponding function according to the operation.
- the image sensor is used to detect whether the user exists, and the frequency of the eye movement is calculated by using the acquired image, and the intelligent display control of the terminal and the display device is performed according to a preset rule, for example, the eye is shaken.
- the frequency is in the first frequency range, and the first operation is performed correspondingly. If the frequency of the eye movement is in the second frequency range, the second operation is performed accordingly, thereby improving the intelligent control level of the electronic product with the display function and improving User experience.
- FIG. 1 is a schematic flow chart of a display control method according to an embodiment of the present application.
- FIG. 2 is a schematic sub-flow diagram of a display control method according to a first embodiment of the present application.
- FIG. 3 is another schematic sub-flow diagram of a display control method according to a second embodiment of the present application.
- FIG. 4 is a schematic block diagram of a terminal provided by an embodiment of the present application.
- FIG. 5 is a schematic block diagram of a frequency calculation module of a terminal according to a first embodiment of the present application.
- FIG. 6 is a schematic block diagram of an operation selection module of a terminal according to the first embodiment of the present application.
- FIG. 7 is a schematic block diagram of a control module of a terminal according to a second embodiment of the present application.
- FIG. 8 is a schematic block diagram of a user of a terminal according to an embodiment of the present application.
- FIG. 9 is a schematic block diagram of a display device according to an embodiment of the present application.
- the terminal can be implemented in various forms.
- terminals described in the embodiments of the present application include, but are not limited to, such as having a touch sensitive surface (eg, a touch screen display and/or a touch pad)
- Fixed terminals such as desktop computers, LCD TVs, and digital TVs.
- the terminal device is a touch sensitive surface (eg, a touch screen display and/or a touch pad)
- Portable communication device such as a mobile phone, laptop or tablet computer.
- the display device can be implemented in various forms.
- the display device described in the embodiments of the present application includes, but is not limited to, a display device such as a display having an organic light emitting diode display, a liquid crystal display, a plasma display, a cathode ray tube display, or the like.
- fixed terminals including displays and touch sensitive surfaces are described.
- a portable mobile terminal such as a notebook computer
- a component such as a notebook computer.
- the terminal supports a variety of applications, such as one or more of the following: drawing applications, presentation applications, word processing applications, website creation applications, disk burning applications, spreadsheet applications, gaming applications, phone applications Programs, video conferencing applications, email applications, instant messaging applications, workout support applications, photo management applications, digital camera applications, digital camera applications, web Browse applications, digital music player applications, and/or digital video player applications.
- applications such as one or more of the following: drawing applications, presentation applications, word processing applications, website creation applications, disk burning applications, spreadsheet applications, gaming applications, phone applications Programs, video conferencing applications, email applications, instant messaging applications, workout support applications, photo management applications, digital camera applications, digital camera applications, web Browse applications, digital music player applications, and/or digital video player applications.
- Various applications that can be executed on the terminal can use at least one common physical user interface device such as a touch sensitive surface.
- One or more functions of the touch sensitive surface and corresponding information displayed on the terminal can be adjusted and/or changed within the application and/or within the respective application.
- the common physical architecture of the terminal eg, a touch-sensitive surface
- FIG. 1 is a schematic flowchart of a display control method according to an embodiment of the present application.
- the display control method is applied to a terminal, where the terminal has an image sensor; the image sensor can be obtained in real time.
- the image in the imaging area in front of the terminal converts the optical image into a processing module that transmits the electronic signal to the terminal, and then calculates a sensor or the like of the corresponding data; as shown, the method may include steps S11 to S14.
- S11 Acquire an image in an imaging area in front of the terminal in real time. Specifically, the optical image acquired in real time is converted into an electronic signal and transmitted to the terminal processing module for corresponding processing. For example, if the presence of the user is detected in the image in the imaging area in front of the acquired terminal and the user's eyes are shaking , indicating that a user is watching the terminal in the camera area in front of the terminal.
- the usage status of the current terminal may be determined according to the presence of the user and whether the user's eyes are swaying.
- the usage status of the terminal includes a normal working state and a standby state.
- the frequency of the user's eye movement based on the acquired image. Specifically, the eye image in the preset time period is extracted from the acquired image, and the number of times the user's eyes are shaken is calculated according to the eye image of the preset time period, which may be based on the calculated number of times the user's eyes are shaken and preset. The time period calculates the frequency of the eye movement.
- the user's eye swaying frequency range is set.
- the user's normal eye swaying frequency range is 15 times a minute
- the program design can set the corresponding frequency range according to the normal sway range of the human eye.
- the eye movement frequency of 10-20 times per minute is the first frequency range
- the eye movement frequency of 1-10 times per minute is set to the second frequency range
- the setting is performed in steps of 10 times for each frequency range.
- the terminal performs the first operation;
- the calculated frequency of the user's eye sway is in the second frequency range, and the terminal performs the second operation.
- the first operation of the terminal may be set according to a function operation of the terminal, where the first operation may be to automatically switch the current application scenario or the playing channel of the terminal, The operation may be an automatic adjustment of brightness or the like; if the eye swaying frequency is in the second frequency range, the terminal performs a second operation, which may be an operation of automatically lowering or increasing the volume or adjusting the contrast.
- the preset rule may also be set by a user, and the setting of the corresponding operation of the eye swaying frequency range may be set according to personal preference.
- Control the terminal to perform a corresponding function according to the operation Specifically, an operation performed by the terminal is selected according to the eye flapping frequency and the preset rule, thereby controlling the terminal to perform a corresponding function according to the operation. For example, when the user accidentally falls asleep while watching the video using the terminal, the calculation of the eye movement frequency is zero, thereby controlling the terminal to enter the standby mode, when the user wakes up, the eye movement frequency is calculated, and the frequency of the eye movement is too slow.
- the control terminal adjusts the low volume, the control terminal performs the operation of switching channels when the eye flapping frequency is too fast; thereby realizing the intelligent control terminal working state, ensuring more intelligent control terminal display, facilitating user use, improving user experience, and saving Power resources.
- the above embodiment can also set corresponding function operations according to user preferences.
- the frequency of the eye movement calculated according to the acquired image processing and the preset rule are performed correspondingly, thereby ensuring intelligent control of the terminal display and improving the user experience. That is, the foregoing embodiment may perform corresponding operations on the terminal display according to the range of the eye swaying frequency and the preset rule in the preset time period, for example, when the calculated frequency of the user's eye swaying is located in the first frequency range, The terminal performs a first operation; when the calculated frequency of the user's eye sway is in the second frequency range, the terminal performs a second operation; thereby dynamically determining an operation performed by the terminal according to an eye swaying frequency range, Improve the user experience as the user watches the video.
- the above embodiment may also determine the correspondence between the frequency of the eye movement and the operation by the user performing a custom setting.
- step S12 is a sub-schematic flowchart of step S12 provided by the first embodiment of the present application.
- the frequency of the user's eye movement is calculated based on the acquired image.
- step S12 includes S21-S25.
- the eye image state in the image may be identified according to the feature information of the eye.
- the pupil shape, the heterochromatic edge (iris, sclera), and the like of the user's eyes may be identified by programming, and thus When the state of the eye image of the user is in a blinking state, eye feature information such as the pupil shape of the eye and the edge of the heterochromatic color can be recognized, and when the eye is closed, the corresponding eye feature information cannot be recognized.
- the number of eye movements in the preset time period is calculated, thereby calculating the eye movement of the preset time period. frequency.
- FIG. 3 is another schematic sub-flow diagram of a display control method according to a second embodiment of the present application. Specifically, steps S31 to S36 are included.
- step S31 Determine whether the current usage state of the terminal is a standby state or a normal working state. If the terminal is in the standby state, the process proceeds to step S32, and if the terminal is in the normal working state, the process proceeds to step S36. Specifically, the currently used state of the terminal includes a standby state and a normal working state.
- step S32 If it is in a standby state, detecting whether a user exists in a preset range of the terminal. If the user of the terminal enters step S33, if the terminal does not exist, the user proceeds to step S35. Specifically, determining whether the user can locate the user's body image according to the image generated by the image sensor within the preset range of the terminal, thereby detecting the presence of the user.
- the infrared sensor may also be disposed at the terminal, and the user may detect the presence of the user according to the infrared sensor when entering the preset range of the terminal.
- step S33 Calculate whether the preset time range of the terminal is that the time of the user reaches a preset duration. If the preset range of the terminal exists, the user's time reaches the preset time length to proceed to step S34, and if the preset time length is not reached, the process proceeds to step S35. Specifically, when the preset range of the terminal detects the presence of the user, the timing module starts to work. In some feasible embodiments, the preset duration may be set by the user, which may prevent the user from entering the preset range of the terminal and suddenly leaving.
- control terminal works normally. Specifically, when the terminal is in a normal working state, the image in the imaging area in front of the real-time acquisition terminal is executed.
- the control terminal maintains the standby mode.
- the terminal is currently in a normal working state. Specifically, when the terminal is in a normal working state, the image in the imaging area in front of the real-time acquisition terminal is executed.
- the smart control terminal is used to improve the intelligent control performance of the terminal according to whether the user exists and detects whether the user's time reaches a preset time, thereby preventing the user experience from being affected.
- the image sensor or the infrared sensor is always working. Once the presence of the user in the imaging area before the terminal is detected and the preset duration is reached, the terminal will be in standby. The state is switched to the normal working state. In other embodiments, the user can also manually start the button to enter the normal working state.
- the above embodiment can ensure the automatic switching use state of the terminal, and the use state of the terminal includes the standby state and the normal working state.
- the step S12 of the first embodiment and the specific process of the second embodiment may also be combined.
- the eye may be calculated according to the method provided in the first embodiment.
- the frequency range is determined, thereby determining that the terminal performs the corresponding operation.
- FIG. 4 is a schematic block diagram of a terminal 100 according to an embodiment of the present application.
- the terminal 100 includes an image acquisition module 10, a frequency calculation module 20, an operation selection module 30, and a control module 40.
- the image acquisition module 10 is configured to acquire an image in the front imaging area of the terminal generated by the image sensor in real time. Specifically, the optical image acquired in real time is converted into an electronic signal and transmitted to the terminal processing module for corresponding processing. For example, if the presence of the user is detected in the image in the imaging area in front of the acquired terminal and the user's eyes are shaking , indicating that a user is watching the terminal in the camera area in front of the terminal.
- the usage status of the current terminal may be determined according to the presence of the user and whether the user's eyes are swaying.
- the usage status of the terminal includes a normal working state and a standby state.
- the frequency calculation module 20 is configured to calculate a frequency of the user's eye movement based on the acquired image. Specifically, the eye image in the preset time period is extracted from the acquired image, and the number of times the user's eyes are shaken is calculated according to the eye image of the preset time period, which may be based on the calculated number of times the user's eyes are shaken and preset. The time period calculates the frequency of the eye movement.
- the operation selection module 30 is configured to select a corresponding operation according to the frequency of the eye movement and a preset rule, where the preset rule is a correspondence between a frequency of the eye movement and an operation, and the operation selection module 30 A first operation module 32 and a second operation module 34 are included (as shown in FIG. 6).
- the user's eye swaying frequency range is set.
- the user's normal eye swaying frequency range is 15 times a minute
- the program design can set the corresponding frequency range according to the normal sway range of the human eye.
- the eye movement frequency of 10-20 times per minute is the first frequency range
- the eye movement frequency of 1-10 times per minute is set to the second frequency range, and the setting is performed in steps of 10 times for each frequency range.
- the user can customize the frequency range in which the eye is shaken. Therefore, according to the eye swaying frequency range and the preset rule selection terminal, performing a corresponding operation, when the calculated frequency of the user's eye swaying is in the first frequency range, the terminal performs the first operation; The calculated frequency of the user's eye sway is in the second frequency range, and the terminal performs the second operation.
- the first operation of the terminal may be set according to a function operation of the terminal, where the first operation may be to automatically switch the current application scenario or the playing channel of the terminal, The operation may be an automatic adjustment of brightness or the like; if the eye swaying frequency is in the second frequency range, the terminal performs a second operation, which may be an operation of automatically lowering or increasing the volume or adjusting the contrast.
- the preset rule may also be set by a user, and the setting of the corresponding operation of the eye swaying frequency range may be set according to personal preference.
- the control module 40 is configured to control the terminal to perform a corresponding function according to the operation. Specifically, an operation performed by the terminal is selected according to the eye flapping frequency and the preset rule, thereby controlling the terminal to perform a corresponding function according to the operation. For example, when the user accidentally falls asleep while watching the video using the terminal, the calculation of the eye movement frequency is zero, thereby controlling the terminal to enter the standby mode, when the user wakes up, the eye movement frequency is calculated, and the frequency of the eye movement is too slow. When the control terminal adjusts the low volume, the control terminal performs the operation of switching channels when the eye flapping frequency is too fast; thereby realizing the intelligent control terminal working state, ensuring more intelligent control terminal display, facilitating user use, improving user experience, and saving Power resources. In addition, the above embodiment can also set corresponding function operations according to user preferences.
- the frequency of the eye movement calculated according to the acquired image processing and the preset rule are performed correspondingly, thereby ensuring intelligent control of the terminal display and improving the user experience. That is, the foregoing embodiment may perform corresponding operations on the terminal display according to the range of the eye swaying frequency and the preset rule in the preset time period, for example, when the calculated frequency of the user's eye swaying is located in the first frequency range, The terminal performs a first operation; when the calculated frequency of the user's eye sway is in the second frequency range, the terminal performs a second operation; thereby dynamically determining an operation performed by the terminal according to an eye swaying frequency range, Improve the user experience as the user watches the video.
- the above embodiment may also determine the correspondence between the frequency of the eye movement and the operation by the user performing a custom setting.
- FIG. 5 is a schematic block diagram of the frequency calculation module 20 provided by the embodiment of the present application.
- the frequency calculation module 20 calculates the frequency of the user's eye movement based on the acquired image.
- the frequency calculation module 20 includes an extraction module 22, an identification module 24, and a calculation module 26.
- the extraction module 22 is configured to extract the image in a preset time period. Specifically, the preset time period can be set by a user.
- the identification module 24 is configured to identify an eye image state in the image, and the eye image state includes a closed eye state and a blink state.
- the eye image state in the image may be identified according to the feature information of the eye.
- the pupil shape, the heterochromatic edge (iris, sclera), and the like of the user's eyes may be identified by programming, and thus When the state of the eye image of the user is in a blinking state, eye feature information such as the pupil shape of the eye and the edge of the heterochromatic color can be recognized, and when the eye is closed, the corresponding eye feature information cannot be recognized.
- the calculation module 26 is configured to calculate that the eye is shaken once if the state of the eye images of the two adjacent images is inconsistent. Specifically, the state of the eye image in the image is recognized, and when the state of the eye image of the two images in the adjacent time points does not match, the eye is moved once.
- the calculating module 26 is configured to calculate the number of times the eye is shaken in a preset time period. Specifically, the number of eye movements can be calculated according to the comparison of the state of the eye image in the image in the preset time, so that the number of eye movements in the preset time period can be calculated.
- the calculating module 26 is further configured to calculate a frequency of the eye movement according to the calculated number of times and a preset time period.
- the number of eye movements in the preset time period is calculated, thereby calculating the eye movement of the preset time period. frequency.
- FIG. 7 is a schematic block diagram of a control module 40 provided by an embodiment of the present application.
- the control module 40 intelligently controls the usage state of the terminal according to whether the user exists and detects whether the time of the user reaches a preset duration, thereby improving the intelligent control performance of the terminal, thereby avoiding the user.
- the experience is affected.
- the control module 40 includes a first determining module 42 , a detecting module 44 , and a timing module 46 .
- the first determining module 42 is configured to determine whether the currently used state of the terminal is a standby state or a normal working state. Specifically, the currently used state of the terminal includes a standby state and a normal working state.
- the detecting module 44 is configured to detect whether there is a user within a preset range of the terminal if it is in a standby state. Specifically, determining whether the user can locate the user's body image according to the image generated by the image sensor within the preset range of the terminal, thereby detecting the presence of the user.
- the infrared sensor may also be disposed at the terminal, and the user may detect the presence of the user according to the infrared sensor when entering the preset range of the terminal.
- the timing module 46 is configured to calculate whether the preset time range of the terminal is that the time of the user reaches a preset duration. Specifically, when the preset range of the terminal detects the presence of the user, the timing module starts to work.
- the preset duration may be set by the user, which may prevent the user from entering the preset range of the terminal and suddenly leaving.
- the control module 40 is configured to control the normal operation of the terminal. Specifically, when the terminal is in a normal working state, the image in the imaging area in front of the real-time acquisition terminal is executed.
- the control module 40 is configured to control the terminal to maintain a standby mode.
- the first determining module 42 is configured to determine that the terminal is currently in a normal working state. Specifically, when the terminal is in a normal working state, the image in the imaging area in front of the real-time acquisition terminal is executed.
- the smart control terminal is used to improve the intelligent control performance of the terminal according to whether the user exists and detects whether the user's time reaches a preset time, thereby preventing the user experience from being affected.
- the image sensor or the infrared sensor is always working. Once the presence of the user in the imaging area before the terminal is detected and the preset duration is reached, the terminal will be in standby. The state is switched to the normal working state. In other embodiments, the user can also manually start the button to enter the normal working state.
- the above embodiment can ensure the automatic switching use state of the terminal, and the use state of the terminal includes the standby state and the normal working state.
- the step S12 of the first embodiment and the specific process of the second embodiment may also be combined.
- the eye may be calculated according to the method provided in the first embodiment.
- the frequency range is determined, thereby determining that the terminal performs the corresponding operation.
- FIG. 8 is a schematic block diagram of a user of a terminal according to an embodiment of the present application, which may also be another embodiment of a display device.
- the terminal 100 described in this embodiment includes: at least one input device 200, at least one output device 300, and at least one processing module (CPU) 400, an image sensor 500, and a storage module 600.
- the input device 200, the output device 300, the processing module 400, the image sensor 500, and the storage module 600 are connected by a bus 700.
- the input device 200 can specifically be a touch panel (touch screen), a physical button, a fingerprint recognition module, and a mouse.
- the output device 300 can be specifically a display screen.
- the storage module 600 can be a high-speed RAM storage module or a non-volatile storage module (non-volatile). Memory), such as a disk storage module.
- the storage module 600 is configured to store a set of program codes, and the input device 200, the output device 300, and the processing module 400 are used to invoke the program code stored in the storage module 600, and perform the following operations:
- the processing module 400 is configured to:
- the working state of the current terminal can be determined according to whether the eye is present and whether the eye is moving.
- the frequency of the user's eye movement is calculated based on the acquired image. Specifically, the eye image in the preset time period is extracted from the acquired image, and the number of times the user's eyes are shaken is calculated according to the eye image of the preset time period, which may be based on the calculated number of times the user's eyes are shaken and preset. The time period calculates the frequency of the eye movement.
- Corresponding operations are selected according to the frequency of the eye sway and a preset rule, and the preset rule is a correspondence between the frequency of the eye sway and the operation.
- the user's eye swaying frequency range is set.
- the user's normal eye swaying frequency range is 15 times a minute
- the program design can set the corresponding frequency range according to the normal sway range of the human eye.
- the eye movement frequency of 10-20 times per minute is the first frequency range
- the eye movement frequency of 1-10 times per minute is set to the second frequency range
- the setting is performed in steps of 10 times for each frequency range.
- the terminal performs the first operation;
- the calculated frequency of the user's eye sway is in the second frequency range, and the terminal performs the second operation.
- the first operation of the terminal may be set according to a function operation of the terminal, where the first operation may be to automatically switch the current application scenario or the playing channel of the terminal, The operation may be an automatic adjustment of brightness or the like; if the eye swaying frequency is in the second frequency range, the terminal performs a second operation, which may be an operation of automatically lowering or increasing the volume or adjusting the contrast.
- the preset rule may also be set by a user, and the setting of the corresponding operation of the eye swaying frequency range may be set according to personal preference.
- the terminal is controlled to perform a corresponding function according to the operation. Specifically, an operation performed by the terminal is selected according to the eye flapping frequency and the preset rule, thereby controlling the terminal to perform a corresponding function according to the operation. For example, when the user accidentally falls asleep while watching the video using the terminal, the calculation of the eye movement frequency is zero, thereby controlling the terminal to enter the standby mode, when the user wakes up, the eye movement frequency is calculated, and the frequency of the eye movement is too slow.
- the control terminal adjusts the low volume, the control terminal performs the operation of switching channels when the eye flapping frequency is too fast; thereby realizing the intelligent control terminal working state, ensuring more intelligent control terminal display, facilitating user use, improving user experience, and saving Power resources.
- the above embodiment can also set corresponding operations according to user preferences.
- the frequency of the eye movement calculated according to the acquired image processing and the preset rule are performed correspondingly, thereby ensuring intelligent control of the terminal display and improving the user experience. That is, the foregoing embodiment may perform corresponding operations on the terminal display according to the range of the eye swaying frequency and the preset rule in the preset time period, for example, when the calculated frequency of the user's eye swaying is located in the first frequency range, The terminal performs a first operation; when the calculated frequency of the user's eye sway is in the second frequency range, the terminal performs a second operation; thereby dynamically determining an operation performed by the terminal according to an eye swaying frequency range, Improve the user experience as the user watches the video.
- the above embodiment may also determine the correspondence between the frequency of the eye movement and the operation by the user performing a custom setting.
- the processing module 400 calculates the frequency of the eye movement of the user according to the acquired image. Specifically, the processing module 400 is configured to:
- the image within a preset time period is extracted.
- the preset time period can be set by a user.
- an eye image state in the image is identified, the eye image state including a closed eye state and a blink state.
- the eye image state in the image may be identified according to the feature information of the eye.
- the pupil shape, the heterochromatic edge (iris, sclera), and the like of the user's eyes may be identified by programming, and thus
- eye feature information such as the pupil shape of the eye and the edge of the heterochromatic color can be recognized, and when the eye is closed, the corresponding eye feature information cannot be recognized.
- the state of the eye images of the two adjacent images is inconsistent, it is calculated that the eye is moved once. Specifically, the state of the eye image in the image is recognized, and when the state of the eye image of the two images in the adjacent time points does not match, the eye is moved once.
- the number of eye movements can be calculated according to the comparison of the state of the eye image in the image in the preset time, so that the number of eye movements in the preset time period can be calculated.
- the frequency of the eye movement is calculated according to the calculated number of times and the preset time period.
- the number of eye movements in the preset time period is calculated, thereby calculating the eye movement of the preset time period. frequency.
- the processing module 400 intelligently controls the usage state of the terminal according to whether the user exists and detects whether the time of the user reaches a preset duration, thereby improving the intelligent control performance of the terminal, thereby preventing the user experience from being affected. .
- the processing module 400 is used to
- the current use state of the terminal is a standby state or a normal working state.
- the currently used state of the terminal includes a standby state and a normal working state.
- the infrared sensor may also be disposed at the terminal, and the user may detect the presence of the user according to the infrared sensor when entering the preset range of the terminal.
- Determining whether the preset range of the terminal has a user's time reaches a preset duration Specifically, when the preset range of the terminal detects the presence of the user, the timing module starts to work.
- the preset duration may be set by the user, which may prevent the user from entering the preset range of the terminal and suddenly leaving.
- the control terminal works normally. Specifically, when the terminal is in a normal working state, the image in the imaging area in front of the real-time acquisition terminal is executed.
- the control terminal maintains the standby mode.
- the terminal is currently in a normal working state. Specifically, when the terminal is in a normal working state, the image in the imaging area in front of the real-time acquisition terminal is executed.
- the smart control terminal is used to improve the intelligent control performance of the terminal according to whether the user exists and detects whether the user's time reaches a preset time, thereby preventing the user experience from being affected.
- the image sensor or the infrared sensor is always working. Once the presence of the user in the imaging area before the terminal is detected and the preset duration is reached, the terminal will be in standby. The state is switched to the normal working state. In other embodiments, the user can also manually start the button to enter the normal working state.
- the above embodiment can ensure the automatic switching use state of the terminal, and the use state of the terminal includes the standby state and the normal working state.
- the step S12 of the first embodiment and the specific process of the second embodiment may also be combined.
- the eye may be calculated according to the method provided in the first embodiment.
- the frequency range is determined, thereby determining that the terminal performs the corresponding operation.
- FIG. 9 is a schematic block diagram of a display device 10 according to an embodiment of the present application.
- the display device 10 described in this embodiment includes: a display screen 30 and an image sensor for displaying an image screen; the display device 10 further includes
- An image acquisition module configured to acquire an image in an imaging area in front of the terminal in real time
- a frequency calculation module configured to calculate a frequency of a user's eye movement according to the acquired image
- An operation selection module configured to select a corresponding operation according to the frequency of the eye movement and a preset rule, where the preset rule is a correspondence between a frequency of the eye movement and an operation;
- control module configured to control the terminal to perform a corresponding function according to the operation.
- the display device 10 described in the embodiments of the present application includes, but is not limited to, a display device 10 such as a display screen 30 having an organic light emitting diode display, a liquid crystal display, a plasma display, a cathode ray tube display, or the like.
- a display device 10 such as a display screen 30 having an organic light emitting diode display, a liquid crystal display, a plasma display, a cathode ray tube display, or the like.
- the display device 10 is integrally formed in a periphery.
- the non-display area of the display device 10 has a mounting hole 20 for mounting an image sensor.
- the image sensor is disposed in the mounting hole 20.
- the image sensor includes, but is not limited to, a camera and a CCD. CMOS, the image sensor can be used to acquire an image in the imaging area located in front of the display screen 30 in real time.
- the display screen 30 includes, but is not limited to, an organic light emitting diode display screen, a liquid crystal display screen, a plasma display screen, a cathode ray tube display screen, and the like.
- the display screen 30 has a TFT substrate and a color filter substrate, and a liquid crystal layer is disposed between the TFT substrate and the color filter substrate, and the display screen 30 is used for displaying an image screen.
- the mounting hole 20 is located on a longitudinal vertical line of the non-display area of the display device 10.
- the image sensor may be fixed in the mounting hole 20 by a silicone.
- the aperture 20 can also correspond to a plurality of image sensors.
- the presence or absence of the user may be detected according to the image sensor, and the frequency of the eye movement is calculated by the acquired image, and the intelligent display control of the display device is performed according to a preset rule, for example, an eye ⁇ The moving frequency is in the first frequency range, the display device performs the first operation, and if the frequency of the eye movement is in the second frequency range, the display device performs the second operation, thereby improving the intelligent control level of the display device and improving the user The effect of the experience.
- a preset rule for example, an eye ⁇ The moving frequency is in the first frequency range, the display device performs the first operation, and if the frequency of the eye movement is in the second frequency range, the display device performs the second operation, thereby improving the intelligent control level of the display device and improving the user The effect of the experience.
- the disclosed terminal and method may be implemented in other manners.
- the device embodiments described above are merely illustrative.
- the division of the modules is only a logical function division.
- there may be another division manner for example, multiple modules or components may be combined or Can be integrated into another system, or some features can be ignored or not executed.
- the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or module, or an electrical, mechanical or other form of connection.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Ophthalmology & Optometry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Hardware Design (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
Claims (20)
- 一种显示控制方法,包括,实时获取终端前方的摄像区域内的图像;根据获取的图像计算用户的眼睛眨动的频率;根据所述眼睛眨动的频率和预设的规则选择相应的操作,所述预设规则为所述眼睛眨动的频率和操作的对应关系;以及控制所述终端根据所述操作执行相应的功能。
- 根据权利要求1所述的方法,其中,所述根据获取的图像计算用户的眼睛眨动的频率的步骤包括,提取预设时间段内的所述图像;识别出所述图像中的眼部图像状态,所述眼部图像状态包括闭眼状态和睁眼状态;若相邻的两幅图像的眼部图像状态不一致时,计算为眼睛眨动一次;计算预设时间段内眼睛眨动的次数;以及根据计算的次数与预设时间段计算出所述眼睛眨动的频率。
- 根据权利要求1所述的方法,其中,所述预设规则为所述眼睛眨动的频率和操作的对应关系的步骤包括,若所述眼睛眨动的频率位于第一频率范围,所述终端执行第一操作;以及若所述眼睛眨动的频率位于第二频率范围,所述终端执行第二操作。
- 根据权利要求1所述的方法,还包括,判断终端当前所处使用状态是待机状态还是正常工作状态;若是待机状态,侦测在所述终端的预设范围内是否存在使用者;计算在所述终端的预设范围内存在使用者的时间是否达到预设时长;以及若达到预设时长,控制所述终端进入正常工作状态并执行所述实时获取终端前方的摄像区域内的图像。
- 如权利要求4所述的方法,其中,所述根据获取的图像计算用户的眼睛眨动的频率的步骤包括,提取预设时间段内的所述图像;识别出所述图像中的眼部图像状态,所述眼部图像状态包括闭眼状态和睁眼状态;若相邻的两幅图像的眼部图像状态不一致时,计算为眼睛眨动一次;计算预设时间段内眼睛眨动的次数;以及根据计算的次数与预设时间段计算出所述眼睛眨动的频率。
- 根据权利要求4所述的方法,其中,所述预设规则为所述眼睛眨动的频率和操作的对应关系的步骤包括,若所述眼睛眨动的频率位于第一频率范围,所述终端执行第一操作;以及若所述眼睛眨动的频率位于第二频率范围,所述终端执行第二操作。
- 一种显示装置,包括,显示面板;存储模块,用于存储程序指令;以及处理模块,与所述显示面板和所述存储模块连接,用于调用并执行所述程序指令,以执行如下步骤:实时获取终端前方的摄像区域内的图像;根据获取的图像计算用户的眼睛眨动的频率;根据所述眼睛眨动的频率和预设的规则选择相应的操作,所述预设规则为所述眼睛眨动的频率和操作的对应关系;控制所述终端及所述显示装置根据所述操作执行相应的功能。
- 如权利要求7所述的显示装置,其中,处理模块执行所述根据获取的图像计算用户的眼睛眨动的频率的步骤时具体执行如下步骤,提取预设时间段内的所述图像;识别出所述图像中的眼部图像状态,所述眼部图像状态包括闭眼状态和睁眼状态;若相邻的两幅图像的眼部图像状态不一致时,计算为眼睛眨动一次;计算预设时间段内眼睛眨动的次数;以及根据计算的次数与预设时间段计算出所述眼睛眨动的频率。
- 如权利要求7所述的显示装置,其中,处理模块执行所述预设规则为所述眼睛眨动的频率和操作的对应关系的步骤时具体执行如下步骤,若所述眼睛眨动的频率位于第一频率范围,所述终端执行第一操作;以及若所述眼睛眨动的频率位于第二频率范围,所述终端执行第二操作。
- 如权利要求7所述的显示装置,其中,处理模块调用并执行所述程序指令,还执行如下步骤,判断终端当前所处使用状态是待机状态还是正常工作状态;若是待机状态,侦测在所述终端的预设范围内是否存在使用者;计算在所述终端的预设范围内存在使用者的时间是否达到预设时长;以及若达到预设时长,控制所述终端进入正常工作状态并执行所述实时获取终端前方的摄像区域内的图像。
- 如权利要求10所述的显示装置,其中,处理模块执行所述根据获取的图像计算用户的眼睛眨动的频率的步骤时具体执行如下步骤,提取预设时间段内的所述图像;识别出所述图像中的眼部图像状态,所述眼部图像状态包括闭眼状态和睁眼状态;若相邻的两幅图像的眼部图像状态不一致时,计算为眼睛眨动一次;计算预设时间段内眼睛眨动的次数;以及根据计算的次数与预设时间段计算出所述眼睛眨动的频率。
- 如权利要求10所述的显示装置,其中,处理模块执行所述预设规则为所述眼睛眨动的频率和操作的对应关系的步骤时具体执行如下步骤,若所述眼睛眨动的频率位于第一频率范围,所述终端执行第一操作;以及若所述眼睛眨动的频率位于第二频率范围,所述终端执行第二操作。
- 一种显示装置,包括,显示面板以及图像获取模块,用于实时获取终端前方的摄像区域内的图像;频率计算模块,用于根据获取的图像计算用户的眼睛眨动的频率;操作选择模块,用于根据所述眼睛眨动的频率和预设的规则选择相应的操作,所述预设规则为所述眼睛眨动的频率和操作的对应关系;以及控制模块,用于控制所述终端根据所述操作执行相应的功能 。
- 如权利要求13所述的显示装置,其中,所述频率计算模块包括,提取模块,用于提取预设时间段内的所述图像;识别模块,用于识别出所述图像中的眼部图像状态,所述眼部图像状态包括闭眼状态和睁眼状态;计算模块,用于若相邻的两幅图像的眼部图像状态不一致时,计算眼睛眨动一次;所述计算模块,还用于计算预设时间段内眼睛眨动的次数;以及所述计算模块,还用于根据计算的次数与预设时间段计算出所述眼睛眨动的频率。
- 如权利要求13所述的显示装置,其中,所述操作选择模块包括,第一操作模块,用于若所述眼睛眨动的频率位于第一频率范围,所述终端执行第一操作;以及第二操作模块,用于若所述眼睛眨动的频率位于第二频率范围,所述终端执行第二操作。
- 如权利要求13所述的显示装置,其中,所述控制模块包括,第一判断模块,用于判断终端当前所处使用状态是待机状态还是正常工作状态;侦测模块,用于若是待机状态,侦测在所述终端的预设范围内是否存在使用者;计时模块,用于计算在所述终端的预设范围内存在使用者的时间是否达到预设时长;所述控制模块,用于若达到预设时长,控制所述终端进入正常工作状态并执行所述实时获取终端前方的摄像区域内的图像。
- 根据权利要求13所述的显示装置,其中,还包括,识别模块,用于识别出所述图像中的眼部图像状态,所述眼部图像状态包括闭眼状态和睁眼状态;第一判断模块,用于判断所述显示装置当前所处使用状态是待机状态还是正常工作状态;侦测模块,用于若是待机状态,侦测在所述显示装置的预设范围内是否存在使用者;计时模块,用于计算在所述显示装置的预设范围内存在使用者的时间是否达到预设时长。
- 根据权利要求17所述的显示装置,其中,所述频率计算模块包括,提取模块,用于提取预设时间段内的所述图像;识别模块,用于识别出所述图像中的眼部图像状态,所述眼部图像状态包括闭眼状态和睁眼状态;计算模块,用于若相邻的两幅图像的眼部图像状态不一致时,计算眼睛眨动一次;所述计算模块,还用于计算预设时间段内眼睛眨动的次数;以及所述计算模块,还用于根据计算的次数与预设时间段计算出所述眼睛眨动的频率。
- 根据权利要求17所述的显示装置,其中,所述操作选择模块包括,第一操作模块,用于若所述眼睛眨动的频率位于第一频率范围,所述终端执行第一操作;以及第二操作模块,用于若所述眼睛眨动的频率位于第二频率范围,所述终端执行第二操作。
- 根据权利要求17所述的显示装置,其中,所述控制模块用于若达到预设时长,控制所述终端进入正常工作状态并执行所述实时获取终端前方的摄像区域内的图像。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/557,808 US10255874B2 (en) | 2016-12-19 | 2017-05-26 | Display controlling method and display device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611178597.4A CN106681503A (zh) | 2016-12-19 | 2016-12-19 | 一种显示控制方法、终端及显示装置 |
CN201611178597.4 | 2016-12-19 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018113187A1 true WO2018113187A1 (zh) | 2018-06-28 |
Family
ID=58870884
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2017/086116 WO2018113187A1 (zh) | 2016-12-19 | 2017-05-26 | 一种显示控制方法及显示装置 |
Country Status (3)
Country | Link |
---|---|
US (1) | US10255874B2 (zh) |
CN (1) | CN106681503A (zh) |
WO (1) | WO2018113187A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112738407A (zh) * | 2021-01-06 | 2021-04-30 | 富盛科技股份有限公司 | 一种操控多摄像机的方法和装置 |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106681503A (zh) * | 2016-12-19 | 2017-05-17 | 惠科股份有限公司 | 一种显示控制方法、终端及显示装置 |
CN106681060A (zh) * | 2017-03-10 | 2017-05-17 | 惠科股份有限公司 | 一种封胶方法、封胶结构及显示装置 |
CN108932058B (zh) | 2018-06-29 | 2021-05-18 | 联想(北京)有限公司 | 显示方法、装置及电子设备 |
US10802585B2 (en) | 2018-07-12 | 2020-10-13 | Apple Inc. | Electronic devices with display operation based on eye activity |
CN110958422B (zh) * | 2018-09-25 | 2021-08-27 | 杭州萤石软件有限公司 | 一种在可见光图像中展示红外检测信息的方法和设备 |
CN110582014A (zh) * | 2019-10-17 | 2019-12-17 | 深圳创维-Rgb电子有限公司 | 电视机及其电视控制方法、控制装置和可读存储介质 |
CN110784763B (zh) * | 2019-11-07 | 2021-11-02 | 深圳创维-Rgb电子有限公司 | 显示终端控制方法、显示终端及可读存储介质 |
CN113760097A (zh) * | 2021-09-16 | 2021-12-07 | Oppo广东移动通信有限公司 | 控制音量的方法及装置、终端及计算机可读存储介质 |
CN115170075B (zh) * | 2022-07-06 | 2023-06-16 | 深圳警通人才科技有限公司 | 一种基于数字化平台技术的智慧办公系统 |
US20240235867A9 (en) * | 2022-10-21 | 2024-07-11 | Zoom Video Communications, Inc. | Automated Privacy Controls For A Schedule View Of A Shared Conference Space Digital Calendar |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101287086A (zh) * | 2007-04-10 | 2008-10-15 | 深圳Tcl新技术有限公司 | 一种实现电视机自动开关机的方法及系统 |
CN104267814A (zh) * | 2014-09-25 | 2015-01-07 | 联想(北京)有限公司 | 一种信息处理方法及电子设备 |
CN106155317A (zh) * | 2016-06-29 | 2016-11-23 | 深圳市金立通信设备有限公司 | 一种终端屏幕控制方法和终端 |
CN106681503A (zh) * | 2016-12-19 | 2017-05-17 | 惠科股份有限公司 | 一种显示控制方法、终端及显示装置 |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9772689B2 (en) * | 2008-03-04 | 2017-09-26 | Qualcomm Incorporated | Enhanced gesture-based image manipulation |
US8106783B2 (en) * | 2008-03-12 | 2012-01-31 | Denso Corporation | Input apparatus, remote controller and operating device for vehicle |
JP4561914B2 (ja) * | 2008-09-22 | 2010-10-13 | ソニー株式会社 | 操作入力装置、操作入力方法、プログラム |
KR101078057B1 (ko) * | 2009-09-08 | 2011-10-31 | 주식회사 팬택 | 영상인식기법을 이용한 촬영 제어 기능을 구비한 이동단말 및 영상인식기법을 이용한 촬영 제어 시스템 |
JP6106921B2 (ja) * | 2011-04-26 | 2017-04-05 | 株式会社リコー | 撮像装置、撮像方法および撮像プログラム |
KR101850034B1 (ko) * | 2012-01-06 | 2018-04-20 | 엘지전자 주식회사 | 이동 단말기 및 그 제어방법 |
KR101850035B1 (ko) * | 2012-05-02 | 2018-04-20 | 엘지전자 주식회사 | 이동 단말기 및 그 제어방법 |
CN102915193B (zh) * | 2012-10-24 | 2015-04-01 | 广东欧珀移动通信有限公司 | 一种网页浏览方法、装置及智能终端 |
CN103472918A (zh) * | 2013-09-12 | 2013-12-25 | 京东方科技集团股份有限公司 | 一种护眼显示装置及其操作方法 |
US10048748B2 (en) * | 2013-11-12 | 2018-08-14 | Excalibur Ip, Llc | Audio-visual interaction with user devices |
KR20150089283A (ko) * | 2014-01-27 | 2015-08-05 | 엘지전자 주식회사 | 웨어러블 단말기 및 이를 포함하는 시스템 |
KR102240632B1 (ko) * | 2014-06-10 | 2021-04-16 | 삼성디스플레이 주식회사 | 생체 효과 영상을 제공하는 전자 기기의 구동 방법 |
KR102240639B1 (ko) * | 2014-06-12 | 2021-04-15 | 엘지전자 주식회사 | 글래스 타입 단말기 및 그것의 제어 방법 |
KR102184272B1 (ko) * | 2014-06-25 | 2020-11-30 | 엘지전자 주식회사 | 글래스 타입 단말기 및 이의 제어방법 |
CN106156806A (zh) * | 2015-04-01 | 2016-11-23 | 冠捷投资有限公司 | 显示器的防止疲劳的方法 |
US20160343229A1 (en) * | 2015-05-18 | 2016-11-24 | Frank Colony | Vigilance detection method and apparatus |
KR20160138806A (ko) * | 2015-05-26 | 2016-12-06 | 엘지전자 주식회사 | 글래스타입 단말기 및 그 제어방법 |
KR20170037466A (ko) * | 2015-09-25 | 2017-04-04 | 엘지전자 주식회사 | 이동 단말기 및 이의 제어방법 |
CN105809139A (zh) * | 2016-03-15 | 2016-07-27 | 广东欧珀移动通信有限公司 | 眼球信息的采集方法及装置 |
US20180025050A1 (en) * | 2016-07-21 | 2018-01-25 | Yen4Ken,Inc. | Methods and systems to detect disengagement of user from an ongoing |
CN106446831B (zh) * | 2016-09-24 | 2021-06-25 | 江西欧迈斯微电子有限公司 | 一种人脸识别方法及装置 |
-
2016
- 2016-12-19 CN CN201611178597.4A patent/CN106681503A/zh active Pending
-
2017
- 2017-05-26 WO PCT/CN2017/086116 patent/WO2018113187A1/zh active Application Filing
- 2017-05-26 US US15/557,808 patent/US10255874B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101287086A (zh) * | 2007-04-10 | 2008-10-15 | 深圳Tcl新技术有限公司 | 一种实现电视机自动开关机的方法及系统 |
CN104267814A (zh) * | 2014-09-25 | 2015-01-07 | 联想(北京)有限公司 | 一种信息处理方法及电子设备 |
CN106155317A (zh) * | 2016-06-29 | 2016-11-23 | 深圳市金立通信设备有限公司 | 一种终端屏幕控制方法和终端 |
CN106681503A (zh) * | 2016-12-19 | 2017-05-17 | 惠科股份有限公司 | 一种显示控制方法、终端及显示装置 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112738407A (zh) * | 2021-01-06 | 2021-04-30 | 富盛科技股份有限公司 | 一种操控多摄像机的方法和装置 |
Also Published As
Publication number | Publication date |
---|---|
US20180293954A1 (en) | 2018-10-11 |
US10255874B2 (en) | 2019-04-09 |
CN106681503A (zh) | 2017-05-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018113187A1 (zh) | 一种显示控制方法及显示装置 | |
WO2018161578A1 (zh) | 动态调节屏幕刷新率的方法、装置、存储介质及电子设备 | |
WO2016072749A1 (en) | Electronic device, and method for analyzing face information in electronic device | |
WO2017074078A1 (en) | Method for operating electronic device and electronic device for supporting the same | |
WO2015068911A1 (en) | Mobile terminal and method of controlling the same | |
WO2018161604A1 (zh) | 移动终端的播放控制方法、装置、存储介质及电子设备 | |
WO2015109865A1 (zh) | 空调运行模式自定义控制方法及系统 | |
WO2018155893A1 (en) | Interface providing method for multitasking and electronic device implementing the same | |
WO2016167620A1 (en) | Apparatus and method for providing information via portion of display | |
WO2017105018A1 (en) | Electronic apparatus and notification displaying method for electronic apparatus | |
WO2017095033A1 (ko) | 마찰음을 이용하는 장치 및 방법 | |
WO2017039125A1 (en) | Electronic device and operating method of the same | |
WO2019143189A1 (en) | Electronic device and method of operating electronic device in virtual reality | |
WO2018161572A1 (zh) | 移动终端帧率的控制方法、装置、存储介质及电子设备 | |
WO2019076087A1 (zh) | 电视机及其显示图效控制方法、计算机可读存储介质 | |
WO2017086559A1 (en) | Image display device and operating method of the same | |
WO2016058258A1 (zh) | 终端远程控制方法和系统 | |
WO2019051902A1 (zh) | 终端控制方法、空调器及计算机可读存储介质 | |
WO2017088444A1 (zh) | 终端电量信息提示方法和装置 | |
WO2017121066A1 (zh) | 应用程序显示方法和系统 | |
WO2017201943A1 (zh) | 显示屏驱动方法及装置 | |
WO2018223602A1 (zh) | 显示终端、画面对比度提高方法及计算机可读存储介质 | |
WO2018126888A1 (zh) | 电视功能的快捷启动设置方法及装置 | |
WO2017206865A1 (zh) | 一种应用程序的关闭方法、装置、存储介质及电子设备 | |
WO2019061530A1 (zh) | 显示器自动调节亮度的方法、装置及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 15557808 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17885350 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 17.10.2019) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17885350 Country of ref document: EP Kind code of ref document: A1 |