CN114762588A - Sleep monitoring method and related device - Google Patents
Sleep monitoring method and related device Download PDFInfo
- Publication number
- CN114762588A CN114762588A CN202110057952.7A CN202110057952A CN114762588A CN 114762588 A CN114762588 A CN 114762588A CN 202110057952 A CN202110057952 A CN 202110057952A CN 114762588 A CN114762588 A CN 114762588A
- Authority
- CN
- China
- Prior art keywords
- electronic device
- user
- electronic equipment
- sleep state
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 158
- 238000012544 monitoring process Methods 0.000 title claims abstract description 78
- 238000004891 communication Methods 0.000 claims description 33
- 230000003068 static effect Effects 0.000 claims description 26
- 238000004590 computer program Methods 0.000 claims description 11
- 230000003860 sleep quality Effects 0.000 abstract description 17
- 238000013473 artificial intelligence Methods 0.000 abstract description 2
- 230000006870 function Effects 0.000 description 26
- 230000001133 acceleration Effects 0.000 description 25
- 238000013528 artificial neural network Methods 0.000 description 15
- 238000012549 training Methods 0.000 description 15
- 230000001413 cellular effect Effects 0.000 description 13
- 238000012545 processing Methods 0.000 description 12
- 210000004027 cell Anatomy 0.000 description 11
- 210000000988 bone and bone Anatomy 0.000 description 10
- 230000001537 neural effect Effects 0.000 description 10
- 230000008569 process Effects 0.000 description 10
- 238000010295 mobile communication Methods 0.000 description 9
- 230000002829 reductive effect Effects 0.000 description 8
- 238000007726 management method Methods 0.000 description 7
- 230000004044 response Effects 0.000 description 6
- 230000008667 sleep stage Effects 0.000 description 5
- 230000005236 sound signal Effects 0.000 description 5
- 230000004913 activation Effects 0.000 description 4
- 238000004422 calculation algorithm Methods 0.000 description 4
- 230000008859 change Effects 0.000 description 4
- 238000003062 neural network model Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 238000005266 casting Methods 0.000 description 3
- 238000013480 data collection Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000001815 facial effect Effects 0.000 description 3
- 206010062519 Poor quality sleep Diseases 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000036772 blood pressure Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 239000004984 smart glass Substances 0.000 description 2
- 230000003321 amplification Effects 0.000 description 1
- 238000010009 beating Methods 0.000 description 1
- 238000013529 biological neural network Methods 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000001902 propagating effect Effects 0.000 description 1
- 230000010349 pulsation Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000000284 resting effect Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 239000004576 sand Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000003238 somatosensory effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 230000002618 waking effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4806—Sleep evaluation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
- A61B5/02438—Detecting, measuring or recording pulse rate or heart rate with portable devices, e.g. worn by the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1118—Determining activity level
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4806—Sleep evaluation
- A61B5/4809—Sleep detection, i.e. determining whether a subject is asleep or not
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4806—Sleep evaluation
- A61B5/4812—Detecting sleep stages or cycles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4806—Sleep evaluation
- A61B5/4815—Sleep quality
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/681—Wristwatch-type devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72409—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
- H04M1/72412—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72448—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72448—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
- H04M1/72454—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/80—Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/12—Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Surgery (AREA)
- Pathology (AREA)
- Heart & Thoracic Surgery (AREA)
- Theoretical Computer Science (AREA)
- Physiology (AREA)
- Cardiology (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Evolutionary Computation (AREA)
- Computational Linguistics (AREA)
- Mathematical Physics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Computing Systems (AREA)
- Software Systems (AREA)
- Data Mining & Analysis (AREA)
- Artificial Intelligence (AREA)
- Human Computer Interaction (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Pulmonology (AREA)
- Dentistry (AREA)
- Anesthesiology (AREA)
- Environmental & Geological Engineering (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The application discloses a sleep monitoring method and a related device, and relates to the field of artificial intelligence. In the method, the second electronic device worn on the first user for monitoring the sleep quality can judge whether the first user enters the sleep state or not by means of the first electronic device. After the second electronic device judges that the first user enters the sleep state in advance, a request for determining whether the first user enters the sleep state is sent to the first electronic device. The first electronic equipment sends a message for indicating that the first user does not enter the sleep state to the second electronic equipment when judging that the first user uses the first electronic equipment. The method can reduce the misjudgment of whether the user enters the sleep state or not by independently utilizing the second electronic equipment, and improve the accuracy of time monitoring when the user enters the sleep state.
Description
Technical Field
The present application relates to the field of artificial intelligence technologies, and in particular, to a sleep monitoring method and related apparatus.
Background
The quality of sleep is closely related to the physical health of a person. More and more electronic devices (such as wristbands, watches, etc.) have a sleep monitoring function to monitor the sleep quality of a person.
Sleep monitoring needs to determine whether a user enters a sleep state and whether the user wakes up. The current electronic devices, such as wristbands, usually determine whether a user is in a sleep state by monitoring the motion state of the user and the heart rate variation of the user. However, in a scene where the user maintains a fixed posture for a long time and is not in a sleep state, the above method is likely to cause erroneous determination of the sleep state. For example, in a scenario where a user uses a mobile phone while keeping a fixed posture before sleeping, the user does not enter a sleep state. The bracelet often judges that the user enters a sleep state, so that the sleep quality monitoring is inaccurate.
Disclosure of Invention
The application provides a sleep monitoring method and a related device, which can judge whether a user enters a sleep state through cooperation of a plurality of electronic devices, and improve the accuracy of time monitoring when the user enters the sleep state.
In a first aspect, the present application provides a sleep monitoring method. In the method, a first electronic device may receive a first request of a second electronic device. The first electronic device and the second electronic device have a binding relationship. The first request may be transmitted when the second electronic device is in a wearing state and the first data is monitored. The first data corresponds to data of the user entering a sleep state. The first electronic device may determine whether a first user wearing the second electronic device is using the first electronic device, and send the first determination result or the second determination result to the second electronic device. The first judgment result is that the first user uses the first electronic device, and the second judgment result is that the first user does not use the first electronic device.
The first electronic device may be a mobile phone, a tablet computer, a laptop computer, a handheld computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), and smart glasses. The second electronic device may be an electronic device for monitoring sleep quality of a user. Such as a bracelet, watch, etc. The second electronic device can monitor the sleep quality of the user when in the wearing state.
The first data is data monitored when the second electronic device is worn by the first user. The first data may include physiological characteristic data of the first user and motion data of the second electronic device. The physiological characteristic data may be, for example, heart rate data. The motion data of the second electronic device may be acceleration data or angular velocity data, for example.
The first data matches with data of the user entering the sleep state, and may indicate that the second electronic device pre-determines that the first user enters the sleep state. The first data can be obtained by big data collection and can reflect physiological characteristic data of a general user who wears the second electronic device to enter a sleep state and motion data of the second electronic device. Alternatively, the second data may include physiological characteristic data of the first user wearing the second electronic device to actually enter a sleep state and motion data of the second electronic device. The second electronic device can more accurately pre-judge whether the first user enters the sleep state or not by utilizing the physiological characteristic data containing the fact that the first user wears the second electronic device to actually enter the sleep state and the motion data of the second electronic device.
With reference to the first aspect, in some embodiments, the first electronic device may determine that the first electronic device is being used by a user and the user is the first user under the first condition, and then the first electronic device obtains a first determination result. The first condition may include one or more of: the first electronic device judges that the first electronic device is in a non-static state, the first electronic device monitors user operation in a first time period, the first electronic device monitors that human eyes watch a screen of the first electronic device, and the first electronic device monitors that an application program with a screen is operated by the first electronic device. The first electronic device may determine that the first electronic device is used by the user and the user is not the first user under the first condition, or the first electronic device determines that the first electronic device is not used by the user, and the first electronic device obtains a second determination result.
If the first electronic device is in a non-static state, the first electronic device may determine that a user is using the first electronic device. If the first electronic device monitors the user operation within the first time, the first electronic device may determine that the user is using the first electronic device. If the first electronic device detects that the eyes of a user watch the screen of the first electronic device, the first electronic device can judge that the user uses the first electronic device. If the first electronic device runs the screen-casting application program, the first electronic device may determine that a user is using the first electronic device.
In a possible implementation manner, when receiving the first request, the first electronic device may first determine whether the first electronic device is in a stationary state. If the first electronic device is in a non-static state and the user is determined to be the first user, the first electronic device may obtain a first determination result. If the first electronic device is in the static state, the first electronic device may further determine whether the user operation is monitored in the first time period. If the user operation is monitored in the first time period and the user is judged to be the first user, the first electronic device can obtain a first judgment result. If the user operation is not monitored in the first time period, the first electronic device may further monitor whether a human eye gazes at the screen of the first electronic device. If it is monitored that the user watches the screen of the first electronic device with eyes and is judged to be the first user, the first electronic device can obtain a first judgment result. If no human eyes watch the screen of the first electronic device is monitored, the first electronic device can obtain a second judgment result. If it is monitored that the user watches the screen of the first electronic device with eyes and is not judged to be the first user, the first electronic device can obtain a second judgment result.
Optionally, if it is monitored that no human eye watches the screen of the first electronic device, the first electronic device may further monitor whether the first electronic device runs the application program for screen projection. If the screen-casting application program is operated and the user is determined to be the first user, the first electronic device can obtain a first judgment result. If no human eyes watch the screen of the first electronic device and no screen-casting application program is operated, the first electronic device can obtain a second judgment result.
In another possible implementation manner, when receiving the first request, the first electronic device may first determine whether the first electronic device is in a stationary state. If the first electronic device is in a non-static state and the user is determined to be the first user, the first electronic device may obtain a first determination result. If the first electronic device is in the static state, the first electronic device may further monitor whether a human eye is watching a screen of the first electronic device. If the fact that the user is the first user is judged if the user watches the screen of the first electronic device through eyes is monitored, the first electronic device can obtain a first judgment result. If no human eyes watch the screen of the first electronic device, the first electronic device can obtain a second judgment result. If it is monitored that the user watches the screen of the first electronic device with eyes and is not judged to be the first user, the first electronic device can obtain a second judgment result.
In another possible implementation manner, when receiving the first request, the first electronic device may first determine whether a user operation is monitored in a first time period. If the user operation is monitored in the first time period and the user is judged to be the first user, the first electronic device can obtain a first judgment result. If the user operation is not monitored in the first time period, the first electronic device may further monitor whether a human eye gazes at the screen of the first electronic device. If it is monitored that the user watches the screen of the first electronic device with eyes and is judged to be the first user, the first electronic device can obtain a first judgment result. If no human eyes watch the screen of the first electronic device is monitored, the first electronic device can obtain a second judgment result. If the fact that the user does not watch the screen of the first electronic device by eyes is monitored, the first electronic device can obtain a second judgment result.
In another possible implementation manner, when receiving the first request, the first electronic device may monitor whether human eyes watch a screen of the first electronic device. If it is monitored that the user watches the screen of the first electronic device with eyes and is judged to be the first user, the first electronic device can obtain a first judgment result. If no human eyes watch the screen of the first electronic device is monitored, the first electronic device can obtain a second judgment result. If the fact that the user does not watch the screen of the first electronic device by eyes is monitored, the first electronic device can obtain a second judgment result.
In some embodiments, the first electronic device may determine whether it is in a stationary state by using an acceleration sensor, a gyro sensor, or the like.
In some embodiments, the user operation may be a touch operation acting on a screen of the first electronic device, a user operation acting on a key of the first electronic device, an input operation of a voice instruction, an input operation of a space gesture, or the like.
In some embodiments, the first electronic device may determine whether a human eye gazes at a screen of the first electronic device through a human eye gazing recognition model. The human eye gaze recognition model may be a neural network model. The training data for training the human eye gaze recognition model may include image data of a human eye gaze screen and image data of a human eye non-gaze screen. The trained human eye gaze recognition model can recognize the characteristics of the image of the human eye gaze screen, so as to judge whether the human eye gazes at the screen of the mobile phone 100.
In some embodiments, the method for determining, by the first electronic device, that the user is the first user may include: the first electronic device can determine that the user of the first electronic device is the first user by judging that the first image contains the face image of the first user through the first image acquired by the camera. The camera for acquiring the first image may be a camera of the first electronic device or a camera of the screen projection device. The first electronic device may further determine whether the user of the first electronic device is the first user by collecting other biometric information (e.g., voiceprint information, fingerprint information, etc.).
With reference to the first aspect, in some embodiments, the first electronic device may establish a binding relationship with the second electronic device through bluetooth pairing. Alternatively, the first electronic device may establish a binding relationship with the second electronic device in response to the first user operation. The first user operation may be used to indicate that the owner of the first electronic device is the first user. Or the first electronic device and the second electronic device may establish a binding relationship by logging in the same account.
Since the first electronic device and the second electronic device have the binding relationship, the first electronic device may determine that the owner of the first electronic device and the user wearing the second electronic device are the same person. Therefore, misjudgment of whether the user of the first electronic equipment enters the sleep state or not by the same person as the user wearing the second electronic equipment can be reduced.
According to the sleep monitoring method, when it is determined in advance that the first user enters the sleep state, the second electronic device may request the first electronic device to further confirm whether the first user enters the sleep state. The first electronic device may confirm whether the first user enters the sleep state by determining whether the first user is using the first electronic device. The method and the device can reduce misjudgment on whether the user enters the sleep state or not because the user keeps a fixed posture for a long time but is not in the sleep state under the condition that whether the user enters the sleep state or not is monitored by independently utilizing the second electronic device, and improve the accuracy of time monitoring when the user enters the sleep state.
With reference to the first aspect, in some embodiments, the first electronic device monitors, within a second time period, a user operation for unlocking the first electronic device, and the first electronic device may send a first message to the second electronic device. The first message may be used to indicate that the first user is using the first electronic device. Or, the first electronic device monitors the user operation of turning off the alarm clock in the second time period, and the user turning off the alarm clock is the first user, and the first electronic device may send the first message to the second electronic device.
The unlocking method may be a method of unlocking by using biometric information. The biometric information may be, for example, face information, voiceprint information, fingerprint information, and the like. When the monitored biometric information belongs to the biometric information of the first user, the first electronic device can determine that the unlocked user is the first user.
The second time period may be a first time period from the first electronic device sending the second determination result to the second electronic device. Alternatively, the second period of time may be a preset period of time.
According to the method, the second electronic device can judge whether the first user exits the sleep state or not by means of the first electronic device. This can reduce the misjudgment of whether the first user exits the sleep state after the first user wakes up but does not get up, and improve the accuracy of sleep quality monitoring.
In a second aspect, the present application further provides a sleep monitoring method. In the method, the second electronic device monitors the first data while in a worn state. The first data corresponds to data of the user entering a sleep state. The second electronic device sends a first request to the first electronic device. The first electronic device and the second electronic device have a binding relationship. And in the case of receiving the first judgment result from the first electronic equipment, the second electronic equipment determines that the first user wearing the second electronic equipment does not enter the sleep state. The first judgment result is a judgment result of the first electronic device judging that the first user uses the first electronic device after the first electronic device receives the first request.
The first data is data monitored when the second electronic device is worn by the first user. The first data may include physiological characteristic data of the first user and motion data of the second electronic device. The physiological characteristic data may be, for example, heart rate data. The motion data of the second electronic device may be acceleration data or angular velocity data, for example.
The first data is consistent with data of the user entering the sleep state, and can indicate that the second electronic device judges that the first user enters the sleep state in advance. The first data can be obtained by big data collection and can reflect physiological characteristic data of a general user who wears the second electronic device to enter a sleep state and motion data of the second electronic device. Alternatively, the second data may include physiological characteristic data of the first user wearing the second electronic device to actually enter a sleep state and motion data of the second electronic device. The second electronic device can more accurately pre-judge whether the first user enters the sleep state or not by utilizing the physiological characteristic data containing the fact that the first user wears the second electronic device to actually enter the sleep state and the motion data of the second electronic device.
In conjunction with the second aspect, in some embodiments, after determining that the first user has not entered the sleep state, the second electronic device may monitor the second data while in the worn state and determine whether the second data matches data of the user entering the sleep state. That is, the second electronic device may again predict whether the first user enters the sleep state. When receiving the first determination result from the first electronic device, the second electronic device may perform a pre-determination once every preset time period (e.g., 5 minutes, etc.), and request the first electronic device to determine whether the first user enters the sleep state when the pre-determination result indicates that the first user enters the sleep state.
The second data is data of the first user, and the second data may include physiological characteristic data of the first user and motion data of the second electronic device.
In conjunction with the second aspect, in some embodiments, after determining that the first user has not entered the sleep state, the second electronic device may record the monitored physiological characteristic data of the user as data that is not in the sleep state.
In combination with the second aspect, in some embodiments, the second electronic device determines that the first user enters the sleep state upon receiving the second determination result from the first electronic device. The second judgment result is a judgment result that the first electronic device judges that the first user does not use the first electronic device after receiving the first request.
Further, after determining that the first user enters the sleep state, the second electronic device may record the monitored physiological characteristic data of the user as data in the sleep state.
According to the sleep monitoring method, when it is determined in advance that the first user enters the sleep state, the second electronic device may request the first electronic device to further confirm whether the first user enters the sleep state. The first electronic device may confirm whether the first user enters the sleep state by determining whether the first user is using the first electronic device. The method and the device can reduce misjudgment on whether the user enters the sleep state or not because the user keeps a fixed posture for a long time but is not in the sleep state under the condition that whether the user enters the sleep state or not is monitored by independently utilizing the second electronic device, and improve the accuracy of time monitoring when the user enters the sleep state.
In combination with the second aspect, in some embodiments, the second electronic device receives the first message from the first electronic device when detecting that the state of the first user is a sleep state, and it may be determined that the state of the first user detected by the second electronic device is a non-sleep state. The first message may be usable to indicate that the first user is using the first electronic device.
According to the method, the second electronic device can judge whether the first user exits the sleep state or not by means of the first electronic device. This can reduce the misjudgment of whether the first user exits the sleep state after the first user wakes up but does not get up, and improve the accuracy of sleep quality monitoring.
In a third aspect, the present application further provides a sleep monitoring method. In the method, the second electronic device monitors the first data while in a worn state. The first data corresponds to data of the user entering a sleep state. The second electronic device may send the first request to the first electronic device. The first electronic device and the second electronic device have a binding relationship. The first electronic device receives a first request of a second electronic device. The first electronic device may determine whether a first user wearing the second electronic device is using the first electronic device, and send the first determination result or the second determination result to the second electronic device. The first judgment result is that the first user uses the first electronic device. The second judgment result is that the first user does not use the first electronic device. In a case where the first determination result is received, the second electronic device may determine that the first user does not enter the sleep state.
According to the sleep monitoring method, when it is determined in advance that the first user enters the sleep state, the second electronic device may request the first electronic device to further confirm whether the first user enters the sleep state. The first electronic device may confirm whether the first user enters the sleep state by determining whether the first user is using the first electronic device. The method and the device can reduce misjudgment on whether the user enters the sleep state or not because the user keeps a fixed posture for a long time but is not in the sleep state under the condition that whether the user enters the sleep state or not is monitored by independently utilizing the second electronic device, and improve the accuracy of time monitoring when the user enters the sleep state.
With reference to the third aspect, in some embodiments, the first electronic device determines, under the first condition, that the first electronic device is being used by a user and the user is the first user, and the first electronic device obtains a first determination result. The first condition includes one or more of: the first electronic device judges that the first electronic device is in a non-static state, monitors user operation in a first time period, monitors that human eyes watch a screen of the first electronic device, and monitors that the first electronic device operates an application program with a screen. The first electronic device judges that the first electronic device is used by the user and the user is not the first user under the first condition, or the first electronic device judges that the first electronic device is not used by the user, and the first electronic device obtains a second judgment result.
With reference to the third aspect, in some embodiments, the method for the first electronic device to determine that the user is the first user may be: the first electronic equipment acquires a first image through the camera and judges that the first image contains a face image of a first user. The camera for acquiring the first image may be a camera of the first electronic device, or a camera of the screen projection device. The first electronic device may further determine whether the user of the first electronic device is the first user by collecting other biometric information (e.g., voiceprint information, fingerprint information, etc.).
In combination with the third aspect, in some embodiments, the second electronic device may determine that the first user enters the sleep state in a case where the second determination result is received.
With reference to the third aspect, in some embodiments, the first electronic device may establish a binding relationship with the second electronic device through bluetooth pairing. Alternatively, the first electronic device may establish a binding relationship with the second electronic device in response to the first user operation. The first user operation may be used to indicate that the owner of the first electronic device is the first user. Or the first electronic device and the second electronic device may establish a binding relationship by logging in the same account.
Since the first electronic device and the second electronic device have the above-mentioned binding relationship, the first electronic device can determine that the owner of the first electronic device and the user wearing the second electronic device are the same person. Therefore, misjudgment of whether the user of the first electronic equipment enters the sleep state or not by the same person as the user wearing the second electronic equipment can be reduced.
With reference to the third aspect, in some embodiments, the first electronic device monitors, within the second time period, a user operation to unlock the first electronic device, and the first electronic device may send a first message to the second electronic device. The first message may be used to indicate that the first user is using the first electronic device. Or the first electronic device monitors the user operation of turning off the alarm clock in the second time period, the user turning off the alarm clock is the first user, and the first electronic device can send the first message to the second electronic device.
The unlocking method may be a method of unlocking by using biometric information. The biometric information may be, for example, face information, voiceprint information, fingerprint information, and the like. When it is monitored that the biometric information belongs to the biometric information of the first user, the first electronic device may determine that the unlocked user is the first user.
The second time period may be a first time period from the first electronic device sending the second determination result to the second electronic device. Alternatively, the second period of time may be a preset period of time.
According to the method, the second electronic device can judge whether the first user exits the sleep state or not by means of the first electronic device. This can reduce the misjudgment of whether the first user exits the sleep state after the first user wakes up but does not get up, and improve the accuracy of sleep quality monitoring.
In a fourth aspect, the present application further provides a sleep monitoring method. In the method, the second electronic device receives a first message of the first electronic device when detecting that the state of the first user wearing the second electronic device is a sleep state, and it may be determined that the state of the first user detected by the second electronic device is a non-sleep state. The first electronic device and the second electronic device have a binding relationship. The first message may be sent by the first electronic device after the first electronic device monitors for user operation to unlock the first electronic device within the first monitoring period.
The determining, by the second electronic device, that the detected state of the first user is the non-sleep state when the first message is received may specifically be to mark the detected state of the first user from the sleep state to the non-sleep state. That is, upon receiving the first message, the second electronic device may determine that the first user wakes up. And, the second electronic device may determine a time when the first message is received or a time when the first electronic device detects the unlocked user operation as a time when the first user wakes up.
According to the method, the second electronic device can judge whether the first user exits the sleep state or not by means of the first electronic device. This can reduce the misjudgment of whether the first user quits the sleep state when the first user wakes up but does not get up, and improve the accuracy of sleep quality monitoring.
With reference to the fourth aspect, in some embodiments, the method of unlocking may be unlocking using biometric information. The biometric information may be, for example, face information, voiceprint information, fingerprint information, and the like. When it is monitored that the biometric information belongs to the biometric information of the first user, the first electronic device may determine that the unlocked user is the first user.
In some embodiments, in combination with the fourth aspect, the second electronic device monitors the first data while in the worn state. The first data corresponds to data of the user entering a sleep state. The second electronic device may send a first request to the first electronic device. The second electronic device receives a determination result that the first electronic device indicates that the first user does not use the first electronic device, and may determine that the state of the first user detected by the second electronic device is a sleep state.
With reference to the fourth aspect, in some embodiments, the first monitoring time period may be a time period when the second electronic device estimates that the first user exits from the sleep state. Alternatively, the first monitoring time period may be a fixed time period, for example, a time period from 5 am to 10 am.
With reference to the fourth aspect, in some embodiments, the first monitoring time period may be a time period of a first duration from the first electronic device sending a determination result indicating that the first user does not use the first electronic device to the second electronic device.
The first monitoring period may be the same period as the second period in the foregoing embodiment.
In combination with the fourth aspect, in some embodiments, the first electronic device may establish a binding relationship with the second electronic device through bluetooth pairing. Alternatively, the first electronic device may establish a binding relationship with the second electronic device in response to the first user operation. The first user operation may be used to indicate that the owner of the first electronic device is the first user. Or the first electronic device and the second electronic device may establish a binding relationship by logging in the same account.
Since the first electronic device and the second electronic device have the binding relationship, the first electronic device may determine that the owner of the first electronic device and the user wearing the second electronic device are the same person. Therefore, misjudgment of whether the user of the first electronic equipment enters the sleep state or not by the same person as the user wearing the second electronic equipment can be reduced.
In combination with the fourth aspect, in some embodiments, after determining that the detected state of the first user is a non-sleep state, the second electronic device may further record the monitored physiological characteristic data of the user as data that is not in a sleep state.
In a fifth aspect, the present application further provides a sleep monitoring method. In the method, the first electronic device may monitor a user operation for unlocking the first electronic device within a first monitoring period, and send a first message to the second electronic device. The first message is used to indicate that a first user wearing a second electronic device is using the first electronic device. The first electronic device and the second electronic device have a binding relationship. The second electronic device receives the first message when detecting that the state of the first user is the sleep state, and may determine that the detected state of the first user is the non-sleep state.
The determining, by the second electronic device, that the detected state of the first user is the non-sleep state when the first message is received may specifically be to mark the detected state of the first user from the sleep state to the non-sleep state. That is, upon receiving the first message, the second electronic device may determine that the first user is awake. And, the second electronic device may determine a time when the first message is received or a time when the first electronic device detects the unlocked user operation as a time when the first user wakes up.
According to the method, the second electronic device can judge whether the first user exits the sleep state or not by means of the first electronic device. This can reduce the misjudgment of whether the first user exits the sleep state after the first user wakes up but does not get up, and improve the accuracy of sleep quality monitoring.
With reference to the fifth aspect, in some embodiments, the second electronic device monitors the first data while in the worn state, the first data corresponding to data of the user entering the sleep state. The second electronic device sends a first request to the first electronic device. The first electronic device receives the first request, judges whether the first user uses the first electronic device or not, and obtains a judgment result indicating that the first user does not use the first electronic device. The first electronic device may transmit a determination result indicating that the first user does not use the first electronic device to the second electronic device. When receiving the judgment result indicating that the first user does not use the first electronic device, the second electronic device may determine that the first user enters the sleep state.
With reference to the fifth aspect, in some embodiments, the first monitoring time period may be a time period during which the second electronic device estimates that the first user exits the sleep state.
With reference to the fifth aspect, in some embodiments, the first monitoring period may be a period of a first duration from the first electronic device sending a determination result to the second electronic device indicating that the first user does not use the first electronic device.
With reference to the fifth aspect, in some embodiments, before the first electronic device sends the first message to the second electronic device, it may be further determined that the first electronic device is being used by a user and the user is the first user under the first condition. The first condition includes one or more of: the first electronic device judges that the first electronic device is in a non-static state, the first electronic device monitors user operation in a first time period, the first electronic device monitors that human eyes watch a screen of the first electronic device, and the first electronic device monitors that an application program with a screen is operated by the first electronic device.
With reference to the fifth aspect, in some embodiments, upon determining that the first user exits the sleep state, the second electronic device may record the monitored physiological characteristic data of the user as data not in the sleep state.
With reference to the fifth aspect, in some embodiments, the first electronic device may establish a binding relationship with the second electronic device through bluetooth pairing. Alternatively, the first electronic device may establish a binding relationship with the second electronic device in response to the first user operation. The first user operation may be used to indicate that the owner of the first electronic device is the first user. Or the first electronic device and the second electronic device may establish a binding relationship by logging in the same account.
Since the first electronic device and the second electronic device have the binding relationship, the first electronic device may determine that the owner of the first electronic device and the user wearing the second electronic device are the same person. Therefore, misjudgment of whether the user of the first electronic equipment enters the sleep state or not by the same person as the user wearing the second electronic equipment can be reduced.
In a sixth aspect, the present application provides an electronic device. The electronic device is a first electronic device. The first electronic device may include a camera, a communication module, a memory, and a processor. The camera may be used to capture images. The communication module may be used to establish a communication connection with a second electronic device. The memory may be used to store a computer program. The processor may be configured to invoke the computer program to cause the first electronic device to perform any of the possible implementation methods of the first aspect described above.
In a seventh aspect, the present application further provides an electronic device. The electronic device is a second electronic device. The second electronic device may include a communication module, a memory, and a processor. The communication module may be used to establish a communication connection with a first electronic device. The memory may be used to store a computer program. The processor may be configured to invoke the computer program to cause the second electronic device to perform any of the possible implementation methods of the second aspect described above or to perform any of the possible implementation methods of the fourth aspect described above.
In an eighth aspect, the present application provides a sleep monitoring system, which may include the electronic device provided in the sixth aspect and the electronic device provided in the seventh aspect.
In a ninth aspect, an embodiment of the present application provides a chip applied to an electronic device provided in the sixth aspect or an electronic device provided in the seventh aspect, where the chip includes one or more processors, and the processor is configured to invoke a computer instruction to cause the electronic device provided in the sixth aspect to perform any one of the implementation methods in the first aspect, or cause the electronic device provided in the seventh aspect to perform any one of the implementation methods in the second aspect or perform any one of the implementation methods in the fourth aspect.
In a tenth aspect, an embodiment of the present application provides a computer program product containing instructions, which, when run on an electronic device, causes the electronic device provided in the above sixth aspect to perform any one of the possible implementation methods in the first aspect, or causes the electronic device provided in the above seventh aspect to perform any one of the possible implementation methods in the second aspect or perform any one of the possible implementation methods in the fourth aspect.
In an eleventh aspect, an embodiment of the present application provides a computer storage medium, which includes computer instructions, and when the computer instructions are executed on an electronic device, the electronic device provided in the sixth aspect performs any one of the possible implementation methods in the first aspect, or the electronic device provided in the seventh aspect performs any one of the possible implementation methods in the second aspect or performs any one of the possible implementation methods in the fourth aspect.
It is to be understood that the electronic device provided by the sixth aspect, the electronic device provided by the seventh aspect, the sleep monitoring system provided by the eighth aspect, the chip provided by the ninth aspect, the computer program product provided by the tenth aspect, and the computer-readable storage medium provided by the eleventh aspect are all configured to perform the method provided by the embodiments of the present application. Therefore, the beneficial effects achieved by the method can refer to the beneficial effects in the corresponding method, and are not described herein again.
Drawings
Fig. 1 is a schematic structural diagram of a first electronic device 100 according to an embodiment of the present disclosure;
fig. 2 is a schematic view of a sleep monitoring scenario provided in an embodiment of the present application;
fig. 3 is a flowchart of a sleep monitoring method according to an embodiment of the present application;
fig. 4 is a flowchart of another sleep monitoring method provided by an embodiment of the present application;
FIG. 5 is a flow chart of another sleep monitoring method provided by an embodiment of the present application;
fig. 6 is a flowchart of another sleep monitoring method according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described in detail and clearly with reference to the accompanying drawings. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" in the text is only an association relationship describing an associated object, and means that three relationships may exist, for example, a and/or B may mean: three cases of a alone, a and B both, and B alone exist, and in addition, "a plurality" means two or more than two in the description of the embodiments of the present application.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature, and in the description of embodiments of the application, unless stated otherwise, "plurality" means two or more.
The application provides a sleep monitoring method which can monitor whether a user enters a sleep state through cooperation of first electronic equipment and second electronic equipment. Sleep state may refer to the morphology a person exhibits while sleeping. The sleep states may include an in-sleep stage, a light sleep stage, and a deep sleep stage. The user in the sleep state maintains a fixed posture for a long time or the posture of the limbs changes to a small extent. Also, the heart rate of the user in the sleep state fluctuates around the resting heart rate. The first electronic device may be a mobile phone, a tablet computer, a laptop computer, a handheld computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), and smart glasses. The second electronic device is an electronic device for monitoring the sleep quality of the user. Sleep quality monitoring may include determining the total length of time a user is in a sleep state and the length of time in the sleep state for a sleep stage, a light sleep stage, a deep sleep stage, and so on. The second electronic device may be, for example, a bracelet, a watch, or other electronic device. I.e. the user can monitor his own sleep quality by wearing the second electronic device. The embodiment of the present application does not limit the specific types of the first electronic device and the second electronic device.
In the sleep monitoring method, a second electronic device is worn on a first user. The second electronic device stores a sleep model therein. The second electronic device can pre-judge whether the first user enters a sleep state or not by utilizing the sleep model according to data collected by the sensors such as the acceleration sensor and the heart rate sensor. When the predetermined result of the second electronic device indicates that the first user enters the sleep state, the second electronic device may send a request to the first electronic device to determine whether the first user enters the sleep state. Further, the first electronic device may detect whether the first user is using the first electronic device. If it is determined that the first user is using the first electronic device, the first electronic device may notify the second electronic device that the first user is using the first electronic device. The second electronic device may determine from the notification that the first user has not entered the sleep state. Otherwise, the first electronic device may notify the second electronic device that the first user is not using the first electronic device. The second electronic device may determine from the notification that the second user entered the sleep state.
According to the method, when the second electronic device pre-judges that the first user enters the sleep state, the second electronic device can request the first electronic device to further confirm whether the first user enters the sleep state. The method and the device can reduce misjudgment on whether the user enters the sleep state or not because the user keeps a fixed posture for a long time but is not in the sleep state under the condition that whether the user enters the sleep state or not is monitored by independently utilizing the second electronic device, and improve the accuracy of time monitoring when the user enters the sleep state.
Fig. 1 schematically illustrates a structural diagram of a first electronic device 100 provided in an embodiment of the present application.
As shown in fig. 1, the first electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the first electronic device 100. In other embodiments of the present application, the first electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Wherein the controller may be a neural center and a command center of the first electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the time sequence signal to finish the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
The USB interface 130 is an interface conforming to the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the first electronic device 100, and may also be used to transmit data between the first electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
The charging management module 140 is configured to receive a charging input from a charger.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like.
The wireless communication function of the first electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the first electronic device 100 may be used to cover a single or multiple communication bands.
The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the first electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the first electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the first electronic device 100 is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the first electronic device 100 can communicate with networks and other devices through wireless communication technology.
The first electronic device 100 implements a display function through the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. In some embodiments, the first electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The first electronic device 100 may implement a photographing function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, and the application processor, etc.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the first electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the first electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The first electronic device 100 may support one or more video codecs. In this way, the first electronic device 100 can play or record video in a plurality of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the first electronic device 100 can be implemented by the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the first electronic device 100.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications and data processing of the first electronic device 100 by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, and the like) required by at least one function, and the like. The storage data area may store data (such as audio data, a phone book, etc.) created during the use of the first electronic device 100, and the like.
The first electronic device 100 can implement an audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals.
The headphone interface 170D is used to connect a wired headphone.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a variety of types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The first electronic device 100 determines the intensity of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the first electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A. The first electronic device 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but have different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the first electronic device 100. In some embodiments, the angular velocity of the first electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by the gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the first electronic device 100, calculates a distance to be compensated for the lens module according to the shake angle, and allows the lens to counteract the shake of the first electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure.
The magnetic sensor 180D includes a hall sensor.
The acceleration sensor 180E may detect the magnitude of acceleration of the first electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the first electronic device 100 is at rest. The method can also be used for recognizing the posture of the first electronic device 100, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The first electronic device 100 detects infrared reflected light from a nearby object using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the first electronic device 100. When insufficient reflected light is detected, the first electronic device 100 may determine that there are no objects near the first electronic device 100.
The ambient light sensor 180L is used to sense the ambient light level. The first electronic device 100 may adaptively adjust the brightness of the display screen 194 according to the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the first electronic device 100 is in a pocket, so as to prevent accidental touch.
The fingerprint sensor 180H is used to collect a fingerprint. The first electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and the like.
The temperature sensor 180J is used to detect temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided via the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the first electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human voice vibrating a bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The first electronic device 100 may receive a key input, and generate a key signal input related to user setting and function control of the first electronic device 100.
The motor 191 may generate a vibration cue.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the first electronic device 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The first electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. In some embodiments, the first electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card may be embedded in the first electronic device 100 and may not be separated from the first electronic device 100.
The structure of the second electronic device 200 may refer to the schematic structure diagram of the first electronic device 100 shown in fig. 1, which is not described herein again in this embodiment.
In the following embodiments of the present application, the sleep monitoring method provided by the present application is specifically described by taking the first electronic device 100 as a mobile phone and the second electronic device 200 as a bracelet.
Fig. 2 illustrates a sleep monitoring scenario according to the present application.
As shown in fig. 2, a first user wears a bracelet 200. The first user lies in the bed and uses the mobile phone 100 in a fixed position. Bracelet 200 can utilize the sleep model to prejudge that first user has got into the sleep state according to the data that sensors such as acceleration sensor, heart rate sensor gathered. The bracelet 200 may request the handset 100 to further confirm whether the first user enters a sleep state. The handset 100 may confirm whether the first user enters the sleep state by determining whether the first user is using the handset 100. If it is determined that the first user is using the mobile phone 100, the mobile phone 100 may send a message to the bracelet 200 indicating that the first user is using the mobile phone 100. Upon receiving the message, wristband 200 may determine that the first user has not entered a sleep state. If it is determined that the first user is not using the cell phone 100, the cell phone 100 may send a message to the bracelet indicating that the first user is not using the cell phone 100. Upon receiving the message, wristband 200 may determine that the first user has entered a sleep state.
The mobile phone 100 may monitor whether the mobile phone 100 is in a static state, whether user operations acting on the mobile phone 100 are monitored within a preset time, and whether eyes of a first user watch a screen of the mobile phone 100, so as to determine whether the first user uses the mobile phone 100.
For example, the mobile phone 100 may determine whether it is in a stationary state through data collected by an acceleration sensor and a gyroscope sensor. If it is determined that the mobile phone 100 is in a non-stationary state (i.e., the posture of the mobile phone 100 changes), the mobile phone 100 may determine that a user is using the mobile phone 100. Further, the cellular phone 100 may determine whether the user using the cellular phone 100 is the first user.
If it is determined that the mobile phone 100 is in the stationary state, the mobile phone 100 may monitor whether there is a user operation on the mobile phone 100 within a predetermined time. This may reduce false positives of whether the user has entered a sleep state for scenarios in which the handset 100 is in a stationary state, but is still being used by the user. If the user operation is monitored within the preset time, the mobile phone 100 may determine that there is a user using the mobile phone 100. Further, the cellular phone 100 may determine whether the user using the cellular phone 100 is the first user. The embodiment of the present application does not limit the length of the preset time.
If the user operation is not monitored within the preset time, the mobile phone 100 may monitor whether a person looks at the screen of the mobile phone 100. This can reduce the misjudgment of whether the user enters the sleep state in a scene (for example, a scene in which the mobile phone is in the static state and plays a video) in which the mobile phone is in the static state and no user operation is performed within a preset time. If a person looks at the screen of the mobile phone 100, the mobile phone 100 can determine whether the user looking at the screen is the first user. If no human eye is watching the screen of the mobile phone 100 or the user watching the screen of the mobile phone 100 is not the first user, the mobile phone 100 may determine that the first user is not using the mobile phone 100.
The implementation method for the bracelet 200 to pre-determine whether the first user enters the sleep state by using the sleep model and the implementation method for the mobile phone 100 to determine whether the user using the mobile phone 100 is the first user will be specifically described in the following embodiments, which will not be described first.
In the embodiment of the present application, the mobile phone 100 and the bracelet 200 have a binding relationship.
In a possible implementation manner, the mobile phone 100 and the bracelet 200 may establish a binding relationship by bluetooth pairing. The mobile phone 100 may mark the bracelet 200 as a bracelet worn by the owner of the mobile phone 100. For example, the mobile phone 100 may add a main label to the bracelet 200 when storing the bluetooth address of the bracelet 200. Upon receiving a request from the bracelet 200 to determine whether the first user is entering a sleep state, the handset 100 may detect whether the first user is using the handset 100. When a request for determining whether the user enters a sleep state, which is transmitted by a bracelet without a master tag, is received, the mobile phone 100 may not process the request.
In another possible implementation, in response to a first user operation, the mobile phone 100 may establish a binding relationship with the bracelet 200. This first user operation may be used to indicate that the owner of the cell phone 100 is the same user as the user wearing the bracelet 200. Specifically, the mobile phone 100 may manage a bracelet with which it establishes a communication connection. A sleep assistance function may be included in the setup options for managing the bracelet in the mobile phone 100. In response to the first user operation described above, the cellular phone 100 may turn on a sleep assist function in a setting option for managing the bracelet 200. When the sleep assisting function is turned on, the bracelet 200 may request the mobile phone 100 to further confirm whether the owner of the mobile phone 100 enters the sleep state after it is judged in advance that the user wearing the bracelet 200 enters the sleep state.
In another possible implementation, the mobile phone 100 and the bracelet 200 can establish a binding relationship by associating with the same account (e.g., huazi account). Namely, the account numbers logged on the mobile phone 100 and the bracelet 200 are the same account number. Since the mobile phone 100 and the bracelet 200 have a binding relationship, the mobile phone 100 can assist the bracelet 200 to monitor whether the user wearing the bracelet 200 (i.e. the first user) enters a sleep state by determining whether the owner (i.e. the first user) is using the mobile phone. This can reduce the misjudgment of whether the user of the mobile phone 100 is in the sleep state or not by the same person as the user wearing the bracelet 200.
The method for establishing the binding relationship between the mobile phone 100 and the bracelet 200 is not limited in the embodiment of the application.
Since the present application relates to the application of neural networks, for ease of understanding, the following description will be made with respect to terms of neural networks to which embodiments of the present application may relate.
1. Neural network
The neural network may be composed of neural units, which may be referred to as xsAnd an arithmetic unit with intercept 1 as input, the output of which can refer to the following formula (1):
wherein s is 1, 2, … …, n is a natural number more than 1, and W issIs xsB is the bias of the neural unit. f is an activation function (activation functions) of the neural unit for introducing a nonlinear characteristic into the neural network to convert an input signal in the neural unit into an output signal. The output signal of the activation function may be used as an input to the next convolutional layer. The activation function may be a sigmoid function. A neural network is a network formed by connecting together a number of such single neural units, i.e. the output of one neural unit may be that of another neural unitAnd (4) inputting. The input of each neural unit can be connected with the local receiving domain of the previous layer to extract the characteristics of the local receiving domain, and the local receiving domain can be a region composed of a plurality of neural units.
2. Loss function
In the process of training the neural network, because the output of the neural network is expected to be as close as possible to the value really expected to be predicted, the weight vector of each layer of the neural network can be updated according to the difference between the predicted value of the current network and the really expected target value (of course, an initialization process is usually carried out before the first updating, namely parameters are configured in advance for each layer in the neural network), for example, if the predicted value of the network is high, the weight vector is adjusted to be lower for predicting, and the adjustment is carried out continuously until the neural network can predict the really expected target value or the value which is very close to the really expected target value. Therefore, it is necessary to define in advance "how to compare the difference between the predicted value and the target value", which are loss functions (loss functions) or objective functions (objective functions), which are important equations for measuring the difference between the predicted value and the target value. Taking the loss function as an example, if the higher the output value (loss) of the loss function indicates the larger the difference, the training of the neural network becomes a process of reducing the loss as much as possible.
3. Back propagation algorithm
The convolutional neural network can adopt a Back Propagation (BP) algorithm to correct the size of parameters in the initial super-resolution model in a training process, so that the reconstruction error loss of the super-resolution model is smaller and smaller. Specifically, error loss occurs when an input signal is transmitted in a forward direction until the input signal is output, and parameters in an initial super-resolution model are updated by reversely propagating error loss information, so that the error loss is converged. The back propagation algorithm is a back propagation motion with error loss as a dominant factor, aiming at obtaining the optimal parameters of the super-resolution model, such as a weight matrix.
A sleep monitoring method provided in the embodiments of the present application is described in detail below.
Fig. 3 is a flowchart illustrating a sleep monitoring method according to an embodiment of the present application. As shown in FIG. 3, the method may include steps S101-S108. Wherein:
s101, the bracelet 200 judges that the first user enters a sleep state in advance by using a sleep model.
The sleep model may be a trained neural network model.
In some embodiments, bracelet 200 may collect acceleration data in real time via an acceleration sensor and heart rate data of the first user in real time via a heart rate sensor. Based on the acceleration data and the heart rate data, the bracelet 200 may utilize the sleep model to pre-determine whether the first user enters a sleep state. When the user is in a sleep state, the bracelet 200 usually has a small posture change amount or even has a posture that remains unchanged for a certain period of time. The posture of the bracelet 200 is different between the state where the bracelet 200 is worn on the user and kept still and the state where the bracelet 200 is placed on the table. Also, the heart rate typically drops gradually after the user enters a sleep state. Therefore, in conjunction with the acceleration data and the heart rate data detected by the bracelet 200, the bracelet 200 can prejudge whether the user is asleep.
The training data used to train the sleep model may include acceleration data and heart rate data of bracelet 200 when the user is actually in a sleep state and acceleration data and heart rate data of bracelet 200 when the user is actually in a non-sleep state. These data may be obtained by big data collection. That is, the data may be acceleration data and heart rate data in a sleep state and acceleration data and heart rate data in a non-sleep state of a general user when wearing the bracelet 200. Alternatively, the data may be acceleration data and heart rate data in a sleep state and acceleration data and heart rate data in a non-sleep state of the first user while wearing the bracelet 200. The sleep model obtained by training the data of the first user can be used for better pre-judging whether the first user enters the sleep state or not.
The trained sleep model can identify the characteristics of the acceleration data and the heart rate data of the bracelet 200 when the first user is actually in the sleep state, so as to pre-judge whether the first user enters the sleep state. That is to say, under the condition that the acceleration data monitored by the bracelet 200 is consistent with the acceleration data of the user in the sleep state, and the monitored heart rate data is consistent with the heart rate data of the user in the sleep state, the bracelet 200 may pre-determine that the first user enters the sleep state.
In some embodiments, the bracelet 200 may also utilize a sleep model to pre-determine whether the first user enters a sleep state based on angular velocity data collected by a gyroscope sensor, ambient light data collected by an ambient light sensor, ambient sound data collected by a microphone, and the like. That is, the training data of the sleep model may further include angular velocity data, ambient light data, ambient sound data, and the like.
The embodiment of the application does not limit the training data for training the sleep model and the method for training the sleep model. The method for the bracelet 200 to pre-determine whether the first user enters the sleep state may further refer to a method for determining whether the first user enters the sleep state by using electronic devices such as a bracelet in the prior art.
S102, the bracelet 200 sends a request to the mobile phone 100 to confirm whether the first user enters the sleep state.
S103, the mobile phone 100 determines whether it is in a static state.
A non-stationary state of the handset 100 may indicate that a user is using the handset 100. The cell phone 100 being in a quiescent state does not directly indicate that no user is using the cell phone 100. For example, in a scenario where the user uses the handset 100 by placing the handset 100 in a handset cradle, the handset 100 is in a stationary state, but is still in use by the user.
When receiving a request from the bracelet 200 to determine whether the first user enters the sleep state, the mobile phone 100 may determine whether the first user is in the stationary state through acceleration data collected by the acceleration sensor.
If it is determined that the mobile phone 100 is in the static state, the mobile phone 100 may perform the following step S104.
If it is determined that the mobile phone 100 is in the non-stationary state, the mobile phone 100 may perform the following step S106.
The method for determining whether the mobile phone 100 is in the stationary state according to the embodiment of the present invention is not limited. For example, the mobile phone 100 may determine whether it is in a stationary state through the angular velocity data collected by the gyro sensor.
The mobile phone 100 monitors whether the first user enters the sleep state by judging whether the first user is in the static state, so that misjudgment of whether the first user enters the sleep state in a scene that the mobile phone 100 is still used by the first user can be reduced.
S104, the mobile phone 100 determines whether the user operation is monitored within a preset time.
The monitoring of the user operation by the mobile phone 100 within the preset time may indicate that the user is using the mobile phone 100. The fact that the mobile phone 100 does not monitor the user operation within the preset time cannot directly indicate that no user is using the mobile phone 100. For example, the preset time is 10 minutes in length, in a scene where the user places the mobile phone 100 on the mobile phone holder to watch a 30-minute video, the mobile phone 100 may not receive the user operation during the video playing process, but is still used.
The user operation may be, for example, a touch operation applied to the screen of the cellular phone 100, a user operation applied to a key of the cellular phone 100, an input operation of a voice command, an input operation of a space gesture, or the like. The embodiment of the present application does not limit the specific type of the user operation.
If the user operation is not monitored within the preset time, the mobile phone 100 may perform the following step S105.
If the user operation is monitored within the preset time, the mobile phone 100 may perform the following step S106.
The embodiment of the present application does not limit the length of the preset time.
Under the condition that the mobile phone 100 is in the static state, the mobile phone 100 monitors whether the first user enters the sleep state by judging whether the user operation is monitored within the preset time, so that the misjudgment that whether the first user enters the sleep state in a scene (for example, a scene in which the mobile phone is in the static state and a video is played) in which the mobile phone 100 is in the static state and the user operation is not available within the preset time can be reduced.
S105, the mobile phone 100 determines whether a person looks at the screen.
The cell phone 100 can capture images through a front-facing camera. Based on the above images, the mobile phone 100 can determine whether a human eye gazes at the screen by using the human eye gazing recognition model.
The human eye gaze recognition model may be a neural network model.
The training data for training the human eye gaze recognition model may include image data of a human eye gaze screen and image data of a human eye non-gaze screen. The trained human eye gaze recognition model can recognize the characteristics of the image of the human eye gaze screen, so as to judge whether the human eye gazes at the screen of the mobile phone 100. The method for training the eye gaze recognition model may refer to a training method of a neural network model in the prior art, which is not described herein again.
In some embodiments, the front camera for capturing images may be a low power consumption camera. Such as an infrared camera. The low-power-consumption camera can be in a working state in real time. The embodiment of the present application does not limit the type of the front camera.
In some embodiments, the mobile phone 100 may turn on the front camera to capture an image when it is determined that there is no user operation on the mobile phone 100 within a preset time period. Alternatively, the mobile phone 100 may turn on the front camera when receiving a request from the bracelet 200 to confirm whether the first user enters the sleep state. The embodiment of the present application does not limit the time when the front camera is turned on by the mobile phone 100.
If it is recognized that the eyes are looking at the screen, the mobile phone 100 may perform the following step S106.
If it is recognized that no human eyes are looking at the screen, the mobile phone 100 may perform the following step S108.
The mobile phone 100 monitors whether the first user enters the sleep state by judging whether a person looks at the screen, so that misjudgment of whether the first user enters the sleep state in a scene (for example, a scene in which the mobile phone plays a video in the static state but the first user falls asleep in the video playing process) that the mobile phone 100 is in the static state and has no user operation within a preset time can be reduced.
S106, the mobile phone 100 determines whether the user of the mobile phone 100 is the first user.
In any of the following cases: the mobile phone 100 determines that the mobile phone 100 is in a non-stationary state, the mobile phone 100 determines that the mobile phone 100 is in a stationary state but the user operation is monitored within a preset time, the mobile phone 100 determines that the mobile phone 100 is in a stationary state and the user operation is not monitored within the preset time but the screen of the mobile phone 100 watched by eyes is recognized, and the mobile phone 100 can determine that the user is using the mobile phone 100. Further, the mobile phone 100 can determine whether the user is the first user.
In one possible implementation, the mobile phone 100 can determine whether the user is the first user through face recognition.
Wherein, the bracelet 200 and the mobile phone 100 have a binding relationship. The first user wearing bracelet 200 is the owner of cell phone 100. The mobile phone 100 may compare the face image collected by the front camera with the face image of the owner stored in the mobile phone 100 to determine whether the user of the mobile phone 100 is the first user. The stored face image of the owner of the mobile phone 100 may be a face image used for face recognition to unlock the mobile phone 100. The embodiment of the present application does not limit the method for the mobile phone 100 to compare whether the face image acquired by the front camera and the face image of the owner stored in the mobile phone 100 are the face images of the same user.
Optionally, the mobile phone 100 may further determine whether the user of the mobile phone 100 is the first user by using a biometric identification method such as voiceprint identification and fingerprint identification. The embodiment of the present application does not limit the specific method for determining whether the user of the mobile phone 100 is the first user.
If the user of the mobile phone 100 is determined to be the first user, the mobile phone 100 may perform the following step S107.
If the user of the mobile phone 100 is determined not to be the first user, the mobile phone 100 can perform the following step S108.
Under the condition that the mobile phone 100 is used by the user, the mobile phone 100 detects whether the first user enters the sleep state by identifying whether the user is the first user, so that the misjudgment that the user of the mobile phone 100 and the first user wearing the bracelet 200 are not the same person whether the first user enters the sleep state can be reduced.
S107, the mobile phone 100 sends the first determination result to the bracelet 200, indicating that the first user is using the mobile phone 100.
If the mobile phone 100 is determined to be used by the user and the user is the first user, the mobile phone 100 may determine that the first user is using the mobile phone 100. Then, the mobile phone 100 may send the first determination result to the bracelet 200, indicating that the first user is using the mobile phone 100.
In some embodiments, after receiving the first determination result from the mobile phone 100, the bracelet 200 may pre-determine whether the first user enters the sleep state by using the sleep model again, and request the mobile phone 100 to confirm whether the first user enters the sleep state. When it is determined that the first user is in the sleep state, the bracelet 200 may pre-determine whether the first user enters the sleep state by using the sleep model every preset time period (e.g., 5 minutes).
In some embodiments, after receiving the first determination result from the mobile phone 100, the wristband 200 may record the monitored data (e.g., heart rate data) as the data that the first user is in the non-sleep state.
S108, the mobile phone 100 sends the second determination result to the bracelet 200, indicating that the first user does not use the mobile phone 100.
If it is determined that no person is looking at the screen of the mobile phone 100 or if it is determined that the mobile phone 100 is used by the user but the user is not the first user, the mobile phone 100 may determine that the first user is not using the mobile phone 100. Then, the mobile phone 100 may send a second determination result to the bracelet 200, indicating that the first user does not use the mobile phone 100.
When receiving the second determination result from the mobile phone 100, the bracelet 200 may determine the time when the second determination result is received as the time when the first user enters the sleep state. Or, the bracelet 200 may use the time when the first user enters the sleep state as the time when the first user enters the sleep state. The time when the hand ring 200 determines that the first user enters the sleep state is not particularly limited in the embodiment of the present application. The bracelet 200 may record data (e.g., heart rate data) monitored after the moment the first user enters the sleep state as data that the first user is in the sleep state.
As can be seen from the sleep monitoring method shown in fig. 3, the bracelet 200 may utilize the mobile phone 100 to confirm whether the first user enters the sleep state. This may reduce a misjudgment of whether the first user enters the sleep state due to the first user maintaining the fixed posture for a long time but not being in the sleep state under a condition of monitoring whether the first user enters the sleep state by using the bracelet 200 alone, and improve an accuracy rate of time monitoring of whether the first user enters the sleep state. Thus, the bracelet 200 can improve the accuracy of sleep quality monitoring.
In addition, after determining that the mobile phone 100 is in the non-stationary state, the mobile phone 100 may directly determine whether the user of the mobile phone 100 is the first user. In this way, the cellular phone 100 may not perform the steps S104 and S105, thereby saving power consumption of the cellular phone 100. After the mobile phone 100 determines that the user operation is not monitored within the preset time, it may be directly determined whether the user of the mobile phone 100 is the first user. In this way, the cellular phone 100 may not need to perform step S105, thereby saving power consumption of the cellular phone 100.
In some embodiments, the execution order of the steps S103 and S104 may be reversed. That is, after receiving the request from the bracelet 200 to confirm whether the first user enters the sleep state, the mobile phone 100 may first determine whether there is a user operation within a preset time. If it is determined that the user operation is performed within the preset time, the mobile phone 100 may execute step S106. If it is determined that there is no user operation within the preset time, the mobile phone 100 may further determine whether it is in a static state. If it is determined that the mobile phone 100 is in the static state, the mobile phone 100 can perform step S105. If the mobile phone 100 determines that it is in the non-stationary state, the step S106 can be executed.
In other embodiments, after receiving the request from the bracelet 200 to confirm whether the first user enters the sleep state, the mobile phone 100 may simultaneously perform the above steps S103, S104, and S105 to determine whether the mobile phone 100 is used by the user. If the mobile phone 100 is determined to be used by the user, the mobile phone 100 may further perform step S106 to determine whether the user is the first user. Otherwise, the handset 100 may execute step S108 to instruct the bracelet 200 that the first user enters the sleep state.
Fig. 4 is a flowchart illustrating another sleep monitoring method provided in an embodiment of the present application.
As shown in fig. 4, the method may include steps S201 to S207. Wherein:
s201, the bracelet 200 judges that the first user enters a sleep state in advance by using a sleep model.
S202, the bracelet 200 sends a request to the mobile phone 100 to confirm whether the first user enters the sleep state.
The above steps S201 and S202 can refer to steps S101 and S102, respectively, in the method shown in fig. 3.
S203, the mobile phone 100 determines whether it is in a stationary state.
The method for the handset 100 to determine whether it is in a static state may refer to step S203 in the method shown in fig. 3.
When determining that the mobile phone 100 is in the static state, the mobile phone 100 may execute step S204. That is, the cellular phone 100 can determine whether a person looks at the screen.
When it is determined that the mobile phone is in the non-stationary state, the mobile phone 100 may execute step S205.
S204, the mobile phone 100 determines whether a person looks at the screen.
S205, the mobile phone 100 determines whether the user of the mobile phone 100 is the first user.
S206, the mobile phone 100 sends the first determination result to the bracelet 200, indicating that the first user is using the mobile phone 100.
S207, the mobile phone 100 sends the second determination result to the bracelet 200, indicating that the first user does not use the mobile phone 100.
The steps S204 to S207 can refer to the steps S105 to S108 in the method shown in fig. 3, and are not described herein again.
Fig. 5 is a flowchart illustrating another sleep monitoring method provided in an embodiment of the present application.
As shown in fig. 5, the method may include steps S301 to S307. Wherein:
s301, the bracelet 200 pre-judges that the first user enters a sleep state by using a sleep model.
S302, the bracelet 200 sends a request to the mobile phone 100 to confirm whether the first user enters the sleep state.
The above steps S301 and S302 can refer to steps S101 and S102, respectively, in the method shown in fig. 3.
S303, the mobile phone 100 determines whether the user operation is monitored within a preset time.
The method for determining whether the user operation is monitored within the preset time by the mobile phone 100 may refer to step S204 in the method shown in fig. 3.
When it is determined that the user operation is not monitored within the preset time, the mobile phone 100 may perform step S304. That is, the cellular phone 100 can determine whether a person looks at the screen.
When it is determined that the user operation is monitored within the preset time, the mobile phone 100 may perform step S305.
S304, the mobile phone 100 determines whether a person looks at the screen.
S305, the mobile phone 100 determines whether the user of the mobile phone 100 is the first user.
S306, the mobile phone 100 sends the first determination result to the bracelet 200, indicating that the first user is using the mobile phone 100.
S307, the mobile phone 100 sends the second determination result to the bracelet 200, indicating that the first user does not use the mobile phone 100.
The above steps S304 to S307 can refer to steps S105 to S108 in the method shown in fig. 3, and are not described herein again.
Fig. 6 is a flowchart illustrating another sleep monitoring method provided in an embodiment of the present application.
As shown in fig. 6, the method may include steps S401 to S406. Wherein:
s401, the bracelet 200 pre-judges that the first user enters a sleep state by using a sleep model.
S402, the bracelet 200 sends a request to the mobile phone 100 to confirm whether the first user enters the sleep state.
The above steps S401 and S402 can refer to steps S101 and S102, respectively, in the method shown in fig. 3.
S403, the mobile phone 100 determines whether a person looks at the screen.
S404, the mobile phone 100 determines whether the user of the mobile phone 100 is the first user.
S405, the mobile phone 100 sends the first determination result to the bracelet 200, indicating that the first user is using the mobile phone 100.
S406, the mobile phone 100 sends the second determination result to the bracelet 200, indicating that the first user does not use the mobile phone 100.
The steps S403 to S406 may refer to the steps S105 to S108 in the method shown in fig. 3, and are not described herein again.
In some embodiments, upon receiving a request from the bracelet 200 to determine whether the first user has entered a sleep state, the handset 100 may monitor itself for a screen-shot application. The application running on the screen of the handset 100 may indicate that a user is using the handset 100. When it is monitored that the user has the application program for screen projection, the mobile phone 100 can detect whether the user of the mobile phone 100 is the first user. In one possible implementation, the handset 100 may send a message to a screen projection device, such as a television, to capture an image. When receiving the message of the user acquiring the image, the screen projection device may acquire an image of an area where the screen projection device is watched, and send the image to the mobile phone 100. The mobile phone 100 can determine whether the image from the screen projection device contains the facial image of the first user.
The image acquired by the screen projection device contains the face image of the first user, which can indicate that the first user projects a screen through the mobile phone 100 and watches the content played on the screen projection device. That is, the first user does not enter the sleep state. When it is determined that the image acquired by the screen projection device includes the face image of the first user, the mobile phone 100 may send the first determination result in the foregoing embodiment to the bracelet 200, indicating that the first user is using the mobile phone 100.
The fact that the face image of the first user is not included in the image acquired by the screen projection device may indicate that the first user is not included in the users who use the mobile phone 100 to project a screen and view the content played on the screen projection device. When it is determined that the image acquired by the screen projection device does not include the face image of the first user, the mobile phone 100 may send the second determination result in the foregoing embodiment to the bracelet 200, indicating that the first user does not use the mobile phone 100.
In an application scenario where the mobile phone 100 is turned on, the mobile phone 100 may monitor that it is in a static state, no user operation is performed within a preset time, and no human eyes watch the mobile phone. But the first user does not go to sleep but is watching the screen projection device. The method can monitor whether the mobile phone 100 runs the screen-projecting application program or not, and monitor whether the first user enters the sleep state or not by means of the image collected by the screen projecting equipment under the condition that the screen-projecting application program runs. This may reduce the false determination that the mobile phone 100 is in a stationary state, has no user operation and no human eyes gazing within a preset time, but is still used by the first user whether the first user enters a sleep state.
In some embodiments, upon receiving a request from the bracelet 200 to determine whether the first user enters a sleep state, the cell phone 100 may request other electronic devices with an image capture device (e.g., a camera), such as a television, to capture images. When the image captured by the electronic device having the image capturing apparatus is obtained, the mobile phone 100 may determine whether the image includes the first user and determine the state of the first user. In this way, the mobile phone 100 can monitor whether the first user enters a sleep state, and send the monitoring result to the bracelet 200.
Illustratively, the handset 100 may send a message to the television to capture an image. The television may send images captured by the camera to the handset 100. The fact that the image acquired by the television through the camera contains the facial image of the first user can indicate that the first user is watching the television. I.e. the first user is entering a sleep state. If the mobile phone 100 determines that the image from the television includes the facial image of the first user, the mobile phone 100 may send the first determination result in the foregoing embodiment to the bracelet 200, indicating that the first user is using the mobile phone 100.
The embodiment of the present application does not limit the manner in which the mobile phone 100 establishes a communication connection with an electronic device having an image capturing device, such as a television. The communication connection mode can be, for example, a bluetooth connection, a Wi-Fi network connection, and the like.
In some embodiments, bracelet 200 may establish a communication connection with other electronic devices having image capture devices. When it is determined that the first user enters the sleep state, the bracelet 200 may send a message of acquiring an image to an electronic device having an image acquisition apparatus. The electronic device having the image capturing apparatus may transmit the captured image to the wristband 200. The bracelet 200 may determine the state of the first user from the images to determine whether the first user enters a sleep state. Alternatively, the bracelet 200 may transmit the images to an electronic device with stronger processing capability, such as the mobile phone 100, among electronic devices with which the bracelet establishes a communication connection. The mobile phone 100 may determine the state of the first user according to the received image, thereby determining whether the first user enters a sleep state.
The embodiment of the present application is not limited to the manner of establishing the communication connection between the hand ring 200 and the electronic device having the image capturing apparatus.
In some embodiments, the bracelet 200 may determine whether the first user exits the sleep state (i.e., the first user wakes up) with the cell phone 100.
Specifically, the mobile phone 100 may send a message to the bracelet 200 for instructing the first user to exit the sleep state when the user operation for unlocking the mobile phone 100 is monitored for the first time within the second time period. When receiving the message indicating that the first user exits from the sleep state, the bracelet 200 may determine the time when the user operation for unlocking the mobile phone 100 is monitored as the time when the first user exits from the sleep state. In combination with the time when the first user enters the sleep state, which is determined by the sleep monitoring method in the foregoing embodiment, the bracelet 200 may determine the total duration of the sleep state of the first user, and evaluate the sleep quality of the first user in the time period from entering the sleep state to exiting the sleep state.
Specifically, the unlocking method may be a method of unlocking using biometric information. The biometric information may be, for example, face information, voiceprint information, fingerprint information, and the like. When the monitored biometric information belongs to the biometric information of the first user, the mobile phone 100 may determine that the unlocked user is the first user.
The second time period may be a time period (e.g., 12 hours) during which the mobile phone 100 transmits a second determination result to the bracelet 200, the second determination result indicating that the first user does not use the mobile phone 100. Alternatively, the second time period may be a preset time period, for example, a time period from 5 am to 10 am. The second time period may also be a time period that the bracelet 200 estimates that the first user exits from the sleep state. The bracelet 200 may estimate a time period when the first user exits the sleep state according to the data of the first user detected multiple times.
Optionally, when monitoring the user operation of turning off the alarm clock of the mobile phone 100 in the second time period, the mobile phone 100 may determine whether the user turning off the alarm clock of the mobile phone 100 is the first user. If the user turning off the alarm clock of the mobile phone 100 is determined to be the first user, the mobile phone 100 may send a message to the bracelet 200 to instruct the first user to exit the sleep state. When receiving the message for instructing the first user to exit the sleep state, the bracelet 200 may determine the time when the user operation for turning off the alarm clock of the mobile phone 100 is monitored as the time when the first user exits the sleep state. The method for judging whether the user turning off the alarm clock of the mobile phone 100 is the first user by the mobile phone 100 in the embodiment of the present application is not limited. For example, the mobile phone 100 may determine whether the user turning off the alarm clock is the first user by using a method of biometric information recognition such as face recognition, voiceprint recognition, fingerprint recognition, and the like.
Optionally, the mobile phone 100 may further determine whether the first user uses the mobile phone 100 by using the methods shown in fig. 3 to fig. 6 during the second time period. If it is determined that the first user is using the mobile phone 100, the mobile phone 100 may send a message to the bracelet 200 to instruct the first user to exit the sleep state. The implementation method for determining whether the first user exits the sleep state by the mobile phone 100 in the embodiment of the present application is not limited.
Not only the user operation of unlocking and the user operation of closing the alarm clock are monitored, and the methods shown in fig. 3 to fig. 6 are used to determine whether the first user uses the mobile phone 100, but the mobile phone 100 may also determine whether the first user uses the mobile phone by other methods, so as to assist the bracelet 200 to determine whether the first user exits the sleep state.
The bracelet 200 may utilize the sleep model in the foregoing embodiments to determine whether the first user exits the sleep state based on the acceleration data and the heart rate data. However, in some application scenarios (e.g., an application scenario in which the first user uses a cell phone while lying in bed after waking up), the first user has already exited the sleep state, but the amount of change in the pose of the bracelet 200 is small or even the pose remains unchanged. The result of the judgment of the bracelet 200 using the sleep model is that the first user is still in the sleep state. This reduces the accuracy of sleep quality monitoring. The bracelet 200 can reduce the misjudgment of whether the first user exits the sleep state after the first user wakes up but does not get up, and improve the accuracy of sleep quality monitoring by means of the method for judging whether the first user exits the sleep state by the mobile phone 100.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and these modifications or substitutions do not depart from the scope of the technical solutions of the embodiments of the present application.
Claims (33)
1. A sleep monitoring method, the method comprising:
the method comprises the steps that a first electronic device receives a first request of a second electronic device; the first electronic equipment and the second electronic equipment have a binding relationship; the first request is sent when the second electronic equipment is in a wearing state and first data are monitored, and the first data are consistent with data of a user entering a sleep state;
the first electronic equipment judges whether a first user wearing the second electronic equipment uses the first electronic equipment or not, and sends a first judgment result or a second judgment result to the second electronic equipment; the first determination result indicates that the first user is using the first electronic device, and the second determination result indicates that the first user is not using the first electronic device.
2. The method according to claim 1, wherein the determining, by the first electronic device, whether the first user wearing the second electronic device is using the first electronic device, specifically comprises:
the first electronic equipment judges that the first electronic equipment is used by a user and the user is the first user under a first condition, and the first electronic equipment obtains a first judgment result; the first condition includes one or more of: the first electronic device judges that the first electronic device is in a non-static state, the first electronic device monitors user operation in a first time period, the first electronic device monitors that human eyes watch a screen of the first electronic device, and the first electronic device monitors that an application program which is used for throwing the screen operates per se;
the first electronic device determines that the first electronic device is used by a user and the user is not the first user under the first condition, or the first electronic device determines that the first electronic device is not used by the user, and the first electronic device obtains the second determination result.
3. The method of claim 2, wherein the determining, by the first electronic device, that the user is the first user comprises:
the first electronic equipment acquires a first image through a camera and judges that the first image contains a face image of the first user.
4. The method of any of claims 1-3, wherein the first data comprises one or more of: physiological characteristic data of the user, motion data of the second electronic device.
5. The method according to any one of claims 1-4, further comprising:
the first electronic device monitors user operation for unlocking the first electronic device in a second time period, and the first electronic device sends a first message to the second electronic device, wherein the first message is used for indicating that the first user uses the first electronic device; or,
and the first electronic equipment monitors the user operation of closing the alarm clock in the second time period, the user closing the alarm clock is the first user, and the first electronic equipment sends the first message to the second electronic equipment.
6. The method according to claim 5, wherein the second time period is a first time period from the first electronic device sending the second determination result to the second electronic device; or, the second time period is a preset time period.
7. The method according to any one of claims 1-6, further comprising:
the first electronic equipment and the second electronic equipment establish a binding relationship in a Bluetooth pairing mode; or,
the first electronic device responds to a first user operation to establish a binding relationship with the second electronic device, wherein the first user operation is used for indicating that the owner of the first electronic device is the first user; or,
and the first electronic equipment and the second electronic equipment establish a binding relationship by logging in the same account.
8. A sleep monitoring method, the method comprising:
the second electronic equipment monitors first data when being in a wearing state, and the first data is consistent with data of a user entering a sleep state;
the second electronic equipment sends a first request to the first electronic equipment; the first electronic equipment and the second electronic equipment have a binding relationship;
under the condition that a first judgment result from the first electronic equipment is received, the second electronic equipment determines that a first user wearing the second electronic equipment does not enter a sleep state; the first judgment result is a judgment result that the first electronic device judges that the first user uses the first electronic device after receiving the first request.
9. The method of claim 8, wherein after the second electronic device determines that a first user wearing the second electronic device has not entered a sleep state, the method further comprises:
and the second electronic equipment monitors second data when in a wearing state and judges whether the second data is consistent with data of the user entering a sleep state.
10. The method of claim 8 or 9, wherein after the second electronic device determines that the first user wearing the second electronic device does not enter a sleep state, the method further comprises:
and the second electronic equipment records the monitored physiological characteristic data of the user as data which are not in a sleep state.
11. The method according to any one of claims 8-10, further comprising:
the second electronic device determines that the first user enters a sleep state when receiving a second judgment result from the first electronic device; the second judgment result is a judgment result that the first electronic device judges that the first user does not use the first electronic device after receiving the first request.
12. The method of claim 11, wherein after the second electronic device determines that the first user enters a sleep state, the method further comprises:
and the second electronic equipment records the monitored physiological characteristic data of the user as data in a sleep state.
13. The method of any of claims 9-12, wherein the first data and the second data comprise one or more of: physiological characteristic data of a user, motion data of the second electronic device.
14. The method according to any one of claims 11-13, further comprising:
the second electronic equipment receives a first message from the first electronic equipment when detecting that the state of the first user is a sleep state, and determines that the state of the first user detected by the second electronic equipment is a non-sleep state; the first message is used for indicating that the first user uses the first electronic equipment.
15. A sleep monitoring method, the method comprising:
the second electronic equipment monitors first data when being in a wearing state, and the first data is consistent with data of a user entering a sleep state;
the second electronic equipment sends a first request to the first electronic equipment; the first electronic device and the second electronic device have a binding relationship;
the first electronic equipment receives a first request of the second electronic equipment;
the first electronic equipment judges whether a first user wearing the second electronic equipment uses the first electronic equipment or not, and sends a first judgment result or a second judgment result to the second electronic equipment; the first judgment result is that the first user uses the first electronic device, and the second judgment result is that the first user does not use the first electronic device;
and under the condition of receiving the first judgment result, the second electronic equipment determines that the first user does not enter a sleep state.
16. The method according to claim 15, wherein the determining, by the first electronic device, whether the first user wearing the second electronic device is using the first electronic device comprises:
the first electronic equipment judges that the first electronic equipment is used by a user and the user is the first user under a first condition, and the first electronic equipment obtains a first judgment result; the first condition includes one or more of: the first electronic equipment judges that the first electronic equipment is in a non-static state, monitors user operation in a first time period, monitors that human eyes watch a screen of the first electronic equipment, and monitors that the first electronic equipment operates an application program with a screen;
the first electronic device determines that the first electronic device is used by a user and the user is not the first user under the first condition, or the first electronic device determines that the first electronic device is not used by the user, and the first electronic device obtains the second determination result.
17. The method of claim 16, wherein the determining, by the first electronic device, that the user is the first user comprises:
the first electronic equipment acquires a first image through a camera and judges that the first image contains a face image of the first user.
18. The method according to any one of claims 15-17, further comprising:
and under the condition that the second judgment result is received, the second electronic equipment determines that the first user enters a sleep state.
19. The method according to any one of claims 15-18, further comprising:
the first electronic device monitors user operation for unlocking the first electronic device in a second time period, and the first electronic device sends a first message to the second electronic device, wherein the first message is used for indicating that the first user uses the first electronic device; or,
and the first electronic equipment monitors the user operation of turning off the alarm clock in the second time period, the user turning off the alarm clock is the first user, and the first electronic equipment sends the first message to the second electronic equipment.
20. The method according to any one of claims 15-19, further comprising:
the first electronic equipment and the second electronic equipment establish a binding relationship in a Bluetooth pairing mode; or,
the first electronic equipment responds to a first user operation to establish a binding relationship with the second electronic equipment, wherein the first user operation is used for indicating that the owner of the first electronic equipment is the first user; or,
and the first electronic equipment and the second electronic equipment establish a binding relationship by logging in the same account.
21. A sleep monitoring method, the method comprising:
the method comprises the steps that when the second electronic device detects that the state of a first user wearing the second electronic device is a sleep state, the second electronic device receives a first message of the first electronic device, and the state of the first user detected by the second electronic device is determined to be a non-sleep state; the first electronic device and the second electronic device have a binding relationship; the first message is sent by the first electronic device after the first electronic device monitors user operation for unlocking the first electronic device within a first monitoring time period.
22. The method of claim 21, wherein the second electronic device, prior to detecting that the state of the first user wearing the second electronic device is a sleep state, further comprises:
the second electronic equipment monitors first data when being in a wearing state, and the first data is consistent with data of a user entering a sleep state;
the second electronic equipment sends a first request to the first electronic equipment;
and the second electronic equipment receives a judgment result that the first electronic equipment indicates that the first user does not use the first electronic equipment.
23. The method of claim 21 or 22, wherein the first monitoring period is a period of time when the second electronic device estimates that the first user exits the sleep state.
24. The method of claim 22, wherein the first monitoring period is a period of a first duration from the first electronic device sending the determination to the second electronic device indicating that the first user is not using the first electronic device.
25. A sleep monitoring method, the method comprising:
the method comprises the steps that a first electronic device monitors user operation for unlocking the first electronic device in a first monitoring time period and sends a first message to a second electronic device; the first electronic device and the second electronic device have a binding relationship;
and the second electronic equipment receives the first message when detecting that the state of the first user is a sleep state, and determines that the state of the first user detected by the second electronic equipment is a non-sleep state.
26. The method of claim 25, wherein prior to the second electronic device determining that the first user is in a sleep state, the method further comprises:
the second electronic equipment monitors first data when being in a wearing state, and the first data is consistent with data of a user entering a sleep state;
the second electronic equipment sends a first request to the first electronic equipment;
the first electronic device receives the first request, judges whether the first user uses the first electronic device or not, and obtains a judgment result indicating that the first user does not use the first electronic device; and the first electronic equipment sends the judgment result indicating that the first user does not use the first electronic equipment to the second electronic equipment.
27. The method of claim 25 or 26, wherein the first monitoring period is a period of time when the second electronic device estimates that the first user exits the sleep state.
28. The method of claim 26, wherein the first monitoring period is a period of time of a first duration since the first electronic device sent the determination indicating that the first user did not use the first electronic device to the second electronic device.
29. An electronic device, the electronic device being a first electronic device, comprising: the device comprises a camera, a communication module, a memory and a processor; the camera is used for collecting images; the communication module is used for establishing communication connection with second electronic equipment; the memory is used for storing a computer program; the processor is configured to invoke the computer program to cause the first electronic device to perform the method of any of claims 1-7.
30. An electronic device, the electronic device being a second electronic device, comprising: a communication module, a memory, and a processor; the communication module is used for establishing communication connection with first electronic equipment; the memory is used for storing a computer program; the processor is configured to invoke the computer program to cause the second electronic device to perform the method of any of claims 8-14 or 21-24.
31. A sleep monitoring system, characterized in that the system comprises a first electronic device according to claim 29 and a second electronic device according to claim 30.
32. A computer storage medium, comprising: computer instructions; the computer instructions, when executed on an electronic device, cause the electronic device to perform the method of any of claims 1-7 or claims 8-14 or claims 21-24.
33. A computer program product, characterized in that, when run on an electronic device, it causes the electronic device to perform the method of any of claims 1-7 or claims 8-14 or claims 21-24.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110057952.7A CN114762588A (en) | 2021-01-15 | 2021-01-15 | Sleep monitoring method and related device |
PCT/CN2021/137461 WO2022151887A1 (en) | 2021-01-15 | 2021-12-13 | Sleep monitoring method and related apparatus |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110057952.7A CN114762588A (en) | 2021-01-15 | 2021-01-15 | Sleep monitoring method and related device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114762588A true CN114762588A (en) | 2022-07-19 |
Family
ID=82364577
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110057952.7A Pending CN114762588A (en) | 2021-01-15 | 2021-01-15 | Sleep monitoring method and related device |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN114762588A (en) |
WO (1) | WO2022151887A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117481614A (en) * | 2023-12-26 | 2024-02-02 | 荣耀终端有限公司 | Sleep state detection method and related equipment |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117678970B (en) * | 2022-09-09 | 2024-10-15 | 荣耀终端有限公司 | Sleep state detection method, electronic equipment and system |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015121840A (en) * | 2013-12-20 | 2015-07-02 | 株式会社東芝 | Electronic apparatus, method, and program |
KR20160024627A (en) * | 2014-08-26 | 2016-03-07 | 삼성전자주식회사 | Electronic apparatus and method for monitoring sleep |
CN105042769B (en) * | 2015-06-30 | 2018-05-22 | 广东美的制冷设备有限公司 | Sleep state monitoring method and device, air-conditioner system |
CN105433904A (en) * | 2015-11-24 | 2016-03-30 | 小米科技有限责任公司 | Sleep state detection method, device and system |
CN106308752A (en) * | 2016-08-23 | 2017-01-11 | 广东小天才科技有限公司 | Sleep monitoring method and system based on wearable device |
CN111839465A (en) * | 2020-07-30 | 2020-10-30 | 歌尔科技有限公司 | Sleep detection method and device, intelligent wearable device and readable storage medium |
-
2021
- 2021-01-15 CN CN202110057952.7A patent/CN114762588A/en active Pending
- 2021-12-13 WO PCT/CN2021/137461 patent/WO2022151887A1/en active Application Filing
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117481614A (en) * | 2023-12-26 | 2024-02-02 | 荣耀终端有限公司 | Sleep state detection method and related equipment |
CN117481614B (en) * | 2023-12-26 | 2024-08-09 | 荣耀终端有限公司 | Sleep state detection method and related equipment |
Also Published As
Publication number | Publication date |
---|---|
WO2022151887A1 (en) | 2022-07-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112289313A (en) | Voice control method, electronic equipment and system | |
CN111742361B (en) | Method for updating wake-up voice of voice assistant by terminal and terminal | |
CN113552937B (en) | Display control method and wearable device | |
CN110851067A (en) | Screen display mode switching method and device and electronic equipment | |
CN113395382B (en) | Method for data interaction between devices and related devices | |
CN112860428A (en) | High-energy-efficiency display processing method and equipment | |
WO2020019355A1 (en) | Touch control method for wearable device, and wearable device and system | |
CN113892920B (en) | Wearing detection method and device of wearable equipment and electronic equipment | |
WO2020034104A1 (en) | Voice recognition method, wearable device, and system | |
CN113467735A (en) | Image adjusting method, electronic device and storage medium | |
WO2022151887A1 (en) | Sleep monitoring method and related apparatus | |
CN113509145B (en) | Sleep risk monitoring method, electronic device and storage medium | |
CN114521878A (en) | Sleep evaluation method, electronic device and storage medium | |
CN114661258A (en) | Adaptive display method, electronic device, and storage medium | |
CN113467747B (en) | Volume adjusting method, electronic device and storage medium | |
WO2021204036A1 (en) | Sleep risk monitoring method, electronic device and storage medium | |
CN111026285B (en) | Method for adjusting pressure threshold and electronic equipment | |
CN115762108A (en) | Remote control method, remote control device and controlled device | |
CN115393676A (en) | Gesture control optimization method and device, terminal and storage medium | |
CN115480250A (en) | Voice recognition method and device, electronic equipment and storage medium | |
CN113918003A (en) | Method and device for detecting time length of skin contacting screen and electronic equipment | |
CN112822246A (en) | Equipment identification method and equipment | |
WO2021239079A1 (en) | Data measurement method and related apparatus | |
CN111460942A (en) | Proximity detection method and device, computer readable medium and terminal equipment | |
CN114125144B (en) | Method, terminal and storage medium for preventing false touch |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |