CN113688736A - Method for analyzing and monitoring health condition of patient - Google Patents
Method for analyzing and monitoring health condition of patient Download PDFInfo
- Publication number
- CN113688736A CN113688736A CN202110980852.1A CN202110980852A CN113688736A CN 113688736 A CN113688736 A CN 113688736A CN 202110980852 A CN202110980852 A CN 202110980852A CN 113688736 A CN113688736 A CN 113688736A
- Authority
- CN
- China
- Prior art keywords
- patient
- posture
- key points
- health
- nursing staff
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000036541 health Effects 0.000 title claims abstract description 101
- 238000012544 monitoring process Methods 0.000 title claims abstract description 34
- 238000000034 method Methods 0.000 title claims abstract description 29
- 230000008921 facial expression Effects 0.000 claims abstract description 79
- 230000000474 nursing effect Effects 0.000 claims abstract description 78
- 230000014509 gene expression Effects 0.000 claims abstract description 67
- 230000036544 posture Effects 0.000 claims description 119
- 238000001514 detection method Methods 0.000 claims description 25
- 241001133760 Acoelorraphe Species 0.000 claims description 12
- 230000000694 effects Effects 0.000 claims description 9
- 230000008569 process Effects 0.000 claims description 7
- 206010063659 Aversion Diseases 0.000 claims description 6
- 210000003127 knee Anatomy 0.000 claims description 6
- 230000007935 neutral effect Effects 0.000 claims description 6
- 208000013875 Heart injury Diseases 0.000 claims description 4
- 238000013135 deep learning Methods 0.000 claims description 4
- 230000008859 change Effects 0.000 claims description 3
- 210000004709 eyebrow Anatomy 0.000 claims description 3
- ZRHANBBTXQZFSP-UHFFFAOYSA-M potassium;4-amino-3,5,6-trichloropyridine-2-carboxylate Chemical compound [K+].NC1=C(Cl)C(Cl)=NC(C([O-])=O)=C1Cl ZRHANBBTXQZFSP-UHFFFAOYSA-M 0.000 claims description 3
- 238000007781 pre-processing Methods 0.000 claims description 3
- 230000036651 mood Effects 0.000 claims description 2
- 238000012163 sequencing technique Methods 0.000 claims description 2
- 238000004458 analytical method Methods 0.000 abstract description 7
- 210000004247 hand Anatomy 0.000 description 17
- 210000003811 finger Anatomy 0.000 description 7
- 230000009471 action Effects 0.000 description 6
- 230000000875 corresponding effect Effects 0.000 description 6
- 230000033001 locomotion Effects 0.000 description 6
- 230000008451 emotion Effects 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 3
- 238000002474 experimental method Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000010354 integration Effects 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 238000012549 training Methods 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 238000012795 verification Methods 0.000 description 2
- 208000027418 Wounds and injury Diseases 0.000 description 1
- 230000032683 aging Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 230000006735 deficit Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000002996 emotional effect Effects 0.000 description 1
- 230000003631 expected effect Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 230000003862 health status Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 210000004932 little finger Anatomy 0.000 description 1
- 230000003340 mental effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000004297 night vision Effects 0.000 description 1
- 230000035882 stress Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2415—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/253—Fusion techniques of extracted features
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
Landscapes
- Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- Probability & Statistics with Applications (AREA)
- Pathology (AREA)
- Databases & Information Systems (AREA)
- Epidemiology (AREA)
- General Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Abstract
The invention discloses a method for analyzing and monitoring the health condition of a patient, which belongs to the technical field of target recognition, and is used for constructing a system for detecting the health condition of the patient by using a monitoring camera arranged in a ward and a monitoring terminal placed in a nursing staff duty room, wherein the system combines facial expression recognition, human body posture estimation and hand posture estimation, fuses expression, posture and gesture data of the patient acquired by the camera, and comprehensively analyzes the health condition of the patient by using the fused information; the system needs to give an alarm when the patient has four conditions in the ward, and informs the nursing staff to pay attention to the condition of observing the health of the patient, so that the working pressure of the nursing staff in the hospital is reduced. The invention comprehensively realizes the analysis of the health condition of the patient in the ward, so that when the patient is in a dangerous condition, the nursing staff can timely respond, the life health of the patient is ensured, and the working pressure of the nursing staff is reduced.
Description
Technical Field
The invention relates to the technical field of target identification, in particular to a method for analyzing and monitoring the health condition of a patient.
Background
The aging phenomenon of the current society is increasingly aggravated, the number of patients needing special nursing including the old who are hospitalized is continuously increased, but the number of people who are engaged in the nursing industry in China is relatively short, gaps of medical care personnel exceed 1000 thousands of people, and the problem that the nursing of the patients is urgent to solve at present is already formed.
With the continuous development of the artificial intelligence technology, the artificial intelligence technology is applied to the nursing of patients in the ward, and the problem of shortage of nursing staff in the ward can be solved to a certain extent; the existing method mainly realizes the unmanned nursing of patients by combining intelligent wearing and an intelligent nursing robot so as to solve the problem of insufficient nursing staff; however, the intelligent wearable and intelligent robot has the limitations of high cost and single function, and the practical expected effect is difficult to achieve.
In view of the above, there is a need to develop a method for analyzing and monitoring the health condition of a patient by using a camera installed in a ward.
Disclosure of Invention
The technical problem to be solved by the invention is to provide a method for analyzing and monitoring the health condition of a patient, a system for detecting the health condition of the patient by utilizing a camera installed in a ward is used, facial expression recognition, human body posture estimation and hand posture estimation are combined, data such as expression, posture, gesture and the like of the patient collected by the camera are fused, the health condition of the patient is comprehensively analyzed by utilizing the fused information, and the problem of shortage of hospital nursing staff is effectively solved.
In order to solve the technical problems, the technical scheme adopted by the invention is as follows:
a method of analytical monitoring of a patient's health condition, characterized by: constructing a system for detecting the health condition of a patient by using a monitoring camera installed in a ward and a monitoring terminal placed in a nursing staff duty room, combining facial expression recognition, human body posture estimation and hand posture estimation, fusing expression, posture and gesture data of the patient acquired by the camera, and comprehensively analyzing the health condition of the patient by using fused information; the system needs to give an alarm when the patient has four conditions in the ward, and informs the nursing staff to pay attention to the condition of observing the health of the patient, so that the working pressure of the nursing staff in the hospital is reduced.
The technical scheme of the invention is further improved as follows: the recognition process of facial expression recognition comprises two parts: face preprocessing and facial expression classification; the facial expressions are dangerous expressions, expressions to be observed and safe expressions, when the system identifies that the expressions of the patient are dangerous expressions, the system sends out a warning to immediately inform nursing staff to go to a ward to check the condition of the patient; when the system identifies that the expression of the patient is the expression to be observed, the system only needs to send out early warning to remind medical personnel to pay attention to observing the condition of the patient in the ward; when the system recognizes that the expression of the patient is a safe expression, the system does not need to send any early warning, and the nursing staff does not need to put full energy on the patient in the observation ward.
The technical scheme of the invention is further improved as follows: the patient facial expression recognition process specifically comprises the following steps:
s1, firstly, identifying and finding out a target area from an input picture by using a face detection algorithm based on CascadeCNN, cutting the face area according to key points detected by Intraface, wherein the key points are used for extracting the outlines of eyebrows, eyes, nose and mouth, and normalizing the picture;
s2, extracting feature information capable of expressing the whole face to the maximum extent from the normalized picture;
and S3, finally, sending the characteristic information obtained in the S2 to a Softmax classifier for expression classification.
The technical scheme of the invention is further improved as follows: the dangerous expressions at least comprise anger, aversion, fear and heart injury; the expression to be observed at least comprises surprise; the safe expression at least comprises neutral and happy.
The technical scheme of the invention is further improved as follows: the human body posture estimation is to generate a human body suggestion box by using a target detection algorithm and then carry out human body key point regression in the box by using a key point regression algorithm based on heat map regression; the human body posture is divided into standing, sitting, lying and falling; extracting 18 key points of the body of the patient, and estimating the posture of the patient by calculating Euclidean distances between the key points; judging whether the body posture of the patient stands or sits by using the Euclidean distance between two key points of the knee and the hip; combining the position relation between the patient and the sickbed, if the patient is on the sickbed and the Euclidean distances of two key points at the hip and the neck in the vertical direction are relatively close, the patient lies on the sickbed, and the system needs to further analyze the health condition of the patient through facial expressions or hand postures; if the patient is not on the hospital bed and the Euclidean distance between two key points of the hip and the neck in the vertical direction is close, the patient is judged to fall down, an alarm needs to be sent immediately and a nursing staff is informed to go to the ward to check the health condition of the patient.
The technical scheme of the invention is further improved as follows: the human body posture estimation process comprises the following steps: firstly, detecting a human body through an improved version of mobilene to obtain a bounding box containing a person; then, generating a human body suggestion frame by using a target detection algorithm, and performing human body key point regression in the frame by using a key point regression algorithm based on heat map regression; the rough locations of the keypoints are represented by heatmap, and the heatmap maximum corresponds to the actual location of the keypoints.
The technical scheme of the invention is further improved as follows: the hand posture estimation comprises the steps of extracting 21 key points of a hand, sequencing the 21 key points, taking a palm root as an initial key point, arranging four key points on each finger, and identifying the posture of the hand of a patient by calculating the Euclidean distance between the key points of the hand and the angle relationship between the key points and the key points of the palm root; the hand posture is divided into a gripping posture and an opening posture; and judging whether the hand posture is closed or opened according to the Euclidean distance and angle relation between the palm center and the finger center.
The technical scheme of the invention is further improved as follows: estimating the hand posture of the patient, adopting a resNet-50 network model for detecting key points of the hand, and identifying the hand posture of the patient by calculating Euclidean distances among 21 extracted key points of the hand and the angle relationship between the key points and key points at the root of the palm; the patient generally holds both hands tightly under dangerous conditions, and according to the characteristic, the obtained hand postures are used for assisting the facial expression and body posture of the patient to analyze the health condition of the patient.
The technical scheme of the invention is further improved as follows: the four cases specifically include:
(1) if the system can detect the facial expression of the patient, the psychological activities of the patient are obtained by analyzing the facial expression of the patient, so that when the patient has dangerous expression, the system gives an alarm, and the nursing staff can respond in time;
(2) if the system can detect the facial expression and the hand posture of the patient, the psychological activities of the patient are judged by combining the facial expression and the hand posture, when the dangerous expression appears in the patient, the fist clenching posture appears, and at the moment, the system gives an alarm to inform nursing staff to check the health condition of the patient in time;
(3) if the system fails to detect the facial expression of the patient, the health condition of the patient is analyzed by the body posture of the patient: after obtaining that the coordinates of two key points of the neck and the hip of the patient are close in the vertical direction, judging whether the patient falls down or lies down by combining the position relation between the patient and a sickbed: the patient lies flat on the sickbed and does not fall down on the sickbed, and the nursing staff needs to be informed in time when the patient falls down; when the patient does not fall down, the psychological mood of the patient can be further judged through the gestures of the patient, and when the patient holds the hands tightly, the nursing staff needs to be vigilant and pay attention to the health condition of the patient all the time;
(4) if the system cannot detect the facial expression and body posture of the patient, judging whether the two hands of the patient are held tightly or not through the change of the distance of the key points of the hands under the condition that the health condition of the patient can be judged only through the hand posture of the patient; when the facial expressions and the body postures of the patient cannot be detected and the two hands are in a holding state, the nursing staff timely receives the alarm sent by the system to pay attention to the health condition of the patient and can quickly respond.
The technical scheme of the invention is further improved as follows: the system is based on a pytorch1.9.0 deep learning framework and at least comprises a mobileNet network model and a resNet-50 network model; the face detection method at least comprises a face detection algorithm, a target detection algorithm and a key point regression algorithm based on CascadeCNN.
Due to the adoption of the technical scheme, the invention has the technical progress that:
1. the invention quickly integrates facial expression recognition, human body posture estimation and gesture estimation, and provides a health condition detection system for patients in a ward; the system does not only analyze the health condition of the patient by depending on certain data, and realizes the quick integration of facial expressions, human body postures and hand postures, thereby achieving the purpose of analyzing the health condition of the patient more quickly and better; through the expression and the action of analysis patient in the ward, obtain patient's real-time health data, the nursing staff relies on these health data that obtain to nurse the patient, has improved work efficiency greatly, has alleviated the not enough problem of nursing staff to a certain extent.
2. The method and the device take the facial expression of the patient as a key for judging the health condition of the patient, and simultaneously detect the body posture and the hand posture in a combined manner to judge the health condition of the patient more specifically and accurately.
3. The invention fuses the obtained data of facial expression, human body posture, hand posture and the like, comprehensively analyzes the data to obtain the health condition of the patient in the ward, fuses the data of various aspects of the patient together, and can more accurately and comprehensively analyze the health condition of the patient.
4. Compared with the intelligent wearing and intelligent robot nursing scheme, the intelligent wearing and intelligent robot nursing system is low in cost, does not need a patient to wear equipment, is easy to realize, and has popularization and use values.
5. The invention has the advantages that when data of one aspect cannot be detected, other data can be collected to detect the health condition of the patient, and the analysis of the health condition of the patient in the ward is comprehensively realized, so that when the patient is in a dangerous condition, the nursing staff can timely react, the life health of the patient is guaranteed, and the working pressure of the nursing staff is relieved.
6. The invention realizes the detection and analysis of the health condition of the patient in the ward, avoids the possibility of danger of the patient with inconvenient action in the ward, and relieves the insufficient pressure of hospital nursing personnel.
Drawings
FIG. 1 is a flow chart of patient facial expression recognition of the present invention;
FIG. 2 is a flow chart of the present invention for estimating the body posture of a patient;
FIG. 3 is a flow chart of the patient hand pose estimation of the present invention;
FIG. 4 is a flow chart of a patient health analysis of the present invention;
FIG. 5 is a screenshot of the recognition of facial expressions, body gestures and hand movements proposed by the present invention;
FIG. 6 is a screenshot of the present invention detecting a patient sitting down;
FIG. 7 is a screenshot of the present invention detecting a patient lying down;
fig. 8 is a screenshot of a fall warning for a patient according to the present invention;
FIG. 9 is a screenshot of a two-action hand gesture as proposed by the present invention.
Detailed Description
The invention provides a method for comprehensively analyzing the health condition of a patient by fusing facial expression data, human body posture data and hand posture data, which can identify the expression, body posture and corresponding actions made by hands of the patient and analyze the facial expression, psychological activity and body posture actions of the patient through the obtained data so as to obtain the health condition of the patient.
The present invention will be described in further detail with reference to the following examples:
a method for analyzing and monitoring the health condition of a patient constructs a system for detecting the health condition of the patient by using a monitoring camera installed in a ward and a monitoring terminal placed in a nursing staff duty room, wherein the system is based on a pytorch1.9.0 deep learning frame and at least comprises a mobileNet network model and a resNet-50 network model; the face detection method at least comprises a face detection algorithm, a target detection algorithm and a key point regression algorithm based on CascadeCNN.
The system combines facial expression recognition, human body posture estimation and hand posture estimation, fuses expression, posture and gesture data of a patient collected by a camera, and comprehensively analyzes the health condition of the patient by utilizing the fused information; the system needs to give an alarm when the patient has four conditions in the ward, and informs the nursing staff to pay attention to the condition of observing the health of the patient, so that the working pressure of the nursing staff in the hospital is reduced.
As shown in fig. 1, the flow chart of patient expression recognition identifies the facial expression of a patient, and the recognition process is composed of two parts: face preprocessing and facial expression classification; firstly, identifying and finding out a target area from an input picture by using a face detection algorithm based on CascadeCNN, cutting the face area according to key points detected by Intraface, wherein the key points can be used for extracting the outlines of eyebrows, eyes, a nose and a mouth, and normalizing the picture; then extracting feature information capable of expressing the whole face to the maximum extent from the processed picture; and finally, sending the features obtained in the last step to a Softmax classifier for expression classification. Facial expressions are grouped together into 7 classes: anger, aversion, fear, impairment of mind, surprise, neutrality and joy. The emotional feeling system classifies the anger, the aversion, the fear and the worry into dangerous expressions, surprises into expressions to be observed, and the neutral and the happy are classified into safe expressions, so that the medical staff can make corresponding reactions according to the danger degree of the expressions of the patient.
When the expression of the patient is identified as negative dangerous emotion such as anger, hurt, aversion and fear, the system can send out warning to immediately inform nursing staff to go to a ward to check the condition of the patient. When the expression of the patient is identified as the surprise to wait for observing the expression, the system only needs to send out early warning to remind medical personnel to pay attention to observing the condition of the patient in the ward. When the expression of the patient is recognized to be safe expressions such as neutral expression and distraction, the system does not need to send any early warning, and nursing staff do not need to put all energy on the patient in the observation ward, so that the working efficiency of the nursing staff is greatly improved. The analysis of the health condition of the patient by using the facial expression of the patient is the key content in the invention, and the facial expression of the patient and the corresponding measures for the nursing staff are shown in table 1.
TABLE 1 patient facial expressions and corresponding caregiver response
Kind of expression | Nursing staff countermeasure |
Anger | Dangerous expressions to inform nursing staff in time |
Heart injury | Dangerous expressions to inform nursing staff in time |
Fear of | Dangerous expressions to inform nursing staff in time |
Aversion to | Dangerous expressions to inform nursing staff in time |
Is surprised | Watch to be observedSending out early warning and observing |
Neutral property | Safe expression without giving any early warning |
Happy | Safe expression without giving any early warning |
As shown in fig. 2, the Human body posture estimation flow chart estimates the Human body posture of a patient, an improved version mobileNet network model for Human body key point detection is adopted, a bounding box containing a person is obtained based on Human body detection, then a Human body suggestion box (Human disposition) is generated by using a target detection algorithm, and then Human body key point regression is performed in the box by using a key point regression algorithm based on heat map (heatmap) regression. The rough position of the key point is represented by a heatmap, the probability detection value of each pixel of the feature map is obtained by the training network, the maximum value of the heatmap corresponds to the position of the key point, and the group Truth of the heatmap is two-dimensional Gaussian distribution taking the key point as the center.
Human body posture is classified into 4 types: standing, sitting down, lying down and falling down. 18 key points of the body of the patient are extracted, and the posture of the patient is estimated by calculating Euclidean distances between the key points. And judging whether the body posture of the patient is standing or sitting by using the Euclidean distance between two key points of the knee and the hip. In addition, since the euclidean distance between key points of the body of the patient when lying down and falling down is very close, an additional judgment condition is required to distinguish between the lying down and falling actions of the patient. According to the invention, the position relation between the patient and the sickbed is introduced, if the patient is on the sickbed and the Euclidean distance between two key points at the hip and the neck in the vertical direction is relatively close, the patient lies on the sickbed, and the system further needs to analyze the health condition of the patient through facial expressions or hand postures. If the patient is not on the hospital bed and the Euclidean distance between two key points of the hip and the neck in the vertical direction is close, the patient is judged to fall down, an alarm needs to be sent immediately and a nursing staff is informed to go to the ward to check the health condition of the patient.
The Euclidean distance calculation formula is shown as the formula 1-1, and rho is a point (x)1,y1) And point (x)2,y2) The euclidean distance between.
As shown in fig. 3, the hand posture estimation flowchart estimates the hand posture of a patient, and adopts a resNet-50 network model for detecting key points of the hand, firstly finds out two hands through hand detection, then extracts 21 key points of the hand, sorts the 21 key points, uses the palm root as a starting key point, has four key points on each finger, and identifies the posture of the hand of the patient by calculating the euclidean distance between the key points of the hand and the angle relationship between the key points and the key points of the palm root. Calculation of Euclidean distance between key points of the hand is shown as formula 1-1, and the angle relation between the key points and key points of the palm root is calculated by an inverse cosine function. The formula of the inverse cosine function is shown in formula 1-2, and theta is a point (x)1,y1) And the root of the palm (x)0,y0) The angular relationship between them.
Hand gestures are classified into 2 types: and (4) gripping and opening. And calculating the coordinates of the palm center of the palm by calculating the average value of the coordinates of key points at the root parts of the fingers, and judging that the gesture is closed when the Euclidean distance between the fingertip key points of the index finger, the middle finger, the ring finger and the little finger and the coordinates of the palm center of the palm is smaller than a specified threshold value and the gesture larger than the specified threshold value is opened. When the facial expression of the patient is collected, the psychological activities of the patient are analyzed by combining the hand information with the facial expression of the patient; when the facial expression of the patient cannot be acquired, the obtained hand posture is used for assisting the body posture of the patient to analyze the health condition of the patient.
As shown in fig. 4, the present invention provides a patient health status detection system in a ward by rapidly integrating the above-mentioned expression recognition, human posture estimation and gesture estimation. The system does not only rely on certain data to analyze the health condition of the patient any more, and realizes the quick integration of facial expressions, human postures and hand postures, thereby achieving the purpose of analyzing the health condition of the patient more quickly and better. The invention takes the facial expression of the patient as the key for judging the health condition of the patient, and simultaneously, the body posture and the hand posture are detected in a combined manner to judge the health condition of the patient more specifically and accurately. The invention fuses the obtained data of facial expression, body posture, hand posture and the like, and comprehensively analyzes the data to obtain the health condition of the patient in the ward. The patient can send an alarm to inform the nursing staff to pay attention to observing the health condition of the patient when the following four conditions occur in the ward:
(1) if the facial expression of the patient can be detected, the psychological activities of the patient can be obtained by analyzing the facial expression of the patient, so that when the dangerous expression appears in the patient, the nursing staff can timely react.
(2) If can detect patient's facial expression and hand posture, utilize facial expression and hand posture to combine together and judge patient's mental activities, when dangerous emotions such as anger, hurting heart, disgust and fear appear in the patient, the posture of clenching the fist can appear, should in time inform the nursing staff this moment and look over patient health.
(3) If the facial expression of the patient cannot be detected, the health condition of the patient can be analyzed through the body posture of the patient. For example, after obtaining that the coordinates of two key points of the neck and the hip of the patient are close in the vertical direction, the patient is judged whether to fall down or lie down by combining the position relationship between the patient and the hospital bed: the patient lies flat on the sickbed, does not fall down on the sickbed, and needs to inform nursing staff in time when falling down. When the patient does not fall down, the psychological emotion of the patient can be further judged through the gesture of the patient, and when the patient holds both hands tightly, the nursing staff needs to be vigilant and pay attention to the health condition of the patient all the time.
(4) If the facial expression and the body posture of the patient cannot be detected, whether the two hands of the patient are held tightly or not is judged through the change of the distance of the key points of the hands under the condition that the health condition of the patient can be judged only through the hand posture of the patient. When facial expression and the body posture of the patient can not be detected, and the hands are in the holding state, the nursing staff timely receives the alarm to pay attention to the health condition of the patient, and can quickly make a response.
Examples
A method for analyzing and monitoring the health condition of a patient comprises the steps of constructing a system for detecting the health condition of the patient by utilizing a monitoring camera installed in a ward and a monitoring terminal placed in a nursing staff duty room, wherein hardware equipment used by the patient health condition monitoring system is a set of monitoring equipment, and the monitoring camera is placed in a wide visual field place in the ward so as to shoot the activity of the patient to the maximum extent; the monitoring camera adopts an indoor panoramic camera of Haicanwei, horizontally rotates 0-350 degrees and vertically rotates-10-110 degrees, and supports night vision and large-capacity storage; the monitoring terminal is placed in a nursing staff duty room, and nursing staff can observe the health conditions of different patients in a plurality of wards at the same time, so that the pressure of shortage of the nursing staff is greatly relieved.
Hand action, facial expression and human posture are shown in proper order in the upper left corner on control interface, when detecting that patient's facial expression is dangerous negative emotion, human posture for falling down, under the hand posture for the condition of holding, satisfy above arbitrary condition, the typeface all can become more striking and can in time send the warning to nursing staff can learn the real-time condition in the ward.
In this embodiment, an experiment on the PC side is based on the pytorch1.9.0 deep learning framework, the version used by OpenCV is 4.5.2, the version used by the programming language is python3.8, the experiment is performed on the Windows 1064-bit operating system, and the hardware platform is: intel i7-10750H 3.2GHZ CPU, NVIDIA RTX 3060GPU, 6G memory.
In the embodiment, 7 classification experiments are performed on facial expressions of a human face on a Fer2013 data set, wherein 35886 facial expression pictures are contained in the Fer2013 data set, and 3589 facial expression pictures are contained in a training set of 28708, a verification set of 3589 and a test set of 3589. Each picture is a 48 x 48 gray scale picture, which contains a total of 7 classes of expressions, namely anger, disgust, fear, sadness, surprise, neutrality, and happiness. The invention detects the key points of the human body on a coco2017 data set, comprises more than 200,000 images and 250,000 key point marked human body examples, and more than 150,000 people and 170 ten thousand marked key points on a training set and a verification set are publicly available. The invention detects hand keypoints on a multi-view gesture dataset that contains 49062 samples, each sample having a corresponding json file marking hand keypoint locations.
As shown in fig. 5, in the screenshot of the recognized facial expression, human body posture and hand motion, the facial expression is recognized as a heart injury, the motion of the person is recognized as a standing position, and in addition, the hand posture is recognized as a holding position; the present embodiment analyzes the health condition of the patient based on the facial expressions, body postures and hand movements identified above, so as to determine whether the patient needs the help of the nursing staff.
The facial expression of the patient is very important when judging the health condition of the patient, and sometimes the health condition of the patient can be judged by only depending on the facial expression of the patient under the condition that the facial expression of the face is detected by nursing staff. When the expression of the patient is recognized as dangerous and negative emotions such as anger, injury, disgust and fear, the system sends out warning to inform nursing staff to go to a ward in time to check the health condition of the patient. When the expression of the patient is identified as surprise to wait for observing the expression, the system sends out early warning, and the nursing staff only needs to pay attention to the health condition of the patient. When the expression of the patient is recognized to be a neutral and happy safe expression, the health condition of the patient is stable, and the system can not send out any early warning. If meet the condition that can't gather patient's facial expression data, the system will detect human gesture and hand gesture, utilizes the human gesture and the hand gesture information of gathering to judge patient's health condition jointly, specifically as follows:
by analysing the patientTo determine the current health condition of the patient. As shown in FIG. 6, when the coordinates of the key points of the left and right knees and the hip in the vertical direction are less than the predetermined threshold y1In time, it can be judged that the patient is sitting. In order to distinguish the case where the coordinates of the key points of the left and right knees and the vertical coordinates of the key points of the hip are close to each other when the patient lies down, the determination conditions are supplemented, that is, the following conditions are required: when the patient sits, the Euclidean distance between the key point of the neck and the key point of the hip is larger than a specified threshold value y2. Thus, the sitting posture and the lying posture of the patient can be distinguished. The calculation of the prescribed threshold requires extensive experimentation to determine the size of the threshold based on the patient-specific motion and the euclidean distance between the corresponding keypoints. The specified threshold value y between two key points of the knee and the hip in the invention1150, a defined threshold value y between two key points of the neck and the hip2300. the posture of the patient when lying flat is detected as shown in fig. 7.
The lying and falling postures of the patient are similar, and the lying and falling postures of the patient cannot be distinguished by only depending on the relative position between key points of the human body. It is clear that most of the time the patient lies on the bed, and the falling is rare. To detect whether a patient falls down, the body posture of the patient and the positional relationship between the patient and the bed are also analyzed. The patient is judged whether to lie flat or fall down by marking the position of the sickbed, calculating the distance between the patient and the sickbed, judging whether the patient is on the sickbed and combining whether the coordinates of two key points of the neck and the hip of the patient are close in the vertical direction. As shown in fig. 8, in the test video, a chair is used instead of the bed and marked with a blue box due to limited conditions. When the patient is away from the sickbed by a certain distance, the patient is judged not to be on the sickbed, and the posture of the body of the patient is similar to that of lying down, so that the falling condition of the patient can be determined. The system needs to send out warning in time to inform nursing staff to go to a ward for rescue, and then auxiliary determination is carried out according to whether the facial expression is a dangerous expression or not.
When the facial expression of a patient cannot be detected, the final desired effect cannot be achieved by simply identifying the body posture of the patient in a ward, and therefore the hand posture of the patient is used for assisting in judging the health condition of the patient. As shown in fig. 9, by calculating euclidean distances between the palm center coordinates and the coordinates between the fingers, when the distance between them is less than a predetermined threshold dst _ thr equal to 10, it is determined that the hand posture is to grip both hands; when the distance between them is greater than a prescribed threshold dst _ thr of 10, the hand posture is determined to be the open both hands. The invention takes the holding of both hands of the patient as the focus of attention, and the holding of both hands can be an important basis for the health condition of the patient. When a facial expression can be detected, the hand gesture can be combined with the facial expression to determine whether the patient is under stress. If the patient holds both hands tightly and dangerous expressions appear, the system can give an alarm, and the nursing staff can timely react. When facial expressions of human faces cannot be detected, if the condition that the patient holds both hands tightly occurs, the system needs to send out early warning to remind nursing staff to pay attention to observing the health condition of the patient. In addition, when the facial expression cannot be detected, the posture and the motion of the patient may be determined by the hand posture or the body posture. For example, when the Euclidean distance between the wrist key point and the shoulder key point of the patient is changed constantly and the hand posture is in a holding state, the fact that the patient is holding something can be judged, the system reminds nursing staff of the behavior of the patient in time, the nursing staff can make a pointed response, and danger is avoided.
In conclusion, the invention quickly integrates facial expression recognition, human body posture estimation and gesture estimation, and provides a health condition detection system for patients in a ward; the system does not only analyze the health condition of the patient by depending on certain data, and realizes the quick integration of facial expressions, human body postures and hand postures, thereby achieving the purpose of analyzing the health condition of the patient more quickly and better; through the expression and the action of analysis patient in the ward, obtain patient's real-time health data, the nursing staff relies on these health data that obtain to nurse the patient, has improved work efficiency greatly, has alleviated the not enough problem of nursing staff to a certain extent.
The above-mentioned examples are only for describing the preferred embodiments of the present invention, and are not intended to limit the scope of the present invention, and various modifications and improvements of the technical solution of the present invention by those skilled in the art should fall within the protection scope defined by the claims without departing from the spirit of the present invention.
Claims (10)
1. A method of analytical monitoring of a patient's health condition, characterized by: constructing a system for detecting the health condition of a patient by using a monitoring camera installed in a ward and a monitoring terminal placed in a nursing staff duty room, combining facial expression recognition, human body posture estimation and hand posture estimation, fusing expression, posture and gesture data of the patient acquired by the camera, and comprehensively analyzing the health condition of the patient by using fused information; the system needs to give an alarm when the patient has four conditions in the ward, and informs the nursing staff to pay attention to the condition of observing the health of the patient, so that the working pressure of the nursing staff in the hospital is reduced.
2. A method of analytical monitoring of the health of a patient according to claim 1, characterized in that: the recognition process of facial expression recognition comprises two parts: face preprocessing and facial expression classification; the facial expressions are dangerous expressions, expressions to be observed and safe expressions, when the system identifies that the expressions of the patient are dangerous expressions, the system sends out a warning to immediately inform nursing staff to go to a ward to check the condition of the patient; when the system identifies that the expression of the patient is the expression to be observed, the system only needs to send out early warning to remind medical personnel to pay attention to observing the condition of the patient in the ward; when the system recognizes that the expression of the patient is a safe expression, the system does not need to send any early warning, and the nursing staff does not need to put full energy on the patient in the observation ward.
3. A method of analytical monitoring of the health of a patient according to claim 2, characterized in that: the patient facial expression recognition process specifically comprises the following steps:
s1, firstly, identifying and finding out a target area from an input picture by using a face detection algorithm based on CascadeCNN, cutting the face area according to key points detected by Intraface, wherein the key points are used for extracting the outlines of eyebrows, eyes, nose and mouth, and normalizing the picture;
s2, extracting feature information capable of expressing the whole face to the maximum extent from the normalized picture;
and S3, finally, sending the characteristic information obtained in the S2 to a Softmax classifier for expression classification.
4. A method of analytical monitoring of the health of a patient according to claim 2, characterized in that: the dangerous expressions at least comprise anger, aversion, fear and heart injury; the expression to be observed at least comprises surprise; the safe expression at least comprises neutral and happy.
5. A method of analytical monitoring of the health of a patient according to claim 1, characterized in that: the human body posture estimation is to generate a human body suggestion box by using a target detection algorithm and then carry out human body key point regression in the box by using a key point regression algorithm based on heat map regression; the human body posture is divided into standing, sitting, lying and falling; extracting 18 key points of the body of the patient, and estimating the posture of the patient by calculating Euclidean distances between the key points; judging whether the body posture of the patient stands or sits by using the Euclidean distance between two key points of the knee and the hip; combining the position relation between the patient and the sickbed, if the patient is on the sickbed and the Euclidean distances of two key points at the hip and the neck in the vertical direction are relatively close, the patient lies on the sickbed, and the system needs to further analyze the health condition of the patient through facial expressions or hand postures; if the patient is not on the hospital bed and the Euclidean distance between two key points of the hip and the neck in the vertical direction is close, the patient is judged to fall down, an alarm needs to be sent immediately and a nursing staff is informed to go to the ward to check the health condition of the patient.
6. A method of analytical monitoring of the health of a patient according to claim 5, characterized in that: the human body posture estimation process comprises the following steps: firstly, detecting a human body through an improved version of mobilene to obtain a bounding box containing a person; then, generating a human body suggestion frame by using a target detection algorithm, and performing human body key point regression in the frame by using a key point regression algorithm based on heat map regression; the rough locations of the keypoints are represented by heatmap, and the heatmap maximum corresponds to the actual location of the keypoints.
7. A method of analytical monitoring of the health of a patient according to claim 1, characterized in that: the hand posture estimation comprises the steps of extracting 21 key points of a hand, sequencing the 21 key points, taking a palm root as an initial key point, arranging four key points on each finger, and identifying the posture of the hand of a patient by calculating the Euclidean distance between the key points of the hand and the angle relationship between the key points and the key points of the palm root; the hand posture is divided into a gripping posture and an opening posture; and judging whether the hand posture is closed or opened according to the Euclidean distance and angle relation between the palm center and the finger center.
8. A method of analytical monitoring of the health of a patient according to claim 7, characterized in that: estimating the hand posture of the patient, adopting a resNet-50 network model for detecting key points of the hand, and identifying the hand posture of the patient by calculating Euclidean distances among 21 extracted key points of the hand and the angle relationship between the key points and key points at the root of the palm; the patient generally holds both hands tightly under dangerous conditions, and according to the characteristic, the obtained hand postures are used for assisting the facial expression and body posture of the patient to analyze the health condition of the patient.
9. A method of analytical monitoring of the health of a patient according to claim 1, characterized in that: the four cases specifically include:
(1) if the system can detect the facial expression of the patient, the psychological activities of the patient are obtained by analyzing the facial expression of the patient, so that when the patient has dangerous expression, the system gives an alarm, and the nursing staff can respond in time;
(2) if the system can detect the facial expression and the hand posture of the patient, the psychological activities of the patient are judged by combining the facial expression and the hand posture, when the dangerous expression appears in the patient, the fist clenching posture appears, and at the moment, the system gives an alarm to inform nursing staff to check the health condition of the patient in time;
(3) if the system fails to detect the facial expression of the patient, the health condition of the patient is analyzed by the body posture of the patient: after obtaining that the coordinates of two key points of the neck and the hip of the patient are close in the vertical direction, judging whether the patient falls down or lies down by combining the position relation between the patient and a sickbed: the patient lies flat on the sickbed and does not fall down on the sickbed, and the nursing staff needs to be informed in time when the patient falls down; when the patient does not fall down, the psychological mood of the patient can be further judged through the gestures of the patient, and when the patient holds the hands tightly, the nursing staff needs to be vigilant and pay attention to the health condition of the patient all the time;
(4) if the system cannot detect the facial expression and body posture of the patient, judging whether the two hands of the patient are held tightly or not through the change of the distance of the key points of the hands under the condition that the health condition of the patient can be judged only through the hand posture of the patient; when the facial expressions and the body postures of the patient cannot be detected and the two hands are in a holding state, the nursing staff timely receives the alarm sent by the system to pay attention to the health condition of the patient and can quickly respond.
10. A method of analytical monitoring of the health of a patient according to claim 1, characterized in that: the system is based on a pytorch1.9.0 deep learning framework and at least comprises a mobileNet network model and a resNet-50 network model; the face detection method at least comprises a face detection algorithm, a target detection algorithm and a key point regression algorithm based on CascadeCNN.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110980852.1A CN113688736B (en) | 2021-08-25 | 2021-08-25 | Method for analyzing and monitoring health condition of patient |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110980852.1A CN113688736B (en) | 2021-08-25 | 2021-08-25 | Method for analyzing and monitoring health condition of patient |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113688736A true CN113688736A (en) | 2021-11-23 |
CN113688736B CN113688736B (en) | 2024-08-02 |
Family
ID=78582514
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110980852.1A Active CN113688736B (en) | 2021-08-25 | 2021-08-25 | Method for analyzing and monitoring health condition of patient |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113688736B (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114974538A (en) * | 2022-07-11 | 2022-08-30 | 雅图(重庆)医疗器械有限公司 | Ward nursing early warning management system based on big data |
CN115019220A (en) * | 2022-04-19 | 2022-09-06 | 北京拙河科技有限公司 | Posture tracking method and system based on deep learning |
CN116700490A (en) * | 2023-06-06 | 2023-09-05 | 山东格物智能科技有限公司 | Man-machine interaction system and method based on computer vision gesture recognition |
CN116759044A (en) * | 2023-05-30 | 2023-09-15 | 急尼优医药科技(上海)有限公司 | Block chain-based intelligent pharmacy management system and method |
CN117275699B (en) * | 2023-11-23 | 2024-02-13 | 四川省医学科学院·四川省人民医院 | Wisdom ward system |
WO2024105297A1 (en) * | 2022-11-18 | 2024-05-23 | Domotik Mind S.L | Security device |
CN118197657A (en) * | 2024-03-20 | 2024-06-14 | 中国人民解放军陆军军医大学第一附属医院 | Critical care cross-institution collaboration method and system based on virtual reality |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105184239A (en) * | 2015-08-27 | 2015-12-23 | 沈阳工业大学 | Ward auxiliary medical-care system and auxiliary medical-care method based on facial expression recognition of patients |
-
2021
- 2021-08-25 CN CN202110980852.1A patent/CN113688736B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105184239A (en) * | 2015-08-27 | 2015-12-23 | 沈阳工业大学 | Ward auxiliary medical-care system and auxiliary medical-care method based on facial expression recognition of patients |
Non-Patent Citations (1)
Title |
---|
杨宁;张虹;: "一种基于物联网和表情识别的健康监护系统", 计算机科学, no. 1, 15 October 2011 (2011-10-15) * |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115019220A (en) * | 2022-04-19 | 2022-09-06 | 北京拙河科技有限公司 | Posture tracking method and system based on deep learning |
CN115019220B (en) * | 2022-04-19 | 2023-02-03 | 北京拙河科技有限公司 | Posture tracking method and system based on deep learning |
CN114974538A (en) * | 2022-07-11 | 2022-08-30 | 雅图(重庆)医疗器械有限公司 | Ward nursing early warning management system based on big data |
CN114974538B (en) * | 2022-07-11 | 2023-12-22 | 广东安护通信息科技有限公司 | Ward nursing early warning management system based on big data |
WO2024105297A1 (en) * | 2022-11-18 | 2024-05-23 | Domotik Mind S.L | Security device |
CN116759044A (en) * | 2023-05-30 | 2023-09-15 | 急尼优医药科技(上海)有限公司 | Block chain-based intelligent pharmacy management system and method |
CN116700490A (en) * | 2023-06-06 | 2023-09-05 | 山东格物智能科技有限公司 | Man-machine interaction system and method based on computer vision gesture recognition |
CN117275699B (en) * | 2023-11-23 | 2024-02-13 | 四川省医学科学院·四川省人民医院 | Wisdom ward system |
CN118197657A (en) * | 2024-03-20 | 2024-06-14 | 中国人民解放军陆军军医大学第一附属医院 | Critical care cross-institution collaboration method and system based on virtual reality |
CN118197657B (en) * | 2024-03-20 | 2024-08-06 | 中国人民解放军陆军军医大学第一附属医院 | Critical care cross-institution collaboration method and system based on virtual reality |
Also Published As
Publication number | Publication date |
---|---|
CN113688736B (en) | 2024-08-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113688736B (en) | Method for analyzing and monitoring health condition of patient | |
Lucey et al. | Automatically detecting pain in video through facial action units | |
CN110458101B (en) | Criminal personnel sign monitoring method and equipment based on combination of video and equipment | |
CN110188615B (en) | Facial expression recognition method, device, medium and system | |
KR20220044677A (en) | Systems and methods for predicting, preventing and mitigating workplace injuries | |
Miah et al. | Rotation, Translation and Scale Invariant Sign Word Recognition Using Deep Learning. | |
CN108960022B (en) | Emotion recognition method and device | |
US10849532B1 (en) | Computer-vision-based clinical assessment of upper extremity function | |
Chen et al. | Multiple-angle hand gesture recognition by fusing SVM classifiers | |
CN113257440A (en) | ICU intelligent nursing system based on patient video identification | |
Alvarez et al. | A method for facial emotion recognition based on interest points | |
CN111345823A (en) | Remote exercise rehabilitation method and device and computer readable storage medium | |
Jeon et al. | Stress recognition using face images and facial landmarks | |
Dosso et al. | RGB-D scene analysis in the NICU | |
Anderson et al. | Robust real-time face tracker for cluttered environments | |
CN115273150A (en) | Novel identification method and system for wearing safety helmet based on human body posture estimation | |
Balasubaramanian et al. | An effective stacked autoencoder based depth separable convolutional neural network model for face mask detection | |
CN111709492A (en) | Dimension reduction visualization method and device for high-dimensional electronic medical record list and storage medium | |
Chen et al. | Drinking gesture spotting and identification using single wrist-worn inertial sensor | |
Mansor et al. | Coma patients expression analysis under different lighting using k-NN and LDA | |
TWI646438B (en) | Emotion detection system and method | |
Ghamen et al. | Positive and negative expressions classification using the belief theory | |
Mridha et al. | ML-DP: a smart emotion detection system for disabled person to develop a smart city | |
CN112487980A (en) | Micro-expression-based treatment method, device, system and computer-readable storage medium | |
Hendryani et al. | Implementation of Thermal Camera for Human Stress Detection: A Review |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |