CN111583333A - Temperature measurement method and device based on visual guidance, electronic equipment and storage medium - Google Patents
Temperature measurement method and device based on visual guidance, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN111583333A CN111583333A CN202010428280.1A CN202010428280A CN111583333A CN 111583333 A CN111583333 A CN 111583333A CN 202010428280 A CN202010428280 A CN 202010428280A CN 111583333 A CN111583333 A CN 111583333A
- Authority
- CN
- China
- Prior art keywords
- image
- identity
- temperature
- temperature measurement
- face
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000009529 body temperature measurement Methods 0.000 title claims abstract description 80
- 238000000034 method Methods 0.000 title claims abstract description 45
- 230000000007 visual effect Effects 0.000 title claims abstract description 23
- 238000001514 detection method Methods 0.000 claims abstract description 80
- 230000036760 body temperature Effects 0.000 claims abstract description 46
- 238000004422 calculation algorithm Methods 0.000 claims description 21
- 238000004891 communication Methods 0.000 claims description 12
- 238000004458 analytical method Methods 0.000 claims description 11
- 238000004590 computer program Methods 0.000 claims description 8
- 238000005516 engineering process Methods 0.000 claims description 8
- 238000012216 screening Methods 0.000 claims description 5
- 238000004861 thermometry Methods 0.000 claims description 4
- 230000005540 biological transmission Effects 0.000 description 10
- 208000035473 Communicable disease Diseases 0.000 description 7
- 238000010586 diagram Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 230000009471 action Effects 0.000 description 4
- 201000010099 disease Diseases 0.000 description 4
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 4
- 210000001061 forehead Anatomy 0.000 description 4
- 208000015181 infectious disease Diseases 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 4
- 238000012549 training Methods 0.000 description 4
- 238000011835 investigation Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 206010037660 Pyrexia Diseases 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000005541 medical transmission Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000005180 public health Effects 0.000 description 1
- 238000001454 recorded image Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01K—MEASURING TEMPERATURE; MEASURING QUANTITY OF HEAT; THERMALLY-SENSITIVE ELEMENTS NOT OTHERWISE PROVIDED FOR
- G01K13/00—Thermometers specially adapted for specific purposes
- G01K13/20—Clinical contact thermometers for use with humans or animals
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/46—Multiprogramming arrangements
- G06F9/54—Interprogram communication
- G06F9/544—Buffers; Shared memory; Pipes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
- G06V10/267—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/62—Text, e.g. of license plates, overlay texts or captions on TV images
- G06V20/63—Scene text, e.g. street names
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Software Systems (AREA)
- Oral & Maxillofacial Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Engineering & Computer Science (AREA)
- Image Analysis (AREA)
Abstract
The application provides a temperature measurement method and device based on visual guidance, electronic equipment and a storage medium, and relates to the technical field of automatic temperature measurement. The method comprises the following steps: acquiring a color image and a depth image obtained by shooting a detection area by a camera from the camera, wherein the detection area is an area where a temperature measurement object is located; carrying out face detection on the color image; when a face image is detected in the color image, acquiring an identity corresponding to the face image; determining the relative position relation between the target part of the temperature measurement object and the camera based on the face image and the depth image; and based on the relative position relation, controlling the robot provided with the temperature measuring device to move to a detection area to measure the temperature of the target part of the temperature measuring object so as to obtain the body temperature data of the temperature measuring object corresponding to the identity identification. The automatic temperature measurement is conducted through visual identification guidance, and the accuracy and the efficiency of accurately obtaining the corresponding temperature data of the specific identity object can be improved.
Description
Technical Field
The application relates to the technical field of automatic temperature measurement, in particular to a temperature measurement method and device based on visual guidance, electronic equipment and a storage medium.
Background
With the increase of global population and the increasing population density, the monitoring and control of large-scale epidemic or infectious diseases are also important problems in the public health environment at present. However, the existing epidemic situation or infectious disease control means are basically based on that patients or health supervision departments actively perform disease condition investigation and report the disease condition investigation to relevant departments such as a disease control center and the like, and the disease condition investigation measures are realized only by consuming a large amount of manpower and material resources, and meanwhile, the problem of low efficiency exists.
Most infectious diseases can cause fever of human bodies, so fever detection is body temperature detection of infectious disease transmission conditions, and infectious diseases can be well monitored. However, most of the existing body temperature detection methods are manually detected by a human and then completed by filling in a form and then filed and reported, or automatically detected by a machine and then reported after checking the body temperature detection result by the identity in the system, so that the body temperature detection result of a person who has not recorded data before needs to be manually input independently, and the problem of low body temperature detection efficiency still exists.
Disclosure of Invention
In view of the above, an object of the embodiments of the present application is to provide a temperature measurement device, an electronic apparatus and a storage medium based on visual guidance, so as to solve the problem of low body temperature detection efficiency in the prior art.
The embodiment of the application also provides a temperature measurement method based on visual guidance, which comprises the following steps: acquiring a color image and a depth image obtained by shooting a detection area by a camera from the camera, wherein the detection area is an area where a temperature measurement object is located; carrying out face detection on the color image; when a face image is detected in the color image, acquiring an identity corresponding to the face image; determining the relative position relation between the target part of the temperature measurement object and the camera based on the face image and the depth image; and controlling the robot provided with the temperature measuring device to move to the detection area to measure the temperature of the target part of the temperature measuring object based on the relative position relation so as to obtain the body temperature data of the temperature measuring object corresponding to the identity mark.
In the implementation mode, the identity of the temperature measurement object is determined through face recognition through the color image, the matching accuracy of the temperature measurement result and the temperature measurement object is improved, the position of the temperature measurement object is determined through the depth image so as to control the robot to move to the target part of the temperature measurement object to perform accurate temperature measurement, and the efficiency and the accuracy of body temperature detection are improved.
Optionally, the performing face detection on the color image includes: and performing face detection on the color image based on an Eigenfaces algorithm, and determining whether the color image has a face image.
In the implementation mode, the face detection is carried out on the color image through the Eigenfaces algorithm, the basic elements of face image distribution are searched from the statistical viewpoint, and the applicability and the accuracy of face recognition are improved when a large number of people need to be subjected to face detection.
Optionally, the identity is an identification card number, the color image further includes an identification card image, and the obtaining of the identity corresponding to the face image includes: converting the color image into a grayscale image; determining a dynamic threshold of the gray level image, performing connected domain analysis on the gray level image based on a screening area and a rectangular length-width ratio, and identifying the identity card image in the gray level image; cutting out an identification card number area in the identification card image through an ROI cutting technology; and identifying the identity card number of the temperature measurement object based on the identity card number area.
In the implementation mode, the identification card image recognition is carried out based on the connected domain analysis, the identification card number recognition is carried out through the ROI cutting technology, the identification card image recognition and the identification card number recognition are realized in a mode with simple steps and high processing speed, and the efficiency of the identification step of body temperature detection is improved.
Optionally, the identifying the id card number of the temperature measurement object based on the id card number region includes: and identifying the identity card number of the temperature measurement object from the identity card number area based on the height and width of a detection character preset by a font model of the identity card number.
In the implementation mode, the identification of the identity card number is carried out based on the font model, so that the identification efficiency of the temperature measurement object and the subsequent identity card number entry efficiency can be improved.
Optionally, the determining a relative position relationship between the target portion of the thermometric object and the camera based on the face image and the depth image includes: determining the position of the target part in the face image based on Eigenfaces algorithm; determining the corresponding position of the target part in the depth image based on the corresponding relation between the face image and the depth image; and determining the relative position relation between the target part of the temperature measurement object and the camera based on the depth image and the corresponding position.
In the implementation mode, the position of the target part in the color image is determined based on the Eigenfaces algorithm, and then the spatial position of the target part in the depth image is determined based on the corresponding relation between the color image and the depth image, so that the relative position relation between the target part and the camera in the actual space is determined, and the accurate positioning of the temperature measurement part can be automatically performed.
Optionally, before the robot with the thermometric device is controlled to move to the detection area to measure the temperature of the target portion of the thermometric object, so as to obtain the body temperature data of the thermometric object corresponding to the identity, the method further includes: and sending the relative position relation to the robot based on Socket communication.
In the implementation mode, data transmission is carried out through a Socket communication mode and the robot, various data transmission protocols can be supported, and the transmission compatibility of body temperature data is improved.
Optionally, after the robot with the thermometric device is controlled to move to the detection area to measure the temperature of the target portion of the thermometric object, so as to obtain the body temperature data of the thermometric object corresponding to the identity, the method further includes: and sending the identity identification and the body temperature data to a background over-the-air download platform through Socket communication, and sending the body temperature data to user equipment corresponding to the identity identification through the background over-the-air download platform.
In the implementation mode, the body temperature data is sent to the user equipment corresponding to the identity through the over-the-air download platform, so that the user can be ensured to obtain the body temperature data in time, and the information reflection speed of epidemic or infectious disease management and control is improved.
The embodiment of the present application further provides a temperature measuring device based on vision guide, the device includes: the image acquisition module is used for acquiring a color image and a depth image which are obtained by shooting a detection area by the camera from the camera, wherein the detection area is an area where a temperature measurement object is located; the face detection module is used for carrying out face detection on the color image; the identity determining module is used for determining an identity label corresponding to the face image when the face image is detected in the color image; the position determining module is used for determining the relative position relation between the target part of the temperature measuring object and the camera based on the face image and the depth image; and the body temperature detection module is used for controlling the robot provided with the temperature measuring device to move to the detection area to measure the temperature of the target part of the temperature measuring object based on the relative position relation so as to obtain body temperature data of the temperature measuring object corresponding to the identity mark.
In the implementation mode, the identity of the temperature measurement object is determined through face recognition through the color image, the matching accuracy of the temperature measurement result and the temperature measurement object is improved, the position of the temperature measurement object is determined through the depth image so as to control the robot to move to the target part of the temperature measurement object to perform accurate temperature measurement, and the efficiency and the accuracy of body temperature detection are improved.
Optionally, the face detection module is specifically configured to: and performing face detection on the color image based on an Eigenfaces algorithm, and determining whether the color image has a face image.
In the implementation mode, the face detection is carried out on the color image through the Eigenfaces algorithm, from the statistical viewpoint, basic elements of face image distribution are searched, and the applicability and the accuracy of face recognition are improved when a large number of people need to be subjected to face detection.
Optionally, the identity determination module is specifically configured to: converting the color image into a grayscale image; determining a dynamic threshold of the gray level image, performing connected domain analysis on the gray level image based on a screening area and a rectangular length-width ratio, and identifying the identity card image in the gray level image; cutting out an identification card number area in the identification card image through an ROI cutting technology; and identifying the identity card number of the temperature measurement object based on the identity card number area.
In the implementation mode, the identification card image recognition is carried out based on the connected domain analysis, the identification card number recognition is carried out through the ROI cutting technology, the identification card image recognition and the identification card number recognition are realized in a mode with simple steps and high processing speed, and the efficiency of the identification step of body temperature detection is improved.
Optionally, the identity determination module is specifically configured to: and identifying the identity card number of the temperature measurement object from the identity card number area based on the height and width of a detection character preset by a font model of the identity card number.
In the implementation mode, the identification of the identity card number is carried out based on the font model, so that the identification efficiency of the temperature measurement object and the subsequent identity card number entry efficiency can be improved.
Optionally, the position determining module is specifically configured to: determining the position of the target part in the face image based on Eigenfaces algorithm; determining the corresponding position of the target part in the depth image based on the corresponding relation between the face image and the depth image; and determining the relative position relation between the target part of the temperature measurement object and the camera based on the depth image and the corresponding position.
In the implementation mode, the position of the target part in the color image is determined based on the Eigenfaces algorithm, and then the spatial position of the target part in the depth image is determined based on the corresponding relation between the color image and the depth image, so that the relative position relation between the target part and the camera in the actual space is determined, and the accurate positioning of the temperature measurement part can be automatically performed.
Optionally, the visual guidance-based thermometry device further comprises: and the data transmission module is used for sending the relative position relation to the robot based on Socket communication.
In the implementation mode, data transmission is carried out through a Socket communication mode and the robot, various data transmission protocols can be supported, and the transmission compatibility of body temperature data is improved.
Optionally, the data transmission module is further configured to: and sending the identity identification and the body temperature data to a background over-the-air download platform through Socket communication, and sending the body temperature data to user equipment corresponding to the identity identification through the background over-the-air download platform.
In the implementation mode, the body temperature data is sent to the user equipment corresponding to the identity through the over-the-air download platform, so that the user can be ensured to obtain the body temperature data in time, and the information reflection speed of epidemic or infectious disease management and control is improved.
An embodiment of the present application further provides an electronic device, where the electronic device includes a memory and a processor, where the memory stores program instructions, and the processor executes steps in any one of the above implementation manners when reading and executing the program instructions.
An embodiment of the present application further provides a storage medium, where computer program instructions are stored in the storage medium, and when the computer program instructions are read and executed by a processor, the steps in any one of the above implementation manners are performed.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a schematic flowchart of a temperature measurement method based on visual guidance according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of an identity obtaining step according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a temperature measuring device based on visual guidance according to an embodiment of the present disclosure.
Icon: 20-a temperature measuring device based on visual guidance; 21-an image acquisition module; 22-a face detection module; 23-an identity determination module; 24-a location determination module; 25-body temperature detection module.
Detailed Description
The technical solution in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
The applicant researches and discovers that the existing body temperature detection cannot accurately position the position of a temperature measurement object, and generally cannot ensure the accuracy of a temperature measurement target part, so that the accuracy of automatic temperature measurement cannot be ensured, and the problems of low efficiency caused by manually performing identity matching and summary reporting of temperature measurement data are solved.
In order to solve the above problems, an embodiment of the present application provides a temperature measurement method based on visual guidance, please refer to fig. 1, and fig. 1 is a schematic flow diagram of the temperature measurement method based on visual guidance provided in the embodiment of the present application. The temperature measurement method based on visual guidance comprises the following specific steps:
step S11: a color image and a depth image obtained by shooting a detection area by a camera are obtained from the camera, and the detection area is an area where a temperature measurement object is located.
The detection area in this embodiment may be an area suitable for shooting, which is determined in advance based on parameters of the camera, and the line may be drawn on the ground of the area to remind the temperature measurement object, or the temperature measurement object may be guided by voice or a worker. It should be understood that a temperature measurement object can be accommodated in the detection area at the same time, so that the influence of human face detection errors and the like on temperature measurement is avoided.
Since a color image and a depth image need to be acquired, in addition to capturing a color image with a general camera, depth image acquisition by a depth camera is also needed.
Optionally, the depth camera for acquiring the depth image in this embodiment may adopt a depth camera with model number intel realsense D435. The D435 depth camera integrates a Vision Processor (VPU) and a depth sensing module in a small, powerful, low-cost, instantly deployable aluminum housing, is equipped with a global shutter and a wide field of view, has a minimum sensing depth of about 0.11m, has a long-range (10 m) function and a depth resolution of up to 1280 x 720, is well-suited for capturing depth data in fast-moving VR or robotic applications, and has high integration, suitable for setup by the robot for thermometry in this embodiment.
Step S12: and carrying out face detection on the color image.
Optionally, the present embodiment may perform face detection on the color image based on the Eigenfaces algorithm, so as to determine whether a face image exists in the color image. The eigenface in the EigenFaces algorithm refers to a group of eigenvectors of a face recognition problem in the machine vision field, and is obtained by a pca (principal Components analysis) method. PCA is a commonly used linear dimensionality reduction method, which aims to map high-dimensional data into a low-dimensional space for representation through some linear projection, and expects the variance of the data to be maximum in the projected dimension, so that the characteristics of more raw data points are preserved with fewer data dimensions. The steps of the Eigenfaces algorithm can be summarized roughly as: (1) collecting training face images of different people or different actions of the same person, and stringing each training face image matrix into a one-dimension according to rows, wherein each training face is a vector; (2) adding up each training face in corresponding dimensionality and then averaging to obtain an average face; (3) subtract the "average face" vector for each image; (4) calculating a covariance matrix; (5) carrying out eigenvalue decomposition on the covariance matrix to obtain a face characteristic model; (6) and comparing the actually shot image to be recognized with the trained human face feature model by using the feature face so as to recognize the human face.
The embodiment can draw the position of the human face by using a rectangular frame on a color image in the display after the human face image is recognized through Eigenfaces algorithm. If the target part needing temperature measurement is arranged as the forehead or the ear, the forehead position can be marked by Eigenfaces algorithm. Optionally, in order to ensure the recognition accuracy of the face image, if the face is not recognized, the next step may not be performed until the face is recognized.
Step S13: and when the face image is detected in the color image, acquiring an identity corresponding to the face image.
Optionally, the identity identifier in this embodiment may be an identity card number, and the identity card number of the temperature measurement object may be obtained from the color image by directing the measurement object to face the camera according to a preset posture when the color image is shot by the camera through voice, video, or slogan.
As an alternative implementation manner, please refer to fig. 2, and fig. 2 is a schematic flowchart of an identity obtaining step provided in an embodiment of the present application. The identity obtaining step may be as follows:
step S131: the color image is converted into a grayscale image.
A color image refers to an image where each pixel is made up of R, G, B components, where R, G, B is described by different gray levels. Grayscale images, also known as grayscale digital images, are images in which each pixel has only one sample color. Such images are typically displayed in gray scale from darkest black to brightest white, although in theory this sampling could be of different shades of any color and even different colors at different brightnesses.
Step S132: and determining a dynamic threshold of the gray level image, performing connected domain analysis on the gray level image based on the screening area and the rectangular length-width ratio, and identifying the identity card image in the gray level image.
The dynamic threshold is a concept proposed in a dynamic threshold segmentation method, which is based on a global threshold and is adjusted by a weighted value of a difference between a neighborhood average gray level and the global threshold. Because the identity card has a fixed aspect ratio and size, connected domain analysis based on area and aspect ratio can identify the identity card image. The Blob analysis is to perform connected component extraction and labeling on a binary image after foreground/background separation. Blob analysis can provide the machine vision application with the number, location, shape, and orientation of blobs in an image, and can also provide the topology between related blobs.
Step S133: and cutting out the ID card number area in the ID card image by the ROI cutting technology.
In machine vision and image processing, a region to be processed is outlined from a processed image in the form Of a box, a circle, an ellipse, an irregular polygon, or the like, and is called a region Of interest (roi) (region Of interest). The embodiment performs ROI area extraction on the image of the ID number area based on ROI clipping.
Step S134: and identifying the identity card number of the temperature measurement object based on the identity card number area.
Specifically, the step S134 may specifically be to identify the identification number of the temperature measurement object from the identification number region based on the height and width of the detection character preset by the font model of the identification number.
In other embodiments, the identity identifier may also be the face image itself, a specially designated identity number, or the like. For example, when the identity is identified as a face image, the face image which is already recorded and the identity of the person corresponding to the face image exist in the database, the shot face image is identified and determined to be matched with the recorded image, and then identity confirmation is completed.
Step S14: and determining the relative position relation between the target part of the temperature measurement object and the camera based on the face image and the depth image.
In the depth camera in this embodiment, the intermediate infrared dot matrix projector emits infrared structured light, the left and right infrared cameras measure depth based on a triangulation distance measurement principle, different colors in the obtained depth map represent different distance information, and then the relative position relationship between the target portion and the depth camera can be measured based on the correspondence between the color image and the depth image.
It should be appreciated that determining the location of the target site based on the depth image also requires reference calculations based on the internal parameters of the depth camera.
Alternatively, the target portion may be any portion capable of measuring temperature, such as the forehead, the neck, or the wrist of the human body. In this embodiment, the forehead may be taken as the target site. Because the target part is positioned and then temperature is measured, the method provided by the embodiment can accurately measure and measure the temperature of the target part with different heights aiming at temperature measuring objects with different heights.
Specifically, the specific step of determining the relative position relationship between the target region and the camera in this embodiment may include:
step S141: and determining the position of the target part in the face image based on Eigenfaces algorithm.
The method for recognizing the target part based on the Eigenfaces algorithm is similar to the method for recognizing the face based on the Eigenfaces algorithm, and the description is omitted in this embodiment.
Step S142: and determining the corresponding position of the target part in the depth image based on the corresponding relation between the face image and the depth image.
Optionally, the correspondence between the face image and the depth image may be realized by a correlation algorithm between a plane image and a depth image of image recognition, or may be determined based on parameter association between a camera for capturing a color image and a depth camera.
Step S143: and determining the relative position relation between the target part of the temperature measurement object and the camera based on the depth image and the corresponding position.
The determination of the corresponding position of the depth image can be obtained by performing depth mapping based on OpenCV.
Step S15: and based on the relative position relation, controlling the robot provided with the temperature measuring device to move to a detection area to measure the temperature of the target part of the temperature measuring object so as to obtain the body temperature data of the temperature measuring object corresponding to the identity identification.
Optionally, the robot in this embodiment may be a robot that is capable of performing a depth-of-field camera and a two-dimensional ordinary camera, and is equipped with a temperature measuring device and capable of driving the temperature measuring device to move to a detection area. Specifically, the robot may be a robot arm type cooperative robot, a movable vehicle type robot, or the like.
The method in this embodiment may be executed by a processor connected to the camera, and then Socket communication may be used to send the relative position relationship to the robot, so as to control the robot to move to the detection area to measure the temperature of the target portion of the temperature measurement object.
Optionally, the temperature measuring device in this embodiment may be a temperature measuring gun, an electronic thermometer, or the like capable of quickly acquiring temperature data of a temperature measuring object. Optionally, the temperature measuring gun in this embodiment may be a serial temperature measuring gun, and the serial temperature measuring gun may directly transmit the body temperature data of the object to be measured through a data connection line or a wireless network.
Further, after the body temperature data is obtained, in order to match and report the body temperature data of the temperature measurement object, the embodiment may further send the identity and the body temperature data to the relevant database and the background over-the-air download platform through Socket communication, and then send the body temperature data to the user equipment corresponding to the identity through the background over-the-air download platform. The Over The Air (OTA) is a technology for remotely managing data of a mobile terminal device and a SIM card through an Air interface of mobile communication.
In order to cooperate with the temperature measuring method based on visual guidance, the embodiment of the application further provides a temperature measuring device 20 based on visual guidance.
Referring to fig. 3, fig. 3 is a schematic structural diagram of a temperature measuring device based on visual guidance according to an embodiment of the present disclosure.
The visual guidance-based temperature measuring device 20 includes:
the image acquisition module 21 is configured to acquire a color image and a depth image obtained by shooting a detection area by a camera, where the detection area is an area where a temperature measurement object is located;
a face detection module 22, configured to perform face detection on the color image;
the identity determining module 23 is configured to determine an identity identifier corresponding to the face image when the face image is detected in the color image;
the position determining module 24 is used for determining the relative position relationship between the target part of the temperature measuring object and the camera based on the face image and the depth image;
and the body temperature detection module 25 is used for controlling the robot provided with the temperature measuring device to move to a detection area to measure the temperature of the target part of the temperature measuring object based on the relative position relationship so as to obtain body temperature data of the temperature measuring object corresponding to the identity identification.
Optionally, the face detection module 22 is specifically configured to: and performing face detection on the color image based on an Eigenfaces algorithm, and determining whether the color image has a face image.
Optionally, the identity determining module 23 is specifically configured to: converting the color image into a gray image; determining a dynamic threshold of the gray level image, performing connected domain analysis on the gray level image based on the screening area and the rectangular length-width ratio, and identifying an identity card image in the gray level image; cutting out an identification card number area in the identification card image through an ROI cutting technology; and identifying the identity card number of the temperature measurement object based on the identity card number area.
Optionally, the identity determining module 23 is specifically configured to: and identifying the identity card number of the temperature measurement object from the identity card number area based on the height and width of the detection character preset by the font model of the identity card number.
Optionally, the position determining module 24 is specifically configured to: determining the position of a target part in the face image based on an Eigenfaces algorithm; determining the corresponding position of the target part in the depth image based on the corresponding relation between the face image and the depth image; and determining the relative position relation between the target part of the temperature measurement object and the camera based on the depth image and the corresponding position.
Optionally, the temperature measuring device 20 based on visual guidance further comprises: and the data transmission module is used for sending the relative position relation to the robot based on Socket communication.
Optionally, the data transmission module is further configured to: and sending the identity identification and the body temperature data to a background over-the-air download platform through Socket communication, and sending the body temperature data to user equipment corresponding to the identity identification through the background over-the-air download platform.
The embodiment of the present application further provides an electronic device, which includes a memory and a processor, where the memory stores program instructions, and when the processor reads and executes the program instructions, the processor executes steps in any one of the methods of the visual guidance-based temperature measurement method provided in this embodiment.
It should be understood that the electronic device may be a Personal Computer (PC), a tablet PC, a smart phone, a Personal Digital Assistant (PDA), or other electronic device having a logical computing function.
The embodiment of the application also provides a readable storage medium, wherein computer program instructions are stored in the readable storage medium, and when the computer program instructions are read and operated by a processor, the steps in the temperature measurement method based on the visual guidance are executed.
In summary, the embodiment of the present application further provides a temperature measurement method and apparatus based on visual guidance, an electronic device, and a storage medium, where the method includes: acquiring a color image and a depth image obtained by shooting a detection area by a camera from the camera, wherein the detection area is an area where a temperature measurement object is located; carrying out face detection on the color image; when a face image is detected in the color image, acquiring an identity corresponding to the face image; determining the relative position relation between the target part of the temperature measurement object and the camera based on the face image and the depth image; and controlling the robot provided with the temperature measuring device to move to the detection area to measure the temperature of the target part of the temperature measuring object based on the relative position relation so as to obtain the body temperature data of the temperature measuring object corresponding to the identity mark.
In the implementation mode, the identity of the temperature measurement object is determined through face recognition through the color image, the matching accuracy of the temperature measurement result and the temperature measurement object is improved, the position of the temperature measurement object is determined through the depth image so as to control the robot to move to the target part of the temperature measurement object to perform accurate temperature measurement, and the efficiency and the accuracy of body temperature detection are improved.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. The apparatus embodiments described above are merely illustrative, and for example, the block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of devices according to various embodiments of the present application. In this regard, each block in the block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams, and combinations of blocks in the block diagrams, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Therefore, the present embodiment further provides a readable storage medium, in which computer program instructions are stored, and when the computer program instructions are read and executed by a processor, the computer program instructions perform the steps of any of the block data storage methods. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a RanDom Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
Claims (10)
1. A method for thermometry based on visual guidance, the method comprising:
acquiring a color image and a depth image obtained by shooting a detection area by a camera from the camera, wherein the detection area is an area where a temperature measurement object is located;
carrying out face detection on the color image;
when a face image is detected in the color image, acquiring an identity corresponding to the face image;
determining the relative position relation between the target part of the temperature measurement object and the camera based on the face image and the depth image;
and controlling the robot provided with the temperature measuring device to move to the detection area to measure the temperature of the target part of the temperature measuring object based on the relative position relation so as to obtain the body temperature data of the temperature measuring object corresponding to the identity mark.
2. The method according to claim 1, wherein the performing face detection on the color image comprises:
and performing face detection on the color image based on an Eigenfaces algorithm, and determining whether the color image has a face image.
3. The method of claim 1, wherein the identification is an identification number, the color image further includes an identification image, and the obtaining the identification corresponding to the face image includes:
converting the color image into a grayscale image;
determining a dynamic threshold of the gray level image, performing connected domain analysis on the gray level image based on a screening area and a rectangular length-width ratio, and identifying the identity card image in the gray level image;
cutting out an identification card number area in the identification card image through an ROI cutting technology;
and identifying the identity card number of the temperature measurement object based on the identity card number area.
4. The method of claim 3, wherein identifying the identification number of the temperature measurement object based on the identification number region comprises:
and identifying the identity card number of the temperature measurement object from the identity card number area based on the height and width of a detection character preset by a font model of the identity card number.
5. The method of claim 2, wherein the determining the relative position relationship between the target part of the thermometric object and the camera based on the face image and the depth image comprises:
determining the position of the target part in the face image based on Eigenfaces algorithm;
determining the corresponding position of the target part in the depth image based on the corresponding relation between the face image and the depth image;
and determining the relative position relation between the target part of the temperature measurement object and the camera based on the depth image and the corresponding position.
6. The method according to claim 1, wherein before the robot with the thermometric device is controlled to move to the detection area to measure the temperature of the target part of the thermometric object so as to obtain the body temperature data of the thermometric object corresponding to the identification mark, the method further comprises:
and sending the relative position relation to the robot based on Socket communication.
7. The method according to claim 1 or 6, wherein after the robot with the thermometric device is controlled to move to the detection area to measure the temperature of the target part of the thermometric object so as to obtain the body temperature data of the thermometric object corresponding to the identity mark, the method further comprises:
and sending the identity identification and the body temperature data to a background over-the-air download platform through Socket communication, and sending the body temperature data to user equipment corresponding to the identity identification through the background over-the-air download platform.
8. A vision-based thermometry device, the device comprising:
the image acquisition module is used for acquiring a color image and a depth image which are obtained by shooting a detection area by the camera from the camera, wherein the detection area is an area where a temperature measurement object is located;
the face detection module is used for carrying out face detection on the color image;
the identity determining module is used for determining an identity label corresponding to the face image when the face image is detected in the color image;
the position determining module is used for determining the relative position relation between the target part of the temperature measuring object and the camera based on the face image and the depth image;
and the body temperature detection module is used for controlling the robot provided with the temperature measuring device to move to the detection area to measure the temperature of the target part of the temperature measuring object based on the relative position relation so as to obtain body temperature data of the temperature measuring object corresponding to the identity mark.
9. An electronic device comprising a memory having stored therein program instructions and a processor that, when executed, performs the steps of the method of any of claims 1-7.
10. A storage medium having stored thereon computer program instructions for executing the steps of the method according to any one of claims 1 to 7 when executed by a processor.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010428280.1A CN111583333A (en) | 2020-05-19 | 2020-05-19 | Temperature measurement method and device based on visual guidance, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010428280.1A CN111583333A (en) | 2020-05-19 | 2020-05-19 | Temperature measurement method and device based on visual guidance, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111583333A true CN111583333A (en) | 2020-08-25 |
Family
ID=72126944
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010428280.1A Pending CN111583333A (en) | 2020-05-19 | 2020-05-19 | Temperature measurement method and device based on visual guidance, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111583333A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112364740A (en) * | 2020-10-30 | 2021-02-12 | 交控科技股份有限公司 | Unmanned machine room monitoring method and system based on computer vision |
CN113785783A (en) * | 2021-08-26 | 2021-12-14 | 北京市农林科学院智能装备技术研究中心 | Livestock grouping system and method |
CN116030562A (en) * | 2022-11-17 | 2023-04-28 | 北京声智科技有限公司 | Data processing method, device, equipment and medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1542416A (en) * | 2003-05-02 | 2004-11-03 | 北京行者华视网络系统集成技术有限公 | Temperature measuring method and apparatus thereof |
CN109190539A (en) * | 2018-08-24 | 2019-01-11 | 阿里巴巴集团控股有限公司 | Face identification method and device |
CN208926337U (en) * | 2017-07-20 | 2019-06-04 | 歌尔股份有限公司 | A kind of temperature taking device |
CN110348326A (en) * | 2019-06-21 | 2019-10-18 | 安庆师范大学 | The family health care information processing method of the identification of identity-based card and the access of more equipment |
KR20200002504A (en) * | 2018-06-29 | 2020-01-08 | 다빈치온 주식회사 | Apparatus for Measuring Temperature of Things Using Thermal Image Sensor and a Method Thereof |
CN110916620A (en) * | 2019-10-30 | 2020-03-27 | 深圳市华盛昌科技实业股份有限公司 | Body temperature measuring method and terminal |
-
2020
- 2020-05-19 CN CN202010428280.1A patent/CN111583333A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1542416A (en) * | 2003-05-02 | 2004-11-03 | 北京行者华视网络系统集成技术有限公 | Temperature measuring method and apparatus thereof |
CN208926337U (en) * | 2017-07-20 | 2019-06-04 | 歌尔股份有限公司 | A kind of temperature taking device |
KR20200002504A (en) * | 2018-06-29 | 2020-01-08 | 다빈치온 주식회사 | Apparatus for Measuring Temperature of Things Using Thermal Image Sensor and a Method Thereof |
CN109190539A (en) * | 2018-08-24 | 2019-01-11 | 阿里巴巴集团控股有限公司 | Face identification method and device |
CN110348326A (en) * | 2019-06-21 | 2019-10-18 | 安庆师范大学 | The family health care information processing method of the identification of identity-based card and the access of more equipment |
CN110916620A (en) * | 2019-10-30 | 2020-03-27 | 深圳市华盛昌科技实业股份有限公司 | Body temperature measuring method and terminal |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112364740A (en) * | 2020-10-30 | 2021-02-12 | 交控科技股份有限公司 | Unmanned machine room monitoring method and system based on computer vision |
CN112364740B (en) * | 2020-10-30 | 2024-04-19 | 交控科技股份有限公司 | Unmanned aerial vehicle room monitoring method and system based on computer vision |
CN113785783A (en) * | 2021-08-26 | 2021-12-14 | 北京市农林科学院智能装备技术研究中心 | Livestock grouping system and method |
CN116030562A (en) * | 2022-11-17 | 2023-04-28 | 北京声智科技有限公司 | Data processing method, device, equipment and medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11627726B2 (en) | System and method of estimating livestock weight | |
JP6441932B2 (en) | Eye level scanner and display pointer tracking | |
CN111583333A (en) | Temperature measurement method and device based on visual guidance, electronic equipment and storage medium | |
WO2020046960A1 (en) | System and method for optimizing damage detection results | |
CN111639629B (en) | Pig weight measurement method and device based on image processing and storage medium | |
US10515459B2 (en) | Image processing apparatus for processing images captured by a plurality of imaging units, image processing method, and storage medium storing program therefor | |
US20210216758A1 (en) | Animal information management system and animal information management method | |
JP2010157093A (en) | Motion estimation device and program | |
CN113762229A (en) | Intelligent identification method and system for building equipment in building site | |
CN112525355A (en) | Image processing method, device and equipment | |
JP5047658B2 (en) | Camera device | |
CN111368698A (en) | Subject recognition method, subject recognition device, electronic device, and medium | |
KR102597692B1 (en) | Method, apparatus, and computer program for measuring volume of objects by using image | |
CN114463779A (en) | Smoking identification method, device, equipment and storage medium | |
CN114743224B (en) | Animal husbandry livestock body temperature monitoring method and system based on computer vision | |
JP6567638B2 (en) | Noseprint matching system, noseprint matching method, and noseprint matching program | |
KR20210044127A (en) | Visual range measurement and alarm system based on video analysis and method thereof | |
JP6893812B2 (en) | Object detector | |
CN112883809B (en) | Target detection method, device, equipment and medium | |
CN112507783A (en) | Mask face detection, identification, tracking and temperature measurement method based on attention mechanism | |
US20220354091A1 (en) | Animal information management system and animal information management method | |
CN111931674B (en) | Article identification management method, device, server and readable storage medium | |
US20240197193A1 (en) | Livestock heart rate monitoring | |
US20180313696A1 (en) | Temperature Monitoring Systems and Processes | |
CN116189023B (en) | Method and system for realizing environment emergency monitoring based on unmanned aerial vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200825 |