[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN111985377A - Temperature measuring method and device, electronic equipment and storage medium - Google Patents

Temperature measuring method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN111985377A
CN111985377A CN202010813228.8A CN202010813228A CN111985377A CN 111985377 A CN111985377 A CN 111985377A CN 202010813228 A CN202010813228 A CN 202010813228A CN 111985377 A CN111985377 A CN 111985377A
Authority
CN
China
Prior art keywords
temperature
image
detected
processed
face
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010813228.8A
Other languages
Chinese (zh)
Inventor
黄王爵
林佩材
徐肇虎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Sensetime Technology Co Ltd
Original Assignee
Shenzhen Sensetime Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Sensetime Technology Co Ltd filed Critical Shenzhen Sensetime Technology Co Ltd
Priority to CN202010813228.8A priority Critical patent/CN111985377A/en
Publication of CN111985377A publication Critical patent/CN111985377A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/162Detection; Localisation; Normalisation using pixel segmentation or colour matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Radiation Pyrometers (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

The application discloses a temperature measuring method and device, electronic equipment and a storage medium. The method comprises the following steps: acquiring an image to be processed; under the condition that the image to be processed contains a face region and the object to be detected corresponding to the face region is a living body, acquiring temperature data of the image to be processed; the temperature data carries temperature information of object points corresponding to pixel points in the image to be processed; and obtaining the temperature of the object to be detected according to the temperature data.

Description

Temperature measuring method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of computer vision technologies, and in particular, to a temperature measuring method and apparatus, an electronic device, and a storage medium.
Background
Currently, measuring the temperature of the human body is usually done by manual work. Specifically, the temperature of the object to be detected is measured by a worker holding the temperature measuring gun. However, the method not only needs to consume a large labor cost, but also has low detection efficiency, easily causes the distance between people to be too close, and has a large potential safety hazard when respiratory infectious diseases prevail.
Disclosure of Invention
The application provides a temperature measuring method and device, electronic equipment and a storage medium.
In a first aspect, a method for measuring temperature is provided, the method comprising:
acquiring an image to be processed;
under the condition that the image to be processed contains a face region and an object to be detected corresponding to the face region is a living body, acquiring temperature data of the image to be processed; the temperature data carries temperature information of object points corresponding to pixel points in the image to be processed;
and obtaining the temperature of the object to be detected according to the temperature data.
In the aspect, the temperature measuring device determines whether the image to be processed contains a face region by performing face detection processing on the image to be processed so as to determine whether the image to be processed contains the object to be detected. The temperature measuring device acquires temperature data under the condition that the object to be detected is determined to be a living body, and obtains the temperature of the object to be detected according to the temperature data, so that the probability of measuring the temperature of an object (namely, the object to be measured is not a person) can be reduced, the probability of measuring the temperature of a non-living body can be reduced, and the data processing amount can be further reduced. In addition, the temperature measuring device obtains the temperature of the object to be detected according to the temperature data, so that the contact between a worker and a person to be detected can be avoided, and the detection efficiency is improved.
With reference to any one of the embodiments of the present application, the temperature data includes a temperature thermodynamic diagram and a temperature file, the temperature file carries temperature information of object points corresponding to pixel points in the temperature thermodynamic diagram, and obtaining the temperature of the object to be detected according to the temperature data includes:
determining a pixel point region corresponding to the face region from the temperature thermodynamic diagram to obtain a first target pixel point region;
and determining the temperature of an object point corresponding to at least one pixel point in the first target pixel point region according to the temperature file, and obtaining the temperature of the face region as the temperature of the object to be detected.
In combination with any embodiment of the present application, the temperature thermodynamic diagram is acquired by a thermal imaging device, and the image to be processed is acquired by an RGB imaging device;
the determining a pixel point region corresponding to the face region from the temperature thermodynamic diagram to obtain a first target pixel point region includes:
performing first face recognition processing on the image to be processed to obtain a first position of the face region in the image to be processed;
acquiring a first pose of the thermal imaging device in the process of acquiring the temperature thermodynamic diagram and a second pose of the RGB imaging device in the process of acquiring the image to be processed;
obtaining a pose conversion relation between the thermal imaging equipment and the RGB imaging equipment according to the first pose and the second pose, wherein the pose conversion relation is used as a coordinate system conversion relation between a pixel coordinate system of the temperature thermodynamic diagram and a pixel coordinate system of the image to be processed;
and converting the first position according to the coordinate system conversion relation, and determining the first target pixel point region from the temperature thermodynamic diagram.
In combination with any embodiment of the present application, the method further comprises:
and carrying out accessory detection processing on the image to be processed, and determining whether the object to be detected wears a preset accessory or not to obtain a detection result, wherein the preset accessory comprises a mask.
With reference to any embodiment of the present application, the performing accessory detection processing on the image to be processed to determine whether the object to be detected wears a predetermined accessory, and obtaining a detection result includes:
performing feature extraction processing on the image to be processed to obtain first feature data; the first characteristic data carries information whether the object to be detected wears a mask or not;
and obtaining the detection result according to the first characteristic data.
With reference to any embodiment of the present application, in a case that the detection result is that the subject to be detected wears the mask, the method further includes:
determining that the mask is worn correctly by the object to be detected under the condition that the key point of the nostril of the object to be detected and the key point of the mouth of the object to be detected are both located in the pixel point area covered by the facial mask; the facial mask is a mask worn by the object to be detected;
and under the condition that the key point of the nostril of the object to be detected and/or the key point of the mouth of the object to be detected are/is located outside the pixel point area covered by the facial mask, determining that the mask is not worn correctly by the object to be detected.
With reference to any embodiment of the present application, the temperature of the object to be detected is obtained by the following steps: according to the temperature file, determining the temperature of an object point corresponding to at least one pixel point in the first target pixel point region to obtain the temperature of the face region as the temperature of the object to be detected, wherein the at least one pixel point belongs to a second target pixel point region in the first target pixel point region under the condition that the mask is worn by the object to be detected, and the second target pixel point region is a pixel point region corresponding to the forehead region in the face region.
In combination with any embodiment of the present application, before determining, according to the temperature file, a temperature of an object point corresponding to at least one pixel point in the first target pixel point region to obtain the temperature of the face region, the method further includes:
performing second face recognition processing on the image to be processed to obtain the position of at least one face key point in the face region;
determining a second position of the forehead area in the image to be processed according to the position of the at least one face key point; the forehead area belongs to the face area;
and determining the second target pixel point region from the temperature thermodynamic diagram according to the coordinate system conversion relation and the second position.
In combination with any embodiment of the present application, the method further comprises:
carrying out face comparison processing on the image to be processed and images in a face image library to obtain an image with the similarity exceeding a face similarity threshold value with the image to be processed as an identity image;
and determining the object to be detected as a person corresponding to the identity image.
In combination with any embodiment of the present application, the temperature measurement method is applied to a first temperature measurement terminal, the first temperature measurement terminal belongs to a temperature measurement system, the temperature measurement system further includes a temperature measurement background device, and the method further includes:
the temperature measurement background equipment receives the temperature of the object to be detected sent by the first temperature measurement terminal, and stores the position of the first temperature measurement terminal as the measurement position of the temperature of the object to be detected in a database;
the temperature measurement background equipment acquires the position information temperature of a monitoring area;
the temperature measurement background equipment uses the position information of the monitoring area to retrieve the database, and obtains the abnormal temperature of the measurement position in the monitoring area as the monitoring temperature; the abnormal temperature is a temperature exceeding a temperature safety threshold;
the temperature measurement background equipment obtains the safety level of the monitoring area according to the quantity of the monitoring temperatures and the mapping relation; the mapping relationship is a mapping relationship between the number of temperature data exceeding a temperature safety threshold and a safety level.
In combination with any embodiment of the present application, the temperature measurement system further includes a second temperature measurement terminal, and a position of the second temperature measurement terminal is different from a position of the first temperature measurement terminal; the database of the temperature measurement background equipment comprises images sent by the first temperature measurement terminal and images sent by the second temperature measurement terminal; the method further comprises the following steps:
under the condition that the temperature of the object to be detected exceeds the temperature safety threshold, the temperature measurement background equipment uses the image to be processed sent by the first temperature measurement terminal to search the database, and an image containing the object to be detected is obtained and serves as a monitoring image;
and the temperature measurement background equipment obtains the track of the object to be detected according to the position of the temperature measurement terminal which sends the monitoring image.
In a second aspect, a thermometric apparatus is provided, the thermometric apparatus comprising:
the first acquisition unit is used for acquiring an image to be processed;
the second acquisition unit is used for acquiring the temperature data of the image to be processed under the condition that the image to be processed contains a face area and an object to be detected corresponding to the face area is a living body; the temperature data carries temperature information of object points corresponding to pixel points in the image to be processed;
and the processing unit is used for obtaining the temperature of the object to be detected according to the temperature data.
In combination with any embodiment of the present application, the temperature data includes a temperature thermodynamic diagram and a temperature file, the temperature file carries temperature information of object points corresponding to pixel points in the temperature thermodynamic diagram, and the processing unit is configured to:
determining a pixel point region corresponding to the face region from the temperature thermodynamic diagram to obtain a first target pixel point region;
and determining the temperature of an object point corresponding to at least one pixel point in the first target pixel point region according to the temperature file, and obtaining the temperature of the face region as the temperature of the object to be detected.
In combination with any embodiment of the present application, the temperature thermodynamic diagram is acquired by a thermal imaging device, and the image to be processed is acquired by an RGB imaging device;
the processing unit is configured to:
performing first face recognition processing on the image to be processed to obtain a first position of the face region in the image to be processed;
acquiring a first pose of the thermal imaging device in the process of acquiring the temperature thermodynamic diagram and a second pose of the RGB imaging device in the process of acquiring the image to be processed;
obtaining a pose conversion relation between the thermal imaging equipment and the RGB imaging equipment according to the first pose and the second pose, wherein the pose conversion relation is used as a coordinate system conversion relation between a pixel coordinate system of the temperature thermodynamic diagram and a pixel coordinate system of the image to be processed;
and converting the first position according to the coordinate system conversion relation, and determining the first target pixel point region from the temperature thermodynamic diagram.
In combination with any embodiment of the present application, the processing unit is further configured to:
and carrying out accessory detection processing on the image to be processed, and determining whether the object to be detected wears a preset accessory or not to obtain a detection result, wherein the preset accessory comprises a mask.
In combination with any embodiment of the present application, the processing unit is further configured to:
performing feature extraction processing on the image to be processed to obtain first feature data; the first characteristic data carries information whether the object to be detected wears a mask or not;
and obtaining the detection result according to the first characteristic data.
With reference to any embodiment of the present application, in a case that the detection result indicates that the mask is worn by the object to be detected, the processing unit is further configured to:
determining that the mask is worn correctly by the object to be detected under the condition that the key point of the nostril of the object to be detected and the key point of the mouth of the object to be detected are both located in the pixel point area covered by the facial mask; the facial mask is a mask worn by the object to be detected;
and under the condition that the key point of the nostril of the object to be detected and/or the key point of the mouth of the object to be detected are/is located outside the pixel point area covered by the facial mask, determining that the mask is not worn correctly by the object to be detected.
With reference to any embodiment of the present application, the temperature of the object to be detected is obtained by the following steps: according to the temperature file, determining the temperature of an object point corresponding to at least one pixel point in the first target pixel point region to obtain the temperature of the face region as the temperature of the object to be detected, wherein the at least one pixel point belongs to a second target pixel point region in the first target pixel point region under the condition that the mask is worn by the object to be detected, and the second target pixel point region is a pixel point region corresponding to the forehead region in the face region.
In combination with any embodiment of the present application, the processing unit is further configured to:
determining the temperature of an object point corresponding to at least one pixel point in the first target pixel point region according to the temperature file, and performing second face recognition processing on the image to be processed before the temperature of the face region is obtained to obtain the position of at least one face key point in the face region;
determining a second position of the forehead area in the image to be processed according to the position of the at least one face key point; the forehead area belongs to the face area;
and determining the second target pixel point region from the temperature thermodynamic diagram according to the coordinate system conversion relation and the second position.
In combination with any embodiment of the present application, the processing unit is further configured to:
carrying out face comparison processing on the image to be processed and images in a face image library to obtain an image with the similarity exceeding a face similarity threshold value with the image to be processed as an identity image;
and determining the object to be detected as a person corresponding to the identity image.
In combination with any embodiment of the present application, the temperature measuring device belongs to a temperature measuring system, the temperature measuring system further includes a temperature measuring background device, and the method further includes:
the temperature measurement background equipment receives the temperature of the object to be detected sent by the temperature measurement device, and stores the position of the temperature measurement device as the measurement position of the temperature of the object to be detected in a database;
the temperature measurement background equipment acquires the position information temperature of a monitoring area;
the temperature measurement background equipment uses the position information of the monitoring area to retrieve the database, and obtains the abnormal temperature of the measurement position in the monitoring area as the monitoring temperature; the abnormal temperature is a temperature exceeding a temperature safety threshold;
the temperature measurement background equipment obtains the safety level of the monitoring area according to the quantity of the monitoring temperatures and the mapping relation; the mapping relationship is a mapping relationship between the number of temperature data exceeding a temperature safety threshold and a safety level.
In combination with any embodiment of the present application, the temperature measurement system further includes a second temperature measurement terminal, and a position of the second temperature measurement terminal is different from a position of the temperature measurement device; the database of the temperature measurement background equipment comprises images sent by the temperature measurement device and images sent by the second temperature measurement terminal;
under the condition that the temperature of the object to be detected exceeds the temperature safety threshold, the temperature measurement background equipment uses the image to be processed sent by the temperature measurement device to search the database, and an image containing the object to be detected is obtained and serves as a monitoring image;
and the temperature measurement background equipment obtains the track of the object to be detected according to the position of the temperature measurement terminal which sends the monitoring image.
In a third aspect, a processor is provided, which is configured to perform the method according to the first aspect and any one of the possible implementations thereof.
In a fourth aspect, an electronic device is provided, comprising: a processor, transmitting means, input means, output means, and a memory for storing computer program code comprising computer instructions, which, when executed by the processor, cause the electronic device to perform the method of the first aspect and any one of its possible implementations.
In a fifth aspect, there is provided a computer-readable storage medium having stored therein a computer program comprising program instructions which, if executed by a processor, cause the processor to perform the method of the first aspect and any one of its possible implementations.
A sixth aspect provides a computer program product comprising a computer program or instructions which, when run on a computer, causes the computer to perform the method of the first aspect and any of its possible implementations.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments or the background art of the present application, the drawings required to be used in the embodiments or the background art of the present application will be described below.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and, together with the description, serve to explain the principles of the application.
Fig. 1 is a schematic flow chart of a temperature measurement method according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of a temperature measurement system according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of a temperature measuring device according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram of a hardware structure of a temperature measuring device according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
As is well known, respiratory infectious diseases are characterized by easy infection, and when people with respiratory infectious diseases appear in public places or people, the respiratory infectious diseases may cause great harm to social safety (such as the harm of the spread of new coronary pneumonia to human safety).
Because of the difficulty in controlling the spread of respiratory infectious diseases from the respiratory infectious disease transmission pathway, isolation measures are taken for people with respiratory infectious diseases and people suspected of having respiratory infectious diseases, and the method is one of powerful means for controlling the spread of respiratory infectious diseases. Therefore, it is important to determine whether a person has respiratory infection or is suspected of having respiratory infection.
Because patients with respiratory infections are often associated with fever symptoms, it can be determined whether further testing of the person is necessary to determine whether the person has respiratory infections based on the person's temperature.
Currently, measuring the temperature of the human body is usually done by manual work. Specifically, the temperature of the object to be detected is measured by a worker holding the temperature measuring gun. However, this method not only requires a large labor cost, but also has a low detection efficiency and is prone to cause a too close distance between people. Based on this, the embodiment of the application provides a temperature measurement method realized through a temperature measurement device.
The execution main body of the embodiment of the application is a temperature measuring device. Optionally, the temperature measuring device may be one of the following: cell-phone, computer, panel computer, entrance guard's equipment. The embodiments of the present application will be described below with reference to the drawings.
Referring to fig. 1, fig. 1 is a schematic flow chart of a temperature measuring method according to an embodiment of the present disclosure.
101. And acquiring an image to be processed.
In the embodiment of the present application, the image to be processed may be any image. For example, the image to be processed may contain a human face; the image to be processed can also contain human faces and objects; the image to be processed may also contain no persons. The content contained in the image to be processed is not limited.
In one implementation of obtaining the image to be processed, the temperature measuring device receives the image to be processed input by the user through the input component. The above-mentioned input assembly includes: keyboard, mouse, touch screen, touch pad, audio input device, etc.
In another implementation manner of acquiring the image to be processed, the temperature measuring device receives the image to be processed sent by the first terminal. Optionally, the first terminal may be any one of the following: cell-phone, computer, panel computer, server, wearable equipment.
In yet another implementation of obtaining the image to be processed, the thermometry device is loaded with an imaging device (e.g., a camera). The temperature measuring device acquires images to be processed through the imaging equipment. Optionally, the temperature measuring device is assembled with the access control device.
102. And acquiring temperature data of the image to be processed under the condition that the image to be processed contains a face region and an object to be detected corresponding to the face region is a living body.
In the embodiment of the application, the temperature measuring device can determine whether the image to be processed contains the face region by carrying out face detection processing on the image to be processed. The face detection process is used to determine whether a face region is included in the image. The temperature measuring device can determine whether the image to be processed contains a human face region or not by carrying out human face detection processing on the image to be processed.
In an implementation mode of carrying out face detection processing on an image to be processed, a temperature measuring device carries out face feature extraction processing on the image to be processed to obtain face feature data of the image to be processed. And further determining whether the image to be processed contains a face region according to the face feature data.
In another implementation manner of performing face detection processing on an image to be processed, the temperature measuring device uses a face detection neural network to process the image to be processed so as to determine whether the image to be processed contains a face region. The face detection neural network is obtained by training with a first training image with labeling information as training data, wherein the labeling information comprises whether the first training image contains a face or not.
And under the condition that the image to be processed contains the face region, the temperature measuring device further determines whether the object to be detected corresponding to the face region is a living body or not by carrying out living body detection processing on the face region.
The temperature measuring device determines that the image to be processed contains a face area, the object to be detected corresponding to the face area is a living body, and the representation image to be processed contains a living body detection object. At the moment, the temperature measuring device obtains temperature data of the image to be processed so as to obtain the temperature of the object to be detected.
In the embodiment of the application, the temperature data carries the temperature information of the object points corresponding to the pixel points in the image to be processed. The object point refers to a point in the real world corresponding to a pixel point in the image. For example, the table is photographed using a camera to obtain an image a. The table comprises an object point a, a pixel point b in the image A is obtained by imaging the object point a, and the object point a corresponds to the pixel point b.
In one implementation of obtaining temperature data, the temperature measuring device receives temperature data input by a user via an input assembly. The above-mentioned input assembly includes: keyboard, mouse, touch screen, touch pad, audio input device, etc.
In another implementation manner of acquiring temperature data, the temperature measuring device receives the temperature data sent by the second terminal. Optionally, the second terminal may be any one of the following: cell-phone, computer, panel computer, server, wearable equipment. The second terminal may be the same as or different from the first terminal, and this application does not limit this.
In yet another implementation of acquiring temperature data, the thermometry device is loaded with an infrared thermal imaging device (e.g., an infrared thermal imaging camera). The temperature measuring device acquires temperature data through the infrared thermal imaging equipment. Optionally, the temperature measuring device is assembled with the access control device.
103. And obtaining the temperature of the object to be detected according to the temperature data.
In the embodiment of the application, the number of the human faces in the image to be processed is at least 1. The temperature measuring device can respectively determine the temperature of the object to be detected corresponding to each face according to the temperature thermodynamic diagram, namely the temperature of the object to be detected corresponding to each face area.
For example, the thermometric device uses the camera to photograph the face of the third image, and obtains the image to be processed including the face area of the third image. At this time, the object to be detected is Zhang III. The temperature measuring device obtains the temperature of Zhangsan according to the temperature data. For another example, the image to be processed includes a face area of zhang san and a face area of lie san. At this time, Zhang three and Li four are all the objects to be detected. The temperature measuring device can obtain the temperature of Zhang III and the temperature of Li IV according to the temperature data.
In a possible implementation manner, the image to be processed includes not only the face region to be detected, but also a body region (such as a trunk region and an extremity region) of the object to be detected. The temperature measuring device can determine the temperature of the human face region and the temperature of the human body region according to the temperature data, and further can obtain the temperature of the object to be detected. For example, the image to be processed includes a face area of zhang and a human body area of zhang, and the temperature measuring device takes an average value of the temperature of the face area of zhang and the temperature of the human body area of zhang as the temperature of zhang.
In another possible implementation manner, the image to be processed includes a face region of the object to be detected, and the temperature measuring device may determine the temperature of the face region as the temperature of the object to be detected according to the temperature data.
In yet another possible implementation manner, the image to be processed includes a body region (such as a trunk region and an extremity region) of the object to be detected, and the temperature measuring device can determine the temperature of the body region as the temperature of the object to be detected according to the temperature data.
In the implementation of the application, the temperature measuring device determines whether the image to be processed contains the face region or not by performing face detection processing on the image to be processed so as to determine whether the image to be processed contains the object to be detected or not. The temperature measuring device acquires temperature data under the condition that the object to be detected is determined to be a living body, and obtains the temperature of the object to be detected according to the temperature data, so that the probability of measuring the temperature of an object (namely, the object to be measured is not a person) can be reduced, the probability of measuring the temperature of a non-living body can be reduced, and the data processing amount can be further reduced. In addition, the temperature measuring device obtains the temperature of the object to be detected according to the temperature data, so that the contact between a worker and a person to be detected can be avoided, and the detection efficiency is improved.
As an alternative embodiment, the thermometry device implements the live body detection processing of the human face region by the following possible implementation modes:
in a possible implementation manner, the thermometry device acquires an Infrared (IR) image, wherein the IR image includes an object to be detected, and the acquisition time of the IR image is the same as the acquisition time of the image to be processed. Thus, the temperature measuring device can determine whether the object to be detected is a living body or not based on the IR image.
For example, the access control device is loaded with an RGB camera and an IR camera. Entrance guard's equipment uses the IR camera to gather the IR image when using the RGB camera to gather the pending image. The temperature measuring device obtains an IR image by receiving an IR image sent by the access control equipment, and determines a pixel point region corresponding to a face region in the image to be processed from the IR image to obtain an infrared heat region. The temperature measuring device can further determine whether the object to be detected is a living body according to the information in the infrared hot area, and a first detection result is obtained.
For another example, the temperature measuring device is equipped with an RGB camera and an IR camera. The temperature measuring device collects images to be processed by using the RGB camera, and simultaneously collects IR images by using the IR camera. And the temperature measuring device determines a pixel point region corresponding to the face region in the image to be processed from the IR image to obtain an infrared heat region. The temperature measuring device can further determine whether the object to be detected is a living body according to the information in the infrared hot area, and a first detection result is obtained.
In another possible implementation manner, the temperature measuring device performs feature extraction processing on the image to be processed to obtain texture features of the face region. And determining whether the object to be detected is a living body according to the texture characteristics to obtain a first detection result. Wherein the texture feature carries at least one of the following information: the skin color information of the human face skin, the glossiness information of the human face skin, the wrinkle information of the human face skin and the texture information of the human face skin.
The above-described feature extraction processing is used to extract texture features from the face region. Optionally, the feature extraction processing may be implemented by any one of the following methods: open-source face recognition networks, multi-task cascaded convolutional neural networks (MTCNN), Tuned Convolutional Neural Networks (TCNN), or task-constrained deep convolutional neural networks (TCDCN).
Since the texture features in the face of a real person carry more information than the texture features in a "non-living body". Therefore, whether the person in the image to be processed is a living body can be determined according to the information carried by the texture feature in the image to be processed.
Optionally, the confidence that the object to be detected is a living body can be obtained according to the texture features. Determining that the living body detection result is that the object to be detected is a living body under the condition that the confidence coefficient exceeds a confidence coefficient threshold value; and under the condition that the confidence coefficient does not exceed the confidence coefficient threshold value, determining that the living body detection result is that the object to be detected is not a living body.
In yet another possible implementation, the above-described liveness detection process may be implemented by a liveness detection network. The living body detection network can be obtained by training the convolutional neural network by using at least one second training image with labeling information as training data, wherein the labeling information comprises whether an object (including a person and an object) in the second training image is a living body.
As an optional implementation manner, the temperature data includes a temperature thermodynamic diagram and a temperature file, and the temperature file carries temperature information of object points corresponding to pixel points in the temperature thermodynamic diagram. Optionally, the temperature data is acquired by an infrared thermal imaging device, and the temperature file is a binary file. The temperature measuring device executes the following steps in the process of executing step 103:
1. and determining a pixel point region corresponding to the face region from the temperature thermodynamic diagram to obtain a first target pixel point region.
In the embodiment of the application, the object points corresponding to the pixel points in the first target pixel point region belong to the face of the object to be detected, and the object points corresponding to the pixel points in the face region also belong to the face of the object to be detected.
In a possible implementation manner, the temperature measuring device determines a pixel point region corresponding to the face region from the temperature thermodynamic diagram by performing image matching processing on the temperature thermodynamic diagram and the image to be processed, so as to obtain a first target pixel point region.
In another possible implementation, the thermographic image is acquired by a thermal imaging device and the image to be processed is acquired by an RGB imaging device. The temperature measuring device executes the following steps in the process of executing the step 1:
11. and carrying out first face identification processing on the image to be processed to obtain a first position of the face area in the image to be processed.
In the embodiment of the application, the position of the face region in the image to be processed may be the position of the face region in a pixel coordinate system of the image to be processed.
In a possible implementation manner, the temperature measuring device performs first face recognition processing on the image to be processed, so as to obtain the position of a face frame including the face region in a pixel coordinate system of the image to be processed. The temperature measuring device further takes the position of the face frame under the pixel coordinate system of the image to be processed as the position of the face area under the pixel coordinate system of the image to be processed.
In another possible implementation manner, the temperature measuring device performs first face recognition processing on the image to be processed, so as to obtain the positions of the key points of the face contour included in the face region in the pixel coordinate system of the image to be processed. And the temperature measuring device further takes the position of the key point of the face contour under the pixel coordinate system of the image to be processed as the position of the face area under the pixel coordinate system of the image to be processed.
12. And acquiring a first pose of the thermal imaging device in the process of acquiring the temperature thermodynamic diagram and a second pose of the RGB imaging device in the process of acquiring the image to be processed.
In the embodiment of the application, the acquisition time of the temperature thermodynamic diagram is different from that of the acquisition equipment of the image to be processed, but the acquisition time of the temperature thermodynamic diagram is the same as that of the image to be processed. The first pose is the pose of the thermal imaging device when acquiring the temperature thermodynamic diagram, and the second pose is the pose of the RGB imaging device when acquiring the image to be processed. For example, at 20/7/9/37/50 s in 2020, a thermography device acquires a thermography and an RGB imaging device acquires an image to be processed. At this time, the first posture is the posture of the thermal imaging device at 9 o 'clock 37 min 50 sec 7/20/2020, and the second posture is the posture of the RGB imaging device at 9 o' clock 37 min 50 sec 7/20/2020.
It should be understood that, in the embodiment of the present application, both the first pose and the second pose are poses in a world coordinate system.
In an implementation manner of acquiring the first pose and the second pose, the temperature measuring device receives the first pose and the second pose input by a user through the input assembly. The above-mentioned input assembly includes: keyboard, mouse, touch screen, touch pad, audio input device, etc.
In another implementation manner of acquiring the first pose and the second pose, the temperature measuring device receives the first pose and the second pose sent by the third terminal. Optionally, the third may be any one of: cell-phone, computer, panel computer, server, wearable equipment. The third terminal may be the same as or different from the first terminal, and this application does not limit this.
13. And obtaining a pose conversion relation between the thermal imaging device and the RGB imaging device according to the first pose and the second pose, wherein the pose conversion relation is used as a coordinate system conversion relation between a pixel coordinate system of the temperature thermodynamic diagram and a pixel coordinate system of the image to be processed.
14. And converting the first position according to the coordinate system conversion relation, and determining the first target pixel point region from the temperature thermodynamic diagram.
The temperature measuring device converts the first position according to the coordinate system conversion relation, and the position of a pixel point region corresponding to the face region in the temperature thermodynamic diagram can be determined from the temperature thermodynamic diagram to serve as the first target position. And then, the first target pixel point region can be determined according to the first target position.
Optionally, the human face recognizable deflection angle of the RGB imaging device ranges from-30 ° to +30 °.
Above-mentioned distinguishable angle of deflection of people's face refers to the shooting direction of RGB imaging device and the contained angle between the vertical line of the face region of treating the detection object, and from the top of the head of treating the detection object from down looking, when the shooting direction of RGB is compared the skew direction of the vertical line of the face region of treating the detection object for clockwise, distinguishable angle of deflection of people's face is positive, otherwise, from the top of the head of treating the detection object from down looking, when the shooting direction of RGB is compared the skew direction of the vertical line of the face region of treating the detection object for anticlockwise, distinguishable angle of deflection of people's face is negative.
Under the condition that the human face identifiable deflection angle of the RGB imaging equipment exceeds negative 30 degrees and does not exceed positive 30 degrees, the RGB camera is used for collecting the image to be processed, the accuracy rate of judging whether the image to be processed contains a human face area can be improved, and the accuracy rate of the first position can be improved.
2. And determining the temperature of an object point corresponding to at least one pixel point in the first target pixel point region according to the temperature file to obtain the temperature of the face region.
The temperature measuring device can determine the temperature of an object point corresponding to any pixel point in the temperature thermodynamic diagram according to the temperature information carried by the temperature file.
After the temperature measuring device determines the temperature of the object point corresponding to at least one pixel point in the first target pixel point region, the temperature measuring device can obtain the temperature of the face region according to the temperature of the object point corresponding to at least one pixel point.
In a possible implementation manner, the temperature measuring device uses the average temperature value of the object points corresponding to at least one pixel point as the temperature of the face region. For example, the first target pixel point region includes: the temperature of the object point corresponding to the pixel point a is 36.9 ℃, and the temperature of the object point corresponding to the pixel point b is 36.3 ℃. The temperature measuring device can take the average value (36.6 ℃) of the temperature of the object point corresponding to the pixel point a and the temperature of the object point corresponding to the pixel point b as the temperature of the face area; the temperature measuring device can also take the temperature (36.9 ℃) of the object point corresponding to the pixel point a as the temperature of the face area; the temperature measuring device can also take the temperature (36.3 ℃) of the object point corresponding to the pixel point b as the temperature of the face area.
In another possible implementation manner, the temperature measuring device first calculates an average temperature value of object points corresponding to pixels in the first target pixel point region, selects at least one pixel from the pixels whose difference with the average temperature value does not exceed the first threshold, and takes the average temperature value of the object points corresponding to the at least one pixel as the temperature of the face region. For example, the first target pixel point region includes: the temperature of the object point corresponding to the pixel point a is 36.9 ℃, the temperature of the object point corresponding to the pixel point b is 36.3 ℃, the temperature of the object point corresponding to the pixel point c is 37.9 ℃, and the temperature of the object point corresponding to the pixel point d is 35.2 ℃. The temperature measuring device calculates the average temperature value of object points corresponding to the pixel points in the first target pixel point region to be 36.625 degrees. Under the condition that the first threshold is 1 degree, because the difference (1.275 degrees) between the temperature of the object point corresponding to the pixel point c and the average temperature value and the difference (1.375 degrees) between the temperature of the object point corresponding to the pixel point d and the average temperature value both exceed 1 degree, the temperature measuring device selects at least one pixel point from the pixel point a and the pixel point b. Further, the temperature measuring device can take the average value (36.6 degrees) of the temperature of the object point corresponding to the pixel point a and the temperature of the object point corresponding to the pixel point b as the temperature of the face area; the temperature measuring device can also take the temperature (36.9 ℃) of the object point corresponding to the pixel point a as the temperature of the face area; the temperature measuring device can also take the temperature (36.3 ℃) of the object point corresponding to the pixel point b as the temperature of the face area.
Due to the fact that errors can occur in the process of collecting the temperature thermodynamic diagrams by the thermal imaging equipment, temperature information of object points corresponding to certain pixel points is inaccurate. In this possible implementation manner, the temperature measuring device can reduce the influence of the pixel points with inaccurate temperature information in the first target pixel point region on the temperature of the face region, thereby improving the accuracy of the temperature of the face region.
As an optional implementation manner, on the basis of executing the foregoing steps, the temperature measuring device further executes the following steps:
3. and carrying out accessory detection processing on the image to be processed, and determining whether the object to be detected wears a preset accessory or not to obtain a detection result.
In an embodiment of the application, the predetermined accessory comprises at least one of: mask, eyes, earrings, and cap. The accessory detection process may be implemented via an accessory detection network. The accessory detection network is obtained by training a neural network by using an image carrying annotation information, wherein the annotation information comprises the type of an accessory in the image and whether the image contains the accessory.
The detection result comprises one of the following: the object to be detected does not wear the accessory, and the object to be detected wears the accessory. Optionally, the accessory worn by the object to be detected includes at least one of: the mask is worn by the object to be detected, the earrings are worn by the object to be detected, the glasses are worn by the object to be detected, and the hat is worn by the object to be detected.
Under some specific scenes, a person wearing the accessory may have safety risks, and the temperature measuring device can reduce the safety risks by carrying out accessory detection processing on the image to be processed. For example, there is a safety risk in a workshop when the earring is worn, and when the detection result indicates that the earring is worn by the object to be detected, the temperature measuring device can send a warning to the object to be detected to remind the object to be detected to remove the earring.
As an alternative embodiment, the predetermined accessory comprises a mask. At this time, the temperature measuring device obtains a detection result through the following implementation mode:
in a possible implementation mode, the temperature measuring device performs feature extraction processing on the image to be processed to obtain first feature data, wherein the first feature data carries information about whether the mask is worn by the object to be detected. And the temperature measuring device obtains a detection result according to the first characteristic data.
Optionally, the feature extraction processing may be implemented by a mask detection network. The mask detection network can be obtained by training the deep convolutional neural network by taking at least one third training image with the labeling information as training data, wherein the labeling information comprises whether a person in the third training image wears the mask.
As an optional implementation manner, in the case that the second detection result is that the mask is worn in the face area, the temperature measuring device further performs the following steps on the basis of performing the foregoing steps:
4. and under the condition that the nostril key points and the mouth key points of the object to be detected are positioned in the pixel point area covered by the facial mask, determining that the mask is worn correctly by the object to be detected.
In the embodiment of the application, the facial mask is a mask worn by an object to be detected. The positions of the key points of the nostrils of the object to be detected in the image to be processed can be used for representing the positions of the nostrils of the object to be detected in the image to be processed. The position of the key point of the mouth of the object to be detected in the image to be processed can be used for representing the position of the mouth of the object to be detected in the image to be processed.
The nostril key points are located in pixel point areas covered by the facial mask, and represent that the facial mask covers nostrils of the object to be detected. The key point of the mouth is located in the pixel point area covered by the facial mask, and the facial mask is characterized to cover the mouth of the object to be detected.
Under the condition that the mouth and nostrils of the object to be detected are covered by the facial mask, the temperature measuring device determines that the mask is worn by the object to be detected correctly.
5. And under the condition that the key point of the nostril of the object to be detected and/or the key point of the mouth of the object to be detected are/is located outside the pixel point area covered by the facial mask, determining that the mask is not worn correctly by the object to be detected.
The key point of the nostril of the object to be detected is positioned outside the pixel point area covered by the facial mask, and the feature that the nostril of the object to be detected is not covered by the facial mask; the key point of the mouth of the object to be detected is located outside the pixel point area covered by the facial mask, and the facial mask is characterized not to cover the mouth of the object to be detected.
The face mask does not cover the mouth of the object to be detected, and the temperature measuring device determines that the object to be detected does not correctly wear the mask; the nostril of the object to be detected is not covered by the facial mask, and the temperature measuring device determines that the object to be detected does not correctly wear the mask; the face mask does not cover the mouth and nostrils of the object to be detected, and the temperature measuring device determines that the mask is not worn correctly by the object to be detected.
In a possible implementation manner, the temperature measuring device performs face key point detection processing on the image to be processed to obtain the position of the nostril key point of the object to be detected in the image to be processed, the position of the mouth key point of the object to be detected in the image to be processed, and the position of the pixel point area covered by the facial mask in the image to be processed. The temperature measuring device determines whether the mask is worn correctly by the object to be detected according to the position of the nostril key point of the object to be detected in the image to be processed, the position of the mouth key point of the object to be detected in the image to be processed and the position of the pixel point area covered by the facial mask in the image to be processed.
In another possible implementation manner, the temperature measuring device uses a mask wearing accuracy detection network to process the image to be processed, and characteristic data carrying information about whether the mask is worn correctly by the object to be detected is obtained. The temperature measuring device may then input the characteristic data to one of: and the classifier and the support vector machine obtain a third detection result and determine whether the mask is worn correctly by the object to be detected.
According to the mask detection network for correct wearing, the mask detection network for correct wearing can be obtained by training the deep convolutional neural network by taking at least one fourth training image with the labeling information as training data, wherein the labeling information comprises whether a person in the fourth training image correctly wears the mask. Optionally, the mask detection network and the correctly worn mask detection network may be the same deep convolutional neural network.
As an optional implementation manner, when the temperature measuring device performs step 2 and the detection result is that the mask is worn by the object to be detected, at least one pixel belongs to a second target pixel region in the first target pixel region, where the second target pixel region is a pixel region corresponding to a forehead region in the face region.
Under the condition that the mask is worn by the object to be detected, the face area comprises a pixel point area covered by the mask. Obviously, the temperature measuring device determines the temperature of the face region according to the temperature of at least one pixel point in the pixel point region covered by the mask, and a large error can be brought. Therefore, under the condition that the mask is worn by the object to be detected, the temperature measuring device determines the temperature of the object to be detected according to the temperature of the forehead area.
Therefore, the temperature measuring device determines the pixel point region corresponding to the forehead region from the first target pixel point region to obtain a second target pixel point region. And obtaining the temperature of the forehead area according to the temperature of at least one pixel point in the second target pixel point area, and taking the temperature of the forehead area as the temperature of the object to be detected.
As an optional implementation manner, before performing step 2, the temperature measuring device determines the second target pixel point region from the first target pixel point region by performing the following steps:
6. and carrying out second face recognition processing on the image to be processed to obtain the position of at least one face key point in the face area.
In the embodiment of the present application, the second face recognition processing is used to determine the position of at least one face key point in the image to be processed. Optionally, the second face recognition processing and the first face recognition processing may be implemented by the same face recognition algorithm. For example, the temperature measuring device uses an open source face recognition network OpenFace to realize first face recognition processing and second face recognition processing of the image to be processed.
7. And determining a second position of the forehead area in the image to be processed according to the position of the at least one face key point.
In an embodiment of the present application, the forehead region belongs to the face region. After the position of at least one face key point is obtained, the position of the forehead area in the image to be processed can be determined as a second position.
For example, the at least one face keypoint includes an eyebrow keypoint and a hairline keypoint. The temperature measuring device can determine the position of the forehead area in the image to be processed according to the positions of the key points of the eyebrows in the image to be processed and the positions of the key points of the hairline in the image to be processed.
8. And determining the second target pixel point region from the temperature thermodynamic diagram according to the coordinate system conversion relation and the second position.
The temperature measuring device converts the second position according to the coordinate system conversion relation, and the position of the pixel point region corresponding to the forehead region in the temperature thermodynamic diagram can be determined from the temperature thermodynamic diagram to serve as the second target position. And then, a second target pixel point region can be determined according to the second target position.
As an optional implementation manner, on the basis of executing the foregoing steps, the temperature measuring device further executes the following steps:
9. and carrying out face comparison processing on the image to be processed and the images in the face image library to obtain an image with the similarity exceeding a face similarity threshold value with the image to be processed as an identity image.
In the embodiment of the application, the images in the face image library are all face images carrying identity information. For example, the face image library includes: the face image of Zhang III and the face image of Li IV, wherein the face image of Zhang III carries identity information of Zhang III, and the face image of Li IV carries identity information of Li IV.
The temperature measuring device carries out face comparison processing on the image to be processed and the image in the face image library, and the face similarity between the image to be processed and the image in the face image library can be obtained. For example, the face image library includes: the face image of Zhang III and the face image of Li IV. The temperature measuring device compares the face of the image to be processed with the face image of Zusanli to obtain the face similarity between the face image of Zusanli and the image to be processed, and the temperature measuring device compares the face of the image to be processed with the face image of Liqu to obtain the face similarity between the face image of Liqu and the image to be processed.
And the face similarity exceeds a face similarity threshold value, and the character in the face image corresponding to the face similarity is represented as the same person as the object to be detected. For example, the face similarity between the face image of zhang san and the image to be processed exceeds the face similarity threshold, and the object to be detected is zhang san.
Therefore, the thermometric apparatus takes an image whose similarity with the image to be processed exceeds the human face similarity threshold as the identity image, and performs step 10 after obtaining the identity image.
10. And determining the object to be detected as a person corresponding to the identity image.
It should be understood that in the embodiment of the present application, in the case that the mask is worn by the object to be detected, the temperature measuring device can still determine the identity of the object to be detected by performing steps 9 and 10.
Optionally, after the identity of the object to be detected is obtained, the temperature measuring device may store the temperature of the object to be detected, so as to facilitate subsequent checking of the historical temperature of the object to be detected.
As an optional implementation manner, when the temperature of the object to be detected exceeds the temperature safety threshold, the temperature measuring device may output first warning information that the temperature of the object to be detected is abnormal, where the first warning information may be output in at least one of the following manners: light, voice, text.
As an optional implementation manner, in a case that the second detection result is that the mask of the object to be detected is not worn, the temperature measuring device may output second warning information that the mask of the object to be detected is not worn, where the second warning information may be output in at least one of the following manners: light, voice, text.
As an optional implementation manner, in a case that the third detection result is that the mask is not correctly worn by the object to be detected, the temperature measuring device may output third warning information that the mask is not correctly worn by the object to be detected, where the third warning information may be output in at least one of the following manners: light, voice, text.
As an optional implementation manner, the temperature measuring device is a first temperature measuring terminal. In one possible implementation, the component architecture diagram of the thermometry system can be seen in FIG. 2. As shown in fig. 2, the temperature measurement system includes not only the first temperature measurement terminal, but also the temperature measurement background device.
Optionally, the temperature measurement system further includes a second temperature measurement terminal. The second temperature measuring terminal may be one of the following: cell-phone, computer, panel computer, entrance guard's equipment. The position of the second temperature measuring terminal is different from the position of the first temperature measuring terminal. For example, the first temperature measuring terminal and the second temperature measuring terminal are both entrance guard equipment, wherein the first temperature measuring terminal is installed in a hall of building a, and the second temperature measuring terminal is installed in a hall of building B.
In this embodiment of the application, the temperature measurement background device may be one of the following: computer, server. The temperature measurement background equipment and the temperature measurement terminals (including the first temperature measurement terminal and the second temperature measurement terminal) are in communication connection. After the first temperature measurement terminal obtains the temperature of the object to be detected, the temperature of the object to be detected can be sent to the temperature measurement background equipment through the communication connection. And the temperature measurement background equipment stores the position of the first temperature measurement terminal as the measurement position of the temperature of the object to be detected in the database.
And the related management personnel can analyze the temperature sent by the temperature measuring terminal through the temperature measuring background terminal so as to control the spread of the respiratory infectious disease according to the analysis result.
In a possible implementation application scenario, relevant managers input the position information of the monitoring area to the temperature measurement background equipment through the input component, or send the position information of the monitoring area to the temperature measurement background equipment through the control terminal. For example, the manager may wish to monitor the number of people in zone a that are heating via a thermometry system. The position information of the area A can be used as the position information of the monitoring area to be input into the temperature measurement background equipment. It should be understood that at least one temperature terminal is contained within the monitored area.
The temperature measurement background equipment uses the position information retrieval database of the monitoring area to obtain the abnormal temperature of the measurement position in the monitoring area as the monitoring temperature, wherein the abnormal temperature is the temperature exceeding the temperature safety threshold. For example, the database includes temperature 1, temperature 2, temperature 3. Wherein the measurement position of temperature 1 and the measurement position of temperature 3 are in the monitoring area, and the measurement position of temperature 2 is not in the monitoring area. The temperature of temperature 1 was 37.8 degrees, the temperature of temperature 2 was 37.7 degrees, and the temperature of temperature 3 was 36.9 degrees. Assuming that the temperature safety threshold is 37.3 degrees, the temperature measuring device searches the database by using the position information of the monitoring area, and the obtained monitoring temperature is 1.
Optionally, after the temperature measurement terminal obtains the temperature of the object to be detected, whether the temperature of the object to be detected is an abnormal temperature or not can be determined according to the temperature of the object to be detected and the temperature safety threshold, so as to obtain an abnormal detection result. And the temperature measurement terminal sends the abnormal detection result to the temperature measurement background equipment.
The greater the number of people suspected of having respiratory infections and the greater the number of people with respiratory infections in an area, the more dangerous the area is. Therefore, the temperature measurement background equipment obtains the safety level of the monitored area according to the number of the monitored temperatures and the mapping relation, wherein the mapping relation is the mapping relation between the number of the temperatures exceeding the temperature safety threshold value and the safety level.
In one possible implementation, the mapping relationship can be seen in table 1. In table 1, the higher the safety level of the monitored area, the safer the monitored area is characterized.
Number of temperatures exceeding a temperature safety threshold Level of security
Not less than 5 1
More than 3 and not more than 5 2
More than 1 and not more than 3 3
0 4
TABLE 1
In another possible application scenario, the whereabouts of persons with temperatures above a temperature safety threshold may be determined based on a thermometry system, so that the relevant management can in turn better control the spread of respiratory infections in accordance with the whereabouts.
As an optional implementation manner, the temperature measurement background device not only includes the image body temperature data sent by the first temperature measurement terminal (i.e. the image with the collection position being the position of the first temperature measurement terminal), but also includes the image sent by the second temperature measurement terminal (i.e. the image temperature with the collection position being the position of the second temperature measurement terminal).
11. And under the condition that the temperature of the object to be detected exceeds the temperature safety threshold, the temperature measurement background equipment uses the image to be processed sent by the first temperature measurement terminal to search the database to obtain an image containing the object to be detected as a monitoring image.
And when the temperature of the object to be detected exceeds the temperature safety threshold, the object to be detected is characterized as a person suspected to be suffering from the respiratory infectious disease or a person suffering from the respiratory infectious disease. At the moment, the temperature measurement background equipment can further determine the track of the object to be detected.
And under the condition that the temperature of the object to be detected exceeds a temperature safety threshold, the temperature measurement background equipment uses the image to be processed to search the database to obtain an image containing the object to be detected as a monitoring image.
In a possible implementation manner, the background temperature measurement device performs person comparison processing on the image to be processed and the image in the database to determine that the database contains the image of the object to be detected. For example, the database includes image a and image b. And the temperature measurement background equipment performs character comparison processing on the image to be processed and the image a to obtain the similarity between the object to be detected and the character in the image a, and the similarity is taken as the similarity 1. And comparing the image to be processed with the image b by the temperature measurement background equipment to obtain the similarity between the object to be detected and the person in the image b, wherein the similarity is used as the similarity 2. Assume that the similarity 1 exceeds the second threshold and the similarity 2 does not exceed the second threshold. And the temperature measurement background equipment determines that the image a contains the object to be detected and the image b does not contain the object to be detected.
It should be understood that, in the case that the image contains at least 2 persons, the temperature measurement background device may obtain the similarity between the object to be detected and each person in the image by performing person comparison processing on the image to be processed and the image. For example, the image a includes a person a and a person b. The temperature measurement background equipment can obtain the similarity between the object to be detected and the person a and the similarity between the object to be detected and the person b by comparing the image to be processed with the image A.
12. And the temperature measurement background equipment obtains the track of the object to be detected according to the position of the temperature measurement terminal which sends the monitoring image.
For example, the monitoring image comprises an image a and an image b, wherein the image a is sent to the temperature measurement background equipment by the first temperature measurement terminal, and the image b is sent to the temperature measurement background equipment by the second temperature measurement terminal. At this time, the temperature measurement background equipment can determine that the place where the object to be detected is exposed comprises the position of the first temperature measurement terminal and the position of the second temperature measurement terminal. And then the track of the object to be detected can be obtained.
It should be understood that the number of the temperature measuring terminals in the temperature measuring system can exceed 2, and the first temperature measuring terminal and the second temperature measuring terminal in the foregoing are examples and should not be limited to this application.
Optionally, under the condition that the temperature measuring device is an access control device, the temperature measuring background device contains information of registered pedestrians. For example, the temperature measuring device is an access control device of company a. The database of the temperature measurement background equipment stores the identity information (such as name, job number, serial number and the like) of all employees of company A.
Under the condition that the temperature measuring background equipment determines that the temperature of the object to be detected exceeds the temperature safety threshold and the object to be detected is a registered expert, the temperature measuring background equipment can use the identity information retrieval database of the object to be detected to obtain the track of the object to be detected, so that the data processing amount of the temperature measuring background equipment is reduced.
As an alternative embodiment, before the thermometric apparatus performs step 12, the relevant administrator may complete face registration on the thermometric apparatus through the thermometric backend device, where the face registration means to input a face image (hereinafter referred to as a registration image) for registration into the thermometric apparatus, so as to obtain a face image library of the thermometric apparatus.
For example, the temperature measuring device is an access control device of company a. The manager of company A can input the face image of the employee of company A to the temperature measurement background equipment and input the registration instruction of the temperature measurement device to the temperature measurement background equipment, so that the temperature measurement background equipment sends the face image of the employee of company A to the temperature measurement device. And the temperature measuring device stores the received face image of the employee of the company A to a face image library. Therefore, the temperature measuring device can determine whether the object to be detected is an employee of company A while measuring the temperature of the object to be detected. Further, the object to be detected is released under the condition that the object to be detected is an employee of company A and the temperature of the object to be detected does not exceed the temperature safety threshold.
Optionally, the process of performing face comparison processing on the image to be processed and the image in the face image library includes the following steps: and respectively carrying out face feature extraction processing on the image to be processed and the image in the face image library to obtain face feature data of the image to be processed and face feature data of the image in the face image library, and comparing the face feature data of the image to be processed with the face feature data of the image in the face image library to obtain the similarity between the image to be processed and the image in the face image library.
Because the human face feature extraction processing can bring larger data processing amount, the human face feature extraction processing of the registered image can be completed through the temperature measurement background equipment in order to reduce the data processing amount of the temperature measurement device. For example, in the process of registering the face of the temperature measuring device, after the temperature measuring background equipment receives the registered image, the face feature extraction processing can be performed on the registered image to obtain the face feature data of the registered image. The temperature measuring device stores the face feature data sent by the temperature measuring background equipment to a face image library. Therefore, in the process of carrying out face comparison processing on the image to be processed and the image in the face image library by the temperature measuring device, only face feature extraction processing needs to be carried out on the image to be processed, and then the data processing capacity of the temperature measuring device can be reduced.
As an optional implementation mode, the temperature measuring device can send the first detection result, the second detection result and the third detection result to the temperature measuring background equipment, so that relevant managers can check the detection results through the temperature measuring background equipment, and the spread of respiratory infectious diseases can be controlled better.
Optionally, the temperature measuring device sends the acquisition time of the image to be processed to the temperature measuring background device as the temperature measuring time of the object to be detected, so that the temperature of the object to be detected can be traced back by the temperature measuring background device.
Optionally, the temperature measurement background device may obtain an analysis result by analyzing the temperature in the database, and display the analysis result through the display device. For example, the display content of the analysis result display page includes the number of the objects to be detected, the number of temperatures that do not exceed the temperature safety threshold, the number of temperatures that exceed the temperature safety threshold, the number of second detection results that do not wear the mask, and the to-be-processed image corresponding to the temperatures that exceed the temperature safety threshold.
Optionally, the temperature measurement background device may monitor whether the temperature measurement terminal fails, and the display content of the analysis result display page further includes whether the temperature measurement terminal fails, the number of failed temperature measurement terminals, and the position of the failed temperature measurement terminal.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
The method of the embodiments of the present application is set forth above in detail and the apparatus of the embodiments of the present application is provided below.
Referring to fig. 3, fig. 3 is a schematic structural diagram of a temperature measuring device according to an embodiment of the present application, where the temperature measuring device 1 includes: a first acquisition unit 11, a second acquisition unit 12, a processing unit 13, wherein:
a first acquisition unit 11 configured to acquire an image to be processed;
a second obtaining unit 12, configured to obtain temperature data of the image to be processed when it is determined that the image to be processed includes a face region and an object to be detected corresponding to the face region is a living body; the temperature data carries temperature information of object points corresponding to pixel points in the image to be processed;
and the processing unit 13 is configured to obtain the temperature of the object to be detected according to the temperature data.
In combination with any embodiment of the present application, the temperature data includes a temperature thermodynamic diagram and a temperature file, the temperature file carries temperature information of object points corresponding to pixel points in the temperature thermodynamic diagram, and the processing unit 13 is configured to:
determining a pixel point region corresponding to the face region from the temperature thermodynamic diagram to obtain a first target pixel point region;
and determining the temperature of an object point corresponding to at least one pixel point in the first target pixel point region according to the temperature file, and obtaining the temperature of the face region as the temperature of the object to be detected.
In combination with any embodiment of the present application, the temperature thermodynamic diagram is acquired by a thermal imaging device, and the image to be processed is acquired by an RGB imaging device;
the processing unit 13 is configured to:
performing first face recognition processing on the image to be processed to obtain a first position of the face region in the image to be processed;
acquiring a first pose of the thermal imaging device in the process of acquiring the temperature thermodynamic diagram and a second pose of the RGB imaging device in the process of acquiring the image to be processed;
obtaining a pose conversion relation between the thermal imaging equipment and the RGB imaging equipment according to the first pose and the second pose, wherein the pose conversion relation is used as a coordinate system conversion relation between a pixel coordinate system of the temperature thermodynamic diagram and a pixel coordinate system of the image to be processed;
and converting the first position according to the coordinate system conversion relation, and determining the first target pixel point region from the temperature thermodynamic diagram.
In combination with any embodiment of the present application, the processing unit 13 is further configured to:
and carrying out accessory detection processing on the image to be processed, and determining whether the object to be detected wears a preset accessory or not to obtain a detection result, wherein the preset accessory comprises a mask.
In combination with any embodiment of the present application, the processing unit 13 is further configured to:
performing feature extraction processing on the image to be processed to obtain first feature data; the first characteristic data carries information whether the object to be detected wears a mask or not;
and obtaining the detection result according to the first characteristic data.
With reference to any embodiment of the present application, in a case that the detection result indicates that the subject to be detected wears the mask, the processing unit 13 is further configured to:
determining that the mask is worn correctly by the object to be detected under the condition that the key point of the nostril of the object to be detected and the key point of the mouth of the object to be detected are both located in the pixel point area covered by the facial mask; the facial mask is a mask worn by the object to be detected;
and under the condition that the key point of the nostril of the object to be detected and/or the key point of the mouth of the object to be detected are/is located outside the pixel point area covered by the facial mask, determining that the mask is not worn correctly by the object to be detected.
With reference to any embodiment of the present application, the temperature of the object to be detected is obtained by the following steps: according to the temperature file, determining the temperature of an object point corresponding to at least one pixel point in the first target pixel point region to obtain the temperature of the face region as the temperature of the object to be detected, wherein the at least one pixel point belongs to a second target pixel point region in the first target pixel point region under the condition that the mask is worn by the object to be detected, and the second target pixel point region is a pixel point region corresponding to the forehead region in the face region.
In combination with any embodiment of the present application, the processing unit 13 is further configured to:
determining the temperature of an object point corresponding to at least one pixel point in the first target pixel point region according to the temperature file, and performing second face recognition processing on the image to be processed before the temperature of the face region is obtained to obtain the position of at least one face key point in the face region;
determining a second position of the forehead area in the image to be processed according to the position of the at least one face key point; the forehead area belongs to the face area;
and determining the second target pixel point region from the temperature thermodynamic diagram according to the coordinate system conversion relation and the second position.
In combination with any embodiment of the present application, the processing unit 13 is further configured to:
carrying out face comparison processing on the image to be processed and images in a face image library to obtain an image with the similarity exceeding a face similarity threshold value with the image to be processed as an identity image;
and determining the object to be detected as a person corresponding to the identity image.
In combination with any embodiment of the present application, the temperature measuring device belongs to a temperature measuring system, the temperature measuring system further includes a temperature measuring background device, and the method further includes:
the temperature measurement background equipment receives the temperature of the object to be detected sent by the temperature measurement device, and stores the position of the temperature measurement device as the measurement position of the temperature of the object to be detected in a database;
the temperature measurement background equipment acquires the position information temperature of a monitoring area;
the temperature measurement background equipment uses the position information of the monitoring area to retrieve the database, and obtains the abnormal temperature of the measurement position in the monitoring area as the monitoring temperature; the abnormal temperature is a temperature exceeding a temperature safety threshold;
the temperature measurement background equipment obtains the safety level of the monitoring area according to the quantity of the monitoring temperatures and the mapping relation; the mapping relationship is a mapping relationship between the number of temperature data exceeding a temperature safety threshold and a safety level.
In combination with any embodiment of the present application, the temperature measurement system further includes a second temperature measurement terminal, and a position of the second temperature measurement terminal is different from a position of the temperature measurement device; the database of the temperature measurement background equipment comprises images sent by the temperature measurement device and images sent by the second temperature measurement terminal;
under the condition that the temperature of the object to be detected exceeds the temperature safety threshold, the temperature measurement background equipment uses the image to be processed sent by the temperature measurement device to search the database, and an image containing the object to be detected is obtained and serves as a monitoring image;
and the temperature measurement background equipment obtains the track of the object to be detected according to the position of the temperature measurement terminal which sends the monitoring image.
In this implementation, the temperature measuring device determines whether the image to be processed includes a face region by performing face detection processing on the image to be processed, so as to determine whether the image to be processed includes the object to be detected. The temperature measuring device acquires temperature data under the condition that the object to be detected is determined to be a living body, and obtains the temperature of the object to be detected according to the temperature data, so that the probability of measuring the temperature of an object (namely, the object to be measured is not a person) can be reduced, the probability of measuring the temperature of a non-living body can be reduced, and the data processing amount can be further reduced. In addition, the temperature measuring device obtains the temperature of the object to be detected according to the temperature data, so that the contact between a worker and a person to be detected can be avoided, and the detection efficiency is improved.
In some embodiments, functions of or modules included in the apparatus provided in the embodiments of the present application may be used to execute the method described in the above method embodiments, and specific implementation thereof may refer to the description of the above method embodiments, and for brevity, will not be described again here.
Fig. 4 is a schematic diagram of a hardware structure of a temperature measuring device according to an embodiment of the present application. The thermometry device 2 includes a processor 21, a memory 22, an input device 23, and an output device 24. The processor 21, the memory 22, the input device 23 and the output device 24 are coupled by a connector, which includes various interfaces, transmission lines or buses, etc., and the embodiment of the present application is not limited thereto. It should be appreciated that in various embodiments of the present application, coupled refers to being interconnected in a particular manner, including being directly connected or indirectly connected through other devices, such as through various interfaces, transmission lines, buses, and the like.
The processor 21 may be one or more Graphics Processing Units (GPUs), and in the case that the processor 21 is one GPU, the GPU may be a single-core GPU or a multi-core GPU. Alternatively, the processor 21 may be a processor group composed of a plurality of GPUs, and the plurality of processors are coupled to each other through one or more buses. Alternatively, the processor may be other types of processors, and the like, and the embodiments of the present application are not limited.
Memory 22 may be used to store computer program instructions, as well as various types of computer program code for executing the program code of aspects of the present application. Alternatively, the memory includes, but is not limited to, Random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM), or compact disc read-only memory (CD-ROM), which is used for associated instructions and data.
The input means 23 are for inputting data and/or signals and the output means 24 are for outputting data and/or signals. The input device 23 and the output device 24 may be separate devices or may be an integral device.
It is understood that, in the embodiment of the present application, the memory 22 may be used to store not only the relevant instructions, but also relevant data, for example, the memory 22 may be used to store the image to be processed acquired through the input device 23, or the memory 22 may also be used to store the temperature of the object to be detected obtained through the processor 21, and the like, and the embodiment of the present application is not limited to the data specifically stored in the memory.
It will be appreciated that figure 4 shows only a simplified design of the thermometric device. In practical applications, the temperature measuring devices may also respectively include other necessary components, including but not limited to any number of input/output devices, processors, memories, etc., and all temperature measuring devices that can implement the embodiments of the present application are within the scope of the present application.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. It is also clear to those skilled in the art that the descriptions of the various embodiments of the present application have different emphasis, and for convenience and brevity of description, the same or similar parts may not be repeated in different embodiments, so that the parts that are not described or not described in detail in a certain embodiment may refer to the descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in or transmitted over a computer-readable storage medium. The computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)), or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., Digital Versatile Disk (DVD)), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media that can store program codes, such as a read-only memory (ROM) or a Random Access Memory (RAM), a magnetic disk, or an optical disk.

Claims (12)

1. A method of measuring temperature, the method comprising:
acquiring an image to be processed;
under the condition that the image to be processed contains a face region and the object to be detected corresponding to the face region is a living body, acquiring temperature data of the image to be processed; the temperature data carries temperature information of object points corresponding to pixel points in the image to be processed;
and obtaining the temperature of the object to be detected according to the temperature data.
2. The method according to claim 1, wherein the temperature data includes a temperature thermodynamic diagram and a temperature file, the temperature file carries temperature information of object points corresponding to pixel points in the temperature thermodynamic diagram, and obtaining the temperature of the object to be detected according to the temperature data includes:
determining a pixel point region corresponding to the face region from the temperature thermodynamic diagram to obtain a first target pixel point region;
and determining the temperature of an object point corresponding to at least one pixel point in the first target pixel point region according to the temperature file, and obtaining the temperature of the face region as the temperature of the object to be detected.
3. The method of claim 2, wherein the thermographic map is acquired by a thermographic device and the image to be processed is acquired by an RGB imaging device;
the determining a pixel point region corresponding to the face region from the temperature thermodynamic diagram to obtain a first target pixel point region includes:
performing first face recognition processing on the image to be processed to obtain a first position of the face region in the image to be processed;
acquiring a first pose of the thermal imaging device in the process of acquiring the temperature thermodynamic diagram and a second pose of the RGB imaging device in the process of acquiring the image to be processed;
obtaining a pose conversion relation between the thermal imaging equipment and the RGB imaging equipment according to the first pose and the second pose, wherein the pose conversion relation is used as a coordinate system conversion relation between a pixel coordinate system of the temperature thermodynamic diagram and a pixel coordinate system of the image to be processed;
and converting the first position according to the coordinate system conversion relation, and determining the first target pixel point region from the temperature thermodynamic diagram.
4. The method according to any one of claims 1 to 3, further comprising:
and carrying out accessory detection processing on the image to be processed, and determining whether the object to be detected wears a preset accessory or not to obtain a detection result, wherein the preset accessory comprises a mask.
5. The method according to claim 4, wherein the performing of the accessory detection processing on the image to be processed to determine whether the object to be detected wears a predetermined accessory or not to obtain a detection result comprises:
performing feature extraction processing on the image to be processed to obtain first feature data; the first characteristic data carries information whether the object to be detected wears a mask or not;
and obtaining the detection result according to the first characteristic data.
6. The method according to claim 5, wherein in the case that the detection result is that the subject to be detected wears a mask, the method further comprises:
determining that the mask is worn correctly by the object to be detected under the condition that the key point of the nostril of the object to be detected and the key point of the mouth of the object to be detected are both located in the pixel point area covered by the facial mask; the facial mask is a mask worn by the object to be detected;
and under the condition that the key point of the nostril of the object to be detected and/or the key point of the mouth of the object to be detected are/is located outside the pixel point area covered by the facial mask, determining that the mask is not worn correctly by the object to be detected.
7. The method according to claim 5 or 6, wherein when the claim cited in claim 5 includes claim 2, and the detection result is that the mask is worn by the subject to be detected, the at least one pixel belongs to a second target pixel region in the first target pixel region, and the second target pixel region is a pixel region corresponding to a forehead region in the face region.
8. The method according to claim 7, wherein before the determining, according to the temperature file, the temperature of the object point corresponding to at least one pixel point in the first target pixel point region to obtain the temperature of the face region, the method further comprises:
performing second face recognition processing on the image to be processed to obtain the position of at least one face key point in the face region;
determining a second position of the forehead area in the image to be processed according to the position of the at least one face key point; the forehead area belongs to the face area;
and determining the second target pixel point region from the temperature thermodynamic diagram according to the coordinate system conversion relation and the second position.
9. The method according to any one of claims 1 to 8, further comprising:
carrying out face comparison processing on the image to be processed and images in a face image library to obtain an image with the similarity exceeding a face similarity threshold value with the image to be processed as an identity image;
and determining the object to be detected as a person corresponding to the identity image.
10. A temperature measuring device, said device comprising:
the first acquisition unit is used for acquiring an image to be processed;
the second acquisition unit is used for acquiring the temperature data of the image to be processed under the condition that the image to be processed contains a face area and an object to be detected corresponding to the face area is a living body; the temperature data carries temperature information of object points corresponding to pixel points in the image to be processed;
and the processing unit is used for obtaining the temperature of the object to be detected according to the temperature data.
11. An electronic device, comprising: a processor and a memory for storing computer program code comprising computer instructions which, if executed by the processor, the electronic device performs the method of any of claims 1 to 9.
12. A computer-readable storage medium, in which a computer program is stored, which computer program comprises program instructions which, if executed by a processor, cause the processor to carry out the method of any one of claims 1 to 9.
CN202010813228.8A 2020-08-13 2020-08-13 Temperature measuring method and device, electronic equipment and storage medium Pending CN111985377A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010813228.8A CN111985377A (en) 2020-08-13 2020-08-13 Temperature measuring method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010813228.8A CN111985377A (en) 2020-08-13 2020-08-13 Temperature measuring method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111985377A true CN111985377A (en) 2020-11-24

Family

ID=73434272

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010813228.8A Pending CN111985377A (en) 2020-08-13 2020-08-13 Temperature measuring method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111985377A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111323125A (en) * 2020-02-28 2020-06-23 北京格灵深瞳信息技术有限公司 Temperature measurement method and device, computer storage medium and electronic equipment
CN111339951A (en) * 2020-02-26 2020-06-26 北京迈格威科技有限公司 Body temperature measuring method, device and system
CN111414831A (en) * 2020-03-13 2020-07-14 深圳市商汤科技有限公司 Monitoring method and system, electronic device and storage medium
CN111428559A (en) * 2020-02-19 2020-07-17 北京三快在线科技有限公司 Method and device for detecting wearing condition of mask, electronic equipment and storage medium
CN111522073A (en) * 2020-04-26 2020-08-11 北京都是科技有限公司 Method for detecting mask wearing condition of target object and thermal infrared image processor

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111428559A (en) * 2020-02-19 2020-07-17 北京三快在线科技有限公司 Method and device for detecting wearing condition of mask, electronic equipment and storage medium
CN111339951A (en) * 2020-02-26 2020-06-26 北京迈格威科技有限公司 Body temperature measuring method, device and system
CN111323125A (en) * 2020-02-28 2020-06-23 北京格灵深瞳信息技术有限公司 Temperature measurement method and device, computer storage medium and electronic equipment
CN111414831A (en) * 2020-03-13 2020-07-14 深圳市商汤科技有限公司 Monitoring method and system, electronic device and storage medium
CN111522073A (en) * 2020-04-26 2020-08-11 北京都是科技有限公司 Method for detecting mask wearing condition of target object and thermal infrared image processor

Similar Documents

Publication Publication Date Title
KR101729327B1 (en) A monitoring system for body heat using the dual camera
KR101806400B1 (en) A surveillance system for body heat by the dual camera using the black body
WO2019090769A1 (en) Human face shape recognition method and apparatus, and intelligent terminal
CN105427421A (en) Entrance guard control method based on face recognition
KR20190016733A (en) Method for recognizing partial obscured face by specifying partial area based on facial feature point, recording medium and apparatus for performing the method
CN111639522A (en) Living body detection method, living body detection device, computer equipment and storage medium
EP0533891A1 (en) Method for identifying individuals from analysis of elemental shapes derived from biosensor data
CN104008317A (en) Authentication apparatus and authentication method
CN111626210B (en) Personnel dressing detection method, processing terminal and storage medium
CN111553266A (en) Identification verification method and device and electronic equipment
KR20120139100A (en) Apparatus and method for security management using face recognition
Abate et al. On the impact of multimodal and multisensor biometrics in smart factories
CN107609515B (en) Double-verification face comparison system and method based on Feiteng platform
CN112767586A (en) Passage detection method and device, electronic equipment and computer readable storage medium
CN112131976A (en) Self-adaptive portrait temperature matching and mask recognition method and device
CN112084882A (en) Behavior detection method and device and computer readable storage medium
CN113646806A (en) Image processing apparatus, image processing method, and recording medium storing program
Ulleri et al. Development of contactless employee management system with mask detection and body temperature measurement using TensorFlow
CN113723276A (en) Health code management method based on vein recognition and related device
CN104134260A (en) Access control system based on iris identity recognition technology
Singh et al. Computer aided face liveness detection with facial thermography
CN113011544B (en) Face biological information identification method, system, terminal and medium based on two-dimensional code
CN104915590A (en) Human face recognition system and method applied to computer encryption
CN111985377A (en) Temperature measuring method and device, electronic equipment and storage medium
CN113158712A (en) Personnel management system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination