CN113936124B - Personnel identity recognition method and device based on big data collision detection - Google Patents
Personnel identity recognition method and device based on big data collision detection Download PDFInfo
- Publication number
- CN113936124B CN113936124B CN202111051069.3A CN202111051069A CN113936124B CN 113936124 B CN113936124 B CN 113936124B CN 202111051069 A CN202111051069 A CN 202111051069A CN 113936124 B CN113936124 B CN 113936124B
- Authority
- CN
- China
- Prior art keywords
- mobile equipment
- face
- face information
- address
- track
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 24
- 238000001514 detection method Methods 0.000 title claims abstract description 22
- 238000013507 mapping Methods 0.000 claims abstract description 49
- 239000000523 sample Substances 0.000 claims abstract description 20
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 11
- 238000010586 diagram Methods 0.000 claims abstract description 10
- 238000004364 calculation method Methods 0.000 claims abstract description 8
- 238000012544 monitoring process Methods 0.000 claims description 38
- 230000004927 fusion Effects 0.000 claims description 9
- 238000004590 computer program Methods 0.000 claims description 3
- 238000013527 convolutional neural network Methods 0.000 claims description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 2
- 238000009432 framing Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 238000011176 pooling Methods 0.000 description 1
- OIGNJSKKLXVSLS-VWUMJDOOSA-N prednisolone Chemical compound O=C1C=C[C@]2(C)[C@H]3[C@@H](O)C[C@](C)([C@@](CC4)(O)C(=O)CO)[C@@H]4[C@@H]3CCC2=C1 OIGNJSKKLXVSLS-VWUMJDOOSA-N 0.000 description 1
- 238000012163 sequencing technique Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
Landscapes
- Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Mobile Radio Communication Systems (AREA)
Abstract
The invention discloses a personnel identity recognition method based on big data collision detection, which comprises the following steps: acquiring face information and a mobile device address; according to the obtained face information and the mobile equipment address, establishing a preliminary mapping relation between the face information and the mobile equipment address; according to the preliminary mapping relation, a face track and a mobile equipment address track are established, and a track diagram between the face track and the mobile equipment address track is obtained through calculation of a Fregminsh distance algorithm; calculating the Fregming distance between the acquired face track and the address track of the mobile equipment; the invention can effectively associate personnel equipment and track personnel whereabouts without personnel registration, and can detect the personnel by only carrying the mobile equipment through the area with the probe and the face equipment, and correlate the personnel, thereby being convenient for tracking the signals of the mobile equipment to track personnel whereabouts in later period.
Description
Technical Field
The invention relates to a personnel identity recognition method and device based on big data collision detection, and belongs to the technical field of identity recognition.
Background
Along with the development of science and technology, the circulation of vehicles and pedestrians is rapidly increased, and various illegal criminal means are also improved along with the rapid growth, so that in order to improve analysis of the case situation and tracking and positioning of personnel, the power is increased on the construction of vehicle bayonets, face bayonets, WIFI probes and the like, and the deployment capability is improved.
The places can generate a large amount of pedestrian, vehicle and probe data every day, the data are generally single, a consistent association relation cannot be formed, the whereabouts of people cannot be formed in certain places with single equipment, the registering of people is complicated, and the problem of inaccurate data exists.
Disclosure of Invention
The invention aims to overcome the defects in the prior art, and provides a personnel identity recognition method and device based on big data collision detection, which solve the problems that the existing equipment cannot form personnel whereabouts, is complicated when personnel registration is required, and has inaccurate data.
In order to achieve the above purpose, the invention is realized by adopting the following technical scheme:
in a first aspect, the present invention provides a person identification method based on big data collision detection, including:
acquiring face information and a mobile device address, wherein the face information and the mobile device address are obtained by selecting a plurality of monitoring time points in a pre-selected target area;
According to the obtained face information and the mobile equipment address, establishing a preliminary mapping relation between the face information and the mobile equipment address;
According to the preliminary mapping relation, a face track and a mobile equipment address track are established, and a track diagram between the face track and the mobile equipment address track is obtained through calculation of a Fregminsh distance algorithm;
calculating the Fregming distance between the acquired face track and the mobile equipment address track, comparing the Fregming distance with a preset threshold value, determining a real mapping relation if the Fregming distance is smaller than the preset threshold value, and if the Fregming distance is larger than or equal to the threshold value, not matching the face information with the mobile equipment address, and removing the preliminary mapping relation;
traversing all face tracks to obtain face information of all people and address fusion data of the mobile equipment.
Further, the face information is acquired through a face acquisition device, is input into a deep convolutional neural network to extract face features and then is acquired, and the mobile equipment address is acquired and stored through a probe, wherein the face acquisition device and the probe are arranged in a target area in advance.
Further, according to the obtained face information and the mobile device address, a preliminary mapping relationship between the face information and the mobile device address is established, which comprises:
dividing the plurality of monitoring time points into { T1, T2, …, tn }, n being a positive integer;
Aiming at the monitoring time T1, generating collision information according to the face information and the mobile equipment address, and recording the number of collisions and the collision time;
Updating the collision information by sequentially utilizing the face information and the mobile equipment address acquired in the monitoring time T2 to Tn;
If the number of times of collision reaches a preset threshold value, a collision table corresponding to the face information and the mobile equipment address is established according to the collision information;
And updating the collision table by sequentially utilizing the face information and the mobile equipment address acquired by the monitoring time { T2, …, tn }, and further comprising: sorting mobile equipment addresses corresponding to the face information in the collision table according to the number of collisions;
setting a time threshold, extracting a record with the collision times of 1 from the collision table, and deleting the record if the interval between the collision time of the record and the current time exceeds the time threshold;
if the monitoring time Ti (i is more than or equal to 1 and less than or equal to n) and the number of faces in the target area is 1, establishing a preliminary mapping relation between the face information and the mobile equipment address.
Further, the collision information is recorded and generated once collision information when the face information and the address of the mobile device are simultaneously presented together.
Further, the mobile device address includes international mobile equipment identification codes IMEI, bluetooth Mac, wiFi Mac.
In a second aspect, the present invention provides a person identification device based on big data collision detection, including:
the device comprises an acquisition unit, a monitoring unit and a control unit, wherein the acquisition unit is used for acquiring face information and a mobile device address, wherein the face information and the mobile device address are obtained by monitoring a plurality of monitoring time points in a pre-selected target area;
the mapping relation establishing unit is used for establishing a preliminary mapping relation between the face information and the mobile equipment address according to the acquired face information and the mobile equipment address;
The track relation establishing unit is used for establishing a face track and a mobile equipment address track according to the preliminary mapping relation and calculating and acquiring a track diagram between the face track and the mobile equipment address track through a French rest distance algorithm;
The computing unit is used for computing the Fregming distance between the acquired face track and the mobile equipment address track, comparing the Fregming distance with a preset threshold value, determining a real mapping relation if the Fregming distance is smaller than the preset threshold value, and removing the preliminary mapping relation if the Fregming distance is larger than or equal to the threshold value and the face information is not matched with the mobile equipment address;
and the traversing unit is used for traversing all the face tracks to obtain the face information of all the people and the address fusion data of the mobile equipment.
Further, the mobile device comprises a face acquisition device and a probe, wherein the face acquisition device and the probe are arranged in the target area in advance, the face acquisition device is used for acquiring the face information, and the probe is used for acquiring and storing the mobile device address.
Further, the mobile device address includes international mobile equipment identification codes IMEI, bluetooth Mac, wiFi Mac.
In a third aspect, the invention provides a personnel identity recognition device based on big data collision detection, which comprises a processor and a storage medium;
The storage medium is used for storing instructions;
the processor is operative to perform the steps of any one of the methods described above in accordance with the instructions.
In a fourth aspect, the present invention provides a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of any of the methods described above.
Compared with the prior art, the invention has the beneficial effects that:
The invention utilizes the probe and face recognition technology to comprehensively utilize various algorithm technologies to carry out association collision, can effectively associate personnel equipment and track personnel whereabouts, does not need personnel registration, can detect personnel only by carrying the mobile equipment by the area with the probe and the face equipment, and carries out association, thereby being convenient for tracking the mobile equipment signal to track personnel whereabouts in later period.
Drawings
Fig. 1 is a flowchart of a person identification method based on big data collision detection according to an embodiment of the present invention.
Detailed Description
The invention is further described below with reference to the accompanying drawings. The following examples are only for more clearly illustrating the technical aspects of the present invention, and are not intended to limit the scope of the present invention.
Example 1
The embodiment introduces a personnel identity recognition method based on big data collision detection, which comprises the following steps:
acquiring face information and a mobile device address, wherein the face information and the mobile device address are obtained by selecting a plurality of monitoring time points in a pre-selected target area;
According to the obtained face information and the mobile equipment address, establishing a preliminary mapping relation between the face information and the mobile equipment address;
According to the preliminary mapping relation, a face track and a mobile equipment address track are established, and a track diagram between the face track and the mobile equipment address track is obtained through calculation of a Fregminsh distance algorithm;
calculating the Fregming distance between the acquired face track and the mobile equipment address track, comparing the Fregming distance with a preset threshold value, determining a real mapping relation if the Fregming distance is smaller than the preset threshold value, and if the Fregming distance is larger than or equal to the threshold value, not matching the face information with the mobile equipment address, and removing the preliminary mapping relation;
traversing all face tracks to obtain face information of all people and address fusion data of the mobile equipment.
The application process of the personnel identity recognition method based on big data collision detection provided by the embodiment specifically relates to the following steps:
Referring to fig. 1, the whole method flow is as follows:
Step one, information acquisition
Face recognition equipment and probe equipment are deployed at each bayonet and used for face capturing and equipment detection.
And selecting a plurality of monitoring time points { T1, T2, …, tn } (n is a positive integer), shooting a person image in the target area by using face recognition equipment, extracting face information from the image, acquiring a mobile equipment identification number in the target area by using probe equipment, and storing the information.
Mobile device identification number: including International Mobile Equipment Identification (IMEI), bluetooth Mac, wiFi Mac, etc.
The face information is characterized in that a color histogram and a gradient histogram are extracted from a head-shoulder region in an image to be combined into a feature representing the face, and a face feature value is obtained by calculating the matching degree of the head-shoulder feature from frame to frame. After the face is aligned and calibrated based on the feature points, the face is input to a deep convolutional neural network to extract features, and the feature dimension is 1024 dimensions. The depth convolution neural network is composed of a residual network, a convolution layer composed of a multi-latitude convolution kernel and a pooling layer adopts a cosine distance loss function as an evaluation function, and finally convolution neural network parameters for extracting face features are generated.
Step two, collision relation
Collision: the face and the mobile equipment identification number are simultaneously appeared together, and then the collision is recorded.
And aiming at the monitoring time T1, generating collision information (namely face and mobile equipment identification number recording information) according to the face and mobile equipment information acquired in the last step, and recording the number of collisions and the collision time. And updating the collision information of the previous step by using the face and the mobile equipment identification numbers acquired in the monitoring time T2 to Tn in sequence.
If the number of collision times reaches the threshold value set by the system, a corresponding relationship between the face and the mobile equipment identification number is established according to the collision information.
(1) Sequentially judging whether the face acquired in the monitoring time Tj (j is more than or equal to 2 and less than or equal to n) is the face extracted in the T1 monitoring time period;
(2) If yes, judging whether the mobile equipment identification number acquired by the monitoring time Tj and the current face record collision in sequence according to the collision information, if yes, adding one corresponding to the number of collision records, and updating the collision time according to the monitoring time Tj, otherwise, adding the mobile equipment identification number into the collision information.
(3) If the collision information is not in the face monitored in the T1, the face and the identification number of the mobile equipment colliding with the face are newly added into the collision information, and the number of collisions and the collision time are recorded.
Step three, collision analysis
And updating the collision table by sequentially utilizing the faces and the mobile equipment identification numbers acquired by the monitoring time { T2, …, tn }, and then sequencing the mobile equipment identification numbers corresponding to the faces in the collision table according to the collision times.
And step four, setting a time threshold, extracting a record with the collision times of 1 from the collision table, and deleting the record if the interval between the collision time of the record and the current time exceeds the time threshold.
Fifthly, if the monitoring time Ti (i is more than or equal to 1 and less than or equal to n) and the number of faces in the target area is 1, establishing a preliminary mapping relation between the faces and the mobile equipment identification numbers acquired by the probes. Thus, a local mapping relation is formed;
And step six, finding out all the trajectories of the equipment and the personnel according to the mapping relation, and calculating the trajectory diagrams of the equipment and the personnel according to the Fregming distance algorithm.
Equipment trajectory: f [ a, b ]. Fwdarw.v
Trajectory of human: g [ a ', b' ] →v
Frammin distance: delta F (f, g) =infmax { d (f (α (t)), g (β (t))) } αβt e [0,1]
The mathematical expression is understood in that for each pair of possible descriptive functions α (t) and β (t), we always find the shortest longest distance between the person and the device during the whole movement, so this minimum distance is the ferming distance.
A recursive method may be employed to calculate the discrete FRECHETDISTANCE (framings distance).
We use the form of a matrix to find the upper bound of the framingly distance, that is, two continuous curves, find the key points therein as end points, and then couplingdistance (coupling distance) is the distance of four points between the end points.
Ca (i, j) is i, j points and the Freon's distance between the two points.
D (ui, vj) is the Euclidean distance between the two points i, j.
Wherein: ca (i, j) =max { min (c (i-1, j), c (i-1, j-1), c (i, j-1)), d (ui, vj) }
People and devices can reach points i and j from the above three cases. We need to find the one of the three positions that is the smallest and compare it to the distance of the i, j point, and then choose the biggest one of them as the framings distance of the i, j point.
Ca is a matrix, n x m, where n and m are the aggregate lengths of curves 1 and 2, i.e., the number of points on the curve. For storing all the calculation results.
The overall calculation is as follows: the calling method is that the subscript of the last point of the two curves is input, then the calling is recursively performed, and the return condition is that the subscript of the last point is always calculated to be 0, namely the first point.
In order to correspond the acquired mobile device to the human face, the coincidence degree of the human face track and the address track of the mobile device is calculated. And for each face track Fi in the set F, calculating the Fregming distance between the Fi and each mobile device address track Cj in the set C to finally obtain a mobile device address track C with the minimum Fregming distance, if the Fregming distance between Fi and C is smaller than a set threshold value, judging that Fi and C are faces and mobile device addresses belonging to the same person, determining a real mapping relation at the moment, and if the Fi and C are larger than or equal to the threshold value, considering that the faces and the mobile device addresses are not matched, and removing the preliminary mapping relation in the previous step. Traversing all face tracks to obtain the fusion data of the faces of all people and the mobile equipment addresses.
Example 2
The embodiment provides a personnel identity recognition device based on big data collision detection, which comprises:
the device comprises an acquisition unit, a monitoring unit and a control unit, wherein the acquisition unit is used for acquiring face information and a mobile device address, wherein the face information and the mobile device address are obtained by monitoring a plurality of monitoring time points in a pre-selected target area;
the mapping relation establishing unit is used for establishing a preliminary mapping relation between the face information and the mobile equipment address according to the acquired face information and the mobile equipment address;
The track relation establishing unit is used for establishing a face track and a mobile equipment address track according to the preliminary mapping relation and calculating and acquiring a track diagram between the face track and the mobile equipment address track through a French rest distance algorithm;
The computing unit is used for computing the Fregming distance between the acquired face track and the mobile equipment address track, comparing the Fregming distance with a preset threshold value, determining a real mapping relation if the Fregming distance is smaller than the preset threshold value, and removing the preliminary mapping relation if the Fregming distance is larger than or equal to the threshold value and the face information is not matched with the mobile equipment address;
and the traversing unit is used for traversing all the face tracks to obtain the face information of all the people and the address fusion data of the mobile equipment.
Further, the mobile device comprises a face acquisition device and a probe, wherein the face acquisition device and the probe are arranged in the target area in advance, the face acquisition device is used for acquiring the face information, and the probe is used for acquiring and storing the mobile device address.
Further, the mobile device address includes international mobile equipment identification codes IMEI, bluetooth Mac, wiFi Mac.
Example 3
The embodiment provides a personnel identity recognition device based on big data collision detection, which comprises a processor and a storage medium;
The storage medium is used for storing instructions;
the processor being operative according to the instructions to perform the steps of any one of the methods described above:
acquiring face information and a mobile device address, wherein the face information and the mobile device address are obtained by selecting a plurality of monitoring time points in a pre-selected target area;
According to the obtained face information and the mobile equipment address, establishing a preliminary mapping relation between the face information and the mobile equipment address;
According to the preliminary mapping relation, a face track and a mobile equipment address track are established, and a track diagram between the face track and the mobile equipment address track is obtained through calculation of a Fregminsh distance algorithm;
calculating the Fregming distance between the acquired face track and the mobile equipment address track, comparing the Fregming distance with a preset threshold value, determining a real mapping relation if the Fregming distance is smaller than the preset threshold value, and if the Fregming distance is larger than or equal to the threshold value, not matching the face information with the mobile equipment address, and removing the preliminary mapping relation;
traversing all face tracks to obtain face information of all people and address fusion data of the mobile equipment.
Example 4
The present embodiment provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of any of the methods described above:
acquiring face information and a mobile device address, wherein the face information and the mobile device address are obtained by selecting a plurality of monitoring time points in a pre-selected target area;
According to the obtained face information and the mobile equipment address, establishing a preliminary mapping relation between the face information and the mobile equipment address;
According to the preliminary mapping relation, a face track and a mobile equipment address track are established, and a track diagram between the face track and the mobile equipment address track is obtained through calculation of a Fregminsh distance algorithm;
calculating the Fregming distance between the acquired face track and the mobile equipment address track, comparing the Fregming distance with a preset threshold value, determining a real mapping relation if the Fregming distance is smaller than the preset threshold value, and if the Fregming distance is larger than or equal to the threshold value, not matching the face information with the mobile equipment address, and removing the preliminary mapping relation;
traversing all face tracks to obtain face information of all people and address fusion data of the mobile equipment.
The foregoing is merely a preferred embodiment of the present invention, and it should be noted that modifications and variations could be made by those skilled in the art without departing from the technical principles of the present invention, and such modifications and variations should also be regarded as being within the scope of the invention.
Claims (9)
1. The personnel identity recognition method based on big data collision detection is characterized by comprising the following steps of:
acquiring face information and a mobile device address, wherein the face information and the mobile device address are obtained by selecting a plurality of monitoring time points in a pre-selected target area;
According to the obtained face information and the mobile equipment address, establishing a preliminary mapping relation between the face information and the mobile equipment address;
According to the preliminary mapping relation, a face track and a mobile equipment address track are established, and a track diagram between the face track and the mobile equipment address track is obtained through calculation of a Fregminsh distance algorithm;
calculating the Fregming distance between the acquired face track and the mobile equipment address track, comparing the Fregming distance with a preset threshold value, determining a real mapping relation if the Fregming distance is smaller than the preset threshold value, and if the Fregming distance is larger than or equal to the threshold value, not matching the face information with the mobile equipment address, and removing the preliminary mapping relation;
Traversing all face tracks to obtain face information of all people and address fusion data of mobile equipment;
according to the obtained face information and the mobile equipment address, establishing a preliminary mapping relation between the face information and the mobile equipment address, wherein the method comprises the following steps:
dividing the plurality of monitoring time points into { T1, T2, …, tn }, n being a positive integer;
Aiming at the monitoring time T1, generating collision information according to the face information and the mobile equipment address, and recording the number of collisions and the collision time;
Updating the collision information by sequentially utilizing the face information and the mobile equipment address acquired in the monitoring time T2 to Tn;
If the number of times of collision reaches a preset threshold value, a collision table corresponding to the face information and the mobile equipment address is established according to the collision information;
and updating the collision table by sequentially utilizing the face information and the mobile equipment address acquired by the monitoring time { T2, …, tn }, and further comprising: sorting mobile equipment addresses corresponding to the face information in the collision table according to the number of collisions;
setting a time threshold, extracting a record with the collision times of 1 from the collision table, and deleting the record if the interval between the collision time of the record and the current time exceeds the time threshold;
if the monitoring time Ti (i is more than or equal to 1 and less than or equal to n) and the number of faces in the target area is 1, establishing a preliminary mapping relation between the face information and the mobile equipment address.
2. The personal identification method based on big data collision detection according to claim 1, wherein: the face information is acquired through a face acquisition device, is input into a deep convolutional neural network to extract face features, is acquired, and is acquired and stored through a probe, wherein the face acquisition device and the probe are arranged in a target area in advance.
3. The personal identification method based on big data collision detection according to claim 2, wherein: and when the face information and the address of the mobile equipment are simultaneously present together, recording and generating primary collision information.
4. A person identification method based on big data collision detection as claimed in claim 3, characterized in that: the mobile equipment address comprises international mobile equipment identification codes IMEI, bluetooth Mac and WiFi Mac.
5. Personnel identification device based on big data collision detection, characterized by comprising:
the device comprises an acquisition unit, a monitoring unit and a control unit, wherein the acquisition unit is used for acquiring face information and a mobile device address, wherein the face information and the mobile device address are obtained by monitoring a plurality of monitoring time points in a pre-selected target area;
the mapping relation establishing unit is used for establishing a preliminary mapping relation between the face information and the mobile equipment address according to the acquired face information and the mobile equipment address;
The track relation establishing unit is used for establishing a face track and a mobile equipment address track according to the preliminary mapping relation and calculating and acquiring a track diagram between the face track and the mobile equipment address track through a French rest distance algorithm;
The computing unit is used for computing the Fregming distance between the acquired face track and the mobile equipment address track, comparing the Fregming distance with a preset threshold value, determining a real mapping relation if the Fregming distance is smaller than the preset threshold value, and removing the preliminary mapping relation if the Fregming distance is larger than or equal to the threshold value and the face information is not matched with the mobile equipment address;
the traversing unit is used for traversing all face tracks to obtain face information of all people and mobile equipment address fusion data;
in the mapping relation establishing unit, according to the obtained face information and the mobile device address, a preliminary mapping relation between the face information and the mobile device address is established, which comprises the following steps:
dividing the plurality of monitoring time points into { T1, T2, …, tn }, n being a positive integer;
Aiming at the monitoring time T1, generating collision information according to the face information and the mobile equipment address, and recording the number of collisions and the collision time;
Updating the collision information by sequentially utilizing the face information and the mobile equipment address acquired in the monitoring time T2 to Tn;
If the number of times of collision reaches a preset threshold value, a collision table corresponding to the face information and the mobile equipment address is established according to the collision information;
and updating the collision table by sequentially utilizing the face information and the mobile equipment address acquired by the monitoring time { T2, …, tn }, and further comprising: sorting mobile equipment addresses corresponding to the face information in the collision table according to the number of collisions;
setting a time threshold, extracting a record with the collision times of 1 from the collision table, and deleting the record if the interval between the collision time of the record and the current time exceeds the time threshold;
if the monitoring time Ti (i is more than or equal to 1 and less than or equal to n) and the number of faces in the target area is 1, establishing a preliminary mapping relation between the face information and the mobile equipment address.
6. The personal identification device based on big data collision detection according to claim 5, further comprising a face acquisition device and a probe, which are arranged in advance in a target area, wherein the face acquisition device is configured to acquire the face information, and the probe is configured to acquire and store the mobile device address.
7. The personal identification device based on big data collision detection of claim 6, wherein the mobile device address comprises an international mobile equipment identification code IMEI, bluetooth Mac, wiFi Mac.
8. A personnel identity recognition device based on big data collision detection is characterized in that: comprises a processor and a storage medium;
The storage medium is used for storing instructions;
the processor being operative according to the instructions to perform the steps of the method according to any one of claims 1 to 4.
9. A computer-readable storage medium having stored thereon a computer program, characterized by: the program, when executed by a processor, implements the steps of the method of any of claims 1 to 4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111051069.3A CN113936124B (en) | 2021-09-08 | 2021-09-08 | Personnel identity recognition method and device based on big data collision detection |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111051069.3A CN113936124B (en) | 2021-09-08 | 2021-09-08 | Personnel identity recognition method and device based on big data collision detection |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113936124A CN113936124A (en) | 2022-01-14 |
CN113936124B true CN113936124B (en) | 2024-09-27 |
Family
ID=79275350
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111051069.3A Active CN113936124B (en) | 2021-09-08 | 2021-09-08 | Personnel identity recognition method and device based on big data collision detection |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113936124B (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108536749A (en) * | 2018-03-12 | 2018-09-14 | 南京甄视智能科技有限公司 | The method for building personnel's Track View based on collision detection method |
CN109711370A (en) * | 2018-12-29 | 2019-05-03 | 北京博睿视科技有限责任公司 | A kind of data anastomosing algorithm based on WIFI detection and face cluster |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10198625B1 (en) * | 2016-03-26 | 2019-02-05 | Videomining Corporation | Association of unique person to a mobile device using repeat face image matching |
CN107026831B (en) * | 2016-09-26 | 2020-03-10 | 阿里巴巴集团控股有限公司 | Identity recognition method and device |
US10885758B2 (en) * | 2018-11-20 | 2021-01-05 | Transocean Sedeo Forex Ventures Limited | Proximity-based personnel safety system and method |
CN111949699A (en) * | 2019-05-14 | 2020-11-17 | 西安光启未来技术研究院 | Trajectory collision method and system based on multiple verifications |
-
2021
- 2021-09-08 CN CN202111051069.3A patent/CN113936124B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108536749A (en) * | 2018-03-12 | 2018-09-14 | 南京甄视智能科技有限公司 | The method for building personnel's Track View based on collision detection method |
CN109711370A (en) * | 2018-12-29 | 2019-05-03 | 北京博睿视科技有限责任公司 | A kind of data anastomosing algorithm based on WIFI detection and face cluster |
Also Published As
Publication number | Publication date |
---|---|
CN113936124A (en) | 2022-01-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110418114B (en) | Object tracking method and device, electronic equipment and storage medium | |
CN107483416A (en) | The method and device of authentication | |
CN109766755B (en) | Face recognition method and related product | |
CN101496074A (en) | Device and method for detecting suspicious activity, program, and recording medium | |
CN103942811A (en) | Method and system for determining motion trajectory of characteristic object in distributed and parallel mode | |
CN101430827B (en) | Taxi wireless video monitoring system and method based on GPS | |
CN108536749B (en) | Method for constructing person track view based on collision detection method | |
CN110991397B (en) | Travel direction determining method and related equipment | |
CN110263633A (en) | The personnel that are involved in drug traffic based on space time correlation detect method for early warning, system and storage medium | |
CN108710841B (en) | Human face living body detection device and method based on MEMs infrared array sensor | |
CN108052887A (en) | A kind of doubtful illegal land automatic recognition system and method for merging SLAM/GNSS information | |
CN106934773B (en) | Video moving target and Mac address matching method | |
CN112770265B (en) | Pedestrian identity information acquisition method, system, server and storage medium | |
CN109634946A (en) | A kind of track intelligent Matching association analysis algorithm model excavated based on big data | |
CN112528706A (en) | Personnel identification system and method thereof | |
CN114492590B (en) | Boundary channel generation method and device based on track clustering | |
CN106452716A (en) | Unmanned aerial vehicle signal recognition and detection algorithm based on Hash fingerprint | |
CN106326835A (en) | Human face data collection statistical system and method for gas station convenience store | |
WO2018176191A1 (en) | Method and apparatus for identifying vehicle with fake registration plate | |
CN110765823A (en) | Target identification method and device | |
CN112733598A (en) | Vehicle law violation determination method and device, computer equipment and storage medium | |
CN109963113B (en) | Method and device for monitoring interested target | |
CN113724289A (en) | Video image multi-target tracking method, system, storage medium and equipment | |
CN113936124B (en) | Personnel identity recognition method and device based on big data collision detection | |
Jaikumar et al. | Fingerprint based student attendance system with SMS alert to parents |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |