CN114037880A - Data processing method and device, electronic equipment and storage medium - Google Patents
Data processing method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN114037880A CN114037880A CN202010698030.XA CN202010698030A CN114037880A CN 114037880 A CN114037880 A CN 114037880A CN 202010698030 A CN202010698030 A CN 202010698030A CN 114037880 A CN114037880 A CN 114037880A
- Authority
- CN
- China
- Prior art keywords
- target
- data
- characteristic data
- biological characteristic
- historical
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 84
- 238000012545 processing Methods 0.000 claims abstract description 66
- 238000012549 training Methods 0.000 claims abstract description 45
- 238000000034 method Methods 0.000 claims abstract description 36
- 238000002372 labelling Methods 0.000 claims abstract description 33
- 230000008569 process Effects 0.000 claims abstract description 21
- 230000006399 behavior Effects 0.000 claims description 81
- 230000001815 facial effect Effects 0.000 claims description 8
- 230000003542 behavioural effect Effects 0.000 claims description 4
- 238000000605 extraction Methods 0.000 description 15
- 230000005540 biological transmission Effects 0.000 description 5
- 238000000926 separation method Methods 0.000 description 5
- 238000013473 artificial intelligence Methods 0.000 description 4
- 230000006698 induction Effects 0.000 description 4
- 238000003062 neural network model Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 230000010365 information processing Effects 0.000 description 3
- 238000013528 artificial neural network Methods 0.000 description 2
- 230000033228 biological regulation Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 1
- 238000013145 classification model Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 238000013136 deep learning model Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000012634 fragment Substances 0.000 description 1
- 230000008676 import Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000877 morphologic effect Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/6218—Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
- G06F21/6245—Protecting personal data, e.g. for financial or medical purposes
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Bioethics (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Databases & Information Systems (AREA)
- Computer Hardware Design (AREA)
- Computer Security & Cryptography (AREA)
- Software Systems (AREA)
- Collating Specific Patterns (AREA)
Abstract
The application provides a data processing method, which comprises the following steps: obtaining target biological characteristic data; establishing a corresponding relation between the target biological characteristic data and target user information in a target terminal; and labeling the target biological characteristic data in the target terminal according to the corresponding relation to obtain labeled data corresponding to the target biological characteristic data, and taking the labeled data corresponding to the target biological characteristic data as data for training a biological characteristic data processing model, wherein the biological characteristic data processing model is a network model used for a server to process biological characteristic data. The data processing method provided by the application ensures that under the condition that the target user does not upload the user data to the server or the cloud, the data marking of the user biological characteristic data is realized for training the biological characteristic data processing model, and further the privacy disclosure problem caused by uploading the user biological characteristic data to the server or the cloud is solved.
Description
Technical Field
The application relates to the technical field of computers, in particular to a data processing method. The application also relates to a data processing device, an electronic device and a storage medium.
Background
With the rapid development of computer technology and the arrival of the big data era, more and more network platforms and artificial intelligence devices begin to emerge in large quantities. In order to better serve users, various network platforms and artificial intelligence devices often need to collect user data of users and provide accurate personalized services for the users according to the user data, such as: the e-commerce platform often provides personalized commodity recommendation for a user by collecting user data, and the intelligent voice terminal equipment often needs to collect user biological characteristic data to complete work such as personalized biological identification authentication, personalized content recommendation and instruction recommendation for the user. In the process of providing accurate personalized service for users by a network platform and artificial intelligence equipment, the method mostly depends on various deep neural network models, such as: the intelligent voice terminal device generally identifies and processes the collected user biological characteristic data based on a voice identification network model and an image identification network model, so as to realize the identification and authentication of the user.
In order to ensure that the deep neural network model has high accuracy, the deep neural network model needs to be trained and updated continuously, and in the training process of the deep neural network model, user data needs to be obtained through a server or a cloud, and specific data annotation is performed on the user data, such as: and marking data attributes to finish the training and updating of the network model. In many cases, it is highly likely that user Data, particularly user biometric Data, will be a refusal to share user Data by users who are sensitive to security privacy because of the personal privacy of the user involved, and furthermore, according to the GDPR (General Data protection regulation) regulation that has been put into effect formally at present, no organization or company can upload and store user Data without obtaining clear permission from the european union user. By combining the above factors, how to label the user data to complete the training and updating of the deep network model under the condition that the user does not want to upload the user data to the server or the cloud becomes a very challenging and urgent problem to be solved.
In the prior art, a core idea for solving the problem that data annotation of user data is realized to complete training and updating of a deep network model under the condition that a user is unwilling to upload user data to a server or a cloud is as follows: the training of the network model is completed on the user terminal, and the weight of a part of deep learning model is uploaded, however, in the prior art, a solution is not provided for the problem of how to realize the data annotation of the user data by the data annotation of the user data under the condition that the user does not want to upload the user data to the server or the cloud.
Disclosure of Invention
The application provides a data processing method, a data processing device, electronic equipment and a storage medium, so that data annotation of user biological characteristic data is realized under the condition that a target user does not upload user data to a server or a cloud.
The application provides a data processing method, which comprises the following steps:
obtaining target biological characteristic data;
establishing a corresponding relation between the target biological characteristic data and target user information in a target terminal;
and labeling the target biological characteristic data in the target terminal according to the corresponding relation to obtain labeled data corresponding to the target biological characteristic data, and taking the labeled data corresponding to the target biological characteristic data as data for training a biological characteristic data processing model, wherein the biological characteristic data processing model is a network model used for a server to process biological characteristic data.
Optionally, the establishing, in the target terminal, a corresponding relationship between the target biometric data and the target user information includes:
judging whether target historical biological characteristic data matched with the target biological characteristic data exists in the target terminal;
if yes, determining the user information corresponding to the target historical biological characteristic data as the target user information, and establishing the corresponding relation.
Optionally, the method further includes: and if the target historical biological characteristic data matched with the target biological characteristic data does not exist in the target terminal, creating user information aiming at the target biological characteristic data in the target terminal as the target user information, and establishing the corresponding relation.
Optionally, the determining whether the target terminal has target historical biometric data matched with the target biometric data includes:
obtaining historical biological characteristic data in the target terminal;
comparing the target biological characteristic data with the historical biological characteristic data to obtain the similarity between the target biological characteristic data and the historical biological characteristic data;
and taking the historical biological characteristic data with the similarity reaching a similarity threshold as the target historical biological characteristic data.
Optionally, the comparing the target biometric data with the historical biometric data to obtain the similarity between the target biometric data and the historical biometric data includes: comparing the physiological characteristic data of the target object and the behavior characteristic data of the target object in the target biological characteristic data with the physiological characteristic data of the object and the behavior characteristic data of the object in the historical biological characteristic data respectively to obtain a first similarity between the physiological characteristic data of the target object and the physiological characteristic data of the object, and a second similarity between the behavior characteristic data of the target object and the behavior characteristic data of the object;
the taking the historical biological feature data with the similarity reaching a similarity threshold as the target historical biological feature data includes: and taking the historical biological feature data of which the first similarity reaches a first similarity threshold or the second similarity reaches a second similarity threshold as the target historical biological feature data.
Optionally, the labeling, according to the correspondence, the target biometric data in the target terminal to obtain labeled data corresponding to the target biometric data includes:
according to the corresponding relation, fusing the target biological characteristic data with the target historical biological characteristic data in the target terminal to obtain the target biological characteristic data corresponding to the target user information;
and performing data annotation on the target biological characteristic data corresponding to the target user information in the target terminal to obtain annotation data corresponding to the target biological characteristic data.
Optionally, the fusing, according to the corresponding relationship, the target biometric data and the target historical biometric data in the target terminal includes: and acquiring a third similarity of the physiological characteristic data of the target object in the target biological characteristic data and the physiological characteristic data of the object in the target historical biological characteristic data, and merging the physiological characteristic data of the target object and the physiological characteristic data of the object, wherein the third similarity is greater than a third similarity threshold value.
Optionally, the fusing, according to the corresponding relationship, the target biometric data and the target historical biometric data in the target terminal includes: and acquiring fourth similarity of the behavior characteristic data of the target object in the target biological characteristic data and the behavior characteristic data of the object in the target historical biological characteristic data, and merging the behavior characteristic data of the target object and the behavior characteristic data of the object, wherein the fourth similarity is greater than a fourth similarity threshold value.
Optionally, the physiological characteristic data of the target object includes facial physiological characteristic data of the target object.
Optionally, the behavior feature data of the target object includes voice behavior feature data of the target object
Optionally, the obtaining target biometric data includes:
obtaining video data including a target object;
and determining a target object in the video data, and extracting biological features of the target object to obtain target biological feature data corresponding to the target object.
Optionally, the obtaining video data including the target object includes: and acquiring the video data through the target terminal.
Optionally, the method further includes: after the video data are obtained, fuzzifying the video data corresponding to other objects except the target object in the video data; and after the target biological characteristic data is obtained, performing fuzzification processing on video data corresponding to the target object in the video data.
In another aspect of the present application, a data processing apparatus applied to a target terminal is provided, including:
a biometric data obtaining unit for obtaining target biometric data;
a corresponding relation establishing unit, configured to establish a corresponding relation between the target biometric data and target user information;
and the data labeling unit is used for labeling the target biological characteristic data according to the corresponding relation to obtain labeled data corresponding to the target biological characteristic data, and using the labeled data corresponding to the target biological characteristic data as data for training a biological characteristic data processing model, wherein the biological characteristic data processing model is a network model used for a server to process biological characteristic data.
In another aspect of the present application, an electronic device is provided, including:
a processor; and
a memory for storing a program for a data processing method, the apparatus performing the following steps after being powered on and running the program for the data processing method by the processor:
obtaining target biological characteristic data;
establishing a corresponding relation between the target biological characteristic data and target user information in a target terminal;
and labeling the target biological characteristic data in the target terminal according to the corresponding relation to obtain labeled data corresponding to the target biological characteristic data, and taking the labeled data corresponding to the target biological characteristic data as data for training a biological characteristic data processing model, wherein the biological characteristic data processing model is a network model used for a server to process biological characteristic data.
In another aspect of the present application, there is provided a storage medium storing a program for a data processing method, the program being executed by a processor to perform the steps of:
obtaining target biological characteristic data;
establishing a corresponding relation between the target biological characteristic data and target user information in a target terminal;
and labeling the target biological characteristic data in the target terminal according to the corresponding relation to obtain labeled data corresponding to the target biological characteristic data, and taking the labeled data corresponding to the target biological characteristic data as data for training a biological characteristic data processing model, wherein the biological characteristic data processing model is a network model used for a server to process biological characteristic data.
In another aspect of the present application, a data processing method is provided, including:
obtaining target biological characteristic data;
establishing a corresponding relation between the target biological characteristic data and target user information in a target terminal;
according to the corresponding relation, labeling the target biological characteristic data in the target terminal to obtain labeled data corresponding to the target biological characteristic data, and taking the labeled data corresponding to the target biological characteristic data as data for training a biological characteristic data identification model, wherein the biological characteristic data identification model is a network model used for a server to process biological characteristic data;
and obtaining a target biological characteristic data identification model to be trained, and performing model training on the target biological characteristic data identification model in the user terminal by using the marking data corresponding to the target biological characteristic data.
Compared with the prior art, the method has the following advantages:
according to the data processing method, after the target biological characteristic data are obtained, firstly, the corresponding relation between the target biological characteristic data and the target user information is established in the target terminal, then, the target biological characteristic data are marked in the target terminal according to the corresponding relation, the marked data corresponding to the target biological characteristic data are obtained, and the marked data corresponding to the target biological characteristic data are used as data for training a biological characteristic data processing model. According to the data processing method, after the corresponding relation between the target biological characteristic data and the target user information is established in the target terminal, the target biological characteristic data is marked in the target terminal according to the corresponding relation between the target biological characteristic data and the target user information, and marked data corresponding to the target biological characteristic data are obtained, so that the data marking of the user biological characteristic data is realized under the condition that the target user does not upload the user data to the server or the cloud, the biological characteristic data processing method is used for training a biological characteristic data processing model, and the problem of privacy disclosure caused by uploading the user biological characteristic data to the server or the cloud is solved.
Drawings
Fig. 1 is a schematic view of an application scenario of a data processing method according to an embodiment of the present application.
Fig. 2 is a flowchart of a data processing method provided in the first embodiment of the present application.
Fig. 3 is a flowchart of a biometric data matching determination method provided in the first embodiment of the present application.
Fig. 4 is a schematic diagram of a data processing apparatus provided in a second embodiment of the present application.
Fig. 5 is a schematic diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is capable of implementation in many different ways than those herein set forth and of similar import by those skilled in the art without departing from the spirit of this application and is therefore not limited to the specific implementations disclosed below.
In order to more clearly show the data processing method provided by the present application, an application scenario of the data processing method provided by the present application is introduced first.
The data processing method provided by the application is generally applied to a scene of generating labeling data used for training a biological characteristic data processing model at a target terminal. In practical application, when training and updating models such as biological feature data processing models for assisting artificial intelligence equipment or a network platform in facial recognition, voice recognition and other biological feature recognition or classification, the training and updating are generally completed on a server or a cloud, and at the moment, the user terminal is required to upload training data for model training to the server or the cloud. Since the training data necessarily involves the biometric data of the subject when training the biometric data processing model, such as: the training data used for training the biological feature data processing model is uploaded to the server or the cloud, and the corresponding privacy is revealed, so that the training of the network model is completed on the user terminal, and the trained network model is uploaded to the server or the cloud to complete the model, so that the problems can be better solved. When the training of the network model is completed on the user terminal, the biological characteristic data used for model training needs to be labeled first to obtain labeled data corresponding to the biological characteristic data, so as to be used for training the biological characteristic data processing model. In an application scenario embodiment of the data processing method provided by the present application, the annotation data corresponding to the biometric data is generally annotation data generated after data annotation is performed on attributes corresponding to the biometric data, and is used for indicating information such as a type to which the biometric data belongs, an affiliation relationship of the data, and a capacity of the data.
In an application scenario embodiment of the data processing method provided by the present application, the biometric data includes physiological characteristic data of an object and behavior characteristic data of the object, where the physiological characteristic data of the object generally includes facial physiological characteristic data of the object, fingerprint physiological characteristic data of the object, and iris data of the object, and the behavior characteristic data of the object generally includes voice characteristic data of the object, morphological characteristic data of the object, and the like. The biometric data processing model is a model for biometric recognition or processing, and specifically may be a model for processing physiological characteristic data of a subject alone, a model for processing behavior characteristic data of a subject alone, or a model for processing both physiological characteristic data of a subject and behavior characteristic data of a subject, such as: face recognition models, speech recognition models, audio classification models, and the like.
In an application scenario embodiment of the data processing method provided by the present application, the target terminal is an electronic device that is loaded with software or a program for information or data transmission and information or data information processing to implement corresponding supply, for example: personal handheld intelligent terminal equipment, intelligent audio amplifier, panel computer, personal computer etc.. Fig. 1 shows an application scenario to which the data processing method provided in the embodiment of the present application is applied, where the application scenario corresponding to fig. 1 is implemented by a program for the data processing method installed in a target terminal, and in the application scenario corresponding to fig. 1, the target terminal is selected as a smartphone.
Firstly, a video acquisition module 101 in a program for a data processing method acquires video data including a target object through a self-contained audio acquisition device on a smart phone, then separates target audio data corresponding to the target object from target image data corresponding to the target object in the video data, and provides the target audio data and the target image data to a voice feature extraction module 102 and an image feature extraction module 103 in the program for the data processing method respectively. A specific implementation manner of separating target audio data and image data corresponding to a target object in video data includes: while target audio data is obtained by the first data separation model 101-1 based on audio data and image data in the video data. Specifically, first, audio data and image data in video data are input into the first data separation model 101-1, and audio-related motion characteristic data in the image data is obtained; secondly, obtaining original audio data corresponding to the audio data; and finally, analyzing the original audio data according to the audio-related motion characteristic data to obtain target audio data. A specific implementation manner of separating target audio data and image data corresponding to a target object in video data includes: meanwhile, target image data corresponding to the target object is obtained through the second data separation model 101-2 according to image data in the video data. Specifically, target image data is obtained by inputting image data in the video data and the target object into the second data separation model 101-2.
In addition, in an application scenario embodiment of the data processing method provided by the application, when video data is acquired through a target terminal, a sensor for target induction can be arranged on an audio acquisition device which is provided on the target terminal or connected to the target terminal and can exchange audio data with the audio acquisition device. When the sensor for target induction detects preset target information, such as face information of a target object, infrared information of the object and the like, the audio acquisition equipment can be controlled to automatically acquire video data. The specific implementation manner is as follows: an infrared sensor for infrared induction is additionally arranged on the audio acquisition equipment on the target terminal, and after the infrared sensor detects infrared information, the audio acquisition equipment on the target terminal can automatically acquire video data; the following steps are repeated: a millimeter wave radar sensor used for target induction is additionally arranged on the audio acquisition equipment on the target terminal, and after the millimeter wave radar sensor detects millimeter waves of a preset frequency domain, the audio acquisition equipment on the target terminal can acquire video data.
Secondly, after receiving the target audio data provided by the video acquisition module 101, the voice feature extraction module 102 extracts voice features of the target object corresponding to the target audio data, such as voiceprint features of the target object, audio frequency features of the target object, and tone features of the target object, to obtain voice feature data of the target object corresponding to the target audio data; accordingly, after receiving the target image data provided by the video capture module 101, the image feature extraction module 103 extracts physiological feature data of the target object corresponding to the target image data, such as a facial physiological feature of the target object, and obtains the physiological feature data of the target object corresponding to the target image data.
It should be noted that, in order to protect the user privacy data of the user, after the video data is collected, it is necessary to perform blurring processing on the video data corresponding to an object other than the target object in the video data. That is, after the video data is collected, the video data which relates to the user privacy data and is unrelated to the target biological characteristic data in the video data is fuzzified, for example: the blurring process is performed on the video image of the other object than the target object in the video data. In addition, after the target biometric data is obtained, it is necessary to further perform blurring processing on the video data corresponding to the target object in the video data. That is, after the target biometric data, it is also necessary to perform blurring processing on the video data related to the privacy data of the target user object in the audio data, such as: and performing fuzzification processing and the like on a video picture related to a target object in the video data. The video data related to the user privacy data in the video data are fuzzified, so that the user privacy data can be prevented from being leaked in other scenes related to video data interaction, such as video data transmission, display and the like.
In an application scenario embodiment of the data processing method provided by the present application, a specific implementation manner of performing fuzzification processing on video data related to user privacy data in the video data may be as follows: reducing the image quality of the video frame corresponding to the related video data, such as: reducing the definition of the video frame, adding noise in the video frame, and the like; the method can also comprise the following steps: and coding the video frames or relevant parts in the video related to the user privacy in the video data.
Thirdly, the corresponding relationship establishing unit 104 in the program for the data processing method establishes a corresponding relationship between the target biometric data and the target user information in the target terminal, and the specific process is as follows: first, historical biometric data in a target terminal is obtained. And secondly, comparing the physiological characteristic data of the target object and the behavior characteristic data of the target object with the physiological characteristic data of the object and the behavior characteristic data of the object in the historical biological characteristic data respectively to obtain a first similarity between the physiological characteristic data of the target object and the physiological characteristic data of the object, and a second similarity between the behavior characteristic data of the target object and the behavior characteristic data of the object. And thirdly, taking the historical biological characteristic data of which the first similarity reaches a first similarity threshold value or the second similarity reaches a second similarity threshold value as target historical biological characteristic data. Fourthly, taking the historical biological feature data with the first similarity reaching a first similarity threshold value or the second similarity reaching a second similarity threshold value as target historical biological feature data. And fifthly, establishing a corresponding relation between the target biological characteristic data and the target user information. Sixth, the target biometric data and the correspondence are supplied to the data labeling unit 105 in the program for the data processing method.
In an application scenario embodiment of the data processing method provided by the present application, the user information of the target user includes identification information of the user, and generally, the identification of the user is an ID (Identity Document) identification of the user.
Finally, after obtaining the target biometric data and the corresponding relationship, the data labeling unit 105 in the program for the data processing method performs data labeling on the target biometric data corresponding to the target user information in the target terminal to obtain labeled data corresponding to the target biometric data. Specifically, the physiological characteristic data of the target object in the target biological characteristic data with a large similarity and the physiological characteristic data of the object in the target historical biological characteristic data, and the behavior characteristic data of the target object with a large similarity and the behavior characteristic data of the object in the target historical biological characteristic data are merged to obtain target biological characteristic data corresponding to the target user information; and then carrying out data annotation on the target biological characteristic data corresponding to the target user information in the target terminal to obtain annotation data corresponding to the target biological characteristic data. The target biological characteristic data corresponding to the target user information is target biological characteristic data carrying target user information, and when the target user information is ID identification information of the target user, the target biological characteristic data corresponding to the target user information may be: the first target physiological characteristic data of user 1, the second target physiological characteristic data of user 1, the first target behavioral data of user 1, and the second target behavioral data … of user 1.
In addition, in an application scenario embodiment of the data processing method provided in the present application, the data labeling unit 105 in the program for the data processing method further provides labeled data corresponding to the target biometric data to the model training unit 106 in the program for the data processing method, so as to train the biometric data processing model. After receiving the labeled data corresponding to the target biometric data, the model training unit 106 performs model training on the target biometric data recognition model in the user terminal by using the labeled data corresponding to the target biometric data, obtains the trained target biometric data recognition model, and uploads the trained target biometric data recognition model to the server corresponding to the target terminal.
In the embodiment of the present application, an application scenario of the data processing method provided by the present application is not specifically limited, for example: the data processing method provided by the application can also be applied to other types of target terminals. The application scenario of the data processing method is only one embodiment of the application scenario of the data processing method provided in the present application, and the application scenario embodiment is provided to facilitate understanding of the data processing method provided in the present application, and is not intended to limit the data processing method provided in the present application. In the embodiment of the present application, no further description is given to other application scenarios of the provided data processing method.
First embodiment
A first embodiment of the present application provides a data processing method, which is described below with reference to fig. 2 to 3.
Please refer to fig. 2, which is a flowchart illustrating a data processing method according to a first embodiment of the present application.
In step S201, target biometric data is obtained.
In the first embodiment of the present application, the target biometric data is obtained as biometric data of a target object, the target object includes, but is not limited to, a target human object, the target biometric data includes physiological characteristic data of the target object, behavior characteristic data of the target object, may include only physiological characteristic data of the target object, may also include only behavior characteristic data of the target object, and may also include both physiological characteristic data of the target object and behavior characteristic data of the target object. Wherein the physiological characteristic data of the target object comprises: the target object's facial biometric data, the target object's iris biometric data, the target object's fingerprint characteristic data and the target object's palm print characteristic data, etc., the target object's behavior characteristic data includes: speech characteristic data of the target object. The data processing method provided in the first embodiment of the present application is described in detail, taking as an example that the target biometric data includes both physiological characteristic data of the target object and behavior characteristic data of the target object.
In practical applications, video data including a target object is generally obtained by acquiring video data including the target object through a video acquisition device installed on a target terminal. The video data is generally a video clip, or may be an image, and the image may be a video frame, or may be audio data, generally an audio clip. When the video data is a video clip, in the process of the target biometric data, the video data needs to be separated first, and image data in the video data and audio data corresponding to the target object included in the video data are obtained respectively, that is, the video clip is separated into an image set and a target audio clip corresponding to the target object. And respectively extracting physiological characteristic data of the target object and behavior characteristic data of the target object aiming at the image set and the target audio clip, so as to obtain target biological characteristic data corresponding to the target object. Namely, a target object in the video data is determined, and biological features of the target object are extracted to obtain target biological feature data corresponding to the target object.
It should be noted that before separating the video clip into the image set and the target audio clip, it is also necessary to determine whether the audio clip in the video clip is the target audio clip corresponding to the target object, that is, the target audio clip, and if so, the operation of extracting the behavior feature data of the target object can be further performed. Whether or not an audio clip in a video clip is a target audio clip corresponding to a target object is as follows: and the audio clip in the video clip is a voice clip corresponding to an object except the target object in the video.
The specific implementation manner of whether the audio clip in the video clip is the target audio clip corresponding to the target object is as follows:
first, audio-related motion feature data in a video segment is obtained.
In the first embodiment of the present application, the audio-related motion feature data in the video segment is lip motion feature data of a target object in the video segment, and the lip motion feature data of the target object may be one or more of lip vibration frequency data, lip vibration phase data, and lip vibration amplitude data of the target object. The lip vibration frequency data is used for representing the speed of speech of the sounding main body, the lip vibration phase data is used for representing the time point when the lip of the sounding main body is opened and closed, and the lip vibration phase data is used for representing the amplitude of the opening and closing of the lip of the sounding main body.
Secondly, an original audio clip corresponding to the video clip is obtained.
After the audio-related motion characteristic data in the video segment is obtained in the above step, this step is used to obtain an original audio segment corresponding to the above video segment, that is, an original audio segment corresponding to the above video segment in the recording time is obtained, and the target object corresponding to the original audio segment may appear in the above video segment in whole or in part.
And thirdly, analyzing the original audio segment according to the audio-related motion characteristic data to obtain the target audio segment.
After the audio-related motion characteristic data in the video segment is obtained and the original audio segment corresponding to the video segment is obtained in the above steps, the step is configured to analyze and obtain a target audio segment from the original audio segment according to the above audio-related motion characteristic data, that is, perform voice separation on the original audio segment according to the above obtained information, and obtain a target audio segment corresponding to a target object in the video segment.
The reason for separating the video data is that the target biological characteristic data includes physiological characteristic data of the target object and behavior characteristic data of the target object, and after the video data is separated into the image set and the target audio clip, the image set and the target audio clip can be more simply and conveniently targeted. The physiological characteristic data of the target object and the behavior characteristic data of the target object are extracted directly aiming at the video data, and compared with the method that the video data are separated firstly and then the characteristic extraction is respectively carried out aiming at the image set and the target audio frequency fragment, the data processing dimensionality is increased. Specifically, for the image set and the target audio clip, the data processing dimension when the physiological characteristic data of the target object and the behavior characteristic data of the target object are respectively extracted is one, that is, the image dimension and the audio dimension. And directly aiming at video data, the data dimensionality when physiological characteristic data of a target object and behavior characteristic data of the target object are respectively extracted is two, namely, the image dimensionality and the audio dimensionality, and the complexity of processing data with the dimensionality of two is generally far higher than that of processing data with the dimensionality of one respectively. In addition, even if the physiological characteristic data of the target object and the behavior characteristic data of the target object are extracted respectively directly for the video data, it is necessary to extract the physiological characteristic data of the target object and the behavior characteristic data of the target object first, and then extract the physiological characteristic data of the target object and the behavior characteristic data of the target object respectively for the image set and the target audio clip.
In order to include the target object and the privacy of the user of other objects, in the first embodiment of the present application, after the video data is obtained, it is necessary to perform blurring processing on the video data corresponding to other objects except the target object in the video data. In addition, after the target biometric data is obtained, it is necessary to perform blurring processing on the video data corresponding to the target object in the video data.
In step S202, a correspondence between the target biometric data and the target user information is established in the target terminal.
In the first embodiment of the present application, the target terminal is an electronic device that is loaded with software or a program for information or data transmission and information or data information processing to implement corresponding supply, for example: personal handheld intelligent terminal equipment, intelligent audio amplifier, panel computer, personal computer etc..
When the corresponding relationship between the target biometric data and the target user information is established in the target terminal, it is necessary to first determine whether target historical biometric data matched with the target biometric data exists in the target terminal, and then respectively implement the following two different corresponding relationship establishment methods according to the difference of the determination results:
the first method is that when target historical biometric data matched with the target biometric data exists in the target terminal, user information corresponding to the target historical biometric data is determined as target user information, and a corresponding relation is established.
The second method is that if there is no target historical biometric data matching the target biometric data in the target terminal, user information for the target biometric data is created in the target terminal as target user information, and a correspondence relationship is established.
And two different corresponding relation construction modes are respectively implemented according to different judgment results, so that whether historical biological characteristic data matched with the target biological characteristic data exists in the target terminal or not can be established.
In the first embodiment of the present application, please refer to fig. 3, which is a flowchart of a method for determining matching of biometric data in the first embodiment of the present application, wherein the method for determining whether there is historical biometric data matching with the target biometric data in the target terminal.
Step S301: historical biometric data in the target terminal is obtained.
In the first embodiment of the present application, the historical biometric data is biometric data already existing in the target terminal, and is typically object feature data for which label data has been determined. The historical biometric data may include not only historical biometric data corresponding to the target object, but also historical biometric data corresponding to the non-target object.
Step S302: and comparing the target biological characteristic data with the historical biological characteristic data to obtain the similarity between the target biological characteristic data and the historical biological characteristic data.
In the first embodiment of the present application, the specific operation of obtaining the similarity between the target biometric data and the historical biometric data is as follows: comparing the physiological characteristic data of the target object and the behavior characteristic data of the target object in the target biological characteristic data with the physiological characteristic data of the object and the behavior characteristic data of the object in the historical biological characteristic data respectively to obtain a first similarity between the physiological characteristic data of the target object and the physiological characteristic data of the object, and a second similarity between the behavior characteristic data of the target object and the behavior characteristic data of the object.
Step S303: and taking the historical biological characteristic data with the similarity reaching the similarity threshold as target historical biological characteristic data.
In the first embodiment of the present application, taking historical biometric data whose similarity reaches a similarity threshold as target historical biometric data includes: and taking the historical biological characteristic data of which the first similarity reaches a first similarity threshold value or the second similarity reaches a second similarity threshold value as target historical biological characteristic data. That is, if one of the degrees of similarity between the physiological characteristic data of the subject and the behavior characteristic data of the subject in the historical biometric data and the physiological characteristic data of the target subject and the behavior characteristic data of the target subject exceeds a preset similarity threshold, the historical biometric data is regarded as the target historical biometric data.
In step S203, labeling the target biometric data in the target terminal according to the correspondence, obtaining labeled data corresponding to the target biometric data, and using the labeled data corresponding to the target biometric data as data for training a biometric data processing model, where the biometric data processing model is a network model used by the server to process the biometric data.
In the first embodiment of the present application, the labeled data corresponding to the biometric data is generally labeled data generated by labeling data with respect to an attribute corresponding to the biometric data, and is used to indicate information such as a type to which the biometric data belongs, an affiliation of the data, and a volume of the data, where: the labeling data of the target biological characteristic data comprises: the target biometric data is biometric data belonging to the subject 1 with ID, and this: the target biometric data includes … first physiological characteristic data, second physiological characteristic data, first behavior characteristic data, second behavior characteristic data, and the like.
In practical applications, since the labeled data corresponding to the target biometric data is obtained by performing data labeling according to the corresponding relationship between the target biometric data and the target user information established in the target terminal, in the first embodiment of the present application, the labeled data corresponding to the target biometric data is the target biometric data corresponding to the target user information, and when obtaining the labeled data corresponding to the target biometric data, it is necessary to first obtain the target biometric data corresponding to the target user information according to the corresponding relationship, and then perform data labeling on the target biometric data corresponding to the target user information in the target terminal, so as to obtain the labeled data corresponding to the target biometric data. In practical application, matched target historical biological characteristic data may exist in a target terminal, that is, the physiological characteristic number and the behavior characteristic data in the target biological characteristic data may partially or coincided with the physiological characteristic number and the behavior characteristic data of the target historical biological characteristic data, and by fusing the target biological characteristic data with the target historical biological characteristic data, not only can the repetition of the biological characteristic data corresponding to the target user information be avoided, but also the labeled data corresponding to the physiological characteristic number and the behavior characteristic data of the target historical biological characteristic data, which are matched with the physiological characteristic number and the behavior characteristic data in the target biological characteristic data, can be directly obtained as the labeled data of the physiological characteristic number and the behavior characteristic data in the partially coincided target biological characteristic data, so that the workload of carrying out data labeling on the physiological characteristic number and the behavior characteristic data in the target biological characteristic data is reduced.
When the target biological characteristic data only comprises physiological characteristic data of a target object, fusing the target biological characteristic data and target historical biological characteristic data in the target terminal according to the corresponding relation, wherein the fusing comprises the following steps: and acquiring a third similarity of the physiological characteristic data of the target object in the target biological characteristic data and the physiological characteristic data of the object in the target historical biological characteristic data, and merging the physiological characteristic data of the target object with the third similarity larger than a third similarity threshold value and the physiological characteristic data of the object.
When the target biological characteristic data only comprises behavior characteristic data of the target object, fusing the target biological characteristic data and target historical biological characteristic data in the target terminal according to the corresponding relation, wherein the fusing comprises the following steps: and acquiring fourth similarity of the behavior characteristic data of the target object in the target biological characteristic data and the behavior characteristic data of the object in the target historical biological characteristic data, and merging the behavior characteristic data of the target object with the fourth similarity larger than a fourth similarity threshold value and the behavior characteristic data of the object.
When the target biological characteristic data includes both the physiological characteristic data of the target object and the behavior characteristic data of the target object, the physiological characteristic data of the target object and the physiological characteristic data of the target object, whose third similarity is greater than the third similarity threshold, and the behavior characteristic data of the target object, whose fourth similarity is greater than the fourth similarity threshold, need to be merged.
In the first embodiment of the present application, the extraction of the physiological characteristic data of the target object, the extraction of the behavior characteristic data of the target object, the extraction of the physiological characteristic data of the target object, and the extraction of the behavior characteristic data of the target object are generally realized based on an existing image characteristic extraction model, voice characteristic extraction model, and the like. The models such as the image feature extraction model and the voice feature extraction model may be models based on a convolutional neural network, models based on a deep neural network, and models based on a countermeasure generation network, and the neural networks on which the models such as the image feature extraction model and the voice feature extraction model are based are not particularly limited.
In the data processing method provided in the first embodiment of the present application, after obtaining target biometric data, first, a corresponding relationship between the target biometric data and target user information is established in a target terminal, and then, according to the corresponding relationship, the target biometric data is labeled in the target terminal to obtain labeled data corresponding to the target biometric data, and the labeled data corresponding to the target biometric data is used as data for training a biometric data processing model. The data processing method provided in the first embodiment of the application can mark the target biological characteristic data in the target terminal after the corresponding relationship between the target biological characteristic data and the target user information is established in the target terminal, and obtain the marked data corresponding to the target biological characteristic data, so that the data marking of the user biological characteristic data is realized under the condition that the target user does not upload the user data to the server or the cloud, and the data processing method is used for training a biological characteristic data processing model, and further solves the privacy disclosure problem caused by uploading the user biological characteristic data to the server or the cloud.
Second embodiment
Corresponding to the application scenario embodiment of the data processing method and the data processing method provided by the first embodiment, a second embodiment of the present application provides a data processing apparatus. Since the embodiment of the apparatus is basically similar to the embodiment of the application scenario and the first embodiment, the description is relatively simple, and for the relevant points, reference may be made to the embodiment of the application scenario and the partial description of the first embodiment. The device embodiments described below are merely illustrative.
Please refer to fig. 4, which is a diagram illustrating a data processing apparatus according to a second embodiment of the present application.
The data processing apparatus includes:
a biometric data obtaining unit 401 for obtaining target biometric data;
a correspondence relationship establishing unit 402, configured to establish a correspondence relationship between the target biometric data and target user information;
a data labeling unit 403, configured to label the target biometric data according to the correspondence, obtain labeled data corresponding to the target biometric data, and use the labeled data corresponding to the target biometric data as data used for training a biometric data processing model, where the biometric data processing model is a network model used by a server to process biometric data.
Optionally, the correspondence establishing unit 402 is specifically configured to determine whether target historical biometric data matched with the target biometric data exists in the target terminal; if yes, determining the user information corresponding to the target historical biological characteristic data as the target user information, and establishing the corresponding relation.
Optionally, the method further includes: and if the target historical biological characteristic data matched with the target biological characteristic data does not exist in the target terminal, creating user information aiming at the target biological characteristic data in the target terminal as the target user information, and establishing the corresponding relation.
Optionally, the determining whether the target terminal has target historical biometric data matched with the target biometric data includes:
obtaining historical biological characteristic data in the target terminal;
comparing the target biological characteristic data with the historical biological characteristic data to obtain the similarity between the target biological characteristic data and the historical biological characteristic data;
and taking the historical biological characteristic data with the similarity reaching a similarity threshold as the target historical biological characteristic data.
Optionally, the comparing the target biometric data with the historical biometric data to obtain the similarity between the target biometric data and the historical biometric data includes: comparing the physiological characteristic data of the target object and the behavior characteristic data of the target object in the target biological characteristic data with the physiological characteristic data of the object and the behavior characteristic data of the object in the historical biological characteristic data respectively to obtain a first similarity between the physiological characteristic data of the target object and the physiological characteristic data of the object, and a second similarity between the behavior characteristic data of the target object and the behavior characteristic data of the object;
the taking the historical biological feature data with the similarity reaching a similarity threshold as the target historical biological feature data includes: and taking the historical biological feature data of which the first similarity reaches a first similarity threshold or the second similarity reaches a second similarity threshold as the target historical biological feature data.
Optionally, the data labeling unit 403 is specifically configured to fuse, in the target terminal, the target biometric data and the target historical biometric data according to the correspondence, so as to obtain the target biometric data corresponding to the target user information; and performing data annotation on the target biological characteristic data corresponding to the target user information in the target terminal to obtain annotation data corresponding to the target biological characteristic data.
Optionally, the fusing, according to the corresponding relationship, the target biometric data and the target historical biometric data in the target terminal includes: and acquiring a third similarity of the physiological characteristic data of the target object in the target biological characteristic data and the physiological characteristic data of the object in the target historical biological characteristic data, and merging the physiological characteristic data of the target object and the physiological characteristic data of the object, wherein the third similarity is greater than a third similarity threshold value.
Optionally, the fusing, according to the corresponding relationship, the target biometric data and the target historical biometric data in the target terminal includes: and acquiring fourth similarity of the behavior characteristic data of the target object in the target biological characteristic data and the behavior characteristic data of the object in the target historical biological characteristic data, and merging the behavior characteristic data of the target object and the behavior characteristic data of the object, wherein the fourth similarity is greater than a fourth similarity threshold value.
Optionally, the physiological characteristic data of the target object includes facial physiological characteristic data of the target object.
Optionally, the behavior feature data of the target object includes voice behavior feature data of the target object.
Optionally, the biometric data obtaining unit is configured to obtain video data including a target object; and determining a target object in the video data, and extracting biological features of the target object to obtain target biological feature data corresponding to the target object.
Optionally, the obtaining video data including the target object includes: and acquiring the video data through the target terminal.
Optionally, the data processing method provided in the second embodiment of the present application further includes: the video data processing unit is used for blurring video data corresponding to other objects except the target object in the video data after the video data is obtained; and after the target biological characteristic data is obtained, performing fuzzification processing on video data corresponding to the target object in the video data.
Third embodiment
Corresponding to the application scenario embodiment of the data processing method provided by the present application and the data processing method provided by the first embodiment, a third embodiment of the present application further provides an electronic device. Since the third embodiment is basically similar to the application scenario embodiment and the first embodiment, the description is relatively simple, and related points can be referred to the application scenario embodiment and the partial description of the first embodiment. The third embodiment described below is merely illustrative.
As shown in fig. 5, fig. 5 is a schematic view of an electronic device provided in an embodiment of the present application.
The electronic device includes:
a processor 501; and
a memory 502 for storing a program of a data processing method, the apparatus performing the following steps after being powered on and running the program of the data processing method by the processor:
obtaining target biological characteristic data;
establishing a corresponding relation between the target biological characteristic data and target user information in a target terminal;
and labeling the target biological characteristic data in the target terminal according to the corresponding relation to obtain labeled data corresponding to the target biological characteristic data, and taking the labeled data corresponding to the target biological characteristic data as data for training a biological characteristic data processing model, wherein the biological characteristic data processing model is a network model used for a server to process biological characteristic data.
It should be noted that, for the detailed description of the electronic device provided in the third embodiment of the present application, reference may be made to the application scenario embodiment of the data processing method provided in the present application and the related description of the data processing method provided in the first embodiment, which are not described herein again.
Fourth embodiment
Corresponding to the application scenario embodiment of the data processing method provided by the present application and the data processing method provided by the first embodiment, a fourth embodiment of the present application further provides a storage medium. Since the fourth embodiment is basically similar to the application scenario embodiment and the first embodiment, the description is relatively simple, and related points can be referred to the application scenario embodiment and the partial description of the first embodiment. The fourth embodiment described below is merely illustrative.
A storage medium provided in a fourth embodiment stores a program for a data processing method, the program being executed by a processor to perform the steps of:
obtaining target biological characteristic data;
establishing a corresponding relation between the target biological characteristic data and target user information in a target terminal;
and labeling the target biological characteristic data in the target terminal according to the corresponding relation to obtain labeled data corresponding to the target biological characteristic data, and taking the labeled data corresponding to the target biological characteristic data as data for training a biological characteristic data processing model, wherein the biological characteristic data processing model is a network model used for a server to process biological characteristic data.
It should be noted that, for the detailed description of the storage medium provided in the fourth embodiment of the present application, reference may be made to the application scenario embodiment of the data processing method provided in the present application and the related description of the data processing method provided in the first embodiment, which are not described herein again.
Fifth embodiment
Corresponding to scenario embodiments of the data processing method provided in the present application and the data processing method provided in the first embodiment, a fifth embodiment of the present application provides another data processing method. Since the data processing method of the fifth embodiment is basically similar to the application scenario embodiment and the first embodiment, the description is relatively simple, and for the relevant points, reference may be made to the application scenario embodiment and the partial description of the first embodiment. The fifth embodiment described below is merely illustrative.
The data processing method provided in the fifth embodiment of the present application includes the following steps:
first, target biometric data is obtained.
In the fifth embodiment of the present application, target biometric data is obtained as biometric data of a target object, the target object includes, but is not limited to, a target human object, the target biometric data includes physiological characteristic data of the target object, behavior characteristic data of the target object, and may include only physiological characteristic data of the target object, may also include only behavior characteristic data of the target object, and may also include both physiological characteristic data of the target object and behavior characteristic data of the target object. Wherein the physiological characteristic data of the target object comprises: the target object's facial biometric data, the target object's iris biometric data, the target object's fingerprint characteristic data and the target object's palm print characteristic data, etc., the target object's behavior characteristic data includes: speech characteristic data of the target object.
And secondly, establishing a corresponding relation between the target biological characteristic data and the target user information in the target terminal.
In the fifth embodiment of the present application, a target terminal is an electronic device that is loaded with software or a program for information or data transmission and information or data information processing to implement corresponding supply, for example: personal handheld intelligent terminal equipment, intelligent audio amplifier, panel computer, personal computer etc..
And labeling the target biological characteristic data in the target terminal according to the corresponding relation to obtain labeled data corresponding to the target biological characteristic data, and taking the labeled data corresponding to the target biological characteristic data as data for training a biological characteristic data identification model, wherein the biological characteristic data identification model is a network model used for processing the biological characteristic data by the server.
In the fifth embodiment of the present application, the historical biometric data is biometric data already existing in the target terminal, and is typically object feature data for which label data has already been determined. The historical biometric data may include not only historical biometric data corresponding to the target object, but also historical biometric data corresponding to the non-target object.
And finally, obtaining a target biological characteristic data identification model to be trained, and performing model training on the target biological characteristic data identification model in the user terminal by using the labeled data corresponding to the target biological characteristic data.
In a fifth embodiment of the present application, the labeled data corresponding to the biometric data is generally labeled data generated after data labeling is performed on attributes corresponding to the biometric data, and is used to indicate information such as a type to which the biometric data belongs, an affiliation of the data, and a volume of the data, where: the labeling data of the target biological characteristic data comprises: the target biometric data is biometric data belonging to the subject 1 with ID, and this: the target biometric data includes a fifth physiological characteristic data, a second physiological characteristic data, a fifth behavior characteristic data, a second behavior characteristic data, and the like ….
Although the present invention has been described with reference to the preferred embodiments, it should be understood that the scope of the present invention is not limited to the embodiments described above, and that various changes and modifications may be made by one skilled in the art without departing from the spirit and scope of the present invention.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or Flash memory (Flash RAM). Memory is an example of a computer-readable medium.
1. Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, computer readable Media does not include non-Transitory computer readable Media (transient Media), such as modulated data signals and carrier waves.
2. As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Claims (17)
1. A data processing method, comprising:
obtaining target biological characteristic data;
establishing a corresponding relation between the target biological characteristic data and target user information in a target terminal;
and labeling the target biological characteristic data in the target terminal according to the corresponding relation to obtain labeled data corresponding to the target biological characteristic data, and taking the labeled data corresponding to the target biological characteristic data as data for training a biological characteristic data processing model, wherein the biological characteristic data processing model is a network model used for a server to process biological characteristic data.
2. The data processing method according to claim 1, wherein the establishing of the correspondence between the target biometric data and the target user information in the target terminal comprises:
judging whether target historical biological characteristic data matched with the target biological characteristic data exists in the target terminal;
if yes, determining the user information corresponding to the target historical biological characteristic data as the target user information, and establishing the corresponding relation.
3. The data processing method of claim 2, further comprising: and if the target historical biological characteristic data matched with the target biological characteristic data does not exist in the target terminal, creating user information aiming at the target biological characteristic data in the target terminal as the target user information, and establishing the corresponding relation.
4. The data processing method according to claim 3, wherein the determining whether there is target historical biometric data matching the target biometric data in the target terminal comprises:
obtaining historical biological characteristic data in the target terminal;
comparing the target biological characteristic data with the historical biological characteristic data to obtain the similarity between the target biological characteristic data and the historical biological characteristic data;
and taking the historical biological characteristic data with the similarity reaching a similarity threshold as the target historical biological characteristic data.
5. The data processing method according to claim 4, wherein the comparing the target biometric data with the historical biometric data to obtain the similarity between the target biometric data and the historical biometric data comprises: comparing the physiological characteristic data of the target object and the behavior characteristic data of the target object in the target biological characteristic data with the physiological characteristic data of the object and the behavior characteristic data of the object in the historical biological characteristic data respectively to obtain a first similarity between the physiological characteristic data of the target object and the physiological characteristic data of the object, and a second similarity between the behavior characteristic data of the target object and the behavior characteristic data of the object;
the taking the historical biological feature data with the similarity reaching a similarity threshold as the target historical biological feature data includes: and taking the historical biological feature data of which the first similarity reaches a first similarity threshold or the second similarity reaches a second similarity threshold as the target historical biological feature data.
6. The data processing method according to claim 2, wherein the labeling the target biometric data in the target terminal according to the correspondence to obtain labeled data corresponding to the target biometric data includes:
according to the corresponding relation, fusing the target biological characteristic data with the target historical biological characteristic data in the target terminal to obtain the target biological characteristic data corresponding to the target user information;
and performing data annotation on the target biological characteristic data corresponding to the target user information in the target terminal to obtain annotation data corresponding to the target biological characteristic data.
7. The data processing method according to claim 6, wherein the fusing the target biometric data with the target historical biometric data in the target terminal according to the correspondence comprises: and acquiring a third similarity of the physiological characteristic data of the target object in the target biological characteristic data and the physiological characteristic data of the object in the target historical biological characteristic data, and merging the physiological characteristic data of the target object and the physiological characteristic data of the object, wherein the third similarity is greater than a third similarity threshold value.
8. The data processing method according to claim 6, wherein the fusing the target biometric data with the target historical biometric data in the target terminal according to the correspondence comprises: and acquiring fourth similarity of the behavior characteristic data of the target object in the target biological characteristic data and the behavior characteristic data of the object in the target historical biological characteristic data, and merging the behavior characteristic data of the target object and the behavior characteristic data of the object, wherein the fourth similarity is greater than a fourth similarity threshold value.
9. The data processing method according to claim 5 or 7, wherein the physiological characteristic data of the target object includes facial physiological characteristic data of the target object.
10. The data processing method according to claim 5 or 8, wherein the behavioral characteristic data of the target object comprises voice behavioral characteristic data of the target object.
11. The data processing method of claim 1, wherein the obtaining target biometric data comprises:
obtaining video data including a target object;
and determining a target object in the video data, and extracting biological features of the target object to obtain target biological feature data corresponding to the target object.
12. The data processing method of claim 11, wherein the obtaining video data including a target object comprises: and acquiring the video data through the target terminal.
13. The data processing method according to claim 11 or 12, further comprising: after the video data are obtained, fuzzifying the video data corresponding to other objects except the target object in the video data; and after the target biological characteristic data is obtained, performing fuzzification processing on video data corresponding to the target object in the video data.
14. A data processing device applied to a target terminal is characterized by comprising:
a biometric data obtaining unit for obtaining target biometric data;
a corresponding relation establishing unit, configured to establish a corresponding relation between the target biometric data and target user information;
and the data labeling unit is used for labeling the target biological characteristic data according to the corresponding relation to obtain labeled data corresponding to the target biological characteristic data, and using the labeled data corresponding to the target biological characteristic data as data for training a biological characteristic data processing model, wherein the biological characteristic data processing model is a network model used for a server to process biological characteristic data.
15. An electronic device, comprising:
a processor; and
a memory for storing a program for a data processing method, the apparatus performing the following steps after being powered on and running the program for the data processing method by the processor:
obtaining target biological characteristic data;
establishing a corresponding relation between the target biological characteristic data and target user information in a target terminal;
and labeling the target biological characteristic data in the target terminal according to the corresponding relation to obtain labeled data corresponding to the target biological characteristic data, and taking the labeled data corresponding to the target biological characteristic data as data for training a biological characteristic data processing model, wherein the biological characteristic data processing model is a network model used for a server to process biological characteristic data.
16. A storage medium storing a program for a data processing method, the program being executed by a processor to perform the steps of:
obtaining target biological characteristic data;
establishing a corresponding relation between the target biological characteristic data and target user information in a target terminal;
and labeling the target biological characteristic data in the target terminal according to the corresponding relation to obtain labeled data corresponding to the target biological characteristic data, and taking the labeled data corresponding to the target biological characteristic data as data for training a biological characteristic data processing model, wherein the biological characteristic data processing model is a network model used for a server to process biological characteristic data.
17. A data processing method, comprising:
obtaining target biological characteristic data;
establishing a corresponding relation between the target biological characteristic data and target user information in a target terminal;
according to the corresponding relation, labeling the target biological characteristic data in the target terminal to obtain labeled data corresponding to the target biological characteristic data, and taking the labeled data corresponding to the target biological characteristic data as data for training a biological characteristic data identification model, wherein the biological characteristic data identification model is a network model used for a server to process biological characteristic data;
and obtaining a target biological characteristic data identification model to be trained, and performing model training on the target biological characteristic data identification model in the user terminal by using the marking data corresponding to the target biological characteristic data.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010698030.XA CN114037880A (en) | 2020-07-20 | 2020-07-20 | Data processing method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010698030.XA CN114037880A (en) | 2020-07-20 | 2020-07-20 | Data processing method and device, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114037880A true CN114037880A (en) | 2022-02-11 |
Family
ID=80134040
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010698030.XA Pending CN114037880A (en) | 2020-07-20 | 2020-07-20 | Data processing method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114037880A (en) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008060320A2 (en) * | 2006-03-30 | 2008-05-22 | Major Gadget Software, Inc. | Method and system for enterprise network access control and management for government and corporate entities |
WO2017039140A1 (en) * | 2015-09-02 | 2017-03-09 | 삼성전자(주) | User terminal device and method for recognizing user's location using sensor-based behavior recognition |
CN107705259A (en) * | 2017-09-24 | 2018-02-16 | 合肥麟图信息科技有限公司 | A kind of data enhancement methods and device under mobile terminal preview, screening-mode |
CN108154398A (en) * | 2017-12-27 | 2018-06-12 | 广东欧珀移动通信有限公司 | Method for information display, device, terminal and storage medium |
CN109886324A (en) * | 2019-02-01 | 2019-06-14 | 广州云测信息技术有限公司 | Icon-based programming method and apparatus |
CN110765939A (en) * | 2019-10-22 | 2020-02-07 | Oppo广东移动通信有限公司 | Identity recognition method and device, mobile terminal and storage medium |
CN110874649A (en) * | 2020-01-16 | 2020-03-10 | 支付宝(杭州)信息技术有限公司 | State machine-based federal learning method, system, client and electronic equipment |
CN111242710A (en) * | 2018-11-29 | 2020-06-05 | 北京京东尚科信息技术有限公司 | Business classification processing method and device, service platform and storage medium |
WO2020134231A1 (en) * | 2018-12-28 | 2020-07-02 | 杭州海康威视数字技术股份有限公司 | Information pushing method and device, and information display system |
-
2020
- 2020-07-20 CN CN202010698030.XA patent/CN114037880A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008060320A2 (en) * | 2006-03-30 | 2008-05-22 | Major Gadget Software, Inc. | Method and system for enterprise network access control and management for government and corporate entities |
WO2017039140A1 (en) * | 2015-09-02 | 2017-03-09 | 삼성전자(주) | User terminal device and method for recognizing user's location using sensor-based behavior recognition |
CN107705259A (en) * | 2017-09-24 | 2018-02-16 | 合肥麟图信息科技有限公司 | A kind of data enhancement methods and device under mobile terminal preview, screening-mode |
CN108154398A (en) * | 2017-12-27 | 2018-06-12 | 广东欧珀移动通信有限公司 | Method for information display, device, terminal and storage medium |
CN111242710A (en) * | 2018-11-29 | 2020-06-05 | 北京京东尚科信息技术有限公司 | Business classification processing method and device, service platform and storage medium |
WO2020134231A1 (en) * | 2018-12-28 | 2020-07-02 | 杭州海康威视数字技术股份有限公司 | Information pushing method and device, and information display system |
CN109886324A (en) * | 2019-02-01 | 2019-06-14 | 广州云测信息技术有限公司 | Icon-based programming method and apparatus |
CN110765939A (en) * | 2019-10-22 | 2020-02-07 | Oppo广东移动通信有限公司 | Identity recognition method and device, mobile terminal and storage medium |
CN110874649A (en) * | 2020-01-16 | 2020-03-10 | 支付宝(杭州)信息技术有限公司 | State machine-based federal learning method, system, client and electronic equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109117777B (en) | Method and device for generating information | |
CN107492379B (en) | Voiceprint creating and registering method and device | |
CN109726624B (en) | Identity authentication method, terminal device and computer readable storage medium | |
US12087097B2 (en) | Image recognition method and apparatus, computer-readable storage medium, and electronic device | |
US11625433B2 (en) | Method and apparatus for searching video segment, device, and medium | |
CN110020009B (en) | Online question and answer method, device and system | |
Rashid et al. | Human emotion recognition from videos using spatio-temporal and audio features | |
CN107481720A (en) | A kind of explicit method for recognizing sound-groove and device | |
WO2020019591A1 (en) | Method and device used for generating information | |
CN109871807B (en) | Face image processing method and device | |
CN104598644A (en) | User fond label mining method and device | |
CN109582825B (en) | Method and apparatus for generating information | |
CN112733645A (en) | Handwritten signature verification method and device, computer equipment and storage medium | |
CN116994188A (en) | Action recognition method and device, electronic equipment and storage medium | |
CN115223022B (en) | Image processing method, device, storage medium and equipment | |
CN111695357A (en) | Text labeling method and related product | |
CN116152938A (en) | Method, device and equipment for training identity recognition model and transferring electronic resources | |
CN111477212A (en) | Content recognition, model training and data processing method, system and equipment | |
CN114037880A (en) | Data processing method and device, electronic equipment and storage medium | |
CN117216206A (en) | Session processing method and device, electronic equipment and storage medium | |
CN114495903A (en) | Language category identification method and device, electronic equipment and storage medium | |
Vrochidis et al. | A Deep Learning Framework for Monitoring Audience Engagement in Online Video Events | |
CN113704544A (en) | Video classification method and device, electronic equipment and storage medium | |
CN114049900A (en) | Model training method, identity recognition method and device and electronic equipment | |
CN113642443A (en) | Model testing method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |