CN113468544B - Training method and device for application model - Google Patents
Training method and device for application model Download PDFInfo
- Publication number
- CN113468544B CN113468544B CN202010238628.0A CN202010238628A CN113468544B CN 113468544 B CN113468544 B CN 113468544B CN 202010238628 A CN202010238628 A CN 202010238628A CN 113468544 B CN113468544 B CN 113468544B
- Authority
- CN
- China
- Prior art keywords
- training
- file
- key
- application model
- preset
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 62
- 230000003068 static effect Effects 0.000 claims description 12
- 230000008901 benefit Effects 0.000 abstract description 9
- 230000008569 process Effects 0.000 description 17
- 238000004590 computer program Methods 0.000 description 10
- 238000005336 cracking Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000003491 array Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/602—Providing cryptographic facilities or services
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Software Systems (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computer Security & Cryptography (AREA)
- Biomedical Technology (AREA)
- Evolutionary Computation (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Data Mining & Analysis (AREA)
- Computational Linguistics (AREA)
- Mathematical Physics (AREA)
- Biophysics (AREA)
- Artificial Intelligence (AREA)
- Computer Hardware Design (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioethics (AREA)
- Storage Device Security (AREA)
Abstract
The embodiment of the application provides a training method and device for an application model, wherein the method comprises the following steps: acquiring a random key; encrypting the training file and the training data by using a random key to obtain an encrypted file; encrypting the random key by using the root key to obtain an encryption key; generating an input file based on the encrypted file and the encryption key; and inputting the input file into a preset training frame for training to obtain an application model. By applying the technical scheme provided by the embodiment of the application, the problem that the training file is easy to steal and leak can be solved, and the benefit loss of manufacturers of application models is reduced.
Description
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a training method and apparatus for an application model.
Background
Currently, in order to improve the accuracy and efficiency of data processing, a large amount of data can be used to train an application model, and the data is processed by using the application model.
However, the data and training files required to train the application model are both in plain text, where the training files include the pre-training model as well as the algorithm network, algorithm super-parameters, and the like. This makes the training file easily stolen, leads to the training file to leak, brings the loss of benefit for the manufacturer who uses the model.
Content of the application
An embodiment of the application aims to provide a training method and device for an application model, so as to solve the problem that training files are easy to steal and leak, and reduce benefit loss of manufacturers of the application model. The specific technical scheme is as follows:
in a first aspect, an embodiment of the present application provides a training method for an application model, where the method includes:
acquiring a random key;
encrypting the training file and the training data by using the random key to obtain an encrypted file;
encrypting the random key by using a root key to obtain an encryption key;
generating an input file based on the encrypted file and the encryption key;
and inputting the input file into a preset training frame for training to obtain an application model.
Optionally, the root key is generated by:
generating a first number of character strings;
selecting a second number of strings from the first number of strings, the second number being smaller than the first number;
and generating the root key according to the second number of character strings.
Optionally, the step of generating the root key according to the second number of character strings includes:
processing the second number of character strings according to a preset symbol operation;
merging the processed second number of character strings to obtain merged character strings;
and generating the root key according to the merging character string.
Optionally, the step of generating the root key according to the merging string includes:
carrying out hash coding on the combined character strings to obtain hash coding character strings;
searching a new character string corresponding to the hash code character string in a preset character dictionary;
and taking the new character string as the root key.
Optionally, a static library for encrypting and decrypting by using the root key is integrated in the preset training frame.
Optionally, the step of inputting the input file into a preset training frame to train to obtain an application model includes:
inputting the input file into a preset training frame so that the preset training frame runs the static library, and decrypting the encrypted file in the input file by using the root key and the encryption key in the input file to obtain the training file and training data; and training the training file by utilizing the training data to obtain an application model.
Optionally, after obtaining the application model, the method further includes:
and encrypting the application model by using the preset training frame to obtain an encrypted application model.
Optionally, the preset training frame has a program shell.
Optionally, the training data is face image data, and the application model is a face recognition model; after obtaining the application model, the method further comprises:
and carrying out face recognition on the face image to be recognized by utilizing the application model.
In a second aspect, an embodiment of the present application provides a training apparatus for applying a model, where the apparatus includes:
an acquisition unit configured to acquire a random key;
the first encryption unit is used for encrypting the training file and the training data by utilizing the random key to obtain an encrypted file;
the second encryption unit is used for encrypting the random key by utilizing the root key to obtain an encryption key;
a first generation unit configured to generate an input file based on the encrypted file and an encryption key;
and the training unit is used for inputting the input file into a preset training frame to train so as to obtain an application model.
Optionally, the apparatus further includes: a second generation unit configured to generate the root key; the second generating unit includes:
a first generation unit configured to generate a first number of character strings;
a selection subunit configured to select a second number of strings from the first number of strings, the second number being smaller than the first number;
and the second generation subunit is used for generating the root key according to the second number of character strings.
Optionally, the second generating subunit is specifically configured to:
processing the second number of character strings according to a preset symbol operation;
merging the processed second number of character strings to obtain merged character strings;
and generating the root key according to the merging character string.
Optionally, the second generating subunit is specifically configured to:
carrying out hash coding on the combined character strings to obtain hash coding character strings;
searching a new character string corresponding to the hash code character string in a preset character dictionary;
and taking the new character string as the root key.
Optionally, a static library for encrypting and decrypting by using the root key is integrated in the preset training frame.
Optionally, the training unit is specifically configured to:
inputting the input file into a preset training frame so that the preset training frame runs the static library, and decrypting the encrypted file in the input file by using the root key and the encryption key in the input file to obtain the training file and training data; and training the training file by utilizing the training data to obtain an application model.
Optionally, the apparatus further includes:
and the third encryption unit is used for encrypting the application model by utilizing the preset training frame after the application model is obtained to obtain an encrypted application model.
Optionally, the preset training frame has a program shell.
Optionally, the training data is face image data, and the application model is a face recognition model; the training device of the application model further comprises:
and the application unit is used for carrying out face recognition on the face image to be recognized by utilizing the application model after the application model is obtained.
In a third aspect, embodiments of the present application provide an electronic device including a processor and a memory; the memory is used for storing a computer program; the processor is configured to implement any of the method steps provided in the first aspect when executing the program stored in the memory.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium having a computer program stored therein, which when executed by a processor, implements any of the method steps provided in the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product comprising instructions which, when run on a computer, cause the computer to perform any of the method steps provided in the first aspect.
The beneficial effects of the embodiment of the application are that:
in the technical scheme provided by the embodiment of the application, the training file and the training data are encrypted by utilizing the random key, and the random key is encrypted by utilizing the root key, so that the training file and the training data are subjected to double encryption, the safety of the training file and the training data is improved, and the training file and the training data are prevented from being leaked. In addition, the random key is utilized for encryption, so that the encryption keys of different files are different, the input files can be effectively prevented from being cracked by violence, even if one input file is cracked by violence, other input files cannot be cracked, the problem that training files are easy to steal and leak is effectively solved, and the benefit loss of manufacturers of application models is reduced.
Of course, not all of the above-described advantages need be achieved simultaneously in practicing any one of the products or methods of the present application.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a training method of an application model according to an embodiment of the present application;
FIG. 2 is a schematic diagram illustrating a decryption process of an input file according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a root key generation process according to an embodiment of the present application;
fig. 4 is another schematic diagram of a root key generating process provided in an embodiment of the present application;
FIG. 5 is a schematic diagram of a root key generation process according to an embodiment of the present disclosure;
FIG. 6 is a schematic flow chart of a training process of an application model according to an embodiment of the present application;
FIG. 7 is a schematic structural diagram of a training device for application model according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
In order to solve the problem that the training files are easy to steal and leak and reduce the benefit loss of manufacturers of the application model, the embodiment of the application model training method is provided. The method can be applied to electronic equipment such as a background server and a personal computer, and the embodiment of the application is not particularly limited. The application module may be a face recognition model, a license plate positioning model, a vehicle recognition model, etc., which is not particularly limited in the embodiment of the present application.
The training method of the application model comprises the following steps: acquiring a random key; encrypting the training file and the training data by using a random key to obtain an encrypted file; encrypting the random key by using the root key to obtain an encryption key; generating an input file based on the encrypted file and the encryption key; and inputting the input file into a preset training frame for training to obtain an application model.
In the technical scheme provided by the embodiment of the application, the training file and the training data are encrypted by utilizing the random key, and the random key is encrypted by utilizing the root key, so that the training file and the training data are subjected to double encryption, the safety of the training file and the training data is improved, and the training file and the training data are prevented from being leaked. In addition, the random key is utilized for encryption, so that the encryption keys of different files are different, the input files can be effectively prevented from being cracked by violence, even if one input file is cracked by violence, other input files cannot be cracked, the problem that training files are easy to steal and leak is effectively solved, and the benefit loss of manufacturers of application models is reduced.
The training method of the application model provided by the embodiment of the application is described in detail below through a specific embodiment.
Referring to fig. 1, fig. 1 is a schematic flow chart of a training method of an application model according to an embodiment of the present application. In the following, for ease of understanding, a server is taken as an execution body, and this is not a limitation. The method comprises the following steps.
Step 101, a random key is acquired.
In the embodiment of the application, the server may use a white-box encryption technology to randomly generate the key, that is, the random key. The files are encrypted by using the random key, so that the encryption keys of all the files are different, the risk that the encrypted files are cracked by violence and other files are cracked together is avoided, and the safety of the files is improved.
And 102, encrypting the training file and the training data by using the random key to obtain an encrypted file.
In the embodiment of the application, the training file may include a pre-training model, an algorithm network, an algorithm super-parameter, and the like. The training data may be image data, text data, or the like. The specific training data can be determined according to the application model obtained through training. For example, if the application model is a face recognition model, the training data is a face image. For another example, the application model is a vehicle identification model, and the training data is a vehicle image.
In one embodiment, the training file includes a pre-training model, an algorithm network, and an algorithm super-parameter. The training data is image data. The server encrypts the pre-training model, the algorithm network and the algorithm super-parameters by using the random key respectively to obtain an encrypted pre-training model, an encrypted algorithm network and an encrypted algorithm super-parameters. The server encrypts the image data using the random key to obtain encrypted image data.
Step 103, encrypting the random key by using the root key to obtain an encryption key.
In the embodiment of the application, the white-box encryption technology mainly utilizes the root key for encryption protection. After the training file and the training data are encrypted by using the random key, in order to prevent an attacker from stealing the random key and cracking the encrypted file, the training file and the training data are obtained. The server may encrypt the random key with the root key to obtain an encrypted key.
At this time, the training file and the training data are subjected to double encryption, and an attacker cannot crack the encrypted file even if stealing the encryption key, so that the safety of the training file and the training data is improved.
Step 104, based on the encrypted file and the encryption key, an input file is generated.
In this embodiment of the present application, after obtaining the encrypted file and the encrypted key, the server may generate the input file based on the encrypted file and the encrypted key. The encryption key may be located at a preset location of the input file.
The preset position can be set according to actual requirements. For example, the preset position may be a head of the input file, a tail of the input file, or the like.
And 105, inputting the input file into a preset training frame for training to obtain an application model.
In the embodiment of the application, the preset training frame can quickly feature the embedded convolutional neural network (Convolutional Architecture for Fast Feature Embedding, CAFFE) training frame. After the server obtains the application model, the application model can be utilized to process the data to be processed.
In one embodiment, a static library for encrypting and decrypting by using a root key is integrated in a preset training framework. The preset training frame is compiled in a static library mode, so that the cracking complexity is increased, and the problem that an algorithm is unsafe due to the cracking of a single encryption and decryption library is effectively solved. In another embodiment, in order to increase the difficulty of decompilation and further increase the complexity of cracking, the preset training frame may be shelled, so that the preset training frame has a program shell.
In one embodiment, the server inputs the input file into a preset training framework, which trains the input file. Specifically, a preset training frame runs a static library, and the encryption files in the input files are decrypted by utilizing the root key and the encryption keys in the input files to obtain training files and training data; training the training file by using the training data to obtain an application model.
Specifically, as shown in fig. 2, the decryption process of the input file may include the following steps.
Step 201, obtaining an encryption key in an input file.
Step 202, decrypting the encryption key with the root key to obtain a random key.
And 203, decrypting the encrypted file in the input file by using the random key to obtain the training file and training data.
By using the steps 201-203, a clear text training file and training data can be obtained, and further training is performed in a preset training frame to obtain an application model.
In one embodiment, in order to further improve the security of the application model, the server encrypts the application model by using a preset training frame to obtain an encrypted application model.
Specifically, after the preset training framework obtains the application model, the server obtains a random key, encrypts the application model by using the random key to obtain an encrypted application model, and outputs the encrypted application model. And encrypting the random key by using the root key to obtain an encryption key. And forming the encryption application model and the encryption key into an output file. The output file may be provided for use by a user.
In the technical scheme provided by the embodiment of the application, the training file and the training data are encrypted by utilizing the random key, and the random key is encrypted by utilizing the root key, so that the training file and the training data are subjected to double encryption, the safety of the training file and the training data is improved, and the training file and the training data are prevented from being leaked. In addition, the random key is utilized for encryption, so that the encryption keys of different files are different, the input files can be effectively prevented from being cracked by violence, even if one input file is cracked by violence, other input files cannot be cracked, the problem that training files are easy to steal and leak is effectively solved, and the benefit loss of manufacturers of application models is reduced.
In one embodiment of the present application, the root key may be generated as shown in FIG. 3, using the following steps.
Step 301 generates a first number of character strings.
In this embodiment of the present application, the server may set a first number of random seeds, and randomly generate a first number of character strings based on the first number of random seeds.
The first number may be set according to actual requirements. For example, if a high security root key is required, the first number may be a larger value; the first amount may be a smaller value if it is desired to reduce the computational resources occupied.
Step 302, a second number of strings is selected from the first number of strings, the second number being smaller than the first number.
In this embodiment of the present application, after the first number of strings is generated, the server may select the second number of strings from the first number of strings according to a preset selection manner. The preset selection mode can be set according to actual requirements. For example, the preset selection mode is the first second number of character strings, the last second number of character strings, or the like.
For example, the first number is 10 and the second number is 2. If the preset selection mode is the first 2 character strings, the server selects the first 2 character strings of the 10 character strings.
In the embodiment of the application, a first number of character strings are randomly generated, and a part of character strings are selected from the character strings to generate a root key. The method can effectively prevent the use guess of the character strings in the confusion decompilation process and improves the difficulty of root key cracking.
Step 303, generating a root key according to the second number of character strings.
In one embodiment, the server may directly merge the second number of strings to obtain the root key. For example, the second number is 2, and the selected 2 strings include: aaa and bbb. The server may obtain the root key as aaabbb.
In another embodiment, the symbol operation may be preset in the server, that is, the symbol operation may be preset. The preset symbol operation may include at least one of an add character operation, a subtract character operation, a character shift operation, and the like. As shown in fig. 4, the above step 303 may be subdivided into steps 3031, 3032 and 3033.
Step 3031, the second number of strings is processed according to the preset symbol operation.
For example, the second number is 2, and the selected 2 strings include: abcd and 12345. The preset symbol is operated to subtract the first two characters in each string, and the processed strings are cd and 345.
In the embodiment of the present application, the operations for each string may be the same or different.
Step 3032, merging the processed second number of character strings to obtain a merged character string.
The example in step 3031 will be described as an example. Each processed string is cd and 345, and the two processed strings are combined to obtain a combined string cd345.
Step 3033, a root key is generated from the merged string.
In this embodiment of the present application, after selecting the second number of strings, the server processes the second number of strings again according to the preset symbol operation, so as to generate the root key. This increases the complexity of the root key generation process, further increasing the difficulty of root key cracking.
In one embodiment, the server may directly use the merged string as the root key.
In yet another embodiment, the server may have a predetermined character dictionary, i.e., a predetermined character dictionary. The preset character dictionary has a plurality of characters, for example, 1000 characters, placed therein. Based on this preset character dictionary, as shown in fig. 5, the above step 3033 can be subdivided into steps 30331, 30332 and 30334.
Step 30331, hash-coding the combined string to obtain a hash-coded string.
In the embodiment of the application, after obtaining the merged string, the server may perform hash encoding on the merged string according to a hash algorithm to obtain a hash encoded string.
Step 30332, searching a new character string corresponding to the hash code character string in the preset character dictionary.
In this embodiment of the present application, a search rule may be set in the server, and based on the search rule, a character corresponding to each character in the hash code character string is searched in the preset character dictionary, and a new character string is formed by the characters searched in the preset character dictionary.
The above-mentioned searching rule may be set according to actual needs, which is not specifically limited in the embodiment of the present application.
Step 30333, the new character string is used as the root key.
In the embodiment of the application, the server takes the new character string searched based on the preset character dictionary as the root key. This further increases the complexity of the root key generation process, further increasing the difficulty of root key cracking.
In the embodiment of the application, the root key of the white-box encryption technology can be completed by the internal management of a program of the server. The interaction between the client and the server is not needed, the possibility of an attacker to root the secret key in a network interception mode is reduced, the safety of training data of the training file is improved, and the safety of an application model is improved.
In the embodiment of the application, after the server obtains the application model, the data to be processed can be processed by using the application model.
For example, the training data is face image data, and the application model is a face recognition model; after the application model is obtained, the application model is utilized to carry out face recognition on the face image to be recognized, and a face recognition result is obtained.
For another example, the training data is vehicle image data, and the application model is a vehicle recognition model; and after the application model is obtained, the application model is utilized to identify the vehicle image to be identified, and a vehicle identification result is obtained.
The following describes the training of the application model according to the embodiment of the present application with reference to the training flow of the application model shown in fig. 6. The protection of the preset training frame as the CAFFE training frame is mainly divided into input file protection and output file protection. The input file can comprise a pre-training model, an algorithm network, an algorithm super-parameter, image data and the like; the output files may include log information files, application model files, state information files, and the like.
For the security of the training files and training data, encryption and decryption tools can be generated according to the encryption and decryption processes in fig. 1-5, and the static library of the encryption and decryption processes in fig. 1-5 can be integrated in the CAFFE training framework. In the embodiment of the application, the CAFFE training frame and the encrypted output file can be provided externally, and an encryption and decryption tool is not provided externally. Because the program code of the encryption and decryption flow in the CAFFE training frame needs to be cracked, the difficulty of cracking is greatly increased, and the safety of the output file is improved.
In the embodiment of the application, the encryption and decryption tool is utilized to encrypt the pre-training model, the algorithm network, the algorithm super-parameter and the image data included in the output file, so as to obtain the encrypted pre-training model, the encrypted algorithm network, the encrypted algorithm super-parameter and the encrypted image data. And inputting an input file comprising the encrypted pre-training model, the encrypted algorithm network, the encrypted algorithm super-parameters and the encrypted image data into the CAFFE training frame.
And the CAFFE training framework decrypts the encrypted pre-training model, the encrypted algorithm network, the encrypted algorithm super-parameter and the encrypted image data included in the input file to obtain the pre-training model, the algorithm network, the algorithm super-parameter and the image data. The CAFFE training framework trains the pre-training model, the algorithm network and the algorithm super-parameters by utilizing image data to obtain an application model, and records a state information file and a log information file in the training process. And the CAFFE training framework encrypts the application model and the state information file to obtain an encrypted application model and an encrypted state information file. The CAFFE training framework outputs the encrypted application model, the encrypted state information file, and the log information file. Wherein the log information file may delete or mask the algorithm-related output.
Corresponding to the embodiment of the training method of the application model, the embodiment of the application also provides a training device of the application model. Referring to fig. 7, fig. 7 is a schematic structural diagram of a training device for applying a model according to an embodiment of the present application. The device comprises: an acquisition unit 701, a first encryption unit 702, a second encryption unit 703, a first generation unit 704, and a training unit 705.
An acquisition unit 701 for acquiring a random key;
a first encryption unit 702, configured to encrypt the training file and the training data with a random key to obtain an encrypted file;
a second encryption unit 703, configured to encrypt the random key with the root key, to obtain an encryption key;
a first generation unit 704 for generating an input file based on the encrypted file and the encryption key;
the training unit 705 is configured to input the input file into a preset training frame for training, so as to obtain an application model.
In one embodiment, the training device for applying the model may further include: a second generation unit (not shown in fig. 7) for generating a root key; the second generating unit may include:
a first generation unit configured to generate a first number of character strings;
a selection subunit configured to select a second number of strings from the first number of strings, the second number being smaller than the first number;
and the second generation subunit is used for generating a root key according to the second number of character strings.
In one embodiment, the second generating subunit may be specifically configured to:
processing a second number of character strings according to a preset symbol operation;
merging the processed second number of character strings to obtain merged character strings;
a root key is generated from the merged string.
In one embodiment, the second generating subunit may be specifically configured to:
carrying out hash coding on the combined character strings to obtain hash coding character strings;
searching a new character string corresponding to the hash code character string in a preset character dictionary;
the new string is taken as the root key.
In one embodiment, a static library for encrypting and decrypting by using a root key is integrated in a preset training framework.
In one embodiment, the training unit 705 may be specifically configured to:
inputting an input file into a preset training frame so that the preset training frame runs a static library, and decrypting an encrypted file in the input file by utilizing a root key and an encrypted key in the input file to obtain a training file and training data; training the training file by using the training data to obtain an application model.
In one embodiment, the training device for applying the model may further include: and the third encryption unit is used for encrypting the application model by using a preset training frame after the application model is obtained, so as to obtain an encrypted application model.
In one embodiment, the preset training frame has a program shell.
In one embodiment, the training data is facial image data, and the application model is a face recognition model; the training device for application model may further include:
and the application unit is used for carrying out face recognition on the face image to be recognized by utilizing the application model after the application model is obtained.
In the technical scheme provided by the embodiment of the application, the training file and the training data are encrypted by utilizing the random key, and the random key is encrypted by utilizing the root key, so that the training file and the training data are subjected to double encryption, the safety of the training file and the training data is improved, and the training file and the training data are prevented from being leaked. In addition, the random key is utilized for encryption, so that the encryption keys of different files are different, the input files can be effectively prevented from being cracked by violence, even if one input file is cracked by violence, other input files cannot be cracked, the problem that training files are easy to steal and leak is effectively solved, and the benefit loss of manufacturers of application models is reduced.
Corresponding to the training method of the application model, the embodiment of the application also provides an electronic device, as shown in fig. 8, including a processor 801 and a memory 802; a memory 802 for storing a computer program; the processor 801 is configured to implement any step of the training method of the application model when executing the program stored in the memory 802.
The Memory may include random access Memory (Random Access Memory, RAM) or may include Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the aforementioned processor.
The processor may be a general-purpose processor including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; but also digital signal processors (Digital Signal Processing, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
In accordance with the foregoing application model training method, in a further embodiment of the present application, a computer readable storage medium is further provided, where a computer program is stored, and the computer program when executed by a processor implements any step of the foregoing application model training method.
In accordance with the above-described method of training an application model, in a further embodiment provided herein, there is also provided a computer program product comprising instructions that, when run on a computer, cause the computer to perform any of the steps of the above-described method of training an application model.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another, for example, by wired (e.g., coaxial cable, optical fiber, digital Subscriber Line (DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk (SSD)), etc.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In this specification, each embodiment is described in a related manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for apparatus, electronic devices, computer readable storage media and computer program product embodiments, the description is relatively simple as it is substantially similar to method embodiments, as relevant points are found in the partial description of method embodiments.
The foregoing description is only of the preferred embodiments of the present application and is not intended to limit the scope of the present application. Any modifications, equivalent substitutions, improvements, etc. that are within the spirit and principles of the present application are intended to be included within the scope of the present application.
Claims (9)
1. A method of training an application model, the method comprising:
acquiring a random key;
encrypting the training file and the training data by using the random key to obtain an encrypted file;
encrypting the random key by using a root key to obtain an encryption key;
generating an input file based on the encrypted file and the encryption key;
inputting the input file into a preset training frame for training to obtain an application model;
wherein the root key is generated by:
generating a first number of character strings;
selecting a second number of strings from the first number of strings, the second number being smaller than the first number;
and generating the root key according to the second number of character strings.
2. The method of claim 1, wherein the step of generating the root key from the second number of strings comprises:
processing the second number of character strings according to a preset symbol operation;
merging the processed second number of character strings to obtain merged character strings;
and generating the root key according to the merging character string.
3. The method of claim 2, wherein the step of generating the root key from the merged string comprises:
carrying out hash coding on the combined character strings to obtain hash coding character strings;
searching a new character string corresponding to the hash code character string in a preset character dictionary;
and taking the new character string as the root key.
4. The method of claim 1, wherein the preset training framework integrates a static library that is encrypted and decrypted using the root key.
5. The method of claim 4, wherein the step of inputting the input file into a preset training frame for training to obtain an application model comprises:
inputting the input file into a preset training frame so that the preset training frame runs the static library, and decrypting the encrypted file in the input file by using the root key and the encryption key in the input file to obtain the training file and training data; and training the training file by utilizing the training data to obtain an application model.
6. The method according to any one of claims 1-5, wherein after deriving the application model, the method further comprises:
and encrypting the application model by using the preset training frame to obtain an encrypted application model.
7. The method of any one of claims 1-5, wherein the pre-set training frame has a program shell.
8. The method according to any one of claims 1 to 5, wherein the training data is face image data and the application model is a face recognition model; after obtaining the application model, the method further comprises:
and carrying out face recognition on the face image to be recognized by utilizing the application model.
9. A training apparatus for an application model, the apparatus comprising:
an acquisition unit configured to acquire a random key;
the first encryption unit is used for encrypting the training file and the training data by utilizing the random key to obtain an encrypted file;
the second encryption unit is used for encrypting the random key by utilizing the root key to obtain an encryption key;
a first generation unit configured to generate an input file based on the encrypted file and an encryption key;
the training unit is used for inputting the input file into a preset training frame to train so as to obtain an application model;
the apparatus further comprises: a second generation unit configured to generate the root key; the second generating unit includes:
a first generation unit configured to generate a first number of character strings;
a selection subunit configured to select a second number of strings from the first number of strings, the second number being smaller than the first number;
and the second generation subunit is used for generating the root key according to the second number of character strings.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010238628.0A CN113468544B (en) | 2020-03-30 | 2020-03-30 | Training method and device for application model |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010238628.0A CN113468544B (en) | 2020-03-30 | 2020-03-30 | Training method and device for application model |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113468544A CN113468544A (en) | 2021-10-01 |
CN113468544B true CN113468544B (en) | 2024-04-05 |
Family
ID=77866133
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010238628.0A Active CN113468544B (en) | 2020-03-30 | 2020-03-30 | Training method and device for application model |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113468544B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106788995A (en) * | 2016-12-07 | 2017-05-31 | 武汉斗鱼网络科技有限公司 | File encrypting method and device |
CN108432178A (en) * | 2015-12-14 | 2018-08-21 | 萨基姆宽带简易股份有限公司 | Method for protecting multimedia content record security in storage medium |
CN108898028A (en) * | 2018-07-06 | 2018-11-27 | 成都大象分形智能科技有限公司 | It is related to the neural network model encryption protection system and method for iteration and accidental enciphering |
US10289816B1 (en) * | 2018-06-08 | 2019-05-14 | Gsfm Llc | Methods, systems, and devices for an encrypted and obfuscated algorithm in a computing environment |
CN110062014A (en) * | 2019-06-11 | 2019-07-26 | 苏州思必驰信息科技有限公司 | The encryption and decryption method and system of network model |
JP2019168590A (en) * | 2018-03-23 | 2019-10-03 | Kddi株式会社 | Information processing method and information processing system |
CN110417544A (en) * | 2019-06-28 | 2019-11-05 | 腾讯科技(深圳)有限公司 | A kind of generation method of root key, device and medium |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102006031870B4 (en) * | 2006-06-01 | 2008-07-31 | Siemens Ag | Method and system for providing a Mobile IP key |
CN102231181B (en) * | 2009-10-22 | 2014-08-06 | 鸿富锦精密工业(深圳)有限公司 | Computer system used for file encryption and file encryption method |
AU2014253868B2 (en) * | 2013-04-18 | 2016-05-19 | RISOFTDEV, Inc. | System and methods for encrypting data |
US20200019882A1 (en) * | 2016-12-15 | 2020-01-16 | Schlumberger Technology Corporation | Systems and Methods for Generating, Deploying, Discovering, and Managing Machine Learning Model Packages |
-
2020
- 2020-03-30 CN CN202010238628.0A patent/CN113468544B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108432178A (en) * | 2015-12-14 | 2018-08-21 | 萨基姆宽带简易股份有限公司 | Method for protecting multimedia content record security in storage medium |
CN106788995A (en) * | 2016-12-07 | 2017-05-31 | 武汉斗鱼网络科技有限公司 | File encrypting method and device |
JP2019168590A (en) * | 2018-03-23 | 2019-10-03 | Kddi株式会社 | Information processing method and information processing system |
US10289816B1 (en) * | 2018-06-08 | 2019-05-14 | Gsfm Llc | Methods, systems, and devices for an encrypted and obfuscated algorithm in a computing environment |
CN108898028A (en) * | 2018-07-06 | 2018-11-27 | 成都大象分形智能科技有限公司 | It is related to the neural network model encryption protection system and method for iteration and accidental enciphering |
CN110062014A (en) * | 2019-06-11 | 2019-07-26 | 苏州思必驰信息科技有限公司 | The encryption and decryption method and system of network model |
CN110417544A (en) * | 2019-06-28 | 2019-11-05 | 腾讯科技(深圳)有限公司 | A kind of generation method of root key, device and medium |
Non-Patent Citations (2)
Title |
---|
卷积神经网络的人脸隐私保护识别;章坚武等;中国图象图形学报(05);全文 * |
基于人工神经网络的数据散列加密算法;崔海霞等;电子工程师(05);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN113468544A (en) | 2021-10-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9934400B2 (en) | System and methods for encrypting data | |
US9740849B2 (en) | Registration and authentication of computing devices using a digital skeleton key | |
US8254571B1 (en) | Cryptographic system with halting key derivation function capabilities | |
US8121294B2 (en) | System and method for a derivation function for key per page | |
CN106776904B (en) | The fuzzy query encryption method of dynamic authentication is supported in a kind of insincere cloud computing environment | |
TW201812638A (en) | Storage design method of blockchain encrypted radio frequency chip | |
US20030219121A1 (en) | Biometric key generation for secure storage | |
JP2000315999A (en) | Cryptographic key generating method | |
CN102156843B (en) | Data encryption method and system as well as data decryption method | |
US11227037B2 (en) | Computer system, verification method of confidential information, and computer | |
US20090077388A1 (en) | Information processing apparatus and computer readable medium | |
Liu et al. | A novel security key generation method for SRAM PUF based on Fourier analysis | |
CN113468544B (en) | Training method and device for application model | |
CN113626645A (en) | Hierarchical optimization efficient ciphertext fuzzy retrieval method and related equipment | |
KR102375973B1 (en) | Security server using case based reasoning engine and storage medium for installing security function | |
JP7024709B2 (en) | Cryptographic information collation device, cryptographic information collation method, and cryptographic information collation program | |
US9882879B1 (en) | Using steganography to protect cryptographic information on a mobile device | |
CN107579987A (en) | A kind of encryption of server high in the clouds diagnostic system rule base two level, access method and system | |
CN117992989B (en) | Decryption method, system, device and storage medium | |
US20240232389A9 (en) | Memory Inline Cypher Engine with Confidentiality, Integrity, and Anti-Replay for Artificial Intelligence or Machine Learning Accelerator | |
US11121884B2 (en) | Electronic system capable of self-certification | |
CN115150074A (en) | Virtual OTP decryption method and device, electronic equipment and storage medium | |
McGuffey et al. | Implementation and optimization of a biometric cryptosystem using iris recognition | |
CN117113377A (en) | Matrix-based coded lock password encryption method and system | |
CN118944913A (en) | Data encryption management method and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |