[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN109325584A - Federation's modeling method, equipment and readable storage medium storing program for executing neural network based - Google Patents

Federation's modeling method, equipment and readable storage medium storing program for executing neural network based Download PDF

Info

Publication number
CN109325584A
CN109325584A CN201810913868.9A CN201810913868A CN109325584A CN 109325584 A CN109325584 A CN 109325584A CN 201810913868 A CN201810913868 A CN 201810913868A CN 109325584 A CN109325584 A CN 109325584A
Authority
CN
China
Prior art keywords
terminal
training pattern
penalty values
neural network
gradient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810913868.9A
Other languages
Chinese (zh)
Other versions
CN109325584B (en
Inventor
刘洋
陈天健
范涛
成柯葳
杨强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
WeBank Co Ltd
Original Assignee
WeBank Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by WeBank Co Ltd filed Critical WeBank Co Ltd
Priority to CN201810913868.9A priority Critical patent/CN109325584B/en
Publication of CN109325584A publication Critical patent/CN109325584A/en
Application granted granted Critical
Publication of CN109325584B publication Critical patent/CN109325584B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a kind of federal modeling method, equipment and readable storage medium storing program for executing neural network based, it is iterated this method comprises: first terminal inputs the first sample data marked to the first nerves network of training pattern, and homomorphic cryptography is carried out to the first output of the first nerves network described after iteration;Receive the second output after homomorphic cryptography that second terminal is sent;According to the first output and the second output after homomorphic cryptography, calculate the penalty values and gradient value of encryption, and the penalty values of encryption and gradient value are transmitted to third terminal, after being decrypted by penalty values of the third terminal to encryption, whether restrained described in penalty values judgement according to decryption to training pattern, if described restrain to training pattern, terminates to train, obtain to training pattern.The present invention can be improved the privacy and utilization rate of each side's sample data.

Description

Federation's modeling method, equipment and readable storage medium storing program for executing neural network based
Technical field
The present invention relates to the technical field of machine learning, more particularly to a kind of federal modeling method neural network based, Equipment and readable storage medium storing program for executing.
Background technique
With the fast development of machine learning, machine learning can be applied to every field, as data mining, computer regard Feel, natural language processing, living things feature recognition, medical diagnosis, detection credit card fraud, securities market analysis and DNA sequence dna are surveyed Sequence etc..Machine learning includes study part and execution part, and study is partially using the knowledge base of sample data modification system, to increase The efficiency of task is completed into system execution part, execution part completes task according to knowledge base, while the information of acquisition is fed back Give study part.
Currently, the sample data due to each side has close ties, if the sample data of a side is used only in machine learning, The model inaccuracy then learnt, to solve the above problems, passing through the sample data of joint each side, using logistic regression or certainly The single layers naive model such as plan tree carries out machine learning.However, the sample data due to needing joint each side, there are the samples of a side The case where data are known by another party can not effectively utilize each side in addition, carrying out machine learning by single layer naive model Sample data.
Therefore, the privacy and utilization rate for how improving each side's sample data are current urgent problems to be solved.
Summary of the invention
The main purpose of the present invention is to provide a kind of federal modeling method, equipment and readable storages neural network based Medium, it is intended to improve the privacy and utilization rate of each side's sample data.
To achieve the above object, the present invention provides a kind of federal modeling method neural network based, described based on nerve The federal modeling method of network the following steps are included:
The first sample data marked are inputted and are iterated to the first nerves network of training pattern by first terminal, and Homomorphic cryptography is carried out to the first output of the first nerves network described after iteration;
Receive the second output after homomorphic cryptography that second terminal is sent, wherein the second terminal is by the second sample Data are inputted to be iterated to the nervus opticus network of training pattern, and defeated by the second of the nervus opticus network described after iteration The first terminal is transmitted to after carrying out homomorphic cryptography out;
First output and second output of the foundation after homomorphic cryptography, calculate the penalty values and gradient value of encryption, and will add Close penalty values and gradient value are transmitted to third terminal;
After being decrypted by penalty values and gradient value of the third terminal to encryption, sentence according to the penalty values after decryption Whether restrained described in disconnected to training pattern, if described restrain to training pattern, terminates to train, obtain to training pattern.
Further, after the penalty values and gradient value of encryption are decrypted in the third terminal, after decryption Penalty values judge that described the step of whether restraining to training pattern includes:
The third terminal receives the penalty values for the encryption that the first terminal is sent, and it is previous to obtain the first terminal The history penalty values of secondary transmission;
Penalty values and history penalty values according to the private cipher key pair encryption prestored are decrypted, and according to the damage after decryption Whether restrained described in mistake value and the judgement of history penalty values to training pattern.
Further, whether penalty values and history the penalty values judgement according to after decryption is described restrains to training pattern The step of include:
The difference between penalty values and history penalty values after calculating decryption, and judge whether the difference is less than or equal to Preset threshold;
If the difference is less than or equal to preset threshold, it is determined that it is described to be restrained to training pattern, otherwise determine it is described to Training pattern is not converged.
Further, after the penalty values and gradient value of encryption are decrypted in the third terminal, after decryption After the step of whether being restrained described in penalty values judgement to training pattern, further includes:
If described not converged to training pattern, the third terminal is by the objective function to training pattern for institute After the gradient decryption for stating the first output, the first terminal is returned to;
As the first terminal according to the third terminal decryption return described in training pattern objective function for The gradient of first output, backpropagation adjust the local gradient of the first nerves network;
After the third terminal decrypts gradient of the objective function to training pattern for second output, Return to the second terminal;
As the second terminal according to the third terminal decryption return described in training pattern objective function for The gradient of second output, backpropagation adjust the local gradient of the nervus opticus network.
Further, the objective function described in being returned as the first terminal according to third terminal decryption to training pattern For the gradient of first output, the step of backpropagation adjusts the local gradient of the first nerves network, includes:
The first terminal carries out fitting of a polynomial processing to the gradient function to training pattern;
Receive what the objective function to training pattern that the third terminal decryption returns was exported for described first Gradient;
According to the gradient function and the objective function to training pattern handled through fitting of a polynomial for described first The gradient of output, backpropagation adjust the local gradient of the first nerves network.
Further, the step of the first output of the described pair of first nerves network after iteration carries out homomorphic cryptography packet It includes:
The Public Key that the third terminal is sent is received, and the Public Key is stored to predeterminable area;
The first output of the first nerves network described after iteration is carried out according to the Public Key in the predeterminable area Homomorphic cryptography.
Further, the federal modeling method neural network based further include:
When detecting the configuration-direct of initial weight, the cynapse number in the first nerves network is counted, and call Default random number generator, generates one group of random number corresponding with the cynapse number;
According to the one group of random number generated, the initial weight of each cynapse in the first nerves network is configured.
Further, described according to the one group of random number generated, configure the initial of each cynapse in the first nerves network The step of weight includes:
According to the size order of the one group of random number generated, a random number is successively selected to make from one group of random number of generation For initial weight, the cynapse being allocated in the first nerves network, wherein each cynapse configures an initial weight.
In addition, to achieve the above object, the present invention also provides a kind of federal modelling apparatus neural network based, the bases Include: memory, processor in the federal modelling apparatus of neural network and is stored on the memory and can be in the processing The federal modeling program neural network based run on device, the federal modeling program neural network based is by the processing The step of federal modeling method neural network based as described above is realized when device executes.
The present invention also provides a kind of readable storage medium storing program for executing, it is stored on the readable storage medium storing program for executing neural network based Nation's modeling program, the federal modeling program neural network based are realized as described above when being executed by processor based on nerve The step of federal modeling method of network.
The present invention provides a kind of federal modeling method, equipment and readable storage medium storing program for executing neural network based, and the present invention will The sample data input that one side has marked is inputted to the neural network in training pattern, and by another party's sample data wait train Another neural network in model, then when both sides' sample data reaches the last layer by respective neural network, to double The output of square neural network carries out homomorphic cryptography, and wherein output of the side Jing Guo homomorphic cryptography is transferred to another party, by another One side combines the output of two side's neural networks through homomorphic cryptography, the penalty values and gradient value of encryption is calculated, finally by encryption Penalty values and gradient value are transferred to third party, are returned after the penalty values and gradient value of encryption are decrypted by third party, foundation Penalty values after decryption judge whether restrain to training pattern, if restraining to training pattern, terminate to train, obtain mould to be trained Type, due to needing the data transmitted to encrypt both sides, and can carry out under the form of encryption by way of homomorphic cryptography Joint training, the effective privacy for improving each side's sample data, meanwhile, the multilayer neural network of joint each side carries out engineering It practises, can effectively utilize the sample data of each side, improve the utilization rate of each side's sample data.
Detailed description of the invention
Fig. 1 is the device structure schematic diagram for the hardware running environment that the embodiment of the present invention is related to;
Fig. 2 is that the present invention is based on the flow diagrams of the federal modeling method first embodiment of neural network;
Fig. 3 is schematic diagram of joint both sides' sample data training to training pattern in first embodiment of the invention;
Fig. 4 is that the present invention is based on the flow diagrams of the federal modeling method second embodiment of neural network.
The embodiments will be further described with reference to the accompanying drawings for the realization, the function and the advantages of the object of the present invention.
Specific embodiment
It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not intended to limit the present invention.
As shown in Figure 1, Fig. 1 is the device structure schematic diagram for the hardware running environment that the embodiment of the present invention is related to.
Federal modelling apparatus neural network based of the embodiment of the present invention can be the fixed terminal equipments such as PC, be also possible to The packaged type terminal device having a display function such as smart phone, tablet computer, portable computer.
As shown in Figure 1, the federal modelling apparatus neural network based may include: processor 1001, such as CPU, lead to Believe bus 1002, user interface 1003, network interface 1004, memory 1005.Wherein, communication bus 1002 is for realizing these Connection communication between component.User interface 1003 may include display screen (Display), input unit such as keyboard (Keyboard), optional user interface 1003 can also include standard wireline interface and wireless interface.Network interface 1004 can Choosing may include standard wireline interface and wireless interface (such as WI-FI interface).Memory 1005 can be high-speed RAM storage Device is also possible to stable memory (non-volatile memory), such as magnetic disk storage.Memory 1005 is optional It can also be the storage device independently of aforementioned processor 1001.
It will be understood by those skilled in the art that federation's modelling apparatus structure neural network based shown in Fig. 1 is not The restriction to federal modelling apparatus neural network based is constituted, may include than illustrating more or fewer components or group Close certain components or different component layouts.
As shown in Figure 1, as may include that operating system, network are logical in a kind of memory 1005 of computer storage medium Believe module, Subscriber Interface Module SIM and federal modeling program neural network based.
In federal modelling apparatus neural network based shown in Fig. 1, network interface 1004 is mainly used for connection backstage Server carries out data communication with background server;User interface 1003 is mainly used for connecting client (user terminal), with client End carries out data communication;And processor 1001 can be used for that the federation neural network based stored in memory 1005 is called to build Mold process sequence, and execute following steps:
The first sample data marked are inputted and are iterated to the first nerves network of training pattern by first terminal, and Homomorphic cryptography is carried out to the first output of the first nerves network described after iteration;
Receive the second output after homomorphic cryptography that second terminal is sent, wherein the second terminal is by the second sample Data are inputted to be iterated to the nervus opticus network of training pattern, and defeated by the second of the nervus opticus network described after iteration The first terminal is transmitted to after carrying out homomorphic cryptography out;
First output and second output of the foundation after homomorphic cryptography, calculate the penalty values and gradient value of encryption, and will add Close penalty values and gradient value are transmitted to third terminal;
After being decrypted by penalty values and gradient value of the third terminal to encryption, sentence according to the penalty values after decryption Whether restrained described in disconnected to training pattern, if described restrain to training pattern, terminates to train, obtain to training pattern.
Further, processor 1001 can be used for that the federation neural network based stored in memory 1005 is called to build Mold process sequence, also execution following steps:
The third terminal receives the penalty values for the encryption that the first terminal is sent, and it is previous to obtain the first terminal The history penalty values of secondary transmission;
Penalty values and history penalty values according to the private cipher key pair encryption prestored are decrypted, and according to the damage after decryption Whether restrained described in mistake value and the judgement of history penalty values to training pattern.
Further, processor 1001 can be used for that the federation neural network based stored in memory 1005 is called to build Mold process sequence, also execution following steps:
The difference between penalty values and history penalty values after calculating decryption, and judge whether the difference is less than or equal to Preset threshold;
If the difference is less than or equal to preset threshold, it is determined that it is described to be restrained to training pattern, otherwise determine it is described to Training pattern is not converged.
Further, processor 1001 can be used for that the federation neural network based stored in memory 1005 is called to build Mold process sequence, also execution following steps:
If described not converged to training pattern, the third terminal is by the objective function to training pattern for institute After the gradient decryption for stating the first output, the first terminal is returned to;
As the first terminal according to the third terminal decryption return described in training pattern objective function for The gradient of first output, backpropagation adjust the local gradient of the first nerves network;
After the third terminal decrypts gradient of the objective function to training pattern for second output, Return to the second terminal;
As the second terminal according to the third terminal decryption return described in training pattern objective function for The gradient of second output, backpropagation adjust the local gradient of the nervus opticus network.
Further, processor 1001 can be used for that the federation neural network based stored in memory 1005 is called to build Mold process sequence, also execution following steps:
The first terminal carries out fitting of a polynomial processing to the gradient function to training pattern;
Receive what the objective function to training pattern that the third terminal decryption returns was exported for described first Gradient;
According to the gradient function and the objective function to training pattern handled through fitting of a polynomial for described first The gradient of output, backpropagation adjust the local gradient of the first nerves network.
Further, processor 1001 can be used for that the federation neural network based stored in memory 1005 is called to build Mold process sequence, also execution following steps:
The Public Key that the third terminal is sent is received, and the Public Key is stored to predeterminable area;
The first output of the first nerves network described after iteration is carried out according to the Public Key in the predeterminable area Homomorphic cryptography.
Further, processor 1001 can be used for that the federation neural network based stored in memory 1005 is called to build Mold process sequence, also execution following steps:
When detecting the configuration-direct of initial weight, the cynapse number in the first nerves network is counted, and call Default random number generator, generates one group of random number corresponding with the cynapse number;
According to the one group of random number generated, the initial weight of each cynapse in the first nerves network is configured.
Further, processor 1001 can be used for that the federation neural network based stored in memory 1005 is called to build Mold process sequence, also execution following steps:
According to the size order of the one group of random number generated, a random number is successively selected to make from one group of random number of generation For initial weight, the cynapse being allocated in the first nerves network, wherein each cynapse configures an initial weight.
The present invention is based on the specific embodiments of the federal modelling apparatus of neural network and following federations neural network based Each specific embodiment of modeling method is essentially identical, and therefore not to repeat here.
It is that the present invention is based on the flow diagrams of the federal modeling method first embodiment of neural network referring to Fig. 2, Fig. 2.
Step S101, first terminal by the first sample data marked input to training pattern first nerves network into Row iteration, and homomorphic cryptography is carried out to the first output of the first nerves network described after iteration;
It should include at least two machine learning models to training pattern, at least two machine moulds extremely in the present embodiment A rare machine mould be neural network model, below with to training pattern by two neural networks and an association neuron It is illustrated for composition, as shown in figure 3, two neural networks are respectively first nerves network and nervus opticus network, and One neural network is deployed in first terminal, and nervus opticus network is deployed in second terminal.It should be noted that first nerves network Actual conditions can be based on by those skilled in the art with the network parameter of nervus opticus network to be configured based on actual conditions, this Embodiment is not especially limited this.Wherein, network parameter includes but is not limited to each layer network node number, implies the number of plies, is each Initial weight, learning rate, dynamic parameter, allowable error, the number of iterations and the activation primitive of cynapse.
In the present embodiment, after developer determines to training pattern, and by the first nerves Network Dept. to training pattern It is deployed on first terminal, the nervus opticus network to training pattern is deployed in second terminal, by first terminal and second terminal institute The Public Key needed is stored in third terminal, and the first sample data marked are locally stored in first terminal, and second eventually The second sample data not marked is locally stored in end.During treating training pattern and being trained, first terminal will The first sample data marked being located locally input the first nerves network to training pattern, and to the first mind after iteration Through network first output, i.e., each neuron value of first nerves network the last layer and each neuron of the last layer to The parameter value that participation penalty values in the gradient function and loss function of training pattern calculate and gradient value calculating is required carries out same State encryption, i.e., first sample data by first nerves network processes reach first nerves network the last layer when, using from The Public Key got in third terminal carries out homomorphic cryptography to the first output of the first nerves network.Wherein, to process The data of homomorphic cryptography are handled to obtain an output, this output are decrypted, result is handled with Same Way Output that the initial data of unencryption obtains is the result is that the same.
Specifically, the Public Key that third terminal is sent is received, and Public Key is stored to predeterminable area, then After the first sample data input first nerves network of mark, according to the Public Key in predeterminable area to after iteration first First output of neural network carries out homomorphic cryptography.
Step S102 receives the second output after homomorphic cryptography that second terminal is sent, wherein second terminal is by second Sample data is inputted to be iterated to the nervus opticus network of training pattern, and defeated by second of the nervus opticus network after iteration First terminal is transmitted to after carrying out homomorphic cryptography out;
In the present embodiment, during treating training pattern and being trained, the second sample that second terminal will be located locally Notebook data inputs the nervus opticus network to training pattern, and second of the nervus opticus network after iteration is exported, i.e., second Each neuron value of neural network the last layer and each neuron of the last layer to training pattern gradient function and loss Participation penalty values in function calculate and gradient value calculates after required parameter value carries out homomorphic cryptography and is transmitted to first terminal, i.e., When second sample data reaches the last layer of nervus opticus network by nervus opticus network processes, using from third terminal The Public Key got carries out homomorphic cryptography to the second of the nervus opticus network the output, and by second Jing Guo homomorphic cryptography Second output of neural network is transmitted to first terminal, receives the after homomorphic cryptography that second terminal is sent by first terminal Two outputs.
Step S103 calculates the penalty values and gradient of encryption according to the first output and the second output after homomorphic cryptography Value, and the penalty values of encryption and gradient value are transmitted to third terminal;
In the present embodiment, first terminal calculates the damage of encryption according to the first output and the second output after homomorphic cryptography Mistake value and gradient value, and the penalty values of encryption and gradient value are transmitted to third terminal, that is, combine the first of first nerves network Output, i.e., each neuron value of first nerves network the last layer and each neuron of the last layer are in the ladder to training pattern The participations penalty values spent in function and loss function calculate and gradient value calculates the of required parameter value and nervus opticus network Two outputs, i.e., each neuron value of nervus opticus network the last layer and each neuron of the last layer are to training pattern Participation penalty values in gradient function and loss function calculate and gradient value calculates required parameter value, with the form calculus of ciphertext Penalty values and gradient value, and the penalty values and gradient value being calculated are in encrypted state.
In specific implementation, for the safety of raising both data further, during training pattern, first eventually End and second terminal timing obtain a Public Key from third terminal, to update the Public Key locally prestored, specially exist One timer is set in third terminal, and when starting training pattern, timer starts timing, when the timing of timer reaches default When duration, third terminal generates one group of Public Key and private cipher key, and Public Key is handed down to first terminal and second eventually End, and timer restarts timing, updates the Public Key locally prestored by first terminal and second terminal.It needs to illustrate It is that above-mentioned preset duration can be configured by those skilled in the art based on actual conditions, and the present embodiment does not limit this specifically It is fixed.
Step S104, after being decrypted by penalty values and gradient value of the third terminal to encryption, according to the damage after decryption Mistake value judges whether restrain to training pattern, if restraining to training pattern, terminates to train, obtain to training pattern.
In the present embodiment, third terminal receives the penalty values and gradient value for the encryption that first terminal is sent, and to encryption After penalty values and gradient value are decrypted, judge whether restrain to training pattern according to the penalty values after decryption, that is, obtain and add The Public Key of close penalty values and gradient value, and private cipher key corresponding with the Public Key is obtained, it is then privately owned according to this The penalty values and gradient value of key pair encryption are decrypted, and judge whether receive to training pattern according to the penalty values after decryption It holds back.Specifically, third terminal is once sent before obtaining first terminal when receiving the penalty values of encryption of first terminal transmission History penalty values, and penalty values according to the encryption of corresponding private cipher key pair and history penalty values are decrypted, and according to solution Penalty values and history penalty values after close judge whether restrain to training pattern, that is, the penalty values and history after calculating decryption are lost Difference between value, and judge whether the difference is less than or equal to preset threshold, if the difference is less than or equal to preset threshold, Then determining should restrain to training pattern, if the difference is greater than preset threshold, it is determined that should be not converged to training pattern.It needs Bright, above-mentioned preset threshold can be configured by those skilled in the art based on actual conditions, and the present embodiment does not make this to have Body limits.
Further, if it is not converged to training pattern, third terminal by the objective function of training pattern for After the gradient decryption of one output, first terminal is returned to, the mould to be trained returned by first terminal according to third terminal decryption The objective function of type is for the gradient of the first output, and the local gradient of backpropagation adjustment first nerves network, third terminal is also After decrypting to gradient of the objective function of training pattern for the second output, the second terminal is returned to, eventually by second Gradient of the objective function to training pattern that end is returned according to third terminal decryption for the second output, backpropagation adjustment the The local gradient of two neural networks.
Specifically, when the gradient function of training pattern or loss function cannot calculate under encryption, first terminal pair Gradient function to training pattern carries out fitting of a polynomial processing, and receives the mesh to training pattern that third terminal decryption returns The gradient that scalar functions are exported for first, then according to the gradient function handled through fitting of a polynomial and to the target of training pattern Gradient of the function for the first output, the local gradient of backpropagation adjustment first nerves network.Similarly, second terminal is according to same The mode of sample adjusts the local gradient of nervus opticus network.It is non-thread by that can be solved to gradient function progress fitting of a polynomial processing The gradient function or loss function of property can not homomorphic cryptography the problem of.It should be noted that approaching activation using polynomial fitting The mode of function handles gradient function, may make the gradient function that cannot be calculated under encryption can by homomorphic cryptography, It is worth noting that can also take other modes to gradient function processing in specific implementation, so that nonlinear gradient function It can be by homomorphic cryptography.
In the present embodiment, the sample data that the present invention has marked a side is inputted to the neural network in training pattern, And input another party's sample data to another neural network in training pattern, then in both sides' sample data by respective When neural network reaches the last layer, homomorphic cryptography is carried out to the output of both sides' neural network, and wherein a side will pass through homomorphism The output of encryption is transferred to another party, and the output of two side's neural networks through homomorphic cryptography is combined by another party, calculates encryption The penalty values of encryption and gradient value are finally transferred to third party by penalty values and gradient value, return the loss to encryption by third party After value and gradient value are decrypted, judge whether restrain to training pattern according to the penalty values after decryption, if to training pattern Convergence, then terminate to train, and obtains to training pattern, due to by way of homomorphic cryptography, data that both sides are needed to transmit into Row encryption, and joint training can be carried out under the form of encryption, the effective privacy for improving each side's sample data, meanwhile, connection The multilayer neural network for closing each side carries out machine learning, can effectively utilize the sample data of each side, improve each side's sample data Utilization rate.
Further, referring to Fig. 4, implement based on above-mentioned first, propose federation's modeling the present invention is based on neural network The second embodiment of method, the difference with previous embodiment be, federation's modeling method neural network based further include:
Step 105, when detecting the configuration-direct of initial weight, the cynapse number in first nerves network is counted, and Default random number generator is called, one group of random number corresponding with cynapse number is generated;
In the present embodiment, before treating training pattern and being trained, need to configure to the first of cynapse each in training pattern Beginning weight, then when detecting the configuration-direct of initial weight, first terminal counts the cynapse number in first nerves network, and Default random number generator is called, generates one group of random number corresponding with cynapse number, while second terminal counts nervus opticus Cynapse number in network, and default random number generator is called, generate another group of random number corresponding with cynapse number.It needs Illustrate, the value range of above-mentioned random number can be configured by those skilled in the art based on actual conditions, the present embodiment This is not especially limited, it is preferable that the value range of random number is -0.5~+0.5.
Step 106, according to the one group of random number generated, the initial weight of each cynapse in first nerves network is configured.
In the present embodiment, one group random number of the first terminal according to generation, each cynapse is first in configuration first nerves network Beginning weight, the i.e. size order according to the one group of random number generated, successively select a random number from one group of random number of generation As initial weight, the cynapse that is allocated in first nerves network;Second terminal is according to another group of random number generated, configuration The initial weight of each cynapse in nervus opticus network, the i.e. size order according to another group of random number generated, from the another of generation Successively select a random number as initial weight in one group of random number, the cynapse being allocated in nervus opticus network, wherein every A cynapse configures an initial weight.
In the present embodiment, the present invention utilizes random number generator, to first nerves network in training pattern and the second mind Each cynapse through network configures random initial weight, prevents the initial weight of each cynapse identical, causes each prominent in training process The weight of touching remains equal, and effective raising training obtains the accuracy of model.
In addition, the embodiment of the present invention also proposes a kind of readable storage medium storing program for executing, it is stored with and is based on the readable storage medium storing program for executing The federal modeling program of neural network executes following when the federal modeling program neural network based is executed by processor Step:
The first sample data marked are inputted and are iterated to the first nerves network of training pattern by first terminal, and Homomorphic cryptography is carried out to the first output of the first nerves network described after iteration;
Receive the second output after homomorphic cryptography that second terminal is sent, wherein the second terminal is by the second sample Data are inputted to be iterated to the nervus opticus network of training pattern, and defeated by the second of the nervus opticus network described after iteration The first terminal is transmitted to after carrying out homomorphic cryptography out;
First output and second output of the foundation after homomorphic cryptography, calculate the penalty values and gradient value of encryption, and will add Close penalty values and gradient value are transmitted to third terminal;
After being decrypted by penalty values and gradient value of the third terminal to encryption, sentence according to the penalty values after decryption Whether restrained described in disconnected to training pattern, if described restrain to training pattern, terminates to train, obtain to training pattern.
Further, when the federal modeling program neural network based is executed by processor, following steps are also executed:
After the penalty values and gradient value of encryption are decrypted in the third terminal, according to the penalty values judgement after decryption Described the step of whether restraining to training pattern includes:
The third terminal obtains the first terminal when receiving the penalty values for the encryption that the first terminal is sent The preceding history penalty values once sent;
Penalty values and history penalty values according to the private cipher key pair encryption prestored are decrypted, and according to the damage after decryption Whether restrained described in mistake value and the judgement of history penalty values to training pattern.
Further, when the federal modeling program neural network based is executed by processor, following steps are also executed:
Penalty values and history penalty values judgement the step of whether being restrained to training pattern packet according to after decryption It includes:
The difference between penalty values and history penalty values after calculating decryption, and judge whether the difference is less than or equal to Preset threshold;
If the difference is less than or equal to preset threshold, it is determined that it is described to be restrained to training pattern, otherwise determine it is described to Training pattern is not converged.
Further, when the federal modeling program neural network based is executed by processor, following steps are also executed:
If described not converged to training pattern, the third terminal is by the objective function to training pattern for institute After the gradient decryption for stating the first output, the first terminal is returned to;
As the first terminal according to the third terminal decryption return described in training pattern objective function for The gradient of first output, backpropagation adjust the local gradient of the first nerves network;
After the third terminal decrypts gradient of the objective function to training pattern for second output, Return to the second terminal;
As the second terminal according to the third terminal decryption return described in training pattern objective function for The gradient of second output, backpropagation adjust the ladder exported in two neural network of local gradient of the nervus opticus network Degree, backpropagation adjust the local gradient of the nervus opticus network.
Further, when the federal modeling program neural network based is executed by processor, following steps are also executed:
The first terminal carries out fitting of a polynomial processing to the gradient function to training pattern;
Receive what the objective function to training pattern that the third terminal decryption returns was exported for described first Gradient;
According to the gradient function and the objective function to training pattern handled through fitting of a polynomial for described first The gradient of output, backpropagation adjust the local gradient of the first nerves network.
Further, when the federal modeling program neural network based is executed by processor, following steps are also executed:
The Public Key that the third terminal is sent is received, and the Public Key is stored to predeterminable area;
The first output of the first nerves network described after iteration is carried out according to the Public Key in the predeterminable area Homomorphic cryptography.
Further, when the federal modeling program neural network based is executed by processor, following steps are also executed:
When detecting the configuration-direct of initial weight, the cynapse number in the first nerves network is counted, and call Default random number generator, generates one group of random number corresponding with the cynapse number;
According to the one group of random number generated, the initial weight of each cynapse in the first nerves network is configured.
Further, when the federal modeling program neural network based is executed by processor, following steps are also executed:
According to the size order of the one group of random number generated, a random number is successively selected to make from one group of random number of generation For initial weight, the cynapse being allocated in the first nerves network, wherein each cynapse configures an initial weight.
The specific embodiment of readable storage medium storing program for executing of the present invention is respectively implemented with above-mentioned federal modeling method neural network based Example is essentially identical, and therefore not to repeat here.
It should be noted that, in this document, the terms "include", "comprise" or its any other variant are intended to non-row His property includes, so that the process, method, article or the system that include a series of elements not only include those elements, and And further include other elements that are not explicitly listed, or further include for this process, method, article or system institute it is intrinsic Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including being somebody's turn to do There is also other identical elements in the process, method of element, article or system.
The serial number of the above embodiments of the invention is only for description, does not represent the advantages or disadvantages of the embodiments.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side Method can be realized by means of software and necessary general hardware platform, naturally it is also possible to by hardware, but in many cases The former is more preferably embodiment.Based on this understanding, technical solution of the present invention substantially in other words does the prior art The part contributed out can be embodied in the form of software products, which is stored in one as described above In storage medium (such as ROM/RAM, magnetic disk, CD), including some instructions are used so that terminal device (it can be mobile phone, Computer, server, air conditioner or network equipment etc.) execute method described in each embodiment of the present invention.
The above is only a preferred embodiment of the present invention, is not intended to limit the scope of the invention, all to utilize this hair Equivalent structure or equivalent flow shift made by bright specification and accompanying drawing content is applied directly or indirectly in other relevant skills Art field, is included within the scope of the present invention.

Claims (10)

1. a kind of federal modeling method neural network based, which is characterized in that the federal modeling side neural network based Method the following steps are included:
The first sample data marked are inputted and are iterated to the first nerves network of training pattern by first terminal, and to warp The first output of the first nerves network carries out homomorphic cryptography after iteration;
Receive the second output after homomorphic cryptography that second terminal is sent, wherein the second terminal is by the second sample data Input and be iterated to the nervus opticus network of training pattern, and by the second of the nervus opticus network described after iteration export into The first terminal is transmitted to after row homomorphic cryptography;
According to the first output and the second output after homomorphic cryptography, the penalty values and gradient value of encryption are calculated, and by encryption Penalty values and gradient value are transmitted to third terminal;
After being decrypted by penalty values and gradient value of the third terminal to encryption, institute is judged according to the penalty values after decryption It states to which whether training pattern restrains, if described restrain to training pattern, terminates to train, obtain to training pattern.
2. federal modeling method neural network based as described in claim 1, which is characterized in that the third terminal is to adding It is described whether convergent to training pattern according to the penalty values judgement after decryption after close penalty values and gradient value are decrypted Step includes:
It is previous to obtain the first terminal when receiving the penalty values for the encryption that the first terminal is sent for the third terminal The history penalty values of secondary transmission;
Penalty values and history penalty values according to the private cipher key pair encryption prestored are decrypted, and according to the penalty values after decryption Whether restrained with described in the judgement of history penalty values to training pattern.
3. federal modeling method neural network based as described in claim 1, which is characterized in that after the foundation decryption Penalty values and history penalty values judge that described the step of whether restraining to training pattern includes:
The difference between penalty values and history penalty values after calculating decryption, and it is default to judge whether the difference is less than or equal to Threshold value;
If the difference is less than or equal to preset threshold, it is determined that it is described to be restrained to training pattern, it otherwise determines described wait train Model is not converged.
4. federal modeling method neural network based as described in claim 1, which is characterized in that the third terminal is to adding It is described whether convergent to training pattern according to the penalty values judgement after decryption after close penalty values and gradient value are decrypted After step, further includes:
If described not converged to training pattern, the third terminal is by the objective function to training pattern for described After the gradient decryption of one output, the first terminal is returned to;
As the first terminal according to the objective function described in third terminal decryption return to training pattern for described The gradient of first output, backpropagation adjust the local gradient of the first nerves network;
After the third terminal decrypts gradient of the objective function to training pattern for second output, return To the second terminal;
As the second terminal according to the objective function described in third terminal decryption return to training pattern for described The gradient of second output, backpropagation adjust the local gradient of the nervus opticus network.
5. federal modeling method neural network based as claimed in claim 4, which is characterized in that by the first terminal according to According to the gradient that the objective function described in third terminal decryption return to training pattern is exported for described first, backpropagation tune The step of local gradient of the whole first nerves network includes:
The first terminal carries out fitting of a polynomial processing to the gradient function to training pattern;
Receive the gradient that the objective function to training pattern that the third terminal decryption returns is exported for described first;
According to the gradient function and the objective function to training pattern handled through fitting of a polynomial for first output Gradient, backpropagation adjusts the local gradient of the first nerves network.
6. federal modeling method neural network based according to any one of claims 1 to 5, which is characterized in that described right The step of the first output progress homomorphic cryptography of the first nerves network, includes: after iteration
The Public Key that the third terminal is sent is received, and the Public Key is stored to predeterminable area;
Homomorphism is carried out to the first output of the first nerves network described after iteration according to the Public Key in the predeterminable area Encryption.
7. federal modeling method neural network based according to any one of claims 1 to 5, which is characterized in that the base In the federal modeling method of neural network further include:
When detecting the configuration-direct of initial weight, the cynapse number in the first nerves network is counted, and is called default Random number generator generates one group of random number corresponding with the cynapse number;
According to the one group of random number generated, the initial weight of each cynapse in the first nerves network is configured.
8. federal modeling method neural network based as claimed in claim 7, which is characterized in that described according to generate one Group random number, the step of configuring the initial weight of each cynapse in the first nerves network include:
According to the size order of the one group of random number generated, successively select a random number as just from one group of random number of generation Beginning weight, the cynapse being allocated in the first nerves network, wherein each cynapse configures an initial weight.
9. a kind of federal modelling apparatus neural network based, which is characterized in that the federal modeling neural network based is set It is standby include: memory, processor and be stored on the memory and can run on the processor based on neural network Federal modeling program, the federal modeling program neural network based realizes such as claim when being executed by the processor Described in any one of 1-8 the step of federation's modeling method neural network based.
10. a kind of readable storage medium storing program for executing, which is characterized in that be stored with federation neural network based on the readable storage medium storing program for executing Modeling program, the federal modeling program neural network based are realized when being executed by processor as any in claim 1 to 8 The step of federal modeling method neural network based described in item.
CN201810913868.9A 2018-08-10 2018-08-10 Federal modeling method and device based on neural network and readable storage medium Active CN109325584B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810913868.9A CN109325584B (en) 2018-08-10 2018-08-10 Federal modeling method and device based on neural network and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810913868.9A CN109325584B (en) 2018-08-10 2018-08-10 Federal modeling method and device based on neural network and readable storage medium

Publications (2)

Publication Number Publication Date
CN109325584A true CN109325584A (en) 2019-02-12
CN109325584B CN109325584B (en) 2021-06-25

Family

ID=65263630

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810913868.9A Active CN109325584B (en) 2018-08-10 2018-08-10 Federal modeling method and device based on neural network and readable storage medium

Country Status (1)

Country Link
CN (1) CN109325584B (en)

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109886417A (en) * 2019-03-01 2019-06-14 深圳前海微众银行股份有限公司 Model parameter training method, device, equipment and medium based on federation's study
CN110084063A (en) * 2019-04-23 2019-08-02 中国科学技术大学 A kind of gradient descent algorithm method for protecting private data
CN110162995A (en) * 2019-04-22 2019-08-23 阿里巴巴集团控股有限公司 Assess the method and device thereof of contribution data degree
CN110263919A (en) * 2019-06-20 2019-09-20 福州数据技术研究院有限公司 A kind of reverse transmittance nerve network training method based on medical treatment & health data safety
CN110674941A (en) * 2019-09-25 2020-01-10 南开大学 Data encryption transmission method and system based on neural network
WO2020029585A1 (en) * 2018-08-10 2020-02-13 深圳前海微众银行股份有限公司 Neural network federation modeling method and device employing transfer learning, and storage medium
CN110807528A (en) * 2019-10-30 2020-02-18 深圳前海微众银行股份有限公司 Feature correlation calculation method, device and computer-readable storage medium
CN110852430A (en) * 2019-10-29 2020-02-28 清华大学 Neural network encryption method and device for nonvolatile computing system
CN110955907A (en) * 2019-12-13 2020-04-03 支付宝(杭州)信息技术有限公司 Model training method based on federal learning
CN111125735A (en) * 2019-12-20 2020-05-08 支付宝(杭州)信息技术有限公司 Method and system for model training based on private data
CN111144576A (en) * 2019-12-13 2020-05-12 支付宝(杭州)信息技术有限公司 Model training method and device and electronic equipment
CN111210003A (en) * 2019-12-30 2020-05-29 深圳前海微众银行股份有限公司 Longitudinal federated learning system optimization method, device, equipment and readable storage medium
CN111222628A (en) * 2019-11-20 2020-06-02 深圳前海微众银行股份有限公司 Method, device and system for optimizing recurrent neural network training and readable storage medium
CN111260061A (en) * 2020-03-09 2020-06-09 厦门大学 Differential noise adding method and system in federated learning gradient exchange
CN111291897A (en) * 2020-02-10 2020-06-16 深圳前海微众银行股份有限公司 Semi-supervision-based horizontal federal learning optimization method, equipment and storage medium
CN111460478A (en) * 2020-03-30 2020-07-28 西安电子科技大学 Privacy protection method for collaborative deep learning model training
CN111695675A (en) * 2020-05-14 2020-09-22 平安科技(深圳)有限公司 Federal learning model training method and related equipment
CN111800265A (en) * 2020-07-07 2020-10-20 上海大学 Privacy protection-based material reverse design method and system
CN111915004A (en) * 2020-06-17 2020-11-10 北京迈格威科技有限公司 Neural network training method and device, storage medium and electronic equipment
CN112016632A (en) * 2020-09-25 2020-12-01 北京百度网讯科技有限公司 Model joint training method, device, equipment and storage medium
CN112085206A (en) * 2019-06-13 2020-12-15 北京百度网讯科技有限公司 Joint logistic regression modeling method and device and terminal
CN112149174A (en) * 2019-06-28 2020-12-29 北京百度网讯科技有限公司 Model training method, device, equipment and medium
CN112182635A (en) * 2019-07-03 2021-01-05 北京百度网讯科技有限公司 Method, device, equipment and medium for realizing joint modeling
CN112348197A (en) * 2020-07-01 2021-02-09 北京沃东天骏信息技术有限公司 Model generation method and device based on federal learning
CN112565254A (en) * 2020-12-04 2021-03-26 深圳前海微众银行股份有限公司 Data transmission method, device, equipment and computer readable storage medium
CN112613577A (en) * 2020-12-31 2021-04-06 上海商汤智能科技有限公司 Neural network training method and device, computer equipment and storage medium
WO2021082647A1 (en) * 2019-10-29 2021-05-06 华为技术有限公司 Federated learning system, training result aggregation method, and device
WO2021082633A1 (en) * 2019-10-29 2021-05-06 支付宝(杭州)信息技术有限公司 Multi-party joint neural network training method and apparatus for achieving security defense
WO2021120855A1 (en) * 2019-12-20 2021-06-24 支付宝(杭州)信息技术有限公司 Method and system for carrying out model training on the basis of privacy data
WO2021184347A1 (en) * 2020-03-20 2021-09-23 云图技术有限公司 Data processing method and apparatus for realizing privacy protection
CN113536667A (en) * 2021-06-22 2021-10-22 同盾科技有限公司 Federal model training method and device, readable storage medium and equipment
CN113723604A (en) * 2020-05-26 2021-11-30 杭州海康威视数字技术股份有限公司 Neural network training method and device, electronic equipment and readable storage medium
CN113988254A (en) * 2020-07-27 2022-01-28 腾讯科技(深圳)有限公司 Method and device for determining neural network model for multiple environments
CN115169589A (en) * 2022-09-06 2022-10-11 北京瑞莱智慧科技有限公司 Parameter updating method, data processing method and related equipment
CN116151370A (en) * 2023-04-24 2023-05-23 西南石油大学 Model parameter optimization selection system
US11764941B2 (en) 2020-04-30 2023-09-19 International Business Machines Corporation Decision tree-based inference on homomorphically-encrypted data without bootstrapping

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160350648A1 (en) * 2014-11-07 2016-12-01 Microsoft Technology Licensing, Llc. Neural networks for encrypted data
US20170330054A1 (en) * 2016-05-10 2017-11-16 Baidu Online Network Technology (Beijing) Co., Ltd. Method And Apparatus Of Establishing Image Search Relevance Prediction Model, And Image Search Method And Apparatus
CN107688493A (en) * 2016-08-05 2018-02-13 阿里巴巴集团控股有限公司 Train the method, apparatus and system of deep neural network
CN107871160A (en) * 2016-09-26 2018-04-03 谷歌公司 Communicate efficient joint study
US20180173572A1 (en) * 2016-02-05 2018-06-21 Sas Institute Inc. Automated transfer of neural network definitions among federated areas
CN108229646A (en) * 2017-08-08 2018-06-29 北京市商汤科技开发有限公司 neural network model compression method, device, storage medium and electronic equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160350648A1 (en) * 2014-11-07 2016-12-01 Microsoft Technology Licensing, Llc. Neural networks for encrypted data
US20180173572A1 (en) * 2016-02-05 2018-06-21 Sas Institute Inc. Automated transfer of neural network definitions among federated areas
US20170330054A1 (en) * 2016-05-10 2017-11-16 Baidu Online Network Technology (Beijing) Co., Ltd. Method And Apparatus Of Establishing Image Search Relevance Prediction Model, And Image Search Method And Apparatus
CN107688493A (en) * 2016-08-05 2018-02-13 阿里巴巴集团控股有限公司 Train the method, apparatus and system of deep neural network
CN107871160A (en) * 2016-09-26 2018-04-03 谷歌公司 Communicate efficient joint study
CN108229646A (en) * 2017-08-08 2018-06-29 北京市商汤科技开发有限公司 neural network model compression method, device, storage medium and electronic equipment

Cited By (58)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020029585A1 (en) * 2018-08-10 2020-02-13 深圳前海微众银行股份有限公司 Neural network federation modeling method and device employing transfer learning, and storage medium
CN109886417B (en) * 2019-03-01 2024-05-03 深圳前海微众银行股份有限公司 Model parameter training method, device, equipment and medium based on federal learning
EP3893170A4 (en) * 2019-03-01 2022-08-31 Webank Co.,Ltd Federated learning-based model parameter training method, apparatus and device, and medium
CN109886417A (en) * 2019-03-01 2019-06-14 深圳前海微众银行股份有限公司 Model parameter training method, device, equipment and medium based on federation's study
CN110162995A (en) * 2019-04-22 2019-08-23 阿里巴巴集团控股有限公司 Assess the method and device thereof of contribution data degree
CN110162995B (en) * 2019-04-22 2023-01-10 创新先进技术有限公司 Method and device for evaluating data contribution degree
CN110084063A (en) * 2019-04-23 2019-08-02 中国科学技术大学 A kind of gradient descent algorithm method for protecting private data
CN110084063B (en) * 2019-04-23 2022-07-15 中国科学技术大学 Gradient descent calculation method for protecting private data
CN112085206A (en) * 2019-06-13 2020-12-15 北京百度网讯科技有限公司 Joint logistic regression modeling method and device and terminal
CN112085206B (en) * 2019-06-13 2024-04-09 北京百度网讯科技有限公司 Combined logistic regression modeling method, device and terminal
CN110263919A (en) * 2019-06-20 2019-09-20 福州数据技术研究院有限公司 A kind of reverse transmittance nerve network training method based on medical treatment & health data safety
CN112149174B (en) * 2019-06-28 2024-03-12 北京百度网讯科技有限公司 Model training method, device, equipment and medium
CN112149174A (en) * 2019-06-28 2020-12-29 北京百度网讯科技有限公司 Model training method, device, equipment and medium
CN112182635B (en) * 2019-07-03 2024-02-23 北京百度网讯科技有限公司 Method, device, equipment and medium for realizing joint modeling
CN112182635A (en) * 2019-07-03 2021-01-05 北京百度网讯科技有限公司 Method, device, equipment and medium for realizing joint modeling
CN110674941B (en) * 2019-09-25 2023-04-18 南开大学 Data encryption transmission method and system based on neural network
CN110674941A (en) * 2019-09-25 2020-01-10 南开大学 Data encryption transmission method and system based on neural network
WO2021082633A1 (en) * 2019-10-29 2021-05-06 支付宝(杭州)信息技术有限公司 Multi-party joint neural network training method and apparatus for achieving security defense
WO2021082647A1 (en) * 2019-10-29 2021-05-06 华为技术有限公司 Federated learning system, training result aggregation method, and device
CN110852430A (en) * 2019-10-29 2020-02-28 清华大学 Neural network encryption method and device for nonvolatile computing system
CN110807528A (en) * 2019-10-30 2020-02-18 深圳前海微众银行股份有限公司 Feature correlation calculation method, device and computer-readable storage medium
CN111222628B (en) * 2019-11-20 2023-09-26 深圳前海微众银行股份有限公司 Method, device, system and readable storage medium for optimizing training of recurrent neural network
CN111222628A (en) * 2019-11-20 2020-06-02 深圳前海微众银行股份有限公司 Method, device and system for optimizing recurrent neural network training and readable storage medium
WO2021114585A1 (en) * 2019-12-13 2021-06-17 支付宝(杭州)信息技术有限公司 Model training method and apparatus, and electronic device
CN110955907A (en) * 2019-12-13 2020-04-03 支付宝(杭州)信息技术有限公司 Model training method based on federal learning
CN111144576A (en) * 2019-12-13 2020-05-12 支付宝(杭州)信息技术有限公司 Model training method and device and electronic equipment
CN111125735B (en) * 2019-12-20 2021-11-02 支付宝(杭州)信息技术有限公司 Method and system for model training based on private data
CN111125735A (en) * 2019-12-20 2020-05-08 支付宝(杭州)信息技术有限公司 Method and system for model training based on private data
WO2021120855A1 (en) * 2019-12-20 2021-06-24 支付宝(杭州)信息技术有限公司 Method and system for carrying out model training on the basis of privacy data
WO2021120888A1 (en) * 2019-12-20 2021-06-24 支付宝(杭州)信息技术有限公司 Method and system for performing model training on the basis of private data
CN111210003A (en) * 2019-12-30 2020-05-29 深圳前海微众银行股份有限公司 Longitudinal federated learning system optimization method, device, equipment and readable storage medium
CN111291897B (en) * 2020-02-10 2024-10-11 深圳前海微众银行股份有限公司 Semi-supervision-based transverse federal learning optimization method, equipment and storage medium
CN111291897A (en) * 2020-02-10 2020-06-16 深圳前海微众银行股份有限公司 Semi-supervision-based horizontal federal learning optimization method, equipment and storage medium
CN111260061A (en) * 2020-03-09 2020-06-09 厦门大学 Differential noise adding method and system in federated learning gradient exchange
CN111260061B (en) * 2020-03-09 2022-07-19 厦门大学 Differential noise adding method and system in federated learning gradient exchange
WO2021184347A1 (en) * 2020-03-20 2021-09-23 云图技术有限公司 Data processing method and apparatus for realizing privacy protection
CN111460478A (en) * 2020-03-30 2020-07-28 西安电子科技大学 Privacy protection method for collaborative deep learning model training
US11764941B2 (en) 2020-04-30 2023-09-19 International Business Machines Corporation Decision tree-based inference on homomorphically-encrypted data without bootstrapping
CN111695675B (en) * 2020-05-14 2024-05-07 平安科技(深圳)有限公司 Federal learning model training method and related equipment
CN111695675A (en) * 2020-05-14 2020-09-22 平安科技(深圳)有限公司 Federal learning model training method and related equipment
CN113723604A (en) * 2020-05-26 2021-11-30 杭州海康威视数字技术股份有限公司 Neural network training method and device, electronic equipment and readable storage medium
CN113723604B (en) * 2020-05-26 2024-03-26 杭州海康威视数字技术股份有限公司 Neural network training method and device, electronic equipment and readable storage medium
CN111915004B (en) * 2020-06-17 2024-08-02 北京迈格威科技有限公司 Neural network training method and device, storage medium and electronic equipment
CN111915004A (en) * 2020-06-17 2020-11-10 北京迈格威科技有限公司 Neural network training method and device, storage medium and electronic equipment
CN112348197A (en) * 2020-07-01 2021-02-09 北京沃东天骏信息技术有限公司 Model generation method and device based on federal learning
CN111800265A (en) * 2020-07-07 2020-10-20 上海大学 Privacy protection-based material reverse design method and system
CN113988254B (en) * 2020-07-27 2023-07-14 腾讯科技(深圳)有限公司 Method and device for determining neural network model for multiple environments
CN113988254A (en) * 2020-07-27 2022-01-28 腾讯科技(深圳)有限公司 Method and device for determining neural network model for multiple environments
CN112016632B (en) * 2020-09-25 2024-04-26 北京百度网讯科技有限公司 Model joint training method, device, equipment and storage medium
CN112016632A (en) * 2020-09-25 2020-12-01 北京百度网讯科技有限公司 Model joint training method, device, equipment and storage medium
CN112565254A (en) * 2020-12-04 2021-03-26 深圳前海微众银行股份有限公司 Data transmission method, device, equipment and computer readable storage medium
CN112565254B (en) * 2020-12-04 2023-03-31 深圳前海微众银行股份有限公司 Data transmission method, device, equipment and computer readable storage medium
CN112613577A (en) * 2020-12-31 2021-04-06 上海商汤智能科技有限公司 Neural network training method and device, computer equipment and storage medium
CN112613577B (en) * 2020-12-31 2024-06-11 上海商汤智能科技有限公司 Neural network training method and device, computer equipment and storage medium
CN113536667B (en) * 2021-06-22 2024-03-01 同盾科技有限公司 Federal model training method, federal model training device, readable storage medium and federal model training device
CN113536667A (en) * 2021-06-22 2021-10-22 同盾科技有限公司 Federal model training method and device, readable storage medium and equipment
CN115169589A (en) * 2022-09-06 2022-10-11 北京瑞莱智慧科技有限公司 Parameter updating method, data processing method and related equipment
CN116151370A (en) * 2023-04-24 2023-05-23 西南石油大学 Model parameter optimization selection system

Also Published As

Publication number Publication date
CN109325584B (en) 2021-06-25

Similar Documents

Publication Publication Date Title
CN109325584A (en) Federation's modeling method, equipment and readable storage medium storing program for executing neural network based
CN109165725A (en) Neural network federation modeling method, equipment and storage medium based on transfer learning
CN109255444A (en) Federal modeling method, equipment and readable storage medium storing program for executing based on transfer learning
US11947680B2 (en) Model parameter training method, terminal, and system based on federation learning, and medium
US10944751B2 (en) Generating cryptographic function parameters from compact source code
EP3449451B1 (en) Implementing logic gate functionality using a blockchain
CN110263936A (en) Laterally federation's learning method, device, equipment and computer storage medium
US11316665B2 (en) Generating cryptographic function parameters based on an observed astronomical event
CN109460966A (en) Contract signing method, apparatus and terminal device based on requesting party's classification
CN102474416B (en) Authentication token with incremental key establishment capability
EP3286747B1 (en) Generating cryptographic function parameters from a puzzle
CN110099069A (en) A kind of wisdom charging pile data transmission device, processing system and method
CN115409198A (en) Distributed prediction method and system thereof
Qi et al. Scalable decentralized privacy-preserving usage-based insurance for vehicles
CN109167662A (en) A kind of seed generation method and its equipment
CN109543458A (en) A kind of prescription data method of calibration, equipment and server based on block chain
Levshun et al. A technique for design of secure data transfer environment: Application for I2C protocol
CN113298404A (en) Method and device for determining workload of federal learning participator
CN109885790A (en) The method and apparatus for obtaining satisfaction evaluation data
Chitra et al. Agent-based simulations of blockchain protocols illustrated via kadena’s chainweb
Gancher et al. A core calculus for equational proofs of cryptographic protocols
CN112949868B (en) Asynchronous federal learning method and device based on block chain and electronic equipment
CN106874813A (en) A kind of method and smart card for automatically selecting M1 cards
Szymoniak et al. Timed models of security protocols including delays in the network
Szymoniak et al. SAT-based verification of NSPK protocol including delays in the network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant