CN112949866A - Poisson regression model training method and device, electronic equipment and storage medium - Google Patents
Poisson regression model training method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN112949866A CN112949866A CN202110322421.6A CN202110322421A CN112949866A CN 112949866 A CN112949866 A CN 112949866A CN 202110322421 A CN202110322421 A CN 202110322421A CN 112949866 A CN112949866 A CN 112949866A
- Authority
- CN
- China
- Prior art keywords
- model
- parameter
- aggregation
- parameters
- participant
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000012549 training Methods 0.000 title claims abstract description 190
- 238000000034 method Methods 0.000 title claims abstract description 61
- 230000002776 aggregation Effects 0.000 claims abstract description 191
- 238000004220 aggregation Methods 0.000 claims abstract description 191
- 230000015654 memory Effects 0.000 claims description 23
- 238000013507 mapping Methods 0.000 claims description 13
- 238000004590 computer program Methods 0.000 claims description 12
- 238000006116 polymerization reaction Methods 0.000 abstract description 41
- 238000004364 calculation method Methods 0.000 description 15
- 238000004891 communication Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 7
- 230000008569 process Effects 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 230000003993 interaction Effects 0.000 description 4
- 238000010801 machine learning Methods 0.000 description 4
- 230000004931 aggregating effect Effects 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000013515 script Methods 0.000 description 2
- 206010063746 Accidental death Diseases 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 235000019800 disodium phosphate Nutrition 0.000 description 1
- 238000012854 evaluation process Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000000873 masking effect Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Security & Cryptography (AREA)
- Data Mining & Analysis (AREA)
- Computer Hardware Design (AREA)
- General Health & Medical Sciences (AREA)
- Bioethics (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Filters That Use Time-Delay Elements (AREA)
- Telephone Function (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The application provides a method and a device for training a Poisson regression model, which are applied to a federal learning system; the method comprises the following steps: the parameter aggregation equipment acquires model parameter pairs corresponding to at least two Poisson regression models, and the model parameter pairs and the Poisson regression models are in one-to-one correspondence; each model parameter pair includes: model parameters of a first submodel in a first participant device and model parameters of a corresponding second submodel in a second participant device; respectively carrying out model parameter polymerization on model parameters of a first sub-model and model parameters of a second sub-model in at least two model parameter pairs to obtain corresponding first polymerization model parameters and second polymerization model parameters; and sending the first aggregation model parameters to the first participant equipment, and sending the second aggregation model parameters to each second participant equipment so as to update the model parameters of the Poisson regression model. By the method and the device, the number of feature providers in the Poisson regression model in federal learning can be increased, and the model prediction capability is improved.
Description
Technical Field
The present application relates to the field of artificial intelligence technologies, and in particular, to a method and an apparatus for training a poisson regression model, an electronic device, and a computer-readable storage medium.
Background
The federal learning technology is a novel privacy protection technology, and can effectively combine data of all parties to carry out model training on the premise that the data cannot be out of the local.
In the related longitudinal poisson regression model training, limited by the characteristics of poisson modeling, currently, a single modeling can only support a single label provider and a single feature provider to participate in the model training. In practical applications, a scenario exists in which one tag provider needs to perform poisson regression modeling with multiple feature providers. For example, when an insurance enterprise predicts the future number of claims of a user, the insurance enterprise needs to combine a plurality of feature providers because of the newer insurance types or the smaller number of specific samples of a certain type. Then, the training by using the relevant longitudinal poisson regression model easily leads to overfitting of the result model, and the model prediction result is inaccurate, such as leading to too high or too low number of predicted claims.
Disclosure of Invention
The embodiment of the application provides a method and a device for training a poisson regression model, an electronic device, a computer readable storage medium and a computer program product, which can expand the number of feature providers in the poisson regression model in federal learning, expand the application range of the poisson regression model and improve the prediction capability of the model.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides a method for training a Poisson regression model, which is applied to a federated learning system, wherein the system comprises a first participant device with a model training label, at least two second participant devices and a parameter aggregation device, and the method comprises the following steps:
the parameter aggregation equipment acquires at least two model parameter pairs corresponding to the Poisson regression models, wherein the model parameter pairs and the Poisson regression models are in one-to-one correspondence;
wherein each of the model parameter pairs comprises: model parameters of a first submodel in the first participant device and model parameters of a corresponding second submodel in the second participant device;
performing model parameter aggregation on the model parameters of the first sub-model and the model parameters of the second sub-model in the at least two model parameter pairs respectively to obtain corresponding first aggregation model parameters and second aggregation model parameters;
and sending the first aggregation model parameters to the first participant equipment, and sending the second aggregation model parameters to each second participant equipment so as to update the model parameters of the Poisson regression model.
The embodiment of the application provides a training device of a Poisson regression model, which is applied to a federated learning system, wherein the system comprises a first participant device with a model training label, at least two second participant devices and a parameter aggregation device, and the device comprises:
the acquisition module is used for acquiring model parameter pairs corresponding to at least two Poisson regression models by parameter aggregation equipment, wherein the model parameter pairs and the Poisson regression models are in one-to-one correspondence;
wherein each of the model parameter pairs comprises: model parameters of a first submodel in the first participant device and model parameters of a corresponding second submodel in the second participant device;
the aggregation module is used for respectively carrying out model parameter aggregation on the model parameters of the first sub-model and the model parameters of the second sub-model in the at least two model parameter pairs to obtain corresponding first aggregation model parameters and second aggregation model parameters;
and the training module is used for sending the first aggregation model parameters to the first participant equipment and sending the second aggregation model parameters to the second participant equipment so as to update the model parameters of the Poisson regression model.
In the above scheme, the aggregation module is further configured to obtain a first weight of a model parameter of a first sub-model in each of the model parameter pairs and a second weight of a model parameter of a second sub-model in each of the model parameter pairs, respectively;
based on the obtained first weights, carrying out weighted summation on model parameters of at least two first submodels to obtain a first aggregation model parameter;
and carrying out weighted summation on the model parameters of at least two second submodels based on the obtained second weights to obtain second aggregation model parameters.
In the above scheme, the obtaining module is further configured to obtain a first number of training samples provided by the first participant device for performing model training with each of the second participant devices, a second number of training samples provided by each of the second participant devices, and a total number of training samples used for poisson regression model training;
determining a first ratio of the first number to the total number, the first ratio being the first weight;
determining a second ratio of the second number to the total number, the second ratio being used as a second weight of the model parameters of the respective second submodel.
In the above scheme, when the number of the first party devices is the same as the number of the second party devices, the first party devices and the second party devices are in a one-to-one correspondence relationship; the obtaining module is further configured to obtain a third number of training samples provided by the first participant device, a fourth number of training samples provided by the corresponding second participant device, and a total number of training samples used for poisson regression model training, respectively;
determining a third ratio of the third number to the total number, the third ratio being the first weight;
determining a fourth ratio of the fourth number to the total number, the fourth ratio being the second weight.
In the above scheme, the obtaining module is further configured to enable the creating module to create a key pair, where the key pair includes a public key and a private key;
distributing the public key to the first participant device and the at least two second participant devices;
the public key is used for the first participant device and each second participant device to encrypt the model parameter based on the public key.
In the above scheme, the obtaining module is further configured to receive the model parameters of the encrypted first sub-model sent by the first party device and the model parameters of the encrypted second sub-model sent by the at least two second party devices;
constructing at least two encrypted model parameter pairs corresponding to the Poisson regression model based on the encrypted model parameters of the first sub-model and the encrypted model parameters of the second sub-model;
and respectively decrypting the model parameter of the first sub-model and the model parameter of the second sub-model in the encrypted model parameter pair based on the private key to obtain the decrypted model parameter pair.
In the foregoing scheme, when the number of the first participant devices is at least two, the aggregation module is further configured to obtain a mapping relationship between the first participant device and the second participant device, where the mapping relationship is used to indicate the second participant device performing poisson regression model training with each of the first participant devices;
constructing a model parameter group corresponding to each first participant device based on the mapping relation and the obtained at least two model parameter pairs;
wherein the model parameter set comprises a first number of model parameter pairs; the first number is the number of the second participant devices performing poisson regression model training with the first parameter party device;
performing parameter aggregation on the model parameter pairs in each model parameter group respectively to obtain initial aggregation parameters corresponding to each first sub-model and initial aggregation parameters corresponding to each second sub-model;
and carrying out parameter polymerization on the initial polymerization parameters corresponding to the first submodels to obtain first polymerization model parameters, and carrying out parameter polymerization on the initial polymerization parameters corresponding to the second submodels to obtain second polymerization model parameters.
In the foregoing scheme, the obtaining module is further configured to receive a first notification message, where the first notification message is sent by a target second participant device in at least two second participant devices, and is used to indicate that a poisson regression model in which the target second participant device participates in training meets a convergence condition;
storing a pair of model parameters when the Poisson regression model reaches a convergence condition based on the first notification message;
correspondingly, obtaining at least two model parameter pairs corresponding to the poisson regression model includes:
obtaining a model parameter pair when the Poisson regression models corresponding to other second participant devices except the target second participant device do not reach a convergence condition;
correspondingly, the aggregation module is further configured to aggregate the model parameters of the first sub-model and the model parameters of the second sub-model in the stored model parameter pairs, the model parameter pairs of the poisson regression model that do not reach the convergence condition, and obtain corresponding first aggregation model parameters and second aggregation model parameters.
In the above scheme, the training module is further configured to obtain a model convergence condition of the at least two poisson regression models;
based on the model convergence condition, when determining that a Poisson regression model meeting the model convergence condition exists in the at least two Poisson regression models, sending a second notification message to second participant equipment corresponding to the Poisson regression model meeting the model convergence condition;
wherein the second notification message is configured to notify the second participant device to stop training of the corresponding poisson regression model.
An embodiment of the present application provides an electronic device, including:
a memory for storing executable instructions;
and the processor is used for realizing the training method of the Poisson regression model provided by the embodiment of the application when the executable instructions stored in the memory are executed.
An embodiment of the present application provides a computer-readable storage medium, which stores executable instructions for causing a processor to execute the method for training a poisson regression model provided in the embodiment of the present application.
The embodiment of the present application provides a computer program product, which includes a computer program, and when the computer program is executed by a processor, the computer program implements the method for training a poisson regression model provided in the embodiment of the present application.
The embodiment of the application has the following beneficial effects:
compared with the technology that the modeling of the related single Poisson regression can only support the modeling of one first participant device and one second participant device, the Federal learning system which comprises the first participant device with the model training label, at least two second participant devices and the parameter aggregation device is built to support the multi-party combined Poisson regression modeling. The parameter aggregation equipment obtains model parameter pairs corresponding to at least two Poisson regression models, each model parameter pair comprises a model parameter of a first sub-model in the first participant equipment and a model parameter of a corresponding second sub-model in the second participant equipment, namely the first participant equipment and each second parameter party equipment respectively carry out Poisson regression model training, each training obtains model parameter pairs with the same number as the second participant equipment, and the model parameter pairs are uploaded to the parameter aggregation equipment, so that the number of the second participant equipment trained by the Poisson regression models can be supplemented. The parameter aggregation equipment carries out model parameter aggregation on model parameters of a first sub-model and model parameters of a second sub-model in at least two model parameter pairs respectively to obtain corresponding first aggregation model parameters and second aggregation model parameters; and sending a first aggregation model parameter to the first participant device, and sending a second aggregation model parameter to each second participant device to update the model parameters of the Poisson regression model.
Drawings
Fig. 1 is an alternative structural diagram of a training system of a poisson regression model provided in an embodiment of the present application;
fig. 2 is an alternative structural schematic diagram of an electronic device provided in an embodiment of the present application;
FIG. 3 is a schematic flow chart of an alternative method for training a Poisson regression model according to an embodiment of the present disclosure;
4A-4C are schematic flow diagrams illustrating an alternative method for training a Poisson regression model provided by an embodiment of the present application;
FIG. 5 is a schematic flow chart of an alternative method for training a Poisson regression model according to an embodiment of the present disclosure;
FIG. 6 is a schematic flow chart of an alternative method for training a Poisson regression model according to an embodiment of the present disclosure;
FIG. 7 is a schematic flow chart of an alternative method for training a Poisson regression model according to an embodiment of the present disclosure;
fig. 8 is an alternative flowchart of a method for training a poisson regression model according to an embodiment of the present disclosure.
Detailed Description
In order to make the objectives, technical solutions and advantages of the present application clearer, the present application will be described in further detail with reference to the attached drawings, the described embodiments should not be considered as limiting the present application, and all other embodiments obtained by a person of ordinary skill in the art without creative efforts shall fall within the protection scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict.
Where similar language of "first/second" appears in the specification, the following description is added, and where reference is made to the term "first \ second \ third" merely for distinguishing between similar items and not for indicating a particular ordering of items, it is to be understood that "first \ second \ third" may be interchanged both in particular order or sequence as appropriate, so that embodiments of the application described herein may be practiced in other than the order illustrated or described herein.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the application.
Before further detailed description of the embodiments of the present application, terms and expressions referred to in the embodiments of the present application will be described, and the terms and expressions referred to in the embodiments of the present application will be used for the following explanation.
1) Federal Learning (fed Learning) is a machine Learning framework of distributed Learning, and efficient machine Learning is developed among computing devices of multiple parties on the premise of guaranteeing information security during data exchange, protecting privacy of terminal data and personal data and guaranteeing legal compliance.
2) Vertical federal Learning (also called sample-aligned federal Learning), which increases the feature dimension of training data, and under the condition that the data feature overlap of participants is small and the users overlap more, takes out the part of users and data with the same users and different user data features to jointly train the machine Learning model.
For example, there are two participants a and B belonging to the same region, where participant a is a bank and participant B is an e-commerce platform. Participants a and B have more users in the same area, but a and B have different services and different recorded user data characteristics. In particular, the user data characteristics of the a and B records may be complementary. In such a scenario, vertical federated learning may be used to help a and B build a joint machine learning predictive model, helping a and B provide better service to their customers.
3) Poisson regression is mainly applicable to analysis of rare outcome incidence, and the data characteristics are generally as follows: the denominator of the occurrence is large relative to the numerator, or the event occurs in a limited temporal, spatial sense. The number of event occurrences may be approximated as obeying a poisson distribution. Common examples are the number of accidental deaths by soldiers, the number of telephone dialing errors, the incidence of rare illnesses, etc.
The inventor finds that in the related longitudinal poisson regression model technology, due to the characteristics of poisson modeling, a single modeling can only support modeling of a single label provider and a single feature provider, so that overfitting of a result model is easily caused, and a model prediction result is inaccurate.
Based on this, embodiments of the present application provide a method, an apparatus, an electronic device, a computer-readable storage medium, and a computer program product for training a poisson regression model, which can extend an application range of the poisson regression model, support multi-party combined modeling, ensure security of multi-party data, and improve prediction capability of the model.
First, a description is given to a training system of a poisson regression model provided in this embodiment, referring to fig. 1, fig. 1 is an optional architecture schematic diagram of the training system 100 of the poisson regression model provided in this embodiment, a first participant device 400, a second participant device 410 (2 second participant devices are exemplarily shown, and are respectively denoted as 410-1 and a terminal 410-2 for distinction), the first participant device 400 and the second participant device 410 are connected to each other through a network 300 while being connected to a parameter aggregation device 200 through the network 300, the network 300 may be a wide area network or a local area network, or a combination of the two, and data transmission is achieved using a wireless link.
In some embodiments, the first participant device 400 and the second participant device 410 may be, but are not limited to, a laptop computer, a tablet computer, a desktop computer, a smart phone, a dedicated messaging device, a portable gaming device, a smart speaker, a smart watch, etc., and may also be client terminals of federal learning participants, such as participant devices storing user characteristic data at various banks or financial institutions, etc. The parameter aggregation device 200 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a Network service, cloud communication, a middleware service, a domain name service, a security service, a Content Delivery Network (CDN), a big data and artificial intelligence platform, and is configured to assist each participant device in performing federal learning to obtain a federal learning model. The network 300 may be a wide area network or a local area network, or a combination of both. The first participant device 400, the second participant device 410 and the parameter aggregation device 200 may be directly or indirectly connected through wired or wireless communication, and the embodiments of the present application are not limited thereto.
The first participant device 400 is configured to perform poisson regression model training with the second participant device 410-1 and the second parameter device 410-2, respectively, generate model parameters of a local first sub-model, send the model parameters of the first sub-model to the parameter aggregation device 200, receive the first aggregation model parameters sent by the parameter aggregation device 200, and update the model parameters of the poisson regression model according to the first aggregation model parameters.
The second participant device 410 is configured to perform poisson regression model training with the first participant device 400, generate a model parameter of a local second sub-model, send the model parameter of the second sub-model to the parameter aggregation device 200, receive the second aggregation model parameter sent by the parameter aggregation device 200, and update the model parameter of the poisson regression model according to the second aggregation model parameter.
The parameter aggregation device 200 is configured to obtain at least two model parameter pairs corresponding to poisson regression models, where the model parameter pairs and the poisson regression models are in a one-to-one correspondence relationship; the model parameter pairs comprise model parameters of a first sub-model in the first participant device 400 and model parameters of a corresponding second sub-model in the second participant device 410; respectively carrying out model parameter polymerization on model parameters of a first sub-model and model parameters of a second sub-model in at least two model parameter pairs to obtain corresponding first polymerization model parameters and second polymerization model parameters; the first aggregation model parameters are sent to the first participant device 400, and the second aggregation model parameters are sent to the second participant device 410-1 and the second participant device 410-2, so as to update the model parameters of the poisson regression model.
Referring to fig. 2 and fig. 2 are schematic structural diagrams of an optional electronic device 500 provided in the embodiment of the present application, in practical applications, the electronic device 500 may be implemented as the terminal 400 or the server 200 in fig. 1, and the electronic device implementing the method for training a poisson regression model in the embodiment of the present application is described by taking the electronic device as the server 200 shown in fig. 1 as an example. The electronic device 500 shown in fig. 2 includes: at least one processor 510, memory 550, at least one network interface 520, and a user interface 530. The various components in the electronic device 500 are coupled together by a bus system 540. It will be appreciated that the bus system 540 is used to enable communications among the components. The bus system 540 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 540 in fig. 2.
The Processor 510 may be an integrated circuit chip having Signal processing capabilities, such as a general purpose Processor, a Digital Signal Processor (DSP), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like, wherein the general purpose Processor may be a microprocessor or any conventional Processor, or the like.
The user interface 530 includes one or more output devices 531 enabling presentation of media content, including one or more speakers and/or one or more visual display screens. The user interface 530 also includes one or more input devices 532, including user interface components to facilitate user input, such as a keyboard, mouse, microphone, touch screen display, camera, other input buttons and controls.
The memory 550 may be removable, non-removable, or a combination thereof. Exemplary hardware devices include solid state memory, hard disk drives, optical disk drives, and the like. Memory 550 optionally includes one or more storage devices physically located remote from processor 510.
The memory 550 may comprise volatile memory or nonvolatile memory, and may also comprise both volatile and nonvolatile memory. The nonvolatile Memory may be a Read Only Memory (ROM), and the volatile Memory may be a Random Access Memory (RAM). The memory 550 described in embodiments herein is intended to comprise any suitable type of memory.
In some embodiments, memory 550 can store data to support various operations, examples of which include programs, modules, and data structures, or subsets or supersets thereof, as exemplified below.
An operating system 551 including system programs for processing various basic system services and performing hardware-related tasks, such as a framework layer, a core library layer, a driver layer, etc., for implementing various basic services and processing hardware-based tasks;
a network communication module 552 for communicating to other computing devices via one or more (wired or wireless) network interfaces 520, exemplary network interfaces 520 including: bluetooth, wireless compatibility authentication (WiFi), and Universal Serial Bus (USB), etc.;
a presentation module 553 for enabling presentation of information (e.g., a user interface for operating peripherals and displaying content and information) via one or more output devices 531 (e.g., a display screen, speakers, etc.) associated with the user interface 530;
an input processing module 554 to detect one or more user inputs or interactions from one of the one or more input devices 532 and to translate the detected inputs or interactions.
In some embodiments, the training apparatus of the poisson regression model provided in the embodiments of the present application may be implemented in software, and fig. 2 illustrates the training apparatus 555 of the poisson regression model stored in the memory 550, which may be software in the form of programs and plug-ins, and includes the following software modules: an acquisition module 5551, an aggregation module 5552, and a training module 5553, which are logical and thus may be arbitrarily combined or further split depending on the functions implemented. The functions of the respective modules will be explained below.
In other embodiments, the training Device of the poisson regression model provided in the embodiments of the present Application may be implemented in hardware, and as an example, the training Device of the poisson regression model provided in the embodiments of the present Application may be a processor in the form of a hardware decoding processor, which is programmed to execute the training method of the poisson regression model provided in the embodiments of the present Application, for example, the processor in the form of the hardware decoding processor may employ one or more Application Specific Integrated Circuits (ASICs), DSPs, Programmable Logic Devices (PLDs), Complex Programmable Logic Devices (CPLDs), Field Programmable Gate Arrays (FPGAs), or other electronic components.
In a poisson regression model training system in the related art, a single modeling only supports a single label provider and a single feature provider, and the overfitting of a prediction result is easy to occur when model prediction is performed under the condition of few training samples. Based on this, in order to increase the number of training samples and adapt to an application scenario in which multiple feature providers participate jointly, the application provides a poisson regression model training system based on federated learning, wherein the training system comprises at least one first participant device (i.e. a label provider) with a model training label, at least two second participant devices (i.e. multiple feature providers) and a parameter aggregation device; compared with the framework of a model training system in the related art, the number of feature providers is increased, a plurality of feature providers and label providers jointly model, and the plurality of feature providers provide training samples to increase the number of the training samples; on the basis, the label provider and each feature provider respectively perform Poisson regression modeling to obtain model parameter pairs with the same number as the feature providers, each model parameter pair comprises a model parameter of a local submodel of the label provider and a model parameter of a corresponding local submodel of the feature provider, and the parameter aggregation equipment respectively performs parameter aggregation on all model parameters sent by the label provider based on the obtained model parameter pairs to obtain aggregation model parameters corresponding to the label provider, and performs parameter aggregation on the model parameters sent by all the feature providers to obtain aggregation model parameters corresponding to the feature providers; the method comprises the steps that aggregation model parameters used for updating label providers and local models of all the feature providers are obtained through parameter aggregation, and therefore training of the Poisson regression model is achieved under the scene of combined modeling of the feature providers and the label providers. Next, with reference to an exemplary application and implementation of the parameter aggregation device provided in the embodiment of the present application, a method for training a poisson regression model provided in the embodiment of the present application is described, where the method is applied to a federated learning system, and the federated learning system includes at least one first participant device having a model training tag, at least two second participant devices, and a parameter aggregation device.
Referring to fig. 3, fig. 3 is an alternative flowchart of a method for training a poisson regression model provided in the embodiment of the present application, and will be described with reference to the steps shown in fig. 3.
In step 101, the parameter aggregation device obtains at least two model parameter pairs corresponding to the poisson regression models, where the model parameter pairs are in a one-to-one correspondence with the poisson regression models, and each model parameter pair includes a model parameter of a first sub-model in the first participant device and a model parameter of a corresponding second sub-model in the second participant device.
Here, the first participant device and the at least two second participant devices perform poisson regression model training, respectively, and there are poisson regression models equal in number to the second participant devices.
Explaining at least two Poisson regression models, wherein each second participant device and the corresponding first participant device cooperatively train a Poisson regression model, the number of the Poisson regression models is the same as that of the second participant devices, and the reason why a plurality of Poisson regression models exist is that sample intersections of training samples provided by the first participant device and the second participant device are different, that is, the number of samples provided by the first participant device and the second participant device for Poisson regression model training is often different, but meanings of all Poisson regression models are the same, that is, meanings of prediction results obtained by trained models are the same.
In some embodiments, each poisson regression model, after each round of training, generates a corresponding pair of model parameters, where each pair of model parameters includes a model parameter of a first sub-model in a first participant device and a model parameter of a corresponding second sub-model in a second participant device. That is, for each round of model training, a pair of model parameters corresponding to the poisson regression model one to one is generated.
In practical implementation, when the number of the first participant devices is one, the sample intersection of the first participant devices and each of the second participant devices is different, each of the second participant devices trains a poisson regression model through a common training sample with the first participant device, and the number of the poisson regression models is consistent with the number of the second participant devices.
Referring to fig. 4A, fig. 4A is an alternative flowchart of a method for training a poisson regression model according to an embodiment of the present disclosure. The first participant device provides training sample 1 and training sample 2, training poisson regression model 1 with training sample 1 (providing feature data x1, x2, label data y) and second participant 1 (providing feature data x3, x4) to obtain model parameter pairs 1{ Θ 1, Θ 2, Θ 3, Θ 4}, where { Θ 1, Θ 2} is from a first sub-model corresponding to the first participant device, and { Θ 3, Θ 4} is from a second sub-model corresponding to the second participant device; training sample 2 (providing characteristic data x1, x2) and the second participant 2 (providing characteristic data x3, x4) train the poisson regression model 2, and the meaning of the obtained model parameter pair 2, poisson regression model 1 and poisson regression model 2 is the same, namely the meaning of the model predicted value y is the same (for example, for a certain newly added vehicle risk type, the number of reimbursements in the next year is predicted). Wherein x1, x2, x3 and x4 are all the same meaning and only have different specific data.
In practical implementation, when the first participant device corresponds to the second participant device, each first participant device and one second participant device train a poisson regression model together, that is, the number of poisson regression models is consistent with the number of second participants.
For example, referring to fig. 4B, fig. 4B is an alternative flowchart of a method for training a poisson regression model provided in the embodiment of the present application. A first participant device 1 (providing characteristic data x1, x2 and label data y) and a second participant device 1 (providing characteristic data x3 and x4) train a poisson regression model 1 to obtain model parameter pairs 1{ Θ 1, Θ 2, Θ 3 and Θ 4 }; the first participant device 2 and the second participant device 2 train the poisson regression model 2 to obtain a model parameter pair 2, and the poisson regression model 1 and the poisson regression model 2 have the same meaning, namely the predicted values of the models are y.
In practical implementation, the first participant device and the second participant device are both multiple, and the first participant device can perform model training with any second participant device.
For example, referring to fig. 4C, fig. 4C is an alternative flowchart of a method for training a poisson regression model provided in the embodiment of the present application. The number of the first participant devices is two, the number of the second participant devices is two, the first participant device 1 (providing the feature data x1, x2 and the label data y) provides a training sample 1 and a training sample 2, and the training sample 1 and the second participant 1 (providing the feature data x3 and x4) jointly train a poisson regression model 1 to obtain a model parameter pair 1; the training sample 2 and a training sample 3 provided by the second participant device 2 jointly train a Poisson regression model 2 to obtain a model parameter pair 2; the first participant device 2 (providing the feature data x1, x2, tag data y) trains the poisson regression model 3 together with the training samples 4 provided by the second participant 2 (providing the feature data x3, x4), resulting in model parameter pairs 3. The Poisson regression model 1, the Poisson regression model 2 and the Poisson regression model 3 have the same meaning, namely the predicted values of the models are y.
In some embodiments, before the parameter aggregation device obtains the model parameter pairs corresponding to at least two poisson regression models, each poisson regression model needs to be trained for a preset round, specifically, the parameter aggregation device creates a key pair, and the key pair includes a public key and a private key; distributing the public key to the first participant device and to at least two second participant devices; and the public key is used for encrypting the model parameters by the first participant equipment and each second participant equipment based on the public key.
Further, based on the received public key, the first participant device performing poisson regression model training, and the corresponding second participant device perform the following operations, respectively: respectively calculating local intermediate results based on the public key to obtain encrypted intermediate results; respectively calculating encryption model parameters of respective local models based on the encryption intermediate result to obtain the model parameters of the encrypted first sub-model and the encrypted second sub-model; and respectively sending the encrypted model parameters of the first sub-model and the encrypted model parameters of the second sub-model to the parameter aggregation equipment.
In some embodiments, the parameter aggregation device performs a decryption operation on the received model parameters of the first sub-model encrypted by using the public key and the encrypted model parameters of the second sub-model by using the corresponding private key, specifically, receives the encrypted model parameters of the first sub-model sent by the first participant device and the encrypted model parameters of the second sub-model sent by at least two second participant devices; constructing at least two encrypted model parameter pairs corresponding to the Poisson regression model based on the encrypted model parameters of the first sub-model and the encrypted model parameters of the second sub-model; and respectively decrypting the model parameter of the first sub-model and the model parameter of the second sub-model in the encrypted model parameter pair based on the private key to obtain the decrypted model parameter pair.
Illustratively, based on fig. 3, before step 101, a first participant device and a corresponding second participant device perform longitudinal poisson regression model training, referring to fig. 5, fig. 5 is an optional flowchart of a method for training a poisson regression model provided in this embodiment of the present application, and a training process of the longitudinal poisson regression model will be described with reference to each step.
In step 201, the coordinator device creates a key pair and sends the public key to the first participant device and the corresponding second participant device.
Here, the coordinator device is a device that passes a security authentication protocol, and may be the aforementioned parameter aggregation device, a newly added trusted third party server, or a first participant device.
In some embodiments, the computational formula for the model tag in the first participant device is set:
y=exp(wx)
in the above formula, y is a model label, and can be understood as the number of times a certain event occurs in a certain indefinite time range, such as the number of claims of a certain insurance type in a year. The model label y is directly predicted, multiplication homomorphic encryption is used when encryption residual is subsequently calculated, and the following formula is generally adopted in the calculation formula of the label in the relevant Poisson regression model
log y=ax+b
The formula actually takes y as log, then carries out linear model related training to obtain model parameters a and b when the model converges, then predicts the value of logy, the actually predicted value y has changed, and when subsequently calculating the encryption residual, the addition homomorphic encryption is used.
In step 202, the first and second participant devices encrypt and exchange intermediate results, where the intermediate results are used to calculate model parameters of the first submodel and model parameters of the second submodel.
Here, the first sub-model is provided by the first participant device, the second sub-model is provided by the second participant device, and the second participant device (i.e., the Host party) obtains the public key in step 201 and uses the public key to calculate the result w of the intermediate calculation of the local modelhjxhjAnd exp (w)hjxhj) Encrypting to obtain encrypted intermediate calculation result [ [ exp (w) ]hjxhj)]]And sending the encrypted intermediate calculation result to the corresponding first participant equipment for model training, wherein the first participant equipment uses respective characteristic dataTo calculate correspondencesAnd
then, obtaining an encryption residual value, wherein the encryption mode adopts multiplication homomorphic encryption, and a formula for calculating the encryption residual value is as follows:
in the above-mentioned formula,is provided by the first participant device [ [ exp (w)hjxhj)]]The calculation of the residual error is performed by the second party device, and based on the model label prediction formula corresponding to the poisson regression model provided in the embodiment of the present application, multiplicative homomorphic encryption is used.
It should be noted that, when using the multiplicative homomorphic encryption, the data provided by the first party device is unencrypted, because in the multiplicative homomorphic encryption, the encrypted information and the encrypted information cannot be multiplied, but the encrypted information and the unencrypted information are multiplied, and the above formula for calculating the encryption residue is for the case of one Guest and one Host, and if a plurality of hosts occur, the formula becomes:
and [ [ exp (w) in the above formulah2xh2)]]And [ [ exp (w) ]h1xh1)]]Both are homomorphic encryption parameters, the two cannot carry out multiplication operation, the result obtained by multiplying the two cannot be restored (in cryptography, if p and q can also be decomposed, the decomposition form of the product of p and q is various, so that decryption cannot be carried out after encryption), and the model parameters received in model training must be restored.
In step 203, the first participant device calculates a first model parameter corresponding to the local first sub-model, and the second participant device calculates a second model parameter corresponding to the local second sub-model, and encrypts the respective model parameters by a public key and sends the encrypted model parameters to the coordinator device.
In step 204, the coordinator device decrypts the encrypted model parameters of the first sub-model and the encrypted model parameters of the second sub-model through the private key, sends the decrypted model parameters of the first sub-model to the first participant device, and sends the decrypted model parameters of the second sub-model to the second participant device, so that the first participant device and the second participant device update the model parameters of their respective local models respectively.
In some embodiments, when the poisson regression model in which the second participant device participates in the training reaches the convergence condition, the second participant device sends a notification message to the parameter aggregation device, and the parameter aggregation device stores a model parameter pair when the poisson regression model reaches the convergence condition, and specifically, receives a first notification message, which is sent by a target second participant device of at least two second participant devices and is used for indicating that the poisson regression model in which the target second participant device participates in the training reaches the convergence condition; storing a pair of model parameters when the Poisson regression model reaches a convergence condition based on the first notification message; correspondingly, obtaining at least two model parameter pairs corresponding to the poisson regression model includes: and obtaining a model parameter pair when the Poisson regression models corresponding to other second participant devices except the target second participant device do not reach the convergence condition.
In practical implementation, when the poisson regression model in which any one of the second participant devices participates in training reaches the convergence condition, the model parameter pair corresponding to the poisson regression model needs to be prestored for use in parameter aggregation.
Illustratively, referring to fig. 4A, when a poisson regression model 1 trained by a training sample 1 together with a second participant device reaches a convergence condition, the obtained model parameter pair 1 is pre-stored.
In step 102, model parameter aggregation is performed on model parameters of a first sub-model and model parameters of a second sub-model in at least two model parameter pairs, so as to obtain corresponding first aggregation model parameters and second aggregation model parameters
Here, the first aggregate model parameter is obtained by performing parameter aggregation on the model parameter of the first submodel, and the second aggregate model parameter is obtained by performing parameter aggregation on the model parameter of the second submodel.
Illustratively, referring to fig. 4A, training the poisson regression model 1 with the training sample 1 provided by the first participant and the second participant 1 yields a model parameter pair 1{ Θ 1, Θ 2, Θ 3, Θ 4}, training the poisson regression model 2 with the training sample 2 provided by the first participant and the second participant 2 yields a model parameter pair 2{ Θ 1, Θ 2, Θ 3, Θ 4}, assuming that { Θ 1, Θ 2} in each model parameter pair is from the first sub-model of the first participant device and { Θ 3, Θ 4} is from the second sub-model of the second participant device. And carrying out safe polymerization on the model parameter pair 1 and the model parameter pair 2, actually carrying out safe polymerization on { theta 1, theta 2} in the model parameter pair 1 and { theta 1, theta 2} in the model parameter pair 2 to obtain a first polymerization model parameter, and carrying out safe polymerization on { theta 3, theta 4} in the model parameter pair 1 and { theta 3, theta 4} in the model parameter pair 2 to obtain a second polymerization model parameter.
In some embodiments, the aggregated model parameter is calculated by obtaining a weight corresponding to the model parameter, specifically, a first weight of the model parameter of the first sub-model in each model parameter pair and a second weight of the model parameter of the second sub-model in each model parameter pair are respectively obtained; based on the obtained first weights, carrying out weighted summation on the model parameters of at least two first submodels to obtain first aggregation model parameters; and carrying out weighted summation on the model parameters of the at least two second submodels based on the obtained second weights to obtain second aggregation model parameters.
Illustratively, 1 first participant device (Guest) and J Host send the encrypted model parameters of the respective local models to the third party arbiters, the parameter aggregation device aggregates all the model parameters provided by the first participant device as first aggregated model parameters by a weighted average value, and the parameter aggregation device aggregates all the model parameters of the second participant device as second aggregated model parameters by a weighted average value.
In some embodiments, the weights corresponding to the model parameters may be determined according to the number of training samples provided by the first participant device and each second participant device and the total data amount of the training samples participating in the model training when the poisson regression model training is performed. Specifically, a first number of training samples provided by the first participant device for performing model training with each second participant device, a second number of training samples provided by each second participant device, and a total number of training samples used for poisson regression model training are obtained; determining a first ratio of the first number to the total number, the first ratio being the first weight; determining a second ratio of the second number to the total number, the second ratio being used as a second weight of the model parameters of the respective second submodel.
Illustratively, the first participant device and the second participant device respectively send each encrypted model parameter and the data amount used for model modeling to the parameter aggregation device. The first participant device sends the encrypted model parameter [ w [ [ w ]g]]Data quantity n used with model modelinggThe second participant device sends the encrypted model parameter [ w [ [ w ]h]]Data quantity n used with model modelingh. The parameter polymerization equipment carries out safe polymerization operation on each model parameter to obtain a polymerization model parameter, and when parameter polymerization is carried out, the usable parameter polymerization formula is as follows:
wherein,is the parameter of polymerization model, sigma, obtained after safe polymerization]]For the received first model parameter, second model parameter, nghModeling data volumes for the first participant device and the second participant device. For example, 1 first participant device Guest, 2 second participant device Host, the first stage is modeled twice, the first modeling, the intersection data volume of Guest and Host1 is 1000, the second modeling, the intersection data volume of Guest and Host2 is 2000, then the total modeled data volume is 1000+2000 to 3000, correspondingThe calculation method comprises the following steps:
in some embodiments, when the number of the first participant devices is the same as the number of the second participant devices, the first participant devices and the second participant devices have a one-to-one correspondence relationship, and the weights for obtaining the corresponding model parameters are described, specifically, a third number of training samples provided by the first participant devices, a fourth number of training samples provided by the second participant devices, and a total number of training samples used for poisson regression model training are respectively obtained; determining a third ratio of the third number to the total number, the third ratio being the first weight; determining a fourth ratio of the fourth number to the total number, the fourth ratio being the second weight.
In some embodiments, when the number of the first participant devices is at least two, performing poisson regression model training with each second participant device, where each first participant device marks the second participant device performing model training with itself, and constructing a mapping relationship, specifically, when the number of the first participant devices is at least two, obtaining a mapping relationship between the first participant device and the second participant device, where the mapping relationship is used to indicate the second participant device performing poisson regression model training with each first participant device; constructing a model parameter group corresponding to each first participant device based on the mapping relation and the obtained at least two model parameter pairs; the model parameter group comprises a first number of model parameter pairs; the first number is the number of second participant devices performing poisson regression model training with the first parameter party device; performing parameter aggregation on the model parameter pairs in each model parameter group respectively to obtain initial aggregation parameters corresponding to each first sub-model and initial aggregation parameters corresponding to each second sub-model; and carrying out parameter aggregation on the initial aggregation parameters corresponding to the first submodels to obtain first aggregation model parameters, and carrying out parameter aggregation on the initial aggregation parameters corresponding to the second submodels to obtain second aggregation model parameters.
In some embodiments, if there is a pair of prestored model parameters in step 101, performing secure aggregation by combining the pair of prestored model parameters and the pair of model parameters that do not reach the convergence condition, specifically, performing model parameter aggregation on the model parameters of the first sub-model and the model parameters of the second sub-model in the pair of at least two model parameters, respectively, to obtain corresponding first aggregation model parameters and second aggregation model parameters, includes: and respectively carrying out model parameter aggregation on the stored model parameter pair, the model parameter of the first sub-model and the model parameter of the second sub-model in the model parameter pair of the Poisson regression model which does not reach the convergence condition, and obtaining corresponding first aggregation model parameters and second aggregation model parameters.
For example, referring to fig. 4A, there are 2 model parameter pairs in the graph, when a poisson regression model 1 trained by a training sample 1 and a second participant device reaches a convergence condition, the obtained model parameter pair 1 is pre-stored, a parameter aggregation operation is performed on the model parameter pair 2 uploaded during the poisson regression model 2 training which does not reach the convergence condition, so as to obtain a first aggregation model parameter and a second aggregation model parameter, the first participant updates a local model parameter according to the first aggregation model parameter, and the second participant 1 and the second participant 2 update the local model parameter according to the second aggregation model parameter.
In step 103, the first aggregation model parameter is sent to the first participant device, and the second aggregation model parameter is sent to each second participant device, so as to update the model parameter of the poisson regression model.
Here, the first aggregation model parameter is distributed to each first participant device, the second aggregation model parameter is distributed to each second participant device, and each first participant device and each second participant device acquire the relevant model parameter of the local submodel, decrypt the model parameter, and update the relevant model parameter of the local model.
In some embodiments, after the last security parameter aggregation is performed, a second notification message for stopping model training is also sent to each participant, specifically, a model convergence condition of the at least two poisson regression models is obtained; based on the model convergence condition, when determining that a Poisson regression model meeting the model convergence condition exists in the at least two Poisson regression models, sending a second notification message to second participant equipment corresponding to the Poisson regression model meeting the model convergence condition; wherein the second notification message is configured to notify the second participant device to stop training of the corresponding poisson regression model.
According to the method and the device, a federated learning system comprising a first participant device with a model training label, at least two second participant devices and a parameter aggregation device is built to support multi-party combined Poisson regression modeling, so that each first model parameter and each second model parameter are obtained, each first model parameter and each second model parameter are subjected to safety parameter aggregation through a safety aggregation technology, a first aggregation model parameter and a corresponding second aggregation model parameter are obtained, and the first participant device and each second participant device update the model parameters of respective local models based on the obtained first aggregation model parameter and the obtained second aggregation model parameter. In the longitudinal poisson regression model training process, local training data are not mutually exposed by all the participants, so that the data safety of all the participants is guaranteed, the application range of the poisson regression model in federal learning is expanded by the mode of safely aggregating model parameters corresponding to the multiparty poisson regression model, the model prediction accuracy is improved, the interaction times among the participants are greatly reduced, and the communication requirement is reduced.
In the following, an exemplary application of the embodiment of the present application in an actual application scenario will be described, where the embodiment is applied in a scenario where multiple parties jointly perform poisson regression model training in a federated learning system.
In a federal learning scenario, a parameter aggregation device is set as an Arbiter (Arbiter), a first participant device is also called a tag provider or a data application (Guest), and a second participant device is also called a feature provider or a data provider (Host). For convenience of calculation, characteristics of sample data in each Host party are consistent but contents of the sample data are inconsistent, and the Arbiter party is used as a coordinating party to coordinate joint modeling of each participant party. And both Host and Guest need to train and predict the Poisson regression model on the premise of not revealing labels and specific characteristic contents of each other.
In practical applications, there may be 1 scenario where tag providers (Guest) need to model with multiple feature complements (Host). For example, when an insurance enterprise predicts the future reimbursement times of a user, the result model is over-fitted easily because the insurance type is newer or the number of certain samples is small, if the model is only modeled with a single Host, and the model training is stopped in advance, so that the effect is not ideal in practical application.
Therefore, if a plurality of data providers Host (such as retail enterprises and banks) providing features are jointly modeled, namely a plurality of hosts with the same complementary features participate in modeling, the features can be supplemented, and the actual effect of the model is improved. However, the method is limited by the characteristics of Poisson modeling, single Poisson regression modeling in the related technology can only support single Host modeling and single Guest modeling, and the application range of common Poisson regression modeling is expanded through a safety aggregation multi-party model, so that multi-party simultaneous combined modeling is achieved.
The embodiment of the application provides a multi-label provider combined Poisson regression training method for integrating a plurality of longitudinal federated Poisson regression models through safe aggregation.
1 Guest party and J Host parties are laid, wherein J is an integer larger than 1, and the Guest parties have data samples X with the same characteristics and the same meaning labelsgkThe Host parties respectively have the same characteristic data sample XhjAs can be seen from the figure, the process of the multiparty joint modeling provided by the embodiment of the present application is roughly divided into two stages:
the first stage is a poisson regression model training stage (1. model training), and 1 Guest and J hosts respectively carry out longitudinal modeling. Calculating a local model result by a data application (Guest), calculating an encryption gradient according to an encryption intermediate result received from a data provider (Host), and sending the encryption gradient to an Arbiter (Arbiter); the data provider calculates a local model result based on an encrypted intermediate result exchanged with the data application party, calculates an encryption gradient and sends the encryption gradient to the coordinator; and the coordinating party decrypts the encryption gradient of each party and distributes the encryption gradient to each party, and each party updates local model parameters to carry out model training.
And the second stage is a parameter aggregation stage (2. safety aggregation), wherein 1 Guest and J Host carry out encryption operation on the model parameters of the respective local model and then send the encrypted model parameters to the Arbiter, the Arbiter aggregates all the model parameters through a weighted average value to obtain aggregation model parameters, and the aggregation model parameters are sent to Guest and each Host so that each participant can obtain the model parameters corresponding to the respective local model from the aggregation model parameters and update the model parameters of the respective local model to the weighted average parameter value.
It can be understood that the embodiment of the present application is an extension of the original scheme of the common single Host participating in the trained poisson regression model, and the extension aims to allow multiple feature providers to participate in the longitudinal poisson regression modeling.
For a specific process of the first stage poisson regression model training, referring to fig. 7, fig. 7 is an optional schematic diagram of a training method of the poisson regression model provided in an embodiment of the present application, in the first stage longitudinal poisson regression model training, Guest performs longitudinal poisson regression modeling with J Host parties, where J ═ 2, 3, 4, … …, J }.
Precondition: guest and each Host party complete common sample screening through encrypted ID intersection, and only common samples are interacted in the following operation steps.
The Guest party and each Host party respectively carry out Poisson regression model training and execute the following operations:
in step 301, the Arbiter generates a public-private key pair and transmits a public key in the public-private key pair to Guest and Guest participating in the round of training modelHostj。
In step 302, Guest determines the amount of training data in each round and sends the training data to the corresponding Host party participating in modeling.
In step 303, the Host and Guest parties respectively initialize the model parameters of the local model, and calculate the intermediate calculation result of the local model.
In step 304, the current Host side encrypts the intermediate calculation result of the local model and sends the intermediate calculation result to the Guest side.
Here, Host obtains the public key in step 401, and uses the public key to calculate the result w of the intermediate calculation of the local modelhjxhjAnd exp (w)hjxhj) And encrypting to obtain an encrypted intermediate calculation result, and sending the encrypted intermediate calculation result to the corresponding Guest for model training.
In step 305, the Guest party calculates the encryption residual value according to the intermediate calculation result of the Host party and sends the encryption residual value to the corresponding Host party.
Here, the encryption method adopts multiplication homomorphic encryption, and the formula for calculating the encryption residual value is as follows:
in the above formula, the Host side is based on the local sample feature data xhjTo calculate correspondences
whjxhjAnd exp (w)hjxhj) And encrypting the result of the encryption to obtain [ exp (w)hjxhj)]]. Guest parties use respective feature dataTo calculate correspondencesAndand calculated in conjunction with the tag data yWhere y is the actual tag value, and is the number of times a certain event has occurred within a certain indefinite time frame, such as the number of claims made within a year for a certain type of insurance. Residual value [ [ d ]]]Comprises whjxhjAndtherefore, the residual value cannot be calculated by either party alone, and requires the Host party and the Guest party to be calculated cooperatively.
In step 306, Guest and Host respectively calculate an encrypted local gradient by using the own characteristic data and the encrypted residual error and send the encrypted local gradient to the Arbiter.
Here, after [ [ d ] is set, the gradient of the loss function with respect to the training parameter can be expressed by the following formula:
in the above formula, L represents a loss function in model training, and derives a training parameter w to obtain a corresponding derivative, i.e., gradient information.
In practical implementation, the Arbiter party is mainly responsible for decrypting the encrypted model parameters sent by the Host party and the Guest party, and coordinating the training process and the evaluation process. To prevent the Arbiter party from learning the relevant information from the Host party or the Guest party, the Host party or the Guest party may add the respective corresponding model parameters (e.g., gradient information) to the encrypted random mask. Every iteration can not expose any information to the Arbiter party, because all information obtained by the Arbiter party only has a gradient which is hidden (namely processed by random masking), and randomness and confidentiality can be ensured.
In step 307, the Arbiter party decrypts all gradients by using the private key in the public and private key pair and sends the decrypted gradients to the Guest party and the Host party.
In step 308, the Host party and the Guest party update the model parameters of the local models respectively, and determine whether the local models reach the convergence condition.
Here, the determination as to whether the convergence condition is reached may be to determine whether the model converges according to a gradient norm change.
The iteration step 303 is repeated 308 until all data is modeled.
The iteration steps 302-308 are repeated until the convergence condition is achieved or the maximum number of model training rounds is reached.
Second stage parameter aggregation: after model training in the first stage, a trusted third party server (which may be an arbiter in the first training stage) obtains first model parameters and second model parameters corresponding to Guest and local models of the Host party respectively when the Guest and the local models of the Host party reach a convergence condition, wherein the first model parameters and the second model parameters are encrypted first model parameters and encrypted second model parameters which are subjected to encryption processing (such as adopting a sha256 encryption algorithm), and security parameter aggregation is performed on the encrypted first model parameters provided by the Guest party and the encrypted second model parameters provided by the Host party to obtain aggregation model parameters. The times of parameter aggregation can be preset, and in actual implementation, if the poisson regression model needs to be applied to a rapid modeling or preliminary modeling scene, the parameter aggregation can be directly performed only once, so that the interaction times among all the participants can be reduced, and the network communication overhead is reduced.
For the process of parameter aggregation in the second stage, referring to fig. 8, fig. 8 is an alternative schematic diagram of the training method of the poisson regression model provided in the embodiment of the present application.
In step 401, the Guest party and the Host party respectively send each encrypted model parameter and the data amount used for model modeling to the Arbiter party.
Here, the Guest party transmits the encrypted model parameter [ [ w ]g]]Data quantity n used with model modelinggThe Host side sends the encrypted model parameter [ w ]h]]Data quantity n used with model modelingh。
In step 402, the Arbiter performs a secure polymerization operation on each model parameter to obtain a polymerization model parameter, where the polymerization model parameter includes a first polymerization model parameter and a second polymerization model parameter.
Here, arbiters may be Guest itself.
In carrying out the parametric polymerization, the parametric polymerization formula that can be used is as follows:
the above formula can be understood as the model parameters for each run are directly multiplied by the amount of data involved in modeling and then divided by the total modeled data amount, where,is the parameter of polymerization model, sigma, obtained after safe polymerization]]For the received first model parameter, second model parameter, nghThe modeled data volumes for Guest and Host.
Illustratively, 1 Guest, 2 hosts are modeled twice in the first stage, the first modeling, the amount of intersection data with Host1 is 1000, the second modeling, the amount of intersection data with Host2 is 2000, then the total modeled data amount is 1000+ 2000-3000, and the corresponding data amount is 3000The calculation method comprises the following steps:
in step 403, Arbiter sends the first polymerization model parameter to the Guest party, and sends the second polymerization model parameter to the corresponding Host party.
In step 404, the Guest party decrypts the first aggregation model parameters and updates the model parameters of the local model according to the decrypted first aggregation model parameters; and the Host side decrypts the second aggregation model parameters and updates the model parameters of the local model according to the decrypted first aggregation model parameters.
Here, after receiving the aggregation model parameters, the Guest party and the Host party screen model parameters matching with their respective local models, and update the model parameters of their respective local models.
The Guest party and the Host party repeat the iteration step 401-404 until the Poisson regression model reaches the convergence condition or reaches the preset number of training rounds.
The embodiment of the application achieves participation of multiple data providers in training by providing the poisson regression model parameters which are integrated with multiple participants through safe aggregation, expands the application range of the poisson regression model in federal learning, allows multiple feature providers to participate in modeling at the same time, enhances the application range of the poisson regression model, and improves the accuracy and the applicability of model prediction.
Continuing with the exemplary structure of the poisson regression model training device 555 provided in the embodiments of the present application as a software module, in some embodiments, as shown in fig. 2, the software module stored in the poisson regression model training device 555 in the memory 540 may include:
an obtaining module 5551, configured to, by a parameter aggregation device, obtain at least two model parameter pairs corresponding to the poisson regression models, where the model parameter pairs and the poisson regression models are in a one-to-one correspondence relationship;
wherein each of the model parameter pairs comprises: model parameters of a first submodel in the first participant device and model parameters of a corresponding second submodel in the second participant device;
the aggregation module 5552 is configured to perform model parameter aggregation on a model parameter of a first sub-model and a model parameter of a second sub-model in the at least two model parameter pairs, respectively, to obtain a corresponding first aggregation model parameter and a corresponding second aggregation model parameter;
a training module 5553, configured to send the first aggregation model parameter to the first participant device, and send the second aggregation model parameter to each second participant device, so as to update the model parameter of the poisson regression model.
In some embodiments, the aggregating module 5552 is further configured to obtain a first weight of the model parameter of the first sub-model in each of the model parameter pairs and a second weight of the model parameter of the second sub-model in each of the model parameter pairs, respectively;
based on the obtained first weights, carrying out weighted summation on model parameters of at least two first submodels to obtain a first aggregation model parameter;
and carrying out weighted summation on the model parameters of at least two second submodels based on the obtained second weights to obtain second aggregation model parameters.
In some embodiments, the obtaining module 5551 is further configured to obtain a first number of training samples provided by the first participant device for performing model training with each of the second participant devices, a second number of training samples provided by each of the second participant devices, and a total number of training samples used for poisson regression model training;
determining a first ratio of the first number to the total number, the first ratio being the first weight;
determining a second ratio of the second number to the total number, the second ratio being used as a second weight of the model parameters of the respective second submodel.
In some embodiments, when the number of the first participant devices is the same as the number of the second participant devices, the first participant devices are in a one-to-one correspondence with the second participant devices; the obtaining module 5551 is further configured to obtain a third number of training samples provided by the first participant device, a fourth number of training samples provided by the corresponding second participant device, and a total number of training samples used for poisson regression model training, respectively;
determining a third ratio of the third number to the total number, the third ratio being the first weight;
determining a fourth ratio of the fourth number to the total number, the fourth ratio being the second weight.
In some embodiments, the obtaining module 5551 is further configured to enable the creating module to create a key pair, where the key pair includes a public key and a private key;
distributing the public key to the first participant device and the at least two second participant devices;
the public key is used for the first participant device and each second participant device to encrypt the model parameter based on the public key.
In some embodiments, the obtaining module 5551 is further configured to receive the model parameters of the encrypted first sub-model sent by the first participant device and the model parameters of the encrypted second sub-model sent by the at least two second participant devices;
constructing at least two encrypted model parameter pairs corresponding to the Poisson regression model based on the encrypted model parameters of the first sub-model and the encrypted model parameters of the second sub-model;
and respectively decrypting the model parameter of the first sub-model and the model parameter of the second sub-model in the encrypted model parameter pair based on the private key to obtain the decrypted model parameter pair.
In some embodiments, when the number of the first participant devices is at least two, the aggregating module 5552 is further configured to obtain a mapping relationship between the first participant device and the second participant device, where the mapping relationship is used to indicate the second participant devices performing poisson regression model training with the first participant devices;
constructing a model parameter group corresponding to each first participant device based on the mapping relation and the obtained at least two model parameter pairs;
wherein the model parameter set comprises a first number of model parameter pairs; the first number is the number of the second participant devices performing poisson regression model training with the first parameter party device;
performing parameter aggregation on the model parameter pairs in each model parameter group respectively to obtain initial aggregation parameters corresponding to each first sub-model and initial aggregation parameters corresponding to each second sub-model;
and carrying out parameter polymerization on the initial polymerization parameters corresponding to the first submodels to obtain first polymerization model parameters, and carrying out parameter polymerization on the initial polymerization parameters corresponding to the second submodels to obtain second polymerization model parameters.
In some embodiments, the obtaining module 5551 is further configured to receive a first notification message, where the first notification message is sent by a target second participant device in at least two second participant devices, and is used to indicate that a poisson regression model in which the target second participant device participates in training reaches a convergence condition;
storing a pair of model parameters when the Poisson regression model reaches a convergence condition based on the first notification message;
correspondingly, obtaining at least two model parameter pairs corresponding to the poisson regression model includes:
obtaining a model parameter pair when the Poisson regression models corresponding to other second participant devices except the target second participant device do not reach a convergence condition;
correspondingly, the aggregation module is further configured to aggregate the model parameters of the first sub-model and the model parameters of the second sub-model in the stored model parameter pairs, the model parameter pairs of the poisson regression model that do not reach the convergence condition, and obtain corresponding first aggregation model parameters and second aggregation model parameters.
In some embodiments, the training module 5553 is further configured to obtain a model convergence condition of the at least two poisson regression models;
based on the model convergence condition, when determining that a Poisson regression model meeting the model convergence condition exists in the at least two Poisson regression models, sending a second notification message to second participant equipment corresponding to the Poisson regression model meeting the model convergence condition;
wherein the second notification message is configured to notify the second participant device to stop training of the corresponding poisson regression model.
It should be noted that the description of the apparatus in the embodiment of the present application is similar to the description of the method embodiment, and has similar beneficial effects to the method embodiment, and therefore, the description is not repeated.
The embodiment of the present application provides a computer program product, which includes a computer program, and is characterized in that when being executed by a processor, the computer program implements the method for training a poisson regression model provided in the embodiment of the present application.
Embodiments of the present application provide a computer-readable storage medium storing executable instructions, which when executed by a processor, will cause the processor to perform a method provided by embodiments of the present application, for example, a method for training a poisson regression model as shown in fig. 3.
In some embodiments, the computer-readable storage medium may be memory such as FRAM, ROM, PROM, EPROM, EEPROM, flash, magnetic surface memory, optical disk, or CD-ROM; or may be various devices including one or any combination of the above memories.
In some embodiments, executable instructions may be written in any form of programming language (including compiled or interpreted languages), in the form of programs, software modules, scripts or code, and may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
By way of example, executable instructions may correspond, but do not necessarily have to correspond, to files in a file system, and may be stored in a portion of a file that holds other programs or data, such as in one or more scripts in a hypertext Markup Language (HTML) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
By way of example, executable instructions may be deployed to be executed on one computing device or on multiple computing devices at one site or distributed across multiple sites and interconnected by a communication network.
In conclusion, the number of feature providers in the poisson regression model in federal learning can be increased, the application range of the poisson regression model is expanded, and the model prediction capability is improved.
The above description is only an example of the present application, and is not intended to limit the scope of the present application. Any modification, equivalent replacement, and improvement made within the spirit and scope of the present application are included in the protection scope of the present application.
Claims (13)
1. A method for training a Poisson regression model, which is applied to a federated learning system including at least one first participant device having a model training label, at least two second participant devices, and a parameter aggregation device, the method comprising:
the parameter aggregation equipment acquires at least two model parameter pairs corresponding to the Poisson regression models, wherein the model parameter pairs and the Poisson regression models are in one-to-one correspondence;
wherein each of the model parameter pairs comprises: model parameters of a first submodel in the first participant device and model parameters of a corresponding second submodel in the second participant device;
performing model parameter aggregation on the model parameters of the first sub-model and the model parameters of the second sub-model in the at least two model parameter pairs respectively to obtain corresponding first aggregation model parameters and second aggregation model parameters;
and sending the first aggregation model parameters to the first participant equipment, and sending the second aggregation model parameters to each second participant equipment so as to update the model parameters of the Poisson regression model.
2. The method of claim 1, wherein the performing model parameter aggregation on the model parameters of the first sub-model and the model parameters of the second sub-model in the at least two model parameter pairs to obtain corresponding first aggregation model parameters and second aggregation model parameters comprises:
respectively obtaining a first weight of a model parameter of a first submodel in each model parameter pair and a second weight of a model parameter of a second submodel in each model parameter pair;
based on the obtained first weights, carrying out weighted summation on model parameters of at least two first submodels to obtain a first aggregation model parameter;
and carrying out weighted summation on the model parameters of at least two second submodels based on the obtained second weights to obtain second aggregation model parameters.
3. The method of claim 2, wherein separately obtaining a first weight of a model parameter of a first sub-model in each of the pairs of model parameters and a second weight of a model parameter of a second sub-model in each of the pairs of model parameters comprises:
acquiring a first number of training samples provided by the first participant device for model training with the second participant devices, a second number of training samples provided by the second participant devices, and a total number of training samples for Poisson regression model training;
determining a first ratio of the first number to the total number, the first ratio being the first weight;
determining a second ratio of the second number to the total number, the second ratio being used as a second weight of the model parameters of the respective second submodel.
4. The method of claim 2, wherein separately obtaining a first weight of a model parameter of a first sub-model in each of the pairs of model parameters and a second weight of a model parameter of a second sub-model in each of the pairs of model parameters comprises:
when the number of the first participant devices is the same as the number of the second participant devices, the first participant devices and the second participant devices are in one-to-one correspondence;
respectively acquiring a third number of training samples provided by the first participant device, a fourth number of training samples provided by the corresponding second participant device, and a total number of training samples used for poisson regression model training;
determining a third ratio of the third number to the total number, the third ratio being the first weight;
determining a fourth ratio of the fourth number to the total number, the fourth ratio being the second weight.
5. The method according to claim 1, wherein before the parameter aggregation device obtains at least two corresponding pairs of model parameters of the poisson regression model, the method comprises:
creating a key pair, the key pair comprising a public key and a private key;
distributing the public key to the first participant device and the at least two second participant devices;
the public key is used for the first participant device and each second participant device to encrypt the model parameter based on the public key.
6. The method according to claim 5, wherein the obtaining, by the parameter aggregation device, at least two corresponding pairs of model parameters of the Poisson regression model comprises:
receiving the model parameters of the encrypted first sub-model sent by the first participant device and the model parameters of the encrypted second sub-model sent by the at least two second participant devices;
constructing at least two encrypted model parameter pairs corresponding to the Poisson regression model based on the encrypted model parameters of the first sub-model and the encrypted model parameters of the second sub-model;
and respectively decrypting the model parameter of the first sub-model and the model parameter of the second sub-model in the encrypted model parameter pair based on the private key to obtain the decrypted model parameter pair.
7. The method of claim 1, wherein the performing model parameter aggregation on the model parameters of the first sub-model and the model parameters of the second sub-model in the at least two model parameter pairs to obtain corresponding first aggregation model parameters and second aggregation model parameters comprises:
when the number of the first participant devices is at least two, obtaining a mapping relationship between the first participant device and the second participant device, where the mapping relationship is used to indicate the second participant devices performing poisson regression model training with the first participant devices;
constructing a model parameter group corresponding to each first participant device based on the mapping relation and the obtained at least two model parameter pairs;
wherein the model parameter set comprises a first number of model parameter pairs; the first number is the number of the second participant devices performing poisson regression model training with the first parameter party device;
performing parameter aggregation on the model parameter pairs in each model parameter group respectively to obtain initial aggregation parameters corresponding to each first sub-model and initial aggregation parameters corresponding to each second sub-model;
and carrying out parameter aggregation on the initial aggregation parameters corresponding to the first submodels to obtain first aggregation model parameters, and carrying out parameter aggregation on the initial aggregation parameters corresponding to the second submodels to obtain second aggregation model parameters.
8. The method of claim 1, further comprising:
receiving a first notification message, wherein the first notification message is sent by a target second participant device in at least two second participant devices and is used for indicating that a Poisson regression model in which the target second participant device participates in training reaches a convergence condition;
storing a pair of model parameters when the Poisson regression model reaches a convergence condition based on the first notification message;
correspondingly, obtaining at least two model parameter pairs corresponding to the poisson regression model includes:
obtaining a model parameter pair when the Poisson regression models corresponding to other second participant devices except the target second participant device do not reach a convergence condition;
correspondingly, the performing model parameter aggregation on the model parameter of the first sub-model and the model parameter of the second sub-model in the at least two model parameter pairs respectively to obtain a corresponding first aggregation model parameter and a corresponding second aggregation model parameter includes:
and respectively carrying out model parameter aggregation on the stored model parameter pair, the model parameter of the first sub-model and the model parameter of the second sub-model in the model parameter pair of the Poisson regression model which does not reach the convergence condition, and obtaining corresponding first aggregation model parameters and second aggregation model parameters.
9. The method of claim 1, wherein after sending the first aggregation model parameters to the first participant device and the second aggregation model parameters to each of the second participant devices, the method further comprises:
obtaining model convergence conditions of the at least two Poisson regression models;
based on the model convergence condition, when determining that a Poisson regression model meeting the model convergence condition exists in the at least two Poisson regression models, sending a second notification message to second participant equipment corresponding to the Poisson regression model meeting the model convergence condition;
wherein the second notification message is configured to notify the second participant device to stop training of the corresponding poisson regression model.
10. A Poisson regression model training device applied to a federated learning system, wherein the system comprises a first participant device with a model training label, at least two second participant devices and a parameter aggregation device, the device comprises:
the acquisition module is used for acquiring model parameter pairs corresponding to at least two Poisson regression models by parameter aggregation equipment, wherein the model parameter pairs and the Poisson regression models are in one-to-one correspondence;
wherein each of the model parameter pairs comprises: model parameters of a first submodel in the first participant device and model parameters of a corresponding second submodel in the second participant device;
the aggregation module is used for respectively carrying out model parameter aggregation on the model parameters of the first sub-model and the model parameters of the second sub-model in the at least two model parameter pairs to obtain corresponding first aggregation model parameters and second aggregation model parameters;
and the sending module is used for sending the first aggregation model parameters to the first participant equipment and sending the second aggregation model parameters to each second participant equipment so as to update the model parameters of the Poisson regression model.
11. An electronic device, comprising:
a memory for storing executable instructions;
a processor for implementing the method of training a poisson regression model as claimed in any one of claims 1 to 9 when executing executable instructions stored in the memory.
12. A computer-readable storage medium storing executable instructions for implementing the method of training a poisson regression model according to any one of claims 1 to 9 when executed by a processor.
13. A computer program product comprising a computer program, characterized in that the computer program, when being executed by a processor, implements the method of training a poisson regression model according to any one of claims 1 to 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110322421.6A CN112949866B (en) | 2021-03-25 | 2021-03-25 | Training method and device of poisson regression model, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110322421.6A CN112949866B (en) | 2021-03-25 | 2021-03-25 | Training method and device of poisson regression model, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112949866A true CN112949866A (en) | 2021-06-11 |
CN112949866B CN112949866B (en) | 2024-08-16 |
Family
ID=76228169
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110322421.6A Active CN112949866B (en) | 2021-03-25 | 2021-03-25 | Training method and device of poisson regression model, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112949866B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113536667A (en) * | 2021-06-22 | 2021-10-22 | 同盾科技有限公司 | Federal model training method and device, readable storage medium and equipment |
CN113704776A (en) * | 2021-07-15 | 2021-11-26 | 杭州医康慧联科技股份有限公司 | Machine learning method based on federal learning |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111340247A (en) * | 2020-02-12 | 2020-06-26 | 深圳前海微众银行股份有限公司 | Longitudinal federated learning system optimization method, device and readable storage medium |
CN111428884A (en) * | 2020-03-30 | 2020-07-17 | 深圳前海微众银行股份有限公司 | Federal modeling method, device and readable storage medium based on forward law |
WO2021008017A1 (en) * | 2019-07-17 | 2021-01-21 | 深圳前海微众银行股份有限公司 | Federation learning method, system, terminal device and storage medium |
CN112257873A (en) * | 2020-11-11 | 2021-01-22 | 深圳前海微众银行股份有限公司 | Training method, device, system, equipment and storage medium of machine learning model |
US20210073639A1 (en) * | 2018-12-04 | 2021-03-11 | Google Llc | Federated Learning with Adaptive Optimization |
-
2021
- 2021-03-25 CN CN202110322421.6A patent/CN112949866B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210073639A1 (en) * | 2018-12-04 | 2021-03-11 | Google Llc | Federated Learning with Adaptive Optimization |
WO2021008017A1 (en) * | 2019-07-17 | 2021-01-21 | 深圳前海微众银行股份有限公司 | Federation learning method, system, terminal device and storage medium |
CN111340247A (en) * | 2020-02-12 | 2020-06-26 | 深圳前海微众银行股份有限公司 | Longitudinal federated learning system optimization method, device and readable storage medium |
CN111428884A (en) * | 2020-03-30 | 2020-07-17 | 深圳前海微众银行股份有限公司 | Federal modeling method, device and readable storage medium based on forward law |
CN112257873A (en) * | 2020-11-11 | 2021-01-22 | 深圳前海微众银行股份有限公司 | Training method, device, system, equipment and storage medium of machine learning model |
Non-Patent Citations (1)
Title |
---|
周俊;方国英;吴楠;: "联邦学习安全与隐私保护研究综述", 西华大学学报(自然科学版), no. 04, 10 July 2020 (2020-07-10) * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113536667A (en) * | 2021-06-22 | 2021-10-22 | 同盾科技有限公司 | Federal model training method and device, readable storage medium and equipment |
CN113536667B (en) * | 2021-06-22 | 2024-03-01 | 同盾科技有限公司 | Federal model training method, federal model training device, readable storage medium and federal model training device |
CN113704776A (en) * | 2021-07-15 | 2021-11-26 | 杭州医康慧联科技股份有限公司 | Machine learning method based on federal learning |
Also Published As
Publication number | Publication date |
---|---|
CN112949866B (en) | 2024-08-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113159327B (en) | Model training method and device based on federal learning system and electronic equipment | |
US11100095B2 (en) | Service processing system and method based on blockchain | |
CN110245510B (en) | Method and apparatus for predicting information | |
CN113204787B (en) | Block chain-based federated learning privacy protection method, system, device and medium | |
US10790971B1 (en) | Preventing misrepresentation of input data by participants in a secure multi-party computation | |
CN113032840B (en) | Data processing method, device, equipment and computer readable storage medium | |
CN111784001B (en) | Model training method and device and computer readable storage medium | |
CN112347500B (en) | Machine learning method, device, system, equipment and storage medium of distributed system | |
CN113127916A (en) | Data set processing method, data processing device and storage medium | |
WO2022156594A1 (en) | Federated model training method and apparatus, electronic device, computer program product, and computer-readable storage medium | |
CN110610093A (en) | Distributed training in parametric datasets | |
CN112202554B (en) | Information processing method, device and equipment for generating key based on attribute of information | |
US20230029087A1 (en) | Systems and methods for managing secure sharing of online data | |
CN114785556B (en) | Encryption communication method, device, computer equipment and storage medium | |
WO2023124219A1 (en) | Joint learning model iterative update method, apparatus, system, and storage medium | |
Kaur et al. | Blockchain technology for convergence: an overview, applications, and challenges | |
CN112199697A (en) | Information processing method, device, equipment and medium based on shared root key | |
CN112765642A (en) | Data processing method, data processing apparatus, electronic device, and medium | |
CN112949866B (en) | Training method and device of poisson regression model, electronic equipment and storage medium | |
CN114417364A (en) | Data encryption method, federal modeling method, apparatus and computer device | |
CN116502732A (en) | Federal learning method and system based on trusted execution environment | |
CN109818965B (en) | Personal identity verification device and method | |
CN116186755A (en) | Privacy calculating method, device, terminal equipment and storage medium | |
CN117036023A (en) | Method, system, equipment and medium for identifying abnormal transaction network in meta universe | |
WO2022241945A1 (en) | Data processing method and device, and computer-readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |