[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN114880449A - Reply generation method and device of intelligent question answering, electronic equipment and storage medium - Google Patents

Reply generation method and device of intelligent question answering, electronic equipment and storage medium Download PDF

Info

Publication number
CN114880449A
CN114880449A CN202210542214.6A CN202210542214A CN114880449A CN 114880449 A CN114880449 A CN 114880449A CN 202210542214 A CN202210542214 A CN 202210542214A CN 114880449 A CN114880449 A CN 114880449A
Authority
CN
China
Prior art keywords
answer
question
historical
training
answers
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210542214.6A
Other languages
Chinese (zh)
Other versions
CN114880449B (en
Inventor
林凌峰
李剑锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Technology Shenzhen Co Ltd
Original Assignee
Ping An Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Technology Shenzhen Co Ltd filed Critical Ping An Technology Shenzhen Co Ltd
Priority to CN202210542214.6A priority Critical patent/CN114880449B/en
Publication of CN114880449A publication Critical patent/CN114880449A/en
Application granted granted Critical
Publication of CN114880449B publication Critical patent/CN114880449B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/332Query formulation
    • G06F16/3329Natural language query formulation or dialogue systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/3331Query processing
    • G06F16/334Query execution
    • G06F16/3344Query execution using natural language analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0613Third-party assisted

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Business, Economics & Management (AREA)
  • Databases & Information Systems (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Biophysics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Human Computer Interaction (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Electrically Operated Instructional Devices (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention relates to an artificial intelligence technology, and discloses a reply generation method of an intelligent question and answer, which comprises the following steps: acquiring a current question of a user, and screening historical question and answer information meeting a first preset condition from preset historical question and answer information as reference corpora, wherein the historical question and answer information comprises historical questions and historical answers corresponding to the historical questions; obtaining opposite answers corresponding to the historical answers, and performing answer classification training on the pre-constructed question-answer model by using the historical answers and the opposite answers; performing answer prediction training on the question-answer model which completes answer classification training by using the current question and reference corpora, and generating a prediction answer; and selecting the reply meeting the fourth preset condition from the predicted replies as a final reply. The invention also provides an intelligent question-answering reply generation device, electronic equipment and a storage medium. The invention can optimize the logic consistency in the intelligent question answering.

Description

Reply generation method and device of intelligent question answering, electronic equipment and storage medium
Technical Field
The present invention relates to the field of artificial intelligence technologies, and in particular, to a method and an apparatus for generating an intelligent question and answer response, an electronic device, and a storage medium.
Background
The intelligent question answering is widely applied to life and work of people, such as online shopping intelligent customer service, attendance checking common problem self-service system and the like.
Currently, a more common way is to train a question-answering model based on deep learning by using dialogue corpora to generate a reply related to the current question of the user. In this kind of reply generation method, the finally generated reply usually has a certain correlation with the current question of the user, and meets the basic scenario requirement of the question and answer, but there may be a case where the current reply conflicts with the historical reply and is inconsistent.
Therefore, if the artificial intelligence can accurately answer the user's question, the consistency of the front-back logic of the intelligent dialogue needs to be ensured, and the artificial intelligence can give a proper reply to the user question.
Disclosure of Invention
The invention provides a reply generation method and device of an intelligent question and answer, electronic equipment and a storage medium, and mainly aims to optimize logic consistency in the intelligent question and answer.
In order to achieve the above object, the present invention provides a reply generation method for intelligent question answering, which comprises:
acquiring a current question of a user, and screening historical question-answer information meeting a first preset condition from preset historical question-answer information as reference corpora according to the current question, wherein the historical question-answer information comprises historical questions and historical answers corresponding to the historical questions;
obtaining an opposite answer corresponding to a historical answer in the reference corpus, and performing answer classification training on a pre-constructed question-answer model by using the historical answer and the opposite answer in the reference corpus until the answer classification training meets a second preset condition, and finishing the answer classification training;
performing answer prediction training on the question-answer model which completes answer classification training by using the current question and the reference corpus until the answer prediction training meets a third preset condition, ending the answer prediction training, and generating a prediction answer;
and selecting a response meeting a fourth preset condition from the predicted responses as a final response.
Optionally, the screening, according to the current question, historical question-answer information meeting a first preset condition from preset historical question-answer information as a reference corpus, where the historical question-answer information includes historical questions and historical answers corresponding to the historical questions, and includes:
respectively calculating the comprehensive similarity between each historical question in the preset historical question-answering information and the current question;
and selecting historical question-answer information corresponding to the historical questions with the comprehensive similarity meeting the first preset condition as reference corpora.
Optionally, the respectively calculating a comprehensive similarity between each historical question in the preset historical question-answer information and the current question includes:
extracting overlapped words of each historical question and the current question in a mode based on inverted indexes, and calculating a first similarity between each historical question and the current question according to the word frequency of the overlapped words;
converting each historical question and the current question into a low-dimensional word vector, and calculating a second similarity of each historical question word vector and the current question word vector by adopting a cosine similarity-based method;
extracting a word vector extreme value of each historical question word vector and the current question word vector, and calculating a third similarity of each historical question word vector and the current question word vector according to the word vector extreme value;
carrying out global word vector coding on each historical question and the current question by using a BERT-based model to obtain a global word vector of each historical question and the current question, and calculating a fourth similarity according to the global word vector of each historical question and the current question;
and normalizing the first similarity, the second similarity, the third similarity and the fourth similarity, calculating the mean value of all normalized similarities, and taking the mean value as the comprehensive similarity between the corresponding historical question and the current question.
Optionally, the performing response prediction training on the pre-constructed question-answer model by using the current question and the reference corpus to generate a prediction response includes:
performing text splicing on the reference corpus and the current question to obtain a question-answer training text;
performing word vector conversion operation on the question-answer training text to obtain a text vector;
according to the text vector, performing answer prediction on the current question by using a feedforward neural network layer in the pre-constructed question-answer model to obtain a predicted answer;
calculating a prediction loss value between the prediction answer and the current question real answer, and judging whether the prediction loss value meets the third preset condition;
when the predicted loss value does not meet the third preset condition, adjusting parameters of the pre-constructed question-answer model, and returning to the step of performing answer prediction on the current question by using a feedforward neural network layer in the pre-constructed question-answer model;
and when the prediction loss value meets a third preset condition, ending the prediction training of the pre-constructed question-answer model, and extracting the prediction answer.
Optionally, said calculating a predicted loss value between said predicted answer and said current questioning real answer comprises:
the following formula for the loss value can be used:
Figure BDA0003648699770000031
wherein p is i For the predicted response, y i For the true answer, Loss 1 And (c) taking the predicted loss value between the predicted response and the real response, wherein L is the number of the neuron cells of the pre-constructed question-answer model, and m is the total number of the neuron cells in the pre-constructed question-answer model.
Optionally, the performing answer classification training on the pre-constructed question-answer model by using the historical answers and the opposite answers in the reference corpus includes:
taking the historical answers in the reference corpus as positive labels of the answer classification training and taking the opposite answers as negative labels of the answer classification training;
performing answer classification training on the answer classification of the pre-constructed question-answer model by using the historical answers and the opposite answers in the reference corpus;
calculating a classification loss value corresponding to each positive label and each negative label by using a preset loss function, and judging whether the classification loss value meets the second preset condition;
when the classification loss value does not meet the second preset condition, adjusting parameters of the pre-constructed question-answer model, and returning to the step of performing answer classification training on answer classifications of the pre-constructed question-answer model by using the historical answers and the opposite answers in the reference corpus;
and when the classification loss value meets the second preset condition, finishing the classification training of the pre-constructed question-answering model.
Optionally, the selecting, as a final reply, a reply satisfying a fourth preset condition from the candidate replies includes:
calculating a confidence level for each of the predicted responses;
and extracting the predicted reply with the confidence coefficient meeting the fourth preset condition as the final reply.
In order to solve the above problem, the present invention further provides an answer generating device for intelligent question answering, the device comprising:
the training corpus selecting module is used for acquiring a current question of a user, and screening historical question-answer information meeting a first preset condition from preset historical question-answer information according to the current question as reference corpus, wherein the historical question-answer information comprises historical questions and historical answers corresponding to the historical questions;
the answer consistency training module is used for obtaining an opposite answer corresponding to a historical answer in the reference corpus, and performing answer classification training on a pre-constructed question-answer model by using the historical answer and the opposite answer in the reference corpus until the answer classification training meets a second preset condition, and ending the answer classification training;
the answer prediction module is used for performing answer prediction training on the question-answer model which completes answer classification training by using the current question and the reference corpus until the answer prediction training meets a third preset condition, ending the answer prediction training and generating a prediction answer;
and the answer screening module is used for selecting an answer meeting a fourth preset condition from the predicted answers as a final answer.
In order to solve the above problem, the present invention also provides an electronic device, including:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor, the computer program being executable by the at least one processor to enable the at least one processor to perform the method for generating answers to intelligent questions and answers described above.
In order to solve the above problem, the present invention also provides a computer-readable storage medium, in which at least one computer program is stored, the at least one computer program being executed by a processor in an electronic device to implement the answer generation method for smart question answering described above.
The embodiment of the invention screens the historical question information meeting the first preset condition from the preset historical question-answer information as the reference corpus according to the current question of the user, historical question-answer information irrelevant to the current question can be removed, the use value of the reference corpus is improved, further, opposite answers corresponding to historical answers in the reference corpus are obtained, and by utilizing the historical answers in the reference corpus and the opposite answers corresponding to each historical answer, and performing answer classification training on the pre-constructed question-answer model to enable the pre-constructed question-answer model to have the recognition capability on answers with consistent or inconsistent logics, and performing answer prediction training on the question-answer model which is subjected to answer classification training by using the current question and the reference corpus to ensure the consistency of the answers predicted by the question-answer model and the logics before and after historical answers.
Drawings
Fig. 1 is a schematic flow chart of a reply generation method of an intelligent question answering according to an embodiment of the present invention;
fig. 2 is a detailed flowchart illustrating a step in the answer generation method for intelligent question answering according to an embodiment of the present invention;
fig. 3 is a detailed flowchart illustrating a step in the reply generation method for an intelligent question answering according to an embodiment of the present invention;
fig. 4 is a detailed flowchart illustrating a step in the reply generation method for an intelligent question answering according to an embodiment of the present invention;
FIG. 5 is a functional block diagram of an apparatus for generating answers to intelligent question answering according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device for implementing the reply generation method for intelligent question answering according to an embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The embodiment of the application provides a reply generation method of an intelligent question answer. The execution subject of the reply generation method of the intelligent question answering includes, but is not limited to, at least one of electronic devices such as a server and a terminal, which can be configured to execute the method provided by the embodiment of the application. In other words, the answer generation method of the intelligent question answering can be executed by software or hardware installed in the terminal device or the server device, and the software can be a block chain platform. The server includes but is not limited to: a single server, a server cluster, a cloud server or a cloud server cluster, and the like. The server may be an independent server, or may be a cloud server that provides basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a Network service, cloud communication, a middleware service, a domain name service, a security service, a Content Delivery Network (CDN), a big data and artificial intelligence platform, and the like.
Fig. 1 is a schematic flow diagram of a reply generation method for an intelligent question answering according to an embodiment of the present invention. In this embodiment, the method for generating answers to the intelligent question answering includes the following steps S1-S4:
s1, obtaining a current question of a user, and screening historical question-answer information meeting a first preset condition from preset historical question-answer information as a reference corpus according to the current question, wherein the historical question-answer information comprises historical questions and historical answers corresponding to the historical questions;
in this embodiment of the present invention, the first preset condition refers to a certain similarity threshold that needs to be satisfied by a similarity between a current question and a historical question in the historical question-answering information, for example, the similarity threshold may be set to 0.95.
According to the embodiment of the invention, the historical question-answer information meeting the first preset condition is screened from the preset historical question-answer information according to the current question as the reference corpus, so that the historical question-answer information irrelevant to the current question can be removed, and the use value of the reference corpus is improved.
As an embodiment of the present invention, the screening, according to the current question, historical question-answer information meeting a first preset condition from preset historical question-answer information as a reference corpus includes: respectively calculating the comprehensive similarity between each historical question and the current question; and selecting historical question-answer information corresponding to the historical questions with the comprehensive similarity meeting the first preset condition as reference corpora.
In the embodiment of the present invention, the preset similarity algorithm library refers to a similarity algorithm set constructed by using a plurality of similarity algorithms, for example, the preset similarity algorithm library includes, but is not limited to, similarity calculation methods Greedy Matching and Vector exchange based on an inverted index and word Vector, and similarity calculation methods based on a BERTScore pre-training model.
Furthermore, the embodiment of the invention can further eliminate the influence of irrelevant historical question-answer information on the answer to the call by respectively calculating the comprehensive similarity between each historical question and the current question and carrying out comprehensive similarity evaluation on the similarity calculated by using various similarity calculation formulas.
In detail, referring to fig. 2, the calculating the comprehensive similarity between each of the historical questions and the current question respectively includes the following steps S100 to S104:
s100, extracting overlapped words of each historical question and the current question in a mode based on inverted index, and calculating a first similarity between each historical question and the current question according to the word frequency of the overlapped words;
s101, converting each historical question and the current question into low-dimensional word vectors, and calculating a second similarity of each historical question word vector and the current question word vector by adopting a cosine similarity-based method;
s102, extracting a word vector extreme value of each historical question word vector and the current question word vector, and calculating a third similarity of each historical question word vector and the current question word vector according to the word vector extreme value;
s103, carrying out global word vector coding on each historical question and the current question by using a BERT-based model to obtain a global word vector of each historical question and the current question, and calculating a fourth similarity according to the global word vector of each historical question and the current question;
s104, normalizing the first similarity, the second similarity, the third similarity and the fourth similarity, calculating the mean value of all normalized similarities, and taking the mean value as the comprehensive similarity between the corresponding historical question and the current question.
In the embodiment of the invention, the BERT model is a deep learning model which is generated by large-scale text data pre-training and used for finely adjusting downstream tasks in a small-scale data service scene in the field of natural language processing.
S2, obtaining an opposite answer corresponding to the historical answer in the reference corpus, and performing answer classification training on the pre-constructed question-answer model by using the historical answer in the reference corpus and the opposite answer until the answer classification training meets a second preset condition, and ending the answer classification training;
in the embodiment of the invention, the pre-constructed question-answering model refers to a language task processing model constructed based on a pre-training model GPT.
In the embodiment of the present invention, after obtaining the opposite response corresponding to the historical response in the reference corpus, performing semantic analysis on the historical response through a pre-constructed semantic model, and generating a response opposite to the semantic according to the semantic of the historical response, as the opposite response corresponding to the historical response, where the pre-constructed semantic model may adopt an LDA model.
According to the embodiment of the invention, the answer classification training is carried out on the pre-constructed question-answer model by acquiring the opposite answer corresponding to the historical answer in the reference corpus and utilizing the historical answer and the opposite answer in the reference corpus, so that the dimensionality of the training corpus during the answer classification training can be expanded, and the classification accuracy is improved.
In detail, referring to fig. 3, the performing answer classification training on the pre-constructed question-answer model by using the historical answers and the opposite answers in the reference corpus includes the following steps S200 to S204:
s200, taking the historical answer in the reference corpus as a positive label of the answer classification training, and taking the opposite answer as a negative label of the answer classification training;
s201, performing answer classification training on answer classifications of the pre-constructed question-answer model by using historical answers and the opposite answers in the reference corpus;
s202, calculating a classification loss value corresponding to each positive label and each negative label by using a preset loss function, and judging whether the classification loss value meets the second preset condition;
s203, when the classification loss value does not meet the second preset condition, adjusting parameters of the pre-constructed question-answer model, and returning to the step of performing answer classification training on answer classification of the pre-constructed question-answer model by using the historical answers and the opposite answers in the reference corpus;
and S204, when the classification loss value meets the second preset condition, finishing the classification training of the pre-constructed question-answer model.
Further, the preset loss function is used to calculate the classification loss value corresponding to each of the positive labels and each of the negative labels, and the following loss function may be used:
Figure BDA0003648699770000081
wherein, y i To classify a class, p i To predict the probability that the classification result is a positive label, Loss 2 Classifying said historical responses as classification loss values under said positive label category, an
Wherein, y i To classify a class, p i To predict the probability that the classification result is a negative label, Loss 2 Classifying the historical responses as classification loss values under the negative label category.
S3, performing answer prediction training on the question-answer model which completes answer classification training by using the current question and the reference corpus until the answer prediction training meets a third preset condition, ending the answer prediction training, and generating a prediction answer;
in this embodiment of the present invention, the third preset condition refers to a loss value threshold requirement that needs to be met by a loss value in response to the prediction training.
In detail, referring to fig. 4, the performing answer prediction training on the question-answer model after answer classification training by using the current question and the reference corpus until the answer prediction training satisfies a third preset condition, ending the answer prediction training, and generating a predicted answer, includes the following steps S300 to S305:
s300, performing text splicing on the reference corpus and the current question to obtain a question and answer training text;
s301, performing word vector conversion operation on the question and answer training text to obtain a text vector;
s302, according to the text vector, performing answer prediction on the current question by using a feedforward neural network layer in the pre-constructed question-answer model to obtain a prediction answer;
s303, calculating a prediction loss value between the prediction answer and the current question real answer, and judging whether the prediction loss value meets the third preset condition;
s304, when the prediction loss value does not meet the third preset condition, adjusting parameters of the pre-constructed question-answer model, and returning to the step of performing answer prediction on the current question by using a feedforward neural network layer in the pre-constructed question-answer model;
s305, when the prediction loss value meets a third preset condition, ending the prediction training of the pre-constructed question-answer model, and extracting the prediction answer.
Further, said calculating a predicted loss value between said predicted answer and said current question true answer comprises:
the following formula for the loss value can be used:
Figure BDA0003648699770000091
wherein p is i For the predicted response, y i For the true answer, Loss 1 And (c) taking the predicted loss value between the predicted response and the real response, wherein L is the number of the neuron cells of the pre-constructed question-answer model, and m is the total number of the neuron cells in the pre-constructed question-answer model.
S4, selecting a response satisfying a fourth preset condition from the predicted responses as a final response.
In the embodiment of the present invention, the fourth preset condition refers to calculating a confidence level of the candidate reply, and meeting a confidence level threshold value of a preset requirement.
In detail, the selecting, from the candidate replies, a reply satisfying a third preset condition as a final reply includes: calculating a confidence level of the candidate response; and extracting candidate answers with the confidence degrees meeting preset conditions to serve as the final answers.
The embodiment of the invention screens the historical question information meeting the first preset condition from the preset historical question-answer information as the reference corpus according to the current question of the user, historical question-answer information irrelevant to the current question can be removed, the use value of the reference corpus is improved, further, opposite answers corresponding to historical answers in the reference corpus are obtained, and by utilizing the historical answers in the reference corpus and the opposite answers corresponding to each historical answer, and performing answer classification training on the pre-constructed question-answer model to enable the pre-constructed question-answer model to have the recognition capability on answers with consistent or inconsistent logics, and performing answer prediction training on the question-answer model which is subjected to answer classification training by using the current question and the reference corpus to ensure the consistency of the answers predicted by the question-answer model and the logics before and after historical answers.
Fig. 5 is a functional block diagram of an intelligent question and answer response generation apparatus according to an embodiment of the present invention.
The answer generating device 100 of the intelligent question answering according to the present invention may be installed in an electronic device. According to the implemented functions, the response generation apparatus 100 for intelligent question answering may include a corpus selecting module 101, a response consistency training module 102, a response predicting module 103, and a response filtering module 104. The module of the present invention, which may also be referred to as a unit, refers to a series of computer program segments that can be executed by a processor of an electronic device and that can perform a fixed function, and that are stored in a memory of the electronic device.
In the present embodiment, the functions regarding the respective modules/units are as follows:
the corpus selecting module 101 is configured to obtain a current question of a user, and screen historical question-answer information meeting a first preset condition from preset historical question-answer information according to the current question as a reference corpus, where the historical question-answer information includes historical questions and historical answers corresponding to the historical questions;
in this embodiment of the present invention, the first preset condition refers to a certain similarity threshold that needs to be satisfied by a similarity between a current question and a historical question in the historical question-answering information, for example, the similarity threshold may be set to 0.95.
According to the embodiment of the invention, the historical question-answer information meeting the first preset condition is screened from the preset historical question-answer information according to the current question as the reference corpus, so that the historical question-answer information irrelevant to the current question can be removed, and the use value of the reference corpus is improved.
As an embodiment of the present invention, the screening, according to the current question, historical question-answer information meeting a first preset condition from preset historical question-answer information as a reference corpus includes: respectively calculating the comprehensive similarity between each historical question and the current question; and selecting historical question-answer information corresponding to the historical questions with the comprehensive similarity meeting the first preset condition as reference corpora.
In the embodiment of the present invention, the preset similarity algorithm library refers to a similarity algorithm set constructed by using a plurality of similarity algorithms, for example, the preset similarity algorithm library includes, but is not limited to, similarity calculation methods Greedy Matching and Vector extension based on inverted indexes and word vectors, and similarity calculation methods based on a BERTScore pre-training model.
Furthermore, the embodiment of the invention can further eliminate the influence of irrelevant historical question-answer information on the answer to the call by respectively calculating the comprehensive similarity between each historical question and the current question and carrying out comprehensive similarity evaluation on the similarity calculated by using various similarity calculation formulas.
In detail, the calculating the comprehensive similarity between each of the historical questions and the current question respectively includes: extracting overlapped words of each historical question and the current question in a mode based on inverted indexes, and calculating a first similarity between each historical question and the current question according to the word frequency of the overlapped words; converting each historical question and the current question into a low-dimensional word vector, and calculating a second similarity of each historical question word vector and the current question word vector by adopting a cosine similarity-based method; extracting a word vector extreme value of each historical question word vector and the current question word vector, and calculating a third similarity of each historical question word vector and the current question word vector according to the word vector extreme value; carrying out global word vector coding on each historical question and the current question by using a BERT-based model to obtain a global word vector of each historical question and the current question, and calculating a fourth similarity according to the global word vector of each historical question and the current question; and normalizing the first similarity, the second similarity, the third similarity and the fourth similarity, calculating the mean value of all normalized similarities, and taking the mean value as the comprehensive similarity between the corresponding historical question and the current question.
In the embodiment of the invention, the BERT model is a deep learning model which is generated by large-scale text data pre-training and used for finely adjusting downstream tasks in a small-scale data service scene in the field of natural language processing.
The reply consistency training module 102 is configured to obtain an opposite reply corresponding to a historical reply in the reference corpus, and perform reply classification training on a pre-constructed question-answer model by using the historical reply and the opposite reply in the reference corpus until the reply classification training meets a second preset condition, and then end the reply classification training;
in the embodiment of the invention, the pre-constructed question-answering model refers to a language task processing model constructed based on a pre-training model GPT.
In the embodiment of the present invention, after obtaining the opposite response corresponding to the historical response in the reference corpus, performing semantic analysis on the historical response through a pre-constructed semantic model, and then generating a response opposite to the semantic according to the semantic of the historical response, as the opposite response corresponding to the historical response, wherein the pre-constructed semantic model may adopt an LDA model.
According to the embodiment of the invention, the answer classification training is carried out on the pre-constructed question-answer model by acquiring the opposite answer corresponding to the historical answer in the reference corpus and utilizing the historical answer and the opposite answer in the reference corpus, so that the dimensionality of the training corpus during the answer classification training can be expanded, and the classification accuracy is improved.
In detail, the performing response classification training on the pre-constructed question-answer model by using the historical responses and the opposite responses in the reference corpus includes: taking the historical answers in the reference corpus as positive labels of the answer classification training and taking the opposite answers as negative labels of the answer classification training; performing answer classification training on the answer classification of the pre-constructed question-answer model by using the historical answers and the opposite answers in the reference corpus; calculating a classification loss value corresponding to each positive label and each negative label by using a preset loss function, and judging whether the classification loss value meets the second preset condition; when the classification loss value does not meet the second preset condition, adjusting parameters of the pre-constructed question-answer model, and returning to the step of performing answer classification training on answer classifications of the pre-constructed question-answer model by using the historical answers and the opposite answers in the reference corpus; and when the classification loss value meets the second preset condition, finishing the classification training of the pre-constructed question-answering model.
Further, the preset loss function is used to calculate the classification loss value corresponding to each of the positive labels and each of the negative labels, and the following loss function may be used:
Figure BDA0003648699770000121
wherein, y i To classify a class, p i To predict the probability that the classification result is a positive label, Loss 2 Classifying said historical responses as classification loss values under said positive label category, an
Wherein, y i To classify a class, p i To predict the probability that the classification result is a negative label, Loss 2 Classifying the historical responses as classification loss values under the negative label category.
The answer prediction module 103 is configured to perform answer prediction training on an answer model that is subjected to answer classification training by using the current question and the reference corpus, and end the answer prediction training to generate a prediction answer when the answer prediction training meets a third preset condition;
in this embodiment of the present invention, the third preset condition refers to a loss value threshold requirement that needs to be met by a loss value in response to the prediction training.
In detail, the performing answer prediction training on the question-answer model which is subjected to answer classification training by using the current question and the reference corpus until the answer prediction training meets a third preset condition, ending the answer prediction training, and generating a predicted answer includes: performing text splicing on the reference corpus and the current question to obtain a question-answer training text; performing word vector conversion operation on the question-answer training text to obtain a text vector; according to the text vector, performing answer prediction on the current question by using a feedforward neural network layer in the pre-constructed question-answer model to obtain a predicted answer; calculating a prediction loss value between the prediction answer and the current question real answer, and judging whether the prediction loss value meets the third preset condition or not; when the predicted loss value does not meet the third preset condition, adjusting parameters of the pre-constructed question-answer model, and returning to the step of performing answer prediction on the current question by using a feedforward neural network layer in the pre-constructed question-answer model; and when the prediction loss value meets a third preset condition, ending the prediction training of the pre-constructed question-answer model, and extracting the prediction answer.
Further, said calculating a predicted loss value between said predicted answer and said current question true answer comprises:
the following formula for the loss value can be used:
Figure BDA0003648699770000131
wherein p is i For the predicted response, y i For the true answer, Loss 1 And (c) taking the predicted loss value between the predicted response and the real response, wherein L is the number of the neuron cells of the pre-constructed question-answer model, and m is the total number of the neuron cells in the pre-constructed question-answer model.
The final answer screening module 104 is configured to select an answer satisfying a fourth preset condition from the predicted answers as a final answer.
In the embodiment of the present invention, the fourth preset condition refers to calculating a confidence level of the candidate reply, and meeting a confidence level threshold value of a preset requirement.
In detail, the selecting, from the candidate replies, a reply satisfying a third preset condition as a final reply includes: calculating a confidence level of the candidate response; and extracting candidate answers with the confidence degrees meeting preset conditions to serve as the final answers.
Fig. 6 is a schematic structural diagram of an electronic device implementing a reply generation method for intelligent question answering according to an embodiment of the present invention.
The electronic device 1 may include a processor 10, a memory 11, a communication bus 12, and a communication interface 13, and may further include a computer program, such as a question and answer response generation program, stored in the memory 11 and executable on the processor 10.
In some embodiments, the processor 10 may be composed of an integrated circuit, for example, a single packaged integrated circuit, or may be composed of a plurality of integrated circuits packaged with the same function or different functions, and includes one or more Central Processing Units (CPUs), a microprocessor, a digital Processing chip, a graphics processor, a combination of various control chips, and the like. The processor 10 is a Control Unit (Control Unit) of the electronic device, connects various components of the whole electronic device by using various interfaces and lines, and executes various functions and processes data of the electronic device by running or executing programs or modules (for example, executing a reply generation program of a smart question and answer, etc.) stored in the memory 11 and calling data stored in the memory 11.
The memory 11 includes at least one type of readable storage medium including flash memory, removable hard disks, multimedia cards, card-type memory (e.g., SD or DX memory, etc.), magnetic memory, magnetic disks, optical disks, etc. The memory 11 may in some embodiments be an internal storage unit of the electronic device, for example a removable hard disk of the electronic device. The memory 11 may also be an external storage device of the electronic device in other embodiments, such as a plug-in mobile hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the electronic device. Further, the memory 11 may also include both an internal storage unit and an external storage device of the electronic device. The memory 11 may be used not only to store application software installed in the electronic device and various types of data, such as codes of a response generation program of a smart question and answer, etc., but also to temporarily store data that has been output or is to be output.
The communication bus 12 may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. The bus is arranged to enable connection communication between the memory 11 and at least one processor 10 or the like.
The communication interface 13 is used for communication between the electronic device and other devices, and includes a network interface and a user interface. Optionally, the network interface may include a wired interface and/or a wireless interface (e.g., WI-FI interface, bluetooth interface, etc.), which are typically used to establish a communication connection between the electronic device and other electronic devices. The user interface may be a Display (Display), an input unit such as a Keyboard (Keyboard), and optionally a standard wired interface, a wireless interface. Alternatively, in some embodiments, the display may be an LED display, a liquid crystal display, a touch-sensitive liquid crystal display, an OLED (Organic Light-Emitting Diode) touch device, or the like. The display, which may also be referred to as a display screen or display unit, is suitable, among other things, for displaying information processed in the electronic device and for displaying a visualized user interface.
Fig. 6 only shows an electronic device with components, and it will be understood by a person skilled in the art that the structure shown in fig. 6 does not constitute a limitation of the electronic device 1, and may comprise fewer or more components than shown, or a combination of certain components, or a different arrangement of components.
For example, although not shown, the electronic device may further include a power supply (such as a battery) for supplying power to each component, and preferably, the power supply may be logically connected to the at least one processor 10 through a power management device, so that functions of charge management, discharge management, power consumption management and the like are realized through the power management device. The power supply may also include any component of one or more dc or ac power sources, recharging devices, power failure detection circuitry, power converters or inverters, power status indicators, and the like. The electronic device may further include various sensors, a bluetooth module, a Wi-Fi module, and the like, which are not described herein again.
It is to be understood that the described embodiments are for purposes of illustration only and that the scope of the appended claims is not limited to such structures.
The answer generating program of the smart question-answer stored in the memory 11 of the electronic device 1 is a combination of a plurality of instructions, which when executed in the processor 10, can realize:
acquiring a current question of a user, and screening historical question-answer information meeting a first preset condition from preset historical question-answer information as reference corpora according to the current question, wherein the historical question-answer information comprises historical questions and historical answers corresponding to the historical questions;
obtaining an opposite answer corresponding to a historical answer in the reference corpus, and performing answer classification training on a pre-constructed question-answer model by using the historical answer and the opposite answer in the reference corpus until the answer classification training meets a second preset condition, and finishing the answer classification training;
performing answer prediction training on the question-answer model which completes answer classification training by using the current question and the reference corpus until the answer prediction training meets a third preset condition, ending the answer prediction training, and generating a prediction answer;
and selecting a response meeting a fourth preset condition from the predicted responses as a final response.
Specifically, the specific implementation method of the instruction by the processor 10 may refer to the description of the relevant steps in the embodiment corresponding to the drawings, which is not described herein again.
Further, the integrated modules/units of the electronic device 1, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. The computer readable storage medium may be volatile or non-volatile. For example, the computer-readable medium may include: any entity or device capable of carrying said computer program code, recording medium, U-disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM).
The present invention also provides a computer-readable storage medium, storing a computer program which, when executed by a processor of an electronic device, may implement:
acquiring a current question of a user, and screening historical question-answer information meeting a first preset condition from preset historical question-answer information as reference corpora according to the current question, wherein the historical question-answer information comprises historical questions and historical answers corresponding to the historical questions;
obtaining an opposite answer corresponding to a historical answer in the reference corpus, and performing answer classification training on a pre-constructed question-answer model by using the historical answer and the opposite answer in the reference corpus until the answer classification training meets a second preset condition, and finishing the answer classification training;
performing answer prediction training on the question-answer model which completes answer classification training by using the current question and the reference corpus until the answer prediction training meets a third preset condition, ending the answer prediction training, and generating a prediction answer;
and selecting a response satisfying a fourth preset condition from the predicted responses as a final response.
In the several embodiments provided in the present invention, it should be understood that the disclosed apparatus, device and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is only one logical functional division, and other divisions may be realized in practice.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional module.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof.
The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference signs in the claims shall not be construed as limiting the claim concerned.
The block chain is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, a consensus mechanism, an encryption algorithm and the like. A block chain (Blockchain), which is essentially a decentralized database, is a series of data blocks associated by using a cryptographic method, and each data block contains information of a batch of network transactions, so as to verify the validity (anti-counterfeiting) of the information and generate a next block. The blockchain may include a blockchain underlying platform, a platform product service layer, an application service layer, and the like.
The embodiment of the application can acquire and process related data based on an artificial intelligence technology. Among them, Artificial Intelligence (AI) is a theory, method, technique and application system that simulates, extends and expands human Intelligence using a digital computer or a machine controlled by a digital computer, senses the environment, acquires knowledge and uses the knowledge to obtain the best result.
Furthermore, it is obvious that the word "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. A plurality of units or means recited in the system claims may also be implemented by one unit or means in software or hardware. The terms first, second, etc. are used to denote names, but not any particular order.
Finally, it should be noted that the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention.

Claims (10)

1. A method for generating answers to intelligent questions and answers, said method comprising:
acquiring a current question of a user, and screening historical question-answer information meeting a first preset condition from preset historical question-answer information as reference corpora according to the current question, wherein the historical question-answer information comprises historical questions and historical answers corresponding to the historical questions;
obtaining an opposite answer corresponding to a historical answer in the reference corpus, and performing answer classification training on a pre-constructed question-answer model by using the historical answer and the opposite answer in the reference corpus until the answer classification training meets a second preset condition, and finishing the answer classification training;
performing answer prediction training on the question-answer model which completes answer classification training by using the current question and the reference corpus until the answer prediction training meets a third preset condition, ending the answer prediction training, and generating a prediction answer;
and selecting a response meeting a fourth preset condition from the predicted responses as a final response.
2. The method according to claim 1, wherein the step of screening historical question-answer information meeting a first preset condition from preset historical question-answer information according to the current question as a reference corpus, wherein the historical question-answer information includes historical questions and historical answers corresponding to the historical questions, and comprises:
respectively calculating the comprehensive similarity between each historical question in the preset historical question-answering information and the current question;
and selecting historical question-answer information corresponding to the historical questions with the comprehensive similarity meeting the first preset condition as reference corpora.
3. The reply generation method of the intelligent question answering according to claim 2, wherein the calculating of the comprehensive similarity between each historical question in the preset historical question answering information and the current question, respectively, comprises:
extracting overlapped words of each historical question and the current question in a mode based on inverted indexes, and calculating a first similarity between each historical question and the current question according to the word frequency of the overlapped words;
converting each historical question and the current question into a low-dimensional word vector, and calculating a second similarity of each historical question word vector and the current question word vector by adopting a cosine similarity-based method;
extracting a word vector extreme value of each historical question word vector and the current question word vector, and calculating a third similarity of each historical question word vector and the current question word vector according to the word vector extreme value;
carrying out global word vector coding on each historical question and the current question by using a BERT-based model to obtain a global word vector of each historical question and the current question, and calculating a fourth similarity according to the global word vector of each historical question and the current question;
and normalizing the first similarity, the second similarity, the third similarity and the fourth similarity, calculating the mean value of all normalized similarities, and taking the mean value as the comprehensive similarity between the corresponding historical question and the current question.
4. The method for generating answers to intelligent questions and answers according to claim 1, wherein said performing answer prediction training on a pre-constructed question-answer model using said current question and said reference corpus to generate a predicted answer comprises:
performing text splicing on the reference corpus and the current question to obtain a question-answer training text;
performing word vector conversion operation on the question-answer training text to obtain a text vector;
according to the text vector, performing answer prediction on the current question by using a feedforward neural network layer in the pre-constructed question-answer model to obtain a predicted answer;
calculating a prediction loss value between the prediction answer and the current question real answer, and judging whether the prediction loss value meets the third preset condition;
when the predicted loss value does not meet the third preset condition, adjusting parameters of the pre-constructed question-answer model, and returning to the step of performing answer prediction on the current question by using a feedforward neural network layer in the pre-constructed question-answer model;
and when the predicted loss value meets a third preset condition, ending the prediction training of the pre-constructed question-answer model, and extracting the predicted answer.
5. The answer generating method for the intelligent question answering according to claim 4, wherein said calculating the predicted loss value between the predicted answer and the current question true answer includes:
the following formula for the loss value can be used:
Figure FDA0003648699760000021
wherein p is i For the predicted response, y i For the true answer, Loss 1 For the predicted reply and the true replyAnd (3) the predicted loss value between responses, wherein L is the number of the neuron cells of the pre-constructed question-answer model, and m is the total number of the neuron cells in the pre-constructed question-answer model.
6. The method for generating answers to an intelligent question answering according to claim 1, wherein the answer classification training for the pre-constructed question-answering model using the historical answers in the reference corpus and the opposite answers includes:
taking the historical answers in the reference corpus as positive labels of the answer classification training and taking the opposite answers as negative labels of the answer classification training;
performing answer classification training on answer classifications of the pre-constructed question-answer model by using historical answers and the opposite answers in the reference corpus;
calculating a classification loss value corresponding to each positive label and each negative label by using a preset loss function, and judging whether the classification loss value meets the second preset condition;
when the classification loss value does not meet the second preset condition, adjusting parameters of the pre-constructed question-answer model, and returning to the step of performing answer classification training on answer classifications of the pre-constructed question-answer model by using the historical answers and the opposite answers in the reference corpus;
and when the classification loss value meets the second preset condition, finishing the classification training of the pre-constructed question-answering model.
7. The response generation method for an intelligent question answering according to claim 1, wherein said selecting a response satisfying a fourth preset condition from among said candidate responses as a final response comprises:
calculating a confidence level for each of the predicted responses;
and extracting the predicted reply with the confidence coefficient meeting the fourth preset condition as the final reply.
8. An apparatus for generating an intelligent question-answer response, comprising:
the training corpus selecting module is used for acquiring a current question of a user, and screening historical question-answer information meeting a first preset condition from preset historical question-answer information according to the current question as reference corpus, wherein the historical question-answer information comprises historical questions and historical answers corresponding to the historical questions;
the answer consistency training module is used for obtaining an opposite answer corresponding to a historical answer in the reference corpus, and performing answer classification training on a pre-constructed question-answer model by using the historical answer and the opposite answer in the reference corpus until the answer classification training meets a second preset condition, and ending the answer classification training;
the answer prediction module is used for performing answer prediction training on the question-answer model which completes answer classification training by using the current question and the reference corpus until the answer prediction training meets a third preset condition, ending the answer prediction training and generating a prediction answer;
and the answer screening module is used for selecting an answer meeting a fourth preset condition from the predicted answers as a final answer.
9. An electronic device, characterized in that the electronic device comprises:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor, the computer program being executable by the at least one processor to enable the at least one processor to perform the answer generation method for smart question answering according to any one of claims 1 to 7.
10. A computer-readable storage medium storing a computer program, wherein the computer program, when executed by a processor, implements the answer generation method for an intelligent question answering according to any one of claims 1 to 7.
CN202210542214.6A 2022-05-17 2022-05-17 Method and device for generating answers of intelligent questions and answers, electronic equipment and storage medium Active CN114880449B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210542214.6A CN114880449B (en) 2022-05-17 2022-05-17 Method and device for generating answers of intelligent questions and answers, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210542214.6A CN114880449B (en) 2022-05-17 2022-05-17 Method and device for generating answers of intelligent questions and answers, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114880449A true CN114880449A (en) 2022-08-09
CN114880449B CN114880449B (en) 2024-05-10

Family

ID=82676115

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210542214.6A Active CN114880449B (en) 2022-05-17 2022-05-17 Method and device for generating answers of intelligent questions and answers, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114880449B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115412745A (en) * 2022-08-12 2022-11-29 联想(北京)有限公司 Information processing method and electronic equipment
CN115952274A (en) * 2023-03-10 2023-04-11 北京百度网讯科技有限公司 Data generation method, training method and device based on deep learning model

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107807933A (en) * 2016-09-09 2018-03-16 阿里巴巴集团控股有限公司 A kind of answering method and device for puing question to
CN112052310A (en) * 2020-09-28 2020-12-08 平安普惠企业管理有限公司 Information acquisition method, device, equipment and storage medium based on big data
US20210035132A1 (en) * 2019-08-01 2021-02-04 Qualtrics, Llc Predicting digital survey response quality and generating suggestions to digital surveys
CN113836296A (en) * 2021-09-28 2021-12-24 平安科技(深圳)有限公司 Method, device, equipment and storage medium for generating Buddhist question-answer abstract

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107807933A (en) * 2016-09-09 2018-03-16 阿里巴巴集团控股有限公司 A kind of answering method and device for puing question to
US20210035132A1 (en) * 2019-08-01 2021-02-04 Qualtrics, Llc Predicting digital survey response quality and generating suggestions to digital surveys
CN112052310A (en) * 2020-09-28 2020-12-08 平安普惠企业管理有限公司 Information acquisition method, device, equipment and storage medium based on big data
CN113836296A (en) * 2021-09-28 2021-12-24 平安科技(深圳)有限公司 Method, device, equipment and storage medium for generating Buddhist question-answer abstract

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115412745A (en) * 2022-08-12 2022-11-29 联想(北京)有限公司 Information processing method and electronic equipment
CN115412745B (en) * 2022-08-12 2024-02-27 联想(北京)有限公司 Information processing method and electronic equipment
CN115952274A (en) * 2023-03-10 2023-04-11 北京百度网讯科技有限公司 Data generation method, training method and device based on deep learning model

Also Published As

Publication number Publication date
CN114880449B (en) 2024-05-10

Similar Documents

Publication Publication Date Title
CN113822494B (en) Risk prediction method, device, equipment and storage medium
CN112988963A (en) User intention prediction method, device, equipment and medium based on multi-process node
CN113821622A (en) Answer retrieval method and device based on artificial intelligence, electronic equipment and medium
CN113378970A (en) Sentence similarity detection method and device, electronic equipment and storage medium
CN114781832A (en) Course recommendation method and device, electronic equipment and storage medium
CN113886691A (en) Intelligent recommendation method and device based on historical data, electronic equipment and medium
CN114880449B (en) Method and device for generating answers of intelligent questions and answers, electronic equipment and storage medium
CN113706291A (en) Fraud risk prediction method, device, equipment and storage medium
CN113887930A (en) Question-answering robot health degree evaluation method, device, equipment and storage medium
CN114398557A (en) Information recommendation method and device based on double portraits, electronic equipment and storage medium
CN113807973A (en) Text error correction method and device, electronic equipment and computer readable storage medium
CN112269875A (en) Text classification method and device, electronic equipment and storage medium
CN115510188A (en) Text keyword association method, device, equipment and storage medium
CN114595321A (en) Question marking method and device, electronic equipment and storage medium
CN114840684A (en) Map construction method, device and equipment based on medical entity and storage medium
CN113918704A (en) Question-answering method and device based on machine learning, electronic equipment and medium
CN112347739A (en) Application rule analysis method and device, electronic equipment and storage medium
CN112560427A (en) Problem expansion method, device, electronic equipment and medium
CN114625340B (en) Commercial software research and development method, device, equipment and medium based on demand analysis
CN114708073B (en) Intelligent detection method and device for surrounding mark and serial mark, electronic equipment and storage medium
CN115099680A (en) Risk management method, device, equipment and storage medium
CN115221274A (en) Text emotion classification method and device, electronic equipment and storage medium
CN113888265A (en) Product recommendation method, device, equipment and computer-readable storage medium
CN114676307A (en) Ranking model training method, device, equipment and medium based on user retrieval
CN113806540A (en) Text labeling method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant