CN112862305A - Method, device, equipment and storage medium for determining risk state of object - Google Patents
Method, device, equipment and storage medium for determining risk state of object Download PDFInfo
- Publication number
- CN112862305A CN112862305A CN202110157249.3A CN202110157249A CN112862305A CN 112862305 A CN112862305 A CN 112862305A CN 202110157249 A CN202110157249 A CN 202110157249A CN 112862305 A CN112862305 A CN 112862305A
- Authority
- CN
- China
- Prior art keywords
- information
- public opinion
- target object
- confidence
- determining
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 63
- 238000001514 detection method Methods 0.000 claims abstract description 59
- 238000004458 analytical method Methods 0.000 claims abstract description 32
- 230000008451 emotion Effects 0.000 claims abstract description 21
- 238000004590 computer program Methods 0.000 claims description 11
- 238000012545 processing Methods 0.000 claims description 10
- 238000013136 deep learning model Methods 0.000 claims description 5
- 238000003058 natural language processing Methods 0.000 abstract description 9
- 230000010365 information processing Effects 0.000 abstract description 3
- 238000013135 deep learning Methods 0.000 abstract description 2
- 238000010586 diagram Methods 0.000 description 17
- 230000008569 process Effects 0.000 description 13
- 238000004891 communication Methods 0.000 description 9
- 239000000383 hazardous chemical Substances 0.000 description 8
- 239000000126 substance Substances 0.000 description 7
- 230000002996 emotional effect Effects 0.000 description 6
- 238000011156 evaluation Methods 0.000 description 6
- 230000007547 defect Effects 0.000 description 5
- 239000003086 colorant Substances 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 4
- 238000013507 mapping Methods 0.000 description 4
- 230000000306 recurrent effect Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 230000002596 correlated effect Effects 0.000 description 3
- 230000000875 corresponding effect Effects 0.000 description 3
- 238000013461 design Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000001914 filtration Methods 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000003062 neural network model Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000008520 organization Effects 0.000 description 3
- 238000012502 risk assessment Methods 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 238000005065 mining Methods 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 230000032683 aging Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 239000003344 environmental pollutant Substances 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 231100000719 pollutant Toxicity 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0635—Risk analysis of enterprise or organisation activities
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/33—Querying
- G06F16/3331—Query processing
- G06F16/334—Query execution
- G06F16/3344—Query execution using natural language analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/279—Recognition of textual entities
- G06F40/289—Phrasal analysis, e.g. finite state techniques or chunking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q40/00—Finance; Insurance; Tax strategies; Processing of corporate or income taxes
- G06Q40/03—Credit; Loans; Processing thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/18—Legal services
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Resources & Organizations (AREA)
- Economics (AREA)
- Strategic Management (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- General Business, Economics & Management (AREA)
- Marketing (AREA)
- General Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Tourism & Hospitality (AREA)
- Artificial Intelligence (AREA)
- Development Economics (AREA)
- Entrepreneurship & Innovation (AREA)
- Finance (AREA)
- Data Mining & Analysis (AREA)
- Accounting & Taxation (AREA)
- Technology Law (AREA)
- Biomedical Technology (AREA)
- Educational Administration (AREA)
- Molecular Biology (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- Biophysics (AREA)
- Game Theory and Decision Science (AREA)
- Operations Research (AREA)
- Quality & Reliability (AREA)
- Life Sciences & Earth Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Primary Health Care (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The disclosure discloses a method, a device, equipment and a storage medium for determining an object risk state, which are applied to the technical field of information processing, in particular to the technical field of big data, deep learning and natural language processing. The specific implementation scheme is as follows: the method comprises the steps of obtaining stability information and public opinion information of a target object, wherein the stability information comprises risk indication information of the target object and legal information related to the target object; determining the confidence coefficient of the public sentiment information by adopting an emotion analysis model to obtain the confidence coefficient information of the target object; detecting a target keyword in public opinion information and legal information to obtain a detection result; and determining the risk state of the target object according to the risk indication information, the confidence degree information and the detection result.
Description
Technical Field
The present disclosure relates to the field of information processing technologies, and in particular, to the field of big data, deep learning, and natural language processing technologies, and in particular, to a method, an apparatus, a device, and a storage medium for determining an object risk state.
Background
With the development of economy, more and more business bodies are on the market. In order to maintain the market, the user can conveniently select the business entity according to the requirement, and the business entity is usually required to be evaluated. For example, when a business entity needs a loan, the business entity's assets are evaluated to determine if the business entity is at risk of being unable to repay the loan. When the operation subject operates the hazardous chemical substance, the safety risk of the operation subject needs to be evaluated, so as to reasonably manage the business range of the operation subject and improve the safety of the hazardous chemical substance purchased by the user.
Disclosure of Invention
A method, apparatus, device and storage medium for determining a risk state of an object are provided that can improve state accuracy.
According to a first aspect, there is provided a method of determining a risk status of a subject, comprising: acquiring stability information and public opinion information of a target object, wherein the stability information comprises risk indication information of the target object and legal information related to the target object; determining the confidence coefficient of the public sentiment information by adopting an emotion analysis model to obtain the confidence coefficient information of the target object; detecting a target keyword in public opinion information and legal information to obtain a detection result; and determining the risk state of the target object according to the risk indication information, the confidence degree information and the detection result.
According to a second aspect, there is provided an apparatus for determining a risk status of a subject, comprising: the information acquisition module is used for acquiring stability information and public opinion information of the target object, wherein the stability information comprises risk indication information and legal information related to the target object; the confidence coefficient information determining module is used for determining the confidence coefficient of the public opinion information by adopting the emotion analysis model to obtain the confidence coefficient information of the target object; the keyword detection module is used for detecting a target keyword in public sentiment information and legal information to obtain a detection result; and the risk state determining module is used for determining the risk state of the target object according to the risk indication information, the confidence degree information and the detection result.
According to a third aspect, there is provided an electronic device comprising: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of determining a risk state of a subject provided by the present disclosure.
According to a fourth aspect, there is provided a non-transitory computer readable storage medium having stored thereon computer instructions for causing a computer to perform the method of determining a risk state of a subject provided by the present disclosure.
According to a fifth aspect, a computer program product is provided, comprising a computer program which, when executed by a processor, implements the method of determining a risk state of a subject provided by the present disclosure.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present disclosure, nor do they limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
FIG. 1 is a schematic diagram of an application scenario of a method, an apparatus, a device and a storage medium for determining a risk status of an object according to an embodiment of the present disclosure;
FIG. 2 is a schematic flow chart diagram of a method of determining a risk status of a subject according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram illustrating a principle of obtaining confidence information of a target object according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram illustrating detection results obtained by detecting target keywords according to an embodiment of the disclosure;
fig. 5 is a schematic diagram of a principle of obtaining public opinion information of a target object according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of determining a risk status of a subject according to an embodiment of the present disclosure;
FIG. 7 is a block diagram of an apparatus for determining a risk status of an object according to an embodiment of the present disclosure; and
FIG. 8 is a block diagram of an electronic device for implementing a method of determining a risk status of an object according to an embodiment of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below with reference to the accompanying drawings, in which various details of the embodiments of the disclosure are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
The present disclosure provides a method for determining a risk status of an object, comprising an information acquisition process, a confidence determination process, a keyword detection process, and a risk status determination process. In the information acquisition process, stability information and public opinion information of the target object are acquired, wherein the stability information comprises risk indication information of the target object and legal information related to the target object. In the confidence coefficient determining process, determining the confidence coefficient of the public sentiment information by adopting an emotion analysis model to obtain the confidence coefficient information of the target object. In the keyword detection process, target keywords are detected in public sentiment information and legal information to obtain a detection result. And in the process of determining the risk state, determining the risk state of the target object according to the risk indication information, the confidence degree information and the detection result.
An application scenario of the method and apparatus provided by the present disclosure will be described below with reference to fig. 1.
FIG. 1 is a diagram of an application scenario for a method, apparatus, device, medium, and program product for determining a risk status of an object according to embodiments of the present disclosure.
As shown in fig. 1, the application scenario 100 of this embodiment may include, for example, a first device 110 and a second device 120. The first device 110 and the second device 120 may communicate over a network, which may include, for example, a wired or wireless communication network.
According to an embodiment of the present disclosure, the first device 110 may provide, for example, a human-machine interaction interface, and a user may interact with the second device 120 using the first device 110 to receive or send a message, and the like. The first device 110 may have installed thereon various client applications such as, for example only, a shopping-type application, a web browser application, a search-type application, a web-disk-type application, a mailbox client, social platform software, and the like.
The first device 110 may be a variety of electronic devices having a display screen and supporting web browsing, including but not limited to smart phones, tablets, laptop and desktop computers, and the like. The second device 120 may be an electronic device providing various services, such as a server, a desktop computer, and the like. For example, the second device 120 may be an application server, a server of a distributed system, or a server incorporating a blockchain. Alternatively, the server may also be a virtual server or a cloud server, etc. for providing support for running the client application installed to the first device 110.
In an embodiment, the second device 120 may, for example, analyze the request information sent by the first device 110, and feed back the processing result to the first device 110. The request information may be used to request any information such as audio, video, text, or image. For example, the requested information may be a risk status of the target object.
In one embodiment, as shown in fig. 1, the application scenario 100 further includes an information database 130 maintaining stability information and a public opinion database 140 maintaining public opinion information. The stability information may be information capable of indicating credit of each business entity in the market and business stability of each business entity, and the public opinion information may include news information generated by each news platform or text published by each social application. The information database 130 and the public opinion database 140 may be two independent databases, may also be virtual databases operating on the same physical device, or may also be two storage partitions in the same physical device.
In one embodiment, the second device 120, in response to the request information for acquiring the risk status of the target object, may perform a comprehensive evaluation of the stability information of the target object in the information database 130 and the public opinion information associated with the target object in the public opinion database 140, determine the risk status of the target object according to the evaluation result, and feed back the risk status of the target object to the first device 110.
It should be noted that the method for determining the risk status of the subject provided by the present disclosure may be performed by the second device 120. Accordingly, the apparatus for determining the risk status of a subject provided by the present disclosure may be provided in the second device 120.
It should be understood that the types and numbers of the first device, the second device, the information database, and the public opinion database in fig. 1 are merely illustrative. There may be any type and number of the first device, the second device, the information database, and the public opinion database, according to implementation needs.
The method for determining the risk status of an object provided by the present disclosure will be described in detail with reference to fig. 2 to 6 in the application scenario described in fig. 1.
Fig. 2 is a flow diagram of a method of determining a risk status of a subject according to an embodiment of the disclosure.
As shown in fig. 2, the method 200 of determining a risk state of a subject of this embodiment may include operation S210, operation S230, operation S250, and operation S270.
In operation S210, stability information and public opinion information of a target object are acquired.
According to the embodiment of the disclosure, when information is acquired, the public sentiment information related to the target object is inquired from the public sentiment database described in the foregoing by taking the object identification of the target object as the inquiry condition, and the stability information of the target object is inquired from the information database described in the foregoing. The object identifier of the target object may be, for example, information that can uniquely indicate the target object, such as a name of the target object, a number of the target object, and the like.
According to embodiments of the present disclosure, the target object may be an entity such as a business, an individual, an organization, and so on. Stability information may include risk indication information and legal information. Here, the risk indicating information may be an evaluation value determined according to a historical risk state, or may be an evaluation value determined according to attribute information of the target object. The attribute information may include, for example, at least one of the following information: the type of the target object, asset information of the target object, age of the target object, and the like. The legal information may for example comprise at least one of the following information associated with the target object: official documents, court announcements, executives information, deceased information, and the like. The legal information associated with the target object may include legal information relating to the target object, or legal information relating to other objects having an association relationship with the target object.
In one embodiment, the risk indication information may be determined according to the business scope involved by the target object. For example, if the target object is an individual, the risk indication information of the target object may be determined according to a business scope corresponding to the work performed by the individual. If the target object is an enterprise, the risk indication information of the target object may be determined according to a category range of products operated by the enterprise. For example, when the target object is an enterprise operating a hazardous chemical substance, the risk indication information of the target object may be determined according to the type, number of types, and properties of the hazardous chemical substance operated by the enterprise. For example, if the hazardous chemical product operated by the enterprise is a flammable product with a low ignition point, the risk indication information of the enterprise indicates that the risk of the enterprise is high, i.e., the stability is poor. If the dangerous chemical substance operated by the enterprise is a flammable product with a high burning point, the risk indication information of the enterprise indicates that the risk of the enterprise is low, namely the stability is good.
According to an embodiment of the present disclosure, the public opinion information may include at least one of the following information generated by a news platform or a social platform: audio, video, text, and images. The public opinion information of the target object may be information in which an occurred event is recorded, and the occurred event may be an event occurred in the target object or an event occurred in another object associated with the target object.
In operation S230, the confidence of the public opinion information is determined using the emotion analysis model, and the confidence information of the target object is obtained.
According to an embodiment of the disclosure, an emotion analysis model (e.g., a recurrent neural network model, etc.) may be employed to determine a negative confidence of the public opinion information, which may be used, for example, to characterize a degree of risk of a target object indicated by content represented by the public opinion information. The greater the risk level, the greater the negative confidence. The confidence information of the target object in this embodiment may be determined according to the negative confidence of the public opinion information. For example, the confidence information for the target object may be the confidence of the target object, the negative confidence and the confidence of the target object are positively correlated with each other, the type of the positive correlation may be set according to actual requirements, and the disclosure does not limit this.
In operation S250, a target keyword is detected in the public sentiment information and the legal information to obtain a detection result.
According to the embodiment of the disclosure, the name of the target object, the public opinion information and the legal information can be spliced to be used as the input of the natural language processing model, and the detection result can be output through the natural language processing model. The natural language processing model may be a convolutional neural network model or the like. By comprehensively considering the public opinion information and the legal information to determine the detection result, the influence of misleading public opinion information on the detection result can be avoided. Therefore, the accuracy of the determined detection result can be improved to a certain extent. Or, extracting keywords in the public opinion information and the legal information by using a natural language processing model, comparing the extracted keywords with words in a predetermined word bank, and determining the words belonging to the predetermined word bank in the extracted keywords as target keywords. The detection result may be, for example, the number of detected target keywords.
According to the embodiment of the present disclosure, character recognition may be performed on the public opinion information and the legal information, respectively, and the amount of words with negative emotional colors included in the public opinion information is determined as the first amount. Similarly, the amount of words with negative emotional colors included in the legal information is determined as the second amount. After the first number and the second number are obtained, the sum of the weights of the first number and the second number may be determined according to the weights assigned to the public opinion information and the legal information, and the determined sum of the weights may be used as a detection result.
In operation S270, a risk status of the target object is determined according to the risk indication information, the confidence information, and the detection result.
According to the embodiments of the present disclosure, predetermined weights may be assigned to the risk indication information, the confidence information, and the detection result in advance. Operation S270 may first digitize the risk indication information, the confidence information, and the detection result, then determine a weighted sum of the three numerical values obtained through the digitization as a risk indication value of the target object according to a predetermined weight, and determine a risk status according to the risk indication value. The risk indication value may have a mapping relationship with the risk status, for example. For example, if the risk indication value is higher than a first predetermined risk value, the risk state is a high risk state. The risk status is a medium risk status if the risk indication value is between the first predetermined risk value and the second predetermined risk value. If the risk indication value is less than a second predetermined risk value, the risk state is a low risk state. It can be understood that the first predetermined risk value is greater than the second predetermined risk value, and values of the first predetermined risk value and the second predetermined risk value may be set according to actual requirements, which is not limited by the present disclosure. In an embodiment, the determined risk indication value may also be taken as the determined risk status.
According to the embodiment of the disclosure, when determining the risk state of the target object, the risk indication information, the confidence information, and the detection result may be converted into numerical values of a predetermined score system, so as to obtain the first numerical value, the second numerical value, and the third numerical value, respectively. For example, if the predetermined score is made as a percentile and the risk indication information is the evaluation value a having the highest value of 31, the percentile value a of the risk indication information is a × 100/31. After the first, second, and third values are obtained, the weighted sum of the three values may be used as the risk indication value of the target object. In an embodiment, the weights assigned to the first, second and third values are all equal, and then an average of the first, second and third values may be determined as the risk indication value of the target object. It is understood that the predetermined score system can be any score rule such as a percentage system, a tenth system, a fifth system, etc., and the predetermined score system can be set according to actual requirements, which is not limited by the present disclosure. The determined risk indication value may be indicative of a risk status of the target object.
When the risk state of the target object is determined, the risk evaluation state is determined by integrating the existing risk indication information, the confidence information determined according to the public sentiment information and the legal information of the target object and the detection result, so that the consideration angle for determining the risk state can be increased. Therefore, the risk state of the enterprise can be more comprehensively evaluated, and the accuracy of the determined risk state is improved.
Fig. 3 is a schematic diagram of the principle of determining confidence information of a target object according to an embodiment of the present disclosure.
According to the embodiment of the disclosure, when determining the confidence information of the target object, for example, negative public opinion information may be first selected from the public opinion information according to the analysis result of the confidence of the public opinion information, and then the confidence information of the target object is determined according to the negative public opinion information, so as to avoid the influence of the positive public opinion information on the confidence information of the target object, improve the accuracy of the determined confidence information of the target object, and reduce the information processing efficiency.
According to an embodiment of the disclosure, a positive confidence and a negative confidence of each public opinion information may be determined first. The positive confidence coefficient represents the confidence coefficient that the content described by each public opinion information is positive and active content. The negative confidence coefficient represents the confidence coefficient that the content described by each public opinion information is negative and negative content. The negative, negative content may indicate a degree of risk of the target object. The greater the content negativity, the higher the risk level of the target object.
Illustratively, whether the public opinion information is negative public opinion information may be determined according to the amount of words with negative emotional colors included in the public opinion information. For example, if the proportion of the words with negative emotional colors to the total amount of words in the public opinion information exceeds a predetermined proportion, the public opinion information is determined to be negative public opinion information. The predetermined ratio can be set according to actual requirements, which is not limited by the present disclosure.
Illustratively, an emotion analysis model can be employed to determine positive and negative confidences of public opinion information. And determining whether the public opinion information is negative public opinion information according to the magnitude relation between the positive confidence coefficient and the negative confidence coefficient. For example, if the negative confidence is greater than the positive confidence, the public sentiment information may be determined to be negative public sentiment information.
According to the embodiment of the disclosure, the emotion analysis model can be constructed based on a deep learning model, for example. The deep learning model may be, for example, a recurrent neural network model. The Recurrent neural network model may be, for example, a Long-Short-Term Memory (LSTM) model, a Gated Recurrent Unit (GRU) model, or a Bidirectional Encoder (Bert) model. By adopting the emotion analysis model constructed based on the deep learning model, the semantic deep analysis and information mining of public sentiment information can be realized, so that the accuracy of the confidence information of the determined target object can be improved.
Illustratively, the emotion analysis model can be constructed based on an emotion analysis operator (e.g., NLPC-sentment-Classify-104) maintained by a natural language Processing Cloud platform (NLPC).
For example, as in the embodiment 300 shown in fig. 3, the number of the obtained public sentiments may be multiple, and the embodiment may poll each public sentiment information 311 of the multiple public sentiment information 310, and first determine a positive confidence 331 and a negative confidence 341 of each public sentiment information 311 by using the sentiment analysis model 320. After the positive confidence 331 and the negative confidence 341 are obtained, for each public opinion information 311, a magnitude relationship 351 between the positive confidence 331 and the negative confidence 341 may be determined. By performing the above operation on each piece of public opinion information polled from the plurality of pieces of public opinion information 310, the magnitude relationship between the positive confidence and the negative confidence of all the pieces of public opinion information 310 can be obtained, i.e., the magnitude relationships corresponding to the plurality of pieces of public opinion information 310 one to one can be obtained. According to the magnitude relation between the positive confidence and the negative confidence of each public opinion information (namely, according to the magnitude relations), the negative public opinion information can be extracted from the public opinion information. The negative public opinion information may be, for example, public opinion information with a positive confidence smaller than a negative confidence, or public opinion information with a difference between the negative confidence and the positive confidence larger than a predetermined difference. The predetermined difference may be set according to actual requirements, and the predetermined difference may be any value greater than zero, which is not limited in this disclosure.
Illustratively, after the negative public opinion information 360 is obtained, the confidence information of the target object may be determined according to the positive confidence of the negative public opinion information 360 and the negative confidence of the negative public opinion information 360. For example, the confidence information of the target object may be determined according to the difference between the negative confidence and the positive confidence of the negative public opinion information 360. In the case where there are a plurality of negative public opinion information 360, the sum of a plurality of differences corresponding to the plurality of negative public opinion information 360 one to one may be used as the confidence information of the target object.
According to an embodiment of the present disclosure, when the number of the negative public opinion information 360 is determined to be large, the confidence information of the target object may also be determined to be a predetermined maximum confidence. When the number of the negative public opinion information 360 is small, the confidence information of the target object is determined according to the difference between the negative confidence and the positive confidence of the negative public opinion information 360.
For example, as shown in fig. 3, after obtaining the negative public opinion information 360, the number of the negative public opinion information may be determined, and the information number 370 may be obtained. The confidence information 380 for the target object is then determined based on the size of the information count 370. For example, in the case where the number of negative public opinion information 360 is equal to or greater than a first predetermined number, the confidence information 380 of the target object is determined to be a first predetermined value. In the case where the number of the negative public opinion information 360 is less than the first predetermined number, a difference between the positive confidence and the negative confidence of each negative public opinion information may be determined as a difference for each negative public opinion information. The confidence information 380 of the target object is determined according to the difference for all negative public opinion information. The first predetermined number may be equal to the confidence value of the confidence information as the target object, or the first predetermined number and the confidence value of the confidence information as the target object are positively correlated with each other. It is understood that the first predetermined number and the confidence value of the confidence information of the target object may be set according to actual requirements, which is not limited by the present disclosure.
Illustratively, the difference of the negative confidence and the positive confidence may be employed to characterize the difference for each negative public opinion information. Or the square of the difference of the negative confidence and the positive confidence can be used to characterize the difference for each negative public opinion information. After obtaining the difference for each negative public opinion information, all values characterizing the differences of all negative public opinion information may be summed, and the summed value may be used as the confidence value of the target object.
Fig. 4 is a schematic diagram illustrating a principle of detecting a target keyword to obtain a detection result according to an embodiment of the present disclosure.
According to the embodiment of the present disclosure, a predetermined word bank may be maintained in advance, the predetermined word bank including a plurality of words, the plurality of words being represented by words capable of reflecting security risks. When determining the detection result, the embodiment may perform semantic recognition on the public sentiment information and the legal information, and extract each of the public sentiment information and the legal information to extract the target keyword. And carrying out target keyword recalling on the public sentiment information and the legal information, wherein the number of the recalled target keywords can reflect the risk degree of the target object. For example, the greater the number of target keywords recalled, the higher the risk level of the target object. Compared with the semantic analysis of the whole information, the method can ensure the detection precision and improve the determination efficiency of the risk state of the target object.
According to the embodiment of the disclosure, when the public opinion information is recalled with the target keyword, the target keyword can be recalled only for the negative public opinion information determined in the foregoing, so that the accuracy of the target keyword recall and the target keyword recall rate are improved.
Illustratively, as shown in fig. 4, the embodiment 400 may poll each information 431 in the information set formed by the negative public opinion information 410 and the legal information 420. For each of the acquired information 431, keywords belonging to the predetermined thesaurus 440 in each of the information 431 are detected, resulting in a target keyword 451 for each of the information 431. After the target keywords 451 for each of the negative public opinion information 410 and the legal information 420 are obtained by polling, the total number of all the obtained target keywords may be determined to obtain the total number of keywords 460. The detection result 470 can be determined according to the total number of the keywords 460.
For example, when each piece of information is identified to obtain a target keyword for each piece of information, each piece of information may be preprocessed, which includes a word segmentation process and a process of removing stop words. By this preprocessing, a plurality of character strings can be obtained. By matching each of the plurality of character strings with a word in the predetermined word bank 440, when a word matching each of the character strings is matched, the matching word is taken as a target keyword for information to which each of the character strings belongs. In the matching, for example, the similarity between the character string and the word in the predetermined word bank may be determined, and if the similarity is greater than the predetermined similarity, the character string and the word in the predetermined word bank may be determined to be matched. The similarity can be expressed by cosine similarity or pearson correlation coefficient, for example. It is understood that the predetermined similarity may be set according to actual requirements, which is not limited by the present disclosure.
Illustratively, after preprocessing each piece of information, a keyword recall model established based on the predetermined lexicon 440 may also be employed to determine a target keyword for each piece of information. The keyword recall model may include a fast text classifier FastText model or a Bi-LSTM-CRF model composed of a two-way long-short term memory network and a conditional random field, etc. The keyword recall model has the input of preprocessed information and the output of categories of character strings in the preprocessed information, wherein the categories include risk word categories constructed based on words in the predetermined word stock 440. Character strings belonging to the risk word category are taken as target keywords for each piece of information. Words belonging to the risk word category are words indicating negative or negative content.
According to an embodiment of the present disclosure, the detection result may be, for example, a risk value indicating a magnitude of risk. The total number of keywords 460 and the test result 470 can be positively correlated with each other. After obtaining the total number 460 of keywords, the risk value can be calculated according to the positive correlation. The positive correlation may be determined according to the score system adopted by the risk value, or may be determined according to the actual requirement, which is not limited by the present disclosure.
For example, when the total number of keywords is determined to be large, the risk value may also be determined to be a maximum value. For example, it may be determined whether the total number of the keywords is greater than or equal to a second predetermined number, and if so, the detection result (i.e., the risk value) is determined to be a second predetermined value. Otherwise, determining the detection result according to the positive correlation between the risk value and the total number of the keywords. When the total number of the keywords is smaller than the second predetermined number, for example, the total number of the keywords may be used as the detection result. The second predetermined number may be equal to the risk value, or the second predetermined number and the risk value may have a positive correlation between the risk value and the total number of keywords. It is understood that the second predetermined number and the risk value can be set according to actual requirements, which are not limited by the present disclosure.
Illustratively, the predetermined thesaurus 440 may have different words for different types of target objects. For example, when the target object is an entity business, the words in the predetermined word bank 440 may include: equipment aging, equipment weight problems, equipment faults, process defects, device failures, basic design nonconformance, pollutant emission overproof, technical defects, design defects, equipment defects, misoperation, illegal operation, no-certificate on duty, no training, illegal operation, non-standard operation and the like. Where the target object is a financial institution, the tag may include, for example: running, being included on a blacklist, liability, etc. It is to be understood that the words included in the predetermined lexicon as described above are merely exemplary to facilitate an understanding of the present disclosure, and the present disclosure is not limited thereto.
Fig. 5 is a schematic diagram of a principle of obtaining public opinion information of a target object according to an embodiment of the present disclosure.
According to an embodiment of the present disclosure, as shown in fig. 5, for example, a mapping 540 between an object identifier 530 and public opinion information 510 may be stored in a public opinion database 550 of the embodiment 500, so as to obtain the public opinion information of a target object from the public opinion database 550 according to needs.
For example, after a news platform or the like generates new public opinion information 510, the embodiment of the disclosure may first extract entity features from the public opinion information 510 by using a lexical analysis model 520, and use a word representing an entity name in the extracted entity features as an object identifier 530. The information stored in the public opinion database 550 can be obtained by establishing a mapping 540 between the object identifier 530 and the public opinion information 510. In establishing the mapping 540, the object identifier 530 may be used as an index of the public opinion information 510, for example.
Illustratively, when public opinion information is acquired from the public opinion database 550, the public opinion information of the target object may be acquired from the predetermined public opinion database 550 according to the object identification of the target object. For example, the public opinion database 550 may be queried using the object identifier 560 of the target object as a query condition, and the public opinion information indexed by the object identifier 560 of the target object may be used as the public opinion information 570 of the target object.
Illustratively, the Lexical Analysis model 520 may employ a joint Lexical Analysis of Chinese (LAC) model, for example, which may provide the functions of word segmentation, part of speech tagging, and proper name recognition. With this lexical analysis model, for example, an organization name is extracted from public opinion information, and this organization name can be used as an object marker. In an embodiment, the nlpc-lac operator in the natural language processing cloud platform can be used as a lexical analysis model. The nlpc-lac operator is implemented using a stacked GRU model, where the training and prediction of the model can be implemented based on the propeller (Paddlepaddle). The embodiment can improve the accuracy of the extracted object identification by determining the object identification by adopting the lexical analysis model, and therefore, can improve the accuracy of the acquired public opinion information.
The principle of determining the risk status of a subject according to an embodiment of the present disclosure will be described in general with reference to fig. 6, as an example.
Fig. 6 is a schematic diagram of the principle of determining a risk status of a subject according to an embodiment of the present disclosure.
As shown in fig. 6, in this embodiment 600, the target object may be, for example, an enterprise operating hazardous chemicals. The assessment of the enterprise managing the hazardous chemical may rely on the public opinion database 601 and the information database 602. The public opinion database 601 may be, for example, a User information repository (User Data consumer) that provides comprehensive, consistent, high-quality and analysis-oriented User behavior basic information including web texts associated with users by using a platform storage management, Data construction process management and metadata management technology. The information database 602 may be, for example, a database storing stability information, which stores information such as legal representatives of an enterprise, registered capital, established dates, stockholders, business information, official documents, intellectual property contributions, and the like.
In determining the risk state, the embodiment may filter effective public opinion information, which is public opinion information for the target object, from the public opinion database 601 through operation S610. When filtering the information in the public opinion database 601, for example, only news information may be obtained by filtering, so as to ensure authority of the obtained public opinion information. Wherein, news information can be obtained by filtering according to the source of the information.
In determining the risk status, legal information of the target object may be further obtained from the information database 602 through operation S620, and a basic risk score of the enterprise operating the hazardous chemical may be obtained from the information database 602 through operation S630. The basic risk score may be used as the risk indication information described above, and will not be described herein.
For the filtered effective public sentiment information, an emotion tendency confidence coefficient can be added to each public sentiment information by using an emotion analysis operator in the NLPC (operation S640). The emotional propensity confidence levels include the positive confidence level and the negative confidence level described above. A public opinion score is then obtained according to the emotional tendency confidence added for the effective public opinion information (operation S650) as confidence information of the target object. The operation S650 can be implemented by, for example, the method described above, which determines the negative public opinion information first, and then determines the confidence information of the target object according to the positive confidence and the negative confidence of the negative public opinion information, and is not described herein again.
For the filtered effective public opinion information and the acquired legal information, a secure production tag may be added as a target keyword via operation S660, respectively. The operation S660 may be implemented by using the method for obtaining the target keyword for the information based on the predetermined lexicon identification information, which is described in detail herein.
According to the added safety production label, a safety production label recall score may be obtained as a detection result as described above through operation S670. The operation S670 may be implemented by using the method for determining the detection result of the target object according to the total number of the keywords, which is described above and is not described herein again.
After the public opinion score, the safe production label recall score, and the basic risk score are acquired, an enterprise risk assessment score may be determined through operation S680 to indicate a risk status of the target object. The operation S680 may be implemented by the method for determining the risk status of the target object according to the risk indication information, the confidence information, and the detection result, which is described above and will not be described herein again.
The method described in the embodiment can make up the defects of few consideration angles in the risk assessment of enterprises operating dangerous chemicals in the prior art, and can effectively improve the calculation dimensionality of the risk assessment by combining the natural language processing tool to realize the information mining and processing, thereby facilitating the more comprehensive assessment of the risk of the enterprises operating dangerous chemicals by users.
Based on the method for determining the risk status of a subject described above, the present disclosure also provides an apparatus for determining the risk status of a subject. The apparatus will be described in detail below with reference to fig. 7.
Fig. 7 is a block diagram of an apparatus for determining a risk status of an object according to an embodiment of the present disclosure.
As shown in fig. 7, the apparatus 700 for determining a risk state of a subject according to this embodiment may include an information obtaining module 710, a confidence determining module 730, a keyword detecting module 750, and a risk state determining module 770.
The information obtaining module 710 is configured to obtain stability information and public opinion information of the target object, where the stability information includes risk indication information and legal information related to the target object. In an embodiment, the information obtaining module 710 may be configured to perform the operation S210 described above, which is not described herein again.
The confidence determining module 730 is configured to determine the confidence of the public opinion information by using the emotion analysis model to obtain the confidence information of the target object. In an embodiment, the confidence determining module 730 may be configured to perform the operation S230 described above, which is not described herein again.
The keyword detection module 750 is configured to detect a target keyword in the public sentiment information and the legal information to obtain a detection result. In an embodiment, the keyword detection module 750 may be configured to perform the operation S250 described above, which is not described herein again.
The risk status determination module 770 is configured to determine a risk status of the target object according to the risk indication information, the confidence level information, and the detection result. In an embodiment, the risk status determining module 770 may be configured to perform the operation S270 described above, which is not described herein again.
According to an embodiment of the present disclosure, the number of the public opinion information is plural, and the confidence determination module 730 may include a confidence determination sub-module, a negative information determination sub-module, and an information determination sub-module. The confidence coefficient determining submodule is used for determining the positive confidence coefficient and the negative confidence coefficient of each public opinion information by adopting the emotion analysis model aiming at each public opinion information in the plurality of public opinion information. The negative information determining submodule is used for determining negative public opinion information in the plurality of public opinion information according to the magnitude relation between the positive confidence coefficient of each public opinion information and the negative confidence coefficient of each public opinion information. And the information determination submodule is used for determining the confidence information of the target object according to the positive confidence of the negative public opinion information and the negative confidence of the negative public opinion information.
According to the embodiment of the disclosure, the emotion analysis model is constructed based on a deep learning model.
According to an embodiment of the present disclosure, the confidence level determination submodule includes a number determination unit and a confidence level determination unit. The number determining unit is used for determining the number of the negative public opinion information. The confidence determination unit is used for: determining the confidence information of the target object as a first preset value under the condition that the number of the negative public opinion information is greater than or equal to a first preset number; under the condition that the number of the negative public opinion information is smaller than a first preset number, determining the difference between the positive confidence coefficient of each negative public opinion information and the negative confidence coefficient of each negative public opinion information aiming at each negative public opinion information as the confidence coefficient difference aiming at each negative public opinion information; then, the confidence information of the target object is determined according to the confidence difference of each negative public opinion information.
According to an embodiment of the present disclosure, the keyword detection module 750 may include, for example, a keyword determination sub-module and a detection result determination sub-module. The keyword determination submodule is used for detecting a keyword belonging to a predetermined word stock in each piece of information as a target keyword for each piece of information aiming at each piece of information in the negative public opinion information and legal information. And the detection result determining submodule is used for determining a detection result according to the total number of the target keywords aiming at the negative public opinion information and the legal information. The predetermined thesaurus includes a plurality of words, the plurality of words being represented by words reflecting security risks.
According to the embodiment of the disclosure, the detection result determining submodule is configured to determine that the detection result is the second predetermined value when the total number of the target keywords is greater than or equal to the second predetermined number. The detection result determining submodule is further used for determining that the detection result is the total number of the target keywords under the condition that the total number of the target keywords is smaller than a second preset number.
According to an embodiment of the present disclosure, the information obtaining module 710 is configured to obtain the public opinion information of the target object from a predetermined public opinion information base according to the object identifier of the target object. The public opinion information stored in the preset public opinion information base takes an object identification related to the public opinion information as an index, and the object identification related to the public opinion information is obtained by analyzing the public opinion information through a lexical analysis model.
According to an embodiment of the present disclosure, the apparatus 700 for determining a risk status of an object further includes an indication information determining module, configured to determine risk indication information of the target object according to a service range related to the target object.
According to an embodiment of the present disclosure, the risk status determination module 770 includes a numerical conversion sub-module and a status determination sub-module. And the numerical value conversion submodule is used for respectively converting the risk indication information, the confidence coefficient information and the detection result into numerical values which are prepared by a preset score to obtain a first numerical value, a second numerical value and a third numerical value. The state determination submodule is used for determining a weighted sum of the first numerical value, the second numerical value and the third numerical value according to the preset weight distributed to the first numerical value, the second numerical value and the third numerical value, and the weighted sum is used for indicating the risk state of the target object.
The present disclosure also provides an electronic device, a readable storage medium, and a computer program product according to embodiments of the present disclosure.
Fig. 8 is a schematic block diagram of an electronic device 800 for implementing a method of determining a risk status of an object according to an embodiment of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 8, the apparatus 800 includes a computing unit 801 that can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM)802 or a computer program loaded from a storage unit 808 into a Random Access Memory (RAM) 803. In the RAM 803, various programs and data required for the operation of the device 800 can also be stored. The calculation unit 801, the ROM 802, and the RAM 803 are connected to each other by a bus 804. An input/output (I/O) interface 805 is also connected to bus 804.
A number of components in the device 800 are connected to the I/O interface 805, including: an input unit 806, such as a keyboard, a mouse, or the like; an output unit 807 such as various types of displays, speakers, and the like; a storage unit 808, such as a magnetic disk, optical disk, or the like; and a communication unit 809 such as a network card, modem, wireless communication transceiver, etc. The communication unit 809 allows the device 800 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present disclosure may be executed in parallel or sequentially or in different orders, and are not limited herein as long as the desired results of the technical solutions disclosed in the present disclosure can be achieved.
The above detailed description should not be construed as limiting the scope of the disclosure. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present disclosure should be included in the scope of protection of the present disclosure.
Claims (13)
1. A method of determining a risk status of a subject, comprising:
acquiring stability information and public opinion information of a target object, wherein the stability information comprises risk indication information of the target object and legal information related to the target object;
determining the confidence coefficient of the public opinion information by adopting an emotion analysis model to obtain the confidence coefficient information of the target object;
detecting a target keyword in the public opinion information and the legal information to obtain a detection result; and
and determining the risk state of the target object according to the risk indication information, the confidence degree information and the detection result.
2. The method of claim 1, wherein the number of the public opinion information is multiple, the determining the confidence of the public opinion information by using the emotion analysis model, and the obtaining the confidence information of the target object comprises:
aiming at each public opinion information in a plurality of public opinion information, determining a positive confidence coefficient and a negative confidence coefficient of each public opinion information by adopting an emotion analysis model;
determining negative public opinion information in the plurality of public opinion information according to a magnitude relation between the positive confidence coefficient of each public opinion information and the negative confidence coefficient of each public opinion information; and
and determining the confidence information of the target object according to the positive confidence of the negative public opinion information and the negative confidence of the negative public opinion information.
3. The method of claim 2, wherein the emotion analysis model is constructed based on a deep learning model.
4. The method of claim 2, wherein determining the confidence information for the target object based on the positive confidence of the negative public opinion information and the negative confidence of the negative public opinion information comprises:
determining the number of the negative public opinion information;
determining the confidence information of the target object as a first preset value under the condition that the number of the negative public opinion information is greater than or equal to a first preset number; and
in the case that the number of negative public opinion information is less than the first predetermined number:
determining, for each negative public opinion information of the negative public opinion information, a difference between a positive confidence of the each negative public opinion information and a negative confidence of the each negative public opinion information as a confidence difference for the each negative public opinion information;
determining confidence information of the target object according to the confidence difference aiming at each negative public opinion information.
5. The method of claim 2, wherein a target keyword is detected in the public opinion information and the legal information, and obtaining a detection result comprises:
detecting a keyword belonging to a predetermined word bank in each piece of information as a target keyword for each piece of information, for each piece of negative public opinion information and each piece of legal information; and
determining the detection result according to the total number of the target keywords aiming at the negative public opinion information and the legal information,
wherein the predetermined thesaurus comprises a plurality of words represented by words reflecting security risks.
6. The method of claim 5, wherein the determining the detection result according to the total number of target keywords for the negative public opinion information and the legal information comprises:
determining the detection result to be a second preset value under the condition that the total number of the target keywords is greater than or equal to a second preset number; and
and under the condition that the total number of the target keywords is smaller than the second preset number, determining the detection result as the total number of the target keywords.
7. The method of claim 1, wherein obtaining public opinion information of a target object comprises:
acquiring the public opinion information of the target object from a preset public opinion information base according to the object identification of the target object,
the public opinion information stored in the preset public opinion information base takes an object identification related to the public opinion information as an index; the object identification associated with the public sentiment information is obtained by analyzing the public sentiment information through a lexical analysis model.
8. The method of claim 1, further comprising:
and determining the risk indication information of the target object according to the service range related to the target object.
9. The method of claim 1, wherein determining the risk status of the target object comprises:
respectively converting the risk indication information, the confidence degree information and the detection result into numerical values which are divided by a preset score to obtain a first numerical value, a second numerical value and a third numerical value; and
determining a weighted sum of the first, second and third numerical values for indicating the risk status of the target object according to predetermined weights assigned to the first, second and third numerical values.
10. An apparatus for determining a risk status of a subject, comprising:
the system comprises an information acquisition module, a processing module and a processing module, wherein the information acquisition module is used for acquiring stability information and public opinion information of a target object, and the stability information comprises risk indication information and legal information related to the target object;
the confidence coefficient determining module is used for determining the confidence coefficient of the public opinion information by adopting an emotion analysis model to obtain the confidence coefficient information of the target object;
the keyword detection module is used for detecting a target keyword in the public opinion information and the legal information to obtain a detection result; and
and the risk state determining module is used for determining the risk state of the target object according to the risk indication information, the confidence degree information and the detection result.
11. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-9.
12. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any of claims 1-9.
13. A computer program product comprising a computer program which, when executed by a processor, implements a method according to any one of claims 1 to 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110157249.3A CN112862305A (en) | 2021-02-03 | 2021-02-03 | Method, device, equipment and storage medium for determining risk state of object |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110157249.3A CN112862305A (en) | 2021-02-03 | 2021-02-03 | Method, device, equipment and storage medium for determining risk state of object |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112862305A true CN112862305A (en) | 2021-05-28 |
Family
ID=75988634
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110157249.3A Pending CN112862305A (en) | 2021-02-03 | 2021-02-03 | Method, device, equipment and storage medium for determining risk state of object |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112862305A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113990352A (en) * | 2021-10-22 | 2022-01-28 | 平安科技(深圳)有限公司 | User emotion recognition and prediction method, device, equipment and storage medium |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130173264A1 (en) * | 2012-01-03 | 2013-07-04 | Nokia Corporation | Methods, apparatuses and computer program products for implementing automatic speech recognition and sentiment detection on a device |
CN108256078A (en) * | 2018-01-18 | 2018-07-06 | 北京百度网讯科技有限公司 | Information acquisition method and device |
CN108846547A (en) * | 2018-05-06 | 2018-11-20 | 成都信息工程大学 | A kind of Enterprise Credit Risk Evaluation method of dynamic adjustment |
CN109657894A (en) * | 2018-09-27 | 2019-04-19 | 深圳壹账通智能科技有限公司 | Credit Risk Assessment of Enterprise method for early warning, device, equipment and storage medium |
CN109684481A (en) * | 2019-01-04 | 2019-04-26 | 深圳壹账通智能科技有限公司 | The analysis of public opinion method, apparatus, computer equipment and storage medium |
CN109800976A (en) * | 2019-01-07 | 2019-05-24 | 平安科技(深圳)有限公司 | Investment decision methods, device, computer equipment and storage medium |
CN109993448A (en) * | 2019-04-08 | 2019-07-09 | 湖北风口网络科技有限公司 | A kind of appraisal procedure and system of enterprise network public sentiment potential risk |
CN110458399A (en) * | 2019-07-05 | 2019-11-15 | 深圳壹账通智能科技有限公司 | Risk information generation method, device, computer equipment and storage medium |
WO2019227710A1 (en) * | 2018-05-31 | 2019-12-05 | 平安科技(深圳)有限公司 | Network public opinion analysis method and apparatus, and computer-readable storage medium |
CN111914087A (en) * | 2020-07-30 | 2020-11-10 | 广州城市信息研究所有限公司 | Public opinion analysis method |
CN112035658A (en) * | 2020-08-05 | 2020-12-04 | 海纳致远数字科技(上海)有限公司 | Enterprise public opinion monitoring method based on deep learning |
-
2021
- 2021-02-03 CN CN202110157249.3A patent/CN112862305A/en active Pending
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130173264A1 (en) * | 2012-01-03 | 2013-07-04 | Nokia Corporation | Methods, apparatuses and computer program products for implementing automatic speech recognition and sentiment detection on a device |
CN108256078A (en) * | 2018-01-18 | 2018-07-06 | 北京百度网讯科技有限公司 | Information acquisition method and device |
CN108846547A (en) * | 2018-05-06 | 2018-11-20 | 成都信息工程大学 | A kind of Enterprise Credit Risk Evaluation method of dynamic adjustment |
WO2019227710A1 (en) * | 2018-05-31 | 2019-12-05 | 平安科技(深圳)有限公司 | Network public opinion analysis method and apparatus, and computer-readable storage medium |
CN109657894A (en) * | 2018-09-27 | 2019-04-19 | 深圳壹账通智能科技有限公司 | Credit Risk Assessment of Enterprise method for early warning, device, equipment and storage medium |
CN109684481A (en) * | 2019-01-04 | 2019-04-26 | 深圳壹账通智能科技有限公司 | The analysis of public opinion method, apparatus, computer equipment and storage medium |
CN109800976A (en) * | 2019-01-07 | 2019-05-24 | 平安科技(深圳)有限公司 | Investment decision methods, device, computer equipment and storage medium |
CN109993448A (en) * | 2019-04-08 | 2019-07-09 | 湖北风口网络科技有限公司 | A kind of appraisal procedure and system of enterprise network public sentiment potential risk |
CN110458399A (en) * | 2019-07-05 | 2019-11-15 | 深圳壹账通智能科技有限公司 | Risk information generation method, device, computer equipment and storage medium |
CN111914087A (en) * | 2020-07-30 | 2020-11-10 | 广州城市信息研究所有限公司 | Public opinion analysis method |
CN112035658A (en) * | 2020-08-05 | 2020-12-04 | 海纳致远数字科技(上海)有限公司 | Enterprise public opinion monitoring method based on deep learning |
Non-Patent Citations (2)
Title |
---|
潘海侠等: "《深度学习工程师认证初级教程》", 31 March 2020, 北京航空航天大学出版社, pages: 202 - 203 * |
魏华;李华飙: ""基于短语模式的情感分析"", 《科研信息化技术与应用》, no. 6, pages 12 - 17 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113990352A (en) * | 2021-10-22 | 2022-01-28 | 平安科技(深圳)有限公司 | User emotion recognition and prediction method, device, equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220164397A1 (en) | Systems and methods for analyzing media feeds | |
CN110008343B (en) | Text classification method, apparatus, device and computer readable storage medium | |
US20100079464A1 (en) | Information processing apparatus capable of easily generating graph for comparing of a plurality of commercial products | |
CN113986864A (en) | Log data processing method and device, electronic equipment and storage medium | |
CN115689717A (en) | Enterprise risk early warning method, device, electronic equipment, medium and program product | |
CN114417118A (en) | Abnormal data processing method, device, equipment and storage medium | |
CN113806660A (en) | Data evaluation method, training method, device, electronic device and storage medium | |
CN112862305A (en) | Method, device, equipment and storage medium for determining risk state of object | |
CN114492323A (en) | Method and device for detecting enclosing and bidding behavior based on electronic bidding document comparison | |
CN112487808A (en) | Big data based news message pushing method, device, equipment and storage medium | |
US20230004715A1 (en) | Method and apparatus for constructing object relationship network, and electronic device | |
CN113672703B (en) | User information updating method, device, equipment and storage medium | |
CN116795777A (en) | Enterprise document management method, device, equipment and storage medium | |
CN112818221B (en) | Entity heat determining method and device, electronic equipment and storage medium | |
US20180196859A1 (en) | Processing datasets of varying schemas from tenants | |
CN112346938B (en) | Operation auditing method and device, server and computer readable storage medium | |
CN114862479A (en) | Information pushing method and device, electronic equipment and medium | |
CN115080744A (en) | Data processing method and device | |
EP3308296A1 (en) | A method and system for locating regulatory information | |
CN113191777A (en) | Risk identification method and device | |
CN113850085B (en) | Enterprise grade evaluation method and device, electronic equipment and readable storage medium | |
CN115618857B (en) | Threat information processing method, threat information pushing method and threat information pushing device | |
CN113656393B (en) | Data processing method, device, electronic equipment and storage medium | |
KR102395550B1 (en) | Method and apparatus for analyzing confidential information | |
US20230115737A1 (en) | Method of processing multimedia data, device and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20240904 Address after: 100013 no.a4, Hepingli District 9, Dongcheng District, Beijing Applicant after: Big data center of emergency management department Country or region after: China Address before: 2 / F, baidu building, 10 Shangdi 10th Street, Haidian District, Beijing 100085 Applicant before: BEIJING BAIDU NETCOM SCIENCE AND TECHNOLOGY Co.,Ltd. Country or region before: China |