US20220334856A1 - Automated pop-up display altering a user of a machine determined medical best practice recommendation - Google Patents
Automated pop-up display altering a user of a machine determined medical best practice recommendation Download PDFInfo
- Publication number
- US20220334856A1 US20220334856A1 US17/852,323 US202217852323A US2022334856A1 US 20220334856 A1 US20220334856 A1 US 20220334856A1 US 202217852323 A US202217852323 A US 202217852323A US 2022334856 A1 US2022334856 A1 US 2022334856A1
- Authority
- US
- United States
- Prior art keywords
- user
- medical
- best practice
- machine determined
- machine
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/166—Editing, e.g. inserting or deleting
- G06F40/174—Form filling; Merging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/28—Databases characterised by their database models, e.g. relational or object models
- G06F16/284—Relational databases
- G06F16/285—Clustering or classification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/35—Clustering; Classification
- G06F16/353—Clustering; Classification into predefined classes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computing arrangements using knowledge-based models
- G06N5/02—Knowledge representation; Symbolic representation
- G06N5/022—Knowledge engineering; Knowledge acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computing arrangements using knowledge-based models
- G06N5/04—Inference or reasoning models
- G06N5/046—Forward inferencing; Production systems
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/70—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computing arrangements using knowledge-based models
- G06N5/01—Dynamic search techniques; Heuristics; Dynamic trees; Branch-and-bound
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computing arrangements using knowledge-based models
- G06N5/02—Knowledge representation; Symbolic representation
- G06N5/022—Knowledge engineering; Knowledge acquisition
- G06N5/025—Extracting rules from data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N7/00—Computing arrangements based on specific mathematical models
- G06N7/01—Probabilistic graphical models, e.g. probabilistic networks
Definitions
- the present invention relates to a computer program product, system, and method for a user interface for determining real-time changes to content entered into the user interface to provide to a classifier program and rules engine to generate results for the content
- Natural Language Processing (NLP) algorithms are used to classify information provided by a user via a computer user interface.
- NLP algorithms are used to process medical information from patient records or from clinician free form narration (audio or text) to extract clinical facts from the processed content. These extracted clinical facts may then be used to determine the completeness and accuracy of the gathered information and to alert the clinician of conditions to further explore and develop, to provide alerts, and to determine alternative hypothesis of diagnosis for the extracted facts. In this way, patient health records and doctor entered free form narration may be processed to provide diagnosis of patient conditions.
- An embodiment may comprise a computer program product, system, and method for determining real-time changes to content entered into a user interface to provide to a classifier program and rules engine to generate results for the content.
- First content from a user in a user input field is rendered in a user interface.
- a determination is made that entry of the first content is completed.
- the first content is provided to a classification program to classify the first content into a first machine classification to provide to a rules engine to apply a series of rules to determine a first machine determined proposition for the first machine classification.
- the first machine determined proposition is rendered in the user interface.
- a determination is made of second content in the user input field from the user that differs from the first content for which the first machine determined proposition is rendered.
- the second content is provided to the classification program to classify the first content into a second machine classification to provide to the rules engine to apply the series of rules to determine a second machine determined proposition for the second machine classification.
- the second machine determined proposition is rendered in the user interface with the second content.
- a first fingerprint is generated from the first content that uniquely identifies the first content.
- a second fingerprint from the second content that uniquely identifies the second content.
- a determination is made as to whether the first fingerprint differs from the second fingerprint, wherein the second content is provided to the classifier program in response to determining that the first fingerprint is different from the second fingerprint.
- the first and the second content comprise user entered findings.
- the classifier program classifies the user entered findings as a machine classification.
- the first and second machine determined propositions each comprise at least one of reference material, calculations, summary of information from other sources, best practices, related to the machine classification.
- the user input field comprises a first user input field.
- the first and the second content comprises user entered observations.
- User input is received indicating a user classification of the user entered observations in a second user input field.
- the classifier program additionally receives the user classification in the second user input field with the user entered observations in the first user input field to process to determine the first and the second machine classifications to provide to the rules engine.
- the user entered observations comprise medical observations of a patient
- the user classification comprises a user diagnosis of the user entered observations
- the first and the second machine classifications comprise clinical diagnoses based on at least one of the user entered medical observations and the user diagnosis
- the first and second machine determined propositions from the rules engine comprise first and second medical best practices based on applying a series of rules of the rules engine to the clinical diagnosis from the classifier program.
- the term “best practices” may refer to best practice recommendations, best recommendations, and other types of preferred or relevant recommendations or actions, etc., to take based on an observed condition or diagnosis.
- an action user interface is rendered with the second machine determined proposition and an acceptance control to allow a user to accept or reject the second machine determined proposition.
- the second machine determined proposition is rendered in the user interface with the second content in response to receiving, via the acceptance control in the action user interface, indication that the user accepted the second machine determined proposition.
- the second machine determined proposition is rendered in the user interface with the second content and indication that the second machine proposition was not accepted in response to receiving rejection of the rendered second machine determined proposition through the action user interface.
- At least one of a preferred classification and a preferred proposition is received from the user rejecting the second machine determined proposition with the acceptance control.
- a message is sent with the at least one of the preferred classification and the preferred proposition to an interface program managing access to the classifier program and the rules engine.
- Including the preferred classification in the rules engine trains the classification program to produce the preferred classification based on the second content used to produce the second machine determined proposition.
- Including the preferred proposition and the preferred classification programs the rules engine to associate the preferred proposition with the preferred classification.
- Including the preferred proposition with no preferred classification programs the rules engine to associate the preferred proposition with the second machine classification used to determine the second machine determined proposition.
- content entered into the user input field is written into a log file.
- Content from the user input field is saved in the log file as previously entered content.
- the log file is periodically processed by determining whether the content from the user input field written to the log file matches the previously entered content.
- the determining the second content differs from the first content comprises determining that the content from the user input field in the log file matches the previously entered content.
- Indication is made of the content from the user input field in the log file as the previously entered content.
- the content from the user input field in the log file is sent to the classification program in response to the content in the log file not matching the previously entered content to obtain a new result from the rules engine to provide real time updating of the new result in the user interface.
- FIG. 1 illustrates an embodiment of a computing environment.
- FIG. 2 a illustrates an embodiment of a user interface into which user findings are entered and results based on the findings are rendered.
- FIG. 2 b illustrates an embodiment of a user interface in which machine generated best practices are rendered to a user.
- FIG. 3 illustrates an embodiment of a log file.
- FIG. 4 illustrates an embodiment of user content information used to generate information in the user interface.
- FIG. 5 illustrates an embodiment of operations to generate a report through the user interface.
- FIG. 6 illustrates an embodiment of operations to periodically process a log file to determine content to transmit to a classifier program.
- FIG. 7 illustrates an embodiment of operations for a classifier program and rules engine to generate results from findings entered in the user interface.
- FIG. 8 illustrates an embodiment of operations to process results from the rules engine to render in the user interface.
- FIG. 9 illustrates an embodiment of operations performed to retrain the classifier program and/or rules engine based on a message rejecting a machine proposition.
- FIG. 10 illustrates a computing environment in which the components of FIG. 1 may be implemented
- Described embodiments provide improvements to computer technology to provide real-time machine propositions for user content, including observations and user classification of the observations, entered into a user interface via an input device, such as a keyboard or voice-to-text device, where the machine propositions may include actions, recommendations, reference material, calculations, summary of information from other sources; best practices based on the findings, etc.
- an input device such as a keyboard or voice-to-text device
- the machine propositions may include actions, recommendations, reference material, calculations, summary of information from other sources; best practices based on the findings, etc.
- a doctor or clinician may be entering observations and possibly a diagnosis of a condition, such as by observing the patient, lab test results, and medical images, and the clinician or doctor may want to be immediately informed of the relevant best practices based on the observed findings.
- the clinician or doctor may want to see changes to the recommendations based on changes the clinician or doctor makes to the findings entered into a medical report rendered in a user interface to provide real-time feedback on such changes.
- other types of professionals wanting to obtain recommended actions or best practices for observed findings such as engineers analyzing a physical structure, device, manufacturing process, etc., or repair technicians may want to be informed real-time of the actions or best practices to take based on real-time entry of findings and observations and modifications to the findings.
- Described embodiments provide improvements to the computer technology for determining real-time entry or changes to inputted content, such as user entered observations and classifications, entered into a user interface to process and present to a program, such as a machine learning classifier program and rules engine, to provide for real-time feedback when making changes to the inputted observations.
- a program such as a machine learning classifier program and rules engine
- first content such as user inputted observations
- a determination is made when entry of the first content is completed.
- the first content is provided to a classification program to classify the first content into a first machine classification to provide to a rules engine to apply a series of rules to determine a first machine determined proposition for the first machine classification.
- the first machine determined proposition is rendered in the user interface.
- the second content is provided to the classification program to classify the second content into a second machine classification to provide to the rules engine to apply the series of rules to determine a second machine determined proposition for the second machine classification.
- the second machine determined proposition is rendered in the user interface with the second content.
- Further embodiments provide improvements to computer technology to determine changes to the user entered content, such as observations and classifications, entered into the user interface to provide to the classifier program to provide real time-feedback on actions to take based on the user entered findings.
- Content entered in to the user input field is written into a log file and content from the user input field in the log file is saved as previously entered content.
- the log file is periodically processed to determine whether the content from the user input field written to the log file matches the previously entered content.
- the determining that current entered content differs from the previously entered and processed content comprises determining that the content from the user input field in the log file matches the previously entered content.
- Content from the user input field in the log file is indicated as the previously entered content.
- the content from the user input field in the log file is sent to the classification program in response to the content in the log file not matching the previously entered content to obtain a new result from the rules engine to provide real time updating of the new result in the user interface.
- the user entered observations may apply to processing user entered medical observations of patient data, observation of the patient conditions and attributes, digital images, and physical samples, such as biopsies and bodily fluid samples, to provide real-time feedback of best practices and recommendations to the user entered medical observations.
- the new inputted observations may be sent to a classifier program to determine from the user entered medical observations a predefined machine classification, such as a clinical diagnosis or recognized condition, to provide to a rules engine.
- the rules engine determines medical best practices based on applying a series of rules from the rules engine to the machine classification.
- the best practices and/or reference material, recommendations, calculations, summary of information from other sources, actions and other relevant information may then be immediately returned to render in the user interface to provide immediate feedback, such as for best practices and changes to best practices for user entered observations.
- FIG. 1 illustrates an embodiment of a computing environment in which embodiments are implemented.
- a computing device 100 includes a processor 102 , a main memory 104 , and a storage 106 .
- the main memory 104 includes various program components including an operating system 108 , a report user interface 200 , a report generator 110 to generate the report user interface 200 , and input drivers 112 to interact with components connected on a bus 114 .
- the memory 104 further includes a log file 300 in which content entered in the report user interface 200 is written, user content information 400 having information on content a user has entered in the report user interface 200 , and a report 116 that may be generated from the content entered in the report user interface 200 .
- the report user interface 200 may be continually displayed waiting to take action in response to user input.
- the bus interface 114 may connect the processor 102 , main memory 104 , a communication transceiver 118 to communicate (via wireless communication or a wired connection) with external devices, including a network, such as the Internet, a cellular network, etc.; a microphone 120 to transmit and receive sound for the personal computing device 100 ; a display screen 122 to render display output to a user of the personal computing device 100 ; a speaker 124 to generate sound output to the user; and input controls 126 such as buttons and other software or mechanical buttons, including a keyboard, to receive user input.
- the components connected on the bus 114 may communicate over one or more bus interfaces 114 and other electronic interfaces.
- the computing device 100 may communicate with a server 128 over a network 130 to transmit user content or findings and observations entered into the report user interface 200 to a classifier program 132 in the server 128 that processes the received user content to generate a classification based on the user entered observation. For instance, if the user observations comprise observations and findings with respect to a medical examination, medical images (e.g., X-Ray or magnetic resonance imaging digital image (MRI)), the classifier program 132 may produce machine classification, such as a diagnosis or description, of the user entered findings. In this way, the classifier program 132 presents relevant information based on user entered descriptions, such as a specified pathology.
- medical images e.g., X-Ray or magnetic resonance imaging digital image (MRI)
- the classifier program 132 may implement a machine learning technique such as decision tree learning, association rule learning, neural network, inductive programming logic, support vector machines, Bayesian network, etc., to determine a classification based on content entered by the user in the report user interface 200 , such as medical observations and findings.
- a machine learning technique such as decision tree learning, association rule learning, neural network, inductive programming logic, support vector machines, Bayesian network, etc.
- the classifier program 132 may comprise a machine learning program that is trained using a training set comprising previously generated reports that have been classified with a ground truth classification, and the classifier program 132 is trained to produce the ground truth classifications provided for the training set of reports. For instance, if the training set comprises doctor entered findings, such as findings based on an Mill reading, then the provided ground truths would be doctor determined classifications or clinical diagnosis based on those findings. The classifier program 132 would then be trained with those findings to produce the clinical diagnosis assigned to those findings and observations.
- the rules engine 134 would then take that machine classification, such as a clinical diagnosis, and determine a proposition, such as a result, action or best practices based on the classification using a decision tree or table that associates specific courses of action or results with the classification outputted from the classifier program 132 . For instance, if the output of the classifier program 132 comprises a clinical diagnosis, then the rules engine 134 would determine the best practices to treat a clinical diagnosis outputted from the classifier program 132 , such as a drug therapy, surgical treatment, further testing, further follow-up visit, etc. In this way, the rules engine 134 may provide a proposition, such as an action or best practices, reference material, calculations, summary of information from other systems, etc., for each possible classified diagnosis output from the classifier program 132 .
- the findings may comprise a size of an observed condition on a patient, such as a size of an abdominal aortic aneurysms (AAA), observed features, such as size, shape, etc., of an incidental thyroid nodule, ovarian cyst, non-incidental thyroid nodule, enlarged thyroid, simple ovarian cyst, etc.
- the rules engine 134 may specify particular best practices given different sizes of the AAA, such as recommended follow-ups after so many years or a follow-up and an additional vascular consultation.
- the content is provided to the classifier program 132 in response to the radiologist adding text to the impressions section 206 .
- the server 128 may also include a rules engine 134 providing a decision tree of rules to map received predefined machine classifications outputted from the classifier program 132 to produce a machine determined proposition based on the outputted machine classification, such as a result or action to take based on the classification. For instance, if the machine classification from the classifier program 132 comprises a clinical diagnosis recognized by the rules engine 134 , then the rules engine 134 may map that recognized clinical diagnosis to a best practices course of action to address that clinical diagnosis, such drug therapy, surgery, further testing, follow-up visit, etc.
- the server 128 may further include a client interface 136 to interface and manage communications with client systems to receive application calls from the report generator 110 operating in multiple client computing devices 100 , such as at different facilities, companies or places of business.
- the client interface 136 may receive application calls from another component.
- An advantage of having the rules engine 134 separate from the classifier program 132 is that the rules engine 134 may be independently updated to provide new results, actions or best practices for the classified condition or diagnosis outputted from the classifier program 132 .
- the network 130 may comprise one or more networks including Local Area Networks (LAN), Storage Area Networks (SAN), Wide Area Network (WAN), peer-to-peer network, wireless network, the Internet, etc.
- LAN Local Area Networks
- SAN Storage Area Networks
- WAN Wide Area Network
- peer-to-peer network wireless network, the Internet, etc.
- the computing device 100 may store program components and corresponding data, such as 108 , 110 , 112 , 116 , 200 , 300 in a non-volatile storage 106 , which may comprise one or more storage devices known in the art, such as a solid state storage device (SSD) comprised of solid state electronics, NAND storage cells, EEPROM (Electrically Erasable Programmable Read-Only Memory), flash memory, flash disk, Random Access Memory (RAM) drive, storage-class memory (SCM), Phase Change Memory (PCM), resistive random access memory (RRAM), spin transfer torque memory (STM-RAM), conductive bridging RAM (CBRAM), magnetic hard disk drive, optical disk, tape, etc.
- SSD solid state storage device
- EEPROM Electrical Erasable Programmable Read-Only Memory
- flash memory flash disk
- RAM Random Access Memory
- SCM storage-class memory
- PCM Phase Change Memory
- RRAM resistive random access memory
- STM-RAM spin transfer torque memory
- CBRAM conductive bridging
- the storage devices may further be configured into an array of devices, such as Just a Bunch of Disks (JBOD), Direct Access Storage Device (DASD), Redundant Array of Independent Disks (RAID) array, virtualization device, etc. Further, the storage devices may comprise heterogeneous storage devices from different vendors or from the same vendor.
- JBOD Just a Bunch of Disks
- DASD Direct Access Storage Device
- RAID Redundant Array of Independent Disks
- virtualization device etc.
- the storage devices may comprise heterogeneous storage devices from different vendors or from the same vendor.
- the memory 104 may comprise a suitable volatile or non-volatile memory devices, including those described above.
- program modules such as the program components 108 , 110 , 112 , 200 , 132 , 134 , 136 may comprise routines, programs, objects, components, logic, data structures, and so on that perform particular tasks or implement particular abstract data types.
- the program components and hardware devices of the computing device 100 and server 128 of FIG. 1 may be implemented in one or more computer systems, where if they are implemented in multiple computer systems, then the computer systems may communicate over a network.
- the program components 108 , 110 , 112 , 200 may be accessed by the processor 102 from the memory 104 to execute. Alternatively, some or all of the program components 108 , 110 , 112 , 200 , 132 , 134 , 136 may be implemented in separate hardware devices, such as Application Specific Integrated Circuit (ASIC) hardware devices.
- ASIC Application Specific Integrated Circuit
- the functions described as performed by the programs 108 , 110 , 112 , 200 , 132 , 134 , 136 may be implemented as program code in fewer program modules than shown or implemented as program code throughout a greater number of program modules than shown.
- FIG. 1 shows the classifier program 132 and rules engine 134 implemented in a server 128 over a network 130 .
- the classifier program 132 and rules engine 134 may be implemented in the computing device 100 including the report generator 110 and user interface 200 .
- the server 128 may comprise a cloud server providing cloud services for classifying user input with the classifier program 132 and determining a course of action or best practices for the classification of the classifier program 132 .
- FIG. 2 a illustrates an embodiment of a user interface 200 in which a user may input data, and includes a observations section 202 in which a user may enter observations about an entity being diagnosed, such as a person, device, natural phenomena.
- the user may describe observations from an image or video, direct observation of a phenomena, etc.
- the user may further enter impressions 206 , such as classifications and conclusions, in an impressions section 208 providing a classification of the observations, such as a condition, medical diagnosis, natural phenomena, state of a device, etc.
- the impressions section 208 may further render an action, solution, best practices, etc. 210 produced by the classifier program 132 and rules engine 134 .
- the classifier program 132 may receive the user entered observations 302 and optionally the user classification 304 , such as a diagnosis or conclusion of observations the user inters in field 206 of the user interface 200 , and classify into a machine classification, such as a clinical diagnosis, category, etc., which is then provided to the rules engine 134 to determine a machine proposition, e.g., result, action, recommendation, best practices, etc. 210 to render in the user interface 200 concerning the user machine classified findings.
- a machine classification such as a clinical diagnosis, category, etc.
- the report user interface 200 may be continually displayed waiting to take action in response to user input into the impressions section 208 providing the classification.
- FIG. 2 b illustrates an embodiment of an alert user interface 220 , which may be displayed as a dialog box or pop-up alert, rendered on top of the user interface 200 , that alerts the user of a machine determined proposition 222 based on the user entered content and findings 204 and 206 , which may comprise a machine determined best practices from the rules engine 134 based on the user entered observations and classification, or diagnosis, of the observations.
- the user may select an accept selection control 224 to cause the machine determined best practices 222 to be entered into field 210 of the user interface 200 to accept the machine generated best practices from the rules engine 134 .
- the user may also select a reject selection control 226 to reject the machine determined best practices.
- the user may enter the machine determined best practices 22 into the report user interface 200 by voice command, copy and paste from the field 222 in the alert user interface 220 into the best practices field 210 of the user interface 200 , or click a button, such as 224 and 226 to cause the automatic insertion of the best practices recommendation in the field 222 of the alert user interface 220 into the user interface 200 , such as into field 210 .
- the user interface 200 provides improvements to computer technology for rendering machine generated results, such as from a machine learning classifier program 132 and rules engine 134 , by, in real-time, providing user entered observations 204 to the classifier program 132 and rules engine 134 to allow immediate computation of new machine propositions based on user entered findings, which include new observations 204 and user entered classifications 206 , to immediately display in real-time the machine propositions 210 , such as recommended actions, best practices, actions, calculations, reference material, summarized information from other sources, etc., in the user interface 200 as they are being entered in the observations 202 section. Further, at some point the observations 204 , user entered classifications 206 , and machine propositions 210 may be saved as a report 116 for later use. In further embodiments, other types of content and findings, in addition to or different from observations and classifications, may be entered in the user interface.
- FIG. 3 illustrates an embodiment of a log file 300 to which content entered in the user interface 200 is written, including observations 302 entered in the observations section 202 of the user interface 200 , such as exams and other patient related information; impressions 304 comprising the user entered classifications 206 entered in the impressions section 208 ; and the machine propositions 306 from the rules engine 134 from processing the machine classification produced by the classifier program 132 , which is rendered as a machine determined proposition 210 , e.g., best practices, etc. 210 in the impressions section 208 .
- observations 302 entered in the observations section 202 of the user interface 200 such as exams and other patient related information
- impressions 304 comprising the user entered classifications 206 entered in the impressions section 208
- the machine propositions 306 from the rules engine 134 from processing the machine classification produced by the classifier program 132 , which is rendered as a machine determined proposition 210 , e.g., best practices, etc. 210 in the impressions section 208 .
- FIG. 4 illustrates an embodiment of user content information 400 generated by the report generator 110 to determine when to transmit the observations 302 to the classifier program 132 , and includes a previous user content fingerprint 402 comprising a unique fingerprint/hash code calculated from a previous state of the user entered impressions 304 and a current user content fingerprint 404 comprising a unique fingerprint/hash code calculated from a current state of the user entered impressions 304 .
- the fingerprints 402 and 404 may be calculated using a fingerprinting algorithm to generate a unique code based on input, such as a fingerprinting method, or hash code generating algorithm.
- FIG. 5 illustrates an embodiment of operations performed by the report generator 110 to render the report user interface 200 , rendered on the display screen 122 , to generate a report 116 .
- the report generator 110 receives (at block 502 ) user input via input controls 126 , such as a keyboard mouse, touchscreen, or microphone 120 to renders in the report user interface 200 .
- the report generator 110 continually writes (at block 504 ) user input entered into the observations 202 and impressions 208 sections into observations 302 and impressions (e.g., classifications) 304 fields, respectively, of the log file 300 . In this way, the log file 300 is updated in real time with new information the user enters into the observations 202 and impressions 208 sections.
- FIG. 6 illustrates an embodiment of operations performed by the report generator 110 to periodically process the log file 300 to implement real-time updating of the machine determined propositions 210 rendered in the user interface 200 .
- the report generator 110 may periodically process the log file 300 after a time period, such as a fraction of a second or in response to the writing to the log file 300 , etc.
- the report generator 110 processes (at block 602 ) the content of the observations field 302 to generate a content fingerprint/hash code comprising a unique identifier of the user entered findings.
- the report generator 110 determines (at block 606 ) whether the user entered information in the impressions field 304 , which may in certain embodiments signal that the user has completed entering the observations 302 . In further implementations, the report generator 110 may continuously provide feedback entered through the user interface 200 without waiting for the user to input content into the impressions section 208 . Other tests may be used to determine that the entering of the observations 304 has completed, such as selection of a button or user interface element. If (at block 606 ) a determination is made the user completed entering observations 304 , then the content fingerprint calculated from the current observations 302 is stored (at block 608 ) as the current user content fingerprint 404 . The user entered findings, such as observations 302 and impressions 304 are transmitted (at block 610 ) to the classifier program 132 .
- the current user content fingerprint 404 is indicated (at block 614 ) as the previous user content fingerprint 402 and the generated content fingerprint is stored (at block 614 ) as the new current user content fingerprint 404 . If (at block 616 ) the current 404 and previous 402 user content fingerprints do not match, then the user entered observations 304 has changed since last checked and the report generator 110 transmits the new user entered observations 302 to the classifier program 132 to generate new results. If (at block 616 ) the fingerprints 402 and 404 are not different, then control ends as there are no new impressions 304 to determine if there are new results, actions or best practices to render.
- FIG. 6 provides improved computer technology to provide an immediate and real-time determination if new user observations 204 have been entered in the user interface 200 to provide for real-time updating of the results by immediately sending any changed user entered impressions 304 to the classifier program 132 and rules engine 134 to process. Any new results from the rules engine 134 may be displayed in the alert user interface 220 ( FIG. 2B ) in the determined proposition field 222 , which the user may then cause to be inputted into the report 200 . This allows for the immediate updating of the machine determined propositions 210 rendered in the alert user interface 220 based on new user entered findings, such as observations and/or impressions/classifications.
- FIG. 7 illustrates an embodiment of operations performed by the classifier program 132 and the rules engine 134 to process observations 304 from the user to generate results.
- the classifier program 132 Upon receiving (at block 700 ) information from the user impressions 304 and other information, such as entered in observations 204 , a classification of the observations 302 , metadata, etc., the classifier program 132 classifies (at block 702 ) the entered information, e.g., impressions 304 , etc., into a predefined machine classification, such as a clinical diagnosis or recognized condition.
- further information may be inputted into the classifier program 132 , such as the user impressions 206 , or user classification, and additional information from other sources.
- the classifier program 132 may receive medical lab results, information from other reports or sources, etc., all that may assist the classifier program 132 to produce a more accurate machine classification of the user entered findings, e.g., observations 302 and/or impressions 304 .
- the classifier program 132 sends (at block 704 ) the determined machine classification to the rules engine 134 .
- the rules engine 134 Upon receiving the machine classification, the rules engine 134 applies (at block 706 ) a decision tree of rules to the machine classification to determine a machine proposition, such as an action to take, best practices, etc., for the provided classification.
- the determined result is returned (at block 708 ) to report generator 110 to render the result in the alert user interface 220 in field 222 to allow the user to add the determined result to the report.
- the report generator 110 or another program component may generate and manage the alert user interface 220 .
- user entered findings such as observations and/or impressions
- the classifier program 132 and the rules engine 134 to provide immediate machine propositions, e.g., actions, best practices, etc. to return to the report generator 110 to immediately render the results in the alert user interface 220 , such as field 222 .
- This enables to the user to see in real-time the changes in the results, actions or best practices resulting from entering new findings, and determine whether to have them entered into the report being generated.
- FIG. 8 illustrates an embodiment of operations performed by the report generator 110 to process results received from the rules engine 134 .
- the report generator 110 determines (at block 802 ) whether the received machine proposition matches the machine proposition 306 saved in the log file currently rendered in field 210 of the user interface 200 . If (at block 802 ) there is a match, then the machine proposition has not changed due to the change in the findings, e.g., observations and impressions, and control ends.
- the report generator 110 displays (at block 804 ) a dialog box, such as the alert notification dialog box 220 in FIG. 2 b , showing the received machine determined recommendations 222 and graphical controls 224 , 226 to allow the user to accept or reject, respectively, the new result in the dialog box 220 .
- a dialog box such as the alert notification dialog box 220 in FIG. 2 b
- the report generator 110 renders (at block 808 ) the received machine proposition in the field 210 in the user interface, e.g., in impressions field 208 and refresh the user interface display 200 with the new machine proposition.
- the user can also copy/paste or use a provided macro to move the machine proposition anywhere into the report.
- the new machine proposition is written (at block 810 ) to the machine proposition field 306 in the log file 300 . If (at block 806 ) the user rejects the received machine proposition, such as by selecting the reject graphical control 226 , then the received machine proposition is indicated in a field of the user interface 220 , such as the action field 210 that the user, and indication is also made that the user, such as radiologist, disagreed with the machine proposition, such as actions, recommendations, reference material, calculations, summary of information from other sources; best practices based on the findings, etc.
- a user interface is then rendered (at block 814 ) to allow the user to specify a preferred proposition, e.g., best practices, and/or a preferred classification based on the current content in the user interface 200 , as well as additional user feedback or comments.
- the report generator 110 sends (at block 816 ) a message to the client interface 136 at the server 128 with information on the rejected machine proposition, preferred proposition/preferred classification, the content (observations, impressions, etc.) that was previously sent to generate the rejected machine determined proposition, and any other feedback.
- machine determined propositions are provided real time in response to the recently sent findings.
- the report generator 110 renders an alert notification dialog 200 to allow the user to review the results, such as machine determined propositions, actions to take, best practices for medical diagnosis, etc., and then determine whether to accept the machine proposition to include into the report or user interface 200 being generated.
- described embodiments provide improved techniques for allowing the user to provide input on the rejected machine proposition to improve the calculations and operations of the classifier program 132 and rules engine 134 by providing a user preferred classification and/or preferred proposition based on the current content rendered in the user interface 200 to the server 128 to retrain the rules engine 134 and/or classifier program 132 to further optimize operations and results.
- the rejected machine propositions may be entered into the log file 300 .
- FIG. 9 illustrates an embodiment of operations performed by the client interface 136 , classifier program 132 and/or rules engine 134 to process a message sent from the report generator 110 where the user rejects the provided machine proposition.
- the message including information on the rejected machine proposition, preferred proposition/preferred classification, the content (observations, impressions, etc.) that was previously sent to generate the rejected machine proposition, and any other feedback, if (at block 902 ) a preferred classification was provided, then the machine learning algorithm of the classifier program 132 is trained (at block 904 ) to output the preferred classification as the machine classification based on the previously sent content that resulted in the rejected machine proposition.
- the rules engine 134 is updated (at block 908 ) to output the preferred proposition as the machine proposition for the preferred classification, which the classifier program 132 has been retrained to produce for the given current content.
- the rules engine 134 may further be updated (at block 910 ) to not associate the rejected machine proposition with the machine classification used to produce the rejected machine proposition, so as not to produce again the machine proposition that the user rejected given the current content, findings, impressions, observations, etc. in the user interface 200 .
- any additional comments provided with the message are forward to a server administrator to consider for improving the performance of the classifier program 132 and/or rules engine 134 .
- FIG. 9 provides improved computer technology for retraining the classifier program 132 and/or rules engine 134 to produce user preferred output when the user rejects a machine proposition previously sent to allow continual improvement of the machine propositions produced for given content, such as improved actions, best practices, etc.
- the present invention may be a system, a method, and/or a computer program product.
- the computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention.
- the computer program product comprises a computer readable storage medium implemented using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof.
- the described operations may be implemented as code or logic maintained in a “computer readable storage medium”.
- code and “program code” as used herein refers to software program code, hardware logic, firmware, microcode, etc.
- the computer readable storage medium includes a tangible element, including at least one of electronic circuitry, storage materials, a casing, a housing, a coating, hardware, and other suitable materials.
- a computer readable storage medium may comprise, but is not limited to, a magnetic storage medium (e.g., hard disk drives, floppy disks, tape, etc.), optical storage (CD-ROMs, DVDs, optical disks, etc.), volatile and non-volatile memory devices (e.g., EEPROMs, ROMs, PROMs, RAMs, DRAMs, SRAMs, Flash Memory, firmware, programmable logic, etc.), Solid State Devices (SSD), computer encoded and readable punch cards, etc.
- a magnetic storage medium e.g., hard disk drives, floppy disks, tape, etc.
- optical storage CD-ROMs, DVDs, optical disks, etc.
- volatile and non-volatile memory devices e.g., EEPROMs, ROMs, PROMs, RAMs, DRAMs, SRAMs, Flash Memory, firmware, programmable logic, etc.
- SSD Solid State Devices
- the computer readable storage medium may further comprise a hardware device implementing firmware, microcode, etc., such as in an integrated circuit chip, a programmable logic device, a Programmable Gate Array (PGA), field-programmable gate array (FPGA), Application Specific Integrated Circuit (ASIC), etc.
- a computer readable storage medium is not comprised solely of transmission signals and includes physical and tangible components.
- These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
- These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
- the computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
- each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s).
- the functions noted in the block may occur out of the order noted in the figures.
- two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
- the computational components of FIG. 1 may be implemented in one or more computer systems, having a computer architecture as shown in FIG. 10 , and including a processor 1002 (e.g., one or more microprocessors and cores), a memory 1004 (e.g., a volatile memory device), and storage 1006 (e.g., a non-volatile storage, such as magnetic disk drives, solid state devices (SSDs), optical disk drives, a tape drive, etc.).
- the storage 1006 may comprise an internal storage device or an attached or network accessible storage. Programs, including an operating system 1008 and applications 1010 stored in the storage 1006 are loaded into the memory 1004 and executed by the processor 1002 .
- the applications 1010 may include the report user interface 200 , report generator 110 , input drivers 112 , classifier program 132 , rules engine 134 , and client interface 136 and other program components described above.
- the architecture 1000 further includes a network card 1012 to enable communication with the network 16 .
- An input device 1014 is used to provide user input to the processor 1002 , and may include a keyboard, mouse, pen-stylus, microphone, touch sensitive display screen, or any other activation or input mechanism known in the art.
- An output device 1016 such as a display monitor, printer, storage, etc., is capable of rendering information transmitted from a graphics card or other component.
- the output device 1016 may render the GUIs described with respect to figures and the input device 1014 may be used to interact with the graphical controls and elements in the GUIs described above.
- the architecture 1000 may be implemented in any number of computing devices, such as a server, mainframe, desktop computer, laptop computer, hand held computer, tablet computer, personal digital assistant (PDA), telephony device, cell phone, etc.
- PDA personal digital assistant
- an embodiment means “one or more (but not all) embodiments of the present invention(s)” unless expressly specified otherwise.
- Devices that are in communication with each other need not be in continuous communication with each other, unless expressly specified otherwise.
- devices that are in communication with each other may communicate directly or indirectly through one or more intermediaries.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Medical Informatics (AREA)
- Databases & Information Systems (AREA)
- General Health & Medical Sciences (AREA)
- Computational Linguistics (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Human Computer Interaction (AREA)
- Pathology (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- Medical Treatment And Welfare Office Work (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Abstract
Description
- The present invention relates to a computer program product, system, and method for a user interface for determining real-time changes to content entered into the user interface to provide to a classifier program and rules engine to generate results for the content
- Natural Language Processing (NLP) algorithms are used to classify information provided by a user via a computer user interface. In the medical field, NLP algorithms are used to process medical information from patient records or from clinician free form narration (audio or text) to extract clinical facts from the processed content. These extracted clinical facts may then be used to determine the completeness and accuracy of the gathered information and to alert the clinician of conditions to further explore and develop, to provide alerts, and to determine alternative hypothesis of diagnosis for the extracted facts. In this way, patient health records and doctor entered free form narration may be processed to provide diagnosis of patient conditions.
- There is a need in the art for improved techniques for extracting information from a user interface in which user observations and findings are entered to process to determine results, actions and best practices for the extracted information.
- An embodiment may comprise a computer program product, system, and method for determining real-time changes to content entered into a user interface to provide to a classifier program and rules engine to generate results for the content. First content from a user in a user input field is rendered in a user interface. A determination is made that entry of the first content is completed. In response to determining that the entry of the first content is completed, the first content is provided to a classification program to classify the first content into a first machine classification to provide to a rules engine to apply a series of rules to determine a first machine determined proposition for the first machine classification. The first machine determined proposition is rendered in the user interface. A determination is made of second content in the user input field from the user that differs from the first content for which the first machine determined proposition is rendered. The second content is provided to the classification program to classify the first content into a second machine classification to provide to the rules engine to apply the series of rules to determine a second machine determined proposition for the second machine classification. The second machine determined proposition is rendered in the user interface with the second content.
- In a further embodiment, a first fingerprint is generated from the first content that uniquely identifies the first content. A second fingerprint from the second content that uniquely identifies the second content. A determination is made as to whether the first fingerprint differs from the second fingerprint, wherein the second content is provided to the classifier program in response to determining that the first fingerprint is different from the second fingerprint.
- In a further embodiment, the user input field comprises a first user input field. Determining that the first content is completed comprises detecting user input into a second input field in the user interface.
- In a further embodiment, the first and the second content comprise user entered findings. The classifier program classifies the user entered findings as a machine classification. The first and second machine determined propositions each comprise at least one of reference material, calculations, summary of information from other sources, best practices, related to the machine classification.
- In a further embodiment, the user input field comprises a first user input field. The first and the second content comprises user entered observations. User input is received indicating a user classification of the user entered observations in a second user input field. The classifier program additionally receives the user classification in the second user input field with the user entered observations in the first user input field to process to determine the first and the second machine classifications to provide to the rules engine.
- In a further embodiment, the user entered observations comprise medical observations of a patient, the user classification comprises a user diagnosis of the user entered observations, the first and the second machine classifications comprise clinical diagnoses based on at least one of the user entered medical observations and the user diagnosis, and the first and second machine determined propositions from the rules engine comprise first and second medical best practices based on applying a series of rules of the rules engine to the clinical diagnosis from the classifier program. The term “best practices” may refer to best practice recommendations, best recommendations, and other types of preferred or relevant recommendations or actions, etc., to take based on an observed condition or diagnosis.
- In a further embodiment, an action user interface is rendered with the second machine determined proposition and an acceptance control to allow a user to accept or reject the second machine determined proposition. The second machine determined proposition is rendered in the user interface with the second content in response to receiving, via the acceptance control in the action user interface, indication that the user accepted the second machine determined proposition.
- In a further embodiment, the second machine determined proposition is rendered in the user interface with the second content and indication that the second machine proposition was not accepted in response to receiving rejection of the rendered second machine determined proposition through the action user interface.
- In a further embodiment, at least one of a preferred classification and a preferred proposition is received from the user rejecting the second machine determined proposition with the acceptance control. A message is sent with the at least one of the preferred classification and the preferred proposition to an interface program managing access to the classifier program and the rules engine. Including the preferred classification in the rules engine trains the classification program to produce the preferred classification based on the second content used to produce the second machine determined proposition. Including the preferred proposition and the preferred classification programs the rules engine to associate the preferred proposition with the preferred classification. Including the preferred proposition with no preferred classification programs the rules engine to associate the preferred proposition with the second machine classification used to determine the second machine determined proposition.
- In a further embodiment, content entered into the user input field is written into a log file. Content from the user input field is saved in the log file as previously entered content. The log file is periodically processed by determining whether the content from the user input field written to the log file matches the previously entered content. The determining the second content differs from the first content comprises determining that the content from the user input field in the log file matches the previously entered content. Indication is made of the content from the user input field in the log file as the previously entered content. The content from the user input field in the log file is sent to the classification program in response to the content in the log file not matching the previously entered content to obtain a new result from the rules engine to provide real time updating of the new result in the user interface.
-
FIG. 1 illustrates an embodiment of a computing environment. -
FIG. 2a illustrates an embodiment of a user interface into which user findings are entered and results based on the findings are rendered. -
FIG. 2b illustrates an embodiment of a user interface in which machine generated best practices are rendered to a user. -
FIG. 3 illustrates an embodiment of a log file. -
FIG. 4 illustrates an embodiment of user content information used to generate information in the user interface. -
FIG. 5 illustrates an embodiment of operations to generate a report through the user interface. -
FIG. 6 illustrates an embodiment of operations to periodically process a log file to determine content to transmit to a classifier program. -
FIG. 7 illustrates an embodiment of operations for a classifier program and rules engine to generate results from findings entered in the user interface. -
FIG. 8 illustrates an embodiment of operations to process results from the rules engine to render in the user interface. -
FIG. 9 illustrates an embodiment of operations performed to retrain the classifier program and/or rules engine based on a message rejecting a machine proposition. -
FIG. 10 illustrates a computing environment in which the components ofFIG. 1 may be implemented - Described embodiments provide improvements to computer technology to provide real-time machine propositions for user content, including observations and user classification of the observations, entered into a user interface via an input device, such as a keyboard or voice-to-text device, where the machine propositions may include actions, recommendations, reference material, calculations, summary of information from other sources; best practices based on the findings, etc. For instance, a doctor or clinician may be entering observations and possibly a diagnosis of a condition, such as by observing the patient, lab test results, and medical images, and the clinician or doctor may want to be immediately informed of the relevant best practices based on the observed findings. Further, the clinician or doctor may want to see changes to the recommendations based on changes the clinician or doctor makes to the findings entered into a medical report rendered in a user interface to provide real-time feedback on such changes. Further, other types of professionals wanting to obtain recommended actions or best practices for observed findings, such as engineers analyzing a physical structure, device, manufacturing process, etc., or repair technicians may want to be informed real-time of the actions or best practices to take based on real-time entry of findings and observations and modifications to the findings.
- Described embodiments provide improvements to the computer technology for determining real-time entry or changes to inputted content, such as user entered observations and classifications, entered into a user interface to process and present to a program, such as a machine learning classifier program and rules engine, to provide for real-time feedback when making changes to the inputted observations. In described embodiments, when first content, such as user inputted observations, is entered in the user interface, a determination is made when entry of the first content is completed. In response to entry of the first content, the first content is provided to a classification program to classify the first content into a first machine classification to provide to a rules engine to apply a series of rules to determine a first machine determined proposition for the first machine classification. The first machine determined proposition is rendered in the user interface. A determination is then made of second content in a user input field from the user, such as user entered observations, that differs from the first content for which the first machine determined proposition is rendered. The second content is provided to the classification program to classify the second content into a second machine classification to provide to the rules engine to apply the series of rules to determine a second machine determined proposition for the second machine classification. The second machine determined proposition is rendered in the user interface with the second content.
- Further embodiments provide improvements to computer technology to determine changes to the user entered content, such as observations and classifications, entered into the user interface to provide to the classifier program to provide real time-feedback on actions to take based on the user entered findings. Content entered in to the user input field is written into a log file and content from the user input field in the log file is saved as previously entered content. To provide for real time feedback and real-time machine determined propositions based on changes to the user content entered into the user input field, the log file is periodically processed to determine whether the content from the user input field written to the log file matches the previously entered content. The determining that current entered content differs from the previously entered and processed content comprises determining that the content from the user input field in the log file matches the previously entered content. Content from the user input field in the log file is indicated as the previously entered content. The content from the user input field in the log file is sent to the classification program in response to the content in the log file not matching the previously entered content to obtain a new result from the rules engine to provide real time updating of the new result in the user interface.
- In certain embodiments, the user entered observations may apply to processing user entered medical observations of patient data, observation of the patient conditions and attributes, digital images, and physical samples, such as biopsies and bodily fluid samples, to provide real-time feedback of best practices and recommendations to the user entered medical observations. Upon determining changes in the user entered findings, such as observations and classifications, the new inputted observations may be sent to a classifier program to determine from the user entered medical observations a predefined machine classification, such as a clinical diagnosis or recognized condition, to provide to a rules engine. The rules engine determines medical best practices based on applying a series of rules from the rules engine to the machine classification. The best practices and/or reference material, recommendations, calculations, summary of information from other sources, actions and other relevant information may then be immediately returned to render in the user interface to provide immediate feedback, such as for best practices and changes to best practices for user entered observations.
-
FIG. 1 illustrates an embodiment of a computing environment in which embodiments are implemented. Acomputing device 100 includes aprocessor 102, amain memory 104, and astorage 106. Themain memory 104 includes various program components including anoperating system 108, areport user interface 200, areport generator 110 to generate thereport user interface 200, andinput drivers 112 to interact with components connected on abus 114. Thememory 104 further includes alog file 300 in which content entered in thereport user interface 200 is written,user content information 400 having information on content a user has entered in thereport user interface 200, and areport 116 that may be generated from the content entered in thereport user interface 200. Thereport user interface 200 may be continually displayed waiting to take action in response to user input. - The
bus interface 114 may connect theprocessor 102,main memory 104, acommunication transceiver 118 to communicate (via wireless communication or a wired connection) with external devices, including a network, such as the Internet, a cellular network, etc.; amicrophone 120 to transmit and receive sound for thepersonal computing device 100; adisplay screen 122 to render display output to a user of thepersonal computing device 100; aspeaker 124 to generate sound output to the user; and input controls 126 such as buttons and other software or mechanical buttons, including a keyboard, to receive user input. The components connected on thebus 114 may communicate over one ormore bus interfaces 114 and other electronic interfaces. - The
computing device 100 may communicate with aserver 128 over anetwork 130 to transmit user content or findings and observations entered into thereport user interface 200 to aclassifier program 132 in theserver 128 that processes the received user content to generate a classification based on the user entered observation. For instance, if the user observations comprise observations and findings with respect to a medical examination, medical images (e.g., X-Ray or magnetic resonance imaging digital image (MRI)), theclassifier program 132 may produce machine classification, such as a diagnosis or description, of the user entered findings. In this way, theclassifier program 132 presents relevant information based on user entered descriptions, such as a specified pathology. Theclassifier program 132 may implement a machine learning technique such as decision tree learning, association rule learning, neural network, inductive programming logic, support vector machines, Bayesian network, etc., to determine a classification based on content entered by the user in thereport user interface 200, such as medical observations and findings. - In one embodiment, the
classifier program 132 may comprise a machine learning program that is trained using a training set comprising previously generated reports that have been classified with a ground truth classification, and theclassifier program 132 is trained to produce the ground truth classifications provided for the training set of reports. For instance, if the training set comprises doctor entered findings, such as findings based on an Mill reading, then the provided ground truths would be doctor determined classifications or clinical diagnosis based on those findings. Theclassifier program 132 would then be trained with those findings to produce the clinical diagnosis assigned to those findings and observations. Therules engine 134 would then take that machine classification, such as a clinical diagnosis, and determine a proposition, such as a result, action or best practices based on the classification using a decision tree or table that associates specific courses of action or results with the classification outputted from theclassifier program 132. For instance, if the output of theclassifier program 132 comprises a clinical diagnosis, then therules engine 134 would determine the best practices to treat a clinical diagnosis outputted from theclassifier program 132, such as a drug therapy, surgical treatment, further testing, further follow-up visit, etc. In this way, therules engine 134 may provide a proposition, such as an action or best practices, reference material, calculations, summary of information from other systems, etc., for each possible classified diagnosis output from theclassifier program 132. - In one embodiment, the findings may comprise a size of an observed condition on a patient, such as a size of an abdominal aortic aneurysms (AAA), observed features, such as size, shape, etc., of an incidental thyroid nodule, ovarian cyst, non-incidental thyroid nodule, enlarged thyroid, simple ovarian cyst, etc. The
rules engine 134 may specify particular best practices given different sizes of the AAA, such as recommended follow-ups after so many years or a follow-up and an additional vascular consultation. In certain embodiments, the content is provided to theclassifier program 132 in response to the radiologist adding text to theimpressions section 206. - The
server 128 may also include arules engine 134 providing a decision tree of rules to map received predefined machine classifications outputted from theclassifier program 132 to produce a machine determined proposition based on the outputted machine classification, such as a result or action to take based on the classification. For instance, if the machine classification from theclassifier program 132 comprises a clinical diagnosis recognized by therules engine 134, then therules engine 134 may map that recognized clinical diagnosis to a best practices course of action to address that clinical diagnosis, such drug therapy, surgery, further testing, follow-up visit, etc. Theserver 128 may further include aclient interface 136 to interface and manage communications with client systems to receive application calls from thereport generator 110 operating in multipleclient computing devices 100, such as at different facilities, companies or places of business. In alternative embodiments, theclient interface 136 may receive application calls from another component. An advantage of having therules engine 134 separate from theclassifier program 132 is that therules engine 134 may be independently updated to provide new results, actions or best practices for the classified condition or diagnosis outputted from theclassifier program 132. - The
network 130 may comprise one or more networks including Local Area Networks (LAN), Storage Area Networks (SAN), Wide Area Network (WAN), peer-to-peer network, wireless network, the Internet, etc. - The
computing device 100 may store program components and corresponding data, such as 108, 110, 112, 116, 200, 300 in anon-volatile storage 106, which may comprise one or more storage devices known in the art, such as a solid state storage device (SSD) comprised of solid state electronics, NAND storage cells, EEPROM (Electrically Erasable Programmable Read-Only Memory), flash memory, flash disk, Random Access Memory (RAM) drive, storage-class memory (SCM), Phase Change Memory (PCM), resistive random access memory (RRAM), spin transfer torque memory (STM-RAM), conductive bridging RAM (CBRAM), magnetic hard disk drive, optical disk, tape, etc. The storage devices may further be configured into an array of devices, such as Just a Bunch of Disks (JBOD), Direct Access Storage Device (DASD), Redundant Array of Independent Disks (RAID) array, virtualization device, etc. Further, the storage devices may comprise heterogeneous storage devices from different vendors or from the same vendor. - The
memory 104 may comprise a suitable volatile or non-volatile memory devices, including those described above. - Generally, program modules, such as the
program components computing device 100 andserver 128 ofFIG. 1 may be implemented in one or more computer systems, where if they are implemented in multiple computer systems, then the computer systems may communicate over a network. - The
program components processor 102 from thememory 104 to execute. Alternatively, some or all of theprogram components - The functions described as performed by the
programs -
FIG. 1 shows theclassifier program 132 andrules engine 134 implemented in aserver 128 over anetwork 130. In an alternative embodiment, theclassifier program 132 andrules engine 134 may be implemented in thecomputing device 100 including thereport generator 110 anduser interface 200. In further embodiments, theserver 128 may comprise a cloud server providing cloud services for classifying user input with theclassifier program 132 and determining a course of action or best practices for the classification of theclassifier program 132. -
FIG. 2a illustrates an embodiment of auser interface 200 in which a user may input data, and includes aobservations section 202 in which a user may enter observations about an entity being diagnosed, such as a person, device, natural phenomena. The user may describe observations from an image or video, direct observation of a phenomena, etc. The user may further enterimpressions 206, such as classifications and conclusions, in animpressions section 208 providing a classification of the observations, such as a condition, medical diagnosis, natural phenomena, state of a device, etc. Theimpressions section 208 may further render an action, solution, best practices, etc. 210 produced by theclassifier program 132 andrules engine 134. Theclassifier program 132 may receive the user enteredobservations 302 and optionally theuser classification 304, such as a diagnosis or conclusion of observations the user inters infield 206 of theuser interface 200, and classify into a machine classification, such as a clinical diagnosis, category, etc., which is then provided to therules engine 134 to determine a machine proposition, e.g., result, action, recommendation, best practices, etc. 210 to render in theuser interface 200 concerning the user machine classified findings. - In certain embodiments, the
report user interface 200 may be continually displayed waiting to take action in response to user input into theimpressions section 208 providing the classification. -
FIG. 2b illustrates an embodiment of analert user interface 220, which may be displayed as a dialog box or pop-up alert, rendered on top of theuser interface 200, that alerts the user of a machine determinedproposition 222 based on the user entered content andfindings rules engine 134 based on the user entered observations and classification, or diagnosis, of the observations. The user may select an acceptselection control 224 to cause the machine determinedbest practices 222 to be entered intofield 210 of theuser interface 200 to accept the machine generated best practices from therules engine 134. The user may also select areject selection control 226 to reject the machine determined best practices. The user may enter the machine determined best practices 22 into thereport user interface 200 by voice command, copy and paste from thefield 222 in thealert user interface 220 into the best practices field 210 of theuser interface 200, or click a button, such as 224 and 226 to cause the automatic insertion of the best practices recommendation in thefield 222 of thealert user interface 220 into theuser interface 200, such as intofield 210. - The
user interface 200 provides improvements to computer technology for rendering machine generated results, such as from a machinelearning classifier program 132 andrules engine 134, by, in real-time, providing user enteredobservations 204 to theclassifier program 132 andrules engine 134 to allow immediate computation of new machine propositions based on user entered findings, which includenew observations 204 and user enteredclassifications 206, to immediately display in real-time themachine propositions 210, such as recommended actions, best practices, actions, calculations, reference material, summarized information from other sources, etc., in theuser interface 200 as they are being entered in theobservations 202 section. Further, at some point theobservations 204, user enteredclassifications 206, andmachine propositions 210 may be saved as areport 116 for later use. In further embodiments, other types of content and findings, in addition to or different from observations and classifications, may be entered in the user interface. -
FIG. 3 illustrates an embodiment of alog file 300 to which content entered in theuser interface 200 is written, includingobservations 302 entered in theobservations section 202 of theuser interface 200, such as exams and other patient related information;impressions 304 comprising the user enteredclassifications 206 entered in theimpressions section 208; and themachine propositions 306 from therules engine 134 from processing the machine classification produced by theclassifier program 132, which is rendered as a machine determinedproposition 210, e.g., best practices, etc. 210 in theimpressions section 208. -
FIG. 4 illustrates an embodiment ofuser content information 400 generated by thereport generator 110 to determine when to transmit theobservations 302 to theclassifier program 132, and includes a previoususer content fingerprint 402 comprising a unique fingerprint/hash code calculated from a previous state of the user enteredimpressions 304 and a currentuser content fingerprint 404 comprising a unique fingerprint/hash code calculated from a current state of the user enteredimpressions 304. Thefingerprints -
FIG. 5 illustrates an embodiment of operations performed by thereport generator 110 to render thereport user interface 200, rendered on thedisplay screen 122, to generate areport 116. Upon initiating (at block 500) an operation to generate areport 116 as part of a start of a report generations session, thereport generator 110 receives (at block 502) user input via input controls 126, such as a keyboard mouse, touchscreen, ormicrophone 120 to renders in thereport user interface 200. Thereport generator 110 continually writes (at block 504) user input entered into theobservations 202 andimpressions 208 sections intoobservations 302 and impressions (e.g., classifications) 304 fields, respectively, of thelog file 300. In this way, thelog file 300 is updated in real time with new information the user enters into theobservations 202 andimpressions 208 sections. -
FIG. 6 illustrates an embodiment of operations performed by thereport generator 110 to periodically process thelog file 300 to implement real-time updating of the machine determinedpropositions 210 rendered in theuser interface 200. Thereport generator 110 may periodically process thelog file 300 after a time period, such as a fraction of a second or in response to the writing to thelog file 300, etc. Upon processing (at block 600) thelog file 300, thereport generator 110 processes (at block 602) the content of theobservations field 302 to generate a content fingerprint/hash code comprising a unique identifier of the user entered findings. If (at block 604) this is the first processing of thelog file 300 for a report generation sessions, then thereport generator 110 determines (at block 606) whether the user entered information in theimpressions field 304, which may in certain embodiments signal that the user has completed entering theobservations 302. In further implementations, thereport generator 110 may continuously provide feedback entered through theuser interface 200 without waiting for the user to input content into theimpressions section 208. Other tests may be used to determine that the entering of theobservations 304 has completed, such as selection of a button or user interface element. If (at block 606) a determination is made the user completed enteringobservations 304, then the content fingerprint calculated from thecurrent observations 302 is stored (at block 608) as the currentuser content fingerprint 404. The user entered findings, such asobservations 302 andimpressions 304 are transmitted (at block 610) to theclassifier program 132. - If (at block 604) this is not the first processing of the log file during a report generation session to generate a report, then the current
user content fingerprint 404 is indicated (at block 614) as the previoususer content fingerprint 402 and the generated content fingerprint is stored (at block 614) as the new currentuser content fingerprint 404. If (at block 616) the current 404 and previous 402 user content fingerprints do not match, then the user enteredobservations 304 has changed since last checked and thereport generator 110 transmits the new user enteredobservations 302 to theclassifier program 132 to generate new results. If (at block 616) thefingerprints new impressions 304 to determine if there are new results, actions or best practices to render. - The embodiment of
FIG. 6 provides improved computer technology to provide an immediate and real-time determination ifnew user observations 204 have been entered in theuser interface 200 to provide for real-time updating of the results by immediately sending any changed user enteredimpressions 304 to theclassifier program 132 andrules engine 134 to process. Any new results from therules engine 134 may be displayed in the alert user interface 220 (FIG. 2B ) in thedetermined proposition field 222, which the user may then cause to be inputted into thereport 200. This allows for the immediate updating of the machine determinedpropositions 210 rendered in thealert user interface 220 based on new user entered findings, such as observations and/or impressions/classifications. -
FIG. 7 illustrates an embodiment of operations performed by theclassifier program 132 and therules engine 134 to processobservations 304 from the user to generate results. Upon receiving (at block 700) information from theuser impressions 304 and other information, such as entered inobservations 204, a classification of theobservations 302, metadata, etc., theclassifier program 132 classifies (at block 702) the entered information, e.g.,impressions 304, etc., into a predefined machine classification, such as a clinical diagnosis or recognized condition. In additional embodiments, further information may be inputted into theclassifier program 132, such as theuser impressions 206, or user classification, and additional information from other sources. For instance, for medical diagnostics, in addition to receiving theuser observations 204 andimpressions 206, theclassifier program 132 may receive medical lab results, information from other reports or sources, etc., all that may assist theclassifier program 132 to produce a more accurate machine classification of the user entered findings, e.g.,observations 302 and/orimpressions 304. Theclassifier program 132 sends (at block 704) the determined machine classification to therules engine 134. - Upon receiving the machine classification, the
rules engine 134 applies (at block 706) a decision tree of rules to the machine classification to determine a machine proposition, such as an action to take, best practices, etc., for the provided classification. The determined result is returned (at block 708) to reportgenerator 110 to render the result in thealert user interface 220 infield 222 to allow the user to add the determined result to the report. Thereport generator 110 or another program component may generate and manage thealert user interface 220. - With the embodiment of
FIG. 7 , user entered findings, such as observations and/or impressions, entered in real time are sent to theclassifier program 132 and therules engine 134 to provide immediate machine propositions, e.g., actions, best practices, etc. to return to thereport generator 110 to immediately render the results in thealert user interface 220, such asfield 222. This enables to the user to see in real-time the changes in the results, actions or best practices resulting from entering new findings, and determine whether to have them entered into the report being generated. -
FIG. 8 illustrates an embodiment of operations performed by thereport generator 110 to process results received from therules engine 134. Upon receiving (at block 800) a result, recommendation or action from therules engine 134 in response to the sentobservations 302, thereport generator 110 determines (at block 802) whether the received machine proposition matches themachine proposition 306 saved in the log file currently rendered infield 210 of theuser interface 200. If (at block 802) there is a match, then the machine proposition has not changed due to the change in the findings, e.g., observations and impressions, and control ends. If (at block 802) there is not a match, meaning the machine proposition or suggested action has changed since the previous findings, then thereport generator 110 displays (at block 804) a dialog box, such as the alertnotification dialog box 220 inFIG. 2b , showing the received machine determinedrecommendations 222 andgraphical controls dialog box 220. If (at block 806) the user has indicated to accept the received machine proposition, by select the acceptgraphical control 224, then thereport generator 110 renders (at block 808) the received machine proposition in thefield 210 in the user interface, e.g., inimpressions field 208 and refresh theuser interface display 200 with the new machine proposition. In further embodiments, the user can also copy/paste or use a provided macro to move the machine proposition anywhere into the report. - The new machine proposition is written (at block 810) to the
machine proposition field 306 in thelog file 300. If (at block 806) the user rejects the received machine proposition, such as by selecting the rejectgraphical control 226, then the received machine proposition is indicated in a field of theuser interface 220, such as theaction field 210 that the user, and indication is also made that the user, such as radiologist, disagreed with the machine proposition, such as actions, recommendations, reference material, calculations, summary of information from other sources; best practices based on the findings, etc. A user interface is then rendered (at block 814) to allow the user to specify a preferred proposition, e.g., best practices, and/or a preferred classification based on the current content in theuser interface 200, as well as additional user feedback or comments. Upon receiving the user input, thereport generator 110 sends (at block 816) a message to theclient interface 136 at theserver 128 with information on the rejected machine proposition, preferred proposition/preferred classification, the content (observations, impressions, etc.) that was previously sent to generate the rejected machine determined proposition, and any other feedback. - With the embodiment of
FIG. 8 , machine determined propositions are provided real time in response to the recently sent findings. In response, thereport generator 110 renders analert notification dialog 200 to allow the user to review the results, such as machine determined propositions, actions to take, best practices for medical diagnosis, etc., and then determine whether to accept the machine proposition to include into the report oruser interface 200 being generated. Further, described embodiments provide improved techniques for allowing the user to provide input on the rejected machine proposition to improve the calculations and operations of theclassifier program 132 andrules engine 134 by providing a user preferred classification and/or preferred proposition based on the current content rendered in theuser interface 200 to theserver 128 to retrain therules engine 134 and/orclassifier program 132 to further optimize operations and results. Further, the rejected machine propositions may be entered into thelog file 300. -
FIG. 9 illustrates an embodiment of operations performed by theclient interface 136,classifier program 132 and/orrules engine 134 to process a message sent from thereport generator 110 where the user rejects the provided machine proposition. Upon receiving (at block 900) the message, including information on the rejected machine proposition, preferred proposition/preferred classification, the content (observations, impressions, etc.) that was previously sent to generate the rejected machine proposition, and any other feedback, if (at block 902) a preferred classification was provided, then the machine learning algorithm of theclassifier program 132 is trained (at block 904) to output the preferred classification as the machine classification based on the previously sent content that resulted in the rejected machine proposition. If (at block 906) a preferred proposition is provided, then therules engine 134 is updated (at block 908) to output the preferred proposition as the machine proposition for the preferred classification, which theclassifier program 132 has been retrained to produce for the given current content. Therules engine 134 may further be updated (at block 910) to not associate the rejected machine proposition with the machine classification used to produce the rejected machine proposition, so as not to produce again the machine proposition that the user rejected given the current content, findings, impressions, observations, etc. in theuser interface 200. - If (at block 902) a preferred classification was not provided and if (at block 912) a preferred proposition was provided, then the
rules engine 134 is updated (at block 914) to output the preferred proposition as the machine proposition for the machine classifier used to produce the rejected machine proposition, so that the user suggested preferred proposition will be produced instead of the user rejected machine proposition for that machine classifier. From the no branch ofblock 912, block 914 or 910, any additional comments provided with the message are forward to a server administrator to consider for improving the performance of theclassifier program 132 and/orrules engine 134. - The embodiment of
FIG. 9 provides improved computer technology for retraining theclassifier program 132 and/orrules engine 134 to produce user preferred output when the user rejects a machine proposition previously sent to allow continual improvement of the machine propositions produced for given content, such as improved actions, best practices, etc. - The present invention may be a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention.
- The computer program product comprises a computer readable storage medium implemented using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof. The described operations may be implemented as code or logic maintained in a “computer readable storage medium”. The term “code” and “program code” as used herein refers to software program code, hardware logic, firmware, microcode, etc. The computer readable storage medium, as that term is used herein, includes a tangible element, including at least one of electronic circuitry, storage materials, a casing, a housing, a coating, hardware, and other suitable materials. A computer readable storage medium may comprise, but is not limited to, a magnetic storage medium (e.g., hard disk drives, floppy disks, tape, etc.), optical storage (CD-ROMs, DVDs, optical disks, etc.), volatile and non-volatile memory devices (e.g., EEPROMs, ROMs, PROMs, RAMs, DRAMs, SRAMs, Flash Memory, firmware, programmable logic, etc.), Solid State Devices (SSD), computer encoded and readable punch cards, etc. The computer readable storage medium may further comprise a hardware device implementing firmware, microcode, etc., such as in an integrated circuit chip, a programmable logic device, a Programmable Gate Array (PGA), field-programmable gate array (FPGA), Application Specific Integrated Circuit (ASIC), etc. A computer readable storage medium is not comprised solely of transmission signals and includes physical and tangible components. Those skilled in the art will recognize that many modifications may be made to this configuration without departing from the scope of the present invention, and that the article of manufacture may comprise suitable information bearing medium known in the art.
- Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
- These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
- The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
- The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
- The computational components of
FIG. 1 , including thecomputer device 100 and theserver 128, may be implemented in one or more computer systems, having a computer architecture as shown inFIG. 10 , and including a processor 1002 (e.g., one or more microprocessors and cores), a memory 1004 (e.g., a volatile memory device), and storage 1006 (e.g., a non-volatile storage, such as magnetic disk drives, solid state devices (SSDs), optical disk drives, a tape drive, etc.). Thestorage 1006 may comprise an internal storage device or an attached or network accessible storage. Programs, including anoperating system 1008 andapplications 1010 stored in thestorage 1006 are loaded into thememory 1004 and executed by theprocessor 1002. Theapplications 1010 may include thereport user interface 200,report generator 110,input drivers 112,classifier program 132,rules engine 134, andclient interface 136 and other program components described above. Thearchitecture 1000 further includes anetwork card 1012 to enable communication with the network 16. Aninput device 1014 is used to provide user input to theprocessor 1002, and may include a keyboard, mouse, pen-stylus, microphone, touch sensitive display screen, or any other activation or input mechanism known in the art. Anoutput device 1016, such as a display monitor, printer, storage, etc., is capable of rendering information transmitted from a graphics card or other component. Theoutput device 1016 may render the GUIs described with respect to figures and theinput device 1014 may be used to interact with the graphical controls and elements in the GUIs described above. Thearchitecture 1000 may be implemented in any number of computing devices, such as a server, mainframe, desktop computer, laptop computer, hand held computer, tablet computer, personal digital assistant (PDA), telephony device, cell phone, etc. - The terms “an embodiment”, “embodiment”, “embodiments”, “the embodiment”, “the embodiments”, “one or more embodiments”, “some embodiments”, and “one embodiment” mean “one or more (but not all) embodiments of the present invention(s)” unless expressly specified otherwise.
- The terms “including”, “comprising”, “having” and variations thereof mean “including but not limited to”, unless expressly specified otherwise.
- The enumerated listing of items does not imply that any or all of the items are mutually exclusive, unless expressly specified otherwise.
- The terms “a”, “an” and “the” mean “one or more”, unless expressly specified otherwise.
- Devices that are in communication with each other need not be in continuous communication with each other, unless expressly specified otherwise. In addition, devices that are in communication with each other may communicate directly or indirectly through one or more intermediaries.
- A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary a variety of optional components are described to illustrate the wide variety of possible embodiments of the present invention.
- When a single device or article is described herein, it will be readily apparent that more than one device/article (whether or not they cooperate) may be used in place of a single device/article. Similarly, where more than one device or article is described herein (whether or not they cooperate), it will be readily apparent that a single device/article may be used in place of the more than one device or article or a different number of devices/articles may be used instead of the shown number of devices or programs. The functionality and/or the features of a device may be alternatively embodied by one or more other devices which are not explicitly described as having such functionality/features. Thus, other embodiments of the present invention need not include the device itself.
- The foregoing description of various embodiments of the invention has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. It is intended that the scope of the invention be limited not by this detailed description, but rather by the claims appended hereto. The above specification, examples and data provide a complete description of the manufacture and use of the composition of the invention. Since many embodiments of the invention can be made without departing from the spirit and scope of the invention, the invention resides in the claims herein after appended.
Claims (13)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/852,323 US20220334856A1 (en) | 2018-06-28 | 2022-06-28 | Automated pop-up display altering a user of a machine determined medical best practice recommendation |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/022,581 US10970089B2 (en) | 2018-06-28 | 2018-06-28 | User interface for determining real-time changes to content entered into the user interface to provide to a classifier program and rules engine to generate results for the content |
US17/180,678 US20210173679A1 (en) | 2018-06-28 | 2021-02-19 | Training a machine learning module and updating a rules engine to determine medical best practice recommendations for user entered medical observations |
US17/852,323 US20220334856A1 (en) | 2018-06-28 | 2022-06-28 | Automated pop-up display altering a user of a machine determined medical best practice recommendation |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/180,678 Continuation US20210173679A1 (en) | 2018-06-28 | 2021-02-19 | Training a machine learning module and updating a rules engine to determine medical best practice recommendations for user entered medical observations |
Publications (1)
Publication Number | Publication Date |
---|---|
US20220334856A1 true US20220334856A1 (en) | 2022-10-20 |
Family
ID=69008119
Family Applications (7)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/022,581 Active US10970089B2 (en) | 2018-06-28 | 2018-06-28 | User interface for determining real-time changes to content entered into the user interface to provide to a classifier program and rules engine to generate results for the content |
US17/180,678 Pending US20210173679A1 (en) | 2018-06-28 | 2021-02-19 | Training a machine learning module and updating a rules engine to determine medical best practice recommendations for user entered medical observations |
US17/852,328 Abandoned US20220326971A1 (en) | 2018-06-28 | 2022-06-28 | User interface for recommending insertion of an updated medical best practice recommendation in response to user entry of updated medical observation informaton for a patient |
US17/852,347 Abandoned US20220326973A1 (en) | 2018-06-28 | 2022-06-28 | Log file comparison apparatus and method for providing an updated medical best practice recommendation based on user entry of new medical observation information for a patient |
US17/852,297 Abandoned US20220326970A1 (en) | 2018-06-28 | 2022-06-28 | User interface for providing a medical best practice recommendation based on a user-entered medical observation of a patient |
US17/852,323 Abandoned US20220334856A1 (en) | 2018-06-28 | 2022-06-28 | Automated pop-up display altering a user of a machine determined medical best practice recommendation |
US17/852,336 Abandoned US20220326972A1 (en) | 2018-06-28 | 2022-06-28 | Server-based machine learning classifier and rules engine for providing a medical best practice recommendation based on a user-entered medical observation of a patient |
Family Applications Before (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/022,581 Active US10970089B2 (en) | 2018-06-28 | 2018-06-28 | User interface for determining real-time changes to content entered into the user interface to provide to a classifier program and rules engine to generate results for the content |
US17/180,678 Pending US20210173679A1 (en) | 2018-06-28 | 2021-02-19 | Training a machine learning module and updating a rules engine to determine medical best practice recommendations for user entered medical observations |
US17/852,328 Abandoned US20220326971A1 (en) | 2018-06-28 | 2022-06-28 | User interface for recommending insertion of an updated medical best practice recommendation in response to user entry of updated medical observation informaton for a patient |
US17/852,347 Abandoned US20220326973A1 (en) | 2018-06-28 | 2022-06-28 | Log file comparison apparatus and method for providing an updated medical best practice recommendation based on user entry of new medical observation information for a patient |
US17/852,297 Abandoned US20220326970A1 (en) | 2018-06-28 | 2022-06-28 | User interface for providing a medical best practice recommendation based on a user-entered medical observation of a patient |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/852,336 Abandoned US20220326972A1 (en) | 2018-06-28 | 2022-06-28 | Server-based machine learning classifier and rules engine for providing a medical best practice recommendation based on a user-entered medical observation of a patient |
Country Status (1)
Country | Link |
---|---|
US (7) | US10970089B2 (en) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11838036B2 (en) | 2016-05-09 | 2023-12-05 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for detection in an industrial internet of things data collection environment |
US11327475B2 (en) | 2016-05-09 | 2022-05-10 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for intelligent collection and analysis of vehicle data |
US11774944B2 (en) | 2016-05-09 | 2023-10-03 | Strong Force Iot Portfolio 2016, Llc | Methods and systems for the industrial internet of things |
US10678233B2 (en) | 2017-08-02 | 2020-06-09 | Strong Force Iot Portfolio 2016, Llc | Systems and methods for data collection and data sharing in an industrial environment |
EP3637425A1 (en) * | 2018-10-08 | 2020-04-15 | Smart Reporting GmbH | Method and system for generating a report |
US11539541B1 (en) * | 2019-03-18 | 2022-12-27 | 8X8, Inc. | Apparatuses and methods involving data-communications room predictions |
US11720621B2 (en) * | 2019-03-18 | 2023-08-08 | Apple Inc. | Systems and methods for naming objects based on object content |
US12131475B2 (en) | 2020-03-19 | 2024-10-29 | Unitedhealth Group Incorporated | Systems and methods for automated digital image selection and pre-processing for automated content analysis |
CN112417271B (en) * | 2020-11-09 | 2023-09-01 | 杭州讯酷科技有限公司 | Intelligent system construction method with field recommendation |
US20220254514A1 (en) * | 2021-02-11 | 2022-08-11 | Nuance Communications, Inc. | Medical Intelligence System and Method |
US12056387B2 (en) * | 2022-06-03 | 2024-08-06 | Bmc Software, Inc. | Writing and reading data sets to and from cloud storage for legacy mainframe applications |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110112848A1 (en) * | 2005-07-28 | 2011-05-12 | Roberto Beraja | Medical decision system including question mapping and cross referencing system and associated methods |
US20120059667A1 (en) * | 1998-11-13 | 2012-03-08 | Anuthep Benja-Athon | System and method for preventing malpractices and negligences |
US20130253940A1 (en) * | 2012-03-20 | 2013-09-26 | Zilla Collections Llc | System and method for diagnosis involving crowdsourcing |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020078165A1 (en) * | 2000-12-14 | 2002-06-20 | International Business Machines Corporation | System and method for prefetching portions of a web page based on learned preferences |
WO2002084560A1 (en) * | 2001-04-06 | 2002-10-24 | Florence Comite | System and method for delivering integrated health care |
US7181375B2 (en) | 2001-11-02 | 2007-02-20 | Siemens Medical Solutions Usa, Inc. | Patient data mining for diagnosis and projections of patient states |
US20030225663A1 (en) * | 2002-04-01 | 2003-12-04 | Horan James P. | Open platform system and method |
US7098815B1 (en) * | 2005-03-25 | 2006-08-29 | Orbital Data Corporation | Method and apparatus for efficient compression |
US20070016442A1 (en) * | 2005-06-27 | 2007-01-18 | Richard Stroup | System and method for collecting, organizing, and presenting patient-oriented medical information |
US20070038037A1 (en) * | 2005-08-15 | 2007-02-15 | General Electric Company | Method and apparatus for symptom-based order protocoling within the exam ordering process |
US7801721B2 (en) | 2006-10-02 | 2010-09-21 | Google Inc. | Displaying original text in a user interface with translated text |
US20110124975A1 (en) * | 2007-03-16 | 2011-05-26 | Arthur Solomon Thompson | Method for Medical Diagnosis Utilizing PDA Software Robots |
EP2181726B1 (en) * | 2008-10-31 | 2017-04-19 | ResMed Ltd. | Systems for guiding transitions between therapy modes in connection with treatment and/or diagnosis of sleep-disordered breathing |
WO2012024450A2 (en) * | 2010-08-17 | 2012-02-23 | Wisercare Llc | Medical care treatment decision support system |
US9043901B2 (en) | 2010-09-01 | 2015-05-26 | Apixio, Inc. | Intent-based clustering of medical information |
BR112013013876A2 (en) | 2010-12-10 | 2016-09-13 | Koninkl Philips Electronics Nv | system for detecting an error in a record, workstation, method of detecting an error in a record, and computer program product |
US9916420B2 (en) | 2011-02-18 | 2018-03-13 | Nuance Communications, Inc. | Physician and clinical documentation specialist workflow integration |
US10181360B1 (en) | 2012-09-04 | 2019-01-15 | D.R. Systems, Inc. | Report links |
WO2014071330A2 (en) | 2012-11-02 | 2014-05-08 | Fido Labs Inc. | Natural language processing system and method |
EP2984601A4 (en) | 2013-04-13 | 2016-12-07 | Univ Pennsylvania | System and method for medical image analysis and probabilistic diagnosis |
US10755804B2 (en) | 2016-08-10 | 2020-08-25 | Talix, Inc. | Health information system for searching, analyzing and annotating patient data |
US10748226B2 (en) | 2016-09-07 | 2020-08-18 | UCB Biopharma SRL | Method of generating, storing and mining data related to key opinion leaders in scientific fields and computer system configured for presenting an explorable graphical user interface |
US20180218737A1 (en) | 2017-01-30 | 2018-08-02 | University Of Maryland, Baltimore | System and method for performing real time analysis of dictated reports for improving compliance efficiency |
-
2018
- 2018-06-28 US US16/022,581 patent/US10970089B2/en active Active
-
2021
- 2021-02-19 US US17/180,678 patent/US20210173679A1/en active Pending
-
2022
- 2022-06-28 US US17/852,328 patent/US20220326971A1/en not_active Abandoned
- 2022-06-28 US US17/852,347 patent/US20220326973A1/en not_active Abandoned
- 2022-06-28 US US17/852,297 patent/US20220326970A1/en not_active Abandoned
- 2022-06-28 US US17/852,323 patent/US20220334856A1/en not_active Abandoned
- 2022-06-28 US US17/852,336 patent/US20220326972A1/en not_active Abandoned
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120059667A1 (en) * | 1998-11-13 | 2012-03-08 | Anuthep Benja-Athon | System and method for preventing malpractices and negligences |
US20110112848A1 (en) * | 2005-07-28 | 2011-05-12 | Roberto Beraja | Medical decision system including question mapping and cross referencing system and associated methods |
US20130253940A1 (en) * | 2012-03-20 | 2013-09-26 | Zilla Collections Llc | System and method for diagnosis involving crowdsourcing |
Also Published As
Publication number | Publication date |
---|---|
US20220326973A1 (en) | 2022-10-13 |
US20220326970A1 (en) | 2022-10-13 |
US20220326971A1 (en) | 2022-10-13 |
US20210173679A1 (en) | 2021-06-10 |
US20220326972A1 (en) | 2022-10-13 |
US20200004561A1 (en) | 2020-01-02 |
US10970089B2 (en) | 2021-04-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220334856A1 (en) | Automated pop-up display altering a user of a machine determined medical best practice recommendation | |
AU2020260078B2 (en) | Computer-implemented machine learning for detection and statistical analysis of errors by healthcare providers | |
US20190370383A1 (en) | Automatic Processing of Ambiguously Labeled Data | |
CN112201359A (en) | Artificial intelligence-based critical illness inquiry data identification method and device | |
US11017572B2 (en) | Generating a probabilistic graphical model with causal information | |
US10586615B2 (en) | Electronic health record quality enhancement | |
Ullah et al. | Detecting High‐Risk Factors and Early Diagnosis of Diabetes Using Machine Learning Methods | |
Hussein et al. | Auto-detection of the coronavirus disease by using deep convolutional neural networks and X-ray photographs | |
WO2022121544A1 (en) | Normalizing oct image data | |
US11551817B2 (en) | Assessing unreliability of clinical risk prediction | |
Dhali et al. | Artificial intelligence assisted endoscopic ultrasound for detection of pancreatic space-occupying lesion: A systematic review and meta-analysis | |
US11908552B2 (en) | Methods for quality data extraction, alignment, and reporting | |
CN114201613B (en) | Test question generation method, test question generation device, electronic device, and storage medium | |
CN114579626B (en) | Data processing method, data processing device, electronic equipment and medium | |
WO2023084254A1 (en) | Diagnosic method and system | |
CN112542244B (en) | Auxiliary information generation method, related device and computer program product | |
WO2023095042A1 (en) | A system and method for medical queries | |
US11113338B2 (en) | Concepts for iterative and collaborative generation of data reports via distinct computing entities | |
Alhashem et al. | Diabetes Detection and Forecasting using Machine Learning Approaches: Current State-of-the-art | |
CN111275558A (en) | Method and device for determining insurance data | |
US20230018521A1 (en) | Systems and methods for generating targeted outputs | |
Cálem et al. | Intelligent systems in healthcare: A systematic survey of explainable user interfaces | |
US12014220B2 (en) | Learning-based automatic selection of AI applications | |
Naik et al. | MediMind: A Comprehensive Health Prediction and Record-Keeping Platform | |
WO2023057516A1 (en) | Conformal training of machine-learning models |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: RADIOLOGY PARTNERS, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TOBIAS, THOMAS;KOTTLER, NINA;MITSKY, JASON R.;AND OTHERS;SIGNING DATES FROM 20180628 TO 20180706;REEL/FRAME:060354/0436 |
|
AS | Assignment |
Owner name: RADIOLOGY PARTNERS, INC., CALIFORNIA Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE FIRST ASSIGNOR'S NAME PREVIOUSLY RECORDED AT REEL: 060354 FRAME: 0436. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNORS:TOBIAS, THOMAS N.;KOTTLER, NINA;MITSKY, JASON R.;AND OTHERS;SIGNING DATES FROM 20180628 TO 20180706;REEL/FRAME:060574/0362 |
|
AS | Assignment |
Owner name: RADIOLOGY PARTNERS, INC., CALIFORNIA Free format text: ASSIGNEE CHANGE OF ADDRESS;ASSIGNOR:RADIOLOGY PARTNERS, INC.;REEL/FRAME:060921/0675 Effective date: 20220717 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |