WO2021110576A1 - Providing information for assisting in the treatment of a ventilated and sedated patient - Google Patents
Providing information for assisting in the treatment of a ventilated and sedated patient Download PDFInfo
- Publication number
- WO2021110576A1 WO2021110576A1 PCT/EP2020/083865 EP2020083865W WO2021110576A1 WO 2021110576 A1 WO2021110576 A1 WO 2021110576A1 EP 2020083865 W EP2020083865 W EP 2020083865W WO 2021110576 A1 WO2021110576 A1 WO 2021110576A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- patient
- health
- ventilation
- ultrasound images
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/10—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to drugs or medications, e.g. for ensuring correct administration to patients
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/40—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
Definitions
- the present invention relates to the field of patient ventilation, and in particular to providing information for aiding a clinician in treating a ventilated and sedated patient.
- ICU intensive care unit
- a mechanical ventilator inevitably requires a clinician, such as a doctor/physician or ICU nurse, to make decisions on whether to wean the patient from the mechanical ventilator. This may comprise deciding whether a patient should be removed from the ventilation to undertake a spontaneous breathing trial (SBT), which is a first stage in the process for allowing ICU patients to breathe on their own.
- SBT spontaneous breathing trial
- a decision for whether to initiating weaning of a patient from a ventilator is based upon physiological and/or oxygenation information of the patient.
- guidelines may define physiological and/or oxygenation criteria for recommending whether or not the patient should be weaned.
- one criteria may be that the patient’s partial pressure of oxygen (PaCE) must be less than 50mmHg for weaning to be recommended, whereas another criteria may be that the patient’s fraction of inspired oxygen (F1O2) must be greater than 0.5.
- Other characteristics of the patient’s ventilation may also be controlled by the clinician based on characteristics of the patient, e.g. to increase oxygen flow to a patient with a low Sp02 level.
- sedation monitoring is generally a process that follows guidelines, such as those suggested by the Richmond Agitation-Sedation Scale (RASS). Co-morbidities, weaning failure, and higher mortality risk are only few of the consequences of prolonged sedation in the ICU.
- RASS Richmond Agitation-Sedation Scale
- properties of sedation may be controlled based on patient characteristics, e.g. to reduce an amount of sedative if the patient exhibits certain co-morbidities.
- cardiac, lung and diaphragm ultrasound imaging e.g. echocardiograms
- diaphragm ultrasound imaging can be performed via a sub-costal or an inter-costal approach with low and high frequency transducers, respectively.
- a computer-implemented method of providing information for assisting in the treatment of a ventilated and sedated patient is provided.
- the computer-implemented method comprises: obtaining one or more ultrasound images of a portion of the patient’s thorax; processing the one or more ultrasound images, using a machine-learning algorithm (i.e. deep learning), to obtain health information, the health information being responsive to changes in the health of the patient; obtaining sedation information of the patient, the sedation information providing details on a dosage of sedative provided to the patient; obtaining ventilation information of the patient, the ventilation information providing details on settings of a ventilator providing ventilation to the patient; and displaying the obtained health information, the sedation information and the ventilation information at a same user interface.
- a machine-learning algorithm i.e. deep learning
- a decision as to whether to take the patient off the ventilator can be at least partially based on information contained in ultrasound images of parts of the patient’s thorax, e.g. an ultrasound image of the heart and/or lungs and/or diaphragm.
- the underlying concept of the invention relies on the recognition that both ventilation settings (of a ventilator) and sedative dosages affect the health of a patient’s thorax, and in particular can cause diaphragm dysfunction that affects the health of anatomical features within the patient’s thorax (such as the diaphragm or heart).
- the present invention proposes to provide a clinician, at a same user interface or dashboard, information about sedative dosages, ventilation settings and health information. This provides a tool for assisting in the treatment of the patient, by aiding in the visualization of the patient’s current state and treatment information.
- the health information may comprise any one or more health indicators that are responsive to a change in the patient’s health, such as a diaphragm thickness or ejection fraction.
- the health indicator may be numerical or categorical, so that the value for a health indicator may be descriptive, numeric or a binary indicator (amongst other value formats).
- the health information comprises any one or more health indicators that are responsive to a change in the probability of (health) complications if the patient is weaned or taken off the ventilator.
- Suitable examples include (an indicator responsive to) a diaphragm thickness (fraction), an ejection fraction, diaphragm excursion, presence/absence of lung sliding, presence/absence of lung infection and so on (all of which have been shown to indicate the probability of the patient having further health problems if ventilation is removed).
- the invention provides a user interface that provides a user with information for managing ventilation and sedation of a patient within a clinical setting, thereby aiding to reduce post-ventilation (e.g. post-extubation) complications.
- information is presented in a meaningful way to a clinician to assist them in performing a clinical task of treating a patient in intensive care undergoing ventilation.
- the health information may comprise at least one measurement of an anatomical feature of the patient’s thorax.
- the health information may comprise at least one measurement for evaluating: diaphragmatic dysfunction, cardiac dysfunction and/or lung dysfunction.
- the health information comprises, or is derived from, at least one of: a diaphragm thickness; an ejection fraction; a diaphragmatic excursion; a presence or absence of lung aeration (i.e. lung sliding, consolidation); and/or indices of cardiac function.
- the health information may comprise an indicator of variance of a health indicator (e.g. diaphragm thickness) over a period of time.
- the health information comprises a health indicator responsive to a diaphragm thickness, i.e. derived from the diaphragm thickness.
- Example health indicators may comprise the diaphragm thickness itself or a trend/ differential of the diaphragm thickness that indicates how the diaphragm thickness changes over time.
- the health information may comprise an indicator of variance diaphragm thickness over a period of time.
- the health indicator derived from the diaphragm thickness comprises a difference between a current diaphragm thickness and a baseline diaphragm thickness (e.g. obtained before ventilation took place).
- Information derived from a diaphragm thickness has been identified as being particularly useful in assessing the likelihood of a patient having further health problems if ventilation has been removed.
- a trend or baseline comparative diaphragm thickness is able to indicate how the patient has deteriorated over time, and thereby indicates a likely success of them breathing independently.
- the health information comprises a health indicator responsive to a trend of the diaphragm thickness, e.g. over time.
- the health information comprises a health indicator responsive to a difference between a measured diaphragm thickness (e.g. a current diaphragm thickness) and a historic/baseline diaphragm thickness, e.g. measured before ventilation.
- the health indicator may be the difference itself, or a score (e.g. on a scale of 0-10) responsive to the difference. This provides particularly helpful clinical information for assessing the likelihood of complications if ventilation is removed.
- the health information comprises a plurality of health indicators each responsive to changes in a health of the patient; and the step of displaying the obtained health information comprises: combining the obtained plurality of indicators to form a combined health indicator; and displaying the combined health indicator at the user interface.
- a first combined health indicator may be a combination of a first and second health indicator and a second combined health indicator may be a combination of a third and fourth indicator.
- the one or more ultrasound images comprises a sequence of ultrasound images captured over a period of time; and the step of processing the one or more ultrasound images comprises processing, using a machine-learning algorithm, the sequence of ultrasound images to obtain the health information.
- the sedation information providing details on a dosage of sedative provided to the patient over at least one period of time.
- the sedation information may indicate that no sedation has been provided or administered to the subject.
- the ventilation information may provide details on settings of a ventilator providing ventilation to the patient over at least one period of time.
- the one or more ultrasound images comprises two or more sets of one or more ultrasound images of the patient’s thorax, wherein each set of one or more ultrasound images is captured at a different time;
- the step of processing the one or more ultrasound images comprises processing each set of one or more ultrasound images, using a machine-learning algorithm, to obtain, for each set, a value for one or more health indicators responsive to a change in the patient’s health;
- the step of displaying health information comprises displaying the obtained health indicator responsive to a change in the patient’s health, to thereby display a plurality of values, each value corresponding to a different time, for each one or more health indicators.
- a time series for a particular health indicator may be generated and displayed by processing more than one set of ultrasound images, each set being acquired at different time points or over different period of time. In this way, the change of a health indicator over time can be tracked and displayed.
- the sedation information may similarly provide details on dosages of sedative provided to the patient at different times.
- the ventilation information may provide details on settings of a ventilator providing ventilation to the patient at different times.
- Some embodiments further comprise a step of displaying at least one of the one or more ultrasound images at the user interface.
- Some embodiments further comprise, responsive to a user input, displaying at least one of the one or more ultrasound images at the user interface, optionally wherein the displayed at least one of the one or more ultrasound images is dependent upon the user input.
- the information provision system comprises one or more processing systems adapted to: obtain one or more ultrasound images of a portion of the patient’s thorax; process the one or more ultrasound images, using a machine-learning algorithm, to obtain health information, the health information being responsive to changes in the health of the patient; obtain sedation information of the patient, the sedation information providing details on a dosage of sedative provided to the patient; and obtain ventilation information of the patient, the ventilation information providing details on settings of a ventilator providing ventilation to the patient, and a user interface adapted to display the obtained health information, the sedation information and the ventilation information.
- an ultrasound imaging system comprising: the information provision system herein described; and an ultrasound probe system adapted to perform, responsive to a user’s control, ultrasound imaging to generate one or more ultrasound images of a portion of the patient’s thorax.
- Figure 1 illustrates an ultrasound imaging system having an information provision system according to an embodiment of the invention
- Figure 2 illustrates a template for a display of a user interface according to an embodiment of the invention
- Figures 3 to 5 each illustrate a different display of a user interface according to an embodiment of the invention.
- Figure 6 is a flowchart illustrating a method according to an embodiment of the invention.
- the invention provides a method and system for providing information useful for treating a ventilated and sedated patient at a same screen or user interface.
- One or more ultrasound images (of the patient) are processed using a machine-learning algorithm to derive health information of the patient.
- the health information is displayed at a user interface.
- Sedation and ventilation information is also displayed at the same user interface, to thereby provide a unified dashboard that provides information for treating a ventilated and sedated patient.
- Embodiments may be employed in any clinical setting in which a patient is undergoing ventilation and sedation, such as in an intensive care unit.
- Figure 1 illustrates an ultrasound imaging system 1 comprising an information provision system 100, according to an embodiment of the invention, and an ultrasound probe system 190.
- the information provision system 100 is adapted to provide and display information for assisting in the treatment of a ventilated and sedated patient.
- the information provision system 100 comprises one or more processors 110 for generating the information and a user interface 120 for displaying the generated information.
- the ultrasound probe system 190 is adapted to generate one or more ultrasound images responsive to a user’s control.
- the operation of an ultrasound probe system 190 is well known in the prior art.
- the ultrasound probe system comprises one or more ultrasound probes or transducers 191 for transmitting ultrasound waves and receiving echo information.
- a signal processor 192 receives the echo information and generates one or more ultrasound images from the echo information, e.g. by using a scan converter to arrange the received echo signals in a particular spatial relationship to thereby generate an ultrasound image.
- the information provision system 100 is adapted to obtain one or more ultrasound images 152 at the one or more processors 110. In the illustrated example, these are directly provided by the ultrasound probe system 190, but the skilled person would appreciate that the one or more ultrasound images may be obtained from a storage or memory (e.g. memory 130).
- the obtained ultrasound image(s) 152 each provide an image of (a portion of) the patient’s thorax.
- the ultrasound image(s) may comprise an ultrasound image of the patient’s heart, the patient’s lungs, the patient’s diaphragm, the patient’s bronchial tract, the patient’s entire thorax and so on.
- the information provision system 100 also obtains sedation information 152 of the patient at the one or more processors 110, the sedation information providing details on a dosage of sedative provided to the patient. In the illustrated example, this is obtained from a memory or storage 130 (e.g. storing an electronic medical record of the patient). In other examples, the sedation information is obtained from an infusion system pump (not shown) or the like that controls the delivery of medication to the patient.
- the sedation information may comprise one or more sedation indicators, each sedation indicator providing information on dosages of sedative provided to the patient at a different point or period of time.
- the sedation indicator may be an dose of sedative provided to the patient over a period of time (e.g.pg/kg), a flow rate of the sedative at a particular point in time (e.g. in pg/kg per second, minute, or hour ) and/or a relative indicator of the intensity or strength of the sedative (e.g. on a scale of 0 to 10 indicating levels of potential under, regular, or over sedation).
- the sedation information may be able to indicate when no sedation has been provided or administered to the subject. Other examples will be apparent to the skilled person.
- the information provision system 100 also obtains, at the one or more processors 110, ventilation information 153 providing details on one or more settings of a ventilator providing ventilation to the patient.
- the ventilation information 153 may be obtained, for example, from a memory or storage 130 or directly from a (mechanical) ventilator 195 ventilating the patient.
- the ventilation information may comprise one or more ventilation indicators, each ventilation indicator providing information on settings of the ventilator providing ventilation to the patient at a different point or period of time.
- a ventilation indicator may be a value representing one or more settings of the ventilator.
- the ventilation indicator may comprise a value of a parameter of a mechanical ventilator ventilating the patient (e.g. a delivered tidal volume or delivered respiratory rate), a combination of values of parameters of a mechanical ventilator (e.g. a positive end-expiration pressure, positive ventilation pressure), and/or a relative indicator of the intensity of the provided ventilation (e.g. on a scale of 0 to 10).
- a sedation indicator and/or a ventilation indicator may be numerical or categorical.
- the one or more processors 110 of the information provision system 100 processes the one or more ultrasound images 152, using a machine-learning algorithm, to obtain health information of the patient, wherein the health information is responsive to a change in the patient’s health.
- the health information may comprise one or more indicators or values that are responsive to changes in the health of the patient.
- the health information comprises any one or more health indicators that are responsive to a change in the probability of (health) complications if the patient is weaned or taken off the ventilator (i.e. should the ventilator stop ventilating the patient).
- health information may comprise (an indicator responsive to) at least one characteristic of at least one anatomical feature of the patient’s thorax. Characteristics of an anatomical feature of the patient’s thorax have been shown to be responsive to a likelihood that the patient would face complications if they were removed from ventilation.
- the health information may, for example, comprise (an indicator responsive to) a measurement of an anatomical feature of the patient’s thorax, such as a diaphragm thickness or ejection fraction (of the patient’s heart).
- Other example measurements might include, for example, thickness of respiratory epithelium or diaphragm excursion. A change in any of these (non-exhaustive) measurements has been shown to indicate a change a probability of the patient facing (medical) complications if they were removed from ventilation.
- the health information is not, however, limited to measurements.
- the health information may, for example, comprise one or more indicators of: a presence or absence of aeration (e.g. lung sliding, consolidation), indices of cardiac function,
- presence or absence of infection in the lung presence or absence (and optionally number) of lung nodules, presence or absence (and optionally number) of B-lines within the ultrasound image and so on.
- Other examples will be apparent to the skilled person.
- the one or more processors may be adapted to generate a health indicator from a single ultrasound image or a sequence/set of ultrasound images (e.g. an ultrasound video).
- the one or more processors may be adapted to generate different values for a health indicator at different points in time.
- the one or more processors may be adapted to obtain two or more sets of one or more ultrasound images (each set being associated with a different point or period of time, e.g. starting at different times).
- Each set of one or more ultrasound images may be processed to obtain more than one value for a health indicator, each value representing a different point/period of time.
- health information may comprise a time series of a health indicator, wherein a time series comprises values for a health indicator obtained at different times (where each value may be associated with a certain point in time, e.g. timestamped).
- the health information may comprise more than one time series for different health indicators (e.g. a first time series for measurements of diaphragm thickness (e.g. baseline vs follow-up measurement(s)), and a second time series for measurements of ejection fraction).
- the sedation information may comprise a time series of sedation indicators providing information of/on a dosage of sedative provided to the patient at different times.
- the ventilation information may comprise a time series of ventilation information providing information of ventilation settings for ventilating the patient at different times.
- the health information is health information concerning the patient’s thorax, and is derived from ultrasound images of the patient’s thorax.
- the health information may alternatively be labelled: “thorax health information”, “ultrasound-derived health information” and/or “ultrasound-derived thorax health information”, to improve clarity and to make this distinction clear.
- the information provision system 100 is adapted to display the obtained health information, the sedation information and the ventilation information at a same user interface 120.
- the user interface 120 displays the information obtained by the one or more processors 110.
- the one or more processors 110 may control the display of the user interface (e.g. by generating display data that defines what is displayed by the user interface 120).
- the user interface 120 may comprise, for example, a two-dimensional screen that is adapted to display a visual representation of the health, sedation and ventilation information.
- the visual representation may be defined by the one or more processors 110.
- relevant information for a ventilated and sedated patient is presented at a same user interface, ensuring that the necessary information for making a clinical decision for the patient is provided in a same location.
- the user interface 120 may be interactive, so that a user is able to control which pieces of information are provided as a visual representation by the user interface. For example, the user may be able to select what portion of sedation information is displayed at the user interface or may be able to select what (type of) health information is obtained by the one or more processors 110.
- the user interface 120 may be designed to control the content of the health information, the sedation information and/or the ventilation information.
- the user interface 120 may also provide an input to the one or more processors to control or define the processing performed by the one or more processors 110
- the one or more processors are further adapted to provide one or more ultrasound images of a portion of the patient’s thorax to the user interface for display. This may comprise all of the ultrasound images obtained by the processor (for generating the health information) or a subset of the same.
- the user interface 120 obtains one or more ultrasound images of a portion of the patient’s thorax directly from the ultrasound probe system (or memory), bypassing the one or more processors.
- the user interface may display one or more ultrasound images of the subject.
- the displayed ultrasound images may be selected by a user (via interaction with the user interface) or may be based upon health information displayed at the user interface (e.g. to display the ultrasound image(s) from which displayed health information was derived).
- the one or more processors 110 are adapted to obtain additional patient information (e.g. physiological or historical patient information, such as heart-rate information, personal patient information and/or demographic information) and control the user interface 120 to display the additional patient information.
- additional patient information may be obtained, for example, from an electronic medical record (e.g. stored in the database, storage or memory 130 or from a patient monitor (not shown).
- the user interface is adapted to process the health information, the sedation information and/or the ventilation information and generate one or more alarms (e.g. an audio or visual alarm) based on a comparison between said information and predetermined information.
- the predetermined information may comprise, for example, one or more threshold values derived from medical guidelines and/or population data.
- the health information may comprise a numerical measure of diaphragm thickness.
- the diaphragm thickness may be compared to patients’ baseline values and threshold values derived either from population of patients with same medical history, or guidelines for clinically acceptable diaphragm thicknesses, and an alarm may be generated if the diaphragm thickness exceeds this threshold value.
- the health information may comprise an indicator that indicates whether or not B-lines are detected in the one or more ultrasound images.
- An alarm may be generated in response to detecting that B-lines are present (i.e. as a result of comparing the value “present” to a predetermined information value “B-lines present”).
- the user interface may comprise one or more alarm modules (not shown) adapted to generate one or more user-perceptible outputs (i.e. alarms) based on a comparison between information (displayed by the user interface) and predetermined information.
- alarm modules not shown
- the user interface may comprise one or more alarm modules (not shown) adapted to generate one or more user-perceptible outputs (i.e. alarms) based on a comparison between information (displayed by the user interface) and predetermined information.
- FIGS 2 to 5 illustrate different displays of information obtained/generated by the one or more processors 110 at the user interface 120. These Figures aid in understanding different embodiments and versions of information displayed by the user interface.
- Figure 2 illustrates a template for a display 200 of a user interface for displaying the information generated by the one or more processors of the information provision system.
- a first area 250 of the display 200 may be designed or assigned for displaying information (health information, sedation information or ventilation information) obtained by the one or more processors.
- a first set of interactive icons 201-203 enables the user to select which information (obtained by the one or more processors 110) is to be displayed in the first area 250.
- a VENTILATION icon 201 may trigger the display of the ventilation information (or removal of said display)
- a SEDATION icon 202 may trigger the display of the sedation information (or removal of said display)
- the ULTRASOUND icon 203 may trigger the display of the health information (or removal of said display).
- a second set of interactive icons 211-214 may enable the user to select what available health indicators (that can act as health information) contribute to the display of the health information within the first area 250.
- the icons enable a user to select whether any one or more of a diagraph thickness (DTF), diaphragm excursion (DE), ejection fraction (EF) or number of identified B-lines (B-lines) contributes to the display of the health information in the first area.
- DTF diagraph thickness
- DE diaphragm excursion
- EF ejection fraction
- B-lines number of identified B-lines
- Baseline ultrasound parameters related to the diaphragmatic functions could also be measured while subjects are temporarily disconnected from the ventilator or if they are receiving non-invasive ventilation; the goal is to avoid contamination of the ventilation burden on the measurements.
- diaphragm examinations such as diaphragm thickness (DTF) can be performed during quiet tidal breathing and maximum inspiration throughout continuous breathing cycles.
- DTF measurement is an indicator of muscle thickening which reflects the ventilation burden and if it has been properly selected for the subject.
- the diaphragm excursion can be measured during quiet and deep breathing or during inspiration in M-mode ultrasound imaging allowing the placement of the M-mode line parallel to the diaphragmatic excursion.
- DE is the result of a given diaphragmatic contraction for a given mechanical burden.
- PEEP Pulsitive end-expiratory pressure
- diaphragmatic function parameters should be within a certain threshold (i.e. DTF during spontaneous breathing trial >11mm) to increase the likelihood of successful breathing trial and patient extubation.
- diaphragmatic information can aid a clinician to make a clinical decision on whether to extubate the patient and or during or immediately following extubation of the patient.
- the display 200 may also comprise a second area 260 that displays other monitoring parameters or health information of the patient (e.g. heart rate or respiratory rate).
- a time series is obtained for each other parameter (e.g. aPA02, airways pressure, heart rate) and displayed in graphical form.
- the second area 260 may further or alternatively display the health information as an ensemble of patients’ clinical parameters (color coded) acquired at different points in time.
- the health information comprises a time series of indicators of the patient’s health derive from sets of one or more ultrasound image(s) (e.g. a time series of the patient’s diaphragm thickness)
- this time series may be displayed in the second area.
- the assessment of different elements of health information (e.g. diaphragm and cardiac function along with pleural effusion) in subjects may all be related, meaning that the evaluation of multiple parameters combined together can give a clinician/operator a better understanding of the subject’s clinical status.
- the proposed display 200 providing information on various characteristics of the subject, can thereby help users have a broader vision of health status of the patient.
- the display 200 may also comprise a third area 270 that is adapted to display one or more ultrasound images. These ultrasound images may be provided by the one or more processors, may be obtained directly from an ultrasound imaging system or may be obtained from a memory/storage.
- Figure 3 illustrates one possible display for a user interface according to an embodiment of the invention.
- the display 300 is one embodiment of the template illustrated in Figure 2, in which the information displayed in the first area 250 comprises health information 350a,
- 355a (illustrated with horizontal hatching), sedation information 350b, 355b (illustrated with diagonal hatching) and ventilation information 350c, 355c (illustrated using stippling).
- the user has appropriately interacted with each of the first set of icons 201-203 so that the relevant information is displayed in the first area 250).
- the health information comprises at least one numerical health indicator 350a, 355a (i.e. a value) responsive to changes in the patient’s health.
- Each numerical health indicator is derived from a set of one or more ultrasound images, by using a machine-learning algorithm to process the set of one or more ultrasound images to generate the health indicator.
- the health information may comprise, as illustrated, a plurality of numerical health indicators 350a, 355a (each responsive to changes in the patient’s health) representing different points or period in time.
- a first health indicator 350a may provide a numerical health indicator indicative of the patient’s health at a first point/period of time (e.g. baseline health information)
- a second health indicator 355a may provide a numerical health indicator indicative of the patient’s health at a second, different point/period of time (e.g. later than the first point in time), such as current health information.
- Each health indicator of the health information may be derived from a different set of one or more ultrasound images, where each set of one or more ultrasound images is captured at a different point or period of time.
- the first health indicator 350a may represent the health of a patient after four hours of ventilation and the second health indicator 350b may represent the health of the patient after twelve hours of ventilation.
- Each health indicator 350a, 355a of the health information may represent the health of the patient over a predetermined period of time (e.g. over the course of an hour or day). This may comprise, for example, averaging a plurality of numerical health indicators obtained at different points in time over the predetermined period of time.
- each health indicator is normalized, e.g. with respect to a population average (e.g. of a population with similar medical history) or a clinically acceptable value.
- Each displayed health indicator 350a, 355a may be a health indicator that represents a plurality of health sub-indicators of the health of the patient.
- each displayed health indicator 350a, 355a may represent a combination of health sub-indicators of the health of the patient.
- each displayed health indicator 350a, 355a may represent a combination of a measured diaphragm thickness and ejection fraction.
- Each of these individual health sub-indicators may be normalized before combination (e.g. normalized with respect to a population average (e.g. of a population with similar medical history) or a clinically acceptable value, e.g. as indicated in clinical guidelines).
- the combination may be performed using any suitable combining technical, e.g. a sum, a weighted sum, a product, a weighted product model and so on.
- the display of a health indicator may further identify a relationship between the (value of the) health indicator and a desired (value for the) health indicator. This may be performed by comparing the value of the health indicator to a desired value for the health indicator (e.g. a clinically desired value, based on clinical guidelines, or a population average value) and controlling the display (e.g. a color, size or pattern) of the health indicator based on the comparison.
- a desired value for the health indicator e.g. a clinically desired value, based on clinical guidelines, or a population average value
- controlling the display e.g. a color, size or pattern
- the sedation information and ventilation information may be formatted in a similar manner to the health information.
- the sedation information may comprise one or more sedation indicators 350b, 355b providing details on a dosage of sedative provided to the patient (e.g. amount of dosage provided or the like) at a certain point/period of time.
- Each sedation indicator may correspond (e.g. temporally correspond) to a health indicator (i.e. may indicate sedation information for a same point/period as a respective health indicator).
- the display of a sedation indicator may further identify a relationship between the (value of the) sedation indicator and a desired (value for the) sedation indicator.
- Previously described methods in the context of health indicators) may be appropriately adapted for this step.
- the ventilation information may comprise one or more ventilation indicators 350b, 355b providing details settings of a ventilator ventilating the patient (e.g. air flow rate, respiration rate, oxygen level and so on) at a certain point/period of time.
- Each ventilation indicator may correspond (e.g. temporally correspond) to a health indicator (i.e. may indicate ventilation information for a same point/period as a respective health indicator). Of course, this may result in each ventilation indicator also corresponding to a respective sedation indicator.
- the display of a ventilation indicator may further identify a relationship between the (value of the) ventilation indicator and a desired (value for the) ventilation indicator. Previously described methods (in the context of health indicators) may be appropriately adapted for this step.
- each indicator of health information may correspond to a respective sedation indicator and a respective ventilation indicator.
- the respective indicators may be grouped (e.g. into a first group 350 or a second group 355) to spatially represent different points/periods of time for the three pieces of information.
- Providing the information obtained at different points/period of time enables a progress or change of the patient over time to be recognized and monitored by a user of the user interface. This can be used to enable a user of the user interface to observe deterioration of the patient and/or changes in their status, to improve a clinical decision making process.
- the third area 250 is populated with one or more ultrasound images 375, 376.
- Each ultrasound image may correspond to an ultrasound image that was used when generating a respective health indicator.
- each ultrasound image for that health indicator may be displayed in turn (e.g. the video may be played) within the third area 250.
- Figure 4 illustrates another possible display 400 for the user interface according to an embodiment of the invention.
- the display of Figure 4 differs from the display of Figure 3 in that, rather than being displayed using a bar chart format, the health information, sedation information and ventilation information are all displayed using a scatter plot.
- the first area 250 comprises a display of a first time series 410 of health indicators, a second time series 420 of sedation indicators and a third time series 430 of ventilation indicators.
- more than one health indicator may be displayed in the first area 250.
- a time series for each of a plurality of different health indicators may be displayed.
- the first time series 410 may represent a diaphragm thickness over time and a fourth time series 440 may represent an ejection fraction over time.
- the selection of which health indicators 410, 440 are displayed in the first area 250 may be dependent upon a user interaction with the second set of icons 211-214.
- a concept of displaying more than one health indicator is not limited to the display 400, and may be implemented in any embodiment of the display.
- This approach enables an overview of the relevant parameter of the subject (and their treatment) over a period of time to be visualized at a same interface.
- Figure 5 illustrates another possible display 500 for the user interface according to an embodiment of the invention.
- the display of Figure 5 differs from the display of Figures 4 and 5 in that, rather than being displayed using a bar chart or scatter plot format, the health information, sedation information and ventilation information are all displayed using a pie chart 510.
- the pie chart may indicate a relative variation of the health information, the sedation information and the ventilation information over the course of a predetermined period. This can help to indicate which parameters fluctuate or are more stable. There is, of course, a desire to maintain stability of the patient to further improve the patient’s health over time.
- each of the health information, sedation information and ventilation information displayed in the first area 250 comprises an indicator of variance.
- the health information comprises an indicator of variance of a health indicator (e.g. diaphragm thickness) over a period of time.
- a health indicator e.g. diaphragm thickness
- This may comprise, for example, a calculated variance of the health indicator over the period of time or a count of the number of times that a value of the health indicator changes by more than a predetermined amount (e.g. by more than a predetermined percentage or magnitude).
- the sedation/ventilation information may similarly comprise an indicator of variance of a sedation/ventilation indicator over a period of time. This may comprise, for example, a calculated variance of the sedation/ventilation information or a count of the number of times that a value of the sedation/ventilation indicator changes by more than a predetermined amount.
- the health, sedation and ventilation information is displayed in a pie chart form, so that a user can easily compare a variance of each of these pieces of information, thereby readily detecting which information has the greatest variance.
- the health information 610 forms one portion of a pie chart
- the sedation information 620 forms another portion of the pie chart
- the ventilation information forms yet another portion of the pie chart 630.
- the health/sedation/ventilation information displayed may comprise statistical information of a health/sedation/ventilation indicator (e.g. over a period of time).
- a health/sedation/ventilation indicator e.g. over a period of time.
- Such a concept may be applied to any embodiment of the invention, i.e. is not limited to the pie chart embodiment described above. This can assist a user of the user interface to conceptually understand statistical information of such indicators to aid in the treating of the patient.
- the content of the first area 250 may be changed in response to a user input, e.g. interacting with a dedicated icon. This may allow, for example, the user to switch the content of the first area 250 between the examples given in Figures 3 to 5 (e.g. to switch between a bar graph, a scatter plot and/or a pie chart).
- the described methods of providing content in the first area 250 are not exhaustive, and would readily contemplate alternative methods of displaying the health information, the sedation information and the ventilation information, such as line charts, radar charts, stem-and-leaf plots and/or box plots. In some examples, the information may be presented using alphanumeric characters, rather than in the form of plots.
- Figure 6 illustrates a method 600 of providing information for assisting in the treatment of a ventilated and sedated patient according to an embodiment of the invention.
- the method 600 comprises a step 601 of obtaining one or more ultrasound images of a portion of the patient’s thorax; and a step 602 of processing the one or more ultrasound images, using a machine-learning algorithm, to obtain health information, the health information being responsive to changes in the health of the patient.
- the method also comprises a step 603 of obtaining sedation information of the patient, the sedation information providing details on a dosage of sedative provided to the patient; and a step 604 of obtaining ventilation information of the patient, the ventilation information providing details on settings of a ventilator providing ventilation to the patient.
- the method further comprises a step 605 of displaying the obtained health information, the sedation information and the ventilation information at a same user interface.
- Embodiments make use of a machine-learning algorithm (i.e. also in the form of deep learning) to process an ultrasound image of the patient’s thorax to obtain or predict health information.
- a machine-learning algorithm i.e. also in the form of deep learning
- a machine-learning algorithm is any self-training algorithm that processes input data in order to produce or predict output data.
- the input data comprises one or more ultrasound images of the patient’s thorax (“thorax ultrasound images”) and the output data comprises health information, such as one or more health indicators, responsive to changes in the subject’s health.
- Suitable machine-learning algorithms for being employed in the present invention will be apparent to the skilled person.
- suitable machine-learning algorithms include decision tree algorithms and artificial or convolutional neural networks.
- Other machine-learning algorithms such as logistic regression, support vector machines or Naive Bayesian model are suitable alternatives.
- Neural networks are comprised of layers, each layer comprising a plurality of neurons.
- Each neuron comprises a mathematical operation.
- each neuron may comprise a different weighted combination of a single type of transformation (e.g. the same type of transformation, sigmoid etc. but with different weightings).
- the mathematical operation of each neuron is performed on the input data to produce a numerical output, and the outputs of each layer in the neural network are fed into the next layer sequentially. The final layer provides the output.
- Methods of training a machine-learning algorithm are well known.
- such methods comprise obtaining a training dataset (with annotation), comprising training input data entries and corresponding training output data entries.
- An initialized machine- learning algorithm is applied to each input data entry to generate predicted output data entries.
- An error between the predicted output data entries and corresponding training output data entries is used to modify the machine-learning algorithm. This process can be repeated until the error converges, and the predicted output data entries are sufficiently similar (e.g. ⁇ 1%) to the training output data entries.
- This is commonly known as a supervised learning technique.
- the machine-learning algorithm is formed from a neural network
- (weightings of) the mathematical operation of each neuron may be modified until the error converges.
- Known methods of modifying a neural network include gradient descent, backpropagation algorithms and so on.
- the training input data entries correspond to example thorax ultrasound images.
- the training output data entries correspond to examples of at least one health indicator responsive to changes in the health of the subject.
- the deep learning algorithm that processes the ultrasound image(s) performs a segmentation process on the ultrasound image(s).
- the segmentation identifies elements of the ultrasound image that are able to provide an indicator responsive to the health of the patient.
- each step of the flow chart may represent a different action performed by a processing system, and may be performed by a respective module of the processing system.
- Embodiments may therefore make use of a processing system.
- the processing system can be implemented in numerous ways, with software and/or hardware, to perform the various functions required.
- a processor is one example of a processing system which employs one or more microprocessors that may be programmed using software (e.g., microcode) to perform the required functions.
- a processing system may however be implemented with or without employing a processor, and also may be implemented as a combination of dedicated hardware to perform some functions and a processor (e.g., one or more programmed microprocessors and associated circuitry) to perform other functions.
- processing system components examples include, but are not limited to, conventional microprocessors, application specific integrated circuits (ASICs), and field-programmable gate arrays (FPGAs).
- a processor or processing system may be associated with one or more storage media such as volatile and non-volatile computer memory such as RAM, PROM, EPROM, and EEPROM.
- the storage media may be encoded with one or more programs that, when executed on one or more processors and/or processing systems, perform the required functions.
- Various storage media may be fixed within a processor or processing system or may be transportable, such that the one or more programs stored thereon can be loaded into a processor or processing system.
- a computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.
- a suitable medium such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Biomedical Technology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Databases & Information Systems (AREA)
- Pathology (AREA)
- Data Mining & Analysis (AREA)
- Radiology & Medical Imaging (AREA)
- Chemical & Material Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Medicinal Chemistry (AREA)
- Surgery (AREA)
- Urology & Nephrology (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Abstract
A method and system for providing information useful for treating a ventilated and sedated patient at a same screen or user interface. One or more ultrasound images (of the patient) are processed using a machine-learning algorithm to derive health information of the patient. The health information is displayed at a user interface. Sedation and ventilation information is also displayed at the same user interface, to thereby provide a unified dashboard that provides information for treating a ventilated and sedated patient.
Description
PROVIDING INFORMATION FOR ASSISTING IN THE TREATMENT OF A VENTILATED AND SEDATED PATIENT
FIELD OF THE INVENTION
The present invention relates to the field of patient ventilation, and in particular to providing information for aiding a clinician in treating a ventilated and sedated patient.
BACKGROUND OF THE INVENTION
The management of (invasive mechanical) ventilation and sedation of a patient constitutes a major part of patient care in an intensive care unit (ICU). Prolonged mechanical ventilation and/or sedation can result in post-extubation complications, such as cardiopulmonary failure.
The use of a mechanical ventilator inevitably requires a clinician, such as a doctor/physician or ICU nurse, to make decisions on whether to wean the patient from the mechanical ventilator. This may comprise deciding whether a patient should be removed from the ventilation to undertake a spontaneous breathing trial (SBT), which is a first stage in the process for allowing ICU patients to breathe on their own.
Typically, a decision for whether to initiating weaning of a patient from a ventilator is based upon physiological and/or oxygenation information of the patient. In particular, guidelines may define physiological and/or oxygenation criteria for recommending whether or not the patient should be weaned. For example, one criteria may be that the patient’s partial pressure of oxygen (PaCE) must be less than 50mmHg for weaning to be recommended, whereas another criteria may be that the patient’s fraction of inspired oxygen (F1O2) must be greater than 0.5.
Other characteristics of the patient’s ventilation (e.g. tidal volume or mode of the ventilator) may also be controlled by the clinician based on characteristics of the patient, e.g. to increase oxygen flow to a patient with a low Sp02 level.
Similarly, sedation monitoring is generally a process that follows guidelines, such as those suggested by the Richmond Agitation-Sedation Scale (RASS). Co-morbidities,
weaning failure, and higher mortality risk are only few of the consequences of prolonged sedation in the ICU.
Thus, properties of sedation (e.g. amount or type of sedative) may be controlled based on patient characteristics, e.g. to reduce an amount of sedative if the patient exhibits certain co-morbidities.
There is therefore an ongoing desire to provide information that assists a clinician in making clinical decisions about a ventilated and sedated user, with the aim of improving patient outcome.
One area of clinical research has investigated the use of ultrasound imaging in this scenario, and has demonstrated how cardiac, lung and diaphragm ultrasound imaging (e.g. echocardiograms) can generate information that can non-invasively support decision making of a ventilated and sedated patient. For example, diaphragm ultrasound imaging can be performed via a sub-costal or an inter-costal approach with low and high frequency transducers, respectively.
SUMMARY OF THE INVENTION
The invention is defined by the claims.
According to examples in accordance with an aspect of the invention, there is provided a computer-implemented method of providing information for assisting in the treatment of a ventilated and sedated patient.
The computer-implemented method comprises: obtaining one or more ultrasound images of a portion of the patient’s thorax; processing the one or more ultrasound images, using a machine-learning algorithm (i.e. deep learning), to obtain health information, the health information being responsive to changes in the health of the patient; obtaining sedation information of the patient, the sedation information providing details on a dosage of sedative provided to the patient; obtaining ventilation information of the patient, the ventilation information providing details on settings of a ventilator providing ventilation to the patient; and displaying the obtained health information, the sedation information and the ventilation information at a same user interface.
As previously explained, a decision as to whether to take the patient off the ventilator can be at least partially based on information contained in ultrasound images of parts of the patient’s thorax, e.g. an ultrasound image of the heart and/or lungs and/or diaphragm.
The underlying concept of the invention relies on the recognition that both ventilation settings (of a ventilator) and sedative dosages affect the health of a patient’s thorax, and in particular can cause diaphragm dysfunction that affects the health of anatomical features within the patient’s thorax (such as the diaphragm or heart).
The present invention proposes to provide a clinician, at a same user interface or dashboard, information about sedative dosages, ventilation settings and health information. This provides a tool for assisting in the treatment of the patient, by aiding in the visualization of the patient’s current state and treatment information.
The health information may comprise any one or more health indicators that are responsive to a change in the patient’s health, such as a diaphragm thickness or ejection fraction. The health indicator may be numerical or categorical, so that the value for a health indicator may be descriptive, numeric or a binary indicator (amongst other value formats). Preferably, the health information comprises any one or more health indicators that are responsive to a change in the probability of (health) complications if the patient is weaned or taken off the ventilator. Suitable examples include (an indicator responsive to) a diaphragm thickness (fraction), an ejection fraction, diaphragm excursion, presence/absence of lung sliding, presence/absence of lung infection and so on (all of which have been shown to indicate the probability of the patient having further health problems if ventilation is removed).
In particular, the invention provides a user interface that provides a user with information for managing ventilation and sedation of a patient within a clinical setting, thereby aiding to reduce post-ventilation (e.g. post-extubation) complications. Moreover, information is presented in a meaningful way to a clinician to assist them in performing a clinical task of treating a patient in intensive care undergoing ventilation.
The health information may comprise at least one measurement of an anatomical feature of the patient’s thorax. The health information may comprise at least one measurement for evaluating: diaphragmatic dysfunction, cardiac dysfunction and/or lung dysfunction.
In some embodiments, the health information comprises, or is derived from, at least one of: a diaphragm thickness; an ejection fraction; a diaphragmatic excursion; a presence or absence of lung aeration (i.e. lung sliding, consolidation); and/or indices of cardiac function. For example, the health information may comprise an indicator of variance of a health indicator (e.g. diaphragm thickness) over a period of time.
In particularly preferable embodiments, the health information comprises a health indicator responsive to a diaphragm thickness, i.e. derived from the diaphragm thickness. Example health indicators may comprise the diaphragm thickness itself or a trend/ differential of the diaphragm thickness that indicates how the diaphragm thickness changes over time. For example, the health information may comprise an indicator of variance diaphragm thickness over a period of time. In some examples, the health indicator derived from the diaphragm thickness comprises a difference between a current diaphragm thickness and a baseline diaphragm thickness (e.g. obtained before ventilation took place).
Information derived from a diaphragm thickness has been identified as being particularly useful in assessing the likelihood of a patient having further health problems if ventilation has been removed. In particular, a trend or baseline comparative diaphragm thickness is able to indicate how the patient has deteriorated over time, and thereby indicates a likely success of them breathing independently.
Preferably, the health information comprises a health indicator responsive to a trend of the diaphragm thickness, e.g. over time.
In particular examples, the health information comprises a health indicator responsive to a difference between a measured diaphragm thickness (e.g. a current diaphragm thickness) and a historic/baseline diaphragm thickness, e.g. measured before ventilation. The health indicator may be the difference itself, or a score (e.g. on a scale of 0-10) responsive to the difference. This provides particularly helpful clinical information for assessing the likelihood of complications if ventilation is removed.
In some embodiments, the health information comprises a plurality of health indicators each responsive to changes in a health of the patient; and the step of displaying the obtained health information comprises: combining the obtained plurality of indicators to form a combined health indicator; and displaying the combined health indicator at the user interface.
Of course, more than one combined health indicator may be formed and/or displayed. For example, a first combined health indicator may be a combination of a first and second health indicator and a second combined health indicator may be a combination of a third and fourth indicator.
In some embodiments, the one or more ultrasound images comprises a sequence of ultrasound images captured over a period of time; and the step of processing the one or more ultrasound images comprises processing, using a machine-learning algorithm, the sequence of ultrasound images to obtain the health information.
Optionally, the sedation information providing details on a dosage of sedative provided to the patient over at least one period of time. Of course, the sedation information may indicate that no sedation has been provided or administered to the subject.
The ventilation information may provide details on settings of a ventilator providing ventilation to the patient over at least one period of time.
In at least one embodiment, the one or more ultrasound images comprises two or more sets of one or more ultrasound images of the patient’s thorax, wherein each set of one or more ultrasound images is captured at a different time; the step of processing the one or more ultrasound images comprises processing each set of one or more ultrasound images, using a machine-learning algorithm, to obtain, for each set, a value for one or more health indicators responsive to a change in the patient’s health; and the step of displaying health information comprises displaying the obtained health indicator responsive to a change in the patient’s health, to thereby display a plurality of values, each value corresponding to a different time, for each one or more health indicators.
Thus, a time series for a particular health indicator may be generated and displayed by processing more than one set of ultrasound images, each set being acquired at different time points or over different period of time. In this way, the change of a health indicator over time can be tracked and displayed.
The sedation information may similarly provide details on dosages of sedative provided to the patient at different times. Of course, the ventilation information may provide details on settings of a ventilator providing ventilation to the patient at different times.
Some embodiments further comprise a step of displaying at least one of the one or more ultrasound images at the user interface.
Some embodiments further comprise, responsive to a user input, displaying at least one of the one or more ultrasound images at the user interface, optionally wherein the displayed at least one of the one or more ultrasound images is dependent upon the user input.
There is also proposed a computer program product comprising computer program code means which, when executed one or more processing systems, causes the one or more processing systems to perform all of the steps of any herein described method.
There is also proposed an information provision system for providing information for assisting in the treatment of a ventilated and sedated patient. The information provision system comprises one or more processing systems adapted to: obtain one or more ultrasound images of a portion of the patient’s thorax; process the one or more ultrasound images, using a machine-learning algorithm, to obtain health information, the health
information being responsive to changes in the health of the patient; obtain sedation information of the patient, the sedation information providing details on a dosage of sedative provided to the patient; and obtain ventilation information of the patient, the ventilation information providing details on settings of a ventilator providing ventilation to the patient, and a user interface adapted to display the obtained health information, the sedation information and the ventilation information.
There is also proposed an ultrasound imaging system comprising: the information provision system herein described; and an ultrasound probe system adapted to perform, responsive to a user’s control, ultrasound imaging to generate one or more ultrasound images of a portion of the patient’s thorax.
These and other aspects of the invention will be apparent from and elucidated with reference to the embodiment(s) described hereinafter.
BRIEF DESCRIPTION OF THE DRAWINGS
For a better understanding of the invention, and to show more clearly how it may be carried into effect, reference will now be made, by way of example only, to the accompanying drawings, in which:
Figure 1 illustrates an ultrasound imaging system having an information provision system according to an embodiment of the invention;
Figure 2 illustrates a template for a display of a user interface according to an embodiment of the invention;
Figures 3 to 5 each illustrate a different display of a user interface according to an embodiment of the invention; and
Figure 6 is a flowchart illustrating a method according to an embodiment of the invention.
DETAILED DESCRIPTION OF THE EMBODIMENTS
The invention will be described with reference to the Figures.
It should be understood that the detailed description and specific examples, while indicating exemplary embodiments of the apparatus, systems and methods, are intended for purposes of illustration only and are not intended to limit the scope of the invention. These and other features, aspects, and advantages of the apparatus, systems and methods of the present invention will become better understood from the following description, appended claims, and accompanying drawings. It should be understood that the
Figures are merely schematic and are not drawn to scale. It should also be understood that the same reference numerals are used throughout the Figures to indicate the same or similar parts.
The invention provides a method and system for providing information useful for treating a ventilated and sedated patient at a same screen or user interface. One or more ultrasound images (of the patient) are processed using a machine-learning algorithm to derive health information of the patient. The health information is displayed at a user interface. Sedation and ventilation information is also displayed at the same user interface, to thereby provide a unified dashboard that provides information for treating a ventilated and sedated patient.
Embodiments may be employed in any clinical setting in which a patient is undergoing ventilation and sedation, such as in an intensive care unit.
Figure 1 illustrates an ultrasound imaging system 1 comprising an information provision system 100, according to an embodiment of the invention, and an ultrasound probe system 190.
The information provision system 100 is adapted to provide and display information for assisting in the treatment of a ventilated and sedated patient. In particular, the information provision system 100 comprises one or more processors 110 for generating the information and a user interface 120 for displaying the generated information.
The ultrasound probe system 190 is adapted to generate one or more ultrasound images responsive to a user’s control. The operation of an ultrasound probe system 190 is well known in the prior art. Here, the ultrasound probe system comprises one or more ultrasound probes or transducers 191 for transmitting ultrasound waves and receiving echo information. A signal processor 192 receives the echo information and generates one or more ultrasound images from the echo information, e.g. by using a scan converter to arrange the received echo signals in a particular spatial relationship to thereby generate an ultrasound image.
The information provision system 100 is adapted to obtain one or more ultrasound images 152 at the one or more processors 110. In the illustrated example, these are directly provided by the ultrasound probe system 190, but the skilled person would appreciate that the one or more ultrasound images may be obtained from a storage or memory (e.g. memory 130).
The obtained ultrasound image(s) 152 each provide an image of (a portion of) the patient’s thorax. For example, the ultrasound image(s) may comprise an ultrasound image
of the patient’s heart, the patient’s lungs, the patient’s diaphragm, the patient’s bronchial tract, the patient’s entire thorax and so on.
The information provision system 100 also obtains sedation information 152 of the patient at the one or more processors 110, the sedation information providing details on a dosage of sedative provided to the patient. In the illustrated example, this is obtained from a memory or storage 130 (e.g. storing an electronic medical record of the patient). In other examples, the sedation information is obtained from an infusion system pump (not shown) or the like that controls the delivery of medication to the patient.
The sedation information may comprise one or more sedation indicators, each sedation indicator providing information on dosages of sedative provided to the patient at a different point or period of time.
The sedation indicator may be an dose of sedative provided to the patient over a period of time (e.g.pg/kg), a flow rate of the sedative at a particular point in time (e.g. in pg/kg per second, minute, or hour ) and/or a relative indicator of the intensity or strength of the sedative (e.g. on a scale of 0 to 10 indicating levels of potential under, regular, or over sedation). Of course, the sedation information may be able to indicate when no sedation has been provided or administered to the subject. Other examples will be apparent to the skilled person.
The information provision system 100 also obtains, at the one or more processors 110, ventilation information 153 providing details on one or more settings of a ventilator providing ventilation to the patient. The ventilation information 153 may be obtained, for example, from a memory or storage 130 or directly from a (mechanical) ventilator 195 ventilating the patient.
The ventilation information may comprise one or more ventilation indicators, each ventilation indicator providing information on settings of the ventilator providing ventilation to the patient at a different point or period of time.
A ventilation indicator may be a value representing one or more settings of the ventilator. For example, the ventilation indicator may comprise a value of a parameter of a mechanical ventilator ventilating the patient (e.g. a delivered tidal volume or delivered respiratory rate), a combination of values of parameters of a mechanical ventilator (e.g. a positive end-expiration pressure, positive ventilation pressure), and/or a relative indicator of the intensity of the provided ventilation (e.g. on a scale of 0 to 10). Other examples will be apparent to the skilled person.
A sedation indicator and/or a ventilation indicator may be numerical or categorical.
The one or more processors 110 of the information provision system 100 processes the one or more ultrasound images 152, using a machine-learning algorithm, to obtain health information of the patient, wherein the health information is responsive to a change in the patient’s health. In particular, the health information may comprise one or more indicators or values that are responsive to changes in the health of the patient.
Preferably, the health information comprises any one or more health indicators that are responsive to a change in the probability of (health) complications if the patient is weaned or taken off the ventilator (i.e. should the ventilator stop ventilating the patient).
By way of example, health information may comprise (an indicator responsive to) at least one characteristic of at least one anatomical feature of the patient’s thorax. Characteristics of an anatomical feature of the patient’s thorax have been shown to be responsive to a likelihood that the patient would face complications if they were removed from ventilation.
The health information may, for example, comprise (an indicator responsive to) a measurement of an anatomical feature of the patient’s thorax, such as a diaphragm thickness or ejection fraction (of the patient’s heart). Other example measurements might include, for example, thickness of respiratory epithelium or diaphragm excursion. A change in any of these (non-exhaustive) measurements has been shown to indicate a change a probability of the patient facing (medical) complications if they were removed from ventilation.
The health information is not, however, limited to measurements. In some example, the health information may, for example, comprise one or more indicators of: a presence or absence of aeration (e.g. lung sliding, consolidation), indices of cardiac function,
, presence or absence of infection in the lung, presence or absence (and optionally number) of lung nodules, presence or absence (and optionally number) of B-lines within the ultrasound image and so on. Other examples will be apparent to the skilled person.
The one or more processors may be adapted to generate a health indicator from a single ultrasound image or a sequence/set of ultrasound images (e.g. an ultrasound video).
The one or more processors may be adapted to generate different values for a health indicator at different points in time. In particular, the one or more processors may be adapted to obtain two or more sets of one or more ultrasound images (each set being
associated with a different point or period of time, e.g. starting at different times). Each set of one or more ultrasound images may be processed to obtain more than one value for a health indicator, each value representing a different point/period of time.
Thus, health information may comprise a time series of a health indicator, wherein a time series comprises values for a health indicator obtained at different times (where each value may be associated with a certain point in time, e.g. timestamped). Of course, the health information may comprise more than one time series for different health indicators (e.g. a first time series for measurements of diaphragm thickness (e.g. baseline vs follow-up measurement(s)), and a second time series for measurements of ejection fraction).
Similarly, the sedation information may comprise a time series of sedation indicators providing information of/on a dosage of sedative provided to the patient at different times. The ventilation information may comprise a time series of ventilation information providing information of ventilation settings for ventilating the patient at different times.
It should be apparent that the health information is health information concerning the patient’s thorax, and is derived from ultrasound images of the patient’s thorax. Thus, the health information may alternatively be labelled: “thorax health information”, “ultrasound-derived health information” and/or “ultrasound-derived thorax health information”, to improve clarity and to make this distinction clear.
The information provision system 100 is adapted to display the obtained health information, the sedation information and the ventilation information at a same user interface 120. In other words, the user interface 120 displays the information obtained by the one or more processors 110. The one or more processors 110 may control the display of the user interface (e.g. by generating display data that defines what is displayed by the user interface 120).
The user interface 120 may comprise, for example, a two-dimensional screen that is adapted to display a visual representation of the health, sedation and ventilation information. The visual representation may be defined by the one or more processors 110.
Thus, relevant information for a ventilated and sedated patient is presented at a same user interface, ensuring that the necessary information for making a clinical decision for the patient is provided in a same location.
The user interface 120 may be interactive, so that a user is able to control which pieces of information are provided as a visual representation by the user interface. For example, the user may be able to select what portion of sedation information is displayed at
the user interface or may be able to select what (type of) health information is obtained by the one or more processors 110.
In some examples, the user interface 120 may be designed to control the content of the health information, the sedation information and/or the ventilation information.
In other words, the user interface 120 may also provide an input to the one or more processors to control or define the processing performed by the one or more processors 110
In some embodiments, the one or more processors are further adapted to provide one or more ultrasound images of a portion of the patient’s thorax to the user interface for display. This may comprise all of the ultrasound images obtained by the processor (for generating the health information) or a subset of the same. In other/further embodiments, the user interface 120 obtains one or more ultrasound images of a portion of the patient’s thorax directly from the ultrasound probe system (or memory), bypassing the one or more processors.
In other words, the user interface may display one or more ultrasound images of the subject. The displayed ultrasound images may be selected by a user (via interaction with the user interface) or may be based upon health information displayed at the user interface (e.g. to display the ultrasound image(s) from which displayed health information was derived).
In some embodiments, the one or more processors 110 are adapted to obtain additional patient information (e.g. physiological or historical patient information, such as heart-rate information, personal patient information and/or demographic information) and control the user interface 120 to display the additional patient information. This additional patient information may be obtained, for example, from an electronic medical record (e.g. stored in the database, storage or memory 130 or from a patient monitor (not shown).
These features are entirely optional, and it is not essential that the user interface 120 displays an ultrasound image or any other patient monitoring information.
In some embodiments, the user interface is adapted to process the health information, the sedation information and/or the ventilation information and generate one or more alarms (e.g. an audio or visual alarm) based on a comparison between said information and predetermined information. The predetermined information may comprise, for example, one or more threshold values derived from medical guidelines and/or population data.
By way of example, the health information may comprise a numerical measure of diaphragm thickness. The diaphragm thickness may be compared to patients’ baseline
values and threshold values derived either from population of patients with same medical history, or guidelines for clinically acceptable diaphragm thicknesses, and an alarm may be generated if the diaphragm thickness exceeds this threshold value.
As another example, the health information may comprise an indicator that indicates whether or not B-lines are detected in the one or more ultrasound images. An alarm may be generated in response to detecting that B-lines are present (i.e. as a result of comparing the value “present” to a predetermined information value “B-lines present”).
Accordingly, the user interface may comprise one or more alarm modules (not shown) adapted to generate one or more user-perceptible outputs (i.e. alarms) based on a comparison between information (displayed by the user interface) and predetermined information.
Figures 2 to 5 illustrate different displays of information obtained/generated by the one or more processors 110 at the user interface 120. These Figures aid in understanding different embodiments and versions of information displayed by the user interface.
Figure 2 illustrates a template for a display 200 of a user interface for displaying the information generated by the one or more processors of the information provision system.
A first area 250 of the display 200 may be designed or assigned for displaying information (health information, sedation information or ventilation information) obtained by the one or more processors.
A first set of interactive icons 201-203 enables the user to select which information (obtained by the one or more processors 110) is to be displayed in the first area 250. For example, a VENTILATION icon 201 may trigger the display of the ventilation information (or removal of said display), a SEDATION icon 202 may trigger the display of the sedation information (or removal of said display) and the ULTRASOUND icon 203 may trigger the display of the health information (or removal of said display). In the hereafter- described embodiments, it is assumed that all these forms of information are to be displayed in the first area 250.
A second set of interactive icons 211-214 may enable the user to select what available health indicators (that can act as health information) contribute to the display of the health information within the first area 250. In the illustrated example, the icons enable a user to select whether any one or more of a diagraph thickness (DTF), diaphragm excursion (DE),
ejection fraction (EF) or number of identified B-lines (B-lines) contributes to the display of the health information in the first area.
It has been recognized that the visualization of the above parameters can be extremely helpful for ICU operators or clinicians.
Baseline ultrasound parameters related to the diaphragmatic functions could also be measured while subjects are temporarily disconnected from the ventilator or if they are receiving non-invasive ventilation; the goal is to avoid contamination of the ventilation burden on the measurements. For example, diaphragm examinations such as diaphragm thickness (DTF) can be performed during quiet tidal breathing and maximum inspiration throughout continuous breathing cycles. DTF measurement is an indicator of muscle thickening which reflects the ventilation burden and if it has been properly selected for the subject.
The diaphragm excursion (DE) can be measured during quiet and deep breathing or during inspiration in M-mode ultrasound imaging allowing the placement of the M-mode line parallel to the diaphragmatic excursion. In patients that are breathing spontaneously, DE is the result of a given diaphragmatic contraction for a given mechanical burden. For patients receiving mechanical ventilation support, DE depends on the amount of support and PEEP (Positive end-expiratory pressure) level.
These diaphragmatic function parameters should be within a certain threshold (i.e. DTF during spontaneous breathing trial >11mm) to increase the likelihood of successful breathing trial and patient extubation.
Thus, it is clear that diaphragmatic information can aid a clinician to make a clinical decision on whether to extubate the patient and or during or immediately following extubation of the patient.
The display 200 may also comprise a second area 260 that displays other monitoring parameters or health information of the patient (e.g. heart rate or respiratory rate). In the illustrated example, a time series is obtained for each other parameter (e.g. aPA02, airways pressure, heart rate) and displayed in graphical form.
The second area 260 may further or alternatively display the health information as an ensemble of patients’ clinical parameters (color coded) acquired at different points in time. For example, if the health information comprises a time series of indicators of the patient’s health derive from sets of one or more ultrasound image(s) (e.g. a time series of the patient’s diaphragm thickness), then this time series may be displayed in the second area.
The assessment of different elements of health information (e.g. diaphragm and cardiac function along with pleural effusion) in subjects may all be related, meaning that the evaluation of multiple parameters combined together can give a clinician/operator a better understanding of the subject’s clinical status.
Consider a scenario in which a subject has left ventricle diastolic and diaphragmatic dysfunction. If a loss of aeration occurs during the weaning process and the spontaneous breathing trials, it will result into an increased respiratory work; hence, an increased cardiac demand. Such a scenario could result into a weaning failure with related cardiogenic pulmonary edema. The proposed display 200, providing information on various characteristics of the subject, can thereby help users have a broader vision of health status of the patient.
The display 200 may also comprise a third area 270 that is adapted to display one or more ultrasound images. These ultrasound images may be provided by the one or more processors, may be obtained directly from an ultrasound imaging system or may be obtained from a memory/storage.
Figure 3 illustrates one possible display for a user interface according to an embodiment of the invention.
The display 300 is one embodiment of the template illustrated in Figure 2, in which the information displayed in the first area 250 comprises health information 350a,
355a (illustrated with horizontal hatching), sedation information 350b, 355b (illustrated with diagonal hatching) and ventilation information 350c, 355c (illustrated using stippling).
In the illustrated example, the user has appropriately interacted with each of the first set of icons 201-203 so that the relevant information is displayed in the first area 250).
The health information comprises at least one numerical health indicator 350a, 355a (i.e. a value) responsive to changes in the patient’s health. Each numerical health indicator is derived from a set of one or more ultrasound images, by using a machine-learning algorithm to process the set of one or more ultrasound images to generate the health indicator.
The health information may comprise, as illustrated, a plurality of numerical health indicators 350a, 355a (each responsive to changes in the patient’s health) representing different points or period in time. Thus, a first health indicator 350a may provide a numerical health indicator indicative of the patient’s health at a first point/period of time (e.g. baseline health information), and a second health indicator 355a may provide a numerical health
indicator indicative of the patient’s health at a second, different point/period of time (e.g. later than the first point in time), such as current health information.
Each health indicator of the health information may be derived from a different set of one or more ultrasound images, where each set of one or more ultrasound images is captured at a different point or period of time.
Purely by way of example, the first health indicator 350a may represent the health of a patient after four hours of ventilation and the second health indicator 350b may represent the health of the patient after twelve hours of ventilation.
Each health indicator 350a, 355a of the health information may represent the health of the patient over a predetermined period of time (e.g. over the course of an hour or day). This may comprise, for example, averaging a plurality of numerical health indicators obtained at different points in time over the predetermined period of time.
In some embodiments, each health indicator is normalized, e.g. with respect to a population average (e.g. of a population with similar medical history) or a clinically acceptable value.
Each displayed health indicator 350a, 355a may be a health indicator that represents a plurality of health sub-indicators of the health of the patient. For example, each displayed health indicator 350a, 355a may represent a combination of health sub-indicators of the health of the patient. By way of example only, each displayed health indicator 350a, 355a may represent a combination of a measured diaphragm thickness and ejection fraction.
Each of these individual health sub-indicators may be normalized before combination (e.g. normalized with respect to a population average (e.g. of a population with similar medical history) or a clinically acceptable value, e.g. as indicated in clinical guidelines). The combination may be performed using any suitable combining technical, e.g. a sum, a weighted sum, a product, a weighted product model and so on.
The display of a health indicator may further identify a relationship between the (value of the) health indicator and a desired (value for the) health indicator. This may be performed by comparing the value of the health indicator to a desired value for the health indicator (e.g. a clinically desired value, based on clinical guidelines, or a population average value) and controlling the display (e.g. a color, size or pattern) of the health indicator based on the comparison.
The sedation information and ventilation information may be formatted in a similar manner to the health information.
Thus, the sedation information may comprise one or more sedation indicators 350b, 355b providing details on a dosage of sedative provided to the patient (e.g. amount of dosage provided or the like) at a certain point/period of time. Each sedation indicator may correspond (e.g. temporally correspond) to a health indicator (i.e. may indicate sedation information for a same point/period as a respective health indicator).
The display of a sedation indicator may further identify a relationship between the (value of the) sedation indicator and a desired (value for the) sedation indicator. Previously described methods (in the context of health indicators) may be appropriately adapted for this step.
Similarly, the ventilation information may comprise one or more ventilation indicators 350b, 355b providing details settings of a ventilator ventilating the patient (e.g. air flow rate, respiration rate, oxygen level and so on) at a certain point/period of time. Each ventilation indicator may correspond (e.g. temporally correspond) to a health indicator (i.e. may indicate ventilation information for a same point/period as a respective health indicator). Of course, this may result in each ventilation indicator also corresponding to a respective sedation indicator.
The display of a ventilation indicator may further identify a relationship between the (value of the) ventilation indicator and a desired (value for the) ventilation indicator. Previously described methods (in the context of health indicators) may be appropriately adapted for this step.
As discussed above, each indicator of health information may correspond to a respective sedation indicator and a respective ventilation indicator. The respective indicators may be grouped (e.g. into a first group 350 or a second group 355) to spatially represent different points/periods of time for the three pieces of information.
Providing the information obtained at different points/period of time enables a progress or change of the patient over time to be recognized and monitored by a user of the user interface. This can be used to enable a user of the user interface to observe deterioration of the patient and/or changes in their status, to improve a clinical decision making process.
In the illustrated example, the third area 250 is populated with one or more ultrasound images 375, 376. Each ultrasound image may correspond to an ultrasound image that was used when generating a respective health indicator.
In some examples, if a health indicator is generated by processing a plurality of ultrasound images (e.g. from a video), then each ultrasound image for that health indicator may be displayed in turn (e.g. the video may be played) within the third area 250.
Figure 4 illustrates another possible display 400 for the user interface according to an embodiment of the invention.
The display of Figure 4 differs from the display of Figure 3 in that, rather than being displayed using a bar chart format, the health information, sedation information and ventilation information are all displayed using a scatter plot.
In other words, the first area 250 comprises a display of a first time series 410 of health indicators, a second time series 420 of sedation indicators and a third time series 430 of ventilation indicators.
In some examples, more than one health indicator may be displayed in the first area 250. In particular, a time series for each of a plurality of different health indicators may be displayed. For example, the first time series 410 may represent a diaphragm thickness over time and a fourth time series 440 may represent an ejection fraction over time.
The selection of which health indicators 410, 440 are displayed in the first area 250 may be dependent upon a user interaction with the second set of icons 211-214.
A concept of displaying more than one health indicator is not limited to the display 400, and may be implemented in any embodiment of the display.
This approach enables an overview of the relevant parameter of the subject (and their treatment) over a period of time to be visualized at a same interface.
Figure 5 illustrates another possible display 500 for the user interface according to an embodiment of the invention.
The display of Figure 5 differs from the display of Figures 4 and 5 in that, rather than being displayed using a bar chart or scatter plot format, the health information, sedation information and ventilation information are all displayed using a pie chart 510.
The pie chart may indicate a relative variation of the health information, the sedation information and the ventilation information over the course of a predetermined period. This can help to indicate which parameters fluctuate or are more stable. There is, of course, a desire to maintain stability of the patient to further improve the patient’s health over time.
In particular, each of the health information, sedation information and ventilation information displayed in the first area 250 comprises an indicator of variance.
For example, the health information comprises an indicator of variance of a health indicator (e.g. diaphragm thickness) over a period of time. This may comprise, for example, a calculated variance of the health indicator over the period of time or a count of the
number of times that a value of the health indicator changes by more than a predetermined amount (e.g. by more than a predetermined percentage or magnitude).
The sedation/ventilation information may similarly comprise an indicator of variance of a sedation/ventilation indicator over a period of time. This may comprise, for example, a calculated variance of the sedation/ventilation information or a count of the number of times that a value of the sedation/ventilation indicator changes by more than a predetermined amount.
The health, sedation and ventilation information is displayed in a pie chart form, so that a user can easily compare a variance of each of these pieces of information, thereby readily detecting which information has the greatest variance. Thus, the health information 610 forms one portion of a pie chart, the sedation information 620 forms another portion of the pie chart and the ventilation information forms yet another portion of the pie chart 630.
From the foregoing embodiment, it will be clear that the health/sedation/ventilation information displayed may comprise statistical information of a health/sedation/ventilation indicator (e.g. over a period of time). Such a concept may be applied to any embodiment of the invention, i.e. is not limited to the pie chart embodiment described above. This can assist a user of the user interface to conceptually understand statistical information of such indicators to aid in the treating of the patient.
It will be appreciated that the content of the first area 250 may be changed in response to a user input, e.g. interacting with a dedicated icon. This may allow, for example, the user to switch the content of the first area 250 between the examples given in Figures 3 to 5 (e.g. to switch between a bar graph, a scatter plot and/or a pie chart).
The skilled person would also appreciate that the described methods of providing content in the first area 250 are not exhaustive, and would readily contemplate alternative methods of displaying the health information, the sedation information and the ventilation information, such as line charts, radar charts, stem-and-leaf plots and/or box plots. In some examples, the information may be presented using alphanumeric characters, rather than in the form of plots.
The skilled person would be readily capable of developing a method to carry out any herein described concept.
Nonetheless, for the sake of completion, Figure 6 illustrates a method 600 of providing information for assisting in the treatment of a ventilated and sedated patient according to an embodiment of the invention.
The method 600 comprises a step 601 of obtaining one or more ultrasound images of a portion of the patient’s thorax; and a step 602 of processing the one or more ultrasound images, using a machine-learning algorithm, to obtain health information, the health information being responsive to changes in the health of the patient.
The method also comprises a step 603 of obtaining sedation information of the patient, the sedation information providing details on a dosage of sedative provided to the patient; and a step 604 of obtaining ventilation information of the patient, the ventilation information providing details on settings of a ventilator providing ventilation to the patient.
The method further comprises a step 605 of displaying the obtained health information, the sedation information and the ventilation information at a same user interface.
Embodiments make use of a machine-learning algorithm (i.e. also in the form of deep learning) to process an ultrasound image of the patient’s thorax to obtain or predict health information.
A machine-learning algorithm is any self-training algorithm that processes input data in order to produce or predict output data. Here, the input data comprises one or more ultrasound images of the patient’s thorax (“thorax ultrasound images”) and the output data comprises health information, such as one or more health indicators, responsive to changes in the subject’s health.
Suitable machine-learning algorithms for being employed in the present invention will be apparent to the skilled person. Examples of suitable machine-learning algorithms include decision tree algorithms and artificial or convolutional neural networks. Other machine-learning algorithms such as logistic regression, support vector machines or Naive Bayesian model are suitable alternatives.
The structure of an artificial and convolutional neural network (or, simply, neural network) is inspired by the human brain. Neural networks are comprised of layers, each layer comprising a plurality of neurons. Each neuron comprises a mathematical operation. In particular, each neuron may comprise a different weighted combination of a single type of transformation (e.g. the same type of transformation, sigmoid etc. but with different weightings). In the process of processing input data, the mathematical operation of each neuron is performed on the input data to produce a numerical output, and the outputs of each layer in the neural network are fed into the next layer sequentially. The final layer provides the output.
Methods of training a machine-learning algorithm are well known. Typically, such methods comprise obtaining a training dataset (with annotation), comprising training
input data entries and corresponding training output data entries. An initialized machine- learning algorithm is applied to each input data entry to generate predicted output data entries. An error between the predicted output data entries and corresponding training output data entries is used to modify the machine-learning algorithm. This process can be repeated until the error converges, and the predicted output data entries are sufficiently similar (e.g. ±1%) to the training output data entries. This is commonly known as a supervised learning technique.
For example, where the machine-learning algorithm is formed from a neural network, (weightings of) the mathematical operation of each neuron may be modified until the error converges. Known methods of modifying a neural network include gradient descent, backpropagation algorithms and so on.
The training input data entries correspond to example thorax ultrasound images. The training output data entries correspond to examples of at least one health indicator responsive to changes in the health of the subject.
In preferable embodiments, the deep learning algorithm that processes the ultrasound image(s) performs a segmentation process on the ultrasound image(s). The segmentation identifies elements of the ultrasound image that are able to provide an indicator responsive to the health of the patient.
The skilled person would be readily capable of developing a processing system for carrying out any herein described method. Thus, each step of the flow chart may represent a different action performed by a processing system, and may be performed by a respective module of the processing system.
Embodiments may therefore make use of a processing system. The processing system can be implemented in numerous ways, with software and/or hardware, to perform the various functions required. A processor is one example of a processing system which employs one or more microprocessors that may be programmed using software (e.g., microcode) to perform the required functions. A processing system may however be implemented with or without employing a processor, and also may be implemented as a combination of dedicated hardware to perform some functions and a processor (e.g., one or more programmed microprocessors and associated circuitry) to perform other functions.
Examples of processing system components that may be employed in various embodiments of the present disclosure include, but are not limited to, conventional microprocessors, application specific integrated circuits (ASICs), and field-programmable gate arrays (FPGAs).
In various implementations, a processor or processing system may be associated with one or more storage media such as volatile and non-volatile computer memory such as RAM, PROM, EPROM, and EEPROM. The storage media may be encoded with one or more programs that, when executed on one or more processors and/or processing systems, perform the required functions. Various storage media may be fixed within a processor or processing system or may be transportable, such that the one or more programs stored thereon can be loaded into a processor or processing system.
It will be understood that disclosed methods are preferably computer- implemented methods. As such, there is also proposed the concept of computer program comprising code means for implementing any described method when said program is run on a processing system, such as a computer. Thus, different portions, lines or blocks of code of a computer program according to an embodiment may be executed by a processing system or computer to perform any herein described method. In some alternative implementations, the functions noted in the block diagram(s) or flow chart(s) may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
Variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. If a computer program is discussed above, it may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems. If the term "adapted to" is used in the claims or description, it is noted the term "adapted to" is intended to be equivalent to the term "configured to". Any reference signs in the claims should not be construed as limiting the scope.
Claims
1. A computer-implemented method (600) of providing information for assisting in the treatment of a ventilated and sedated patient, the computer-implemented method comprising: obtaining (601) one or more ultrasound images (151) of a portion of the patient’s thorax; processing (602) the one or more ultrasound images, using a machine-learning algorithm, to obtain health information, the health information being responsive to changes in the health of the patient; obtaining (603) sedation information (152) of the patient, the sedation information providing details on a dosage of sedative provided to the patient; obtaining (604) ventilation information (153) of the patient, the ventilation information providing details on settings of a ventilator (195) providing ventilation to the patient; and displaying (605) the obtained health information, the sedation information and the ventilation information at a same user interface (120).
2. The computer-implemented method of claim 1, wherein the health information comprises at least one measurement of an anatomical feature of the patient’s thorax.
3. The computer-implemented method of claim 1 or claim 2, wherein the health information comprises at least one measurement for evaluating: diaphragmatic dysfunction, cardiac dysfunction and/or lung dysfunction.
4. The computer-implemented method of any of claims 1 to 3, wherein: the health information comprises a plurality of health indicators each responsive to changes in a health of the patient; and the step of displaying the obtained health information comprises: combining the obtained plurality of indicators to form a combined health indicator; and
displaying the combined health indicator at the user interface.
5. The computer-implemented method of any of claims 1 to 4, wherein: the one or more ultrasound images (151) comprises a sequence of ultrasound images captured over a period of time; and the step of processing the one or more ultrasound images comprises processing, using a machine-learning algorithm, the sequence of ultrasound images to obtain the health information.
6. The computer-implemented method of any of claims 1 to 4, wherein the health information is responsive to at least one of: a diaphragm thickness; an ejection fraction; a diaphragmatic excursion; a presence or absence of lung aeration; and/or cardiac function.
7. The computer-implemented method of claim 6, wherein the health information comprises a health indicator responsive to a trend of the diaphragm thickness.
8. The computer-implemented method of any of claims 1 to 7, wherein: the one or more ultrasound images comprises two or more sets of one or more ultrasound images of the patient’s thorax, wherein each set of one or more ultrasound images is captured at a different time; the step of processing the one or more ultrasound images comprises processing each set of one or more ultrasound images, using a machine-learning algorithm, to obtain, for each set, a value for one or more health indicators responsive to a change in the patient’s health; and the step of displaying the obtained health information comprises displaying the obtained health indicator responsive to a change in the patient’s health, to thereby display a plurality of values, each value corresponding to a different time, for each one or more health indicators.
9. The computer-implemented method of any of claims 1 to 8, wherein the sedation information providing details on dosages of sedative provided to the patient at different times.
10. The computer-implemented method of any of claims 1 to 9, wherein the ventilation information provides details on settings of a ventilator (195) providing ventilation to the patient at different times.
11. The computer-implemented method of any of claims 1 to 10, further comprising displaying at least one of the one or more ultrasound images at the user interface (120).
12. The computer-implemented method of any of claims 1 to 11, further comprising, responsive to a user input, displaying at least one of the one or more ultrasound images at the user interface, optionally wherein the displayed at least one of the one or more ultrasound images is dependent upon the user input.
13. A computer program product comprising computer program code means which, when executed one or more processing systems, causes the one or more processing systems to perform all of the steps of the method according to any of claims 1 to 12.
14. An information provision system (100) for providing information for assisting in the treatment of a ventilated and sedated patient comprising: one or more processing systems (110) adapted to: obtain (601) one or more ultrasound images of a portion of the patient’s thorax; process (602) the one or more ultrasound images, using a machine- learning algorithm, to obtain health information, the health information being responsive to changes in the health of the patient; obtain (603) sedation information of the patient, the sedation information providing details on a dosage of sedative provided to the patient; and obtain (604) ventilation information of the patient, the ventilation information providing details on settings of a ventilator providing ventilation to the patient, and a user interface (120) adapted to display (605) the obtained health information, the sedation information and the ventilation information.
15. An ultrasound imaging system (1) comprising: the information provision system of claim 14; and an ultrasound probe system (190) adapted to perform, responsive to a user’s control, ultrasound imaging to generate one or more ultrasound images of a portion of the patient’s thorax.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201962942590P | 2019-12-02 | 2019-12-02 | |
US62/942590 | 2019-12-02 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021110576A1 true WO2021110576A1 (en) | 2021-06-10 |
Family
ID=73694972
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2020/083865 WO2021110576A1 (en) | 2019-12-02 | 2020-11-30 | Providing information for assisting in the treatment of a ventilated and sedated patient |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2021110576A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023031071A1 (en) * | 2021-08-30 | 2023-03-09 | Koninklijke Philips N.V. | Ultrasound-controlled training program for individualized and automatic weaning |
WO2023110795A1 (en) * | 2021-12-15 | 2023-06-22 | Koninklijke Philips N.V. | Use of diaphragmatic ultrasound to determine patient-specific ventilator settings and optimize patient-ventilator asynchrony detection algorithms |
WO2024061721A1 (en) * | 2022-09-19 | 2024-03-28 | Koninklijke Philips N.V. | Safe ventilation in the presence of respiratory effort |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120325209A1 (en) * | 2011-06-24 | 2012-12-27 | Quintin Luc | Method for treating early severe diffuse acute respiratory distress syndrome |
WO2017222970A1 (en) * | 2016-06-20 | 2017-12-28 | Butterfly Network, Inc. | Automated image acquisition for assisting a user to operate an ultrasound device |
-
2020
- 2020-11-30 WO PCT/EP2020/083865 patent/WO2021110576A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120325209A1 (en) * | 2011-06-24 | 2012-12-27 | Quintin Luc | Method for treating early severe diffuse acute respiratory distress syndrome |
WO2017222970A1 (en) * | 2016-06-20 | 2017-12-28 | Butterfly Network, Inc. | Automated image acquisition for assisting a user to operate an ultrasound device |
TW201800057A (en) * | 2016-06-20 | 2018-01-01 | 蝴蝶網路公司 | Automated image acquisition for assisting a user to operate an ultrasound device |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023031071A1 (en) * | 2021-08-30 | 2023-03-09 | Koninklijke Philips N.V. | Ultrasound-controlled training program for individualized and automatic weaning |
WO2023110795A1 (en) * | 2021-12-15 | 2023-06-22 | Koninklijke Philips N.V. | Use of diaphragmatic ultrasound to determine patient-specific ventilator settings and optimize patient-ventilator asynchrony detection algorithms |
WO2024061721A1 (en) * | 2022-09-19 | 2024-03-28 | Koninklijke Philips N.V. | Safe ventilation in the presence of respiratory effort |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11826181B2 (en) | Acute care treatment systems dashboard | |
US9533113B2 (en) | Integrated pulmonary index for weaning from mechanical ventilation | |
JP5443376B2 (en) | Medical equipment | |
US10799185B2 (en) | Medical system, apparatus and method | |
US6148814A (en) | Method and system for patient monitoring and respiratory assistance control through mechanical ventilation by the use of deterministic protocols | |
US8412655B2 (en) | Medical system, apparatus and method | |
CN110251836B (en) | Defibrillation system | |
WO2021110576A1 (en) | Providing information for assisting in the treatment of a ventilated and sedated patient | |
JP6855465B2 (en) | Enhanced acute care management that combines imaging and physiological monitoring | |
US10998095B2 (en) | Tool for recommendation of ventilation therapy guided by risk score for acute respirator distress syndrome (ARDS) | |
US20220039762A1 (en) | Alarm threshold setting suggestion method, system and monitor | |
JP2018531067A6 (en) | Enhanced acute care management combined with imaging and physiological monitoring | |
WO2017083735A1 (en) | System and methods for extubation device utilization following liberation from mechanical ventilation | |
EP2445561A1 (en) | Integrated pulmonary index for weaning from mechanical ventilation | |
US20210298635A1 (en) | Systems and methods for sedation-level monitoring | |
JP2020533131A (en) | Correction of intravascular measurements for comorbidities using archived patient data | |
JP7295962B2 (en) | Methods and systems for improved prediction of fluid responsiveness | |
CN118043902A (en) | Digital twin of lungs for safe mechanical ventilation using mechanical ventilator data and bedside imaging information calibration and updating | |
WO2024206179A1 (en) | Systems and methods for determining flow and cardiac output |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20817244 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20817244 Country of ref document: EP Kind code of ref document: A1 |