[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2024111386A1 - Information processing device, information processing method, program, and information processing system - Google Patents

Information processing device, information processing method, program, and information processing system Download PDF

Info

Publication number
WO2024111386A1
WO2024111386A1 PCT/JP2023/039825 JP2023039825W WO2024111386A1 WO 2024111386 A1 WO2024111386 A1 WO 2024111386A1 JP 2023039825 W JP2023039825 W JP 2023039825W WO 2024111386 A1 WO2024111386 A1 WO 2024111386A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
display screen
patient
information processing
display
Prior art date
Application number
PCT/JP2023/039825
Other languages
French (fr)
Japanese (ja)
Inventor
弘泰 馬場
宇紀 深澤
奈々 河村
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2024111386A1 publication Critical patent/WO2024111386A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/92Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/93Regeneration of the television signal or of selected parts thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • This technology relates to an information processing device, an information processing method, a program, and an information processing system, and in particular to an information processing device, an information processing method, a program, and an information processing system that enable efficient review of a patient's condition.
  • ICUs intensive care units
  • doctors, nurses, and other medical professionals often assess a patient's condition not only by looking at changes in vital signs and test data, but also by comprehensively judging information from the patient's appearance, such as facial expressions and breathing patterns, to estimate the severity of the patient's condition.
  • Patent Document 1 discloses a technology that analyzes a patient's movements based on hospital bed image data and estimates the severity of the patient's condition based on the analyzed movements.
  • Some hospitals have cameras installed on the ceilings above the beds, but the primary purpose of these is to prevent blind spots from the nurses' stations, and currently there is no review of recorded footage or analysis of the video data. Furthermore, even if medical staff were to review the recorded footage, it would be difficult to thoroughly review footage covering a huge number of patients and over a huge period of time.
  • This technology was developed in light of these circumstances, and makes it possible to efficiently review a patient's condition.
  • An information processing device includes a data processing unit that calculates an overall feature amount based on a data set consisting of video feature amounts, which are feature amounts of video data obtained by imaging a patient, and patient data, which is at least one of the patient's clinical data, and a display screen generating unit that generates a first display screen that displays the calculated overall feature amount in chronological order together with a display component that indicates the playback position of the video data.
  • an overall feature is calculated based on a data set consisting of image features, which are features of video data obtained by imaging a patient, and patient data, which is at least one of the patient's clinical data. Then, a first display screen is generated that displays the calculated overall feature in chronological order together with a display component that indicates the playback position of the video data.
  • FIG. 1 is a diagram illustrating a configuration example of a review display system according to an embodiment of the present technology.
  • 2 is a block diagram showing a detailed configuration of the review display system of FIG. 1 .
  • 3 is a block diagram showing an example of a functional configuration of a data processing unit in FIG. 2 .
  • FIG. 13 is a diagram showing an example of the configuration of a review display screen used for reviewing video data.
  • FIG. 5 is an enlarged view of the video operation unit in FIG. 4 .
  • 6 is a diagram showing an example of the configuration of patient features displayed on the playback seek bar in FIG. 5 .
  • FIG. 7 is a diagram showing an example of display of patient features in the case of the configuration data set of FIG. 6 .
  • FIG. 7 is a diagram showing an example of display of patient features in the case of the comprehensive feature in FIG. 6;
  • FIG. 13 is a diagram illustrating an example of a comprehensive feature amount and a constituent data set.
  • FIG. 13 is a diagram illustrating an example of customization of a comprehensive feature amount.
  • FIG. 13 is a diagram showing an example of a superimposed display of comprehensive feature amounts.
  • 13 is a flowchart illustrating a process of setting patient features.
  • 13 is a flowchart illustrating a process of generating an event label.
  • FIG. 13 is a diagram showing an example of threshold setting in the time-series data display operation unit.
  • FIG. 13 is a diagram showing an example of a display of the playback position of video data currently being played back.
  • FIG. 13 is a diagram showing an example of displaying correlated data.
  • FIG. 2 is a block diagram illustrating an example of the configuration of a computer.
  • FIG. 1 is a block diagram showing an example of a configuration of a review display system according to an embodiment of the present technology.
  • the review display system 1 in Figure 1 is a system used in places such as intensive care units (ICUs) where the patient's condition needs to be carefully observed, allowing medical staff such as doctors and nurses to efficiently review images of the patient's condition around the bed.
  • ICUs intensive care units
  • doctors and nurses to efficiently review images of the patient's condition around the bed.
  • “Efficiently” here means that since images of patients (hereafter referred to as patient images) that are constantly being recorded can last for an enormous amount of time, only important scenes can be selected and played back.
  • the retrospective display system 1 displays information such as video, video features extracted from the video, and clinical data such as vital signs recorded in an existing patient information system 2, etc., all together on a display screen.
  • the retrospective display system 1 also uses the patient's video features and event labels (tags) indicating events to clearly display important scenes of the patient, and can play back video data from those important scenes.
  • video features include information obtained through image analysis (facial feature point detection, facial expression estimation, whole body skeletal point estimation, etc.) (e.g. facial expression of the subject, joint point positions, number of people, etc.) and information converted into values per unit time such as the amount of movement of the entire image.
  • image analysis facial feature point detection, facial expression estimation, whole body skeletal point estimation, etc.
  • information converted into values per unit time such as the amount of movement of the entire image.
  • video features include the average value of the agony facial expression score per unit time, the frequency peak and amplitude of the positional change of the relevant area associated with shoulder breathing or mandibular breathing, the total amount of movement of each joint point per unit time associated with the patient's movement, the average number of people on the screen per unit time, and the total amount of image shift per unit time.
  • the retrospective display system 1 is broadly configured to include a sensing block 11, a data processing block 12, and a display block 13.
  • the sensing block 11 uses sensing equipment such as an RGB camera, an IR camera, a depth camera, and a microphone to capture images and sounds of the patient and the area around the bed.
  • sensing equipment such as an RGB camera, an IR camera, a depth camera, and a microphone to capture images and sounds of the patient and the area around the bed.
  • the data processing block 12 processes the patient data recorded in the patient information system 2 and the video data and audio data acquired by the sensing block 11 in response to a specific program or an instruction signal sent from the display block 13, and generates display screen data.
  • the display block 13 displays a display screen corresponding to the display screen data generated by the data processing block 12.
  • the display block 13 also transmits an instruction signal to the data processing block 12 in response to the operation of the medical staff.
  • Patient information system 2 is an information system within the hospital that records patient attributes, vital sign data, and test result data, such as electronic medical records and ICU information systems.
  • FIG. 2 is a block diagram showing a detailed configuration of the review display system 1 of FIG.
  • the retrospective display system 1 in FIG. 2 is configured to include a bedside terminal 21, an information server 22, and a display terminal 23.
  • the dashed rectangular lines indicate that the corresponding devices may be configured as separate hardware or separate systems.
  • Each device is configured to include a communication unit (not shown). Although not specifically mentioned, information and data are sent and received between the devices via each communication unit.
  • the bedside terminal 21 is a terminal that is installed at the bedside where the patient is sleeping.
  • the bedside terminal 21 is configured to include a sensing block 11.
  • the sensing block 11 is configured to include a sensing data acquisition unit 31 and the sensing device 32 described above.
  • the sensing data acquisition unit 31 acquires video data, audio data, and depth information obtained by sensing performed by the sensing device 32.
  • the acquired data and information are transmitted to the information server 22.
  • the data and information acquired by the sensing device 32 is transmitted to the information server 22 without passing through the sensing data acquisition unit 31.
  • the information server 22 is a device installed in a hospital or on the cloud.
  • the information server 22 is configured to include a patient information system 2 and a data processing block 12. Note that the patient information system 2 may be installed in a server other than the information server 22.
  • the data processing block 12 is configured to include a data collection unit 41, a data processing unit 42, and a display screen generation unit 43.
  • the data collection unit 41 collects video data, audio data, and depth information, as well as data recorded in the patient information system 2.
  • the data collected from the patient information system 2 is, for example, clinical data such as the patient's vital sign data, respiratory data, blood test data, blood gas test data, medication record data, nursing record data, and patient attribute data (gender, height, age, medical history, etc.).
  • the data collection unit 41 outputs the collected clinical data to the data processing unit 42.
  • the data processing unit 42 performs data processing on the data supplied from the data collection unit 41, including image analysis such as facial expression estimation and whole body skeletal point estimation, as well as voice analysis, text analysis, and correlation evaluation.
  • the display screen generation unit 43 generates display screen data using the results of data processing by the data processing unit 42 in response to instruction signals and other signals transmitted from the display terminal 23.
  • the generated display screen data is transmitted to the display terminal 23.
  • the display terminal 23 is a device on which a medical professional actually looks at the display screen and operates it.
  • the display terminal 23 is, for example, a PC located at the bedside, nurse's station, or medical office, or a mobile device such as a tablet or smartphone.
  • the display terminal 23 is configured to include the display block 13.
  • the display block 13 is configured to include a display control unit 51, a display unit 52, and an operation input unit 53.
  • the display control unit 51 causes the display unit 52 to display a display screen corresponding to the display screen data transmitted from the display screen generation unit 43.
  • the display unit 52 is composed of an LCD (Liquid Crystal Display) panel, an organic EL (Electro-Luminescence) panel, or the like.
  • the operation input unit 53 inputs instruction signals in response to operations performed by a medical professional using a touch panel, mouse, or the like. The input instruction signals are transmitted to the information server 22 (the display screen generation unit 43).
  • FIG. 3 is a block diagram showing an example of the functional configuration of the data processing unit 42 in FIG.
  • the functional configuration in FIG. 3 is realized by the CPU of the information server 22 expanding a specific program into memory, etc.
  • the data processing unit 42 is configured to include a memory unit 71, an image analysis unit 72, a voice analysis unit 73, a text analysis unit 74, a correlation analysis unit 75, and a comprehensive feature calculation unit 76.
  • the storage unit 71 stores data collected by the data collection unit 41 and data processed by the data processing unit 42 as necessary.
  • the storage unit 71 stores an event list generated by the text analysis unit 74. The event list will be described later.
  • the image analysis unit 72 performs image analysis such as facial feature point detection, facial expression estimation, and whole body skeletal point estimation on the video data collected by the data collection unit 41 to obtain video feature amounts.
  • the obtained video feature amounts are supplied to the memory unit 71 and the overall feature amount calculation unit 76.
  • the voice analysis unit 73 performs voice analysis on the voice data collected by the data collection unit 41 to obtain voice features.
  • the obtained voice features are supplied to the storage unit 71 and the overall feature calculation unit 76.
  • the text analysis unit 74 performs character recognition on the patient's nursing record data collected by the data collection unit 41, and generates an event list consisting of event labels indicating events of the patient.
  • the generated event list is registered in the memory unit 71, etc.
  • the correlation analysis unit 75 analyzes other image features or clinical data that are correlated with the image features or clinical data for the time period specified by the instruction signal from the display terminal 23 obtained via the display screen generation unit 43. Information on the image features or clinical data that are analyzed to be correlated is supplied to the display screen generation unit 43.
  • the overall feature calculation unit 76 calculates an overall feature, which is a comprehensive feature, using default video features and clinical data specified by an instruction signal from the display terminal 23 obtained via the display screen generation unit 43 or set in advance. Note that audio features may also be used to calculate the overall feature. Details of the overall feature will be described later.
  • the calculated overall feature is supplied to the display screen generation unit 43.
  • FIG. 4 is a diagram showing an example of the configuration of a review display screen used for reviewing video data.
  • the review display screen 100 is configured to include a video display section 101, a video operation section 102, and a time-series data display operation section 103.
  • the video display unit 101 is located in the upper left of the review display screen 100. Video captured of the patient around the bed is displayed on the video display unit 101. In the case of FIG. 4, as shown by the LIVE icon displayed on the video display unit 101, real-time live video of the patient is displayed on the video display unit 101.
  • the video operation unit 102 is located at the bottom left of the review display screen 100, directly below the video display unit 101.
  • the location of the video operation unit 102 is not limited to below the video display unit 101, but it may be above or next to it. However, it is preferable that the video operation unit 102 is located near the video display unit 101 so that operations can be performed while viewing the image displayed on the video display unit 101.
  • the video operation unit 102 is composed of an operation button display unit 111 and a playback seek bar display unit 112.
  • the operation button display section 111 displays various operation buttons that enable medical personnel to perform operations related to the playback of the video displayed on the video display section 101.
  • the playback seek bar display unit 112 displays a playback seek bar that medical personnel can operate to search for a scene to view from the video data of the video displayed on the video display unit 101 and to specify the playback position (playback start position).
  • the time-series data display operation unit 103 is located to the right of the video display unit 101.
  • the time-series data display operation unit 103 displays time-series data of clinical data and video features from the past to the present.
  • the playback position of the video displayed on the video display unit 101 is displayed on the time series data of the time series data display operation unit 103 in conjunction with the playback seek bar of the video operation unit 102.
  • the review display screen 100 may also be configured with other components such as a data set reading section and a display section that works in conjunction with the patient information system 2 to read and display patient attributes (age, sex, patient ID number, etc.), underlying diseases, medical history, etc.
  • a data set reading section and a display section that works in conjunction with the patient information system 2 to read and display patient attributes (age, sex, patient ID number, etc.), underlying diseases, medical history, etc.
  • FIG. 5 is an enlarged view of the video operation unit 102 in FIG.
  • Operation button display section 111 has operation buttons arranged thereon, such as a (back to) previous event button, a 10 second rewind button, a rewind button, a play button, a fast forward button, a 15 second forward button, a (forward to) next event button, a pause button, and a (back to) live button.
  • buttons arranged thereon such as a (back to) previous event button, a 10 second rewind button, a rewind button, a play button, a fast forward button, a 15 second forward button, a (forward to) next event button, a pause button, and a (back to) live button.
  • a playback seek bar display section 112 is displayed, which displays a playback seek bar 120.
  • a graph showing the time series changes in the patient characteristics is superimposed on the playback seek bar 120.
  • the playback seek bar 120 is a tool (display component) that medical staff operates to search for a scene to be viewed from the video data of the video displayed on the video display section 101 and to specify the playback start position.
  • the patient features consist of constituent datasets consisting of clinical data and image features, or comprehensive features calculated comprehensively from the constituent datasets.
  • Figure 5 shows a case where the patient features are comprehensive features. Note that both the constituent datasets and the comprehensive features may be displayed as patient features.
  • Event label 121a is an event label that indicates a route management event such as an IV drip.
  • Event label 121a is composed of the event title "Route Management", an event point 122a that indicates the start time of the event on the playback seek bar 120, and an event preview image 123a obtained from the video data around the event time.
  • Event label 121b is an event label that indicates a position change event.
  • Event label 121b is composed of the event title "Position change", an event point 122b that indicates the start time of the event on the playback seek bar 120, and an event preview image 123b obtained from the video data around the event time.
  • Event label 121c is an event label that indicates a disturbing event. Disturbing refers to a restless movement. Event label 121b is composed of the event title "disturbing," an event point 122c that indicates the start time of the event on the playback seek bar 120, and an event preview image 123c obtained from the video data around the event time.
  • event labels 121a to 121c when there is no need to distinguish between the event labels 121a to 121c, the event points 122a to 122c, and the event preview images 123a to 123c, they will be referred to as the event label 121, the event point 122, and the event preview image 123, respectively.
  • a play icon which is a right-facing triangle within a rectangular frame, is superimposed on the event preview image 123.
  • the play icon is clicked, the video data is played from the event point 122 in the video display unit 101.
  • the event point 122 can also be considered the playback start point of the video data.
  • FIG. 6 is a diagram showing an example of the configuration of patient features displayed on the playback seek bar in FIG.
  • Patient features consist of constituent datasets or overall features, as described above.
  • the constituent dataset is composed of clinical data A, clinical data B, patient feature A, and patient feature B.
  • the overall feature is calculated comprehensively using the constituent data sets, clinical data A, clinical data B, patient feature A, and patient feature B.
  • FIG. 7 is a diagram showing an example of display of patient features in the case of the configuration data set of FIG.
  • FIG. 7 a graph showing the time series changes in the constituent data sets, clinical data B, clinical data F, patient characteristic C, and patient characteristic G, is displayed on the playback seek bar 120 of the playback seek bar display section 112.
  • a playback point 151 indicating the point currently being played back and a preview image 152 at the playback point 151 are displayed.
  • the preview image 152 is displayed by selecting the playback point 151.
  • FIG. 8 is a diagram showing an example of display of patient features in the case of the comprehensive features in FIG.
  • FIG. 8 a graph showing the time series change in the overall feature calculated comprehensively from the constituent data sets, clinical data B, clinical data F, patient feature C, and patient feature G, is displayed on the playback seek bar 120 of the playback seek bar display section 112.
  • a playback point 151 indicating the point currently being played back and a preview image 152 at the playback point 151 are displayed on the playback seek bar 120.
  • the preview image 152 is displayed by selecting the playback point 151.
  • FIG. 9 is a diagram showing an example of the comprehensive feature amount and the constituent data sets.
  • the overall features and constituent data sets can be selected according to the type of observation to be made, the disease of the target patient, and the doctor operating the retrospective display system 1.
  • FIG 9 an example of a combination of comprehensive features and constituent data sets is shown.
  • comprehensive features are shown as solid rectangles.
  • Clinical data are shown as hatched rectangles.
  • Video features are shown as dashed rectangles.
  • the periodic fluctuation of the shoulders which is an overall feature, is a movement that can occur when shoulder breathing, which is a type of labored breathing.
  • the periodic fluctuation of the shoulders can be calculated from the respiratory rate (clinical data), the amount of vertical shoulder movement, and the periodicity of the vertical shoulder movement (image feature).
  • the amount of vertical shoulder movement is composed of the amount of vertical shoulder movement of the right shoulder and the amount of vertical shoulder movement of the left shoulder.
  • the overall feature of suspected agitation can be calculated from blood pH (clinical data), the amount of whole-body movement, the amount of change in gaze, and the volume of the voice (visual features).
  • the amount of whole-body movement is composed of the amount of head movement, shoulder movement, arm movement, and leg movement.
  • the total feature, heart rate excluding intervention, is the heart rate excluding abnormal heart rate values during intervention.
  • Heart rate excluding intervention times can be calculated from the heart rate (clinical data) and the number of interventions (video feature).
  • the comprehensive feature, Doctor A's original, is the parameter set that the veteran Doctor A focuses on.
  • the Doctor A original can be calculated from clinical data X1, clinical data X2, and image feature X.
  • Image feature X is composed of image feature X1 and image feature X2.
  • the overall feature for sepsis patients is the parameter set that should be focused on in the case (in this case, sepsis).
  • Image feature Y is composed of image feature Y1 and image feature Y2.
  • a default configuration dataset prepared in advance may be used for the comprehensive feature (e.g., suspicion of a disturbance), or a configuration dataset customized (created) by medical personnel may be used (e.g., original by Doctor A).
  • the constituent datasets are composed of clinical data and/or image features.
  • the overall feature is a feature (value) calculated based on the calculation basis set using the constituent datasets. However, the overall feature does not necessarily have to be a "value” calculated from the constituent datasets, and may simply be the "group name" of the constituent datasets in order to display the constituent datasets as patient features. In that case, the value of the overall feature is assumed to be empty.
  • the overall feature may also be a severity score used in the ICU, etc.
  • SOFA Sepsis-related Organ Failure Assessment
  • FIG. 10 is a diagram showing an example of customization of the comprehensive feature amount.
  • the overall feature amount can be customized using simple programming methods or existing programming languages.
  • the calculation basis for the default overall feature amount can also be displayed, allowing medical professionals to view and confirm the calculation basis for the overall feature amount.
  • the inputs include heart rate and number of participants.
  • the heart rate excluding interventions is set to the actual heart rate of the patient. If the number of interventions is not zero, the heart rate excluding interventions is set to None.
  • the output is set to the heart rate excluding intervention.
  • FIG. 11 is a diagram showing an example of a superimposed display of comprehensive feature amounts.
  • Time series data can be displayed on top of each other.
  • the heart rate excluding the time of intervention which is the overall feature
  • the heart rate (hatched) which is the clinical data
  • the heart rate excluding the time of intervention is not displayed, so only the heart rate (hatched) is displayed.
  • the heart rate with and without intervention can be displayed in an easy-to-understand manner.
  • FIG. 12 is a flowchart illustrating the process of setting patient features.
  • step S11 the display screen generation unit 43 selects the patient features to be displayed in response to the operation of an input unit (not shown) by a medical professional.
  • step S12 the display screen generating unit 43 determines whether or not the selected patient feature has already been registered in the existing list in the storage unit 71. If it is determined in step S12 that the selected patient feature has not already been registered in the existing list, the process proceeds to step S13.
  • step S13 the display screen generation unit 43 creates new patient features. For example, as the patient features, a constituent data set and an overall feature that make up the patient features, and the calculation basis for the overall feature are created.
  • step S14 the display screen generator 43 registers the created patient features in an existing list. Then, the process proceeds to step S15.
  • step S12 If it is determined in step S12 that the selected patient feature is already registered in the existing list, steps S13 and S14 are skipped and processing proceeds to step S15.
  • step S15 the display screen generation unit 43 reads the existing list from the storage unit 71.
  • step S16 the display screen generation unit 43 selects the patient features selected in step S11 from the existing list that was loaded.
  • step S17 it is determined whether display of the constituent data set has been selected as the display method for the selected patient feature. If it is determined in step S17 that display of the constituent data set has been selected, the process proceeds to step S18.
  • step S18 the display screen generation unit 43 displays the configuration data set as the patient features.
  • step S17 If it is determined in step S17 that display of the configuration data set has not been selected, processing proceeds to step S19.
  • step S19 the display screen generation unit 43 displays the overall feature as the patient feature.
  • step S18 or S19 the patient feature setting process ends.
  • FIG. 13 is a flowchart illustrating the process of generating an event label.
  • the event label is linked to the nursing record data in the patient information system 2, and the text information entered by the nurse or medical staff is recognized, generated, and displayed on the playback seek bar 120.
  • Figure 13 shows the process of generating an event label.
  • step S31 the data collection unit 41 imports nursing record data from the patient information system 2.
  • step S32 the text analysis unit 74 performs character recognition using the nursing record data, and generates an event label 121 based on the recognized characters. For example, if there is nursing record data stating "At 1:32, XX behavior occurred. There is a possibility of agitation," characters such as "1:32" and "agitation" are recognized.
  • step S33 the text analysis unit 74 registers the generated event label 121 in the event list of the storage unit 71.
  • step S34 the display screen generating unit 43 displays the event label 121 registered in the event list in the storage unit 71 on the playback seek bar 120.
  • event labels indicating important scenes are generated without changing the workflow of doctors and nurses on-site.
  • event labels could be generated by medical staff entering nursing records into a terminal.
  • the time on the event label may be recorded by pressing a button such as a nurse call button placed at the bedside.
  • the event label does not necessarily need to have a title, and only the time the event occurred may be recorded.
  • FIG. 14 is a diagram showing an example of threshold setting in the time-series data display operation unit 103. As shown in FIG. 14
  • the time series data display operation unit 103 shown on the right side of A in Figure 14 displays the time series data X.
  • an adjustment bar 181 is displayed at a position indicating the threshold value 100.
  • the time period where the value exceeds the adjustment bar 181, i.e., the time period 191a where the value is higher than the threshold value 100 is shown in a different color.
  • This time period 191a is reflected on the playback seek bar 120 of the video operation unit 102 as time period 192a during which the time series data X is greater than the threshold value 100, as shown on the left side of A in FIG. 14.
  • an event preview image 193a is displayed in association with the time period 192a.
  • a play icon which is a right-facing triangle within a rectangular frame, is superimposed on the event preview image 193a.
  • the play icon is clicked, the video data in the video display section 101 is played from the beginning of the time period 192a.
  • FIG. 14B shows a time-series data display operation unit 103 that displays the time-series data X when a medical professional moves the adjustment bar 181, which is at the position indicating the threshold value 100 shown in FIG. 14A, to a position indicating the threshold value 80 by dragging and dropping it down.
  • time periods 191a to 191d are reflected on the playback seek bar 120 of the video operation unit 102 as time periods 192a to 192d whose values are higher than the threshold value 80, as shown on the left side of FIG. 14B.
  • preview images 193a to 193d are displayed in association with time periods 192a to 192d, respectively.
  • event preview image 193a Similar to event preview image 193a, the play icon is also superimposed on event preview images 193b to 193d.
  • the video data in the video display section 101 can be played from the beginning of each of the time periods 192b to 193d, in the same way as in the case of the event preview image 193a.
  • the time periods when the time series data X exceeds the threshold are displayed in different colors, but depending on the threshold that is set, the time periods when the time series data X falls below the threshold may be displayed in different colors.
  • the method of displaying them differently is not limited to different colors, and may be in a different format.
  • the threshold value was set by dragging and dropping the adjustment bar 181 up and down, but a text box can also be displayed and a numerical value can be entered directly.
  • FIG. 15 is a diagram showing an example of a display of the playback position of a video currently being played back.
  • the playback position 200 of the video currently being played is displayed on the playback seek bar 120 of the playback seek bar display section 112 of the video operation section 102.
  • the playback position corresponding to playback position 200 displayed on the video operation unit 102 is also displayed on each piece of time series data on the time series data display operation unit 103, as shown on the right side of Figure 15.
  • playback positions corresponding to playback position 200 are displayed as playback positions 201a to 201c on the time-series data of clinical data A to C.
  • playback positions corresponding to playback position 200 are displayed as playback positions 211a to 211c on the time-series data of video features A to C.
  • FIG. 16 is a diagram showing an example of displaying correlated data.
  • FIG. 16 shows the time series data display operation unit 103 on which the time series data of clinical data A to C and the time series data of image features A to C are displayed.
  • a small screen 252 is displayed showing other data candidates with positive or negative correlation from among the clinical data and image features as shown by arrow P1.
  • the small screen 252 shows other data candidates, namely clinical data K and image feature N, which are positively correlated, and image feature J, which is negatively correlated.
  • the small screen 252 also shows the time difference with clinical data A and the degree of correlation coefficient for each of the other data candidates.
  • a cross-correlation display screen 253 with clinical data A is displayed, as shown by arrow P2.
  • the cross-correlation display screen 253 shows a graph of the correlation coefficient between clinical data A and clinical data K shifted in time by ⁇ t.
  • an overall feature is calculated based on a dataset consisting of video features, which are features of video data obtained by imaging a patient, and patient data, which is at least one of the patient's clinical data, and a first display screen is generated that displays the calculated overall feature in chronological order together with a display component that indicates the playback position of the video data.
  • a display screen is generated that shows changes over time along with a display component that indicates the playback position of the video data, allowing users to efficiently review the patient's condition.
  • This technology makes it possible to grasp the condition of patients efficiently, making it possible to check the condition of multiple patients efficiently with a small number of staff at night when there are fewer staff on duty, in hospitals with a shortage of medical staff, and in support centers for remote ICUs.
  • Medical professionals can load feature datasets that are prepared in advance or that they have created themselves, allowing them to focus on the data that interests them. It also allows young doctors to learn the perspectives of veteran doctors.
  • FIG. 17 is a block diagram showing an example of the hardware configuration of a computer that executes the above-mentioned series of processes using a program.
  • CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • an input/output interface 305 Further connected to the bus 304 is an input/output interface 305. Connected to the input/output interface 305 are an input unit 306 consisting of a keyboard, mouse, etc., and an output unit 307 consisting of a display, speakers, etc. Also connected to the input/output interface 305 are a storage unit 308 consisting of a hard disk or non-volatile memory, a communication unit 309 consisting of a network interface, etc., and a drive 310 that drives removable media 311.
  • the CPU 301 performs the above-mentioned series of processes, for example by loading a program stored in the storage unit 308 into the RAM 303 via the input/output interface 305 and the bus 304 and executing the program.
  • the programs executed by the CPU 301 are provided, for example, by being recorded on removable media 311, or via a wired or wireless transmission medium such as a local area network, the Internet, or digital broadcasting, and are installed in the storage unit 308.
  • the program executed by the computer may be a program in which processing is performed chronologically in the order described in this specification, or a program in which processing is performed in parallel or at the required timing, such as when called.
  • a system refers to a collection of multiple components (devices, modules (parts), etc.), regardless of whether all the components are in the same housing. Therefore, multiple devices housed in separate housings and connected via a network, and a single device in which multiple modules are housed in a single housing, are both systems.
  • this technology can be configured as cloud computing, in which a single function is shared and processed collaboratively by multiple devices over a network.
  • each step described in the above flowchart can be executed by a single device, or can be shared and executed by multiple devices.
  • one step includes multiple processes
  • the multiple processes included in that one step can be executed by one device, or can be shared and executed by multiple devices.
  • the present technology can also be configured as follows.
  • a data processing unit that calculates a comprehensive feature based on a data set including image feature amounts that are feature amounts of image data obtained by imaging a patient and patient data that is at least one of clinical data of the patient; a display screen generating unit that generates a first display screen that displays the calculated comprehensive feature amount in chronological order together with a display component that indicates a playback position of the video data.
  • the data processing unit generates an event label indicating an event related to the patient based on the nursing record data of the patient;
  • the information processing device according to (1), wherein the display screen generating unit superimposes the event label together with the display component on the first display screen.
  • the information processing device includes a point indicating a start time of the event and a preview image of the event.
  • the event label includes a title of the event.
  • the display screen generation unit displays the patient data together with the display part on the first display screen.
  • the display screen generating unit displays the patient data on the first display screen by superimposing the patient data on the comprehensive feature amount.
  • the display screen generation unit displays, on the first display screen, a point corresponding to a playback position of the video data together with the display part.
  • the information processing device according to any one of (1) to (7), wherein the display screen generation unit generates the first display screen in a periphery of a second display screen that displays an image corresponding to the video data being played back.
  • the display screen generation unit generates a third display screen that displays the patient data in chronological order.
  • the data processing unit extracts correlation data, which is other patient data that is correlated with the patient data for the time period selected on the third display screen, The information processing device according to (9), wherein the display screen generation unit generates a fourth display screen that displays the extracted correlation data.
  • the information processing device (11) The information processing device according to (10), wherein the correlation data is other patient data in a time zone different from the time zone selected on the third display screen. (12) A threshold value for the patient data can be set on the third display screen, The information processing device according to (9), wherein the display screen generating unit displays, on the third display screen, a time period in which the patient data is equal to or greater than the threshold and a time period in which the patient data is less than the threshold in different formats. (13) The information processing device according to (12), wherein the display screen generating unit displays, on the first display screen, a time period in which the patient data is equal to or greater than the threshold and a time period in which the patient data is less than the threshold in different formats.
  • the display screen generation unit displays, on the first display screen, a point corresponding to a playback position of the video data together with the display part;
  • a data processing unit that calculates a comprehensive feature based on a data set including image feature amounts that are feature amounts of image data obtained by imaging a patient and patient data that is at least one of clinical data of the patient; a display screen generating unit that generates a first display screen that displays the calculated comprehensive feature amount in time series together with a display component that indicates a playback position of the video data,
  • the programs that make a computer function are included in the data processing unit.
  • a sensor for capturing an image of a patient to obtain video data a data processing unit that calculates a comprehensive feature based on a data set including image feature amounts that are feature amounts of the image data and patient data that is at least one of clinical data of the patient; an information processing device including a display screen generation unit that generates a first display screen that displays the calculated comprehensive feature amount in time series together with a display component that indicates a playback position of the video data; and a terminal that controls the display of the first display screen.
  • Retrospective display system 2. Patient information system, 11. Sensing block, 12. Data processing block, 13. Display block, 21. Bedside terminal, 22. Information server, 23. Display terminal, 41. Data collection unit, 42. Data processing unit, 43. Display screen generation unit, 51. Display control unit, 52. Display unit, 53. Operation input unit, 71. Memory unit, 72. Image analysis unit, 73. Audio analysis unit, 74. Text analysis unit, 75. Correlation analysis unit, 76. Overall feature calculation unit, 100. Retrospective display screen, 101. Video display unit, 102. Video operation unit, 103. Time series data display operation unit, 111. Operation button display unit, 112. Playback seek bar display unit, 120. Playback seek bar

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • General Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

The present technology relates to an information processing device, an information processing method, a program, and an information processing system which make it possible to efficiently look back on the situation of a patient. This information processing device calculates a total feature amount on the basis of a data set comprising patient data that is at least an image feature amount being the feature amount of image data obtained by capturing an image of a patient or the clinical data of the patient, and generates a first display screen for displaying, in chronological order, the calculated total feature amount together with a display component indicating the playback position of the image data. The present technology can be applied to a look-back display system.

Description

情報処理装置、情報処理方法、プログラム、および情報処理システムInformation processing device, information processing method, program, and information processing system
 本技術は、情報処理装置、情報処理方法、プログラム、および情報処理システムに関し、特に、患者の様子を効率良く振り返ることができるようにした情報処理装置、情報処理方法、プログラム、および情報処理システムに関する。 This technology relates to an information processing device, an information processing method, a program, and an information processing system, and in particular to an information processing device, an information processing method, a program, and an information processing system that enable efficient review of a patient's condition.
 集中治療室(ICU)を始めとした患者の注意深い観察が必要な場所では、医師や看護師などの医療従事者は患者の状態を把握する際にバイタルサインの変化や検査データだけではなく、患者の表情や呼吸の様子などの外観からの情報を総合的に判断して患者の重篤度を推測することが多い。 In intensive care units (ICUs) and other locations where careful observation of patients is required, doctors, nurses, and other medical professionals often assess a patient's condition not only by looking at changes in vital signs and test data, but also by comprehensively judging information from the patient's appearance, such as facial expressions and breathing patterns, to estimate the severity of the patient's condition.
 例えば、病床画像データに基づいて患者の動作を解析し、解析された動作に基づいて患者の重症度を推定する技術は、特許文献1に開示されている。 For example, Patent Document 1 discloses a technology that analyzes a patient's movements based on hospital bed image data and estimates the severity of the patient's condition based on the analyzed movements.
 また、バイタルサインの測定値に大きな変化が生じたが、実際は看護師の介入によるバイタルセンサの一時的な取り外しであったなどのケースが現場では頻繁に発生する。 In addition, cases frequently occur in the field where a significant change occurs in the measured vital signs, but in reality it is just the nurse intervening to temporarily remove the vital sensor.
 このように外観から得られる情報は重要であるにもかかわらず、医療現場では人手不足や過労働などの理由もあり、患者やベッドサイドを常時観察し続けることは困難である。 Although the information obtained from external appearance is important, it is difficult to constantly observe patients and their bedside in the medical field due to factors such as manpower shortages and overwork.
 ベッドの天井にカメラが備え付けられている病院もあるが、ナースステーションからの死角対策の意味合いが大きく、録画された映像の振り返りや映像データの解析などは行われていないのが現状である。また、例え、録画した映像を、医療従事者が振り返ろうとしても、膨大な患者人数、膨大な時間の録画された映像を隈なく確認することは困難である。 Some hospitals have cameras installed on the ceilings above the beds, but the primary purpose of these is to prevent blind spots from the nurses' stations, and currently there is no review of recorded footage or analysis of the video data. Furthermore, even if medical staff were to review the recorded footage, it would be difficult to thoroughly review footage covering a huge number of patients and over a huge period of time.
国際公開第2020/203015号International Publication No. 2020/203015
 そこで、常時録画された患者映像の重要シーンをわかりやすく表示する技術が必要である。 Therefore, there is a need for technology that can clearly display important scenes from continuously recorded patient footage.
 本技術はこのような状況に鑑みてなされたものであり、患者の様子を効率よく振り返ることができるようにするものである。 This technology was developed in light of these circumstances, and makes it possible to efficiently review a patient's condition.
 本技術の一側面の情報処理装置は、患者を撮像して得られる映像データの特徴量である映像特徴量および前記患者のクリニカルデータの少なくとも一方である患者データからなるデータセットに基づいて総合特徴量を算出するデータ処理部と、算出された前記総合特徴量を、前記映像データの再生位置を示す表示部品とともに時系列に表示する第1の表示画面を生成する表示画面生成部とを備える。 An information processing device according to one aspect of the present technology includes a data processing unit that calculates an overall feature amount based on a data set consisting of video feature amounts, which are feature amounts of video data obtained by imaging a patient, and patient data, which is at least one of the patient's clinical data, and a display screen generating unit that generates a first display screen that displays the calculated overall feature amount in chronological order together with a display component that indicates the playback position of the video data.
 本技術の一側面においては、患者を撮像して得られる映像データの特徴量である映像特徴量および前記患者のクリニカルデータの少なくとも一方である患者データからなるデータセットに基づいて総合特徴量が算出される。そして、算出された前記総合特徴量を、前記映像データの再生位置を示す表示部品とともに時系列に表示する第1の表示画面が生成される。 In one aspect of the present technology, an overall feature is calculated based on a data set consisting of image features, which are features of video data obtained by imaging a patient, and patient data, which is at least one of the patient's clinical data. Then, a first display screen is generated that displays the calculated overall feature in chronological order together with a display component that indicates the playback position of the video data.
本技術の実施の形態に係る振り返り表示システムの構成例を示す図である。1 is a diagram illustrating a configuration example of a review display system according to an embodiment of the present technology. 図1の振り返り表示システムの詳細構成を示すブロック図である。2 is a block diagram showing a detailed configuration of the review display system of FIG. 1 . 図2のデータ処理部の機能構成例を示すブロック図である。3 is a block diagram showing an example of a functional configuration of a data processing unit in FIG. 2 . 映像データの振り返りに用いられる振り返り表示画面の構成例を示す図である。FIG. 13 is a diagram showing an example of the configuration of a review display screen used for reviewing video data. 図4の映像操作部を拡大した図である。FIG. 5 is an enlarged view of the video operation unit in FIG. 4 . 図5の再生シークバー上に表示される患者特徴量の構成例を示す図である。6 is a diagram showing an example of the configuration of patient features displayed on the playback seek bar in FIG. 5 . 図6の構成データセットである場合の患者特徴量の表示例を示す図である。FIG. 7 is a diagram showing an example of display of patient features in the case of the configuration data set of FIG. 6 . 図6の総合特徴量である場合の患者特徴量の表示例を示す図である。7 is a diagram showing an example of display of patient features in the case of the comprehensive feature in FIG. 6; FIG. 総合特徴量と構成データセットの例を示す図である。FIG. 13 is a diagram illustrating an example of a comprehensive feature amount and a constituent data set. 総合特徴量のカスタマイズの例を示す図である。FIG. 13 is a diagram illustrating an example of customization of a comprehensive feature amount. 総合特徴量の重畳表示の例を示す図である。FIG. 13 is a diagram showing an example of a superimposed display of comprehensive feature amounts. 患者特徴量の設定処理を説明するフローチャートである。13 is a flowchart illustrating a process of setting patient features. イベントラベルの生成処理を説明するフローチャートである。13 is a flowchart illustrating a process of generating an event label. 時系列データ表示操作部における閾値設定の例を示す図である。FIG. 13 is a diagram showing an example of threshold setting in the time-series data display operation unit. 現在再生中の映像データの再生位置の表示例を示す図である。FIG. 13 is a diagram showing an example of a display of the playback position of video data currently being played back. 相関性のあるデータの表示例を示す図である。FIG. 13 is a diagram showing an example of displaying correlated data. コンピュータの構成例を示すブロック図である。FIG. 2 is a block diagram illustrating an example of the configuration of a computer.
 以下、本技術を実施するための形態について説明する。説明は以下の順序で行う。
 1.システム構成と装置構成
 2.画面構成例
 3.その他
Hereinafter, an embodiment of the present technology will be described in the following order.
1. System configuration and device configuration 2. Screen configuration example 3. Other
<1.システム構成と装置構成>
 <振り返り表示システムの構成>
 図1は、本技術の実施の形態に係る振り返り表示システムの構成例を示すブロック図である。
1. System configuration and device configuration
<Configuration of the retrospective display system>
FIG. 1 is a block diagram showing an example of a configuration of a review display system according to an embodiment of the present technology.
 図1の振り返り表示システム1は、集中治療室(ICU)など患者の様子を注意深く観察する必要がある場所において、医師や看護師などの医療従事者がベッド周辺の患者の様子を効率良く映像で振り返るために用いられるシステムである。ここで言う“効率良く”とは、常時撮影されている患者の映像(以下、患者映像)が膨大な時間に及ぶため、重要なシーンのみを選んで再生することができることを意味する。 The review display system 1 in Figure 1 is a system used in places such as intensive care units (ICUs) where the patient's condition needs to be carefully observed, allowing medical staff such as doctors and nurses to efficiently review images of the patient's condition around the bed. "Efficiently" here means that since images of patients (hereafter referred to as patient images) that are constantly being recorded can last for an enormous amount of time, only important scenes can be selected and played back.
 振り返り表示システム1は、映像と映像から抽出された映像特徴量、既存の患者情報システム2などに記録されたバイタルサインなどのクリニカルデータなどの情報をまとめて表示画面上で表示する。また、振り返り表示システム1は、患者の映像特徴量やイベントを示すイベントラベル(タグ)を用いて患者の重要シーンをわかりやすく表示し、その重要シーンから映像データを再生することができる。 The retrospective display system 1 displays information such as video, video features extracted from the video, and clinical data such as vital signs recorded in an existing patient information system 2, etc., all together on a display screen. The retrospective display system 1 also uses the patient's video features and event labels (tags) indicating events to clearly display important scenes of the patient, and can play back video data from those important scenes.
 ここで、映像特徴量は、画像解析(顔特徴点検出、表情推定、全身骨格点推定など)により得られる情報(測定対象者の表情、関節点位置、人数など)や画像全体の動き量などを単位時間当たりの値に変換した情報などである。 Here, video features include information obtained through image analysis (facial feature point detection, facial expression estimation, whole body skeletal point estimation, etc.) (e.g. facial expression of the subject, joint point positions, number of people, etc.) and information converted into values per unit time such as the amount of movement of the entire image.
 より具体的には、映像特徴量としては、苦悶表情スコアの単位時間当たりの平均値、肩呼吸や下顎呼吸に伴う該当部位の位置変化の周波数ピークや振幅、患者の動きに伴う各関節点の単位時間内の総動き量、単位時間における画面内の人数平均、単位時間における画像のシフト量の総和などが挙げられる。 More specifically, video features include the average value of the agony facial expression score per unit time, the frequency peak and amplitude of the positional change of the relevant area associated with shoulder breathing or mandibular breathing, the total amount of movement of each joint point per unit time associated with the patient's movement, the average number of people on the screen per unit time, and the total amount of image shift per unit time.
 振り返り表示システム1は、おおまかに、センシングブロック11、データ処理ブロック12、および表示ブロック13を含むように構成される。 The retrospective display system 1 is broadly configured to include a sensing block 11, a data processing block 12, and a display block 13.
 センシングブロック11は、RGBカメラ、IRカメラ、デプスカメラ、およびマイクなどのセンシング機器を用いて、患者やベッド周辺の映像と音を取得する。 The sensing block 11 uses sensing equipment such as an RGB camera, an IR camera, a depth camera, and a microphone to capture images and sounds of the patient and the area around the bed.
 データ処理ブロック12は、所定のプログラムまたは表示ブロック13から送信されてくる指示信号に応じて、患者情報システム2に記録されている患者に関するデータ、およびセンシングブロック11により取得された映像データと音声データとを処理し、表示画面データを生成する。 The data processing block 12 processes the patient data recorded in the patient information system 2 and the video data and audio data acquired by the sensing block 11 in response to a specific program or an instruction signal sent from the display block 13, and generates display screen data.
 表示ブロック13は、データ処理ブロック12により生成された表示画面データに対応する表示画面を表示する。また、表示ブロック13は、医療従事者の操作に応じた指示信号を、データ処理ブロック12に送信する。 The display block 13 displays a display screen corresponding to the display screen data generated by the data processing block 12. The display block 13 also transmits an instruction signal to the data processing block 12 in response to the operation of the medical staff.
 患者情報システム2は、電子カルテやICU情報システムなどの、患者の属性、バイタルサインデータ、および検査結果データが記録された病院内の情報システムである。 Patient information system 2 is an information system within the hospital that records patient attributes, vital sign data, and test result data, such as electronic medical records and ICU information systems.
 <振り返り表示システムの詳細構成>
 図2は、図1の振り返り表示システム1の詳細構成を示すブロック図である。
<Detailed configuration of the retrospective display system>
FIG. 2 is a block diagram showing a detailed configuration of the review display system 1 of FIG.
 図2の振り返り表示システム1は、ベッドサイド端末21、情報サーバ22、および表示端末23を含むように構成される。図2において、矩形の破線は、該当する機器が個別のハードウェアまたは別のシステムとして構成されていてもよいことを表している。 The retrospective display system 1 in FIG. 2 is configured to include a bedside terminal 21, an information server 22, and a display terminal 23. In FIG. 2, the dashed rectangular lines indicate that the corresponding devices may be configured as separate hardware or separate systems.
 なお、各装置は、通信部(不図示)を含んで構成される。各装置間は、特に言及されないが、各通信部を介して、情報やデータを送受信するものとする。 Each device is configured to include a communication unit (not shown). Although not specifically mentioned, information and data are sent and received between the devices via each communication unit.
 ベッドサイド端末21は、患者が寝ているベッドサイドなどに設置される端末である。ベッドサイド端末21は、センシングブロック11を含むように構成される。 The bedside terminal 21 is a terminal that is installed at the bedside where the patient is sleeping. The bedside terminal 21 is configured to include a sensing block 11.
 図2において、センシングブロック11は、センシングデータ取得部31、および、上述したセンシング機器32を含むように構成される。センシングデータ取得部31は、センシング機器32がセンシングすることにより得られた映像データ、音声データ、およびデプス情報を取得する。取得されたデータおよび情報は、情報サーバ22に送信される。 In FIG. 2, the sensing block 11 is configured to include a sensing data acquisition unit 31 and the sensing device 32 described above. The sensing data acquisition unit 31 acquires video data, audio data, and depth information obtained by sensing performed by the sensing device 32. The acquired data and information are transmitted to the information server 22.
 なお、IPカメラのように映像データを直接送信できるセンシング機器32の場合、センシング機器32により取得されたデータおよび情報などは、センシングデータ取得部31を経由せずに、情報サーバ22に送信される。 In the case of a sensing device 32 that can directly transmit video data, such as an IP camera, the data and information acquired by the sensing device 32 is transmitted to the information server 22 without passing through the sensing data acquisition unit 31.
 情報サーバ22は、病院内またはクラウド上に設けられる装置である。情報サーバ22は、患者情報システム2およびデータ処理ブロック12を含むように構成される。なお、患者情報システム2は、情報サーバ22ではなく、他のサーバに設けられてもよい。 The information server 22 is a device installed in a hospital or on the cloud. The information server 22 is configured to include a patient information system 2 and a data processing block 12. Note that the patient information system 2 may be installed in a server other than the information server 22.
 図2において、データ処理ブロック12は、データ収集部41、データ処理部42、および表示画面生成部43を含むように構成される。 In FIG. 2, the data processing block 12 is configured to include a data collection unit 41, a data processing unit 42, and a display screen generation unit 43.
 データ収集部41は、映像データ、音声データ、およびデプス情報と、患者情報システム2に記録されているデータを収集する。患者情報システム2から収集されるデータは、例えば、患者のバイタルサインデータ、呼吸器データ、血液検査データ、血液ガス検査データ、投薬記録データ、看護記録データ、および患者属性(性別、身長、年齢、既往歴など)データなどのクリニカルデータである。データ収集部41は、収集したクリニカルデータを、データ処理部42に出力する。 The data collection unit 41 collects video data, audio data, and depth information, as well as data recorded in the patient information system 2. The data collected from the patient information system 2 is, for example, clinical data such as the patient's vital sign data, respiratory data, blood test data, blood gas test data, medication record data, nursing record data, and patient attribute data (gender, height, age, medical history, etc.). The data collection unit 41 outputs the collected clinical data to the data processing unit 42.
 データ処理部42は、データ収集部41から供給されるデータに対して、表情推定、全身骨格点推定などの画像解析や、音声解析、テキスト解析、相関性評価などのデータ処理を行う。 The data processing unit 42 performs data processing on the data supplied from the data collection unit 41, including image analysis such as facial expression estimation and whole body skeletal point estimation, as well as voice analysis, text analysis, and correlation evaluation.
 表示画面生成部43は、表示端末23から送信されてくる指示信号などに応じて、データ処理部42によるデータ処理の結果などを用い、表示画面データを生成する。生成された表示画面データは表示端末23に送信される。 The display screen generation unit 43 generates display screen data using the results of data processing by the data processing unit 42 in response to instruction signals and other signals transmitted from the display terminal 23. The generated display screen data is transmitted to the display terminal 23.
 表示端末23は、実際に医療従事者が表示画面を見て、操作を行う装置である。表示端末23は、例えば、ベッドサイド、ナースステーション、もしくは医局にあるPC、またはタブレットやスマートフォンなどのモバイル機器などで構成される。表示端末23は、表示ブロック13を含むように構成される。 The display terminal 23 is a device on which a medical professional actually looks at the display screen and operates it. The display terminal 23 is, for example, a PC located at the bedside, nurse's station, or medical office, or a mobile device such as a tablet or smartphone. The display terminal 23 is configured to include the display block 13.
 表示ブロック13は、表示制御部51、表示部52、および操作入力部53を含むように構成される。 The display block 13 is configured to include a display control unit 51, a display unit 52, and an operation input unit 53.
 表示制御部51は、表示画面生成部43から送信されてくる表示画面データに対応する表示画面を、表示部52に表示させる。表示部52は、LCD(Liquid Crystal Display)パネルや有機EL(Electro-Luminescence)パネルなどから構成される。操作入力部53は、医療従事者が行うタッチパネルやマウスなどの操作に応じて指示信号を入力する。入力した指示信号は情報サーバ22(の表示画面生成部43)に送信される。 The display control unit 51 causes the display unit 52 to display a display screen corresponding to the display screen data transmitted from the display screen generation unit 43. The display unit 52 is composed of an LCD (Liquid Crystal Display) panel, an organic EL (Electro-Luminescence) panel, or the like. The operation input unit 53 inputs instruction signals in response to operations performed by a medical professional using a touch panel, mouse, or the like. The input instruction signals are transmitted to the information server 22 (the display screen generation unit 43).
 <データ処理部の機能構成例>
 図3は、図2のデータ処理部42の機能構成例を示すブロック図である。
<Example of functional configuration of data processing unit>
FIG. 3 is a block diagram showing an example of the functional configuration of the data processing unit 42 in FIG.
 図3の機能構成は、情報サーバ22のCPUが所定のプログラムをメモリなどに展開されることで実現される。 The functional configuration in FIG. 3 is realized by the CPU of the information server 22 expanding a specific program into memory, etc.
 データ処理部42は、記憶部71、画像解析部72、音声解析部73、テキスト解析部74、相関性解析部75、および総合特徴量算出部76を含むように構成される。 The data processing unit 42 is configured to include a memory unit 71, an image analysis unit 72, a voice analysis unit 73, a text analysis unit 74, a correlation analysis unit 75, and a comprehensive feature calculation unit 76.
 記憶部71は、データ収集部41により収集されたデータや、データ処理部42により処理されたデータを必要に応じて記憶する。例えば、記憶部71は、テキスト解析部74により生成されたイベントリストなどを記憶する。イベントリストについては後述される。 The storage unit 71 stores data collected by the data collection unit 41 and data processed by the data processing unit 42 as necessary. For example, the storage unit 71 stores an event list generated by the text analysis unit 74. The event list will be described later.
 画像解析部72は、データ収集部41により収集された映像データに対して、顔特徴点検出、表情推定、全身骨格点推定などの画像解析を行い、映像特徴量を得る。得られた映像特徴量は、記憶部71および総合特徴量算出部76に供給される。 The image analysis unit 72 performs image analysis such as facial feature point detection, facial expression estimation, and whole body skeletal point estimation on the video data collected by the data collection unit 41 to obtain video feature amounts. The obtained video feature amounts are supplied to the memory unit 71 and the overall feature amount calculation unit 76.
 音声解析部73は、データ収集部41により収集された音声データに対して、音声解析を行い、音声特徴量を得る。得られた音声特徴量は、記憶部71および総合特徴量算出部76に供給される。 The voice analysis unit 73 performs voice analysis on the voice data collected by the data collection unit 41 to obtain voice features. The obtained voice features are supplied to the storage unit 71 and the overall feature calculation unit 76.
 テキスト解析部74は、データ収集部41により収集された患者の看護記録データに対して文字認識を行い、患者のイベントを示すイベントラベルからなるイベントリストを生成する。生成されたイベントリストは、記憶部71などに登録される。 The text analysis unit 74 performs character recognition on the patient's nursing record data collected by the data collection unit 41, and generates an event list consisting of event labels indicating events of the patient. The generated event list is registered in the memory unit 71, etc.
 相関性解析部75は、表示画面生成部43を介して得られる表示端末23からの指示信号により指定された時間帯の映像特徴量またはクリニカルデータと相関性のある他の映像特徴量またはクリニカルデータを解析する。相関性があると解析された映像特徴量またはクリニカルデータの情報は、表示画面生成部43に供給される。 The correlation analysis unit 75 analyzes other image features or clinical data that are correlated with the image features or clinical data for the time period specified by the instruction signal from the display terminal 23 obtained via the display screen generation unit 43. Information on the image features or clinical data that are analyzed to be correlated is supplied to the display screen generation unit 43.
 総合特徴量算出部76は、表示画面生成部43を介して得られる表示端末23からの指示信号により指定された、または予め設定されたデフォルトの映像特徴量とクリニカルデータを用いて総合的な特徴量である総合特徴量を算出する。なお、総合特徴量の算出には、音声特徴量が用いられてもよい。総合特徴量の詳細は後述される。算出された総合特徴量は、表示画面生成部43に供給される。 The overall feature calculation unit 76 calculates an overall feature, which is a comprehensive feature, using default video features and clinical data specified by an instruction signal from the display terminal 23 obtained via the display screen generation unit 43 or set in advance. Note that audio features may also be used to calculate the overall feature. Details of the overall feature will be described later. The calculated overall feature is supplied to the display screen generation unit 43.
<2.画面構成例>
 <振り返り表示画面の構成>
 図4は、映像データの振り返りに用いられる振り返り表示画面の構成例を示す図である。
<2. Screen configuration example>
<Configuration of the review display screen>
FIG. 4 is a diagram showing an example of the configuration of a review display screen used for reviewing video data.
 図4において、振り返り表示画面100は、映像表示部101、映像操作部102、並びに、時系列データ表示操作部103を含むように構成される。 In FIG. 4, the review display screen 100 is configured to include a video display section 101, a video operation section 102, and a time-series data display operation section 103.
 映像表示部101は、振り返り表示画面100の左上に配置される。映像表示部101には、ベッド周辺の患者を撮像した映像が表示される。図4の場合、映像表示部101上に表示されるLIVEのアイコンに示されるように、映像表示部101には、患者のリアルタイムのLIVE映像が表示されている。 The video display unit 101 is located in the upper left of the review display screen 100. Video captured of the patient around the bed is displayed on the video display unit 101. In the case of FIG. 4, as shown by the LIVE icon displayed on the video display unit 101, real-time live video of the patient is displayed on the video display unit 101.
 映像操作部102は、振り返り表示画面100の左下、映像表示部101の真下に配置される。なお、映像操作部102の配置場所は、映像表示部101の下に限らず、上であってもよいし、隣であってもよいが、映像操作部102は、映像表示部101に表示される映像を見ながらの操作を行うべく、映像表示部101の周辺に配置されることが望ましい。 The video operation unit 102 is located at the bottom left of the review display screen 100, directly below the video display unit 101. The location of the video operation unit 102 is not limited to below the video display unit 101, but it may be above or next to it. However, it is preferable that the video operation unit 102 is located near the video display unit 101 so that operations can be performed while viewing the image displayed on the video display unit 101.
 映像操作部102は、操作ボタン表示部111および再生シークバー表示部112から構成される。 The video operation unit 102 is composed of an operation button display unit 111 and a playback seek bar display unit 112.
 操作ボタン表示部111には、医療従事者が、映像表示部101に表示される映像の再生に関する操作を行うための各操作ボタンが表示される。 The operation button display section 111 displays various operation buttons that enable medical personnel to perform operations related to the playback of the video displayed on the video display section 101.
 再生シークバー表示部112には、医療従事者が、映像表示部101に表示される映像の映像データから見るべきシーンを探索し、再生位置(再生開始位置)を指定するために操作する再生シークバーが表示される。 The playback seek bar display unit 112 displays a playback seek bar that medical personnel can operate to search for a scene to view from the video data of the video displayed on the video display unit 101 and to specify the playback position (playback start position).
 時系列データ表示操作部103は、映像表示部101の右側に配置される。時系列データ表示操作部103には、過去から現在までのクリニカルデータや映像特徴量の時系列データが表示される。 The time-series data display operation unit 103 is located to the right of the video display unit 101. The time-series data display operation unit 103 displays time-series data of clinical data and video features from the past to the present.
 時系列データ表示操作部103の時系列データ上には、映像操作部102の再生シークバーと連動して、映像表示部101に表示される映像の再生位置などが表示される。 The playback position of the video displayed on the video display unit 101 is displayed on the time series data of the time series data display operation unit 103 in conjunction with the playback seek bar of the video operation unit 102.
 なお、振り返り表示画面100には、他の構成要素として、データセットの読み込み部や、患者情報システム2と連携し、患者属性(年齢、性別、患者ID番号など)や基礎疾患、既往歴などを読み込み、表示する表示部などが構成されてもよい。 In addition, the review display screen 100 may also be configured with other components such as a data set reading section and a display section that works in conjunction with the patient information system 2 to read and display patient attributes (age, sex, patient ID number, etc.), underlying diseases, medical history, etc.
 <映像操作部の構成例>
 図5は、図4の映像操作部102を拡大した図である。
<Example of video operation unit configuration>
FIG. 5 is an enlarged view of the video operation unit 102 in FIG.
 操作ボタン表示部111には、例えば、前のイベント(に戻る)ボタン、10秒戻しボタン、巻き戻しボタン、再生ボタン、早送りボタン、15秒送りボタン、次のイベント(に進む)ボタン、一時停止ボタン、およびライブ(に戻る)ボタンなどの操作ボタンが配置されている。 Operation button display section 111 has operation buttons arranged thereon, such as a (back to) previous event button, a 10 second rewind button, a rewind button, a play button, a fast forward button, a 15 second forward button, a (forward to) next event button, a pause button, and a (back to) live button.
 操作ボタン表示部111の直下には、再生シークバー120が表示される再生シークバー表示部112が表示される。再生シークバー120上には、患者特徴量の時系列の変化を示すグラフが重畳表示される。再生シークバー120は、上述したように、医療従事者が、映像表示部101に表示される映像の映像データから見るべきシーンを探索し、再生開始位置を指定するために操作するためのツール(表示部品)である。 Directly below the operation button display section 111, a playback seek bar display section 112 is displayed, which displays a playback seek bar 120. A graph showing the time series changes in the patient characteristics is superimposed on the playback seek bar 120. As described above, the playback seek bar 120 is a tool (display component) that medical staff operates to search for a scene to be viewed from the video data of the video displayed on the video display section 101 and to specify the playback start position.
 患者特徴量は、クリニカルデータおよび映像特徴量からなる構成データセット、または、構成データセットから総合的に算出される総合特徴量からなる。図5においては、患者特徴量が総合特徴量である場合が示されている。なお、構成データセットと総合特徴量の両方を患者特徴量として表示させてもよい。 The patient features consist of constituent datasets consisting of clinical data and image features, or comprehensive features calculated comprehensively from the constituent datasets. Figure 5 shows a case where the patient features are comprehensive features. Note that both the constituent datasets and the comprehensive features may be displayed as patient features.
 再生シークバー120上には、また、患者情報システム2に記録されている看護記録データを用いて生成された患者に関するイベントを示すイベントラベル121a乃至121cが表示される。 Also displayed on the playback seek bar 120 are event labels 121a to 121c that indicate events related to the patient that were generated using nursing record data recorded in the patient information system 2.
 イベントラベル121aは、点滴などのルート管理のイベントを示すイベントラベルである。イベントラベル121aは、イベントタイトル「ルート管理」、再生シークバー120上でイベントの開始時刻を示すイベントポイント122a、およびイベント時刻周辺の映像データの中から得られるイベントプレビュー画像123aから構成される。 Event label 121a is an event label that indicates a route management event such as an IV drip. Event label 121a is composed of the event title "Route Management", an event point 122a that indicates the start time of the event on the playback seek bar 120, and an event preview image 123a obtained from the video data around the event time.
 イベントラベル121bは、体位交換のイベントを示すイベントラベルである。イベントラベル121bは、イベントタイトル「体位交換」、再生シークバー120上でイベントの開始時刻を示すイベントポイント122b、およびイベント時刻周辺の映像データの中から得られるイベントプレビュー画像123bから構成される。 Event label 121b is an event label that indicates a position change event. Event label 121b is composed of the event title "Position change", an event point 122b that indicates the start time of the event on the playback seek bar 120, and an event preview image 123b obtained from the video data around the event time.
 イベントラベル121cは、不穏のイベントを示すイベントラベルである。不穏とは、もぞもぞした動きを表す。イベントラベル121bは、イベントタイトル「不穏」、再生シークバー120上でイベントの開始時刻を示すイベントポイント122c、およびイベント時刻周辺の映像データの中から得られるイベントプレビュー画像123cから構成される。 Event label 121c is an event label that indicates a disturbing event. Disturbing refers to a restless movement. Event label 121b is composed of the event title "disturbing," an event point 122c that indicates the start time of the event on the playback seek bar 120, and an event preview image 123c obtained from the video data around the event time.
 なお、以下、イベントラベル121a乃至121c、イベントポイント122a乃至122c、イベントプレビュー画像123a乃至123cを個々に区別する必要がない場合、それぞれ、イベントラベル121、イベントポイント122、イベントプレビュー画像123と称する。 Note that, hereinafter, when there is no need to distinguish between the event labels 121a to 121c, the event points 122a to 122c, and the event preview images 123a to 123c, they will be referred to as the event label 121, the event point 122, and the event preview image 123, respectively.
 イベントプレビュー画像123には、四角形の枠の中に右向き三角形を記した再生アイコンが重畳されている。再生アイコンがクリックされると、映像表示部101において、イベントポイント122から映像データが再生される。すなわち、イベントポイント122は、映像データの再生開始ポイントとも言える。 A play icon, which is a right-facing triangle within a rectangular frame, is superimposed on the event preview image 123. When the play icon is clicked, the video data is played from the event point 122 in the video display unit 101. In other words, the event point 122 can also be considered the playback start point of the video data.
 <患者特徴量の構成例>
 図6は、図5の再生シークバー上に表示される患者特徴量の構成例を示す図である。
<Example of patient feature configuration>
FIG. 6 is a diagram showing an example of the configuration of patient features displayed on the playback seek bar in FIG.
 患者特徴量は、上述したように、構成データセットまたは総合特徴量からなる。 Patient features consist of constituent datasets or overall features, as described above.
 図6の場合、構成データセットは、クリニカルデータA、クリニカルデータB、患者特徴量A、および患者特徴量Bから構成される。 In the case of Figure 6, the constituent dataset is composed of clinical data A, clinical data B, patient feature A, and patient feature B.
 また、総合特徴量は、構成データセットであるクリニカルデータA、クリニカルデータB、患者特徴量A、および患者特徴量Bを用いて総合的に算出される。 In addition, the overall feature is calculated comprehensively using the constituent data sets, clinical data A, clinical data B, patient feature A, and patient feature B.
 <患者特徴量の表示例>
 図7は、図6の構成データセットである場合の患者特徴量の表示例を示す図である。
<Display example of patient features>
FIG. 7 is a diagram showing an example of display of patient features in the case of the configuration data set of FIG.
 図7において、再生シークバー表示部112の再生シークバー120上には、構成データセットであるクリニカルデータB、クリニカルデータF、患者特徴量C、および患者特徴量Gの時系列の変化を示すグラフが表示されている。 In FIG. 7, a graph showing the time series changes in the constituent data sets, clinical data B, clinical data F, patient characteristic C, and patient characteristic G, is displayed on the playback seek bar 120 of the playback seek bar display section 112.
 さらに、再生シークバー120上には、再生中のポイントを示す再生ポイント151と、再生ポイント151におけるプレビュー画像152が示されている。プレビュー画像152は、再生ポイント151を選択することで表示される。 Furthermore, on the playback seek bar 120, a playback point 151 indicating the point currently being played back and a preview image 152 at the playback point 151 are displayed. The preview image 152 is displayed by selecting the playback point 151.
 <患者特徴量の表示例>
 図8は、図6の総合特徴量である場合の患者特徴量の表示例を示す図である。
<Display example of patient features>
FIG. 8 is a diagram showing an example of display of patient features in the case of the comprehensive features in FIG.
 図8において、再生シークバー表示部112の再生シークバー120上には、構成データセットであるクリニカルデータB、クリニカルデータF、患者特徴量C、および患者特徴量Gから総合的に算出された総合特徴量の時系列の変化を示すグラフが表示されている。 In FIG. 8, a graph showing the time series change in the overall feature calculated comprehensively from the constituent data sets, clinical data B, clinical data F, patient feature C, and patient feature G, is displayed on the playback seek bar 120 of the playback seek bar display section 112.
 図7の場合と同様に、再生シークバー120上には、再生中のポイントを示す再生ポイント151と、再生ポイント151におけるプレビュー画像152が示されている。プレビュー画像152は、再生ポイント151を選択することで表示される。 As in FIG. 7, a playback point 151 indicating the point currently being played back and a preview image 152 at the playback point 151 are displayed on the playback seek bar 120. The preview image 152 is displayed by selecting the playback point 151.
 <総合特徴量と構成データセットの例>
 図9は、総合特徴量と構成データセットの例を示す図である。
<Examples of comprehensive features and constituent datasets>
FIG. 9 is a diagram showing an example of the comprehensive feature amount and the constituent data sets.
 総合特徴量と構成データセットは観察したい種類や対象とする患者の疾患や、振り返り表示システム1を操作する医師によって好きなものを選ぶことができる。 The overall features and constituent data sets can be selected according to the type of observation to be made, the disease of the target patient, and the doctor operating the retrospective display system 1.
 図9においては、総合特徴量と構成データセットの組み合わせ例が示されている。なお、図9において、総合特徴量は実線の矩形で示されている。クリニカルデータはハッチングされた矩形で示されている。映像特徴量は破線の矩形で示されている。 In Figure 9, an example of a combination of comprehensive features and constituent data sets is shown. In Figure 9, comprehensive features are shown as solid rectangles. Clinical data are shown as hatched rectangles. Video features are shown as dashed rectangles.
 総合特徴量である肩の周期変動は、努力呼吸の一種である肩呼吸の場合に起こりうる動作である。 The periodic fluctuation of the shoulders, which is an overall feature, is a movement that can occur when shoulder breathing, which is a type of labored breathing.
 肩の周期変動は、呼吸数(クリニカルデータ)と、肩の上下方向の動き量および肩の上下方向の周期性(映像特徴量)から算出することができる。なお、肩の上下方向の動き量は、右肩の上下方向の動き量および左肩の上下方向の動き量から構成される。 The periodic fluctuation of the shoulders can be calculated from the respiratory rate (clinical data), the amount of vertical shoulder movement, and the periodicity of the vertical shoulder movement (image feature). The amount of vertical shoulder movement is composed of the amount of vertical shoulder movement of the right shoulder and the amount of vertical shoulder movement of the left shoulder.
 総合特徴量である不穏の疑いは、血液pH(クリニカルデータ)と、全身の動き量、視線の変化量、および声の大きさ(映像特徴量)から算出することができる。なお、全身の動き量は、頭の動き量、肩の動き量、腕の動き量、および脚の動き量から構成される。 The overall feature of suspected agitation can be calculated from blood pH (clinical data), the amount of whole-body movement, the amount of change in gaze, and the volume of the voice (visual features). The amount of whole-body movement is composed of the amount of head movement, shoulder movement, arm movement, and leg movement.
 総合特徴量である介入時を除く心拍数は、介入時の心拍数の異常値を除いた心拍数のことである。 The total feature, heart rate excluding intervention, is the heart rate excluding abnormal heart rate values during intervention.
 介入時を除く心拍数は、心拍数(クリニカルデータ)と介入人数(映像特徴量)から算出することができる。 Heart rate excluding intervention times can be calculated from the heart rate (clinical data) and the number of interventions (video feature).
 総合特徴量であるA医師オリジナルは、ベテランであるA医師が着目するパラメータセットである。 The comprehensive feature, Doctor A's original, is the parameter set that the veteran Doctor A focuses on.
 A医師オリジナルは、クリニカルデータX1およびクリニカルデータX2と、映像特徴量Xから算出することができる。映像特徴量Xは、映像特徴量X1と映像特徴量X2から構成される。 The Doctor A original can be calculated from clinical data X1, clinical data X2, and image feature X. Image feature X is composed of image feature X1 and image feature X2.
 総合特徴量である敗血症患者用は、症例(この場合、敗血症)で着目すべきパラメータセットである。 The overall feature for sepsis patients is the parameter set that should be focused on in the case (in this case, sepsis).
 敗血症患者用は、クリニカルデータY1およびクリニカルデータY2と、映像特徴量Yから算出することができる。映像特徴量Yは、映像特徴量Y1と映像特徴量Y2から構成される。 For sepsis patients, it can be calculated from clinical data Y1, clinical data Y2, and image feature Y. Image feature Y is composed of image feature Y1 and image feature Y2.
 このように、総合特徴量には、予め用意されたデフォルトの構成データセットが利用されてもよいし(例:不穏の疑い)、医療関係者によりカスタマイズ(作成)された構成データセットが利用されてもよい(例:A医師オリジナル)。 In this way, a default configuration dataset prepared in advance may be used for the comprehensive feature (e.g., suspicion of a disturbance), or a configuration dataset customized (created) by medical personnel may be used (e.g., original by Doctor A).
 構成データセットは、クリニカルデータと映像特徴量の両方またはいずれかで構成される。総合特徴量は、構成データセットを用いて設定された算出根拠で算出される特徴量(値)である。ただし、総合特徴量は必ずしも構成データセットから算出される“値”でなくてもよく、患者特徴量として構成データセットを表示させるための、構成データセットの単なる“グループ名”であってもよい。その場合、総合特徴量の値は空となるイメージである。総合特徴量は、ICUで用いられる重症度スコアなどでもよい。 The constituent datasets are composed of clinical data and/or image features. The overall feature is a feature (value) calculated based on the calculation basis set using the constituent datasets. However, the overall feature does not necessarily have to be a "value" calculated from the constituent datasets, and may simply be the "group name" of the constituent datasets in order to display the constituent datasets as patient features. In that case, the value of the overall feature is assumed to be empty. The overall feature may also be a severity score used in the ICU, etc.
 重症度スコアは、例えば、バイタルデータや検査データから多臓器不全を評価するSOFA(Sepsis-related Organ Failure Assessment)スコアなどである。 An example of a severity score is the SOFA (Sepsis-related Organ Failure Assessment) score, which evaluates multiple organ failure based on vital signs and test data.
 <総合特徴量のカスタマイズ>
 図10は、総合特徴量のカスタマイズの例を示す図である。
<Customization of comprehensive features>
FIG. 10 is a diagram showing an example of customization of the comprehensive feature amount.
 総合特徴量は、簡易的なプログラミング方法または既存のプログラミング言語によりカスタマイズが可能である。また、デフォルトで用意された総合特徴量の算出根拠も表示が可能であり、医療従事者は、総合特徴量の算出根拠を見て確認することができる。 The overall feature amount can be customized using simple programming methods or existing programming languages. In addition, the calculation basis for the default overall feature amount can also be displayed, allowing medical professionals to view and confirm the calculation basis for the overall feature amount.
 図10においては、総合特徴量が介入時を除く心拍数で、構成データセットが、心拍数(クリニカルデータ)と介入人数(映像特徴量)である場合の総合特徴量の算出根拠が、右側に示されている。 In Figure 10, the basis for calculating the overall feature when the overall feature is the heart rate excluding intervention and the constituent data sets are the heart rate (clinical data) and the number of interventions (video feature) is shown on the right.
 入力には、心拍数と介入人数が設定されている。 The inputs include heart rate and number of participants.
 介入人数がゼロである場合、介入時を除く心拍数は、実際の患者の心拍数に設定される。介入人数がゼロでない場合、介入時を除く心拍数は、Noneとなる。 If the number of interventions is zero, the heart rate excluding interventions is set to the actual heart rate of the patient. If the number of interventions is not zero, the heart rate excluding interventions is set to None.
 出力には、介入時を除く心拍数が設定されている。 The output is set to the heart rate excluding intervention.
 <総合特徴量の重畳例>
 図11は、総合特徴量の重畳表示の例を示す図である。
<Example of superimposing comprehensive features>
FIG. 11 is a diagram showing an example of a superimposed display of comprehensive feature amounts.
 時系列データ同士は重畳表示を行うことができる。 Time series data can be displayed on top of each other.
 図11においては、総合特徴量である介入時を除く心拍数と、クリニカルデータである心拍数(ハッチ)とが重畳表示されている。介入時には、介入時の除く心拍数が表示されないため、心拍数(ハッチ)のみが表示される。 In Figure 11, the heart rate excluding the time of intervention, which is the overall feature, and the heart rate (hatched), which is the clinical data, are superimposed. During intervention, the heart rate excluding the time of intervention is not displayed, so only the heart rate (hatched) is displayed.
 このように、介入時を除く心拍数と心拍数(ハッチ)とを重畳表示することにより、介入の有無による心拍数をわかり易く表示することができる。 In this way, by superimposing the heart rate (hatch) and the heart rate excluding intervention, the heart rate with and without intervention can be displayed in an easy-to-understand manner.
 <患者特徴量の設定処理>
 図12は、患者特徴量の設定処理を説明するフローチャートである。
<Setting process of patient features>
FIG. 12 is a flowchart illustrating the process of setting patient features.
 ステップS11において、表示画面生成部43は、医療従事者による入力部(不図示)の操作に応じて、表示する患者特徴量を選択する。 In step S11, the display screen generation unit 43 selects the patient features to be displayed in response to the operation of an input unit (not shown) by a medical professional.
 ステップS12において、表示画面生成部43は、選択した患者特徴量がすでに記憶部71の既存リストに登録されているか否かを判定する。選択した患者特徴量がすでに既存リストに登録されていないとステップS12において判定された場合、処理は、ステップS13に進む。 In step S12, the display screen generating unit 43 determines whether or not the selected patient feature has already been registered in the existing list in the storage unit 71. If it is determined in step S12 that the selected patient feature has not already been registered in the existing list, the process proceeds to step S13.
 ステップS13において、表示画面生成部43は、新規患者特徴量を作成する。例えば、患者特徴量として、患者特徴量を構成する構成データセットと総合特徴量、および総合特徴量の算出根拠などが作成される。 In step S13, the display screen generation unit 43 creates new patient features. For example, as the patient features, a constituent data set and an overall feature that make up the patient features, and the calculation basis for the overall feature are created.
 ステップS14において、表示画面生成部43は、作成した患者特徴量を既存リストに登録する。その後、処理は、ステップS15に進む。 In step S14, the display screen generator 43 registers the created patient features in an existing list. Then, the process proceeds to step S15.
 ステップS12において、選択した患者特徴量がすでに既存リストに登録されていると判定された場合、ステップS13およびS14の処理をスキップし、処理は、ステップS15に進む。 If it is determined in step S12 that the selected patient feature is already registered in the existing list, steps S13 and S14 are skipped and processing proceeds to step S15.
 ステップS15において、表示画面生成部43は、既存リストを記憶部71から読み込む。 In step S15, the display screen generation unit 43 reads the existing list from the storage unit 71.
 ステップS16において、表示画面生成部43は、読み込んだ既存リストから、ステップS11において選択した患者特徴量を選択する。 In step S16, the display screen generation unit 43 selects the patient features selected in step S11 from the existing list that was loaded.
 ステップS17において、選択した患者特徴量の表示方法として、構成データセットの表示が選択されたか否かを判定する。構成データセットの表示が選択されたとステップS17において判定された場合、処理は、ステップS18に進む。 In step S17, it is determined whether display of the constituent data set has been selected as the display method for the selected patient feature. If it is determined in step S17 that display of the constituent data set has been selected, the process proceeds to step S18.
 ステップS18において、表示画面生成部43は、患者特徴量として、構成データセットを表示させる。 In step S18, the display screen generation unit 43 displays the configuration data set as the patient features.
 ステップS17において、構成データセットの表示が選択されていないと判定された場合、処理は、ステップS19に進む。 If it is determined in step S17 that display of the configuration data set has not been selected, processing proceeds to step S19.
 ステップS19において、表示画面生成部43は、患者特徴量として、総合特徴量を表示させる。 In step S19, the display screen generation unit 43 displays the overall feature as the patient feature.
 ステップS18またはS19の後、患者特徴量の設定処理は終了となる。 After step S18 or S19, the patient feature setting process ends.
 <イベントラベルの生成処理>
 図13は、イベントラベルの生成処理を説明するフローチャートである。
<Event label generation process>
FIG. 13 is a flowchart illustrating the process of generating an event label.
 イベントラベルは、図5を参照して上述したように、患者情報システム2の看護記録データと連携されており、看護師や医療従事者によって入力された文字情報が認識され、生成されて、再生シークバー120上に表示される。 As described above with reference to Figure 5, the event label is linked to the nursing record data in the patient information system 2, and the text information entered by the nurse or medical staff is recognized, generated, and displayed on the playback seek bar 120.
 図13においては、イベントラベルの生成処理が示されている。 Figure 13 shows the process of generating an event label.
 ステップS31において、データ収集部41は、患者情報システム2から看護記録データを取り込む。 In step S31, the data collection unit 41 imports nursing record data from the patient information system 2.
 ステップS32において、テキスト解析部74は、看護記録データを用いて文字認識を行い、認識した文字に基づいてイベントラベル121を生成する。例えば、「1時32分、〇〇の言動あり。不穏の恐れ」という看護記録データがある場合、「1時32分」「不穏」などの文字が認識される。 In step S32, the text analysis unit 74 performs character recognition using the nursing record data, and generates an event label 121 based on the recognized characters. For example, if there is nursing record data stating "At 1:32, XX behavior occurred. There is a possibility of agitation," characters such as "1:32" and "agitation" are recognized.
 ステップS33において、テキスト解析部74は、生成したイベントラベル121を記憶部71のイベントリストに登録する。 In step S33, the text analysis unit 74 registers the generated event label 121 in the event list of the storage unit 71.
 ステップS34において、表示画面生成部43は、記憶部71のイベントリストに登録されているイベントラベル121を、再生シークバー120上に表示させる。 In step S34, the display screen generating unit 43 displays the event label 121 registered in the event list in the storage unit 71 on the playback seek bar 120.
 以上の処理により、現場の医師や看護師のワークフローを変えることなく、重要シーンを示すイベントラベルが生成される。 Through the above process, event labels indicating important scenes are generated without changing the workflow of doctors and nurses on-site.
 なお、自動生成以外の方法としては、現場の負担が多少発生することになるが、医療従事者が端末に対して看護記録を入力することにより、イベントラベルが生成されるようにしてもよい。 As an alternative to automatic generation, although this would impose some burden on the field, event labels could be generated by medical staff entering nursing records into a terminal.
 または、ベッドサイドに置かれたナースコールなどのボタンが押されることにより、イベントラベルの時刻が刻まれるようにしてもよい。この場合、イベントラベルのタイトルは必ずしもなくてもよく、イベントが発生した時刻だけが記録される形式であってもよい。 Alternatively, the time on the event label may be recorded by pressing a button such as a nurse call button placed at the bedside. In this case, the event label does not necessarily need to have a title, and only the time the event occurred may be recorded.
 <閾値設定の例>
 図14は、時系列データ表示操作部103における閾値設定の例を示す図である。
<Example of threshold setting>
FIG. 14 is a diagram showing an example of threshold setting in the time-series data display operation unit 103. As shown in FIG.
 図14のAの右側に示される時系列データ表示操作部103においては、時系列データXが表示されている。 The time series data display operation unit 103 shown on the right side of A in Figure 14 displays the time series data X.
 時系列データX上には、閾値100を示す位置にある調整バー181が表示されている。そして、時系列データX上においては、調整バー181を値が上回る時間帯、すなわち、閾値100より値が高い時間帯191aが異なる色で示されている。 On the time series data X, an adjustment bar 181 is displayed at a position indicating the threshold value 100. On the time series data X, the time period where the value exceeds the adjustment bar 181, i.e., the time period 191a where the value is higher than the threshold value 100, is shown in a different color.
 この時間帯191aは、図14のAの左側に示されるように、映像操作部102の再生シークバー120上に、時系列データXが閾値100より高い値の時間帯192aとして反映される。 This time period 191a is reflected on the playback seek bar 120 of the video operation unit 102 as time period 192a during which the time series data X is greater than the threshold value 100, as shown on the left side of A in FIG. 14.
 その際、時間帯192aに紐づいて、イベントプレビュー画像193aが表示される。 At that time, an event preview image 193a is displayed in association with the time period 192a.
 イベントプレビュー画像193aには、四角形の枠内に右向き三角形を記した再生アイコンが重畳されている。再生アイコンがクリックされると、映像表示部101における映像データが、時間帯192aの先頭から再生される。 A play icon, which is a right-facing triangle within a rectangular frame, is superimposed on the event preview image 193a. When the play icon is clicked, the video data in the video display section 101 is played from the beginning of the time period 192a.
 これにより、時系列データXが閾値より高い時間帯、すなわち、時系列データXが正常ではない時間帯に何が起こっているのかを映像で容易に確認することができる。 This makes it easy to see from the video what is happening during the time period when the time series data X is higher than the threshold, i.e., during the time period when the time series data X is not normal.
 また、調整バー181をドラッグアンドドロップで上下に移動させることにより、医療従事者は時系列データXに対して任意の閾値を設定することができる。 In addition, by dragging and dropping the adjustment bar 181 up and down, medical personnel can set any threshold value for the time series data X.
 図14のBの右側においては、医療関係者が、図14のAに示される閾値100を示す位置にある調整バー181をドラッグアンドドロップにより下げることで、閾値80を示す位置に移動させた場合の時系列データXが表示される時系列データ表示操作部103が示されている。 The right side of FIG. 14B shows a time-series data display operation unit 103 that displays the time-series data X when a medical professional moves the adjustment bar 181, which is at the position indicating the threshold value 100 shown in FIG. 14A, to a position indicating the threshold value 80 by dragging and dropping it down.
 図14のBの場合、時系列データX上においては、調整バー181を値が上回る時間帯、すなわち、閾値80より値が高い時間帯191a乃至191dが異なる色で表示されている。この場合、図14のAの場合よりも閾値を下げたことにより、より多くの時間帯がピックアップ(異なる色で表示)されている。 In the case of FIG. 14B, on the time series data X, time periods 191a to 191d whose values exceed the adjustment bar 181, i.e., whose values are higher than the threshold value 80, are displayed in different colors. In this case, by lowering the threshold value compared to the case of FIG. 14A, more time periods are picked up (displayed in different colors).
 これらの時間帯191a乃至191dは、図14のBの左側に示されるように、映像操作部102の再生シークバー120上に、閾値80より値が高い時間帯192a乃至192dとして反映される。 These time periods 191a to 191d are reflected on the playback seek bar 120 of the video operation unit 102 as time periods 192a to 192d whose values are higher than the threshold value 80, as shown on the left side of FIG. 14B.
 その際、時間帯192a乃至192dに紐づいて、プレビュー画像193a乃至193dがそれぞれ表示される。 At that time, preview images 193a to 193d are displayed in association with time periods 192a to 192d, respectively.
 イベントプレビュー画像193b乃至193dにも、イベントプレビュー画像193aと同様に、再生アイコンがそれぞれ重畳されている。 Similar to event preview image 193a, the play icon is also superimposed on event preview images 193b to 193d.
 イベントプレビュー画像193b乃至193dに重畳されている各再生アイコンをクリックすることで、イベントプレビュー画像193aの場合と同様に、映像表示部101における映像データを、時間帯192b乃至193dの先頭からそれぞれ再生することができる。 By clicking on each of the play icons superimposed on the event preview images 193b to 193d, the video data in the video display section 101 can be played from the beginning of each of the time periods 192b to 193d, in the same way as in the case of the event preview image 193a.
 なお、図14においては、時系列データXが閾値より上回る時間帯を異なる色で表示させるようにしたが、設定される閾値によっては、時系列データXが閾値より下回る時間帯を異なる色で表示させてもよい。異なるように表示する方法としては、異なる色に限らず、異なる形式であればよい。 In FIG. 14, the time periods when the time series data X exceeds the threshold are displayed in different colors, but depending on the threshold that is set, the time periods when the time series data X falls below the threshold may be displayed in different colors. The method of displaying them differently is not limited to different colors, and may be in a different format.
 また、閾値の設定方法は、調整バー181をドラッグアンドドロップで上下に変更する方法を例として説明したが、テキストボックスを表示させて、数値を直接入力する方法でもよい。 The threshold value was set by dragging and dropping the adjustment bar 181 up and down, but a text box can also be displayed and a numerical value can be entered directly.
 <再生位置の表示例>
 図15は、現在再生中の映像の再生位置の表示例を示す図である。
<Example of playback position display>
FIG. 15 is a diagram showing an example of a display of the playback position of a video currently being played back.
 図15の左側においては、映像操作部102の再生シークバー表示部112の再生シークバー120上に現在再生中の映像の再生位置200が表示されている。 On the left side of FIG. 15, the playback position 200 of the video currently being played is displayed on the playback seek bar 120 of the playback seek bar display section 112 of the video operation section 102.
 映像操作部102上に表示される再生位置200に相当する再生位置は、図15の右側に示されるように、時系列データ表示操作部103の各時系列データ上にもそれぞれ表示される。 The playback position corresponding to playback position 200 displayed on the video operation unit 102 is also displayed on each piece of time series data on the time series data display operation unit 103, as shown on the right side of Figure 15.
 例えば、再生位置200に相当する再生位置は、クリニカルデータA乃至Cの時系列データ上に、再生位置201a乃至201cとして表示されている。 For example, playback positions corresponding to playback position 200 are displayed as playback positions 201a to 201c on the time-series data of clinical data A to C.
 同様に、再生位置200に相当する再生位置は、映像特徴量A乃至Cの時系列データ上に、再生位置211a乃至211cとして表示されている。 Similarly, playback positions corresponding to playback position 200 are displayed as playback positions 211a to 211c on the time-series data of video features A to C.
 映像操作部102上に表示される再生位置を、時系列データ表示操作部103の各時系列データ上にもそれぞれ表示することで、時系列データの変化と映像による患者の様子の関係を確認しやすくすることができる。 By displaying the playback position displayed on the video operation unit 102 on each piece of time series data on the time series data display operation unit 103, it is easy to check the relationship between changes in the time series data and the patient's condition in the video.
 <相関性のあるデータの表示>
 図16は、相関性のあるデータの表示例を示す図である。
Displaying correlated data
FIG. 16 is a diagram showing an example of displaying correlated data.
 図16において、クリニカルデータA乃至Cの時系列データ、映像特徴量A乃至Cの時系列データが表示されている時系列データ表示操作部103が示されている。 FIG. 16 shows the time series data display operation unit 103 on which the time series data of clinical data A to C and the time series data of image features A to C are displayed.
 クリニカルデータAの時系列データにおいて、枠251に示されるように、任意の時間帯が選択されて、相関性評価が実行されると、矢印P1に示されるように、クリニカルデータや映像特徴量の中から、正または負の相関性がある他のデータ候補を示す小画面252が表示される。 When an arbitrary time period is selected in the time series data of clinical data A as shown in box 251 and a correlation evaluation is performed, a small screen 252 is displayed showing other data candidates with positive or negative correlation from among the clinical data and image features as shown by arrow P1.
 小画面252には、他のデータ候補として、正の相関があるクリニカルデータKと映像特徴量Nが示されており、負の相関がある映像特徴量Jが示されている。 The small screen 252 shows other data candidates, namely clinical data K and image feature N, which are positively correlated, and image feature J, which is negatively correlated.
 また、小画面252には、各他のデータ候補とともに、クリニカルデータAとの時間差、相関係数の度合いもそれぞれ示されている。 In addition, the small screen 252 also shows the time difference with clinical data A and the degree of correlation coefficient for each of the other data candidates.
 すなわち、他のデータ候補としては、同じ時間帯の波形の相関性だけではなく、時間をΔtずらした場合の他の時間帯の相関性(相互相関)も評価される。 In other words, as other data candidates, not only the correlation of waveforms in the same time period is evaluated, but also the correlation (cross-correlation) of other time periods when shifted by Δt.
 小画面252において、例えば、クリニカルデータKが選択されると、矢印P2に示されるように、クリニカルデータAとの相互相関表示画面253が表示される。 For example, when clinical data K is selected on the small screen 252, a cross-correlation display screen 253 with clinical data A is displayed, as shown by arrow P2.
 相互相関表示画面253には、クリニカルデータAと、時間をΔtずらしたクリニカルデータKとの相関係数のグラフが示されている。 The cross-correlation display screen 253 shows a graph of the correlation coefficient between clinical data A and clinical data K shifted in time by Δt.
 以上のように、時系列データ表示操作部103において選択した時間帯の時系列データと相関性のある他の時系列データを探索することができる。 As described above, it is possible to search for other time series data that are correlated with the time series data for the time period selected in the time series data display operation unit 103.
 この機能の有用性としては、クリニカルデータおよび映像特徴量の相互の関係性を知ることにより、従来医師や看護師の経験に頼り続けられている現場の暗黙知を形式知化することにつながることがある。さらにその関係性を利用して、患者映像とクリニカルデータによる急変予測技術に発展する可能性が考えられる。 The usefulness of this function is that by understanding the mutual relationships between clinical data and video features, it can lead to the explicitization of tacit knowledge in the field, which has traditionally relied on the experience of doctors and nurses. Furthermore, it is thought that this relationship could be used to develop technology for predicting sudden changes using patient videos and clinical data.
<3.その他>
 <本技術の効果>
 以上のように、本技術においては、患者を撮像して得られる映像データの特徴量である映像特徴量および患者のクリニカルデータの少なくとも一方である患者データからなるデータセットに基づいて総合特徴量が算出され、算出された総合特徴量を、映像データの再生位置を示す表示部品とともに時系列に表示する第1の表示画面が生成される。
<3. Other>
<Effects of this technology>
As described above, in the present technology, an overall feature is calculated based on a dataset consisting of video features, which are features of video data obtained by imaging a patient, and patient data, which is at least one of the patient's clinical data, and a first display screen is generated that displays the calculated overall feature in chronological order together with a display component that indicates the playback position of the video data.
 すなわち、映像データの再生位置を示す表示部品とともに経時的な変化を表示する表示画面が生成されるので、患者の様子を効率良く振り返ることができる。 In other words, a display screen is generated that shows changes over time along with a display component that indicates the playback position of the video data, allowing users to efficiently review the patient's condition.
 本技術によれば、患者の様子を効率よく把握することができるので、勤務人数が減少する夜間帯や医療従事者が不足している病院、遠隔ICUにおける支援センターなどにおいて、少ない人手で効率よく複数人の患者の状態を確認することができる。 This technology makes it possible to grasp the condition of patients efficiently, making it possible to check the condition of multiple patients efficiently with a small number of staff at night when there are fewer staff on duty, in hospitals with a shortage of medical staff, and in support centers for remote ICUs.
 これにより、医師不足の解消、働き方改革、遠隔医療の発展などに貢献することができる。 This will contribute to resolving the shortage of doctors, reforming work styles, and advancing remote medical care.
 医療従事者が、日勤と夜勤の交代時に行われる“申し送り”において、映像を交えて患者の様子をわかりやすく相手に伝えることができる。 When medical staff hand over information between day and night shifts, they can easily communicate the patient's condition to others using video.
 医療従事者が、予め用意された、または自分で作成した特徴量データセットを読み込むことにより、気になるデータを重点的に確認することができる。また、若手医師にとってはベテラン医師の着眼点を学ぶことができる。 Medical professionals can load feature datasets that are prepared in advance or that they have created themselves, allowing them to focus on the data that interests them. It also allows young doctors to learn the perspectives of veteran doctors.
 クリニカルデータと映像特徴量の相関性の発見により、暗黙知の形式知化に繋がる。さらには、映像を用いた急変予測技術開発につながる。 The discovery of correlations between clinical data and video features will lead to the conversion of tacit knowledge into explicit knowledge. It will also lead to the development of technology to predict sudden changes using video.
 <コンピュータの構成例>
 上述した一連の処理は、ハードウェアにより実行することもできるし、ソフトウェアにより実行することもできる。一連の処理をソフトウェアにより実行する場合には、そのソフトウェアを構成するプログラムが、専用のハードウェアに組み込まれているコンピュータ、または汎用のパーソナルコンピュータなどに、プログラム記録媒体からインストールされる。
<Example of computer configuration>
The above-mentioned series of processes can be executed by hardware or software. When the series of processes is executed by software, the program constituting the software is installed from a program recording medium into a computer incorporated in dedicated hardware or a general-purpose personal computer.
 図17は、上述した一連の処理をプログラムにより実行するコンピュータのハードウェアの構成例を示すブロック図である。 FIG. 17 is a block diagram showing an example of the hardware configuration of a computer that executes the above-mentioned series of processes using a program.
 CPU(Central Processing Unit)301、ROM(Read Only Memory)302、RAM(Random Access Memory)303は、バス304により相互に接続されている。 CPU (Central Processing Unit) 301, ROM (Read Only Memory) 302, and RAM (Random Access Memory) 303 are interconnected by bus 304.
 バス304には、さらに、入出力インタフェース305が接続されている。入出力インタフェース305には、キーボード、マウスなどよりなる入力部306、ディスプレイ、スピーカなどよりなる出力部307が接続される。また、入出力インタフェース305には、ハードディスクや不揮発性のメモリなどよりなる記憶部308、ネットワークインタフェースなどよりなる通信部309、リムーバブルメディア311を駆動するドライブ310が接続される。 Further connected to the bus 304 is an input/output interface 305. Connected to the input/output interface 305 are an input unit 306 consisting of a keyboard, mouse, etc., and an output unit 307 consisting of a display, speakers, etc. Also connected to the input/output interface 305 are a storage unit 308 consisting of a hard disk or non-volatile memory, a communication unit 309 consisting of a network interface, etc., and a drive 310 that drives removable media 311.
 以上のように構成されるコンピュータでは、CPU301が、例えば、記憶部308に記憶されているプログラムを入出力インタフェース305及びバス304を介してRAM303にロードして実行することにより、上述した一連の処理が行われる。 In a computer configured as described above, the CPU 301 performs the above-mentioned series of processes, for example by loading a program stored in the storage unit 308 into the RAM 303 via the input/output interface 305 and the bus 304 and executing the program.
 CPU301が実行するプログラムは、例えばリムーバブルメディア311に記録して、あるいは、ローカルエリアネットワーク、インターネット、デジタル放送といった、有線または無線の伝送媒体を介して提供され、記憶部308にインストールされる。 The programs executed by the CPU 301 are provided, for example, by being recorded on removable media 311, or via a wired or wireless transmission medium such as a local area network, the Internet, or digital broadcasting, and are installed in the storage unit 308.
 なお、コンピュータが実行するプログラムは、本明細書で説明する順序に沿って時系列に処理が行われるプログラムであっても良いし、並列に、あるいは呼び出しが行われたとき等の必要なタイミングで処理が行われるプログラムであっても良い。 The program executed by the computer may be a program in which processing is performed chronologically in the order described in this specification, or a program in which processing is performed in parallel or at the required timing, such as when called.
 なお、本明細書において、システムとは、複数の構成要素(装置、モジュール(部品)等)の集合を意味し、すべての構成要素が同一筐体中にあるか否かは問わない。したがって、別個の筐体に収納され、ネットワークを介して接続されている複数の装置、及び、1つの筐体の中に複数のモジュールが収納されている1つの装置は、いずれも、システムである。 In this specification, a system refers to a collection of multiple components (devices, modules (parts), etc.), regardless of whether all the components are in the same housing. Therefore, multiple devices housed in separate housings and connected via a network, and a single device in which multiple modules are housed in a single housing, are both systems.
 また、本明細書に記載された効果はあくまで例示であって限定されるものでは無く、また他の効果があってもよい。 Furthermore, the effects described in this specification are merely examples and are not limiting, and other effects may also exist.
 本技術の実施の形態は、上述した実施の形態に限定されるものではなく、本技術の要旨を逸脱しない範囲において種々の変更が可能である。 The embodiment of this technology is not limited to the above-mentioned embodiment, and various modifications are possible without departing from the gist of this technology.
 例えば、本技術は、1つの機能を、ネットワークを介して複数の装置で分担、共同して処理するクラウドコンピューティングの構成をとることができる。 For example, this technology can be configured as cloud computing, in which a single function is shared and processed collaboratively by multiple devices over a network.
 また、上述のフローチャートで説明した各ステップは、1つの装置で実行する他、複数の装置で分担して実行することができる。 In addition, each step described in the above flowchart can be executed by a single device, or can be shared and executed by multiple devices.
 さらに、1つのステップに複数の処理が含まれる場合には、その1つのステップに含まれる複数の処理は、1つの装置で実行する他、複数の装置で分担して実行することができる。 Furthermore, when one step includes multiple processes, the multiple processes included in that one step can be executed by one device, or can be shared and executed by multiple devices.
<構成の組み合わせ例>
 本技術は、以下のような構成をとることもできる。
(1)
 患者を撮像して得られる映像データの特徴量である映像特徴量および前記患者のクリニカルデータの少なくとも一方である患者データからなるデータセットに基づいて総合特徴量を算出するデータ処理部と、
 算出された前記総合特徴量を、前記映像データの再生位置を示す表示部品とともに時系列に表示する第1の表示画面を生成する表示画面生成部と
 を備える情報処理装置。
(2)
 前記データ処理部は、前記患者の看護記録データに基づいて前記患者に関するイベントを示すイベントラベルを生成し、
 前記表示画面生成部は、前記第1の表示画面において、前記イベントラベルを前記表示部品とともに重畳する
 前記(1)に記載の情報処理装置。
(3)
 前記イベントラベルは、前記イベントの開始時刻を示すポイントおよび前記イベントのプレビュー画像を含む
 前記(2)に記載の情報処理装置。
(4)
 前記イベントラベルは、前記イベントのタイトルを含む
 前記(3)に記載の情報処理装置。
(5)
 前記表示画面生成部は、前記第1の表示画面において、前記患者データを前記表示部品とともに表示する
 前記(1)乃至(4)のいずれかに記載の情報処理装置。
(6)
 前記表示画面生成部は、前記第1の表示画面において、前記患者データを前記総合特徴量に重畳して表示する
 前記(5)に記載の情報処理装置。
(7)
 前記表示画面生成部は、前記第1の表示画面において、前記映像データの再生位置に対応するポイントを前記表示部品とともに表示する
 前記(1)乃至(6)のいずれかに記載の情報処理装置。
(8)
 前記表示画面生成部は、再生中の前記映像データに対応する映像を表示する第2の表示画面の周辺に、前記第1の表示画面を生成する
 前記(1)乃至(7)のいずれかに記載の情報処理装置。
(9)
 前記表示画面生成部は、前記患者データを時系列にそれぞれ表示する第3の表示画面を生成する
 前記(1)乃至(8)のいずれかに記載の情報処理装置。
(10)
 前記データ処理部は、前記第3の表示画面において選択された時間帯の前記患者データと相関性がある他の前記患者データである相関データを抽出し、
 前記表示画面生成部は、抽出された前記相関データを表示する第4の表示画面を生成する
 前記(9)に記載の情報処理装置。
(11)
 前記相関データは、前記第3の表示画面において選択された時間帯とは異なる時間帯の他の前記患者データである
 前記(10)に記載の情報処理装置。
(12)
 前記第3の表示画面において、前記患者データにおける閾値を設定可能であり、
 前記表示画面生成部は、前記第3の表示画面において、前記患者データが前記閾値以上である時間帯と前記閾値未満である時間帯とを異なる形式で表示する
 前記(9)に記載の情報処理装置。
(13)
 前記表示画面生成部は、前記第1の表示画面において、前記患者データが前記閾値以上である時間帯と前記閾値未満である時間帯とを異なる形式で表示する
 前記(12)に記載の情報処理装置。
(14)
 前記表示画面生成部は、前記第1の表示画面において、前記映像データの再生位置に対応するポイントを前記表示部品とともに表示し、
 前記第3の表示画面において、前記映像データの再生位置に対応するポイントを、前記患者データ上に表示する
 前記(9)に記載の情報処理装置。
(15)
 前記総合特徴量の算出に用いられる前記データセットはユーザにより設定可能である
 前記(1)乃至(14)のいずれかに記載の情報処理装置。
(16)
 前記総合特徴量の算出根拠はユーザにより設定可能である
 前記(14)に記載の情報処理装置。
(17)
 前記表示部品は、ユーザが、表示される映像の前記映像データから見るべきシーンを探索し、再生位置を指定するために操作する再生指定ツールである
 前記(1)乃至(16)のいずれかに記載の情報処理装置。
(18)
 情報処理装置が、
 患者を撮像して得られる映像データの特徴量である映像特徴量および前記患者のクリニカルデータの少なくとも一方である患者データからなるデータセットに基づいて総合特徴量を算出し、
 算出された前記総合特徴量を、前記映像データの再生位置を示す表示部品とともに時系列に表示する第1の表示画面を生成する
 情報処理方法。
(19)
 患者を撮像して得られる映像データの特徴量である映像特徴量および前記患者のクリニカルデータの少なくとも一方である患者データからなるデータセットに基づいて総合特徴量を算出するデータ処理部と、
 算出された前記総合特徴量を、前記映像データの再生位置を示す表示部品とともに時系列に表示する第1の表示画面を生成する表示画面生成部として、
 コンピュータを機能させるプログラム。
(20)
 患者を撮像して映像データを得るセンサと、
 前記映像データの特徴量である映像特徴量および前記患者のクリニカルデータの少なくとも一方である患者データからなるデータセットに基づいて総合特徴量を算出するデータ処理部と、
 算出された前記総合特徴量を、前記映像データの再生位置を示す表示部品とともに時系列に表示する第1の表示画面を生成する表示画面生成部とを備える情報処理装置と、
 前記第1の表示画面の表示を制御する端末と
 からなる情報処理システム。
<Examples of configuration combinations>
The present technology can also be configured as follows.
(1)
a data processing unit that calculates a comprehensive feature based on a data set including image feature amounts that are feature amounts of image data obtained by imaging a patient and patient data that is at least one of clinical data of the patient;
a display screen generating unit that generates a first display screen that displays the calculated comprehensive feature amount in chronological order together with a display component that indicates a playback position of the video data.
(2)
The data processing unit generates an event label indicating an event related to the patient based on the nursing record data of the patient;
The information processing device according to (1), wherein the display screen generating unit superimposes the event label together with the display component on the first display screen.
(3)
The information processing device according to (2), wherein the event label includes a point indicating a start time of the event and a preview image of the event.
(4)
The information processing device according to (3), wherein the event label includes a title of the event.
(5)
The information processing device according to any one of (1) to (4), wherein the display screen generation unit displays the patient data together with the display part on the first display screen.
(6)
The information processing device according to (5), wherein the display screen generating unit displays the patient data on the first display screen by superimposing the patient data on the comprehensive feature amount.
(7)
The information processing device according to any one of (1) to (6), wherein the display screen generation unit displays, on the first display screen, a point corresponding to a playback position of the video data together with the display part.
(8)
The information processing device according to any one of (1) to (7), wherein the display screen generation unit generates the first display screen in a periphery of a second display screen that displays an image corresponding to the video data being played back.
(9)
The information processing device according to any one of (1) to (8), wherein the display screen generation unit generates a third display screen that displays the patient data in chronological order.
(10)
The data processing unit extracts correlation data, which is other patient data that is correlated with the patient data for the time period selected on the third display screen,
The information processing device according to (9), wherein the display screen generation unit generates a fourth display screen that displays the extracted correlation data.
(11)
The information processing device according to (10), wherein the correlation data is other patient data in a time zone different from the time zone selected on the third display screen.
(12)
A threshold value for the patient data can be set on the third display screen,
The information processing device according to (9), wherein the display screen generating unit displays, on the third display screen, a time period in which the patient data is equal to or greater than the threshold and a time period in which the patient data is less than the threshold in different formats.
(13)
The information processing device according to (12), wherein the display screen generating unit displays, on the first display screen, a time period in which the patient data is equal to or greater than the threshold and a time period in which the patient data is less than the threshold in different formats.
(14)
the display screen generation unit displays, on the first display screen, a point corresponding to a playback position of the video data together with the display part;
The information processing device according to (9), wherein a point corresponding to a playback position of the video data is displayed on the third display screen on the patient data.
(15)
The information processing device according to any one of (1) to (14), wherein the data set used for calculating the comprehensive feature amount is set by a user.
(16)
The information processing device according to (14), wherein a basis for calculating the comprehensive feature amount is set by a user.
(17)
The information processing device according to any one of (1) to (16), wherein the display part is a playback designation tool that is operated by a user to search for a scene to be viewed from the video data of the displayed video and to designate a playback position.
(18)
An information processing device,
Calculating a comprehensive feature based on a data set including image feature amounts, which are feature amounts of image data obtained by imaging a patient, and patient data, which is at least one of clinical data of the patient;
generating a first display screen that displays the calculated comprehensive feature amount in chronological order together with a display component that indicates a playback position of the video data.
(19)
a data processing unit that calculates a comprehensive feature based on a data set including image feature amounts that are feature amounts of image data obtained by imaging a patient and patient data that is at least one of clinical data of the patient;
a display screen generating unit that generates a first display screen that displays the calculated comprehensive feature amount in time series together with a display component that indicates a playback position of the video data,
The programs that make a computer function.
(20)
A sensor for capturing an image of a patient to obtain video data;
a data processing unit that calculates a comprehensive feature based on a data set including image feature amounts that are feature amounts of the image data and patient data that is at least one of clinical data of the patient;
an information processing device including a display screen generation unit that generates a first display screen that displays the calculated comprehensive feature amount in time series together with a display component that indicates a playback position of the video data;
and a terminal that controls the display of the first display screen.
 1 振り返り表示システム, 2 患者情報システム, 11 センシングブロック, 12 データ処理ブロック, 13 表示ブロック, 21 ベッドサイド端末, 22 情報サーバ, 23 表示端末, 41 データ収集部, 42 データ処理部, 43 表示画面生成部, 51 表示制御部, 52 表示部, 53 操作入力部, 71 記憶部, 72 画像解析部, 73 音声解析部, 74 テキスト解析部, 75 相関性解析部, 76 総合特徴量算出部, 100 振り返り表示画面, 101 映像表示部, 102 映像操作部, 103 時系列データ表示操作部, 111 操作ボタン表示部, 112 再生シークバー表示部, 120 再生シークバー 1. Retrospective display system, 2. Patient information system, 11. Sensing block, 12. Data processing block, 13. Display block, 21. Bedside terminal, 22. Information server, 23. Display terminal, 41. Data collection unit, 42. Data processing unit, 43. Display screen generation unit, 51. Display control unit, 52. Display unit, 53. Operation input unit, 71. Memory unit, 72. Image analysis unit, 73. Audio analysis unit, 74. Text analysis unit, 75. Correlation analysis unit, 76. Overall feature calculation unit, 100. Retrospective display screen, 101. Video display unit, 102. Video operation unit, 103. Time series data display operation unit, 111. Operation button display unit, 112. Playback seek bar display unit, 120. Playback seek bar

Claims (20)

  1.  患者を撮像して得られる映像データの特徴量である映像特徴量および前記患者のクリニカルデータの少なくとも一方である患者データからなるデータセットに基づいて総合特徴量を算出するデータ処理部と、
     算出された前記総合特徴量を、前記映像データの再生位置を示す表示部品とともに時系列に表示する第1の表示画面を生成する表示画面生成部と
     を備える情報処理装置。
    a data processing unit that calculates a comprehensive feature based on a data set including image feature amounts that are feature amounts of image data obtained by imaging a patient and patient data that is at least one of clinical data of the patient;
    a display screen generating unit that generates a first display screen that displays the calculated comprehensive feature amount in chronological order together with a display component that indicates a playback position of the video data.
  2.  前記データ処理部は、前記患者の看護記録データに基づいて前記患者に関するイベントを示すイベントラベルを生成し、
     前記表示画面生成部は、前記第1の表示画面において、前記イベントラベルを前記表示部品とともに重畳する
     請求項1に記載の情報処理装置。
    The data processing unit generates an event label indicating an event related to the patient based on the nursing record data of the patient;
    The information processing apparatus according to claim 1 , wherein the display screen generating unit superimposes the event label together with the display component on the first display screen.
  3.  前記イベントラベルは、前記イベントの開始時刻を示すポイントおよび前記イベントのプレビュー画像を含む
     請求項2に記載の情報処理装置。
    The information processing device according to claim 2 , wherein the event label includes a point indicating a start time of the event and a preview image of the event.
  4.  前記イベントラベルは、前記イベントのタイトルを含む
     請求項3に記載の情報処理装置。
    The information processing device according to claim 3 , wherein the event label includes a title of the event.
  5.  前記表示画面生成部は、前記第1の表示画面において、前記患者データを前記表示部品とともに表示する
     請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1 , wherein the display screen generating unit displays the patient data together with the display parts on the first display screen.
  6.  前記表示画面生成部は、前記第1の表示画面において、前記患者データを前記総合特徴量に重畳して表示する
     請求項5に記載の情報処理装置。
    The information processing apparatus according to claim 5 , wherein the display screen generating unit displays the patient data on the first display screen in a manner superimposed on the comprehensive feature amount.
  7.  前記表示画面生成部は、前記第1の表示画面において、前記映像データの再生位置に対応するポイントを前記表示部品とともに表示する
     請求項1に記載の情報処理装置。
    The information processing device according to claim 1 , wherein the display screen generating section displays, on the first display screen, a point corresponding to a playback position of the video data together with the display part.
  8.  前記表示画面生成部は、再生中の前記映像データに対応する映像を表示する第2の表示画面の周辺に、前記第1の表示画面を生成する
     請求項1に記載の情報処理装置。
    The information processing device according to claim 1 , wherein the display screen generating section generates the first display screen in a periphery of a second display screen that displays an image corresponding to the video data being reproduced.
  9.  前記表示画面生成部は、前記患者データを時系列にそれぞれ表示する第3の表示画面を生成する
     請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1 , wherein the display screen generating unit generates a third display screen that displays the patient data in chronological order.
  10.  前記データ処理部は、前記第3の表示画面において選択された時間帯の前記患者データと相関性がある他の前記患者データである相関データを抽出し、
     前記表示画面生成部は、抽出された前記相関データを表示する第4の表示画面を生成する
     請求項9に記載の情報処理装置。
    The data processing unit extracts correlation data, which is other patient data that is correlated with the patient data for the time period selected on the third display screen,
    The information processing apparatus according to claim 9 , wherein the display screen generating unit generates a fourth display screen that displays the extracted correlation data.
  11.  前記相関データは、前記第3の表示画面において選択された時間帯とは異なる時間帯の他の前記患者データである
     請求項10に記載の情報処理装置。
    The information processing apparatus according to claim 10 , wherein the correlation data is other patient data in a time period different from the time period selected on the third display screen.
  12.  前記第3の表示画面において、前記患者データにおける閾値を設定可能であり、
     前記表示画面生成部は、前記第3の表示画面において、前記患者データが前記閾値以上である時間帯と前記閾値未満である時間帯とを異なる形式で表示する
     請求項9に記載の情報処理装置。
    A threshold value for the patient data can be set on the third display screen,
    The information processing device according to claim 9 , wherein the display screen generating unit displays, on the third display screen, a time period in which the patient data is equal to or greater than the threshold value and a time period in which the patient data is less than the threshold value in different formats.
  13.  前記表示画面生成部は、前記第1の表示画面において、前記患者データが前記閾値以上である時間帯と前記閾値未満である時間帯とを異なる形式で表示する
     請求項12に記載の情報処理装置。
    The information processing device according to claim 12 , wherein the display screen generating unit displays, on the first display screen, a time period in which the patient data is equal to or greater than the threshold value and a time period in which the patient data is less than the threshold value in different formats.
  14.  前記表示画面生成部は、前記第1の表示画面において、前記映像データの再生位置に対応するポイントを前記表示部品とともに表示し、
     前記第3の表示画面において、前記映像データの再生位置に対応するポイントを、前記患者データ上に表示する
     請求項9に記載の情報処理装置。
    the display screen generation unit displays, on the first display screen, a point corresponding to a playback position of the video data together with the display part;
    The information processing apparatus according to claim 9 , wherein a point corresponding to a playback position of the video data is displayed on the third display screen on the patient data.
  15.  前記総合特徴量の算出に用いられる前記データセットはユーザにより設定可能である
     請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1 , wherein the data set used for calculating the comprehensive feature amount can be set by a user.
  16.  前記総合特徴量の算出根拠はユーザにより設定可能である
     請求項15に記載の情報処理装置。
    The information processing apparatus according to claim 15 , wherein a basis for calculating the comprehensive feature amount can be set by a user.
  17.  前記表示部品は、ユーザが、表示される映像の前記映像データから見るべきシーンを探索し、再生位置を指定するために操作する再生指定バーである
     請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1 , wherein the display part is a playback designation bar that is operated by a user to search for a scene to be viewed from the video data of the displayed video and to designate a playback position.
  18.  情報処理装置が、
     患者を撮像して得られる映像データの特徴量である映像特徴量および前記患者のクリニカルデータの少なくとも一方である患者データからなるデータセットに基づいて総合特徴量を算出し、
     算出された前記総合特徴量を、前記映像データの再生位置を示す表示部品とともに時系列に表示する第1の表示画面を生成する
     情報処理方法。
    An information processing device,
    Calculating a comprehensive feature based on a data set including image feature amounts, which are feature amounts of image data obtained by imaging a patient, and patient data, which is at least one of clinical data of the patient;
    generating a first display screen that displays the calculated comprehensive feature amount in chronological order together with a display component that indicates a playback position of the video data.
  19.  患者を撮像して得られる映像データの特徴量である映像特徴量および前記患者のクリニカルデータの少なくとも一方である患者データからなるデータセットに基づいて総合特徴量を算出するデータ処理部と、
     算出された前記総合特徴量を、前記映像データの再生位置を示す表示部品とともに時系列に表示する第1の表示画面を生成する表示画面生成部として、
     コンピュータを機能させるプログラム。
    a data processing unit that calculates a comprehensive feature based on a data set including image feature amounts that are feature amounts of image data obtained by imaging a patient and patient data that is at least one of clinical data of the patient;
    a display screen generating unit that generates a first display screen that displays the calculated comprehensive feature amount in time series together with a display component that indicates a playback position of the video data,
    The programs that make a computer function.
  20.  患者を撮像して映像データを得るセンサと、
     前記映像データの特徴量である映像特徴量および前記患者のクリニカルデータの少なくとも一方である患者データからなるデータセットに基づいて総合特徴量を算出するデータ処理部と、
     算出された前記総合特徴量を、前記映像データの再生位置を示す表示部品とともに時系列に表示する第1の表示画面を生成する表示画面生成部とを備える情報処理装置と、
     前記第1の表示画面の表示を制御する端末と
     からなる情報処理システム。
    A sensor for capturing an image of a patient to obtain video data;
    a data processing unit that calculates a comprehensive feature based on a data set including image feature amounts that are feature amounts of the image data and patient data that is at least one of clinical data of the patient;
    an information processing device including a display screen generation unit that generates a first display screen that displays the calculated comprehensive feature amount in time series together with a display component that indicates a playback position of the video data;
    and a terminal that controls the display of the first display screen.
PCT/JP2023/039825 2022-11-21 2023-11-06 Information processing device, information processing method, program, and information processing system WO2024111386A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022185763 2022-11-21
JP2022-185763 2022-11-21

Publications (1)

Publication Number Publication Date
WO2024111386A1 true WO2024111386A1 (en) 2024-05-30

Family

ID=91195518

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/039825 WO2024111386A1 (en) 2022-11-21 2023-11-06 Information processing device, information processing method, program, and information processing system

Country Status (1)

Country Link
WO (1) WO2024111386A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021069033A (en) * 2019-10-24 2021-04-30 セコム医療システム株式会社 Video recording system, video recording method, and video recording program
JP2022520701A (en) * 2019-02-21 2022-04-01 シアター・インコーポレイテッド Systems and methods for analysis of surgical videos
WO2022224524A1 (en) * 2021-04-22 2022-10-27 ソニーグループ株式会社 Patient monitoring system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022520701A (en) * 2019-02-21 2022-04-01 シアター・インコーポレイテッド Systems and methods for analysis of surgical videos
JP2021069033A (en) * 2019-10-24 2021-04-30 セコム医療システム株式会社 Video recording system, video recording method, and video recording program
WO2022224524A1 (en) * 2021-04-22 2022-10-27 ソニーグループ株式会社 Patient monitoring system

Similar Documents

Publication Publication Date Title
US11645745B2 (en) System and method for adverse event detection or severity estimation from surgical data
US11666288B2 (en) Systems and methods for graphical user interfaces for medical device trends
US8869115B2 (en) Systems and methods for emotive software usability
JP2007516011A (en) Data entry system for endoscopy
US20110218822A1 (en) Remote patient management system adapted for generating a teleconsultation report
JP6261183B2 (en) Medical image data information exchange system
US20210065889A1 (en) Systems and methods for graphical user interfaces for a supervisory application
US20080120548A1 (en) System And Method For Processing User Interaction Information From Multiple Media Sources
WO2021041500A1 (en) Systems and methods for graphical user interfaces for medical device trends
US20140222805A1 (en) Apparatus, method and computer readable medium for tracking data and events
EP4376402A1 (en) Information processing system, information processing method, and program
US20230363851A1 (en) Methods and systems for video collaboration
US20160295086A1 (en) System for enabling remote annotation of media data captured using endoscopic instruments and the creation of targeted digital advertising in a documentation environment using diagnosis and procedure code entries
CN109155019A (en) For tracking the system and method unofficially observed by caregiver about nursing recipient
US20190362859A1 (en) System for enabling remote annotation of media data captured using endoscopic instruments and the creation of targeted digital advertising in a documentation environment using diagnosis and procedure code entries
WO2024111386A1 (en) Information processing device, information processing method, program, and information processing system
JP2023526412A (en) Information processing method, electronic device, and computer storage medium
CN112712876A (en) Image recording system, image recording method, and image recording program
JP6769417B2 (en) Servers, doctor equipment, instructor equipment, computer programs and telemedicine support methods
US20240237953A1 (en) Methods for collecting and presenting physiological signal data and location information, and servers and systems implementing the same
Cavaro-Ménard et al. QoE for telemedicine: Challenges and trends
US20150109307A1 (en) Systems and methods to present points of interest in longitudinal data
JP2020096703A (en) Medical image management device and medical image management system
CN114121208A (en) Operation record quality control method based on visual data
JP2002215797A (en) Hospital information system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23894398

Country of ref document: EP

Kind code of ref document: A1