[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2024180593A1 - Image processing device, image processing method, and storage medium - Google Patents

Image processing device, image processing method, and storage medium Download PDF

Info

Publication number
WO2024180593A1
WO2024180593A1 PCT/JP2023/007007 JP2023007007W WO2024180593A1 WO 2024180593 A1 WO2024180593 A1 WO 2024180593A1 JP 2023007007 W JP2023007007 W JP 2023007007W WO 2024180593 A1 WO2024180593 A1 WO 2024180593A1
Authority
WO
WIPO (PCT)
Prior art keywords
lesion
image
progression
endoscopic image
image processing
Prior art date
Application number
PCT/JP2023/007007
Other languages
French (fr)
Japanese (ja)
Inventor
雅弘 西光
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Priority to PCT/JP2023/007007 priority Critical patent/WO2024180593A1/en
Priority to PCT/JP2023/031840 priority patent/WO2024180796A1/en
Priority to US18/410,361 priority patent/US20240289959A1/en
Priority to US18/410,293 priority patent/US20240289949A1/en
Publication of WO2024180593A1 publication Critical patent/WO2024180593A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof

Definitions

  • This disclosure relates to the technical fields of image processing devices, image processing methods, and storage media that process images acquired during endoscopic examinations.
  • Patent Document 1 discloses a medical image processing device that detects lesion candidates from medical images and identifies the malignancy of the detected lesion candidates and the organs contained in the medical images.
  • one of the objectives of the present disclosure is to provide an image processing device, an image processing method, and a storage medium that can appropriately determine the degree of progress.
  • One aspect of the image processing device is An acquisition means for acquiring an endoscopic image of a subject; a detection means for detecting a lesion area, which is a candidate area for a lesion of the subject in the endoscopic image, based on the endoscopic image; a first determination means for determining whether or not the endoscopic image is suitable for determining the progression or depth of the lesion based on at least one of the size of the lesion area or the reliability of the lesion area with respect to the lesion-likeness; a second determination means for determining the degree of progression or the depth of invasion based on the endoscopic image determined to be an image suitable for determining the degree of progression or the depth of invasion;
  • the image processing device has the following features.
  • One aspect of the image processing method includes: The computer An endoscopic image of the subject is acquired, detecting a lesion area in the endoscopic image, the lesion area being a candidate area for a lesion in the subject based on the endoscopic image; determining whether the endoscopic image is suitable for determining the progression or depth of the lesion based on at least one of the size of the lesion area or the reliability of the lesion area; determining the degree of progression or the depth of invasion based on the endoscopic image determined to be an image suitable for determining the degree of progression or the depth of invasion; An image processing method.
  • One aspect of the storage medium is An endoscopic image of the subject is acquired, detecting a lesion area in the endoscopic image, the lesion area being a candidate area for a lesion in the subject based on the endoscopic image; determining whether the endoscopic image is suitable for determining the progression or depth of the lesion based on at least one of the size of the lesion area or the reliability of the lesion area;
  • the storage medium stores a program that causes a computer to execute a process for determining the degree of progression or depth of invasion based on the endoscopic image that has been determined to be an image suitable for determining the degree of progression or depth of invasion.
  • FIG. 1 shows a schematic configuration of an endoscopic examination system.
  • 2 shows the hardware configuration of an image processing device.
  • 4 is a diagram showing an overview of a progress determination process executed by the image processing device in the first embodiment;
  • FIG. 4 is an example of a functional block of a progress determination process in the first embodiment.
  • 1 shows a first display example displayed on a display device during an endoscopic examination.
  • 13 shows a second display example displayed on the display device during endoscopic examination.
  • 5 is an example of a flowchart illustrating an overview of a process executed by an image processing device during an endoscopic examination in the first embodiment.
  • FIG. 13 is a schematic configuration diagram of an endoscopic inspection system according to a modified example.
  • FIG. 11 is a block diagram of an image processing device according to a second embodiment. 13 is an example of a flowchart executed by the image processing apparatus in the second embodiment.
  • FIG. 1 shows a schematic configuration of an endoscopic examination system 100.
  • the endoscopic examination system 100 is a system that detects a lesion site, which is a site of a subject suspected of having a lesion, based on an image captured by an endoscope, judges the progression of the lesion at the detected lesion site, and presents the judgment result.
  • the "progression level” may refer to a comprehensive degree (grade) of progression of a lesion including the depth of invasion (degree of invasion), or may be the depth of invasion (degree of invasion) itself.
  • the endoscopic examination system 100 mainly includes an image processing device 1, a display device 2, and an endoscope scope 3 that is connected to the image processing device 1 and is handled by an examiner such as a doctor who performs an examination or treatment.
  • the image processing device 1 acquires images (also called “endoscopic images Ia") captured by the endoscope scope 3 in a time series from the endoscope scope 3, and displays a screen based on the endoscopic images Ia on the display device 2.
  • the endoscopic images Ia are images captured at a predetermined frame cycle during at least one of the processes of inserting or ejecting the endoscope scope 3 into the subject.
  • the image processing device 1 detects, from the time series of endoscopic images Ia, an endoscopic image Ia in which a region that is a candidate for a lesion site (also called a "lesion region”) is present.
  • the image processing device 1 selects an image from the detected endoscopic images Ia that is suitable for determining the progression of the lesion, determines the progression of the lesion for the selected image, and presents information on the determination result. Furthermore, when an image suitable for determining the progression of the lesion cannot be obtained, the image processing device 1 outputs a suggestion for capturing such an image.
  • the display device 2 is a display or the like that performs a predetermined display based on a display signal supplied from the image processing device 1.
  • the endoscope 3 mainly comprises an operation section 36 that allows the examiner to perform predetermined inputs, a flexible shaft 37 that is inserted into the subject's organ to be imaged, a tip section 38 that incorporates an imaging section such as a miniature image sensor, and a connection section 39 for connecting to the image processing device 1.
  • the configuration of the endoscopic examination system 100 shown in FIG. 1 is an example, and various modifications may be made.
  • the image processing device 1 may be configured integrally with the display device 2.
  • the image processing device 1 may be configured from multiple devices.
  • the subject of the endoscopic examination in the present disclosure may be any organ that can be examined endoscopically, such as the large intestine, esophagus, stomach, or pancreas.
  • endoscopes that are the subject of the present disclosure include pharyngeal endoscopes, bronchoscopes, upper gastrointestinal endoscopes, duodenoscopes, small intestinal endoscopes, colonoscopes, capsule endoscopes, thoracoscopes, laparoscopes, cystoscopes, cholangioscopes, arthroscopes, spinal endoscopes, vascular endoscopes, and epidural endoscopes.
  • Examples of the pathology of the lesion site that is the subject of the endoscopic examination include (a) to (f) below.
  • FIG. 2 shows the hardware configuration of the image processing device 1.
  • the image processing device 1 mainly includes a processor 11, a memory 12, an interface 13, an input unit 14, a light source unit 15, and a sound output unit 16. These elements are connected via a data bus 19.
  • the processor 11 executes predetermined processing by executing programs stored in the memory 12.
  • the processor 11 is a processor such as a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), or a TPU (Tensor Processing Unit).
  • the processor 11 may be composed of multiple processors.
  • the processor 11 is an example of a computer.
  • the memory 12 is composed of various volatile memories used as working memories, such as RAM (Random Access Memory) and ROM (Read Only Memory), and non-volatile memories that store information necessary for the processing of the image processing device 1.
  • the memory 12 may include an external storage device such as a hard disk connected to or built into the image processing device 1, or may include a removable storage medium such as a flash memory.
  • the memory 12 stores programs that enable the image processing device 1 to execute each process in this embodiment.
  • the memory 12 also stores lesion detection model information D1, which is information related to the lesion detection model, and progression determination model information D2, which is information related to the progression determination model. Details of the lesion detection model and the progression determination model will be described later.
  • the memory 12 may also include any other information necessary for the image processing device 1 to execute each process in this embodiment.
  • the interface 13 performs interface operations between the image processing device 1 and an external device. For example, the interface 13 supplies the display information "Ib" generated by the processor 11 to the display device 2. The interface 13 also supplies light generated by the light source unit 15 to the endoscope scope 3. The interface 13 also supplies an electrical signal indicating the endoscopic image Ia supplied from the endoscope scope 3 to the processor 11.
  • the interface 13 may be a communication interface such as a network adapter for communicating with an external device by wire or wirelessly, or may be a hardware interface compliant with USB (Universal Serial Bus), SATA (Serial AT Attachment), etc.
  • the input unit 14 generates an input signal based on the operation by the examiner.
  • the input unit 14 is, for example, a button, a touch panel, a remote controller, a voice input device, etc.
  • the light source unit 15 generates light to be supplied to the tip 38 of the endoscope 3.
  • the light source unit 15 may also incorporate a pump or the like for sending water or air to be supplied to the endoscope 3.
  • the sound output unit 16 outputs sound based on the control of the processor 11.
  • the lesion detection model is a machine learning model that generates an inference result regarding a lesion area corresponding to a disease to be detected in an endoscopic examination, and the parameters required for the model are stored in the lesion detection model information D1. For example, when an endoscopic image is input, the lesion detection model outputs an inference result regarding a lesion area in the input endoscopic image.
  • the lesion detection model is a model that has learned the relationship between an image input to the lesion detection model and a lesion area in the image.
  • the lesion detection model may be a model (including a statistical model, the same applies below) that includes an architecture adopted in any machine learning such as a neural network or a support vector machine.
  • the lesion detection model information D1 includes various parameters such as the layer structure, the neuron structure of each layer, the number of filters and filter size in each layer, and the weights of each element of each filter.
  • the lesion detection model outputs, as an inference result, a map indicating the reliability (also called “lesion reliability") that each unit area of the input endoscopic image is a lesion area.
  • a map indicating the reliability also called “lesion reliability”
  • the above map will also be called a "lesion reliability map.”
  • the lesion reliability map is an image indicating the lesion reliability for each unit pixel (which may include subpixels) or for each pixel block separated by a predetermined rule. Note that the lesion reliability indicates that the reliability of an area is higher for an area with a higher lesion reliability.
  • the lesion reliability map may be a mask image indicating lesion areas using binary values.
  • the lesion detection model outputs an inference result indicating a bounding box indicating the extent of the lesion area in the input endoscopic image, and the confidence that the area surrounded by the bounding box is a lesion area.
  • the confidence here is, for example, a confidence level that is a score indicating the degree of confidence output from the output layer of a neural network when the lesion detection model is configured using a neural network. Note that the above-mentioned mode of inference result output by the lesion detection model is one example, and various modes of inference results may be output from the lesion detection model.
  • the lesion detection model is trained in advance based on a pair of an input image conforming to the input format of the lesion detection model and correct answer data (in the above example, the correct lesion confidence map or bounding box) indicating the correct inference result that the lesion detection model should output when the input image is input. Then, the parameters of each model obtained by training are stored in memory 12 as lesion detection model information D1.
  • the lesion detection model may include a feature extraction model that extracts features from an endoscopic image, or may be a model separate from the feature extraction model. In the latter case, the lesion detection model is a model trained to output the above-mentioned inference result when a feature (tensor with a predetermined number of dimensions) output by the feature extraction model to which an endoscopic image is input is input.
  • the progression assessment model is a machine learning model that infers (classifies) the progression of a lesion indicated by a lesion area included in an input endoscopic image, and parameters required for the model are stored in the progression assessment model information D2.
  • the progression assessment model outputs an inference result indicating the progression of the lesion in the input endoscopic image (more specifically, a classification result indicating the class of the progression).
  • the progression assessment model is a model that has learned the relationship between the image input to the progression assessment model and the progression of the lesion in the image.
  • the progression assessment model may be a model (including a statistical model, the same applies below) that includes an architecture adopted in any machine learning such as a neural network or a support vector machine.
  • the progression assessment model includes various parameters such as a layer structure, a neuron structure of each layer, the number of filters and filter size in each layer, and the weight of each element of each filter.
  • the progress assessment model is trained in advance based on a pair of an input image conforming to the input format of the progress assessment model and correct answer data (i.e., the progress class that is the correct answer) that indicates the correct inference result that the progress assessment model should output when the input image is input. Then, the parameters of each model obtained by training are stored in memory 12 as progress assessment model information D2.
  • the endoscopic image input to the progression determination model may be the entire image of the endoscopic image Ia generated by the endoscopic scope 3, or may be an image cut out from the endoscopic image so as to include at least the lesion area detected by the lesion detection model (i.e., a partial image of the endoscopic image Ia). Furthermore, instead of the endoscopic image, the progress determination model may be input with the feature quantities of the image calculated by the lesion detection model or feature extraction model described above.
  • FIG. 3 is a diagram showing an overview of a progress determination process related to the determination of the progress degree executed by the image processing device 1 in the first embodiment.
  • the image processing device 1 detects a lesion area in each endoscopic image Ia obtained from the endoscope scope 3 at a predetermined frame period using a lesion detection model. Then, the image processing device 1 obtains an inference result regarding the lesion area in the endoscopic image Ia from the lesion detection model.
  • the image processing device 1 obtains, as the lesion detection result, either a lesion confidence map indicating the inference result of the lesion detection model based on the first output mode, or a bounding box indicating the inference result of the lesion detection model based on the second output mode.
  • the lesion confidence map pixels with higher lesion confidence are represented by a color closer to white.
  • the above-mentioned bounding box is displayed superimposed on the endoscopic image Ia input to the lesion detection model.
  • the image processing device 1 does not perform a progression determination based on the progression determination model for an endoscopic image Ia in which a lesion area could not be detected.
  • the image processing device 1 regards a unit area (e.g., pixels) having a lesion reliability equal to or greater than a predetermined threshold (also referred to as the "first threshold") as a partial area constituting the lesion area, and determines that the lesion area could not be detected if no such unit area exists or if no connected area of the above-mentioned unit areas having a predetermined number of pixels or more exists.
  • the above-mentioned first threshold is, for example, a default value stored in advance in the memory 12 or the like.
  • the image processing device 1 regards the area within a bounding box as the lesion area, and determines that the lesion area could not be detected if a bounding box cannot be obtained.
  • the image processing device 1 judges whether each of the endoscopic images Ia in which the lesion area has been detected is an endoscopic image suitable for judging the degree of progression (also called a "progression suitability judgment"). In this case, the image processing device 1 judges the suitability of the degree of progression for the target endoscopic image Ia based on at least one of the size of the lesion area detected from the target endoscopic image Ia or the reliability of the detected lesion area.
  • the image processing device 1 determines that the endoscopic image Ia in which the lesion area is detected is an endoscopic image suitable for judging the degree of progression, the image processing device 1 uses the endoscopic image Ia to judge the degree of progression of the lesion using a progression judgment model. This allows the image processing device 1 to automatically select an endoscopic image Ia suitable for the degree of progression of the lesion and judge the degree of progression of the lesion with high accuracy. On the other hand, if the image processing device 1 determines that the endoscopic image Ia in which the lesion area is detected is not an endoscopic image suitable for judging the degree of progression, it outputs a suggestion to capture an endoscopic image Ia suitable for judging the degree of progression using the endoscopic scope 3 via the display device 2 or the sound output unit 16. This allows the image processing device 1 to assist the examiner in operating the endoscopic scope 3 to capture an endoscopic image Ia suitable for judging the degree of progression, and promote the acquisition of an endoscopic image Ia suitable for judging the degree of progression.
  • Fig. 4 is an example of functional blocks of the progress assessment process in the first embodiment.
  • the processor 11 of the image processing device 1 functionally has an endoscopic image acquisition unit 30, a lesion detection unit 31, a suitability assessment unit 32, a progress assessment unit 33, and an output control unit 34. Note that in Fig. 4, blocks between which data is exchanged are connected by solid lines, but the combination of blocks between which data is exchanged is not limited to this. The same applies to other functional block diagrams described later.
  • the endoscopic image acquisition unit 30 acquires the endoscopic image Ia captured by the endoscopic scope 3 via the interface 13 at predetermined intervals.
  • the endoscopic image acquisition unit 30 then supplies the acquired endoscopic image Ia to the lesion detection unit 31 and the output control unit 34, respectively.
  • the respective processing units in the subsequent stages perform the processing described below, with the time interval at which the endoscopic image acquisition unit 30 acquires the endoscopic image Ia being set as a period.
  • the lesion detection unit 31 detects a lesion area in the endoscopic image Ia supplied from the endoscopic image acquisition unit 30 based on the lesion detection model information D1.
  • the lesion detection unit 31 inputs the endoscopic image Ia to a lesion detection model constructed by referring to the lesion detection model information D1, and obtains an inference result of lesion detection output by the lesion detection model.
  • the inference result of lesion detection may be a lesion reliability map based on the first output mode, or a set of a bounding box and lesion reliability based on the second output mode.
  • the lesion detection unit 31 determines that a lesion area has been detected, it supplies the lesion detection result corresponding to the inference result of the above-mentioned lesion detection together with the endoscopic image Ia to the suitability determination unit 32. In addition, the lesion detection unit 31 supplies the lesion detection result to the output control unit 34 regardless of whether or not a lesion area has been detected. Note that the lesion detection result when a lesion area has not been detected is, for example, information indicating that a lesion area has not been detected.
  • the suitability determination unit 32 performs a progression suitability determination, which is a suitability determination of whether the endoscopic image Ia supplied from the lesion detection unit 31 is suitable for determining the progression of the lesion.
  • the suitability determination unit 32 acquires at least one of the size of the lesion area detected from the endoscopic image Ia or the reliability of the detected lesion area based on the inference result of the lesion detection, and performs a progression suitability determination based on at least one of the acquired size or reliability. Then, if the suitability determination unit 32 determines that the endoscopic image Ia is suitable for determining the progression of the lesion, it supplies the endoscopic image Ia to the progression determination unit 33.
  • the suitability determination unit 32 determines that the endoscopic image Ia is not suitable for determining the progression of the lesion, it supplies a determination result (also called an "insuitability determination result") indicating that the endoscopic image Ia is not suitable for determining the progression of the lesion to the output control unit 34. Furthermore, even if the suitability determination unit 32 determines that the endoscopic image Ia is suitable for determining the degree of progression of the lesion, the suitability determination unit 32 may supply the output control unit 34 with a determination result indicating that the endoscopic image Ia is suitable for determining the degree of progression of the lesion (also referred to as a "suitable determination result").
  • the progress determination unit 33 determines the progress of the lesion area included in the endoscopic image Ia that is determined by the suitability determination unit 32 to be suitable for use in progression determination (i.e., classifies the progress class to which the lesion area belongs).
  • the progress determination unit 33 inputs the endoscopic image Ia to a progress determination model configured by referring to the progress determination model information D2, and obtains the progress inference result output by the progress determination model.
  • the progress determination model outputs the most likely progress class (e.g., a grade representing the progress) and the confidence level (degree of confidence) for each progress class that is a candidate for classification as the progress inference result.
  • the progress determination unit 33 may input a partial image of the endoscopic image Ia including the lesion area or the feature amount of the endoscopic image Ia calculated by the lesion detection unit 31 to the progress determination model instead of the endoscopic image Ia. Then, the progress determination unit 33 supplies the progress determination result based on the above-mentioned progress inference result to the output control unit 34.
  • the progress level determination result is information obtained by determining the progress level, and is, for example, information indicating the progress level class determined to be most likely and the degree of certainty of that class.
  • the progress determination unit 33 may determine the final progress to be output to the output control unit 34 based on a predetermined number of progress inference results based on a predetermined number of endoscopic images Ia, instead of determining the final progress to be output to the output control unit 34 based on a predetermined number of progress inference results based on two or more predetermined number of endoscopic images Ia based on the progress determination model. In this case, when a predetermined number of progress inference results output by the progress determination model are accumulated, the progress determination unit 33 determines the progress to be output to the output control unit 34 based on the accumulated predetermined number of progress inference results.
  • the progress determination unit 33 counts the predetermined number of progress inference results, and determines the progress class that is most frequently determined to be most likely (i.e., the class determined by majority vote) as the final progress.
  • the progress determination result output by the progress determination unit 33 includes, for example, the class determined by majority vote and the average value of the certainty of the class (or a representative value other than the average value).
  • the output control unit 34 generates display information Ib based on the latest endoscopic image Ia supplied from the endoscopic image acquisition unit 30, the lesion detection result output by the lesion detection unit 31, and the progress determination result output by the progress determination unit 33.
  • the output control unit 34 then supplies the generated display information Ib to the display device 2, thereby causing the display device 2 to display the latest endoscopic image Ia, the lesion detection result, and the progress determination result, etc.
  • the output control unit 34 may control the sound output of the sound output unit 16 so as to output a warning sound or voice guidance, etc., to notify the user that a lesion area has been detected, based on the lesion detection result.
  • the output control unit 34 also outputs, via the display device 2 or the sound output unit 16, a suggestion for capturing an endoscopic image Ia suitable for judging the degree of progression using the endoscope scope 3, based on the inappropriate judgment result supplied from the appropriateness judgment unit 32. For example, the output control unit 34 outputs the above-mentioned suggestion when an inappropriate judgment result is generated a predetermined number of times or for a predetermined period of time in succession without generating a suitable judgment result.
  • the output control unit 34 outputs information encouraging the user to move the shooting position closer to the lesion area in order to obtain an endoscopic image Ia with a larger magnified lesion area.
  • the output control unit 34 displays or outputs audio stating, "Please move the camera closer to the lesion.”
  • the output control unit 34 outputs information indicating a target range of the lesion area on the endoscopic image Ia.
  • the output control unit 34 displays a frame indicating the range in which the lesion area is preferably displayed, superimposed on the latest endoscopic image Ia.
  • the range on the endoscopic image Ia in which the frame should be displayed may be a preset range stored in advance in the memory 12, etc., or may be a range in which at least one of the shape and size has been adjusted based on the lesion detection result.
  • a specific aspect of the second example will be described in detail in the display example of Figure 6 described later.
  • the output control unit 34 determines that an unsuitable judgment result has been generated consecutively without generating a suitable judgment result for a predetermined number of times or for a predetermined period of time even after the output of the suggestion, it displays or outputs audio information indicating that a progress judgment cannot be performed.
  • the output control unit 34 may output a suggestion based on the progress determination result output by the progress determination unit 33. For example, when the confidence level of the most likely progress class is less than a predetermined threshold, the output control unit 34 may output a suggestion based on the first or second example described above, regardless of the presence or absence of an inappropriate determination result.
  • each of the components of the endoscopic image acquisition unit 30, the lesion detection unit 31, the suitability determination unit 32, the progress determination unit 33, and the output control unit 34 can be realized, for example, by the processor 11 executing a program. Also, each component may be realized by recording the necessary programs in any non-volatile storage medium and installing them as needed. Note that at least a portion of each of these components may not be realized by software through a program, but may be realized by any combination of hardware, firmware, and software. Also, at least a portion of each of these components may be realized using a user-programmable integrated circuit, such as an FPGA (Field-Programmable Gate Array) or a microcontroller. In this case, a program consisting of each of the above components may be realized using this integrated circuit.
  • FPGA Field-Programmable Gate Array
  • each component may be configured by an ASSP (Application Specific Standard Production), an ASIC (Application Specific Integrated Circuit), or a quantum processor (quantum computer control chip).
  • ASSP Application Specific Standard Production
  • ASIC Application Specific Integrated Circuit
  • quantum processor quantum computer control chip
  • the suitability judgment unit 32 when judging the suitability of the degree of progression based on the size of the lesion area, the suitability judgment unit 32 generates a suitable judgment result when the size of the lesion area is equal to or larger than a predetermined size, and generates an inappropriate judgment result when the size of the lesion area is less than the predetermined size.
  • the above-mentioned predetermined size is, for example, a default value determined as the size necessary for accurate judgment of the degree of progression, and is stored in advance in the memory 12, etc.
  • the size of the lesion area in the first output mode is, for example, the size (e.g., the number of pixels) of the connected area of unit areas having a lesion reliability equal to or greater than a first threshold value.
  • the size of the lesion area in the second output mode is, for example, the size of a bounding box.
  • the suitability judgment unit 32 when the above-mentioned suitability judgment is made based on the reliability of the lesion area, the suitability judgment unit 32 generates a suitable judgment result when the reliability of the lesion area is equal to or greater than a predetermined threshold (also referred to as the "second threshold"), and generates an inappropriate judgment result when the reliability of the lesion area is less than the second threshold.
  • a predetermined threshold also referred to as the "second threshold”
  • the reliability of the lesion area in the case of the first output mode is, for example, the average value, median, or other representative value of the lesion reliability of the unit areas that make up the lesion area.
  • the reliability of the lesion area in the case of the second output mode is, for example, the reliability associated with the bounding box.
  • the above-mentioned second threshold is, for example, a default value determined as the reliability required for accurate judgment of the degree of progression, and is stored in advance in the memory 12, etc.
  • the second threshold in the case of the first output mode is preferably set to a value equal to or greater than the first threshold that is compared with the lesion reliability of each unit area when determining whether or not it is a lesion area.
  • the suitability judgment unit 32 when the progress suitability is judged based on both the size and reliability of the lesion area, the suitability judgment unit 32 generates a suitable judgment result when the size of the lesion area is equal to or larger than a predetermined size and the reliability of the lesion area is equal to or larger than a second threshold. On the other hand, the suitability judgment unit 32 generates an inappropriate judgment result when the size of the lesion area is less than the predetermined size or the reliability of the lesion area is less than the second threshold.
  • the suitability determination unit 32 may change the criteria used to determine the suitability of the progress level based on the progress level determination result by the progress level determination unit 33 (i.e., information obtained by judging the progress level).
  • the above-mentioned criteria correspond to at least one of a predetermined size to be compared with the size of the lesion area and a second threshold value to be compared with the reliability of the lesion area.
  • the suitability determination unit 32 changes the criteria used to determine whether the progression is appropriate based on the confidence level for the progression class determined to be the most likely by the progression determination unit 33. In this case, for example, if the confidence level for the most likely progression class is less than a predetermined threshold, the suitability determination unit 32 changes the criteria used to determine whether the progression is appropriate to be stricter. In this case, the suitability determination unit 32 increases the predetermined size and/or the second threshold used as the criteria by a predetermined value or a predetermined percentage. As a result, the suitability determination unit 32 tightens the criteria for determining that the endoscopic image Ia is appropriate for determining the progression of the lesion, and promotes the determination of the progression using more carefully selected endoscopic images Ia.
  • predetermined threshold for example, default values and are stored in advance in the memory 12, etc.
  • the predetermined size is an example of a "first criterion”
  • second threshold is an example of a "second criterion.”
  • the suitability determination unit 32 changes the criteria used to determine whether the progress level is appropriate, based on the degree of change over time in the progress class determined to be most likely by the progress level determination unit 33. In this case, for example, if the progress level determination results obtained from the time series for a predetermined number of times do not match the progress level class determined to be most likely, the suitability determination unit 32 determines that the progress level determination results are fluctuating (i.e., unstable) and changes the criteria used to determine whether the progress level is appropriate to be stricter.
  • the suitability determination unit 32 tallyes up the most likely progress level class based on the progress level determination results obtained from the time series for a predetermined number of times, and if the proportion of the most frequent class is less than a predetermined proportion, changes the criteria used to determine whether the progress level is appropriate to be stricter.
  • the fifth shows a first display example displayed by the display device 2 during an endoscopic examination.
  • the first display example shows an example of a display screen displayed by the display device 2 when the progression determination unit 33 generates a determination result of the progression (here, the invasion depth).
  • the output control unit 34 of the image processing device 1 outputs to the display device 2 display information Ib generated based on the latest endoscopic image Ia supplied from the endoscopic image acquisition unit 30, the lesion detection result output by the lesion detection unit 31, and the progress determination result output by the progress determination unit 33.
  • the output control unit 34 transmits the display information Ib to the display device 2, thereby causing the display device 2 to display the above-mentioned display screen.
  • the output control unit 34 of the image processing device 1 provides a real-time image display area 70, a lesion detection result display area 71, and an invasion depth determination result display area 72 on the display screen.
  • the output control unit 34 displays a moving image representing the latest endoscopic image Ia in the real-time image display area 70. Furthermore, in the lesion detection result display area 71, the output control unit 34 displays the lesion detection result by the lesion detection unit 31. Note that at the time of displaying the display screen shown in FIG. 5, the lesion detection unit 31 has generated a lesion reliability map as the lesion detection result, so the output control unit 34 displays an image based on the lesion reliability map (here, a mask image indicating the lesion area) in the lesion detection result display area 71.
  • the lesion reliability map here, a mask image indicating the lesion area
  • the output control unit 34 displays an image in which the above-mentioned bounding box is superimposed on the latest endoscopic image Ia in the real-time image display area 70 or the lesion detection result display area 71.
  • the output control unit 34 may further display a text message in the lesion detection result display area 71 or the like to the effect that there is a high possibility that a lesion is present, and may output a sound (including voice) from the sound output unit 16 to notify the effect that there is a high possibility that a lesion is present.
  • the output control unit 34 displays the result of the progression (here, depth of invasion) judgment made by the progress determination unit 33 in the depth of invasion judgment result display area 72.
  • the progress determination unit 33 judges that the most likely depth of invasion class is "T3", and the output control unit 34 displays "T3" in the depth of invasion judgment result display area 72.
  • the output control unit 34 may output the depth of invasion class judged by the progress determination unit 33 via the sound output unit 16.
  • the progression determination unit 33 determines the depth of invasion based on the endoscopic image Ia that is suitable for the progression of the lesion selected by the suitability determination unit 32. Therefore, the output control unit 34 can present the highly accurate depth of invasion determination result to the examiner.
  • FIG. 6 shows a second display example displayed by the display device 2 during an endoscopic examination.
  • the second display example shows an example of a display screen displayed by the display device 2 when the suitability judgment unit 32 continuously generates unsuitability judgment results and the output control unit 34 suggests image capture.
  • the output control unit 34 of the image processing device 1 provides a real-time image display area 70, a lesion detection result display area 71, and an invasion depth judgment result display area 72 on the display screen.
  • the output control unit 34 displays a mask image based on the lesion detection result in the lesion detection result display area 71, similar to the first display example.
  • the output control unit 34 superimposes a frame 73 representing a target range (i.e., the target position and size) of a preferred lesion area in the endoscopic image Ia in the real-time image display area 70 as a suggestion for capturing an endoscopic image Ia suitable for judging the degree of progression using the endoscopic scope 3.
  • the output control unit 34 since the output control unit 34 has not been able to judge the depth of invasion in the depth of invasion judgment result display area 72, instead of displaying the result of the depth of invasion, it displays a message urging the examiner to adjust the lesion area so that it falls within the frame 73.
  • the examiner operating the endoscopic scope 3 to operate the endoscopic scope 3 so that the outer edge of the lesion area shown in the lesion detection result display area 71 overlaps with the frame 73.
  • the information shown in the lesion detection result display area 71 may be superimposed on the endoscopic image Ia in the real-time image display area 70.
  • the output control unit 34 may determine at least one of the size and shape of the frame 73 based on the lesion detection result. For example, the output control unit 34 may increase the size of the frame 73 as the size of the lesion area detected by the lesion detection unit 31 increases. In another example, the output control unit 34 may recognize the shape of the lesion area detected by the lesion detection unit 31, and determine the shape of the frame 73 (e.g., the ratio of the length to the width) according to the recognized shape. In this case, for example, the output control unit 34 may approximate the lesion area detected by the lesion detection unit 31 with a figure such as an ellipse or a rectangle, and display the frame 73 according to the approximated figure.
  • a figure such as an ellipse or a rectangle
  • the image processing device 1 can suggest capturing an image and promote the acquisition of an endoscopic image Ia suitable for determining the degree of progression.
  • FIG. 7 is an example of a flowchart outlining the processing executed by the image processing device 1 during endoscopic examination in the first embodiment.
  • the image processing device 1 acquires an endoscopic image Ia (step S11).
  • the endoscopic image acquisition unit 30 of the image processing device 1 receives the endoscopic image Ia from the endoscopic scope 3 via the interface 13.
  • the image processing device 1 performs lesion detection on the endoscopic image Ia acquired in step S11 (step S12).
  • the image processing device 1 inputs the endoscopic image Ia to a lesion detection model configured with reference to the lesion detection model information D1, thereby acquiring a lesion detection result output from the lesion detection model.
  • the image processing device 1 determines whether or not a lesion area has been detected from the endoscopic image Ia (step S13). If the image processing device 1 determines that a lesion area has been detected from the endoscopic image Ia (step S13; Yes), it performs a progress level determination for the endoscopic image Ia (step S14). In this case, the image processing device 1 performs the progress level determination based on at least one of the size and reliability of the detected lesion area. On the other hand, if the image processing device 1 determines that a lesion area has not been detected from the endoscopic image Ia (step S13; No), it proceeds to step S17 without performing a progress level determination or a determination of the progression level of the lesion.
  • the image processing device 1 assesses the progression of the lesion (step S16). In this case, for example, the image processing device 1 obtains an inference result of the progression output by the progression assessment model when the endoscopic image Ia is input to the progression assessment model constructed with reference to the progression assessment model information D2.
  • the image processing device 1 displays on the display device 2 information based on the endoscopic image Ia obtained in step S11, the lesion detection result generated in step S12, and the progress assessment result generated in step S16 (step S17). If the image processing device 1 determines in step S13 that a lesion area has not been detected, it displays, for example, the endoscopic image Ia and information indicating that a lesion area has not been detected in step S17.
  • step S15 if it is determined that the endoscopic image Ia is not suitable for progression assessment (step S15; No), the image processing device 1 outputs a suggestion to capture an image suitable for progression assessment (step S19).
  • the image processing device 1 may execute step S19 only if it is determined in step S15 that the image is not suitable for progression assessment a predetermined number of times or for a predetermined period of time. Note that in this case, if the number of times or the period of time that the image is continuously determined in step S15 to be not suitable for progression assessment does not reach the above-mentioned predetermined number of times or predetermined period of time, the image processing device 1 performs a process of displaying the endoscopic image Ia and the lesion detection result without outputting a suggestion, for example.
  • step S18 the image processing device 1 determines whether or not the endoscopic examination has ended. For example, the image processing device 1 determines that the endoscopic examination has ended when it detects a predetermined input to the input unit 14 or the operation unit 36. Then, if the image processing device 1 determines that the endoscopic examination has ended (step S18; Yes), it ends the processing of the flowchart. On the other hand, if the image processing device 1 determines that the endoscopic examination has not ended (step S18; No), it returns the processing to step S11.
  • the image processing device 1 may process the video composed of the endoscopic images Ia generated during the endoscopic examination after the examination.
  • the image processing device 1 sequentially performs the processing of the flowchart in FIG. 7 on the time-series endoscopic images Ia that constitute the specified image. Then, when the image processing device 1 determines in step S18 that the target image has ended, it ends the processing of the flowchart, and when the target image has not ended, it returns to step S11 and performs the processing of the flowchart on the next endoscopic image Ia in the time series.
  • the suitability determining unit 32 may perform the progression suitability determination based on a reliability calculated separately from the reliability output by the lesion detection model.
  • the suitability determination unit 32 calculates the reliability based on the endoscopic image Ia including the detected lesion area.
  • the suitability determination unit 32 calculates the reliability from the endoscopic image Ia using a model that outputs a reliability score that a lesion area exists in the input endoscopic image.
  • the above-mentioned model is, for example, a model based on machine learning such as a neural network, and learned parameters are stored in advance in the memory 12 or the like.
  • the above-mentioned model may be a classification model that performs a binary classification of whether or not a lesion area exists.
  • the suitability determination unit 32 acquires the confidence level corresponding to the class in which a lesion area exists, which is output from the above-mentioned classification model, as the above-mentioned reliability.
  • the suitability determination unit 32 can appropriately perform progress suitability determination.
  • the lesion detection model information D1 and the progression determination model information D2 may be stored in a storage device separate from the image processing device 1.
  • FIG. 8 is a schematic diagram of an endoscopic examination system 100A in a modified example.
  • the endoscopic examination system 100A includes a server device 4 that stores lesion detection model information D1 and progression assessment model information D2.
  • the endoscopic examination system 100A also includes multiple image processing devices 1 (1A, 1B, ...) that can communicate data with the server device 4 via a network.
  • each image processing device 1 refers to the lesion detection model information D1 and the progression determination model information D2 via the network.
  • the interface 13 of each image processing device 1 includes a communication interface such as a network adapter for communication.
  • each image processing device 1 can refer to the lesion detection model information D1 and the progression determination model information D2, as in the above-mentioned embodiment, and suitably execute processing related to determining the progression of the lesion.
  • the server device 4 may instead execute at least a part of the processing executed by each functional block of the processor 11 of the image processing device 1 shown in FIG. 4.
  • Second Embodiment 9 is a block diagram of an image processing device 1X according to the second embodiment.
  • the image processing device 1X includes an acquisition unit 30X, a detection unit 31X, a first determination unit 32X, and a second determination unit 33X.
  • the image processing device 1X may be composed of a plurality of devices.
  • the acquisition means 30X acquires an endoscopic image of the subject.
  • the acquisition means 30X can be, for example, the endoscopic image acquisition section 30 in the first embodiment (including modified examples, the same applies below).
  • the acquisition means 30X may instantly acquire an endoscopic image generated by the imaging section, or may acquire an endoscopic image generated in advance by the imaging section and stored in a storage device at a predetermined timing.
  • the detection means 31X detects a lesion area, which is a candidate area for a lesion in the subject in the endoscopic image, based on the endoscopic image.
  • the detection means 31X can be, for example, the lesion detection unit 31 in the first embodiment.
  • the first determination means 32X determines whether or not the endoscopic image is suitable for determining the progression or depth of a lesion based on at least one of the size of the lesion area or the reliability of the lesion area as a lesion.
  • the first determination means 32X can be, for example, the suitability determination unit 32 in the first embodiment.
  • the second determination means 33X determines the degree of progression or the depth of invasion based on the endoscopic image determined to be an image suitable for determining the degree of progression or the depth of invasion.
  • the second determination means 33X can be, for example, the degree of progression determination unit 33 in the first embodiment.
  • FIG. 10 is an example of a flowchart showing the processing procedure in the second embodiment.
  • the acquisition means 30X acquires an endoscopic image of the subject (step S21).
  • the detection means 31X detects a lesion area in the endoscopic image that is a candidate area for a lesion in the subject based on the endoscopic image (step S22).
  • the first determination means 32X determines whether the endoscopic image is suitable for determining the progression or depth of the lesion based on at least one of the size of the lesion area and the reliability of the lesion area as a lesion (step S23).
  • the second determination means 33X determines the progression or depth based on the endoscopic image determined to be suitable for determining the progression or depth (step S24).
  • the image processing device 1X can accurately determine the progression or depth of invasion based on the selected endoscopic image.
  • Non-transitory computer readable media include various types of tangible storage media.
  • Examples of non-transitory computer-readable media include magnetic storage media (e.g., floppy disks, magnetic tapes, hard disk drives), optical storage media (e.g., optical disks), CD-ROMs (Read Only Memory), CD-Rs, CD-R/Ws, semiconductor memories (e.g., mask ROMs, PROMs (Programmable ROMs), EPROMs (Erasable PROMs), flash ROMs, and RAMs (Random Access Memory).
  • Programs may also be supplied to computers by various types of transient computer-readable media.
  • Examples of transient computer-readable media include electrical signals, optical signals, and electromagnetic waves.
  • Transient computer-readable media can supply programs to computers via wired communication paths such as electric wires and optical fibers, or wireless communication paths.
  • An image processing device comprising: [Appendix 2] The image processing device described in Appendix 1, wherein the first judgment means changes the criteria used to judge whether an image is suitable for judging the degree of progression or depth of invasion based on information obtained by judging the degree of progression or depth of invasion.
  • the first determination means changes the criterion based on a degree of confidence for the class of the progression or depth of invasion determined by the second determination means.
  • the first determination means changes the criterion based on a degree of change in the class of the progression or depth of invasion over time determined by the second determination means.
  • the criterion is at least one of a first criterion related to the size of the lesion area and a second criterion related to the reliability.
  • Appendix 10 The computer An endoscopic image of the subject is acquired, detecting a lesion area in the endoscopic image, the lesion area being a candidate area for a lesion in the subject based on the endoscopic image; determining whether the endoscopic image is suitable for determining the progression or depth of the lesion based on at least one of the size of the lesion area or the reliability of the lesion area; determining the degree of progression or the depth of invasion based on the endoscopic image determined to be an image suitable for determining the degree of progression or the depth of invasion; Image processing methods.
  • An endoscopic image of the subject is acquired, detecting a lesion area in the endoscopic image, the lesion area being a candidate area for a lesion in the subject based on the endoscopic image; determining whether the endoscopic image is suitable for determining the progression or depth of the lesion based on at least one of the size of the lesion area or the reliability of the lesion area;
  • a storage medium storing a program for causing a computer to execute a process for determining the degree of progression or depth of invasion based on the endoscopic image that has been determined to be an image suitable for determining the degree of progression or depth of invasion.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)

Abstract

This image processing device 1X comprises an acquisition means 30X, a detection means 31X, a first determination means 32X, and a second determination means 33X. The acquisition means 30X acquires an endoscopic image obtained by photographing a subject. The detection means 31X detects a lesion region, that is, a candidate region for a lesion in the subject in the endoscopic image, on the basis of the endoscopic image. The first determination means 32X determines whether or not the endoscopic image is suitable for determining a progression degree or an invasion depth of the lesion on the basis the size of the lesion region and/or the degree of reliability in terms of the likelihood of a lesion in the lesion region. The second determination means 33X determines the progression degree or the invasion depth on the basis of an endoscopic image which is determined as being suitable for determining the progression degree or the invasion depth.

Description

画像処理装置、画像処理方法及び記憶媒体Image processing device, image processing method, and storage medium
 本開示は、内視鏡検査において取得される画像の処理を行う画像処理装置、画像処理方法及び記憶媒体の技術分野に関する。 This disclosure relates to the technical fields of image processing devices, image processing methods, and storage media that process images acquired during endoscopic examinations.
 従来から、臓器の管腔内を撮影した画像を処理する画像処理システムが知られている。例えば、特許文献1には、病変候補を医用画像から検出し、検出した病変候補の悪性度や医用画像に含まれる臓器を識別する医用画像処理装置が開示されている。  Image processing systems that process images of the inside of organ lumens have been known for some time. For example, Patent Document 1 discloses a medical image processing device that detects lesion candidates from medical images and identifies the malignancy of the detected lesion candidates and the organs contained in the medical images.
特開2021-083821号公報JP 2021-083821 A
 進行度(深達度を含む、以下同じ。)を内視鏡画像から判定する場合において、時系列で得られる内視鏡画像にはボケやテカリ、水しぶきなどのノイズを含む画像が多く含まれており、進行度の判定に適した画像とはならない。また、ノイズが少なく病変らしき部位を撮影した画像についても、必ずしも進行度の判定に適した画像とはならない。 When judging the stage of progression (including the depth of invasion, the same applies below) from endoscopic images, many of the endoscopic images obtained in time series contain noise such as blur, shine, and splashes of water, and are not suitable for judging the stage of progression. Furthermore, images with little noise that capture areas that appear to be lesions are not necessarily suitable for judging the stage of progression.
 本開示は、上述した課題を鑑み、進行度の判定を好適に実行することが可能な画像処理装置、画像処理方法及び記憶媒体を提供することを目的の一つとする。 In consideration of the above-mentioned problems, one of the objectives of the present disclosure is to provide an image processing device, an image processing method, and a storage medium that can appropriately determine the degree of progress.
 画像処理装置の一の態様は、
 被検体を撮影した内視鏡画像を取得する取得手段と、
 前記内視鏡画像に基づき、前記内視鏡画像における前記被検体の病変の候補領域である病変領域を検出する検出手段と、
 前記病変領域の大きさ又は前記病変領域の前記病変らしさに関する信頼度の少なくとも一方に基づき、前記内視鏡画像が前記病変の進行度又は深達度の判定に適した画像であるか否か判定する第1判定手段と、
 前記進行度又は深達度の判定に適した画像と判定された前記内視鏡画像に基づき、前記進行度又は深達度を判定する第2判定手段と、
を有する画像処理装置である。
One aspect of the image processing device is
An acquisition means for acquiring an endoscopic image of a subject;
a detection means for detecting a lesion area, which is a candidate area for a lesion of the subject in the endoscopic image, based on the endoscopic image;
a first determination means for determining whether or not the endoscopic image is suitable for determining the progression or depth of the lesion based on at least one of the size of the lesion area or the reliability of the lesion area with respect to the lesion-likeness;
a second determination means for determining the degree of progression or the depth of invasion based on the endoscopic image determined to be an image suitable for determining the degree of progression or the depth of invasion;
The image processing device has the following features.
 画像処理方法の一の態様は、
 コンピュータが、
 被検体を撮影した内視鏡画像を取得し、
 前記内視鏡画像に基づき、前記内視鏡画像における前記被検体の病変の候補領域である病変領域を検出し、
 前記病変領域の大きさ又は前記病変領域の前記病変らしさに関する信頼度の少なくとも一方に基づき、前記内視鏡画像が前記病変の進行度又は深達度の判定に適した画像であるか否か判定し、
 前記進行度又は深達度の判定に適した画像と判定された前記内視鏡画像に基づき、前記進行度又は深達度を判定する、
画像処理方法である。
One aspect of the image processing method includes:
The computer
An endoscopic image of the subject is acquired,
detecting a lesion area in the endoscopic image, the lesion area being a candidate area for a lesion in the subject based on the endoscopic image;
determining whether the endoscopic image is suitable for determining the progression or depth of the lesion based on at least one of the size of the lesion area or the reliability of the lesion area;
determining the degree of progression or the depth of invasion based on the endoscopic image determined to be an image suitable for determining the degree of progression or the depth of invasion;
An image processing method.
 記憶媒体の一の態様は、
 被検体を撮影した内視鏡画像を取得し、
 前記内視鏡画像に基づき、前記内視鏡画像における前記被検体の病変の候補領域である病変領域を検出し、
 前記病変領域の大きさ又は前記病変領域の前記病変らしさに関する信頼度の少なくとも一方に基づき、前記内視鏡画像が前記病変の進行度又は深達度の判定に適した画像であるか否か判定し、
 前記進行度又は深達度の判定に適した画像と判定された前記内視鏡画像に基づき、前記進行度又は深達度を判定する処理をコンピュータに実行させるプログラムを格納した記憶媒体である。
One aspect of the storage medium is
An endoscopic image of the subject is acquired,
detecting a lesion area in the endoscopic image, the lesion area being a candidate area for a lesion in the subject based on the endoscopic image;
determining whether the endoscopic image is suitable for determining the progression or depth of the lesion based on at least one of the size of the lesion area or the reliability of the lesion area;
The storage medium stores a program that causes a computer to execute a process for determining the degree of progression or depth of invasion based on the endoscopic image that has been determined to be an image suitable for determining the degree of progression or depth of invasion.
 本開示の1つの効果の例として、進行度(深達度を含む)の判定を好適に実行することが可能となる。 As an example of one effect of the present disclosure, it becomes possible to appropriately determine the degree of progression (including the depth of invasion).
内視鏡検査システムの概略構成を示す。1 shows a schematic configuration of an endoscopic examination system. 画像処理装置のハードウェア構成を示す。2 shows the hardware configuration of an image processing device. 第1実施形態における画像処理装置が実行する進行度判定処理の概要を示す図である。4 is a diagram showing an overview of a progress determination process executed by the image processing device in the first embodiment; FIG. 第1実施形態における進行度判定処理の機能ブロックの一例である。4 is an example of a functional block of a progress determination process in the first embodiment. 内視鏡検査において表示装置が表示する第1表示例を示す。1 shows a first display example displayed on a display device during an endoscopic examination. 内視鏡検査において表示装置が表示する第2表示例を示す。13 shows a second display example displayed on the display device during endoscopic examination. 第1実施形態において内視鏡検査時に画像処理装置が実行する処理の概要を示すフローチャートの一例である。5 is an example of a flowchart illustrating an overview of a process executed by an image processing device during an endoscopic examination in the first embodiment. 変形例における内視鏡検査システムの概略構成図である。FIG. 13 is a schematic configuration diagram of an endoscopic inspection system according to a modified example. 第2実施形態における画像処理装置のブロック図である。FIG. 11 is a block diagram of an image processing device according to a second embodiment. 第2実施形態において画像処理装置が実行するフローチャートの一例である。13 is an example of a flowchart executed by the image processing apparatus in the second embodiment.
 以下、図面を参照しながら、画像処理装置、画像処理方法及び記憶媒体の実施形態について説明する。 Below, embodiments of an image processing device, an image processing method, and a storage medium will be described with reference to the drawings.
 <第1実施形態>
 (1)システム構成
 図1は、内視鏡検査システム100の概略構成を示す。図1に示すように、内視鏡検査システム100は、内視鏡により撮影した画像に基づいて病変の疑いがある被検体の部位である病変部位を検出し、検出した病変部位における病変の進行度の判定及び判定結果の提示を行うシステムである。以後において、「進行度」とは、深達度(浸潤度)を含む総合的な病変進行の度合い(グレード)を指すものであってもよく、深達度(浸潤度)そのものであってもよい。内視鏡検査システム100は、主に、画像処理装置1と、表示装置2と、画像処理装置1に接続され、検査又は治療を行う医師等の検査者が扱う内視鏡スコープ3と、を備える。
First Embodiment
(1) System Configuration FIG. 1 shows a schematic configuration of an endoscopic examination system 100. As shown in FIG. 1, the endoscopic examination system 100 is a system that detects a lesion site, which is a site of a subject suspected of having a lesion, based on an image captured by an endoscope, judges the progression of the lesion at the detected lesion site, and presents the judgment result. Hereinafter, the "progression level" may refer to a comprehensive degree (grade) of progression of a lesion including the depth of invasion (degree of invasion), or may be the depth of invasion (degree of invasion) itself. The endoscopic examination system 100 mainly includes an image processing device 1, a display device 2, and an endoscope scope 3 that is connected to the image processing device 1 and is handled by an examiner such as a doctor who performs an examination or treatment.
 画像処理装置1は、内視鏡スコープ3が時系列により撮影する画像(「内視鏡画像Ia」とも呼ぶ。)を内視鏡スコープ3から取得し、内視鏡画像Iaに基づく画面を表示装置2に表示させる。内視鏡画像Iaは、被検者への内視鏡スコープ3の挿入工程又は排出工程の少なくとも一方において所定のフレーム周期により撮影された画像である。本実施形態においては、画像処理装置1は、時系列の内視鏡画像Iaから、病変部位の候補となる領域(「病変領域」とも呼ぶ。)が存在する内視鏡画像Iaを検出する。そして、画像処理装置1は、検出した内視鏡画像Iaのうち病変の進行度の判定に好適な画像を選択し、選択した画像に対して病変の進行度の判定を行い、その判定結果に関する情報を提示する。また、画像処理装置1は、病変の進行度の判定に好適な画像が得られない場合に、そのような画像を得るための撮影の示唆を出力する。 The image processing device 1 acquires images (also called "endoscopic images Ia") captured by the endoscope scope 3 in a time series from the endoscope scope 3, and displays a screen based on the endoscopic images Ia on the display device 2. The endoscopic images Ia are images captured at a predetermined frame cycle during at least one of the processes of inserting or ejecting the endoscope scope 3 into the subject. In this embodiment, the image processing device 1 detects, from the time series of endoscopic images Ia, an endoscopic image Ia in which a region that is a candidate for a lesion site (also called a "lesion region") is present. The image processing device 1 then selects an image from the detected endoscopic images Ia that is suitable for determining the progression of the lesion, determines the progression of the lesion for the selected image, and presents information on the determination result. Furthermore, when an image suitable for determining the progression of the lesion cannot be obtained, the image processing device 1 outputs a suggestion for capturing such an image.
 表示装置2は、画像処理装置1から供給される表示信号に基づき所定の表示を行うディスプレイ等である。 The display device 2 is a display or the like that performs a predetermined display based on a display signal supplied from the image processing device 1.
 内視鏡スコープ3は、主に、検査者が所定の入力を行うための操作部36と、被検者の撮影対象となる臓器内に挿入され、柔軟性を有するシャフト37と、超小型撮像素子などの撮影部を内蔵した先端部38と、画像処理装置1と接続するための接続部39とを有する。 The endoscope 3 mainly comprises an operation section 36 that allows the examiner to perform predetermined inputs, a flexible shaft 37 that is inserted into the subject's organ to be imaged, a tip section 38 that incorporates an imaging section such as a miniature image sensor, and a connection section 39 for connecting to the image processing device 1.
 図1に示される内視鏡検査システム100の構成は一例であり、種々の変更が行われてもよい。例えば、画像処理装置1は、表示装置2と一体に構成されてもよい。他の例では、画像処理装置1は、複数の装置から構成されてもよい。 The configuration of the endoscopic examination system 100 shown in FIG. 1 is an example, and various modifications may be made. For example, the image processing device 1 may be configured integrally with the display device 2. In another example, the image processing device 1 may be configured from multiple devices.
 なお、本開示における内視鏡検査の被検体は、大腸、食道、胃、膵臓などの内視鏡検査が可能な任意の臓器であってもよい。例えば、本開示において対象となる内視鏡は、咽頭内視鏡、気管支鏡、上部消化管内視鏡、十二指腸内視鏡、小腸内視鏡、大腸内視鏡、カプセル内視鏡、胸腔鏡、腹腔鏡、膀胱鏡、胆道鏡、関節鏡、脊椎内視鏡、血管内視鏡、硬膜外腔内視鏡などが挙げられる。また、内視鏡検査において検知対象となる病変部位の病状は、以下の(a)~(f)ように例示される。 The subject of the endoscopic examination in the present disclosure may be any organ that can be examined endoscopically, such as the large intestine, esophagus, stomach, or pancreas. For example, endoscopes that are the subject of the present disclosure include pharyngeal endoscopes, bronchoscopes, upper gastrointestinal endoscopes, duodenoscopes, small intestinal endoscopes, colonoscopes, capsule endoscopes, thoracoscopes, laparoscopes, cystoscopes, cholangioscopes, arthroscopes, spinal endoscopes, vascular endoscopes, and epidural endoscopes. Examples of the pathology of the lesion site that is the subject of the endoscopic examination include (a) to (f) below.
 (a)頭頚部:咽頭ガン、悪性リンパ腫、乳頭腫
 (b)食道:食道ガン、食道炎、食道裂孔ヘルニア、食道静脈瘤、食道アカラシア、食道粘膜下腫瘍、食道良性腫瘍
 (c)胃:胃ガン、胃炎、胃潰瘍、胃ポリープ、胃腫瘍
 (d)十二指腸:十二指腸ガン、十二指腸潰瘍、十二指腸炎、十二指腸腫瘍、十二指腸リンパ腫
 (e)小腸:小腸ガン、小腸腫瘍性疾患、小腸炎症性疾患、小腸血管性疾患
 (f)大腸:大腸ガン、大腸腫瘍性疾患、大腸炎症性疾患、大腸ポリープ、大腸ポリポーシス、クローン病、大腸炎、腸結核、痔
(a) Head and neck: pharyngeal cancer, malignant lymphoma, papilloma (b) Esophagus: esophageal cancer, esophagitis, hiatal hernia, esophageal varices, esophageal achalasia, esophageal submucosal tumor, benign esophageal tumor (c) Stomach: gastric cancer, gastritis, gastric ulcer, gastric polyp, gastric tumor (d) Duodenum: duodenal cancer, duodenal ulcer, duodenitis, duodenal tumor, duodenal lymphoma (e) Small intestine: small intestine cancer, small intestine neoplastic disease, small intestine inflammatory disease, small intestine vascular disease (f) Large intestine: large intestine cancer, large intestine neoplastic disease, large intestine inflammatory disease, large intestine polyp, large intestine polyposis, Crohn's disease, colitis, intestinal tuberculosis, hemorrhoids
 (2)ハードウェア構成
 図2は、画像処理装置1のハードウェア構成を示す。画像処理装置1は、主に、プロセッサ11と、メモリ12と、インターフェース13と、入力部14と、光源部15と、音出力部16と、を含む。これらの各要素は、データバス19を介して接続されている。
(2) Hardware Configuration Fig. 2 shows the hardware configuration of the image processing device 1. The image processing device 1 mainly includes a processor 11, a memory 12, an interface 13, an input unit 14, a light source unit 15, and a sound output unit 16. These elements are connected via a data bus 19.
 プロセッサ11は、メモリ12に記憶されているプログラム等を実行することにより、所定の処理を実行する。プロセッサ11は、CPU(Central Processing Unit)、GPU(Graphics Processing Unit)、TPU(Tensor Processing Unit)などのプロセッサである。プロセッサ11は、複数のプロセッサから構成されてもよい。プロセッサ11は、コンピュータの一例である。 The processor 11 executes predetermined processing by executing programs stored in the memory 12. The processor 11 is a processor such as a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), or a TPU (Tensor Processing Unit). The processor 11 may be composed of multiple processors. The processor 11 is an example of a computer.
 メモリ12は、RAM(Random Access Memory)、ROM(Read Only Memory)などの、作業メモリとして使用される各種の揮発性メモリ及び画像処理装置1の処理に必要な情報を記憶する不揮発性メモリにより構成される。なお、メモリ12は、画像処理装置1に接続又は内蔵されたハードディスクなどの外部記憶装置を含んでもよく、着脱自在なフラッシュメモリなどの記憶媒体を含んでもよい。メモリ12には、画像処理装置1が本実施形態における各処理を実行するためのプログラムが記憶される。 The memory 12 is composed of various volatile memories used as working memories, such as RAM (Random Access Memory) and ROM (Read Only Memory), and non-volatile memories that store information necessary for the processing of the image processing device 1. The memory 12 may include an external storage device such as a hard disk connected to or built into the image processing device 1, or may include a removable storage medium such as a flash memory. The memory 12 stores programs that enable the image processing device 1 to execute each process in this embodiment.
 また、メモリ12は、病変検出モデルに関する情報である病変検出モデル情報D1と、進行度判定モデルに関する情報である進行度判定モデル情報D2とを記憶している。病変検出モデル及び進行度判定モデルの詳細については後述する。また、メモリ12には、画像処理装置1が本実施形態における各処理を実行するために必要なその他の情報を任意に含んでもよい。 The memory 12 also stores lesion detection model information D1, which is information related to the lesion detection model, and progression determination model information D2, which is information related to the progression determination model. Details of the lesion detection model and the progression determination model will be described later. The memory 12 may also include any other information necessary for the image processing device 1 to execute each process in this embodiment.
 インターフェース13は、画像処理装置1と外部装置とのインターフェース動作を行う。例えば、インターフェース13は、プロセッサ11が生成した表示情報「Ib」を表示装置2に供給する。また、インターフェース13は、光源部15が生成する光等を内視鏡スコープ3に供給する。また、インターフェース13は、内視鏡スコープ3から供給される内視鏡画像Iaを示す電気信号をプロセッサ11に供給する。インターフェース13は、外部装置と有線又は無線により通信を行うためのネットワークアダプタなどの通信インターフェースであってもよく、USB(Universal Serial Bus)、SATA(Serial AT Attachment)などに準拠したハードウェアインターフェースであってもよい。 The interface 13 performs interface operations between the image processing device 1 and an external device. For example, the interface 13 supplies the display information "Ib" generated by the processor 11 to the display device 2. The interface 13 also supplies light generated by the light source unit 15 to the endoscope scope 3. The interface 13 also supplies an electrical signal indicating the endoscopic image Ia supplied from the endoscope scope 3 to the processor 11. The interface 13 may be a communication interface such as a network adapter for communicating with an external device by wire or wirelessly, or may be a hardware interface compliant with USB (Universal Serial Bus), SATA (Serial AT Attachment), etc.
 入力部14は、検査者による操作に基づく入力信号を生成する。入力部14は、例えば、ボタン、タッチパネル、リモートコントローラ、音声入力装置等である。光源部15は、内視鏡スコープ3の先端部38に供給するための光を生成する。また、光源部15は、内視鏡スコープ3に供給する水や空気を送り出すためのポンプ等も内蔵してもよい。音出力部16は、プロセッサ11の制御に基づき音を出力する。 The input unit 14 generates an input signal based on the operation by the examiner. The input unit 14 is, for example, a button, a touch panel, a remote controller, a voice input device, etc. The light source unit 15 generates light to be supplied to the tip 38 of the endoscope 3. The light source unit 15 may also incorporate a pump or the like for sending water or air to be supplied to the endoscope 3. The sound output unit 16 outputs sound based on the control of the processor 11.
 次に、病変検出モデルと進行度判定モデルの詳細について説明する。 Next, we will explain the details of the lesion detection model and progression assessment model.
 病変検出モデルは、内視鏡検査において検知対象となる疾患に該当する病変領域に関する推論結果を生成する機械学習モデルであり、当該モデルに必要なパラメータが病変検出モデル情報D1に記憶されている。病変検出モデルは、例えば、内視鏡画像が入力された場合に、入力された内視鏡画像における病変領域に関する推論結果を出力する。言い換えると、病変検出モデルは、病変検出モデルに入力される画像と、当該画像における病変領域との関係を学習したモデルである。病変検出モデルは、ニューラルネットワークやサポートベクターマシーンなどの任意の機械学習において採用されるアーキテクチャを含むモデル(統計モデルを含む、以下同じ。)であってもよい。このようなニューラルネットワークの代表モデルとして、例えば、Fully Convolutional Network、SegNet、U-Net、V-Net、Feature Pyramid Network、Mask R-CNN、DeepLabなどが存在する。病変検出モデルがニューラルネットワークにより構成される場合、病変検出モデル情報D1は、例えば、層構造、各層のニューロン構造、各層におけるフィルタ数及びフィルタサイズ、並びに各フィルタの各要素の重みなどの各種パラメータを含む。 The lesion detection model is a machine learning model that generates an inference result regarding a lesion area corresponding to a disease to be detected in an endoscopic examination, and the parameters required for the model are stored in the lesion detection model information D1. For example, when an endoscopic image is input, the lesion detection model outputs an inference result regarding a lesion area in the input endoscopic image. In other words, the lesion detection model is a model that has learned the relationship between an image input to the lesion detection model and a lesion area in the image. The lesion detection model may be a model (including a statistical model, the same applies below) that includes an architecture adopted in any machine learning such as a neural network or a support vector machine. Representative models of such neural networks include, for example, Fully Convolutional Network, SegNet, U-Net, V-Net, Feature Pyramid Network, Mask R-CNN, DeepLab, and the like. When the lesion detection model is configured using a neural network, the lesion detection model information D1 includes various parameters such as the layer structure, the neuron structure of each layer, the number of filters and filter size in each layer, and the weights of each element of each filter.
 ここで、病変検出モデルが出力する推論結果の具体的な態様(第1出力態様及び第2出力態様)について説明する。 Here, we will explain the specific aspects (first output aspect and second output aspect) of the inference results output by the lesion detection model.
 第1出力態様では、病変検出モデルは、入力された内視鏡画像の単位領域ごとに、病変領域であることの信頼度(「病変信頼度」とも呼ぶ。)を示したマップを、推論結果として出力する。上述のマップを、以後では、「病変信頼度マップ」とも呼ぶ。例えば、病変信頼度マップは、病変信頼度を単位画素(サブピクセルを含んでもよい)又は所定の規則により区切られた画素ブロックごとに示した画像である。なお、病変信頼度は、病変信頼度が高い領域ほど、病変領域である信頼度が高いことを表すものとする。病変信頼度マップは、病変領域を2値により示したマスク画像であってもよい。 In the first output mode, the lesion detection model outputs, as an inference result, a map indicating the reliability (also called "lesion reliability") that each unit area of the input endoscopic image is a lesion area. Hereinafter, the above map will also be called a "lesion reliability map." For example, the lesion reliability map is an image indicating the lesion reliability for each unit pixel (which may include subpixels) or for each pixel block separated by a predetermined rule. Note that the lesion reliability indicates that the reliability of an area is higher for an area with a higher lesion reliability. The lesion reliability map may be a mask image indicating lesion areas using binary values.
 第2出力態様では、病変検出モデルは、入力された内視鏡画像における病変領域の存在範囲を示すバウンディングボックスと、バウンディングボックスにより囲まれた領域が病変領域であることの信頼度と、を夫々示す推論結果を出力する。ここでの信頼度は、例えば、病変検出モデルがニューラルネットワークにより構成されている場合にニューラルネットワークの出力層から出力される確信の度合いを示すスコアである確信度である。なお、上述した病変検出モデルが出力する推論結果の態様は一例であり、種々の態様の推論結果が病変検出モデルから出力されてもよい。 In the second output mode, the lesion detection model outputs an inference result indicating a bounding box indicating the extent of the lesion area in the input endoscopic image, and the confidence that the area surrounded by the bounding box is a lesion area. The confidence here is, for example, a confidence level that is a score indicating the degree of confidence output from the output layer of a neural network when the lesion detection model is configured using a neural network. Note that the above-mentioned mode of inference result output by the lesion detection model is one example, and various modes of inference results may be output from the lesion detection model.
 病変検出モデルは、病変検出モデルの入力形式に即した入力画像と当該入力画像が入力された場合に病変検出モデルが出力すべき推論結果の正解を示す正解データ(上述の例では、正解の病変信頼度マップ又はバウンディングボックス)との組に基づき予め学習される。そして、学習により得られた各モデルのパラメータ等が病変検出モデル情報D1としてメモリ12に記憶される。 The lesion detection model is trained in advance based on a pair of an input image conforming to the input format of the lesion detection model and correct answer data (in the above example, the correct lesion confidence map or bounding box) indicating the correct inference result that the lesion detection model should output when the input image is input. Then, the parameters of each model obtained by training are stored in memory 12 as lesion detection model information D1.
 なお、病変検出モデルは、内視鏡画像から特徴量を抽出する特徴抽出モデルを含んでもよく、当該特徴抽出モデルと別体のモデルであってもよい。後者の場合、病変検出モデルは、内視鏡画像が入力された特徴抽出モデルが出力する特徴量(所定次元数のテンソル)が入力された場合に上述した推論結果を出力するように学習されたモデルとなる。 The lesion detection model may include a feature extraction model that extracts features from an endoscopic image, or may be a model separate from the feature extraction model. In the latter case, the lesion detection model is a model trained to output the above-mentioned inference result when a feature (tensor with a predetermined number of dimensions) output by the feature extraction model to which an endoscopic image is input is input.
 進行度判定モデルは、入力された内視鏡画像に含まれる病変領域が示す病変の進行度を推論(分類)する機械学習モデルであり、当該モデルに必要なパラメータが進行度判定モデル情報D2に記憶されている。進行度判定モデルは、内視鏡画像が入力された場合に、入力された内視鏡画像における病変の進行度を示す推論結果(詳しくは進行度のクラスを示す分類結果)を出力する。言い換えると、進行度判定モデルは、進行度判定モデルに入力される画像と、当該画像における病変の進行度との関係を学習したモデルである。進行度判定モデルは、ニューラルネットワークやサポートベクターマシーンなどの任意の機械学習において採用されるアーキテクチャを含むモデル(統計モデルを含む、以下同じ。)であってもよい。進行度判定モデルがニューラルネットワークにより構成される場合、進行度判定モデルは、例えば、層構造、各層のニューロン構造、各層におけるフィルタ数及びフィルタサイズ、並びに各フィルタの各要素の重みなどの各種パラメータを含む。 The progression assessment model is a machine learning model that infers (classifies) the progression of a lesion indicated by a lesion area included in an input endoscopic image, and parameters required for the model are stored in the progression assessment model information D2. When an endoscopic image is input, the progression assessment model outputs an inference result indicating the progression of the lesion in the input endoscopic image (more specifically, a classification result indicating the class of the progression). In other words, the progression assessment model is a model that has learned the relationship between the image input to the progression assessment model and the progression of the lesion in the image. The progression assessment model may be a model (including a statistical model, the same applies below) that includes an architecture adopted in any machine learning such as a neural network or a support vector machine. When the progression assessment model is configured by a neural network, the progression assessment model includes various parameters such as a layer structure, a neuron structure of each layer, the number of filters and filter size in each layer, and the weight of each element of each filter.
 進行度判定モデルは、進行度判定モデルの入力形式に即した入力画像と当該入力画像が入力された場合に進行度判定モデルが出力すべき推論結果の正解を示す正解データ(即ち、正解となる進行度のクラス)との組に基づき予め学習される。そして、学習により得られた各モデルのパラメータ等が進行度判定モデル情報D2としてメモリ12に記憶される。 The progress assessment model is trained in advance based on a pair of an input image conforming to the input format of the progress assessment model and correct answer data (i.e., the progress class that is the correct answer) that indicates the correct inference result that the progress assessment model should output when the input image is input. Then, the parameters of each model obtained by training are stored in memory 12 as progress assessment model information D2.
 なお、進行度判定モデルに入力される内視鏡画像は、内視鏡スコープ3が生成する内視鏡画像Iaの全体画像であってもよく、病変検出モデルにより検出された病変領域を少なくとも含むように内視鏡画像から切り取られた画像(即ち内視鏡画像Iaの部分画像)であってもよい。また、進行度判定モデルには、内視鏡画像に代えて、上述した病変検出モデル又は特徴抽出モデルが算出した画像の特徴量が入力されてもよい。 The endoscopic image input to the progression determination model may be the entire image of the endoscopic image Ia generated by the endoscopic scope 3, or may be an image cut out from the endoscopic image so as to include at least the lesion area detected by the lesion detection model (i.e., a partial image of the endoscopic image Ia). Furthermore, instead of the endoscopic image, the progress determination model may be input with the feature quantities of the image calculated by the lesion detection model or feature extraction model described above.
 (3)概要説明
 図3は、第1実施形態における画像処理装置1が実行する進行度の判定に関する進行度判定処理の概要を示す図である。
(3) Overview FIG. 3 is a diagram showing an overview of a progress determination process related to the determination of the progress degree executed by the image processing device 1 in the first embodiment.
 まず、画像処理装置1は、内視鏡スコープ3から所定のフレーム周期により得られる各内視鏡画像Iaにおける病変領域を、病変検出モデルを用いて検出する。そして、画像処理装置1は、内視鏡画像Iaにおける病変領域に関する推論結果を病変検出モデルから取得する。図3の例では、画像処理装置1は、病変検出結果として、第1出力態様に基づく病変検出モデルの推論結果を示す病変信頼度マップと、第2出力態様に基づく病変検出モデルの推論結果を示すバウンディングボックスとのいずれかを取得する。ここでは、病変信頼度マップは、病変信頼度が高いピクセルほど白に近い色により表されている。また、上述のバウンディングボックスは、説明便宜上、病変検出モデルに入力された内視鏡画像Iaに重畳して表示されている。 First, the image processing device 1 detects a lesion area in each endoscopic image Ia obtained from the endoscope scope 3 at a predetermined frame period using a lesion detection model. Then, the image processing device 1 obtains an inference result regarding the lesion area in the endoscopic image Ia from the lesion detection model. In the example of FIG. 3, the image processing device 1 obtains, as the lesion detection result, either a lesion confidence map indicating the inference result of the lesion detection model based on the first output mode, or a bounding box indicating the inference result of the lesion detection model based on the second output mode. Here, in the lesion confidence map, pixels with higher lesion confidence are represented by a color closer to white. Also, for ease of explanation, the above-mentioned bounding box is displayed superimposed on the endoscopic image Ia input to the lesion detection model.
 ここで、画像処理装置1は、病変領域が検出できなかった内視鏡画像Iaについては、進行度判定モデルに基づく進行度の判定を行わない。なお、画像処理装置1は、第1出力態様では、所定の閾値(「第1閾値」とも呼ぶ。)以上となる病変信頼度となる単位領域(例えば画素)を病変領域を構成する部分領域とみなし、そのような単位領域が存在しなかった又は所定画素数以上の上述の単位領域の連結領域が存在しなかった場合に、病変領域が検出できなかったと判定する。上述の第1閾値は、例えば、予めメモリ12等に記憶された既定値である。また、画像処理装置1は、第2出力態様では、バウンディングボックス内の領域を病変領域とみなし、バウンディングボックスが得られなった場合に、病変領域が検出できなかったと判定する。 Here, the image processing device 1 does not perform a progression determination based on the progression determination model for an endoscopic image Ia in which a lesion area could not be detected. In the first output mode, the image processing device 1 regards a unit area (e.g., pixels) having a lesion reliability equal to or greater than a predetermined threshold (also referred to as the "first threshold") as a partial area constituting the lesion area, and determines that the lesion area could not be detected if no such unit area exists or if no connected area of the above-mentioned unit areas having a predetermined number of pixels or more exists. The above-mentioned first threshold is, for example, a default value stored in advance in the memory 12 or the like. In the second output mode, the image processing device 1 regards the area within a bounding box as the lesion area, and determines that the lesion area could not be detected if a bounding box cannot be obtained.
 次に、画像処理装置1は、病変領域が検出された内視鏡画像Iaの各々について、進行度の判定に適した内視鏡画像であるか否かの判定(「進行度適否判定」とも呼ぶ。)を行う。この場合、画像処理装置1は、対象となる内視鏡画像Iaから検出した病変領域のサイズ又は検出した病変領域についての信頼度の少なくとも一方に基づき、対象となる内視鏡画像Iaを対象とする進行度適否判定を行う。 Next, the image processing device 1 judges whether each of the endoscopic images Ia in which the lesion area has been detected is an endoscopic image suitable for judging the degree of progression (also called a "progression suitability judgment"). In this case, the image processing device 1 judges the suitability of the degree of progression for the target endoscopic image Ia based on at least one of the size of the lesion area detected from the target endoscopic image Ia or the reliability of the detected lesion area.
 そして、画像処理装置1は、病変領域が検出された内視鏡画像Iaが進行度の判定に適した内視鏡画像であると判定した場合には、当該内視鏡画像Iaを用いて進行度判定モデルにより病変の進行度の判定を行う。これにより、画像処理装置1は、病変の進行度に好適な内視鏡画像Iaを自動選定し、病変の進行度を高精度に判定することができる。一方、画像処理装置1は、病変領域が検出された内視鏡画像Iaが進行度の判定に適した内視鏡画像でないと判定した場合には、進行度の判定に適した内視鏡画像Iaを内視鏡スコープ3により撮影するための示唆を、表示装置2又は音出力部16により出力する。これにより、画像処理装置1は、進行度の判定に適した内視鏡画像Iaを撮影するように検査者の内視鏡スコープ3の操作を支援し、進行度の判定に適した内視鏡画像Iaの取得を促進することができる。 If the image processing device 1 determines that the endoscopic image Ia in which the lesion area is detected is an endoscopic image suitable for judging the degree of progression, the image processing device 1 uses the endoscopic image Ia to judge the degree of progression of the lesion using a progression judgment model. This allows the image processing device 1 to automatically select an endoscopic image Ia suitable for the degree of progression of the lesion and judge the degree of progression of the lesion with high accuracy. On the other hand, if the image processing device 1 determines that the endoscopic image Ia in which the lesion area is detected is not an endoscopic image suitable for judging the degree of progression, it outputs a suggestion to capture an endoscopic image Ia suitable for judging the degree of progression using the endoscopic scope 3 via the display device 2 or the sound output unit 16. This allows the image processing device 1 to assist the examiner in operating the endoscopic scope 3 to capture an endoscopic image Ia suitable for judging the degree of progression, and promote the acquisition of an endoscopic image Ia suitable for judging the degree of progression.
 (4)機能ブロック
 図4は、第1実施形態における進行度判定処理の機能ブロックの一例である。画像処理装置1のプロセッサ11は、機能的には、内視鏡画像取得部30と、病変検出部31と、適否判定部32と、進行度判定部33と、出力制御部34と、を有する。なお、図4では、データの授受が行われるブロック同士を実線により結んでいるが、データの授受が行われるブロックの組合せはこれに限定されない。後述する他の機能ブロックの図においても同様である。
(4) Functional Blocks Fig. 4 is an example of functional blocks of the progress assessment process in the first embodiment. The processor 11 of the image processing device 1 functionally has an endoscopic image acquisition unit 30, a lesion detection unit 31, a suitability assessment unit 32, a progress assessment unit 33, and an output control unit 34. Note that in Fig. 4, blocks between which data is exchanged are connected by solid lines, but the combination of blocks between which data is exchanged is not limited to this. The same applies to other functional block diagrams described later.
 内視鏡画像取得部30は、インターフェース13を介して内視鏡スコープ3が撮影した内視鏡画像Iaを所定間隔により取得する。そして、内視鏡画像取得部30は、取得した内視鏡画像Iaを、病変検出部31及び出力制御部34に夫々供給する。そして、内視鏡画像取得部30が内視鏡画像Iaを取得する時間間隔を周期として、後段の各処理部が後述の処理を行う。 The endoscopic image acquisition unit 30 acquires the endoscopic image Ia captured by the endoscopic scope 3 via the interface 13 at predetermined intervals. The endoscopic image acquisition unit 30 then supplies the acquired endoscopic image Ia to the lesion detection unit 31 and the output control unit 34, respectively. Then, the respective processing units in the subsequent stages perform the processing described below, with the time interval at which the endoscopic image acquisition unit 30 acquires the endoscopic image Ia being set as a period.
 病変検出部31は、病変検出モデル情報D1に基づき、内視鏡画像取得部30から供給された内視鏡画像Iaにおける病変領域の検出を行う。この場合、病変検出部31は、病変検出モデル情報D1を参照することで構成した病変検出モデルに内視鏡画像Iaを入力し、当該病変検出モデルが出力する病変検出の推論結果を取得する。病変検出の推論結果は、第1出力態様に基づく病変信頼度マップであってもよく、第2出力態様に基づくバウンディングボックス及び病変信頼度の組であってもよい。病変検出部31は、病変領域を検出したと判定した場合に、上述の病変検出の推論結果に相当する病変検出結果を内視鏡画像Iaと共に適否判定部32に供給する。また、病変検出部31は、病変領域を検出したか否かに関わらず、病変検出結果を出力制御部34に供給する。なお、病変領域を検出しなかった場合の病変検出結果は、例えば、病変領域が検出されなかったことを示す情報である。 The lesion detection unit 31 detects a lesion area in the endoscopic image Ia supplied from the endoscopic image acquisition unit 30 based on the lesion detection model information D1. In this case, the lesion detection unit 31 inputs the endoscopic image Ia to a lesion detection model constructed by referring to the lesion detection model information D1, and obtains an inference result of lesion detection output by the lesion detection model. The inference result of lesion detection may be a lesion reliability map based on the first output mode, or a set of a bounding box and lesion reliability based on the second output mode. When the lesion detection unit 31 determines that a lesion area has been detected, it supplies the lesion detection result corresponding to the inference result of the above-mentioned lesion detection together with the endoscopic image Ia to the suitability determination unit 32. In addition, the lesion detection unit 31 supplies the lesion detection result to the output control unit 34 regardless of whether or not a lesion area has been detected. Note that the lesion detection result when a lesion area has not been detected is, for example, information indicating that a lesion area has not been detected.
 適否判定部32は、病変検出部31から供給される内視鏡画像Iaが病変の進行度の判定に適しているか否かの適否判定である進行度適否判定を行う。この場合、適否判定部32は、病変検出の推論結果に基づき、内視鏡画像Iaから検出した病変領域のサイズ又は検出した病変領域についての信頼度の少なくとも一方を取得し、取得したサイズ又は信頼度の少なくとも一方に基づき、進行度適否判定を行う。そして、適否判定部32は、内視鏡画像Iaが病変の進行度の判定に適していると判定した場合、内視鏡画像Iaを、進行度判定部33へ供給する。また、適否判定部32は、内視鏡画像Iaが病変の進行度の判定に適していないと判定した場合、内視鏡画像Iaが病変の進行度の判定に適していない旨を示す判定結果(「不適判定結果」とも呼ぶ。)を出力制御部34へ供給する。また、適否判定部32は、内視鏡画像Iaが病変の進行度の判定に適していると判定した場合においても、内視鏡画像Iaが病変の進行度の判定に適している旨を示す判定結果(「好適判定結果」とも呼ぶ。)を出力制御部34へ供給してもよい。 The suitability determination unit 32 performs a progression suitability determination, which is a suitability determination of whether the endoscopic image Ia supplied from the lesion detection unit 31 is suitable for determining the progression of the lesion. In this case, the suitability determination unit 32 acquires at least one of the size of the lesion area detected from the endoscopic image Ia or the reliability of the detected lesion area based on the inference result of the lesion detection, and performs a progression suitability determination based on at least one of the acquired size or reliability. Then, if the suitability determination unit 32 determines that the endoscopic image Ia is suitable for determining the progression of the lesion, it supplies the endoscopic image Ia to the progression determination unit 33. Also, if the suitability determination unit 32 determines that the endoscopic image Ia is not suitable for determining the progression of the lesion, it supplies a determination result (also called an "insuitability determination result") indicating that the endoscopic image Ia is not suitable for determining the progression of the lesion to the output control unit 34. Furthermore, even if the suitability determination unit 32 determines that the endoscopic image Ia is suitable for determining the degree of progression of the lesion, the suitability determination unit 32 may supply the output control unit 34 with a determination result indicating that the endoscopic image Ia is suitable for determining the degree of progression of the lesion (also referred to as a "suitable determination result").
 進行度判定部33は、進行度判定モデル情報D2に基づき、進行度の判定に用いる適性があると適否判定部32により判定された内視鏡画像Iaに含まれる病変領域の進行度を判定する(即ち病変領域が属する進行度のクラスを分類する)。この場合、例えば、進行度判定部33は、進行度判定モデル情報D2を参照することで構成した進行度判定モデルに内視鏡画像Iaを入力し、当該進行度判定モデルが出力する進行度の推論結果を取得する。この場合、例えば、進行度判定モデルは、進行度の推論結果として、最も確からしい進行度のクラス(例えば進行度を表すグレード)と、分類の候補となる進行度の各クラスにおける確信度(確信の度合い)と、を出力する。なお、進行度判定部33は、内視鏡画像Iaに代えて、病変領域を含む内視鏡画像Iaの部分画像又は病変検出部31により算出された内視鏡画像Iaの特徴量を進行度判定モデルに入力してもよい。そして、進行度判定部33は、上述した進行度の推論結果に基づく進行度判定結果を出力制御部34へ供給する。進行度判定結果は、進行度の判定により得られた情報であり、例えば、最も確からしいと判定された進行度のクラスと、当該クラスの確信度とを示す情報である。 Based on the progression determination model information D2, the progress determination unit 33 determines the progress of the lesion area included in the endoscopic image Ia that is determined by the suitability determination unit 32 to be suitable for use in progression determination (i.e., classifies the progress class to which the lesion area belongs). In this case, for example, the progress determination unit 33 inputs the endoscopic image Ia to a progress determination model configured by referring to the progress determination model information D2, and obtains the progress inference result output by the progress determination model. In this case, for example, the progress determination model outputs the most likely progress class (e.g., a grade representing the progress) and the confidence level (degree of confidence) for each progress class that is a candidate for classification as the progress inference result. Note that the progress determination unit 33 may input a partial image of the endoscopic image Ia including the lesion area or the feature amount of the endoscopic image Ia calculated by the lesion detection unit 31 to the progress determination model instead of the endoscopic image Ia. Then, the progress determination unit 33 supplies the progress determination result based on the above-mentioned progress inference result to the output control unit 34. The progress level determination result is information obtained by determining the progress level, and is, for example, information indicating the progress level class determined to be most likely and the degree of certainty of that class.
 なお、進行度判定部33は、進行度判定モデルが1つの内視鏡画像Iaから算出する進行度の推論結果に基づき、出力制御部34に出力させる最終的な進行度を決定する代わりに、2以上の所定個数の内視鏡画像Iaに基づく所定個数の進行度の推論結果に基づき、出力制御部34に出力させる最終的な進行度を決定してもよい。この場合、進行度判定部33は、進行度判定モデルが出力した進行度の推論結果が所定個数分だけ蓄積された場合に、蓄積された所定個数の進行度の推論結果に基づき、出力制御部34に出力させる進行度を決定する。この場合、例えば、進行度判定部33は、所定個数の進行度の推論結果を集計し、最も確からしいと判定された頻度が最も高い進行度のクラス(即ち多数決により定めたクラス)を、最終的な進行度として判定する。この場合、進行度判定部33が出力する進行度判定結果は、例えば、多数決により定めたクラスと、当該クラスの確信度の平均値(又は平均値以外の代表値)とを含む。 In addition, the progress determination unit 33 may determine the final progress to be output to the output control unit 34 based on a predetermined number of progress inference results based on a predetermined number of endoscopic images Ia, instead of determining the final progress to be output to the output control unit 34 based on a predetermined number of progress inference results based on two or more predetermined number of endoscopic images Ia based on the progress determination model. In this case, when a predetermined number of progress inference results output by the progress determination model are accumulated, the progress determination unit 33 determines the progress to be output to the output control unit 34 based on the accumulated predetermined number of progress inference results. In this case, for example, the progress determination unit 33 counts the predetermined number of progress inference results, and determines the progress class that is most frequently determined to be most likely (i.e., the class determined by majority vote) as the final progress. In this case, the progress determination result output by the progress determination unit 33 includes, for example, the class determined by majority vote and the average value of the certainty of the class (or a representative value other than the average value).
 出力制御部34は、内視鏡画像取得部30から供給される最新の内視鏡画像Iaと、病変検出部31が出力する病変検出結果と、進行度判定部33が出力する進行度判定結果とに基づき、表示情報Ibを生成する。そして、出力制御部34は、生成した表示情報Ibを表示装置2に供給することで、最新の内視鏡画像Ia、病変検出結果、及び進行度判定結果等を表示装置2に表示させる。なお、出力制御部34は、病変検出結果に基づき、病変部位が検知されたことをユーザに通知する警告音又は音声案内等を出力するように、音出力部16の音出力制御を行ってもよい。 The output control unit 34 generates display information Ib based on the latest endoscopic image Ia supplied from the endoscopic image acquisition unit 30, the lesion detection result output by the lesion detection unit 31, and the progress determination result output by the progress determination unit 33. The output control unit 34 then supplies the generated display information Ib to the display device 2, thereby causing the display device 2 to display the latest endoscopic image Ia, the lesion detection result, and the progress determination result, etc. In addition, the output control unit 34 may control the sound output of the sound output unit 16 so as to output a warning sound or voice guidance, etc., to notify the user that a lesion area has been detected, based on the lesion detection result.
 また、出力制御部34は、適否判定部32から供給される不適判定結果に基づき、進行度の判定に適した内視鏡画像Iaを内視鏡スコープ3により撮影するための示唆を、表示装置2又は音出力部16により出力する。例えば、出力制御部34は、好適判定結果が生成されることなく、不適判定結果が所定回数又は所定時間だけ連続して生成された場合、上述の示唆を出力する。 The output control unit 34 also outputs, via the display device 2 or the sound output unit 16, a suggestion for capturing an endoscopic image Ia suitable for judging the degree of progression using the endoscope scope 3, based on the inappropriate judgment result supplied from the appropriateness judgment unit 32. For example, the output control unit 34 outputs the above-mentioned suggestion when an inappropriate judgment result is generated a predetermined number of times or for a predetermined period of time in succession without generating a suitable judgment result.
 この場合、示唆の第1の例では、出力制御部34は、病変領域をより拡大した内視鏡画像Iaを得るため、病変領域に撮影位置を近づけることを促す情報を出力する。例えば、出力制御部34は、「カメラを病変に近づけてください」との表示又は音声出力を行う。 In this case, in the first example of the suggestion, the output control unit 34 outputs information encouraging the user to move the shooting position closer to the lesion area in order to obtain an endoscopic image Ia with a larger magnified lesion area. For example, the output control unit 34 displays or outputs audio stating, "Please move the camera closer to the lesion."
 示唆の第2の例では、出力制御部34は、内視鏡画像Ia上での病変領域の目標となる範囲を示す情報を出力する。例えば、出力制御部34は、病変領域が表示されることが好ましい範囲を示す枠を、最新の内視鏡画像Iaに重ねて表示する。上述の枠を表示すべき内視鏡画像Ia上の範囲については、メモリ12等に予め記憶された既定の範囲であってもよく、病変検出結果に基づき形状又は大きさの少なくとも一方が調整された範囲であってもよい。第2の例の具体的態様については、後述する図6の表示例において詳しく説明する。 In a second example of the suggestion, the output control unit 34 outputs information indicating a target range of the lesion area on the endoscopic image Ia. For example, the output control unit 34 displays a frame indicating the range in which the lesion area is preferably displayed, superimposed on the latest endoscopic image Ia. The range on the endoscopic image Ia in which the frame should be displayed may be a preset range stored in advance in the memory 12, etc., or may be a range in which at least one of the shape and size has been adjusted based on the lesion detection result. A specific aspect of the second example will be described in detail in the display example of Figure 6 described later.
 また、出力制御部34は、示唆の出力後においても、所定回数又は所定時間以上、好適判定結果が生成されずに不適判定結果が連続して生成されていると判定した場合、進行度の判定を実行できないことを示す情報を表示又は音声出力する。 In addition, if the output control unit 34 determines that an unsuitable judgment result has been generated consecutively without generating a suitable judgment result for a predetermined number of times or for a predetermined period of time even after the output of the suggestion, it displays or outputs audio information indicating that a progress judgment cannot be performed.
 なお、出力制御部34は、進行度判定部33が出力する進行度判定結果に基づいて、示唆を出力してもよい。例えば、出力制御部34は、最も確からしい進行度のクラスの確信度が所定の閾値未満である場合には、不適判定結果の有無に関わらず、上述した第1の例又は第2の例に基づく示唆の出力を行ってもよい。 The output control unit 34 may output a suggestion based on the progress determination result output by the progress determination unit 33. For example, when the confidence level of the most likely progress class is less than a predetermined threshold, the output control unit 34 may output a suggestion based on the first or second example described above, regardless of the presence or absence of an inappropriate determination result.
 なお、内視鏡画像取得部30、病変検出部31、適否判定部32、進行度判定部33、及び出力制御部34の各構成要素は、例えば、プロセッサ11がプログラムを実行することによって実現できる。また、必要なプログラムを任意の不揮発性記憶媒体に記録しておき、必要に応じてインストールすることで、各構成要素を実現するようにしてもよい。なお、これらの各構成要素の少なくとも一部は、プログラムによるソフトウェアで実現することに限ることなく、ハードウェア、ファームウェア、及びソフトウェアのうちのいずれかの組合せ等により実現してもよい。また、これらの各構成要素の少なくとも一部は、例えばFPGA(Field-Programmable Gate Array)又はマイクロコントローラ等の、ユーザがプログラミング可能な集積回路を用いて実現してもよい。この場合、この集積回路を用いて、上記の各構成要素から構成されるプログラムを実現してもよい。また、各構成要素の少なくとも一部は、ASSP(Application Specific Standard Produce)、ASIC(Application Specific Integrated Circuit)又は量子プロセッサ(量子コンピュータ制御チップ)により構成されてもよい。このように、各構成要素は、種々のハードウェアにより実現されてもよい。以上のことは、後述する他の実施の形態においても同様である。さらに、これらの各構成要素は、例えば、クラウドコンピューティング技術などを用いて、複数のコンピュータの協働によって実現されてもよい。 Note that each of the components of the endoscopic image acquisition unit 30, the lesion detection unit 31, the suitability determination unit 32, the progress determination unit 33, and the output control unit 34 can be realized, for example, by the processor 11 executing a program. Also, each component may be realized by recording the necessary programs in any non-volatile storage medium and installing them as needed. Note that at least a portion of each of these components may not be realized by software through a program, but may be realized by any combination of hardware, firmware, and software. Also, at least a portion of each of these components may be realized using a user-programmable integrated circuit, such as an FPGA (Field-Programmable Gate Array) or a microcontroller. In this case, a program consisting of each of the above components may be realized using this integrated circuit. Furthermore, at least a portion of each component may be configured by an ASSP (Application Specific Standard Production), an ASIC (Application Specific Integrated Circuit), or a quantum processor (quantum computer control chip). In this way, each component may be realized by various hardware. The above also applies to other embodiments described below. Furthermore, each of these components may be realized by the cooperation of multiple computers, for example, using cloud computing technology.
 (5)進行度適否判定
 ここで、進行度適否判定について具体的に説明する。
(5) Judgment of Progress Adequacy Here, the judgment of progress adequacy will be specifically described.
 例えば、病変領域のサイズに基づき進行度適否判定を行う場合、適否判定部32は、病変領域のサイズが所定サイズ以上である場合に、好適判定結果を生成し、病変領域のサイズが所定サイズ未満である場合に、不適判定結果を生成する。上述の所定サイズは、例えば、進行度の的確な判定に必要なサイズとして定められた既定値であり、予めメモリ12等に記憶されている。また、第1出力態様の場合の病変領域のサイズは、例えば、第1閾値以上の病変信頼度を有する単位領域の連結領域の大きさ(例えば画素数)である。また、第2出力態様の場合の病変領域のサイズは、例えば、バウンディングボックスの大きさである。 For example, when judging the suitability of the degree of progression based on the size of the lesion area, the suitability judgment unit 32 generates a suitable judgment result when the size of the lesion area is equal to or larger than a predetermined size, and generates an inappropriate judgment result when the size of the lesion area is less than the predetermined size. The above-mentioned predetermined size is, for example, a default value determined as the size necessary for accurate judgment of the degree of progression, and is stored in advance in the memory 12, etc. Furthermore, the size of the lesion area in the first output mode is, for example, the size (e.g., the number of pixels) of the connected area of unit areas having a lesion reliability equal to or greater than a first threshold value. Furthermore, the size of the lesion area in the second output mode is, for example, the size of a bounding box.
 他の例では、病変領域の信頼度に基づき上述の適否判定を行う場合、適否判定部32は、病変領域の信頼度が所定の閾値(「第2閾値」とも呼ぶ。)以上となる場合に、好適判定結果を生成し、病変領域の信頼度が第2閾値未満となる場合に、不適判定結果を生成する。ここで、第1出力態様の場合の病変領域の信頼度は、例えば、病変領域を構成する単位領域の病変信頼度の平均値、中央値、又はその他の代表値である。また、第2出力態様の場合の病変領域の信頼度は、例えば、バウンディングボックスに対応付けられた信頼度である。また、上述の第2閾値は、例えば、進行度の的確な判定に必要な信頼度として定められた既定値であり、予めメモリ12等に記憶されている。なお、第1出力態様の場合の第2閾値は、病変領域か否かを決定する際に単位領域ごとの病変信頼度と比較する第1閾値以上となる値に設定されるとよい。 In another example, when the above-mentioned suitability judgment is made based on the reliability of the lesion area, the suitability judgment unit 32 generates a suitable judgment result when the reliability of the lesion area is equal to or greater than a predetermined threshold (also referred to as the "second threshold"), and generates an inappropriate judgment result when the reliability of the lesion area is less than the second threshold. Here, the reliability of the lesion area in the case of the first output mode is, for example, the average value, median, or other representative value of the lesion reliability of the unit areas that make up the lesion area. Also, the reliability of the lesion area in the case of the second output mode is, for example, the reliability associated with the bounding box. Also, the above-mentioned second threshold is, for example, a default value determined as the reliability required for accurate judgment of the degree of progression, and is stored in advance in the memory 12, etc. Note that the second threshold in the case of the first output mode is preferably set to a value equal to or greater than the first threshold that is compared with the lesion reliability of each unit area when determining whether or not it is a lesion area.
 さらに別の例では、病変領域のサイズ及び信頼度の両方に基づき進行度適否判定を行う場合、適否判定部32は、病変領域のサイズが所定サイズ以上であり、かつ、病変領域の信頼度が第2閾値以上となる場合に、好適判定結果を生成する。一方、適否判定部32は、病変領域のサイズが所定サイズ未満、又は、病変領域の信頼度が第2閾値未満となる場合、不適判定結果を生成する。 In yet another example, when the progress suitability is judged based on both the size and reliability of the lesion area, the suitability judgment unit 32 generates a suitable judgment result when the size of the lesion area is equal to or larger than a predetermined size and the reliability of the lesion area is equal to or larger than a second threshold. On the other hand, the suitability judgment unit 32 generates an inappropriate judgment result when the size of the lesion area is less than the predetermined size or the reliability of the lesion area is less than the second threshold.
 また、好適には、適否判定部32は、進行度判定部33による進行度判定結果(即ち、進行度の判定により得られた情報)に基づき、進行度適否判定に用いる基準を変更してもよい。上述の基準は、病変領域のサイズと比較する所定サイズと、病変領域の信頼度と比較する第2閾値との少なくとも一方に相当する。 Furthermore, preferably, the suitability determination unit 32 may change the criteria used to determine the suitability of the progress level based on the progress level determination result by the progress level determination unit 33 (i.e., information obtained by judging the progress level). The above-mentioned criteria correspond to at least one of a predetermined size to be compared with the size of the lesion area and a second threshold value to be compared with the reliability of the lesion area.
 例えば、適否判定部32は、進行度判定部33により最も確からしいと判定された進行度のクラスに対する確信度に基づき、進行度適否判定に用いる基準を変更する。この場合、例えば、適否判定部32は、最も確からしい進行度のクラスの確信度が所定の閾値未満である場合には、進行度適否判定に用いる基準がより厳しくなるように変更する。この場合、適否判定部32は、基準として用いた所定サイズ又は/及び第2閾値を、所定値又は所定割合だけ上昇させる。これにより、適否判定部32は、内視鏡画像Iaが病変の進行度の判定に適していると判定する基準を厳しくし、より厳選した内視鏡画像Iaにより進行度の判定を行うことを促進する。なお、上述の所定の閾値、所定値、及び所定割合は、例えば既定値であり、予めメモリ12等に記憶されている。所定サイズは「第1基準」の一例であり、第2閾値は「第2基準」の一例である。 For example, the suitability determination unit 32 changes the criteria used to determine whether the progression is appropriate based on the confidence level for the progression class determined to be the most likely by the progression determination unit 33. In this case, for example, if the confidence level for the most likely progression class is less than a predetermined threshold, the suitability determination unit 32 changes the criteria used to determine whether the progression is appropriate to be stricter. In this case, the suitability determination unit 32 increases the predetermined size and/or the second threshold used as the criteria by a predetermined value or a predetermined percentage. As a result, the suitability determination unit 32 tightens the criteria for determining that the endoscopic image Ia is appropriate for determining the progression of the lesion, and promotes the determination of the progression using more carefully selected endoscopic images Ia. Note that the above-mentioned predetermined threshold, predetermined value, and predetermined percentage are, for example, default values and are stored in advance in the memory 12, etc. The predetermined size is an example of a "first criterion," and the second threshold is an example of a "second criterion."
 他の例では、適否判定部32は、進行度判定部33により最も確からしいと判定された進行度のクラスの時系列での変化の度合いに基づき、進行度適否判定に用いる基準を変更する。この場合、例えば、適否判定部32は、時系列により得られた所定回数分の進行度判定結果において、最も確からしいと判定された進行度のクラスが一致しない場合に、進行度判定結果が変動している(即ち安定していない)と判定し、進行度適否判定に用いる基準がより厳しくなるように変更する。他の例では、適否判定部32は、時系列により得られた所定回数分の進行度判定結果に基づき、最も確からしい進行度のクラスを集計し、最も頻度が多かったクラスの割合が所定割合未満である場合には、進行度適否判定に用いる基準がより厳しくなるように変更する。 In another example, the suitability determination unit 32 changes the criteria used to determine whether the progress level is appropriate, based on the degree of change over time in the progress class determined to be most likely by the progress level determination unit 33. In this case, for example, if the progress level determination results obtained from the time series for a predetermined number of times do not match the progress level class determined to be most likely, the suitability determination unit 32 determines that the progress level determination results are fluctuating (i.e., unstable) and changes the criteria used to determine whether the progress level is appropriate to be stricter. In another example, the suitability determination unit 32 tallyes up the most likely progress level class based on the progress level determination results obtained from the time series for a predetermined number of times, and if the proportion of the most frequent class is less than a predetermined proportion, changes the criteria used to determine whether the progress level is appropriate to be stricter.
 (6)表示例
 図5は、内視鏡検査において表示装置2が表示する第1表示例を示す。第1表示例は、進行度判定部33が進行度(ここでは深達度とする)の判定結果を生成したときに表示装置2が表示する表示画面の一例を示している。
5 shows a first display example displayed by the display device 2 during an endoscopic examination. The first display example shows an example of a display screen displayed by the display device 2 when the progression determination unit 33 generates a determination result of the progression (here, the invasion depth).
 画像処理装置1の出力制御部34は、内視鏡画像取得部30から供給される最新の内視鏡画像Iaと、病変検出部31が出力する病変検出結果と、進行度判定部33が出力する進行度判定結果とに基づき生成した表示情報Ibを表示装置2に出力する。出力制御部34は、表示情報Ibを表示装置2に送信することで、上述の表示画面を表示装置2に表示させている。 The output control unit 34 of the image processing device 1 outputs to the display device 2 display information Ib generated based on the latest endoscopic image Ia supplied from the endoscopic image acquisition unit 30, the lesion detection result output by the lesion detection unit 31, and the progress determination result output by the progress determination unit 33. The output control unit 34 transmits the display information Ib to the display device 2, thereby causing the display device 2 to display the above-mentioned display screen.
 図5に示す第1表示例では、画像処理装置1の出力制御部34は、リアルタイム画像表示領域70と、病変検出結果表示領域71と、深達度判定結果表示領域72と、を表示画面上に設けている。 In the first display example shown in FIG. 5, the output control unit 34 of the image processing device 1 provides a real-time image display area 70, a lesion detection result display area 71, and an invasion depth determination result display area 72 on the display screen.
 ここで、出力制御部34は、リアルタイム画像表示領域70において、最新の内視鏡画像Iaを表す動画像を表示する。さらに、病変検出結果表示領域71において、出力制御部34は、病変検出部31による病変検出結果を表示する。なお、図5に示す表示画面の表示時点において、病変信頼度マップを病変検出部31が病変検出結果として生成したことから、出力制御部34は、病変信頼度マップに基づく画像(ここでは病変領域を示すマスク画像)を、病変検出結果表示領域71に表示している。なお、病変検出結果がバウンディングボックスを示す場合には、例えば、出力制御部34は、最新の内視鏡画像Iaに上述のバウンディングボックスを重ねた画像をリアルタイム画像表示領域70又は病変検出結果表示領域71に表示する。 Here, the output control unit 34 displays a moving image representing the latest endoscopic image Ia in the real-time image display area 70. Furthermore, in the lesion detection result display area 71, the output control unit 34 displays the lesion detection result by the lesion detection unit 31. Note that at the time of displaying the display screen shown in FIG. 5, the lesion detection unit 31 has generated a lesion reliability map as the lesion detection result, so the output control unit 34 displays an image based on the lesion reliability map (here, a mask image indicating the lesion area) in the lesion detection result display area 71. Note that, when the lesion detection result indicates a bounding box, for example, the output control unit 34 displays an image in which the above-mentioned bounding box is superimposed on the latest endoscopic image Ia in the real-time image display area 70 or the lesion detection result display area 71.
 なお、出力制御部34は、病変が存在する可能性が高い旨のテキストメッセージをさらに病変検出結果表示領域71等に表示してもよく、病変が存在する可能性が高い旨を通知する音(音声を含む)を、音出力部16により出力してもよい。 The output control unit 34 may further display a text message in the lesion detection result display area 71 or the like to the effect that there is a high possibility that a lesion is present, and may output a sound (including voice) from the sound output unit 16 to notify the effect that there is a high possibility that a lesion is present.
 さらに、出力制御部34は、深達度判定結果表示領域72において、進行度判定部33による進行度(ここでは深達度)の判定結果を表示する。ここでは、進行度判定部33は、最も確からしい深達度のクラスが「T3」であると判定し、出力制御部34は、「T3」を深達度判定結果表示領域72に表示している。なお、出力制御部34は、進行度判定部33により判定された深達度のクラスを音出力部16により出力してもよい。 Furthermore, the output control unit 34 displays the result of the progression (here, depth of invasion) judgment made by the progress determination unit 33 in the depth of invasion judgment result display area 72. Here, the progress determination unit 33 judges that the most likely depth of invasion class is "T3", and the output control unit 34 displays "T3" in the depth of invasion judgment result display area 72. The output control unit 34 may output the depth of invasion class judged by the progress determination unit 33 via the sound output unit 16.
 第1表示例では、適否判定部32により選定された病変の進行度に好適な内視鏡画像Iaに基づき進行度判定部33が深達度を判定している。従って、出力制御部34は、高精度な深達度の判定結果を検査者に提示することができる。 In the first display example, the progression determination unit 33 determines the depth of invasion based on the endoscopic image Ia that is suitable for the progression of the lesion selected by the suitability determination unit 32. Therefore, the output control unit 34 can present the highly accurate depth of invasion determination result to the examiner.
 図6は、内視鏡検査において表示装置2が表示する第2表示例を示す。第2表示例は、適否判定部32により連続的に不適判定結果が生成されることにより出力制御部34が画像撮影の示唆を行う場合に表示装置2が表示する表示画面の一例を示している。図6に示す第2表示例では、第1表示例と同様に、画像処理装置1の出力制御部34は、リアルタイム画像表示領域70と、病変検出結果表示領域71と、深達度判定結果表示領域72と、を表示画面上に設けている。ここでは、出力制御部34は、病変検出結果表示領域71には、第1表示例と同様、病変検出結果に基づくマスク画像を表示している。 FIG. 6 shows a second display example displayed by the display device 2 during an endoscopic examination. The second display example shows an example of a display screen displayed by the display device 2 when the suitability judgment unit 32 continuously generates unsuitability judgment results and the output control unit 34 suggests image capture. In the second display example shown in FIG. 6, similar to the first display example, the output control unit 34 of the image processing device 1 provides a real-time image display area 70, a lesion detection result display area 71, and an invasion depth judgment result display area 72 on the display screen. Here, the output control unit 34 displays a mask image based on the lesion detection result in the lesion detection result display area 71, similar to the first display example.
 第2表示例では、出力制御部34は、進行度の判定に適した内視鏡画像Iaを内視鏡スコープ3により撮影するための示唆として、内視鏡画像Iaにおける好ましい病変領域の目標範囲(即ち目標の位置及び大きさ)を表す枠73を、リアルタイム画像表示領域70において内視鏡画像Iaに重畳表示している。また、出力制御部34は、深達度判定結果表示領域72には、深達度の判定ができていないことから、深達度の判定結果を表示する代わりに、枠73内に病変領域が入るように調整することを検査者に促すメッセージを表示している。これにより、内視鏡スコープ3を扱う検査者は、病変検出結果表示領域71により示されている病変領域の外縁が枠73に重なるように、内視鏡スコープ3の操作を行う。なお、この場合、病変検出結果表示領域71に示される情報がリアルタイム画像表示領域70において内視鏡画像Iaに重畳表示されていてもよい。 In the second display example, the output control unit 34 superimposes a frame 73 representing a target range (i.e., the target position and size) of a preferred lesion area in the endoscopic image Ia in the real-time image display area 70 as a suggestion for capturing an endoscopic image Ia suitable for judging the degree of progression using the endoscopic scope 3. In addition, since the output control unit 34 has not been able to judge the depth of invasion in the depth of invasion judgment result display area 72, instead of displaying the result of the depth of invasion, it displays a message urging the examiner to adjust the lesion area so that it falls within the frame 73. This allows the examiner operating the endoscopic scope 3 to operate the endoscopic scope 3 so that the outer edge of the lesion area shown in the lesion detection result display area 71 overlaps with the frame 73. In this case, the information shown in the lesion detection result display area 71 may be superimposed on the endoscopic image Ia in the real-time image display area 70.
 ここで、出力制御部34は、病変検出結果に基づき、枠73の大きさと形状の少なくとも一方を決定してもよい。例えば、出力制御部34は、病変検出部31が検出した病変領域のサイズが大きいほど、枠73のサイズを大きくしてもよい。他の例では、出力制御部34は、病変検出部31が検出した病変領域の形状を認識し、認識した形状に応じて枠73の形状(例えば縦と横の長さの比)を決定してもよい。この場合、例えば、出力制御部34は、病変検出部31が検出した病変領域を楕円又は矩形等の図形により近似し、近似した図形に沿った枠73を表示してもよい。 Here, the output control unit 34 may determine at least one of the size and shape of the frame 73 based on the lesion detection result. For example, the output control unit 34 may increase the size of the frame 73 as the size of the lesion area detected by the lesion detection unit 31 increases. In another example, the output control unit 34 may recognize the shape of the lesion area detected by the lesion detection unit 31, and determine the shape of the frame 73 (e.g., the ratio of the length to the width) according to the recognized shape. In this case, for example, the output control unit 34 may approximate the lesion area detected by the lesion detection unit 31 with a figure such as an ellipse or a rectangle, and display the frame 73 according to the approximated figure.
 以上のように、第2表示例によれば、画像処理装置1は、進行度(ここでは深達度)の判定に好適な内視鏡画像Iaが得られない場合に、撮影の示唆を行い、進行度の判定に好適な内視鏡画像Iaを得ることを促進することができる。 As described above, according to the second display example, when an endoscopic image Ia suitable for determining the degree of progression (here, the depth of invasion) cannot be obtained, the image processing device 1 can suggest capturing an image and promote the acquisition of an endoscopic image Ia suitable for determining the degree of progression.
 (7)処理フロー
 図7は、第1実施形態において内視鏡検査時に画像処理装置1が実行する処理の概要を示すフローチャートの一例である。
(7) Processing Flow FIG. 7 is an example of a flowchart outlining the processing executed by the image processing device 1 during endoscopic examination in the first embodiment.
 まず、画像処理装置1は、内視鏡画像Iaを取得する(ステップS11)。この場合、画像処理装置1の内視鏡画像取得部30は、インターフェース13を介して内視鏡スコープ3から内視鏡画像Iaを受信する。 First, the image processing device 1 acquires an endoscopic image Ia (step S11). In this case, the endoscopic image acquisition unit 30 of the image processing device 1 receives the endoscopic image Ia from the endoscopic scope 3 via the interface 13.
 次に、画像処理装置1は、ステップS11で取得された内視鏡画像Iaの病変検出を行う(ステップS12)。この場合、画像処理装置1は、病変検出モデル情報D1を参照して構成された病変検出モデルに内視鏡画像Iaを入力することで病変検出モデルから出力される病変検出結果を取得する。 Next, the image processing device 1 performs lesion detection on the endoscopic image Ia acquired in step S11 (step S12). In this case, the image processing device 1 inputs the endoscopic image Ia to a lesion detection model configured with reference to the lesion detection model information D1, thereby acquiring a lesion detection result output from the lesion detection model.
 そして、画像処理装置1は、内視鏡画像Iaから病変領域が検出されたか否か判定する(ステップS13)。そして、画像処理装置1は、内視鏡画像Iaから病変領域が検出されたと判定した場合(ステップS13;Yes)、当該内視鏡画像Iaについて進行度適否判定を行う(ステップS14)。この場合、画像処理装置1は、検出された病変領域のサイズ又は信頼度の少なくとも一方に基づき、進行度適否判定を行う。一方、画像処理装置1は、内視鏡画像Iaから病変領域が検出されなかったと判定した場合(ステップS13;No)、進行度適否判定及び病変の進行度の判定を行うことなく、ステップS17へ処理を進める。 Then, the image processing device 1 determines whether or not a lesion area has been detected from the endoscopic image Ia (step S13). If the image processing device 1 determines that a lesion area has been detected from the endoscopic image Ia (step S13; Yes), it performs a progress level determination for the endoscopic image Ia (step S14). In this case, the image processing device 1 performs the progress level determination based on at least one of the size and reliability of the detected lesion area. On the other hand, if the image processing device 1 determines that a lesion area has not been detected from the endoscopic image Ia (step S13; No), it proceeds to step S17 without performing a progress level determination or a determination of the progression level of the lesion.
 そして、内視鏡画像Iaについて進行度判定に適した画像であると判定した場合(ステップS15;Yes)、画像処理装置1は、病変の進行度の判定を行う(ステップS16)。この場合、例えば、画像処理装置1は、進行度判定モデル情報D2を参照して構成された進行度判定モデルに内視鏡画像Iaを入力した場合に進行度判定モデルが出力する進行度の推論結果を取得する。 If the endoscopic image Ia is determined to be an image suitable for progression assessment (step S15; Yes), the image processing device 1 assesses the progression of the lesion (step S16). In this case, for example, the image processing device 1 obtains an inference result of the progression output by the progression assessment model when the endoscopic image Ia is input to the progression assessment model constructed with reference to the progression assessment model information D2.
 そして、画像処理装置1は、ステップS11で得られた内視鏡画像Iaと、ステップS12で生成した病変検出結果と、ステップS16で生成した進行度判定結果とに基づく情報を表示装置2に表示する(ステップS17)。なお、画像処理装置1は、ステップS13において病変領域が検出されていないと判定した場合には、例えば、内視鏡画像Iaと、病変領域が検出されていないことを示す情報とをステップS17において表示する。 Then, the image processing device 1 displays on the display device 2 information based on the endoscopic image Ia obtained in step S11, the lesion detection result generated in step S12, and the progress assessment result generated in step S16 (step S17). If the image processing device 1 determines in step S13 that a lesion area has not been detected, it displays, for example, the endoscopic image Ia and information indicating that a lesion area has not been detected in step S17.
 一方、内視鏡画像Iaについて進行度判定に適した画像でないと判定した場合(ステップS15;No)、画像処理装置1は、進行度判定に適した画像撮影の示唆を出力する(ステップS19)。なお、画像処理装置1は、進行度判定に適した画像でないとステップS15において所定回数又は所定時間連続して判定された場合に限りステップS19を実行してもよい。なお、この場合において、進行度判定に適した画像でないとステップS15において連続して判定された回数又は時間が上述の所定回数又は所定時間に満たないときには、画像処理装置1は、例えば、示唆を出力することなく、内視鏡画像Iaと、病変検出結果とを表示する処理を行う。 On the other hand, if it is determined that the endoscopic image Ia is not suitable for progression assessment (step S15; No), the image processing device 1 outputs a suggestion to capture an image suitable for progression assessment (step S19). Note that the image processing device 1 may execute step S19 only if it is determined in step S15 that the image is not suitable for progression assessment a predetermined number of times or for a predetermined period of time. Note that in this case, if the number of times or the period of time that the image is continuously determined in step S15 to be not suitable for progression assessment does not reach the above-mentioned predetermined number of times or predetermined period of time, the image processing device 1 performs a process of displaying the endoscopic image Ia and the lesion detection result without outputting a suggestion, for example.
 そして、画像処理装置1は、ステップS17又はステップS19の後、内視鏡検査が終了したか否か判定する(ステップS18)。例えば、画像処理装置1は、入力部14又は操作部36への所定の入力等を検知した場合に、内視鏡検査が終了したと判定する。そして、画像処理装置1は、内視鏡検査が終了したと判定した場合(ステップS18;Yes)、フローチャートの処理を終了する。一方、画像処理装置1は、内視鏡検査が終了していないと判定した場合(ステップS18;No)、ステップS11へ処理を戻す。 Then, after step S17 or step S19, the image processing device 1 determines whether or not the endoscopic examination has ended (step S18). For example, the image processing device 1 determines that the endoscopic examination has ended when it detects a predetermined input to the input unit 14 or the operation unit 36. Then, if the image processing device 1 determines that the endoscopic examination has ended (step S18; Yes), it ends the processing of the flowchart. On the other hand, if the image processing device 1 determines that the endoscopic examination has not ended (step S18; No), it returns the processing to step S11.
 (8)変形例
 次に、上述した第1実施形態に好適な変形例について説明する。以下の変形例は、組み合わせて上述の第1実施形態に適用してもよい。
(8) Modifications Next, preferred modifications of the first embodiment will be described. The following modifications may be applied in combination to the first embodiment.
 (変形例1)
 画像処理装置1は、内視鏡検査時に生成された内視鏡画像Iaから構成された映像を、検査後において処理してもよい。
(Variation 1)
The image processing device 1 may process the video composed of the endoscopic images Ia generated during the endoscopic examination after the examination.
 例えば、画像処理装置1は、検査後の任意のタイミングにおいて、入力部14によるユーザ入力等に基づき、処理を行う対象となる映像が指定された場合に、当該映像を構成する時系列の内視鏡画像Iaに対して逐次的に図7のフローチャートの処理を行う。そして、画像処理装置1は、ステップS18において対象の映像が終了したと判定した場合に、フローチャートの処理を終了し、対象の映像が終了していない場合にはステップS11に戻り、時系列において次の内視鏡画像Iaを対象としてフローチャートの処理を行う。 For example, when an image to be processed is specified based on user input via the input unit 14 at any timing after the examination, the image processing device 1 sequentially performs the processing of the flowchart in FIG. 7 on the time-series endoscopic images Ia that constitute the specified image. Then, when the image processing device 1 determines in step S18 that the target image has ended, it ends the processing of the flowchart, and when the target image has not ended, it returns to step S11 and performs the processing of the flowchart on the next endoscopic image Ia in the time series.
 (変形例2)
 適否判定部32は、病変検出モデルが出力する信頼度とは別に算出された信頼度に基づき、進行度適否判定を実行してもよい。
(Variation 2)
The suitability determining unit 32 may perform the progression suitability determination based on a reliability calculated separately from the reliability output by the lesion detection model.
 この場合、例えば、適否判定部32は、病変検出部31が病変領域を検出した場合に、検出した病変領域を含む内視鏡画像Iaに基づき、信頼度を算出する。この場合、例えば、適否判定部32は、内視鏡画像が入力された場合に、入力された内視鏡画像に病変領域が存在する信頼度のスコアを出力するモデルを用い、内視鏡画像Iaから信頼度を算出する。この場合、上述のモデルは、例えばニューラルネットワークなどの機械学習に基づくモデルであり、学習済みのパラメータがメモリ12等に予め記憶されている。なお、上述のモデルは、病変領域が存在するか否かの2値分類を行う分類モデルであってもよい。この場合、適否判定部32は、上述の分類モデルから出力される、病変領域が存在するクラスに対応する確信度を、上述の信頼度として取得する。 In this case, for example, when the lesion detection unit 31 detects a lesion area, the suitability determination unit 32 calculates the reliability based on the endoscopic image Ia including the detected lesion area. In this case, for example, when an endoscopic image is input, the suitability determination unit 32 calculates the reliability from the endoscopic image Ia using a model that outputs a reliability score that a lesion area exists in the input endoscopic image. In this case, the above-mentioned model is, for example, a model based on machine learning such as a neural network, and learned parameters are stored in advance in the memory 12 or the like. Note that the above-mentioned model may be a classification model that performs a binary classification of whether or not a lesion area exists. In this case, the suitability determination unit 32 acquires the confidence level corresponding to the class in which a lesion area exists, which is output from the above-mentioned classification model, as the above-mentioned reliability.
 本変形例によっても、適否判定部32は、進行度適否判定を好適に実行することができる。 Even with this modified example, the suitability determination unit 32 can appropriately perform progress suitability determination.
 (変形例3)
 病変検出モデル情報D1及び進行度判定モデル情報D2は、画像処理装置1とは別の記憶装置に記憶されてもよい。
(Variation 3)
The lesion detection model information D1 and the progression determination model information D2 may be stored in a storage device separate from the image processing device 1.
 図8は、変形例における内視鏡検査システム100Aの概略構成図である。なお、簡潔化のため、表示装置2及び内視鏡スコープ3等は図示されていない。内視鏡検査システム100Aは、病変検出モデル情報D1及び進行度判定モデル情報D2を記憶するサーバ装置4を備える。また、内視鏡検査システム100Aは、サーバ装置4とネットワークを介してデータ通信が可能な複数の画像処理装置1(1A、1B、…)を備える。 FIG. 8 is a schematic diagram of an endoscopic examination system 100A in a modified example. For simplicity, the display device 2 and the endoscope 3 are not shown. The endoscopic examination system 100A includes a server device 4 that stores lesion detection model information D1 and progression assessment model information D2. The endoscopic examination system 100A also includes multiple image processing devices 1 (1A, 1B, ...) that can communicate data with the server device 4 via a network.
 この場合、各画像処理装置1は、ネットワークを介して病変検出モデル情報D1及び進行度判定モデル情報D2の参照を行う。この場合、各画像処理装置1のインターフェース13は、通信を行うためのネットワークアダプタなどの通信インターフェースを含む。この構成では、各画像処理装置1は、上述の実施形態と同様、病変検出モデル情報D1及び進行度判定モデル情報D2を参照し、病変の進行度の判定に関する処理を好適に実行することができる。なお、サーバ装置4が図4に示される画像処理装置1のプロセッサ11の各機能ブロックが実行する処理の少なくとも一部を代わりに実行してもよい。 In this case, each image processing device 1 refers to the lesion detection model information D1 and the progression determination model information D2 via the network. In this case, the interface 13 of each image processing device 1 includes a communication interface such as a network adapter for communication. In this configuration, each image processing device 1 can refer to the lesion detection model information D1 and the progression determination model information D2, as in the above-mentioned embodiment, and suitably execute processing related to determining the progression of the lesion. Note that the server device 4 may instead execute at least a part of the processing executed by each functional block of the processor 11 of the image processing device 1 shown in FIG. 4.
 <第2実施形態>
 図9は、第2実施形態における画像処理装置1Xのブロック図である。画像処理装置1Xは、取得手段30Xと、検出手段31Xと、第1判定手段32Xと、第2判定手段33Xと、を備える。画像処理装置1Xは、複数の装置から構成されてもよい。
Second Embodiment
9 is a block diagram of an image processing device 1X according to the second embodiment. The image processing device 1X includes an acquisition unit 30X, a detection unit 31X, a first determination unit 32X, and a second determination unit 33X. The image processing device 1X may be composed of a plurality of devices.
 取得手段30Xは、被検体を撮影した内視鏡画像を取得する。取得手段30Xは、例えば、第1実施形態(変形例を含む、以下同じ。)における内視鏡画像取得部30とすることができる。なお、取得手段30Xは、撮影部が生成した内視鏡画像を即時に取得してもよく、予め撮影部が生成して記憶装置に記憶された内視鏡画像を、所定のタイミングにおいて取得してもよい。 The acquisition means 30X acquires an endoscopic image of the subject. The acquisition means 30X can be, for example, the endoscopic image acquisition section 30 in the first embodiment (including modified examples, the same applies below). The acquisition means 30X may instantly acquire an endoscopic image generated by the imaging section, or may acquire an endoscopic image generated in advance by the imaging section and stored in a storage device at a predetermined timing.
 検出手段31Xは、内視鏡画像に基づき、内視鏡画像における被検体の病変の候補領域である病変領域を検出する。検出手段31Xは、例えば、第1実施形態における病変検出部31とすることがでる。 The detection means 31X detects a lesion area, which is a candidate area for a lesion in the subject in the endoscopic image, based on the endoscopic image. The detection means 31X can be, for example, the lesion detection unit 31 in the first embodiment.
 第1判定手段32Xは、病変領域の大きさ又は病変領域の病変らしさに関する信頼度の少なくとも一方に基づき、内視鏡画像が病変の進行度又は深達度の判定に適した画像であるか否か判定する。第1判定手段32Xは、例えば、第1実施形態における適否判定部32とすることができる。 The first determination means 32X determines whether or not the endoscopic image is suitable for determining the progression or depth of a lesion based on at least one of the size of the lesion area or the reliability of the lesion area as a lesion. The first determination means 32X can be, for example, the suitability determination unit 32 in the first embodiment.
 第2判定手段33Xは、進行度又は深達度の判定に適した画像と判定された内視鏡画像に基づき、進行度又は深達度を判定する。第2判定手段33Xは、例えば、第1実施形態における進行度判定部33とすることができる。 The second determination means 33X determines the degree of progression or the depth of invasion based on the endoscopic image determined to be an image suitable for determining the degree of progression or the depth of invasion. The second determination means 33X can be, for example, the degree of progression determination unit 33 in the first embodiment.
 図10は、第2実施形態における処理手順を示すフローチャートの一例である。取得手段30Xは、被検体を撮影した内視鏡画像を取得する。(ステップS21)。次に、検出手段31Xは、内視鏡画像に基づき、内視鏡画像における被検体の病変の候補領域である病変領域を検出する(ステップS22)。第1判定手段32Xは、病変領域の大きさ又は病変領域の病変らしさに関する信頼度の少なくとも一方に基づき、内視鏡画像が病変の進行度又は深達度の判定に適した画像であるか否か判定する(ステップS23)。第2判定手段33Xは、進行度又は深達度の判定に適した画像と判定された内視鏡画像に基づき、進行度又は深達度を判定する(ステップS24)。 FIG. 10 is an example of a flowchart showing the processing procedure in the second embodiment. The acquisition means 30X acquires an endoscopic image of the subject (step S21). Next, the detection means 31X detects a lesion area in the endoscopic image that is a candidate area for a lesion in the subject based on the endoscopic image (step S22). The first determination means 32X determines whether the endoscopic image is suitable for determining the progression or depth of the lesion based on at least one of the size of the lesion area and the reliability of the lesion area as a lesion (step S23). The second determination means 33X determines the progression or depth based on the endoscopic image determined to be suitable for determining the progression or depth (step S24).
 第2実施形態によれば、画像処理装置1Xは、選定された内視鏡画像に基づき、進行度又は深達度を的確に判定することができる。 According to the second embodiment, the image processing device 1X can accurately determine the progression or depth of invasion based on the selected endoscopic image.
 なお、上述した各実施形態において、プログラムは、様々なタイプの非一時的なコンピュータ可読媒体(Non-transitory computer readable medium)を用いて格納され、コンピュータであるプロセッサ等に供給することができる。非一時的なコンピュータ可読媒体は、様々なタイプの実体のある記憶媒体(Tangible storage medium)を含む。非一時的なコンピュータ可読媒体の例は、磁気記憶媒体(例えばフレキシブルディスク、磁気テープ、ハードディスクドライブ)、光磁気記憶媒体(例えば光磁気ディスク)、CD-ROM(Read Only Memory)、CD-R、CD-R/W、半導体メモリ(例えば、マスクROM、PROM(Programmable ROM)、EPROM(Erasable PROM)、フラッシュROM、RAM(Random Access Memory)を含む。また、プログラムは、様々なタイプの一時的なコンピュータ可読媒体(Transitory computer readable medium)によってコンピュータに供給されてもよい。一時的なコンピュータ可読媒体の例は、電気信号、光信号、及び電磁波を含む。一時的なコンピュータ可読媒体は、電線及び光ファイバ等の有線通信路、又は無線通信路を介して、プログラムをコンピュータに供給できる。 In each of the above-described embodiments, the program can be stored using various types of non-transitory computer readable media and supplied to a computer, such as a processor. Non-transitory computer readable media include various types of tangible storage media. Examples of non-transitory computer-readable media include magnetic storage media (e.g., floppy disks, magnetic tapes, hard disk drives), optical storage media (e.g., optical disks), CD-ROMs (Read Only Memory), CD-Rs, CD-R/Ws, semiconductor memories (e.g., mask ROMs, PROMs (Programmable ROMs), EPROMs (Erasable PROMs), flash ROMs, and RAMs (Random Access Memory). Programs may also be supplied to computers by various types of transient computer-readable media. Examples of transient computer-readable media include electrical signals, optical signals, and electromagnetic waves. Transient computer-readable media can supply programs to computers via wired communication paths such as electric wires and optical fibers, or wireless communication paths.
 その他、上記の各実施形態(変形例を含む、以下同じ)の一部又は全部は、以下の付記のようにも記載され得るが以下には限られない。 In addition, all or part of the above-described embodiments (including modified examples, the same applies below) can be described as, but are not limited to, the following notes.
 [付記1]
 被検体を撮影した内視鏡画像を取得する取得手段と、
 前記内視鏡画像に基づき、前記内視鏡画像における前記被検体の病変の候補領域である病変領域を検出する検出手段と、
 前記病変領域の大きさ又は前記病変領域の前記病変らしさに関する信頼度の少なくとも一方に基づき、前記内視鏡画像が前記病変の進行度又は深達度の判定に適した画像であるか否か判定する第1判定手段と、
 前記進行度又は深達度の判定に適した画像と判定された前記内視鏡画像に基づき、前記進行度又は深達度を判定する第2判定手段と、
を有する画像処理装置。
 [付記2]
 前記第1判定手段は、前記進行度又は深達度の判定により得られた情報に基づき、前記進行度又は深達度の判定に適した画像であるか否かの判定に用いる基準を変更する、付記1に記載の画像処理装置。
 [付記3]
 前記第1判定手段は、前記第2判定手段により判定された前記進行度又は深達度のクラスに対する確信の度合いに基づき、前記基準を変更する、
 [付記4]
 前記第1判定手段は、前記第2判定手段により判定された前記進行度又は深達度のクラスの時系列での変化の度合いに基づき、前記基準を変更する、
 [付記5]
 前記基準は、前記病変領域の大きさに関する第1基準と、前記信頼度に関する第2基準との少なくとも一方である、
 [付記6]
 前記進行度又は深達度の判定に適した画像でないと判定された場合に、前記内視鏡画像の撮影に関する示唆を表示装置又は音声出力装置により出力する出力制御手段を有する、付記1に記載の画像処理装置。
 [付記7]
 前記出力制御手段は、前記内視鏡画像上での前記病変領域の目標となる範囲を示す情報を出力する、付記6に記載の画像処理装置。
 [付記8]
 前記出力制御手段は、前記検出手段による前記病変領域の検出結果に基づき、前記範囲の形状又は大きさの少なくとも一方を決定する、付記7に記載の画像処理装置。
 [付記9]
 前記出力制御手段は、前記示唆として、前記病変領域に撮影位置を近づけることを促す情報を出力する、付記6に記載の画像処理装置。
 [付記10]
 コンピュータが、
 被検体を撮影した内視鏡画像を取得し、
 前記内視鏡画像に基づき、前記内視鏡画像における前記被検体の病変の候補領域である病変領域を検出し、
 前記病変領域の大きさ又は前記病変領域の前記病変らしさに関する信頼度の少なくとも一方に基づき、前記内視鏡画像が前記病変の進行度又は深達度の判定に適した画像であるか否か判定し、
 前記進行度又は深達度の判定に適した画像と判定された前記内視鏡画像に基づき、前記進行度又は深達度を判定する、
画像処理方法。
 [付記11]
 被検体を撮影した内視鏡画像を取得し、
 前記内視鏡画像に基づき、前記内視鏡画像における前記被検体の病変の候補領域である病変領域を検出し、
 前記病変領域の大きさ又は前記病変領域の前記病変らしさに関する信頼度の少なくとも一方に基づき、前記内視鏡画像が前記病変の進行度又は深達度の判定に適した画像であるか否か判定し、
 前記進行度又は深達度の判定に適した画像と判定された前記内視鏡画像に基づき、前記進行度又は深達度を判定する処理をコンピュータに実行させるプログラムを格納した記憶媒体。
[Appendix 1]
An acquisition means for acquiring an endoscopic image of a subject;
a detection means for detecting a lesion area, which is a candidate area for a lesion of the subject in the endoscopic image, based on the endoscopic image;
a first determination means for determining whether or not the endoscopic image is suitable for determining the progression or depth of the lesion based on at least one of the size of the lesion area and the reliability of the lesion area with respect to the lesion-likeness;
a second determination means for determining the degree of progression or the depth of invasion based on the endoscopic image determined to be an image suitable for determining the degree of progression or the depth of invasion;
An image processing device comprising:
[Appendix 2]
The image processing device described in Appendix 1, wherein the first judgment means changes the criteria used to judge whether an image is suitable for judging the degree of progression or depth of invasion based on information obtained by judging the degree of progression or depth of invasion.
[Appendix 3]
The first determination means changes the criterion based on a degree of confidence for the class of the progression or depth of invasion determined by the second determination means.
[Appendix 4]
The first determination means changes the criterion based on a degree of change in the class of the progression or depth of invasion over time determined by the second determination means.
[Appendix 5]
The criterion is at least one of a first criterion related to the size of the lesion area and a second criterion related to the reliability.
[Appendix 6]
An image processing device as described in Appendix 1, having an output control means for outputting suggestions regarding the capture of the endoscopic image via a display device or audio output device when it is determined that the image is not suitable for determining the progression or depth of invasion.
[Appendix 7]
The image processing device according to claim 6, wherein the output control means outputs information indicating a target range of the lesion area on the endoscopic image.
[Appendix 8]
The image processing device according to claim 7, wherein the output control means determines at least one of a shape and a size of the area based on a detection result of the lesion area by the detection means.
[Appendix 9]
The image processing device according to claim 6, wherein the output control means outputs, as the suggestion, information encouraging the user to move the imaging position closer to the lesion area.
[Appendix 10]
The computer
An endoscopic image of the subject is acquired,
detecting a lesion area in the endoscopic image, the lesion area being a candidate area for a lesion in the subject based on the endoscopic image;
determining whether the endoscopic image is suitable for determining the progression or depth of the lesion based on at least one of the size of the lesion area or the reliability of the lesion area;
determining the degree of progression or the depth of invasion based on the endoscopic image determined to be an image suitable for determining the degree of progression or the depth of invasion;
Image processing methods.
[Appendix 11]
An endoscopic image of the subject is acquired,
detecting a lesion area in the endoscopic image, the lesion area being a candidate area for a lesion in the subject based on the endoscopic image;
determining whether the endoscopic image is suitable for determining the progression or depth of the lesion based on at least one of the size of the lesion area or the reliability of the lesion area;
A storage medium storing a program for causing a computer to execute a process for determining the degree of progression or depth of invasion based on the endoscopic image that has been determined to be an image suitable for determining the degree of progression or depth of invasion.
 以上、実施形態を参照して本願発明を説明したが、本願発明は上記実施形態に限定されるものではない。本願発明の構成や詳細には、本願発明のスコープ内で当業者が理解し得る様々な変更をすることができる。すなわち、本願発明は、請求の範囲を含む全開示、技術的思想にしたがって当業者であればなし得るであろう各種変形、修正を含むことは勿論である。また、引用した上記の特許文献及び非特許文献の各開示は、本書に引用をもって繰り込むものとする。 The present invention has been described above with reference to the embodiments, but the present invention is not limited to the above-mentioned embodiments. Various modifications that a person skilled in the art can understand can be made to the configuration and details of the present invention within the scope of the present invention. In other words, the present invention naturally includes various modifications and amendments that a person skilled in the art could make in accordance with the entire disclosure, including the claims, and the technical ideas. Furthermore, the disclosures of the above cited patent documents and non-patent documents are incorporated by reference into this document.
 1、1A、1B、1X 画像処理装置
 2 表示装置
 3 内視鏡スコープ
 11 プロセッサ
 12 メモリ
 13 インターフェース
 14 入力部
 15 光源部
 16 音出力部
 100、100A 内視鏡検査システム
Reference Signs List 1, 1A, 1B, 1X Image processing device 2 Display device 3 Endoscope 11 Processor 12 Memory 13 Interface 14 Input section 15 Light source section 16 Sound output section 100, 100A Endoscopy system

Claims (11)

  1.  被検体を撮影した内視鏡画像を取得する取得手段と、
     前記内視鏡画像に基づき、前記内視鏡画像における前記被検体の病変の候補領域である病変領域を検出する検出手段と、
     前記病変領域の大きさ又は前記病変領域の前記病変らしさに関する信頼度の少なくとも一方に基づき、前記内視鏡画像が前記病変の進行度又は深達度の判定に適した画像であるか否か判定する第1判定手段と、
     前記進行度又は深達度の判定に適した画像と判定された前記内視鏡画像に基づき、前記進行度又は深達度を判定する第2判定手段と、
    を有する画像処理装置。
    An acquisition means for acquiring an endoscopic image of a subject;
    a detection means for detecting a lesion area, which is a candidate area for a lesion of the subject in the endoscopic image, based on the endoscopic image;
    a first determination means for determining whether or not the endoscopic image is suitable for determining the progression or depth of the lesion based on at least one of the size of the lesion area and the reliability of the lesion area with respect to the lesion-likeness;
    a second determination means for determining the degree of progression or the depth of invasion based on the endoscopic image determined to be an image suitable for determining the degree of progression or the depth of invasion;
    An image processing device comprising:
  2.  前記第1判定手段は、前記進行度又は深達度の判定により得られた情報に基づき、前記進行度又は深達度の判定に適した画像であるか否かの判定に用いる基準を変更する、請求項1に記載の画像処理装置。 The image processing device according to claim 1, wherein the first determination means changes the criteria used to determine whether an image is suitable for determining the degree of progression or depth of invasion based on information obtained by the determination of the degree of progression or depth of invasion.
  3.  前記第1判定手段は、前記第2判定手段により判定された前記進行度又は深達度のクラスに対する確信の度合いに基づき、前記基準を変更する、請求項2に記載の画像処理装置。 The image processing device according to claim 2, wherein the first determination means changes the criteria based on the degree of confidence in the class of the progression or depth determined by the second determination means.
  4.  前記第1判定手段は、前記第2判定手段により判定された前記進行度又は深達度のクラスの時系列での変化の度合いに基づき、前記基準を変更する、請求項3に記載の画像処理装置。 The image processing device according to claim 3, wherein the first determination means changes the criteria based on the degree of change over time of the progression or depth of invasion class determined by the second determination means.
  5.  前記基準は、前記病変領域の大きさに関する第1基準と、前記信頼度に関する第2基準との少なくとも一方である、請求項3に記載の画像処理装置。 The image processing device according to claim 3, wherein the criterion is at least one of a first criterion related to the size of the lesion area and a second criterion related to the reliability.
  6.  前記進行度又は深達度の判定に適した画像でないと判定された場合に、前記内視鏡画像の撮影に関する示唆を表示装置又は音声出力装置により出力する出力制御手段を有する、請求項1に記載の画像処理装置。 The image processing device according to claim 1, further comprising an output control means for outputting, via a display device or audio output device, a suggestion regarding the capture of the endoscopic image when the image is determined to be unsuitable for determining the progression or depth of invasion.
  7.  前記出力制御手段は、前記内視鏡画像上での前記病変領域の目標となる範囲を示す情報を出力する、請求項6に記載の画像処理装置。 The image processing device according to claim 6, wherein the output control means outputs information indicating a target range of the lesion area on the endoscopic image.
  8.  前記出力制御手段は、前記検出手段による前記病変領域の検出結果に基づき、前記範囲の形状又は大きさの少なくとも一方を決定する、請求項7に記載の画像処理装置。 The image processing device according to claim 7, wherein the output control means determines at least one of the shape and size of the range based on the detection result of the lesion area by the detection means.
  9.  前記出力制御手段は、前記示唆として、前記病変領域に撮影位置を近づけることを促す情報を出力する、請求項6に記載の画像処理装置。 The image processing device according to claim 6, wherein the output control means outputs, as the suggestion, information encouraging the imaging position to be brought closer to the lesion area.
  10.  コンピュータが、
     被検体を撮影した内視鏡画像を取得し、
     前記内視鏡画像に基づき、前記内視鏡画像における前記被検体の病変の候補領域である病変領域を検出し、
     前記病変領域の大きさ又は前記病変領域の前記病変らしさに関する信頼度の少なくとも一方に基づき、前記内視鏡画像が前記病変の進行度又は深達度の判定に適した画像であるか否か判定し、
     前記進行度又は深達度の判定に適した画像と判定された前記内視鏡画像に基づき、前記進行度又は深達度を判定する、
    画像処理方法。
    The computer
    An endoscopic image of the subject is acquired,
    detecting a lesion area in the endoscopic image, the lesion area being a candidate area for a lesion in the subject based on the endoscopic image;
    determining whether the endoscopic image is suitable for determining the progression or depth of the lesion based on at least one of the size of the lesion area or the reliability of the lesion area;
    determining the degree of progression or the depth of invasion based on the endoscopic image determined to be an image suitable for determining the degree of progression or the depth of invasion;
    Image processing methods.
  11.  被検体を撮影した内視鏡画像を取得し、
     前記内視鏡画像に基づき、前記内視鏡画像における前記被検体の病変の候補領域である病変領域を検出し、
     前記病変領域の大きさ又は前記病変領域の前記病変らしさに関する信頼度の少なくとも一方に基づき、前記内視鏡画像が前記病変の進行度又は深達度の判定に適した画像であるか否か判定し、
     前記進行度又は深達度の判定に適した画像と判定された前記内視鏡画像に基づき、前記進行度又は深達度を判定する処理をコンピュータに実行させるプログラムを格納した記憶媒体。
    An endoscopic image of the subject is acquired,
    detecting a lesion area in the endoscopic image, the lesion area being a candidate area for a lesion in the subject based on the endoscopic image;
    determining whether the endoscopic image is suitable for determining the progression or depth of the lesion based on at least one of the size of the lesion area or the reliability of the lesion area;
    A storage medium storing a program for causing a computer to execute a process for determining the degree of progression or depth of invasion based on the endoscopic image that has been determined to be an image suitable for determining the degree of progression or depth of invasion.
PCT/JP2023/007007 2023-02-27 2023-02-27 Image processing device, image processing method, and storage medium WO2024180593A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
PCT/JP2023/007007 WO2024180593A1 (en) 2023-02-27 2023-02-27 Image processing device, image processing method, and storage medium
PCT/JP2023/031840 WO2024180796A1 (en) 2023-02-27 2023-08-31 Image processing device, image processing method, and storage medium
US18/410,361 US20240289959A1 (en) 2023-02-27 2024-01-11 Image processing device, image processing method, and storage medium
US18/410,293 US20240289949A1 (en) 2023-02-27 2024-01-11 Image processing device, image processing method, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2023/007007 WO2024180593A1 (en) 2023-02-27 2023-02-27 Image processing device, image processing method, and storage medium

Publications (1)

Publication Number Publication Date
WO2024180593A1 true WO2024180593A1 (en) 2024-09-06

Family

ID=92589407

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/JP2023/007007 WO2024180593A1 (en) 2023-02-27 2023-02-27 Image processing device, image processing method, and storage medium
PCT/JP2023/031840 WO2024180796A1 (en) 2023-02-27 2023-08-31 Image processing device, image processing method, and storage medium

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/031840 WO2024180796A1 (en) 2023-02-27 2023-08-31 Image processing device, image processing method, and storage medium

Country Status (1)

Country Link
WO (2) WO2024180593A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020170791A1 (en) * 2019-02-19 2020-08-27 富士フイルム株式会社 Medical image processing device and method
WO2020174572A1 (en) * 2019-02-26 2020-09-03 オリンパス株式会社 Endoscope device and program
JP2022505205A (en) * 2018-10-19 2022-01-14 武田薬品工業株式会社 Image scoring for intestinal pathology
WO2022014258A1 (en) * 2020-07-17 2022-01-20 富士フイルム株式会社 Processor device and processor device operation method
WO2022185651A1 (en) * 2021-03-04 2022-09-09 Hoya株式会社 Program, information processing method, and information processing device
WO2022209390A1 (en) * 2021-03-31 2022-10-06 富士フイルム株式会社 Endoscope system and operation method of same

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4204185B2 (en) * 2000-11-17 2009-01-07 株式会社リコー Character recognition device, character recognition method, and recording medium
WO2019054045A1 (en) * 2017-09-15 2019-03-21 富士フイルム株式会社 Medical image processing device, medical image processing method, and medical image processing program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022505205A (en) * 2018-10-19 2022-01-14 武田薬品工業株式会社 Image scoring for intestinal pathology
WO2020170791A1 (en) * 2019-02-19 2020-08-27 富士フイルム株式会社 Medical image processing device and method
WO2020174572A1 (en) * 2019-02-26 2020-09-03 オリンパス株式会社 Endoscope device and program
WO2022014258A1 (en) * 2020-07-17 2022-01-20 富士フイルム株式会社 Processor device and processor device operation method
WO2022185651A1 (en) * 2021-03-04 2022-09-09 Hoya株式会社 Program, information processing method, and information processing device
WO2022209390A1 (en) * 2021-03-31 2022-10-06 富士フイルム株式会社 Endoscope system and operation method of same

Also Published As

Publication number Publication date
WO2024180796A1 (en) 2024-09-06

Similar Documents

Publication Publication Date Title
JP6657480B2 (en) Image diagnosis support apparatus, operation method of image diagnosis support apparatus, and image diagnosis support program
CN110049709B (en) Image processing apparatus
EP1994878B1 (en) Medical image processing device and medical image processing method
WO2012153568A1 (en) Medical image processing device and medical image processing method
CN116745861B (en) Control method, device and recording medium of lesion judgment system obtained through real-time image
JP7485193B2 (en) Image processing device, image processing method, and program
JP6824868B2 (en) Image analysis device and image analysis method
JP4749732B2 (en) Medical image processing device
WO2024180593A1 (en) Image processing device, image processing method, and storage medium
WO2022224446A1 (en) Image processing device, image processing method, and storage medium
WO2023042273A1 (en) Image processing device, image processing method, and storage medium
WO2022185369A1 (en) Image processing device, image processing method, and storage medium
WO2023126999A1 (en) Image processing device, image processing method, and storage medium
WO2023187886A1 (en) Image processing device, image processing method, and storage medium
WO2024084838A1 (en) Image processing device, image processing method, and storage medium
WO2023162216A1 (en) Image processing device, image processing method, and storage medium
US20240289959A1 (en) Image processing device, image processing method, and storage medium
WO2024013848A1 (en) Image processing device, image processing method, and storage medium
WO2023181353A1 (en) Image processing device, image processing method, and storage medium
WO2024075240A1 (en) Image processing device, image processing method, and storage medium
WO2023233453A1 (en) Image processing device, image processing method, and storage medium
WO2024075242A1 (en) Image processing device, image processing method, and storage medium
WO2024142490A1 (en) Image processing device, image processing method, and storage medium
WO2024018581A1 (en) Image processing device, image processing method, and storage medium
WO2024024022A1 (en) Endoscopic examination assistance device, endoscopic examination assistance method, and recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23925151

Country of ref document: EP

Kind code of ref document: A1