WO2022190366A1 - 内視鏡用形状計測システムおよび内視鏡用形状計測方法 - Google Patents
内視鏡用形状計測システムおよび内視鏡用形状計測方法 Download PDFInfo
- Publication number
- WO2022190366A1 WO2022190366A1 PCT/JP2021/010118 JP2021010118W WO2022190366A1 WO 2022190366 A1 WO2022190366 A1 WO 2022190366A1 JP 2021010118 W JP2021010118 W JP 2021010118W WO 2022190366 A1 WO2022190366 A1 WO 2022190366A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- endoscope
- information
- region
- image
- dimensional shape
- Prior art date
Links
- 238000005259 measurement Methods 0.000 title claims description 34
- 238000000691 measurement method Methods 0.000 title claims description 6
- 230000003902 lesion Effects 0.000 claims description 74
- 238000003384 imaging method Methods 0.000 claims description 21
- 238000000034 method Methods 0.000 claims description 19
- 230000007717 exclusion Effects 0.000 claims description 11
- 238000010801 machine learning Methods 0.000 claims description 5
- 230000002194 synthesizing effect Effects 0.000 claims description 5
- 238000012549 training Methods 0.000 claims description 4
- 238000012545 processing Methods 0.000 abstract description 35
- 238000009795 derivation Methods 0.000 abstract description 17
- 230000000392 somatic effect Effects 0.000 abstract 1
- 210000001519 tissue Anatomy 0.000 description 34
- 230000010365 information processing Effects 0.000 description 22
- 238000010191 image analysis Methods 0.000 description 20
- 238000010586 diagram Methods 0.000 description 12
- 230000006870 function Effects 0.000 description 11
- 238000012323 Endoscopic submucosal dissection Methods 0.000 description 8
- 238000001839 endoscopy Methods 0.000 description 6
- 206010028980 Neoplasm Diseases 0.000 description 5
- 210000001035 gastrointestinal tract Anatomy 0.000 description 5
- 238000005286 illumination Methods 0.000 description 4
- 230000002093 peripheral effect Effects 0.000 description 3
- 238000012790 confirmation Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 208000007433 Lymphatic Metastasis Diseases 0.000 description 1
- 206010027476 Metastases Diseases 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000007797 corrosion Effects 0.000 description 1
- 238000005260 corrosion Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 230000009401 metastasis Effects 0.000 description 1
- 238000001356 surgical procedure Methods 0.000 description 1
- 238000002560 therapeutic procedure Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000096—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope using artificial intelligence
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00039—Operational features of endoscopes provided with input arrangements for the user
- A61B1/0004—Operational features of endoscopes provided with input arrangements for the user for electronic operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00163—Optical arrangements
- A61B1/00194—Optical arrangements adapted for three-dimensional imaging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/62—Analysis of geometric attributes of area, perimeter, diameter or volume
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00163—Optical arrangements
- A61B1/00193—Optical arrangements adapted for stereoscopic vision
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
- G06T2207/20104—Interactive definition of region of interest [ROI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
Definitions
- the present disclosure relates to an endoscope shape measurement system and an endoscope shape measurement method.
- Endoscopic submucosal dissection is a therapeutic method that removes lesions en masse using a dedicated instrument, and is attracting attention as a less invasive treatment method in place of conventional surgical treatment. It is Since ESD target tumors are lesions with no risk of lymph node metastasis and there is a correlation between the metastasis rate and tumor size, it is important to accurately measure the tumor size.
- Patent Literature 2 discloses a technique of determining a reference plane by determining three-dimensional coordinates of a plurality of points on the surface of an object near anomalies such as dents, cracks, and pitting corrosion.
- an object of the present disclosure is to provide a technique for accurately specifying information regarding the size of a lesion contained in an image captured by an endoscope.
- an endoscopic shape measurement system includes a processor, and the processor displays an endoscopic image of living tissue in a body cavity captured by an endoscope on a display device.
- receive a user operation for setting a region of interest in the endoscopic image set the region of interest in the endoscopic image based on the user's operation, and obtain three-dimensional shape information of the biological tissue photographed by the endoscope.
- 3D shape information of the virtual surface in the attention area is derived from the 3D shape information of the area different from the attention area, and information about the size of the virtual surface is specified from the derived 3D shape information of the virtual surface.
- the endoscopic shape measurement system may have a plurality of processors, and the plurality of processors may work together to perform the above processing.
- An endoscopic shape measurement method displays an endoscopic image of a body tissue in a body cavity captured by an endoscope on a display device, and sets a region of interest in the endoscopic image. , sets a region of interest in an endoscopic image based on the user's operation, acquires 3D shape information of a biological tissue photographed by the endoscope, and obtains 3D shape information of a region different from the region of interest , the three-dimensional shape information of the virtual surface in the region of interest is derived, and information about the size of the virtual surface is specified from the derived three-dimensional shape information of the virtual surface.
- FIG. 10 is a diagram showing a flowchart of lesion shape measurement processing. It is a figure which shows the set attention area.
- FIG. 10 is a diagram showing a set reference area; It is a figure which shows the state which has arrange
- FIG. 4 is a diagram for explaining a technique for deriving a three-dimensional shape of a virtual surface
- FIG. (a) and (b) are diagrams showing the three-dimensional shape of a derived virtual surface.
- (a) to (c) are diagrams for explaining a method of calculating the size of a virtual surface.
- FIG. 11 is a diagram showing an example of a confirmation screen of an attention area
- FIG. 10 is a diagram showing an example in which a discontinuity is included in the reference area; It is a figure which shows the set exclusion area
- FIG. 1(a) shows an example of an image of a living tissue in a body cavity captured by an endoscope.
- a lesion 2 is included in the central region, and the living body surface 3, which is the inner wall surface of the digestive tract, is included around it.
- FIG. 1(b) schematically shows a situation during endoscopic image capturing.
- the lesion 2 is a tumor protruding from the curved surface 3 of the living body, but some of the lesions 2 are recessed from the surface 3 of the living body.
- the length L between the points A and B on the lesion 2 is not the length of the line segment connecting the points A and B with a straight line, but the curved biomedical surface 3. length along. Since the endoscopic image does not include the living body surface 3 hidden by the lesion 2, in the embodiment, the three-dimensional shape of the living body surface 3 hidden by the lesion 2 (hereinafter referred to as "virtual surface”) is estimated to measure the size of the lesion 2.
- FIG. 2 shows the configuration of the endoscope shape measurement system 1 according to the embodiment.
- An endoscopic shape measurement system 1 is provided in a medical facility such as a hospital where endoscopy is performed.
- the endoscope observation device 5 the information processing device 10a, the information processing device 10b, and the image analysis device 8 can communicate via a network 4 such as a LAN (local area network). It is connected.
- a network 4 such as a LAN (local area network). It is connected.
- the endoscope observation device 5 is installed in the examination room and connected to an endoscope 7 that is inserted into the patient's gastrointestinal tract.
- the endoscope 7 has a light guide for transmitting the illumination light supplied from the endoscope observation device 5 to illuminate the inside of the gastrointestinal tract.
- An illumination window for emitting light to the living tissue and an imaging unit for picking up an image of the living tissue at a predetermined cycle and outputting an imaging signal to the endoscope observation device 5 are provided.
- the endoscope observation device 5 supplies the endoscope 7 with illumination light according to the observation mode.
- the imaging unit includes a solid-state imaging device (such as a CCD image sensor or a CMOS image sensor) that converts incident light into electrical signals.
- the endoscope observation device 5 generates an endoscope image by performing image processing on the imaging signal photoelectrically converted by the solid-state imaging device of the endoscope 7, and displays it on the display device 6 in real time.
- the endoscope observation device 5 may have a function of performing special image processing for the purpose of highlighting, etc., in addition to normal image processing such as A/D conversion and noise removal. Since the endoscope observation device 5 is equipped with a special image processing function, the endoscope observation device 5 can detect an endoscope without special image processing from an imaging signal captured using the same illumination light. Images and endoscopic images with special image processing can be generated.
- the endoscopic image is a WLI (White Light Imaging) observation image generated from an imaging signal imaged using normal light (white light), and an imaging signal imaged using normal light is subjected to special image processing.
- WLI White Light Imaging
- TXI Texture and Color Enhancement Imaging
- RDI Magnetic Reson Imaging
- NBI Narrow Band Imaging
- AFI Autofluorescence imaging
- the endoscopic image may be an image in which unevenness of a subject is pseudo-colored, or an image generated by performing image processing on an imaging signal of the endoscope 7 .
- the doctor observes the endoscopic image displayed on the display device 6 according to the examination procedure.
- the endoscope observation device 5 captures (stores) an endoscopic image at the timing when the release switch is operated, and accumulates the captured endoscopic image.
- the endoscopic images stored in the image storage server 9 are used by doctors to create examination reports.
- the information processing device 10a is installed in an examination room, and is used by users such as doctors and nurses to confirm information regarding the size of lesions included in endoscopic images in real time during endoscopic examinations.
- the information processing device 10a may cooperate with the image analysis device 8 to provide the user with information regarding the size of the lesion.
- the information processing device 10b is installed in a room other than the examination room, and is used by the doctor when creating an examination report. For example, the doctor may use the lesion shape measurement function of the information processing device 10b to confirm whether the lesion imaged in the current detailed examination has a size that will be subject to ESD in the next examination. Once the physician confirms that the imaged lesion is large enough for ESD, he may decide to en bloc resect the lesion using ESD at the next endoscopy.
- the image analysis device 8 has an image analysis function that, upon input of an endoscopic image, detects lesions and outputs regions where lesions exist (lesion regions).
- the image analysis device 8 is a trained model generated by machine learning using endoscopic images for learning and information about lesion areas included in the endoscopic images as teacher data, and receives endoscopic images as input. A trained model that outputs the location of the lesion area may then be used.
- the image analysis device 8 may be provided with an endoscopic image from the endoscopic observation device 5 and supply the image analysis result to the endoscopic observation device 5 or the information processing device 10a.
- FIG. 3 can be implemented by any processor, memory, auxiliary storage device, or other LSI in terms of hardware, and by programs loaded in the memory in terms of software. It depicts the functional blocks realized by the cooperation of Therefore, those skilled in the art will understand that these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof.
- the functional blocks of the endoscopic shape measurement system 1 may be realized independently by each of the endoscope observation device 5, the information processing device 10a, the information processing device 10b, and the image analysis device 8, but two or more devices may be implemented. may be realized by a combination of As such, the functional blocks shown as processing device 20 may be implemented by one or more processors contained in a single device, but may also be implemented by multiple processors contained in two or more devices.
- the endoscope observation device 5 displays an image of the inside of the gastrointestinal tract captured by the endoscope 7 on the display device 6 .
- the doctor observes the endoscopic image displayed on the display device 6 while moving the endoscope 7 , and when the lesion is displayed on the display device 6 , operates the release switch of the endoscope 7 .
- the endoscopic observation device 5 captures an endoscopic image at the timing when the release switch is operated, and stores the captured endoscopic image together with information (image ID) for identifying the endoscopic image in an image storage server.
- image ID information for identifying the endoscopic image in an image storage server.
- the endoscopic observation device 5 may collectively transmit a plurality of captured endoscopic images to the image storage server 9 after the end of the examination.
- the image storage server 9 records the endoscopic image transmitted from the endoscopic observation device 5 in association with the examination ID that identifies the endoscopic examination.
- the endoscopic observation device 5 of the embodiment has a function of measuring three-dimensional shape information of living tissue included in the captured endoscopic image.
- the endoscope observation device 5 may measure the three-dimensional shape of the photographed living tissue by a known technique.
- the endoscope observation device 5 measures the three-dimensional shape of the living tissue included in the captured endoscope image, adds information on the measured three-dimensional shape to the endoscope image, and stores the information in the image storage server 9. Send. Therefore, the image storage server 9 records the endoscopic image in association with the three-dimensional shape information of the photographed living tissue.
- the endoscope 7 may be equipped with a stereo camera, and the endoscope observation device 5 may measure the three-dimensional shape of living tissue using the principle of triangulation from images captured by the two cameras. Further, as disclosed in Patent Document 1, the endoscope observation device 5 projects a projection image of the measurement pattern by laser light onto the biological tissue, and based on the imaging result of the measurement pattern projected onto the biological tissue, , the three-dimensional shape of the living tissue may be measured. In addition, the endoscope observation device 5 utilizes a trained model that has undergone machine learning using an image acquired by a stereo camera and distance information of the living tissue included in the image as training data, and an endoscopic image captured by a monocular camera. A three-dimensional shape of the living tissue may be measured from the mirror image.
- the endoscope observation device 5 may measure the three-dimensional shape of the living tissue from the inter-frame feature amount of the endoscope image captured by the monocular camera. In this manner, the endoscope observation device 5 measures the three-dimensional shape of the living tissue in the endoscope image using a known measurement technique.
- the three-dimensional shape measurement function of the photographed living tissue may be installed in a device other than the endoscope observation device 5.
- the three-dimensional shape measurement function may be installed in the information processing device 10 a, the image analysis device 8 , or the image storage server 9 .
- the image storage server 9 adds the three-dimensional shape information of the photographed living tissue to the endoscopic image transmitted from the endoscope observation device 5.
- data for calculating the three-dimensional shape information is associated and recorded.
- the doctor operates the information processing device 10b to create an examination report.
- the information processing device 10b reads the endoscopic image taken in the examination from the image storage server 9 and displays it on the display device 12b, and the doctor diagnoses the lesion included in the displayed endoscopic image.
- the information processing apparatus 10b of the embodiment derives the three-dimensional shape of a living body surface (virtual surface) hidden in a lesion included in an endoscopic image, and determines the size of the virtual surface, that is, A process is performed to identify the size of the lesion bed.
- a case in which the information processing device 10b implements the lesion shape measurement function of the processing device 20 will be described below.
- FIG. 4 shows an example of an endoscopic image displayed on the display device 12b.
- the display processing unit 46 acquires from the image storage server 9 the endoscopic image associated with the examination ID of the examination for which the report is to be created, and displays it on the display device 12b.
- the endoscopic image is an image of the living tissue inside the body cavity captured by the endoscope 7.
- FIG. 5 shows a flowchart of lesion shape measurement processing.
- a user who is a doctor operates the user interface 70 to set a region of interest in the endoscopic image displayed on the display device 12b (S10).
- FIG. 6 shows an example of the attention area 110 that has been set.
- the user operates the user interface 70 such as a mouse to draw a boundary line between the lesion 100 and the biological surface 102 in the endoscopic image displayed on the display device 12b.
- the region of interest setting unit 34 sets the region of interest 110 on the endoscopic image based on the user's operation.
- a region of interest 110 is a lesion area surrounded by a boundary line.
- a user who is a doctor can recognize the boundary between the lesion 100 and the living body surface 102 based on medical findings such as three-dimensional shape and color tone characteristics. Region 110 can be set accurately.
- the reference area setting unit 36 sets a reference area surrounding the attention area 110 based on the attention area 110 (S12).
- the reference area 120 includes at least the entire attention area 110 and is set to a range larger than the attention area 110 .
- FIG. 7 shows an example of the set reference area 120.
- the reference area setting unit 36 preferably sets the reference area 120 so as to have a blank space of a predetermined length or more outside the outer edge of the attention area 110 .
- the reference area setting unit 36 sets the left and right sides passing through positions separated by a predetermined length l1 from the leftmost and rightmost ends of the attention area 110, and the left and right sides separated by a predetermined length l2 from the top and bottom ends of the attention area 110, respectively.
- a rectangular area formed by upper and lower sides passing through the position may be set as the reference area 120 .
- the user may operate the user interface 70 to set the reference area 120 for the attention area 110 displayed on the display device 12b.
- the operation accepting unit 30 accepts a user operation for setting a reference area on the endoscopic image
- the reference area setting unit 36 sets the reference area 120 on the endoscopic image based on the user's operation. Since the set reference area 120 defines the range of the three-dimensional shape information used when deriving the virtual surface of the attention area 110, the user can determine the range in which the virtual surface of the attention area 110 can be preferably derived. , is preferably determined as the reference region 120 .
- the three-dimensional information acquisition unit 38 acquires three-dimensional shape information of the living tissue photographed by the endoscope (S14). As described above, the three-dimensional shape information of the living tissue is linked to the endoscopic image and recorded in the image storage server 9. Get dimensional shape information.
- the virtual surface derivation unit 48 derives the three-dimensional shape information of the virtual surface in the attention area 110 from the three-dimensional shape information of the area different from the attention area 110 (S16).
- the three-dimensional shape information of the virtual plane may include two-dimensional coordinates of each pixel of the virtual plane in the endoscopic image and distance information (depth information) of each pixel.
- the virtual surface derivation unit 48 derives the three-dimensional shape information of the virtual surface in the attention area 110 from the three-dimensional shape information of the reference area 120 .
- the virtual plane in the region of interest 110 means a virtual living body plane existing behind the lesion 100 (the living body plane 102 when it is assumed that the lesion 100 does not exist).
- FIG. 8 shows a state in which the reference area 120 is arranged on the xy plane.
- the horizontal position of the endoscopic image is represented by the x coordinate
- the vertical position is represented by the y coordinate.
- the position (depth position) in the distance direction from the camera may be represented by the z-coordinate.
- the hatched area indicates the reference area 120 in the xy plane
- the virtual surface derivation unit 48 derives the 3D shape information of the virtual surface in the attention area 110 from the 3D shape information of the reference area 120. do.
- the virtual surface derivation unit 48 does not use the three-dimensional shape information of the attention area 110 .
- FIG. 9 shows an example of the three-dimensional shape of the reference area 120.
- FIG. 9A is a perspective view of the three-dimensional shape of the reference area 120
- FIG. 9B is a view of the three-dimensional shape of the reference area 120 viewed in the y-axis direction.
- FIG. 10 is a diagram for explaining a method of deriving the three-dimensional shape of the virtual surface in the attention area 110.
- the virtual surface derivation unit 48 performs fitting processing of the three-dimensional shape of the attention area 110 based on the three-dimensional shape information of the reference area 120 for each column in the y-axis direction at predetermined intervals in the x-axis direction. Specifically, as shown in FIG. 10, a polynomial approximation curve is obtained from the three-dimensional point cloud data in the reference region 120 for each column extending in the y-axis direction.
- the virtual surface derivation unit 48 divides the attention area 110 into N equal intervals in the x-axis direction, and derives a polynomial approximation curve that fits each division line (column). After that, the virtual surface deriving unit 48 may perform fitting processing also in the x-axis direction based on the point group data of the N polynomial approximate curves to derive three-dimensional shape information of a smooth virtual surface. In this way, the virtual surface deriving unit 48 uses the reference area 120 set by the reference area setting unit 36 to obtain high-level three-dimensional shape information of the virtual surface even if the biological surface has a complicated curved shape. It can be derived with precision.
- FIG. 11(a) shows an example of a virtual surface 130 derived with respect to the three-dimensional shape of the reference area 120 shown in FIG. 9(b), and FIG. Examples of dimensional shapes are shown.
- the virtual surface deriving unit 48 of the embodiment performs a three-dimensional shape fitting process for deriving a polynomial approximate curve, other types of fitting processes may be performed.
- the virtual surface derivation unit 48 of the embodiment smoothly connects the empty space in the center of the three-dimensional shape of the reference area 120 to derive the virtual surface 130 in the attention area 110 .
- the virtual plane 130 generated in this way corresponds to the virtual bottom surface of the lesion 100 in the region of interest 110 .
- the size information specifying unit 50 specifies information about the size of the virtual surface 130 from the three-dimensional shape information of the virtual surface 130 (S18). For example, the size information specifying unit 50 may derive the major axis and minor axis of the virtual surface 130 as follows.
- FIG. 12(a) is a diagram for explaining a method of calculating the major axis of the virtual surface 130.
- the size information specifying unit 50 calculates the route distance along the virtual plane 130 between all two points on the boundary, and specifies the maximum value of the route distance as the major axis.
- FIG. 12(b) shows a combination of two points with the maximum route distance.
- the size information specifying unit 50 determines that the route distance between the points P and Q on the boundary is the maximum, the size information specifying unit 50 specifies the route distance between the points P and Q as the “major axis” of the virtual surface 130 . After specifying the major axis, the size information specifying unit 50 derives a combination of two points that provide the maximum path distance among paths orthogonal to the major axis.
- FIG. 12(c) shows a combination of two points with the maximum path distance in the direction intersecting the major axis.
- the virtual plane 130 corresponds to the virtual bottom surface of the lesion 100
- the size information specifying unit 50 derives the major axis and minor axis of the virtual plane 130 to obtain the major axis and minor axis of the lesion 100 .
- the display processing unit 46 may display the major axis and minor axis of the lesion bottom derived by the virtual plane deriving unit 48 on the display device 12b.
- FIG. 13 shows an example of the attention area confirmation screen displayed on the display device 12b.
- the user operates the user interface 70 to set the attention area 110. At this time, it may be possible to check whether the set attention area 110 is appropriate.
- the image generator 40 When the user sets the region of interest 110, the image generator 40 generates a three-dimensional image of the living tissue based on the three-dimensional shape information of the living tissue.
- the display processing unit 46 synthesizes the three-dimensional image of the living tissue and the image indicating the position of the attention area 110, and displays the synthesized image on the display device 12b.
- the region of interest 110 is synthesized at a position separated from the three-dimensional image of the biological tissue in the z-axis direction, but may be synthesized at a position aligned in the z-axis direction.
- the user can see that the contour lines of the region of interest 110 are aligned with the boundary between the lesion 100 and the biological surface 102 . You can check if you agree. If not, the user resets the region of interest 110 so that it matches the area of the lesion 100 .
- the display processing unit 46 may synthesize the three-dimensional image of the living tissue and the image indicating the position of the reference region 120 and display them on the display device 12b. By synthesizing and displaying the three-dimensional image of the living tissue and the image indicating the position of the reference region 120, the user can check whether the reference region 120 is set appropriately.
- the display processing unit 46 may synthesize the virtual surface 130 with the three-dimensional image of the biological tissue and display it on the display device.
- the virtual plane 130 is hidden behind the lesion 100 and cannot be visually recognized.
- the user can confirm whether the virtual plane 130 is appropriately derived. can.
- the virtual surface deriving unit 48 derives the three-dimensional shape information of the virtual surface 130 from the three-dimensional shape information of the reference region 120 different from the attention region 110. There may be objects that impair the continuity of the surface. If the virtual surface derivation unit 48 performs fitting processing of the 3D shape of the attention area 110 from the discontinuous 3D shape information of the biological surface, the 3D shape information of the virtual surface 130 cannot be estimated with high accuracy.
- FIG. 14 shows an example in which the discontinuity 104 is included in the reference area 120.
- the discontinuity 104 is, for example, a protrusion or recess that impairs the continuity of the biological surface in the reference region 120, and may be a lesion. Since the discontinuity 104 adversely affects the fitting accuracy when performing the three-dimensional shape fitting process, it is preferably removed manually by the user before performing the fitting process.
- FIG. 15 shows the exclusion area 140 that has been set.
- the user operates the user interface 70 to set the exclusion area 140 in the endoscopic image displayed on the display device 12b.
- the virtual surface derivation unit 48 extracts the three-dimensional shape information of the reference region 120 from which the three-dimensional shape information of the exclusion region 140 is excluded. , the three-dimensional shape information of the virtual surface 130 is derived. Accordingly, the virtual surface derivation unit 48 can derive the virtual surface 130 with high accuracy.
- the virtual surface derivation unit 48 performs fitting processing of the three-dimensional shape of the exclusion region 140 from the three-dimensional shape information of the reference region 120 excluding the three-dimensional shape information of the exclusion region 140, and corrects the correction surface of the exclusion region 140. (virtual surface) may be derived. After correcting the three-dimensional shape information of the exclusion region 140 in this manner, the virtual surface derivation unit 48 may derive the virtual surface 130 of the attention region 110 from the corrected three-dimensional shape information of the reference region 120 .
- the above is a mode in which the information processing device 10b realizes the lesion shape measurement function of the processing device 20 when the user creates an examination report after the examination is finished.
- the information processing device 10a may cooperate with the image analysis device 8 to realize the lesion shape measurement function of the processing device 20 during the endoscopy.
- the image analysis device 8 has an image analysis unit 42 and a learned model holding unit 60 (see FIG. 3). It holds a trained model generated by machine learning using information about areas as training data. This trained model is configured to detect a lesion area and output the position of the lesion area when an endoscopic image is input.
- the endoscopic observation device 5 displays the endoscopic image captured by the endoscope 7 in real time from the display device 6, and transmits the endoscopic image to the information processing device 10a and the image display device 10a.
- Send to analysis device 8 the image analysis unit 42 inputs the acquired endoscopic image to the learned model held in the learned model holding unit 60 .
- the trained model receives an endoscopic image and detects a lesion area, it outputs the position information of the lesion area.
- the image analysis unit 42 transmits the positional information of the lesion area output by the learned model to the information processing apparatus 10a together with the information (image ID) for identifying the endoscopic image.
- the display processing unit 46 displays the endoscopic image captured by the endoscope 7 on the display device 12a, and receives the position information and image ID of the lesion area provided from the image analysis device 8. Based on this, information indicating the position of the lesion area is displayed on the endoscopic image. At this time, the display processing unit 46 synchronizes the endoscopic image to be displayed with the information indicating the position of the lesion area to be displayed based on the image ID. For example, as shown in FIG. 6, the display processing unit 46 may superimpose the boundary line of the lesion 100 on the endoscopic image including the lesion 100 and display it.
- the user operating the information processing device 10a may decide to set the attention area 110 by operating the user interface 70 after confirming that the boundary line of the lesion 100 output by the learned model is correct.
- the operation receiving unit 30 receives a user's operation for setting a region of interest in the endoscopic image, and the region-of-interest setting unit 34 sets the region of interest in the endoscopic image based on the user's operation.
- the information processing apparatus 10a performs the steps S12, S14, S16, and S18 shown in FIG. This information may be provided to the endoscope viewing device 5 and displayed on the display device 6 to inform the physician operating the endoscope 7 .
- the virtual surface deriving unit 48 derives the three-dimensional shape information of the virtual surface 130 from the three-dimensional shape of the peripheral area of the attention area 110, and therefore, if the peripheral area is small, the three-dimensional shape of the virtual surface 130 is estimated. is not sufficient, it is difficult to derive the three-dimensional shape information of the virtual surface 130 with high accuracy. Therefore, the auxiliary information generation unit 44 may generate auxiliary information regarding the imaging range of the endoscope 7 during imaging based on the position of the lesion area provided from the image analysis device 8 . For example, if the ratio of the lesion area in the endoscopic image exceeds a predetermined threshold (for example, 60%), a sufficiently large reference area cannot be secured around the lesion area.
- a predetermined threshold for example, 60%
- Auxiliary information may be generated to inform the physician that the mirror 7 camera is too close to the lesion site.
- the auxiliary information generator 44 provides the generated auxiliary information to the endoscope observation device 5 , and the endoscope observation device 5 displays the auxiliary information on the display device 6 so that the doctor operating the endoscope 7 can can be confirmed. This allows the doctor to recognize that the camera of the endoscope 7 is too close to the lesion site and move the endoscope 7 so that the camera is away from the lesion site. Therefore, a sufficiently large peripheral area is secured around the lesion area, and the virtual surface derivation unit 48 can derive the three-dimensional shape information of the virtual surface 130 with high accuracy.
- the present disclosure can be used in the technical field of measuring the shape of a lesion site.
- SYMBOLS 1 Shape measuring system for endoscopes, 5... Endoscope observation apparatus, 6... Display apparatus, 7... Endoscope, 8... Image analysis apparatus, 9... Image Storage server 10a, 10b Information processing device 12a, 12b Display device 20 Processing device 30 Operation receiving unit 32 Area setting unit 34 Attention Region setting unit 36 Reference region setting unit 38 Three-dimensional information acquisition unit 40 Image generation unit 42 Image analysis unit 44 Auxiliary information generation unit 46. Display processing unit 48 Virtual surface deriving unit 50 Size information specifying unit 60 Learned model holding unit 70 User interface.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Heart & Thoracic Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Veterinary Medicine (AREA)
- Molecular Biology (AREA)
- Optics & Photonics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Geometry (AREA)
- Endoscopes (AREA)
Abstract
Description
Claims (12)
- プロセッサを備えた内視鏡用形状計測システムであって、
前記プロセッサは、
体腔内の生体組織を内視鏡が撮影した内視鏡画像を表示装置に表示し、
内視鏡画像に注目領域を設定するためのユーザ操作を受け付け、
ユーザ操作にもとづいて、内視鏡画像に注目領域を設定し、
内視鏡が撮影した生体組織の3次元形状情報を取得し、
注目領域とは異なる領域の3次元形状情報から、注目領域における仮想面の3次元形状情報を導出し、
導出した仮想面の3次元形状情報から、仮想面のサイズに関する情報を特定する、
ことを特徴とする内視鏡用形状計測システム。 - 前記プロセッサは、
注目領域を取り囲む参照領域を設定し、
参照領域の3次元形状情報から、仮想面の3次元形状情報を導出する、
ことを特徴とする請求項1に記載の内視鏡用形状計測システム。 - 前記プロセッサは、
内視鏡画像に参照領域を設定するためのユーザ操作を受け付け、
ユーザ操作にもとづいて、内視鏡画像に参照領域を設定する、
ことを特徴とする請求項2に記載の内視鏡用形状計測システム。 - 前記プロセッサは、
生体組織の3次元形状情報にもとづいて、生体組織の3次元画像を生成し、
生体組織の3次元画像と、注目領域の位置を示す画像とを合成して表示装置に表示する、
ことを特徴とする請求項1に記載の内視鏡用形状計測システム。 - 前記プロセッサは、
生体組織の3次元形状情報にもとづいて、生体組織の3次元画像を生成し、
生体組織の3次元画像と、参照領域の位置を示す画像とを合成して表示装置に表示する、
ことを特徴とする請求項2に記載の内視鏡用形状計測システム。 - 前記プロセッサは、
参照領域に除外領域を設定するためのユーザ操作を受け付け、
除外領域の3次元形状情報を除外した参照領域の3次元形状情報から、仮想面の3次元形状情報を導出する、
ことを特徴とする請求項2に記載の内視鏡用形状計測システム。 - 前記プロセッサは、
生体組織の3次元画像に、仮想面を合成して表示装置に表示する、
ことを特徴とする請求項4に記載の内視鏡用形状計測システム。 - 前記プロセッサは、
注目領域とは異なる領域の3次元形状情報から、3次元形状のフィッティング処理を行うことで、注目領域における仮想面の3次元形状情報を導出する、
ことを特徴とする請求項1に記載の内視鏡用形状計測システム。 - 学習用の内視鏡画像および内視鏡画像に含まれる病変領域に関する情報を教師データとして用いた機械学習により生成された学習済みモデルであって、内視鏡画像を入力すると、病変領域の位置を出力する学習済みモデルをさらに備え、
前記プロセッサは、
学習済みモデルにより出力された病変領域の位置を示す情報を表示する、
ことを特徴とする請求項1に記載の内視鏡用形状計測システム。 - 学習用の内視鏡画像および内視鏡画像に含まれる病変領域に関する情報を教師データとして用いた機械学習により生成された学習済みモデルであって、内視鏡画像を入力すると、病変領域の位置を出力する学習済みモデルをさらに備え、
前記プロセッサは、
学習済みモデルにより出力された病変領域の位置にもとづいて、撮影中の内視鏡の撮影範囲に関する補助情報を生成し、
補助情報を表示装置に表示する、
ことを特徴とする請求項1に記載の内視鏡用形状計測システム。 - 前記プロセッサは、
注目領域にもとづいて、参照領域を設定する、
ことを特徴とする請求項2に記載の内視鏡用形状計測システム。 - 体腔内の生体組織を内視鏡が撮影した内視鏡画像を表示装置に表示し、
内視鏡画像に注目領域を設定するためのユーザ操作を受け付け、
ユーザ操作にもとづいて、内視鏡画像に注目領域を設定し、
内視鏡が撮影した生体組織の3次元形状情報を取得し、
注目領域とは異なる領域の3次元形状情報から、注目領域における仮想面の3次元形状情報を導出し、
導出した仮想面の3次元形状情報から、仮想面のサイズに関する情報を特定する、
内視鏡用形状計測方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023505052A JP7462255B2 (ja) | 2021-03-12 | 2021-03-12 | 内視鏡用形状計測システムおよび内視鏡用形状計測方法 |
PCT/JP2021/010118 WO2022190366A1 (ja) | 2021-03-12 | 2021-03-12 | 内視鏡用形状計測システムおよび内視鏡用形状計測方法 |
CN202180095271.8A CN116940274A (zh) | 2021-03-12 | 2021-03-12 | 内窥镜用形状测量系统及内窥镜用形状测量方法 |
US18/244,394 US20230419517A1 (en) | 2021-03-12 | 2023-09-11 | Shape measurement system for endoscope and shape measurement method for endoscope |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/010118 WO2022190366A1 (ja) | 2021-03-12 | 2021-03-12 | 内視鏡用形状計測システムおよび内視鏡用形状計測方法 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/244,394 Continuation US20230419517A1 (en) | 2021-03-12 | 2023-09-11 | Shape measurement system for endoscope and shape measurement method for endoscope |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022190366A1 true WO2022190366A1 (ja) | 2022-09-15 |
Family
ID=83227584
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/010118 WO2022190366A1 (ja) | 2021-03-12 | 2021-03-12 | 内視鏡用形状計測システムおよび内視鏡用形状計測方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20230419517A1 (ja) |
JP (1) | JP7462255B2 (ja) |
CN (1) | CN116940274A (ja) |
WO (1) | WO2022190366A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024195100A1 (ja) * | 2023-03-23 | 2024-09-26 | 日本電気株式会社 | 内視鏡検査支援装置、内視鏡検査支援方法、及び、記録媒体 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9412054B1 (en) * | 2010-09-20 | 2016-08-09 | Given Imaging Ltd. | Device and method for determining a size of in-vivo objects |
JP2018197674A (ja) * | 2017-05-23 | 2018-12-13 | オリンパス株式会社 | 計測装置の作動方法、計測装置、計測システム、3次元形状復元装置、およびプログラム |
US20190043188A1 (en) * | 2017-08-04 | 2019-02-07 | CapsoVision, Inc. | Method and Apparatus for Area or Volume of Object of Interest from Gastrointestinal Images |
CN110811491A (zh) * | 2019-12-05 | 2020-02-21 | 中山大学附属第一医院 | 一种具有三维重建功能的在线疾病识别内窥镜 |
-
2021
- 2021-03-12 CN CN202180095271.8A patent/CN116940274A/zh active Pending
- 2021-03-12 JP JP2023505052A patent/JP7462255B2/ja active Active
- 2021-03-12 WO PCT/JP2021/010118 patent/WO2022190366A1/ja active Application Filing
-
2023
- 2023-09-11 US US18/244,394 patent/US20230419517A1/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9412054B1 (en) * | 2010-09-20 | 2016-08-09 | Given Imaging Ltd. | Device and method for determining a size of in-vivo objects |
JP2018197674A (ja) * | 2017-05-23 | 2018-12-13 | オリンパス株式会社 | 計測装置の作動方法、計測装置、計測システム、3次元形状復元装置、およびプログラム |
US20190043188A1 (en) * | 2017-08-04 | 2019-02-07 | CapsoVision, Inc. | Method and Apparatus for Area or Volume of Object of Interest from Gastrointestinal Images |
CN110811491A (zh) * | 2019-12-05 | 2020-02-21 | 中山大学附属第一医院 | 一种具有三维重建功能的在线疾病识别内窥镜 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024195100A1 (ja) * | 2023-03-23 | 2024-09-26 | 日本電気株式会社 | 内視鏡検査支援装置、内視鏡検査支援方法、及び、記録媒体 |
Also Published As
Publication number | Publication date |
---|---|
CN116940274A (zh) | 2023-10-24 |
JP7462255B2 (ja) | 2024-04-05 |
JPWO2022190366A1 (ja) | 2022-09-15 |
US20230419517A1 (en) | 2023-12-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5380348B2 (ja) | 内視鏡観察を支援するシステムおよび方法、並びに、装置およびプログラム | |
JP5676058B1 (ja) | 内視鏡システム及び内視鏡システムの作動方法 | |
US9516993B2 (en) | Endoscope system | |
JP6371729B2 (ja) | 内視鏡検査支援装置、内視鏡検査支援装置の作動方法および内視鏡支援プログラム | |
US8767057B2 (en) | Image processing device, image processing method, and program | |
JP5421828B2 (ja) | 内視鏡観察支援システム、並びに、内視鏡観察支援装置、その作動方法およびプログラム | |
JP6254053B2 (ja) | 内視鏡画像診断支援装置、システムおよびプログラム、並びに内視鏡画像診断支援装置の作動方法 | |
JP5486432B2 (ja) | 画像処理装置、その作動方法およびプログラム | |
KR20130108320A (ko) | 관련 애플리케이션들에 대한 일치화된 피하 해부구조 참조의 시각화 | |
JP2017508529A (ja) | 内視鏡測定システム及び方法 | |
CN111867438A (zh) | 手术辅助设备、手术方法、非暂时性计算机可读介质和手术辅助系统 | |
US10631826B2 (en) | Medical apparatus, medical-image generating method, and recording medium on which medical-image generating program is recorded | |
JP2012024518A (ja) | 内視鏡観察を支援する装置および方法、並びに、プログラム | |
JP3910239B2 (ja) | 医用画像合成装置 | |
JP6022133B2 (ja) | 医療装置 | |
EP3298949B1 (en) | Image processing apparatus, image processing method, and surgical system | |
JP2018153346A (ja) | 内視鏡位置特定装置、方法およびプログラム | |
US20230419517A1 (en) | Shape measurement system for endoscope and shape measurement method for endoscope | |
JP2002253480A (ja) | 医療処置補助装置 | |
JP2017205343A (ja) | 内視鏡装置、内視鏡装置の作動方法 | |
US20220225860A1 (en) | Medical imaging system, medical imaging processing method, and medical information processing apparatus | |
JP2011024913A (ja) | 医用画像処理装置、医用画像処理プログラム、及びx線ct装置 | |
EP3595299A1 (en) | Medical image display control device, medical image display device, medical information processing system, and medical image display control method | |
EP3782529A1 (en) | Systems and methods for selectively varying resolutions | |
JP2018047117A (ja) | 内視鏡位置検出装置、内視鏡位置検出装置の作動方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21930218 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2023505052 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202180095271.8 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21930218 Country of ref document: EP Kind code of ref document: A1 |