[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2024096840A1 - Method and device for endoscopy evaluation - Google Patents

Method and device for endoscopy evaluation Download PDF

Info

Publication number
WO2024096840A1
WO2024096840A1 PCT/TR2023/051219 TR2023051219W WO2024096840A1 WO 2024096840 A1 WO2024096840 A1 WO 2024096840A1 TR 2023051219 W TR2023051219 W TR 2023051219W WO 2024096840 A1 WO2024096840 A1 WO 2024096840A1
Authority
WO
WIPO (PCT)
Prior art keywords
trajectory
endoscope
organ
dimensional
endoscopy
Prior art date
Application number
PCT/TR2023/051219
Other languages
French (fr)
Inventor
Cem Simsek
Original Assignee
Cem Simsek
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from TR2022/016566 external-priority patent/TR2022016566A1/en
Application filed by Cem Simsek filed Critical Cem Simsek
Publication of WO2024096840A1 publication Critical patent/WO2024096840A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/31Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the rectum, e.g. proctoscopes, sigmoidoscopes, colonoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00194Optical arrangements adapted for three-dimensional imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/62Extraction of image or video features relating to a temporal dimension, e.g. time-based feature extraction; Pattern tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000094Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000096Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope using artificial intelligence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images
    • G06V2201/031Recognition of patterns in medical or anatomical images of internal organs

Definitions

  • the present invention relates to a method and a device for evaluating operator procedures based on data obtained during an endoscopy procedure. More particularly, the present invention relates to a computer-implemented method and a device configured to execute said method, for assessing endoscopy procedures based on analysis of image data acquired during the endoscopic examination.
  • Endoscopy refers to the visualization of interior cavities and hollow organs of the body for diagnostic and therapeutic applications in medicine.
  • the field of endoscopy has witnessed tremendous growth and innovation over the past few decades. Endoscopic procedures provide minimally invasive means to screen various medical conditions, enabling early diagnosis and treatment.
  • Endoscopes are slender instruments equipped with lighting and imaging systems to capture visuals from inside the body. Based on the site of application, endoscopes can be classified as gastrointestinal (GI), respiratory, urological, gynecological, neurological, arthroscopic, laparoscopic etc.
  • GI endoscopes examine organs like esophagus, stomach, small intestine, colon, bile and pancreatic ducts. Respiratory endoscopes access airways and lungs.
  • Urological endoscopes inspect urinary tract organs such as bladder and urethra.
  • Hysteroscopy visualizes the cervix and inside of the uterus.
  • Neuroendoscopy enables intracranial procedures through nasal or oral access. Arthroscopy is applied for joint spaces including knee, shoulder, elbow etc.
  • Laparoscopy employs small incisions to insert an endoscope and examine organs inside the abdomen.
  • Flexible endoscopes have become highly popular for most endoscopy applications due to superior maneuverability and access compared to rigid scopes.
  • Modern flexible endoscopes typically employ a Charge Coupled Device (CCD) or Complementary Metal Oxide Sensor (CMOS) as imaging element located at the distal tip.
  • CCD Charge Coupled Device
  • CMOS Complementary Metal Oxide Sensor
  • LEDs Light Emitting Diodes
  • the proximal end has an eyepiece for direct viewing and a connector for attaching external video processor and light source equipment.
  • Flexible endoscope models also incorporate small lumens for delivering fluids, gases, accessories and operative instruments if necessary.
  • Gastrointestinal (GI) endoscopy is one of the most widely performed endoscopic examination for visualization of the digestive tract. It serves as an effective screening and diagnostic tool for detection of structural abnormalities, inflammation, ulcers, polyps, tumors and cancers affecting the esophagus, stomach, small intestine (duodenum), colon and rectum.
  • Some of the common GI endoscopy procedures include esophagogastroduodenoscopy (EGD), colonoscopy, sigmoidoscopy, enteroscopy etc.
  • EGD commonly referred to as upper endoscopy, is employed to diagnose conditions of the upper GI tract including esophagus, stomach and duodenum.
  • Colonoscopy examines the entire large intestine from rectum to cecum for colorectal cancer screening. Sigmoidoscopy inspects the distal colon beginning at the rectum and ending at the sigmoid. Small bowel enteroscopy is carried out on patients presenting with obscure gastrointestinal bleeding (OGIB).
  • OGIB obscure gastrointestinal bleeding
  • Colonoscopy is one of the most widely performed endoscopic procedures worldwide. It serves as an effective diagnostic and screening tool for colorectal cancer. Colonoscopy enables inspection of the mucosa throughout the colon by advancement of the scope through anus. It also facilitates biopsy sampling and therapeutic interventions like removal of polyps, tumors and other abnormal tissue.
  • Typical adult colonoscope length is around 168 cm with diameter of approximately 13 mm.
  • Pediatric colonoscopes are smaller with length around 106 cm and diameter nearly 10 mm.
  • the key components of a video colonoscope are CCD/CMOS image sensor, light bundle, working/biopsy channel, insufflation port and control head. Water jet channels for mucosal cleaning and lens wash nozzles are also integrated.
  • a critical quality indicator for colonoscopy is the adenoma detection rate (ADR).
  • Adenomas are precursor polyp lesions that may develop into colorectal carcinoma if left untreated.
  • ADR is defined as the proportion of screening colonoscopies in which at least one colorectal adenoma is identified and excised. Studies have revealed ADR to be strongly associated with the risk of interval colorectal cancer after colonoscopy.
  • the recommended benchmark ADR is >25% in men and >15% in women for average-risk screening colonoscopies.
  • the cecal intubation rate describing successful advancement of the colonoscope tip to the cecum, cecal landmarks or terminal ileum, is another significant metric. A cecal intubation rate greater than 90% is considered optimal.
  • the withdrawal time corresponding to the time taken for scope withdrawal from the cecum to the rectum, also impacts ADR. Slow scope withdrawal with careful mucosal inspection enables higher ADR. At least 6 minutes of withdrawal time is advised, with some guidelines recommending withdrawal times up to 10 minutes.
  • Colonoscopy is a technically challenging procedure requiring considerable skill. Complex maneuvering is necessitated due to the numerous twists and turns of the colon anatomy. Insufficient visualization of the mucosal surface and rapid scope withdrawal lead to missed lesions. Patient factors like presence of diverticula or redundant colons further increase difficulty.
  • Endoscopists exhibit a broad range in competency levels, resulting in variability in adenoma yields. Studies have revealed increased adenoma and polyp detection rates for endoscopists with higher procedure volumes and experience. However, technical skill assessment based solely on such surrogate measures remains suboptimal. Direct monitoring and quantification of the complex psychomotor skills involved in scope maneuvering is warranted to reliably evaluate competency. This can enable targeted training and feedback to improve proficiency.
  • Video image analyses techniques allow extraction of clinically relevant information from recorded endoscopy videos.
  • Computer vision methods can automate detection of mucosal abnormalities with high accuracy to complement human interpretation.
  • Image processing algorithms may also enable localization of the endoscope tip based on tissue surface characteristics. The estimated trajectory can help assess critical maneuvers like circular movements around haustral folds and behind colonic flexures.
  • Reconstruction of the projected path traversed by the endoscope can reveal segments with inadequate visual coverage and acceleration/deceleration patterns indicative of suboptimal technique.
  • Such innovative computational tools can help objectively quantify the quality of examination in terms of thoroughness of mucosal visualization.
  • Automated assessment can grade competency and stratify training needs to ultimately enhance adenoma detection.
  • Patent document W02019092940A1 employs a position sensor-enabled endoscopy capsule.
  • the capsule localization data allows estimation of the 3D trajectory only at sparse sampling points. Continuous trajectory reconstruction from densely sampled video frames is not feasible. This restricts assessment of dynamic manipulations based on trajectory characteristics.
  • the data acquisition method of document CN109448041 A also utilizes an endoscopy capsule.
  • trajectories are obtained by tracking corresponding features across sequential capsule images. While enabling 3D organ reconstruction, this technique does not facilitate quantitative evaluation of the endoscopist’s maneuvering performance.
  • Patent document WO2018025444A1 describes a method to determine endoscopy capsule trajectory based on intensity of received signals.
  • competency assessment is not within the purview of this prior art.
  • Patent document WO2015111292A1 details an image compression technique but does not provide solutions for procedural evaluation via image analysis.
  • the present invention aims to address this need and provide techniques to quantitatively analyze endoscopic procedures using computer vision and trajectory analysis methods.
  • the inventive method and system enable reconstruction of the 3D endoscope trajectory from video sequences. Characteristics derived from this trajectory are used to objectively assess procedural skill and completeness.
  • the generated trajectory representation also facilitates compact storage of examination data.
  • the present invention discloses a computer-implemented method and system for comprehensive evaluation of endoscopic examinations based on endoscope maneuvering pattern analysis.
  • the primary objective of the invented technique is to enable standardized and quantitative assessment of endoscopy procedural competence based on the motion trajectory of the endoscope reconstructed from video sequences acquired during the intervention.
  • Endoscopy procedures like colonoscopy, upper endoscopy, bronchoscopy, cystoscopy etc. involve navigating a flexible endoscope through the intricate pathways and obstructions of hollow organs to visualize the interior surface. Thorough examination necessitates complex tip manipulations to negotiate tight turns and access convoluted areas while providing stable views. There is a broad range of competency in scope maneuvering skills among endoscopists impacting diagnostic yields. Furthermore, longer procedure duration and unstable visualization increase patient discomfort.
  • the present invention proposes innovative techniques leveraging computer vision, image processing and information fusion algorithms to reconstruct the complete three-dimensional motion trajectory of the endoscope tip from standard endoscopy videos. Characteristics derived from this reconstructed trajectory provide robust and explainable metrics for quantitative evaluation of endoscopic interventions.
  • the automated competency assessment can benchmark endoscopists against standards and/or prior best trajectories from experts. Proceduralist-specific learning curves may be plotted over time to track skill progression. Real-time feedback during live procedures can assist course correction.
  • Another major utility of the reconstructed endoscope trajectory is enabling compressed storage and replay of procedures. Recording entire videos requires prohibitive storage capacity.
  • the trajectory provides a compact procedural representation supporting rapid review and retrieval of salient sections, while requiring multiple orders of magnitude lower space compared to raw videos.
  • the method involves automated processing of video frames acquired by standard endoscopy systems during routine clinical procedures. No modifications or attachments to existing endoscope hardware are necessitated.
  • the video feed is input to software implementing computer vision techniques like feature detection, optical flow estimation and bundle adjustment to robustly track the endoscope tip position across frames. This is used to reconstruct the complete 3D trajectory of the tip movement through the organ interior.
  • the software maps each point on the trajectory to corresponding timestamped source video frames to annotate visual coverage.
  • Visualization of the 3D trajectory overlaid on organ surface reconstructed from the video provides an intuitive demonstration of scope navigation and coverage completeness.
  • Objective metrics derived from the trajectory include insertion/retraction speed, acceleration/deceleration magnitudes, tip velocities, distance traveled, trajectory smoothness, looping patterns and stability during viewing. Values are compared against predefined benchmarks and prior expert trajectories to compute competency scores.
  • the organ interior is digitally partitioned into semantic regions reflecting natural subdivisions like colonic sections.
  • the software tracks time spent visualizing each region based on estimated trajectory and compares it against recommended observation durations to detect insufficiently surveyed areas.
  • Critical maneuvers like circular tip motions around intestinal folds and behind flexures are recognized by classifying trajectory patterns using machine learning algorithms trained on expert demonstrations.
  • the system provides configurable reporting of procedural metrics highlighting deficiencies. Suggestions for improving technique are provided for low-scoring elements. Real-time audible, haptic or visual alerts may be activated when metrics breach thresholds to enable mid-procedure corrections. The endoscopist can review reconstructed trajectories of prior procedures for selfassessment.
  • fold detection algorithms are applied to identify expressed colonic haustra from video frames.
  • the reconstructed trajectory is correlated with detected folds to quantify visualization of critical hidden surfaces prone to being missed. This furnishes an explainable quality metric based on folding coverage and circular maneuver measurements.
  • the invented technique aims to enhance procedural consistency and improve clinical outcomes by enabling objective skills evaluation.
  • Automated competence scoring can standardize assessments and identify training needs. It can help shorten the learning curve for novice endoscopists by providing explanatory feedback.
  • the abbreviated procedure representation allows convenient storage for documentation, review and training purposes.
  • the proposed invention has potential to significantly advance quality and training in endoscopy.
  • FIG. 1 Schematic view of an endoscope
  • Figure 3 Schematic view showing the optimal motion trajectory of an endoscope in an intestine
  • the present invention is related to a method and a device for evaluating operator procedures based on data obtained during an endoscopy procedure.
  • the aforementioned computer-implemented method takes as input, images obtained from an endoscope that are processed through steps to be described by a data processing unit (100).
  • the data processing unit (100) can be integrated into the aforementioned endoscope or can be an entirely external device.
  • a program containing the method steps may be operated by the data processing unit (100).
  • the aforementioned program can also be stored in a medium that can be read by a computer such as a CD, USB, etc.
  • the endoscopy device is configured with a distal end (10) to enter the organ (O).
  • the distal end (10) contains an imaging element (11), preferably a camera or lens, and an illumination element (12), preferably a light guide guiding the light coming from a light source.
  • an imaging element (11) and illumination element (12) are arranged such that they do not protrude from the distal end's (10) front surface.
  • the imaging element (11) is configured to acquire multiple images, especially video recording, from the inner surface of the organ (O) where endoscopy will be performed, and the illumination element (12) is configured to illuminate the field of the related images.
  • the endoscope contains elements known in the art such as gas and liquid suction channels and control elements (for distal end entry movements).
  • organ (O) images obtained by means of an endoscopy device are acquired.
  • images or video can be directly obtained from the endoscopy device for use in the computer-implemented method.
  • Images can be obtained directly from the endoscopy device instantaneously (obtained during the procedure) or may have been obtained as a result of a previously performed procedure.
  • multiple images may be data stored in a memory unit acquired by the endoscopy device. It is sufficient that the data is acquired by the endoscopy device.
  • the provided images are subjected to an image matching process by the data processing unit (100).
  • images can be processed by image matching or by feeding to an artificial neural network based on image matching.
  • Convolutional neural networks (CNN) or vSLAM (visual Simultaneous Localisation and Mapping) can also be used.
  • the image acquisition point multiple images are matched with the next image (according to the time variable).
  • each frame is matched with the next sequential frame.
  • the position of the reference points and features of the first image will be compared with the next frame.
  • the difference in the position of these reference points will be used to estimate the 3D motion and 6 degrees of freedom of the endoscope.
  • This estimate obtained from matching will then be converted to a 3D translation to create 3D trajectories (T).
  • the trajectories (T) are created, they are preferably normalized and/or denoised.
  • the obtained trajectories (T) are used to evaluate the endoscopy procedure in terms of operator skill.
  • the trajectory (T) can be specifically used to evaluate operator movement accuracy, agility, economy (completing the procedure with minimal movements). Additionally, this trajectory (T) data can also be used for operator training.
  • the aforementioned trajectory motions can be evaluated based on predetermined metrics and intervals related to these metrics.
  • the aforementioned metrics are variables that can be derived from any movement performed by the operator with the endoscope distal end during the operation.
  • the metrics can be at least one of total trajectory (T) length, linear or angular acceleration, linear and angular speed, rotation of the trajectory (T), or tremors during the motion, preferably more than one, especially all of them.
  • T total trajectory
  • T linear or angular acceleration
  • T rotation of the trajectory
  • tremors during the motion, preferably more than one, especially all of them.
  • the obtained motion trajectory (T) can also be fed to an artificial neural network to obtain a prediction result.
  • the aforementioned artificial neural network may have been trained with a dataset containing motion trajectories (T) and various inputs indicating their adequacy. As a result, it can provide a response regarding whether the estimated motion trajectory (T) is adequate or not, and preferably actions required to correct the motion trajectory (T).
  • a predetermined motion trajectory (T) and the estimated motion trajectory (T) can be directly compared geometrically, and based on the amount of similarity, the sufficiency of the performed endoscopy procedure can be evaluated.
  • the previously determined motion trajectory (T) is a motion trajectory obtained during a procedure performed on the patient undergoing the operation that was defined as successful.
  • the newly obtained motion trajectory (T) is compared to this previous trajectory (T) to enable both general and real-time evaluation of the operator (if images are provided in real-time during the operation).
  • a skill score can also be generated for an operator based on the evaluation result.
  • the minimum duration is determined as 6 minutes, and procedures above 6 minutes are considered successful. However, this 6 minute may be distributed inefficiently. That is, each region of the organ (O) may not have been examined equally or at least for similar durations.
  • the organ (O) for instance intestine, is divided into at least two regions (S), and each region is assigned a certain time criterion. For example, the intestine is divided into two regions (S) and it is required to spend at least 3 minutes in each region.
  • the time spent in each region (S) is determined based on the time and position data found in the estimated trajectory (T). Additionally, if there are regions (S) that need to be imaged longer than other regions (S), the durations defined for the regions (S) can be specified differently from each other.
  • the accuracy of the evaluation can increase when trajectory (T) and region-based
  • T can be directly compared geometrically, and based on the amount of similarity, the sufficiency of the performed endoscopy procedure can be evaluated.
  • the previously determined motion trajectory (T) is a motion trajectory obtained during a procedure performed on the patient undergoing the operation that was defined as successful.
  • the newly obtained motion trajectory (T) is compared to this previous trajectory (T) to enable both general and real-time evaluation of the operator (if images are provided in real-time during the operation).
  • a skill score can also be generated for an operator.
  • the minimum duration is determined as 6 minutes, and procedures above 6 minutes are considered successful. However, these 6 minutes may be distributed inefficiently. That is, each region of the organ (O) may not have been examined equally or at least for similar durations.
  • the organ (O), for instance intestine is divided into at least two regions (S), and each region is assigned a certain time criterion. For example, the intestine is divided into two regions (S) and it is required to spend at least 3 minutes in each region. The time spent in each region (S) is determined based on the time and position data found in the estimated trajectory (T). Additionally, if there are regions (S) that need to be imaged longer than other regions (S), the durations defined for the regions (S) can be specified differently from each other.
  • trajectory (T) and region-based (S) assessments are made together.
  • predefined trajectory (T) motions for some regions (S) can be specified differently than others according to procedural needs.
  • images obtained by the imaging element (11) can be fed to an artificial neural network trained to specifically detect folded regions (F) and folded regions (F) can be detected in the images.
  • the adequacy of the obtained folded region (F) can also be obtained by ratioing the total area of the obtained images with the area of the related folded region (F).
  • the aforementioned total area includes the folded region (F) as well as non-informative areas and lumen view areas.
  • a three-dimensional reconstruction of the organ (O) can be performed on the obtained trajectory (T) and a visual evaluation can be made by matching the aforementioned three-dimensional reconstruction with the trajectory (T).
  • the regions observed on the trajectory (T) can be more easily identified.
  • instant feedbacks can also be generated in the present method depending on detection of the motion trajectory (T).
  • T motion trajectory
  • a warning response can be generated.
  • This warning response may be arranged to trigger a warning element provided on the data processing unit (100) where the related method is operated or on another device connected to the data processing unit (100).
  • This warning element can be visual, audio, sensory type.
  • a screen or a haptic device, especially a haptic device provided on the endoscope can be a warning element.
  • the data processing unit (100) configured to execute the method of the invention contains appropriate elements to execute the method steps. These elements may be a processing unit to process data and a memory unit to store data. Additionally, it may contain an input unit to receive multiple images provided by the endoscopy. Here, the input unit may directly receive data from the endoscopy device or imaging element (11) or may be provided as a USB, compact disk input for input of previously provided data. Additionally, receiving multiple images is possible through online data acquisition methods.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Theoretical Computer Science (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • General Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Evolutionary Computation (AREA)
  • Multimedia (AREA)
  • Computing Systems (AREA)
  • Quality & Reliability (AREA)
  • Endoscopes (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Signal Processing (AREA)
  • Artificial Intelligence (AREA)

Abstract

The present invention discloses a computerized method and system for evaluating endoscopy procedures based on endoscope maneuvering patterns derived from video data obtained during the procedure. In the disclosed technique, video frames acquired by an endoscope imaging element are processed to reconstruct the 3D motion trajectory of the endoscope tip through the organ. Metrics extracted from analysis of this trajectory provide objective measures to assess endoscopist proficiency and procedural completeness. The method involves digitally partitioning the organ into segments and tracking visualization duration for each region compared to recommended observation times. Fold detection algorithms also enable quantification of inspection of convoluted areas based on the reconstructed scope movements. The automated competency metrics aim to enhance procedural consistency and training. The compact trajectory representation further enables efficient examination data storage and review. Thus, the invention provides techniques to grade endoscopic skills based on instrument maneuvering patterns through innovative application of computer vision and trajectory mapping. The objective is to improve clinical outcomes by standardizing quantitative evaluation of core techniques.

Description

METHOD AND DEVICE FOR ENDOSCOPY EVALUATION
Technical Field
The present invention relates to a method and a device for evaluating operator procedures based on data obtained during an endoscopy procedure. More particularly, the present invention relates to a computer-implemented method and a device configured to execute said method, for assessing endoscopy procedures based on analysis of image data acquired during the endoscopic examination.
Background Art
Flexible endoscopy refers to the visualization of interior cavities and hollow organs of the body for diagnostic and therapeutic applications in medicine. The field of endoscopy has witnessed tremendous growth and innovation over the past few decades. Endoscopic procedures provide minimally invasive means to screen various medical conditions, enabling early diagnosis and treatment.
Endoscopes are slender instruments equipped with lighting and imaging systems to capture visuals from inside the body. Based on the site of application, endoscopes can be classified as gastrointestinal (GI), respiratory, urological, gynecological, neurological, arthroscopic, laparoscopic etc. GI endoscopes examine organs like esophagus, stomach, small intestine, colon, bile and pancreatic ducts. Respiratory endoscopes access airways and lungs. Urological endoscopes inspect urinary tract organs such as bladder and urethra. Hysteroscopy visualizes the cervix and inside of the uterus. Neuroendoscopy enables intracranial procedures through nasal or oral access. Arthroscopy is applied for joint spaces including knee, shoulder, elbow etc. Laparoscopy employs small incisions to insert an endoscope and examine organs inside the abdomen.
Flexible endoscopes have become highly popular for most endoscopy applications due to superior maneuverability and access compared to rigid scopes. Modern flexible endoscopes typically employ a Charge Coupled Device (CCD) or Complementary Metal Oxide Sensor (CMOS) as imaging element located at the distal tip. Light Emitting Diodes (LEDs) or optical fibers connected to an external light source are generally used for illumination. The proximal end has an eyepiece for direct viewing and a connector for attaching external video processor and light source equipment. Flexible endoscope models also incorporate small lumens for delivering fluids, gases, accessories and operative instruments if necessary.
Gastrointestinal (GI) endoscopy is one of the most widely performed endoscopic examination for visualization of the digestive tract. It serves as an effective screening and diagnostic tool for detection of structural abnormalities, inflammation, ulcers, polyps, tumors and cancers affecting the esophagus, stomach, small intestine (duodenum), colon and rectum. Some of the common GI endoscopy procedures include esophagogastroduodenoscopy (EGD), colonoscopy, sigmoidoscopy, enteroscopy etc. EGD, commonly referred to as upper endoscopy, is employed to diagnose conditions of the upper GI tract including esophagus, stomach and duodenum. Colonoscopy examines the entire large intestine from rectum to cecum for colorectal cancer screening. Sigmoidoscopy inspects the distal colon beginning at the rectum and ending at the sigmoid. Small bowel enteroscopy is carried out on patients presenting with obscure gastrointestinal bleeding (OGIB).
Colonoscopy is one of the most widely performed endoscopic procedures worldwide. It serves as an effective diagnostic and screening tool for colorectal cancer. Colonoscopy enables inspection of the mucosa throughout the colon by advancement of the scope through anus. It also facilitates biopsy sampling and therapeutic interventions like removal of polyps, tumors and other abnormal tissue. Typical adult colonoscope length is around 168 cm with diameter of approximately 13 mm. Pediatric colonoscopes are smaller with length around 106 cm and diameter nearly 10 mm. The key components of a video colonoscope are CCD/CMOS image sensor, light bundle, working/biopsy channel, insufflation port and control head. Water jet channels for mucosal cleaning and lens wash nozzles are also integrated.
A critical quality indicator for colonoscopy is the adenoma detection rate (ADR). Adenomas are precursor polyp lesions that may develop into colorectal carcinoma if left untreated. ADR is defined as the proportion of screening colonoscopies in which at least one colorectal adenoma is identified and excised. Studies have revealed ADR to be strongly associated with the risk of interval colorectal cancer after colonoscopy. The recommended benchmark ADR is >25% in men and >15% in women for average-risk screening colonoscopies.
The cecal intubation rate, describing successful advancement of the colonoscope tip to the cecum, cecal landmarks or terminal ileum, is another significant metric. A cecal intubation rate greater than 90% is considered optimal. The withdrawal time, corresponding to the time taken for scope withdrawal from the cecum to the rectum, also impacts ADR. Slow scope withdrawal with careful mucosal inspection enables higher ADR. At least 6 minutes of withdrawal time is advised, with some guidelines recommending withdrawal times up to 10 minutes.
Colonoscopy is a technically challenging procedure requiring considerable skill. Complex maneuvering is necessitated due to the numerous twists and turns of the colon anatomy. Insufficient visualization of the mucosal surface and rapid scope withdrawal lead to missed lesions. Patient factors like presence of diverticula or redundant colons further increase difficulty.
These quality indicators provide basic metrics to assess colonoscopy efficacy and ensure adequate mucosal coverage. However, they have limitations. ADR only furnishes an overall estimate without localization of detected lesions. Recording withdrawal time is prone to errors and inconsistencies in measurement. Furthermore, these parameters reflect outcomes and do not monitor procedural skills. The actual endoscope manipulation technique employed by the colonoscopist significantly influences inspection quality but remains unquantified.
Endoscopists exhibit a broad range in competency levels, resulting in variability in adenoma yields. Studies have revealed increased adenoma and polyp detection rates for endoscopists with higher procedure volumes and experience. However, technical skill assessment based solely on such surrogate measures remains suboptimal. Direct monitoring and quantification of the complex psychomotor skills involved in scope maneuvering is warranted to reliably evaluate competency. This can enable targeted training and feedback to improve proficiency.
Recent technological advances have paved the way for computer-assisted evaluation of endoscopy procedures. Video image analyses techniques allow extraction of clinically relevant information from recorded endoscopy videos. Computer vision methods can automate detection of mucosal abnormalities with high accuracy to complement human interpretation. Image processing algorithms may also enable localization of the endoscope tip based on tissue surface characteristics. The estimated trajectory can help assess critical maneuvers like circular movements around haustral folds and behind colonic flexures.
Reconstruction of the projected path traversed by the endoscope can reveal segments with inadequate visual coverage and acceleration/deceleration patterns indicative of suboptimal technique. Such innovative computational tools can help objectively quantify the quality of examination in terms of thoroughness of mucosal visualization. Automated assessment can grade competency and stratify training needs to ultimately enhance adenoma detection.
However, prior art solutions for endoscopy evaluation have certain limitations. Patent document W02019092940A1 employs a position sensor-enabled endoscopy capsule. The capsule localization data allows estimation of the 3D trajectory only at sparse sampling points. Continuous trajectory reconstruction from densely sampled video frames is not feasible. This restricts assessment of dynamic manipulations based on trajectory characteristics.
The data acquisition method of document CN109448041 A also utilizes an endoscopy capsule. Here, trajectories are obtained by tracking corresponding features across sequential capsule images. While enabling 3D organ reconstruction, this technique does not facilitate quantitative evaluation of the endoscopist’s maneuvering performance.
Patent document WO2018025444A1 describes a method to determine endoscopy capsule trajectory based on intensity of received signals. However, competency assessment is not within the purview of this prior art.
Patent document WO2015111292A1 details an image compression technique but does not provide solutions for procedural evaluation via image analysis.
Therefore, there is a need for an improved technique applying advanced image processing and trajectory reconstruction algorithms to comprehensively evaluate endoscopic interventions based on quantified endoscope manipulation patterns. This can potentially enhance quality assurance and training for endoscopy procedures through objective skill assessment. The present invention aims to address this need and provide techniques to quantitatively analyze endoscopic procedures using computer vision and trajectory analysis methods. The inventive method and system enable reconstruction of the 3D endoscope trajectory from video sequences. Characteristics derived from this trajectory are used to objectively assess procedural skill and completeness. The generated trajectory representation also facilitates compact storage of examination data.
Disclosure of Invention
The present invention discloses a computer-implemented method and system for comprehensive evaluation of endoscopic examinations based on endoscope maneuvering pattern analysis.
The primary objective of the invented technique is to enable standardized and quantitative assessment of endoscopy procedural competence based on the motion trajectory of the endoscope reconstructed from video sequences acquired during the intervention.
Specific aims of the proposed invention are:
- To develop image processing algorithms that can accurately track the endoscope tip across a sequence of endoscopy video frames and estimate its three-dimensional motion trajectory.
- To extract metrics from the reconstructed trajectory that can evaluate skill, completeness and quality benchmarks of the conducted endoscopy procedure.
- To quantify endoscopist proficiency by comparing maneuvering metrics against predefined benchmarks and prior established trajectories from expert procedures.
- To provide real-time feedback and personalized training based on objective procedural competence assessment for individual endoscopists.
- To detect insufficiently visualized regions based on comparison of imaged areas with reconstructed endoscope trajectory. - To enable automated detection of complex maneuvers like circular movements around intestinal folds using computer vision techniques.
- To develop an abbreviated representation of the complete examination based on the endoscope trajectory enabling compressed storage and replay of procedures.
Endoscopy procedures like colonoscopy, upper endoscopy, bronchoscopy, cystoscopy etc. involve navigating a flexible endoscope through the intricate pathways and obstructions of hollow organs to visualize the interior surface. Thorough examination necessitates complex tip manipulations to negotiate tight turns and access convoluted areas while providing stable views. There is a broad range of competency in scope maneuvering skills among endoscopists impacting diagnostic yields. Furthermore, longer procedure duration and unstable visualization increase patient discomfort.
However, prevailing methods for technical skills assessment rely on indirect surrogate measures of procedural quality and outcomes such as adenoma detection rate for colonoscopies. Direct monitoring and quantification of endoscope handling proficiency currently involves expensive simulation setups unsuitable for routine implementation. Automated computational assessment tools enabling objective competency evaluation by analyzing the endoscope maneuvering patterns could potentially enhance training and improve clinical outcomes.
The present invention proposes innovative techniques leveraging computer vision, image processing and information fusion algorithms to reconstruct the complete three-dimensional motion trajectory of the endoscope tip from standard endoscopy videos. Characteristics derived from this reconstructed trajectory provide robust and explainable metrics for quantitative evaluation of endoscopic interventions. The automated competency assessment can benchmark endoscopists against standards and/or prior best trajectories from experts. Proceduralist-specific learning curves may be plotted over time to track skill progression. Real-time feedback during live procedures can assist course correction. Another major utility of the reconstructed endoscope trajectory is enabling compressed storage and replay of procedures. Recording entire videos requires prohibitive storage capacity. The trajectory provides a compact procedural representation supporting rapid review and retrieval of salient sections, while requiring multiple orders of magnitude lower space compared to raw videos.
The method involves automated processing of video frames acquired by standard endoscopy systems during routine clinical procedures. No modifications or attachments to existing endoscope hardware are necessitated. The video feed is input to software implementing computer vision techniques like feature detection, optical flow estimation and bundle adjustment to robustly track the endoscope tip position across frames. This is used to reconstruct the complete 3D trajectory of the tip movement through the organ interior.
The software maps each point on the trajectory to corresponding timestamped source video frames to annotate visual coverage. Visualization of the 3D trajectory overlaid on organ surface reconstructed from the video provides an intuitive demonstration of scope navigation and coverage completeness. Objective metrics derived from the trajectory include insertion/retraction speed, acceleration/deceleration magnitudes, tip velocities, distance traveled, trajectory smoothness, looping patterns and stability during viewing. Values are compared against predefined benchmarks and prior expert trajectories to compute competency scores.
In a preferred embodiment, the organ interior is digitally partitioned into semantic regions reflecting natural subdivisions like colonic sections. The software tracks time spent visualizing each region based on estimated trajectory and compares it against recommended observation durations to detect insufficiently surveyed areas. Critical maneuvers like circular tip motions around intestinal folds and behind flexures are recognized by classifying trajectory patterns using machine learning algorithms trained on expert demonstrations.
The system provides configurable reporting of procedural metrics highlighting deficiencies. Suggestions for improving technique are provided for low-scoring elements. Real-time audible, haptic or visual alerts may be activated when metrics breach thresholds to enable mid-procedure corrections. The endoscopist can review reconstructed trajectories of prior procedures for selfassessment.
In another embodiment, fold detection algorithms are applied to identify expressed colonic haustra from video frames. The reconstructed trajectory is correlated with detected folds to quantify visualization of critical hidden surfaces prone to being missed. This furnishes an explainable quality metric based on folding coverage and circular maneuver measurements.
The invented technique aims to enhance procedural consistency and improve clinical outcomes by enabling objective skills evaluation. Automated competence scoring can standardize assessments and identify training needs. It can help shorten the learning curve for novice endoscopists by providing explanatory feedback. The abbreviated procedure representation allows convenient storage for documentation, review and training purposes. Thus, the proposed invention has potential to significantly advance quality and training in endoscopy.
Brief Description of Drawings
The drawings used to better describe the device developed with this invention and related explanations are given below.
Figure 1. Schematic view of an endoscope
Figure 2. Schematic view for zoning the organ
Figure 3. Schematic view showing the optimal motion trajectory of an endoscope in an intestine
Description of the Components/Parts/Elements Constituting the Invention
To better describe the device developed with this invention, the parts and components in the figures are numbered and each number corresponds to:
10. Endoscope head 11. Imaging element
12. Illumination element
100. Data processing unit
O. Data processing unit
F. Fold region
T. Trajectory
S. Region
Modes of Carrying Out the Invention
The present invention is related to a method and a device for evaluating operator procedures based on data obtained during an endoscopy procedure.
The aforementioned computer-implemented method takes as input, images obtained from an endoscope that are processed through steps to be described by a data processing unit (100). The data processing unit (100) can be integrated into the aforementioned endoscope or can be an entirely external device. Here, a program containing the method steps may be operated by the data processing unit (100). The aforementioned program can also be stored in a medium that can be read by a computer such as a CD, USB, etc.
With reference to Figure 1; given for visual description, it does not reflect the actual geometry of an endoscopy device. The endoscopy device is configured with a distal end (10) to enter the organ (O). The distal end (10) contains an imaging element (11), preferably a camera or lens, and an illumination element (12), preferably a light guide guiding the light coming from a light source. Generally, an imaging element (11) and illumination element (12) are arranged such that they do not protrude from the distal end's (10) front surface.
Here, the imaging element (11) is configured to acquire multiple images, especially video recording, from the inner surface of the organ (O) where endoscopy will be performed, and the illumination element (12) is configured to illuminate the field of the related images. In addition, the endoscope contains elements known in the art such as gas and liquid suction channels and control elements (for distal end entry movements).
In the method of the invention, first, organ (O) images obtained by means of an endoscopy device are acquired. Here, images or video can be directly obtained from the endoscopy device for use in the computer-implemented method. Images can be obtained directly from the endoscopy device instantaneously (obtained during the procedure) or may have been obtained as a result of a previously performed procedure. Additionally, multiple images may be data stored in a memory unit acquired by the endoscopy device. It is sufficient that the data is acquired by the endoscopy device.
The provided images are subjected to an image matching process by the data processing unit (100). Here, images can be processed by image matching or by feeding to an artificial neural network based on image matching. Convolutional neural networks (CNN) or vSLAM (visual Simultaneous Localisation and Mapping) can also be used.
Here, in order to determine the movement/trajectory (T) of the distal end (10), or more clearly, the image acquisition point, multiple images are matched with the next image (according to the time variable). In video input, each frame is matched with the next sequential frame. After matching, the position of the reference points and features of the first image will be compared with the next frame. The difference in the position of these reference points will be used to estimate the 3D motion and 6 degrees of freedom of the endoscope. This estimate obtained from matching will then be converted to a 3D translation to create 3D trajectories (T). After the trajectories (T) are created, they are preferably normalized and/or denoised.
The obtained trajectories (T) are used to evaluate the endoscopy procedure in terms of operator skill. Here, the trajectory (T) can be specifically used to evaluate operator movement accuracy, agility, economy (completing the procedure with minimal movements). Additionally, this trajectory (T) data can also be used for operator training. Industrial Applicability
The aforementioned trajectory motions can be evaluated based on predetermined metrics and intervals related to these metrics. The aforementioned metrics are variables that can be derived from any movement performed by the operator with the endoscope distal end during the operation.
In a preferred embodiment, the metrics can be at least one of total trajectory (T) length, linear or angular acceleration, linear and angular speed, rotation of the trajectory (T), or tremors during the motion, preferably more than one, especially all of them. Here, while the aforementioned metrics may rely solely on the obtained motion trajectory (T) data, data obtained from different sensors and devices can also be used as criteria for this assessment.
Alternatively, the obtained motion trajectory (T) can also be fed to an artificial neural network to obtain a prediction result. Here, the aforementioned artificial neural network may have been trained with a dataset containing motion trajectories (T) and various inputs indicating their adequacy. As a result, it can provide a response regarding whether the estimated motion trajectory (T) is adequate or not, and preferably actions required to correct the motion trajectory (T).
In another alternative, a predetermined motion trajectory (T) and the estimated motion trajectory (T) can be directly compared geometrically, and based on the amount of similarity, the sufficiency of the performed endoscopy procedure can be evaluated.
In another alternative, the previously determined motion trajectory (T) is a motion trajectory obtained during a procedure performed on the patient undergoing the operation that was defined as successful. The newly obtained motion trajectory (T) is compared to this previous trajectory (T) to enable both general and real-time evaluation of the operator (if images are provided in real-time during the operation).
In addition to determining whether the procedure is sufficient or not, a skill score can also be generated for an operator based on the evaluation result. With reference to Figure 2; For the proper execution of endoscopy, especially colonoscopy, the minimum duration is determined as 6 minutes, and procedures above 6 minutes are considered successful. However, this 6 minute may be distributed inefficiently. That is, each region of the organ (O) may not have been examined equally or at least for similar durations. Thus, the organ (O), for instance intestine, is divided into at least two regions (S), and each region is assigned a certain time criterion. For example, the intestine is divided into two regions (S) and it is required to spend at least 3 minutes in each region. The time spent in each region (S) is determined based on the time and position data found in the estimated trajectory (T). Additionally, if there are regions (S) that need to be imaged longer than other regions (S), the durations defined for the regions (S) can be specified differently from each other.
Additionally, the accuracy of the evaluation can increase when trajectory (T) and region-based
(S) assessments are made together. Furthermore, predefined trajectory (T) motions for some regions (S) can be specified differently than others according to procedural needs.
In another alternative, a predetermined motion trajectory (T) and the estimated motion trajectory
(T) can be directly compared geometrically, and based on the amount of similarity, the sufficiency of the performed endoscopy procedure can be evaluated.
In another alternative, the previously determined motion trajectory (T) is a motion trajectory obtained during a procedure performed on the patient undergoing the operation that was defined as successful. The newly obtained motion trajectory (T) is compared to this previous trajectory (T) to enable both general and real-time evaluation of the operator (if images are provided in real-time during the operation).
Based on the evaluation result, in addition to determining whether the procedure is sufficient or not, a skill score can also be generated for an operator.
With reference to Figure 2; For the proper execution of endoscopy, especially colonoscopy, the minimum duration is determined as 6 minutes, and procedures above 6 minutes are considered successful. However, these 6 minutes may be distributed inefficiently. That is, each region of the organ (O) may not have been examined equally or at least for similar durations. Thus, the organ (O), for instance intestine, is divided into at least two regions (S), and each region is assigned a certain time criterion. For example, the intestine is divided into two regions (S) and it is required to spend at least 3 minutes in each region. The time spent in each region (S) is determined based on the time and position data found in the estimated trajectory (T). Additionally, if there are regions (S) that need to be imaged longer than other regions (S), the durations defined for the regions (S) can be specified differently from each other.
Additionally, the accuracy of the evaluation can increase when trajectory (T) and region-based (S) assessments are made together. Furthermore, predefined trajectory (T) motions for some regions (S) can be specified differently than others according to procedural needs.
With reference to Figure 3; In endoscopy, especially colonoscopy, controlling the folded regions (F) is of great importance. Under normal circumstances, it is not possible to visualize folded regions (F) during colonoscopy performed with linear motion, or only partial visuals are obtained, yet polyps may also be present in these regions. For this reason, the operator should move the distal end (10) of the endoscopy device in a circular, helical trajectory (T) during colonoscopy. Whether this motion is performed or not can be determined by controlling the obtained trajectory (T). For this control, the aforementioned metrics can be arranged according to the circular motion result by the artificial neural network.
Additionally, images obtained by the imaging element (11) can be fed to an artificial neural network trained to specifically detect folded regions (F) and folded regions (F) can be detected in the images.
Additionally, the adequacy of the obtained folded region (F) can also be obtained by ratioing the total area of the obtained images with the area of the related folded region (F). Here, the aforementioned total area includes the folded region (F) as well as non-informative areas and lumen view areas.
By comparing the obtained folded regions (F) with the motion trajectory (T), it can be determined whether sufficient images are obtained from these folded regions and/or sufficient duration is allocated. In a preferred embodiment, a three-dimensional reconstruction of the organ (O) can be performed on the obtained trajectory (T) and a visual evaluation can be made by matching the aforementioned three-dimensional reconstruction with the trajectory (T). Thus, the regions observed on the trajectory (T) can be more easily identified.
Moreover, instant feedbacks can also be generated in the present method depending on detection of the motion trajectory (T). During processing and evaluation of instantaneous multiple images provided from the endoscopy device, if the evaluation results are negative, a warning response can be generated. This warning response may be arranged to trigger a warning element provided on the data processing unit (100) where the related method is operated or on another device connected to the data processing unit (100). This warning element can be visual, audio, sensory type. For example, a screen or a haptic device, especially a haptic device provided on the endoscope can be a warning element.
The data processing unit (100) configured to execute the method of the invention contains appropriate elements to execute the method steps. These elements may be a processing unit to process data and a memory unit to store data. Additionally, it may contain an input unit to receive multiple images provided by the endoscopy. Here, the input unit may directly receive data from the endoscopy device or imaging element (11) or may be provided as a USB, compact disk input for input of previously provided data. Additionally, receiving multiple images is possible through online data acquisition methods.

Claims

CLAIMS A computer-implemented method for evaluating an endoscopy procedure, comprising: obtaining organ (0) images from an endoscope, determining a three-dimensional movement trajectory (T) of the endoscope by subjecting said multiple images to an image matching process, evaluating the endoscope movement based on said trajectory (T) by comparison with predefined trajectory motions. The method of claim 1, wherein said multiple images are fed into an image matching based artificial neural network to determine the three-dimensional movement trajectory (T) of the endoscope. The method of claim 1, wherein the endoscope movement is evaluated by an artificial neural network trained on predefined trajectory (T) motions. The method of claim 1, wherein the endoscope trajectory (T) is evaluated by an artificial neural network trained on predetermined metrics. The method of claim 4, wherein said metrics are variables derivable from the motion trajectory and positional changes. The method of claim 4 or 5, wherein said metrics comprise at least one of total trajectory (T) length, linear or angular acceleration, linear and angular velocity, rotation of the trajectory (T), or tremors during the motion. The method of claim 1 , wherein the endoscope movement is evaluated by comparing the obtained endoscope trajectory (T) with predefined trajectory (T) motions. The method of claim 1, comprising: defining multiple regions (S) in the imaged organ (O), determining from said trajectory (T) the time spent by the endoscope in each of said regions (S), evaluating the time spent in each region (S) based on predefined time parameters. The method of claim 8, wherein evaluating based on said time parameter comprises comparing the predefined time for a region (S) with the time spent in said region (S) determined from the trajectory (T). The method of claim 1, comprising: detecting organ (O) fold regions (F) by feeding endoscope images into a neural network, comparing the obtained three-dimensional motion trajectory with the detected fold regions (F). The method of claim 10, wherein the comparison determines if adequate images are obtained from the fold region (F) based on the three-dimensional motion trajectory. The method of claim 11 , wherein determining if adequate images are obtained comprises ratioing the total imaged area with the area of the imaged fold region (F). The method of claim 12, wherein the total imaged area includes non-informative areas and lumen view areas besides the fold region (F).
14. The method of claim 11 , wherein the comparison determines if adequate duration is spent visualizing the fold region (F) based on the three-dimensional motion trajectory.
15. The method of any preceding claim, wherein said predefined trajectory (T) motions comprise circular motions.
16. The method of any preceding claim, comprising generating a warning response based on the evaluation outcome.
17. The method of claim 16, wherein said warning response triggers a visual, audio or haptic alert element.
18. The method of any preceding claim, wherein said evaluation is performed in real-time, at defined intervals or post-procedure.
19. The method of any preceding claim, wherein said evaluation is performed after completion of the procedure.
20. The method of any preceding claim, comprising reconstructing a three-dimensional visualization of the organ (O) based on said trajectory (T).
21. The method of claim 20, comprising performing evaluation by correlating the reconstructed three-dimensional organ (O) and the three-dimensional trajectory (T).
22. The method of claim 1, comprising normalization and/or noise reduction of said three- dimensional trajectory (T).
23. A device for evaluating an endoscopy procedure comprising elements configured to execute the steps of any of claims 1-22. 24. A computer program for evaluating an endoscopy procedure, which when executed by a data processing unit (100) causes the steps of any of claims 1-22 to be performed.
25. A computer-readable medium storing instructions which when executed by a data processing unit (100) causes the steps of any of claims 1-22 to be performed.
PCT/TR2023/051219 2022-11-02 2023-10-30 Method and device for endoscopy evaluation WO2024096840A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TR2022/016566 TR2022016566A1 (en) 2022-11-02 A METHOD AND DEVICE FOR ENDOSCOPY ASSESSMENT
TR2022016566 2022-11-02

Publications (1)

Publication Number Publication Date
WO2024096840A1 true WO2024096840A1 (en) 2024-05-10

Family

ID=90931213

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/TR2023/051219 WO2024096840A1 (en) 2022-11-02 2023-10-30 Method and device for endoscopy evaluation

Country Status (1)

Country Link
WO (1) WO2024096840A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021156159A1 (en) * 2020-02-03 2021-08-12 Cosmo Artificial Intelligence - AI Limited Systems and methods for contextual image analysis
US20210280312A1 (en) * 2020-03-06 2021-09-09 Verily Life Sciences Llc Detecting deficient coverage in gastroenterological procedures
CN113763360A (en) * 2021-09-08 2021-12-07 山东大学 Digestive endoscopy simulator inspection quality assessment method and system
WO2023057986A2 (en) * 2021-10-08 2023-04-13 Cosmo Artificial Intelligence - AI Limited Computer-implemented systems and methods for analyzing examination quality for an endoscopic procedure

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021156159A1 (en) * 2020-02-03 2021-08-12 Cosmo Artificial Intelligence - AI Limited Systems and methods for contextual image analysis
US20210280312A1 (en) * 2020-03-06 2021-09-09 Verily Life Sciences Llc Detecting deficient coverage in gastroenterological procedures
CN113763360A (en) * 2021-09-08 2021-12-07 山东大学 Digestive endoscopy simulator inspection quality assessment method and system
WO2023057986A2 (en) * 2021-10-08 2023-04-13 Cosmo Artificial Intelligence - AI Limited Computer-implemented systems and methods for analyzing examination quality for an endoscopic procedure

Similar Documents

Publication Publication Date Title
JP6389136B2 (en) Endoscopy part specifying device, program
JP7154832B2 (en) Improving registration by orbital information with shape estimation
US20190340761A1 (en) Examining or Imaging an Interior Surface of a Cavity
US9460536B2 (en) Endoscope system and method for operating endoscope system that display an organ model image to which an endoscopic image is pasted
JP5028191B2 (en) Endoscope device
JP6215236B2 (en) System and method for displaying motility events in an in-vivo image stream
JP6535020B2 (en) System for measuring 3D distance and dimensions of visible objects in endoscopic images
US7761134B2 (en) System and method for modeling a tracking curve of an in vivo device
US20150313445A1 (en) System and Method of Scanning a Body Cavity Using a Multiple Viewing Elements Endoscope
WO2004023986A1 (en) Medical treatment system, endoscope system, endoscope insert operation program, and endoscope device
US20110032347A1 (en) Endoscopy system with motion sensors
JP7034102B2 (en) Systems and methods for assessing and monitoring mucosal disease in the subject's gastrointestinal tract
JP2015509026A5 (en)
US20110065989A1 (en) System for assessment of colonoscope manipulation
JP2014527837A (en) Systematically alphanumeric coded endoscopy and endoscope positioning system
WO2021075418A1 (en) Image processing method, teacher data generation method, trained model generation method, illness development prediction method, image processing device, image processing program, and recording medium on which program is recorded
JP5750669B2 (en) Endoscope system
JP2009022446A (en) System and method for combined display in medicine
JP4436638B2 (en) Endoscope apparatus and endoscope insertion operation program
WO2018211674A1 (en) Image processing device, image processing method, and program
WO2024096840A1 (en) Method and device for endoscopy evaluation
US20230044280A1 (en) Accessory device for an endoscopic device
US20240112407A1 (en) System, methods, and storage mediums for reliable ureteroscopes and/or for imaging
WO2023218523A1 (en) Second endoscopic system, first endoscopic system, and endoscopic inspection method
JP7533905B2 (en) Colonoscopic observation support device, operation method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23886465

Country of ref document: EP

Kind code of ref document: A1