EP3092951B1 - Method and apparatus for synthesizing medical images - Google Patents
Method and apparatus for synthesizing medical images Download PDFInfo
- Publication number
- EP3092951B1 EP3092951B1 EP16156252.5A EP16156252A EP3092951B1 EP 3092951 B1 EP3092951 B1 EP 3092951B1 EP 16156252 A EP16156252 A EP 16156252A EP 3092951 B1 EP3092951 B1 EP 3092951B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- image
- medical image
- signal information
- image frames
- ecg
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000002194 synthesizing effect Effects 0.000 title claims description 54
- 238000000034 method Methods 0.000 title claims description 44
- 238000002604 ultrasonography Methods 0.000 claims description 82
- 230000000875 corresponding effect Effects 0.000 claims description 68
- 238000002591 computed tomography Methods 0.000 claims description 16
- 230000002596 correlated effect Effects 0.000 claims description 15
- 230000004044 response Effects 0.000 claims description 12
- 239000003550 marker Substances 0.000 claims description 4
- 230000033001 locomotion Effects 0.000 description 67
- 210000004204 blood vessel Anatomy 0.000 description 42
- 238000004891 communication Methods 0.000 description 31
- 239000000523 sample Substances 0.000 description 23
- 230000000737 periodic effect Effects 0.000 description 19
- 230000037081 physical activity Effects 0.000 description 18
- 230000000747 cardiac effect Effects 0.000 description 13
- 230000006870 function Effects 0.000 description 13
- 230000008859 change Effects 0.000 description 11
- 230000008569 process Effects 0.000 description 10
- 239000000284 extract Substances 0.000 description 8
- 230000008602 contraction Effects 0.000 description 7
- 238000003384 imaging method Methods 0.000 description 7
- 210000003205 muscle Anatomy 0.000 description 7
- 230000005540 biological transmission Effects 0.000 description 6
- 238000012285 ultrasound imaging Methods 0.000 description 6
- 230000017531 blood circulation Effects 0.000 description 4
- 238000003745 diagnosis Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 238000002595 magnetic resonance imaging Methods 0.000 description 4
- 210000004369 blood Anatomy 0.000 description 3
- 239000008280 blood Substances 0.000 description 3
- 238000010295 mobile communication Methods 0.000 description 3
- 210000000056 organ Anatomy 0.000 description 3
- 210000001519 tissue Anatomy 0.000 description 3
- 241001465754 Metazoa Species 0.000 description 2
- 238000005481 NMR spectroscopy Methods 0.000 description 2
- 230000005856 abnormality Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 230000001934 delay Effects 0.000 description 2
- 230000003111 delayed effect Effects 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 230000001747 exhibiting effect Effects 0.000 description 2
- 238000005086 pumping Methods 0.000 description 2
- 238000003786 synthesis reaction Methods 0.000 description 2
- 208000007536 Thrombosis Diseases 0.000 description 1
- 210000001015 abdomen Anatomy 0.000 description 1
- 210000003484 anatomy Anatomy 0.000 description 1
- 238000002583 angiography Methods 0.000 description 1
- 208000007474 aortic aneurysm Diseases 0.000 description 1
- 210000001367 artery Anatomy 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 210000000481 breast Anatomy 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 210000004185 liver Anatomy 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000010339 medical test Methods 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000002285 radioactive effect Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 230000029058 respiratory gaseous exchange Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 210000004872 soft tissue Anatomy 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000001356 surgical procedure Methods 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computed tomography [CT]
- A61B6/032—Transmission computed tomography [CT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5288—Devices using data or image processing specially adapted for radiation diagnosis involving retrospective matching to a physiological signal
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0833—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
- A61B8/085—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0891—Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of blood vessels
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/483—Diagnostic techniques involving the acquisition of a 3D volume of data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5207—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5238—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
- A61B8/5246—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from the same or different imaging techniques, e.g. color Doppler and B-mode
- A61B8/5253—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from the same or different imaging techniques, e.g. color Doppler and B-mode combining overlapping images, e.g. spatial compounding
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5269—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts
- A61B8/5276—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts due to motion
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5284—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving retrospective matching to a physiological signal
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01R—MEASURING ELECTRIC VARIABLES; MEASURING MAGNETIC VARIABLES
- G01R33/00—Arrangements or instruments for measuring magnetic variables
- G01R33/20—Arrangements or instruments for measuring magnetic variables involving magnetic resonance
- G01R33/44—Arrangements or instruments for measuring magnetic variables involving magnetic resonance using nuclear magnetic resonance [NMR]
- G01R33/48—NMR imaging systems
- G01R33/54—Signal processing systems, e.g. using pulse sequences ; Generation or control of pulse sequences; Operator console
- G01R33/56—Image enhancement or correction, e.g. subtraction or averaging techniques, e.g. improvement of signal-to-noise ratio and resolution
- G01R33/5608—Data processing and visualization specially adapted for MR, e.g. for feature analysis and pattern recognition on the basis of measured MR data, segmentation of measured MR data, edge contour detection on the basis of measured MR data, for enhancing measured MR data in terms of signal-to-noise ratio by means of noise filtering or apodization, for enhancing measured MR data in terms of resolution by means for deblurring, windowing, zero filling, or generation of gray-scaled images, colour-coded images or images displaying vectors instead of pixels
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8934—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a dynamic transducer configuration
- G01S15/8936—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a dynamic transducer configuration using transducers mounted for mechanical movement in three dimensions
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52053—Display arrangements
- G01S7/52057—Cathode ray tube displays
- G01S7/5206—Two-dimensional coordinated display of distance and direction; B-scan display
- G01S7/52065—Compound scan display, e.g. panoramic imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52085—Details related to the ultrasound signal acquisition, e.g. scan sequences
- G01S7/52087—Details related to the ultrasound signal acquisition, e.g. scan sequences using synchronization techniques
- G01S7/52088—Details related to the ultrasound signal acquisition, e.g. scan sequences using synchronization techniques involving retrospective scan line rearrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/003—Reconstruction from projections, e.g. tomography
- G06T11/008—Specific post-processing after tomographic reconstruction, e.g. voxelisation, metal artifact correction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
- A61B5/055—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/50—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
- A61B6/503—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of the heart
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/50—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
- A61B6/504—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of blood vessels, e.g. by angiography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/50—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
- A61B6/506—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of nerves
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5229—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
- A61B6/5235—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT
- A61B6/5241—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT combining overlapping images of the same imaging modality, e.g. by stitching
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5258—Devices using data or image processing specially adapted for radiation diagnosis involving detection or reduction of artifacts or noise
- A61B6/5264—Devices using data or image processing specially adapted for radiation diagnosis involving detection or reduction of artifacts or noise due to motion
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0883—Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the heart
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/467—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/488—Diagnostic techniques involving Doppler signals
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01R—MEASURING ELECTRIC VARIABLES; MEASURING MAGNETIC VARIABLES
- G01R33/00—Arrangements or instruments for measuring magnetic variables
- G01R33/20—Arrangements or instruments for measuring magnetic variables involving magnetic resonance
- G01R33/44—Arrangements or instruments for measuring magnetic variables involving magnetic resonance using nuclear magnetic resonance [NMR]
- G01R33/48—NMR imaging systems
- G01R33/54—Signal processing systems, e.g. using pulse sequences ; Generation or control of pulse sequences; Operator console
- G01R33/56—Image enhancement or correction, e.g. subtraction or averaging techniques, e.g. improvement of signal-to-noise ratio and resolution
- G01R33/567—Image enhancement or correction, e.g. subtraction or averaging techniques, e.g. improvement of signal-to-noise ratio and resolution gated by physiological signals, i.e. synchronization of acquired MR data with periodical motion of an object of interest, e.g. monitoring or triggering system for cardiac or respiratory gating
- G01R33/5673—Gating or triggering based on a physiological signal other than an MR signal, e.g. ECG gating or motion monitoring using optical systems for monitoring the motion of a fiducial marker
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8979—Combined Doppler and pulse-echo imaging systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8995—Combining images from different aspect angles, e.g. spatial compounding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30048—Heart; Cardiac
Definitions
- the exemplary embodiments relate to methods and apparatuses for synthesizing medical images in consideration of bio-signal information corresponding to a physical activity of an object, and more particularly, to methods and apparatuses for synthesizing medical images in consideration of electrocardiogram (ECG) signal information of an object.
- ECG electrocardiogram
- Ultrasound diagnostic apparatuses transmit ultrasound signals generated by transducers of a probe to an object and receive echo signals reflected from the object, thereby obtaining at least one image of an internal part of the object (e.g., soft tissues or blood flow).
- ultrasound diagnostic apparatuses are used for medical purposes including observation of the interior of an object, detection of foreign substances, and diagnosis of damage to the object.
- Such ultrasound diagnostic apparatuses provide high stability, display images in real time, and are safe due to the lack of radioactive exposure, compared to X-ray apparatuses. Therefore, ultrasound diagnostic apparatuses are widely used together with other image diagnostic apparatuses including a computed tomography (CT) apparatus, a magnetic resonance imaging (MRI) apparatus, and the like.
- CT computed tomography
- MRI magnetic resonance imaging
- An ultrasound system provides panoramic images based on ultrasound images that are continuously obtained by moving an ultrasonic probe along a surface of an object. That is, the ultrasound system continuously obtains ultrasound images by moving the ultrasonic probe along the surface of the object and forms panoramic images by synthesizing the obtained ultrasound images.
- US 2009198134 A1 which is considered to constitute the closest prior-art and which discloses a method and device using selected image data and positioning and overlapped area of 3D image data acquired from the different echo windows in the same time phase of a signal synchronizing with the working of a heart, and synthesizing (combining) the resultant data, thereby forming combined panorama 4D image data consisting of panorama three-dimensional image data, which are continued in time and have a display area larger than the three-dimensional image data.
- the term "and/or” includes any and all combinations of one or more of the correlated listed items. Expressions such as "at least one of' when preceding a list of elements, modify the entire list of elements and do not modify the individual elements of the list.
- image used herein may refer to multi-dimensional data including discrete image elements (e.g., pixels for two-dimensional (2D) images and voxels for three-dimensional (3D) images).
- the image may be, but is not limited to being, a medical image (e.g., an ultrasound image, a computed tomography (CT) image, or a magnetic resonance (MR) image) of an object that is obtained by an ultrasound apparatus, a CT apparatus, or a magnetic resonance imaging (MRI) apparatus.
- a medical image e.g., an ultrasound image, a computed tomography (CT) image, or a magnetic resonance (MR) image
- CT computed tomography
- MR magnetic resonance
- the ultrasound image may refer to an image obtained by emitting an ultrasound signal, which is generated from a transducer of a probe, to the object and receiving information of an echo signal reflected from the object.
- the ultrasound image may be formed in various ways.
- the ultrasound image may be at least one of an amplitude (A)-mode image, a brightness (B)-mode image, a color (C)-mode image, and a Doppler (D)-mode image.
- the CT image may refer to an image obtained by synthesizing a plurality of X-ray images that are obtained by taking a picture of the object through rotation about at least one axis of the object.
- the MR image may refer to an image of the object that is obtained by using nuclear magnetic resonance (NMR).
- NMR nuclear magnetic resonance
- the term "object” may refer to a human, an animal, or a part of a human or animal.
- the object may be an organ (e.g., the liver, the heart, the womb, the brain, a breast, or the abdomen), a blood vessel, or a combination thereof.
- the object may be a phantom.
- the phantom refers to a material having a density, an effective atomic number, and a volume that are approximately the same as those of an organism.
- the phantom may refer to a spherical phantom having properties similar to a human body.
- the term "user” may refer to, but is not limited to referring to, a medical expert, for example, a medical doctor, a nurse, a medical laboratory technologist, or a medical imaging expert, or a technician who repairs medical apparatuses.
- FIG. 1 is a block diagram illustrating a configuration of an apparatus 100 for synthesizing medical images, according to an exemplary embodiment.
- the apparatus 100 may include a data acquirer 110, an image processor 120, and a display 130.
- the data acquirer 110 may acquire image data of an object.
- the data acquirer 110 may transmit an ultrasound signal to the object and may receive an echo signal reflected from the object.
- the data acquirer 110 may process the received echo signal and may generate ultrasound image data of the object.
- the data acquirer 110 may transmit a radio frequency (RF) signal to the object and may receive an MR signal that is emitted from the object.
- the data acquirer 110 may process the received MR signal and may generate MR image data of the object.
- RF radio frequency
- the data acquirer 110 may transmit X-rays to the object and may detect an X-ray signal transmitted through the object.
- the data acquirer 110 may process the detected X-ray signal and may generate CT image data of the object.
- the data acquirer 110 may receive image data that is generated by an ultrasound diagnostic apparatus, an MR apparatus, or a CT apparatus that is located outside the apparatus 100, without receiving an ultrasound signal, an MR signal, or an X-ray signal from the object and directly generating image data of the object.
- the image processor 120 may generate a plurality of first medical image frames based on the image data that is received from the data acquirer 110.
- the plurality of first medical image frames may be a plurality of image frames that are temporally adjacent to one another.
- the image processor 120 may generate a panoramic image based on the plurality of first medical image frames.
- the image processor 120 may generate the panoramic image by synthesizing second medical image frames that are selected based on bio-signal information of the object from among the first medical image frames.
- the bio-signal information of the object may include information related to a body movement corresponding to a physical activity of the object for a predetermined period of time for which the first medical image frames are generated.
- the bio-signal information may be obtained from at least one medical image obtained by taking a picture of the body movement of the object.
- the at least one medical image may include, but is not limited to including, at least one of a blood vessel image, a musculoskeletal image, and an electrocardiogram (ECG) image.
- ECG electrocardiogram
- the image processor 120 may select second medical image frames corresponding to points in time that have the same ECG signal information of the object from among the plurality of first medical image frames and may generate a panoramic image by synthesizing only the second medical image frames.
- the display 130 may output the panoramic image that is generated by the image processor 120.
- the display 130 may output and display various pieces of information processed by the apparatus 100 as well as the panoramic image through a graphical user interface (GUI) onto a screen.
- the apparatus 100 may include two or more displays 130 according to a type of the apparatus 100.
- FIG. 2 is a block diagram illustrating a configuration of an apparatus 1000 for synthesizing medical images, according to an exemplary embodiment.
- the apparatus 1000 may be an ultrasound diagnostic apparatus.
- the apparatus 1000 may include a probe 20, an ultrasound transceiver 1100, an image processor 1200, a communication module 1300 (e.g., communicator), a display 1400, a memory 1500, an input device 1600, and a controller 1700, which may be connected to one another via a bus 1800.
- the data acquirer 110 of FIG. 1 may correspond to the ultrasound transceiver 1100 of FIG. 2
- the image processor 120 of FIG. 1 may correspond to the image processor 1200 of FIG. 2
- the display 130 of FIG. 1 may correspond to the display 1400 of FIG. 2 .
- the apparatus 1000 may be a cart-type apparatus or a portable apparatus.
- portable ultrasound diagnostic apparatuses may include, but are not limited to including, a picture archiving and communication system (PACS) viewer, a smartphone, a laptop computer, a personal digital assistant (PDA), and a tablet PC.
- PPS picture archiving and communication system
- smartphone a smartphone
- laptop computer a laptop computer
- PDA personal digital assistant
- tablet PC a tablet PC.
- the probe 20 transmits ultrasound waves to an object 10 in response to a driving signal applied by the ultrasound transceiver 1100 and receives echo signals reflected from the object 10.
- the probe 20 includes a plurality of transducers, and the plurality of transducers oscillate in response to electrical signals and generate acoustic energy, that is, ultrasound waves.
- the probe 20 may be connected to a main body of the apparatus 1000 by wire or wirelessly, and according to exemplary embodiments, the apparatus 1000 may include a plurality of the probes 20.
- the probe 20 may continuously transmit ultrasound signals to the object 10 while moving along a surface of the object 10 and may continuously receive echo signals reflected from the object 10.
- a transmitter 1110 applies a driving signal to the probe 20.
- the transmitter 1110 includes a pulse generator 1112, a transmission delaying unit 1114 (e.g., transmission delayer), and a pulser 1116.
- the pulse generator 1112 generates pulses for forming transmission ultrasound waves based on a predetermined pulse repetition frequency (PRF), and the transmission delaying unit 1114 delays the pulses by delay times used for determining transmission directionality.
- the pulses which have been delayed correspond to a plurality of piezoelectric vibrators included in the probe 20, respectively.
- the pulser 1116 applies a driving signal (or a driving pulse) to the probe 20 based on timing corresponding to each of the pulses which have been delayed.
- a receiver 1120 generates ultrasound data by processing echo signals received from the probe 20.
- the receiver 1120 may include an amplifier 1122, an analog-to-digital converter (ADC) 1124, a reception delaying unit 1126 (e.g., reception delayer), and a summing unit 1128 (e.g., summer).
- the amplifier 1122 amplifies echo signals in each channel, and the ADC 1124 performs analog-to-digital conversion with respect to the amplified echo signals.
- the reception delaying unit 1126 delays digital echo signals output by the ADC 1124 by delay times used for determining reception directionality, and the summing unit 1128 generates ultrasound data by summing the echo signals processed by the reception delaying unit 1166.
- the receiver 1120 may not include the amplifier 1122. In other words, if the sensitivity of the probe 20 or the capability of the ADC 1124 to process bits is enhanced, the amplifier 1122 may be omitted.
- the ultrasound transceiver 1100 may generate ultrasound image data by continuously transmitting ultrasound signals to the object 10 and continuously receiving response signals to the transmitted ultrasound signals.
- the image processor 1200 generates an ultrasound image by scan-converting ultrasound data generated by the ultrasound transceiver 1100 and displays the ultrasound image.
- the ultrasound image may be not only a grayscale ultrasound image obtained by scanning the object 10 in an amplitude (A)-mode, a brightness (B)-mode, and a motion (M)-mode, but also a Doppler image showing a movement of the object 10 via a Doppler effect.
- the Doppler image may be a blood flow Doppler image showing the flow of blood (also referred to as a color Doppler image), a tissue Doppler image showing a movement of tissue, or a spectral Doppler image showing a moving speed of the object 10 as a waveform.
- a B-mode processor 1212 extracts B-mode components from the ultrasound data and processes the B-mode components.
- An image generator 1220 may generate an ultrasound image indicating signal intensities as brightness based on the extracted B-mode components.
- a Doppler processor 1214 may extract Doppler components from the ultrasound data, and the image generator 1220 may generate a Doppler image indicating a movement of the object 10 as colors or waveforms based on the extracted Doppler components.
- the image generator 1220 may generate a 3D ultrasound image via volume-rendering with respect to volume data and may also generate an elasticity image by imaging deformation of the object 10 due to pressure. Furthermore, the image generator 1220 may display various pieces of additional information in an ultrasound image by using text and graphics. In addition, the generated ultrasound image may be stored in the memory 1500.
- the image processor 1200 may generate a plurality of first medical image frames based on the ultrasound image data that is received from the ultrasound transceiver 1100.
- the plurality of first medical image frames may be a plurality of image frames that are temporally adjacent to one another.
- the image processor 1200 may generate a panoramic image based on the plurality of first medical image frames.
- the image processor 1200 may generate the panoramic image by synthesizing second medical image frames that are selected based on bio-signal information of the object 10 from among the first medical image frames.
- the bio-signal information of the object 10 may include information related to a body movement corresponding to a physical activity of the object 10 during a predetermined period of time for which the first medical image frames are generated.
- the process of synthesizing the second medical image frames may be performed according to many different synthesizing techniques, as would be appreciated by one of ordinary skill in the art.
- the process of synthesizing according to an exemplary embodiment may employ various types of synthesis algorithms, although is not limited thereto.
- the bio-signal information may be obtained from at least one medical image that is obtained by taking a picture of the body movement of the object 10.
- the at least one medical image according to an exemplary embodiment includes a medical image that is different from the first medical image frames.
- the bio-signal information of the object 10 may be directly obtained by the controller 1700 based on at least one medical image received from an apparatus for obtaining medical images.
- bio-signal information may be directly obtained from the apparatus for obtaining medical images and may be received through the communication unit 1300.
- an apparatus for obtaining images may be an apparatus that obtains medical images of the object 10.
- Examples of the apparatus for obtaining images (not shown) according to an exemplary embodiment may include, but are not limited to including, a CT apparatus, an MRI apparatus, an angiography apparatus, and an ultrasound apparatus.
- the apparatus for obtaining images may include a plurality of apparatuses for obtaining images, and may include different types of apparatuses for obtaining images using different image obtaining methods or the same type of apparatuses for obtaining images using the same image obtaining method.
- the at least one medical image may include at least one of a blood vessel image, a musculoskeletal image, and an ECG image, although is not limited thereto as long as the medical image is a medical image obtained by taking a picture of the body movement of the object 10.
- the communication unit 1300 may receive the at least one medical image obtained by taking a picture of the body movement of the object 10 from the apparatus for obtaining medical images through a network. Also, the communication unit 1300 according to an exemplary embodiment may directly obtain the bio-signal information from the apparatus for obtaining images.
- the image processor 1200 may select second medical image frames corresponding to points in time that have the same ECG signal information of the object 10 from the first medical image frames and may generate panoramic images by synthesizing only the second medical image frames.
- the display 1400 displays the generated ultrasound image.
- the display 1400 may display not only the ultrasound image, but also various pieces of information processed by the apparatus 1000 on a screen through a GUI.
- the apparatus 1000 may include two or more displays 1400 according to exemplary embodiments.
- the display 1400 may output the panoramic images that are generated by the image processor 1200.
- the communication unit 1300 is connected to a network 30 by wire or wirelessly to communicate with an external device or a server.
- the communication unit 1300 may exchange data with a hospital server or another medical apparatus in a hospital, which is connected thereto via a PACS.
- the communication unit 1300 may perform data communication according to the digital imaging and communications in medicine (DICOM) standard.
- DICOM digital imaging and communications in medicine
- the communication unit 1300 may transmit or receive data related to diagnosis of the object 10, e.g., an ultrasound image, ultrasound data, and Doppler data of the object 10, via the network 30 and may also transmit or receive medical images captured by another medical apparatus, e.g., a CT apparatus, an MRI apparatus, or an X-ray apparatus. Furthermore, the communication unit 1300 may receive information about a diagnosis history or a medical treatment schedule of a patient from a server and utilizes the received information to diagnose the patient. Furthermore, the communication unit 1300 may perform data communication not only with a server or a medical apparatus in a hospital, but also with a portable terminal of a medical doctor or the patient.
- data related to diagnosis of the object 10 e.g., an ultrasound image, ultrasound data, and Doppler data of the object 10
- another medical apparatus e.g., a CT apparatus, an MRI apparatus, or an X-ray apparatus.
- the communication unit 1300 may receive information about a diagnosis history or a medical treatment schedule of a patient from a server
- the communication unit 1300 is connected to the network 30 by wire or wirelessly to exchange data with a server 32, a medical apparatus 34, or a portable terminal 36.
- the communication unit 1300 may include one or more components for communication with external devices.
- the communication unit 1300 may include a local area communication module 1310 (e.g., local area communicator), a wired communication module 1320 (e.g., wired communicator), and a mobile communication module 1330 (e.g., mobile communicator).
- the local area communication module 1310 refers to a module for local area communication within a predetermined distance.
- Examples of local area communication techniques according to an exemplary embodiment may include, but are not limited to including, wireless LAN, Wi-Fi, Bluetooth, ZigBee, Wi-Fi Direct (WFD), ultra wideband (UWB), infrared data association (IrDA), Bluetooth low energy (BLE), and near field communication (NFC).
- the wired communication module 1320 refers to a module for communication using electrical signals or optical signals. Examples of wired communication techniques according to an exemplary embodiment may include communication via a twisted pair cable, a coaxial cable, an optical fiber cable, and an Ethernet cable.
- the mobile communication module 1330 transmits or receives wireless signals to or from at least one selected from a base station, an external terminal, and a server on a mobile communication network.
- the wireless signals may be voice call signals, video call signals, or various types of data for transmission and reception of text/multimedia messages.
- the memory 1500 stores various types of data processed by the apparatus 1000.
- the memory 1500 may store medical data related to diagnosis of the object 10, such as ultrasound data and an ultrasound image that are input or output, and may also store algorithms or programs which are to be executed in the apparatus 1000.
- the memory 1500 may be any of various types of storage media, e.g., a flash memory, a hard disk drive, EEPROM, etc. Furthermore, the apparatus 1000 may utilize web storage or a cloud server that performs the storage function of the memory 1500 online.
- the input device 1600 refers to a unit via which a user inputs data for controlling the apparatus 1000.
- the input device 1600 may include hardware components, such as a keypad, a mouse, a touch pad, a touch screen, and a jog switch.
- exemplary embodiments are not limited thereto, and the input device 1600 may further include any of various other types of input units including an ECG measuring module, a respiration measuring module, a voice recognition sensor, a gesture recognition sensor, a fingerprint recognition sensor, an iris recognition sensor, a depth sensor, a distance sensor, or any other type of sensor known to those skilled in the art.
- the controller 1700 may control all operations of the apparatus 1000. In other words, the controller 1700 may control operations among the probe 20, the ultrasound transceiver 1100, the image processor 1200, the communication unit 1300, the display 1400, the memory 1500, and the input device 1600 shown in FIG. 1 .
- All or some of the probe 20, the ultrasound transceiver 1100, the image processor 1200, the communication unit 1300, the display 1400, the memory 1500, the input device 1600, and the controller 1700 may be implemented as software modules. Also, at least one of the ultrasound transceiver 1100, the image processor 1200, and the communication unit 1300 may be included in the controller 1600; however, the exemplary embodiments are not limited thereto.
- FIG. 3 is a flowchart of a method of synthesizing medical images to generate a panoramic image, according to an exemplary embodiment.
- the apparatus 100 receives a plurality of first medical image frames.
- the apparatus 100 may acquire image data of an object and may generate the plurality of first image frames based on the acquired image data.
- the apparatus 100 may transmit an ultrasound signal to the object and may generate the first image frames based on ultrasound image data that is acquired by receiving a response signal to the transmitted ultrasound signal.
- the apparatus 100 may transmit an RF signal to the object and may receive an MR signal that is emitted from the object.
- the apparatus 100 may acquire MR image data of the object by processing the received MR signal and may generate the first image frames based on the acquired MR image data.
- the apparatus 100 may transmit X-rays to the object and may detect an X-ray signal transmitted through the object.
- the apparatus 100 may acquire CT image data of the object by processing the detected X-ray signal and may generate the first medical image frames based on the acquired CT image data.
- the plurality of first medical image frames that are image frames that are continuously acquired while the probe 20 moves along a surface of the object for a predetermined period of time may reflect a body movement related to a physical activity of the object that occurs during a medical procedure.
- the plurality of first medical image frames that are temporally adjacent to one another may be obtained while ECG signal information of the object is obtained.
- the apparatus 100 selects second medical image frames corresponding to points in time that have the same ECG signal information of the object from among the first medical image frames that are generated in operation S100.
- An ultrasound diagnostic apparatus may select the second medical image frames based on bio-signal information of the object from among the first medical image frames.
- the bio-signal information may be the body movement related to the physical activity of the object that occurs while the first medical image frames are generated.
- the bio-signal information according to an exemplary embodiment may be a value that is previously determined before the medical procedure using the apparatus 1000 that is an ultrasound diagnostic apparatus.
- At least one medical image that is generated by an apparatus for acquiring medical images may be at least one medical image acquired by taking a picture of the body movement of the object before the apparatus 1000 generates the first medical image frames.
- the at least one medical image generated by the apparatus for acquiring medical images may include at least one of a blood vessel image, a musculoskeletal image, and an ECG image.
- examples of the body movement related to the physical activity of the object may include, but are not limited to including, a change in a thickness of a blood vessel and a change in a muscle type according to a heartbeat of the object.
- the bio-signal information according to an exemplary embodiment that is time information corresponding to a cycle of the body movement of the object may be obtained based on the at least one medical image generated by the apparatus for obtaining medical images, instead of the first medical image frames according to an exemplary embodiment.
- body movement is not limited to the above examples, and may instead be many other types of body movements according to exemplary embodiments, such as other types of movements related to the circulatory system, other types of muscles, neural activities, or movements of other parts of the human body, e.g., other organs, bones, etc.
- the bio-signal information may include, but is not limited to including, at least one of a cycle of the heartbeat of the object, a cycle of the change in the thickness of the blood vessel, and a cycle of the change in the muscle type that are included in the at least one medical image.
- bio-signal information according to an exemplary embodiment that is state information corresponding to the body movement of the object may be obtained based on the at least one medical image that is generated by the apparatus for obtaining medical images, instead of the first medical image frames according to an exemplary embodiment.
- the bio-signal information may include, but is not limited to including, at least one of a state of the heartbeat, a state of the thickness of the blood vessel, and a state of the muscle type of the object that are included in the at least one medical image.
- the apparatus 1000 may correlate the bio-signal information that is obtained based on a medical image that is previously generated before ultrasound imaging with the first medical image frames that are generated in real time during the ultrasound imaging.
- a periodic body movement of the object included in the first medical image frames may be derived from the bio-signal information that is previously determined.
- the apparatus 1000 When the body movement of the object included in the medical image is a periodic movement and time information corresponding to a cycle of the body movement of the object is T (sec), the apparatus 1000 according to an exemplary embodiment has to obtain the plurality of first medical image frames by using an imaging time of at least 2 T (sec).
- the bio-signal information of the object may include ECG signal information of the object, and the first medical image frames may be obtained while the ECG signal information of the object is generated.
- the first medical image frames that are generated in real time during ultrasound imaging may be correlated with the ECG signal information of the object that is obtained in real time during the ultrasound imaging. A method of correlating the first medical image frames with the ECG signal information of the object will be explained below in detail with reference to FIG. 7 .
- the second medical image frames may be a plurality of image frames having the same bio-signal information among the first medical image frames.
- the bio-signal information is information related to the body movement corresponding to the physical activity of the object.
- this correlation may indicate that the body movements of the object included in the two medical image frames are the same.
- the second medical image frames according to another exemplary embodiment may be a plurality of image frames corresponding to a plurality of pieces of bio-signal information among the first medical image frames.
- the bio-signal information of the object includes ECG signal information of the object, and only second medical image frames corresponding to points in time that have the same ECG signal information of the object may be selected from among the first medical image frames, which will be explained below in detail with reference to FIGS. 4 and 10 .
- the apparatus 100 In operation S120, the apparatus 100 generates a panoramic image by synthesizing the second medical image frames that are selected in operation S110.
- the panoramic image may be generated by synthesizing a plurality of second medical image frames having the same bio-signal information among the first medical image frames.
- the bio-signal information may include ECG signal information
- the panoramic image may be generated by synthesizing only a plurality of second medical image frames corresponding to points in time that have the same ECG signal information of the object among the first medical image frames.
- the apparatus 100 displays the panoramic image that is generated in operation S120 on the display 1400.
- FIG. 4 is a flowchart of a method of synthesizing medical images to generate a plurality of panoramic images, according to an exemplary embodiment.
- Operation S200 corresponds to operation S100 of FIG. 3 , and thus, a detailed explanation thereof will not be given.
- the apparatus 100 selects second medical image frames corresponding to points in time that have a plurality of pieces of ECG signal information of an object from among the first medical image frames that are generated in operation S200.
- the apparatus 1000 In operation S220, the apparatus 1000 generates a plurality of panoramic image frames by synthesizing image frames corresponding to points in time that have the same ECG signal information among the second medical image frames that are selected in operation S210, which will be explained below in detail with reference to FIG. 10 .
- the apparatus 1000 continuously outputs the plurality of panoramic image frames that are generated in operation S220 to the display 1400, which will be explained below in detail with reference to FIGS. 11 through 14 .
- FIG. 5 is a perspective view illustrating first medical image frames that are continuously obtained by the apparatus 1000, according to an exemplary embodiment.
- the apparatus 1000 may acquire ultrasound image data by continuously transmitting ultrasound signals to the object 10 while moving the probe 20 along a surface of the object 10 and continuously receiving echo signals reflected from the object 10.
- the apparatus 1000 may generate a plurality of first medical image frames, as shown in FIG. 5 , based on the continuously acquired ultrasound image data.
- the plurality of first medical image frames that are a plurality of medical image frames including image frames 200, 201 202, 210, 211, 212, and 220 may be image frames that are continuously acquired while the probe 20 moves along a surface of the object 10 for a predetermined period of time and may reflect a body movement related to a physical activity of the object 10 which occurs during a medical procedure.
- Examples of the body movement related to the physical activity of the object may include, but are not limited to, a change in a thickness of a blood vessel and a change in a muscle type according to a heartbeat of the object 10.
- examples of the body movement related to the physical activity of the object 10 may include a contraction or expansion of the heart of the object 10 and bio-signal information of the object 10 may include ECG signal information of the object 10.
- the first medical image frames may be obtained while the ECG signal information of the object is generated.
- a conventional ultrasound diagnostic apparatus generates panoramic images by sequentially synthesizing a plurality of medical image frames that are continuously obtained in real time as shown in FIG. 5 .
- panoramic imaging which is a process of generating an image with a field of view greater than a field of view of an independent frame that is generated from one transducer, increases a field of view of an image to be equal to or greater than a field of view of a transducer that is generally limited.
- a scan plane may be extended by manually moving a transducer in a direction parallel to the scan plane.
- old echo signal information of previous frames may be retained while a new echo signal is added in order to generate an image in a direction in which the scan plane moves.
- a greater field of view obtained as a result may show a large organ or a wide anatomical region on one image.
- the new echo signal that is obtained while the transducer moves is added, it may be very important to accurately locate the new echo signal on an existing image. This is accomplished by correlating locations of echo signals common to adjacent frames so that new information on a new frame is located accurately.
- ultrasound image frames that are continuously obtained during a medical procedure reflect the physical activity of the object 10, for example, the change in the blood vessel or the change in the muscle type according to the heartbeat of the object 10 as described above, connectivity between regions of interest in panoramic images that are generated by sequentially synthesizing the continuously obtained ultrasound image frames is reduced.
- panoramic images show wide regions of interest but do not provide a video function of reflecting movements in the wide regions of interest, it may be difficult to detect the movements in the wide regions of interest as time passes.
- the apparatus 1000 may select only some from among a plurality of medical image frames that are continuously obtained in consideration of the bio-signal information corresponding to the physical activity of the object 10 and may synthesize a panoramic image, thereby outputting the panoramic image with high accuracy.
- the term "panoramic image” may refer to an image including information obtained from a wide field of view which is wider than a field of view of a single image frame.
- panoramic image may refer to an image including information obtained from a field of view which changes according to the movement of the probe 20 along a single direction of an object, along multiple directions of the object, at a single angle, at multiple angles, 2-D images, 3-D images, etc.
- the apparatus 1000 may generate a panoramic image by selecting and synthesizing only medical image frames corresponding to points in time that have the same ECG signal information from among a plurality of medical image frames that are continuously obtained.
- this correlation may indicate that the medical image frames correspond to points in time that have the same contractions or expansions of the heart of the object 10.
- consistency of a state of observed tissue of the object 10 that is included in the medical image frames corresponding to the points in time that have the same contractions or expansions of the heart of the object 10 may be maintained.
- consistency of the thickness of the blood vessel or the muscle type included in the medical image frames corresponding to the points in time that have the same ECG signal information of the object 10 may be maintained.
- a panoramic image in which connectivity between regions of interest is improved may be generated by synthesizing only the medical image frames corresponding to the points in time that have the same ECG signal information of the object 10.
- the apparatus 1000 may provide a panoramic image with improved connectivity between regions of interest by generating the panoramic images by synthesizing only a plurality of second medical image frames 200, 210, and 220 corresponding to points in time that have the same ECG signal information of the object 10 among the plurality of first medical image frames of FIG. 5 .
- FIG. 6 is a graph illustrating ECG signal information of an object, according to an exemplary embodiment.
- Bio-signal information may indicate a body movement related to a physical activity of the object which occurs while first medical image frames are generated.
- the bio-signal information according to an exemplary embodiment may be determined based on the medical image that is generated by another apparatus for obtaining medical images before the apparatus 1000 according to an exemplary embodiment performs a medical procedure. This feature is possible on the assumption that the body movement of the object is similarly maintained before and after the medical procedure is performed.
- the medical image that is generated by the apparatus for obtaining medical images may be a medical image obtained by previously taking a picture of a periodic body movement of the object before the apparatus 1000 according to an exemplary embodiment generates the first medical image frames.
- the medical image that is generated by the apparatus for obtaining medical images may include at least one of a blood vessel image, a musculoskeletal image, and an ECG image.
- the apparatus 1000 while the apparatus 1000 according to an exemplary embodiment generates the first medical image frames that are continuously obtained while the probe 20 moves along a surface of the object for a predetermined period of time, the bio-signal information according to an exemplary embodiment may be obtained.
- the apparatus 1000 obtains ultrasound images of the object
- the medical image including the bio-signal information of the object may be obtained by the apparatus for obtaining medical images.
- the body movement related to the physical activity of the object may be a contraction or expansion of the object and may include the ECG signal information of the object.
- FIG. 6 is a graph illustrating an ECG image that is a medical image including the body movement related to a heartbeat of the object.
- the ECG image shows an ECG that is measured from an ECG signal that is received through a plurality of electrodes that are attached to the object.
- the apparatus 1000 may obtain the bio-signal information of the object based on the ECG image of FIG. 6 .
- the bio-signal information may be time information corresponding to a cycle of the body movement of the object.
- the bio-signal information may include information about a cycle of the heartbeat of the object that may be calculated in the ECG image.
- the bio-signal information may include the ECG signal information of the object.
- the controller 1700 may extract points in time that have the same ECG signal information from the received ECG signal.
- the controller 1700 may calculate at least one type of information such as a length of the ECG signal in each interval, a point of time when a voltage is the highest, or a gradient of a waveform of the ECG signal from the medical image and may extract points in time that have the same (or substantially similar) ECG signal information from the ECG image.
- the points in time that have the same ECG signal may include points in time that have the same singular points extracted from the ECG image.
- the controller 1700 may extract an R-point 400 where a voltage of a QRS group is the highest and an R-point 410 where a voltage of a QRS group of a next heartbeat is the highest in FIG. 6 as singular points.
- controller 1700 may extract a point 402 where a T-wave starts and a point 412 where a T-wave of the next heartbeat starts as singular points.
- the controller 1700 may extract at least one singular point for calculating a cardiac cycle from the received ECG signal.
- the present exemplary embodiment is not limited thereto, and various points or intervals for calculating the cardiac cycle may be extracted as singular points.
- the controller 1700 may calculate the cardiac cycle based on the extracted singular points. For example, when the R-point 400 and the R-point 410 are extracted as singular points, the controller 1700 may measure a time T 1 corresponding to an RR interval between the R-point 400 and the R-point 410 and may calculate the cardiac cycle based on the measured time T 1 . For example, when the time T 1 corresponding to the RR interval is 0.6 seconds, the cardiac cycle may be calculated to be 0.6 seconds.
- the controller 1700 may measure a time T 2 corresponding to a TT interval between the point 402 where the T-wave starts and the point 412 where the T-wave of the next heartbeat starts and may calculate the cardiac cycle based on the measured time T 2 .
- FIG. 7 shows a graph and a perspective view illustrating a correlation between ECG signal information of an object and first medical image frames, according to an exemplary embodiment.
- first medical image frames are image frames that are continuously obtained while the probe 20 moves along a surface of the object for a predetermined period of time and a physical activity of the object continuously occurs during the predetermined period of time
- information related to a periodic body movement of the object may be included in the first medical image frames.
- image processing of generating a panoramic image performed by the apparatus 1000 is a part of a pre-process that is performed before data is stored in a memory, it may be difficult to detect the periodic body movement of the object by analyzing the plurality of first medical image frames.
- bio-signal information may be obtained from at least one medical image that is previously generated by an apparatus for obtaining medical images, instead of from the first medical image frames according to an exemplary embodiment.
- the at least one medical image that reflects the periodic body movement of the object may include at least one of a blood vessel image, a musculosketal image, and an ECG image.
- the apparatus 1000 may correlate the bio-signal information that is obtained based on the at least one medical image that is previously generated before ultrasound imaging with the first medical image frames that are generated in real time during the ultrasound imaging.
- the apparatus 1000 according to an exemplary embodiment generates the first medical image frames and a time when the apparatus for obtaining medical images according to an exemplary embodiment generates the at least one medical image for extracting the bio-signal information are different from each other, since consistency of the periodic body movement of the object is maintained irrespective of a time when imaging is performed, as shown in FIG. 7 , the at least one medical image and the first medical image frames may be correlated with each other in relation to the periodic body movement that is the physical activity of the object. Accordingly, the periodic body movement of the object that is included in the first medical image frames may be derived from the periodic body movement of the object that is calculated from the at least one medical image.
- the apparatus 1000 may synthesize panoramic images by obtaining the plurality of first medical image frames using an imaging time of at least 2 T (sec) as shown in FIG. 7 .
- a movement of the object at the R-points 400, 410, and 420 where a voltage is the highest in each cycle of the medical image that is the ECG image of FIG. 6 respectively correspond to a movement of the object in the medical image frames 300, 310, and 320 in the first medical image frames of FIG. 5 .
- a change in a thickness of a blood vessel of the object periodically occurs according to periodic contractions or expansions of the heart corresponding to a pumping action of the heart that supplies blood to all body parts.
- the time T 1 that is a cardiac cycle calculated based on the R-point 400 and the R-point 410 that are extracted as singular points in the ECG image may be obtained as the bio-signal information including time information corresponding to a cycle of the body movement of the object.
- a body movement corresponding to the R-points 400, 410, and 420 where the voltage of the QRS group in the ECG image is the highest may be a contraction of the heart, and thus, blood is supplied to all body parts, thereby causing a thickness of a blood vessel included in the first medical image frames to become greater than usual.
- the thickness of the blood vessel included in the first medical image frames according to an exemplary embodiment may vary according to the cardiac cycle of the object in the ECG image according to an exemplary embodiment.
- the change in the thickness of the blood vessel that occurs according to the pumping action of the heart is periodic, like the cardiac cycle.
- a cycle of a change in the thickness of the blood vessel including the first medical image frames may be derived by using the time T 1 that the cardiac cycle already obtained in the ECG image.
- the apparatus 1000 may select the plurality of second medical image frames 300, 310, and 320 respectively corresponding to the plurality of R-points 400, 410, and 420 in different cycles by using the bio-signal information that is obtained in the ECG image.
- the thickness of the blood vessel included in the first medical image frames may get less than usual.
- the second medical image frame 302 corresponding to the point 402 where the T-wave starts in the ECG image is determined, the second medical image frames 312 and 322 in other cycles may also be determined based on the time T 2 that is a cardiac cycle.
- a periodic body movement of the object that is included in the first medical image frames that are continuously obtained by the apparatus 1000 may be derived from the bio-signal information that is obtained in the at least one medical image.
- the bio-signal information may include at least one of time information corresponding to a cycle of the body movement of the object, for example, a cardiac cycle, and state information corresponding to the body movement of the object, for example, a state of a thickness of a blood vessel.
- the apparatus 1000 since the apparatus 1000 according to an exemplary embodiment generates panoramic images by synthesizing only second medical image frames that are derived to have the same body movement among the plurality of first medical image frames, the apparatus 1000 may generate the panoramic images in consideration of the physical activity of the object, which will be explained below with reference to FIGS. 8 through 10 .
- the apparatus 1000 obtains the first medical image frames of the object as shown in FIG. 7
- the ECG image including the ECG signal information of the object may be obtained by the apparatus for obtaining medical images.
- the ECG signal information of the ECG image and the first medical image frames may be correlated with each other.
- the ECG signal information of the ECG image may be calculated based on at least one type of information such as a length of the ECG signal in each interval, a point of time when a voltage is the highest, and a gradient of a waveform.
- the apparatus 1000 may extract points in time that have the same ECG signal information from the ECG image as shown in FIG. 7 .
- the points in time that have the same ECG signal information may include points in time that have the same singular points extracted in the ECG image.
- the R-points 400, 410, and 420 where the voltage of the QRS group in the ECG image is the highest may correspond to the points in time that are the same ECG signal information.
- the second medical image frames 300, 310, and 320 respectively corresponding to the R-points 400, 410, and 420 in the ECG image may be selected from among the first medical image frames.
- the points 402, 412, and 422 where the T-wave starts in the ECG image may correspond to the points in time that have the same ECG signal information.
- the second medical image frames 302, 312, and 322 respectively corresponding to the points 402, 412, and 422 where the T-wave starts in the ECG image may be selected from among the first medical image frames.
- FIGS. 8 and 9 are views illustrating various examples where the apparatus 1000 generates panoramic images by synthesizing second medical image frames corresponding to points in time that have the same ECG signal information among first medical image frames, according to an exemplary embodiment.
- the second medical image frames 300, 310, and 320 respectively corresponding to the R-points 400, 410, and 420 in the ECG image may be selected from among the first medical image frames.
- a body movement of an object included in the second medical image frames 300, 310, and 320 includes a state where a blood vessel is relatively expanded to be wide.
- the apparatus 1000 may generate a panoramic image D 1 500 by selecting and synthesizing only second medical image frames A 1 , B 1 , and C 1 that correspond to points in time that have the same ECG signal information of the object and thus are derived to have the same thickness of a blood vessel in the first medical image frames.
- the panoramic image D 1 500 is obtained by selecting and synthesizing only image frames corresponding to a state where the blood vessel is expanded (e.g., a thickness of the blood vessel is d 1 ) in a periodic body movement of the object and only the panoramic image D 1 500 in the state where the blood vessel is expanded is displayed, connectivity between regions of interest may be improved.
- the apparatus 1000 may synthesize the second medical image frames A 1 , B 1 , and C 1 by using a synthesis algorithm which combines the second medical image frames A 1 , B 1 , and C 1 side-by-side or in some other continuous form, although is not limited thereto.
- the second medical image frames 302, 312, and 322 respectively corresponding to the points 402, 412, and 422 where the T-wave starts in the ECG image may be selected from among the first medical image frames.
- a body movement of the object included in the second medical image frames 302, 312, and 322 includes a state where the blood vessel is relatively contracted to be narrow.
- the apparatus 1000 may generate a panoramic image D 2 510 by selecting and synthesizing only second medical image frames A 2 , B 2 , and C 2 that correspond to points in time that have the same ECG signal information of the object and thus are derived to have the same thickness of the blood vessel in the first medical image frames.
- the panoramic image D 2 510 is obtained by selecting and synthesizing only image frames corresponding to a state where the blood vessel is contracted (e.g., a thickness of the blood vessel is d 2 ) in the periodic body movement of the object and only the panoramic image D 2 510 in the state where the blood vessel is contracted is displayed, connectivity between regions of interest may be improved.
- FIG. 10 is a view illustrating an example where the apparatus 1000 generates a plurality of panoramic images by synthesizing image frames corresponding to points in time that have the same ECG signal information of an object among first medical image frames, according to an exemplary embodiment.
- the apparatus 1000 may select a plurality of second medical image frames having a plurality of pieces of ECG signal information and may generate a plurality of panoramic images.
- the panoramic images according to an exemplary embodiment may be generated by synthesizing second medical image frames corresponding to points in time that have the same ECG signal information.
- second medical image frames 304, 314, and 324 respectively corresponding to the R-points 400, 410, and 420 in the ECG image and second medical image frames 306, 316, and 326 respectively corresponding to the points 402, 412, and 422 where the T-wave starts in the ECG image may be selected from among first medical image frames.
- a panoramic image D 3 520 may be generated by synthesizing only second medical image frames A 3 , B 3 , and C 3 having bio-signal information corresponding to a state where a blood vessel is expanded (e.g., a thickness of the blood vessel is d 3 ).
- a panoramic image D 4 530 may be generated by synthesizing only second medical image frames A 4 , B 4 , and C 4 having bio-signal information corresponding to a state where the blood vessel is contracted (e.g., a thickness of the blood vessel is d 4 ).
- the apparatus 1000 may generate a plurality of panoramic images by synthesizing second medical image frames corresponding to points in time that have the same ECG signal information of the object from among the same first medical image frames. Accordingly, the apparatus 1000 according to an exemplary embodiment may provide a panoramic video function by continuously outputting the plurality of panoramic images corresponding to points of various body movements of the object.
- FIGS. 11 through 14 are views illustrating various examples where the apparatus 1000 displays generated panoramic images, according to an exemplary embodiment.
- Doppler images 800 and 810 that may show a movement of blood flow in various colors and/or B-mode images 700 and 710, wherein the Doppler images 800 and 810 and the B-mode images 700 and 710 are panoramic images, may be displayed on the display 1400 of the apparatus 1000.
- the B-mode image 700 and the Doppler image 800 of FIG. 11 show the panoramic image D 3 520 that is generated by synthesizing only the second medical image frames A 3 , B 3 , and C 3 having a bio-signal information corresponding to a state where a blood vessel is expanded (e.g., a thickness of the blood vessel is d 3 ) in FIG. 10 .
- the B-mode image 710 and the Doppler image 810 of FIG. 12 show the panoramic image D 4 530 that is generated by synthesizing only the second medical image frames A 4 , B 4 , and C 4 having bio-signal information corresponding to a state where the blood vessel is contracted (e.g., a thickness of the blood vessel is d 4 ) in FIG. 10 .
- images 720 and 820 of FIGS. 13A and 13B show the panoramic image D 4 530 that is generated by synthesizing only second medical image frames having bio-signal information corresponding to a state where a blood vessel of a wrist is contracted.
- images 730 and 830 of FIGS. 14A and 14B show the panoramic image D 4 530 that is generated by synthesizing only second medical image frames having bio-signal information corresponding to a state where a blood vessel of a back portion of a lower leg is contracted.
- FIGS. 15 through 17 are views illustrating various examples where the apparatus 1000 displays panoramic images along with ECG signal information of an object, according to an exemplary embodiment.
- FIGS. 15 through 17 are views for explaining a post-process of processing panoramic images by using ECG signal information of an object after the panoramic images are generated and stored in a memory.
- An image process unit may correlate the ECG signal information of the object with the panoramic images and may store the ECG signal information and the panoramic images that are correlated with each other, and a display according to an exemplary embodiment may display the ECG signal information and the panoramic images together.
- the plurality of panoramic images according to an exemplary embodiment may be correlated with the ECG signal information of the object and may be stored as a video file in the memory.
- a panoramic image 840 that is generated by synthesizing only second medical image frames corresponding to points in time that have the same ECG signal information of the object may be displayed on the display along with an ECG image 900 of the object.
- a GUI related to a function of reproducing a panoramic video may also be output to the display.
- the GUI related to the function of reproducing the panoramic video may include a user interface for reproducing, editing, or re-storing the panoramic images.
- the display may display together the ECG image 900 correlated with the panoramic image 840, an interval marker 901 of the ECG image 900 corresponding to the panoramic image 840 that is currently output to the display, and a processor bar 910.
- the display may display together an ECG image 920 correlated with the panoramic image 850, an interval marker 921 of the ECG image 920 corresponding to the panoramic image 850 that is currently output to the display, and a processor bar 930.
- the display may also output a plurality of ECG images 940a, 940b, and 940c that are correlated with a panoramic image 860.
- the apparatus 1000 may reproduce, edit, or re-store an interval of a panoramic image selected by a user through a user interface related to a function of reproducing the panoramic image.
- only an interval of a panoramic image desired to be seen by the user may be output to the display based on a user input through the processor bars 910, 930, and 950.
- the apparatus 1000 may store in the memory, as a separate file, only a panoramic image that is selected based on a user input in a panoramic video file that is stored in the memory.
- the selected panoramic image may be correlated with ECG signal information corresponding to the selected panoramic image and may be stored in the memory.
- a panoramic image corresponding to the interval may be output. Also, only the panoramic image corresponding to the interval that is determined to have an abnormality in an ECG signal may be re-stored in the memory based on the user's input.
- the apparatus 1000 since the apparatus 1000 according to an exemplary embodiment provides panoramic images in consideration of a periodic body movement of the object, and thus, may obtain more physical information than that obtained by using panoramic images that are provided as still images, the apparatus 100 may be used for various medical tests.
- the apparatus 1000 may detect a contraction or expansion of a blood vessel or a position and the amount of a blood clot by using a panoramic video function.
- the apparatus 1000 according to an exemplary embodiment may be used to evaluate blood flow in an artery or examine an aortic aneurysm, and may use a panoramic video function to evaluate a state where a blood vessel is reconnected after blood vessel bypass surgery.
- the apparatus 1000 may provide to the user a panoramic video function having high intuition and high utilization.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Biophysics (AREA)
- General Health & Medical Sciences (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Heart & Thoracic Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Pathology (AREA)
- Surgery (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physiology (AREA)
- High Energy & Nuclear Physics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- Acoustics & Sound (AREA)
- Vascular Medicine (AREA)
- Signal Processing (AREA)
- Pulmonology (AREA)
- Artificial Intelligence (AREA)
- Condensed Matter Physics & Semiconductors (AREA)
- Multimedia (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Description
- The exemplary embodiments relate to methods and apparatuses for synthesizing medical images in consideration of bio-signal information corresponding to a physical activity of an object, and more particularly, to methods and apparatuses for synthesizing medical images in consideration of electrocardiogram (ECG) signal information of an object.
- Ultrasound diagnostic apparatuses transmit ultrasound signals generated by transducers of a probe to an object and receive echo signals reflected from the object, thereby obtaining at least one image of an internal part of the object (e.g., soft tissues or blood flow). In particular, ultrasound diagnostic apparatuses are used for medical purposes including observation of the interior of an object, detection of foreign substances, and diagnosis of damage to the object. Such ultrasound diagnostic apparatuses provide high stability, display images in real time, and are safe due to the lack of radioactive exposure, compared to X-ray apparatuses. Therefore, ultrasound diagnostic apparatuses are widely used together with other image diagnostic apparatuses including a computed tomography (CT) apparatus, a magnetic resonance imaging (MRI) apparatus, and the like.
- An ultrasound system provides panoramic images based on ultrasound images that are continuously obtained by moving an ultrasonic probe along a surface of an object. That is, the ultrasound system continuously obtains ultrasound images by moving the ultrasonic probe along the surface of the object and forms panoramic images by synthesizing the obtained ultrasound images.
- Here, reference is made to
US 2009198134 A1 , which is considered to constitute the closest prior-art and which discloses a method and device using selected image data and positioning and overlapped area of 3D image data acquired from the different echo windows in the same time phase of a signal synchronizing with the working of a heart, and synthesizing (combining) the resultant data, thereby forming combined panorama 4D image data consisting of panorama three-dimensional image data, which are continued in time and have a display area larger than the three-dimensional image data. - Additional publications that are considered to constitute relevant prior-art are
DE 19541987 A1 ;US 2005107688 A1 ;EP 2037413 A2 ;DE 102011075287 A1 ;JP 2009022459 A US 2010022877 A1 ;JPH 0975339 A US 2005/222506 A1 . - According to an aspect of the invention, there is provided a method exhibiting features of the appended independent method claim.
- Alternative or additional exemplary embodiments of such a method may exhibit features specified in dependent method claims.
- According to another aspect of the invention, there is provided an apparatus exhibiting features of the appended independent apparatus claim.
- Alternative or additional exemplary embodiments of an apparatus according to the present invention may exhibit features of appended dependent apparatus claims.
- These and/or other aspects will become apparent and more readily appreciated from the following description of the exemplary embodiments, taken in conjunction with the accompanying drawings in which:
-
FIG. 1 is a block diagram illustrating a configuration of an apparatus for synthesizing images, according to an exemplary embodiment; -
FIG. 2 is a block diagram illustrating a configuration of an apparatus for synthesizing images, according to an exemplary embodiment; -
FIG. 3 is a flowchart of a method of synthesizing medical images to generate a panoramic image, according to an exemplary embodiment; -
FIG. 4 is a flowchart of a method of synthesizing medical images to generate a plurality of panoramic images, according to an exemplary embodiment; -
FIG. 5 is a perspective view of first medical image frames that are continuously obtained by an ultrasound diagnostic apparatus, according to an exemplary embodiment; -
FIG. 6 is a graph illustrating electrocardiogram (ECG) signal information of an object, according to an exemplary embodiment; -
FIG. 7 shows a graph and a perspective view illustrating a correlation between ECG signal information of an object and first medical image frames, according to an exemplary embodiment; -
FIGS. 8 and9 are views illustrating various examples where the ultrasound diagnostic apparatus generates panoramic images by selecting and synthesizing second medical image frames corresponding to points in time that have the same ECG signal information of an object among first medical image frames, according to an exemplary embodiment; -
FIG. 10 is a view illustrating an example where the ultrasound diagnostic apparatus generates a plurality of panoramic images by synthesizing image frames corresponding to points in time that have the same ECG signal information of an object among first medical image frames, according to an exemplary embodiment; -
FIGS. 11 ,12 ,13A, 13B ,14A and 14B are views illustrating various examples where the ultrasound diagnostic apparatus displays generated panoramic images, according to an exemplary embodiment; and -
FIGS. 15 ,16 and17 are views illustrating various examples where the ultrasound diagnostic apparatus displays panoramic images along with ECG signal information of an object, according to an exemplary embodiment. - The terms used herein are those general terms currently widely used in the art in consideration of functions regarding the disclosure, but the terms may vary according to the intention of one of ordinary skill in the art, precedents, or new technology in the art. Also, some terms may be arbitrarily selected by the applicant, and in this case, the meaning of the selected terms will be described in detail in the detailed description of the present specification. Thus, the terms used herein should be defined based on the meaning of the terms together with the description throughout the specification.
- Throughout the specification, it will also be understood that when a component "includes" an element, unless there is another opposite description thereto, it should be understood that the component does not exclude another element and may further include another element. In addition, terms such as "... unit", "... module", or the like refer to units that perform at least one function or operation, and the units may be implemented as hardware or software or as a combination of hardware and software.
- As used herein, the term "and/or" includes any and all combinations of one or more of the correlated listed items. Expressions such as "at least one of' when preceding a list of elements, modify the entire list of elements and do not modify the individual elements of the list.
- The term "image" used herein may refer to multi-dimensional data including discrete image elements (e.g., pixels for two-dimensional (2D) images and voxels for three-dimensional (3D) images). For example, the image may be, but is not limited to being, a medical image (e.g., an ultrasound image, a computed tomography (CT) image, or a magnetic resonance (MR) image) of an object that is obtained by an ultrasound apparatus, a CT apparatus, or a magnetic resonance imaging (MRI) apparatus.
- The ultrasound image may refer to an image obtained by emitting an ultrasound signal, which is generated from a transducer of a probe, to the object and receiving information of an echo signal reflected from the object. Also, the ultrasound image may be formed in various ways. For example, the ultrasound image may be at least one of an amplitude (A)-mode image, a brightness (B)-mode image, a color (C)-mode image, and a Doppler (D)-mode image.
- The CT image may refer to an image obtained by synthesizing a plurality of X-ray images that are obtained by taking a picture of the object through rotation about at least one axis of the object.
- The MR image may refer to an image of the object that is obtained by using nuclear magnetic resonance (NMR).
- Furthermore, the term "object" may refer to a human, an animal, or a part of a human or animal. For example, the object may be an organ (e.g., the liver, the heart, the womb, the brain, a breast, or the abdomen), a blood vessel, or a combination thereof. Also, the object may be a phantom. The phantom refers to a material having a density, an effective atomic number, and a volume that are approximately the same as those of an organism. For example, the phantom may refer to a spherical phantom having properties similar to a human body.
- Throughout the specification, the term "user" may refer to, but is not limited to referring to, a medical expert, for example, a medical doctor, a nurse, a medical laboratory technologist, or a medical imaging expert, or a technician who repairs medical apparatuses.
- Hereinafter, exemplary embodiments will be described in detail with reference to the accompanying drawings.
-
FIG. 1 is a block diagram illustrating a configuration of anapparatus 100 for synthesizing medical images, according to an exemplary embodiment. Theapparatus 100 may include a data acquirer 110, animage processor 120, and adisplay 130. - The
data acquirer 110 may acquire image data of an object. For example, thedata acquirer 110 may transmit an ultrasound signal to the object and may receive an echo signal reflected from the object. Thedata acquirer 110 may process the received echo signal and may generate ultrasound image data of the object. - Alternatively, the
data acquirer 110 may transmit a radio frequency (RF) signal to the object and may receive an MR signal that is emitted from the object. Thedata acquirer 110 may process the received MR signal and may generate MR image data of the object. - Alternatively, the
data acquirer 110 may transmit X-rays to the object and may detect an X-ray signal transmitted through the object. Thedata acquirer 110 may process the detected X-ray signal and may generate CT image data of the object. - Alternatively, the
data acquirer 110 may receive image data that is generated by an ultrasound diagnostic apparatus, an MR apparatus, or a CT apparatus that is located outside theapparatus 100, without receiving an ultrasound signal, an MR signal, or an X-ray signal from the object and directly generating image data of the object. - The
image processor 120 may generate a plurality of first medical image frames based on the image data that is received from thedata acquirer 110. For example, the plurality of first medical image frames may be a plurality of image frames that are temporally adjacent to one another. - Also, the
image processor 120 may generate a panoramic image based on the plurality of first medical image frames. Theimage processor 120 may generate the panoramic image by synthesizing second medical image frames that are selected based on bio-signal information of the object from among the first medical image frames. The bio-signal information of the object may include information related to a body movement corresponding to a physical activity of the object for a predetermined period of time for which the first medical image frames are generated. Also, the bio-signal information may be obtained from at least one medical image obtained by taking a picture of the body movement of the object. In this case, the at least one medical image may include, but is not limited to including, at least one of a blood vessel image, a musculoskeletal image, and an electrocardiogram (ECG) image. - The
image processor 120 according to an exemplary embodiment may select second medical image frames corresponding to points in time that have the same ECG signal information of the object from among the plurality of first medical image frames and may generate a panoramic image by synthesizing only the second medical image frames. - The
display 130 according to an exemplary embodiment may output the panoramic image that is generated by theimage processor 120. - The
display 130 according to an exemplary embodiment may output and display various pieces of information processed by theapparatus 100 as well as the panoramic image through a graphical user interface (GUI) onto a screen. Theapparatus 100 may include two ormore displays 130 according to a type of theapparatus 100. -
FIG. 2 is a block diagram illustrating a configuration of anapparatus 1000 for synthesizing medical images, according to an exemplary embodiment. Referring toFIG. 2 , theapparatus 1000 may be an ultrasound diagnostic apparatus. Theapparatus 1000 may include aprobe 20, anultrasound transceiver 1100, animage processor 1200, a communication module 1300 (e.g., communicator), adisplay 1400, amemory 1500, aninput device 1600, and acontroller 1700, which may be connected to one another via abus 1800. - The
data acquirer 110 ofFIG. 1 may correspond to theultrasound transceiver 1100 ofFIG. 2 , theimage processor 120 ofFIG. 1 may correspond to theimage processor 1200 ofFIG. 2 , and thedisplay 130 ofFIG. 1 may correspond to thedisplay 1400 ofFIG. 2 . - The
apparatus 1000 may be a cart-type apparatus or a portable apparatus. Examples of portable ultrasound diagnostic apparatuses may include, but are not limited to including, a picture archiving and communication system (PACS) viewer, a smartphone, a laptop computer, a personal digital assistant (PDA), and a tablet PC. - The
probe 20 transmits ultrasound waves to anobject 10 in response to a driving signal applied by theultrasound transceiver 1100 and receives echo signals reflected from theobject 10. Theprobe 20 includes a plurality of transducers, and the plurality of transducers oscillate in response to electrical signals and generate acoustic energy, that is, ultrasound waves. Furthermore, theprobe 20 may be connected to a main body of theapparatus 1000 by wire or wirelessly, and according to exemplary embodiments, theapparatus 1000 may include a plurality of theprobes 20. - The
probe 20 according to an exemplary embodiment may continuously transmit ultrasound signals to theobject 10 while moving along a surface of theobject 10 and may continuously receive echo signals reflected from theobject 10. - A
transmitter 1110 applies a driving signal to theprobe 20. Thetransmitter 1110 includes apulse generator 1112, a transmission delaying unit 1114 (e.g., transmission delayer), and apulser 1116. Thepulse generator 1112 generates pulses for forming transmission ultrasound waves based on a predetermined pulse repetition frequency (PRF), and thetransmission delaying unit 1114 delays the pulses by delay times used for determining transmission directionality. The pulses which have been delayed correspond to a plurality of piezoelectric vibrators included in theprobe 20, respectively. Thepulser 1116 applies a driving signal (or a driving pulse) to theprobe 20 based on timing corresponding to each of the pulses which have been delayed. - A
receiver 1120 generates ultrasound data by processing echo signals received from theprobe 20. Thereceiver 1120 may include anamplifier 1122, an analog-to-digital converter (ADC) 1124, a reception delaying unit 1126 (e.g., reception delayer), and a summing unit 1128 (e.g., summer). Theamplifier 1122 amplifies echo signals in each channel, and theADC 1124 performs analog-to-digital conversion with respect to the amplified echo signals. Thereception delaying unit 1126 delays digital echo signals output by theADC 1124 by delay times used for determining reception directionality, and the summingunit 1128 generates ultrasound data by summing the echo signals processed by the reception delaying unit 1166. In some exemplary embodiments, thereceiver 1120 may not include theamplifier 1122. In other words, if the sensitivity of theprobe 20 or the capability of theADC 1124 to process bits is enhanced, theamplifier 1122 may be omitted. - The
ultrasound transceiver 1100 according to an exemplary embodiment may generate ultrasound image data by continuously transmitting ultrasound signals to theobject 10 and continuously receiving response signals to the transmitted ultrasound signals. Theimage processor 1200 generates an ultrasound image by scan-converting ultrasound data generated by theultrasound transceiver 1100 and displays the ultrasound image. The ultrasound image may be not only a grayscale ultrasound image obtained by scanning theobject 10 in an amplitude (A)-mode, a brightness (B)-mode, and a motion (M)-mode, but also a Doppler image showing a movement of theobject 10 via a Doppler effect. The Doppler image may be a blood flow Doppler image showing the flow of blood (also referred to as a color Doppler image), a tissue Doppler image showing a movement of tissue, or a spectral Doppler image showing a moving speed of theobject 10 as a waveform. - A B-
mode processor 1212 extracts B-mode components from the ultrasound data and processes the B-mode components. Animage generator 1220 may generate an ultrasound image indicating signal intensities as brightness based on the extracted B-mode components. - Similarly, a
Doppler processor 1214 may extract Doppler components from the ultrasound data, and theimage generator 1220 may generate a Doppler image indicating a movement of theobject 10 as colors or waveforms based on the extracted Doppler components. - According to an exemplary embodiment, the
image generator 1220 may generate a 3D ultrasound image via volume-rendering with respect to volume data and may also generate an elasticity image by imaging deformation of theobject 10 due to pressure. Furthermore, theimage generator 1220 may display various pieces of additional information in an ultrasound image by using text and graphics. In addition, the generated ultrasound image may be stored in thememory 1500. - The
image processor 1200 according to an exemplary embodiment may generate a plurality of first medical image frames based on the ultrasound image data that is received from theultrasound transceiver 1100. - For example, the plurality of first medical image frames may be a plurality of image frames that are temporally adjacent to one another.
- Also, the
image processor 1200 according to an exemplary embodiment may generate a panoramic image based on the plurality of first medical image frames. - In particular, the
image processor 1200 according to an exemplary embodiment may generate the panoramic image by synthesizing second medical image frames that are selected based on bio-signal information of theobject 10 from among the first medical image frames. For example, the bio-signal information of theobject 10 may include information related to a body movement corresponding to a physical activity of theobject 10 during a predetermined period of time for which the first medical image frames are generated. According to exemplary embodiments, the process of synthesizing the second medical image frames may be performed according to many different synthesizing techniques, as would be appreciated by one of ordinary skill in the art. For example, the process of synthesizing according to an exemplary embodiment may employ various types of synthesis algorithms, although is not limited thereto. - In this case, the bio-signal information may be obtained from at least one medical image that is obtained by taking a picture of the body movement of the
object 10. The at least one medical image according to an exemplary embodiment includes a medical image that is different from the first medical image frames. - For example, the bio-signal information of the
object 10 may be directly obtained by thecontroller 1700 based on at least one medical image received from an apparatus for obtaining medical images. - Also, the bio-signal information may be directly obtained from the apparatus for obtaining medical images and may be received through the
communication unit 1300. - For example, an apparatus for obtaining images (not shown) may be an apparatus that obtains medical images of the
object 10. Examples of the apparatus for obtaining images (not shown) according to an exemplary embodiment may include, but are not limited to including, a CT apparatus, an MRI apparatus, an angiography apparatus, and an ultrasound apparatus. - Also, the apparatus for obtaining images may include a plurality of apparatuses for obtaining images, and may include different types of apparatuses for obtaining images using different image obtaining methods or the same type of apparatuses for obtaining images using the same image obtaining method.
- For example, the at least one medical image may include at least one of a blood vessel image, a musculoskeletal image, and an ECG image, although is not limited thereto as long as the medical image is a medical image obtained by taking a picture of the body movement of the
object 10. - The
communication unit 1300 according to an exemplary embodiment may receive the at least one medical image obtained by taking a picture of the body movement of theobject 10 from the apparatus for obtaining medical images through a network. Also, thecommunication unit 1300 according to an exemplary embodiment may directly obtain the bio-signal information from the apparatus for obtaining images. - For example, the
image processor 1200 may select second medical image frames corresponding to points in time that have the same ECG signal information of theobject 10 from the first medical image frames and may generate panoramic images by synthesizing only the second medical image frames. - The
display 1400 displays the generated ultrasound image. Thedisplay 1400 may display not only the ultrasound image, but also various pieces of information processed by theapparatus 1000 on a screen through a GUI. In addition, theapparatus 1000 may include two ormore displays 1400 according to exemplary embodiments. - The
display 1400 according to an exemplary embodiment may output the panoramic images that are generated by theimage processor 1200. - The
communication unit 1300 is connected to anetwork 30 by wire or wirelessly to communicate with an external device or a server. Thecommunication unit 1300 may exchange data with a hospital server or another medical apparatus in a hospital, which is connected thereto via a PACS. Furthermore, thecommunication unit 1300 may perform data communication according to the digital imaging and communications in medicine (DICOM) standard. - The
communication unit 1300 may transmit or receive data related to diagnosis of theobject 10, e.g., an ultrasound image, ultrasound data, and Doppler data of theobject 10, via thenetwork 30 and may also transmit or receive medical images captured by another medical apparatus, e.g., a CT apparatus, an MRI apparatus, or an X-ray apparatus. Furthermore, thecommunication unit 1300 may receive information about a diagnosis history or a medical treatment schedule of a patient from a server and utilizes the received information to diagnose the patient. Furthermore, thecommunication unit 1300 may perform data communication not only with a server or a medical apparatus in a hospital, but also with a portable terminal of a medical doctor or the patient. - The
communication unit 1300 is connected to thenetwork 30 by wire or wirelessly to exchange data with aserver 32, amedical apparatus 34, or aportable terminal 36. Thecommunication unit 1300 may include one or more components for communication with external devices. For example, thecommunication unit 1300 may include a local area communication module 1310 (e.g., local area communicator), a wired communication module 1320 (e.g., wired communicator), and a mobile communication module 1330 (e.g., mobile communicator). - The local
area communication module 1310 refers to a module for local area communication within a predetermined distance. Examples of local area communication techniques according to an exemplary embodiment may include, but are not limited to including, wireless LAN, Wi-Fi, Bluetooth, ZigBee, Wi-Fi Direct (WFD), ultra wideband (UWB), infrared data association (IrDA), Bluetooth low energy (BLE), and near field communication (NFC). - The
wired communication module 1320 refers to a module for communication using electrical signals or optical signals. Examples of wired communication techniques according to an exemplary embodiment may include communication via a twisted pair cable, a coaxial cable, an optical fiber cable, and an Ethernet cable. - The
mobile communication module 1330 transmits or receives wireless signals to or from at least one selected from a base station, an external terminal, and a server on a mobile communication network. The wireless signals may be voice call signals, video call signals, or various types of data for transmission and reception of text/multimedia messages. - The
memory 1500 stores various types of data processed by theapparatus 1000. For example, thememory 1500 may store medical data related to diagnosis of theobject 10, such as ultrasound data and an ultrasound image that are input or output, and may also store algorithms or programs which are to be executed in theapparatus 1000. - The
memory 1500 may be any of various types of storage media, e.g., a flash memory, a hard disk drive, EEPROM, etc. Furthermore, theapparatus 1000 may utilize web storage or a cloud server that performs the storage function of thememory 1500 online. - The
input device 1600 refers to a unit via which a user inputs data for controlling theapparatus 1000. Theinput device 1600 may include hardware components, such as a keypad, a mouse, a touch pad, a touch screen, and a jog switch. However, exemplary embodiments are not limited thereto, and theinput device 1600 may further include any of various other types of input units including an ECG measuring module, a respiration measuring module, a voice recognition sensor, a gesture recognition sensor, a fingerprint recognition sensor, an iris recognition sensor, a depth sensor, a distance sensor, or any other type of sensor known to those skilled in the art. - The
controller 1700 may control all operations of theapparatus 1000. In other words, thecontroller 1700 may control operations among theprobe 20, theultrasound transceiver 1100, theimage processor 1200, thecommunication unit 1300, thedisplay 1400, thememory 1500, and theinput device 1600 shown inFIG. 1 . - All or some of the
probe 20, theultrasound transceiver 1100, theimage processor 1200, thecommunication unit 1300, thedisplay 1400, thememory 1500, theinput device 1600, and thecontroller 1700 may be implemented as software modules. Also, at least one of theultrasound transceiver 1100, theimage processor 1200, and thecommunication unit 1300 may be included in thecontroller 1600; however, the exemplary embodiments are not limited thereto. -
FIG. 3 is a flowchart of a method of synthesizing medical images to generate a panoramic image, according to an exemplary embodiment. - In operation S100, the
apparatus 100 receives a plurality of first medical image frames. - The
apparatus 100 may acquire image data of an object and may generate the plurality of first image frames based on the acquired image data. - For example, the
apparatus 100 may transmit an ultrasound signal to the object and may generate the first image frames based on ultrasound image data that is acquired by receiving a response signal to the transmitted ultrasound signal. - Alternatively, the
apparatus 100 may transmit an RF signal to the object and may receive an MR signal that is emitted from the object. Theapparatus 100 may acquire MR image data of the object by processing the received MR signal and may generate the first image frames based on the acquired MR image data. - Alternatively, the
apparatus 100 may transmit X-rays to the object and may detect an X-ray signal transmitted through the object. Theapparatus 100 may acquire CT image data of the object by processing the detected X-ray signal and may generate the first medical image frames based on the acquired CT image data. Also, the plurality of first medical image frames that are image frames that are continuously acquired while theprobe 20 moves along a surface of the object for a predetermined period of time may reflect a body movement related to a physical activity of the object that occurs during a medical procedure. For example, the plurality of first medical image frames that are temporally adjacent to one another may be obtained while ECG signal information of the object is obtained. - In operation S110, the
apparatus 100 selects second medical image frames corresponding to points in time that have the same ECG signal information of the object from among the first medical image frames that are generated in operation S100. - An ultrasound diagnostic apparatus according to an exemplary embodiment may select the second medical image frames based on bio-signal information of the object from among the first medical image frames.
- For example, the bio-signal information may be the body movement related to the physical activity of the object that occurs while the first medical image frames are generated. Also, the bio-signal information according to an exemplary embodiment may be a value that is previously determined before the medical procedure using the
apparatus 1000 that is an ultrasound diagnostic apparatus. - For example, at least one medical image that is generated by an apparatus for acquiring medical images may be at least one medical image acquired by taking a picture of the body movement of the object before the
apparatus 1000 generates the first medical image frames. - For example, the at least one medical image generated by the apparatus for acquiring medical images may include at least one of a blood vessel image, a musculoskeletal image, and an ECG image.
- In this case, examples of the body movement related to the physical activity of the object may include, but are not limited to including, a change in a thickness of a blood vessel and a change in a muscle type according to a heartbeat of the object. The bio-signal information according to an exemplary embodiment that is time information corresponding to a cycle of the body movement of the object may be obtained based on the at least one medical image generated by the apparatus for obtaining medical images, instead of the first medical image frames according to an exemplary embodiment. Of course, it is understood that the body movement is not limited to the above examples, and may instead be many other types of body movements according to exemplary embodiments, such as other types of movements related to the circulatory system, other types of muscles, neural activities, or movements of other parts of the human body, e.g., other organs, bones, etc.
- For example, the bio-signal information may include, but is not limited to including, at least one of a cycle of the heartbeat of the object, a cycle of the change in the thickness of the blood vessel, and a cycle of the change in the muscle type that are included in the at least one medical image.
- Also, the bio-signal information according to an exemplary embodiment that is state information corresponding to the body movement of the object may be obtained based on the at least one medical image that is generated by the apparatus for obtaining medical images, instead of the first medical image frames according to an exemplary embodiment.
- For example, the bio-signal information may include, but is not limited to including, at least one of a state of the heartbeat, a state of the thickness of the blood vessel, and a state of the muscle type of the object that are included in the at least one medical image.
- The
apparatus 1000 according to an exemplary embodiment may correlate the bio-signal information that is obtained based on a medical image that is previously generated before ultrasound imaging with the first medical image frames that are generated in real time during the ultrasound imaging. - For example, a periodic body movement of the object included in the first medical image frames may be derived from the bio-signal information that is previously determined.
- When the body movement of the object included in the medical image is a periodic movement and time information corresponding to a cycle of the body movement of the object is T (sec), the
apparatus 1000 according to an exemplary embodiment has to obtain the plurality of first medical image frames by using an imaging time of at least 2 T (sec). - The bio-signal information of the object according to an exemplary embodiment may include ECG signal information of the object, and the first medical image frames may be obtained while the ECG signal information of the object is generated. In this case, the first medical image frames that are generated in real time during ultrasound imaging may be correlated with the ECG signal information of the object that is obtained in real time during the ultrasound imaging. A method of correlating the first medical image frames with the ECG signal information of the object will be explained below in detail with reference to
FIG. 7 . - The second medical image frames according to an exemplary embodiment may be a plurality of image frames having the same bio-signal information among the first medical image frames.
- In this case, the bio-signal information is information related to the body movement corresponding to the physical activity of the object. When two medical image frames have the same bio-signal information, this correlation may indicate that the body movements of the object included in the two medical image frames are the same.
- Since a movement of the object included in the first medical image frames derived from the bio-signal information is periodic, medical image frames corresponding to points in time that have the same movements of the object exist in each cycle. Accordingly, only some medical image frames whose body movements correspond to predetermined points in time may be selected as the second medical image frames from among the plurality of first medical image frames.
- Also, the second medical image frames according to another exemplary embodiment may be a plurality of image frames corresponding to a plurality of pieces of bio-signal information among the first medical image frames.
- For example, the bio-signal information of the object includes ECG signal information of the object, and only second medical image frames corresponding to points in time that have the same ECG signal information of the object may be selected from among the first medical image frames, which will be explained below in detail with reference to
FIGS. 4 and10 . - In operation S120, the
apparatus 100 generates a panoramic image by synthesizing the second medical image frames that are selected in operation S110. - The panoramic image may be generated by synthesizing a plurality of second medical image frames having the same bio-signal information among the first medical image frames. For example, the bio-signal information may include ECG signal information, and the panoramic image may be generated by synthesizing only a plurality of second medical image frames corresponding to points in time that have the same ECG signal information of the object among the first medical image frames.
- In operation S130, the
apparatus 100 displays the panoramic image that is generated in operation S120 on thedisplay 1400. -
FIG. 4 is a flowchart of a method of synthesizing medical images to generate a plurality of panoramic images, according to an exemplary embodiment. - Operation S200 corresponds to operation S100 of
FIG. 3 , and thus, a detailed explanation thereof will not be given. - In operation S210, the
apparatus 100 selects second medical image frames corresponding to points in time that have a plurality of pieces of ECG signal information of an object from among the first medical image frames that are generated in operation S200. - In operation S220, the
apparatus 1000 generates a plurality of panoramic image frames by synthesizing image frames corresponding to points in time that have the same ECG signal information among the second medical image frames that are selected in operation S210, which will be explained below in detail with reference toFIG. 10 . - In operation S230, the
apparatus 1000 continuously outputs the plurality of panoramic image frames that are generated in operation S220 to thedisplay 1400, which will be explained below in detail with reference toFIGS. 11 through 14 . -
FIG. 5 is a perspective view illustrating first medical image frames that are continuously obtained by theapparatus 1000, according to an exemplary embodiment. - In an exemplary embodiment, the
apparatus 1000 may acquire ultrasound image data by continuously transmitting ultrasound signals to theobject 10 while moving theprobe 20 along a surface of theobject 10 and continuously receiving echo signals reflected from theobject 10. In this case, theapparatus 1000 may generate a plurality of first medical image frames, as shown inFIG. 5 , based on the continuously acquired ultrasound image data. - For example, the plurality of first medical image frames that are a plurality of medical image frames including image frames 200, 201 202, 210, 211, 212, and 220 may be image frames that are continuously acquired while the
probe 20 moves along a surface of theobject 10 for a predetermined period of time and may reflect a body movement related to a physical activity of theobject 10 which occurs during a medical procedure. - Examples of the body movement related to the physical activity of the object may include, but are not limited to, a change in a thickness of a blood vessel and a change in a muscle type according to a heartbeat of the
object 10. - Also, examples of the body movement related to the physical activity of the
object 10 according to an exemplary embodiment may include a contraction or expansion of the heart of theobject 10 and bio-signal information of theobject 10 may include ECG signal information of theobject 10. - For example, the first medical image frames may be obtained while the ECG signal information of the object is generated.
- A conventional ultrasound diagnostic apparatus generates panoramic images by sequentially synthesizing a plurality of medical image frames that are continuously obtained in real time as shown in
FIG. 5 . - In general, panoramic imaging, which is a process of generating an image with a field of view greater than a field of view of an independent frame that is generated from one transducer, increases a field of view of an image to be equal to or greater than a field of view of a transducer that is generally limited.
- For example, in panoramic imaging, a scan plane may be extended by manually moving a transducer in a direction parallel to the scan plane. At the same time, old echo signal information of previous frames may be retained while a new echo signal is added in order to generate an image in a direction in which the scan plane moves. A greater field of view obtained as a result may show a large organ or a wide anatomical region on one image. While the new echo signal that is obtained while the transducer moves is added, it may be very important to accurately locate the new echo signal on an existing image. This is accomplished by correlating locations of echo signals common to adjacent frames so that new information on a new frame is located accurately.
- However, since ultrasound image frames that are continuously obtained during a medical procedure reflect the physical activity of the
object 10, for example, the change in the blood vessel or the change in the muscle type according to the heartbeat of theobject 10 as described above, connectivity between regions of interest in panoramic images that are generated by sequentially synthesizing the continuously obtained ultrasound image frames is reduced. - Also, since panoramic images show wide regions of interest but do not provide a video function of reflecting movements in the wide regions of interest, it may be difficult to detect the movements in the wide regions of interest as time passes.
- In contrast, the
apparatus 1000 according to an exemplary embodiment may select only some from among a plurality of medical image frames that are continuously obtained in consideration of the bio-signal information corresponding to the physical activity of theobject 10 and may synthesize a panoramic image, thereby outputting the panoramic image with high accuracy. According to an exemplary embodiment, the term "panoramic image" may refer to an image including information obtained from a wide field of view which is wider than a field of view of a single image frame. However, exemplary embodiments are not limited thereto, and the term "panoramic image" may refer to an image including information obtained from a field of view which changes according to the movement of theprobe 20 along a single direction of an object, along multiple directions of the object, at a single angle, at multiple angles, 2-D images, 3-D images, etc. - For example, the
apparatus 1000 according to an exemplary embodiment may generate a panoramic image by selecting and synthesizing only medical image frames corresponding to points in time that have the same ECG signal information from among a plurality of medical image frames that are continuously obtained. - For example, when medical image frames correspond to points in time that have the same ECG signal information of the
object 10, this correlation may indicate that the medical image frames correspond to points in time that have the same contractions or expansions of the heart of theobject 10. - In this case, consistency of a state of observed tissue of the
object 10 that is included in the medical image frames corresponding to the points in time that have the same contractions or expansions of the heart of theobject 10 may be maintained. For example, consistency of the thickness of the blood vessel or the muscle type included in the medical image frames corresponding to the points in time that have the same ECG signal information of theobject 10 may be maintained. - Accordingly, a panoramic image in which connectivity between regions of interest is improved may be generated by synthesizing only the medical image frames corresponding to the points in time that have the same ECG signal information of the
object 10. - That is, the
apparatus 1000 according to an exemplary embodiment may provide a panoramic image with improved connectivity between regions of interest by generating the panoramic images by synthesizing only a plurality of second medical image frames 200, 210, and 220 corresponding to points in time that have the same ECG signal information of theobject 10 among the plurality of first medical image frames ofFIG. 5 . -
FIG. 6 is a graph illustrating ECG signal information of an object, according to an exemplary embodiment. Bio-signal information according to an exemplary embodiment may indicate a body movement related to a physical activity of the object which occurs while first medical image frames are generated. - The bio-signal information according to an exemplary embodiment may be determined based on the medical image that is generated by another apparatus for obtaining medical images before the
apparatus 1000 according to an exemplary embodiment performs a medical procedure. This feature is possible on the assumption that the body movement of the object is similarly maintained before and after the medical procedure is performed. - For example, the medical image that is generated by the apparatus for obtaining medical images may be a medical image obtained by previously taking a picture of a periodic body movement of the object before the
apparatus 1000 according to an exemplary embodiment generates the first medical image frames. - For example, the medical image that is generated by the apparatus for obtaining medical images may include at least one of a blood vessel image, a musculoskeletal image, and an ECG image.
- Also, while the
apparatus 1000 according to an exemplary embodiment generates the first medical image frames that are continuously obtained while theprobe 20 moves along a surface of the object for a predetermined period of time, the bio-signal information according to an exemplary embodiment may be obtained. - That is, while the
apparatus 1000 according to an exemplary embodiment obtains ultrasound images of the object, the medical image including the bio-signal information of the object may be obtained by the apparatus for obtaining medical images. - For example, the body movement related to the physical activity of the object may be a contraction or expansion of the object and may include the ECG signal information of the object.
FIG. 6 is a graph illustrating an ECG image that is a medical image including the body movement related to a heartbeat of the object. - As shown in
FIG. 6 , the ECG image shows an ECG that is measured from an ECG signal that is received through a plurality of electrodes that are attached to the object. - The
apparatus 1000 according to an exemplary embodiment may obtain the bio-signal information of the object based on the ECG image ofFIG. 6 . - In this case, the bio-signal information according to an exemplary embodiment may be time information corresponding to a cycle of the body movement of the object. For example, the bio-signal information may include information about a cycle of the heartbeat of the object that may be calculated in the ECG image.
- Also, the bio-signal information according to an exemplary embodiment may include the ECG signal information of the object.
- Referring to
FIG. 6 , thecontroller 1700 may extract points in time that have the same ECG signal information from the received ECG signal. - In an exemplary embodiment, the
controller 1700 may calculate at least one type of information such as a length of the ECG signal in each interval, a point of time when a voltage is the highest, or a gradient of a waveform of the ECG signal from the medical image and may extract points in time that have the same (or substantially similar) ECG signal information from the ECG image. For example, the points in time that have the same ECG signal according to an exemplary embodiment may include points in time that have the same singular points extracted from the ECG image. - For example, the
controller 1700 according to an exemplary embodiment may extract an R-point 400 where a voltage of a QRS group is the highest and an R-point 410 where a voltage of a QRS group of a next heartbeat is the highest inFIG. 6 as singular points. - Also, the
controller 1700 according to another exemplary embodiment may extract apoint 402 where a T-wave starts and apoint 412 where a T-wave of the next heartbeat starts as singular points. - Also, referring to
FIG. 6 , thecontroller 1700 may extract at least one singular point for calculating a cardiac cycle from the received ECG signal. - However, the present exemplary embodiment is not limited thereto, and various points or intervals for calculating the cardiac cycle may be extracted as singular points. The
controller 1700 according to an exemplary embodiment may calculate the cardiac cycle based on the extracted singular points. For example, when the R-point 400 and the R-point 410 are extracted as singular points, thecontroller 1700 may measure a time T1 corresponding to an RR interval between the R-point 400 and the R-point 410 and may calculate the cardiac cycle based on the measured time T1. For example, when the time T1 corresponding to the RR interval is 0.6 seconds, the cardiac cycle may be calculated to be 0.6 seconds. - Also, when the
point 402 where the T-wave starts and thepoint 412 where the T-wave of the next heartbeat starts are extracted as singular points, instead of extracting the R-points controller 1700 may measure a time T2 corresponding to a TT interval between thepoint 402 where the T-wave starts and thepoint 412 where the T-wave of the next heartbeat starts and may calculate the cardiac cycle based on the measured time T2. -
FIG. 7 shows a graph and a perspective view illustrating a correlation between ECG signal information of an object and first medical image frames, according to an exemplary embodiment. - Since a plurality of first medical image frames according to an exemplary embodiment are image frames that are continuously obtained while the
probe 20 moves along a surface of the object for a predetermined period of time and a physical activity of the object continuously occurs during the predetermined period of time, information related to a periodic body movement of the object may be included in the first medical image frames. - However, since image processing of generating a panoramic image performed by the
apparatus 1000 according to an exemplary embodiment is a part of a pre-process that is performed before data is stored in a memory, it may be difficult to detect the periodic body movement of the object by analyzing the plurality of first medical image frames. - Accordingly, on the assumption that consistency of the periodic body movement of the object is maintained before or after an ultrasound medical procedure, bio-signal information according to an exemplary embodiment may be obtained from at least one medical image that is previously generated by an apparatus for obtaining medical images, instead of from the first medical image frames according to an exemplary embodiment. For example, the at least one medical image that reflects the periodic body movement of the object may include at least one of a blood vessel image, a musculosketal image, and an ECG image.
- The
apparatus 1000 according to an exemplary embodiment may correlate the bio-signal information that is obtained based on the at least one medical image that is previously generated before ultrasound imaging with the first medical image frames that are generated in real time during the ultrasound imaging. - For example, although a time when the
apparatus 1000 according to an exemplary embodiment generates the first medical image frames and a time when the apparatus for obtaining medical images according to an exemplary embodiment generates the at least one medical image for extracting the bio-signal information are different from each other, since consistency of the periodic body movement of the object is maintained irrespective of a time when imaging is performed, as shown inFIG. 7 , the at least one medical image and the first medical image frames may be correlated with each other in relation to the periodic body movement that is the physical activity of the object. Accordingly, the periodic body movement of the object that is included in the first medical image frames may be derived from the periodic body movement of the object that is calculated from the at least one medical image. - As shown in
FIG. 6 , when the body movement of the object that is included in the at least one medical image is periodic and, in this case, time information corresponding to a cycle of the body movement of the object is T (sec), theapparatus 1000 according to an exemplary embodiment may synthesize panoramic images by obtaining the plurality of first medical image frames using an imaging time of at least 2 T (sec) as shown inFIG. 7 . - As shown in
FIG. 7 , it may be derived that a movement of the object at the R-points FIG. 6 respectively correspond to a movement of the object in the medical image frames 300, 310, and 320 in the first medical image frames ofFIG. 5 . - For example, in relation to the physical activity of the heartbeat of the object, a change in a thickness of a blood vessel of the object periodically occurs according to periodic contractions or expansions of the heart corresponding to a pumping action of the heart that supplies blood to all body parts.
- As shown in
FIG. 7 , the time T1 that is a cardiac cycle calculated based on the R-point 400 and the R-point 410 that are extracted as singular points in the ECG image may be obtained as the bio-signal information including time information corresponding to a cycle of the body movement of the object. - In this case, a body movement corresponding to the R-
points - That is, the thickness of the blood vessel included in the first medical image frames according to an exemplary embodiment may vary according to the cardiac cycle of the object in the ECG image according to an exemplary embodiment. Also, the change in the thickness of the blood vessel that occurs according to the pumping action of the heart is periodic, like the cardiac cycle. A cycle of a change in the thickness of the blood vessel including the first medical image frames may be derived by using the time T1 that the cardiac cycle already obtained in the ECG image.
- For example, when the second
medical image frame 300 corresponding to the R-point 400 where the voltage of the QRS group in the ECG image is the highest is determined, the second medical image frames 310 and 320 in other cycles may also be determined based on the time T1 that is the cardiac cycle. Accordingly, theapparatus 1000 according to an exemplary embodiment may select the plurality of second medical image frames 300, 310, and 320 respectively corresponding to the plurality of R-points - In contrast, when a body movement corresponding to
points medical image frame 302 corresponding to thepoint 402 where the T-wave starts in the ECG image is determined, the second medical image frames 312 and 322 in other cycles may also be determined based on the time T2 that is a cardiac cycle. - As described above, a periodic body movement of the object that is included in the first medical image frames that are continuously obtained by the
apparatus 1000 according to an exemplary embodiment may be derived from the bio-signal information that is obtained in the at least one medical image. In this case, the bio-signal information may include at least one of time information corresponding to a cycle of the body movement of the object, for example, a cardiac cycle, and state information corresponding to the body movement of the object, for example, a state of a thickness of a blood vessel. - In this case, since the
apparatus 1000 according to an exemplary embodiment generates panoramic images by synthesizing only second medical image frames that are derived to have the same body movement among the plurality of first medical image frames, theapparatus 1000 may generate the panoramic images in consideration of the physical activity of the object, which will be explained below with reference toFIGS. 8 through 10 . - While the
apparatus 1000 according to an exemplary embodiment obtains the first medical image frames of the object as shown inFIG. 7 , the ECG image including the ECG signal information of the object may be obtained by the apparatus for obtaining medical images. - In this case, as shown in
FIG. 7 , the ECG signal information of the ECG image and the first medical image frames may be correlated with each other. - For example, the ECG signal information of the ECG image may be calculated based on at least one type of information such as a length of the ECG signal in each interval, a point of time when a voltage is the highest, and a gradient of a waveform.
- Also, the
apparatus 1000 according to an exemplary embodiment may extract points in time that have the same ECG signal information from the ECG image as shown inFIG. 7 . - For example, the points in time that have the same ECG signal information according to an exemplary embodiment may include points in time that have the same singular points extracted in the ECG image.
- As shown in
FIG. 7 , the R-points points - Alternatively, the
points points -
FIGS. 8 and9 are views illustrating various examples where theapparatus 1000 generates panoramic images by synthesizing second medical image frames corresponding to points in time that have the same ECG signal information among first medical image frames, according to an exemplary embodiment. - As shown in
FIG. 7 , the second medical image frames 300, 310, and 320 respectively corresponding to the R-points - The
apparatus 1000 according to an exemplary embodiment may generate apanoramic image D 1 500 by selecting and synthesizing only second medical image frames A1, B1, and C1 that correspond to points in time that have the same ECG signal information of the object and thus are derived to have the same thickness of a blood vessel in the first medical image frames. In this case, as shown inFIG. 8 , since thepanoramic image D 1 500 is obtained by selecting and synthesizing only image frames corresponding to a state where the blood vessel is expanded (e.g., a thickness of the blood vessel is d1) in a periodic body movement of the object and only thepanoramic image D 1 500 in the state where the blood vessel is expanded is displayed, connectivity between regions of interest may be improved. For example, theapparatus 1000 may synthesize the second medical image frames A1, B1, and C1 by using a synthesis algorithm which combines the second medical image frames A1, B1, and C1 side-by-side or in some other continuous form, although is not limited thereto. - Also, as shown in
FIG. 7 , the second medical image frames 302, 312, and 322 respectively corresponding to thepoints - The
apparatus 1000 according to an exemplary embodiment may generate apanoramic image D 2 510 by selecting and synthesizing only second medical image frames A2, B2, and C2 that correspond to points in time that have the same ECG signal information of the object and thus are derived to have the same thickness of the blood vessel in the first medical image frames. In this case, as shown inFIG. 9 , since thepanoramic image D 2 510 is obtained by selecting and synthesizing only image frames corresponding to a state where the blood vessel is contracted (e.g., a thickness of the blood vessel is d2) in the periodic body movement of the object and only thepanoramic image D 2 510 in the state where the blood vessel is contracted is displayed, connectivity between regions of interest may be improved. -
FIG. 10 is a view illustrating an example where theapparatus 1000 generates a plurality of panoramic images by synthesizing image frames corresponding to points in time that have the same ECG signal information of an object among first medical image frames, according to an exemplary embodiment. - As shown in
FIG. 10 , theapparatus 1000 according to an exemplary embodiment may select a plurality of second medical image frames having a plurality of pieces of ECG signal information and may generate a plurality of panoramic images. In this case, the panoramic images according to an exemplary embodiment may be generated by synthesizing second medical image frames corresponding to points in time that have the same ECG signal information. - As shown in
FIG. 10 , second medical image frames 304, 314, and 324 respectively corresponding to the R-points points - In this case, a
panoramic image D 3 520 may be generated by synthesizing only second medical image frames A3, B3, and C3 having bio-signal information corresponding to a state where a blood vessel is expanded (e.g., a thickness of the blood vessel is d3). Also, apanoramic image D 4 530 may be generated by synthesizing only second medical image frames A4, B4, and C4 having bio-signal information corresponding to a state where the blood vessel is contracted (e.g., a thickness of the blood vessel is d4). - In this case, the
apparatus 1000 according to an exemplary embodiment may generate a plurality of panoramic images by synthesizing second medical image frames corresponding to points in time that have the same ECG signal information of the object from among the same first medical image frames. Accordingly, theapparatus 1000 according to an exemplary embodiment may provide a panoramic video function by continuously outputting the plurality of panoramic images corresponding to points of various body movements of the object. -
FIGS. 11 through 14 are views illustrating various examples where theapparatus 1000 displays generated panoramic images, according to an exemplary embodiment. - As shown in
FIGS. 11 and12 ,Doppler images mode images 700 and 710, wherein theDoppler images mode images 700 and 710 are panoramic images, may be displayed on thedisplay 1400 of theapparatus 1000. - The B-mode image 700 and the
Doppler image 800 ofFIG. 11 show thepanoramic image D 3 520 that is generated by synthesizing only the second medical image frames A3, B3, and C3 having a bio-signal information corresponding to a state where a blood vessel is expanded (e.g., a thickness of the blood vessel is d3) inFIG. 10 . - Also, the B-
mode image 710 and theDoppler image 810 ofFIG. 12 show thepanoramic image D 4 530 that is generated by synthesizing only the second medical image frames A4, B4, and C4 having bio-signal information corresponding to a state where the blood vessel is contracted (e.g., a thickness of the blood vessel is d4) inFIG. 10 . - For example,
images FIGS. 13A and 13B show thepanoramic image D 4 530 that is generated by synthesizing only second medical image frames having bio-signal information corresponding to a state where a blood vessel of a wrist is contracted. - Also,
images FIGS. 14A and 14B show thepanoramic image D 4 530 that is generated by synthesizing only second medical image frames having bio-signal information corresponding to a state where a blood vessel of a back portion of a lower leg is contracted. -
FIGS. 15 through 17 are views illustrating various examples where theapparatus 1000 displays panoramic images along with ECG signal information of an object, according to an exemplary embodiment. -
FIGS. 15 through 17 are views for explaining a post-process of processing panoramic images by using ECG signal information of an object after the panoramic images are generated and stored in a memory. An image process unit according to an exemplary embodiment may correlate the ECG signal information of the object with the panoramic images and may store the ECG signal information and the panoramic images that are correlated with each other, and a display according to an exemplary embodiment may display the ECG signal information and the panoramic images together. - In this case, the plurality of panoramic images according to an exemplary embodiment may be correlated with the ECG signal information of the object and may be stored as a video file in the memory.
- For example, as shown in
FIG. 15 , apanoramic image 840 that is generated by synthesizing only second medical image frames corresponding to points in time that have the same ECG signal information of the object may be displayed on the display along with anECG image 900 of the object. - In this case, a GUI related to a function of reproducing a panoramic video may also be output to the display.
- For example, the GUI related to the function of reproducing the panoramic video may include a user interface for reproducing, editing, or re-storing the panoramic images.
- For example, as shown in
FIG. 15 , when thepanoramic image 840 that is stored in the memory is output, the display may display together theECG image 900 correlated with thepanoramic image 840, aninterval marker 901 of theECG image 900 corresponding to thepanoramic image 840 that is currently output to the display, and aprocessor bar 910. - Also, as shown in
FIG. 16 , when apanoramic image 850 that is stored in the memory is output, the display may display together anECG image 920 correlated with thepanoramic image 850, aninterval marker 921 of theECG image 920 corresponding to thepanoramic image 850 that is currently output to the display, and aprocessor bar 930. - Also, as shown in
FIG. 17 , the display may also output a plurality ofECG images panoramic image 860. - Also, the
apparatus 1000 according to an exemplary embodiment may reproduce, edit, or re-store an interval of a panoramic image selected by a user through a user interface related to a function of reproducing the panoramic image. - For example, as shown in
FIGS. 15 through 17 , only an interval of a panoramic image desired to be seen by the user may be output to the display based on a user input through the processor bars 910, 930, and 950. - Also, the
apparatus 1000 according to an exemplary embodiment may store in the memory, as a separate file, only a panoramic image that is selected based on a user input in a panoramic video file that is stored in the memory. In this case, the selected panoramic image may be correlated with ECG signal information corresponding to the selected panoramic image and may be stored in the memory. - For example, when the user selects an interval that is determined to have an abnormality in the ECG image output to the display, a panoramic image corresponding to the interval may be output. Also, only the panoramic image corresponding to the interval that is determined to have an abnormality in an ECG signal may be re-stored in the memory based on the user's input.
- Accordingly, since the
apparatus 1000 according to an exemplary embodiment provides panoramic images in consideration of a periodic body movement of the object, and thus, may obtain more physical information than that obtained by using panoramic images that are provided as still images, theapparatus 100 may be used for various medical tests. - For example, the
apparatus 1000 according to an exemplary embodiment may detect a contraction or expansion of a blood vessel or a position and the amount of a blood clot by using a panoramic video function. Also, theapparatus 1000 according to an exemplary embodiment may be used to evaluate blood flow in an artery or examine an aortic aneurysm, and may use a panoramic video function to evaluate a state where a blood vessel is reconnected after blood vessel bypass surgery. - Also, since the
apparatus 1000 according to an exemplary embodiment provides a function of reproducing, editing, or re-storing a panoramic image of an interval desired by the user through a user interface related to a function of reproducing the panoramic video, theapparatus 1000 may provide to the user a panoramic video function having high intuition and high utilization. - While one or more exemplary embodiments have been described with reference to the figures, it will be understood by one of ordinary skill in the art that various changes in form and details may be made therein without departing from the scope of the following claims.
Claims (14)
- A method of synthesizing medical images, the method comprising:acquiring image data of an object;acquiring an electrocardiogram (ECG) of the object;generating first medical image frames of the object based on the image data;displaying, on a display, ECG signal information along with an interval marker on the ECG signal information;receiving a user input of selecting a time interval of the ECG signal information; andin response to the user input being received:selecting, from among the first medical image frames, second medical image frames corresponding to first points of time that have the same ECG signal information, in the time interval that is selected;generating a panoramic image by synthesizing the second medical image frames that are selected;correlating the ECG signal information in the time interval that is selected with the panoramic image that is generated; anddisplaying, on the display, the panoramic image, correlated with the ECG signal information in the time interval that is selected.
- The method of claim 1, wherein the acquiring of the image data of the object comprises:transmitting an ultrasound signal to the object;receiving a response signal, based on the ultrasound signal that is transmitted; andacquiring the image data, based on the response signal that is received.
- The method of claim 1, wherein the acquiring of the image data of the object comprises:transmitting a radio frequency (RF) signal to the object;receiving a magnetic resonance (MR) signal that is emitted from the object based on the RF signal that is transmitted; andacquiring the image data based on the MR signal that is received.
- The method of claim 1, wherein the acquiring of the image data of the object comprises:transmitting an X-ray signal to the object;detecting the X-ray signal that is transmitted through the object; andacquiring computed tomography (CT) image data based on the X-ray signal that is detected.
- The method of any preceding claim, further comprising:
storing the correlated ECG signal information with the panoramic image correlated with the time interval that is selected - The method of claim any preceding claim, wherein the first points in time that have the same ECG signal information have same singular points that are extracted from an ECG image.
- The method of claim any preceding claim, further comprising, in response to the user input being received:selecting, from among the first medical image frames, third medical image frames different from the second medical image frames and corresponding to second points of time that have the same ECG signal information, in time interval, that is selected, wherein the second points of time are different from the first points of time;generating a further panoramic image by synthesizing the third medical image frames that are selected; anddisplaying the further panoramic image on the display.
- An apparatus for synthesizing medical images, the apparatus comprising:a data acquirer configured to acquire image data of an object;an electrocardiogram (ECG) acquirer configured to acquire an ECG signal of the object;an image processor configured to generate first medical image frames of the object based on the image data,a display configured to display ECG signal information along with an interval marker on the ECG signal information,wherein the image processor is further configured to:receive a user input of selecting a time interval of the ECG signal information; andin response to the user input being received:select, from among the first medical image frames, second medical image frames corresponding to first points of time that have the same ECG signal information, in the time interval that is selected,generate a panoramic image by synthesizing the second medical image frames that are selected,correlate the ECG signal information in the time interval that is selected with the panoramic image that is generated; andcontrol the display to display the panoramic image, that is correlated with the ECG signal information in the time interval that is selected.
- The apparatus of claim 8, wherein the data acquirer further comprises an ultrasound transceiver configured to:transmit an ultrasound signal to the object;receive a response signal based on the ultrasound signal that is transmitted, andacquire the image data based on the response signal that is received.
- The apparatus of claim 8, wherein the data acquirer is configured to :transmit a radio frequency (RF) signal to the object;receive a magnetic resonance (MR) signal that is emitted from the object based on the RF signal, that is transmitted; andacquire the image data based on the MR signal that is received.
- The apparatus of claim 8, wherein the data acquirer is configured to:transmit an X-ray signal to the object;detect the X-ray signal that is transmitted through the object; andacquire computed tomography (CT) image data based on the X-ray signal that is detected.
- The apparatus of any of the claims 8-11, further comprising a memory configured to store the ECG signal information with the panoramic image correlated with the time interval that is selected.
- The apparatus of any of the claims 8-12, wherein the first points of time that have the same ECG signal information have the same singular points that are extracted from the ECG image.
- The apparatus of any of the claims 8-13, wherein
the image processor is configured to, in response to the user input being received:select, from among the first medical image frames, third medical image frames corresponding to second points of time that have the same ECG signal information, in the time interval that is selected, wherein the second points of time are different from the first points of time;generate a further panoramic image by synthesizing the third medical image frames that are selected; andcontrol the display to display the further panoramic image.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150068187A KR101797042B1 (en) | 2015-05-15 | 2015-05-15 | Method and apparatus for synthesizing medical images |
Publications (2)
Publication Number | Publication Date |
---|---|
EP3092951A1 EP3092951A1 (en) | 2016-11-16 |
EP3092951B1 true EP3092951B1 (en) | 2020-08-19 |
Family
ID=55701681
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP16156252.5A Active EP3092951B1 (en) | 2015-05-15 | 2016-02-18 | Method and apparatus for synthesizing medical images |
Country Status (5)
Country | Link |
---|---|
US (1) | US10957013B2 (en) |
EP (1) | EP3092951B1 (en) |
KR (1) | KR101797042B1 (en) |
CN (1) | CN107635464A (en) |
WO (1) | WO2016186279A1 (en) |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10412377B2 (en) * | 2017-01-11 | 2019-09-10 | Koninklijke Philips N.V. | Augmented display device for use in a medical imaging laboratory |
US11704850B2 (en) * | 2017-04-07 | 2023-07-18 | Formus Labs Limited | System for transmitting and viewing a series of images |
CN109840898A (en) * | 2017-11-24 | 2019-06-04 | 通用电气公司 | A kind of the imaging display methods and imaging device of medical image |
KR101999785B1 (en) | 2018-02-09 | 2019-07-12 | 메디컬아이피 주식회사 | Method and apparatus for providing 3D model |
JP2019146679A (en) * | 2018-02-26 | 2019-09-05 | 株式会社島津製作所 | X-ray imaging apparatus |
JP7052591B2 (en) * | 2018-06-20 | 2022-04-12 | コニカミノルタ株式会社 | Ultrasound diagnostic equipment, ultrasonic image display method and program |
CN112367922B (en) * | 2018-08-22 | 2024-09-17 | 古野电气株式会社 | Ultrasonic analysis device, ultrasonic analysis method, and ultrasonic analysis program |
KR102656237B1 (en) * | 2018-09-21 | 2024-04-09 | 엘지디스플레이 주식회사 | Moving fingerprint recognition method and apparatus using display |
CN109360181B (en) * | 2018-10-29 | 2020-07-24 | 中惠医疗科技(上海)有限公司 | Ultrasonic image and nuclear magnetic image fusion method and system |
KR102301422B1 (en) * | 2019-05-24 | 2021-09-14 | 오스템임플란트 주식회사 | Dental panoramic x-ray photographing apparatus and method |
CA3151618A1 (en) * | 2019-09-19 | 2021-03-25 | Lightlab Imaging, Inc. | Systems and methods of combined imaging |
US11227392B2 (en) * | 2020-05-08 | 2022-01-18 | GE Precision Healthcare LLC | Ultrasound imaging system and method |
US11559280B2 (en) * | 2020-05-08 | 2023-01-24 | GE Precision Healthcare LLC | Ultrasound imaging system and method for determining acoustic contact |
CN112603338B (en) * | 2020-12-02 | 2021-11-12 | 赛诺威盛科技(北京)股份有限公司 | Method and device for selecting and retrospective reconstruction data of heart spiral retrospective reconstruction |
CN113014739B (en) * | 2021-03-17 | 2022-05-27 | 无锡祥生医疗科技股份有限公司 | Method, apparatus and system for image export adjustment |
CN115426508B (en) * | 2022-08-30 | 2023-05-30 | 深圳英美达医疗技术有限公司 | Audio and video signal synchronization method, device, system and medium of medical workstation |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0975339A (en) * | 1995-09-14 | 1997-03-25 | Hitachi Medical Corp | Ultrasonic diagnostic device |
US20050222506A1 (en) * | 2004-03-24 | 2005-10-06 | Masao Takimoto | Ultrasound diagnostic apparatus |
US20090198134A1 (en) * | 2008-01-31 | 2009-08-06 | Shinichi Hashimoto | Ultrasonic diagnostic apparatus, diagnostic imaging apparatus, and program |
Family Cites Families (65)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2063744C (en) * | 1991-04-01 | 2002-10-08 | Paul M. Urbanus | Digital micromirror device architecture and timing for use in a pulse-width modulated display system |
US5999173A (en) * | 1992-04-03 | 1999-12-07 | Adobe Systems Incorporated | Method and apparatus for video editing with video clip representations displayed along a time line |
US5251628A (en) * | 1992-06-23 | 1993-10-12 | General Electric Company | Variable ECG delay in fast pulse sequence scans |
JPH07219970A (en) * | 1993-12-20 | 1995-08-18 | Xerox Corp | Method and apparatus for reproduction in acceleration format |
FR2735966B1 (en) | 1994-11-10 | 1998-02-06 | By Heart | ECHOGRAPHIC METHOD AND IMPLEMENTING DEVICE FOR THREE-DIMENSIONAL ECHOGRAPHY OF THE HEART |
JP3410843B2 (en) | 1994-12-27 | 2003-05-26 | 株式会社東芝 | Ultrasound diagnostic equipment |
US5782766A (en) * | 1995-03-31 | 1998-07-21 | Siemens Medical Systems, Inc. | Method and apparatus for generating and displaying panoramic ultrasound images |
WO1998034195A1 (en) * | 1997-01-30 | 1998-08-06 | Yissum Research Development Company Of The Hebrew University Of Jerusalem | Generalized panoramic mosaic |
DE19740214A1 (en) * | 1997-09-12 | 1999-04-01 | Siemens Ag | Computer tomography device with spiral scanning e.g. for examination of heart |
US6301496B1 (en) * | 1998-07-24 | 2001-10-09 | Biosense, Inc. | Vector mapping of three-dimensionally reconstructed intrabody organs and method of display |
US5993390A (en) * | 1998-09-18 | 1999-11-30 | Hewlett- Packard Company | Segmented 3-D cardiac ultrasound imaging method and apparatus |
US6117081A (en) * | 1998-10-01 | 2000-09-12 | Atl Ultrasound, Inc. | Method for correcting blurring of spatially compounded ultrasonic diagnostic images |
US6484048B1 (en) * | 1998-10-21 | 2002-11-19 | Kabushiki Kaisha Toshiba | Real-time interactive three-dimensional locating and displaying system |
US6159152A (en) * | 1998-10-26 | 2000-12-12 | Acuson Corporation | Medical diagnostic ultrasound system and method for multiple image registration |
DE19854939C2 (en) * | 1998-11-27 | 2001-11-22 | Siemens Ag | Method and device for generating CT images |
JP3645727B2 (en) * | 1999-01-28 | 2005-05-11 | 株式会社日立製作所 | Ultrasonic diagnostic apparatus, program for synthesizing panoramic image, and recording medium thereof |
US7778688B2 (en) | 1999-05-18 | 2010-08-17 | MediGuide, Ltd. | System and method for delivering a stent to a selected position within a lumen |
JP4377495B2 (en) * | 1999-10-29 | 2009-12-02 | 株式会社東芝 | Ultrasonic diagnostic equipment |
AU2001253490A1 (en) * | 2000-04-13 | 2001-10-30 | The Trustees Of Columbia University In The City Of New York | Method and apparatus for processing echocardiogram video images |
US6416477B1 (en) * | 2000-08-22 | 2002-07-09 | Koninklijke Philips Electronics N.V. | Ultrasonic diagnostic systems with spatial compounded panoramic imaging |
US7043019B2 (en) * | 2001-02-28 | 2006-05-09 | Eastman Kodak Company | Copy protection for digital motion picture image data |
US7134093B2 (en) * | 2001-04-18 | 2006-11-07 | International Business Machines Corporation | Graphical user interface for direct control of display of data |
US7142703B2 (en) * | 2001-07-17 | 2006-11-28 | Cedara Software (Usa) Limited | Methods and software for self-gating a set of images |
US7006862B2 (en) * | 2001-07-17 | 2006-02-28 | Accuimage Diagnostics Corp. | Graphical user interfaces and methods for retrospectively gating a set of images |
JP4299015B2 (en) | 2003-01-31 | 2009-07-22 | アロカ株式会社 | Ultrasonic image processing device |
US7052460B2 (en) * | 2003-05-09 | 2006-05-30 | Visualsonics Inc. | System for producing an ultrasound image using line-based image reconstruction |
US7606402B2 (en) * | 2003-06-09 | 2009-10-20 | Ge Medical Systems Global Technology, Llc | Methods and systems for physiologic structure and event marking |
US8112145B2 (en) * | 2004-06-03 | 2012-02-07 | Wisconsin Alumni Research Foundation | MRI method for assessing myocardial viability |
CN100455266C (en) * | 2005-03-29 | 2009-01-28 | 深圳迈瑞生物医疗电子股份有限公司 | Broad image processing method |
DE102005023167B4 (en) * | 2005-05-19 | 2008-01-03 | Siemens Ag | Method and device for registering 2D projection images relative to a 3D image data set |
EP1731102A1 (en) * | 2005-06-08 | 2006-12-13 | Esaote S.p.A. | Method for measuring and displaying time varying events |
US20100168573A1 (en) * | 2005-06-23 | 2010-07-01 | Koninklijke Philips Electronics, N.V. | Method and apparatus for 3d ultrasound imaging using a stationary beam to estimate a parameter |
EP1949857B1 (en) | 2005-11-15 | 2012-06-13 | Hitachi Medical Corporation | Ultrasonographic device |
US8303505B2 (en) * | 2005-12-02 | 2012-11-06 | Abbott Cardiovascular Systems Inc. | Methods and apparatuses for image guided medical procedures |
KR100867588B1 (en) | 2006-01-06 | 2008-11-10 | 주식회사 메디슨 | Ultrasound system for displaying ultrasound image relating to periodic motion of object and method for the same |
JP5366370B2 (en) * | 2006-09-06 | 2013-12-11 | 株式会社東芝 | Magnetic resonance imaging system |
CN101541232B (en) * | 2006-11-30 | 2013-07-17 | 皇家飞利浦电子股份有限公司 | Reconstruction window adaption in ECG-gated computed tomography |
EP1936975A1 (en) * | 2006-12-20 | 2008-06-25 | Thomson Licensing | Method and device for processing source pictures to generate aliasing |
EP2129284A4 (en) * | 2007-03-08 | 2012-11-28 | Sync Rx Ltd | Imaging and tools for use with moving organs |
JP5414157B2 (en) * | 2007-06-06 | 2014-02-12 | 株式会社東芝 | Ultrasonic diagnostic apparatus, ultrasonic image processing apparatus, and ultrasonic image processing program |
JP5105981B2 (en) * | 2007-07-18 | 2012-12-26 | 株式会社東芝 | MEDICAL IMAGE PROCESSING DISPLAY DEVICE AND PROCESSING PROGRAM THEREOF |
EP2037413A3 (en) * | 2007-09-17 | 2011-01-26 | Wisconsin Alumni Research Foundation | A method for reducing motion artifacts in highly constrained medical images |
WO2009061521A1 (en) * | 2007-11-11 | 2009-05-14 | Imacor, Llc | Method and system for synchronized playback of ultrasound images |
US8340751B2 (en) * | 2008-04-18 | 2012-12-25 | Medtronic, Inc. | Method and apparatus for determining tracking a virtual point defined relative to a tracked member |
US8204289B2 (en) * | 2008-05-09 | 2012-06-19 | Siemens Medical Solutions Usa, Inc. | System for identifying medical images having a particular medical characteristic |
CN102065770B (en) * | 2008-07-01 | 2013-03-27 | 株式会社日立医疗器械 | X-ray CT apparatus |
US20100061611A1 (en) * | 2008-09-11 | 2010-03-11 | Siemens Corporate Research, Inc. | Co-registration of coronary artery computed tomography and fluoroscopic sequence |
US8199994B2 (en) | 2009-03-13 | 2012-06-12 | International Business Machines Corporation | Automatic analysis of cardiac M-mode views |
US9852761B2 (en) * | 2009-03-16 | 2017-12-26 | Apple Inc. | Device, method, and graphical user interface for editing an audio or video attachment in an electronic message |
JP5596940B2 (en) * | 2009-06-30 | 2014-09-24 | 株式会社東芝 | Ultrasonic diagnostic equipment |
JP5735914B2 (en) * | 2009-07-30 | 2015-06-17 | 株式会社日立メディコ | Ultrasound diagnostic apparatus and region of interest setting method |
JP5412242B2 (en) | 2009-11-05 | 2014-02-12 | 伸治 久米 | Ultrasonic tomographic image processing device |
US8880352B2 (en) * | 2010-11-29 | 2014-11-04 | Siemens Aktiengesellschaft | System and method for analyzing an electrophysiological signal |
DE102011075287B4 (en) | 2011-05-05 | 2016-07-21 | Siemens Healthcare Gmbh | A method for obtaining a 3D X-ray image data set to a periodically moving image object |
EP2719334A4 (en) * | 2011-06-13 | 2014-11-12 | Konica Minolta Inc | Ultrasound diagnostic apparatus and ultrasound measurement method using same |
WO2014002894A1 (en) * | 2012-06-27 | 2014-01-03 | 株式会社 東芝 | Medical-image-processing device, method for same, and x-ray diagnostic device |
KR101351583B1 (en) * | 2012-10-10 | 2014-01-16 | 한국과학기술원 | Medical image imaging method, medical diagnostic apparatus thereof, and recording device thereof |
CN103006217B (en) | 2012-12-21 | 2016-05-25 | 中国科学院深圳先进技术研究院 | The real-time film formation method of a kind of cardiac magnetic resonance, system |
JP6125281B2 (en) * | 2013-03-06 | 2017-05-10 | 東芝メディカルシステムズ株式会社 | Medical image diagnostic apparatus, medical image processing apparatus, and control program |
KR102205353B1 (en) * | 2013-09-26 | 2021-01-20 | 삼성전자주식회사 | X-ray imaging apparatus and x-ray imaging apparatus control method |
EP2873371B1 (en) * | 2013-11-13 | 2022-12-21 | Pie Medical Imaging BV | Method and system for registering intravascular images |
KR101664432B1 (en) | 2014-02-12 | 2016-10-10 | 삼성전자주식회사 | Computer tomography apparatus and method for displaying a computer tomography image thereof |
WO2015122687A1 (en) * | 2014-02-12 | 2015-08-20 | Samsung Electronics Co., Ltd. | Tomography apparatus and method of displaying a tomography image by the tomography apparatus |
US9836861B2 (en) * | 2014-12-12 | 2017-12-05 | Samsung Electronics Co., Ltd. | Tomography apparatus and method of reconstructing tomography image |
EP3346923B1 (en) * | 2015-09-10 | 2020-12-16 | Koninklijke Philips N.V. | Enhanced imaging of a vascular treatment |
-
2015
- 2015-05-15 KR KR1020150068187A patent/KR101797042B1/en active IP Right Grant
- 2015-11-30 US US14/953,531 patent/US10957013B2/en active Active
- 2015-12-01 WO PCT/KR2015/013006 patent/WO2016186279A1/en active Application Filing
- 2015-12-01 CN CN201580080024.5A patent/CN107635464A/en active Pending
-
2016
- 2016-02-18 EP EP16156252.5A patent/EP3092951B1/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0975339A (en) * | 1995-09-14 | 1997-03-25 | Hitachi Medical Corp | Ultrasonic diagnostic device |
US20050222506A1 (en) * | 2004-03-24 | 2005-10-06 | Masao Takimoto | Ultrasound diagnostic apparatus |
US20090198134A1 (en) * | 2008-01-31 | 2009-08-06 | Shinichi Hashimoto | Ultrasonic diagnostic apparatus, diagnostic imaging apparatus, and program |
Also Published As
Publication number | Publication date |
---|---|
EP3092951A1 (en) | 2016-11-16 |
WO2016186279A1 (en) | 2016-11-24 |
US10957013B2 (en) | 2021-03-23 |
KR101797042B1 (en) | 2017-11-13 |
CN107635464A (en) | 2018-01-26 |
KR20160134320A (en) | 2016-11-23 |
US20160335742A1 (en) | 2016-11-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3092951B1 (en) | Method and apparatus for synthesizing medical images | |
US20140108053A1 (en) | Medical image processing apparatus, a medical image processing method, and ultrasonic diagnosis apparatus | |
CN107666863B (en) | Method of displaying elastographic image and ultrasonic diagnostic apparatus for performing the method | |
US11191524B2 (en) | Ultrasonic diagnostic apparatus and non-transitory computer readable medium | |
EP2989987B1 (en) | Ultrasound diagnosis apparatus and method and computer readable storage medium | |
EP3143939B1 (en) | Ultrasound apparatus and method of obtaining information from contrast image | |
EP3173026B1 (en) | Medical imaging apparatus and method of operating same | |
US20150173716A1 (en) | Apparatus and method for displaying ultrasound image | |
KR20150082006A (en) | The Method and Apparatus for Displaying Medical Image | |
US20160157829A1 (en) | Medical imaging apparatus and method of generating medical image | |
US20150206323A1 (en) | Method and apparatus for displaying medical image | |
US10761198B2 (en) | Method and apparatus for acquiring image using ultrasound | |
EP3037041B1 (en) | Method and apparatus for generating body marker | |
KR20160056163A (en) | Ultrasound Diagnostic Method and Ultrasound Diagnostic Apparatus | |
KR101563501B1 (en) | Apparatus and method for measuring vessel stress | |
EP3000401B1 (en) | Method and apparatus for generating ultrasound image | |
JP7356229B2 (en) | Ultrasound diagnostic equipment | |
US11291429B2 (en) | Medical imaging apparatus and method of generating medical image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20170117 |
|
RBV | Designated contracting states (corrected) |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20181010 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTG | Intention to grant announced |
Effective date: 20200414 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602016042112 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 1303032 Country of ref document: AT Kind code of ref document: T Effective date: 20200915 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG4D |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20200819 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200819 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200819 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201119 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201119 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201221 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200819 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201120 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200819 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1303032 Country of ref document: AT Kind code of ref document: T Effective date: 20200819 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201219 Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200819 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200819 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200819 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200819 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200819 Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200819 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200819 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200819 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200819 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602016042112 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200819 Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200819 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200819 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200819 |
|
26N | No opposition filed |
Effective date: 20210520 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200819 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200819 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200819 |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20210228 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210228 Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210228 Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210218 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: FR Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210228 Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210218 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210228 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20160218 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200819 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200819 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20240122 Year of fee payment: 9 Ref country code: GB Payment date: 20240122 Year of fee payment: 9 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200819 |