[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN111012377B - Echocardiogram heart parameter calculation and myocardial strain measurement method and device - Google Patents

Echocardiogram heart parameter calculation and myocardial strain measurement method and device Download PDF

Info

Publication number
CN111012377B
CN111012377B CN201911242976.9A CN201911242976A CN111012377B CN 111012377 B CN111012377 B CN 111012377B CN 201911242976 A CN201911242976 A CN 201911242976A CN 111012377 B CN111012377 B CN 111012377B
Authority
CN
China
Prior art keywords
section
segmentation
heart
result
network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911242976.9A
Other languages
Chinese (zh)
Other versions
CN111012377A (en
Inventor
陈晓天
罗志鹏
张培芳
吴振洲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Ande Yizhi Technology Co ltd
Original Assignee
Beijing Ande Yizhi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Ande Yizhi Technology Co ltd filed Critical Beijing Ande Yizhi Technology Co ltd
Priority to CN201911242976.9A priority Critical patent/CN111012377B/en
Publication of CN111012377A publication Critical patent/CN111012377A/en
Application granted granted Critical
Publication of CN111012377B publication Critical patent/CN111012377B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0883Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the heart

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Cardiology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Image Analysis (AREA)

Abstract

The present disclosure relates to a method and a device for calculating echocardiogram heart parameters and measuring myocardial strain, wherein the method executes processing through a trained neural network, and the processing at least comprises the following steps: classifying the heart ultrasonic video to obtain a section classification result; obtaining a segmentation result by carrying out image segmentation on the section classification result; and obtaining cardiac parameters and myocardial strain according to the segmentation result. In the embodiment of the disclosure, the trained neural network is used for carrying out automatic section classification processing and image segmentation on the heart ultrasonic video, and further automatically obtaining the measurement results of the heart parameters and the myocardial strain, thereby effectively reducing the workload of doctors and improving the working efficiency.

Description

Echocardiogram heart parameter calculation and myocardial strain measurement method and device
Technical Field
The present disclosure relates to the field of computer image processing, and in particular, to a method and an apparatus for calculating echocardiography heart parameters and measuring myocardial strain.
Background
In recent years, the artificial intelligence technology represented by the deep learning neural network has undergone a rapid development, and its application in the field of medical image processing is also currently a focus of research. Echocardiography, a commonly used medical examination method, is also beginning to be used as an analysis object of a deep learning model, however, automatic measurement of cardiac parameters and automatic calculation of myocardial strain of echocardiography are not available in the related art.
Disclosure of Invention
In view of the above, the present disclosure provides a method and an apparatus for calculating echocardiography cardiac parameters and measuring myocardial strain.
According to an aspect of the present disclosure, there is provided an echocardiographic heart parameter calculation and myocardial strain measurement method, which performs processing by a trained neural network, the processing at least including:
classifying the heart ultrasonic video to obtain a section classification result;
obtaining a segmentation result by carrying out image segmentation on the section classification result;
and obtaining cardiac parameters and myocardial strain according to the segmentation result.
In one possible implementation, the trained neural network includes: classified networks and split networks.
In a possible implementation manner, the obtaining a slice classification result by performing classification processing on the cardiac ultrasound video includes:
acquiring a heart ultrasonic original image;
converting each section sequence in the ultrasonic original image into a section video with the same resolution and the same frame number;
and inputting the section video into the classification network to obtain a section classification result.
In a possible implementation manner, the obtaining a segmentation result by performing image segmentation on the section classification result includes:
screening the section classification result to obtain an appointed section;
the designated section comprises: at least one of a two-chamber and two-dimensional section of the apex of the heart, a three-chamber and two-dimensional section of the apex of the heart and a four-chamber and two-dimensional section of the apex of the heart;
carrying out segmentation processing on the specified section through the segmentation network to obtain a segmentation result;
the segmentation result comprises: at least one of contours of a left ventricular intima, a left ventricular epicardium, a left atrial intima, a right ventricular intima, and a right ventricular adventitia.
In one possible implementation, the obtaining of the cardiac parameter and the left ventricular myocardial strain according to the segmentation result includes:
calculating cardiac parameters according to the segmentation result;
and calculating the left ventricle myocardial strain according to the segmentation result and the speckle tracking.
In a possible implementation manner, the segmenting the designated section through the segmentation network to obtain a segmentation result includes:
preprocessing the designated section to obtain a preprocessed designated section sequence;
inputting the specified section sequence into the segmentation network to obtain masks of all structures of the heart;
obtaining structural regions of the heart according to the mask;
and obtaining the initial frame number of the cardiac cycle and the positions of the apex and the root of the mitral valve according to each structural region of the heart.
In one possible implementation, the classification network is a three-dimensional convolutional neural network.
According to another aspect of the present disclosure, there is provided an echocardiographic heart parameter calculation and myocardial strain measurement apparatus, which performs processing by a trained neural network, including:
the classification module is used for classifying the cardiac ultrasonic video to obtain a section classification result;
the segmentation module is used for carrying out image segmentation on the section classification result to obtain a segmentation result;
and the calculation module is used for obtaining the heart parameters and the myocardial strain according to the segmentation result.
According to another aspect of the present disclosure, there is provided an echocardiographic heart parameter calculation and myocardial strain measurement apparatus comprising: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to perform the above method.
According to another aspect of the present disclosure, there is provided a non-transitory computer readable storage medium having computer program instructions stored thereon, wherein the computer program instructions, when executed by a processor, implement the above-described method.
In the embodiment of the disclosure, the trained neural network is used for carrying out automatic section classification processing and image segmentation on the heart ultrasonic video, and further automatically obtaining the measurement results of the heart parameters and the myocardial strain, thereby effectively reducing the workload of doctors and improving the working efficiency.
Other features and aspects of the present disclosure will become apparent from the following detailed description of exemplary embodiments, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate exemplary embodiments, features, and aspects of the disclosure and, together with the description, serve to explain the principles of the disclosure.
FIG. 1 shows a block diagram of a 2D convolutional neural network VGGNET;
fig. 2 shows a schematic diagram of ultrasound image segmentation using the convolutional neural network model Unet;
FIG. 3 illustrates a flow chart of a method of echocardiographic heart parameter calculation and myocardial strain measurement according to an embodiment of the present disclosure;
FIG. 4 illustrates a block diagram of a three-dimensional convolutional neural network model, according to an embodiment of the present disclosure;
FIG. 5 illustrates a block diagram of a multitasking convolutional neural network model according to one embodiment of the present disclosure;
FIG. 6 shows a left ventricular myocardium segmentation schematic according to an embodiment of the present disclosure;
FIG. 7 illustrates a flow chart of a method of echocardiographic heart parameter calculation and myocardial strain measurement according to an embodiment of the present disclosure;
FIG. 8 shows a block diagram of an echocardiographic heart parameter calculation and myocardial strain measurement apparatus according to an embodiment of the present disclosure;
figure 9 illustrates a block diagram of an apparatus for echocardiographic heart parameter calculation and myocardial strain measurement according to an embodiment of the present disclosure.
Detailed Description
Various exemplary embodiments, features and aspects of the present disclosure will be described in detail below with reference to the accompanying drawings. In the drawings, like reference numbers can indicate functionally identical or similar elements. While the various aspects of the embodiments are presented in drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
The word "exemplary" is used exclusively herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
Furthermore, in the following detailed description, numerous specific details are set forth in order to provide a better understanding of the present disclosure. It will be understood by those skilled in the art that the present disclosure may be practiced without some of these specific details. In some instances, methods, means, elements and circuits that are well known to those skilled in the art have not been described in detail so as not to obscure the present disclosure.
In recent years, the artificial intelligence technology represented by the deep learning neural network has undergone a rapid development, and its application in the field of medical image processing is also currently a focus of research. Echocardiography, a commonly used medical examination method, is also beginning to be used as an analysis object of deep learning models.
On the one hand, in the related art, when the deep learning model is used to automatically classify the ultrasound slices, an image is generally randomly extracted from the ultrasound video file directly, and the 2D convolutional neural network VGGNET shown in fig. 1 is used to classify the image, so as to obtain the ultrasound slice classification result. However, in this way, by extracting the images in the video and inputting the images into the 2D convolutional neural network for section classification, the motion information of the heart is lost, and the classification is not accurate enough.
On the other hand, in the related art, when performing ultrasound image segmentation, the left ventricle of the ultrasound image is generally segmented by using a convolutional neural network model to calculate some cardiac parameters. As shown in fig. 2, the Convolutional neural network model Unet is used to segment the left ventricle heart chamber of the apical four-chamber heart section, the original ultrasound image (Echo Cine Raw Frames) and Optical Flow (Echo Cine Optical Flow) are used as input, the sliding temporal window (sliding temporal window) and two encoders of Unet (U-net Encoder) are used as the Convolutional Encoder (Convolutional Encoder) of each image in the video, the encoding process is performed, after the encoding result is cascaded (coordination), the obtained feature map is processed by a bidirectional CONVLSTM layer (bidirectional consistent LSTM), or jump cascade (concatenation of skip connections from the Convolutional Decoder) directly to the Convolutional Decoder, and decode the feature map of each image by using the Decoder (U-net Decoder) of Unet as the Convolutional Decoder (Convolutional Decoder) to obtain the final image segmentation result (segmentation of LV front frame). However, the ultrasound image segmentation method mainly segments the apical four-chamber cardiac section, and only can segment the left ventricle structure in the image, which has strong limitation.
In addition, in the related art, a speckle tracking method is usually adopted to track the motion of a pixel point in the center muscle of an ultrasonic video to obtain a myocardial strain calculation result. Using commercial ultrasound analysis software such as TOMTEC as a representative, a speckle tracking technique is used to track the speckle motion in the myocardium in the ultrasound image, so as to obtain the strain indicator during the motion of the myocardium. However, in this method, the calculation result of the strain is greatly affected by the image quality, and the position of the left ventricle heart chamber needs to be manually specified as initialization, so the operation is complicated, the repeatability is poor, and the accuracy is poor.
Therefore, the embodiment of the disclosure provides an ultrasonic cardiogram heart parameter calculation and myocardial strain measurement scheme based on artificial intelligence, automatic section classification processing and image segmentation are performed on a heart ultrasonic video through a trained neural network, and a measurement result of the heart parameter and myocardial strain is further obtained automatically, so that the workload of a doctor is effectively reduced, and the work efficiency is improved.
Figure 3 illustrates a flow chart of a method of echocardiographic cardiac parameter calculation and myocardial strain measurement according to an embodiment of the present disclosure. As shown in fig. 3, the method performs a process by the trained neural network, and the process at least includes:
101, classifying cardiac ultrasonic videos to obtain a section classification result;
102, carrying out image segmentation on the section classification result to obtain a segmentation result;
and 103, obtaining heart parameters and myocardial strain according to the segmentation result.
In order to realize automatic measurement of cardiac parameters and automatic calculation of myocardial strain, in the embodiment of the disclosure, a trained neural network is utilized to automatically classify the section of a cardiac ultrasound video, and based on a classified image, a part of the section is selected to perform image segmentation to obtain a segmentation result (such as the outline of the cardiac structures in the left ventricle, the right ventricle, the left atrium, the right atrium and the like in the image); further based on the segmentation results, a variety of cardiac parameters are calculated and measures of myocardial strain are automatically derived.
In one possible implementation, the trained neural network may include: classified networks and split networks. The classification network and the segmentation network may be convolutional neural networks, and specifically may include but not limited to convolutional neural networks such as V-NET, U-NET, VGG, ResNet, densnet, and the like, which is not limited in this embodiment. The classification network is used for classifying the cardiac ultrasound video to obtain a section classification result; the segmentation network is used for carrying out image segmentation on the section classification result output by the classification network so as to obtain a segmentation result.
In a possible implementation manner, in step 101, obtaining a slice classification result by performing classification processing on the cardiac ultrasound video may include the following steps:
step 10101, acquiring a heart ultrasonic original image;
10102, converting each section sequence in the ultrasonic original image into a section video with the same resolution and the same frame number;
10103, inputting the section video into the classification network to obtain a section classification result.
The heart ultrasonic original image (i.e. echocardiogram) is a graph in which the periodic activities of structures such as cardiac walls, ventricles and valves are measured by using the ultrasonic ranging principle to make pulse ultrasonic waves penetrate through the chest wall and soft tissues, and the periodic activities are displayed on a display as the relationship curve between the corresponding activities of the structures and time. The original image of an echocardiogram usually contains a plurality of sequences, each sequence corresponding to a slice in the ultrasound examination. In order to fully utilize the motion information of the heart, in the embodiment of the disclosure, before the classification processing is performed through the trained neural network, the acquired heart ultrasound original image is preprocessed, and each section sequence to be classified is converted into a video with the same resolution and the same frame number, so that the processed section video rather than the section picture is used as the input of the neural network, so that the information contained in the section video is fully utilized, and the classification accuracy is improved.
In one possible implementation, the classification network is a three-dimensional convolutional neural network. FIG. 4 illustrates a block diagram of a three-dimensional convolutional neural network model, according to an embodiment of the present disclosure; as shown in fig. 4, the three-dimensional convolutional neural network includes: the three-dimensional convolutional neural network is trained in advance through a sample section video file to obtain a trained 3D convolutional neural network. And inputting the preprocessed section videos with the same resolution and the same frame number into a trained 3D convolutional neural network, and performing section classification processing to obtain a classification result of the ultrasonic section (namely, the section category corresponding to each video file). Therefore, the 3D convolutional neural network is used for classifying the ultrasonic section videos, so that the heart motion information in the ultrasonic videos can be effectively utilized, and the type of the section can be judged more accurately.
The ultrasound slices that the three-dimensional convolutional neural network can classify include most of conventional cardiac ultrasound video slices, which may be, for example: the first three-dimensional model of the heart comprises a first three-dimensional model of a parasternal long axis two-dimensional model, a second three-dimensional model of a parasternal long axis two-dimensional model, a third two-dimensional model of a parasternal long axis ascending aorta two-dimensional model, a third two-dimensional model of a parasternal long axis model, a third two-dimensional model of a parasternal aorta short axis model, a fourth three-dimensional model of a parasternal aorta short axis model, a third two-dimensional model of a parasternal long axis model, a fourth three-dimensional model of a parasternal aorta model, a fourth three-dimensional model of a parasternal short axis model, a third three-dimensional model of a parasternal aorta short axis model, a third three-dimensional model of a parasternal long axis model, a third two-dimensional model of a parasternal long axis model, a left ventricular short axis model of a parasternal long axis model, a parasternal short.
In a possible implementation manner, in step 102, the obtaining a segmentation result by performing image segmentation on the section classification result may include the following steps:
step 10201, screening the section classification result to obtain an appointed section; the designated section comprises: at least one of an Apical two-chamber-heart two-dimensional section (A2C, Apical 2 chamber), an Apical three-chamber-heart two-dimensional section (A3C, Apical 3chamber), and an Apical four-chamber-heart two-dimensional section (A4C, Apical 4 chamber);
in this embodiment, according to the section category corresponding to each section video obtained in step 101, the apical two-chamber two-dimensional section, apical three-chamber two-dimensional section, and apical four-chamber two-dimensional section in each case can be automatically selected for subsequent image segmentation.
Step 10202, segmenting the designated section through the segmentation network to obtain a segmentation result; the segmentation result comprises: at least one of contours of a left ventricular intima, a left ventricular epicardium, a left atrial intima, a right ventricular intima, and a right ventricular adventitia.
In the embodiment of the present disclosure, the segmentation network may be a classification network, may also be a regression network, and may also include both a classification network and a regression network. For example, the segmentation network may be a multitasking convolutional neural network; FIG. 5 illustrates a block diagram of a multitasking convolutional neural network model according to one embodiment of the present disclosure; as shown in fig. 5, the multitask convolutional neural network model may include: the method comprises the steps of training a multitask Convolutional neural network model in advance through segmentation samples to obtain a trained segmentation network, inputting the video with the specified section obtained in the step into the trained segmentation network, and performing segmentation processing to obtain the outlines of a left ventricular intima, a left ventricular adventitia, a left atrial intima, a right ventricular intima and a right ventricular adventitia on each frame in the video. In this way, the multitask division network outputs the division result and simultaneously outputs the distance graph and the curvature graph by utilizing the cavity convolution and the multitask learning, and effectively guides the division operation, so that the division result with higher accuracy and continuity is generated. The distance graph is the shortest straight-line distance from each pixel point in the image to the contour of the segmented object, and the curvature graph is the bending degree of each point on the segmented physical contour.
In one possible implementation manner, in step 10202, the performing, by the segmentation network, a segmentation process on the designated section to obtain a segmentation result may include:
step 1020201, preprocessing the designated section to obtain a preprocessed designated section sequence;
step 1020202, inputting the specified tangent plane sequence into the segmentation network to obtain the mask of each structure of the heart;
step 1020203, obtaining structural regions of the heart according to the mask;
and 1020204, obtaining the initial frame number of the cardiac cycle and the positions of the apex and the root of the mitral valve according to each structural region of the heart.
For example, the selected video with the designated section may be preprocessed by using image processing techniques, such as: peripheral character information irrelevant to image segmentation is eliminated, the pixel spacing of the image is unified, and pixel gray scale is subjected to normalization processing, so that the accuracy of image segmentation is improved.
Furthermore, each picture in the screened video with the specified section is preprocessed through the method and then sequentially input into a trained segmentation network (a multitask convolution neural network), masks of all structural regions of the heart are output through segmentation processing, and all structural regions of the heart are extracted through multiplication of all the masks and the original pictures. After all the pictures in the video with the specified section are processed, the segmentation result of the heart structure in the whole video time period can be obtained. Therefore, on the basis of completing the classification task of the pixel level, the segmentation network of the embodiment of the disclosure simultaneously completes the distance and curvature regression task of the contour, thereby effectively improving the segmentation accuracy, and the segmentation result can further assist in determining the positions of the apex and the root of the mitral valve.
Based on the obtained segmentation result of the heart structure in the whole video time period, the image processing technology can be further utilized to determine the initial frame number of a cardiac cycle and the positions of the apex and the root of the mitral valve, so as to provide a basis for subsequent strain measurement. For example, a region corresponding to the left ventricle in each frame may be divided from the above segmentation result, the number of pixels included in the region is calculated, then the number of frames corresponding to the maximum value of the number of pixels included in the left ventricle region in the video is found, and two adjacent frames are selected as a cardiac cycle. Meanwhile, the area corresponding to the left atrium can be divided according to the division result, the boundary of the left ventricle and the left atrium is determined by using a support vector machine, then the position of the boundary is adjusted according to the image size, and two intersection points of the boundary and the left atrium are obtained and used as the position of the root of the mitral valve; the point in the left ventricular area furthest from the line joining the bases of the mitral valves is then identified as the apex.
In one possible implementation manner, in step 103, the obtaining a cardiac parameter and a left ventricular myocardial strain according to the segmentation result may include:
step 10301, calculating cardiac parameters based on the segmentation result;
based on the start frame number of the cardiac cycle and the apex and mitral valve root positions obtained in step 1020204, various cardiac parameters including: ventricular septal thickness, ejection fraction, left ventricular end-diastolic diameter, left ventricular end-systolic diameter, left ventricular end-diastolic volume, left ventricular end-systolic volume, right ventricular lateral diameter, left atrial anterior posterior diameter, and the like.
Step 10302, calculating left ventricular myocardial strain based on the segmentation and speckle tracking.
On the basis of the segmentation result, the left ventricular myocardium strain (for example, the strain of the segment 17 of the middle myocardium) of the three selected slices is calculated by combining the speckle tracking and the segmentation network result, for example, fig. 6 shows a left ventricular myocardium segmentation schematic diagram according to an embodiment of the present disclosure, and the left ventricular myocardium portion in each frame of the video is extracted by using the left ventricular myocardium mask obtained by the segmentation. The myocardium is segmented according to 17-segment models by using the obtained apex and mitral valve root positions (as shown in fig. 6), the motion information of the feature points in each segment of the myocardium is obtained by using a speckle tracking technique, the motion information of the feature points is corrected by using a left ventricular myocardium mask in the segmentation result (for example, after a corresponding feature point in the current frame is obtained according to a certain feature point in the previous frame of the image, the feature point position is compared with the mask contour, if a certain threshold value is exceeded, the feature point position is corrected to the mask contour), the speckle tracking error is reduced, and finally the strain of each segment is calculated according to the tracking result. It should be noted that, in the embodiment of the present disclosure, the cardiac structure segmentation result obtained by deep learning is used to determine the left ventricular myocardium range, the endocardium contour, the apex of the heart, and the mitral valve root position, on this basis, the speckle tracking is performed, the endocardium contour obtained by deep learning segmentation is used to correct the tracking result, and finally the left ventricular myocardium strain result is obtained by calculation; therefore, the accuracy and the reliability of the strain calculation are improved by adopting a mode of combining deep learning and speckle tracking, and the repeatability of the strain calculation is stronger.
Illustratively, fig. 7 shows a flow chart of an echocardiographic heart parameter calculation and myocardial strain measurement method according to an embodiment of the present disclosure. As shown in fig. 7, a cardiac ultrasound original image is obtained, a 3D convolutional neural network is used to perform automatic section classification on the cardiac ultrasound image, a designated section is selected based on the classified image, the convolutional neural network is used to perform image segmentation and cardiac parameter calculation on the designated section, and a measurement result of myocardial strain is obtained automatically by a method combining speckle tracking and the neural network. Therefore, cardiac parameter calculation and strain calculation based on the cardiac ultrasound original image are completed automatically without manual intervention, so that the workload of doctors is effectively reduced, and the working efficiency is greatly improved.
It should be noted that, although the echocardiographic heart parameter calculation and the myocardial strain measurement method are described above by taking the above embodiments as examples, those skilled in the art will understand that the present disclosure should not be limited thereto. In fact, the user can flexibly set each implementation mode according to personal preference and/or actual application scene, as long as the technical scheme of the disclosure is met.
Therefore, the embodiment of the disclosure can perform automatic section classification processing and image segmentation on the heart ultrasonic video through the trained neural network, and further automatically obtain the measurement results of the heart parameters and the myocardial strain, thereby effectively reducing the workload of doctors and improving the working efficiency.
FIG. 8 illustrates a block diagram of an echocardiographic heart parameter calculation and myocardial strain measurement apparatus according to an embodiment of the present disclosure; the apparatus performs processing by the trained neural network, and may include: the classification module 81 is used for classifying the cardiac ultrasound video to obtain a section classification result; a segmentation module 82, configured to perform image segmentation on the section classification result to obtain a segmentation result; and the calculating module 83 is configured to obtain the cardiac parameter and the myocardial strain according to the segmentation result.
In one possible implementation, the trained neural network includes: classified networks and split networks.
In one possible implementation, the classification module may include: the acquisition unit is used for acquiring a cardiac ultrasound original image; the conversion unit is used for converting each section sequence in the ultrasonic original image into a section video with the same resolution and the same frame number; and the classification unit is used for inputting the section video into the classification network to obtain a section classification result.
In one possible implementation, the partitioning module may include: the screening submodule is used for screening the section classification result to obtain an appointed section; the designated section comprises: at least one of a two-chamber and two-dimensional section of the apex of the heart, a three-chamber and two-dimensional section of the apex of the heart and a four-chamber and two-dimensional section of the apex of the heart; the segmentation submodule is used for carrying out segmentation processing on the specified section through the segmentation network to obtain a segmentation result; the segmentation result comprises: at least one of contours of a left ventricular intima, a left ventricular epicardium, a left atrial intima, a right ventricular intima, and a right ventricular adventitia.
In one possible implementation, the calculation module may include: the heart parameter calculation unit is used for calculating heart parameters according to the segmentation result; and the myocardial strain calculation unit is used for calculating the left ventricle myocardial strain according to the segmentation result and the speckle tracking.
In one possible implementation, partitioning the sub-module may include: the pretreatment unit is used for pretreating the specified section to obtain a pretreated specified section sequence; the mask unit is used for inputting the specified section sequence into the segmentation network to obtain masks of all structures of the heart; the heart area unit is used for obtaining each structural area of the heart according to the mask; and the apex and mitral valve root position unit is used for obtaining the initial frame number of the cardiac cycle and the apex and mitral valve root positions according to each structural region of the heart.
In one possible implementation, the classification network is a three-dimensional convolutional neural network.
It should be noted that, although the echocardiographic heart parameter calculation and the myocardial strain measurement device are described above by taking the above-described embodiment as an example, those skilled in the art will understand that the present disclosure should not be limited thereto. In fact, the user can flexibly set each implementation mode according to personal preference and/or actual application scene, as long as the technical scheme of the disclosure is met.
Therefore, the embodiment of the disclosure can perform automatic section classification processing and image segmentation on the heart ultrasonic video through the trained neural network, and further automatically obtain the measurement results of the heart parameters and the myocardial strain, thereby effectively reducing the workload of doctors and improving the working efficiency.
Fig. 9 shows a block diagram of an apparatus 1900 for echocardiographic heart parameter calculation and myocardial strain measurement according to an embodiment of the present disclosure. For example, the apparatus 1900 may be provided as a server. Referring to fig. 9, the device 1900 includes a processing component 1922 further including one or more processors and memory resources, represented by memory 1932, for storing instructions, e.g., applications, executable by the processing component 1922. The application programs stored in memory 1932 may include one or more modules that each correspond to a set of instructions. Further, the processing component 1922 is configured to execute instructions to perform the above-described method.
The device 1900 may also include a power component 1926 configured to perform power management of the device 1900, a wired or wireless network interface 1950 configured to connect the device 1900 to a network, and an input/output (I/O) interface 1958. The device 1900 may operate based on an operating system stored in memory 1932, such as Windows Server, MacOS XTM, UnixTM, LinuxTM, FreeBSDTM, or the like.
In an exemplary embodiment, a non-transitory computer readable storage medium, such as the memory 1932, is also provided that includes computer program instructions executable by the processing component 1922 of the apparatus 1900 to perform the above-described methods.
The present disclosure may be systems, methods, and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for causing a processor to implement various aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present disclosure may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, the electronic circuitry that can execute the computer-readable program instructions implements aspects of the present disclosure by utilizing the state information of the computer-readable program instructions to personalize the electronic circuitry, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA).
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Having described embodiments of the present disclosure, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (9)

1. A method of echocardiographic cardiac parameter calculation and myocardial strain measurement, the method performing processing by a trained neural network, the processing comprising at least:
classifying the heart ultrasonic video to obtain a section classification result;
obtaining a segmentation result by carrying out image segmentation on the section classification result; the segmentation result comprises: at least one of contours of a left ventricular intima, a left ventricular epicardium, a left atrial intima, a right ventricular intima, and a right ventricular adventitia; calculating cardiac parameters according to the segmentation result;
and obtaining the motion information of the characteristic points in each segment of the myocardium of the left ventricle through speckle tracking, and correcting the motion information of the characteristic points by utilizing the contour of the inner membrane of the left ventricle to obtain the strain of the myocardium of the left ventricle.
2. The method of claim 1, wherein the trained neural network comprises: classified networks and split networks.
3. The method of claim 2, wherein the obtaining the slice classification result by classifying the cardiac ultrasound video comprises:
acquiring a heart ultrasonic original image;
converting each section sequence in the ultrasonic original image into a section video with the same resolution and the same frame number;
and inputting the section video into the classification network to obtain a section classification result.
4. The method of claim 2, wherein the obtaining a segmentation result by performing image segmentation on the section classification result comprises:
screening the section classification result to obtain an appointed section;
the designated section comprises: at least one of a two-chamber and two-dimensional section of the apex of the heart, a three-chamber and two-dimensional section of the apex of the heart and a four-chamber and two-dimensional section of the apex of the heart;
and carrying out segmentation processing on the specified section through the segmentation network to obtain a segmentation result.
5. The method of claim 4, wherein the segmenting the designated section through the segmentation network to obtain a segmentation result comprises:
preprocessing the designated section to obtain a preprocessed designated section sequence;
inputting the specified section sequence into the segmentation network to obtain masks of all structures of the heart;
obtaining structural regions of the heart according to the mask;
and obtaining the initial frame number of the cardiac cycle and the positions of the apex and the root of the mitral valve according to each structural region of the heart.
6. The method of claim 2, wherein the classification network is a three-dimensional convolutional neural network.
7. An echocardiographic heart parameter calculation and myocardial strain measurement apparatus, wherein the apparatus performs processing via a trained neural network, comprising:
the classification module is used for classifying the cardiac ultrasonic video to obtain a section classification result;
the segmentation module is used for carrying out image segmentation on the section classification result to obtain a segmentation result; the segmentation result comprises: at least one of contours of a left ventricular intima, a left ventricular epicardium, a left atrial intima, a right ventricular intima, and a right ventricular adventitia;
a calculation module for calculating cardiac parameters according to the segmentation result; and obtaining the motion information of the characteristic points in each segment of the myocardium of the left ventricle through speckle tracking, and correcting the motion information of the characteristic points by utilizing the contour of the inner membrane of the left ventricle to obtain the strain of the myocardium of the left ventricle. .
8. An echocardiographic heart parameter calculation and myocardial strain measurement apparatus, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to implement the method of any one of claims 1 to 6 when executing the memory-stored executable instructions.
9. A non-transitory computer readable storage medium having computer program instructions stored thereon, wherein the computer program instructions, when executed by a processor, implement the method of any of claims 1 to 6.
CN201911242976.9A 2019-12-06 2019-12-06 Echocardiogram heart parameter calculation and myocardial strain measurement method and device Active CN111012377B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911242976.9A CN111012377B (en) 2019-12-06 2019-12-06 Echocardiogram heart parameter calculation and myocardial strain measurement method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911242976.9A CN111012377B (en) 2019-12-06 2019-12-06 Echocardiogram heart parameter calculation and myocardial strain measurement method and device

Publications (2)

Publication Number Publication Date
CN111012377A CN111012377A (en) 2020-04-17
CN111012377B true CN111012377B (en) 2020-11-03

Family

ID=70207429

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911242976.9A Active CN111012377B (en) 2019-12-06 2019-12-06 Echocardiogram heart parameter calculation and myocardial strain measurement method and device

Country Status (1)

Country Link
CN (1) CN111012377B (en)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111508004B (en) * 2020-04-29 2021-01-15 中国人民解放军总医院 Wall motion abnormity ultrasonic processing method, system and equipment based on deep learning
CN113768533B (en) * 2020-06-10 2024-05-14 无锡祥生医疗科技股份有限公司 Ultrasonic developing device and ultrasonic developing method
CN111739000B (en) * 2020-06-16 2022-09-13 山东大学 System and device for improving left ventricle segmentation accuracy of multiple cardiac views
CN111915557B (en) * 2020-06-23 2024-09-17 杭州深睿博联科技有限公司 Deep learning atrial septal defect detection method and device
CN111915562A (en) * 2020-07-02 2020-11-10 杭州深睿博联科技有限公司 Deep learning children echocardiogram standard tangent plane identification method and device
US11382595B2 (en) * 2020-08-28 2022-07-12 GE Precision Healthcare LLC Methods and systems for automated heart rate measurement for ultrasound motion modes
CN112075956B (en) * 2020-09-02 2022-07-22 深圳大学 Method, terminal and storage medium for estimating ejection fraction based on deep learning
CN112381895A (en) * 2020-10-19 2021-02-19 深圳蓝韵医学影像有限公司 Method and device for calculating cardiac ejection fraction
CN112259227B (en) * 2020-10-29 2021-08-27 中国医学科学院北京协和医院 Calculation method and system for evaluating quantitative index of myocardial involvement of SLE patient
CN112381777B (en) * 2020-11-09 2024-10-18 深圳开立生物医疗科技股份有限公司 Image processing method and device, electronic equipment and storage medium
CN112560637B (en) * 2020-12-10 2024-03-15 长沙理工大学 Deep learning-based clothing analysis method, equipment and storage medium
CN112842384B (en) * 2020-12-30 2023-05-30 无锡触典科技有限公司 Method, device and storage medium for measuring echocardiographic myocardial envelope
CN112734748B (en) * 2021-01-21 2022-05-17 广东工业大学 Image segmentation system for hepatobiliary and biliary calculi
CN112914610B (en) * 2021-01-22 2023-03-24 华中科技大学同济医学院附属同济医院 Contrast-enhanced echocardiography wall thickness automatic analysis system and method based on deep learning
CN112932535B (en) * 2021-02-01 2022-10-18 杜国庆 Medical image segmentation and detection method
CN115082574B (en) * 2021-03-16 2024-05-14 上海软逸智能科技有限公司 Network model training method and viscera ultrasonic section code generation method and device
CN113384293A (en) * 2021-06-12 2021-09-14 北京医院 Integrated machine learning method for coronary heart disease screening based on two-dimensional spot tracking technology
CN113570569B (en) * 2021-07-26 2024-04-16 东北大学 Automatic heart chamber interval jitter detection system based on deep learning
CN115482190A (en) * 2021-11-10 2022-12-16 中山大学附属第七医院(深圳) Fetal heart structure segmentation measurement method and device and computer storage medium
CN114565622B (en) * 2022-03-03 2023-04-07 北京安德医智科技有限公司 Atrial septal defect length determination method and device, electronic device and storage medium
CN115587971B (en) * 2022-09-21 2023-10-24 四川大学华西医院 Organism reaction and hemodynamic monitoring method and system based on heart ultrasonic segment activity

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104794706A (en) * 2015-04-03 2015-07-22 哈尔滨医科大学 Method for examining cardiac muscles and measuring features by aid of ultrasonic images
WO2018210714A1 (en) * 2017-05-18 2018-11-22 Koninklijke Philips N.V. Convolutional deep learning analysis of temporal cardiac images
CN109690554A (en) * 2016-07-21 2019-04-26 西门子保健有限责任公司 Method and system for the medical image segmentation based on artificial intelligence
CN110136828A (en) * 2019-05-16 2019-08-16 杭州健培科技有限公司 A method of medical image multitask auxiliary diagnosis is realized based on deep learning
CN110310256A (en) * 2019-05-30 2019-10-08 上海联影智能医疗科技有限公司 Coronary stenosis detection method, device, computer equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104794706A (en) * 2015-04-03 2015-07-22 哈尔滨医科大学 Method for examining cardiac muscles and measuring features by aid of ultrasonic images
CN109690554A (en) * 2016-07-21 2019-04-26 西门子保健有限责任公司 Method and system for the medical image segmentation based on artificial intelligence
WO2018210714A1 (en) * 2017-05-18 2018-11-22 Koninklijke Philips N.V. Convolutional deep learning analysis of temporal cardiac images
CN110914865A (en) * 2017-05-18 2020-03-24 皇家飞利浦有限公司 Convolution depth learning analysis of temporal cardiac images
CN110136828A (en) * 2019-05-16 2019-08-16 杭州健培科技有限公司 A method of medical image multitask auxiliary diagnosis is realized based on deep learning
CN110310256A (en) * 2019-05-30 2019-10-08 上海联影智能医疗科技有限公司 Coronary stenosis detection method, device, computer equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
《基于斑点跟踪的心脏二维超声图像运动分析》;胡佳;《南京理工大学硕士学位论文》;20080601;正文第12页、第22页、第35-50页 *
《基于深度学习的心室核磁共振图像分割研究与应用》;尹航;《兰州大学硕士学位论文》;20190401;全文 *

Also Published As

Publication number Publication date
CN111012377A (en) 2020-04-17

Similar Documents

Publication Publication Date Title
CN111012377B (en) Echocardiogram heart parameter calculation and myocardial strain measurement method and device
CN110009640B (en) Method, apparatus and readable medium for processing cardiac video
Smistad et al. Real-time automatic ejection fraction and foreshortening detection using deep learning
US11282206B2 (en) Image segmentation based on a shape-guided deformable model driven by a fully convolutional network prior
CN110766691A (en) Method and device for cardiac magnetic resonance image analysis and cardiomyopathy prediction
WO2021027571A1 (en) Artificial intelligence-based medical image processing method, medical device and storage medium
US9147258B2 (en) Methods and systems for segmentation in echocardiography
US20230326034A1 (en) Automated right ventricle medical imaging and computation of clinical parameters
US9142030B2 (en) Systems, methods and computer readable storage media storing instructions for automatically segmenting images of a region of interest
CN113362272A (en) Medical image segmentation with uncertainty estimation
CN111145160B (en) Method, device, server and medium for determining coronary artery branches where calcified regions are located
CN111448614B (en) Method and apparatus for analyzing echocardiography
CN110070529A (en) A kind of Endovascular image division method, system and electronic equipment
US12020806B2 (en) Methods and systems for detecting abnormalities in medical images
CN113744215B (en) Extraction method and device for central line of tree-shaped lumen structure in three-dimensional tomographic image
CN112075956B (en) Method, terminal and storage medium for estimating ejection fraction based on deep learning
CN115511703B (en) Method, device, equipment and medium for generating two-dimensional heart ultrasonic section image
CN114638878B (en) Two-dimensional echocardiogram pipe diameter detection method and device based on deep learning
CN114787867A (en) Organ deformation compensation for medical image registration
CN114010227B (en) Right ventricle characteristic information identification method and device
US11803967B2 (en) Methods and systems for bicuspid valve detection with generative modeling
CN116051538A (en) Left ventricle segmentation method based on transducer echocardiography
US20230123169A1 (en) Methods and systems for use of analysis assistant during ultrasound imaging
CN116168099A (en) Medical image reconstruction method and device and nonvolatile storage medium
US11903898B2 (en) Ultrasound imaging with real-time visual feedback for cardiopulmonary resuscitation (CPR) compressions

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: Echocardiographic cardiac parameter calculation and myocardial strain measurement methods and devices

Effective date of registration: 20220715

Granted publication date: 20201103

Pledgee: Su Qiwen

Pledgor: BEIJING ANDE YIZHI TECHNOLOGY Co.,Ltd.

Registration number: Y2022990000432

PE01 Entry into force of the registration of the contract for pledge of patent right
PC01 Cancellation of the registration of the contract for pledge of patent right

Granted publication date: 20201103

Pledgee: Su Qiwen

Pledgor: BEIJING ANDE YIZHI TECHNOLOGY Co.,Ltd.

Registration number: Y2022990000432