[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN113486195A - Ultrasonic image processing method and device, ultrasonic equipment and storage medium - Google Patents

Ultrasonic image processing method and device, ultrasonic equipment and storage medium Download PDF

Info

Publication number
CN113486195A
CN113486195A CN202110944161.6A CN202110944161A CN113486195A CN 113486195 A CN113486195 A CN 113486195A CN 202110944161 A CN202110944161 A CN 202110944161A CN 113486195 A CN113486195 A CN 113486195A
Authority
CN
China
Prior art keywords
processed
ultrasonic image
target
standard surface
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110944161.6A
Other languages
Chinese (zh)
Inventor
董振鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Wisonic Medical Technology Co ltd
Original Assignee
Shenzhen Wisonic Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Wisonic Medical Technology Co ltd filed Critical Shenzhen Wisonic Medical Technology Co ltd
Priority to CN202110944161.6A priority Critical patent/CN113486195A/en
Publication of CN113486195A publication Critical patent/CN113486195A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/51Indexing; Data structures therefor; Storage structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5207Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/36Creation of semantic tools, e.g. ontology or thesauri
    • G06F16/367Ontology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/5866Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, manually generated location and time information
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30048Heart; Cardiac

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Medical Informatics (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Databases & Information Systems (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Public Health (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Quality & Reliability (AREA)
  • Library & Information Science (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)
  • Computational Linguistics (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention discloses an ultrasonic image processing method, an ultrasonic image processing device, ultrasonic equipment and a storage medium. The method comprises the following steps: acquiring an ultrasonic image to be processed; standard surface recognition is carried out on the ultrasonic image to be processed, and target standard surface characteristics are determined; carrying out tissue structure identification on an ultrasonic image to be processed, and determining target tissue characteristics; performing character recognition on an ultrasonic image to be processed, and determining the characteristics of imaging equipment; performing labeling processing on an ultrasonic image to be processed based on the target standard surface characteristics, the target organization characteristics and the imaging equipment characteristics to obtain a keyword association map; and storing the ultrasound image to be processed and the keyword association map in a system database in an associated manner. In the scheme, the keyword association map formed by the target standard surface characteristics, the target organization characteristics and the imaging equipment characteristics corresponding to the ultrasonic image to be processed is associated with the ultrasonic image to be processed, so that the processing efficiency of subsequent query, statistics and other processing is improved.

Description

Ultrasonic image processing method and device, ultrasonic equipment and storage medium
Technical Field
The present invention relates to the field of ultrasound imaging, and in particular, to an ultrasound image processing method and apparatus, an ultrasound device, and a storage medium.
Background
With the continuous popularization and application of ultrasound devices, more and more medical institutions are equipped with ultrasound devices to scan and identify human tissues of a target object by using the ultrasound devices to form an ultrasound image, so that doctors can know physiological detection data of the target object through the ultrasound image to achieve the purpose of assisting in detecting and identifying the health state of the target object. Generally, the ultrasound device will store ultrasound images resulting from the examination in a system database so that the physician can determine the desired ultrasound image by querying the system database and generate an ultrasound analysis report based on the ultrasound image.
When an existing ultrasonic image is scanned to form an ultrasonic image, an image identifier associated with the ultrasonic image and the generation time is generated based on a specific naming rule, and the image identifier and the ultrasonic image are stored in a system database in an associated mode. The direct storage method of the ultrasound image has the following disadvantages: firstly, when the number of ultrasound images in the system database is large, the required ultrasound images cannot be simply and quickly found, which results in long time consumption in the ultrasound image query process and affects the ultrasound image analysis processing efficiency. Only the image identification and the ultrasound image are stored in the system database, image characteristic information related to the ultrasound image is not stored, statistical processes such as case type statistics and pathological type statistics cannot be performed on all ultrasound images in the system database, manual statistical analysis needs to be performed subsequently, and the efficiency of the ultrasound image statistical analysis process is low.
Disclosure of Invention
The embodiment of the invention provides an ultrasonic image processing method, an ultrasonic image processing device, ultrasonic equipment and a storage medium, and aims to solve the problem that the conventional ultrasonic image is stored in a system database and cannot be quickly inquired and analyzed.
An ultrasound image processing method comprising:
acquiring an ultrasonic image to be processed;
performing standard surface identification on the ultrasonic image to be processed, and determining a target standard surface characteristic corresponding to the ultrasonic image to be processed;
identifying the tissue structure of the ultrasonic image to be processed, and determining the target tissue characteristics corresponding to the ultrasonic image to be processed;
performing character recognition on the ultrasonic image to be processed, and determining the imaging equipment characteristics corresponding to the ultrasonic image to be processed;
labeling the ultrasonic image to be processed based on the target standard surface feature, the target tissue feature and the imaging equipment feature to obtain a keyword association map corresponding to the ultrasonic image to be processed;
and storing the ultrasound image to be processed and the keyword association map in a system database in an associated manner.
An ultrasound image processing apparatus comprising:
the ultrasonic image acquisition module is used for acquiring an ultrasonic image to be processed;
the standard surface recognition module is used for performing standard surface recognition on the ultrasonic image to be processed and determining a target standard surface characteristic corresponding to the ultrasonic image to be processed;
the tissue structure identification module is used for identifying the tissue structure of the ultrasonic image to be processed and determining the target tissue characteristics corresponding to the ultrasonic image to be processed;
the character recognition module is used for carrying out character recognition on the ultrasonic image to be processed and determining the imaging equipment characteristics corresponding to the ultrasonic image to be processed;
the labeling processing module is used for performing labeling processing on the ultrasonic image to be processed based on the target standard surface feature, the target tissue feature and the imaging equipment feature to acquire a keyword association map corresponding to the ultrasonic image to be processed;
and the association storage module is used for storing the ultrasound image to be processed and the keyword association map in a system database in an associated manner.
An ultrasound apparatus comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the ultrasound image processing method when executing the computer program.
A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, implements the ultrasound image processing method described above.
According to the ultrasonic image processing method, the ultrasonic image processing device, the ultrasonic equipment and the storage medium, the ultrasonic image to be processed is analyzed and processed to determine the characteristics of the target standard surface, the target tissue characteristic, the imaging equipment characteristic and the like corresponding to the ultrasonic image to be processed, so that different characteristics which can be used for marking the ultrasonic image to be processed are determined; and integrating the target standard surface characteristics, the target organization characteristics, the imaging equipment characteristics and other characteristics corresponding to the ultrasonic image to be processed to form a keyword association map, and storing the keyword association map and the ultrasonic image to be processed in a system database in an association manner to realize labeling processing of the ultrasonic image to be processed by adopting the keyword association map, so that statistics, query and other processing of the ultrasonic image to be processed are realized by utilizing the keyword association map subsequently, and the processing efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
FIG. 1 is a schematic view of an ultrasound apparatus in an embodiment of the present invention;
FIG. 2 is a flowchart of a method for processing ultrasound images according to an embodiment of the present invention;
FIG. 3 is another flow chart of a method for processing an ultrasound image according to an embodiment of the present invention;
FIG. 4 is another flow chart of a method for processing an ultrasound image according to an embodiment of the present invention;
FIG. 5 is another flow chart of a method for processing an ultrasound image according to an embodiment of the present invention;
FIG. 6 is another flow chart of a method for processing an ultrasound image according to an embodiment of the present invention;
FIG. 7 is another flow chart of a method for processing an ultrasound image according to an embodiment of the present invention;
fig. 8 is a schematic diagram of an ultrasound image processing apparatus according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The ultrasound image processing method provided by the embodiment of the invention can be applied to the ultrasound equipment shown in fig. 1, and the ultrasound equipment comprises a main controller, an ultrasound probe connected with the main controller, a beam forming processor, an image processor and a display screen.
The main controller is a controller of the ultrasonic equipment, and the main controller is connected with other functional modules in the ultrasonic equipment, including but not limited to an ultrasonic probe, a beam forming processor, an image processor, a display screen and the like, and is used for controlling the work of each functional module.
An ultrasound probe is a transmitting and receiving device of ultrasound waves. In this example, the ultrasonic probe emits an ultrasonic wave to the outside; the ultrasonic wave is transmitted in media such as human tissues and the like, echo analog signals such as reflected waves, scattered waves and the like are generated, the echo analog signals are converted into echo electric signals, the echo electric signals are amplified and subjected to analog-to-digital conversion, the echo electric signals are converted into echo digital signals, and then the echo digital signals are sent to the beam synthesis processor.
The beam forming processor is connected with the ultrasonic probe and used for receiving the echo digital signals sent by the ultrasonic probe, carrying out beam forming on the echo digital signals of one or more channels, acquiring one or more paths of echo forming signals and sending the echo forming signals to the image processor.
The image processor is connected with the beam forming processor and used for receiving the echo synthesis signals sent by the beam forming processor, carrying out image processing processes such as image synthesis, space composition, frame correlation and the like on the echo synthesis signals, forming an ultrasonic image, and sending the ultrasonic image to a display screen so that the ultrasonic image is displayed on the display screen.
In an embodiment, an ultrasound image processing method is provided, which is exemplified by applying the ultrasound image processing method to the image processor or the main controller in fig. 1, and the ultrasound image processing method includes the following steps:
s201: acquiring an ultrasonic image to be processed;
s202: standard surface recognition is carried out on the ultrasonic image to be processed, and target standard surface characteristics corresponding to the ultrasonic image to be processed are determined;
s203: carrying out tissue structure identification on an ultrasonic image to be processed, and determining a target tissue characteristic corresponding to the ultrasonic image to be processed;
s204: performing character recognition on the ultrasonic image to be processed, and determining the imaging equipment characteristics corresponding to the ultrasonic image to be processed;
s205: performing labeling processing on the ultrasonic image to be processed based on the target standard surface characteristic, the target organization characteristic and the imaging equipment characteristic to obtain a keyword association map corresponding to the ultrasonic image to be processed;
s206: and storing the ultrasound image to be processed and the keyword association map in a system database in an associated manner.
The ultrasound image to be processed is an ultrasound image which needs to be labeled.
As an example, in step S201, the image processor may acquire an ultrasound image to be processed, which needs to be tagged, and is an ultrasound image obtained by performing image processing processes such as image synthesis, spatial compounding, and frame correlation on the echo synthesis signal of the image processor, that is, an ultrasound image finally displayed on the display screen. In this example, the image processor may acquire a single frame of ultrasound image to be processed, or may acquire an ultrasound video, and extract multiple frames of ultrasound images to be processed from the ultrasound video.
The target standard surface features refer to standard surface features identified by the ultrasonic image to be processed.
As an example, in step S202, the image processor may perform standard surface recognition on the to-be-processed ultrasound image that needs to be labeled by using a pre-trained standard surface recognition model, and determine a target standard surface feature corresponding to the to-be-processed ultrasound image according to a recognition result of the standard surface recognition model. For example, if the ultrasound image to be processed is an ultrasound image formed by scanning a heart region of the target object, the target standard surface type corresponding to the ultrasound image to be processed is a four-chamber heart standard surface, and the target standard surface characteristic corresponding to the target standard surface type is a four-chamber heart of the heart.
The target tissue features refer to tissue features identified by the ultrasound image to be processed.
As an example, in step S203, the image processor may use a pre-trained tissue structure recognition model to perform tissue structure recognition on the ultrasound image to be processed that needs to be labeled, and determine a target tissue feature corresponding to the ultrasound image to be processed according to a recognition result of the tissue structure recognition model. In this example, the image processor may use the tissue structure recognition model corresponding to the target standard surface feature to perform tissue structure recognition on the ultrasound image to be processed, and determine the corresponding target tissue feature, which is helpful to improve the recognition efficiency and accuracy of the target tissue feature. For example, when the target standard surface feature corresponding to the target standard surface type is a heart four-chamber heart, the target tissue feature may be a specific tissue in the heart four-chamber heart, for example, a tissue structure of a left upper ventricle.
The imaging device characteristics refer to characteristics related to ultrasound device measurement identified from the ultrasound image to be processed, and include but are not limited to information such as an ultrasound imaging mode, an ultrasound probe, an ultrasound marker, and ultrasound measurement.
As an example, in step S204, the image processor may perform character recognition on the ultrasound image to be processed by using, but not limited to, OCR or other image character recognition technologies, and recognize a text to be processed from the ultrasound image to be processed; and extracting keywords from the text to be processed, and determining the imaging equipment characteristics corresponding to the ultrasonic image to be processed. The text to be processed refers to the text content identified from the ultrasound image to be processed. In this example, since the text characters of the ultrasound image generally have a structural feature, that is, the text characters recorded in the ultrasound image generally are structured information of predetermined structuring, such as an ultrasound imaging mode, an ultrasound probe, an ultrasound marker, and ultrasound measurement, and the positions of different structured information are specific, an OCR or other image recognition technology may be adopted to perform text recognition on the text character region in the ultrasound image to be processed, which is beneficial to improving the efficiency of text recognition.
The keyword association graph is a knowledge graph formed by summarizing the keywords extracted or identified from the ultrasonic image to be processed, and specifically, keywords corresponding to the features including but not limited to the target standard surface feature, the target organization feature, the imaging equipment feature and the like are summarized to form the keyword association graph corresponding to the ultrasonic image to be processed.
As an example, in step S205, the image processor may perform integration processing on features, such as a target standard surface feature, a target tissue feature, and an imaging device feature, corresponding to the ultrasound image to be processed by using a pre-configured feature integration rule, so as to obtain a keyword association map corresponding to the integrated ultrasound image to be processed. In this example, the image processor may perform a splicing process on the target standard surface feature, the target tissue feature, the imaging device feature, and other features corresponding to the ultrasound image to be processed based on a preset feature splicing rule, so as to form a keyword associated map in a character string form. Or, the image processor may record the target standard surface characteristics, the target tissue characteristics, the imaging device characteristics and other characteristics corresponding to the ultrasound image to be processed in the knowledge map data table to form a keyword associated map expressed in the form of the data table.
As an example, in step S206, after acquiring the keyword associated map corresponding to each to-be-processed ultrasound image, the image processor needs to associate and store the to-be-processed ultrasound image and the keyword associated map into the system database, so that statistical analysis is performed on all to-be-processed ultrasound images based on the keyword associated map in the following step, and the statistical analysis efficiency is improved; or carrying out multi-dimensional data query on the ultrasonic image to be processed, and improving the data query efficiency. In this example, the system database may be a local database or a cloud database.
In this example, the keyword association map corresponding to each ultrasound image to be processed includes a target standard surface feature, a target tissue feature and an imaging device feature, where the imaging device feature includes information such as an ultrasound imaging mode, an ultrasound probe, an ultrasound marker and ultrasound measurement; the ultrasound image to be processed and the keyword association map are stored in the system database in an associated mode, and the method can be understood as that the ultrasound image to be processed is subjected to labeling processing by adopting the keyword association map, so that statistics, query and other processing are performed on all the ultrasound images to be processed in the system database according to the keyword-based association map, and the processing efficiency is improved.
In the ultrasound image processing method provided by this embodiment, the ultrasound image to be processed is analyzed to determine the characteristics, such as a target standard surface characteristic, a target tissue characteristic, and an imaging device characteristic, corresponding to the ultrasound image to be processed, so as to determine different characteristics that can be used for labeling the ultrasound image to be processed; and integrating the target standard surface characteristics, the target organization characteristics, the imaging equipment characteristics and other characteristics corresponding to the ultrasonic image to be processed to form a keyword associated map, and storing the keyword associated map and the ultrasonic image to be processed in a system database in an associated manner so as to realize the labeling processing of the ultrasonic image to be processed by adopting the keyword associated map, so that the ultrasonic image to be processed is subjected to statistics, query and other processing by utilizing the keyword associated map in the following process, and the processing efficiency is improved.
In an embodiment, as shown in fig. 3, in step S202, performing standard surface recognition on the ultrasound image to be processed, and determining a target standard surface feature corresponding to the ultrasound image to be processed, includes:
s301: performing standard face recognition on the ultrasonic image to be processed by adopting a standard face recognition model to obtain standard face similarity corresponding to at least one preset standard face type in the standard face recognition model;
s302: determining a target standard surface type from at least one preset standard surface type according to the standard surface similarity corresponding to the at least one preset standard surface type;
s303: and determining the target standard surface characteristics corresponding to the ultrasonic image to be processed according to the type of the target standard surface.
The standard surface recognition model is a model which is determined by adopting a neural network model to perform model training in advance and is used for standard surface recognition. In this example, the standard surface recognition model can be model-trained using, but not limited to, neural network models such as Resnet, FasterRCNN, Densenet, YOLO, VGG16, and VGG 19. The preset standard face type is a standard face type determined in the model training process. Generally, in the standard surface recognition model training process, a training ultrasonic image corresponding to a preset standard surface type is adopted to train a neural network model, and model parameters in the neural network model are updated by the training ultrasonic image, so that the standard surface recognition model obtained by training can realize standard surface recognition. The standard face similarity refers to the similarity between the ultrasonic image to be processed and the training ultrasonic image corresponding to the preset standard face type.
As an example, in step S301, the image processor may perform standard face recognition on the ultrasound image to be processed by using a pre-trained standard face recognition model, respectively recognize an image similarity between the ultrasound image to be processed and a trained ultrasound image corresponding to at least one preset standard face type in the standard face recognition model, and determine the image similarity as a standard face similarity corresponding to at least one preset standard face type in the standard face recognition model. For example, if the standard surface recognition model corresponds to A, B, C and D preset standard surface types, the ultrasound image to be processed and the A, B, C and D training ultrasound images Pa, Pb, Pc and Pd corresponding to the preset standard surface types need to be processed to respectively determine the corresponding standard surface similarities Sa, Sb, Sc and Sd.
The target standard face type refers to a standard face type identified by the ultrasonic image to be processed.
As an example, in step S302, the image processor determines a target standard surface type from at least one preset standard surface type according to a standard surface similarity corresponding to the at least one preset standard surface type, specifically, the preset standard surface type corresponding to a maximum value of the standard surface similarity may be determined as the target standard surface type; or determining the preset standard surface type with the standard surface similarity larger than the first similarity threshold as the target standard surface type. The first similarity threshold here is a threshold set in advance for evaluating whether the standard surface similarity reaches a preset standard.
The standard face knowledge graph is a pre-configured graph used for recording keywords related to the standard face features.
As an example, in step S303, after determining the target standard surface type corresponding to the ultrasound image to be processed, the image processor may query the standard surface knowledge base according to the target standard surface type, and determine the target standard surface feature corresponding to the ultrasound image to be processed. For example, when the type of the target standard surface corresponding to the ultrasound image to be processed is identified as a four-chamber standard surface, the standard surface identification map is queried according to the four-chamber standard surface, a standard surface keyword which is most matched with the type of the target standard surface of the ultrasound image to be processed is selected from standard surface keywords which are pre-recorded in the standard surface knowledge map and correspond to the four-chamber standard surface, such as standard surface keywords of the four-chamber heart, the four-chamber heart under the cardiac sword, the four-chamber heart under the cardiac sternum, and the like, and is determined as a target standard surface feature corresponding to the ultrasound image to be processed, for example, the target standard surface feature is determined as the four-chamber heart of the heart.
In the embodiment, the standard surface identification model trained by the neural network model is used for identifying the ultrasonic image to be processed and determining the type of the target standard surface, so that the intellectualization of the identification process of the type of the target standard surface can be ensured, a doctor does not need to perform manual identification judgment, and the processing efficiency and the accuracy are improved; and then, the standard surface knowledge graph is inquired based on the type of the target standard surface to determine the characteristics of the target standard surface, so that the determination efficiency and standardization of the characteristics of the target standard surface can be guaranteed, doctors are not required to perform standard surface identification and judgment, and inconsistency of labeling treatment caused by the fact that different doctors define the characteristics of the target standard surface independently can be avoided.
In an embodiment, as shown in fig. 4, the step S203 of performing tissue structure identification on the ultrasound image to be processed and determining a target tissue feature corresponding to the ultrasound image to be processed includes:
s401: adopting an organization structure identification model, identifying an organization structure of the ultrasonic image to be processed, and acquiring the organization similarity corresponding to at least one preset organization structure in the organization structure identification model;
s402: determining a target tissue structure from at least one preset tissue structure according to the tissue similarity corresponding to the at least one preset tissue structure;
s403: and determining the target tissue characteristics corresponding to the ultrasonic image to be processed according to the target tissue structure.
The tissue structure recognition model is a model which is determined by adopting a neural network model to perform model training in advance and is used for performing tissue structure recognition. In the present example, the tissue structure recognition model can be trained by using neural network models such as, but not limited to, U-Net and V-Net. The predetermined tissue structure is a tissue structure predetermined in the model training process. Generally, in the process of training the tissue structure recognition model, training the neural network model by using a training ultrasonic image corresponding to a preset tissue structure, and updating model parameters in the neural network model by using the training ultrasonic image, so that the tissue structure recognition model obtained by training can realize tissue structure recognition. The tissue similarity is the similarity between the ultrasound image to be processed and the training ultrasound image corresponding to the preset tissue structure.
As an example, in step S401, the image processor may perform tissue structure recognition on the ultrasound image to be processed by using a pre-trained tissue structure recognition model, respectively recognize an image similarity between the ultrasound image to be processed and a trained ultrasound image corresponding to at least one preset tissue structure in the tissue structure recognition model, and determine the image similarity as a tissue similarity corresponding to at least one preset tissue structure in the tissue structure recognition model.
In this example, in order to improve the identification accuracy and efficiency of the tissue structure, the tissue structure identification model corresponding to the target standard surface feature may be adopted to identify the tissue structure of the ultrasound image to be processed, and the image similarity between the ultrasound image to be processed and the training ultrasound image corresponding to the at least one preset tissue structure corresponding to the target standard surface feature is respectively identified. For example, when the target standard surface feature is a heart with four cavities, the preset tissue structures of the target standard surface feature include an upper left ventricle, an upper right ventricle, a lower left ventricle, a lower right ventricle, and the like, and the ultrasound image to be processed may be respectively subjected to model recognition with training ultrasound images corresponding to the four preset tissue structures, i.e., the upper left ventricle, the upper right ventricle, the lower left ventricle, and the lower right ventricle, to determine tissue similarities S1, S2, S3, and S4 corresponding to the four preset tissue structures.
As an example, in step S402, the image processor determines a target tissue structure from at least one preset tissue structure according to the tissue similarity corresponding to the at least one preset tissue structure, specifically, the preset tissue structure corresponding to the maximum tissue similarity may be determined as the target tissue structure, or the preset tissue structure with the tissue similarity greater than the second similarity threshold may be determined as the target tissue structure. The second similarity threshold here is a preset threshold for evaluating whether the tissue similarity reaches a preset standard.
Wherein the organization structure knowledge map is a pre-configured map for recording keywords related to the organization structure.
As an example, in step S403, after determining the target tissue structure corresponding to the ultrasound image to be processed, the image processor may query the tissue structure knowledge base according to the target tissue structure, and determine the target tissue feature corresponding to the ultrasound image to be processed. For example, when the target tissue structure is identified as the upper left ventricle, the tissue structure knowledge map may be queried according to the upper left ventricle, so as to select a tissue keyword matched with the ultrasound image to be processed from tissue keywords corresponding to the upper left ventricle, and determine the tissue keyword as the target tissue feature.
Further, after the tissue structure identification model determines the target tissue structure of the ultrasonic image to be processed, the shape contour identification is also performed on the target tissue structure to identify the tissue contour shapes such as circles, triangles, ellipses, straight lines and the like, so that the tissue contour shape is subsequently used as a keyword in the keyword association map, and the processing efficiency of analyzing statistics, querying and other processing of the ultrasonic image to be processed is improved.
In the embodiment, the tissue structure identification model trained by the neural network model is used for identifying the ultrasonic image to be processed and determining the target tissue structure, so that the intellectualization of the target tissue structure identification process can be ensured, a doctor is not required to perform manual identification judgment, and the processing efficiency and the accuracy are improved; and then, the tissue structure knowledge graph is inquired based on the target tissue structure to determine the target tissue characteristics, so that the determination efficiency and standardization of the target tissue characteristics can be guaranteed, doctors are not required to identify and judge the tissue structure, and the inconsistency of labeling treatment caused by the fact that different doctors define the target tissue characteristics independently can be avoided.
In an embodiment, as shown in fig. 5, in step S204, performing text recognition on the ultrasound image to be processed, and determining an imaging device feature corresponding to the ultrasound image to be processed includes:
s501: performing character recognition on the ultrasonic image to be processed, and determining a text to be processed corresponding to the ultrasonic image to be processed;
s502: and inquiring the equipment feature knowledge graph based on the text to be processed, and determining the imaging equipment feature corresponding to the ultrasonic image to be processed.
As an example, in step S501, the image processor may perform character recognition on the ultrasound image to be processed by using, but not limited to, OCR or other image character recognition technologies, and recognize a text to be processed from the ultrasound image to be processed, which specifically includes: firstly, quickly positioning a text character area from an ultrasonic image to be processed by adopting but not limited to CRAFT or other character area detection technologies; then intercepting a target screenshot of a literal character area from the ultrasonic image to be processed; and finally, performing character recognition on the target screenshot by adopting but not limited to OCR or other image character recognition technologies, and determining a text to be processed corresponding to the ultrasonic image to be processed. Understandably, the target screenshot corresponding to the character area is positioned and intercepted, and then character recognition is carried out by adopting OCR or other image character recognition technologies, which is beneficial to ensuring the recognition efficiency of the text to be processed.
Wherein the device feature knowledge map is a pre-configured map for recording keywords related to the imaging device features. Because the characters recorded in the ultrasound image are usually structured information of predetermined structuralization, such as an ultrasound imaging mode, an ultrasound probe, an ultrasound marker, ultrasound measurement and the like, all imaging equipment features corresponding to the training ultrasound image can be gathered and aggregated to form an equipment feature knowledge graph in the model training process.
As an example, in step S502, after determining the text to be processed corresponding to the ultrasound image to be processed, the image processor may perform keyword recognition on the text to be processed corresponding to the ultrasound image to be processed, and determine the keyword to be processed corresponding to the ultrasound image to be processed; and matching the keywords to be processed with the equipment characteristic knowledge graph, and determining the keywords to be processed which are successfully matched as the imaging equipment characteristics corresponding to the ultrasonic image to be processed, namely determining the keywords to be processed related to information such as an ultrasonic imaging mode, an ultrasonic probe, an ultrasonic mark, ultrasonic measurement and the like as the imaging equipment characteristics corresponding to the ultrasonic image to be processed.
In this embodiment, the device feature knowledge graph constructed in advance based on the structural information such as the ultrasonic imaging mode, the ultrasonic probe, the ultrasonic marker, the ultrasonic measurement and the like is queried for the text to be processed identified by the ultrasonic image to be processed, so that the imaging device features of the ultrasonic image to be processed can be determined quickly and accurately, and the efficiency and standardization of obtaining the imaging device features are facilitated.
In an embodiment, as shown in fig. 6, in step S205, performing tagging processing on the ultrasound image to be processed based on the target standard surface feature, the target tissue feature and the imaging device feature, and acquiring a keyword association map corresponding to the ultrasound image to be processed, includes:
s601: determining the target standard surface characteristics, the target organization characteristics and the imaging equipment characteristics as keywords to be labeled;
s602: matching the keywords to be labeled with all standard keywords in the target knowledge graph to obtain the matching degree of the keywords to be labeled with the keywords corresponding to each standard keyword;
s603: if the matching degree of the keywords is greater than the target matching degree threshold value, performing labeling processing on the ultrasonic image to be processed by adopting standard keywords to form a keyword association map corresponding to the ultrasonic image to be processed;
s604: and if the matching degree of the keywords is not greater than the target matching degree threshold value, performing labeling processing on the ultrasonic image to be processed by adopting the keywords to be labeled to form a keyword association map corresponding to the ultrasonic image to be processed, and updating the target knowledge map based on the keywords to be labeled.
The keywords to be labeled are keywords which need to be subjected to labeling processing on the ultrasound image to be processed.
As an example, in step S601, the image processor may determine a target standard surface feature, a target tissue feature, and an imaging device feature extracted or identified from the ultrasound image to be processed as a keyword to be labeled, so as to perform labeling processing on the ultrasound image to be processed by using the target standard surface feature, the target tissue feature, and the imaging device feature, so as to perform summary analysis and fast query on the ultrasound image to be processed subsequently.
Wherein the target knowledge-graph is a knowledge-graph formed by all standard keywords related to the ultrasonic image. The standard keywords refer to keywords recorded on the target knowledge graph before the current time of the system and used for recording related information of the ultrasonic image. The keyword matching degree is used for reflecting the matching degree or similarity between the keywords to be labeled and the standard keywords.
As an example, in step S602, the image processor may perform matching processing on each to-be-labeled keyword of the to-be-processed ultrasound image and all standard keywords in the target knowledge graph respectively, so as to determine a keyword matching degree between each to-be-labeled keyword and each standard keyword in the target knowledge graph. For example, if the number of keywords to be labeled corresponding to the ultrasound image to be processed is N, and the number of standard keywords recorded in the target knowledge graph is M, a keyword matching algorithm or a similarity matching algorithm may be used for processing, and the ith keyword to be labeled Ti is respectively matched with the jth standard keyword Tj in the target knowledge graph to determine a keyword matching degree Sij, where i is greater than or equal to 1 and less than or equal to N, and j is greater than or equal to 1 and less than or equal to M, of the M keyword matching degrees Sij corresponding to each keyword to be labeled Ti.
The target matching degree threshold is a pre-configured threshold used for evaluating whether the keyword matching degree reaches a threshold for considering the keyword matching degree and the keyword matching degree as a matching standard.
As an example, in step S603, when the matching degree of the keyword is greater than the target matching degree threshold, the image processor performs tagging processing on the ultrasound image to be processed by using the standard keyword to form a keyword associated map corresponding to the ultrasound image to be processed, so that the ultrasound image to be processed acquired at the current time of the system can perform tagging processing by using the standard keyword to ensure standardization of the tagging processing of the ultrasound image to be processed, avoid using two keywords with a smaller matching degree of the keyword, perform respective tagging on the ultrasound image to be processed and the ultrasound image before the current time of the system, ensure standardization of all the keyword associated maps of the ultrasound image to be processed, facilitate subsequent query and statistics, and improve processing efficiency.
As an example, in step S604, when the matching degree of the keyword is not greater than the target matching degree threshold, the image processor performs tagging processing on the ultrasound image to be processed by using the keyword to be tagged to form a keyword association map corresponding to the ultrasound image to be processed, so as to perform tagging processing on the ultrasound image to be processed by using the keyword to be tagged when the target knowledge map does not record the keyword to be tagged having a smaller matching degree with the keyword to be tagged, so as to perform statistical analysis subsequently. In this example, after the ultrasonic image to be processed is tagged with the keyword to be tagged, the target knowledge graph needs to be updated by using the keyword to be tagged, so that the keyword to be tagged can act on the standard keyword of the ultrasonic image to be processed after the current time of the system, thereby ensuring the consistency of tagging all the ultrasonic images to be processed.
In this embodiment, the keywords to be labeled are determined according to the target standard surface features, the target organization features and the imaging device features, then the keyword matching degree between the keywords to be labeled and the standard keywords is calculated, and according to the comparison result between the keyword matching degree and the target matching degree threshold, the standard keywords and the keywords to be labeled are respectively adopted to perform labeling processing on the ultrasound images to be processed, so that the consistency of labeling processing on all the ultrasound images to be processed is ensured.
In an embodiment, as shown in fig. 7, after step S206, that is, after performing tagging processing on the ultrasound image to be processed based on the target standard surface feature, the target tissue feature and the imaging device feature, and acquiring the keyword association map corresponding to the ultrasound image to be processed, the ultrasound image processing method further includes:
s701: acquiring an ultrasonic analysis report corresponding to an ultrasonic image to be processed and file attribute information corresponding to the ultrasonic analysis report;
s702: extracting keywords from the ultrasonic analysis report and the file attribute information to obtain report keywords and attribute keywords;
s703: and updating the keyword association map corresponding to the ultrasonic image to be processed according to the report keywords and the attribute keywords.
The ultrasonic analysis report refers to a report file formed by analyzing the ultrasonic image to be processed by a doctor. The file attribute information is information related to ultrasound analysis report storage, including but not limited to file naming, folder and file suffix, and the like.
As an example, in step S701, the image processor may further obtain an ultrasound analysis report formed by analyzing the ultrasound image to be processed by the doctor, and store the ultrasound analysis report in the system database, so as to obtain file attribute information corresponding to the ultrasound analysis report according to a storage address and a storage manner of the ultrasound analysis report in the system database.
Further, the image processor of the ultrasound device may also perform the steps of: (1) and inquiring a system database according to the target standard surface characteristic and the target organization characteristic corresponding to the ultrasonic image to be processed to obtain a standard ultrasonic image corresponding to the target standard surface characteristic and the target organization characteristic. The standard ultrasonic image is an ultrasonic image which is stored in advance and is related to the target standard surface characteristic and the target tissue characteristic and can reflect the normal state of the tested tissue. (2) And performing difference analysis on the ultrasonic image to be processed and the standard ultrasonic image, and determining a difference area corresponding to the ultrasonic image to be processed and the standard ultrasonic image. For example, the image processor may perform matching processing on the ultrasound image to be processed and the standard ultrasound image by using an image matching analysis algorithm, and determine the difference region with a smaller matching degree as the difference region corresponding to the ultrasound image to be processed and the standard ultrasound image. (3) When the to-be-processed ultrasonic image is displayed on the display screen, the difference area corresponding to the to-be-processed ultrasonic image is highlighted, for example, the difference area is marked with a red frame in a key mode, so that a doctor can form an ultrasonic analysis report based on the difference area, and the analysis efficiency of the doctor for analyzing the to-be-processed ultrasonic image is improved.
As an example, in step S702, the image processor may perform keyword extraction on the ultrasound analysis report by using a keyword extraction algorithm, and obtain a report keyword corresponding to the ultrasound analysis report; and a keyword extraction algorithm is adopted to extract keywords from the file attribute information, and attribute keywords corresponding to the file attribute information are obtained, so that the ultrasonic image to be processed is subjected to labeling processing based on the report keywords and the attribute keywords, and the method is favorable for ensuring that the subsequent ultrasonic image to be processed is subjected to statistical analysis, query analysis and other processing, and improving the processing efficiency.
As an example, in step S703, the image processor may update the keyword associated map corresponding to the ultrasound image to be processed according to the report keyword and the attribute keyword, which may specifically be understood as performing tagging processing on the ultrasound image to be processed by using the report keyword and the attribute keyword to update the keyword associated map corresponding to the ultrasound image to be processed, so that the updated keyword associated map not only includes the target standard surface feature, the target organization feature and the imaging device feature, but also includes the report keyword and the attribute keyword; the imaging device characteristics include information such as an ultrasonic imaging mode, an ultrasonic probe, an ultrasonic marker, ultrasonic measurement and the like; the attribute keywords comprise information such as file names, folders, file suffixes and the like; the target tissue features also correspond to tissue outline shapes, namely the tissue outline shapes can also be used as a key word in the key word association map, which is beneficial to improving the processing efficiency of analyzing statistics, querying and other processing of the ultrasonic images to be processed.
In this embodiment, the keywords of the ultrasound analysis report and the file attribute information corresponding to the ultrasound image to be processed are respectively extracted to determine report keywords and attribute keywords, and the report keywords and the attribute keywords are utilized to update the keyword associated map, so as to improve the tag diversity of the keyword associated map, and to help improve the processing efficiency of analyzing statistics, querying and other processing of the ultrasound image to be processed.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
In an embodiment, an ultrasound image processing apparatus is provided, and the ultrasound image processing apparatus corresponds to the ultrasound image processing methods in the above embodiments one to one. As shown in fig. 8, the ultrasound image processing apparatus includes an ultrasound image acquisition module 801, a standard plane identification module 802, an organization structure identification module 803, a character identification module 804, a labeling processing module 805, and an associated storage module 806. The functional modules are explained in detail as follows:
an ultrasound image acquisition module 801, configured to acquire an ultrasound image to be processed;
the standard surface recognition module 802 is configured to perform standard surface recognition on the ultrasound image to be processed, and determine a target standard surface feature corresponding to the ultrasound image to be processed;
the tissue structure identification module 803 is configured to perform tissue structure identification on the ultrasound image to be processed, and determine a target tissue characteristic corresponding to the ultrasound image to be processed;
the character recognition module 804 is used for performing character recognition on the ultrasonic image to be processed and determining the imaging equipment characteristics corresponding to the ultrasonic image to be processed;
the labeling processing module 805 is configured to perform labeling processing on the ultrasound image to be processed based on the target standard surface feature, the target tissue feature and the imaging device feature, and acquire a keyword association map corresponding to the ultrasound image to be processed;
and the association storage module 806 is used for storing the association between the ultrasonic image to be processed and the keyword association map in a system database.
Preferably, the standard face recognition module 802 includes:
the standard surface similarity obtaining unit is used for performing standard surface recognition on the ultrasonic image to be processed by adopting a standard surface recognition model and obtaining standard surface similarity corresponding to at least one preset standard surface type in the standard surface recognition model;
the target standard surface type determining unit is used for determining a target standard surface type from at least one preset standard surface type according to the standard surface similarity corresponding to the at least one preset standard surface type;
and the target standard surface feature determining unit is used for determining the target standard surface feature corresponding to the ultrasonic image to be processed according to the type of the target standard surface.
Preferably, the tissue structure identification module 803 includes:
the tissue similarity obtaining unit is used for identifying the tissue structure of the ultrasonic image to be processed by adopting the tissue structure identification model and obtaining the tissue similarity corresponding to at least one preset tissue structure in the tissue structure identification model;
the target tissue structure determining unit is used for determining a target tissue structure from at least one preset tissue structure according to the tissue similarity corresponding to the at least one preset tissue structure;
and the target tissue characteristic determining unit is used for determining the target tissue characteristics corresponding to the ultrasonic image to be processed according to the target tissue structure.
Preferably, the word recognition module 804 includes:
the to-be-processed text acquisition unit is used for performing character recognition on the to-be-processed ultrasonic image and determining a to-be-processed text corresponding to the to-be-processed ultrasonic image;
and the imaging equipment characteristic acquisition unit is used for inquiring the equipment characteristic knowledge graph based on the text to be processed and determining the imaging equipment characteristic corresponding to the ultrasonic image to be processed.
Preferably, the labeling process module 805 includes:
the system comprises a to-be-labeled keyword determining unit, a labeling unit and a labeling unit, wherein the to-be-labeled keyword determining unit is used for determining the target standard surface characteristics, the target organization characteristics and the imaging equipment characteristics as to-be-labeled keywords;
the keyword matching degree acquisition unit is used for matching the keywords to be labeled with all standard keywords in the target knowledge graph to acquire the keyword matching degree corresponding to the keywords to be labeled and each standard keyword;
the first labeling processing unit is used for performing labeling processing on the ultrasonic image to be processed by adopting a standard keyword when the keyword matching degree is greater than a target matching degree threshold value to form a keyword association map corresponding to the ultrasonic image to be processed;
and the second labeling processing unit is used for performing labeling processing on the ultrasonic image to be processed by adopting the keywords to be labeled when the matching degree of the keywords is not greater than the target matching degree threshold value, forming a keyword association map corresponding to the ultrasonic image to be processed, and updating the target knowledge map based on the keywords to be labeled.
Preferably, the ultrasound image processing apparatus further includes:
the report and attribute acquisition module is used for acquiring an ultrasonic analysis report corresponding to the ultrasonic image to be processed and file attribute information corresponding to the ultrasonic analysis report;
the keyword extraction module is used for extracting keywords from the ultrasonic analysis report and the file attribute information to obtain report keywords and attribute keywords;
and the associated map updating module is used for updating the keyword associated map corresponding to the ultrasonic image to be processed according to the report keywords and the attribute keywords.
For specific definition of the ultrasound image processing apparatus, reference may be made to the above definition of the ultrasound image processing method, which is not described herein again. The modules in the ultrasound image processing apparatus can be wholly or partially implemented by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the ultrasound device, and can also be stored in a memory in the ultrasound device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, an ultrasound apparatus is provided, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the ultrasound image processing method in the foregoing embodiments is implemented, for example, as shown in S201-S206 in fig. 2, or as shown in fig. 3 to 7, which is not repeated here to avoid repetition. Alternatively, the processor executes the computer program to implement the functions of the modules/units in the embodiment of the ultrasound image processing apparatus, such as the functions of the ultrasound image acquisition module 801, the standard surface identification module 802, the organization structure identification module 803, the text identification module 804, the labeling processing module 805, and the associated storage module 806 shown in fig. 8, which are not repeated herein for avoiding repetition.
In an embodiment, a computer-readable storage medium is provided, and a computer program is stored on the computer-readable storage medium, and when being executed by a processor, the computer program implements the ultrasound image processing method in the foregoing embodiments, for example, S201-S206 shown in fig. 2, or shown in fig. 3 to 7, which are not repeated herein to avoid repetition. Alternatively, when being executed by the processor, the computer program implements functions of the modules/units in the embodiment of the ultrasound image processing apparatus, such as the functions of the ultrasound image acquisition module 801, the standard surface identification module 802, the organization structure identification module 803, the text identification module 804, the labeling processing module 805, and the associated storage module 806 shown in fig. 8, which are not repeated herein for avoiding repetition.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (14)

1. An ultrasound image processing method, comprising:
acquiring an ultrasonic image to be processed;
performing standard surface identification on the ultrasonic image to be processed, and determining a target standard surface characteristic corresponding to the ultrasonic image to be processed;
identifying the tissue structure of the ultrasonic image to be processed, and determining the target tissue characteristics corresponding to the ultrasonic image to be processed;
performing character recognition on the ultrasonic image to be processed, and determining the imaging equipment characteristics corresponding to the ultrasonic image to be processed;
labeling the ultrasonic image to be processed based on the target standard surface feature, the target tissue feature and the imaging equipment feature to obtain a keyword association map corresponding to the ultrasonic image to be processed;
and storing the ultrasound image to be processed and the keyword association map in a system database in an associated manner.
2. The method for processing the ultrasonic image according to claim 1, wherein the performing standard surface recognition on the ultrasonic image to be processed and determining the target standard surface feature corresponding to the ultrasonic image to be processed includes:
adopting a standard surface recognition model to perform standard surface recognition on the ultrasonic image to be processed, and acquiring standard surface similarity corresponding to at least one preset standard surface type in the standard surface recognition model;
determining a target standard surface type from at least one preset standard surface type according to the standard surface similarity corresponding to at least one preset standard surface type;
and determining the target standard surface characteristics corresponding to the ultrasonic image to be processed according to the target standard surface type.
3. The method for processing an ultrasound image according to claim 1, wherein the identifying the tissue structure of the ultrasound image to be processed and determining the target tissue feature corresponding to the ultrasound image to be processed includes:
adopting an organization structure identification model to identify the organization structure of the ultrasonic image to be processed, and acquiring the organization similarity corresponding to at least one preset organization structure in the organization structure identification model;
determining a target tissue structure from at least one preset tissue structure according to the tissue similarity corresponding to at least one preset tissue structure;
and determining the target tissue characteristics corresponding to the ultrasonic image to be processed according to the target tissue structure.
4. The method for processing the ultrasonic image according to claim 1, wherein the performing character recognition on the ultrasonic image to be processed and determining the imaging device characteristic corresponding to the ultrasonic image to be processed includes:
performing character recognition on the ultrasonic image to be processed, and determining a text to be processed corresponding to the ultrasonic image to be processed;
and determining the imaging equipment characteristics corresponding to the ultrasonic image to be processed based on the equipment characteristic knowledge graph of the text query to be processed.
5. The method for processing an ultrasound image according to claim 1, wherein the labeling the ultrasound image to be processed based on the target standard surface feature, the target tissue feature and the imaging device feature to obtain a keyword association map corresponding to the ultrasound image to be processed comprises:
determining the target standard surface feature, the target organization feature and the imaging equipment feature as keywords to be labeled;
matching the keywords to be labeled with all standard keywords in a target knowledge graph to obtain the matching degree of the keywords to be labeled with the keywords corresponding to each standard keyword;
if the matching degree of the keywords is greater than a target matching degree threshold value, performing labeling processing on the ultrasonic image to be processed by adopting the standard keywords to form a keyword association map corresponding to the ultrasonic image to be processed;
and if the matching degree of the keywords is not greater than a target matching degree threshold value, performing labeling processing on the ultrasonic image to be processed by adopting the keywords to be labeled to form a keyword association map corresponding to the ultrasonic image to be processed, and updating the target knowledge map based on the keywords to be labeled.
6. The method of processing an ultrasound image according to claim 1, wherein after the labeling processing is performed on the ultrasound image to be processed based on the target standard surface feature, the target tissue feature and the imaging device feature, and a keyword association map corresponding to the ultrasound image to be processed is obtained, the method of processing an ultrasound image further includes:
acquiring an ultrasonic analysis report corresponding to the ultrasonic image to be processed and file attribute information corresponding to the ultrasonic analysis report;
extracting keywords from the ultrasonic analysis report and the file attribute information to obtain report keywords and attribute keywords;
and updating the keyword association map corresponding to the ultrasonic image to be processed according to the report keyword and the attribute keyword.
7. An ultrasound image processing apparatus characterized by comprising:
the ultrasonic image acquisition module is used for acquiring an ultrasonic image to be processed;
the standard surface recognition module is used for performing standard surface recognition on the ultrasonic image to be processed and determining a target standard surface characteristic corresponding to the ultrasonic image to be processed;
the tissue structure identification module is used for identifying the tissue structure of the ultrasonic image to be processed and determining the target tissue characteristics corresponding to the ultrasonic image to be processed;
the character recognition module is used for carrying out character recognition on the ultrasonic image to be processed and determining the imaging equipment characteristics corresponding to the ultrasonic image to be processed;
the labeling processing module is used for performing labeling processing on the ultrasonic image to be processed based on the target standard surface feature, the target tissue feature and the imaging equipment feature to acquire a keyword association map corresponding to the ultrasonic image to be processed;
and the association storage module is used for storing the ultrasound image to be processed and the keyword association map in a system database in an associated manner.
8. The ultrasound image processing apparatus of claim 7, wherein the standard face identification module comprises:
a standard surface similarity obtaining unit, configured to perform standard surface recognition on the ultrasound image to be processed by using a standard surface recognition model, and obtain a standard surface similarity corresponding to at least one preset standard surface type in the standard surface recognition model;
the target standard surface type determining unit is used for determining a target standard surface type from at least one preset standard surface type according to the standard surface similarity corresponding to the at least one preset standard surface type;
and the target standard surface feature determining unit is used for determining the target standard surface feature corresponding to the ultrasonic image to be processed according to the target standard surface type.
9. The ultrasound image processing apparatus of claim 7, wherein the tissue structure identification module comprises:
the tissue similarity obtaining unit is used for identifying the tissue structure of the ultrasonic image to be processed by adopting a tissue structure identification model and obtaining the tissue similarity corresponding to at least one preset tissue structure in the tissue structure identification model;
the target tissue structure determining unit is used for determining a target tissue structure from at least one preset tissue structure according to the tissue similarity corresponding to at least one preset tissue structure;
and the target tissue characteristic determining unit is used for determining the target tissue characteristic corresponding to the ultrasonic image to be processed according to the target tissue structure.
10. The ultrasound image processing apparatus of claim 7, wherein the text recognition module comprises:
a to-be-processed text acquisition unit, configured to perform character recognition on the to-be-processed ultrasound image, and determine a to-be-processed text corresponding to the to-be-processed ultrasound image;
and the imaging equipment characteristic acquisition unit is used for determining the imaging equipment characteristics corresponding to the ultrasonic image to be processed based on the text query equipment characteristic knowledge graph to be processed.
11. The ultrasound image processing apparatus of claim 7, wherein the tagging processing module comprises:
the to-be-labeled keyword determining unit is used for determining the target standard surface feature, the target organization feature and the imaging equipment feature as to-be-labeled keywords;
a keyword matching degree obtaining unit, configured to perform matching processing on the keywords to be labeled and all standard keywords in a target knowledge graph, and obtain a keyword matching degree corresponding to each of the standard keywords;
the first labeling processing unit is used for performing labeling processing on the ultrasonic image to be processed by adopting the standard keyword when the keyword matching degree is greater than a target matching degree threshold value to form a keyword associated map corresponding to the ultrasonic image to be processed;
and the second labeling processing unit is used for performing labeling processing on the ultrasonic image to be processed by adopting the keywords to be labeled when the keyword matching degree is not greater than a target matching degree threshold value, forming a keyword associated map corresponding to the ultrasonic image to be processed, and updating the target knowledge map based on the keywords to be labeled.
12. The ultrasound image processing apparatus of claim 7, further comprising:
the report and attribute acquisition module is used for acquiring an ultrasonic analysis report corresponding to the ultrasonic image to be processed and file attribute information corresponding to the ultrasonic analysis report;
the keyword extraction module is used for extracting keywords from the ultrasonic analysis report and the file attribute information to obtain report keywords and attribute keywords;
and the associated map updating module is used for updating the keyword associated map corresponding to the ultrasonic image to be processed according to the report keyword and the attribute keyword.
13. An ultrasound apparatus comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the ultrasound image processing method according to any one of claims 1 to 6 when executing the computer program.
14. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, implements the ultrasound image processing method according to any one of claims 1 to 6.
CN202110944161.6A 2021-08-17 2021-08-17 Ultrasonic image processing method and device, ultrasonic equipment and storage medium Pending CN113486195A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110944161.6A CN113486195A (en) 2021-08-17 2021-08-17 Ultrasonic image processing method and device, ultrasonic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110944161.6A CN113486195A (en) 2021-08-17 2021-08-17 Ultrasonic image processing method and device, ultrasonic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113486195A true CN113486195A (en) 2021-10-08

Family

ID=77946752

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110944161.6A Pending CN113486195A (en) 2021-08-17 2021-08-17 Ultrasonic image processing method and device, ultrasonic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113486195A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024139719A1 (en) * 2022-12-26 2024-07-04 重庆微海软件开发有限公司 Control method and apparatus for ultrasound therapy apparatus, medium, and ultrasound therapy system

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108388580A (en) * 2018-01-24 2018-08-10 平安医疗健康管理股份有限公司 Merge the dynamic knowledge collection of illustrative plates update method of medical knowledge and application case
CN109583440A (en) * 2017-09-28 2019-04-05 北京西格码列顿信息技术有限公司 It is identified in conjunction with image and reports the medical image aided diagnosis method edited and system
CN109918513A (en) * 2019-03-12 2019-06-21 北京百度网讯科技有限公司 Image processing method, device, server and storage medium
CN111816301A (en) * 2020-07-07 2020-10-23 平安科技(深圳)有限公司 Medical inquiry assisting method, device, electronic equipment and medium
CN111933251A (en) * 2020-06-24 2020-11-13 安徽影联云享医疗科技有限公司 Medical image labeling method and system
CN112420151A (en) * 2020-12-07 2021-02-26 医惠科技有限公司 Method, system, equipment and medium for structured analysis after ultrasonic report
CN112580613A (en) * 2021-02-24 2021-03-30 深圳华声医疗技术股份有限公司 Ultrasonic video image processing method, system, equipment and storage medium
CN113052116A (en) * 2021-04-06 2021-06-29 深圳华声医疗技术股份有限公司 Ultrasonic video data processing method and device, ultrasonic equipment and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109583440A (en) * 2017-09-28 2019-04-05 北京西格码列顿信息技术有限公司 It is identified in conjunction with image and reports the medical image aided diagnosis method edited and system
CN108388580A (en) * 2018-01-24 2018-08-10 平安医疗健康管理股份有限公司 Merge the dynamic knowledge collection of illustrative plates update method of medical knowledge and application case
CN109918513A (en) * 2019-03-12 2019-06-21 北京百度网讯科技有限公司 Image processing method, device, server and storage medium
CN111933251A (en) * 2020-06-24 2020-11-13 安徽影联云享医疗科技有限公司 Medical image labeling method and system
CN111816301A (en) * 2020-07-07 2020-10-23 平安科技(深圳)有限公司 Medical inquiry assisting method, device, electronic equipment and medium
CN112420151A (en) * 2020-12-07 2021-02-26 医惠科技有限公司 Method, system, equipment and medium for structured analysis after ultrasonic report
CN112580613A (en) * 2021-02-24 2021-03-30 深圳华声医疗技术股份有限公司 Ultrasonic video image processing method, system, equipment and storage medium
CN113052116A (en) * 2021-04-06 2021-06-29 深圳华声医疗技术股份有限公司 Ultrasonic video data processing method and device, ultrasonic equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024139719A1 (en) * 2022-12-26 2024-07-04 重庆微海软件开发有限公司 Control method and apparatus for ultrasound therapy apparatus, medium, and ultrasound therapy system

Similar Documents

Publication Publication Date Title
CN112070119B (en) Ultrasonic section image quality control method, device and computer equipment
CN113505262B (en) Ultrasonic image searching method and device, ultrasonic equipment and storage medium
Nurmaini et al. Accurate detection of septal defects with fetal ultrasonography images using deep learning-based multiclass instance segmentation
CN112580613B (en) Ultrasonic video image processing method, system, equipment and storage medium
CN111583249B (en) Medical image quality monitoring system and method
CN112672691B (en) Ultrasonic imaging method and equipment
CN112102247B (en) Machine learning-based pathological section quality evaluation method and related equipment
CN111462049A (en) Automatic lesion area form labeling method in mammary gland ultrasonic radiography video
CN110298820A (en) Image analysis methods, computer equipment and storage medium
CN113768546B (en) Ultrasound elastography generation and processing system and method
CN113241138B (en) Medical event information extraction method and device, computer equipment and storage medium
US20240257347A1 (en) Tissue identification and classification based on vibrational signatures
CN112580404A (en) Ultrasonic parameter intelligent control method, storage medium and ultrasonic diagnostic equipment
CN113570594A (en) Method and device for monitoring target tissue in ultrasonic image and storage medium
CN113486195A (en) Ultrasonic image processing method and device, ultrasonic equipment and storage medium
CN115082487B (en) Ultrasonic image section quality evaluation method and device, ultrasonic equipment and storage medium
CN113052116B (en) Ultrasonic video data processing method and device, ultrasonic equipment and storage medium
CN113298773A (en) Heart view identification and left ventricle detection device and system based on deep learning
CN113693625B (en) Ultrasonic imaging method and ultrasonic imaging apparatus
US20210251601A1 (en) Method for ultrasound imaging and related equipment
CN114820483A (en) Image detection method and device and computer equipment
CN114049485A (en) Method and device for intelligently identifying and judging abnormality of heart section in embryonic development period
CN116687445B (en) Automatic positioning and tracking method, device, equipment and storage medium for ultrasonic fetal heart
CN111493931A (en) Ultrasonic imaging method and device and computer readable storage medium
CN115272771A (en) Fetal ultrasonic image processing method, device, program product and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20211008