[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN114557772B - Virtual-real fusion navigation system and method in breast surgery appearance remodelling operation - Google Patents

Virtual-real fusion navigation system and method in breast surgery appearance remodelling operation Download PDF

Info

Publication number
CN114557772B
CN114557772B CN202210125090.1A CN202210125090A CN114557772B CN 114557772 B CN114557772 B CN 114557772B CN 202210125090 A CN202210125090 A CN 202210125090A CN 114557772 B CN114557772 B CN 114557772B
Authority
CN
China
Prior art keywords
breast
image
virtual
difference
navigation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210125090.1A
Other languages
Chinese (zh)
Other versions
CN114557772A (en
Inventor
李小兵
李健一
方玲玲
刘惠
邱天爽
王欣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian University of Technology
Liaoning Cancer Hospital and Institute
Original Assignee
Dalian University of Technology
Liaoning Cancer Hospital and Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian University of Technology, Liaoning Cancer Hospital and Institute filed Critical Dalian University of Technology
Priority to CN202210125090.1A priority Critical patent/CN114557772B/en
Publication of CN114557772A publication Critical patent/CN114557772A/en
Application granted granted Critical
Publication of CN114557772B publication Critical patent/CN114557772B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Image Processing (AREA)

Abstract

The invention belongs to the technical field of breast surgery, and provides a virtual-real fusion navigation system and method in breast surgery appearance remodelling. The system comprises a 3D camera, a projector, a main control computer and peripheral equipment, wherein the 3D camera and the projector are connected with a universal suspension arm through a fixed bottom plate and are fixed on a ceiling above an operating table. The invention provides a bilateral breast difference assessment method and a affected side breast trimming navigation method based on a 3D image and a virtual-real fusion technology in breast operation remodelling based on the system. Taking the supine posture of the human body, respectively lifting one side of the upper body of the operating table by 0 degrees, 15 degrees and 30 degrees, referring to marker points arranged in advance, taking full-size images (three-dimensional point cloud and other data) of the double-sided breasts of the human body by using a 3D camera, analyzing the difference and the similarity between the affected side and the healthy side by special software, projecting the images to the human body by a virtual-real fusion technology, and giving a double-sided breast difference evaluation result and trimming navigation information. The invention can provide a precise and practical navigation system and method in breast operation shape remodelling operation.

Description

Virtual-real fusion navigation system and method in breast surgery appearance remodelling operation
Technical Field
The invention belongs to the technical field of breast surgery, and relates to a virtual-real fusion navigation system and method in breast surgery appearance remodelling.
Background
Currently, breast tumors account for the first incidence of female tumors. However, due to the enhancement of health consciousness and the improvement of clinical detection means, early-discovered breast tumors account for about 90%, and the 5-year survival rate and the 10-year survival rate account for 80% and 90% of early-stage breast tumor patients, respectively. Thus, in addition to curing patients from resected tumors, breast contour remodeling techniques and procedures are becoming increasingly important to the medical community and the patient.
Navigation systems and techniques in breast surgery contouring surgery are critical to breast contouring surgery. The navigation technique described herein mainly includes two aspects, namely, differential evaluation of the affected side and detection breast, and guidance and navigation of the affected side breast surgical repair.
Current breast measurement and bilateral breast difference (or symmetry) assessment methods mainly include subjective and objective methods. The subjective method is mainly used for evaluating breast difference or symmetry according to visual inspection of operators or other people. The objective method mainly comprises three types, namely a measurement method based on the dimension and the volume of a mechanical mode; a measurement method based on magnetic resonance imaging; and measurement methods based on 2D or 3D imaging of the contoured surface.
However, in the above method, it is difficult to accurately evaluate bilateral breast differences or symmetries by subjective visual inspection methods due to visual errors and individual differences of different surgeons. Among the existing objective measurement and evaluation methods, the non-rigid curved shape is not easy to measure and express accurately, and belongs to contact measurement, which is inconvenient to use in operation environment based on mechanical scale or volume measurement method. The measurement and evaluation method based on magnetic resonance requires the support of huge magnetic resonance equipment and subsequent image reconstruction operation, and is not suitable for measurement and evaluation in the operation of breast reconstruction operation. The existing measuring and evaluating methods based on the 2D or 3D imaging of the appearance surface mostly take the standing posture of the testee, and are used for measuring and evaluating before or after the operation. There are reports of 3D symmetry evaluation using the hand-held 3D camera 2, but the operation is relatively complex, the requirements on the user are high, and the time for the scanning imaging stitching and the subsequent processing are long, so that the method can be used as postoperative evaluation and is not suitable for being used as intra-operative navigation. None of the above three methods are suitable for use intra-operatively as a navigation system or part of a system.
No relevant report has been made about a navigation system in complete and practical breast surgery contour remodeling.
In summary, existing bilateral breast differential (or symmetry) measurement and assessment methods either do not work well or are not suitable for full-scale measurement and analytical assessment of the breast in clinical settings, particularly in breast remodeling procedures. Further navigation techniques and systems have not been reported. In view of the above, the invention provides a virtual-real fusion navigation system and algorithm in breast surgery appearance remodelling, which are based on 3D imaging and virtual-real scene co-occurrence (called as virtual-real fusion) technology, and are used for carrying out accurate non-contact 3D measurement and differential evaluation on bilateral breasts in supine positions and different elevation positions of a patient in surgery, so as to realize accurate navigation on trimming of breasts on affected sides in surgery.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a navigation system and an algorithm for virtual-real fusion in breast surgery appearance remodelling, which can solve the problems of accurate non-contact 3D measurement and differential evaluation of bilateral breasts in clinical surgery, and further realize accurate navigation of breast surgery trimming of a patient in the surgery.
In order to achieve the above purpose, the present invention provides the following technical solutions:
the navigation system in the breast surgery appearance remodelling operation comprises a main control computer 1, a 3D camera 2 and a projector 3 which are in signal transmission with the main control computer 1, and a universal suspension arm 4, wherein the external equipment configuration of the main control computer 1 comprises a color liquid crystal display, an input keyboard and a mouse. The 3D camera 2 and the projector 3 are fixed on a fixed bottom plate through bolts after being overlapped up and down, the upper part of the fixed bottom plate is connected with the bottom end of the universal suspension arm 4 through bolts, and the top end of the universal suspension arm 4 is fixed on a ceiling above the operating table 5. The position and orientation of the 3D camera 2 and the projector 3 can be adjusted universally by the universal boom 4. 2 mark points are arranged on the body surface of a patient in advance and used as a navigation system to automatically determine datum points or reference points of the 3D image. The 3D camera 2 is used for shooting a full-size 3D image of the breast of a patient in operation to obtain 3D point cloud data information containing the breast, and the camera 2 is controlled by the main control computer 1 to perform data acquisition and analysis processing. The surgical navigation image and data are projected to the human body again by the projector 3.
Furthermore, the main control computer 1 is a conventional personal computer 1 or a notebook computer, and is separated from the 3D camera 2 and the projector 3, but the distance is not more than 3 meters. The main control computer 1 is connected with the 3D camera 2 and the projector 3 through data communication lines.
The navigation method in the breast surgery appearance remodelling is realized based on the navigation system, the navigation algorithm takes the supine posture of a human body, one side of the upper body of an operating table rises to different angles, the arranged mark points are referred, a 3D camera is utilized to capture full-size images (data such as three-dimensional point cloud) of the bilateral breasts of the human body in real time, the real-time collection of the 3D image data of the full-size breasts, the differential analysis and evaluation of the bilateral breasts are completed through a main control computer 1, and the differential index and the differential analysis evaluation image of the regional differential analysis of the full breasts, namely a virtual image, are generated according to the analysis evaluation result. The differential index and the full breast partition differential analysis evaluation chart are displayed on the color liquid crystal display to be fused with the 3D breast real image, so as to form '3D virtual real fusion'; the whole breast partition difference analysis evaluation chart is projected to the surface of the breast of the patient in operation through the projector 3 to form another 3D virtual-real fusion. In the operation process, real-time data updating is carried out, the difference and the similarity between the affected side and the healthy side are analyzed, the obtained data are projected to the human body through the virtual-real fusion technology, and the bilateral breast difference assessment result and the trimming navigation information are given. The method specifically comprises the following steps:
first step, setting a mark point and constructing a navigation system
1.1 Two mark points are arranged on the body surface of the patient, and a three-dimensional space rectangular coordinate system of the human body is established, as shown in figure 5. One of the two marker points (such as the waffle point) is defined as the origin of the coordinate system, the X axis represents the transverse direction of the human body surface, the Y axis represents the central axis of the human body surface, and the Z axis represents the direction perpendicular to the human body surface and facing the 3D camera 2.
The mark point is an important measure for realizing the 3D virtual-real fusion technology, and can be used for automatically determining a base or a reference point of a 3D image by a system. The arranged mark points are convenient to operate or can be arranged at the upper and lower positions of the breast (such as Huagai acupoint, zhongwan acupoint and the like) on the body surface of the shaft of a human body by virtue of the existing marks of the existing medical regulations, and the mark points have the characteristics of convenient automatic identification, such as colors and cross shapes with large contrast with the complexion of the patient. One marker point is placed at each of the Huagai acupoint (the position about 12 cm below the middle point of two collarbones, which is located between the 1 st intercostal on the central axis of the chest of a human body) and the Zhongwan acupoint (the position about 13 cm above the navel of the human body). Defining a 3D rectangular coordinate system shown in fig. 5, ensuring that the two marker points form a y-axis of the rectangular coordinate system is called a central axis of the human body surface. The YOZ plane formed by this central axis is called the human central axis section. The two mark points provide characteristic point information of the tracking, matching and mixed superposition of a symmetry axis (namely the central axis) and a symmetry plane (namely the central axis surface) of an actual human body, a measurement scale of a human body system and a virtual breast and a physical breast for the system.
1.2 The boom 4 is pulled, the 3D camera 2 and the projector 3 connected on the boom 4 are positioned on the Z axis of a three-dimensional space rectangular coordinate system and are aligned with the chest of a patient, and the distance between the front end of the lens of the 3D camera 2 and the surface of the chest of the patient is kept to be 900 mm-1000 mm.
1.3 Signal connection of the main control computer 1 with the 3D camera 2 and the projector 3, and full-size 3D breast image data (point cloud data) obtained by the 3D camera 2 is read into the main control computer 1 for storage by using a gigabit Ethernet through a GigE interface, and then subsequent analysis and processing are carried out. The basic flow comprises the following steps: the specific flow is shown in fig. 8.
Second, acquiring 3D full-size breast images of the patient in different poses
The human body is in a supine position on the operating table, and the upper body is positioned on the operating table support plate 6. Acquiring 3D full-size breast images under different poses: the 3D camera 2 is started, breast image data of the operating table 5 and the patient in the states of 30 degrees, 15 degrees and 0 degrees are acquired, wherein the state of 0 degrees is an operation state, and the data of the other two angles can provide operation guidance data for measuring breast symmetry of different post-operation postures. A full-size breast 3D image is obtained by shooting with the 3D camera 2 as a "real image", wherein the full-size breast 3D image includes 3D point cloud data or display form data such as a 3D curved surface.
Thirdly, finishing 3D full-size breast image preprocessing, segmentation, registration and breast data fusion through a main control computer 1;
the main control computer 1 processes the breast data of each angle obtained in the second step, tracks the mark points in the image by utilizing the characteristics (color, shape and the like) of the mark points, establishes a virtual space coordinate system, and establishes information such as double-breast difference three-dimensional data and the like required by operation navigation by using the coordinates: and obtaining data focused by surgical navigation such as a 3D point cloud image of the affected side breast, a surface image of the 3D point cloud image, sagging information of the affected side breast image, a height difference between symmetrical points of the affected side breast and the healthy side breast, an evaluation index and the like through calculation methods such as fusion of multi-angle breast data and matching of 3D images. The following process is required.
3.1 Image preprocessing: preprocessing the full-size breast 3D image obtained in the second step, removing abnormal discontinuous points in the 3D full-size breast image data, and ensuring the smoothness of a 3D curved surface of the breast surface;
3.2 Image region segmentation): processing the preprocessed image by adopting a deep learning method based on a U-shaped network, so as to realize the region segmentation of the bilateral breast 3D image;
3.3 Different pose registration: in order to realize the expected symmetry of breast health side and affected side under any posture, namely to compensate bilateral asymmetry caused by breast sagging when standing upright, and also to consider the limitation of operation conditions, three body positions with the included angles of 0 degree, 15 degrees and 30 degrees between the supine position of the upper body and the horizontal plane are taken for 3D data acquisition, 3D data acquired by each posture is recorded, and the left breast is L when lying flat 0 Right milk is R 0 The kth pose of the rest 2 poses is marked as L k And right milk is R k Registering the 3D image of each pose with the 0 degree pose image to ensure the alignment of the nipple, key focus point and deformation point (for ensuring the beautiful shape of the breast after operation, the doctor usually marks a plurality of key points on two sides of the breast to make them keep relative symmetry), registering all the poses to the 3D coordinates of the breast when the patient lies flat (namely 0 degree) through the registration deformation field,obtaining a deviation field based on deformation amounts of the kth pose and the lying pose
3.4 Fusion of breast 3D structure and sagging deformation: by fusion of the bias fieldsCalculating 3D information of mammary gland sagging, particularly, when the included angle between an upper body and a horizontal plane is large and deformation occurs, carrying out large specific gravity weighting on the difference between a healthy side and a diseased side, namely:
in omega 0 3D curved surface of breast surface under 0 degree pose, w k The weighting of the kth pose deformation field can be appropriately adjusted according to the situation, for example:
in θ k Is the included angle between the upper body and the horizontal plane in the kth pose.
Wherein Ω k Is a 3D curved surface of the breast surface under the kth pose. The (x, y, z) in the above formulas represents 3 coordinate variables of a specific 3D coordinate system O (x, y, z) (shown in fig. 5) established by the present invention.
Thus, the obtained multi-fusion information including the breast 3D information and sagging deformation information is recorded asFor the follow-up convenience note:
the multi-fusion information is expressed as (x, y, z, d).
The above data provide a data source for the next breast differential analysis.
Fourth, bilateral breast differential analysis and assessment
According to the preprocessed and segmented bilateral breast 3D image point cloud data, dividing bilateral breast difference analysis into two parts, namely bilateral breast 2D size difference analysis on an XOY plane and 3D curved surface difference analysis after bilateral breast 3D fusion under different poses.
For the 2D size difference analysis of the bilateral breasts on the XOY plane, the patient is recorded as omega under the supine position condition under the three-dimensional space rectangular coordinate system established in the step 1.1 0 Segmentation of (x, y, z) bilateral breast 3D images, respectively in Ω 0 + (x, y, z) and Ω 0 - (x, y, z) represents the 3D curved surfaces of the segmented affected and healthy (reference) breasts. Let z=0, compare Ω 0 + (x, y, 0) and Ω 0 - The dimensions of (x, y, 0) in the XOY plane can be divided into the difference in the dimensions of the affected side breast and the healthy side breast in the 2D plane. The 2D size error function may be further calculated as:
G (2D) (x,y,0)=Ω 0 + (x,y,0)-Ω 0 - (-x,y,0) (5)
for the 3D curved surface difference analysis after the bilateral breast 3D fusion under different postures, a YOZ plane is taken as a mirror image plane, and a breast 3D curved surface omega under the supine posture condition is taken 0 For reference, bilateral breast differences were analyzed and bilateral differences fusing breast 3D and sagging information were measured. Extracting breast curvature omega from multiple fusion information (x, y, z, d) 0 Is denoted B, and can be expressed as:
without loss of generality,for the affected side breast, the patient is treated with->For the healthy side, i.e. the reference side breast, the x-axis direction is reversed if reversed, wherein:
Ω 0 + ={x>0|Ω 0 },Ω 0 - ={x'<0|Ω 0 } (7)
as can be seen from the mirror image of yoz,has a corresponding relationship. The differential analysis is then based on Ω 0 A comparison of the corresponding amounts is made, namely:
wherein D (·, ·) represents the functional relationship of two arguments, E (3D) (. Cndot. Cndot.) fusing differences in z and d dimensions, e.g., desirable
Wherein, phi (·,) and pi (·,) can be obtained by adopting a measurement mode of solving absolute errors or square errors of two parameters, lambda 1 And lambda (lambda) 2 And (5) fusing the weighting parameters of the 3D information and the sagging deformation information. Thus, it is possible to obtain (x, y, z, E (3D) ),(x,y,z)∈Ω 0 To obtain breast difference information by fusing 3D information and sagging deformation information, (x, y, z, E) (3D) ),(x,y,z)∈Ω 0 Elements 1, 2, 4 x, y, E (3D) Composition of xi (x, y, E) (3D) ),(x,y,z)∈Ω 0 For the projector 3 live-action presentation, called "virtual breast bilateral contrast information", for all four elements (x, y, z, E (3D) ),(x,y,z)∈Ω 0 Contains breast 3D information and virtual breast bilateral difference degree informationThe display device is used for displaying virtual-real fusion of a computer display screen.
The difference between the affected side breast and the healthy side breast in the 3D image can also be further examined by dividing the 3D image of the breast into a plurality of small areas. As shown in fig. 5, the segmented breast image of the healthy side and the affected side are divided into a plurality of small areas spaced 10mm apart, and the 3D curved surface of the breast containing the fusion sagging information of each small area is expressed as Ω 0 + (x i ,y j ,z i,j ) And omega 0 - (x i ,y j ,z i,j ) Wherein i and j respectively represent the serial numbers of small areas in the x and y directions in the coordinate system, z i,j Representation and x i ,y j The height and shape of the curved surface corresponding to the small area contain error information fused under different angle conditions. Defining the 3D curved surface error function of the affected side breast and the healthy side breast as follows:
G (3D) (x i ,y j ,z i,j )=Ω 0 + (x i ,y j ,z i,j )-Ω 0 - (-x i ,y j ,z i,j ) (8)
from G (3D) (x i ,y j ,z i,j ) The error between the 3D curved surface of each small area of the affected side breast and the 3D curved surface of the corresponding small area of the healthy side breast serving as a comparison reference can be determined.
Defining the surface error indexes of each small area corresponding to the 3D curved surface of the affected side breast and the healthy side breast as follows:
the above quantitative indication of the difference between the affected side and the healthy side in the 3D curved surface of the breast. Defining the overall difference index of the 3D curved surface of the affected side breast as follows:
wherein S represents the collection of all small areas of the affected side breast or the healthy side breast obtained by image segmentation. The above-mentioned method reflects the overall difference condition of the 3D curved surfaces of the affected side breast and the healthy side breast.
The (x, y, z, E) (3D) ),(x,y,z)∈Ω 0 Or G (3D) (x i ,y j ,z i,j ) Practically reflects the error of the 3D curved surface of the affected side breast and the 3D curved surface of the corresponding part of the healthy side breast.
Consider G (3D) (x i ,y j ,z i,j ) In case of G (3D) (x i ,y j ,z i,j )>0, indicates that the surface of the small region of the affected breast is higher than that of the healthy breast, if G (3D) (x i ,y j ,z i,j ) And < 0, indicating that the surface of the small area of the affected side breast is lower than that of the healthy side breast, so that the small area can be used as navigation information for guiding an operator to perform further operation trimming. The small area 3D surface error indexIs a relative to the maximum error->The normalized index of (2) has a value ranging from [ -100, +100 [ -100 ]]. The more positive the index value, the higher the 3D curved surface of the affected side breast is than the 3D curved surface of the healthy side breast, and the lower the curved surface of the affected side is than the curved surface of the healthy side. The overall difference index C is a macroscopic index for representing the overall difference between the affected side breast and the healthy side breast, and if C is approximately equal to 0, the 3D curved surface of the affected side breast is close to the 3D curved surface of the healthy side breast, and good symmetry of the affected side breast and the healthy side breast is shown.
The breast difference analysis and evaluation index and graphic display part is used for obtaining difference data (x, y, z, E) of the affected side breast and the healthy side breast surface according to the bilateral breast difference analysis and evaluation part (3D) ),(x,y,z)∈Ω 0 Or corresponding to small area surface errors G (3D) (x i ,y j ,z i,j ) Index of surface error of small region corresponding to affected side breast and healthy side breastAnd the overall difference index C of the affected side breast and the healthy side breast, and the error G of each small area of the affected side breast and the corresponding small area of the healthy side breast is displayed on the display of the main control computer 1 in the form of graph and color strips (3D) (x i ,y j ,z i,j ) And/or error indexGuiding the surgeon to trim the corresponding small area. The main control computer 1 displays the value of the overall symmetry index C on a display, which indicates the overall difference or symmetry between the affected side breast and the healthy side breast, and can be used as a reference for stopping correction operation by an operator.
Fifthly, outputting and displaying the breast difference analysis and evaluation index and the virtual image, and realizing navigation of virtual-real fusion
The automatic tracking algorithm is utilized to track the mark points and the breasts, a projector is started, and the difference, evaluation and navigation data obtained in the steps are projected onto the entity breasts, so that the operation navigation [ A4] of 'virtual-real fusion' is realized. The flow of automatic tracking and virtual-real fusion navigation of the breast with the mark point as a reference is shown in fig. 11, and the main steps are as follows: setting parameter information such as the mark points and measuring the distance between the mark points, starting a 3D camera to acquire data, calculating to generate a virtual breast, projecting the virtual breast and difference data, realizing automatic matching of the projection image and the virtual and real fusion of the entity, and outputting navigation data. The computational analysis and virtual breast reproduction are mainly realized by an autonomously designed algorithm, and mature 3D commercial software (such as GOM aspect, cloudCompare, mountain map and the like) is assisted to realize image display and comparison so as to increase system compatibility and user friendliness. The autonomous design algorithm is as follows:
5.1 On the basis of the arrangement of the marker points and the measurement of the distance between the marker points, the measuring and navigation process can be entered.
5.2 (x, y, z, E) comparing the difference data (x, y, z, E) representing the affected side breast compared to the healthy side breast generated by the fourth step calculation (3D) ),(x,y,z)∈Ω 0 Or watchThe color graph showing the difference of each small area is denoted as G (3D) (x i ,y j ,z i,j ) I, j e S, is referred to as a "virtual breast image".
5.3 In the state that the patient is positioned at 0 DEG in the operation, the laser projector 3 is started, and the virtual breast projector is utilized to project the virtual breast image, the difference data obtained through calculation and other information to the breast position of the patient, so as to form a virtual-real fused breast image.
5.4 The marking points of the entity image and the virtual breast image are overlapped, then the acquired entity image and the image generated by the projector are subjected to matching work such as shifting, stretching, rotating and the like, so that the superposition of the entity image and the virtual breast image is realized, the entity image and the virtual breast image are completely overlapped under ideal conditions, and if the entity image and the virtual breast image are not overlapped, the subsequent treatment of the affected side is guided; and delineate data for surgical navigation on the physical breast, including: the missing position, size, the position, size, peripheral characteristics (such as radian) and the like which need to be filled are convenient for an operator to intuitively know the difference condition between each part of the affected side breast and the healthy side breast, and provide guidance or navigation for further correction of the operator.
Furthermore, the differential index and the total breast partition differential analysis evaluation chart can be used for clearly representing the visual representation of the overall differential index and the local area differential of the affected side breast and the healthy measurement breast in the breast remodelling operation by numerical values and graphs, and provides qualitative and quantitative indication and navigation information for the operation doctor to further carry out the operation trimming of the affected side breast.
The innovation point of the invention is as follows: registering the 3D data of the entity breast and the virtual breast by using the mark points, and projecting the operation navigation 2D data (comprising bilateral breast differential data, operation planning data and the like which are obtained in the earlier stage) to the entity breast at the affected side through a projector to assist operation navigation.
The beneficial effects of the invention are as follows:
aiming at the problems existing in the measurement and evaluation of the difference between the affected side and the healthy side breast in the existing breast shape remodelling operation, a virtual-real fusion navigation system and algorithm in the breast operation shape remodelling operation which takes a 3D imaging technology and a digital video technology as cores are constructed, a mark point arrangement method and a virtual-real image fusion technology are designed, a specific 3D rectangular coordinate system is established, deviation fields of breast images under different poses are defined, an error compensation weighting function and a multivariate information fusion method are designed, dimensional differences and 3D curved surface errors on the 2D plane of the affected side breast and the healthy side breast are defined, a small area surface error index and an overall difference index corresponding to the affected side breast and the healthy side breast are defined, macroscopic and microscopic bilateral breast operation difference evaluation methods based on the evaluation indexes are designed, and a virtual-real fusion information display mode and an operation navigation method based on the difference information are provided. The breast information and the operation navigation image and data obtained in real time through the 3D camera are relatively objective, an important basic reference is provided for personal judgment of doctors, the navigation image and data of virtual-real fusion are directly projected on the operation breast (affected side) in real time, and reliable and accurate navigation information can be provided for the surgeons to further trim the breast. The measuring equipment is simple and accurate, and the evaluation method is simple and easy to implement, so that the method can be used in clinical operation.
Drawings
FIG. 1 is a schematic diagram of the overall layout structure of the present invention;
FIG. 2 is an overall logic block diagram of the present invention;
FIG. 3 is a schematic diagram showing the structural connection of the base plate and the 3D camera 2 and projector 3;
FIG. 4 is a schematic view of the structure and installation of the universal boom 4 of the present invention;
FIG. 5 is a rectangular coordinate system depicting a bilateral breast and regional division of the breast in accordance with the present invention;
fig. 6 is a logic and electrical connection diagram of the main control computer 1 and its external devices, the 3D camera 2 and the projector 3;
FIG. 7 is a schematic view showing the elevation of one side of the upper body of the operating table 5 at different angles during surgery;
FIG. 8 is a flow chart of a 3D data acquisition portion of the proprietary software system of the present invention;
FIG. 9 is a flow chart of a 3D image processing portion of a proprietary software system;
FIG. 10 is a flow chart of a control portion of the dedicated software system of the present invention;
FIG. 11 is a flow chart of automatic tracking, virtual-real fusion navigation of the breast.
In the figure: 1, a main control computer; a 23D camera; 3, a projector; 4, universal suspension arms; 5, an operating table; 6, an operation table support plate.
Detailed Description
The invention is further illustrated below with reference to specific examples.
A navigation system in breast surgery appearance remodelling operation, the navigation system includes a master control computer 1, and carries out signal transmission's 3D camera 2 and projector 3 between the master control computer 1, a universal davit 4. After the 3D camera 2 and the projector 3 are vertically overlapped, the three-dimensional camera is fixed on a fixed bottom plate through bolts, the upper part of the fixed bottom plate is connected with the bottom end of the universal suspension arm 4 through bolts, and the top end of the universal suspension arm 4 is fixed on a ceiling above the operating table 5. The position and orientation of the 3D camera 2 and the projector 3 can be adjusted universally by the universal boom 4. 2 marker points which are arranged in advance on the body surface of the patient are used as a navigation system to automatically determine datum points or reference points of the 3D image. The 3D camera 2 is used for shooting a full-size 3D image of the breast of a patient in operation to obtain 3D point cloud data information containing the breast, and the camera 2 is controlled by the main control computer 1 to perform data acquisition and analysis processing. Projecting the calculated operation navigation images and data such as bilateral breast difference and evaluation data to the affected side breast, and performing 'virtual-real fusion' operation navigation
The main control computer 1 is a conventional personal computer 1 or a notebook computer, and is separated from the 3D camera 2 and the projector 3, and the distance is 2.5 meters. The main control computer 1 is connected with the 3D camera 2 and the projector 3 through data communication lines, and the external equipment configuration of the main control computer 1 comprises a color liquid crystal display, an input keyboard and a mouse.
A virtual-real fusion navigation method in breast surgery appearance remodelling operation comprises the following steps:
preparation for acquisition of 3D images
A1. According to the overall structure layout shown in fig. 1, signal connection lines of a main control computer 1, a 3D camera 2 and a projector 3 are connected.
A2. Marker points are placed in the Huagai acupoint and Zhongwan acupoint of the patient.
A3. The boom 4 is pulled such that the 3D camera 2 and projector 3 attached to the boom 4 are substantially positioned on the Z-axis of the defined 3D coordinate system, aligned with the patient's chest, and the lens front of the 3D camera 2 is kept at a distance of 950mm from the surface of the patient's chest.
B. Obtaining intra-operative 3D full-size breast images of a patient
B1. The upper body side of the patient of the operating table 5 is adjusted so as to rise by 0 °, 15 ° and 30 ° in order.
B2. The power supply of the host computer 1, the 3D camera 2 and the virtual breast projector (i.e. projector 3) is started. The central control computer starts to collect and process the camera data.
B3. And starting a 3D data acquisition function of a special software system in the main control computer 1, wherein the main control computer 1 reads 3D image data of the 3D camera 2, and the acquisition time is not longer than 10 seconds.
B4. And tracking mark points in the image, establishing a virtual space coordinate system, and establishing double-breast difference 3D data required by surgical navigation according to the coordinates.
C. Whole breast 3D image preprocessing and breast image segmentation
C1. The 3D full-size breast image obtained from the 3D camera 2 is subjected to a filtering pre-process to remove possible noise interference.
C2. The preprocessed 3D full-size breast image is subjected to the necessary geometric adjustments, including scaling and angular rotation.
C3. The geometrically adjusted 3D full-size breast image is segmented to obtain image data of the affected side breast and the healthy side breast respectively.
C4. According to the foregoing formulas (1) -formula (4), the multi-fusion information (x, y, z, D) including the breast 3D information and sagging deformation information is calculated.
D. Bilateral breast differential analysis and assessment
D1. A specific 3D rectangular coordinate system O (x, y, z) describing a bilateral breast curved surface is established as shown in fig. 5, where (x, y) represents a plane approximated by a human surface, and the z-axis represents a dimension of the human surface in a direction perpendicular to the 3D camera 2 (i.e., in a direction perpendicular to the paper surface in the figure).
D2. The segmented 3D images of the patient side and healthy side breasts are incorporated into the 3D coordinate system along with the background portion, as shown in fig. 5. Respectively by omega 0 + (x, y, z) and Ω 0 - (x, y, z) represents the 3D curved surfaces of the segmented affected and healthy (reference) breasts.
D3. Calculating a 2D error function E according to (5) (2D) (x, y, 0) determining the difference in geometric dimensions of the patient's side breast and the healthy side breast in the (x, y) plane.
D4a calculating to obtain 3D error information of affected side breast and healthy side breast, (x, y, z, E) (3D) ),(x,y,z)∈Ω 0
Or D4b, as shown in FIG. 5, on the (x, y) plane of the bilateral breast image, dividing the affected side breast image and the healthy side breast image into several small areas at 10mm intervals, respectively, and expressing the 3D curved surface of the breast containing fusion sagging information as omega 0 + (x i ,y j ,z i,j ) And omega 0 - (x i ,y j ,z i,j ) Wherein i and j respectively represent the serial numbers of small areas in the x and y directions in the coordinate system, z i,j Representation and x i ,y j The height and shape of the curved surface corresponding to the small area contain error information fused under different angle conditions.
D5a. according to (x, y, z, E (3D) ),(x,y,z)∈Ω 0 Further calculating to obtain the Xi (x, y, E) (3D) ),(x,y,z)∈Ω 0 Can be used as virtual image error information displayed on a central control computer display, and can also be reserved for subsequent projection display to the operation site of the patient
Or D4b. G defined according to formula (8) (3D) (x i ,y j ,z i,j ) Calculating a 3D curved surface error function of the affected side breast and the healthy side breast; defined according to formula (9)Calculating the surface error index of each small area corresponding to the 3D curved surface of the affected side breast and the healthy side breast; and calculating the overall difference index of the 3D curved surface of the affected side breast and the healthy side breast according to the C defined in the formula (10).
D5. The index calculated by the D4a and the D4b is used for quantitatively evaluating the difference between the affected side breast and the healthy side breast, and can be used as navigation information for guiding an operator to perform further operation trimming.
E. Output display of breast difference analysis and evaluation index and virtual image, namely virtual-real fusion
E1. Display E on the display of the Master computer 1 (2D) The (x, y, 0) 2D curve represents the difference in size of the affected breast from the test breast in the (x, y) plane.
E2A error virtual image information (x, y, z, E) of the affected side breast and the healthy side breast is displayed on the computer 1 (3D) ),(x,y,z)∈Ω 0 And further projecting and displaying the calculated xi (x, y, E) (3D) ),(x,y,z)∈Ω 0 And carrying out further virtual-real fusion on the virtual image information.
Or E2b or based on the obtained 3D differential analysis and evaluation result of the affected side and healthy side breasts, displaying G in each small region of affected side breasts in pseudo-color virtual image and numerical mode on the rectangular coordinate system of bilateral breasts and divided image of breast region described on the display of computer 1 (3D) (x i ,y j ,z i,j ) Virtual error imageThe values and the corresponding colors, and the color bars are provided to indicate the magnitude or symmetry of the errors corresponding to the different colors, thereby providing microscopic differential evaluation for operators. E3. The virtual breast projector (i.e. projector 3) simultaneously projects the information of E2a and E2b, namely virtual images, to the breast part of the patient, so as to realize virtual-real fusion and provide more visual differential evaluation and operation trimming navigation for the operator.
E4. The numerical value of the bilateral breast overall symmetry index C is displayed on the display of the main control computer 1, and macroscopic difference evaluation is provided for an operator.
F. The surgeon dresses the appearance of the breast and repeatedly performs the difference evaluation of B-E and the deficiency-excess fusion until satisfaction.
The examples described above represent only embodiments of the invention and are not to be understood as limiting the scope of the patent of the invention, it being pointed out that several variants and modifications may be made by those skilled in the art without departing from the concept of the invention, which fall within the scope of protection of the invention.

Claims (5)

1. The navigation method in the breast surgery appearance remodelling is realized based on a navigation system, which is characterized in that the navigation system comprises a main control computer (1), a 3D camera (2) and a projector (3) which are in signal transmission with the main control computer (1), and a universal suspension arm (4), wherein the external equipment configuration of the main control computer (1) comprises a color liquid crystal display, an input keyboard and a mouse;
the 3D camera (2) and the projector (3) are fixed on a fixed bottom plate through bolts after being overlapped up and down, the upper part of the fixed bottom plate is connected with the bottom end of the universal suspension arm (4) through bolts, the top end of the universal suspension arm (4) is fixed on a ceiling above the operating table (5), and the positions and the orientations of the 3D camera (2) and the projector (3) are adjusted through the universal suspension arm (4);
arranging 2 mark points on the body surface of a patient in advance, and automatically determining a datum point or a reference point of a 3D image as a navigation system; shooting a full-size 3D image of a breast of a patient in operation through a 3D camera (2), and acquiring and analyzing data through a main control computer (1) to obtain 3D point cloud data information containing the breast; then the operation navigation image and the data are projected to the human body through the projector (3);
the navigation method comprises the following steps:
firstly setting a mark point to construct a navigation system, secondly taking full-size images of bilateral breasts of a human body in real time by using a 3D camera, completing real-time acquisition of 3D image data of the full-size breasts, bilateral breast difference analysis and evaluation by using a main control computer (1), and finally generating a difference index and a full-breast partition difference analysis evaluation graph, namely a virtual breast image, by using an analysis evaluation result, wherein the mark point is used for registering the 3D data of the entity breasts with the virtual breasts; the differential index and the full breast partition differential analysis evaluation chart are displayed on the color liquid crystal display to be fused with the 3D breast real image, so as to form '3D virtual real fusion'; the whole breast partition difference analysis evaluation graph is projected to the surface of the entity breast to be processed in operation in real time through the projector (3) to form another 3D virtual-real fusion; in the operation process, continuous tracking and real-time data updating are carried out, the difference and the similarity between the affected side and the healthy side are analyzed in real time, and then the obtained result is projected to the surface of the breast through a virtual-real fusion technology, so that a bilateral breast difference evaluation result and trimming navigation information are obtained.
2. A navigation system-based navigation method in breast surgical contouring surgery according to claim 1, comprising the steps of:
first step, setting a mark point and constructing a navigation system
1.1 Arranging two mark points on the body surface of a patient, and establishing a three-dimensional space rectangular coordinate system of the human body; defining the position of one of the two mark points as the origin of the coordinate system, wherein the X axis represents the transverse direction of the human body surface, the Y axis represents the central axis of the human body surface, and the Z axis represents the direction perpendicular to the human body surface and facing the 3D camera (2);
1.2 Pulling the suspension arm (4) to enable the 3D camera (2) and the projector (3) which are connected to the suspension arm (4) to be positioned on the Z axis of the three-dimensional space rectangular coordinate system;
1.3 The main control computer (1) analyzes and processes the full-size 3D breast image data obtained by the 3D camera (2);
second, acquiring 3D full-size breast images of the patient in different poses
The human body is in a supine posture on an operating table, the upper body is positioned on an operating table support plate (6), 3D full-size breast images under different postures are collected, and the 3D full-size breast images are taken in real time through a 3D camera (2) and used as 'real images';
thirdly, processing breast data of each angle acquired in the second step through a main control computer (1), tracking mark points in the image by utilizing the characteristics of the mark points, establishing a virtual space coordinate system, establishing double-breast difference three-dimensional data required by operation navigation by using the coordinates, and finishing 3D full-size breast image preprocessing, segmentation, registration and breast data fusion; the method comprises the following steps:
3.1 Image preprocessing: abnormal discontinuous points in the 3D full-size breast image data are removed, and the smoothness of a 3D curved surface of the breast surface is ensured;
3.2 Image region segmentation): processing the preprocessed image by adopting a deep learning method based on a U-shaped network, so as to realize the region segmentation of the bilateral breast 3D image;
3.3 Different pose registration: collecting 3D data of three body positions with included angles of 0 degree, 15 degrees and 30 degrees between the supine position and the horizontal plane of the upper body, recording the 3D data collected by each position, and taking the left breast as L when lying down 0 Right milk is R 0 The kth pose of the rest 2 poses is marked as L k And right milk is R k Registering 3D images of all the poses with 0 degree pose images to ensure alignment of nipple, key focus points and deformation points or other key positions, registering all the poses to 3D coordinates of the breast when a patient lies flat through registration deformation fields, and obtaining a deviation field based on deformation amounts of the kth pose and the lying poseWherein the 0 degree state is a lying state in the operation process, and the data of the other two angles provide operation guidance data for measuring the symmetry of breasts in different post-operation postures;
3.4 Fusion of breast 3D structure and sagging deformation: by fusion of the bias fieldsCalculating 3D information of mammary gland sagging, and carrying out larger specific gravity weighting on differences between healthy sides and affected sides when the included angle between an upper body and a horizontal plane is larger and the upper body is deformed, namely:
in omega 0 3D curved surface of breast surface under 0 degree pose, w k For weighting the kth pose deformation field, the adjustment formula is:
in θ k The included angle between the upper body and the horizontal plane is the kth pose;
wherein Ω k A 3D curved surface which is the surface of the breast under the kth pose; (x, y, z) in the above formulae represents 3 coordinate variables of the 3D coordinate system O (x, y, z);
whereby the resulting multi-fusion information comprising 3D information and sagging deformation information of the breast is recorded asThe method is characterized by comprising the following steps:
the multi-element fusion information is expressed as (x, y, z, d);
fourth, bilateral breast differential analysis and assessment
Dividing bilateral breast difference analysis into two parts according to preprocessed and segmented bilateral breast 3D image point cloud data, namely performing bilateral breast 2D size difference analysis on an XOY plane and performing 3D curved surface difference analysis after bilateral breast 3D fusion under different poses;
for bilateral breast 2D size variability analysis on XOY plane, three-dimensional spatial right angles established at step 1.1)In the coordinate system, the patient is photographed in the supine position and marked as omega 0 Segmentation of (x, y, z) bilateral breast 3D images, respectively in Ω 0 + (x, y, z) and Ω 0 - (x, y, z) represents a 3D curved surface of the segmented patient side and the healthy side breast, with the healthy side as a reference side; let z=0, compare Ω 0 + (x, y, 0) and Ω 0 - (x, y, 0) dimension in the XOY plane, resulting in a difference in dimension in the 2D plane between the segmented patient side breast and the healthy side breast; further calculation results in a 2D size error function of:
G (2D) (x,y,0)=Ω 0 + (x,y,0)-Ω 0 - (-x,y,0) (5)
for the 3D curved surface difference analysis after the bilateral breast 3D fusion under different postures, a YOZ plane is taken as a mirror image plane, and a breast 3D curved surface omega under the supine posture condition is taken 0 As a benchmark, bilateral breast variability is analyzed and bilateral differences fusing breast 3D and sagging information are measured; extracting breast curvature omega from multiple fusion information (x, y, z, d) 0 Is denoted B, expressed as:
wherein,for the affected side breast, the patient is treated with->For the breast to be healthy, the x-axis direction is reversed if the direction is reversed, wherein:
Ω 0 + ={x>0|Ω 0 },Ω 0 - ={x'<0|Ω 0 } (7)
as is known from the mirror image of yoz,has a corresponding relationship; thus differentiationThe analysis is based on omega 0 A comparison of the corresponding amounts is made, namely:
wherein D (·, ·) represents the functional relationship of two arguments, E (3D) (. Cndot. ) the differences in z and d dimensions were taken:
wherein, phi (·,) and pi (·,) are obtained by adopting measurement modes such as absolute error or square error of two parameters, lambda 1 And lambda (lambda) 2 Weighting parameters for fusing the 3D information and the sagging deformation information; obtain (x, y, z, E) (3D) ),(x,y,z)∈Ω 0 To obtain breast difference information by fusing 3D information and sagging deformation information, (x, y, z, E) (3D) ),(x,y,z)∈Ω 0 Elements 1, 2, 4 x, y, E (3D) Composition of xi (x, y, E) (3D) ),(x,y,z)∈Ω 0 For the real scene display of the projector (3), called as 'virtual breast bilateral difference degree information', all four elements (x, y, z, E) of the target breast difference range (3D) ),(x,y,z)∈Ω 0 The method comprises the steps of including breast 3D information and virtual breast bilateral difference degree information, and displaying as virtual-real fusion of a computer display screen;
or further examining the difference between the affected side breast and the healthy side breast in the 3D image by dividing the 3D image of the breast into a plurality of small areas; dividing the segmented breast image of the healthy side and the affected side into a plurality of small areas, wherein each small area comprises a breast 3D curved surface which fuses sagging information and is expressed as omega 0 + (x i ,y j ,z i,j ) And omega 0 - (x i ,y j ,z i,j ) Wherein i and j respectively represent the serial numbers of small areas in the x and y directions in the coordinate system, z i,j Representation and x i ,y j The height and shape of the curved surface corresponding to the small area comprises the condition of fusing different anglesError information of (2); defining the 3D curved surface error function of the affected side breast and the healthy side breast as follows:
G (3D) (x i ,y j ,z i,j )=Ω 0 + (x i ,y j ,z i,j )-Ω 0 - (-x i ,y j ,z i,j ) (8)
from G (3D) (x i ,y j ,z i,j ) Determining the error between the 3D curved surface of each small area of the affected side breast and the 3D curved surface of the corresponding small area of the healthy side breast serving as a comparison reference;
defining the surface error indexes of each small area corresponding to the 3D curved surface of the affected side breast and the healthy side breast as follows:
the index quantitatively reflects the difference of each small area of the affected side and the healthy side in the 3D curved surface of the breast; defining the overall difference index of the 3D curved surface of the affected side breast as follows:
wherein S represents a collection of all small areas of the affected side breast or the healthy side breast obtained by image segmentation; the overall difference condition of the 3D curved surfaces of the affected side breast and the healthy side breast is reflected;
the (x, y, z, E) (3D) ),(x,y,z)∈Ω 0 Or G (3D) (x i ,y j ,z i,j ) Practically reflecting the error of the 3D curved surface of the affected side breast and the 3D curved surface of the corresponding part of the healthy side breast;
consider G (3D) (x i ,y j ,z i,j ) In case of G (3D) (x i ,y j ,z i,j ) If > 0, it indicates that the surface of the small region of the affected breast is higher than that of the healthy breast, if G (3D) (x i ,y j ,z i,j ) < 0, then indicate the surface of the small region of the affected side breastIs lower than the breast on the healthy side, and is used as navigation information to guide the doctor to perform further operation trimming;
the breast difference analysis and evaluation index and graphic display part is used for obtaining difference data (x, y, z, E) of the affected side breast and the healthy side breast surface according to the bilateral breast difference analysis and evaluation part (3D) ),(x,y,z)∈Ω 0 Or corresponding to small area surface errors G (3D) (x i ,y j ,z i,j ) Index of surface error of small region corresponding to affected side breast and healthy side breastAnd the overall difference index C of the affected side breast and the healthy side breast, and the error G of each small area of the affected side breast and the corresponding small area of the healthy side breast is displayed on the display of the main control computer (1) in the form of graph and color strips (3D) (x i ,y j ,z i,j ) And/or error index->Guiding the surgeon to trim the corresponding small area; the main control computer (1) displays the numerical value of the overall symmetry index C on a display to show the overall difference or symmetry between the affected side breast and the healthy side breast, and the numerical value is used as a reference for stopping correction operation of an operator;
fifthly, outputting and displaying the breast difference analysis and evaluation index and the virtual image, and realizing navigation of virtual-real fusion
The automatic tracking algorithm is utilized to track the mark points and the breasts, the projector is started, the difference, evaluation and navigation data obtained in the steps are projected onto the entity breasts, and the operation navigation of 'virtual-real fusion' is realized, specifically: setting the mark points and measuring the distance information between the mark points, starting a 3D camera to acquire data, calculating to generate a virtual breast, projecting the virtual breast and difference data, realizing automatic matching of the projection image and the virtual-real fusion of the entity, and outputting navigation data;
the automatic tracking algorithm is as follows:
5.1 On the basis of arranging the mark points and measuring the distance between the mark points, entering a measuring and navigating process;
5.2 (x, y, z, E) comparing the difference data (x, y, z, E) representing the affected side breast compared to the healthy side breast generated by the fourth step calculation (3D) ),(x,y,z)∈Ω 0 Or a color graphic representing the difference of each small region is represented as G (3D) (x i ,y j ,z i,j ) I, j e S, called "virtual breast image";
5.3 Under the state that the patient is positioned at 0 degree in the operation, starting a laser projector (3), and projecting a virtual breast image and difference data information obtained through calculation to the breast position of the patient by utilizing the virtual breast projector to form a virtual-real fusion breast image;
5.4 The marking points of the entity image and the virtual breast image are overlapped, then the acquired entity image and the image generated by the projector are processed, so that the superposition of the entity image and the virtual breast image is realized, the entity image and the virtual breast image are completely overlapped under ideal conditions, and if the entity image and the virtual breast image are not overlapped, the subsequent processing of the affected side is guided; and data describing surgical navigation on the physical breast, providing guidance or navigation for subsequent revision.
3. A navigation system-based navigation method in breast surgery contouring surgery according to claim 1, wherein said marker points are used for the system to automatically determine the fiducial or reference point of the 3D image; the arranged mark points are arranged at the upper and lower positions of the breast on the body surface of the middle axis of the human body, the mark points have the characteristic of being convenient for automatic identification, and the two mark points form a y axis of a rectangular coordinate system, which is called the middle axis of the body surface; the YOZ plane formed by this central axis is called the human central axis section.
4. The navigation system-based navigation method in breast surgery contouring surgery according to claim 2, wherein the distance between the front end of the lens of the 3D camera (2) and the chest surface of the patient in step 1.2) is 900 mm-1000 mm.
5. The navigation system-based navigation method in breast surgery external shape remodelling according to claim 1, wherein the differential index and the total breast partition differential analysis evaluation chart are used for clearly expressing the overall differential index and the local area differential of the affected side breast and the healthy side breast in the breast remodelling by numerical values and graphs, and providing qualitative and quantitative indication and navigation information for the operator to further perform the affected side breast surgery.
CN202210125090.1A 2022-02-10 2022-02-10 Virtual-real fusion navigation system and method in breast surgery appearance remodelling operation Active CN114557772B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210125090.1A CN114557772B (en) 2022-02-10 2022-02-10 Virtual-real fusion navigation system and method in breast surgery appearance remodelling operation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210125090.1A CN114557772B (en) 2022-02-10 2022-02-10 Virtual-real fusion navigation system and method in breast surgery appearance remodelling operation

Publications (2)

Publication Number Publication Date
CN114557772A CN114557772A (en) 2022-05-31
CN114557772B true CN114557772B (en) 2024-03-26

Family

ID=81712891

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210125090.1A Active CN114557772B (en) 2022-02-10 2022-02-10 Virtual-real fusion navigation system and method in breast surgery appearance remodelling operation

Country Status (1)

Country Link
CN (1) CN114557772B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118229930B (en) * 2024-04-03 2024-09-10 艾瑞迈迪医疗科技(北京)有限公司 Near infrared optical tracking method and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2737304Y (en) * 2004-10-28 2005-11-02 高建华 Arrangement for optical measuring 3D form of mamma and conducting operation design by computer auxiliary
CN101797182A (en) * 2010-05-20 2010-08-11 北京理工大学 Nasal endoscope minimally invasive operation navigating system based on augmented reality technique
JP2014150855A (en) * 2013-02-06 2014-08-25 Mitsubishi Electric Corp Breast diagnosis assist system and breast data processing method
CN107296650A (en) * 2017-06-01 2017-10-27 西安电子科技大学 Intelligent operation accessory system based on virtual reality and augmented reality

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2737304Y (en) * 2004-10-28 2005-11-02 高建华 Arrangement for optical measuring 3D form of mamma and conducting operation design by computer auxiliary
CN101797182A (en) * 2010-05-20 2010-08-11 北京理工大学 Nasal endoscope minimally invasive operation navigating system based on augmented reality technique
JP2014150855A (en) * 2013-02-06 2014-08-25 Mitsubishi Electric Corp Breast diagnosis assist system and breast data processing method
CN107296650A (en) * 2017-06-01 2017-10-27 西安电子科技大学 Intelligent operation accessory system based on virtual reality and augmented reality

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
乳房再造计算机辅助设计系统的开发和临床应用;徐华等;组织工程与重建外科杂志;20080430;第4卷(第2期);94-96 *

Also Published As

Publication number Publication date
CN114557772A (en) 2022-05-31

Similar Documents

Publication Publication Date Title
JP2950340B2 (en) Registration system and registration method for three-dimensional data set
US8900146B2 (en) Three-dimensional (3D) ultrasound imaging system for assessing scoliosis
US5531520A (en) System and method of registration of three-dimensional data sets including anatomical body data
US10881353B2 (en) Machine-guided imaging techniques
CN100536792C (en) Navigation system and method backing up several modes
JP2966089B2 (en) Interactive device for local surgery inside heterogeneous tissue
US20190142359A1 (en) Surgical positioning system and positioning method
CN106108951B (en) A kind of medical real-time three-dimensional location tracking system and method
CN108272502A (en) A kind of ablation needle guiding operating method and system of CT three-dimensional imagings guiding
CN107854177A (en) A kind of ultrasound and CT/MR image co-registrations operation guiding system and its method based on optical alignment registration
Ma et al. Autonomous scanning target localization for robotic lung ultrasound imaging
CN114129240A (en) Method, system and device for generating guide information and electronic equipment
US20160155247A1 (en) Systems and methods for tissue mapping
CN111603205A (en) Three-dimensional image reconstruction and positioning analysis system used in CT (computed tomography) cabin of puncture surgical robot
JP7221190B2 (en) Structural masking or unmasking for optimized device-to-image registration
CN114557772B (en) Virtual-real fusion navigation system and method in breast surgery appearance remodelling operation
CN109907801B (en) Locatable ultrasonic guided puncture method
CN106236264A (en) The gastrointestinal procedures air navigation aid of optically-based tracking and images match and system
CN103584885A (en) Free arm ultrasound calibration method based on positioning and navigation puncture needle
CN117392109A (en) Mammary gland focus three-dimensional reconstruction method and system
US11771508B2 (en) Robotically-assisted surgical device, robotically-assisted surgery method, and system
CN108143501B (en) Anatomical projection method based on body surface vein features
US20170086791A1 (en) Apparatus and method for supporting acquisition of area-of-interest in ultrasound image
JP2000163558A (en) Positioning device
US20130293464A1 (en) Radiographic image display device and method for displaying radiographic image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant