CN114338936A - Image processing apparatus and method - Google Patents
Image processing apparatus and method Download PDFInfo
- Publication number
- CN114338936A CN114338936A CN202111134028.0A CN202111134028A CN114338936A CN 114338936 A CN114338936 A CN 114338936A CN 202111134028 A CN202111134028 A CN 202111134028A CN 114338936 A CN114338936 A CN 114338936A
- Authority
- CN
- China
- Prior art keywords
- image
- images
- image processing
- position display
- illustration
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000012545 processing Methods 0.000 title claims abstract description 199
- 238000000034 method Methods 0.000 title claims description 40
- 238000003384 imaging method Methods 0.000 claims abstract description 56
- 230000006870 function Effects 0.000 description 145
- 238000002595 magnetic resonance imaging Methods 0.000 description 32
- 230000008569 process Effects 0.000 description 32
- 238000010586 diagram Methods 0.000 description 18
- 238000012986 modification Methods 0.000 description 18
- 230000004048 modification Effects 0.000 description 18
- 230000003068 static effect Effects 0.000 description 14
- 230000005540 biological transmission Effects 0.000 description 11
- 230000008859 change Effects 0.000 description 7
- 238000003780 insertion Methods 0.000 description 7
- 230000037431 insertion Effects 0.000 description 7
- 230000015572 biosynthetic process Effects 0.000 description 5
- 238000006243 chemical reaction Methods 0.000 description 3
- 230000010365 information processing Effects 0.000 description 3
- 230000009467 reduction Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000012937 correction Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 230000004907 flux Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000009966 trimming Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Landscapes
- Magnetic Resonance Imaging Apparatus (AREA)
Abstract
An image processing apparatus according to an embodiment includes a display control unit, a selection unit, a reception unit, and an image processing unit. The display control unit causes the display unit to display the plurality of magnetic resonance images and a plurality of position display images indicating the positions of the imaging sections of the plurality of magnetic resonance images in a corresponding manner. The selection unit selects a plurality of position display images included in the plurality of position display images displayed on the display unit. The receiving unit receives a user operation for instructing image processing for displaying images at the selected plurality of positions. The image processing unit applies image processing to the selected plurality of position display images collectively.
Description
Cross reference to related applications
This application is based on and claims priority from japanese patent application No. 2020-.
Technical Field
Embodiments of the present invention relate to an image processing apparatus and method.
Background
Conventionally, there is known a technique for displaying medical images such as magnetic resonance images by associating the medical images with images showing the positions of imaging cross sections in the medical images. An image showing the position of an imaging cross section in a medical image is generally called an illustration (instet) image.
A user sometimes performs image processing on such an illustration image. Therefore, when there are a plurality of medical images to be displayed, it is necessary to individually perform image processing on a plurality of pictorial images corresponding to each of the plurality of medical images, and therefore, the work may take time and effort.
Disclosure of Invention
An image processing apparatus according to an embodiment includes a display control unit, a selection unit, a reception unit, and an image processing unit. The display control unit causes the display unit to display the plurality of magnetic resonance images and a plurality of first position display images indicating the positions of the imaging sections of the plurality of magnetic resonance images in a corresponding manner. The selection unit selects a plurality of second position display images included in the plurality of first position display images. The receiving unit receives a user operation instructing image processing for displaying images at the selected plurality of second positions. The image processing unit applies image processing to the selected plurality of second position display images collectively.
Drawings
Fig. 1 is a block diagram showing an example of a magnetic resonance imaging apparatus according to a first embodiment.
Fig. 2 is a diagram showing an example of a positioning image according to the first embodiment.
Fig. 3 is a diagram showing an example of a diagnostic image according to the first embodiment.
Fig. 4 is a diagram showing an example of a reference image according to the first embodiment.
Fig. 5 is a diagram showing an example of an image with an inset in accordance with the first embodiment.
Fig. 6 is a diagram showing an example of a virtual film editing screen according to the first embodiment.
Fig. 7 is a diagram for explaining an example of the operation of specifying a selection target according to the first embodiment.
Fig. 8 is a flowchart showing an example of the flow of image processing according to the first embodiment.
Fig. 9 is a diagram showing an example of a virtual film editing screen according to the second embodiment.
Fig. 10 is a flowchart showing an example of the flow of image processing according to the second embodiment.
Fig. 11 is a diagram showing an example of a virtual film editing screen according to a first modification.
Detailed Description
Embodiments of an image processing apparatus and method are described in detail below with reference to the drawings.
(first embodiment)
An image processing apparatus according to an embodiment includes a display control unit, a selection unit, a reception unit, and an image processing unit. The display control unit causes the display unit to display the plurality of magnetic resonance images and a plurality of first position display images indicating the positions of the imaging sections of the plurality of magnetic resonance images in a corresponding manner. The selection unit selects a plurality of second position display images included in the plurality of first position display images. The receiving unit receives a user operation instructing image processing for displaying images at the selected plurality of second positions. The image processing unit applies image processing to the selected plurality of second position display images collectively.
Fig. 1 is a block diagram showing an example of a Magnetic Resonance Imaging (MRI) apparatus 100 according to a first embodiment. The magnetic resonance imaging apparatus 100 is an example of the image processing apparatus in the present embodiment.
The magnetic resonance imaging apparatus 100 includes a static field magnet 101, a static field power supply (not shown), a gradient coil 103, a gradient power supply 104, a bed 105, a bed control circuit 106, a transmission coil 107, a transmission circuit 108, a reception coil 109, a reception circuit 110, a sequence control circuit 120, and a computer system 130.
The configuration shown in fig. 1 is merely an example. For example, the sequence control circuit 120 and the respective parts in the computer system 130 may be integrated or separated as appropriate. In addition, the magnetic resonance imaging apparatus 100 does not include the subject P (e.g., a human body).
The X, Y, and Z axes shown in fig. 1 constitute a device coordinate system unique to the magnetic resonance imaging device 100. For example, the Z-axis direction is set along the magnetic flux of the static magnetic field generated by the static magnetic field magnet 101 so as to coincide with the axial direction of the Yen cylinder of the gradient magnetic field coil 103. The Z-axis direction is the same direction as the longitudinal direction of the bed 105, or the same direction as the head-tail direction of the subject P placed on the bed 105. The X-axis direction is set along a horizontal direction orthogonal to the Z-axis direction. The Y-axis direction is set along a vertical direction orthogonal to the Z-axis direction. In the present embodiment, the term "circle" includes an ellipse.
The static magnetic field magnet 101 is a magnet formed in a hollow substantially cylindrical shape, and generates a static magnetic field in an internal space. The static magnetic field magnet 101 is, for example, a superconducting magnet, and is excited by receiving a current from a static magnetic field power supply. The static magnetic field power supply supplies a current to the static magnetic field magnet 101. As another example, the static magnetic field magnet 101 may be a permanent magnet, and in this case, the magnetic resonance imaging apparatus 100 may not include a static magnetic field power supply. The static magnetic field power supply may be provided separately from the magnetic resonance imaging apparatus 100.
The gradient coil 103 is a coil formed in a hollow substantially cylindrical shape, and is disposed inside the static field magnet 101. The gradient magnetic field coil 103 is formed by combining three coils corresponding to X, Y, and Z axes orthogonal to each other, and generates a gradient magnetic field whose magnetic field strength changes along each of the X, Y, and Z axes upon receiving a current from the gradient magnetic field power supply 104 alone. The gradient magnetic field power supply 104 supplies a current to the gradient magnetic field coil 103 under the control of the sequence control circuit 120.
The bed 105 includes a top plate 105a for placing the subject P thereon, and the top plate 105a is inserted into the imaging port with the subject P such as a patient placed thereon under the control of the bed control circuit 106. The bed control circuit 106 drives the bed 105 under the control of the computer system 130 to move the top plate 105a in the longitudinal direction and the vertical direction.
The transmission coil 107 excites an arbitrary region of the subject P by applying a high-frequency magnetic field. The transmission coil 107 is, for example, a Whole body (Whole body) type coil that surrounds the entire body of the subject P. The transmission coil 107 generates a radio frequency magnetic field upon receiving the supply of the RF pulse from the transmission circuit 108, and applies the radio frequency magnetic field to the subject P. The transmission circuit 108 supplies RF pulses to the transmission coil 107 under the control of the sequence control circuit 120.
The receiving coil 109 is disposed inside the gradient magnetic field coil 103, and receives a magnetic resonance signal (hereinafter, referred to as an mr (magnetic resonance) signal) emitted from the subject P under the influence of the radio frequency magnetic field. Upon receiving the MR signal, the receiving coil 109 outputs the received MR signal to the receiving circuit 110.
In fig. 1, the receiving coil 109 is provided separately from the transmitting coil 107, but this is an example and is not limited to this configuration. For example, the receiving coil 109 may also serve as the transmitting coil 107.
The receiving circuit 110 performs analog-to-digital (AD) conversion on the analog MR signal output from the receiving coil 109 to generate MR data. The receiving circuit 110 transmits the generated MR data to the sequence control circuit 120. Further, the AD conversion may be performed in the reception coil 109. The reception circuit 110 can perform arbitrary signal processing in addition to AD conversion.
The sequence control circuit 120 drives the gradient magnetic field power supply 104, the transmission circuit 108, and the reception circuit 110 based on the sequence information transmitted from the computer system 130, thereby performing imaging of the subject P. The sequence information defines the order in which imaging is performed. The sequence information includes, for example, the intensity of the current supplied from the gradient magnetic field power supply 104 to the gradient magnetic field coil 103 and the timing of supplying the current, the intensity of the RF pulse supplied from the transmission circuit 108 to the transmission coil 107 and the timing of applying the RF pulse, and the timing of detecting the MR signal by the reception circuit 110. The sequence control circuit 120 may be implemented by a processor, and may also be implemented by a combination of software and hardware.
When the sequence control circuit 120 receives MR data from the receiving circuit 110 as a result of driving the gradient magnetic field power supply 104, the transmitting circuit 108, and the receiving circuit 110 to image the subject P, the received MR data is transferred to the computer system 130.
The computer system 130 performs overall control of the magnetic resonance imaging apparatus 100, generation of an MR image, and the like. As shown in fig. 1, the computer system 130 includes an NW (network) interface 131, a storage circuit 132, a processing circuit 133, an input interface 134, and a display 135.
The NW interface 131 communicates with the sequence control circuit 120 and the couch control circuit 106. For example, the NW interface 131 sends sequence information to the sequence control circuit 120. Also, the NW interface 131 receives MR data by the sequence control circuit 120.
The storage circuit 132 stores MR data received via the NW interface 131, k-space data arranged in k-space by a processing circuit 133 described later, image data generated by the processing circuit 133, and the like. The Memory circuit 132 is, for example, a semiconductor Memory element such as a RAM (Random Access Memory) or a flash Memory, a hard disk, an optical disk, or the like. Also, the memory circuit 132 may also be provided outside the magnetic resonance imaging apparatus 100.
The input interface 134 receives various instructions and information input from an operator. The input interface 134 is implemented by, for example, a trackball, a switch button, a mouse, a keyboard, a touch panel that performs an input operation by touching an operation surface, a touch panel in which a display screen and a touch panel are integrated, a non-contact input circuit using an optical sensor, an audio input circuit, and the like. The input interface is connected to the processing circuit 133, converts an input operation received by an operator into an electrical signal, and outputs the electrical signal to the processing circuit 133. In this specification, the input interface is not limited to an interface including a physical operation member such as a mouse or a keyboard. For example, an input interface includes an electric signal processing circuit that receives an electric signal corresponding to an input operation from an external input device provided separately from the computer system 130 and outputs the electric signal to the control circuit. The input interface 134 is an example of an operation unit in the present embodiment.
The display 135 displays a GUI (Graphical User Interface) for accepting input of imaging conditions, a magnetic resonance image (MR image) generated by the processing circuit 133, and the like under the control of the processing circuit 133. The display 135 is a display device such as a liquid crystal display. The display 135 is an example of a display unit. In addition, the display 135 may also be provided outside the magnetic resonance imaging apparatus 100.
In the present embodiment, the magnetic resonance image includes a positioning image and a diagnostic image. The Scout image is also called a Locator image (Locator) or Scout. The diagnostic image is captured based on the FOV (Field Of View) and slice position determined by the user using the positioning image. The slice position is a position of an imaging cross section in the diagnostic image. In this embodiment, determining at least the slice position on the scout image is referred to as scout.
The positioning image is also referred to as a main image, and the diagnostic image captured from the positioning image is referred to as a sub-image. Hereinafter, in the present embodiment, the image is a diagnostic image in the case of being simply referred to as a "magnetic resonance image".
The processing circuit 133 controls the entire magnetic resonance imaging apparatus 100. More specifically, the processing circuit 133 includes, for example, an imaging processing function 133a, a display control function 133b, a reception function 133c, a selection function 133d, and an image processing function 133 e. The image pickup processing function 133a is an example of an image pickup processing section. The display control function 133b is an example of a display control unit. The receiving function 133c is an example of a receiving unit. The selection function 133d is an example of a selection unit. The image processing function 133e is an example of an image processing section.
Here, the respective processing functions of the imaging processing function 133a, the display control function 133b, the reception function 133c, the selection function 133d, and the image processing function 133e, which are components of the processing circuit 133, are stored in the storage circuit 132 as programs executable by a computer, for example. The processing circuit 133 is a processor. For example, the processing circuit 133 reads out and executes programs from the storage circuit 132, thereby realizing functions corresponding to the respective programs. In other words, the processing circuit 133 that has read the states of the programs has the functions shown in the processing circuit 133 in fig. 1. In fig. 1, the description will be made assuming that the processing functions performed by the display control function 133b, the reception function 133c, the selection function 133d, and the image processing function 133e are realized by one processor, but the processing circuit 133 may be configured by combining a plurality of independent processors, and the functions may be realized by executing programs by the respective processors. In fig. 1, the description is given assuming that a single storage circuit 132 stores programs corresponding to respective processing functions, but a plurality of storage circuits may be arranged in a distributed manner, and the processing circuit 133 may read the corresponding programs from the respective storage circuits.
In the above description, an example has been described in which the "processor" reads out a program corresponding to each function from the memory circuit and executes the program, but the embodiment is not limited to this. The term "processor" is a Circuit such as a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), an Application Specific Integrated Circuit (ASIC), a Programmable Logic Device (e.g., a Simple Programmable Logic Device (SPLD)), a Complex Programmable Logic Device (CPLD), and a Field Programmable Gate Array (FPGA)). In the case where the processor is, for example, a CPU, the processor realizes a function by reading out and executing a program held in a memory circuit. On the other hand, in the case where the processor is an ASIC, instead of saving the program in a memory circuit, the function is directly embedded in a circuit of the processor as a logic circuit. The processor of the present embodiment is not limited to a case where each processor is configured as a separate circuit, and may be configured as one processor by combining a plurality of independent circuits to realize the functions thereof. In addition, a plurality of components in fig. 1 may be integrated into one processor to realize the functions thereof.
The imaging processing function 133a controls each part of the magnetic resonance imaging apparatus 100 to perform imaging of a magnetic resonance image. For example, the imaging processing function 133a executes generation of sequence information, acquisition of MR data, generation of k-space data, and generation of a magnetic resonance image. The generation of the magnetic resonance image is to perform reconstruction processing such as fourier transform on k-space data. The imaging processing function 133a may also perform various corrections on the reconstructed magnetic resonance image.
In the present embodiment, the imaging processing function 133a captures a positioning image and a diagnostic image.
The display control function 133b causes the display 135 to display a plurality of diagnostic images and a plurality of illustration images indicating the positions of the imaging sections of the plurality of diagnostic images in association with each other.
The illustration image is an example of the position display image in the present embodiment. An image in which an interpolation image is superimposed and displayed on a diagnostic image is referred to as an interpolated image. In addition to the illustration image, a reference image described later may be an example of the position display image.
Here, the relationship among the positioning image, the diagnostic image, the reference image, the illustration image, and the illustration-with-illustration image in the present embodiment will be described by way of example.
Fig. 2 is a diagram showing an example of the positioning image 9 according to the first embodiment. Fig. 3 is a diagram showing an example of the diagnostic images 8a to 8o according to the first embodiment. The diagnostic images 8a to 8o are captured based on the result of positioning based on the positioning image 9. When the positioning image 9 is used as the main image, the diagnostic images 8a to 8o are sub-images of the positioning image 9. Hereinafter, the diagnostic images 8a to 8o are simply referred to as diagnostic images 8 without distinguishing them.
Fig. 4 is a diagram showing an example of a reference image 91 according to the first embodiment. The reference image 91 is an image in which the reference lines 6a to 6p are superimposed on the positioning image 9. Reference lines 6a to 6p indicate positions of imaging sections of the plurality of diagnostic images 8 imaged with the positioning image 9 as a main image. Hereinafter, the reference lines 6a to 6p are simply referred to as reference lines 6 without particularly distinguishing them.
Fig. 5 is a diagram showing an example of the illustration image 7 according to the first embodiment. As shown in fig. 5, the strip-interpolation image 7 includes the diagnostic image 8 and an interpolation image 92 showing the position of the imaging cross section of the diagnostic image 8.
The illustration image 92 is smaller than the diagnostic image 8, and is superimposed on the diagnostic image 8 for display. The illustration image 92 is an image in which one reference line 6 is superimposed on the positioning image 9. A plurality of reference lines 6a to 6p indicating the positions of the imaging cross sections of the plurality of diagnostic images 8 are displayed superimposed on the reference image 91 described with reference to fig. 4, and one reference line 6 corresponding to one diagnostic image 8 is displayed on the inset image 92. The image region in which the interpolation image 92 is displayed is called an interpolation frame (instet frame).
The positioning image 9, the diagnostic image 8, the reference image 91, and the illustration image 92 according to the present embodiment are medical images according to the DICOM (Digital Imaging and Communications in Medicine) standard. Each medical Image can be specified by a unique Image identifier (Image identifier UID). The image identifiers of the positioning images 9 included in the reference image 91 and the illustration image 92 are included in the incidental information of the reference image 91 and the illustration image 92. The additional information of the illustration image 92 includes information indicating the position of the imaging cross section corresponding to the reference line 6 displayed in the illustration image 92.
In fig. 5, for the sake of convenience of explanation, although one band insertion image 7 is shown separately, the display control function 133b can cause the display 135 to simultaneously display a plurality of band insertion images 7.
In the present embodiment, a film image forming process is given as an example of displaying a plurality of band interpolation images 7. The film imaging process includes an image process for printing a medical image on a film and a print layout adjustment process. In the present embodiment, a film imaging process in which a magnetic resonance image is printed on a film will be described as an example.
The image processing in the present embodiment includes at least one Of enlargement, reduction, adjustment (japanese: パン), brightness change, rotation, addition Of comment (annotation), addition Of ROI (Region Of Interest), color filtering, change in Window value (WL), and change in Window Width (WW), for example. Adjustment refers to movement of the displayed image range. The contents of these processes are examples, and the contents of the image processing are not particularly limited.
Fig. 6 is a diagram showing an example of a virtual film editing screen 50a according to the first embodiment. The display control function 133b causes the display 135 to display the virtual film editing screen 50 a.
The virtual film editing screen 50a includes a virtual film 5 and editing boxes (Edit boxes) 51a to 51 c.
On the virtual film 5, a plurality of illustration images 7a to 7k and a reference image 91 printed on the film are arranged in accordance with a user operation. Hereinafter, the band-inserted images 7a to 7k are simply referred to as the band-inserted images 7 without distinguishing them. When the respective illustration images 92a to 92k are not distinguished, they are simply referred to as illustration images 92.
Note that, in fig. 6, reference numerals of the diagnostic image 8 are not shown, and in the virtual film editing screen 50a shown in fig. 6, the illustration image 7 includes the diagnostic image 8 and an illustration image 92 showing the position of the imaging cross section of the diagnostic image 8, as in fig. 5.
In the present embodiment, the placement of the illustration image 7 on the virtual film 5 is referred to as registration of the illustration image 7 on the virtual film 5. In addition to the image with an illustration 7, a reference image 91 or a diagnostic image 8 to which no illustration is given may be registered on the virtual film 5.
The interpolation images 92a to 92k registered on the virtual film 5 shown in fig. 6 are images in which the positions of the imaging cross sections of the diagnostic images 8 included in the respective interpolation images 7a to 7k are displayed in the positioning image 9 used for positioning the diagnostic images 8 included in the respective interpolation images 7a to 7 k.
The user can perform image processing operations on the plurality of illustration images 92a to 92k included in the plurality of illustration images 7a to 7k on the virtual film 5 shown in fig. 6.
The user can input parameter values used for image processing in the edit boxes 51a to 51 c. In the present embodiment, the parameter values that can be input to the edit boxes 51a to 51c are numerical values. A window value can be input in the edit box 51 a. Also, the window width can be input in the edit box 51 b. The magnification can be input in the edit box 51 c. That is, the values of the parameters defining the contents of the image processing corresponding to the edit boxes 51a to 51c can be set to numerical values.
The number of edit boxes 51a to 51c and the type of parameter shown in fig. 6 are examples, and are not limited to these. Hereinafter, the edit boxes 51a to 51c are simply referred to as edit boxes 51 without being particularly distinguished. Note that the check box 54 written as "Select event frame" in fig. 6 is a check box for switching the function of the operator to collectively Select a plurality of illustration images 92 between active and inactive. When a checkup is input to the check box 54, for example, when a region including the plurality of illustration images 7 is selected by a drag operation of a mouse, the plurality of diagnostic images 8 included in the selected region are not selected, and only the plurality of illustration images 92 are selected. When the checkbox 54 is not checked, for example, the plurality of illustration images 92 included in the region selected by the drag operation are not selected, and only the plurality of diagnostic images 8 are selected.
Returning to fig. 1, the reception function 133c receives various operations by the user via the input interface 134. For example, the reception function 133c receives an operation of a user for registering the illustration image 7 on the virtual film 5.
The receiving function 133c receives a user operation for designating a plurality of pictorial images 92 included in the plurality of pictorial images 92 displayed on the display 135 as selection targets.
Fig. 7 is a diagram for explaining an example of the operation of specifying a selection target according to the first embodiment. In the example shown in fig. 7, among the plurality of illustration images 92a to 92k displayed on the display 135, the plurality of illustration images 92a to 92g are specified as selection targets by a user operation.
The number of the illustration images designated as the selection target is not particularly limited. One illustration image 92 may be designated as a selection target, or all of the plurality of illustration images 92a to 92k displayed on the display 135 may be designated as selection targets. Not only the illustration image 92 but also the reference image 91 may be designated as the selection target.
The receiving function 133c notifies the selection function 133d, which will be described later, of the plurality of illustration images 92 designated as selection targets by the received user operation.
The reception function 133c receives image processing corresponding to an operation by the user. More specifically, the reception function 133c receives an operation by the user that instructs image processing on the plurality of illustration images 92 selected by the selection function 133d described later.
The operation of the user for instructing the image processing is, for example, an operation of inputting a value of a parameter in the edit box 51. In the present embodiment, the value of the parameter is set as a numerical value. The reception function 133c receives image processing corresponding to at least one numerical value input by the user as a value of a parameter.
Then, the user's operation for instructing the image processing accepts the image processing corresponding to the amount of operation of the input interface 134 performed by the user. For example, when the user presses a predetermined button of the mouse on the selected illustration image 92 to move the mouse, the reception function 133c receives an enlargement operation of the illustration image 92. The magnification is determined according to the amount of movement. Further, the image processing designated by the user may be different depending on the movement direction of the mouse. For example, the upward movement may be an operation for instructing enlargement, and the downward movement may be an operation for instructing reduction. For example, when the user presses a predetermined button of the mouse on the selected illustration image 92 to move the mouse, which is different from the case of the enlargement operation, the receiving function 133c receives the trimming operation corresponding to the movement direction. In addition, the assignment of the mouse button corresponding to the operation may be changeable by the user.
The reception function 133c transmits the content of the operation performed by the user to the image processing function 133e described later. For example, the reception function 133c transmits the parameter values of the image processing input by the user into the edit box 51 to the image processing function 133e described later. The reception function 133c transmits the content of image processing corresponding to the amount of operation of the input interface 134 by the user to the image processing function 133e described later.
Returning to fig. 1, the selection function 133d selects a plurality of illustration images 92a to 92g included in the plurality of illustration images 92a to 92k displayed on the display 135. The selection function 133d of the present embodiment selects a plurality of illustration images 92a to 92g specified as selection targets by the user. The plurality of illustration images 92a to 92k are examples of the plurality of first position display images in the present embodiment. Moreover, the illustration images 92a to 92g are an example of the plurality of second position display images in the present embodiment. When the reference image 91 is included in the selection target, the selection function 133d also selects the reference image 91.
The image processing function 133e applies image processing based on the user's operation to the selected plurality of illustration images 92a to 92g or the reference image 91 in a lump.
For example, in the example shown in fig. 7, among the plurality of illustration images 92a to 92k displayed on the display 135, the plurality of illustration images 92a to 92g are specified as selection targets by a user operation. When the user performs an operation of image processing in this state, the image processing function 133e applies the image processing to the plurality of illustration images 92a to 92g collectively. More specifically, in fig. 7, the user inputs a magnification of "150.0%" in the edit box 51 c.
In this case, the image processing function 133e performs the enlargement processing of "150.0%" simultaneously for the illustration images 92a to 92 g. Here, the image processing function 133e does not perform the enlargement processing of "150.0%" on the unselected pictorial images 92h to 92k and the reference image 91 among the pictorial images 92a to 92k displayed on the display 135.
Although the virtual film 5 of one version is shown in fig. 6 and 7, the virtual film editing screen 50a may include a plurality of versions of the virtual film 5. For example, the display control function 133b may change the version of the display object in accordance with a user operation such as a scroll operation or selection of the number of versions. In the present embodiment, the illustration image 92 that the user can specify as the selection target is the illustration image 92 registered in the version displayed on the virtual film editing screen 50 a. If the user desires image processing for an illustration image 92 registered in a version not displayed on the virtual film editing screen 50a, the user performs an operation of displaying another version. When the accepting function 133c accepts the operation, the display control function 133b causes the user to display the version designated. In this case, as described with reference to fig. 6 and 7, the user can designate a desired illustration image 92 as a selection target in the newly displayed version, and perform an operation of image processing on the designated illustration image 92.
In fig. 7, the case where the image processing is collectively performed on the plurality of illustration images 92a to 92g is illustrated, but the user may perform an operation of performing different image processing on each of the illustration images 92 individually. In the present embodiment, the image processing for the illustration image 92 has been mainly described, but the user may perform the image processing operation for the diagnostic image 8 included in the illustration image 7. For example, when a plurality of diagnostic images 8 are designated as selection targets by the user, the image processing function 133e may collectively perform image processing on the plurality of diagnostic images 8.
Next, a flow of image processing executed in the magnetic resonance imaging apparatus 100 of the present embodiment configured as described above will be described.
Fig. 8 is a flowchart showing an example of the flow of image processing according to the first embodiment. As a premise of the processing of this flowchart, it is assumed that the positioning image 9 and the plurality of diagnostic images 8 based on the positioning image 9 have been captured. The positioning image 9 and the diagnostic image 8 are stored in the storage circuit 132, for example.
For example, the receiving function 133c determines whether or not an operation for starting the film forming process by the user is received (S1). When the operation to start the film forming process is not accepted (no in S1), the acceptance function 133c repeats the process of S1.
When the reception function 133c receives the operation to start the film formation process (yes in S1), the notification display control function 133b receives the operation to start the film formation process. In this case, the display control function 133b causes the display 135 to display the virtual film editing screen 50a (S2).
The reception function 133c then determines whether or not a user operation for specifying the illustration image 7 to be registered with the virtual film 5 has been received (S3). When the user operation for specifying the image with illustration 7 to be registered is not accepted (no in S3), the acceptance function 133c repeats the process of S3.
When the accepting function 133c accepts a user operation to specify the tape insert image 7 to be registered (yes at S3), the display control function 133b registers the selected tape insert image 7 in the virtual film (S4). The reception function 133c may specify a medical image other than the illustration image 7 as a registration target. Here, as shown in fig. 6 and 7, it is assumed that the band interpolation images 7a to 7k and the reference image 91 are registered.
Then, the receiving function 133c determines whether or not an operation of the user to specify the selection object is received (S5). If it is determined that the user has not accepted the operation for specifying the selection object (no in S5), the acceptance function 133c repeats the process of S5. Here, as shown in fig. 7, it is assumed that, of the illustration images 92a to 92k included in the illustration images 7a to 7k, the illustration images 92a to 92g are specified as selection targets in accordance with a user operation. In this case, the receiving function 133c determines that the operation of the user to specify the selection object is received (yes in S5).
The selection function 133d then selects the medical image specified as the selection target by the user (S6). Here, the selection function 133d selects the illustration images 92a to 92g specified as selection targets.
Then, the reception function 133c determines whether or not the operation of the image processing by the user is received (S7). When the accepting function 133c does not accept the operation of the image processing (no in S7), the process of S7 is repeated. Here, as shown in fig. 7, as an operation of the image processing, it is assumed that the user inputs an enlargement rate of "150.0%" in the edit box 51 c.
Then, the image processing function 133e applies image processing to the selected medical images collectively (S8). Here, the image processing function 133e performs an enlargement process of "150.0%" on the selected illustration images 92a to 92 g.
The receiving function 133c then determines whether or not the user has received an operation to end the film forming process (S9). If the operation to end the film forming process is not accepted (no in S9), the accepting function 133c returns to the process of S5. When the operation for ending the film image forming process is accepted by the acceptance function 133c (yes at S9), the process in the flowchart ends.
In this way, the magnetic resonance imaging apparatus 100 of the present embodiment applies image processing to the selected plurality of pictorial images 92 collectively. Therefore, according to the magnetic resonance imaging apparatus 100 of the present embodiment, the user can perform image processing without performing image processing on each of the interpolation images 92 individually, and thus the workload of the user when editing a plurality of interpolation images 92 can be reduced.
The magnetic resonance imaging apparatus 100 of the present embodiment selects a plurality of pictorial images 92 specified by the user as selection targets, and applies image processing to the selected plurality of pictorial images 92 collectively. Therefore, according to the magnetic resonance imaging apparatus 100 of the present embodiment, the user can collectively perform image processing on any of the plurality of pictorial images 92. This can reduce the difference in the result of image processing, as compared with the case where the user performs image processing on each illustration image 92 alone.
The magnetic resonance imaging apparatus 100 according to the present embodiment receives image processing corresponding to at least one numerical value input by the user. Therefore, when the user performs image processing on a plurality of illustration images 92 included in different versions, if the same numerical value is input, image processing of the same content can be performed, and a job depending on visual observation and a sense of operation can be reduced.
The magnetic resonance imaging apparatus 100 according to the present embodiment receives image processing corresponding to the amount of operation of the input interface 134 by the user. Therefore, according to the magnetic resonance imaging apparatus 100 of the present embodiment, it is possible to perform image processing based on an intermittent operation by the user and to collectively apply the image processing to the plurality of selected pictorial images 92.
(second embodiment)
In the first embodiment described above, the illustration image to be selected is manually specified by the user and is selected as a uniform application target of the image processing. In this second embodiment, selection of an illustration image is automatically performed.
The magnetic resonance imaging apparatus 100 of the present embodiment has the same hardware configuration as that of the first embodiment. The processing circuit 133 of the present embodiment includes an imaging processing function 133a, a display control function 133b, a reception function 133c, a selection function 133d, and an image processing function 133e, as in the first embodiment. The imaging processing function 133a and the image processing function 133e have the same functions as those of the first embodiment.
The receiving function 133c of the present embodiment receives a user operation for specifying at least one illustration image 92 among a plurality of illustration images 92 displayed on the display 135, in addition to the same functions as those of the first embodiment.
The selection function 133d of the present embodiment selects an illustration image 92 including the same positioning image 9 as the positioning image 9 included in the illustration image 92 designated by the user, in addition to the same functions as those of the first embodiment. That is, in the present embodiment, the selection function 133d automatically selects the illustration image 92 including the positioning image 9 similar to the designated illustration image 92 in addition to the illustration image 92 itself manually designated by the user.
Further, if, among the illustration images 92 registered in the virtual film 5, the illustration image 92 included in a version not displayed on the display 135 at the time of selection by the user is also an illustration image including the same positioning image 9 as the positioning image 9 included in the illustration image 92 designated by the user, the selection function 133d of the present embodiment performs selection. The illustration image 92 included in the version not displayed on the display 135 at the time of selection by the user is an example of the plurality of illustration images 92 not displayed on the display 135 in the present embodiment. That is, the selection function 133d of the present embodiment selects an illustration image 92 including the same positioning image 9 as the positioning image 9 included in the illustration image 92 designated by the user, from among the illustration images 92 registered in the virtual film 5, at least one of the illustration image 92 displayed on the display 135 at the time of selection by the user and the illustration image 92 not displayed.
Fig. 9 is a diagram showing an example of a virtual film editing screen 50b according to the second embodiment.
The display control function 133b of the present embodiment causes the display 135 to display the virtual film editing screen 50 b. In addition, when an operation such as enlargement can be performed in accordance with a user operation on an image, the virtual film editing screen 50b may not include the editing frame 51.
In the example shown in fig. 9, the virtual film editing screen 50b2 includes two versions of the virtual films 5a and 5 b. In addition, the first version of the virtual film 5a and the second version of the virtual film 5b are simply referred to as the virtual film 5 without being particularly distinguished. The virtual films 5a and 5b of the two versions can be displayed simultaneously on the virtual film editing screen 50b, or can be displayed on a one-to-one basis.
The virtual films 5a and 5 on the virtual film editing screen 50b have a plurality of illustration images 7 and reference images 91a to 91e registered therein, which are included in the five series 40a to 40 e. The series is a unit representing a group of a plurality of medical images. In general, a plurality of series are included in one detection test, and a plurality of medical images are included in one series.
The series 40a to 40d includes a pictorial image 71 including a diagnostic image 8 captured from the same scout image 9. The reference images 91a to 91d included in the series 40a to 40d are images obtained by superimposing a plurality of reference lines 6 indicating the imaging cross-sectional positions in each series on the same positioning image 9.
For example, the illustration images 71a to 71g included in the series 40a include diagnostic images 8 captured at cross-sectional positions corresponding to the plurality of reference lines 6 described with respect to the reference image 91 a. The illustration images 71a to 71g include illustration images 921a to 921g including the same positioning image 9 as the reference image 91 a. The inset images 921a to 921g include one reference line 6 indicating the imaging cross-sectional position of the diagnostic image 8 included in the inset-with-inset images 71a to 71g, respectively.
The series 40e includes a strip-interpolation image 71 including a diagnostic image 8 captured from a positioning image 9 different from the series 40a to 40 d. The series 40e includes a reference image 91e in which the reference line 6 is superimposed on a positioning image 9 different from the reference images 91a to 91d included in the series 40a to 40 d.
For example, when the user designates an illustration image 921a and performs an operation of image processing for enlarging the illustration image 921a, the selection function 133d selects the illustration image 921 including the same positioning image 9 as the positioning image 9 included in the illustration image 921 a.
In the example shown in fig. 9, the illustration image 921 including the same positioning image 9 as the positioning image 9 included in the illustration image 921a is the entire illustration image 921 included in the series 40a to 40 d. The selection function 133d also selects the reference images 91a to 91d including the same positioning image 9 as the positioning image 9 included in the illustration image 921 a.
When a plurality of versions of the virtual film 5 are included in the virtual film editing screen 50b, the selection function 133d specifies a selection target from the illustration image 921 and the reference image 91 included in all the versions. For example, in the case where, as shown in fig. 9, an insertion image 921 or a reference image 91 including the same positioning images 9 as those included in the insertion image 921a is registered over a plurality of pages, the selection function 133d selects these insertion images 921 or reference images 91 from the plurality of pages.
The selection function 133d determines whether or not the interpolation image 921 and the reference image 91 registered in the virtual film 5 include the same positioning image 9 as the positioning image 9 included in the interpolation image 921a designated by the user, based on the image identifier of the positioning image 9 included in the incidental information of each interpolation image 921 and each reference image 91. Specifically, the selection function 133d selects the illustration image 921 and the reference image 91 in which the image identifier of the positioning image 9 included in the additional information of the positioning image 9, which is the same as the image identifier of the positioning image 9 included in the additional information of the positioning image 9 included in the illustration image 921a designated by the user, is included in the additional information.
Since the series 40e does not include the illustration image 921 or the reference image 91 including the same positioning image 9 as the positioning image 9 included in the user-specified illustration image 921a, the selection function 133d does not select the illustration image 921 or the reference image 91 of the series 40 e.
When the user designates the reference image 91, the image 921 and the reference image 91 to which image processing is applied are selected from the image 921 and the reference image 91 registered in the virtual film 5, as in the case where the image 921a is designated. The illustration image 921 and the reference image 91 to which the image processing is applied are the illustration image 921 and the reference image 91 in which the image identifier of the positioning image 9, which is the same as the image identifier of the positioning image 9 included in the incidental information of the reference image 91 specified by the user, is included in the incidental information.
In the present embodiment, when the user designates the insertion image 921a, the selection function 133d sets only the incidental information of the insertion image 921 and the reference image 91 as the search target, and excludes the incidental information of the diagnostic image 8 registered in the virtual film 5 from the search target.
Fig. 10 is a flowchart showing an example of the flow of image processing according to the second embodiment. The process from the judgment process of accepting the start operation of the film imaging process at S1 to the registration of the virtual film at S4 is the same as the process of the first embodiment described with reference to fig. 8.
The reception function 133c determines whether or not a user operation for specifying the illustration image 921 or the reference image 91 to perform image processing is received (S101). When determining that the user operation for specifying the illustration image 921 or the image processing with reference to the image 91 is not accepted (no in S101), the accepting function 133c repeats the processing in S101.
When the reception function 133c determines that the user operation for specifying the illustration image 921 or the reference image 91 to perform the image processing has been received (yes in S101), the selection function 133d selects the illustration image 921 or the reference image 91 including the same positioning image 9 as the positioning image 9 included in the specified illustration image 921 or the reference image 91 (S102).
Then, the image processing function 133e applies image processing based on a user operation to the illustration image 921 or the reference image 91 selected by the selection function 133d in a lump (S8). The process from the determination of the end of the film formation process at S9 is the same as the process of the first embodiment described with reference to fig. 8.
In this way, the magnetic resonance imaging apparatus 100 according to the present embodiment receives a user operation for specifying at least one of the interpolation images 921 in the plurality of interpolation images 921 displayed on the display 135, and selects the interpolation image 921 including the same positioning image 9 as the positioning image 9 included in the interpolation image 921 specified by the user. Therefore, according to the magnetic resonance imaging apparatus 100 of the present embodiment, in addition to the same effects as those of the first embodiment, even if the user does not manually select each of the interpolation images 921, the same image processing can be collectively performed on the interpolation images 921 including the same positioning image 9.
In general, the illustration image 921 on the display 135 is small in display size and does not display the carry-over information, but according to the magnetic resonance imaging apparatus 100 of the present embodiment, even if the user does not perform the task of visually searching for the illustration image 921 including the same positioning image 9, the illustration image 921 to be the image processing target can be automatically selected.
Further, in the case where the user manually specifies the selection target, it is necessary to display the illustration images 921 of the selection candidates on the display 135 as a precondition, but in the case where the illustration images 921 are automatically selected, it is also possible to select the illustration images 921 registered in the non-displayed plate, and to select the illustration images 921 of the image processing targets across a plurality of plates.
(modification 1)
In each of the above embodiments, although the inserted image 92, 921 or the reference image 91 registered in the virtual film 5 is the subject of the unified image processing, the inserted image 92, 921 or the reference image 91 registered in the virtual film 5 may be the subject of the unified image processing.
Fig. 11 is a diagram showing an example of a virtual film editing screen 50c according to a first modification. The virtual film editing screen 50c of the present modification is a screen on which a plurality of diagnostic images 8 and illustration images 92 printed on a film can be edited. The display control function 133b of the present modification causes the display 135 to display the virtual film editing screen 50 c.
More specifically, the virtual film editing screen 50c of the present modification includes a virtual film 5c, a main frame 52, and a selection field 53 in which candidate images are registered.
In the selection field 53 in which candidate images are registered, for example, a plurality of diagnostic images 8 are arranged and displayed in a matrix shape defined by a series and a slice position. The user selects one or more diagnostic images 8 among the plurality of diagnostic images 8 displayed in the selection field 53 for registering candidate images, and then presses the registration button 501, thereby designating the selected plurality of diagnostic images 8 as targets for registration with the virtual film 5 c.
The reception function 133c of the present modification receives a user operation for instructing image processing on the illustration image 92 newly registered on the virtual film 5c on the virtual film editing screen 50 c.
In the example shown in fig. 11, an image with an illustration 7 and a reference image 91 corresponding to the diagnostic image 8 registered in the selection field 53 of the registration candidate image are registered in the virtual film 5c, and the image with an illustration 7 includes the diagnostic image 8 specified as the registration target in the selection field 53 of the registration candidate image and an illustration image 92 corresponding to the diagnostic image 8.
The main frame 52 is an image area in which various medical images registered on the virtual film 5c or a user can edit a medical image newly registered on the virtual film 5 c.
In the example shown in fig. 11, the main frame 52 displays the image with an illustration 7, and includes the diagnostic image 8 specified as the registration target in the selection field 53 of the registration candidate image and the illustration image 92 corresponding to the diagnostic image 8. The illustration image 7 is a medical image newly registered in the virtual film 5 c. The display control function 133b refers to the image identifier of the positioning image 9 used for acquiring the diagnostic image 8 included in the supplementary information of the diagnostic image 8, and displays the illustration image 92 corresponding to the image identifier on the main frame 52 of the display 135, thereby displaying the illustration image 7.
When the user performs an image processing operation on the illustration image 92 displayed on the main frame 52, the selection function 133d of the present modification selects the illustration image 92 or the reference image 91 including the same positioning image 9 as the positioning image 9 included in the newly registered illustration image 92, from among the illustration images 92 or the reference images 91 already registered in the virtual film 5 c.
The method of selection can be, for example, a search based on the image identifier of the positioning image 9, as in the second embodiment. When the virtual film 5c has a plurality of versions, the selection function 133d selects the corresponding illustration image 92 or reference image 91 from all the versions, as in the second embodiment. In addition, the method of selection may also be selected by the user as in the first embodiment.
The image processing function 133e of the present modification applies image processing accepted by the user to the newly registered illustration image 92 and the illustration image 92 selected by the selection function 133d in a unified manner. Further, the user may be able to set whether or not to apply the operation of the image processing performed in the main frame 52 uniformly.
Note that, although image processing for the newly registered illustration image 92 is taken as an example in fig. 11, when image processing for the newly registered reference image 91 is performed, image processing can be performed in a similar manner for the illustration image 92 or the reference image 91 that includes the same positioning image 9 as the positioning image 9 included in the newly registered illustration image 92, among the illustration images 92 or the reference images 91 that have already been registered in the virtual film 5 c.
According to the magnetic resonance imaging apparatus 100 of the present modification, since the image processing for the newly registered illustration image 92 can be applied to the illustration image 92 or the reference image 91 already registered on the virtual film 5c in a unified manner, the effort of the user to edit each illustration image 92 or reference image 911 already registered on the virtual film 5c one by one is reduced in addition to the effects of the above-described embodiments.
When the user performs an image processing operation on the illustration image 92 displayed on the main frame 52, the user may select whether or not to apply the image processing to only the newly registered illustration image 92 and to apply the newly registered illustration image 92 or the reference image 91 to the illustration image 92 or the reference image 91 registered on the virtual film 5c in a unified manner.
For example, the display control function 133b may display on the virtual film editing screen 50 c: a first operation unit that allows the user to set only an operation of image processing as a newly registered illustration image 92 with respect to the illustration image 92 displayed on the main frame 52; the second operation unit enables the user to specify that the second operation unit is also applied to the illustration image 92 or the reference image 91 registered in the virtual film 5c in a unified manner.
(modification 2)
In the second embodiment described above, the selection function 133d selects the illustration image 92 including the same positioning image 9 as the positioning image 9 included in the illustration image 92 on which the user has performed the image processing operation, but may add a condition to the selection.
The selection function 133d according to the present modification does not select an illustration image that does not satisfy a predetermined condition from among illustration images 92 including the same positioning images 9 as the positioning images 9 included in the illustration image 92 designated by the user.
For example, the predetermined condition is that the position of the imaging section is within a predetermined distance from the position of the imaging section of the illustration image 92 selected by the user.
The distance between the imaging sections may be a distance on the illustration image 92 of the reference line 6 on the illustration image 92 selected by the user and the reference line 6 on the illustration image 92 registered on the virtual film 5, or may be a distance converted into a distance on the body of the object P to be imaged. For example, the selection function 133d may set the distance between the plan centers of the diagnostic images 8 corresponding to the respective illustration images 92 as the distance between the imaging sections. The center of the plan of the diagnostic image 8 is the center of the imaging range set based on the positioning for imaging the diagnostic image 8. The selection function 133d may calculate the distance between the imaging sections from the information on the positions of the imaging sections registered in the incidental information of each illustration image 92.
The length of the predetermined distance is not limited, and may be, for example, a length to the extent that the imaging target region drawn on the diagnostic image 8 changes. The predetermined distance may be changeable by a user.
For example, even when a plurality of the interpolation images 92 including the same positioning image 9 are positioned far from each other with respect to the reference line 6, if the magnification of one interpolation image 92 is applied to another interpolation image 92, the reference line 6 of the other interpolation image 92 may reach the outside of the interpolation frame and may not be displayed. The selection function 133d can reduce such a phenomenon by selecting only the illustration image 92 whose imaging section position is within a predetermined distance from the imaging section position of the illustration image 92 selected by the user.
When the imaging range of the positioning image 9 is wide, a plurality of imaging portions of the positioning image 9 may be included in the positioning image 9. In this case, there is a case where it is not appropriate to apply image processing for a certain portion to another portion. Therefore, the selection function 133d can appropriately restrict the objects to which the image processing is uniformly applied by selecting only the illustration image 92 in which the position of the imaging section is within a predetermined distance from the position of the imaging section of the illustration image 92 selected by the user.
The predetermined condition may also be different depending on the type of image processing. For example, the user may set whether or not to add a predetermined condition to various image processing such as enlargement, reduction, adjustment, change in brightness, rotation, addition of a comment, addition of an ROI, color filtering, change in a window value, or change in a window width. For example, the predetermined conditions described above may be set for the enlargement processing as default settings, but the predetermined conditions may not be set for the window value and the window width. Further, the length of the distance between the imaging sections set as the predetermined condition may be different depending on the type of image processing.
The predetermined condition is not limited to the above example. For example, the same imaging target region may be a predetermined condition.
(modification 3)
In each of the above embodiments, the example in which the film imaging process is executed in the magnetic resonance imaging apparatus 100 has been described, but the film imaging process may be executed in an information processing apparatus other than the magnetic resonance imaging apparatus 100.
The information processing device other than the magnetic resonance imaging apparatus 100 is, for example, a workstation or a terminal for a doctor, but is not limited thereto. In the present modification, an information processing apparatus other than the magnetic resonance imaging apparatus 100 is an example of an image processing apparatus.
The functions of the magnetic resonance imaging apparatus 100 described in the above embodiments may be realized by being divided into a cloud environment, a central server apparatus, and an operation terminal.
(modification 4)
In the above embodiments, the film formation process was described as an example, but the functions of the display control function 133b, the reception function 133c, the selection function 133d, and the image processing function 133e may be applied to processes other than the film formation process. For example, the above-described functions can be applied to various film viewing apparatuses or editing apparatuses related to medical images.
(modification 5)
In the above-described embodiments, the magnetic resonance image captured by the magnetic resonance imaging apparatus 100 is taken as the object value, but the above-described embodiments may be applied to a medical image captured in another manner.
(modification 6)
In each of the above embodiments, the same image processing is applied to both the illustration images 92 and 921 operated by the user to perform image processing and the other illustration images 92 and 921 selected by the user operation or automatic processing, but the application timing is not limited to this.
For example, the user may perform image processing on one of the illustration images 92 and 921 first, and then the image processing may be applied to the other illustration image 92 and 921. More specifically, it is assumed that the user has performed an operation of image processing in the illustration image 921a shown in fig. 9. In this case, after the image processing is determined, the image processing function 133e may apply the image processing to the interpolation image 921 or the reference image 91 including the same positioning image 9 as the positioning image 9 included in the interpolation image 921 a. The determination of the image processing is realized by, for example, the operator selecting a processing determination button (not shown) displayed on the display 135 using the input interface 134.
In addition, various data used in the above-described embodiments of the present specification are typically digital data.
The image pickup processing function 133a, the display control function 133b, the reception function 133c, the selection function 133d, and the image processing function 133e in the above embodiments may be realized by only hardware, only software, or a combination of hardware and software, in addition to the processing circuit 133 described in the embodiments.
According to at least one embodiment described above, when editing a plurality of illustration images, the workload of the user can be reduced.
Several embodiments have been described, but these embodiments are presented as examples and are not intended to limit the scope of the invention. These embodiments may be implemented in various other forms, and various omissions, substitutions, changes, and combinations of the embodiments may be made without departing from the spirit of the invention. These embodiments and modifications are included in the scope and gist of the invention, and are also included in the invention described in the claims and the equivalent scope thereof.
Claims (13)
1. An image processing apparatus includes:
a display control unit that causes a display unit to display a plurality of magnetic resonance images and a plurality of first position display images in correspondence with each other, the plurality of first position display images indicating positions of imaging sections of the plurality of magnetic resonance images;
a selection unit that selects a plurality of second position display images included in the plurality of first position display images;
a receiving unit configured to receive a user operation for instructing image processing for displaying images at the selected plurality of second positions;
and an image processing unit configured to apply the image processing to the selected plurality of second position display images collectively.
2. The image processing apparatus according to claim 1,
the receiving unit receives the user operation for designating a plurality of second position display images included in a plurality of first position display images displayed on the display unit as selection targets,
the selection unit selects the plurality of second position display images specified as selection targets by the user.
3. The image processing apparatus according to claim 1,
the first position display images and the second position display images are images in which the positions of imaging cross sections of the magnetic resonance images are displayed in a positioning image used for positioning the magnetic resonance images,
the receiving unit receives the user operation specifying at least one position display image among the plurality of first position display images displayed on the display unit,
the selection unit selects, as the second position display image, a position display image including a positioning image that is the same as a positioning image included in the position display image specified by the user, from among at least one of the plurality of first position display images displayed on the display unit and the plurality of first position display images not displayed on the display unit.
4. The image processing apparatus according to claim 3,
the selection unit does not select a position display image that does not satisfy a predetermined condition from among position display images including the same positioning image as the positioning image included in the position display image specified by the user.
5. The image processing apparatus according to claim 4,
the predetermined condition is that a position of an imaging section is within a predetermined distance from a position of an imaging section of the position display image selected by the user.
6. The image processing apparatus according to claim 4,
the prescribed condition differs depending on the type of the image processing.
7. The image processing apparatus according to claim 3,
the display control unit causes the display unit to display a virtual film editing screen capable of editing the plurality of magnetic resonance images and the plurality of first position display images printed on the film,
the accepting unit accepts a user operation that instructs image processing for displaying an image at a position newly registered in a virtual film on the virtual film editing screen,
the selection unit selects a position display image including a positioning image identical to a positioning image included in the newly registered position display image from among the plurality of first position display images registered in the virtual film,
the image processing unit applies the image processing to the newly registered position display image and the position display image selected by the selection unit collectively.
8. The image processing apparatus according to claim 1,
the receiving unit receives the image processing corresponding to at least one numerical value input by the user.
9. The image processing apparatus according to claim 8,
the numerical value is a value of a parameter that defines the content of the image processing.
10. The image processing apparatus according to claim 1,
the receiving unit receives the image processing corresponding to the user operation.
11. The image processing apparatus according to claim 10,
the receiving unit receives the image processing corresponding to an operation amount of an operation unit performed by the user.
12. The image processing apparatus according to any one of claims 1 to 11,
the plurality of first position display images and the plurality of second position display images are smaller than each of the plurality of magnetic resonance images and are displayed superimposed on each of the plurality of magnetic resonance images.
13. A method, comprising:
a display control step of causing a display unit to display a plurality of magnetic resonance images and a plurality of first position display images in correspondence with each other, the plurality of first position display images indicating positions of imaging sections of the plurality of magnetic resonance images;
a selection step of selecting a plurality of second position display images included in the plurality of first position display images;
a receiving step of receiving a user operation for instructing image processing for displaying images at the plurality of selected second positions;
and an image processing step of applying the image processing to the selected plurality of second position display images collectively.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020-161984 | 2020-09-28 | ||
JP2020161984A JP7489881B2 (en) | 2020-09-28 | 2020-09-28 | Image processing device and program |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114338936A true CN114338936A (en) | 2022-04-12 |
CN114338936B CN114338936B (en) | 2024-10-18 |
Family
ID=80998103
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111134028.0A Active CN114338936B (en) | 2020-09-28 | 2021-09-27 | Image processing apparatus and method |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP7489881B2 (en) |
CN (1) | CN114338936B (en) |
Citations (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02237548A (en) * | 1989-03-10 | 1990-09-20 | Toshiba Corp | Image display apparatus |
CN1484199A (en) * | 2002-08-13 | 2004-03-24 | ��ʽ���綫֥ | Method and device for processing image by three-dimension interested area |
JP2004194704A (en) * | 2002-12-16 | 2004-07-15 | Hitachi Medical Corp | Medical image diagnostic system |
JP2005296064A (en) * | 2004-04-06 | 2005-10-27 | Konica Minolta Medical & Graphic Inc | Control unit and display controlling program |
CN101014288A (en) * | 2004-07-21 | 2007-08-08 | 株式会社日立医药 | Fault image pick-up device |
JP2008018086A (en) * | 2006-07-13 | 2008-01-31 | Toshiba Corp | Magnetic resonance imaging apparatus |
CN101264017A (en) * | 2007-03-05 | 2008-09-17 | 株式会社东芝 | Magnetic resonance diagnosis device and medical image displaying device |
JP2009153966A (en) * | 2007-12-07 | 2009-07-16 | Toshiba Corp | Image display device and magnetic resonance imaging apparatus |
JP2009160092A (en) * | 2007-12-28 | 2009-07-23 | Ge Medical Systems Global Technology Co Llc | Image display device, display method, and magnetic resonance imaging apparatus |
JP2010051615A (en) * | 2008-08-29 | 2010-03-11 | Hitachi Medical Corp | Magnetic resonance imaging apparatus |
JP2012065768A (en) * | 2010-09-22 | 2012-04-05 | Fujifilm Corp | Portable radiation imaging system, holder used therein, and portable set for radiation imaging |
CN103443579A (en) * | 2011-04-06 | 2013-12-11 | 爱克发医疗保健公司 | Method and system for optical coherence tomography |
US20150185302A1 (en) * | 2013-12-27 | 2015-07-02 | Kabushiki Kaisha Toshiba | Magnetic resonance imaging apparatus |
CN106163405A (en) * | 2014-02-12 | 2016-11-23 | 三星电子株式会社 | Tomographic apparatus and the method by tomographic apparatus display tomoscan image |
CN107110943A (en) * | 2014-12-11 | 2017-08-29 | 三星电子株式会社 | Magnetic resonance imaging apparatus and image processing method thereof |
US20180028148A1 (en) * | 2016-07-26 | 2018-02-01 | Toshiba Medical Systems Corporation | Ultrasound diagnosis apparatus, medical image processing apparatus, and medical image processing method |
CN108065967A (en) * | 2016-11-10 | 2018-05-25 | 东芝医疗系统株式会社 | Diagnostic ultrasound equipment, medical image-processing apparatus and medical image processing method |
JP2019072457A (en) * | 2017-03-24 | 2019-05-16 | キヤノンメディカルシステムズ株式会社 | Magnetic resonance imaging apparatus, magnetic resonance imaging method, and magnetic resonance imaging system |
CN110100282A (en) * | 2017-03-17 | 2019-08-06 | 株式会社理光 | Information processing unit, information processing method, program and biosignal measurement set |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6674449B1 (en) | 1998-11-25 | 2004-01-06 | Ge Medical Systems Global Technology Company, Llc | Multiple modality interface for imaging systems |
JP4820680B2 (en) | 2006-04-12 | 2011-11-24 | 株式会社東芝 | Medical image display device |
JP2008043435A (en) * | 2006-08-11 | 2008-02-28 | Toshiba Corp | Medical image processor and medical image processing method |
JP2015011647A (en) * | 2013-07-02 | 2015-01-19 | キヤノン株式会社 | Operation device, image forming apparatus including the same, and control method of operation device |
-
2020
- 2020-09-28 JP JP2020161984A patent/JP7489881B2/en active Active
-
2021
- 2021-09-27 CN CN202111134028.0A patent/CN114338936B/en active Active
Patent Citations (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02237548A (en) * | 1989-03-10 | 1990-09-20 | Toshiba Corp | Image display apparatus |
CN1484199A (en) * | 2002-08-13 | 2004-03-24 | ��ʽ���綫֥ | Method and device for processing image by three-dimension interested area |
JP2004194704A (en) * | 2002-12-16 | 2004-07-15 | Hitachi Medical Corp | Medical image diagnostic system |
JP2005296064A (en) * | 2004-04-06 | 2005-10-27 | Konica Minolta Medical & Graphic Inc | Control unit and display controlling program |
CN101014288A (en) * | 2004-07-21 | 2007-08-08 | 株式会社日立医药 | Fault image pick-up device |
JP2008018086A (en) * | 2006-07-13 | 2008-01-31 | Toshiba Corp | Magnetic resonance imaging apparatus |
CN101264017A (en) * | 2007-03-05 | 2008-09-17 | 株式会社东芝 | Magnetic resonance diagnosis device and medical image displaying device |
JP2009153966A (en) * | 2007-12-07 | 2009-07-16 | Toshiba Corp | Image display device and magnetic resonance imaging apparatus |
JP2009160092A (en) * | 2007-12-28 | 2009-07-23 | Ge Medical Systems Global Technology Co Llc | Image display device, display method, and magnetic resonance imaging apparatus |
JP2010051615A (en) * | 2008-08-29 | 2010-03-11 | Hitachi Medical Corp | Magnetic resonance imaging apparatus |
JP2012065768A (en) * | 2010-09-22 | 2012-04-05 | Fujifilm Corp | Portable radiation imaging system, holder used therein, and portable set for radiation imaging |
CN103443579A (en) * | 2011-04-06 | 2013-12-11 | 爱克发医疗保健公司 | Method and system for optical coherence tomography |
US20150185302A1 (en) * | 2013-12-27 | 2015-07-02 | Kabushiki Kaisha Toshiba | Magnetic resonance imaging apparatus |
CN106163405A (en) * | 2014-02-12 | 2016-11-23 | 三星电子株式会社 | Tomographic apparatus and the method by tomographic apparatus display tomoscan image |
CN107110943A (en) * | 2014-12-11 | 2017-08-29 | 三星电子株式会社 | Magnetic resonance imaging apparatus and image processing method thereof |
US20180028148A1 (en) * | 2016-07-26 | 2018-02-01 | Toshiba Medical Systems Corporation | Ultrasound diagnosis apparatus, medical image processing apparatus, and medical image processing method |
CN108065967A (en) * | 2016-11-10 | 2018-05-25 | 东芝医疗系统株式会社 | Diagnostic ultrasound equipment, medical image-processing apparatus and medical image processing method |
CN110100282A (en) * | 2017-03-17 | 2019-08-06 | 株式会社理光 | Information processing unit, information processing method, program and biosignal measurement set |
JP2019072457A (en) * | 2017-03-24 | 2019-05-16 | キヤノンメディカルシステムズ株式会社 | Magnetic resonance imaging apparatus, magnetic resonance imaging method, and magnetic resonance imaging system |
Also Published As
Publication number | Publication date |
---|---|
JP2022054781A (en) | 2022-04-07 |
JP7489881B2 (en) | 2024-05-24 |
CN114338936B (en) | 2024-10-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5566860B2 (en) | Magnetic resonance imaging system | |
US8368397B2 (en) | Magnetic resonance imaging apparatus and magnetic resonance imaging method | |
JP7278785B2 (en) | Medical image diagnostic apparatus and medical image diagnostic system | |
US20210121092A1 (en) | Magnetic resonance imaging system and position display method | |
US10429472B2 (en) | Magnetic resonance imaging apparatus and method for magnetic resonance imaging with copying and setting of parameter values | |
US9072497B2 (en) | Method for an image data acquisition | |
US20020151785A1 (en) | Mehtod and magnetic resonance tomography apparatus for preparing a data acquisition using previously obtained data acquisitions | |
CN114338936B (en) | Image processing apparatus and method | |
JP7106291B2 (en) | Magnetic resonance imaging system | |
JP5689623B2 (en) | Magnetic resonance imaging system | |
US9964620B2 (en) | Visual indication of the magic angle in orthopedic MRI | |
JP2012066005A (en) | Magnetic resonance imaging apparatus | |
JP5283866B2 (en) | Magnetic resonance imaging system | |
JP6104627B2 (en) | MRI equipment | |
JP7106292B2 (en) | Magnetic resonance imaging system | |
JP5361113B2 (en) | MRI image stitching method and apparatus, and MRI apparatus | |
US4860221A (en) | Magnetic resonance imaging system | |
US20240027555A1 (en) | Magnetic resonance imaging apparatus | |
JP2009207755A (en) | Magnetic resonance imaging apparatus | |
US20230293040A1 (en) | Method for Determining a Position of at Least One Coil Element of a Radiofrequency Coil That Can Be Inserted in a Patient Placement Region of a Magnetic Resonance Apparatus | |
JP2014076137A (en) | Medical image diagnostic device and image processing management device | |
JP3743734B2 (en) | Magnetic resonance imaging system | |
US20210353212A1 (en) | Magnetic resonance imaging apparatus, method, and storage medium | |
US11668776B2 (en) | Method for providing a process plan of a magnetic resonance examination | |
JP2022150820A (en) | Display control device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |