[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2011004568A1 - Procédé de traitement d’images pour l’observation d’Œufs fécondés, programme de traitement d’images, dispositif de traitement d’images et procédé de production d’Œufs fécondés - Google Patents

Procédé de traitement d’images pour l’observation d’Œufs fécondés, programme de traitement d’images, dispositif de traitement d’images et procédé de production d’Œufs fécondés Download PDF

Info

Publication number
WO2011004568A1
WO2011004568A1 PCT/JP2010/004328 JP2010004328W WO2011004568A1 WO 2011004568 A1 WO2011004568 A1 WO 2011004568A1 JP 2010004328 W JP2010004328 W JP 2010004328W WO 2011004568 A1 WO2011004568 A1 WO 2011004568A1
Authority
WO
WIPO (PCT)
Prior art keywords
fertilized egg
image
observation
image processing
objects
Prior art date
Application number
PCT/JP2010/004328
Other languages
English (en)
Japanese (ja)
Inventor
三村正文
佐々木秀貴
伊藤啓
Original Assignee
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ニコン filed Critical 株式会社ニコン
Priority to JP2011521808A priority Critical patent/JPWO2011004568A1/ja
Publication of WO2011004568A1 publication Critical patent/WO2011004568A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/94Investigating contamination, e.g. dust
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/68Food, e.g. fruit or vegetables

Definitions

  • the present invention relates to an image processing means for automatically discriminating a fertilized egg and a foreign substance from an observation image acquired in fertilized egg observation, and a fertilized egg manufacturing method using the image processing means.
  • a culture microscope is mentioned as an example of the apparatus which observes the conditions of cultures, such as a fertilized egg (for example, refer patent document 1).
  • the culture microscope includes a culture apparatus (invecutor) that forms a suitable environment for culturing fertilized eggs, and a microscopic observation system that microscopically observes the state of the fertilized eggs in the culture container housed in the culture apparatus.
  • the observation image of the fertilized egg is acquired at regular intervals, and the user can recognize the fertilized egg by visual observation and automatically perform observation, recording, management, etc. of the fertilized egg.
  • the culture medium in the culture container contains foreign matters such as dust and bubbles in addition to the fertilized egg to be observed. Since these foreign objects may have an appearance similar to a fertilized egg, it is easy to confuse the fertilized egg with the foreign object in the observation image, and it is difficult to automatically recognize the fertilized egg with simple image processing.
  • the present invention has been made in view of the problems as described above, and provides a means for automatically discriminating a fertilized egg by discriminating a fertilized egg to be observed and other foreign matters in fertilized egg observation. Objective.
  • an observation image obtained by photographing a plurality of objects located in the observation visual field by the imaging device is acquired, and a plurality of objects imprinted in the observation image are extracted, and the observation image is extracted.
  • Calculating a plurality of feature values of an image according to the attributes of a fertilized egg for each object included in the object, and identifying a fertilized egg from the plurality of objects based on the calculated plurality of feature values An image processing method for fertilized egg observation is provided.
  • an image processing program for causing a computer to function as an image processing device that can be read by a computer and that is captured by an imaging device and acquires an image and performs image processing.
  • a step of acquiring an observation image obtained by photographing a plurality of objects located in a field of view by an imaging device, a step of extracting a plurality of objects imprinted in the observation image, and a fertilized egg for each object included in the observation image Calculating a plurality of image feature amounts according to the attributes of the image, identifying a fertilized egg from the plurality of objects based on the calculated feature amounts, and outputting an identification result for the object
  • An image processing program for observing a fertilized egg is provided.
  • an imaging device that captures a plurality of objects, a plurality of objects are extracted from observation images captured by the imaging device, and a fertilized egg is identified from the plurality of objects.
  • An image analysis unit, and an output unit that outputs an identification result determined by the image analysis unit to the outside, and the image analysis unit determines the feature amount of the image according to the attribute of the fertilized egg for each object included in the observation image
  • a fertilized egg observation image processing apparatus is provided, which is configured to identify a fertilized egg from a plurality of objects based on a plurality of calculated feature values.
  • the fertilized egg is cultured under predetermined environmental conditions, and the fertilized egg is identified from the culture container in which the fertilized egg exists using the image processing apparatus having the above-described configuration.
  • a featured fertilized egg production method is provided.
  • a fertilized egg is cultured under a predetermined environmental condition, and an observation in which a plurality of objects located in an observation field are photographed with an imaging device in a culture container in which the fertilized egg exists.
  • Acquire an image extract a plurality of objects imprinted in the observation image, calculate a plurality of image feature amounts according to the attributes of the fertilized egg for each object included in the observation image, and calculate the plurality of features.
  • a method for producing a fertilized egg characterized by identifying a fertilized egg from a plurality of objects in a culture container based on the amount.
  • the fertilized egg observation image processing method the image processing program and the image processing apparatus, and the fertilized egg manufacturing method, by image processing that discriminates an object based on a plurality of feature amounts according to the attributes of the fertilized egg.
  • the fertilized egg can be accurately identified from among a plurality of objects included in the observation image.
  • FIG. 1 It is a flowchart which shows the outline
  • (A) is a top view which shows a culture container
  • (B) is a perspective view which shows a dish. It is a figure for demonstrating the characteristic of a fertilized egg and other foreign materials. It is a figure which illustrates the condition of the outline extraction process which extracts an object. It is a figure for demonstrating the difference of the luminance value inside the outline of a fertilized egg and a foreign material.
  • FIG. 1 is a block diagram illustrating a schematic configuration of an image processing apparatus. It is a flowchart which shows the outline
  • FIGS. 2 and 3 As an example of a system to which the image processing apparatus according to the present embodiment is applied, a schematic configuration diagram and a block diagram of a culture observation system are shown in FIGS. 2 and 3, respectively.
  • the culture observation system BS roughly observes the culture chamber 2 provided in the upper part of the housing 1, the shelf-like stocker 3 that accommodates and holds a plurality of culture containers 10, and the sample in the culture container 10.
  • An observation unit 5 a transfer unit 4 for transferring the culture vessel 10 between the stocker 3 and the observation unit 5, a control unit 6 for comprehensively controlling the operation of the system, and an operation panel 7 provided with an image display device. Etc.
  • the culture room 2 is a room that forms a culture environment, and is kept sealed after the sample is charged in order to prevent environmental changes and contamination.
  • a temperature adjustment device 21 that raises and lowers the temperature in the culture chamber 2
  • a humidifier 22 that adjusts humidity
  • a gas supply device 23 that supplies a gas such as CO 2 gas or N 2 gas.
  • a circulation fan 24 for making the entire environment of the culture chamber 2 uniform, an environmental sensor 25 for detecting the temperature, humidity, carbon dioxide concentration, etc. of the culture chamber 2 are provided.
  • the operation of each device is controlled by the control unit 6, and the culture environment defined by the temperature, humidity, carbon dioxide concentration, etc. of the culture chamber 2 is maintained in a state that matches the culture conditions set on the operation panel 7.
  • the stocker 3 is formed in a shelf shape that is partitioned into a plurality of parts in the front-rear direction and the up-down direction in FIG. Each shelf has its own unique address. For example, when the longitudinal direction is A to C rows and the vertical direction is 1 to 7 rows, the A row 5 shelves are set as A-5.
  • the culture vessel 10 an appropriate one such as a flask, a dish, or a well plate is selected according to the type and purpose of the culture.
  • the fertilized egg a which is a culture is a culture medium containing pH indicators, such as phenol red It is injected into each dish 10a together with the drop D.
  • the medium drops D of about 20 ⁇ l dropped by a pipette or the like are formed (only one is shown in FIG. 4B), and the medium drop D is contained in the dish 10a.
  • each medium drop D for example, one fertilized egg a collected from the same mother at the same time for external fertilization is inserted.
  • the culture vessel 10 is assigned a code number and is stored in association with the designated address of the stocker 3.
  • the transfer unit 4 is provided inside the culture chamber 2 so as to be movable in the vertical direction and is moved up and down by the Z-axis drive mechanism.
  • the transfer unit 4 is attached to the Z stage 41 so as to be movable in the front-rear direction and by the Y-axis drive mechanism.
  • the Y stage 42 that is moved back and forth, the X stage 43 that is attached to the Y stage 42 so as to be movable in the left-right direction and is moved left and right by the X-axis drive mechanism, and the like, is supported by lifting the culture vessel 10 to the tip side of the X stage 43
  • a support arm 45 is provided.
  • the transport unit 4 has a moving range in which the support arm 45 can move between the entire shelf of the stocker 3 and the observation unit 5.
  • the X-axis drive mechanism, the Y-axis drive mechanism, and the Z-axis drive mechanism are configured by, for example, a servo motor with a ball screw and an encoder, and the operation thereof is controlled by
  • the observation unit 5 includes a first illumination unit 51 that illuminates the sample from the lower side of the sample stage 15, a second illumination unit 52 that illuminates the sample from above the sample stage 15 along the optical axis of the microscopic observation system, and a sample from below. 3, a macro observation system 54 that performs macro observation of the sample, a micro observation system 55 that performs micro observation of the sample, an image processing apparatus 100 (see FIG. 10), and the like.
  • the sample stage 15 is made of a light-transmitting material and has a transparent window 16 in the observation area.
  • the sample stage 15 is composed of a fine drive stage that can be moved in the XY direction (horizontal plane direction) and the Z direction (vertical direction) by operation control from the control unit 6, and the culture vessel 10 placed on the upper surface thereof. Is moved in the XY directions, so that the culture vessel 10 can be inserted on the optical axis of the macro observation system 54 or on the optical axis of the microscopic observation system 55.
  • the first illumination unit 51 is composed of a surface-emitting light source provided on the lower frame 1b side, and backlight-illuminates the entire culture vessel 10 from the lower side of the sample stage 15.
  • the second illumination unit 52 includes a light source such as an LED and an illumination optical system including a phase ring, a condenser lens, and the like.
  • the second illumination unit 52 is provided in the culture chamber 2 and receives light from the microscopic observation system 55 from above the sample stage 15.
  • the sample in the culture vessel 10 is illuminated along the axis.
  • the third illumination unit 53 superimposes light emitted from each of the light sources, such as a plurality of LEDs and mercury that emit light having a wavelength suitable for epi-illumination observation and fluorescence observation, on the optical axis of the microscopic observation system 55.
  • An illumination optical system composed of a beam splitter, a fluorescent filter, and the like to be disposed in the lower frame 1b located below the culture chamber 2, and the light of the microscopic observation system 55 from below the sample stage 15 The sample in the culture vessel 10 is illuminated along the axis.
  • the macro observation system 54 includes an observation optical system 54 a and an imaging device 54 c such as a CCD camera that captures an image of the sample imaged by the observation optical system 54 a and is positioned above the first illumination unit 51. Provided in the culture chamber 2.
  • the macro observation system 54 captures an entire observation image (macro image) from above the culture vessel 10 that is backlit by the first illumination unit 51.
  • the microscopic observation system 55 includes an observation optical system 55a composed of an objective lens, an intermediate zoom lens, a fluorescent filter, and the like, and an imaging device 55c such as a cooled CCD camera that takes an image of a sample imaged by the observation optical system 55a. And disposed inside the lower frame 1b.
  • the second illumination unit 52 and the microscopic observation system 55 constitute a phase difference observation microscope.
  • a plurality of objective lenses and intermediate zoom lenses are provided, and are configured to be set to a plurality of magnifications using a displacement mechanism such as a revolver or a slider (not shown in detail).
  • the microscopic observation system 55 displays the sample in the culture vessel 10 such as a phase difference image by the transmitted light of the sample illuminated by the second illumination unit 52 and a fluorescence image by the fluorescence emitted from the sample illuminated by the third illumination unit 53.
  • a microscopic observation image is taken.
  • the image processing apparatus 100 performs A / D conversion on signals input from the imaging device 54c of the macro observation system 54 and the imaging device 55c of the microscopic observation system 55, and performs various image processing to perform an entire observation image or a microscopic observation image. Image data is generated. Further, the image processing apparatus 100 performs image analysis on the image data of these observation images (entire observation image and microscopic observation image), calculates the feature amount of an object present in the image, and responds to each feature amount. Image processing such as calculating the score and determining the fertilized egg based on the total score. Specifically, the image processing apparatus 100 is constructed by executing an image processing program stored in the ROM of the control unit 6 described below. The image processing apparatus 100 will be described in detail later.
  • the control unit 6 includes a CPU 61 that executes processing, a ROM 62 that stores and stores control programs and control data of the culture observation system BS, a RAM 63 that temporarily stores observation conditions and image data, and the like. Control operation. Therefore, as shown in FIG. 3, the constituent devices of the culture chamber 2, the transport device 4, the observation unit 5, and the operation panel 7 are connected to the control unit 6.
  • the RAM 63 the environmental conditions of the culture chamber 2 according to the observation program, the observation schedule, the observation type and observation position in the observation unit 5, the observation magnification, and the like are set and stored. Further, the RAM 63 is provided with an image data storage area for recording image data photographed by the observation unit 5, and index data including the code number of the culture vessel 10 and photographing date / time are associated with the image data and recorded. Is done.
  • the operation panel 7 is provided with an operation panel 71 provided with input / output devices such as a keyboard and a switch, and a display panel 72 for displaying an operation screen, image data, and the like. An operation command or the like is input.
  • the communication unit 65 is configured in accordance with a wired or wireless communication standard, and data can be transmitted to and received from a computer or the like externally connected to the communication unit 65.
  • the CPU 61 controls the operation of each part based on the control program stored in the ROM 62 according to the setting conditions of the observation program set on the operation panel 7, and the culture vessel 10
  • the sample inside is automatically captured. That is, when the observation program is started by a panel operation on the operation panel 71 (or a remote operation via the communication unit 65), the CPU 61 reads each condition value of the environmental conditions stored in the RAM 63, and from the environment sensor 25.
  • the environmental state of the culture chamber 2 to be input is detected, and the temperature adjustment device 21, the humidifier 22, the gas supply device 23, the circulation fan 24, etc. are operated according to the difference between the condition value and the actual measurement value. Feedback control is performed on the culture environment such as temperature, humidity, and carbon dioxide concentration.
  • the CPU 61 reads the observation conditions stored in the RAM 63 and operates the X, Y, Z stages 41, 42, 43 of the transport unit 4 based on the observation schedule to observe the culture vessel 10 to be observed from the stocker 3.
  • the sample is transported to the sample stage 15 of the unit 5 and observation by the observation unit 5 is started.
  • the observation set in the observation program is macro observation
  • the culture vessel 10 transported from the stocker 3 by the transport unit 4 is positioned on the optical axis of the macro observation system 54 and placed on the sample stage 15.
  • the light source of the first illumination unit 51 is turned on, and the entire observation image is taken by the imaging device 54c from above the culture vessel 10 that is backlit.
  • the signal input from the imaging device 54c to the control unit 6 is processed by the image processing device 100 to generate a whole observation image, and the image data is stored in the image data storage area of the RAM 63 together with index data such as the shooting date and time. Is done.
  • the observation set in the observation program is micro observation of the sample at a specific position in the culture vessel 10
  • the specific position in the culture vessel 10 that has been transported by the transport unit 4 is indicated by the light of the microscopic observation system 55.
  • a signal photographed by the imaging device 55c and inputted to the control unit 6 is processed by the image processing device 100 to generate a microscopic observation image (phase difference image, fluorescent image, etc.), and the image data is an index such as a photographing date / time.
  • the CPU 61 sequentially executes the above-described whole observation image photographing and microscopic observation image photographing based on the observation schedule set in the observation program.
  • the image data stored in the RAM 63 is read from the RAM 63 in response to an image display command input from the operation panel 71. For example, an entire observation image, a microscopic observation image at a specified time, an analysis result of image analysis, and the like are displayed on the display panel 72. Is displayed.
  • the image processing apparatus 100 performs observation of the growth state and the like of the fertilized egg in the culture container 10, so that the fertilized egg to be observed and other garbage or the like It has a function of discriminating foreign substances (asymmetrical substances) such as bubbles. That is, the medium drop D in the culture container 10 is mixed with other objects such as dust and bubbles in addition to the fertilized egg, which may have a similar appearance to the fertilized egg and are confused in observation. Since it is easy, the image processing apparatus 100 has a function of automatically discriminating a fertilized egg to be observed by discriminating a fertilized egg from other foreign substances.
  • foreign substances asymmetrical substances
  • each of the fertilized egg, dust, oil particles, and bubbles has the following characteristics.
  • a fertilized egg has a feature that a contour (a zona pellucida) is present in the contour, the outer shape is spherical, and an egg cell is present in the contour.
  • a contour a zona pellucida
  • the dust has the characteristics that there is no transparent band in the outline, halo appears in the outline in the phase difference image, and the structure inside the outline is indefinite.
  • the oil particles and bubbles have the characteristics that the contour portion is dark and the inside is bright, the outer shape is spherical or elliptical, and the structure inside the contour is small.
  • the fertilized egg has a higher degree of circularity than the foreign object, the fertilized egg has a zona pellucida in the contour, the internal structure of the fertilized egg (Egg cells) are always present.
  • the image processing method for observing a fertilized egg by the image processing apparatus 100 acquires images obtained by photographing an object located in the observation field of view of the observation unit 5 with an imaging device, and images of the objects (objects) imprinted in these images. As a result of calculating the above three feature quantities and scoring (score calculation) based on these feature quantities, it is determined that the fertilized egg is the one that has the highest score among these three feature quantities. Configured. In the following, this image processing method will be described from the basic concept. In the following description, a case where a fertilized egg is observed based on a phase contrast image (microscopic observation image) photographed by a phase contrast microscope configured by the second illumination unit 52, the microscopic observation system 55, and the like will be exemplified. .
  • FIG. 6 is a diagram showing a state of the contour extraction process. A plurality of objects are extracted from the observation image (A) photographed and acquired by the imaging device 55c as in the binary image shown in (B).
  • Each segmented object (binary image) is labeled to give a unique label.
  • the area of the object (area surrounded by the extracted contour) is calculated together with labeling. Therefore, the size of an image of a fertilized egg (for example, a diameter of about 100 ⁇ m in the case of a human fertilized egg) is almost determined by the image acquisition conditions (observation magnification, etc.), and is therefore subject to discrimination.
  • An upper limit value and a lower limit value of a possible area of the object are set in advance, and an object outside the set range is excluded from a labeling target (fertilized egg discrimination candidate).
  • Contour circularity Utilizing the fact that the fertilized egg to be observed has a spherical shape (the cross section is close to a perfect circle), the circularity of each object is used as the feature amount 1.
  • the degree of circularity of an object is a scale for determining the degree of circularity of an object. For example, after determining the center of gravity of each object in the image plane and detecting the edge indicating the outline of each object, Then, the shortest distance and the longest distance from the center of gravity to the contour are calculated and obtained from the following equation ( ⁇ ).
  • Contour circularity (shortest distance from centroid to contour) / (longest distance from centroid to contour)... ( ⁇ )
  • the circularity obtained in this way is calculated as a numerical value within the range of 0 to 1, and indicates a numerical value closer to 1 as the distance from the center of gravity to the contour becomes a uniform circular shape. That is, an object having a circularity close to 1 has a higher score as a fertilized egg (highly likely to be a fertilized egg).
  • the degree of circularity of the contour the complexity of the shape (contour circumference 2 / area), the eccentricity using the second moment, and the like are exemplified.
  • Brightness difference between outside and inside contour A transparent thin film called a zona pellucida exists in the outline of the fertilized egg to be observed.
  • the difference in luminance value between the outside and inside of the contour is small.
  • the inside of the outline of oil particles and bubbles is darker than the background outside, and in the phase contrast image, the inside of the outline of dust tends to be brightened by halo, so the difference in brightness value between the outside and inside of the outline is large.
  • an average luminance value (for example, 8-bit gradation) between the outer ring zone (outer ring zone) and the inner ring zone (inner ring zone) of the contour in the object is used.
  • the difference in brightness is obtained by the following equation ( ⁇ ) using the difference of 0 to 255).
  • Brightness difference between outside and inside contour (255 ⁇
  • FIG. 8 is a diagram illustrating the state of mask generation processing when calculating the average luminance value of the outer annular zone and the inner annular zone.
  • the images (labeling) obtained by the above-described processing are illustrated.
  • a binary image is used.
  • the binary image (label image) of each object is subjected to expansion and contraction processing, and two types of mask images (expansion mask M1 and contraction mask M2) showing the expanded and contracted regions are prepared for each object. To do.
  • the difference between the areas of the two mask images M1 and M2 and the original original label image is calculated in a state where the centers of gravity coincide with each other, and these substantially ring-shaped difference images are used as the outline of the object.
  • Texture feature amount inside contour Egg cells are contained inside the fertilized egg to be observed, and it is considered that a texture (internal texture) exists on the image. On the other hand, oil grains and bubbles do not have an internal structure. Therefore, by detecting the internal structure of each object, it is possible to differentiate a fertilized egg from oil particles and bubbles.
  • image region As for the region of the object for detecting the internal structure, as shown in FIG. 9, using the contraction mask M2 generated by the above processing, an image region corresponding to its contraction mask M2 in the observation image (hereinafter, referred to as “image region”). Also referred to as “shrink mask area”.
  • an average value of edge strength (gradation change strength) in the entire image in the contraction mask region is used (see the following equation ( ⁇ )).
  • the edge strength is obtained on a pixel basis by a convolution operation in which a differential filter such as a Laplacian filter represented by, for example, a 3 ⁇ 3 matrix or a 5 ⁇ 5 matrix is applied to the image of the contraction mask region of each object.
  • Texture feature amount inside contour total edge intensity in shrink mask area / number of pixels in shrink mask area ( ⁇ )
  • the number of pixels included in the contraction mask region is used to obtain the average edge intensity. Then, an average value obtained by calculating the edge strength in the contraction mask area for each object in units of pixels in this way is obtained as the texture feature quantity inside the object, which is the feature quantity 3 by the number of pixels in the contraction mask area. .
  • the value (average value of edge strength) itself is not used, but the ratio (edge strength) to the maximum value of the average value of edge strength in all labeled objects. Average value / maximum value of edge strength average value) is used. This is because, for the feature amount 3, the edge strength changes extremely depending on the contrast of the image, so that it is meaningless to provide a certain threshold value. Normalize between images of all objects to remove. Further, when there is no internal texture (for example, when only bubbles are captured), the edge strength originally appears to be small as a whole, but by performing normalization, the feature amount 3 is large for all objects. Since there is a risk of becoming too much, a permissible value is set in advance, and when the maximum value of the edge strength average value falls below this permissible value, all of the feature amount 3 is rejected.
  • the lower limit value is set to 0, and the upper limit value is less than the above allowable value. Is set to 0, and scoring of the feature amount 3 is not performed.
  • the feature quantities normalized and converted to a common scale are set as scores 1, 2, and 3, and a total score that is the sum of them is calculated.
  • Overall score Score 1 + Score 2 + Score 3
  • the object having the maximum total score in the medium drop D is determined as a fertilized egg.
  • a score indicating the degree of fertility is calculated based on the feature amount, and the fertilized egg can be recognized clearly by separating the fertilized egg from the foreign substance based on the height of this score.
  • a fertilized egg based on each score in addition to ranking based on the total score as a simple sum of the scores 1 to 3, for example, a total score obtained by changing the weights for the scores 1 to 3 respectively.
  • Calculate and determine the fertilized egg rank each score 1 to 3 and judge that the object with the lowest total rank is a fertilized egg, Among them, a method of determining that the highest score is 3 is a fertilized egg.
  • the structure which recognizes that a total score exceeded the threshold value as a fertilized egg may be sufficient.
  • FIG. 1 is a flowchart showing an outline of processing in the image processing program GP for fertilized egg discrimination
  • FIG. 10 is a block diagram showing a schematic configuration of an image processing apparatus 100 that executes image processing for fertilized egg discrimination.
  • the image processing apparatus 100 includes an image storage unit 110 that acquires and stores an observation image in which a fertilized egg to be observed is captured by the imaging device 55c, and an object that analyzes the observation image and is imprinted on the observation image.
  • An image analysis unit 120 that determines whether or not the image analysis unit 120, a feature amount storage unit 130 that stores the three feature amounts calculated for each object by the image analysis unit 120 in association with labels assigned to the objects,
  • An output unit 140 that outputs the determination result analyzed by the image analysis unit 120 to the outside, and outputs, for example, to the display panel 72 the identification result as to whether or not the object determined by the image analysis unit 120 is a fertilized egg Configured to be displayed.
  • the image processing apparatus 100 is configured such that an image processing program GP preset and stored in the ROM 62 is read by the CPU 61 and processing based on the image processing program GP is sequentially executed by the CPU 61.
  • the fertilized egg in the culture vessel 10 designated every predetermined time is observed according to the observation conditions set in the observation program.
  • the CPU 61 operates each stage of the transport unit 4 to transport the culture vessel 10 to be observed from the stocker 3 to the observation unit 5 (in this embodiment, it is arranged on the optical axis of the microscopic observation system 55). Then, an observation image (phase difference image) by the microscopic observation system 55 using the second illumination unit 52 is photographed by the imaging device 55c.
  • the image processing apparatus 100 first acquires an observation image (phase difference image) taken by the imaging device 55c in step S1, and acquires the acquired observation image such as the code number, the observation position, and the observation time of the culture vessel 10. It is stored in the image storage unit 110 together with the index data.
  • an observation image phase difference image
  • the acquired observation image such as the code number, the observation position, and the observation time of the culture vessel 10. It is stored in the image storage unit 110 together with the index data.
  • step S2 contour extraction processing such as the LevelSet method is performed on the observation image (phase difference image) acquired from the imaging device 55c in the image analysis unit 120, and as shown in FIG. Objects included in are extracted.
  • each object of the observation image (binary image) from which the contour is extracted is labeled.
  • an appropriate range (upper and lower limit values) is set as the area of the image of the fertilized egg, and the area is calculated for each object region from which the contour is extracted, and has a region area that exceeds this set range.
  • Objects are excluded from the fertilized egg discrimination candidates, and only objects having a region area that falls within the set range are given unique labels as fertilized egg discrimination candidates.
  • the image analysis unit 120 calculates the circularity of the contour, the brightness difference between the outside and inside of the contour, and the texture feature amount inside the contour as the image feature amount of the fertilized egg (step S4).
  • the image analysis unit 120 records the unique label assigned to each object and the three feature amounts calculated below in the feature amount storage unit 130 in association with each other.
  • the image analysis unit 120 calculates the center of gravity of the object from the label image, and the longest distance from the center of gravity to the contour is the distance from the center of gravity to the contour (edge). The ratio of the shortest distance to is calculated as the circularity of the contour (step S11).
  • the image analysis unit 120 In the processing flow F20 for obtaining the luminance difference (feature value 2) between the outer side and the inner side of the contour, the image analysis unit 120 generates two mask images of the expansion mask M1 and the contraction mask M2 for each object from the label image. Thus, difference masks M3 and M4 are respectively generated from the difference between the two mask images M1 and M2 and the original label image. Then, the average of the luminance values in the regions corresponding to the region of the outer differential mask M3 and the region of the inner differential mask M4 in the observation image captured by the imaging device 55c is obtained, and the contour is calculated using the above equation ( ⁇ ). The brightness difference between the outside and the inside is calculated (step S21).
  • the contraction mask M2 generated in step S21 is used, and a differential filter is applied to the observation image captured by the imaging device 55c, so that the contraction mask M2 is applied.
  • the average edge strength in the corresponding region is calculated as the texture feature amount inside the contour (step S31).
  • the image analysis unit 120 compares each feature quantity with a preset threshold value for each feature quantity calculated for each object and recorded in the feature quantity storage unit 130, and rejects each threshold value. Perform (Steps S12, S22, S32). Only objects having three feature quantities that satisfy the threshold are regarded as fertilized egg candidates. As described above, the texture feature amount (feature amount 3) inside the contour is treated as a ratio to the maximum value of the average luminance in all objects so that the feature amount does not change greatly due to the influence of contrast.
  • step S5 normalization of the scale is performed on each feature amount with a predetermined upper and lower limit range as a full scale
  • each feature amount converted into a common scale is set as scores 1, 2, and 3. Obtained (step S6).
  • step S7 a total score composed of the sum of scores 1, 2, and 3 is calculated for each object, and the total scores are sorted (rearranged in descending order of scores).
  • the object with the label having the maximum total score is recognized as a fertilized egg to be observed (step S8).
  • step S9 a determination result that the object with this label is a fertilized egg is output from the output unit 140.
  • the determination result output from the output unit 130 is displayed on the display panel 72 of the operation panel 7, and a display indicating a fertilized egg is displayed on the object having the highest overall score in the observation image.
  • a symbol indicating that the egg is a fertilized egg for example, “J”
  • the fertilized egg is displayed with a different hue or brightness
  • a foreign object is displayed.
  • the interface is exemplified such that a fertilized egg and other foreign matters are discriminated and displayed by, for example, painting and displaying an image from which foreign matters have been removed.
  • the discrimination data output from the output unit 140 is transmitted to a computer or the like externally connected via the communication unit 65 to display the same image or observe the growth state of the fertilized egg. For use as basic data.
  • the observer refers to the image displayed on the display panel 72 and the image displayed on a monitor such as an externally connected computer, so that each of the observers who are observing (or that has already finished obtaining the observed image) It is possible to immediately determine whether or not the object included in the image is a fertilized egg. In addition, by using the data in which the fertilized egg and other foreign matters are discriminated in this way, the growth state of the fertilized egg can be efficiently observed.
  • step S110 the fertilized egg is injected into the culture container 10 (dish 10a) together with the medium drop D, and the culture container 10 is stored in the culture chamber 2 maintained at environmental conditions suitable for culturing the fertilized egg. Then, fertilized eggs are cultured under the environmental conditions. The environmental conditions are adjusted in the control unit 6 according to the culture environment of the fertilized egg such as the temperature, humidity, and carbon dioxide concentration in the culture chamber 2.
  • step S120 as the observation of the fertilized egg in the culture container 10, the above-described image processing steps S1 to S9 (see FIG. 1) are executed, and a fertilized egg is selected from a plurality of objects copied in the observation image. Identify. At this time, one fertilized egg is identified for each medium drop D in the culture vessel 10 (dish 10a).
  • a plurality of fertilized eggs identified for each medium drop D are selected based on a predetermined selection criterion.
  • a selection standard for a fertilized egg the grade of the fertilized egg is determined based on the timing of cleavage, the shape of the blastomere, and the like, and a good one satisfying this selection standard is selected. For example, it is performed based on whether or not the timing at which cleavage occurs in all egg cells in the egg is in the same period as having passed through a good growth state. That is, regarding the cleavage of normal fertilized eggs, cells of the same generation divide at almost the same timing, and only cells of the same generation exist in the embryo. On the other hand, regarding the cleavage of an abnormal fertilized egg, even when cells are of the same generation, the division timing is shifted, and cells of different generations are mixed in the embryo.
  • the selected fertilized eggs (good fertilized eggs that have grown to a state called a blastocyst) are collected and stored frozen, for example, in liquid nitrogen at minus 196 ° C.
  • the fertilized egg (blastocyst) is returned to the mother (embryo transfer) at a predetermined cycle.
  • the fertilized eggs to be cultured may be fertilized eggs such as humans, cows, horses, pigs and mice.
  • the fertilized egg may be stored in a blastocyst state or may be stored in a divided phase (4-cell stage embryo, 8-cell stage embryo).
  • the image processing method and image processing apparatus 100 for observing a fertilized egg configured by executing the image processing program GP, and the method for manufacturing a fertilized egg
  • the image processing method and image processing apparatus 100 for observing a fertilized egg configured by executing the image processing program GP, and the method for manufacturing a fertilized egg
  • the processing method for recognizing a fertilized egg based on the three feature amounts of the circularity of the contour, the luminance difference between the outside and inside of the contour, and the texture feature amount inside the contour is illustrated.
  • the present invention is not limited to this embodiment, and a method for recognizing a fertilized egg based on other feature quantities (such as shape feature quantities and texture feature quantities), and four additional feature quantities are added. Even when applied to a method for recognizing a fertilized egg based on the five feature quantities, the same effect can be obtained. For example, based on an image with a relatively low observation magnification, the brightness value of the outline of the object, the variance value of the brightness inside the object, and the like are used as new feature values, replacing one of the above three feature values.
  • the configuration may be configured as four or five feature amounts in addition to the above three feature amounts.
  • the observation magnification is configured to be variable according to lens settings such as an objective lens, and is obtained from the observation image.
  • the feature amount includes a feature amount that appears more prominently in a low-magnification image (low-magnification phase difference image) and a feature amount that appears more prominently in a high-magnification image (high-magnification phase difference image).
  • the three feature quantities exemplified in the present embodiment are preferably obtained based on a high-resolution high-magnification image so that numerical values and textures can be detected, and the brightness inside the contour as the new feature quantities described above.
  • the value is preferably obtained based on a low-magnification image in which a luminance change is noticeable.
  • the high-magnification image is an image having an observation magnification of 10 or 20 times, for example
  • the low-magnification image is an image having an observation magnification of about 2 times, for example, the size of a fertilized egg, egg cell, embryo, or the like to be observed
  • An appropriate magnification can be used according to the above.
  • the configuration for identifying and processing one fertilized egg a injected into the medium drop D is exemplified, but the present invention is not limited to this embodiment, You may comprise so that all the several fertilized eggs a inject
  • a threshold value for identifying a fertilized egg is set in advance for the total score, and an object exceeding this threshold value is determined to be a fertilized egg, or the fertilized egg a injected into the medium drop D
  • a configuration may be adopted in which only the number is set in advance, and an object having a higher overall score corresponding to the number is recognized as a fertilized egg.
  • BS culture observation system GP image processing program a fertilized egg 5 observation unit 6 control unit 7 operation panel 54 macro observation system 54c imaging device 55 micro observation system 55c imaging device 61 CPU 62 ROM 63 RAM 100 Image Processing Device 120 Image Analysis Unit 140 Output Unit

Landscapes

  • Biochemistry (AREA)
  • Signal Processing (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Apparatus Associated With Microorganisms And Enzymes (AREA)
  • Image Processing (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Micro-Organisms Or Cultivation Processes Thereof (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un moyen destiné à reconnaître automatiquement des œufs fécondés à observer en distinguant les œufs fécondés de contaminants pendant l'observation d'œufs fécondés. Un ordinateur exécute les étapes suivantes : une étape (S1) consistant à obtenir une image d'observation représentant une pluralité d'objets positionnés dans le champ de vision d'un dispositif d'imagerie; une étape (S2) consistant à extraire la pluralité d'objets représentés sur l'image d'observation; des étapes (S11, S21, S31) consistant à calculer une pluralité de valeurs caractéristiques, correspondant aux attributs d'œufs fécondés, sur l'image pour chaque objet compris dans l'image d'observation; une étape (S8) consistant à identifier des œufs fécondés parmi la pluralité d'objets sur la base des valeurs caractéristiques calculées; et une étape (9) consistant à présenter les résultats d'identification des objets.
PCT/JP2010/004328 2009-07-08 2010-07-01 Procédé de traitement d’images pour l’observation d’Œufs fécondés, programme de traitement d’images, dispositif de traitement d’images et procédé de production d’Œufs fécondés WO2011004568A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2011521808A JPWO2011004568A1 (ja) 2009-07-08 2010-07-01 受精卵観察の画像処理方法、画像処理プログラム及び画像処理装置、並びに受精卵の製造方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009162149 2009-07-08
JP2009-162149 2009-07-08

Publications (1)

Publication Number Publication Date
WO2011004568A1 true WO2011004568A1 (fr) 2011-01-13

Family

ID=43428999

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/004328 WO2011004568A1 (fr) 2009-07-08 2010-07-01 Procédé de traitement d’images pour l’observation d’Œufs fécondés, programme de traitement d’images, dispositif de traitement d’images et procédé de production d’Œufs fécondés

Country Status (3)

Country Link
JP (1) JPWO2011004568A1 (fr)
TW (1) TW201102040A (fr)
WO (1) WO2011004568A1 (fr)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013099045A1 (fr) * 2011-12-28 2013-07-04 大日本スクリーン製造株式会社 Dispositif d'affichage d'image et procédé d'affichage d'image
JP2015029461A (ja) * 2013-08-02 2015-02-16 克昌 藤田 撮像装置
JP2015130805A (ja) * 2014-01-09 2015-07-23 大日本印刷株式会社 成育情報管理システム及び成育情報管理プログラム
JP2015130806A (ja) * 2014-01-09 2015-07-23 大日本印刷株式会社 成育情報管理システム及び成育情報管理プログラム
JP2017167817A (ja) * 2016-03-16 2017-09-21 ヤフー株式会社 画像処理装置、画像処理方法及び画像処理プログラム
JP2017191609A (ja) * 2017-04-14 2017-10-19 ソニー株式会社 画像処理装置および画像処理方法
WO2018189875A1 (fr) * 2017-04-14 2018-10-18 株式会社日立ハイテクノロジーズ Dispositif de capture d'image et procédé d'affichage de données de caractéristiques morphologiques
CN109444044A (zh) * 2018-03-26 2019-03-08 宁波华仪宁创智能科技有限公司 受精卵观察装置及其工作方法
EP3456812A4 (fr) * 2016-06-21 2019-06-26 Sony Corporation Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
JP6956302B1 (ja) * 2021-01-20 2021-11-02 医療法人浅田レディースクリニック 胚培養装置および培養環境に保持された処置卵を表示する方法
JP7005087B1 (ja) * 2021-01-20 2022-01-21 医療法人浅田レディースクリニック 胚培養装置およびその撮像装置

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0469775A (ja) * 1990-07-10 1992-03-04 Hitachi Ltd 細胞自動分類装置
JPH04318460A (ja) * 1991-04-17 1992-11-10 Omron Corp 血液細胞分析装置
JPH06111021A (ja) * 1992-09-30 1994-04-22 Omron Corp 画像判別装置
JPH09126987A (ja) * 1995-10-27 1997-05-16 Hitachi Ltd 粒子画像解析装置
JP2001500744A (ja) * 1996-11-01 2001-01-23 ユニバーシティ オブ ピッツバーグ 細胞を保持する方法及び装置
JP2002524134A (ja) * 1998-09-02 2002-08-06 ランゲルハンス・アンパルトセルスカブ 粒子分離装置
JP2006333710A (ja) * 2005-05-31 2006-12-14 Nikon Corp 細胞の自動良否判定システム
JP2007101199A (ja) * 2005-09-30 2007-04-19 Sysmex Corp 標本撮像装置及び標本撮像方法とその装置を制御するプログラム、並びに、標本分析装置
JP2007327843A (ja) * 2006-06-07 2007-12-20 Olympus Corp 画像処理装置および画像処理プログラム
WO2009081832A1 (fr) * 2007-12-20 2009-07-02 Nikon Corporation Procédé de traitement d'image pour image à intervalles préréglés, programme de traitement d'image et dispositif de traitement d'image

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0469775A (ja) * 1990-07-10 1992-03-04 Hitachi Ltd 細胞自動分類装置
JPH04318460A (ja) * 1991-04-17 1992-11-10 Omron Corp 血液細胞分析装置
JPH06111021A (ja) * 1992-09-30 1994-04-22 Omron Corp 画像判別装置
JPH09126987A (ja) * 1995-10-27 1997-05-16 Hitachi Ltd 粒子画像解析装置
JP2001500744A (ja) * 1996-11-01 2001-01-23 ユニバーシティ オブ ピッツバーグ 細胞を保持する方法及び装置
JP2002524134A (ja) * 1998-09-02 2002-08-06 ランゲルハンス・アンパルトセルスカブ 粒子分離装置
JP2006333710A (ja) * 2005-05-31 2006-12-14 Nikon Corp 細胞の自動良否判定システム
JP2007101199A (ja) * 2005-09-30 2007-04-19 Sysmex Corp 標本撮像装置及び標本撮像方法とその装置を制御するプログラム、並びに、標本分析装置
JP2007327843A (ja) * 2006-06-07 2007-12-20 Olympus Corp 画像処理装置および画像処理プログラム
WO2009081832A1 (fr) * 2007-12-20 2009-07-02 Nikon Corporation Procédé de traitement d'image pour image à intervalles préréglés, programme de traitement d'image et dispositif de traitement d'image

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013137635A (ja) * 2011-12-28 2013-07-11 Dainippon Screen Mfg Co Ltd 画像表示装置および画像表示方法
WO2013099045A1 (fr) * 2011-12-28 2013-07-04 大日本スクリーン製造株式会社 Dispositif d'affichage d'image et procédé d'affichage d'image
JP2015029461A (ja) * 2013-08-02 2015-02-16 克昌 藤田 撮像装置
JP2015130805A (ja) * 2014-01-09 2015-07-23 大日本印刷株式会社 成育情報管理システム及び成育情報管理プログラム
JP2015130806A (ja) * 2014-01-09 2015-07-23 大日本印刷株式会社 成育情報管理システム及び成育情報管理プログラム
JP2017167817A (ja) * 2016-03-16 2017-09-21 ヤフー株式会社 画像処理装置、画像処理方法及び画像処理プログラム
EP3456812A4 (fr) * 2016-06-21 2019-06-26 Sony Corporation Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
US11321585B2 (en) 2017-04-14 2022-05-03 Hitachi High-Tech Corporation Imaging device and morphological feature data display method
JP2017191609A (ja) * 2017-04-14 2017-10-19 ソニー株式会社 画像処理装置および画像処理方法
WO2018189875A1 (fr) * 2017-04-14 2018-10-18 株式会社日立ハイテクノロジーズ Dispositif de capture d'image et procédé d'affichage de données de caractéristiques morphologiques
JPWO2018189875A1 (ja) * 2017-04-14 2020-02-27 株式会社日立ハイテクノロジーズ 撮像装置および形態特徴データ表示方法
CN109444044A (zh) * 2018-03-26 2019-03-08 宁波华仪宁创智能科技有限公司 受精卵观察装置及其工作方法
CN109444044B (zh) * 2018-03-26 2023-07-18 广州市华粤行医疗科技有限公司 受精卵观察装置及其工作方法
JP7005087B1 (ja) * 2021-01-20 2022-01-21 医療法人浅田レディースクリニック 胚培養装置およびその撮像装置
JP6956302B1 (ja) * 2021-01-20 2021-11-02 医療法人浅田レディースクリニック 胚培養装置および培養環境に保持された処置卵を表示する方法
WO2022157855A1 (fr) * 2021-01-20 2022-07-28 医療法人浅田レディースクリニック Appareil de culture d'embryons et procédé d'affichage d'oeufs traités maintenus dans un environnement de culture
WO2022157854A1 (fr) * 2021-01-20 2022-07-28 医療法人浅田レディースクリニック Dispositif de culture d'embryons et dispositif d'imagerie associé

Also Published As

Publication number Publication date
TW201102040A (en) 2011-01-16
JPWO2011004568A1 (ja) 2012-12-20

Similar Documents

Publication Publication Date Title
WO2011004568A1 (fr) Procédé de traitement d’images pour l’observation d’Œufs fécondés, programme de traitement d’images, dispositif de traitement d’images et procédé de production d’Œufs fécondés
EP2234061B1 (fr) Procédé de traitement d'image pour image à intervalles préréglés, programme de traitement d'image et dispositif de traitement d'image
US8588504B2 (en) Technique for determining the state of a cell aggregation image processing program and image processing device using the technique, and method for producing a cell aggregation
JP4953092B2 (ja) 細胞観察における生細胞の判別手法、細胞観察の画像処理プログラム及び画像処理装置
WO2009119330A1 (fr) Procédé d'analyse d'image pour observation de cellule, programme et dispositif de traitement d'image
US20120134571A1 (en) Cell classification method, image processing program and image processing device using the method, and method for producing cell aggregation
EP2444479A1 (fr) Méthode permettant de déterminer de l'état d'un amas cellulaire, programme de traitement d'image et dispositif de traitement d'image utilisant ladite méthode et méthode de production d'un amas cellulaire
US20120122143A1 (en) Technique for determining maturity of a cell aggregation, image processing program and image processing device using the technique, and method for producing a cell aggregation
WO2012029817A1 (fr) Dispositif d'observation, programme d'observation, et système d'observation
WO2012117647A1 (fr) Programme d'observation et dispositif d'observation
WO2009119329A1 (fr) Procédé d'analyse d'image pour observation cellulaire, programme et dispositif de traitement d'image
JP6102166B2 (ja) 心筋細胞の運動検出方法、心筋細胞の培養方法、薬剤評価方法、画像処理プログラム及び画像処理装置
JP5516108B2 (ja) 観察装置、観察方法、及びプログラム
JP2011004638A (ja) 受精卵観察の画像処理方法、画像処理プログラム及び画像処理装置
JP2012039929A (ja) 受精卵観察の画像処理方法、画像処理プログラム及び画像処理装置、並びに受精卵の製造方法
JP2009229274A (ja) 細胞観察の画像解析方法、画像処理プログラム及び画像処理装置
JP2011017964A (ja) 培養観察装置
JP2016165274A (ja) 弁別的細胞イベントの定量的な識別を行う装置及び方法
JP2012039930A (ja) 培養物観察の画像処理方法、画像処理プログラム及び画像処理装置、並びに培養物の製造方法
JP2009229276A (ja) 細胞観察の画像解析方法、画像処理プログラム及び画像処理装置
JP2011010621A (ja) 培養物観察の画像処理方法、画像処理プログラム及び画像処理装置
JP2012039931A (ja) 観察装置、観察方法、及び培養物の製造方法
JP2012042327A (ja) 培養物観察の画像処理方法、画像処理プログラム及び画像処理装置、並びに培養物の製造方法
JP6567244B2 (ja) 細胞の運動観察方法、画像処理プログラム及び画像処理装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10796882

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2011521808

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10796882

Country of ref document: EP

Kind code of ref document: A1