WO2017061155A1 - 情報処理装置、情報処理方法及び情報処理システム - Google Patents
情報処理装置、情報処理方法及び情報処理システム Download PDFInfo
- Publication number
- WO2017061155A1 WO2017061155A1 PCT/JP2016/070121 JP2016070121W WO2017061155A1 WO 2017061155 A1 WO2017061155 A1 WO 2017061155A1 JP 2016070121 W JP2016070121 W JP 2016070121W WO 2017061155 A1 WO2017061155 A1 WO 2017061155A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- detector
- unit
- region
- analysis
- information processing
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/255—Detecting or recognising potential candidate objects based on visual cues, e.g. shapes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
- G06T2207/20104—Interactive definition of region of interest [ROI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30024—Cell structures in vitro; Tissue sections in vitro
Definitions
- the present disclosure relates to an information processing apparatus, an information processing method, and an information processing system.
- Patent Literature 1 discloses a technique for executing a plurality of region extraction algorithms for a plurality of image data and selecting an algorithm that extracts the feature in a region of interest in one image designated by the user with the highest accuracy. It is disclosed.
- Patent Document 2 discloses a technique for analyzing a cell by selecting an algorithm according to the type of the cell.
- Patent Document 1 an algorithm is determined according to the characteristics of the cell shown in one image. Therefore, when a change due to cell growth or proliferation occurs, the determined algorithm is determined. It is difficult to analyze changes in the cells using. Further, in the technique disclosed in Patent Document 2, a detector for analyzing the state of a cell at a certain point in time is selected from the type of cell, so that the shape or state of the cell, such as cell proliferation or cell death. It is difficult to analyze temporal changes continuously.
- the present disclosure proposes a new and improved information processing apparatus, information processing method, and information processing system capable of performing highly accurate analysis of cell changes.
- the detection determining unit that determines at least one detector according to the analysis method and the at least one detector determined by the detector determining unit, the analysis by the analysis method is performed.
- An information processing apparatus is provided.
- information including determining at least one detector according to an analysis method and performing analysis by the analysis method using the determined at least one detector A processing method is provided.
- an imaging apparatus including an imaging unit that generates a captured image, a detector determining unit that determines at least one detector according to an analysis method, and the above-described determination determined by the detector determining unit
- An information processing system includes an information processing apparatus including an analysis unit that performs analysis by the analysis method on the captured image using at least one detector.
- FIG. 2 is a block diagram illustrating a configuration example of an information processing device according to a first embodiment of the present disclosure.
- FIG. It is a table
- FIG. 3 is a block diagram illustrating a hardware configuration example of an information processing apparatus according to an embodiment of the present disclosure.
- FIG. 1 is a diagram illustrating an outline of a configuration of an information processing system 1 according to an embodiment of the present disclosure.
- the information processing system 1 includes an imaging device 10 and an information processing device 20.
- the imaging device 10 and the information processing device 20 are connected by various wired or wireless networks.
- the imaging device 10 is a device that generates a captured image (moving image).
- the imaging device 10 according to the present embodiment is realized by a digital camera, for example.
- the imaging device 10 may be realized by any device having an imaging function, such as a smartphone, a tablet, a game machine, or a wearable device.
- the imaging apparatus 10 uses various members such as an imaging element such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), and a lens for controlling the formation of a subject image on the imaging element. Capture real space.
- the imaging device 10 includes a communication device for transmitting and receiving captured images and the like with the information processing device 20.
- the imaging device 10 is provided above the imaging stage S for imaging the culture medium M in which the cells to be analyzed are cultured. And the imaging device 10 produces
- the imaging device 10 may image the culture medium M directly (without passing through other members), or may image the culture medium M through other members such as a microscope.
- the frame rate is not particularly limited, but is preferably set according to the degree of change of the observation target. Note that the imaging device 10 images a certain imaging region including the culture medium M in order to correctly track changes in the observation target.
- the moving image data generated by the imaging device 10 is transmitted to the information processing device 20.
- the imaging device 10 is a camera installed in an optical microscope or the like, but the present technology is not limited to such an example.
- the imaging device 10 may be an imaging device included in an electron microscope using an electron beam such as an SEM (Scanning Electron Microscope) or a TEM (Transmission Electron Microscope), or an AFM. Even an imaging device included in an SPM (Scanning Probe Microscope) using a short needle such as an (Atomic Force Microscope) or STM (Scanning Tunneling Microscope) Good.
- the captured image generated by the imaging device 10 is, for example, an image obtained by irradiating an observation target with an electron beam in the case of an electron microscope, and in the case of SPM, the observation target is traced using a short needle. It is an image obtained by this.
- These captured images can also be analyzed by the information processing apparatus 20 according to the present embodiment.
- the information processing apparatus 20 is an apparatus having an image analysis function.
- the information processing apparatus 20 is realized by any apparatus having an image analysis function, such as a PC (Personal Computer), a tablet, and a smartphone. Further, the information processing apparatus 20 may be realized by one or a plurality of information processing apparatuses on a network.
- the information processing apparatus 20 acquires a captured image from the imaging apparatus 10 and performs tracking of a region to be observed on the acquired captured image.
- the analysis result of the tracking process by the information processing device 20 is output to a storage device or a display device provided inside or outside the information processing device 20. A functional configuration for realizing each function of the information processing apparatus 20 will be described later.
- the information processing system 1 is comprised by the imaging device 10 and the information processing apparatus 20, this technique is not limited to this example.
- the imaging device 10 may perform processing (for example, tracking processing) regarding the information processing device 20.
- the information processing system 1 is realized by an imaging device having a function of tracking an observation target.
- the cells to be observed are different from ordinary subjects such as humans, animals, plants, living tissues, or inanimate structures, and grow, divide, join, deform, or necrosis (necrosis) in a short time. ) And other phenomena.
- a detector is selected on the basis of an image of a cell at a certain time point. Therefore, when a cell changes its shape or state, the same detector is selected. It is difficult to analyze the cells using In the technique disclosed in Japanese Patent No. 4852890, a detector for analyzing the state of the cell at a certain point in time is selected from the cell type, so that the cell shape or cell death, such as cell proliferation or cell death, can be obtained.
- observation target is an animal, plant, or inanimate structure
- structure or shape of the observation target changes significantly in a short time, such as the growth of a thin film or nanocluster crystal, observation according to the type of observation It is difficult to continue to analyze the subject continuously.
- the information processing system 1 selects a detector associated with the analysis method or the evaluation method of the observation target from the detector group, and performs analysis using the selected detector.
- the information processing system 1 is mainly used for evaluating a change or the like of an observation target.
- the change or the like of the observation object is analyzed.
- the information processing system 1 uses the analysis method BB or CC for analysis. Is performed on the observation target. That is, the analysis using the detector selected according to the evaluation method is included in the analysis using the detector selected according to the analysis method. Therefore, in the present disclosure, the analysis method will be described as including an evaluation method.
- the overview of the information processing system 1 according to an embodiment of the present disclosure has been described above.
- the information processing apparatus 20 included in the information processing system 1 according to an embodiment of the present disclosure is realized in a plurality of embodiments.
- a specific configuration example and operation processing of the information processing apparatus 20 will be described.
- FIG. 2 is a block diagram illustrating a configuration example of the information processing apparatus 20-1 according to the first embodiment of the present disclosure.
- the information processing apparatus 20-1 includes a detector database (DB) 200, an analysis method acquisition unit 210, a detector determination unit 220, an image acquisition unit 230, a detection unit 240, a detection parameter adjustment unit 250, An area drawing unit 260, an analysis unit 270, and an output control unit 280 are included.
- DB detector database
- the detector DB 200 is a database that stores detectors necessary for detecting an analysis target.
- the detector stored by the detector DB 200 is used to calculate a feature amount from a captured image obtained by capturing an observation target, and to detect a region corresponding to the observation target based on the feature amount.
- a plurality of detectors are stored in the detector DB 200, and these detectors are optimized according to an analysis method or an evaluation method performed on a specific observation target. For example, in order to detect a specific change in the observation target, a plurality of detectors are associated with the specific change.
- a set of a plurality of detectors for detecting this specific change is defined herein as a “detection recipe”.
- the combination of detectors included in the detection recipe is determined in advance for each observation target and for each phenomenon that the observation target can develop.
- FIG. 3 is a table for explaining the detection recipe according to the present embodiment.
- the detection recipe is associated with a change (and observation object) of a cell that is an observation target, and a detector (and a corresponding feature amount) for detecting the change of the associated cell is provided.
- the feature amount means a variable used for detecting an observation target.
- the attention area detector is a detector for detecting an area where an observation target exists from a captured image.
- the attention area detector for example, when the observation target is various cells, a cell area detector is included. This attention area detector is used, for example, to detect an existence area of an observation target by calculating a feature quantity such as an edge or a shading.
- the identification area detector is a detector for detecting, from the captured image, an area that changes due to part or all of the observation target.
- an identification region detector for example, when the observation target is various cells, a proliferation region detector, a rhythm region detector, a differentiation region detector, a lumen region detector, a death region detector, a neuronal cell body region detector , And axonal region detectors and the like.
- This identification area detector is used, for example, to detect a change area of an observation target by calculating a feature quantity such as motion between a plurality of frames or LBP (Local Binary Pattern). Thereby, it becomes easy to analyze the characteristic change seen in the observation target.
- LBP Local Binary Pattern
- the detection recipe described above has an attention area detector and an identification area detector. By using such a detection recipe, it is possible to detect a region (a region of interest) corresponding to an observation target and identify a region in which the change of the observation target further occurs in the region of interest.
- the detection recipe may include only the attention area detector.
- the detection recipe is identified. Only the area detector may be included.
- the detection recipe A is a detection recipe for detecting changes such as cell migration or infiltration. Therefore, the detection recipe A includes a cell region detector for detecting a cell region and a growth region detector for detecting a cell growth region that causes cell migration or invasion.
- a region corresponding to cancer cells is detected using a cell region detector, and further, cancer cells are detected using a growth region detector. A region causing infiltration can be detected.
- the detection recipe A may be prepared for each observation target, for example, a detection recipe Aa for detecting cancer cells, a detection recipe Ab for detecting blood cells, and a detection recipe Ac for detecting lymphocytes. . This is because the characteristics for detection differ for each observation object.
- a plurality of identification region detectors may be included for one detection recipe.
- the new observation target is detected again without adopting a detector corresponding to the new observation target, Can be analyzed.
- a region having a specific feature can be identified and analyzed.
- the detector as described above may be generated by machine learning using a set of an analysis method or an evaluation method for an observation target and a captured image including an image of the observation target as learning data.
- the analysis method or the evaluation method for the observation target is associated with at least one detection recipe. Therefore, detection accuracy can be improved by performing machine learning in advance using a captured image including an image of an observation target that is an object of an analysis method or an evaluation method corresponding to the detection recipe.
- the feature quantity used in the identification region detector may include time series information such as vector data, for example. This is because, for example, the degree of temporal change of the region to be identified in the observation target is detected with higher accuracy.
- the machine learning described above may be, for example, machine learning using a boost, a support vector machine, or the like. According to these methods, a detector relating to a feature amount that a plurality of images of the observation target has in common is generated.
- the feature amount used in these methods may be, for example, an edge, LBP, or Haar-like feature amount.
- Deep Learning may be used as machine learning. In Deep Learning, feature quantities for detecting the above regions are automatically generated, so that a detector can be generated simply by machine learning of a set of learning data.
- the analysis method acquisition unit 210 is an analysis method or an evaluation method for analyzing an observation target (because the evaluation method is included in the analysis method as described above. Information).
- the analysis method acquisition unit 210 may acquire an analysis method input by the user via an input unit (not shown) when the observation target is analyzed using the information processing apparatus 20-1.
- the analysis method acquisition unit 210 may acquire the analysis method from a storage unit (not shown) at a predetermined time.
- the analysis method acquisition unit 210 may acquire an analysis method via a communication unit (not shown).
- the analysis method acquisition unit 210 acquires information related to an analysis method (evaluation method) such as “scratch assay of cancer cells” and “evaluation of drug efficacy of cardiomyocytes”, for example.
- an analysis method evaluation method
- the analysis method is simply “size analysis”, “motion analysis”, or the like
- the analysis method acquisition unit 210 may acquire information on the type of cell to be observed in addition to the analysis method.
- Information regarding the analysis method acquired by the analysis method acquisition unit 210 is output to the detector determination unit 220.
- the detector determination unit 220 determines at least one detector according to the information on the analysis method acquired from the analysis method acquisition unit 210. For example, the detector determining unit 220 determines a detection recipe associated with the type of the acquired analysis method, and acquires the detector included in the detection recipe from the detector DB 200.
- FIG. 4 is a table showing an example of a detection recipe corresponding to the analysis method.
- one analysis method is associated with at least one change (and observation object) of cells to be observed. This is because cell analysis is performed for specific changes in the cell. Further, as shown in FIG. 3, each change in the observation target is associated with a detection recipe. Therefore, if the analysis method is determined, the detector used for the detection process is also determined according to the analysis method.
- the detector determining unit 220 determines a detection recipe A corresponding to the cancer cell scratch assay. This is because the cancer cell scratch assay evaluates cancer cell migration and invasion.
- the detection recipe A determined here may be a detection recipe Aa corresponding to a cancer cell. Thereby, detection accuracy and analysis accuracy can be further improved.
- the detector determination unit 220 acquires the detectors included in the detection recipe A from the detector DB 200.
- the detector determining unit 220 determines detection recipe B, detection recipe C, and detection recipe D as detection recipes corresponding to cardiomyocyte drug efficacy evaluation. This is because cardiomyocyte pharmacological evaluation evaluates cardiomyocyte rhythm, proliferation, division, or cell death by administration. In this case, a detection recipe B corresponding to rhythm, a detection recipe C corresponding to proliferation and division, and a detection recipe D corresponding to cell death are determined. By detecting using the detectors included in these detection recipes, it is possible to classify the rhythmic region, the dividing region, the cell dead region, and the like of the cardiomyocytes. Thereby, an analysis result can be enriched more.
- the detector determination unit 220 determines a plurality of detectors according to the analysis method
- the following analysis is also possible. For example, there are cases where it is desired to simultaneously analyze a plurality of types of cells.
- the detector determining unit 220 can acquire a plurality of types of cells at a time by acquiring the detectors according to a plurality of analysis methods. Thereby, for example, when analyzing fertilization, it becomes possible to detect and analyze an egg and a sperm, respectively. When it is desired to analyze the interaction between cancer cells and immune cells, two cells can be detected and analyzed, respectively. It is also possible to identify cells (red blood cells, white blood cells, or platelets) included in the blood cell group.
- the function of the detector determination unit 220 has been described above. Information regarding the detector determined by the detector determination unit 220 is output to the detection unit 240.
- the image acquisition unit 230 acquires image data including a captured image generated by the imaging device 10 via a communication device (not shown). For example, the image acquisition unit 230 acquires the moving image data generated by the imaging device 10 in time series. The acquired image data is output to the detection unit 240.
- the image acquired by the image acquisition unit 230 includes an RGB image or a grayscale image.
- the image acquisition unit 230 converts the captured image that is the RGB image into a gray scale.
- the detection unit 240 detects a region of interest for the captured image acquired by the image acquisition unit 230 using the detector determined by the detector determination unit 220.
- the attention area is an area corresponding to the observation target as described above.
- the detection unit 240 detects a region corresponding to the observation target in the captured image by using a region-of-interest detector included in the detection recipe. Moreover, the detection part 240 detects the area
- the detection unit 240 calculates a feature amount designated by the detector from the acquired captured image, and generates feature amount data regarding the captured image.
- the detection unit 240 detects the attention area from the captured image using the feature amount data.
- Boost an algorithm for the detection unit 240 to detect the attention area
- the feature amount data generated for the captured image is data regarding the feature amount specified by the detector used by the detection unit 240. If the detector used by the detector 240 is generated by a learning method that does not require preset feature values such as Deep Learning, the detector 240 uses the feature values automatically set by the detector. Calculated from the captured image.
- the detection unit 240 may detect each region of interest using the plurality of detectors.
- the detection unit 240 may detect a region of interest using a region-of-interest detector, and may further detect a region desired to be identified from the region of interest previously detected using a recognition region detector. Thereby, the specific change of the observation target to be analyzed can be detected in more detail.
- the detection unit 240 detects an observation target using the detection recipe A (see FIG. 3) determined by the detector determination unit 220.
- the detection recipe A includes a cell region detector and a growth region detector for cancer cells.
- the detection unit 240 can detect a region corresponding to a cancer cell using a cell region detector, and can further detect a region in which the cancer cell causes infiltration by using a growth region detector. .
- the detection unit 240 may perform processing for associating the detected attention area with the analysis result obtained by the analysis by the analysis unit 270. For example, although described later in detail, the detection unit 240 may assign an ID for identifying an analysis method or the like for each detected attention area. Thereby, for example, management of each analysis result obtained in the post-analysis processing of each attention area can be facilitated. Moreover, the detection part 240 may determine the value of ID provided to each attention area
- the detection unit 240 assigns IDs “10000001” and “10000002” to the two regions of interest detected using the first detector, and detects using the second detector.
- An ID “00010001” may be assigned to one attention area.
- the detection section 240 may assign an ID “1000001” to the attention area.
- the detection unit 240 may detect the attention area based on the detection parameter.
- the detection parameter means a parameter that can be adjusted according to the state of the captured image that varies depending on the state of the observation target or the observation condition, or the imaging condition or specification of the imaging device 10. More specifically, the detection parameters include the scale of the captured image, the size of the observation target, the speed of movement, the size of the cluster formed by the observation target, a random variable, and the like.
- the detection parameter may be automatically adjusted according to the state of the observation target or the observation condition as described above, or the imaging parameter of the imaging apparatus 10 (for example, imaging magnification, imaging frame, brightness, etc.). It may be adjusted automatically accordingly. Further, this detection parameter may be adjusted by a detection parameter adjustment unit described later.
- the detection unit 240 outputs the detection result (information such as the attention region, the identification region, and the label) to the region drawing unit 260 and the analysis unit 270.
- the detection parameter adjustment unit 250 adjusts the detection parameters related to the detection processing of the detection unit 240 according to the state of the observation target, the observation conditions, the imaging conditions of the imaging device 10, or the like. For example, the detection parameter adjustment unit 250 may automatically adjust the detection parameter according to each of the above states and conditions, or the detection parameter may be adjusted by a user operation.
- FIG. 5 is a diagram illustrating an example of an interface for inputting adjustment contents to the detection parameter adjustment unit 250 according to the present embodiment.
- the interface 2000 for adjusting the detection parameters includes a detection parameter type 2001 and a slider 2002.
- Detection parameter types 2001 include Size Ratio (reduction rate of captured image), Object Size (threshold of detection size), and Cluster Size (threshold for determining whether the observation target corresponding to the detected attention area is the same. ), And Step Size (frame unit of detection processing).
- other detection parameters such as a luminance threshold value may be included in the detection parameter type 2001 as an adjustment target. These detection parameters are changed by operating the slider 2002.
- the detection parameter adjusted by the detection parameter adjustment unit 250 is output to the detection unit 240.
- the area drawing unit 260 superimposes the detection results such as the attention area, the identification area, and the ID on the captured image that is the target of the detection process of the detection unit 240.
- the area drawing unit 260 may indicate the attention area, the identification area, and the like by a graphic such as a straight line, a curve, or a plane closed by a curve, for example.
- the shape of the plane showing these regions may be an arbitrary shape such as a rectangle, a circle, an ellipse, or the like, or may be a shape formed according to the contour of the region corresponding to the observation target.
- the area drawing unit 260 may display the ID in the vicinity of the attention area or the identification area. Specific drawing processing by the area drawing unit 260 will be described later.
- the area drawing unit 260 outputs the drawing processing result to the output control unit 280.
- the analysis unit 270 analyzes the attention area (and the identification area) detected by the detection unit 240. For example, the analysis unit 270 performs an analysis based on an analysis method associated with the detector used for detecting the attention area on the attention area.
- the analysis performed by the analysis unit 270 is an analysis for quantitatively evaluating, for example, the growth, proliferation, division, cell death, movement, or shape change of a cell to be observed. In this case, the analysis unit 270 calculates, for example, feature quantities such as cell size, area, number, shape (for example, roundness), and motion vector from the attention area or the identification area.
- the analysis unit 270 analyzes the degree of migration or invasion of the region of interest corresponding to the cancer cells. Specifically, the analysis unit 270 analyzes a region in which a phenomenon of migration or invasion occurs in a region of interest corresponding to a cancer cell. The analysis unit 270 calculates the area, size, motion vector, and the like of the region of interest as a feature amount of the region of interest or a region where migration or infiltration occurs.
- the analysis unit 270 when the medicinal efficacy evaluation is performed on the cardiomyocytes, the analysis unit 270 generates a region in which rhythm is generated, a region in which proliferation (division) occurs, and cell death among regions of interest corresponding to the cardiomyocytes Analysis is performed for each of the areas. More specifically, the analysis unit 270 analyzes the size of the rhythm of the region where the rhythm is generated, analyzes the differentiation speed of the region where the proliferation occurs, and also determines the area of the region where the cell death occurs. May be analyzed. Thus, the analysis unit 270 may perform analysis for each detection result obtained using each detector by the detection unit 240. As a result, even for a single type of cell, a plurality of analyzes can be performed at a time, so that an evaluation requiring a plurality of analyzes can be comprehensively performed.
- the analysis unit 270 outputs an analysis result including the calculated feature amount and the like to the output control unit 280.
- the output control unit 280 outputs the drawing information acquired from the region drawing unit 260 (the captured image after the region superimposition) and the analysis result acquired from the analysis unit 270 as output data.
- the output control unit 280 may display the output data on a display unit (not shown) provided inside or outside the information processing apparatus 20-1.
- the output control unit 280 may store the output data in a storage unit (not shown) provided inside or outside the information processing apparatus 20-1.
- the output control unit 280 may transmit the output data to an external device (server, cloud, terminal device) or the like via a communication unit (not shown) included in the information processing device 20-1.
- the output control unit 280 may display a captured image including an ID and a figure indicating at least one of the attention region or the identification region superimposed by the region drawing unit 260. Good.
- the output control unit 280 may output the analysis result acquired from the analysis unit 270 in association with the region of interest.
- the output control unit 280 may output the analysis result with an ID for identifying the region of interest. Thereby, the observation object corresponding to the attention area can be output in association with the analysis result.
- the output control unit 280 may process the analysis result acquired from the analysis unit 270 into a table, a graph, a chart, or the like, or may output the data as a data file suitable for analysis by another analysis device. It may be output.
- the output control unit 280 may further superimpose a display indicating the analysis result on a captured image including a graphic indicating the region of interest and output the captured image.
- the output control unit 280 may output a heat map that is color-coded according to the analysis result (for example, the magnitude of the movement) of the specific movement of the observation target, superimposed on the captured image.
- FIG. 6 is a flowchart illustrating an example of processing performed by the information processing device 20-1 according to the first embodiment of the present disclosure.
- the analysis method acquisition unit 210 acquires information on an analysis method through a user operation or batch processing (S101).
- the detector determination unit 220 acquires information on the analysis method from the analysis method acquisition unit 210, and selects and determines a detection recipe associated with the analysis method from the detector DB 200 (S103).
- the image acquisition unit 230 acquires data related to the captured image generated by the imaging device 10 via a communication unit (not shown) (S105).
- FIG. 7 is a diagram illustrating an example of a captured image generated by the imaging device 10 according to the present embodiment.
- a captured image 1000 includes cancer cell regions 300a, 300b, and 300c, and immune cell regions 400a and 400b.
- This captured image 1000 is a captured image obtained by the imaging device 10 imaging cancer cells and immune cells present in the medium M.
- regions of interest corresponding to cancer cells and immune cells are detected, and each region of interest is analyzed.
- the detection unit 240 detects a region of interest using a detector included in the detection recipe determined by the detector determination unit 220 (S107). Then, the detection unit 240 performs labeling on the detected attention area (S109).
- the detection unit 240 detects the region of interest using all the detectors (S111). For example, in the example shown in FIG. 7, the detector 240 uses two detectors, a detector for detecting cancer cells and a detector for detecting immune cells.
- the area drawing unit 260 draws the attention area and the ID associated with the attention area on the captured image used for the detection process (S113). ).
- FIG. 8 is a diagram illustrating an example of a drawing process performed by the area drawing unit 260 according to the present embodiment.
- rectangular attention regions 301a, 301b, and 301c are drawn around the cancer cell regions 300a, 300b, and 300c.
- rectangular attention regions 401a, 401b, and 401c are drawn around the immune cell regions 400a, 400b, and 400c.
- the area drawing unit 260 may change the outline indicating the attention area to a solid line, a broken line, or the like. May be changed.
- the area drawing unit 260 may attach an ID indicating the attention area in the vicinity of each of the attention areas 301 and 401 (in the example illustrated in FIG. 8, outside the frame of the attention area).
- IDs 302a, 302b, 302c, 402a, and 402b may be attached in the vicinity of the attention areas 301a, 301b, 301c, 401a, and 401b.
- ID 302a is displayed as “ID: 00000001”, and ID 402a is displayed as “ID: 00010001”.
- ID is not limited to the above-described example, but is numbered so that it can be easily distinguished according to the type of analysis or the state of the cell.
- the output control unit 280 outputs the drawing information by the region drawing unit 260 (S115).
- the analysis unit 270 analyzes the attention area detected by the detection unit 240 (S117).
- the output control unit 280 outputs the analysis result by the analysis unit 270 (S119).
- FIG. 9 is a diagram illustrating an output example by the output control unit 280 according to the present embodiment.
- the display unit D (provided inside or outside the information processing apparatus 20-1) shows a captured image 1000 drawn by the region drawing unit 260 and an analysis result by the analysis unit 270.
- a table 1100 is included. A region of interest and an ID are superimposed on the captured image 1000. Further, in the table 1100 showing the analysis results, the length (Length), size (Size), roundness (Circularity), and cell type of the region of interest corresponding to each ID are shown. For example, in the row of ID “00000001” in Table 1100, the length (150), size (1000), and roundness (0.
- the output control unit 280 may output the analysis results as a table, or the output control unit 280 may output the analysis results in a format such as a graph or mapping.
- a detection recipe (detector) is determined according to the analysis method acquired by the analysis method acquisition unit 210, and the detection unit 240 detects a region of interest from the captured image using the determined detector.
- the analysis unit 270 analyzes the region of interest.
- the user can detect the observation target from the captured image and analyze the observation target only by determining the analysis method of the observation target.
- a detector suitable for each shape and state of the observation object that changes with the passage of time is selected. This makes it possible to analyze the observation target with high accuracy regardless of the change in the observation target.
- a detector suitable for detecting a change in the observation target is automatically selected, which improves convenience for the user who wants to analyze the change in the observation target.
- the detection unit 240 first detects a region of interest of a plurality of cells using one detector, and the detection unit 240 further detects an attention corresponding to an observation target that shows a specific change from the detected region of interest.
- the area is narrowed down using other detectors. Thereby, only the attention area
- FIG. 10 is a diagram showing a first output example by the region-of-interest narrowing processing by the plurality of detectors according to the present embodiment.
- captured image 1001 includes cancer cell regions 311a, 311b, 410a and 410b.
- the cancer cell regions 311a and 311b are regions that have changed from the cancer cell regions 310a and 310b one frame before due to the proliferation of the cancer cells.
- the cancer cell regions 410a and 410b have not changed (eg, due to cell death or inactivity).
- the detection unit 240 first detects the region of interest using a detector (cell region detector) that detects the region of the cancer cell. Then, the detection unit 240 further narrows down the attention area where the proliferation phenomenon has occurred from the attention area detected previously using a detector (growth area detector) that detects the area where the cells are growing.
- a detector cell region detector
- growth area detector detector
- attention regions 312a and 312b are drawn around the cancer cell regions 311a and 311b.
- motion vectors 313a and 313b which are feature quantities indicating motion, are drawn inside the attention areas 312a and 312b.
- rectangular regions 411a and 411b are drawn around the cancer cell regions 410a and 410b, but the line type of the rectangular region 411 is set to be different from the line type of the region of interest 312. .
- the analysis results corresponding to the narrowed attention area 312 are displayed.
- the growth rate of cancer cells corresponding to the attention area 312 is displayed in the table 1200.
- the state of the cancer cell corresponding to the attention area 312 is indicated as “Carcinoma Proliferation”, and it is displayed in Table 1200 that the cancer cell is in a proliferating state.
- the detection unit 240 detects a plurality of regions of interest of one type of cell using a plurality of detectors. Thereby, even if one cell has a plurality of different features, it is possible to analyze the attention area detected according to each feature. Therefore, for example, even when one cell has a specific feature such as an axon like a nerve cell, it is possible to detect and analyze only the region of the axon.
- FIG. 11 is a diagram showing a second output example by the region-of-interest narrowing processing by the plurality of detectors according to the present embodiment.
- the captured image 1002 includes a nerve cell region 320.
- the nerve cell includes a nerve cell body and an axon. Since the nerve cell body has a planar structure, it is easy to detect the area 320A of the nerve cell body included in the captured image 1002, but the axon has a long structure and is three-dimensional. Therefore, it is difficult to distinguish the background of the captured image 1002 from the axon region 320B, as shown in FIG. Therefore, the detection unit 240 according to the present embodiment uses two detectors, a detector for detecting a region of a nerve cell body and a detector for detecting a region of an axon. Each component is detected separately.
- the detection unit 240 when the detection unit 240 uses a detector for detecting a region of the neuronal cell body, the detection unit 240 detects the attention region 321 corresponding to the neuronal cell body.
- the detector 240 when the detector 240 uses a detector for detecting an axon region, the detector 240 detects a region of interest 322 corresponding to the axon.
- the attention area 322 may be drawn by a curve indicating an axon area.
- FIG. 12 is a block diagram illustrating a configuration example of the information processing device 20-2 according to the second embodiment of the present disclosure.
- the information processing apparatus 20-2 includes a detector database (DB) 200, an analysis method acquisition unit 210, a detector determination unit 220, an image acquisition unit 230, a detection unit 240, a detection parameter adjustment unit 250,
- DB detector database
- a shape setting unit 290 and a region specifying unit 295 are further included.
- functions of the shape setting unit 290 and the region specifying unit 295 will be described.
- the shape setting unit 290 sets a display shape indicating the region of interest drawn by the region drawing unit 260.
- FIG. 13 is a diagram illustrating an example of the shape setting process of the region of interest by the shape setting unit 290 according to the present embodiment.
- a region of interest 331 is drawn around the region 330 to be observed.
- the shape setting unit 290 may set the display shape indicating the attention area 331 to a rectangle (area 331a) or an ellipse (area 331b).
- the shape setting unit 290 detects a region corresponding to the outline of the observation target region 330 by image analysis on a captured image (not shown), and uses the shape obtained based on the detection result as the shape of the attention region 331. It may be set. For example, as illustrated in FIG. 13, the shape setting unit 290 detects the contour of the observation target region 330 by image analysis, and pays attention to the shape indicated by the closed curve (or curve) that displays the detected contour.
- the shape of the region 331 may be used (for example, the region 331c). Thereby, the area 330 to be observed and the attention area 331 can be more closely associated on the captured image.
- a fitting technique such as Snakes or Level Set can be used.
- the region drawing unit 260 may perform the shape setting process of the region of interest based on the shape of the outline of the region to be observed as described above.
- the region drawing unit 260 may set the shape of the attention region using the attention region detection result by the detection unit 240.
- the detection result can be used as it is for setting the shape of the region of interest, and there is no need to perform image analysis on the captured image again.
- the region specifying unit 295 specifies a region of interest that is to be analyzed by the analysis unit 270 from the region of interest detected by the detection unit 240.
- the region specifying unit 295 specifies a region of interest to be analyzed among a plurality of regions of interest detected by the detection unit 240 according to a user operation or a predetermined condition.
- the analysis unit 270 analyzes the attention region specified by the region specification unit 295. More specifically, when the attention area is specified by the user's operation, the area specifying unit 295 selects which attention area to specify from among the plurality of attention areas displayed by the output control unit 280 by the user's operation. Then, the analysis unit 270 analyzes the selected attention area.
- FIG. 14 is a diagram illustrating an example of a region-of-interest specifying process by the region specifying unit 295 according to the present embodiment.
- the display unit D includes a captured image 1000 and a table 1300 indicating analysis results.
- the captured image 1000 includes cancer cell regions 350a, 350b, and 350c, and other cell regions 400a and 400b.
- the detection unit 240 detects a region of interest corresponding to the cancer cell region 300.
- the region drawing unit 260 draws attention regions around the cancer cell regions 350a, 350b, and 350c, and the output control unit 280 displays each attention region.
- the region of interest 351a corresponding to the cancer cell region 350a and the region of interest 351b corresponding to the cancer cell region 350b are selected as the region of interest to be analyzed by the region specifying unit 295.
- the region of interest corresponding to the cancer cell region 350b is excluded from the selection, and thus is not subject to analysis. Thereby, only the selected attention areas 351a and 351b are analyzed.
- Table 1300 includes IDs (corresponding to IDs 352a and 352b) corresponding to the attention areas 351a and 352b, and descriptions regarding the length, size, roundness, and cell type of each attention area.
- IDs corresponding to IDs 352a and 352b
- Table 1300 includes IDs (corresponding to IDs 352a and 352b) corresponding to the attention areas 351a and 352b, and descriptions regarding the length, size, roundness, and cell type of each attention area.
- the area specifying unit 295 is displayed in the table 1300. Similar to the above-described selection of the attention area, analysis results for all the attention areas detected before the area identification processing by the area identification unit 295 may be displayed in the table 1300. In this case, the analysis result for the attention area in which the attention area is not specified by the area specifying unit 295 may be removed from the table 1300.
- the region specifying unit 295 may specify the region of interest as the analysis target by selecting the region of interest again from the region of interest once removed from the analysis target
- the analysis result of the attention area may be displayed again in the table 1300.
- a necessary analysis result can be freely selected, and an analysis result necessary for evaluation can be extracted. Further, for example, it is possible to compare analysis results regarding a plurality of attention areas, and to perform a new analysis by comparing the analysis results.
- a display 340 (340a and 340b) for indicating the region of interest identified by the region identifying unit 295 may be displayed in the vicinity of the region of interest 351. Thereby, it can be grasped which attention area is specified as an analysis object.
- the configuration example of the information processing apparatus 20-2 according to the second embodiment of the present disclosure has been described.
- the shape of a graphic that defines a region of interest For example, a shape that fits the contour of the region to be observed can be set as the shape of the region of interest.
- region used as the analysis object can be specified among the detected attention area
- the information processing apparatus 20-2 includes both the shape setting unit 290 and the region specifying unit 295, but the present technology is not limited to such an example.
- the information processing apparatus may further add only the shape setting unit 290 or may further add only the region specifying unit 295 to the configuration of the information processing apparatus according to the first embodiment of the present disclosure. .
- FIG. 15 is a block diagram illustrating a hardware configuration example of the information processing apparatus according to the embodiment of the present disclosure.
- the illustrated information processing apparatus 900 can be realized, for example, by the information processing apparatus 20 in the above-described embodiment.
- the information processing apparatus 900 includes a CPU (Central Processing unit) 901, a ROM (Read Only Memory) 903, and a RAM (Random Access Memory) 905.
- the information processing apparatus 900 may include a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a drive 921, a connection port 925, and a communication device 929.
- the information processing apparatus 900 may include a processing circuit called DSP (Digital Signal Processor) or ASIC (Application Specific Integrated Circuit) instead of or in addition to the CPU 901.
- DSP Digital Signal Processor
- ASIC Application Specific Integrated Circuit
- the CPU 901 functions as an arithmetic processing unit and a control unit, and controls all or a part of the operation in the information processing apparatus 900 according to various programs recorded in the ROM 903, the RAM 905, the storage apparatus 919, or the removable recording medium 923.
- the CPU 901 controls the overall operation of each functional unit included in the information processing apparatus 20 in the above embodiment.
- the ROM 903 stores programs and calculation parameters used by the CPU 901.
- the RAM 905 primarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like.
- the CPU 901, the ROM 903, and the RAM 905 are connected to each other by a host bus 907 configured by an internal bus such as a CPU bus. Further, the host bus 907 is connected to an external bus 911 such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 909.
- PCI Peripheral Component Interconnect / Interface
- the input device 915 is a device operated by the user, such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever.
- the input device 915 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device 927 such as a mobile phone that supports the operation of the information processing device 900.
- the input device 915 includes an input control circuit that generates an input signal based on information input by the user and outputs the input signal to the CPU 901. The user operates the input device 915 to input various data and instruct processing operations to the information processing device 900.
- the output device 917 is a device that can notify the user of the acquired information visually or audibly.
- the output device 917 can be, for example, a display device such as an LCD, PDP, and OELD, an acoustic output device such as a speaker and headphones, and a printer device.
- the output device 917 outputs the result obtained by the processing of the information processing device 900 as a video such as text or an image, or outputs it as a sound such as sound.
- the storage device 919 is a data storage device configured as an example of a storage unit of the information processing device 900.
- the storage device 919 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
- the storage device 919 stores programs executed by the CPU 901, various data, various data acquired from the outside, and the like.
- the drive 921 is a reader / writer for a removable recording medium 923 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and is built in or externally attached to the information processing apparatus 900.
- the drive 921 reads information recorded on the attached removable recording medium 923 and outputs the information to the RAM 905.
- the drive 921 writes a record in the mounted removable recording medium 923.
- the connection port 925 is a port for directly connecting a device to the information processing apparatus 900.
- the connection port 925 can be, for example, a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface) port, or the like. Further, the connection port 925 may be an RS-232C port, an optical audio terminal, an HDMI (registered trademark) (High-Definition Multimedia Interface) port, or the like.
- the communication device 929 is a communication interface configured with a communication device for connecting to the communication network NW, for example.
- the communication device 929 may be, for example, a communication card for wired or wireless LAN (Local Area Network), Bluetooth (registered trademark), or WUSB (Wireless USB).
- the communication device 929 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various communication.
- the communication device 929 transmits and receives signals and the like using a predetermined protocol such as TCP / IP with the Internet and other communication devices, for example.
- the communication network NW connected to the communication device 929 is a network connected by wire or wireless, and is, for example, the Internet, a home LAN, infrared communication, radio wave communication, satellite communication, or the like.
- Each component described above may be configured using a general-purpose member, or may be configured by hardware specialized for the function of each component. Such a configuration can be appropriately changed according to the technical level at the time of implementation.
- the information processing system 1 is configured to include the imaging device 10 and the information processing device 20, but the present technology is not limited to such an example.
- the imaging apparatus 10 may include functions (detection function and analysis function) that the information processing apparatus 20 has.
- the information processing system 1 is realized by the imaging device 10.
- the information processing apparatus 20 may include a function (imaging function) of the imaging apparatus 10.
- the information processing system 1 is realized by the information processing apparatus 20.
- the imaging apparatus 10 may have a part of the functions of the information processing apparatus 20, and the information processing apparatus 20 may have a part of the functions of the imaging apparatus 10.
- the cells are listed as the observation target of the analysis by the information processing system 1, but the present technology is not limited to such an example.
- the observation object may be a cell organelle, a biological tissue, an organ, a human, an animal, a plant, or an inanimate structure, etc., and when these structures or shapes change in a short time, It is possible to analyze the change of the observation target using the information processing system 1.
- each step in the processing of the information processing apparatus of the present specification does not necessarily have to be processed in time series in the order described as a flowchart.
- each step in the processing of the information processing apparatus may be processed in an order different from the order described in the flowchart, or may be processed in parallel.
- a detector determining unit that determines at least one detector according to an analysis method; Using the at least one detector determined by the detector determination unit, an analysis unit that performs analysis by the analysis method;
- An information processing apparatus comprising: (2) Using the at least one detector determined by the detector determination unit, further comprising a detection unit for detecting a region of interest from within the captured image; The information processing apparatus according to (1), wherein the analysis unit analyzes the region of interest. (3) When a plurality of detectors are determined by the detector determining unit, The information processing apparatus according to (2), wherein the detection unit determines the region of interest based on a plurality of detection results obtained using the plurality of detectors.
- An area drawing unit that draws a display indicating the region of interest on the captured image based on a detection result of the detection unit;
- a display shape corresponding to the attention area includes a shape detected based on an image analysis on the captured image.
- a display shape corresponding to the attention area includes a shape calculated based on a detection result of the attention area by the detection unit.
- the information processing apparatus according to any one of (2) to (9), further including a region specifying unit that specifies a target region to be analyzed by the analysis unit from the detected target region.
- the detector is a detector generated by machine learning using as a learning data a set of the analysis method and image data related to an analysis object analyzed by the analysis method, The information processing apparatus according to any one of (2) to (10), wherein the detection unit detects the region of interest based on feature data obtained from the captured image using the detector.
- the detector determining unit determines at least one detector according to a type of change indicated by an analysis target analyzed by the analysis method. Processing equipment.
- the information processing apparatus wherein the analysis target analyzed by the analysis method includes a cell, an organelle, or a biological tissue formed by the cell.
- An information processing method including: (15) An imaging device including an imaging unit that generates a captured image; A detector determining unit that determines at least one detector according to an analysis method; Using the at least one detector determined by the detector determination unit, an analysis unit that performs analysis by the analysis method on the captured image;
- An information processing apparatus comprising:
- An information processing system comprising:
- imaging device 20 information processing device 200 detector DB 210 analysis method acquisition unit 220 detector determination unit 230 image acquisition unit 240 detection unit 250 detection parameter adjustment unit 260 region drawing unit 270 analysis unit 280 output control unit 290 shape setting unit 295 region specifying unit
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Quality & Reliability (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Medical Informatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Investigating Or Analysing Biological Materials (AREA)
- Apparatus Associated With Microorganisms And Enzymes (AREA)
- Measuring Or Testing Involving Enzymes Or Micro-Organisms (AREA)
- Image Analysis (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
Abstract
Description
1.情報処理システムの概要
2.第1の実施形態
2.1.情報処理装置の構成例
2.2.情報処理装置の処理例
2.3.効果
2.4.応用例
3.第2の実施形態
3.1.情報処理装置の構成例
3.2.効果
4.ハードウェア構成例
5.まとめ
図1は、本開示の一実施形態に係る情報処理システム1の構成の概要を示す図である。図1に示すように、情報処理システム1は、撮像装置10、および情報処理装置20を備える。撮像装置10および情報処理装置20は、有線または無線の各種ネットワークにより接続される。
撮像装置10は、撮像画像(動画像)を生成する装置である。本実施形態に係る撮像装置10は、例えば、デジタルカメラにより実現される。他にも、撮像装置10は、例えばスマートフォン、タブレット、ゲーム機、またはウェアラブル装置など、撮像機能を有するあらゆる装置により実現されてもよい。例えば、撮像装置10は、CCD(Charge Coupled Device)またはCMOS(Complementary Metal Oxide Semiconductor)などの撮像素子、および撮像素子への被写体像の結像を制御するためのレンズなどの各種の部材を用いて実空間を撮像する。また、撮像装置10は、情報処理装置20との間で撮像画像等を送受信するための通信装置を含む。本実施形態において、撮像装置10は、解析対象である細胞が培養されている培地Mを撮像するための撮像ステージSの上方に設けられる。そして、撮像装置10は、培地Mを特定のフレームレートで撮像することにより動画像データを生成する。なお、撮像装置10は、培地Mを直接(他の部材を介さずに)撮像してもよいし、顕微鏡等の他の部材を介して培地Mを撮像してもよい。また、上記フレームレートは特に限定されないが、観察対象の変化の度合いに応じて設定されることが好ましい。なお、撮像装置10は、観察対象の変化を正しく追跡するため、培地Mを含む一定の撮像領域を撮像する。撮像装置10により生成された動画像データは、情報処理装置20へ送信される。
情報処理装置20は、画像解析機能を有する装置である。情報処理装置20は、PC(Personal Computer)、タブレット、スマートフォンなど、画像解析機能を有するあらゆる装置により実現される。また、情報処理装置20は、ネットワーク上の1または複数の情報処理装置によって実現されてもよい。本実施形態に係る情報処理装置20は、撮像装置10から撮像画像を取得し、取得した撮像画像について観察対象の領域の追跡を実行する。情報処理装置20による追跡処理の解析結果は、情報処理装置20の内部または外部に備えられる記憶装置または表示装置等に出力される。なお、情報処理装置20の各機能を実現する機能構成については後述する。
まず、図2~図11を参照して本開示の第1の実施形態に係る情報処理装置20-1について説明する。
図2は、本開示の第1の実施形態に係る情報処理装置20-1の構成例を示すブロック図である。図2に示すように、情報処理装置20-1は、検出器データベース(DB)200、解析方法取得部210、検出器決定部220、画像取得部230、検出部240、検出パラメータ調整部250、領域描画部260、解析部270、および出力制御部280を含む。
検出器DB200は、解析対象を検出するために必要な検出器を格納するデータベースである。検出器DB200により格納された検出器は、観察対象を撮像した撮像画像から特徴量を計算し、当該特徴量に基づいて当該観察対象に対応する領域を検出するために用いられる。検出器DB200には複数の検出器が格納されており、これらの検出器は、特定の観察対象に対して行う解析方法または評価方法に応じてそれぞれ最適化されている。例えば、観察対象のある特定の変化を検出するために、当該特定の変化に対して複数の検出器が関連付けられている。この特定の変化を検出するための複数の検出器の組を、本明細書において「検出レシピ」と定義する。検出レシピに含まれる検出器の組み合わせは、例えば、観察対象ごとに、および当該観察対象が発現し得る現象ごとに予め決定される。
解析方法取得部210は、観察対象を解析するための解析方法または評価方法(上述したように評価方法は解析方法に含まれるため、以下、評価方法と解析方法とを併せて「解析方法」と呼称する)に関する情報を取得する。例えば、解析方法取得部210は、情報処理装置20-1を用いて観察対象の解析を行う際に、不図示の入力部を介してユーザにより入力された解析方法を取得してもよい。また、解析方法取得部210は、例えば、予め決められたスケジュールに従って解析を行う際は、所定の時点において不図示の記憶部から解析方法を取得してもよい。さらに、解析方法取得部210は、不図示の通信部を介して解析方法を取得してもよい。
検出器決定部220は、解析方法取得部210から取得した解析方法に関する情報に応じて、少なくとも一の検出器を決定する。例えば、検出器決定部220は、取得した解析方法の種類に関連付けられている検出レシピを決定し、検出器DB200から当該検出レシピに含まれている検出器を取得する。
画像取得部230は、撮像装置10により生成された撮像画像を含む画像データを、不図示の通信装置を介して取得する。例えば、画像取得部230は、撮像装置10により生成された動画像データを時系列に取得する。取得した画像データは、検出部240に出力される。
検出部240は、画像取得部230が取得した撮像画像について、検出器決定部220において決定された検出器を用いて注目領域を検出する。注目領域とは、上述したように、観察対象に対応する領域である。
検出パラメータ調整部250は、上述したように、観察対象の状態、観察条件または撮像装置10の撮像条件等に応じて、検出部240の検出処理に関する検出パラメータを調整する。検出パラメータ調整部250は、例えば上記の各状態および各条件に応じて検出パラメータを自動的に調整してもよいし、ユーザの操作により検出パラメータが調整されてもよい。
領域描画部260は、検出部240の検出処理の対象である撮像画像上に、注目領域、識別領域、およびID等の検出結果を重畳させる。領域描画部260は、例えば、直線、曲線、または曲線等によって閉じられた平面等の図形により注目領域および識別領域等を示してもよい。これらの領域を示す平面の形状は、例えば矩形、円形、楕円形等、任意の形状であってもよく、また、観察対象に対応する領域の輪郭に応じて形成された形状であってもよい。また、領域描画部260は、上記のIDを注目領域または識別領域の近傍に表示させてもよい。領域描画部260による具体的な描画処理については、後述する。領域描画部260は、描画処理の結果を出力制御部280に出力する。
解析部270は、検出部240が検出した注目領域(および識別領域)について解析を行う。解析部270は、例えば、注目領域の検出に用いられた検出器に関連付けられている解析方法に基づく解析を、当該注目領域について行う。解析部270により行われる解析とは、例えば観察対象である細胞の成長、増殖、分裂、細胞死、運動または形状の変化を定量的に評価するための解析である。この場合、解析部270は、例えば、細胞のサイズ、面積、個数、形状(例えば、真円度)、動きベクトル等の特徴量について注目領域または識別領域から算出する。
出力制御部280は、領域描画部260から取得した描画情報(領域重畳後の撮像画像等)、および解析部270から取得した解析結果を出力データとして出力する。出力制御部280は、例えば、情報処理装置20-1の内部または外部に備えられる表示部(不図示)に出力データを表示してもよい。また、出力制御部280は、情報処理装置20-1の内部または外部に備えられる記憶部(不図示)に出力データを記憶してもよい。また、出力制御部280は、情報処理装置20-1の備える通信部(不図示)を介して、外部装置(サーバ、クラウド、端末装置)等に出力データを送信してもよい。
以上、本開示の一実施形態に係る情報処理装置20-1の構成例について説明した。次に、本開示の一実施形態に係る情報処理装置20-1による処理の一例について、図6~図9を参照して説明する。
以上、本開示の第1の実施形態に係る情報処理装置20-1の構成例および処理例について説明した。本実施形態によれば、解析方法取得部210が取得した解析方法に応じて検出レシピ(検出器)が決定され、検出部240が決定された検出器を用いて撮像画像から注目領域を検出し、解析部270が当該注目領域について解析を行う。これにより、ユーザは観察対象の解析方法を決めるだけで、当該観察対象を撮像画像から検出し、当該観察対象についての解析を行うことができる。解析方法に基づいて検出器を決定することにより、時間の経過に応じて変化する観察対象のそれぞれの形状および状態に適した検出器が選択される。これにより、観察対象の変化にかかわらず観察対象を精度高く解析することが可能となる。また、解析方法を選択すれば観察対象の変化の検出に適した検出器が自動的に選択されるので、観察対象の変化を解析したいユーザにとっての利便性も向上する。
続いて、本開示の第1の実施形態に係る情報処理装置20-1による処理の応用例について、図10および図11を参照しながら説明する。
まず、複数の検出器による注目領域の絞り込み処理の第1の例について説明する。本応用例では、検出部240は、まず一の検出器を用いて複数の細胞の注目領域を検出し、さらに検出部240は、検出した注目領域から特定の変化を示す観察対象に対応する注目領域を、他の検出器を用いて絞り込む。これにより、複数の注目領域から、特定の変化を示す観察対象に対応する注目領域のみを解析対象とすることができる。したがって、例えば、複数のがん細胞のうち、増殖しているもの、および細胞死しているものを区別して、当該がん細胞を解析することが可能となる。
次に、複数の検出器による注目領域の絞り込み処理の第2の例について説明する。本応用例では、検出部240は、複数の検出器を用いて、一の種類の細胞の注目領域を複数検出する。これにより、一の細胞が複数の異なる特徴を有している場合であっても、それぞれの特徴に応じて検出される注目領域について解析することができる。したがって、例えば、神経細胞のように一の細胞に軸索のような特異的な特徴を有する場合であっても、軸索の領域のみを検出して解析することが可能となる。
次に、図12~図14を参照して本開示の第2の実施形態に係る情報処理装置20-2について説明する。
図12は、本開示の第2の実施形態に係る情報処理装置20-2の構成例を示すブロック図である。図12に示すように、情報処理装置20-2は、検出器データベース(DB)200、解析方法取得部210、検出器決定部220、画像取得部230、検出部240、検出パラメータ調整部250、領域描画部260、解析部270、および出力制御部280に加えて、さらに形状設定部290、および領域特定部295を含む。以下、形状設定部290および領域特定部295の機能について説明する。
形状設定部290は、領域描画部260により描画される注目領域を示す表示の形状を設定する。
領域特定部295は、検出部240により検出された注目領域から、解析部270による解析の対象となる注目領域を特定する。例えば、領域特定部295は、検出部240により検出された複数の注目領域のうち、解析の対象とする注目領域を、ユーザの操作または所定の条件に応じて特定する。そして、解析部270は、領域特定部295により特定された注目領域について解析を行う。より具体的には、ユーザの操作により注目領域を特定する場合、領域特定部295は、出力制御部280により表示された複数の注目領域のうちどの注目領域を特定するかをユーザの操作により選択し、選択された注目領域について解析部270は解析を行う。
以上、本開示の第2の実施形態に係る情報処理装置20-2の構成例について説明した。本実施形態によれば、注目領域を定義する図形の形状を設定することが可能であり、例えば、観察対象の領域の輪郭にフィットする形状を注目領域の形状として設定することもできる。これにより、観察対象の領域と注目領域とをより密接に関連付けて解析することができる。また、本実施形態によれば、検出された注目領域のうち、解析の対象とする注目領域を特定することができる。これにより、評価に必要な解析結果を抽出したり、解析結果の比較をすることが可能となる。
次に、図15を参照して、本開示の実施形態に係る情報処理装置のハードウェア構成について説明する。図15は、本開示の実施形態に係る情報処理装置のハードウェア構成例を示すブロック図である。図示された情報処理装置900は、例えば、上記の実施形態における情報処理装置20実現しうる。
以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示の技術的範囲はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、特許請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。
解析方法に応じて少なくとも一の検出器を決定する検出器決定部と、
前記検出器決定部により決定された前記少なくとも一の検出器を用いて、前記解析方法による解析を行う解析部と、
を備える情報処理装置。
(2)
前記検出器決定部により決定された前記少なくとも一の検出器を用いて、撮像画像内から注目領域を検出する検出部をさらに備え、
前記解析部は、前記注目領域について解析を行う、前記(1)に記載の情報処理装置。
(3)
前記検出器決定部により複数の検出器が決定された場合、
前記検出部は、前記複数の検出器を用いて得られる複数の検出結果に基づいて前記注目領域を決定する、前記(2)に記載の情報処理装置。
(4)
前記検出部は、前記検出器を用いて検出した前記注目領域を、前記解析部による前記注目領域についての解析により得られる解析結果と関連付ける、前記(2)または(3)に記載の情報処理装置。
(5)
前記検出器の検出パラメータを調整する検出パラメータ調整部をさらに備え、
前記検出部は、決定された前記検出器の前記検出パラメータに基づいて前記撮像画像内から前記注目領域を検出する、前記(2)~(4)のいずれか1項に記載の情報処理装置。
(6)
前記解析部による解析結果を、解析結果に対応する注目領域と関連付けて出力する出力制御部をさらに備える、前記(2)~(5)のいずれか1項に記載の情報処理装置。
(7)
前記注目領域を示す表示を前記検出部による検出結果に基づいて前記撮像画像に描画する領域描画部をさらに備え、
前記出力制御部は、前記領域描画部により描画された前記注目領域に相当する表示を含む前記撮像画像を出力する、前記(6)に記載の情報処理装置。
(8)
前記注目領域に相当する表示の形状は、前記撮像画像に対する画像解析に基づいて検出される形状を含む、前記(7)に記載の情報処理装置。
(9)
前記注目領域に相当する表示の形状は、前記検出部による注目領域の検出結果に基づいて算出される形状を含む、前記(7)に記載の情報処理装置。
(10)
検出された前記注目領域から、前記解析部による解析の対象とする注目領域を特定する領域特定部をさらに備える、前記(2)~(9)のいずれか1項に記載の情報処理装置。
(11)
前記検出器は、前記解析方法と前記解析方法により解析される解析対象に関する画像データとの組を学習データとする機械学習により生成される検出器であり、
前記検出部は、前記検出器を用いて前記撮像画像から得られる特徴データに基づいて前記注目領域を検出する、前記(2)~(10)のいずれか1項に記載の情報処理装置。
(12)
前記検出器決定部は、前記解析方法により解析される解析対象が示す変化の種類に応じて少なくとも一の検出器を決定する、前記(1)~(11)のいずれか1項に記載の情報処理装置。
(13)
前記解析方法により解析される解析対象は、細胞、細胞小器官、または前記細胞により形成される生体組織を含む、前記(12)に記載の情報処理装置。
(14)
解析方法に応じて少なくとも一の検出器を決定することと、
決定された前記少なくとも一の検出器を用いて、前記解析方法による解析を行うことと、
を含む情報処理方法。
(15)
撮像画像を生成する撮像部
を備える撮像装置と、
解析方法に応じて少なくとも一の検出器を決定する検出器決定部と、
前記検出器決定部により決定された前記少なくとも一の検出器を用いて、前記撮像画像について前記解析方法による解析を行う解析部と、
を備える情報処理装置と、
を備える情報処理システム。
20 情報処理装置
200 検出器DB
210 解析方法取得部
220 検出器決定部
230 画像取得部
240 検出部
250 検出パラメータ調整部
260 領域描画部
270 解析部
280 出力制御部
290 形状設定部
295 領域特定部
Claims (15)
- 解析方法に応じて少なくとも一の検出器を決定する検出器決定部と、
前記検出器決定部により決定された前記少なくとも一の検出器を用いて、前記解析方法による解析を行う解析部と、
を備える情報処理装置。 - 前記検出器決定部により決定された前記少なくとも一の検出器を用いて、撮像画像内から注目領域を検出する検出部をさらに備え、
前記解析部は、前記注目領域について解析を行う、請求項1に記載の情報処理装置。 - 前記検出器決定部により複数の検出器が決定された場合、
前記検出部は、前記複数の検出器を用いて得られる複数の検出結果に基づいて前記注目領域を決定する、請求項2に記載の情報処理装置。 - 前記検出部は、前記検出器を用いて検出した前記注目領域を、前記解析部による前記注目領域についての解析により得られる解析結果と関連付ける、請求項2に記載の情報処理装置。
- 前記検出器の検出パラメータを調整する検出パラメータ調整部をさらに備え、
前記検出部は、決定された前記検出器の前記検出パラメータに基づいて前記撮像画像内から前記注目領域を検出する、請求項2に記載の情報処理装置。 - 前記解析部による解析結果を、解析結果に対応する注目領域と関連付けて出力する出力制御部をさらに備える、請求項2に記載の情報処理装置。
- 前記注目領域を示す表示を前記検出部による検出結果に基づいて前記撮像画像に描画する領域描画部をさらに備え、
前記出力制御部は、前記領域描画部により描画された前記注目領域に相当する表示を含む前記撮像画像を出力する、請求項6に記載の情報処理装置。 - 前記注目領域に相当する表示の形状は、前記撮像画像に対する画像解析に基づいて検出される形状を含む、請求項7に記載の情報処理装置。
- 前記注目領域に相当する表示の形状は、前記検出部による注目領域の検出結果に基づいて算出される形状を含む、請求項7に記載の情報処理装置。
- 検出された前記注目領域から、前記解析部による解析の対象とする注目領域を特定する領域特定部をさらに備える、請求項2に記載の情報処理装置。
- 前記検出器は、前記解析方法と前記解析方法により解析される解析対象に関する画像データとの組を学習データとする機械学習により生成される検出器であり、
前記検出部は、前記検出器を用いて前記撮像画像から得られる特徴データに基づいて前記注目領域を検出する、請求項2に記載の情報処理装置。 - 前記検出器決定部は、前記解析方法により解析される解析対象が示す変化の種類に応じて少なくとも一の検出器を決定する、請求項1に記載の情報処理装置。
- 前記解析方法により解析される解析対象は、細胞、細胞小器官、または前記細胞により形成される生体組織を含む、請求項12に記載の情報処理装置。
- 解析方法に応じて少なくとも一の検出器を決定することと、
決定された前記少なくとも一の検出器を用いて、前記解析方法による解析を行うことと、
を含む情報処理方法。 - 撮像画像を生成する撮像部
を備える撮像装置と、
解析方法に応じて少なくとも一の検出器を決定する検出器決定部と、
前記検出器決定部により決定された前記少なくとも一の検出器を用いて、前記撮像画像について前記解析方法による解析を行う解析部と、
を備える情報処理装置と、
を備える情報処理システム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017544391A JP6777086B2 (ja) | 2015-10-08 | 2016-07-07 | 情報処理装置、情報処理方法及び情報処理システム |
US15/761,572 US20180342078A1 (en) | 2015-10-08 | 2016-07-07 | Information processing device, information processing method, and information processing system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015199990 | 2015-10-08 | ||
JP2015-199990 | 2015-10-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017061155A1 true WO2017061155A1 (ja) | 2017-04-13 |
Family
ID=58487485
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/070121 WO2017061155A1 (ja) | 2015-10-08 | 2016-07-07 | 情報処理装置、情報処理方法及び情報処理システム |
Country Status (3)
Country | Link |
---|---|
US (1) | US20180342078A1 (ja) |
JP (1) | JP6777086B2 (ja) |
WO (1) | WO2017061155A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019082617A1 (ja) * | 2017-10-26 | 2019-05-02 | ソニー株式会社 | 情報処理装置、情報処理方法、プログラム及び観察システム |
WO2019180833A1 (ja) * | 2018-03-20 | 2019-09-26 | 株式会社島津製作所 | 細胞観察装置 |
WO2019180811A1 (ja) * | 2018-03-20 | 2019-09-26 | 株式会社島津製作所 | 細胞観察装置及び細胞観察用プログラム |
JP2020160369A (ja) * | 2019-03-28 | 2020-10-01 | コニカミノルタ株式会社 | 表示システム、表示制御装置及び表示制御方法 |
CN112400023A (zh) * | 2019-03-14 | 2021-02-23 | 株式会社日立高新技术 | 药剂敏感性检查方法 |
JP2021083431A (ja) * | 2019-11-29 | 2021-06-03 | シスメックス株式会社 | 細胞解析方法、細胞解析装置、細胞解析システム、及び細胞解析プログラム |
JP2021517255A (ja) * | 2018-03-07 | 2021-07-15 | ヴァーディクト ホールディングス プロプライエタリー リミテッド | 顕微鏡による生物学的物質の同定方法 |
JPWO2021166120A1 (ja) * | 2020-02-19 | 2021-08-26 |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102521386B1 (ko) * | 2019-08-07 | 2023-04-14 | 주식회사 히타치하이테크 | 치수 계측 장치, 치수 계측 방법 및 반도체 제조 시스템 |
US11989906B2 (en) * | 2020-01-29 | 2024-05-21 | Rakuten Group, Inc. | Object recognition system, position information acquisition method, and program |
DE102020126953B3 (de) * | 2020-10-14 | 2021-12-30 | Bayerische Motoren Werke Aktiengesellschaft | System und Verfahren zum Erfassen einer räumlichen Orientierung einer tragbaren Vorrichtung |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006018706A (ja) * | 2004-07-05 | 2006-01-19 | Nippon Telegr & Teleph Corp <Ntt> | 被写体識別器設定装置、その設定方法とその設定プログラム、および被写体識別装置、その識別方法とその識別プログラム |
WO2011016189A1 (ja) * | 2009-08-07 | 2011-02-10 | 株式会社ニコン | 細胞の分類手法、この手法を用いた画像処理プログラム及び画像処理装置、並びに細胞塊の製造方法 |
JP2011193159A (ja) * | 2010-03-12 | 2011-09-29 | Toshiba Corp | 監視システム、画像処理装置、及び監視方法 |
JP2012073179A (ja) * | 2010-09-29 | 2012-04-12 | Dainippon Screen Mfg Co Ltd | 病理診断支援装置、病理診断支援方法、病理診断支援のための制御プログラムおよび該制御プログラムを記録した記録媒体 |
JP2015137857A (ja) * | 2014-01-20 | 2015-07-30 | 富士ゼロックス株式会社 | 検出制御装置、プログラム及び検出システム |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000069346A (ja) * | 1998-06-12 | 2000-03-03 | Canon Inc | カメラ制御装置、方法、カメラ、追尾カメラシステム及びコンピュ―タ読み取り可能な記憶媒体 |
US7203360B2 (en) * | 2003-04-09 | 2007-04-10 | Lee Shih-Jong J | Learnable object segmentation |
JP2007222073A (ja) * | 2006-02-23 | 2007-09-06 | Yamaguchi Univ | 画像処理により細胞運動特性を評価する方法、そのための画像処理装置及び画像処理プログラム |
US9398266B2 (en) * | 2008-04-02 | 2016-07-19 | Hernan Carzalo | Object content navigation |
JP5530126B2 (ja) * | 2009-07-24 | 2014-06-25 | オリンパス株式会社 | 三次元細胞画像解析システム及びそれに用いる三次元細胞画像解析装置 |
EP2293248A1 (en) * | 2009-09-08 | 2011-03-09 | Koninklijke Philips Electronics N.V. | Motion monitoring system for monitoring motion within a region of interest |
EP2499827A4 (en) * | 2009-11-13 | 2018-01-03 | Pixel Velocity, Inc. | Method for tracking an object through an environment across multiple cameras |
US8941706B2 (en) * | 2010-04-07 | 2015-01-27 | Apple Inc. | Image processing for a dual camera mobile device |
EP2549735A3 (en) * | 2011-07-19 | 2014-08-27 | Samsung Electronics Co., Ltd. | Method of editing static digital combined images comprising images of multiple objects |
CN104024913A (zh) * | 2011-12-22 | 2014-09-03 | 松下健康医疗器械株式会社 | 观察系统、观察系统的控制方法以及程序 |
JP5945434B2 (ja) * | 2012-03-16 | 2016-07-05 | オリンパス株式会社 | 生物試料の画像解析方法、画像解析装置、画像撮影装置およびプログラム |
JP6102166B2 (ja) * | 2012-10-10 | 2017-03-29 | 株式会社ニコン | 心筋細胞の運動検出方法、心筋細胞の培養方法、薬剤評価方法、画像処理プログラム及び画像処理装置 |
KR102173123B1 (ko) * | 2013-11-22 | 2020-11-02 | 삼성전자주식회사 | 전자장치에서 이미지 내의 특정 객체를 인식하기 위한 방법 및 장치 |
KR101736173B1 (ko) * | 2014-02-14 | 2017-05-17 | 한국전자통신연구원 | 관심 객체 고속 검출 장치 및 그 방법 |
JP6024719B2 (ja) * | 2014-09-09 | 2016-11-16 | カシオ計算機株式会社 | 検出装置、検出方法、及びプログラム |
-
2016
- 2016-07-07 WO PCT/JP2016/070121 patent/WO2017061155A1/ja active Application Filing
- 2016-07-07 US US15/761,572 patent/US20180342078A1/en not_active Abandoned
- 2016-07-07 JP JP2017544391A patent/JP6777086B2/ja active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006018706A (ja) * | 2004-07-05 | 2006-01-19 | Nippon Telegr & Teleph Corp <Ntt> | 被写体識別器設定装置、その設定方法とその設定プログラム、および被写体識別装置、その識別方法とその識別プログラム |
WO2011016189A1 (ja) * | 2009-08-07 | 2011-02-10 | 株式会社ニコン | 細胞の分類手法、この手法を用いた画像処理プログラム及び画像処理装置、並びに細胞塊の製造方法 |
JP2011193159A (ja) * | 2010-03-12 | 2011-09-29 | Toshiba Corp | 監視システム、画像処理装置、及び監視方法 |
JP2012073179A (ja) * | 2010-09-29 | 2012-04-12 | Dainippon Screen Mfg Co Ltd | 病理診断支援装置、病理診断支援方法、病理診断支援のための制御プログラムおよび該制御プログラムを記録した記録媒体 |
JP2015137857A (ja) * | 2014-01-20 | 2015-07-30 | 富士ゼロックス株式会社 | 検出制御装置、プログラム及び検出システム |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPWO2019082617A1 (ja) * | 2017-10-26 | 2020-11-26 | ソニー株式会社 | 情報処理装置、情報処理方法、プログラム及び観察システム |
WO2019082617A1 (ja) * | 2017-10-26 | 2019-05-02 | ソニー株式会社 | 情報処理装置、情報処理方法、プログラム及び観察システム |
JP2021517255A (ja) * | 2018-03-07 | 2021-07-15 | ヴァーディクト ホールディングス プロプライエタリー リミテッド | 顕微鏡による生物学的物質の同定方法 |
JP2021184766A (ja) * | 2018-03-20 | 2021-12-09 | 株式会社島津製作所 | 細胞観察装置及び細胞観察方法 |
WO2019180833A1 (ja) * | 2018-03-20 | 2019-09-26 | 株式会社島津製作所 | 細胞観察装置 |
WO2019180811A1 (ja) * | 2018-03-20 | 2019-09-26 | 株式会社島津製作所 | 細胞観察装置及び細胞観察用プログラム |
JPWO2019180833A1 (ja) * | 2018-03-20 | 2020-12-03 | 株式会社島津製作所 | 細胞観察装置 |
JPWO2019180811A1 (ja) * | 2018-03-20 | 2020-12-03 | 株式会社島津製作所 | 細胞観察装置及び細胞観察用プログラム |
JP7428173B2 (ja) | 2018-03-20 | 2024-02-06 | 株式会社島津製作所 | 細胞観察装置及び細胞観察方法 |
JP7461935B2 (ja) | 2019-03-14 | 2024-04-04 | 株式会社日立ハイテク | 薬剤感受性の検査方法 |
CN112400023A (zh) * | 2019-03-14 | 2021-02-23 | 株式会社日立高新技术 | 药剂敏感性检查方法 |
JP2022511399A (ja) * | 2019-03-14 | 2022-01-31 | 株式会社日立ハイテク | 薬剤感受性の検査方法 |
JP7172796B2 (ja) | 2019-03-28 | 2022-11-16 | コニカミノルタ株式会社 | 表示システム、表示制御装置及び表示制御方法 |
US11806181B2 (en) | 2019-03-28 | 2023-11-07 | Konica Minolta, Inc. | Display system, display control device, and display control method |
JP2020160369A (ja) * | 2019-03-28 | 2020-10-01 | コニカミノルタ株式会社 | 表示システム、表示制御装置及び表示制御方法 |
JP2021083431A (ja) * | 2019-11-29 | 2021-06-03 | シスメックス株式会社 | 細胞解析方法、細胞解析装置、細胞解析システム、及び細胞解析プログラム |
US12020492B2 (en) | 2019-11-29 | 2024-06-25 | Sysmex Corporation | Cell analysis method, cell analysis device, and cell analysis system |
JP7545202B2 (ja) | 2019-11-29 | 2024-09-04 | シスメックス株式会社 | 細胞解析方法、細胞解析装置、細胞解析システム、及び細胞解析プログラム |
WO2021166120A1 (ja) * | 2020-02-19 | 2021-08-26 | 三菱電機株式会社 | 情報処理装置、情報処理方法及び情報処理プログラム |
JP7038933B2 (ja) | 2020-02-19 | 2022-03-18 | 三菱電機株式会社 | 情報処理装置、情報処理方法及び情報処理プログラム |
JPWO2021166120A1 (ja) * | 2020-02-19 | 2021-08-26 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2017061155A1 (ja) | 2018-08-02 |
US20180342078A1 (en) | 2018-11-29 |
JP6777086B2 (ja) | 2020-10-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6777086B2 (ja) | 情報処理装置、情報処理方法及び情報処理システム | |
US11229419B2 (en) | Method for processing 3D image data and 3D ultrasonic imaging method and system | |
US9798770B2 (en) | Information processing unit, information processing method, and program | |
US10929985B2 (en) | System and methods for tracking motion of biological cells | |
Yamamoto et al. | Node detection and internode length estimation of tomato seedlings based on image analysis and machine learning | |
US20120092478A1 (en) | Incubated state evaluating device, incubated state evaluating method, incubator, and program | |
CN107580715A (zh) | 用于自动计数微生物菌落的方法和系统 | |
CN111227864A (zh) | 使用超声图像利用计算机视觉进行病灶检测的方法与装置 | |
WO2018083984A1 (ja) | 情報処理装置、情報処理方法及び情報処理システム | |
WO2017154318A1 (ja) | 情報処理装置、情報処理方法、プログラム及び情報処理システム | |
CN108846828A (zh) | 一种基于深度学习的病理图像目标区域定位方法及系统 | |
EP3806101A1 (en) | Training data collecting device, training data collecting method and program, training system, trained model, and endoscope image processing device | |
EP3485458B1 (en) | Information processing device, information processing method, and information processing system | |
CN107408198A (zh) | 细胞图像和视频的分类 | |
CN108830222A (zh) | 一种基于信息性和代表性主动学习的微表情识别方法 | |
EP3432269B1 (en) | Information processing device, information processing method, program, and information processing system | |
Yin et al. | A novel method of situ measurement algorithm for oudemansiella raphanipies caps based on YOLO v4 and distance filtering | |
CN108460370A (zh) | 一种固定式家禽生命信息报警装置 | |
JPWO2018105298A1 (ja) | 情報処理装置、情報処理方法及び情報処理システム | |
CN113469942B (zh) | 一种ct图像病变检测方法 | |
López Flórez et al. | Automatic Cell Counting With YOLOv5: A Fluorescence Microscopy Approach | |
Masdiyasa et al. | A new method to improve movement tracking of human sperms | |
US20230230398A1 (en) | Image processing device, image processing method, image processing program, and diagnosis support system | |
Durgadevi et al. | Image characterization based fetal brain MRI localization and extraction | |
CN111275754B (zh) | 一种基于深度学习的脸部痘印比例计算方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16853310 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017544391 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15761572 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16853310 Country of ref document: EP Kind code of ref document: A1 |