WO2017179304A1 - 色ムラ部位の評価方法及び色ムラ部位評価装置 - Google Patents
色ムラ部位の評価方法及び色ムラ部位評価装置 Download PDFInfo
- Publication number
- WO2017179304A1 WO2017179304A1 PCT/JP2017/006522 JP2017006522W WO2017179304A1 WO 2017179304 A1 WO2017179304 A1 WO 2017179304A1 JP 2017006522 W JP2017006522 W JP 2017006522W WO 2017179304 A1 WO2017179304 A1 WO 2017179304A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- skin image
- color unevenness
- skin
- image
- spot
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/44—Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
- A61B5/441—Skin evaluation, e.g. for skin disorder diagnosis
- A61B5/444—Evaluating skin marks, e.g. mole, nevi, tumour, scar
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/1032—Determining colour for diagnostic purposes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/107—Measuring physical dimensions, e.g. size of the entire body or parts thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/44—Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
- A61B5/441—Skin evaluation, e.g. for skin disorder diagnosis
- A61B5/443—Evaluating skin constituents, e.g. elastin, melanin, water
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
- G06T7/337—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/66—Analysis of geometric attributes of image moments or centre of gravity
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2576/00—Medical imaging apparatus involving image processing or analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30088—Skin; Dermal
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
Definitions
- the present application relates to a color unevenness portion evaluation method and a color unevenness portion evaluation apparatus.
- color information or skin pigment component information is quantified by using a spectrocolorimeter or other device for spots and freckles specified by visual sensation etc. in the entire face, around the eyes, and on cheeks.
- a spectrocolorimeter or other device for spots and freckles specified by visual sensation etc. in the entire face, around the eyes, and on cheeks.
- the quantification of the color information is based on an average value in a measurement region of principal component scores related to melanin obtained by performing principal component analysis using spectral reflectance data of a plurality of wavelengths, for example (see, for example, Patent Document 1). .
- the state of the uneven color part such as a stain may change with aging or may change due to the influence of the season.
- the state of a certain color unevenness part may change with the effect of chemical
- medical agents such as a whitening agent, for example.
- Patent Document 2 describes an image alignment method for aligning two or more images, acquires a specific structure-oriented image that emphasizes a specific structure, and acquires the acquired specific structure. It is described that a structural correspondence positional relationship is obtained between important images, and the two or more images are aligned based on the obtained structural correspondence positional relationship. However, such a method cannot associate color unevenness portions of two or more images with each other.
- an object of the present invention is to capture and evaluate a change in each color unevenness part included in a skin image.
- a color unevenness part detecting step for detecting a plurality of color unevenness parts from each of the first skin image and a second skin image different from the first skin image, and the first skin Based on the barycentric position calculation step of calculating the barycentric position coordinate of each color unevenness part of each of the image and the second skin image, and the barycentric position coordinate of each of the color unevenness part calculated in the barycentric position calculating step,
- a method for evaluating a color unevenness portion including a matching processing step that associates the plurality of color unevenness portions included in the first skin image with the plurality of color unevenness portions included in the second skin image.
- a stain will be described as an example of a color uneven portion, and a stain evaluation process will be described as an example of a method for evaluating a color uneven portion.
- the term “stain” refers to a state in which pigments such as melanin are deposited on the skin, where the boundary between the site where the pigment is deposited and the region where the pigment is not deposited is clear and the pigment is deposited to some extent.
- the stain includes, for example, senile pigment spots or sunshine moles, post-inflammation pigmentation, freckles, liver spots and the like.
- FIG. 1 is a diagram illustrating an example of a functional configuration of a spot evaluation apparatus 10 according to the present embodiment.
- the spot evaluation device 10 includes an input unit 11, an output unit 12, a storage unit 13, an image acquisition unit 14, a spot detection unit 15, a centroid position calculation unit 16, a matching processing unit 17, and a spot evaluation unit 18. And a control unit 19.
- the input unit 11 receives inputs such as the start, end, and setting of various instructions related to the stain evaluation process from, for example, a user who uses the stain evaluation device 10.
- the output unit 12 outputs the content input by the input unit 11 and the content executed based on the input content. For example, the output unit 12 performs a process of displaying the results of the processes performed by the image acquisition unit 14, the spot detection unit 15, the gravity center position calculation unit 16, the matching processing unit 17, the spot evaluation unit 18, and the like on a display described later.
- the image acquisition unit 14 acquires a first skin image and a second skin image different from the first skin image.
- the first skin image and the second skin image can be skin images obtained by photographing the same target portion at different times of the same subject.
- first skin image and the second skin image can be skin images in which the entire subject's cheek is photographed, for example.
- first skin image and the second skin image are identified from a face image of a subject taken by, for example, a SIA (Skin Image Analyzer) system composed of a diffuse lighting box and a digital camera (for example, it is possible to obtain a skin image obtained by extracting around the eyes and cheeks.
- SIA Skin Image Analyzer
- the spot detection unit 15 detects a plurality of spots from each of the first skin image and the second skin image.
- the center-of-gravity position calculation unit 16 calculates the center-of-gravity position coordinates of each stain on each of the first skin image and the second skin image.
- the matching processing unit 17 associates the plurality of spots included in the first skin image with the plurality of spots included in the second skin image based on the barycentric position coordinates of each spot calculated by the barycentric position calculating unit 16.
- the stain evaluation unit 18 evaluates a change in the stain based on the correspondence relationship of the stains associated with the matching processing unit 17.
- the control unit 19 controls the entire components of the stain evaluation apparatus 10.
- the control unit 19 controls, for example, at least one of spot detection, gravity center calculation, matching processing, and spot evaluation on the skin image, but the contents controlled by the control unit 19 are not limited to this.
- the storage unit 13 stores various information necessary for the present embodiment. Specifically, the storage unit 13 stores various programs for executing the stain evaluation process of the present embodiment, various setting information, and the like.
- the storage unit 13 includes information about the first skin image and the second skin image, and spots included in each skin image (number of spots, area of each spot, darkness of each spot, barycentric position coordinates of each spot, (Spot correspondence, etc.), evaluation results, etc. are stored.
- the storage unit 13 is a collection of various types of information as described above.
- the storage unit 13 may function as a database structured systematically so that it can be searched and extracted using keywords or the like. Good.
- the information stored in the storage unit 13 may be acquired from an external device via a communication network represented by the Internet or a LAN (Local Area Network), for example.
- the stain evaluation apparatus 10 installs an execution program (stain evaluation program) that causes a computer to execute each function of the stain evaluation apparatus 10 illustrated in FIG. 1 on a general-purpose computer such as a PC (Personal Computer), a smartphone, a tablet terminal, or the like. This can be realized.
- FIG. 2 is a diagram illustrating an example of a hardware configuration capable of realizing the spot evaluation process.
- the stain evaluation apparatus 10 includes an input device 21, an output device 22, a drive device 23, an auxiliary storage device 24, a memory device 25, a CPU (Central Processing Unit) 26 that performs various controls, and a network connection device 27. These are connected to each other by a system bus B.
- the input device 21 can be a pointing device such as a keyboard or a mouse operated by a user or the like.
- the input device 21 may be a voice input device such as a microphone that can be input by voice or the like.
- the output device 22 can be a display, a speaker, or the like.
- the output device 22 may be a printing device such as a printer.
- the input device 21 and the output device 22 described above may have an input / output integrated configuration such as a touch panel when the stain evaluation device 10 is a smartphone, a tablet terminal, or the like.
- the execution program installed in the spot evaluation apparatus 10 in the present embodiment is provided by a portable recording medium 28 such as a USB (Universal Serial Bus) memory or a CD-ROM, for example.
- the recording medium 28 can be set in the drive device 23, and an execution program included in the recording medium 28 is installed from the recording medium 28 to the auxiliary storage device 24 via the drive device 23.
- the auxiliary storage device 24 is a storage means such as a hard disk, and stores the execution program of the present embodiment, a control program provided in the computer, and the like, and can perform input / output as necessary.
- the memory device 25 stores an execution program read from the auxiliary storage device 24 by the CPU 26.
- the memory device 25 is a ROM (Read Only Memory), a RAM (Random Access Memory), or the like. Note that the auxiliary storage device 24 and the memory device 25 described above may be configured integrally as a single storage device.
- the CPU 26 controls processing of the entire computer, such as various operations and input / output of data with each hardware component, based on a control program such as an OS (Operating System) and an execution program stored in the memory device 25.
- a control program such as an OS (Operating System) and an execution program stored in the memory device 25.
- OS Operating System
- execution program stored in the memory device 25.
- the network connection device 27 acquires, for example, an execution program and various data from other devices connected to the communication network by connecting to a communication network such as the Internet or a LAN.
- the network connection device 27 can also provide the execution result obtained by executing the program to other devices.
- FIG. 3 is a flowchart illustrating an example of the spot evaluation process.
- the image acquisition unit 14 acquires two skin images, a first skin image and a second skin image (step S102).
- the first skin image and the second skin image are, for example, images before and after application of a whitening agent such as a whitening agent of the same subject, aging change images of the same subject, images of different seasons of the same subject, etc. It is possible to obtain a skin image obtained by photographing the same target portion at different times of the same subject.
- the second skin image is a skin image obtained by photographing the same target portion of the same subject as the first skin image after the first skin image.
- the image acquisition unit 14 acquires, as an analysis region, a skin image (a pixel region of 500 ⁇ 500 pixels or the like) obtained by capturing a predetermined region of the subject's cheek, for example, as the first skin image and the second skin image. Can do. Further, the image acquisition unit 14 acquires a face image of the subject, extracts a predetermined region (such as a 500 ⁇ 500 pixel region) of the cheek as an analysis region based on the face image contour and the like, and extracts the first skin An image and a second skin image can also be used. However, for example, when the first skin image and the second skin image are aligned later in the process of step S106, for example, the first skin is secured in order to ensure a sufficient analysis region even when a deviation occurs.
- the image 100 and the second skin image 102 can be configured to include a wider area than the analysis area thus determined.
- the spot detection unit 15 performs image processing on the first skin image and the second skin image acquired by the image acquisition unit 14, and detects a spot from each of the first skin image and the second skin image. (Step S104).
- the processing of the spot detection unit 15 will be described in detail with reference to FIG. (Stain detection processing)
- FIG. 4 is a flowchart showing in detail the spot detection process in step S104. The following processing is performed for each of the first skin image and the second skin image.
- the spot detection unit 15 calculates pigment components such as a melanin component and a hemoglobin component of the skin image that is an analysis region, and converts the pigment component into an image (pigment component distribution image) indicating the density of the pigment component and its distribution state (step S1). S11). Specifically, the spot detection unit 15 acquires the RGB color system RGB value of the analysis region, the CIE-XYZ value that is the CIE international standard value converted from the RGB color system, the color data Lab value, and the like.
- the RGB color system RGB values can be converted into CIE-XYZ values using, for example, the following equation.
- the spot detection unit 15 removes the low-frequency component from the distribution image of the pigment component obtained in step S11. Thereby, for example, the influence of a large swell corresponding to a shadow due to the shape of the face can be excluded.
- the spot detection unit 15 removes a band having a half width of about 40.0 mm or more as an influence of a shadow.
- the stain detection unit 15 first generates an image of a low frequency component using a bandpass filter such as a Gaussian function (step S12).
- the spot detection unit 15 subtracts the low-frequency component image obtained in step S12 from the pigment component distribution image obtained in step S11 (step S13).
- the spot detection unit 15 performs a binarization process on the image obtained in step S13 (step S14).
- the binarization processing for example, with respect to the density of the melanin component, for example, an average value +0.01 to 0.30 is set as a threshold value, and a pixel having a melanin value (high melanin value) equal to or higher than the threshold value is set as a high melanin portion. .
- a normal skin part and a high melanin part are distinguishable.
- the spot detection unit 15 performs noise processing on the image obtained in step S14 (step S15).
- the noise processing can be performed using, for example, a median filter (5 ⁇ 5 pixels or the like), but is not limited thereto.
- the spot detection unit 15 performs a labeling process for labeling a region where pixels having high melanin values are continuous as one pigmentation site in the image obtained in step S15 (step S16). For example, the spot detection unit 15 connects adjacent portions of pixels identified as pixels having a high melanin value, and extracts the connected pixel group as one pigmentation site.
- the spot detection unit 15 removes a region having a predetermined area (for example, an actual size of 1.0 mm 2 ) or less from the pigmentation site labeled in step S16 and removes the remaining pigmentation site. (S17). Thereby, for example, a small extract such as a pore can be removed, and a spot can be detected with high accuracy.
- a predetermined area for example, an actual size of 1.0 mm 2
- the center-of-gravity position calculation unit 16 uses the first skin image and the second skin image based on the plurality of spots detected from the first skin image and the plurality of spots detected from the second skin image. Is aligned with the skin image, and a coordinate reference point is determined (step S106).
- the alignment can be performed by pattern recognition based on the distribution of spots detected from each of the first skin image and the second skin image.
- FIG. 5A to 5C are schematic diagrams showing an example of this process.
- FIG. 5A shows the first skin image 100
- FIG. 5B shows the second skin image 102.
- FIG. 5C is a diagram in which the first skin image 100 and the second skin image 102 are overlaid and aligned.
- the center-of-gravity position calculation unit 16 aligns the first skin image 100 and the second skin image 102 based on the distribution of a plurality of spots 106 included in the first skin image 100 and the second skin image 102.
- the coordinates of the first skin image 100 and the second skin image 102 are aligned.
- the predetermined position on the lower left can be determined as the reference point (the origin (0, 0)) of the XY coordinate system.
- the stain 106 included in the first skin image 100 and the stain included in the second skin image 102 can be accurately associated.
- the barycentric position calculation unit 16 calculates the barycentric position coordinates of each spot 106 in each of the first skin image 100 and the second skin image 102 with respect to the reference point determined in step S106 (step S106). S108).
- FIG. 6 is a schematic diagram showing an example of this process.
- the centroid position calculation unit 16 calculates the position coordinates of the centroid 108 of each of the spots 106a to 106d in the first skin image 100 (and the second skin image 102).
- the position coordinates of the center of gravity 108 can be calculated from the position coordinates of all the pixels constituting the spot area, for example. Specifically, the position coordinates of the center of gravity 108 can be obtained by calculating the average of the X coordinates and Y coordinates of all the pixels constituting the spot area.
- the position coordinates of the center of gravity 108 may be calculated by the following procedure. (1) Method of calculating from all pixels constituting the outline of each spot If the number of all pixels constituting the outline of each spot is n, the X coordinate of the center of gravity 108 is the sum of the X coordinates of all points. , N. The Y coordinate of the center of gravity 108 is calculated similarly. (2) Method of weighting and calculating all the pixels constituting each spot area and calculating the X coordinates after weighting (multiplying the coordinates) the melanin density to all pixels constituting the spot area And the average of each Y coordinate is obtained. (3) Method for obtaining ellipse center by performing ellipse fitting The above is an example, and other methods can be selected as appropriate.
- the matching processing unit 17 associates the spots 106 with each other based on the barycentric position coordinates of the spots 106 of the first skin image 100 and the second skin image 102 (step S ⁇ b> 110). .
- the matching processing unit 17 For each spot 106 in the first skin image 100, the matching processing unit 17 performs a first search step for detecting a corresponding spot 106 from the second skin image 102, and each spot 106 in the second skin image 102. Then, a second search step for detecting a corresponding spot 106 from the first skin image 100 is performed.
- FIG. 7 is a schematic diagram showing an example of this process.
- FIG. 7 is a diagram schematically illustrating an example of the relationship between the coordinate axis of each skin image and the amount of melanin.
- (a) corresponds to the first skin image 100
- (b) corresponds to the second skin image 102.
- the matching processing unit 17 centers the target coordinate (“a” in FIG. 7B) of the second skin image 102 corresponding to the barycentric position coordinate 109a of the spot 106a for the spot 106a of the first skin image 100.
- the second skin image 102 is searched using the predetermined range as the search range 110.
- the matching processing unit 17 uses the spot 106g having the barycentric position coordinate 109g closest to the target coordinate a among the spots having the barycentric position coordinate in the search range 110 of the second skin image 102 as the first skin.
- the stain 106a of the image 100 corresponds to display with a dashed arrow in the figure. The matching processing unit 17 performs this process for each spot of the first skin image 100.
- the matching processing unit 17 determines that the spot corresponding to the spot in the first skin image 100 is the second skin image. The fact that it does not exist in 102 is associated with the spot in the first skin image 100.
- the matching processing unit 17 centers the target coordinate (“g” in FIG. 7A) of the first skin image 100 corresponding to the barycentric position coordinate 109g of the stain 106g for the stain 106g of the second skin image 102.
- the first skin image 100 is searched using the predetermined range as the search range 112.
- the matching processing unit 17 sets the stain 106a having the centroid position coordinate 109a closest to the target coordinate g among the stains having the centroid position coordinate in the search range 112 of the first skin image 100 as the second skin.
- the stain 106g of the image 102 corresponds to display with a solid line arrow in the figure. The matching processing unit 17 performs this process for each spot of the second skin image 102.
- the matching processing unit 17 determines that the spot corresponding to the spot of the second skin image 102 is the first skin image. The fact that it does not exist in 100 is associated with the spot in the second skin image 102.
- the range (size) of the search range 110 can be, for example, within a circle (radius is 1 pixel or more) centered on the target coordinates and less than the radius of the number of pixels corresponding to 40 mm in actual size. Preferably, for example, it can be within a circle with a radius of the number of pixels corresponding to 1 mm or more and 2 mm or less in actual size centered on the target coordinates.
- the actual size refers to the actual size of the subject from which the skin image is captured, and is the actual size of a part such as a cheek.
- the range (size) of the search range 112 can be the same as the search range 110, for example.
- the search range 110 and the size of the search range 112 may be different.
- the size of the search range may be determined dynamically based on the average value of the spot area detected in step S104 in FIG. Also in this case, the size of the search range 110 and the size of the search range 112 may be the same or different.
- FIG. 8 is a diagram schematically showing an example of a correspondence relationship between a plurality of spots on the first skin image 100 and the second skin image 102 when the above processing is performed.
- (a) corresponds to the first skin image 100
- (b) corresponds to the second skin image 102.
- Broken arrows in the figure indicate the results associated in the first search step.
- the solid line arrow in the figure indicates the result associated in the second search step.
- the spot 106 g of the second skin image 102 is only associated with the spot 106 b of the first skin image 100.
- the stain 106a and the stain 106b of the first skin image 100 are associated with the stain 106g of the second skin image 102, respectively. Thereby, the binding of a stain can be detected.
- the spot 106 c of the first skin image 100 is only associated with the spot 106 i of the second skin image 102.
- the stain 106h and the stain 106i of the second skin image 102 are associated with the stain 106c of the first skin image 100, respectively. Thereby, the division
- the first search step it is possible to detect that the stain corresponding to the stain 106 d of the first skin image 100 is not included in the second skin image 102. Thereby, the disappearance of the stain can be detected.
- the second search step among the stains included in the second skin image 102, those that do not include the stain corresponding to the first skin image 100 are included. It can also be detected. Thereby, the occurrence of a spot can be detected.
- the correspondence between each spot can be detected in detail by performing both the first search step and the second search step. Note that either the first search step or the second search step may be performed first.
- the spot evaluation unit 18 evaluates a spot change pattern and the like based on the result of the matching processing step by the matching processing unit 17 (step S112).
- 9A to 9E are diagrams showing a change pattern of a spot.
- the spot evaluation unit 18 includes a spot whose barycentric position coordinates exist in the search area of the second skin image 102 corresponding to the spot 106 of the first skin image 100. If there is no spot, it is evaluated that the spot in the first skin image 100 has disappeared (FIG. 9A).
- the spot evaluation unit 18 has the barycentric position coordinates in the search region of the first skin image 100 corresponding to the spot of the second skin image 102 in the second search step by the matching processing unit 17. When there is no stain, it is evaluated that the stain of the second skin image 102 has occurred (FIG. 9B).
- the spot evaluation unit 18 is configured to detect a plurality of spots 106 of the first skin image 100 associated with one spot 106 of the second skin image 102 in the first search step by the matching processing unit 17. It is evaluated that the plurality of spots 106 of the first skin image 100 are combined (FIG. 9C).
- the spot evaluation unit 18 determines that the second search step performed by the matching processing unit 17 includes a plurality of spots of the second skin image 102 that are associated with one spot of the first skin image 100. It is evaluated that the stain 106 of one skin image 100 is split (FIG. 9D).
- the spot evaluation unit 18 determines that the spot 106 of the first skin image 100 and the spot 106 of the second skin image 102 are associated with each other as a result of the matching processing step by the matching processing unit 17. It is determined to be maintained (FIG. 9E).
- the figure shown with the broken line in the figure between the 1st skin image 100 and the 2nd skin image 102 shows transition from the state of the 1st skin image 100 to the state of the 2nd skin image 102.
- FIG. 10A to FIG. 10C are diagrams showing an example of information regarding each stain obtained by the processes of the spot detecting unit 15, the center-of-gravity position calculating unit 16, the matching processing unit 17, and the spot evaluating unit 18. Such information is stored in the storage unit 13.
- FIG. 10A and FIG. 10B show an example of the stain information related to each stain obtained by the processing of the stain detection unit 15, the gravity center position calculation unit 16, and the stain evaluation unit 18.
- the stain information includes items such as a stain ID, a gravity center position, an area, a melanin amount, and a stain classification.
- the spot ID is information for identifying each spot.
- the barycentric position indicates the barycentric position coordinate of each spot.
- the area indicates the area of each stain.
- the amount of melanin indicates the amount of melanin in each stain.
- the melanin amount may be an average value of the melanin amount of the spot, or may be stored in association with each image constituting each spot.
- the spot classification indicates the classification of the spots evaluated by the spot evaluation unit 18 based on the area of each spot and the amount of melanin.
- FIG. 10C is a correspondence relationship of each spot in the first skin image 100 and the second skin image 102 obtained by the processing of the matching processing unit 17 and the spot evaluation unit 18 based on the spot information shown in FIGS. 10A and 10B.
- 2 shows an example of spot correspondence information indicating a change pattern or the like.
- the stain correspondence information includes items such as a stain ID of the first skin image 100, a stain ID of the second skin image 102, a change pattern, and a state change.
- the change pattern indicates a pattern of how the stain included in the first skin image 100 has changed in the second skin image 102.
- the state change indicates a state change such as a change in the area of the stain or a change in the amount of melanin in the stain.
- a predetermined medicine is applied to a subject for a predetermined period
- a skin image before application is defined as a first skin image 100
- a skin image after application is defined as a second skin image 102.
- the result was compared with the result confirmed visually one by one.
- the spot evaluation apparatus 10 of the present embodiment it has been confirmed that it is possible to perform spot matching between the first skin image 100 and the second skin image 102 with high accuracy. .
- 11A and 11B are diagrams in which a part of the skin image 100 before application and the skin image 102 after application used in the accuracy verification example are extracted.
- 11A shows the first skin image 100
- FIG. 11B shows the second skin image 102.
- the spot 106e shown in FIG. 11A is associated with the spots 106f and 106g shown in FIG. 11B.
- change can be caught and evaluated about each spot which was not able to grasp
- the spot evaluation process in this embodiment can capture and evaluate changes in each spot included in the skin image. Specifically, for example, by associating a plurality of stains respectively included in the first skin image and the second skin image having different photographing times with respect to the same target portion of the same subject, it is possible to grasp the change of each stain. For example, it is possible to grasp how small spots change, how large spots change, how dark spots change, how thin spots change, and the like.
- the first skin image 100 and the second skin image 102 have been described, but the skin image acquired by the image acquisition unit 14 may be 3 or more. In this case, for example, it is possible to detect the correspondence between spots included in the skin image before and after the time series.
- the color of the skin is mainly determined by light absorption by “hemoglobin pigment” and “melanin pigment”.
- the pigments constituting the skin color such as hemoglobin and melanin components, are not uniformly distributed inside the skin, and when the pigments are locally generated excessively, the color of the skin surface becomes uneven. This state is generally referred to as color unevenness).
- color unevenness there are ⁇ acne '', ⁇ acne scars '' and the like as symptoms caused by hemoglobin pigment, and as symptom caused by melanin pigment as ⁇ stain (senile pigment spot or sunlight black spot, Post-inflammation pigmentation, freckles, melasma, etc.) ”and“ mole ”.
- the uneven color portion is a stain
- the uneven color portion is not limited to a stain, for example, acne, acne scars, pimples, burn marks, moles, melanin components,
- a pigment component such as a hemoglobin component, a color value, or the like
- the color unevenness part evaluation process of the present application is particularly useful when a color unevenness part that changes with time is targeted.
- the spot evaluation unit 18 in the present embodiment displays the first skin image 100 and the second skin image 102 as shown in FIGS. 11A and 11B on the output device 22 such as a display via the output unit 12. can do.
- the output device 22 such as a display via the output unit 12. can do.
- the spot evaluation device 10 selects a predetermined spot 106c from the first skin image 100 with a pointer or the like, the corresponding spots 106h and 106i in the second skin image 102 are highlighted. It can be set as this structure. Thereby, the user can also visually confirm how each stain in the first skin image 100 has changed in the second skin image 102.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Animal Behavior & Ethology (AREA)
- Surgery (AREA)
- Molecular Biology (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dermatology (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Geometry (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
(ハードウェア構成)
図2は、シミ評価処理が実現可能なハードウェア構成の一例を示す図である。シミ評価装置10は、入力装置21と、出力装置22と、ドライブ装置23と、補助記憶装置24と、メモリ装置25と、各種制御を行うCPU(Central Processing Unit)26と、ネットワーク接続装置27とを有し、これらはシステムバスBで相互に接続されている。
(シミ評価処理)
図3は、シミ評価処理の一例を示すフローチャートである。
(シミ検出処理)
図4は、ステップS104のシミ検出処理を詳細に示すフローチャートである。以下の処理は、第1の肌画像及び第2の肌画像それぞれに対して行われる。
Y=0.001110×R+0.002080×G+0.000065×B+2.359088
Z=0.000439×R+0.000610×G+0.002439×B+2.757769 (式1)
また、(式1)から得られたXYZ値は、本出願人による特許第3727807号公報等に記載された手法により、以下の(式2)を用いてメラニン成分やヘモグロビン成分等の色素成分へと変換することができる。
ヘモグロビン量=-32.218×log10(1/X)+37.499×log10(1/Y)-4.495×log10(1/Z)+0.444 (式2)
次に、シミ検出部15は、ステップS11で得られた色素成分の分布画像から低周波数成分を除去する。これにより、例えば顔の形状による影に相当する大きなうねりの影響を除外することができる。シミ検出部15は、例えば半値幅約40.0mm以上の帯域を影による影響として除去する。具体的には、シミ検出部15は、まず、例えばガウシアン関数等のバンドパスフィルタを用いて、低周波数成分の画像を生成する(ステップS12)。次いで、シミ検出部15は、ステップS11の処理で得られた色素成分の分布画像からステップS12で得られた低周波数成分の画像を減算する(ステップS13)。
(1)各シミの輪郭を構成するすべての画素から算出する方法
各シミの輪郭を構成するすべての画素の数をn個とすると、重心108のX座標はすべての点のX座標を足し合わせ、nで除することにより得られる。重心108のY座標も同様に算出する。
(2)各シミ領域を構成するすべての画素にメラニン濃度の重み付けをし、算出する方法
そのシミ領域を構成するすべての画素に、メラニン濃度で重み付け(座標に対する掛け合わせ)をした後に、X座標及びY座標それぞれの平均を算出して得られる。
(3)楕円フィッティングを行って楕円の中心を求める方法
以上は一例であり、その他の方法を適宜選択することができる。
(精度検証例)
次に、本実施形態のシミ評価装置10によるシミのマッチング処理の精度検証を行った結果を説明する。ここでは、ある被験者に所定の薬剤を所定期間適用し、適用前の肌画像を第1の肌画像100、適用後の肌画像を第2の肌画像102として、本実施形態のシミ評価装置10を用いて第1の肌画像100中のシミと第2の肌画像102中のシミとを対応付ける処理を行った。その結果を目視で一つずつ確認した結果と比較した。
11 入力部
12 出力部
13 記憶部
14 画像取得部
15 シミ検出部
16 重心位置算出部
17 マッチング処理部
18 シミ評価部
19 制御部
21 入力装置
22 出力装置
23 ドライブ装置
24 補助記憶装置
25 メモリ装置
26 CPU
27 ネットワーク接続装置
28 記録媒体
100 第1の肌画像
102 第2の肌画像
106 シミ
108 重心
110 探索範囲
112 探索範囲
Claims (11)
- 第1の肌画像及び当該第1の肌画像とは異なる第2の肌画像それぞれから、複数の色ムラ部位を検出する色ムラ部位検出ステップと、
前記第1の肌画像及び前記第2の肌画像それぞれの各前記色ムラ部位の重心位置座標を算出する重心位置算出ステップと、
前記重心位置算出ステップで算出された各前記色ムラ部位の前記重心位置座標に基づき、前記第1の肌画像に含まれる前記複数の色ムラ部位と前記第2の肌画像に含まれる前記複数の色ムラ部位とを対応付けるマッチング処理ステップと、
を含む色ムラ部位の評価方法。 - 前記第1の肌画像及び前記第2の肌画像は、同一被験者の異なる時期における同一対象箇所を撮影して得られた肌画像である請求項1に記載の色ムラ部位の評価方法。
- 前記第1の肌画像から検出された前記複数の色ムラ部位と、前記第2の肌画像から検出された前記複数の色ムラ部位とに基づき、前記第1の肌画像と前記第2の肌画像との位置合わせを行い、座標の基準点を決定するステップをさらに含み、
前記重心位置算出ステップにおいて、前記基準点に対する各前記色ムラ部位の前記重心位置座標を算出する請求項1に記載の色ムラ部位の評価方法。 - 前記色ムラ部位はシミであって、
前記色ムラ部位検出ステップにおいて、メラニン成分又はヘモグロビン成分の分布画像に基づき、前記複数の色ムラ部位を検出する請求項1に記載の色ムラ部位の評価方法。 - 前記マッチング処理ステップは、前記第1の肌画像の各前記色ムラ部位について、当該色ムラ部位の前記重心位置座標に対応する前記第2の肌画像の対象座標を中心とした所定の範囲を探索領域として当該第2の肌画像を探索し、当該探索領域内に前記重心位置座標が存在する前記色ムラ部位のうち、前記重心位置座標が前記対象座標に最も近接する色ムラ部位を当該第1の肌画像の当該色ムラ部位と対応付けるとともに、当該探索領域内に前記重心位置座標が存在する前記色ムラ部位がない場合は、当該第1の肌画像の当該色ムラ部位に対応する前記色ムラ部位が前記第2の肌画像に存在しない旨を当該第1の肌画像の当該色ムラ部位と対応付ける第1の探索ステップを含む請求項1に記載の色ムラ部位の評価方法。
- 前記マッチング処理ステップは、前記第2の肌画像の各前記色ムラ部位について、当該色ムラ部位の前記重心位置座標に対応する前記第1の肌画像の対象座標を中心とした所定の範囲を探索領域として当該第1の肌画像を探索し、当該探索領域内に前記重心位置座標が存在する前記色ムラ部位のうち、前記重心位置座標が前記対象座標に最も近接する色ムラ部位を当該第2の肌画像の当該色ムラ部位と対応付けるとともに、当該探索領域内に前記重心位置座標が存在する前記色ムラ部位がない場合は、当該第2の肌画像の当該色ムラ部位に対応する前記色ムラ部位が前記第1の肌画像に存在しない旨を当該第2の肌画像の当該色ムラ部位と対応付ける第2の探索ステップを含む請求項5に記載の色ムラ部位の評価方法。
- 前記第2の肌画像は、前記第1の肌画像と同一被験者の同一対象箇所を前記第1の肌画像よりも後に撮影して得られた肌画像であって、
前記マッチング処理ステップで対応付けられた前記第1の肌画像に含まれる前記複数の色ムラ部位と前記第2の肌画像に含まれる前記複数の色ムラ部位との対応付けに基づき、前記第1の肌画像と前記第2の肌画像との間の色ムラ部位の変化を評価する評価ステップをさらに含む請求項6に記載の色ムラ部位の評価方法。 - 前記評価ステップは、前記マッチング処理ステップの前記第1の探索ステップにおいて前記探索領域内に前記重心位置座標が存在する前記色ムラ部位がない場合に、当該第1の肌画像の当該色ムラ部位が消滅したと評価し、前記第2の探索ステップにおいて前記探索領域内に前記重心位置座標が存在する前記色ムラ部位がない場合に、当該第2の肌画像の当該色ムラ部位が発生したと評価する請求項7に記載の色ムラ部位の評価方法。
- 前記評価ステップは、前記第1の探索ステップにおいて前記第1の肌画像の複数の前記色ムラ部位が前記第2の肌画像の一の前記色ムラ部位と対応付けられた場合に、当該第1の肌画像の前記複数の色ムラ部位が結合したと評価し、前記第2の探索ステップにおいて前記第2の肌画像の複数の前記色ムラ部位が前記第1の肌画像の一の前記色ムラ部位と対応付けられた場合に、当該第1の肌画像の前記色ムラ部位が分裂したと評価する請求項7に記載の色ムラ部位の評価方法。
- 第1の肌画像及び当該第1の肌画像とは異なる第2の肌画像それぞれから、複数の色ムラ部位を検出する色ムラ部位検出部と、
前記第1の肌画像及び前記第2の肌画像それぞれの各前記色ムラ部位の重心位置座標を算出する重心位置算出部と、
前記重心位置算出部が算出した各前記色ムラ部位の前記重心位置座標に基づき、前記第1の肌画像に含まれる前記複数の色ムラ部位と前記第2の肌画像に含まれる前記複数の色ムラ部位とを対応付けるマッチング処理部と、
を含む色ムラ部位評価装置。 - コンピュータを、
第1の肌画像及び当該第1の肌画像とは異なる第2の肌画像それぞれから、複数の色ムラ部位を検出する色ムラ部位検出手段、
前記第1の肌画像及び前記第2の肌画像それぞれの各前記色ムラ部位の重心位置座標を算出する重心位置算出手段、
前記重心位置算出手段が算出した各前記色ムラ部位の前記重心位置座標に基づき、前記第1の肌画像に含まれる前記複数の色ムラ部位と前記第2の肌画像に含まれる前記複数の色ムラ部位とを対応付けるマッチング処理手段、
として機能させる色ムラ部位評価プログラムを記録したコンピュータ読み取り可能な記録媒体。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020187031765A KR20180130553A (ko) | 2016-04-15 | 2017-02-22 | 색 얼룩 부위 평가 방법 및 색 얼룩 부위 평가 장치 |
US16/090,666 US10786197B2 (en) | 2016-04-15 | 2017-02-22 | Evaluation method for site of color irregularity and color irregularity site evaluation apparatus |
EP17782122.0A EP3443899A4 (en) | 2016-04-15 | 2017-02-22 | METHOD FOR EVALUATING A COLOR RUNNING LIQUID AND COLOR INSPECTION MEASUREMENT APPARATUS |
CN201780022594.8A CN109069065B (zh) | 2016-04-15 | 2017-02-22 | 色斑部位的评价方法及色斑部位评价装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016082380A JP6650819B2 (ja) | 2016-04-15 | 2016-04-15 | 色ムラ部位の評価方法、色ムラ部位評価装置及び色ムラ部位評価プログラム |
JP2016-082380 | 2016-04-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017179304A1 true WO2017179304A1 (ja) | 2017-10-19 |
Family
ID=60042637
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/006522 WO2017179304A1 (ja) | 2016-04-15 | 2017-02-22 | 色ムラ部位の評価方法及び色ムラ部位評価装置 |
Country Status (7)
Country | Link |
---|---|
US (1) | US10786197B2 (ja) |
EP (1) | EP3443899A4 (ja) |
JP (1) | JP6650819B2 (ja) |
KR (1) | KR20180130553A (ja) |
CN (1) | CN109069065B (ja) |
TW (1) | TW201738842A (ja) |
WO (1) | WO2017179304A1 (ja) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3560375B1 (en) * | 2016-12-20 | 2023-08-16 | Shiseido Company, Ltd. | Application control device, application device, application control method, and recording medium |
US11612350B2 (en) * | 2017-11-07 | 2023-03-28 | Canfield Scientific, Incorporated | Enhancing pigmentation in dermoscopy images |
JP7512167B2 (ja) | 2020-10-26 | 2024-07-08 | 株式会社 資生堂 | 頬画像中のシミ分布に基づく肌分類 |
JP7152727B2 (ja) * | 2021-02-25 | 2022-10-13 | エバ・ジャパン 株式会社 | 生体情報算出装置及び生体情報算出方法 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090327890A1 (en) * | 2008-06-26 | 2009-12-31 | Raytheon Company | Graphical user interface (gui), display module and methods for displaying and comparing skin features |
JP2012211886A (ja) * | 2011-03-24 | 2012-11-01 | Nippon Menaade Keshohin Kk | メラニン合成能力評価方法及び美容アドバイス方法並びにそれらを用いるメラニン合成能力評価システム及び美容アドバイスシステム |
JP2013090752A (ja) * | 2011-10-25 | 2013-05-16 | Fujifilm Corp | シミ分類方法、シミ分類装置およびシミ分類プログラム |
JP2015198785A (ja) * | 2014-04-08 | 2015-11-12 | キヤノン株式会社 | 診療支援装置、診療支援方法及びプログラム |
JP2015205222A (ja) * | 2015-08-20 | 2015-11-19 | 花王株式会社 | 体表評価方法および体表評価装置 |
JP2016096931A (ja) * | 2014-11-19 | 2016-05-30 | 株式会社 資生堂 | シミ評価装置、シミ評価方法、及びシミ評価プログラム |
Family Cites Families (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6624860B1 (en) * | 1998-01-26 | 2003-09-23 | Sharp Kabushiki Kaisha | Color filter layer providing transmitted light with improved brightness and display device using same |
JP3727807B2 (ja) | 1999-06-14 | 2005-12-21 | 株式会社資生堂 | 皮膚中成分および皮膚特性の測定方法および測定装置 |
JP4294880B2 (ja) | 2000-03-06 | 2009-07-15 | 富士フイルム株式会社 | 画像の位置合わせ方法および装置 |
JP3734741B2 (ja) | 2001-11-12 | 2006-01-11 | 株式会社資生堂 | しみ・そばかす評価方法 |
EP1681709A4 (en) * | 2003-10-16 | 2008-09-17 | Nikon Corp | DEVICE AND METHOD FOR MEASURING OPTICAL CHARACTERISTICS, EXPOSURE SYSTEM AND EXPOSURE METHOD AND COMPONENT MANUFACTURING METHOD |
US20070002342A1 (en) * | 2005-06-29 | 2007-01-04 | Xerox Corporation | Systems and methods for evaluating named colors against specified print engines |
JP4958483B2 (ja) * | 2006-06-19 | 2012-06-20 | キヤノン株式会社 | 記録装置 |
JP5571651B2 (ja) * | 2008-03-18 | 2014-08-13 | コーニンクレッカ フィリップス エヌ ヴェ | 皮膚撮像装置及び皮膚分析システム |
US8194952B2 (en) * | 2008-06-04 | 2012-06-05 | Raytheon Company | Image processing system and methods for aligning skin features for early skin cancer detection systems |
US20110040192A1 (en) * | 2009-05-21 | 2011-02-17 | Sara Brenner | Method and a system for imaging and analysis for mole evolution tracking |
JP5426475B2 (ja) * | 2010-05-21 | 2014-02-26 | 株式会社 資生堂 | 肌の色ムラ解析装置、肌の色ムラ解析方法、及び肌の色ムラ解析プログラム |
US20130188878A1 (en) | 2010-07-20 | 2013-07-25 | Lockheed Martin Corporation | Image analysis systems having image sharpening capabilities and methods using same |
JP5809498B2 (ja) * | 2010-10-19 | 2015-11-11 | キヤノン株式会社 | 光源ユニットの調整装置及び製造方法 |
US8554016B2 (en) | 2010-11-10 | 2013-10-08 | Raytheon Company | Image registration system and method for registering images for deformable surfaces |
JP6080427B2 (ja) * | 2012-08-10 | 2017-02-15 | キヤノン株式会社 | シャック・ハルトマンセンサとそれを利用した波面計測方法 |
JP6299594B2 (ja) * | 2012-08-17 | 2018-03-28 | ソニー株式会社 | 画像処理装置、画像処理方法、プログラムおよび画像処理システム |
US20140313303A1 (en) * | 2013-04-18 | 2014-10-23 | Digimarc Corporation | Longitudinal dermoscopic study employing smartphone-based image registration |
WO2014172671A1 (en) * | 2013-04-18 | 2014-10-23 | Digimarc Corporation | Physiologic data acquisition and analysis |
JP6179196B2 (ja) | 2013-05-31 | 2017-08-16 | 富士通株式会社 | データセンター |
JP6040103B2 (ja) * | 2013-06-06 | 2016-12-07 | 浜松ホトニクス株式会社 | 補償光学システムの対応関係特定方法、補償光学システム、および補償光学システム用プログラム |
JP6026655B2 (ja) * | 2013-06-07 | 2016-11-16 | 富士フイルム株式会社 | 透明感評価装置、透明感評価装置の作動方法、透明感評価方法および透明感評価プログラム |
GB201310854D0 (en) * | 2013-06-18 | 2013-07-31 | Isis Innovation | Photoactive layer production process |
CN104318239A (zh) * | 2014-11-14 | 2015-01-28 | 江南大学 | 基于纹理分析的快速图像特征提取方法 |
JP6868847B2 (ja) * | 2016-06-29 | 2021-05-12 | パナソニックIpマネジメント株式会社 | 画像処理装置および画像処理方法 |
-
2016
- 2016-04-15 JP JP2016082380A patent/JP6650819B2/ja active Active
-
2017
- 2017-02-22 KR KR1020187031765A patent/KR20180130553A/ko unknown
- 2017-02-22 WO PCT/JP2017/006522 patent/WO2017179304A1/ja active Application Filing
- 2017-02-22 CN CN201780022594.8A patent/CN109069065B/zh active Active
- 2017-02-22 EP EP17782122.0A patent/EP3443899A4/en not_active Withdrawn
- 2017-02-22 US US16/090,666 patent/US10786197B2/en active Active
- 2017-04-11 TW TW106111985A patent/TW201738842A/zh unknown
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090327890A1 (en) * | 2008-06-26 | 2009-12-31 | Raytheon Company | Graphical user interface (gui), display module and methods for displaying and comparing skin features |
JP2012211886A (ja) * | 2011-03-24 | 2012-11-01 | Nippon Menaade Keshohin Kk | メラニン合成能力評価方法及び美容アドバイス方法並びにそれらを用いるメラニン合成能力評価システム及び美容アドバイスシステム |
JP2013090752A (ja) * | 2011-10-25 | 2013-05-16 | Fujifilm Corp | シミ分類方法、シミ分類装置およびシミ分類プログラム |
JP2015198785A (ja) * | 2014-04-08 | 2015-11-12 | キヤノン株式会社 | 診療支援装置、診療支援方法及びプログラム |
JP2016096931A (ja) * | 2014-11-19 | 2016-05-30 | 株式会社 資生堂 | シミ評価装置、シミ評価方法、及びシミ評価プログラム |
JP2015205222A (ja) * | 2015-08-20 | 2015-11-19 | 花王株式会社 | 体表評価方法および体表評価装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3443899A4 * |
Also Published As
Publication number | Publication date |
---|---|
KR20180130553A (ko) | 2018-12-07 |
JP2017189536A (ja) | 2017-10-19 |
TW201738842A (zh) | 2017-11-01 |
CN109069065B (zh) | 2021-08-10 |
EP3443899A4 (en) | 2019-10-30 |
EP3443899A1 (en) | 2019-02-20 |
US20190117147A1 (en) | 2019-04-25 |
US10786197B2 (en) | 2020-09-29 |
CN109069065A (zh) | 2018-12-21 |
JP6650819B2 (ja) | 2020-02-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017179304A1 (ja) | 色ムラ部位の評価方法及び色ムラ部位評価装置 | |
TWI701018B (zh) | 資訊處理裝置、資訊處理方法、及程式 | |
US20160035109A1 (en) | Skin dullness evaluation apparatus and skin dullness evaluation method | |
US9135693B2 (en) | Image calibration and analysis | |
WO2013042436A1 (ja) | シワ検出方法、シワ検出装置およびシワ検出プログラム、並びに、シワ評価方法、シワ評価装置およびシワ評価プログラム | |
JP6297941B2 (ja) | うるおい感評価装置、うるおい感評価装置の作動方法およびうるおい感評価プログラム | |
WO2016080266A1 (ja) | シミ評価装置、シミ評価方法、及びプログラム | |
JP5426475B2 (ja) | 肌の色ムラ解析装置、肌の色ムラ解析方法、及び肌の色ムラ解析プログラム | |
EP3142045B1 (en) | Predicting accuracy of object recognition in a stitched image | |
WO2015019573A1 (ja) | 情報処理装置の制御方法および画像処理方法 | |
WO2020095739A1 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
US9286513B2 (en) | Image processing apparatus, method, and storage medium | |
US20160345887A1 (en) | Moisture feeling evaluation device, moisture feeling evaluation method, and moisture feeling evaluation program | |
JP2006061170A (ja) | 皮膚の鑑別法 | |
JP2017012384A (ja) | シワ状態分析装置及びシワ状態分析方法 | |
EP3435281B1 (en) | Skin undertone determining method and an electronic device | |
Kurniastuti et al. | Color Feature Extraction of Fingernail Image based on HSV Color Space as Early Detection Risk of Diabetes Mellitus | |
JP6351550B2 (ja) | ハリ感評価装置、ハリ感評価方法およびハリ感評価プログラム | |
Ferri et al. | Size functions for the morphological analysis of melanocytic lesions | |
JP2023032776A (ja) | 画像処理装置、画像処理方法、及びプログラム | |
KR101038674B1 (ko) | 컬러 인식 방법 및 장치 | |
Breneman | Towards early-stage malignant melanoma detection using consumer mobile devices | |
CN112257782A (zh) | 与问题皮肤部位检测相关的方法、系统及其神经网络训练方法、存储介质 | |
CN114463792A (zh) | 一种多光谱识别方法、装置、设备及可读存储介质 | |
JP2023075384A (ja) | 脈波検出システム、脈波検出方法及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20187031765 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2017782122 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2017782122 Country of ref document: EP Effective date: 20181115 |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17782122 Country of ref document: EP Kind code of ref document: A1 |