[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2021131686A1 - Inspection method, program, and inspection system - Google Patents

Inspection method, program, and inspection system Download PDF

Info

Publication number
WO2021131686A1
WO2021131686A1 PCT/JP2020/045763 JP2020045763W WO2021131686A1 WO 2021131686 A1 WO2021131686 A1 WO 2021131686A1 JP 2020045763 W JP2020045763 W JP 2020045763W WO 2021131686 A1 WO2021131686 A1 WO 2021131686A1
Authority
WO
WIPO (PCT)
Prior art keywords
inspection
image
shape
information
inspection method
Prior art date
Application number
PCT/JP2020/045763
Other languages
French (fr)
Japanese (ja)
Inventor
隆信 尾島
本村 秀人
莉奈 赤穗
松井 義徳
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2021131686A1 publication Critical patent/WO2021131686A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/46Measurement of colour; Colour measuring devices, e.g. colorimeters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/55Specular reflectivity
    • G01N21/57Measuring gloss
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Definitions

  • This disclosure relates to inspection methods, programs, and inspection systems.
  • the present disclosure relates to inspection methods, programs, and inspection systems for inspecting the surface condition of an object with images.
  • Patent Document 1 discloses a coloring inspection apparatus.
  • the color inspection apparatus disclosed in Patent Document 1 includes a camera having three spectral sensitivities linearly converted equivalent to the CIEXYZ color matching function, and three images having three spectral sensitivities acquired by the camera in the CIEXYZ color system. It includes an arithmetic processing device that acquires and calculates coloring data converted into stimulus values, and an illumination unit that illuminates an automobile, which is an example of a measurement object.
  • the color inspection apparatus inspects colors by calculating a color distribution matching index indicating the overlapping ratio of two xyz chromaticity histogram distributions of an inspection object and a reference object.
  • the appearance of the color of the inspection object (measurement object, object) can be affected by the shape of the surface of the inspection object. Therefore, even if the color distribution matching index, which indicates the overlapping ratio of the two xyz chromaticity histogram distributions of the inspection object and the reference object, which is obtained in Patent Document 1, is high, the visual sensation felt by a person may be different.
  • An object of the present disclosure is to provide an inspection method, a program, and an inspection system capable of improving the accuracy of inspection of the surface condition of an object.
  • the inspection method of one aspect of the present disclosure includes an acquisition step and an inspection step.
  • the acquisition step is a step of acquiring an object image relating to the surface of the object.
  • the inspection step is a step of inspecting the state of the surface of the object based on the target information including the shape information representing the shape of the frequency distribution of the parameter based on the pixel value obtained from the target image.
  • Another aspect of the program of the present disclosure is a program for causing one or more processors to execute the inspection method of the present disclosure.
  • a target information including an acquisition unit that acquires a target image on the surface of an object and shape information that represents the shape of a frequency distribution of parameters based on pixel values obtained from the target image. It is provided with an inspection unit that inspects the surface condition of the object based on the above.
  • FIG. 1 is a flowchart of an inspection method of one embodiment.
  • FIG. 2 is a block diagram of an inspection system that implements the above inspection method.
  • FIG. 3 is an explanatory diagram of a method for generating shape information used in the above inspection method.
  • FIG. 4 is an explanatory diagram of an example of shape information obtained by the inspection system.
  • FIG. 5 is an explanatory diagram of another example of the shape information obtained by the inspection system.
  • FIG. 6 is an explanatory diagram of another example of the shape information obtained by the inspection system.
  • FIG. 7 is an explanatory diagram of an example of the target information and the reference information obtained by the inspection system.
  • FIG. 8 is an explanatory diagram of another example of the target information and the reference information obtained by the inspection system.
  • FIG. 1 is a flowchart of an inspection method of one embodiment.
  • FIG. 2 is a block diagram of an inspection system that implements the above inspection method.
  • FIG. 3 is an explanatory diagram of a method for generating shape information
  • FIG. 9 is an explanatory diagram of another example of the target information and the reference information obtained by the inspection system.
  • FIG. 10 is a graph showing the matching rate of a plurality of samples prepared as an object with the reference object.
  • FIG. 11 is a graph showing the results of visual evaluation of a plurality of samples prepared as objects.
  • FIG. 12 is a graph showing the relationship between the concordance rate and the result of visual evaluation of a plurality of samples prepared as objects.
  • FIG. 13 is an explanatory diagram of a method of generating shape information in one modification.
  • FIG. 14 is an explanatory diagram of a method of generating shape information in one modification.
  • FIG. 15 is an explanatory diagram of a method of acquiring a target image in an inspection method of a modified example.
  • FIG. 16 is an explanatory diagram of an example of the target image.
  • FIG. 17 is an explanatory diagram of an example of the target image.
  • FIG. 18 is an explanatory diagram of an example of a histogram obtained from the target image.
  • FIG. 19 is an explanatory diagram of an example of a histogram obtained from the target image.
  • FIG. 20 is an explanatory diagram of an example of an image of the surface of an object.
  • FIG. 21 is an explanatory diagram of a method of acquiring a target image in an inspection method of a modified example.
  • FIG. 22 is an explanatory diagram of a method of generating shape information of one modification.
  • FIG. 23 is an explanatory diagram of a method of acquiring a target image in the inspection method of one modified example.
  • FIG. 24 is a diagram showing an example of inspection results presented by the inspection system.
  • FIG. 25 is a diagram showing another example of the inspection result presented by the inspection system.
  • FIG. 26 is a diagram showing another example of the inspection result presented by the inspection system.
  • FIG. 27 is a diagram showing another example of the inspection result presented by the inspection system.
  • FIG. 28 is a diagram showing another example of the inspection result presented by the inspection system.
  • Embodiment (1.1) Outline The inspection method of one embodiment can be used for inspecting the surface condition of the object 100 as shown in FIG.
  • the surface condition of the object 100 may include the color, shape, texture, glossiness, gradation, and coating condition of the surface of the object 100.
  • the gradation may include not only the gradation generated depending on the incident angle and the reflection angle when the surface is a curved surface, but also the flip-flop property, the color travel effect, the brilliance feeling, and the graininess feeling.
  • the object 100 is assumed to be an automobile.
  • the surface of the object 100 is a part of the outer surface of the vehicle body of the automobile.
  • the object 100 is not limited to an automobile.
  • the object 100 may or may not be a moving body other than an automobile. Examples of mobiles include motorcycles, trains, drones, aircraft, construction machinery, and ships. Further, the object 100 may be an electric device, tableware, a container, furniture, clothing, a building material, or the like. In short, the object 100 may be an object having a surface.
  • FIG. 1 shows a flowchart of the inspection method of the present embodiment.
  • the inspection method of the present embodiment includes acquisition step S11 and inspection step S12.
  • the acquisition step S11 is a step of acquiring an object image relating to the surface of the object 100.
  • the inspection step S12 inspects the state of the surface of the object 100 based on the object information including the shape information representing the shape of the frequency distribution of the parameter based on the pixel value obtained from the object image.
  • the state of the surface of the object 100 is inspected by using the shape information representing the shape of the frequency distribution of the parameters based on the pixel values obtained from the object image on the surface of the object 100.
  • the shape of the frequency distribution itself is used.
  • the inspection method of this embodiment can be executed by the inspection system 1 shown in FIG.
  • the inspection system 1 is a system for inspecting the surface of the object 100.
  • the inspection system 1 has a function as a coloring inspection device.
  • the inspection system 1 can also paint the object 100.
  • the inspection system 1 can paint the object 100 according to the result of the inspection, whereby the object 100 with the desired coating can be obtained.
  • the inspection system 1 includes a determination system 10, a lighting system 20, an imaging system 30, and a painting system 40.
  • the lighting system 20 is a system for irradiating the surface of the object 100 with light.
  • the lighting system 20 includes one or more lamps that illuminate the object 100 with light.
  • the lamp is, for example, an LED (Light Emitting Diode) lamp.
  • the lamp also emits white light.
  • the number of lamps is not particularly limited, and the type of lamp may be a light source other than the LED.
  • the emission color of the lamp is not limited to white.
  • the emission color of the lamp can be appropriately set in consideration of the color of the object 100 and the color detectable by the imaging system 30. Further, the wavelength of the light emitted by the lighting system 20 may be changeable.
  • the imaging system 30 is a system for generating an image (digital image) of the surface of the object 100.
  • the imaging system 30 images the surface of the object 100 illuminated by the lighting system 20 to generate an image of the surface of the object 100.
  • the imaging system 30 includes one or more cameras.
  • the camera comprises one or more image sensors.
  • the camera may include one or more line sensors.
  • the painting system 40 is a system for painting the surface of the object 100.
  • the painting system 40 includes one or more painting parts (painting robots). Since the painting robot may have a conventionally known configuration, detailed description thereof will be omitted.
  • the determination system 10 includes an input / output unit 11, a storage unit 12, and a processing unit 13.
  • the determination system 10 can be realized by a computer system.
  • a computer system may include one or more processors, one or more connectors, one or more communication devices, one or more memories, and the like.
  • the input / output unit 11 inputs / outputs information between the lighting system 20, the imaging system 30, and the painting system 40.
  • the input / output unit 11 is communicably connected to the lighting system 20, the imaging system 30, and the painting system 40.
  • the input / output unit 11 includes one or more input / output devices and uses one or more input / output interfaces.
  • the storage unit 12 is used to store the information used by the processing unit 13.
  • the storage unit 12 includes one or more storage devices.
  • the storage device is, for example, a RAM (RandomAccessMemory) or an EEPROM (ElectricallyErasableProgrammableReadOnlyMemory).
  • the processing unit 13 can be realized by, for example, one or more processors (microprocessors). That is, one or more processors execute one or more programs (computer programs) stored in one or more memories, thereby functioning as the processing unit 13.
  • the one or more programs may be recorded in advance in one or more memories, may be recorded through a telecommunication line such as the Internet, or may be recorded and provided on a non-temporary recording medium such as a memory card.
  • the processing unit 13 has an acquisition unit F11, an inspection unit F12, a presentation unit F13, and a control unit 14.
  • the acquisition unit F11, the inspection unit F12, the presentation unit F13, and the control unit F14 do not show a substantive configuration, but show a function realized by the processing unit 13.
  • the acquisition unit F11 executes acquisition step S11 (see FIG. 1) to acquire an object image relating to the surface of the object 100.
  • the target image is an image obtained by capturing the surface of the object 100 illuminated by the lighting system 20 with the imaging system 30.
  • the imaging system 30 images a part of the surface of the object 100, not the entire surface. Therefore, from the object 100, a plurality of target images having different surface portions of the captured object 100 are acquired. That is, the inspection is performed for each part of the surface of the object 100.
  • the acquisition unit F11 acquires an image of the surface of the object 100 from the imaging system 30. That is, the acquisition unit F11 receives an image from the image pickup system 30 via the input / output unit 11.
  • the image acquired by the acquisition unit F11 from the imaging system 30 is determined by the imaging conditions of the imaging system 30.
  • the imaging condition may include a relative positional relationship between the object 100, the lighting system 20, and the imaging system 30 (that is, information on the positional relationship between the object to be photographed, the lighting, and the camera).
  • the pixel value of the target image may include the values of three colors.
  • the values of the three colors can be an R value corresponding to red, a G value corresponding to green, and a B value corresponding to blue.
  • the pixel value is represented by the RGB color system.
  • the inspection unit F12 executes the inspection step S12 (see FIG. 1) for inspecting the surface condition of the object 100.
  • the inspection unit F12 For inspection of the surface condition of the object 100, the inspection unit F12 generates target information from the target image (S12a, S12b in FIG. 1).
  • the target information includes shape information representing the shape of the frequency distribution of the parameter based on the pixel value.
  • the parameter is a color parameter.
  • the inspection unit F12 converts the pixel value of the target image from the RGB color system to the LCh color system.
  • the inspection unit F12 obtains the frequency distribution of the parameters (brightness, saturation, hue) based on the pixel value. Therefore, in the present embodiment, the target information includes shape information regarding each of the plurality of parameters (three of lightness, saturation, and hue).
  • the shape information includes a feature amount representing the feature of the shape of the frequency distribution.
  • FIG. 3 shows a histogram H10 representing a frequency distribution of lightness.
  • the horizontal axis represents the parameter brightness
  • the vertical axis represents the frequency.
  • the shape of the frequency distribution is the shape of a histogram representing the frequency distribution.
  • the feature quantity includes a plurality of evaluation values and representative values.
  • the plurality of evaluation values are values representing the shape of the histogram representing the frequency distribution.
  • the plurality of evaluation values are treated as vectors.
  • a plurality of evaluation values may be referred to as evaluation vectors.
  • the evaluation vector is calculated based on the area of each of the plurality of divisions of the shape parameter of the frequency distribution. For example, as shown in FIG. 3, the inspection unit F12 obtains the area Si (S1 to S7) of each of the plurality of divisions Di (D1 to D7) of the shape parameter (brightness) of the histogram H10.
  • i is an integer of 1 or more.
  • the plurality of divisions Di have the same width (difference between the upper limit and the lower limit of the division).
  • the inspection unit F12 uses a value obtained by dividing the area Si of the category by the total area S as the evaluation value of the category. That is, the evaluation value corresponds to the ratio of each category to the entire histogram H10.
  • the evaluation value of the division Di is wi
  • the evaluation vector of the histogram H10 is (w1, w2, w3, w4, w5, w6, w7).
  • the representative value is the representative value of the frequency distribution.
  • the representative value corresponds to the value (center value) of the parameter (brightness in FIG. 3) that is the center of the frequency distribution.
  • the representative value is a value obtained from the ratio (evaluation vector) of each of the plurality of divisions of the parameter of the shape of the frequency distribution to the entire shape of the shape of the frequency distribution and the value of the parameter corresponding to the plurality of divisions.
  • the value of the parameter corresponding to the division Di is Li (L1 to L7).
  • the value Li of the parameter corresponding to the division Di is a representative value of the range of the parameters corresponding to the division Di, and may be any of an average value, a mode value, a median value, a maximum value, and a minimum value. ..
  • C ⁇ Li * wi holds.
  • "*" of Li * wi represents multiplication.
  • the inspection unit F12 generates shape information for each parameter based on the pixel value.
  • the inspection unit F12 generates shape information for each of lightness, saturation, and hue with respect to the target image.
  • the feature amount includes an evaluation vector and a representative value.
  • FIGS. 4, 5 and 6 are explanatory views of examples of shape information of lightness, saturation and hue, respectively.
  • HT1 shows a histogram
  • wT1 shows an evaluation vector
  • CT1 shows a representative value.
  • FIG. 5 regarding the saturation, HT2 shows a histogram
  • wT2 shows an evaluation vector, and CT2 shows a representative value.
  • the feature amount of the shape information is a simplified representation of the frequency distribution so that the shape and position are not impaired.
  • the inspection unit F12 compares the target information generated in this way with the reference information (S12c in FIG. 1).
  • the reference information can be stored in advance in the storage unit 12.
  • the reference information includes shape information obtained from a reference image regarding the surface of the reference object that is the reference of the object 100.
  • the reference information is generated by the same method as the target information.
  • the inspection unit F12 converts the pixel value of the reference image from the RGB color system to the LCh color system as necessary, and obtains the frequency distribution of the parameters (brightness, saturation, hue) based on the pixel value. ..
  • the reference information includes shape information relating to each of the plurality of parameters (three of lightness, saturation, and hue) as well as the target information.
  • the feature amount includes the evaluation vector and the representative value in each of the lightness, saturation, and hue shape information.
  • FIGS. 7, 8 and 9 are explanatory views of examples of target information and reference information regarding lightness, saturation and hue, respectively.
  • wR1 shows an evaluation vector based on the shape of the frequency distribution from the reference image
  • CR1 shows a representative value of the frequency distribution from the reference image.
  • wR2 shows an evaluation vector based on the shape of the frequency distribution from the reference image
  • CR2 shows a representative value of the frequency distribution from the reference image
  • wR3 shows an evaluation vector based on the shape of the frequency distribution from the reference image
  • CR3 shows a representative value of the frequency distribution from the reference image.
  • the inspection unit F12 obtains the difference in the feature amount of the shape information for each parameter in the comparison between the target information and the reference information.
  • the difference in the feature amount includes the difference in the evaluation vector and the difference in the representative value.
  • the difference Dw1 of the evaluation vector is wT1-wR1
  • the difference Dc1 of the representative value is CT1-CR1.
  • Table 1 below shows an example of the difference Dw1 of the evaluation vector
  • Table 2 below shows an example of the difference Dc1 of the representative value.
  • the difference Dw2 of the evaluation vector is wT2-wR2
  • the difference Dc2 of the representative value is CT2-CR2.
  • Table 3 below shows an example of the difference Dw2 of the evaluation vector
  • Table 4 below shows an example of the difference Dc2 of the representative value.
  • the difference Dw3 of the evaluation vector is wT3-wR3
  • the difference Dc3 of the representative value is CT3-CR3.
  • Table 5 below shows an example of the difference Dw3 of the evaluation vector
  • Table 6 below shows an example of the difference Dc3 of the representative value.
  • the inspection unit F12 obtains a determination value (first determination value) of the evaluation vector and a determination value (second determination value) of the representative value from the difference in the feature amount of the shape information for each parameter.
  • first determination value E1
  • second determination value E2
  • E2 a (Dc1 2 + Dc2 2 + Dc3 2) 1/2.
  • the inspection unit F12 determines whether the surface condition of the object 100 passes or fails based on the result of comparison between the target information and the reference information (S12d in FIG. 1). In the present embodiment, when the first determination value E1 is less than the first specified value and the second determination value E2 is less than the second specified value, the inspection unit F12 passes the surface condition of the object 100. Suppose there is. On the other hand, the inspection unit F12 fails the surface condition of the object 100 when the first determination value E1 is equal to or greater than the first specified value or the second determination value E2 is equal to or greater than the second specified value. And.
  • the specified values may be determined by using visual evaluation.
  • a brief explanation will be given on how to determine the specified value using visual evaluation.
  • a plurality of samples d1 to dn with different coating conditions are prepared as the object 100.
  • n is an arbitrary integer of 2 or more.
  • the coating conditions may be set so that the colors become lighter in the order of samples d1 to dn.
  • FIG. 10 shows the relationship between the samples d1 to dn and the concordance rate.
  • the matching rate is the highest in the sample dk.
  • each of the samples d1 to dn is visually evaluated (visually evaluated) by a plurality of people (30 people as an example) to obtain the correct answer rate.
  • the correct answer rate is the ratio of the number of people who selected the sample dk to the number of people who performed the visual evaluation.
  • FIG. 11 shows the result of visual evaluation.
  • a sample with a correct answer rate of 1.0 may be clearly rejected.
  • the correct answer rate is the same as random, so it may be passed.
  • the samples d2 to dk-2 and dk + 2 to dn-1 whose accuracy rate is between 0.5 and 1.0, it is further determined which sample is within the permissible range.
  • k is an integer.
  • the sample dm (k-2 ⁇ m ⁇ n-1) having a large gradient (slope) of the matching rate among the samples dk + 2 to dn-1 is passed.
  • Visual evaluation is performed with the sample dk + 2.
  • the correct answer rate is 1.0
  • the sample dm is rejected.
  • the sample dm-1 and the sample dm, which have the second highest matching rate after the sample dm, are visually evaluated.
  • the correct answer rate is 1.0, the sample dm-1 is rejected.
  • the sample dm-2 and the sample dm-1 which have the second highest matching rate after the sample dm-1, are visually evaluated. If the accuracy rate is 0.5, the sample dm-1 is set as the sample within the permissible range. In this way, the visual evaluation is repeated until the correct answer rate becomes 0.5, and the sample when the correct answer rate becomes 0.5 is taken as the sample within the permissible range. However, if the sample dk + 2 is reached before the accuracy rate reaches 0.5, the visual evaluation ends with the sample dk + 2 as the sample within the permissible range. Note that m is an integer.
  • the sample dm + 1 is passed. Then, the sample dm + 1 and the sample dm, which have the second lowest matching rate after the sample dm, are visually evaluated. Then, if the correct answer rate is 0.5, the sample dm + 1 is passed. Then, the sample dm + 2 and the sample dm + 1, which have the second lowest matching rate after the sample dm + 1, are visually evaluated. If the accuracy rate is 1.0, the sample dm + 1 is set as the sample within the allowable range.
  • the visual evaluation is repeated until the correct answer rate becomes 1.0, and the sample when the correct answer rate becomes 1.0 is set as the sample within the allowable range. However, if the sample dn-1 is reached before the accuracy rate reaches 1.0, the visual evaluation is completed with the sample dn-2 immediately before the sample dn-1 as the sample within the allowable range.
  • a sample with an allowable range limit may be determined. Then, based on the matching rate of the sample of the limit of the allowable range selected from the samples d2 to dk-2 and the matching rate of the sample of the limit of the allowable range selected from the samples dk + 2 to dn-1, the specified value ( The first specified value and the second specified value) are determined. For example, then, the larger of the matching rate of the upper limit sample selected from the samples d2 to dk-2 and the matching rate of the upper limit sample selected from the samples dk + 2 to dn-1.
  • the specified values (first specified value and second specified value) may be set based on the smaller one or the average value.
  • the presentation unit F13 executes the presentation step S13 (see FIG. 1) for presenting based on the inspection result (inspection result in the inspection step) in the inspection unit F12. That is, the presentation unit F13 makes a presentation based on the result of the inspection by the inspection unit F12.
  • the presentation based on the inspection result may also include the presentation of the judgment result in the inspection unit F12 using the result of comparison between the target information and the reference information. Therefore, the presentation unit F13 may present the result of the determination by the inspection unit F12.
  • the presentation unit F13 outputs the result of the determination by the inspection unit F12 to the external device through the input / output unit 11.
  • the external device may present the result of the determination by the inspection unit F12, that is, the result of the inspection by the inspection system 1.
  • the control unit F14 executes a control step S14 (see FIG. 1) that outputs control information based on the inspection result in the inspection step S12 to the coating system 40 that paints the surface of the object 100.
  • the control unit F14 when the inspection unit F12 determines that the surface condition of the object 100 is unacceptable, the control unit F14 outputs control information to the coating system 40 to repaint the object 100.
  • the control information may include data in the format shown in Table 7 below.
  • the control information includes a plurality of items.
  • the plurality of items are "model", “camera”, “part”, “result”, "E1", “E2”, "color space", and "number of bins".
  • "model” indicates information for identifying the inspection system 1.
  • “Camera” indicates the number of the camera of the imaging system 30 used to generate the target image.
  • Part indicates information for identifying a part of the object 100 shown in the target image.
  • Part indicates a number assigned to the part.
  • Result indicates the result of the inspection by the inspection unit F12.
  • the "result” is "NG”, indicating that the test result is unsuccessful.
  • “E1” indicates a first determination value corresponding to the target image.
  • “E2” indicates a second determination value corresponding to the target image.
  • the “color space” indicates information regarding the color system of the frequency distribution of the target image. Table 7 shows that the "color space” is the LCh color system.
  • “Number of bins” indicates the number of elements of the evaluation vector of each color system. Table 7 shows that the number of elements of the evaluation vector for each of lightness, saturation, and hue is 64.
  • the control unit F14 controls the painting system 40 based on the result of comparison between the target information obtained from the control information and the reference information (difference in the feature amount of the shape information for each parameter). That is, the control unit F14 controls the coating system 40 that paints the surface of the object 100 based on the result of the inspection by the inspection unit F12. As a result, the state of the surface of the object 100 can be brought closer to the target state.
  • the control unit F14 may output control information to the coating system 40 to finish the coating of the object 100.
  • the acquisition unit F11 acquires an object image relating to the surface of the object 100 from the imaging system 30 (S11).
  • the inspection unit F12 inspects the surface condition of the object 100 (S12).
  • the inspection unit F12 obtains the frequency distribution for each parameter based on the pixel value from the target image (S12a).
  • the inspection unit F12 since there are three parameters, lightness, saturation, and hue, a frequency distribution of lightness, a frequency distribution of saturation, and a frequency distribution of hue can be obtained.
  • the inspection unit F12 generates target information based on the frequency distribution (S12b).
  • the target information includes shape information regarding lightness, saturation, and hue.
  • the shape information includes a feature amount representing the feature of the shape of the frequency distribution, and includes an evaluation vector (plural evaluation values) and a representative value.
  • the inspection unit F12 compares the target information with the reference information (S12c). In the comparison between the target information and the reference information, the inspection unit F12 obtains the difference in the feature amount of the shape information for each parameter (here, brightness, saturation, hue).
  • the difference in the feature amount includes the difference in the evaluation vector and the difference in the representative value.
  • the determination value (first determination value) of the evaluation vector and the determination value (second determination value) of the representative value are obtained from the difference in the feature amount of the shape information.
  • the inspection unit F12 determines whether the surface condition of the object 100 passes or fails based on the result of comparison between the target information and the reference information (S12d). In the present embodiment, when the first determination value E1 is less than the first specified value and the second determination value E2 is less than the second specified value, the inspection unit F12 passes the surface condition of the object 100. Suppose there is. On the other hand, the inspection unit F12 fails the surface condition of the object 100 when the first determination value E1 is equal to or greater than the first specified value or the second determination value E2 is equal to or greater than the second specified value. And. Then, in the inspection system 1, the presentation unit F13 outputs the result of the inspection by the inspection unit F12 to the external device through the input / output unit 11 (S13). Further, in the inspection system 1, the control unit F14 outputs control information based on the inspection result to the painting system 40 (S14).
  • the inspection system 1 described above includes an acquisition unit F11 and an inspection unit F12.
  • the acquisition unit F11 is a step of acquiring an object image relating to the surface of the object 100.
  • the inspection unit F12 inspects the state of the surface of the object 100 based on the target information including the shape information representing the shape of the frequency distribution of the parameter based on the pixel value obtained from the target image. According to this inspection system 1, the accuracy of inspection of the surface condition of the object 100 can be improved.
  • the inspection system 1 executes the following method (inspection method).
  • the inspection method includes acquisition step S11 and inspection step S12.
  • the acquisition step S11 acquires an object image relating to the surface of the object 100.
  • the inspection step S12 inspects the state of the surface of the object 100 based on the object information including the shape information representing the shape of the frequency distribution of the parameter based on the pixel value obtained from the object image. According to this inspection method, the accuracy of inspection of the surface condition of the object 100 can be improved as in the inspection system 1.
  • the inspection method is realized by executing a program (computer program) by one or more processors.
  • This program is a program for causing one or more processors to execute the above-mentioned inspection method. According to such a program, the accuracy of the inspection of the surface condition of the object 100 can be improved as in the inspection method.
  • the program may then be provided by a storage medium.
  • This storage medium is a non-temporary storage medium that can be read by a computer and stores the above program. According to such a storage medium, the accuracy of the inspection of the surface condition of the object 100 can be improved as in the inspection method.
  • the target information includes shape information corresponding to each of a plurality of parameters, but the target information may include shape information corresponding to a single parameter instead of a plurality of parameters.
  • the feature amount included in the shape information may include at least one of a plurality of evaluation values (evaluation vectors) and representative values.
  • the pixel value of the target image is not limited to the RGB color system.
  • the color system include CIE color systems such as XYZ color system, xyY color system, L * u * v * color system, L * a * b * color system, and Lch color system.
  • the imaging system 30 may generate an image in which the pixel value is represented by the XYZ color system instead of an image in which the pixel value is represented by the RGB color system.
  • the color system may be converted by arithmetic processing.
  • an image in which the pixel value is represented by the RGB color system may be converted into an image in which the pixel value is represented by the XYZ color system.
  • a model expression, a look-up table, or the like can be used for the arithmetic processing in this case. Thereby, it is possible to convert the pixel value of the target image into a parameter of a desired color system.
  • the inspection unit F12 does not need to use all of the shape information of a plurality of parameters obtained from the pixel values of the target image for inspection.
  • the parameters are lightness, saturation, and hue.
  • the hue can change significantly with a slight difference in color.
  • the use of hue in the inspection may reduce the accuracy of the inspection. Therefore, when the surface of the object 100 is achromatic, it is better not to use the shape information regarding the hue.
  • certain parameters of the plurality of parameters may not be valid for inspection.
  • the inspection unit F12 does not use the shape information regarding the second parameter for the inspection when the first parameter satisfies a predetermined condition.
  • a predetermined condition is that the saturation is equal to or less than the threshold value.
  • the threshold value can be determined based on whether or not the surface of the object 100 can be determined to be achromatic if the saturation is equal to or less than the threshold value.
  • the combination of the first parameter and the second parameter is not limited to saturation and hue, and may be other parameters.
  • weighted values may be set individually for a plurality of parameters. That is, the inspection unit F12 may weight a plurality of parameters.
  • the inspection unit F12 may weight a plurality of parameters.
  • the hue can change significantly with a slight difference in color.
  • the use of hue in the inspection may reduce the accuracy of the inspection.
  • the closer the surface of the object 100 is to an achromatic color the smaller the weighting value for the hue (lightening the weighting), thereby reducing the influence of the hue on the inspection result. For example, as shown in FIG.
  • the inspection unit F12 sets a weighted value in the evaluation vector w11 to generate an evaluation vector w12 in which the influence on the inspection is suppressed, and the inspection is performed based on the evaluation vector w12. You may go. In this way, the inspection unit F12 may individually change the weight value for a plurality of parameters according to the surface of the object 100.
  • weighted values may be set individually for a plurality of evaluation values. That is, the inspection unit F12 may weight each element of the evaluation vector corresponding to each parameter. For example, in the state of the surface of the object 100, there is a case where it is desired to perform an inspection paying attention to a feeling of brilliance. At this time, if the area of the high-brightness region in the target image is small, the desired inspection accuracy may not be obtained with respect to the brilliance. In such a case, the inspection unit F12 may set a weighted value that increases as the brightness increases. For example, as shown in FIG.
  • the inspection unit F12 sets a weighted value for each element of the evaluation vector w11 to generate an evaluation vector w13 in which a region having a large brightness is emphasized, and based on this evaluation vector w13. , May be inspected. In this way, the inspection unit F12 may individually change the weight value for the plurality of evaluation values according to the surface of the object 100. Further, regarding the feeling of brilliance, although the result of the inspection by the inspection unit F12 is passed, the result of the inspection may not be passed when visually observed by a person. That is, the result of the inspection by the inspection unit F12 may not match the way a person feels. In such a case, it is conceivable to adopt a psychophysical quantity that reflects the psychological quantity according to the way a person feels as a weighted value. This makes it possible to reflect the psychological amount of a person in the inspection by the inspection unit F12.
  • the target information may be obtained from a deviation image obtained from the target image.
  • the deviation image is an image having a value obtained by subtracting a reference value of the pixel value of the target image from the pixel value of the target image as a pixel value.
  • the reference value can be selected from the average value, the mode value, the median value, and the representative value of the frequency distribution obtained from the target image.
  • Such a deviation image is effective for inspecting the texture of the surface of the object 100. That is, when inspecting the texture of the surface of the object 100 as the state of the surface of the object 100, the inspection unit F12 generates a deviation image from the target image, and the target is based on the deviation image. Generate information. In this case, a deviation image is generated by the same method for the reference object to be compared with the object 100. By generating the target information using the deviation image in this way, it is possible to improve the accuracy of the inspection of the texture of the surface of the object 100.
  • the acquisition unit F11 may acquire a plurality of target images obtained by imaging the surface of the object 100 under different imaging conditions. That is, the acquisition step S11 may be a step of acquiring a plurality of target images obtained by imaging the surface of the object 100 under different imaging conditions.
  • imaging conditions include imaging range, imaging direction, resolution, and the like.
  • the imaging system 30 may use a plurality of cameras 31 and 32 having different imaging ranges and resolutions, as shown in FIG.
  • FIG. 15 is an explanatory diagram of a method of acquiring a target image in an inspection method of a modified example. In FIG. 15, the camera 31 can capture a wider range than the camera 32, but the camera 32 has a higher resolution than the camera 31.
  • the camera 32 is arranged closer to the surface of the object 100 than the camera 31.
  • the camera 31 outputs an image of the first imaging range 111 of the surface 110 of the object 100 as the target image P11 (see FIG. 16).
  • the camera 32 outputs an image of the second imaging range 112 of the surface 110 of the object 100 as the object image P12 (see FIG. 17).
  • the second imaging range 112 is narrower than the first imaging range 111.
  • the second imaging range 112 may not be included in the first imaging range 111.
  • the camera 31 is a macro camera suitable for inspection (for example, inspection related to gradation) of the surface 110 of the object 100 from a larger viewpoint than the camera 32.
  • the camera 32 is a micro camera suitable for inspection of the surface 110 of the object 100 from a smaller viewpoint (for example, inspection of graininess) than the camera 31.
  • FIG. 18 shows a histogram H11 corresponding to the target image P11
  • FIG. 19 shows a histogram H12 corresponding to the target image P12.
  • the inspection unit F12 obtains the target information for each of the target image P11 and the target image P12, and compares the target information with the corresponding reference information. Thereby, the accuracy of the inspection of the state of the surface 110 of the object 100 can be improved.
  • the inspection unit F12 may inspect the uneven state of the surface of the object 100 as the state of the surface of the object 100.
  • FIG. 20 is an explanatory diagram of an example of an image of the surface of the object 100 in this modified example.
  • the acquisition unit F11 acquires a plurality of target images obtained by imaging the surface of the object 100 under different imaging conditions. Examples of imaging conditions include imaging range, imaging direction, resolution, and the like.
  • the imaging system 30 may use a plurality of cameras 33, 34 having different imaging directions, as shown in FIG. In FIG.
  • the surface 110 of the object 100 is illuminated by the lighting system 20.
  • a part of the light 21 from the lighting system 20 becomes the light 22 mirror-reflected on the surface 110, and the rest of the light 21 from the lighting system 20 is diffuse-reflected.
  • the camera 33 is arranged at a position where the light diffusely reflected by the surface 110 of the object 100 is received.
  • the object image generated by the camera 33 reflects the diffuse reflection component on the surface 110 of the object 100.
  • the camera 34 is arranged at a position where it receives the light reflected by the mirror surface on the surface 110 of the object 100.
  • the specular reflection component on the surface 110 of the object 100 is well reflected in the object image generated by the camera 34.
  • the shape of the surface 110 of the object 100 is more reflected in the specular reflection component than the diffuse reflection component, and the color itself of the surface 110 of the object 100 is more reflected in the diffuse reflection component than the mirror reflection component. It is well reflected.
  • the state of the surface 110 of the object 100 can be inspected by taking into account not only the color of the surface of the object 100 itself but also the shape of the surface of the object 100.
  • the accuracy of inspection of the surface condition of the object 100 can be improved.
  • the surface 110 of the object 100 may be imaged from different positions with the same camera.
  • the specular reflection portion 130 may be imaged by a macro camera and a micro camera to obtain an object image. Further, the target image obtained by using high dynamic range imaging may be used. Since the target image obtained by high dynamic range composition can handle a wider dynamic range than usual, a component with a relatively small amount of light such as a diffuse reflection component has a relatively large amount of light such as a specular reflection component. Even the components can be captured in a single image.
  • the division Di when obtaining a plurality of evaluation values does not necessarily have to be the same width.
  • the plurality of divisions D1 to D7 have the same width.
  • the plurality of divisions Di may have different widths.
  • the width of the division Di is narrowed, the resolution of the evaluation vector is improved, and the accuracy of the inspection can be improved.
  • FIG. 22 shows a histogram H20 obtained from a target image containing a large amount of specular reflection components. The peak of the histogram H20 is biased toward the bright side (right side).
  • a plurality of divisions Di (D21 to D30) are set for the histogram H20.
  • the widths of the plurality of divisions D21 to D30 are narrowed in this order.
  • the inspection unit F12 may inspect the surface mapping property of the object 100 as the state of the surface of the object 100.
  • the mapability is an index of the sharpness of an image (reflection image) of an object reflected on the surface of the object 100. When the reproducibility is poor, the reflected image becomes unclear, and when the reproducibility is good, the reflected image becomes clear.
  • the mapability is affected by the surface roughness and shape of the object 100.
  • the inspection image 200 is used.
  • a chart to which a predetermined test pattern is applied is used for the inspection image 200. Light 23 is applied to the inspection image 200 from the lighting system 20 so that the reflection image 210 of the inspection image 200 is projected on the surface 110 of the object 100.
  • the imaging system 30 is arranged at a position where the light 24 from the reflected image 210 is received.
  • the imaging system 30 captures an image of the reflection image 210 reflected on the surface 110 of the object 100 and outputs it as the target image.
  • the target image is an image of a reflection image of the inspection image 200 by the surface 110 of the object 100.
  • the inspection unit F12 may generate target information from such a target image, compare the target information with the corresponding reference information, and inspect the state (mapping property) of the surface 110 of the target object 100.
  • a self-luminous display displaying a predetermined test pattern may be used. In this case, it is not necessary to use the lighting system 20 to project the test pattern on the surface of the object 100.
  • the presentation unit F13 may present difference information that visually indicates the difference between the target information and the reference information based on the result of the inspection by the inspection unit F12.
  • FIG. 24 shows an example of difference information.
  • the difference information is a graph showing the evaluation vector wT1 and the representative value CT1 obtained from the target information and the evaluation vector wR1 and the representative value CR1 obtained from the reference information.
  • the difference information includes an upper limit value Th1 and a lower limit value Th2 in the allowable range of the representative value CT1 of the target information. The permissible range is set around the representative value CR1 obtained from the reference information.
  • FIG. 25 shows another example of the difference information.
  • the difference information the difference between the evaluation vectors of the target information and the reference information is emphasized. More specifically, the difference information is a graph showing the evaluation vector wT1 obtained from the target information and the evaluation vector wR1 obtained from the reference information in FIG. 25. Further, in FIG. 25, the regions R11 and R12 that are the differences between the evaluation vector wT1 of the target information and the evaluation vector wR1 of the reference information are emphasized and displayed. Therefore, according to the difference information of FIG.
  • FIG. 26 shows another example of the difference information.
  • FIG. 26 shows the difference vector of the evaluation vector between the target information and the reference information as the difference information. More specifically, the difference information in FIG. 26 displays the regions R11 and R12 that are the differences between the evaluation vector wT1 of the target information and the evaluation vector wR1 of the reference information in FIG. 25. Therefore, according to the difference information of FIG.
  • the user can visually grasp the difference between the evaluation vector wT1 of the target information and the evaluation vector wR1 of the reference information. Therefore, the user can determine whether or not the evaluation vector wT1 of the target information is within the allowable range with respect to the evaluation vector wR1 of the reference information.
  • the presentation unit F13 may present an image of the object 100 that reflects the difference between the target information and the reference information as a result of the inspection by the inspection unit F12 on the surface of the object 100. Further, the presentation unit F13 may make a presentation that clearly distinguishes the portion determined to be acceptable by the inspection unit F12 from the portion determined to be unacceptable. For example, in the image P30 of the object 100 shown in FIG. 27, the surface of the object 100 is classified into four regions R31, R32, R33, and R34. The four regions R31, R32, R33, and R34 indicate that the difference between the representative values of the target information and the reference information increases in this order. The region R34 is a portion of the surface of the object 100 that is determined to be unacceptable by the inspection unit F12.
  • the surface of the object 100 is classified into four regions R41, R42, R43, and R44.
  • the four regions R41, R42, R43, and R44 indicate that the difference between the evaluation vectors of the target information and the reference information increases in this order.
  • the region R44 is a portion of the surface of the object 100 that is determined to be unacceptable by the inspection unit F12.
  • the input / output unit 11 may include an image display device.
  • the presentation unit F13 may display the result of the inspection by the inspection unit F12 on the image display device of the input / output unit 11.
  • the input / output unit 11 may include an audio output device.
  • the presentation unit F13 may output the result of the inspection by the inspection unit F12 from the audio output device of the input / output unit 11.
  • the wavelength of the light emitted by the lighting system 20 may be changeable. This can be achieved by using light sources having different emission colors or color filters. Thus, in the inspection system 1, at least one of the wavelength of the light emitted by the lighting system 20 and the wavelength of the light detected by the imaging system 30 may be changeable.
  • the inspection system 1 may be composed of a plurality of computers.
  • the functions of the inspection system 1 (determination system 10) may be distributed to a plurality of devices.
  • at least a part of the functions of the inspection system 1 (determination system 10) may be realized by, for example, the cloud (cloud computing).
  • the execution subject of the inspection system 1 includes a computer system.
  • a computer system has a processor and memory as hardware.
  • the processor executes the program recorded in the memory of the computer system, the function as the execution subject of the inspection system 1 (determination system 10) in the present disclosure is realized.
  • the program may be pre-recorded in the memory of the computer system or may be provided through a telecommunication line. Further, the program may be provided by being recorded on a non-temporary recording medium such as a memory card, an optical disk, or a hard disk drive that can be read by a computer system.
  • a processor in a computer system is composed of one or more electronic circuits including a semiconductor integrated circuit (IC) or a large scale integrated circuit (LSI).
  • Logical devices can be used for the same purpose.
  • a plurality of electronic circuits may be integrated on one chip, or may be distributed on a plurality of chips. The plurality of chips may be integrated in one device, or may be distributed in a plurality of devices.
  • the first aspect is an inspection method, which includes an acquisition step (S11) and an inspection step (S12).
  • the acquisition step (S11) is a step of acquiring an object image relating to the surface of the object (100).
  • the inspection step (S12) is a step of inspecting the surface condition of the object (100) based on the object information including the shape information representing the shape of the frequency distribution of the parameter based on the pixel value obtained from the object image. is there. According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
  • the second aspect is an inspection method based on the first aspect.
  • the shape information includes a feature quantity representing a feature of the shape of the frequency distribution. According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
  • the third aspect is an inspection method based on the second aspect.
  • the feature quantity includes a plurality of evaluation values based on the plurality of divisions (Di) of the parameters of the shape of the frequency distribution and the area (Si) of each. According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
  • the fourth aspect is an inspection method based on the third aspect.
  • weighted values are individually set for a plurality of evaluation values. According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
  • the fifth aspect is an inspection method based on the second aspect.
  • the feature quantity includes a representative value of the frequency distribution. According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
  • the sixth aspect is an inspection method based on the fifth aspect.
  • the representative values are the ratio (wi) of each of the plurality of divisions (Di) of the parameters of the shape of the frequency distribution to the entire shape of the frequency distribution, and the values of the parameters corresponding to the plurality of divisions (Di). It is a value obtained from (Li). According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
  • the seventh aspect is an inspection method based on any one of the first to sixth aspects.
  • the parameter is a color parameter. According to this aspect, the accuracy of the color inspection can be improved as the surface condition of the object (100).
  • the eighth aspect is an inspection method based on any one of the first to seventh aspects.
  • the target information includes shape information for each of the plurality of parameters. According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
  • the ninth aspect is an inspection method based on the eighth aspect.
  • weighted values are individually set for the plurality of parameters. According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
  • the tenth aspect is an inspection method based on the eighth or ninth aspect.
  • the plurality of parameters include a first parameter and a second parameter.
  • the inspection step (S12) when the first parameter satisfies a predetermined condition, the shape information regarding the second parameter is not used for the inspection. According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
  • the eleventh aspect is an inspection method based on the tenth aspect.
  • the first parameter is saturation.
  • the second parameter is hue. According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
  • the twelfth aspect is an inspection method based on any one of the first to eleventh aspects.
  • the inspection step (S12) compares the target information with the reference information including the shape information obtained from the reference image regarding the surface of the reference object as the reference of the object (100). According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
  • the thirteenth aspect is an inspection method based on any one of the first to twelfth aspects.
  • the target information is obtained from a deviation image obtained from the target image.
  • the deviation image is an image having a value obtained by subtracting a reference value of the pixel value of the target image from the pixel value of the target image as a pixel value. According to this aspect, it is possible to inspect the texture as the state of the surface of the object (100).
  • the 14th aspect is an inspection method based on any one of the 1st to 13th aspects.
  • the acquisition step (S11) acquires a plurality of target images obtained by imaging the surface of the object (100) under different imaging conditions. According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
  • the fifteenth aspect is an inspection method based on any one of the first to fourteenth aspects.
  • the target image is an image of a reflection image (210) of the inspection image (200) by the surface of the object (100).
  • the mapability can be inspected as the state of the surface of the object (100).
  • the sixteenth aspect is an inspection method based on any one of the first to fifteenth aspects.
  • the inspection method further includes a presentation step (S13) for making a presentation based on the result of the inspection in the inspection step (S12). According to this aspect, the result of the inspection of the object (100) can be presented.
  • the 17th aspect is an inspection method based on any one of the 1st to 16th aspects.
  • the inspection method is to provide a coating system (40) for painting the surface of the object (100) with a control step (S14) for outputting control information based on the inspection result in the inspection step (S12). Further included. According to this aspect, it is possible to improve the quality of coating on the surface of the object (100).
  • the eighteenth aspect is an inspection method based on any one of the first to seventeenth aspects.
  • the shape of the frequency distribution is the shape of a histogram. According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
  • the nineteenth aspect is a program for causing one or more processors to execute the inspection method of any one of the first to eighteenth aspects. According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
  • the twentieth aspect is an inspection system (1), which includes an acquisition unit (F11) and an inspection unit (F12).
  • the acquisition unit (F11) acquires an object image on the surface of the object (100).
  • the inspection unit (F12) inspects the surface condition of the object (100) based on the object information including the shape information representing the shape of the frequency distribution of the parameter based on the pixel value obtained from the object image. According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
  • the second to eighteenth aspects can be appropriately modified and applied to the nineteenth and twentieth aspects.
  • the accuracy of inspection of the surface condition of the object can be improved. Therefore, the invention according to the present disclosure is industrially useful.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Chemical & Material Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Geometry (AREA)
  • Spectrometry And Color Measurement (AREA)
  • Image Analysis (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

Provided are an inspection method, program, and inspection system that make it possible to enhance the accuracy of inspection of the state of the surface of a subject. This inspection method comprises an acquisition step (S11) and an inspection step (S12). The acquisition step (S11) is a step for acquiring a subject image relating to the surface of a subject. In the inspection step (S12), the state of the surface of the subject is inspected on the basis of subject information that has been obtained from the subject image and includes shape information representing the shape of the frequency distribution of a pixel-value-based parameter.

Description

検査方法、プログラム、及び、検査システムInspection method, program, and inspection system
 本開示は、検査方法、プログラム、及び、検査システムに関する。特に、本開示は、画像により対象物の表面の状態の検査を行う検査方法、プログラム、及び、検査システムに関する。 This disclosure relates to inspection methods, programs, and inspection systems. In particular, the present disclosure relates to inspection methods, programs, and inspection systems for inspecting the surface condition of an object with images.
 特許文献1は、着色検査装置を開示する。特許文献1に開示された着色検査装置は、CIEXYZ等色関数と等価に線形変換された三つの分光感度を有するカメラと、カメラが取得した三つの分光感度を有する画像をCIEXYZ表色系における三刺激値に変換した着色データを取得し演算する演算処理装置と、測定対象物の一例である自動車を照射する照明部と、を備える。着色検査装置は、検査物と基準物の2つのxyz色度ヒストグラム分布の重なり合った割合を示す色分布一致指数を演算することにより、色を検査する。 Patent Document 1 discloses a coloring inspection apparatus. The color inspection apparatus disclosed in Patent Document 1 includes a camera having three spectral sensitivities linearly converted equivalent to the CIEXYZ color matching function, and three images having three spectral sensitivities acquired by the camera in the CIEXYZ color system. It includes an arithmetic processing device that acquires and calculates coloring data converted into stimulus values, and an illumination unit that illuminates an automobile, which is an example of a measurement object. The color inspection apparatus inspects colors by calculating a color distribution matching index indicating the overlapping ratio of two xyz chromaticity histogram distributions of an inspection object and a reference object.
特開2015-155892号公報Japanese Unexamined Patent Publication No. 2015-155892
 検査物(測定対象物、対象物)の色の見え方は、検査物の表面の形状の影響を受け得る。そのため、特許文献1で得られる、検査物と基準物の2つのxyz色度ヒストグラム分布の重なり合った割合を示す色分布一致指数が高くても、人が感じる見た目の感覚が異なることがある。 The appearance of the color of the inspection object (measurement object, object) can be affected by the shape of the surface of the inspection object. Therefore, even if the color distribution matching index, which indicates the overlapping ratio of the two xyz chromaticity histogram distributions of the inspection object and the reference object, which is obtained in Patent Document 1, is high, the visual sensation felt by a person may be different.
 本開示の目的は、対象物の表面の状態の検査の精度を向上できる、検査方法、プログラム、及び、検査システムを提供することである。 An object of the present disclosure is to provide an inspection method, a program, and an inspection system capable of improving the accuracy of inspection of the surface condition of an object.
 本開示の一態様の検査方法は、取得ステップと、検査ステップとを含む。取得ステップは、対象物の表面に関する対象画像を取得するステップである。検査ステップは、対象画像から得られる、画素値に基づくパラメータの度数分布の形状を表す形状情報を含む対象情報に基づいて、対象物の表面の状態の検査をするステップである。 The inspection method of one aspect of the present disclosure includes an acquisition step and an inspection step. The acquisition step is a step of acquiring an object image relating to the surface of the object. The inspection step is a step of inspecting the state of the surface of the object based on the target information including the shape information representing the shape of the frequency distribution of the parameter based on the pixel value obtained from the target image.
 本開示の別の一態様のプログラムは、1以上のプロセッサに、本開示の検査方法を実行させるための、プログラムである。 Another aspect of the program of the present disclosure is a program for causing one or more processors to execute the inspection method of the present disclosure.
 本開示の別の一態様の検査システムは、対象物の表面の対象画像を取得する取得部と、対象画像から得られる、画素値に基づくパラメータの度数分布の形状を表す形状情報を含む対象情報に基づいて、対象物の表面の状態の検査をする検査部とを備える。 Another aspect of the inspection system of the present disclosure is a target information including an acquisition unit that acquires a target image on the surface of an object and shape information that represents the shape of a frequency distribution of parameters based on pixel values obtained from the target image. It is provided with an inspection unit that inspects the surface condition of the object based on the above.
 本開示の態様によれば、対象物の表面の状態の検査の精度を向上できる、という効果を奏する。 According to the aspect of the present disclosure, it is possible to improve the accuracy of the inspection of the surface condition of the object.
図1は、一実施形態の検査方法のフローチャートである。FIG. 1 is a flowchart of an inspection method of one embodiment. 図2は、上記検査方法を実施する検査システムのブロック図である。FIG. 2 is a block diagram of an inspection system that implements the above inspection method. 図3は、上記検査方法で用いられる形状情報の生成方法の説明図である。FIG. 3 is an explanatory diagram of a method for generating shape information used in the above inspection method. 図4は、上記検査システムで得られる形状情報の一例の説明図である。FIG. 4 is an explanatory diagram of an example of shape information obtained by the inspection system. 図5は、上記検査システムで得られる形状情報の別の一例の説明図である。FIG. 5 is an explanatory diagram of another example of the shape information obtained by the inspection system. 図6は、上記検査システムで得られる形状情報の別の一例の説明図である。FIG. 6 is an explanatory diagram of another example of the shape information obtained by the inspection system. 図7は、上記検査システムで得られる対象情報と基準情報の一例の説明図である。FIG. 7 is an explanatory diagram of an example of the target information and the reference information obtained by the inspection system. 図8は、上記検査システムで得られる対象情報と基準情報の別の一例の説明図である。FIG. 8 is an explanatory diagram of another example of the target information and the reference information obtained by the inspection system. 図9は、上記検査システムで得られる対象情報と基準情報の別の一例の説明図である。FIG. 9 is an explanatory diagram of another example of the target information and the reference information obtained by the inspection system. 図10は、対象物として用意された複数のサンプルの、基準物との一致率を示すグラフである。FIG. 10 is a graph showing the matching rate of a plurality of samples prepared as an object with the reference object. 図11は、対象物として用意された複数のサンプルの、目視評価の結果を示すグラフである。FIG. 11 is a graph showing the results of visual evaluation of a plurality of samples prepared as objects. 図12は、対象物として用意された複数のサンプルの、一致率と目視評価の結果との関係を示すグラフである。FIG. 12 is a graph showing the relationship between the concordance rate and the result of visual evaluation of a plurality of samples prepared as objects. 図13は、一変形例における形状情報の生成方法の説明図である。FIG. 13 is an explanatory diagram of a method of generating shape information in one modification. 図14は、一変形例における形状情報の生成方法の説明図である。FIG. 14 is an explanatory diagram of a method of generating shape information in one modification. 図15は、一変形例の検査方法における対象画像の取得方法の説明図である。FIG. 15 is an explanatory diagram of a method of acquiring a target image in an inspection method of a modified example. 図16は、対象画像の一例の説明図である。FIG. 16 is an explanatory diagram of an example of the target image. 図17は、対象画像の一例の説明図である。FIG. 17 is an explanatory diagram of an example of the target image. 図18は、対象画像から得られるヒストグラムの一例の説明図である。FIG. 18 is an explanatory diagram of an example of a histogram obtained from the target image. 図19は、対象画像から得られるヒストグラムの一例の説明図である。FIG. 19 is an explanatory diagram of an example of a histogram obtained from the target image. 図20は、対象物の表面の画像の一例の説明図である。FIG. 20 is an explanatory diagram of an example of an image of the surface of an object. 図21は、一変形例の検査方法における対象画像の取得方法の説明図である。FIG. 21 is an explanatory diagram of a method of acquiring a target image in an inspection method of a modified example. 図22は、一変形例の形状情報の生成方法の説明図である。FIG. 22 is an explanatory diagram of a method of generating shape information of one modification. 図23は、一変形例の検査方法における対象画像の取得方法の説明図である。FIG. 23 is an explanatory diagram of a method of acquiring a target image in the inspection method of one modified example. 図24は、上記検査システムで提示される検査結果の一例を示す図である。FIG. 24 is a diagram showing an example of inspection results presented by the inspection system. 図25は、上記検査システムで提示される検査結果の別の一例を示す図である。FIG. 25 is a diagram showing another example of the inspection result presented by the inspection system. 図26は、上記検査システムで提示される検査結果の別の一例を示す図である。FIG. 26 is a diagram showing another example of the inspection result presented by the inspection system. 図27は、上記検査システムで提示される検査結果の別の一例を示す図である。FIG. 27 is a diagram showing another example of the inspection result presented by the inspection system. 図28は、上記検査システムで提示される検査結果の別の一例を示す図である。FIG. 28 is a diagram showing another example of the inspection result presented by the inspection system.
 (1)実施形態
 (1.1)概要
 一実施形態の検査方法は、図2に示すような対象物100の表面の状態の検査のために利用され得る。対象物100の表面の状態は、対象物100の表面の色、形状、質感(テクスチャ)、光沢感、グラデーション、及び塗装状態を含み得る。本実施形態では、グラデーションは、表面が曲面である場合に入射角度と反射角度に依存して生じるグラデーションだけではなく、フリップフロップ性、カラートラベル効果、光輝感、及び粒状感を含み得る。本実施形態では、対象物100は、自動車を想定している。特に、対象物100の表面は、自動車の車体の外面の一部である。なお、対象物100は自動車に限られない。例えば、対象物100は、自動車以外の移動体であってよいし、移動体でなくてもよい。移動体の例としては、二輪車、電車、ドローン、航空機、建設機械、及び船舶が挙げられる。また、対象物100は、電気機器や、食器、容器、家具、衣類、建材等であってもよい。要するに、対象物100は、表面を有する物体であればよい。
(1) Embodiment (1.1) Outline The inspection method of one embodiment can be used for inspecting the surface condition of the object 100 as shown in FIG. The surface condition of the object 100 may include the color, shape, texture, glossiness, gradation, and coating condition of the surface of the object 100. In the present embodiment, the gradation may include not only the gradation generated depending on the incident angle and the reflection angle when the surface is a curved surface, but also the flip-flop property, the color travel effect, the brilliance feeling, and the graininess feeling. In the present embodiment, the object 100 is assumed to be an automobile. In particular, the surface of the object 100 is a part of the outer surface of the vehicle body of the automobile. The object 100 is not limited to an automobile. For example, the object 100 may or may not be a moving body other than an automobile. Examples of mobiles include motorcycles, trains, drones, aircraft, construction machinery, and ships. Further, the object 100 may be an electric device, tableware, a container, furniture, clothing, a building material, or the like. In short, the object 100 may be an object having a surface.
 図1は、本実施形態の検査方法のフローチャートを示す。本実施形態の検査方法は、取得ステップS11と、検査ステップS12とを含む。取得ステップS11は、対象物100の表面に関する対象画像を取得するステップである。検査ステップS12は、対象画像から得られる、画素値に基づくパラメータの度数分布の形状を表す形状情報を含む対象情報に基づいて、対象物100の表面の状態の検査をする。 FIG. 1 shows a flowchart of the inspection method of the present embodiment. The inspection method of the present embodiment includes acquisition step S11 and inspection step S12. The acquisition step S11 is a step of acquiring an object image relating to the surface of the object 100. The inspection step S12 inspects the state of the surface of the object 100 based on the object information including the shape information representing the shape of the frequency distribution of the parameter based on the pixel value obtained from the object image.
 本実施形態の検査方法では、対象物100の表面の対象画像から得られる、画素値に基づくパラメータの度数分布の形状を表す形状情報を利用して、対象物の100の表面の状態の検査をする。つまり、本実施形態の検査方法では、度数分布の形状そのものを利用している。その結果、本実施形態の検査方法によれば、対象物100の表面の状態の検査の精度を向上できる。 In the inspection method of the present embodiment, the state of the surface of the object 100 is inspected by using the shape information representing the shape of the frequency distribution of the parameters based on the pixel values obtained from the object image on the surface of the object 100. To do. That is, in the inspection method of this embodiment, the shape of the frequency distribution itself is used. As a result, according to the inspection method of the present embodiment, the accuracy of inspection of the surface condition of the object 100 can be improved.
 (1.2)詳細
 以下、本実施形態の検査方法について、図1~図12を参照して更に詳細に説明する。本実施形態の検査方法は、図2に示す検査システム1により実行され得る。検査システム1は、対象物100の表面の検査のためのシステムである。検査システム1は、着色検査装置としての機能を有する。また、検査システム1は、対象物100の塗装も可能である。検査システム1は、検査の結果に応じて対象物100を塗装することができ、これによって、所望の塗装がされた対象物100が得られる。
(1.2) Details Hereinafter, the inspection method of the present embodiment will be described in more detail with reference to FIGS. 1 to 12. The inspection method of this embodiment can be executed by the inspection system 1 shown in FIG. The inspection system 1 is a system for inspecting the surface of the object 100. The inspection system 1 has a function as a coloring inspection device. The inspection system 1 can also paint the object 100. The inspection system 1 can paint the object 100 according to the result of the inspection, whereby the object 100 with the desired coating can be obtained.
 検査システム1は、図2に示すように、判定システム10と、照明システム20と、撮像システム30と、塗装システム40とを備える。 As shown in FIG. 2, the inspection system 1 includes a determination system 10, a lighting system 20, an imaging system 30, and a painting system 40.
 照明システム20は、対象物100の表面に光を照射するためのシステムである。照明システム20は、対象物100に光を照射する1以上のランプを含む。ランプは、一例としては、LED(Light Emitting Diode)ランプである。また、ランプは、白色光を放射する。なお、照明システム20において、ランプの数は特に限定されず、ランプの種類も、LED以外の光源であってよい。また、ランプの発光色は白色に限定されない。ランプの発光色は、対象物100の色及び撮像システム30で検出可能な色を考慮して適宜設定され得る。また、照明システム20が放射する光の波長は変更可能であってよい。 The lighting system 20 is a system for irradiating the surface of the object 100 with light. The lighting system 20 includes one or more lamps that illuminate the object 100 with light. The lamp is, for example, an LED (Light Emitting Diode) lamp. The lamp also emits white light. In the lighting system 20, the number of lamps is not particularly limited, and the type of lamp may be a light source other than the LED. Further, the emission color of the lamp is not limited to white. The emission color of the lamp can be appropriately set in consideration of the color of the object 100 and the color detectable by the imaging system 30. Further, the wavelength of the light emitted by the lighting system 20 may be changeable.
 撮像システム30は、対象物100の表面の画像(デジタル画像)を生成するためのシステムである。本実施形態では、撮像システム30は、照明システム20によって照らされている対象物100の表面を撮像して対象物100の表面の画像を生成する。撮像システム30は、1以上のカメラを含む。カメラは、1以上のイメージセンサを備える。なお、カメラは1以上のラインセンサを備えてもよい。 The imaging system 30 is a system for generating an image (digital image) of the surface of the object 100. In the present embodiment, the imaging system 30 images the surface of the object 100 illuminated by the lighting system 20 to generate an image of the surface of the object 100. The imaging system 30 includes one or more cameras. The camera comprises one or more image sensors. The camera may include one or more line sensors.
 塗装システム40は、対象物100の表面の塗装のためのシステムである。塗装システム40は、1以上の塗装部(塗装ロボット)を含む。塗装ロボットは、従来周知の構成であってよいから、詳細な説明は省略する。 The painting system 40 is a system for painting the surface of the object 100. The painting system 40 includes one or more painting parts (painting robots). Since the painting robot may have a conventionally known configuration, detailed description thereof will be omitted.
 判定システム10は、図2に示すように、入出力部11と、記憶部12と、処理部13とを備える。判定システム10は、コンピュータシステムにより実現され得る。コンピュータシステムは、1以上のプロセッサ、1以上のコネクタ、1以上の通信機器、及び1以上のメモリ等を含み得る。 As shown in FIG. 2, the determination system 10 includes an input / output unit 11, a storage unit 12, and a processing unit 13. The determination system 10 can be realized by a computer system. A computer system may include one or more processors, one or more connectors, one or more communication devices, one or more memories, and the like.
 入出力部11は、照明システム20、撮像システム30及び塗装システム40との間の情報の入出力を行う。本実施形態では、入出力部11は、照明システム20、撮像システム30及び塗装システム40に通信可能に接続される。入出力部11は、1以上の入出力装置を含み、1以上の入出力インタフェースを利用する。 The input / output unit 11 inputs / outputs information between the lighting system 20, the imaging system 30, and the painting system 40. In the present embodiment, the input / output unit 11 is communicably connected to the lighting system 20, the imaging system 30, and the painting system 40. The input / output unit 11 includes one or more input / output devices and uses one or more input / output interfaces.
 記憶部12は、処理部13が利用する情報を記憶するために用いられる。記憶部12は、1以上の記憶装置を含む。記憶装置は、例えば、RAM(Random Access Memory)、EEPROM(Electrically Erasable Programmable Read Only Memory)である。 The storage unit 12 is used to store the information used by the processing unit 13. The storage unit 12 includes one or more storage devices. The storage device is, for example, a RAM (RandomAccessMemory) or an EEPROM (ElectricallyErasableProgrammableReadOnlyMemory).
 処理部13は、例えば、1以上のプロセッサ(マイクロプロセッサ)により実現され得る。つまり、1以上のプロセッサが1以上のメモリに記憶された1以上のプログラム(コンピュータプログラム)を実行することで、処理部13として機能する。1以上のプログラムは、1以上のメモリに予め記録されていてもよいし、インターネット等の電気通信回線を通じて、又はメモリカード等の非一時的な記録媒体に記録されて提供されてもよい。 The processing unit 13 can be realized by, for example, one or more processors (microprocessors). That is, one or more processors execute one or more programs (computer programs) stored in one or more memories, thereby functioning as the processing unit 13. The one or more programs may be recorded in advance in one or more memories, may be recorded through a telecommunication line such as the Internet, or may be recorded and provided on a non-temporary recording medium such as a memory card.
 処理部13は、図2に示すように、取得部F11と、検査部F12と、提示部F13と、制御部14とを有している。取得部F11と、検査部F12と、提示部F13と、制御部F14とは、実体のある構成を示しているわけではなく、処理部13によって実現される機能を示している。 As shown in FIG. 2, the processing unit 13 has an acquisition unit F11, an inspection unit F12, a presentation unit F13, and a control unit 14. The acquisition unit F11, the inspection unit F12, the presentation unit F13, and the control unit F14 do not show a substantive configuration, but show a function realized by the processing unit 13.
 取得部F11は、対象物100の表面に関する対象画像を取得する取得ステップS11(図1参照)を実行する。本実施形態では、対象画像は、照明システム20によって照らされている対象物100の表面を撮像システム30で撮像して得られる画像である。ここで、撮像システム30は、対象物100の表面の全体ではなく、一部を撮像する。したがって、対象物100からは、撮像した対象物100の表面の部分が異なる複数の対象画像が取得される。つまり、対象物100の表面の部分毎に検査が実行されることになる。本実施形態では、取得部F11は、撮像システム30から対象物100の表面の画像を取得する。つまり、取得部F11は、入出力部11を介して撮像システム30から画像を受け取る。取得部F11が撮像システム30から取得する画像は、撮像システム30の撮像条件によって決まる。撮像条件は、対象物100と照明システム20と撮像システム30との相対的な位置関係(つまりは、撮影対象物と照明とカメラの位置関係の情報)を含み得る。本実施形態では、対象画像の画素値は、三色の値を含み得る。三色の値は、赤色に対応するR値、緑色に対応するG値、及び、青色に対応するB値であり得る。このように、本実施形態では、画素値は、RGB表色系で表される。 The acquisition unit F11 executes acquisition step S11 (see FIG. 1) to acquire an object image relating to the surface of the object 100. In the present embodiment, the target image is an image obtained by capturing the surface of the object 100 illuminated by the lighting system 20 with the imaging system 30. Here, the imaging system 30 images a part of the surface of the object 100, not the entire surface. Therefore, from the object 100, a plurality of target images having different surface portions of the captured object 100 are acquired. That is, the inspection is performed for each part of the surface of the object 100. In the present embodiment, the acquisition unit F11 acquires an image of the surface of the object 100 from the imaging system 30. That is, the acquisition unit F11 receives an image from the image pickup system 30 via the input / output unit 11. The image acquired by the acquisition unit F11 from the imaging system 30 is determined by the imaging conditions of the imaging system 30. The imaging condition may include a relative positional relationship between the object 100, the lighting system 20, and the imaging system 30 (that is, information on the positional relationship between the object to be photographed, the lighting, and the camera). In the present embodiment, the pixel value of the target image may include the values of three colors. The values of the three colors can be an R value corresponding to red, a G value corresponding to green, and a B value corresponding to blue. As described above, in the present embodiment, the pixel value is represented by the RGB color system.
 検査部F12は、対象物100の表面の状態の検査をする検査ステップS12(図1参照)を実行する。対象物100の表面の状態の検査のために、検査部F12は、対象画像から対象情報を生成する(図1のS12a,S12b)。対象情報は、画素値に基づくパラメータの度数分布の形状を表す形状情報を含む。本実施形態では、パラメータは、色に関するパラメータである。特に、パラメータとして、LCh表色系のパラメータである、明度(L)、彩度(C)、及び色相(h)が用いられる。そのため、検査部F12は、対象画像の画素値を、RGB表色系から、LCh表色系に変換する。そして、検査部F12は、画素値に基づくパラメータ(明度、彩度、色相)の度数分布を求める。よって、本実施形態では、対象情報は、複数のパラメータ(明度、彩度、色相の3つ)それぞれに関する形状情報を含む。 The inspection unit F12 executes the inspection step S12 (see FIG. 1) for inspecting the surface condition of the object 100. For inspection of the surface condition of the object 100, the inspection unit F12 generates target information from the target image (S12a, S12b in FIG. 1). The target information includes shape information representing the shape of the frequency distribution of the parameter based on the pixel value. In this embodiment, the parameter is a color parameter. In particular, as the parameters, the lightness (L), the saturation (C), and the hue (h), which are the parameters of the LCh color system, are used. Therefore, the inspection unit F12 converts the pixel value of the target image from the RGB color system to the LCh color system. Then, the inspection unit F12 obtains the frequency distribution of the parameters (brightness, saturation, hue) based on the pixel value. Therefore, in the present embodiment, the target information includes shape information regarding each of the plurality of parameters (three of lightness, saturation, and hue).
 本実施形態では、形状情報は、度数分布の形状の特徴を表す特徴量を含む。例えば、図3は、明度の度数分布を表すヒストグラムH10を示す。図3では、横軸が、パラメータである明度を表し、縦軸が、度数を表す。本実施形態では、度数分布の形状は、度数分布を表すヒストグラムの形状である。そして、特徴量は、複数の評価値と、代表値を含む。 In the present embodiment, the shape information includes a feature amount representing the feature of the shape of the frequency distribution. For example, FIG. 3 shows a histogram H10 representing a frequency distribution of lightness. In FIG. 3, the horizontal axis represents the parameter brightness, and the vertical axis represents the frequency. In the present embodiment, the shape of the frequency distribution is the shape of a histogram representing the frequency distribution. The feature quantity includes a plurality of evaluation values and representative values.
 複数の評価値は、度数分布を表すヒストグラムの形状を表す値である。本実施形態では、複数の評価値は、ベクトルとして扱われる。以下、複数の評価値を、評価ベクトルということがある。評価ベクトルは、度数分布の形状のパラメータの複数の区分それぞれの面積に基づいて算出される。例えば、検査部F12は、図3に示すように、ヒストグラムH10の形状のパラメータ(明度)の複数の区分Di(D1~D7)それぞれの面積Si(S1~S7)を求める。ここで、iは、1以上の整数である。本実施形態では、複数の区分Diは、幅(区分の上限と下限との差)が同じである。また、検査部F12は、ヒストグラムH10の形状の総面積S(=ΣSi)を求める。検査部F12は、区分の面積Siを総面積Sで割って得られる値を、当該区分の評価値とする。つまり、評価値は、ヒストグラムH10の全体に対して各区分が占める比率に対応する。ここで、区分Diの評価値を、wiとすれば、wi=Si/Sが成立する。そして、図3の例では、7つの区分D1~D7があるため、ヒストグラムH10の評価ベクトルは、(w1,w2,w3,w4,w5,w6,w7)となる。 The plurality of evaluation values are values representing the shape of the histogram representing the frequency distribution. In this embodiment, the plurality of evaluation values are treated as vectors. Hereinafter, a plurality of evaluation values may be referred to as evaluation vectors. The evaluation vector is calculated based on the area of each of the plurality of divisions of the shape parameter of the frequency distribution. For example, as shown in FIG. 3, the inspection unit F12 obtains the area Si (S1 to S7) of each of the plurality of divisions Di (D1 to D7) of the shape parameter (brightness) of the histogram H10. Here, i is an integer of 1 or more. In the present embodiment, the plurality of divisions Di have the same width (difference between the upper limit and the lower limit of the division). Further, the inspection unit F12 obtains the total area S (= ΣSi) of the shape of the histogram H10. The inspection unit F12 uses a value obtained by dividing the area Si of the category by the total area S as the evaluation value of the category. That is, the evaluation value corresponds to the ratio of each category to the entire histogram H10. Here, if the evaluation value of the division Di is wi, then wi = Si / S is established. Then, in the example of FIG. 3, since there are seven categories D1 to D7, the evaluation vector of the histogram H10 is (w1, w2, w3, w4, w5, w6, w7).
 代表値は、度数分布の代表値である。本実施形態では、代表値は、度数分布の中心となるパラメータ(図3では、明度)の値(中心値)に対応する。代表値は、度数分布の形状の全体に対する度数分布の形状のパラメータの複数の区分それぞれの比率(評価ベクトル)と、複数の区分に対応するパラメータの値とから求まる値である。例えば、図3において、区分Di(D1~D7)に対応するパラメータの値をLi(L1~L7)とする。ここで、区分Diに対応するパラメータの値Liは、区分Diに対応するパラメータの範囲の代表値であって、平均値、最頻値、中央値、最大値、最小値のいずれかであり得る。この場合、度数分布の代表値をCとすれば、C=ΣLi*wiが成立する。なお、ここでLi*wiの「*」は、掛け算を表す。 The representative value is the representative value of the frequency distribution. In the present embodiment, the representative value corresponds to the value (center value) of the parameter (brightness in FIG. 3) that is the center of the frequency distribution. The representative value is a value obtained from the ratio (evaluation vector) of each of the plurality of divisions of the parameter of the shape of the frequency distribution to the entire shape of the shape of the frequency distribution and the value of the parameter corresponding to the plurality of divisions. For example, in FIG. 3, the value of the parameter corresponding to the division Di (D1 to D7) is Li (L1 to L7). Here, the value Li of the parameter corresponding to the division Di is a representative value of the range of the parameters corresponding to the division Di, and may be any of an average value, a mode value, a median value, a maximum value, and a minimum value. .. In this case, if the representative value of the frequency distribution is C, then C = ΣLi * wi holds. Here, "*" of Li * wi represents multiplication.
 検査部F12は、画素値に基づくパラメータ毎に、形状情報を生成する。本実施形態では、検査部F12は、対象画像に関して、明度、彩度、及び色相それぞれの形状情報を生成する。明度、彩度、及び色相の形状情報のそれぞれにおいて、特徴量は、評価ベクトルと、代表値とを含む。例えば、図4、図5及び図6は、それぞれ明度、彩度及び色相の形状情報の例の説明図である。図4は、明度に関し、HT1はヒストグラムを、wT1は評価ベクトルを、CT1は代表値を示している。また、図5は、彩度に関し、HT2はヒストグラムを、wT2は評価ベクトルを、CT2は代表値を示している。また、図6は、色相に関し、HT3はヒストグラムを、wT3は評価ベクトルを、CT3は代表値を示している。このように、形状情報の特徴量は、度数分布をその形状及び位置が損なわれない程度に、簡略化して示すものであるといえる。 The inspection unit F12 generates shape information for each parameter based on the pixel value. In the present embodiment, the inspection unit F12 generates shape information for each of lightness, saturation, and hue with respect to the target image. In each of the lightness, saturation, and hue shape information, the feature amount includes an evaluation vector and a representative value. For example, FIGS. 4, 5 and 6 are explanatory views of examples of shape information of lightness, saturation and hue, respectively. In FIG. 4, regarding the brightness, HT1 shows a histogram, wT1 shows an evaluation vector, and CT1 shows a representative value. Further, in FIG. 5, regarding the saturation, HT2 shows a histogram, wT2 shows an evaluation vector, and CT2 shows a representative value. Further, FIG. 6 shows a histogram in HT3, an evaluation vector in wT3, and a representative value in CT3 with respect to hue. As described above, it can be said that the feature amount of the shape information is a simplified representation of the frequency distribution so that the shape and position are not impaired.
 検査部F12は、このようにして生成された対象情報と、基準情報との比較をする(図1のS12c)。基準情報は、記憶部12に予め記憶され得る。基準情報は、対象物100の基準となる基準物の表面に関する基準画像から得られる形状情報を含む。基準情報は、対象情報と同様の手法で生成される。基準情報では、対象情報と同様に、LCh表色系のパラメータである、明度(L)、彩度(C)、及び色相(h)が用いられる。そのため、検査部F12は、必要に応じて、基準画像の画素値をRGB表色系からLCh表色系に変換して、画素値に基づくパラメータ(明度、彩度、色相)の度数分布を求める。よって、本実施形態では、基準情報は、対象情報と同様に、複数のパラメータ(明度、彩度、色相の3つ)それぞれに関する形状情報を含む。そして、対象情報と同様に、明度、彩度、及び色相の形状情報のそれぞれにおいて、特徴量は、評価ベクトルと、代表値とを含む。例えば、図7、図8及び図9は、それぞれ明度、彩度及び色相に関する対象情報と基準情報の例の説明図である。図7は、明度に関し、wR1は基準画像からの度数分布の形状に基づく評価ベクトルを、CR1は基準画像からの度数分布の代表値を示している。また、図8は、彩度に関し、wR2は基準画像からの度数分布の形状に基づく評価ベクトルを、CR2は基準画像からの度数分布の代表値を示している。図9は、色相に関し、wR3は基準画像からの度数分布の形状に基づく評価ベクトルを、CR3は基準画像からの度数分布の代表値を示している。 The inspection unit F12 compares the target information generated in this way with the reference information (S12c in FIG. 1). The reference information can be stored in advance in the storage unit 12. The reference information includes shape information obtained from a reference image regarding the surface of the reference object that is the reference of the object 100. The reference information is generated by the same method as the target information. In the reference information, the lightness (L), the saturation (C), and the hue (h), which are the parameters of the LCh color system, are used as in the target information. Therefore, the inspection unit F12 converts the pixel value of the reference image from the RGB color system to the LCh color system as necessary, and obtains the frequency distribution of the parameters (brightness, saturation, hue) based on the pixel value. .. Therefore, in the present embodiment, the reference information includes shape information relating to each of the plurality of parameters (three of lightness, saturation, and hue) as well as the target information. Then, as with the target information, the feature amount includes the evaluation vector and the representative value in each of the lightness, saturation, and hue shape information. For example, FIGS. 7, 8 and 9 are explanatory views of examples of target information and reference information regarding lightness, saturation and hue, respectively. In FIG. 7, wR1 shows an evaluation vector based on the shape of the frequency distribution from the reference image, and CR1 shows a representative value of the frequency distribution from the reference image. Further, in FIG. 8, wR2 shows an evaluation vector based on the shape of the frequency distribution from the reference image, and CR2 shows a representative value of the frequency distribution from the reference image. In FIG. 9, wR3 shows an evaluation vector based on the shape of the frequency distribution from the reference image, and CR3 shows a representative value of the frequency distribution from the reference image.
 検査部F12は、対象情報と基準情報との比較では、パラメータ毎に、形状情報の特徴量の差分を求める。特徴量の差分は、評価ベクトルの差分と、代表値の差分とを含む。図7に示す例では、評価ベクトルの差分Dw1はwT1-wR1であり、代表値の差分Dc1はCT1-CR1である。下記表1は、評価ベクトルの差分Dw1の一例を示し、下記表2は、代表値の差分Dc1の一例を示す。 The inspection unit F12 obtains the difference in the feature amount of the shape information for each parameter in the comparison between the target information and the reference information. The difference in the feature amount includes the difference in the evaluation vector and the difference in the representative value. In the example shown in FIG. 7, the difference Dw1 of the evaluation vector is wT1-wR1, and the difference Dc1 of the representative value is CT1-CR1. Table 1 below shows an example of the difference Dw1 of the evaluation vector, and Table 2 below shows an example of the difference Dc1 of the representative value.
Figure JPOXMLDOC01-appb-T000001
Figure JPOXMLDOC01-appb-T000001
Figure JPOXMLDOC01-appb-T000002
Figure JPOXMLDOC01-appb-T000002
 図8に示す例では、評価ベクトルの差分Dw2はwT2-wR2であり、代表値の差分Dc2はCT2-CR2である。下記表3は、評価ベクトルの差分Dw2の一例を示し、下記表4は、代表値の差分Dc2の一例を示す。 In the example shown in FIG. 8, the difference Dw2 of the evaluation vector is wT2-wR2, and the difference Dc2 of the representative value is CT2-CR2. Table 3 below shows an example of the difference Dw2 of the evaluation vector, and Table 4 below shows an example of the difference Dc2 of the representative value.
Figure JPOXMLDOC01-appb-T000003
Figure JPOXMLDOC01-appb-T000003
Figure JPOXMLDOC01-appb-T000004
Figure JPOXMLDOC01-appb-T000004
 図9に示す例では、評価ベクトルの差分Dw3はwT3-wR3であり、代表値の差分Dc3はCT3-CR3である。下記表5は、評価ベクトルの差分Dw3の一例を示し、下記表6は、代表値の差分Dc3の一例を示す。 In the example shown in FIG. 9, the difference Dw3 of the evaluation vector is wT3-wR3, and the difference Dc3 of the representative value is CT3-CR3. Table 5 below shows an example of the difference Dw3 of the evaluation vector, and Table 6 below shows an example of the difference Dc3 of the representative value.
Figure JPOXMLDOC01-appb-T000005
Figure JPOXMLDOC01-appb-T000005
Figure JPOXMLDOC01-appb-T000006
Figure JPOXMLDOC01-appb-T000006
 そして、検査部F12は、パラメータ毎の形状情報の特徴量の差分から、評価ベクトルの判定値(第1判定値)と、代表値の判定値(第2判定値)とを求める。第1判定値をE1とすると、E1=(Dw1+Dw2+Dw31/2となる。第2判定値をE2とすると、E2=(Dc1+Dc2+Dc31/2となる。 Then, the inspection unit F12 obtains a determination value (first determination value) of the evaluation vector and a determination value (second determination value) of the representative value from the difference in the feature amount of the shape information for each parameter. Assuming that the first determination value is E1, E1 = (Dw1 2 + Dw2 2 + Dw3 2 ) 1/2 . When the second judgment value and E2, E2 = a (Dc1 2 + Dc2 2 + Dc3 2) 1/2.
 そして、検査部F12は、対象情報と基準情報との比較の結果に基づいて、対象物100の表面の状態が合格か不合格かを判定する(図1のS12d)。本実施形態では、検査部F12は、第1判定値E1が第1規定値未満であり、第2判定値E2が第2規定値未満である場合に、対象物100の表面の状態が合格であるとする。一方で、検査部F12は、第1判定値E1が第1規定値以上、又は、第2判定値E2が第2規定値以上である場合に、対象物100の表面の状態が不合格であるとする。 Then, the inspection unit F12 determines whether the surface condition of the object 100 passes or fails based on the result of comparison between the target information and the reference information (S12d in FIG. 1). In the present embodiment, when the first determination value E1 is less than the first specified value and the second determination value E2 is less than the second specified value, the inspection unit F12 passes the surface condition of the object 100. Suppose there is. On the other hand, the inspection unit F12 fails the surface condition of the object 100 when the first determination value E1 is equal to or greater than the first specified value or the second determination value E2 is equal to or greater than the second specified value. And.
 一例として、規定値(第1規定値及び第2規定値)は、目視評価を利用して決定してよい。目視評価を利用した規定値の決定の仕方について簡単に説明する。例えば、対象物100として塗装条件が異なる複数のサンプルd1~dnを用意する。なお、nは2以上の任意の整数である。一例として、塗装条件は、サンプルd1~dnの順番で色が淡くなるように設定されてよい。 As an example, the specified values (first specified value and second specified value) may be determined by using visual evaluation. A brief explanation will be given on how to determine the specified value using visual evaluation. For example, a plurality of samples d1 to dn with different coating conditions are prepared as the object 100. Note that n is an arbitrary integer of 2 or more. As an example, the coating conditions may be set so that the colors become lighter in the order of samples d1 to dn.
 複数のサンプルd1~dnそれぞれについて、基準物と比較し、第1判定値及び第2判定値に基づいて、一致率を求める。図10は、サンプルd1~dnと一致率との関係を示す。図10では、サンプルdkにおいて一致率が最も高くなっている。そして、サンプルd1~dnの各々について複数人(一例として、30人)で目視による評価(目視評価)を行い、正解率を求める。目視評価では、サンプルd1~dnのうちの一つとサンプルdkとを比較する。正解率は、目視評価を行った人数に対する、サンプルdkを選択した人数の割合である。図11は、目視評価の結果を示す。正解率が1.0のサンプルについては明らかに不合格としてよい。正解率が0.5のサンプルについては正解率がランダムと同じであるから、合格としてよい。そして、正解率が0.5~1.0の間にあるサンプルd2~dk-2、dk+2~dn-1については、更に、どのサンプルまでを許容範囲とするかが決定される。なお、kは整数である。 Each of the plurality of samples d1 to dn is compared with the reference material, and the matching rate is obtained based on the first judgment value and the second judgment value. FIG. 10 shows the relationship between the samples d1 to dn and the concordance rate. In FIG. 10, the matching rate is the highest in the sample dk. Then, each of the samples d1 to dn is visually evaluated (visually evaluated) by a plurality of people (30 people as an example) to obtain the correct answer rate. In the visual evaluation, one of the samples d1 to dn is compared with the sample dk. The correct answer rate is the ratio of the number of people who selected the sample dk to the number of people who performed the visual evaluation. FIG. 11 shows the result of visual evaluation. A sample with a correct answer rate of 1.0 may be clearly rejected. For a sample with a correct answer rate of 0.5, the correct answer rate is the same as random, so it may be passed. Then, for the samples d2 to dk-2 and dk + 2 to dn-1 whose accuracy rate is between 0.5 and 1.0, it is further determined which sample is within the permissible range. Note that k is an integer.
 例えば、図12に示すように、サンプルdk+2~dn-1については、サンプルdk+2~dn-1のうち一致率の勾配(傾き)が大きいサンプルdm(k-2≦m≦n-1)と合格であるサンプルdk+2との目視評価を行う。ここで、正解率が1.0であれば、サンプルdmは不合格とする。そして、サンプルdmの次に一致率が高いサンプルdm-1とサンプルdmとの目視評価を行う。そして、正解率が1.0であれば、サンプルdm-1は不合格とする。そして、サンプルdm-1の次に一致率が高いサンプルdm-2とサンプルdm-1との目視評価を行う。正解率が0.5であれば、サンプルdm-1を許容範囲の限度のサンプルとする。このように、正解率が0.5になるまで、目視評価を繰り返し、正解率が0.5になったときのサンプルを許容範囲の限度のサンプルとする。ただし、正解率が0.5になる前にサンプルdk+2に到達すると、サンプルdk+2を許容範囲の限度のサンプルとして目視評価を終了する。なお、mは整数である。 For example, as shown in FIG. 12, for the samples dk + 2 to dn-1, the sample dm (k-2 ≦ m ≦ n-1) having a large gradient (slope) of the matching rate among the samples dk + 2 to dn-1 is passed. Visual evaluation is performed with the sample dk + 2. Here, if the correct answer rate is 1.0, the sample dm is rejected. Then, the sample dm-1 and the sample dm, which have the second highest matching rate after the sample dm, are visually evaluated. Then, if the correct answer rate is 1.0, the sample dm-1 is rejected. Then, the sample dm-2 and the sample dm-1, which have the second highest matching rate after the sample dm-1, are visually evaluated. If the accuracy rate is 0.5, the sample dm-1 is set as the sample within the permissible range. In this way, the visual evaluation is repeated until the correct answer rate becomes 0.5, and the sample when the correct answer rate becomes 0.5 is taken as the sample within the permissible range. However, if the sample dk + 2 is reached before the accuracy rate reaches 0.5, the visual evaluation ends with the sample dk + 2 as the sample within the permissible range. Note that m is an integer.
 一方で、サンプルdmと合格であるサンプルdk+2との目視評価において、正解率が0.5であれば、サンプルdmは合格とする。そして、サンプルdmの次に一致率が低いサンプルdm+1とサンプルdmとの目視評価を行う。そして、正解率が0.5であれば、サンプルdm+1は合格とする。そして、サンプルdm+1の次に一致率が低いサンプルdm+2とサンプルdm+1との目視評価を行う。正解率が1.0であれば、サンプルdm+1を許容範囲の限度のサンプルとする。このように、正解率が1.0になるまで、目視評価を繰り返し、正解率が1.0になったときのサンプルを許容範囲の限度のサンプルとする。ただし、正解率が1.0になる前にサンプルdn-1に到達すると、サンプルdn-1の一つ前のサンプルdn-2を許容範囲の限度のサンプルとして目視評価を終了する。 On the other hand, in the visual evaluation of the sample dm and the passing sample dk + 2, if the correct answer rate is 0.5, the sample dm is passed. Then, the sample dm + 1 and the sample dm, which have the second lowest matching rate after the sample dm, are visually evaluated. Then, if the correct answer rate is 0.5, the sample dm + 1 is passed. Then, the sample dm + 2 and the sample dm + 1, which have the second lowest matching rate after the sample dm + 1, are visually evaluated. If the accuracy rate is 1.0, the sample dm + 1 is set as the sample within the allowable range. In this way, the visual evaluation is repeated until the correct answer rate becomes 1.0, and the sample when the correct answer rate becomes 1.0 is set as the sample within the allowable range. However, if the sample dn-1 is reached before the accuracy rate reaches 1.0, the visual evaluation is completed with the sample dn-2 immediately before the sample dn-1 as the sample within the allowable range.
 そして、サンプルd2~dk-2についても同様にして、許容範囲の限度のサンプルを決定してよい。そして、サンプルd2~dk-2から選択される許容範囲の限度のサンプルの一致率と、サンプルdk+2~dn-1から選択される許容範囲の限度のサンプルの一致率とに基づいて、規定値(第1規定値及び第2規定値)を決定する。例えば、そして、サンプルd2~dk-2から選択される許容範囲の限度のサンプルの一致率と、サンプルdk+2~dn-1から選択される許容範囲に限度のサンプルの一致率とのうち大きいほう、小さいほう、又は平均値に基づいて、規定値(第1規定値及び第2規定値)を設定してよい。 Then, in the same manner for samples d2 to dk-2, a sample with an allowable range limit may be determined. Then, based on the matching rate of the sample of the limit of the allowable range selected from the samples d2 to dk-2 and the matching rate of the sample of the limit of the allowable range selected from the samples dk + 2 to dn-1, the specified value ( The first specified value and the second specified value) are determined. For example, then, the larger of the matching rate of the upper limit sample selected from the samples d2 to dk-2 and the matching rate of the upper limit sample selected from the samples dk + 2 to dn-1. The specified values (first specified value and second specified value) may be set based on the smaller one or the average value.
 提示部F13は、検査部F12での検査の結果(検査ステップでの検査の結果)に基づく提示を行う提示ステップS13(図1参照)を実行する。つまり、提示部F13は、検査部F12での検査の結果に基づく提示を行う。検査の結果に基づく提示は、対象情報と基準情報との比較の結果を利用した検査部F12での判定の結果の提示も含み得る。よって、提示部F13は、検査部F12での判定の結果の提示を行ってよい。本実施形態では、提示部F13は、検査部F12での判定の結果を、入出力部11を通して外部装置に出力する。外部装置は、検査部F12での判定の結果、つまり、検査システム1での検査の結果を提示してよい。 The presentation unit F13 executes the presentation step S13 (see FIG. 1) for presenting based on the inspection result (inspection result in the inspection step) in the inspection unit F12. That is, the presentation unit F13 makes a presentation based on the result of the inspection by the inspection unit F12. The presentation based on the inspection result may also include the presentation of the judgment result in the inspection unit F12 using the result of comparison between the target information and the reference information. Therefore, the presentation unit F13 may present the result of the determination by the inspection unit F12. In the present embodiment, the presentation unit F13 outputs the result of the determination by the inspection unit F12 to the external device through the input / output unit 11. The external device may present the result of the determination by the inspection unit F12, that is, the result of the inspection by the inspection system 1.
 制御部F14は、対象物100の表面を塗装する塗装システム40に、検査ステップS12での検査の結果に基づく制御情報を出力する制御ステップS14(図1参照)を実行する。本実施形態では、検査部F12が対象物100の表面の状態が不合格であると判断した場合に、制御部F14は、塗装システム40に制御情報を出力して対象物100の再塗装を行わせる。一例として、制御情報は、下記表7に示す形式のデータを含み得る。制御情報は、複数の項目を含む。複数の項目は、「機種」、「カメラ」、「部位」、「結果」、「E1」、「E2」、「色空間」、及び「ビン数」である。ここで、「機種」は、検査システム1を特定するための情報を示す。「カメラ」は、対象画像の生成に利用された撮像システム30のカメラの番号を示す。「部位」は、対象画像に写っている対象物100の部位を特定するための情報を示す。ここでは、「部位」は、部位に付けられた番号を示す。「結果」は、検査部F12での検査の結果を示す。ここでは、「結果」は「NG」であり、検査の結果が不合格であることを示している。「E1」は、対象画像に対応する第1判定値を示す。「E2」は、対象画像に対応する第2判定値を示す。「色空間」は、対象画像の度数分布の表色系に関する情報を示す。表7では、「色空間」がLCh表色系であることを示している。「ビン数」は、各表色系の評価ベクトルの要素数を示す。表7では、明度、彩度、及び色相それぞれの評価ベクトルの要素の数が64であることを示している。このような制御情報を与えることで、対象物100のどの部位をどのように再塗装すればよいかを、塗装システム40に伝達することができる。 The control unit F14 executes a control step S14 (see FIG. 1) that outputs control information based on the inspection result in the inspection step S12 to the coating system 40 that paints the surface of the object 100. In the present embodiment, when the inspection unit F12 determines that the surface condition of the object 100 is unacceptable, the control unit F14 outputs control information to the coating system 40 to repaint the object 100. Let me. As an example, the control information may include data in the format shown in Table 7 below. The control information includes a plurality of items. The plurality of items are "model", "camera", "part", "result", "E1", "E2", "color space", and "number of bins". Here, "model" indicates information for identifying the inspection system 1. “Camera” indicates the number of the camera of the imaging system 30 used to generate the target image. “Part” indicates information for identifying a part of the object 100 shown in the target image. Here, "part" indicates a number assigned to the part. “Result” indicates the result of the inspection by the inspection unit F12. Here, the "result" is "NG", indicating that the test result is unsuccessful. “E1” indicates a first determination value corresponding to the target image. “E2” indicates a second determination value corresponding to the target image. The "color space" indicates information regarding the color system of the frequency distribution of the target image. Table 7 shows that the "color space" is the LCh color system. "Number of bins" indicates the number of elements of the evaluation vector of each color system. Table 7 shows that the number of elements of the evaluation vector for each of lightness, saturation, and hue is 64. By giving such control information, it is possible to convey to the painting system 40 which part of the object 100 should be repainted and how.
Figure JPOXMLDOC01-appb-T000007
Figure JPOXMLDOC01-appb-T000007
 再塗装では、制御部F14は、制御情報から得られる対象情報と基準情報との比較の結果(パラメータ毎の形状情報の特徴量の差分)に基づいて、塗装システム40を制御する。つまり、制御部F14は、対象物100の表面を塗装する塗装システム40を、検査部F12での検査の結果に基づいて制御する。これによって、対象物100の表面の状態を、目的の状態に近付けることができる。なお、検査部F12が対象物100の表面の状態が合格であると判断した場合に、制御部F14は、塗装システム40に制御情報を出力して対象物100の塗装を終了させてよい。 In repainting, the control unit F14 controls the painting system 40 based on the result of comparison between the target information obtained from the control information and the reference information (difference in the feature amount of the shape information for each parameter). That is, the control unit F14 controls the coating system 40 that paints the surface of the object 100 based on the result of the inspection by the inspection unit F12. As a result, the state of the surface of the object 100 can be brought closer to the target state. When the inspection unit F12 determines that the surface condition of the object 100 is acceptable, the control unit F14 may output control information to the coating system 40 to finish the coating of the object 100.
 (1.3)動作
 次に、以上述べた検査システム1で実施される検査方法について、図1のフローチャートを参照して簡単に説明する。検査システム1では、取得部F11が、撮像システム30から、対象物100の表面に関する対象画像を取得する(S11)。次に、検査部F12が、対象物100の表面の状態についての検査を行う(S12)。ここで、検査部F12は、対象画像から、画素値に基づくパラメータ毎の度数分布を求める(S12a)。本実施形態では、パラメータは、明度、彩度、及び色相の3つであるから、明度の度数分布と、彩度の度数分布と、色相の度数分布とが得られる。次に、検査部F12は、度数分布に基づいて、対象情報を生成する(S12b)。対象情報は、明度、彩度、及び色相に関する形状情報を含む。形状情報は、度数分布の形状の特徴を表す特徴量を含んでおり、評価ベクトル(複数の評価値)と代表値を含む。次に、検査部F12は、対象情報と基準情報との比較を行う(S12c)。検査部F12は、対象情報と基準情報との比較では、パラメータ(ここでは、明度、彩度、色相)毎に、形状情報の特徴量の差分を求める。特徴量の差分は、評価ベクトルの差分と、代表値の差分とを含む。そして、形状情報の特徴量の差分から、評価ベクトルの判定値(第1判定値)と、代表値の判定値(第2判定値)とを求める。検査部F12は、対象情報と基準情報との比較の結果に基づいて、対象物100の表面の状態が合格か不合格かを判定する(S12d)。本実施形態では、検査部F12は、第1判定値E1が第1規定値未満であり、第2判定値E2が第2規定値未満である場合に、対象物100の表面の状態が合格であるとする。一方で、検査部F12は、第1判定値E1が第1規定値以上、又は、第2判定値E2が第2規定値以上である場合に、対象物100の表面の状態が不合格であるとする。そして、検査システム1では、提示部F13は、検査部F12での検査の結果を、入出力部11を通して外部装置に出力する(S13)。また、検査システム1では、制御部F14は、塗装システム40に、検査の結果に基づく制御情報を出力する(S14)。
(1.3) Operation Next, the inspection method performed by the inspection system 1 described above will be briefly described with reference to the flowchart of FIG. In the inspection system 1, the acquisition unit F11 acquires an object image relating to the surface of the object 100 from the imaging system 30 (S11). Next, the inspection unit F12 inspects the surface condition of the object 100 (S12). Here, the inspection unit F12 obtains the frequency distribution for each parameter based on the pixel value from the target image (S12a). In the present embodiment, since there are three parameters, lightness, saturation, and hue, a frequency distribution of lightness, a frequency distribution of saturation, and a frequency distribution of hue can be obtained. Next, the inspection unit F12 generates target information based on the frequency distribution (S12b). The target information includes shape information regarding lightness, saturation, and hue. The shape information includes a feature amount representing the feature of the shape of the frequency distribution, and includes an evaluation vector (plural evaluation values) and a representative value. Next, the inspection unit F12 compares the target information with the reference information (S12c). In the comparison between the target information and the reference information, the inspection unit F12 obtains the difference in the feature amount of the shape information for each parameter (here, brightness, saturation, hue). The difference in the feature amount includes the difference in the evaluation vector and the difference in the representative value. Then, the determination value (first determination value) of the evaluation vector and the determination value (second determination value) of the representative value are obtained from the difference in the feature amount of the shape information. The inspection unit F12 determines whether the surface condition of the object 100 passes or fails based on the result of comparison between the target information and the reference information (S12d). In the present embodiment, when the first determination value E1 is less than the first specified value and the second determination value E2 is less than the second specified value, the inspection unit F12 passes the surface condition of the object 100. Suppose there is. On the other hand, the inspection unit F12 fails the surface condition of the object 100 when the first determination value E1 is equal to or greater than the first specified value or the second determination value E2 is equal to or greater than the second specified value. And. Then, in the inspection system 1, the presentation unit F13 outputs the result of the inspection by the inspection unit F12 to the external device through the input / output unit 11 (S13). Further, in the inspection system 1, the control unit F14 outputs control information based on the inspection result to the painting system 40 (S14).
 (1.4)まとめ
 以上述べた検査システム1は、取得部F11と、検査部F12とを備える。取得部F11は、対象物100の表面に関する対象画像を取得するステップである。検査部F12は、対象画像から得られる、画素値に基づくパラメータの度数分布の形状を表す形状情報を含む対象情報に基づいて、対象物100の表面の状態の検査をする。この検査システム1によれば、対象物100の表面の状態の検査の精度を向上できる。
(1.4) Summary The inspection system 1 described above includes an acquisition unit F11 and an inspection unit F12. The acquisition unit F11 is a step of acquiring an object image relating to the surface of the object 100. The inspection unit F12 inspects the state of the surface of the object 100 based on the target information including the shape information representing the shape of the frequency distribution of the parameter based on the pixel value obtained from the target image. According to this inspection system 1, the accuracy of inspection of the surface condition of the object 100 can be improved.
 換言すれば、検査システム1は、下記の方法(検査方法)を実行しているといえる。検査方法は、取得ステップS11と、検査ステップS12とを含む。取得ステップS11は、対象物100の表面に関する対象画像を取得する。検査ステップS12は、対象画像から得られる、画素値に基づくパラメータの度数分布の形状を表す形状情報を含む対象情報に基づいて、対象物100の表面の状態の検査をする。この検査方法によれば、検査システム1と同様に、対象物100の表面の状態の検査の精度を向上できる。 In other words, it can be said that the inspection system 1 executes the following method (inspection method). The inspection method includes acquisition step S11 and inspection step S12. The acquisition step S11 acquires an object image relating to the surface of the object 100. The inspection step S12 inspects the state of the surface of the object 100 based on the object information including the shape information representing the shape of the frequency distribution of the parameter based on the pixel value obtained from the object image. According to this inspection method, the accuracy of inspection of the surface condition of the object 100 can be improved as in the inspection system 1.
 検査方法は、1以上のプロセッサがプログラム(コンピュータプログラム)を実行することにより実現される。このプログラムは、1以上のプロセッサに上述の検査方法を実行させるためのプログラムである。このようなプログラムによれば、検査方法と同様に、対象物100の表面の状態の検査の精度を向上できる。そして、プログラムは、記憶媒体により提供され得る。この記憶媒体は、コンピュータで読み取り可能な非一時的な記憶媒体であって、上記のプログラムを記憶している。このような記憶媒体によれば、検査方法と同様に、対象物100の表面の状態の検査の精度を向上できる。 The inspection method is realized by executing a program (computer program) by one or more processors. This program is a program for causing one or more processors to execute the above-mentioned inspection method. According to such a program, the accuracy of the inspection of the surface condition of the object 100 can be improved as in the inspection method. The program may then be provided by a storage medium. This storage medium is a non-temporary storage medium that can be read by a computer and stores the above program. According to such a storage medium, the accuracy of the inspection of the surface condition of the object 100 can be improved as in the inspection method.
 (2)変形例
 本開示の実施形態は、上記実施形態に限定されない。上記実施形態は、本開示の目的を達成できれば、設計等に応じて種々の変更が可能である。以下に、上記実施形態の変形例を列挙する。
(2) Modified Example The embodiment of the present disclosure is not limited to the above embodiment. The above-described embodiment can be changed in various ways depending on the design and the like as long as the object of the present disclosure can be achieved. Examples of modifications of the above embodiment are listed below.
 一変形例では、対象情報は、複数のパラメータそれぞれに対応する形状情報を含んでいるが、対象情報は、複数ではなく単一のパラメータに対応する形状情報を含んでいてよい。また、形状情報に含まれる特徴量は、複数の評価値(評価ベクトル)と、代表値との少なくとも一方を含んでいればよい。 In one modification, the target information includes shape information corresponding to each of a plurality of parameters, but the target information may include shape information corresponding to a single parameter instead of a plurality of parameters. Further, the feature amount included in the shape information may include at least one of a plurality of evaluation values (evaluation vectors) and representative values.
 一変形例では、対象画像の画素値は、RGB表色系に限定されない。表色系の例としては、XYZ表色系、xyY表色系、L表色系、L表色系、Lch表色系等のCIE表色系が挙げられる。一例として、撮像システム30は、画素値がRGB表色系で表される画像ではなく、画素値がXYZ表色系で表される画像を生成してもよい。あるいは、演算処理によって、表色系の変換を行ってもよい。例えば、画素値がRGB表色系で表される画像を、画素値がXYZ表色系で表される画像に変換してもよい。この場合の演算処理には、モデル式やルックアップテーブル等が利用可能である。これによって、対象画像の画素値を所望の表色系のパラメータに変換することが可能である。 In one modification, the pixel value of the target image is not limited to the RGB color system. Examples of the color system include CIE color systems such as XYZ color system, xyY color system, L * u * v * color system, L * a * b * color system, and Lch color system. Be done. As an example, the imaging system 30 may generate an image in which the pixel value is represented by the XYZ color system instead of an image in which the pixel value is represented by the RGB color system. Alternatively, the color system may be converted by arithmetic processing. For example, an image in which the pixel value is represented by the RGB color system may be converted into an image in which the pixel value is represented by the XYZ color system. A model expression, a look-up table, or the like can be used for the arithmetic processing in this case. Thereby, it is possible to convert the pixel value of the target image into a parameter of a desired color system.
 一変形例では、検査部F12は、対象画像の画素値から得られる複数のパラメータの形状情報の全てを検査に利用する必要はない。例えば、パラメータが明度、彩度、及び色相であるとする。このとき、対象物100の表面の彩度が低く、無彩色と考えられる場合には、少しの色の違いで色相が大きく変化し得る。このような場合には、検査で色相を用いると、検査の精度が低下する可能性がある。よって、対象物100の表面が無彩色である場合には、色相に関する形状情報を利用しないほうがよい。このように、ある条件では、複数のパラメータのうちの特定のパラメータが検査に有効ではない場合がある。要するに、複数のパラメータが第1パラメータと第2パラメータとを含む場合に、検査部F12は、第1パラメータが所定条件を満たす場合には、第2パラメータに関する形状情報を検査に利用しないようにしてもよい。例えば、第1パラメータは、彩度であり、第2パラメータは、色相である。そして、所定条件は、彩度が閾値以下であることである。ここで、閾値は、彩度が閾値以下であれば対象物100の表面が無彩色であると判断できるかどうかに基づいて決定され得る。なお、第1パラメータ及び第2パラメータの組み合わせは、彩度と色相とに限定されず、その他のパラメータであり得る。 In one modification, the inspection unit F12 does not need to use all of the shape information of a plurality of parameters obtained from the pixel values of the target image for inspection. For example, suppose the parameters are lightness, saturation, and hue. At this time, if the surface saturation of the object 100 is low and it is considered to be achromatic, the hue can change significantly with a slight difference in color. In such a case, the use of hue in the inspection may reduce the accuracy of the inspection. Therefore, when the surface of the object 100 is achromatic, it is better not to use the shape information regarding the hue. Thus, under certain conditions, certain parameters of the plurality of parameters may not be valid for inspection. In short, when a plurality of parameters include the first parameter and the second parameter, the inspection unit F12 does not use the shape information regarding the second parameter for the inspection when the first parameter satisfies a predetermined condition. May be good. For example, the first parameter is saturation and the second parameter is hue. The predetermined condition is that the saturation is equal to or less than the threshold value. Here, the threshold value can be determined based on whether or not the surface of the object 100 can be determined to be achromatic if the saturation is equal to or less than the threshold value. The combination of the first parameter and the second parameter is not limited to saturation and hue, and may be other parameters.
 一変形例では、複数のパラメータに対して個々に加重値が設定されていてもよい。つまり、検査部F12は、複数のパラメータに対して重み付けを行ってもよい。例えば、対象物100の表面の彩度が低く、無彩色と考えられる場合には、少しの色の違いで色相が大きく変化し得る。このような場合には、検査で色相を用いると、検査の精度が低下する可能性がある。この場合には、対象物100の表面が無彩色に近いほど、色相に対する加重値を小さくする(重み付けを軽くする)ことで、色相による検査結果への影響を低くすることが考えられる。例えば、図13に示すように、検査部F12は、評価ベクトルw11に加重値を設定して、検査への影響が抑えられた評価ベクトルw12を生成し、この評価ベクトルw12に基づいて、検査を行ってよい。このように、検査部F12は、対象物100の表面に応じて、複数のパラメータに対して個々に加重値を変更してよい。 In one modification, weighted values may be set individually for a plurality of parameters. That is, the inspection unit F12 may weight a plurality of parameters. For example, when the surface saturation of the object 100 is low and it is considered to be achromatic, the hue can change significantly with a slight difference in color. In such a case, the use of hue in the inspection may reduce the accuracy of the inspection. In this case, it is conceivable that the closer the surface of the object 100 is to an achromatic color, the smaller the weighting value for the hue (lightening the weighting), thereby reducing the influence of the hue on the inspection result. For example, as shown in FIG. 13, the inspection unit F12 sets a weighted value in the evaluation vector w11 to generate an evaluation vector w12 in which the influence on the inspection is suppressed, and the inspection is performed based on the evaluation vector w12. You may go. In this way, the inspection unit F12 may individually change the weight value for a plurality of parameters according to the surface of the object 100.
 一変形例では、複数の評価値に対して個々に加重値が設定されていてもよい。つまり、検査部F12は、各パラメータに対応する評価ベクトルの各要素に対して重み付けを行ってもよい。例えば、対象物100の表面の状態において、光輝感について着目して検査を行いたい場合がある。このとき、対象画像において、明度が高い領域の面積が小さいと、光輝感に関しては思うような検査の精度が得られない場合がある。このような場合、検査部F12は、明度が大きくなるほど大きい加重値を設定してよい。例えば、図14に示すように、検査部F12は、評価ベクトルw11の各要素に加重値を設定して、明度が大きい領域が強調された評価ベクトルw13を生成し、この評価ベクトルw13に基づいて、検査を行ってよい。このように、検査部F12は、対象物100の表面に応じて、複数の評価値に対して個々に加重値を変更してよい。また、光輝感に関しては、検査部F12での検査の結果は合格となるものの、人が目視した場合には、検査の結果が合格とならない場合がある。つまり、検査部F12での検査の結果が人の感じ方と合わない場合があり得る。このような場合には、加重値として、人の感じ方に応じた心理量を反映した心理物理量を採用することが考えられる。これによって、検査部F12での検査に、人の心理量を反映させることが可能となる。 In one modification, weighted values may be set individually for a plurality of evaluation values. That is, the inspection unit F12 may weight each element of the evaluation vector corresponding to each parameter. For example, in the state of the surface of the object 100, there is a case where it is desired to perform an inspection paying attention to a feeling of brilliance. At this time, if the area of the high-brightness region in the target image is small, the desired inspection accuracy may not be obtained with respect to the brilliance. In such a case, the inspection unit F12 may set a weighted value that increases as the brightness increases. For example, as shown in FIG. 14, the inspection unit F12 sets a weighted value for each element of the evaluation vector w11 to generate an evaluation vector w13 in which a region having a large brightness is emphasized, and based on this evaluation vector w13. , May be inspected. In this way, the inspection unit F12 may individually change the weight value for the plurality of evaluation values according to the surface of the object 100. Further, regarding the feeling of brilliance, although the result of the inspection by the inspection unit F12 is passed, the result of the inspection may not be passed when visually observed by a person. That is, the result of the inspection by the inspection unit F12 may not match the way a person feels. In such a case, it is conceivable to adopt a psychophysical quantity that reflects the psychological quantity according to the way a person feels as a weighted value. This makes it possible to reflect the psychological amount of a person in the inspection by the inspection unit F12.
 一変形例では、対象情報は、対象画像から得られる偏差画像から得られてもよい。偏差画像は、対象画像の画素値から対象画像の画素値の基準値を引いた値を画素値として有する画像である。基準値は、対象画像の画素値の平均値、最頻値、中央値、及び、対象画像から求めた度数分布の代表値から選択され得る。このような偏差画像は、対象物100の表面の質感(テクスチャ)の検査に有効である。つまり、対象物100の表面の状態として、対象物100の表面の質感(テクスチャ)の検査を行う場合には、検査部F12は、対象画像から偏差画像を生成し、偏差画像に基づいて、対象情報を生成する。この場合、対象物100と比較される基準物についても同様の手法で偏差画像が生成される。このように、偏差画像を利用して対象情報を生成することで、対象物100の表面の質感の検査の精度の向上を図ることができる。 In one modification, the target information may be obtained from a deviation image obtained from the target image. The deviation image is an image having a value obtained by subtracting a reference value of the pixel value of the target image from the pixel value of the target image as a pixel value. The reference value can be selected from the average value, the mode value, the median value, and the representative value of the frequency distribution obtained from the target image. Such a deviation image is effective for inspecting the texture of the surface of the object 100. That is, when inspecting the texture of the surface of the object 100 as the state of the surface of the object 100, the inspection unit F12 generates a deviation image from the target image, and the target is based on the deviation image. Generate information. In this case, a deviation image is generated by the same method for the reference object to be compared with the object 100. By generating the target information using the deviation image in this way, it is possible to improve the accuracy of the inspection of the texture of the surface of the object 100.
 一変形例では、取得部F11は、対象物100の表面を異なる撮像条件で撮像して得られた複数の対象画像を取得してよい。つまり、取得ステップS11は、対象物100の表面を異なる撮像条件で撮像して得られた複数の対象画像を取得するステップであり得る。撮像条件の例としては、撮像範囲、撮像方向、分解能等が挙げられる。例えば、撮像条件を異ならせるために、撮像システム30は、図15に示すように、撮像範囲及び分解能が異なる複数のカメラ31,32を利用してよい。図15は、一変形例の検査方法における対象画像の取得方法の説明図である。図15において、カメラ31は、カメラ32より広い範囲を撮像することができるが、カメラ32のほうがカメラ31よりも分解能が高い。そして、カメラ32は、カメラ31よりも対象物100の表面の近くに配置される。図15に示すように、カメラ31は、対象物100の表面110の第1撮像範囲111の画像を対象画像P11(図16参照)として出力する。カメラ32は、対象物100の表面110の第2撮像範囲112の画像を対象画像P12(図17参照)として出力する。ここで、第2撮像範囲112は、第1撮像範囲111よりも狭い。なお、第2撮像範囲112は、第1撮像範囲111に含まれていなくてもよい。カメラ31は、カメラ32よりも対象物100の表面110の大きな視点からの検査(例えばグラデーションに関する検査)に向いているマクロ用カメラである。カメラ32は、カメラ31よりも対象物100の表面110の小さな視点からの検査(例えば粒状感に関する検査)に向いているミクロ用カメラである。例えば、図18は、対象画像P11に対応するヒストグラムH11を示し、図19は、対象画像P12に対応するヒストグラムH12を示している。このように、撮像条件が異なれば、当然ながら得られるヒストグラムも異なる。検査部F12は、対象画像P11及び対象画像P12それぞれについて、対象情報を求め、それぞれ対応する基準情報との比較を行う。これによって、対象物100の表面110の状態の検査の精度の向上を図ることができる。 In one modification, the acquisition unit F11 may acquire a plurality of target images obtained by imaging the surface of the object 100 under different imaging conditions. That is, the acquisition step S11 may be a step of acquiring a plurality of target images obtained by imaging the surface of the object 100 under different imaging conditions. Examples of imaging conditions include imaging range, imaging direction, resolution, and the like. For example, in order to make the imaging conditions different, the imaging system 30 may use a plurality of cameras 31 and 32 having different imaging ranges and resolutions, as shown in FIG. FIG. 15 is an explanatory diagram of a method of acquiring a target image in an inspection method of a modified example. In FIG. 15, the camera 31 can capture a wider range than the camera 32, but the camera 32 has a higher resolution than the camera 31. Then, the camera 32 is arranged closer to the surface of the object 100 than the camera 31. As shown in FIG. 15, the camera 31 outputs an image of the first imaging range 111 of the surface 110 of the object 100 as the target image P11 (see FIG. 16). The camera 32 outputs an image of the second imaging range 112 of the surface 110 of the object 100 as the object image P12 (see FIG. 17). Here, the second imaging range 112 is narrower than the first imaging range 111. The second imaging range 112 may not be included in the first imaging range 111. The camera 31 is a macro camera suitable for inspection (for example, inspection related to gradation) of the surface 110 of the object 100 from a larger viewpoint than the camera 32. The camera 32 is a micro camera suitable for inspection of the surface 110 of the object 100 from a smaller viewpoint (for example, inspection of graininess) than the camera 31. For example, FIG. 18 shows a histogram H11 corresponding to the target image P11, and FIG. 19 shows a histogram H12 corresponding to the target image P12. As described above, if the imaging conditions are different, the obtained histogram is naturally different. The inspection unit F12 obtains the target information for each of the target image P11 and the target image P12, and compares the target information with the corresponding reference information. Thereby, the accuracy of the inspection of the state of the surface 110 of the object 100 can be improved.
 一変形例では、検査部F12は、対象物100の表面の状態として、対象物100の表面の凹凸状態の検査を行ってよい。図20は、この変形例における対象物100の表面の画像の一例の説明図である。ここで、対象物100の表面の凹凸の特徴は、図20に示す対象画像P20において、対象物100の表面の鏡面反射部分130によく表れると考えられる。そこで、取得部F11は、対象物100の表面を異なる撮像条件で撮像して得られた複数の対象画像を取得する。撮像条件の例としては、撮像範囲、撮像方向、分解能等が挙げられる。例えば、撮像条件を異ならせるために、撮像システム30は、図21に示すように、撮像方向が異なる複数のカメラ33,34を利用してよい。図21では、対象物100の表面110は、照明システム20で照らされている。照明システム20からの光21の一部は、表面110で鏡面反射された光22となり、照明システム20からの光21の残りは、拡散反射される。カメラ33は、対象物100の表面110で拡散反射された光を受ける位置に配置される。カメラ33で生成される対象画像には、対象物100の表面110での拡散反射成分が反映される。カメラ34は、対象物100の表面110で鏡面反射された光を受ける位置に配置される。カメラ34で生成される対象画像には、対象物100の表面110での鏡面反射成分がよく反映される。そして、鏡面反射成分には、拡散反射成分よりも対象物100の表面110の凹凸等の形状がよく反映され、拡散反射成分には、鏡面反射成分よりも対象物100の表面110の色自体がよく反映される。これによって、対象物100の表面110の状態の検査を、対象物100の表面の色自体だけではなく、対象物100の表面の形状を加味して実行でできる。その結果、検査システム1によれば、対象物100の表面の状態の検査の精度を向上できる。なお、撮像方向を異ならせる場合には、同じカメラで異なる位置から対象物100の表面110を撮像してもよい。また、対象物100の表面の凹凸状態をよりよく検査するために、鏡面反射部分130を、マクロ用カメラとミクロ用カメラとで撮像して、対象画像を得てもよい。また、ハイダイナミックレンジ合成(high dynamic range imaging)を利用して得られた対象画像を用いてもよい。ハイダイナミックレンジ合成で得られた対象画像は、通常に比べてより広いダイナミックレンジに対応できるから、拡散反射成分のような光量が比較的少ない成分から、鏡面反射成分のような光量が比較的大きい成分までを一枚の画像で捉えることができる。 In one modification, the inspection unit F12 may inspect the uneven state of the surface of the object 100 as the state of the surface of the object 100. FIG. 20 is an explanatory diagram of an example of an image of the surface of the object 100 in this modified example. Here, it is considered that the characteristic of the unevenness on the surface of the object 100 often appears in the specular reflection portion 130 on the surface of the object 100 in the object image P20 shown in FIG. Therefore, the acquisition unit F11 acquires a plurality of target images obtained by imaging the surface of the object 100 under different imaging conditions. Examples of imaging conditions include imaging range, imaging direction, resolution, and the like. For example, in order to make the imaging conditions different, the imaging system 30 may use a plurality of cameras 33, 34 having different imaging directions, as shown in FIG. In FIG. 21, the surface 110 of the object 100 is illuminated by the lighting system 20. A part of the light 21 from the lighting system 20 becomes the light 22 mirror-reflected on the surface 110, and the rest of the light 21 from the lighting system 20 is diffuse-reflected. The camera 33 is arranged at a position where the light diffusely reflected by the surface 110 of the object 100 is received. The object image generated by the camera 33 reflects the diffuse reflection component on the surface 110 of the object 100. The camera 34 is arranged at a position where it receives the light reflected by the mirror surface on the surface 110 of the object 100. The specular reflection component on the surface 110 of the object 100 is well reflected in the object image generated by the camera 34. The shape of the surface 110 of the object 100 is more reflected in the specular reflection component than the diffuse reflection component, and the color itself of the surface 110 of the object 100 is more reflected in the diffuse reflection component than the mirror reflection component. It is well reflected. As a result, the state of the surface 110 of the object 100 can be inspected by taking into account not only the color of the surface of the object 100 itself but also the shape of the surface of the object 100. As a result, according to the inspection system 1, the accuracy of inspection of the surface condition of the object 100 can be improved. When the imaging directions are different, the surface 110 of the object 100 may be imaged from different positions with the same camera. Further, in order to better inspect the uneven state of the surface of the object 100, the specular reflection portion 130 may be imaged by a macro camera and a micro camera to obtain an object image. Further, the target image obtained by using high dynamic range imaging may be used. Since the target image obtained by high dynamic range composition can handle a wider dynamic range than usual, a component with a relatively small amount of light such as a diffuse reflection component has a relatively large amount of light such as a specular reflection component. Even the components can be captured in a single image.
 一変形例では、複数の評価値を求める際の区分Diは、必ずしも同じ幅でなくてもよい。例えば、上記実施形態では、図3に示すように、複数の区分D1~D7は同じ幅である。これに対して、複数の区分Diは異なる幅であってもよい。区分Diの幅を狭くすればするほど、評価ベクトルの分解能が向上し、検査の精度の向上を図ることができる。特に、ヒストグラムのピークに近い区分Diほど幅を小さくするとよい。例えば、図22は、鏡面反射成分を多く含む対象画像から得られたヒストグラムH20を示す。ヒストグラムH20のピークは、明るい側(右側)に偏っている。図22では、ヒストグラムH20に対して複数の区分Di(D21~D30)が設定されている。ここで、複数の区分D21~D30は、この順番に、幅が狭くなっている。このように区分D21~D30の幅を設定することで、これらの区分D21~D30から得られる評価ベクトルにヒストグラムH20の形状がよりよく反映されることになる。そのため、対象物100の表面の状態の検査の精度の向上を図ることができる。特に、ヒストグラムH20において度数の変化が大きい範囲では、区分の幅を小さくすることで、評価ベクトルにヒストグラムH20の形状をよりよく反映させることが可能となる。逆に言えば、ヒストグラムH20において度数の変化が小さい範囲では、区分の幅を大きくしてもよく、これによって、評価ベクトルのデータ量の削減を図ることができる。 In one modification, the division Di when obtaining a plurality of evaluation values does not necessarily have to be the same width. For example, in the above embodiment, as shown in FIG. 3, the plurality of divisions D1 to D7 have the same width. On the other hand, the plurality of divisions Di may have different widths. As the width of the division Di is narrowed, the resolution of the evaluation vector is improved, and the accuracy of the inspection can be improved. In particular, it is preferable that the width of the division Di closer to the peak of the histogram be smaller. For example, FIG. 22 shows a histogram H20 obtained from a target image containing a large amount of specular reflection components. The peak of the histogram H20 is biased toward the bright side (right side). In FIG. 22, a plurality of divisions Di (D21 to D30) are set for the histogram H20. Here, the widths of the plurality of divisions D21 to D30 are narrowed in this order. By setting the widths of the divisions D21 to D30 in this way, the shape of the histogram H20 is better reflected in the evaluation vectors obtained from these divisions D21 to D30. Therefore, it is possible to improve the accuracy of the inspection of the surface condition of the object 100. In particular, in the range where the change in frequency is large in the histogram H20, it is possible to better reflect the shape of the histogram H20 in the evaluation vector by reducing the width of the division. Conversely, in the range where the change in frequency is small in the histogram H20, the width of the division may be increased, whereby the amount of data of the evaluation vector can be reduced.
 一変形例では、検査部F12は、対象物100の表面の状態として、対象物100の表面の写像性の検査を行ってよい。写像性は、対象物100の表面に写る物体の像(反射像)の鮮明性の指標である。写像性が悪くなると反射像が不鮮明となり、写像性が良くなると反射像が鮮明になる。写像性は、対象物100の表面の粗さ及び形状の影響を受ける。写像性の検査にあたっては、図23に示すように、検査用像200が利用される。検査用像200には、所定のテストパターンが施されたチャートが利用される。検査用像200には、照明システム20から光23を当てて、対象物100の表面110に、検査用像200の反射像210が映るようにする。そして、撮像システム30は、反射像210からの光24を受ける位置に配置される。これによって、撮像システム30は、対象物100の表面110に写る反射像210の画像を撮像して、対象画像として出力する。つまり、写像性の検査にあたっては、対象画像は、対象物100の表面110による検査用像200の反射像の画像である。検査部F12は、このような対象画像から対象情報を生成して、対象情報と対応する基準情報とを比較し、対象物100の表面110の状態(写像性)の検査を行えばよい。なお、チャートの代わりに、所定のテストパターンを表示させた自発光型のディスプレイを利用してもよい。この場合には、テストパターンを対象物100の表面に映すために照明システム20を利用しなくてもよい。 In one modification, the inspection unit F12 may inspect the surface mapping property of the object 100 as the state of the surface of the object 100. The mapability is an index of the sharpness of an image (reflection image) of an object reflected on the surface of the object 100. When the reproducibility is poor, the reflected image becomes unclear, and when the reproducibility is good, the reflected image becomes clear. The mapability is affected by the surface roughness and shape of the object 100. In the mapability inspection, as shown in FIG. 23, the inspection image 200 is used. For the inspection image 200, a chart to which a predetermined test pattern is applied is used. Light 23 is applied to the inspection image 200 from the lighting system 20 so that the reflection image 210 of the inspection image 200 is projected on the surface 110 of the object 100. Then, the imaging system 30 is arranged at a position where the light 24 from the reflected image 210 is received. As a result, the imaging system 30 captures an image of the reflection image 210 reflected on the surface 110 of the object 100 and outputs it as the target image. That is, in the mapability inspection, the target image is an image of a reflection image of the inspection image 200 by the surface 110 of the object 100. The inspection unit F12 may generate target information from such a target image, compare the target information with the corresponding reference information, and inspect the state (mapping property) of the surface 110 of the target object 100. Instead of the chart, a self-luminous display displaying a predetermined test pattern may be used. In this case, it is not necessary to use the lighting system 20 to project the test pattern on the surface of the object 100.
 一変形例では、提示部F13は、検査部F12での検査の結果に基づいて、対象情報と基準情報との差異を視覚的に示す差異情報を提示してよい。図24は、差異情報の一例を示す。図24では、差異情報として、度数分布の代表値の差異を強調して示す。より詳細には、差異情報は、図24に示すように、対象情報から得られる評価ベクトルwT1及び代表値CT1と、基準情報から得られる評価ベクトルwR1及び代表値CR1とを示すグラフである。また、差異情報は、対象情報の代表値CT1の許容範囲の上限値Th1及び下限値Th2を含む。許容範囲は、基準情報から得られる代表値CR1を中心に設定される。図24の差異情報によれば、ユーザは視覚的に、対象情報の代表値CT1が、基準情報の代表値CR1に対して許容範囲にあるかどうかを判断することが可能となる。図25は、差異情報の別の一例を示す。図25では、差異情報として、対象情報と基準情報との評価ベクトルの差異を強調して示す。より詳細には、差異情報は、図25では、対象情報から得られる評価ベクトルwT1と、基準情報から得られる評価ベクトルwR1とを示すグラフである。更に、図25では、対象情報の評価ベクトルwT1と基準情報の評価ベクトルwR1との差異となる領域R11,R12が強調して表示されている。よって、図25の差異情報によれば、ユーザは視覚的に、対象情報の評価ベクトルwT1と基準情報の評価ベクトルwR1との差異を把握することができる。そのため、ユーザは、対象情報の評価ベクトルwT1が基準情報の評価ベクトルwR1に対して許容範囲にあるかどうかを判断することが可能となる。図26は、差異情報の別の一例を示す。図26では、差異情報として、対象情報と基準情報との評価ベクトルの差分のベクトルを示す。より詳細には、図26の差異情報は、図25における対象情報の評価ベクトルwT1と基準情報の評価ベクトルwR1との差異となる領域R11,R12を表示する。よって、図26の差異情報によれば、ユーザは視覚的に、対象情報の評価ベクトルwT1と基準情報の評価ベクトルwR1との差異を把握することができる。そのため、ユーザは、対象情報の評価ベクトルwT1が基準情報の評価ベクトルwR1に対して許容範囲にあるかどうかを判断することが可能となる。 In one modification, the presentation unit F13 may present difference information that visually indicates the difference between the target information and the reference information based on the result of the inspection by the inspection unit F12. FIG. 24 shows an example of difference information. In FIG. 24, the difference between the representative values of the frequency distribution is emphasized and shown as the difference information. More specifically, as shown in FIG. 24, the difference information is a graph showing the evaluation vector wT1 and the representative value CT1 obtained from the target information and the evaluation vector wR1 and the representative value CR1 obtained from the reference information. Further, the difference information includes an upper limit value Th1 and a lower limit value Th2 in the allowable range of the representative value CT1 of the target information. The permissible range is set around the representative value CR1 obtained from the reference information. According to the difference information of FIG. 24, the user can visually determine whether or not the representative value CT1 of the target information is within the permissible range with respect to the representative value CR1 of the reference information. FIG. 25 shows another example of the difference information. In FIG. 25, as the difference information, the difference between the evaluation vectors of the target information and the reference information is emphasized. More specifically, the difference information is a graph showing the evaluation vector wT1 obtained from the target information and the evaluation vector wR1 obtained from the reference information in FIG. 25. Further, in FIG. 25, the regions R11 and R12 that are the differences between the evaluation vector wT1 of the target information and the evaluation vector wR1 of the reference information are emphasized and displayed. Therefore, according to the difference information of FIG. 25, the user can visually grasp the difference between the evaluation vector wT1 of the target information and the evaluation vector wR1 of the reference information. Therefore, the user can determine whether or not the evaluation vector wT1 of the target information is within the allowable range with respect to the evaluation vector wR1 of the reference information. FIG. 26 shows another example of the difference information. FIG. 26 shows the difference vector of the evaluation vector between the target information and the reference information as the difference information. More specifically, the difference information in FIG. 26 displays the regions R11 and R12 that are the differences between the evaluation vector wT1 of the target information and the evaluation vector wR1 of the reference information in FIG. 25. Therefore, according to the difference information of FIG. 26, the user can visually grasp the difference between the evaluation vector wT1 of the target information and the evaluation vector wR1 of the reference information. Therefore, the user can determine whether or not the evaluation vector wT1 of the target information is within the allowable range with respect to the evaluation vector wR1 of the reference information.
 更に、提示部F13は、検査部F12での検査の結果、対象情報と基準情報との差異を、対象物100の表面に反映した対象物100の画像を提示してもよい。更に、提示部F13は、検査部F12で合格と判定された部分と不合格と判定された部分とを明確に区別する提示を行ってよい。例えば、図27に示す対象物100の画像P30では、対象物100の表面が4つの領域R31,R32,R33,R34に分類されている。4つの領域R31,R32,R33,R34は、この順に、対象情報と基準情報との代表値の差異が大きくなっていることを示している。そして、領域R34は、対象物100の表面において検査部F12で不合格と判定された部分である。このような提示部F13での提示によって、対象画像において不合格と判定された部分(異常がある部分)を分かりやすく提示することが可能である。例えば、図28に示す対象物100の画像P40では、対象物100の表面が4つの領域R41,R42,R43,R44に分類されている。4つの領域R41,R42,R43,R44は、この順に、対象情報と基準情報との評価ベクトルの差異が大きくなっていることを示している。そして、領域R44は、対象物100の表面において検査部F12で不合格と判定された部分である。このような提示部F13での提示によって、対象画像において不合格と判定された部分(異常がある部分)を分かりやすく提示することが可能である。 Further, the presentation unit F13 may present an image of the object 100 that reflects the difference between the target information and the reference information as a result of the inspection by the inspection unit F12 on the surface of the object 100. Further, the presentation unit F13 may make a presentation that clearly distinguishes the portion determined to be acceptable by the inspection unit F12 from the portion determined to be unacceptable. For example, in the image P30 of the object 100 shown in FIG. 27, the surface of the object 100 is classified into four regions R31, R32, R33, and R34. The four regions R31, R32, R33, and R34 indicate that the difference between the representative values of the target information and the reference information increases in this order. The region R34 is a portion of the surface of the object 100 that is determined to be unacceptable by the inspection unit F12. By presenting in such a presentation unit F13, it is possible to present a portion (a portion having an abnormality) determined to be rejected in the target image in an easy-to-understand manner. For example, in the image P40 of the object 100 shown in FIG. 28, the surface of the object 100 is classified into four regions R41, R42, R43, and R44. The four regions R41, R42, R43, and R44 indicate that the difference between the evaluation vectors of the target information and the reference information increases in this order. The region R44 is a portion of the surface of the object 100 that is determined to be unacceptable by the inspection unit F12. By presenting in such a presentation unit F13, it is possible to present a portion (a portion having an abnormality) determined to be rejected in the target image in an easy-to-understand manner.
 一変形例では、入出力部11は、画像表示装置を含んでいてよい。この場合、提示部F13は、入出力部11の画像表示装置に、検査部F12での検査の結果を、表示してよい。また、入出力部11は、音声出力装置を含んでいてよい。この場合、提示部F13は、入出力部11の音声出力装置から、検査部F12での検査の結果を、出力してよい。 In one modification, the input / output unit 11 may include an image display device. In this case, the presentation unit F13 may display the result of the inspection by the inspection unit F12 on the image display device of the input / output unit 11. Further, the input / output unit 11 may include an audio output device. In this case, the presentation unit F13 may output the result of the inspection by the inspection unit F12 from the audio output device of the input / output unit 11.
 一変形例では、照明システム20が放射する光の波長が変更可能であってよい。これは、発光色の異なる光源、又はカラーフィルタを利用して実現し得る。ようするに、検査システム1においては、照明システム20が放射する光の波長と撮像システム30が検出する光の波長との少なくとも一方は変更可能であってよい。 In one modification, the wavelength of the light emitted by the lighting system 20 may be changeable. This can be achieved by using light sources having different emission colors or color filters. Thus, in the inspection system 1, at least one of the wavelength of the light emitted by the lighting system 20 and the wavelength of the light detected by the imaging system 30 may be changeable.
 一変形例では、検査システム1(判定システム10)は、複数のコンピュータにより構成されていてもよい。例えば、検査システム1(判定システム10)の機能(特に、取得部F11、検査部F12、提示部F13、及び制御部F14)は、複数の装置に分散されていてもよい。更に、検査システム1(判定システム10)の機能の少なくとも一部が、例えば、クラウド(クラウドコンピューティング)によって実現されていてもよい。 In one modification, the inspection system 1 (judgment system 10) may be composed of a plurality of computers. For example, the functions of the inspection system 1 (determination system 10) (particularly, the acquisition unit F11, the inspection unit F12, the presentation unit F13, and the control unit F14) may be distributed to a plurality of devices. Further, at least a part of the functions of the inspection system 1 (determination system 10) may be realized by, for example, the cloud (cloud computing).
 以上述べた検査システム1(判定システム10)の実行主体は、コンピュータシステムを含んでいる。コンピュータシステムは、ハードウェアとしてのプロセッサ及びメモリを有する。コンピュータシステムのメモリに記録されたプログラムをプロセッサが実行することによって、本開示における検査システム1(判定システム10)の実行主体としての機能が実現される。プログラムは、コンピュータシステムのメモリに予め記録されていてもよいが、電気通信回線を通じて提供されてもよい。また、プログラムは、コンピュータシステムで読み取り可能なメモリカード、光学ディスク、ハードディスクドライブ等の非一時的な記録媒体に記録されて提供されてもよい。コンピュータシステムのプロセッサは、半導体集積回路(IC)又は大規模集積回路(LSI)を含む1または複数の電子回路で構成される。LSIの製造後にプログラムされる、フィールド・プログラマブル・ゲート・アレイ(FGPA)、ASIC(application specific integrated circuit)、又はLSI内部の接合関係の再構成又はLSI内部の回路区画のセットアップができる再構成可能な論理デバイスも同じ目的で使うことができる。複数の電子回路は、1つのチップに集約されていてもよいし、複数のチップに分散して設けられていてもよい。複数のチップは、1つの装置に集約されていてもよいし、複数の装置に分散して設けられていてもよい。 The execution subject of the inspection system 1 (judgment system 10) described above includes a computer system. A computer system has a processor and memory as hardware. When the processor executes the program recorded in the memory of the computer system, the function as the execution subject of the inspection system 1 (determination system 10) in the present disclosure is realized. The program may be pre-recorded in the memory of the computer system or may be provided through a telecommunication line. Further, the program may be provided by being recorded on a non-temporary recording medium such as a memory card, an optical disk, or a hard disk drive that can be read by a computer system. A processor in a computer system is composed of one or more electronic circuits including a semiconductor integrated circuit (IC) or a large scale integrated circuit (LSI). A field programmable gate array (FGPA), an ASIC (application specific integrated circuit), or a reconfigurable reconfigurable circuit partition inside the LSI that can be reconfigured after the LSI is manufactured. Logical devices can be used for the same purpose. A plurality of electronic circuits may be integrated on one chip, or may be distributed on a plurality of chips. The plurality of chips may be integrated in one device, or may be distributed in a plurality of devices.
 (3)態様
 上記実施形態及び変形例から明らかなように、本開示は、下記の態様を含む。以下では、実施形態との対応関係を明示するためだけに、符号を括弧付きで付している。
(3) Aspects As is clear from the above embodiments and modifications, the present disclosure includes the following aspects. In the following, reference numerals are given in parentheses only to clearly indicate the correspondence with the embodiments.
 第1の態様は、検査方法であって、取得ステップ(S11)と、検査ステップ(S12)とを含む。取得ステップ(S11)は、対象物(100)の表面に関する対象画像を取得するステップである。検査ステップ(S12)は、対象画像から得られる、画素値に基づくパラメータの度数分布の形状を表す形状情報を含む対象情報に基づいて、対象物(100)の表面の状態の検査をするステップである。この態様によれば、対象物(100)の表面の状態の検査の精度を向上できる。 The first aspect is an inspection method, which includes an acquisition step (S11) and an inspection step (S12). The acquisition step (S11) is a step of acquiring an object image relating to the surface of the object (100). The inspection step (S12) is a step of inspecting the surface condition of the object (100) based on the object information including the shape information representing the shape of the frequency distribution of the parameter based on the pixel value obtained from the object image. is there. According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
 第2の態様は、第1の態様に基づく検査方法である。第2の態様では、形状情報は、度数分布の形状の特徴を表す特徴量を含む。この態様によれば、対象物(100)の表面の状態の検査の精度を向上できる。 The second aspect is an inspection method based on the first aspect. In the second aspect, the shape information includes a feature quantity representing a feature of the shape of the frequency distribution. According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
 第3の態様は、第2の態様に基づく検査方法である。第3の態様では、特徴量は、度数分布の形状のパラメータの複数の区分(Di)それぞれの面積(Si)に基づく複数の評価値を含む。この態様によれば、対象物(100)の表面の状態の検査の精度を向上できる。 The third aspect is an inspection method based on the second aspect. In the third aspect, the feature quantity includes a plurality of evaluation values based on the plurality of divisions (Di) of the parameters of the shape of the frequency distribution and the area (Si) of each. According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
 第4の態様は、第3の態様に基づく検査方法である。第4の態様では、検査方法は、複数の評価値に対して個々に加重値が設定される。この態様によれば、対象物(100)の表面の状態の検査の精度を向上できる。 The fourth aspect is an inspection method based on the third aspect. In the fourth aspect, in the inspection method, weighted values are individually set for a plurality of evaluation values. According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
 第5の態様は、第2の態様に基づく検査方法である。第5の態様では、特徴量は、度数分布の代表値を含む。この態様によれば、対象物(100)の表面の状態の検査の精度を向上できる。 The fifth aspect is an inspection method based on the second aspect. In the fifth aspect, the feature quantity includes a representative value of the frequency distribution. According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
 第6の態様は、第5の態様に基づく検査方法である。第6の態様では、代表値は、度数分布の形状の全体に対する度数分布の形状のパラメータの複数の区分(Di)それぞれの比率(wi)と、複数の区分(Di)に対応するパラメータの値(Li)から求まる値である。この態様によれば、対象物(100)の表面の状態の検査の精度を向上できる。 The sixth aspect is an inspection method based on the fifth aspect. In the sixth aspect, the representative values are the ratio (wi) of each of the plurality of divisions (Di) of the parameters of the shape of the frequency distribution to the entire shape of the frequency distribution, and the values of the parameters corresponding to the plurality of divisions (Di). It is a value obtained from (Li). According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
 第7の態様は、第1~第6の態様のいずれか一つに基づく検査方法である。第7の態様では、パラメータは、色に関するパラメータである。この態様によれば、対象物(100)の表面の状態として、色の検査の精度を向上できる。 The seventh aspect is an inspection method based on any one of the first to sixth aspects. In a seventh aspect, the parameter is a color parameter. According to this aspect, the accuracy of the color inspection can be improved as the surface condition of the object (100).
 第8の態様は、第1~第7の態様のいずれか一つに基づく検査方法である。第8の態様では、対象情報は、複数のパラメータそれぞれに関する形状情報を含む。この態様によれば、対象物(100)の表面の状態の検査の精度を向上できる。 The eighth aspect is an inspection method based on any one of the first to seventh aspects. In the eighth aspect, the target information includes shape information for each of the plurality of parameters. According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
 第9の態様は、第8の態様に基づく検査方法である。第9の態様では、複数のパラメータに対して個々に加重値が設定される。この態様によれば、対象物(100)の表面の状態の検査の精度を向上できる。 The ninth aspect is an inspection method based on the eighth aspect. In the ninth aspect, weighted values are individually set for the plurality of parameters. According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
 第10の態様は、第8又は第9の態様に基づく検査方法である。第10の態様では、複数のパラメータは、第1パラメータと第2パラメータとを含む。検査ステップ(S12)は、第1パラメータが所定条件を満たす場合には、第2パラメータに関する形状情報を検査に利用しない。この態様によれば、対象物(100)の表面の状態の検査の精度を向上できる。 The tenth aspect is an inspection method based on the eighth or ninth aspect. In the tenth aspect, the plurality of parameters include a first parameter and a second parameter. In the inspection step (S12), when the first parameter satisfies a predetermined condition, the shape information regarding the second parameter is not used for the inspection. According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
 第11の態様は、第10の態様に基づく検査方法である。第11の態様では、第1パラメータは、彩度である。第2パラメータは、色相である。この態様によれば、対象物(100)の表面の状態の検査の精度を向上できる。 The eleventh aspect is an inspection method based on the tenth aspect. In the eleventh aspect, the first parameter is saturation. The second parameter is hue. According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
 第12の態様は、第1~第11の態様のいずれか一つに基づく検査方法である。第12の態様では、検査ステップ(S12)は、対象情報と、対象物(100)の基準となる基準物の表面に関する基準画像から得られる形状情報を含む基準情報との比較をする。この態様によれば、対象物(100)の表面の状態の検査の精度を向上できる。 The twelfth aspect is an inspection method based on any one of the first to eleventh aspects. In the twelfth aspect, the inspection step (S12) compares the target information with the reference information including the shape information obtained from the reference image regarding the surface of the reference object as the reference of the object (100). According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
 第13の態様は、第1~第12の態様のいずれか一つに基づく検査方法である。第13の態様では、対象情報は、対象画像から得られる偏差画像から得られる。偏差画像は、対象画像の画素値から対象画像の画素値の基準値を引いた値を画素値として有する画像である。この態様によれば、対象物(100)の表面の状態として、質感(テクスチャ)の検査が可能となる。 The thirteenth aspect is an inspection method based on any one of the first to twelfth aspects. In the thirteenth aspect, the target information is obtained from a deviation image obtained from the target image. The deviation image is an image having a value obtained by subtracting a reference value of the pixel value of the target image from the pixel value of the target image as a pixel value. According to this aspect, it is possible to inspect the texture as the state of the surface of the object (100).
 第14の態様は、第1~第13の態様のいずれか一つに基づく検査方法である。第14の態様では、取得ステップ(S11)は、対象物(100)の表面を異なる撮像条件で撮像して得られた複数の対象画像を取得する。この態様によれば、対象物(100)の表面の状態の検査の精度を向上できる。 The 14th aspect is an inspection method based on any one of the 1st to 13th aspects. In the fourteenth aspect, the acquisition step (S11) acquires a plurality of target images obtained by imaging the surface of the object (100) under different imaging conditions. According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
 第15の態様は、第1~第14の態様のいずれか一つに基づく検査方法である。第15の態様では、対象画像は、対象物(100)の表面による検査用像(200)の反射像(210)の画像である。この態様によれば、対象物(100)の表面の状態として、写像性の検査が可能となる。 The fifteenth aspect is an inspection method based on any one of the first to fourteenth aspects. In the fifteenth aspect, the target image is an image of a reflection image (210) of the inspection image (200) by the surface of the object (100). According to this aspect, the mapability can be inspected as the state of the surface of the object (100).
 第16の態様は、第1~第15の態様のいずれか一つに基づく検査方法である。第16の態様では、検査方法は、検査ステップ(S12)での検査の結果に基づく提示を行う提示ステップ(S13)を更に含む。この態様によれば、対象物(100)の検査の結果を提示できる。 The sixteenth aspect is an inspection method based on any one of the first to fifteenth aspects. In the sixteenth aspect, the inspection method further includes a presentation step (S13) for making a presentation based on the result of the inspection in the inspection step (S12). According to this aspect, the result of the inspection of the object (100) can be presented.
 第17の態様は、第1~第16の態様のいずれか一つに基づく検査方法である。第17の態様では、検査方法は、対象物(100)の表面を塗装する塗装システム(40)に、検査ステップ(S12)での検査の結果に基づく制御情報を出力する制御ステップ(S14)を更に含む。この態様によれば、対象物(100)の表面の塗装の品質の向上を図ることができる。 The 17th aspect is an inspection method based on any one of the 1st to 16th aspects. In the seventeenth aspect, the inspection method is to provide a coating system (40) for painting the surface of the object (100) with a control step (S14) for outputting control information based on the inspection result in the inspection step (S12). Further included. According to this aspect, it is possible to improve the quality of coating on the surface of the object (100).
 第18の態様は、第1~第17の態様のいずれか一つに基づく検査方法である。第18の態様では、度数分布の形状は、ヒストグラムの形状である。この態様によれば、対象物(100)の表面の状態の検査の精度を向上できる。 The eighteenth aspect is an inspection method based on any one of the first to seventeenth aspects. In the eighteenth aspect, the shape of the frequency distribution is the shape of a histogram. According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
 第19の態様は、プログラムであって、1以上のプロセッサに、第1~第18の態様のいずれか一つの検査方法を実行させるための、プログラムである。この態様によれば、対象物(100)の表面の状態の検査の精度を向上できる。 The nineteenth aspect is a program for causing one or more processors to execute the inspection method of any one of the first to eighteenth aspects. According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
 第20の態様は、検査システム(1)であって、取得部(F11)と、検査部(F12)とを備える。取得部(F11)は、対象物(100)の表面の対象画像を取得する。検査部(F12)は、対象画像から得られる、画素値に基づくパラメータの度数分布の形状を表す形状情報を含む対象情報に基づいて、対象物(100)の表面の状態の検査をする。この態様によれば、対象物(100)の表面の状態の検査の精度を向上できる。 The twentieth aspect is an inspection system (1), which includes an acquisition unit (F11) and an inspection unit (F12). The acquisition unit (F11) acquires an object image on the surface of the object (100). The inspection unit (F12) inspects the surface condition of the object (100) based on the object information including the shape information representing the shape of the frequency distribution of the parameter based on the pixel value obtained from the object image. According to this aspect, the accuracy of the inspection of the surface condition of the object (100) can be improved.
 なお、第2~第18の態様は、第19及び第20の態様にも適宜変更して適用することが可能である。 The second to eighteenth aspects can be appropriately modified and applied to the nineteenth and twentieth aspects.
 本開示の検査方法、プログラム、及び、検査システムによれば、対象物の表面の状態の検査の精度を向上させることができる。そのため、本開示にかかる発明は、産業上有用である。 According to the inspection method, program, and inspection system of the present disclosure, the accuracy of inspection of the surface condition of the object can be improved. Therefore, the invention according to the present disclosure is industrially useful.
 1 検査システム
 F11 取得部
 F12 検査部
 100 対象物
 200 検査用像
 210 反射像
 S11 取得ステップ
 S12 検査ステップ
 S13 提示ステップ
 S14 制御ステップ
1 Inspection system F11 Acquisition unit F12 Inspection unit 100 Object 200 Inspection image 210 Reflection image S11 Acquisition step S12 Inspection step S13 Presentation step S14 Control step

Claims (20)

  1.  対象物の表面に関する対象画像を取得する取得ステップと、
     前記対象画像から得られる、画素値に基づくパラメータの度数分布の形状を表す形状情報を含む対象情報に基づいて、前記対象物の表面の状態の検査をする検査ステップと、
     を含む、
     検査方法。
    The acquisition step to acquire the target image about the surface of the object, and
    An inspection step of inspecting the surface condition of the object based on the object information including the shape information representing the shape of the frequency distribution of the parameter based on the pixel value obtained from the object image.
    including,
    Inspection method.
  2.  前記形状情報は、前記度数分布の形状の特徴を表す特徴量を含む、
     請求項1の検査方法。
    The shape information includes a feature amount representing a feature of the shape of the frequency distribution.
    The inspection method of claim 1.
  3.  前記特徴量は、前記度数分布の形状の前記パラメータの複数の区分それぞれの面積に基づく複数の評価値を含む、
     請求項2の検査方法。
    The feature amount includes a plurality of evaluation values based on the area of each of the plurality of divisions of the parameter of the shape of the frequency distribution.
    The inspection method of claim 2.
  4.  前記複数の評価値に対して個々に加重値が設定される、
     請求項3の検査方法。
    Weighted values are individually set for the plurality of evaluation values.
    The inspection method of claim 3.
  5.  前記特徴量は、前記度数分布の代表値を含む、
     請求項2の検査方法。
    The feature amount includes a representative value of the frequency distribution.
    The inspection method of claim 2.
  6.  前記代表値は、前記度数分布の形状の全体に対する前記度数分布の形状の前記パラメータの複数の区分それぞれの比率と、前記複数の区分に対応するパラメータの値とから求まる値である、
     請求項5の検査方法。
    The representative value is a value obtained from the ratio of each of the plurality of divisions of the parameter of the shape of the frequency distribution to the entire shape of the shape of the frequency distribution and the value of the parameter corresponding to the plurality of divisions.
    The inspection method of claim 5.
  7.  前記パラメータは、色に関するパラメータである、
     請求項1~6のいずれか一つの検査方法。
    The parameters are parameters relating to color.
    The inspection method according to any one of claims 1 to 6.
  8.  前記対象情報は、複数の前記パラメータそれぞれに関する形状情報を含む、
     請求項1~7のいずれか一つの検査方法。
    The target information includes shape information for each of the plurality of parameters.
    The inspection method according to any one of claims 1 to 7.
  9.  複数の前記パラメータに対して個々に加重値が設定される、
     請求項8の検査方法。
    Weighted values are set individually for the plurality of parameters.
    The inspection method of claim 8.
  10.  複数の前記パラメータは、第1パラメータと第2パラメータとを含み、
     前記検査ステップは、前記第1パラメータが所定条件を満たす場合には、前記第2パラメータに関する形状情報を検査に利用しない、
     請求項8又は9の検査方法。
    The plurality of said parameters include a first parameter and a second parameter.
    In the inspection step, when the first parameter satisfies a predetermined condition, the shape information regarding the second parameter is not used for the inspection.
    The inspection method according to claim 8 or 9.
  11.  前記第1パラメータは、彩度であり、
     前記第2パラメータは、色相である、
     請求項10の検査方法。
    The first parameter is saturation.
    The second parameter is hue.
    The inspection method of claim 10.
  12.  前記検査ステップは、前記対象情報と、前記対象物の基準となる基準物の表面に関する基準画像から得られる前記形状情報を含む基準情報との比較をする、
     請求項1~11のいずれか一つの検査方法。
    The inspection step compares the target information with the reference information including the shape information obtained from the reference image regarding the surface of the reference object that serves as a reference for the object.
    The inspection method according to any one of claims 1 to 11.
  13.  前記対象情報は、前記対象画像から得られる偏差画像から得られ、
     前記偏差画像は、前記対象画像の画素値から前記対象画像の画素値の基準値を引いた値を画素値として有する画像である、
     請求項1~12のいずれか一つの検査方法。
    The target information is obtained from a deviation image obtained from the target image.
    The deviation image is an image having a value obtained by subtracting a reference value of the pixel value of the target image from the pixel value of the target image as a pixel value.
    The inspection method according to any one of claims 1 to 12.
  14.  前記取得ステップは、前記対象物の表面を異なる撮像条件で撮像して得られた複数の前記対象画像を取得する、
     請求項1~13のいずれか一つの検査方法。
    The acquisition step acquires a plurality of the target images obtained by imaging the surface of the object under different imaging conditions.
    The inspection method according to any one of claims 1 to 13.
  15.  前記対象画像は、前記対象物の表面による検査用像の反射像の画像である、
     請求項1~14のいずれか一つの検査方法。
    The target image is an image of a reflection image of the inspection image by the surface of the target object.
    The inspection method according to any one of claims 1 to 14.
  16.  前記検査ステップでの検査の結果に基づく提示を行う提示ステップを更に含む、
     請求項1~15のいずれか一つの検査方法。
    Further including a presentation step of making a presentation based on the result of the inspection in the inspection step.
    The inspection method according to any one of claims 1 to 15.
  17.  前記対象物の表面を塗装する塗装システムに、前記検査ステップでの検査の結果に基づく制御情報を出力する制御ステップを更に含む、
     請求項1~16のいずれか一つの検査方法。
    The coating system for painting the surface of the object further includes a control step that outputs control information based on the inspection result in the inspection step.
    The inspection method according to any one of claims 1 to 16.
  18.  前記度数分布の形状は、ヒストグラムの形状である、
     請求項1~17のいずれか一つの検査方法。
    The shape of the frequency distribution is the shape of a histogram.
    The inspection method according to any one of claims 1 to 17.
  19.  1以上のプロセッサに、請求項1~18のいずれか一つの検査方法を実行させるための、
     プログラム。
    For causing one or more processors to perform any one of the inspection methods of claims 1-18.
    program.
  20.  対象物の表面の対象画像を取得する取得部と、
     前記対象画像から得られる、画素値に基づくパラメータの度数分布の形状を表す形状情報を含む対象情報に基づいて、前記対象物の表面の状態の検査をする検査部と、
     を備える、
     検査システム。
    An acquisition unit that acquires an object image on the surface of an object,
    An inspection unit that inspects the surface condition of the object based on the object information including the shape information that represents the shape of the frequency distribution of the parameter based on the pixel value obtained from the object image.
    To prepare
    Inspection system.
PCT/JP2020/045763 2019-12-27 2020-12-09 Inspection method, program, and inspection system WO2021131686A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-239862 2019-12-27
JP2019239862A JP2023015418A (en) 2019-12-27 2019-12-27 Inspection method, program and inspection system

Publications (1)

Publication Number Publication Date
WO2021131686A1 true WO2021131686A1 (en) 2021-07-01

Family

ID=76575410

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/045763 WO2021131686A1 (en) 2019-12-27 2020-12-09 Inspection method, program, and inspection system

Country Status (2)

Country Link
JP (1) JP2023015418A (en)
WO (1) WO2021131686A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09178558A (en) * 1995-12-22 1997-07-11 Kawasaki Heavy Ind Ltd Method and apparatus for determining color of object
US6901163B1 (en) * 1998-05-19 2005-05-31 Active Silicon Limited Method of detecting objects
JP2015184147A (en) * 2014-03-25 2015-10-22 ダイハツ工業株式会社 Color matching method of bright material containing coating film, and structure member applied with bright material containing coating film
JP2018004423A (en) * 2016-06-30 2018-01-11 有限会社パパラボ Color determination device and color determination method
US20190011252A1 (en) * 2016-01-07 2019-01-10 Arkema Inc. Optical method to measure the thickness of coatings deposited on substrates
JP2019144059A (en) * 2018-02-20 2019-08-29 株式会社パパラボ Device and method for determining surface roughness by white light source

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09178558A (en) * 1995-12-22 1997-07-11 Kawasaki Heavy Ind Ltd Method and apparatus for determining color of object
US6901163B1 (en) * 1998-05-19 2005-05-31 Active Silicon Limited Method of detecting objects
JP2015184147A (en) * 2014-03-25 2015-10-22 ダイハツ工業株式会社 Color matching method of bright material containing coating film, and structure member applied with bright material containing coating film
US20190011252A1 (en) * 2016-01-07 2019-01-10 Arkema Inc. Optical method to measure the thickness of coatings deposited on substrates
JP2018004423A (en) * 2016-06-30 2018-01-11 有限会社パパラボ Color determination device and color determination method
JP2019144059A (en) * 2018-02-20 2019-08-29 株式会社パパラボ Device and method for determining surface roughness by white light source

Also Published As

Publication number Publication date
JP2023015418A (en) 2023-02-01

Similar Documents

Publication Publication Date Title
US6567543B1 (en) Image processing apparatus, image processing method, storage medium for storing image processing method, and environment light measurement apparatus
Mendoza et al. Calibrated color measurements of agricultural foods using image analysis
Smith et al. Apparent greyscale: A simple and fast conversion to perceptually accurate images and video
JP6038965B2 (en) Coloring inspection apparatus and coloring inspection method
JP3809838B2 (en) Image highlight correction method using image source specific HSV color coordinates, image highlight correction program, and image acquisition system
JP6039008B2 (en) Coloring evaluation apparatus and coloring evaluation method
EP3220101B1 (en) Texture evaluation apparatus, texture evaluation method, and computer-readable recording medium
JP6907766B2 (en) Measuring equipment and measuring system
JP2023085360A (en) inspection system
WO1996034259A1 (en) Apparatus for chromatic vision measurement
US20240044778A1 (en) Inspection method and inspection system
JPH07234158A (en) Reproduction of color printed matter
WO2021131686A1 (en) Inspection method, program, and inspection system
JP6502275B2 (en) Depth feeling evaluation apparatus, depth feeling evaluation method and depth feeling evaluation program
Marini et al. Color constancy measurements for synthetic image generation
Darling et al. Real-time multispectral rendering with complex illumination
WO2020049860A1 (en) Coating-color-evaluation-image generation method and generation program and coating-color-evaluation-image generation device
Bratuž et al. Defining Optimal Conditions of Colors in 3D Space in Dependence on Gamma Values, Illumination, and Background Color.
KR20200045264A (en) Inspection area determination method and visual inspection apparatus using the same
Canham et al. Noise Prism: A Novel Multispectral Visualization Technique
Gabrijelčič Tomc et al. Colorimetric Accuracy of Color Reproductions in the 3D Scenes
Tominaga et al. Color temperature estimation of scene illumination by the sensor correlation method
Wang et al. Retinex-based color correction for displaying high dynamic range images
Rossi Real material texture color management in CAD systems for Spatial Design
CN118329391A (en) AR optical waveguide stray light testing method and system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20908284

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20908284

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP