AU2002338353A1 - Method and device for surface evaluation - Google Patents
Method and device for surface evaluationInfo
- Publication number
- AU2002338353A1 AU2002338353A1 AU2002338353A AU2002338353A AU2002338353A1 AU 2002338353 A1 AU2002338353 A1 AU 2002338353A1 AU 2002338353 A AU2002338353 A AU 2002338353A AU 2002338353 A AU2002338353 A AU 2002338353A AU 2002338353 A1 AU2002338353 A1 AU 2002338353A1
- Authority
- AU
- Australia
- Prior art keywords
- light
- sample
- light source
- imaging device
- recording
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Description
METHOD AND DEVICE FOR SURFACE EVALUATION
The invention relates to a method and a device for recording the visual properties of a surface, for example the colour, gloss, texture, etc. of pigmented coating films, using an imaging device, a light source, and a sample area for positioning a sample with a surface to be examined. The visual appearance of a surface can be dependent on optical geometry, optical geometry being defined as the positioning of an observed object relative to an observer and a light source. This dependency can occur for example in films of coatings comprising effect pigments. The dependency of visual properties on optical geometry is generally called goniochromatism, or flop behaviour. In optical geometry, the observation direction is the line between the observer and an observed point on the observed object, and the illumination direction is the line between the light source and the observed point. The specular direction is the direction of a line mirroring the illumination direction by a mirror plane perpendicular to the sample surface. The flop angle, also called the aspecular angle, is the angle between the observation direction and the specular direction. Also gloss, of either effect or non-effect films, is dependant on the optical geometry.
Effect pigments are used in coatings to obtain optical effects, such as a metallic or pearlescent look. Typically, in a film of a coating comprising metallic pigments lightness is dependent on the optical geometry, whereas with coatings comprising pearlescent pigments the hue changes with the optical geometry. This complicates the characterization of the visual appearance of such coating films. A further complication in this respect is the appearance of local strong scattering spots in the coating film when it is observed from a short distance, due to the presence of pigment flakes. Whereas a solid colour can be characterized by the spectral distribution of reflectance values, coatings comprising effect pigments require taking the angular and spatial dependency into account.
Hitherto, the dependency of the lightness and colour of metallic and pearlescent coatings on optical geometry was examined by using a spectrophotometer at a limited number of different measuring geometries. However, this results in an incomplete picture giving data at only a very limited number of optical geometries.
US 5,550,632 discloses a method and an arrangement for evaluating paint films using a digital photo camera. Only one optical geometry is used per recording, and since the camera is focussed, only one flop angle. Since the appearance of a coating layer comprising effect pigments is dependent on the flop angle, this method cannot be used for the evaluation of such effect coatings in one recording.
The object of the invention is a system allowing the evaluation of a surface in a single recording over a continuous range of flop angles.
The object of the invention is achieved by a device for recording the flop angle dependent properties of a surface, comprising an imaging device for recording light interaction by a surface, a light source, and a sample area for positioning a sample with a surface to be examined, characterized in that the scope of the imaging device covers a continuous range of observation directions and in that the imaging device, the light source, and the sample area are arranged in such a way that in one image at least one of the surface properties is recorded as a function of the flop angle. The recording can be used for visual inspection and comparison or, depending on the capability of the imaging device to quantify recorded signals, for measurement or data processing. Particular examples of use are gloss measurement and colour matching.
A device or arrangement according to the present invention is particularly suitable for evaluating the flop behaviour, or goniochromatism, of a surface coated with a coating containing effect pigments.
To get a useful picture of flop behaviour, the range of flop angles, as defined above, should preferably span more than 40 degrees, more preferably more than 50 degrees.
For some types of surfaces, such as coating films containing metallic pigments, the colour is darker at larger flop angles. To utilize the full measurement range in such a case, the light distribution is preferably varied over the viewing scope of the imaging device, preferably according to an increasing or decreasing function. This function is dependent on the type of material. The light distribution can be varied by varying the light output of the light source as a function of the illumination angle. Alternatively, the light distribution can be varied by using suitable filters.
In a preferred arrangement, the light source can be a line source, such as a TL strip light, a horizontal slit in a light diffusor, an array of point sources, such as LEDs or glass fibres, etc. Alternatively, the light source may be a point source.
A suitable imaging device in an arrangement according to the invention is a CCD camera, or Charge Coupled Device camera. Suitable CCD photo cameras are for instance the Ricoh® RDC 5000, the Olympus® C-2000Z, the Minolta® Dimage® RD 3000, and the Nikon® Coolpix® 950.
Digital video cameras form another group of suitable devices which can be used to implement the present invention. Using a video camera the flop angle can be varied not only as a function of location, but, alternatively or additionally, as a function of time. Using a video camera also makes it possible to monitor time- dependent changes in the visual appearance of the examined surface over a period of time, for instance the appearance of a coating film during curing.
Using a digital camera, every recorded image is composed of a large number of pixels. Every pixel has a red value R, a green value G, and a blue value B.
Ideally, calibrated R, G, and B values for a purely black surface should all be 0, whereas for an ideal purely white surface, each of these three values should be equal to a predefined maximum value. The maximum value is equal to 2π-1 , wherein n is the number of bits defining a pixel. If an 8-bits pixel depth is used, this maximum value is 255.
When metallic coatings are examined, the lightness can exceed the lightness of white locally. This should be taken into account, for example by choosing a maximum white value lower than 2n-1.
For accurate colour measurement, it is preferred to calibrate the measurements periodically. When a CCD camera is used, calibration can for example be carried out by first separately recording a black sample and a white sample. The R,G,B values of the black value are subtracted from the R, G, and B values of the white sample and of the measured sample in question. Subsequently, the R, G, and B values of the measured sample are divided by the corresponding values of the white calibration sample and are multiplied with the maximum white value. This means that for every pixel in the image, a calibrated value Rcaι of the R value is calculated using the following formula:
Real - 255 * (R — Rblac ) (Rwhite " Rblack)
In this formula, Rbiack is the R value of a pixel in the black sample, whereas RWhite is the R value of a pixel in the white sample. Calibrated values for the B and G values are calculated correspondingly. This correction accounts for deviations in the light sensitivity of the pixels and for the variation in illuminant intensity as a function of optical geometry.
Optionally, the R, G, and B values can be corrected for time-dependent variations in light intensity. This can be done for example by applying a parallel
white strip to the sample. For calculation purposes, the sample and the white strip are considered to be divided along the longitudinal axis of the sample by a large number of virtual parts. For every sample part, the average R, G, and B values Rav, Gav, and Bav are determined. Similarly, for every white strip part, the average R, G, and B values Rwhite-av, Gwhjte-a , and BWhite-av are determined. The corrected R value Rcor for each sample part is then calculated using the following formula:
Rcor = 255 (Rav! Rwhite-av)-
GCor and BCor are calculated accordingly.
The most common systems for colorimetric data have been laid down by the Commission International de I'Eclairage (CIE), e.g., CIELab (L*, a*, b*), CIEXYZ (X, Y, Z), and CIELuv (L*, u*, v*). These systems take the sensitivity of the human eye into account. The R, G, and B values, as measured by a CCD camera, can be converted to L*, a*, b* values of the CIELab system.
The mathematical model selected may be any model known to the skilled person. Examples are mentioned in H.R. Kang, Color Technology for Electronic Imaging Devices, SPIE Optical Engineering Press, 1997, chapters 3 and 11 , and in U.S. 5,850,472. The model may be non-linear or linear. One example of a non-linear model is a 2nd order polynomial having 10 parameters or a 3rd order polynomial having 20 parameters. Preferably, use is made of a linear model. More preferably, the linear model used has 4 model parameters.
One example of a linear model having 4 parameters is the following model, where the measured colour signals of the calibration colours, in this case R, G, and B data, are converted to colorimetric data, in this case CIELab data:
Li*=Co+CιRi+C2Gi+C3Bi
wherein Rj, G,-, B,- are the measured signals and L *, a *, and b * are the colorimetric data of calibration colour i.
Linear regression is used to calculate the 12 model parameters c0-c3, d0-d3, and e0-e3 from the measured RGB data and the known CIELab data (CIE 1964 standard colorimetric observer) of the calibration colours. These model parameters are used to convert the measured RGB data of the selected colour to CIELab data.
One example of a non-linear 3rd order polynomial having 20 parameters is:
Li* = Co + cιRι + c2Gi + c3B| + c4R2 + c5Gi2 + c6Bi2 + c7RjGj + c8RiBi + c9GjBj + c10Ri3 + CnGi3 + c12Bi3 + Cι3Rj2Gi + c14R2Bj + c15Gi2Ri + c16G 2Bj + c17B 2Ri + c18Bi2Gi + c19RiGiBi a,* = do + diRi + d2Gι + d3Bj + d4R2 + d5G 2 + d6B 2 + d7RjGj - d8RjBi + dgGjBj + dioRi3 + dnGj3 + d12B 3 + d 3R2G| + d14R2Bj + d15G 2Ri + d16Gi2Bj + d17Bi2Ri + d18B 2Gi + dtgRiGiBi b|* = e0 + Θi Ri + e2Gι + e3Bι + e4R2 + e5Gi2 + e6B 2 + e7RiGj - e8RiBi + egGjBj + e10R 3 + enGi3 + e12B|3 + e13Ri2Gj + e14R2Bi + e15Gi2Ri + e16Gj2Bi + e17B 2Ri + e18B 2Gi + e19RiGiBi
Linear regression is used to calculate the 60 model parameters c0-c 9, d0-dιg, and e0-eιg from the measured RGB data and the known CIELab data of the calibration colours. These model parameters are used to convert the measured RGB data of the selected colour to CIELab data.
Notwithstanding the above, it is possible to lend greater weight to the calibration colours in the vicinity of the selected colour when calculating the model parameters. In the case of the above example of a linear model having 4 parameters, this means that during the linear regression each calibration colour is given a weighting factor based on the distance in the RGB colour space between the calibration colour in question and the selected colour. In the linear regression procedure the following sum of squares is minimized:
wherein w , is a weighting factor, y , is the L*,, a , *, or b , * based on spectral measurements and y, is the calculated value for L* , a , *, or b , * based on the RGB to CIELab conversion.
With y, being equal to c0 + c-iR + c2G + c3B (see above), and w , being equal to ((RI-R)2+(G,-G)2+(BI-B)2)"2, this sum is as follows:
∑ (L; - C0 - ClR, -c2G, -c3B,)2((Rl -R)2 + (G, -G)2 + (B, - B)2)"2
1=1
∑(a; -d0 -d.R, - d2G, -d3B,)2((R, -R)2 + (G, - G)2 + (B, - B)2)"2
1=1
∑(b; -e0 -θlR, -e2G, -e3B,)2((R, -R)2 + (G, -G)2 + (B, - B)2 )"2
1=1
wherein n : is the number of calibration colours R, G, B: are the measured signals of the selected colour
Alternatively, it is possible to use the calibration colours in the vicinity of the selected colour for interpolation.
If so desired, grey balancing may be performed on the signals measured for black, white, and grey according to the formula R = G = B = f(L*) or a comparable value for L* in a different colorimetric system. Such grey balancing is described in H.R. Kang, Color Technology for Electronic Imaging Devices, SPIE Optical Engineering Press, 1997, chapter 11.
Using image processing software, such as the computer program Optimas® commercially available from Media Cybernetics, or the program Image ProPlus® available from the same company, individual particles can be identified by recognition of differing luminance relative to the particle's background. These particles can for instance be metallic pigments or clusters of metallic pigments. After identification of the particles, the number of particles, and image parameters like particle size, particle shape, the length of the shortest and the longest axis, as well as the R, G, and B values of the particles can be determined by the image processing software. This data optionally can be averaged per strip part, or per larger part if so required.
The data determined on the basis of the images can for example be used to search for a coating formulation giving a matching surface coating. To this end, the measured data can be compared with data from a colour formulation database.
To enhance the scope of the imaging device, the light source can optionally comprise a set of mirrors. It has been found that using mirrors in a suitable arrangement can enhance the scope to about 90 degrees or even more.
Although the light source may be a permanent light source, a flash light is preferred to minimize the use of energy. If permanent light sources are used, the camera should be set at a suitable exposure time. Suitable light sources are for instance tungsten halogen lamps or xenon lamps.
In a particularly preferred embodiment, the light source comprises a light diffusing housing discharging light via a slit. The longitudinal side of the slit is arranged substantially parallel to the sample surface, whereas the short side is substantially perpendicular to the sample surface. In such an arrangement, a light sensor can be used to control the light output. In a preferred embodiment of such a diffusor, the slit is bordered by a substantially horizontal wall at the inner side of the diffusor. This way, the light intensity at the sample surface is a function of the flop angle. On positions at a smaller angular distance to the diffusor, the light intensity is less than at larger angles.
In a further preferred embodiment of the device according to the present invention, the spectral distribution of the light is varied in one image as a function of the position on the sample, e.g., by using different light sources or using a set of filters, a grating, or a prism. This enhances the number of independent measurement data and thus improves colour accuracy. This can be done for example by varying the light from the light source or by varying the spectral distribution of the light just before it enters the imaging device. Preferably, the spectral distribution of the illumination is varied perpendicular to the variation of optical geometry.
To eliminate any effect of environmental light, the device according to the invention preferably comprises a housing.
The present invention includes a method of surface evaluation as described above wherein, in general, the recorded light interaction is the light reflection of a sample. However, if the sample is transparent or semi-transparent, the recorded light interaction can be light transmission. In that case, the sample is positioned between the imaging device and the light source.
Flat samples can be used. However, if so required, curved samples may also be suitable for the examination of flop behaviour.
The invention is further described and illustrated by the drawings. In the drawings:
Fig. 1 : shows a schematic overview of a recording arrangement according to the invention;
Fig. 2: shows a recording by the arrangement of Figure 1 ; Fig. 3: shows schematically an alternative arrangement according to the invention;
Fig. 4: shows a plot of the flop angle as a function of the position, as recorded by the arrangement of Figure 3;
Fig. 5 shows a third alternative arrangement according to the invention; Fig. 6 shows a fourth alternative embodiment of the invention; Fig. 7 shows the change in the filtered wavelength ranging over a sample imaged in a device according to Figure 6.; Fig. 8: shows a sample with a parallel reference sample for gloss measurement.
Figure 1 shows an arrangement 1 according to the present invention, comprising a light source 2, a CCD camera 3 as a recording device, having a viewing angle α ranging from a first outer end observation direction 4 closest to the light source to a second outer end observation direction 5. A coated sample 6 is located under the camera 3. The light source 2 is a line source parallel to the sample surface. The light source 2 is located outside the direct scope of the CCD camera 3. The line between the light source 2 and the point where the observation direction 4 meets the sample 6 defines a first illumination direction 7, which is reflected by the sample 6 in a direction defined as the first specular direction 8. Similarly, the line between the light source 2 and the point where the observation direction 5 meets the sample 6 defines a second illumination direction 9, which is reflected by the sample 6 in a direction defined as the second specular direction 10. In the drawing, the outer flop angle Θ-ι is the angle between the first observation direction 4 and the first specular direction 8,
whereas the outer flop angle 0 is the angle between the second observation direction 5 and the second specular direction 10. The angular range between Θ1 and ©2 may span up to about 90 degrees.
Figure 2 shows a recorded image recorded by the arrangement of Figure 1. It is the image of a sample coated with a metallic paint. The figure shows how brightness varies with the flop angle. The image in Figure 2 also shows a change in coarseness over the length of the sample, as would also be experienced by the human eye.
Figure 3 shows an alternative arrangement according to the invention, where the flop angle scope is enhanced by the use of a mirror 11. The mirror 11 is arranged in such a way that it reflects the sample part right next to the outer end of the camera's viewing angle scope most distant from the light source 2. In the image observed by the camera 3, the scope closest to the light source 2 is replaced by an extension of the scope on the other side. From right to left, the recording shows the scope from Θ3 to Θ5 and subsequently the scope from Θ4 to 02. The scope from Θι to Θ is not visible anymore on the recording.
Figure 4 shows the flop angle as a function of the position in an arrangement similar to the arrangement of Figure 3, position 0 being right under the camera. The mirrored part of the sample overlaps the part from about 20 mm to 25 mm.
Optionally, the sample can be recorded in two (or even more) separate parallel strips, one with a mirror enhanced scope and the other without. This way, the full enlarged scope ranging from Θ-j to Θ2 and the scope ranging from 05 to Θ3 can be recorded. If O5 is equal to Θ2, then a closed range from Θ-i to Θ3 is covered.
Figure 5 shows an arrangement similar to the arrangement of Figure 1 with a camera 3 right above a sample 6 where a light source 12 comprises a standard
flash light 13, available under the trademark Metz 45CT-1. The flash light 13 comprises a transparent side 14. At the transparent side 14, the flash light is connected to the flat upper side 15 of a diffusor 16 comprising a semi-cylindrical part 17. This flat side 15 is open where it is in connection with the transparent side 14 of the flash light 13. The inner side of the diffusor 16 is covered with a white coating. Where it does not coincide with the transparent side 14 of the flash light 13, the flat side 15 is closed by a horizontal wall 18, which on its inside is provided with a white coating. The edge between the outer end of the horizontal wall 18 and the semi-cylindrical part 17 is provided with a vertical slit 19 extending over the width of the diffusor 16.
When the flash light 13 flashes, the light is diffused by the reflective coatings at the inner side of the diffusor 16. Part of the light is reflected by the reflective layer of the horizontal wall 18 through the slit 19 onto the part of the sample 6 to be imaged. As can be seen in figure 5, at the outer viewing direction of the camera's viewing angle scope closest to the diffusor 16, the sample 6 is illuminated by a considerably smaller part of the reflective horizontal wall 17 than at the camera's outer viewing direction distant from the diffusor 16. This way, the light density is a function of the angular distance from the light source. The function can be varied with the orientation of the slit with respect to the sample surface.
A glass fibre cable 20 connects the space enclosing the flash light 13 to a light sensor 21 controlling the time interval at which the flash light 13 flashes. Part of the light diffused in the diffusor 16 escapes via the glass fibre cable 20 to the light sensor 21. The quantity of light passing the glass fibre cable 20 is measured by the light sensor 21. When a pre-defined quantity of light has passed the glass fibre cable 20, the sensor 21 stops the flash light 13. This way, it is ensured that every flash gives exactly the same quantity of light.
Figure 6 shows a further alternative embodiment of a device according to the invention, comprising an imaging device 3 and a light source 2. A sample 6 is located under the imaging device 3. A set of filters and a grating or a prism 24 are positioned between the light source 2 and the sample 6. The spectral distribution of the illumination is varied in one image as a function of the position on the sample. Figure 7 shows the result of the preferred embodiment where the spectral distribution of the illumination is varied perpendicular to the variation of optical geometry.
Figure 8 shows that the gloss behaviour of a sample can be characterized as a function of optical geometry. This particular example shows in one image the difference between a high gloss sample 25 and a low gloss sample26.
Claims (13)
1. Device for recording the visual properties of a surface, comprising an imaging device for recording light interaction with a surface, a light source, and a sample area for positioning a sample with a surface to be examined, characterized in that the imaging device, the light source, and the sample area are arranged in such a way that in one image at least one of the surface properties is recordable as a function of a continuous range of angles between the illumination direction and the observation direction.
2. Device according to claim 1 , characterized in that the camera, the light source, and the sample area are positioned in a triangular arrangement.
3. Device according to claim 1 or 2, characterized in that the flop angle is more than 40 degrees, preferably more than 50 degrees.
4. Device according to any one of the preceding claims, characterized in that the light distribution by the light source varies over the scope of the imaging device.
5. Device according to anyone of the preceding claims, characterized in that the light source is a line source.
6. Device according to any of the preceding claims, characterized in that the imaging device is a CCD camera.
7. Device according to any one of the preceding claims, characterized in that the device comprises a set of mirrors enhancing the scope of the imaging device.
8. Device according to any one of the preceding claims, characterized in that the light source is a flash light comprising a light output control.
9. Device according to any one of the preceding claims, characterized in that the device comprises a set of filters and a grating or a prism for variation of the spectral distribution of the illumination or the spectral distribution of the light entering the recording device as a function of the position on the sample.
10. Method for recording the visual properties of a surface, using an imaging device for recording light interaction with a surface, a light source, and a sample area for positioning a sample with a surface to be examined, characterized in that the imaging device, the light source, and the sample area are arranged in such way that in one image at least one of the surface properties is recorded as a function of a continuous range of angles between the illumination direction and the observation direction.
11. Method according to claim 10, characterized in that the visual property is the gloss of the sample.
12. Method according to claim 10 or 11 , characterized in that the recorded light interaction is the light reflection of a sample.
13. Method according to claim 12, characterized in that the recorded light interaction is the light transmission of a semi-transparent sample.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP01201276.1 | 2001-04-06 | ||
EP01201276 | 2001-04-06 | ||
PCT/EP2002/003682 WO2002082063A1 (en) | 2001-04-06 | 2002-03-28 | Method and device for surface evaluation |
Publications (2)
Publication Number | Publication Date |
---|---|
AU2002338353A1 true AU2002338353A1 (en) | 2003-04-10 |
AU2002338353B2 AU2002338353B2 (en) | 2006-05-25 |
Family
ID=8180117
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
AU2002338353A Ceased AU2002338353B2 (en) | 2001-04-06 | 2002-03-28 | Method and device for surface evaluation |
Country Status (9)
Country | Link |
---|---|
EP (1) | EP1373867A1 (en) |
JP (1) | JP3933581B2 (en) |
KR (1) | KR100875806B1 (en) |
CN (1) | CN100403011C (en) |
AU (1) | AU2002338353B2 (en) |
BR (1) | BR0208660A (en) |
RU (1) | RU2292037C2 (en) |
WO (1) | WO2002082063A1 (en) |
ZA (1) | ZA200307712B (en) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB0625890D0 (en) * | 2006-12-23 | 2007-02-07 | Colormatrix Holdings Inc | Polymeric materials |
US9025153B2 (en) * | 2011-11-16 | 2015-05-05 | Axalta Coating Systems Ip Co., Llc | Process for predicting degree of mottling in coating compositions by wet color measurement |
JP5475057B2 (en) | 2012-04-20 | 2014-04-16 | 株式会社 オフィス・カラーサイエンス | Variable angle spectroscopic imaging measurement method and apparatus |
TWI592652B (en) * | 2012-05-09 | 2017-07-21 | 希捷科技有限公司 | Apparatus and device for surface features mapping |
CN103674903A (en) * | 2013-12-31 | 2014-03-26 | 爱彼思(苏州)自动化科技有限公司 | Non-contact vancometer |
US9880098B2 (en) * | 2014-10-28 | 2018-01-30 | Axalta Coatings Systems Ip Co., Llc | Method and systems for quantifying differences between colored surfaces |
US9678018B2 (en) | 2015-03-30 | 2017-06-13 | Gemological Institute Of America Inc. (Gia) | Apparatus and method for assessing optical quality of gemstones |
CN106908149A (en) * | 2017-04-11 | 2017-06-30 | 上海电机学院 | A kind of robot object color identifying system and method |
KR102047206B1 (en) * | 2018-10-31 | 2019-11-20 | 한국과학기술원 | Swcc-hyperspectral cam test method and apparatus for measuring reflectance by volumetric water content |
JP7341835B2 (en) * | 2019-10-09 | 2023-09-11 | 株式会社日立製作所 | Powder mixing system and powder mixing method |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2588297B2 (en) * | 1990-06-20 | 1997-03-05 | 日産自動車株式会社 | Evaluation method for sharpness of painted surface |
US5078496A (en) * | 1990-08-14 | 1992-01-07 | Autospect, Inc. | Machine vision surface characterization system |
US5640237A (en) * | 1995-08-29 | 1997-06-17 | Kla Instruments Corporation | Method and apparatus for detecting non-uniformities in reflective surafaces |
JP3312849B2 (en) * | 1996-06-25 | 2002-08-12 | 松下電工株式会社 | Defect detection method for object surface |
JP4527205B2 (en) * | 1997-03-31 | 2010-08-18 | リアル・タイム・メトロジー,インコーポレーテッド | Optical inspection module and method for detecting particles and defects on a substrate in an integrated process tool |
-
2002
- 2002-03-28 JP JP2002579783A patent/JP3933581B2/en not_active Expired - Fee Related
- 2002-03-28 EP EP02742872A patent/EP1373867A1/en not_active Withdrawn
- 2002-03-28 CN CNB028079019A patent/CN100403011C/en not_active Expired - Fee Related
- 2002-03-28 RU RU2003132476/28A patent/RU2292037C2/en active
- 2002-03-28 AU AU2002338353A patent/AU2002338353B2/en not_active Ceased
- 2002-03-28 BR BR0208660-3A patent/BR0208660A/en not_active IP Right Cessation
- 2002-03-28 KR KR1020037012945A patent/KR100875806B1/en not_active IP Right Cessation
- 2002-03-28 WO PCT/EP2002/003682 patent/WO2002082063A1/en active Application Filing
-
2003
- 2003-10-02 ZA ZA200307712A patent/ZA200307712B/en unknown
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6039109B2 (en) | Coloring inspection apparatus and coloring inspection method | |
KR101162078B1 (en) | Method for matching paint | |
Casini et al. | Image spectroscopy mapping technique for noninvasive analysis of paintings | |
US5850472A (en) | Colorimetric imaging system for measuring color and appearance | |
JP3809838B2 (en) | Image highlight correction method using image source specific HSV color coordinates, image highlight correction program, and image acquisition system | |
Liang et al. | A new multi-spectral imaging system for examining paintings | |
CN102124723B (en) | Method and device for the true-to-original representation of colors on screens | |
JP6371237B2 (en) | Coloring evaluation apparatus and coloring evaluation method | |
JP5640812B2 (en) | Paint color evaluation method | |
US7027165B2 (en) | Method and device for surface evaluation | |
AU2002338353B2 (en) | Method and device for surface evaluation | |
JP2020012668A (en) | Evaluation device, measurement device, evaluation method and evaluation program | |
AU2002338353A1 (en) | Method and device for surface evaluation | |
JP7099474B2 (en) | Multi-angle colorimeter | |
JP2016194449A (en) | Coloring checkup device, and coloring checkup method | |
Maczkowski et al. | Integrated method for three-dimensional shape and multispectral color measurement | |
JP7362121B2 (en) | Color/texture measuring device and method | |
WO2019097826A1 (en) | Multi-angle colorimeter | |
Fiorentin et al. | A multispectral imaging device for monitoring of colour in art works | |
JP2017153054A (en) | Coloring inspection apparatus and coloring inspection method | |
Hunt et al. | Imaging tristimulus colorimeter for the evaluation of color in printed textiles | |
KR100255500B1 (en) | Apparatus and method for measurement and decision of color matching for 3d objects | |
Rich | Critical parameters in the measurement of the color of nonimpact printing | |
Rossi et al. | Towards image-based measurement of perceived lightness applied to paintings lighting | |
Paviotti | Acquisition and processing of multispectral data for texturing 3D models |