[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
sensors-logo

Journal Browser

Journal Browser

Recent Trends and Advances in Color and Spectral Sensors: 2nd Edition

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Sensing and Imaging".

Deadline for manuscript submissions: 30 April 2025 | Viewed by 1120

Special Issue Editors


E-Mail Website
Guest Editor
Research center of graphic communication, printing and packaging, Wuhan University, Wuhan 430079, China
Interests: spectral printing; ICC color management; printing quality controlling; fine art reproduction; chromatic adaptation; LED lighting design
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Research Institute of Photonics, Dalian Polytechnic University, Dalian 116038, China
Interests: color imaging
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Faculty of Light Industry, Qilu University of Technology (Shandong Academy of Sciences), Jinan 250353, China
Interests: spectral imaging
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

We are pleased to announce the second edition of this topic. You can view the original Special Issue here: https://www.mdpi.com/journal/sensors/special_issues/P00N2L5Z5Z. Color and spectral sensors are key technologys for color acquisition and reproduction in different applications, such as image visualization, cultural heritage, color recognition and inspection, medical diagnosis, and remote sensing. Color acquisition and reproduction issues are not comprehensively investigated and are challenging problems in developing a general color accuracy sensing system. Furthermore, test benchmark design, experimental guidelines, visual perception, and numerical analysis models for evaluating the performance of color and spectral sensors are also key to implementing accurate material-aware color. At the same time, new areas in the study of applied color and spectral sensors have emerged in just the past decade, examining the new advances in cameras, displays, smartphones, VR, AR, MR, and smart lighting.

Manuscripts should contain both theoretical and practical/experimental results. Potential topics include but are not limited to the following: color sensors, spectral sensors, image sensors, hyper/multispectral imaging, imaging spectroscopy, color/spectral filter arrays, radiometric calibration, calibration site design, CCD/CMOS, spectral recovery, color perception, color appearance models, and image vision.

Dr. Qiang Liu
Dr. Jean-Baptiste Thomas
Dr. Xufen Xie
Dr. Guangyuan Wu
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • color sensors
  • spectral sensors
  • imaging spectroscopy
  • spectral recovery
  • color perception
  • color appearance model
  • color reproduction
  • image vision

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

10 pages, 1474 KiB  
Communication
Comparative Analysis of Low-Cost Portable Spectrophotometers for Colorimetric Accuracy on the RAL Design System Plus Color Calibration Target
by Jaša Samec, Eva Štruc, Inese Berzina, Peter Naglič and Blaž Cugmas
Sensors 2024, 24(24), 8208; https://doi.org/10.3390/s24248208 - 23 Dec 2024
Viewed by 221
Abstract
Novel low-cost portable spectrophotometers could be an alternative to traditional spectrophotometers and calibrated RGB cameras by offering lower prices and convenient measurements but retaining high colorimetric accuracy. This study evaluated the colorimetric accuracy of low-cost, portable spectrophotometers on the established color calibration target—RAL [...] Read more.
Novel low-cost portable spectrophotometers could be an alternative to traditional spectrophotometers and calibrated RGB cameras by offering lower prices and convenient measurements but retaining high colorimetric accuracy. This study evaluated the colorimetric accuracy of low-cost, portable spectrophotometers on the established color calibration target—RAL Design System Plus (RAL+). Four spectrophotometers with a listed price between USD 100–1200 (Nix Spectro 2, Spectro 1 Pro, ColorReader, and Pico) and a smartphone RGB camera were tested on a representative subset of 183 RAL+ colors. Key performance metrics included the devices’ ability to match and measure RAL+ colors in the CIELAB color space using the color difference CIEDE2000 ΔE. The results showed that Nix Spectro 2 had the best performance, matching 99% of RAL+ colors with an estimated ΔE of 0.5–1.05. Spectro 1 Pro and ColorReader matched approximately 85% of colors with ΔE values between 1.07 and 1.39, while Pico and the Asus 8 smartphone matched 54–77% of colors, with ΔE of around 1.85. Our findings showed that low-cost, portable spectrophotometers offered excellent colorimetric measurements. They mostly outperformed existing RGB camera-based colorimetric systems, making them valuable tools in science and industry. Full article
(This article belongs to the Special Issue Recent Trends and Advances in Color and Spectral Sensors: 2nd Edition)
Show Figures

Figure 1

Figure 1
<p>Color calibration target (CCT) RAL Design System Plus (©RAL gGmbH, Bonn, Germany, reproduced with permission from RAL gGmbH).</p>
Full article ">Figure 2
<p>Spectrophotometers (<b>a</b>) Nix Spectro 2, (<b>b</b>) Spectro 1 Pro, (<b>c</b>) ColorReader, and (<b>d</b>) Pico ((<b>a</b>) ©Nix Sensor Ltd., Hamilton, ON, Canada; (<b>b</b>) ©Variable Inc., Chattanooga, TN, USA; (<b>c</b>) ©Datacolor GmbH, Marl, Germany; (<b>d</b>) ©Palette Pty Ltd., Melbourne, Victoria, Australia; images are reproduced with permissions from Nix Sensor Ltd., Variable Inc., Datacolor GmbH, and Palette Pty Ltd.).</p>
Full article ">
20 pages, 14473 KiB  
Article
Digitizing the Appearance of 3D Printing Materials Using a Spectrophotometer
by Alina Pranovich, Morten Rieger Hannemose, Janus Nørtoft Jensen, Duc Minh Tran, Henrik Aanæs, Sasan Gooran, Daniel Nyström and Jeppe Revall Frisvad
Sensors 2024, 24(21), 7025; https://doi.org/10.3390/s24217025 - 31 Oct 2024
Viewed by 593
Abstract
The conventional approach to appearance prediction for 3D printed parts is to print a thin slab of material and measure its reflectance or transmittance with a spectrophotometer. Reflectance works for opaque printing materials. Transmittance works for transparent printing materials. However, the conventional approach [...] Read more.
The conventional approach to appearance prediction for 3D printed parts is to print a thin slab of material and measure its reflectance or transmittance with a spectrophotometer. Reflectance works for opaque printing materials. Transmittance works for transparent printing materials. However, the conventional approach does not work convincingly for translucent materials. For these, we need to separate scattering and absorption. We suggest printing a collection of thin slabs of different thicknesses and using these in a spectrophotometer to obtain the scattering and absorption properties of the material. A model is fitted to the measured data in order to estimate the scattering and absorption properties. To this end, we compare the use of Monte Carlo light transport simulation and the use of an analytic model that we developed from the theory of radiative transfer in plane-parallel media. We assess the predictive capabilities of our method through a multispectral photo-render comparison based on the estimated optical properties. Full article
(This article belongs to the Special Issue Recent Trends and Advances in Color and Spectral Sensors: 2nd Edition)
Show Figures

Figure 1

Figure 1
<p>Using thin 3D printed slabs of different thicknesses, we can estimate the spectral optical properties of 3D printing materials. After this estimation, the photorealism of 3D renderings based on these optical properties is, however, unknown. We suggest validating the correctness of acquired optical properties by comparing a multispectral photograph of a 3D printed object with corresponding renderings. In this example, we show differences (green for positive, red for negative) per wavelength and a FLIP error map [<a href="#B8-sensors-24-07025" class="html-bibr">8</a>] (values in the colourbar) for the reconstructed RGB image. The white numbers are the root mean squared error for the spectral bands and average FLIP error for sRGB.</p>
Full article ">Figure 2
<p><b>Left</b>: Spectrophotometer measurement of diffuse transmission and reflection. For the transmission measurement, we have a diffuse light source with a circular cross section (9 mm diameter). For the reflection mode, the sample is illuminated by a collimated light source with a circular cross-section and 9 mm diameter (at an angle of incidence of 45°). The detection aperture is in both cases an 8 mm wide circle with an acceptance angle (0 ± 2)°. <b>Right</b>: VideometerLab instrument for the multispectral imaging of a sample that is diffusely illuminated due to the use of an integrating sphere. The sketch of this instrument is courtesy of Videometer, <a href="https://www.videometer.com" target="_blank">www.videometer.com</a> (accessed on 27 October 2024).</p>
Full article ">Figure 3
<p>PolyJet 3D printed objects with different material mixtures of varying translucency (blending the red primary with white material).</p>
Full article ">Figure 4
<p>Schematic of our radiometric validation procedure. Note that the printed object exists both physically and digitally and the pose estimation of the digital twin is based on the captured multispectral image of the physical object and calibration of the camera.</p>
Full article ">Figure 5
<p>Appearance maps generated with Monte Carlo path tracing simulations for layer thicknesses of 0.1 mm (<b>left</b> column) and 1.0 mm (<b>right</b> column). The surface roughness was set to 0.03, and the reflectances of the white and black backgrounds were set to 1 and 0, respectively, in the <b>top</b> row but 0.93 and 0.03 in the <b>bottom</b> row.</p>
Full article ">Figure 6
<p>Appearance maps plotted using our model for layer thicknesses of 0.1 mm (<b>left</b>) and 1.0 mm (<b>right</b>). Roughness was set to 0.016, and the reflectances of the white and black backgrounds were set to 0.93 and 0.03, respectively. This plot illustrates the single-scattering limitation of our model.</p>
Full article ">Figure 7
<p>Fit of our model (solid curves) to the spectrophotometer measurements (dots). The dashed curves were generated using Monte Carlo simulation with parameters estimated using appearance map interpolation.</p>
Full article ">Figure 8
<p>Estimated spectral absorption coefficients (<math display="inline"><semantics> <msub> <mi>σ</mi> <mi>a</mi> </msub> </semantics></math>, <b>left</b>) and scattering coefficients (<math display="inline"><semantics> <msub> <mi>σ</mi> <mi>s</mi> </msub> </semantics></math>, <b>right</b>) of the Red Vero material when using our model (solid curves) and appearance maps averaged over 10 values for sample thicknesses of 0.1 to 1 mm (dashed curves).</p>
Full article ">Figure 9
<p>Mixtures of primary materials for samples of 1 mm thickness as measured (top) and reconstructed using our estimated optical properties (bottom). Colors are as observed on a white background. From left to right, 1:1 mixtures of Red and Blue, Red and Yellow, and Yellow and Blue.</p>
Full article ">Figure 10
<p>The dragon objects from <a href="#sensors-24-07025-f003" class="html-fig">Figure 3</a> were PolyJet 3D printed using different mixtures of the materials Vero Red and Vero Pure White. The references are diffusely illuminated multispectral images converted to sRGB. We show images rendered based on our radiometric validation procedure (<a href="#sec5-sensors-24-07025" class="html-sec">Section 5</a>) using optical properties estimated with appearance maps [<a href="#B4-sensors-24-07025" class="html-bibr">4</a>] and our model (<a href="#sec4-sensors-24-07025" class="html-sec">Section 4</a>), and we include the FLIP error maps [<a href="#B8-sensors-24-07025" class="html-bibr">8</a>] and average FLIP errors of the rendered images (an average of less than 0.15 seems a reasonably good prediction of the appearance of the printed object).</p>
Full article ">Figure 11
<p>sRGB reconstruction of an unfiltered multispectral Videometer image (<b>left</b>) and fluorescence activated by a source with a wavelength of 365 nm (<b>right</b>).</p>
Full article ">Figure 12
<p>Predicted appearance of PolyJet 3D printed objects with different material mixtures of varying translucency (blending Vero Red and Vero Pure White). For the image at the <b>top</b>, the scene was set up to be qualitatively comparable to the photo in <a href="#sensors-24-07025-f003" class="html-fig">Figure 3</a>. In the <b>middle</b>, the same scene is seen from a different view. At the <b>bottom</b>, we used a different lighting environment. The lighting environments are shown as small inserts in the lower right corner (the <b>top</b> and <b>middle</b> used the same).</p>
Full article ">
Back to TopTop