[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (7)

Search Parameters:
Keywords = machine-readable patterns

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
11 pages, 5810 KiB  
Article
Reading Dye-Based Colorimetric Inks: Achieving Color Consistency Using Color QR Codes
by Ismael Benito-Altamirano, Laura Engel, Ferran Crugeira, Miriam Marchena, Jürgen Wöllenstein, Joan Daniel Prades and Cristian Fàbrega
Chemosensors 2024, 12(12), 260; https://doi.org/10.3390/chemosensors12120260 - 13 Dec 2024
Viewed by 307
Abstract
Color consistency when reading colorimetric sensors is a key factor for this technology. Here, we demonstrate how the usage of machine-readable patterns, like QR codes, can be used to solve the problem. We present our approach of using back-compatible color QR codes as [...] Read more.
Color consistency when reading colorimetric sensors is a key factor for this technology. Here, we demonstrate how the usage of machine-readable patterns, like QR codes, can be used to solve the problem. We present our approach of using back-compatible color QR codes as colorimetric sensors, which are common QR codes that also embed a set of hundreds of color references as well as colorimetric indicators. The method allows locating the colorimetric sensor within the captured scene and to perform automated color correction to ensure color consistency regardless of the hardware used. To demonstrate it, a CO2-sensitive colorimetric indicator was printed on top of a paper-based substrate using screen printing. This indicator was formulated for Modified Atmosphere Packaging (MAP) applications. To verify the method, the sensors were exposed to several environmental conditions (both in gas composition and light conditions). And, images were captured with an 8M pixel digital camera sensor, similar to those used in smartphones. Our results show that the sensors have a relative error of 9% when exposed with a CO2 concentration of 20%. This is a good result for low-cost disposable sensors that are not intended for permanent use. However, as soon as light conditions change (2500–6500 K), this error increases up to ϵ20 = 440% (rel. error at 20% CO2 concentration) rendering the sensors unusable. Within this work, we demonstrate that our color QR codes can reduce the relative error to ϵ20 = 14%. Furthermore, we show that the most common color correction, white balance, is not sufficient to address the color consistency issue, resulting in a relative error of ϵ20 = 90%. Full article
(This article belongs to the Special Issue Novel Gas Sensing Approaches: From Fabrication to Application)
Show Figures

Figure 1

Figure 1
<p>A Back-compatible Color QR Code [<a href="#B18-chemosensors-12-00260" class="html-bibr">18</a>] for the evaluation of colorimetric indicators. This QR code is read by commercial scanners and should display the URL: <a href="http://c-s.is/#38RmtGVV6RQSf" target="_blank">c-s.is/#38RmtGVV6RQSf</a> (accessed on 12 December 2024). It includes up to 125 reference colors, and the colorimetric dye is printed above the lower finder pattern, represented here as seven purple modules.</p>
Full article ">Figure 2
<p>The structure of the color QR code from <a href="#chemosensors-12-00260-f001" class="html-fig">Figure 1</a>: (<b>a</b>,<b>b</b>) Possible sensor inks placements. (<b>a</b>) Big sensor outside the QR code. (<b>b</b>) Smaller factor forms (<math display="inline"><semantics> <mrow> <mn>3</mn> <mo>×</mo> <mn>2</mn> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <mn>1</mn> <mo>×</mo> <mn>1</mn> </mrow> </semantics></math>, …) within the QR code. (<b>c</b>) Color references and how they are spread over the QR code area. (<b>d</b>) Whole sensor layout of the gas-sensitive color QR code.</p>
Full article ">Figure 3
<p>The sensor changes from purple to yellow when exposed to <math display="inline"><semantics> <mrow> <mi>C</mi> <msub> <mi>O</mi> <mn>2</mn> </msub> </mrow> </semantics></math>.</p>
Full article ">Figure 4
<p>A mass-flow controller station, a capture station, and a user-access computer. The mass-flow controller station supplies a chamber in which the gas sensors are placed with modified atmospheres. The capture station takes time-lapse images of the sensor through an optical window of the chamber under controlled light settings. Finally, the user computer presents a web page interface to operate the system.</p>
Full article ">Figure 5
<p>A printed sensor featuring a color QR code and two different colorimetric indicators (<math display="inline"><semantics> <mrow> <mi>C</mi> <msub> <mi>O</mi> <mn>2</mn> </msub> </mrow> </semantics></math> indicator above, <math display="inline"><semantics> <mrow> <mi>N</mi> <msub> <mi>H</mi> <mn>3</mn> </msub> </mrow> </semantics></math> below, which was not used in this experiment) inside the sensor chamber. The image shows the sensor before exposure to the target gas under three different light conditions: 2500 K (<b>left</b>), 4500 K (<b>middle</b>) and 6500 K (<b>right</b>).</p>
Full article ">Figure 6
<p>Response of the green channel under nine different light conditions (2500 K to 6500 K) with all pulses overlapped in the same time frame and after correction of the measured values using a color correction method. Each target gas concentration (20%, 30%, 35%, 40%, 50%) was exposed three times under the respective light condition, resulting in a total of 27 pulses for every gas concentration.</p>
Full article ">Figure 7
<p><b>Up</b>: Fitting the responses to a model without performing any color correction (NONE), which is the worst-case scenario, with different color in the data points indicating different illumination conditions and different transparency indicating different repetition sample. <b>Down</b>: Fitting the responses to a model for the ground-truth responses (PERF), which is the best-case scenario, where all color corrections recover the D65 color of the sensor perfectly.</p>
Full article ">
3 pages, 414 KiB  
Abstract
The Application of Back-Compatible Color QR Codes to Colorimetric Sensors
by Ismael Benito-Altamirano, Ferran Crugeira, Míriam Marchena and J. Daniel Prades
Proceedings 2024, 97(1), 3; https://doi.org/10.3390/proceedings2024097003 - 13 Mar 2024
Viewed by 726
Abstract
We present the application of QR Codes as carriers for colorimetric dyes, whereby this refined version of machine-readable patterns applied to colorimetric sensing also allows us to maintain the data from the QR Code standard in a back-compatible way, which means that the [...] Read more.
We present the application of QR Codes as carriers for colorimetric dyes, whereby this refined version of machine-readable patterns applied to colorimetric sensing also allows us to maintain the data from the QR Code standard in a back-compatible way, which means that the QR Code is still able to encode digital data (readable with a standard QR Code decoder) alongside a hundred colorimetric references and the dyes. Also, we discuss in detail the effectiveness of different color correction methods in attaining color accuracy levels suited for sensing via colorimetry. Moreover, we illustrate how color correction techniques can be applied to take advantage of having hundreds of color references, with an exemplary case of a CO2 printed sensor used to monitor the integrity of modified atmosphere packaging (MAP). Full article
(This article belongs to the Proceedings of XXXV EUROSENSORS Conference)
Show Figures

Figure 1

Figure 1
<p>Evolution over the years of machine-readable patterns which embed colorimetric dyes from 2018 to 2023. (<b>a</b>) Our first proposal for such patterns, presented at Eurosensors in 2018 [<a href="#B1-proceedings-97-00003" class="html-bibr">1</a>]; (<b>b</b>) our second attempt to fabricate the patterns [<a href="#B2-proceedings-97-00003" class="html-bibr">2</a>]; (<b>c</b>) Escobedo et al. proposal [<a href="#B3-proceedings-97-00003" class="html-bibr">3</a>] to embed sensors in pattern with digital data; and (<b>d</b>) our proposal to do a similar concept but maximazing back-compatibility [<a href="#B4-proceedings-97-00003" class="html-bibr">4</a>].</p>
Full article ">
28 pages, 5701 KiB  
Article
Agile Methodology for the Standardization of Engineering Requirements Using Large Language Models
by Archana Tikayat Ray, Bjorn F. Cole, Olivia J. Pinon Fischer, Anirudh Prabhakara Bhat, Ryan T. White and Dimitri N. Mavris
Systems 2023, 11(7), 352; https://doi.org/10.3390/systems11070352 - 10 Jul 2023
Cited by 8 | Viewed by 5337
Abstract
The increased complexity of modern systems is calling for an integrated and comprehensive approach to system design and development and, in particular, a shift toward Model-Based Systems Engineering (MBSE) approaches for system design. The requirements that serve as the foundation for these intricate [...] Read more.
The increased complexity of modern systems is calling for an integrated and comprehensive approach to system design and development and, in particular, a shift toward Model-Based Systems Engineering (MBSE) approaches for system design. The requirements that serve as the foundation for these intricate systems are still primarily expressed in Natural Language (NL), which can contain ambiguities and inconsistencies and suffer from a lack of structure that hinders their direct translation into models. The colossal developments in the field of Natural Language Processing (NLP), in general, and Large Language Models (LLMs), in particular, can serve as an enabler for the conversion of NL requirements into machine-readable requirements. Doing so is expected to facilitate their standardization and use in a model-based environment. This paper discusses a two-fold strategy for converting NL requirements into machine-readable requirements using language models. The first approach involves creating a requirements table by extracting information from free-form NL requirements. The second approach consists of an agile methodology that facilitates the identification of boilerplate templates for different types of requirements based on observed linguistic patterns. For this study, three different LLMs are utilized. Two of these models are fine-tuned versions of Bidirectional Encoder Representations from Transformers (BERTs), specifically, aeroBERT-NER and aeroBERT-Classifier, which are trained on annotated aerospace corpora. Another LLM, called flair/chunk-english, is utilized to identify sentence chunks present in NL requirements. All three language models are utilized together to achieve the standardization of requirements. The effectiveness of the methodologies is demonstrated through the semi-automated creation of boilerplates for requirements from Parts 23 and 25 of Title 14 Code of Federal Regulations (CFRs). Full article
(This article belongs to the Section Systems Engineering)
Show Figures

Figure 1

Figure 1
<p>Pipeline for converting NL requirements to standardized requirements using various LLMs.</p>
Full article ">Figure 2
<p>Steps of requirements engineering, starting with gathering requirements from various stakeholders, followed by using NLP techniques to standardize them, and, lastly, converting the standardized requirements into models. The main focus of this work is to convert NL requirements into machine-readable requirements (where parts of the requirement become data objects) as shown in Step 2.</p>
Full article ">Figure 3
<p>A SysML requirement table with three columns, namely, Name, Paragraph Text, and Traced Elements. More columns can be added to capture other properties pertaining to the requirements.</p>
Full article ">Figure 4
<p>An aerospace requirement along with its POS tags and sentence chunks. Each word has a POS tag associated with it, which can then be combined together to obtain a higher-level representation called sentence chunks (NP: Noun Phrase; VP: Verb Phrase; PP: Prepositional Phrase).</p>
Full article ">Figure 5
<p>Flowchart showcasing the creation of the requirements table for the requirement “<span class="html-italic">The state estimates supplied to the flight recorder shall meet the aircraft level system requirements and functionality specified in Section 23–2500</span>” using two LMs, namely, aeroBERT-Classifier [<a href="#B9-systems-11-00352" class="html-bibr">9</a>] and aeroBERT-NER [<a href="#B8-systems-11-00352" class="html-bibr">8</a>], to populate various columns of the table. A zoomed-in version of the figure can be found <a href="https://archanatikayatray19.github.io/Practitioners_Guide/flowchart_requirements_Table.html" target="_blank">here</a> and more context can be found in [<a href="#B10-systems-11-00352" class="html-bibr">10</a>].</p>
Full article ">Figure 6
<p>Flowchart showcasing the creation of boilerplate templates using three language models, namely, aeroBERT-Classifier [<a href="#B9-systems-11-00352" class="html-bibr">9</a>], aeroBERT-NER [<a href="#B8-systems-11-00352" class="html-bibr">8</a>], and flair/chunk-english. A zoomed-in version of the figure can be found <a href="https://archanatikayatray19.github.io/Practitioners_Guide/boilerplate_flowchart.html" target="_blank">https://archanatikayatray19.github.io/Practitioners_Guide/boilerplate_flowchart.html</a> and more context can be found in [<a href="#B10-systems-11-00352" class="html-bibr">10</a>].</p>
Full article ">Figure 7
<p>Sankey diagram showing the text chunk patterns in design requirements. A part of the figure is shown due to space constraints; however, the full diagram can be found <a href="https://github.com/archanatikayatray19/Sankey_diagram_Requirements/blob/main/Design_pos_chunk_Requirements_Diagram.png" target="_blank">https://github.com/archanatikayatray19/Sankey_diagram_Requirements/blob/main/Design_pos_chunk_Requirements_Diagram.png</a> [<a href="#B10-systems-11-00352" class="html-bibr">10</a>].</p>
Full article ">Figure 8
<p>Examples 1 and 2 show a design requirement beginning with a prepositional phrase (PP) and subordinate clause (SBAR), which is uncommon in the requirements dataset used for this work. The uncommon starting sentence chunks (PP, SBAR) are, however, followed by a noun phrase (NP) and verb phrase (VP). Most of the design requirements start with a NP.</p>
Full article ">Figure 9
<p>Example 3 shows a design requirement starting with a verb phrase (VP). Example 4 shows the requirement starting with a noun phrase (NP), which was the most commonly observed pattern.</p>
Full article ">Figure 10
<p>Sankey diagram showing the named entity patterns in design requirements. A part of the figure is shown here due to space constraints; however, the full diagram can be found <a href="https://github.com/archanatikayatray19/Sankey_diagram_Requirements/blob/main/Design_NER_Requirements_Diagram.png" target="_blank">https://github.com/archanatikayatray19/Sankey_diagram_Requirements/blob/main/Design_NER_Requirements_Diagram.png</a> [<a href="#B10-systems-11-00352" class="html-bibr">10</a>].</p>
Full article ">Figure 11
<p>The general textual pattern observed in requirements was &lt;Prefix&gt; + &lt;Body&gt; + &lt;Suffix&gt; out of which Prefix and Suffix are optional and can be used to provide more context about the requirement. The different variations in the requirement Body, Prefix, and Suffix are shown as well [<a href="#B10-systems-11-00352" class="html-bibr">10</a>].</p>
Full article ">Figure 12
<p>The schematics of the first boilerplate for <b>design requirements</b> along with some examples that fit the boilerplate are shown here. This boilerplate accounts for 74 of the 149 design requirements (∼50%) used for this study and is tailored toward requirements that mandate the way a &lt;system&gt; should be designed and/or installed, its location, and whether it should protect another &lt;system/sub-system&gt; given a certain &lt;condition&gt; or &lt;state&gt;. Parts of the NL requirements shown here are matched with their corresponding boilerplate elements via the use of the same color scheme. In addition, the sentence chunks and named entity tags are displayed below and above the boilerplate structure, respectively.</p>
Full article ">Figure 13
<p>The schematics of the second boilerplate for <b>design requirements</b> along with some examples that fit the boilerplate are shown here. This boilerplate accounts for 8 of the 149 design requirements (∼5%) used for this study and focuses on requirements that mandate a &lt;functional attribute&gt;, &lt;design attribute&gt;, or the inclusion of a &lt;system/sub-system&gt; by design. Two of the example requirements highlight the &lt;design attribute&gt; element, which emphasizes additional details regarding the system design to facilitate a certain function. The last example shows a requirement where a &lt;sub-system&gt; is to be included in a system by design.</p>
Full article ">Figure 14
<p>The schematics of the first boilerplate for <b>functional requirements</b> along with some examples that fit the boilerplate is shown here. This boilerplate accounts for 20 of the 100 functional requirements (20%) used for this study and is tailored toward requirements that describe the capability of a &lt;system&gt; to be in a certain &lt;state&gt; or have a certain &lt;functional attribute&gt;. The first example requirement focuses on the handling characteristics of the system (airplane in this case).</p>
Full article ">Figure 15
<p>The schematics of the second boilerplate for <b>functional requirements</b> along with some examples that fit the boilerplate is shown here. This boilerplate accounts for 15 of the 100 functional requirements (15%) used for this study and is tailored toward requirements that require the &lt;system&gt; to have a certain &lt;functional attribute&gt; or maintain a particular &lt;state&gt;.</p>
Full article ">Figure 16
<p>The schematics of the third boilerplate for <b>functional requirements</b> along with some examples that fit the boilerplate is shown here. This boilerplate accounts for 7 of the 100 functional requirements (7%) used for this study and is tailored toward requirements that require the &lt;system&gt; to protect another &lt;sub-system/system&gt; or &lt;user&gt; against a certain &lt;state&gt; or another &lt;sub-system/system&gt;.</p>
Full article ">Figure 17
<p>The schematics of the fourth boilerplate for <b>functional requirements</b> along with some examples that fit the boilerplate is shown here. This boilerplate accounts for 15 of the 100 functional requirements (15%) used for this study and is tailored toward requirements that require the &lt;system&gt; to provide a certain &lt;functional attribute&gt; given a certain &lt;condition&gt;.</p>
Full article ">Figure 18
<p>The schematics of the fifth boilerplate for <b>functional requirements</b> along with some examples that fit the boilerplate is shown here. This boilerplate accounts for 6 of the 100 design requirements (6%) used for this study and is specifically focused on requirements related to the <span class="html-italic">cockpit voice recorder</span> since a total of six requirements in the entire dataset were about this particular system and its &lt;functional attribute&gt; given a certain &lt;condition&gt;.</p>
Full article ">Figure 19
<p>The schematics of the first boilerplate for <b>performance requirements</b> along with some examples that fit the boilerplate are shown here. This boilerplate accounts for 20 of the 61 performance requirements (∼33%) used for this study. This particular boilerplate has the element &lt;system attribute&gt;, which is unique as compared to the other boilerplate structures. In addition, this boilerplate caters to the performance requirements specifying a &lt;system&gt; or &lt;system attribute&gt; to satisfy a certain &lt;condition&gt; or have a certain &lt;functional attribute&gt;.</p>
Full article ">Figure 20
<p>The schematics of the second boilerplate for <b>performance requirements</b> along with some examples that fit the boilerplate are shown here. This boilerplate accounts for 12 of the 61 performance requirements (∼20%) used for this study. This boilerplate focuses on performance requirements that specify a &lt;functional attribute&gt; that a &lt;system&gt; should have or maintain given a certain &lt;state&gt; or &lt;condition&gt;.</p>
Full article ">Figure 21
<p>The schematics of the third boilerplate for <b>performance requirements</b> along with some examples that fit the boilerplate are shown here. This boilerplate accounts for 3 of the 61 performance requirements (∼5%) used for this study and focuses on a &lt;system&gt; being able to <span class="html-italic">withstand</span> and certain &lt;condition&gt; with or without ending up in a certain &lt;state&gt;.</p>
Full article ">Figure 22
<p>Practitioner’s Guide to creation of aeroBERT-NER and aeroBERT-Classifier. A zoomed-in version of this figure can be found <a href="https://archanatikayatray19.github.io/Practitioners_Guide/" target="_blank">https://archanatikayatray19.github.io/Practitioners_Guide/</a> [<a href="#B10-systems-11-00352" class="html-bibr">10</a>].</p>
Full article ">Figure A1
<p>Sankey diagram showing the text chunk patterns in functional requirements. A part of the figure is shown due to space constraints; however, the full diagram can be found <a href="https://github.com/archanatikayatray19/Sankey_diagram_Requirements/blob/main/Functional_pos_chunk_Requirements_Diagram.png" target="_blank">https://github.com/archanatikayatray19/Sankey_diagram_Requirements/blob/main/Functional_pos_chunk_Requirements_Diagram.png</a> [<a href="#B10-systems-11-00352" class="html-bibr">10</a>].</p>
Full article ">Figure A2
<p>Examples 1, 2, and 3 show functional requirements starting with an NP, PP, and SBAR. Most of the functional requirements start with an NP, however.</p>
Full article ">Figure A3
<p>Sankey diagram showing the named entity patterns in functional requirements. A part of the figure is shown due to space constraints; however, the full diagram can be found <a href="https://github.com/archanatikayatray19/Sankey_diagram_Requirements/blob/main/Functional_NER_Requirements_Diagram.png" target="_blank">https://github.com/archanatikayatray19/Sankey_diagram_Requirements/blob/main/Functional_NER_Requirements_Diagram.png</a> [<a href="#B10-systems-11-00352" class="html-bibr">10</a>].</p>
Full article ">Figure A4
<p>Sankey diagram showing the text chunk patterns in performance requirements. A part of the figure is shown due to space constraints; however, the full diagram can be found <a href="https://github.com/archanatikayatray19/Sankey_diagram_Requirements/blob/main/Performance_pos_chunk_Requirements_Diagram.png" target="_blank">https://github.com/archanatikayatray19/Sankey_diagram_Requirements/blob/main/Performance_pos_chunk_Requirements_Diagram.png</a> [<a href="#B10-systems-11-00352" class="html-bibr">10</a>].</p>
Full article ">Figure A5
<p>Examples 1 and 2 show performance requirements starting with NP and PP, respectively.</p>
Full article ">Figure A6
<p>Sankey diagram showing the named entity patterns in performance requirements. A part of the figure is shown due to space constraints; however, the full diagram can be found <a href="https://github.com/archanatikayatray19/Sankey_diagram_Requirements/blob/main/Performance_NER_Requirements_Diagram.png" target="_blank">https://github.com/archanatikayatray19/Sankey_diagram_Requirements/blob/main/Performance_NER_Requirements_Diagram.png</a> [<a href="#B10-systems-11-00352" class="html-bibr">10</a>].</p>
Full article ">
12 pages, 482 KiB  
Article
Classification of CO Environmental Parameter for Air Pollution Monitoring with Grammatical Evolution
by Evangelos D. Spyrou, Chrysostomos Stylios and Ioannis Tsoulos
Algorithms 2023, 16(6), 300; https://doi.org/10.3390/a16060300 - 15 Jun 2023
Cited by 1 | Viewed by 1856
Abstract
Air pollution is a pressing concern in urban areas, necessitating the critical monitoring of air quality to understand its implications for public health. Internet of Things (IoT) devices are widely utilized in air pollution monitoring due to their sensor capabilities and seamless data [...] Read more.
Air pollution is a pressing concern in urban areas, necessitating the critical monitoring of air quality to understand its implications for public health. Internet of Things (IoT) devices are widely utilized in air pollution monitoring due to their sensor capabilities and seamless data transmission over the Internet. Artificial intelligence (AI) and machine learning techniques play a crucial role in classifying patterns derived from sensor data. Environmental stations offer a multitude of parameters that can be obtained to uncover hidden patterns showcasing the impact of pollution on the surrounding environment. This paper focuses on utilizing the CO parameter as an indicator of pollution in two datasets collected from wireless environmental monitoring devices in the greater Port area and the Town Hall of Igoumenitsa City in Greece. The datasets are normalized to facilitate their utilization in classification algorithms. The k-means algorithm is applied, and the elbow method is used to determine the optimal number of clusters. Subsequently, the datasets are introduced to the grammatical evolution algorithm to calculate the percentage fault. This method constructs classification programs in a human-readable format, making it suitable for analysis. Finally, the proposed method is compared against four state-of-the-art models: the Adam optimizer for optimizing artificial neural network parameters, a genetic algorithm for training an artificial neural network, the Bayes model, and the limited-memory BFGS method applied to a neural network. The comparison reveals that the GenClass method outperforms the other approaches in terms of classification error. Full article
(This article belongs to the Special Issue Machine Learning Algorithms in Prediction Model)
Show Figures

Figure 1

Figure 1
<p>Environmental monitoring wireless communication.</p>
Full article ">Figure 2
<p>Optimal number of clusters using elbow.</p>
Full article ">Figure 3
<p>The used grammar in BNF notation.</p>
Full article ">Figure 4
<p>Example of one-point crossover.</p>
Full article ">Figure 5
<p>Average classification error measured on Port dataset.</p>
Full article ">Figure 6
<p>Average classification error on Town Hall dataset.</p>
Full article ">
14 pages, 6187 KiB  
Concept Paper
Communication of Design Data in Manufacturing Democratization
by Bhairavsingh Ghorpade and Shivakumar Raman
J. Manuf. Mater. Process. 2023, 7(3), 108; https://doi.org/10.3390/jmmp7030108 - 1 Jun 2023
Viewed by 1526
Abstract
Part design is the principal source of communicating design intent to manufacturing and inspection. Design data are often communicated through computer-aided design (CAD) systems. Modern analytics tools and artificial intelligence integration into manufacturing have significantly advanced machine recognition of design specification and manufacturing [...] Read more.
Part design is the principal source of communicating design intent to manufacturing and inspection. Design data are often communicated through computer-aided design (CAD) systems. Modern analytics tools and artificial intelligence integration into manufacturing have significantly advanced machine recognition of design specification and manufacturing constraints. These algorithms require data to be uniformly structured and easily consumable; however, the design data are represented in a graphical structure and contain a nonuniform structure, which limits the use of machine learning algorithms for a variety of tasks. This paper proposes an algorithm for extracting dimensional data from three-dimensional (3D) part designs in a structured manner. The algorithm extracts face dimensions and their relationships with other faces, enabling the recognition of underlying patterns and expanding the applicability of machine learning for various tasks. The extracted part dimensions can be stored in a dimension-based numeric extensible markup language (XML) file, allowing for easy storage and use in machine-readable formats. The resulting XML file provides a dimensional representation of the part data based on their features. The proposed algorithm reads and extracts dimensions with respect to each face of the part design, preserving the dimensional and face relevance. The uniform structure of the design data facilitates the processing of data by machine learning algorithms, enabling the detection of hidden patterns and the development of pattern-based predictive algorithms. Full article
Show Figures

Figure 1

Figure 1
<p>Multilevel collaborative manufacturing.</p>
Full article ">Figure 2
<p>Flowchart of data extraction algorithm.</p>
Full article ">Figure 3
<p>Snapshot of XML storage schema.</p>
Full article ">Figure 4
<p>Connector part design.</p>
Full article ">Figure 5
<p>Output of dimensional data extraction algorithm for connector part.</p>
Full article ">Figure 6
<p>Internal face of connector part.</p>
Full article ">Figure 7
<p>XML file output of connector part.</p>
Full article ">Figure 8
<p>Communication of design data in manufacturing democratization chart.</p>
Full article ">Figure 9
<p>Data sharing in collaborative manufacturing.</p>
Full article ">
15 pages, 472 KiB  
Review
Knowledge Generation with Rule Induction in Cancer Omics
by Giovanni Scala, Antonio Federico, Vittorio Fortino, Dario Greco and Barbara Majello
Int. J. Mol. Sci. 2020, 21(1), 18; https://doi.org/10.3390/ijms21010018 - 18 Dec 2019
Cited by 9 | Viewed by 4062
Abstract
The explosion of omics data availability in cancer research has boosted the knowledge of the molecular basis of cancer, although the strategies for its definitive resolution are still not well established. The complexity of cancer biology, given by the high heterogeneity of cancer [...] Read more.
The explosion of omics data availability in cancer research has boosted the knowledge of the molecular basis of cancer, although the strategies for its definitive resolution are still not well established. The complexity of cancer biology, given by the high heterogeneity of cancer cells, leads to the development of pharmacoresistance for many patients, hampering the efficacy of therapeutic approaches. Machine learning techniques have been implemented to extract knowledge from cancer omics data in order to address fundamental issues in cancer research, as well as the classification of clinically relevant sub-groups of patients and for the identification of biomarkers for disease risk and prognosis. Rule induction algorithms are a group of pattern discovery approaches that represents discovered relationships in the form of human readable associative rules. The application of such techniques to the modern plethora of collected cancer omics data can effectively boost our understanding of cancer-related mechanisms. In fact, the capability of these methods to extract a huge amount of human readable knowledge will eventually help to uncover unknown relationships between molecular attributes and the malignant phenotype. In this review, we describe applications and strategies for the usage of rule induction approaches in cancer omics data analysis. In particular, we explore the canonical applications and the future challenges and opportunities posed by multi-omics integration problems. Full article
(This article belongs to the Special Issue Data Analysis and Integration in Cancer Research)
Show Figures

Figure 1

Figure 1
<p>Typical shape of an omics data matrix. Blue arrows link the column of the matrix to the different omics data type that are frequently found in a multi-omics experiment.</p>
Full article ">
5 pages, 1025 KiB  
Proceeding Paper
Machine-Readable Pattern for Colorimetric Sensor Interrogation
by Ismael Benito-Altamirano, Peter Pfeiffer, Oriol Cusola and J. Daniel Prades
Proceedings 2018, 2(13), 906; https://doi.org/10.3390/proceedings2130906 - 29 Nov 2018
Cited by 2 | Viewed by 1738
Abstract
We present a systematic methodology to generate machine-readable patterns embodying all the elements needed to carry out colorimetric measurements with conventional color cameras in an automated, robust and accurate manner. Our approach relies on the well-stablished machine-readable features of the QR Codes, to [...] Read more.
We present a systematic methodology to generate machine-readable patterns embodying all the elements needed to carry out colorimetric measurements with conventional color cameras in an automated, robust and accurate manner. Our approach relies on the well-stablished machine-readable features of the QR Codes, to detect the pattern, identify the color reference elements and the colorimetric spots, to calibrate the color of the image and to conclude a quantitative measurement. We illustrate our approach with a NH3 colorimetric indicator operating at distinct color temperature ambient lights, demonstrating that with our design, consistent measurements can be achieved, with independence on the illumination conditions. Full article
(This article belongs to the Proceedings of EUROSENSORS 2018)
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) RGB 8-bit color data acquired from a colorimetric sensor captured with a digital camera at 5500 K color temperature exposition (2D color plane with Red and Green channels), with the centers of 32 clusters generated by K-means clustering; (<b>b</b>) 32 clusters centers from (<b>a</b>), and color clustering regions.</p>
Full article ">Figure 2
<p>(<b>a</b>) Colors from K-Means clustering (<a href="#proceedings-02-00906-f001" class="html-fig">Figure 1</a>b) spread over a square distribution with certain pattern (block size of 3 × 3 pixels, 32 different colors, 7 times redundancy), (<b>b</b>) computing vision patterns of a QR Code version 7 (position, alignment and timing patterns) and (<b>c</b>) a machine-readable pattern for colorimetric sensor interrogation.</p>
Full article ">Figure 3
<p>(<b>a</b>) A machine-readable pattern with two reserved regions to accommodate the colorimetric indicator, and (<b>b</b>) its captured version with two replicas of the colorimetric indicator showing color artifacts due to ambient light, camera setup, etc.</p>
Full article ">Figure 4
<p>Color data (3D RGB 8-bit cube and corresponding 2D projections) of the NH3 indicators exposed to gas concentrations from 0 to 100 ppm in humid air (50% RH) acquired under different illumination conditions (4500 K, 5000 K, 5500 K and 6000 K color temperature); (<b>a</b>) before and (<b>b</b>) after color calibration.</p>
Full article ">
Back to TopTop