Towards Sensor-Actuator Coupling in an Automated Order Picking System by Detecting Sealed Seams on Pouch Packed Goods
<p>Different solutions of automated order-picking.</p> "> Figure 2
<p>Common shapes of flat pouches used for bag chains. (<b>a</b>,<b>b</b>) tubular pouches produced with or without side gussets; (<b>c</b>) a chain of tabular bags; (<b>d</b>) 3-side sealed bag; (<b>e</b>) 4-side sealed bag; (<b>f</b>) sealed seam bags as double or chain.</p> "> Figure 3
<p>Systematic of possible different ways to mark a pouch.</p> "> Figure 4
<p>Drawing of the functional principle and the needed components (<b>a</b>) and the real setup at the Fraunhofer Institute for Material Flow and Logistics (<b>b</b>).</p> "> Figure 5
<p>Interaction of the sensor (red) and actuators (green) in terms of two controlling strategies of the picking system: (<b>a</b>) clocked and (<b>b</b>) “on the fly” mode of operation. <a href="#jsan-03-00245-f008" class="html-fig">Figure 8</a> depicts the details of the “marker detection” module (dashed) which are presented throughout this paper. It serves as the (algorithmic) interface between the sensor and actuators.</p> "> Figure 6
<p>Two different types of marker: (<b>a</b>) a silver sealed seam as a color based marker and (<b>b</b>) a plastic pouch with a specially designed user-defined marker.</p> "> Figure 7
<p>Schematic concept of the (<b>Left</b>) extraction of <math display="inline"> <mrow> <mi>k</mi> <mo>≤</mo> <mi>M</mi> </mrow> </math> salient columns <math display="inline"> <mrow> <msup> <mrow/> <mi>ρ</mi> </msup> <msup> <mrow> <mi mathvariant="bold">x</mi> </mrow> <mrow> <mo stretchy="false">(</mo> <mi>j</mi> <mo stretchy="false">)</mo> </mrow> </msup> <mo>=</mo> <mrow> <mo>{</mo> <msup> <mrow/> <mi>ρ</mi> </msup> <msubsup> <mi>x</mi> <mrow> <mn>0</mn> </mrow> <mrow> <mo stretchy="false">(</mo> <mi>j</mi> <mo stretchy="false">)</mo> </mrow> </msubsup> <mo>,</mo> <mi>…</mi> <mo>,</mo> <msup> <mrow/> <mi>ρ</mi> </msup> <msubsup> <mi>x</mi> <mrow> <mi>i</mi> </mrow> <mrow> <mo stretchy="false">(</mo> <mi>j</mi> <mo stretchy="false">)</mo> </mrow> </msubsup> <mo>,</mo> <mi>…</mi> <mo>,</mo> <msup> <mrow/> <mi>ρ</mi> </msup> <msubsup> <mi>x</mi> <mrow> <mi>N</mi> <mo>−</mo> <mn>1</mn> </mrow> <mrow> <mo stretchy="false">(</mo> <mi>j</mi> <mo stretchy="false">)</mo> </mrow> </msubsup> <mo>}</mo> </mrow> </mrow> </math> from an image of size <math display="inline"> <mrow> <mi>N</mi> <mo>×</mo> <mi>M</mi> </mrow> </math> pixels–the RGB-channel is represented by <math display="inline"> <mrow> <mi>ρ</mi> <mo>∈</mo> <mo>{</mo> <mi>r</mi> <mo>,</mo> <mi>g</mi> <mo>,</mo> <mi>b</mi> <mo>}</mo> </mrow> </math>. The salient columns undergo a classification process leading to a time series which assigns the image rows a specific probability of containing the marker (<b>Right</b>).</p> "> Figure 8
<p>Pipeline of the vision based marker detection system (dashed boxes in <a href="#jsan-03-00245-f005" class="html-fig">Figure 5</a>). The classification step of the color based marker detection proceeds with Template Matching while the user-defined marker is detected by supervised classification (SVMs).</p> "> Figure 9
<p>Model of the color based marker depending on the (<b>a</b>) vertical image dimension <math display="inline"> <msub> <mi>θ</mi> <mi>v</mi> </msub> </math> of the marker and (<b>b</b>) the three dimensional RGB histogram. The size of the spheres symbolizes the pixel count in the subquads.</p> "> Figure 10
<p>Flow chart for constructing confidence vectors <math display="inline"> <msup> <mi mathvariant="bold">c</mi> <mrow> <mo stretchy="false">(</mo> <mi>j</mi> <mo stretchy="false">)</mo> </mrow> </msup> </math> for a particular time series <math display="inline"> <msup> <mi mathvariant="bold">x</mi> <mrow> <mo stretchy="false">(</mo> <mi>j</mi> <mo stretchy="false">)</mo> </mrow> </msup> </math>. Input and output is visualized in <a href="#jsan-03-00245-f011" class="html-fig">Figure 11</a>a,c. <span class="html-italic">α</span> represents predefined marker candidate pixels based on the RGB color model and <span class="html-italic">β</span> represents pixels from the rest of the image.</p> "> Figure 11
<p>Classification of time series <math display="inline"> <msup> <mi mathvariant="bold">x</mi> <mrow> <mo stretchy="false">(</mo> <mi>j</mi> <mo stretchy="false">)</mo> </mrow> </msup> </math> of a synthetic image shown in (<b>a</b>) to marker candidates (<math display="inline"> <mrow> <mi>α</mi> <mo>∈</mo> <mi mathvariant="double-struck">N</mi> </mrow> </math>) and no marker time steps (<math display="inline"> <mrow> <mi>β</mi> <mo>=</mo> <mn>0</mn> </mrow> </math>). The subsequent process of constructing a confidence vector <math display="inline"> <msup> <mi mathvariant="bold">c</mi> <mrow> <mo stretchy="false">(</mo> <mi>j</mi> <mo stretchy="false">)</mo> </mrow> </msup> </math> based on (<b>b</b>) is depicted in (<b>c</b>).</p> "> Figure 12
<p>Detected marker position <span class="html-italic">i</span> with <math display="inline"> <mrow> <msub> <mover accent="true"> <mi>z</mi> <mo>^</mo> </mover> <mi>i</mi> </msub> <mo>≥</mo> <mi>η</mi> </mrow> </math> in the global confidence vector <math display="inline"> <mover accent="true"> <mi mathvariant="bold">z</mi> <mo>^</mo> </mover> </math>.</p> "> Figure 13
<p>The design of the marker (<b>a</b>) approves deducing characteristic intensity profiles (<b>b</b>) of each image column (vertical red line in (a)) of the marker by analyzing the intensity values. The marker is integrated into the bar code (vertical bars) which does not take influence on the detection of the submarkes (horizontal bars).</p> "> Figure 14
<p>Definition of the Region of Interest (ROI) and the vertical bounds of the upper and lower marker parts within the ROI.</p> "> Figure 15
<p>Schematic visualization of the learning classification model: extraction of feature vectors <math display="inline"> <mrow> <msub> <mi mathvariant="bold">x</mi> <mi>i</mi> </msub> <mo>∈</mo> <msup> <mi mathvariant="double-struck">R</mi> <msub> <mi>l</mi> <mrow> <mi>m</mi> <mi>i</mi> <mi>n</mi> </mrow> </msub> </msup> </mrow> </math> for the learning set of the SVM.</p> "> Figure 16
<p>Extracting feature vectors <math display="inline"> <msubsup> <mover accent="true"> <mi mathvariant="bold">x</mi> <mo>˜</mo> </mover> <mrow> <mi>i</mi> </mrow> <mrow> <mo stretchy="false">(</mo> <mi>j</mi> <mo stretchy="false">)</mo> </mrow> </msubsup> </math> from a salient column <math display="inline"> <msup> <mi mathvariant="bold">x</mi> <mrow> <mo stretchy="false">(</mo> <mi>j</mi> <mo stretchy="false">)</mo> </mrow> </msup> </math> (represented as grey values) leads to a set <math display="inline"> <mrow> <msup> <mi>S</mi> <mrow> <mo stretchy="false">(</mo> <mi>j</mi> <mo stretchy="false">)</mo> </mrow> </msup> <mo>=</mo> <mfenced separators="" open="{" close="}"> <msubsup> <mover accent="true"> <mi mathvariant="bold">x</mi> <mo>˜</mo> </mover> <mn>0</mn> <mrow> <mo stretchy="false">(</mo> <mi>j</mi> <mo stretchy="false">)</mo> </mrow> </msubsup> <mo>,</mo> <mi>…</mi> <mo>,</mo> <msubsup> <mover accent="true"> <mi mathvariant="bold">x</mi> <mo>˜</mo> </mover> <mrow> <mi>N</mi> <mo>−</mo> <mn>1</mn> <mo>−</mo> <msub> <mi>l</mi> <mrow> <mi>m</mi> <mi>i</mi> <mi>n</mi> </mrow> </msub> </mrow> <mrow> <mo stretchy="false">(</mo> <mi>j</mi> <mo stretchy="false">)</mo> </mrow> </msubsup> </mfenced> </mrow> </math> of feature vectors.</p> "> Figure 17
<p>Image states in template matching based marker detection: (<b>a</b>) grayscale image, (<b>b</b>) filtered grayscale image, (<b>c</b>) histogram equalized grayscale image and (<b>d</b>) binary image.</p> "> Figure 18
<p>Color marker with Gaussian noise.</p> "> Figure 19
<p>User-defined marker with Gaussian noise.</p> "> Figure 20
<p>Evaluation of the robustness of the detection algorithm under varying (<b>a</b>) random Gaussian noise with standard deviation <math display="inline"> <mrow> <mi>σ</mi> <mo>=</mo> <mn>5</mn> <mo>;</mo> <mn>10</mn> <mo>;</mo> <mn>20</mn> <mo>;</mo> <mn>40</mn> <mo>;</mo> <mn>80</mn> </mrow> </math> and (<b>b</b>) illumination conditions with <math display="inline"> <mrow> <mi>γ</mi> <mo>=</mo> <mn>0</mn> <mo>.</mo> <mn>125</mn> <mo>;</mo> <mn>0</mn> <mo>.</mo> <mn>25</mn> <mo>;</mo> <mn>0</mn> <mo>.</mo> <mn>5</mn> <mo>;</mo> <mn>1</mn> <mo>.</mo> <mn>5</mn> <mo>;</mo> <mn>2</mn> </mrow> </math>.</p> "> Figure 21
<p>Color marker with synthetic brightening.</p> "> Figure 22
<p>User-defined marker with synthetic brightening.</p> "> Figure 23
<p>Accuracy of the detection process with <span class="html-italic">UD-Marker_SVM</span> on (<b>a</b>) transparent pouches. Varied parameters are the distance of the user-defined marker bars <math display="inline"> <mrow> <msub> <mi>d</mi> <mn>1</mn> </msub> <mo>∈</mo> <mfenced separators="" open="{" close="}"> <mrow> <mn>0</mn> <mo>.</mo> <mn>6</mn> <mtext>mm</mtext> </mrow> <mo>,</mo> <mrow> <mn>0</mn> <mo>.</mo> <mn>8</mn> <mtext>mm</mtext> </mrow> <mo>,</mo> <mrow> <mn>1</mn> <mo>.</mo> <mn>2</mn> <mtext>mm</mtext> </mrow> </mfenced> </mrow> </math> and the distance of the marker to the optimal cutting line <math display="inline"> <mrow> <msub> <mi>d</mi> <mrow> <mi>c</mi> <mi>u</mi> <mi>t</mi> </mrow> </msub> <mo>∈</mo> <mfenced separators="" open="{" close="}"> <mn>1</mn> <mtext>cm</mtext> <mo>,</mo> <mn>5</mn> <mtext>cm</mtext> </mfenced> </mrow> </math>. Visualization of the accuracy of the detection process of the template approaches (<b>b</b>): <span class="html-italic">UD-Marker_Template</span> on transparent pouches with user-defined marker and <span class="html-italic">Color-Marker_Template</span> on colored pouches. The <span class="html-italic">y</span>-axis in each box-whisker-plot indicates the deviation of the detected to the optimal cutting line.</p> "> Figure 24
<p>Runtimes for different marker detection methods on (<b>a</b>) a mobile device (Intel i5M 2.53 GHz, 4 GB RAM, Windows 7 64 bit) and (<b>b</b>) an desktop PC (Intel i5 2.80 GHz, 8 GB RAM, Windows 7 64 bit).</p> "> Figure 25
<p>Trade-off consideration between the detection quality and the number of columns: the detection rate (%) depends on the number of salient columns as input for the marker detection algorithms <span class="html-italic">Color-Marker_Template</span>, <span class="html-italic">UD-Marker_SVM</span> and <span class="html-italic">UD-Marker_Template</span>.</p> ">
Abstract
:1. Introduction
2. State of the Art
2.1. Fully Automated Order Picking
2.2. Types of Pouches in Industrial Systems
2.3. Cutting Line Detection
- binary registration (print) mark sensor based on contrast or color (e.g., SICK KT 10-2),
- binary camera based sensors,
- camera systems (e.g., SICK IVC-2D or IVC 3D),
- and sensors for detection of streams of marks.
3. Automated Order Picking System
4. Marker Detection System
4.1. Marker Detection
4.2. Color Marker Detection
4.2.1. Preprocessing
4.2.2. Histogram-Enhanced Template Matching (Color-Marker_Template)
- : interval which contains the red channel intensity values of the subquad.
- : interval which contains the green channel intensity values of the subquad.
- : interval which contains the blue channel intensity values of the subquad.
4.3. User-Defined Marker Detection
4.3.1. User-Defined Marker Design
4.3.2. Machine Learning Based Marker Detection
4.3.3. Template Matching Based Marker Detection
5. Evaluation
5.1. Vision System Setup
5.2. Detection Quality, Cutting Line Accuracy and Runtime
5.2.1. Detection Quality
5.2.2. Cutting Line Accuracy
5.2.3. Runtime Performance
5.3. Trade-off Considerations: Detection Quality vs. Number of Columns
6. Conclusion
Author Contributions
Conflicts of Interest
References
- Bvh–Bundeverband des Deutschen Versandhandels. Aktuelle Zahlen zum interaktiven Handel: “Jahresprognose 2014 für den interaktiven Handel mit Waren”. Available online: http://www.bvh.info/zahlen-und-fakten/allgemeines/ (accessed on 30 June 2014).
- European Union. Key Figures on European Business—With Special Feature on SMEs; Eurostat Pocketbooks; Eurostat: Luxembourg, 2011. [Google Scholar]
- Leiking, L. Method of Automated Order-Picking of Pouch-Packed Goods. PhD Thesis, Technical University of Dortmund, Dortmund, Germany, 2007. [Google Scholar]
- Statistisches Bundesamt—DESTATIS. Produktion des verarbeitenden Gewerbes sowie des Bergbaus und der Gewinnung von Steinen und Erden—Verarbeitendes Gewerbe, 2014. Fachreihe 4, Reihe 3.1, 4th quarter of 2013; Statistisches Bundesamt: Wiesbaden, Germany, 2014. [Google Scholar]
- Verein deutscher Ingenieure (VDI). Kommissioniersysteme: Grundlagen. Richtlinie, VDI 3590. Beuth Verlag: Düsseldorf, Germany, 1994. [Google Scholar]
- ten Hompel, M.; Schmidt, T. Warehouse Management; Springer-Verlag: Berlin/Heidelberg, Germany, 2008. [Google Scholar]
- SSI-Schäfer Peem GmbH. Schachtkommissionierer/A-Frame, 2011. Available online: http://www.ssi-schaefer.de/foerder-und-kommissioniersysteme/automatische-kommissionierung/a-frame.html (accessed on 18 July 2014).
- Viastore Systems. Vollautomatisches Kommissioniersystem viapick, 2011. Available online: http://www.viastore.de/kommissioniersysteme/viapick.html (accessed on 18 July 2014).
- Apostore GmbH. Carry Fix Pusher, 2014. Available online: http://www.apostore.de/de/produkte/spezialloesungen/carryfix-pusher.html (accessed on 18 July 2014).
- Siemens Technical Press, Industry Sector. Vereinzelung auf kleinstem Raum mit dem Visicon Singulator von Siemens, 2009. Available online: http://w1.siemens.com/press/de/pressemitteilungen/2009/mobility/imo200901008.htm (accessed on 18 July 2014).
- Karaca, H.; Akinlar, C. A multi-camera vision system for real-time tracking of parcels moving on a conveyor belt. In Computer and Information Sciences—ISCIS 2005; Springer-Verlag: Berlin/Heidelberg, Germany, 2005; Volume 3733, pp. 708–717. [Google Scholar]
- Rontech. SpaceFeeder—Universal Infeed System, 2011. Available online: http://www.rontech.ch/index.cfm?navid=41&detailid=7 (accessed on 18 July 2014).
- Somtec. Automatische Beutelvereinzelung und Kartonbefüllung, 2008. Available online: www.somtec.de (accessed on 18 July 2014).
- Rovema Verpackungsservice GmbH. Packaging Concepts, 2011. Available online: http://www.ptg-verpackungsservice.de/deutsch/dienstleistung/beutel/index.html (accessed on 18 July 2014).
- Rockwell Automation, Inc. The VFFS (Vertical From Fill Seal) Machine Application, 2008. Available online: http://literature.rockwellautomation.com/idc/groups/literature/documents/wp/oem-wp004_-en-p.pdf (accessed on 18 July 2014).
- Schultze, S.; Schnabel, H. Sensor for Marks on or in Material and Method of Sensing a Mark on or in a Material. Patent No. DE102008024104A1, 2010. [Google Scholar]
- Schölkopf, B.; Sung, K.; Burges, C.; Girosi, F.; Niyogi, P.; Poggio, T.; Vapnik, V. Comparing Support Vector Machines with Gaussian Kernels to Radial Basis Function Classifiers. IEEE Trans. Signal Process. 1997, 45, 2758–2765. [Google Scholar] [CrossRef]
- Cristianini, N.; Shawe-Taylor, J. Support Vector Machines; Cambridge University Press: Cambridge, UK, 2000. [Google Scholar]
- Gaspar, P.; Carbonell, J.; Oliveira, J. On the parameter optimization of Support Vector Machines for binary classification. J. Integr. Bioinf. 2012, 9, 1–11. [Google Scholar]
- Otsu, N. A Threshold Selection Method from Gray-Level Histograms. IEEE Trans. Syst. Man Cybern. 1979, 9, 62–66. [Google Scholar]
- Basler AG. Available online: http://www.baslerweb.com (accessed on 18 July 2014).
- Allied Vision Technologies. Digital Machine Vision Cameras. Available online: http://www.alliedvisiontec.com/emea/home.html (accessed on 18 July 2014).
- GigE Vision—True Plug and Play Connectivity. http://www.visiononline.org/vision-standards-details.cfm (accessed on 18 July 2014).
- Gudehus, T. Logistik—Grundlagen, Strategien, Anwendungen; Springer-Verlag: Berlin/Heidelberg, Germany, 2010. [Google Scholar]
- Rabiner, L.; Juang, B. An Introduction to Hidden Markov Models. IEEE ASSP Mag. 1986, 3, 4–16. [Google Scholar] [CrossRef]
- Koller-Meier, E.; Ade, F. Tracking multiple objects using the condensation algorithm. Robot. Auton. Syst. 2001, 34, 93–105. [Google Scholar] [CrossRef]
© 2014 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Weichert, F.; Böckenkamp, A.; Prasse, C.; Timm, C.; Rudak, B.; Hölscher, K.; Hompel, M.T. Towards Sensor-Actuator Coupling in an Automated Order Picking System by Detecting Sealed Seams on Pouch Packed Goods. J. Sens. Actuator Netw. 2014, 3, 245-273. https://doi.org/10.3390/jsan3040245
Weichert F, Böckenkamp A, Prasse C, Timm C, Rudak B, Hölscher K, Hompel MT. Towards Sensor-Actuator Coupling in an Automated Order Picking System by Detecting Sealed Seams on Pouch Packed Goods. Journal of Sensor and Actuator Networks. 2014; 3(4):245-273. https://doi.org/10.3390/jsan3040245
Chicago/Turabian StyleWeichert, Frank, Adrian Böckenkamp, Christian Prasse, Constantin Timm, Bartholomäus Rudak, Klaas Hölscher, and Michael Ten Hompel. 2014. "Towards Sensor-Actuator Coupling in an Automated Order Picking System by Detecting Sealed Seams on Pouch Packed Goods" Journal of Sensor and Actuator Networks 3, no. 4: 245-273. https://doi.org/10.3390/jsan3040245
APA StyleWeichert, F., Böckenkamp, A., Prasse, C., Timm, C., Rudak, B., Hölscher, K., & Hompel, M. T. (2014). Towards Sensor-Actuator Coupling in an Automated Order Picking System by Detecting Sealed Seams on Pouch Packed Goods. Journal of Sensor and Actuator Networks, 3(4), 245-273. https://doi.org/10.3390/jsan3040245