Review: Development and Technical Design of Tangible User Interfaces in Wide-Field Areas of Application
<p>The process of selecting articles for the design and technical solutions of a tangible user interface (TUI).</p> "> Figure 2
<p>Tinker system for simulation of the warehouse [<a href="#B5-sensors-21-04258" class="html-bibr">5</a>].</p> "> Figure 3
<p>Formation of acute angle [<a href="#B16-sensors-21-04258" class="html-bibr">16</a>].</p> "> Figure 4
<p>Salt Analysis—student presents reaction details [<a href="#B16-sensors-21-04258" class="html-bibr">16</a>].</p> "> Figure 5
<p>Overall structure of a TUI for learning flow charts and algorithms [<a href="#B16-sensors-21-04258" class="html-bibr">16</a>].</p> "> Figure 6
<p>Proposed TUI system with a laptop [<a href="#B18-sensors-21-04258" class="html-bibr">18</a>].</p> "> Figure 7
<p>Proposed SmartBlocks, connector, question, a screenshot of the SmartBlocks interface (from the left) [<a href="#B19-sensors-21-04258" class="html-bibr">19</a>].</p> "> Figure 8
<p>Proposed BlackBlocks with a running example of 3-letter words [<a href="#B20-sensors-21-04258" class="html-bibr">20</a>].</p> "> Figure 9
<p>Sensetable for the education of chemical reactions [<a href="#B21-sensors-21-04258" class="html-bibr">21</a>].</p> "> Figure 10
<p>Printed TangibleCircuits with an audio interface on a smartphone [<a href="#B24-sensors-21-04258" class="html-bibr">24</a>].</p> "> Figure 11
<p>Design of MICOO [<a href="#B26-sensors-21-04258" class="html-bibr">26</a>].</p> "> Figure 12
<p>Illustration of the interaction with the Tangible Graph Builder (<b>a</b>), Tangible Graph Builder with tangible grid, and tangible objects (<b>b</b>) [<a href="#B28-sensors-21-04258" class="html-bibr">28</a>].</p> "> Figure 13
<p>Techear-child session with TUI (<b>a</b>), Self-learning using a multitouch interface (<b>b</b>) [<a href="#B30-sensors-21-04258" class="html-bibr">30</a>].</p> "> Figure 14
<p>Prototype of tabletop (<b>a</b>), User manipulates with tangible object (<b>b</b>) [<a href="#B31-sensors-21-04258" class="html-bibr">31</a>].</p> "> Figure 15
<p>Smart cube for early detection of motoric impairments in childhood [<a href="#B32-sensors-21-04258" class="html-bibr">32</a>].</p> "> Figure 16
<p>(<b>a</b>) Smart toys for detecting developmental delays in children, (<b>b</b>) child manipulates with smart cubes to build a tower [<a href="#B33-sensors-21-04258" class="html-bibr">33</a>].</p> "> Figure 17
<p>Child using SIG-Blocks with segmented animal faces to match the displayed image and an adult observes the cognitive skills in the graphical user interface (GUI) (<b>a</b>), Hardware design of SIG-Block for TAG-Games (<b>b</b>) [<a href="#B34-sensors-21-04258" class="html-bibr">34</a>].</p> "> Figure 18
<p>Two children playing with TangToys [<a href="#B36-sensors-21-04258" class="html-bibr">36</a>].</p> "> Figure 19
<p>Activity Board 1.0 consists of a wooden box with a radiofrequency identification (RFID) reader and a RFID antenna, tablet, and tangible objects with RFID tags [<a href="#B37-sensors-21-04258" class="html-bibr">37</a>].</p> "> Figure 20
<p>Creative Hybrid Environment for Robotic Programming (CHERP) tangible blocks, LEGO WeDo robotic kit with computer screen shows CHERP graphical user interface [<a href="#B38-sensors-21-04258" class="html-bibr">38</a>].</p> "> Figure 21
<p>Kiwi robotics kit and CHERP programming blocks [<a href="#B39-sensors-21-04258" class="html-bibr">39</a>].</p> "> Figure 22
<p>Tangible programming blocks for programming LEGO NXT robot [<a href="#B40-sensors-21-04258" class="html-bibr">40</a>].</p> "> Figure 23
<p>Creation of Maze Area by blocks (<b>a</b>), the user interface of Maze Creation (<b>b</b>) [<a href="#B41-sensors-21-04258" class="html-bibr">41</a>].</p> "> Figure 24
<p>Programming P-CUBEs (<b>a</b>), Pro-Tan: programming panel and cards (<b>b</b>) [<a href="#B41-sensors-21-04258" class="html-bibr">41</a>].</p> "> Figure 25
<p>CodeRythm blocks [<a href="#B44-sensors-21-04258" class="html-bibr">44</a>].</p> "> Figure 26
<p>Sensor, actuator and shape modules (<b>a</b>), programming and controlling modules (<b>b</b>), construction of underwater vehicle (<b>c</b>) [<a href="#B44-sensors-21-04258" class="html-bibr">44</a>].</p> "> Figure 27
<p>Tangible user interface (<b>a</b>), their corresponding robots (<b>b</b>) [<a href="#B47-sensors-21-04258" class="html-bibr">47</a>].</p> "> Figure 28
<p>Creation of database Query with Sifteo Cubes (<b>a</b>), representation of the results of a query (<b>b</b>) [<a href="#B49-sensors-21-04258" class="html-bibr">49</a>].</p> "> Figure 29
<p>Using Sifteo Cubes for construction of complex database queries [<a href="#B50-sensors-21-04258" class="html-bibr">50</a>].</p> "> Figure 30
<p>Tangible objects for creating data queries [<a href="#B51-sensors-21-04258" class="html-bibr">51</a>].</p> "> Figure 31
<p>Spyractable consists of: Reactable and tokens with tags in action [<a href="#B55-sensors-21-04258" class="html-bibr">55</a>].</p> "> Figure 32
<p>A tangible rhythm sequencer with parameter controls, camera, illumination [<a href="#B55-sensors-21-04258" class="html-bibr">55</a>].</p> "> Figure 33
<p>Modular, interchangeable parts for construction skeleton of elephant (<b>a</b>), manipulates with elephant (<b>b</b>) [<a href="#B57-sensors-21-04258" class="html-bibr">57</a>].</p> "> Figure 34
<p>RFID-based tangible query for role-based visualization [<a href="#B58-sensors-21-04258" class="html-bibr">58</a>].</p> "> Figure 35
<p>(<b>a</b>) creation of vase by using SPATA tools, (<b>b</b>) sculpting decorative features by SPATA tools, (<b>c</b>) checking the size of the model, (<b>d</b>) exploring flower hole angles, (<b>e</b>) printed object result [<a href="#B59-sensors-21-04258" class="html-bibr">59</a>].</p> "> Figure 36
<p>Manipulation with SandScape and projected onto the surface of sand in real-time (<b>a</b>) [<a href="#B61-sensors-21-04258" class="html-bibr">61</a>], Illuminating Clay in use (<b>b</b>) [<a href="#B62-sensors-21-04258" class="html-bibr">62</a>].</p> "> Figure 37
<p>Interchangeable emotional faces, characters, and objects (<b>a</b>), interactive diarama for supporting storytelling (<b>b</b>) [<a href="#B69-sensors-21-04258" class="html-bibr">69</a>].</p> "> Figure 38
<p>Storytelling model with two users view the story (<b>a</b>), story-teller’s views (<b>b</b>), the audience’s views (<b>c</b>) [<a href="#B70-sensors-21-04258" class="html-bibr">70</a>].</p> "> Figure 39
<p>Two ACTOs, a smartphone, and an Arduino with a RF-module [<a href="#B71-sensors-21-04258" class="html-bibr">71</a>].</p> "> Figure 40
<p>The Tangisense interactive table with using LEDs [<a href="#B72-sensors-21-04258" class="html-bibr">72</a>].</p> "> Figure 41
<p>Application of tangible objects for pair smart devices [<a href="#B77-sensors-21-04258" class="html-bibr">77</a>].</p> "> Figure 42
<p>User plays arcade puzzler by Sifteo Cubes [<a href="#B48-sensors-21-04258" class="html-bibr">48</a>,<a href="#B78-sensors-21-04258" class="html-bibr">78</a>,<a href="#B79-sensors-21-04258" class="html-bibr">79</a>].</p> ">
Abstract
:1. Introduction
2. Structure of Review
3. TUI Application Areas
3.1. TUI as a Method for Learning
3.2. Application of TUI in Medicine and Psychology
3.3. TUI for Programming and Controlling a Robot
3.4. TUI for Construction of Database Queries
3.5. TUI in Music and Arts
3.6. TUI for Modeling 3D Objects
3.7. TUI for Modeling in Architecture
3.8. TUIs in Literature and Storytelling
3.9. Adjustable TUI Solution
3.10. Commercial TUI Smart Toys
4. TUI Technical Solution Analysis
4.1. Sensory Technical Solution
4.1.1. Wireless Technologies
4.1.2. Sensors
4.1.3. Feedback Possibilities
4.2. Image Processing
4.2.1. Marker Approach for Object Recognition
4.2.2. Markerless Approach for Object Recognition
4.2.3. Used Cameras for Image Processing
4.2.4. Used Computer Vision Platforms for TUI Applications
5. Discussion
6. Conclusions
Funding
Conflicts of Interest
References
- Blackwell, A.F.; Fitzmaurice, G.; Holmquist, L.E.; Ishii, H.; Ullmer, B. Tangible user interfaces in context and theory. In CHI’07, Proceedings of the Extended Abstracts on Human Factors in Computing Systems, San Jose, CA, USA, 28 April–3 May 2007; ACM Press: New York, NY, USA, 2007; pp. 2817–2820. [Google Scholar]
- Ishii, H.; Mazalek, A.; Lee, J. Bottles as a minimal interface to access digital information. In CHI’01, Proceedings of the Extended Abstracts on Human Factors in Computing Systems, Seattle, WA, USA, 31 March–5 April 2001; ACM Press: New York, NY, USA, 2001; pp. 187–188. [Google Scholar]
- Fiebrink, R.; Morris, D.; Morris, M.R. Meredith Ringel Morris Dynamic mapping of physical controls for tabletop groupware. In CHI’09, Proceedings of the 27th International Conference on Human Factors in Computing Systems, Boston, MA, USA, 4–9 April 2009; ACM Press: New York, NY, USA, 2009; pp. 471–480. [Google Scholar]
- Shaer, O.; Hornecker, E. Tangible User Interfaces: Past, Present, and Future Directions. Found. Trends Hum. Comput. Interact. 2009, 3, 1–137. [Google Scholar] [CrossRef] [Green Version]
- Do-Lenh, S.; Jermann, P.; Cuendet, S.; Zufferey, G.; Dillenbourg, P. Task performance vs. learning outcomes: A study of a tangible user interface in the classroom. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Proceedings of the European Conference on Technology Enhanced Learning 2010: Sustaining TEL: From Innovation to Learning and Practice, Arcelona, Spain, 28 September–1 October 2010; Springer: Berlin/Heidelberg, Germany, 2010; Volume 6383, pp. 78–92. [Google Scholar]
- Mateu, J.; Lasala, M.J.; Alamán, X. Tangible interfaces and virtual worlds: A new environment for inclusive education. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Proceedings of the Ubiquitous Computing and Ambient Intelligence. Context-Awareness and Context-Driven Interaction, Carrillo, Costa Rica, 2–6 December 2013; Springer: Cham, Switzerland, 2013; Volume 8276, pp. 119–126. [Google Scholar]
- Lucchi, A.; Jermann, P.; Zufferey, G.; Dillenbourg, P. An empirical evaluation of touch and tangible interfaces for tabletop displays. In TEI’10, Proceedings of the 4th International Conference on Tangible, Embedded, and Embodied Interaction, Cambridge MA, USA, 24–27 January 2010; Association for Computing Machinery: New York, NY, USA, 2010; pp. 177–184. [Google Scholar]
- Yui, T.; Hashida, T. Floatio: Floating tangible user interface based on animacy perception. In UIST 2016 Adjunct, Proceedings of the 29th Annual Symposium on User Interface Software and Technology, Tokyo, Japan, 16–19 October 2016; Association for Computing Machinery: New York, NY, USA, 2016; pp. 43–45. [Google Scholar]
- Schneider, B.; Wallace, J.; Blikstein, P.; Pea, R. Preparing for future learning with a tangible user interface: The case of neuroscience. IEEE Trans. Learn. Technol. 2013, 6, 117–129. [Google Scholar] [CrossRef]
- Ma, J.; Sindorf, L.; Liao, I.; Frazier, J. Using a tangible versus a multi-touch graphical user interface to support data exploration at a museum exhibit. In TEI 2015, Proceedings of the 9th International Conference on Tangible, Embedded, and Embodied Interaction, Stanford, CA, USA, 15–19 January 2015; Association for Computing Machinery: New York, NY, USA, 2015; pp. 33–40. [Google Scholar]
- Vaz, R.I.F.; Fernandes, P.O.; Veiga, A.C.R. Proposal of a Tangible User Interface to Enhance Accessibility in Geological Exhibitions and the Experience of Museum Visitors. Procedia Comput. Sci. 2016, 100, 832–839. [Google Scholar] [CrossRef] [Green Version]
- Kaltenbrunner, M.; Bencina, R. reacTIVision. In TEI ’07, Proceedings of the 1st International Conference on Tangible and Embedded Interaction, Baton Rouge, Louisiana, 15–17 February 2007; Association for Computing Machinery: New York, NY, USA, 2007; pp. 69–74. [Google Scholar]
- Schneider, B.; Sharma, K.; Cuendet, S.; Zufferey, G.; Dillenbourg, P.; Pea, E.R. Using mobile eye-trackers to unpack the perceptual benefits of a tangible user interface for collaborative learning. ACM Trans. Comput. Interact. 2016, 23. [Google Scholar] [CrossRef]
- Schneider, B.; Jermann, P.; Zufferey, G.; Dillenbourg, P. Benefits of a tangible interface for collaborative learning and interaction. IEEE Trans. Learn. Technol. 2011, 4, 222–232. [Google Scholar] [CrossRef]
- Starcic, A.I.; Cotic, M.; Zajc, M. Design-based research on the use of a tangible user interface for geometry teaching in an inclusive classroom. Br. J. Educ. Technol. 2013, 44, 729–744. [Google Scholar] [CrossRef]
- Sorathia, K.; Servidio, R. Learning and Experience: Teaching Tangible Interaction & Edutainment. Procedia Soc. Behav. Sci. 2012, 64, 265–274. [Google Scholar] [CrossRef] [Green Version]
- Pulli, K.; Baksheev, A.; Kornyakov, K.; Eruhimov, V. Real-time computer vision with OpenCV. Commun. ACM 2012, 55, 61–69. [Google Scholar] [CrossRef]
- Campos, P.; Pessanha, S. Designing augmented reality tangible interfaces for kindergarten children. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Proceedings of the VMR 2011, Virtual and Mixed Reality-New Trends, Orlando, FL, USA, 9–14 July 2011; Springer: Berlin/Heidelberg, Germany, 2011; Volume 6773 LNCS, pp. 12–19. [Google Scholar]
- Girouard, A.; Solovey, E.T.; Hirshfield, L.M.; Ecott, S.; Shaer, O.; Jacob, R.J.K. Smart Blocks: A tangible mathematical manipulative. In TEI’07, Proceedings of the First International Conference on Tangible and Embedded Interaction, Baton Rouge, Louisiana, 15–17 February 2007; Association for Computing Machinery: New York, NY, USA, 2007; pp. 183–186. [Google Scholar]
- Almukadi, W.; Stephane, A.L. BlackBlocks: Tangible interactive system for children to learn 3-letter words and basic math. In ITS’15, Proceedings of the 2015 International Conference on Interactive Tabletops & Surfaces, Madeira, Portugal, 15–18 November 2015; ACM Press: New York, NY, USA, 2015; pp. 421–424. [Google Scholar]
- Patten, J.; Ishii, H.; Hines, J.; Pangaro, G. Sensetable: A wireless object tracking platform for tangible user interfaces. In Proceedings of the Conference on Human Factors in Computing Systems, Seattle, WA, USA, 31 March–5 April 2001; pp. 253–260. [Google Scholar]
- Reinschlüssel, A.; Alexandrovsky, D.; Döring, T.; Kraft, A.; Braukmüller, M.; Janßen, T.; Reid, D.; Vallejo, E.; Bikner-Ahsbahs, A.; Malaka, R. Multimodal Algebra Learning: From Math Manipulatives to Tangible User Interfaces. i-com 2018, 17, 201–209. [Google Scholar] [CrossRef]
- Gajadur, D.; Bekaroo, G. TangiNet: A Tangible User Interface System for Teaching the Properties of Network Cables. In Proceedings of the 2019 Conference on Next Generation Computing Applications (NextComp), Pointe-aux-Piments, Mauritius, 19–21 September 2019; pp. 1–6. [Google Scholar]
- Davis, J.U.; Wu, T.-Y.; Shi, B.; Lu, H.; Panotopoulou, A.; Whiting, E.; Yang, X.-D. TangibleCircuits: An Interactive 3D Printed Circuit Education Tool for People with Visual Impairments. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems; ACM: New York, NY, USA, 2020; pp. 1–13. [Google Scholar]
- Nathoo, A.; Bekaroo, G.; Gangabissoon, T.; Santokhee, A. Using tangible user interfaces for teaching concepts of internet of things. Interact. Technol. Smart Educ. 2020, 17, 133–158. [Google Scholar] [CrossRef]
- Manshad, M.S.; Pontelli, E.; Manshad, S.J. MICOO (Multimodal Interactive Cubes for Object Orientation): A tangible user interface for the blind and visually impaired. In ASSETS’11, Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility, Dundee Scotland, UK, 24–26 October 2011; Association for Computing Machinery: New York, NY, USA, 2011; pp. 261–262. [Google Scholar]
- Billinghurst, M.; Kato, H. Collaborative augmented reality. Commun. ACM 2002, 45, 65–70. [Google Scholar] [CrossRef]
- McGookin, D.; Robertson, E.; Brewster, S. Clutching at straws: Using tangible interaction to provide non-visual access to graphs. In Proceedings of the Conference on Human Factors in Computing Systems, Atlanta, GA, USA, 10–15 April 2010; Volume 3, pp. 1715–1724. [Google Scholar]
- De La Guía, E.; Lozano, M.D.; Penichet, V.M.R. Educational games based on distributed and tangible user interfaces to stimulate cognitive abilities in children with ADHD. Br. J. Educ. Technol. 2015, 46, 664–678. [Google Scholar] [CrossRef]
- Jadan-Guerrero, J.; Jaen, J.; Carpio, M.A.; Guerrero, L.A. Kiteracy: A kit of tangible objects to strengthen literacy skills in children with Down syndrome. In IDC 2015, Proceedings of the The 14th International Conference on Interaction Design and Children; Association for Computing Machinery: New York, NY, USA, 2015; pp. 315–318. [Google Scholar]
- Haro, B.P.M.; Santana, P.C.; Magaña, M.A. Developing reading skills in children with Down syndrome through tangible interfaces. In Proceedings of the ACM International Conference Proceeding Series, Singapore, 7 August 2012; pp. 28–34. [Google Scholar]
- Martin-Ruiz, M.L. Foundations of a Smart Toy Development for the Early Detection of Motoric Impairments at Childhood. Int. J. Pediatr. Res. 2015, 1, 1–5. [Google Scholar] [CrossRef] [Green Version]
- Rivera, D.; García, A.; Alarcos, B.; Velasco, J.R.; Ortega, J.E.; Martínez-Yelmo, I. Smart toys designed for detecting developmental delays. Sensors 2016, 16, 1953. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Lee, K.; Jeong, D.; Schindler, R.C.; Hlavaty, L.E.; Gross, S.I.; Short, E.J. Interactive block games for assessing children’s cognitive skills: Design and preliminary evaluation. Front. Pediatr. 2018, 6, 111. [Google Scholar] [CrossRef] [Green Version]
- Al Mahmud, A.; Soysa, A.I. POMA: A tangible user interface to improve social and cognitive skills of Sri Lankan children with ASD. Int. J. Hum. Comput. Stud. 2020, 144, 102486. [Google Scholar] [CrossRef]
- Woodward, K.; Kanjo, E.; Brown, D.J.; Inkster, B. TangToys: Smart Toys to Communicate and Improve Children’s Wellbeing. In Adjunct, Proceedings of the 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM International Symposium on Wearable Computers; ACM: New York, NY, USA, 2020; pp. 497–499. [Google Scholar]
- Di Fuccio, R.; Siano, G.; De Marco, A. The Activity Board 1.0: RFID-NFC WI-FI Multitags Desktop Reader for Education and Rehabilitation Applications. In Proceedings of the WorldCIST 2017: Recent Advances in Information Systems and Technologies, Terceira Island, Portugal, 30 March–2 April 2017; pp. 677–689. [Google Scholar]
- Strawhacker, A.; Bers, M.U. “I want my robot to look for food”: Comparing Kindergartner’s programming comprehension using tangible, graphic, and hybrid user interfaces. Int. J. Technol. Des. Educ. 2015, 25, 293–319. [Google Scholar] [CrossRef]
- Sullivan, A.; Bers, M.U. Robotics in the early childhood classroom: Learning outcomes from an 8-week robotics curriculum in pre-kindergarten through second grade. Int. J. Technol. Des. Educ. 2016, 26, 3–20. [Google Scholar] [CrossRef]
- Sapounidis, T.; Demetriadis, S. Tangible versus graphical user interfaces for robot programming: Exploring cross-age children’s preferences. Pers. Ubiquitous Comput. 2013, 17, 1775–1786. [Google Scholar] [CrossRef]
- Wang, D.; Zhang, C.; Wang, H. T-Maze: A tangible programming tool for children. In IDC 2011, Proceedings of the 10th International Conference on Interaction Design and Children, Michigan, USA, 20–23 June 2011; Association for Computing Machinery: New York, NY, USA, 2011; pp. 1–10. [Google Scholar]
- Motoyoshi, T.; Tetsumura, N.; Masuta, H.; Koyanagi, K.; Oshima, T.; Kawakami, H. Tangible gimmick for programming education using RFID systems. IFAC-PapersOnLine 2016, 49, 514–518. [Google Scholar] [CrossRef]
- Kakehashi, S.; Motoyoshi, T.; Koyanagi, K.; Oshima, T.; Masuta, H.; Kawakami, H. Improvement of P-CUBE: Algorithm education tool for visually impaired persons. In Proceedings of the 2014 IEEE Symposium on Robotic Intelligence in Informationally Structured Space (RiiSS), Orlando, FL, USA, 9–12 December 2014; pp. 1–6. [Google Scholar]
- Rong, Z.; Chan, N.F.; Chen, T.; Zhu, K. CodeRhythm: A Tangible Programming Toolkit for Visually Impaired Students. In AsianCHI’20, Proceedings of the 2020 Symposium on Emerging Research from Asia and on Asian Contexts and Cultures; ACM: New York, NY, USA, 2020; pp. 57–60. [Google Scholar]
- Nathoo, A.; Gangabissoon, T.; Bekaroo, G. Exploring the Use of Tangible User Interfaces for Teaching Basic Java Programming Concepts: A Usability Study. In Proceedings of the 2019 Conference on Next Generation Computing Applications (NextComp), Pointe-aux-Piments, Mauritius, 19–21 September 2019; pp. 1–5. [Google Scholar]
- Liu, L.; Wang, J.; Gong, H.; Guo, J.; Wang, P.; Wang, Z.; Huang, L.; Yao, C. ModBot: A Tangible and Modular Making Toolkit for Children to Create Underwater Robots. In CHI’20, Proceedings of the CHI Conference on Human Factors in Computing Systems; ACM: New York, NY, USA, 2020; pp. 1–8. [Google Scholar]
- Merrad, W.; Héloir, A.; Kolski, C.; Krüger, A. RFID-based tangible and touch tabletop for dual reality in crisis management context. J. Multimodal User Interfaces 2021. [Google Scholar] [CrossRef]
- Merrill, D.; Sun, E.; Kalanithi, J. Sifteo cubes. In CHI’12, Proceedings of the CHI Conference on Human Factors in Computing Systems, Austin, TX, USA, 5 May–10 May 2012; Association for Computing Machinery: New York, NY, USA, 2012; pp. 1015–1018. [Google Scholar]
- Langner, R.; Augsburg, A.; Dachselt, R. Cubequery: Tangible interface for creating and manipulating database queries. In Proceedings of the Ninth ACM International Conference on Interactive Tabletops and Surfaces, Dresden, Germany, 16–19 November 2014; pp. 423–426. [Google Scholar]
- Valdes, C.; Eastman, D.; Grote, C.; Thatte, S.; Shaer, O.; Mazalek, A.; Ullmer, B.; Konkel, M.K. Exploring the design space of gestural interaction with active tokens through user-defined gestures. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Toronto, ON, Canada, 26 April–1 May 2014; pp. 4107–4116. [Google Scholar]
- Jofre, A.; Szigeti, S.; Keller, S.T.; Dong, L.-X.; Czarnowski, D.; Tomé, F.; Diamond, S. A tangible user interface for interactive data visualization. In Proceedings of the 25th Annual International Conference on Computer Science and Software Engineering, Markham, ON, Canada, 2–4 November 2015; pp. 244–247. [Google Scholar]
- Villafuerte, L.; Markova, M.S.; Jorda, S. Acquisition of social abilities through musical tangible user interface: Children with autism spectrum condition and the reactable. In CHI EA’12, Proceedings of the Extended Abstracts on Human Factors in Computing Systems, Austin Texas, USA, 5 May–10 May 2012; Association for Computing Machinery: New York, NY, USA, 2012; pp. 745–760. [Google Scholar]
- Xambó, A.; Hornecker, E.; Marshall, P.; Jordà, S.; Dobbyn, C.; Laney, R. Exploring Social Interaction with a Tangible Music Interface. Interact. Comput. 2016, 29, 248–270. [Google Scholar] [CrossRef] [Green Version]
- Waranusast, R.; Bang-Ngoen, A.; Thipakorn, J. Interactive tangible user interface for music learning. In Proceedings of the International Conference Image and Vision Computing, Wellington, New Zealand, 27–29 November 2013; pp. 400–405. [Google Scholar]
- Potidis, S.; Spyrou, T. Spyractable: A tangible user interface modular synthesizer. In Proceedings of the International Conference on Human-Computer Interaction, Crete, Greece, 22–27 June 2014; Volume 8511, pp. 600–611. [Google Scholar]
- Gohlke, K.; Hlatky, M.; De Jong, B. Physical construction toys for rapid sketching of Tangible User Interfaces. In Proceedings of the 9th International Conference on Tangible, Embedded, and Embodied Interaction, Stanford, CA, USA, 15–19 January 2015; pp. 643–648. [Google Scholar]
- Jacobson, A.; Panozzo, D.; Glauser, O.; Pradalier, C.; Hilliges, O.; Sorkine-Hornung, O. Tangible and modular input device for character articulation. ACM Trans. Graph. 2014, 33, 82:1–82:12. [Google Scholar] [CrossRef]
- Lee, J.Y.; Seo, D.W.; Rhee, G.W. Tangible authoring of 3D virtual scenes in dynamic augmented reality environment. Comput. Ind. 2011, 62, 107–119. [Google Scholar] [CrossRef]
- Weichel, C.; Alexander, J.; Karnik, A.; Gellersen, H. SPATA: Spatio-tangible tools for fabrication-aware design. In Proceedings of the 9th International Conference on Tangible, Embedded, and Embodied Interaction, Stanford, CA, USA, 15–19 January 2015; pp. 189–196. [Google Scholar]
- Ishii, H.; Ullmer, B. Tangible bits: Towards seamless interfaces between people, bits and atoms. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems, Atlanta, GA, USA, 22 March 1997; pp. 234–241. [Google Scholar]
- Ishii, H. The tangible user interface and its evolution. Commun. ACM 2008, 51, 32–36. [Google Scholar] [CrossRef]
- Piper, B.; Ratti, C.; Ishii, H. Illuminating clay: A 3-D tangible interface for landscape analysis. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Minneapolis, MN, USA, 20–25 April 2002; Volume 4, pp. 355–362. [Google Scholar]
- Ishii, H.; Ratti, C.; Piper, B.; Wang, Y.; Biderman, A.; Ben-Joseph, E. Bringing clay and sand into digital design-continous tangible user interfaces. BT Technol. J. 2004, 22, 287–299. [Google Scholar] [CrossRef]
- Nasman, J.; Cutler, B. Evaluation of user interaction with daylighting simulation in a tangible user interface. Autom. Constr. 2013, 36, 117–127. [Google Scholar] [CrossRef]
- Cutler, B.; Sheng, Y.; Martin, S.; Glaser, D.; Andersen, M. Interactive selection of optimal fenestration materials for schematic architectural daylighting design. Autom. Constr. 2008, 17, 809–823. [Google Scholar] [CrossRef] [Green Version]
- Maquil, V.; De Sousa, L.; Leopold, U.; Tobias, E. A geospatial tangible user interface to support stakeholder participation in urban planning. In Proceedings of the 2015 1st International Conference on Geographical Information Systems Theory, Applications and Management (GISTAM), Barcelona, Spain, 28–30 April 2015; pp. 1–8. [Google Scholar]
- Ha, T.; Woo, W.; Lee, Y.; Lee, J.; Ryu, J.; Choi, H.; Lee, K. ARtalet: Tangible user interface based immersive augmented reality authoring tool for digilog book. In Proceedings of the 2010 International Symposium on Ubiquitous Virtual Reality, Gwangju, Korea, 7–10 July 2010; pp. 40–43. [Google Scholar]
- Smith, A.; Reitsma, L.; Van Den Hoven, E.; Kotzé, P.; Coetzee, L. Towards preserving indigenous oral stories using tangible objects. In Proceedings of the 2011 2nd International Conference on Culture and Computing, Culture and Computing, Kyoto, Japan, 20–22 October 2011; pp. 86–91. [Google Scholar]
- Wallbaum, T.; Ananthanarayan, S.; Borojeni, S.S.; Heuten, W.; Boll, S. Towards a Tangible Storytelling Kit for Exploring Emotions with Children. In Thematic Workshops’17, Proceedings of the on Thematic Workshops of ACM Multimedia 2017, Mountain View, CA, USA, 23–27 October 2017; ACM Press: New York, NY, USA, 2017; pp. 10–16. [Google Scholar]
- Song, Y.; Yang, C.; Gai, W.; Bian, Y.; Liu, J. A new storytelling genre: Combining handicraft elements and storytelling via mixed reality technology. Vis. Comput. 2020, 36, 2079–2090. [Google Scholar] [CrossRef]
- Vonach, E.; Gerstweiler, G.; Kaufmann, H. ACTO: A modular actuated tangible user interface object. In Proceedings of the 2014 9th ACM International Conference on Interactive Tabletops and Surfaces (ITS 2014), Dresden, Germany, 16–19 November 2014; pp. 259–268. [Google Scholar]
- Kubicki, S.; Lepreux, S.; Kolski, C. RFID-driven situation awareness on TangiSense, a table interacting with tangible objects. Pers. Ubiquitous Comput. 2012, 16, 1079–1094. [Google Scholar] [CrossRef] [Green Version]
- Zappi, P.; Farella, E.; Benini, L. Hidden Markov models implementation for tangible interfaces. In Proceedings of the Third International Conference on Intelligent Technologies for Interactive Entertainment (INTETAIN 2009), Amsterdam, The Netherlands, 22–24 June 2009; Volume 9, pp. 258–263. [Google Scholar]
- Henderson, S.; Feiner, S. Opportunistic tangible user interfaces for augmented reality. IEEE Trans. Vis. Comput. Graph. 2010, 16, 4–16. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Lee, J.Y.; Kim, M.S.; Kim, J.S.; Lee, S.M. Tangible user interface of digital products in multi-Displays. Int. J. Adv. Manuf. Technol. 2012, 59, 1245–1259. [Google Scholar] [CrossRef]
- Wu, F.G.; Wang, P.C.; Tseng, W.T.; Cheng, C.M.; Chou, Y.C.; Sung, Y.H. Use pattern-recognition-based technology to explore a new interactive system on smart dining table. In Proceedings of the 2013 1st International Conference on Orange Technologies (ICOT), Tainan, Taiwan, 12–16 March 2013; pp. 252–255. [Google Scholar]
- Park, D.; Choo, H. Vibration Based Tangible Tokens for Intuitive Pairing Among Smart Devices. In Proceedings of the 4th International Conference on Human Aspects of Information Security, Privacy, and Trust, Toronto, ON, Canada, 17–22 July 2016; pp. 48–56. [Google Scholar]
- Merrill, D.; Kalanithi, J.; Maes, P. Siftables: Towards sensor network user interfaces. In Proceedings of the 1st International Conference on Tangible and Embedded Interaction, Baton Rouge, LA, USA, 15–17 February 2007; pp. 75–78. [Google Scholar]
- Garber, L. Tangible user interfaces: Technology you can touch. Computer 2012, 45, 15–18. [Google Scholar] [CrossRef]
- Cooper, J.R.; Tentzeris, M.M. Novel “Smart Cube” Wireless Sensors with Embedded Processing/Communication/Power Core for “Smart Skins” Applications. In Proceedings of the 2012 IEEE SENSORS, Taipei, Taiwan, 28–31 October 2012; pp. 1–4. [Google Scholar]
- Cutler, B.; Nasman, J. Interpreting Physical Sketches as Architectural Models. In Advances in Architectural Geometry 2010; Springer: Heidelberg, Germany, 2010; pp. 15–35. [Google Scholar]
TUI Application Area | Short Description |
---|---|
Teaching | Stimulation of the learning process in traditional study subjects Increase of attractivity in different education levels (school, kindergarten, and exhibitions). |
Medicine and psychology | Solutions for blind people, children with ADHD (Attention Deficit Hyperactivity Disorder), or with Down Syndrome Early detection of motoric impairments in childhood and detecting developmental delays in children. |
Programming and controlling robots | Understanding the basics of programming or controlling robots for children, students, and visually impaired persons. |
Database development | Development of database queries by tangible blocks |
Music and Arts | Compilation of music |
Modeling of 3D objects | Augmented reality and TUI design of 3D objects by tangible measuring tools. |
Modeling in architecture | Geospatial design of buildings Configuration and control of basic urban shadow simulation, light reflection, wind flows, traffic jams, for simulation of daylight in rooms. |
Literature and storytelling | Development of imagination and vocabulary and storing stories of cultural heritage |
Technology | Function | References |
---|---|---|
Atmega328p | microcontroller | [33] |
ATMega 168 | microcontroller | [73] |
Rasberry Pi | microcontroller | [33] |
Arduino | microcontroller | [8,11,16,41,42,44,71,77] |
Technology | Used with | References |
---|---|---|
RFID | RFID reader, RFID tag | [16,29,30,33,42,43,58,64,68,72,80] |
ZigBee Module | [34] | |
Blue Giga WT12 | Bluetooth 2.0 transmitter | [40,73] |
NRF24L01+mini | RF device | [39] |
Sensor | Used as | References |
---|---|---|
Optical sensor | Determining the relative position | [34,42,48,57,73,78,79] |
Accelerometers, gyroscopes, magnetometers | Motion measurement | [32,33,34,49,57,73,78,79] |
RFID sensors | Antenna, RFID chip | [16,29,30,33,42,43,58,64,68,72,80] |
Hall sensor | [57] | |
Light-dependent resistor | Determining the relative position | [32,33] |
Tilt sensor | detect the motion of objects for the management of energy savings of TUIs | [32,33,71] |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Krestanova, A.; Cerny, M.; Augustynek, M. Review: Development and Technical Design of Tangible User Interfaces in Wide-Field Areas of Application. Sensors 2021, 21, 4258. https://doi.org/10.3390/s21134258
Krestanova A, Cerny M, Augustynek M. Review: Development and Technical Design of Tangible User Interfaces in Wide-Field Areas of Application. Sensors. 2021; 21(13):4258. https://doi.org/10.3390/s21134258
Chicago/Turabian StyleKrestanova, Alice, Martin Cerny, and Martin Augustynek. 2021. "Review: Development and Technical Design of Tangible User Interfaces in Wide-Field Areas of Application" Sensors 21, no. 13: 4258. https://doi.org/10.3390/s21134258
APA StyleKrestanova, A., Cerny, M., & Augustynek, M. (2021). Review: Development and Technical Design of Tangible User Interfaces in Wide-Field Areas of Application. Sensors, 21(13), 4258. https://doi.org/10.3390/s21134258