Abstract
We seek to increase the sophistication of our insect-like hexapod robot MantisBot’s visual system. We assembled and tested a benchtop robotic testbed with which to test our dynamical neural model of the insect visual system. Here we specifically model wide-field vision and the optomotor response. The system is composed of a Raspberry Pi with a camera outfitted with a 360° lens. The camera sits on a motorized turntable, which represents the “robot”. Above the turntable sits another motorized system that rotates a drum with printed patterns around the camera, which represents the visual “background”. The camera downsamples the visual scene and sends it to a synthetic nervous system (SNS) model of the insect optic lobe. The optic lobe is columnar. Each column detects changes in receptor intensity (retina), inhibits adjacent columns to increase dynamic range (lamina), compares time-delayed activities of adjacent columns to detect motion (medulla), then pools the motion of each column in a directionally-specific connectivity to compute the direction and speed of the wide-field scene (lobula plate). Our robotic model successfully encodes lateral wide-field visual speed into the activity of a pair of opposing Lobula Plate Tangential Cells (LPTCs). Furthermore, the optomotor response can be recreated by using the LPTCs to stimulate the neck motor neurons (MNs), producing a real-time, closed-loop dynamical model of the optomotor response.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Egelhaaf, M., Boeddeker, N., Kern, R., Kurtz, R., Lindemann, J.P.: Spatial vision in insects is facilitated by shaping the dynamics of visual input through behavioural action. Front. Neural Circuits. 6, 1–23 (2012)
Bagheri, Z.M., Wiederman, S.D., Cazzolato, B.S., Grainger, S., O’Carroll, D.C.: Performance of an insect-inspired target tracker in natural conditions. Bioinspir. Biomim. 12, 025006 (2017)
Borst, Alexander., Haag, Jürgen, Mauss, Alex S.: How fly neurons compute the direction of visual motion. J. Comp. Physiol. A. 206(2), 109–124 (2019). https://doi.org/10.1007/s00359-019-01375-9
Webb, B.: Robots with insect brains. Science. 368, 244–245 (2020)
Suver, M.P., Huda, A., Iwasaki, N., Safarik, S., Dickinson, M.H.: An array of descending visual interneurons encoding self-motion in drosophila. J. Neurosci. 36, 11768–11780 (2016)
Rossel, S.: Foveal fixation and tracking in the praying mantis. J. Comp. Physiol. A 139, 307–331 (1980)
Nityananda, V., Tarawneh, G., Errington, S., Serrano-Pedraza, I., Read, J.: The optomotor response of the praying mantis is driven predominantly by the central visual field. J. Comp. Physiol. A. 203(1), 77–87 (2016). https://doi.org/10.1007/s00359-016-1139-3
Dürr, V., Ebeling, W.: The behavioural transition from straight to curve walking: kinetics of leg movement parameters and the initiation of turning. J. Exp. Biol. 208, 2237–2252 (2005)
Meyer, H.G., et al.: Resource-efficient bio-inspired visual processing on the hexapod walking robot HECTOR. PLoS ONE 15, e0230620 (2020)
Szczecinski, N.S., Goldsmith, C.A., Young, F.R., Quinn, R.D.: Tuning a robot servomotor to exhibit muscle-like dynamics. In: Conference on Biomimetic and Biohybrid Systems. (2019)
Cofer, D.W., Cymbalyuk, G., Reid, J., Zhu, Y., Heitler, W.J., Edwards, D.H.: AnimatLab: a 3D graphics environment for neuromechanical simulations. J. Neurosci. Methods. 187, 280–288 (2010)
Szczecinski, N.S., et al.: Introducing MantisBot: hexapod robot controlled by a high-fidelity, real-time neural simulation. In: IEEE International Conference on Intelligent Robots and Systems. pp. 3875–3881. Hamburg, DE (2015)
Joly, J.-S., Recher, G., Brombin, A., Ngo, K., Hartenstein, V.: A conserved developmental mechanism builds complex visual systems in insects and vertebrates. Curr. Biol. 26, 1–9 (2016)
Mihalas, S., Niebur, E.: A generalized linear integrate-and-fire neural model produces diverse spiking behaviors. Neural Comput. 21, 704–718 (2009)
Szczecinski, N.S., Hunt, A.J., Quinn, R.D.: A functional subnetwork approach to designing synthetic nervous systems that control legged robot locomotion. Front. Neurorobot. 11, 37 (2017)
Borst, A.: Neural circuits for elementary motion detection. J. Neurogenet. 28, 361–373 (2014)
Barlow, H.B., Levick, W.R.: The mechanism of directionally selective units in rabbit’s retina. J. Physiol. 178, 477–504 (1965)
Szczecinski, N.S., Getsy, A.P., Martin, J.P., Ritzmann, R.E., Quinn, R.D.: MantisBot is a Robotic Model of Visually Guided Motion in the Praying Mantis. Arth. Struct, Dev (2017)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Sedlackova, A., Szczecinski, N.S., Quinn, R.D. (2020). A Synthetic Nervous System Model of the Insect Optomotor Response. In: Vouloutsi, V., Mura, A., Tauber, F., Speck, T., Prescott, T.J., Verschure, P.F.M.J. (eds) Biomimetic and Biohybrid Systems. Living Machines 2020. Lecture Notes in Computer Science(), vol 12413. Springer, Cham. https://doi.org/10.1007/978-3-030-64313-3_30
Download citation
DOI: https://doi.org/10.1007/978-3-030-64313-3_30
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-64312-6
Online ISBN: 978-3-030-64313-3
eBook Packages: Computer ScienceComputer Science (R0)