[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to content
BY 4.0 license Open Access Published by De Gruyter (O) October 17, 2023

A fully automated touch-response behavior inspection pipeline on zebrafish larvae

Eine vollautomatisierte Inspektionspipeline für Berührungsreizreaktionen der Larven des Zebrabärblings
  • Yanke Wang

    He received the Bachelor and Master degree in automation from the Harbin Engineering University, China, in 2017 and 2019 respectively. Since 2019, he has been a doctoral researcher at the Faculty of Mechanical Engineering and has been working in the research group “Machine Learning for High-Throughput and Mechatronics” at the Institute for Automation and Applied Computer Science at the Karlsruhe Institute of Technology. Research Interests: Automated systems, bio-informatics, computer vision.

    EMAIL logo
    , Christian Pylatiuk

    He received the MD degree from the University of Marburg, Germany in 1997. Since 2014, he has been an Adjunct Professor at the Faculty of Mechanical Engineering and is leading the research group “Biomedical Engineering & Robotics” at the Karlsruhe Institute of Technology. Research Interests: mechatronics, medical technology, medical automation and image analysis.

    , Ralf Mikut

    He received the Dipl.-Ing. degree in automatic control from the University of Technology, Dresden, Germany, in 1994, and the Ph.D. degree in mechanical engineering from the University of Karlsruhe, Karlsruhe, Germany, in 1999. Since 2011, he has been an Adjunct Professor at the Faculty of Mechanical Engineering and the head of the research field “Automated Image and Data Analysis”. He is leading the research group “Machine Learning for Time Series and Images” at the Institute for Automation and Applied Informatics of the Karlsruhe Institute of Technology (KIT), Germany. His current research interests include machine learning, image processing, life science applications and smart grids.

    , Ravindra Peravali

    He received his Bachelor of Engineering in Electronics and Communications Engineering from the University of Mysore, India, his M.S. in Electrical Engineering from the University of Maryland Baltimore County, U.S.A. and his PhD in Biosciences from the Heidelberg University, Germany. He is currently the head of the Screening Center at the Institute of Biological and Chemical Systems – Biological Information Processing at the Karlsruhe Institute of Technology, Germany. His research interests are in understanding the neural mechanisms involved in animal behavior using zebrafish and Medaka as model organisms. In addition, he works in high-throughput imaging and in the mathematical modeling of biological information.

    and Markus Reischl

    He received the Dipl.-Ing. and the Ph.D. degree in mechanical engineering from the University of Karlsruhe, Germany, in 2001 and 2006 respectively. Since 2020, he has been an Adjunct Professor at the Faculty of Mechanical Engineering. He is the head of research field “Automation for Laboratories” and heading the research group “Machine Learning for High-Throughput and Mechatronics” at the Institute for Automation and Applied Informatics at the Karlsruhe Institute of Technology. Research Interests: Man-machine interfaces, image processing, machine learning, data analytics.

Abstract

A touch-evoked response of zebrafish larvae provides information on the mechanism of the gene functional expressions. Recently, an automated system has been developed for precise and repeated touch-response experimentation with minor human intervention. To quantify the collected data, we propose a fully automated multi-larvae touch-response behavior inspection pipeline based on larva tracking and segmentation. Experimental data with different treatments is analyzed by using the proposed inspection platform for demonstration, and the result proves that this platform can generate comparable touch-response behavior inspection readouts efficiently and automatically. The initial results were published in 31. Workshop Computational Intelligence, and this paper summarizes and extends the main work of the respective article.

Zusammenfassung

Eine durch mechanische Berührungsreize ausgelöste Reaktion der Larven des Zebrabärblings liefert Informationen über den Mechanismus der genetischen funktionellen Expression. Kürzlich wurde ein automatisiertes System entwickelt, um genaue und wiederholte Berührungsreizreaktionsexperimente mit geringer menschlicher Assistenz durchzuführen. Zur Quantifizierung der gesammelten Daten schlagen wir eine vollautomatisierte Inspektionspipeline für Berührungsreizreaktionen von mehreren Larven vor, die auf Larventracking und Segmentierung basiert. Experimentelle Daten mit unterschiedlichen chemischen Behandlungen werden durch die vorgeschlagene Inspektionsplattform analysiert. Die Auswertung bestätigt, dass die Plattform vergleichbare Inspektionsanzeigen der Berührungsreizreaktionen effizient und automatisch erzeugen kann. Die ursprünglichen Ergebnisse wurden in einem Konferenzbeitrag beim 31. Workshop Computational Intelligence veröffentlicht und werden in diesem Artikel resümiert und erweitert.

1 Introduction

Zebrafish larvae are commonly used animal models for organism-based screenings due to their small size, high fecundity, and short reproductive cycle [1]. Their specific (repeatedly and obvious) behaviors indicate certain functional mechanisms of mutants by the treatments [2, 3], making it possible to do the large-scale high-throughput screening of chemicals or drugs. Automated experimental systems to acquire the data of these behaviors have been developed so far [37], so the automated high-throughput inspection of the data from the systems is also becoming in higher demand, as visual inspection is time-consuming and not statistically comparable. In particular, the automated touch-response experimental system on zebrafish larvae has been developed to conduct the experiment by controlling a blunt needle to touch the larvae at a specific position and with a predefined force (Figure 1). The touch-evoked response of zebrafish larva consists of three components, including a C-Bend (the larva bends the body as a C-Shape, shortened as C-Bend), reverse C-Bends, and escape movement (changing the position). The touch-response experimental data (videos) are in a high frame rate [8, 9], so the automated inspection is essential in this case. During the touch-evoked response of zebrafish larvae, four time points of importance are the time when touch applied (t 1), response begins (t 2), response peak (t 3), and response stops (t 4). Four criteria are to be quantified, including the latency time (t l ), C-Bend curvature maximum (c m ), C-Bend peak time (t cp ), response time (t r ), and escape distance (d e ). However, it is challenging to generate a precise number of C-Bend curvatures and escape distance manually [7]. Furthermore, the operators cannot keep the same criteria all the time for each video, as the video has more than ten thousand frames on average. Thus, we proposed a touch-response quantification pipeline for single zebrafish larva in [8], but as for the multi-larvae case, we face more challenges: (i) multiple larvae need to be tracked and segmented at the same time; (ii) the larva that is touched shall be defined; (iii) the quantification of multiple larvae has higher computational costs. To solve these problems, we proposed an AI-based Multi-larvae Touch-response behavior Inspection Pipeline (AMTIP) in the work published in 31. Workshop Computational Intelligence [10] with the main work summarized and extended in this paper.

Figure 1: 
The diagram of conducting touch-response experiment and the corresponding response.
Figure 1:

The diagram of conducting touch-response experiment and the corresponding response.

In AMTIP, the tracking procedure plays a vital role, especially in the tracking of multiple larvae [9]. Recently, machine learning or deep learning based tracking methods have emerged to promote the accuracy of the tracking procedure [11, 12], and much previous work focused on the tracking and segmentation of single or multiple adult zebrafish [1316]. To make the best of the deep learning methods, we use a U-Net [17] based segmentation method for the initialization of tracking. However, those high-computational methods are difficult to be used in the tracking procedure of our high-frame-rate videos. In order to make the inspection pipeline less complex, we propose an optical flow based needle tracking procedure and a particle filter based larvae tracking procedure. Besides, the segmentation for each larva is also of importance to the analysis of the movements. In [18], a Gaussian Mixture Model (GMM) based segmentation is used to detect the moving objects, and the noise is filtered according to the region size by using a global Otsu (a conventional automated thresholding) method. However, considering global information in our platform makes the procedure more computationally expensive. Therefore, a local region growing based segmentation method is used for each larva according to the result of tracking procedure. Based on the tracking and segmentation results, we propose AMTIP to find the touched larvae and generate the behavior quantification according to the proposed experiment criteria. In order to test the performance of the proposed platform, we conduct six sets of experiments with different drugs and analyze the experiment criteria and detected errors (failure cases). With the verification of the experiment results, AMTIP shows a high efficiency for analyzing the touch-response experimental data and reduces the efforts for the operators involved in the experiments. The methods used in AMTIP can make contributions to the optimization of object tracking methods for analyzing videos with expensive computation. As well, AMTIP can also be transformed into the inspection pipeline of other organisms (like medaka) and can also be added with more quantification criteria.

The organization of the article is as follows. Section 2 describes the design of the proposed AMTIP. Section 3 provides the setup of the experiments, the quantification criteria and results as well as the discussion. According to the above results, conclusions are drawn in Section 4.

2 Multi-larvae touch-response inspection pipeline

The touch-response inspection procedure transfers the raw data collected by the acquisition platform into variables (criteria) that make sense to humans. Figure 2 visualizes the architecture of AMTIP that transfers the videos collected by the data acquisition system [8, 9] and generates the quantification criteria of the touch-response behaviors, including latency time t l , C-Bend curvature maximum c m , C-Bend curvature peak time t cp , response time t r , and escape distance d e . Four time points are vital to the quantification of touch response, including t 1 (touch applied), t 2 (response begins), t 3 (response peak), and t 4 (response stops). The AMTIP contains three essential parts: initialization, tracking and segmentation procedure, and quantification.

Figure 2: 
The diagram of the AI-based multi-larvae touch-response behavior inspection pipeline (AMTIP), including the initialization, tracking and segmentation procedure, quantification, and quantification criteria. The steps are marked in red.
Figure 2:

The diagram of the AI-based multi-larvae touch-response behavior inspection pipeline (AMTIP), including the initialization, tracking and segmentation procedure, quantification, and quantification criteria. The steps are marked in red.

The initial positions (initialization, Step 1 in Figure 2) of the needle and larvae obtained from the first frame are vital to the accuracy of the tracking procedure in AMTIP. Thus, a U-Net is used to segment the needle and larvae, which can be used directly as the initial positions for the tracking procedure.

Given the initial positions, each following frame of the video is processed by the tracking and segmentation procedure (Step 2 in Figure 2), including an optical flow based needle tracking [19, 20], a particle filter based larva tracking [11], and a region growing based larva segmentation. As the needle moves slowly, an optical flow is sufficient to estimate the needle position for each video frame (with coordinates in the frame denoted as X j n , Y j n , t j where n indicates the needle and t j indicates the frame j). The optical flow, however, cannot be applied to track the larvae that move rapidly, so a particle filter based larvae tracking is considered to generate the positions of the larvae for each frame ( X j l , Y j l , t j where l indicates the larva). To fulfill the detailed analysis of the touch-response behaviors of the larvae, a region growing based larva segmentation is used. The seed points of the region growing are chosen by the larva positions generated by the larvae tracking procedure, as described in [10]. The outputs of the tracking and segmentation contain the image patches of all larvae, as well as the positions of the larvae and needle for each frame.

With the results of the tracking and segmentation procedure, the quantification criteria (latency time t l , C-Bend curvature maximum c m , C-Bend curvature peak time t cp , response time t r , and escape distance d e ) are generated by the following three steps,

  1. Each video contains multiple larvae (Larva #1, Larva #2, Larva #l, etc.), so the larva that is actually touched by the needle needs to be firstly distinguished among the larvae in the video by AMTIP for the following quantification (Step 3–1 in Figure 2). The needle stops at the larva position after touch is applied, so the initial position of the touched larva ( X 0 l at t = 0) is the closest to the final position of the needle ( X t f n at t = t f ).

  2. As for the latency time t l and response time t r , the essential time points, including t 1 (touch applied), t 2 (response begins), and t 4 (response stops), need to be computed as detailed in Figure 2. As shown in Step 3–2 in Figure 2, the distance between the needle and larva for each frame is computed from t = 0 until the time point with the distance lower than a heuristic threshold T nl , as t 1 (when touch is applied). The t 2 is obtained as the time point when the larva moves and the response begins (with a heuristic threshold T mq : the percentage of the particles used in particle filter in Step 2). Similarly, as shown in Step 3–3 in Figure 2, the t 4 is searched from t f reversely until the time point when the response stops. Consequently, the latency time is computed as t l = t 2t 1, and the response time is computed as t r = t 4t 2. The escape distance is computed by the sum of the distances between the larva positions in the frames from t 2 to t 4, namely d e = Σ j = t 2 j = t 4 1 X j l X j + 1 l 2 + Y j l Y j + 1 l 2 .

  3. To quantify the amplitude of the touch-response behaviors, the curvature of the C-Bend for each frame is to be analyzed according to the skeleton of the larvae. Besides, the C-Bend curvature maximum c m can be computed, as well as the time point of c m (t 3, response peak), generating t cp = t 3t 2.

3 Experiment

3.1 Experiment setup

Different chemicals can have different influences on the touch-evoked behaviors of the zebrafish larvae. Thus, the experiments on long-term treatments (denoted as E lt ) with different chemicals are conducted on the zebrafish larvae to analyze the difference between various chemicals, with the protocol visualized in Figure 3 on a timeline, and the experimental setup is outlined in Table 1.

Figure 3: 
Protocol of the quantification experiment E

lt
 which is marked in red.
Figure 3:

Protocol of the quantification experiment E lt which is marked in red.

Table 1:

The experiment setup of experiment E lt . Wild: larvae in fish water. DMSO: larvae in dimethyl sulfoxide. Dia: larvae treated by diazepam. Iso: larvae treated by isoprenaline hydrochloride. Caffi: larvae treated by caffeine. Saha: larvae treated by suberoylanilide hydroxamic acid.

Parameter Quantity
T nl (used in AMTIP, Section 2) 10 pixels
T mq (used in AMTIP, Section 2) 50 %
Volume of water 4 mL
Age of larvae 73 hpf
Dechorionation 27 hpf
Touching positions Body
Number of larvae per well 4
Type of larvae Wild and treated by DMSO (1 %), Dia, Iso, Caffi, Saha
Concentration of chemicals 100 μmol/mL
Total number of larvae 173

In Experiment E lt , the larvae at 73 h post-fertilization (hpf) are put in the well plate to conduct the touching on the body. The details of the treatments are outlined in Table 1. Six experiments are conducted: wild type (without treatment), larvae with Dimethyl sulfoxide (DMSO),[1] as well as larvae treated by Diazepam (Dia) to reduce the movements [21], Isoproterenol hydrochloride (Iso) with unknown effects, Caffeine (Caffi) for also reduction of movements [21], and Suberoylanilide hydroxamic acid (Saha) with unknown effects, respectively. Each treatment is in a concentration of 100 μmol/mL for the demonstration. The larvae are dechorionated and treated at 27 hpf for long-term treatment. The proposed AMTIP is used to quantify the data collected in Experiment E lt for verifying that AMTIP can generate different touch-response behavior criteria with different chemical treatments on zebrafish larvae in the long term (assumption). Even though the proposed AMTIP is designed for the multi-larvae case, it can be used to quantify the data in the single-larva case.

3.2 Experiment results

The experiment on long-term treatment (E lt ) is conducted to collect 173 videos (24 videos for Wild, 27 videos for DMSO, 38 videos for Dia, 30 videos for Iso, 24 videos for Caffi, 30 videos for Saha, with the dataset denoted as DA-Elt in Table 2) and the quantification is run via AMTIP with the results visualized in Figure 4, including latency time t l , C-Bend curvature maximum c m , C-Bend curvature peak time t cp , response time t r , and escape distance d e . The results in Figure 4 verify the assumption of Experiment E lt in Section 3.1 that the proposed AMTIP can generate different quantification results (the chemical effects) for different chemicals. In detail, the larvae with a longer latency time have lower response strength (lower c m ), shorter time to shape the C-Bend peak (lower t cp ), and less response duration (lower t r and d e ), examples seen from the cases of Dia and Caffi. This result also proves that the larvae under the treatments of Dia and Caffi respond less compared with the wild and DMSO. Additionally, the treatments of Iso and Saha cannot change the touch-response behaviors of the larvae significantly with similar results to the wild and DMSO.

Table 2:

The datasets of experiment E lt .

Dataset denotation Chemical Number
DA-Elt Wild type 24 videos
DMSO 27 videos
Dia 38 videos
Iso 30 videos
Caffi 24 videos
Saha 30 videos
Total 173 videos
Figure 4: 
Five quantification indices on DA-Elt with six experiment cases (wild, DMSO, Dia, Iso, Caffi, and Saha) generated by AMTIP, including latency time t

l
, C-Bend curvature maximum c

m
, C-Bend curvature peak time t

cp
, response time t

r
, and escape distance d

e
. (a) Latency time. (b) C-Bend curvature maximum. (c) C-Bend curvature peak. (d) Response time. (e) Escape distance.
Figure 4:

Five quantification indices on DA-Elt with six experiment cases (wild, DMSO, Dia, Iso, Caffi, and Saha) generated by AMTIP, including latency time t l , C-Bend curvature maximum c m , C-Bend curvature peak time t cp , response time t r , and escape distance d e . (a) Latency time. (b) C-Bend curvature maximum. (c) C-Bend curvature peak. (d) Response time. (e) Escape distance.

3.3 Evaluation of AMTIP

The proposed AMTIP can fail in the touch-response quantification owing to the inaccuracy of the segmentation method and missing objects by the tracking procedure used in the inspection pipeline. Thus, the detected errors (failure cases) are to be analyzed to evaluate the proposed AMTIP. The collected video data contain some unquantifiable ones, such as the larvae are not touched, and the larvae or needle cannot be detected. The dataset DA-Elt collected in the experiment on long-term treatment E lt is used to analyze the detected errors: including the number of videos with no larvae touched (#NT) as well as those with failures of quantification (#QF). Among the videos collected (#C), shown in Table 3, the ground-truth numbers of the videos with no larvae touched (#NT g , generated by visual screening) are compared with the numbers output from AMTIP (#NT p ), with the false positive rate (FPR) and false negative rate (FNR). As well, the numbers of failures of quantification (#QF) are given with the percentage (E QF = #QF/|#C#NT g |). The AMTIP can generate #NT p and #QF automatically and find more than 90 % videos without any larvae touched on average (1 − FNR). Besides, around 10 % of valid videos (#C#NT g ) cannot be quantified by AMTIP (failure cases). In addition, the larvae under the treatment of Dia are assumed to have a response scarcely. Thus, the output of latency time is expected to be infinite, and the other quantification criteria (C-Bend curvature maximum, C-Bend curvature peak time, response time, and escape distance) are expected to be 0. However, AMTIP can only generate finite numbers less than the duration of videos (15 s in our case), but from Figure 4a, the latency time of Dia is still useful to be compared with the controls as it shows a much longer latency time than those of wild and DMSO. Furthermore, the results in Figure 4b–e are over zero (negative outputs), caused by the following reasons: (i) some larvae still have a slight response; (ii) the movements of the needle can push the larvae away (fake response); (iii) the tracking procedure generates the movements of the larvae because of the slight environment changes or other inaccuracy. Nonetheless, the results of Dia in Figure 4b–e are much lower than those of wild and DMSO. In other words, even with slight variance, the proposed AMTIP verifies our assumption on treatment Dia that reduces the touch response of zebrafish larvae. Finally, AMTIP can achieve the quantification in higher efficiency (frame rate: on average 63 ms per frame on CPU) via the proposed efficient tracking and segmentation procedure compared with the U-Net (frame rate: on average 2.60 s per frame on CPU).

Table 3:

The analysis of the detected errors (failure cases) of the proposed AMTIP. #C: the number of the collected videos. #NT g : the ground-truth number of the videos with no larvae touched, generated by visual inspection. #NT p : the predicted number of the videos with no larvae touched, generated by AMTIP. FPR: false positive rate of the videos without larvae touched. FNR: false negative rate of the videos without larvae touched. #QF: the number of failures of quantification. E QF : the percentage of failure of quantification. E QF = #QF/|#C#NT g |.

Type #C #NT g #NT p FPR FNR #QF E QF
Wild 24 4 7 16.7 % 4.2 % 1 5 %
DMSO 27 3 8 22.2 % 3.7 % 0 0 %
Dia 38 4 1 0 % 7.9 % 8 23.5 %
Iso 30 6 6 6.7 % 6.7 % 2 8.3 %
Caffi 24 7 5 4.2 % 12.5 % 3 17.6 %
Saha 30 5 6 13.3 % 10 % 2 8 %
Average 10.5 % 7.5 % 10.4 %

3.4 Discussions

The results verify that the proposed inspection pipeline AMTIP can work as an automated quantification tool for the touch-response data in a high frame rate. The AMTIP has the following advantageous strategies:

  1. The time point when the touch is applied (t 1), as well as the actually touched larva, is obtained by the final position of the needle and the initialized positions of the larvae, as the local segmentation during the tracking procedure is not as accurate as the initialized segmentation by the U-Net.

  2. The response of the larvae is defined by the particles used in the particle filter based larva tracking (details in Section 2) instead of the change of the larva center, as the centers of the larvae can change slightly but constantly during the tracking procedure, even if the larvae do not move.

  3. The time point when the touch response stops (t 4) is computed from the last frame reversely to the previous frames, since the larva can move slowly (no significant changes of pixels) for a moment and start moving strongly again.

  4. The quantification is achieved after the tracking and segmentation of all frames in the video, making it possible to consider the global information of the video.

However, some drawbacks still need to be considered carefully when the users apply AMTIP to the customized data. The tracking procedure and local segmentation of the larvae are the keys to AMTIP, but they may fail in the following cases: (i) the larvae overlap with each other when moving; (ii) the well edge area has similar brightness to the larvae; (iii) the needle overlaps with the larvae. It is essential to conduct the touch-response experiments on a large scale, so the proposed AMTIP is vital in such cases.

4 Conclusions

In this work, we introduce an AI-based inspection platform for the touch response of zebrafish larvae, which can generate five quantification indices (latency time, C-Bend curvature maximum, C-Bend curvature peak time, response time, and escape distance) automatically without human intervention. This platform uses an automated inspection pipeline based on a multi-larvae tracking procedure, with a U-Net for initialization of the tracking procedure, optical flow and particle filter for tracking, and region growing for local segmentation of larvae. Six sets of experiments (two controls and four treatments) are conducted, and the results generated from this platform as well as the analysis of the detected errors verify the effectiveness of the platform. The AMTIP can generate the expected conclusions as the assumption according to the corresponding experimental results. A high efficiency is also guaranteed with on average 63 ms per frame for the inspection pipeline on CPU. The AMTIP can be applied to the inspection of animal behaviors and systems that are required to analyze position changes in videos and to quantify the movements into criteria.


Corresponding author: Yanke Wang, Institute for Automation and Applied Informatics, Karlsruhe Institute of Technology, Hermann-von-Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen, Germany, E-mail:

About the authors

Yanke Wang

He received the Bachelor and Master degree in automation from the Harbin Engineering University, China, in 2017 and 2019 respectively. Since 2019, he has been a doctoral researcher at the Faculty of Mechanical Engineering and has been working in the research group “Machine Learning for High-Throughput and Mechatronics” at the Institute for Automation and Applied Computer Science at the Karlsruhe Institute of Technology. Research Interests: Automated systems, bio-informatics, computer vision.

Christian Pylatiuk

He received the MD degree from the University of Marburg, Germany in 1997. Since 2014, he has been an Adjunct Professor at the Faculty of Mechanical Engineering and is leading the research group “Biomedical Engineering & Robotics” at the Karlsruhe Institute of Technology. Research Interests: mechatronics, medical technology, medical automation and image analysis.

Ralf Mikut

He received the Dipl.-Ing. degree in automatic control from the University of Technology, Dresden, Germany, in 1994, and the Ph.D. degree in mechanical engineering from the University of Karlsruhe, Karlsruhe, Germany, in 1999. Since 2011, he has been an Adjunct Professor at the Faculty of Mechanical Engineering and the head of the research field “Automated Image and Data Analysis”. He is leading the research group “Machine Learning for Time Series and Images” at the Institute for Automation and Applied Informatics of the Karlsruhe Institute of Technology (KIT), Germany. His current research interests include machine learning, image processing, life science applications and smart grids.

Ravindra Peravali

He received his Bachelor of Engineering in Electronics and Communications Engineering from the University of Mysore, India, his M.S. in Electrical Engineering from the University of Maryland Baltimore County, U.S.A. and his PhD in Biosciences from the Heidelberg University, Germany. He is currently the head of the Screening Center at the Institute of Biological and Chemical Systems – Biological Information Processing at the Karlsruhe Institute of Technology, Germany. His research interests are in understanding the neural mechanisms involved in animal behavior using zebrafish and Medaka as model organisms. In addition, he works in high-throughput imaging and in the mathematical modeling of biological information.

Markus Reischl

He received the Dipl.-Ing. and the Ph.D. degree in mechanical engineering from the University of Karlsruhe, Germany, in 2001 and 2006 respectively. Since 2020, he has been an Adjunct Professor at the Faculty of Mechanical Engineering. He is the head of research field “Automation for Laboratories” and heading the research group “Machine Learning for High-Throughput and Mechatronics” at the Institute for Automation and Applied Informatics at the Karlsruhe Institute of Technology. Research Interests: Man-machine interfaces, image processing, machine learning, data analytics.

  1. Author contributions: All the authors have accepted responsibility for the entire content of this submitted manuscript and approved submission.

  2. Conflict of interest statement: The authors declare no conflicts of interest regarding this article.

  3. Research funding: This work was financially funded by China Scholarship Council (CSC). The authors would also like to thank the program of Natural, Artificial and Cognitive Information Processing (NACIP) and the BioInterfaces International Graduate School (BIF-IGS) at the Karlsruhe Institute of Technology (KIT).

References

[1] A. A. Popova, D. Marcato, R. Peravali, I. Wehl, U. Schepers, and P. A. Levkin, “Fish-microarray: a miniaturized platform for single-embryo high-throughput screenings,” Adv. Funct. Mater., vol. 28, no. 3, p. 1703486, 2018. https://doi.org/10.1002/adfm.201703486.Search in Google Scholar

[2] L. Saint-Amant and P. Drapeau, “Time course of the development of motor behaviors in the zebrafish embryo,” J. Neurobiol., vol. 37, no. 4, pp. 622–632, 1998. https://doi.org/10.1002/(sici)1097-4695(199812)37:4<622::aid-neu10>3.0.co;2-s.10.1002/(SICI)1097-4695(199812)37:4<622::AID-NEU10>3.0.CO;2-SSearch in Google Scholar

[3] R. M. Basnet, D. Zizioli, S. Taweedet, D. Finazzi, and M. Memo, “Zebrafish larvae as a behavioral model in neuropharmacology,” Biomedicines, vol. 7, no. 1, p. 23, 2019. https://doi.org/10.3390/biomedicines7010023.Search in Google Scholar

[4] D. Marcato, R. Alshut, H. Breitwieser, et al.., “An automated and high-throughput photomotor response platform for chemical screens,” in 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 2015, pp. 7728–7731.10.1109/EMBC.2015.7320183Search in Google Scholar

[5] G. Audira, B. P. Sampurna, S. Juniardi, S. T. Liang, Y. H. Lai, and C. D. Hsiao, “A simple setup to perform 3D locomotion tracking in zebrafish by using a single camera,” Inventions, vol. 3, no. 1, p. 11, 2018. https://doi.org/10.3390/inventions3010011.Search in Google Scholar

[6] M. Schutera, T. Dickmeis, M. Mione, et al.., “Automated phenotype pattern recognition of zebrafish for high-throughput screening,” Bioengineered, vol. 7, no. 4, pp. 261–265, 2016. https://doi.org/10.1080/21655979.2016.1197710.Search in Google Scholar

[7] V. Bedell, E. Buglo, D. Marcato, et al.., “Zebrafish: a pharmacogenetic model for anesthesia,” Methods Enzymol., vol. 602, pp. 189–209, 2018. https://doi.org/10.1016/bs.mie.2018.02.004.Search in Google Scholar

[8] Y. Wang, D. Marcato, V. Tirumalasetty, et al.., “An automated experimentation system for the touch-Response quantification of zebrafish larvae,” IEEE Trans. Autom. Sci. Eng., vol. 19, no. 4, pp. 3007–3019, 2022. https://doi.org/10.1109/TASE.2021.3104507.Search in Google Scholar

[9] Y. Wang, N. K. Kanagaraj, C. Pylatiuk, R. Mikut, R. Peravali, and M. Reischl, “High-throughput data acquisition platform for multi-larvae touch-response behavior screening of zebrafish,” IEEE Rob. Autom. Lett., vol. 7, no. 2, pp. 858–865, 2022.10.1109/LRA.2021.3134281Search in Google Scholar

[10] Y. Wang, C. Pylatiuk, R. Mikut, R. Peravali, and M. Reischl, “Quantification platform for touch response of zebrafish larvae using machine learning,” in Proceedings 31. Workshop Computational Intelligence, vol. 25, 2021, p. 37.Search in Google Scholar

[11] Q. Zhu, Y. Wang, Y. He, and X. Hong, “Object tracking with particles weighted by region proposal network,” Multimed. Tool. Appl., vol. 78, no. 9, pp. 12083–12101, 2019. https://doi.org/10.1007/s11042-018-6743-5.Search in Google Scholar

[12] S. Ren, K. He, R. Girshick, et al.., “Faster R-CNN: towards real-time object detection with region proposal networks,” Adv. Neural Inf. Process. Syst., vol. 28, pp. 91–99, 2015. https://doi.org/10.1007/s12032-015-0536-8.Search in Google Scholar PubMed

[13] X. Wang, E. Cheng, I. S. Burnett, Y. Huang, and D. Wlodkowic, “Crowdsourced generation of annotated video datasets: a zebrafish larvae dataset for video segmentation and tracking evaluation,” in IEEE Life Sciences Conference (LSC), 2017, pp. 274–277.10.1109/LSC.2017.8268196Search in Google Scholar

[14] X. Wang, E. Cheng, I. S. Burnett, R. Wilkinson, and M. Lech, “Automatic tracking of multiple zebrafish larvae with resilience against segmentation errors,” in IEEE 15th International Symposium on Biomedical Imaging (ISBI), 2018, pp. 1157–1160.10.1109/ISBI.2018.8363776Search in Google Scholar

[15] Y. X. Bai, S. H. Zhang, Z. Fan, et al.., “Automatic multiple zebrafish tracking based on improved HOG features,” Sci. Rep., vol. 8, no. 1, pp. 1–14, 2018. https://doi.org/10.1038/s41598-018-29185-0.Search in Google Scholar PubMed PubMed Central

[16] F. Romero-Ferrero, M. G. Bergomi, R. C. Hinz, F. J. Heras, and G. G. de Polavieja, “Idtracker. ai: tracking all individuals in small or large collectives of unmarked animals,” Nat. Methods, vol. 16, no. 2, pp. 179–182, 2019. https://doi.org/10.1038/s41592-018-0295-5.Search in Google Scholar PubMed

[17] O. Ronneberger, P. Fischer, and T. Brox, “U-net: convolutional networks for biomedical image segmentation,” in International Conference on Medical Image Computing and Computer-Assisted Intervention, 2015, pp. 234–241.10.1007/978-3-319-24574-4_28Search in Google Scholar

[18] X. Wang, E. Cheng, I. S. Burnett, Y. Huang, and D. Wlodkowic, “Automatic multiple zebrafish larvae tracking in unconstrained microscopic video conditions,” Sci. Rep., vol. 7, no. 1, pp. 1–8, 2017. https://doi.org/10.1038/s41598-017-17894-x.Search in Google Scholar PubMed PubMed Central

[19] B. K. Horn and B. G. Schunck, “Determining optical flow,” Artif. Intell., vol. 17, nos. 1–3, pp. 185–203, 1981. https://doi.org/10.1016/0004-3702(81)90024-2.Search in Google Scholar

[20] T. Senst, V. Eiselein, and T. Sikora, “Robust local optical flow for feature tracking,” IEEE Trans. Circ. Syst. Video Technol., vol. 22, no. 9, pp. 1377–1387, 2012. https://doi.org/10.1109/tcsvt.2012.2202070.Search in Google Scholar

[21] H. Richendrfer, S. D. Pelkowski, R. M. Colwill, and R. Creton, “On the edge: pharmacological evidence for anxiety-related behavior in zebrafish larvae,” Behav. Brain Res., vol. 228, no. 1, pp. 99–106, 2012. https://doi.org/10.1016/j.bbr.2011.11.041.Search in Google Scholar PubMed PubMed Central

Received: 2023-02-14
Accepted: 2023-08-06
Published Online: 2023-10-17
Published in Print: 2023-10-26

© 2023 the author(s), published by De Gruyter, Berlin/Boston

This work is licensed under the Creative Commons Attribution 4.0 International License.

Downloaded on 22.12.2024 from https://www.degruyter.com/document/doi/10.1515/auto-2023-0013/html
Scroll to top button