[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (34)

Search Parameters:
Keywords = smart PIG

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
16 pages, 6692 KiB  
Article
Behavior Tracking and Analyses of Group-Housed Pigs Based on Improved ByteTrack
by Shuqin Tu, Haoxuan Ou, Liang Mao, Jiaying Du, Yuefei Cao and Weidian Chen
Animals 2024, 14(22), 3299; https://doi.org/10.3390/ani14223299 - 16 Nov 2024
Viewed by 584
Abstract
Daily behavioral analysis of group-housed pigs provides critical insights into early warning systems for pig health issues and animal welfare in smart pig farming. In this study, our main objective was to develop an automated method for monitoring and analyzing the behavior of [...] Read more.
Daily behavioral analysis of group-housed pigs provides critical insights into early warning systems for pig health issues and animal welfare in smart pig farming. In this study, our main objective was to develop an automated method for monitoring and analyzing the behavior of group-reared pigs to detect health problems and improve animal welfare promptly. We have developed the method named Pig-ByteTrack. Our approach addresses target detection, Multi-Object Tracking (MOT), and behavioral time computation for each pig. The YOLOX-X detection model is employed for pig detection and behavior recognition, followed by Pig-ByteTrack for tracking behavioral information. In 1 min videos, the Pig-ByteTrack algorithm achieved Higher Order Tracking Accuracy (HOTA) of 72.9%, Multi-Object Tracking Accuracy (MOTA) of 91.7%, identification F1 Score (IDF1) of 89.0%, and ID switches (IDs) of 41. Compared with ByteTrack and TransTrack, the Pig-ByteTrack achieved significant improvements in HOTA, IDF1, MOTA, and IDs. In 10 min videos, the Pig-ByteTrack achieved the results with 59.3% of HOTA, 89.6% of MOTA, 53.0% of IDF1, and 198 of IDs, respectively. Experiments on video datasets demonstrate the method’s efficacy in behavior recognition and tracking, offering technical support for health and welfare monitoring of pig herds. Full article
Show Figures

Figure 1

Figure 1
<p>Process diagram of tracking and behavioral time statistics for group-housed pigs.</p>
Full article ">Figure 2
<p>Flow chart of Pig-ByteTrack algorithm.</p>
Full article ">Figure 3
<p>The flow chart of the Byte data association algorithm.</p>
Full article ">Figure 4
<p>Comparison of tracking box between Pig-ByteTrack and ByteTrack.</p>
Full article ">Figure 5
<p>Comparison of Pig-BytetTrack, ByteTrack and TransTrack results on private datasets.</p>
Full article ">Figure 6
<p>The visualized tracking results comparison of Pig-BytetTrack, ByteTrack, and TransTrack.</p>
Full article ">Figure 7
<p>The visualized tracking results of Pig-BytetTrack in the 10 min videos. (The red arrows in the figure indicate pigs with id transformations).</p>
Full article ">Figure 8
<p>Pig behavior statistics for videos 14–17.</p>
Full article ">
20 pages, 5826 KiB  
Article
Novel Method for Detecting Coughing Pigs with Audio-Visual Multimodality for Smart Agriculture Monitoring
by Heechan Chae, Junhee Lee, Jonggwan Kim, Sejun Lee, Jonguk Lee, Yongwha Chung and Daihee Park
Sensors 2024, 24(22), 7232; https://doi.org/10.3390/s24227232 - 12 Nov 2024
Viewed by 632
Abstract
While the pig industry is crucial in global meat consumption, accounting for 34% of total consumption, respiratory diseases in pigs can cause substantial economic losses to pig farms. To alleviate this issue, we propose an advanced audio-visual monitoring system for the early detection [...] Read more.
While the pig industry is crucial in global meat consumption, accounting for 34% of total consumption, respiratory diseases in pigs can cause substantial economic losses to pig farms. To alleviate this issue, we propose an advanced audio-visual monitoring system for the early detection of coughing, a key symptom of respiratory diseases in pigs, that will enhance disease management and animal welfare. The proposed system is structured into three key modules: the cough sound detection (CSD) module, which detects coughing sounds using audio data; the pig object detection (POD) module, which identifies individual pigs in video footage; and the coughing pig detection (CPD) module, which pinpoints which pigs are coughing among the detected pigs. These modules, using a multimodal approach, detect coughs from continuous audio streams amidst background noise and accurately pinpoint specific pens or individual pigs as the source. This method enables continuous 24/7 monitoring, leading to efficient action and reduced human labor stress. It achieved a substantial detection accuracy of 0.95 on practical data, validating its feasibility and applicability. The potential to enhance farm management and animal welfare is shown through proposed early disease detection. Full article
(This article belongs to the Section Smart Agriculture)
Show Figures

Figure 1

Figure 1
<p>System architecture. The system comprises three modules: a cough sound detection (CSD), a pig object detection (POD), and a coughing pig detection (CPD).</p>
Full article ">Figure 2
<p>CSD module. (<b>a</b>) provides an overview of the overall structure of the CSD module, while (<b>b</b>) illustrates the specific architecture of the sound-based cough detection model used within the CSD module.</p>
Full article ">Figure 3
<p>POD module. High-score frames are extracted from the video clip to detect pig objects, and pig-specific crop frames are created based on the detected regions.</p>
Full article ">Figure 4
<p>Differences in individual pig videos cropped according to the bounding box. (<b>a</b>) A coughing pig using the high-score bounding box (HB), (<b>b</b>) a moving pig using the HB, and (<b>c</b>) a moving pig without using the HB.</p>
Full article ">Figure 5
<p>Coughing pig detection module. The CSD module ultimately combines audio information with the visual information from the POD module to detect coughing pigs within the area of interest.</p>
Full article ">Figure 6
<p>Visual encoder. Visual features are extracted from the cropped pig object videos.</p>
Full article ">Figure 7
<p>Attention block. It determines the activity status of pigs using attention and the GRU structure with their audio and visual features.</p>
Full article ">Figure 8
<p>Panels (<b>a</b>,<b>b</b>) qualitatively show the sound detection results in different environments. The label ‘estimated’ is the cough area predicted by the model’s CSD module, and ‘reference’ corresponds to the ground truth.</p>
Full article ">Figure 9
<p>Qualitative results of neural network model in POD module (YOLOv7).</p>
Full article ">Figure 10
<p><math display="inline"><semantics> <mrow> <mi>M</mi> <mi>o</mi> <mi>v</mi> </mrow> </semantics></math> results of pig object cropped videos according to the situation.</p>
Full article ">Figure 11
<p>Qualitative comparison of the feature representation results of the CSD module and the audio encoder’s feature representation results.</p>
Full article ">Figure 12
<p>Qualitative results of applying the HB method. (<b>a</b>,<b>b</b>) are still shots of video clips in different situations, where the numbers indicate the index of the predicted pig. (<b>c</b>) shows the coughing pig prediction results by applying the HB method.</p>
Full article ">Figure 13
<p>Results obtained by applying the proposed system to actual pig barn data are shown. (<b>a</b>) shows the number of cough detections inside and outside the RoI. (<b>b</b>) is a heatmap of coughing that occurred inside the RoI, where red indicates areas with higher cough frequency. From this, users can identify areas where coughing frequently occurs.</p>
Full article ">
18 pages, 3015 KiB  
Review
Chest Tubes and Pleural Drainage: History and Current Status in Pleural Disease Management
by Claudio Sorino, David Feller-Kopman, Federico Mei, Michele Mondoni, Sergio Agati, Giampietro Marchetti and Najib M. Rahman
J. Clin. Med. 2024, 13(21), 6331; https://doi.org/10.3390/jcm13216331 - 23 Oct 2024
Viewed by 6776
Abstract
Thoracostomy and chest tube placement are key procedures in treating pleural diseases involving the accumulation of fluids (e.g., malignant effusions, serous fluid, pus, or blood) or air (pneumothorax) in the pleural cavity. Initially described by Hippocrates and refined through the centuries, chest drainage [...] Read more.
Thoracostomy and chest tube placement are key procedures in treating pleural diseases involving the accumulation of fluids (e.g., malignant effusions, serous fluid, pus, or blood) or air (pneumothorax) in the pleural cavity. Initially described by Hippocrates and refined through the centuries, chest drainage achieved a historical milestone in the 19th century with the creation of closed drainage systems to prevent the entry of air into the pleural space and reduce infection risk. The introduction of plastic materials and the Heimlich valve further revolutionized chest tube design and function. Technological advancements led to the availability of various chest tube designs (straight, angled, and pig-tail) and drainage systems, including PVC and silicone tubes with radiopaque stripes for better radiological visualization. Modern chest drainage units can incorporate smart digital systems that monitor and graphically report pleural pressure and evacuated fluid/air, improving patient outcomes. Suction application via wall systems or portable digital devices enhances drainage efficacy, although careful regulation is needed to avoid complications such as re-expansion pulmonary edema or prolonged air leak. To prevent recurrent effusion, particularly due to malignancy, pleurodesis agents can be applied through the chest tube. In cases of non-expandable lung, maintaining a long-term chest drain may be the most appropriate approach and procedures such as the placement of an indwelling pleural catheter can significantly improve quality of life. Continued innovations and rigorous training ensure that chest tube insertion remains a cornerstone of effective pleural disease management. This review provides a comprehensive overview of the historical evolution and modern advancements in pleural drainage. By addressing both current technologies and procedural outcomes, it serves as a valuable resource for healthcare professionals aiming to optimize pleural disease management and patient care. Full article
(This article belongs to the Section Pulmonology)
Show Figures

Figure 1

Figure 1
<p>Main types of pleural drainage with details of the tips. (<b>A</b>): small-bore straight catheter with a Verres-type needle dilator; (<b>B</b>): small-bore pig-tail catheter; (<b>C</b>): small-bore straight catheter with guide wire for placement by means of the Seldinger technique; (<b>D</b>): large-bore catheter with trocar.</p>
Full article ">Figure 2
<p>Exemplification of the classic underwater-seal chest drainage systems with one (<b>A</b>), two (<b>B</b>), and three (<b>C</b>) chambers, and a modern collection box (<b>D</b>).</p>
Full article ">Figure 3
<p>Different methods for anchoring a pleural drain. (<b>A</b>): Simple stitch and Roman sandal technique in a small-bore chest tube; (<b>B</b>): Simple stitch and tie of the drainage tube; (<b>C</b>): Purse-string sutures in a large-bore chest tube; and (<b>D</b>): Indwelling pleural catheter (IPC) secured by two simple stitches and a Roman sandal at the proximal end.</p>
Full article ">Figure 4
<p>Progressive steps (from <b>A</b>–<b>F</b>) to secure a large-bore chest tube using the purse-string technique.</p>
Full article ">
18 pages, 5897 KiB  
Article
Tracking and Behavior Analysis of Group-Housed Pigs Based on a Multi-Object Tracking Approach
by Shuqin Tu, Jiaying Du, Yun Liang, Yuefei Cao, Weidian Chen, Deqin Xiao and Qiong Huang
Animals 2024, 14(19), 2828; https://doi.org/10.3390/ani14192828 - 30 Sep 2024
Viewed by 852
Abstract
Smart farming technologies to track and analyze pig behaviors in natural environments are critical for monitoring the health status and welfare of pigs. This study aimed to develop a robust multi-object tracking (MOT) approach named YOLOv8 + OC-SORT(V8-Sort) for the automatic monitoring of [...] Read more.
Smart farming technologies to track and analyze pig behaviors in natural environments are critical for monitoring the health status and welfare of pigs. This study aimed to develop a robust multi-object tracking (MOT) approach named YOLOv8 + OC-SORT(V8-Sort) for the automatic monitoring of the different behaviors of group-housed pigs. We addressed common challenges such as variable lighting, occlusion, and clustering between pigs, which often lead to significant errors in long-term behavioral monitoring. Our approach offers a reliable solution for real-time behavior tracking, contributing to improved health and welfare management in smart farming systems. First, the YOLOv8 is employed for the real-time detection and behavior classification of pigs under variable light and occlusion scenes. Second, the OC-SORT is utilized to track each pig to reduce the impact of pigs clustering together and occlusion on tracking. And, when a target is lost during tracking, the OC-SORT can recover the lost trajectory and re-track the target. Finally, to implement the automatic long-time monitoring of behaviors for each pig, we created an automatic behavior analysis algorithm that integrates the behavioral information from detection and the tracking results from OC-SORT. On the one-minute video datasets for pig tracking, the proposed MOT method outperforms JDE, Trackformer, and TransTrack, achieving the highest HOTA, MOTA, and IDF1 scores of 82.0%, 96.3%, and 96.8%, respectively. And, it achieved scores of 69.0% for HOTA, 99.7% for MOTA, and 75.1% for IDF1 on sixty-minute video datasets. In terms of pig behavior analysis, the proposed automatic behavior analysis algorithm can record the duration of four types of behaviors for each pig in each pen based on behavior classification and ID information to represent the pigs’ health status and welfare. These results demonstrate that the proposed method exhibits excellent performance in behavior recognition and tracking, providing technical support for prompt anomaly detection and health status monitoring for pig farming managers. Full article
(This article belongs to the Section Pigs)
Show Figures

Figure 1

Figure 1
<p>Part of group-housed pig images.</p>
Full article ">Figure 2
<p>The overall structure of V8-Sort.</p>
Full article ">Figure 3
<p>The pipeline of the YOLOv8n algorithm.</p>
Full article ">Figure 4
<p>The flowchart of OC-SORT.</p>
Full article ">Figure 5
<p>OC-SORT tracking process for pigs.</p>
Full article ">Figure 6
<p>Comparison between V8-Sort and other tracking methods on public datasets.</p>
Full article ">Figure 7
<p>Comparison between V8-Sort and other tracking methods on private datasets.</p>
Full article ">Figure 8
<p>The visual results of V8-Sort on the public dataset.</p>
Full article ">Figure 9
<p>The tracking results visualization of V8-Sort on the private dataset.</p>
Full article ">Figure 10
<p>The visualization of long-term tracking results. (The first row shows the tracking results for videos 2001 and 3010, and the second, third, and fourth rows, respectively, depict the tracking results of two frames from videos 2002, 2003 and 2004).</p>
Full article ">Figure 11
<p>Time allocation and proportion of pig behaviors.</p>
Full article ">Figure 12
<p>The proportional occurrence of the four behaviors.</p>
Full article ">
18 pages, 1289 KiB  
Article
The Impact of African Swine Fever on the Efficiency of China’s Pig Farming Industry
by Shiyong Piao, Xijie Jin, Shuangyu Hu and Ji-Yong Lee
Sustainability 2024, 16(17), 7819; https://doi.org/10.3390/su16177819 - 8 Sep 2024
Cited by 1 | Viewed by 1287
Abstract
African Swine Fever (ASF) is a severe viral disease that has significantly impacted the pig farming industry in China. It first broke out in China in 2018 and quickly spread to multiple provinces, significantly affecting the production efficiency of the pig farming industry. [...] Read more.
African Swine Fever (ASF) is a severe viral disease that has significantly impacted the pig farming industry in China. It first broke out in China in 2018 and quickly spread to multiple provinces, significantly affecting the production efficiency of the pig farming industry. This study utilized pig production data from 17 provinces in China from 2010 to 2022 and applied the Malmquist production efficiency index and panel regression methods to assess the impact of the ASF epidemic on the efficiency of the pig farming industry. The results indicated that the outbreak of ASF significantly reduced overall production efficiency, which magnified the vulnerabilities of the production system. Although there was a general decline in technological change and pure technical efficiency, the increase in scale efficiency suggested effective resource optimization by farmers under resource-constrained conditions. In light of these findings, it is recommended to strengthen biosecurity education and epidemic prevention measures in the pig farming industry and to enhance technological innovation and the application of smart technologies to improve production efficiency and disease response capabilities. Additionally, timely adjustments in farming scale and resource optimization will be key to addressing future challenges. Through these strategies, the pig farming industry can maintain stable production efficiency during future epidemics and push towards a more efficient and refined production model. Full article
Show Figures

Figure 1

Figure 1
<p>Changes in the TFP of the pig industry in the study area from 2018 to 2019.</p>
Full article ">Figure 2
<p>Changes in the TFP of the pig industry in the study area from 2019 to 2020.</p>
Full article ">Figure 3
<p>Changes in the TFP of the pig industry in the study area from 2010 to 2022.</p>
Full article ">
16 pages, 6518 KiB  
Article
DCNN for Pig Vocalization and Non-Vocalization Classification: Evaluate Model Robustness with New Data
by Vandet Pann, Kyeong-seok Kwon, Byeonghyeon Kim, Dong-Hwa Jang and Jong-Bok Kim
Animals 2024, 14(14), 2029; https://doi.org/10.3390/ani14142029 - 9 Jul 2024
Viewed by 877
Abstract
Since pig vocalization is an important indicator of monitoring pig conditions, pig vocalization detection and recognition using deep learning play a crucial role in the management and welfare of modern pig livestock farming. However, collecting pig sound data for deep learning model training [...] Read more.
Since pig vocalization is an important indicator of monitoring pig conditions, pig vocalization detection and recognition using deep learning play a crucial role in the management and welfare of modern pig livestock farming. However, collecting pig sound data for deep learning model training takes time and effort. Acknowledging the challenges of collecting pig sound data for model training, this study introduces a deep convolutional neural network (DCNN) architecture for pig vocalization and non-vocalization classification with a real pig farm dataset. Various audio feature extraction methods were evaluated individually to compare the performance differences, including Mel-frequency cepstral coefficients (MFCC), Mel-spectrogram, Chroma, and Tonnetz. This study proposes a novel feature extraction method called Mixed-MMCT to improve the classification accuracy by integrating MFCC, Mel-spectrogram, Chroma, and Tonnetz features. These feature extraction methods were applied to extract relevant features from the pig sound dataset for input into a deep learning network. For the experiment, three datasets were collected from three actual pig farms: Nias, Gimje, and Jeongeup. Each dataset consists of 4000 WAV files (2000 pig vocalization and 2000 pig non-vocalization) with a duration of three seconds. Various audio data augmentation techniques are utilized in the training set to improve the model performance and generalization, including pitch-shifting, time-shifting, time-stretching, and background-noising. In this study, the performance of the predictive deep learning model was assessed using the k-fold cross-validation (k = 5) technique on each dataset. By conducting rigorous experiments, Mixed-MMCT showed superior accuracy on Nias, Gimje, and Jeongeup, with rates of 99.50%, 99.56%, and 99.67%, respectively. Robustness experiments were performed to prove the effectiveness of the model by using two farm datasets as a training set and a farm as a testing set. The average performance of the Mixed-MMCT in terms of accuracy, precision, recall, and F1-score reached rates of 95.67%, 96.25%, 95.68%, and 95.96%, respectively. All results demonstrate that the proposed Mixed-MMCT feature extraction method outperforms other methods regarding pig vocalization and non-vocalization classification in real pig livestock farming. Full article
Show Figures

Figure 1

Figure 1
<p>The installation of recording devices inside the pig farm.</p>
Full article ">Figure 2
<p>Data augmentation visualization in the wave signals of each method.</p>
Full article ">Figure 3
<p>The overall flow diagram of the classification method.</p>
Full article ">Figure 4
<p>The visualization of the pig vocalization sample in each feature extraction method. The first row displays the non-vocalization samples, and the second row shows the vocalization samples.</p>
Full article ">Figure 5
<p>The overall structure of the proposed deep learning network architecture model.</p>
Full article ">Figure 6
<p>Confusion matrix of the classification results of the different feature extraction methods.</p>
Full article ">Figure 7
<p>The ROC curve visualization of the model classification performance in each feature extraction method. (<b>a</b>–<b>c</b>) are ROC curves of Nias, Gimje, and Jeongeup, respectively.</p>
Full article ">Figure 8
<p>The ROC curve visualization of the model robustness classification performance. (<b>a</b>–<b>c</b>) are the ROC curves of NGdb, NJdb, and GJdb, respectively.</p>
Full article ">Figure 9
<p>Confusion matrix results of the model robustness classification performance.</p>
Full article ">
29 pages, 995 KiB  
Review
Smart Pig Farming—A Journey Ahead of Vietnam
by Md Sharifuzzaman, Hong-Seok Mun, Keiven Mark B. Ampode, Eddiemar B. Lagua, Hae-Rang Park, Young-Hwa Kim, Md Kamrul Hasan and Chul-Ju Yang
Agriculture 2024, 14(4), 555; https://doi.org/10.3390/agriculture14040555 - 31 Mar 2024
Cited by 3 | Viewed by 4597
Abstract
Vietnam heavily relies on pork as its primary source of animal protein. Traditional farming methods, characterized by small-scale operations, dominate the industry. However, challenges such as rising feed costs, disease outbreaks, and market volatility are prompting many farmers to abandon their businesses. Recognizing [...] Read more.
Vietnam heavily relies on pork as its primary source of animal protein. Traditional farming methods, characterized by small-scale operations, dominate the industry. However, challenges such as rising feed costs, disease outbreaks, and market volatility are prompting many farmers to abandon their businesses. Recognizing the pivotal role of the swine sector in both economic development and nutrition, authorities must intervene to prevent its collapse. In developed nations, smart pig farming, utilizing technologies like sensors and cameras for data collection and real-time decision-making, has significantly improved health and productivity. These technologies can detect subtle indicators of animal well-being, enabling prompt intervention. This review aims to analyze the drivers of Vietnam’s swine farming, identify existing production system flaws, and explore innovative precision farming methods worldwide. Embracing precision farming promises to enhance Vietnam’s competitiveness in export markets and bolster consumer confidence. However, reliance solely on expensive foreign technologies may benefit large-scale farms, leaving smaller ones behind. Therefore, fostering local innovation and validating cost-effective solutions will be crucial for the sustainable growth of small- and medium-scale pig farming in Vietnam. Full article
Show Figures

Figure 1

Figure 1
<p>Gradual changes in the concentration of pigs in different regions. Generated from [<a href="#B13-agriculture-14-00555" class="html-bibr">13</a>].</p>
Full article ">Figure 2
<p>Influencing factors, obstacles, and proposed methods and technologies to overcome obstacles in Vietnam pig farming system.</p>
Full article ">
16 pages, 9686 KiB  
Article
A Deep-Learning-Based System for Pig Posture Classification: Enhancing Sustainable Smart Pigsty Management
by Chanhui Jeon, Haram Kim and Dongsoo Kim
Sustainability 2024, 16(7), 2888; https://doi.org/10.3390/su16072888 - 29 Mar 2024
Viewed by 1446
Abstract
This paper presents a deep-learning-based system for classifying pig postures, aiming to improve the management of sustainable smart pigsties. The classification of pig postures is a crucial concern for researchers investigating pigsty environments and for on-site pigsty managers. To address this issue, we [...] Read more.
This paper presents a deep-learning-based system for classifying pig postures, aiming to improve the management of sustainable smart pigsties. The classification of pig postures is a crucial concern for researchers investigating pigsty environments and for on-site pigsty managers. To address this issue, we developed a comprehensive system framework for pig posture classification within a pigsty. We collected image datasets from an open data sharing site operated by a public organization and systematically conducted the following steps: object detection, data labeling, image preprocessing, model development, and training. These processes were carried out using the acquired datasets to ensure comprehensive and effective training for our pig posture classification system. Subsequently, we analyzed and discussed the classification results using techniques such as Grad-CAM. As a result of visual analysis through Grad-CAM, it is possible to identify image features when posture is correctly classified or misclassified in a pig image. By referring to these results, it is expected that the accuracy of pig posture classification can be further improved. Through this analysis and discussion, we can identify which features of pig postures in images need to be emphasized to improve the accuracy of pig posture classification. The findings of this study are anticipated to significantly improve the accuracy of pig posture classification. In practical applications, the proposed pig posture classification system holds the potential to promptly detect abnormal situations in pigsties, leading to prompt responses. Ultimately, this can greatly contribute to increased productivity in pigsty operations, fostering efficiency enhancements in pigsty management. Full article
(This article belongs to the Special Issue Sustainable Technology in Agricultural Engineering)
Show Figures

Figure 1

Figure 1
<p>Overall system framework.</p>
Full article ">Figure 2
<p>Acquired raw data image of the pigsty.</p>
Full article ">Figure 3
<p>Object detected smart pigsty.</p>
Full article ">Figure 4
<p>Dataset configuration criteria.</p>
Full article ">Figure 5
<p>Class labeling.</p>
Full article ">Figure 6
<p>Image preprocessing process.</p>
Full article ">Figure 7
<p>Learning curve of EfficientNetV2-S.</p>
Full article ">Figure 8
<p>Confusion matrix of classification results.</p>
Full article ">Figure 9
<p>Grad-CAM result.</p>
Full article ">Figure 10
<p>Misclassified image by class.</p>
Full article ">Figure 11
<p>Example utilization scenarios.</p>
Full article ">
19 pages, 7206 KiB  
Article
A Lightweight Pig Face Recognition Method Based on Automatic Detection and Knowledge Distillation
by Ruihan Ma, Hassan Ali, Seyeon Chung, Sang Cheol Kim and Hyongsuk Kim
Appl. Sci. 2024, 14(1), 259; https://doi.org/10.3390/app14010259 - 27 Dec 2023
Cited by 2 | Viewed by 1627
Abstract
Identifying individual pigs is crucial for efficient breeding, health management, and disease control in modern farming. Traditional animal face identification methods are labor-intensive and prone to inaccuracies, while existing CNN-based pig face recognition models often struggle with high computational demands, large sizes, and [...] Read more.
Identifying individual pigs is crucial for efficient breeding, health management, and disease control in modern farming. Traditional animal face identification methods are labor-intensive and prone to inaccuracies, while existing CNN-based pig face recognition models often struggle with high computational demands, large sizes, and reliance on extensive labeled data, which limit their practical application. This paper addresses these challenges by proposing a novel, decoupled approach to pig face recognition that separates detection from identification. This strategy employs a detection model as a pre-processing step, significantly reducing the need for extensive re-annotation for new datasets. Additionally, the paper introduces a method that integrates offline knowledge distillation with a lightweight pig face recognition model, aiming to build an efficient and embedding-friendly system. To achieve these objectives, the study constructs a small-scale, high-quality pig face detection dataset consisting of 1500 annotated images from a selection of 20 pigs. An independent detection model, trained on this dataset, then autonomously generates a large-scale pig face recognition dataset with 56 pig classes. In the face recognition stage, a robust teacher model guides the student model through a distillation process informed by a knowledge distillation loss, enabling the student model to learn relational features from the teacher. Experimental results confirm the high accuracy of the pig face detection model on the small-scale detection dataset and the ability to generate a large-scale dataset for pig face recognition on unlabeled data. The recognition experiments further verify that the distilled lightweight model outperforms its non-distilled counterparts and approaches the performance of the teacher model. This scalable, cost-effective solution shows significant promise for broader computer vision applications beyond agriculture. Full article
Show Figures

Figure 1

Figure 1
<p>Proposed pig face recognition architecture based on automatic detection and a knowledge distillation technique.</p>
Full article ">Figure 2
<p>The workflow of the proposed decoupled pig face recognition system.</p>
Full article ">Figure 3
<p>Sample image visualization from the pig face detection dataset.</p>
Full article ">Figure 4
<p>The MS-SSIM filters similar images in our pig face dataset.</p>
Full article ">Figure 5
<p>The architecture of ViT-base as teacher model.</p>
Full article ">Figure 6
<p>The architecture of Shufflenet-V2 as student model.</p>
Full article ">Figure 7
<p>The visualization of the training loss and average precision (<math display="inline"><semantics> <msub> <mi>AP</mi> <mrow> <mn>50</mn> <mo>−</mo> <mn>95</mn> </mrow> </msub> </semantics></math>) from four YOLO models.</p>
Full article ">Figure 8
<p>Prediction of YOLO-v8 on pig face detection testing dataset.</p>
Full article ">Figure 9
<p>Predicted unlabeled images of 56 pigs using YOLO-v8.</p>
Full article ">Figure 10
<p>Training loss and Accuracy curve on teacher, student, and distilled student models. The figure displays the respective progression of training loss and Accuracy for the ViT-base teacher model, the standalone ShuffleNet-V2 student model, and the ShuffleNet-V2 student model after undergoing Relational Knowledge Distillation (RKD).</p>
Full article ">Figure 11
<p>The confusion matrix on the teacher model, lightweight model, and distilled student model.</p>
Full article ">Figure 12
<p>Comparison of teacher models’ loss and Accuracy curves.</p>
Full article ">Figure 13
<p>Comparison of student models’ loss and Accuracy curves.</p>
Full article ">
12 pages, 2694 KiB  
Article
Assessment of the Smartpill, a Wireless Sensor, as a Measurement Tool for Intra-Abdominal Pressure (IAP)
by Andréa Soucasse, Arthur Jourdan, Lauriane Edin, Elise Meunier, Thierry Bege and Catherine Masson
Sensors 2024, 24(1), 54; https://doi.org/10.3390/s24010054 - 21 Dec 2023
Cited by 1 | Viewed by 1379
Abstract
Background: The SmartPill, a multisensor ingestible capsule, is marketed for intestinal motility disorders. It includes a pressure sensor, which could be used to study intra-abdominal pressure (IAP) variations. However, the validation data are lacking for this use. Material and Methods: An experimental study [...] Read more.
Background: The SmartPill, a multisensor ingestible capsule, is marketed for intestinal motility disorders. It includes a pressure sensor, which could be used to study intra-abdominal pressure (IAP) variations. However, the validation data are lacking for this use. Material and Methods: An experimental study was conducted on anesthetized pigs with stepwise variations of IAP (from 0 to 15 mmHg by 3 mmHg steps) generated by laparoscopic insufflation. A SmartPill, inserted by endoscopy, provided intragastric pressure data. These data were compensated to take into account the intrabdominal temperature. They were compared to the pressure recorded by intragastric (IG) and intraperitoneal (IP) wired sensors by statistical Spearman and Bland–Altmann analysis. Results: More than 4500 pressure values for each sensor were generated on two animals. The IG pressure values obtained with the SmartPill were correlated with the IG pressure values obtained with the wired sensor (respectively, Spearman ρ coefficients 0.90 ± 0.08 and 0.72 ± 0.25; bias of −28 ± −0.3 mmHg and −29.2 ± 0.5 mmHg for pigs 1 and 2). The intragastric SmartPill values were also correlated with the IAP measured intra-peritoneally (respectively, Spearman ρ coefficients 0.49 ± 0.18 and 0.57 ± 0.30; bias of −29 ± 1 mmHg and −31 ± 0.7 mmHg for pigs 1 and 2). Conclusions: The SmartPill is a wireless and painless sensor that appears to correctly monitor IAP variations. Full article
(This article belongs to the Special Issue Wearable Sensors for Monitoring Athletic and Clinical Cohorts)
Show Figures

Figure 1

Figure 1
<p>Image and schematic of the components of the SmartPill<sup>®</sup>; reprinted from <a href="https://doi.org/10.1016/S1369-7021(09)70272-X" target="_blank">https://doi.org/10.1016/S1369-7021(09)70272-X</a> (accessed on 2 December 2023) [<a href="#B21-sensors-24-00054" class="html-bibr">21</a>].</p>
Full article ">Figure 2
<p>Schema of the experimental setup.</p>
Full article ">Figure 3
<p>Intra-gastric and intra-peritoneal pressure as functions of pressure steps—patterns of the different signals.</p>
Full article ">Figure 4
<p>Correlation between intra-gastric pressure measured by the SmartPill™ and the wired sensor.</p>
Full article ">Figure 5
<p>Bias of measurement between intra-gastric pressure measured by the SmartPill™ and intra-gastric pressure measured by wired sensor.</p>
Full article ">Figure 6
<p>Correlation between pressure measured by the SmartPill™ and the wired intra-peritoneal sensor.</p>
Full article ">Figure 7
<p>Bias of measurement between intra-gastric pressure measured by the SmartPill™ and intra-peritoneal pressure measured by a wired sensor.</p>
Full article ">Figure 8
<p>Illustration of the SmartPill™ measurement bias. This figure represents the bias of measurement that appears of the pressure (mmHg) versus time (s) curve when placing the SmartPill™ (SP) capsule in the pig’s 1 stomach. Curve of the temperature (°C) as a function of time (s) is also represented.</p>
Full article ">Figure 9
<p>Comparison of the pressure measured by the SmartPill™ depending on chosen data export mode.</p>
Full article ">
22 pages, 1794 KiB  
Article
Smart Temperature and Humidity Control in Pig House by Improved Three-Way K-Means
by Haopu Li, Haoming Li, Bugao Li, Jiayuan Shao, Yanbo Song and Zhenyu Liu
Agriculture 2023, 13(10), 2020; https://doi.org/10.3390/agriculture13102020 - 18 Oct 2023
Cited by 2 | Viewed by 2480
Abstract
Efficiently managing temperature and humidity in a pig house is crucial for enhancing animal welfare. This research endeavors to develop an intelligent temperature and humidity control system grounded in a three-way decision and clustering algorithm. To establish and validate the effectiveness of this [...] Read more.
Efficiently managing temperature and humidity in a pig house is crucial for enhancing animal welfare. This research endeavors to develop an intelligent temperature and humidity control system grounded in a three-way decision and clustering algorithm. To establish and validate the effectiveness of this intelligent system, experiments were conducted to compare its performance against a naturally ventilated pig house without any control system. Additionally, comparisons were made with a threshold-based control system to evaluate the duration of temperature anomalies. The experimental findings demonstrate a substantial improvement in temperature regulation within the experimental pig house. Over a 24 h period, the minimum temperature increased by 4 °C, while the maximum temperature decreased by 8 °C, approaching the desired range. Moreover, the average air humidity decreased from 73.4% to 68.2%. In summary, this study presents a precision-driven intelligent control strategy for optimizing temperature and humidity management in pig housing facilities. Full article
(This article belongs to the Special Issue Computer Vision and Sensor Networks in Agriculture)
Show Figures

Figure 1

Figure 1
<p>Blocks diagram of the pattern of black box used to choose the control scheme.</p>
Full article ">Figure 2
<p>Flow chart of microcosmic regulation of temperature and humidity. <math display="inline"><semantics> <mrow> <mi>K</mi> <mi>N</mi> <mi>N</mi> </mrow> </semantics></math> denotes the k-nearest neighbor algorithm.</p>
Full article ">Figure 3
<p>Temperature and humidity control system block diagram.</p>
Full article ">Figure 4
<p>Cross-sectional view of experimental pig house.</p>
Full article ">Figure 5
<p>Results of the weather station (26 September 2022 to 28 September 2022). <math display="inline"><semantics> <msub> <mi>T</mi> <mi>i</mi> </msub> </semantics></math>: indoor air temperature, <math display="inline"><semantics> <msub> <mi>T</mi> <mi>o</mi> </msub> </semantics></math>: outdoor air temperature, <math display="inline"><semantics> <msub> <mi>H</mi> <mi>i</mi> </msub> </semantics></math>: indoor relative humidity, <math display="inline"><semantics> <msub> <mi>H</mi> <mi>o</mi> </msub> </semantics></math>: outdoor relative humidity, <math display="inline"><semantics> <msub> <mi>W</mi> <mi>s</mi> </msub> </semantics></math>: wind speed, <math display="inline"><semantics> <msub> <mi>T</mi> <mi>s</mi> </msub> </semantics></math>: surface temperature, <math display="inline"><semantics> <msub> <mi>W</mi> <mi>d</mi> </msub> </semantics></math>: wind direction, <math display="inline"><semantics> <msub> <mi>P</mi> <mi>s</mi> </msub> </semantics></math>: surface pressure.</p>
Full article ">Figure 6
<p><math display="inline"><semantics> <mrow> <mi>A</mi> <mi>C</mi> <mi>C</mi> </mrow> </semantics></math> of each algorithms. (<b>a</b>): Average of <math display="inline"><semantics> <mrow> <mi>A</mi> <mi>C</mi> <mi>C</mi> </mrow> </semantics></math>, (<b>b</b>): Optimal of <math display="inline"><semantics> <mrow> <mi>A</mi> <mi>C</mi> <mi>C</mi> </mrow> </semantics></math>.</p>
Full article ">Figure 7
<p><math display="inline"><semantics> <mrow> <mi>A</mi> <mi>R</mi> <mi>I</mi> </mrow> </semantics></math> of each algorithms. (<b>a</b>): Average of <math display="inline"><semantics> <mrow> <mi>A</mi> <mi>R</mi> <mi>I</mi> </mrow> </semantics></math>, (<b>b</b>): Optimal of <math display="inline"><semantics> <mrow> <mi>A</mi> <mi>R</mi> <mi>I</mi> </mrow> </semantics></math>.</p>
Full article ">Figure 8
<p><math display="inline"><semantics> <mrow> <mi>A</mi> <mi>S</mi> </mrow> </semantics></math> of each algorithms. (<b>a</b>): Average of <math display="inline"><semantics> <mrow> <mi>A</mi> <mi>S</mi> </mrow> </semantics></math>, (<b>b</b>): Optimal of <math display="inline"><semantics> <mrow> <mi>A</mi> <mi>S</mi> </mrow> </semantics></math>.</p>
Full article ">Figure 9
<p>Temperature conditions (26 September 2022 to 28 September 2022). (<b>a</b>): Inside temperature, (<b>b</b>): Outside temperature.</p>
Full article ">Figure 10
<p>Humidity conditions (26 September 2022 to 28 September 2022). (<b>a</b>): Inside humidity, (<b>b</b>): Outside humidity.</p>
Full article ">Figure 11
<p>The external climate conditions (26 September 2022 to 28 September 2022). (<b>a</b>): Wind speed, (<b>b</b>): Wind direction, (<b>c</b>): Surface temperature, (<b>d</b>): Surface pressure.</p>
Full article ">Figure 12
<p>Comparison of inside temperatures with control and without control (26 September 2022 to 28 September 2022).</p>
Full article ">Figure 13
<p>Comparison of inside humidity with control and without control (26 September 2022 to 28 September 2022).</p>
Full article ">Figure 14
<p>Percentage of abnormal time in a day.</p>
Full article ">
23 pages, 2748 KiB  
Article
Demonstration of an Integrated Methodology for the Sustainable Valorisation of Bakery Former Food Products as a Pig Feed Ingredient: A Circular Bioeconomy Paradigm
by Apostolos Malamakis, Sotiris I. Patsios, Lefteris Melas, Anna Dedousi, Konstantinos N. Kontogiannopoulos, Konstantinos Vamvakas, Nikos Tsotsolas, Eleni Koutsouraki, Evangelia N. Sossidou and George F. Banias
Sustainability 2023, 15(19), 14385; https://doi.org/10.3390/su151914385 - 29 Sep 2023
Cited by 1 | Viewed by 1320
Abstract
This study aims to demonstrate an integrated methodology for the valorisation of bakery former food products (FFP) as an ingredient of pig feed diets. The methodology involves: conducting a needs analysis and a full path traceability scheme based on Global Standards 1 (GS1) [...] Read more.
This study aims to demonstrate an integrated methodology for the valorisation of bakery former food products (FFP) as an ingredient of pig feed diets. The methodology involves: conducting a needs analysis and a full path traceability scheme based on Global Standards 1 (GS1) Organisation (Brussels, Belgium) standards, designing digital tools to support the implementation of the traceability scheme, and assessing the valorisation of FFP and, more specifically, of bakery by-products in bakery meal (BM) production, and its implementation in pig feed diet. BM production comprises various bakery by-products, which were collected, unpacked, grinded, and thermally treated. Physicochemical and microbiological analyses were conducted on BM samples, mainly focusing on nutrient composition, and the presence of aflatoxins, mycotoxins, and pathogenic microorganisms. The BM was then fed to finishing pigs (at an inclusion rate of 20% w/w), in parallel to a control group fed with a conventional pig feed diet. The animals in both dietary groups were evaluated for growth performance, and meat samples were analysed for specific quality parameters and sensory characteristics. The results show that the addition of 20% w/w BM does not significantly affect the growth performance or the meat quality of the pigs. Moreover, a sensory evaluation revealed minor differences in the sensory characteristics of the meat samples, denoting that the BM addition does not seem to dwindle the final meat product. Full article
(This article belongs to the Section Sustainable Materials)
Show Figures

Figure 1

Figure 1
<p>Mapping of possible by-product sources. The three zones of interest are depicted in each map section. The QR code is an open data source of the mapped sources.</p>
Full article ">Figure 2
<p>Sensory characteristics of pig meat. Characteristics were judged by a panel on a scale from 1 to 5.</p>
Full article ">Figure 3
<p>A screenshot of the B2B App.</p>
Full article ">Figure 4
<p>Traceability App.</p>
Full article ">Figure 5
<p>A screenshot of the Label Creation App.</p>
Full article ">
15 pages, 7528 KiB  
Article
Efficient Aggressive Behavior Recognition of Pigs Based on Temporal Shift Module
by Hengyi Ji, Guanghui Teng, Jionghua Yu, Yanbin Wen, Huixiang Deng and Yanrong Zhuang
Animals 2023, 13(13), 2078; https://doi.org/10.3390/ani13132078 - 23 Jun 2023
Cited by 7 | Viewed by 2320
Abstract
Aggressive behavior among pigs is a significant social issue that has severe repercussions on both the profitability and welfare of pig farms. Due to the complexity of aggression, recognizing it requires the consideration of both spatial and temporal features. To address this problem, [...] Read more.
Aggressive behavior among pigs is a significant social issue that has severe repercussions on both the profitability and welfare of pig farms. Due to the complexity of aggression, recognizing it requires the consideration of both spatial and temporal features. To address this problem, we proposed an efficient method that utilizes the temporal shift module (TSM) for automatic recognition of pig aggression. In general, TSM is inserted into four 2D convolutional neural network models, including ResNet50, ResNeXt50, DenseNet201, and ConvNext-t, enabling the models to process both spatial and temporal features without increasing the model parameters and computational complexity. The proposed method was evaluated on the dataset established in this study, and the results indicate that the ResNeXt50-T (TSM inserted into ResNeXt50) model achieved the best balance between recognition accuracy and model parameters. On the test set, the ResNeXt50-T model achieved accuracy, recall, precision, F1 score, speed, and model parameters of 95.69%, 95.25%, 96.07%, 95.65%, 29 ms, and 22.98 M, respectively. These results show that the proposed method can effectively improve the accuracy of recognizing pig aggressive behavior and provide a reference for behavior recognition in actual scenarios of smart livestock farming. Full article
Show Figures

Figure 1

Figure 1
<p>Composition and examples of aggressive and non-aggressive behaviors in datasets.</p>
Full article ">Figure 2
<p>Schematic diagram of temporal shift module: (<b>a</b>) the input features with C channels and T frames; (<b>b</b>) TSM moving the features along the temporal dimension; (<b>c</b>) the output features mingles both past and future frames with the current frame.</p>
Full article ">Figure 3
<p>Schematic diagram of inserting TSM in blocks of different 2D CNN models: (<b>a</b>) ResNet block with temporal shift module; (<b>b</b>) ResNeXt block with temporal shift module; (<b>c</b>) ConvNext block with temporal shift module; (<b>d</b>) DenseNet block with temporal shift module and dense connections in the DenseNet stage.</p>
Full article ">Figure 4
<p>The accuracy of ResNet50, ResNeXt50, ConvNext-t, and DenseNet201 inserted with TSM: (<b>a</b>) accuracy on the training set, (<b>b</b>) accuracy on the validation set.</p>
Full article ">Figure 5
<p>ResNeXt50-T heat map comparison after extracting video features of aggressive and non-aggressive behaviors: (<b>a</b>) when the two pigs located in the center of the video engaged in mutual biting with significant displacement, ResNeXt50-T successfully focused on their positions; (<b>b</b>) when no aggressive behavior was observed in the video, ResNeXt50-T exhibited a tendency to focus on the region where the pigs are clustered.</p>
Full article ">Figure 6
<p>Examples and heat maps of video misclassification by ResNeXt50-T: (<b>a</b>) due to two pigs in the top left corner exhibited treading behavior, characterized by a significant overlap of the pigs’ bodies without notable displacement, ResNeXt50-T encountered difficulties in recognizing aggressive behavior; (<b>b</b>) due to the overlapping of the heads of the three pigs in the middle with significant displacement, ResNeXt50-T erroneously recognized this behavior as an aggression.</p>
Full article ">
20 pages, 2680 KiB  
Review
A Review of Posture Detection Methods for Pigs Using Deep Learning
by Zhe Chen, Jisheng Lu and Haiyan Wang
Appl. Sci. 2023, 13(12), 6997; https://doi.org/10.3390/app13126997 - 9 Jun 2023
Cited by 8 | Viewed by 2766
Abstract
Analysis of pig posture is significant for improving the welfare and yield of captive pigs under different conditions. Detection of pig postures, such as standing, lateral lying, sternal lying, and sitting, can facilitate a comprehensive assessment of the psychological and physiological conditions of [...] Read more.
Analysis of pig posture is significant for improving the welfare and yield of captive pigs under different conditions. Detection of pig postures, such as standing, lateral lying, sternal lying, and sitting, can facilitate a comprehensive assessment of the psychological and physiological conditions of pigs, prediction of their abnormal or detrimental behavior, and evaluation of the farming conditions to improve pig welfare and yield. With the introduction of smart farming into the farming industry, effective and applicable posture detection methods become indispensable for realizing the above purposes in an intelligent and automatic manner. From early manual modeling to traditional machine vision, and then to deep learning, multifarious detection methods have been proposed to meet the practical demand. Posture detection methods based on deep learning show great superiority in terms of performance (such as accuracy, speed, and robustness) and feasibility (such as simplicity and universality) compared with most traditional methods. It is promising to popularize deep learning technology in actual commercial production on a large scale to automate pig posture monitoring. This review comprehensively introduces the data acquisition methods and sub-tasks for pig posture detection and their technological evolutionary processes, and also summarizes the application of mainstream deep learning models in pig posture detection. Finally, the limitations of current methods and the future directions for research will be discussed. Full article
(This article belongs to the Special Issue Feature Review Papers in Agricultural Science and Technology)
Show Figures

Figure 1

Figure 1
<p>The correlation between external factors, psychological and physiological state of pigs, pig postures, and pig welfare and production. (The external factors include breeding conditions, environmental parameters, social interaction between pigs within the same enclosure, and invasive human activities [<a href="#B5-applsci-13-06997" class="html-bibr">5</a>,<a href="#B6-applsci-13-06997" class="html-bibr">6</a>,<a href="#B7-applsci-13-06997" class="html-bibr">7</a>]).</p>
Full article ">Figure 2
<p>The processes of utilizing acquired data to address real-world problems.</p>
Full article ">Figure 3
<p>Examples of pig localization: (<b>a</b>) Example of pig localization by contours; (<b>b</b>) Example of pig localization by bounding boxes; (<b>c</b>) Example of pig localization by key points. (Blue marks indicate left and right neck, purple marks indicate left and right shoulder, green marks indicate left and right abdomen, red marks indicate left and right hip, and yellow marks indicate left and right tail [<a href="#B39-applsci-13-06997" class="html-bibr">39</a>]).</p>
Full article ">Figure 4
<p>Example of pig posture classification.</p>
Full article ">Figure 5
<p>(<b>a</b>) Example of pig segmentation mask; (<b>b</b>) Example of pig identification and tracking.</p>
Full article ">Figure 6
<p>A typical two-stage model pipeline for pig posture detection.</p>
Full article ">Figure 7
<p>A typical one-stage model diagram for pig posture detection. (The image is divided into grids, and within each grid cell, bounding boxes, confidence scores, and class probabilities for different posture types are predicted simultaneously).</p>
Full article ">
13 pages, 1605 KiB  
Article
SmartPill™ Administration to Assess Gastrointestinal Function after Spinal Cord Injury in a Porcine Model—A Preliminary Study
by Chase A. Knibbe, Rakib Uddin Ahmed, Felicia Wilkins, Mayur Sharma, Jay Ethridge, Monique Morgan, Destiny Gibson, Kimberly B. Cooper, Dena R. Howland, Manicka V. Vadhanam, Shirish S. Barve, Steven Davison, Leslie C. Sherwood, Jack Semler, Thomas Abell and Maxwell Boakye
Biomedicines 2023, 11(6), 1660; https://doi.org/10.3390/biomedicines11061660 - 7 Jun 2023
Cited by 4 | Viewed by 2095
Abstract
Gastrointestinal (GI) complications, including motility disorders, metabolic deficiencies, and changes in gut microbiota following spinal cord injury (SCI), are associated with poor outcomes. After SCI, the autonomic nervous system becomes unbalanced below the level of injury and can lead to severe GI dysfunction. [...] Read more.
Gastrointestinal (GI) complications, including motility disorders, metabolic deficiencies, and changes in gut microbiota following spinal cord injury (SCI), are associated with poor outcomes. After SCI, the autonomic nervous system becomes unbalanced below the level of injury and can lead to severe GI dysfunction. The SmartPill™ is a non-invasive capsule that, when ingested, transmits pH, temperature, and pressure readings that can be used to assess effects in GI function post-injury. Our minipig model allows us to assess these post-injury changes to optimize interventions and ultimately improve GI function. The aim of this study was to compare pre-injury to post-injury transit times, pH, and pressures in sections of GI tract by utilizing the SmartPill™ in three pigs after SCI at 2 and 6 weeks. Tributyrin was administered to two pigs to assess the influences on their gut microenvironment. We observed prolonged GET (Gastric Emptying Time) and CTT (Colon Transit Time), decreases in contraction frequencies (Con freq) in the antrum of the stomach, colon, and decreases in duodenal pressures post-injury. We noted increases in Sum amp generated at 2 weeks post-injury in the colon, with corresponding decreases in Con freq. We found transient changes in pH in the colon and small intestine at 2 weeks post-injury, with minimal effect on stomach pH post-injury. Prolonged GETs and CTTs can influence the absorptive profile in the gut and contribute to pathology development. This is the first pilot study to administer the SmartPill™ in minipigs in the context of SCI. Further investigations will elucidate these trends and characterize post-SCI GI function. Full article
(This article belongs to the Special Issue Porcine Models of Neurotrauma and Neurological Disorders)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Whole gut pH recorded by the SmartPill™ at pre-injured condition. The green line depicts pig 1 and the red line pig 2. Graphs include raw data recorded by the SmartPill™. A drop in mean pH to between 1 and 3 indicates that the wireless motility capsule (WMC) has been ingested and resides in the stomach. A permanent rise in mean pH to above 4 to between approximately 7 and 9, with a concomitant increase in pressure, indicates passage through the pyloric sphincter GET and into the small intestine. A subsequent decrease in pH to approximately 6 to 8, with a congruent increase in pressure, indicates passage through the ileocecal valve SITT into the colon, indicating colon arrival time (CAT) and the start of CTT. A permanent and continuous rise above pH 7 indicates that the WMC exited the body, completing whole gut transit time (WGTT).</p>
Full article ">Figure 2
<p>Whole gut pH recorded by the SmartPill™ at week 2 post-injured condition. The green line is pig 1, the red line is pig 2, and the dark blue is pig 3. Graphs include raw data recorded by the SmartPill™, with previously described graph analysis.</p>
Full article ">Figure 3
<p>Six weeks post-injury whole gut pH recorded by the SmartPill™. The green line is pig 1, the red line is pig 2, and the dark blue is pig 3. Graphs include raw data recorded by the SmartPill™, with previously described graph analysis.</p>
Full article ">
Back to TopTop