[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (281)

Search Parameters:
Keywords = multi-date image

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
39 pages, 9959 KiB  
Article
Utilization of Non-Composted Human Hair Hydrolysate as a Natural and Nutrient-Rich Liquid Fertilizer for Sustainable Agro-Applications and Bio-Waste Management
by Kaan Yetilmezsoy, Fatih Ilhan and Emel Kıyan
Sustainability 2025, 17(4), 1641; https://doi.org/10.3390/su17041641 - 16 Feb 2025
Viewed by 633
Abstract
Human hair, commonly considered a discarded organic waste, is a keratin-rich material with remarkable potential for sustainable agriculture as an innovative resource. This study systematically explored the potential of non-composted human hair hydrolysates as eco-friendly and nutrient-rich liquid fertilizers, emphasizing their ability to [...] Read more.
Human hair, commonly considered a discarded organic waste, is a keratin-rich material with remarkable potential for sustainable agriculture as an innovative resource. This study systematically explored the potential of non-composted human hair hydrolysates as eco-friendly and nutrient-rich liquid fertilizers, emphasizing their ability to enhance agricultural sustainability and mitigate organic waste accumulation. Eight distinct hydrolysates prepared with alkaline solutions were evaluated for their effects on plant growth using red-hot chili pepper (Capsicum frutescens) as the primary model under greenhouse conditions. The present study introduces a novel approach by employing an advanced digital image analysis technique to quantitatively assess 37 distinct plant growth parameters, providing an unprecedented depth of understanding regarding the impact of liquid human hair hydrolysates on plant development. Additionally, the integration of pilot-scale field trials and multi-species evaluations highlights the broader applicability and scalability of these hydrolysates as sustainable fertilizers. Collectively, these features establish this research as a pioneering contribution to sustainable agriculture and bio-waste management. The top-performing hydrolysates (KCaMgN, KMgN, KCaN) demonstrated significant enhancements in plant growth metrics, with fresh weight reaching up to 3210 mg, projected leaf area of approximately 132 cm2, and crown diameter of 20.91 cm for the best-performing formulations, outperforming a commercial organomineral fertilizer by 20–46% in overall growth performance. Furthermore, observational studies on various species (such as bird of paradise flower (Strelitzia reginae), avocado (Persea americana), lemon (Citrus limon L.), Mazafati date (Phoenix dactylifera L.), and red mini conical hot pepper (Capsicum annuum var. conoides) and field trials on long sweet green peppers (Capsicum annuum) confirmed the broad applicability of these hydrolysates. Toxicity assessments using shortfin molly fish (Poecilia sphenops) validated the environmental safety of plants cultivated with hydrolysates. These findings highlight that human hair hydrolysates offer a sustainable alternative to synthetic fertilizers, contributing to waste management efforts while enhancing agricultural productivity. Full article
Show Figures

Figure 1

Figure 1
<p>Schematic representation of waste human hair hydrolysis and subsequent application as a liquid fertilizer.</p>
Full article ">Figure 2
<p>Multi-stage digital image analysis for comprehensive plant growth parameter quantification.</p>
Full article ">Figure 3
<p>Definition of key morphological parameters for <span class="html-italic">Capsicum frutescens</span> growth via digital image analysis approach.</p>
Full article ">Figure 4
<p>Flowchart of the experimental design illustrating the different treatments applied to each experimental group from initial preparations to final evaluations.</p>
Full article ">Figure 5
<p>Static bioassay apparatus for evaluating acute toxicity of dried chili pepper leaves (grown with waste human hair fertilizers) in <span class="html-italic">Poecilia sphenops</span>.</p>
Full article ">Figure 6
<p>Comparative growth response of <span class="html-italic">Capsicum frutescens</span> to human hair hydrolysates, commercial fertilizer, and control treatments.</p>
Full article ">Figure 7
<p>A controlled laboratory-based field model for evaluating human hair hydrolysate effects on shallot growth: (<b>a</b>) model construction (10 July 2023), (<b>b</b>) soil preparation and initial liquid fertilization (13 July 2023), (<b>c</b>) planting of shallot bulbs (13 July 2023) and (<b>d</b>) development of green shoots (5 September 2023) and subsequent harvest.</p>
Full article ">Figure 8
<p>Growth observation of five temperate and tropical climate plant species under controlled environment: (<b>a</b>) <span class="html-italic">Capsicum annuum</span> var. <span class="html-italic">conoides</span> (22 April 2024), (<b>b</b>) <span class="html-italic">Phoenix dactylifera</span> L. (5 September 2023), (<b>c</b>) <span class="html-italic">Citrus limon</span> L. (16 October 2023), (<b>d</b>) <span class="html-italic">Persea americana</span> (22 May 2023) (<b>e</b>) <span class="html-italic">Strelitzia reginae</span> (14 August 2023).</p>
Full article ">Figure 9
<p>Photographs showing the enhanced growth of long sweet green pepper (<span class="html-italic">Capsicum annuum</span>) plants treated with human hair hydrolysate (KCaMgN set) compared to controls in a pilot-scale garden (12 September 2023).</p>
Full article ">
26 pages, 7164 KiB  
Article
Leveraging Semantic Segmentation for Photovoltaic Plants Mapping in Optimized Energy Planning
by Giulia Ronchetti, Martina Aiello and Alberto Maldarella
Remote Sens. 2025, 17(3), 483; https://doi.org/10.3390/rs17030483 - 30 Jan 2025
Viewed by 487
Abstract
The growth of photovoltaic (PV) installations is essential for the global energy transition; however, comprehensive data regarding their spatial distribution are limited, which complicates effective energy planning. This research introduces a methodology for automatic recognition of ground-mounted PV systems in Italy, using semantic [...] Read more.
The growth of photovoltaic (PV) installations is essential for the global energy transition; however, comprehensive data regarding their spatial distribution are limited, which complicates effective energy planning. This research introduces a methodology for automatic recognition of ground-mounted PV systems in Italy, using semantic segmentation and Sentinel-2 RGB images with a resolution of 10 m. The objective of this methodology is to accurately identify both the locations and the sizes of these installations, estimate their capacity, and facilitate regular updates to maps, thereby supporting energy planning strategies. The segmentation model, which is founded on a U-Net architecture, is trained using a dataset from 2019 and evaluated on two separate cases that involve different dates and geographical areas. We propose a multi-temporal approach, applying the model to a sequence of images taken throughout the year and aggregating the results to create a PV detection probability map. Users have the flexibility to modify probability thresholds to enhance accuracy: lower thresholds increase producer accuracy, ensuring continuous area detection for capacity estimation, while higher thresholds boost user accuracy by reducing false positives. Additionally, post-processing techniques, such as filtering for plastic-covered greenhouses, assist minimizing detection errors. However, there is a need for improved model generalizability across various landscapes, necessitating retraining with images from a range of environmental contexts. Full article
(This article belongs to the Special Issue Remote Sensing: 15th Anniversary)
Show Figures

Figure 1

Figure 1
<p>A flowchart illustrating the primary stages of this study.</p>
Full article ">Figure 2
<p>Study areas: (<b>a</b>) SA Puglia, used for both model training and testing; (<b>b</b>) SA Emilia-Romagna, used for model testing. Basemap: @OpenStreetMap Contributors.</p>
Full article ">Figure 3
<p>Our proposed framework for the segmentation of PV installations on Sentinel-2 images.</p>
Full article ">Figure 4
<p>An example of an RGB image and its corresponding label included in the dataset used for model training. Blue pixels represent background, yellow pixels represent PV plants.</p>
Full article ">Figure 5
<p>The evaluation of accuracy and loss function during model training: (<b>a</b>) accuracy score curve, (<b>b</b>) cross-entropy curve.</p>
Full article ">Figure 6
<p>Examples taken from the test set, representing RGB image, ground truth, and model prediction. Numbers in the maps indicate pixel coordinates [unitless].</p>
Full article ">Figure 7
<p>SA Puglia 2023: PV plants maps obtained for varying threshold values.</p>
Full article ">Figure 8
<p>Detection performances for varying threshold values: (<b>a</b>) examples of PV plants fully recognized with any threshold; (<b>b</b>) example of PV plants fully recognized only with threshold 0. Basemap: Google satellite.</p>
Full article ">Figure 9
<p>Detection performances for varying threshold values: examples of PV plants as visible in the (<b>a</b>) Google Satellite basemap, the (<b>b</b>) Sentinel-2 image from January 2023, and the (<b>c</b>) Sentinel-2 image from October 2023.</p>
Full article ">Figure 10
<p>SA Emilia-Romagna 2019: PV plants maps obtained for varying threshold values.</p>
Full article ">Figure A1
<p>Example of undetected small PV plants: (<b>a</b>) Google Satellite basemap; (<b>b</b>) Sentinel-2 image from January 2023; (<b>c</b>) Sentinel-2 image from October 2023.</p>
Full article ">Figure A2
<p>Example of undetected PV plants: (<b>a</b>) Google Satellite basemap; (<b>b</b>) Sentinel-2 image from January 2023; (<b>c</b>) Sentinel-2 image from October 2023.</p>
Full article ">Figure A3
<p>Example of partially undetected PV plants: (<b>a</b>) Google Satellite basemap; (<b>b</b>) Sentinel-2 image from January 2023; (<b>c</b>) Sentinel-2 image from October 2023.</p>
Full article ">
47 pages, 20555 KiB  
Article
Commissioning an All-Sky Infrared Camera Array for Detection of Airborne Objects
by Laura Domine, Ankit Biswas, Richard Cloete, Alex Delacroix, Andriy Fedorenko, Lucas Jacaruso, Ezra Kelderman, Eric Keto, Sarah Little, Abraham Loeb, Eric Masson, Mike Prior, Forrest Schultz, Matthew Szenher, Wesley Andrés Watters and Abigail White
Sensors 2025, 25(3), 783; https://doi.org/10.3390/s25030783 - 28 Jan 2025
Cited by 2 | Viewed by 946
Abstract
To date, there is little publicly available scientific data on unidentified aerial phenomena (UAP) whose properties and kinematics purportedly reside outside the performance envelope of known phenomena. To address this deficiency, the Galileo Project is designing, building, and commissioning a multi-modal, multi-spectral ground-based [...] Read more.
To date, there is little publicly available scientific data on unidentified aerial phenomena (UAP) whose properties and kinematics purportedly reside outside the performance envelope of known phenomena. To address this deficiency, the Galileo Project is designing, building, and commissioning a multi-modal, multi-spectral ground-based observatory to continuously monitor the sky and collect data for UAP studies via a rigorous long-term aerial census of all aerial phenomena, including natural and human-made. One of the key instruments is an all-sky infrared camera array using eight uncooled long-wave-infrared FLIR Boson 640 cameras. In addition to performing intrinsic and thermal calibrations, we implement a novel extrinsic calibration method using airplane positions from Automatic Dependent Surveillance–Broadcast (ADS-B) data that we collect synchronously on site. Using a You Only Look Once (YOLO) machine learning model for object detection and the Simple Online and Realtime Tracking (SORT) algorithm for trajectory reconstruction, we establish a first baseline for the performance of the system over five months of field operation. Using an automatically generated real-world dataset derived from ADS-B data, a dataset of synthetic 3D trajectories, and a hand-labeled real-world dataset, we find an acceptance rate (fraction of in-range airplanes passing through the effective field of view of at least one camera that are recorded) of 41% for ADS-B-equipped aircraft, and a mean frame-by-frame aircraft detection efficiency (fraction of recorded airplanes in individual frames which are successfully detected) of 36%. The detection efficiency is heavily dependent on weather conditions, range, and aircraft size. Approximately 500,000 trajectories of various aerial objects are reconstructed from this five-month commissioning period. These trajectories are analyzed with a toy outlier search focused on the large sinuosity of apparent 2D reconstructed object trajectories. About 16% of the trajectories are flagged as outliers and manually examined in the IR images. From these ∼80,000 outliers and 144 trajectories remain ambiguous, which are likely mundane objects but cannot be further elucidated at this stage of development without information about distance and kinematics or other sensor modalities. We demonstrate the application of a likelihood-based statistical test to evaluate the significance of this toy outlier analysis. Our observed count of ambiguous outliers combined with systematic uncertainties yields an upper limit of 18,271 outliers for the five-month interval at a 95% confidence level. This test is applicable to all of our future outlier searches. Full article
(This article belongs to the Section Sensors and Robotics)
Show Figures

Figure 1

Figure 1
<p>(<b>Left</b>): Mechanical design drawing of the Dalek IR camera array. (<b>Right</b>): Photograph of the Dalek as constructed at the development site.</p>
Full article ">Figure 2
<p>(<b>Left</b>): Illustration of fields of view (FOVs) and their overlap between the eight cameras of the Dalek. The orange areas represent the FOVs of the seven cameras arranged hemispherically. The purple area shows the FOV of the zenith camera on top of the Dalek. (<b>Right</b>): Side view of fields of view for the hemispherical Dalek cameras. As the center of their optical axes are pointing 30° above the horizon, the bottom of the images that they capture corresponds to 10° above the horizon.</p>
Full article ">Figure 3
<p>Map view of a mosaic of images from the seven hemispheric cameras and the one zenith Boson IR camera, and their orientation with respect to a visible all-sky camera photograph from the Dalek’s location (background, center of the image). The shaded (purple) semi-translucent overlays show the corresponding treeline masks that are used in post-processing to ignore all but the sky area of the images. All camera frames are taken from a video recording from 7 May 2024 except for camera 1 (3 April 2024) and camera 7 (10 May 2024).</p>
Full article ">Figure 4
<p>Stacked area plot showing the evolution over time of the sum of all cameras’ recording efficiencies, defined as recording duration per camera per day divided by expected duration based on recording schedule, which varies per camera. If all cameras were recording according to the schedule all the time, the summed efficiencies should add up to 8. The few data points above 8 are due to manual enabling of the recording, for testing purposes. This timeline goes from November 2023 to May 2024. Some cameras, such as cameras 6 and 7, show a drastic improvement over time.</p>
Full article ">Figure 5
<p>Metal chessboard used for intrinsic calibration of the FLIR Boson 640 cameras. (<b>Left</b>): Boson camera image. (<b>Right</b>): Dark-painted metal cutout chessboard on dark-painted metal base plate.</p>
Full article ">Figure 6
<p>(<b>Left</b>): Four cameras mounted on a fixture and pointed at the black metal plate used for removing INUs through lens gain correction and supplemental flat-field correction (SFFC). (<b>Middle</b>): Four cameras mounted on a fixture and pointing at the black polyurethane foam in the incubator; used for thermal calibration of four Boson cameras at the same time. (<b>Right</b>): Example of 16-bit image taken with a Boson camera using the thermal calibration setup described in <a href="#sec2dot2dot5-sensors-25-00783" class="html-sec">Section 2.2.5</a>. The two small pieces of reflective tape, pinned at the top and left side of the foam, are used to find the geometrical center of the foam in the image, where the target temperature was measured using a thermometer. The “ears” of foam minimize thermal reflections from the sides and door of the freezer or incubator.</p>
Full article ">Figure 7
<p>Euler angles representing the orientation matrix of camera 1 in the Dalek over a period of three months: January to March 2024. The error bars are estimated from the inverse Hessian matrix after minimization. Large error bars are correlated with a lack of statistics. Overall the fluctuations are of the order of 1%. The dashed lines represent the fixed Euler angle values that were used throughout this paper.</p>
Full article ">Figure 8
<p>(<b>Left</b>): Map of measurements made on four cameras of the Dalek. (<b>Right</b>): Visualization of fitted polynomial. The maximum pixel values occur when the camera and target are both “hot”. Pixel values are divided by <math display="inline"><semantics> <msup> <mn>10</mn> <mn>4</mn> </msup> </semantics></math> on the z-axis.</p>
Full article ">Figure 9
<p>(<b>Left</b>): Example of 16-bit infrared image of an aircraft taken with a Boson camera. (<b>Right</b>): Zoom in on the airplane in the scene. The red lines cross at the pixel of maximum brightness. The area in between the green rectangles is used to estimate the ambient brightness.</p>
Full article ">Figure 10
<p>Comparison of straight (<b>left</b>), curved (<b>middle</b>), and piecewise (<b>right</b>) 3D trajectories from the synthetic video dataset. Trajectories are identified by color and unique number.</p>
Full article ">Figure 11
<p>Comparison of YOLOv5 model trained on synthetic-only dataset versus model trained on a mix of synthetic and real-world datasets. The horizontal axis units are pixels square. The model trained on synthetic-only data has better overall performance and is the one used for commissioning in this paper.</p>
Full article ">Figure 12
<p>(<b>Left</b>): Track quality metrics MT, PT, and ML as a function of IoU threshold; values are normalized by the total number of ground truth trajectories (209) and together add up to 1. Higher values of MT (mostly tracked) are better. (<b>Right</b>): IDSs normalized by the total number of ground truth trajectories (209) as a function of IoU threshold. A lower number of IDSs (track identity switches) is better.</p>
Full article ">Figure 13
<p>MOTA, IDF1, and MOTP tracking performance metrics for our SORT implementation on synthetic trajectories, as a function of the BBox scale factor.</p>
Full article ">Figure 14
<p>(<b>Left</b>): YOLO detection count per camera, first divided by the camera’s visible sky area and total recordings count, and then normalized across cameras so all values sum up to 1. This histogram includes all detections of any object from a subset of the five months of commissioning data. Camera 8 was offline during this interval. (<b>Middle</b> and <b>Right</b>): 2D histograms showing examples of spatial distribution of YOLO detections with confidence score &gt; 0.5, for the month of May 2024, for camera 5 (SE) and 7 (SW), respectively. A regular commercial aircraft route is clearly visible for camera 5 (center).</p>
Full article ">Figure 15
<p>(<b>Left</b>): Average hourly detection count for different cameras throughout the day. (<b>Middle</b>): A 2D histogram of the detections’ bounding box width and height, showing pronounced clustering that may be useful for object classification. (<b>Right</b>): A 2D histogram of the detections’ bounding box area and aspect ratio, again with strong clustering. The histograms include all detections of any object from a subset of the five months of commissioning data.</p>
Full article ">Figure 16
<p>(<b>Upper left</b>): For the commissioning period, average daily count vs. distance (km) from observatory of ADS-B-equipped airplanes in range of site (criterion 1: <span class="html-italic">in range</span>); of those, also within the effective field of view of at least one camera (criteria 1–3: <span class="html-italic">viewable</span>); of those, also within that camera’s recording time window (criteria 1–4: <span class="html-italic">recorded</span>); and, of those, also detected by YOLOv5 (criteria 1–5: <span class="html-italic">detected</span>). (<b>Middle left</b>): The number <span class="html-italic">detected</span> divided by the number <span class="html-italic">recorded</span> (<span class="html-italic">efficiency</span>) vs. distance (km). (<b>Lower left</b>): The number <span class="html-italic">recorded</span> divided by the number <span class="html-italic">viewable</span> (<span class="html-italic">acceptance</span>) vs. distance (km). (<b>Right</b>): <span class="html-italic">Acceptance</span> and <span class="html-italic">efficiency</span> as functions of apparent airplane size. The histogram shows the number of ADS-B records that contributed to each point on the graph. Error bars are computed by propagating statistical errors from all ADS-B counts, assuming Poisson distributions.</p>
Full article ">Figure 17
<p>(<b>Left</b>): Efficiency as a function of actual airplane size. Error bars are computed by propagating statistical errors from all ADS-B counts, assuming Poisson distributions. (<b>Middle</b>): Actual airplane size distribution throughout the day. (<b>Right</b>): Actual airplane size distribution compared to projected size in the image.</p>
Full article ">Figure 18
<p>(<b>Left</b>): Efficiency as a function of precipitation. (<b>Middle</b>): Efficiency as a function of visibility. (<b>Right</b>): Efficiency as a function of relative humidity. Error bars are computed by propagating statistical errors from all ADS-B counts, assuming Poisson distributions.</p>
Full article ">Figure 19
<p>(<b>Left</b>): Efficiency as a function of temperature. (<b>Right</b>): Uneven temperature distribution throughout the day in our dataset. Error bars are computed by propagating statistical errors from all ADS-B counts, assuming Poisson distributions.</p>
Full article ">Figure 20
<p>(<b>Left</b>): Efficiency as a function of camera. (<b>Middle</b>): Efficiency as a function of horizontal position. Zero corresponds to the left of the image. (<b>Right</b>): Efficiency as a function of vertical position. Zero corresponds to the top of the image. Error bars are computed by propagating statistical errors from all ADS-B counts, assuming Poisson distributions.</p>
Full article ">Figure 21
<p>Fraction of aircraft <span class="html-italic">in range</span> which are <span class="html-italic">detected</span> for each camera and spatial location in the camera image frame. Each bin represents an area of size <math display="inline"><semantics> <mrow> <mn>80</mn> <mo>×</mo> <mn>64</mn> </mrow> </semantics></math> px in the original camera image. The origin of both axes is set to the upper left corner, following the computer vision convention. White bins are either entirely included in the treeline mask, where we do not look for detections, or happen to lack any aircraft in range during the commissioning period.</p>
Full article ">Figure 22
<p>(<b>Left</b>): A 2D histogram of reconstructed and true apparent speed, computed from the apparent 2D distance between two consecutive bounding box centers, for matched true and reconstructed trajectories. (<b>Middle</b>): Purity as a function of reconstructed apparent speed. (<b>Right</b>): Efficiency as a function of true apparent speed. Error bars are statistical uncertainties reflecting the bin population count.</p>
Full article ">Figure 23
<p>(<b>Left</b>): A 2D histogram of reconstructed and true apparent area. (<b>Middle</b>): Purity as a function of reconstructed apparent area. (<b>Right</b>): Efficiency as a function of true apparent area. Error bars are statistical uncertainties reflecting the bin population count.</p>
Full article ">Figure 24
<p>(<b>Left</b>): Purity as a function of reconstructed curvature for curved trajectories. (<b>Right</b>): Efficiency as a function of inflection point counts for piecewise trajectories. Error bars are statistical uncertainties reflecting the bin population count.</p>
Full article ">Figure 25
<p>(<b>Left</b>): Distribution of individual trajectory counts per video. The reconstructed trajectory count is higher on average due to trajectory fragmentation. (<b>Right</b>): Distribution of trajectory point counts. True trajectories can have at most 100 points due to the dataset generation parameters, and reconstructed trajectories have fewer points on average along the trajectory compared to true trajectories due to missed or dropped detections.</p>
Full article ">Figure 26
<p>Overlay of multiple reconstructed trajectories randomly sampled from different videos in the commissioning dataset. Each trajectory is plotted as a set of consecutive 2D points, representing the unique detections underlying the trajectory, and assigned a unique color. The numbers next to each trajectory are unique identifiers assigned by the SORT algorithm.</p>
Full article ">Figure 27
<p>Normalized histogram of the log-transformed distribution of sinuosity values computed for all reconstructed trajectories from the five months of commissioning. The lines show the fit with four models for probability density estimation: Kernel Density Estimation (KDE), with different bandwidths in colored solid lines, and a Gaussian fit in a dashed line.</p>
Full article ">Figure 28
<p>(<b>Left</b>): Examples of reconstructed trajectory data points. (<b>Right</b>): For each corresponding reconstructed object, we overlay three snapshots of the frame-by-frame object YOLO detections inside green outlines and annotated with red text (unique identifiers). The snapshots are taken and located at the start, middle, and end of each trajectory. The background is a frame of the corresponding camera which does not contain any detected objects, overlaid for reference. (<b>Top row</b>): Objects with low reconstructed trajectory sinuosity, between 1.0 and 1.2. (<b>Middle row</b>): Objects with reconstructed trajectory sinuosity ranging from 1.2 to 3.0. (<b>Bottom row</b>): Objects with reconstructed trajectory sinuosity &gt;3.0.</p>
Full article ">Figure 29
<p>(<b>Left</b>): Airplanes on camera 7. (<b>Middle</b>): Birds and clouds on camera 4. (<b>Right</b>): Helicopters and Moon on camera 7. All are examples of detected objects that can be visually identified by inspecting the detection bounding box and the corresponding video recording. We show here examples of airplane, bird, cloud, helicopter, and Moon detections. For each reconstructed trajectory, we overlay three snapshots of the frame-by-frame object YOLO detections inside green outlines and annotated with red text (unique identifiers). The snapshots are taken and located at the start, middle, and end of each trajectory. The background is a frame of the corresponding camera, which does not contain any detected objects.</p>
Full article ">Figure 30
<p>Examples of trajectories created by objects with small apparent size. (<b>Left</b>): For each corresponding reconstructed trajectory, we overlay three snapshots of the frame-by-frame object YOLO detections inside green outlines and annotated with red text (unique identifiers). The snapshots are taken at different points along the trajectories. (<b>Right</b>): Corresponding examples of reconstructed trajectory data points.</p>
Full article ">Figure 31
<p>Manual classification of reconstructed trajectories from Dalek recordings, a small subset of the full commissioning dataset.</p>
Full article ">Figure 32
<p>After manual classification of reconstructed trajectories, we sample typical objects (in pairs) from each category. These images are crops of the objects for illustration purposes. First row: flocks of birds and the Moon; second row: planes and single birds; third row: clouds.</p>
Full article ">Figure 33
<p>Graphical representation of the observed and expected upper limit calculation. This shows the upper limit on the ambiguous outlier count at a 0.05 significance level (shown as a red line) for a counting experiment where <math display="inline"><semantics> <mrow> <msub> <mi>N</mi> <mi>b</mi> </msub> <mo>=</mo> <mn>627</mn> <mo>,</mo> <mn>339</mn> <mo>±</mo> <mn>324</mn> <mo>,</mo> <mn>668</mn> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msub> <mi>N</mi> <mi>s</mi> </msub> <mo>=</mo> <mn>180</mn> <mo>±</mo> <mn>10</mn> <mo>,</mo> <mn>999</mn> </mrow> </semantics></math>, assuming Gaussian uncertainties. The observed upper limit at a 95% confidence level is 18,271.</p>
Full article ">
71 pages, 7585 KiB  
Systematic Review
Unmanned Aerial Geophysical Remote Sensing: A Systematic Review
by Farzaneh Dadrass Javan, Farhad Samadzadegan, Ahmad Toosi and Mark van der Meijde
Remote Sens. 2025, 17(1), 110; https://doi.org/10.3390/rs17010110 - 31 Dec 2024
Viewed by 5829
Abstract
Geophysical surveys, a means of analyzing the Earth and its environments, have traditionally relied on ground-based methodologies. However, up-to-date approaches encompass remote sensing (RS) techniques, employing both spaceborne and airborne platforms. The emergence of Unmanned Aerial Vehicles (UAVs) has notably catalyzed interest in [...] Read more.
Geophysical surveys, a means of analyzing the Earth and its environments, have traditionally relied on ground-based methodologies. However, up-to-date approaches encompass remote sensing (RS) techniques, employing both spaceborne and airborne platforms. The emergence of Unmanned Aerial Vehicles (UAVs) has notably catalyzed interest in UAV-borne geophysical RS. The objective of this study is to comprehensively review the state-of-the-art UAV-based geophysical methods, encompassing magnetometry, gravimetry, gamma-ray spectrometry/radiometry, electromagnetic (EM) surveys, ground penetrating radar (GPR), traditional UAV RS methods (i.e., photogrammetry and LiDARgrammetry), and integrated approaches. Each method is scrutinized concerning essential aspects such as sensors, platforms, challenges, applications, etc. Drawing upon an extensive systematic review of over 435 scholarly works, our analysis reveals the versatility of these systems, which ranges from geophysical development to applications over various geoscientific domains. Among the UAV platforms, rotary-wing multirotors were the most used (64%), followed by fixed-wing UAVs (27%). Unmanned helicopters and airships comprise the remaining 9%. In terms of sensors and methods, imaging-based methods and magnetometry were the most prevalent, which accounted for 35% and 27% of the research, respectively. Other methods had a more balanced representation (6–11%). From an application perspective, the primary use of UAVs in geoscience included soil mapping (19.6%), landslide/subsidence mapping (17.2%), and near-surface object detection (13.5%). The reviewed studies consistently highlight the advantages of UAV RS in geophysical surveys. UAV geophysical RS effectively balances the benefits of ground-based and traditional RS methods regarding cost, resolution, accuracy, and other factors. Integrating multiple sensors on a single platform and fusion of multi-source data enhance efficiency in geoscientific analysis. However, implementing geophysical methods on UAVs poses challenges, prompting ongoing research and development efforts worldwide to find optimal solutions from both hardware and software perspectives. Full article
(This article belongs to the Special Issue Advances in Remote Sensing of Geophysical Surveys Based on UAV)
Show Figures

Figure 1

Figure 1
<p>A review of the collected publications by the following categories: (<b>a</b>) Year; (<b>b</b>) Type; (<b>c</b>) Subject area; (<b>d</b>) Country (first author); (<b>e</b>) Affiliation (first author); (<b>f</b>) Source (Source: Scopus).</p>
Full article ">Figure 2
<p>Refinement of studies based on the PRISMA workflow.</p>
Full article ">Figure 3
<p>Categorization of geophysical methods applicable to UAVs.</p>
Full article ">Figure 4
<p>UAV-borne magnetometry system (A ground-based magnetometer is typically used in extended aerial operations where the diurnal variations of the Earth’s magnetic field are significant. The data from this base station is essential for modeling these variations and correcting the data captured by the aerial method).</p>
Full article ">Figure 5
<p>Different arrangements for mounting magnetometers on UAVs: (<b>a-i</b>–<b>a-iv</b>) fixed-boom design for rotary-wings, fixed-wings, helicopters, and airships; (<b>b-i</b>–<b>b-iv</b>) towed sensor design for the mentioned UAV types; (<b>c-i</b>–<b>c-iv</b>) towed bird design for the mentioned UAV types; (<b>d</b>) fixed wing-tip design for fixed-wing UAV.</p>
Full article ">Figure 6
<p>UAV-based gravimetry system (the scheme was depicted based on [<a href="#B24-remotesensing-17-00110" class="html-bibr">24</a>,<a href="#B126-remotesensing-17-00110" class="html-bibr">126</a>,<a href="#B127-remotesensing-17-00110" class="html-bibr">127</a>]).</p>
Full article ">Figure 7
<p>Operational modes in UAV-borne gravimetry (R: flight rans, P: sampling points).</p>
Full article ">Figure 8
<p>The manner in which a UAV GRS system operates (depicted based on the outputs of [<a href="#B162-remotesensing-17-00110" class="html-bibr">162</a>]).</p>
Full article ">Figure 9
<p>AEM survey—induced vs. measured magnetic fields (depicted based on [<a href="#B310-remotesensing-17-00110" class="html-bibr">310</a>,<a href="#B312-remotesensing-17-00110" class="html-bibr">312</a>]).</p>
Full article ">Figure 10
<p>Single-drone and dual-drone configurations in UAV-borne EM: The EM primary field is produced using one of the following: (<b>a</b>) A compact mobile current loop transported by a UAV; (<b>b</b>) A large loop placed on the ground. In both scenarios, the EM response is detected with a receiver transported by another UAV (depicted based on [<a href="#B308-remotesensing-17-00110" class="html-bibr">308</a>]).</p>
Full article ">Figure 11
<p>The processing chain of UAV-borne EM data (depicted based on [<a href="#B365-remotesensing-17-00110" class="html-bibr">365</a>]).</p>
Full article ">Figure 12
<p>Payload assembly architectures in UAV-GPR systems: (<b>a</b>) Independent payload; (<b>b</b>) Integrated payload architectures. The principles were borrowed from [<a href="#B25-remotesensing-17-00110" class="html-bibr">25</a>,<a href="#B413-remotesensing-17-00110" class="html-bibr">413</a>], with the flowcharts being reconfigured.</p>
Full article ">Figure 13
<p>Observation modes in UAV-borne GPR survey (reconfigured based on [<a href="#B25-remotesensing-17-00110" class="html-bibr">25</a>]).</p>
Full article ">Figure 14
<p>Fully/semi-airborne UAV-GPR: (<b>a</b>,<b>b</b>) Fully airborne GPR using Tx and Rx onboard a single UAV operating in DLGPR and/or FLGPR modes; (<b>c</b>) Fully airborne GPR using double UAVs; (<b>d</b>) Semi-airborne scheme combining ground-based FLGPR and UAV-borne DLGPR (subfigure ‘d’ conceptualized from [<a href="#B415-remotesensing-17-00110" class="html-bibr">415</a>,<a href="#B424-remotesensing-17-00110" class="html-bibr">424</a>]).</p>
Full article ">Figure 15
<p>UAV-GPR data processing workflow (reconfigured based on [<a href="#B25-remotesensing-17-00110" class="html-bibr">25</a>,<a href="#B425-remotesensing-17-00110" class="html-bibr">425</a>,<a href="#B426-remotesensing-17-00110" class="html-bibr">426</a>]).</p>
Full article ">Figure 16
<p>UAV-borne GPR imaging problem (reconfigured based on [<a href="#B417-remotesensing-17-00110" class="html-bibr">417</a>,<a href="#B434-remotesensing-17-00110" class="html-bibr">434</a>]).</p>
Full article ">
16 pages, 14248 KiB  
Article
Holocene Activity Characteristics and Seismic Risk of Major Earthquakes in the Middle Segment of the Jinshajiang Fault Zone, East of the Qinghai–Tibetan Plateau
by Mingjian Liang, Naifei Luo, Yunxi Dong, Ling Tan, Jinrong Su and Weiwei Wu
Appl. Sci. 2025, 15(1), 9; https://doi.org/10.3390/app15010009 - 24 Dec 2024
Viewed by 445
Abstract
The Jinshajiang fault zone is the western boundary fault of the Sichuan–Yunnan block, located east of the Qinghai–Tibetan Plateau. It is a complex tectonic suture belt with multi-phase activity and is characterized by multiple sets of parallel or intersecting faults. Using high-resolution image [...] Read more.
The Jinshajiang fault zone is the western boundary fault of the Sichuan–Yunnan block, located east of the Qinghai–Tibetan Plateau. It is a complex tectonic suture belt with multi-phase activity and is characterized by multiple sets of parallel or intersecting faults. Using high-resolution image interpretation, seismic geological surveys, and trench studies, we examined the Holocene activity and obtained the paleoseismic sequences on the middle segment of the fault zone. Thus, we could analyze the kinematic characteristics of the fault and its potential risk of strong earthquakes. Our results indicated that the predominant movement of the fault zone was strike-slip motion. In the Jinshajiang fault zone, the Late Quaternary horizontal slip rates of the north-northeast-trending Yarigong fault and the northeast-trending Ciwu fault were 3.6 ± 0.6 mm/a and 2.5 ± 0.5 mm/a, respectively. Three paleoseismic events were identified on the Yarigong fault, dated 6745–3848, 3742–1899, and 1494–1112 cal BP, and on the Ciwu fault, constrained to 32,566–29,430, 24,056–22,990, and 2875–2723 cal BP. The last major earthquake on the Ciwu fault occurred approximately 2800 years ago; therefore, its future seismic hazard deserves attention. Full article
(This article belongs to the Special Issue Paleoseismology and Disaster Prevention)
Show Figures

Figure 1

Figure 1
<p>Tectonic framework and seismic activity of the Jinshajiang fault zone and adjacent regions. The black dashed rectangle represents the study area.</p>
Full article ">Figure 2
<p>(<b>a</b>) Distribution map of major faults and field survey sites in the middle section of the Jinshajiang fault zone and its adjacent areas. (<b>b</b>) On the basis of the GF-7 satellite data, a hillshade map was generated to interpret the detailed fault tracks of the Yarigong and Ciwu faults.</p>
Full article ">Figure 3
<p>(<b>a</b>,<b>b</b>) Geological profile of the Jinshajiang fault zone in Hongdong Village. (<b>c</b>) Photograph of the top of the geological section showing the fault displaced top strata.</p>
Full article ">Figure 4
<p>Right-lateral offset of a gully and profiles of the dating sample collection south of Yarigong Town. (<b>a</b>,<b>b</b>) The right-lateral offset of the gully was obtained using unmanned aerial vehicle (UAV) photogrammetry. The white rectangle in panel (<b>b</b>) indicates the sampling location of the dating samples. (<b>c</b>,<b>d</b>) The dating samples were collected from the T3 and T4 terraces of the Muqu River, respectively.</p>
Full article ">Figure 5
<p>Fault profiles exposed along the Yarigong fault. (<b>a</b>–<b>c</b>) The fault profiles are exposed north of the Ran, Dalong, and Lide villages, respectively, where the fault has displaced Late Quaternary strata. In (<b>a</b>), the fault has displaced the Holocene alluvial layer. And the fault has displaced the late Pleistocene alluvial layer in (<b>b</b>). The red arrows indicate the fault traces.</p>
Full article ">Figure 6
<p>Photograph of the fault profile and its explanatory profile.</p>
Full article ">Figure 7
<p>Structural landform of the Ciwu fault southwest of Nidou Village. (<b>a</b>) Fault track of the newest activity of the Ciwu fault on the alluvial fan. (<b>b</b>,<b>c</b>) Linear fault trough gullies and reverse fault scarps. The red arrows indicate the fault traces.</p>
Full article ">Figure 8
<p>(<b>a</b>) Right-lateral offset of the T2 terrace of the Ciwu River and location of the dating sample. The image was obtained using UAV photogrammetry. (<b>b</b>) Photograph showing the dating sample collection section in the T2 terrace of the Ciwu River.</p>
Full article ">Figure 9
<p>(<b>a</b>) Tectonic landform near trench TC1 according to UAV photogrammetry. (<b>b</b>) Photograph of the linear fault trough landform. The red arrows indicate the fault trace.</p>
Full article ">Figure 10
<p>Photograph of the southern wall of the Bugge trench and explanatory profile. The radiocarbon dating sample ages of the TC1 trench are detailed in <a href="#applsci-15-00009-t002" class="html-table">Table 2</a>.</p>
Full article ">Figure 11
<p>(<b>a</b>) Tectonic landform near trench TC2 shown via UAV photogrammetry. The gully to the south of the site is right-laterally offset by approximately 18 ± 2 m. (<b>b</b>) Photograph of the trench site.</p>
Full article ">Figure 12
<p>Photograph of the southern wall of the Bugge trench and explanatory profile. The radiocarbon dating sample ages of the TC2 trench are detailed in <a href="#applsci-15-00009-t002" class="html-table">Table 2</a>.</p>
Full article ">Figure 13
<p>Paleoseismic sequences of the Yarigong and Ciwu faults, the raw carbon ages of which were calibrated using OxCal 4.4.4 [<a href="#B42-applsci-15-00009" class="html-bibr">42</a>]. (<b>a</b>,<b>b</b>) are the paleoseismic sequences revealed in trench TC1 and TC2, respectively.</p>
Full article ">
17 pages, 15062 KiB  
Article
Dynamics of Irrigated Land Expansion in the Ouémé River Basin Using Field and Remote Sensing Data in the Google Earth Engine
by David Houéwanou Ahoton, Taofic Bacharou, Aymar Yaovi Bossa, Luc Ollivier Sintondji, Benjamin Bonkoungou and Voltaire Midakpo Alofa
Land 2024, 13(11), 1926; https://doi.org/10.3390/land13111926 - 15 Nov 2024
Viewed by 986
Abstract
The availability of reliable and quantified information on the spatiotemporal distribution of irrigated land at the river basin scale is an essential step towards sustainable management of water resources. This research aims to assess the spatiotemporal extent of irrigated land in the Ouémé [...] Read more.
The availability of reliable and quantified information on the spatiotemporal distribution of irrigated land at the river basin scale is an essential step towards sustainable management of water resources. This research aims to assess the spatiotemporal extent of irrigated land in the Ouémé River basin using Landsat multi-temporal images and ground truth data. A methodology was built around the use of supervised classification and the application of an algorithm based on the logical expression and thresholding of a combination of surface temperature (Ts) and normalized difference vegetation index (NDVI). The findings of the supervised classification showed that agricultural areas were 16,003 km2, 19,732 km2, and 22,850 km2 for the years 2014, 2018, and 2022, respectively. The irrigated land areas were 755 km2, 1143 km2, and 1883 km2 for the same years, respectively. A significant increase in irrigated areas was recorded throughout the study period. The overall accuracy values of 79%, 82%, and 83% obtained during validation of the irrigated land maps indicate a good performance of the algorithm. The results suggest a promising application of the algorithm to obtain up-to-date information on the distribution of irrigated land in several regions of Africa. Full article
(This article belongs to the Special Issue Water Resources and Land Use Planning II)
Show Figures

Figure 1

Figure 1
<p>Geographical location and climatic conditions of the Ouémé River basin.</p>
Full article ">Figure 2
<p>Flow chart of the methodological approach used in this study.</p>
Full article ">Figure 3
<p>(<b>a</b>) Land use and land cover sampling points (red: built-up area and bare land; blue: water body; green: vegetation; purple: agricultural area) from GEE; (<b>b</b>) irrigated area sampling points (yellow) from Google Earth Pro.</p>
Full article ">Figure 4
<p>Spatiotemporal distribution of Ts and NDVI across the study area.</p>
Full article ">Figure 5
<p>Seasonal variation of NDVI and Ts in the irrigated and rainfed areas during (<b>a</b>) 2014, (<b>b</b>) 2018, and (<b>c</b>) 2022.</p>
Full article ">Figure 6
<p>Distribution of agricultural cropland across the study area.</p>
Full article ">Figure 7
<p>Distribution of irrigated croplands in the Ouémé River basin in 2014, 2018, and 2022.</p>
Full article ">Figure 8
<p>Trends in the area of agricultural land with respect to general and irrigated cropland in particular.</p>
Full article ">
31 pages, 35674 KiB  
Article
Discussion Points of the Remote Sensing Study and Integrated Analysis of the Archaeological Landscape of Rujm el-Hiri
by Olga Khabarova, Michal Birkenfeld and Lev V. Eppelbaum
Remote Sens. 2024, 16(22), 4239; https://doi.org/10.3390/rs16224239 - 14 Nov 2024
Viewed by 5605
Abstract
Remote sensing techniques provide crucial insights into ancient settlement patterns in various regions by uncovering previously unknown archaeological sites and clarifying the topological features of known ones. Meanwhile, in the northern part of the Southern Levant, megalithic structures remain largely underexplored with these [...] Read more.
Remote sensing techniques provide crucial insights into ancient settlement patterns in various regions by uncovering previously unknown archaeological sites and clarifying the topological features of known ones. Meanwhile, in the northern part of the Southern Levant, megalithic structures remain largely underexplored with these methods. This study addresses this gap by analyzing the landscape around Rujm el-Hiri, one of the most prominent Southern Levantine megaliths dated to the Chalcolithic/Early Bronze Age, for the first time. We discuss the type and extent of the archaeological remains identified in satellite images within a broader context, focusing on the relationships between landscapes and these objects and the implications of their possible function. Our analysis of multi-year satellite imagery covering the 30 km region surrounding the Sea of Galilee reveals several distinct patterns: 40–90-m-wide circles and thick walls primarily constructed along streams, possibly as old as Rujm el-Hiri itself; later-period linear thin walls forming vast rectangular fields and flower-like clusters of ~ 20 m diameter round-shaped fences found in wet areas; tumuli, topologically linked to the linear walls and flower-like fences. Although tumuli share similar forms and likely construction techniques, their spatial distribution, connections to other archaeological features, and the statistical distribution in their sizes suggest that they might serve diverse functions. The objects and patterns identified may be used for further training neural networks to analyze their spatial properties and interrelationships. Most archaeological structures in the region were reused long after their original construction. This involved adding new features, building walls over older ones, and reshaping the landscape with new objects. Rujm el-Hiri is a prime example of such a complex sequence. Geomagnetic analysis shows that since the entire region has rotated over time, the Rujm el-Hiri’s location shifted from its original position for tens of meters for the thousands of years of the object’s existence, challenging theories of the alignment of its walls with astronomical bodies and raising questions regarding its possible identification as an observatory. Full article
(This article belongs to the Section Remote Sensing for Geospatial Science)
Show Figures

Figure 1

Figure 1
<p>Rujm el-Hiri. (<b>a</b>) Geographic location, (32°54′30.87″N, 35°48′3.89″E); (<b>b</b>) Aerial view, adapted from [<a href="#B21-remotesensing-16-04239" class="html-bibr">21</a>]; (<b>c</b>) Distance-height profile of the surrounding area relative to the northernmost point of the Sea of Galilee (vertical axis—altitude below/above sea level, in m; horizontal axis—the distance in km). The vertical line indicates the location of Rujm el-Hiri.</p>
Full article ">Figure 2
<p>Results of the combined geophysical analysis in the area under study. (<b>A</b>): combined paleomagnetic–magnetic–radiometric scheme of the Sea of Galilee (modified and supplemented after [<a href="#B72-remotesensing-16-04239" class="html-bibr">72</a>]). (1) outcropped Cenozoic basalts, (2) points with the radiometric age of basalts (in m.y.), (3) wells, (4) faults, (5) general direction of the discovered buried basaltic plate dipping in the southern part of the Sea of Galilee, (6) counter clockwise (a) and clockwise (b) rotation of faults and tectonic blocks, (7) pull-apart basin of the Sea of Galilee, (8) suggested boundaries of the paleomagnetic zones in the sea, data of land paleomagnetic measurements: (9 and 10) (9) reverse magnetization, (10) normal magnetization, (11 and 12) results of magnetic anomalies analysis: (11) normal magnetization, (12) reverse magnetization, (13) reversely magnetized basalts, (14) normal magnetized basalts, (15) Miocene basalts and sediments with the complex paleomagnetic characteristics, (16) Pliocene–Pleistocene basalts and sediments with complex paleomagnetic characteristics, (17) index of paleomagnetic zonation. (<b>B</b>): The generalized results of the paleomagnetic–geodynamic studies in northern Israel (after [<a href="#B71-remotesensing-16-04239" class="html-bibr">71</a>,<a href="#B72-remotesensing-16-04239" class="html-bibr">72</a>]) overlaid on the geological map of Israel (map after [<a href="#B97-remotesensing-16-04239" class="html-bibr">97</a>]; geological captions are omitted for simplicity).</p>
Full article ">Figure 3
<p>Rujm el-Hiri site, as seen from space in different years and seasons. High-resolution images from Pleiades satellites processed by CNES/Airbus are provided by Google Earth Pro. Eye altitude is 460 m, tilt—zero.</p>
Full article ">Figure 4
<p>Landscape around the Rujm el-Hiri site, large-scale view. <b>Upper</b> panel: general view of the Rujm el-Hiri area with distinct types of archaeological objects indicated by arrows. <b>Bottom</b> panels: examples of the key types of archaeological objects identified in satellite images. Here and below, the north direction is as shown in <a href="#remotesensing-16-04239-f003" class="html-fig">Figure 3</a>.</p>
Full article ">Figure 5
<p>Linear-shaped walls and rectangular fields, and livestock enclosures beneath the Revaya reservoir. (<b>a</b>) General view of the reservoir during the full water period in 2018. (<b>b</b>) Bottom of the lake during the low water period in 2021. (<b>c</b>) Close-up of (<b>b</b>) indicated by a green rectangle. (<b>d</b>–<b>f</b>) Close-up of (<b>b</b>) indicated by the turquoise rectangle and two objects related to the human exploitation of the area surrounding the former small lake before the reservoirs’ dike was constructed. Here and below, the location is given with coordinates in white corresponding to the center of the site under study.</p>
Full article ">Figure 6
<p>Walls, rectangular livestock enclosures, and old wide walls built along the former stream near Rujm el-Hiri.</p>
Full article ">Figure 7
<p>Examples of round-shaped walls or fences forming flower-like clusters of ~100 m diameter. (<b>a</b>) Well-preserved site on the bottom of the Dvash reservoir; (<b>b</b>) Flower-like cluster of fences found along the Wadi Hafina stream; (<b>c</b>) Flower-like structures near the Revaya reservoir; (<b>d</b>) Analogous structures located 4 km to the south of Rujm el-Hiri; (<b>e</b>) Flower-like structures on the hill by the Nachal Akbara stream 28 km to the north-west of Rujm el-Hiri; (<b>f</b>) Merging clusters connected by walls 12 km to the north of Rujm el-Hiri. Archaeological objects of this type are found in the nearest vicinity of water sources.</p>
Full article ">Figure 8
<p>Examples of more complex round-shaped fences forming flower-like clusters. (<b>a</b>) flower-like conglomerate of fences located 6.5 km southwest of Rujm el-Hiri; (<b>b</b>) analogous cluster located 14.3 km north of Rujm el-Hiri featuring rectangular structures around the center; (<b>c</b>) two clusters with tumuli in the center linked by the wall, located one kilometer north of Rujm el-Hiri.</p>
Full article ">Figure 9
<p>Examples of round-shaped large structures of different types. (<b>a</b>,<b>b</b>)—objects with double walls, probably built in the same period as Rujm el-Hiri. (<b>c</b>,<b>d</b>)—singular-wall objects of the later period filled with linear structures. There are remains of the buildings or tumuli in the circular structure shown in (<b>d</b>).</p>
Full article ">Figure 10
<p>Examples of round-shaped ~60–90 m-wide structures, with the entrance facing southeast and signatures of active secondary use. (<b>a</b>) round-shaped structure situated 3 km northeast of Rujm el-Hiri; (<b>b</b>) round-shaped structure located 13.5 km north of Rujm el-Hiri; (<b>c</b>) analogous object located 13.5 km northwest of Rujm el-Hiri.</p>
Full article ">Figure 11
<p>Tumuli observed in different landscapes. (<b>a</b>) Agglomerate of tumuli along the Dalyiot stream 500 m north of Rujm el-Hiri. The distance between the tumuli is small, ~3–10 m. Most tumuli are linked by walls, and some of them are surrounded by fences; (<b>b</b>) Several tumuli among rectangular walls located 0.7 km southwest of Rujm el-Hiri. The distance between the tumuli is tens of meters; (<b>c</b>) Agglomerate of poorly-preserved tumuli on the hill 28 km east of Rujm el-Hiri. The tumuli are located close to each other, similar to (<b>a</b>), inside rectangular walls.</p>
Full article ">Figure 12
<p>Distribution of tumuli sizes observed in different landscapes. The black color shows all tumuli in three selected areas (the tumuli field shown in <a href="#remotesensing-16-04239-f011" class="html-fig">Figure 11</a>a, the tumuli field located to the northwest from the Revaya reservoir, the Revaya reservoir tumuli, and the tumuli field to the southwest from Rujm el-Hiri). A total of 304 tumuli. The white color indicates tumuli on the bottom of the Revaya reservoir, shown in <a href="#remotesensing-16-04239-f005" class="html-fig">Figure 5</a>.</p>
Full article ">Figure 13
<p>Combined types of archaeological objects belonging to different epochs. (<b>a</b>) The site, located 3 km northwest of Rujm el-Hiri, features interlinked objects such as tumuli, round-shaped structures, and walls. Modern activities damage the site. (<b>b</b>) Unfinished or damaged Rujm el-Hiri-type object with thick walls, located 1.7 km south of Rujm el-Hiri. The internal space is filled with flower-like circular walls of the later period. (<b>c</b>) Rujm el-Hiri-size circular object situated 13 km north of Rujm el-Hiri. The site was intensively reused.</p>
Full article ">Figure 14
<p>Walls of different periods in the archaeological landscape. (<b>a</b>) An example of the later period walls built upon older-period walls; (<b>b</b>) Walls of different periods as seen in Rujm el-Hiri, aerial view.</p>
Full article ">
23 pages, 16528 KiB  
Article
Mortars in the Archaeological Site of Hierapolis of Phrygia (Denizli, Turkey) from Imperial to Byzantine Age
by Matteo Maria Niccolò Franceschini, Sara Calandra, Silvia Vettori, Tommaso Ismaelli, Giuseppe Scardozzi, Maria Piera Caggia and Emma Cantisani
Minerals 2024, 14(11), 1143; https://doi.org/10.3390/min14111143 - 11 Nov 2024
Viewed by 1010
Abstract
Hierapolis of Phrygia, an archaeological site in southwestern Turkey, has been a UNESCO World Heritage Site since 1988. During archaeological campaigns, 71 mortar samples from public buildings were collected, dating from the Julio-Claudian to the Middle Byzantine period. The samples were analyzed using [...] Read more.
Hierapolis of Phrygia, an archaeological site in southwestern Turkey, has been a UNESCO World Heritage Site since 1988. During archaeological campaigns, 71 mortar samples from public buildings were collected, dating from the Julio-Claudian to the Middle Byzantine period. The samples were analyzed using a multi-analytical approach including polarized optical microscopy (POM), digital image analysis (DIA), X-ray powder diffraction (XRPD) and SEM–EDS to trace the raw materials and understand the evolution of mortar composition and technology over time. During the Roman period, travertine and marble were commonly used in binder production, while marble dominated in the Byzantine period. The aggregates come mainly from sands of the Lycian Nappe and Menderes Massif, with carbonate and silicate rock fragments. Variations in composition, average size and circularity suggest changes in raw material sources in both Roman and Byzantine periods. Cocciopesto mortar was used in water-related structures from the Flavian to the Severan period, but, in the Byzantine period, it also appeared in non-hydraulic contexts. Straw became a common organic additive in Byzantine renders, marking a shift from the exclusively inorganic aggregates of Roman renders. This study illustrates the evolving construction technologies and material sources used throughout the city’s history. Full article
(This article belongs to the Special Issue The Significance of Applied Mineralogy in Archaeometry)
Show Figures

Figure 1

Figure 1
<p>Plan of the ancient city of Hierapolis.</p>
Full article ">Figure 2
<p>(<b>a</b>) Tectonic map of western Anatolia (A: Gediz or Alaşehir Graben, B: Küçük Menderes Graben and C: Büyük Menderes Graben), modified from [<a href="#B44-minerals-14-01143" class="html-bibr">44</a>]; (<b>b</b>) geological map of the Denizli Basin (modified from [<a href="#B45-minerals-14-01143" class="html-bibr">45</a>]).</p>
Full article ">Figure 3
<p>Some of the main monuments of Hierapolis: (<b>a</b>) The Apollo Sanctuary, Building A; (<b>b</b>) the Stoa of the Springs; (<b>c</b>) Nymphaeum of the Tritons; (<b>d</b>) the Theatre; (<b>e</b>) the Ploutonion; and (<b>f</b>) Church of St. Philip.</p>
Full article ">Figure 4
<p>Some examples of (<b>a</b>) bedding mortar (SP5 sample—Church of St. Philip); (<b>b</b>) grouting mortar (NT4 sample—Nymphaeum of the Tritons); (<b>c</b>) coating mortar (SS5—Stoa of the Springs); and (<b>d</b>) concrete fill (P7 sample—Ploutonion).</p>
Full article ">Figure 5
<p>Microphotographs of main aspects of binder and lumps in analyzed mortars (cross polarized nicols): (<b>a</b>) SA-C1 homogeneous micritic binder; (<b>b</b>) SS7 non-homogeneous binder; (<b>c</b>) NA3 binder with recrystallization; (<b>d</b>) P2 binder heterogeneous from micritic to sparitic; (<b>e</b>) P2 travertine underburned fragment; (<b>f</b>) GB2 travertine underburned fragment; (<b>g</b>) P1 marble underburned fragment; (<b>h</b>) SP29 marble underburned fragment.</p>
Full article ">Figure 6
<p>Microphotographs of most common aggregate in analyzed mortars (crossed polarized nicols): (<b>a</b>) TH2 gneiss; (<b>b</b>) SS7 meta-sandstone and schist; (<b>c</b>) SS6 calc-schist and quartzite; (<b>d</b>) SA-C1 amphibolite; (<b>e</b>) TH2 breccias; (<b>f</b>) P6 fossiliferous limestone; (<b>g</b>) P1 on the left micritic limestone and on the right travertine; (<b>h</b>) SS5 crushed ceramic fragment; (<b>i</b>) SA-A2 gabbro-like igneous rock; (<b>j</b>) SA-A1 serpentine fragment; (<b>k</b>) SS6 marble on the left and schist on the right; (<b>l</b>) P6 phyllite with schists.</p>
Full article ">Figure 7
<p>Microphotographs of sample; (<b>a</b>) SA-C1 bedding mortar; (<b>b</b>) NT1 bedding mortar; (<b>c</b>) SP11, render mortar; (<b>d</b>) SS7 render mortar; (<b>e</b>) SP30 render mortar; (<b>f</b>) SP29 render mortar; (<b>g</b>) SP36 opus sectile bedding mortar; (<b>h</b>) NT4 grouting mortar.</p>
Full article ">Figure 8
<p>Backscattered SEM image of binder: (<b>a</b>,<b>c</b>) reaction rim between binder and ceramic fragment (NT2 and SS5, respectively); (<b>b</b>,<b>d</b>) air lime binder (P7 and SA-A1, respectively); (<b>e</b>) HI calculated on selected samples using microchemical SEM–EDS data of binders and lumps. All the acquired data are presented in <a href="#app1-minerals-14-01143" class="html-app">Table S4</a>.</p>
Full article ">
14 pages, 936 KiB  
Review
Application of Artificial Intelligence Models to Predict the Onset or Recurrence of Neovascular Age-Related Macular Degeneration
by Francesco Saverio Sorrentino, Marco Zeppieri, Carola Culiersi, Antonio Florido, Katia De Nadai, Ginevra Giovanna Adamo, Marco Pellegrini, Francesco Nasini, Chiara Vivarelli, Marco Mura and Francesco Parmeggiani
Pharmaceuticals 2024, 17(11), 1440; https://doi.org/10.3390/ph17111440 - 28 Oct 2024
Viewed by 1133
Abstract
Neovascular age-related macular degeneration (nAMD) is one of the major causes of vision impairment that affect millions of people worldwide. Early detection of nAMD is crucial because, if untreated, it can lead to blindness. Software and algorithms that utilize artificial intelligence (AI) have [...] Read more.
Neovascular age-related macular degeneration (nAMD) is one of the major causes of vision impairment that affect millions of people worldwide. Early detection of nAMD is crucial because, if untreated, it can lead to blindness. Software and algorithms that utilize artificial intelligence (AI) have become valuable tools for early detection, assisting doctors in diagnosing and facilitating differential diagnosis. AI is particularly important for remote or isolated communities, as it allows patients to endure tests and receive rapid initial diagnoses without the necessity of extensive travel and long wait times for medical consultations. Similarly, AI is notable also in big hubs because cutting-edge technologies and networking help and speed processes such as detection, diagnosis, and follow-up times. The automatic detection of retinal changes might be optimized by AI, allowing one to choose the most effective treatment for nAMD. The complex retinal tissue is well-suited for scanning and easily accessible by modern AI-assisted multi-imaging techniques. AI enables us to enhance patient management by effectively evaluating extensive data, facilitating timely diagnosis and long-term prognosis. Novel applications of AI to nAMD have focused on image analysis, specifically for the automated segmentation, extraction, and quantification of imaging-based features included within optical coherence tomography (OCT) pictures. To date, we cannot state that AI could accurately forecast the therapy that would be necessary for a single patient to achieve the best visual outcome. A small number of large datasets with high-quality OCT, lack of data about alternative treatment strategies, and absence of OCT standards are the challenges for the development of AI models for nAMD. Full article
Show Figures

Figure 1

Figure 1
<p>Methods to treat neovascular age-related macular degeneration.</p>
Full article ">Figure 2
<p>Artificial intelligence models for therapy prediction assessing the anatomical response to antiVEGF.</p>
Full article ">
22 pages, 6160 KiB  
Article
WaterGPT: Training a Large Language Model to Become a Hydrology Expert
by Yi Ren, Tianyi Zhang, Xurong Dong, Weibin Li, Zhiyang Wang, Jie He, Hanzhi Zhang and Licheng Jiao
Water 2024, 16(21), 3075; https://doi.org/10.3390/w16213075 - 27 Oct 2024
Cited by 3 | Viewed by 2330
Abstract
This paper introduces WaterGPT, a language model designed for complex multimodal tasks in hydrology. WaterGPT is applied in three main areas: (1) processing and analyzing data such as images and text in water resources, (2) supporting intelligent decision-making for hydrological tasks, and (3) [...] Read more.
This paper introduces WaterGPT, a language model designed for complex multimodal tasks in hydrology. WaterGPT is applied in three main areas: (1) processing and analyzing data such as images and text in water resources, (2) supporting intelligent decision-making for hydrological tasks, and (3) enabling interdisciplinary information integration and knowledge-based Q&A. The model has achieved promising results. One core aspect of WaterGPT involves the meticulous segmentation of training data for the supervised fine-tuning phase, sourced from real-world data and annotated with high quality using both manual methods and GPT-series model annotations. These data are carefully categorized into four types: knowledge-based, task-oriented, negative samples, and multi-turn dialogues. Additionally, another key component is the development of a multi-agent framework called Water_Agent, which enables WaterGPT to intelligently invoke various tools to solve complex tasks in the field of water resources. This framework handles multimodal data, including text and images, allowing for deep understanding and analysis of complex hydrological environments. Based on this framework, WaterGPT has achieved over a 90% success rate in tasks such as object detection and waterbody extraction. For the waterbody extraction task, using Dice and mIoU metrics, WaterGPT’s performance on high-resolution images from 2013 to 2022 has remained stable, with accuracy exceeding 90%. Moreover, we have constructed a high-quality water resources evaluation dataset, EvalWater, which covers 21 categories and approximately 10,000 questions. Using this dataset, WaterGPT achieved the highest accuracy to date in the field of water resources, reaching 83.09%, which is about 17.83 points higher than GPT-4. Full article
Show Figures

Figure 1

Figure 1
<p>SFT training data production process.</p>
Full article ">Figure 2
<p>EvalWater dataset.</p>
Full article ">Figure 3
<p>Water_Agent framework diagram.</p>
Full article ">Figure 4
<p>Subtasks supported by Water_Agent.</p>
Full article ">Figure 5
<p>Calculation flowchart.</p>
Full article ">Figure 6
<p>Training process change curves.</p>
Full article ">Figure 7
<p>Evaluation results of each model classified by EvalWater.</p>
Full article ">Figure 8
<p>Comparison chart of evaluation results for various models.</p>
Full article ">Figure 9
<p>Water_Agent operation diagram.</p>
Full article ">Figure 10
<p>Accuracy of water body extraction in different years.</p>
Full article ">Figure 11
<p>Completion rates of different models on simple and complex hydrology tasks.</p>
Full article ">Figure 12
<p>Average completion rate of different models on overall hydrology tasks.</p>
Full article ">Figure 13
<p>Gpt4 evaluation results of model answer quality from different dimensions.</p>
Full article ">Figure 14
<p>Gpt4’s overall evaluation results of model answer quality.</p>
Full article ">Figure 15
<p>The outcome of Gpt4 and WaterGPT.</p>
Full article ">
17 pages, 3569 KiB  
Article
A Cippus from Turris Libisonis: Evidence for the Use of Local Materials in Roman Painting on Stone in Northern Sardinia
by Roberta Iannaccone, Stefano Giuliani, Sara Lenzi, Matteo M. N. Franceschini, Silvia Vettori and Barbara Salvadori
Minerals 2024, 14(10), 1040; https://doi.org/10.3390/min14101040 - 17 Oct 2024
Viewed by 1085
Abstract
The ancient Roman town of Turris Libisonis was located on the northern coast of Sardinia and was known in the past as an important naval port. Located in the Gulf of Asinara, it was a Roman colony from the 1st century BCE and [...] Read more.
The ancient Roman town of Turris Libisonis was located on the northern coast of Sardinia and was known in the past as an important naval port. Located in the Gulf of Asinara, it was a Roman colony from the 1st century BCE and became one of the richest towns on the island. Among the archaeological finds in the area, the cippus exhibited in the Antiquarium Turritano is of great interest for its well-preserved traces of polychromy. The artefact dates back to the early Imperial Age and could have had a funerary or votive function. The artefact was first examined using a portable and non-invasive protocol involving multi-band imaging (MBI), portable X-ray fluorescence (p-XRF), portable FT-IR in external reflectance mode (ER FT-IR) and Raman spectroscopy. After this initial examination, a few microfragments were collected and investigated by optical microscopy (OM), X-ray powder diffraction (XRPD), Fourier-transform infrared spectroscopy in ATR mode (ATR FT-IR) and micro-ATR mode (μATR FT-IR) and Scanning Electron Microscopy/Energy Dispersive Spectroscopy (SEM-EDS) to improve our knowledge and characterize the materials and to determine their provenience. The results contribute to a better understanding of the provenance of materials and shed light on pigments on stone and their use outside the Italian peninsula and, in particular, Roman Sardinia. Full article
(This article belongs to the Special Issue Geomaterials and Cultural Heritage)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>(<b>a</b>) The position of the cippus in the Antiquarium of Porto Torres (SS) and the two sides analyzed: (<b>b</b>) side A; (<b>c</b>) side B (courtesy of Ministero della Cultura–Direzione Regionale Musei Sardegna).</p>
Full article ">Figure 2
<p>(<b>a</b>) Raking light detail of flaking areas; (<b>b</b>) High magnification image (60×) of the area in the red square; (<b>c</b>) SEM image in BSE from the sample (red point) and (<b>d</b>) EDS analysis results of a point on recrystallized salt.</p>
Full article ">Figure 3
<p>(<b>a</b>) Raman spectrum of point corresponding to (<b>a</b>) a black area and (<b>b</b>) a yellow area, respectively. On the right, the optical microscopic details of the points (60×). In (<b>b</b>) Raman spectrum of yellow area, in black, and goethite reference spectrum in yellow (RRUFF mineral database).</p>
Full article ">Figure 4
<p>(<b>a</b>) p-XRF spectra of point 8 in black, point 10 in dotted black and the reference background in red; (<b>b</b>) the locations of measured points 8 and 10 are shown (<b>c</b>) Raman spectra of green earth pigment at point 8 (in green) and point 10 (in gray).</p>
Full article ">Figure 5
<p>Microphotographs of thin sections of the carbonate rock of the cippus (<b>a</b>,<b>b</b>) and of the mortar covering the cippus (<b>c</b>,<b>d</b>) (using a polarized light microscope): (<b>a</b>,<b>c</b>) parallel nicols; (<b>b</b>,<b>d</b>) crossed nicols.</p>
Full article ">Figure 6
<p>Geological map of Porto Torres and its surrounding area. Modified from [<a href="#B46-minerals-14-01040" class="html-bibr">46</a>].</p>
Full article ">Figure 7
<p>Macro-photo of carbonate rock of the cippus. Various fossils were observed in the carbonatic rock of the cippus: (<b>a</b>) gastropods; (<b>b</b>) ammonites; (<b>c</b>) algae tallus.</p>
Full article ">Figure 8
<p>Fragment analyzed by SEM−EDS. (<b>a</b>) Optical microscope image at 50×; (<b>b</b>) backscattered details at 127×; and (<b>c</b>) EDS analysis results.</p>
Full article ">Figure 9
<p>(<b>a</b>) FT-IR micro-ATR spectrum obtained from the green sample using point analysis with a TE-MCT detector and (<b>b</b>) FT-IR FPA-ATR spectrum extracted from the chemical map shown above. The arrow indicates the position of the spectrum.</p>
Full article ">
15 pages, 3035 KiB  
Article
Application of Kirchhoff Migration from Two-Dimensional Fresnel Dataset by Converting Unavailable Data into a Constant
by Won-Kwang Park
Mathematics 2024, 12(20), 3253; https://doi.org/10.3390/math12203253 - 17 Oct 2024
Viewed by 645
Abstract
In this contribution, we consider an application of the Kirchhoff migration (KM) technique for fast and accurate identification of small dielectric objects from two-dimensional Fresnel experimental dataset. Generally, for successful application of the KM, a complete set of elements from the so-called multi-static [...] Read more.
In this contribution, we consider an application of the Kirchhoff migration (KM) technique for fast and accurate identification of small dielectric objects from two-dimensional Fresnel experimental dataset. Generally, for successful application of the KM, a complete set of elements from the so-called multi-static response (MSR) matrix must be collected; however, in the Fresnel experimental dataset, many of the elements of an MSR matrix are not measurable. Nevertheless, the existence, location, and outline shape of small objects can be retrieved using the KM by converting unavailable data into the zero constant. However, the theoretical reason behind such conversion has not been confirmed to date. In order to explain this theoretical reason, we convert unavailable measurement data into a constant and demonstrate that the imaging function of the KM can be expressed by an infinite series of the Bessel functions of integer order of the first kind, the object’s material properties, and the converted constant. Following the theoretical result, we confirm that converting unknown data into the zero constant guarantees good results and unique determination of the objects. Finally, various numerical simulation results from Fresnel experimental dataset are presented and discussed to validate the theoretical result. Full article
(This article belongs to the Special Issue Inverse Problems and Numerical Computation in Mathematical Physics)
Show Figures

Figure 1

Figure 1
<p>Receiver arrangements corresponding to the transmitter location.</p>
Full article ">Figure 2
<p>Visualization of the modulus of <math display="inline"><semantics> <mrow> <mi mathvariant="double-struck">M</mi> <mo>(</mo> <mi>C</mi> <mo>)</mo> </mrow> </semantics></math>: (<b>a</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>0</mn> </mrow> </semantics></math>, (<b>b</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>0.05</mn> </mrow> </semantics></math>, and (<b>c</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>0.1</mn> </mrow> </semantics></math>.</p>
Full article ">Figure 3
<p>1D plots of <math display="inline"><semantics> <mrow> <mo>Φ</mo> <mo>(</mo> <mi mathvariant="bold">x</mi> <mo>,</mo> <mn mathvariant="bold">0</mn> <mo>)</mo> </mrow> </semantics></math> at <math display="inline"><semantics> <mrow> <mi>f</mi> <mo>=</mo> <mn>2</mn> <mo>,</mo> <mn>5</mn> <mo>,</mo> <mn>10</mn> <mo> </mo> <mrow> <mi mathvariant="normal">GHz</mi> </mrow> </mrow> </semantics></math>.</p>
Full article ">Figure 4
<p>1D plots of <math display="inline"><semantics> <mrow> <mo>Ψ</mo> <mo>(</mo> <mi mathvariant="bold">x</mi> <mo>)</mo> </mrow> </semantics></math> at <math display="inline"><semantics> <mrow> <mi>f</mi> <mo>=</mo> <mn>2</mn> <mo>,</mo> <mn>5</mn> <mo>,</mo> <mn>10</mn> <mo> </mo> <mrow> <mi mathvariant="normal">GHz</mi> </mrow> </mrow> </semantics></math>.</p>
Full article ">Figure 5
<p>(<b>a</b>) illustration of single object; (<b>b</b>) map of <math display="inline"><semantics> <mrow> <mi mathvariant="fraktur">F</mi> <mo>(</mo> <mi mathvariant="bold">x</mi> <mo>,</mo> <mn>0</mn> <mo>)</mo> </mrow> </semantics></math> at <math display="inline"><semantics> <mrow> <mn>1</mn> <mo> </mo> <mrow> <mi mathvariant="normal">GHz</mi> </mrow> </mrow> </semantics></math>; (<b>c</b>) map of <math display="inline"><semantics> <mrow> <mi mathvariant="fraktur">F</mi> <mo>(</mo> <mi mathvariant="bold">x</mi> <mo>,</mo> <mn>0</mn> <mo>)</mo> </mrow> </semantics></math> at <math display="inline"><semantics> <mrow> <mn>4</mn> <mo> </mo> <mrow> <mi mathvariant="normal">GHz</mi> </mrow> </mrow> </semantics></math>; (<b>d</b>) illustration of multiple objects; (<b>e</b>) map of <math display="inline"><semantics> <mrow> <mi mathvariant="fraktur">F</mi> <mo>(</mo> <mi mathvariant="bold">x</mi> <mo>,</mo> <mn>0</mn> <mo>)</mo> </mrow> </semantics></math> at <math display="inline"><semantics> <mrow> <mn>2</mn> <mo> </mo> <mrow> <mi mathvariant="normal">GHz</mi> </mrow> </mrow> </semantics></math>; (<b>f</b>) map of <math display="inline"><semantics> <mrow> <mi mathvariant="fraktur">F</mi> <mo>(</mo> <mi mathvariant="bold">x</mi> <mo>,</mo> <mn>0</mn> <mo>)</mo> </mrow> </semantics></math> at <math display="inline"><semantics> <mrow> <mn>5</mn> <mo> </mo> <mrow> <mi mathvariant="normal">GHz</mi> </mrow> </mrow> </semantics></math>. The white dashed lines describe the boundaries of the objects.</p>
Full article ">Figure 6
<p>Maps of <math display="inline"><semantics> <mrow> <mi mathvariant="fraktur">F</mi> <mo>(</mo> <mi mathvariant="bold">x</mi> <mo>,</mo> <mi>C</mi> <mo>)</mo> </mrow> </semantics></math> at <math display="inline"><semantics> <mrow> <mn>1</mn> <mo> </mo> <mrow> <mi mathvariant="normal">GHz</mi> </mrow> </mrow> </semantics></math>: (<b>a</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>0.05</mn> </mrow> </semantics></math>; (<b>b</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>0.05</mn> <mi>i</mi> </mrow> </semantics></math>; (<b>c</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>0.1</mn> </mrow> </semantics></math>; (<b>d</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>0.1</mn> <mi>i</mi> </mrow> </semantics></math>; (<b>e</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>0.3</mn> </mrow> </semantics></math>; (<b>f</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>0.3</mn> <mi>i</mi> </mrow> </semantics></math>. The white dashed lines describe the boundaries of the objects.</p>
Full article ">Figure 7
<p>Maps of <math display="inline"><semantics> <mrow> <mi mathvariant="fraktur">F</mi> <mo>(</mo> <mi mathvariant="bold">x</mi> <mo>,</mo> <mi>C</mi> <mo>)</mo> </mrow> </semantics></math> at <math display="inline"><semantics> <mrow> <mn>4</mn> <mo> </mo> <mrow> <mi mathvariant="normal">GHz</mi> </mrow> </mrow> </semantics></math>: (<b>a</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>0.1</mn> </mrow> </semantics></math>; (<b>b</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>0.1</mn> <mi>i</mi> </mrow> </semantics></math>; (<b>c</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>0.3</mn> </mrow> </semantics></math>; (<b>d</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>0.3</mn> <mi>i</mi> </mrow> </semantics></math>; (<b>e</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>1</mn> </mrow> </semantics></math>; (<b>f</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>1</mn> <mi>i</mi> </mrow> </semantics></math>. The white dashed lines describe the boundaries of the objects.</p>
Full article ">Figure 8
<p>Maps of <math display="inline"><semantics> <mrow> <mi mathvariant="fraktur">F</mi> <mo>(</mo> <mi mathvariant="bold">x</mi> <mo>,</mo> <mi>C</mi> <mo>)</mo> </mrow> </semantics></math> at <math display="inline"><semantics> <mrow> <mn>2</mn> <mo> </mo> <mrow> <mi mathvariant="normal">GHz</mi> </mrow> </mrow> </semantics></math>: (<b>a</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>0.1</mn> </mrow> </semantics></math>; (<b>b</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>0.1</mn> <mi>i</mi> </mrow> </semantics></math>; (<b>c</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>0.3</mn> </mrow> </semantics></math>; (<b>d</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>0.3</mn> <mi>i</mi> </mrow> </semantics></math>; (<b>e</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>0.5</mn> </mrow> </semantics></math>; (<b>f</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>0.5</mn> <mi>i</mi> </mrow> </semantics></math>. The white dashed lines describe the boundaries of the objects.</p>
Full article ">Figure 9
<p>Maps of <math display="inline"><semantics> <mrow> <mi mathvariant="fraktur">F</mi> <mo>(</mo> <mi mathvariant="bold">x</mi> <mo>,</mo> <mi>C</mi> <mo>)</mo> </mrow> </semantics></math> at <math display="inline"><semantics> <mrow> <mn>5</mn> <mo> </mo> <mrow> <mi mathvariant="normal">GHz</mi> </mrow> </mrow> </semantics></math>: (<b>a</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>0.1</mn> </mrow> </semantics></math>; (<b>b</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>0.1</mn> <mi>i</mi> </mrow> </semantics></math>; (<b>c</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>0.3</mn> </mrow> </semantics></math>; (<b>d</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>0.3</mn> <mi>i</mi> </mrow> </semantics></math>; (<b>e</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>0.5</mn> </mrow> </semantics></math>; (<b>f</b>) <math display="inline"><semantics> <mrow> <mi>C</mi> <mo>=</mo> <mn>0.5</mn> <mi>i</mi> </mrow> </semantics></math>. The white dashed lines describe the boundaries of the objects.</p>
Full article ">
19 pages, 8120 KiB  
Article
The Impacts of Phenological Stages within the Annual Cycle on Mapping Forest Stock Volume Using Multi-Band Dual-Polarization SAR Images in Boreal Forests
by Jiangping Long, Huanna Zheng, Zilin Ye, Tingchen Zhang and Xunwei Li
Forests 2024, 15(9), 1660; https://doi.org/10.3390/f15091660 - 20 Sep 2024
Cited by 1 | Viewed by 703
Abstract
SAR images with two polarizations show strong potential for mapping forest stock volume (FSV) combined with limited samples. However, accurately mapping FSV still presents challenges in selecting the optimal acquisition date to obtain the SAR images during specific phenological stages within the annual [...] Read more.
SAR images with two polarizations show strong potential for mapping forest stock volume (FSV) combined with limited samples. However, accurately mapping FSV still presents challenges in selecting the optimal acquisition date to obtain the SAR images during specific phenological stages within the annual forest cycle (growth and dormant stages). To clarify the impacts of phenological stages within the annual cycle on FSV mapping, SAR images with various polarization models and bands (Sentinel-1(S), GaoFen-3(GF-3 (G)) and ALOS-2(A)) were acquired within the growth and dormant stages of an annual cycle in a boreal evergreen coniferous forest (Chinese pine) and a deciduous coniferous forest (Larch). Subsequently, single-band (G, S, and A) and multi-band combined alternative variable sets (A + G, A + S, S + G, and A + S + G) were extracted within the same stage, respectively. Finally, the forward selection approach was utilized in conjunction with four different models (MLR, KNN, RF, and SVR) to obtain the most suitable variable sets and generate FSV mapping. The results demonstrated a strong correlation between the intensity of backscattering coefficients and the phenological stages of the forest. Within the dormant stage, there was a significant decrease in the gaps of backscattering coefficients obtained from the same polarization compared to those within the growth stage. Furthermore, the results also revealed that more signals from inside the canopy could be detected during the dormant stage in both evergreen coniferous forests and deciduous coniferous forests. Subsequently, the accuracy in mapping FSV obtained from single-band SAR images within the dormant stage are slightly higher than that within the growth stage, and the accuracy was still significantly affected by both overestimation and underestimation. Moreover, the combined effects of different bands significantly improve the reliability of mapped FSV. The rRMSE values in four multi-band combinations ranged from 22.37% to 29.40% for Chinese pine forests and from 21.27% to 34.38% for Larch forests, and the optimal result was observed from combinations of A + S + G acquired within the dormant stage. It is confirmed that SAR signal and their sensitivity to FSV depends on the stages of forest annual growth cycle. In comparison to the growth period, dual-polarization SAR data acquired during the dormant stage is more suitable for estimating FSV in boreal forests. Full article
(This article belongs to the Section Forest Meteorology and Climate Change)
Show Figures

Figure 1

Figure 1
<p>The maps of the study area (<b>a</b>) and the distribution maps of samples (<b>b</b>).</p>
Full article ">Figure 2
<p>The map of the Digital Elevation Model (DEM).</p>
Full article ">Figure 3
<p>The scatterplots between forest FSV and backscattering coefficients of different polarization modes in growth stages (<b>a</b>–<b>d</b>) and dormant stages (<b>e</b>–<b>h</b>).</p>
Full article ">Figure 4
<p>Plots of Pearson correlation coefficients between FSV and various variables (derived variables and texture variables), (<b>a</b>,<b>b</b>,<b>e</b>,<b>f</b>) presents the derived variables (From X1 to X18), and (<b>c</b>,<b>d</b>,<b>g</b>,<b>h</b>) presents the texture variables (T1 to T8 are extracted from backscattering coefficients of co-polarization, T9 to T18 are extracted from backscattering coefficients of cross-polarization).</p>
Full article ">Figure 5
<p>The plots between ground measured and predicted FSV obtained from optimal models using single-band dual polarization SAR images acquired in a dormant stage. (<b>a</b>–<b>c</b>) show the FSV estimation results for Chinese pine, While (<b>d</b>–<b>f</b>) for Larch.</p>
Full article ">Figure 6
<p>The plots between measured and predicted FSV derived from optimal models using single data acquired in the growth stage. (<b>a</b>–<b>f</b>) present the FSV estimation results for Chinese pine and Larch, respectively.</p>
Full article ">Figure 7
<p>The plots between measured and predicted FSV by optimal models using multi-band dual-polarization SAR images acquired during the dormant stage. (<b>a</b>–<b>d</b>) show estimation results for Chinese pine FSV; (<b>e</b>–<b>h</b>) show results for Larch FSV.</p>
Full article ">Figure 8
<p>The plots between measured and predicted values obtained by optimal models using multi-band dual-polarization SAR images acquired during the growth stage. (<b>a</b>–<b>d</b>) show the estimation results for Chinese pine FSV, (<b>e</b>–<b>h</b>) for Larch FSV.</p>
Full article ">Figure 9
<p>The maps of FSV obtained from optimal models using multi-band data acquired during the dormant stage, respectively.</p>
Full article ">Figure 10
<p>Radar charts of Pearson correlation coefficients between FSV and backscattering coefficients of different polarizations extracted from single-band SAR images during growth and dormant stages.</p>
Full article ">Figure 11
<p>Plots of sorted Pearson correlation coefficient between the top 20 variables extracted from single dual-polarization SAR data and FSV. (<b>a</b>,<b>b</b>) show the correlation between variables and Chinese pine FSV, (<b>c</b>,<b>d</b>) show the correlation for Larch FSV, during the growth and dormant stages.</p>
Full article ">Figure 12
<p>The histograms of accuracy indices in mapping FSV using single and multi-band polarimetric SAR data during growth and dormant stages in planted forests.</p>
Full article ">
19 pages, 10946 KiB  
Article
Crop Growth Analysis Using Automatic Annotations and Transfer Learning in Multi-Date Aerial Images and Ortho-Mosaics
by Shubham Rana, Salvatore Gerbino, Ehsan Akbari Sekehravani, Mario Brandon Russo and Petronia Carillo
Agronomy 2024, 14(9), 2052; https://doi.org/10.3390/agronomy14092052 - 7 Sep 2024
Cited by 1 | Viewed by 1954
Abstract
Growth monitoring of crops is a crucial aspect of precision agriculture, essential for optimal yield prediction and resource allocation. Traditional crop growth monitoring methods are labor-intensive and prone to errors. This study introduces an automated segmentation pipeline utilizing multi-date aerial images and ortho-mosaics [...] Read more.
Growth monitoring of crops is a crucial aspect of precision agriculture, essential for optimal yield prediction and resource allocation. Traditional crop growth monitoring methods are labor-intensive and prone to errors. This study introduces an automated segmentation pipeline utilizing multi-date aerial images and ortho-mosaics to monitor the growth of cauliflower crops (Brassica Oleracea var. Botrytis) using an object-based image analysis approach. The methodology employs YOLOv8, a Grounding Detection Transformer with Improved Denoising Anchor Boxes (DINO), and the Segment Anything Model (SAM) for automatic annotation and segmentation. The YOLOv8 model was trained using aerial image datasets, which then facilitated the training of the Grounded Segment Anything Model framework. This approach generated automatic annotations and segmentation masks, classifying crop rows for temporal monitoring and growth estimation. The study’s findings utilized a multi-modal monitoring approach to highlight the efficiency of this automated system in providing accurate crop growth analysis, promoting informed decision-making in crop management and sustainable agricultural practices. The results indicate consistent and comparable growth patterns between aerial images and ortho-mosaics, with significant periods of rapid expansion and minor fluctuations over time. The results also indicated a correlation between the time and method of observation which paves a future possibility of integration of such techniques aimed at increasing the accuracy in crop growth monitoring based on automatically derived temporal crop row segmentation masks. Full article
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) Locational information of the experimental farm in the Department of Agronomy. (<b>b</b>) University of Napoli Federico II, Portici (study area marked with red polygon). Source: Google Earth, 19 October 2022.</p>
Full article ">Figure 2
<p>(<b>a</b>) Aerial instance from 29 October 2020. (<b>b</b>) Ortho-mosaic dated 21 October 2020 [<a href="#B34-agronomy-14-02052" class="html-bibr">34</a>].</p>
Full article ">Figure 3
<p>Training graphs for (<b>a</b>) 8 October, (<b>b</b>) 21 October, (<b>c</b>) 29 October 2020 [<a href="#B34-agronomy-14-02052" class="html-bibr">34</a>].</p>
Full article ">Figure 4
<p>Instance crop row masks derived from automatically segmented aerial images dated 21 October 2020. (<b>a</b>) Row 1. (<b>b</b>) Row 2. (<b>c</b>) Row 3. (<b>d</b>) Row 4. (<b>e</b>) Row 5. (<b>f</b>) Row 6. (<b>g</b>) Row 7.</p>
Full article ">Figure 5
<p>Relative crop growth rate across multi-date aerial imagery.</p>
Full article ">Figure 6
<p>Relative crop growth percentage across multi-date ortho-mosaic imagery.</p>
Full article ">Figure 7
<p>Time series analysis of multi-date aerial imagery.</p>
Full article ">Figure 8
<p>Time series analysis of multi-date ortho-mosaics.</p>
Full article ">Figure 9
<p>Pearson correlation analysis between multi-date aerial imagery and ortho-mosaics: (<b>a</b>) Crop Row 1, (<b>b</b>) Crop Row 2, (<b>c</b>) Crop Row 3, (<b>d</b>) Crop Row 4, (<b>e</b>) Crop Row 5, (<b>f</b>) Crop Row 6, and (<b>g</b>) Crop Row 7.</p>
Full article ">
18 pages, 14483 KiB  
Article
Digital Surface Model Generation from Satellite Images Based on Double-Penalty Bundle Adjustment Optimization
by Henan Li, Junping Yin and Liguo Jiao
Appl. Sci. 2024, 14(17), 7777; https://doi.org/10.3390/app14177777 - 3 Sep 2024
Cited by 2 | Viewed by 1401
Abstract
Digital Surface Model (DSM) generation from high-resolution optical satellite images is an important topic of research in the remote sensing field. In optical satellite imaging systems, the attitude information of the cameras recorded by satellite sensors is often biased, which leads to errors [...] Read more.
Digital Surface Model (DSM) generation from high-resolution optical satellite images is an important topic of research in the remote sensing field. In optical satellite imaging systems, the attitude information of the cameras recorded by satellite sensors is often biased, which leads to errors in the Rational Polynomial Camera (RPC) model of satellite imaging. These errors in the RPC model can mislead the DSM generation. To solve the above problems, we propose an automatic DSM generation method from satellite images based on the Double-Penalty bundle adjustment (DPBA) optimization algorithm. In the proposed method, two penalty functions representing the camera’s attitude and the spatial 3D points, respectively, are added to the reprojection error model of the traditional bundle adjustment optimization algorithm. Instead of acting on images directly, the penalty functions are used to adjust the reprojection error model and improve the RPC parameters. We evaluate the performance of the proposed method using high-resolution satellite image pairs and multi-date satellite images. Through some experiments, we compare the accuracy and completeness of the DSM generated by the proposed method, the Satellite Stereo Pipeline (S2P) method, and the traditional bundle adjustment (BA) method. Compared to the S2P method, the experiment results of the satellite image pair indicate that the proposed method can significantly improve the accuracy and the completeness of the generated DSM by about 1–5 m and 20%–60% in most cases. Compared to the traditional BA method, the proposed method improves the accuracy and completeness of the generated DSM by about 0.01–0.05 m and 1%–3% in most cases. The experiment results can be a testament to the feasibility and effectiveness of the proposed method. Full article
(This article belongs to the Section Earth Sciences)
Show Figures

Figure 1

Figure 1
<p>Block diagram of the proposed DSM generation method.</p>
Full article ">Figure 2
<p>The reconstruction results from different image pairs: (<b>a</b>) image pair A; (<b>b</b>) image pair B.</p>
Full article ">Figure 3
<p>The stereo rectification of the proposed method.</p>
Full article ">Figure 4
<p>The test data in the IARPA MVS3DM dataset: (<b>a</b>) the input satellite image; (<b>b</b>) the ground truth DSM.</p>
Full article ">Figure 5
<p>The DSM generated from: (<b>a</b>) the satellite image pair with more than 30° sun angle difference; (<b>b</b>) the satellite image pair with more than 20° intersection angle.</p>
Full article ">Figure 6
<p>The generated DSMs of the S2P method [<a href="#B7-applsci-14-07777" class="html-bibr">7</a>], the traditional BA method [<a href="#B32-applsci-14-07777" class="html-bibr">32</a>], and the proposed method.</p>
Full article ">Figure 7
<p>The details of the generated DSMs from image pair 9.</p>
Full article ">Figure 8
<p>The input satellite images of Site 2, Site 3, and Site 4.</p>
Full article ">Figure 9
<p>The ground truth DSM and the generated DSMs of the multi-date S2P method [<a href="#B20-applsci-14-07777" class="html-bibr">20</a>], the traditional BA method [<a href="#B32-applsci-14-07777" class="html-bibr">32</a>], and the proposed method.</p>
Full article ">Figure 10
<p>The ground truth DSM and the generated DSMs of the multi-date S2P method [<a href="#B20-applsci-14-07777" class="html-bibr">20</a>], the traditional BA method [<a href="#B32-applsci-14-07777" class="html-bibr">32</a>], and the proposed method.</p>
Full article ">
Back to TopTop