Optimal Altitude, Overlap, and Weather Conditions for Computer Vision UAV Estimates of Forest Structure
"> Figure 1
<p>Maps showing location of 3 study sites in Maryland, USA, with local area insets (<b>a</b>). Maps of each 250 m × 250 m study area for Herbert Run (<b>b</b>), Knoll (<b>c</b>), and SERC (<b>d</b>). Overhead view of Ecosynth 3D point cloud for Herbert Run site from 2013-08-26 (<b>e</b>) and oblique view of the same point cloud from approximately the point of view of the red arrow (<b>f</b>). In all panels, the red squares are 250 m × 250 m in size. Imagery in (a–d) from Google Earth, image date 2014-10-23.</p> "> Figure 2
<p>Relationship between the error in Ecosynth top-of-canopy height (TCH) estimates of field canopy height and the displacement of the Ecosynth point cloud relative to the digital terrain model (DTM) used for extracting heights, as measured by the value Launch Location Elevation Difference (LLED) in meters (n = 82).</p> "> Figure 3
<p>Point cloud density (PD) and Ecosynth TCH error relative to field and LIDAR TCH for a single replicate sampled from every image to every 10th image, decreasing forward photographic overlap. Top axis is forward overlap: 60%, 64%, 68%, 72%, 76%, 80%, 84%, 88%, 92%, and 96%. Left plots show error without dense post-processing, plots at right show error of the same point clouds with dense post-processing.</p> "> Figure 4
<p>Cross-sections of a 100 m × 5 m swath of forest at the Herbert Run site showing Ecosynth point clouds produced from high forward overlap images (96%, (<b>a</b>)) and low forward overlap images (60%, (<b>b</b>)), relative to the LIDAR first return point cloud over the same area.</p> "> Figure 5
<p>Average variation or contrast of point cloud point color values per channel within forest areas under clear and cloudy lighting conditions. All per channel differences in average variation under different lighting were significantly different based on analysis of variance (<span class="html-italic">p</span> < 0.0001).</p> "> Figure 6
<p>Linear models of Ecosynth average TCH across optimal replicates (n = 7) to field (<b>a</b>) and LIDAR TCH (<b>b</b>), average Ecosynth TCH estimated above ground biomass density (AGB Mg·ha<sup>−1</sup>) relative to field estimated AGB (<b>c</b>). Solid line is one to one line, error bars are standard deviation, dotted lines are linear model.</p> ">
Abstract
:1. Introduction
2. Methods
2.1. Data Collection
2.1.1. Study Area and Field Data
2.1.2. UAV Image Acquisition under a Controlled Experimental Design
2.1.3. Airborne LIDAR
2.2. Data Processing
2.3. Data Analysis
2.3.1. Measurements of Position Accuracy
2.3.2. Measurements of Canopy Structure
2.3.3. Measures of Canopy Sampling
2.3.4. Radiometric Quality of Ecosynth Point Clouds
3. Results
3.1. Point Cloud Positioning Quality
Lighting Condition | Altitude above Canopy (meters) | |||||||||||||||
CLEAR | CLOUDY | p < | 20 | 40 | 60 | 80 | R2 | |||||||||
N | 43 | 39 | 9 | 15 | 17 | 41 | ||||||||||
Path-XY Error | 1.2 | 1.4 | NS | 0.61 | 1.0 | 1.2 | 1.6 | 0.98 | ||||||||
RMSE m | (0.6) | (1.3) | (0.25) | (0.5) | (0.7) | (1.2) | ||||||||||
Path-Z Error | 0.44 | 0.44 | NS | 0.4 | 0.5 | 0.4 | 0.5 | NS | ||||||||
RMSE m | (0.13) | (0.12) | (0.1) | (0.1) | (0.1) | (0.1) | ||||||||||
ICP-XY Error | 1.8 | 2.3 | 0.05 | 2.2 | 1.8 | 2.2 | 2.0 | NS | ||||||||
RMSE m | (0.8) | (1.2) | (1.1) | (0.7) | (1.2) | (1.2) | ||||||||||
ICP-Z Error | 2.0 | 3.8 | 0.00001 | 3.4 | 2.7 | 2.4 | 3.0 | NS | ||||||||
RMSE m | (1.0) | (1.7) | (0.9) | (1.2) | (1.5) | (1.9) | ||||||||||
LLED | 2.2 | 3.8 | 0.00001 | 3.2 | 3.4 | 2.4 | 3.0 | NS | ||||||||
MAD (m) | (1.3) | (1.9) | (0.9) | (1.2) | (1.6) | (2.1) | ||||||||||
Ecosynth TCH to | 4.2 | 4.3 | NS | 5.3 | 4.2 | 4.2 | 4.0 | NS | ||||||||
Field RMSE (m) | (0.6) | (0.6) | (0.6) | (0.3) | (0.3) | (0.4) | ||||||||||
Ecosynth to LIDAR | 2.5 | 2.5 | NS | 2.2 | 2.5 | 2.3 | 2.6 | NS | ||||||||
TCH RMSE (m) | (0.6) | (0.7) | (0.4) | (0.7) | (0.6) | (0.7) | ||||||||||
Point Density | 33 | 43 | 0.05 | 80 | 53 | 38 | 23 | 0.97 | ||||||||
Points m−2 | (14) | (27) | (24) | (13) | (7) | (9) | ||||||||||
Canopy Penetration | 18 | 16 | 0.01 | 17 | 16 | 17 | 17 | NS | ||||||||
% CV | (3) | (2) | (2) | (2) | (2) | (2) | ||||||||||
Average Computation | 44 | 49 | NS | 104 | 70 | 53 | 23 | NS | ||||||||
Time hours | (38) | (46) | (14) | (59) | (33) | (16) | ||||||||||
Image Side Overlap (%) | Image forward Overlap (%) a | |||||||||||||||
20 | 40 | 60 | 80 | R2 | 96 | 60 | R2 | |||||||||
N | 10 | 10 | 29 | 33 | 5 | 5 | ||||||||||
Path-XY Error | 1.9 | 2.2 | 1.2 | 0.9 | NS | 1.3 | 1.7 | 0.65 | ||||||||
RMSE m | (1.3) | (1.7) | (0.6) | (0.4) | (0.2) | (0.2) | ||||||||||
Path-Z Error | 0.5 | 0.4 | 0.5 | 0.4 | NS | 0.36 | 0.41 | 0.88 | ||||||||
RMSE m | (0.1) | (0.1) | (0.1) | (0.1) | (0.1) | (0.1) | ||||||||||
ICP-XY Error | 2.1 | 2.8 | 1.7 | 2.1 | NS | 1.7 | 1.9 | NS | ||||||||
RMSE m | (1.3) | (1.5) | (0.8) | (1.0) | (0.3) | (0.3) | ||||||||||
ICP-Z Error | 2.3 | 3.3 | 2.6 | 3.1 | NS | 1.9 | 2.2 | 0.41 | ||||||||
RMSE m | (1.5) | (2.0) | (1.2) | (1.8) | (0.8) | (1.1) | ||||||||||
LLED | 2.0 | 3.4 | 3.0 | 3.2 | NS | 2.1 | 2.5 | 0.47 | ||||||||
MAD (m) | (1.5) | (2.0) | (1.4) | (2.0) | (1.2) | (1.5) | ||||||||||
Ecosynth TCH to | 4.1 | 4.5 | 4.1 | 4.4 | NS | 3.6 | 7.0 | 1.0 | ||||||||
Field RMSE (m) | (0.2) | (0.5) | (0.3) | (0.8) | (0.1) | (0.3) | ||||||||||
Ecosynth to LIDAR | 2.6 | 2.2 | 2.5 | 2.6 | NS | 3.4 | 2.7 | NS | ||||||||
TCH RMSE (m) | (0.7) | (0.6) | (0.6) | (0.7) | (0.6) | (0.1) | ||||||||||
Point Density | 14 | 18 | 34 | 54 | 0.93 | 36 | 0.8 | 0.67 | ||||||||
Points m−2 | (0.5) | (0.7) | (10) | (23) | (1) | (0.1) | ||||||||||
Canopy Penetration | 15 | 17 | 17 | 18 | 0.93 | 18 | 0.02 | 0.91 | ||||||||
% CV | (3) | (3) | (3) | (2) | (0.02) | (0.01) | ||||||||||
Average Computation | 8 | 10 | 26 | 87 | 0.93 | 45 | 0.5 | 0.91 | ||||||||
Time hours | (0.7) | (0.5) | (3) | (38) | (1.5) | (0.01) |
3.2. Canopy Structure and Canopy Sampling
3.3. Influence of Wind of Point Cloud Quality
3.4. Radiometric Quality of Ecosynth Point Clouds
3.5. Optimal Conditions for Ecosynth UAV-SFM Remote Sensing of Forest Structure
Average Ecosynth Quality Traits and Metrics | |||
---|---|---|---|
Point Cloud Traits and Metrics | Herbert Run | Knoll | SERC |
N | 7 | 1 | 1 |
Path-XY Error RMSE (m) | 1.1 | 0.7 | 1.0 |
Path-Z Error RMSE (m) | 0.5 | 0.6 | 0.4 |
ICP-XY Error RMSE (m) | 1.7 | 0.5 | 1.8 |
ICP-Z Error RMSE (m) | 1.6 | 4.0 | 1.8 |
Launch Location Elevation Difference (m) | 1.2 | 3.1 | 2.1 |
Ecosynth TCH to Field Height RMSE (m) | 3.6 | 5.2 | 3.6 |
Ecosynth TCH to LIDAR TCH RMSE (m) | 3.0 | 1.6 | 3.2 |
Ecosynth TCH to Field Height R2 | 0.86 | 0.79 | 0.19 |
Ecosynth TCH to LIDAR TCH R2 | 0.99 | 0.99 | 0.89 |
Average Forest Point Density (points m−2) | 35 | 33 | 39 |
Average Forest Canopy Penetration (% CV) | 20 | 24 | 11 |
Computation Time (hours) | 45 | 50 | 15 |
3.6. Influence of Computation on Ecosynth Point Cloud Quality
SFM Algorithm | Photoscan v0.84 a | Photoscan v0.91 a | Photoscan v1.04 a Sparse | Photoscan v1.04 a Dense | Ecosynther v1.0 b Sparse | Ecosynther v1.0 b Dense |
---|---|---|---|---|---|---|
Path-XY Error RMSE (m) | 1.1 | 1.1 | 1.1 | 1.1 | 1.1 | 1.1 |
Path-Z Error RMSE (m) | 0.3 | 0.3 | 0.3 | 0.3 | 0.7 | 0.7 |
ICP-XY Error RMSE (m) | 1.6 | 1.6 | 1.6 | 1.6 | 1.9 | 1.9 |
ICP-Z Error RMSE (m) | 1.0 | 0.9 | 0.9 | 0.9 | 0.8 | 0.8 |
Launch Location Elevation Difference (m) | 0.9 | 0.9 | 0.9 | 0.9 | 0.6 | 0.6 |
Ecosynth TCH to Field Height RMSE (m) | 3.8 | 3.9 | 3.9 | 4.6 | 3.8 | 5.3 |
Ecosynth TCH to LIDAR TCH RMSE (m) | 3.4 | 3.0 | 2.9 | 2.0 | 2.9 | 2.0 |
Forest Point Cloud Density (points m−2) | 88 | 36 | 34 | 138 | 7 | 59 |
Forest Canopy Penetration (% CV) | 18 | 18 | 18 | 11 | 9 | 13 |
Computation Time (hours) | 30 | 45 | 16 | +40 c | 61 | +5c |
4. Discussion
4.1. The Importance of Accurate DTM Alignment
4.2. The Importance of Image Overlap
4.3. The Importance of Lighting, Contrast, and Radiometric Quality
4.4. The Importance of Wind Speed
4.5. Factors Influencing Tree Height Estimates
4.6. Future Research: The Path forward for UAV-SFM Remote Sensing
4.6.1. Optimizing Data Collection with Computation Time
4.6.2. The Role of the Camera Sensor; Multi and Hyperspectral Structure from Motion
4.6.3. Computer Vision Image Features: The New Pixel
5. Conclusions
Supplementary Files
Supplementary File 1Acknowledgements
Author Contributions
Conflicts of Interest
References
- Lindquist, E.J.; D’annunzio, R.; Gerrand, A.; Macdicken, K.; Achard, F.; Beuchle, R.; Brink, A.; Eva, H.D.; Mayaux, P.; San-Miguel-Ayanz, J.; Stibig, H.-J. Global Forest Land-Use Change 1990–2005; FAO & JRC: Rome, Italy, 2012. [Google Scholar]
- Houghton, R.A.; Hall, F.; Goetz, S.J. Importance of biomass in the global carbon cycle. J Geophys. Res. 2009, 114, 1–13. [Google Scholar] [CrossRef]
- Asner, G.P.; Martin, R.E. Airborne spectranomics: Mapping canopy chemical and taxonomic diversity in tropical forests. Front. Ecol. Environ. 2009, 7, 269–276. [Google Scholar] [CrossRef]
- Defries, R.S.; Hansen, M.C.; Townshend, J.R.G.; Janetos, A.C.; Loveland, T.R. A new global 1-km dataset of percentage tree cover derived from remote sensing. Glob. Change Biol. 2000, 6, 247–254. [Google Scholar] [CrossRef]
- Hansen, M.C.; Roy, D.P.; Lindquist, E.; Adusei, B.; Justice, C.O.; Altstatt, A. A method for integrating MODIS and Landsat data for systematic monitoring of forest cover and change in the Congo Basin. Remote Sens. Environ. 2008, 112, 2495–2513. [Google Scholar] [CrossRef]
- Lefsky, M.A.; Cohen, W.B.; Parker, G.G.; Harding, D.J. LIDAR remote sensing for ecosystem studies. Bioscience 2002, 52, 20–30. [Google Scholar] [CrossRef]
- Parker, G.G.; Russ, M.E. The canopy surface and stand development: Assessing forest canopy structure and complexity with near-surface altimetry. For. Ecol. Manage. 2004, 189, 307–315. [Google Scholar] [CrossRef]
- Richardson, A.D.; Braswell, B.; Hollinger, D.Y.; Jenkins, J.C.; Ollinger, S.V. Near-surface remote sensing of spatial and temporal variation in canopy phenology. Ecol. Appl. 2009, 19, 1417–1428. [Google Scholar] [CrossRef] [PubMed]
- Zhang, X.; Friedl, M.; Schaaf, C.; Strahler, A.; Hodges, J.; Gao, F.; Reed, B.; Huete, A. Monitoring vegetation phenology using MODIS. Remote Sens. Environ. 2003, 84, 471–475. [Google Scholar] [CrossRef]
- Lefsky, M.; Cohen, W. SELECTION of remotely sensed data. In Remote Sensing of Forest Environments; Wulder, M., Franklin, S., Eds.; Kluwer Academic Publishers: Boston, MA, USA, 2003; pp. 13–46. [Google Scholar]
- Beuchle, R.; Eva, H.D.; Stibig, H.-J.; Bodart, C.; Brink, A.; Mayaux, P.; Johansson, D.; Achard, F.; Belward, A. A satellite data set for tropical forest area change assessment. Int. J. Remote Sens. 2011, 32, 7009–7031. [Google Scholar] [CrossRef]
- Anderson, K.; Gaston, K.J. Lightweight unmanned aerial vehicles will revolutionize spatial ecology. Front. Ecol. Environ. 2013, 11, 138–146. [Google Scholar] [CrossRef]
- Turner, W. Sensing biodiversity. Science 2014, 346, 301–302. [Google Scholar] [CrossRef] [PubMed]
- Dandois, J.P.; Ellis, E.C. High spatial resolution three-dimensional mapping of vegetation spectral dynamics using computer vision. Remote Sens. Environ. 2013, 136, 259–276. [Google Scholar] [CrossRef]
- Rango, A.; Laliberte, A.; Herrick, J.E.; Winters, C.; Havstad, K.; Steele, C.; Browning, D. Unmanned aerial vehicle-based remote sensing for rangeland assessment, monitoring, and management. J. Appl. Remote Sens. 2009, 3. [Google Scholar] [CrossRef]
- Turner, D.; Lucieer, A.; Watson, C. An automated technique for generating georectified mosaics from ultra-high resolution unmanned aerial vehicle (UAV) imagery, based on structure from motion (SFM) point clouds. Remote Sens. 2012, 4, 1392–1410. [Google Scholar] [CrossRef]
- Getzin, S.; Wiegand, K.; Schöning, I. Assessing biodiversity in forests using very high-resolution images and unmanned aerial vehicles. Methods Ecol. Evol. 2012, 3, 397–404. [Google Scholar] [CrossRef]
- Koh, L.P.; Wich, S.A. Dawn of drone ecology: Low-cost autonomous aerial vehicles for conservation. Trop. Conserv. Sci. 2012, 5, 121–132. [Google Scholar]
- Hunt, J.; Raymond, E.; Hively, W.D.; Fujikawa, S.; Linden, D.; Daughtry, C.S.; McCarty, G. Acquisition of NIR-green-blue digital photographs from unmanned aircraft for crop monitoring. Remote Sens. 2010, 2, 290–305. [Google Scholar] [CrossRef]
- Snavely, N.; Seitz, S.; Szeliski, R. Modeling the world from internet photo collections. Int. J. Comp. Vis. 2008, 80, 189–210. [Google Scholar] [CrossRef]
- Lisein, J.; Pierrot-Deseilligny, M.; Bonnet, S.; Lejeune, P. A photogrammetric workflow for the creation of a forest canopy height model from small unmanned aerial system imagery. Forests 2013, 4, 922–944. [Google Scholar] [CrossRef]
- Dandois, J.P.; Ellis, E.C. Remote sensing of vegetation structure using computer vision. Remote Sens. 2010, 2, 1157–1176. [Google Scholar] [CrossRef]
- Puliti, S.; Ørka, H.; Gobakken, T.; Næsset, E. Inventory of small forest areas using an unmanned aerial system. Remote Sens. 2015, 7, 9632–9654. [Google Scholar] [CrossRef] [Green Version]
- Castillo, C.; Pérez, R.; James, M.R.; Quinton, J.N.; Taguas, E.V.; Gómez, J.A. Comparing the accuracy of several field methods for measuring gully erosion. Soil Sci. Soc. Am. J. 2012, 76, 1319–1332. [Google Scholar] [CrossRef] [Green Version]
- Javernick, L.; Brasington, J.; Caruso, B. Modeling the topography of shallow braided rivers using structure-from-motion photogrammetry. Geomorphology 2014, 213, 166–182. [Google Scholar] [CrossRef]
- Dey, D.; Mummert, L.; Sukthankar, R. Classification of plant structures from uncalibrated image sequences. In Proceedings of the 2012 IEEE Workshop on Applications of Computer Vision WACV, Breckenridge, CO, USA, 9–11 January 2012; pp. 329–336.
- Mathews, A.; Jensen, J. Visualizing and quantifying vineyard canopy lai using an unmanned aerial vehicle (UAV) collected high density structure from motion point cloud. Remote Sens. 2013, 5, 2164–2183. [Google Scholar]
- Torres-Sánchez, J.; López-Granados, F.; Serrano, N.; Arquero, O.; Peña, J.M. High-throughput 3-D monitoring of agricultural-tree plantations with unmanned aerial vehicle (UAV) technology. PLoS ONE 2015, 10, e0130479. [Google Scholar] [CrossRef] [PubMed]
- de Matías, J.; Sanjosé, J.J.D.; López-Nicolás, G.; Sagüés, C.; Guerrero, J.J. Photogrammetric methodology for the production of geomorphologic maps: Application to the veleta rock glacier (sierra nevada, granada, spain). Remote Sens. 2009, 1, 829–841. [Google Scholar] [CrossRef]
- Harwin, S.; Lucieer, A. Assessing the accuracy of georeferenced point clouds produced via multi-view stereopsis from unmanned aerial vehicle (UAV) imagery. Remote Sens. 2012, 4, 1573–1599. [Google Scholar] [CrossRef]
- James, M.; Applegarth, L.J.; Pinkerton, H. Lava channel roofing, overflows, breaches and switching: Insights from the 2008–2009 eruption of mt. Etna. Bull. Volcanol 2012, 74, 107–117. [Google Scholar] [CrossRef]
- James, M.; Robson, S. Straightforward reconstruction of 3d surfaces and topography with a camera: Accuracy and geoscience application. J Geophys. Res.: Earth Surf. 2012, 117, 1–23. [Google Scholar] [CrossRef]
- Rosnell, T.; Honkavaara, E. Point cloud generation from aerial image data acquired by a quadrocopter type micro unmanned aerial vehicle and a digital still camera. Sensors 2012, 12, 453–480. [Google Scholar] [CrossRef] [PubMed]
- Westoby, M.J.; Brasington, J.; Glasser, N.F.; Hambrey, M.J.; Reynolds, J.M. ‘Structure-from-motion’ photogrammetry: A low-cost, effective tool for geoscience applications. Geomorphology 2012, 179, 300–314. [Google Scholar] [CrossRef] [Green Version]
- Næsset, E. Effects of different sensors, flying altitudes, and pulse repetition frequencies on forest canopy metrics and biophysical stand properties derived from small-footprint airborne laser data. Remote Sens. Environ. 2009, 113, 148–159. [Google Scholar] [CrossRef]
- Jenkins, J.C.; Chojnacky, D.C.; Heath, L.S.; Birdsey, R.A. National-scale biomass estimators for united states tree species. For. Sci. 2003, 49, 12–35. [Google Scholar]
- Dandois, J.P.; Boswell, D.; Anderson, E.; Bofto, A.; Baker, M.; Ellis, E.C. Forest census and map data for two temperate deciduous forest edge woodlot patches in Baltimore MD, USA. Ecology 2015, 96, 1734. [Google Scholar] [CrossRef]
- McMahon, S.M.; Parker, G.G.; Miller, D.R. Evidence for a recent increase in forest growth. Proc. Natl. Acad. Sci. 2010, 107, 3611–3615. [Google Scholar] [CrossRef] [PubMed]
- ForestGeo. Available online: http://www.forestgeo.si.edu (accessed on 25 July 2015).
- Mikrokopter. Available online: http://www.mikrokopter.de (accessed on 9 September 2015).
- Arducopter. Available online: http://copter.ardupilot.com (accessed on 9 September 2015).
- Ecosynth Wiki. Available online: http://wiki.ecosynth.org (accessed on 9 September 2015).
- Runyan, C. Methodology for Installation of Eddy Covariance Meteorological Sensors and Data Processing; UMBC/CUERE Technical Memo 2009/002: Baltimore, MD, USA, 2009. [Google Scholar]
- Beaufort Wind Force Scale. Available online: http://www.spc.noaa.gov/faq/tornado/beaufort.html (accessed on 20 May 2014).
- Habib, A.; Bang, K.I.; Kersting, A.P.; Lee, D.-C. Error budget of LIDAR systems and quality control of the derived data. Photogramm. Eng. Remote Sens. 2009, 75, 1093–1108. [Google Scholar] [CrossRef]
- Ecosynther v1.0. Available online: https://bitbucket.org/ecosynth/ecosynther (accessed on 9 September 2015).
- Furukawa, Y.; Ponce, J. Accurate, dense, and robust multiview stereopsis. IEEE Trans. Pattern Anal. Mach. Intell. 2010, 32, 1362–1376. [Google Scholar] [CrossRef] [PubMed]
- Wang, Y.; Olano, M. A framework for GPS 3D model reconstruction using structure-from-motion. In Proceedings of ACM SIGGRAPH’ 11, Vancouver, BC, Canada, 7–11 August 2011.
- Wu, C. SIFTGPU: A GPU implementation of scale invariant feature transform (SIFT). Available online: http://www.cs.unc.edu/~ccwu/siftgpu/ (accessed on 27 July 2015).
- Szeliski, R. Computer Vision; Algorithms and Applications, Texts in Computer Science, Springer-Verlag: London, UK, 2011. [Google Scholar]
- Li, J.; Allinson, N.M. A comprehensive review of current local features for computer vision. Neurocomputing 2008, 71, 1771–1787. [Google Scholar] [CrossRef]
- Mikolajczyk, K.; Schmid, C. A performance evaluation of local descriptors. IEEE Trans. Pattern Anal. Mach. Intell. 2005, 27, 1615–1630. [Google Scholar] [CrossRef] [PubMed]
- Lowe, D.G. Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vision 2004, 60, 91–110. [Google Scholar] [CrossRef]
- Ecosynth Aerial. Available online: http://code.ecosynth.org/EcosynthAerial (accessed on 9 September 2015).
- Besl, P.J.; McKay, H.D. A method for registration of 3-D shapes. IEEE Trans. Pattern Anal. Mach. Intell. 1992, 14, 239–256. [Google Scholar] [CrossRef]
- Asner, G.P.; Mascaro, J. Mapping tropical forest carbon: Calibrating plot estimates to a simple LIDAR metric. Remote Sens. Environ. 2014, 140, 614–624. [Google Scholar] [CrossRef]
- Baltsavias, E.; Pateraki, M.; Zhang, L. Radiometric and geometric evaluation of IKONOS GEO images and their use for 3D building modelling. In Proceedings of the Joint ISPRS Workshop “High Resolution Mapping from Space 2001”, Hanover, Germany, 19–21 September 2001; 2001; pp. 19–21. [Google Scholar]
- Yang, Y.; Newsam, S. Geographic image retrieval using local invariant features. IEEE Trans. Geosci. Remote Sens. 2013, 51, 818–832. [Google Scholar] [CrossRef]
- Zahawi, R.A.; Dandois, J.P.; Holl, K.D.; Nadwodny, D.; Reid, J.L.; Ellis, E.C. Using lightweight unmanned aerial vehicles to monitor tropical forest recovery. Biol. Conserv. 2015, 186, 287–295. [Google Scholar] [CrossRef]
- Ni, W.; Ranson, K.J.; Zhang, Z.; Sun, G. Features of point clouds synthesized from multi-view ALOS/PRISM data and comparisons with LIDAR data in forested areas. Remote Sens. Environ. 2014, 149, 47–57. [Google Scholar] [CrossRef]
- Ogunjemiyo, S.; Parker, G.; Roberts, D. Reflections in bumpy terrain: Implications of canopy surface variations for the radiation balance of vegetation. IEEE Geosci. Remote Sens. Lett. 2005, 2, 90–93. [Google Scholar] [CrossRef]
- Sithole, G.; Vosselman, G. Experimental comparison of filter algorithms for bare-earth extraction from airborne laser scanning point clouds. ISPRS J. Photogramm. Remote Sens. 2004, 59, 85–101. [Google Scholar] [CrossRef]
- St. Onge, B.; Hu, Y.; Vega, C. Mapping the height and above-ground biomass of a mixed forest using LIDAR and stereo IKONOS images. Int. J. Remote Sens. 2008, 29, 1277–1294. [Google Scholar] [CrossRef]
- Tinkham, W.T.; Huang, H.; Smith, A.M.S.; Shrestha, R.; Falkowski, M.J.; Hudak, A.T.; Link, T.E.; Glenn, N.F.; Marks, D.G. A comparison of two open source LIDAR surface classification algorithms. Remote Sens. 2011, 3, 638–649. [Google Scholar] [CrossRef]
- Wasser, L.; Day, R.; Chasmer, L.; Taylor, A. Influence of vegetation structure on LIDAR-derived canopy height and fractional cover in forested riparian buffers during leaf-off and leaf-on conditions. PLoS ONE 2013, 8, e54776. [Google Scholar] [CrossRef] [PubMed]
- Chasmer, L.; Hopkinson, C.; Smith, B.; Treitz, P. Examining the influence of changing laser pulse repetition frequencies on conifer forest canopy returns. Photogramm. Eng. Remote Sens. 2006, 72, 1359–1367. [Google Scholar] [CrossRef]
- Hirschmugl, M.; Ofner, M.; Raggam, J.; Schardt, M. Single tree detection in very high resolution remote sensing data. Remote Sens. Environ. 2007, 110, 533–544. [Google Scholar] [CrossRef]
- Ofner, M.; Hirschmugl, M.; Raggam, H.; Schardt, M. 3D stereo mapping by means of UltracamD data. In Proceedings of the Workshop on 3D Remote Sensing in Forestry, Vienna, Austria, 14–15 February 2006.
- Cox, S.E.; Booth, D.T. Shadow attenuation with high dynamic range images. Environ. Monit. Assess. 2009, 158, 231–241. [Google Scholar] [CrossRef] [PubMed]
- Bragg, D. An improved tree height measurement technique tested on mature southern pines. South. J. Appl. For. 2008, 32, 38–43. [Google Scholar]
- Goetz, S.; Dubayah, R. Advances in remote sensing technology and implications for measuring and monitoring forest carbon stocks and change. Carbon Manag. 2011, 2, 231–244. [Google Scholar] [CrossRef]
- Larjavaara, M.; Muller-Landau, H.C. Measuring tree height: A quantitative comparison of two common field methods in a moist tropical forest. Methods Ecol. Evol. 2013, 4, 793–801. [Google Scholar] [CrossRef]
- Frazer, G.W.; Magnussen, S.; Wulder, M.A.; Niemann, K.O. Simulated impact of sample plot size and co-registration error on the accuracy and uncertainty of LIDAR-derived estimates of forest stand biomass. Remote Sens. Environ. 2011, 115, 636–649. [Google Scholar] [CrossRef]
- Swiftnav Piksi RTK-GPS. Available online: http://www.swiftnav.com/piksi.html (accessed on 9 September 2015).
- Richards, J.A. Remote Sensing Digital Image Analysis; Springer: Berlin, Germany, 2006; p. 437. [Google Scholar]
- Honkavaara, E.; Saari, H.; Kaivosoja, J.; Pölönen, I.; Hakala, T.; Litkey, P.; Mäkynen, J.; Pesonen, L. Processing and assessment of spectrometric, stereoscopic imagery collected using a lightweight UAV spectral camera for precision agriculture. Remote Sens. 2013, 5, 5006–5039. [Google Scholar] [CrossRef] [Green Version]
- Berni, J.; Zarco-Tejada, P.J.; Su·rez, L.; Fereres, E. Thermal and narrowband multispectral remote sensing for vegetation monitoring from an unmanned aerial vehicle. IEEE Trans. Geosci. Remote Sens. 2009, 47, 722–738. [Google Scholar] [CrossRef]
- Zarco-Tejada, P.J.; González-Dugo, V.; Berni, J.A.J. Fluorescence, temperature and narrow-band indices acquired from a UAV platform for water stress detection using a micro-hyperspectral imager and a thermal camera. Remote Sens. Environ. 2012, 117, 322–337. [Google Scholar] [CrossRef]
- Beijborn, O.; Edmunds, P.J.; Kline, D.I.; Mitchell, B.G.; Kriegman, D. Automated annotation of coral reef survey images. In Proceedings of Computer Vision and Pattern Recognition (CVPR), Providence, RI, USA, 16–21 June 2012; pp. 1170–1177.
- Nilsback, M.-E. An Automatic Visual Flora-Segmentation and Classication of Flower Images. Ph.D. Thesis, University of Oxford, Oxford, UK, 2009. [Google Scholar]
- Kendal, D.; Hauser, C.E.; Garrard, G.E.; Jellinek, S.; Giljohann, K.M.; Moore, J.L. Quantifying plant colour and colour difference as perceived by humans using digital images. PLoS ONE 2013, 8, e72296. [Google Scholar] [CrossRef]
© 2015 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Dandois, J.P.; Olano, M.; Ellis, E.C. Optimal Altitude, Overlap, and Weather Conditions for Computer Vision UAV Estimates of Forest Structure. Remote Sens. 2015, 7, 13895-13920. https://doi.org/10.3390/rs71013895
Dandois JP, Olano M, Ellis EC. Optimal Altitude, Overlap, and Weather Conditions for Computer Vision UAV Estimates of Forest Structure. Remote Sensing. 2015; 7(10):13895-13920. https://doi.org/10.3390/rs71013895
Chicago/Turabian StyleDandois, Jonathan P., Marc Olano, and Erle C. Ellis. 2015. "Optimal Altitude, Overlap, and Weather Conditions for Computer Vision UAV Estimates of Forest Structure" Remote Sensing 7, no. 10: 13895-13920. https://doi.org/10.3390/rs71013895
APA StyleDandois, J. P., Olano, M., & Ellis, E. C. (2015). Optimal Altitude, Overlap, and Weather Conditions for Computer Vision UAV Estimates of Forest Structure. Remote Sensing, 7(10), 13895-13920. https://doi.org/10.3390/rs71013895