Mapping Banana Plants from High Spatial Resolution Orthophotos to Facilitate Plant Health Assessment
">
<p>The study area on the Sunshine Coast (yellow outline), Queensland, Australia.</p> ">
<p>Ten selected orthophoto tiles in the Sunshine Coast Region represented as true colour composites each covering 1 km × 1 km. (<b>a</b>,<b>b</b>) area around Kin Kin; (<b>c</b>–<b>e</b>) area around Nambour; (<b>f</b>,<b>g</b>) area around Buderim; and (<b>h</b>–<b>j</b>) area around Beerwah. Coordinates represent the approximate centre of each image tile.</p> ">
<p>(<b>a</b>) Sunlit and shaded banana plant leaves; and (<b>b</b>) torn banana plant leaves.</p> ">
<p>Orthophoto of banana plants showing the star shape of the leaves and their associated shadows on the ground spatially “in touch” in the photo with the actual banana plant leaves. Approximate location: 153°02′00E″E, 26°41′45″S.</p> ">
<p>(<b>a</b>) True and (<b>b</b>) false colour orthophotos; (<b>c</b>) the Normalized Difference Vegetation Index (NDVI) index; (<b>d</b>) the Green Ratio index and (<b>e</b>) the vegetation (green) classification results. Location: 153°02′00″E, 26°41′45″S.</p> ">
<p>(<b>a</b>) Base object segmentation; (<b>b</b>–<b>d</b>) bright and dark edge detection; (<b>e</b>) smoothed edge detection; (<b>f</b>–<b>g</b>) line features; and (<b>h</b>) updated segmentation result. Location: 153°02′00″E, 26°41′45″S.</p> ">
<p>Classification of (<b>a</b>) banana plant candidates; (<b>b</b>) potential banana plant leaf (green lines) and banana plant leaf shadow (red lines) candidates; (<b>c</b>) coinciding banana plant candidates and banana plant leaf and banana plant leaf shadow candidates (red); and (<b>d</b>) classification of banana plants (red). Location: 153°02′00″E, 26°41′45″S.</p> ">
<p>(<b>a</b>) Banana plant classification result; (<b>b</b>) distance map; (<b>c</b>) smoothed banana plant objects; <b>d</b>) distance map; (<b>e</b>) buffer around the smoothed banana plant objects; (<b>f</b>,<b>g</b>) remaining buffers, and (<b>h</b>) final classification result after the visual assessment (red outlines = banana plants, green outlines = excluded false positives). Location: 153°02′00″E, 26°41′45″S.</p> ">
<p>Banana plantation outlined by blue lines. Location: 152°53′50″E, 26°41′25″S.</p> ">
Abstract
: The Banana Bunchy Top Virus (Genus: Babuvirus) reduces plant growth and prevents banana production. Because of the very large number of properties with banana plants in South East Queensland, Australia, a mapping approach was developed to delineate individual and clusters of banana plants to help plant identification and enable prioritization of plant inspections for Banana Bunchy Top Virus. Due to current outbreaks in South East Queensland, there are concerns that the virus may spread to the major banana growing districts further north. The mapping approach developed was based on very high spatial resolution airborne orthophotos. Object-based image analysis was used to: (1) detect banana plants using edge and line detection approaches; (2) produce accurate and realistic outlines around classified banana plants; and (3) evaluate the mapping results. The mapping approach was developed based on 10 image tiles of 1 km × 1 km and was applied to orthophotos (3600 image tiles) from September 2011 covering the entire Sunshine Coast Region in South East Queensland. Based on field inspections of the classified maps, a user’s mapping accuracy of 88% (n = 146) was achieved. The results will facilitate the detection of banana plants and increase the inspection rate of Banana Bunchy Top Virus in the future.1. Introduction
While banana cultivation has a long history, the Banana Bunchy Top Virus (BBTV) was first documented in Fiji in 1889, where it significantly affected banana production [1]. BBTV is a circular single-stranded DNA virus that causes the Bunchy Top disease in bananas. The virus has been assigned to the Nanoviridae family in the genus Babuvirus [2]. BBTV was introduced into Australia in planting material from Fiji in 1913. During the 1920s and 1930s BBTV devastated the banana industry in state of New South Wales, Australia, but at the same time a considerable effort was put into BBTV research to effectively control the disease [1].
Banana plants can only be infected from (1) infected plant material and (2) the banana aphid when carrying the virus [3]. When people give away young banana plants or suckers to others the BBTV can spread. The aphid is a small black insect that, once feeding on an infected banana plant, becomes a carrier of the BBTV. The wind can carry aphids many kilometres, which may result in further spreading the BBTV. Hook et al. [3] showed that the incubation period of BBTV or appearance of symptoms in banana plants ranged from 25 to 85 days after aphid inoculation.
No outbreaks of BBTV have been identified in North Queensland, Australia since a single outbreak in a plantation near Innisfail in 1954. More than 90% of Australia’s banana production takes place in North Queensland [4]. While the movement of bananas and control of BBTV is still under official regulation, the responsibility of detecting BBTV is increasingly left to the banana growers. The diagnosis of BBTV is generally performed by visual inspection of the symptoms of the infection. Symptom-bearing leaves develop dark green streaks of variable length in the leaf veins, midribs and petioles. Leaves become progressively dwarfed and develop marginal chlorosis or yellowing. As the disease develops, the leaves become more upright and crowded or bunched at the apex of the plant [1]. Plant growth is stunted and the production of bananas will significantly decrease or stop once a banana plant becomes infected. Infected banana plants cannot recover from the disease and new suckers will also be infected [3].
Once BBTV is detected, the only way to get rid of it is by destroying the infected banana plant. This is done by injecting the infected banana plant with (1) systemic insecticide to prevent aphids from spreading to other plants and (2) glyphosate to destroy the plant. Bio Pest Oil is also used to suppress aphid activity and prevent aphids from flying from the infected banana plants [5].
BBTV is currently present in New South Wales and in South East Queensland, with the Sunshine Coast in Queensland being the northern most point of detection at present. It is essential to prevent the spread of BBTV to the major banana growing areas in Queensland, such as Tully and Innisfail, which would have significant economic consequences for the Australian banana industry. The annual farm gate value of banana production in Australia is approximately AU$450 million, making it the largest horticulture industry in Queensland [4]. In South East Queensland there are many backyard growers over a large area, which make inspection of banana plants and identification of BBTV very time-consuming and labour-intensive. Cook et al. [6] used a simulation model to estimate the likely benefits of exclusion of the BBTV from commercial banana plantations over time relative to a nil management scenario in which no surveillance or containment activities take place. A range of possible future incursion scenarios showed that the disease exclusion would prevent annual losses of AU$15.9–27.0 million for the Australian banana industry, and that improvements in disease surveillance and incursion response will greatly enhance future cost benefits.
The Queensland Government collects very high spatial resolution aerial orthophotos on a regular basis over urban and peri-urban areas of South East Queensland. These orthophotos represent a potential means of detecting individual and clusters of banana plants over large areas on an annual basis. The development of a mapping approach for automatic detection and delineation of banana plants from very high spatial resolution orthophotos may facilitate the detection of banana plants and increase the inspection rate of BBTV by targeting field based searches and inspections of banana plants, especially in the vicinity of areas where plants with BBTV have been detected, as the neighbouring areas are more likely to be infected.
In response to the threat of BBTV to the Australian banana industry, the National Banana Bunchy Top Project was established in 2009. The project is administered through the Australian Banana Growers’ Council, using funds from compulsory grower levies, which are matched by funding from the Federal Government through Horticulture Australia Limited, to eradicate BBTV from commercial plantations, backyard banana plants and feral banana plants. This research was part of the National Banana Bunchy Top Project, to develop a mapping approach for detection and delineation of banana plants. Specifically, the objective of this research was: To develop a mapping approach using object-based image analysis to map banana plants within 10 selected 1 km × 1 km tiles using existing very high spatial resolution orthophotos and apply the approach to the whole Sunshine Coast Regional Council area (∼3100 km2). No automatic mapping approach currently exists for mapping individual banana plants and clusters of banana plants.
2. Study Area
The study area was located in the Sunshine Coast Region, Queensland, Australia (Figure 1) and included 10 tiles each covering 1 km × 1 km (Figure 2). The Sunshine Coast Region is located north of Brisbane and contains residential and semi-rural areas that have experienced rapid population growth over the last decades. Small tropical fruit farms, including banana farms, exist on the Sunshine Coast. However, the majority of banana plants are found in backyards. The 10 orthophoto tiles selected for this research represented a number of different environments from rural to high density urban areas (Figure 2).
3. Data and Methods
3.1. Image and Field Data
Ten orthophoto tiles each covering 1 km × 1 km were selected for this research project from a total of 3600 tiles covering the whole of the Sunshine Coast Region. The 10 tiles, selected to represent the majority of landscape variation on the Sunshine Coast, were captured between 10 and 16 September 2011 and consisted of four multi-spectral bands with 0.10 m pixels located in the blue, green, red and near infrared (NIR) part of the spectrum. The 10 tiles all had a large number of banana plants or clusters of banana plants, occurring within different types of urban, peri-urban and semi-rural environments typical for the Sunshine Coast Region. In addition, the 10 tiles were also selected because BBTV had been detected within these tiles and there was an interest in identifying other banana plants in the vicinity of those infected banana plants.
GPS recordings of banana plant locations were collected on the 16 October 2012 through an initial inspection of several properties in the Sunshine Coast Region, including one of the 10 image tiles, and used for calibration purposes for the development of the mapping approach. Forty-two sites with banana plants were visited for calibration purposes. A two-day field campaign was carried out on 25 and 26 March 2013 to collect independent validation data to assess if the banana plants identified in the remaining nine orthophotos were banana plants, and if the method used for mapping banana plants had omitted any banana plants. This was done once the maps of delineated banana plants had been completed. To provide an understanding of why some banana plants were omitted with the existing mapping approach, omission errors were assessed by systematic door-knocking to inspect all properties along the streets within four of the 10 selected image tiles (two tiles in Kin Kin and two tiles in Buderim). In addition to the four image tiles systematically assessed, 97 out of 134 of those banana plant sites delineated in the five remaining orthophotos used for validation were inspected to assess the presence of banana plants. A total of 164 sites with banana plants were visited for validation purposes within the nine image tiles. One hundred and forty-six out of 183 sites (80% of sites) mapped with banana plants were checked in the field. Access constraints prevented 37 mapped banana plant sites from being checked in the field. A GPS receiver was used to record a single point based location of all 164 identified sites with banana plants to an accuracy within 3 m. The extent of banana plants at each site varied from approximately 1 m2 (single dwarf plant) to 3000 m2 (small plantation). In the majority of cases, a cluster of less than 20 banana plants were found at each location, with the exception of three small plantations and 11 clusters of banana plants varying in sizes from approximately 200–500 m2. Banana plants that were identified in the field, but could not be visually identified in the orthophotos and could not be determined to have an age of more than 1.5 years, were omitted from the accuracy assessment because of the 1.5 year time gap between the capture date of the orthophotos and the field validation campaign. Classification accuracies were calculated based on the collected independent field data.
3.2. Initial Evaluation of Banana Plant Detection and Delineation Approaches
Most tree crown detection and delineation algorithms are based on two distinct spectral properties of tree crowns and their representation in high spatial resolution image data: (1) the association of a tree apex with a local maximum image brightness value; and (2) delineation of the crown boundary by local minima brightness values [7]. One of the most fundamental assumptions of crown delineation algorithms is that the centre of a crown appears radiometrically brighter than the edge [8,9] due to the more direct illumination of the crown peak [10]. The valley-following approach (local minima approach) was developed in Canada [11] based on delineation of tree crown boundaries by finding the shaded parts between the tree crowns, which appear to have low brightness values [12].
Quackenbush et al. [13] found canopy illumination and background effects to significantly reduce mapping accuracies of conventional pixel-based classifications of individual trees, especially in highly heterogeneous landscapes such as urban areas [14]. Hence, object-based image analysis is now considered more appropriate for single tree classification because of the added information available using objects [15]. Bunting and Lucas [16] used an object-based image analysis approach to delineate tree crowns of mixed species and achieved accuracies between 48% and 88%. Their approach did also look for object maxima as seeds for tree crown identification with subsequent expansion from the seed location to the edges of the tree crowns and clusters.
However, these tree crown delineation and object-based approaches were not found suitable for the delineation of individual and clusters of banana plants because the banana plants’ structure, composition and leaf shape, which prevent the detection of a sunlit apex. Leaves are spirally arranged and may grow up to 3 m long and about 0.60 m wide (Figure 3a). As leaves are easily torn by the wind, the leaves often appear with a frond look (Figure 3b). This causes some parts of the leaves to be sunlit and other parts to be shaded (Figure 3a,b), and hence reducing the ability to use spectral information for the detection of banana plants. Hence the main focus of the detection of banana plants was on the leaf shape and length and the contrast between the leaves/leaf shadows and the ground. As most banana plants are <7 m tall, distinct shadows cast by the leaves onto the ground were in most cases “in touch” with the banana plant leaves from an aerial viewpoint (Figure 4).
3.3. Banana Plant Detection and Delineation
The objective of geographic object-based image analysis (GEOBIA) is to develop and apply theory, methods and tools for replicating and improving human interpretation of remotely sensed image data in an automated manner. GEOBIA consists of image segmentation, i.e., clustering of pixels into homogenous objects, and subsequent classification or labelling of the objects, and modelling based on the characteristics of objects. In high spatial resolution image data, individual features on the ground usually consist of multiple pixels, which may exhibit variable spectral reflectance characteristics [17]. GEOBIA overcomes problems of traditional pixel-based techniques of high spatial resolution image data, by firstly defining segments rather than pixels to classify, and allowing spectral reflectance variability to be used as an attribute for discriminating features in the segmentation approach. In addition, GEOBIA allows inclusion of additional information to guide the classification and modelling processes, by the use of: object average reflectance; object standard deviation; object maximum, minimum and median values; area and shape of objects; texture of objects; location of objects in relation to other objects and land cover classes in the landscape; relation of objects to the image scene characteristics; exploitation of existing geographic information from spatial data infrastructures, and many other contextual information properties not available in traditional pixel-based approaches [18,19]. Very high spatial resolution image data are required to enable detection of individual banana leaves to identify and map individual and small clusters of banana plants. Using orthophotos of 0.10 m pixels, GEOBIA was deemed to be the only suitable approach for mapping banana plants.
GEOBIA builds on the older segmentation, edge-detection, feature extraction and classification concepts of remote sensing image analysis and the idea of incorporating contextual information in the classification [20,21]. The increasing availability of high spatial resolution digital image data in the 2000s coincided with the release of the first commercially available object-based image analysis software, eCognition [22]. A variety of software packages focusing on GEOBIA have been developed since the launch of eCognition in 2000 [18]. However, the eCognition software is currently the most advanced and sophisticated GEOBIA software available. The eCognition Server allows batch processing of very large data sets by utilising multiple computers for the processing generally using a tiling and stitching approach. The tiling and stitching approach cuts an image into multiple subsets, sends each subset to separate computers for processing and upon completion of the processing, the subsets are stitched back together.
The eCognition Client version was used for the rule set development, while testing of developed rule sets were performed with the eCognition Server for faster processing. A total of 16 computers were used for the processing (4 × 2.66 GHz with 16GB RAM, 6 × 2.9 GHz with 64 GB RAM, 6 × 2.53 GHz with 24 GB RAM). The sites with banana plants and clusters of banana plants identified in the field for calibration purposes were used to train the rule set and select suitable algorithm parameter values for each process in the rule set. The individual steps in the processing approach are illustrated in Figures 5 and 8, using a subset of orthophoto g in Figure 2. This subset was selected because three clusters of banana plants of varying sizes were located in close proximity and next to different land-cover classes. The overall processing approach in the eCognition software included a number of main stages:
Mapping of vegetation features (Table 1);
Production of edge extraction bands (Table 2);
Detection of bright and dark lines from the edge extraction bands (Table 2);
Extraction of banana plant candidates based on lines and areas with a high density of edges (Table 3);
Creation of buffers around banana plant candidates (Table 4);
Manual/Visual acceptance or decline of banana plant candidate objects; and
Export results to shapefiles.
In eCognition Architect an Action Library and an associated Architect Solution were produced to manually accept or decline each of the banana plant candidate buffers (Figure 8h). The visually assessed maps were subsequently exported as a shapefile. In ArcMap, the delineated banana plant objects in the shapefile were further assessed and assigned to be mapped with “high confidence” or “low confidence” as some banana plants and clusters of banana plants were easier than others to confidently select in the eCognition Architect.
3.4. Map Validation
Upon completion of the maps of banana plants, nine out of the 10 image tiles were visited in the field to assess errors of commission and omission (the banana plant location information from the last image tile was used for calibration purposes). The orthophotos and the derived banana plant maps were brought on the field trip on A3 printouts and on a laptop. Areas that appeared to have changed (9% of sites visited), e.g., (1) if no banana plant-looking features were present during the field trip on 25 and 26 March 2013 while mapped in the orthophotos (six locations) or (2) if banana plants were present at the time of the field trip, but did not appear in the orthophotos (12 locations), were excluded from the accuracy assessment. Even though the three small banana plantations within the image tiles produced multiple objects (28 objects for the largest plantation) in the analysis because of the separation between some of the banana plants, these objects were only considered as a single location in the accuracy assessment to provide a more realistic measure of the detection rate.
4. Results and Discussion
4.1. Banana Plant Mapping Results
The eCognition processing of each image tile was performed in 15–20 min when cutting the 1 km × 1 km orthophoto tiles into four subsets for batch processing in the eCognition Server software. Generally the identification and delineation of clusters of banana plants worked very well using the developed rule set (Figures 9 and 10), whereas individual banana plants (16) were in some cases omitted. Those that were omitted generally had limited contrast against the ground (Figure 11), whereas those showing more contrast or creating distinct leaf shaped shadows on the ground were automatically identified (Figure 12).
An example of the final classification results is presented in Figure 13. Based on the field validation data, 94 of the 96 banana plant objects classified with high confidence (yellow outlines) were in fact banana plants. This yields a user’s accuracy of 98%. Of the 50 low confidence (orange outlines) objects inspected in the field, 35 of them were in fact objects with banana plants. This yields a user’s accuracy of 70%. The false positives (features being incorrectly outlined as banana plants) were a combination of palms, Heliconia sp. and Strelitzia sp. That means if all mapped objects (146 in total) were inspected in the field, there is over 88% probability that the object will in fact represent banana plants.
A total of 35 banana plants/clusters of banana plants were not identified by the mapping approach used. Nineteen objects of banana plants were found underneath trees and could not be identified in the orthophotos. Some of the other unidentified banana plants were either too small to identify (six banana plants) in the images or lacked contrast against the ground (10 banana plants). The dwarf banana plant varieties were in general more difficult to identify, because of their smaller size and shorter leaves, which made the automatic detection of dark and bright lines in the eCognition software difficult. The lack of contrast between banana plant leaves and leaf shadows against the ground resulted in the lack of edge and line detection in the rule set and hence the omission of these banana plants. After visual orthophoto reassessment of all the 35 missed objects with banana plants, it was concluded that only nine (three dwarf banana plants and six plants with limited contrast against the ground) of the 35 omitted banana plants could be confidently identified visually in the images. The nine locations of banana plants were missed as part of the automatic processing. In some cases, the lack of edge detection in the eCognition software caused the smoothing of the added bright and dark line images to produce brightness values that fell below the threshold set in the rule set for identification of banana plant candidates. While the automatic approach did select most of them, they were in some cases difficult to confidently accept in the final manual step using the eCognition Architect. The fact that not all banana plants can be identified from the orthophotos presents a challenge for field inspection teams searching for banana plants infected with BBTV. However, with a detection rate of 79% (129 out of 164 locations), field inspection teams will save significant amounts of time and be able to inspect a much larger number of sites with banana plants than previously.
The denser populated areas, e.g., Buderim, had many single banana plants or small clusters of banana plants, which made these areas the most challenging to map, especially when combined with the heterogeneous urban landscape with many other distinct edge and line features. In urban areas, other types of trees and buildings were often found adjacent to single or a small number of banana plants, which in some cases were partly obstructed or appeared partly shaded. This reduced the ability to detect banana plant leaves and their shadows as line features. The presence of many varieties of palms in urban areas increased the number of false positives.
The number of objects classified as banana plant candidates from the automatic rule set processing of the 10 tiles is presented in Table 5. It is obvious that the automatic processing method produced a large number of false positives, which emphasize the need for the final step using the eCognition Architect solution to visually accept or decline banana plant objects. The time spent on the visual assessment in the eCognition Architect for one image tile was between 5 and 31 min. This is significantly faster than visually assessing the whole image tile and manually delineating banana plants, which took 73 min on average for the 10 image tiles, without the use of the object-based image classification approach.
4.2. Large Area Mapping
The eCognition Server was found suitable for large area mapping. A total of 16 computers were used to process the 3600 orthophoto tiles of the Sunshine Coast. About 500–600 tiles were processed per 24 h. Given the vast extent of the Sunshine Coast, it would be beneficial to reduce the number of false positives, while still avoiding omitting objects representing banana plants. Some forested areas had in many cases a large number of false positives, as several features occurred with distinct edge and line characteristics fulfilling the set criteria used to map banana plants. LiDAR data collected with properly selected data acquisition specifications and at the same time as the orthophotos would be useful to prevent false positives in forested areas based on height information. Banana plants vary in height depending on the variety and growing conditions, but they rarely exceeds a height of 8 m. Most varieties are between 2 m and 6 m in height. As many palms and trees are taller than banana plants, the use of height information derived from LiDAR data can be used to reduce the number of false positives.
Along drainage lines and streams within forested areas, palms were in many cases present, which also resulted in false positives because of their compositional similarity to banana plants (Figure 14). Palms in general were the cause of many false positives in both peri-urban and urban areas (Figure 15). Especially the area around Buderim resulted in many false positives because of the large number of palms in this area. Man-made features and associated shadows appearing with distinct edges and line objects between 1 and 5 m in length also resulted in false positives. Examples included roof tops, foot paths and power poles and their shadows. While this was not a major issue for the 10 image tiles mapped in this research project, there were many of these examples from the processing of the whole Sunshine Coast data set, highlighting the need to implement additional mapping steps in the rule set for large areas to reduce the number of false positives and improve the detection of banana plants.
The main limitation of the automatic mapping approach was the inclusion of many false positives. While this was not a major issue because of the eCognition Architect solution developed as part of the rule set, which automatically goes through each outlined object and allow the image analyst to accept or decline the objects, it does increase the time spent on completing the mapping of banana plants, and it is impractical to implement for large area mapping. Based on a sample of 100 assessed image tiles, the number of false positives ranged from 5 to 2759, with an average of 497. The visual assessment step in the eCognition Architect was only applied to the sample of 100 image tiles. While banana plants occurred in 52 of the selected 100 image tiles, there were no obvious trends in their distribution. In addition, the final results rely on the expertise of the image interpreter conducting the final visual mapping step and the ability to visually discriminate banana plants from similar looking features also outlined in the automatic mapping approach. By reducing the number of false positives through the change of thresholds, there will be fewer objects to visually accept or decline, making this process faster, but that includes the risk of omitting banana plants, especially those with less contrast against the background. Hence the overall mapping approach is currently a compromise between accuracy and time spent per image tile. There may be other approaches to explore in order to optimise the eCognition rule set further for large area mapping, but that was outside the scope of this project.
Despite this compromise, the rule set included novel individual mapping approaches that will be useful for many mapping applications in the future. An example is the automatic creation of buffers around the banana plant candidates. This approach can be used for mapping other features of interest such as different types of trees and agricultural areas with distinct characteristics, e.g., damage, to clearly and neatly outline these features in an automated manner. This approach is particularly useful when mapping features where the entire part of the feature cannot be mapped. For example, in this research only some banana plant leaves were identified based on their edge and line characteristics, but the whole banana plant or cluster of plants was generally delineated automatically because of the buffering approach.
4.3. Accuracies, Errors, and Uncertainties
The main strength of the developed object-based image analysis approach was the high detection rate of banana plants. While the producer’s accuracy was 79%, over half of the omission errors were due to the obstruction of banana plants by tall trees, making it impossible to detect these banana plants. The main limitation of the mapping approach was the large number of false positives, which required the final visual assessment step to reduce the errors of commission. After the application of the final visual assessment step the errors of commission were reduced to 12%, which represented Heliconia sp. and Strelitzia sp. and some palms that looked very similar to some banana plants. Chemura et al. [24] reported similar accuracies for mapping individual oil palms and that palm age influenced the user’s and producer’s accuracies. The differences in banana plant varieties, their density in clusters of plants, their health, height, age, and location in the landscape contributed to variation in their appearance in the orthophotos, which caused some errors of commission to the final visual assessment step. Johansen et al. [25] also found that young, variably aged and damaged banana plants reduced the mapping accuracy of banana plantations in north Queensland, Australia. Although the final visual assessment step focussed on reducing errors of commission, this step was ultimately a compromise between user’s and producer’s accuracies.
The only uncertainties introduced in the accuracy assessment were attributed to the time gap of 1.5 years between the orthophoto and field data acquisition times. Twenty-six locations with banana plants identified in the field were not identified in the orthophotos, in most cases because of the obstruction of tall trees. Because of the 1.5 year time gap between the orthophoto and field data collections, some of these 26 banana plants/clusters of banana plants may not have been present at the time of the orthophoto acquisition. Hence, the producer’s accuracy may in fact be higher than the reported 79%.
5. Conclusions and Future Work
This research focussed on automatically identifying and delineating banana plants to facilitate the detection of banana plants and increase the inspection rate of Banana Bunchy Top Virus on the Sunshine Coast, Queensland, Australia. An innovative object-based image analysis approach was developed to automatically identify and delineate potential banana plants, including a final visual step to accept or decline the outlined banana plant candidates. User’s and producer’s accuracies of 88% and 79%, respectively, were achieved. The visual interpretation step to identify actual banana plants among the banana plant candidates took on average 13 min per image tile, which is significantly faster than solely using manual image interpretation, which took on average 73 min per image tile. The developed rule set did also include a novel approach, which can be used in other applications for automatic creation of buffers around features where the entire part of the feature cannot be mapped. Hence, this research advances the current knowledge on and application value of object-based image analysis of very high spatial resolution optical image data. The developed mapping approach has the potential to significantly contribute to identifying banana plants and prioritise the field inspections of banana plants for detection of Banana Bunchy Top Virus in the future.
Future work should focus on improving the rule set in the eCognition software to reduce the number of false positives for large area mapping to reduce the time spent on the final visual interpretation step. This can be achieved if man-made features are automatically mapped and forested areas can be eliminated from the search for banana plants. This may be facilitated by the inclusion of LiDAR data collected coincidently with very high spatial resolution orthophotos. Often, only subtle changes appear in banana plants infected with BBTV. Therefore, it may be difficult for untrained people to recognise the symptoms in the field. Because of spectral reflectance variation encountered among various plant varieties, differing illumination angles on leaves, young versus old leaves, and leaves with leaf spots and other diseases, it is uncertain if even high spatial resolution hyper-spectral remote sensing techniques will enable detection of BBTV infection in mapped banana plants. However, future work may explore in further detail whether the detection of BBTV from remote sensing techniques is possible.
Acknowledgments
This research project was administered by the Australian Banana Growers’ Council and was funded from banana grower levies, the National Banana Bunchy Top Project and Horticulture Australia Limited. The Banana Bunchy Top inspection team from NSW is thanked for their help with the field validation campaign in March 2013. The School of Geography, Planning and Environmental Management and the Biophysical Remote Sensing Group at the University of Queensland provided a field spectroradiometer for the fieldwork and image processing software for this project.
Author Contributions
Kasper Johansen was the main developer and coordinator of the project and developed parts of the eCognition rule set, conducted all GIS analysis and validation, participated in the fieldwork and wrote the journal manuscript and produced the figures. Malte Sohlbach developed significant parts of the rule set together with Kasper Johansen and provided feedback on the manuscript. Barry Sullivan, Samantha Stringer and David Peasley all participated actively in the research, project development and fieldwork, facilitated access to the orthophotos, and provided large amounts of previously collected field data. Stuart Phinn participated actively in the project development and provided valuable editing of the manuscript.
Conflicts of Interest
The authors declare no conflict of interest.
References
- Dale, J.L. Banana bunchy top: An economically important tropical plan virus disease. Adv. Virus Res 1987, 33, 301–325. [Google Scholar]
- Harding, R.; Burns, T.; Dale, J. Virus-like particles associated with banana bunchy top disease contain small single-stranded DNA. J. Gen. Virol 1991, 72, 225–230. [Google Scholar]
- Hook, C.R.R.; Wright, M.G.; Kabasawa, D.D.; Manandhar, R.; Almeida, R.P.P. Effect of banana bunchy top virus infection on morphology and growth characteristics of banana. Ann. Appl. Biol 2008, 153, 1–9. [Google Scholar]
- Australian Banana Growers’ Council. Available online: http://www.abgc.org.au (accessed on 11 August 2014).
- Sacoa: Biopest Paraffinic Oil. Available online: http://www.sacoa.com.au/contents/products/biopest.htm?id=8 (accessed on 11 August 2014).
- Cook, D.C.; Liu, S.; Edwards, J.; Villalta, O.N.; Aurambout, J.-P.; Kriticos, D.J.; Drenth, A.; de Barro, P.J. Predicting the benefits of banana bunchy top virus exclusion from commercial plantations in Australia. PLoS One 2012, 7, e42391. [Google Scholar]
- Pouliot, D.D.; King, D.J.; Bell, F.W.; Pitt, D.G. Automated tree crown detection and delineation in high-resolution digital camera imagery of coniferous forest regeneration. Remote Sens. Environ 2002, 82, 322–334. [Google Scholar]
- Culvenor, D.S. TIDA: An algorithm for the delineation of tree crowns in high spatial resolution remotely sensed imagery. Comput. Geosci 2002, 28, 33–44. [Google Scholar]
- Gougeon, F.A.; Leckie, D.G. Individual tree crown image analysis—A step towards precision forestry. In Proceedings of the First International Precision Forestry Symposium, Seattle, WA, USA; June 2001; pp. 17–20. [Google Scholar]
- DeLucia, E.H.; Nelson, K.; Vogelmann, T.C.; Smith, W.K. Contribution of intercellular reflectance to photosynthesis in shaded leaves. Plant Cell Environ 1996, 19, 159–170. [Google Scholar]
- Gougeon, F.A. A crown-following approach to the automatic delineation of individual tree crowns in high spatial resolution aerial images. Can. J. Remote Sens 1995, 21, 274–284. [Google Scholar]
- Brandtberg, T. Remote Sensing for Forestry Applications—A Historical Retrospect; Centre for Image Analysis, Swedish University of Agricultural Sciences: Umeå, Sweden, 1999. [Google Scholar]
- Quackenbush, L.J.; Hopkins, P.F.; Kinn, G.J. Developing forestry products from high resolution digital aerial imagery. Photogramm. Eng. Remote Sens 2000, 66, 1337–1346. [Google Scholar]
- Mathieu, R.; Aryal, J.; Chong, A.K. Object-based classification of IKONOS imagery for mapping large-scale vegetation communities in urban areas. Sensors 2007, 7, 2860–2880. [Google Scholar]
- Pu, R.; Landry, S. A comparative analysis of high spatial resolution IKONOS and WorldView-2 imagery for mapping urban tree species. Remote Sens. Environ 2012, 124, 516–533. [Google Scholar]
- Bunting, P.; Lucas, R. The delineation of tree crowns in Australian mixed species forests using hyperspectral Compact Airborne Spectrographic Imager (CASI) data. Remote Sens. Environ 2006, 101, 230–248. [Google Scholar]
- Hay, G.J.; Castilla, G.; Wulder, M.A.; Ruiz, J.R. An automated object-based approach for the multiscale image segmentation of forest scenes. Int. J. Appl. Earth Obs. Geoinf 2005, 7, 339–359. [Google Scholar]
- Blaschke, T. Object based image analysis for remote sensing. ISPRS J. Photogram. Remote Sens 2010, 65, 2–16. [Google Scholar]
- Johansen, K.; Bartolo, R.; Phinn, S. Special feature—Geographic object-based image analysis. J. Spat. Sci 2010, 55, 3–7. [Google Scholar]
- Haralick, R.M.; Shapiro, L. Survey: Image segmentation techniques. Comput. Vis. Graph. Image Process 1985, 29, 100–132. [Google Scholar]
- Kettig, R.; Landgrebe, D. Classification of multispectral image data by extraction and classification of homogenous objects. IEEE Trans. Geosci. Elect 1976, GE-14, 29–44. [Google Scholar]
- Benz, U.C.; Hofmann, P.; Willhauck, G.; Lingenfelder, I.; Heynen, M. Multi-resolution, object-oriented fuzzy analysis of remote sensing data for GIS-ready information. ISPRS J. Photogram. Remote Sens 2004, 58, 239–258. [Google Scholar]
- Baatz, M.; Benz, U.; Dehghani, S.; Heynen, M.; Höltje, A.; Hofmann, P.; Lingenfelder, I.; Mimler, M.; Sohlbach, M.; Weber, M. eCognition Professional User Guide; Definiens Imaging GmbH: München, Germany, 2004. [Google Scholar]
- Chemura, A.; van Duren, I.; van Leeuwen, L.M. Determination of the age of oil palm from crown projection area detected from WorldView-2 multispectral remote sensing data: The case of Ejisu-Juaben district, Ghana. ISPRS J. Photogram. Remote Sens 2014. in press.. [Google Scholar]
- Johansen, K.; Phinn, S.; Witte, C.; Philip, S.; Newton, L. Mapping banana plantations from object-oriented classification of SPOT-5 imagery. Photogramm. Eng. Remote Sens 2009, 75, 1069–1081. [Google Scholar]
Step | Processing Routines and Algorithms |
---|---|
1 | Using the orthophotos (Figure 5a,b), thresholds were set at the scene level using the Update Variable algorithm for the Normalized Difference Vegetation Index (NDVI) and Green Ratio bands to be produced. |
2 | An NDVI ((NIR − Red)/(NIR + Red)) band was produced using the Layer Arithmetics algorithm (Figure 5c). |
3 | A Green Ratio (Figure 5d), i.e., Green/((Red + Green + Blue)/3), band was produced using the Layer Arithmetics algorithm. |
4 | Multi-threshold segmentation, to separate vegetation from non-vegetation, was based on the Green Ratio band. |
5 | Multi-threshold segmentation based on the NDVI band was used to adjust the initial vegetation mapping. |
6 | Objects classified as vegetation were merged (Figure 5e). |
7 | Vegetation objects were refined to adjust object boundaries using the Pixel-Based Object Resizing algorithm and the Growing mode, and the lower NDVI threshold was set at the scene level (in step 1). |
Step | Processing Routines and Algorithms |
---|---|
8 | Base objects were created from the Blue, Green, Red and NIR bands, using the Multiresolution Segmentation algorithm [23] with a scale parameter of 10, a shape factor of 0.2 and a compactness factor of 0.5. |
9 | A customized Multi-Condition-Based Fusion algorithm was used to merge neighbouring objects (Figure 6a) if they fulfilled all the following criteria:
|
10 | The Edge Extraction Lee Sigma algorithm was used for detection of both dark and bright edges in the Green band. A sigma value of 5 was used. A higher sigma value results in stronger edge detection (Figure 6b,c). |
11 | An additional band was produced by adding together the dark and bright edge Lee Sigma bands using the Layer Arithmetics algorithm (Figure 6d). |
12 | The Pixel Filters Sliding Window algorithm was used to apply a Gaussian filter with a window size of 39 × 39 pixels to create a smoothed band of the added dark and bright edge Lee Sigma bands, producing bright areas (high pixel values) where many dark and bright edges appeared, such as areas with banana plants (Figure 6e). |
13 | The Line Extraction algorithm was used to search for both dark (blue lines in Figure 6f) and bright (green lines in Figure 6f) lines. |
14 | A number of layer statistics were produced at the scene level using the Update Variable algorithm including:
|
15 | Multi-threshold segmentation based on the layer statistics for the bright line band was used to identify banana plant leaf candidates. |
16 | Banana plant leaf candidate objects were refined using the Pixel-Based Object Resizing algorithm and the Growing mode using different thresholds for bright lines. |
17 | Banana plant leaf candidate objects (bright lines) with a length >50 pixels, i.e., 5 m were unclassified, as banana plant leaves are generally shorter than 5 m. |
18 | Banana plant leaf candidate objects (bright lines) with a length <10 pixels, i.e., 1 m (green outlined objects in Figure 6g) were unclassified, as most banana plant leaves were longer than 1 m. This also removed a lot of false positives (features being incorrectly outlined as banana plants). |
19 | Multi-threshold segmentation based on the layer statistics for the dark line band was used to identify banana plant leaf shadow candidates. |
20 | Banana plant leaf shadow candidate objects were refined using the Pixel-Based Object Resizing algorithm and the Growing mode by setting different thresholds for dark lines. |
21 | Banana plant leaf shadow candidate objects (dark lines) with a length >50 pixels, i.e., 5 m were unclassified, as banana plant leaf shadows were generally shorter than 5 m. |
22 | Banana plant leaf shadow candidate objects (dark lines) with a length <10 pixels, i.e., 1 m (red outlined objects in Figure 6g) were unclassified, as most banana plant leaf shadows were longer than 1 m. This also removed a lot of false positives. |
23 | Banana plant leaf candidate and leaf shadow candidate objects were transferred to the same level of the base object segmentation (Figure 6h) to ensure the segmented image included all the delineated banana plant leaf candidate and leaf shadow candidate objects. |
Step | Processing Routines and Algorithms |
---|---|
24 | Vegetation objects (see Figure 5e) were classified as banana plant candidates (light blue in Figure 7a) using the thresholds set in the layer statistics (step 14a) to identify bright areas in the smoothed dark and bright edge Lee Sigma band, representing areas within which many features with distinct edges occurred. |
25 | Banana plant candidate objects with an area >200 pixels were excluded, as banana plants are small and heterogeneous features producing smaller objects. |
26 | Banana plant candidate objects containing sub-level banana plant leaf candidate and banana plant leaf shadow candidate line objects (Figure 7b) were classified as banana plants (red objects in Figure 7c). |
27 | Banana plant objects were unclassified if they were further away than a distance of 50 pixels, i.e., 5 m, to a neighbouring banana plant object, as banana plants and clusters of banana plants appeared with multiple adjacent objects due to their spectrally heterogeneous appearance. |
28 | Remaining banana plant candidate objects were classified as vegetation (Figure 7d). |
Step | Processing Routines and Algorithms |
---|---|
29 | A buffer distance threshold (of 6 pixels) and a minimum mapping unit area (of 250 pixels) were set using the Update Variable algorithm. |
30 | Vegetation objects were unclassified and banana plant objects were reassigned to banana plant candidate objects and all neighbouring objects of the same class were merged, i.e., banana plant candidate objects and unclassified objects (Figure 8a) |
31 | A Distance Map was produced with the distance calculated in relation to all banana plant candidate objects (Figure 8b). |
32 | The Pixel-Based Object Resizing algorithm and the Growing mode were used to grow the banana plant candidate objects based on the buffer distance threshold (set in step 29) (Figure 8c). |
33 | Using a number of processing loops, the buffered banana plant candidate objects had their edges smoothed, and if they fell below an area threshold (set in step 29) they were removed (Figure 8c). |
34 | A new Distance Map was produced with the distance calculated in relation to all banana plant candidate objects (Figure 8d). |
35 | The Pixel-Based Object Resizing algorithm and the Coating mode were used to buffer each object (line width of 4 pixels based on the Distance Map) (Figure 8e). |
36 | Banana plant candidate objects were unclassified to leave only the created buffers surrounding the banana plant candidate objects (Figure 8f). |
37 | Buffers covering an area of less than the area threshold (set in step 29) were removed (Figure 8g). |
Image Tile | Number of Banana Plant Candidate Objects Mapped | Approximate Time Spent on Visual Assessment | Final Number of Objects Classified as Banana Plants |
---|---|---|---|
1 | 1012 | 12 min | 16 |
2 | 1761 | 15 min | 17 |
3 | 1060 | 13 min | 23 |
4 | 698 | 10 min | 40 |
5 | 2536 | 31 min | 35 |
6 | 542 | 7 min | 32 |
7 | 942 | 11 min | 44 |
8 | 314 | 5 min | 82 |
9 | 820 | 10 min | 22 |
10 | 924 | 11 min | 20 |
© 2014 by the authors; licensee MDPI, Basel, Switzerland This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/3.0/).
Share and Cite
Johansen, K.; Sohlbach, M.; Sullivan, B.; Stringer, S.; Peasley, D.; Phinn, S. Mapping Banana Plants from High Spatial Resolution Orthophotos to Facilitate Plant Health Assessment. Remote Sens. 2014, 6, 8261-8286. https://doi.org/10.3390/rs6098261
Johansen K, Sohlbach M, Sullivan B, Stringer S, Peasley D, Phinn S. Mapping Banana Plants from High Spatial Resolution Orthophotos to Facilitate Plant Health Assessment. Remote Sensing. 2014; 6(9):8261-8286. https://doi.org/10.3390/rs6098261
Chicago/Turabian StyleJohansen, Kasper, Malte Sohlbach, Barry Sullivan, Samantha Stringer, David Peasley, and Stuart Phinn. 2014. "Mapping Banana Plants from High Spatial Resolution Orthophotos to Facilitate Plant Health Assessment" Remote Sensing 6, no. 9: 8261-8286. https://doi.org/10.3390/rs6098261
APA StyleJohansen, K., Sohlbach, M., Sullivan, B., Stringer, S., Peasley, D., & Phinn, S. (2014). Mapping Banana Plants from High Spatial Resolution Orthophotos to Facilitate Plant Health Assessment. Remote Sensing, 6(9), 8261-8286. https://doi.org/10.3390/rs6098261