[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2021078663A1 - Aerial vehicle detection - Google Patents

Aerial vehicle detection Download PDF

Info

Publication number
WO2021078663A1
WO2021078663A1 PCT/EP2020/079308 EP2020079308W WO2021078663A1 WO 2021078663 A1 WO2021078663 A1 WO 2021078663A1 EP 2020079308 W EP2020079308 W EP 2020079308W WO 2021078663 A1 WO2021078663 A1 WO 2021078663A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
aerial vehicle
aircraft
captured
image sensor
Prior art date
Application number
PCT/EP2020/079308
Other languages
French (fr)
Inventor
William Tulloch
Original Assignee
Airbus Operations Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Airbus Operations Limited filed Critical Airbus Operations Limited
Publication of WO2021078663A1 publication Critical patent/WO2021078663A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D45/00Aircraft indicators or protectors not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • G01C11/08Interpretation of pictures by comparison of two or more pictures of the same area the pictures not being supported in the same relative position as when they were taken
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/04Systems determining the presence of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/292Multi-camera tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30261Obstacle

Definitions

  • the present invention relates to detection of an aerial vehicle. Particularly, although not exclusively, the present invention relates to the detection of an aerial vehicle by an aircraft.
  • UAVs unmanned aerial vehicles
  • a first aspect of the present invention provides a method of detecting the presence of an external aerial vehicle in the vicinity of an aircraft in flight.
  • the method comprises receiving image data representing a first image captured by a first aircraft-mounted image sensor having a first field of view and processing the image data to determine whether an external aerial vehicle candidate is present in a target space of the first captured image; receiving image data representing a second image captured by a second aircraft-mounted image sensor having a second field of view, which encompasses the target space, and processing the image data to determine whether the external aerial vehicle candidate is present in the second captured image; and generating a signal indicating that an external aerial vehicle is determined to be present in the vicinity of the aircraft if the external aerial vehicle candidate is determined to be present in both the first captured image and in the second captured image.
  • the images are processed to identify the presence of an external aerial vehicle candidate by comparing the images to one or more stored representations of existing aerial vehicles.
  • the one or more stored representations of aerial vehicles is determined by a classifier which is trained to recognise different types of aerial vehicles using supervised training procedures based on images from a library of aerial vehicle images. A number of libraries already exist and therefore this may reduce the time and complexity of training the image processor.
  • the image data is captured using a 360° image sensor. This may provide an increased field of vision compared to regular image sensors and may result in a larger area of the surface of the aircraft captured.
  • the method may comprise receiving an indication that the external aerial vehicle candidate is present based on an external image captured by a ground-based image sensor.
  • a ground-based image sensor will provide additional verification that the UAV is present in the vicinity of the aircraft in flight.
  • processing the image data to determine whether the external aerial vehicle candidate is present in the second captured image is triggered in response to a determination that an external aerial vehicle candidate is present in a target space of the first captured image.
  • This may provide the advantage of minimising the image sensor processing power and associated resources unless an external aerial vehicle is suspected or determined in by another image sensor.
  • a location of the external aerial vehicle is triangulated using the first and second captured images.
  • this provides location data about the external aerial vehicle which can be used to help minimise the risk the external aerial vehicle poses.
  • a second aspect of the present invention provides a machine-readable storage medium executable by a processor to implement the method according to the first aspect of the present invention.
  • a third aspect of the present invention provides a system for detecting the presence of an external aerial vehicle in the vicinity of an aircraft in flight.
  • the system comprises a first image sensor device having a first field of view to capture a first image comprising an external aerial vehicle candidate in a target space of the first image that is in the vicinity of an aircraft; a second image sensor device having a second field of view, which encompasses the target space, to capture a second image comprising the external aerial vehicle candidate; and a processor to generate a signal indicating that an external aerial vehicle is determined to be present in the vicinity of the aircraft if the external aerial vehicle candidate is determined to be present in both the first captured image and in the second captured image.
  • a processor to generate a signal indicating that an external aerial vehicle is determined to be present in the vicinity of the aircraft if the external aerial vehicle candidate is determined to be present in both the first captured image and in the second captured image.
  • an improved set of data for the authorities is generated with this system. This can lead to a more efficient management of the airspace in the vicinity of the detected UAV and reduce the risks that UAVs pose.
  • At least one image sensor is aircraft mounted.
  • external aerial vehicles in the vicinity of the aircraft is detected with use of on-board image sensors.
  • the field of view of the image sensors may be outwardly facing from the aircraft.
  • At least one image sensor is ground mounted. Such an arrangement provides added protection against external aerial vehicles that are spotted near to an airfield or other ground-based locations.
  • a fourth aspect of the present invention is an aircraft comprising the system according to the third aspect of the present invention.
  • a fifth aspect of the present invention is a processor and stored program code, and at least a first image sensor and a second image sensor, to perform the method of the first aspect of the present invention.
  • Figure 1A is an illustrative plan view of an aircraft, according to an example
  • Figure IB is an illustrative side elevation of an aircraft, according to an example
  • Figure 2 is an illustrative side elevation of an aircraft in flight, according to an example
  • Figure 3 is another illustrative side elevation of an aircraft in flight, according to an example
  • Figure 4 illustrates two overhead images of a scene superimposed on one another, according to an example
  • Figure 5 is an illustrative view of a scenario, according to an example; and [0022] Figure 6 is a process flow chart of a method, according to an example.
  • That present invention takes into account that UAVs are readily available for anyone to purchase and there is little guidance or rules relating to their ownership or use. Where rules exist, they may not be internationally recognised or applied. There have been reported incidents involving commercial aircraft and suspected UAVs, which have resulted in the shutdown of major airports. These are extremely disruptive incidents and come at a large cost to flight operators and flight passengers alike. There have even been incidents when the presence of a UAV has been reported, causing subsequent disruption, without the presence even being verified. Such is the seriousness of the threat UAVs pose that a mere alleged sighting can down aircraft for long periods of time.
  • Figure 1A illustrates a plan view of an aircraft 102 in flight
  • Figure IB illustrates a side elevation of the aircraft 102 in flight.
  • the figures show image sensors mounted at various positions on the aircraft.
  • An image sensor 104, 107 is mounted at a position on the leading edge of each horizontal stabiliser and another image sensor 106 is mounted at a position at the top of the vertical stabiliser.
  • Another image sensor 108, 116 is mounted on the leading edge of each wing tip.
  • An image sensor 110, 118 is mounted on the underside of each wing and an image sensor 112, 119 is mounted atop each engine.
  • An additional image sensor 128 is mounted on the top of the fuselage above the cabin area.
  • an image sensor is any kind of device that is able to capture an image.
  • the device may operate in colour or monochrome, and may operate in visible or near IR (or IR) regions of the electromagnetic spectrum.
  • Such a device typically captures and stores images digitally, and is controllable to communicate captured image data to a local or remote processor for image processing.
  • Known imaging sensors for example in digital cameras that are adapted for use in adverse (i.e. in flight) conditions, are suitable for use in examples herein.
  • the plurality of image sensors may be controlled by one or more processors (not pictured).
  • the processor, or processors may be mounted within the fuselage of the aircraft 102.
  • each image sensor may include a co-located processor that performs at least some control and/or image processing.
  • the image sensors may be controlled centrally.
  • the image sensors may be powered by local power connections taken from the aircraft power network. Control signals and image data may be communicated to and from image sensors via wired or wireless connections.
  • the processor(s) are arranged to process images and/or video captured by the plurality of image sensors to identify external aerial vehicle candidates, such as UAVs.
  • At least some of the image sensors may have a wide or a panoramic field of view, for example greater than 160° horizontally and/or greater than 75° vertically. What each image sensor can see for any given field of view is of course dictated by where the image sensor is mounted on the aircraft and in which direction it is directed. At least one of the image sensors may have a 360° field of view horizontally and 90° or greater vertically. Image sensors may be fixed, for example as applied in catadioptric cameras, and derive their wide fields of view from fixed elements, such as lenses or mirrors. Other image sensors may be movable, such as rotatable, to achieve their fields of view. In any case the image sensors may be interconnected and be in communication with one another either directly or via a central system.
  • Connectivity may use a wireless protocol, such as an Internet of Things (IoT) protocol such as Bluetooth, WiFi, ZigBee, MQTT IoT, CoAP, DSS, NFC, Cellular, AMQP, RFID, Z-Wave, EnOcean and the like.
  • IoT Internet of Things
  • the fields of view of various image sensors mounted on the plane 102 overlap giving multiple viewpoints of a vicinity or space around the aircraft 102.
  • the fields of view are arranged to be outwardly- looking away from the aircraft 102, so that all regions around the aircraft are visible.
  • Figures 1 A and IB illustrate a resultant field of view 126 that encompasses the entire area around the aircraft 102.
  • the aircraft 102 is in Figure IB is depicted being approached by a UAV 130.
  • the UAV 130 poses a threat to the safety of the aircraft and is likely to cause disruption if it is not dealt with in an efficient manner.
  • Dealing with the UAV 130 may include, for example, recording its detection, location, velocity, alerting the pilot and notifying other aircraft and aircraft controllers at airports in the vicinity. This can give an improved set of data for the authorities, which can lead to a more efficient management of the airspace in the vicinity of the detected UAV.
  • FIG. 2 illustrates a side elevation of the aircraft 102 in flight.
  • the UAV 130 is approaching the aircraft 102.
  • the UAV 130 is within a field of view 134 of the image sensor 132.
  • Each of the image sensors captures images which are stored and processed in near-real-time to determine whether a UAV candidate is present.
  • a UAV that is spotted by one image sensor is referred to herein as a candidate, whereas the same candidate, if it is spotted by more than one image sensor, is determined to be a UAV sighting.
  • the image sensor 132 determines that there is the UAV candidate 130 in the vicinity, referred to herein as a target space, of the aircraft 102.
  • the processor controlling the image sensor 132 notifies other image sensors, having overlapping fields of view with the image sensor 132, to scan their respective target areas in their captured images to determine if the UAV candidate 130 is also present therein.
  • FIG. 3 shows the image sensor 114 having a field of view 136 which overlaps with the field of view 134.
  • the field of view 136 provides an alternate view of the UAV 130 and that is used to verify or confirm that a UAV is in the vicinity of the aircraft 102. Therefore, image sensor 114, if it also comprises an image of the UAV candidate, is used to confirm the presence of the UAV 130. Based on the successful detection and confirmation of the UAV 130, an output indicating the presence of the UAV (as opposed to a ‘UAV candidate’) 130 is generated, along with any other pertinent details that has been surmised, such as size, distance and velocity.
  • any of the plurality of image sensors with overlapping fields of view may be used to confirm that the UAV 130 is in the external area of the aircraft 102.
  • any two of the plurality of image sensors may be used to triangulate the location of the UAV 130.
  • Figure 3 illustrates other objects beneath the aircraft 102.
  • the objects include a first tree 140, a second tree 142 and a car 144.
  • the car 144 may be stationary or moving.
  • the objects are in the field of view of at least one image sensor.
  • FIG. 4 illustrates two overhead images of the ground 138 captured at two different times.
  • the images are superimposed over one another.
  • the two images were captured, one after the other, for instance 0.2s apart, by the same aircraft-mounted image sensor.
  • Each image includes a first tree 141, a second tree 142, a road 143, a vehicle 144 on the road and a UAV 130.
  • Each object in the image is designated with a first reference number (e.g. 141 for the first tree) that denotes the position of the object in the first image, and a second reference numeral (e.g. 141 ’ for the first tree) that denotes the position of the object in the second image.
  • the objects are at least initially assumed to be on the ground 138.
  • Non-moving objects may be identified by reference to the respective locations in the consecutive images and with knowledge of the ground velocity and altitude of the aircraft.
  • non-moving object may be identified by reference to libraries of similar images (e.g. such as for roads and trees), by using a trained classifier, or by reference to satellite images and maps of a respective landscape.
  • the processor is arranged to compare the two images and determine that certain matching objects have not moved (e.g. the trees and the road), whereas certain other objects (e.g. the car 144 and the UAV 130) have moved.
  • the speed or velocity of the objects that are moving can be determined by reference to their different positions in the images relative to the static objects, and with the knowledge of the ground velocity and altitude of the aircraft. For example, dl is estimated to be about 1.8m, whereas d2 is estimated to be about 6x that distance, or 9m.
  • a car travelling 1.8m in 0.2 seconds has a ground speed of 32.4kmh 1 . Meanwhile, the UAV has a calculated apparent ground speed of 194.4 kmh 1 .
  • the processor is arranged to determine that an object moving at such a high apparent ground speed (for example, a threshold apparent ground speed may be higher than 120 kmh 1 or higher than 150 kmh 1 ) is in flight and, in fact, nearer to the aircraft.
  • a threshold apparent ground speed may be higher than 120 kmh 1 or higher than 150 kmh 1
  • the UAV 130 has moved the greatest distance across the field of view of the image sensor and it is determined not to be moving as it should be if it is a moving object on the ground 138 such as, for example, a car.
  • the processor controlling the image sensor differentiates between objects on the ground that are moving as they should be, given knowledge of the altitude and ground velocity of the aircraft, and objects that are not moving as they should be. In the latter case, if it is clear that the objects are moving further/faster than a typical ground object (static or moving relatively slowly), the processor deduces that they could be UAVs.
  • UAV candidate has been identified by two image sensors, and is determined to be a UAV
  • triangulation can be performed (given a known spacing on the aircraft between the respective image sensors) to determine the altitude of the UAV, its distance from the aircraft and its velocity. The distance from the aircraft and velocity determine a threat level posed by the UAV.
  • the image processing uses known object detection algorithms to identify moving objects and compare the objects to a library of known objects, including known UAVs.
  • the image processing comprises a trained classifier to identify UAV candidates. The classifier may be trained to identify UAV candidates based on movement characteristics and/or based on pattern or image matching.
  • FIG. 5 is an illustrative view of a scenario according to an example.
  • the processor, or processors controlling the image sensors on aircraft 102 may connect to a wide area network of other systems.
  • a wide area network of other systems For example, there is an image sensor 162 having a field of view 164 of the UAV mounted on a control tower 160.
  • the image processor controlling the image sensor 162 determines that the UAV is present and generates a signal indicating it.
  • a second aircraft 170 with a plurality of image sensors mounted in similar positions as aircraft 102.
  • the processor controlling the image sensor 172 determines the presence of the UAV 130 and generates a signal indicating its presence.
  • the image sensor is described to be mounted on the control tower 160 but may be mounted on different ground-based locations such as aircraft hangars, other buildings, masts, or ground based vehicles.
  • the system controlling the plurality of image sensors sends out a signal to other systems that have image sensors with overlapping views of the external area of the aircraft 102. Where fields of view of other image sensors are controllable, those image sensors may adjust their respective field of view to view the UAV candidate.
  • the other image sensors perform a similar process that that has been described to verify that the UAV 130 is present in the external area to the aircraft 102. More broadly, a first sighting of a UAV candidate by any of the sensors (i.e. ground or aircraft-mounted) illustrated in Figure 5 may generate a signal that causes any other of the images sensors that has an overlapping field of view to confirm the UAV candidate as being a UAV. Such an arrangement provides added protection against UAVs that are spotted near to an airfield.
  • Figure 6 is a process flow chat of method 600 of determining if there is an aerial vehicle in the external area of an aircraft, for example, using the plurality of image sensors mounted on the aircraft 102 as described in relation to Figures 1 to 5.
  • the method receives first image data having a first field of view of the external area of the aircraft 102.
  • the received first image data is processed to determine the presence of a UAV candidate.
  • second image data is received from a second image sensor having a field of view that encompasses part of the field of view, or the target area containing the UAV candidate, of the first image.
  • the second image data is processed to determine if the UAV candidate is present.
  • a signal is generated to indicate the presence of the UAV if it is determined that there is one in the first and second image data.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Astronomy & Astrophysics (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

Disclosed is a method of detecting the presence of an external aerial vehicle in the vicinity of an aircraft in flight. The method comprises receiving image data representing a first image captured by a first aircraft-mounted image sensor having a first field of view and processing the image data to determine whether an external aerial vehicle candidate is present in a target space of the first captured image; and receiving image data representing a second image captured by a second aircraft-mounted image sensor having a second field of view, which encompasses the target space, and processing the image data to determine whether the external aerial vehicle candidate is present in the second captured image. Finally, the method comprises generating a signal indicating that an external aerial vehicle is determined to be present in the vicinity of the aircraft if the external aerial vehicle candidate is determined to be present in both the first captured image and in the second captured image.

Description

AERIAL VEHICLE DETECTION
TECHNICAL FIELD
[0001] The present invention relates to detection of an aerial vehicle. Particularly, although not exclusively, the present invention relates to the detection of an aerial vehicle by an aircraft.
BACKGROUND
[0002] Commercial unmanned aerial vehicles (UAVs) are now widely available and affordable. UAVs pose a potential threat to aircraft and aircraft operations.
SUMMARY
[0003] A first aspect of the present invention provides a method of detecting the presence of an external aerial vehicle in the vicinity of an aircraft in flight. The method comprises receiving image data representing a first image captured by a first aircraft-mounted image sensor having a first field of view and processing the image data to determine whether an external aerial vehicle candidate is present in a target space of the first captured image; receiving image data representing a second image captured by a second aircraft-mounted image sensor having a second field of view, which encompasses the target space, and processing the image data to determine whether the external aerial vehicle candidate is present in the second captured image; and generating a signal indicating that an external aerial vehicle is determined to be present in the vicinity of the aircraft if the external aerial vehicle candidate is determined to be present in both the first captured image and in the second captured image. With this method an improved set of data for the authorities is generated. This can lead to a more efficient management of the airspace in the vicinity of the detected UAV and reduce the risks that UAVs pose. [0004] Optionally, the images are processed to identify the presence of an external aerial vehicle candidate by comparing the images to one or more stored representations of existing aerial vehicles. Furthermore, the one or more stored representations of aerial vehicles is determined by a classifier which is trained to recognise different types of aerial vehicles using supervised training procedures based on images from a library of aerial vehicle images. A number of libraries already exist and therefore this may reduce the time and complexity of training the image processor.
[0005] Optionally, the image data is captured using a 360° image sensor. This may provide an increased field of vision compared to regular image sensors and may result in a larger area of the surface of the aircraft captured.
[0006] Optionally, the method may comprise receiving an indication that the external aerial vehicle candidate is present based on an external image captured by a ground-based image sensor. A ground-based image sensor will provide additional verification that the UAV is present in the vicinity of the aircraft in flight.
[0007] Optionally, processing the image data to determine whether the external aerial vehicle candidate is present in the second captured image is triggered in response to a determination that an external aerial vehicle candidate is present in a target space of the first captured image. This may provide the advantage of minimising the image sensor processing power and associated resources unless an external aerial vehicle is suspected or determined in by another image sensor.
[0008] Optionally, a location of the external aerial vehicle is triangulated using the first and second captured images. Beneficially, this provides location data about the external aerial vehicle which can be used to help minimise the risk the external aerial vehicle poses.
[0009] A second aspect of the present invention provides a machine-readable storage medium executable by a processor to implement the method according to the first aspect of the present invention. [0010] A third aspect of the present invention provides a system for detecting the presence of an external aerial vehicle in the vicinity of an aircraft in flight. The system comprises a first image sensor device having a first field of view to capture a first image comprising an external aerial vehicle candidate in a target space of the first image that is in the vicinity of an aircraft; a second image sensor device having a second field of view, which encompasses the target space, to capture a second image comprising the external aerial vehicle candidate; and a processor to generate a signal indicating that an external aerial vehicle is determined to be present in the vicinity of the aircraft if the external aerial vehicle candidate is determined to be present in both the first captured image and in the second captured image. Beneficially, an improved set of data for the authorities is generated with this system. This can lead to a more efficient management of the airspace in the vicinity of the detected UAV and reduce the risks that UAVs pose.
[0011] Optionally, at least one image sensor is aircraft mounted. Advantageously, external aerial vehicles in the vicinity of the aircraft is detected with use of on-board image sensors. Beneficially, the field of view of the image sensors may be outwardly facing from the aircraft.
[0012] Optionally, at least one image sensor is ground mounted. Such an arrangement provides added protection against external aerial vehicles that are spotted near to an airfield or other ground-based locations.
[0013] A fourth aspect of the present invention is an aircraft comprising the system according to the third aspect of the present invention.
[0014] A fifth aspect of the present invention is a processor and stored program code, and at least a first image sensor and a second image sensor, to perform the method of the first aspect of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[0015] Embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which: [0016] Figure 1A is an illustrative plan view of an aircraft, according to an example;
[0017] Figure IB is an illustrative side elevation of an aircraft, according to an example;
[0018] Figure 2 is an illustrative side elevation of an aircraft in flight, according to an example;
[0019] Figure 3 is another illustrative side elevation of an aircraft in flight, according to an example;
[0020] Figure 4 illustrates two overhead images of a scene superimposed on one another, according to an example;
[0021] Figure 5 is an illustrative view of a scenario, according to an example; and [0022] Figure 6 is a process flow chart of a method, according to an example.
DETAILED DESCRIPTION
[0023] That present invention takes into account that UAVs are readily available for anyone to purchase and there is little guidance or rules relating to their ownership or use. Where rules exist, they may not be internationally recognised or applied. There have been reported incidents involving commercial aircraft and suspected UAVs, which have resulted in the shutdown of major airports. These are extremely disruptive incidents and come at a large cost to flight operators and flight passengers alike. There have even been incidents when the presence of a UAV has been reported, causing subsequent disruption, without the presence even being verified. Such is the seriousness of the threat UAVs pose that a mere alleged sighting can down aircraft for long periods of time.
[0024] Figure 1A illustrates a plan view of an aircraft 102 in flight and Figure IB illustrates a side elevation of the aircraft 102 in flight. The figures show image sensors mounted at various positions on the aircraft. An image sensor 104, 107 is mounted at a position on the leading edge of each horizontal stabiliser and another image sensor 106 is mounted at a position at the top of the vertical stabiliser. Another image sensor 108, 116 is mounted on the leading edge of each wing tip. An image sensor 110, 118 is mounted on the underside of each wing and an image sensor 112, 119 is mounted atop each engine. There is also an image sensor 114 mounted on the underside of the fuselage toward the cockpit. The image sensor 114 may be mounted on the centre line. An additional image sensor 128 is mounted on the top of the fuselage above the cabin area.
[0025] As used herein, an image sensor is any kind of device that is able to capture an image. The device may operate in colour or monochrome, and may operate in visible or near IR (or IR) regions of the electromagnetic spectrum. Such a device typically captures and stores images digitally, and is controllable to communicate captured image data to a local or remote processor for image processing. Known imaging sensors, for example in digital cameras that are adapted for use in adverse (i.e. in flight) conditions, are suitable for use in examples herein.
[0026] The plurality of image sensors may be controlled by one or more processors (not pictured). The processor, or processors, may be mounted within the fuselage of the aircraft 102. In some examples, each image sensor may include a co-located processor that performs at least some control and/or image processing. In other examples, the image sensors may be controlled centrally. The image sensors may be powered by local power connections taken from the aircraft power network. Control signals and image data may be communicated to and from image sensors via wired or wireless connections. In some examples, the processor(s) are arranged to process images and/or video captured by the plurality of image sensors to identify external aerial vehicle candidates, such as UAVs.
[0027] At least some of the image sensors may have a wide or a panoramic field of view, for example greater than 160° horizontally and/or greater than 75° vertically. What each image sensor can see for any given field of view is of course dictated by where the image sensor is mounted on the aircraft and in which direction it is directed. At least one of the image sensors may have a 360° field of view horizontally and 90° or greater vertically. Image sensors may be fixed, for example as applied in catadioptric cameras, and derive their wide fields of view from fixed elements, such as lenses or mirrors. Other image sensors may be movable, such as rotatable, to achieve their fields of view. In any case the image sensors may be interconnected and be in communication with one another either directly or via a central system. Connectivity may use a wireless protocol, such as an Internet of Things (IoT) protocol such as Bluetooth, WiFi, ZigBee, MQTT IoT, CoAP, DSS, NFC, Cellular, AMQP, RFID, Z-Wave, EnOcean and the like.
[0028] According to examples, the fields of view of various image sensors mounted on the plane 102 overlap giving multiple viewpoints of a vicinity or space around the aircraft 102. According to the present example, the fields of view are arranged to be outwardly- looking away from the aircraft 102, so that all regions around the aircraft are visible. Figures 1 A and IB illustrate a resultant field of view 126 that encompasses the entire area around the aircraft 102.
[0029] The aircraft 102 is in Figure IB is depicted being approached by a UAV 130. The UAV 130 poses a threat to the safety of the aircraft and is likely to cause disruption if it is not dealt with in an efficient manner. Dealing with the UAV 130 may include, for example, recording its detection, location, velocity, alerting the pilot and notifying other aircraft and aircraft controllers at airports in the vicinity. This can give an improved set of data for the authorities, which can lead to a more efficient management of the airspace in the vicinity of the detected UAV.
[0030] Figure 2 illustrates a side elevation of the aircraft 102 in flight. As previously described, the UAV 130 is approaching the aircraft 102. In this example, the UAV 130 is within a field of view 134 of the image sensor 132. Each of the image sensors captures images which are stored and processed in near-real-time to determine whether a UAV candidate is present. A UAV that is spotted by one image sensor is referred to herein as a candidate, whereas the same candidate, if it is spotted by more than one image sensor, is determined to be a UAV sighting. As such, the image sensor 132 determines that there is the UAV candidate 130 in the vicinity, referred to herein as a target space, of the aircraft 102. [0031] In response to determining whether the UAV candidate 130 is present in the target space of the first captured image, the processor controlling the image sensor 132 notifies other image sensors, having overlapping fields of view with the image sensor 132, to scan their respective target areas in their captured images to determine if the UAV candidate 130 is also present therein.
[0032] The illustration of Figure 3 shows the image sensor 114 having a field of view 136 which overlaps with the field of view 134. The field of view 136 provides an alternate view of the UAV 130 and that is used to verify or confirm that a UAV is in the vicinity of the aircraft 102. Therefore, image sensor 114, if it also comprises an image of the UAV candidate, is used to confirm the presence of the UAV 130. Based on the successful detection and confirmation of the UAV 130, an output indicating the presence of the UAV (as opposed to a ‘UAV candidate’) 130 is generated, along with any other pertinent details that has been surmised, such as size, distance and velocity. Of course, any of the plurality of image sensors with overlapping fields of view may be used to confirm that the UAV 130 is in the external area of the aircraft 102. Moreover, any two of the plurality of image sensors may be used to triangulate the location of the UAV 130.
[0033] Figure 3 illustrates other objects beneath the aircraft 102. By way of example, the objects include a first tree 140, a second tree 142 and a car 144. The car 144 may be stationary or moving. The objects are in the field of view of at least one image sensor.
[0034] One way of determining that an object in a field of view is a UAV candidate will now be described with reference to Figure 4, which illustrates two overhead images of the ground 138 captured at two different times. The images are superimposed over one another. In this example, the two images were captured, one after the other, for instance 0.2s apart, by the same aircraft-mounted image sensor. Each image includes a first tree 141, a second tree 142, a road 143, a vehicle 144 on the road and a UAV 130. Each object in the image is designated with a first reference number (e.g. 141 for the first tree) that denotes the position of the object in the first image, and a second reference numeral (e.g. 141 ’ for the first tree) that denotes the position of the object in the second image. The objects are at least initially assumed to be on the ground 138.
[0035] The images are superimposed over one by object- matching and aligning the non moving objects. Non-moving objects may be identified by reference to the respective locations in the consecutive images and with knowledge of the ground velocity and altitude of the aircraft. In addition, or alternatively, non-moving object may be identified by reference to libraries of similar images (e.g. such as for roads and trees), by using a trained classifier, or by reference to satellite images and maps of a respective landscape.
[0036] According to Figure 4, it is clear that, as between the two images, the first and second trees, 141, 142, have not moved: they are non-moving objects. The vehicle 144, 144’, meanwhile, has moved by arelatively small distance, dl, along the road 143 (which has of course not moved). The UAV 130, 130’ in the same period of time has moved a relatively significant distance, d2, in a direction that is not aligned with the road 143.
[0037] The processor is arranged to compare the two images and determine that certain matching objects have not moved (e.g. the trees and the road), whereas certain other objects (e.g. the car 144 and the UAV 130) have moved. The speed or velocity of the objects that are moving can be determined by reference to their different positions in the images relative to the static objects, and with the knowledge of the ground velocity and altitude of the aircraft. For example, dl is estimated to be about 1.8m, whereas d2 is estimated to be about 6x that distance, or 9m. A car travelling 1.8m in 0.2 seconds has a ground speed of 32.4kmh 1. Meanwhile, the UAV has a calculated apparent ground speed of 194.4 kmh 1. According to one example, the processor is arranged to determine that an object moving at such a high apparent ground speed (for example, a threshold apparent ground speed may be higher than 120 kmh 1 or higher than 150 kmh 1) is in flight and, in fact, nearer to the aircraft.
[0038] In more general terms, the UAV 130 has moved the greatest distance across the field of view of the image sensor and it is determined not to be moving as it should be if it is a moving object on the ground 138 such as, for example, a car. The processor controlling the image sensor differentiates between objects on the ground that are moving as they should be, given knowledge of the altitude and ground velocity of the aircraft, and objects that are not moving as they should be. In the latter case, if it is clear that the objects are moving further/faster than a typical ground object (static or moving relatively slowly), the processor deduces that they could be UAVs.
[0039] Once the UAV candidate has been identified by two image sensors, and is determined to be a UAV, triangulation can be performed (given a known spacing on the aircraft between the respective image sensors) to determine the altitude of the UAV, its distance from the aircraft and its velocity. The distance from the aircraft and velocity determine a threat level posed by the UAV.
[0040] An alternative or additional way to identify UAV candidates applies image processing. In one example, the image processing uses known object detection algorithms to identify moving objects and compare the objects to a library of known objects, including known UAVs. In another example, the image processing comprises a trained classifier to identify UAV candidates. The classifier may be trained to identify UAV candidates based on movement characteristics and/or based on pattern or image matching.
[0041] Figure 5 is an illustrative view of a scenario according to an example. The processor, or processors controlling the image sensors on aircraft 102 may connect to a wide area network of other systems. For example, there is an image sensor 162 having a field of view 164 of the UAV mounted on a control tower 160. The image processor controlling the image sensor 162 determines that the UAV is present and generates a signal indicating it. Also, nearby the aircraft 102 there is a second aircraft 170 with a plurality of image sensors mounted in similar positions as aircraft 102. An image sensor 172 mounted on the aircraft 170 and has a field of view 174 that encompasses the target area of the UAV 130. The processor controlling the image sensor 172 determines the presence of the UAV 130 and generates a signal indicating its presence. In this example, the image sensor is described to be mounted on the control tower 160 but may be mounted on different ground-based locations such as aircraft hangars, other buildings, masts, or ground based vehicles.
[0042] When aircraft 102 identifies an aerial vehicle, such as the UAV 130, the system controlling the plurality of image sensors sends out a signal to other systems that have image sensors with overlapping views of the external area of the aircraft 102. Where fields of view of other image sensors are controllable, those image sensors may adjust their respective field of view to view the UAV candidate. The other image sensors perform a similar process that that has been described to verify that the UAV 130 is present in the external area to the aircraft 102. More broadly, a first sighting of a UAV candidate by any of the sensors (i.e. ground or aircraft-mounted) illustrated in Figure 5 may generate a signal that causes any other of the images sensors that has an overlapping field of view to confirm the UAV candidate as being a UAV. Such an arrangement provides added protection against UAVs that are spotted near to an airfield.
[0043] Figure 6 is a process flow chat of method 600 of determining if there is an aerial vehicle in the external area of an aircraft, for example, using the plurality of image sensors mounted on the aircraft 102 as described in relation to Figures 1 to 5. At block 602, the method receives first image data having a first field of view of the external area of the aircraft 102. Next, at block 604, the received first image data is processed to determine the presence of a UAV candidate. At block 606, second image data is received from a second image sensor having a field of view that encompasses part of the field of view, or the target area containing the UAV candidate, of the first image. At block 608, the second image data is processed to determine if the UAV candidate is present. Finally, at block 610, a signal is generated to indicate the presence of the UAV if it is determined that there is one in the first and second image data.
[0044] It is to be noted that the term “or” as used herein is to be interpreted to mean “and/or”, unless expressly stated otherwise. The term aircraft has been used but it will be appreciated that any vehicle may be used, such as boats, cars, lorries, ships, or unmanned aerial vehicles.

Claims

CLAIMS:
1. A method of detecting the presence of an external aerial vehicle in the vicinity of an aircraft in flight comprising: receiving image data representing a first image captured by a first aircraft- mounted image sensor having a first field of view and processing the image data to determine whether an external aerial vehicle candidate is present in a target space of the first captured image; receiving image data representing a second image captured by a second aircraft-mounted image sensor having a second field of view, which encompasses the target space, and processing the image data to determine whether the external aerial vehicle candidate is present in the second captured image; and generating a signal indicating that an external aerial vehicle is determined to be present in the vicinity of the aircraft if the external aerial vehicle candidate is determined to be present in both the first captured image and in the second captured image.
2. A method according to claim 1, wherein the images are processed to identify the presence of an external aerial vehicle candidate by comparing the images to one or more stored representations of existing aerial vehicles.
3. A method according to claim 2, wherein the one or more stored representations of aerial vehicles is determined by a classifier which is trained to recognise different types of aerial vehicles using supervised training procedures based on images from a library of aerial vehicle images.
4. A method according to claim 1, wherein the images are processed to identify the presence of an external aerial vehicle candidate by detecting the relative motion of the UAV candidate.
5. A method according to any preceding claim, wherein the image data is captured using a 360° image sensor.
6. A method according to any preceding claim, comprising receiving an indication that an the external aerial vehicle candidate is present based on an external image captured by a ground-based image sensor.
7. A method according to any one of the preceding claims, wherein the processing the image data to determine whether the external aerial vehicle candidate is present in the second captured image is triggered in response to a determination that an external aerial vehicle candidate is present in a target space of the first captured image.
8. A method according to any one of the preceding claims, wherein a location of the external aerial vehicle is triangulated using the first and second captured images.
9. A machine-readable storage medium comprising instructions executable by a processor to implement the method according to claims 1 to 8.
10. A system for detecting the presence of an external aerial vehicle in the vicinity of an aircraft in flight comprising: a first image sensor device having a first field of view to capture a first image comprising an external aerial vehicle candidate in a target space of the first image that is in the vicinity of an aircraft; a second image sensor device having a second field of view, which encompasses the target space, to capture a second image comprising the external aerial vehicle candidate; and a processor to generate a signal indicating that an external aerial vehicle is determined to be present in the vicinity of the aircraft if the external aerial vehicle candidate is determined to be present in both the first captured image and in the second captured image.
11. A system according to claim 10, wherein at least one image sensor is aircraft mounted.
12. A system according to claim 11, wherein at least one image sensor is ground mounted.
13. An aircraft comprising a system according to any one of claims 10 to 12.
14. An aircraft comprising a processor and stored program code, and at least a first image sensor and a second image sensor, to perform the method of any one of claims 1 to 8.
PCT/EP2020/079308 2019-10-23 2020-10-19 Aerial vehicle detection WO2021078663A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1915388.1 2019-10-23
GB1915388.1A GB2588893A (en) 2019-10-23 2019-10-23 Aerial vehicle detection

Publications (1)

Publication Number Publication Date
WO2021078663A1 true WO2021078663A1 (en) 2021-04-29

Family

ID=68728380

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2020/079308 WO2021078663A1 (en) 2019-10-23 2020-10-19 Aerial vehicle detection

Country Status (2)

Country Link
GB (1) GB2588893A (en)
WO (1) WO2021078663A1 (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5581250A (en) * 1995-02-24 1996-12-03 Khvilivitzky; Alexander Visual collision avoidance system for unmanned aerial vehicles
US6804607B1 (en) * 2001-04-17 2004-10-12 Derek Wood Collision avoidance system and method utilizing variable surveillance envelope
US20150302858A1 (en) * 2014-04-22 2015-10-22 Brian Hearing Drone detection and classification methods and apparatus
EP3121763A1 (en) * 2015-07-24 2017-01-25 Honeywell International Inc. Helo bumper system using a camera for obstacle detection
CN107831777A (en) * 2017-09-26 2018-03-23 中国科学院长春光学精密机械与物理研究所 A kind of aircraft automatic obstacle avoiding system, method and aircraft
CN108168706A (en) * 2017-12-12 2018-06-15 河南理工大学 A kind of multispectral infrared imaging detecting and tracking system for monitoring low-altitude unmanned vehicle
US20190025858A1 (en) * 2016-10-09 2019-01-24 Airspace Systems, Inc. Flight control using computer vision
EP3447436A1 (en) * 2017-08-25 2019-02-27 Aurora Flight Sciences Corporation Aerial vehicle interception system
WO2019163454A1 (en) * 2018-02-20 2019-08-29 ソフトバンク株式会社 Image processing device, flying object, and program

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108447075B (en) * 2018-02-08 2020-06-23 烟台欣飞智能系统有限公司 Unmanned aerial vehicle monitoring system and monitoring method thereof

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5581250A (en) * 1995-02-24 1996-12-03 Khvilivitzky; Alexander Visual collision avoidance system for unmanned aerial vehicles
US6804607B1 (en) * 2001-04-17 2004-10-12 Derek Wood Collision avoidance system and method utilizing variable surveillance envelope
US20150302858A1 (en) * 2014-04-22 2015-10-22 Brian Hearing Drone detection and classification methods and apparatus
EP3121763A1 (en) * 2015-07-24 2017-01-25 Honeywell International Inc. Helo bumper system using a camera for obstacle detection
US20190025858A1 (en) * 2016-10-09 2019-01-24 Airspace Systems, Inc. Flight control using computer vision
EP3447436A1 (en) * 2017-08-25 2019-02-27 Aurora Flight Sciences Corporation Aerial vehicle interception system
CN107831777A (en) * 2017-09-26 2018-03-23 中国科学院长春光学精密机械与物理研究所 A kind of aircraft automatic obstacle avoiding system, method and aircraft
CN108168706A (en) * 2017-12-12 2018-06-15 河南理工大学 A kind of multispectral infrared imaging detecting and tracking system for monitoring low-altitude unmanned vehicle
WO2019163454A1 (en) * 2018-02-20 2019-08-29 ソフトバンク株式会社 Image processing device, flying object, and program

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
GIANCARMINE FASANO ET AL: "Multi-sensor data fusion: A tool to enable UAS integration into civil airspace", DIGITAL AVIONICS SYSTEMS CONFERENCE (DASC), 2011 IEEE/AIAA 30TH, IEEE, 16 October 2011 (2011-10-16), pages 5C3 - 1, XP032069380, ISBN: 978-1-61284-797-9, DOI: 10.1109/DASC.2011.6096082 *
SCHUMANN ARNE ET AL: "Deep cross-domain flying object classification for robust UAV detection", 2017 14TH IEEE INTERNATIONAL CONFERENCE ON ADVANCED VIDEO AND SIGNAL BASED SURVEILLANCE (AVSS), IEEE, 29 August 2017 (2017-08-29), pages 1 - 6, XP033233410, DOI: 10.1109/AVSS.2017.8078558 *

Also Published As

Publication number Publication date
GB201915388D0 (en) 2019-12-04
GB2588893A (en) 2021-05-19

Similar Documents

Publication Publication Date Title
US12066840B2 (en) Method and system for providing route of unmanned air vehicle
US20210358311A1 (en) Automated system of air traffic control (atc) for at least one unmanned aerial vehicle (uav)
RU2692306C2 (en) Tracking system for unmanned aerial vehicles
CN107871405B (en) Detection and assessment of air crash threats using visual information
US11827352B2 (en) Visual observer for unmanned aerial vehicles
CN110612234A (en) System and method for calibrating vehicle sensors
JP2023538589A (en) Unmanned aircraft with resistance to hijacking, jamming, and spoofing attacks
US11132909B2 (en) Drone encroachment avoidance monitor
JP2020509363A (en) Method and system using a networked phased array antenna application for detecting and / or monitoring moving objects
RU2755603C2 (en) System and method for detecting and countering unmanned aerial vehicles
US20210088652A1 (en) Vehicular monitoring systems and methods for sensing external objects
RU2746090C2 (en) System and method of protection against unmanned aerial vehicles in airspace settlement
KR20190021875A (en) System and its Method for Detecting and Defeating Small Unmanned Aircrafts
Zarandy et al. A novel algorithm for distant aircraft detection
KR102290533B1 (en) RTK-GPS interlocking system and method for detecting and responding to illegal flight
KR20200131081A (en) Drone control system and control method for countering hostile drones
KR101483058B1 (en) Ground control system for UAV anticollision
Minwalla et al. Experimental evaluation of PICAS: An electro-optical array for non-cooperative collision sensing on unmanned aircraft systems
WO2021078663A1 (en) Aerial vehicle detection
RU2746102C1 (en) System and method for protecting the controlled area from unmanned vehicles
CN115602003A (en) Unmanned aerial vehicle flight obstacle avoidance method, system and readable storage medium
US20240096099A1 (en) Intrusion determination device, intrusion detection system, intrusion determination method, and program storage medium
US20230010630A1 (en) Anti-collision system for an aircraft and aircraft including the anti-collision system
JP7574935B2 (en) Suspicious machine handling device, suspicious machine handling system, suspicious machine handling method, and computer program
US20240248477A1 (en) Multi-drone beyond visual line of sight (bvlos) operation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20799631

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20799631

Country of ref document: EP

Kind code of ref document: A1