US20190066523A1 - Method and system for aircraft taxi strike alerting using adaptive field of view - Google Patents
Method and system for aircraft taxi strike alerting using adaptive field of view Download PDFInfo
- Publication number
- US20190066523A1 US20190066523A1 US15/683,215 US201715683215A US2019066523A1 US 20190066523 A1 US20190066523 A1 US 20190066523A1 US 201715683215 A US201715683215 A US 201715683215A US 2019066523 A1 US2019066523 A1 US 2019066523A1
- Authority
- US
- United States
- Prior art keywords
- view
- aircraft
- illumination field
- structured light
- light
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 38
- 230000003044 adaptive effect Effects 0.000 title description 2
- 238000005286 illumination Methods 0.000 claims abstract description 93
- 239000006185 dispersion Substances 0.000 claims description 12
- 230000004044 response Effects 0.000 claims description 9
- 238000001514 detection method Methods 0.000 abstract description 9
- 238000004891 communication Methods 0.000 description 20
- 230000015654 memory Effects 0.000 description 14
- 238000010521 absorption reaction Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 6
- 239000003381 stabilizer Substances 0.000 description 6
- 238000003384 imaging method Methods 0.000 description 3
- 230000007774 longterm Effects 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 206010000369 Accident Diseases 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000002238 attenuated effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000002329 infrared spectrum Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000001429 visible spectrum Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/04—Anti-collision systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D45/00—Aircraft indicators or protectors not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D47/00—Equipment not otherwise provided for
- B64D47/02—Arrangements or adaptations of signal or lighting devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D47/00—Equipment not otherwise provided for
- B64D47/08—Arrangements of cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/46—Indirect determination of position data
- G01S17/48—Active triangulation systems, i.e. using the transmission and reflection of electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/04—Anti-collision systems
- G08G5/045—Navigation or guidance aids, e.g. determination of anti-collision manoeuvers
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/06—Traffic control systems for aircraft, e.g. air-traffic control [ATC] for control when on the ground
- G08G5/065—Navigation or guidance aids, e.g. for taxiing or rolling
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H04N5/2256—
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/87—Combinations of systems using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10141—Special mode during image acquisition
- G06T2207/10152—Varying illumination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30241—Trajectory
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B3/00—Audible signalling systems; Audible personal calling systems
- G08B3/10—Audible signalling systems; Audible personal calling systems using electric transmission; using electromagnetic transmission
Definitions
- Pilots are located in a central cockpit where they are well positioned to observe objects that are directly in front of the cabin of the aircraft. Objects that are not located directly in front of the cabin, however, can be more difficult to observe.
- Wings are attached to the cabin behind the cockpit and extend laterally from the cabin in both directions. Some commercial and some military aircraft have large wingspans, and so the wings on these aircraft laterally extend a great distance from the cabin and are thus positioned behind and out of the field of view of the cockpit. Some commercial and some military planes have engines that hang below the wings of the aircraft. Pilots, positioned in the cabin, can have difficulty knowing the risk of collisions between objects external to the aircraft and the wingtips and/or engines.
- taxi-in and taxi-out phases require that the aircraft move between the runway and the terminal gates, for example.
- the aircraft must first transition from the runway to a taxiway and then to the gateway.
- the taxiway can include an elaborate network of roads requiring the aircraft to travel over straight stretches as well as turns and transitions to/from the taxiway.
- Some high-speed taxi operation occurs on one-way taxiways dedicated to aircraft only. During such high-speed taxi operation, relatively distant objects located in the forward direction of the aircraft might present the greatest risk of collision to the aircraft.
- an adaptive field of view for an aircraft on-ground collision alerting system would be useful to facilitate surveillance of areas most likely to have objects external to the aircraft, which could present a risk of collision with the aircraft.
- Apparatus and associated methods relate to a system for calculating position values and/or range data of an object(s) external to an aircraft.
- the system includes a mode selector configured to determine an illumination field of view.
- the system includes a projector mounted at a projector location on the aircraft and configured to project structured light within the illumination field of view, thereby illuminating the object(s) external to the aircraft that are within the illumination field of view.
- the system includes a camera mounted at a camera location on the aircraft and configured to receive a portion of the structured light reflected by the object(s) within the illumination field of view.
- the camera is further configured to focus the received portion onto a focal plane array comprising a plurality of light-sensitive pixels, thereby forming an image of object(s) within the illumination field of view.
- the image includes pixel data generated by the plurality of light-sensitive pixels.
- the system also includes an image processor configured to identify pixel coordinates corresponding to a subset of the plurality of light-sensitive pixels upon which the received portion of the structured light reflected by the object(s) within the illumination field of view is focused.
- the image processor is further configured to use triangulation, based on the projector location, the camera location, and the identified pixel coordinates, to calculate the position values and/or range data of the object(s) within the illumination field.
- Some embodiments relate to a method for generating an alert signal of a potential aircraft collision for a taxiing aircraft.
- the method includes the step of determining an illumination field of view. Then, structured light is projected within the determined illumination field of view. A portion of the structured light reflected by object(s) within the illumination field of view is received. The received portion is focused onto a focal plane array having a plurality of light-sensitive pixels, thereby forming an image of object(s) within the illumination field of view. The image includes pixel data generated by the plurality of light-sensitive pixels. Then, the method identifies pixel coordinates corresponding to a subset of the plurality of light-sensitive pixels upon which the received portion of the structured light reflected by the object(s) within the illumination field of view is focused.
- the method then calculates, based on the projector location, the camera location, and the identified pixel coordinates, position values and/or range data of the object(s) within the illumination field of view by which the structured light is reflected.
- An alert signal is generated if the calculated position values and range data of the object(s) indicate that one or more of the object(s) are within a collision zone or on a collision trajectory.
- FIG. 1A is a schematic view of an exemplary aircraft collision alerting system used by an aircraft on a taxiway.
- FIG. 1B depicts an image captured by a camera of the collision alerting system mounted to the aircraft depicted in FIG. 1A .
- FIG. 2A is a schematic view of an exemplary aircraft collision alerting system used by an aircraft approaching a gateway.
- FIG. 2B depicts an image captured by a camera of the collision alerting system mounted to the aircraft depicted in FIG. 2A .
- FIG. 3 is a plan view of an airport showing various regions in which various types of on-ground operations are performed.
- FIG. 4 is a schematic view depicting various fields of view of an aircraft collision alerting system used by an aircraft during on-ground operations.
- FIG. 5 is a detailed block diagram of the exemplary aircraft collision alerting system depicted in FIG. 2 .
- FIG. 6 is a schematic diagram depicting object location determination using both active and passive imaging.
- FIG. 7 is block diagram of an embodiment of controller 26 depicted in FIG. 5 .
- Apparatus and associated methods relate to controlling, based on a mode selector, the field of view of an external object detector during aircraft taxi operations.
- the field of view can be controlled to have a relatively-small solid-angle of detection capability.
- the relatively-small solid-angle field of view can be aligned so as to detect more distant objects within a narrow corridor extending forward of the aircraft's wingtips.
- the field of view can be controlled to have a relatively-large solid-angle of detection capability.
- the relatively-large solid-angle field of view can be aligned so as to detect close objects in the vicinity of the aircraft wings and engine nacelle.
- the object detector projects structured light within the controlled field of view, thereby illuminating the object(s) external to the aircraft that are within the illumination field of view.
- Determining locations and/or ranges of objects nearby an aircraft can be performed using triangulation of structured light projected upon and reflected by the objects. Only objects upon which the projected structured light falls can reflect that projected structured light.
- the structured light is projected by a projector that has a controlled field of view of projection. The field of view can be controlled based on a mode of ground operation of the aircraft.
- Triangulation can be used to calculate locations and/or ranges of objects from which the structured light is reflected. The locations and/or ranges can be calculated based on a location of a structured projector, a location of a camera or imager, and the pixel coordinate upon which the reflected structured light is focused.
- the structured light can be a pulse of light projected in a pattern, such as, for example, a pulse having a fixed azimuthal angle of projection but having an elevational angle of projection between +/ ⁇ 5 degrees from the horizontal.
- the structured light can be a collimated beam rastered or scanned in a pattern.
- Various other types of patterned light can be projected.
- the structured light is projected within a controlled field of view. This means that outside of the controlled field of view, substantially no light energy is projected.
- the term structured light indicates that light is projected within the solid-angle of the field of view in such a manner that the projected light is not uniformly projected throughout the solid-angle of projection.
- light will be primarily projected along certain azimuthal and/or elevational angles comprising a subset of the azimuthal and elevational angles within the solid-angle of light projection.
- Other subsets of the solid-angle of light projection can be used for structured light projection.
- the structured light can have a wavelength corresponding to infrared light and/or to an atmospheric absorption band.
- infrared light because it is outside the visible spectrum, can minimize a distraction to a pilot who is taxiing the aircraft.
- infrared light that has a wavelength within an atmospheric absorption band can permit low-power projector illumination, as the illuminating power need not compete with the sun's illumination in such an absorption band. Knowing a first aircraft location from where the light is projected, a second aircraft location where the reflection is imaged, and a pixel coordinate within the image corresponding to an object from which the spatially patterned is reflected light permits a calculation of the location and/or range of that reflecting object.
- FIG. 1A is a schematic view of an exemplary aircraft collision alerting system used by an aircraft on a taxiway.
- first aircraft 10 is taxiing along one-way taxiway 12 at a relatively high speed.
- First aircraft 10 is approaching taxiway crossing 14 .
- Second aircraft 16 is near the taxiway crossing 14 on taxiway 18 .
- First aircraft 10 is equipped with aircraft collision alerting system 20 .
- Aircraft collision alerting system 20 includes projector 22 , camera 24 , and a controller 26 .
- projector 22 is mounted on vertical stabilizer 28 of tail 30 .
- Projector 22 is configured to project structured light 32 onto a scene external to first aircraft 10 , thereby illuminating objects external to first aircraft 10 .
- Projector 22 can be mounted at other locations on first aircraft 10 in other embodiments.
- Controller 26 controls the solid-angle of projection, such that projector 22 projects structured light 32 within the controlled solid-angle of projection.
- the solid-angle of projection includes azimuthal angle of projection 34 A.
- the solid-angle of projection represents a small fraction of the full two pi steradians of a half sphere of projection.
- the relatively-small solid-angle of projection is configured to project structured light 32 onto objects, such as second aircraft 16 , located within small azimuthal angle 36 A of longitudinal axis 38 of first aircraft 10 .
- controller 26 also can control the optical focus and/or dispersion of projected structured light 32 , and the emission power of structured light 32 .
- controller 26 might also increase the emission power and collimate the structured light pattern so as to reduce dispersion.
- One or more of such controlled parameters of structured light 32 can facilitate detection of objects that are relatively distant from first aircraft 10 .
- FIG. 1B depicts an image captured by a camera of the collision alerting system mounted to the aircraft depicted in FIG. 1A .
- captured image 40 A has a field of view commensurate with the solid-angle of projection of structured light 32 .
- captured image 40 A can have a field of view different than the field of view used for structured-light projection.
- Captured image 40 A depicts second aircraft 16 on taxiway 18 .
- Superimposed on taxiway 18 and second aircraft 16 are lines 32 A- 32 D of structured light 32 . Because projector 22 and camera 24 are mounted to first aircraft 10 at different locations, lines 32 A- 32 D will have discontinuities 42 in captured image 40 A where structured light 32 encounters objects, such as second aircraft 16 .
- Such discontinuities 42 in captured image 40 A are indicative of differences in the locations and/or ranges of the objects from which structured light 32 reflects.
- FIG. 2A is a schematic view of an exemplary aircraft collision alerting system used by an aircraft approaching a gateway.
- first aircraft 10 is taxiing on tarmac 44 at a relatively low speed.
- First aircraft 10 is approaching gate 46 C.
- Aircraft 48 and 50 are parked at gates 46 A and 46 B, respectively.
- First aircraft 10 is again equipped with aircraft collision alerting system 20 .
- Aircraft collision alerting system 20 includes projector 22 , camera 24 , and a controller 26 .
- projector 22 is mounted on vertical stabilizer 28 of tail 30 .
- Projector 22 is configured to project structured light 33 onto a scene external to first aircraft 10 , thereby illuminating objects external to first aircraft 10 .
- Controller 26 controls the solid-angle of projection, such that projector 22 projects structured light 33 within the controlled solid-angle of projection.
- the solid-angle of projection includes azimuthal angle of projection 34 B.
- the solid-angle of projection shown in FIG. 2A represents a wider field of view than the field of view of the solid-angle of projection depicted in FIG. 1A .
- This relation is represented by the azimuthal angle of projection 34 B being greater than the azimuthal angle of projection 34 A.
- the relatively-large solid-angle of projection is configured to project structured light 33 onto objects, such as aircraft 48 and 50 , located within large azimuthal angle 36 B as measured from longitudinal axis 38 of first aircraft 10 .
- controller 26 also can control the optical focus and/or dispersion of projected structured light 33 , and the emission power of structured light 33 .
- controller 26 when controller 26 controls a large solid-angle, such as the one depicted in FIG. 2A , controller 26 might also decrease the emission power and collimate the structured light pattern so as to permit increased dispersion.
- One or more of such controlled parameters of the structured light 33 can facilitate detection of objects that are relatively nearby first aircraft 10 .
- FIG. 2B depicts an image captured by a camera of the collision alerting system mounted to the aircraft depicted in FIG. 2A .
- captured image 40 B has a field of view commensurate with the solid-angle of projection of structured light 33 .
- captured image 40 B can have a field of view different than the field of view used for structured-light projection.
- Captured image 40 B depicts aircraft 48 and 50 parked at gates 46 A and 46 B, respectively.
- Superimposed on tarmac 44 and aircraft 48 and 50 are lines 33 A- 33 D of structured light 33 .
- lines 33 A- 33 D will have discontinuities 42 in captured image 40 B where structured light 33 encounters objects, such as second aircraft 16 .
- Such discontinuities 42 in captured image 40 B are indicative of differences in the locations and/or ranges of the objects from which structured light 33 reflects.
- Collision alerting system 20 can determine locations and/or ranges of objects using triangulation.
- Projector 22 projects structured light 32 ( FIG. 1A ) or 33 ( FIG. 2A ) within a controlled field of view having a predetermined solid-angle of illumination. Structured light 32 or 33 illuminates various objects that reside within the predetermined solid-angle of illumination.
- Projector 22 has an optical axis that is coplanar with longitudinal axis 38 of first aircraft 10 .
- Controller 26 is configured to control the field of view within which projector 22 projects structured light 32 .
- Projector 22 is shown illuminating objects that are within various azimuthal angle ranges, for example, and within an elevation range of a projection horizon of projector 22 , in FIGS. 1A-2B .
- the elevation range of projection for example, can be from about +3, +5, +10, +12, or +15 degrees to about ⁇ 2, ⁇ 5, ⁇ 8, or ⁇ 10 degrees of projection from a vertical location of projector 22 .
- structured light 32 or 33 can continuously illuminate objects within the solid-angle of illumination. In other embodiments, structured light 32 or 33 can intermittently illuminate objects within the solid-angle of illumination. Such illumination may use light of various wavelengths. For example, in some embodiments, infrared light, being invisible to a human eye, can be used to provide illumination of objects within the solid-angle of illumination. Infrared light can advantageously be non-distractive to pilots and to other people upon whom the collimated beam of light is projected.
- the directed beam of light is pulsed on for a limited time, with image capture synchronized with the projector illumination. Shorter image capture durations reduce the light captured from solar illumination, lowering the needed projector power.
- projector 22 is controlled so as to facilitate imaging of various objects within the scene by a left-side camera and/or a right-side camera.
- Projector 22 can be controlled, based on a mode of ground operation of first aircraft 10 , for example.
- Various parameters of structured light 33 produced by projector 22 can be controlled. For example, the azimuthal range of a field of view can be controlled. In some embodiments, focus/beam divergence and/or emission power can be controlled.
- the field of view can be controlled, based upon an input device, such as a switch or keypad. In other embodiments, the field of view can be controlled in response to aircraft operating conditions, such as ground speed, steering wheel orientation, etc.
- intensity of the directed beam of light can be controlled based on ground speed of aircraft. Faster moving aircraft may control the directed beam of light to have a greater intensity. Also, the intensity can be controlled such that objects at greater ranges are illuminated at a greater intensity than objects at a closer range. In some embodiments, the intensity of the directed beam of light may be controlled based on atmospheric conditions (e.g., atmospheric attenuation). In an exemplary embodiment, power intensity of the directed beam can be varied while looking at a known location(s) on first aircraft 10 . A magnitude of the signal reflected from the known location(s) on first aircraft 10 can be compared to a predetermined reference signal level at a standard attenuation to determine instantaneous attenuation of atmospheric conditions. Such a method can be used to normalize the measured reflected power intensity for various atmospheric conditions.
- atmospheric conditions e.g., atmospheric attenuation
- light having wavelengths within an atmospheric absorption band can be used. Careful selection of projector wavelength can permit projector 22 to compete favorably with solar energy. There are, however, certain wavelengths where the atmospheric absorption is so great that both projector energy and solar energy are attenuated equally.
- Light is broadband as emitted from the sun with a maximum intensity falling in the visible light spectrum. Sunlight having wavelengths within the infrared spectrum is of lower intensity than the visible band. And so, projected light having such wavelengths need not compete with the sunlight. Using light having such wavelengths can thereby permit reduced power levels in projecting structured light. Atmospheric absorption bands may further reduce solar infrared illumination.
- atmospheric absorption bands include infrared wavelengths of between about 1.35-1.4, 1.8-1.95, 2.5-2.9, and 5.5-7.2 microns.
- the structured light that is projected by projector 22 can be formed by a collimated beam scanned in a predetermined pattern so as to have a structure that can be identified in images formed by camera 24 .
- Using knowledge of the location from which the feature is projected e.g., the location of image projector 22
- the location of the camera 24 and the location within the images e.g., pixel coordinates
- projector 22 can be located at an elevation on first aircraft 10 that is higher than an elevation where camera 24 is located.
- a location of the imaged feature can be used to determine a location and a range distance to the object from which that specific feature is reflected.
- Projector 22 can emit structured light to produce a pattern that, when reflected from a surface having a normal direction to longitudinal axis 38 , is imaged as horizontal lines by camera 24 .
- One structured light beam for example, might be projected at an angle of elevation of zero degrees (i.e., directed parallel to the horizon).
- a second structured light beam might be projected at an angle of negative five degrees from the horizon (i.e., directed at a slightly downward angle from projector 22 ).
- Each of these projected structured light beams, when reflected from an object will be imaged at a different vertical location (e.g., each will have a different vertical pixel coordinate) within the camera image, depending on the range distance between the reflecting object and first aircraft 10 .
- Knowing the elevation of projector 22 , the elevation of camera 24 , the specific feature of the structured light (e.g., which horizontal line is imaged), and the location within the camera image where the specific feature is imaged can permit a determination of the location of the object from which the specific feature has been reflected.
- pilots of first aircraft 10 can be informed of any potential collision hazards within the scene illuminated by projector 22 . Pilots of first aircraft 10 can steer first aircraft 10 to avoid wingtip collisions and/or engine collisions based on the location and range information that is calculated by aircraft collision alerting system 20 .
- FIG. 3 is a plan view of an airport showing various regions in which various types of on-ground operations are performed.
- airport 52 includes runway 54 , taxiways 56 , apron 58 , ramp 60 , terminal gates 62 , aircraft stands 64 , and hangar 66 .
- Various ground operations are performed when an aircraft is navigating around and/or over each of these different areas of airport 52 . For example, when an aircraft is being pushed back from terminal gates 62 , the aircraft might be turned so as to exit ramp 60 . And when the aircraft then navigates ramp 60 , it must carefully avoid the objects there located, including parked aircraft and support vehicles, for example. Then when the aircraft navigates taxiways 56 , it might taxi at a higher speed than when navigating ramp 60 .
- Each of these different areas presents different risks of collision between the aircraft and objects external to the aircraft. Because different risks are present at different areas within airport 52 , different fields of view can be used to detect the objects that present the collision risks.
- FIG. 4 is a schematic view depicting various fields of view of an aircraft collision alerting system used by an aircraft during on-ground operations.
- aircraft 10 includes projector 22 mounted on vertical stabilizer 28 .
- Projector 22 is configured to project structured light within various controlled fields of view 68 A- 68 E.
- Fields of view 68 A- 68 E are progressively smaller.
- Field of view 68 A has a relatively large azimuthal angle 70 A compared with relatively small azimuthal angle 70 E corresponding to field of view 68 E.
- Each of fields of view 68 A- 68 E are configured such that to detect objects within narrow corridor 72 extending forward of the aircraft's wingtips.
- Structured light projected within field of view 68 E will illuminate objects that are located within narrow corridor 72 , but only if such objects are located at a distance greater than minimum distance 74 of aircraft 10 .
- structured light projected within field of view 68 A will illuminate objects that are located within narrow corridor 72 at any distance depicted in FIG. 4 .
- structured light projected within field of view 68 A will illuminate objects on the right-hand side of aircraft 10 , so as to facilitate objects that present a collision risk should aircraft 10 turn in the right-hand direction.
- relatively-wide field of view 68 A provides object detection in scenarios in which aircraft 10 might be turning
- relatively-narrow field of view 68 E provides object detection in scenarios in which aircraft 10 might be taxiing in a straight direction and not turning.
- FIG. 5 is a block diagram of an exemplary aircraft collision alerting system.
- Aircraft collision alerting system 20 includes projector(s) 22 , camera(s) 24 , controller 26 , and cockpit alarm and display module 76 .
- Projector(s) 22 is configured to be mounted at a projector location on an aircraft.
- Projector(s) 22 is further configured to project structure light from projector(s) 22 onto a scene external to the aircraft, thereby illuminating a spatially-patterned portion of the scene.
- Camera(s) 24 is configured to be mounted at one or more camera locations on the aircraft. Camera(s) 24 is further configured to receive light reflected from the scene. Camera(s) 24 is further configured to focus the received light onto a focal plane array comprising a plurality of light-sensitive pixels, thereby forming an image of the scene.
- the image can include pixel data generated by the plurality of light-sensitive pixels.
- Controller 26 receives inputs from camera(s) 24 and from aircraft avionics 78 . Controller 26 generates commands that control the operation of projector(s) 22 and camera(s) 24 . Controller 26 outputs signals indicative of alarms, ranges, and/or images to cockpit alarms and display module 76 . Controller 26 is configured to identify pixel coordinates corresponding to a subset of the plurality of light-sensitive pixels upon which the spatially-patterned light projected by projector(s) 22 and reflected from the spatially-patterned portion of the scene is focused.
- Controller 26 is further configured to use triangulation, based on the projector location of projector(s) 22 , the location(s) of camera(s) 24 and the identified pixel coordinates, to calculate range value data of object(s) in the scene from which the spatially-patterned light projected by projector(s) 22 is reflected.
- FIG. 6 is a schematic diagram depicting object location determination using both active and passive imaging.
- camera image 80 of tail 82 of aircraft 84 external to first aircraft 10 (depicted in FIG. 2A ) is shown.
- Camera image 80 is composed from intensity data of a two-dimensional array of light-sensitive pixels (not individually depicted).
- Tail 82 includes vertical stabilizer 86 and horizontal stabilizer 88 .
- Vertical stabilizer 86 depicts features 90 of structured light projected thereon.
- Features 90 are diagonal lines of light.
- Features 90 are imaged by a subset of the two-dimensional array of light-sensitive pixels composing the image. For each of the subset of the two-dimensional array of light-sensitive pixels containing the structured light projected upon tail 82 , a range value is calculated.
- range values can be calculated using the already calculated range values corresponding to nearby pixels.
- range values can be calculated for the pixels determined to be boundary pixels of an object.
- Range values for boundary pixels 92 may be calculated by modeling the range variations within a single object as a polynomial function of spatial coordinates, for example. Such a model may be used to calculate range values using the pixel coordinates and corresponding range values of pixels having already calculated range values that reside within the object boundary associated with boundary pixels 92 .
- Various embodiments can use various structured light patterns having various features. For example, in some embodiments, vertical or diagonal lines can be projected upon a scene. In some embodiments, spots of light can be projected upon a scene. In an exemplary embodiment, both vertical lines and horizontal lines can be projected upon a scene, using projectors that are horizontally and/or vertically displaced, respectively, from the camera location.
- FIG. 7 is block diagram of an embodiment of controller 26 depicted in FIG. 5 .
- the block diagram includes controller 26 , projector(s) 22 , camera(s) 24 , cockpit alarms and display module 76 , and aircraft avionics 78 .
- Controller 26 includes processor(s) 92 , input/output interface 94 , storage device(s) 96 , input devices 98 , and output devices 100 .
- Storage device(s) 96 has various storage or memory locations.
- Storage device(s) 96 includes program memory 102 , and data memory 104 .
- Controller 26 is in communication with cockpit alarms and display module 76 and aircraft avionics 78 via input/output interface 94 .
- Aircraft avionics 78 can provide controller 26 with metrics indicative of the aircraft's location, orientation, speed, etc.
- Controller 26 can provide cockpit alarms and display module 76 with signals indicative of risk of collision with an object(s) external to the aircraft.
- controller 26 includes processor(s) 92 , input/output interface 94 , storage device(s) 96 , input devices 98 , and output devices 100 .
- controller 26 can include more or fewer components.
- controller 26 may not include input devices 98 and/or output devices 100 .
- Controller 26 may include additional components such as a battery that provides power to components of controller 26 during operation.
- controller 26 can be located together with projector(s) 22 and/or camera(s) 24 .
- Processor(s) 92 in one example, is configured to implement functionality and/or process instructions for execution within controller 26 .
- processor(s) 92 can be capable of processing instructions stored in storage device(s) 96 .
- Examples of processor(s) 92 can include any one or more of a microprocessor, a controller, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or other equivalent discrete or integrated logic circuitry.
- DSP digital signal processor
- ASIC application specific integrated circuit
- FPGA field-programmable gate array
- Input/output interface 94 includes a communications module. Input/output interface 94 , in one example, utilizes the communications module to communicate with external devices via one or more networks, such as one or more wireless or wired networks or both.
- the communications module can be a network interface card, such as an Ethernet card, an optical transceiver, a radio frequency transceiver, or any other type of device that can send and receive information.
- the communications module can be a network interface card, such as an Ethernet card, an optical transceiver, a radio frequency transceiver, or any other type of device that can send and receive information.
- Other examples of such network interfaces can include Bluetooth, 3G, 4G, and Wi-Fi radio computing devices as well as Universal Serial Bus (USB).
- communication with the aircraft can be performed via a communications bus, such as, for example, an Aeronautical Radio, Incorporated (ARINC) standard communications protocol.
- aircraft communication with the aircraft can be performed via a communications bus, such as, for example, a Controller Area Network (CAN) bus.
- a communications bus such as, for example, a Controller Area Network (CAN) bus.
- CAN Controller Area Network
- Storage device(s) 96 can be configured to store information within controller 26 during operation.
- Storage device(s) 96 in some examples, is described as computer-readable storage media.
- a computer-readable storage medium can include a non-transitory medium.
- the term “non-transitory” can indicate that the storage medium is not embodied in a carrier wave or a propagated signal.
- a non-transitory storage medium can store data that can, over time, change (e.g., in RAM or cache).
- storage device(s) 96 is a temporary memory, meaning that a primary purpose of Storage device(s) 96 is not long-term storage.
- Storage device(s) 96 in some examples, is described as volatile memory, meaning that storage device(s) 96 do not maintain stored contents when power to controller 26 is turned off. Examples of volatile memories can include random access memories (RAM), dynamic random access memories (DRAM), static random access memories (SRAM), and other forms of volatile memories.
- RAM random access memories
- DRAM dynamic random access memories
- SRAM static random access memories
- storage device(s) 96 is used to store program instructions for execution by processor(s) 92 .
- Storage device(s) 96 in one example, is used by software or applications running on controller 26 (e.g., a software program implementing long-range cloud conditions detection) to temporarily store information during program execution.
- Storage device(s) 96 also include one or more computer-readable storage media.
- Storage device(s) 96 can be configured to store larger amounts of information than volatile memory.
- Storage device(s) 96 can further be configured for long-term storage of information.
- Storage device(s) 96 include non-volatile storage elements. Examples of such non-volatile storage elements can include magnetic hard discs, optical discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories.
- Input devices 98 are configured to receive input from a user.
- Examples of input devices 98 can include a mouse, a keyboard, a microphone, a camera device, a presence-sensitive and/or touch-sensitive display, push buttons, arrow keys, or other type of device configured to receive input from a user.
- input communication from the user can be performed via a communications bus, such as, for example, an Aeronautical Radio, Incorporated (ARINC) standard communications protocol.
- ARINC Aeronautical Radio, Incorporated
- user input communication from the user can be performed via a communications bus, such as, for example, a Controller Area Network (CAN) bus.
- CAN Controller Area Network
- Output devices 100 can be configured to provide output to a user.
- Examples of output devices 100 can include a display device, a sound card, a video graphics card, a speaker, a cathode ray tube (CRT) monitor, a liquid crystal display (LCD), a light emitting diode (LED) display, an organic light emitting diode (OLED) display, or other type of device for outputting information in a form understandable to users or machines.
- output communication to the user can be performed via a communications bus, such as, for example, an Aeronautical Radio, Incorporated (ARINC) standard communications protocol.
- ARINC Aeronautical Radio, Incorporated
- output communication to the user can be performed via a communications bus, such as, for example, a Controller Area Network (CAN) bus.
- CAN Controller Area Network
- Apparatus and associated methods relate to a system for calculating position values and/or range data of an object(s) external to an aircraft.
- the system includes a mode selector configured to determine an illumination field of view.
- the system includes a projector mounted at a projector location on the aircraft and configured to project structured light within the illumination field of view, thereby illuminating the object(s) external to the aircraft that are within the illumination field of view.
- the system includes a camera mounted at a camera location on the aircraft and configured to receive a portion of the structured light reflected by the object(s) within the illumination field of view.
- the camera is further configured to focus the received portion of the structured light onto a focal plane array comprising a plurality of light-sensitive pixels, thereby forming an image of object(s) within the illumination field of view.
- the image includes pixel data generated by the plurality of light-sensitive pixels.
- the system also includes an image processor configured to identify pixel coordinates corresponding to a subset of the plurality of light-sensitive pixels upon which the received portion of the structured light reflected by the object(s) within the illumination field of view is focused.
- the image processor is further configured to use triangulation, based on the projector location, the camera location, and the identified pixel coordinates, to calculate the position values and/or range data of the object(s) within the illumination field.
- the system of the preceding paragraph can optionally include, additionally and/or alternatively, any one or more of the following features, configurations and/or additional components:
- mode selector can be further configured to determine an emission power
- the projector can be further configured to project structured light of the determined emission power
- mode selector can be further configured to determine beam dispersion
- the projector can be further configured to project structured light of the determined beam dispersion
- mode selector can include a selection input device operable by a pilot of the aircraft.
- mode selector can determine the illumination field of view within which the structured light is projected, based on ground operation modes of the aircraft.
- the mode selector can determine a first illumination field of view within which the structured light is projected. If the ground operation mode is a low-speed taxi mode, then the mode selector can determine a second illumination field of view within which the structured light is projected.
- the second illumination field of view can have a greater solid-angle than a solid-angle of the first illumination field of view.
- the mode selector can determine a third illumination field of view within which the structured light is projected.
- the third illumination field of view can have a greater solid-angle than the solid-angle of the first illumination field of view but less than the solid-angle of the second illumination field of view.
- a further embodiment of any of the foregoing systems can further include an aircraft system interface configured to receive signals indicative of aircraft operating parameters.
- aircraft operating parameters can include an aircraft taxi speed.
- the mode selector can be further configured to determine the illumination field of view in response to the received signal indicative of the aircraft taxi speed.
- aircraft operating parameters can include an orientation of a steerable wheel.
- the mode selector can be further configured to determine the illumination field of view in response to the received signal indicative of the orientation of the steerable wheel.
- a further embodiment of any of the foregoing systems can further include an audible alarm configured to generate an alert signal.
- the alert signal can be generated if the calculated position values and range data of the object(s) within the illumination field of view indicate that one or more of the object(s) are within a collision zone or on a collision trajectory.
- the projector can be an infrared projector configured to project structured light that is infrared light.
- Some embodiments relate to a method for generating an alert signal of a potential aircraft collision for a taxiing aircraft.
- the method includes determining an illumination field of view.
- the method includes projecting structured light within the determined illumination field of view.
- the method includes receiving a portion of the structured light reflected by object(s) within the illumination field of view.
- the method includes focusing the received portion of the structured light onto a focal plane array comprising a plurality of light-sensitive pixels, thereby forming an image of the object(s) within the illumination field of view.
- the image includes pixel data generated by the plurality of light-sensitive pixels.
- the method includes identifying pixel coordinates corresponding to a subset of the plurality of light-sensitive pixels upon which the received portion of the structured light reflected by the object(s) within the illumination field of view is focused.
- the method includes calculating, based on a projector location, a camera location, and the identified pixel coordinates, position values and/or range data of the object(s) within the illumination field of view by which the structured light is reflected.
- the method also includes generating an alert signal if the calculated position values and range data of the object(s) indicate that one or more of the object(s) are within a collision zone or on a collision trajectory.
- the method of the preceding paragraph can optionally include, additionally and/or alternatively, any one or more of the following features, configurations and/or additional components:
- a further embodiment of the foregoing method wherein the illumination field of view external to the aircraft can be determined based on ground operation modes of the aircraft.
- a further embodiment of any of the foregoing methods can further include determining an emission power.
- the projected structured light can be of the determined emission power.
- a further embodiment of any of the foregoing methods can further include determining a beam dispersion.
- the projected structured light can be of the determined beam dispersion.
- a further embodiment of any of the foregoing methods can further include receiving signals indicative of aircraft operating parameters.
- the method can also include determining the illumination field of view in response to a received signal indicative of an aircraft taxi speed.
- a further embodiment of any of the foregoing methods can further include receiving signals indicative of aircraft operating parameters.
- the method can also include determining the illumination field of view in response to a received signal indicative of an orientation of a steerable wheel.
- determining the illumination field of view within which the structured light is projected can be based on ground operation modes of the aircraft.
- the mode selector can determine a first illumination field of view within which the structured light is projected. If the ground operation mode is a low-speed taxi mode, then the mode selector can determine a second illumination field of view within which the structured light is projected.
- the second illumination field of view can have a greater solid-angle than a solid-angle of the first illumination field of view.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Traffic Control Systems (AREA)
Abstract
Apparatus and associated methods relate to controlling, based on a mode selector, the field of view of an external object detector during aircraft taxi operations. For example, during high-speed taxi operations, the field of view can be controlled to have a relatively-small solid-angle of detection capability. The relatively-small solid-angle field of view can be aligned so as to detect more distant objects within a narrow corridor extending forward of the aircraft's wingtips. During low-speed taxi operations, for example, the field of view can be controlled to have a relatively-large solid-angle of detection capability. The relatively-large solid-angle field of view can be aligned so as to detect close objects in the vicinity of the aircraft wings and engine nacelle. The object detector projects structured light within the controlled field of view, thereby illuminating the object(s) external to the aircraft that are within the illumination field of view.
Description
- Each year, significant time and money are lost due to commercial aircraft accidents and incidents during ground operations, of which significant portions occur during taxiing maneuvers. During ground operations, aircraft share the taxiways with other aircraft, fuel vehicles, baggage carrying trains, mobile stairways and many other objects. Aircrafts often taxi to and/or from fixed buildings and other fixed objects. Should an aircraft collide with any of these objects, the aircraft must be repaired and recertified as capable of operation. The cost of repair and recertification, as well as the lost opportunity costs associated with the aircraft being unavailable for use can be very expensive.
- Pilots are located in a central cockpit where they are well positioned to observe objects that are directly in front of the cabin of the aircraft. Objects that are not located directly in front of the cabin, however, can be more difficult to observe. Wings are attached to the cabin behind the cockpit and extend laterally from the cabin in both directions. Some commercial and some military aircraft have large wingspans, and so the wings on these aircraft laterally extend a great distance from the cabin and are thus positioned behind and out of the field of view of the cockpit. Some commercial and some military planes have engines that hang below the wings of the aircraft. Pilots, positioned in the cabin, can have difficulty knowing the risk of collisions between objects external to the aircraft and the wingtips and/or engines.
- There are various types of on-ground operations that an aircraft must perform at an airport, each of which presenting different collision risks to the aircraft. The taxi-in and taxi-out phases require that the aircraft move between the runway and the terminal gates, for example. During taxi-in, the aircraft must first transition from the runway to a taxiway and then to the gateway. Sometimes, the taxiway can include an elaborate network of roads requiring the aircraft to travel over straight stretches as well as turns and transitions to/from the taxiway. Some high-speed taxi operation occurs on one-way taxiways dedicated to aircraft only. During such high-speed taxi operation, relatively distant objects located in the forward direction of the aircraft might present the greatest risk of collision to the aircraft. During low-speed taxiing and gateway approach, nearby objects in the vicinity of the wings and engine nacelles might present the greatest risk of collision to the aircraft. Thus, an adaptive field of view for an aircraft on-ground collision alerting system would be useful to facilitate surveillance of areas most likely to have objects external to the aircraft, which could present a risk of collision with the aircraft.
- Apparatus and associated methods relate to a system for calculating position values and/or range data of an object(s) external to an aircraft. The system includes a mode selector configured to determine an illumination field of view. The system includes a projector mounted at a projector location on the aircraft and configured to project structured light within the illumination field of view, thereby illuminating the object(s) external to the aircraft that are within the illumination field of view. The system includes a camera mounted at a camera location on the aircraft and configured to receive a portion of the structured light reflected by the object(s) within the illumination field of view. The camera is further configured to focus the received portion onto a focal plane array comprising a plurality of light-sensitive pixels, thereby forming an image of object(s) within the illumination field of view. The image includes pixel data generated by the plurality of light-sensitive pixels. The system also includes an image processor configured to identify pixel coordinates corresponding to a subset of the plurality of light-sensitive pixels upon which the received portion of the structured light reflected by the object(s) within the illumination field of view is focused. The image processor is further configured to use triangulation, based on the projector location, the camera location, and the identified pixel coordinates, to calculate the position values and/or range data of the object(s) within the illumination field.
- Some embodiments relate to a method for generating an alert signal of a potential aircraft collision for a taxiing aircraft. The method includes the step of determining an illumination field of view. Then, structured light is projected within the determined illumination field of view. A portion of the structured light reflected by object(s) within the illumination field of view is received. The received portion is focused onto a focal plane array having a plurality of light-sensitive pixels, thereby forming an image of object(s) within the illumination field of view. The image includes pixel data generated by the plurality of light-sensitive pixels. Then, the method identifies pixel coordinates corresponding to a subset of the plurality of light-sensitive pixels upon which the received portion of the structured light reflected by the object(s) within the illumination field of view is focused. The method then calculates, based on the projector location, the camera location, and the identified pixel coordinates, position values and/or range data of the object(s) within the illumination field of view by which the structured light is reflected. An alert signal is generated if the calculated position values and range data of the object(s) indicate that one or more of the object(s) are within a collision zone or on a collision trajectory.
-
FIG. 1A is a schematic view of an exemplary aircraft collision alerting system used by an aircraft on a taxiway. -
FIG. 1B depicts an image captured by a camera of the collision alerting system mounted to the aircraft depicted inFIG. 1A . -
FIG. 2A is a schematic view of an exemplary aircraft collision alerting system used by an aircraft approaching a gateway. -
FIG. 2B depicts an image captured by a camera of the collision alerting system mounted to the aircraft depicted inFIG. 2A . -
FIG. 3 is a plan view of an airport showing various regions in which various types of on-ground operations are performed. -
FIG. 4 is a schematic view depicting various fields of view of an aircraft collision alerting system used by an aircraft during on-ground operations. -
FIG. 5 is a detailed block diagram of the exemplary aircraft collision alerting system depicted inFIG. 2 . -
FIG. 6 is a schematic diagram depicting object location determination using both active and passive imaging. -
FIG. 7 is block diagram of an embodiment ofcontroller 26 depicted inFIG. 5 . - Apparatus and associated methods relate to controlling, based on a mode selector, the field of view of an external object detector during aircraft taxi operations. For example, during high-speed taxi operations, the field of view can be controlled to have a relatively-small solid-angle of detection capability. The relatively-small solid-angle field of view can be aligned so as to detect more distant objects within a narrow corridor extending forward of the aircraft's wingtips. During low-speed taxi operations, for example, the field of view can be controlled to have a relatively-large solid-angle of detection capability. The relatively-large solid-angle field of view can be aligned so as to detect close objects in the vicinity of the aircraft wings and engine nacelle. The object detector projects structured light within the controlled field of view, thereby illuminating the object(s) external to the aircraft that are within the illumination field of view.
- Determining locations and/or ranges of objects nearby an aircraft can be performed using triangulation of structured light projected upon and reflected by the objects. Only objects upon which the projected structured light falls can reflect that projected structured light. The structured light is projected by a projector that has a controlled field of view of projection. The field of view can be controlled based on a mode of ground operation of the aircraft. Triangulation can be used to calculate locations and/or ranges of objects from which the structured light is reflected. The locations and/or ranges can be calculated based on a location of a structured projector, a location of a camera or imager, and the pixel coordinate upon which the reflected structured light is focused.
- The structured light can be a pulse of light projected in a pattern, such as, for example, a pulse having a fixed azimuthal angle of projection but having an elevational angle of projection between +/−5 degrees from the horizontal. In some embodiments, the structured light can be a collimated beam rastered or scanned in a pattern. Various other types of patterned light can be projected. The structured light is projected within a controlled field of view. This means that outside of the controlled field of view, substantially no light energy is projected. Herein the term structured light indicates that light is projected within the solid-angle of the field of view in such a manner that the projected light is not uniformly projected throughout the solid-angle of projection. For example, light will be primarily projected along certain azimuthal and/or elevational angles comprising a subset of the azimuthal and elevational angles within the solid-angle of light projection. Other subsets of the solid-angle of light projection can be used for structured light projection.
- In some embodiments, the structured light can have a wavelength corresponding to infrared light and/or to an atmospheric absorption band. Using infrared light, because it is outside the visible spectrum, can minimize a distraction to a pilot who is taxiing the aircraft. Using infrared light that has a wavelength within an atmospheric absorption band can permit low-power projector illumination, as the illuminating power need not compete with the sun's illumination in such an absorption band. Knowing a first aircraft location from where the light is projected, a second aircraft location where the reflection is imaged, and a pixel coordinate within the image corresponding to an object from which the spatially patterned is reflected light permits a calculation of the location and/or range of that reflecting object.
-
FIG. 1A is a schematic view of an exemplary aircraft collision alerting system used by an aircraft on a taxiway. InFIG. 1A ,first aircraft 10 is taxiing along one-way taxiway 12 at a relatively high speed.First aircraft 10 is approaching taxiway crossing 14.Second aircraft 16 is near the taxiway crossing 14 ontaxiway 18.First aircraft 10 is equipped with aircraftcollision alerting system 20. Aircraftcollision alerting system 20 includesprojector 22,camera 24, and acontroller 26. In the depicted embodiment,projector 22 is mounted onvertical stabilizer 28 oftail 30.Projector 22 is configured to project structured light 32 onto a scene external tofirst aircraft 10, thereby illuminating objects external tofirst aircraft 10.Projector 22 can be mounted at other locations onfirst aircraft 10 in other embodiments.Controller 26 controls the solid-angle of projection, such thatprojector 22 projects structuredlight 32 within the controlled solid-angle of projection. In the depicted embodiment, the solid-angle of projection includes azimuthal angle ofprojection 34A. - The solid-angle of projection represents a small fraction of the full two pi steradians of a half sphere of projection. The relatively-small solid-angle of projection is configured to project structured light 32 onto objects, such as
second aircraft 16, located within smallazimuthal angle 36A oflongitudinal axis 38 offirst aircraft 10. By controlling the solid-angle of projection, the power required for projecting structured light 32 can be controlled. In some embodiments, in addition to controlling the solid-angle of projection,controller 26 also can control the optical focus and/or dispersion of projected structuredlight 32, and the emission power of structuredlight 32. For example, whencontroller 26 controls a small solid-angle, such as the one depicted inFIG. 1A ,controller 26 might also increase the emission power and collimate the structured light pattern so as to reduce dispersion. One or more of such controlled parameters of structured light 32 can facilitate detection of objects that are relatively distant fromfirst aircraft 10. -
FIG. 1B depicts an image captured by a camera of the collision alerting system mounted to the aircraft depicted inFIG. 1A . InFIG. 1B , capturedimage 40A has a field of view commensurate with the solid-angle of projection of structuredlight 32. In other embodiments, capturedimage 40A can have a field of view different than the field of view used for structured-light projection. Capturedimage 40A depictssecond aircraft 16 ontaxiway 18. Superimposed ontaxiway 18 andsecond aircraft 16 arelines 32A-32D of structuredlight 32. Becauseprojector 22 andcamera 24 are mounted tofirst aircraft 10 at different locations,lines 32A-32D will havediscontinuities 42 in capturedimage 40A where structured light 32 encounters objects, such assecond aircraft 16.Such discontinuities 42 in capturedimage 40A are indicative of differences in the locations and/or ranges of the objects from which structuredlight 32 reflects. -
FIG. 2A is a schematic view of an exemplary aircraft collision alerting system used by an aircraft approaching a gateway. InFIG. 2A ,first aircraft 10 is taxiing ontarmac 44 at a relatively low speed.First aircraft 10 is approachinggate 46C.Aircraft gates First aircraft 10 is again equipped with aircraftcollision alerting system 20. Aircraftcollision alerting system 20 includesprojector 22,camera 24, and acontroller 26. In the depicted embodiment,projector 22 is mounted onvertical stabilizer 28 oftail 30.Projector 22 is configured to project structured light 33 onto a scene external tofirst aircraft 10, thereby illuminating objects external tofirst aircraft 10.Controller 26 controls the solid-angle of projection, such thatprojector 22 projects structuredlight 33 within the controlled solid-angle of projection. In the depicted embodiment, the solid-angle of projection includes azimuthal angle ofprojection 34B. - The solid-angle of projection shown in
FIG. 2A represents a wider field of view than the field of view of the solid-angle of projection depicted inFIG. 1A . This relation is represented by the azimuthal angle ofprojection 34B being greater than the azimuthal angle ofprojection 34A. The relatively-large solid-angle of projection is configured to project structured light 33 onto objects, such asaircraft azimuthal angle 36B as measured fromlongitudinal axis 38 offirst aircraft 10. By controlling the solid-angle of projection, the power required for projecting structured light 33 can be controlled. In some embodiments, in addition to controlling the solid-angle of projection,controller 26 also can control the optical focus and/or dispersion of projected structuredlight 33, and the emission power of structuredlight 33. For example, whencontroller 26 controls a large solid-angle, such as the one depicted inFIG. 2A ,controller 26 might also decrease the emission power and collimate the structured light pattern so as to permit increased dispersion. One or more of such controlled parameters of the structured light 33 can facilitate detection of objects that are relatively nearbyfirst aircraft 10. -
FIG. 2B depicts an image captured by a camera of the collision alerting system mounted to the aircraft depicted inFIG. 2A . InFIG. 2B , capturedimage 40B has a field of view commensurate with the solid-angle of projection of structuredlight 33. In other embodiments, capturedimage 40B can have a field of view different than the field of view used for structured-light projection. Capturedimage 40B depictsaircraft gates tarmac 44 andaircraft lines 33A-33D of structuredlight 33. Becauseprojector 22 andcamera 24 are mounted tofirst aircraft 10 at different locations,lines 33A-33D will havediscontinuities 42 in capturedimage 40B where structured light 33 encounters objects, such assecond aircraft 16.Such discontinuities 42 in capturedimage 40B are indicative of differences in the locations and/or ranges of the objects from which structuredlight 33 reflects. -
Collision alerting system 20 can determine locations and/or ranges of objects using triangulation.Projector 22 projects structured light 32 (FIG. 1A ) or 33 (FIG. 2A ) within a controlled field of view having a predetermined solid-angle of illumination. Structured light 32 or 33 illuminates various objects that reside within the predetermined solid-angle of illumination. In the depicted embodiment,projector 22 has an optical axis that is coplanar withlongitudinal axis 38 offirst aircraft 10.Controller 26 is configured to control the field of view within whichprojector 22 projects structuredlight 32.Projector 22 is shown illuminating objects that are within various azimuthal angle ranges, for example, and within an elevation range of a projection horizon ofprojector 22, inFIGS. 1A-2B . The elevation range of projection, for example, can be from about +3, +5, +10, +12, or +15 degrees to about −2, −5, −8, or −10 degrees of projection from a vertical location ofprojector 22. - In some embodiments, structured light 32 or 33 can continuously illuminate objects within the solid-angle of illumination. In other embodiments, structured light 32 or 33 can intermittently illuminate objects within the solid-angle of illumination. Such illumination may use light of various wavelengths. For example, in some embodiments, infrared light, being invisible to a human eye, can be used to provide illumination of objects within the solid-angle of illumination. Infrared light can advantageously be non-distractive to pilots and to other people upon whom the collimated beam of light is projected.
- In some embodiments, the directed beam of light is pulsed on for a limited time, with image capture synchronized with the projector illumination. Shorter image capture durations reduce the light captured from solar illumination, lowering the needed projector power. In some embodiments,
projector 22 is controlled so as to facilitate imaging of various objects within the scene by a left-side camera and/or a right-side camera.Projector 22 can be controlled, based on a mode of ground operation offirst aircraft 10, for example. Various parameters of structured light 33 produced byprojector 22 can be controlled. For example, the azimuthal range of a field of view can be controlled. In some embodiments, focus/beam divergence and/or emission power can be controlled. In some embodiments, the field of view can be controlled, based upon an input device, such as a switch or keypad. In other embodiments, the field of view can be controlled in response to aircraft operating conditions, such as ground speed, steering wheel orientation, etc. - In some embodiments, intensity of the directed beam of light can be controlled based on ground speed of aircraft. Faster moving aircraft may control the directed beam of light to have a greater intensity. Also, the intensity can be controlled such that objects at greater ranges are illuminated at a greater intensity than objects at a closer range. In some embodiments, the intensity of the directed beam of light may be controlled based on atmospheric conditions (e.g., atmospheric attenuation). In an exemplary embodiment, power intensity of the directed beam can be varied while looking at a known location(s) on
first aircraft 10. A magnitude of the signal reflected from the known location(s) onfirst aircraft 10 can be compared to a predetermined reference signal level at a standard attenuation to determine instantaneous attenuation of atmospheric conditions. Such a method can be used to normalize the measured reflected power intensity for various atmospheric conditions. - In some embodiments, light having wavelengths within an atmospheric absorption band can be used. Careful selection of projector wavelength can permit
projector 22 to compete favorably with solar energy. There are, however, certain wavelengths where the atmospheric absorption is so great that both projector energy and solar energy are attenuated equally. Light is broadband as emitted from the sun with a maximum intensity falling in the visible light spectrum. Sunlight having wavelengths within the infrared spectrum is of lower intensity than the visible band. And so, projected light having such wavelengths need not compete with the sunlight. Using light having such wavelengths can thereby permit reduced power levels in projecting structured light. Atmospheric absorption bands may further reduce solar infrared illumination. For example, atmospheric absorption bands include infrared wavelengths of between about 1.35-1.4, 1.8-1.95, 2.5-2.9, and 5.5-7.2 microns. - The structured light that is projected by
projector 22 can be formed by a collimated beam scanned in a predetermined pattern so as to have a structure that can be identified in images formed bycamera 24. Using knowledge of the location from which the feature is projected (e.g., the location of image projector 22), the location of thecamera 24 and the location within the images (e.g., pixel coordinates) where the feature is imaged, can permit location determination using triangulation of the object reflecting the structured light. For example,projector 22 can be located at an elevation onfirst aircraft 10 that is higher than an elevation wherecamera 24 is located. A location of the imaged feature can be used to determine a location and a range distance to the object from which that specific feature is reflected. -
Projector 22, for example, can emit structured light to produce a pattern that, when reflected from a surface having a normal direction tolongitudinal axis 38, is imaged as horizontal lines bycamera 24. One structured light beam, for example, might be projected at an angle of elevation of zero degrees (i.e., directed parallel to the horizon). A second structured light beam might be projected at an angle of negative five degrees from the horizon (i.e., directed at a slightly downward angle from projector 22). Each of these projected structured light beams, when reflected from an object, will be imaged at a different vertical location (e.g., each will have a different vertical pixel coordinate) within the camera image, depending on the range distance between the reflecting object andfirst aircraft 10. Knowing the elevation ofprojector 22, the elevation ofcamera 24, the specific feature of the structured light (e.g., which horizontal line is imaged), and the location within the camera image where the specific feature is imaged can permit a determination of the location of the object from which the specific feature has been reflected. - Using the calculated location information, pilots of
first aircraft 10 can be informed of any potential collision hazards within the scene illuminated byprojector 22. Pilots offirst aircraft 10 can steerfirst aircraft 10 to avoid wingtip collisions and/or engine collisions based on the location and range information that is calculated by aircraftcollision alerting system 20. -
FIG. 3 is a plan view of an airport showing various regions in which various types of on-ground operations are performed. InFIG. 3 ,airport 52 includesrunway 54,taxiways 56,apron 58,ramp 60,terminal gates 62, aircraft stands 64, andhangar 66. Various ground operations are performed when an aircraft is navigating around and/or over each of these different areas ofairport 52. For example, when an aircraft is being pushed back fromterminal gates 62, the aircraft might be turned so as to exitramp 60. And when the aircraft then navigatesramp 60, it must carefully avoid the objects there located, including parked aircraft and support vehicles, for example. Then when the aircraft navigatestaxiways 56, it might taxi at a higher speed than when navigatingramp 60. Each of these different areas presents different risks of collision between the aircraft and objects external to the aircraft. Because different risks are present at different areas withinairport 52, different fields of view can be used to detect the objects that present the collision risks. -
FIG. 4 is a schematic view depicting various fields of view of an aircraft collision alerting system used by an aircraft during on-ground operations. InFIG. 4 ,aircraft 10 includesprojector 22 mounted onvertical stabilizer 28.Projector 22 is configured to project structured light within various controlled fields ofview 68A-68E. Fields ofview 68A-68E are progressively smaller. Field ofview 68A has a relatively largeazimuthal angle 70A compared with relatively smallazimuthal angle 70E corresponding to field ofview 68E. Each of fields ofview 68A-68E are configured such that to detect objects within narrow corridor 72 extending forward of the aircraft's wingtips. Structured light projected within field ofview 68E will illuminate objects that are located within narrow corridor 72, but only if such objects are located at a distance greater thanminimum distance 74 ofaircraft 10. Whereas, structured light projected within field ofview 68A will illuminate objects that are located within narrow corridor 72 at any distance depicted inFIG. 4 . Furthermore, structured light projected within field ofview 68A will illuminate objects on the right-hand side ofaircraft 10, so as to facilitate objects that present a collision risk shouldaircraft 10 turn in the right-hand direction. Thus, relatively-wide field ofview 68A provides object detection in scenarios in whichaircraft 10 might be turning, whereas relatively-narrow field ofview 68E provides objet detection in scenarios in whichaircraft 10 might be taxiing in a straight direction and not turning. -
FIG. 5 is a block diagram of an exemplary aircraft collision alerting system. Aircraftcollision alerting system 20 includes projector(s) 22, camera(s) 24,controller 26, and cockpit alarm anddisplay module 76. Projector(s) 22 is configured to be mounted at a projector location on an aircraft. Projector(s) 22 is further configured to project structure light from projector(s) 22 onto a scene external to the aircraft, thereby illuminating a spatially-patterned portion of the scene. - Camera(s) 24 is configured to be mounted at one or more camera locations on the aircraft. Camera(s) 24 is further configured to receive light reflected from the scene. Camera(s) 24 is further configured to focus the received light onto a focal plane array comprising a plurality of light-sensitive pixels, thereby forming an image of the scene. The image can include pixel data generated by the plurality of light-sensitive pixels.
-
Controller 26 receives inputs from camera(s) 24 and fromaircraft avionics 78.Controller 26 generates commands that control the operation of projector(s) 22 and camera(s) 24.Controller 26 outputs signals indicative of alarms, ranges, and/or images to cockpit alarms anddisplay module 76.Controller 26 is configured to identify pixel coordinates corresponding to a subset of the plurality of light-sensitive pixels upon which the spatially-patterned light projected by projector(s) 22 and reflected from the spatially-patterned portion of the scene is focused.Controller 26 is further configured to use triangulation, based on the projector location of projector(s) 22, the location(s) of camera(s) 24 and the identified pixel coordinates, to calculate range value data of object(s) in the scene from which the spatially-patterned light projected by projector(s) 22 is reflected. -
FIG. 6 is a schematic diagram depicting object location determination using both active and passive imaging. InFIG. 6 ,camera image 80 oftail 82 ofaircraft 84 external to first aircraft 10 (depicted inFIG. 2A ) is shown.Camera image 80 is composed from intensity data of a two-dimensional array of light-sensitive pixels (not individually depicted).Tail 82 includesvertical stabilizer 86 andhorizontal stabilizer 88.Vertical stabilizer 86 depictsfeatures 90 of structured light projected thereon.Features 90 are diagonal lines of light.Features 90 are imaged by a subset of the two-dimensional array of light-sensitive pixels composing the image. For each of the subset of the two-dimensional array of light-sensitive pixels containing the structured light projected upontail 82, a range value is calculated. - Between the subset of pixels that have calculated range values, are pixels upon which the collimated beam of light has not been projected. For some, if not all, of these pixels, range values can be calculated using the already calculated range values corresponding to nearby pixels. For example, range values can be calculated for the pixels determined to be boundary pixels of an object. Range values for
boundary pixels 92 may be calculated by modeling the range variations within a single object as a polynomial function of spatial coordinates, for example. Such a model may be used to calculate range values using the pixel coordinates and corresponding range values of pixels having already calculated range values that reside within the object boundary associated withboundary pixels 92. - Various embodiments can use various structured light patterns having various features. For example, in some embodiments, vertical or diagonal lines can be projected upon a scene. In some embodiments, spots of light can be projected upon a scene. In an exemplary embodiment, both vertical lines and horizontal lines can be projected upon a scene, using projectors that are horizontally and/or vertically displaced, respectively, from the camera location.
-
FIG. 7 is block diagram of an embodiment ofcontroller 26 depicted inFIG. 5 . InFIG. 7 , the block diagram includescontroller 26, projector(s) 22, camera(s) 24, cockpit alarms anddisplay module 76, andaircraft avionics 78.Controller 26 includes processor(s) 92, input/output interface 94, storage device(s) 96,input devices 98, andoutput devices 100. Storage device(s) 96 has various storage or memory locations. Storage device(s) 96 includesprogram memory 102, anddata memory 104.Controller 26 is in communication with cockpit alarms anddisplay module 76 andaircraft avionics 78 via input/output interface 94.Aircraft avionics 78 can providecontroller 26 with metrics indicative of the aircraft's location, orientation, speed, etc.Controller 26 can provide cockpit alarms anddisplay module 76 with signals indicative of risk of collision with an object(s) external to the aircraft. - As illustrated in
FIG. 7 ,controller 26 includes processor(s) 92, input/output interface 94, storage device(s) 96,input devices 98, andoutput devices 100. However, in certain examples,controller 26 can include more or fewer components. For instance, in examples wherecontroller 26 is an avionics system,controller 26 may not includeinput devices 98 and/oroutput devices 100.Controller 26 may include additional components such as a battery that provides power to components ofcontroller 26 during operation. In some embodiments,controller 26 can be located together with projector(s) 22 and/or camera(s) 24. - Processor(s) 92, in one example, is configured to implement functionality and/or process instructions for execution within
controller 26. For instance, processor(s) 92 can be capable of processing instructions stored in storage device(s) 96. Examples of processor(s) 92 can include any one or more of a microprocessor, a controller, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or other equivalent discrete or integrated logic circuitry. - Input/
output interface 94, in some examples, includes a communications module. Input/output interface 94, in one example, utilizes the communications module to communicate with external devices via one or more networks, such as one or more wireless or wired networks or both. The communications module can be a network interface card, such as an Ethernet card, an optical transceiver, a radio frequency transceiver, or any other type of device that can send and receive information. The communications module can be a network interface card, such as an Ethernet card, an optical transceiver, a radio frequency transceiver, or any other type of device that can send and receive information. Other examples of such network interfaces can include Bluetooth, 3G, 4G, and Wi-Fi radio computing devices as well as Universal Serial Bus (USB). In some embodiments, communication with the aircraft can be performed via a communications bus, such as, for example, an Aeronautical Radio, Incorporated (ARINC) standard communications protocol. In an exemplary embodiment, aircraft communication with the aircraft can be performed via a communications bus, such as, for example, a Controller Area Network (CAN) bus. - Storage device(s) 96 can be configured to store information within
controller 26 during operation. Storage device(s) 96, in some examples, is described as computer-readable storage media. In some examples, a computer-readable storage medium can include a non-transitory medium. The term “non-transitory” can indicate that the storage medium is not embodied in a carrier wave or a propagated signal. In certain examples, a non-transitory storage medium can store data that can, over time, change (e.g., in RAM or cache). In some examples, storage device(s) 96 is a temporary memory, meaning that a primary purpose of Storage device(s) 96 is not long-term storage. Storage device(s) 96, in some examples, is described as volatile memory, meaning that storage device(s) 96 do not maintain stored contents when power tocontroller 26 is turned off. Examples of volatile memories can include random access memories (RAM), dynamic random access memories (DRAM), static random access memories (SRAM), and other forms of volatile memories. In some examples, storage device(s) 96 is used to store program instructions for execution by processor(s) 92. Storage device(s) 96, in one example, is used by software or applications running on controller 26 (e.g., a software program implementing long-range cloud conditions detection) to temporarily store information during program execution. - Storage device(s) 96, in some examples, also include one or more computer-readable storage media. Storage device(s) 96 can be configured to store larger amounts of information than volatile memory. Storage device(s) 96 can further be configured for long-term storage of information. In some examples, Storage device(s) 96 include non-volatile storage elements. Examples of such non-volatile storage elements can include magnetic hard discs, optical discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories.
-
Input devices 98, in some examples, are configured to receive input from a user. Examples ofinput devices 98 can include a mouse, a keyboard, a microphone, a camera device, a presence-sensitive and/or touch-sensitive display, push buttons, arrow keys, or other type of device configured to receive input from a user. In some embodiments, input communication from the user can be performed via a communications bus, such as, for example, an Aeronautical Radio, Incorporated (ARINC) standard communications protocol. In an exemplary embodiment, user input communication from the user can be performed via a communications bus, such as, for example, a Controller Area Network (CAN) bus. -
Output devices 100 can be configured to provide output to a user. Examples ofoutput devices 100 can include a display device, a sound card, a video graphics card, a speaker, a cathode ray tube (CRT) monitor, a liquid crystal display (LCD), a light emitting diode (LED) display, an organic light emitting diode (OLED) display, or other type of device for outputting information in a form understandable to users or machines. In some embodiments, output communication to the user can be performed via a communications bus, such as, for example, an Aeronautical Radio, Incorporated (ARINC) standard communications protocol. In an exemplary embodiment, output communication to the user can be performed via a communications bus, such as, for example, a Controller Area Network (CAN) bus. - The following are non-exclusive descriptions of possible embodiments of the present invention.
- Apparatus and associated methods relate to a system for calculating position values and/or range data of an object(s) external to an aircraft. The system includes a mode selector configured to determine an illumination field of view. The system includes a projector mounted at a projector location on the aircraft and configured to project structured light within the illumination field of view, thereby illuminating the object(s) external to the aircraft that are within the illumination field of view. The system includes a camera mounted at a camera location on the aircraft and configured to receive a portion of the structured light reflected by the object(s) within the illumination field of view. The camera is further configured to focus the received portion of the structured light onto a focal plane array comprising a plurality of light-sensitive pixels, thereby forming an image of object(s) within the illumination field of view. The image includes pixel data generated by the plurality of light-sensitive pixels. The system also includes an image processor configured to identify pixel coordinates corresponding to a subset of the plurality of light-sensitive pixels upon which the received portion of the structured light reflected by the object(s) within the illumination field of view is focused. The image processor is further configured to use triangulation, based on the projector location, the camera location, and the identified pixel coordinates, to calculate the position values and/or range data of the object(s) within the illumination field.
- The system of the preceding paragraph can optionally include, additionally and/or alternatively, any one or more of the following features, configurations and/or additional components:
- A further embodiment of the foregoing system, wherein the mode selector can be further configured to determine an emission power, and the projector can be further configured to project structured light of the determined emission power.
- A further embodiment of any of the foregoing systems, wherein the mode selector can be further configured to determine beam dispersion, and the projector can be further configured to project structured light of the determined beam dispersion.
- A further embodiment of any of the foregoing systems, wherein the mode selector can include a selection input device operable by a pilot of the aircraft.
- A further embodiment of any of the foregoing systems, wherein the mode selector can determine the illumination field of view within which the structured light is projected, based on ground operation modes of the aircraft.
- A further embodiment of any of the foregoing systems, wherein, if the ground operation mode is a high-speed taxi mode, then the mode selector can determine a first illumination field of view within which the structured light is projected. If the ground operation mode is a low-speed taxi mode, then the mode selector can determine a second illumination field of view within which the structured light is projected. The second illumination field of view can have a greater solid-angle than a solid-angle of the first illumination field of view.
- A further embodiment of any of the foregoing systems, wherein, if the ground operation mode is a moderate-speed taxi mode, then the mode selector can determine a third illumination field of view within which the structured light is projected. The third illumination field of view can have a greater solid-angle than the solid-angle of the first illumination field of view but less than the solid-angle of the second illumination field of view.
- A further embodiment of any of the foregoing systems can further include an aircraft system interface configured to receive signals indicative of aircraft operating parameters.
- A further embodiment of any of the foregoing systems, wherein the aircraft operating parameters can include an aircraft taxi speed. The mode selector can be further configured to determine the illumination field of view in response to the received signal indicative of the aircraft taxi speed.
- A further embodiment of any of the foregoing systems, wherein the aircraft operating parameters can include an orientation of a steerable wheel. The mode selector can be further configured to determine the illumination field of view in response to the received signal indicative of the orientation of the steerable wheel.
- A further embodiment of any of the foregoing systems can further include an audible alarm configured to generate an alert signal. The alert signal can be generated if the calculated position values and range data of the object(s) within the illumination field of view indicate that one or more of the object(s) are within a collision zone or on a collision trajectory.
- A further embodiment of any of the foregoing systems, wherein the projector can be an infrared projector configured to project structured light that is infrared light.
- Some embodiments relate to a method for generating an alert signal of a potential aircraft collision for a taxiing aircraft. The method includes determining an illumination field of view. The method includes projecting structured light within the determined illumination field of view. The method includes receiving a portion of the structured light reflected by object(s) within the illumination field of view. The method includes focusing the received portion of the structured light onto a focal plane array comprising a plurality of light-sensitive pixels, thereby forming an image of the object(s) within the illumination field of view. The image includes pixel data generated by the plurality of light-sensitive pixels. The method includes identifying pixel coordinates corresponding to a subset of the plurality of light-sensitive pixels upon which the received portion of the structured light reflected by the object(s) within the illumination field of view is focused. The method includes calculating, based on a projector location, a camera location, and the identified pixel coordinates, position values and/or range data of the object(s) within the illumination field of view by which the structured light is reflected. The method also includes generating an alert signal if the calculated position values and range data of the object(s) indicate that one or more of the object(s) are within a collision zone or on a collision trajectory.
- The method of the preceding paragraph can optionally include, additionally and/or alternatively, any one or more of the following features, configurations and/or additional components:
- A further embodiment of the foregoing method, wherein the illumination field of view external to the aircraft can be determined based on ground operation modes of the aircraft.
- A further embodiment of any of the foregoing methods can further include determining an emission power. The projected structured light can be of the determined emission power.
- A further embodiment of any of the foregoing methods can further include determining a beam dispersion. The projected structured light can be of the determined beam dispersion.
- A further embodiment of any of the foregoing methods can further include receiving signals indicative of aircraft operating parameters. The method can also include determining the illumination field of view in response to a received signal indicative of an aircraft taxi speed.
- A further embodiment of any of the foregoing methods can further include receiving signals indicative of aircraft operating parameters. The method can also include determining the illumination field of view in response to a received signal indicative of an orientation of a steerable wheel.
- A further embodiment of any of the foregoing methods, wherein determining the illumination field of view within which the structured light is projected can be based on ground operation modes of the aircraft.
- A further embodiment of any of the foregoing methods, wherein, if the ground operation mode is a high-speed taxi mode, then the mode selector can determine a first illumination field of view within which the structured light is projected. If the ground operation mode is a low-speed taxi mode, then the mode selector can determine a second illumination field of view within which the structured light is projected. The second illumination field of view can have a greater solid-angle than a solid-angle of the first illumination field of view.
- While the invention has been described with reference to an exemplary embodiment(s), it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from the scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from the essential scope thereof. Therefore, it is intended that the invention not be limited to the particular embodiment(s) disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.
Claims (20)
1. A system for calculating position values and/or range data of an object(s) external to an aircraft, the system comprising:
a mode selector configured to determine an illumination field of view;
a projector mounted at a projector location on the aircraft and configured to project structured light within the illumination field of view, thereby illuminating the object(s) external to the aircraft that are within the illumination field of view;
a camera mounted at a camera location on the aircraft and configured to receive a portion of the structured light reflected by the object(s) within the illumination field of view, and further configured to focus the received portion of the structured light onto a focal plane array comprising a plurality of light-sensitive pixels, thereby forming an image of object(s) within the illumination field of view, the image comprising pixel data generated by the plurality of light-sensitive pixels; and
an image processor configured to identify pixel coordinates corresponding to a subset of the plurality of light-sensitive pixels upon which the received portion of the structured light reflected by the object(s) within the illumination field of view is focused, the image processor further configured to use triangulation, based on the projector location, the camera location, and the identified pixel coordinates, to calculate the position values and/or range data of the object(s) within the illumination field.
2. The system of claim 1 , wherein the mode selector is further configured to determine an emission power, and the projector is further configured to project structured light of the determined emission power.
3. The system of claim 1 , wherein the mode selector is further configured to determine beam dispersion, and the projector is further configured to project structured light of the determined beam dispersion.
4. The system of claim 1 , wherein the mode selector comprises a selection input device operable by a pilot of the aircraft.
5. The system of claim 1 , wherein the mode selector determines the illumination field of view within which the structured light is projected, based on ground operation modes of the aircraft.
6. The system of claim 5 , wherein, if the ground operation mode is a high-speed taxi mode, then the mode selector determines a first illumination field of view within which the structured light is projected, and wherein, if the ground operation mode is a low-speed taxi mode, then the mode selector determines a second illumination field of view within which the structured light is projected, wherein the second illumination field of view has a greater solid-angle than a solid-angle of the first illumination field of view.
7. The system of claim 6 , wherein, if the ground operation mode is a moderate-speed taxi mode, then the mode selector determines a third illumination field of view within which the structured light is projected, wherein the third illumination field of view has a greater solid-angle than the solid-angle of the first illumination field of view but less than the solid-angle of the second illumination field of view.
8. The system of claim 1 , further comprising:
an aircraft system interface configured to receive signals indicative of aircraft operating parameters.
9. The system of claim 8 , wherein the aircraft operating parameters include an aircraft taxi speed, wherein the mode selector is further configured to determine the illumination field of view in response to the received signal indicative of the aircraft taxi speed.
10. The system of claim 8 , wherein the aircraft operating parameters include an orientation of a steerable wheel, wherein the mode selector is further configured to determine the illumination field of view in response to the received signal indicative of the orientation of the steerable wheel.
11. The system of claim 1 , further comprising:
an audible alarm configured to generate an alert signal, wherein the alert signal is generated if the calculated position values and range data of the object(s) within the illumination field of view indicate that one or more of the object(s) are within a collision zone or on a collision trajectory.
12. The system of claim 1 , wherein the projector is an infrared projector configured to project structured light that is infrared light.
13. A method for generating an alert signal of a potential aircraft collision for a taxiing aircraft, the method comprising the steps of:
determining an illumination field of view;
projecting structured light within the determined illumination field of view;
receiving a portion of the structured light reflected by object(s) within the illumination field of view;
focusing the received portion of the structured light onto a focal plane array comprising a plurality of light-sensitive pixels, thereby forming an image of the object(s) within the illumination field of view, the image comprising pixel data generated by the plurality of light-sensitive pixels;
identifying pixel coordinates corresponding to a subset of the plurality of light-sensitive pixels upon which the received portion of the structured light reflected by the object(s) within the illumination field of view is focused;
calculating, based on a projector location, a camera location, and the identified pixel coordinates, position values and/or range data of the object(s) within the illumination field of view by which the structured light is reflected; and
generating an alert signal if the calculated position values and range data of the object(s) indicate that one or more of the object(s) are within a collision zone or on a collision trajectory.
14. The method of claim 13 , wherein the illumination field of view external to the aircraft is determined based on ground operation modes of the aircraft.
15. The method of claim 13 , further comprising:
determining an emission power; wherein the projected structured light is of the determined emission power.
16. The method of claim 13 , further comprising:
determining a beam dispersion; wherein the projected structured light is of the determined beam dispersion.
17. The method of claim 13 , further comprising:
receiving signals indicative of aircraft operating parameters; and
determining the illumination field of view in response to a received signal indicative of an aircraft taxi speed.
18. The method of claim 13 , further comprising:
receiving signals indicative of aircraft operating parameters; and
determining the illumination field of view in response to a received signal indicative of an orientation of a steerable wheel.
19. The method of claim 13 , wherein determining the illumination field of view within which the structured light is projected is based on ground operation modes of the aircraft.
20. The method of claim 19 , wherein, if the ground operation mode is a high-speed taxi mode, then the mode selector determines a first illumination field of view within which the structured light is projected, and wherein, if the ground operation mode is a low-speed taxi mode, then the mode selector determines a second illumination field of view within which the structured light is projected, wherein the second illumination field of view has a greater solid-angle than a solid-angle of the first illumination field of view.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/683,215 US10217371B1 (en) | 2017-08-22 | 2017-08-22 | Method and system for aircraft taxi strike alerting using adaptive field of view |
EP18189046.8A EP3446984B1 (en) | 2017-08-22 | 2018-08-14 | Method and system for aircraft taxi strike alerting using adaptive field of view |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/683,215 US10217371B1 (en) | 2017-08-22 | 2017-08-22 | Method and system for aircraft taxi strike alerting using adaptive field of view |
Publications (2)
Publication Number | Publication Date |
---|---|
US10217371B1 US10217371B1 (en) | 2019-02-26 |
US20190066523A1 true US20190066523A1 (en) | 2019-02-28 |
Family
ID=63311797
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/683,215 Active 2037-10-03 US10217371B1 (en) | 2017-08-22 | 2017-08-22 | Method and system for aircraft taxi strike alerting using adaptive field of view |
Country Status (2)
Country | Link |
---|---|
US (1) | US10217371B1 (en) |
EP (1) | EP3446984B1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11097851B1 (en) | 2019-11-19 | 2021-08-24 | Rockwell Collins, Inc. | System and method to momentarily switch SVS mode |
US11315434B2 (en) | 2019-10-31 | 2022-04-26 | Rockwell Collins, Inc. | System and method to change SVS mode |
US11610497B2 (en) | 2020-12-15 | 2023-03-21 | Rockwell Collins, Inc. | System and method to display SVS taxi mode exocentric view of aircraft |
US12017792B2 (en) | 2020-04-14 | 2024-06-25 | Rockwell Collins, Inc. | System and method to change map range of airport moving map |
US12025456B2 (en) | 2021-03-29 | 2024-07-02 | Rockwell Collins, Inc. | System and method to display airport moving map and taxi routing guidance content |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190310373A1 (en) * | 2018-04-10 | 2019-10-10 | Rosemount Aerospace Inc. | Object ranging by coordination of light projection with active pixel rows of multiple cameras |
CN112027107B (en) * | 2019-06-04 | 2022-03-11 | 丰鸟航空科技有限公司 | Unmanned aerial vehicle avoidance test system, method and device, terminal equipment and storage medium |
US10867522B1 (en) | 2019-08-28 | 2020-12-15 | Honeywell International Inc. | Systems and methods for vehicle pushback collision notification and avoidance |
Family Cites Families (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4572662A (en) * | 1982-11-05 | 1986-02-25 | The United States Of America As Represented By The Secretary Of The Army | Wire and wire like object detection system |
US8538636B2 (en) * | 1995-06-07 | 2013-09-17 | American Vehicular Sciences, LLC | System and method for controlling vehicle headlights |
US6405975B1 (en) | 1995-12-19 | 2002-06-18 | The Boeing Company | Airplane ground maneuvering camera system |
US6937348B2 (en) * | 2000-01-28 | 2005-08-30 | Genex Technologies, Inc. | Method and apparatus for generating structural pattern illumination |
US6909381B2 (en) * | 2000-02-12 | 2005-06-21 | Leonard Richard Kahn | Aircraft collision avoidance system |
US6963293B1 (en) * | 2000-05-11 | 2005-11-08 | Rastar Corporation | System and method of preventing aircraft wingtip ground incursion |
US6571166B1 (en) | 2000-06-23 | 2003-05-27 | Rockwell Collins, Inc. | Airport surface operation advisory system |
US6754370B1 (en) * | 2000-08-14 | 2004-06-22 | The Board Of Trustees Of The Leland Stanford Junior University | Real-time structured light range scanning of moving scenes |
FR2854129B1 (en) | 2003-04-28 | 2007-06-01 | Airbus France | DISPLAY DEVICE IN AN AIRCRAFT COCKPIT WITH INFORMATION ABOUT SURROUNDING TRAFFIC |
JP2008512742A (en) * | 2004-09-07 | 2008-04-24 | ウィリアム・マイケル・バトラー | Collision avoidance warning and ground travel guidance device |
EP1842082A2 (en) * | 2005-01-20 | 2007-10-10 | Elbit Systems Electro-Optics Elop Ltd. | Laser obstacle detection and display |
US7592929B2 (en) | 2006-04-06 | 2009-09-22 | Honeywell International Inc. | Runway and taxiway turning guidance |
US7623044B2 (en) | 2006-04-06 | 2009-11-24 | Honeywell International Inc. | Runway and taxiway turning guidance |
US7737867B2 (en) | 2006-04-13 | 2010-06-15 | The United States Of America As Represented By The United States National Aeronautics And Space Administration | Multi-modal cockpit interface for improved airport surface operations |
US7974773B1 (en) | 2007-06-21 | 2011-07-05 | Rockwell Collins, Inc. | Methods and devices of an aircraft taxi navigation system |
US8687056B2 (en) | 2007-07-18 | 2014-04-01 | Elbit Systems Ltd. | Aircraft landing assistance |
US20090140887A1 (en) * | 2007-11-29 | 2009-06-04 | Breed David S | Mapping Techniques Using Probe Vehicles |
US9245450B1 (en) | 2008-07-03 | 2016-01-26 | Rockwell Collins, Inc. | System, apparatus, and method for generating runway visual aids on an aircraft display unit |
US8849477B2 (en) | 2008-10-14 | 2014-09-30 | Honeywell International Inc. | Avionics display system and method for generating three dimensional display including error-compensated airspace |
FR2959052B1 (en) | 2010-04-16 | 2012-12-28 | Thales Sa | ON-BOARD ASSISTANCE DEVICE FOR MONITORING AN AIRCRAFT ROUTE BY AN AIRCRAFT |
US8400330B2 (en) | 2010-09-07 | 2013-03-19 | Honeywell International Inc. | System for displaying multiple overlaid images to a pilot of an aircraft during flight |
US8692980B2 (en) * | 2010-11-01 | 2014-04-08 | Advanced Scientific Concepts, Inc. | Flash LADAR collision avoidance system |
US20120224058A1 (en) | 2011-03-02 | 2012-09-06 | Rosemount Aerospace Inc. | Airplane cockpit video system |
FR2973343B1 (en) | 2011-04-01 | 2013-11-29 | Latecoere | AIRCRAFT COMPRISING AN ENVIRONMENTAL OBSERVATION SYSTEM OF THIS AIRCRAFT |
US9013330B2 (en) | 2011-09-01 | 2015-04-21 | Honeywell International Inc. | Electric taxi system guidance |
US9958867B2 (en) | 2012-01-13 | 2018-05-01 | Borealis Technical Limited | Monitoring and control system for enhancing ground movement safety in aircraft equipped with non-engine drive means |
US8797191B2 (en) | 2012-07-13 | 2014-08-05 | Honeywell International Inc. | Systems and methods for displaying runway information |
CA2833985C (en) * | 2012-11-19 | 2020-07-07 | Rosemount Aerospace, Inc. | Collision avoidance system for aircraft ground operations |
FR3004423B1 (en) | 2013-04-12 | 2015-05-01 | Latecoere | METHOD AND SYSTEM FOR VISUALIZING THE EXTERNAL ENVIRONMENT OF AN AIRCRAFT, AND AIRCRAFT DOOR EQUIPPED WITH SUCH A SYSTEM |
US9047771B1 (en) | 2014-03-07 | 2015-06-02 | The Boeing Company | Systems and methods for ground collision avoidance |
US9174746B1 (en) | 2014-06-26 | 2015-11-03 | Rockwell Collins, Inc. | Visual aid generating system, device, and method |
FR3023406B1 (en) | 2014-07-07 | 2019-07-12 | Airbus | METHOD FOR AIDING THE FLIGHT OF AN AIRCRAFT AND SYSTEM FOR ITS IMPLEMENTATION |
EP3198582B1 (en) * | 2014-09-22 | 2019-06-12 | Gulfstream Aerospace Corporation | Methods and systems for collision aviodance using visual indication of wingtip path |
US9989357B2 (en) * | 2015-09-09 | 2018-06-05 | Faro Technologies, Inc. | Aerial device that cooperates with an external projector to measure three-dimensional coordinates |
FR3041938B1 (en) | 2015-10-02 | 2018-08-17 | Latecoere | METHOD AND ON-ROAD EQUIPMENT FOR AIDING THE ROLLING AND ANTICOLLIZING OF A VEHICLE, IN PARTICULAR AN AIRCRAFT |
CN105391975A (en) | 2015-11-02 | 2016-03-09 | 中国民用航空总局第二研究所 | Video processing method in scene monitoring, device and scene monitoring system |
US10043404B2 (en) * | 2016-04-18 | 2018-08-07 | Rosemount Aerospace Inc. | Method and system for aircraft taxi strike alerting |
US11319086B2 (en) | 2016-05-23 | 2022-05-03 | Rosemount Aerospace Inc. | Method and system for aligning a taxi-assist camera |
US10559213B2 (en) * | 2017-03-06 | 2020-02-11 | Rosemount Aerospace Inc. | Method and system for aircraft taxi strike alerting |
US10096256B2 (en) * | 2017-03-07 | 2018-10-09 | Rosemount Aerospace Inc. | Method and system for aircraft taxi strike alerting |
US10720069B2 (en) * | 2017-04-17 | 2020-07-21 | Rosemount Aerospace Inc. | Method and system for aircraft taxi strike alerting |
-
2017
- 2017-08-22 US US15/683,215 patent/US10217371B1/en active Active
-
2018
- 2018-08-14 EP EP18189046.8A patent/EP3446984B1/en active Active
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11315434B2 (en) | 2019-10-31 | 2022-04-26 | Rockwell Collins, Inc. | System and method to change SVS mode |
US11097851B1 (en) | 2019-11-19 | 2021-08-24 | Rockwell Collins, Inc. | System and method to momentarily switch SVS mode |
US12017792B2 (en) | 2020-04-14 | 2024-06-25 | Rockwell Collins, Inc. | System and method to change map range of airport moving map |
US11610497B2 (en) | 2020-12-15 | 2023-03-21 | Rockwell Collins, Inc. | System and method to display SVS taxi mode exocentric view of aircraft |
US12025456B2 (en) | 2021-03-29 | 2024-07-02 | Rockwell Collins, Inc. | System and method to display airport moving map and taxi routing guidance content |
Also Published As
Publication number | Publication date |
---|---|
US10217371B1 (en) | 2019-02-26 |
EP3446984B1 (en) | 2020-06-17 |
EP3446984A1 (en) | 2019-02-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3446984B1 (en) | Method and system for aircraft taxi strike alerting using adaptive field of view | |
EP3235735B1 (en) | Method and system for generating a collision alerting during aircraft taxiing | |
EP3372508B1 (en) | Method and system for aircraft taxi strike alerting | |
EP3392152B1 (en) | Method and system for aircraft taxi strike alerting | |
US10249203B2 (en) | Method and system for providing docking guidance to a pilot of a taxiing aircraft | |
US10838068B2 (en) | Obstacle avoidance system for aircraft | |
EP3597545B1 (en) | Taxi strike alert system | |
EP3372509B1 (en) | Method and system for aircraft taxi strike alerting | |
JP2016161572A (en) | System and methods of detecting intruding object | |
US11053005B2 (en) | Circular light source for obstacle detection | |
EP3546884B1 (en) | Ranging objects external to an aircraft using multi-camera triangulation | |
EP3552973B1 (en) | Object ranging by coordination of light projection with active pixel rows of multiple cameras | |
CN109385939B (en) | Multi-inlet runway scratch-proof system | |
AU2017298227B2 (en) | Interactive projection system | |
CN101850846B (en) | Intelligent laser device for positively avoiding flyer and method thereof for avoiding flyer |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: ROSEMOUNT AEROSPACE INC., MINNESOTA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PESIK, JOSEPH T.;RUTKIEWICZ, ROBERT;ELL, TODD ANTHONY;REEL/FRAME:043357/0193 Effective date: 20170822 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |