WO2014003698A1 - An aircraft vision system - Google Patents
An aircraft vision system Download PDFInfo
- Publication number
- WO2014003698A1 WO2014003698A1 PCT/TR2013/000213 TR2013000213W WO2014003698A1 WO 2014003698 A1 WO2014003698 A1 WO 2014003698A1 TR 2013000213 W TR2013000213 W TR 2013000213W WO 2014003698 A1 WO2014003698 A1 WO 2014003698A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- aircraft
- vision system
- video
- user device
- image
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C23/00—Combined instruments indicating more than one navigational value, e.g. for aircraft; Combined measuring devices for measuring two or more variables of movement, e.g. distance, speed or acceleration
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/02—Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/29—Geographical information databases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/40—Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
- G06F16/44—Browsing; Visualisation therefor
- G06F16/444—Spatial browsing, e.g. 2D maps, 3D or virtual spaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/40—Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
- G06F16/48—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/487—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
Definitions
- the present invention is related to vision systems especially used in airliners.
- Augmented Reality systems which are enriched by superimposing various objects on a real image/video via computer graphics systems and change the visual/audio reality perception of a person are already known.
- augmented reality systems can superimpose the data held in the database, by using the position and angle information of the camera, over the image/video captured from the camera. For example by using the coordinates stored in the database of various buildings, the names of the buildings are superimposed on the images/video at the positions where these buildings are located in accordance with the viewpoint of the camera.
- a plurality of users wants to watch images/video from the same fixed camera in such systems, each one of the users are obliged to watch the same image/video and it is not possible for each user to look at a different direction.
- a part of the desired panoramic image/video captured can be displayed on the device of a user and according to the field of view at which the panoramic image/video is taken, the superimposing (augmented reality) of images at the field of view which said user wants to display, is carried out.
- the panoramic image/video captured from a certain monitoring point can be shared with a plurality of users.
- the users are not only able to indicate their field of view with various input units (joystick, keyboard etc) but also the field of view of the user can be tracked with a head tracking device, and the images/video can be monitored via imaging systems that are mounted on the headpiece (Helmet/Head Mounted Display).
- Panoramic imaging systems can only capture 2 dimensioned images/video and 3 dimension visualisation procedures become more difficult.
- Panoptic cameras formed of sensors that are adjusted to look in various directions, which generally protrude out from a certain point and which are positioned at different angles are present on said systems. These devices are able to capture images/video of a spherical field of view with the certain refreshing rates as 3 dimension or 2 dimension images/video. For example when looked from the centre a video within the scope of a hemisphere around said centre can be obtained as an output.
- the distance to each point on an image/video can be determined with a panoptic camera and by this means a depth map can be genarated.
- This system comprises a panoramic projector which captures at least a section of the panoramic image and an imaging unit which displays this.
- the imaging unit superimposes a three dimentional image over a panoramic image and the panoramic display rotates and matches this image so that it is suitable with the panoramic image.
- US2002036649 describes an image adapting device which creates an augmented reality by forming a panoramic image with the images captured from a plurality of cameras, with multi user support, determines the field of view of a user via a tracking device and combines a virtual picture with the panoramic view of the area looked at.
- the aim of this invention is to provide an aircraft vision system wherein all of the passengers and crew in an aircraft can see outside and watch any direction independently from each other with augmented reality and with digital map support in real time.
- Another aim of this invention is to provide an aircraft vision system with which passengers can watch, even when there is no visibility outside the aircraft (like in bad weather conditions or at night).
- Another aim of this invention is to provide an aircraft vision system which enables the passengers to view the images/video of a region without waiting for the aircraft to reach said region and which alerts the passengers when said region is reached.
- Another aim of this invention is to provide an aircraft vision system for all of the passengers and the crew to see the other aircrafts that are close by, independent from each other with augmented reality and in real time.
- Another aim of this invention is to provide an aircraft vision system wherein a wide angle view can be recorded during flights.
- FIG. 1 Is the schematic view of an aircraft vision system subject to the invention.
- the parts in the figures have been each numbered and the references of said numbers have been given below:
- An aircraft vision system (1) comprises:
- At least one panoptic camera positioned outside the aircraft (8) to be able to capture a semi-spherical field of view segment around the installation point
- At least one map database (5) used to accommodate a digital map and contents regarding the positions determined in said map
- At least one user device (6) which displays the sections of the panoptic camera (3) image/video, that have been requested via tri-axial interfaces to be displayed.
- At least one main control unit (2) ( Figure 1) connected with a panoptic camera (3), positioning system (4), map database (5), imaging database (9), and user device; enabling the communication between said parts (3), (4), (5), (6), (9), which is adapted to carry out the following functions;
- the panoptic camera (3) is a visible light panoptic camera adapted to capture a whole hemisphere field of view, which is positioned at a point at the bottom of the aircraft (8) to have the widest view possible except for aircraft (8) parts .
- the camera can have 360 degrees of a horizontal field of view and 180 degrees of a vertical field of view .
- at least 30 frames per second of imaging capability is preferred.
- panoptic camera (3) operated at the near infrared range.
- a panoptic camera which has night vision feature combined with visible light and infrared light is being used in another exemplary embodiment.
- the positioning system (4) is pereferably a Global Positioning System (GPS) working together with an Inertial Navigation System (INS).
- the digital map database (5) comprises data related to air routes, geographical elements like mountains, lakes, rivers, seas, valleys, historical and touristic places, buildings, towns/villages/cities/countries and borders besides landforms and object digital models. All of these elements and pre-defined points/regions can be associated with pictures, videos, audio recordings and information and the associated data are also stored in the map database (5).
- the stored digital map data may include satellite pictures besides vectoral maps. All of the map elements can be stored in different map layers and can be fetched
- the user device (6) has been adapted such that it can request the image/video section that is desired to be displayed and all or any of the augmented reality overlays from the main control unit (2). Namely, a requested section of a wide angled image/video captured with a panoptic camera (3) is processed by the main control unit (2) and submitted to the user device (6).
- the user device (6) is located within the aircraft (8) while in another embodiment, the device is a portable device adapted suitably and brought by the passenger.
- the section that is requested to be viewed is determined with a joystick, in the user device (6)
- the part of the image/video that is requested to be viewed is determined in the user device (6) with a head tracker.
- the section of the image/video that is requested to be viewed in the user device (6) is determined with a motion and position sensor unit which is installed in the display device.
- This unit is preferably an inertial measurement unit (IMU), an optical or magnetic motion and position sensor unit.
- the user device (6) displays the image/video in 2 or 3 dimension and preferably via a monitor mounted behind a seat or via a portable device such as tablet pc.
- the motion and position sensor unit preferably is an inertial measuring unit (IMU) embedded in the portable device.
- the user device (6) displays 2 or 3 dimensional image/video via a Head mounted display (HDM).
- HDM Head mounted display
- the user device (6) displays the image/video via a 2 or 3 dimensional dome display device.
- the main control unit (2) is responsible for the communication between all of the components (panoptik camera (3), positioning system (4), map database (5), image database (9), image/video transmitter (10) and user device (6). It provides the distribution of the images/video captured from the panoptic camera (3) to the user devices (6) according to requests. In order to superimpose augmented reality overlays correctly over images/video , the position and angles of the panoptic camera (3) need to be known. For this reason, the position and angles of the panoptic camera (3) is calculated in the main control unit (2) using the data obtained from the positioning system (4).
- Different images/video can be submitted to various user devices (6) by superimposing augmented reality layer over the chosen sections of the image/video received from the panoptic camera (3).
- the augmented reality By means of the augmented reality, the objects that cannot be viewed due to weather conditions can be viewed via the digital map.
- the main control unit (2) has been adapted to carry out the function of recording the images/video and depth data taken from the panoptic camera (3) throughout the journey to the database (9) together with the position and angles of the panoptic camera (3) at that time.
- the routes of aircrafts are usually the same or very similar to each other, it is possible to display the panoptic camera records for that position belonging to a prior flight in the user device (6).
- recordings belonging to another time frame or season can be enriched by superimposing augmented contents according to the current geographic location information.
- suitable weather conditions for example when making a journey during the dayj the night image/video or when travelling in summer the winter image/video can be watched just like a live view.
- the user device (6) has been adapted such that it can request a live or recorded image/video from the main control unit (2).
- the user device (6) has been adapted such that it can request the augmentations that are desired to be superimposed and transparencies of said augmentations.
- the superimpositions that can be requested includes air routes; geographical elements such as mountains, lakes, rivers, seas, valleys; historical and touristic places; buildings; towns/villages/cities/countries and borders.
- Another augmentation that the user device (6) can request is the image/video and information of other aircrafts that are close by.
- the determination of the locations of aircrafts that are close by can be easily applied to aircrafts that have been equipped with ADS-B transmitters.
- the flight information broadcasted by aircrafts is received by ground stations, combined and sent to airplanes. Received data is used to calculate the position and angles of closeby airplanes. The calculation are used to display some information about those airplanes on live images/video as an augmented reality object, at the position where the aircraft is really located.
- an image/video transmitter has been adapted such that it shall send the image/video, location and angles of the panoptic camera (3) from the aircraft (8) to an independent ground station (1 1) wirelessly.
- a printer (7) connected to the main control unit (2) has been adapted such that it shall print out an image displayed in a unit (6).
- the main control unit (2) has been adapted such that the user is able to determine at least a point or a region on a map via the user device (6) and when the aircraft (8) enters into any of the determined regions, or when the aircraft comes as close to the point as determined by the user, the device shall carry out a function to realize an action related to said region or point determined by the user.
- this action can be an audio and/or visual alarm, automatically capturing a photograph or video of that region, turning on the display device or printing an output from the priner (7).
- the user does not need to wait to reach a zone or point of auset. He/she can take the photo/video of that region automatically or can be warned when said region is reached.
- the action can only be carried out when there is an open view (suitable weather/lighting conditions) in accordance with the user preferences.
- the main control unit (2) has been adapted such that all of the crew and/or passengers can place marks on the image/video via user devices (6) and record said markings by associating them to the related position.
- the objects which cannot be identified by the passengers when seen from the sky can be defined by other users by displaying the markson their screens.
- This function can be used for example by crew to define objects as a tour guide to the passengers or in order to create an interactive environment amongst passengers.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Databases & Information Systems (AREA)
- Remote Sensing (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Radar, Positioning & Navigation (AREA)
- Library & Information Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Studio Devices (AREA)
Abstract
The invention is related to an aircraft vision system used in especially airliners comprising at least a panoptic camera positioned outside the aircraft both to be able to capture a line of vision of a spherical segment around a point of view and at the same time adapted to output depth information from a captured image/video upon request, at least a positioning system adapted to determine the position and angles of the aircraft, at least a map database adapted to accommodate a digital virtual map and contents regarding the positions determined in said map, at least a user device which displays with an imaging device the sections that are desired to be displayed which have been captured with a panoptic camera and the sections that have been determined via tri-axial interfaces, at least a main control unit which is in connection with the panoptic camera, positioning system, map database image database and user device; and provides the communication between said components.
Description
DESCRIPTION
AN AIRCRAFT VISION SYSTEM
Technical Field
The present invention is related to vision systems especially used in airliners.
Prior Art
Augmented Reality systems which are enriched by superimposing various objects on a real image/video via computer graphics systems and change the visual/audio reality perception of a person are already known. When used together with digital maps and 3 dimension model database, augmented reality systems can superimpose the data held in the database, by using the position and angle information of the camera, over the image/video captured from the camera. For example by using the coordinates stored in the database of various buildings, the names of the buildings are superimposed on the images/video at the positions where these buildings are located in accordance with the viewpoint of the camera. In the case that a plurality of users wants to watch images/video from the same fixed camera in such systems, each one of the users are obliged to watch the same image/video and it is not possible for each user to look at a different direction.
In this case as a solution even if an articulated or gimbal mounted camera is used, the field of view of the camera is adjusted accordance to the request of only one user. In order for each user to be able to look at different directions a plurality of rotatable camera systems need to be used; however this situation gets more expensive when there are more than 2 or 3 users. Moreover the camera motions aren't sufficiently fast; and as it comprises a mechanical component, there is a higher risk of breakdown and said camera necessitates maintenance. Another
technique used in order to reach this aim nowadays is to use panoramic imaging systems. In such a case a part of the desired panoramic image/video captured can be displayed on the device of a user and according to the field of view at which the panoramic image/video is taken, the superimposing (augmented reality) of images at the field of view which said user wants to display, is carried out. By this means, the panoramic image/video captured from a certain monitoring point can be shared with a plurality of users. The users are not only able to indicate their field of view with various input units (joystick, keyboard etc) but also the field of view of the user can be tracked with a head tracking device, and the images/video can be monitored via imaging systems that are mounted on the headpiece (Helmet/Head Mounted Display). Panoramic imaging systems can only capture 2 dimensioned images/video and 3 dimension visualisation procedures become more difficult. Panoptic cameras formed of sensors that are adjusted to look in various directions, which generally protrude out from a certain point and which are positioned at different angles are present on said systems. These devices are able to capture images/video of a spherical field of view with the certain refreshing rates as 3 dimension or 2 dimension images/video. For example when looked from the centre a video within the scope of a hemisphere around said centre can be obtained as an output. The distance to each point on an image/video can be determined with a panoptic camera and by this means a depth map can be genarated.
When airliners and the number of passengers transported with such airliners are taken into account, it is clear that it is not possible for all passengers to easily look outside due to various reasons. The wings of the aircraft, the size of the windows and the positions of the seats prevent the view of some passengers. The passengers with window seats can only see a limited area, cannot obtain detailed information regarding what they are seeing and even though it is not prohibited to take pictures in the aircraft, they cannot always capture the image/video they desire freely. There are some mapping and information systems in commercial aircrafts however it is difficult for passengers to identify such information with
real images. It is obvious that articulated or fixed camera systems cannot solve these problems. Moreover, even though said systems are integrated to a vehicle it will not be possible to capture images in insufficient lighting conditions or bad weather conditions.
In the United States patent document numbered US6559846 a system wherein the data related with each frame of the image captured from a panoramic camera or a video recorder is covered over a determined peripheral model and wherein the user determines a part of the panoramic video to the direction at which he/she wants to look at is described.
In the United States patent document numbered US2011 141254 an augmented reality method wherein an augmented reality image is superimposed over a real image dependent on the three dimensioned position of the imaging device is discussed.
In the International patent document numbered WO2008147561 the imaging of panoramic images and adding notes to said images are described. This system comprises a panoramic projector which captures at least a section of the panoramic image and an imaging unit which displays this. The imaging unit superimposes a three dimentional image over a panoramic image and the panoramic display rotates and matches this image so that it is suitable with the panoramic image.
The United States patent document numbered US2002036649 describes an image adapting device which creates an augmented reality by forming a panoramic image with the images captured from a plurality of cameras, with multi user support, determines the field of view of a user via a tracking device and combines a virtual picture with the panoramic view of the area looked at.
Brief Description of the Invention
The aim of this invention is to provide an aircraft vision system wherein all of the passengers and crew in an aircraft can see outside and watch any direction independently from each other with augmented reality and with digital map support in real time.
Another aim of this invention is to provide an aircraft vision system with which passengers can watch, even when there is no visibility outside the aircraft (like in bad weather conditions or at night).
Another aim of this invention is to provide an aircraft vision system which enables the passengers to view the images/video of a region without waiting for the aircraft to reach said region and which alerts the passengers when said region is reached.
Another aim of this invention is to provide an aircraft vision system for all of the passengers and the crew to see the other aircrafts that are close by, independent from each other with augmented reality and in real time.
Another aim of this invention is to provide an aircraft vision system wherein a wide angle view can be recorded during flights.
Detailed description of the invention
An aircraft vision system produced in order to reach the aims of the present invention has been illustrated in the attached figures, wherein said figures show the following:
Figure 1 - Is the schematic view of an aircraft vision system subject to the invention.
The parts in the figures have been each numbered and the references of said numbers have been given below:
1. Aircraft vision system
2. Main control unit
3. Panoptic camera
4. Positioning system
5. Digital Map database
6. User device
7. Printer
8. Aircraft
9. Image database
10. Panoptic image/video information transmitter
1 1. Ground station
An aircraft vision system (1) comprises:
- At least one panoptic camera positioned outside the aircraft (8) to be able to capture a semi-spherical field of view segment around the installation point,
- At least one positioning system (4) used to determine the position and angles of the aircraft (8),
At least one map database (5) used to accommodate a digital map and contents regarding the positions determined in said map,
- At least one user device (6) which displays the sections of the panoptic camera (3) image/video, that have been requested via tri-axial interfaces to be displayed.,
- At least one main control unit (2) (Figure 1) connected with a panoptic camera (3), positioning system (4), map database (5), imaging database
(9), and user device; enabling the communication between said parts (3), (4), (5), (6), (9), which is adapted to carry out the following functions;
- Determining the position and angles of the panoptic camera (3) using the data obtained from the positioning system (4),
Recording the images/video and depth information received from the panoptic camera (3) during the whole journey together with the position and angles of the panoptic camera at that moment, Submitting the requested section of the image/video that has been recorded beforehand to the database (9) or which is captured live to at least one user device (6) by superimposing augmented reality layer ,
Determining at least one region or one point on a map via the user device (6) by the user and carrying out an action related with said region or point that again the user has determined when the aircraft gets close to the region.
In a preferred embodiment of the invention, the panoptic camera (3) is a visible light panoptic camera adapted to capture a whole hemisphere field of view, which is positioned at a point at the bottom of the aircraft (8) to have the widest view possible except for aircraft (8) parts . By this means, the camera can have 360 degrees of a horizontal field of view and 180 degrees of a vertical field of view . In order to have smooth video streaming, at least 30 frames per second of imaging capability is preferred.
In another preferred embodiment the panoptic camera (3) operated at the near infrared range. A panoptic camera which has night vision feature combined with visible light and infrared light is being used in another exemplary embodiment.
The positioning system (4), is pereferably a Global Positioning System (GPS) working together with an Inertial Navigation System (INS).
The digital map database (5) comprises data related to air routes, geographical elements like mountains, lakes, rivers, seas, valleys, historical and touristic places, buildings, towns/villages/cities/countries and borders besides landforms and object digital models. All of these elements and pre-defined points/regions can be associated with pictures, videos, audio recordings and information and the associated data are also stored in the map database (5). The stored digital map data may include satellite pictures besides vectoral maps. All of the map elements can be stored in different map layers and can be fetched
independently from the database (5).
The user device (6), has been adapted such that it can request the image/video section that is desired to be displayed and all or any of the augmented reality overlays from the main control unit (2). Namely, a requested section of a wide angled image/video captured with a panoptic camera (3) is processed by the main control unit (2) and submitted to the user device (6).
In a preferred embodiment of the invention, the user device (6) is located within the aircraft (8) while in another embodiment, the device is a portable device adapted suitably and brought by the passenger.
In a preferred embodiment of the invention the section that is requested to be viewed is determined with a joystick, in the user device (6)
In another preferred embodiment of the invention, the part of the image/video that is requested to be viewed is determined in the user device (6) with a head tracker.
In a preferred embodiment of the invention, the section of the image/video that is requested to be viewed in the user device (6) is determined with a motion and position sensor unit which is installed in the display device. This unit is preferably an inertial measurement unit (IMU), an optical or magnetic motion and position sensor unit.
In a preferred embodiment of the invention the user device (6) displays the image/video in 2 or 3 dimension and preferably via a monitor mounted behind a seat or via a portable device such as tablet pc. When a portable device is used, the motion and position sensor unit, preferably is an inertial measuring unit (IMU) embedded in the portable device.
In a preferred embodiment of the invention the user device (6) displays 2 or 3 dimensional image/video via a Head mounted display (HDM).
In a preferred embodiment of the invention the user device (6) displays the image/video via a 2 or 3 dimensional dome display device.
The main control unit (2) is responsible for the communication between all of the components (panoptik camera (3), positioning system (4), map database (5), image database (9), image/video transmitter (10) and user device (6). It provides the distribution of the images/video captured from the panoptic camera (3) to the user devices (6) according to requests. In order to superimpose augmented reality overlays correctly over images/video , the position and angles of the panoptic camera (3) need to be known. For this reason, the position and angles of the panoptic camera (3) is calculated in the main control unit (2) using the data obtained from the positioning system (4).
Different images/video can be submitted to various user devices (6) by superimposing augmented reality layer over the chosen sections of the image/video received from the panoptic camera (3). By means of the augmented reality, the objects that cannot be viewed due to weather conditions can be viewed via the digital map. However despite all this, if a user wants to view the real image/video instead of digital map in bad weather conditions, it is possible to use the panoptic camera images/video that have been recorded previously for the same flight. For this reason the main control unit (2) has been adapted to carry out the function of recording the images/video and depth data taken from the panoptic
camera (3) throughout the journey to the database (9) together with the position and angles of the panoptic camera (3) at that time. As a result, assuming that the routes of aircrafts are usually the same or very similar to each other, it is possible to display the panoptic camera records for that position belonging to a prior flight in the user device (6).
Besides providing vision with augmented reality in unsuitable weather conditions and at night, even in suitable visibility conditions, recordings belonging to another time frame or season can be enriched by superimposing augmented contents according to the current geographic location information. Even so, by this means, in suitable weather conditions, for example when making a journey during the dayj the night image/video or when travelling in summer the winter image/video can be watched just like a live view. For this reason, the user device (6) has been adapted such that it can request a live or recorded image/video from the main control unit (2). At the same time, the user device (6) has been adapted such that it can request the augmentations that are desired to be superimposed and transparencies of said augmentations.
The superimpositions that can be requested includes air routes; geographical elements such as mountains, lakes, rivers, seas, valleys; historical and touristic places; buildings; towns/villages/cities/countries and borders. Another augmentation that the user device (6) can request is the image/video and information of other aircrafts that are close by. The determination of the locations of aircrafts that are close by can be easily applied to aircrafts that have been equipped with ADS-B transmitters. The flight information broadcasted by aircrafts is received by ground stations, combined and sent to airplanes. Received data is used to calculate the position and angles of closeby airplanes. The calculation are used to display some information about those airplanes on live images/video as an augmented reality object, at the position where the aircraft is really located.
in a preferred embodiment of the invention, an image/video transmitter has been adapted such that it shall send the image/video, location and angles of the panoptic camera (3) from the aircraft (8) to an independent ground station (1 1) wirelessly.
In a preferred embodiment of the invention, a printer (7) connected to the main control unit (2) has been adapted such that it shall print out an image displayed in a unit (6).
In a preferred embodiment of the invention the main control unit (2), has been adapted such that the user is able to determine at least a point or a region on a map via the user device (6) and when the aircraft (8) enters into any of the determined regions, or when the aircraft comes as close to the point as determined by the user, the device shall carry out a function to realize an action related to said region or point determined by the user. Preferably this action can be an audio and/or visual alarm, automatically capturing a photograph or video of that region, turning on the display device or printing an output from the priner (7). By this means, the user does not need to wait to reach a zone or point of intereset. He/she can take the photo/video of that region automatically or can be warned when said region is reached. In a preferred embodiment, the action can only be carried out when there is an open view (suitable weather/lighting conditions) in accordance with the user preferences.
The main control unit (2), has been adapted such that all of the crew and/or passengers can place marks on the image/video via user devices (6) and record said markings by associating them to the related position. By this means, for example the objects which cannot be identified by the passengers when seen from the sky, can be defined by other users by displaying the markson their screens. This function can be used for example by crew to define objects as a tour guide to the passengers or in order to create an interactive environment amongst passengers.
Within the scope of this basic concept, it is possible to develop many various applications of the aircraft vision system (1) subject to the invention; wherein the invention here cannot be limited to the examples described herein and is principally as explained in the claims.
Claims
1. An aircraft vision system (1) which enables users in an aircraft to watch outside in any direction and in any weather condition comprising:
At least one panoptic camera positioned on an aircraft (8), such that it can capture a hemisphere field of view around the view point and adapted such that it can also provide depth information from the captured image/video upon request,
At least one positioning system (4) adapted to determine the location and angles of the aircraft (8),
A digital map and at least one map database (5) adapted to store the contents related to the locations of said map,
At least one user device (6) which requests the section of panoptic camera (3) image/video desired to be viewed and determined with triaxial interfaces , at least one main control unit (2) which
• is in connection with the panoptic camera (3), positioning system (4), map database (5) image database (9) and user device (6); and provides the communication between said components (3), (4), (5), (6), (9); and has been adapted such that it can carry out the following functions:
• determines the location and position of the panoptic camera (3) using the data obtained from the positioning system (4),
records the real images/video and depth data into the database (9) taken from the panoptic camera (3) during the journey together with the location and angles of the panoptic camera (3) at that moment, Submits the related section of the image/video that has been recorded beforehand to the database (9) or which is captured live to at least one user device (6) by superimposing augmented reality data according to the position and angles of the panoptic camera (3) and according to the options requested ,
•
Submits the related section of the image/video that has been recorded beforehand to the database (9) or which is captured live to at least one user
device (6) by superimposing augmented reality data according to the position and angles of the panoptic camera (3) and according to the options requested , • Determines at least one region or one point on a map via the user device (6) by the user and carrying out an action related with said region or point that again the user has determined when the aircraft gets close to the region.
2. An aircraft vision system (1) according to Claim 1, characterized in that, it comprises a panoptic camera (3) adapted to have a full hemisphere line of vision at the same time.
3. An aircraft vision system (1) according to Claim 1 or 2, characterized in that, it comprises a panoptic camera (3) positioned at a point below the aircraft (8) such that it shall have the widest possible view without any obstruction from the parts of the aircraft (8).
4. An aircraft vision system (1) according to claims 1 to 3, characterized in that it comprises a panoptic camera (3) operating with visible light.
5. An aircraft vision system (1) according to claims 1 to 4, characterized in that it comprises a panoptic camera (3) operating within near infrared ranges.
6. An aircraft vision system (1) according to claims 1 to 5, characterized in that it comprises a panoptic camera (3) which has night vision.
7. An aircraft vision system (1) according to claims 1 to 6, characterized in that it comprises a positioning system (4) which has a global positioning system (GPS) operating together with an inertial navigation (INS) system.
8. An aircraft vision system (1) according to claims 1 to 7, characterized in that it comprises a user device (6) adapted to request all of the augmented reality
overlay options and modify their transparencies provided by the main control unit (2).
9. An aircraft vision system (1) according to claims 1 to 8, characterized in that it comprises a user device (6) accommodated in an aircraft (8).
10. An aircraft vision system (1) according to claim 9, characterized in that it comprises a user device (6) mounted behind a seat.
11. An aircraft vision system (1) according to claims 1 to 10, characterized in that it comprises a portable user device (6).
12. An aircraft vision system (1) according to claims 1 to 7, characterized in that it comprises a user device (6) adapted such that the section of the image/video to view can be determined with at least a joystick.
13. An aircraft vision system (1) according to claims 1 to 12, characterized in that it comprises a user device (6) adapted such that the section of the image/video to view can be determined with at least a head tracker.
14. An aircraft vision system (1) according to claims 1 to 13, characterized in that it comprises a user device (6) adapted such that the section of the image/video to view can be determined with at least a motion and position sensor unit.
15. An aircraft vision system (1) according to claim 14, characterized in that it comprises a motion and position sensor unit which is an inertial measurement unit (IMU).
16. An aircraft vision system (1) according to claim 14, characterized in that it comprises a magnetic motion and position sensor unit.
17. An aircraft vision system (1) according to claim 14, characterized in that it comprises an optical motion and position sensor unit.
18. An aircraft vision system (1) according to claims 1 to 17, characterized in that it comprises a main control unit (2) adapted to determine the location and angles of the panoptic camera (3) using the data taken from the positioning system (4).
19. An aircraft vision system (1) according to claims 1 to 18, characterized in that it comprises a main control unit (2) adapted to include the image/video and information of other aircrafts that are close by into the augmented reality layer.
20. An aircraft vision system (1) according to claims 1 to 19, characterized in that it comprises an image/video information transmitter (10) in connection with the main control unit (2), adapted to submit the image/video captured by the panoptic camera (3) together with the location and angles of the panoptic camera wirelessly to at least one ground station (1 1).
21. An aircraft vision system (1) according to claims 1 to 20, characterized in that it comprises a main control unit (2) adapted to associate and record the markings carried out on the image/video viewed over the user devices (6) with the related sections of the image/video.
22. An aircraft vision system (1) according to claims 1 to 21, characterized in that it comprises a main control unit (2) wherein the action associated with the region or point determined by the user is an audio alarm.
23. An aircraft vision system (1) according to claims 1 to 22, characterized in that it comprises a main control unit (2) wherein the action associated with the region or point determined by the user is a visual alarm.
24. An aircraft vision system (1) according to claims 1 to 23, characterized in that it comprises a main control unit (2) wherein the action associated with the region or point determined by the user is to capture a video/photograph of the region.
25. An aircraft vision system (1) according to claims 1 to 24, characterized in that it comprises a main control unit (2) wherein the action associated with the region or point determined by the user is to open an imaging device belonging to the user device.
26. An aircraft vision system (1) according to claims 1 to 25, characterized in that it comprises a user device (6) having a display device which is a monitor.
27. An aircraft vision system (1) according to claims 1 to 26, characterized in that it comprises a user device (6) having an imaging device which is a Head Mounted Display (HMD).
28. An aircraft vision system (1) according to claims 1 to 27, characterized in that it comprises a user device (6) having an imaging device which is a dome display.
29. An aircraft vision system (1) according to claims 1 to 28, characterized in that it comprises a user device (6) having an imaging device adapted to carry out
2 dimension imaging.
30. An aircraft vision system (1) according to claims 1 to 29, characterized in that it comprises a user device (6) having an imaging device adapted to carry out
3 dimension imaging.
31. An aircraft vision system (1) according to claims 1 to 30, characterized in that it comprises a printer (7) connected with a main control unit (2) which can
print out said image/video at said unit (6) at that moment according to a request submitted to the main control unit (2).
32. An aircraft vision system (1) according to claims 1 to 31, characterized in that it comprises a main control unit (2) wherein the action associated with the region or point determined by the user is to print out from the printer (7).
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TR2012/07589 | 2012-06-29 | ||
TR201207589 | 2012-06-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014003698A1 true WO2014003698A1 (en) | 2014-01-03 |
Family
ID=49230835
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/TR2013/000213 WO2014003698A1 (en) | 2012-06-29 | 2013-06-28 | An aircraft vision system |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2014003698A1 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20160020033A (en) * | 2014-08-12 | 2016-02-23 | 전자부품연구원 | Flight path guiding method based on augmented reality using mobile terminal |
US10067513B2 (en) * | 2017-01-23 | 2018-09-04 | Hangzhou Zero Zero Technology Co., Ltd | Multi-camera system and method of use |
US10220954B2 (en) | 2015-01-04 | 2019-03-05 | Zero Zero Robotics Inc | Aerial system thermal control system and method |
US10358214B2 (en) | 2015-01-04 | 2019-07-23 | Hangzhou Zero Zro Technology Co., Ltd. | Aerial vehicle and method of operation |
US10435144B2 (en) | 2016-04-24 | 2019-10-08 | Hangzhou Zero Zero Technology Co., Ltd. | Aerial system propulsion assembly and method of use |
US10824149B2 (en) | 2015-01-04 | 2020-11-03 | Hangzhou Zero Zero Technology Co., Ltd. | System and method for automated aerial system operation |
US10824167B2 (en) | 2015-01-04 | 2020-11-03 | Hangzhou Zero Zero Technology Co., Ltd. | System and method for automated aerial system operation |
WO2023284268A1 (en) * | 2021-07-13 | 2023-01-19 | 郭晓勤 | Passenger visual travel system configured on passenger aircraft |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6064335A (en) * | 1997-07-21 | 2000-05-16 | Trimble Navigation Limited | GPS based augmented reality collision avoidance system |
EP1160541A1 (en) * | 2000-05-30 | 2001-12-05 | Fuji Jukogyo Kabushiki Kaisha | Integrated vision system |
US20020036649A1 (en) | 2000-09-28 | 2002-03-28 | Ju-Wan Kim | Apparatus and method for furnishing augmented-reality graphic using panoramic image with supporting multiuser |
JP2003083745A (en) * | 2001-09-12 | 2003-03-19 | Starlabo Corp | Imaging apparatus mounted to aircraft, and aircraft imaging data processing apparatus |
US6559846B1 (en) | 2000-07-07 | 2003-05-06 | Microsoft Corporation | System and process for viewing panoramic video |
WO2008147561A2 (en) | 2007-05-25 | 2008-12-04 | Google Inc. | Rendering, viewing and annotating panoramic images, and applications thereof |
US20110141254A1 (en) | 2009-11-17 | 2011-06-16 | Roebke Mark J | Systems and methods for augmented reality |
-
2013
- 2013-06-28 WO PCT/TR2013/000213 patent/WO2014003698A1/en active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6064335A (en) * | 1997-07-21 | 2000-05-16 | Trimble Navigation Limited | GPS based augmented reality collision avoidance system |
EP1160541A1 (en) * | 2000-05-30 | 2001-12-05 | Fuji Jukogyo Kabushiki Kaisha | Integrated vision system |
US6559846B1 (en) | 2000-07-07 | 2003-05-06 | Microsoft Corporation | System and process for viewing panoramic video |
US20020036649A1 (en) | 2000-09-28 | 2002-03-28 | Ju-Wan Kim | Apparatus and method for furnishing augmented-reality graphic using panoramic image with supporting multiuser |
JP2003083745A (en) * | 2001-09-12 | 2003-03-19 | Starlabo Corp | Imaging apparatus mounted to aircraft, and aircraft imaging data processing apparatus |
WO2008147561A2 (en) | 2007-05-25 | 2008-12-04 | Google Inc. | Rendering, viewing and annotating panoramic images, and applications thereof |
US20110141254A1 (en) | 2009-11-17 | 2011-06-16 | Roebke Mark J | Systems and methods for augmented reality |
Non-Patent Citations (3)
Title |
---|
ARTHUR III JARVIS J ET AL: "Enhanced/synthetic vision and head-worn display technologies for terminal maneuvering area NextGen operations", DISPLAY TECHNOLOGIES AND APPLICATIONS FOR DEFENSE, SECURITY, AND AVIONICS V; AND ENHANCED AND SYNTHETIC VISION 2011, SPIE, 1000 20TH ST. BELLINGHAM WA 98225-6705 USA, vol. 8042, no. 1, 13 May 2011 (2011-05-13), pages 1 - 15, XP060014717, DOI: 10.1117/12.883036 * |
HOSSEIN AFSHARI ET AL: "Hardware implementation of an omnidirectional camerawith real-time 3D imaging capability", 3DTV CONFERENCE: THE TRUE VISION - CAPTURE, TRANSMISSION AND DISPLAY OF 3D VIDEO (3DTV-CON), 2011, IEEE, 16 May 2011 (2011-05-16), pages 1 - 4, XP031993762, ISBN: 978-1-61284-161-8, DOI: 10.1109/3DTV.2011.5877192 * |
MUNA SHABANEH ET AL: "Probability Grid Mapping system for aerial search", SCIENCE AND TECHNOLOGY FOR HUMANITY (TIC-STH), 2009 IEEE TORONTO INTERNATIONAL CONFERENCE, IEEE, PISCATAWAY, NJ, USA, 26 September 2009 (2009-09-26), pages 521 - 526, XP031655850, ISBN: 978-1-4244-3877-8 * |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20160020033A (en) * | 2014-08-12 | 2016-02-23 | 전자부품연구원 | Flight path guiding method based on augmented reality using mobile terminal |
KR101994898B1 (en) | 2014-08-12 | 2019-07-01 | 전자부품연구원 | Flight path guiding method based on augmented reality using mobile terminal |
US10220954B2 (en) | 2015-01-04 | 2019-03-05 | Zero Zero Robotics Inc | Aerial system thermal control system and method |
US10358214B2 (en) | 2015-01-04 | 2019-07-23 | Hangzhou Zero Zro Technology Co., Ltd. | Aerial vehicle and method of operation |
US10824149B2 (en) | 2015-01-04 | 2020-11-03 | Hangzhou Zero Zero Technology Co., Ltd. | System and method for automated aerial system operation |
US10824167B2 (en) | 2015-01-04 | 2020-11-03 | Hangzhou Zero Zero Technology Co., Ltd. | System and method for automated aerial system operation |
US10435144B2 (en) | 2016-04-24 | 2019-10-08 | Hangzhou Zero Zero Technology Co., Ltd. | Aerial system propulsion assembly and method of use |
US11027833B2 (en) | 2016-04-24 | 2021-06-08 | Hangzhou Zero Zero Technology Co., Ltd. | Aerial system propulsion assembly and method of use |
US10067513B2 (en) * | 2017-01-23 | 2018-09-04 | Hangzhou Zero Zero Technology Co., Ltd | Multi-camera system and method of use |
US10303185B2 (en) | 2017-01-23 | 2019-05-28 | Hangzhou Zero Zero Technology Co., Ltd. | Multi-camera system and method of use |
WO2023284268A1 (en) * | 2021-07-13 | 2023-01-19 | 郭晓勤 | Passenger visual travel system configured on passenger aircraft |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014003698A1 (en) | An aircraft vision system | |
US10798343B2 (en) | Augmented video system providing enhanced situational awareness | |
US8754786B2 (en) | Method of operating a synthetic vision system in an aircraft | |
JP5349055B2 (en) | Multi-lens array system and method | |
CN109644256B (en) | Vehicle-mounted video system | |
US7456847B2 (en) | Video with map overlay | |
US8467598B2 (en) | Unconstrained spatially aligned head-up display | |
US8711218B2 (en) | Continuous geospatial tracking system and method | |
EP3596588B1 (en) | Gradual transitioning between two-dimensional and three-dimensional augmented reality images | |
US20180262789A1 (en) | System for georeferenced, geo-oriented realtime video streams | |
JP3225434B2 (en) | Video presentation system | |
EP2557037B1 (en) | Systems and methods for a virtual terrain display | |
BRPI1105955A2 (en) | Peripheral field representation device as well as a vehicle with such peripheral field representation device and process for the representation of a panoramic image | |
GB2457707A (en) | Integration of video information | |
WO2017160381A1 (en) | System for georeferenced, geo-oriented real time video streams | |
JP2007241085A (en) | Photographed image processing system and photographed image processing device, and photographed image display method | |
US20240010340A1 (en) | Augmented reality through digital aircraft windows of aircraft inflight entertainment systems | |
JP7367930B2 (en) | Image display system for mobile objects |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13766160 Country of ref document: EP Kind code of ref document: A1 |
|
DPE1 | Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101) | ||
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13766160 Country of ref document: EP Kind code of ref document: A1 |