EP3685211A1 - Verfahren und system zum anzeigen zumindest eines raumausschnitts, wobei der raumausschnitt abhängig von einer augenposition einer person angezeigt wird - Google Patents
Verfahren und system zum anzeigen zumindest eines raumausschnitts, wobei der raumausschnitt abhängig von einer augenposition einer person angezeigt wirdInfo
- Publication number
- EP3685211A1 EP3685211A1 EP18762765.8A EP18762765A EP3685211A1 EP 3685211 A1 EP3685211 A1 EP 3685211A1 EP 18762765 A EP18762765 A EP 18762765A EP 3685211 A1 EP3685211 A1 EP 3685211A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- person
- image
- display device
- space
- displayed
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 21
- 230000001419 dependent effect Effects 0.000 claims abstract description 3
- 238000003384 imaging method Methods 0.000 claims description 14
- 238000001514 detection method Methods 0.000 claims description 10
- 230000003190 augmentative effect Effects 0.000 claims description 7
- 238000004891 communication Methods 0.000 description 10
- 238000013459 approach Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/22—Display screens
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/142—Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/147—Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
Definitions
- Method and system for displaying at least one spatial section the spatial section being displayed as a function of an eye position of a person
- the present invention relates to a method for displaying at least one spatial section of at least one space on a first display surface of a first display device for a first person.
- Methods are already known from the state of the art in which, for example, two persons can communicate with one another by means of video telephone, whereby an image of the first person is transmitted to the second person and an image of the second person is transmitted to the first person.
- the images are displayed on respective display devices for the respective persons.
- the respective spatial sections are detected with the respective cameras of the display devices, and the image information is transmitted via a network to the respective display device of the counterpart and displayed on the imaging device of the respective display device.
- a rigid image of the space section is displayed. If, for example, the person approaches the camera, then the displayed spatial section remains the same for the person on the display device. In other words, the displayed spatial section from the prior art does not change with the viewing direction of the respective person.
- Object of the present invention is to provide a method and a system by means of which an improved representation of a room can be displayed as an image for a person on the display device. This object is achieved by a method and by a system according to the independent claims.
- One aspect of the invention relates to a method for displaying at least one spatial section of at least one space on a first display surface of a first display device for a first person.
- a second spatial section of a second space, in which the second display device is located is detected and the second spatial section is displayed as a first image on the first display surface.
- the second space is detected and by means of at least one detection device, an eye position of the first person is detected and a second spatial section of the second space, which is dependent on the detected eye position, is displayed as the first image on the first display surface.
- the first person depending on the first person's eye position, to see the image of the second room, in particular of the room section, displayed on the first display surface.
- the first person may look into the second room with different viewing angles and, for example, in the second room, look around the first display device in the second room. Similar to looking through a window, the second room, which is then behind the window, can be viewed by the first person accordingly.
- the second room can then be displayed "behind the window."
- a particularly realistic communication between the first display device and the second display device can thereby be realized
- objects which are hidden in a first viewing angle by another object in the second space are changed by a change of perspective
- an improved representation of the second space for the first person can be realized.
- the second space is detected by the second camera.
- An eye position is detected by means of a detection device of a first person arranged outside the second space.
- the second space section of the second space is selected in accordance with the detected eye position of the person by means of a controller.
- the control device By means of the control device, the first display device is actuated for displaying the selected second spatial section.
- the detection device can detect the eye position, for example, by means of image processing of an image captured by a camera. Further, by detecting a temperature of the room in which the first person is located, the eye position can be detected. It is also possible that based on markers on the first person, the eye position is detected.
- the first and / or the second display device can also communicate with at least one further third display device, so that the displayed spatial section in which the third display device is located is correspondingly adapted to at least the detected perspective of the first person and on the first and / or the second display device can be displayed.
- a third image and / or the first and / or a second image may then be displayed on the first and / or the second display surface.
- the person and a first room in which the first person is located can be detected by means of at least one first camera of the detection device, and a first spatial section can be displayed as a second image on a second display surface of the second display device.
- the first room detail on the second display surface of the second display device are displayed accordingly.
- the second person for example, the first person also get displayed on a display surface of the second display device, whereby an improved communication between the first and the second person can be realized.
- the second image on the second display surface can also be adapted depending on the second eye position of the second person, so that a kind of window view also arises for the second person.
- the second image which indicates the first room with, for example, the first person, can also be adapted for the second person depending on the position of the second person or the angle of view of the second person. This makes it possible that a particularly realistic communication between the first person and the second person can be realized depending on the respective viewing angles.
- the first person and the first spatial detail can be displayed as a 3-dimensional second image on the second display surface and / or a second person and the second spatial detail as a 3-dimensional first image on the first Display interface.
- the first display surface is made semitransparent and at least the first image is displayed as augmented reality on the first display surface.
- the second image is displayed as augmented reality on the second display surface.
- augmented reality the person can still perceive the environment behind the first display surface while projecting the image onto the first display surface. The person can, so to speak, look through the first image and continue to perceive the environment behind the first display surface.
- a communication can be carried out in which the person can still perceive the surroundings of the display device.
- a pane can be used as a first display surface and a vehicle occupant as a person can continue to perceive the environment of the motor vehicle and still get displayed the first image on the first display surface.
- the second space is detected at least by means of at least two second cameras, in particular by means of three second cameras.
- the first display device, the first room and / or the person is detected by means of at least two first cameras, in particular by means of three first cameras.
- a particularly reliable recording of the room can be made possible by means of the at least two cameras, in particular by means of the three cameras, so that in particular a particularly advantageous representation of the spatial section can be realized as a function of the eye position of the person.
- the different viewing angles can thereby be represented particularly realistically since the at least two cameras can detect the space from different directions and thus the space can be displayed in an improved manner. This can also create the opportunity will be able to display a hologram or a three-dimensional image.
- At least one first camera of the first detection device and / or the second camera can be formed as an image-receiving film and / or at least the first display surface can be formed as an imaging film.
- the display device can be provided in a particularly space-saving manner, which has both the image-receiving film and the imaging film.
- a vehicle window or a mirror can be used as a display or recording device.
- such a display device can be installed in a space-saving and simple manner in the case of many carrier media of the film, so that communication between a person and the room or between a person of another person can be carried out in a particularly simple and comfortable manner.
- the image-receiving film and / or the imaging film is formed on a first carrier element of the first display device.
- a support member for example, a disc or a mirror can be used.
- a vehicle window pane or a motor vehicle mirror can then be used as a carrier element within a motor vehicle. This makes it possible that an image can be recorded by means of a single carrier element, as well as an image can be displayed. This is particularly comfortable and space-saving.
- an LCD film is arranged on the first carrier element, which can be darkened pixel-wise, so that depths of the imaging film are improved by the person perceptible.
- the image-receiving and the imaging film can be provided as a single one-piece film element.
- a single film element can be arranged on the carrier element, which is designed both image-receiving and imaging.
- a further aspect of the invention relates to a system having a first display device and a second display device, which are designed to carry out a method according to one of the preceding aspects.
- the first display device is coupled via a network to at least the second display device for exchanging information.
- the network can be configured both wired and wireless.
- Fig. 1 is a schematic view of an embodiment of a system with two display devices for communication between see two people; a schematic perspective view of an embodiment of the system having a first and a second space; and Fig. 3 is a further schematic perspective view of the system in a motor vehicle for communication between two persons.
- the exemplary embodiments explained below are preferred embodiments of the invention.
- the described components of the embodiments each represent individual features of the invention, which are to be considered independently of one another, which also develop the invention independently of each other and thus also individually or in a different combination than the one shown as part of the invention.
- the described embodiments can also be supplemented by further features of the invention already described.
- functionally identical elements are each provided with the same reference numerals.
- the system 1 shows a schematic view of an embodiment of a system 1.
- the system 1 has at least one first display device 2 and at least one second display device 3.
- the first display device 2 is coupled via a network 4 to the second display device 3 for information exchange.
- the network 4 can be configured both wired and wireless.
- the first display device 2 has a first display surface 5 and the second display device 3 has a second display surface 6.
- the first display device 2 has at least one first camera 7, in the present case three first cameras 7. With the cameras 7, a first space 8 can be detected.
- a respective detection area E1, E2, E3 can be detected, so that the first space 8, in particular completely, can be detected.
- the second display device 3 has at least one second camera 9, in the present case in particular three second cameras 9. With the second cameras 9, a second space 10 can be detected.
- a respective detection area E4, E5, E6 can be detected by a respective camera 9, as a result of which the space 10 can in particular be completely detected.
- the second space section 13 is adjusted according to the eye position 1 1 for the first person 12.
- further information about the second space 10 is provided on the first display device 2. For example, a temperature and / or a surface condition and / or a movement can be displayed on the display device 2.
- a first spatial section 15 of the first space 8 is displayed on the second display surface 6 of the second display device 3.
- a second eye position 16 of a second person 17, which is located in the second space 16 and a second image 18 of the first space section 15 depending on the second eye position 16 on the second display surface 6 are adjusted accordingly.
- a kind of window view can thus also be created for the second person 17, so that the second person 17 can also look through the "window" via the second display device 3 and view the first space 8 as a function of the second eye position 16.
- the first person 12 can see through the semi-transparent display surface 5 and perceive an environment 19 of the first display device 2.
- the first image 14 may be projected onto the first display surface 5 and considered as an augmented reality.
- the second person 17 sees through the second display surface 6 and perceives the second environment 20 of the second display device 3 and the second image 18 is displayed on the display surface 6 as augmented reality.
- At least the first camera 7 and / or the second camera 9 are formed as respective image-receiving film 21 and / or at least the first display surface 5 and / or the second display surface 6 are formed as an imaging film 22.
- FIG. 2 shows a schematic perspective view of the system 1.
- the first person 12 looks through the first display device 2, such as through a window, and can recognize the second person 17 in the second room 10.
- the second person 17 looks through the second display device 3, and can perceive the first person 12 in the first room 8, as through a window.
- the respective display of the spatial cut-outs 13, 15 can be adapted, so that a particularly realistic near-real communication between the first person 12 and the second person 17 can be performed.
- sound information for example, can be transmitted in addition to the image.
- FIG. 3 shows a further schematic perspective view of the system 1.
- the first person 12 looks through the first display device 2, in particular through the first display surface 5, and can see the second person 17, which is shown here as a toddler.
- the first person 12 sitting on a front seat of a motor vehicle and look, for example, through a windshield, which has the display device 2.
- the child 17 can sit on a rear area and, for example, the second display device 3 can be accommodated in a seat of the front seats.
- the first person 12 looks out of the windshield and can view the second person 17, who is in the rear area, for example, through the windshield or on the display device 2, 3, which is arranged on the windshield.
- first person 12 and the first spatial section 15 are displayed as a 3-dimensional second image 18 on the second display surface 6 and / or the second person 17 and the second spatial section 13 as a 3-dimensional first image 14 on the first display surface 6 are displayed.
- the image-receiving film 21 and / or the imaging film 22 can be formed on a first carrier element 23 of the first display device 2 and / or the second display device 3.
- the image-receiving and the Imaging film 21, 22 provided as a single one-piece film element.
- an LCD film is arranged on the first carrier element 23, which can be darkened pixel-wise, so that depths of the imaging film 22 improved by a person 12, 17 can be perceived.
- only the areas on which also an image 14, 18 or parts of the image 14, 18 are represented can be darkened by the LCD film, so that the areas where no image 14, 18 is represented continue to be represented by a person 12, 17 can be seen through.
- FIG. 3 shows different perspectives in FIGS. 3a to 3c.
- the first person 12 looks straight through the first display surface 5 and thus sees the second person 17 in a front view.
- the first person 17 has changed the eye position 11, in particular in the present case looking to the right, so that now the second person 17 is shown rotated accordingly on the first display surface 5.
- the first person 12 has turned the viewing direction to the left so that the second person 17 is represented on the first display surface 5 in accordance with the left-hand view angle of the first person 12.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Optics & Photonics (AREA)
- Controls And Circuits For Display Device (AREA)
- Studio Devices (AREA)
- Devices For Indicating Variable Information By Combining Individual Elements (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102017216843.9A DE102017216843B4 (de) | 2017-09-22 | 2017-09-22 | Verfahren und System zum Anzeigen zumindest eines Raumausschnitts, wobei der Raumausschnitt abhängig von einer Augenposition einer Person angezeigt wird |
PCT/EP2018/070783 WO2019057378A1 (de) | 2017-09-22 | 2018-07-31 | Verfahren und system zum anzeigen zumindest eines raumausschnitts, wobei der raumausschnitt abhängig von einer augenposition einer person angezeigt wird |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3685211A1 true EP3685211A1 (de) | 2020-07-29 |
Family
ID=63452597
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP18762765.8A Pending EP3685211A1 (de) | 2017-09-22 | 2018-07-31 | Verfahren und system zum anzeigen zumindest eines raumausschnitts, wobei der raumausschnitt abhängig von einer augenposition einer person angezeigt wird |
Country Status (5)
Country | Link |
---|---|
US (2) | US11068053B2 (de) |
EP (1) | EP3685211A1 (de) |
CN (1) | CN111133363A (de) |
DE (1) | DE102017216843B4 (de) |
WO (1) | WO2019057378A1 (de) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102009045748A1 (de) * | 2009-10-15 | 2011-04-21 | Continental Teves Ag & Co. Ohg | Verfahren und Vorrichtung zur Ermittlung des Status einer drahtlosen C2X-Kommunikation eines Fahrzeugs zu seiner Umgebung |
DE102014005976A1 (de) * | 2014-04-24 | 2014-09-25 | Daimler Ag | Anordnung und Verfahren zur Darstellung von optischen Informationen auf einer transparenten Anzeigefläche |
US20140375752A1 (en) * | 2012-12-14 | 2014-12-25 | Biscotti Inc. | Virtual Window |
WO2016154123A2 (en) * | 2015-03-21 | 2016-09-29 | Mine One Gmbh | Virtual 3d methods, systems and software |
US20170054949A1 (en) * | 2015-08-19 | 2017-02-23 | Faraday&Future Inc. | In-vehicle camera system |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AUPP727598A0 (en) * | 1998-11-23 | 1998-12-17 | Dynamic Digital Depth Research Pty Ltd | Improved teleconferencing system |
US6919907B2 (en) * | 2002-06-20 | 2005-07-19 | International Business Machines Corporation | Anticipatory image capture for stereoscopic remote viewing with foveal priority |
US20070002130A1 (en) * | 2005-06-21 | 2007-01-04 | David Hartkop | Method and apparatus for maintaining eye contact during person-to-person video telecommunication |
US20130021476A1 (en) * | 2008-10-10 | 2013-01-24 | Trummer Marcus A | Child seat safety system |
US8317329B2 (en) * | 2009-04-02 | 2012-11-27 | GM Global Technology Operations LLC | Infotainment display on full-windshield head-up display |
US8823769B2 (en) * | 2011-01-05 | 2014-09-02 | Ricoh Company, Ltd. | Three-dimensional video conferencing system with eye contact |
DE102011083662B4 (de) | 2011-09-29 | 2022-02-17 | Robert Bosch Gmbh | Anzeigevorrichtung für einen Insassen eines Fahrzeugs, Fahrzeug und Verfahren zur Generierung einer Anzeige in einem Sichtfeld des Insassen |
US20140362170A1 (en) * | 2012-02-15 | 2014-12-11 | Thomson Licensing | Video conference system and method for maintaining participant eye contact |
DE102013210887B4 (de) | 2013-06-11 | 2019-12-12 | Robert Bosch Gmbh | Optische Sensoranordnung für ein Fahrzeug und Fahrzeug mit einer derartigen Sensoranordnung |
CN203623529U (zh) * | 2013-11-15 | 2014-06-04 | 福特环球技术公司 | 监视系统 |
DE102014204691A1 (de) | 2014-03-13 | 2015-09-17 | Robert Bosch Gmbh | Bildaufnahmevorrichtung, insbesondere zur Fahrzeugvermessung |
US9602767B2 (en) * | 2014-10-10 | 2017-03-21 | Microsoft Technology Licensing, Llc | Telepresence experience |
CN105825493B (zh) * | 2015-01-09 | 2019-05-03 | 华为技术有限公司 | 图像配准方法和装置 |
US10009405B2 (en) * | 2015-04-27 | 2018-06-26 | International Business Machines Corporation | Dynamically adjusting quality of service using cognitive focus of attention detection |
CN106707576A (zh) * | 2015-11-13 | 2017-05-24 | 小米科技有限责任公司 | Lcd面板、终端及感光控制方法 |
US9681096B1 (en) * | 2016-07-18 | 2017-06-13 | Apple Inc. | Light field capture |
EP3513242B1 (de) * | 2016-09-13 | 2021-12-01 | Magic Leap, Inc. | Sensorische brille |
-
2017
- 2017-09-22 DE DE102017216843.9A patent/DE102017216843B4/de active Active
-
2018
- 2018-07-31 US US16/649,526 patent/US11068053B2/en active Active
- 2018-07-31 EP EP18762765.8A patent/EP3685211A1/de active Pending
- 2018-07-31 CN CN201880061799.1A patent/CN111133363A/zh active Pending
- 2018-07-31 WO PCT/EP2018/070783 patent/WO2019057378A1/de active Search and Examination
-
2021
- 2021-05-05 US US17/308,621 patent/US11422621B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102009045748A1 (de) * | 2009-10-15 | 2011-04-21 | Continental Teves Ag & Co. Ohg | Verfahren und Vorrichtung zur Ermittlung des Status einer drahtlosen C2X-Kommunikation eines Fahrzeugs zu seiner Umgebung |
US20140375752A1 (en) * | 2012-12-14 | 2014-12-25 | Biscotti Inc. | Virtual Window |
DE102014005976A1 (de) * | 2014-04-24 | 2014-09-25 | Daimler Ag | Anordnung und Verfahren zur Darstellung von optischen Informationen auf einer transparenten Anzeigefläche |
WO2016154123A2 (en) * | 2015-03-21 | 2016-09-29 | Mine One Gmbh | Virtual 3d methods, systems and software |
US20170054949A1 (en) * | 2015-08-19 | 2017-02-23 | Faraday&Future Inc. | In-vehicle camera system |
Non-Patent Citations (2)
Title |
---|
ALIEXPRESS: "TakTark", 20 May 2017 (2017-05-20), https://fr.aliexpress.com/item/32795024304.html, XP055611166, Retrieved from the Internet <URL:https://fr.aliexpress.com/item/32795024304.html> [retrieved on 20190806] * |
See also references of WO2019057378A1 * |
Also Published As
Publication number | Publication date |
---|---|
DE102017216843A1 (de) | 2019-03-28 |
WO2019057378A1 (de) | 2019-03-28 |
DE102017216843B4 (de) | 2024-03-21 |
US20210255702A1 (en) | 2021-08-19 |
CN111133363A (zh) | 2020-05-08 |
US20200310536A1 (en) | 2020-10-01 |
US11422621B2 (en) | 2022-08-23 |
US11068053B2 (en) | 2021-07-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE102005000739B4 (de) | Fahrzeug-Sichtunterstützungssystem | |
EP3394708B1 (de) | Verfahren zum betreiben eines virtual-reality-systems und virtual-reality-system | |
DE102014006732A1 (de) | Bildüberlagerung von virtuellen Objekten in ein Kamerabild | |
DE102012203491B4 (de) | Elektronisches Rückspiegel-System | |
WO2019068477A1 (de) | Kinetosefreies betrachten eines digitalen inhalts in einem fahrzeug | |
DE102014002493A1 (de) | System mit und Verfahren zum automatischen Ein-/Umschalten einer Einstellvorrichtung für eine Head-up-Display-Einrichtung | |
DE102014119317A1 (de) | Verfahren zur Darstellung eines Bildüberlagerungselements in einem Bild mit 3D-Information, Fahrerassistenzsystem und Kraftfahrzeug | |
DE102014015871A1 (de) | Anzeigesystem für einen Kraftwagen, Kraftwagen mit einem Anzeigesystem und Verfahren zum Betreiben eines Anzeigesystems | |
WO2018215332A1 (de) | Externe darstellung von bildaufnahmen eines fahrzeuginnenraums in einer vr-brille | |
WO2016165799A1 (de) | Verfahren zum betreiben einer virtual-reality-brille und system mit einer virtual-reality-brille | |
DE102014208048A1 (de) | System und Verfahren zur Personalisierung eines Ausstellungsraums | |
DE102016217037B4 (de) | Verfahren zum Betreiben einer Anzeigeeinrichtung eines Kraftfahrzeugs mit einer digitalen Anzeigefläche und Kraftfahrzeug | |
DE102014010309B4 (de) | Anzeigen von zusätzlichen Inhalten in einer virtuellen Szenerie | |
DE102014116441A1 (de) | Verfahren zum Darstellen einer Sicherheitsinformation, Fahrerassistenzsystem und Kraftfahrzeug | |
EP3685211A1 (de) | Verfahren und system zum anzeigen zumindest eines raumausschnitts, wobei der raumausschnitt abhängig von einer augenposition einer person angezeigt wird | |
DE102015007246B4 (de) | Verfahren zum Betreiben eines Anzeigesystems und Anzeigesystem für ein Kraftfahrzeug | |
WO2020030312A1 (de) | Verfahren und system zum betreiben von zumindest zwei von jeweiligen fahrzeuginsassen am kopf getragenen anzeigeeinrichtungen | |
DE102011080556B4 (de) | Videosystem für ein Kraftfahrzeug und Verfahren zum Betreiben eines Videosystems | |
DE102011088492A1 (de) | Navigationssystem mit einer verbesserten Kartendarstellung | |
DE102005048232A1 (de) | Rückfahrkamerasystem und Verfahren zum Anzeigen von Informationen zu einer rückwärtigen Sicht aus einem Fahrzeug | |
WO2019063284A1 (de) | Verfahren und system zum durchführen eines virtuellen treffens zwischen wenigstens einer ersten person und einer zweiten person | |
DE60129059T2 (de) | 3D-visuelle Präsentationsmethode und Apparat für Autosimulator | |
DE102020134814A1 (de) | Vorrichtung zum Überwachen von Umgebungen eines Fahrzeugs | |
DE102004032586B4 (de) | Verfahren zur Erzeugung einer dreidimensionalen Darstellung | |
WO2013178358A1 (de) | Verfahren zur räumlichen visualisierung von virtuellen objekten |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20200422 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: KEZZE, MUHAMMAD ALI Inventor name: KLUG, MARKUS Inventor name: SCHWAGER, ANDRE Inventor name: SCHWARTZE, SEBASTIAN |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
PUAG | Search results despatched under rule 164(2) epc together with communication from examining division |
Free format text: ORIGINAL CODE: 0009017 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20220614 |
|
B565 | Issuance of search results under rule 164(2) epc |
Effective date: 20220614 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G02B 27/01 20060101ALI20220610BHEP Ipc: H04N 21/00 20110101ALI20220610BHEP Ipc: H04N 13/00 20180101ALI20220610BHEP Ipc: H04N 7/14 20060101ALI20220610BHEP Ipc: B60K 37/02 20060101ALI20220610BHEP Ipc: G02B 27/00 20060101AFI20220610BHEP |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230529 |