[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2016067082A1 - Procédé et dispositif de commande gestuelle dans un véhicule - Google Patents

Procédé et dispositif de commande gestuelle dans un véhicule Download PDF

Info

Publication number
WO2016067082A1
WO2016067082A1 PCT/IB2015/001966 IB2015001966W WO2016067082A1 WO 2016067082 A1 WO2016067082 A1 WO 2016067082A1 IB 2015001966 W IB2015001966 W IB 2015001966W WO 2016067082 A1 WO2016067082 A1 WO 2016067082A1
Authority
WO
WIPO (PCT)
Prior art keywords
occupant
gesture
detection unit
head
detection
Prior art date
Application number
PCT/IB2015/001966
Other languages
German (de)
English (en)
Inventor
Frank Schliep
Oliver Kirsch
Christian Sinn
Stephen WINTON
Xin Jiang
Original Assignee
Visteon Global Technologies, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Visteon Global Technologies, Inc. filed Critical Visteon Global Technologies, Inc.
Publication of WO2016067082A1 publication Critical patent/WO2016067082A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/65Instruments specially adapted for specific vehicle types or users, e.g. for left- or right-hand drive
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/146Instrument input by gesture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/146Instrument input by gesture
    • B60K2360/14643D-gesture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/149Instrument input by detecting viewing direction not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/21Optical features of instruments using cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/741Instruments adapted for user detection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Definitions

  • the invention relates to a method for gesture control in one
  • the invention relates to a device for gesture control in
  • TOF cameras short for: time-of-flight camera
  • the TOF camera is based on an active one
  • Illumination of a scene for example by infrared light, and determines a duration of the light used for illumination by means of sensors. On this basis, a distance of the illuminated object from the TOF camera is then determined.
  • the object is achieved according to the invention with the features specified in claim 1.
  • the object is achieved according to the invention with the features specified in claim 10.
  • Detection unit image data of at least one position of an occupant and at least one gesture of the occupant in a detection range of
  • Capture unit are detected.
  • the detection unit formed as a 3D camera combines both the position of a head of the occupant, head movements and gesture movements are combined and analyzed, so that an ergonomic
  • Vehicle components based on an eye, head and / or
  • the registration unit is part of an interactive system
  • a 3D point cloud is generated on the basis of the captured image data for determining the position, gesture and at least one distance of the occupant.
  • the 3D point cloud makes it possible to determine a distance and position of an object and the occupant located in the detection area in a simple manner.
  • the detection area is subdivided into at least two interaction areas for recognizing and tracking a movement and / or a gesture of the occupant.
  • the movement and / or gesture of the head and the hand of the occupant can be detected spaced from each other.
  • One possible embodiment of the method provides that the movement and / or gesture of a head and a hand of the occupant in the same
  • Interaction area are recorded or will. This allows, for example, a quick activation or deactivation of the
  • Vehicle application or vehicle function Here, the movement and / or gesture of the head and the hand of the occupant can be determined clearly and thus quickly.
  • the movement and gesture executed by the occupant will be detected at the same time.
  • the position of the head is detected and rigid in combination with a 3D
  • Hand gesture of the occupant triggered the vehicle application in a fast manner.
  • the movement and / or gesture of the head and the hand carried out by the occupant are detected in at least one of the two interaction areas in different time periods.
  • a workload and an execution effort of the occupant, in particular of the driver may interact with and / or control at least one of them
  • Vehicle component can be reduced.
  • the driver can perform at intervals the movement of the head and then the hand.
  • the interaction and / or control is / are determined by detecting a line of sight of the occupant. In particular, this allows the driver to easily or simply activate or deactivate interaction with the vehicle application.
  • the viewing direction of eyes and / or eyelids of the eyes of the occupant is or are determined. This can be especially a tiredness and a
  • a device for gesture control in a vehicle interior comprises a detection unit which displays image data of at least one position of an occupant and at least one gesture of the occupant in one
  • Detection unit is designed as a 3D camera.
  • the detection unit is designed as a TOF camera (short for: time-of-flight camera).
  • the driver's gesture control is particularly accurate and targeted with the trained as a 3D camera or TOF camera
  • Capture unit detectable.
  • a body of the gesture-executing occupant or certain body parts of the occupant and a background serves as a reference area for the TOF camera.
  • Detection unit electrically connected to a multifunction display and / or integrated into a unit of the multi-function display.
  • the gesture and movement of the occupant provided for operating the multifunction display can be specifically detected by the detection unit.
  • the detection unit is data-technologically coupled to a control unit for processing and evaluating acquired image data, wherein the control unit compares the acquired image data with predetermined image data for evaluating the gesture or a state of the occupant. For example, due to the comparison of acquired image data with predetermined image data, a fast execution
  • Detection unit with a lighting unit for illuminating the
  • Detection area is signal coupled
  • Lighting unit for example, event-driven, for example, at predetermined intervals, emitted light in the infrared spectral range. In this case, a glare of the occupant is avoided, wherein the detection area for precise determination of the position or the gesture of the occupant is sufficiently illuminated.
  • FIG. 1 shows schematically a perspective view of a
  • Figure 2 schematically shows a block diagram of the device in a
  • FIG. 3 schematically shows a block diagram of the device in a further embodiment
  • FIG. 4 shows schematically a block diagram of the device in one
  • FIG. 1 shows a vehicle interior 1 of a vehicle not shown in detail with a device V, comprising a detection unit 2, for gesture control in the vehicle.
  • the detection unit 2 is designed here as a 3D camera.
  • the detection unit 2 is a TOF camera, a
  • Stereo camera or other suitable for the vehicle interior 1 SD camera is formed.
  • the detection unit 2 is in one
  • Dashboard 3 preferably in close proximity to one
  • Multifunction display 3.1 arranged such that an occupant 4,
  • Upper body area can be determined.
  • the multi-function display 3.1 is in particular an infotainment system.
  • the detection unit 2 comprises a detection area E which is directed into the vehicle interior 1, wherein the detection area E detects at least one position of a head 4.1 of the occupant 4 and a movement performed by the occupant 4 as a gesture.
  • a gesture for example, a movement of the head 4.1 and / or a movement of a hand 4.2 and / or an arm of the
  • Occupant 4 detected to trigger an associated with the movement and / or gesture of the head 4.1 and / or the hand 4.2 application.
  • the detection unit 2 in the form of the TOF camera is connected to a control unit 6, which detected with the detection unit 2
  • Image data I processed and evaluated.
  • the detection unit 2 is, for example, as one
  • Thermal imaging camera in particular as an infrared camera, formed with an active illumination of a scene using infrared light and determined by a transit time measurement of a light used for illumination, a distance of an illuminated object.
  • an illuminated object is, for example, the upper body portion or the head portion of the occupant 4.
  • a scene or gesture in particular the temporal position of the head 4.1 and / or the hand 4.2, in the detection area E of
  • Detecting unit 2 detected. This scene or gesture is going through
  • the TOF camera using a 3D point cloud to determine the distance of the illuminated object by an amplitude evaluation of the intensity of the reflected and detected light signals in
  • Detection area E recorded.
  • the detection unit 2 and a lighting unit 5 are arranged in the dashboard 3 and signal technology coupled that the light at predetermined intervals light, especially in the infrared spectral range, emitting lighting unit 5 the detection area E of
  • Detection unit 2 illuminated.
  • the detection unit 2 comprises such an optical
  • Detection area E local areas or local zones can capture, for example, a
  • the detection unit 2 is in particular electrically connected to the multifunction display 3.1 and / or in a structural unit of
  • the movement and / or gesture in particular an extrapolation of the movement and / or gesture of the occupant 4 to detect relative to the multifunction unit 3.1 within the detection area E. Detection and / or tracking of the movement and / or gesture of the occupant 4 is performed by the detection unit 2 in the same
  • Detection area E wherein the head 4.1 and the hand 4.2 of the occupant 4 independently of each other can trigger the associated with the executed movement and / or gesture application.
  • the detection range E of the detection unit 2 in at least two interaction areas, in which in each case the movement and / or gesture of the head 4.1 and the hand 4.2 of the occupant 4 from each other
  • One possibility for the rapid activation or deactivation of the application desired by the occupant 4 is based on the processing of at least simultaneously acquired image data I from the occupant 4
  • Image data I detected a different movement and / or gesture of the head 4.1 and the hand 4.2 of the occupant 4.
  • two movements and / or gestures executed in the same interaction area and associated with the application can be activated or deactivated, whereby, for example, the position of the head 4.1 is detected and in combination with a 3D rigid hand gesture of the occupant 4 the application is triggered.
  • the second possibility can be at least one workload of the
  • Vehicle component is detected by detecting the line of sight of the eyes of the occupant 4 within the interaction area of
  • Detection area E determined, the interaction and / or control based on the associated with the application and detected motion and / or gesture, in particular hand gesture of the occupant 4, takes place.
  • the interaction and / or control may be determined by the combination of the movement associated with the application and / or gesture of the head 4.1 and the hand 4.2. For example, based on the extrapolation of the hand gesture, in particular finger gesture, and a subsequent movement and / or gesture of the head 4.1, in particular nodding, the interaction and / or the control of the at least one
  • Detection unit 2 can be detected, for example, as the associated with the application movement and / or gesture of the occupant 4. To effectively monitor a behavior of the occupant 4 is the
  • Detection unit 2 directed into the vehicle interior 1, in particular outside a (main) axis.
  • a method based on a 3D model for the partial recognition of the head 4.1 is provided.
  • the position of the head 4.1 and / or the eye and / or a nose vector and / or facial features of the occupant 4 can be detected thereby.
  • This detection takes place in particular on the basis of a filter algorithm, for example a so-called Kalman filter, for processing statically estimated and / or predicted values.
  • FIGS. 2 to 4 each show a block diagram of an embodiment of the device V for detecting and / or detecting the head 4.1 of FIG.
  • Inmates 4 shown.
  • a sequence of the method for detecting and / or detecting the head 4.1 of the occupant 4 is shown.
  • the detection unit 2 For illuminating the detection area E, the detection unit 2 is electrically coupled to the illumination unit 5. For processing and
  • Detection unit 2 electrically coupled to the control unit 6.
  • image data I include, for example, movements, gestures and / or
  • the image data I in particular of the upper body region of the occupant 4, in the detection area E are determined on the basis of local coordinates and light reflections of the object.
  • protruding facial features and / or facial structures are detected on the basis of detected amplitude maps.
  • the device V comprises, for example, an interface S1 to
  • the detection and / or detection of the head 4.1 of the occupant 4 is based on a combination of 2D and 3D Algohthmen to largely appropriate information from the 3D point cloud and the
  • the device V comprises a connection interface S2, which detected the from the 3D point cloud and amplitude mapping
  • Image data I includes and is provided for noise filtering the captured image data I.
  • the device V comprises a
  • a verification interface S4 of the device V is provided for checking the position of the head 4.1 within at least one segment, wherein the position of the head 4.1 by a head-shoulder detector S4.1 is determined.
  • the segments are each compared with a head-shoulder template S4.2 or a reference.
  • the head-and-shoulders template S4.2 is aligned to the segment such that the head-shoulder template S4.2 scales at least with a segment size, the head-shoulder template S4.2 based on a rotation is aligned.
  • the segment with a largely high agreement to the head and shoulder template S4.2 are at least heuristic
  • Assumptions S4.4 further coordinates for completing the position of the head 4.1 is determined.
  • the position of the head 4.1 is determined based on an upper body template S4.3, in which at least body parts of the upper body of the occupant 4 are compared with pre-stored forms of the upper body of a human. Subsequently, with a match based on heuristic assumptions S4.4, further coordinates for completing the position of the head 4.1 are determined.
  • a head of the pre-stored shape of the upper body has an elliptical shape.
  • Evaluation interface S5 estimated the position and viewing direction of the head 4.1 of the occupant 4 within a current freeze frame, with a largely suitable for estimation algorithm S5.1 is determined in advance.
  • the position and direction of the head 4.1 can largely on the basis of acquired image data I of the current still image be determined. This is done via an output interface S5.2 as a function of the algorithm S5.1 the evaluation interface S5, wherein the captured image data I then as a motion and
  • Position sequences S5.3 determines a subsequent position and / or viewing direction of the head 4.1 of the occupant 4.
  • the position and / or viewing direction of the head 4.1 can be determined based on the nose vector.
  • a viewing direction of the eye can differ from the viewing direction of the head 4.1, wherein the viewing direction of the eye can be determined, for example, by detecting at least one pupil.
  • a condition of the occupant 4 in particular a tiredness of the driver, for example, a number of eyelids, intervals between the respective eyelid beats and / or duration of a
  • Such captured image data I are stored in a memory interface S6 or in the control unit 6, wherein the captured image data I can be retrieved at any time.
  • a decision is made on the basis of a preprogrammed in the control unit 6 decision tree, which automatically from the stored image data I at least one suitable method, in particular detection by nose vector and / or
  • Eye position selects to capture the line of sight of the occupant 4. Furthermore, individual information about head size, face geometry and / or head-to-shoulder distance can be different
  • Inmates 4 are stored, which are available at any time.
  • the method can also be used to improve driving experience and / or driving safety, and / or to monitor the driver and / or

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Chemical & Material Sciences (AREA)
  • Multimedia (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un procédé de commande gestuelle dans un habitacle (1) d'un véhicule. Selon l'invention, l'unité de détection (2) réalisée sous la forme d'une caméra 3D détecte des données d'image (I) d'au moins une position d'un occupant (4) et d'au moins un geste de l'occupant (4) dans une plage de détection (E) de l'unité de détection (2). L'invention concerne en outre un dispositif (V) pour la commande gestuelle dans un habitacle (1) d'un véhicule.
PCT/IB2015/001966 2014-10-22 2015-10-22 Procédé et dispositif de commande gestuelle dans un véhicule WO2016067082A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DEDE102014221488.2 2014-10-22
DE102014221488 2014-10-22

Publications (1)

Publication Number Publication Date
WO2016067082A1 true WO2016067082A1 (fr) 2016-05-06

Family

ID=54695763

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2015/001966 WO2016067082A1 (fr) 2014-10-22 2015-10-22 Procédé et dispositif de commande gestuelle dans un véhicule

Country Status (2)

Country Link
DE (1) DE202015105611U1 (fr)
WO (1) WO2016067082A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019215286A1 (fr) * 2018-05-09 2019-11-14 Motherson Innovations Company Ltd. Dispositif et procédé servant à faire fonctionner une identification d'objets pour l'habitacle d'un véhicule automobile, et véhicule automobile
CN113821106A (zh) * 2021-10-08 2021-12-21 江苏铁锚玻璃股份有限公司 基于智能透明oled车窗的智能功能导航方法及结构

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016202526A1 (de) * 2016-02-18 2017-08-24 Bayerische Motoren Werke Aktiengesellschaft Verfahren und Vorrichtung zur Erkennung einer Bediengeste eines Nutzers, insbesondere in einem Kraftfahrzeug
DE102017209200A1 (de) * 2017-05-31 2018-12-06 Bayerische Motoren Werke Aktiengesellschaft Verfahren, Computer-lesbares Medium, System, und Fahrzeug umfassend das System zum Bereitstellen von Kameradaten wenigstens einer Fahrzeugkamera während einer Fahrt eines Fahrzeugs auf einer Anzeigevorrichtung des Fahrzeugs
DE102017211516A1 (de) * 2017-07-06 2019-01-10 Bayerische Motoren Werke Aktiengesellschaft Fahrzeugsitz und Fahrzeug umfassend einen Fahrzeugsitz
DE102017214009B4 (de) * 2017-08-10 2020-06-18 Volkswagen Aktiengesellschaft Verfahren und Vorrichtung zur Erkennung der Anwesenheit und/oder Bewegung eines Fahrzeuginsassen
DE102018222127A1 (de) * 2018-12-18 2020-06-18 Volkswagen Aktiengesellschaft Fahrzeug mit einer Infrarotkamera zur Fahreridentifikation und/oder Fahrerbeobachtung
GB2585247B (en) * 2019-07-05 2022-07-27 Jaguar Land Rover Ltd Occupant classification method and apparatus

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090278915A1 (en) * 2006-02-08 2009-11-12 Oblong Industries, Inc. Gesture-Based Control System For Vehicle Interfaces
US20140195096A1 (en) * 2011-06-30 2014-07-10 Johnson Controls Gmbh Apparatus and method for contactlessly detecting objects and/or persons and gestures and/or operating procedures made and/or carried out thereby

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090278915A1 (en) * 2006-02-08 2009-11-12 Oblong Industries, Inc. Gesture-Based Control System For Vehicle Interfaces
US20140195096A1 (en) * 2011-06-30 2014-07-10 Johnson Controls Gmbh Apparatus and method for contactlessly detecting objects and/or persons and gestures and/or operating procedures made and/or carried out thereby

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019215286A1 (fr) * 2018-05-09 2019-11-14 Motherson Innovations Company Ltd. Dispositif et procédé servant à faire fonctionner une identification d'objets pour l'habitacle d'un véhicule automobile, et véhicule automobile
US11697381B2 (en) 2018-05-09 2023-07-11 Motherson Innovations Company Limited Device and method for operating an object detection system for the passenger compartment of a motor vehicle, and a motor vehicle
CN113821106A (zh) * 2021-10-08 2021-12-21 江苏铁锚玻璃股份有限公司 基于智能透明oled车窗的智能功能导航方法及结构

Also Published As

Publication number Publication date
DE202015105611U1 (de) 2015-11-09

Similar Documents

Publication Publication Date Title
WO2016067082A1 (fr) Procédé et dispositif de commande gestuelle dans un véhicule
EP3013621B1 (fr) Interface de commande de véhicule automobile à reconnaissance de gestes
DE102013012466B4 (de) Bediensystem und Verfahren zum Bedienen einer fahrzeugseitigen Vorrichtung
DE102011053449A1 (de) Mensch-Maschine-Schnittstelle auf Fingerzeige- und Gestenbasis für Fahrzeuge
DE102008019731B4 (de) Verfahren und Vorrichtung zur Fahrer-Beifahrer-Unterscheidung
DE102014201036A1 (de) Bildbasierte Klassifikation des Fahrerzustands und/oder des Fahrerverhaltens
DE102014004395B3 (de) Verfahren zum Betrieb eines Fahrerassistenzsystems und Kraftfahrzeug
DE102012109622A1 (de) Verfahren zur Steuerung einer Anzeigekomponente eines adaptiven Anzeigesystems
DE102013013227B4 (de) Kraftwagen mit Anzeigeeinheit und Sicherung gegen eine Fahrerablenkung
EP2915022B1 (fr) Procédé d'entrée d'une instruction de commande pour un sous-ensemble d'un véhicule automobile
DE102016206126A1 (de) Verfahren und Vorrichtung zum Überwachen oder Regeln einer Fahraufgabe-Übergabe in einem selbstfahrenden Fahrzeug und System für eine Fahraufgabe-Übergabe in einem selbstfahrenden Fahrzeug
DE102019005448B4 (de) Vorrichtung und Verfahren zur Steuerung eines Beifahrersitzes in einem Fahrzeug
WO2016096235A1 (fr) Procédé de détermination de la direction du regard d'une personne
DE102016215291A1 (de) Verfahren zur Klassifikation von Fahrerbewegungen
EP3254172B1 (fr) Détermination d'une position d'un objet étranger à un véhicule dans un véhicule
DE102020105566A1 (de) Überwachung des lenkradeingriffs für autonome fahrzeuge
DE102016210088B3 (de) Verfahren und Vorrichtung zur Darstellung einer Umgebung eines Kraftfahrzeugs
US20220083786A1 (en) Method and system for monitoring a person using infrared and visible light
EP3727936A1 (fr) Procédé de reconnaissance d'au moins un objet présent sur un véhicule automobile et dispositif de commande et véhicule automobile
EP3230828A1 (fr) Dispositif de détection pour connaître un geste et/ou une direction du regard d'un occupant d'un véhicule par commande synchrone d'unités d'éclairage
DE102015010421A1 (de) Dreidimensionale Erfassung des Fahrzeuginnenraums
DE102014200783A1 (de) Verfahren, Vorrichtung und Computerprogrammprodukt zum Betreiben eines Kraftfahrzeugs
EP3934929B1 (fr) Procédé de classement d'objets dans un véhicule à moteur
DE102014216053A1 (de) Anpassung der Beleuchtungs-Wellenlänge bei der Augenerfassung
DE102019004692B3 (de) Vorrichtung und Verfahren zur Ermittlung von Bilddaten der Augen, von Augenpositionen und/oder einer Blickrichtung eines Fahrzeugnutzers in einem Fahrzeug

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15798182

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15798182

Country of ref document: EP

Kind code of ref document: A1