[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2016087902A1 - Operating device for a vehicle, in particular a passenger vehicle; as well as method for operating such an operating device - Google Patents

Operating device for a vehicle, in particular a passenger vehicle; as well as method for operating such an operating device Download PDF

Info

Publication number
WO2016087902A1
WO2016087902A1 PCT/IB2014/066621 IB2014066621W WO2016087902A1 WO 2016087902 A1 WO2016087902 A1 WO 2016087902A1 IB 2014066621 W IB2014066621 W IB 2014066621W WO 2016087902 A1 WO2016087902 A1 WO 2016087902A1
Authority
WO
WIPO (PCT)
Prior art keywords
movement
hand
operating device
gesture
axis
Prior art date
Application number
PCT/IB2014/066621
Other languages
French (fr)
Inventor
Carsten KAUSCH
Original Assignee
Audi Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Audi Ag filed Critical Audi Ag
Priority to PCT/IB2014/066621 priority Critical patent/WO2016087902A1/en
Priority to CN201480083850.0A priority patent/CN107003142A/en
Publication of WO2016087902A1 publication Critical patent/WO2016087902A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3664Details of the user input interface, e.g. buttons, knobs or sliders, including those provided on a touch screen; remote controllers; input using gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/23Head-up displays [HUD]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3635Guidance using 3D or perspective road maps
    • G01C21/3638Guidance using 3D or perspective road maps including 3D objects and buildings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/146Instrument input by gesture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/146Instrument input by gesture
    • B60K2360/14643D-gesture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/177Augmented reality
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/178Warnings

Definitions

  • the invention relates to an operating device for a vehicle according to the preamble of patent claim 1 as well as a method for operating such an operating device according to the preamble of patent claim 8.
  • Such an operating device for a vehicle, in particular a passenger vehicle, and such a method for operating an operating device are known from DE 10 2012 219 814 Al.
  • the operating device comprises a gesture recognition unit configured to recognize at least one gesture performed by a human user of the vehicle.
  • the operating device is configured to execute at least one function upon recognition of the gesture.
  • EP 1 830 244 A2 shows a method comprising selecting an audio system or an air conditioning system by an acoustic signal.
  • a function and/or an operating condition of the selected system is set depending on a gesture of a user.
  • the gesture for setting the audio system and the air conditioning system is adapted to a component-specific setting of the function and/or the operating condition.
  • the gesture assigned to the function and/or the operating condition is set depending on the detection of the gesture by a detecting device.
  • DE 10 2012 219 814 Al shows a method for providing an operating input by using a head mounted display carried by a user.
  • a first aspect of the invention relates to an operating device for a vehicle, in particular a passenger vehicle.
  • the operating device comprises a gesture recognition unit configured to recognize at lest one gesture performed by a human user of the vehicle, the operating device being configured to execute at least one function upon recognition of the gesture.
  • the gesture recognition unit has at least one three-dimensional detection area which is, for example, arranged in the interior of the vehicle.
  • the operating device is configured to execute respective functions upon recognition of eight gestures performed within the detection area by the user.
  • one of said functions of the operating device can be effected or initiated by the user as the user performs the corresponding gesture within the detection area.
  • a first one of the gestures comprises a translational movement of a hand of the user along a first movement axis into a first direction.
  • a second one of the gestures comprises a translational movement of the hand along the first movement axis into a second direction opposite to the first direction.
  • the user can move their hand forwards and backwards along the first movement axis within the detection area so that, by moving their hand along the first movement axis into the first direction, a first one of said functions can be effected and, by moving the hand into the second direction along the first movement axis, a second one of the functions can be effected by the user.
  • a third one of the gestures comprises a translational movement of the hand of the user along a second movement axis into a third direction, the second movement axis being perpendicular to the first movement axis.
  • a fourth one of the gestures comprises a translational movement of the hand along the second movement axis into a fourth direction opposite to the third direction.
  • the first movement axis and the second movement axis are virtual axes extending perpendicularly in relation to each other.
  • a fifth one of the gestures comprises a translational movement of the hand along a third movement axis into a fifth direction, the third movement axis being perpendicular to the first movement axis and the second movement axis.
  • a sixth one of the gestures comprises a translational movement of the hand along the third movement axis into a sixth direction opposite to the fifth direction.
  • the third movement axis is a virtual axis extending perpendicularly to the first movement axis and the second movement axis.
  • a seventh one of the gestures comprises a rotational movement of the hand about the third movement axis into a first rotating direction which is a seventh direction.
  • An eighth one of the gestures comprises a rotational movement of the hand about the third movement axis into a second rotating direction opposite to the first rotating direction, the second rotating direction being an eighth direction. For example, by rotating their hand to the left and to the right about the third movement axis within the detection area, the user can effect a seventh one of the function as and an eighth one of the functions of the operating device.
  • the operating device according to the present invention can be operated particularly comfortably and easily by the user since the user can perform the gestures very easily with their hand within the detection area which can be kept particularly small. Furthermore, said gestures are very easy inputs by means of which complex systems such as the operating device can be operated very easily.
  • the gesture recognition unit is configured to be switched from a locked state into an unlocked state upon recognition of a ninth gesture performed within the detection area by the user, the gesture recognition unit being configured to detect the ninth gesture.
  • the ninth gesture comprises a movement of two fingers of the hand of the user towards each other. In other words, the user performs the ninth gesture by moving two fingers of their hand towards each other within the detection area.
  • the recognition of said first eight gestures is omitted. In other words, when the gesture recognition unit is in the locked state the eight functions corresponding to the first eight gestures cannot be effected by performing said first eight gestures.
  • the gesture recognition unit In the unlocked state, the gesture recognition unit is ready to recognize said first eight gestures so that, in the unlocked state, the user can effect the eight functions by performing the corresponding first eight gestures within the detection area. Thereby, the risk of faulty operation can be kept particularly low since, in order to effect the eight functions, the user first needs to switch the gesture recognition unit from the locked state into the unlocked state.
  • the gesture recognition unit is configured to detect the ninth gesture in the locked state.
  • the gesture recognition unit is configured to be switched from the unlocked state into the locked state upon a tenth gesture performed by the user within the detection area, the gesture recognition unit being configured to recognize the tenth gesture, wherein the tenth gesture comprises a movement of the fingers away from each other.
  • the gesture recognition unit is switched from the unlocked state to the locked state when the user moves their fingers away from each other within the detection area.
  • the gesture recognition unit can be switched from the unlocked state into the locked state particularly easily so that the risk of faulty operation can be kept particularly low.
  • the gesture recognition unit is configured to be kept in the unlocked state as long as the fingers are kept together. This means the gesture recognition unit is witched from the unlocked state into the locked state as soon as the user opens their fingers, i.e. moves their fingers away from each other. This also means the gesture recognition unit is ready to recognize the first eight gestures only as long as the user keeps their fingers together. Thus, the risk of faulty operation can be kept particularly low.
  • the operating device is configured as a navigation system for the vehicle.
  • a navigation system is a very complex system which, conventionally, can be cumbersome to operate.
  • the operating device according to the present invention allows for realizing a particularly easy and comfortable operation for any complex system. Thereby, the user can operate the navigation system in a fast and easy way without getting distracted from traffic.
  • the navigation device comprises a display unit configured to show a virtual map, the functions relating to adjusting the virtual map.
  • the virtual map can be adjusted by the user by performing said first eight gestures.
  • an area represented by the virtual map can be changed and/or enlarged and/or scaled down by performing the first eight gestures.
  • the user can confirm or decline inputs by performing the gestures and/or navigate through a manual structure comprising different manuals and/or tabs and/or menu interfaces by performing said gestures.
  • a second aspect of the present invention relates to a vehicle, in particular a passenger vehicle, comprising an operation device according to the present invention.
  • Advantages and advantageous embodiments of the first aspect of the invention are to be regarded as advantages and advantageous embodiments of the second aspect of the invention and vice versa.
  • a third aspect of the invention relates to a method for operating an operating device for a vehicle, the operating device comprising a gesture recognition unit by means of which at least one gesture performed by a user of the vehicle is recognized, wherein at least one function is executed by the operating device upon recognition of the gesture.
  • the gesture recognition unit has at least one three-dimensional detection area, the operating device executing respective functions upon recognition of eight gestures performed within the detection area by the human user.
  • a first one of the gestures comprises a translational movement of a hand of the user along a first movement axis into a first direction
  • a second one of the gestures comprises a translational movement of the hand along the first movement axis into a second direction opposite to the first direction
  • a third one of the gestures comprises a translational movement of the hand of the user along a second movement axis into a third direction, the second movement axis being perpendicular to the first movement axis
  • a fourth one of the gestures comprises a translational movement of the hand along the second movement axis into a fourth direction opposite to the third direction
  • a fifth one of the gestures comprises a translational movement of the hand along a third movement axis into a fifth direction
  • the third movement axis comprises a translational movement of the hand along a third movement
  • Fig. 1 a schematic perspective view of an operating device for a vehicle, the operating device comprising a gesture recognition unit configured to recognize at least eight gestures performed by a user of the vehicle within a three-dimensional detection area of the gesture recognition unit;
  • Fig. 2 a schematic perspective view of a hand of the human user performing a ninth gesture
  • Fig. 3 a schematic perspective view of the hand of the human user performing a tenth gesture.
  • Fig. 1 shows in a schematic perspective view an operating device in the form of a navigation system for a vehicle in the form of a passenger vehicle.
  • the navigation system comprises a display unit 10 configured to display a virtual map 12 of, for example, the surroundings.
  • a virtual map 12 of, for example, the surroundings.
  • at least a portion of the earth's surface is represented by the virtual map 12, wherein a human user of the navigation system can adjust said portion to be represented by the virtual map 12, i.e. to be displayed by the display unit 10.
  • the display unit shows a role or barrel having an outer lateral circumferential surface 14 forming the three-dimensional virtual map 12 so that, for example, buildings and/or mountains arranged in the portion shown by the virtual map 12 are represented in the three-dimensional manner.
  • Fig. 1 shows a front surface 16 of said barrel or role.
  • the navigation system comprises a gesture recognition unit 18 configured to recognize a plurality of gestures performed by the human user, wherein the navigation system is configured to execute functions upon recognition of the corresponding gestures.
  • the gesture recognition unit 18 has a three-dimensional detection area 20 which is arranged in the interior of the passenger vehicle, the operating device being configured to execute respective functions upon recognition of first eight gestures performed within the detection area 20 by the user.
  • a first one of the first eight gestures comprises a translational movement of a hand of the user along a first movement axis 22 into a first direction illustrated by a directional arrow 24.
  • a second one of the first eight gestures comprises a translational movement of the hand along the first movement axis 22 into a second direction opposite to a first direction and illustrated by a directional arrow 26.
  • a third one of the first eight gestures comprise a translational movement of the hand of the user along a second movement axis 28 into a third direction illustrated by a directional arrow 30, the second movement axis 28 extending perpendicularly to the first movement axis 22.
  • a fourth one of the first eight gestures comprises a translational movement of the hand along the second movement axis 28 into a fourth direction opposite to the third direction and illustrated by a directional arrow 32.
  • a fifth one of the first eight gestures comprises a translational movement of the hand along a third movement axis 34 into a fifth direction illustrated by a directional arrow 36, the third movement axis 34 extending perpendicularly to the first movement axis 22 and the second movement axis 28.
  • a sixth one of the first eight gestures comprises a translational movement of the hand along the third movement axis 34 into a sixth direction opposite to the fifth direction illustrated by directional arrow 38.
  • a seventh one of the first eight gestures comprises a rotational movement of the hand about the third movement axis 34 into a first rotating direction illustrated by a directional arrow 40.
  • an eighth one of the first eight gestures comprises a rotational movement of the hand about the third movement axis 34 into a second rotating direction opposite to the first rotating direction and illustrated by a directional arrow 42.
  • the first movement axis 22 corresponds to the transverse direction of the vehicle.
  • the second movement axis 28 corresponds to the vertical direction of the vehicle and the third movement axis 34 corresponds to the longitudinal direction of the vehicle.
  • the virtual map 12 and/or the barrel can be moved to the left and to the right.
  • the virtual map 12 and/or the barrel can be moved up and down.
  • the scale of the virtual map 12 can be enlarged.
  • the scale of the virtual map 12 can be decreased.
  • the user can navigate through a menu structure of a graphical user interface (GUI) of the navigation system by performing the gestures within the detection area 20.
  • GUI graphical user interface
  • the user can confirm an input into the navigation system.
  • the user can decline inputs and/or navigate to previous menu interfaces of the navigation system.
  • the gesture recognition unit 18 is configured to be switched from a locked state into an unlocked state upon recognition of a ninth gesture performed within the detection area 20 and illustrated in Fig.
  • the ninth gesture comprising a movement of two fingers 44 and 46 of the hand 48 towards each other so that respective tips of the fingers 44 and 46 meet in one point 50.
  • the recognition of said first eight gestures is omitted.
  • the gesture recognition unit 18 is ready to recognize said first eight gestures.
  • the gesture recognition unit 18 is configured to be switched from the unlocked state into the locked state upon recognition of a tenth gesture illustrated in Fig. 3, the tenth gesture comprising a movement of the fingers 44 and 46 away from each other. As can be seen in Fig. 3, the tenth gesture is performed by opening the fingers 44 and 46 so that the finger tips are arranged at a distance from each other. In the present case, the finger 44 is the forefinger and the finger 46 is the thumb of the hand 48.
  • the gesture recognition unit 18 upon recognition of the gestures respective sounds are played and/or the colour of the front surface 16 is changed thereby realizing an acoustical and optical feedback to the user.
  • the front surface 16 is grey as default colour and the gesture recognition unit 18 scans for input.
  • the gesture recognition unit 18 cannot recognize any gesture except for the ninth gesture.
  • the gesture recognition unit 18 is switched from the locked state into the unlocked state.
  • the gesture recognition unit 18 detects the fingers 44 and 46 being arranged at a distance from each other, a short activation sound is played and the front surface 16 changes its colour from grey to green with a fast rotating colour effect.
  • a short activation sound is played and the fast rotating colour effect is stopped so that the front surface 16 or its green colour is steady.
  • the gesture recognition unit 18 is kept in the unlocked state only as long as the fingers 44 and 46, in particular the finger tips are kept together within the detection area 20. Then, the user can perform said first eight gestures in order to effect the corresponding eight functions of the navigation system. For example, every time the gesture recognition unit 18 recognizes one of said first eight gestures, a corresponding adjusting sound is played in order to inform the user that the respective gesture was detected properly.
  • the gesture recognition unit 18 is switched back from the unlocked state into the locked state, wherein the front surface 16 changes its colour from green back to static grey.
  • the navigation system is an operation device having a plurality of different functions.
  • the functions relate to adjusting the virtual map 12.
  • the navigation system is a very complex system.
  • this complex system can be operated by the user in a particularly easy and comfortable way by performing said gestures.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Automation & Control Theory (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention relates to an operating device for a vehicle, the operating device comprising a gesture recognition unit (18) configured to recognize at least one gesture performed by a user of the vehicle, the operating device being configured to execute at least one function upon recognition of the gesture, wherein the gesture recognition unit (18) has at least one three-dimensional detection area (20), the operating device being configured to execute respective functions upon recognition of eight gestures performed within the detection area (20) by the user.

Description

Operating Device for a Vehicle, in particular a Passenger Vehicle; as well as Method for Operating such an Operating Device
Field of the Invention
The invention relates to an operating device for a vehicle according to the preamble of patent claim 1 as well as a method for operating such an operating device according to the preamble of patent claim 8.
Background Art
Such an operating device for a vehicle, in particular a passenger vehicle, and such a method for operating an operating device are known from DE 10 2012 219 814 Al. The operating device comprises a gesture recognition unit configured to recognize at least one gesture performed by a human user of the vehicle. The operating device is configured to execute at least one function upon recognition of the gesture.
EP 1 830 244 A2 shows a method comprising selecting an audio system or an air conditioning system by an acoustic signal. A function and/or an operating condition of the selected system is set depending on a gesture of a user. The gesture for setting the audio system and the air conditioning system is adapted to a component-specific setting of the function and/or the operating condition. The gesture assigned to the function and/or the operating condition is set depending on the detection of the gesture by a detecting device.
Moreover, DE 10 2012 219 814 Al shows a method for providing an operating input by using a head mounted display carried by a user.
l Summary of the Invention
Technical problem to be solved
It is an object of the present invention to provide an operating device for a vehicle and a method for operating an operating device for a vehicle by means of which a particularly easy and comfortable operation of the operating device can be realized.
Technical solution
This object is solved by an operating device having the features of patent claim 1 as well as a method having the features of patent claim 8. Advantageous embodiments with expedient developments of the invention are indicated in the other patent claims.
A first aspect of the invention relates to an operating device for a vehicle, in particular a passenger vehicle. The operating device comprises a gesture recognition unit configured to recognize at lest one gesture performed by a human user of the vehicle, the operating device being configured to execute at least one function upon recognition of the gesture.
In order to realize a particularly easy an comfortable operation of said operating device, according to the present invention the gesture recognition unit has at least one three-dimensional detection area which is, for example, arranged in the interior of the vehicle. The operating device is configured to execute respective functions upon recognition of eight gestures performed within the detection area by the user. In other words, one of said functions of the operating device can be effected or initiated by the user as the user performs the corresponding gesture within the detection area. In the operation device according to the present invention, a first one of the gestures comprises a translational movement of a hand of the user along a first movement axis into a first direction. A second one of the gestures comprises a translational movement of the hand along the first movement axis into a second direction opposite to the first direction. In other words, the user can move their hand forwards and backwards along the first movement axis within the detection area so that, by moving their hand along the first movement axis into the first direction, a first one of said functions can be effected and, by moving the hand into the second direction along the first movement axis, a second one of the functions can be effected by the user.
A third one of the gestures comprises a translational movement of the hand of the user along a second movement axis into a third direction, the second movement axis being perpendicular to the first movement axis. A fourth one of the gestures comprises a translational movement of the hand along the second movement axis into a fourth direction opposite to the third direction. In other words, the first movement axis and the second movement axis are virtual axes extending perpendicularly in relation to each other. By moving their hand forwards and backwards along the second movement axis within the detection area, the user can effect a third one of the functions and a fourth one of the functions.
A fifth one of the gestures comprises a translational movement of the hand along a third movement axis into a fifth direction, the third movement axis being perpendicular to the first movement axis and the second movement axis. A sixth one of the gestures comprises a translational movement of the hand along the third movement axis into a sixth direction opposite to the fifth direction. This means the third movement axis is a virtual axis extending perpendicularly to the first movement axis and the second movement axis. By moving their hand forwards and backwards along the third movement axis within the detection area, the, the user can effect a fifth one of the functions and a sixth one of the eight functions.
Moreover, a seventh one of the gestures comprises a rotational movement of the hand about the third movement axis into a first rotating direction which is a seventh direction. An eighth one of the gestures comprises a rotational movement of the hand about the third movement axis into a second rotating direction opposite to the first rotating direction, the second rotating direction being an eighth direction. For example, by rotating their hand to the left and to the right about the third movement axis within the detection area, the user can effect a seventh one of the function as and an eighth one of the functions of the operating device.
The operating device according to the present invention can be operated particularly comfortably and easily by the user since the user can perform the gestures very easily with their hand within the detection area which can be kept particularly small. Furthermore, said gestures are very easy inputs by means of which complex systems such as the operating device can be operated very easily.
In an advantageous embodiment of the invention, the gesture recognition unit is configured to be switched from a locked state into an unlocked state upon recognition of a ninth gesture performed within the detection area by the user, the gesture recognition unit being configured to detect the ninth gesture. The ninth gesture comprises a movement of two fingers of the hand of the user towards each other. In other words, the user performs the ninth gesture by moving two fingers of their hand towards each other within the detection area. In the locked state, the recognition of said first eight gestures is omitted. In other words, when the gesture recognition unit is in the locked state the eight functions corresponding to the first eight gestures cannot be effected by performing said first eight gestures. In the unlocked state, the gesture recognition unit is ready to recognize said first eight gestures so that, in the unlocked state, the user can effect the eight functions by performing the corresponding first eight gestures within the detection area. Thereby, the risk of faulty operation can be kept particularly low since, in order to effect the eight functions, the user first needs to switch the gesture recognition unit from the locked state into the unlocked state. However, the gesture recognition unit is configured to detect the ninth gesture in the locked state.
In a particularly advantageous embodiment of the invention, the gesture recognition unit is configured to be switched from the unlocked state into the locked state upon a tenth gesture performed by the user within the detection area, the gesture recognition unit being configured to recognize the tenth gesture, wherein the tenth gesture comprises a movement of the fingers away from each other. In other words, the gesture recognition unit is switched from the unlocked state to the locked state when the user moves their fingers away from each other within the detection area. Thereby, the gesture recognition unit can be switched from the unlocked state into the locked state particularly easily so that the risk of faulty operation can be kept particularly low.
In a further advantageous embodiment of the invention, the gesture recognition unit is configured to be kept in the unlocked state as long as the fingers are kept together. This means the gesture recognition unit is witched from the unlocked state into the locked state as soon as the user opens their fingers, i.e. moves their fingers away from each other. This also means the gesture recognition unit is ready to recognize the first eight gestures only as long as the user keeps their fingers together. Thus, the risk of faulty operation can be kept particularly low.
In a further advantageous embodiment of the invention, the operating device is configured as a navigation system for the vehicle. Such a navigation system is a very complex system which, conventionally, can be cumbersome to operate. However, the operating device according to the present invention allows for realizing a particularly easy and comfortable operation for any complex system. Thereby, the user can operate the navigation system in a fast and easy way without getting distracted from traffic.
For example, the navigation device comprises a display unit configured to show a virtual map, the functions relating to adjusting the virtual map. In other words, the virtual map can be adjusted by the user by performing said first eight gestures. For example, an area represented by the virtual map can be changed and/or enlarged and/or scaled down by performing the first eight gestures. Moreover, for example, the user can confirm or decline inputs by performing the gestures and/or navigate through a manual structure comprising different manuals and/or tabs and/or menu interfaces by performing said gestures.
A second aspect of the present invention relates to a vehicle, in particular a passenger vehicle, comprising an operation device according to the present invention. Advantages and advantageous embodiments of the first aspect of the invention are to be regarded as advantages and advantageous embodiments of the second aspect of the invention and vice versa.
A third aspect of the invention relates to a method for operating an operating device for a vehicle, the operating device comprising a gesture recognition unit by means of which at least one gesture performed by a user of the vehicle is recognized, wherein at least one function is executed by the operating device upon recognition of the gesture.
In order to realize a particularly easy and comfortable operation of the operating device, according to the present invention the gesture recognition unit has at least one three-dimensional detection area, the operating device executing respective functions upon recognition of eight gestures performed within the detection area by the human user. Therein, a first one of the gestures comprises a translational movement of a hand of the user along a first movement axis into a first direction, a second one of the gestures comprises a translational movement of the hand along the first movement axis into a second direction opposite to the first direction, a third one of the gestures comprises a translational movement of the hand of the user along a second movement axis into a third direction, the second movement axis being perpendicular to the first movement axis, a fourth one of the gestures comprises a translational movement of the hand along the second movement axis into a fourth direction opposite to the third direction, a fifth one of the gestures comprises a translational movement of the hand along a third movement axis into a fifth direction, the third movement axis being perpendicular to the first movement axis and the second movement axis, a sixth one of the gestures comprises a translational movement of the hand along the third movement axis into a sixth direction opposite to the fifth direction, a seventh one of the gestures comprises a rotational movement of the hand about the third movement axis into a first rotating direction, and an eighth one of the gestures comprises a rotational movement of the hand about the third movement axis into a second rotating direction opposite to the first rotating direction. Advantages and advantageous embodiments of the first aspect and the second aspect of the invention are to be regarded as advantages and advantageous embodiments of the third aspect of the invention and vice versa.
Further advantages, features and details of the invention derive from the following description of a preferred embodiment as well as from the drawing. The features and feature combinations previously mentioned in the description as well as the features and feature combinations mentioned in the following description of the figures and/or shown in the figures alone can be employed not only in the respective indicated combination but also in any other combination or taken alone without leaving the scope of the invention.
Brief Description of the Drawings
Fig. 1 a schematic perspective view of an operating device for a vehicle, the operating device comprising a gesture recognition unit configured to recognize at least eight gestures performed by a user of the vehicle within a three-dimensional detection area of the gesture recognition unit;
Fig. 2 a schematic perspective view of a hand of the human user performing a ninth gesture; and
Fig. 3 a schematic perspective view of the hand of the human user performing a tenth gesture. Detailed Description of Embodiments
In the figures the same elements or elements having the same functions are indicated with the same reference signs.
Fig. 1 shows in a schematic perspective view an operating device in the form of a navigation system for a vehicle in the form of a passenger vehicle. The navigation system comprises a display unit 10 configured to display a virtual map 12 of, for example, the surroundings. In other words, at least a portion of the earth's surface is represented by the virtual map 12, wherein a human user of the navigation system can adjust said portion to be represented by the virtual map 12, i.e. to be displayed by the display unit 10.
In the present case, the display unit shows a role or barrel having an outer lateral circumferential surface 14 forming the three-dimensional virtual map 12 so that, for example, buildings and/or mountains arranged in the portion shown by the virtual map 12 are represented in the three-dimensional manner. Moreover, Fig. 1 shows a front surface 16 of said barrel or role.
In order to realize a particularly easy and comfortable operation of the navigation system, the navigation system comprises a gesture recognition unit 18 configured to recognize a plurality of gestures performed by the human user, wherein the navigation system is configured to execute functions upon recognition of the corresponding gestures.
The gesture recognition unit 18 has a three-dimensional detection area 20 which is arranged in the interior of the passenger vehicle, the operating device being configured to execute respective functions upon recognition of first eight gestures performed within the detection area 20 by the user. A first one of the first eight gestures comprises a translational movement of a hand of the user along a first movement axis 22 into a first direction illustrated by a directional arrow 24. A second one of the first eight gestures comprises a translational movement of the hand along the first movement axis 22 into a second direction opposite to a first direction and illustrated by a directional arrow 26. A third one of the first eight gestures comprise a translational movement of the hand of the user along a second movement axis 28 into a third direction illustrated by a directional arrow 30, the second movement axis 28 extending perpendicularly to the first movement axis 22. A fourth one of the first eight gestures comprises a translational movement of the hand along the second movement axis 28 into a fourth direction opposite to the third direction and illustrated by a directional arrow 32. A fifth one of the first eight gestures comprises a translational movement of the hand along a third movement axis 34 into a fifth direction illustrated by a directional arrow 36, the third movement axis 34 extending perpendicularly to the first movement axis 22 and the second movement axis 28.
A sixth one of the first eight gestures comprises a translational movement of the hand along the third movement axis 34 into a sixth direction opposite to the fifth direction illustrated by directional arrow 38. A seventh one of the first eight gestures comprises a rotational movement of the hand about the third movement axis 34 into a first rotating direction illustrated by a directional arrow 40. Moreover, an eighth one of the first eight gestures comprises a rotational movement of the hand about the third movement axis 34 into a second rotating direction opposite to the first rotating direction and illustrated by a directional arrow 42. Thus, a four-plus-four gesture operation is realized to operate the navigation system.
For example, the first movement axis 22 corresponds to the transverse direction of the vehicle. The second movement axis 28 corresponds to the vertical direction of the vehicle and the third movement axis 34 corresponds to the longitudinal direction of the vehicle. Moreover, for example, by moving their hand along the first movement axis 22, i.e. to the left and to the right the virtual map 12 and/or the barrel can be moved to the left and to the right. By moving the hand along the second movement axis 28, i.e. up and down, for example, the virtual map 12 and/or the barrel can be moved up and down. By rotating the hand about the third movement axis 28 into the first rotation direction, i.e. clockwise, for example, the scale of the virtual map 12 can be enlarged. Moreover, by rotating the hand about the third movement axis 34 into the second rotating direction, i.e. counter-clockwise, for example, the scale of the virtual map 12 can be decreased.
Alternatively or additionally, the user can navigate through a menu structure of a graphical user interface (GUI) of the navigation system by performing the gestures within the detection area 20. For example, by moving the hand along the third movement axis 34 into the fifth direction, i.e. forwards, the user can confirm an input into the navigation system. Moreover, by moving the hand along the third movement axis 34 into the sixth direction, i.e. backwards, for example, the user can decline inputs and/or navigate to previous menu interfaces of the navigation system. The gesture recognition unit 18 is configured to be switched from a locked state into an unlocked state upon recognition of a ninth gesture performed within the detection area 20 and illustrated in Fig. 2, the ninth gesture comprising a movement of two fingers 44 and 46 of the hand 48 towards each other so that respective tips of the fingers 44 and 46 meet in one point 50. In the locked state, the recognition of said first eight gestures is omitted. In the unlocked state, the gesture recognition unit 18 is ready to recognize said first eight gestures.
The gesture recognition unit 18 is configured to be switched from the unlocked state into the locked state upon recognition of a tenth gesture illustrated in Fig. 3, the tenth gesture comprising a movement of the fingers 44 and 46 away from each other. As can be seen in Fig. 3, the tenth gesture is performed by opening the fingers 44 and 46 so that the finger tips are arranged at a distance from each other. In the present case, the finger 44 is the forefinger and the finger 46 is the thumb of the hand 48.
Moreover, upon recognition of the gestures respective sounds are played and/or the colour of the front surface 16 is changed thereby realizing an acoustical and optical feedback to the user. For example, in the locked state, the front surface 16 is grey as default colour and the gesture recognition unit 18 scans for input. Preferably, in the locked state, the gesture recognition unit 18 cannot recognize any gesture except for the ninth gesture. Thus, when the user performs the ninth gesture, i.e. moves the fingers 44 and 46 towards each other within the detection area 20, the gesture recognition unit 18 is switched from the locked state into the unlocked state. For example, when the user moves the fingers 44 and 46 into the detection area 20 and than the gesture recognition unit 18 detects the fingers 44 and 46 being arranged at a distance from each other, a short activation sound is played and the front surface 16 changes its colour from grey to green with a fast rotating colour effect. When the user moves the fingers 44 and 46 towards each other, i.e. performs the ninth gesture and the gesture recognition unit 18 recognizes the ninth gesture, a short activation sound is played and the fast rotating colour effect is stopped so that the front surface 16 or its green colour is steady.
The gesture recognition unit 18 is kept in the unlocked state only as long as the fingers 44 and 46, in particular the finger tips are kept together within the detection area 20. Then, the user can perform said first eight gestures in order to effect the corresponding eight functions of the navigation system. For example, every time the gesture recognition unit 18 recognizes one of said first eight gestures, a corresponding adjusting sound is played in order to inform the user that the respective gesture was detected properly. When the user opens the fingers 44 and 46, i.e. performs the tenth gesture, the gesture recognition unit 18 is switched back from the unlocked state into the locked state, wherein the front surface 16 changes its colour from green back to static grey.
The navigation system is an operation device having a plurality of different functions. In the present case, the functions relate to adjusting the virtual map 12. Thus, the navigation system is a very complex system. However, this complex system can be operated by the user in a particularly easy and comfortable way by performing said gestures.

Claims

What is claimed is:
1. An operating device for a vehicle, the operating device comprising a gesture recognition unit (18) configured to recognize at least one gesture performed by a user of the vehicle, the operating device being configured to execute at least one function upon recognition of the gesture,
characterized in that
the gesture recognition unit (18) has at least one three-dimensional detection area (20), the operating device being configured to execute respective functions upon recognition of eight gestures performed within the detection area (20) by the user, wherein:
- a first one of the gestures comprises a translational movement of a hand (48) of the user along a first movement (22) axis into a first direction (24);
- a second one of the gestures comprises a translational movement of the hand (48) along the first movement (22) axis into a second direction (26) opposite to the first direction (24);
- a third one of the gestures comprises a translational movement of the hand (48) of the user along a second movement axis (28) into a third direction (30), the second movement axis (28) being perpendicular to the first movement axis (22);
- a fourth one of the gestures comprises a translational movement of the hand (48) along the second movement (28) axis into a fourth direction (32) opposite to the third direction (30);
- a fifth one of the gestures comprises a translational movement of the hand (48) along a third movement axis (34) into a fifth direction (36), the third movement axis (34) being perpendicular to the first movement axis (22) and the second movement axis (28);
- a sixth one of the gestures comprises a translational movement of the hand (48) along the third movement axis (34) into a sixth direction (38) opposite to the fifth direction (36);
- a seventh one of the gestures comprises a rotational movement of the hand (48) about the third movement axis (34) into a first rotating direction (40); and
- an eighth one of the gestures comprises a rotational movement of the hand (48) about the third movement axis (34) into a second rotating direction (42) opposite to the first rotating direction (40).
2. The operating device according to claim 1,
characterized in that
the gesture recognition unit (18) is configured to be switched from a locked state into an unlocked state upon recognition of a ninth gesture performed within the detection area (20), the ninth gesture comprising a movement of two fingers (44, 46) of the hand (48) towards each other, wherein, in the locked state, the recognition of said eight gestures is omitted and, in the unlocked state, the gesture recognition unit (18) is ready to recognize said eight gestures
3. The operating device according to claim 2,
characterized in that
the gesture recognition unit (18) is configured to be switched from the unlocked state into the locked state upon recognition of a tenth gesture performed within the detection area (20), the tenth gesture comprising a movement of the fingers (44, 46) away from each other.
4. The operating device according to any one of claims 2 or 3,
characterized in that the gesture recognition unit (18) is configured to be kept in the unlocked state as long as the fingers (44, 46) are kept together.
5. The operating device according to any one of the preceding claims, characterized in that
the operating device is configured as a navigation system for the vehicle.
6. The operating device according to claim 5,
characterized in that
the navigation device comprises a display unit (10) configured to show a virtual map (12), the functions relating to adjusting the virtual map (12).
7. A vehicle, in particular a passenger vehicle, comprising an operation device according to any one of the preceding claims.
8. A method for operating an operating device for a vehicle, the operating device comprising a gesture recognition unit (18) by means of which at least one gesture performed by a user of the vehicle is recognized, wherein at least one function is executed by the operating device upon recognition of the gesture,
characterized in that
the gesture recognition unit has at least one three-dimensional detection area, the operating device executing respective functions upon recognition of eight gestures performed within the detection area by the user, wherein:
- a first one of the gestures comprises a translational movement of a hand (48) of the user along a first movement (22) axis into a first direction (24);
- a second one of the gestures comprises a translational movement of the hand (48) along the first movement (22) axis into a second direction (26) opposite to the first direction (24);
- a third one of the gestures comprises a translational movement of the hand (48) of the user along a second movement axis (28) into a third direction (30), the second movement axis (28) being perpendicular to the first movement axis (22);
- a fourth one of the gestures comprises a translational movement of the hand (48) along the second movement (28) axis into a fourth direction (32) opposite to the third direction (30);
- a fifth one of the gestures comprises a translational movement of the hand (48) along a third movement axis (34) into a fifth direction (36), the third movement axis (34) being perpendicular to the first movement axis (22) and the second movement axis (28);
- a sixth one of the gestures comprises a translational movement of the hand (48) along the third movement axis (34) into a sixth direction (38) opposite to the fifth direction (36);
- a seventh one of the gestures comprises a rotational movement of the hand (48) about the third movement axis (34) into a first rotating direction (40); and
- an eighth one of the gestures comprises a rotational movement of the hand (48) about the third movement axis (34) into a second rotating direction (42) opposite to the first rotating direction (40).
PCT/IB2014/066621 2014-12-05 2014-12-05 Operating device for a vehicle, in particular a passenger vehicle; as well as method for operating such an operating device WO2016087902A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/IB2014/066621 WO2016087902A1 (en) 2014-12-05 2014-12-05 Operating device for a vehicle, in particular a passenger vehicle; as well as method for operating such an operating device
CN201480083850.0A CN107003142A (en) 2014-12-05 2014-12-05 The operation device and its operating method of vehicle particularly passenger stock

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2014/066621 WO2016087902A1 (en) 2014-12-05 2014-12-05 Operating device for a vehicle, in particular a passenger vehicle; as well as method for operating such an operating device

Publications (1)

Publication Number Publication Date
WO2016087902A1 true WO2016087902A1 (en) 2016-06-09

Family

ID=52424057

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2014/066621 WO2016087902A1 (en) 2014-12-05 2014-12-05 Operating device for a vehicle, in particular a passenger vehicle; as well as method for operating such an operating device

Country Status (2)

Country Link
CN (1) CN107003142A (en)
WO (1) WO2016087902A1 (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030151592A1 (en) * 2000-08-24 2003-08-14 Dieter Ritter Method for requesting destination information and for navigating in a map view, computer program product and navigation unit
EP1830244A2 (en) 2006-03-01 2007-09-05 Audi Ag Method and device for operating at least two functional components of a system, in particular of a vehicle
US20130155237A1 (en) * 2011-12-16 2013-06-20 Microsoft Corporation Interacting with a mobile device within a vehicle using gestures
US20130169668A1 (en) * 2011-12-30 2013-07-04 James D. Lynch Path side imagery
US20130204457A1 (en) * 2012-02-06 2013-08-08 Ford Global Technologies, Llc Interacting with vehicle controls through gesture recognition
US20130265229A1 (en) * 2012-04-09 2013-10-10 Qualcomm Incorporated Control of remote device based on gestures
DE102012219852A1 (en) * 2012-10-30 2014-04-30 Robert Bosch Gmbh Method for manipulating text-to-speech output to operator, involves detecting gesture of operator in gesture information and evaluating gesture information to detect operator command, where parameter of text-to-speech output is adjusted
DE102012219814A1 (en) 2012-10-30 2014-04-30 Bayerische Motoren Werke Aktiengesellschaft Providing an operator input using a head-mounted display
US20140267035A1 (en) * 2013-03-15 2014-09-18 Sirius Xm Connected Vehicle Services Inc. Multimodal User Interface Design

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030151592A1 (en) * 2000-08-24 2003-08-14 Dieter Ritter Method for requesting destination information and for navigating in a map view, computer program product and navigation unit
EP1830244A2 (en) 2006-03-01 2007-09-05 Audi Ag Method and device for operating at least two functional components of a system, in particular of a vehicle
US20130155237A1 (en) * 2011-12-16 2013-06-20 Microsoft Corporation Interacting with a mobile device within a vehicle using gestures
US20130169668A1 (en) * 2011-12-30 2013-07-04 James D. Lynch Path side imagery
US20130204457A1 (en) * 2012-02-06 2013-08-08 Ford Global Technologies, Llc Interacting with vehicle controls through gesture recognition
US20130265229A1 (en) * 2012-04-09 2013-10-10 Qualcomm Incorporated Control of remote device based on gestures
DE102012219852A1 (en) * 2012-10-30 2014-04-30 Robert Bosch Gmbh Method for manipulating text-to-speech output to operator, involves detecting gesture of operator in gesture information and evaluating gesture information to detect operator command, where parameter of text-to-speech output is adjusted
DE102012219814A1 (en) 2012-10-30 2014-04-30 Bayerische Motoren Werke Aktiengesellschaft Providing an operator input using a head-mounted display
US20140267035A1 (en) * 2013-03-15 2014-09-18 Sirius Xm Connected Vehicle Services Inc. Multimodal User Interface Design

Also Published As

Publication number Publication date
CN107003142A (en) 2017-08-01

Similar Documents

Publication Publication Date Title
US9186994B2 (en) Vehicle input apparatus
KR101092722B1 (en) User interface device for controlling multimedia system of vehicle
US10627913B2 (en) Method for the contactless shifting of visual information
US10474357B2 (en) Touch sensing display device and method of detecting user input from a driver side or passenger side in a motor vehicle
EP3040844B1 (en) Feedback by modifying stiffness
US10088928B2 (en) Method vehicle input/output device and method
US20140379212A1 (en) Blind control system for vehicle
KR101558354B1 (en) Blind control system for vehicle
US10394436B2 (en) Manipulation apparatus
US20180239424A1 (en) Operation system
CN106926697B (en) Display system and display device for vehicle
KR20170107767A (en) Vehicle terminal control system and method
JP6375715B2 (en) Line-of-sight input device
CN111602102B (en) Method and system for visual human-machine interaction
JP6819539B2 (en) Gesture input device
WO2016087902A1 (en) Operating device for a vehicle, in particular a passenger vehicle; as well as method for operating such an operating device
KR100856491B1 (en) System for making a driver operate a car easily and operating method of the same
US9939915B2 (en) Control device and method for controlling functions in a vehicle, in particular a motor vehicle
US10690509B2 (en) Display system and method for operating a display system in a transportation vehicle having at least one first and one second display surface
US20160011667A1 (en) System and Method for Supporting Human Machine Interaction
WO2015122259A1 (en) Input method and input device
JP2016185720A (en) Vehicular input system
CN106796454B (en) Method for controlling a vehicle system, analysis device and vehicle
KR20150056322A (en) Apparatus for controlling menu of head-up display and method thereof
WO2019065616A1 (en) Operation image display device and operation device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14830875

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14830875

Country of ref document: EP

Kind code of ref document: A1