GB2568509A - Vehicle controller - Google Patents
Vehicle controller Download PDFInfo
- Publication number
- GB2568509A GB2568509A GB1719068.7A GB201719068A GB2568509A GB 2568509 A GB2568509 A GB 2568509A GB 201719068 A GB201719068 A GB 201719068A GB 2568509 A GB2568509 A GB 2568509A
- Authority
- GB
- United Kingdom
- Prior art keywords
- hand
- vehicle
- control device
- control
- dependence
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 claims abstract description 47
- 238000004590 computer program Methods 0.000 claims abstract description 5
- 238000005286 illumination Methods 0.000 claims description 21
- 230000010399 physical interaction Effects 0.000 claims description 3
- 238000004458 analytical method Methods 0.000 description 10
- 230000004913 activation Effects 0.000 description 7
- 108010066057 cabin-1 Proteins 0.000 description 4
- 230000001419 dependent effect Effects 0.000 description 3
- 230000003213 activating effect Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000036548 skin texture Effects 0.000 description 2
- 238000004378 air conditioning Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000002329 infrared spectrum Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000010408 sweeping Methods 0.000 description 1
- 238000009423 ventilation Methods 0.000 description 1
- 238000001429 visible spectrum Methods 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/28—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/146—Instrument input by gesture
- B60K2360/1464—3D-gesture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/176—Camera images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/20—Optical features of instruments
- B60K2360/21—Optical features of instruments using cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- Transportation (AREA)
- General Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Human Computer Interaction (AREA)
- Automation & Control Theory (AREA)
- Mathematical Physics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A method and a controller for enabling control of a second vehicle system associated with a control device 15 located in a vehicle are provided. The method comprises detecting a hand 17 of a vehicle occupant within a volume of space 16 within the vehicle cabin; determining the relative position of the detected hand 17 with respect to the control device 15; and enabling control of the second vehicle system in dependence on a distance between at least a portion of the hand 17 and the control device 15 being less than or equal to a predefined threshold distance. A controller, a system and a vehicle including means by which the method may be carried out (e.g. via a computer program) are also disclosed.
Description
VEHICLE CONTROLLER
TECHNICAL FIELD
The present disclosure relates to a vehicle controller and particularly, but not exclusively, to a method and to a controller for contactless operation of a secondary function associated with a control device located within a vehicle cabin. Aspects of the invention relate to a method of enabling control of one or more vehicle systems using a control device located in a vehicle cabin, to a controller for enabling control of one or more vehicle systems associated with the control device, to a system, to a vehicle, to a computer program product and to a computer readable data carrier.
BACKGROUND
Modern vehicles, and in particular automobiles, comprise a myriad of different control systems. Some of these control systems relate to systems for controlling different aspects of a vehicle’s operation. Similarly, different control systems are often used to control vehicle cabin climate settings. Typically, control devices, such as switches and/or dials are provided to enable operation and control of the different vehicle control systems. The control devices are usually located on a control panel within the vehicle cabin. As a result of the ever growing number of different vehicle control systems found in modern vehicles, the number of control devices located on vehicle control panels has increased significantly. This has led to cluttered control panels, where as a result of the large number of different control devices located on the limited surface area of the control panel, it is often difficult for a vehicle occupant to locate the desired control device for operation.
Furthermore, it is often challenging to operate a specific control device located on the control panel, whilst simultaneously operating the vehicle, due to the myriad of different control devices populating the control panel.
At least in certain embodiments, the present invention seeks to mitigate or overcome at least some of the above-mentioned problems.
SUMMARY OF THE INVENTION
Aspects and embodiments of the invention provide a method, a controller, a system, a vehicle, a computer program product and a computer readable data carrier as claimed in the appended claims.
According to an aspect of the invention there is provided a method of enabling control of one or more vehicle systems using a control device located in a vehicle cabin. The control device may have a first function for controlling operation of a first vehicle system and a second function for controlling operation of a second vehicle system. The method may comprise: detecting a hand of a vehicle occupant within a volume of space within the vehicle cabin; determining the relative position of the hand with respect to the control device; and enabling control of the second vehicle system in dependence on a distance between at least a portion of the hand and the control device being less than or equal to a predefined threshold distance. In this way, advantageously a single control device may be used to control different vehicle systems. In this way it is possible to configure a single control device to control a plurality of different vehicle control systems. Furthermore, control of the second vehicle system may be achieved via a contactless interaction, which in many real-world scenarios may be easier for a vehicle occupant to carry out. This is particularly true where the vehicle occupant is the driver of the vehicle, and is simultaneously operating the vehicle whilst attempting to control the second vehicle system.
The method may comprise determining the relative position of the hand with respect to a control proximity boundary associated with the control device. The control proximity boundary may define a boundary offset from the control device by the predefined threshold distance. The method may comprise enabling control of the second vehicle system in dependence on the position of at least a portion of the hand intersecting the control proximity boundary. The proximity boundary may define a volume of space adjacent to the control device, in dependence on the vehicle occupant’s hand being at or within the proximity boundary, the control of the second vehicle system may be enabled.
In certain embodiments operation of the first vehicle system may be arranged to be controlled by physical interaction of the control device by the vehicle occupant. Accordingly, control of the first vehicle system is enabled when physical contact is established between the vehicle occupant’s hand and the control device.
The method may comprise determining if the hand is the vehicle occupant’s left or right hand, and enabling control of the second vehicle system in dependence on whether the hand is the vehicle occupant’s left or right hand. This may comprise determining if the hand is oriented palm upwards or downwards within the volume of space relative to the control device; and determining if the hand is the vehicle occupant’s left or right hand in dependence on the orientation of the hand. In this way it is possible to enable control of the second vehicle system in dependence on whether the hand is the left or right hand.
The method may comprise determining which vehicle occupant the hand belongs to. Control of the second vehicle system may then be enabled in dependence on which vehicle occupant the hand belongs to. This enables control of the second vehicle system to be restricted to specific vehicle occupants, for example to a driver of the vehicle.
In certain embodiments the method may comprise determining a direction of entry of the hand into the volume of space relative to the control device; determining which vehicle occupant the hand belongs to in dependence on the direction of entry; and enabling control of the second vehicle system in dependence on which vehicle occupant the hand belongs to. This provides a convenient way of identifying whom the hand belongs to, which in turn may be used for selectively restricting control of the second vehicle system to specific vehicle occupants.
The method may comprise obtaining image data of the hand within the volume of space; receiving a reflectance signal reflected from the hand; determining a distance of the hand from a designated origin in dependence on the reflectance signal; and determining the relative position of the hand with respect to the control device in dependence on the distance of the hand from the designated origin, the obtained image data, and a known distance of the control device relative to the designated origin. For example, the time taken for the reflectance signal to be measured by a receiver is proportional to the distance of the hand from the sensor, and therefore provides a convenient way for distance information associated with the position of the hand to be obtained. In this way it is possible to determine the distance of the hand from the control device on the basis of a two-dimensional image of the hand relative to the control device, and distance information of the hand. This significantly simplifies the hardware required to carry out the method, and in particular obviates the need for using a complex system of two or more cameras, each configured to capture different perspective images of the hand relative to the control device, from which the distance of the hand relative to the control device may be determined.
In certain embodiments the designated origin may be coincident with a position of an image capture device.
The method may comprise identifying a gesture performed by the hand; and controlling operation of the second vehicle system in dependence on the identified gesture. The gesture may be a predefined gesture. This helps to reduce the likelihood of accidental activation of the second function, and provides a convenient means for controlling operation of the second vehicle system.
In certain embodiments the second vehicle system may be associated with two or more different functions, and the method may comprise identifying one of a plurality of different gestures; and selecting the function in dependence on the identified gesture. This provides a convenient way of selecting between different functions associated with the second vehicle system. For example, the different functions associated with the second vehicle system may relate to different settings or modes of operation associated with the second vehicle system. The use of a different predefined hand gestures enables selecting different settings of modes of operation.
The method may comprise determining if a distance between at least a portion of the hand and the control device is less than or equal to the predefined threshold distance for a predefined threshold time period; and enabling control of the second vehicle system in dependence on the distance between the hand and control device being less than or equal to the predefined threshold distance for a period of time exceeding the predefined threshold time period. This helps to reduce the likelihood of unintentionally enabling control of the second vehicle system.
According to a further aspect of the invention there is provided a controller for enabling control of one or more vehicle systems associated with a control device located in a vehicle cabin. The control device may have a first function for controlling operation of a first vehicle system, and a second function for controlling operation of a second vehicle system. The controller may comprise an input configured to receive image data obtained from an image capture device, and a processor. The processor may be configured in use to: recognise a hand of a vehicle occupant from the image data, the hand being located within a volume of space within the vehicle cabin within which the image data is obtained by the image capture device; and determine a position of the hand with respect to the control device. The controller may further comprise an output arranged in use to output a control signal to the control device enabling control of the second vehicle system, in dependence on a distance between at least a portion of the hand and the control device being less than or equal to a predefined threshold. The present controller benefits from the same advantages as set out in respect of the preceding aspect of the invention.
Embodiments of the controller may be provided with similar functionality as set out in respect of the embodiments set out in respect of the preceding aspect of the invention.
In certain embodiments the input may be configured to receive data from a 3D mapping device configured to generate a three-dimensional model of the vehicle occupant’s hand located within a volume of space within the vehicle cabin. The processor may be configured to determine the relative position of the hand with respect to the control device from the three-dimensional model.
In certain embodiments the input may be configured to receive data from a time-offlight (ToF) image capture device comprising a sensor, the ToF image capture device being arranged in use to obtain image data of the hand, to illuminate the hand within the volume of space, and to measure a time of return of a reflected illumination signal, the time of return of the reflected illumination signal being proportional to a distance of the hand from the sensor, the data comprising the image data and the time of return of the reflected illumination signal. The processor may be arranged in use to determine the relative position of the hand with respect to the control device in dependence on the determined distance of the hand from the sensor, the obtained image data, and a known distance of the control device relative to the sensor, wherein the distance of the hand from the sensor may be determined in dependence on the time of return of the reflected illumination signal. The ToF image capture device provides a convenient means for obtaining image data associated with image object distance data, and therefore simplifies determining the distance of the hand from the control device.
In certain embodiments the processor may be arranged in use to identify a gesture performed by the hand; and the output may be arranged in use to output the control signal enabling control of the second vehicle system in dependence on the identified gesture.
In certain embodiments the second vehicle system may be associated with two or more different functions, and the processor may be arranged in use to identify one of a plurality of different gestures. The output may be arranged in use to output a control signal selecting the function for operation in dependence on the identified gesture.
The processor may be configured to determine if a distance between at least a portion of the hand and the control device is less than or equal to the predefined threshold distance for a predefined threshold time period; and the output may be arranged in use to output the control signal enabling control of the second vehicle system in dependence on the distance between the hand and control device being less than or equal to the predefined threshold distance for a period of time exceeding the predefined threshold time period.
According to a further aspect of the invention there is provided a system comprising the aforementioned controller in combination with a time-of-flight (ToF) image capture device.
In accordance with yet a further aspect of the invention there is provided a vehicle configured in use to carry out the aforementioned method. In particular, the vehicle may be configured to detecting a hand of a vehicle occupant within a volume of space within a vehicle cabin of the vehicle; determine the relative position of the hand with respect to a control device located within the vehicle cabin, the control device having a first function for controlling operation of a first vehicle system and a second function for controlling operation of a second vehicle system; and enabling control of the second vehicle system in dependence on a distance between at least a portion of the hand and the control being less than or equal to a predefined threshold distance.
Similarly, according to a further aspect of the invention there is provided a vehicle comprising the aforementioned controller or the aforementioned system. In particular, the vehicle may comprise a controller for enabling control of one or more vehicle systems associated with a control device located in a vehicle cabin of the vehicle. The control device may have a first function for controlling operation of a first vehicle system, and a second function for controlling operation of a second vehicle system. The controller may comprise an input configured to receive image data captured from an image capture device, and a processor. The processor may be configured in use to: recognise a hand of a vehicle occupant from the image data, the hand being located within a volume of space within the vehicle cabin within which image objects are captured by the image capture device; and determine a position of the hand with respect to the control device. The controller may further comprise an output arranged in use to output a control signal to the control device enabling control of the second vehicle system, in dependence on a distance between at least a portion of the hand and the control device being less than or equal to a predefined threshold.
In accordance with yet a further aspect of the invention there is provided a computer program product comprising instructions for carrying out the aforementioned method.
The instructions may comprise instructions for enabling control of one or more vehicle systems using a control device located in a vehicle cabin, the control device having a first function for controlling operation of a first vehicle system and a second function for controlling operation of a second vehicle system. When executed on a processor, the instructions may configure the processor to: detect a hand of a vehicle occupant within a volume of space within the vehicle cabin; determine the relative position of the hand with respect to the control device; and enable control of the second vehicle system in dependence on a distance between at least a portion of the hand and the control being less than or equal to a predefined threshold distance.
In accordance with a further aspect of the invention there is provided a computer readable data carrier having stored thereon instructions for carrying out the aforementioned method. Optionally, the computer readable data carrier comprises a non-transitory computer readable data carrier.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
Figure 1 is a schematic cut-away illustration of a front portion of a vehicle cabin having a camera having a field of view arranged to obtain image data of a vehicle occupant’s hand within a volume of space within a vehicle cabin;
Figures 2a and 2b are schematic magnified illustrations of portions of the control panel of the vehicle cabin of Figure 1, showing control devices having a control proximity boundary used for enabling control of a second vehicle system associated with the control device, in dependence on at least a portion of the vehicle occupant’s hand intersecting at least a portion of the control proximity boundary;
Figure 3 is a schematic of a controller configured to enable operation of the second vehicle system associated with a control device located in the vehicle cabin of Figures 1,2a, or 2b;
Figure 4 is a process flow chart outlining a method for enabling control of the second vehicle system associated with a control device within the vehicle cabin, in dependence on the proximity of a hand to the control device, using the camera of Figure 1;
Figure 5 is a schematic illustration highlighting the principle of operation of a Time-ofFlight (ToF) camera, which may be used to determine the position of a vehicle occupant’s hand within the vehicle cabin of Figure 1;
Figures 6a and 6b are schematic illustrations showing a three-dimensional point cloud of a vehicle occupant’s hand generated using the ToF camera of Figure 5; and
Figure 7 is a schematic illustration of a vehicle comprising the camera of Figure 1 and the controller of Figure 3.
DETAILED DESCRIPTION
Figure 1 is a cut-away perspective view of a portion of the vehicle cabin 1, and in particular shows the driver 3 sitting in the driver’s seat 5. An image capture device in the form of a camera 7, having a field of view 9 delineated in Figure 1 by lines 11, is shown located in the cabin roof. Optionally, the camera 7 may comprise a ToF camera. The camera 7 is arranged to image objects located within the camera’s field of view 9. The field of view defines a volume of space within the vehicle cabin within which objects are imaged by the camera 7. The camera 7 is arranged such that a control panel 13 of the vehicle 1 lies within the camera’s field of view 9. The control panel 13 comprises a plurality of different control devices 15, which may relate to, but are not limited to: air ventilation switches; air conditioning switches; vehicle infotainment system; air circulation switches; terrain mode selection switches; and any other control device configured to operate a control system of the vehicle. Each control device 15 may comprise two or more different functions. For example, the control device 15 may comprise a first function which is controlled via physical interaction with the control device 15, and enables control of an associated first vehicle system; and a second function which is activated in dependence on a distance between at least a portion of the vehicle occupant’s hand 17 and the control device 15 being less than or equal to a predefined threshold distance, activation of the second function enabling control of an associated second vehicle system. The vehicle cabin 1 may be comprised in the vehicle 43 of Figure 7.
Figures 2a and 2b provide a perspective view of a portion of the control panel 13 of the vehicle cabin 1. The control panel 13 comprises a plurality of control devices 15. As mentioned previously, each control device 15 may comprise a first function and a second function of operation. Activation of the second function of operation is dependent on a relative distance between at least a portion of a vehicle occupant’s hand 17 and the associated control device 15 being less than or equal to a predefined threshold. In certain embodiments the predefined threshold may delineate a control proximity boundary 16 as illustrated in Figures 2a and 2b. In Figure 2a the vehicle occupant’s hand 17 is shown lying at a distance such than no portion of the hand intersects the control proximity boundary 16. Whereas in Figure 2b the vehicle occupant’s hand is located at a position such that at least a portion of the hand intersects the control proximity boundary 16 of an associated control device 15, which enables activation of the second function associated with the specific control device 15. Activation of the second function enables control of the second vehicle system associated with the control device 15. This is explained in further detail in the ensuing description.
In certain embodiments, the camera 7 may be operatively coupled to a controller 19 (shown in Figure 3), and configured to receive image data obtained by the camera 7 and to output a control signal to the specific control device 15 enabling control of the second vehicle system associated with the second function of the control device 15 in dependence on an analysis of the received image data. This enables selective operation of the second vehicle system associated with the second function of the control device 15.
Figure 3 provides a functional overview of the controller 19. The controller 19 may be functionally embedded into an existing electronic control unit of the vehicle 43. The controller 19 may be provided with an input 21 and an output 23. The input 21 may be configured to receive image data obtained by the camera 7, and the output 23 may be configured to output a control signal to the control device 15, which control signal activates the second function associated with the control device 15, and enables control of the second vehicle system associated with the second function. In addition, the controller 19 comprises a processor 25 arranged to analyse image data received from the camera 7, to identify image objects such as the hand 17 of a vehicle occupant within the obtained image data, and to generate control signals for controlling operation of the second vehicle systems of associated control devices 15, in dependence on the relative position of the hand 17 with respect to the control devices 15.
In certain embodiments, the second function of a control device 15 may comprise a plurality of different settings each associated with a different mode of operation of the associated second vehicle system. Once at least a portion of the vehicle occupant’s hand has been identified as being located at a position that is determined to be at or within the control proximity boundary 16, and control of the second vehicle system has been enabled, subsequent control of the second vehicle system may be achieved by use of specific hand gestures. Specific hand gestures may be used to select one from the plurality of different settings associated with the second vehicle system. For example and for non-limiting illustrative purposes only, in Figure 2b the vehicle occupant’s hand 17 is shown as swiping right or left by direction arrows 18. The gesture of sweeping right or left may be used to toggle between a plurality of different settings associated with different modes of operation of the associated second vehicle system. Different gestures are also envisaged as being usable to select between different settings associated with the different modes of operation of the second vehicle system.
In use, as an image of a vehicle occupant’s hand is captured by the camera 7, and its position relative to a control device 15 is determined by the controller 19, typically by the processor 25 of the controller 19, then the second function of the associated control device 15 may be activated by the controller 19 via a control signal output to the subject control device 15. In this way the second function associated with the desired control device 15 may be activated. This means that the secondary functions associated with the control devices 15 that the vehicle occupant is interested in operating are selectively and individually operable.
In certain embodiments, the controller 19 may be configured in use to output the control signal in dependence on a distance between at least a portion of a vehicle occupant’s hand 17 and the desired control device 15 being less than or equal to a predefined threshold distance. For example, as the camera 7 obtains image data of a vehicle occupant’s hand, such as the driver’s hand 17, the controller 19 may be configured to identify the image of the hand within the received obtained image data. The relative position of the imaged hand with respect to a desired control device 15 may then be determined, from the obtained image data. In order to identify an image of a hand, the controller 19, and specifically the processor 25 may be configured with image recognition software configured to identify a vehicle occupant’s hand 17 located within the camera’s field of view 9, from obtained image data. Similarly, image control software may be used to identify one of a plurality of different predefined gestures used to select between different settings associated with the second vehicle system. In such embodiments it is envisaged that the content of the output control signal is dependent on the identified gesture. For example, the controller 19, and more specifically the processor 25, may be configured to adapt the content of the control signal for subsequent output by the output 23, in dependence on the identified hand gesture. In this way it is possible to output different control signals which select different settings, or modes of operation associated with the second vehicle system.
Figure 4 is a process flow chart outlining the method used in accordance with certain embodiments of the invention, to control operation of the second vehicle system associated with the second function of a control device 15, in dependence on the proximity of a vehicle occupant’s hand 17 to the control device 15, using the camera 7 in operative communication with the controller 19. The method is initiated by the camera 7 obtaining image data within the vehicle cabin 1, at step 301. In certain embodiments the camera 7 may be configured to continuously obtain the image data, or to periodically obtain image data at a predefined frequency. The obtained image data may be forwarded to the controller 19 for analysis where, at step 303, obtained image data is analysed to identify a vehicle occupant’s hand 17 within the obtained image data. As mentioned previously, this may comprise the use of image recognition software. Once a vehicle occupant’s hand 17 has been identified within the obtained image data, the position of the hand 17 is determined relative to the control device 15, at step 305. Where the vehicle control panel 13 comprises a plurality of different control devices 15, step 305 may comprise determining the position of the hand 17 relative to the nearest control device 15. The position of the hand 17 relative to the control device 15 may be determined by the processor 25. At step 307 it is determined if at least a portion of the hand 17 lies at a distance that is less than or equal to a predefined threshold distance from the control device 15. If it is determined that no portion of the hand lies within the predefined threshold distance, then the processor 25 continues to analyse received obtained image data, and the method returns to step 303. If instead it is determined by the processor 25 that at least a portion of the identified hand lies within the predefined threshold distance of the control device 15, then the processor generates a control signal for output to the relevant control device 15, at step 308. Upon receipt of the control signal, at step 310, the second function of the control device 15 is activated enabling control of the associated second vehicle system. In certain embodiments and as previously mentioned, control of the associated second vehicle system may be carried out by the use of specific predefined gestures.
In certain embodiments the predefined threshold distance may relate to a few centimetres, for example any distance within the range of 1cm to 10cm, including 1cm and 10cm. In certain embodiments, and as mentioned previously, the predefined threshold may delineate a control proximity boundary 16 surrounding and offset from the control device 15 by the predefined threshold distance, which control proximity boundary 16 when intersected by at least a portion of the vehicle occupant’s hand 17 causes the controller 19 to generate the control signal for output to the relevant control device 15.
The control proximity boundary 16 may be geometrically shaped. For example, the control proximity boundary 16 may be box-shaped as illustrated in Figures 2a and 2b, or spherically shaped. Effectively, the control proximity boundary 16 relates to a volume of space offset from the control device 15 by the predefined threshold distance. In dependence on any portion of the control proximity boundary 16 being intersected by at least a portion of the vehicle occupant’s hand, the controller 19 generates the control signal for activating the associated control device’s second function, which enables control of the associated second vehicle system. It is to be appreciated that not all of the portions of the control proximity boundary 16 need to be offset from the control device 15 by the predefined threshold distance. For example, where the control proximity boundary 16 is box-shaped (e.g. cube shaped), it is to be appreciated that some faces of the cube may not be offset from the control device 15 by the predefined threshold distance.
In certain embodiments, in order to enable the position of the hand 17 to be determined relative to the control device 15, the camera 7 may relate to a 3D mapping controller arranged to generate a 3D model of the hand within the field of view 9. For example, in certain embodiments the camera 7 may relate to a Time-of-Flight (ToF) camera, in which each captured image pixel is associated with a distance on the basis of a time of return of a reflected illumination signal. To achieve this the ToF camera may be configured with an illumination source arranged to illuminate the camera’s field of view. The incident illumination signal is subsequently reflected by objects present in the camera’s field of view, and the time of return of the reflected illumination signal is measured. In this way it is possible to associate a distance measurement to each imaged object. The illumination signal may relate to any electro-magnetic signal, and need not be comprised in the visible spectrum. For example, in certain embodiments the illumination signal may operate in the infrared spectrum.
In those embodiments where the camera 7 comprises a ToF camera 27, the controller 19, and specifically the input 21 may be configured to receive both camera image data and image object distance information data from the ToF camera 27. This enables the controller 19, and more specifically the processor 25 to determine the position of the vehicle occupant’s hand 17 relative to a control device 15 from the received data.
Figure 5 is a schematic diagram illustrating the principle of operation of a ToF camera 27. A modulated illumination source 29 is used to illuminate a desired target 31. The incident illumination 33 is reflected by the target 31 and captured on a sensor 35 comprising an array of pixels. However, whilst simultaneously capturing the reflected modulated light 37, the pixels of the sensor 35 also capture visible light reflected from the target. Since the illumination signal is modulated 33, it may be distinguished from the visible light reflected from the target 31, which enables the time of flight of the modulated illumination signal to be measured. The time of flight taken for the modulated illumination signal to be incident on the target 31 and reflected back to the sensor 35 is measured when it is incident on the sensor 35. In this way, each captured image pixel may be associated with a distance of the corresponding image object on the basis of the measured time of flight required for the reflected modulated illumination signal 37 to be measured by the sensor 35. More specific details regarding operation of ToF cameras are widely available in the art, and for this reason a more detailed discussion is not necessary for present purposes.
Where the camera 7 of Figure 1 comprises a ToF camera 27, it is possible to generate a three-dimensional point cloud of the vehicle occupant’s hand located within the camera’s field of view 9. Figures 6a and 6b illustrate an example of a threedimensional point cloud 39 of the vehicle occupant’s hand 17, generated using the ToF camera 27. In certain embodiments, the controller 19 may be configured to generate the three-dimensional point cloud using the image data and image object distance information received from the ToF camera 27. Figure 6a shows a point cloud 39 of the vehicle occupant’s hand 17 as it is approaching a rectangular-shaped control proximity boundary 41. In Figure 6b a portion of the point cloud 39 of the vehicle occupant’s hand 17 is intersecting a portion of the control proximity boundary 41. In this event, and as mentioned previously, the controller 19 is configured to generate a control signal enabling control of the second vehicle system associated with a control device.
In order to enable the position of the vehicle occupant’s hand 17 to be determined relative to a control device 15, the position of the control device 15 relative to the ToF camera may be determined. Again, this may be done using image recognition software. Since the position of the control device 15 relative to the ToF camera 27 is known, and the position of the vehicle occupant’s hand 17 relative to the ToF camera 27 is known, the position of the vehicle occupant’s hand 17 relative to the control device 15 may be determined using trigonometry. In certain embodiments, and in order to facilitate computation during use, the controller 19 may be provided with distance information of each control device 15 relative to the ToF camera 27 during an initial configuration of the ToF camera 27. This distance information may be stored and accessed for subsequent use when it is needed. This facilitates subsequent computation of the position of the hand relative to the control device, since only the distance of the vehicle occupant’s hand 17 with respect to the ToF camera 27, and the position relative to the known position of the control device 15 requires calculation.
In a further embodiment, the controller 19 may be configured to determine from an analysis of the received camera image data whether the vehicle occupant’s hand 17 is the occupant’s left or right hand. The controller 19 may then be configured to output the control signal in dependence on whether the hand 17 is the occupant’s left or right hand. In this way it is possible to restrict control of the second vehicle system associated with a control device in dependence on whether a vehicle occupant’s left or right hand is attempting to control the associated second vehicle system.
In certain embodiments, the controller 19 may be configured to determine if the vehicle occupant’s hand 17 is oriented palm upwards or downwards relative to the camera 7, and determining if the hand 17 is the vehicle occupant’s left or right hand in dependence on whether the hand is oriented palm upwards or downwards. This may be determined on the basis of the reflectance signal from the hand 17, and by image object analysis. The skin texture of a palm of a hand is different to the skin texture of the back of a hand, and as a result the amount of incident light absorbed by the palm differs to the amount of incident light absorbed by the back of the hand. Accordingly, by configuring the controller to analyse the intensity of the reflected signal, which is indicative of the amount of incident illumination absorbed by the hand, it is possible for the controller to determine whether the hand is oriented palm upwards or downwards.
In certain embodiments, the controller 19 may be configured to determine which vehicle occupant the imaged hand belongs to. For example, whether the imaged hand 17 belongs to a driver of the vehicle or to a passenger. Control of the second vehicle system associated with a control device 15 may then be controlled in dependence on which vehicle occupant the hand belongs to. For example, control of the second vehicle system may be dependent on the hand 17 belonging to the driver 3 of the vehicle. This helps to prevent accidental activation of a control device’s second function and control of the associated second vehicle system by a passenger of the vehicle.
One non-limiting way in which the controller 19 may determine which vehicle occupant the hand belongs to, is by monitoring and determining a direction of entry of the hand into the camera’s field of view 9 relative to the control device 15. This may be achieved from an analysis by the controller 19 of image data obtained by the camera 7. The direction of entry of the hand 17 into the camera’s field of view 9 may be indicative of where the vehicle occupant is seated in relation to the control device 15, and therefore provides a good assumption regarding which vehicle occupant the hand 17 belongs to.
Another non-limiting way in which the controller 19 may determine which vehicle occupant the hand belongs to, is by determining from an analysis of the received camera image data whether the vehicle occupant’s hand 17 is an occupant’s left or right hand, as described in a foregoing embodiment. This may be particularly useful for control devices disposed between two vehicle occupants occupying a common seating row within the vehicle and facing the same direction, e.g. for control devices arranged between the front seat passengers. By way of further explanation, in this example a first occupant is most likely to operate the control devices with a left hand whereas a second occupant is most likely to operate the control devices with a right hand (or vice versa). In this manner, the controller may discriminate between two vehicle occupants seated adjacent one another within the vehicle.
The non-limiting examples described above may be used independently or in combination to discriminate between multiple vehicle occupants, and to control the second vehicle system associated with a control device in dependence on whom the hand belongs to.
Whilst the preceding embodiments of the invention have been described within the context of a ToF camera, it is to be appreciated that alternative camera configurations may be used in accordance with the herein described embodiments. Any configuration of cameras may be used that enables image data of a hand relative to a control device to be obtained, and the position of the hand relative to the control device to be determined. For example, a configuration of two or more cameras each configured to enable a different perspective image of the hand relative to the control device to be captured may also be used. In such an arrangement the different perspective images of the hand relative to the control device would enable the controller to determine the position of the hand with respect to the control device by triangulation.
Similarly, in an alternative embodiment, the ToF camera of the preceding embodiments may be replaced by a conventional camera, in combination with an optical ruler, such as a LIDAR for example. In such an embodiment the LIDAR provides the image object distance information, whilst the camera provides image data. The controller may be configured in such embodiments to analyse the LIDAR data in combination with the obtained image data in order to determine the position of the vehicle occupant’s hand relative to the control device.
In certain embodiments, it is envisaged that before activating the second function associated with a control device 15, it is determined if the distance between at least a portion of the vehicle occupant’s hand 17 and the associated control device 15 has been maintained at a position that is less than or equal to the predefined threshold distance for a predefined period of time. This helps to avoid unintentional activation of the second function.
Whilst the above embodiments have been described in relation to a single control device being configured to enable control of two different vehicle systems, in alternative embodiments it is envisaged that a single control device may be configured to enable control of more than two different vehicle systems. In such embodiments it is envisaged that a specific predefined hand gesture may be used to toggle between different functions of the control device, where each different mode enables operation of a different one of the associated vehicle systems. Similarly, a different predefined hand gesture may be used to toggle between the different settings associated with each vehicle system. It is to be appreciated that the herein described methods and controller may be used to control any arbitrary number of different vehicle systems associated with a control device.
In certain embodiments it is envisaged that the specific function associated with the control device that is enabled, is displayed on a vehicle display located within the vehicle cabin. For example, when control of the second vehicle system is enabled, this may be displayed on the vehicle display unit.
It is envisaged that the above described methods and controller may be used for contactless operation of an audio device located within the vehicle cabin. For example, the control device may relate to a rotary dial. A first function associated with the rotary dial may relate to volume control and a second function associated with the rotary dial may relate to radio frequency scanning. The vehicle occupant, by swiping their hand left or right when located at least partially within the control proximity boundary, may change the selected radio channel. Alternatively, the first function may relate to radio frequency scanning, and the second function may relate to volume control.
Similarly, it is envisaged that the above described methods and controller may be used for contactless operations of a rotary shifter. For example, the rotary shifter may comprise a first function associated with establishing a gear setting, whilst the second function may be associated with a terrain mode of operation. The selected terrain mode of operation may be varied using a predefined gesture, such as by swiping left or right, or by rotating clockwise or anticlockwise.
It is to be appreciated that many modifications may be made to the above examples and embodiments without departing from the scope of the present invention as defined in the accompanying claims.
Claims (27)
1. A method of enabling control of one or more vehicle systems using a control device located in a vehicle cabin, the control device having a first function for controlling operation of a first vehicle system and a second function for controlling operation of a second vehicle system, the method comprising:
detecting a hand of a vehicle occupant within a volume of space within the vehicle cabin;
determining the relative position of the hand with respect to the control device; and enabling control of the second vehicle system in dependence on a distance between at least a portion of the hand and the control device being less than or equal to a predefined threshold distance.
2. The method of claim 1, comprising:
determining the relative position of the hand with respect to a control proximity boundary associated with the control device, the control proximity boundary defining a boundary offset from the control device by the predefined threshold distance; and enabling control of the second vehicle system in dependence on the position of at least a portion of the hand intersecting the control proximity boundary.
3. The method of claim 1 or claim 2, wherein operation of the first vehicle system is arranged to be controlled by physical interaction of the control device by the vehicle occupant.
4. The method of any preceding claim, comprising:
determining if the hand is the vehicle occupant’s left or right hand; and enabling control of the second vehicle system in dependence on whether the hand is the vehicle occupant’s left or right hand.
5. The method of claim 4, comprising:
determining if the hand is oriented palm upwards or downwards within the volume of space relative to the control device; and determining if the hand is the vehicle occupant’s left or right hand in dependence on the orientation of the hand.
6. The method of any preceding claim, comprising:
determining which vehicle occupant the hand belongs to; and enabling control of the second vehicle system in dependence on which vehicle occupant the hand belongs to.
7. The method of any preceding claim, comprising:
determining a direction of entry of the hand into the volume of space relative to the control device;
determining which vehicle occupant the hand belongs to in dependence on the direction of entry; and enabling control of the second vehicle system in dependence on which vehicle occupant the hand belongs to.
8. The method of any preceding claim, comprising:
obtaining image data of the hand within the volume of space;
receiving a reflectance signal reflected from the hand;
determining a distance of the hand from a designated origin in dependence on the reflectance signal; and determining the relative position of the hand with respect to the control device in dependence on the distance of the hand from the designated origin, the obtained image data, and a known distance of the control device relative to the designated origin.
9. The method of claim 8, wherein the designated origin is coincident with a position of an image capture device.
10. The method of any preceding claim, comprising:
identifying a gesture performed by the hand; and controlling operation of the second vehicle system in dependence on the identified gesture.
11. The method of claim 10, wherein the second vehicle system is associated with two or more different functions, and the method comprises:
identifying one of a plurality of different gestures; and selecting the function in dependence on the identified gesture.
12. The method of any preceding claim, comprising:
determining if a distance between at least a portion of the hand and the control device is less than or equal to the predefined threshold distance for a predefined threshold time period; and enabling control of the second vehicle system in dependence on the distance between the hand and control device being less than or equal to the predefined threshold distance for a period of time exceeding the predefined threshold time period.
13. A controller for enabling control of one or more vehicle systems associated with a control device located in a vehicle cabin, the control device having a first function for controlling operation of a first vehicle system, and a second function for controlling operation of a second vehicle system, the controller comprising:
an input configured to receive image data obtained from an image capture device;
a processor configured in use to:
recognise a hand of a vehicle occupant from the image data, the hand being located within a volume of space within the vehicle cabin within which the image data is obtained by the image capture device;
determine a position of the hand with respect to the control device; and an output arranged in use to output a control signal to the control device enabling control of the second vehicle system, in dependence on a distance between at least a portion of the hand and the control device being less than or equal to a predefined threshold.
14. The controller of claim 13, wherein the processor is arranged in use to determine the relative position of the hand with respect to a control proximity boundary associated with the control device, the control proximity boundary defining a boundary offset from the control device by the predefined threshold distance; and the output is arranged in use to output the control signal enabling control of the second vehicle system in dependence on at least a portion of the hand intersecting the control proximity boundary.
15. The controller of claim 13 or 14, wherein the processor is arranged in use to determine if the hand is the vehicle occupant’s left or right hand; and the output is arranged in use to output the control signal enabling control of the second vehicle system in dependence on whether the hand is the vehicle occupant’s left or right hand.
16. The controller of any one of claims 13 to 15, wherein the processor is arranged in use to determine if the hand is oriented palm upwards or downwards within the volume of space relative to the control device; and to determine if the hand is the vehicle occupant’s left or right hand in dependence on the orientation of the hand.
17. The controller of any one of claims 13 to 16, wherein the processor is arranged in use to determine which vehicle occupant the hand belongs to; and the output is arranged in use to output the control signal enabling control of the second vehicle system in dependence on which vehicle occupant the hand belongs to.
18. The controller of any one of claims 13 to 17, wherein the processor is arranged in use to determine a direction of entry of the hand into the volume of space relative to the control device;
to determine which vehicle occupant the hand belongs to in dependence on the direction of origin; and wherein the output is arranged in use to output the control signal enabling control of the second vehicle system in dependence on which vehicle occupant the hand belongs to.
19. The controller of any one of claims 13 to 18, wherein:
the input is configured to receive data from a time-of-flight (ToF) image capture device comprising a sensor, the ToF image capture device being arranged in use to obtain image data of the hand, to illuminate the hand within the volume of space, and to measure a time of return of a reflected illumination signal, the time of return of the reflected illumination signal being proportional to a distance of the hand from the sensor, the data comprising the image data and the time of return of the reflected illumination signal; and wherein the processor is arranged in use to determine the relative position of the hand with respect to the control device in dependence on the determined distance of the hand from the sensor, the obtained image data, and a known distance of the control device relative to the sensor, wherein the distance of the hand from the sensor is determined in dependence on the time of return of the reflected illumination signal.
20. The controller of any one of claims 13 to 19, wherein the processor is arranged in use to identify a gesture performed by the hand; and the output is arranged in use to output the control signal enabling control of the second vehicle system in dependence on the identified gesture.
21. The controller of claim 20, wherein the second vehicle system is associated with two or more different functions, the processor is arranged in use to identify one of a plurality of different gestures; and the output is arranged in use to output a control signal selecting the function for operation in dependence on the identified gesture.
22. The controller of any one of claims 13 to 21, wherein the processor is configured to determine if a distance between at least a portion of the hand and the control device is less than or equal to the predefined threshold distance for a predefined threshold time period; and the output is arranged in use to output the control signal enabling control of the second vehicle system in dependence on the distance between the hand and control device being less than or equal to the predefined threshold distance for a period of time exceeding the predefined threshold time period.
23. A system comprising the controller of any one of claims 13 to 22 in combination with a time-of-flight (ToF) image capture device.
5
24. A vehicle configured in use to carry out the method of any one of claims 1 to 12.
25. A vehicle comprising the controller of any one of claims 13 to 22 or the system of claim 23.
10
26. A computer program product comprising instructions for carrying out the method of any one of claims 1 to 12.
27. A computer readable data carrier having stored thereon instructions for carrying out the method of any one of claims 1 to 12.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1719068.7A GB2568509B (en) | 2017-11-17 | 2017-11-17 | Vehicle controller |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1719068.7A GB2568509B (en) | 2017-11-17 | 2017-11-17 | Vehicle controller |
Publications (3)
Publication Number | Publication Date |
---|---|
GB201719068D0 GB201719068D0 (en) | 2018-01-03 |
GB2568509A true GB2568509A (en) | 2019-05-22 |
GB2568509B GB2568509B (en) | 2020-03-18 |
Family
ID=60805660
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1719068.7A Active GB2568509B (en) | 2017-11-17 | 2017-11-17 | Vehicle controller |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2568509B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101055193A (en) * | 2006-04-12 | 2007-10-17 | 株式会社日立制作所 | Noncontact input operation device for in-vehicle apparatus |
DE102008023405A1 (en) * | 2008-05-13 | 2009-11-19 | Volkswagen Ag | Motor vehicle i.e. land vehicle, has display controller displaying image of control panel in display region formed in display, where image of control panel is aligned depending on direction of hand of operator on panel |
WO2014108152A2 (en) * | 2013-01-08 | 2014-07-17 | Audi Ag | Motor vehicle user interface comprising a control element for detecting a control action |
WO2017164835A1 (en) * | 2016-03-21 | 2017-09-28 | Ford Global Technologies, Llc | Virtual vehicle occupant rendering |
-
2017
- 2017-11-17 GB GB1719068.7A patent/GB2568509B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101055193A (en) * | 2006-04-12 | 2007-10-17 | 株式会社日立制作所 | Noncontact input operation device for in-vehicle apparatus |
DE102008023405A1 (en) * | 2008-05-13 | 2009-11-19 | Volkswagen Ag | Motor vehicle i.e. land vehicle, has display controller displaying image of control panel in display region formed in display, where image of control panel is aligned depending on direction of hand of operator on panel |
WO2014108152A2 (en) * | 2013-01-08 | 2014-07-17 | Audi Ag | Motor vehicle user interface comprising a control element for detecting a control action |
WO2017164835A1 (en) * | 2016-03-21 | 2017-09-28 | Ford Global Technologies, Llc | Virtual vehicle occupant rendering |
Also Published As
Publication number | Publication date |
---|---|
GB2568509B (en) | 2020-03-18 |
GB201719068D0 (en) | 2018-01-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10832064B2 (en) | Vacant parking space detection apparatus and vacant parking space detection method | |
US10345806B2 (en) | Autonomous driving system and method for same | |
US10962638B2 (en) | Vehicle radar sensing system with surface modeling | |
US10099576B2 (en) | Vehicle seat adjustment system | |
US20180032822A1 (en) | Vehicle exterior monitoring | |
US10499014B2 (en) | Image generation apparatus | |
CN104512332B (en) | Method and device for acquiring image of vehicle | |
EP3358840A1 (en) | Image processing device for vehicles | |
US20170305345A1 (en) | Image display control apparatus and image display system | |
KR20160145598A (en) | Method and device for the distortion-free display of an area surrounding a vehicle | |
US11283995B2 (en) | Image display apparatus | |
WO2015155715A2 (en) | Panoramic view blind spot eliminator system and method | |
EP4224293B1 (en) | System and method for determining a pointing direction in 3d space | |
US20170357271A1 (en) | Method and apparatus for visualization of an environment of a motor vehicle | |
KR102460043B1 (en) | Overtaking acceleration support for adaptive cruise control of the vehicle | |
GB2568511A (en) | Vehicle controller | |
GB2568669A (en) | Vehicle controller | |
GB2568509A (en) | Vehicle controller | |
GB2568508A (en) | Vehicle controller | |
JP7073237B2 (en) | Image display device, image display method | |
GB2570629A (en) | Vehicle controller | |
GB2568507A (en) | Vehicle Controller | |
GB2568512A (en) | Vehicle controller | |
JP7286613B2 (en) | Operation detection device and operation detection method | |
GB2568510A (en) | Vehicle controller |