US20160041562A1 - Method of controlling a component of a vehicle with a user device - Google Patents
Method of controlling a component of a vehicle with a user device Download PDFInfo
- Publication number
- US20160041562A1 US20160041562A1 US14/920,413 US201514920413A US2016041562A1 US 20160041562 A1 US20160041562 A1 US 20160041562A1 US 201514920413 A US201514920413 A US 201514920413A US 2016041562 A1 US2016041562 A1 US 2016041562A1
- Authority
- US
- United States
- Prior art keywords
- user device
- vehicle
- component
- signal
- vehicle controller
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 36
- 238000004891 communication Methods 0.000 claims description 24
- 230000004044 response Effects 0.000 claims description 7
- 230000008569 process Effects 0.000 description 8
- 230000008859 change Effects 0.000 description 5
- 230000033001 locomotion Effects 0.000 description 5
- 230000003213 activating effect Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 238000004378 air conditioning Methods 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 239000000872 buffer Substances 0.000 description 2
- 230000035897 transcription Effects 0.000 description 2
- 238000013518 transcription Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000003750 conditioning effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 238000010438 heat treatment Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000003278 mimic effect Effects 0.000 description 1
- 238000003032 molecular docking Methods 0.000 description 1
- 238000011022 operating instruction Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000035790 physiological processes and functions Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 238000009423 ventilation Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D3/00—Control of position or direction
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6887—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
- A61B5/6893—Cars
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60C—VEHICLE TYRES; TYRE INFLATION; TYRE CHANGING; CONNECTING VALVES TO INFLATABLE ELASTIC BODIES IN GENERAL; DEVICES OR ARRANGEMENTS RELATED TO TYRES
- B60C9/00—Reinforcements or ply arrangement of pneumatic tyres
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K28/00—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
- B60K28/02—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
- B60K28/06—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver
- B60K28/066—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver actuating a signalling device
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/23—Head-up displays [HUD]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/28—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/29—Instruments characterised by the way in which information is handled, e.g. showing information on plural displays or prioritising information according to driving conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/60—Instruments characterised by their location or relative disposition in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/80—Arrangements for controlling instruments
- B60K35/81—Arrangements for controlling instruments for controlling displays
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q9/00—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/09—Beam shaping, e.g. changing the cross-sectional area, not otherwise provided for
- G02B27/0927—Systems for changing the beam intensity distribution, e.g. Gaussian to top-hat
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/09—Beam shaping, e.g. changing the cross-sectional area, not otherwise provided for
- G02B27/0938—Using specific optical elements
- G02B27/095—Refractive optical elements
- G02B27/0955—Lenses
- G02B27/0961—Lens arrays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B6/00—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
- G02B6/0001—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems
- G02B6/0011—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems the light guides being planar or of plate-like form
- G02B6/0033—Means for improving the coupling-out of light from the light guide
- G02B6/005—Means for improving the coupling-out of light from the light guide provided by one optical element, or plurality thereof, placed on the light output side of the light guide
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/02—Registering or indicating driving, working, idle, or waiting time only
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/08—Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C17/00—Arrangements for transmitting signals characterised by the use of a wireless electrical link
- G08C17/02—Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
- A61B5/02405—Determining heart rate variability
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
- A61B5/053—Measuring electrical impedance or conductance of a portion of the body
- A61B5/0531—Measuring skin impedance
- A61B5/0533—Measuring galvanic skin response
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/177—Augmented reality
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/18—Information management
- B60K2360/197—Blocking or enabling of input functions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/20—Optical features of instruments
- B60K2360/21—Optical features of instruments using cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/20—Optical features of instruments
- B60K2360/25—Optical features of instruments using filters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/20—Optical features of instruments
- B60K2360/33—Illumination features
- B60K2360/334—Projection means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/77—Instrument locations other than the dashboard
- B60K2360/785—Instrument locations other than the dashboard on or in relation to the windshield or windows
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8006—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring and displaying scenes of vehicle interior, e.g. for monitoring passengers or cargo
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8093—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for obstacle warning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Y—INDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
- B60Y2302/00—Responses or measures related to driver conditions
- B60Y2302/03—Actuating a signal or alarm device
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0112—Head-up displays characterised by optical features comprising device for genereting colour display
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0118—Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0141—Head-up displays characterised by optical features characterised by the informative content of the display
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B2027/0192—Supplementary details
- G02B2027/0194—Supplementary details with combiner of laminated type, for optical or mechanical aspects
Definitions
- the present disclosure is related to a system and method of controlling a component of a vehicle with a user device.
- Vehicles such as cars, typically include displays or indicators to provide information to the vehicle user.
- Such displays or indicators may, for example, provide information regarding mileage, fuel consumption, and vehicle speed.
- the vehicle user usually has to shift his eye gaze away from the road scene and on to an in-vehicle display in order to visually process the information presented by these displays or indicators.
- the user In order to interact with the displayed information, the user has to utilize input controls that are built into the vehicle.
- One aspect of the disclosure provides a method of controlling a component of a vehicle with a user device includes receiving a data signal from the user device, within a vehicle controller. Graphics, corresponding to the data signal received from the user device, are displayed on a display screen of the vehicle. A signal is received from the user device that indicates the user device has received an input into a user interface of the user device. A determination is made, in the controller, that the signal received from the user device corresponds to controlling the component of the vehicle.
- the interface system includes a display screen and a vehicle controller.
- the vehicle controller is configured to be in selective communication with a user device.
- the vehicle controller is operable for receiving a data signal from the user device.
- the data signal is received within the vehicle controller.
- Graphics are displayed that correspond to the data signal received from the user device on the display screen of the interface system.
- a signal is received from the user device that indicates the user device has received an input into a user interface of the user device.
- a determination is made in the controller that the signal received from the user device corresponds to actuating the component of the vehicle.
- the interface system includes a display screen and a vehicle controller.
- the vehicle controller is configured to be in operative communication with the component and the user device.
- the vehicle controller operable for receiving, by the interface system, a data signal from the user device.
- the data signal corresponds to the execution of at least one software application.
- Graphics are displayed that correspond to the data signal received from the user device on the display screen of the interface system.
- a signal is received from the user device indicating the user device has received an input into a user interface of the user device.
- a determination is made, in the controller, that the signal received from the user device, corresponds to actuating the component of the vehicle
- FIG. 1 is a schematic illustrative side view of a vehicle.
- FIG. 2 is a schematic diagrammatic view of an interior of a vehicle having an interface system and a user device.
- FIG. 3 is a schematic view of the vehicle, including a component and the interface system, illustrating the vehicle in communication with a user device.
- FIG. 4 is a schematic flow chart diagram of a method of alerting the user of the vehicle as to a scene, external to the vehicle, requiring the user's attention.
- FIG. 1 schematically illustrates a vehicle 10 including a body 12 .
- the vehicle 10 may be a land vehicle, such as a car, or any other type of vehicle such as an airplane, farm equipment, construction equipment, a boat, etc.
- the vehicle 10 includes an interface system 14 and at least one component 16 .
- the interface system 14 is configured to allow the operative connection between a user device 18 and a component 16 of the vehicle 10 .
- the interface system 14 may include a display screen 21 and the vehicle controller 20 .
- the component 16 which is resident within the vehicle 10 , may include, but should not be limited to a head unit, a heads-up display (HUD), instrument cluster, center display, speakers, a video screen, air blowers, speedometer, seat motors, door locks, window motors, window defrost actuators, power doors, include, or be included in, a head unit, an infotainment system, a navigation system, an on-board telephone system, a heating, ventilation, and air conditioning (HVAC) system, and other like devices within the vehicle 10 .
- the video screen and/or display screen 21 may be located anywhere within the interior of the vehicle 10 , including, but not limited to, extending from a headliner, rear seat entertainment displays, side window displays, center front displays, and any other area capable of receiving a display.
- the component 16 is configured to operatively interact with the interface system 14 . More specifically, the vehicle 10 component 16 may be operatively connected to the vehicle controller 20 .
- the vehicle 10 component 16 may be operatively interconnected to the vehicle controller 20 (arrow 42 ) using, a wireless communication medium, for example, Bluetooth, Wi-Fi, etc., or may be wired communication medium, for example, a universal serial bus (USB) or other hardwire cable via hardwiring.
- a wireless communication medium for example, Bluetooth, Wi-Fi, etc.
- USB universal serial bus
- the elements of the component 16 may include, but should not be limited to, a vehicle interface, while such things as the operating system 34 , the applications, and the like, are resident within the user device 18 .
- the user device 18 may be a portable device that is carried by the user 24 of the interface system 14 , i.e., a user 24 of the vehicle 10 .
- the component 16 may be a smart phone, a tablet, a computer, a netbook, an e-reader, a personal digital assistant (PDA), a gaming device, a video player, a wristwatch, and other like devices with a at least one sensor and capable of running a plurality of software applications, either preloaded or downloaded by the user 24 , which may be stored on and executed by the user device 18 to interact with one or more of the components 16 , via the interface system 14 .
- PDA personal digital assistant
- Examples of the plurality of software functions may include, but should not be limited to, providing music, DVD, video, phone, navigation, weather, e-mail, climate control, seat motor actuation, window motor actuation, window defrost actuation, and other like applications.
- the user device 18 may include a device memory 26 , a transmitter 28 , at least one sensor 29 , and a user interface 30 , also referred to as a human machine interface (HMI).
- the sensors 29 may be one or more of an accelerometer, a touch sensor, a pressure sensor, a camera, a proximity sensor, a physiological sensor, and other like sensors.
- the sensors 29 may be operatively connected to, or otherwise in operative communication, with the user interface 30 .
- the touch sensor may be configured to sense gestures, while operatively contacting the touch sensor. Therefore, the touch sensor may be able to discern gestures corresponding to on/off and/or directional gestures, including, but not limited to, left right, up, down, etc.
- the pressure sensor may be configured to sense on/off, pressure range, and pressure distribution into the user interface 30 .
- the temperature sensor may be configured to sense a temperature in an area of the sensor that may be configured to enact a change temperature in an area of the vehicle 10 .
- the camera may be configured to recognize an object, features of an object, cause a state change, recognize motion to thereby adjust a feature, and the like.
- the accelerometer may be configured to sense a rate of acceleration to thereby bring about, i.e., via the vehicle controller 20 , a corresponding movement of a component in the vehicle and/or change an orientation of the display screen within the vehicle 10 .
- the proximity sensor may be configured to adjust a level within the vehicle.
- the proximity sensor may be configured to sense light intensity to bring about, i.e., via the vehicle controller 20 , a dimming of the display screen 21 , adjusting a light intensity within the vehicle 10 , and the like.
- the physiological sensor may be configured to monitor physiological state thresholds so as to bring about, i.e., via the engine controller 20 , a change in vehicle operating conditions.
- the physiological sensor may determine that the user 24 is perspiring and would, in turn, transmit a signal to the vehicle controller 20 to turn on air conditioning and play classical music through speakers within the vehicle.
- the user device may include other sensors 29 and the software of the vehicle controller 20 may be configured to recognize signals from any type of sensor 29 .
- the user interface 30 may include audio input and/or output, a keypad, touch screen, a display screen 21 , a switch, and/or other like interfaces.
- the user 24 may activate the user device 18 by actuating one or more of the user interfaces 30 .
- the user 24 may turn on the air conditioning within the vehicle 10 by touching the touch pad on the user device 18 .
- the software and sensors 29 configured for controlling such components 16 is resident within the user device 18 and not resident within the vehicle 10 .
- the vehicle controller 20 may be configured to receive a signal (arrow 36 ) in response to input by the user 24 into the user interface 30 .
- a signal (arrow 36 ) in response to input by the user 24 into the user interface 30 .
- Such input into the user interface 30 may be sensed by at least one of the sensors 29 and subsequently be transmitted to the vehicle controller 20 .
- accelerometers within the user device 18 may be configured to sense movement of the user device 18 . More specifically, the accelerometers may sense an orientation of the user device 18 as the user device 18 is tilted and turned. Signals, corresponding to the orientation of the user device may be transmitted to the vehicle controller 20 .
- the vehicle controller 20 may transmit a signal (arrow 42 ) to a corresponding component 16 to cause the component 16 to move to mimic or otherwise correspond to the movement and orientation of the user device 18 .
- the component may be an outside mirror, a rearview mirror, a display on the display screen 21 , position of a vehicle seat, and the like.
- the pressure sensors within the user device 18 may be configured to sense the pressure being applied to the user interface 30 .
- the signal transmitted from the user device 18 to the vehicle controller 20 may be proportionate to the amount of pressure applied to the user interface 30 . Therefore, if a large pressure is applied, the signal transmitted to the vehicle controller may instruct the vehicle controller to send a corresponding signal to the HVAC blower to increase a fan speed. Conversely, application of a lesser pressure to the user interface 30 may cause the signal transmitted to the vehicle controller to set the fan speed of the HVAC blower to be proportionately less than the fan speed with the higher pressure.
- the pressure is not limited to being able to control fan speed of the HVAC blower, but may also be used to control temperatures of the HVAC system, seat temperatures, seat positions, lumbar support, lighting levels, speaker levels, etc.
- the camera and/or the proximity sensor may be configured to sense a level of light surrounding the user device 18 .
- the user device 18 may transmit a signal to the vehicle controller 20 to adjust light levels within the vehicle 10 and/or exterior to the vehicle 10 .
- the camera and/or proximity sensors may be used to control other features of the vehicle 20 , as well.
- the display screen 21 may display a menu of selectable software applications.
- the user 24 may physically gesture with their hand, whereby a camera, resident within the user device 18 , recognizes the gesture.
- a selectable software application, displayed on the display screen 21 within the interface system 14 may be highlighted in response to the gesture.
- Such a highlighting of the software application on the display screen 21 may be sufficient to signal the user device 18 to execute the selected software application.
- another input into the user interface 30 may be required to execute the highlighted software application to run on the user device 18 , such as, a voice command, toggling a switch, touching the touch screen, and the like.
- the software of the vehicle controller 20 may also be configured to modify a sensitivity of input gain into the user interface 30 of the user device 18 as a function of vehicle 10 operating conditions, user 24 preference, driving context, and the like.
- the graphical display of a visual menu list, displayed on the display screen 21 may respond to a gesture at different vehicle velocities and/or vehicle accelerations. Therefore, the vehicle controller 20 may be configured to interpret the input signal from the user device 18 such that response from the vehicle controller 20 to the component 16 corresponds to the operating state of the vehicle.
- the vehicle controller 20 may transmit a signal to the component 16 , e.g., via the transmitter 28 , to operate.
- the vehicle controller 20 may transmit signals to speakers within the vehicle 10 to play the music.
- the user interface 30 of the user device 18 may also be configured to provide a haptic feedback to inputs into the user interface 30 . More specifically, haptic response provides tactile feedback, which takes advantage of the sense of touch by applying forces, vibrations, or motions to the user 24 .
- the user 24 may move a finger across the touchpad to change an operating mode on the HVAC controls simulated on the display screen 21 of the interface system 14 . As a different HVAC operating mode is highlighted and/or selected on the display screen 21 , the user interface 30 may vibrate, signaling to the user 24 that the different operating mode was selected.
- the haptic feedback may be provided for any desired software application.
- the user interface 30 may also be equipped with tactile sensors that may measure forces exerted by the user 24 on the user interface 30 .
- the user device 18 may include an operating system 34 , which may provide functionality such as authenticating the user device 18 to the component 16 through a handshaking process or other authenticating process and enabling one or more applications.
- the operating system 34 and/or user device 18 may include memory configured of sufficient size and type to store data and other information and to store and/or execute the plurality of applications.
- the vehicle controller 20 may be configured to interact with the user device 18 through a first communication link (arrow 36 ).
- the first communication link may be a wireless communication medium, for example, Bluetooth, Wi-Fi, etc., or may be wired communication medium, for example, a universal serial bus (USB) or other hardwire cable.
- a protocol may be used over the communication link to project graphics, sound, operating instructions, and the like from the user device 18 to the vehicle 10 component 16 .
- the user device 18 may also utilize a direct hardware video and/or audio out signals to project the contents of the user interface 30 of the user device 18 onto the display screen 21 included in the interface system 14 .
- the display screen 21 may be, for example, a video display screen 21 configured to display video content, an electronic visual display configured to display images, and other like devices for displaying content.
- the user device 18 may further include a communications interface 38 to selectively communicate with other devices, via a second communication link (arrow 40 ), which may include telephones, portable devices, and one or more off-board (e.g., off vehicle) servers or systems.
- the second communication link may be a wireless communication link in communication with a telecommunications network or the internet.
- An example of an off-board system may include one or more service providers, which may be configured as a server located off-board the vehicle 10 , e.g., at a location remote from the vehicle 10 .
- the off-board server may be a vehicle integrated service providers, such as the OnStar® service system, which may be selectively linked to the vehicle 10 component 16 via the user device 18 .
- the server may include an operating system 34 , which may provide functionality such as authenticating a device in communication with the server that may be, for example, the user device 18 or the component 16 , through a handshaking process or other authenticating process, and enabling one or more applications.
- the operating system 34 and/or server may include memory that is configured of sufficient size and type to store data and information and store and execute the plurality of applications.
- the plurality of applications may include, for example, phone, voicemail, text messaging, email, navigation, web browser, message analysis including information feature extraction, message transcription including voice-to-text transcription using, for example, automatic speech recognition (ASR), and text-to-speech (TTS) conversion.
- the server further includes a communications interface 38 which may be used to enable interaction between the user device 18 and/or the vehicle 10 component 16 which may include sending and receiving data and information including a message and/or an information feature through the communications link, or providing other services, such as navigation instructions, telephone text, email and/or other messaging services.
- One or more servers may be selectively linked to the user device 18 to, in turn, operate the component 16 of the vehicle 10 , through the vehicle controller 20 .
- a first server 46 may be selectively linked to the user device 18 , where the first server 46 is configured as a service provider or back-end server to process information features and provide services related thereto to the vehicle 10 .
- the first server 46 may be configured as a back-end such as the OnStar® system.
- a second server 48 may be selectively linked to the user device 18 and configured to receive a message from the vehicle 10 component 16 , and the integration application, and to extract the information feature(s) from the message and/or transcribe or convert the message and/or information feature(s).
- the vehicle controller 20 is programmed to provide communication between the user device 18 and the component 16 via execution of instructions embodying a method 100 , an example of which is described below with reference to FIG. 3 .
- the vehicle controller 20 of FIG. 1 may be embodied as one or more computer devices having a processor (P) 22 and tangible, non-transitory memory (M) 25 on which is recorded instructions for executing the method 100 .
- the memory 25 may include magnetic or optical memory, electrically-erasable programmable read only memory (EEPROM), and the like. Additional transitory memory may be included as needed, e.g., random access memory (RAM), memory for internal signal buffers, etc.
- Other hardware of the vehicle controller 20 may include a high-speed clock, analog-to-digital (A/D) circuitry, digital-to-analog (D/A) circuitry, and any required input/output (I/O) circuitry and devices, as well as signal conditioning and buffer electronics. Individual control algorithms resident in the vehicle controller 20 or readily accessible by the vehicle controller 20 may be stored in memory 25 and/or other suitable memory, and automatically executed via the processor 22 to provide the required control functionality.
- A/D analog-to-digital
- D/A digital-to-
- an example embodiment of the method 100 begins with step 102 , where the vehicle controller 20 receives a signal (arrow 36 ) from the user device 18 requesting to connect the user device 18 to the interface system 14 .
- the signal may automatically result from the user device 18 being docked in a docking station within the vehicle 10 .
- the signal may be the result of the vehicle 10 user 24 affirmatively activating the user device 18 , i.e., actuating the user interface 30 .
- the method proceeds to step 104 .
- the vehicle controller 20 may authenticate the user device 18 to the component 16 through a handshaking process or other authenticating process and enabling one or more applications. Once the vehicle controller 20 authenticates the user device 18 , the method proceeds to step 106 .
- the user device 18 is connected to the interface system 14 .
- the connection between the user device 18 and the interface system 14 may be employed by a wireless communication medium or a wired communication medium.
- the controller may receive a data signal (arrow 36 ) from the user device 18 .
- the data signal may correspond to graphics, video, etc., to be displayed on the display screen 21 of the interface system 14 .
- the data signal may correspond to graphics that may be a menu of an application, which was activated and is running on the user device 18 .
- the display screen 21 of the interface system 14 may display the graphics corresponding to the data signal received from the user device 18 .
- the display screen 21 may display a menu including a plurality of choices for music channels available for selection. The method then proceeds to step 112 .
- controller receives a signal from the user device 18 that the user device 18 has received an input (arrow 44 ) into the user interface 30 .
- an input into the user interface 30 may include, but should not be limited to, an audio input, keypad actuation, touch screen actuation, switch actuation, activation of accelerometers within the user device 18 , and the like.
- the method proceeds to step 114 .
- the display screen 21 alters the display as a function of the signal received from the user device 18 .
- the display screen 21 may highlight a selected music channel on the menu in response to receiving the signal from the user device 18 . The method then proceeds to step 116 .
- the controller determines whether the signal received from the user device 18 corresponds to activating one or more components 16 of the vehicle 10 . If the vehicle controller 20 determines that the signal does not correspond to activating one or more components 16 , the method returns to step 114 . However, if the vehicle controller 20 determines the signal does correspond to activating one or more components 16 , the method proceeds to step 118 .
- step 118 the vehicle controller 20 transmits a signal to the corresponding component 16 of the vehicle 10 .
- the method then proceeds to step 120 .
- the component 16 receives the transmitted signal.
- the method is not limited to using the applications and components 16 described herein. Other applications may be executed and/or other components 16 may be actuated, so long as the applications remain resident within, and are executed by, the user device 18 .
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Mechanical Engineering (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Transportation (AREA)
- Combustion & Propulsion (AREA)
- Chemical & Material Sciences (AREA)
- Optics & Photonics (AREA)
- Veterinary Medicine (AREA)
- General Health & Medical Sciences (AREA)
- Automation & Control Theory (AREA)
- Biophysics (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Human Computer Interaction (AREA)
- Pathology (AREA)
- Cardiology (AREA)
- Physiology (AREA)
- Computer Networks & Wireless Communication (AREA)
- Ophthalmology & Optometry (AREA)
- Pulmonology (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Traffic Control Systems (AREA)
- Instrument Panels (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A method of controlling a component of a vehicle with a user device includes receiving a data signal from the user device, within a vehicle controller. Graphics, corresponding to the data signal received from the user device, are displayed on a display screen of the vehicle. A signal is received from the user device that indicates the user device has received an input into a user interface of the user device. A determination is made, in the controller, that the signal received from the user device corresponds to controlling the component of the vehicle.
Description
- This application is a continuation of International Patent Application No. PCT/US2014/035396, filed on Apr. 25, 2014, which claims the benefit of U.S. Provisional Application No. 61/816,089, filed Apr. 25, 2013, which are hereby incorporated by reference in their entirety.
- The present disclosure is related to a system and method of controlling a component of a vehicle with a user device.
- Vehicles, such as cars, typically include displays or indicators to provide information to the vehicle user. Such displays or indicators may, for example, provide information regarding mileage, fuel consumption, and vehicle speed. The vehicle user usually has to shift his eye gaze away from the road scene and on to an in-vehicle display in order to visually process the information presented by these displays or indicators. In order to interact with the displayed information, the user has to utilize input controls that are built into the vehicle.
- One aspect of the disclosure provides a method of controlling a component of a vehicle with a user device includes receiving a data signal from the user device, within a vehicle controller. Graphics, corresponding to the data signal received from the user device, are displayed on a display screen of the vehicle. A signal is received from the user device that indicates the user device has received an input into a user interface of the user device. A determination is made, in the controller, that the signal received from the user device corresponds to controlling the component of the vehicle.
- Another aspect of the disclosure provides a vehicle including a component and an interface system. The interface system includes a display screen and a vehicle controller. The vehicle controller is configured to be in selective communication with a user device. The vehicle controller is operable for receiving a data signal from the user device. The data signal is received within the vehicle controller. Graphics are displayed that correspond to the data signal received from the user device on the display screen of the interface system. A signal is received from the user device that indicates the user device has received an input into a user interface of the user device. A determination is made in the controller that the signal received from the user device corresponds to actuating the component of the vehicle.
- Yet another aspect of the disclosure provides an interface system for controlling a component of a vehicle with a user device. The interface system includes a display screen and a vehicle controller. The vehicle controller is configured to be in operative communication with the component and the user device. The vehicle controller operable for receiving, by the interface system, a data signal from the user device. The data signal corresponds to the execution of at least one software application. Graphics are displayed that correspond to the data signal received from the user device on the display screen of the interface system. A signal is received from the user device indicating the user device has received an input into a user interface of the user device. A determination is made, in the controller, that the signal received from the user device, corresponds to actuating the component of the vehicle
- The above features and advantages and other features and advantages of the present teachings are readily apparent from the following detailed description of the best modes for carrying out the present teachings when taken in connection with the accompanying drawings.
-
FIG. 1 is a schematic illustrative side view of a vehicle. -
FIG. 2 is a schematic diagrammatic view of an interior of a vehicle having an interface system and a user device. -
FIG. 3 is a schematic view of the vehicle, including a component and the interface system, illustrating the vehicle in communication with a user device. -
FIG. 4 is a schematic flow chart diagram of a method of alerting the user of the vehicle as to a scene, external to the vehicle, requiring the user's attention. - Those having ordinary skill in the art will recognize that terms such as “above,” “below,” “upward,” “downward,” “top,” “bottom,” etc., are used descriptively for the figures, and do not represent limitations on the scope of the invention, as defined by the appended claims. Furthermore, the invention may be described herein in terms of functional and/or logical block components and/or various processing steps. It should be realized that such block components may be comprised of any number of hardware, software, and/or firmware components configured to perform the specified functions.
- Referring now to the drawings, wherein the like numerals indicate corresponding parts throughout the several views,
FIG. 1 schematically illustrates avehicle 10 including abody 12. Thevehicle 10 may be a land vehicle, such as a car, or any other type of vehicle such as an airplane, farm equipment, construction equipment, a boat, etc. - With reference to
FIG. 3 , thevehicle 10 includes aninterface system 14 and at least onecomponent 16. Theinterface system 14 is configured to allow the operative connection between auser device 18 and acomponent 16 of thevehicle 10. Theinterface system 14 may include adisplay screen 21 and thevehicle controller 20. - The
component 16, which is resident within thevehicle 10, may include, but should not be limited to a head unit, a heads-up display (HUD), instrument cluster, center display, speakers, a video screen, air blowers, speedometer, seat motors, door locks, window motors, window defrost actuators, power doors, include, or be included in, a head unit, an infotainment system, a navigation system, an on-board telephone system, a heating, ventilation, and air conditioning (HVAC) system, and other like devices within thevehicle 10. The video screen and/ordisplay screen 21 may be located anywhere within the interior of thevehicle 10, including, but not limited to, extending from a headliner, rear seat entertainment displays, side window displays, center front displays, and any other area capable of receiving a display. - The
component 16 is configured to operatively interact with theinterface system 14. More specifically, thevehicle 10component 16 may be operatively connected to thevehicle controller 20. Thevehicle 10component 16 may be operatively interconnected to the vehicle controller 20 (arrow 42) using, a wireless communication medium, for example, Bluetooth, Wi-Fi, etc., or may be wired communication medium, for example, a universal serial bus (USB) or other hardwire cable via hardwiring. It would be understood that the elements of thecomponent 16 may include, but should not be limited to, a vehicle interface, while such things as theoperating system 34, the applications, and the like, are resident within theuser device 18. - The
user device 18 may be a portable device that is carried by theuser 24 of theinterface system 14, i.e., auser 24 of thevehicle 10. Thecomponent 16 may be a smart phone, a tablet, a computer, a netbook, an e-reader, a personal digital assistant (PDA), a gaming device, a video player, a wristwatch, and other like devices with a at least one sensor and capable of running a plurality of software applications, either preloaded or downloaded by theuser 24, which may be stored on and executed by theuser device 18 to interact with one or more of thecomponents 16, via theinterface system 14. Examples of the plurality of software functions may include, but should not be limited to, providing music, DVD, video, phone, navigation, weather, e-mail, climate control, seat motor actuation, window motor actuation, window defrost actuation, and other like applications. - The
user device 18 may include adevice memory 26, atransmitter 28, at least onesensor 29, and auser interface 30, also referred to as a human machine interface (HMI). Thesensors 29 may be one or more of an accelerometer, a touch sensor, a pressure sensor, a camera, a proximity sensor, a physiological sensor, and other like sensors. Thesensors 29 may be operatively connected to, or otherwise in operative communication, with theuser interface 30. The touch sensor may be configured to sense gestures, while operatively contacting the touch sensor. Therefore, the touch sensor may be able to discern gestures corresponding to on/off and/or directional gestures, including, but not limited to, left right, up, down, etc. The pressure sensor may be configured to sense on/off, pressure range, and pressure distribution into theuser interface 30. The temperature sensor may be configured to sense a temperature in an area of the sensor that may be configured to enact a change temperature in an area of thevehicle 10. The camera may be configured to recognize an object, features of an object, cause a state change, recognize motion to thereby adjust a feature, and the like. The accelerometer may be configured to sense a rate of acceleration to thereby bring about, i.e., via thevehicle controller 20, a corresponding movement of a component in the vehicle and/or change an orientation of the display screen within thevehicle 10. The proximity sensor may be configured to adjust a level within the vehicle. By way of a non-limiting example, the proximity sensor may be configured to sense light intensity to bring about, i.e., via thevehicle controller 20, a dimming of thedisplay screen 21, adjusting a light intensity within thevehicle 10, and the like. The physiological sensor may be configured to monitor physiological state thresholds so as to bring about, i.e., via theengine controller 20, a change in vehicle operating conditions. By way of a non-limiting example, the physiological sensor may determine that theuser 24 is perspiring and would, in turn, transmit a signal to thevehicle controller 20 to turn on air conditioning and play classical music through speakers within the vehicle. It should be appreciated that the user device may includeother sensors 29 and the software of thevehicle controller 20 may be configured to recognize signals from any type ofsensor 29. - The
user interface 30 may include audio input and/or output, a keypad, touch screen, adisplay screen 21, a switch, and/or other like interfaces. In use, theuser 24 may activate theuser device 18 by actuating one or more of the user interfaces 30. By way of a non-limiting example, theuser 24 may turn on the air conditioning within thevehicle 10 by touching the touch pad on theuser device 18. As such, the software andsensors 29 configured for controllingsuch components 16 is resident within theuser device 18 and not resident within thevehicle 10. - Further, the
vehicle controller 20 may be configured to receive a signal (arrow 36) in response to input by theuser 24 into theuser interface 30. Such input into theuser interface 30 may be sensed by at least one of thesensors 29 and subsequently be transmitted to thevehicle controller 20. By way of a non-limiting example, accelerometers within theuser device 18 may be configured to sense movement of theuser device 18. More specifically, the accelerometers may sense an orientation of theuser device 18 as theuser device 18 is tilted and turned. Signals, corresponding to the orientation of the user device may be transmitted to thevehicle controller 20. In response, thevehicle controller 20 may transmit a signal (arrow 42) to acorresponding component 16 to cause thecomponent 16 to move to mimic or otherwise correspond to the movement and orientation of theuser device 18. The component may be an outside mirror, a rearview mirror, a display on thedisplay screen 21, position of a vehicle seat, and the like. - In another non-limiting example, the pressure sensors within the
user device 18 may be configured to sense the pressure being applied to theuser interface 30. As such, the signal transmitted from theuser device 18 to thevehicle controller 20 may be proportionate to the amount of pressure applied to theuser interface 30. Therefore, if a large pressure is applied, the signal transmitted to the vehicle controller may instruct the vehicle controller to send a corresponding signal to the HVAC blower to increase a fan speed. Conversely, application of a lesser pressure to theuser interface 30 may cause the signal transmitted to the vehicle controller to set the fan speed of the HVAC blower to be proportionately less than the fan speed with the higher pressure. It should be appreciated that the pressure is not limited to being able to control fan speed of the HVAC blower, but may also be used to control temperatures of the HVAC system, seat temperatures, seat positions, lumbar support, lighting levels, speaker levels, etc. - In yet another non-limiting example, the camera and/or the proximity sensor may be configured to sense a level of light surrounding the
user device 18. As such, theuser device 18 may transmit a signal to thevehicle controller 20 to adjust light levels within thevehicle 10 and/or exterior to thevehicle 10. It should be appreciated that the camera and/or proximity sensors may be used to control other features of thevehicle 20, as well. - In another example, the
display screen 21 may display a menu of selectable software applications. Theuser 24 may physically gesture with their hand, whereby a camera, resident within theuser device 18, recognizes the gesture. A selectable software application, displayed on thedisplay screen 21 within theinterface system 14, may be highlighted in response to the gesture. Such a highlighting of the software application on thedisplay screen 21 may be sufficient to signal theuser device 18 to execute the selected software application. Alternatively, another input into theuser interface 30 may be required to execute the highlighted software application to run on theuser device 18, such as, a voice command, toggling a switch, touching the touch screen, and the like. - The software of the
vehicle controller 20 may also be configured to modify a sensitivity of input gain into theuser interface 30 of theuser device 18 as a function ofvehicle 10 operating conditions,user 24 preference, driving context, and the like. By way of a non-limiting example, the graphical display of a visual menu list, displayed on thedisplay screen 21, may respond to a gesture at different vehicle velocities and/or vehicle accelerations. Therefore, thevehicle controller 20 may be configured to interpret the input signal from theuser device 18 such that response from thevehicle controller 20 to thecomponent 16 corresponds to the operating state of the vehicle. - Once the software application is executed, the
vehicle controller 20 may transmit a signal to thecomponent 16, e.g., via thetransmitter 28, to operate. By way of a non-limiting example, if the selected software application was music, thevehicle controller 20 may transmit signals to speakers within thevehicle 10 to play the music. - The
user interface 30 of theuser device 18 may also be configured to provide a haptic feedback to inputs into theuser interface 30. More specifically, haptic response provides tactile feedback, which takes advantage of the sense of touch by applying forces, vibrations, or motions to theuser 24. By way of example, theuser 24 may move a finger across the touchpad to change an operating mode on the HVAC controls simulated on thedisplay screen 21 of theinterface system 14. As a different HVAC operating mode is highlighted and/or selected on thedisplay screen 21, theuser interface 30 may vibrate, signaling to theuser 24 that the different operating mode was selected. It should be appreciated that the haptic feedback may be provided for any desired software application. Theuser interface 30 may also be equipped with tactile sensors that may measure forces exerted by theuser 24 on theuser interface 30. - The
user device 18 may include anoperating system 34, which may provide functionality such as authenticating theuser device 18 to thecomponent 16 through a handshaking process or other authenticating process and enabling one or more applications. Theoperating system 34 and/oruser device 18 may include memory configured of sufficient size and type to store data and other information and to store and/or execute the plurality of applications. - The
vehicle controller 20 may be configured to interact with theuser device 18 through a first communication link (arrow 36). The first communication link may be a wireless communication medium, for example, Bluetooth, Wi-Fi, etc., or may be wired communication medium, for example, a universal serial bus (USB) or other hardwire cable. A protocol may be used over the communication link to project graphics, sound, operating instructions, and the like from theuser device 18 to thevehicle 10component 16. Theuser device 18 may also utilize a direct hardware video and/or audio out signals to project the contents of theuser interface 30 of theuser device 18 onto thedisplay screen 21 included in theinterface system 14. Thedisplay screen 21 may be, for example, avideo display screen 21 configured to display video content, an electronic visual display configured to display images, and other like devices for displaying content. - The
user device 18 may further include acommunications interface 38 to selectively communicate with other devices, via a second communication link (arrow 40), which may include telephones, portable devices, and one or more off-board (e.g., off vehicle) servers or systems. The second communication link may be a wireless communication link in communication with a telecommunications network or the internet. - An example of an off-board system may include one or more service providers, which may be configured as a server located off-board the
vehicle 10, e.g., at a location remote from thevehicle 10. The off-board server may be a vehicle integrated service providers, such as the OnStar® service system, which may be selectively linked to thevehicle 10component 16 via theuser device 18. The server may include anoperating system 34, which may provide functionality such as authenticating a device in communication with the server that may be, for example, theuser device 18 or thecomponent 16, through a handshaking process or other authenticating process, and enabling one or more applications. Theoperating system 34 and/or server may include memory that is configured of sufficient size and type to store data and information and store and execute the plurality of applications. The plurality of applications may include, for example, phone, voicemail, text messaging, email, navigation, web browser, message analysis including information feature extraction, message transcription including voice-to-text transcription using, for example, automatic speech recognition (ASR), and text-to-speech (TTS) conversion. The server further includes acommunications interface 38 which may be used to enable interaction between theuser device 18 and/or thevehicle 10component 16 which may include sending and receiving data and information including a message and/or an information feature through the communications link, or providing other services, such as navigation instructions, telephone text, email and/or other messaging services. - One or more servers may be selectively linked to the
user device 18 to, in turn, operate thecomponent 16 of thevehicle 10, through thevehicle controller 20. For example, afirst server 46 may be selectively linked to theuser device 18, where thefirst server 46 is configured as a service provider or back-end server to process information features and provide services related thereto to thevehicle 10. In one example thefirst server 46 may be configured as a back-end such as the OnStar® system. Asecond server 48 may be selectively linked to theuser device 18 and configured to receive a message from thevehicle 10component 16, and the integration application, and to extract the information feature(s) from the message and/or transcribe or convert the message and/or information feature(s). - The
vehicle controller 20 is programmed to provide communication between theuser device 18 and thecomponent 16 via execution of instructions embodying amethod 100, an example of which is described below with reference toFIG. 3 . - The
vehicle controller 20 ofFIG. 1 may be embodied as one or more computer devices having a processor (P) 22 and tangible, non-transitory memory (M) 25 on which is recorded instructions for executing themethod 100. Thememory 25 may include magnetic or optical memory, electrically-erasable programmable read only memory (EEPROM), and the like. Additional transitory memory may be included as needed, e.g., random access memory (RAM), memory for internal signal buffers, etc. Other hardware of thevehicle controller 20 may include a high-speed clock, analog-to-digital (A/D) circuitry, digital-to-analog (D/A) circuitry, and any required input/output (I/O) circuitry and devices, as well as signal conditioning and buffer electronics. Individual control algorithms resident in thevehicle controller 20 or readily accessible by thevehicle controller 20 may be stored inmemory 25 and/or other suitable memory, and automatically executed via theprocessor 22 to provide the required control functionality. - Referring to
FIG. 4 , an example embodiment of themethod 100 begins withstep 102, where thevehicle controller 20 receives a signal (arrow 36) from theuser device 18 requesting to connect theuser device 18 to theinterface system 14. The signal may automatically result from theuser device 18 being docked in a docking station within thevehicle 10. Alternatively, the signal may be the result of thevehicle 10user 24 affirmatively activating theuser device 18, i.e., actuating theuser interface 30. Once thevehicle controller 20 receives the signal, the method proceeds to step 104. - At
step 104, thevehicle controller 20 may authenticate theuser device 18 to thecomponent 16 through a handshaking process or other authenticating process and enabling one or more applications. Once thevehicle controller 20 authenticates theuser device 18, the method proceeds to step 106. - At
step 106, theuser device 18 is connected to theinterface system 14. As already described, the connection between theuser device 18 and theinterface system 14 may be employed by a wireless communication medium or a wired communication medium. Once theuser device 18 is connected to theinterface system 14, the method proceeds to step 108. - At
step 108, the controller may receive a data signal (arrow 36) from theuser device 18. The data signal may correspond to graphics, video, etc., to be displayed on thedisplay screen 21 of theinterface system 14. The data signal may correspond to graphics that may be a menu of an application, which was activated and is running on theuser device 18. Once the controller receives the data signal, the method proceeds to step 110. - At step 110, the
display screen 21 of theinterface system 14 may display the graphics corresponding to the data signal received from theuser device 18. By way of example, thedisplay screen 21 may display a menu including a plurality of choices for music channels available for selection. The method then proceeds to step 112. - At
step 112, controller receives a signal from theuser device 18 that theuser device 18 has received an input (arrow 44) into theuser interface 30. Such an input into theuser interface 30 may include, but should not be limited to, an audio input, keypad actuation, touch screen actuation, switch actuation, activation of accelerometers within theuser device 18, and the like. The method proceeds to step 114. - At
step 114, thedisplay screen 21 alters the display as a function of the signal received from theuser device 18. In continuing with the exemplary menu for the music channels, provided above, thedisplay screen 21 may highlight a selected music channel on the menu in response to receiving the signal from theuser device 18. The method then proceeds to step 116. - At
step 116, the controller determines whether the signal received from theuser device 18 corresponds to activating one ormore components 16 of thevehicle 10. If thevehicle controller 20 determines that the signal does not correspond to activating one ormore components 16, the method returns to step 114. However, if thevehicle controller 20 determines the signal does correspond to activating one ormore components 16, the method proceeds to step 118. - At
step 118, thevehicle controller 20 transmits a signal to the correspondingcomponent 16 of thevehicle 10. The method then proceeds to step 120. - At
step 120, thecomponent 16 receives the transmitted signal. - It should be appreciated that the method is not limited to using the applications and
components 16 described herein. Other applications may be executed and/orother components 16 may be actuated, so long as the applications remain resident within, and are executed by, theuser device 18. - While the best modes for carrying out the many aspects of the present teachings have been described in detail, those familiar with the art to which these teachings relate will recognize various alternative aspects for practicing the present teachings that are within the scope of the appended claims.
Claims (19)
1. A method of controlling a component of a vehicle with a user device, the method comprising:
receiving a data signal from the user device, wherein the data signal is received within a vehicle controller;
receiving a signal from the user device indicating the user device has received an input into a user interface of the user device, wherein the signal is received within the vehicle controller;
determining, in the controller, that the signal received from the user device corresponds to actuating the component of the vehicle.
transmitting a signal from the vehicle controller to the component;
receiving, by the component, the signal from the vehicle controller; and
actuating the component as a function of the transmitted signal.
2. A method, as set forth in claim 1 , wherein the signal received from the user device is a function of a sensor reading of at least one sensor within the user device.
3. A method, as set forth in claim 2 , further comprising determining a vehicle operating condition;
wherein transmitting a signal from the vehicle controller to the component is a function of the vehicle operating condition and the sensor reading of the at least one sensor.
4. A method, as set forth in claim 2 , wherein transmitting a signal from the vehicle controller to the component is proportional to the sensor reading of the at least one sensor within the user device.
5. A method, as set forth in claim 1 , further comprising:
displaying graphics corresponding to the data signal received from the user device on a display screen of the vehicle; and
altering the display of graphics on the display screen as a function of the signal received from the user device.
6. A method, as set forth in claim 1 , further comprising:
operatively connecting the user device to an interface system of the vehicle;
receiving a signal, in a vehicle controller, from the user device; and
authenticating the user device such that at least one application is enabled on the user device.
7. A method, as set forth in claim 1 , actuating a haptic actuator within the user device to provide a haptic response through the user device to indicate the signal received from the user device corresponds to actuating the component of the vehicle.
8. A method, as set forth in claim 1 , wherein actuating the component is further defined as changing an orientation of the component as a function of the transmitted signal.
9. A vehicle configured to communicate with a user device, the vehicle comprising:
a component; and
an interface system including a vehicle controller;
wherein the vehicle controller is configured to be in selective communication with a user device, the vehicle controller is operable for:
receiving a data signal from the user device, wherein the data signal is received within the vehicle controller;
receiving a signal from the user device indicating the user device has received an input into a user interface of the user device;
determining, in the controller, that the signal received from the user device, corresponds to actuating the component of the vehicle;
transmitting a signal from the vehicle controller to the component; and
receiving, by the component, the signal from the vehicle controller; and
actuating the component as a function of the transmitted signal.
10. A vehicle, as set forth in claim 9 , wherein the signal received from the user device is a function of a sensor reading of at least one sensor within the user device.
11. A vehicle, as set forth in claim 10 , wherein the vehicle controller is further operable for determining a vehicle operating condition; and
wherein transmitting a signal from the vehicle controller to the component is a function of the vehicle operating condition and the sensor reading of the at least one sensor.
12. A vehicle, as set forth in claim 10 , wherein transmitting a signal from the vehicle controller to the component is proportional to the sensor reading of the at least one sensor within the user device.
13. A vehicle, as set forth in claim 9 , wherein the interface system further includes a display screen and the vehicle controller is further operable for:
displaying graphics corresponding to the data signal received from the user device on the display screen of the interface system; and
altering the display of graphics on the display screen as a function of the signal received from the user device.
14. A vehicle, as set forth in claim 9 , wherein the vehicle controller is further operable for:
operatively connecting the user device to the interface system of the vehicle.
receiving a signal, in a vehicle controller, from the user device; and
authenticating the user device such that at least one application is enabled on the user device.
15. A vehicle, as set forth in claim 9 , wherein the vehicle controller is further operable for providing haptic feedback through the user device to indicate the signal received from the user device corresponds to actuating the component of the vehicle.
16. A vehicle, as set forth in claim 9 , wherein actuating the component is further defined as changing an orientation of the component as a function of the transmitted signal.
17. An interface system for controlling a component of a vehicle with a user device, the interface system comprising:
a vehicle controller configured to be in operative communication with the component and the user device, the vehicle controller operable for:
receiving a data signal from the user device, wherein the data signal is received within the vehicle controller;
receiving a signal from the user device indicating the user device has received an input into a user interface of the user device;
determining, in the controller, that the signal received from the user device, corresponds to actuating the component of the vehicle;
transmitting a signal from the vehicle controller to the component to actuate the component.
18. A vehicle, as set forth in claim 9 , wherein the interface system further includes a display screen and the vehicle controller is further operable for:
displaying graphics corresponding to the data signal received from the user device on the display screen of the interface system; and
altering the display of graphics on the display screen as a function of the signal received from the user device.
19. A vehicle, as set forth in claim 9 , wherein the vehicle controller is further operable for:
operatively connecting the user device to the interface system of the vehicle.
receiving a signal, in a vehicle controller, from the user device; and
authenticating the user device such that at least one application is enabled on the user device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/920,413 US20160041562A1 (en) | 2013-04-25 | 2015-10-22 | Method of controlling a component of a vehicle with a user device |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361816089P | 2013-04-25 | 2013-04-25 | |
PCT/US2014/035396 WO2014176476A1 (en) | 2013-04-25 | 2014-04-25 | Method of controlling a component of a vehicle with a user device |
US14/920,413 US20160041562A1 (en) | 2013-04-25 | 2015-10-22 | Method of controlling a component of a vehicle with a user device |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2014/035396 Continuation WO2014176476A1 (en) | 2013-04-25 | 2014-04-25 | Method of controlling a component of a vehicle with a user device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20160041562A1 true US20160041562A1 (en) | 2016-02-11 |
Family
ID=51792397
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/918,073 Active 2034-06-28 US9688287B2 (en) | 2013-04-25 | 2015-10-20 | Situation awareness system and method |
US14/920,499 Active 2035-06-10 US10131364B2 (en) | 2013-04-25 | 2015-10-22 | Ambient display |
US14/920,420 Abandoned US20160039285A1 (en) | 2013-04-25 | 2015-10-22 | Scene awareness system for a vehicle |
US14/920,413 Abandoned US20160041562A1 (en) | 2013-04-25 | 2015-10-22 | Method of controlling a component of a vehicle with a user device |
Family Applications Before (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/918,073 Active 2034-06-28 US9688287B2 (en) | 2013-04-25 | 2015-10-20 | Situation awareness system and method |
US14/920,499 Active 2035-06-10 US10131364B2 (en) | 2013-04-25 | 2015-10-22 | Ambient display |
US14/920,420 Abandoned US20160039285A1 (en) | 2013-04-25 | 2015-10-22 | Scene awareness system for a vehicle |
Country Status (4)
Country | Link |
---|---|
US (4) | US9688287B2 (en) |
CN (1) | CN105324268A (en) |
DE (1) | DE112014001607B4 (en) |
WO (4) | WO2014176476A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112918381A (en) * | 2019-12-06 | 2021-06-08 | 广州汽车集团股份有限公司 | Method, device and system for welcoming and delivering guests by vehicle-mounted robot |
Families Citing this family (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9679471B2 (en) | 2014-04-18 | 2017-06-13 | Gentex Corporation | Trainable transceiver and cloud computing system architecture systems and methods |
US20150321604A1 (en) * | 2014-05-07 | 2015-11-12 | Ford Global Technologies, Llc | In-vehicle micro-interactions |
KR20170003906A (en) * | 2014-05-09 | 2017-01-10 | 삼성전자주식회사 | Terminal and method for displaying caller information |
DE102015212676A1 (en) * | 2015-07-07 | 2017-01-12 | Bayerische Motoren Werke Aktiengesellschaft | Determining the driving ability of the driver of a first motor vehicle |
CN105631977A (en) * | 2016-02-18 | 2016-06-01 | 广东百事泰电子商务股份有限公司 | Intelligent monitoring and recording instrument |
US10166996B2 (en) * | 2017-02-09 | 2019-01-01 | Toyota Motor Engineering & Manufacturing North America, Inc. | Systems and methods for adaptively communicating notices in a vehicle |
TW201836890A (en) * | 2017-03-31 | 2018-10-16 | 育全 李 | Method of showing the inside status of a vehicle via a plurality of first icons |
JP6325154B1 (en) * | 2017-06-07 | 2018-05-16 | スマート ビート プロフィッツ リミテッド | Information processing system |
JP7132127B2 (en) * | 2017-10-10 | 2022-09-06 | 積水化学工業株式会社 | VEHICLE WINDOW GLASS AND WARNING INDICATION METHOD |
US10709386B2 (en) | 2017-12-12 | 2020-07-14 | Lear Corporation | Electrocardiogram waveform identification and diagnostics via electrophysiological sensor system fusion |
CN110027473B (en) * | 2018-01-04 | 2024-04-30 | 哈曼国际工业有限公司 | Contextual sunroof for enhanced media experience in a car |
JPWO2020071169A1 (en) * | 2018-10-01 | 2021-09-02 | 富士フイルム株式会社 | display |
US11151810B2 (en) * | 2018-10-12 | 2021-10-19 | Aurora Flight Sciences Corporation | Adaptable vehicle monitoring system |
US20200376937A1 (en) * | 2019-05-29 | 2020-12-03 | Toyota Boshoku Kabushiki Kaisha | Light-adjusting system and vehicle light-adjusting system |
US11312300B1 (en) | 2021-01-29 | 2022-04-26 | Toyota Motor Engineering & Manufacturing North America, Inc. | Object notification systems for identifying and notifying vehicles of relevant objects |
US11467401B2 (en) * | 2021-03-02 | 2022-10-11 | GM Global Technology Operations LLC | Display and light blocking screens |
US11506892B1 (en) | 2021-05-03 | 2022-11-22 | GM Global Technology Operations LLC | Holographic display system for a motor vehicle |
US11762195B2 (en) | 2021-05-06 | 2023-09-19 | GM Global Technology Operations LLC | Holographic display system with conjugate image removal for a motor vehicle |
US20240054528A1 (en) * | 2022-08-10 | 2024-02-15 | Toyota Motor Engineering & Manufacturing North America, Inc. | Systems and methods for measuring a reaction of a user to an advertisement |
GB2624974A (en) * | 2022-11-29 | 2024-06-05 | E Lead Electronic Co Ltd | Shield for helmet, helmet, and head-up display device |
Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040048622A1 (en) * | 1999-05-26 | 2004-03-11 | Johnson Controls Technology Company | System and method for radio frequency communication with a personal digital assistant in a vehicle |
US20040194479A1 (en) * | 2003-02-03 | 2004-10-07 | Makoto Umebayashi | Remotely operable air conditioning system for vehicle |
US20100004838A1 (en) * | 2008-07-01 | 2010-01-07 | Sony Corporation | Automatic speed limit adjust for road conditions |
US20100049528A1 (en) * | 2007-01-05 | 2010-02-25 | Johnson Controls Technology Company | System and method for customized prompting |
US20100063670A1 (en) * | 2006-11-14 | 2010-03-11 | Johnson Controls Technology Company | System and method of synchronizing an in-vehicle control system with a remote source |
US20110012720A1 (en) * | 2009-07-15 | 2011-01-20 | Hirschfeld Robert A | Integration of Vehicle On-Board Diagnostics and Smart Phone Sensors |
US20120303182A1 (en) * | 2010-01-20 | 2012-11-29 | In Ju Choi | System and method for managing vehicle through the wireless communications relay of a vehicle remote controller |
US20120303178A1 (en) * | 2011-05-26 | 2012-11-29 | Hendry Jeffrey C | Method and system for establishing user settings of vehicle components |
US20130018567A1 (en) * | 2011-07-12 | 2013-01-17 | Pantech Co., Ltd. | Mobile terminal, system and method for controlling an electronic control unit |
US20130274997A1 (en) * | 2012-04-13 | 2013-10-17 | Htc Corporation | Method of controlling interaction between mobile electronic device and in-vehicle electronic system and devices using the same |
US20130328670A1 (en) * | 2011-01-31 | 2013-12-12 | Andreas Brüninghaus | Operator control device |
US20140088793A1 (en) * | 2012-09-27 | 2014-03-27 | Dennis M. Morgan | Device, method, and system for portable configuration of vehicle controls |
US20140088794A1 (en) * | 2012-09-27 | 2014-03-27 | Mitsubishi Jidosha Kogyo Kabushiki Kaisha | Remote control system for in-vehicle device |
US20140142948A1 (en) * | 2012-11-21 | 2014-05-22 | Somya Rathi | Systems and methods for in-vehicle context formation |
US8751065B1 (en) * | 2012-12-14 | 2014-06-10 | Denso Corporation | Smartphone controller of vehicle settings |
US20140163771A1 (en) * | 2012-12-10 | 2014-06-12 | Ford Global Technologies, Llc | Occupant interaction with vehicle system using brought-in devices |
US20150210287A1 (en) * | 2011-04-22 | 2015-07-30 | Angel A. Penilla | Vehicles and vehicle systems for providing access to vehicle controls, functions, environment and applications to guests/passengers via mobile devices |
US20150243168A1 (en) * | 2012-10-31 | 2015-08-27 | Bayerische Motoren Werke Aktiengesellschaft | Vehicle Assistance Device |
Family Cites Families (50)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5883605A (en) * | 1992-02-25 | 1999-03-16 | Gentex Corporation | Automatic electrochromic control of light level of vacuum fluorescent display |
DE4211728A1 (en) | 1992-04-08 | 1993-10-14 | Zeiss Carl Fa | Holographic display device e.g. for vehicle or aircraft head=up display - uses curved windscreen incorporating monomode waveguide for supplied light and holographic gratings |
JPH08272321A (en) * | 1995-03-31 | 1996-10-18 | Toyoda Gosei Co Ltd | External display device of vehicle |
US6172613B1 (en) * | 1998-02-18 | 2001-01-09 | Donnelly Corporation | Rearview mirror assembly incorporating vehicle information display |
JP3562999B2 (en) * | 1999-07-27 | 2004-09-08 | 株式会社クボタ | Work vehicle |
US7167796B2 (en) * | 2000-03-09 | 2007-01-23 | Donnelly Corporation | Vehicle navigation system for use with a telematics system |
US7449081B2 (en) * | 2000-06-21 | 2008-11-11 | E. I. Du Pont De Nemours And Company | Process for improving the emission of electron field emitters |
US6580973B2 (en) * | 2000-10-14 | 2003-06-17 | Robert H. Leivian | Method of response synthesis in a driver assistance system |
US7565230B2 (en) * | 2000-10-14 | 2009-07-21 | Temic Automotive Of North America, Inc. | Method and apparatus for improving vehicle operator performance |
JP3846872B2 (en) * | 2002-06-27 | 2006-11-15 | パイオニア株式会社 | Driver mental state information provision system |
US6992580B2 (en) * | 2002-07-25 | 2006-01-31 | Motorola, Inc. | Portable communication device and corresponding method of operation |
US6859144B2 (en) * | 2003-02-05 | 2005-02-22 | Delphi Technologies, Inc. | Vehicle situation alert system with eye gaze controlled alert signal generation |
US20050084659A1 (en) * | 2003-10-20 | 2005-04-21 | General Atomics | Vehicle windshield head-up display |
US7801283B2 (en) * | 2003-12-22 | 2010-09-21 | Lear Corporation | Method of operating vehicular, hands-free telephone system |
JP4206928B2 (en) * | 2004-01-19 | 2009-01-14 | 株式会社デンソー | Collision possibility judgment device |
DE102004005816B4 (en) | 2004-02-06 | 2007-02-08 | Audi Ag | motor vehicle |
US7413328B2 (en) * | 2004-12-30 | 2008-08-19 | Honeywell International Inc. | Remotely coupled hybrid HUD backlight |
JP2006327527A (en) * | 2005-05-30 | 2006-12-07 | Honda Motor Co Ltd | Safety device for vehicle running |
DE102005059216A1 (en) | 2005-07-16 | 2007-01-25 | Ralf Michel | Supervision system in particular for motorcycle, comprises units for determination and evaluation of driving performance |
JP4617226B2 (en) * | 2005-08-30 | 2011-01-19 | 本田技研工業株式会社 | Vehicle display device |
JP2007259931A (en) * | 2006-03-27 | 2007-10-11 | Honda Motor Co Ltd | Visual axis detector |
KR100828965B1 (en) * | 2006-07-31 | 2008-05-13 | 삼성전자주식회사 | Method and apparatus for setting environment of cars in portable terminal |
WO2008029802A1 (en) * | 2006-09-04 | 2008-03-13 | Panasonic Corporation | Travel information providing device |
US20080158510A1 (en) * | 2007-01-02 | 2008-07-03 | Gm Global Technology Operations, Inc. | Apparatus And Method For Displaying Information Within A Vehicle Interior |
JP2010527464A (en) * | 2007-05-17 | 2010-08-12 | プリズム インコーポレイテッド | Multilayer screen with light-emitting stripes for beam display system scanning |
US7908060B2 (en) * | 2007-07-31 | 2011-03-15 | International Business Machines Corporation | Method and system for blind spot identification and warning utilizing portable and wearable devices |
JP2009156898A (en) * | 2007-12-25 | 2009-07-16 | Seiko Epson Corp | Display device |
JP5354514B2 (en) * | 2008-03-31 | 2013-11-27 | 現代自動車株式会社 | Armpit driving detection alarm system |
DE102008042521A1 (en) * | 2008-10-01 | 2010-04-08 | Robert Bosch Gmbh | Procedure for displaying a visual warning |
DE102009010623A1 (en) | 2009-02-26 | 2010-09-02 | Hella Kgaa Hueck & Co. | Device for issuing visual warning information to driver of vehicle, particularly motor vehicle, has vehicle window pane, through which vehicle driver visually captures area of environment of vehicle |
US8564502B2 (en) * | 2009-04-02 | 2013-10-22 | GM Global Technology Operations LLC | Distortion and perspective correction of vector projection display |
FR2946336B1 (en) * | 2009-06-03 | 2011-05-20 | Saint Gobain | LAMINATED GLAZING FOR HIGH HEAD VISUALIZATION SYSTEM |
US20110025584A1 (en) * | 2009-07-29 | 2011-02-03 | Gm Global Technology Operations, Inc. | Light-emitting diode heads-up display for a vehicle |
KR20110038563A (en) * | 2009-10-08 | 2011-04-14 | 최운호 | Method, vehicle terminal, biometrics card and system for controlling vehicle through authenticating driver, and method for providing passenger protecting/tracking function using biometrics card and terminal |
US8498757B2 (en) * | 2009-10-09 | 2013-07-30 | Visteon Global Technologies, Inc. | Portable and personal vehicle presets |
CN201525262U (en) * | 2009-11-25 | 2010-07-14 | 王辉 | Automobile front windshield glass with transparent LCD (liquid crystal display) device |
US8384534B2 (en) * | 2010-01-14 | 2013-02-26 | Toyota Motor Engineering & Manufacturing North America, Inc. | Combining driver and environment sensing for vehicular safety systems |
US8599027B2 (en) * | 2010-10-19 | 2013-12-03 | Deere & Company | Apparatus and method for alerting machine operator responsive to the gaze zone |
KR20120075672A (en) * | 2010-12-29 | 2012-07-09 | 성균관대학교산학협력단 | System and method for safe driving induction using detection of gaze direction |
CN103442925B (en) * | 2011-03-25 | 2016-08-17 | Tk控股公司 | For determining the system and method for driver's Vigilance |
KR20120113579A (en) * | 2011-04-05 | 2012-10-15 | 현대자동차주식회사 | Apparatus and method for displaying road guide information on the windshield |
KR20120136721A (en) * | 2011-06-09 | 2012-12-20 | 현대자동차주식회사 | Apparatus and method for alarming use of mobile phone on driving |
EP2564776B1 (en) * | 2011-09-02 | 2019-08-28 | Volvo Car Corporation | Method, system and computer readable medium embodying a computer program product for determining a vehicle operator's expectation of a state of an object |
CN202357886U (en) * | 2011-12-09 | 2012-08-01 | 常州永旭车辆配件厂 | Dashboard of electric vehicle |
FR2985042B1 (en) * | 2011-12-22 | 2014-01-17 | Saint Gobain | DEVICE FOR VISUALIZING AN IMAGE ON A SHEET SUPPORT |
CN202806308U (en) * | 2012-08-21 | 2013-03-20 | 惠州市德赛西威汽车电子有限公司 | Automotive windshield |
US20160023604A1 (en) * | 2013-07-08 | 2016-01-28 | LightSpeed Automotive Technology | Head-Up Display Controller |
KR101555444B1 (en) * | 2014-07-10 | 2015-10-06 | 현대모비스 주식회사 | An apparatus mounted in vehicle for situational awareness and a method thereof |
CN105313898B (en) * | 2014-07-23 | 2018-03-20 | 现代摩比斯株式会社 | Driver status induction installation and its method |
US20160109701A1 (en) * | 2014-10-15 | 2016-04-21 | GM Global Technology Operations LLC | Systems and methods for adjusting features within a head-up display |
-
2014
- 2014-04-25 CN CN201480035937.0A patent/CN105324268A/en active Pending
- 2014-04-25 WO PCT/US2014/035396 patent/WO2014176476A1/en active Application Filing
- 2014-04-25 WO PCT/US2014/035387 patent/WO2014176474A1/en active Application Filing
- 2014-04-25 WO PCT/US2014/035398 patent/WO2014176478A1/en active Application Filing
- 2014-04-25 WO PCT/US2014/035385 patent/WO2014176473A1/en active Application Filing
- 2014-04-25 DE DE112014001607.1T patent/DE112014001607B4/en active Active
-
2015
- 2015-10-20 US US14/918,073 patent/US9688287B2/en active Active
- 2015-10-22 US US14/920,499 patent/US10131364B2/en active Active
- 2015-10-22 US US14/920,420 patent/US20160039285A1/en not_active Abandoned
- 2015-10-22 US US14/920,413 patent/US20160041562A1/en not_active Abandoned
Patent Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040048622A1 (en) * | 1999-05-26 | 2004-03-11 | Johnson Controls Technology Company | System and method for radio frequency communication with a personal digital assistant in a vehicle |
US20040194479A1 (en) * | 2003-02-03 | 2004-10-07 | Makoto Umebayashi | Remotely operable air conditioning system for vehicle |
US20100063670A1 (en) * | 2006-11-14 | 2010-03-11 | Johnson Controls Technology Company | System and method of synchronizing an in-vehicle control system with a remote source |
US20100049528A1 (en) * | 2007-01-05 | 2010-02-25 | Johnson Controls Technology Company | System and method for customized prompting |
US20100004838A1 (en) * | 2008-07-01 | 2010-01-07 | Sony Corporation | Automatic speed limit adjust for road conditions |
US20110012720A1 (en) * | 2009-07-15 | 2011-01-20 | Hirschfeld Robert A | Integration of Vehicle On-Board Diagnostics and Smart Phone Sensors |
US20120303182A1 (en) * | 2010-01-20 | 2012-11-29 | In Ju Choi | System and method for managing vehicle through the wireless communications relay of a vehicle remote controller |
US20130328670A1 (en) * | 2011-01-31 | 2013-12-12 | Andreas Brüninghaus | Operator control device |
US20150210287A1 (en) * | 2011-04-22 | 2015-07-30 | Angel A. Penilla | Vehicles and vehicle systems for providing access to vehicle controls, functions, environment and applications to guests/passengers via mobile devices |
US20120303178A1 (en) * | 2011-05-26 | 2012-11-29 | Hendry Jeffrey C | Method and system for establishing user settings of vehicle components |
US20130018567A1 (en) * | 2011-07-12 | 2013-01-17 | Pantech Co., Ltd. | Mobile terminal, system and method for controlling an electronic control unit |
US20130274997A1 (en) * | 2012-04-13 | 2013-10-17 | Htc Corporation | Method of controlling interaction between mobile electronic device and in-vehicle electronic system and devices using the same |
US20140088793A1 (en) * | 2012-09-27 | 2014-03-27 | Dennis M. Morgan | Device, method, and system for portable configuration of vehicle controls |
US20140088794A1 (en) * | 2012-09-27 | 2014-03-27 | Mitsubishi Jidosha Kogyo Kabushiki Kaisha | Remote control system for in-vehicle device |
US20150243168A1 (en) * | 2012-10-31 | 2015-08-27 | Bayerische Motoren Werke Aktiengesellschaft | Vehicle Assistance Device |
US20140142948A1 (en) * | 2012-11-21 | 2014-05-22 | Somya Rathi | Systems and methods for in-vehicle context formation |
US20140163771A1 (en) * | 2012-12-10 | 2014-06-12 | Ford Global Technologies, Llc | Occupant interaction with vehicle system using brought-in devices |
US8751065B1 (en) * | 2012-12-14 | 2014-06-10 | Denso Corporation | Smartphone controller of vehicle settings |
Non-Patent Citations (1)
Title |
---|
ITmedia, "How Can a Screen Sense Touch? A Basic Understanding of Touch Panels," Sept. 27, 2010, ITmedia Inc., translated at http://www.eizo.com/library/basics/basic_understanding_of_touch_panel/ * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112918381A (en) * | 2019-12-06 | 2021-06-08 | 广州汽车集团股份有限公司 | Method, device and system for welcoming and delivering guests by vehicle-mounted robot |
Also Published As
Publication number | Publication date |
---|---|
DE112014001607B4 (en) | 2021-09-02 |
US20160085070A1 (en) | 2016-03-24 |
WO2014176473A1 (en) | 2014-10-30 |
US9688287B2 (en) | 2017-06-27 |
US20160082979A1 (en) | 2016-03-24 |
US10131364B2 (en) | 2018-11-20 |
WO2014176474A1 (en) | 2014-10-30 |
WO2014176478A1 (en) | 2014-10-30 |
CN105324268A (en) | 2016-02-10 |
US20160039285A1 (en) | 2016-02-11 |
DE112014001607T5 (en) | 2015-12-24 |
WO2014176476A1 (en) | 2014-10-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20160041562A1 (en) | Method of controlling a component of a vehicle with a user device | |
CN108284840B (en) | Autonomous vehicle control system and method incorporating occupant preferences | |
EP2726981B1 (en) | Human machine interface unit for a communication device in a vehicle and i/o method using said human machine interface unit | |
EP3482344B1 (en) | Portable personalization | |
US11314389B2 (en) | Method for presenting content based on checking of passenger equipment and distraction | |
US8843553B2 (en) | Method and system for communication with vehicles | |
US20120095643A1 (en) | Method, Apparatus, and Computer Program Product for Modifying a User Interface Format | |
CN108349388B (en) | Dynamically reconfigurable display knob | |
KR101730315B1 (en) | Electronic device and method for image sharing | |
KR101668248B1 (en) | Input apparatus for vehicle and Vehicle | |
JP5260298B2 (en) | Information device advantageously provided in a motor vehicle and method for notifying vehicle data, in particular a method for notifying information on vehicle functions and operation of the vehicle functions | |
KR101711835B1 (en) | Vehicle, Vehicle operating method and wearable device operating method | |
US20140002357A1 (en) | Enabling and Disabling Features of a Headset Computer Based on Real-Time Image Analysis | |
KR102686009B1 (en) | Terminal device, vehicle having the same and method for controlling the same | |
WO2012121814A1 (en) | Enhancing vehicle infotainment systems by adding remote sensors from a portable device | |
US10369943B2 (en) | In-vehicle infotainment control systems and methods | |
CN112513708B (en) | Apparatus and method for use with a vehicle | |
US20110144856A1 (en) | Three-Dimensional Corporeal Figure for Communication with a Passenger in a Motor Vehicle | |
KR20210151089A (en) | Voice control of vehicle systems | |
CN113811851A (en) | User interface coupling | |
CN113886437A (en) | Hybrid fetch using on-device cache | |
US20180054570A1 (en) | Systems for effecting progressive driver-distraction-avoidance actions at a vehicle | |
KR101638543B1 (en) | Display appratus for vehicle | |
EP4020851A1 (en) | Remote controller | |
CN118151381A (en) | Method and system for HUD-based function control visualization |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GM GLOBAL TECHNOLOGY OPERATIONS LLC, MICHIGAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MATHIEU, ROY J.;SZCZERBA, JOSEPH F.;JONES, MICAH R.;REEL/FRAME:036882/0785 Effective date: 20151021 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |