[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2021057352A1 - 一种车载设备的信息显示方法、装置及车辆 - Google Patents

一种车载设备的信息显示方法、装置及车辆 Download PDF

Info

Publication number
WO2021057352A1
WO2021057352A1 PCT/CN2020/110506 CN2020110506W WO2021057352A1 WO 2021057352 A1 WO2021057352 A1 WO 2021057352A1 CN 2020110506 W CN2020110506 W CN 2020110506W WO 2021057352 A1 WO2021057352 A1 WO 2021057352A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
area
navigation
instruction
displayed
Prior art date
Application number
PCT/CN2020/110506
Other languages
English (en)
French (fr)
Inventor
郑维希
王冠华
陈子捷
黄雪妍
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2021057352A1 publication Critical patent/WO2021057352A1/zh
Priority to US17/703,053 priority Critical patent/US20220212690A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/22Display screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/26Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using acoustic output
    • B60K35/265Voice
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/29Instruments characterised by the way in which information is handled, e.g. showing information on plural displays or prioritising information according to driving conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/85Arrangements for transferring vehicle- or driver-related data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/1523Matrix displays
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/166Navigation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/168Target or limit values
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/175Autonomous driving
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/177Augmented reality
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/178Warnings
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/179Distances to obstacles or vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/27Optical features of instruments using semi-transparent optical elements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/31Virtual images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/05Type of road, e.g. motorways, local streets, paved or unpaved roads
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/10Number of lanes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/35Road bumpiness, e.g. potholes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/53Road markings, e.g. lane marker or crosswalk
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4041Position
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4042Longitudinal speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/406Traffic density
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • B60W2554/802Longitudinal distance
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2555/00Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
    • B60W2555/20Ambient conditions, e.g. wind or rain
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2555/00Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
    • B60W2555/60Traffic rules, e.g. speed limits or right of way
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/65Data transmitted between vehicles

Definitions

  • This application relates to the field of smart cars or automatic driving, and in particular to an information display method, device and vehicle for vehicle-mounted equipment.
  • Autonomous driving technology relies on the collaboration of artificial intelligence, visual computing, radar, monitoring devices, and global positioning systems to allow motor vehicles to achieve autonomous driving without the need for human active operations. Since autonomous driving technology does not require humans to drive motor vehicles, it can theoretically effectively avoid human driving errors, reduce the occurrence of traffic accidents, and improve highway transportation efficiency. Therefore, autonomous driving technology is getting more and more attention.
  • the on-board equipment inside the vehicle can display the automatic driving interface.
  • the automatic driving interface can display the lane of the vehicle and other vehicles located near the vehicle.
  • the existing The display content of the autopilot interface can no longer meet the needs of drivers.
  • the embodiments of the present application provide an information display method, device, and vehicle for in-vehicle equipment, which enrich the display content of the automatic driving interface.
  • the present application provides an information display method for in-vehicle equipment, including: acquiring lane line information on the road where the first vehicle is located, where the lane lines are at least two lines on the road surface used to divide different lanes ; According to the information of the lane line, the virtual lane line that is consistent with the type of the lane line is displayed.
  • the virtual lane line consistent with the lane line corresponding to the acquired lane line information is displayed in the automatic driving interface, so that the driver can see the actual lane on the road at this time from the automatic driving interface
  • the virtual lane line of the line type not only enriches the display content of the automatic driving interface, but also improves the safety of driving.
  • the virtual lane line is exactly the same as the road lane line. There may always be some differences between the virtual lane line displayed on the computer screen and the actual lane line.
  • the purpose of this application is to indicate the actual lane for the driver for reference.
  • the indication method is as close to the actual lane line as possible, but the color, shape, material, etc. of the line can be different from the actual lane line. Further, it is also possible to add and display other indication information on the basis of the virtual lane line.
  • the acquiring the lane line information of the road where the first vehicle is located includes: acquiring the lane line information of the lane where the first vehicle is located.
  • the lane line includes at least one of the following lane lines: a dashed line, a solid line, a double dashed line, a double solid line, and a dashed solid line.
  • a dashed line a dashed line
  • a solid line a double dashed line
  • a double solid line a dashed solid line.
  • the lane line includes at least one of the following lane lines: white dashed line, white solid line, yellow dashed line, yellow solid line, double white dashed line, double Yellow solid line, yellow dashed solid line and double white solid line. It should be noted that the type of the virtual lane line displayed on the automatic driving interface can be consistent with the actual lane line shape and color.
  • the method further includes: acquiring information of a non-motor vehicle object on the road; and displaying the non-motor vehicle object according to the information of the non-motor vehicle object object.
  • the method further includes:
  • second sharing information is sent to the second vehicle, where the second sharing information includes location information of the non-motor vehicle object.
  • the method further includes:
  • an obstacle prompt is displayed on the navigation interface, and the obstacle prompt is used to indicate a non-motor vehicle object at a position corresponding to the position information.
  • the non-motor vehicle object includes at least road depressions, obstacles, and road water.
  • the method further includes:
  • a lane change instruction is displayed, wherein the navigation instruction is used to indicate the navigation path of the first vehicle, and the lane change instruction is used to indicate the first vehicle.
  • a vehicle avoids the travel path of the non-motor vehicle object.
  • the method further includes:
  • a second warning prompt is displayed, and the second warning prompt is different from the first warning prompt.
  • the color or transparency of the first alarm prompt and the second alarm prompt are different.
  • the method further includes:
  • a navigation instruction is displayed based on the navigation information, and the navigation instruction is used to indicate a navigation path of the first vehicle.
  • the navigation instruction includes a first navigation instruction or a second navigation instruction
  • the displaying of the navigation instruction based on the navigation information includes:
  • the second navigation instruction is displayed, and the first navigation instruction is different from the second navigation instruction.
  • the display color or transparency of the first navigation instruction and the second navigation instruction of the method are different.
  • the driver or passenger can determine the current driving state of the vehicle based on the display of the navigation instructions in the navigation interface.
  • the navigation instruction includes a third navigation instruction or a fourth navigation instruction
  • the displaying of the navigation instruction based on the navigation information includes:
  • the fourth navigation instruction is displayed based on the first vehicle being in the second environment, the first environment is different from the second environment, and the third navigation instruction is different from the fourth navigation instruction.
  • the first environment includes at least one of the following environments: a weather environment where the first vehicle is located, and a road surface where the first vehicle is located Environment, the weather environment where the first vehicle navigation destination is located, the road environment where the first vehicle navigation destination is located, the traffic jam environment of the road where the first vehicle is located, and the first vehicle navigation destination The traffic jam environment or the brightness environment where the first vehicle is located.
  • the first vehicle may display the first road surface based on the first vehicle being in the first environment, and display the second lane based on the first vehicle being in the second environment, where the first lane and the second lane
  • the lane is the lane where the first vehicle travels, or the lane on the road where the first vehicle is located, the first environment is different from the second environment, and the first lane is different from the second lane.
  • the driver or passenger can obtain the current environment of the vehicle based on the display of the automatic navigation interface, especially at night or other low-brightness scenes, the driver or passenger can obtain the current environment of the vehicle based on the display of the automatic navigation interface , Improve the safety of driving.
  • the method further includes:
  • a second area is displayed, wherein the second area includes a scene area on the left front of the driving direction of the first vehicle that is larger than the first area Contains the scene area of the left front.
  • the method further includes:
  • a fourth area is displayed, wherein the third area includes a scene area on the right rear side of the first vehicle traveling direction that is larger than the fourth area Contains the scene area behind the right.
  • the method further includes:
  • a sixth area is displayed, wherein the fifth area includes a scene area on the right front of the first vehicle traveling direction that is larger than the sixth area Contains the scene area on the right front.
  • the method further includes:
  • an eighth area is displayed, wherein the seventh area includes a scene area on the left and rear of the driving direction of the first vehicle that is larger than the eighth area Contains the scene area behind the left.
  • the current display angle of view can be changed, so that the driver can know that there is Information on areas that may have safety risks improves driving safety.
  • the method further includes:
  • the tenth area is displayed, the ninth area and the tenth area are the scene areas where the first vehicle is driving, and the second driving speed is greater than all
  • a scene area included in the ninth area is larger than a scene area included in the tenth area.
  • the first vehicle may display the ninth area based on the first vehicle at the first traveling speed, and display the tenth area based on the first vehicle at the second traveling speed, the ninth area and The tenth area is the scene area where the first vehicle is traveling, the second traveling speed is greater than the first traveling speed, and the ninth area includes a greater scene area than the tenth area. Scene area.
  • a larger scene area can be displayed, so that the driver can learn more road surface information when the first vehicle is traveling faster, and driving safety is improved.
  • the method further includes:
  • a first image is displayed based on the geographic location, and the first image is used to indicate the type of geographic location where the first vehicle navigation destination is located.
  • the method further includes:
  • the third vehicle is detected
  • a second image is displayed based on the geographic location where the third vehicle navigation destination is located, and the second image is used to indicate the type of geographic location where the third vehicle navigation destination is located.
  • the type of the geographic location includes at least one of the following types: city, mountain, plain, forest, or seaside.
  • the first vehicle may obtain the geographic location of the navigation destination of the first vehicle, and display a first image based on the geographic location.
  • the first image is used to indicate the navigation of the first vehicle The type of geographic location where the destination is located.
  • the first vehicle can display the corresponding image in the automatic driving interface based on the geographic location of the navigation destination, which enriches the content of the automatic driving interface.
  • the method further includes:
  • the intersection stop instruction includes: a first intersection stop instruction or a second intersection stop instruction, and it is detected that the first vehicle is driving to the intersection stop area, and Display stop instructions at intersections, including:
  • a second intersection stop instruction is displayed, and the first intersection stop instruction is different from the second intersection stop instruction.
  • the intersection stop instruction includes: a third intersection stop instruction or a fourth intersection stop instruction, and it is detected that the first vehicle is traveling to the intersection stop area, and Display stop instructions at intersections, including:
  • the fourth intersection stop instruction, the third intersection stop instruction and the fourth intersection stop instruction are displayed different.
  • the method further includes:
  • the fourth vehicle is detected
  • a vehicle warning prompt is displayed.
  • the vehicle warning reminder includes a first vehicle warning reminder or a second vehicle warning reminder, and the vehicle warning reminder is based on the relationship between the fourth vehicle and the first vehicle If the distance is less than the preset distance, the vehicle warning prompt will be displayed, including:
  • a second vehicle warning prompt is displayed.
  • the first distance is different from the second distance, and the first vehicle warning prompt is the same as that of the first vehicle.
  • the second vehicle warning prompt is different.
  • the first vehicle may display a vehicle warning prompt on the automatic driving interface based on the distance between the nearby vehicle and the own vehicle. This enables the driver to know the risk of collision between the first vehicle and other vehicles through the warning prompt displayed on the automatic driving interface.
  • the method further includes:
  • the fifth vehicle is detected
  • a fourth image corresponding to the fifth vehicle is displayed, and the third image is different from the fourth image.
  • the present application provides an information display device for in-vehicle equipment, including:
  • An acquisition module configured to acquire lane line information on the road where the first vehicle is located, where the lane lines are at least two lines on the road surface that are used to divide different lanes;
  • the display module is used to display the virtual lane line consistent with the lane line type according to the lane line information.
  • the acquiring lane line information of the road where the first vehicle is located includes:
  • the lane line includes at least one of the following lane lines: a dashed line, a solid line, a double dashed line, a double solid line, and a dashed solid line.
  • the lane line includes at least one of the following lane lines: white dashed line, white solid line, yellow dashed line, yellow solid line, double white dashed line, double Yellow solid line, yellow dashed solid line and double white solid line.
  • the acquisition module is further configured to acquire information about non-motor vehicle objects on the road surface
  • the display module is also used to display the non-motor vehicle object.
  • the device further includes:
  • a receiving module configured to receive a sharing instruction, the sharing instruction carrying the address of the second vehicle
  • the sending module is configured to send second shared information to the second vehicle in response to the sharing instruction, where the second shared information includes location information of the non-motor vehicle object.
  • the receiving module is further configured to receive the first shared information sent by the server or the second vehicle, and the first shared information includes information about the non-motor vehicle object. location information;
  • the display module is further configured to start navigation based on the first vehicle, and display an obstacle prompt on a navigation interface, where the obstacle prompt is used to indicate a non-motor vehicle object at a location corresponding to the location information.
  • the non-motor vehicle object includes at least road depressions, obstacles, and road water.
  • the display module is further configured to display lane change instructions based on the non-motor vehicle object being located on the navigation path indicated by the navigation instructions, wherein the navigation The indication is used to indicate the navigation path of the first vehicle, and the lane change indication is used to indicate the driving path of the first vehicle to avoid the non-motor vehicle object.
  • the display module is further configured to display a first alarm based on the distance between the first vehicle and the non-motor vehicle object as the first distance prompt;
  • a second warning prompt is displayed, and the second warning prompt is different from the first warning prompt.
  • the color or transparency of the first alarm prompt and the second alarm prompt are different.
  • the acquisition module is further configured to acquire navigation information of the first vehicle
  • the display module is further configured to display navigation instructions based on the navigation information, and the navigation instructions are used to indicate the navigation path of the first vehicle.
  • the navigation instruction includes a first navigation instruction or a second navigation instruction
  • the display module is specifically configured to display The first navigation instruction
  • the second navigation instruction is displayed, and the first navigation instruction is different from the second navigation instruction.
  • the display color or transparency of the first navigation instruction and the second navigation instruction of the device are different.
  • the navigation instruction includes a third navigation instruction or a fourth navigation instruction
  • the display module is specifically configured to be based on the first vehicle being in the first environment, Display the third navigation instruction
  • the fourth navigation instruction is displayed based on the first vehicle being in the second environment, the first environment is different from the second environment, and the third navigation instruction is different from the fourth navigation instruction.
  • the first environment includes at least one of the following environments: a weather environment where the first vehicle is located, and a road surface where the first vehicle is located Environment, the weather environment where the first vehicle navigation destination is located, the road environment where the first vehicle navigation destination is located, the traffic jam environment of the road where the first vehicle is located, and the first vehicle navigation destination The traffic jam environment or the brightness environment where the first vehicle is located.
  • the display module is further configured to display the first area based on the first vehicle being in a straight-moving state
  • a second area is displayed, wherein the second area includes a scene area on the left front of the driving direction of the first vehicle that is larger than the first area Contains the scene area of the left front.
  • the display module is further configured to display the third area based on the left-turning state of the first vehicle
  • a fourth area is displayed, wherein the third area includes a scene area on the right rear side of the first vehicle traveling direction that is larger than the fourth area Contains the scene area behind the right.
  • the display module is further configured to display the fifth area based on the first vehicle being in a straight-moving state
  • a sixth area is displayed, wherein the fifth area includes a scene area on the right front of the first vehicle traveling direction that is larger than the sixth area Contains the scene area on the right front.
  • the display module is further configured to display the seventh area based on the state of the first vehicle turning right;
  • an eighth area is displayed, wherein the seventh area includes a scene area on the left and rear of the driving direction of the first vehicle that is larger than the eighth area Contains the scene area behind the left.
  • the display module is further configured to display the ninth area based on the first vehicle at the first traveling speed
  • the tenth area is displayed, the ninth area and the tenth area are the scene areas where the first vehicle is driving, and the second driving speed is greater than all
  • a scene area included in the ninth area is larger than a scene area included in the tenth area.
  • the acquiring module is further configured to acquire the geographic location of the first vehicle navigation destination;
  • the display module is further configured to display a first image based on the geographic location, and the first image is used to indicate the type of geographic location where the first vehicle navigation destination is located.
  • the detection module is further configured to detect a third vehicle
  • the acquiring module is also used to acquire the geographic location of the third vehicle navigation destination;
  • the display module is further configured to display a second image based on the geographic location where the third vehicle navigation destination is located, and the second image is used to indicate the type of geographic location where the third vehicle navigation destination is located .
  • the type of geographic location includes at least one of the following types: city, mountain, plain, forest, or seaside.
  • the detection module is further configured to detect that the first vehicle has traveled to an intersection stop area and display a first intersection stop instruction.
  • intersection stop instruction includes: a first intersection stop instruction or a second intersection stop instruction
  • display module is further used for:
  • a second intersection stop instruction is displayed, and the first intersection stop instruction is different from the second intersection stop instruction.
  • intersection stop instruction includes: a third intersection stop instruction or a fourth intersection stop instruction
  • display module is further used for:
  • the fourth intersection stop instruction is displayed, and the third intersection stop instruction and the first intersection stop instruction
  • the four-junction stop instructions are different.
  • the detection module is further configured to detect a fourth vehicle
  • the display module is further configured to display a vehicle warning prompt based on the distance between the fourth vehicle and the first vehicle being less than a preset distance.
  • the vehicle warning prompt includes a first vehicle warning prompt or a second vehicle warning prompt
  • the display module is further configured to The distance between the first vehicles is the first distance, and the first vehicle warning prompt is displayed;
  • a second vehicle warning prompt is displayed.
  • the first distance is different from the second distance, and the first vehicle warning prompt is the same as that of the first vehicle.
  • the second vehicle warning prompt is different.
  • the detection module is further configured to detect the fifth vehicle
  • the display module is further configured to display a third image corresponding to the fifth vehicle based on the fifth vehicle being located on the lane line of the lane in front of the driving direction of the first vehicle;
  • a fourth image corresponding to the fifth vehicle is displayed, and the third image is different from the fourth image.
  • the present application provides a vehicle including a processor, a memory, and a display, and the processor is configured to acquire and execute the code in the memory to execute the method described in any one of the above-mentioned first aspects.
  • the vehicle supports an unmanned driving function.
  • the present application provides an in-vehicle device, which is characterized by comprising a processor and a memory, and the processor is configured to acquire and execute the code in the memory to execute the method described in any one of the above-mentioned first aspects .
  • the present application provides a computer storage medium, the computer-readable storage medium stores instructions, and when the instructions run on a computer, the computer executes the method described in any one of the above-mentioned first aspects .
  • the present application provides a computer program (or computer program product).
  • the computer program includes instructions.
  • the instructions run on a computer, the computer executes any one of the above-mentioned first aspects. The method described.
  • This application provides an information display method for on-vehicle equipment, which is applied to the field of Internet of Vehicles, including: acquiring lane line information on the road where the first vehicle is located, where the lane lines are at least two lanes used to divide different lanes on the road surface. A line; the virtual lane line consistent with the lane line is displayed according to the information of the lane line.
  • This application can be applied to the automatic driving interface in the smart car, so that the driver can see the lane line type of the driving road at this time from the automatic driving interface, which not only enriches the display content of the automatic driving interface, but also improves the safety of driving .
  • FIG. 1 is a functional block diagram of an automatic driving device with automatic driving function provided by an embodiment of the application
  • FIG. 2 is a schematic structural diagram of an automatic driving system provided by an embodiment of the application.
  • 3a and 3b are an internal structure of a vehicle provided by an embodiment of the application.
  • FIG. 4a is a schematic flowchart of a method for displaying information of a vehicle-mounted device according to an embodiment of the application
  • FIG. 4b is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • FIG. 5a is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • FIG. 5b is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • FIG. 5c is a schematic diagram of an automatic driving interface provided in an embodiment of the application.
  • FIG. 5d is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • FIG. 5e is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • FIG. 5f is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • Fig. 6a is a schematic diagram of an automatic driving interface provided in an embodiment of the application.
  • FIG. 6b is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • FIG. 7a is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • FIG. 7b is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • FIG. 7c is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • FIG. 8a is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • FIG. 8b is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • FIG. 8c is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • FIG. 8d is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • FIG. 8e is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • FIG. 8f is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • FIG. 9a is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • FIG. 9b is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • FIG. 9c is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • FIG. 10 is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • FIG. 11a is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • FIG. 11b is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • FIG. 11c is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • FIG. 11d is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • Fig. 11e is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • FIG. 11f is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • FIG. 11g is a schematic diagram of an automatic driving interface provided in an embodiment of this application.
  • FIG. 11h is a schematic diagram of an automatic driving interface provided in an embodiment of the application.
  • FIGS. 12a to 12d are schematic diagrams of an automatic driving interface provided in an embodiment of this application.
  • FIG. 13a to 13c are schematic diagrams of an automatic driving interface provided in an embodiment of this application.
  • FIG. 14 is a schematic structural diagram of an information display device of an in-vehicle device according to an embodiment of the application.
  • the embodiments of the present application provide an information display method and device for in-vehicle equipment and a vehicle.
  • the vehicle described in this specification may be an internal combustion engine vehicle that uses an engine as a power source, a hybrid vehicle that uses an engine and an electric motor as a power source, an electric vehicle that uses an electric motor as a power source, and the like.
  • the vehicle may include an automatic driving device 100 with an automatic driving function.
  • FIG. 1 is a functional block diagram of an automatic driving device 100 with an automatic driving function provided by an embodiment of the present application.
  • the automatic driving device 100 is configured in a fully or partially automatic driving mode.
  • the automatic driving device 100 can control itself while in the automatic driving mode, and can determine the current state of the automatic driving device and its surrounding environment through human operation, and determine the possible behavior of at least one other automatic driving device in the surrounding environment , And determine the confidence level corresponding to the possibility of the other automatic driving device performing possible behaviors, and control the automatic driving device 100 based on the determined information.
  • the automatic driving device 100 can be set to operate without interacting with a human.
  • the autonomous driving device 100 may include various subsystems, such as a traveling system 102, a sensor system 104, a control system 106, one or more peripheral devices 108, and a power supply 110, a computer system 112, and a user interface 116.
  • the automatic driving device 100 may include more or fewer sub-systems, and each sub-system may include multiple elements.
  • each sub-system and element of the autonomous driving device 100 may be interconnected by wire or wirelessly.
  • the traveling system 102 may include components that provide power movement for the autonomous driving device 100.
  • the travel system 102 may include an engine 118, an energy source 119, a transmission 120, and wheels/tires 121.
  • the engine 118 may be an internal combustion engine, an electric motor, an air compression engine, or a combination of other types of engines, such as a hybrid engine composed of a gas oil engine and an electric motor, or a hybrid engine composed of an internal combustion engine and an air compression engine.
  • the engine 118 converts the energy source 119 into mechanical energy.
  • Examples of energy sources 119 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electricity.
  • the energy source 119 may also provide energy for other systems of the automatic driving device 100.
  • the transmission device 120 can transmit mechanical power from the engine 118 to the wheels 121.
  • the transmission device 120 may include a gearbox, a differential, and a drive shaft.
  • the transmission device 120 may also include other devices, such as a clutch.
  • the drive shaft may include one or more shafts that can be coupled to one or more wheels 121.
  • the sensor system 104 may include several sensors that sense information about the environment around the automatic driving device 100.
  • the sensor system 104 may include a positioning system 122 (the positioning system may be a global positioning system (GPS) system, a Beidou system or other positioning systems), an inertial measurement unit (IMU) 124, Radar 126, laser rangefinder 128, and camera 130.
  • the sensor system 104 may also include sensors of the internal system of the automatic driving device 100 to be monitored (for example, an in-vehicle air quality monitor, a fuel gauge, an oil temperature gauge, etc.). Sensor data from one or more of these sensors can be used to detect objects and their corresponding characteristics (position, shape, direction, speed, etc.). Such detection and recognition are key functions for the safe operation of the autonomous automatic driving device 100.
  • the positioning system 122 may be used to estimate the geographic location of the automatic driving device 100.
  • the IMU 124 is used to sense changes in the position and orientation of the automatic driving device 100 based on inertial acceleration.
  • the IMU 124 may be a combination of an accelerometer and a gyroscope.
  • the radar 126 may use radio signals to sense objects in the surrounding environment of the automatic driving device 100. In some embodiments, in addition to sensing the object, the radar 126 may also be used to sense the speed and/or direction of the object.
  • the radar 126 may include an electromagnetic wave transmitting unit and a receiving unit.
  • the radar 126 can be implemented in a pulse radar mode or a continuous wave radar mode in the principle of radio wave transmission.
  • the radar 126 may be implemented as a frequency modulated continuous wave (FMCW) mode or a frequency shift keying (FSK) mode according to the signal waveform.
  • FMCW frequency modulated continuous wave
  • FSK frequency shift keying
  • the radar 126 can use electromagnetic waves as a medium to detect objects based on time of flight (TOF) or phase-shift (phase-shift), and detect the position of the detected object, the distance to the detected object, and the relative speed.
  • TOF time of flight
  • phase-shift phase-shift
  • the radar 126 may be arranged at an appropriate position outside the vehicle.
  • the lidar 126 can use laser as a medium to detect an object based on a TOF method or a phase shift method, and detect the position of the detected object, the distance to the detected object, and the relative speed.
  • the lidar 126 may be arranged at an appropriate position outside the vehicle.
  • the laser rangefinder 128 can use laser light to sense objects in the environment where the automatic driving device 100 is located.
  • the laser rangefinder 128 may include one or more laser sources, laser scanners, and one or more detectors, as well as other system components.
  • the camera 130 may be used to capture multiple images of the surrounding environment of the automatic driving device 100.
  • the camera 130 may be a still camera or a video camera.
  • the camera 130 may be located at an appropriate position outside the vehicle.
  • the camera 130 in order to obtain an image of the front of the vehicle, the camera 130 may be arranged in the interior of the vehicle close to the front windshield.
  • the camera 130 may be arranged around the front bumper or the radiator grille.
  • the camera 130 in order to obtain an image of the rear of the vehicle, the camera 130 may be arranged close to the rear window glass in the interior of the vehicle.
  • the camera 130 may be arranged around the rear bumper, trunk or tailgate.
  • the camera 130 may be arranged close to at least one of the side windows in the interior of the vehicle.
  • the camera 130 may be arranged around a side mirror, a fender, or a car door.
  • the control system 106 controls the operation of the automatic driving device 100 and its components.
  • the control system 106 may include various components, including a steering system 132, a throttle 134, a braking unit 136, a sensor fusion algorithm 138, a computer vision system 140, a route control system 142, and an obstacle avoidance system 144.
  • the steering system 132 is operable to adjust the forward direction of the automatic driving device 100.
  • it may be a steering wheel system in one embodiment.
  • the throttle 134 is used to control the operating speed of the engine 118 and thereby control the speed of the automatic driving device 100.
  • the braking unit 136 is used to control the automatic driving device 100 to decelerate.
  • the braking unit 136 may use friction to slow down the wheels 121.
  • the braking unit 136 may convert the kinetic energy of the wheels 121 into electric current.
  • the braking unit 136 may also take other forms to slow down the rotation speed of the wheels 121 to control the speed of the automatic driving device 100.
  • the computer vision system 140 may be operable to process and analyze the images captured by the camera 130 in order to recognize objects and/or features in the surrounding environment of the autonomous driving device 100.
  • the objects and/or features may include traffic signals, road boundaries, and obstacles.
  • the computer vision system 140 may use object recognition algorithms, structure from motion (SFM) algorithms, video tracking, and other computer vision technologies.
  • SFM structure from motion
  • the computer vision system 140 may be used to map the environment, track objects, estimate the speed of objects, and so on.
  • the route control system 142 is used to determine the driving route of the automatic driving device 100.
  • the route control system 142 may combine data from the sensor 138, the positioning system 122, and one or more predetermined maps to determine the driving route for the autonomous driving device 100.
  • the obstacle avoidance system 144 is used to identify, evaluate and avoid or otherwise cross over potential obstacles in the environment of the automatic driving device 100.
  • control system 106 may add or alternatively include components other than those shown and described. Alternatively, a part of the components shown above may be reduced.
  • the automatic driving device 100 interacts with external sensors, other automatic driving devices, other computer systems, or users through the peripheral device 108.
  • the peripheral device 108 may include a wireless communication system 146, an onboard computer 148, a microphone 150, and/or a speaker 152.
  • the peripheral device 108 provides a means for the user of the autonomous driving apparatus 100 to interact with the user interface 116.
  • the onboard computer 148 may provide information to the user of the automatic driving device 100.
  • the user interface 116 can also operate the onboard computer 148 to receive user input.
  • the on-board computer 148 can be operated through a touch screen.
  • the peripheral device 108 may provide a means for the autonomous driving device 100 to communicate with other devices located in the vehicle.
  • the microphone 150 may receive audio (eg, voice commands or other audio input) from the user of the autonomous driving device 100.
  • the speaker 152 may output audio to the user of the automatic driving device 100.
  • the wireless communication system 146 may wirelessly communicate with one or more devices directly or via a communication network.
  • the wireless communication system 146 may use 3G cellular communication, such as code division multiple access (CDMA), EVD0, global system for mobile communications (GSM)/general packet radio service technology packet radio service, GPRS), or 4G cellular communication, such as long term evolution (LTE), or 5G cellular communication.
  • the wireless communication system 146 may use WiFi to communicate with a wireless local area network (WLAN).
  • the wireless communication system 146 may directly communicate with the device using an infrared link, Bluetooth, or ZigBee.
  • Other wireless protocols such as various autonomous driving device communication systems.
  • the wireless communication system 146 may include one or more dedicated short-range communications (DSRC) devices, which may include autonomous driving devices and/or roadsides. Public and/or private data communication between stations.
  • DSRC dedicated short-range communications
  • the power supply 110 may provide power to various components of the automatic driving device 100.
  • the power source 110 may be a rechargeable lithium ion or lead-acid battery.
  • One or more battery packs of such batteries may be configured as a power source to provide power to various components of the automatic driving device 100.
  • the power source 110 and the energy source 119 may be implemented together, such as in some all-electric vehicles.
  • the computer system 112 may include at least one processor 113 that executes instructions 115 stored in a non-transitory computer readable medium such as the memory 114.
  • the computer system 112 may also be multiple computing devices that control individual components or subsystems of the automatic driving apparatus 100 in a distributed manner.
  • the processor 113 may be any conventional processor, such as a commercially available central processing unit (CPU). Alternatively, the processor may be a dedicated device such as an application specific integrated circuit (ASIC) or other hardware-based processor.
  • FIG. 1 functionally illustrates the processor, memory, and other elements of the computer 110 in the same block, those of ordinary skill in the art should understand that the processor, computer, or memory may actually include Multiple processors, computers, or memories stored in the same physical enclosure.
  • the memory may be a hard disk drive or other storage medium located in a housing other than the computer 110. Therefore, a reference to a processor or computer will be understood to include a reference to a collection of processors or computers or memories that may or may not operate in parallel. Rather than using a single processor to perform the steps described here, some components such as steering components and deceleration components may each have its own processor that only performs calculations related to component-specific functions .
  • the processor may be located far away from the automatic driving device and wirelessly communicate with the automatic driving device.
  • some of the processes described herein are executed on a processor arranged in the autopilot device and others are executed by a remote processor, including taking the necessary steps to perform a single maneuver.
  • the memory 114 may include instructions 115 (for example, program logic), and the instructions 115 may be executed by the processor 113 to perform various functions of the automatic driving device 100, including those described above.
  • the memory 114 may also contain additional instructions, including those for sending data to, receiving data from, interacting with, and/or controlling one or more of the traveling system 102, the sensor system 104, the control system 106, and the peripheral device 108. instruction.
  • the memory 114 may also store data, such as road maps, route information, the position, direction, and speed of the automatic driving device, and other such automatic driving device data, as well as other information. Such information may be used by the autonomous driving device 100 and the computer system 112 during the operation of the autonomous driving device 100 in autonomous, semi-autonomous, and/or manual modes.
  • the user interface 116 is used to provide information to or receive information from the user of the automatic driving device 100.
  • the user interface 116 may include one or more input/output devices in the set of peripheral devices 108, such as a wireless communication system 146, a car computer 148, a microphone 150, and a speaker 152.
  • the computer system 112 may control the functions of the automatic driving device 100 based on inputs received from various subsystems (for example, the traveling system 102, the sensor system 104, and the control system 106) and from the user interface 116. For example, the computer system 112 may utilize input from the control system 106 in order to control the steering unit 132 to avoid obstacles detected by the sensor system 104 and the obstacle avoidance system 144. In some embodiments, the computer system 112 is operable to provide control of many aspects of the autonomous driving device 100 and its subsystems.
  • one or more of the aforementioned components may be installed or associated with the automatic driving device 100 separately.
  • the memory 114 may exist partially or completely separately from the automatic driving device 100.
  • the aforementioned components may be communicatively coupled together in a wired and/or wireless manner.
  • FIG. 1 should not be construed as a limitation to the embodiments of the present application.
  • a self-driving car traveling on a road can recognize objects in its surrounding environment to determine the adjustment to the current speed.
  • the object may be other automatic driving devices, traffic control equipment, or other types of objects.
  • each recognized object can be considered independently, and based on the respective characteristics of the object, such as its current speed, acceleration, distance from the automatic driving device, etc., can be used to determine the speed to be adjusted by the self-driving car .
  • the self-driving car automatic driving device 100 or a computing device associated with the automatic driving device 100 may be based on the characteristics of the recognized object and the surrounding environment
  • the state of the object e.g., traffic, rain, ice on the road, etc.
  • each recognized object depends on each other's behavior, so all recognized objects can also be considered together to predict the behavior of a single recognized object.
  • the automatic driving device 100 can adjust its speed based on the predicted behavior of the recognized object.
  • an autonomous vehicle can determine what stable state the autonomous driving device will need to adjust to (for example, accelerate, decelerate, or stop) based on the predicted behavior of the object.
  • other factors may also be considered to determine the speed of the automatic driving device 100, such as the lateral position of the automatic driving device 100 on the traveling road, the curvature of the road, the proximity of static and dynamic objects, and so on.
  • the computing device can also provide instructions to modify the steering angle of the self-driving device 100, so that the self-driving car follows a given trajectory and/or maintains objects near the self-driving car. (For example, a car in an adjacent lane on a road) The safe horizontal and vertical distance.
  • the above-mentioned automatic driving device 100 can be a car, truck, motorcycle, bus, boat, airplane, helicopter, lawn mower, recreational vehicle, playground automatic driving device, construction equipment, tram, golf cart, train, and Trolleys, etc., are not particularly limited in the embodiments of the present application.
  • FIG. 1 introduces a functional block diagram of the automatic driving device 100, and the automatic driving system 101 in the automatic driving device 100 is introduced below.
  • Fig. 2 is a schematic structural diagram of an automatic driving system provided by an embodiment of the application.
  • FIG. 1 and FIG. 2 describe the automatic driving device 100 from different angles.
  • the computer system 101 in FIG. 2 is the computer system 112 in FIG. 1.
  • the computer system 101 includes a processor 103, and the processor 103 is coupled with a system bus 105.
  • the processor 103 may be one or more processors, where each processor may include one or more processor cores.
  • the system bus 105 is coupled with an input/output (I/O) bus 113 through a bus bridge 111.
  • the I/O interface 115 is coupled to the I/O bus.
  • the I/O interface 115 communicates with various I/O devices, such as an input device 117 (such as a keyboard, a mouse, a touch screen, etc.), a media tray 121, such as a CD-ROM, a multimedia interface, and the like.
  • Transceiver 123 can send and/or receive radio communication signals
  • camera 155 can capture scene and dynamic digital video images
  • external USB interface 125 external USB interface 125.
  • the interface connected to the I/O interface 115 may be a USB interface.
  • the processor 103 may be any conventional processor, including a reduced instruction set computing (“RISC”) processor, a complex instruction set computing (“CISC”) processor, or a combination of the foregoing.
  • the processor may be a dedicated device such as an application specific integrated circuit (“ASIC").
  • the processor 103 may be a neural-network processing unit (NPU) or a combination of a neural-network processing unit and the foregoing traditional processors.
  • the processor 103 is mounted with a neural network processor.
  • the computer system 101 can communicate with a server 149 through a network interface 129.
  • the network interface 129 is a hardware network interface, such as a network card.
  • the network 127 may be an external network, such as the Internet, or an internal network, such as an Ethernet or a virtual private network (VPN).
  • the network 127 may also be a wireless network, such as a WiFi network, a cellular network, and so on.
  • the server 149 may be connected to a high-precision map server, and the vehicle can obtain high-precision map information through communication with the high-precision map.
  • the server 149 may be a vehicle management server, and the vehicle management server may be used to process data uploaded by the vehicle, and may also send the data to the vehicle through the network.
  • the computer system 101 may perform wireless communication with other vehicles 160 (V2V) or pedestrians (V2P) via the network interface 129.
  • V2V vehicles 160
  • V2P pedestrians
  • the hard disk drive interface is coupled to the system bus 105.
  • the hardware drive interface is connected with the hard drive.
  • the system memory 135 is coupled to the system bus 105.
  • the data running in the system memory 135 may include the operating system 137 and application programs 143 of the computer system 101.
  • the operating system includes a shell (Shell) 139 and a kernel (kernel) 141.
  • the shell 139 is an interface between the user and the kernel of the operating system.
  • the shell 139 is the outermost layer of the operating system.
  • the shell 139 manages the interaction between the user and the operating system: waiting for the user's input, interpreting the user's input to the operating system, and processing the output results of various operating systems.
  • the kernel 141 is composed of those parts of the operating system that are used to manage memory, files, peripherals, and system resources. Directly interact with the hardware, the operating system kernel usually runs processes and provides inter-process communication, providing CPU time slice management, interrupts, memory management, IO management, and so on.
  • the application program 141 includes programs related to automatic driving, such as programs that manage the interaction between the automatic driving device and obstacles on the road, programs that control the driving route or speed of the automatic driving device, and programs that control the interaction between the automatic driving device 100 and other automatic driving devices on the road.
  • the sensor 153 is associated with the computer system 101.
  • the sensor 153 is used to detect the environment around the computer system 101.
  • the sensor 153 can detect animals, cars, obstacles, and crosswalks.
  • the sensor can also detect the surrounding environment of the above-mentioned animals, cars, obstacles, and crosswalks, such as: the environment around the animals, for example, when the animals appear around them. Other animals, weather conditions, the brightness of the surrounding environment, etc.
  • the sensor may be a camera, an infrared sensor, a chemical detector, a microphone, etc.
  • the sensor 153 senses information at preset intervals when activated and provides the sensed information to the computer system 101 in real time or near real time.
  • the computer system 101 is used to determine the driving state of the automatic driving device 100 according to the sensor data collected by the sensor 153, and to determine the driving operation to be performed by the automatic driving transposition 100 according to the driving state and the current driving task, and send it to the control system 106 (FIG. 1) sends a control command corresponding to the driving operation.
  • the driving state of the automatic driving device 100 may include the driving conditions of the automatic driving device 100 itself, such as the heading direction, speed, position, acceleration, etc., as well as the state of the surrounding environment of the automatic driving device 100, such as the location of obstacles and the location of other vehicles. And the speed, the location of the crosswalk, the signal of the traffic light, etc.
  • the computer system 101 may include a task abstraction network and a shared policy network implemented by the processor 103.
  • the processor 103 determines the current automatic driving task; the processor 103 inputs at least one set of historical paths of the automatic driving task into the task abstraction network for feature extraction, and obtains the task feature vector that characterizes the characteristics of the automatic driving task; processing
  • the processor 103 determines the state vector representing the current driving state of the automatic driving device according to the sensor data collected by the sensor 153; the processor 103 inputs the task feature vector and the state vector to the shared strategy network for processing, and obtains the current position of the automatic driving device.
  • the computer system 101 may be located far away from the automatic driving device, and may perform wireless communication with the automatic driving device.
  • the transceiver 123 can send automatic driving tasks, sensor data collected by the sensor 153, and other data to the computer system 101; and can also receive control instructions sent by the computer system 101.
  • the automatic driving device can execute the control instructions from the computer system 101 received by the transceiver, and perform corresponding driving operations.
  • some of the processes described herein are executed on a processor provided in an autonomous vehicle, and others are executed by a remote processor, including taking actions required to perform a single manipulation.
  • the display adapter 107 can drive the display 109, and the display 109 is coupled to the system bus 105.
  • the display 109 may be used for visual display, voice playback of information input by the user or information provided to the user, and various menus of in-vehicle equipment.
  • the display 109 may include a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display (flexible display) , 3D display (3D display), electronic ink display (e-ink display) at least one.
  • the touch panel can cover the display 109.
  • the touch panel When the touch panel detects a touch operation on or near it, it is sent to the processor to determine the type of the touch event, and then the processor provides corresponding information on the display 109 according to the type of the touch event. Visual output.
  • the touch panel and the display 109 may also be integrated to realize the input and output functions of the vehicle-mounted device.
  • the display 109 may be implemented by a head-up display (HUD).
  • the display 109 may be provided with a projection module to output information through an image projected on a windshield or a vehicle window.
  • the display 109 may include a transparent display.
  • the transparent display can be attached to the windshield or car window.
  • the transparent display can display a predetermined screen with a predetermined transparency.
  • the transparent display may include transparent thin film electroluminescent (TFEL), transparent organic light-emitting diode (organic light-emitting diode, OLED), transparent LCD (liquid crystal display), and transmissive transparent display One or more of transparent LED (Light Emitting Diode) displays. The transparency of the transparent display can be adjusted.
  • the display 109 may be configured in multiple areas inside the vehicle.
  • FIGS. 3a and 3b show the internal structure of the vehicle according to an embodiment of the present invention.
  • the display 109 can be configured in the areas 300 and 301 of the instrument panel, the area 302 of the seat 308, the area 303 of each pillar trim, the area 304 of the door, and the area 305 of the center console. , The area of the head lining, the area of the sunvisor, or the area 306 of the windshield and the area 307 of the window. It should be noted that the above arrangement position of the display 109 is only an indication, and does not constitute a limitation to the present application.
  • a human-computer interaction interface may be displayed on the display, for example, an automatic driving interface may be displayed when the vehicle is in automatic driving.
  • Fig. 4a is a schematic flowchart of a method for displaying information of a vehicle-mounted device according to an embodiment of the application. As shown in Fig. 4a, the method for displaying information of a vehicle-mounted device includes:
  • the lane line may be a driving lane, a side lane of the driving lane, or a lane where vehicles meeting vehicles travel.
  • the lane line may be a concept including lines forming the left and right sides of a lane. To put it another way, the lane line is at least two lines on the road surface used to divide different lanes.
  • the first vehicle may acquire an external image or image of the vehicle through its own camera or other shooting equipment, and send the acquired external image or image to the processor, and the processor may use The recognition algorithm obtains the lane line information contained in the external image or video.
  • the first vehicle may obtain an external image or video of the vehicle through its own camera or other shooting equipment, and then upload the image or video to the vehicle management server, and the vehicle management server processes the image. Then the recognition result (the information of the lane line) is issued to the first vehicle.
  • the first vehicle may also detect the surrounding environment of the vehicle body through a sensor (for example, radar or lidar) carried by itself, and obtain the information of the external lane line.
  • a sensor for example, radar or lidar
  • the first vehicle may also obtain lane line information of the road currently traveling on from the high-precision map server.
  • the first vehicle may also determine lane line-related information based on other data (for example, it may be based on the current driving speed, or historical driving data, etc.).
  • the above-mentioned lane line information may be the image information of the lane line.
  • the automatic driving interface when the vehicle is driving automatically, the automatic driving interface may be displayed on the above-mentioned display 109. Specifically, after acquiring the lane line information of the road where the first vehicle is located, the automatic driving interface may be displayed on the automatic driving interface.
  • Fig. 4b is a schematic diagram of an automatic driving interface provided in an embodiment of the application.
  • the automatic driving interface includes: a first vehicle 401, a virtual lane line 402, and a virtual lane line 403.
  • the virtual lane line 402 is the lane line of the lane where the first vehicle is located
  • the virtual lane line 403 is not the virtual lane line corresponding to the lane line of the lane where the first vehicle 401 is located, but is also the virtual lane line corresponding to the lane line of the road where the first vehicle 401 is located.
  • Lane line is not the virtual lane line corresponding to the lane line of the road where the first vehicle 401 is located.
  • the automatic driving interface may also only display the virtual lane line corresponding to the lane line of the lane where the first vehicle 401 is located (for example, the virtual lane line 402 shown in FIG. 4).
  • the type of the virtual lane line displayed on the automatic driving interface may be the same as the type of the actual lane line, and specifically may be the same shape.
  • the lane line includes at least one of the following lane lines: a dashed line, a solid line, a double dashed line, a double solid line, and a dashed solid line.
  • the type of the virtual lane line displayed on the automatic driving interface may be consistent with the type of the actual lane line, and specifically may be the same in shape and color.
  • the lane line includes at least one of the following lane lines: a white dashed line, a white solid line, a yellow dashed line, a yellow solid line, a double white dashed line, a double yellow solid line, a yellow dashed solid line, and a double white solid line.
  • a double yellow solid line is used to separate traffic from opposite directions when it is drawn in the middle of a road section.
  • the yellow solid line when drawn in the middle of a road section, is used to separate the traffic flow from opposite directions or as a marking line for buses and school buses. When drawn on the side of the road, it means that parking on the side of the road is prohibited.
  • Solid white line When drawn in a road section, it is used to separate motor vehicles and non-motor vehicles traveling in the same direction, or to indicate the edge of the roadway. When drawn at an intersection, it is used as a guide lane line or stop line, or to guide The trajectory of the vehicle.
  • the yellow dashed solid line is used to separate the traffic flow in the opposite direction when it is drawn in the road section. Among them, the solid line side prohibits vehicles from crossing the line, and the dotted line side allows vehicles to temporarily cross the line.
  • the lane line can also include diversion lines and grid lines, etc., where the diversion line can be one or several white V-shaped or diagonal line areas set according to the terrain of the intersection, used for too wide, irregular or driving At intersections with more complicated conditions, grade-separation ramps or other special places, vehicles must drive on a prescribed route, and must not drive on or over the line.
  • the yellow grid lines indicate areas where parking is prohibited. When marked as parking space markings, they indicate exclusive parking spaces. This means that the vehicle can pass through the line normally, but cannot stay on it.
  • the automatic driving interface may also include other display elements, such as the current driving speed of the first vehicle, the current speed limit of the road surface, other vehicles, etc., which are not limited by this application.
  • the "consistent" in this embodiment does not emphasize that the virtual lane line is exactly the same as the lane line on the road. There may always be some differences between the virtual lane line displayed on the computer display screen and the actual lane line.
  • the purpose of this application is to indicate the actual lane for the driver for reference.
  • the indication method is as close to the actual lane line as possible, but the color, shape, material, etc. of the line can be different from the actual lane line. Further, it is also possible to add and display other indication information on the basis of the virtual lane line.
  • the virtual lane line consistent with the lane line corresponding to the acquired lane line information is displayed in the automatic driving interface, so that the driver can see the actual lane on the road at this time from the automatic driving interface
  • the virtual lane line of the line type not only enriches the display content of the automatic driving interface, but also improves the safety of driving.
  • the first vehicle may also obtain information of the non-motor vehicle object on the road surface, and display the identifier corresponding to the non-motor vehicle object according to the information of the non-motor vehicle object.
  • non-motor vehicle objects include at least road depressions, obstacles, and road water.
  • they may also include pedestrians, two-wheelers, traffic signals, street lights, trees and other plants, buildings, telephone poles, etc.
  • Signal lights, bridges, hills, hills, etc. are not limited here.
  • the first vehicle may acquire an external image or image of the vehicle through its own camera or other shooting equipment, and send the acquired external image or image to the processor, and the processor may use The recognition algorithm obtains the information of the non-motor vehicle objects contained in the external image or video.
  • the first vehicle may obtain an external image or video of the vehicle through its own camera or other shooting equipment, and then upload the image or video to the vehicle management server, and the vehicle management server processes the image. Then the recognition result (information of the non-motor vehicle object) is issued to the first vehicle.
  • the first vehicle may also detect the surrounding environment of the vehicle body through a sensor (for example, radar or lidar) carried by itself, and obtain information about external non-motor vehicle objects.
  • a sensor for example, radar or lidar
  • the identification corresponding to the non-motor vehicle object may be displayed on the automatic navigation interface.
  • the information of the non-motor vehicle object may include the position, shape, and size of the non-motor vehicle object.
  • the identification corresponding to the non-motor vehicle object can be displayed at the corresponding position of the non-motor vehicle object according to the shape and size of the non-motor vehicle object.
  • the mark corresponding to the non-motor vehicle object can be the same as the non-motor vehicle object, or it can be a sign, which is only used to indicate the shape and size of the non-motor vehicle object.
  • Fig. 5a is a schematic diagram of an automatic driving interface provided in an embodiment of the application. As shown in Fig. 5a, the automatic driving interface further includes: a non-motor vehicle object 501 (road recessed).
  • Fig. 5b is a schematic diagram of an automatic driving interface provided in an embodiment of the application.
  • the automatic driving interface further includes: a non-motor vehicle object 501 (water accumulation on the road).
  • Fig. 5c is a schematic diagram of an automatic driving interface provided in an embodiment of the application. As shown in Fig. 5c, the automatic driving interface further includes: a non-motor vehicle object 501 (obstacle).
  • a non-motor vehicle object 501 obstacle
  • the lane change indication may also be displayed based on the non-motor vehicle object being located on the navigation path indicated by the navigation indication, wherein the navigation indication is used to indicate the navigation path of the first vehicle
  • the lane change indication is used to instruct the first vehicle to avoid the travel path of the non-motor vehicle object.
  • navigation instructions may be displayed based on navigation information.
  • the navigation instructions are used to indicate the navigation path of the first vehicle.
  • a lane change instruction for instructing the first vehicle to avoid the non-motor vehicle object is displayed.
  • the first vehicle can acquire external images or images of the vehicle based on its own camera or other shooting equipment, and send the acquired external images or images to the processor for processing.
  • the processor can obtain the information of the non-motor vehicle object contained in the external image or the image through the recognition algorithm.
  • the information of the non-motor vehicle object can include the size, shape and position of the non-motor vehicle object.
  • the processor can obtain the information according to the above The size, shape and location of the non-motor vehicle object determines whether the non-motor vehicle object is on the current navigation path.
  • the first vehicle may obtain the external image or video of the vehicle through its own camera or other shooting equipment, and then upload the image or video to the vehicle management server, and the vehicle management server will process/process the image or video.
  • the image, and then the recognition result (whether the non-motor vehicle object is on the current navigation path, or whether the non-motor vehicle object will hinder the driving of the vehicle) is sent to the first vehicle.
  • Fig. 5d is a schematic diagram of an automatic driving interface provided in an embodiment of the application.
  • a non-motor vehicle object 501 (obstacle) is located on the navigation path indicated by the navigation instruction 502, then A lane change indication 503 for instructing the first vehicle to avoid the non-motor vehicle object is displayed.
  • the lane change indication 503 may be a belt-shaped path indication or a linear path indication, which is not limited here.
  • the obstacle is different from the road depression and road water accumulation.
  • the first vehicle can directly drive past. If there is an obstacle, the first vehicle needs to go around.
  • the lane change instruction 503 for instructing the first vehicle to avoid the non-motor vehicle object can be displayed.
  • the lane indicator 503 can be displayed in a different color and/or different shape from the current navigation indicator.
  • the navigation indicator 502 can be displayed as a curved indicator (as shown in Figure 5e). As shown), when the first vehicle bypasses the obstacle, the navigation indication 502 can be displayed straight again (as shown in FIG. 5f).
  • a first warning prompt may be displayed based on the distance between the first vehicle and the non-motor vehicle object as the first distance, and the first warning prompt may be displayed based on the distance between the first vehicle and the non-motor vehicle object.
  • the distance between the motor vehicle objects is the second distance, and a second warning prompt is displayed, and the second warning prompt is different from the first warning prompt.
  • the color or transparency of the first alarm prompt and the second alarm prompt are different.
  • the first vehicle may obtain the distance between the first vehicle and the non-motor vehicle object based on the distance sensor, and display an alarm prompt based on the distance between the first vehicle and the non-motor vehicle object.
  • the warning prompt may change at least two colors according to the distance of the obstacle (collision risk level). As the distance between the first vehicle and the obstacle increases/decreases, the two adjacent colors change smoothly.
  • the first vehicle may also receive a sharing instruction, the sharing instruction carrying the address of the second vehicle, and in response to the sharing instruction, sending second sharing information to the second vehicle, the second sharing information Including the location information of the non-motor vehicle object.
  • the first vehicle may also receive the first shared information sent by the server or the second vehicle.
  • the first shared information includes the location information of the non-motor vehicle object, and the navigation is turned on based on the first vehicle.
  • An obstacle prompt is displayed on the interface, and the obstacle prompt is used to indicate a non-motor vehicle object at a position corresponding to the position information.
  • the information is reported to the vehicle management server ,
  • the navigation route is issued by the server to vehicles on the roads with sunken roads, stagnant water, or obstacles, so that these vehicles can learn this information in advance.
  • the first vehicle obtains the information of the non-motor vehicle object through the sensor, the information (position, shape, size, etc.) of the non-motor vehicle object can be sent to another vehicle (the second vehicle).
  • the driver or passenger can operate on the automatic driving interface (for example, trigger the sharing control on the display interface and enter the address of the second vehicle, or directly select the second vehicle that is connected to the first vehicle, etc.)
  • the first vehicle may receive a sharing instruction, the sharing instruction carrying the address of the second vehicle, and in response to the sharing instruction, sending second sharing information to the second vehicle, the second sharing information including The location information of the non-motor vehicle object.
  • Figure 6a is a schematic diagram of an automatic driving interface provided in an embodiment of the application.
  • A starts first, and On the passing path, if there is a road depression on the road, then A can touch the display, click the depression prompt, select the sharing control 601 "Send to a friend" (as shown in Figure 6a), and select driver B (this time is equivalent After entering the address of the second vehicle), the driver B can receive the warning of the road depression in advance.
  • the first vehicle to receive shared information as an example, if the first vehicle receives the first shared information sent by the server or the second vehicle, the first shared information includes the location information of the non-motor vehicle object and is based on all The first vehicle starts the navigation, and displays an obstacle prompt on the navigation interface, and the obstacle prompt is used to indicate the non-motor vehicle object at the position corresponding to the position information.
  • Figure 6b is a schematic diagram of an automatic driving interface provided in an embodiment of the application, as shown in Figure 6b, where the right figure in Figure 6b is a navigation interface, the navigation interface includes a navigation map, The medium thick solid line is the navigation route, the arrow is the position where the vehicle in front of you is going, the black dot on the thick solid line is the road depression information collected by the vehicle management server or the road depression information sent by other vehicles, and The depression prompt 602 is displayed on the navigation interface of the current first vehicle.
  • the first vehicle may also display different navigation instructions based on the driving speed.
  • the first vehicle may also obtain navigation information of the first vehicle, and display navigation instructions based on the navigation information, and the navigation instructions are used to indicate the navigation path of the first vehicle.
  • the navigation instruction includes a first navigation instruction or a second navigation instruction.
  • the first navigation instruction is displayed based on the stationary state of the first vehicle, and displayed based on the driving state of the first vehicle.
  • the second navigation instruction the first navigation instruction and the second navigation instruction are different.
  • the display color or transparency of the first navigation instruction and the second navigation instruction are different.
  • Fig. 7a is a schematic diagram of an automatic driving interface provided in an embodiment of the application.
  • the automatic driving interface includes a navigation instruction 701, which indicates the first vehicle's For the navigation route, when the first vehicle determines that it is currently in the prohibited state, or the driving speed is lower than the preset speed, the first navigation instruction 701 (as shown in FIG. 7b) is displayed, and when the first vehicle determines If it is currently in the driving state, or the driving speed is higher than the preset speed, the second navigation instruction 701 (as shown in FIG. 7c) is displayed, wherein the color of the second navigation instruction 701 shown in FIG. 7c Than the first navigation indication 701 shown in Figure 7b.
  • the driver or passenger can determine the current driving state of the vehicle based on the display of the navigation instructions in the navigation interface.
  • the first vehicle can also make the visual elements (virtual lane lines, road surfaces, navigation instructions, etc.) on the automatic navigation interface be in color, brightness, and color according to the current environment (weather, time information, etc.). There is a change in at least one of the materials.
  • the visual elements virtual lane lines, road surfaces, navigation instructions, etc.
  • the navigation instruction includes a third navigation instruction or a fourth navigation instruction
  • the first vehicle may display the third navigation instruction based on the first vehicle being in the first environment.
  • the first vehicle is in a second environment, and the fourth navigation instruction is displayed, wherein the first environment is different from the second environment, and the third navigation instruction is different from the fourth navigation instruction.
  • the first vehicle may display the first road surface based on the first vehicle being in the first environment, and display the second lane based on the first vehicle being in the second environment, wherein, The first lane and the second lane are the lanes where the first vehicle travels, or the lanes on the road where the first vehicle is located, the first environment is different from the second environment, the first lane and the second lane different.
  • the first vehicle can make the visual elements (virtual lane lines, road surfaces, navigation instructions, etc.) on the automatic navigation interface in color, according to the current environment (weather, time information, etc.) At least one of brightness and material changes.
  • the visual elements virtual lane lines, road surfaces, navigation instructions, etc.
  • the first environment includes at least one of the following environments: the weather environment where the first vehicle is located, the road environment where the first vehicle is located, and the first environment The weather environment where the vehicle navigation destination is located, the road environment where the first vehicle navigation destination is located, the traffic jam environment on the road where the first vehicle is located, and the traffic jam environment where the first vehicle navigation destination is located Or the brightness environment where the first vehicle is located.
  • the weather environment can be obtained through a network connection to a weather server.
  • the weather environment may include temperature, humidity, etc., as well as strong winds, heavy rains, and blizzards.
  • the brightness environment can be the brightness of the environment where the current vehicle is located, and can represent the current time. For example, if the current time is morning, the colors of virtual lane lines, road surfaces, navigation instructions, etc. are higher than normal brightness or the colors become lighter, and the current time At night, the colors of virtual lane lines, road surfaces, navigation instructions, etc., are lower than normal or darker.
  • the materials of virtual lane lines, road surfaces, navigation instructions, etc. appear to be covered with snow.
  • the virtual lane lines, road surfaces, navigation instructions and other visual elements are enhanced to display, such as more vivid colors (improved purity), or increased brightness, Or use enhanced materials.
  • Fig. 8a is a schematic diagram of an automatic driving interface provided in an embodiment of the application.
  • the road environment on which the first vehicle is driving is snowy.
  • the automatic navigation interface The pavement material is covered by snow.
  • Fig. 8b is a schematic diagram of an automatic driving interface provided in an embodiment of the application.
  • the road environment on which the first vehicle is driving is desert.
  • the automatic navigation interface The road surface material is desert.
  • the first vehicle may display the first road surface based on the first vehicle being in the first environment, and display the second lane based on the first vehicle being in the second environment, where the first lane and the second lane
  • the lane is the lane where the first vehicle travels, or the lane on the road where the first vehicle is located, the first environment is different from the second environment, and the first lane is different from the second lane.
  • the driver or passenger can obtain the current environment of the vehicle based on the display of the automatic navigation interface, especially at night or other low-brightness scenes, the driver or passenger can obtain the current environment of the vehicle based on the display of the automatic navigation interface , Improve the safety of driving.
  • the first vehicle may display the corresponding image in the automatic driving interface based on the geographic location of the navigation destination.
  • the first vehicle may obtain the geographic location of the navigation destination of the first vehicle, and display a first image based on the geographic location, and the first image is used to indicate the location Describe the type of geographic location where the first vehicle navigation destination is located.
  • the type of geographic location may include at least one of the following types: city, mountain, plain, forest, or seaside.
  • the first vehicle can obtain the geographic location of the navigation destination of the first vehicle through the GPS system, or obtain the geographic location of the navigation destination of the current vehicle through the high-definition map, and further obtain the information of these geographic locations.
  • Attribute information (type) for example, the geographic location of the first vehicle navigation destination may belong to a city, a mountainous area, a plain, a forest, a seaside, etc.
  • the attribute information (type) of the geographic location can be obtained from the map system.
  • the first car after the first car obtains the geographic location of the navigation destination and the type of geographic location where the navigation destination is located, it can be used to identify the visual element of the lane according to the type of geographic location.
  • the far-view picture (the first image) is presented at the end of the lane, or the material of the visual elements of the lane is changed.
  • the first image can be displayed next to the speed indicator and overlapped with the speed indicator. , Or occupy the top of the entire display panel, and so on.
  • Fig. 8c is a schematic diagram of an automatic driving interface provided in an embodiment of the application. As shown in Fig. 8c, if the geographic location of the navigation destination of the first vehicle is located at the seaside, it can be displayed on the automatic driving interface.
  • the first image used to represent the seaside (for example, may include coconut trees and sea water) is displayed on the upper side.
  • Fig. 8d is a schematic diagram of an automatic driving interface provided in an embodiment of the application. As shown in Fig. 8d, if the geographic location of the navigation destination of the first vehicle is located in a mountainous area, it can be displayed on the automatic driving interface. A first image representing a mountain area (for example, may include a mountain) is displayed on the top.
  • a mountain area for example, may include a mountain
  • Fig. 8e is a schematic diagram of an automatic driving interface provided in an embodiment of the application. As shown in Fig. 8e, if the geographic location of the navigation destination of the first vehicle is located in a forest, it can be displayed on the automatic driving interface. The first image for representing the forest (for example, may include multiple trees) is displayed on the top.
  • the first vehicle may also detect a third vehicle, acquire the geographic location of the third vehicle navigation destination, and display the second image based on the geographic location of the third vehicle navigation destination, The second image is used to indicate the type of geographic location where the third vehicle navigation destination is located.
  • the type of the geographic location of the destination of the other vehicle may also be displayed on the automatic driving interface.
  • Fig. 8f is a schematic diagram of an automatic driving interface provided in an embodiment of the application.
  • the first vehicle the largest vehicle in the figure
  • the vehicle on the side is going to the same type of destination (forest) as itself, but the vehicle on the right is not, because the vehicle in front and the vehicle on the left and right are identified by a special color and/or texture, or displayed around these vehicles
  • a second image including a tree
  • the first vehicle may obtain the geographic location of the navigation destination of the first vehicle, and display a first image based on the geographic location.
  • the first image is used to indicate the navigation of the first vehicle The type of geographic location where the destination is located.
  • the first vehicle can display the corresponding image in the automatic driving interface based on the geographic location of the navigation destination, which enriches the content of the automatic driving interface.
  • the first vehicle may display the intersection stop instruction on the automatic driving interface based on driving to the intersection stop area.
  • the first vehicle may detect that the first vehicle is driving to the intersection stop area, and display the intersection stop instruction 901.
  • the intersection stop area may be an area where the first vehicle travels to within a preset distance (for example, 20 m) from the red light intersection.
  • the first vehicle may determine that the first vehicle currently enters the intersection stop area based on images or camera, or may determine that the current first vehicle enters the intersection stop area based on navigation information.
  • the first vehicle may obtain the status of the traffic light corresponding to the first vehicle at the current intersection, and display the stop instruction of the first intersection when the status of the traffic light is red or yellow.
  • Fig. 9a is a schematic diagram of an automatic driving interface provided in an embodiment of the application. As shown in Fig. 9a, when driving to an intersection stop area, the automatic driving interface displays an intersection stop line 901.
  • the navigation indication 701 can also be displayed. At the same time, the part of the navigation indication 701 that exceeds the intersection stop line is weakened.
  • the way of weakening can be to display only the outline of the navigation indication 701, or to increase the navigation.
  • the transparency of indication 701, etc., is not limited here.
  • the intersection stop instruction includes: a first intersection stop instruction or a second intersection stop instruction, and the first vehicle may detect that the front of the first vehicle does not exceed the intersection stop area , Display the first intersection stop instruction, and display a second intersection stop instruction based on detecting that the head of the first vehicle exceeds the intersection stop area, and the first intersection stop instruction is different from the second intersection stop instruction.
  • the display content of the first intersection stop instruction 901 can be changed.
  • the intersection stop instruction can be weakened and displayed, and the weakening method can be to increase the intersection stop instruction
  • the transparency, etc., is not limited here.
  • Fig. 9b is a schematic diagram of an automatic driving interface provided in an embodiment of the application.
  • the first vehicle can detect that the first vehicle is traveling to the intersection stop area, Correspondingly, an intersection stop instruction and a weakened navigation instruction 701 are displayed on the automatic driving interface.
  • Fig. 9c is a schematic diagram of an automatic driving interface provided in an embodiment of the application.
  • the first vehicle can detect that the first vehicle is driving beyond the intersection stop area ( The front of the first vehicle exceeds the intersection stop area), correspondingly, the weakened intersection stop instruction is displayed on the autopilot interface, and enhanced (display the outline of the complete navigation instruction 701, change the color, or reduce the transparency of the navigation instruction 701 ) After the navigation instructions 701.
  • the intersection stop instruction includes: a third intersection stop instruction or a fourth intersection stop instruction, and the first vehicle may travel to the intersection stop area based on detecting that the first vehicle is traveling to the intersection stop area , And the traffic light corresponding to the intersection stop area is a red light or a yellow light, and the third intersection stop instruction is displayed, based on detecting that the first vehicle is driving to the intersection stop area, and the traffic light corresponding to the intersection stop area is The green light shows the stop instruction of the fourth intersection, and the stop instruction of the third intersection is different from the stop instruction of the fourth intersection.
  • the first vehicle in addition to displaying the intersection stop instruction based on driving to the intersection stop area, the first vehicle also considers the traffic light information of the current intersection. Specifically, when the first vehicle travels to the intersection, Stop area, and the traffic light corresponding to the intersection stop area is a red light or a yellow light, showing the third intersection stop instruction, when the first vehicle drives to the intersection stop area, and the traffic light corresponding to the intersection stop area is a green light,
  • the fourth intersection stop instruction is displayed, for example, the fourth intersection instruction may be an enhanced third intersection instruction (change the color, or reduce the transparency of the navigation instruction 701).
  • the first vehicle may display a vehicle warning prompt on the automatic driving interface based on the distance between the nearby vehicle and the own vehicle.
  • the first vehicle may detect the fourth vehicle, and based on the distance between the fourth vehicle and the first vehicle being less than a preset distance, a vehicle warning prompt is displayed.
  • the vehicle warning prompt includes a first vehicle warning prompt or a second vehicle warning prompt
  • the first vehicle may be based on the distance between the fourth vehicle and the first vehicle as The first distance
  • the first vehicle warning prompt is displayed
  • the second vehicle warning prompt is displayed.
  • the first vehicle warning prompt is different from the second vehicle warning prompt.
  • the first vehicle can obtain the distance between the remaining vehicles and the first vehicle based on the distance sensor carried by itself, and when it is detected that the distance between a certain vehicle (the fourth vehicle) and the first vehicle is less than After the preset distance, the vehicle warning prompt will be displayed.
  • FIG. 10 is a schematic diagram of an automatic driving interface provided in an embodiment of the application.
  • the first vehicle can detect the fourth vehicle 1001, and based on the fourth vehicle 1001 and If the distance between the first vehicles is less than the preset distance, a vehicle warning prompt 1002 is displayed on the automatic driving interface.
  • the color of the warning prompt may be different, for example, it is displayed in red when it is very close, and it is displayed in yellow when it is relatively close.
  • the color change of the hazard warning graphic may be a gradual transition, instead of suddenly changing from red to yellow when the corresponding threshold is exceeded (or Yellow turns to red).
  • the first vehicle may display a vehicle warning prompt on the automatic driving interface based on the distance between the nearby vehicle and the own vehicle. This enables the driver to know the risk of collision between the first vehicle and other vehicles through the warning prompt displayed on the automatic driving interface.
  • the first vehicle may change the display angle of view of the current automatic driving interface based on the change from the turning state to the straight state, or the change from the straight state to the turning state.
  • Fig. 11a is a schematic diagram of an automatic driving interface provided in an embodiment of the application.
  • the first vehicle may display the first vehicle based on the first vehicle being in a straight state. area.
  • Fig. 11b is a schematic diagram of an automatic driving interface provided in an embodiment of the application.
  • the first vehicle can be changed from the straight state to the left turning state based on the first vehicle.
  • a second area is displayed, wherein the scene area 1102 in the front left of the first vehicle traveling direction included in the second area is larger than the scene area 1101 in the front left of the first area included in the first area.
  • the second area includes the scene area 1102 on the front left of the first vehicle traveling direction. It is larger than the left front scene area 1101 included in the first area.
  • Fig. 11c is a schematic diagram of an automatic driving interface provided in an embodiment of the present application.
  • the first vehicle may display the third area based on the state of the first vehicle turning left.
  • Fig. 11d is a schematic diagram of an automatic driving interface provided in an embodiment of the application.
  • the first vehicle can change from the left turning state to the straight traveling state based on the first vehicle.
  • the fourth area is displayed, wherein the scene area 1103 on the rear right side of the first vehicle traveling direction included in the third area is larger than the scene area 1104 on the rear right side included in the fourth area.
  • the third area includes the scene area 1103 on the right rear of the first vehicle traveling direction greater than
  • the fourth area includes the rear right scene area 1104.
  • Fig. 11e is a schematic diagram of an automatic driving interface provided in an embodiment of the application.
  • the first vehicle may display the fifth area based on the first vehicle being in a straight state.
  • FIG. 11f is a schematic diagram of an automatic driving interface provided in an embodiment of the application.
  • the first vehicle can change from the straight state to the right turning state based on the first vehicle.
  • a sixth area is displayed, wherein the scene area 1105 at the front right of the first vehicle traveling direction included in the fifth area is larger than the scene area 1106 at the front right included in the sixth area.
  • the fifth area includes the scene on the front right of the first vehicle traveling direction.
  • the area 1105 is larger than the front right scene area 1106 included in the sixth area.
  • FIG. 11g is a schematic diagram of an automatic driving interface provided in an embodiment of the application.
  • the first vehicle may display the seventh area based on the first vehicle being in a right turning state.
  • Fig. 11h is a schematic diagram of an automatic driving interface provided in an embodiment of the application.
  • the first vehicle may change from the right turning state to the straight driving state based on the first vehicle.
  • the third area includes the scene area 1103 of the first vehicle driving direction and the left rear of the first vehicle is greater than
  • the fourth area includes the rear left scene area 1104.
  • FIGS. 11a to 11h are only an illustration, and does not constitute a limitation of the present application.
  • the first vehicle may change the viewing angle of the information displayed on the display according to the turning area of the intersection.
  • the turning area can be obtained by sensing whether the steering wheel is rotated to the left or right, or by whether the high-precision map navigation is activated when the vehicle is driving, and the navigation route is used to determine whether to drive to an intersection that needs to turn left or right, or only when the vehicle is driving. High-precision map is enabled, but the navigation is not used, but the driver is driving. By judging whether the vehicle is driving close to the intersection for a preset distance and driving in the left-hand turning lane or the right-hand turning lane, it is further determined whether the vehicle is going to the left Turn or turn right.
  • the angle of view in this embodiment refers to the angle of view of the information displayed on the display.
  • a virtual camera can be used to track the position of the vehicle (the first vehicle), and objects that can be seen in the field of view of this camera.
  • Changing the display angle of view is to change the relative position of the virtual camera and the vehicle (x, y, z axis coordinates, and the angle of each direction), and then present the change of the objects that can be seen in the virtual camera's field of view on the display.
  • the direction facing the front of the vehicle is the positive direction of the y-axis, and the traveling direction of the vehicle is the negative direction of the y-axis; facing the vehicle, the right-hand side of the vehicle is the positive direction of the x-axis, and the direction of the vehicle is the positive direction of the x-axis.
  • the left-hand side is the negative direction of the x-axis.
  • the position of the virtual camera is above the z-axis, in the positive direction of the z-axis, and in the positive direction of the y-axis.
  • the viewing angle in this default state is referred to as the default viewing angle (referred to as the "default forward viewing angle" in the following embodiments).
  • the current display angle of view can be changed, so that the driver can know that there is Information on areas that may have safety risks improves driving safety.
  • the first vehicle may change the display angle of view of the current automatic driving interface based on the change of the driving speed.
  • the first vehicle may display the ninth area based on the first vehicle at the first traveling speed, and display the tenth area based on the first vehicle at the second traveling speed.
  • the ninth area and the tenth area are the scene areas where the driving position of the first vehicle is located, the second driving speed is greater than the first driving speed, and the ninth area includes a scene area greater than the tenth area. The area of the scene contained in the area.
  • Figures 12a to 12d are schematic diagrams of an automatic driving interface provided in an embodiment of the application. As shown in Figures 12a to 12d, the description from Figures 12a to 12d is the speed of the vehicle. In the case of a lower coming, it can be seen that as the driving speed of the first vehicle decreases, the scene area where the driving position of the first vehicle is located in the automatic driving interface becomes smaller and smaller.
  • the first vehicle may be based on the faster the vehicle's driving speed, so that the higher the road angle displayed on the automatic driving interface, and correspondingly, the larger the road display range; the slower the vehicle's driving speed, the higher the display panel.
  • the more road information buildings, pedestrians, roadside traffic facilities on both sides of the lane, etc.
  • the range of the road display the scene area where the first vehicle is driving
  • Figure 12a to Figure 12d describe the situation where the speed of the vehicle is getting lower and lower. It can be seen that the viewing angle is getting lower and lower. When the first vehicle is driving at high speed, the viewing angle is high (virtual The position of the camera has a large z-axis value), and the angle of view is low at low speed (the position of the virtual camera has a small z-axis value). It should be noted that the speed values in Figs. 12a to 12d are merely illustrative, and do not constitute a limitation of the present application.
  • the driver when the vehicle speed is low, such as driving in a street, the driver will pay more attention to the information around the car, such as the details of the collision information. At this time, the perspective will be closer to the car itself, so that the driver can pay attention to the information he wants to pay attention to.
  • the more road information (buildings, pedestrians, roadside traffic facilities, etc. on both sides of the lane) is displayed on the display panel, the more obvious, the lower the road angle of view displayed on the autopilot interface, and the smaller the road display range.
  • the first vehicle may display the ninth area based on the first vehicle at the first traveling speed, and display the tenth area based on the first vehicle at the second traveling speed, the ninth area and The tenth area is the scene area where the first vehicle is traveling, the second traveling speed is greater than the first traveling speed, and the ninth area includes a greater scene area than the tenth area. Scene area.
  • a larger scene area can be displayed, so that the driver can learn more road surface information when the first vehicle is traveling faster, and driving safety is improved.
  • the first vehicle may display a prompt that the side car is inserted into the current driving lane on the automatic driving interface.
  • the first vehicle may detect the fifth vehicle, and based on the fifth vehicle being located on the lane line of the lane in front of the first vehicle traveling direction, it is displayed that the fifth vehicle corresponds to According to the third image, the first vehicle may display the fourth image corresponding to the fifth vehicle based on the fifth vehicle driving to the lane in front of the first vehicle traveling direction, the third image and the The fourth image is different.
  • the first vehicle when the first vehicle detects that a certain vehicle (fifth vehicle) is located on the lane line of the lane in front of the first vehicle traveling direction, it is determined that the fifth vehicle will overtake the first vehicle.
  • the first vehicle may also be determined based on that the fifth vehicle is located on the lane line of the lane in front of the first vehicle traveling direction, and the distance between the fifth vehicle and the first vehicle is less than a certain preset value The fifth vehicle will overtake the first vehicle.
  • the first vehicle may process the captured image or camera to determine that the fifth vehicle is located on the lane line of the lane in front of the driving direction of the first vehicle.
  • the first vehicle may send the captured image or camera to the server, and the server may determine that the fifth vehicle is located on the lane line of the lane in front of the driving direction of the first vehicle, and receive the determination result sent by the server.
  • the fifth vehicle may be located behind the first vehicle (as shown in FIG. 13a). If the first vehicle detects that the fifth vehicle is overtaking, it may be in the automatic driving A special color mark (for example, white) is used on the interface to display the image corresponding to the fifth vehicle (such as the fifth vehicle 1301 shown in Figure 13b. At this time, the fifth vehicle 1302 is located in the lane in front of the driving direction of the first vehicle). On line), it means that the fifth vehicle has suppressed the speed of the first vehicle.
  • a special color mark for example, white
  • the display content of the fifth vehicle can be changed.
  • the first vehicle can be based on the fifth vehicle traveling to the first vehicle traveling direction (The fifth vehicle 1301 shown in Figure 13c, at this time the fifth vehicle 1302 is located in the lane in front of the driving direction of the first vehicle, but not on the lane line), then the display The fourth image corresponding to the fifth vehicle, where the color and/or transparency of the fourth image may be different from the third image.
  • FIG. 14 is a schematic structural diagram of an information display device for in-vehicle equipment according to an embodiment of the application. As shown in FIG. 14, the information display device includes:
  • the obtaining module 1401 is configured to obtain lane line information of the road where the first vehicle is located, where the lane lines are at least two lines on the road surface that are used to divide different lanes;
  • the display module 1402 is configured to display a virtual lane line consistent with the type of the lane line according to the information of the lane line.
  • the acquiring lane line information of the road where the first vehicle is located includes:
  • the lane line includes at least one of the following lane lines: a dashed line, a solid line, a double dashed line, a double solid line, and a dashed solid line.
  • the lane line includes at least one of the following lane lines: a white dashed line, a white solid line, a yellow dashed line, a yellow solid line, a double white dashed line, a double yellow solid line, a yellow dashed solid line, and a double white solid line .
  • the acquiring module 1401 is further configured to acquire information about non-motor vehicle objects on the road;
  • the display module 1402 is also used to display the non-motor vehicle object.
  • the device further includes:
  • a receiving module configured to receive a sharing instruction, the sharing instruction carrying the address of the second vehicle
  • the sending module is configured to send second shared information to the second vehicle in response to the sharing instruction, where the second shared information includes location information of the non-motor vehicle object.
  • the receiving module is further configured to receive first shared information sent by a server or a second vehicle, where the first shared information includes location information of non-motor vehicle objects;
  • the display module 1402 is further configured to start navigation based on the first vehicle, and display an obstacle prompt on a navigation interface, where the obstacle prompt is used to indicate a non-motor vehicle object at a location corresponding to the location information.
  • the non-motor vehicle object includes at least road depressions, obstacles and road water.
  • the display module 1402 is further configured to display a lane change indication based on the non-motor vehicle object being located on the navigation path indicated by the navigation indication, wherein the navigation indication is used to indicate the navigation of the first vehicle A path, the lane change indication is used to instruct the first vehicle to avoid a travel path of the non-motor vehicle object.
  • the display module 1402 is further configured to display a first warning prompt based on the distance between the first vehicle and the non-motor vehicle object as the first distance;
  • a second warning prompt is displayed, and the second warning prompt is different from the first warning prompt.
  • the color or transparency of the first alarm prompt and the second alarm prompt are different.
  • the obtaining module 1401 is further configured to obtain navigation information of the first vehicle;
  • the display module 1402 is further configured to display navigation instructions based on the navigation information, and the navigation instructions are used to indicate the navigation path of the first vehicle.
  • the navigation instruction includes a first navigation instruction or a second navigation instruction
  • the display module 1402 is specifically configured to display the first navigation instruction based on the stationary state of the first vehicle
  • the second navigation instruction is displayed, and the first navigation instruction is different from the second navigation instruction.
  • the display color or transparency of the first navigation instruction and the second navigation instruction of the device are different.
  • the navigation instruction includes a third navigation instruction or a fourth navigation instruction
  • the display module 1402 is specifically configured to display the third navigation instruction based on the first vehicle being in the first environment
  • the fourth navigation instruction is displayed based on the first vehicle being in the second environment, the first environment is different from the second environment, and the third navigation instruction is different from the fourth navigation instruction.
  • the first environment includes at least one of the following environments: the weather environment where the first vehicle is located, the road environment where the first vehicle is located, and the location where the navigation destination of the first vehicle is located.
  • the weather environment of the first vehicle navigation destination, the road environment where the first vehicle is located, the traffic jam environment of the road where the first vehicle is located, the traffic jam environment where the first vehicle navigation destination is located, or the first vehicle The brightness of the environment.
  • the display module 1402 is further configured to display the first area based on the first vehicle being in a straight state;
  • a second area is displayed, wherein the second area includes a scene area on the left front of the driving direction of the first vehicle that is larger than the first area The scene area at the left front included; or,
  • a fourth area is displayed, wherein the third area includes a scene area on the right rear side of the first vehicle traveling direction that is larger than the fourth area The included scene area on the right rear side; or,
  • a sixth area is displayed, wherein the fifth area includes a scene area on the right front of the first vehicle traveling direction that is larger than the sixth area The included scene area on the right front; or,
  • an eighth area is displayed, wherein the seventh area includes a scene area on the left and rear of the driving direction of the first vehicle that is larger than the eighth area Contains the scene area behind the left.
  • the display module 1402 is further configured to display the ninth area based on the first vehicle at the first traveling speed;
  • the tenth area is displayed, the ninth area and the tenth area are the scene areas where the first vehicle is driving, and the second driving speed is greater than all
  • a scene area included in the ninth area is larger than a scene area included in the tenth area.
  • the acquiring module 1401 is further configured to acquire the geographic location of the first vehicle navigation destination;
  • the display module 1402 is further configured to display a first image based on the geographic location, and the first image is used to indicate the type of geographic location where the first vehicle navigation destination is located.
  • the detection module 1403 is also used to detect a third vehicle
  • the acquiring module 1401 is also used to acquire the geographic location of the third vehicle navigation destination;
  • the display module 1402 is further configured to display a second image based on the geographic location of the third vehicle navigation destination, and the second image is used to indicate the geographic location of the third vehicle navigation destination. Types of.
  • the type of the geographic location includes at least one of the following types: city, mountain, plain, forest, or seaside.
  • the detection module 1403 is further configured to detect that the first vehicle has driven to the intersection stop area, and display the first intersection stop instruction.
  • intersection stop instruction includes: a first intersection stop instruction or a second intersection stop instruction
  • display module 1402 is further configured to:
  • a second intersection stop instruction is displayed, and the first intersection stop instruction is different from the second intersection stop instruction.
  • intersection stop instruction includes: a third intersection stop instruction or a fourth intersection stop instruction
  • display module 1402 is further configured to:
  • a third intersection stop instruction is displayed;
  • the fourth intersection stop instruction is displayed, and the third intersection stop instruction and the The stop instructions for the fourth intersection are different.
  • the detection module 1403 is also used to detect the fourth vehicle
  • the display module 1402 is further configured to display a vehicle warning prompt based on the distance between the fourth vehicle and the first vehicle being less than a preset distance.
  • the vehicle warning prompt includes a first vehicle warning prompt or a second vehicle warning prompt
  • the display module 1402 is further configured to be the first vehicle based on the distance between the fourth vehicle and the first vehicle. Distance, display the first vehicle warning prompt;
  • a second vehicle warning prompt is displayed.
  • the first distance is different from the second distance, and the first vehicle warning prompt is the same as that of the first vehicle.
  • the second vehicle warning prompt is different.
  • the detection module 1403 is also used to detect the fifth vehicle
  • the display module 1402 is further configured to display a third image corresponding to the fifth vehicle based on the fifth vehicle being located on the lane line of the lane in front of the first vehicle traveling direction;
  • a fourth image corresponding to the fifth vehicle is displayed, and the third image is different from the fourth image.
  • the present application also provides a vehicle, including a processor, a memory, and a display.
  • the processor is configured to acquire and execute the code in the memory to execute the information display method of the vehicle-mounted device described in any of the foregoing embodiments.
  • the vehicle may be a smart vehicle supporting an automatic driving function.
  • the device embodiments described above are only illustrative, and the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physically separate.
  • the physical unit can be located in one place or distributed across multiple network units. Some or all of the modules can be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • the connection relationship between the modules indicates that they have a communication connection between them, which can be specifically implemented as one or more communication buses or signal lines.
  • this application can be implemented by means of software plus necessary general hardware.
  • it can also be implemented by dedicated hardware including dedicated integrated circuits, dedicated CPUs, dedicated memory, Dedicated components and so on to achieve.
  • all functions completed by computer programs can be easily implemented with corresponding hardware.
  • the specific hardware structures used to achieve the same function can also be diverse, such as analog circuits, digital circuits or special-purpose circuits. Circuit etc.
  • software program implementation is a better implementation in more cases.
  • the technical solution of this application essentially or the part that contributes to the existing technology can be embodied in the form of a software product, and the computer software product is stored in a readable storage medium, such as a computer floppy disk. , U disk, mobile hard disk, ROM, RAM, magnetic disk or optical disk, etc., including several instructions to make a computer device (which can be a personal computer, training device, or network device, etc.) execute the various embodiments described in this application method.
  • a computer device which can be a personal computer, training device, or network device, etc.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium, or transmitted from one computer-readable storage medium to another computer-readable storage medium.
  • the computer instructions may be transmitted from a website, computer, training device, or data.
  • the center transmits to another website, computer, training equipment, or data center through wired (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.).
  • wired such as coaxial cable, optical fiber, digital subscriber line (DSL)
  • wireless such as infrared, wireless, microwave, etc.
  • the computer-readable storage medium may be any available medium that can be stored by a computer or a data storage device such as a training device or a data center integrated with one or more available media.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, and a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk (SSD)).

Landscapes

  • Engineering & Computer Science (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

一种车载设备的信息显示方法,应用于车联网或自动驾驶领域,包括:获取第一车辆所在路面的车道线的信息,所述车道线为所述路面上用于划分不同车道的至少两条线;根据所述车道线的信息显示与所述车道线类型一致的虚拟车道线。其可以应用于智能汽车中的自动驾驶界面,使得驾驶者可以从自动驾驶界面中获得此时行驶路面的车道线类型,丰富了自动驾驶界面的显示内容。

Description

一种车载设备的信息显示方法、装置及车辆
本申请要求于2019年09月25日提交中国国家知识产权局、申请号为201910912412.5、发明名称为“一种车载设备的信息显示方法、装置及车辆”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及智能车或自动驾驶领域,尤其涉及一种车载设备的信息显示方法、装置及车辆。
背景技术
自动驾驶技术依靠人工智能、视觉计算、雷达、监控装置和全球定位系统协同合作,让机动车辆可以在不需要人类主动操作下,实现自动驾驶。由于自动驾驶技术无需人类来驾驶机动车辆,所以理论上能够有效避免人类的驾驶失误,减少交通事故的发生,且能够提高公路的运输效率。因此,自动驾驶技术越来越受到重视。
自动驾驶时,车辆内部的车载设备可以显示自动驾驶界面,其中,自动驾驶界面可以显示出车辆所处的车道,以及位于车辆附近的其他车辆,然而,在路面环境日趋复杂的情况下,现有自动驾驶界面的显示内容已经不能满足驾驶者的需求。
发明内容
本申请实施例提供了一种车载设备的信息显示方法、装置及车辆,丰富了自动驾驶界面的显示内容。
第一方面,本申请提供了一种车载设备的信息显示方法,包括:获取第一车辆所在路面的车道线的信息,所述车道线为所述路面上用于划分不同车道的至少两条线;根据所述车道线的信息显示与所述车道线类型一致的虚拟车道线。
本申请实施例中,通过在自动驾驶界面中显示与获取到的车道线的信息对应的车道线相一致的虚拟车道线,使得驾驶者可以从自动驾驶界面中看到此时行驶路面的实际车道线类型的虚拟车道线,既丰富了自动驾驶界面的显示内容,也提高了驾驶的安全性。
需要说明的是,这里“一致”并非强调虚拟车道线与路面的车道线完全一模一样,通过计算机显示屏显示出的虚拟车道线和实际的车道线可能总会存在一些区别。本申请目的是为驾驶者指示出实际的车道,供驾驶者参考,指示方式尽量贴近实际的车道线,但线条的颜色、形态、材质等呈现的效果可以与实际的车道线存在区别。进一步的,也可以在虚拟车道线的基础上再添加显示其他的指示信息。
可选的,在第一方面的一种可选设计中,所述获取第一车辆所在路面的车道线的信息,包括:获取第一车辆所在车道的车道线的信息。
可选的,在第一方面的一种可选设计中,所述车道线至少包括如下车道线中的至少一种类型:虚线、实线、双虚线、双实线和虚实线。需要说明的是,自动驾驶界面显示的虚拟车道线的类型可以与实际的车道线的类型一致,比如形状一致。
可选的,在第一方面的一种可选设计中,所述车道线至少包括如下车道线中的至少一种:白色虚线、白色实线、黄色虚线、黄色实线、双白虚线、双黄实线、黄色虚实线和双白实线。需要说明的是,自动驾驶界面显示的虚拟车道线的类型可以与实际的车道线形状和颜色都一致。
可选的,在第一方面的一种可选设计中,所述方法还包括:获取所述路面上的非机动车物体的信息;根据所述非机动车物体的信息显示所述非机动车物体。
可选的,在第一方面的一种可选设计中,所述方法还包括:
接收共享指令,所述共享指令携带有第二车辆的地址;
响应于所述共享指令,向所述第二车辆发送第二共享信息,所述第二共享信息包括所述非机动车物体的位置信息。
可选的,在第一方面的一种可选设计中,所述方法还包括:
接收到服务器或第二车辆发送的第一共享信息,所述第一共享信息包括非机动车物体的位置信息;
基于所述第一车辆开启导航,在导航界面上显示障碍提示,所述障碍提示用于指示所述位置信息对应的位置上的非机动车物体。
可选的,在第一方面的一种可选设计中,所述非机动车物体至少包括道路凹陷、障碍物和道路积水。
可选的,在第一方面的一种可选设计中,所述方法还包括:
基于所述非机动车物体位于导航指示指示的导航路径上,显示变道指示,其中,所述导航指示用于指示所述第一车辆的导航路径,所述变道指示用于指示所述第一车辆避开所述非机动车物体的行驶路径。
可选的,在第一方面的一种可选设计中,所述方法还包括:
基于所述第一车辆与所述非机动车物体之间的距离为第一距离,显示第一告警提示;
基于所述第一车辆与所述非机动车物体之间的距离为第二距离,显示第二告警提示,所述第二告警提示与所述第一告警提示不同。
可选的,在第一方面的一种可选设计中,所述第一告警提示和所述第二告警提示的颜色或透明度不同。
可选的,在第一方面的一种可选设计中,所述方法还包括:
获取所述第一车辆的导航信息;
基于所述导航信息显示导航指示,所述导航指示用于指示所述第一车辆的导航路径。
可选的,在第一方面的一种可选设计中,所述导航指示包括第一导航指示或第二导航指示,所述基于所述导航信息显示导航指示,包括:
基于所述第一车辆处于静止状态,显示所述第一导航指示;
基于所述第一车辆处于行驶状态,显示所述第二导航指示,所述第一导航指示和所述第二导航指示不同。
可选的,在第一方面的一种可选设计中,所述方法第一导航指示和所述第二导航指示的显示颜色或透明度不同。
本申请实施例中,通过基于第一车辆的行驶状态显示不同的导航指示,使得驾驶者或乘客可以基于导航界面中的导航指示的显示确定出车辆当前的行驶状态。
可选的,在第一方面的一种可选设计中,所述导航指示包括第三导航指示或第四导航指示,所述基于所述导航信息显示导航指示,包括:
基于所述第一车辆处于第一环境,显示所述第三导航指示;
基于所述第一车辆处于第二环境,显示所述第四导航指示,所述第一环境与所述第二环境不同,所述第三导航指示和所述第四导航指示不同。
可选的,在第一方面的一种可选设计中,所述第一环境至少包括如下环境中的一种:所述第一车辆所处的天气环境、所述第一车辆所处的路面环境、所述第一车辆导航目的地所处的天气环境、所述第一车辆导航目的地所处的路面环境、所述第一车辆所在道路的交通拥堵环境、所述第一车辆导航目的地所处的交通拥堵环境或所述第一车辆所处的亮度环境。
本申请实施例中,第一车辆可以基于所述第一车辆处于第一环境,显示第一路面,基于所述第一车辆处于第二环境,显示第二车道,其中,第一车道和第二车道为第一车辆行驶的车道,或者是第一车辆所在的路面的车道,所述第一环境与所述第二环境不同,所述第一车道和所述第二车道不同。驾驶者或者乘客可以基于自动导航界面的显示获取到当前车辆所处的环境,尤其在夜晚或其他亮度较低的场景中,驾驶者或者乘客可以基于自动导航界面的显示获知当前车辆所处的环境,提高了驾驶的安全性。
可选的,在第一方面的一种可选设计中,所述方法还包括:
基于所述第一车辆处于直行状态,显示第一区域;
基于所述第一车辆由所述直行状态改变为左转弯状态,显示第二区域,其中,所述第二区域包含的所述第一车辆行驶方向的左前方的场景区域大于所述第一区域包含的所述左前方的场景区域。
可选的,在第一方面的一种可选设计中,所述方法还包括:
基于所述第一车辆处于左转弯状态,显示第三区域;
基于所述第一车辆由所述左转弯状态改变为直行状态,显示第四区域,其中,所述第三区域包含的所述第一车辆行驶方向的右后方的场景区域大于所述第四区域包含的所述右后方的场景区域。
可选的,在第一方面的一种可选设计中,所述方法还包括:
基于所述第一车辆处于直行状态,显示第五区域;
基于所述第一车辆由所述直行状态改变为右转弯状态,显示第六区域,其中,所述第五区域包含的所述第一车辆行驶方向的右前方的场景区域大于所述第六区域包含的所述右前方的场景区域。
可选的,在第一方面的一种可选设计中,所述方法还包括:
基于所述第一车辆处于右转弯状态,显示第七区域;
基于所述第一车辆由所述右转弯状态改变为直行状态,显示第八区域,其中,所述第七区域包含的所述第一车辆行驶方向的左后方的场景区域大于所述第八区域包含的所述左 后方的场景区域。
本申请实施例中,在第一车辆处于转弯状态到直行状态的变化,或者在第一车辆处于直行状态到转弯状态的变化时,可以改变当前的显示视角,使得驾驶员可以知晓到转弯时有可能有安全风险的区域信息,提高了驾驶的安全性。
可选的,在第一方面的一种可选设计中,所述方法还包括:
基于所述第一车辆处于第一行驶速度,显示第九区域;
基于所述第一车辆处于第二行驶速度,显示第十区域,所述第九区域和所述第十区域为所述第一车辆行驶位置所处的场景区域,所述第二行驶速度大于所述第一行驶速度,所述第九区域包含的场景区域大于所述第十区域包含的场景区域。
本申请实施例中,第一车辆可以基于所述第一车辆处于第一行驶速度,显示第九区域,基于所述第一车辆处于第二行驶速度,显示第十区域,所述第九区域和所述第十区域为所述第一车辆行驶位置所处的场景区域,所述第二行驶速度大于所述第一行驶速度,所述第九区域包含的场景区域大于所述第十区域包含的场景区域。通过上述方式,当第一车辆的行驶速度较快时,可以显示更大的场景区域,使得驾驶者可以在行驶速度较快时知晓到更多的路面信息,提高了驾驶的安全性。
可选的,在第一方面的一种可选设计中,所述方法还包括:
获取所述第一车辆导航目的地所处的地理位置;
基于所述地理位置显示第一图像,所述第一图像用于指示所述第一车辆导航目的地所处的地理位置的类型。
可选的,在第一方面的一种可选设计中,所述方法还包括:
检测到第三车辆;
获取所述第三车辆导航目的地所处的地理位置;
基于所述第三车辆导航目的地所处的地理位置显示第二图像,所述第二图像用于指示所述第三车辆导航目的地所处的地理位置的类型。
可选的,在第一方面的一种可选设计中,所述地理位置的类型至少包括如下类型的一种:城市、山区、平原、森林或海边。
本申请实施例中,第一车辆可以获取所述第一车辆导航目的地所处的地理位置,并基于所述地理位置显示第一图像,所述第一图像用于指示所述第一车辆导航目的地所处的地理位置的类型。第一车辆可以基于导航目的地所处的地理位置在自动驾驶界面中显示对应的图像,丰富了自动驾驶界面的内容。
可选的,在第一方面的一种可选设计中,所述方法还包括:
检测到所述第一车辆行驶至路口停止区域,显示第一路口停止指示。
可选的,在第一方面的一种可选设计中,所述路口停止指示包括:第一路口停止指示或第二路口停止指示,所述检测到所述第一车辆行驶至路口停止区域,显示路口停止指示,包括:
基于检测到所述第一车辆的车头未超出所述路口停止区域,显示第一路口停止指示;
基于检测到所述第一车辆的车头超出所述路口停止区域,显示第二路口停止指示,所 述第一路口停止指示与所述第二路口停止指示不同。
可选的,在第一方面的一种可选设计中,所述路口停止指示包括:第三路口停止指示或第四路口停止指示,所述检测到所述第一车辆行驶至路口停止区域,显示路口停止指示,包括:
基于检测到所述第一车辆行驶至所述路口停止区域,且所述路口停止区域对应的红绿灯为红灯或黄灯,显示第三路口停止指示;
基于检测到所述第一车辆行驶至所述路口停止区域,且所述路口停止区域对应的红绿灯为绿灯,显示第四路口停止指示,所述第三路口停止指示与所述第四路口停止指示不同。
可选的,在第一方面的一种可选设计中,所述方法还包括:
检测到第四车辆;
基于所述第四车辆与所述第一车辆之间的距离小于预设距离,显示车辆警告提示。
可选的,在第一方面的一种可选设计中,所述车辆警告提示包括第一车辆警告提示或第二车辆警告提示,所述基于所述第四车辆与所述第一车辆之间的距离小于预设距离,显示车辆警告提示,包括:
基于所述第四车辆与所述第一车辆之间的距离为第一距离,显示第一车辆警告提示;
基于所述第四车辆与所述第一车辆之间的距离为第二距离,显示第二车辆警告提示,所述第一距离与所述第二距离不同,所述第一车辆警告提示与所述第二车辆警告提示不同。
本申请实施例中,第一车辆可以基于附近车辆与本车的距离在自动驾驶界面上显示车辆告警提示。使得驾驶者可以通过自动驾驶界面上显示的告警提示知晓第一车辆与其他车辆的碰撞风险。
可选的,在第一方面的一种可选设计中,所述方法还包括:
检测到第五车辆;
基于所述第五车辆位于所述第一车辆行驶方向的前方所在车道的车道线上,显示所述第五车辆对应的第三图像;
基于所述第五车辆行驶至所述第一车辆行驶方向的前方所在的车道上,显示所述第五车辆对应的第四图像,所述第三图像和所述第四图像不同。
第二方面,本申请提供了一种车载设备的信息显示装置,包括:
获取模块,用于获取第一车辆所在路面的车道线的信息,所述车道线为所述路面上用于划分不同车道的至少两条线;
显示模块,用于根据所述车道线的信息显示与所述车道线类型一致的虚拟车道线。
可选的,在第二方面的一种可选设计中,所述获取第一车辆所在路面的车道线的信息,包括:
获取第一车辆所在车道的车道线的信息。
可选的,在第二方面的一种可选设计中,所述车道线至少包括如下车道线中的至少一种:虚线、实线、双虚线、双实线和虚实线。
可选的,在第二方面的一种可选设计中,所述车道线至少包括如下车道线中的至少一 种:白色虚线、白色实线、黄色虚线、黄色实线、双白虚线、双黄实线、黄色虚实线和双白实线。
可选的,在第二方面的一种可选设计中,所述获取模块,还用于获取所述路面上的非机动车物体的信息;
所述显示模块,还用于显示所述非机动车物体。
可选的,在第二方面的一种可选设计中,所述装置还包括:
接收模块,用于接收共享指令,所述共享指令携带有第二车辆的地址;
发送模块,用于响应于所述共享指令,向所述第二车辆发送第二共享信息,所述第二共享信息包括所述非机动车物体的位置信息。
可选的,在第二方面的一种可选设计中,所述接收模块,还用于接收到服务器或第二车辆发送的第一共享信息,所述第一共享信息包括非机动车物体的位置信息;
所述显示模块,还用于基于所述第一车辆开启导航,在导航界面上显示障碍提示,所述障碍提示用于指示所述位置信息对应的位置上的非机动车物体。
可选的,在第二方面的一种可选设计中,所述非机动车物体至少包括道路凹陷、障碍物和道路积水。
可选的,在第二方面的一种可选设计中,所述显示模块,还用于基于所述非机动车物体位于导航指示指示的导航路径上,显示变道指示,其中,所述导航指示用于指示所述第一车辆的导航路径,所述变道指示用于指示所述第一车辆避开所述非机动车物体的行驶路径。
可选的,在第二方面的一种可选设计中,所述显示模块,还用于基于所述第一车辆与所述非机动车物体之间的距离为第一距离,显示第一告警提示;
基于所述第一车辆与所述非机动车物体之间的距离为第二距离,显示第二告警提示,所述第二告警提示与所述第一告警提示不同。
可选的,在第二方面的一种可选设计中,所述第一告警提示和所述第二告警提示的颜色或透明度不同。
可选的,在第二方面的一种可选设计中,所述获取模块,还用于获取所述第一车辆的导航信息;
所述显示模块,还用于基于所述导航信息显示导航指示,所述导航指示用于指示所述第一车辆的导航路径。
可选的,在第二方面的一种可选设计中,所述导航指示包括第一导航指示或第二导航指示,所述显示模块,具体用于基于所述第一车辆处于静止状态,显示所述第一导航指示;
基于所述第一车辆处于行驶状态,显示所述第二导航指示,所述第一导航指示和所述第二导航指示不同。
可选的,在第二方面的一种可选设计中,所述装置第一导航指示和所述第二导航指示的显示颜色或透明度不同。
可选的,在第二方面的一种可选设计中,所述导航指示包括第三导航指示或第四导航指示,所述显示模块,具体用于基于所述第一车辆处于第一环境,显示所述第三导航指示;
基于所述第一车辆处于第二环境,显示所述第四导航指示,所述第一环境与所述第二环境不同,所述第三导航指示和所述第四导航指示不同。
可选的,在第二方面的一种可选设计中,所述第一环境至少包括如下环境中的一种:所述第一车辆所处的天气环境、所述第一车辆所处的路面环境、所述第一车辆导航目的地所处的天气环境、所述第一车辆导航目的地所处的路面环境、所述第一车辆所在道路的交通拥堵环境、所述第一车辆导航目的地所处的交通拥堵环境或所述第一车辆所处的亮度环境。
可选的,在第二方面的一种可选设计中,所述显示模块,还用于基于所述第一车辆处于直行状态,显示第一区域;
基于所述第一车辆由所述直行状态改变为左转弯状态,显示第二区域,其中,所述第二区域包含的所述第一车辆行驶方向的左前方的场景区域大于所述第一区域包含的所述左前方的场景区域。
可选的,在第二方面的一种可选设计中,所述显示模块,还用于基于所述第一车辆处于左转弯状态,显示第三区域;
基于所述第一车辆由所述左转弯状态改变为直行状态,显示第四区域,其中,所述第三区域包含的所述第一车辆行驶方向的右后方的场景区域大于所述第四区域包含的所述右后方的场景区域。
可选的,在第二方面的一种可选设计中,所述显示模块,还用于基于所述第一车辆处于直行状态,显示第五区域;
基于所述第一车辆由所述直行状态改变为右转弯状态,显示第六区域,其中,所述第五区域包含的所述第一车辆行驶方向的右前方的场景区域大于所述第六区域包含的所述右前方的场景区域。
可选的,在第二方面的一种可选设计中,所述显示模块,还用于基于所述第一车辆处于右转弯状态,显示第七区域;
基于所述第一车辆由所述右转弯状态改变为直行状态,显示第八区域,其中,所述第七区域包含的所述第一车辆行驶方向的左后方的场景区域大于所述第八区域包含的所述左后方的场景区域。
可选的,在第二方面的一种可选设计中,所述显示模块,还用于基于所述第一车辆处于第一行驶速度,显示第九区域;
基于所述第一车辆处于第二行驶速度,显示第十区域,所述第九区域和所述第十区域为所述第一车辆行驶位置所处的场景区域,所述第二行驶速度大于所述第一行驶速度,所述第九区域包含的场景区域大于所述第十区域包含的场景区域。
可选的,在第二方面的一种可选设计中,所述获取模块,还用于获取所述第一车辆导航目的地所处的地理位置;
所述显示模块,还用于基于所述地理位置显示第一图像,所述第一图像用于指示所述第一车辆导航目的地所处的地理位置的类型。
可选的,在第二方面的一种可选设计中,所述检测模块,还用于检测到第三车辆;
所述获取模块,还用于获取所述第三车辆导航目的地所处的地理位置;
所述显示模块,还用于基于所述第三车辆导航目的地所处的地理位置显示第二图像,所述第二图像用于指示所述第三车辆导航目的地所处的地理位置的类型。
可选的,在第二方面的一种可选设计中,所述地理位置的类型至少包括如下类型的一种:城市、山区、平原、森林或海边。
可选的,在第二方面的一种可选设计中,所述检测模块,还用于检测到所述第一车辆行驶至路口停止区域,显示第一路口停止指示。
可选的,在第二方面的一种可选设计中,所述路口停止指示包括:第一路口停止指示或第二路口停止指示,所述显示模块,还用于:
基于所述检测模块检测到所述第一车辆的车头未超出所述路口停止区域,显示第一路口停止指示;
基于所述检测模块检测到所述第一车辆的车头超出所述路口停止区域,显示第二路口停止指示,所述第一路口停止指示与所述第二路口停止指示不同。
可选的,在第二方面的一种可选设计中,所述路口停止指示包括:第三路口停止指示或第四路口停止指示,所述显示模块,还用于:
基于所述检测模块检测到所述第一车辆行驶至所述路口停止区域,且所述路口停止区域对应的红绿灯为红灯或黄灯,显示第三路口停止指示;
基于所述检测模块检测到所述第一车辆行驶至所述路口停止区域,且所述路口停止区域对应的红绿灯为绿灯,显示第四路口停止指示,所述第三路口停止指示与所述第四路口停止指示不同。
可选的,在第二方面的一种可选设计中,所述检测模块,还用于检测到第四车辆;
所述显示模块,还用于基于所述第四车辆与所述第一车辆之间的距离小于预设距离,显示车辆警告提示。
可选的,在第二方面的一种可选设计中,所述车辆警告提示包括第一车辆警告提示或第二车辆警告提示,所述显示模块,还用于基于所述第四车辆与所述第一车辆之间的距离为第一距离,显示第一车辆警告提示;
基于所述第四车辆与所述第一车辆之间的距离为第二距离,显示第二车辆警告提示,所述第一距离与所述第二距离不同,所述第一车辆警告提示与所述第二车辆警告提示不同。
可选的,在第二方面的一种可选设计中,所述检测模块,还用于检测到第五车辆;
所述显示模块,还用于基于所述第五车辆位于所述第一车辆行驶方向的前方所在车道的车道线上,显示所述第五车辆对应的第三图像;
基于所述第五车辆行驶至所述第一车辆行驶方向的前方所在的车道上,显示所述第五车辆对应的第四图像,所述第三图像和所述第四图像不同。
第三方面,本申请提供了一种车辆,包括处理器、存储器和显示器,所述处理器用于获取并执行所述存储器中的代码,以执行上述第一方面中任一所述的方法。
可选的,在第三方面的一种可选设计中,所述车辆支持无人驾驶功能。
第四方面,本申请提供一种车载装置,其特征在于,包括处理器和存储器,所述处理 器用于获取并执行所述存储器中的代码,以执行上述第一方面中任一所述的方法。
第五方面,本申请提供了一种计算机存储介质,所述计算机可读存储介质存储指令,当所述指令在计算机上运行时,使得所述计算机执行如上述第一方面任一所述的方法。
第六方面,本申请提供了一种计算机程序(或称计算机程序产品),所述计算机程序包括指令,当所述指令在计算机上运行时,使得所述计算机执行如上述第一方面任一所述的方法。
本申请提供了一种车载设备的信息显示方法,应用于车联网领域,包括:获取第一车辆所在路面的车道线的信息,所述车道线为所述路面上用于划分不同车道的至少两条线;根据所述车道线的信息显示与所述车道线一致的虚拟车道线。本申请可以应用于智能汽车中的自动驾驶界面,使得驾驶者可以从自动驾驶界面中看到此时行驶路面的车道线类型,既丰富了自动驾驶界面的显示内容,也提高了驾驶的安全性。
附图说明
图1为本申请实施例提供的具有自动驾驶功能的自动驾驶装置的功能框图;
图2为本申请实施例提供的一种自动驾驶系统的结构示意图;
图3a和图3b为本申请实施例提供的车辆的一种内部结构;
图4a为本申请实施例提供的一种车载设备的信息显示方法的流程示意图;
图4b为本申请实施例中提供的一种自动驾驶界面示意图;
图5a为本申请实施例中提供的一种自动驾驶界面示意图;
图5b为本申请实施例中提供的一种自动驾驶界面示意图;
图5c为本申请实施例中提供的一种自动驾驶界面示意图;
图5d为本申请实施例中提供的一种自动驾驶界面示意图;
图5e为本申请实施例中提供的一种自动驾驶界面示意图;
图5f为本申请实施例中提供的一种自动驾驶界面示意图;
图6a为本申请实施例中提供的一种自动驾驶界面示意图;
图6b为本申请实施例中提供的一种自动驾驶界面示意图;
图7a为本申请实施例中提供的一种自动驾驶界面示意图;
图7b为本申请实施例中提供的一种自动驾驶界面示意图;
图7c为本申请实施例中提供的一种自动驾驶界面示意图;
图8a为本申请实施例中提供的一种自动驾驶界面示意图;
图8b为本申请实施例中提供的一种自动驾驶界面示意图;
图8c为本申请实施例中提供的一种自动驾驶界面示意图;
图8d为本申请实施例中提供的一种自动驾驶界面示意图;
图8e为本申请实施例中提供的一种自动驾驶界面示意图;
图8f为本申请实施例中提供的一种自动驾驶界面示意图;
图9a为本申请实施例中提供的一种自动驾驶界面示意图;
图9b为本申请实施例中提供的一种自动驾驶界面示意图;
图9c为本申请实施例中提供的一种自动驾驶界面示意图;
图10为本申请实施例中提供的一种自动驾驶界面示意图;
图11a为本申请实施例中提供的一种自动驾驶界面示意图;
图11b为本申请实施例中提供的一种自动驾驶界面示意图;
图11c为本申请实施例中提供的一种自动驾驶界面示意图;
图11d为本申请实施例中提供的一种自动驾驶界面示意图;
图11e为本申请实施例中提供的一种自动驾驶界面示意图;
图11f为本申请实施例中提供的一种自动驾驶界面示意图;
图11g为本申请实施例中提供的一种自动驾驶界面示意图;
图11h为本申请实施例中提供的一种自动驾驶界面示意图;
图12a至图12d为本申请实施例中提供的一种自动驾驶界面示意图;
图13a至图13c为本申请实施例中提供的一种自动驾驶界面示意图;
图14为本申请实施例提供的一种车载设备的信息显示装置的结构示意图。
具体实施方式
本申请实施例提供了一种车载设备的信息显示方法、装置以及车辆。
下面结合附图,对本申请的实施例进行描述。本领域普通技术人员可知,随着技术的发展和新场景的出现,本申请实施例提供的技术方案对于类似的技术问题,同样适用。
本申请的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的术语在适当情况下可以互换,这仅仅是描述本申请的实施例中对相同属性的对象在描述时所采用的区分方式。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,以便包含一系列单元的过程、方法、系统、产品或设备不必限于那些单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它单元。
本说明书中所述的车辆可以是将引擎作为动力源的内燃机车辆、将引擎和电动马达作为动力源的混合动力车辆、将电动马达作为动力源的电动汽车等等。
本申请实施例中,车辆可以包括具有自动驾驶功能的自动驾驶装置100。
参照图1,图1是本申请实施例提供的具有自动驾驶功能的自动驾驶装置100的功能框图。在一个实施例中,将自动驾驶装置100配置为完全或部分地自动驾驶模式。例如,自动驾驶装置100可以在处于自动驾驶模式中的同时控制自身,并且可通过人为操作来确定自动驾驶装置及其周边环境的当前状态,确定周边环境中的至少一个其他自动驾驶装置的可能行为,并确定该其他自动驾驶装置执行可能行为的可能性相对应的置信水平,基于所确定的信息来控制自动驾驶装置100。在自动驾驶装置100处于自动驾驶模式中时,可以将自动驾驶装置100置为在没有和人交互的情况下操作。
自动驾驶装置100可包括各种子系统,例如行进系统102、传感器系统104、控制系统106、一个或多个外围设备108以及电源110、计算机系统112和用户接口116。可选地,自动驾驶装置100可包括更多或更少的子系统,并且每个子系统可包括多个元件。另外,自动驾驶装 置100的每个子系统和元件可以通过有线或者无线互连。
行进系统102可包括为自动驾驶装置100提供动力运动的组件。在一个实施例中,行进系统102可包括引擎118、能量源119、传动装置120和车轮/轮胎121。引擎118可以是内燃引擎、电动机、空气压缩引擎或其他类型的引擎组合,例如气油发动机和电动机组成的混动引擎,内燃引擎和空气压缩引擎组成的混动引擎。引擎118将能量源119转换成机械能量。
能量源119的示例包括汽油、柴油、其他基于石油的燃料、丙烷、其他基于压缩气体的燃料、乙醇、太阳能电池板、电池和其他电力来源。能量源119也可以为自动驾驶装置100的其他系统提供能量。
传动装置120可以将来自引擎118的机械动力传送到车轮121。传动装置120可包括变速箱、差速器和驱动轴。在一个实施例中,传动装置120还可以包括其他器件,比如离合器。其中,驱动轴可包括可耦合到一个或多个车轮121的一个或多个轴。
传感器系统104可包括感测关于自动驾驶装置100周边的环境的信息的若干个传感器。例如,传感器系统104可包括定位系统122(定位系统可以是全球定位系统(global positioning system,GPS)系统,也可以是北斗系统或者其他定位系统)、惯性测量单元(inertial measurement unit,IMU)124、雷达126、激光测距仪128以及相机130。传感器系统104还可包括被监视自动驾驶装置100的内部系统的传感器(例如,车内空气质量监测器、燃油量表、机油温度表等)。来自这些传感器中的一个或多个的传感器数据可用于检测对象及其相应特性(位置、形状、方向、速度等)。这种检测和识别是自主自动驾驶装置100的安全操作的关键功能。
定位系统122可用于估计自动驾驶装置100的地理位置。IMU 124用于基于惯性加速度来感测自动驾驶装置100的位置和朝向变化。在一个实施例中,IMU 124可以是加速度计和陀螺仪的组合。
雷达126可利用无线电信号来感测自动驾驶装置100的周边环境内的物体。在一些实施例中,除了感测物体以外,雷达126还可用于感测物体的速度和/或前进方向。
雷达126可包括电磁波发送部、接收部。雷达126在电波发射原理上可实现为脉冲雷达(pulse radar)方式或连续波雷达(continuous wave radar)方式。雷达126在连续波雷达方式中可根据信号波形而实现为调频连续波(frequency modulated continuous wave,FMCW)方式或频移监控(frequency shift keying,FSK)方式。
雷达126可以电磁波作为媒介,基于飞行时间(time of flight,TOF)方式或相移(phase-shift)方式来检测对象,并检测被检测出的对象的位置、与检测出的对象的距离以及相对速度。为了检测位于车辆的前方、后方或侧方的对象,雷达126可配置在车辆的外部的适当的位置。激光雷达126可以激光作为媒介,基于TOF方式或相移方式检测对象,并检测被检测出的对象的位置、与检测出的对象的距离以及相对速度。
可选地,为了检测位于车辆的前方、后方或侧方的对象,激光雷达126可配置在车辆的外部的适当的位置。
激光测距仪128可利用激光来感测自动驾驶装置100所位于的环境中的物体。在一些实施例中,激光测距仪128可包括一个或多个激光源、激光扫描器以及一个或多个检测器,以 及其他系统组件。
相机130可用于捕捉自动驾驶装置100的周边环境的多个图像。相机130可以是静态相机或视频相机。
可选地,为了获取车辆外部影像,相机130可位于车辆的外部的适当的位置。例如,为了获取车辆前方的影像,相机130可在车辆的室内与前风挡相靠近地配置。或者,相机130可配置在前保险杠或散热器格栅周边。例如,为了获取车辆后方的影像,相机130可在车辆的室内与后窗玻璃相靠近地配置。或者,相机130可配置在后保险杠、后备箱或尾门周边。例如,为了获取车辆侧方的影像,相机130可在车辆的室内与侧窗中的至少一方相靠近地配置。或者,相机130可配置在侧镜、挡泥板或车门周边。
控制系统106为控制自动驾驶装置100及其组件的操作。控制系统106可包括各种元件,其中包括转向系统132、油门134、制动单元136、传感器融合算法138、计算机视觉系统140、路线控制系统142以及障碍物避免系统144。
转向系统132可操作来调整自动驾驶装置100的前进方向。例如在一个实施例中可以为方向盘系统。
油门134用于控制引擎118的操作速度并进而控制自动驾驶装置100的速度。
制动单元136用于控制自动驾驶装置100减速。制动单元136可使用摩擦力来减慢车轮121。在其他实施例中,制动单元136可将车轮121的动能转换为电流。制动单元136也可采取其他形式来减慢车轮121转速从而控制自动驾驶装置100的速度。
计算机视觉系统140可以操作来处理和分析由相机130捕捉的图像以便识别自动驾驶装置100周边环境中的物体和/或特征。所述物体和/或特征可包括交通信号、道路边界和障碍物。计算机视觉系统140可使用物体识别算法、运动中恢复结构(structure from motion,SFM)算法、视频跟踪和其他计算机视觉技术。在一些实施例中,计算机视觉系统140可以用于为环境绘制地图、跟踪物体、估计物体的速度等等。
路线控制系统142用于确定自动驾驶装置100的行驶路线。在一些实施例中,路线控制系统142可结合来自传感器138、定位系统122和一个或多个预定地图的数据以为自动驾驶装置100确定行驶路线。
障碍规避系统144用于识别、评估和规避或者以其他方式越过自动驾驶装置100的环境中的潜在障碍物。
当然,在一个实例中,控制系统106可以增加或替换地包括除了所示出和描述的那些以外的组件。或者也可以减少一部分上述示出的组件。
自动驾驶装置100通过外围设备108与外部传感器、其他自动驾驶装置、其他计算机系统或用户之间进行交互。外围设备108可包括无线通信系统146、车载电脑148、麦克风150和/或扬声器152。
在一些实施例中,外围设备108提供自动驾驶装置100的用户与用户接口116交互的手段。例如,车载电脑148可向自动驾驶装置100的用户提供信息。用户接口116还可操作车载电脑148来接收用户的输入。车载电脑148可以通过触摸屏进行操作。在其他情况中,外围设备108可提供用于自动驾驶装置100与位于车内的其它设备通信的手段。例如,麦克风150 可从自动驾驶装置100的用户接收音频(例如,语音命令或其他音频输入)。类似地,扬声器152可向自动驾驶装置100的用户输出音频。
无线通信系统146可以直接地或者经由通信网络来与一个或多个设备无线通信。例如,无线通信系统146可使用3G蜂窝通信,例如码分多址(code division multiple access,CDMA)、EVD0、全球移动通信系统(global system for mobile communications,GSM)/是通用分组无线服务技术(general packet radio service,GPRS),或者4G蜂窝通信,例如长期演进(long term evolution,LTE),或者5G蜂窝通信。无线通信系统146可利用WiFi与无线局域网(wireless local area network,WLAN)通信。在一些实施例中,无线通信系统146可利用红外链路、蓝牙或ZigBee与设备直接通信。其他无线协议,例如各种自动驾驶装置通信系统,例如,无线通信系统146可包括一个或多个专用短程通信(dedicated short range communications,DSRC)设备,这些设备可包括自动驾驶装置和/或路边台站之间的公共和/或私有数据通信。
电源110可向自动驾驶装置100的各种组件提供电力。在一个实施例中,电源110可以为可再充电锂离子或铅酸电池。这种电池的一个或多个电池组可被配置为电源为自动驾驶装置100的各种组件提供电力。在一些实施例中,电源110和能量源119可一起实现,例如一些全电动车中那样。
自动驾驶装置100的部分或所有功能受计算机系统112控制。计算机系统112可包括至少一个处理器113,处理器113执行存储在例如存储器114这样的非暂态计算机可读介质中的指令115。计算机系统112还可以是采用分布式方式控制自动驾驶装置100的个体组件或子系统的多个计算设备。
处理器113可以是任何常规的处理器,诸如商业可获得的中央处理器(central processing unit,CPU)。替选地,该处理器可以是诸如专用集成电路(application specific integrated circuits,ASIC)或其它基于硬件的处理器的专用设备。尽管图1功能性地图示了处理器、存储器、和在相同块中的计算机110的其它元件,但是本领域的普通技术人员应该理解该处理器、计算机、或存储器实际上可以包括可以或者可以不存储在相同的物理外壳内的多个处理器、计算机、或存储器。例如,存储器可以是硬盘驱动器或位于不同于计算机110的外壳内的其它存储介质。因此,对处理器或计算机的引用将被理解为包括对可以或者可以不并行操作的处理器或计算机或存储器的集合的引用。不同于使用单一的处理器来执行此处所描述的步骤,诸如转向组件和减速组件的一些组件每个都可以具有其自己的处理器,所述处理器只执行与特定于组件的功能相关的计算。
在此处所描述的各个方面中,处理器可以位于远离该自动驾驶装置并且与该自动驾驶装置进行无线通信。在其它方面中,此处所描述的过程中的一些在布置于自动驾驶装置内的处理器上执行而其它则由远程处理器执行,包括采取执行单一操纵的必要步骤。
在一些实施例中,存储器114可包含指令115(例如,程序逻辑),指令115可被处理器113执行来执行自动驾驶装置100的各种功能,包括以上描述的那些功能。存储器114也可包含额外的指令,包括向行进系统102、传感器系统104、控制系统106和外围设备108中的一个或多个发送数据、从其接收数据、与其交互和/或对其进行控制的指令。
除了指令115以外,存储器114还可存储数据,例如道路地图、路线信息,自动驾驶装置的位置、方向、速度以及其它这样的自动驾驶装置数据,以及其他信息。这种信息可在自动驾驶装置100在自主、半自主和/或手动模式中操作期间被自动驾驶装置100和计算机系统112使用。
用户接口116,用于向自动驾驶装置100的用户提供信息或从其接收信息。可选地,用户接口116可包括在外围设备108的集合内的一个或多个输入/输出设备,例如无线通信系统146、车载电脑148、麦克风150和扬声器152。
计算机系统112可基于从各种子系统(例如,行进系统102、传感器系统104和控制系统106)以及从用户接口116接收的输入来控制自动驾驶装置100的功能。例如,计算机系统112可利用来自控制系统106的输入以便控制转向单元132来避免由传感器系统104和障碍物避免系统144检测到的障碍物。在一些实施例中,计算机系统112可操作来对自动驾驶装置100及其子系统的许多方面提供控制。
可选地,上述这些组件中的一个或多个可与自动驾驶装置100分开安装或关联。例如,存储器114可以部分或完全地与自动驾驶装置100分开存在。上述组件可以按有线和/或无线方式来通信地耦合在一起。
可选地,上述组件只是一个示例,实际应用中,上述各个模块中的组件有可能根据实际需要增添或者删除,图1不应理解为对本申请实施例的限制。
在道路行进的自动驾驶汽车,如上面的自动驾驶装置100,可以识别其周围环境内的物体以确定对当前速度的调整。所述物体可以是其它自动驾驶装置、交通控制设备、或者其它类型的物体。在一些示例中,可以独立地考虑每个识别的物体,并且基于物体的各自的特性,诸如它的当前速度、加速度、与自动驾驶装置的间距等,可以用来确定自动驾驶汽车所要调整的速度。
可选地,自动驾驶汽车自动驾驶装置100或者与自动驾驶装置100相关联的计算设备(如图1的计算机系统112、计算机视觉系统140、存储器114)可以基于所识别的物体的特性和周围环境的状态(例如,交通、雨、道路上的冰、等等)来预测所述识别的物体的行为。可选地,每一个所识别的物体都依赖于彼此的行为,因此还可以将所识别的所有物体全部一起考虑来预测单个识别的物体的行为。自动驾驶装置100能够基于预测的所述识别的物体的行为来调整它的速度。换句话说,自动驾驶汽车能够基于所预测的物体的行为来确定自动驾驶装置将需要调整到(例如,加速、减速、或者停止)什么稳定状态。在这个过程中,也可以考虑其它因素来确定自动驾驶装置100的速度,诸如,自动驾驶装置100在行驶的道路中的横向位置、道路的曲率、静态和动态物体的接近度等等。
除了提供调整自动驾驶汽车的速度的指令之外,计算设备还可以提供修改自动驾驶装置100的转向角的指令,以使得自动驾驶汽车遵循给定的轨迹和/或维持与自动驾驶汽车附近的物体(例如,道路上的相邻车道中的轿车)的安全横向和纵向距离。
上述自动驾驶装置100可以为轿车、卡车、摩托车、公共汽车、船、飞机、直升飞机、割草机、娱乐车、游乐场自动驾驶装置、施工设备、电车、高尔夫球车、火车、和手推车等,本申请实施例不做特别的限定。
图1介绍了自动驾驶装置100的功能框图,下面介绍自动驾驶装置100中的自动驾驶系统101。图2为本申请实施例提供的一种自动驾驶系统的结构示意图。图1和图2是从不同的角度来描述自动驾驶装置100,例如图2中的计算机系统101为图1中的计算机系统112。
如图2所示,计算机系统101包括处理器103,处理器103和系统总线105耦合。处理器103可以是一个或者多个处理器,其中,每个处理器都可以包括一个或多个处理器核。系统总线105通过总线桥111和输入输出(I/O)总线113耦合。I/O接口115和I/O总线耦合。I/O接口115和多种I/O设备进行通信,比如输入设备117(如:键盘,鼠标,触摸屏等),多媒体盘(media tray)121,例如CD-ROM,多媒体接口等。收发器123(可以发送和/或接受无线电通信信号),摄像头155(可以捕捉景田和动态数字视频图像)和外部USB接口125。可选的,和I/O接口115相连接的接口可以是USB接口。
其中,处理器103可以是任何传统处理器,包括精简指令集计算(“RISC”)处理器、复杂指令集计算(“CISC”)处理器或上述的组合。可选的,处理器可以是诸如专用集成电路(“ASIC”)的专用装置。可选的,处理器103可以是神经网络处理器(neural-network processing unit,NPU)或者是神经网络处理器和上述传统处理器的组合。可选的,处理器103挂载有一个神经网络处理器。
计算机系统101可以通过网络接口129和服务器149通信。网络接口129是硬件网络接口,比如,网卡。网络127可以是外部网络,比如因特网,也可以是内部网络,比如以太网或者虚拟私人网络(VPN)。可选的,网络127还可以是无线网络,比如WiFi网络,蜂窝网络等。
服务器149可以是高精度地图服务器连接,车辆可以通过与高精度地图的通信来获取高精度地图信息。
服务器149可以是车辆管理服务器,车辆管理服务器可以用于处理车辆上传的数据,也可以将数据通过网络下发到车辆。
此外,计算机系统101可以通过网络接口129和其他车辆160(vehicle to vehicle,V2V)或行人(vehicle to pedestrian,V2P)进行无线通信。
硬盘驱动接口和系统总线105耦合。硬件驱动接口和硬盘驱动器相连接。系统内存135和系统总线105耦合。运行在系统内存135的数据可以包括计算机系统101的操作系统137和应用程序143。
操作系统包括壳(Shell)139和内核(kernel)141。壳139是介于使用者和操作系统之内核(kernel)间的一个接口。壳139是操作系统最外面的一层。壳139管理使用者与操作系统之间的交互:等待使用者的输入,向操作系统解释使用者的输入,并且处理各种各样的操作系统的输出结果。
内核141由操作系统中用于管理存储器、文件、外设和系统资源的那些部分组成。直接与硬件交互,操作系统内核通常运行进程,并提供进程间的通信,提供CPU时间片管理、中断、内存管理、IO管理等等。
应用程序141包括自动驾驶相关程序,比如,管理自动驾驶装置和路上障碍物交互的程序,控制自动驾驶装置的行车路线或者速度的程序,控制自动驾驶装置100和路上其他自动驾驶装置交互的程序。
传感器153和计算机系统101关联。传感器153用于探测计算机系统101周围的环境。举例来说,传感器153可以探测动物,汽车,障碍物和人行横道等,进一步传感器还可以探测上述动物,汽车,障碍物和人行横道等物体周围的环境,比如:动物周围的环境,例如,动物周围出现的其他动物,天气条件,周围环境的光亮度等。可选的,如果计算机系统101位于自动驾驶装置上,传感器可以是摄像头,红外线感应器,化学检测器,麦克风等。传感器153在激活时按照预设间隔感测信息并实时或接近实时地将所感测的信息提供给计算机系统101。
计算机系统101,用于根据传感器153采集的传感器数据,确定自动驾驶装置100的行驶状态,以及根据该行驶状态和当前的驾驶任务确定自动驾驶转置100所需执行的驾驶操作,并向控制系统106(图1)发送该驾驶操作对应的控制指令。自动驾驶装置100的行驶状态可以包括自动驾驶装置100自身的行驶状况,例如车头方向、速度、位置、加速度等,也包括自动驾驶装置100周边环境的状态,例如障碍物的位置、其他车辆的位置和速度、人行横道的位置、交通灯的信号等。计算机系统101可以包括由处理器103实现的任务抽象网络和共享策略网络。具体的,处理器103确定当前的自动驾驶任务;处理器103将该自动驾驶任务的至少一组历史路径输入到任务抽象网络做特征提取,得到表征该自动驾驶任务的特征的任务特征向量;处理器103根据传感器153采集的传感器数据,确定表征自动驾驶装置的当前行驶状态的状态向量;处理器103将该任务特征向量和该状态向量输入到共享策略网络做处理,得到该自动驾驶装置当前所需执行的驾驶操作;处理器103通过控制系统执行该驾驶操作;处理器103重复之前确定和执行驾驶操作的步骤,直到完成该自动驾驶任务。
可选的,在本文所述的各种实施例中,计算机系统101可位于远离自动驾驶装置的地方,并且可与自动驾驶装置进行无线通信。收发器123可将自动驾驶任务、传感器153采集的传感器数据和其他数据发送给计算机系统101;还可以接收计算机系统101发送的控制指令。自动驾驶装置可执行收发器接收的来自计算机系统101的控制指令,并执行相应的驾驶操作。在其它方面,本文所述的一些过程在设置在自动驾驶车辆内的处理器上执行,其它由远程处理器执行,包括采取执行单个操纵所需的动作。
如图2中示出的那样,显示适配器107可以驱动显示器109,显示器109和系统总线105耦合。显示器109可用于视觉显示、语音播放由用户输入的信息或提供给用户的信息以及车载设备的各种菜单。显示器109可包括液晶显示器(liquid crystal display,LCD)、薄膜晶体管液晶显示器(thin film transistor-liquid crystal display,TFT LCD)、有机发光二极管(organic light-emitting diode、OLED)、柔性显示器(flexible display)、3D显示器(3D display)、电子墨水显示器(e-ink display)中的一种以上。触控面板可覆盖显示器109,当触控面板检测到在其上或附近的触摸操作后,传送给处理器以确定触摸事件的类型,随后处理器根据触摸事件的类型在显示器109上提供相应的视觉输出。此外,触控面板与显示器109也可以是集成的,来实现车载设备的输入和输出功能。
此外,显示器109可由平视显示器(head up display,HUD)来实现。此外,显示器109可设置有投射模块,从而通过投射在风挡或车窗的图像来输出信息。显示器109可包括透 明显示器。透明显示器可贴附在风挡或车窗。透明显示器可以具有规定的透明度的方式显示规定的画面。为使透明显示器具有透明度,透明显示器可包括透明薄膜电致发光(thin film electroluminescent,TFEL)、透明有机发光二极管(organic light-emitting diode,OLED),透明LCD(liquid crystal display)、透射型透明显示器、透明LED(Light Emitting Diode)显示器中的一种以上。透明显示器的透明度可进行调节。
另外,显示器109可配置在车辆内部的多个区域,参照图3a和图3b,图3a和图3b示出了本发明一种实施例的车辆的内部结构。如图3a和图3b示出的那样,显示器109可配置在仪表板的区域300、301、座椅308的区域302、各柱饰板的区域303、车门的区域304、中控台的区域305、顶板(head lining)的区域,遮阳板(sunvisor)的区域,或者可实现于风挡的区域306、车窗的区域307。需要说明的是,以上显示器109的配置位置仅为一种示意,并不构成对本申请的限定。
本申请实施例中,显示器上可以显示人机交互界面,例如,在车辆处于自动驾驶时,可以显示自动驾驶界面。
参照图4a,图4a为本申请实施例提供的一种车载设备的信息显示方法的流程示意图,如图4a中示出的那样,车载设备的信息显示方法,包括:
41、获取第一车辆所在路面的车道线的信息,所述车道线为所述路面上用于划分不同车道的至少两条线。
本申请实施例中,车道线可以是行驶车线、行驶车线的旁边车线、会车的车辆行驶的车线。车道线可以是包含形成车线(lane)的左右侧的线(line)的概念,换一种表述方式,车道线为所述路面上用于划分不同车道的至少两条线。
可选地,本申请实施例中,第一车辆可以通过自身携带的照相机或其他拍摄设备获取到车辆的外部图像或影像,并将获取到的外部图像或影像发送至处理器,处理器可以通过识别算法获取到外部图像或影像中包含的车道线的信息。
可选地,本申请实施例中,第一车辆可以通过自身携带的照相机或其他拍摄设备获取到车辆的外部图像或影像后,将图像或影像上传至车辆管理服务器,由车辆管理服务器处理图像,再将识别结果(车道线的信息)下发到第一车辆。
可选地,本申请实施例中,第一车辆还可以通过自身携带的传感器(例如雷达或激光雷达)探测车身周围环境,并获取到外部的车道线的信息。
可选地,本申请实施例中,第一车辆还可以从高精度地图服务器获取到当前行驶的路面的车道线的信息。
可选地,本申请实施例中,第一车辆还可以根据其他数据确定的车道线相关的信息(例如可以根据当前行驶的速度,或者是历史行驶数据等)。
本申请实施例中,上述车道线的信息可以为车道线的图像信息。
42、根据所述车道线的信息显示与所述车道线类型一致的虚拟车道线。
本申请实施例中,车辆在自动驾驶时,可以在上述显示器109中显示自动驾驶界面,具体的,在获取到第一车辆所在路面的车道线的信息之后,可以在自动驾驶界面中显示与 所述车道线类型一致的虚拟车道线。
参照图4b,图4b为本申请实施例中提供的一种自动驾驶界面示意图,如图4b中示出的那样,自动驾驶界面包括:第一车辆401、虚拟车道线402以及虚拟车道线403。其中,虚拟车道线402为第一车辆所在车道的车道线,虚拟车道线403不是第一车辆401所在车道的车道线对应的虚拟车道线,但也是第一车辆401所在路面的车道线对应的虚拟车道线。
可选地,在一种实施例中,自动驾驶界面也可以只显示第一车辆401所在车道的车道线对应的虚拟车道线(例如图4中示出的虚拟车道线402)。
本申请实施例中,自动驾驶界面显示的虚拟车道线的类型可以与实际的车道线的类型一致,具体可以是形状一致。具体的,所述车道线至少包括如下车道线中的至少一种:虚线、实线、双虚线、双实线和虚实线。
可选的,本申请实施例中,自动驾驶界面显示的虚拟车道线的类型可以与实际的车道线的类型一致,具体可以是形状和颜色都一致。具体的,所述车道线至少包括如下车道线中的至少一种:白色虚线、白色实线、黄色虚线、黄色实线、双白虚线、双黄实线、黄色虚实线和双白实线。
示例性的,双黄实线,划于路段中时,用以分隔对向行驶的交通。
黄色实线,划于路段中时,用以分隔对向行驶的交通流或作为公交车、校车专用停靠站标线,划于路侧上时,表示禁止路边停放车辆。
白色实线:划于路段中时,用以分隔同向行驶的机动车和非机动车,或指示车行道的边缘,划于路口时,用作导向车道线或停止线,或用以引导车辆行驶轨迹。
黄色虚实线,划于路段中时,用以分隔对向行驶的交通流,其中,实线侧禁止车辆越线,虚线侧准许车辆临时越线。
此外,车道线还可以包括导流线以及网格线等,其中,导流线可以为一个或几个根据路口地形设置的白色V形线或斜纹线区域,用于过宽、不规则或行驶条件比较复杂的交叉路口,立体交叉的匝道口或其他特殊地点,表示车辆必须按规定的路线行驶,不得压线或越线行驶。黄色网格线,表示禁止停车的区域,在划为停车位标线时,表示专属停车位。这意味着,车辆可以压线正常通过,但是不能在上面停留。
应理解,自动驾驶界面还可以包括其他显示元素,例如当前第一车辆的行驶速度,当前路面的限速,其他车辆等等,本申请并不限定。
需要说明的是,本实施例中的“一致”并非强调虚拟车道线与路面的车道线完全一模一样,通过计算机显示屏显示出的虚拟车道线和实际的车道线可能总会存在一些区别。本申请目的是为驾驶者指示出实际的车道,供驾驶者参考,指示方式尽量贴近实际的车道线,但线条的颜色、形态、材质等呈现的效果可以与实际的车道线存在区别。进一步的,也可以在虚拟车道线的基础上再添加显示其他的指示信息。
本申请实施例中,通过在自动驾驶界面中显示与获取到的车道线的信息对应的车道线相一致的虚拟车道线,使得驾驶者可以从自动驾驶界面中看到此时行驶路面的实际车道线类型的虚拟车道线,既丰富了自动驾驶界面的显示内容,也提高了驾驶的安全性。
可选地,本申请实施例中,第一车辆还可以获取所述路面上的非机动车物体的信息,并根据所述非机动车物体的信息显示所述非机动车物体对应的标识。
本申请实施例中,非机动车物体至少包括道路凹陷、障碍物和道路积水,此外,还可以包括行人、二轮车、交通信号、路灯、树等各类植物、建筑物、电线杆、信号灯、桥、山、丘等,这里并不限定。
可选地,本申请实施例中,第一车辆可以通过自身携带的照相机或其他拍摄设备获取到车辆的外部图像或影像,并将获取到的外部图像或影像发送至处理器,处理器可以通过识别算法获取到外部图像或影像中包含的非机动车物体的信息。
可选地,本申请实施例中,第一车辆可以通过自身携带的照相机或其他拍摄设备获取到车辆的外部图像或影像后,将图像或影像上传至车辆管理服务器,由车辆管理服务器处理图像,再将识别结果(非机动车物体的信息)下发到第一车辆。
可选地,本申请实施例中,第一车辆还可以通过自身携带的传感器(例如雷达或激光雷达)探测车身周围环境,并获取到外部的非机动车物体的信息。
本申请实施例中,在获取所述路面上的非机动车物体的信息之后,可以在自动导航界面上显示非机动车物体对应的标识。具体的,非机动车物体的信息可以包括非机动车物体的位置、形状和大小等。相应的,可以在非机动车物体相应的位置上,根据非机动车物体的形状和大小显示非机动车物体对应的标识。
需要说明的是,该非机动车物体对应的标识可以与非机动车物体一致,也可以是一个示意,仅用来表示非机动车物体的形状和大小
参照图5a,图5a为本申请实施例中提供的一种自动驾驶界面示意图,如图5a中示出的那样,自动驾驶界面还包括:非机动车物体501(道路凹陷)。
参照图5b,图5b为本申请实施例中提供的一种自动驾驶界面示意图,如图5b中示出的那样,自动驾驶界面还包括:非机动车物体501(道路积水)。
参照图5c,图5c为本申请实施例中提供的一种自动驾驶界面示意图,如图5c中示出的那样,自动驾驶界面还包括:非机动车物体501(障碍物)。
可选地,本申请实施例中,还可以基于所述非机动车物体位于导航指示指示的导航路径上,显示变道指示,其中,所述导航指示用于指示所述第一车辆的导航路径,所述变道指示用于指示所述第一车辆避开所述非机动车物体的行驶路径。
本申请实施例中,第一车辆在导航状态下,可以基于导航信息显示导航指示,所述导航指示用于指示所述第一车辆的导航路径,此时,当第一车辆识别出非机动车物体位于导航指示指示的导航路径上,则显示用于指示所述第一车辆避开所述非机动车物体的行驶路径的变道指示。
需要说明的是,本申请实施例中,第一车辆可以基于可以通过自身携带的照相机或其他拍摄设备获取到车辆的外部图像或影像,并将获取到的外部图像或影像发送至处理器,处理器可以通过识别算法获取到外部图像或影像中包含的非机动车物体的信息,此时,非机动车物体的信息可以包括非机动车物体的大小、形状和位置,处理器可以根据上述获取 到的非机动车物体的大小、形状和位置判断出非机动车物体是否在当前的导航路径上。
可选地,本申请实施例中,第一车辆可以通过自身携带的照相机或其他拍摄设备获取到车辆的外部图像或影像后,将图像或影像上传至车辆管理服务器,由车辆管理服务器处/理图像,再将识别结果(非机动车物体是否在当前的导航路径上、或非机动车物体是否会阻碍车辆的行驶)下发到第一车辆。
参照图5d,图5d为本申请实施例中提供的一种自动驾驶界面示意图,如图5d中示出的那样,非机动车物体501(障碍物)位于导航指示502指示的导航路径上,则显示用于指示所述第一车辆避开所述非机动车物体的行驶路径的变道指示503。
需要说明的是,变道指示503可以是带状的路径指示,也可以是线状的路径指示,这里并不限定,
本申请实施例中,障碍物不同于道路凹陷和道路积水,第一车辆是可以直接开过去的,如果是有障碍物,第一车辆需要绕行。在显示导航指示的情况下,如果遇到导航指示的导航路径上有障碍物,可以显示用于指示所述第一车辆避开所述非机动车物体的行驶路径的变道指示503,该变道指示503可以与当前的导航指示以不同的颜色显示和/或不同的形状显示,当第一车辆按照变道指示503绕行变道,导航指示502可以显示为弯曲的指示(如图5e中示出的那样),当第一车辆绕行通过障碍物,导航指示502可以重新变直显示(如图5f中示出的那样)。
可选地,本申请实施例中,还可以基于所述第一车辆与所述非机动车物体之间的距离为第一距离,显示第一告警提示,基于所述第一车辆与所述非机动车物体之间的距离为第二距离,显示第二告警提示,所述第二告警提示与所述第一告警提示不同。
可选地,本申请实施例中,所述第一告警提示和所述第二告警提示的颜色或透明度不同。
具体的,本申请实施例中,第一车辆可以基于距离传感器获取第一车辆与非机动车物体之间的距离,并基于取第一车辆与非机动车物体之间的距离显示告警提示。其中告警提示可以根据障碍物远近(碰撞危险级别)发生至少两种颜色的变化,随着第一车辆与障碍物之间距离的增大/减小,两种相邻颜色的变化呈平滑过渡。
可选地,第一车辆还可以接收共享指令,所述共享指令携带有第二车辆的地址,响应于所述共享指令,向所述第二车辆发送第二共享信息,所述第二共享信息包括所述非机动车物体的位置信息。
可选地,第一车辆还可以接收到服务器或第二车辆发送的第一共享信息,所述第一共享信息包括非机动车物体的位置信息,并基于所述第一车辆开启导航,在导航界面上显示障碍提示,所述障碍提示用于指示所述位置信息对应的位置上的非机动车物体。
可以理解的是,如果路面凹陷、路面积水或障碍物比较大,可能严重影响车辆的驾驶,驾驶员可能更希望提早了解,而不是等到车辆行驶到路面凹陷、路面积水或障碍物的近前才知晓,因此,在这种情况下,只靠车辆的传感器不能做到事先预知。
可选地,本申请实施例中,可以通过交通系统中的监控摄像头、或者行驶过该路面的 车辆的传感器获取到路面凹陷、积水或障碍物的信息后,将这些信息上报给车辆管理服务器,由服务器下发给导航路线中包括有这些路面凹陷、积水或障碍物的道路的车辆,使这些车辆可以提前了解到这些信息。
若第一车辆通过传感器获取到非机动车物体的信息,则可以将该非机动车物体的信息(位置、形状、大小等)发送至其他车辆(第二车辆)。具体的,驾驶员或者乘客可以在自动驾驶界面上进行操作(例如触发显示界面上的分享控件,并输入第二车辆的地址、或者直接选择和第一车辆建立了连接的第二车辆等),相应的,第一车辆可以接收共享指令,所述共享指令携带有第二车辆的地址,并响应于所述共享指令,向所述第二车辆发送第二共享信息,所述第二共享信息包括所述非机动车物体的位置信息。
参照图6a,图6a为本申请实施例中提供的一种自动驾驶界面示意图,如图6a中示出的那样,如驾驶员A和驾驶员B一共约定去某地出游,A先出发,并在经过路径上发现路面有路面凹陷,那么A可以通过触摸显示器,点击凹陷提示,选择分享控件601“发送给朋友”(如图6a中示出的那样),并选择驾驶员B(此时相当于输入了第二车辆的地址),以驾驶员B可以提前收到该路面凹陷的提示。
相应的,以第一车辆为接收共享信息为例,若第一车辆接收到服务器或第二车辆发送的第一共享信息,所述第一共享信息包括非机动车物体的位置信息,并基于所述第一车辆开启导航,在导航界面上显示障碍提示,所述障碍提示用于指示所述位置信息对应的位置上的非机动车物体。
参照图6b,图6b为本申请实施例中提供的一种自动驾驶界面示意图,如图6b中示出的那样,其中,图6b中的右图为导航界面,该导航界面包括导航地图,图中粗实线为导航路线,箭头处为当面车辆所行至位置,粗实线上黑色圆点处标示的位置为车辆管理服务器收集到的路面凹陷信息或者是其他车辆发送的路面凹陷信息,并在当前第一车辆的导航界面上显示凹陷提示602。
本申请实施例中,第一车辆还可以基于行驶速度显示不同的导航指示。
具体的,本申请实施例中,第一车辆还可以获取所述第一车辆的导航信息,并基于所述导航信息显示导航指示,所述导航指示用于指示所述第一车辆的导航路径。
本申请实施例中,所述导航指示包括第一导航指示或第二导航指示,基于所述第一车辆处于静止状态,显示所述第一导航指示,基于所述第一车辆处于行驶状态,显示所述第二导航指示,所述第一导航指示和所述第二导航指示不同。
具体的,第一导航指示和所述第二导航指示的显示颜色或透明度不同。
参照图7a,图7a为本申请实施例中提供的一种自动驾驶界面示意图,如图7a中示出的那样,自动驾驶界面包括导航指示701,该导航指示701指示了所述第一车辆的导航路径,当第一车辆判断出当前为禁止状态,或者是行驶速度低于预设的速度,则显示所述第一导航指示701(如图7b中示出的那样),当第一车辆判断出当前为行驶状态,或者是行驶速度高于预设的速度,则显示第二导航指示701(如图7c中示出的那样),其中,图7c中示出的第二导航指示701的颜色比图7b中示出的第一导航指示701。
本申请实施例中,通过基于第一车辆的行驶状态显示不同的导航指示,使得驾驶者或乘客可以基于导航界面中的导航指示的显示确定出车辆当前的行驶状态。
本申请实施例中,第一车辆还可以根据当前所处的环境(天气、时间信息等),使得自动导航界面上的视觉元素(虚拟车道线、车道路面、导航指示等)在颜色、亮度和材质中的至少一种上发生变化。
具体的,在一种实施例中,所述导航指示包括第三导航指示或第四导航指示,第一车辆可以基于所述第一车辆处于第一环境,显示所述第三导航指示,基于所述第一车辆处于第二环境,显示所述第四导航指示,其中,所述第一环境与所述第二环境不同,所述第三导航指示和所述第四导航指示不同。
可选地,在另一种实施例中,第一车辆可以基于所述第一车辆处于第一环境,显示第一路面,基于所述第一车辆处于第二环境,显示第二车道,其中,第一车道和第二车道为第一车辆行驶的车道,或者是第一车辆所在的路面的车道,所述第一环境与所述第二环境不同,所述第一车道和所述第二车道不同。
具体的,本申请实施例中,第一车辆可以根据当前所处的环境(天气、时间信息等),使得自动导航界面上的视觉元素(虚拟车道线、车道路面、导航指示等)在颜色、亮度和材质中的至少一种上发生变化。
可选地,本申请实施例中,所述第一环境至少包括如下环境中的一种:所述第一车辆所处的天气环境、所述第一车辆所处的路面环境、所述第一车辆导航目的地所处的天气环境、所述第一车辆导航目的地所处的路面环境、所述第一车辆所在道路的交通拥堵环境、所述第一车辆导航目的地所处的交通拥堵环境或所述第一车辆所处的亮度环境。
其中,天气环境可以通过网络连接天气服务器获取。其中,天气环境可以包括温度、湿度等,以及大风、暴雨、暴雪等。亮度环境可以是当前车辆所处的环境的亮度,可以表示当前的时间,例如,当前时间为早上,则虚拟车道线、车道路面、导航指示等的颜色比正常亮度提高或者颜色变浅,当前时间是晚上,则虚拟车道线、车道路面、导航指示等的颜色比正常亮度要低或颜色变深。
例如,当前为雪天,则虚拟车道线、车道路面、导航指示等的材质表现为积雪覆盖。
例如,当前天气环境为恶劣天气(如大风、暴雨、暴雪)的情况下,对虚拟车道线、车道路面、导航指示等视觉元素进行增强显示,如颜色更加鲜艳(纯度提高),或者亮度提高,或者采用增强型的材质。
参照图8a,图8a为本申请实施例中提供的一种自动驾驶界面示意图,如图8a中示出的那样,此时,第一车辆行驶的路面环境为雪地,相应的,自动导航界面中的路面材质表现为积雪覆盖。
参照图8b,图8b为本申请实施例中提供的一种自动驾驶界面示意图,如图8b中示出的那样,此时,第一车辆行驶的路面环境为沙漠,相应的,自动导航界面中的路面材质表现为沙漠。
本申请实施例中,第一车辆可以基于所述第一车辆处于第一环境,显示第一路面,基 于所述第一车辆处于第二环境,显示第二车道,其中,第一车道和第二车道为第一车辆行驶的车道,或者是第一车辆所在的路面的车道,所述第一环境与所述第二环境不同,所述第一车道和所述第二车道不同。驾驶者或者乘客可以基于自动导航界面的显示获取到当前车辆所处的环境,尤其在夜晚或其他亮度较低的场景中,驾驶者或者乘客可以基于自动导航界面的显示获知当前车辆所处的环境,提高了驾驶的安全性。
本申请实施例中,第一车辆可以基于导航目的地所处的地理位置在自动驾驶界面中显示对应的图像。
可选地,在一种实施例中,第一车辆可以获取所述第一车辆导航目的地所处的地理位置,并基于所述地理位置显示第一图像,所述第一图像用于指示所述第一车辆导航目的地所处的地理位置的类型。其中,地理位置的类型可以至少包括如下类型的一种:城市、山区、平原、森林或海边。
本申请实施例中,第一车辆可以通过GPS系统得到第一车辆导航目的地所处的地理位置,或者通过高清地图获取当前车的导航目的地所处的地理位置,并进一步获取这些地理位置的属性信息(类型),如第一车辆导航目的地所处的地理位置可以属于城市、山区、平原、森林、海边,等。地理位置的属性信息(类型)可以从地图系统获取。
本申请实施例中,在第一汽车获取到导航目的地所处的地理位置,以及导航目的地所处的地理位置的类型后,可以根据地理位置的类型,在用于标识车道的视觉元素的车道尽头位置呈现远景图片(第一图像),或者更改车道视觉元素的材质。
可以理解的是,第一图像显示的区域长、宽、位置都是可以变化的,本实施例中只是给出几种可能的例子,第一图像可以显示在速度标识旁边、与速度标识重叠显示、或占据整个显示面板的上方,等等。
参照图8c,图8c为本申请实施例中提供的一种自动驾驶界面示意图,如图8c中示出的那样,若第一车辆的导航目的地的地理位置位于海边,则可以在自动驾驶界面上显示用于表示海边的第一图像(例如,可以包括椰子树和海水)。
参照图8d,图8d为本申请实施例中提供的一种自动驾驶界面示意图,如图8d中示出的那样,若第一车辆的导航目的地的地理位置位于山区,则可以在自动驾驶界面上显示用于表示山区的第一图像(例如,可以包括山)。
参照图8e,图8e为本申请实施例中提供的一种自动驾驶界面示意图,如图8e中示出的那样,若第一车辆的导航目的地的地理位置位于森林,则可以在自动驾驶界面上显示用于表示森林的第一图像(例如,可以包括多棵树)。
以上第一图像仅为一种示意,并不构成对本申请的限定。
可选地,第一车辆还可以检测到第三车辆,并获取所述第三车辆导航目的地所处的地理位置,基于所述第三车辆导航目的地所处的地理位置显示第二图像,所述第二图像用于指示所述第三车辆导航目的地所处的地理位置的类型。
本申请实施例中,如果其他车辆(第三车辆)的驾驶员愿意公开自己的目的地(类型)信息,也可以在自动驾驶界面上显示其他车辆的目的地的地理位置的类型。
参照图8f,图8f为本申请实施例中提供的一种自动驾驶界面示意图,如图8f中示出的那样,第一车辆(图中最大的车辆)可以通过自动驾驶界面知道前方车辆和左方车辆跟自己是要去同样类型的目的地(森林),而右侧车辆则不是,因为前方车辆和左右方车辆采用了一种特殊的颜色和/或纹理来标识,或者在这些车辆周围显示了指示所述第三车辆导航目的地所处的地理位置的类型的第二图像(包括树)。
本申请实施例中,第一车辆可以获取所述第一车辆导航目的地所处的地理位置,并基于所述地理位置显示第一图像,所述第一图像用于指示所述第一车辆导航目的地所处的地理位置的类型。第一车辆可以基于导航目的地所处的地理位置在自动驾驶界面中显示对应的图像,丰富了自动驾驶界面的内容。
本申请实施例中,第一车辆可以基于行驶到路口停止区域,在自动驾驶界面上显示路口停止指示。
具体的,本申请实施例中,第一车辆可以检测到所述第一车辆行驶至路口停止区域,显示路口停止指示901。可选地,本申请实施例中,路口停止区域可以是第一车辆行驶至距离红灯路口的预设距离(例如20m)以内的区域。
具体的,第一车辆可以基于图像或摄像判断出当前第一车辆进入路口停止区域,或者可以基于导航信息确定当前第一车辆进入路口停止区域。
可选地,第一车辆可以获取到当前路口与第一车辆对应的红绿灯的状态,并在红绿灯状态处于红灯或黄灯时,显示第一路口停止指示。
参照图9a,图9a为本申请实施例中提供的一种自动驾驶界面示意图,如图9a中示出的那样,当行驶至路口停止区域时,自动驾驶界面显示路口停止线901。
需要说明的是,若第一车辆处于导航状态,则还可以显示导航指示701,同时导航指示701超出路口停止线的部分弱化显示,弱化的方式可以是只显示导航指示701的轮廓,或者提高导航指示701的透明度,等等,这里并不限定。
可选地,本申请实施例中,所述路口停止指示包括:第一路口停止指示或第二路口停止指示,第一车辆可以基于检测到所述第一车辆的车头未超出所述路口停止区域,显示第一路口停止指示,基于检测到所述第一车辆的车头超出所述路口停止区域,显示第二路口停止指示,所述第一路口停止指示与所述第二路口停止指示不同。
本申请实施例中,当第一车辆的车头超过路口停止指示901时,可以改变第一路口停止指示901的显示内容,例如,可以将路口停止指示弱化显示,弱化的方式可以是提高路口停止指示的透明度,等等,这里并不限定。
参照图9b,图9b为本申请实施例中提供的一种自动驾驶界面示意图,如图9b中示出的那样,此时,第一车辆可以检测到所述第一车辆行驶至路口停止区域,相应的,自动驾驶界面上显示路口停止指示,以及弱化后的导航指示701。
参照图9c,图9c为本申请实施例中提供的一种自动驾驶界面示意图,如图9c中示出的那样,此时,第一车辆可以检测到所述第一车辆行驶超过路口停止区域(第一车辆的车头超出所述路口停止区域),相应的,自动驾驶界面上显示弱化后的路口停止指示,以及强 化(显示完整的导航指示701的轮廓、改变颜色、或者降低导航指示701的透明度)后的导航指示701。
可选地,在另一种实施例中,所述路口停止指示包括:第三路口停止指示或第四路口停止指示,第一车辆可以基于检测到所述第一车辆行驶至所述路口停止区域,且所述路口停止区域对应的红绿灯为红灯或黄灯,显示第三路口停止指示,基于检测到所述第一车辆行驶至所述路口停止区域,且所述路口停止区域对应的红绿灯为绿灯,显示第四路口停止指示,所述第三路口停止指示与所述第四路口停止指示不同。
本申请实施例中,本申请实施例中,第一车辆除了基于行驶至路口停止区域来显示路口停止指示,还会考虑到当前路口的红绿灯信息,具体的,当第一车辆行驶至所述路口停止区域,且所述路口停止区域对应的红绿灯为红灯或黄灯,显示第三路口停止指示,当第一车辆行驶至所述路口停止区域,且所述路口停止区域对应的红绿灯为绿灯,显示第四路口停止指示,例如第四路口指示可以是强化(改变颜色,或者降低导航指示701的透明度)后的第三路口指示。
本申请实施例中,第一车辆可以基于附近车辆与本车的距离在自动驾驶界面上显示车辆告警提示。
具体的,本申请实施例中,第一车辆可以检测到第四车辆,基于所述第四车辆与所述第一车辆之间的距离小于预设距离,显示车辆警告提示。
可选地,在一种实施例中,所述车辆警告提示包括第一车辆警告提示或第二车辆警告提示,第一车辆可以基于所述第四车辆与所述第一车辆之间的距离为第一距离,显示第一车辆警告提示,并基于所述第四车辆与所述第一车辆之间的距离为第二距离,显示第二车辆警告提示,所述第一距离与所述第二距离不同,所述第一车辆警告提示与所述第二车辆警告提示不同。
本申请实施例中,第一车辆可以基于自身携带的距离传感器获取到其余车辆与第一车辆之间的距离,并在检测到某一车辆(第四车辆)与第一车辆之间的距离小于预设距离后,显示车辆告警提示。
本申请实施例中,在第一车辆周围有其他车辆(第四车辆)时,可以在自动驾驶界面上以本车靠近第四车辆最近点为圆心显示警告提示(危险提示图形)。参照图10,图10为本申请实施例中提供的一种自动驾驶界面示意图,如图10中示出的那样,第一车辆可以检测到第四车辆1001,并基于所述第四车辆1001与所述第一车辆之间的距离小于预设距离,自动驾驶界面上显示车辆警告提示1002。
可选地,基于第四车辆与第一车辆距离的远近,警告提示的颜色可以不同,例如特别近时显示为红色,比较接近时显示为黄色。
可选地,当第四车辆与第一车辆之间的距离在不断发生变化时,危险提示图形的颜色变化可以是渐变过渡的,而不是在超过相应阈值时,突然由红色变为黄色(或者黄色变为红色)。
本申请实施例中,第一车辆可以基于附近车辆与本车的距离在自动驾驶界面上显示车 辆告警提示。使得驾驶者可以通过自动驾驶界面上显示的告警提示知晓第一车辆与其他车辆的碰撞风险。
本申请实施例中,第一车辆可以基于转弯状态到直行状态的变化,或者直行状态到转弯状态的变化,改变当前自动驾驶界面的显示视角。
具体的,参照图11a,图11a为本申请实施例中提供的一种自动驾驶界面示意图,如图11a中示出的那样,第一车辆可以基于所述第一车辆处于直行状态,显示第一区域。
参照图11b,图11b为本申请实施例中提供的一种自动驾驶界面示意图,如图11b中示出的那样,第一车辆可以基于所述第一车辆由所述直行状态改变为左转弯状态,显示第二区域,其中,所述第二区域包含的所述第一车辆行驶方向的左前方的场景区域1102大于所述第一区域包含的所述左前方的场景区域1101。
本申请实施例中,当驾驶员在将要左转弯之前,比较关注左前方的信息,主要是看有无行人,因此,第二区域包含的所述第一车辆行驶方向的左前方的场景区域1102大于所述第一区域包含的所述左前方的场景区域1101。
参照图11c,图11c为本申请实施例中提供的一种自动驾驶界面示意图,如图11c中示出的那样,第一车辆可以基于所述第一车辆处于左转弯状态,显示第三区域。
参照图11d,图11d为本申请实施例中提供的一种自动驾驶界面示意图,如图11d中示出的那样,第一车辆可以基于所述第一车辆由所述左转弯状态改变为直行状态,显示第四区域,其中,所述第三区域包含的所述第一车辆行驶方向的右后方的场景区域1103大于所述第四区域包含的所述右后方的场景区域1104。
本申请实施例中,驾驶员在左转弯之后,比较关注右后方的信息,主要是看有无来车,因此,第三区域包含的所述第一车辆行驶方向的右后方的场景区域1103大于所述第四区域包含的所述右后方的场景区域1104。
参照图11e,图11e为本申请实施例中提供的一种自动驾驶界面示意图,如图11e中示出的那样,第一车辆可以基于所述第一车辆处于直行状态,显示第五区域。
参照图11f,图11f为本申请实施例中提供的一种自动驾驶界面示意图,如图11f中示出的那样,第一车辆可以基于所述第一车辆由所述直行状态改变为右转弯状态,显示第六区域,其中,所述第五区域包含的所述第一车辆行驶方向的右前方的场景区域1105大于所述第六区域包含的所述右前方的场景区域1106。
本申请实施例中,当驾驶员在将要右转弯之前,比较关注右前方的信息,主要是看有无行人,因此,所述第五区域包含的所述第一车辆行驶方向的右前方的场景区域1105大于所述第六区域包含的所述右前方的场景区域1106。
参照图11g,图11g为本申请实施例中提供的一种自动驾驶界面示意图,如图11g中示出的那样,第一车辆可以基于所述第一车辆处于右转弯状态,显示第七区域。
参照图11h,图11h为本申请实施例中提供的一种自动驾驶界面示意图,如图11h中示出的那样,第一车辆可以基于所述第一车辆由所述右转弯状态改变为直行状态,显示第八区域,其中,所述第七区域包含的所述第一车辆行驶方向的左后方的场景区域1107大于 所述第八区域包含的所述左后方的场景区域1108。
本申请实施例中,驾驶员在右转弯之后,比较关注左后方的信息,主要是看有无来车,因此,第三区域包含的所述第一车辆行驶方向的左后方的场景区域1103大于所述第四区域包含的所述左后方的场景区域1104。
需要说明的是,上述图11a至图11h中的场景区域的划分仅为一种示意,并不构成对本申请的限定。
换一种表述方式,本申请实施例中,第一车辆可以根据路口转弯区域,改变显示器上显示信息的显示视角。具体的,转弯区域可以通过感知方向盘是否左、右旋转获得,或者通过车辆行驶时是否启动了高精度地图导航,通过导航路线判断是否行驶至需要左转或右转的路口,或者车辆行驶时只启用了高精度地图,但是并未使用导航,而是由驾驶员驾驶,则通过判断车辆是否行驶在靠近路口预设距离且行驶在左侧转弯车道或右侧转弯车道,进一步确定车辆是否要左转或右转。
本实施例中所说的视角,是指显示器上显示信息的视角,具体的,可以通过一个虚拟摄像机跟踪本车(第一车辆)的位置,以这个摄像机的视野中能看到的对象。改变显示视角,就是通过改变这个虚拟摄像机与本车的相对位置(x、y、z轴坐标,以及各个方向的角度),进而在显示器上呈现虚拟摄像机视野中能看到的对象的改变。
示例性的,以本车为坐标原点,面向车辆正面的方向是y轴的正方向,车的行进方向是y轴的负方向;面向车辆,车辆的右手边为x轴的正方向,车辆的左手边为x轴的负方向。虚拟摄像机的位置是在z轴的上方,在z轴的正方向、y轴的正方向上。在该默认状态下的视角称为默认视角(以下实施例中称“默认向前视角”)。
可以理解的是,原点的位置和各轴的方向都可以由开发人员自定义。
以右转弯为例,当驾驶员在将要右转弯之前,比较关注右前方的信息,主要是看有无行人,在转弯之后,比较关注左后方的信息,主要是看有无来车,如果判断驾驶员将要右转弯,则由默认向前视角改变虚拟摄像机的视角先往右看(虚拟摄像机向右转,从面向y轴负方向向x轴负方向转动),再改变虚拟摄像机的视角向左看(虚拟摄像机向左转,向x轴正方向转动),当转弯结束后开始直行时,恢复默认向前视角(图d,虚拟摄像机面向y轴负方向)。
本申请实施例中,在第一车辆处于转弯状态到直行状态的变化,或者在第一车辆处于直行状态到转弯状态的变化时,可以改变当前的显示视角,使得驾驶员可以知晓到转弯时有可能有安全风险的区域信息,提高了驾驶的安全性。
本申请实施例中,第一车辆可以基于行驶的速度的变化,改变当前自动驾驶界面的显示视角。
具体的,本申请实施例中,第一车辆可以基于所述第一车辆处于第一行驶速度,显示第九区域,基于所述第一车辆处于第二行驶速度,显示第十区域,所述第九区域和所述第十区域为所述第一车辆行驶位置所处的场景区域,所述第二行驶速度大于所述第一行驶速度,所述第九区域包含的场景区域大于所述第十区域包含的场景区域。
参照图12a至图12d,图12a至图12d为本申请实施例中提供的一种自动驾驶界面示意图,如图12a至图12d中示出的那样,从图12a至图12d描述的是车速越来越低的情形,可以看到,随着第一车辆的行驶速度的降低,自动驾驶界面中,第一车辆行驶位置所处的场景区域越来越小。
本申请实施例中,第一车辆可以基于车辆的行驶速度越快,使得自动驾驶界面上显示的道路视角越高,相应的,道路显示的范围越大;车辆的行驶速度越慢,显示面板上道路信息(车道两边的建筑物、行人、路边交通设施等)显示得越多,越明显,显示面板上显示的道路视角越低,道路显示的范围(第一车辆行驶位置所处的场景区域)越小。
关于如何变换得自动驾驶界面上显示的道路视角,可以参照上述实施例中的描述,这里不再赘述。
如图12a至图12d中示出的那样,从图12a至图12d描述的是车速越来越低的情形,可以看到视角越来越低,当第一车辆高速行驶的时候视角高(虚拟摄像机的位置的z轴值大),低速的时候视角低(虚拟摄像机的位置的z轴值小)。需要说明的是,图12a至图12d中的速度数值仅为一种示意,并不构成对本申请的限定。
此外,当车速较低时,比如在街道里行驶,驾驶员会更加的关注车周边的信息,比如碰撞信息的细节。此时视角会更贴近车本身,从而让驾驶员关注到他想关注的信息。显示面板上道路信息(车道两边的建筑物、行人、路边交通设施等)显示得越多,越明显,自动驾驶界面上显示的道路视角越低,道路显示的范围越小。如图12a至图12d中示出的那样,在12a中第一车辆的行驶速度更快,则道路旁边的建筑物更加弱化显示(通过颜色减淡和/或增加透明度等方式),而图12d中第一车辆的行驶速度慢,道路旁边的建筑物增强显示(通过颜色加深和/或减小透明度等方式)。
本申请实施例中,第一车辆可以基于所述第一车辆处于第一行驶速度,显示第九区域,基于所述第一车辆处于第二行驶速度,显示第十区域,所述第九区域和所述第十区域为所述第一车辆行驶位置所处的场景区域,所述第二行驶速度大于所述第一行驶速度,所述第九区域包含的场景区域大于所述第十区域包含的场景区域。通过上述方式,当第一车辆的行驶速度较快时,可以显示更大的场景区域,使得驾驶者可以在行驶速度较快时知晓到更多的路面信息,提高了驾驶的安全性。
本申请实施例中,第一车辆可以在自动驾驶界面上显示旁车插入当前行驶车道的提示。
具体的,本申请实施例中,第一车辆可以检测到第五车辆,并基于所述第五车辆位于所述第一车辆行驶方向的前方所在车道的车道线上,显示所述第五车辆对应的第三图像,第一车辆可以基于所述第五车辆行驶至所述第一车辆行驶方向的前方所在的车道上,显示所述第五车辆对应的第四图像,所述第三图像和所述第四图像不同。
本申请实施例中,当第一车辆检测到某一车辆(第五车辆)位于所述第一车辆行驶方向的前方所在车道的车道线上时,确定第五车辆会对第一车辆进行超车。
可选地,第一车辆还可以基于第五车辆位于所述第一车辆行驶方向的前方所在车道的车道线上,且第五车辆与第一车辆的距离小于某一预设值,来确定出第五车辆会对第一车 辆进行超车。
可选地,第一车辆可以对拍摄到的图像或摄像进行处理,来确定第五车辆位于所述第一车辆行驶方向的前方所在车道的车道线上。第一车辆可以通过将拍摄到的图像或摄像发送至服务器,由服务器来判断出第五车辆位于所述第一车辆行驶方向的前方所在车道的车道线上,并接受到服务器发送的判断结果。
本申请实施例中,示例性的,第五车辆可以位于第一车辆的后方(如图13a中示出的那样),若第一车辆检测出第五车辆正在进行超车行为,则可以在自动驾驶界面上采用特殊颜色标识(例如白色)显示第五车辆对应的图像(如图13b中示出的第五车辆1301,此时第五车辆1302位于所述第一车辆行驶方向的前方所在车道的车道线上),表示第五车辆抑制了第一车辆的速度。
本申请实施例中,当第一车辆检测到第五车辆完成超车后,可以改变第五车辆的显示内容,具体的,第一车辆可以基于所述第五车辆行驶至所述第一车辆行驶方向的前方所在的车道上(如图13c中示出的第五车辆1301,此时第五车辆1302位于所述第一车辆行驶方向的前方所在车道上,而没有位于车道线上),则显示所述第五车辆对应的第四图像,其中,第四图像相对于第三图像,颜色和/或透明度可以不同。
需要说明的是,图13b和图13c中第三图像和第四图像仅为一种示意,只要可以区分超车时和超车完成的车辆,本申请并不限定第三图像和第四图像的显示内容。
接下来介绍本申请实施例提供的一种车载设备的信息显示装置。参照图14,图14为本申请实施例提供的一种车载设备的信息显示装置的结构示意图,如图14示出的那样,信息显示装置包括:
获取模块1401,用于获取第一车辆所在路面的车道线的信息,所述车道线为所述路面上用于划分不同车道的至少两条线;
显示模块1402,用于根据所述车道线的信息显示与所述车道线类型一致的虚拟车道线。
可选地,所述获取第一车辆所在路面的车道线的信息,包括:
获取第一车辆所在车道的车道线的信息。
可选地,所述车道线至少包括如下车道线中的至少一种:虚线、实线、双虚线、双实线和虚实线。
可选地,所述车道线至少包括如下车道线中的至少一种:白色虚线、白色实线、黄色虚线、黄色实线、双白虚线、双黄实线、黄色虚实线和双白实线。
可选地,所述获取模块1401,还用于获取所述路面上的非机动车物体的信息;
所述显示模块1402,还用于显示所述非机动车物体。
可选地,所述装置还包括:
接收模块,用于接收共享指令,所述共享指令携带有第二车辆的地址;
发送模块,用于响应于所述共享指令,向所述第二车辆发送第二共享信息,所述第二共享信息包括所述非机动车物体的位置信息。
可选地,所述接收模块,还用于接收到服务器或第二车辆发送的第一共享信息,所述 第一共享信息包括非机动车物体的位置信息;
所述显示模块1402,还用于基于所述第一车辆开启导航,在导航界面上显示障碍提示,所述障碍提示用于指示所述位置信息对应的位置上的非机动车物体。
可选地,所述非机动车物体至少包括道路凹陷、障碍物和道路积水。
可选地,所述显示模块1402,还用于基于所述非机动车物体位于导航指示指示的导航路径上,显示变道指示,其中,所述导航指示用于指示所述第一车辆的导航路径,所述变道指示用于指示所述第一车辆避开所述非机动车物体的行驶路径。
可选地,所述显示模块1402,还用于基于所述第一车辆与所述非机动车物体之间的距离为第一距离,显示第一告警提示;
基于所述第一车辆与所述非机动车物体之间的距离为第二距离,显示第二告警提示,所述第二告警提示与所述第一告警提示不同。
可选地,所述第一告警提示和所述第二告警提示的颜色或透明度不同。
可选地,所述获取模块1401,还用于获取所述第一车辆的导航信息;
所述显示模块1402,还用于基于所述导航信息显示导航指示,所述导航指示用于指示所述第一车辆的导航路径。
可选地,所述导航指示包括第一导航指示或第二导航指示,所述显示模块1402,具体用于基于所述第一车辆处于静止状态,显示所述第一导航指示;
基于所述第一车辆处于行驶状态,显示所述第二导航指示,所述第一导航指示和所述第二导航指示不同。
可选地,所述装置第一导航指示和所述第二导航指示的显示颜色或透明度不同。
可选地,所述导航指示包括第三导航指示或第四导航指示,所述显示模块1402,具体用于基于所述第一车辆处于第一环境,显示所述第三导航指示;
基于所述第一车辆处于第二环境,显示所述第四导航指示,所述第一环境与所述第二环境不同,所述第三导航指示和所述第四导航指示不同。
可选地,所述第一环境至少包括如下环境中的一种:所述第一车辆所处的天气环境、所述第一车辆所处的路面环境、所述第一车辆导航目的地所处的天气环境、所述第一车辆导航目的地所处的路面环境、所述第一车辆所在道路的交通拥堵环境、所述第一车辆导航目的地所处的交通拥堵环境或所述第一车辆所处的亮度环境。
可选地,所述显示模块1402,还用于基于所述第一车辆处于直行状态,显示第一区域;
基于所述第一车辆由所述直行状态改变为左转弯状态,显示第二区域,其中,所述第二区域包含的所述第一车辆行驶方向的左前方的场景区域大于所述第一区域包含的所述左前方的场景区域;或,
基于所述第一车辆处于左转弯状态,显示第三区域;
基于所述第一车辆由所述左转弯状态改变为直行状态,显示第四区域,其中,所述第三区域包含的所述第一车辆行驶方向的右后方的场景区域大于所述第四区域包含的所述右后方的场景区域;或,
基于所述第一车辆处于直行状态,显示第五区域;
基于所述第一车辆由所述直行状态改变为右转弯状态,显示第六区域,其中,所述第五区域包含的所述第一车辆行驶方向的右前方的场景区域大于所述第六区域包含的所述右前方的场景区域;或,
基于所述第一车辆处于右转弯状态,显示第七区域;
基于所述第一车辆由所述右转弯状态改变为直行状态,显示第八区域,其中,所述第七区域包含的所述第一车辆行驶方向的左后方的场景区域大于所述第八区域包含的所述左后方的场景区域。
可选地,所述显示模块1402,还用于基于所述第一车辆处于第一行驶速度,显示第九区域;
基于所述第一车辆处于第二行驶速度,显示第十区域,所述第九区域和所述第十区域为所述第一车辆行驶位置所处的场景区域,所述第二行驶速度大于所述第一行驶速度,所述第九区域包含的场景区域大于所述第十区域包含的场景区域。
可选地,所述获取模块1401,还用于获取所述第一车辆导航目的地所处的地理位置;
所述显示模块1402,还用于基于所述地理位置显示第一图像,所述第一图像用于指示所述第一车辆导航目的地所处的地理位置的类型。
可选地,所述检测模块1403,还用于检测到第三车辆;
所述获取模块1401,还用于获取所述第三车辆导航目的地所处的地理位置;
所述显示模块1402,还用于基于所述第三车辆导航目的地所处的地理位置显示第二图像,所述第二图像用于指示所述第三车辆导航目的地所处的地理位置的类型。
可选地,所述地理位置的类型至少包括如下类型的一种:城市、山区、平原、森林或海边。
可选地,所述检测模块1403,还用于检测到所述第一车辆行驶至路口停止区域,显示第一路口停止指示。
可选地,所述路口停止指示包括:第一路口停止指示或第二路口停止指示,所述显示模块1402,还用于:
基于所述检测模块1403检测到所述第一车辆的车头未超出所述路口停止区域,显示第一路口停止指示;
基于所述检测模块1403检测到所述第一车辆的车头超出所述路口停止区域,显示第二路口停止指示,所述第一路口停止指示与所述第二路口停止指示不同。
可选地,所述路口停止指示包括:第三路口停止指示或第四路口停止指示,所述显示模块1402,还用于:
基于所述检测模块1403检测到所述第一车辆行驶至所述路口停止区域,且所述路口停止区域对应的红绿灯为红灯或黄灯,显示第三路口停止指示;
基于所述检测模块1403检测到所述第一车辆行驶至所述路口停止区域,且所述路口停止区域对应的红绿灯为绿灯,显示第四路口停止指示,所述第三路口停止指示与所述第四路口停止指示不同。
可选地,所述检测模块1403,还用于检测到第四车辆;
所述显示模块1402,还用于基于所述第四车辆与所述第一车辆之间的距离小于预设距离,显示车辆警告提示。
可选地,所述车辆警告提示包括第一车辆警告提示或第二车辆警告提示,所述显示模块1402,还用于基于所述第四车辆与所述第一车辆之间的距离为第一距离,显示第一车辆警告提示;
基于所述第四车辆与所述第一车辆之间的距离为第二距离,显示第二车辆警告提示,所述第一距离与所述第二距离不同,所述第一车辆警告提示与所述第二车辆警告提示不同。
可选地,所述检测模块1403,还用于检测到第五车辆;
所述显示模块1402,还用于基于所述第五车辆位于所述第一车辆行驶方向的前方所在车道的车道线上,显示所述第五车辆对应的第三图像;
基于所述第五车辆行驶至所述第一车辆行驶方向的前方所在的车道上,显示所述第五车辆对应的第四图像,所述第三图像和所述第四图像不同。
本申请还提供一种车辆,包括处理器、存储器和显示器,所述处理器用于获取并执行所述存储器中的代码,以执行上述实施例中任一所述的车载设备的信息显示方法。
可选地,车辆可以是支持自动驾驶功能的智能车辆。
另外需说明的是,以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。另外,本申请提供的装置实施例附图中,模块之间的连接关系表示它们之间具有通信连接,具体可以实现为一条或多条通信总线或信号线。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到本申请可借助软件加必需的通用硬件的方式来实现,当然也可以通过专用硬件包括专用集成电路、专用CPU、专用存储器、专用元器件等来实现。一般情况下,凡由计算机程序完成的功能都可以很容易地用相应的硬件来实现,而且,用来实现同一功能的具体硬件结构也可以是多种多样的,例如模拟电路、数字电路或专用电路等。但是,对本申请而言更多情况下软件程序实现是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在可读取的存储介质中,如计算机的软盘、U盘、移动硬盘、ROM、RAM、磁碟或者光盘等,包括若干指令用以使得一台计算机设备(可以是个人计算机,训练设备,或者网络设备等)执行本申请各个实施例所述的方法。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。
所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储 在计算机可读存储介质中,或者从一个计算机可读存储介质向另一计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、训练设备或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、训练设备或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存储的任何可用介质或者是包含一个或多个可用介质集成的训练设备、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘(Solid State Disk,SSD))等。

Claims (29)

  1. 一种车载设备的信息显示方法,其特征在于,包括:
    获取第一车辆所在路面的车道线的信息,所述车道线为所述路面上用于划分不同车道的至少两条线;
    根据所述车道线的信息显示与所述车道线类型一致的虚拟车道线。
  2. 根据权利要求1所述的方法,其特征在于,所述获取第一车辆所在路面的车道线的信息,包括:
    获取第一车辆所在车道的车道线的信息。
  3. 根据权利要求1或2所述的方法,其特征在于,所述车道线至少包括如下车道线中的至少一种:虚线、实线、双虚线、双实线和虚实线。
  4. 根据权利要求1至3任一所述的方法,其特征在于,所述方法还包括:
    获取所述路面上的非机动车物体的信息;
    根据所述非机动车物体的信息显示所述非机动车物体对应的标识。
  5. 根据权利要求4所述的方法,其特征在于,所述方法还包括:
    接收共享指令,所述共享指令携带有第二车辆的地址;
    响应于所述共享指令,向所述第二车辆发送第二共享信息,所述第二共享信息包括所述非机动车物体的位置信息。
  6. 根据权利要求1或5任一所述的方法,其特征在于,所述方法还包括:
    接收到服务器或第二车辆发送的第一共享信息,所述第一共享信息包括非机动车物体的位置信息;
    基于所述第一车辆开启导航,在导航界面上显示障碍提示,所述障碍提示用于指示所述位置信息对应的位置上的非机动车物体。
  7. 根据权利要求4至6任一所述的方法,其特征在于,所述非机动车物体至少包括道路凹陷、障碍物和道路积水。
  8. 根据权利要求7所述的方法,其特征在于,所述方法还包括:
    基于所述非机动车物体位于导航指示指示的导航路径上,显示变道指示,其中,所述导航指示用于指示所述第一车辆的导航路径,所述变道指示用于指示所述第一车辆避开所述非机动车物体的行驶路径。
  9. 根据权利要求4至8任一所述的方法,其特征在于,所述方法还包括:
    基于所述第一车辆与所述非机动车物体之间的距离为第一距离,显示第一告警提示;
    基于所述第一车辆与所述非机动车物体之间的距离为第二距离,显示第二告警提示,所述第二告警提示与所述第一告警提示不同。
  10. 根据权利要求9所述的方法,其特征在于,所述第一告警提示和所述第二告警提示的颜色或透明度不同。
  11. 根据权利要求1至10任一所述的方法,其特征在于,所述方法还包括:
    获取所述第一车辆的导航信息;
    基于所述导航信息显示导航指示,所述导航指示用于指示所述第一车辆的导航路径。
  12. 根据权利要求11所述的方法,其特征在于,所述导航指示包括第一导航指示或第二导航指示,所述基于所述导航信息显示导航指示,包括:
    基于所述第一车辆处于静止状态,显示所述第一导航指示;
    基于所述第一车辆处于行驶状态,显示所述第二导航指示,所述第一导航指示和所述第二导航指示不同。
  13. 根据权利要求12所述的方法,其特征在于,所述方法第一导航指示和所述第二导航指示的显示颜色或透明度不同。
  14. 根据权利要求11所述的方法,其特征在于,所述导航指示包括第三导航指示或第四导航指示,所述基于所述导航信息显示导航指示,包括:
    基于所述第一车辆处于第一环境,显示所述第三导航指示;
    基于所述第一车辆处于第二环境,显示所述第四导航指示,所述第一环境与所述第二环境不同,所述第三导航指示和所述第四导航指示不同。
  15. 根据权利要求14所述的方法,其特征在于,所述第一环境至少包括如下环境中的一种:所述第一车辆所处的天气环境、所述第一车辆所处的路面环境、所述第一车辆导航目的地所处的天气环境、所述第一车辆导航目的地所处的路面环境、所述第一车辆所在道路的交通拥堵环境、所述第一车辆导航目的地所处的交通拥堵环境或所述第一车辆所处的亮度环境。
  16. 根据权利要求1至15任一所述的方法,其特征在于,所述方法还包括:
    基于所述第一车辆处于直行状态,显示第一区域;
    基于所述第一车辆由所述直行状态改变为左转弯状态,显示第二区域,其中,所述第二区域包含的所述第一车辆行驶方向的左前方的场景区域大于所述第一区域包含的所述左前方的场景区域;或,
    基于所述第一车辆处于左转弯状态,显示第三区域;
    基于所述第一车辆由所述左转弯状态改变为直行状态,显示第四区域,其中,所述第三区域包含的所述第一车辆行驶方向的右后方的场景区域大于所述第四区域包含的所述右后方的场景区域;或,
    基于所述第一车辆处于直行状态,显示第五区域;
    基于所述第一车辆由所述直行状态改变为右转弯状态,显示第六区域,其中,所述第五区域包含的所述第一车辆行驶方向的右前方的场景区域大于所述第六区域包含的所述右前方的场景区域;或,
    基于所述第一车辆处于右转弯状态,显示第七区域;
    基于所述第一车辆由所述右转弯状态改变为直行状态,显示第八区域,其中,所述第七区域包含的所述第一车辆行驶方向的左后方的场景区域大于所述第八区域包含的所述左后方的场景区域。
  17. 根据权利要求1至16任一所述的方法,其特征在于,所述方法还包括:
    基于所述第一车辆处于第一行驶速度,显示第九区域;
    基于所述第一车辆处于第二行驶速度,显示第十区域,所述第九区域和所述第十区域 为所述第一车辆行驶位置所处的场景区域,所述第二行驶速度大于所述第一行驶速度,所述第九区域包含的场景区域大于所述第十区域包含的场景区域。
  18. 根据权利要求1至17任一所述的方法,其特征在于,所述方法还包括:
    获取所述第一车辆导航目的地所处的地理位置;
    基于所述地理位置显示第一图像,所述第一图像用于指示所述第一车辆导航目的地所处的地理位置的类型。
  19. 根据权利要求18所述的方法,其特征在于,所述方法还包括:
    检测到第三车辆;
    获取所述第三车辆导航目的地所处的地理位置;
    基于所述第三车辆导航目的地所处的地理位置显示第二图像,所述第二图像用于指示所述第三车辆导航目的地所处的地理位置的类型。
  20. 根据权利要求18或19所述的方法,其特征在于,所述地理位置的类型至少包括如下类型的一种:城市、山区、平原、森林或海边。
  21. 根据权利要求1至20任一所述的方法,其特征在于,所述方法还包括:
    检测到所述第一车辆行驶至路口停止区域,显示第一路口停止指示。
  22. 根据权利要求21所述的方法,其特征在于,所述路口停止指示包括:第一路口停止指示或第二路口停止指示,所述检测到所述第一车辆行驶至路口停止区域,显示路口停止指示,包括:
    基于检测到所述第一车辆的车头未超出所述路口停止区域,显示第一路口停止指示;
    基于检测到所述第一车辆的车头超出所述路口停止区域,显示第二路口停止指示,所述第一路口停止指示与所述第二路口停止指示不同。
  23. 根据权利要求21所述的方法,其特征在于,所述路口停止指示包括:第三路口停止指示或第四路口停止指示,所述检测到所述第一车辆行驶至路口停止区域,显示路口停止指示,包括:
    基于检测到所述第一车辆行驶至所述路口停止区域,且所述路口停止区域对应的红绿灯为红灯或黄灯,显示第三路口停止指示;
    基于检测到所述第一车辆行驶至所述路口停止区域,且所述路口停止区域对应的红绿灯为绿灯,显示第四路口停止指示,所述第三路口停止指示与所述第四路口停止指示不同。
  24. 根据权利要求1至23任一所述的方法,其特征在于,所述方法还包括:
    检测到第四车辆;
    基于所述第四车辆与所述第一车辆之间的距离小于预设距离,显示车辆警告提示。
  25. 根据权利要求24所述的方法,其特征在于,所述车辆警告提示包括第一车辆警告提示或第二车辆警告提示,所述基于所述第四车辆与所述第一车辆之间的距离小于预设距离,显示车辆警告提示,包括:
    基于所述第四车辆与所述第一车辆之间的距离为第一距离,显示第一车辆警告提示;
    基于所述第四车辆与所述第一车辆之间的距离为第二距离,显示第二车辆警告提示,所述第一距离与所述第二距离不同,所述第一车辆警告提示与所述第二车辆警告提示不同。
  26. 根据权利要求1至25任一所述的方法,其特征在于,所述方法还包括:
    检测到第五车辆;
    基于所述第五车辆位于所述第一车辆行驶方向的前方所在车道的车道线上,显示所述第五车辆对应的第三图像;
    基于所述第五车辆行驶至所述第一车辆行驶方向的前方所在的车道上,显示所述第五车辆对应的第四图像,所述第三图像和所述第四图像不同。
  27. 一种车载装置,其特征在于,包括处理器和存储器,所述处理器用于获取并执行所述存储器中的代码,以执行所述权利要求1至26任一所述的方法。
  28. 一种车辆,包括处理器、存储器和显示器,所述处理器用于获取并执行所述存储器中的代码,以执行所述权利要求1至26任一所述的方法。
  29. 根据权利要求28所述的车辆,其特征在于,所述车辆支持无人驾驶功能。
PCT/CN2020/110506 2019-09-25 2020-08-21 一种车载设备的信息显示方法、装置及车辆 WO2021057352A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/703,053 US20220212690A1 (en) 2019-09-25 2022-03-24 Vehicle-mounted device information display method, apparatus, and vehicle

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910912412.5A CN110775063B (zh) 2019-09-25 2019-09-25 一种车载设备的信息显示方法、装置及车辆
CN201910912412.5 2019-09-25

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/703,053 Continuation US20220212690A1 (en) 2019-09-25 2022-03-24 Vehicle-mounted device information display method, apparatus, and vehicle

Publications (1)

Publication Number Publication Date
WO2021057352A1 true WO2021057352A1 (zh) 2021-04-01

Family

ID=69384343

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/110506 WO2021057352A1 (zh) 2019-09-25 2020-08-21 一种车载设备的信息显示方法、装置及车辆

Country Status (3)

Country Link
US (1) US20220212690A1 (zh)
CN (1) CN110775063B (zh)
WO (1) WO2021057352A1 (zh)

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110775063B (zh) * 2019-09-25 2021-08-13 华为技术有限公司 一种车载设备的信息显示方法、装置及车辆
CN111290386B (zh) * 2020-02-20 2023-08-04 北京小马慧行科技有限公司 路径规划方法及装置、运载工具
DE102020107739B3 (de) * 2020-03-20 2021-06-02 Webasto SE Fahrzeugdach mit Umfeldsensor und Reinigungseinrichtung
CN111959528B (zh) * 2020-08-20 2021-11-02 广州小马智行科技有限公司 移动载体的显示设备的控制方法、装置与处理器
CN112639580A (zh) * 2020-09-14 2021-04-09 华为技术有限公司 抬头显示装置、抬头显示方法及车辆
US12024170B2 (en) 2020-12-14 2024-07-02 Zoox, Inc. Lane change gap finder
JP2022138171A (ja) * 2021-03-10 2022-09-26 矢崎総業株式会社 車両用表示装置
JP7447039B2 (ja) * 2021-03-10 2024-03-11 矢崎総業株式会社 車両用表示装置
CN113183758A (zh) * 2021-04-28 2021-07-30 昭通亮风台信息科技有限公司 一种基于增强现实的辅助驾驶方法及系统
CN113232661B (zh) * 2021-05-28 2023-05-12 广州小鹏汽车科技有限公司 控制方法、车载终端及车辆
JP7308880B2 (ja) * 2021-06-03 2023-07-14 本田技研工業株式会社 車両制御装置、車両制御方法、およびプログラム
CN113256989B (zh) * 2021-07-07 2021-11-19 智道网联科技(北京)有限公司 行车警示方法、装置、车载终端和存储介质
CN113761007A (zh) * 2021-09-10 2021-12-07 阿波罗智联(北京)科技有限公司 地图界面显示方法、装置、设备、存储介质和程序产品
CN114440929A (zh) * 2022-01-28 2022-05-06 中国第一汽车股份有限公司 一种高精度地图的测试评估方法、装置、车辆及介质
CN116929351A (zh) * 2022-03-31 2023-10-24 华为技术有限公司 导航方法及电子设备
CN115220227A (zh) * 2022-04-18 2022-10-21 长城汽车股份有限公司 一种增强现实抬头显示方法、装置及终端设备
CN116974495A (zh) 2022-04-21 2023-10-31 金宝电子工业股份有限公司 显示后视图像的方法和使用所述方法的移动装置
TWI824496B (zh) * 2022-04-21 2023-12-01 金寶電子工業股份有限公司 顯示後視影像的方法和使用該方法的行動裝置
CN114964298A (zh) * 2022-05-23 2022-08-30 广州小鹏汽车科技有限公司 显示方法、车辆和计算机可读存储介质
CN115123303A (zh) * 2022-07-18 2022-09-30 腾讯科技(深圳)有限公司 车辆驾驶状态展示方法、装置、电子设备和存储介质
CN117622180A (zh) * 2022-08-11 2024-03-01 华为技术有限公司 一种显示方法、控制装置和车辆
CN115472031A (zh) * 2022-08-15 2022-12-13 北京罗克维尔斯科技有限公司 信息显示方法、装置、设备、介质、产品及车辆
US12077174B2 (en) * 2022-08-24 2024-09-03 Toyota Motor Engineering & Manufacturing North America, Inc. Compensating mismatch in abnormal driving behavior detection
CN116608879B (zh) * 2023-05-19 2024-11-01 亿咖通(湖北)技术有限公司 信息显示方法、设备、存储介质及程序产品

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1833934A (zh) * 2005-09-09 2006-09-20 中国科学院自动化研究所 汽车行驶安全监控系统及监控方法
CN106094809A (zh) * 2015-04-30 2016-11-09 Lg电子株式会社 车辆驾驶辅助装置
CN108128243A (zh) * 2016-12-01 2018-06-08 株式会社斯巴鲁 车辆用显示装置
US20180297590A1 (en) * 2017-04-18 2018-10-18 Hyundai Motor Company Vehicle and method for supporting driving safety of vehicle
CN109387211A (zh) * 2017-08-14 2019-02-26 通用汽车环球科技运作有限责任公司 用于改进使用v2x通信系统时的障碍物感知的系统和方法
CN110775063A (zh) * 2019-09-25 2020-02-11 华为技术有限公司 一种车载设备的信息显示方法、装置及车辆

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4777786B2 (ja) * 2006-02-01 2011-09-21 クラリオン株式会社 車載地図表示装置
US8676431B1 (en) * 2013-03-12 2014-03-18 Google Inc. User interface for displaying object-based indications in an autonomous driving system
TWI600558B (zh) * 2014-04-01 2017-10-01 Dynamic lane detection system and method
KR102263731B1 (ko) * 2014-11-11 2021-06-11 현대모비스 주식회사 주변차량의 위치정보 보정 시스템 및 방법
US9965957B2 (en) * 2014-11-26 2018-05-08 Mitsubishi Electric Corporation Driving support apparatus and driving support method
JP2016199204A (ja) * 2015-04-14 2016-12-01 トヨタ自動車株式会社 車両制御装置
US11270589B2 (en) * 2017-08-25 2022-03-08 Nissan Motor Co., Ltd. Surrounding vehicle display method and surrounding vehicle display device
JP6877571B2 (ja) * 2017-11-10 2021-05-26 本田技研工業株式会社 表示システム、表示方法、およびプログラム
US10595176B1 (en) * 2018-09-19 2020-03-17 Denso International America, Inc. Virtual lane lines for connected vehicles
JP7023817B2 (ja) * 2018-09-19 2022-02-22 本田技研工業株式会社 表示システム、表示方法、およびプログラム
CN109823268A (zh) * 2019-02-19 2019-05-31 百度在线网络技术(北京)有限公司 一种危险道路行为告警方法、装置、服务器及系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1833934A (zh) * 2005-09-09 2006-09-20 中国科学院自动化研究所 汽车行驶安全监控系统及监控方法
CN106094809A (zh) * 2015-04-30 2016-11-09 Lg电子株式会社 车辆驾驶辅助装置
CN108128243A (zh) * 2016-12-01 2018-06-08 株式会社斯巴鲁 车辆用显示装置
US20180297590A1 (en) * 2017-04-18 2018-10-18 Hyundai Motor Company Vehicle and method for supporting driving safety of vehicle
CN109387211A (zh) * 2017-08-14 2019-02-26 通用汽车环球科技运作有限责任公司 用于改进使用v2x通信系统时的障碍物感知的系统和方法
CN110775063A (zh) * 2019-09-25 2020-02-11 华为技术有限公司 一种车载设备的信息显示方法、装置及车辆

Also Published As

Publication number Publication date
CN110775063B (zh) 2021-08-13
CN110775063A (zh) 2020-02-11
US20220212690A1 (en) 2022-07-07

Similar Documents

Publication Publication Date Title
WO2021057352A1 (zh) 一种车载设备的信息显示方法、装置及车辆
US11619940B2 (en) Operating an autonomous vehicle according to road user reaction modeling with occlusions
US10232713B2 (en) Lamp for a vehicle
WO2021135371A1 (zh) 一种自动驾驶方法、相关设备及计算机可读存储介质
CN112298207B (zh) 当存在失去行驶能力的自主车辆时维持道路安全
CN113968216B (zh) 一种车辆碰撞检测方法、装置及计算机可读存储介质
CN110789533B (zh) 一种数据呈现的方法及终端设备
CA3099840A1 (en) System and method for using v2x and sensor data
US20230168095A1 (en) Route providing device and route providing method therefor
US20190276044A1 (en) User interface apparatus for vehicle and vehicle including the same
EP3995379B1 (en) Behavior prediction for railway agents for autonomous driving system
US11745761B2 (en) Path providing device and path providing method thereof
EP4145409A1 (en) Pipeline architecture for road sign detection and evaluation
US20210039674A1 (en) Path providing device and path providing method thereof
US20180135972A1 (en) Using map information to smooth objects generated from sensor data
KR20210083048A (ko) 경로 제공 장치 및 그것의 경로 제공 방법
US20230168102A1 (en) Device for providing route and method for providing route therefor
US20230192134A1 (en) Methods and Systems for Providing Incremental Remote Assistance to an Autonomous Vehicle
CN114616153A (zh) 防碰撞的方法和控制装置
CN116278739A (zh) 一种风险提醒方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20869031

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20869031

Country of ref document: EP

Kind code of ref document: A1