[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN118025211A - Vehicle early warning method and electronic equipment - Google Patents

Vehicle early warning method and electronic equipment Download PDF

Info

Publication number
CN118025211A
CN118025211A CN202211418783.6A CN202211418783A CN118025211A CN 118025211 A CN118025211 A CN 118025211A CN 202211418783 A CN202211418783 A CN 202211418783A CN 118025211 A CN118025211 A CN 118025211A
Authority
CN
China
Prior art keywords
vehicle
distance
obstacle
information
equal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211418783.6A
Other languages
Chinese (zh)
Inventor
邢海峰
李淑玲
饶刚
卢恒惠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202211418783.6A priority Critical patent/CN118025211A/en
Publication of CN118025211A publication Critical patent/CN118025211A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/50Barriers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/402Type
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application discloses a vehicle early warning method and electronic equipment, wherein the method comprises the following steps: acquiring information of a blind area of a vehicle, and determining an obstacle positioned in the blind area according to the information of the blind area; determining a display area of the obstacle on the HUD according to a first relative position of the obstacle and the running direction of the vehicle; and if the distance between the obstacle or the curve around the vehicle and the vehicle is smaller than or equal to the first distance, displaying the image of the obstacle in a display area in a gradual change mode, wherein under the gradual change mode, the smaller the distance is, the larger the contrast between the parameter value of the image of the obstacle and the parameter value of the initial image displayed on the vehicle-mounted HUD is, so that the potential safety hazard of vehicle early warning is reduced.

Description

Vehicle early warning method and electronic equipment
Technical Field
The present application relates to the field of electronic devices, and in particular, to a vehicle early warning method and an electronic device.
Background
In the running process of the vehicle, a driver needs to determine whether a blind area of the vehicle has pedestrians or other vehicles, and the like, so that the driver can pre-judge whether the running direction needs to be changed or the running speed needs to be reduced in advance according to the information of the blind area, traffic accidents are reduced, and traffic safety is improved.
At present, several early warning modes for the blind area information of the vehicle are provided. For example, for roads that are not covered by a vehicle networking (vehicle to everything, V2X) device, the driver may view the vehicle blind spot image through a roadside convex mirror. For roads covered with the V2X device, the vehicle can interact with the road side V2X device through the vehicle-mounted V2X device to acquire vehicle blind area information, a vehicle blind area image is generated according to the vehicle blind area information, and a driver can watch the vehicle blind area image through the vehicle-mounted display. However, when a driver views the blind area image of the vehicle, the sight needs to deviate from the running direction of the vehicle, so that the potential safety hazard is increased.
Therefore, the potential safety hazard brought by the existing vehicle early warning mode is large.
Disclosure of Invention
The embodiment of the application provides a vehicle early warning method and electronic equipment, which are used for reducing potential safety hazards of vehicle early warning.
In a first aspect, the present application provides a vehicle early warning method, the method comprising: acquiring information of a blind area of a vehicle, and determining an obstacle positioned in the blind area according to the information of the blind area; determining a display area of the obstacle on the vehicle-mounted HUD according to a first relative position of the obstacle and the running direction of the vehicle; and displaying an image of the obstacle in a gradation form in the display area if a distance between the obstacle or a curve existing around the vehicle and the vehicle is less than or equal to a first distance, wherein in the gradation form, a contrast ratio of a parameter value of the image of the obstacle to a parameter value of an initial image displayed on the vehicle-mounted HUD is greater as the distance is smaller.
According to the method, the display area of the obstacle on the vehicle-mounted HUD can be determined through the relative position of the obstacle in the blind area of the vehicle and the running direction of the vehicle, and when the distance between the vehicle and the obstacle or the distance between the vehicle and the curve is smaller than or equal to the first distance, the image of the obstacle is displayed in the display area, so that a driver can sense the position of the obstacle according to the first time of the display area of the obstacle on the vehicle-mounted HUD, and the vehicle early warning efficiency is improved. And because the image of the obstacle on the vehicle-mounted HUD does not appear suddenly, but gradually changes along with the distance between the vehicle and the obstacle or the curve existing around the vehicle, the influence on the driver is smaller, for example, the attention of the driver is not dispersed due to the suddenly appearing image, and the potential safety hazard is reduced.
In one possible design, the method may also: determining a probability of collision of the obstacle with the vehicle if the distance is less than or equal to a second distance, wherein the second distance is less than the first distance; if the probability is greater than or equal to the first probability, the travel speed of the vehicle is reduced.
In the method, when the distance between the vehicle and the obstacle or the curve is smaller than or equal to the second distance, whether the probability of collision between the obstacle and the vehicle is larger than or equal to the first probability is determined, namely whether the obstacle and the vehicle are easy to collide, and if the obstacle and the vehicle are easy to collide, the running speed of the vehicle is reduced, and the potential safety hazard is reduced.
In one possible design, the method may reduce the running speed of the vehicle if the probability is greater than or equal to a first probability: and if the probability is greater than or equal to the first probability and the running speed of the vehicle is greater than the first speed, reducing the running speed of the vehicle.
In the method, when the obstacle is easy to collide with the vehicle, whether the running speed of the vehicle is larger than the first speed or not can be determined, namely, whether a driver takes a deceleration measure or not is determined, and if the driver does not take the deceleration measure, the running speed of the vehicle is reduced, and the potential safety hazard is reduced.
In one possible design, the method may also: if the probability is less than or equal to the first probability, information indicating to reduce the running speed of the vehicle is output.
In the method, when the obstacle is not easy to collide with the vehicle, information for indicating to reduce the running speed of the vehicle, such as 'the existence of the obstacle in the blind area', the voice information for noticing to slow down or the vibration information for stretching the safety belt, can be output so as to remind the driver, and the potential safety hazard is reduced.
In one possible design, the method may also: determining whether a sign for prohibiting whistling exists around the vehicle if the distance is less than or equal to a second distance, wherein the second distance is less than the first distance; and if the sign does not exist around the vehicle, controlling the vehicle to blast.
In the method, when the distance between the vehicle and the obstacle or the curve is smaller than or equal to the second distance and the sign for prohibiting the whistle does not exist around the vehicle, the whistle of the vehicle can be controlled to remind the driver and the obstacle (such as pedestrians or other vehicles) located in the blind area, so that the potential safety hazard is reduced.
In one possible design, the method may also: determining a projection angle of a headlight of the vehicle according to the first relative position and the distance if the distance is smaller than or equal to a second distance, wherein the second distance is smaller than the first distance; and controlling the headlight to project light to the obstacle according to the projection angle.
In one possible design, the method may also: if the distance is smaller than or equal to a second distance, determining a projection angle of a headlight of the vehicle according to a second relative position and the distance, wherein the second distance is smaller than the first distance, and the second relative position is a relative position of the curve and a running direction of the vehicle; and controlling the headlight to project light to the curve according to the projection angle.
In the method, when the distance between the vehicle and the obstacle or the curve is smaller than or equal to the second distance, the headlight of the vehicle can be controlled to project light to the obstacle according to the projection angle determined by the first relative position and the distance between the obstacle and the vehicle, or the light can be projected to the curve according to the projection angle determined by the second relative position and the distance between the curve and the vehicle, so that the obstacle (such as a pedestrian or other vehicles) in the blind area is reminded, and the potential safety hazard is reduced.
In one possible design, the method may also: and outputting information indicating to reduce the running speed of the vehicle if the distance is less than or equal to the first distance and greater than a second distance, wherein the second distance is less than the first distance.
In the method, when the distance between the vehicle and the obstacle or the curve is smaller than or equal to the first distance and larger than the second distance, information for indicating to reduce the running speed of the vehicle, such as voice information of 'the existence of the obstacle in the blind area, attention to deceleration and slow running' or vibration information of stretching of the safety belt, can be output so as to remind a driver, and potential safety hazards are reduced.
In one possible design, the method may, when acquiring information of the blind area of the vehicle, include: receiving information of the blind area from the road side vehicle networking V2X equipment; or shooting the information of the blind area displayed by the road side convex mirror through the vehicle-mounted camera.
In the method, the information of the dead zone of the vehicle can be acquired according to the existing equipment on the running road of the vehicle, such as road side V2X equipment or road side convex mirror, so that the early warning cost of the vehicle is reduced.
In one possible design, the method may, when the information of the blind area displayed by the road side convex mirror is captured by the vehicle-mounted camera, include: detecting a user operation or determining that the curve exists around the vehicle, wherein the user operation is used for indicating the vehicle to start vehicle early warning; and if the brightness around the vehicle is greater than or equal to the first brightness, shooting the information of the blind area displayed by the road side convex mirror through the vehicle-mounted camera.
In the method, since the brightness around the vehicle can influence the accuracy of the information of the blind area of the vehicle, which is obtained by photographing the road side convex mirror by the vehicle-mounted camera, when the user operation for indicating the vehicle to start the vehicle early warning is detected or when the curve around the vehicle is determined to exist, whether the brightness around the vehicle is larger than or equal to the first brightness is firstly determined, and if the brightness around the vehicle is larger than or equal to the first brightness, the information of the blind area of the vehicle, which is displayed by photographing the road side convex mirror by the vehicle-mounted camera, can improve the vehicle early warning efficiency.
In one possible design, the method may, when determining the display area of the obstacle on the vehicle-mounted HUD according to the first relative position of the obstacle and the traveling direction of the vehicle, include: if the obstacle is positioned at the left side of the running direction of the vehicle, determining the display area as a left side area of the vehicle-mounted HUD; or if the obstacle is located on the right side of the running direction of the vehicle, determining the display area to be the right side area of the vehicle-mounted HUD; or if the obstacle is located in front of the running direction of the vehicle, determining the display area as an intermediate area of the vehicle-mounted HUD.
According to the method, the display area of the obstacle on the vehicle-mounted HUD can be determined through the relative position of the obstacle in the blind area of the vehicle and the running direction of the vehicle, so that a driver can sense the position of the obstacle according to the first time of the display area of the obstacle on the vehicle-mounted HUD, and the early warning efficiency of the vehicle is improved.
In one possible design, the method may further include, prior to displaying the obstacle in the display area in a gradual manner: stopping displaying the initial image on the display area; or stopping displaying the first information included in the initial image on the vehicle-mounted HUD, wherein the priority of the first information is smaller than the first priority.
According to the method, before the obstacle is displayed in the display area in a gradual change mode, the display of the initial image on the display area is stopped, or the display of the first information with the priority smaller than the first priority on the vehicle-mounted HUD is stopped, so that the redundancy of the information displayed on the vehicle-mounted HUD is avoided, a driver can sense the position of the obstacle according to the first time of the obstacle in the display area on the vehicle-mounted HUD, the vehicle early warning efficiency is improved, and the potential safety hazard is reduced.
In a second aspect, the present application provides a vehicle warning device having a function for implementing the method described in the first aspect or any of the possible designs of the first aspect, where the function may be implemented by hardware, or may be implemented by hardware executing corresponding software. The hardware or software includes one or more modules corresponding to the above functions, such as a first processing module, a second processing module, and a display module.
The first processing module is used for acquiring information of a blind area of the vehicle and determining an obstacle positioned in the blind area according to the information of the blind area;
The second processing module is used for determining a display area of the obstacle on the vehicle-mounted augmented reality (AR-head-up) display (HUD) according to a first relative position of the obstacle and the running direction of the vehicle;
And a display module configured to display an image of the obstacle in a gradation form in the display area if a distance between the obstacle or a curve existing around the vehicle and the vehicle is less than or equal to a first distance, wherein in the gradation form, a contrast ratio of a parameter value of the image of the obstacle to a parameter value of an initial image displayed on the vehicle-mounted HUD is greater as the distance is smaller.
In one possible design, the apparatus further comprises a third processing module for: determining a probability of collision of the obstacle with the vehicle if the distance is less than or equal to a second distance, wherein the second distance is less than the first distance; if the probability is greater than or equal to the first probability, the travel speed of the vehicle is reduced.
In one possible design, the third processing module is specifically configured to, when reducing the running speed of the vehicle if the probability is greater than or equal to the first probability: and if the probability is greater than or equal to the first probability and the running speed of the vehicle is greater than the first speed, reducing the running speed of the vehicle.
In one possible design, the third processing module is further configured to: if the probability is less than or equal to the first probability, information indicating to reduce the running speed of the vehicle is output.
In one possible design, the apparatus further comprises a fourth processing module for: determining whether a sign for prohibiting whistling exists around the vehicle if the distance is less than or equal to a second distance, wherein the second distance is less than the first distance; and if the sign does not exist around the vehicle, controlling the vehicle to blast.
In one possible design, the apparatus further comprises a fifth processing module for: determining a projection angle of a headlight of the vehicle according to the first relative position and the distance if the distance is smaller than or equal to a second distance, wherein the second distance is smaller than the first distance; and controlling the headlight to project light to the obstacle according to the projection angle.
In one possible design, the apparatus further comprises a sixth processing module for: if the distance is smaller than or equal to a second distance, determining a projection angle of a headlight of the vehicle according to a second relative position and the distance, wherein the second distance is smaller than the first distance, and the second relative position is a relative position of the curve and a running direction of the vehicle; and controlling the headlight to project light to the curve according to the projection angle.
In one possible design, the apparatus further comprises a seventh processing module for: and outputting information indicating to reduce the running speed of the vehicle if the distance is less than or equal to the first distance and greater than a second distance, wherein the second distance is less than the first distance.
In one possible design, the first processing module is specifically configured to: receiving information of the blind area from the road side vehicle networking V2X equipment; or shooting the information of the blind area displayed by the road side convex mirror through the vehicle-mounted camera.
In one possible design, the first processing module is specifically configured to: detecting a user operation or determining that the curve exists around the vehicle, wherein the user operation is used for indicating the vehicle to start vehicle early warning; and if the brightness around the vehicle is greater than or equal to the first brightness, shooting the information of the blind area displayed by the road side convex mirror through the vehicle-mounted camera.
In one possible design, the second processing module is specifically configured to: if the obstacle is positioned at the left side of the running direction of the vehicle, determining the display area as a left side area of the vehicle-mounted HUD; or if the obstacle is located on the right side of the running direction of the vehicle, determining the display area to be the right side area of the vehicle-mounted HUD; or if the obstacle is located in front of the running direction of the vehicle, determining the display area as an intermediate area of the vehicle-mounted HUD.
In one possible design, the display module is further configured to, prior to displaying the obstacle in the display area in a gradual manner: stopping displaying the initial image on the display area; or stopping displaying the first information included in the initial image on the vehicle-mounted HUD, wherein the priority of the first information is smaller than the first priority.
In a third aspect, the present application also provides an electronic device, including: a processor, a memory, and one or more programs; wherein the one or more programs are stored in the memory, the one or more programs comprising instructions, which when executed by the processor, cause the electronic device to perform the method as described in the first aspect or any of the possible designs of the first aspect.
In a fourth aspect, the present application provides a computer readable storage medium for storing a computer program which, when run on a computer, causes the computer to perform the method as described in the first aspect or any of the possible designs of the first aspect.
In a fifth aspect, the application provides a computer program product comprising a computer program which, when run on a computer, causes the computer to perform the method as described in the first aspect or any of the possible designs of the first aspect.
The advantages of the second to fifth aspects and possible designs thereof described above may be referred to the description of the advantages of the method described in the first aspect and any possible designs thereof.
Drawings
FIG. 1 is a schematic diagram of a vehicle warning scenario in the prior art;
FIG. 2 is a schematic diagram of the projected content of an AR-HUD according to the prior art;
Fig. 3 is a schematic hardware structure of an electronic device according to an embodiment of the present application;
fig. 4 is a schematic flow chart of a vehicle early warning method according to an embodiment of the present application;
Fig. 5 is a schematic diagram of a vehicle early warning scene provided by an embodiment of the present application;
FIG. 6 is a schematic diagram of another vehicle early warning scenario provided in an embodiment of the present application;
fig. 7 is a schematic hardware structure of another electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application.
Some terms involved in the embodiments of the present application are explained below in order to understand the embodiments of the present application.
(1) At least one of the embodiments of the present application includes one or more; wherein, a plurality refers to greater than or equal to two. In addition, it should be understood that in the description herein, the words "first," "second," and the like are used solely for the purpose of distinguishing between the descriptions and not necessarily for the purpose of indicating or implying a relative importance or order. For example, the first object and the second object do not represent the importance of both or the order of both, only for distinguishing descriptions. In the embodiment of the present application, "and/or" merely describes the association relationship, which means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship.
In describing embodiments of the present application, it should be noted that, unless explicitly stated and limited otherwise, the terms "mounted," "connected," and "connected" should be construed broadly, and for example, the terms "connected" may be removably connected or non-removably connected; may be directly connected or indirectly connected through an intermediate medium. References to directional terms in the embodiments of the present application, such as "upper", "lower", "left", "right", "inner", "outer", etc., are merely with reference to the directions of the drawings, and thus, the directional terms are used in order to better and more clearly describe and understand the embodiments of the present application, and do not indicate or imply that the devices or elements referred to must have a specific orientation, be constructed and operated in a specific orientation, and thus should not be construed as limiting the embodiments of the present application. "plurality" means at least two.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the specification. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
(2) Internet of vehicles (vehicle to everything, V2X) equipment
V2X is the interconnection of vehicles and everything, also known as the communication interconnection of vehicles and everything based on long term evolution communication (long term evolution-vehicle, LTE-V) or dedicated short message communication (dedicate short range communication, DSRC) for vehicle communication, and is a key technology of future intelligent transportation systems. V2X is a generic term for an LTE-V or DSRC-based communication interconnect (vehicle to vehicle, V2V) between vehicles, a communication interconnect (vehicle to infrastructure, V2I) between vehicles and infrastructure, and the like. V2X enables bidirectional information transfer between the vehicle and any entity that may affect the vehicle, for example, enabling communication between the vehicle and pedestrians, between the vehicle and the vehicle, and between the vehicle and the base station, so as to obtain a series of information such as running information, real-time road conditions, road information, pedestrian information, etc. of the vehicle, improve driving safety, reduce congestion, improve traffic efficiency, provide vehicle-mounted entertainment information, etc.
(3) Vehicle Head Up Display (HUD)
The in-vehicle HUD may be a conventional HUD or an augmented reality (augmented reality, AR) -HUD. Among them, a conventional HUD is a display device for projecting an image into a driver's front view, which projects information in a two-dimensional image on a windshield of a vehicle, which is substantially horizontal to the eyes of the driver, mainly using the principle of optical reflection, and the driver can see the two-dimensional image projected by the HUD to be displayed on a virtual image plane in front of the windshield of the vehicle when looking forward through the windshield of the vehicle. Compared with the traditional instrument and the central control screen, when a driver observes the two-dimensional image projected by the HUD, the driver does not need to lower his head, the sight is prevented from being switched back and forth between the two-dimensional image and the road surface, the crisis reaction time is reduced, and the driving safety is improved.
The AR-HUD can fuse the AR effect of the HUD projection display with real road surface information, so that the acquisition of road surface information by a driver is enhanced, and the functions of AR navigation, AR early warning and the like are realized. To realize the functions of AR navigation, AR early warning and the like of the AR-HUD, three-dimensional perception data obtained by a sensor are required to be sent into a virtual three-dimensional space for augmented reality effect drawing, and the three-dimensional perception data are mapped to a two-dimensional virtual image surface displayed by the HUD after drawing is completed and finally are mapped back to the three-dimensional space through human eyes. In the process, three points and one line of the display picture-real object of the human eye-HUD are required to be ensured, the size and the position of the display picture of the HUD observed by human eyes are ensured to be consistent with those of the real object, so that the virtual image in the display picture of the HUD observed by human eyes can be just fused with the corresponding real object, and the matching fusion of the AR effect and the display scene is realized. And, for different positions of the same driver in the driving process, or for different drivers, the change of the position of human eyes requires the display screen of the HUD to be correspondingly adjusted so as to ensure that the display screen of the HUD observed by the human eyes is always fused with the real road surface information.
For example, fig. 1 is a schematic diagram of a conventional vehicle early warning scenario, as shown in fig. 1, an on-board V2X device (e.g., an on-board unit (OBU)) of a target vehicle may acquire information of surrounding pedestrians, motor vehicles, non-motor vehicles, and the like, particularly traffic information of a detection blind area of an on-board sensor such as an on-board camera, an on-board radar, and the like, through an on-board V2X device (e.g., a Road Side Unit (RSU) or an OBU of another vehicle). The V2X information received by the OBU of the target vehicle may come from a sensor such as a camera or a radar on the road side, for example, the sensor such as the camera or the radar on the road side may detect information of pedestrians and non-motor vehicles, and send the detected information to the RSU, which may send the information to the OBU of the target vehicle through a PC5 communication protocol or other protocols. Or the V2X information received by the OBU of the target vehicle may also come from other vehicles, for example, the OBU of the other vehicles may send its own information to the OBU of the target vehicle through the PC5 communication protocol or other protocols.
The OBU of the target vehicle may transmit the received V2X information to an electronic control unit (electronic control unit, ECU) of the advanced driving assistance system (ADVANCED DRIVER ASSISTANCE SYSTEMS, ADAS) through an Ethernet (Ethernet) protocol and a controller area network (controller area network, CAN) protocol, so that the ECU of the ADAS may perform a sensing algorithm process according to information from the vehicle-mounted camera, the vehicle-mounted radar, and the OBU. According to the image data around the vehicle body collected by the vehicle-mounted camera, the ECU of the ADAS can process through an object detection and classification algorithm, and meanwhile, the type (such as a dotted line, a solid line, a single line, a double line and the like) of lane lines around the vehicle, the color (such as yellow, white and the like), the position, the curvature change condition and the like, and the type (such as pedestrians, motor vehicles, non-motor vehicles and the like), the position, the size, the moving speed and the like of obstacles around the vehicle are identified by combining radar data collected by the vehicle-mounted radar to perform fusion positioning. According to the V2X information received by the OBU, the ECU of the ADAS can acquire the information such as the position, the speed and the like of pedestrians, motor vehicles, non-motor vehicles and other traffic participants with V2X communication capability in the detection blind areas of the vehicle-mounted cameras and the vehicle-mounted radar.
The ECU of the ADAS can send the processing result of the sensing algorithm to the vehicle machine end through an Ethernet protocol, and the AR Core of the vehicle machine end simultaneously combines information such as navigation, inertial measurement unit (inertial measurement unit, IMU), global satellite positioning system (global positioning system, GPS) and vehicle state to generate the projection content of the AR-HUD, and the position of the projection content is calibrated. The basic state information such as vehicle speed, road speed limit, residual oil quantity/electric quantity and the like is projected on a near view layer of the AR-HUD, the distance between 3 and 5 meters in front of the sight of a driver is generally the same, and the position of the projected content in the near view layer is fixed without calibration. ADAS information (such as lane departure reminding (lane departure warning, LDW), forward collision reminding (forward collision warning, FCW) and the like), AR navigation information (such as lane change indication and exit reminding) and the like are projected on a long-range layer of the AR-HUD, generally more than 10 meters in front of the sight line of a driver, the projected content needs to be calibrated according to the actual environment of a front road, and the projected content needs to be projected on a road surface above a lane line, below the tail of a front vehicle and the like, for example, the mark of the lane departure reminding needs to be projected on the mark of the front vehicle collision reminding needs to be projected on the road surface. For example, the ECU of the ADAS recognizes that traffic participants in the detection blind areas of the vehicle-mounted camera and the vehicle-mounted radar are pedestrians according to the V2X information received by the OBU, and thus fuses a graphic mark representing a pedestrian into the projected image.
Fig. 2 is a schematic diagram of the projected content of an AR-HUD according to the prior art, and as shown in fig. 2, the projected content of the AR-HUD is divided into five parts: the position A is the current and impending state of the traffic signal lamp and the waiting time of the traffic signal lamp, the position B is the running speed of the current vehicle, the position C is the map navigation information acquired from the network, the position D is the front traffic condition, and the position E is the surrounding vehicle distance early warning, for example, the too close distance of the vehicle can be reminded.
Currently, for roads not covered with V2X devices, the driver can view the vehicle blind spot image through the road side convex mirror. For roads covered with the V2X device, the vehicle can interact with the road side V2X device through the vehicle-mounted V2X device to acquire vehicle blind area information, a vehicle blind area image is generated according to the vehicle blind area information, and a driver can watch the vehicle blind area image through a vehicle-mounted display (such as a common vehicle-mounted display or a special vehicle-mounted display: a vehicle-mounted traditional HUD and a vehicle-mounted AR-HUD). When a driver views the blind area image of the vehicle through a road side convex mirror or a common vehicle-mounted display, the sight line is required to deviate from the running direction of the vehicle, so that the potential safety hazard is increased. When the driver views the blind area image of the vehicle through the special in-vehicle display, as shown in fig. 1 and 2, although the line of sight does not need to deviate from the traveling direction of the vehicle, since this is to directly project the blind area image of the vehicle onto the in-vehicle HUD, the driver is not notified in advance, and the image appearing in the traveling direction of the vehicle suddenly may be distracted from the driver, and the safety hazard is increased.
In view of the above, the embodiment of the application provides a vehicle early warning method for reducing potential safety hazards of vehicle early warning. According to the method, the display area of the obstacle on the vehicle-mounted HUD can be determined through the relative position of the obstacle in the blind area of the vehicle and the running direction of the vehicle, and when the distance between the vehicle and the obstacle or the distance between the vehicle and the curve is smaller than or equal to the first distance, the image of the obstacle is displayed in the display area, so that a driver can sense the position of the obstacle according to the first time of the display area of the obstacle on the vehicle-mounted HUD, and the vehicle early warning efficiency is improved. And because the image of the obstacle on the vehicle-mounted HUD does not appear suddenly, but gradually changes along with the distance between the vehicle and the obstacle or the curve existing around the vehicle, the influence on the driver is smaller, for example, the attention of the driver is not dispersed due to the suddenly appearing image, and the potential safety hazard is reduced.
The vehicle early warning method provided by the embodiment of the application can be realized through any electronic equipment capable of controlling the vehicle to execute the vehicle control function corresponding to the vehicle control signal. In some embodiments of the application, the electronic device may be a portable device external to the vehicle, such as a cell phone, tablet computer, wearable device with wireless communication capabilities (e.g., watch, bracelet, helmet, headset, etc.), augmented Reality (AR)/Virtual Reality (VR) device, notebook computer, ultra-mobile personal computer (UMPC), netbook, personal digital assistant (personal DIGITAL ASSISTANT, PDA), etc. The electronic device may also be an in-vehicle terminal device in the vehicle. The embodiment of the application does not limit the specific type of the electronic equipment.
Fig. 3 is a schematic hardware structure of an electronic device according to an embodiment of the present application. As shown in fig. 3, the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, a subscriber identity module (subscriber identification module, SIM) card interface 195, and the like.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a memory, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution. A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system. The execution of the vehicle early warning method in the embodiment of the present application may be completed by the processor 110 controlling or calling other components, for example, calling the processing program in the embodiment of the present application stored in the internal memory 121, or calling the processing program in the embodiment of the present application stored in the third party device through the external memory interface 120, so as to control the vehicle to execute the vehicle early warning function, improve the vehicle early warning efficiency, and reduce the potential safety hazard of vehicle early warning.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an operating system, software code of at least one application program, and the like. The storage data area may store data (e.g., captured images, recorded video, etc.) generated during use of the electronic device 100, and so forth. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as pictures and videos are stored in an external memory card.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. The charge management module 140 is configured to receive a charge input from a charger. The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like. The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (WIRELESS FIDELITY, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation SATELLITE SYSTEM, GNSS), frequency modulation (frequency modulation, FM), near field communication (NEAR FIELD communication, NFC), infrared (IR), etc., applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques can include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (GENERAL PACKET radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation SATELLITE SYSTEM, GLONASS), a beidou satellite navigation system (beidou navigation SATELLITE SYSTEM, BDS), a quasi zenith satellite system (quasi-zenith SATELLITE SYSTEM, QZSS) and/or a satellite based augmentation system (SATELLITE BASED AUGMENTATION SYSTEMS, SBAS). For example, the electronic device 100 may communicate with an on-board V2X device (e.g., an OBU of a vehicle) through a wireless communication technology, and acquire information of a blind area of the vehicle received by the on-board V2X device from a road side V2X device (e.g., an RSU or an OBU of another vehicle).
The display 194 is used to display a display interface of an application, such as a display page of an application installed on the electronic device 100. The display 194 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an organic light-emitting diode (OLED), an active-matrix organic LIGHT EMITTING diode (AMOLED), a flexible light-emitting diode (FLED), miniled, microLed, micro-oLed, a quantum dot LIGHT EMITTING diode (QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The sensor module 180 may include a pressure sensor 180A, an acceleration sensor 180B, a touch sensor 180C, and the like, among others.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194.
The touch sensor 180C, also referred to as a "touch panel". The touch sensor 180C may be disposed on the display 194, and the touch sensor 180C and the display 194 form a touch screen, which is also referred to as a "touch screen". The touch sensor 180C is used to detect a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180C may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100. The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization. The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc. The SIM card interface 195 is used to connect a SIM card. The SIM card may be contacted and separated from the electronic device 100 by inserting the SIM card interface 195 or extracting it from the SIM card interface 195.
It is to be understood that the components shown in fig. 3 are not to be construed as a specific limitation on the electronic device 100, and the electronic device 100 may include more or less components than illustrated, or may combine certain components, or may split certain components, or may have different arrangements of components. Furthermore, the combination/connection relationship between the components in fig. 3 is also adjustable and modifiable.
The vehicle early warning method provided by the embodiment of the application is described below with reference to the accompanying drawings.
Please refer to fig. 4. The embodiment of the application provides a flow chart of a vehicle early warning method. The method may be implemented by the electronic device shown in fig. 3, or by an electronic device having a functional structure similar to that of the electronic device shown in fig. 3, and embodiments of the present application are not limited herein. It should be understood that, in the embodiment of the present application, the electronic device is taken as an example and not limited to the main body of the execution method, and the main body of the execution method may be a chip, a chip system or a processor in the electronic device, or may also be a larger device including the electronic device, etc. In addition, for convenience of explanation, in the embodiment of the present application, an electronic device is taken as an example of an in-vehicle terminal device.
S401, the vehicle-mounted terminal equipment acquires information of a blind area of the vehicle, and determines an obstacle located in the blind area according to the information of the blind area.
In some embodiments, the vehicle-mounted terminal device of the vehicle may acquire information of a blind area of the vehicle, and determine an obstacle located in the blind area, such as a pedestrian, other vehicle, or the like, according to the information of the blind area. The blind area of the vehicle may include a detection blind area of a vehicle-mounted sensor such as a vehicle-mounted camera and a vehicle-mounted radar of the vehicle and/or a visual field blind area of a driver.
For example, the in-vehicle terminal device may acquire information of a blind area of the vehicle through an existing device on a running road of the vehicle, such as a road side V2X device or a road side convex mirror. For example, for a road on which the V2X device is installed, the in-vehicle terminal device may receive information of a blind area of a vehicle from the road side V2X device (e.g., an OBU of a vehicle) through the in-vehicle V2X device (e.g., an OBU of an RSU or other vehicle), where a sensor such as a camera, radar or the like on the road side may detect information of a pedestrian around the vehicle and other vehicle or the like, and transmit the detected information of the pedestrian around the vehicle, other vehicle or the like to the RSU, so that the RSU may transmit the information of the pedestrian around the vehicle, the other vehicle or the like to the OBU of the vehicle through a PC5 communication protocol or other protocol, and the OBU of the other vehicle may also transmit its own information to the OBU of the vehicle through the PC5 communication protocol or other protocol. For a road on which a convex mirror has been installed, the in-vehicle terminal device may capture information of a blind area of the vehicle displayed by the roadside convex mirror through the in-vehicle camera, wherein, since brightness around the vehicle may affect accuracy of the information of the blind area of the vehicle obtained by the in-vehicle camera capturing the roadside convex mirror, the in-vehicle terminal device may determine whether brightness around the vehicle is greater than or equal to a first brightness when detecting a user operation for instructing the vehicle to start a vehicle early warning, or when determining that there is a curve around the vehicle, and capture information of the blind area of the vehicle displayed by the roadside convex mirror through the in-vehicle camera if the brightness around the vehicle is greater than or equal to the first brightness.
S402, the vehicle-mounted terminal device determines a display area of the obstacle on the vehicle-mounted HUD according to a first relative position of the obstacle and the running direction of the vehicle.
In some embodiments, after determining the obstacle located in the blind area according to the information of the blind area, the in-vehicle terminal device may determine the display area of the obstacle on the in-vehicle HUD of the vehicle according to the first relative position of the obstacle and the traveling direction of the vehicle. The vehicle HUD can be a traditional HUD or an AR-HUD, and the application is not limited in particular. For example, if the obstacle is located on the left side of the traveling direction of the vehicle, the display area of the obstacle on the in-vehicle HUD is determined to be the left side area of the in-vehicle HUD, or if the obstacle is located on the right side of the traveling direction of the vehicle, the display area of the obstacle on the in-vehicle HUD is determined to be the right side area of the in-vehicle HUD, or if the obstacle is located in front of the traveling direction of the vehicle, the display area of the obstacle on the in-vehicle HUD is determined to be the middle area of the in-vehicle HUD.
S403, if the distance between the obstacle or the curve existing around the vehicle and the vehicle is smaller than or equal to the first distance, the vehicle-mounted terminal device displays the image of the obstacle in a gradual change form in the display area.
In some embodiments, the in-vehicle terminal device may determine the distance of the obstacle or a curve existing around the vehicle from the vehicle after determining the display area of the obstacle on the in-vehicle HUD. For example, the distance is calculated from the position of the obstacle or the curve existing around the vehicle, which may be determined by the on-vehicle GPS or the on-vehicle V2X device, and the position of the obstacle or the curve existing around the vehicle, which may be determined by the roadside V2X device from information detected by the sensors of the roadside camera, the roadside radar, and the like. The distance may be a time distance, for example, when it is calculated according to the current running speed of the vehicle that it takes 10s for the vehicle to run to the intersection, the distance between the vehicle and the intersection is 10s, or may be a space distance, which is not particularly limited in the present application.
The vehicle-mounted terminal device can perform vehicle early warning in various modes according to the distance between the vehicle and the obstacle or the curve existing around the vehicle, and the vehicle early warning is described by way of example below.
In the first mode, if the distance between the obstacle or the curve existing around the vehicle and the vehicle is smaller than or equal to the first distance, the in-vehicle terminal device may display the image of the obstacle in a gradation form in which the smaller the distance between the obstacle or the curve existing around the vehicle and the vehicle is, the larger the contrast between the parameter value of the image of the obstacle and the parameter value of the initial image displayed on the in-vehicle HUD is, for example, the darker the color (e.g., black) of the image of the obstacle is when the color of the initial image displayed on the in-vehicle HUD is lighter (e.g., white), the larger the contrast between the image of the obstacle and the initial image displayed on the in-vehicle HUD is, or the larger the brightness of the image of the obstacle is; or the larger the size of the image of the obstacle, the greater the contrast of the image of the obstacle with the initial image displayed on the vehicle-mounted HUD.
For example, the in-vehicle terminal device may display the values of the color, the brightness, and the size of the image of the obstacle in a gradual change form in the display area, and the formula of the change ratio of the distance of the obstacle or the curve existing around the vehicle from the vehicle to the parameter value of the image of the obstacle may be a+b/(c+d), where a, B, C are constants, d is the distance of the obstacle or the curve existing around the vehicle from the vehicle, and a <1, a+b/C >1. When d is larger, the value obtained by the change proportion formula is smaller than 1, the color, the brightness and the size of the image of the obstacle are weaker than those of the initial image displayed on the vehicle-mounted HUD, and when d is gradually smaller, the value obtained by the change proportion formula is gradually larger, and the color, the brightness and the size of the image of the obstacle are gradually stronger than those of the initial image displayed on the vehicle-mounted HUD. Assuming that the brightness and the size of the image of the obstacle are 60% of the brightness and the size of the image of the obstacle at the initial time (d is larger) and 150% of the brightness and the size of the image of the obstacle at the final time (d is smaller) when the image of the obstacle is normally displayed, the brightness and the size of the image of the obstacle are adjusted according to the projection distance of the vehicle-mounted HUD and by taking the ergonomic design into consideration, the vehicle-mounted terminal equipment can be 10 groups in every 10%, the values obtained according to a change proportion formula, for example, 0.73, and the nearest proportion of 70% is used for displaying the image of the obstacle. Assuming that the initial color of the image of the obstacle is a lighter color, its RGB value is a, and since the color of the initial image displayed on the vehicle HUD is also a lighter color, the final color of the image of the obstacle is a darker color, its RGB value is b, and is divided into 10 groups, the RGB value of the first group is a+ (b-a)/10×1, the RGB value of the second group is a+ (b-a)/10×2, and so on. The RGB value of the color of the image of the obstacle is a+ (b-a)/10 x 3, if the value obtained from the change ratio formula, for example, 0.8, belongs to the third group. For example, the two colors a, b are RGB (60,50,30) and RGB (160,200,0), respectively, and the RGB values for the third set of colors are: r=60+ (160-60)/10×3=60+30=90; g= 50+ (200-50)/10×3=50+45=95; b=30+ (0-30)/10×3=21.
In addition, the in-vehicle terminal device may stop displaying the initial image on the display area before displaying the obstacle in the display area in a gradation form, for example, concealing part of information of the initial image displayed on the in-vehicle HUD from interference or redundancy of information, and if the display area is a left area of the in-vehicle HUD, stop displaying the initial image on the left area of the in-vehicle HUD, and retain only the initial image displayed on the right area of the in-vehicle HUD, so that the driver can perceive that the obstacle is located on the left side of the traveling direction of the vehicle at the first time. Or the in-vehicle terminal device may not stop displaying the initial image on the display area before displaying the obstacle in the gradation form on the display area, but the image of the obstacle may cover the initial image when displaying the obstacle in the gradation form on the display area. Or the vehicle-mounted terminal device may stop displaying the first information included in the initial image on the vehicle-mounted HUD before displaying the obstacle in the display area in the gradation form, wherein the first information has a priority lower than the first priority. For example, the vehicle-mounted terminal device may set the speed of time as important information (information with priority not less than the first priority), the road line rendering, the marking of the road-side building as non-important information (information with priority less than the first priority), the navigation may have a voice prompt, and if the marking arrows are too many, the non-important information may be set, and the display of the non-important information included in the initial image on the vehicle-mounted HUD is stopped.
The vehicle-mounted terminal device can also mark the image of the obstacle by a sign (such as a red frame) with strong warning, or highlight the image of the obstacle by a color with high contrast, so that a driver can perceive the obstacle at the first time, and can predict whether the driving direction needs to be changed or the driving speed needs to be reduced in advance according to the obstacle, thereby reducing the occurrence of traffic accidents and improving the traffic safety.
In the second mode, if the distance from the vehicle of the obstacle or the curve existing around the vehicle is smaller than or equal to the first distance and larger than the second distance, the in-vehicle terminal device may output information for instructing to reduce the running speed of the vehicle. For example, the in-vehicle terminal device may output a voice message to alert the driver of "the existence of an obstacle in the blind area, taking care of slowing down the slow-down.
In the third aspect, if the distance between the obstacle or the curve existing around the vehicle and the vehicle is smaller than or equal to the second distance, the in-vehicle terminal device may determine the probability of the obstacle colliding with the vehicle, wherein the second distance is smaller than the first distance, for example, the probability of the obstacle colliding with the vehicle is determined according to the position of the obstacle, the position of the vehicle, and the running speed of the vehicle. If the probability of collision of the obstacle with the vehicle is greater than or equal to the first probability (i.e., collision is likely to occur), the in-vehicle terminal device may determine whether the running speed of the vehicle is greater than the first speed, if the running speed of the vehicle is not greater than the first speed, the driver has taken a deceleration measure, if the running speed of the vehicle is greater than the first speed, the driver does not take a deceleration measure, the in-vehicle terminal device may actively reduce the running speed of the vehicle, and output information for instructing to reduce the running speed of the vehicle, such as voice information for reminding the driver of "blind area is an obstacle, notice of deceleration slow running", or vibration information such as seat belt tension, seat vibration; the in-vehicle terminal device may also output information indicating to reduce the running speed of the vehicle if the probability of the obstacle colliding with the vehicle is less than or equal to the first probability (i.e., collision is unlikely to occur).
If the distance between the obstacle or the curve around the vehicle and the vehicle is smaller than or equal to the second distance, the vehicle-mounted terminal equipment can also determine whether a sign for prohibiting the whistle exists around the vehicle, and if the sign for prohibiting the whistle does not exist around the vehicle, the vehicle-mounted terminal equipment can control the whistle of the vehicle, so that the obstacle (such as a pedestrian or other vehicles) located in the blind area can be reminded through the active whistle, and the vehicle-mounted terminal equipment can not only give an internal early warning prompt to a driver, but also give an external early warning prompt to the pedestrian or other vehicles, so that traffic accidents are reduced, and traffic safety is improved.
If the distance between the obstacle or the curve around the vehicle and the vehicle is smaller than or equal to the second distance, the vehicle-mounted terminal device can also determine the projection angle of the headlight of the vehicle according to the first relative position of the obstacle or the distance between the curve around the vehicle and the vehicle, and control the headlight of the vehicle to project light to the obstacle according to the projection angle (such as the first projection angle) determined by the first relative position and the distance between the obstacle and the vehicle, or project light to the curve according to the projection angle (such as the second projection angle) determined by the second relative position and the distance between the curve and the vehicle, so that the obstacle (such as a pedestrian or other vehicles) in the blind area can also be warned by a lamp, and the internal warning of the driver and the external warning of the pedestrian or other vehicles can be provided, thereby reducing the occurrence of traffic accidents and improving the traffic safety.
Fig. 5 is a schematic diagram of a vehicle early warning scene provided by the embodiment of the present application, as shown in fig. 5, when the distance between a pedestrian located in a blind area and a vehicle is less than or equal to a first distance, the vehicle-mounted terminal device may display an image of the pedestrian in a display area of the vehicle-mounted HUD in a gradual change manner, where in the gradual change manner, the values of parameters such as color, brightness, size, etc. of the image of the pedestrian are weaker than the initial image displayed on the vehicle-mounted HUD at an initial time, and as the distance between the pedestrian and the vehicle becomes smaller, the values of parameters such as color, brightness, size, etc. of the image of the pedestrian become larger gradually, and at the same time, when the distance between the pedestrian and the vehicle is greater than a second distance, the vehicle-mounted terminal device may output voice information to remind a user that a pedestrian exists in the blind area, and slow down is noted; when the distance between the pedestrian and the vehicle is greater than the second distance, the vehicle-mounted terminal device can determine the probability of collision between the pedestrian and the vehicle according to the position of the pedestrian, the position of the vehicle and the running speed of the vehicle, if the probability is lower, the driver can be not reminded or voice information is output to remind the user of ' the existence of the pedestrian in the blind area ', the deceleration slow-running is noted, and if the probability is higher, the driver does not take the deceleration measures, the running speed of the vehicle can be actively reduced, and voice information is output to remind the user of ' the existence of the pedestrian in the blind area, the deceleration slow-running is noted.
For example, fig. 6 is a schematic diagram of another vehicle early warning scene provided in the embodiment of the present application, as shown in fig. 6, a convex mirror is usually disposed at a curve, and a driver can watch traffic conditions of a dead zone of the curve through the convex mirror at the road side, for example, whether a situation that an oncoming vehicle or a pedestrian needs to avoid in advance exists, but when the driver watches the dead zone of the vehicle, the sight line needs to deviate from the running direction of the vehicle, so that potential safety hazard is increased. Therefore, the vehicle-mounted terminal device can shoot the environment around the vehicle through the vehicle-mounted camera, and can determine whether a curve exists around the vehicle according to the lane lines, the road edge contour and the like, or can determine whether the curve exists around the vehicle through the vehicle-mounted GPS. The in-vehicle terminal device may determine whether the brightness around the vehicle is greater than or equal to the first brightness when determining that there is a curve around the vehicle.
If the brightness around the vehicle is smaller than the first brightness, outputting voice information to remind a user of paying attention to the curve, slowing down and slowing down, determining whether the vehicle is whistled in a preset time, and if the vehicle is not whistled, controlling the vehicle to whistle or controlling the head lamp to project lamplight to the curve according to a projection angle determined by a second relative position of the curve and the running direction of the vehicle and the distance between the curve and the vehicle so as to remind pedestrians or other vehicles in the blind area of the curve.
If the brightness around the vehicle is greater than or equal to the first brightness, the vehicle-mounted camera shoots the information of the blind area of the vehicle displayed by the road-side convex mirror, determines an obstacle such as a pedestrian, other vehicles and the like positioned in the blind area according to the information of the blind area of the vehicle, determines the display area of the obstacle on the vehicle-mounted HUD according to the first relative position of the obstacle and the running direction of the vehicle, displays the image of the obstacle in a gradual change mode in the display area if the distance between the obstacle or the curve and the vehicle is smaller than or equal to the first distance, controls the vehicle to actively reduce the running speed or outputs voice information to remind a user to pay attention to the curve and slow down, or controls the vehicle to whistle, or controls the headlight to project light to the curve according to the second relative position of the curve and the running direction of the vehicle and the projection angle determined by the distance between the curve and the vehicle so as to remind the pedestrian or other vehicles positioned in the blind area of the curve.
Based on the above embodiments and the same concept, the embodiments of the present application further provide an electronic device, where the electronic device is configured to implement a method executed by the electronic device provided by the embodiments of the present application.
As shown in fig. 7, an electronic device 700 may include: memory 701, one or more processors 702, and one or more computer programs (not shown). The devices described above may be coupled by one or more communication buses 703. Optionally, when the electronic device 700 is used to implement the method performed by the electronic device provided by the embodiment of the present application, the electronic device 700 may further include a display 704.
Wherein the memory 701 has stored therein one or more computer programs (code) comprising computer instructions; the one or more processors 702 invoke computer instructions stored in the memory 701 to cause the electronic device 700 to perform the inter-process communication method provided by embodiments of the present application. The display 704 is used to display images, videos, application interfaces, and other related user interfaces.
In particular implementations, memory 701 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state storage devices. The memory 701 may store an operating system (hereinafter referred to as a system), such as ANDROID, IOS, WINDOWS, or an embedded operating system, such as LINUX. The memory 701 may be used to store an implementation program of the embodiment of the present application. The memory 701 may also store network communication programs that may be used to communicate with one or more additional devices, one or more user devices, and one or more network devices. The one or more processors 702 may be a general purpose central processing unit (central processing unit, CPU), microprocessor, application-specific integrated circuit (ASIC), or one or more integrated circuits for controlling the execution of the programs of the present application.
It should be noted that fig. 7 is merely an implementation of the electronic device 700 according to an embodiment of the present application, and in practical application, the electronic device 700 may further include more or fewer components, which is not limited herein.
Based on the above embodiments and the same conception, the present embodiments also provide a computer-readable storage medium storing a computer program, which when run on a computer, causes the computer to perform a method performed by an electronic device among the methods provided in the above embodiments.
Based on the above embodiments and the same conception, the present embodiments also provide a computer program product comprising a computer program or instructions for causing a computer to perform the method performed by the electronic device among the methods provided in the above embodiments when the computer program or instructions are run on the computer.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present application without departing from the spirit or scope of the application. Thus, it is intended that the present application also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (15)

1. A vehicle warning method, comprising:
Acquiring information of a blind area of a vehicle, and determining an obstacle positioned in the blind area according to the information of the blind area;
determining a display area of the obstacle on a vehicle head-up display (HUD) according to a first relative position of the obstacle and a running direction of the vehicle;
And displaying an image of the obstacle in a gradation form in the display area if a distance between the obstacle or a curve existing around the vehicle and the vehicle is less than or equal to a first distance, wherein in the gradation form, a contrast ratio of a parameter value of the image of the obstacle to a parameter value of an initial image displayed on the vehicle-mounted HUD is greater as the distance is smaller.
2. The method of claim 1, wherein the method further comprises:
determining a probability of collision of the obstacle with the vehicle if the distance is less than or equal to a second distance, wherein the second distance is less than the first distance;
if the probability is greater than or equal to the first probability, the travel speed of the vehicle is reduced.
3. The method of claim 2, wherein reducing the travel speed of the vehicle if the probability is greater than or equal to a first probability comprises:
And if the probability is greater than or equal to the first probability and the running speed of the vehicle is greater than the first speed, reducing the running speed of the vehicle.
4. A method according to claim 2 or 3, wherein the method further comprises:
If the probability is less than or equal to the first probability, information indicating to reduce the running speed of the vehicle is output.
5. The method of any one of claims 1-4, wherein the method further comprises:
Determining whether a sign for prohibiting whistling exists around the vehicle if the distance is less than or equal to a second distance, wherein the second distance is less than the first distance;
And if the sign does not exist around the vehicle, controlling the vehicle to blast.
6. The method of any one of claims 1-5, wherein the method further comprises:
determining a projection angle of a headlight of the vehicle according to the first relative position and the distance if the distance is smaller than or equal to a second distance, wherein the second distance is smaller than the first distance;
and controlling the headlight to project light to the obstacle according to the projection angle.
7. The method of any one of claims 1-6, wherein the method further comprises:
If the distance is smaller than or equal to a second distance, determining a projection angle of a headlight of the vehicle according to a second relative position and the distance, wherein the second distance is smaller than the first distance, and the second relative position is a relative position of the curve and a running direction of the vehicle;
And controlling the headlight to project light to the curve according to the projection angle.
8. The method of any one of claims 1-7, wherein the method further comprises:
And outputting information indicating to reduce the running speed of the vehicle if the distance is less than or equal to the first distance and greater than a second distance, wherein the second distance is less than the first distance.
9. The method of any one of claims 1-8, wherein obtaining information of a blind zone of the vehicle comprises:
receiving information of the blind area from the road side vehicle networking V2X equipment; or alternatively
And shooting the information of the blind area displayed by the road side convex mirror through the vehicle-mounted camera.
10. The method of claim 9, wherein capturing information of the blind zone displayed by the roadside convex mirror via the onboard camera comprises:
detecting a user operation or determining that the curve exists around the vehicle, wherein the user operation is used for indicating the vehicle to start vehicle early warning;
And if the brightness around the vehicle is greater than or equal to the first brightness, shooting the information of the blind area displayed by the road side convex mirror through the vehicle-mounted camera.
11. The method of any of claims 1-10, wherein determining a display area of the obstacle on the vehicle-mounted HUD based on a first relative position of the obstacle and a direction of travel of the vehicle comprises:
If the obstacle is positioned at the left side of the running direction of the vehicle, determining the display area as a left side area of the vehicle-mounted HUD; or alternatively
If the obstacle is positioned on the right side of the running direction of the vehicle, determining the display area as a right side area of the vehicle-mounted HUD; or alternatively
And if the obstacle is positioned in front of the running direction of the vehicle, determining the display area as the middle area of the vehicle-mounted HUD.
12. The method of any of claims 1-10, wherein prior to displaying the obstacle in the display area in a gradual manner, the method further comprises:
Stopping displaying the initial image on the display area; or alternatively
And stopping displaying first information included in the initial image on the vehicle-mounted HUD, wherein the first information has a lower priority than the first information.
13. An electronic device, the electronic device comprising:
A processor, a memory, and one or more programs;
Wherein the one or more programs are stored in the memory, the one or more programs comprising instructions, which when executed by the processor, cause the electronic device to perform the method of any of claims 1-12.
14. A computer readable storage medium for storing a computer program which, when run on a computer, causes the computer to perform the method of any one of claims 1-12.
15. A computer program product comprising a computer program which, when run on a computer, causes the computer to perform the method of any of the preceding claims 1-12.
CN202211418783.6A 2022-11-14 2022-11-14 Vehicle early warning method and electronic equipment Pending CN118025211A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211418783.6A CN118025211A (en) 2022-11-14 2022-11-14 Vehicle early warning method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211418783.6A CN118025211A (en) 2022-11-14 2022-11-14 Vehicle early warning method and electronic equipment

Publications (1)

Publication Number Publication Date
CN118025211A true CN118025211A (en) 2024-05-14

Family

ID=91002914

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211418783.6A Pending CN118025211A (en) 2022-11-14 2022-11-14 Vehicle early warning method and electronic equipment

Country Status (1)

Country Link
CN (1) CN118025211A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118486172A (en) * 2024-07-15 2024-08-13 济南城市建设集团有限公司 Method, equipment and medium for monitoring and early warning of vehicle behavior in tunnel

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118486172A (en) * 2024-07-15 2024-08-13 济南城市建设集团有限公司 Method, equipment and medium for monitoring and early warning of vehicle behavior in tunnel

Similar Documents

Publication Publication Date Title
US11562550B1 (en) Vehicle and mobile device interface for vehicle occupant assistance
US10957029B2 (en) Image processing device and image processing method
US20230106673A1 (en) Vehicle and mobile device interface for vehicle occupant assistance
US20230048230A1 (en) Method for displaying lane information and apparatus for executing the method
US10719990B2 (en) Electronic apparatus, control method thereof, computer program, and computer-readable recording medium
US20200086789A1 (en) Mixed reality left turn assistance to promote traffic efficiency and enhanced safety
US10704957B2 (en) Imaging device and imaging method
US11272115B2 (en) Control apparatus for controlling multiple camera, and associated control method
CN110091798B (en) Electronic device, control method of electronic device, and computer-readable storage medium
KR102233391B1 (en) Electronic apparatus, control method of electronic apparatus and computer readable recording medium
JP2016012277A (en) Vehicle communication device and communication system
JP7235906B2 (en) Solid-state imaging device
KR20160065724A (en) Electronic apparatus, control method of electronic apparatus, computer program and computer readable recording medium
US20240045204A1 (en) Augmenting vehicle indicator lights with arhud for color vision impaired
US11443520B2 (en) Image processing apparatus, image processing method, and image processing system
CN118025211A (en) Vehicle early warning method and electronic equipment
KR102406490B1 (en) Electronic apparatus, control method of electronic apparatus, computer program and computer readable recording medium
US11671700B2 (en) Operation control device, imaging device, and operation control method
US20240029559A1 (en) Augmented reality display for traffic signal awareness
WO2021164387A1 (en) Early warning method and apparatus for target object, and electronic device
KR101781689B1 (en) Vitual image generating apparatus, head mounted display and vehicle
KR20150049544A (en) Black Box
KR102371620B1 (en) Electronic apparatus, control method of electronic apparatus and computer readable recording medium
CN117742848A (en) Method and electronic device for navigation
CN117351757A (en) Signal lamp early warning method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination