[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN109115206A - Air navigation aid and navigation system - Google Patents

Air navigation aid and navigation system Download PDF

Info

Publication number
CN109115206A
CN109115206A CN201811084287.5A CN201811084287A CN109115206A CN 109115206 A CN109115206 A CN 109115206A CN 201811084287 A CN201811084287 A CN 201811084287A CN 109115206 A CN109115206 A CN 109115206A
Authority
CN
China
Prior art keywords
navigation
plan
environmental condition
current
target object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811084287.5A
Other languages
Chinese (zh)
Inventor
许奔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201811084287.5A priority Critical patent/CN109115206A/en
Publication of CN109115206A publication Critical patent/CN109115206A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

Present disclose provides a kind of air navigation aids, comprising: determines a navigation programming;Obtain the geographical location that target object is presently in;Geographical location based on acquisition determines position of the target object in electronic map;Obtain the circumstance state information for the ambient enviroment that target object is presently in;And the position based on circumstance state information and target object in electronic map, the navigation strategy of electronic navigation software is determined, to arrange or change fixed navigation programming.The disclosure additionally provides a kind of navigation system.

Description

Navigation method and navigation system
Technical Field
The present disclosure relates to a navigation method and a navigation system.
Background
At present, both mobile phone navigation and vehicle navigation have a problem, and the current traffic condition is completely disregarded during navigation. For example, when a turn is clearly red, navigation prompts a user to turn right immediately, and the user is easy to run the red light if the user is not aware of the fact. For example, the navigation system still prompts the driver to merge lines, turn small curves and the like under the condition of good weather, and does not reserve sufficient preparation time or make a corresponding change prompt for the driver according to different actual conditions.
Disclosure of Invention
One aspect of the present disclosure provides a navigation method capable of safe navigation, including: determining a navigation plan; acquiring the current geographic position of a target object; determining the position of the target object in the electronic map based on the acquired geographic position; acquiring environmental condition information of the current surrounding environment of the target object; and determining a navigation strategy of electronic navigation software based on the environmental condition information and the position of the target object in the electronic map so as to arrange or change the determined navigation plan.
Optionally, the determining the navigation policy of the electronic navigation software to arrange or change the navigation policy of the electronic navigation software includes: when the environmental condition information indicates that the current environmental condition meets the preset characteristics, one or more of the following steps are executed: keeping the determined navigation plan unchanged, and prompting to pause at the current navigation position, or prompting to pause near the current navigation position in advance by a preset time; keeping the determined navigation plan unchanged, and prompting to continue to navigate according to the navigation plan after delaying for a period of time at the current navigation position, or prompting to continue to navigate according to the navigation plan after delaying for a period of time near the current navigation position in advance by a preset time; on the basis of the determined navigation plan, modifying the navigation plan at the appointed navigation position, or modifying the navigation plan at the appointed navigation position and prompting that part of the navigation plan is modified in advance by preset time; resetting the subsequent navigation plan from the designated navigation position, or resetting the subsequent navigation plan and prompting that part of the navigation plan is reset in advance by a preset time; and outputting a voice prompt or a visual prompt aiming at the determined navigation plan.
Optionally, the environmental condition information includes one or more of the following: environmental condition information caused by weather conditions; and condition information of the surrounding road surface, wherein the condition information comprises information related to a stationary object and/or information related to a moving object.
Optionally, the method further includes determining whether the current environmental condition satisfies the predetermined characteristic, where the operation includes: acquiring an image about the surrounding environment; identifying one or more identified objects in the image; predicting whether the current environmental condition satisfies the predetermined characteristic according to the identified identification object, or sensing a stationary object and/or a moving object in the surrounding environment by using a sensing device; and determining whether the current environmental condition satisfies the predetermined characteristic according to a relative displacement relationship between the object to be sensed and the target object.
Optionally, the visual cue comprises one or more of the following: performing color change display on local elements or all elements in the navigation interface; performing color change display on a navigation identifier in a navigation interface; changing the display form of the navigation mark in the navigation interface; and displaying an alarm identifier around the navigation identifier in the navigation interface.
Another aspect of the present disclosure provides a navigation system including: a first determination module for determining a navigation plan; the first acquisition module is used for acquiring the current geographic position of the target object; the second determination module is used for determining the position of the target object in the electronic map based on the acquired geographic position; the second acquisition module is used for acquiring the environmental condition information of the current surrounding environment where the target object is located; and a third determining module, configured to determine a navigation policy of electronic navigation software based on the environmental condition information and a position of the target object in an electronic map, so as to arrange or change the determined navigation plan.
Optionally, the third determining module is further configured to: when the environmental condition information indicates that the current environmental condition meets the preset characteristics, one or more of the following steps are executed: keeping the determined navigation plan unchanged, and prompting to pause at the current navigation position, or prompting to pause near the current navigation position in advance by a preset time; keeping the determined navigation plan unchanged, and prompting to continue to navigate according to the navigation plan after delaying for a period of time at the current navigation position, or prompting to continue to navigate according to the navigation plan after delaying for a period of time near the current navigation position in advance by a preset time; modifying the navigation plan at the designated navigation position on the basis of the determined navigation plan; resetting subsequent navigation plans from the appointed navigation position, or modifying the navigation plans at the appointed navigation position and prompting that part of the navigation plans are modified in advance by preset time; and outputting voice prompt or visual prompt aiming at the determined navigation plan, or resetting the subsequent navigation plan and prompting that part of the navigation plan is reset in advance by a preset time.
Optionally, the environmental condition information includes one or more of the following: environmental condition information caused by weather conditions; and condition information of the surrounding road surface, wherein the condition information comprises information related to a stationary object and/or information related to a moving object.
Optionally, the system further includes a determining module, configured to determine whether the current environmental condition satisfies the predetermined characteristic, where the determining module includes: an acquisition unit configured to acquire an image of the surrounding environment; the identification unit is used for identifying one or more identification objects in the image; and a prediction unit for predicting whether the current environmental condition satisfies the predetermined characteristic based on the identified object, or a sensing unit for sensing a stationary object and/or a moving object in the surrounding environment using a sensing device; and a determination unit configured to determine whether the current environmental condition satisfies the predetermined characteristic according to a relative displacement relationship between the object to be sensed and the target object.
Optionally, the visual cue comprises one or more of the following: performing color change display on local elements or all elements in the navigation interface; changing the display form of the navigation mark in the navigation interface; and displaying an alarm identifier around the navigation identifier in the navigation interface.
Another aspect of the present disclosure provides a computer system comprising: one or more processors; memory for storing one or more programs, wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method as described above.
Another aspect of the present disclosure provides a computer-readable storage medium storing computer-executable instructions for implementing the method as described above when executed.
Another aspect of the disclosure provides a computer program comprising computer executable instructions for implementing the method as described above when executed.
Drawings
For a more complete understanding of the present disclosure and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
fig. 1 schematically shows an application scenario of a navigation method and a navigation system according to an embodiment of the present disclosure;
FIG. 2 schematically illustrates a flow chart of a navigation method according to an embodiment of the present disclosure;
FIG. 3A schematically illustrates a schematic diagram of pausing navigation according to an embodiment of the present disclosure;
FIG. 3B schematically shows a schematic diagram of deferred navigation according to an embodiment of the present disclosure;
FIG. 3C schematically illustrates a schematic diagram of modifying a navigation plan according to an embodiment of the present disclosure;
FIG. 3D schematically illustrates a diagram of resetting a navigation plan, according to an embodiment of the present disclosure;
FIG. 4A schematically illustrates a flow chart of determining whether a current environmental condition satisfies a predetermined characteristic in accordance with an embodiment of the present disclosure;
FIG. 4B schematically illustrates a flow chart of determining whether a current environmental condition satisfies a predetermined characteristic according to another embodiment of the present disclosure;
FIG. 5 schematically shows a block diagram of a navigation system according to an embodiment of the present disclosure;
FIG. 6A schematically illustrates a block diagram of a determination module according to an embodiment of the disclosure;
FIG. 6B schematically illustrates a block diagram of a determination module according to another embodiment of the present disclosure; and
FIG. 7 schematically illustrates a block diagram of a computer system suitable for implementing a navigation method and a navigation system according to an embodiment of the disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Where a convention analogous to "at least one of A, B and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B and C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). Where a convention analogous to "A, B or at least one of C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B or C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). It will be further understood by those within the art that virtually any disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms. For example, the phrase "a or B" should be understood to include the possibility of "a" or "B", or "a and B".
Some block diagrams and/or flow diagrams are shown in the figures. It will be understood that some blocks of the block diagrams and/or flowchart illustrations, or combinations thereof, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the instructions, which execute via the processor, create means for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks. The techniques of this disclosure may be implemented in hardware and/or software (including firmware, microcode, etc.). In addition, the techniques of this disclosure may take the form of a computer program product on a computer-readable storage medium having instructions stored thereon for use by or in connection with an instruction execution system.
Embodiments of the present disclosure provide a navigation method for implementing safe navigation and a navigation system capable of applying the method. The method includes determining a navigation plan; acquiring the current geographic position of a target object; determining the position of the target object in the electronic map based on the acquired geographic position; acquiring environmental condition information of the current surrounding environment of a target object; and determining a navigation strategy of the electronic navigation software based on the environmental condition information and the position of the target object in the electronic map to arrange or change the determined navigation plan.
Fig. 1 schematically shows an application scenario of a navigation method and a navigation system according to an embodiment of the present disclosure. It should be noted that fig. 1 is only an example of a scenario in which the embodiments of the present disclosure may be applied to help those skilled in the art understand the technical content of the present disclosure, but does not mean that the embodiments of the present disclosure may not be used in other environments or scenarios.
As shown in fig. 1, the present disclosure may be applied to any scene such as in-vehicle navigation and live-action navigation by electronic navigation software. Specifically, a navigation plan can be preset by means of an electronic map, the real-time environment condition of the surrounding environment is concerned in the advancing process according to the existing navigation plan, and the corresponding navigation strategy is designated to arrange or change the existing navigation plan, so that the mechanical navigation without live situation can be avoided, and safe navigation and flexible navigation are realized.
Fig. 2 schematically shows a flow chart of a navigation method according to an embodiment of the present disclosure.
As shown in fig. 2, the method includes operations S210 to S250.
In operation S210, a navigation plan is determined.
Specifically, an origin and a destination may be entered, with electronic navigation software providing one or more possible navigation plans for selection by the user.
In operation S220, a geographic location where the target object is currently located is acquired.
The target object is matched with the actual application scene, for example, when the vehicle is in navigation, the target object is a vehicle, and when the real scene is in navigation, the target object is a user.
In operation S230, a location of the target object in the electronic map is determined based on the acquired geographic location.
Namely, the current geographic position of the target object is mapped into a pre-established electronic map.
In operation S240, environmental condition information of the surrounding environment in which the target object is currently located is acquired.
Specifically, the environmental conditions may include, but are not limited to, environmental conditions due to construction, weather, traffic, and daily life.
For example, garbage due to construction, roadblocks due to construction, and the like; for example, water accumulation, landslide, traffic accident scene, etc. due to weather; for another example, people flow, traffic flow, accident scene, etc. caused by traffic jam; and for example, garbage due to daily life.
In operation S250, a navigation strategy of the electronic navigation software is determined based on the environmental condition information and the position of the target object in the electronic map to arrange or change the determined navigation plan.
Specifically, a more flexible navigation strategy may be determined with reference to the actual environmental conditions around the target object on the basis of navigating according to the position of the target object mapped in the electronic map, to decide how to navigate based on the determined navigation plan next.
For example, taking the car navigation as an example, if the environment live represents a red light ahead, the driver is prompted to pause waiting for the red light ahead, if the environment live represents a green light ahead, the driver is prompted to continue to move ahead by the green light ahead, if the environment live represents an obstacle ahead, the navigation plan is modified or reset according to the actual situation, and the like.
For another example, taking live-action navigation as an example, if the environment live represents a red light ahead, the user is prompted to pause waiting for the red light ahead, if the environment live represents a green light ahead, the user is prompted to continue to move ahead, if the environment live represents a vehicle driving ahead, the user is prompted to wait or slow down, and so on.
Do not consider when navigating with prior art completely current environmental situation, break through the red light easily, cause the traffic accident easily and compare, through the embodiment of this disclosure, can consider current environmental situation during the navigation, avoid breaking through the red light, avoid causing the traffic accident for it is safer, more nimble to navigate.
The method shown in fig. 2 is further described with reference to fig. 3A-3D, and fig. 4A-4B, in conjunction with specific embodiments.
As an alternative embodiment, the determining the navigation strategy of the electronic navigation software to arrange or change the navigation strategy of the electronic navigation software includes: and when the environmental condition information indicates that the current environmental condition meets the preset characteristics, executing one or more of the following A-E.
A: the determined navigation plan is kept unchanged and a pause is prompted at the current navigation position or a pause is prompted in the vicinity of the current navigation position a predetermined time ahead.
For example, as shown in fig. 3A, in the process of advancing according to the determined navigation plan, when the camera detects that the front is blocked due to rain and snow weather (such as a heavy snow-covered road, deep ponding, landslide, debris flow, and the like), if the camera continues to navigate according to the navigation plan determined in advance, the target object is likely to enter a dangerous area, even a safety accident occurs, and therefore, in such a situation that the temporary no-way can be cleared, the determined navigation plan can be kept unchanged, but the navigation progress of the navigation software needs to be controlled to be paused at the current navigation position, and the target object is paused until the front blocking is released, or the navigation progress of the navigation software is controlled to be paused near the current navigation position in advance by a predetermined time, so that sufficient time can be reserved to prompt the stop in advance.
B: and keeping the determined navigation plan unchanged, and prompting to continue to navigate according to the navigation plan after delaying for a period of time at the current navigation position, or prompting to continue to navigate according to the navigation plan after delaying for a period of time near the current navigation position in advance by a preset time.
For example, as shown in fig. 3B, in the process of advancing according to the determined navigation plan, when the camera detects a red light ahead or a pedestrian (e.g., a large snow-covered road, deep ponding, a landslide, a debris flow, etc.), in order to avoid running the red light or avoiding the pedestrian, the determined navigation plan may be kept unchanged, but the navigation progress of the navigation software needs to be controlled to pause at the current navigation position until a green light or the pedestrian passes, and then the navigation is continued according to the previously determined navigation plan, or a predetermined time is advanced to prompt the driver to stop after a time delay near the current navigation position, so that sufficient time may be reserved to prompt the driver to stop in advance.
C: on the basis of the determined navigation plan, the navigation plan at the specified navigation position is modified, or the navigation plan at the specified navigation position is modified and a predetermined time ahead is used for prompting that part of the navigation plan is modified.
For example, as shown in fig. 3C, in the process of advancing according to the determined navigation plan, when the camera detects that a front turn is being constructed and the construction area is small, if the navigation is continued according to the previously determined navigation plan, it is highly likely that a constructor or a construction device is collided, and in order to avoid the turn, the previously determined navigation plan needs to be finely adjusted, for example, a small turn is directly turned before, and a large turn can be turned around. Alternatively, the driver may be informed that a portion of the navigation plan has been modified a predetermined time in advance while the navigation plan at the specified navigation location has been modified, so that sufficient time can be left for the driver to prompt the driver for exit, turns, merge, etc.
D: and resetting the subsequent navigation plan from the designated navigation position, or resetting the subsequent navigation plan and prompting that part of the navigation plan is reset in advance by a preset time.
For example, as shown in fig. 3D, in the process of proceeding according to the determined navigation plan, when the camera detects that a front turn is being constructed and the construction area is large, if the navigation is continued according to the predetermined navigation plan, it is highly likely to hit a constructor or a construction equipment, and if the navigation plan determined in advance is only finely adjusted, the construction site cannot be bypassed, in this case, it is necessary to start at the specified navigation position in front, and reset the subsequent navigation plan, for example, a small turn needs to be directly turned before, and at this time, the route can be adjusted to go straight ahead.
In addition, for the cases of A-D, corresponding voice prompts can be added simultaneously so as to attract the attention of the driver, and furthermore, the voice prompts can be continuously prompted for a plurality of times for one event, so that the voice prompts can attract the attention of the driver.
In addition, for the cases of a to D, in some navigation scenarios, especially in-vehicle navigation, it is likely that the driver or the user cannot be noticed only by making a navigation strategy arrangement or changing a determined navigation plan, and in order to overcome this drawback, a voice prompt or a visual prompt may also be output at the same time.
E: and outputting voice prompt or visual prompt aiming at the determined navigation plan.
For example, the front is a school, and the driver may be prompted to slow down for a determined navigation plan, or to skid on a rainy or snowy day, or to slow down for a determined navigation plan, etc.
By the embodiment of the disclosure, different navigation strategies can be formulated according to different environmental conditions, and then the navigation plan which is determined in advance is arranged or changed based on the navigation strategies, so that safety accidents possibly caused by simple mechanical navigation are overcome, and finally, the aim of flexible and safer navigation is fulfilled.
As an alternative embodiment, the environmental condition information includes, but is not limited to, one or more of the following: environmental condition information caused by weather conditions, such as severe water accumulation on roads caused by heavy rain, partial road section collapse, hillside debris flow, traffic accidents, and the like; the condition information of the surrounding road surface, wherein the condition information of the surrounding road surface may specifically include information about a stationary object and/or information about a moving object. Further, the information related to the stationary object may include traffic light information, road block information, other obstacle information such as road debris; the information related to the moving object may include information of pedestrians, vehicles, and the like.
As an optional embodiment, the method further includes determining whether the current environmental condition satisfies the predetermined characteristic, and the operation may include at least two implementations:
mode 1: as shown in fig. 4A, operations S410 to S430 are included:
operation S410 of acquiring an image about a surrounding environment;
operation S420, identifying one or more identified objects in the image; and
operation S430 predicts whether the current environmental condition satisfies a predetermined characteristic according to the identified identifying object.
It should be noted that the identification object may be predetermined, and may include, but is not limited to, a vehicle, a person, a traffic sign (e.g., a signal, a road sign, etc.), and the like.
For example, in the case of vehicle-mounted navigation, if a child (i.e., an identifiable object) in the surrounding environment is approaching the vehicle during driving of the target object of the vehicle, even if the child is currently in front of the vehicle and is approaching the vehicle on a sidewalk in a direction parallel to the driving direction of the vehicle, although there is no safety risk, in order to reduce the safety risk, the motion trajectory attribute of the child may be predicted according to the characteristics of the child and the prediction result may be used as a part of the current environmental condition to make a corresponding navigation strategy, so as to arrange or change the predetermined navigation plan.
For another example, continuing to take the car navigation as an example, if a garbage bag in the surrounding environment captured by the camera flies head-on during the driving of the target object of the vehicle, but considering that even if the garbage bag collides with the vehicle, no safety accident will be caused, the environmental condition information can be ignored at this time, and the navigation is continued directly according to the predetermined navigation plan.
Mode 2: as shown in FIG. 4B, operations S440-S450 are included:
an operation S440 of sensing, by a sensing device, stationary objects in a surrounding environment, and/or moving objects; and
in operation S450, it is determined whether the current environmental condition satisfies a predetermined characteristic according to the relative displacement relationship between the sensed object and the target object.
The relative displacement relationship includes, but is not limited to, a relative position relationship and a relative velocity relationship.
It should be noted that, since the sensor can only identify the relative position relationship or the relative displacement relationship between the object and the target object in the surrounding environment, and cannot identify what the object is, and even cannot predict the motion trajectory attribute of the object, it can only be determined whether the current environmental condition satisfies the predetermined characteristic, and cannot predict whether the future environmental condition still satisfies the predetermined characteristic, for example, using a sensing device, it cannot distinguish a garbage bag flying ahead from a running child.
Based on this, it is preferable that, in order to improve the safety, regardless of the sensed object, if its moving speed with respect to the target object is fast, even if it is far from the target object, the arrangement or the change of the navigation strategy of the electronic navigation software provided by the foregoing embodiment may be performed.
It should be noted that the fast moving speed of the sensed object and the object relative to the target object includes, but is not limited to, the following situations: (1) the sensed object is in front, the target object is behind, and the sensed object moves in the direction away from the target object, but the moving speed of the target object is far faster than that of the sensed object; (2) the sensed object is in front of the target object, the target object is behind the target object, and the sensed object moves towards the target object, wherein the moving speed of the target object is high, the moving speed of the sensed object is high, or the moving speeds of the sensed object and the sensed object are both general but the relative speed is high.
With the embodiments of the present disclosure, different detection means, such as using a camera or a sensing device, may be used to predict or determine whether the current environmental conditions meet predetermined characteristics.
As an alternative embodiment, the visual cue comprises one or more of the following: performing color-changing display on partial elements or all elements in the navigation interface, wherein the partial elements can be navigation marks in the navigation interface, and can be displayed in red or other special colors to indicate emergency when being displayed visually; changing the display form of the navigation identifier in the navigation interface, for example, when the navigation plan is changed, displaying the navigation identifier in an enlarged manner, and the like; and displaying an alarm identifier around the navigation identifier in the navigation interface, for example, adding an identifier for forbidding driving and the like on the basis of the original navigation identifier when the navigation identifier is paused.
In addition, in order to enrich the types of prompts, voice prompts with preset times can be added on the basis of visual prompts. Of course, these visual prompts, as well as the visual prompts accompanied by voice prompts, may also be performed in advance of a predetermined time, so as to reserve sufficient time for the user to turn, merge, and the like.
By means of the navigation planning method and device, when navigation changes compared with a navigation plan determined in advance, or even if the navigation does not change, a driver or a user needs to be reminded, the driver or the user can be noticed through visual and auditory prompts.
FIG. 5 schematically shows a block diagram of a navigation system according to an embodiment of the present disclosure.
As shown in fig. 5, the navigation system 500 includes a first determining module 510, a first obtaining module 520, a second determining module 530, a second obtaining module 540, and a third determining module 550. The navigation system 500 may perform the methods described above with reference to fig. 2, and fig. 3A-3D, and 4A-4B to enable communication between multiple robots.
A first determining module 510 for determining a navigation plan;
a first obtaining module 520, configured to obtain a current geographic location of a target object;
a second determining module 530, configured to determine a location of the target object in the electronic map based on the obtained geographic location;
a second obtaining module 540, configured to obtain environmental condition information of a surrounding environment where the target object is currently located; and
a third determining module 550, configured to determine a navigation strategy of the electronic navigation software based on the environmental condition information and the position of the target object in the electronic map, so as to arrange or change the determined navigation plan.
Do not consider when navigating with prior art completely current environmental situation, break through the red light easily, cause the traffic accident easily and compare, through the embodiment of this disclosure, can consider current environmental situation during the navigation, avoid breaking through the red light, avoid causing the traffic accident for it is safer, more nimble to navigate.
As an alternative embodiment, the third determining module is further configured to: when the environmental condition information indicates that the current environmental condition meets the predetermined characteristic, performing one or more of the following: keeping the determined navigation plan unchanged, and prompting to pause at the current navigation position, or prompting to pause near the current navigation position in advance by a preset time; keeping the determined navigation plan unchanged, and continuing to navigate according to the navigation plan after delaying for a period of time at the current navigation position, or prompting to delay for a period of time near the current navigation position in advance by a preset time and continuing to navigate according to the navigation plan; modifying the navigation plan at the appointed navigation position on the basis of the determined navigation plan, or modifying the navigation plan at the appointed navigation position and prompting that part of the navigation plan is modified in advance by preset time; resetting the subsequent navigation plan from the designated navigation position, or resetting the subsequent navigation plan and prompting that part of the navigation plan is reset in advance by a preset time; and outputting voice prompt or visual prompt aiming at the determined navigation plan.
By the embodiment of the disclosure, different navigation strategies can be formulated according to different environmental conditions, and then the navigation plan which is determined in advance is arranged or changed based on the navigation strategies, so that safety accidents possibly caused by simple mechanical navigation are overcome, and finally, the aim of flexible and safer navigation is fulfilled.
As an alternative embodiment, the environmental condition information includes one or more of the following: environmental condition information caused by weather conditions; and condition information of the surrounding road surface, wherein the condition information comprises information related to a stationary object and/or information related to a moving object.
As an alternative embodiment, the navigation system 500 further includes a determining module 560 for determining whether the current environmental condition satisfies the predetermined characteristic, as shown in fig. 6A, the determining module 560 includes: an acquisition unit 561 for acquiring an image about a surrounding environment; a recognition unit 562 for recognizing one or more identifying objects in the image; and a prediction unit 563 configured to predict whether the current environmental condition satisfies the predetermined characteristic according to the identified object, or, as shown in fig. 6B, the determination module 560 includes: a sensing unit 564 for sensing stationary objects in the surrounding environment, and/or moving objects, using the sensing means; and a determining unit 565, configured to determine whether the current environmental condition satisfies a predetermined characteristic according to a relative displacement relationship between the sensed object and the target object.
With the embodiments of the present disclosure, different detection means, such as using a camera or a sensing device, may be used to predict or determine whether the current environmental conditions meet predetermined characteristics.
As an alternative embodiment, the visual cue comprises one or more of the following: performing color change display on local elements or all elements in the navigation interface; changing the display form of the navigation mark in the navigation interface; and displaying an alarm identifier around the navigation identifier in the navigation interface.
Any of the modules, units, or at least part of the functionality of any of them according to embodiments of the present disclosure may be implemented in one module. Any one or more of the modules and units according to the embodiments of the present disclosure may be implemented by being split into a plurality of modules. Any one or more of the modules, units according to the embodiments of the present disclosure may be implemented at least partially as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented by any other reasonable means of hardware or firmware by integrating or packaging the circuits, or in any one of three implementations of software, hardware and firmware, or in any suitable combination of any of them. Alternatively, one or more of the modules, units according to embodiments of the present disclosure may be implemented at least partly as computer program modules, which, when executed, may perform the respective functions.
For example, any number of the first determining module 510, the first obtaining module 520, the second determining module 530, the second obtaining module 540, and the third determining module 550 may be combined and implemented in one module, or any one of the modules may be split into a plurality of modules. Alternatively, at least part of the functionality of one or more of these modules may be combined with at least part of the functionality of the other modules and implemented in one module. According to an embodiment of the present disclosure, at least one of the first determining module 510, the first obtaining module 520, the second determining module 530, the second obtaining module 540, and the third determining module 550 may be at least partially implemented as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented by hardware or firmware in any other reasonable manner of integrating or packaging a circuit, or implemented by any one of three implementations of software, hardware, and firmware, or by a suitable combination of any of them. Alternatively, at least one of the first determining module 510, the first obtaining module 520, the second determining module 530, the second obtaining module 540 and the third determining module 550 may be at least partially implemented as a computer program module, which when executed, may perform a corresponding function.
FIG. 7 schematically illustrates a block diagram of a computer system suitable for implementing a navigation method and a navigation system according to an embodiment of the disclosure. The computer system illustrated in FIG. 7 is only one example and should not impose any limitations on the scope of use or functionality of embodiments of the disclosure.
As shown in fig. 7, computer system 700 includes a processor 710, a computer-readable storage medium 720. The computer system 700 may perform a method according to an embodiment of the disclosure.
In particular, processor 710 may comprise, for example, a general purpose microprocessor, an instruction set processor and/or associated chipset, and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), and/or the like. The processor 710 may also include on-board memory for caching purposes. Processor 710 may be a single processing unit or a plurality of processing units for performing the different actions of the method flows according to embodiments of the present disclosure.
Computer-readable storage medium 720, for example, may be a non-volatile computer-readable storage medium, specific examples including, but not limited to: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and so on.
The computer-readable storage medium 720 may include a computer program 721, which computer program 721 may include code/computer-executable instructions that, when executed by the processor 710, cause the processor 710 to perform a method according to an embodiment of the disclosure, or any variation thereof.
The computer program 721 may be configured with, for example, computer program code comprising computer program modules. For example, in an example embodiment, code in computer program 721 may include one or more program modules, including 721A, modules 721B, … …, for example. It should be noted that the division and number of modules are not fixed, and those skilled in the art may use suitable program modules or program module combinations according to actual situations, so that the processor 710 may execute the method according to the embodiment of the present disclosure or any variation thereof when the program modules are executed by the processor 710.
According to an embodiment of the present disclosure, the processor 710 may interact with the signal transmitter 730 and the signal receiver 740 to perform a method according to an embodiment of the present disclosure or any variation thereof.
According to an embodiment of the present invention, at least one of the first determining module 510, the first obtaining module 520, the second determining module 530, the second obtaining module 540, and the third determining module 550 may be implemented as a computer program module described with reference to fig. 7, which, when executed by the processor 710, may implement the respective operations described above.
The present disclosure also provides a computer-readable storage medium, which may be contained in the apparatus/device/system described in the above embodiments; or may exist separately and not be assembled into the device/apparatus/system. The computer-readable storage medium carries one or more programs which, when executed, implement the method according to an embodiment of the disclosure.
According to embodiments of the present disclosure, the computer-readable storage medium may be a non-volatile computer-readable storage medium, which may include, for example but is not limited to: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Those skilled in the art will appreciate that various combinations and/or combinations of features recited in the various embodiments and/or claims of the present disclosure can be made, even if such combinations or combinations are not expressly recited in the present disclosure. In particular, various combinations and/or combinations of the features recited in the various embodiments and/or claims of the present disclosure may be made without departing from the spirit or teaching of the present disclosure. All such combinations and/or associations are within the scope of the present disclosure.
While the disclosure has been shown and described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims and their equivalents. Accordingly, the scope of the present disclosure should not be limited to the above-described embodiments, but should be defined not only by the appended claims, but also by equivalents thereof.

Claims (10)

1. A navigation method, comprising:
determining a navigation plan;
acquiring the current geographic position of a target object;
determining the position of the target object in the electronic map based on the acquired geographic position;
acquiring environmental condition information of the current surrounding environment of the target object; and
determining a navigation strategy of electronic navigation software to arrange or alter the determined navigation plan based on the environmental condition information and the position of the target object in the electronic map.
2. The method of claim 1, wherein the determining the navigation strategy of the electronic navigation software to arrange or change the navigation strategy of the electronic navigation software comprises:
when the environmental condition information indicates that the current environmental condition meets a predetermined characteristic, performing one or more of the following:
keeping the determined navigation plan unchanged, and prompting to pause at the current navigation position, or prompting to pause near the current navigation position in advance by a preset time;
keeping the determined navigation plan unchanged, and prompting to continue to navigate according to the navigation plan after delaying for a period of time at the current navigation position, or prompting to continue to navigate according to the navigation plan after delaying for a period of time near the current navigation position in advance by a preset time;
on the basis of the determined navigation plan, modifying the navigation plan at the specified navigation position, or modifying the navigation plan at the specified navigation position and prompting that part of the navigation plan is modified in advance by a preset time;
resetting the subsequent navigation plan from the designated navigation position, or resetting the subsequent navigation plan and prompting that part of the navigation plan is reset in advance by a preset time;
outputting a voice prompt or a visual prompt for the determined navigation plan.
3. The method of claim 1 or 2, wherein the environmental condition information comprises one or more of:
environmental condition information caused by weather conditions;
condition information of a surrounding road surface, wherein the condition information comprises information about stationary objects and/or information about moving objects.
4. The method of claim 2, wherein the method further comprises determining whether the current environmental condition satisfies the predetermined characteristic, comprising:
acquiring an image about the surrounding environment;
identifying one or more identifying objects in the image; and
predicting whether the current environmental condition satisfies the predetermined characteristic based on the identified object,
or,
sensing stationary objects, and/or moving objects in the surrounding environment with a sensing device; and
and judging whether the current environmental condition meets the predetermined characteristic or not according to the relative displacement relation between the sensed object and the target object.
5. The method of claim 2, wherein the visual cue comprises one or more of:
performing color change display on local elements or all elements in the navigation interface;
changing the display form of the navigation mark in the navigation interface;
and displaying an alarm identifier around the navigation identifier in the navigation interface.
6. A navigation system, comprising:
a first determination module for determining a navigation plan;
the first acquisition module is used for acquiring the current geographic position of the target object;
the second determination module is used for determining the position of the target object in the electronic map based on the acquired geographic position;
the second acquisition module is used for acquiring the environmental condition information of the current surrounding environment where the target object is located; and
a third determination module, configured to determine a navigation policy of the electronic navigation software based on the environmental condition information and the position of the target object in the electronic map, so as to arrange or change the determined navigation plan.
7. The system of claim 6, wherein the third determination module is further to:
when the environmental condition information indicates that the current environmental condition meets a predetermined characteristic, performing one or more of the following:
keeping the determined navigation plan unchanged, and prompting to pause at the current navigation position, or prompting to pause near the current navigation position in advance by a preset time;
keeping the determined navigation plan unchanged, and prompting to continue to navigate according to the navigation plan after delaying for a period of time at the current navigation position, or prompting to continue to navigate according to the navigation plan after delaying for a period of time near the current navigation position in advance by a preset time;
on the basis of the determined navigation plan, modifying the navigation plan at the specified navigation position, or modifying the navigation plan at the specified navigation position and prompting that part of the navigation plan is modified in advance by a preset time;
resetting the subsequent navigation plan from the designated navigation position, or resetting the subsequent navigation plan and prompting that part of the navigation plan is reset in advance by a preset time;
outputting a voice prompt or a visual prompt for the determined navigation plan.
8. The system of claim 6 or 7, wherein the environmental condition information comprises one or more of:
environmental condition information caused by weather conditions;
condition information of a surrounding road surface, wherein the condition information comprises information about stationary objects and/or information about moving objects.
9. The system of claim 7, further comprising a determination module for determining whether the current environmental condition satisfies the predetermined characteristic, the determination module comprising:
an acquisition unit configured to acquire an image about the surrounding environment;
the identification unit is used for identifying one or more identification objects in the image; and
a prediction unit for predicting whether the current environmental condition satisfies the predetermined characteristic based on the identified identifying object,
or,
a sensing unit for sensing stationary objects, and/or moving objects in the surrounding environment by means of a sensing device; and
and the judging unit is used for judging whether the current environmental condition meets the preset characteristic or not according to the relative displacement relation between the sensed object and the target object.
10. The system of claim 7, wherein the visual cue comprises one or more of:
performing color change display on local elements or all elements in the navigation interface;
changing the display form of the navigation mark in the navigation interface;
and displaying an alarm identifier around the navigation identifier in the navigation interface.
CN201811084287.5A 2018-09-17 2018-09-17 Air navigation aid and navigation system Pending CN109115206A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811084287.5A CN109115206A (en) 2018-09-17 2018-09-17 Air navigation aid and navigation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811084287.5A CN109115206A (en) 2018-09-17 2018-09-17 Air navigation aid and navigation system

Publications (1)

Publication Number Publication Date
CN109115206A true CN109115206A (en) 2019-01-01

Family

ID=64859576

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811084287.5A Pending CN109115206A (en) 2018-09-17 2018-09-17 Air navigation aid and navigation system

Country Status (1)

Country Link
CN (1) CN109115206A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115755400A (en) * 2022-11-21 2023-03-07 江苏泽景汽车电子股份有限公司 Information display method and device, storage medium and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102334151A (en) * 2009-02-27 2012-01-25 丰田自动车株式会社 Movement trajectory generator
CN106205175A (en) * 2015-05-28 2016-12-07 Lg电子株式会社 display device and vehicle for vehicle
CN106289290A (en) * 2016-07-21 2017-01-04 触景无限科技(北京)有限公司 A kind of path guiding system and method
CN107111956A (en) * 2015-01-13 2017-08-29 日产自动车株式会社 Travel controlling system
CN107664502A (en) * 2016-07-28 2018-02-06 奥迪股份公司 The method, apparatus and system of dynamic programming path

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102334151A (en) * 2009-02-27 2012-01-25 丰田自动车株式会社 Movement trajectory generator
CN107111956A (en) * 2015-01-13 2017-08-29 日产自动车株式会社 Travel controlling system
CN106205175A (en) * 2015-05-28 2016-12-07 Lg电子株式会社 display device and vehicle for vehicle
CN106289290A (en) * 2016-07-21 2017-01-04 触景无限科技(北京)有限公司 A kind of path guiding system and method
CN107664502A (en) * 2016-07-28 2018-02-06 奥迪股份公司 The method, apparatus and system of dynamic programming path

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115755400A (en) * 2022-11-21 2023-03-07 江苏泽景汽车电子股份有限公司 Information display method and device, storage medium and electronic equipment
CN115755400B (en) * 2022-11-21 2023-10-27 江苏泽景汽车电子股份有限公司 Information display method and device, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
US11462022B2 (en) Traffic signal analysis system
US12090997B1 (en) Predicting trajectories of objects based on contextual information
US11550331B1 (en) Detecting street parked vehicles
CN113631454B (en) Driving assistance method and driving assistance device
EP3299921B1 (en) Location specific assistance for an autonomous vehicle control system
JP6840240B2 (en) Dynamic route determination for autonomous vehicles
CN106891888B (en) Vehicle turn signal detection
CN106873580B (en) Autonomous driving at intersections based on perception data
CN106462727B (en) Vehicle, lane ending detection system and method
US9915951B2 (en) Detection of overhanging objects
EP2002210B1 (en) A driving aid system for creating a model of surroundings of a vehicle
US11282388B2 (en) Edge-assisted alert system
CN109426256A (en) The lane auxiliary system based on driver intention of automatic driving vehicle
US20180203457A1 (en) System and Method for Avoiding Interference with a Bus
US20150336502A1 (en) Communication between autonomous vehicle and external observers
CA3090627A1 (en) Computer aided driving
WO2021070451A1 (en) Vehicle control device, vehicle control method, autonomous driving device, and autonomous driving method
CN104691447A (en) System and method for dynamically focusing vehicle sensors
US10974725B2 (en) Vehicle control apparatus, vehicle control method, and storage medium
US11866037B2 (en) Behavior-based vehicle alerts
US11027651B2 (en) Vehicle control device, vehicle control system, vehicle control method, and storage medium
JP7362733B2 (en) Automated crowdsourcing of road environment information
US20210225168A1 (en) Systems and methods for assisting occupants to exit a parked vehicle safely
JP2016002893A (en) Travel control device of vehicle
KR20210054107A (en) Display Apparatus and Method for Vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20190101

RJ01 Rejection of invention patent application after publication