CN104897165A - Shot scenery-based navigation method and system thereof - Google Patents
Shot scenery-based navigation method and system thereof Download PDFInfo
- Publication number
- CN104897165A CN104897165A CN201410080837.1A CN201410080837A CN104897165A CN 104897165 A CN104897165 A CN 104897165A CN 201410080837 A CN201410080837 A CN 201410080837A CN 104897165 A CN104897165 A CN 104897165A
- Authority
- CN
- China
- Prior art keywords
- scenery
- destination locations
- station
- mark
- directional information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Navigation (AREA)
Abstract
The invention provides a shot scenery-based navigation method and a system thereof. The method comprises the following steps: presetting a scenery navigation database in a mobile terminal, and preserving identified scenery image sets of all indoor environments needing navigation; identifying the identified scenery image set of the corresponding indoor environment needed by a current user according to an initial position when the user inputs the initial position and the destination position of navigation; and shooting real-time scenery images nearby the initial position through the mobile terminal, such as the shooting device of an intelligent mobile phone by the user, matching the real-time scenery images with all the identified scenery images in the database through a program to identify corresponding identified sceneries in order to determine the direction information from the identified sceneries to the destination position, and outputting a navigation direction. Dependence of a GPS system is not needed, and the method can realize indoor environment navigation when the user is in indoor environment with a poor GPS signal.
Description
Technical field
The present invention relates to field of navigation technology, particularly relate to a kind of air navigation aid based on photographed, and a kind of navigational system based on photographed.
Background technology
In recent years, electronic digit science and technology is constantly in progress, and it is many that modern uses many multi-functional electronic products and device to allow life facilitate in daily life.One of them is exactly diversified function because of it and the more and more welcome smart mobile phone of moulding.A wherein function of the smart mobile phone of functional diversities has GPS (Global Positioning System is called for short GPS) exactly.The ultimate principle of GPS measures the distance between the satellite near user's receiver and a certain displacement, and then the data of comprehensive multi-satellite are learnt concrete position.Smart mobile phone utilizes the signal of GPS to allow user person carry out flowing navigational system through the mode of smart mobile phone application software.
Flowing navigational system can allow user show position and the moving direction at current place easily.Position can be presented on electronic chart clearly to allow user learn position in more detail.Commercially there is the application software of many navigational system at present, and function wherein comprises the location that user person can be allowed to find certain address, and be supplied to the optimal travel route of user person and mode, such as: walking, take bus or subway, or drive to travel to facilitate user.
But current GPS technology is only limited to and uses out of doors, for the interior space, such as: buildings, shopping center, subway station etc., the signal capacity of GPS can become faint or even cannot accept.One of its reason is because the interior space contains the existence of many electronic interferences, makes mobile phone cannot accept the signal of GPS exactly.The restriction of this GPS make user person can not effectively find in buildings needed for outlet, especially in subway station, when user gets off from subway, because exit is more, user can need inquiry terrestrial reference or inquiry passerby when not having gps signal.Can waste user finds correct outlet to arrive at destination locations the much time.
Summary of the invention
GPS is depended on for existing airmanship, the problem cannot navigated when the signal of GPS receives interference, the present invention proposes a kind of air navigation aid based on photographed, its realization does not rely on GPS technology, can when the signal of GPS receives interference for user provides the indoor navigation of preserving in database.
Based on an air navigation aid for photographed, comprise the following steps:
Obtain reference position and destination locations;
From the scenery navigational route database preset, the mark scene image collection corresponding with described reference position is obtained according to described reference position, wherein, described mark scene image is concentrated to include and is identified scene image accordingly, and identifies the positional information of scenery described in each;
Obtain the real-time scenery image of mobile terminal shooting, each described real-time scenery image and described mark scene image being concentrated identifies scene image and mates, if the match is successful, then according to positional information and the described destination locations of described mark scenery, obtain the directional information from described mark scenery to described destination locations;
According to from described mark scenery to the directional information of described destination locations, export navigation direction.
Depend on GPS for existing airmanship, the problem cannot navigated when the signal of GPS receives interference, the present invention proposes a kind of navigational system based on photographed corresponding with said method.
Based on a navigational system for photographed, comprising:
Load module, for obtaining reference position and destination locations;
Search module, for obtaining the mark scene image collection corresponding with described reference position according to described reference position from the scenery navigational route database preset, wherein, described mark scene image is concentrated to include and is identified scene image accordingly, and identifies the positional information of scenery described in each;
Direction computing module, for obtaining the real-time scenery image of mobile terminal shooting, each described real-time scenery image and described mark scene image being concentrated identifies scene image and mates, if the match is successful, then according to positional information and the described destination locations of described mark scenery, obtain the directional information from described mark scenery to described destination locations;
Output module, for according to from described mark scenery to the directional information of described destination locations, exports navigation direction.
In air navigation aid based on photographed of the present invention and system thereof, by presetting described scenery navigational route database in the terminal, preserve the mark scene image collection that each needs the indoor environment of navigation.When user inputs reference position and the destination locations of navigation, first identify according to reference position the mark scene image collection that active user needs the respective chambers environment used.Then user is by the real-time scenery image near the camera head shooting reference position of mobile terminal such as smart mobile phone.After real-time scenery image described in process accepts, mate by described real-time scenery image and each are identified scene image, thus identify corresponding mark scenery, according to positional information and the described destination locations of described mark scenery, thus determine the directional information from described mark scenery to described destination locations, export navigation direction.Thus without the need to depending on gps system again, when user is in the environment of indoor GPS dtr signal, the navigation of indoor environment just can be realized by method of the present invention.The present invention utilizes and identifies that physical feature realizes indoor navigation, can be applicable in subway station the exit looking for destination, can be used on smart mobile phone, carry with to facilitate user, and when getting off from subway train, in the interior architecture thing that gps signal is faint, manufacture indoor azimuth navigation system, with allow user smoothly with reach the destination with saving time.
Accompanying drawing explanation
Fig. 1 is the schematic flow sheet of the air navigation aid that the present invention is based on photographed;
Fig. 2 is the schematic diagram data in the scenery navigational route database of the air navigation aid that the present invention is based on photographed;
Fig. 3 is the real-time scenery image schematic diagram of mobile terminal shooting in the air navigation aid that the present invention is based on photographed;
Fig. 4 is the schematic diagram of mobile terminal display direction Warning Mark in the air navigation aid that the present invention is based on photographed;
Fig. 5 is the structural representation of the navigational system that the present invention is based on photographed.
Embodiment
Refer to Fig. 1, Fig. 1 is the schematic flow sheet of the air navigation aid that the present invention is based on photographed.
The described air navigation aid based on photographed, comprises the following steps:
S101, obtains reference position and destination locations;
S102, from the scenery navigational route database preset, the mark scene image collection corresponding with described reference position is obtained according to described reference position, wherein, described mark scene image is concentrated to include and is identified scene image accordingly, and identifies the positional information of scenery described in each;
S103, obtain the real-time scenery image of mobile terminal shooting, each described real-time scenery image and described mark scene image being concentrated identifies scene image and mates, if the match is successful, then according to positional information and the described destination locations of described mark scenery, obtain the directional information from described mark scenery to described destination locations;
S104, according to from described mark scenery to the directional information of described destination locations, exports navigation direction.
Air navigation aid based on photographed of the present invention, by presetting described scenery navigational route database in the terminal, preserves the mark scene image collection that each needs the indoor environment of navigation.When user inputs reference position and the destination locations of navigation, first identify according to reference position the mark scene image collection that active user needs the respective chambers environment used.Then user is by the real-time scenery image near the camera head shooting reference position of mobile terminal such as smart mobile phone.After real-time scenery image described in process accepts, mate by described real-time scenery image and each are identified scene image, thus identify corresponding mark scenery, according to positional information and the described destination locations of described mark scenery, thus determine the directional information from described mark scenery to described destination locations, export navigation direction.Thus without the need to depending on gps system again, when user is in the environment of indoor GPS dtr signal, the navigation of indoor environment just can be realized by method of the present invention.The present invention utilizes and identifies that physical feature realizes indoor navigation, can be applicable in subway station the exit looking for destination, can be used on smart mobile phone, carry with to facilitate user, and when getting off from subway train, in the interior architecture thing that gps signal is faint, manufacture indoor azimuth navigation system, with allow user smoothly with reach the destination with saving time.
Wherein, for step S101, obtain reference position and destination locations.Described reference position refers to the current position needing indoor navigation of user, such as so-and-so station, so-and-so subway station, or so-and-so market, buildings etc.Described destination locations typically refers to the address near described reference position position, such as, hotel near so-and-so station.
In step S102, from the scenery navigational route database preset, obtain the mark scene image collection corresponding with described reference position according to described reference position.
Described scenery navigational route database is the indoor navigation database preserved in mobile terminal this locality, preserve the reference position that each may need to navigate, the mark scene image collection of described reference position, and described initial near buildings or address name so that user select as destination locations.
Such as station, generally preserve station name (reference position), mark scene image collection in station, and the conventional address such as hotel, market near station, or also can preserve the positional information of each outlet of described station, and each outlet described is for the directional information of possible destination locations.
As shown in Figure 2, comprise in described scenery navigational route database subway station and associated outlet prescription to outlet and the destination locations table of comparisons.Described outlet and the destination locations table of comparisons include corresponding subway station name data; Subway station nearby buildings data; Subway station exit data corresponding to described nearby buildings; And each exports to the position of orientation data of described nearby buildings.Give an example, user have input subway station title, Orchard, and destination, Four Seasons Hotel, can find B and export, the data that orientation is 200 ° with described outlet and the destination locations table of comparisons.
Described reference position and the described destination locations of user's input must be preserve related data in described scenery navigational route database, just can navigate, if there is no described reference position and described destination locations in described scenery navigational route database, then may need down loading updating, thus obtain related data.
Such as, user in advance according to the stroke of oneself, can first upgrade the scenery navigational route database on mobile terminal when there being network, and the mark scene image collection that acquisition needs the station of indoor navigation corresponding is to local scenery navigational route database.Namely before acquisition reference position and destination locations, further comprising the steps: according to the positional information of input, the scenery navigational route database preset in search server, downloads the mark scene image collection corresponding with described positional information to described mobile terminal from described server.
Described mark scene image concentrates the mark scene image including described reference position, and identifies the positional information of scenery described in each.Described mark scenery can be the various indoor scenes of indoor environment residing for described reference position, such as reference position is subway station A, then described mark scene image can be escalator within subway station A or staircase chart picture, subway title signboard image, and the image at entrance or outlet outpost of the tax office place etc.
In step s 103, obtain the real-time scenery image of mobile terminal shooting, concentrated to described real-time scenery image and described mark scene image each is identified scene image and mates.
User takes the real-time scenery near reference position by the camera of mobile terminal when navigating, and obtains described real-time scenery image, and is kept in mobile memory.Program is after receiving described real-time scenery image, and each described real-time scenery image and described mark scene image being concentrated identifies scene image and mate.
As shown in Figure 3, the real-time scenery image that described mobile terminal 200 is taken can be the various indoor scenes of indoor environment residing for described reference position equally, such as, escalator within subway station or staircase chart are as 202, subway title signboard image 204, and the image 206 at entrance or outlet outpost of the tax office place etc.
When mating, preferably the feature extraction of described real-time scenery image and described mark scene image is out compared.This comparison technology can have employed computer vision technique, such as: to analyse in depth rapid robust feature (Speeded Up Robust Feature is called for short SURF) unique point (ke
ypoints) take out, then take quick arest neighbors to approach search function storehouse (Fast Approximate Nearest Search Librar
y, be called for short FLANN) method described real-time scenery image and described mark scene image are compared.According to comparison achievement, if the match is successful, then obtain the positional information of described mark scenery, or obtain the title of described mark scenery further.According to positional information and the described destination locations of described mark scenery, the directional information from described mark scenery to described destination locations can be obtained.
For the application scenarios at station, described mark scenery can comprise two parts to the directional information of described destination locations, a part is the directional information exported to corresponding station from described mark scenery, and another part is the directional information that described station is exported to described destination locations.
The step obtained from described mark scenery to the directional information of described destination locations comprises:
According to described destination locations, obtain the station outlet information corresponding with described destination locations, and described station is exported to the directional information of described destination locations;
According to positional information and the described station outlet information of described mark scenery, obtain the directional information exported to described station from described mark scenery.
Wherein, obtain described station and be exported to the step of the directional information of described destination locations by realizing with under type: the positional information exported according to described station, search the outlet and the destination locations table of comparisons of preserving in advance, obtain the directional information that described station is exported to described destination locations.
Obtain the step of the directional information exported to described station from described mark scenery by realizing with under type: according to the positional information of described mark scenery and the positional information of described station outlet, calculate the directional information that described mark scenery exports to described station.
Wherein, the directional information that described mark scenery exports to described station is because be in indoor, so the positional information that can export according to the positional information of described mark scenery and described station, and electronic chart in the station at described station, calculate the guidance path exported to described station from described mark scenery; Then according to described guidance path, the directional information that described mark scenery exports to described station is obtained.
Also can preserve in described scenery navigational route database the directional information that each described mark scenery exports relative to each station in advance, then according to the station outlet information obtained in step S102, the directional information that described mark scenery exports to corresponding station is found.
In this step, program can be estimated to calculate the orientation of place reference position based on exit, station.This calculating employs accelerometer (accelerometer) that smart mobile phone contains and orientation calculated by revolving gear (gyroscope).
In step S104, according to from described mark scenery to the directional information of described destination locations, export navigation direction.
Usually by navigation direction described in the display screen of mobile terminal or voice output, if by the mode of display, then according to from described mark scenery to the directional information of described destination locations, corresponding direction sign can be generated; Then described direction sign is shown by described mobile terminal.
Such as, the arrow be rotating through smart mobile phone display screen with the position of orientation providing user corresponding, as shown in Figure 4, smart mobile phone display screen picture 400, intake of entrance to the station or the outpost of the tax office, exit picture 402, and show arrow picture 404 with the indoor position of orientation providing user corresponding.
If by the mode of voice message, then can directional information described in voice broadcast.
The invention provides a kind of utilization and be identified in the scenery physical feature that has in subway station to carry out indoor navigation method.The present invention can be used on smart mobile phone, and without the need to gps signal, allows user take facility after subway is got off in indoor and find exit to reach the destination.The method camera applied in smart mobile phone carries out true view video recording, then compares with the database images in smart mobile phone, carries out indoor navigation with the location providing user current.
This system contains a secondary program, and the method real-time update user that crosses of the prerequisite that make use of is in the place, place of indoor.In database, in each width image shelves, store exclusive position data, and the destination data of user's input wherein also stores exclusive directional data.
In a word, the present invention to indoor navigation system with innovation and easily mode allow smart phone user under faint gps signal can after subway train is got off smoothly and timesaving mode reach the destination.
Refer to Fig. 5, Fig. 5 is the structural representation of the navigational system that the present invention is based on photographed.
The described navigational system based on photographed, comprising:
Load module 11, for obtaining reference position and destination locations;
Search module 12, for obtaining the mark scene image collection corresponding with described reference position according to described reference position from the scenery navigational route database preset, wherein, described mark scene image is concentrated to include and is identified scene image accordingly, and identifies the positional information of scenery described in each;
Direction computing module 13, for obtaining the real-time scenery image of mobile terminal shooting, each described real-time scenery image and described mark scene image being concentrated identifies scene image and mates, if the match is successful, then according to positional information and the described destination locations of described mark scenery, obtain the directional information from described mark scenery to described destination locations;
Output module 14, for according to from described mark scenery to the directional information of described destination locations, exports navigation direction.
Of the present invention based in the navigational system of photographed, by presetting described scenery navigational route database in the terminal, preserve the mark scene image collection that each needs the indoor environment of navigation.When user inputs reference position and the destination locations of navigation, first identify according to reference position the mark scene image collection that active user needs the respective chambers environment used.Then user is by the real-time scenery image near the camera head shooting reference position of mobile terminal such as smart mobile phone.After real-time scenery image described in process accepts, mate by described real-time scenery image and each are identified scene image, thus identify corresponding mark scenery, according to positional information and the described destination locations of described mark scenery, thus determine the directional information from described mark scenery to described destination locations, export navigation direction.Thus without the need to depending on gps system again, when user is in the environment of indoor GPS dtr signal, the navigation of indoor environment just can be realized by method of the present invention.The present invention utilizes and identifies that physical feature realizes indoor navigation, can be applicable in subway station the exit looking for destination, can be used on smart mobile phone, carry with to facilitate user, and when getting off from subway train, in the interior architecture thing that gps signal is faint, manufacture indoor azimuth navigation system, with allow user smoothly with reach the destination with saving time.
Wherein, described load module 11 obtains reference position and destination locations.Described reference position refers to the current position needing indoor navigation of user, such as so-and-so station, so-and-so subway station, or so-and-so market, buildings etc.Described destination locations typically refers to the address near described reference position position, such as, hotel near so-and-so station.
Described module 12 of searching obtains the mark scene image collection corresponding with described reference position according to described reference position from the scenery navigational route database preset.
Described scenery navigational route database is the indoor navigation database preserved in mobile terminal this locality, preserve the reference position that each may need to navigate, the mark scene image collection of described reference position, and described initial near buildings or address name so that user select as destination locations.
Such as station, generally preserve station name (reference position), mark scene image collection in station, and the conventional address such as hotel, market near station, or also can preserve the positional information of each outlet of described station, and each outlet described is for the directional information of possible destination locations.
Comprise in described scenery navigational route database subway station and associated outlet prescription to outlet and the destination locations table of comparisons.Described outlet and the destination locations table of comparisons include corresponding subway station name data; Subway station nearby buildings data; Subway station exit data corresponding to described nearby buildings; And each exports to the position of orientation data of described nearby buildings.
Described reference position and the described destination locations of user's input must be preserve related data in described scenery navigational route database, just can navigate, if there is no described reference position and described destination locations in described scenery navigational route database, then may need down loading updating, thus obtain related data.
User in advance according to the stroke of oneself, can first upgrade the scenery navigational route database on mobile terminal when there being network, the mark scene image collection that acquisition needs the station of indoor navigation corresponding is to local scenery navigational route database.Therefore, the described navigational system based on photographed also comprises: download module, for the positional information according to input, the scenery navigational route database preset in search server, downloads the mark scene image collection corresponding with described positional information to described mobile terminal from described server.
Described mark scene image concentrates the mark scene image including described reference position, and identifies the positional information of scenery described in each.Described mark scenery can be the various indoor scenes of indoor environment residing for described reference position, such as reference position is subway station A, then described mark scene image can be escalator within subway station A or staircase chart picture, subway title signboard image, and the image at entrance or outlet outpost of the tax office place etc.
Described direction computing module 13 obtains the real-time scenery image of mobile terminal shooting, concentrated to described real-time scenery image and described mark scene image each is identified scene image and mates.
User takes the real-time scenery near reference position by the camera of mobile terminal when navigating, and obtains described real-time scenery image, and is kept in mobile memory.Program is after receiving described real-time scenery image, and each described real-time scenery image and described mark scene image being concentrated identifies scene image and mate.
The feature extraction of described real-time scenery image and described mark scene image, when mating, preferably out compares by described direction computing module 13.This comparison technology can have employed computer vision technique, such as: to analyse in depth rapid robust feature (Speeded Up Robust Feature, be called for short SURF) unique point (key points) is taken out, described real-time scenery image and described mark scene image compare by the method taking quick arest neighbors to approach search function storehouse (Fast Approximate Nearest Search Library is called for short FLANN) again.According to comparison achievement, if the match is successful, then obtain the positional information of described mark scenery, or obtain the title of described mark scenery further.According to positional information and the described destination locations of described mark scenery, the directional information from described mark scenery to described destination locations can be obtained.
For the application scenarios at station, described mark scenery can comprise two parts to the directional information of described destination locations, a part is the directional information exported to corresponding station from described mark scenery, and another part is the directional information that described station is exported to described destination locations.
Then described direction computing module 13 comprises:
Destination locations direction acquisition module, for according to described destination locations, obtain the station outlet information corresponding with described destination locations, and described station is exported to the directional information of described destination locations;
And Way out acquisition module, for according to the positional information of described mark scenery and described station outlet information, obtains the directional information exported to described station from described mark scenery.
Wherein, the positional information that described destination locations direction acquisition module exports according to described station, searches the outlet and the destination locations table of comparisons of preserving in advance, obtains the directional information that described station is exported to described destination locations.
The positional information that described Way out acquisition module exports according to the positional information of described mark scenery and described station, calculates the directional information that described mark scenery exports to described station.
The directional information that described mark scenery exports to described station is because be in indoor, so the positional information that can export according to the positional information of described mark scenery and described station, and electronic chart in the station at described station, calculate the guidance path exported to described station from described mark scenery; Then according to described guidance path, the directional information that described mark scenery exports to described station is obtained.
Also can preserve in described scenery navigational route database the directional information that each described mark scenery exports relative to each station in advance, then according to the station outlet information obtained, find the directional information that described mark scenery exports to corresponding station.
Described direction computing module 13 can be estimated to calculate the orientation of place reference position based on exit, station.This calculating employs accelerometer (accelerometer) that smart mobile phone contains and orientation calculated by revolving gear (gyroscope).
Described output module 14, according to from described mark scenery to the directional information of described destination locations, exports navigation direction.
Described output module 14 usually by navigation direction described in the display screen of mobile terminal or voice output, if by the mode of display, then according to from described mark scenery to the directional information of described destination locations, corresponding direction sign can be generated; Then described direction sign is shown by described mobile terminal.Such as, the arrow be rotating through smart mobile phone display screen is with the position of orientation providing user corresponding.If by the mode of voice message, then can directional information described in voice broadcast.
One of ordinary skill in the art will appreciate that and realize all or part of flow process in above-mentioned embodiment and the system of correspondence, described music player, that the hardware that can carry out instruction relevant by computer program has come, described program can be stored in a computer read/write memory medium, this program, when performing, can comprise the flow process as the respective embodiments described above.Wherein, described storage medium can be magnetic disc, CD, read-only store-memory body (Read-Only Memory, ROM) or random store-memory body (Random Access Memory, RAM) etc.
The above embodiment only have expressed several embodiment of the present invention, and it describes comparatively concrete and detailed, but therefore can not be interpreted as the restriction to the scope of the claims of the present invention.It should be pointed out that for the person of ordinary skill of the art, without departing from the inventive concept of the premise, can also make some distortion and improvement, these all belong to protection scope of the present invention.Therefore, the protection domain of patent of the present invention should be as the criterion with claims.
Claims (10)
1. based on an air navigation aid for photographed, it is characterized in that, comprise the following steps:
Obtain reference position and destination locations;
From the scenery navigational route database preset, the mark scene image collection corresponding with described reference position is obtained according to described reference position, wherein, described mark scene image is concentrated to include and is identified scene image accordingly, and identifies the positional information of scenery described in each;
Obtain the real-time scenery image of mobile terminal shooting, each described real-time scenery image and described mark scene image being concentrated identifies scene image and mates, if the match is successful, then according to positional information and the described destination locations of described mark scenery, obtain the directional information from described mark scenery to described destination locations;
According to from described mark scenery to the directional information of described destination locations, export navigation direction.
2., as claimed in claim 1 based on the air navigation aid of photographed, it is characterized in that, according to positional information and the described destination locations of described mark scenery, the step obtained from described mark scenery to the directional information of described destination locations comprises:
According to described destination locations, obtain the station outlet information corresponding with described destination locations, and described station is exported to the directional information of described destination locations;
According to positional information and the described station outlet information of described mark scenery, obtain the directional information exported to described station from described mark scenery.
3. as claimed in claim 2 based on the air navigation aid of photographed, it is characterized in that, obtain the step that described station is exported to the directional information of described destination locations and comprise:
According to the positional information that described station exports, search the outlet and the destination locations table of comparisons of preserving in advance, obtain the directional information that described station is exported to described destination locations.
4., as claimed in claim 2 based on the air navigation aid of photographed, it is characterized in that, the step obtaining the directional information exported to described station from described mark scenery comprises:
According to the positional information of described mark scenery and the positional information of described station outlet, calculate the directional information that described mark scenery exports to described station.
5. the air navigation aid based on photographed as described in Claims 1-4 any one, is characterized in that, according to from described mark scenery to the directional information of described destination locations, the step exporting navigation direction comprises:
According to from described mark scenery to the directional information of described destination locations, generate corresponding direction sign;
Described direction sign is shown by described mobile terminal.
6. based on a navigational system for photographed, it is characterized in that, comprising:
Load module, for obtaining reference position and destination locations;
Search module, for obtaining the mark scene image collection corresponding with described reference position according to described reference position from the scenery navigational route database preset, wherein, described mark scene image is concentrated to include and is identified scene image accordingly, and identifies the positional information of scenery described in each;
Direction computing module, for obtaining the real-time scenery image of mobile terminal shooting, each described real-time scenery image and described mark scene image being concentrated identifies scene image and mates, if the match is successful, then according to positional information and the described destination locations of described mark scenery, obtain the directional information from described mark scenery to described destination locations;
Output module, for according to from described mark scenery to the directional information of described destination locations, exports navigation direction.
7., as claimed in claim 6 based on the navigational system of photographed, it is characterized in that, described direction computing module comprises:
Destination locations direction acquisition module, for according to described destination locations, obtain the station outlet information corresponding with described destination locations, and described station is exported to the directional information of described destination locations;
Way out acquisition module, for according to the positional information of described mark scenery and described station outlet information, obtains the directional information exported to described station from described mark scenery.
8. as claimed in claim 7 based on the navigational system of photographed, it is characterized in that, the positional information that described destination locations direction acquisition module exports according to described station, search the outlet and the destination locations table of comparisons of preserving in advance, obtain the directional information that described station is exported to described destination locations.
9. as claimed in claim 7 based on the navigational system of photographed, it is characterized in that, the positional information that described Way out acquisition module exports according to the positional information of described mark scenery and described station, calculates the directional information that described mark scenery exports to described station.
10. the navigational system based on photographed as described in claim 6 to 9 any one, it is characterized in that, described output module is according to from described mark scenery to the directional information of described destination locations, generate corresponding direction sign, show described direction sign by described mobile terminal.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410080837.1A CN104897165A (en) | 2014-03-06 | 2014-03-06 | Shot scenery-based navigation method and system thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410080837.1A CN104897165A (en) | 2014-03-06 | 2014-03-06 | Shot scenery-based navigation method and system thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
CN104897165A true CN104897165A (en) | 2015-09-09 |
Family
ID=54029977
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410080837.1A Pending CN104897165A (en) | 2014-03-06 | 2014-03-06 | Shot scenery-based navigation method and system thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104897165A (en) |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105096220A (en) * | 2015-09-17 | 2015-11-25 | 成都千易信息技术有限公司 | Method for providing tourism interpretation through intelligent terminal |
CN105333878A (en) * | 2015-11-26 | 2016-02-17 | 深圳如果技术有限公司 | Road condition video navigation system and method |
CN106595678A (en) * | 2016-11-10 | 2017-04-26 | 广州市沃希信息科技有限公司 | Passenger positioning method and system |
WO2017092008A1 (en) * | 2015-12-03 | 2017-06-08 | 华为技术有限公司 | Navigation method and navigation device |
CN107036609A (en) * | 2016-10-18 | 2017-08-11 | 中建八局第建设有限公司 | Virtual reality air navigation aid, server, terminal and system based on BIM |
CN107067807A (en) * | 2017-04-12 | 2017-08-18 | 广西灵图信息科技有限公司 | A kind of method of indoor parking position guiding |
CN107289936A (en) * | 2016-03-30 | 2017-10-24 | 中国移动通信集团福建有限公司 | A kind of indoor navigation method, client and system |
CN107576332A (en) * | 2016-07-04 | 2018-01-12 | 百度在线网络技术(北京)有限公司 | A kind of method and apparatus of transfering navigation |
CN107728506A (en) * | 2017-08-28 | 2018-02-23 | 深圳市盛路物联通讯技术有限公司 | A kind of method for detecting position and controller based on Internet of Things |
WO2018035848A1 (en) * | 2016-08-26 | 2018-03-01 | Nokia Technologies Oy | A method, apparatus and computer program product for assisting a user in locating a vehicle |
CN108334639A (en) * | 2018-03-20 | 2018-07-27 | 北京知道创宇信息技术有限公司 | Method for visualizing, device and the AR equipment presented based on AR visions |
CN109029444A (en) * | 2018-06-12 | 2018-12-18 | 深圳职业技术学院 | One kind is based on images match and sterically defined indoor navigation system and air navigation aid |
CN109579864A (en) * | 2018-12-30 | 2019-04-05 | 张鸿青 | Air navigation aid and device |
CN110146083A (en) * | 2019-05-14 | 2019-08-20 | 深圳信息职业技术学院 | A kind of crowded off-the-air picture identification cloud navigation system |
CN110470296A (en) * | 2018-05-11 | 2019-11-19 | 珠海格力电器股份有限公司 | A kind of localization method, positioning robot and computer storage medium |
CN110864683A (en) * | 2019-11-27 | 2020-03-06 | 支付宝(杭州)信息技术有限公司 | Service handling guiding method and device based on augmented reality |
CN112378385A (en) * | 2020-07-31 | 2021-02-19 | 浙江宇视科技有限公司 | Method, device, medium and electronic equipment for determining position of attention information |
CN112953989A (en) * | 2019-12-10 | 2021-06-11 | 中移雄安信息通信科技有限公司 | Image sharing method, communication equipment and medium |
CN113029142A (en) * | 2020-05-26 | 2021-06-25 | 深圳市拓安信计控仪表有限公司 | Navigation method, navigation device and terminal equipment |
CN114646320A (en) * | 2022-02-09 | 2022-06-21 | 江苏泽景汽车电子股份有限公司 | Path guiding method and device, electronic equipment and readable storage medium |
-
2014
- 2014-03-06 CN CN201410080837.1A patent/CN104897165A/en active Pending
Cited By (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105096220A (en) * | 2015-09-17 | 2015-11-25 | 成都千易信息技术有限公司 | Method for providing tourism interpretation through intelligent terminal |
CN105333878A (en) * | 2015-11-26 | 2016-02-17 | 深圳如果技术有限公司 | Road condition video navigation system and method |
US10823577B2 (en) | 2015-12-03 | 2020-11-03 | Huawei Technologies Co., Ltd. | Navigation method and navigation device |
WO2017092008A1 (en) * | 2015-12-03 | 2017-06-08 | 华为技术有限公司 | Navigation method and navigation device |
CN107289936A (en) * | 2016-03-30 | 2017-10-24 | 中国移动通信集团福建有限公司 | A kind of indoor navigation method, client and system |
CN107576332A (en) * | 2016-07-04 | 2018-01-12 | 百度在线网络技术(北京)有限公司 | A kind of method and apparatus of transfering navigation |
CN107576332B (en) * | 2016-07-04 | 2020-08-04 | 百度在线网络技术(北京)有限公司 | Transfer navigation method and device |
WO2018035848A1 (en) * | 2016-08-26 | 2018-03-01 | Nokia Technologies Oy | A method, apparatus and computer program product for assisting a user in locating a vehicle |
CN107036609A (en) * | 2016-10-18 | 2017-08-11 | 中建八局第建设有限公司 | Virtual reality air navigation aid, server, terminal and system based on BIM |
CN106595678A (en) * | 2016-11-10 | 2017-04-26 | 广州市沃希信息科技有限公司 | Passenger positioning method and system |
CN107067807A (en) * | 2017-04-12 | 2017-08-18 | 广西灵图信息科技有限公司 | A kind of method of indoor parking position guiding |
CN107728506A (en) * | 2017-08-28 | 2018-02-23 | 深圳市盛路物联通讯技术有限公司 | A kind of method for detecting position and controller based on Internet of Things |
CN107728506B (en) * | 2017-08-28 | 2020-10-20 | 深圳市盛路物联通讯技术有限公司 | Position detection method based on Internet of things and controller |
CN108334639A (en) * | 2018-03-20 | 2018-07-27 | 北京知道创宇信息技术有限公司 | Method for visualizing, device and the AR equipment presented based on AR visions |
CN110470296A (en) * | 2018-05-11 | 2019-11-19 | 珠海格力电器股份有限公司 | A kind of localization method, positioning robot and computer storage medium |
CN109029444B (en) * | 2018-06-12 | 2022-03-08 | 深圳职业技术学院 | Indoor navigation system and method based on image matching and space positioning |
CN109029444A (en) * | 2018-06-12 | 2018-12-18 | 深圳职业技术学院 | One kind is based on images match and sterically defined indoor navigation system and air navigation aid |
CN109579864A (en) * | 2018-12-30 | 2019-04-05 | 张鸿青 | Air navigation aid and device |
CN109579864B (en) * | 2018-12-30 | 2022-06-07 | 张鸿青 | Navigation method and device |
CN110146083A (en) * | 2019-05-14 | 2019-08-20 | 深圳信息职业技术学院 | A kind of crowded off-the-air picture identification cloud navigation system |
CN110864683A (en) * | 2019-11-27 | 2020-03-06 | 支付宝(杭州)信息技术有限公司 | Service handling guiding method and device based on augmented reality |
CN110864683B (en) * | 2019-11-27 | 2021-07-30 | 支付宝(杭州)信息技术有限公司 | Service handling guiding method and device based on augmented reality |
CN112953989A (en) * | 2019-12-10 | 2021-06-11 | 中移雄安信息通信科技有限公司 | Image sharing method, communication equipment and medium |
CN113029142A (en) * | 2020-05-26 | 2021-06-25 | 深圳市拓安信计控仪表有限公司 | Navigation method, navigation device and terminal equipment |
CN112378385A (en) * | 2020-07-31 | 2021-02-19 | 浙江宇视科技有限公司 | Method, device, medium and electronic equipment for determining position of attention information |
CN114646320A (en) * | 2022-02-09 | 2022-06-21 | 江苏泽景汽车电子股份有限公司 | Path guiding method and device, electronic equipment and readable storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104897165A (en) | Shot scenery-based navigation method and system thereof | |
CN108615247B (en) | Method, device and equipment for relocating camera attitude tracking process and storage medium | |
CN100433050C (en) | Mobile communication system, mobile terminal device, fixed station device, character recognition device and method, and program | |
US9280852B2 (en) | Augmented reality virtual guide system | |
CN111323024B (en) | Positioning method and device, equipment and storage medium | |
CN107131883B (en) | Full-automatic mobile terminal indoor positioning system based on vision | |
US10049124B2 (en) | Apparatus and method of tracking location of wireless terminal based on image | |
US20100235091A1 (en) | Human assisted techniques for providing local maps and location-specific annotated data | |
CN103440318A (en) | System for identifying sights of mobile terminal | |
CN101950351A (en) | Method of identifying target image using image recognition algorithm | |
CN108020225A (en) | Map system and air navigation aid based on image recognition | |
WO2011145239A1 (en) | Position estimation device, position estimation method, and program | |
CN108020231A (en) | A kind of map system and air navigation aid based on video | |
CN104850563A (en) | Destination image comparison retrieval device, destination image comparison retrieval system and destination image comparison retrieval method | |
JPWO2012046671A1 (en) | Positioning system | |
US20170039450A1 (en) | Identifying Entities to be Investigated Using Storefront Recognition | |
Piciarelli | Visual indoor localization in known environments | |
Feng et al. | Visual Map Construction Using RGB‐D Sensors for Image‐Based Localization in Indoor Environments | |
JP5330606B2 (en) | Method, system, and computer-readable recording medium for adaptively performing image matching according to circumstances | |
CN114120301A (en) | Pose determination method, device and equipment | |
KR20190124436A (en) | Method for searching building based on image and apparatus for the same | |
CN108932275B (en) | Bayesian methods for geospatial object/feature detection | |
CN107121661B (en) | Positioning method, device and system and server | |
Sui et al. | An accurate indoor localization approach using cellphone camera | |
JP2007139748A (en) | Complementary device and method for navigation map |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20150909 |
|
WD01 | Invention patent application deemed withdrawn after publication |