WO2014114244A1 - 一种发现增强现实目标的方法及终端 - Google Patents
一种发现增强现实目标的方法及终端 Download PDFInfo
- Publication number
- WO2014114244A1 WO2014114244A1 PCT/CN2014/071169 CN2014071169W WO2014114244A1 WO 2014114244 A1 WO2014114244 A1 WO 2014114244A1 CN 2014071169 W CN2014071169 W CN 2014071169W WO 2014114244 A1 WO2014114244 A1 WO 2014114244A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- terminal
- target
- state
- determined
- navigation route
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 38
- 230000003190 augmentative effect Effects 0.000 title claims description 4
- 230000001788 irregular Effects 0.000 claims description 7
- 238000012937 correction Methods 0.000 claims description 3
- 230000006870 function Effects 0.000 description 17
- 206010048669 Terminal state Diseases 0.000 description 9
- 230000008859 change Effects 0.000 description 9
- 238000010586 diagram Methods 0.000 description 6
- 230000008569 process Effects 0.000 description 6
- 238000012545 processing Methods 0.000 description 5
- 210000001747 pupil Anatomy 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 238000005265 energy consumption Methods 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 238000003672 processing method Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000001351 cycling effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/3407—Route searching; Route guidance specially adapted for specific applications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3647—Guidance involving output of stored or live camera images or video streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/024—Guidance services
Definitions
- the present invention belongs to the field of Augmented Representation (AR) technology, and in particular, to a method and a terminal for discovering an AR target.
- Background technique
- AR is a technology used to help people acquire relevant information about objects in the real world in a more intuitive and visual way. It is a technology that enhances the user's perception of the real world through information provided by computer systems, usually through mobile terminals or Head Mounted Display (HMD) implementation, and the above device includes at least one camera and one display device in hardware, and may also include a positioning system and various sensors, such as a global positioning system (G l Oba l Pos iti on i ng System, GPS), gyroscopes, light sensors, etc.
- G l Oba l Pos iti on i ng System GPS
- gyroscopes light sensors
- the existing solution can not more accurately guide users to distinguish between AR targets and non-AR targets, need to manually after the user reaches the AR target Taking pictures of the surrounding environment can further identify the AR target.
- the purpose of the embodiments of the present invention is to provide an AR target discovery method and system, which aims to solve the problem that the prior art can only locate large AR targets and cannot accurately identify small AR targets.
- an AR target discovery method including: determining, when a pre-generated navigation route and a moving speed of a terminal are determined to reach a vicinity of a pre-selected AR target, determining a state of the terminal; When searching for the status, the camera is started and the picture is acquired; when the acquired AR image is included in the picture, the AR target is prompted.
- the determining, by using the pre-generated navigation route and the moving speed of the terminal, the proximity to the pre-selected AR target includes: acquiring according to the pre-generated navigation route The distance between the location of the AR target and the starting position of the terminal; Pre-selected motion mode, obtaining an average motion speed of the motion mode; calculating an estimated time when the terminal arrives near the AR target according to the distance and the average motion speed; After the terminal start position starts and moves for the expected time, it is determined to arrive near the AR target.
- the method further includes: when the acquired image does not include the AR target, use navigation
- the positioning system collects current position parameters of the terminal; and corrects the navigation route according to the position parameter.
- the location parameter includes at least one of positioning parameter information of a current location of the terminal and a prominent flag of a current location of the terminal. item.
- the method further includes: when determining that the terminal performs irregular motion near the AR target and finally stays away from the When the AR target is output, a warning message is output.
- the determining the status of the terminal includes:
- Whether the terminal is in a search state is determined by whether the gyroscope and the preset function are used.
- the warning information includes at least one of the following: vibration, ringing, or illuminating.
- a terminal including: a state determining unit, configured to determine a state of the terminal when determining to reach a pre-selected AR target by a pre-generated navigation route and a moving speed of the terminal; And when the state determining unit determines that the terminal is in a search state, starts a camera and acquires a picture; and the prompting unit is configured to: when the picture acquired by the picture acquiring unit includes the AR target, prompting a Describe the AR target.
- the state determining unit includes: a distance acquiring subunit, configured to acquire a location of the AR target and a start position of the terminal according to the navigation route generated in advance a distance acquisition subunit, configured to acquire an average motion speed of the motion mode according to a motion mode preselected by a user; and an estimated time acquisition subunit, configured to acquire, according to the distance acquisition subunit a distance, and an average moving speed obtained by the speed acquiring subunit, calculating an estimated time when the terminal arrives near the AR target; determining a subunit, configured to start at the terminal from the terminal starting position After moving the estimated time, it is determined to arrive at the AR Near the target.
- the method further includes: a parameter collecting unit, configured to: when the image acquired by the image acquiring unit does not include The navigation target system is used to collect the current position parameter of the terminal, and the correction unit is configured to correct the navigation route according to the position parameter collected by the parameter collection unit.
- the location parameter includes at least one of positioning parameter information of a current location of the terminal and a prominent flag of a current location of the terminal. item.
- the method further includes: a warning unit, configured to: when determining that the terminal performs irregular motion near the AR target And when the object is finally away from the AR target, a warning message is output.
- the state determining unit is specifically configured to: when determining, by using a pre-generated navigation route and a moving speed of the terminal, to reach a vicinity of a pre-selected AR target, Whether the terminal is in a search state is determined by whether the gyroscope and the preset function are used.
- the warning information includes at least one of the following: vibration, ringing, or illuminating.
- the picture of the surrounding environment is automatically acquired, so that one or more targets are extracted from the image and matched with the AR target, thereby realizing AR processing in the labelless environment.
- the automatic start of the process accurately discovers small AR targets, greatly reducing the time to discover AR targets.
- FIG. 1 is a flowchart of an implementation of an AR target discovery method according to an embodiment of the present invention
- FIG. 2 is a flowchart of a specific implementation of the step S1 01 of the AR target discovery method according to the embodiment of the present invention
- FIG. 3 is a flowchart of an implementation of the AR target discovery method for determining whether the terminal is in a search state according to an embodiment of the present invention
- FIG. 4 is a flowchart of an implementation of an AR target discovery method according to another embodiment of the present invention.
- FIG. 5 is a flowchart of an implementation of an A R target discovery method according to another embodiment of the present invention.
- FIG. 6 is a structural block diagram of an AR target discovery apparatus according to an embodiment of the present invention.
- FIG. 7 is a block diagram showing the hardware structure of an AR target discovery apparatus according to an embodiment of the present invention
- FIG. 8 is a hardware implementation block diagram of an AR target discovery apparatus according to an embodiment of the present invention. detailed description
- step S101 when a pre-generated navigation route and a moving speed of a terminal are determined to reach a vicinity of a pre-selected AR target, Determining the status of the terminal.
- the determining the status of the terminal includes: determining whether the terminal is in a search state by using a gyroscope and a preset function.
- the terminal may obtain an AR target pre-selected by the user by receiving an input instruction, and determine the location of the AR target, thereby navigating according to the current location of the terminal (ie, the starting position) and the position of the AR target.
- the positioning system generates a navigation route to guide the user to their pre-selected AR target.
- the navigation and positioning system includes, but is not limited to, Global Positioning System (GPS), Global Navigation Satellite System (G l oba l Nav i gat i on Sate lli te System, GLONASS), Beidou System, etc.
- the terminal may perform an AR target search centering on the selected location by acquiring a certain location selected by the user on the map displayed by the terminal, and displaying the searched AR target in the radar map with the current location of the user as a center. on.
- the user can select one or more AR targets through the radar map to generate a navigation route that directs the user to their pre-selected one or more AR targets.
- a way of transportation to a destination for example, walking, cycling or driving, the terminal can be used as a starting point, and a navigation route can be generated according to the departure place, the destination and the pre-selected movement mode of the user, thereby guiding the user. Go to the vicinity of its selected AR target.
- the navigation route generated by the navigation and positioning function can only guide the user to reach the vicinity of the AR target, and does not necessarily ensure that the guiding user reaches the precise position of the AR target. .
- the terminal may temporarily close the devices or function modules that are not needed during the movement of the terminal, such as a camera, a gyroscope, an accelerometer, an electronic compass, and a GPS, before determining the navigation route. Move to the vicinity of the AR target and then turn on the temporarily turned off device or functional module.
- the navigation and positioning system is not used for navigation, but is used.
- the GSM/3G signal and the base station perform low-precision navigation to save energy consumption of the terminal.
- Step S1 01 is specifically as follows:
- Step S201 Acquire a distance between a location of the AR target and a start position of the terminal according to the navigation route generated in advance.
- the terminal acquires the distance between the AR target location and the terminal start location according to the generated navigation route.
- the average moving speed of the motion mode is obtained according to the motion mode previously selected by the user.
- the average moving speed of the walking can be obtained according to the relevant configuration parameters, for example, 5KM/h, and the obtained average moving speed can be preset in the system according to different motion modes, and can be estimated value.
- the terminal can pre-record the speed at which the user walks, runs, rides, or otherwise moves, and estimates the average speed of each user's movement based on statistical data.
- step S203 an estimated time when the terminal reaches the vicinity of the AR target is calculated according to the distance and the average moving speed.
- step S204 after the terminal starts from the terminal starting position and moves the estimated time, it is determined to reach the vicinity of the AR target.
- the terminal may set a task alarm according to the estimated estimated time to remind the user to approach the AR target in the form of a task alarm when the estimated terminal approaches the destination, thereby prompting the user to pay attention to whether the AR target is concerned. appear.
- the estimated time may be corrected according to actual motion conditions of the terminal.
- the user originally intended to carry the terminal to walk near the AR target change the idea in the second half of the navigation route, and decide to drive the remaining route.
- the terminal receives the motion input by the user during the motion. Change the instruction, and change the "walking" movement mode to the "driving" movement mode.
- step S102 when it is determined that the terminal is in a search state, the camera is activated and a picture is acquired.
- the sensor such as the gyroscope and the light sensor that have been temporarily turned off may be re-opened, and the terminal state parameters are collected, and the terminal state is determined.
- the terminal state parameter satisfies the preset condition to judge the current state of the terminal, and further determines whether the terminal is in a normal motion state or a search state.
- it is considered that the user may currently confirm the surrounding through the terminal. Environment, looking for AR targets in the surrounding environment.
- the camera is automatically turned on to take a picture, or the video stream is collected to extract an image frame.
- the above actions of automatic photographing or video stream acquisition can be implemented by a terminal such as a smart phone or a head-mounted device, and the photo or image frame can be optimized before acquiring the image frame in the photo or video stream, for example, zooming in, Adjust brightness, adjust contrast, etc. (including but not limited to the above method, the processing method here can be any method to improve the viewing effect of the visual content to make it more suitable for human eyes), and then extract the image by image recognition algorithm N attention targets that are of interest to the user to determine whether the N targets include the AR target, where N is an integer greater than or equal to 1.
- the system can first process the M in the picture by using a fast, low-precision recognition algorithm (M is greater than or equal to 1 and An integer that is less than or equal to N) is focused on the target, and the remaining NM targets are identified using an accurate recognition algorithm to determine whether the target is included in the target.
- the fast, low-precision recognition algorithm identifies the non-AR target as an AR target, but does not recognize the AR target as a non-AR target, and the precise recognition algorithm can accurately identify the AR target and the non-AR target.
- the AR target but it takes a long time. Therefore, with the above processing method, the total processing time can be reduced while ensuring the recognition accuracy.
- the head-mounted device when used as the AR target discovery device, since the head-mounted device can detect the change of the viewing angle of the user, when the picture is acquired, it is found that the user quickly switches the current perspective, that is, the time when the user is in the attention state is less than If a certain threshold value T is not optimized for the current picture, only the current picture is saved. If the time when the user is in the state of interest exceeds the threshold value T, the user is considered to be interested in a certain target in the current field of view, that is, the current acquisition. If the image is likely to contain an AR target, the image will be processed later.
- step S103 when the acquired AR object is included in the acquired image, the AR target is prompted.
- the embodiment When determining that the terminal is in the search state, the embodiment automatically obtains a picture of the surrounding environment, thereby extracting one or more targets in the picture and matching the AR target, thereby realizing automatic processing of the AR process in the tagless environment. Start up, accurately discover small AR targets, and greatly reduce discovery The time of the AR target.
- steps S101 to S103, and the execution subjects of steps S201 to S204 are terminals.
- the terminal status parameter may include an average motion speed of the terminal from the departure point to the vicinity of the AR target, and the current motion speed of the terminal.
- the step of determining whether the terminal is in the search state is specifically:
- the difference between the current moving speed of the terminal and the average moving speed is greater than a preset threshold.
- the average moving speed of the terminal may be calculated according to the motion time before the terminal reaches the vicinity of the AR target and the accumulated actual moving distance, or the average moving speed of the motion mode selected by the user is directly obtained, and according to the current motion of the terminal. Whether the difference between the speed and the average moving speed is greater than a preset threshold to determine whether the speed of the terminal is significantly reduced, and the terminal is considered to be in a search state.
- the terminal state parameter further includes an operation state of the terminal preset function, and considering that the movement speed of the terminal is significantly reduced may also be due to a temporary answering call, viewing a short message, etc., therefore, determining whether the terminal is in a search state Specifically:
- the preset function includes, but is not limited to, a preset function such as a voice call, a short message, a multimedia message, an alarm clock, a calendar event, etc., if it is determined that one or more of the preset functions are being used, then the return is returned. Monitor the change of the speed of the terminal; if it is determined that the preset function is in the stop state, it can be judged that the terminal is in the search state.
- a preset function such as a voice call, a short message, a multimedia message, an alarm clock, a calendar event, etc.
- the terminal status parameter further includes a terminal motion state.
- the step of determining whether the terminal is in the search state is specifically:
- determining whether the smartphone or the head mounted device is in a steady state by a sensor such as a gyroscope.
- a sensor such as a gyroscope.
- determining whether the terminal motion state is in a steady state may be specifically:
- the gyroscope when the data collected by the gyroscope indicates that the motion state of the terminal is in a steady state, it can further determine whether the time when the terminal motion state is in a steady state is greater than a preset time threshold, such as If the threshold is less than the preset time threshold, no judgment is made until the time when the terminal motion state is in the steady state is greater than the preset time threshold, and then the terminal is determined to be in the search state.
- a preset time threshold such as If the threshold is less than the preset time threshold, no judgment is made until the time when the terminal motion state is in the steady state is greater than the preset time threshold, and then the terminal is determined to be in the search state.
- the change of the pupil size can be used to identify whether the user is staring at a certain target, for example, when the user is paying attention to a certain thing, the pupil of the eye changes. Large, at this time, the size of the user's pupil can be photographed through the front camera of the terminal, which serves as an auxiliary reference to further improve the judgment accuracy of the terminal state.
- FIG. 3 is a flowchart of an implementation of the AR target discovery method for determining whether a terminal is in a search state according to an embodiment of the present invention.
- the embodiment shown in FIG. 3 combines all the foregoing judgment terminal states after determining that the terminal motion speed has dropped significantly. All the preset conditions that the parameters need to satisfy, through the embodiment shown in Fig. 3, can maximize the accuracy of the judgment of the terminal state.
- the order of execution of steps S302 and S303 in the embodiment shown in FIG. 3 is not limited, and only S302 or only S303 may be executed, which is not limited herein.
- step S301 it is determined whether the current motion of the terminal is lower than the average motion speed is greater than a preset threshold. If yes, step S302 is performed, otherwise it is determined that the terminal is not in the search state.
- step S302 it is determined whether the preset function of the terminal is in the stopped use state, if yes, step S303 is performed, otherwise it is determined that the terminal is not in the search state.
- step S303 it is determined whether the terminal motion state is in a steady state, if yes, step S304 is performed, otherwise it is determined that the terminal is not in the search state.
- step S303 after determining that the terminal motion state is in a steady state, it may further determine whether the terminal running state is in a steady state time is greater than a preset time threshold, and if the terminal running state is in a steady state, the time is greater than the preset time threshold. Then, step S304 is performed.
- step S304 it is determined that the terminal is in the search state.
- step S301 since the relevant steps after step S301 are all implemented by the sensor of the terminal or the related detecting device, preferably, it is not determined that the terminal moving speed is significant in step S301. Before the change, the sensor of the terminal or the related detecting device may be in a closed state or a standby state until the step S301 determines that the user's moving speed has changed significantly, and then turns on the detecting to save the energy consumption of the terminal device. It should be noted that the execution subject of the above steps S301 to S304 is a terminal.
- FIG. 4 is a flowchart showing an implementation process of an AR target discovery method according to another embodiment of the present invention. This embodiment is based on the embodiment shown in FIG. 1 of the present invention, and does not identify one or more target objects. The case of including the AR target is described as follows:
- step S1 04 when the acquired AR target is not included in the acquired image, the current positioning parameter of the terminal is collected by using a navigation positioning system.
- step S1 05 the navigation route is corrected based on the positional parameter.
- the location parameter includes, but is not limited to, parameter information such as a GPS parameter of a current location of the terminal, a prominent flag of a current location of the terminal, and the like, wherein the GPS parameter may be used to determine whether the current location is located near the AR target, and the prominent flag of the current location may be used for Matches the prominent flag near the built-in AR target to determine whether the current position is near the AR target, thus re-correcting the navigation route in the case of a wrong position and guiding the user forward.
- the GPS parameter may be used to determine whether the current location is located near the AR target
- the prominent flag of the current location may be used for Matches the prominent flag near the built-in AR target to determine whether the current position is near the AR target, thus re-correcting the navigation route in the case of a wrong position and guiding the user forward.
- FIG. 5 shows an implementation flow of the AR target discovery method according to another embodiment of the present invention, specifically, before step S1 01, or After step S1 03, or after step S1 05, the method further includes: Step S1 06: outputting a warning message when it is determined that the terminal performs irregular motion near the AR target and finally moves away from the AR target.
- the warning information includes at least one of the following: vibration, ringing, or lighting. Therefore, the user can specify the specific orientation of the AR target in the current environment, and further guide the user to discover the AR target.
- the warning message may also be outputted when the AR task alarm expires or before the AR task alarm expires or after the AR task alarm expires, prompting the user to pay attention.
- the AR task alarm is used to prompt the user to reach the vicinity of the AR target.
- the embodiment When determining that the terminal is in the search state, the embodiment automatically obtains a picture of the surrounding environment, thereby extracting one or more targets and matching the AR target in the picture, thereby realizing the automatic processing of the AR process in the tagless environment. Start-up, accurate discovery of small AR targets, greatly reducing the time to discover AR targets.
- FIG. 6 is a structural block diagram of a terminal for discovering an AR target according to an embodiment of the present invention.
- the terminal may be located in an AR terminal such as an intelligent mobile terminal or a head-mounted device, and is used to run the embodiments of the present invention in FIGS. 1 to 5. The method described. For the convenience of explanation, only the parts related to the present embodiment are shown.
- the apparatus includes:
- the state determining unit 61 determines the state of the terminal when it is determined by the pre-generated navigation route and the moving speed of the terminal to reach the vicinity of the pre-selected AR target.
- the picture obtaining unit 62 when the state determining unit 61 determines that the terminal is in the search state, activates the camera and acquires a picture.
- the prompting unit 63 prompts the AR target when the image acquired by the image obtaining unit 62 includes the AR target.
- the state determining unit 61 includes:
- the distance obtaining sub-unit acquires a distance between the location of the AR target and the start position of the terminal according to the navigation route generated in advance.
- the speed acquisition subunit acquires the average motion speed of the motion mode according to a motion mode preselected by the user.
- a time acquisition subunit configured to calculate, according to the distance obtained by the distance acquisition subunit, and the average motion speed obtained by the speed acquisition subunit, calculate an estimated time that the terminal arrives near the AR target .
- Determining the subunit after the terminal starts from the terminal starting position and moves the estimated time, it is determined to arrive near the AR target.
- the state determining unit 61 is specifically configured to determine, when the gyro and the preset function are used, when the proximity to the pre-selected AR target is determined by the pre-generated navigation route and the moving speed of the terminal. Whether the terminal is in the search state.
- the terminal further includes:
- the parameter collection unit collects the current location parameter of the terminal by using the navigation and positioning system when the image acquired by the image acquisition unit 62 does not include the AR target.
- the location parameter includes at least one of positioning parameter information of a current location of the terminal and a prominent flag of a current location of the terminal.
- the terminal further includes:
- the warning unit outputs a warning message when it is determined that the terminal performs irregular motion near the AR target and finally moves away from the AR target.
- the warning information includes at least one of the following: vibration, ringing, or lighting.
- FIG. 7 is a block diagram showing the hardware structure of a terminal for discovering an AR target according to an embodiment of the present invention.
- the terminal may be a terminal such as a smart mobile terminal or a head mounted device for operating the method described in the embodiments of FIG. 1 to FIG. 5 of the present invention. For the convenience of explanation, only the parts related to the present embodiment are shown.
- the apparatus includes a processor 71, a memory 72 and a bus 73, wherein the processor 71 and the memory 72 communicate with each other via a bus 73, the memory 72 is used to store a program, and the processor 71 is used to execute the memory 72.
- Stored program, when executed, is used to:
- the camera When it is determined that the terminal is in a search state, the camera is activated and a picture is acquired;
- the AR target is prompted.
- the determining the status of the terminal includes:
- Whether the terminal is in a search state is determined by whether the gyroscope and the preset function are used.
- the determining, by using the pre-generated navigation route and the moving speed of the terminal, the proximity to the pre-selected AR target includes:
- the terminal After the terminal starts from the terminal starting position and moves the estimated time, it is determined that the arrival of the AR target is near.
- the navigation location system is used to collect the current location parameter of the terminal;
- the navigation route is corrected based on the positional parameter.
- the location parameter includes at least one of positioning parameter information of a current location of the terminal and a prominent flag of a current location of the terminal.
- a warning message is output when it is determined that the terminal performs irregular motion near the AR target and eventually moves away from the AR target.
- the warning information includes at least one of the following: vibration, ringing, or lighting.
- FIG. 8 is based on the embodiment shown in FIG. 6 of the present invention, and provides an AR target discovery provided by an embodiment of the present invention.
- the embodiment of the present invention automatically acquires a picture of the surrounding environment, thereby extracting one or more targets in the picture and matching the AR target, thereby realizing the AR processing flow in the labelless environment. Auto-start, accurate discovery of small AR targets, greatly reducing the time to discover AR targets.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Navigation (AREA)
- Telephone Function (AREA)
- Instructional Devices (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020157023037A KR101748226B1 (ko) | 2013-01-28 | 2014-01-23 | 증강 현실 객체를 발견하는 방법, 및 단말 |
JP2015554037A JP6123120B2 (ja) | 2013-01-28 | 2014-01-23 | 拡張現実オブジェクトを発見するための方法および端末 |
EP14743259.5A EP2865993B1 (en) | 2013-01-28 | 2014-01-23 | Augmented reality target discovery method and terminal |
US14/573,178 US9436874B2 (en) | 2013-01-28 | 2014-12-17 | Method for discovering augmented reality object, and terminal |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310031093.X | 2013-01-28 | ||
CN201310031093.XA CN103968824B (zh) | 2013-01-28 | 2013-01-28 | 一种发现增强现实目标的方法及终端 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/573,178 Continuation US9436874B2 (en) | 2013-01-28 | 2014-12-17 | Method for discovering augmented reality object, and terminal |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014114244A1 true WO2014114244A1 (zh) | 2014-07-31 |
Family
ID=51226936
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2014/071169 WO2014114244A1 (zh) | 2013-01-28 | 2014-01-23 | 一种发现增强现实目标的方法及终端 |
Country Status (6)
Country | Link |
---|---|
US (1) | US9436874B2 (zh) |
EP (1) | EP2865993B1 (zh) |
JP (1) | JP6123120B2 (zh) |
KR (1) | KR101748226B1 (zh) |
CN (1) | CN103968824B (zh) |
WO (1) | WO2014114244A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110347703A (zh) * | 2019-07-01 | 2019-10-18 | 华南理工大学 | 一种基于ARCore的用户行为分析方法及系统 |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106062862B (zh) | 2014-10-24 | 2020-04-21 | 杭州凌感科技有限公司 | 用于沉浸式和交互式多媒体生成的系统和方法 |
US10256859B2 (en) | 2014-10-24 | 2019-04-09 | Usens, Inc. | System and method for immersive and interactive multimedia generation |
US10334085B2 (en) * | 2015-01-29 | 2019-06-25 | Splunk Inc. | Facilitating custom content extraction from network packets |
CN104834375A (zh) * | 2015-05-05 | 2015-08-12 | 常州恐龙园股份有限公司 | 基于增强现实的游乐园指南系统 |
CN104850229B (zh) * | 2015-05-18 | 2019-03-22 | 小米科技有限责任公司 | 识别物体的方法及装置 |
CN105824863B (zh) * | 2015-10-30 | 2021-12-28 | 维沃移动通信有限公司 | 一种桌面主题推荐方法及终端 |
CN106959111A (zh) * | 2016-01-08 | 2017-07-18 | 台湾国际物业管理顾问有限公司 | 建筑物空间连续定位资讯系统 |
CN105554392A (zh) * | 2016-01-20 | 2016-05-04 | 京东方科技集团股份有限公司 | 显示切换装置及其切换方法、穿戴显示装置及其显示方法 |
CN106713868A (zh) * | 2017-01-03 | 2017-05-24 | 捷开通讯(深圳)有限公司 | 一种监控随机目标的方法及系统 |
CN107229706A (zh) * | 2017-05-25 | 2017-10-03 | 广州市动景计算机科技有限公司 | 一种基于增强现实的信息获取方法及其装置 |
CN109151204B (zh) * | 2018-08-31 | 2021-04-23 | Oppo(重庆)智能科技有限公司 | 一种基于移动终端的导航方法、装置及移动终端 |
CN111198608B (zh) * | 2018-11-16 | 2021-06-22 | 广东虚拟现实科技有限公司 | 信息提示方法、装置、终端设备及计算机可读取存储介质 |
CN109324693A (zh) * | 2018-12-04 | 2019-02-12 | 塔普翊海(上海)智能科技有限公司 | Ar搜索装置、基于ar搜索装置的物品搜索系统及方法 |
CN111491293B (zh) * | 2019-01-25 | 2022-01-11 | 华为技术有限公司 | 一种运动状态的上报方法及装置 |
CN112733620A (zh) * | 2020-12-23 | 2021-04-30 | 深圳酷派技术有限公司 | 信息提示方法、装置、存储介质及电子设备 |
CN112880689A (zh) * | 2021-01-29 | 2021-06-01 | 北京百度网讯科技有限公司 | 一种领位方法、装置、电子设备及存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101033978A (zh) * | 2007-01-30 | 2007-09-12 | 珠海市智汽电子科技有限公司 | 智能汽车辅助导航和自动兼辅助驾驶系统 |
WO2008099482A1 (ja) * | 2007-02-16 | 2008-08-21 | Pioneer Corporation | 検索装置、検索方法、検索プログラムおよびコンピュータに読み取り可能な記録媒体 |
CN101451852A (zh) * | 2008-12-19 | 2009-06-10 | 深圳华为通信技术有限公司 | 导航设备和导航方法 |
CN102123194A (zh) * | 2010-10-15 | 2011-07-13 | 张哲颖 | 利用增强实景技术优化移动导航和人机交互功能的方法 |
CN102867169A (zh) * | 2011-04-08 | 2013-01-09 | 索尼公司 | 图像处理设备、显示控制方法及程序 |
Family Cites Families (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3442163B2 (ja) * | 1994-10-11 | 2003-09-02 | 富士通株式会社 | 位置合わせ方法および装置 |
US6456234B1 (en) | 2000-06-07 | 2002-09-24 | William J. Johnson | System and method for proactive content delivery by situation location |
JP2002267484A (ja) * | 2001-03-09 | 2002-09-18 | Ntt Docomo Inc | 案内システム及び被誘導端末装置 |
JP3787760B2 (ja) | 2001-07-31 | 2006-06-21 | 松下電器産業株式会社 | カメラ付き携帯電話装置 |
JP4260432B2 (ja) * | 2002-07-08 | 2009-04-30 | シャープ株式会社 | 情報提供方法、情報提供プログラム、情報提供プログラムを記録した記録媒体、および情報提供装置 |
JP4066421B2 (ja) * | 2003-01-14 | 2008-03-26 | 株式会社リコー | 携帯情報端末装置 |
JP4380550B2 (ja) * | 2004-03-31 | 2009-12-09 | 株式会社デンソー | 車載用撮影装置 |
JP2007041955A (ja) * | 2005-08-04 | 2007-02-15 | Matsushita Electric Ind Co Ltd | 端末装置 |
JP2007132870A (ja) * | 2005-11-11 | 2007-05-31 | Pioneer Electronic Corp | ナビゲーション装置、コンピュータプログラム、画面制御方法及び測定間隔制御方法 |
JP2007292713A (ja) * | 2006-03-30 | 2007-11-08 | Denso Corp | ナビゲーション装置 |
US7617042B2 (en) | 2006-06-30 | 2009-11-10 | Microsoft Corporation | Computing and harnessing inferences about the timing, duration, and nature of motion and cessation of motion with applications to mobile computing and communications |
JP2008146136A (ja) * | 2006-12-06 | 2008-06-26 | Seiko Epson Corp | 画像認識装置、画像認識システム、画像認識方法、および、制御プログラム |
JP2009267792A (ja) * | 2008-04-25 | 2009-11-12 | Panasonic Corp | 撮像装置 |
US8909466B2 (en) * | 2008-08-01 | 2014-12-09 | Environmental Systems Research Institute, Inc. | System and method for hybrid off-board navigation |
US8639440B2 (en) * | 2010-03-31 | 2014-01-28 | International Business Machines Corporation | Augmented reality shopper routing |
KR101667715B1 (ko) | 2010-06-08 | 2016-10-19 | 엘지전자 주식회사 | 증강현실을 이용한 경로 안내 방법 및 이를 이용하는 이동 단말기 |
US8762041B2 (en) * | 2010-06-21 | 2014-06-24 | Blackberry Limited | Method, device and system for presenting navigational information |
CN102054166B (zh) * | 2010-10-25 | 2016-04-27 | 北京理工大学 | 一种新的用于户外增强现实系统的场景识别方法 |
US20120249588A1 (en) * | 2011-03-22 | 2012-10-04 | Panduit Corp. | Augmented Reality Data Center Visualization |
GB201106555D0 (en) * | 2011-04-19 | 2011-06-01 | Tomtom Int Bv | Taxi dispatching system |
CN102519475A (zh) * | 2011-12-12 | 2012-06-27 | 杨志远 | 一种基于增强现实技术的智能导航方法和设备 |
KR102021050B1 (ko) * | 2012-06-06 | 2019-09-11 | 삼성전자주식회사 | 내비게이션 정보를 제공하는 방법, 기계로 읽을 수 있는 저장 매체, 이동 단말 및 서버 |
-
2013
- 2013-01-28 CN CN201310031093.XA patent/CN103968824B/zh active Active
-
2014
- 2014-01-23 KR KR1020157023037A patent/KR101748226B1/ko active IP Right Grant
- 2014-01-23 JP JP2015554037A patent/JP6123120B2/ja active Active
- 2014-01-23 EP EP14743259.5A patent/EP2865993B1/en active Active
- 2014-01-23 WO PCT/CN2014/071169 patent/WO2014114244A1/zh active Application Filing
- 2014-12-17 US US14/573,178 patent/US9436874B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101033978A (zh) * | 2007-01-30 | 2007-09-12 | 珠海市智汽电子科技有限公司 | 智能汽车辅助导航和自动兼辅助驾驶系统 |
WO2008099482A1 (ja) * | 2007-02-16 | 2008-08-21 | Pioneer Corporation | 検索装置、検索方法、検索プログラムおよびコンピュータに読み取り可能な記録媒体 |
CN101451852A (zh) * | 2008-12-19 | 2009-06-10 | 深圳华为通信技术有限公司 | 导航设备和导航方法 |
CN102123194A (zh) * | 2010-10-15 | 2011-07-13 | 张哲颖 | 利用增强实景技术优化移动导航和人机交互功能的方法 |
CN102867169A (zh) * | 2011-04-08 | 2013-01-09 | 索尼公司 | 图像处理设备、显示控制方法及程序 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2865993A4 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110347703A (zh) * | 2019-07-01 | 2019-10-18 | 华南理工大学 | 一种基于ARCore的用户行为分析方法及系统 |
CN110347703B (zh) * | 2019-07-01 | 2023-08-22 | 华南理工大学 | 一种基于ARCore的用户行为分析方法及系统 |
Also Published As
Publication number | Publication date |
---|---|
JP6123120B2 (ja) | 2017-05-10 |
JP2016507746A (ja) | 2016-03-10 |
US9436874B2 (en) | 2016-09-06 |
EP2865993B1 (en) | 2019-04-03 |
KR101748226B1 (ko) | 2017-06-16 |
CN103968824B (zh) | 2018-04-10 |
CN103968824A (zh) | 2014-08-06 |
EP2865993A4 (en) | 2015-12-30 |
US20150104069A1 (en) | 2015-04-16 |
KR20150108925A (ko) | 2015-09-30 |
EP2865993A1 (en) | 2015-04-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014114244A1 (zh) | 一种发现增强现实目标的方法及终端 | |
US11035687B2 (en) | Virtual breadcrumbs for indoor location wayfinding | |
EP3014476B1 (en) | Using movement patterns to anticipate user expectations | |
US10909759B2 (en) | Information processing to notify potential source of interest to user | |
US9410810B2 (en) | Method and apparatus for providing service using a sensor and image recognition in a portable terminal | |
JP6025433B2 (ja) | 携帯ナビゲーション装置 | |
JP4380550B2 (ja) | 車載用撮影装置 | |
US9928710B2 (en) | Danger alerting method and device, portable electronic apparatus | |
US10373496B2 (en) | Parking management system and parking management method | |
KR20160147052A (ko) | 스트리트 뷰 목적지에 대한 안내를 제공하는 방법 및 장치 | |
JP2008227877A (ja) | 映像情報処理装置 | |
CN105509735B (zh) | 信息提示方法、装置及终端 | |
EP3287745B1 (en) | Information interaction method and device | |
CN117128959A (zh) | 寻车导航方法、电子设备、服务器及系统 | |
JP2012019374A (ja) | 電子アルバム作成サーバー、情報処理装置、電子アルバム作成システム、及び、電子アルバム作成サーバーの制御方法 | |
JP2014044166A (ja) | 電子機器、進行方向提示方法およびプログラム | |
CN111176338A (zh) | 导航方法、电子设备及存储介质 | |
JP2005017074A (ja) | 情報送受信装置、情報送受信用プログラム | |
KR20190010065A (ko) | 이동체의 위치인식 및 실시간 추적 가능한 자동보정 예측 알고리즘을 혼합한 선형 제어 방법 | |
JP5977697B2 (ja) | 電子機器、および電子機器を制御するための方法 | |
JP2011220899A (ja) | 情報提示システム | |
JP6976186B2 (ja) | 端末装置及びプログラム | |
KR20190010066A (ko) | 자가 자동 촬영 장치 | |
CN113179372A (zh) | 图像采集装置与人工智能的结合方法、装置及电子设备 | |
JP2012202782A (ja) | 携帯情報端末 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14743259 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014743259 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2015554037 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20157023037 Country of ref document: KR Kind code of ref document: A |