[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2014114244A1 - 一种发现增强现实目标的方法及终端 - Google Patents

一种发现增强现实目标的方法及终端 Download PDF

Info

Publication number
WO2014114244A1
WO2014114244A1 PCT/CN2014/071169 CN2014071169W WO2014114244A1 WO 2014114244 A1 WO2014114244 A1 WO 2014114244A1 CN 2014071169 W CN2014071169 W CN 2014071169W WO 2014114244 A1 WO2014114244 A1 WO 2014114244A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal
target
state
determined
navigation route
Prior art date
Application number
PCT/CN2014/071169
Other languages
English (en)
French (fr)
Inventor
李国庆
金志皓
Original Assignee
华为终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为终端有限公司 filed Critical 华为终端有限公司
Priority to KR1020157023037A priority Critical patent/KR101748226B1/ko
Priority to JP2015554037A priority patent/JP6123120B2/ja
Priority to EP14743259.5A priority patent/EP2865993B1/en
Publication of WO2014114244A1 publication Critical patent/WO2014114244A1/zh
Priority to US14/573,178 priority patent/US9436874B2/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3647Guidance involving output of stored or live camera images or video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/024Guidance services

Definitions

  • the present invention belongs to the field of Augmented Representation (AR) technology, and in particular, to a method and a terminal for discovering an AR target.
  • Background technique
  • AR is a technology used to help people acquire relevant information about objects in the real world in a more intuitive and visual way. It is a technology that enhances the user's perception of the real world through information provided by computer systems, usually through mobile terminals or Head Mounted Display (HMD) implementation, and the above device includes at least one camera and one display device in hardware, and may also include a positioning system and various sensors, such as a global positioning system (G l Oba l Pos iti on i ng System, GPS), gyroscopes, light sensors, etc.
  • G l Oba l Pos iti on i ng System GPS
  • gyroscopes light sensors
  • the existing solution can not more accurately guide users to distinguish between AR targets and non-AR targets, need to manually after the user reaches the AR target Taking pictures of the surrounding environment can further identify the AR target.
  • the purpose of the embodiments of the present invention is to provide an AR target discovery method and system, which aims to solve the problem that the prior art can only locate large AR targets and cannot accurately identify small AR targets.
  • an AR target discovery method including: determining, when a pre-generated navigation route and a moving speed of a terminal are determined to reach a vicinity of a pre-selected AR target, determining a state of the terminal; When searching for the status, the camera is started and the picture is acquired; when the acquired AR image is included in the picture, the AR target is prompted.
  • the determining, by using the pre-generated navigation route and the moving speed of the terminal, the proximity to the pre-selected AR target includes: acquiring according to the pre-generated navigation route The distance between the location of the AR target and the starting position of the terminal; Pre-selected motion mode, obtaining an average motion speed of the motion mode; calculating an estimated time when the terminal arrives near the AR target according to the distance and the average motion speed; After the terminal start position starts and moves for the expected time, it is determined to arrive near the AR target.
  • the method further includes: when the acquired image does not include the AR target, use navigation
  • the positioning system collects current position parameters of the terminal; and corrects the navigation route according to the position parameter.
  • the location parameter includes at least one of positioning parameter information of a current location of the terminal and a prominent flag of a current location of the terminal. item.
  • the method further includes: when determining that the terminal performs irregular motion near the AR target and finally stays away from the When the AR target is output, a warning message is output.
  • the determining the status of the terminal includes:
  • Whether the terminal is in a search state is determined by whether the gyroscope and the preset function are used.
  • the warning information includes at least one of the following: vibration, ringing, or illuminating.
  • a terminal including: a state determining unit, configured to determine a state of the terminal when determining to reach a pre-selected AR target by a pre-generated navigation route and a moving speed of the terminal; And when the state determining unit determines that the terminal is in a search state, starts a camera and acquires a picture; and the prompting unit is configured to: when the picture acquired by the picture acquiring unit includes the AR target, prompting a Describe the AR target.
  • the state determining unit includes: a distance acquiring subunit, configured to acquire a location of the AR target and a start position of the terminal according to the navigation route generated in advance a distance acquisition subunit, configured to acquire an average motion speed of the motion mode according to a motion mode preselected by a user; and an estimated time acquisition subunit, configured to acquire, according to the distance acquisition subunit a distance, and an average moving speed obtained by the speed acquiring subunit, calculating an estimated time when the terminal arrives near the AR target; determining a subunit, configured to start at the terminal from the terminal starting position After moving the estimated time, it is determined to arrive at the AR Near the target.
  • the method further includes: a parameter collecting unit, configured to: when the image acquired by the image acquiring unit does not include The navigation target system is used to collect the current position parameter of the terminal, and the correction unit is configured to correct the navigation route according to the position parameter collected by the parameter collection unit.
  • the location parameter includes at least one of positioning parameter information of a current location of the terminal and a prominent flag of a current location of the terminal. item.
  • the method further includes: a warning unit, configured to: when determining that the terminal performs irregular motion near the AR target And when the object is finally away from the AR target, a warning message is output.
  • the state determining unit is specifically configured to: when determining, by using a pre-generated navigation route and a moving speed of the terminal, to reach a vicinity of a pre-selected AR target, Whether the terminal is in a search state is determined by whether the gyroscope and the preset function are used.
  • the warning information includes at least one of the following: vibration, ringing, or illuminating.
  • the picture of the surrounding environment is automatically acquired, so that one or more targets are extracted from the image and matched with the AR target, thereby realizing AR processing in the labelless environment.
  • the automatic start of the process accurately discovers small AR targets, greatly reducing the time to discover AR targets.
  • FIG. 1 is a flowchart of an implementation of an AR target discovery method according to an embodiment of the present invention
  • FIG. 2 is a flowchart of a specific implementation of the step S1 01 of the AR target discovery method according to the embodiment of the present invention
  • FIG. 3 is a flowchart of an implementation of the AR target discovery method for determining whether the terminal is in a search state according to an embodiment of the present invention
  • FIG. 4 is a flowchart of an implementation of an AR target discovery method according to another embodiment of the present invention.
  • FIG. 5 is a flowchart of an implementation of an A R target discovery method according to another embodiment of the present invention.
  • FIG. 6 is a structural block diagram of an AR target discovery apparatus according to an embodiment of the present invention.
  • FIG. 7 is a block diagram showing the hardware structure of an AR target discovery apparatus according to an embodiment of the present invention
  • FIG. 8 is a hardware implementation block diagram of an AR target discovery apparatus according to an embodiment of the present invention. detailed description
  • step S101 when a pre-generated navigation route and a moving speed of a terminal are determined to reach a vicinity of a pre-selected AR target, Determining the status of the terminal.
  • the determining the status of the terminal includes: determining whether the terminal is in a search state by using a gyroscope and a preset function.
  • the terminal may obtain an AR target pre-selected by the user by receiving an input instruction, and determine the location of the AR target, thereby navigating according to the current location of the terminal (ie, the starting position) and the position of the AR target.
  • the positioning system generates a navigation route to guide the user to their pre-selected AR target.
  • the navigation and positioning system includes, but is not limited to, Global Positioning System (GPS), Global Navigation Satellite System (G l oba l Nav i gat i on Sate lli te System, GLONASS), Beidou System, etc.
  • the terminal may perform an AR target search centering on the selected location by acquiring a certain location selected by the user on the map displayed by the terminal, and displaying the searched AR target in the radar map with the current location of the user as a center. on.
  • the user can select one or more AR targets through the radar map to generate a navigation route that directs the user to their pre-selected one or more AR targets.
  • a way of transportation to a destination for example, walking, cycling or driving, the terminal can be used as a starting point, and a navigation route can be generated according to the departure place, the destination and the pre-selected movement mode of the user, thereby guiding the user. Go to the vicinity of its selected AR target.
  • the navigation route generated by the navigation and positioning function can only guide the user to reach the vicinity of the AR target, and does not necessarily ensure that the guiding user reaches the precise position of the AR target. .
  • the terminal may temporarily close the devices or function modules that are not needed during the movement of the terminal, such as a camera, a gyroscope, an accelerometer, an electronic compass, and a GPS, before determining the navigation route. Move to the vicinity of the AR target and then turn on the temporarily turned off device or functional module.
  • the navigation and positioning system is not used for navigation, but is used.
  • the GSM/3G signal and the base station perform low-precision navigation to save energy consumption of the terminal.
  • Step S1 01 is specifically as follows:
  • Step S201 Acquire a distance between a location of the AR target and a start position of the terminal according to the navigation route generated in advance.
  • the terminal acquires the distance between the AR target location and the terminal start location according to the generated navigation route.
  • the average moving speed of the motion mode is obtained according to the motion mode previously selected by the user.
  • the average moving speed of the walking can be obtained according to the relevant configuration parameters, for example, 5KM/h, and the obtained average moving speed can be preset in the system according to different motion modes, and can be estimated value.
  • the terminal can pre-record the speed at which the user walks, runs, rides, or otherwise moves, and estimates the average speed of each user's movement based on statistical data.
  • step S203 an estimated time when the terminal reaches the vicinity of the AR target is calculated according to the distance and the average moving speed.
  • step S204 after the terminal starts from the terminal starting position and moves the estimated time, it is determined to reach the vicinity of the AR target.
  • the terminal may set a task alarm according to the estimated estimated time to remind the user to approach the AR target in the form of a task alarm when the estimated terminal approaches the destination, thereby prompting the user to pay attention to whether the AR target is concerned. appear.
  • the estimated time may be corrected according to actual motion conditions of the terminal.
  • the user originally intended to carry the terminal to walk near the AR target change the idea in the second half of the navigation route, and decide to drive the remaining route.
  • the terminal receives the motion input by the user during the motion. Change the instruction, and change the "walking" movement mode to the "driving" movement mode.
  • step S102 when it is determined that the terminal is in a search state, the camera is activated and a picture is acquired.
  • the sensor such as the gyroscope and the light sensor that have been temporarily turned off may be re-opened, and the terminal state parameters are collected, and the terminal state is determined.
  • the terminal state parameter satisfies the preset condition to judge the current state of the terminal, and further determines whether the terminal is in a normal motion state or a search state.
  • it is considered that the user may currently confirm the surrounding through the terminal. Environment, looking for AR targets in the surrounding environment.
  • the camera is automatically turned on to take a picture, or the video stream is collected to extract an image frame.
  • the above actions of automatic photographing or video stream acquisition can be implemented by a terminal such as a smart phone or a head-mounted device, and the photo or image frame can be optimized before acquiring the image frame in the photo or video stream, for example, zooming in, Adjust brightness, adjust contrast, etc. (including but not limited to the above method, the processing method here can be any method to improve the viewing effect of the visual content to make it more suitable for human eyes), and then extract the image by image recognition algorithm N attention targets that are of interest to the user to determine whether the N targets include the AR target, where N is an integer greater than or equal to 1.
  • the system can first process the M in the picture by using a fast, low-precision recognition algorithm (M is greater than or equal to 1 and An integer that is less than or equal to N) is focused on the target, and the remaining NM targets are identified using an accurate recognition algorithm to determine whether the target is included in the target.
  • the fast, low-precision recognition algorithm identifies the non-AR target as an AR target, but does not recognize the AR target as a non-AR target, and the precise recognition algorithm can accurately identify the AR target and the non-AR target.
  • the AR target but it takes a long time. Therefore, with the above processing method, the total processing time can be reduced while ensuring the recognition accuracy.
  • the head-mounted device when used as the AR target discovery device, since the head-mounted device can detect the change of the viewing angle of the user, when the picture is acquired, it is found that the user quickly switches the current perspective, that is, the time when the user is in the attention state is less than If a certain threshold value T is not optimized for the current picture, only the current picture is saved. If the time when the user is in the state of interest exceeds the threshold value T, the user is considered to be interested in a certain target in the current field of view, that is, the current acquisition. If the image is likely to contain an AR target, the image will be processed later.
  • step S103 when the acquired AR object is included in the acquired image, the AR target is prompted.
  • the embodiment When determining that the terminal is in the search state, the embodiment automatically obtains a picture of the surrounding environment, thereby extracting one or more targets in the picture and matching the AR target, thereby realizing automatic processing of the AR process in the tagless environment. Start up, accurately discover small AR targets, and greatly reduce discovery The time of the AR target.
  • steps S101 to S103, and the execution subjects of steps S201 to S204 are terminals.
  • the terminal status parameter may include an average motion speed of the terminal from the departure point to the vicinity of the AR target, and the current motion speed of the terminal.
  • the step of determining whether the terminal is in the search state is specifically:
  • the difference between the current moving speed of the terminal and the average moving speed is greater than a preset threshold.
  • the average moving speed of the terminal may be calculated according to the motion time before the terminal reaches the vicinity of the AR target and the accumulated actual moving distance, or the average moving speed of the motion mode selected by the user is directly obtained, and according to the current motion of the terminal. Whether the difference between the speed and the average moving speed is greater than a preset threshold to determine whether the speed of the terminal is significantly reduced, and the terminal is considered to be in a search state.
  • the terminal state parameter further includes an operation state of the terminal preset function, and considering that the movement speed of the terminal is significantly reduced may also be due to a temporary answering call, viewing a short message, etc., therefore, determining whether the terminal is in a search state Specifically:
  • the preset function includes, but is not limited to, a preset function such as a voice call, a short message, a multimedia message, an alarm clock, a calendar event, etc., if it is determined that one or more of the preset functions are being used, then the return is returned. Monitor the change of the speed of the terminal; if it is determined that the preset function is in the stop state, it can be judged that the terminal is in the search state.
  • a preset function such as a voice call, a short message, a multimedia message, an alarm clock, a calendar event, etc.
  • the terminal status parameter further includes a terminal motion state.
  • the step of determining whether the terminal is in the search state is specifically:
  • determining whether the smartphone or the head mounted device is in a steady state by a sensor such as a gyroscope.
  • a sensor such as a gyroscope.
  • determining whether the terminal motion state is in a steady state may be specifically:
  • the gyroscope when the data collected by the gyroscope indicates that the motion state of the terminal is in a steady state, it can further determine whether the time when the terminal motion state is in a steady state is greater than a preset time threshold, such as If the threshold is less than the preset time threshold, no judgment is made until the time when the terminal motion state is in the steady state is greater than the preset time threshold, and then the terminal is determined to be in the search state.
  • a preset time threshold such as If the threshold is less than the preset time threshold, no judgment is made until the time when the terminal motion state is in the steady state is greater than the preset time threshold, and then the terminal is determined to be in the search state.
  • the change of the pupil size can be used to identify whether the user is staring at a certain target, for example, when the user is paying attention to a certain thing, the pupil of the eye changes. Large, at this time, the size of the user's pupil can be photographed through the front camera of the terminal, which serves as an auxiliary reference to further improve the judgment accuracy of the terminal state.
  • FIG. 3 is a flowchart of an implementation of the AR target discovery method for determining whether a terminal is in a search state according to an embodiment of the present invention.
  • the embodiment shown in FIG. 3 combines all the foregoing judgment terminal states after determining that the terminal motion speed has dropped significantly. All the preset conditions that the parameters need to satisfy, through the embodiment shown in Fig. 3, can maximize the accuracy of the judgment of the terminal state.
  • the order of execution of steps S302 and S303 in the embodiment shown in FIG. 3 is not limited, and only S302 or only S303 may be executed, which is not limited herein.
  • step S301 it is determined whether the current motion of the terminal is lower than the average motion speed is greater than a preset threshold. If yes, step S302 is performed, otherwise it is determined that the terminal is not in the search state.
  • step S302 it is determined whether the preset function of the terminal is in the stopped use state, if yes, step S303 is performed, otherwise it is determined that the terminal is not in the search state.
  • step S303 it is determined whether the terminal motion state is in a steady state, if yes, step S304 is performed, otherwise it is determined that the terminal is not in the search state.
  • step S303 after determining that the terminal motion state is in a steady state, it may further determine whether the terminal running state is in a steady state time is greater than a preset time threshold, and if the terminal running state is in a steady state, the time is greater than the preset time threshold. Then, step S304 is performed.
  • step S304 it is determined that the terminal is in the search state.
  • step S301 since the relevant steps after step S301 are all implemented by the sensor of the terminal or the related detecting device, preferably, it is not determined that the terminal moving speed is significant in step S301. Before the change, the sensor of the terminal or the related detecting device may be in a closed state or a standby state until the step S301 determines that the user's moving speed has changed significantly, and then turns on the detecting to save the energy consumption of the terminal device. It should be noted that the execution subject of the above steps S301 to S304 is a terminal.
  • FIG. 4 is a flowchart showing an implementation process of an AR target discovery method according to another embodiment of the present invention. This embodiment is based on the embodiment shown in FIG. 1 of the present invention, and does not identify one or more target objects. The case of including the AR target is described as follows:
  • step S1 04 when the acquired AR target is not included in the acquired image, the current positioning parameter of the terminal is collected by using a navigation positioning system.
  • step S1 05 the navigation route is corrected based on the positional parameter.
  • the location parameter includes, but is not limited to, parameter information such as a GPS parameter of a current location of the terminal, a prominent flag of a current location of the terminal, and the like, wherein the GPS parameter may be used to determine whether the current location is located near the AR target, and the prominent flag of the current location may be used for Matches the prominent flag near the built-in AR target to determine whether the current position is near the AR target, thus re-correcting the navigation route in the case of a wrong position and guiding the user forward.
  • the GPS parameter may be used to determine whether the current location is located near the AR target
  • the prominent flag of the current location may be used for Matches the prominent flag near the built-in AR target to determine whether the current position is near the AR target, thus re-correcting the navigation route in the case of a wrong position and guiding the user forward.
  • FIG. 5 shows an implementation flow of the AR target discovery method according to another embodiment of the present invention, specifically, before step S1 01, or After step S1 03, or after step S1 05, the method further includes: Step S1 06: outputting a warning message when it is determined that the terminal performs irregular motion near the AR target and finally moves away from the AR target.
  • the warning information includes at least one of the following: vibration, ringing, or lighting. Therefore, the user can specify the specific orientation of the AR target in the current environment, and further guide the user to discover the AR target.
  • the warning message may also be outputted when the AR task alarm expires or before the AR task alarm expires or after the AR task alarm expires, prompting the user to pay attention.
  • the AR task alarm is used to prompt the user to reach the vicinity of the AR target.
  • the embodiment When determining that the terminal is in the search state, the embodiment automatically obtains a picture of the surrounding environment, thereby extracting one or more targets and matching the AR target in the picture, thereby realizing the automatic processing of the AR process in the tagless environment. Start-up, accurate discovery of small AR targets, greatly reducing the time to discover AR targets.
  • FIG. 6 is a structural block diagram of a terminal for discovering an AR target according to an embodiment of the present invention.
  • the terminal may be located in an AR terminal such as an intelligent mobile terminal or a head-mounted device, and is used to run the embodiments of the present invention in FIGS. 1 to 5. The method described. For the convenience of explanation, only the parts related to the present embodiment are shown.
  • the apparatus includes:
  • the state determining unit 61 determines the state of the terminal when it is determined by the pre-generated navigation route and the moving speed of the terminal to reach the vicinity of the pre-selected AR target.
  • the picture obtaining unit 62 when the state determining unit 61 determines that the terminal is in the search state, activates the camera and acquires a picture.
  • the prompting unit 63 prompts the AR target when the image acquired by the image obtaining unit 62 includes the AR target.
  • the state determining unit 61 includes:
  • the distance obtaining sub-unit acquires a distance between the location of the AR target and the start position of the terminal according to the navigation route generated in advance.
  • the speed acquisition subunit acquires the average motion speed of the motion mode according to a motion mode preselected by the user.
  • a time acquisition subunit configured to calculate, according to the distance obtained by the distance acquisition subunit, and the average motion speed obtained by the speed acquisition subunit, calculate an estimated time that the terminal arrives near the AR target .
  • Determining the subunit after the terminal starts from the terminal starting position and moves the estimated time, it is determined to arrive near the AR target.
  • the state determining unit 61 is specifically configured to determine, when the gyro and the preset function are used, when the proximity to the pre-selected AR target is determined by the pre-generated navigation route and the moving speed of the terminal. Whether the terminal is in the search state.
  • the terminal further includes:
  • the parameter collection unit collects the current location parameter of the terminal by using the navigation and positioning system when the image acquired by the image acquisition unit 62 does not include the AR target.
  • the location parameter includes at least one of positioning parameter information of a current location of the terminal and a prominent flag of a current location of the terminal.
  • the terminal further includes:
  • the warning unit outputs a warning message when it is determined that the terminal performs irregular motion near the AR target and finally moves away from the AR target.
  • the warning information includes at least one of the following: vibration, ringing, or lighting.
  • FIG. 7 is a block diagram showing the hardware structure of a terminal for discovering an AR target according to an embodiment of the present invention.
  • the terminal may be a terminal such as a smart mobile terminal or a head mounted device for operating the method described in the embodiments of FIG. 1 to FIG. 5 of the present invention. For the convenience of explanation, only the parts related to the present embodiment are shown.
  • the apparatus includes a processor 71, a memory 72 and a bus 73, wherein the processor 71 and the memory 72 communicate with each other via a bus 73, the memory 72 is used to store a program, and the processor 71 is used to execute the memory 72.
  • Stored program, when executed, is used to:
  • the camera When it is determined that the terminal is in a search state, the camera is activated and a picture is acquired;
  • the AR target is prompted.
  • the determining the status of the terminal includes:
  • Whether the terminal is in a search state is determined by whether the gyroscope and the preset function are used.
  • the determining, by using the pre-generated navigation route and the moving speed of the terminal, the proximity to the pre-selected AR target includes:
  • the terminal After the terminal starts from the terminal starting position and moves the estimated time, it is determined that the arrival of the AR target is near.
  • the navigation location system is used to collect the current location parameter of the terminal;
  • the navigation route is corrected based on the positional parameter.
  • the location parameter includes at least one of positioning parameter information of a current location of the terminal and a prominent flag of a current location of the terminal.
  • a warning message is output when it is determined that the terminal performs irregular motion near the AR target and eventually moves away from the AR target.
  • the warning information includes at least one of the following: vibration, ringing, or lighting.
  • FIG. 8 is based on the embodiment shown in FIG. 6 of the present invention, and provides an AR target discovery provided by an embodiment of the present invention.
  • the embodiment of the present invention automatically acquires a picture of the surrounding environment, thereby extracting one or more targets in the picture and matching the AR target, thereby realizing the AR processing flow in the labelless environment. Auto-start, accurate discovery of small AR targets, greatly reducing the time to discover AR targets.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Navigation (AREA)
  • Telephone Function (AREA)
  • Instructional Devices (AREA)

Abstract

本发明适用于AR技术领域,提供了一种AR目标发现方法和终端,包括:当通过预先生成的导航路线和终端的移动速度确定到达预先选定的AR目标附近时,确定所述终端的状态;当确定所述终端处于搜索状态时,启动摄像头并获取图片;当获取的所述图片中包含所述AR目标时,提示所述AR目标。

Description

一种发现增强现实目标的方法及终端 技术领域
本发明属于增强现实 (Augmented Rea l i ty, AR) 技术领域, 尤其涉及一种 发现 AR目标的方法及终端。 背景技术
AR是一种用于帮助人们以更直观、 更形象的方式获取关于现实世界中物体 的相关信息的技术, 是通过计算机系统提供的信息增强用户对现实世界感知的 技术, 其通常通过移动终端或者头戴式显示系统(Head Mounted D i sp l ay, HMD) 实现, 且上述设备在硬件上至少包含一个摄像头和一个显示设备, 还可能包含 定位系统和多种传感器, 如全球定位系统(G l oba l Pos i t i on i ng System, GPS)、 陀螺仪、 光感传感器等等。
在使用上述设备准确地发现 AR 目标的现有方案中, 若 AR 目标是一个大型 目标, 例如, 地标性建筑, 则通过 GPS就可以引导用户发现 AR目标, 然而, 当 AR 目标是小型目标, 例如, 某个室内的展台上的若干个手机模型, 或者某个博 物馆内的若干件文物,则现有方案无法更加精确地引导用户区分 AR目标和非 AR 目标, 需要在用户到达 AR目标附近后手动对周围环境进行拍照, 才能进一步对 AR目标进行识別。 发明内容
本发明实施例的目的在于提供一种 AR目标发现方法及系统, 旨在解决现有 技术只能定位大型 AR目标, 无法精确识別小型 AR目标的问题。
第一方面, 提供一种 AR目标发现方法, 包括: 当通过预先生成的导航路线 和终端的移动速度确定到达预先选定的 AR目标附近时, 确定所述终端的状态; 当确定所述终端处于搜索状态时, 启动摄像头并获取图片; 当获取的所述图片 中包含所述 AR目标时, 提示所述 AR目标。
在第一方面的第一种可能的实现方式中, 所述通过预先生成的导航路线和 所述终端的移动速度确定到达预先选定的 AR目标附近, 包括: 根据预先生成的 所述导航路线获取所述 AR目标的位置与所述终端起始位置之间的距离; 根据用 户预先选定的运动方式, 获取所述运动方式的平均运动速度; 根据所述距离与 所述平均运动速度计算出所述终端到达所述 AR 目标附近的预计时间; 在所述终 端从所述终端起始位置出发并运动了所述预计时间之后, 确定到达所述 AR 目标 附近。
结合第一方面或者第一方面的第一种可能的实现方式, 在第二种可能的实 现方式中, 所述方法还包括: 当获取的所述图片中不包含所述 AR 目标时, 使用 导航定位系统采集所述终端当前的位置参数; 根据所述位置参数修正所述导航 路线。
结合第一方面的第二种可能的实现方式, 在第三种可能的实现方式中, 所 述位置参数包括所述终端当前位置的定位参数信息和所述终端当前位置的显著 标志中的至少一项。
结合第一方面或者第一方面的任一种可能的实现方式, 在第四种可能的实 现方式中, 还包括: 当确定所述终端在所述 AR 目标附近做不规则运动并最终远 离所述 AR 目标时, 输出警告信息。
结合第一方面或者第一方面的任一种可能的实现方式, 所述确定所述终端 的状态, 包括:
通过陀螺仪和预设功能是否被使用确定所述终端是否处于搜索状态。
结合第一方面的第四种可能的实现方式, 在第五种可能的实现方式中, 所 述警告信息包括以下至少一种: 振动、 铃声或发光。
第二方面, 提供一种终端, 包括: 状态确定单元, 用于当通过预先生成的 导航路线和终端的移动速度确定到达预先选定的 AR 目标附近时, 确定所述终端 的状态; 图片获取单元, 用于当所述状态确定单元确定所述终端处于搜索状态 时, 启动摄像头并获取图片; 提示单元, 用于当所述图片获取单元获取的所述 图片中包含所述 AR 目标时, 提示所述 AR 目标。
在第二方面的第一种可能的实现方式中, 所述状态确定单元包括: 距离获 取子单元, 用于根据预先生成的所述导航路线获取所述 AR 目标的位置与所述终 端起始位置之间的距离; 速度获取子单元, 用于根据用户预先选定的运动方式, 获取所述运动方式的平均运动速度; 预计时间获取子单元, 用于根据所述距离 获取子单元获取的所述距离, 以及所述速度获取子单元获取的所述平均运动速 度, 计算出所述终端到达所述 AR 目标附近的预计时间; 确定子单元, 用于在所 述终端从所述终端起始位置出发并运动了所述预计时间之后, 确定到达所述 AR 目标附近。
结合第二方面或者第二方面的第一种可能的实现方式, 在第二种可能的实 现方式中, 还包括: 参数采集单元, 用于当所述图片获取单元获取的所述图片 中不包含所述 AR 目标时, 使用导航定位系统采集所述终端当前的位置参数; 修 正单元, 用于根据所述参数采集单元采集的所述位置参数修正所述导航路线。
结合第二方面的第二种可能的实现方式, 在第三种可能的实现方式中, 所 述位置参数包括所述终端当前位置的定位参数信息和所述终端当前位置的显著 标志中的至少一项。
结合第二方面或者第二方面的任一种可能的实现方式, 在第四种可能的实 现方式中, 还包括: 警告单元, 用于当确定所述终端在所述 AR 目标附近做不规 则运动并最终远离所述 AR 目标时, 输出警告信息。
结合第二方面或者第二方面的任一种可能的实现方式, 所述状态确定单元 具体用于当通过预先生成的导航路线和所述终端的移动速度确定到达预先选定 的 AR 目标附近时, 通过陀螺仪和预设功能是否被使用确定所述终端是否处于搜 索状态。
结合第二方面的第四种可能的实现方式, 在第五种可能的实现方式中, 所 述警告信息包括以下至少一种: 振动、 铃声或发光。
在本发明实施例中, 当判断出终端处于搜索状态时, 自动地获取到周围环 境的图片, 从而在图片中提取出一个或者多个目标与 AR 目标进行匹配, 实现无 标记物环境中 AR处理流程的自动启动, 准确地对小型 AR 目标进行发现, 大大 缩短了发现 AR 目标的时间。 附图说明
图 1是本发明实施例提供 AR 目标发现方法的实现流程图;
图 2是本发明实施例提供的 AR 目标发现方法步驟 S1 01的具体实现流程图; 图 3是本发明实施例提供的 AR 目标发现方法判断终端是否处于搜索状态的 的实现流程图;
图 4是本发明另一实施例提供 AR 目标发现方法的实现流程图;
图 5是本发明另一实施例提供的 A R 目标发现方法的实现流程图;
图 6是本发明实施例提供的 AR 目标发现装置的结构框图;
图 7是本发明实施例提供的 AR 目标发现装置的硬件结构框图; 图 8是本发明实施例提供的 AR 目标发现装置的硬件实现框图。 具体实施方式
为了使本发明的目的、 技术方案及优点更加清楚明白, 以下结合附图及实 施例, 对本发明进行进一步详细说明。 应当理解, 此处所描述的具体实施例仅 仅用以解释本发明, 并不用于限定本发明。
图 1是本发明实施例提供的 AR 目标发现方法的实现流程图, 详述如下: 在步驟 S101 中, 当通过预先生成的导航路线和终端的移动速度确定到达预 先选定的 AR 目标附近时, 确定所述终端的状态。
所述确定所述终端的状态, 包括: 通过陀螺仪和预设功能是否被使用确定 所述终端是否处于搜索状态。
在本实施例中, 终端可以通过接收输入指令, 获取到用户预先选定的 AR 目 标, 并确定 AR 目标的位置, 从而根据终端的当前位置 (即起始位置) 及 AR 目 标的位置, 通过导航定位系统生成导航路线, 以引导用户到达其预先选定的 AR 目标附近。其中,导航定位系统包括但不限于全球定位系统(G l oba l Pos i t i on i ng System, GPS) , 全球卫星导航系统 (G l oba l Nav i gat i on Sate l l i te System, GLONASS) , 北斗系统等。
具体地, 终端可以通过获取用户在终端显示的地图上选定的某个位置, 以 该选定位置为中心进行 AR 目标搜索, 且将搜索到的 AR 目标以用户当前位置为 圆心显示在雷达图上。 用户可以通过雷达图选定一个或者多个 AR 目标, 从而生 成导航路线, 引导用户到达其预先选定的一个或者多个 AR 目标附近。 一种前往目的地的交通方式, 例如, 步行、 骑车或者驾车, 则可以将终端当前 所在位置作为出发地, 根据出发地、 目的地和用户预先选定的运动方式生成导 航路线, 从而引导用户前往其选定的 AR 目标的附近区域。 需要说明的是, 由于 导航定位系统的定位精度问题, 当 AR 目标为小型目标时, 通过导航定位功能生 成的导航路线只能引导用户到达 AR 目标附近, 不一定保证引导用户到达 AR 目 标的精确位置。
优选地, 终端在生成导航路线后, 完成导航路线之前, 可以暂时关闭在终 端运动过程中不需要使用的器件或功能模块, 如摄像头、 陀螺仪、 加速度计、 电子罗盘和 GPS等, 在终端确定移动至 AR 目标附近后再开启暂时关闭的器件或 功能模块。 同时, 在终端运动过程中不使用导航定位系统进行导航, 而使用
GSM/3G信号和基站进行低精度导航, 以节约终端的能耗。
在本实施例中, 终端在通过导航定位系统生成导航路线之后, 可以根据生 成的导航路线以及终端的移动速度,来确定是否到达了预先选定的 AR 目标附近, 具体地, 如图 2所示, 步驟 S1 01具体为:
步驟 S201 ,根据预先生成的所述导航路线获取所述 AR 目标的位置与所述终 端起始位置之间的距离。
终端根据生成的导航路线,获取到 AR目标位置与终端起始位置之间的距离。 在步驟 S202中, 根据用户预先选定的运动方式, 获取该运动方式的平均运 动速度。
例如, 当用户选择步行时, 则可以根据相关配置参数获取到步行的平均运 动速度, 如, 5KM/h, 获取到的平均运动速度可以预先根据不同的运动方式在系 统中预置, 且可以为估计值。 同样地, 终端可以预先记录用户步行、 跑步、 骑 车或以其他方式运动时的速度, 并依据统计数据估算出用户每种运动方式的平 均速度。
在步驟 S203中, 根据所述距离与所述平均运动速度计算出所述终端到达所 述 AR 目标附近的预计时间。
在步驟 S204中, 在所述终端从所述终端起始位置出发并运动了所述预计时 间之后, 确定到达所述 AR 目标附近。
在本实施例中, 终端可以根据估计出的预计时间设定一个任务闹钟, 以在 预估出的终端接近目的地的时刻以任务闹钟形式提醒用户接近 AR 目标, 从而提 示用户注意关注 AR 目标是否出现。
优选的, 所述预计时间可以根据终端实际的运动情况进行修正。 例如, 用 户原打算携带终端步行至 AR 目标附近, 在导航路线的后半程改变想法, 决定驾 车完成剩余路线, 则当运动方式改变时, 终端在运动过程中会接收到由用户输 入的运动方式更改指令, 由 "步行"的运动方式更改为了 "驾车"的运动方式, , 间进行调整。
在步驟 S102中, 当确定所述终端处于搜索状态时,启动摄像头并获取图片。 在本实施例中, 当步驟 S101 确定出终端到达 AR 目标附近后, 可以重新开 启此前暂时关闭的陀螺仪、 光感传感器等传感器, 采集终端状态参数, 判断终 端状态参数是否满足预设条件, 来对终端的当前状态进行判断, 进而判断出终 端处于正常的运动状态还是处于搜索状态, 当判断出终端处于搜索状态, 即认 为用户当前可能在通过终端确认周围环境, 寻找周围环境中的 AR 目标。 对于判 断终端状态参数是否满足预设条件的相关内容将在本发明后续实施例中进行详 细说明, 在此不再赘述。
当终端判断出处于搜索状态后, 自动地开启摄像头进行拍照, 或者采集视 频流以提取图像帧。 上述自动拍照或者视频流获取的动作均可以通过智能手机 或者头戴式设备等终端实现, 且在获取到照片或视频流中的图像帧之前, 可以 对照片或者图像帧进行优化, 例如, 放大、 调整亮度、 调整对比度等等 (包含 但不限于上述方法, 这里的处理方法可以是一切提高可视内容收看效果使之更 适合人眼观看的方法) , 之后通过图像识別算法提取出图片中可能被用户关注 的 N个关注目标, 以判断这 N个关注目标中是否包含 AR 目标, 其中, N为大于 等于 1 的整数。
具体地, 当一张图片可以提取出 N (N为大于等于 1 的整数)个关注目标时, 系统可以首先采用快速、 低精度的识別算法先处理图片中的 M (M为大于等于 1 且小于等于 N的整数) 个关注目标, 再对剩余的 N-M个关注目标采用精确的识 別算法进行识別, 以判断出关注目标中是否包含 AR 目标。在本实施例中, 快速、 低精度识別算法会将非 AR 目标识別成 AR 目标, 但不会把 AR 目标识別成非 AR 目标, 而精确识別算法能精确识別 AR 目标和非 AR 目标, 但耗时较长, 因此, 采用上述处理方法, 能够在保证识別精度的前提下, 减少总的处理时间。
优选地, 当采用头戴式设备作为 AR 目标发现设备时, 由于头戴式设备能够 检测用户的视角变化, 当获取到图片后发现用户快速地切换了当前视角, 即用 户处于关注状态的时间小于某个门限值 T, 则不对当前图片进行优化, 只保存当 前图片; 如果用户处于关注状态的时间超过门限值 T, 则认为用户对当前视野中 的某个目标很感兴趣, 即当前获取到的图片中很可能包含 AR 目标, 则对图片进 行后续处理。
在步驟 S103中, 当获取的所述图片中包含所述 AR 目标时, 提示所述 AR 目 标。
本发明实施例当判断出终端处于搜索状态时, 自动地获取到周围环境的图 片, 从而在图片中提取出一个或者多个目标与 AR 目标相匹配, 实现无标记物环 境中 AR处理流程的自动启动, 准确地对小型 AR 目标进行发现, 大大缩短发现 AR目标的时间。
需要说明的是: 上述步驟 S101至 S103, 以及步驟 S201至 S204的执行主体 为终端。
接下来, 对本发明实施例提供的判断终端状态参数是否满足预设条件的具 体实施例进行详细说明:
作为本发明的一个实施例, 终端状态参数可以包括终端从出发地到达 AR目 标附近的平均运动速度, 以及包括终端的当前运动速度, 此时, 判断终端是否 处于搜索状态的步驟具体为:
判断终端的当前运动速度低于平均运动速度的差值是否大于预设阈值。 在本实施例中, 通过监控终端速度, 根据终端速度的变化情况来判断终端 是否处于搜索状态。 具体地, 可以根据终端到达 AR目标附近之前的运动时间和 累计的实际运动距离, 计算出终端的平均运动速度, 或者直接获取到用户预先 选定的运动方式的平均运动速度, 并根据终端当前运动速度与平均运动速度的 差值是否大于预设阈值, 来判断终端的运动速度是否出现明显降低, 是则认为 终端处于搜索状态。
进一步, 优选地, 终端状态参数还包括终端预设功能的运行状态, 考虑到 终端的运动速度出现明显降低还可能是因为临时接听电话、 查看短信等原因, 因此, 判断终端是否处于搜索状态的步驟具体为:
当终端的当前运动速度低于平均运动速度的差值大于预设阈值时, 判断预 设功能是否均处于停止使用状态。
在本实施例中, 预设功能包括但不限于语音通话、 短信息、 彩信、 闹钟、 日历事件等预设功能, 若判断出预设功能中的一个或多个正在被使用时, 则判 返回监控终端运动速度的变化; 若判断出预设功能均处于停止使用状态时, 则 可以判断终端处于搜索状态。
进一步, 优选地, 终端状态参数还包括终端运动状态, 此时, 判断终端是 否处于搜索状态的步驟具体为:
当终端的当前运动速度低于平均运动速度的差值大于预设阈值时, 判断终 端运动状态是否处于稳态。
具体地, 可以通过陀螺仪等传感器来判断智能手机或者头戴式设备是否处 于稳态。 当终端运动状态为稳态时, 则进一步确定用户正在关注某个目标。 进一步地, 判断终端运动状态是否处于稳态可以具体为:
判断终端运动状态处于稳态的时间是否大于预设时间阈值。
在本实施例中, 考虑到终端运动状态的突变性, 当陀螺仪采集到的数据表 示终端运动状态处于稳态时, 可以进一步判断终端运动状态处于稳态的时间是 否大于预设时间阈值, 如小于预设时间阈值, 则不作出判断, 直至终端运动状 态处于稳态的时间大于预设时间阈值, 再确定终端处于搜索状态。
优选的, 当终端还具有用户瞳孔监控功能时, 可以利用瞳孔大小的变化情 况来识別出用户是否正盯着某目标注意看, 例如, 当用户在关注某一事物时, 眼睛的瞳孔会变大, 此时可以通过终端的前置摄像头拍摄用户瞳孔的大小, 以 此作为辅助参考, 进一步提高终端状态的判断准确度。
图 3描述了本发明实施例提供的 AR 目标发现方法判断终端是否处于搜索状 态的实现流程图, 图 3所示实施例在判断出终端运动速度发生了显著下降之后, 结合了上述所有判断终端状态参数需要满足的所有预设条件, 通过图 3 所示实 施例, 可以最大程度地提高对终端状态判断的准确度。 显然, 在实际的判断过 程中, 图 3所示实施例中的步驟 S302和 S303的执行先后顺序不受限定, 且也 可以只执行 S302或者只执行 S303, 在此不一一限定。 具体地, 如图 3所示, 在步驟 S301 中, 判断终端的当前运动低于平均运动速度的值是否大于预设 阈值, 是则执行步驟 S302, 否则确定出终端不处于搜索状态。
在步驟 S302中, 判断终端的预设功能是否均处于停止使用状态, 是则执行 步驟 S303 , 否则确定出终端不处于搜索状态。
在步驟 S303中, 判断终端运动状态是否处于稳态, 是则执行步驟 S304, 否 则确定出终端不处于搜索状态。
进一步地, 步驟 S303中, 当判断出终端运动状态处于稳态之后, 还可以判 断终端运行状态处于稳态的时间是否大于预设时间阈值, 若终端运行状态处于 稳态的时间大于预设时间阈值, 则执行步驟 S304。
在步驟 S304中, 确定出终端处于搜索状态。
需要说明的是, 在本发明图 3所示实施例中, 由于步驟 S301之后的相关步 驟均是通过终端的传感器或者相关检测装置来实现的, 优选地, 在步驟 S301 未 判断出终端运动速度显著变化之前, 终端的传感器或者相关检测装置可以处于 关闭或者待机状态, 直至步驟 S301判断出用户运动速度发生了显著变化, 再开 启进行检测, 以节约终端设备的能耗。 需要说明的是: 上述步驟 S301 至 S304的执行主体为终端。
图 4示出了本发明另一实施例提供的 AR 目标发现方法的实现流程, 本实施 例在本发明图 1 所示实施例的基础上, 对当识別出一个或者多个关注目标中不 包含 AR 目标的情况进行了描述, 详述如下:
在步驟 S1 04 中, 当获取的所述图片中不包含所述 AR 目标时, 使用导航定 位系统采集所述终端当前的位置参数。
在步驟 S1 05中, 根据位置参数修正导航路线。
在本实施例中, 当提取到的关注目标中不包含 AR 目标时, 则考虑终端是否 被正确地引导到了 AR 目标附近, 此时, 则通过采集当前位置的位置参数来修正 导航路线。 具体地, 位置参数包括但不限于终端当前位置的 GPS 参数、 终端当 前位置的显著标志等参数信息, 其中, GPS参数可以用于确定当前位置是否位于 AR 目标附近, 当前位置的显著标志可以用于与系统内置的 AR 目标附近的显著标 志相匹配, 以判断当前位置是否位于 AR 目标附近, 从而在位置错误的情况下重 新修正导航路线, 引导用户前进。
考虑到用户位于 AR 目标附近, 但是无法确认 AR 目标的具体方位的情况, 图 5示出了本发明另一实施例提供的 AR 目标发现方法的实现流程, 具体地, 在 步驟 S1 01之前, 或者步驟 S1 03之后, 或者步驟 S1 05之后, 所述方法还包括: 步驟 S1 06 : 当确定所述终端在所述 AR 目标附近做不规则运动并最终远离 AR 目标时, 输出警告信息。
其中, 所述警告信息包括以下至少一种: 振动、 铃声或发光。 由此, 使得 用户能够明确当前环境下 AR 目标的具体方位, 进一步引导用户发现 AR 目标。
或者, 在通过 AR任务闹钟提示用户到达 AR 目标附近的情况下, 也可以在 AR任务闹钟到时或所述 AR任务闹钟到时之前或所述 AR任务闹钟到时之后输出 警告信息, 提示用户注意发现 AR 目标。
本发明实施例当判断出终端处于搜索状态时, 自动地获取到周围环境的图 片, 从而在图片中提取出一个或者多个目标与 AR 目标进行匹配, 实现无标记物 环境中 AR处理流程的自动启动, 准确地对小型 AR 目标进行发现, 大大缩短了 发现 AR 目标的时间。
图 6示出了本发明实施例提供的发现 AR 目标的终端的结构框图, 该终端可 以位于智能移动终端或者头戴式设备等 AR终端中, 用于运行本发明图 1 至图 5 实施例所述的方法。 为了便于说明, 仅示出了与本实施例相关的部分。 参照图 6, 该装置包括:
状态确定单元 61, 当通过预先生成的导航路线和终端的移动速度确定到达 预先选定的 AR目标附近时, 确定所述终端的状态。
图片获取单元 62, 当所述状态确定单元 61确定所述终端处于搜索状态时, 启动摄像头并获取图片。
提示单元 63, 当所述图片获取单元 62获取的所述图片中包含所述 AR 目标 时, 提示所述 AR目标。
可选地, 所述状态确定单元 61 包括:
距离获取子单元, 根据预先生成的所述导航路线获取所述 AR目标的位置与 所述终端起始位置之间的距离。
速度获取子单元, 根据用户预先选定的运动方式, 获取所述运动方式的平 均运动速度。
预计时间获取子单元, 用于根据所述距离获取子单元获取的所述距离, 以 及所述速度获取子单元获取的所述平均运动速度, 计算出所述终端到达所述 AR 目标附近的预计时间。
确定子单元, 在所述终端从所述终端起始位置出发并运动了所述预计时间 之后, 确定到达所述 AR目标附近。
可选的, 所述状态确定单元 61具体用于当通过预先生成的导航路线和所述 终端的移动速度确定到达预先选定的 AR目标附近时, 通过陀螺仪和预设功能是 否被使用确定所述终端是否处于搜索状态。
可选地, 所述终端还包括:
参数采集单元, 当所述图片获取单元 62 获取的所述图片中不包含所述 AR 目标时, 使用导航定位系统采集所述终端当前的位置参数。
修正单元, 根据所述参数采集单元采集的所述位置参数修正所述导航路线。 可选地, 位置参数包括所述终端当前位置的定位参数信息和所述终端当前 位置的显著标志中的至少一项。
可选地, 所述终端还包括:
警告单元, 当确定所述终端在所述 AR目标附近做不规则运动并最终远离所 述 AR目标时, 输出警告信息。
进一步地, 所述警告信息包括以下至少一种: 振动、 铃声或发光。
图 7示出了本发明实施例提供的发现 AR目标的终端的硬件结构框图, 该终 端可以为智能移动终端或者头戴式设备等终端, 用于运行本发明图 1 至图 5实 施例所述的方法。 为了便于说明, 仅示出了与本实施例相关的部分。
参照图 7, 该装置包括处理器 71、 存储器 72和总线 73, 其中, 处理器 71 和存储器 72通过总线 73进行相互间的通信, 存储器 72用于存储程序, 处理器 71 用于执行存储器 72中存储的程序, 所述程序在被执行时, 用于:
当通过预先生成的导航路线和终端的移动速度确定到达预先选定的 AR目标 附近时, 确定所述终端的状态;
当确定所述终端处于搜索状态时, 启动摄像头并获取图片;
当获取的所述图片中包含所述 AR目标时, 提示所述 AR目标。
可选的, 所述确定所述终端的状态, 包括:
通过陀螺仪和预设功能是否被使用确定所述终端是否处于搜索状态。
可选地, 所述通过预先生成的导航路线和所述终端的移动速度确定到达预 先选定的 AR目标附近, 包括:
根据预先生成的所述导航路线获取所述 AR目标的位置与所述终端起始位置 之间的距离;
根据用户预先选定的运动方式, 获取所述运动方式的平均运动速度; 根据所述距离与所述平均运动速度计算出所述终端到达所述 AR目标附近的 预计时间;
在所述终端从所述终端起始位置出发并运动了所述预计时间之后, 确定到 达所述 AR目标附近。
可选地, 所述程序被执行时, 还用于:
当获取的所述图片中不包含所述 AR目标时, 使用导航定位系统采集所述终 端当前的位置参数;
根据所述位置参数修正所述导航路线。
可选地, 所述位置参数包括所述终端当前位置的定位参数信息和所述终端 当前位置的显著标志中的至少一项。
可选地, 所述程序被执行时, 还用于:
当确定所述终端在所述 AR 目标附近做不规则运动并最终远离所述 AR 目标 时, 输出警告信息。
可选地, 所述警告信息包括以下至少一种: 振动、 铃声或发光。
图 8基于本发明图 6所示实施例, 提供了本发明实施例提供的 AR目标发现 系统的硬件实现框图, 如图 8所示, 本发明图 6实施例所述的装置的相关功能 可以通过分別部署在图 8的相关硬件功能模块中实现, 在此不一一赘述。
本发明实施例当判断出终端处于搜索状态时, 自动地获取到周围环境的图 片, 从而在图片中提取出一个或者多个目标与 AR 目标进行匹配, 实现在无标记 物环境中 AR处理流程的自动启动, 准确地对小型 AR 目标进行发现, 大大缩短 了发现 AR 目标的时间。
以上所述仅为本发明的较佳实施例而已, 并不用以限制本发明, 凡在本发明的 精神和原则之内所作的任何修改、 等同替换和改进等, 均应包含在本发明的保 护范围之内

Claims

权 利 要 求
1、 一种发现增强现实 AR目标的方法, 其特征在于, 包括:
当通过预先生成的导航路线和终端的移动速度确定到达预先选定的 AR目标 附近时, 确定所述终端的状态;
当确定所述终端处于搜索状态时, 启动摄像头并获取图片;
当获取的所述图片中包含所述 AR目标时, 提示所述 AR目标。
2、 如权利要求 1所述的方法, 其特征在于, 所述通过预先生成的导航路线 和所述终端的移动速度确定到达预先选定的 AR目标附近, 包括:
根据预先生成的所述导航路线获取所述 AR目标的位置与所述终端起始位置 之间的距离;
根据用户预先选定的运动方式, 获取所述运动方式的平均运动速度; 预计时间;
在所述终端从所述终端起始位置出发并运动了所述预计时间之后, 确定到 达所述 AR目标附近。
3、 如权利要求 1或 2所述的方法, 其特征在于, 所述方法还包括: 当获取的所述图片中不包含所述 AR目标时, 使用导航定位系统采集所述终 端当前的位置参数;
根据所述位置参数修正所述导航路线。
4、 如权利要求 3所述的方法, 其特征在于, 所述位置参数包括所述终端当 前位置的定位参数信息和所述终端当前位置的显著标志中的至少一项。
5、 根据权利要求 1至 4任一项所述的方法, 其特征在于, 所述确定所述终端 的状态, 包括:
通过陀螺仪和预设功能是否被使用确定所述终端是否处于搜索状态。
6、 根据权利要求 1至 5任一项所述的方法, 其特征在于, 还包括: 当确定所述终端在所述 AR目标附近做不规则运动并最终远离所述 AR目标 时, 输出警告信息。
7、 一种终端, 其特征在于, 包括:
状态确定单元, 用于当通过预先生成的导航路线和终端的移动速度确定到 达预先选定的 AR目标附近时, 确定所述终端的状态;
图片获取单元, 用于当所述状态确定单元确定所述终端处于搜索状态时, 启动摄像头并获取图片;
提示单元, 用于当所述图片获取单元获取的所述图片中包含所述 AR目标时, 提示所述 AR目标。
8、 如权利要求 7所述的终端, 其特征在于, 所述状态确定单元包括: 距离获取子单元, 用于根据预先生成的所述导航路线获取所述 AR目标的位 置与所述终端起始位置之间的距离;
速度获取子单元, 用于根据用户预先选定的运动方式, 获取所述运动方式 的平均运动速度;
预计时间获取子单元, 用于根据所述距离获取子单元获取的所述距离, 以 及所述速度获取子单元获取的所述平均运动速度, 计算出所述终端到达所述 AR 目标附近的预计时间;
确定子单元, 用于在所述终端从所述终端起始位置出发并运动了所述预计 时间之后, 确定到达所述 AR目标附近。
9、 如权利要求 7或 8所述的终端, 其特征在于, 还包括:
参数采集单元, 用于当所述图片获取单元获取的所述图片中不包含所述 AR 目标时, 使用导航定位系统采集所述终端当前的位置参数;
修正单元, 用于根据所述参数采集单元采集的所述位置参数修正所述导航 路线。
1 0、 如权利要求 9所述的终端, 其特征在于, 所述位置参数包括所述终端当 前位置的定位参数信息和所述终端当前位置的显著标志中的至少一项。
1 1、 根据权利要求 7至 10任一项所述的终端, 其特征在于, 所述状态确定单 元具体用于当通过预先生成的导航路线和所述终端的移动速度确定到达预先选 定的 AR目标附近时, 通过陀螺仪和预设功能是否被使用确定所述终端是否处于 搜索状态。
1 2、 根据权利要求 7至 1 1任一项所述的终端, 其特征在于, 还包括: 警告单元, 用于当确定所述终端在所述 AR目标附近做不规则运动并最终远离所 述 AR目标时, 输出警告信息。
PCT/CN2014/071169 2013-01-28 2014-01-23 一种发现增强现实目标的方法及终端 WO2014114244A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
KR1020157023037A KR101748226B1 (ko) 2013-01-28 2014-01-23 증강 현실 객체를 발견하는 방법, 및 단말
JP2015554037A JP6123120B2 (ja) 2013-01-28 2014-01-23 拡張現実オブジェクトを発見するための方法および端末
EP14743259.5A EP2865993B1 (en) 2013-01-28 2014-01-23 Augmented reality target discovery method and terminal
US14/573,178 US9436874B2 (en) 2013-01-28 2014-12-17 Method for discovering augmented reality object, and terminal

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201310031093.X 2013-01-28
CN201310031093.XA CN103968824B (zh) 2013-01-28 2013-01-28 一种发现增强现实目标的方法及终端

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/573,178 Continuation US9436874B2 (en) 2013-01-28 2014-12-17 Method for discovering augmented reality object, and terminal

Publications (1)

Publication Number Publication Date
WO2014114244A1 true WO2014114244A1 (zh) 2014-07-31

Family

ID=51226936

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/071169 WO2014114244A1 (zh) 2013-01-28 2014-01-23 一种发现增强现实目标的方法及终端

Country Status (6)

Country Link
US (1) US9436874B2 (zh)
EP (1) EP2865993B1 (zh)
JP (1) JP6123120B2 (zh)
KR (1) KR101748226B1 (zh)
CN (1) CN103968824B (zh)
WO (1) WO2014114244A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110347703A (zh) * 2019-07-01 2019-10-18 华南理工大学 一种基于ARCore的用户行为分析方法及系统

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106062862B (zh) 2014-10-24 2020-04-21 杭州凌感科技有限公司 用于沉浸式和交互式多媒体生成的系统和方法
US10256859B2 (en) 2014-10-24 2019-04-09 Usens, Inc. System and method for immersive and interactive multimedia generation
US10334085B2 (en) * 2015-01-29 2019-06-25 Splunk Inc. Facilitating custom content extraction from network packets
CN104834375A (zh) * 2015-05-05 2015-08-12 常州恐龙园股份有限公司 基于增强现实的游乐园指南系统
CN104850229B (zh) * 2015-05-18 2019-03-22 小米科技有限责任公司 识别物体的方法及装置
CN105824863B (zh) * 2015-10-30 2021-12-28 维沃移动通信有限公司 一种桌面主题推荐方法及终端
CN106959111A (zh) * 2016-01-08 2017-07-18 台湾国际物业管理顾问有限公司 建筑物空间连续定位资讯系统
CN105554392A (zh) * 2016-01-20 2016-05-04 京东方科技集团股份有限公司 显示切换装置及其切换方法、穿戴显示装置及其显示方法
CN106713868A (zh) * 2017-01-03 2017-05-24 捷开通讯(深圳)有限公司 一种监控随机目标的方法及系统
CN107229706A (zh) * 2017-05-25 2017-10-03 广州市动景计算机科技有限公司 一种基于增强现实的信息获取方法及其装置
CN109151204B (zh) * 2018-08-31 2021-04-23 Oppo(重庆)智能科技有限公司 一种基于移动终端的导航方法、装置及移动终端
CN111198608B (zh) * 2018-11-16 2021-06-22 广东虚拟现实科技有限公司 信息提示方法、装置、终端设备及计算机可读取存储介质
CN109324693A (zh) * 2018-12-04 2019-02-12 塔普翊海(上海)智能科技有限公司 Ar搜索装置、基于ar搜索装置的物品搜索系统及方法
CN111491293B (zh) * 2019-01-25 2022-01-11 华为技术有限公司 一种运动状态的上报方法及装置
CN112733620A (zh) * 2020-12-23 2021-04-30 深圳酷派技术有限公司 信息提示方法、装置、存储介质及电子设备
CN112880689A (zh) * 2021-01-29 2021-06-01 北京百度网讯科技有限公司 一种领位方法、装置、电子设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101033978A (zh) * 2007-01-30 2007-09-12 珠海市智汽电子科技有限公司 智能汽车辅助导航和自动兼辅助驾驶系统
WO2008099482A1 (ja) * 2007-02-16 2008-08-21 Pioneer Corporation 検索装置、検索方法、検索プログラムおよびコンピュータに読み取り可能な記録媒体
CN101451852A (zh) * 2008-12-19 2009-06-10 深圳华为通信技术有限公司 导航设备和导航方法
CN102123194A (zh) * 2010-10-15 2011-07-13 张哲颖 利用增强实景技术优化移动导航和人机交互功能的方法
CN102867169A (zh) * 2011-04-08 2013-01-09 索尼公司 图像处理设备、显示控制方法及程序

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3442163B2 (ja) * 1994-10-11 2003-09-02 富士通株式会社 位置合わせ方法および装置
US6456234B1 (en) 2000-06-07 2002-09-24 William J. Johnson System and method for proactive content delivery by situation location
JP2002267484A (ja) * 2001-03-09 2002-09-18 Ntt Docomo Inc 案内システム及び被誘導端末装置
JP3787760B2 (ja) 2001-07-31 2006-06-21 松下電器産業株式会社 カメラ付き携帯電話装置
JP4260432B2 (ja) * 2002-07-08 2009-04-30 シャープ株式会社 情報提供方法、情報提供プログラム、情報提供プログラムを記録した記録媒体、および情報提供装置
JP4066421B2 (ja) * 2003-01-14 2008-03-26 株式会社リコー 携帯情報端末装置
JP4380550B2 (ja) * 2004-03-31 2009-12-09 株式会社デンソー 車載用撮影装置
JP2007041955A (ja) * 2005-08-04 2007-02-15 Matsushita Electric Ind Co Ltd 端末装置
JP2007132870A (ja) * 2005-11-11 2007-05-31 Pioneer Electronic Corp ナビゲーション装置、コンピュータプログラム、画面制御方法及び測定間隔制御方法
JP2007292713A (ja) * 2006-03-30 2007-11-08 Denso Corp ナビゲーション装置
US7617042B2 (en) 2006-06-30 2009-11-10 Microsoft Corporation Computing and harnessing inferences about the timing, duration, and nature of motion and cessation of motion with applications to mobile computing and communications
JP2008146136A (ja) * 2006-12-06 2008-06-26 Seiko Epson Corp 画像認識装置、画像認識システム、画像認識方法、および、制御プログラム
JP2009267792A (ja) * 2008-04-25 2009-11-12 Panasonic Corp 撮像装置
US8909466B2 (en) * 2008-08-01 2014-12-09 Environmental Systems Research Institute, Inc. System and method for hybrid off-board navigation
US8639440B2 (en) * 2010-03-31 2014-01-28 International Business Machines Corporation Augmented reality shopper routing
KR101667715B1 (ko) 2010-06-08 2016-10-19 엘지전자 주식회사 증강현실을 이용한 경로 안내 방법 및 이를 이용하는 이동 단말기
US8762041B2 (en) * 2010-06-21 2014-06-24 Blackberry Limited Method, device and system for presenting navigational information
CN102054166B (zh) * 2010-10-25 2016-04-27 北京理工大学 一种新的用于户外增强现实系统的场景识别方法
US20120249588A1 (en) * 2011-03-22 2012-10-04 Panduit Corp. Augmented Reality Data Center Visualization
GB201106555D0 (en) * 2011-04-19 2011-06-01 Tomtom Int Bv Taxi dispatching system
CN102519475A (zh) * 2011-12-12 2012-06-27 杨志远 一种基于增强现实技术的智能导航方法和设备
KR102021050B1 (ko) * 2012-06-06 2019-09-11 삼성전자주식회사 내비게이션 정보를 제공하는 방법, 기계로 읽을 수 있는 저장 매체, 이동 단말 및 서버

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101033978A (zh) * 2007-01-30 2007-09-12 珠海市智汽电子科技有限公司 智能汽车辅助导航和自动兼辅助驾驶系统
WO2008099482A1 (ja) * 2007-02-16 2008-08-21 Pioneer Corporation 検索装置、検索方法、検索プログラムおよびコンピュータに読み取り可能な記録媒体
CN101451852A (zh) * 2008-12-19 2009-06-10 深圳华为通信技术有限公司 导航设备和导航方法
CN102123194A (zh) * 2010-10-15 2011-07-13 张哲颖 利用增强实景技术优化移动导航和人机交互功能的方法
CN102867169A (zh) * 2011-04-08 2013-01-09 索尼公司 图像处理设备、显示控制方法及程序

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2865993A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110347703A (zh) * 2019-07-01 2019-10-18 华南理工大学 一种基于ARCore的用户行为分析方法及系统
CN110347703B (zh) * 2019-07-01 2023-08-22 华南理工大学 一种基于ARCore的用户行为分析方法及系统

Also Published As

Publication number Publication date
JP6123120B2 (ja) 2017-05-10
JP2016507746A (ja) 2016-03-10
US9436874B2 (en) 2016-09-06
EP2865993B1 (en) 2019-04-03
KR101748226B1 (ko) 2017-06-16
CN103968824B (zh) 2018-04-10
CN103968824A (zh) 2014-08-06
EP2865993A4 (en) 2015-12-30
US20150104069A1 (en) 2015-04-16
KR20150108925A (ko) 2015-09-30
EP2865993A1 (en) 2015-04-29

Similar Documents

Publication Publication Date Title
WO2014114244A1 (zh) 一种发现增强现实目标的方法及终端
US11035687B2 (en) Virtual breadcrumbs for indoor location wayfinding
EP3014476B1 (en) Using movement patterns to anticipate user expectations
US10909759B2 (en) Information processing to notify potential source of interest to user
US9410810B2 (en) Method and apparatus for providing service using a sensor and image recognition in a portable terminal
JP6025433B2 (ja) 携帯ナビゲーション装置
JP4380550B2 (ja) 車載用撮影装置
US9928710B2 (en) Danger alerting method and device, portable electronic apparatus
US10373496B2 (en) Parking management system and parking management method
KR20160147052A (ko) 스트리트 뷰 목적지에 대한 안내를 제공하는 방법 및 장치
JP2008227877A (ja) 映像情報処理装置
CN105509735B (zh) 信息提示方法、装置及终端
EP3287745B1 (en) Information interaction method and device
CN117128959A (zh) 寻车导航方法、电子设备、服务器及系统
JP2012019374A (ja) 電子アルバム作成サーバー、情報処理装置、電子アルバム作成システム、及び、電子アルバム作成サーバーの制御方法
JP2014044166A (ja) 電子機器、進行方向提示方法およびプログラム
CN111176338A (zh) 导航方法、电子设备及存储介质
JP2005017074A (ja) 情報送受信装置、情報送受信用プログラム
KR20190010065A (ko) 이동체의 위치인식 및 실시간 추적 가능한 자동보정 예측 알고리즘을 혼합한 선형 제어 방법
JP5977697B2 (ja) 電子機器、および電子機器を制御するための方法
JP2011220899A (ja) 情報提示システム
JP6976186B2 (ja) 端末装置及びプログラム
KR20190010066A (ko) 자가 자동 촬영 장치
CN113179372A (zh) 图像采集装置与人工智能的结合方法、装置及电子设备
JP2012202782A (ja) 携帯情報端末

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14743259

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2014743259

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2015554037

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20157023037

Country of ref document: KR

Kind code of ref document: A