CN109345560B - Motion tracking precision testing method and device of augmented reality equipment - Google Patents
Motion tracking precision testing method and device of augmented reality equipment Download PDFInfo
- Publication number
- CN109345560B CN109345560B CN201811103303.0A CN201811103303A CN109345560B CN 109345560 B CN109345560 B CN 109345560B CN 201811103303 A CN201811103303 A CN 201811103303A CN 109345560 B CN109345560 B CN 109345560B
- Authority
- CN
- China
- Prior art keywords
- augmented reality
- real
- marker object
- test
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Multimedia (AREA)
- Processing Or Creating Images (AREA)
Abstract
The application provides a motion tracking precision testing method and device of augmented reality equipment, wherein the method comprises the following steps: acquiring an augmented reality video image; determining an augmented reality marker object in a video image; controlling the augmented reality device and the first real mark object to move relatively to obtain at least two test images; and determining the tracking precision of the augmented reality equipment to the first real marked object according to the size parameter of the test image and the relative position parameter of the first real marked object and the augmented reality marked object. Through the method and the device, the problem that in the related technology, the evaluation of the AR motion tracking precision is determined through the subjective feeling of the user is solved, and the precision of testing the AR motion tracking precision is improved.
Description
Technical Field
The application relates to the field of communication, in particular to a method and a device for testing motion tracking precision of augmented reality equipment.
Background
With the development of technology, Augmented Reality (AR for short) is increasingly popular with people, and people can really experience scenes which cannot be experienced in real life through the AR. However, at present, the evaluation of the motion tracking accuracy of the AR is determined by the subjective feeling of the user, and an objective and accurate accuracy testing scheme is lacking.
In view of the above problems in the related art, no effective solution exists at present.
Disclosure of Invention
The embodiment of the application provides a method and a device for testing motion tracking precision of augmented reality equipment, and aims to at least solve the problem that AR motion tracking precision evaluation is determined through subjective feeling of a user in the related art.
According to an embodiment of the present application, there is provided a method for testing motion tracking accuracy of an augmented reality device, including: acquiring an augmented reality video image; determining an augmented reality marker object in the video image, wherein the position of the augmented reality marker object in the video image corresponds to a first real marker object preset in a real scene in the video image; controlling the augmented reality device and the first real marker object to move relatively to obtain at least two test images, wherein the at least two test images comprise the first real marker object; and determining the tracking precision of the augmented reality equipment to the first real marker object according to the size parameter of the test image and the relative position parameter of the first real marker object and the augmented reality marker object.
According to another embodiment of the present application, there is provided a motion tracking accuracy testing apparatus of an augmented reality device, including: the first acquisition module is used for acquiring an augmented reality video image; a first determining module, configured to determine an augmented reality marker object in the video image, where a position of the augmented reality marker object in the video image corresponds to a first real marker object preset in a real scene in the video image; the second obtaining module is used for controlling the augmented reality device and the first real marker object to move relatively so as to obtain at least two test images, wherein the at least two test images comprise the first real marker object; a second determining module, configured to determine, according to the size parameter of the test image and the relative position parameter of the first real marker object and the augmented reality marker object, the tracking accuracy of the augmented reality device on the first real marker object.
According to yet another embodiment of the present application, there is further provided a storage medium having a computer program stored therein, where the computer program is configured to execute the steps in the embodiment of the motion tracking accuracy testing method of the augmented reality device when running.
According to yet another embodiment of the present application, there is further provided an electronic apparatus, including a memory and a processor, where the memory stores a computer program, and the processor is configured to execute the computer program to perform the steps in the embodiment of the method for testing motion tracking accuracy of an augmented reality device.
According to the application, an augmented reality video image is obtained; determining an augmented reality marker object in a video image, wherein the position of the augmented reality marker object in the video image corresponds to a first real marker object preset in a real scene in the video image; controlling the augmented reality equipment and the first real marker object to move relatively to obtain at least two test images, wherein the at least two test images comprise the first real marker object; the method and the device for evaluating the AR motion tracking precision determine the tracking precision of the augmented reality device on the first real marked object according to the size parameter of the test image and the relative position parameter of the first real marked object and the augmented reality marked object, namely, the motion tracking precision obtained by testing the AR device is determined in the application, the judgment is not carried out by the user subjectivity, the problem that the evaluation of the AR motion tracking precision is determined through the subjective feeling of the user in the related technology is solved, and the precision of testing the AR motion tracking precision is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a block diagram of a hardware structure of a mobile terminal of a motion tracking accuracy testing method according to an embodiment of the present application;
FIG. 2 is a flow chart of a method for testing motion tracking accuracy according to an embodiment of the present application;
FIG. 3 is a schematic diagram of marking a target mark object according to an embodiment of the present application;
FIG. 4 is a schematic illustration of maximum distances according to an embodiment of the present application;
FIG. 5 is a schematic distance diagram of a target mark object according to an embodiment of the present application;
FIG. 6 is a schematic diagram of distances in an augmented reality marker object absence image according to an embodiment of the present application;
FIG. 7 is a schematic structural diagram of a test device for motion tracking accuracy according to an embodiment of the present application;
FIG. 8 is a first diagram illustrating an alternative configuration of a device for testing motion tracking accuracy according to an embodiment of the present application;
fig. 9 is a schematic diagram of an alternative structure of a test apparatus for motion tracking accuracy according to an embodiment of the present application.
Detailed Description
The present application will be described in detail below with reference to the accompanying drawings in conjunction with embodiments. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order.
Example 1
The method provided by the first embodiment of the present application may be executed in a mobile terminal, a computer terminal, or a similar computing device. Taking the example of the method running on a mobile terminal as an example, fig. 1 is a hardware structure block diagram of the mobile terminal of the method for processing motion tracking accuracy according to the embodiment of the present application. As shown in fig. 1, the mobile terminal 10 may include one or more (only one shown in fig. 1) processors 102 (the processor 102 may include, but is not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA) and a memory 104 for storing data, and optionally may also include a transmission device 106 for communication functions and an input-output device 108. It will be understood by those skilled in the art that the structure shown in fig. 1 is only an illustration, and does not limit the structure of the mobile terminal. For example, the mobile terminal 10 may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1.
The memory 104 may be used to store a computer program, for example, a software program and a module of application software, such as a computer program corresponding to the processing method of motion tracking accuracy in the embodiment of the present application, and the processor 102 executes various functional applications and data processing by running the computer program stored in the memory 104, so as to implement the above-mentioned method. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some instances, the memory 104 may further include memory located remotely from the processor 102, which may be connected to the mobile terminal 10 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used for receiving or transmitting data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of the mobile terminal 10. In one example, the transmission device 106 includes a Network adapter (NIC), which can be connected to other Network devices through a base station so as to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is used for communicating with the internet in a wireless manner.
In this embodiment, a method for testing motion tracking accuracy of an augmented reality device with a mobile terminal as a carrier is provided, and fig. 2 is a flowchart of a method for testing motion tracking accuracy of an augmented reality device according to an embodiment of the present application, and as shown in fig. 2, the flow includes the following steps:
step S202, acquiring an augmented reality video image;
it should be noted that, before acquiring the augmented reality video image, it is necessary to determine that the natural condition of the environment to be tested is constant, and after the natural condition is constant, the augmented reality video image is acquired, where the video image at least includes: the size and position information of various surfaces in the environment to be tested and the illumination information of the environment to be tested.
Step S204, an augmented reality marker object is determined in the video image, wherein the position of the augmented reality marker object in the video image corresponds to a first real marker object preset in a real scene in the video image;
step S206, controlling the augmented reality device and the first real marker object to move relatively to obtain at least two test images, wherein the at least two test images comprise the first real marker object;
for step S206, in an optional implementation manner of this embodiment, the following may be performed: controlling the augmented reality equipment to obtain at least two test images through at least two test points; or controlling the augmented reality equipment to acquire at least two test images when the first real marker object is positioned at the at least two test points. It can be seen that, for obtaining the test images, the augmented reality device may be moved to obtain at least two test images at the at least two test points, or at least two test images of the first real marked object located at the at least two test points may also be obtained.
Step S208, determining the tracking accuracy of the augmented reality device on the first real marker object according to the size parameter of the test image and the relative position parameter of the first real marker object and the augmented reality marker object.
It should be noted that, the size parameter of the test image involved in the embodiment of the present invention is the maximum distance on the test image; the maximum distance is the maximum value of the distance between any two points in the test image, and thus if the test image is rectangular, the maximum distance is the length of the diagonal line, and if the test image is circular, the maximum distance is the diameter of the circle. In this regard, the following description of the embodiment will be given with reference to fig. 4.
Acquiring an augmented reality video image through the steps S202 to S208; determining an augmented reality marker object in a video image, and controlling relative movement of augmented reality equipment and a first real marker object to obtain at least two test images, wherein the at least two test images comprise the first real marker object; the method and the device for evaluating the AR motion tracking precision determine the tracking precision of the augmented reality device on the first real marked object according to the size parameter of the test image and the relative position parameter of the first real marked object and the augmented reality marked object, namely, the motion tracking precision obtained by testing the AR device is determined in the application, the judgment is not carried out by the user subjectivity, the problem that the evaluation of the AR motion tracking precision is determined through the subjective feeling of the user in the related technology is solved, and the precision of testing the AR motion tracking precision is improved.
Alternatively, the execution subject of the above steps may be a terminal or the like, but is not limited thereto.
In an optional implementation manner of this embodiment, the relative position parameter of the augmented reality marker object may be determined by at least one of the following steps:
step S1, under the condition that the augmented reality marker object is not detected in the test image, the distance value between the first real marker object and the preset marker object in the test image is used as the relative position parameter of the first real marker object and the augmented reality marker object;
the distance value between the first real mark object and the preset mark object in the test image is the maximum distance between the first real mark object and the image edge.
Step S2, in a case where it is determined that the augmented reality marker object is detected to be included in the test image, takes the distance value between the first real marker object and the augmented reality marker object as the relative position parameter of the first real marker object and the augmented reality marker object.
For the above steps S1 and S2, in a specific embodiment, the distance value may be determined based on the established three-dimensional coordinate system, specifically: establishing a three-dimensional orthogonal coordinate system in an image acquired through augmented reality equipment; based on this, the distance value of the first real mark object and the preset mark object involved in the present embodiment is determined by the three-dimensional coordinates of the first real mark object and the three-dimensional coordinates of the preset mark object; likewise, the distance value of the first real marker object and the augmented reality marker object is determined by the three-dimensional coordinates of the first real marker object and the three-dimensional coordinates of the augmented reality marker object. It should be noted that the preset mark object refers to a mark object at the edge of the image with the maximum distance M from the real mark object.
Based on the augmented reality video image acquired in step S202, the augmented reality video image at least includes at least one of the following: to be measuredAfter the size and position information of various surfaces in the test environment and the illumination information of the environment to be tested, the following describes in detail the distance values related to the embodiment in conjunction with a specific application scenario: firstly, a three-dimensional orthogonal coordinate system is established in an environment to be tested in a video image acquired by augmented reality, and coordinates of M marked objects recorded in the testing environment can be [ (x) respectively1,y1,z1),(x2,y2,z2),...,(xm,ym,zm)]. Based on this, the three-dimensional coordinates of the first real mark object and the three-dimensional coordinates of the preset mark object involved in the embodiment can be known through the three-dimensional orthogonal coordinate system, and the distance value between the two can be determined through the three-dimensional coordinates of the two; likewise, the distance value between the first real marker object and the augmented reality marker object in the present embodiment is determined by the three-dimensional coordinates of the first real marker object and the three-dimensional coordinates of the augmented reality marker object, which will be exemplified in fig. 4 to 6 in the alternative embodiments of the present embodiment.
In another optional implementation manner of this embodiment, the manner of determining the tracking accuracy of the augmented reality device for the first real marker object according to the size parameter of the test image and the relative position parameters of the real marker object and the augmented reality marker object, which is involved in step S208, may be implemented by:
step S208-1, determining the sum of a plurality of relative position parameters according to the number of the test images;
step S208-2, obtaining the product of the quantity value and the size parameter of the test image;
in step S208-3, the ratio result of the sum and the product result is determined as the tracking accuracy.
As can be seen from the above steps S208-1 to S208-3, the tracking accuracy is a ratio of a sum of the plurality of relative position parameters and a product of a quantity value and a size parameter of the test image. In a specific application scenario, the tracking accuracy in the above step S208-1 to step S208-3 can be obtained by the following formula (1):
wherein A is the motion tracking accuracy, D is the dimensional parameter, eiAnd obtaining the relative position parameter for the ith position point.
In another optional implementation manner of this embodiment, the method steps of this embodiment may further include:
step S210, obtaining a distance value between a second real mark object and a first real mark object in a real scene in a video image;
step S212, determining the tracking precision of the augmented reality device on the second real marked object according to the distance value between the second real marked object and the first real marked object, the size parameter of the test image and the relative position parameter of the first real marked object and the augmented reality marked object.
Based on the above steps S210 and S212, the motion tracking accuracy of the other second real marker object is related to the distance value of the first real marker object, the size parameter of the test image, and the relative position parameter of the first real marker object and the augmented reality marker object, and specifically, the tracking accuracy of the second real marker object can be determined by the following formula (2):
wherein (x, y, z) is the coordinates of the second real marker object in the three-dimensional orthogonal coordinate system; d and e in the formula (2)iD and e in the formula (1)iThe same; d is the distance from the second real mark object to the first real mark object, and in the case that the first real mark object is multiple, i and j are the ith and j first real mark objects.
The present application will be illustrated with reference to alternative embodiments of the present embodiment; in this particular embodiment, a method for testing the accuracy of AR motion tracking is provided. This optional implementation provides a method for testing AR motion tracking accuracy, which includes the steps of:
s302, obtaining a video image of an environment to be tested by AR equipment through enhancement (corresponding to the step S202);
wherein, 1 fixed environment to be tested is determined, the natural conditions (such as illumination, temperature, air pressure, humidity, etc.) of the test environment are kept constant, and then a fixed test target point is selected in the test environment. The test environment may be a sphere with a radius R and an interior thereof, with the test target point as the center of the sphere, or an irregular space with the farthest distance R from any point in the test environment to the test site. Information of an environment to be tested is acquired through input equipment (such as various cameras, sensors, recording devices and the like) of the augmented reality AR equipment, so that an augmented reality video image is obtained, wherein the information of the environment to be tested comprises but is not limited to information of sizes and positions of various surfaces in the testing environment, illumination conditions of the testing environment and the like.
S302, testing the target mark object (corresponding to step S204 in the present embodiment described above);
wherein, a point on a plane in a test environment is selected as a target mark object (corresponding to the first real mark object in the embodiment), fig. 3 is a schematic diagram of marking the target mark object according to the embodiment of the present application, such as the center of a black circle shown in fig. 3, and then a corresponding augmented reality mark object (corresponding to the augmented reality mark object in the embodiment) is also placed on the corresponding plane in the AR, as shown in fig. 3, the center of a white circle under the robot bot. The marker object in the AR is adjusted so that it coincides with the target marker object, as shown in fig. 3 where the white circle coincides with the black circle.
S306, obtaining data at the test point;
preferably, a three-dimensional orthogonal coordinate system is established in a test environment, and N positions are selected as test points in the test environment (the N test points may be test points obtained by moving the augmented reality device, or may be test points where the first real mark object is located in the N positionsThe test points or the sum of the test points and the test points form the N test points), and the coordinates of the N test points are recorded as [ (x) respectively1,y1,z1),(x2,y2,z2),...,(xn,yn,zn)]。
S308, analyzing the data acquired at the test point;
the method includes recording a test target marker object and an augmented reality marker object in an image displayed by the AR at N test points, where the image may be a video or a picture, and the video may take one frame after the AR output is stabilized as a result image, where a maximum distance between pixels in the image is D (corresponding to a size parameter of the test image referred to in this embodiment).
Fig. 4 is a schematic diagram of the maximum distance according to the embodiment of the present application, and as shown in fig. 4, the maximum distance of the rectangle is the length of the diagonal line, the maximum distance of the circle is the diameter, and the distance between the test target marker object and the augmented reality marker object is calculated to obtain e1,e2,...,eN(corresponding to the relative position parameter in the above-described step S2). Fig. 5 is a schematic diagram of a distance of a target mark object according to an embodiment of the present application, and as shown in fig. 5, a connecting line between two points is the distance. Fig. 6 is a schematic distance diagram of an augmented reality marker object absent image according to an embodiment of the present application, and as shown in fig. 6, if the augmented reality marker object is absent in the image, the marker object distance is the farthest distance M from any point in the image by the target point. That is, if the augmented reality marker object does not exist in the image, e is replaced with MN(corresponding to the relative position parameter in the above-described step S1). It can be seen that the relative position parameters in step S1 and step S2 in the present embodiment are exemplified with reference to fig. 4 to 6.
S310, evaluating the motion tracking precision;
and calculating the motion tracking precision A of the test target point under the test environment by taking the data of all the test points according to the following formula.
It should be noted that the step S310 corresponds to the above step S208-1 to step S208-3, i.e. D is the dimension parameter mentioned in the above embodiment, and e isiAnd obtaining the relative position parameter for the ith position point.
S312, estimating the motion tracking precision of any point;
it should be noted that this step S312 corresponds to the step S210 and the step S212 in the above embodiment, that is, the tracking accuracy of the second real mark object can be realized in the following manner in an optional implementation manner of this embodiment, and the second real mark object in this embodiment is included in any point described below.
The motion accuracy a (x, y, z) of any point (coordinate (x, y, z)) not tested in the test scene can be estimated by the following formula:
the distance d from the point to each test point xiiComprises the following steps:
the result of estimating the motion tracking precision of the point is as follows:
it should be noted that, in the above steps S302 to S312, the motion accuracy evaluation is already completed through S302 to S310, and S312 provides a method capable of performing motion tracking accuracy estimation on untested points based on existing data.
In addition, if testing of multiple test target points is required, the steps of S302-S310 may be repeated.
Through the above description of the embodiments, those skilled in the art can clearly understand that the method according to the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
Example 2
In this embodiment, a motion tracking precision processing device is further provided, and the device is used to implement the foregoing embodiments and preferred embodiments, and the description of the device that has been already made is omitted. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 7 is a schematic structural diagram of a motion tracking accuracy testing apparatus of an augmented reality device according to an embodiment of the present application, and as shown in fig. 7, the apparatus includes: a first obtaining module 72, configured to obtain an augmented reality video image; a first determining module 74, coupled to the first obtaining module 72, for determining an augmented reality marker object in the video image, where a position of the augmented reality marker object in the video image corresponds to a first real marker object preset in a real scene in the video image; a second obtaining module 76, coupled to the first determining module 74, configured to control the augmented reality device to move relative to the first real marker object, so as to obtain at least two test images, where the at least two test images include the first real marker object; a second determining module 78, coupled to the second acquiring module 76, is configured to determine the tracking accuracy of the augmented reality device on the first real marker object according to the size parameter of the test image and the relative position parameter of the first real marker object and the augmented reality marker object.
It should be noted that, before acquiring the augmented reality video image, it is necessary to determine that a natural condition of an environment to be tested remains constant, and after the natural condition remains constant, the augmented reality video image is acquired, where the video image at least includes: the size and position information of various surfaces in the environment to be tested and the illumination information of the environment to be tested.
Optionally, in this embodiment, the second determining module 78 includes: a first determination unit configured to determine a sum of the plurality of relative position parameters according to the number of the test images; the acquisition unit is coupled with the first determination unit and used for acquiring the product of the quantity value and the size parameter of the test image; and the second determining unit is coupled and connected with the acquiring unit and is used for determining a ratio result of the sum and the product result as the tracking precision.
Optionally, the second obtaining module 76 in this embodiment includes: the first acquisition unit is used for controlling the augmented reality equipment to acquire at least two test images through at least two test points; or the second acquiring unit is used for controlling the augmented reality device to acquire at least two test images when the first real marker object is positioned at the at least two test points.
Fig. 8 is a schematic diagram of an alternative structure of a motion tracking accuracy testing apparatus of an augmented reality device according to an embodiment of the present application, as shown in fig. 8, the apparatus further includes: a third obtaining module 82, coupled to the second determining module 78, configured to obtain a distance value between the second real tagged object and the first real tagged object in the real scene in the video image; a third determining module 84, coupled to the third obtaining module 82, configured to determine the tracking accuracy of the augmented reality device for the second real marker object according to the distance value between the second real marker object and the first real marker object, the size parameter of the test image, and the relative position parameter between the first real marker object and the augmented reality marker object.
Fig. 9 is a schematic diagram of an alternative structure of a motion tracking accuracy testing apparatus of an augmented reality device according to an embodiment of the present application, and as shown in fig. 9, the apparatus further includes at least one of the following components: a first processing module 92, configured to, in a case that it is determined that the augmented reality marker object is not detected in the test image, take a distance value between the first real marker object and a preset marker object in the test image as a relative position parameter of the first real marker object and the augmented reality marker object; a second processing module 94, configured to, in a case that it is determined that the augmented reality marker object is detected to be included in the test image, take the distance value between the first real marker object and the augmented reality marker object as a relative position parameter of the first real marker object and the augmented reality marker object.
It should be noted that, for the determination of the relative position parameter, in this embodiment, the determination may be based on a three-dimensional orthogonal coordinate system, which specifically may be: establishing a three-dimensional orthogonal coordinate system in an image acquired through augmented reality equipment; based on this, the distance value of the first real mark object and the preset mark object is determined by the three-dimensional coordinates of the first real mark object and the three-dimensional coordinates of the preset mark object; likewise, the distance value of the first real marker object and the augmented reality marker object is determined by the three-dimensional coordinates of the first real marker object and the three-dimensional coordinates of the augmented reality marker object.
It should be noted that the size parameter of the test image referred to in this embodiment is the maximum distance on the test image.
Embodiments of the present application further provide a storage medium having a computer program stored therein, wherein the computer program is configured to perform the steps in any of the above method embodiments when executed.
Alternatively, in the present embodiment, the storage medium may be configured to store a computer program for executing the steps of:
s1, acquiring an augmented reality video image;
s2, determining an augmented reality marker object in the video image, wherein the position of the augmented reality marker object in the video image corresponds to a first real marker object preset in a real scene in the video image;
s3, controlling the augmented reality device and the first real mark object to move relatively to obtain at least two test images, wherein the at least two test images comprise the first real mark object;
and S4, determining the tracking precision of the augmented reality device to the first real marker object according to the size parameter of the test image and the relative position parameter of the first real marker object and the augmented reality marker object.
Alternatively, in the present embodiment, the storage medium may be configured to store a computer program for executing the steps of:
s1, when it is determined that the augmented reality marker object is not detected in the test image, taking a distance value between the first real marker object and a preset marker object in the test image as a relative position parameter between the first real marker object and the augmented reality marker object;
s2, when it is determined that the augmented reality marker object is included in the test image, taking a distance value between the first real marker object and the augmented reality marker object as a relative position parameter of the first real marker object and the augmented reality marker object.
Alternatively, in the present embodiment, the storage medium may be configured to store a computer program for executing the steps of:
s1, determining the sum value of a plurality of relative position parameters according to the number of the test images;
s2, acquiring the product of the quantity value of the test image and the size parameter;
and S3, determining the ratio result of the sum and the product result as the tracking precision.
Optionally, in this embodiment, the storage medium may include, but is not limited to: various media capable of storing computer programs, such as a usb disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk.
Embodiments of the present application further provide an electronic device comprising a memory having a computer program stored therein and a processor configured to execute the computer program to perform the steps of any of the above method embodiments.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
s1, acquiring an augmented reality video image;
s2, determining an augmented reality marker object in the video image, wherein the position of the augmented reality marker object in the video image corresponds to a first real marker object preset in a real scene in the video image;
s3, controlling the augmented reality device and the first real mark object to move relatively to obtain at least two test images, wherein the at least two test images comprise the first real mark object;
and S4, determining the tracking precision of the augmented reality device to the first real marker object according to the size parameter of the test image and the relative position parameter of the first real marker object and the augmented reality marker object.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
s1, when it is determined that the augmented reality marker object is not detected in the test image, taking a distance value between the first real marker object and a preset marker object in the test image as a relative position parameter between the first real marker object and the augmented reality marker object;
s2, when it is determined that the augmented reality marker object is included in the test image, taking a distance value between the first real marker object and the augmented reality marker object as a relative position parameter of the first real marker object and the augmented reality marker object.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
s1, determining the sum value of a plurality of relative position parameters according to the number of the test images;
s2, acquiring the product of the quantity value of the test image and the size parameter;
and S3, determining the ratio result of the sum and the product result as the tracking precision.
Optionally, the specific examples in this embodiment may refer to the examples described in the above embodiments and optional implementation manners, and this embodiment is not described herein again.
It will be apparent to those skilled in the art that the modules or steps of the present application described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of multiple computing devices, and alternatively, they may be implemented by program code executable by a computing device, such that they may be stored in a storage device and executed by a computing device, and in some cases, the steps shown or described may be performed in an order different than that described herein, or they may be separately fabricated into individual integrated circuit modules, or multiple ones of them may be fabricated into a single integrated circuit module. Thus, the present application is not limited to any specific combination of hardware and software.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the principle of the present application shall be included in the protection scope of the present application.
Claims (11)
1. A motion tracking precision testing method of augmented reality equipment comprises the following steps:
acquiring an augmented reality video image;
determining an augmented reality marker object in the video image, wherein the position of the augmented reality marker object in the video image corresponds to a first real marker object preset in a real scene in the video image;
controlling the augmented reality device and the first real marker object to move relatively to obtain at least two test images, wherein the at least two test images comprise the first real marker object;
and determining the tracking precision of the augmented reality equipment to the first real marker object according to the size parameter of the test image and the relative position parameter of the first real marker object and the augmented reality marker object.
2. A method according to claim 1, wherein the relative position parameters of the first real marker object and the augmented reality marker object are determined by at least one of:
taking a distance value between the first real marker object and a preset marker object in the test image as a relative position parameter of the first real marker object and the augmented reality marker object under the condition that the augmented reality marker object is not detected to be contained in the test image;
in a case where it is determined that the augmented reality marker object is detected to be included in the test image, a distance value of the first real marker object and the augmented reality marker object is taken as a relative position parameter of the first real marker object and the augmented reality marker object.
3. The method of claim 2, further comprising: establishing a three-dimensional orthogonal coordinate system in an image acquired through augmented reality equipment;
the distance value between the first real mark object and the preset mark object is determined by the three-dimensional coordinate of the first real mark object and the three-dimensional coordinate of the preset mark object;
the distance value of the first real marker object and the augmented reality marker object is determined by the three-dimensional coordinates of the first real marker object and the three-dimensional coordinates of the augmented reality marker object.
4. A method according to claim 1 or 2, wherein the size parameter of the test image is the maximum distance on the test image.
5. The method of claim 1, wherein determining the tracking accuracy of the augmented reality device for the first real marker object based on the size parameter of the test image and the relative position parameters of the first real marker object and the augmented reality marker object comprises:
determining a sum of a plurality of the relative position parameters according to the number of the test images;
obtaining the product of the quantity value of the test image and the size parameter;
determining a ratio result of the sum and the product result as the tracking accuracy.
6. The method of claim 5, further comprising:
acquiring a distance value between a second real mark object and the first real mark object in a real scene in the video image;
and determining the tracking precision of the augmented reality equipment on the second real marked object according to the distance value of the second real marked object and the first real marked object, the size parameter of the test image and the relative position parameter of the first real marked object and the augmented reality marked object.
7. The method according to claim 1, wherein the step of controlling the augmented reality device to move relative to the first real marker object to acquire at least two test images comprises at least one of:
controlling the augmented reality equipment to obtain the at least two test images through the at least two test points;
and controlling the augmented reality equipment to obtain the at least two test images when the first real mark object is positioned at the at least two test points.
8. The method of claim 1, wherein the augmented reality video image comprises at least one of: the size and position information of various surfaces in the environment to be tested and the illumination information of the environment to be tested.
9. A motion tracking precision testing device of an augmented reality device comprises:
the first acquisition module is used for acquiring an augmented reality video image;
a first determining module, configured to determine an augmented reality marker object in the video image, where a position of the augmented reality marker object in the video image corresponds to a first real marker object preset in a real scene in the video image;
the second obtaining module is used for controlling the augmented reality device and the first real marker object to move relatively so as to obtain at least two test images, wherein the at least two test images comprise the first real marker object;
a second determining module, configured to determine, according to the size parameter of the test image and the relative position parameter of the first real marker object and the augmented reality marker object, the tracking accuracy of the augmented reality device on the first real marker object.
10. A storage medium, in which a computer program is stored, wherein the computer program is arranged to perform the method of any of claims 1 to 8 when executed.
11. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and wherein the processor is arranged to execute the computer program to perform the method of any of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811103303.0A CN109345560B (en) | 2018-09-20 | 2018-09-20 | Motion tracking precision testing method and device of augmented reality equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811103303.0A CN109345560B (en) | 2018-09-20 | 2018-09-20 | Motion tracking precision testing method and device of augmented reality equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109345560A CN109345560A (en) | 2019-02-15 |
CN109345560B true CN109345560B (en) | 2021-02-19 |
Family
ID=65306053
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811103303.0A Active CN109345560B (en) | 2018-09-20 | 2018-09-20 | Motion tracking precision testing method and device of augmented reality equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109345560B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111708381A (en) * | 2020-06-29 | 2020-09-25 | 西安方元明科技股份有限公司 | Tracking precision testing method for double-shaft tracking rotary table |
CN112200827B (en) * | 2020-09-09 | 2023-06-09 | 天津津航技术物理研究所 | Far and near scene-based infrared image tracking algorithm evaluation method and platform |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103793936A (en) * | 2012-10-31 | 2014-05-14 | 波音公司 | Automated frame of reference calibration for augmented reality |
CN106952312A (en) * | 2017-03-10 | 2017-07-14 | 广东顺德中山大学卡内基梅隆大学国际联合研究院 | A Markerless Augmented Reality Registration Method Based on Line Feature Description |
CN108537889A (en) * | 2018-03-26 | 2018-09-14 | 广东欧珀移动通信有限公司 | Method of adjustment, device, storage medium and the electronic equipment of augmented reality model |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9132342B2 (en) * | 2012-10-31 | 2015-09-15 | Sulon Technologies Inc. | Dynamic environment and location based augmented reality (AR) systems |
-
2018
- 2018-09-20 CN CN201811103303.0A patent/CN109345560B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103793936A (en) * | 2012-10-31 | 2014-05-14 | 波音公司 | Automated frame of reference calibration for augmented reality |
CN106952312A (en) * | 2017-03-10 | 2017-07-14 | 广东顺德中山大学卡内基梅隆大学国际联合研究院 | A Markerless Augmented Reality Registration Method Based on Line Feature Description |
CN108537889A (en) * | 2018-03-26 | 2018-09-14 | 广东欧珀移动通信有限公司 | Method of adjustment, device, storage medium and the electronic equipment of augmented reality model |
Non-Patent Citations (1)
Title |
---|
手术导航系统分类及在颌面外科中的应用;杨亦 等.;《东南大学学报(医学版)》;20180625;第535页 * |
Also Published As
Publication number | Publication date |
---|---|
CN109345560A (en) | 2019-02-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111008561B (en) | Method, terminal and computer storage medium for determining quantity of livestock | |
CN105513155B (en) | Classification, naming method and the terminal device of inspection photo | |
CN113029128B (en) | Visual navigation method and related device, mobile terminal and storage medium | |
CN109190674B (en) | Training data generation method and device | |
CN111294563B (en) | Video monitoring method and device, storage medium and electronic device | |
CN110991297A (en) | Target positioning method and system based on scene monitoring | |
CN109345560B (en) | Motion tracking precision testing method and device of augmented reality equipment | |
EP3998582A1 (en) | Three-dimensional model generation method and three-dimensional model generation device | |
CN109389645B (en) | Camera self-calibration method and system, camera, robot and cloud server | |
WO2019123988A1 (en) | Calibration data generating device, calibration data generating method, calibration system, and control program | |
CN111654677B (en) | Method and device for determining desynchronization of holder | |
CN109903308B (en) | Method and device for acquiring information | |
CN113706472A (en) | Method, device and equipment for detecting road surface diseases and storage medium | |
CN113079369A (en) | Method and device for determining image pickup equipment, storage medium and electronic device | |
JP6246441B1 (en) | Image analysis system, image analysis method, and program | |
CN111899349B (en) | Model presentation method and device, electronic equipment and computer storage medium | |
CN111860042A (en) | Method and device for reading meter | |
CN113793392A (en) | Camera parameter calibration method and device | |
CN113505720A (en) | Image processing method and device, storage medium and electronic device | |
US20170109583A1 (en) | Evaluation of models generated from objects in video | |
CN111223139B (en) | Target positioning method and terminal equipment | |
CN108519215B (en) | Pupil distance adaptability test system and method and test host | |
CN114909999A (en) | Three-dimensional measurement system and method based on structured light | |
EP2874117A1 (en) | Method and apparatus for determining position related properties of a motion video camera | |
CN113704533B (en) | Method and device for determining object relationship, storage medium and electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |