JPWO2021087065A5 - - Google Patents
Download PDFInfo
- Publication number
- JPWO2021087065A5 JPWO2021087065A5 JP2022525302A JP2022525302A JPWO2021087065A5 JP WO2021087065 A5 JPWO2021087065 A5 JP WO2021087065A5 JP 2022525302 A JP2022525302 A JP 2022525302A JP 2022525302 A JP2022525302 A JP 2022525302A JP WO2021087065 A5 JPWO2021087065 A5 JP WO2021087065A5
- Authority
- JP
- Japan
- Prior art keywords
- coordinate frame
- portable device
- information
- persistent coordinate
- persistent
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Description
前述の説明は、例証として提供され、限定することを意図するものではない。
本発明は、例えば、以下を提供する。
(項目1)
電子システムを動作させ、仮想コンテンツを、ポータブルデバイスを備える3D環境内にレンダリングする方法であって、前記方法は、1つまたはそれを上回るプロセッサを用いて、
少なくとも部分的に、前記3D環境についての記憶された空間情報に基づいて、前記3D環境内の前記ポータブルデバイスを位置特定することと、
少なくとも部分的に、前記3D環境内のポータブルデバイスの位置特定に基づいて、前記3D環境についての前記記憶された空間情報から持続座標フレーム情報を取得することであって、前記取得される持続座標フレーム情報は、前記ポータブルデバイスにローカルの座標フレームと持続座標フレームとの間の変換を備える、ことと、
前記持続座標フレームについての品質情報を算出することであって、前記品質情報は、少なくとも部分的に前記持続座標フレームに対して規定された位置、前記変換、および前記ポータブルデバイスにローカルの前記座標フレームに基づいてレンダリングされた仮想コンテンツの位置不確実性を示す、ことと
を含む、方法。
(項目2)
仮想オブジェクトを、前記ポータブルデバイスのディスプレイ上に、少なくとも部分的に前記持続座標フレームおよび前記持続座標フレームについての算出された品質情報に基づいて決定された場所においてレンダリングすること
を含む、項目1に記載の方法。
(項目3)
前記ポータブルデバイスに提供される前記持続座標フレームについての前記算出された品質情報は、少なくとも1自由度において、
(i)前記持続座標フレームと関連付けられる前記3D環境内の物理的特徴と、前記変換に基づいてレンダリングされた前記仮想コンテンツの位置との間の距離であって、位置は、前記持続座標フレームに対して規定され、前記座標フレームは、前記ポータブルデバイスにローカルである、距離と、
(ii)前記持続座標フレーム内に表されるような前記持続座標フレームと関連付けられる前記3D環境内の物理的特徴と、前記持続座標フレームに対して規定された前記仮想コンテンツの位置との間の距離と
の間の逸脱が閾値量を下回る確率を備える、項目1に記載の方法。
(項目4)
前記ポータブルデバイスによって知覚される前記持続座標フレームについての前記算出された品質情報は、
前記ポータブルデバイスによってレンダリングされるときの前記仮想コンテンツの上界位置誤差
を示す、項目1に記載の方法。
(項目5)
前記記憶された空間情報は、前記ポータブルデバイスに対して遠隔で記憶された規準マップである、項目1に記載の方法。
(項目6)
前記ポータブルデバイスは、前記1つまたはそれを上回るプロセッサの第1のプロセッサを備える第1のポータブルデバイスを備え、
前記電子システムはさらに、前記1つまたはそれを上回るプロセッサの第2のプロセッサを備える第2のポータブルデバイスを備え、
前記第2のポータブルデバイスの第2のプロセッサは、
前記持続座標フレームを取得することと、
前記第2のポータブルデバイスのビュー内の前記持続座標フレームについての品質情報を取得することと、
仮想オブジェクトを前記第2のポータブルデバイスのディスプレイ上にレンダリングし、同時に、前記第1のポータブルデバイスを通して行われる前記仮想オブジェクトへの変更を示すことと
を行う、項目1に記載の方法。
(項目7)
前記3D環境についての前記記憶された空間情報は、共有マップを備え、
前記持続座標フレームは、前記共有マップに対する姿勢を有し、
前記持続座標フレームについての品質情報を算出することは、
前記共有マップに対する前記持続座標フレームの姿勢における不確実性を決定することと、
1つまたはそれを上回る誤差伝搬変換を前記姿勢における不確実性に適用し、前記仮想コンテンツの前記位置不確実性への寄与を算出することと
を含む、項目1に記載の方法。
(項目8)
前記ポータブルデバイスにローカルの座標フレームは、前記ポータブルデバイスの追跡マップ内の前記ポータブルデバイスの姿勢を備え、
前記持続座標フレームについての品質情報を算出することは、
前記追跡マップ内の前記ポータブルデバイスの姿勢における不確実性を決定することと、
前記追跡マップ内の前記ポータブルデバイスの姿勢における不確実性を位置不確実性の1つまたはそれを上回る源とともに集約することと
を含む、項目1に記載の方法。
(項目9)
前記3D環境についての前記記憶された空間情報は、共有マップを備え、
前記共有マップの持続座標フレームは、前記持続座標フレームによって表される前記3D環境内の場所における特徴を表す特徴情報を備え、
位置特定は、前記ポータブルデバイスの前記3D環境内の特徴を表す少なくとも1つの特徴情報のセットに基づき、
前記持続座標フレームについての品質情報を算出することは、
前記ポータブルデバイスの前記3D環境内の特徴を表す前記少なくとも1つの特徴情報のセットを前記持続座標フレームの特徴情報にマッチングさせる際の不確実性を決定することと、
1つまたはそれを上回る誤差伝搬変換を前記姿勢における不確実性に適用し、前記仮想コンテンツの前記位置不確実性への寄与を算出することと
を含む、項目1に記載の方法。
(項目10)
ポータブル電子システムであって、
3次元(3D)環境についての情報を捕捉するように構成される1つまたはそれを上回るセンサであって、前記捕捉された情報は、複数の画像を備える、1つまたはそれを上回るセンサと、
仮想コンテンツを前記3D環境内にレンダリングするためのコンピュータ実行可能命令を実行するように構成される少なくとも1つのプロセッサであって、前記コンピュータ実行可能命令はさらに、
少なくとも1つの特徴情報のセットを前記複数の画像から形成することと、
持続座標フレーム情報を取得することであって、前記持続座標フレームは、前記少なくとも1つの特徴情報のセットにマッチングするそれと関連付けられる特徴情報と、品質メトリックとを有する、ことと、
選択的に、少なくとも部分的に、前記品質メトリックに基づいて、前記持続座標フレーム情報を使用して、仮想オブジェクトを、ポータブルデバイスのディスプレイ上に、前記持続座標フレームに対して規定された場所においてレンダリングすることと
を行うための命令を備える、少なくとも1つのプロセッサと
を備える、ポータブル電子システム。
(項目11)
選択的に、少なくとも部分的に、前記品質メトリックに基づいて、前記持続座標フレーム情報を使用して、前記仮想オブジェクトを、前記ポータブルデバイスのディスプレイ上にレンダリングすることは、
前記仮想オブジェクトの少なくとも一部を拡大すること
を含む、項目10に記載のポータブル電子システム。
(項目12)
選択的に、少なくとも部分的に、前記品質メトリックに基づいて、前記持続座標フレーム情報を使用して、前記仮想オブジェクトを、前記ポータブルデバイスのディスプレイ上にレンダリングすることは、
前記仮想オブジェクトの少なくとも一部の分解能を低減させること
を含む、項目10に記載のポータブル電子システム。
(項目13)
選択的に、少なくとも部分的に、前記品質メトリックに基づいて、前記持続座標フレーム情報を使用して、前記仮想オブジェクトを、前記ポータブルデバイスのディスプレイ上にレンダリングすることは、
前記仮想オブジェクトの少なくとも一部をレンダリングしないこと
を含む、項目10に記載のポータブル電子システム。
(項目14)
選択的に、少なくとも部分的に、前記品質メトリックに基づいて、前記持続座標フレーム情報を使用して、前記仮想オブジェクトを、前記ポータブルデバイスのディスプレイ上にレンダリングすることは、
第2のポータブル電子デバイスとの仮想コンテンツの共有を防止すること
を含む、項目110に記載のポータブル電子システム。
(項目15)
前記品質メトリックは、前記ポータブル電子システムと前記持続座標フレームとの間の距離と逆相関を有する、項目10に記載のポータブル電子システム。
(項目16)
前記品質メトリックは、前記ポータブル電子システムと前記持続座標フレームとの間の距離が閾値距離を上回るとき、ゼロ確率を示す、項目10に記載のポータブル電子システム。
(項目17)
クロスリアリティシステムのためのコンピューティング環境であって、前記コンピューティング環境は、
複数のマップを記憶するデータベースであって、各マップは、3D環境の領域を表す情報を備える、データベースと、
非一過性コンピュータ記憶媒体であって、前記非一過性コンピュータ記憶媒体は、コンピュータ実行可能命令を記憶しており、前記コンピュータ実行可能命令は、前記コンピューティング環境内において、少なくとも1つのプロセッサによって実行されると、
前記データベース内のマップの中のポータブルデバイスを位置特定することと、
少なくとも部分的に、前記マップ内のポータブルデバイスの位置特定に基づいて、前記マップから持続座標フレームを取得することであって、前記持続座標フレームは、前記マップの座標フレームに対する第1の姿勢を備える、ことと、
少なくとも部分的に、前記第1の姿勢および前記マップの座標フレームを前記ポータブルデバイスにローカルの座標フレームに関連させる変換に基づいて、第2の姿勢についての品質情報を算出することであって、前記品質情報は、前記第2の姿勢における位置誤差が閾値未満である確率を示す、ことと
を行う、非一過性コンピュータ記憶媒体と
を備える、コンピューティング環境。
(項目18)
前記第2の姿勢についての品質情報は、少なくとも部分的に、前記マップ内の前記持続座標フレームについての品質情報に基づいて算出される、項目17に記載のコンピューティング環境。
(項目19)
前記第2の姿勢についての品質情報は、少なくとも部分的に、前記マップ内の前記ポータブルデバイスの位置特定についての品質情報に基づいて算出される、項目17に記載のコンピューティング環境。
(項目20)
前記第2の姿勢についての品質情報は、少なくとも部分的に、前記ポータブルデバイスによって算出された追跡マップについての品質情報に基づいて算出される、項目17に記載のコンピューティング環境。
The foregoing description is provided by way of example and is not intended to be limiting.
The present invention provides, for example, the following.
(Item 1)
A method of operating an electronic system to render virtual content within a 3D environment comprising a portable device, the method comprising: using one or more processors;
locating the portable device within the 3D environment based at least in part on stored spatial information about the 3D environment;
obtaining persistent coordinate frame information from the stored spatial information about the 3D environment based at least in part on locating a portable device within the 3D environment, the obtained persistent coordinate frame information comprises a transformation between a coordinate frame local to the portable device and a persistent coordinate frame;
calculating quality information for the persistent coordinate frame, the quality information comprising, at least in part, a defined position relative to the persistent coordinate frame, the transformation, and the coordinate frame local to the portable device; indicating the positional uncertainty of virtual content rendered based on
including methods.
(Item 2)
rendering a virtual object on a display of the portable device at a location determined based at least in part on the persistent coordinate frame and computed quality information about the persistent coordinate frame;
The method according to item 1, including:
(Item 3)
The calculated quality information about the persistent coordinate frame provided to the portable device comprises, in at least one degree of freedom:
(i) a distance between a physical feature in the 3D environment associated with the persistent coordinate frame and a position of the virtual content rendered based on the transformation, the position being in the persistent coordinate frame; the coordinate frame is local to the portable device;
(ii) between a physical feature in the 3D environment associated with the persistent coordinate frame as represented in the persistent coordinate frame and a position of the virtual content defined with respect to the persistent coordinate frame; distance and
The method of item 1, comprising a probability that the deviation between is below a threshold amount.
(Item 4)
The calculated quality information about the persistent coordinate frame as perceived by the portable device is:
an upper bound position error of the virtual content when rendered by the portable device;
The method described in item 1, which indicates.
(Item 5)
2. The method of item 1, wherein the stored spatial information is a reference map stored remotely with respect to the portable device.
(Item 6)
the portable device comprises a first portable device comprising a first processor of the one or more processors;
The electronic system further comprises a second portable device comprising a second processor of the one or more processors;
The second processor of the second portable device comprises:
obtaining the persistent coordinate frame;
obtaining quality information about the persistent coordinate frame within a view of the second portable device;
rendering a virtual object on a display of the second portable device while simultaneously indicating changes to the virtual object made through the first portable device;
The method described in item 1.
(Item 7)
the stored spatial information about the 3D environment comprises a shared map;
the persistent coordinate frame has a pose with respect to the shared map;
Computing quality information for the persistent coordinate frame comprises:
determining an uncertainty in the pose of the persistent coordinate frame relative to the shared map;
applying one or more error propagation transformations to the uncertainty in the pose and calculating a contribution of the virtual content to the position uncertainty;
The method according to item 1, including:
(Item 8)
a coordinate frame local to the portable device comprises a pose of the portable device within a tracking map of the portable device;
Computing quality information for the persistent coordinate frame comprises:
determining an uncertainty in the pose of the portable device in the tracking map;
aggregating uncertainties in the pose of the portable device in the tracking map with one or more sources of positional uncertainty;
The method according to item 1, including:
(Item 9)
the stored spatial information about the 3D environment comprises a shared map;
the persistent coordinate frame of the shared map comprises feature information representative of features at a location within the 3D environment represented by the persistent coordinate frame;
The localization is based on a set of at least one feature information representative of features in the 3D environment of the portable device;
Computing quality information for the persistent coordinate frame comprises:
determining an uncertainty in matching the at least one set of feature information representative of features in the 3D environment of the portable device to feature information of the persistent coordinate frame;
applying one or more error propagation transformations to the uncertainty in the pose and calculating a contribution of the virtual content to the position uncertainty;
The method according to item 1, including:
(Item 10)
A portable electronic system,
one or more sensors configured to capture information about a three-dimensional (3D) environment, the captured information comprising a plurality of images;
at least one processor configured to execute computer-executable instructions for rendering virtual content within the 3D environment, the computer-executable instructions further comprising:
forming at least one set of feature information from the plurality of images;
obtaining persistent coordinate frame information, the persistent coordinate frame having feature information associated therewith matching the at least one set of feature information and a quality metric;
Selectively, based at least in part on the quality metric, using the persistent coordinate frame information to render a virtual object on a display of a portable device at a location defined with respect to the persistent coordinate frame. things to do
at least one processor comprising instructions for performing
A portable electronic system with.
(Item 11)
Optionally, based at least in part on the quality metric, using the persistent coordinate frame information to render the virtual object on a display of the portable device;
enlarging at least a portion of the virtual object;
The portable electronic system of item 10, comprising:
(Item 12)
Optionally, based at least in part on the quality metric, using the persistent coordinate frame information to render the virtual object on a display of the portable device;
reducing the resolution of at least a portion of the virtual object;
The portable electronic system of item 10, comprising:
(Item 13)
Optionally, based at least in part on the quality metric, using the persistent coordinate frame information to render the virtual object on a display of the portable device;
not rendering at least a portion of the virtual object;
The portable electronic system of item 10, comprising:
(Item 14)
Optionally, based at least in part on the quality metric, using the persistent coordinate frame information to render the virtual object on a display of the portable device;
Preventing sharing of virtual content with a second portable electronic device
111. The portable electronic system of item 110, comprising:
(Item 15)
11. The portable electronic system of item 10, wherein the quality metric is inversely related to a distance between the portable electronic system and the persistent coordinate frame.
(Item 16)
11. The portable electronic system of item 10, wherein the quality metric indicates a zero probability when a distance between the portable electronic system and the persistent coordinate frame is greater than a threshold distance.
(Item 17)
A computing environment for a cross-reality system, the computing environment comprising:
a database storing a plurality of maps, each map comprising information representing a region of the 3D environment;
a non-transitory computer storage medium storing computer-executable instructions, the computer-executable instructions being executed by at least one processor within the computing environment; When executed,
locating a portable device in a map within the database;
obtaining a persistent coordinate frame from the map based, at least in part, on locating the portable device within the map, the persistent coordinate frame comprising a first orientation relative to the coordinate frame of the map; , and,
calculating quality information for the second pose based, at least in part, on a transformation that relates the coordinate frame of the first pose and the map to a coordinate frame local to the portable device; The quality information indicates a probability that the position error in the second orientation is less than a threshold.
a non-transitory computer storage medium that performs
A computing environment with
(Item 18)
18. The computing environment of item 17, wherein quality information for the second pose is calculated based, at least in part, on quality information for the persistent coordinate frame in the map.
(Item 19)
18. The computing environment of item 17, wherein quality information for the second pose is calculated based, at least in part, on quality information for a location of the portable device in the map.
(Item 20)
18. The computing environment of item 17, wherein quality information for the second pose is calculated based, at least in part, on quality information for a tracking map calculated by the portable device.
Claims (22)
前記3D環境についての記憶された空間情報に少なくとも部分的に基づいて、前記3D環境内の前記ポータブルデバイスを位置特定することと、
前記3D環境内の前記ポータブルデバイスの位置特定に少なくとも部分的に基づいて、前記3D環境についての前記記憶された空間情報から持続座標フレーム情報を取得することであって、前記取得される持続座標フレーム情報は、前記ポータブルデバイスにローカルの座標フレームと持続座標フレームとの間の変換を備える、ことと、
前記持続座標フレームについての品質情報を算出することであって、前記品質情報は、前記持続座標フレームに対して規定された位置、前記変換、および前記ポータブルデバイスにローカルの前記座標フレームに少なくとも部分的に基づいてレンダリングされた仮想コンテンツの位置不確実性を示す、ことと
を含む、方法。 A method of operating an electronic system to render virtual content within a 3D environment comprising a portable device, the method comprising: using one or more processors;
locating the portable device within the 3D environment based at least in part on stored spatial information about the 3D environment;
obtaining persistent coordinate frame information from the stored spatial information about the 3D environment based at least in part on locating the portable device within the 3D environment, the obtained persistent coordinates frame information comprises a transformation between a coordinate frame local to the portable device and a persistent coordinate frame;
calculating quality information for the persistent coordinate frame, the quality information comprising : a position defined for the persistent coordinate frame, the transformation, and at least partially the coordinate frame local to the portable device; A method, comprising: indicating positional uncertainty of virtual content rendered based on .
を含む、請求項1に記載の方法。 2. Rendering a virtual object on a display of the portable device at a location determined based at least in part on the persistent coordinate frame and the calculated quality information for the persistent coordinate frame. The method described in.
複数のマップを記憶するデータベースであって、各マップは、3D環境の領域を表す情報を備える、データベースと、
非一過性コンピュータ記憶媒体であって、前記非一過性コンピュータ記憶媒体は、コンピュータ実行可能命令を記憶しており、前記コンピュータ実行可能命令は、前記コンピューティング環境内において、少なくとも1つのプロセッサによって実行されると、
前記データベース内のマップの中のポータブルデバイスを位置特定することと、
前記マップ内の前記ポータブルデバイスの位置特定に少なくとも部分的に基づいて、前記マップから持続座標フレームを取得することであって、前記持続座標フレームは、前記マップの座標フレームに対する第1の姿勢と、前記ポータブルデバイスにローカルの座標フレームに対する第2の姿勢とを備える、ことと、
前記第1の姿勢および前記マップの座標フレームを前記ポータブルデバイスにローカルの座標フレームに関連させる変換に少なくとも部分的に基づいて、第2の姿勢についての品質情報を算出することであって、前記品質情報は、前記第2の姿勢における位置誤差が閾値未満である確率を示す、ことと
を行う、非一過性コンピュータ記憶媒体と
を備える、コンピューティング環境。 A computing environment for a cross-reality system, the computing environment comprising:
a database storing a plurality of maps, each map comprising information representing a region of the 3D environment;
a non-transitory computer storage medium storing computer-executable instructions, the computer-executable instructions being executed by at least one processor within the computing environment; When executed,
locating a portable device in a map within the database;
obtaining a persistent coordinate frame from the map based at least in part on locating the portable device in the map, the persistent coordinate frame having a first orientation relative to the coordinate frame of the map; , a second orientation relative to a coordinate frame local to the portable device ;
calculating quality information for the second pose based at least in part on a transformation that relates the coordinate frame of the first pose and the map to a coordinate frame local to the portable device; and a non-transitory computer storage medium, wherein the quality information indicates a probability that the position error in the second pose is less than a threshold.
(i)前記持続座標フレームと関連付けられる前記3D環境内の物理的特徴と、前記変換に基づいてレンダリングされた前記仮想コンテンツの位置との間の距離であって、位置は、前記持続座標フレームに対して規定され、前記座標フレームは、前記ポータブルデバイスにローカルである、距離と、
(ii)前記持続座標フレーム内に表されるような前記持続座標フレームと関連付けられる前記3D環境内の物理的特徴と、前記持続座標フレームに対して規定された前記仮想コンテンツの位置との間の距離と
の間の逸脱が閾値量を下回る確率を備える、請求項1に記載の方法または請求項6に記載のコンピューティング環境。 The calculated quality information about the persistent coordinate frame provided to the portable device comprises, in at least one degree of freedom:
(i) a distance between a physical feature in the 3D environment associated with the persistent coordinate frame and a position of the virtual content rendered based on the transformation, the position being in the persistent coordinate frame; the coordinate frame is local to the portable device;
(ii) between a physical feature in the 3D environment associated with the persistent coordinate frame as represented in the persistent coordinate frame and a position of the virtual content defined with respect to the persistent coordinate frame; 7. A method as claimed in claim 1 or a computing environment as claimed in claim 6 , comprising a probability that the deviation between distance and is below a threshold amount.
前記ポータブルデバイスによってレンダリングされるときの前記仮想コンテンツの上界位置誤差
を示す、請求項1に記載の方法または請求項6に記載のコンピューティング環境。 The calculated quality information about the persistent coordinate frame as perceived by the portable device is:
7. The method of claim 1 or the computing environment of claim 6 , indicating an upper bound position error of the virtual content when rendered by the portable device.
前記電子システムは、前記1つまたはそれを上回るプロセッサの第2のプロセッサを備える第2のポータブルデバイスをさらに備え、
前記第2のポータブルデバイスの前記第2のプロセッサは、
前記持続座標フレームを取得することと、
前記第2のポータブルデバイスのビュー内の前記持続座標フレームについての品質情報を取得することと、
仮想オブジェクトを前記第2のポータブルデバイスのディスプレイ上にレンダリングし、同時に、前記第1のポータブルデバイスを通して行われる前記仮想オブジェクトへの変更を示すことと
を行う、請求項1に記載の方法。 the portable device comprises a first portable device comprising a first processor of the one or more processors;
The electronic system further comprises a second portable device comprising a second processor of the one or more processors;
The second processor of the second portable device
obtaining the persistent coordinate frame;
obtaining quality information about the persistent coordinate frame within a view of the second portable device;
2. The method of claim 1, further comprising: rendering a virtual object on a display of the second portable device, and simultaneously indicating changes to the virtual object made through the first portable device.
前記持続座標フレームは、前記共有マップに対する姿勢を有し、
前記持続座標フレームについての品質情報を算出することは、
前記共有マップに対する前記持続座標フレームの前記姿勢における不確実性を決定することと、
1つまたはそれを上回る誤差伝搬変換を前記姿勢における不確実性に適用し、前記仮想コンテンツの前記位置不確実性への寄与を算出することと
を含む、請求項1に記載の方法または請求項6に記載のコンピューティング環境。 the stored spatial information about the 3D environment comprises a shared map;
the persistent coordinate frame has a pose with respect to the shared map;
Computing quality information for the persistent coordinate frame comprises:
determining an uncertainty in the pose of the persistent coordinate frame relative to the shared map;
and applying one or more error propagation transformations to the uncertainty in the pose and calculating a contribution of the virtual content to the position uncertainty. 6. The computing environment according to 6 .
前記持続座標フレームについての品質情報を算出することは、
前記追跡マップ内の前記ポータブルデバイスの前記姿勢における不確実性を決定することと、
前記追跡マップ内の前記ポータブルデバイスの前記姿勢における不確実性を位置不確実性の1つまたはそれを上回る源とともに集約することと
を含む、請求項1に記載の方法または請求項6に記載のコンピューティング環境。 the coordinate frame local to the portable device comprises a pose of the portable device within a tracking map of the portable device;
Computing quality information for the persistent coordinate frame comprises:
determining an uncertainty in the pose of the portable device in the tracking map;
and aggregating uncertainties in the pose of the portable device in the tracking map with one or more sources of positional uncertainty. computing environment .
前記共有マップの持続座標フレームは、前記持続座標フレームによって表される前記3D環境内の場所における特徴を表す特徴情報を備え、
位置特定は、前記ポータブルデバイスの前記3D環境内の特徴を表す少なくとも1つの特徴情報のセットに基づき、
前記持続座標フレームについての品質情報を算出することは、
前記ポータブルデバイスの前記3D環境内の特徴を表す前記少なくとも1つの特徴情報のセットを前記持続座標フレームの特徴情報にマッチングさせる際の不確実性を決定することと、
1つまたはそれを上回る誤差伝搬変換を前記姿勢における不確実性に適用し、前記仮想コンテンツの前記位置不確実性への寄与を算出することと
を含む、請求項1に記載の方法または請求項6に記載のコンピューティング環境。 the stored spatial information about the 3D environment comprises a shared map;
the persistent coordinate frame of the shared map comprises feature information representative of features at a location within the 3D environment represented by the persistent coordinate frame;
The localization is based on a set of at least one feature information representative of features in the 3D environment of the portable device;
Computing quality information for the persistent coordinate frame comprises:
determining an uncertainty in matching the at least one set of feature information representative of features in the 3D environment of the portable device to feature information of the persistent coordinate frame;
and applying one or more error propagation transformations to the uncertainty in the pose and calculating a contribution of the virtual content to the position uncertainty. 6. The computing environment according to 6 .
3次元(3D)環境についての情報を捕捉するように構成される1つまたはそれを上回るセンサであって、前記捕捉された情報は、複数の画像を備える、1つまたはそれを上回るセンサと、
仮想コンテンツを前記3D環境内にレンダリングするためのコンピュータ実行可能命令を実行するように構成される少なくとも1つのプロセッサであって、前記コンピュータ実行可能命令は、
少なくとも1つの特徴情報のセットを前記複数の画像から形成することと、
持続座標フレーム情報を取得することであって、前記持続座標フレーム情報は、前記少なくとも1つの特徴情報のセットにマッチングするそれと関連付けられる特徴情報と、品質メトリックとを有する、ことと、
選択的に、前記品質メトリックに少なくとも部分的に基づいて、前記持続座標フレーム情報を使用して、仮想オブジェクトを、ポータブルデバイスのディスプレイ上に、持続座標フレームに対して規定された場所においてレンダリングすることと
を行うための命令をさらに備える、少なくとも1つのプロセッサと
を備える、ポータブル電子システム。 A portable electronic system,
one or more sensors configured to capture information about a three-dimensional (3D) environment, the captured information comprising a plurality of images;
at least one processor configured to execute computer-executable instructions for rendering virtual content within the 3D environment, the computer-executable instructions comprising :
forming at least one set of feature information from the plurality of images;
obtaining persistent coordinate frame information, the persistent coordinate frame information having feature information associated therewith matching the at least one set of feature information and a quality metric;
Optionally , based at least in part on the quality metric, using the persistent coordinate frame information to render a virtual object on a display of the portable device at a location defined with respect to the persistent coordinate frame. A portable electronic system comprising at least one processor further comprising instructions for performing and.
前記仮想オブジェクトの少なくとも一部を拡大すること
を含む、請求項14に記載のポータブル電子システム。 Optionally , rendering the virtual object on the display of the portable device using the persistent coordinate frame information based at least in part on the quality metric comprises:
15. The portable electronic system of claim 14 , comprising: enlarging at least a portion of the virtual object.
前記仮想オブジェクトの少なくとも一部の分解能を低減させること
を含む、請求項14に記載のポータブル電子システム。 Optionally , rendering the virtual object on the display of the portable device using the persistent coordinate frame information based at least in part on the quality metric comprises:
15. The portable electronic system of claim 14 , comprising: reducing the resolution of at least a portion of the virtual object.
前記仮想オブジェクトの少なくとも一部をレンダリングしないこと
を含む、請求項14に記載のポータブル電子システム。 Optionally , rendering the virtual object on the display of the portable device using the persistent coordinate frame information based at least in part on the quality metric comprises:
15. The portable electronic system of claim 14 , comprising: not rendering at least a portion of the virtual object.
第2のポータブル電子デバイスとの仮想コンテンツの共有を防止すること
を含む、請求項14に記載のポータブル電子システム。 Optionally , rendering the virtual object on the display of the portable device using the persistent coordinate frame information based at least in part on the quality metric comprises:
15. The portable electronic system of claim 14 , comprising: preventing sharing of virtual content with a second portable electronic device.
5. The computing environment of claim 4 , wherein quality information for the second pose is computed based at least in part on quality information for a tracking map computed by the portable device.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201962928833P | 2019-10-31 | 2019-10-31 | |
US62/928,833 | 2019-10-31 | ||
PCT/US2020/057887 WO2021087065A1 (en) | 2019-10-31 | 2020-10-29 | Cross reality system with quality information about persistent coordinate frames |
Publications (2)
Publication Number | Publication Date |
---|---|
JP2023501952A JP2023501952A (en) | 2023-01-20 |
JPWO2021087065A5 true JPWO2021087065A5 (en) | 2023-11-02 |
Family
ID=75689029
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2022525302A Pending JP2023501952A (en) | 2019-10-31 | 2020-10-29 | Cross-reality system with quality information about persistent coordinate frames |
Country Status (5)
Country | Link |
---|---|
US (1) | US12100108B2 (en) |
EP (1) | EP4052086A4 (en) |
JP (1) | JP2023501952A (en) |
CN (1) | CN114616509A (en) |
WO (1) | WO2021087065A1 (en) |
Families Citing this family (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11227435B2 (en) | 2018-08-13 | 2022-01-18 | Magic Leap, Inc. | Cross reality system |
US10957112B2 (en) | 2018-08-13 | 2021-03-23 | Magic Leap, Inc. | Cross reality system |
CN113196209A (en) | 2018-10-05 | 2021-07-30 | 奇跃公司 | Rendering location-specific virtual content at any location |
WO2020237089A1 (en) | 2019-05-21 | 2020-11-26 | Magic Leap, Inc. | Caching and updating of dense 3d reconstruction data |
US11109010B2 (en) * | 2019-06-28 | 2021-08-31 | The United States of America As Represented By The Director Of The National Geospatial-Intelligence Agency | Automatic system for production-grade stereo image enhancements |
US11632679B2 (en) | 2019-10-15 | 2023-04-18 | Magic Leap, Inc. | Cross reality system with wireless fingerprints |
JP2022551733A (en) | 2019-10-15 | 2022-12-13 | マジック リープ, インコーポレイテッド | Cross-reality system with localization service |
JP2022551734A (en) | 2019-10-15 | 2022-12-13 | マジック リープ, インコーポレイテッド | Cross-reality system that supports multiple device types |
CN114616509A (en) * | 2019-10-31 | 2022-06-10 | 奇跃公司 | Cross-reality system with quality information about persistent coordinate frames |
WO2021096931A1 (en) | 2019-11-12 | 2021-05-20 | Magic Leap, Inc. | Cross reality system with localization service and shared location-based content |
CN114730212A (en) * | 2019-12-05 | 2022-07-08 | Oppo广东移动通信有限公司 | Method and system for associating device coordinate systems in a multi-person AR system |
EP4073763A4 (en) | 2019-12-09 | 2023-12-27 | Magic Leap, Inc. | Cross reality system with simplified programming of virtual content |
US11393176B2 (en) * | 2020-02-07 | 2022-07-19 | Krikey, Inc. | Video tools for mobile rendered augmented reality game |
JP2023514207A (en) | 2020-02-13 | 2023-04-05 | マジック リープ, インコーポレイテッド | Cross-reality system with prioritization of geolocation information for localization |
EP4103910A4 (en) | 2020-02-13 | 2024-03-06 | Magic Leap, Inc. | Cross reality system with accurate shared maps |
JP2023514208A (en) | 2020-02-13 | 2023-04-05 | マジック リープ, インコーポレイテッド | Cross-reality system with map processing using multi-resolution frame descriptors |
CN115461787A (en) | 2020-02-26 | 2022-12-09 | 奇跃公司 | Cross reality system with quick positioning |
JP2023524446A (en) | 2020-04-29 | 2023-06-12 | マジック リープ, インコーポレイテッド | Cross-reality system for large-scale environments |
WO2021258084A1 (en) * | 2021-07-29 | 2021-12-23 | Innopeak Technology, Inc. | Spatio-temporal pseudo three-dimensional (stp-3d) network for performing video action recognition |
CN116719664B (en) * | 2023-08-09 | 2023-12-05 | 国网江苏省电力有限公司信息通信分公司 | Application and cloud platform cross-layer fault analysis method and system based on micro-service deployment |
Family Cites Families (261)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7032185B1 (en) | 1995-12-29 | 2006-04-18 | Microsoft Corporation | Graphical method and system for accessing information on a communications network |
KR100483806B1 (en) | 2002-07-18 | 2005-04-20 | 한국과학기술원 | Motion Reconstruction Method from Inter-Frame Feature Correspondences of a Single Video Stream Using a Motion Library |
US7467356B2 (en) | 2003-07-25 | 2008-12-16 | Three-B International Limited | Graphical user interface for 3d virtual display browser using virtual display windows |
DE602004029133D1 (en) | 2004-03-05 | 2010-10-28 | Mega Vision Corp | Processing device and computer program for gamma correction |
US20050228849A1 (en) | 2004-03-24 | 2005-10-13 | Tong Zhang | Intelligent key-frame extraction from a video |
US7542034B2 (en) | 2004-09-23 | 2009-06-02 | Conversion Works, Inc. | System and method for processing video images |
US7583858B2 (en) | 2004-10-12 | 2009-09-01 | Eastman Kodak Company | Image processing based on direction of gravity |
US20080303787A1 (en) | 2005-10-21 | 2008-12-11 | Zheng Yu Brian | Touch Screen Apparatus And Methods |
DE602007004305D1 (en) | 2006-02-02 | 2010-03-04 | Farmer Duane | TAMPON |
EP2032224A2 (en) | 2006-06-26 | 2009-03-11 | Icosystem Corporation | Methods and systems for interactive customization of avatars and other animate or inanimate items in video games |
US8781162B2 (en) | 2011-01-05 | 2014-07-15 | Ailive Inc. | Method and system for head tracking and pose estimation |
US20080090659A1 (en) | 2006-10-12 | 2008-04-17 | Maximino Aguilar | Virtual world event notification from a persistent world game server in a logically partitioned game console |
US8144920B2 (en) | 2007-03-15 | 2012-03-27 | Microsoft Corporation | Automated location estimation using image analysis |
US8306348B2 (en) | 2007-04-24 | 2012-11-06 | DigitalOptics Corporation Europe Limited | Techniques for adjusting the effect of applying kernels to signals to achieve desired effect on signal |
JP4292426B2 (en) | 2007-05-15 | 2009-07-08 | ソニー株式会社 | Imaging apparatus and imaging data correction method |
US7779161B2 (en) | 2007-07-24 | 2010-08-17 | Hiconversion, Inc. | Method and apparatus for general virtual application enabling of websites |
WO2009021068A1 (en) | 2007-08-06 | 2009-02-12 | Trx Systems, Inc. | Locating, tracking, and/or monitoring personnel and/or assets both indoors and outdoors |
US9812096B2 (en) * | 2008-01-23 | 2017-11-07 | Spy Eye, Llc | Eye mounted displays and systems using eye mounted displays |
US8231465B2 (en) | 2008-02-21 | 2012-07-31 | Palo Alto Research Center Incorporated | Location-aware mixed-reality gaming platform |
US8605863B1 (en) | 2008-03-18 | 2013-12-10 | Avaya Inc. | Method and apparatus for providing state indication on a telephone call |
GB0818561D0 (en) | 2008-10-09 | 2008-11-19 | Isis Innovation | Visual tracking of objects in images, and segmentation of images |
US9064023B2 (en) | 2008-12-29 | 2015-06-23 | Avaya Inc. | Providing web content in the context of a virtual environment |
US20100208033A1 (en) | 2009-02-13 | 2010-08-19 | Microsoft Corporation | Personal Media Landscapes in Mixed Reality |
US20100257252A1 (en) | 2009-04-01 | 2010-10-07 | Microsoft Corporation | Augmented Reality Cloud Computing |
US8839121B2 (en) | 2009-05-06 | 2014-09-16 | Joseph Bertolami | Systems and methods for unifying coordinate systems in augmented reality applications |
KR20100138806A (en) | 2009-06-23 | 2010-12-31 | 삼성전자주식회사 | Method and apparatus for automatic transformation of three-dimensional video |
KR20100138725A (en) | 2009-06-25 | 2010-12-31 | 삼성전자주식회사 | Method and apparatus for processing virtual world |
US20130162481A1 (en) | 2009-10-01 | 2013-06-27 | Parviz Parvizi | Systems and methods for calibration of indoor geolocation |
US9119027B2 (en) | 2009-10-06 | 2015-08-25 | Facebook, Inc. | Sharing of location-based content item in social networking service |
US8121618B2 (en) * | 2009-10-28 | 2012-02-21 | Digimarc Corporation | Intuitive computing methods and systems |
US8514491B2 (en) | 2009-11-20 | 2013-08-20 | Pelican Imaging Corporation | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
US8185596B2 (en) | 2010-02-22 | 2012-05-22 | Samsung Electronics Co., Ltd. | Location-based communication method and system |
JP2013141049A (en) | 2010-03-24 | 2013-07-18 | Hitachi Ltd | Server and terminal utilizing world coordinate system database |
US8620914B1 (en) | 2010-05-18 | 2013-12-31 | Google Inc. | Ranking of digital goods in a marketplace |
KR101686171B1 (en) | 2010-06-08 | 2016-12-13 | 삼성전자주식회사 | Apparatus for recognizing location using image and range data and method thereof |
WO2012006578A2 (en) | 2010-07-08 | 2012-01-12 | The Regents Of The University Of California | End-to-end visual recognition system and methods |
WO2012126500A1 (en) | 2011-03-18 | 2012-09-27 | C3 Technologies Ab | 3d streets |
WO2012135554A1 (en) | 2011-03-29 | 2012-10-04 | Qualcomm Incorporated | System for the rendering of shared digital interfaces relative to each user's point of view |
US8526368B2 (en) | 2011-05-17 | 2013-09-03 | Qualcomm Incorporated | Wi-Fi access point characteristics database |
US20120314031A1 (en) | 2011-06-07 | 2012-12-13 | Microsoft Corporation | Invariant features for computer vision |
US9082214B2 (en) | 2011-07-01 | 2015-07-14 | Disney Enterprises, Inc. | 3D drawing system for providing a real time, personalized, and immersive artistic experience |
US10019962B2 (en) | 2011-08-17 | 2018-07-10 | Microsoft Technology Licensing, Llc | Context adaptive user interface for augmented reality display |
CN103959308B (en) | 2011-08-31 | 2017-09-19 | Metaio有限公司 | The method that characteristics of image is matched with fixed reference feature |
US9268406B2 (en) | 2011-09-30 | 2016-02-23 | Microsoft Technology Licensing, Llc | Virtual spectator experience with a personal audio/visual apparatus |
US8243102B1 (en) | 2011-10-12 | 2012-08-14 | Google Inc. | Derivative-based selection of zones for banded map display |
US20130141419A1 (en) | 2011-12-01 | 2013-06-06 | Brian Mount | Augmented reality with realistic occlusion |
US8977021B2 (en) | 2011-12-30 | 2015-03-10 | Mako Surgical Corp. | Systems and methods for customizing interactive haptic boundaries |
US9530221B2 (en) | 2012-01-06 | 2016-12-27 | Pelco, Inc. | Context aware moving object detection |
US9041739B2 (en) | 2012-01-31 | 2015-05-26 | Microsoft Technology Licensing, Llc | Matching physical locations for shared virtual experience |
CN107320949B (en) | 2012-02-06 | 2021-02-02 | 索尼互动娱乐欧洲有限公司 | Book object for augmented reality |
CN103297677B (en) | 2012-02-24 | 2016-07-06 | 卡西欧计算机株式会社 | Generate video generation device and the image generating method of reconstruct image |
US9293118B2 (en) | 2012-03-30 | 2016-03-22 | Sony Corporation | Client device |
KR20130110907A (en) | 2012-03-30 | 2013-10-10 | 삼성전자주식회사 | Apparatus and method for remote controlling based on virtual reality and augmented reality |
US8965741B2 (en) | 2012-04-24 | 2015-02-24 | Microsoft Corporation | Context aware surface scanning and reconstruction |
CN102682091A (en) | 2012-04-25 | 2012-09-19 | 腾讯科技(深圳)有限公司 | Cloud-service-based visual search method and cloud-service-based visual search system |
KR101356192B1 (en) | 2012-04-26 | 2014-01-24 | 서울시립대학교 산학협력단 | Method and System for Determining Position and Attitude of Smartphone by Image Matching |
US9122321B2 (en) | 2012-05-04 | 2015-09-01 | Microsoft Technology Licensing, Llc | Collaboration environment using see through displays |
WO2013167864A1 (en) | 2012-05-11 | 2013-11-14 | Milan Momcilo Popovich | Apparatus for eye tracking |
US9743057B2 (en) | 2012-05-31 | 2017-08-22 | Apple Inc. | Systems and methods for lens shading correction |
US11089247B2 (en) | 2012-05-31 | 2021-08-10 | Apple Inc. | Systems and method for reducing fixed pattern noise in image data |
US9311750B2 (en) | 2012-06-05 | 2016-04-12 | Apple Inc. | Rotation operations in a mapping application |
CN104737061B (en) | 2012-06-11 | 2018-01-16 | 奇跃公司 | Use more depth plane three dimensional displays of the waveguided reflector arrays projector |
US9671566B2 (en) | 2012-06-11 | 2017-06-06 | Magic Leap, Inc. | Planar waveguide apparatus with diffraction element(s) and system employing same |
US9111135B2 (en) | 2012-06-25 | 2015-08-18 | Aquifi, Inc. | Systems and methods for tracking human hands using parts based template matching using corresponding pixels in bounded regions of a sequence of frames that are a specified distance interval from a reference camera |
US8798357B2 (en) | 2012-07-09 | 2014-08-05 | Microsoft Corporation | Image-based localization |
EP2875471B1 (en) | 2012-07-23 | 2021-10-27 | Apple Inc. | Method of providing image feature descriptors |
WO2014016987A1 (en) | 2012-07-27 | 2014-01-30 | Necソフト株式会社 | Three-dimensional user-interface device, and three-dimensional operation method |
GB2506338A (en) | 2012-07-30 | 2014-04-02 | Sony Comp Entertainment Europe | A method of localisation and mapping |
US9088787B1 (en) | 2012-08-13 | 2015-07-21 | Lockheed Martin Corporation | System, method and computer software product for providing visual remote assistance through computing systems |
US8829409B2 (en) | 2012-10-10 | 2014-09-09 | Thermo Fisher Scientific Inc. | Ultra-high speed imaging array with orthogonal readout architecture |
CN102932582B (en) | 2012-10-26 | 2015-05-27 | 华为技术有限公司 | Method and device for realizing motion detection |
KR102052605B1 (en) | 2012-11-13 | 2019-12-06 | 네이버 주식회사 | Method and system of installing shortcut through mobile application |
US9596481B2 (en) | 2013-01-30 | 2017-03-14 | Ati Technologies Ulc | Apparatus and method for video data processing |
CA3157218A1 (en) | 2013-03-11 | 2014-10-09 | Magic Leap, Inc. | System and method for augmented and virtual reality |
US9349072B2 (en) | 2013-03-11 | 2016-05-24 | Microsoft Technology Licensing, Llc | Local feature based image compression |
US10025486B2 (en) | 2013-03-15 | 2018-07-17 | Elwha Llc | Cross-reality select, drag, and drop for augmented reality systems |
US20140267234A1 (en) | 2013-03-15 | 2014-09-18 | Anselm Hook | Generation and Sharing Coordinate System Between Users on Mobile |
KR101380854B1 (en) | 2013-03-21 | 2014-04-04 | 한국과학기술연구원 | Apparatus and method providing augmented reality contents based on web information structure |
US9269022B2 (en) | 2013-04-11 | 2016-02-23 | Digimarc Corporation | Methods for object recognition and related arrangements |
US9154919B2 (en) | 2013-04-22 | 2015-10-06 | Alcatel Lucent | Localization systems and methods |
EP2808842B1 (en) | 2013-05-31 | 2017-08-16 | Technische Universität München | An apparatus and method for tracking and reconstructing three-dimensional objects |
US10262462B2 (en) | 2014-04-18 | 2019-04-16 | Magic Leap, Inc. | Systems and methods for augmented and virtual reality |
US9874749B2 (en) | 2013-11-27 | 2018-01-23 | Magic Leap, Inc. | Virtual and augmented reality systems and methods |
US9406137B2 (en) | 2013-06-14 | 2016-08-02 | Qualcomm Incorporated | Robust tracking using point and line features |
US9329682B2 (en) | 2013-06-18 | 2016-05-03 | Microsoft Technology Licensing, Llc | Multi-step virtual object selection |
WO2014202258A1 (en) | 2013-06-21 | 2014-12-24 | National University Of Ireland, Maynooth | A method for mapping an environment |
US9779548B2 (en) | 2013-06-25 | 2017-10-03 | Jordan Kent Weisman | Multiuser augmented reality system |
US10408613B2 (en) | 2013-07-12 | 2019-09-10 | Magic Leap, Inc. | Method and system for rendering virtual content |
WO2015006784A2 (en) | 2013-07-12 | 2015-01-15 | Magic Leap, Inc. | Planar waveguide apparatus with diffraction element(s) and system employing same |
US9264702B2 (en) | 2013-08-19 | 2016-02-16 | Qualcomm Incorporated | Automatic calibration of scene camera for optical see-through head mounted display |
US9646384B2 (en) | 2013-09-11 | 2017-05-09 | Google Technology Holdings LLC | 3D feature descriptors with camera pose information |
AU2013237718A1 (en) | 2013-10-04 | 2015-04-23 | Canon Kabushiki Kaisha | Method, apparatus and system for selecting a frame |
US9791700B2 (en) | 2013-11-27 | 2017-10-17 | Magic Leap, Inc. | Virtual and augmented reality systems and methods |
US9877158B2 (en) | 2013-12-20 | 2018-01-23 | Intel Corporation | Wi-Fi scan scheduling and power adaptation for low-power indoor location |
US10586395B2 (en) | 2013-12-30 | 2020-03-10 | Daqri, Llc | Remote object detection and local tracking using visual odometry |
US10203762B2 (en) * | 2014-03-11 | 2019-02-12 | Magic Leap, Inc. | Methods and systems for creating virtual and augmented reality |
US9830679B2 (en) | 2014-03-25 | 2017-11-28 | Google Llc | Shared virtual reality |
US20150281869A1 (en) | 2014-03-31 | 2015-10-01 | Google Inc. | Native web-based application |
WO2015155628A1 (en) | 2014-04-07 | 2015-10-15 | Eyeways Systems Ltd. | Apparatus and method for image-based positioning, orientation and situational awareness |
US9754167B1 (en) | 2014-04-17 | 2017-09-05 | Leap Motion, Inc. | Safety for wearable virtual reality devices via object detection and tracking |
WO2015161307A1 (en) | 2014-04-18 | 2015-10-22 | Magic Leap, Inc. | Systems and methods for augmented and virtual reality |
US9728009B2 (en) | 2014-04-29 | 2017-08-08 | Alcatel Lucent | Augmented reality based management of a representation of a smart environment |
GB2526263B (en) | 2014-05-08 | 2019-02-06 | Sony Interactive Entertainment Europe Ltd | Image capture method and apparatus |
AU2015274283B2 (en) | 2014-06-14 | 2020-09-10 | Magic Leap, Inc. | Methods and systems for creating virtual and augmented reality |
US10068373B2 (en) | 2014-07-01 | 2018-09-04 | Samsung Electronics Co., Ltd. | Electronic device for providing map information |
CN104143212A (en) | 2014-07-02 | 2014-11-12 | 惠州Tcl移动通信有限公司 | Reality augmenting method and system based on wearable device |
US10198865B2 (en) | 2014-07-10 | 2019-02-05 | Seiko Epson Corporation | HMD calibration with direct geometric modeling |
US9596411B2 (en) | 2014-08-25 | 2017-03-14 | Apple Inc. | Combined optical and electronic image stabilization |
US10509865B2 (en) | 2014-09-18 | 2019-12-17 | Google Llc | Dress form for three-dimensional drawing inside virtual reality environment |
KR102276847B1 (en) | 2014-09-23 | 2021-07-14 | 삼성전자주식회사 | Method for providing a virtual object and electronic device thereof |
US20200252233A1 (en) | 2014-09-24 | 2020-08-06 | James Thomas O'Keeffe | System and method for user profile enabled smart building control |
US10719727B2 (en) | 2014-10-01 | 2020-07-21 | Apple Inc. | Method and system for determining at least one property related to at least part of a real environment |
CN106062862B (en) | 2014-10-24 | 2020-04-21 | 杭州凌感科技有限公司 | System and method for immersive and interactive multimedia generation |
US10055892B2 (en) | 2014-11-16 | 2018-08-21 | Eonite Perception Inc. | Active region determination for head mounted displays |
CN106663411A (en) | 2014-11-16 | 2017-05-10 | 易欧耐特感知公司 | Systems and methods for augmented reality preparation, processing, and application |
US20160147408A1 (en) | 2014-11-25 | 2016-05-26 | Johnathan Bevis | Virtual measurement tool for a wearable visualization device |
EP3227708A1 (en) | 2014-12-04 | 2017-10-11 | HERE Global B.V. | Supporting a collaborative collection of data |
US10185775B2 (en) | 2014-12-19 | 2019-01-22 | Qualcomm Technologies, Inc. | Scalable 3D mapping system |
US10335677B2 (en) | 2014-12-23 | 2019-07-02 | Matthew Daniel Fuchs | Augmented reality system with agent device for viewing persistent content and method of operation thereof |
US9685005B2 (en) | 2015-01-02 | 2017-06-20 | Eon Reality, Inc. | Virtual lasers for interacting with augmented reality environments |
US9852546B2 (en) | 2015-01-28 | 2017-12-26 | CCP hf. | Method and system for receiving gesture input via virtual control objects |
US9560737B2 (en) * | 2015-03-04 | 2017-01-31 | International Business Machines Corporation | Electronic package with heat transfer element(s) |
US9697796B2 (en) | 2015-03-26 | 2017-07-04 | Intel Corporation | Adaptive linear luma domain video pipeline architecture |
US20160300389A1 (en) | 2015-04-08 | 2016-10-13 | Exactigo, Inc. | Correlated immersive virtual simulation for indoor navigation |
US9467718B1 (en) | 2015-05-06 | 2016-10-11 | Echostar Broadcasting Corporation | Apparatus, systems and methods for a content commentary community |
US20160335275A1 (en) | 2015-05-11 | 2016-11-17 | Google Inc. | Privacy-sensitive query for localization area description file |
US10373366B2 (en) | 2015-05-14 | 2019-08-06 | Qualcomm Incorporated | Three-dimensional model generation |
KR101725478B1 (en) | 2015-05-21 | 2017-04-11 | 주식회사 맥스트 | Method for displaying augmented reality of based 3d point cloud cognition, apparatus and system for executing the method |
US9898864B2 (en) * | 2015-05-28 | 2018-02-20 | Microsoft Technology Licensing, Llc | Shared tactile interaction and user safety in shared space multi-person immersive virtual reality |
US20160358383A1 (en) | 2015-06-05 | 2016-12-08 | Steffen Gauglitz | Systems and methods for augmented reality-based remote collaboration |
US9977493B2 (en) | 2015-06-17 | 2018-05-22 | Microsoft Technology Licensing, Llc | Hybrid display system |
US20160381118A1 (en) | 2015-06-23 | 2016-12-29 | Microsoft Technology Licensing, Llc | Extracting and formatting content from web-resources |
US10492981B1 (en) | 2015-07-17 | 2019-12-03 | Bao Tran | Systems and methods for computer assisted operation |
US10335572B1 (en) | 2015-07-17 | 2019-07-02 | Naveen Kumar | Systems and methods for computer assisted operation |
US10169917B2 (en) | 2015-08-20 | 2019-01-01 | Microsoft Technology Licensing, Llc | Augmented reality |
US20170061696A1 (en) | 2015-08-31 | 2017-03-02 | Samsung Electronics Co., Ltd. | Virtual reality display apparatus and display method thereof |
EP3347677B1 (en) | 2015-09-10 | 2022-04-06 | Oriient New Media Ltd. | Navigate, track, and position mobile devices in gps-denied or gps-inaccurate areas with automatic map generation |
CA2941893C (en) | 2015-09-14 | 2020-02-25 | The Toronto-Dominion Bank | Connected device-based property evaluation |
IL291685B2 (en) | 2015-09-25 | 2023-09-01 | Magic Leap Inc | Methods and systems for detecting and combining structural features in 3d reconstruction |
US20170094227A1 (en) | 2015-09-25 | 2017-03-30 | Northrop Grumman Systems Corporation | Three-dimensional spatial-awareness vision system |
GB201517101D0 (en) | 2015-09-28 | 2015-11-11 | Univ Essex Entpr Ltd | Mixed-reality system |
US10471355B2 (en) | 2015-10-21 | 2019-11-12 | Sharp Kabushiki Kaisha | Display system, method of controlling display system, image generation control program, and computer-readable storage medium |
US9706366B2 (en) | 2015-11-06 | 2017-07-11 | International Business Machines Corporation | WiFi-fingerprint based indoor localization map |
US10254845B2 (en) | 2016-01-05 | 2019-04-09 | Intel Corporation | Hand gesture recognition for cursor control |
US10523865B2 (en) | 2016-01-06 | 2019-12-31 | Texas Instruments Incorporated | Three dimensional rendering for surround view using predetermined viewpoint lookup tables |
US10397320B2 (en) | 2016-01-20 | 2019-08-27 | International Business Machines Corporation | Location based synchronized augmented reality streaming |
CN114995647B (en) | 2016-02-05 | 2024-09-10 | 奇跃公司 | System and method for augmented reality |
WO2017143303A1 (en) | 2016-02-17 | 2017-08-24 | Meta Company | Apparatuses, methods and systems for sharing virtual elements |
US10373380B2 (en) | 2016-02-18 | 2019-08-06 | Intel Corporation | 3-dimensional scene analysis for augmented reality operations |
JP6776609B2 (en) | 2016-02-22 | 2020-10-28 | デクセリアルズ株式会社 | Anisotropic conductive film |
US20180122143A1 (en) | 2016-03-15 | 2018-05-03 | Sutherland Cook Ellwood, JR. | Hybrid photonic vr/ar systems |
US10115234B2 (en) | 2016-03-21 | 2018-10-30 | Accenture Global Solutions Limited | Multiplatform based experience generation |
US10802147B2 (en) | 2016-05-18 | 2020-10-13 | Google Llc | System and method for concurrent odometry and mapping |
US10586391B2 (en) | 2016-05-31 | 2020-03-10 | Accenture Global Solutions Limited | Interactive virtual reality platforms |
US10217231B2 (en) | 2016-05-31 | 2019-02-26 | Microsoft Technology Licensing, Llc | Systems and methods for utilizing anchor graphs in mixed reality environments |
US10607408B2 (en) | 2016-06-04 | 2020-03-31 | Shape Labs Inc. | Method for rendering 2D and 3D data within a 3D virtual environment |
US10338392B2 (en) | 2016-06-13 | 2019-07-02 | Microsoft Technology Licensing, Llc | Identification of augmented reality image display position |
US11201981B1 (en) | 2016-06-20 | 2021-12-14 | Pipbin, Inc. | System for notification of user accessibility of curated location-dependent content in an augmented estate |
US10504008B1 (en) | 2016-07-18 | 2019-12-10 | Occipital, Inc. | System and method for relocalization and scene recognition |
EP3497676B1 (en) | 2016-08-11 | 2024-07-17 | Magic Leap, Inc. | Automatic placement of a virtual object in a three-dimensional space |
KR102634343B1 (en) | 2016-08-22 | 2024-02-05 | 매직 립, 인코포레이티드 | Virtual, augmented, and mixed reality systems and methods |
US10162362B2 (en) | 2016-08-29 | 2018-12-25 | PerceptIn, Inc. | Fault tolerance to provide robust tracking for autonomous positional awareness |
US10007868B2 (en) | 2016-09-19 | 2018-06-26 | Adobe Systems Incorporated | Font replacement based on visual similarity |
RU2016138608A (en) | 2016-09-29 | 2018-03-30 | Мэджик Лип, Инк. | NEURAL NETWORK FOR SEGMENTING THE EYE IMAGE AND ASSESSING THE QUALITY OF THE IMAGE |
CN107024980A (en) | 2016-10-26 | 2017-08-08 | 阿里巴巴集团控股有限公司 | Customer location localization method and device based on augmented reality |
US10452133B2 (en) | 2016-12-12 | 2019-10-22 | Microsoft Technology Licensing, Llc | Interacting with an environment using a parent device and at least one companion device |
KR102630774B1 (en) | 2016-12-29 | 2024-01-30 | 매직 립, 인코포레이티드 | Automatic control of wearable display device based on external conditions |
US10621773B2 (en) | 2016-12-30 | 2020-04-14 | Google Llc | Rendering content in a 3D environment |
JP7268879B2 (en) | 2017-01-02 | 2023-05-08 | ガウス サージカル,インコーポレイテッド | Tracking Surgical Items Predicting Duplicate Imaging |
US10354129B2 (en) | 2017-01-03 | 2019-07-16 | Intel Corporation | Hand gesture recognition for virtual reality and augmented reality devices |
US10812936B2 (en) | 2017-01-23 | 2020-10-20 | Magic Leap, Inc. | Localization determination for mixed reality systems |
WO2018139773A1 (en) | 2017-01-25 | 2018-08-02 | 한국과학기술연구원 | Slam method and device robust to changes in wireless environment |
US10534964B2 (en) | 2017-01-30 | 2020-01-14 | Blackberry Limited | Persistent feature descriptors for video |
JP6426772B2 (en) | 2017-02-07 | 2018-11-21 | ファナック株式会社 | Coordinate information conversion apparatus and coordinate information conversion program |
CN110892408A (en) | 2017-02-07 | 2020-03-17 | 迈恩德玛泽控股股份有限公司 | Systems, methods, and apparatus for stereo vision and tracking |
US10664993B1 (en) | 2017-03-13 | 2020-05-26 | Occipital, Inc. | System for determining a pose of an object |
US10460489B2 (en) | 2017-03-15 | 2019-10-29 | Facebook, Inc. | Visual editor for designing augmented-reality effects and configuring scaling parameters |
AU2018234929B2 (en) | 2017-03-17 | 2022-06-30 | Magic Leap, Inc. | Technique for recording augmented reality data |
US10600252B2 (en) | 2017-03-30 | 2020-03-24 | Microsoft Technology Licensing, Llc | Coarse relocalization using signal fingerprints |
US10466953B2 (en) | 2017-03-30 | 2019-11-05 | Microsoft Technology Licensing, Llc | Sharing neighboring map data across devices |
KR102648256B1 (en) | 2017-03-30 | 2024-03-14 | 매직 립, 인코포레이티드 | Centralized Rendering |
US9754397B1 (en) | 2017-04-07 | 2017-09-05 | Mirage Worlds, Inc. | Systems and methods for contextual augmented reality sharing and performance |
CN113608617A (en) | 2017-04-19 | 2021-11-05 | 奇跃公司 | Multi-modal task execution and text editing for wearable systems |
CA3060209A1 (en) | 2017-05-01 | 2018-11-08 | Magic Leap, Inc. | Matching content to a spatial 3d environment |
US11417091B2 (en) | 2017-05-30 | 2022-08-16 | Ptc Inc. | Use of coordinated local user devices during a shared augmented reality session |
CN109145927A (en) | 2017-06-16 | 2019-01-04 | 杭州海康威视数字技术股份有限公司 | The target identification method and device of a kind of pair of strain image |
JP6585665B2 (en) | 2017-06-29 | 2019-10-02 | ファナック株式会社 | Virtual object display system |
US10929945B2 (en) | 2017-07-28 | 2021-02-23 | Google Llc | Image capture devices featuring intelligent use of lightweight hardware-generated statistics |
US10503955B2 (en) | 2017-08-29 | 2019-12-10 | Synaptics Incorporated | Device with improved circuit positioning |
WO2019046774A1 (en) | 2017-09-01 | 2019-03-07 | Memorial Sloan Kettering Cancer Center | Systems and methods for generating 3d medical images by scanning a whole tissue block |
US10546387B2 (en) | 2017-09-08 | 2020-01-28 | Qualcomm Incorporated | Pose determination with semantic segmentation |
US10685456B2 (en) * | 2017-10-12 | 2020-06-16 | Microsoft Technology Licensing, Llc | Peer to peer remote localization for devices |
US10612929B2 (en) | 2017-10-17 | 2020-04-07 | AI Incorporated | Discovering and plotting the boundary of an enclosure |
US10573089B2 (en) | 2017-11-09 | 2020-02-25 | The Boeing Company | Systems, methods, and tools for spatially-registering virtual content with physical environment in augmented reality platforms |
AU2018369757B2 (en) | 2017-11-14 | 2023-10-12 | Magic Leap, Inc. | Fully convolutional interest point detection and description via homographic adaptation |
EP3724713A4 (en) | 2017-12-15 | 2021-08-25 | Magic Leap, Inc. | Enhanced pose determination for display device |
US10630857B2 (en) | 2017-12-21 | 2020-04-21 | Ricoh Company, Ltd. | Electronic apparatus and method to update firmware of the electronic apparatus when adding a web application to the electronic apparatus |
CA3084149A1 (en) | 2017-12-22 | 2019-06-27 | Magic Leap, Inc. | Methods and system for managing and displaying virtual content in a mixed reality system |
US20190206258A1 (en) | 2018-01-04 | 2019-07-04 | nuTonomy Inc. | Augmented reality vehicle interfacing |
US11237004B2 (en) | 2018-03-27 | 2022-02-01 | Uatc, Llc | Log trajectory estimation for globally consistent maps |
CN110322300B (en) | 2018-03-28 | 2024-06-18 | 北京京东尚科信息技术有限公司 | Data processing method and device, electronic equipment and storage medium |
US10838574B2 (en) | 2018-04-09 | 2020-11-17 | Spatial Systems Inc. | Augmented reality computing environments—workspace save and load |
US11086474B2 (en) | 2018-04-09 | 2021-08-10 | Spatial Systems Inc. | Augmented reality computing environments—mobile device join and load |
WO2019210284A1 (en) | 2018-04-26 | 2019-10-31 | SCRRD, Inc. | Augmented reality platform and method for use of same |
CN108665508B (en) | 2018-04-26 | 2022-04-05 | 腾讯科技(深圳)有限公司 | Instant positioning and map construction method, device and storage medium |
US10803671B2 (en) | 2018-05-04 | 2020-10-13 | Microsoft Technology Licensing, Llc | Authoring content in three-dimensional environment |
US11321929B2 (en) | 2018-05-18 | 2022-05-03 | Purdue Research Foundation | System and method for spatially registering multiple augmented reality devices |
US10812711B2 (en) | 2018-05-18 | 2020-10-20 | Samsung Electronics Co., Ltd. | Semantic mapping for low-power augmented reality using dynamic vision sensor |
CN110515452B (en) | 2018-05-22 | 2022-02-22 | 腾讯科技(深圳)有限公司 | Image processing method, image processing device, storage medium and computer equipment |
US11010977B2 (en) * | 2018-05-31 | 2021-05-18 | Jido, Inc. | Method for establishing a common reference frame amongst devices for an augmented reality session |
US10706629B2 (en) | 2018-06-15 | 2020-07-07 | Dell Products, L.P. | Coordinate override in virtual, augmented, and mixed reality (xR) applications |
US10964053B2 (en) | 2018-07-02 | 2021-03-30 | Microsoft Technology Licensing, Llc | Device pose estimation using 3D line clouds |
WO2020023582A1 (en) * | 2018-07-24 | 2020-01-30 | Magic Leap, Inc. | Methods and apparatuses for determining and/or evaluating localizing maps of image display devices |
US10957112B2 (en) * | 2018-08-13 | 2021-03-23 | Magic Leap, Inc. | Cross reality system |
US11227435B2 (en) | 2018-08-13 | 2022-01-18 | Magic Leap, Inc. | Cross reality system |
KR101985703B1 (en) | 2018-08-24 | 2019-06-04 | 주식회사 버넥트 | Augmented reality service Software as a Service based Augmented Reality Operating System |
JP7361763B2 (en) | 2018-09-04 | 2023-10-16 | アクティーア・ソシエテ・アノニム | System for determining blood pressure of one or more users |
WO2020056692A1 (en) * | 2018-09-20 | 2020-03-26 | 太平洋未来科技(深圳)有限公司 | Information interaction method and apparatus, and electronic device |
EP3629290B1 (en) | 2018-09-26 | 2023-01-04 | Apple Inc. | Localization for mobile devices |
CN113196209A (en) | 2018-10-05 | 2021-07-30 | 奇跃公司 | Rendering location-specific virtual content at any location |
US11461979B2 (en) | 2018-10-21 | 2022-10-04 | Oracle International Corporation | Animation between visualization objects in a virtual dashboard |
US10937191B2 (en) | 2018-10-23 | 2021-03-02 | Dell Products, Lp | Predictive simultaneous localization and mapping system using prior user session positional information |
US10839556B2 (en) | 2018-10-23 | 2020-11-17 | Microsoft Technology Licensing, Llc | Camera pose estimation using obfuscated features |
US10854007B2 (en) | 2018-12-03 | 2020-12-01 | Microsoft Technology Licensing, Llc | Space models for mixed reality |
US20200211290A1 (en) | 2018-12-26 | 2020-07-02 | Lg Electronics Inc. | Xr device for providing ar mode and vr mode and method for controlling the same |
US11030812B2 (en) | 2019-01-02 | 2021-06-08 | The Boeing Company | Augmented reality system using enhanced models |
US10767997B1 (en) | 2019-02-25 | 2020-09-08 | Qualcomm Incorporated | Systems and methods for providing immersive extended reality experiences on moving platforms |
CN113412614B (en) | 2019-03-27 | 2023-02-14 | Oppo广东移动通信有限公司 | Three-dimensional localization using depth images |
US11074706B2 (en) | 2019-04-12 | 2021-07-27 | Intel Corporation | Accommodating depth noise in visual slam using map-point consensus |
CN111815755B (en) | 2019-04-12 | 2023-06-30 | Oppo广东移动通信有限公司 | Method and device for determining blocked area of virtual object and terminal equipment |
US11151792B2 (en) | 2019-04-26 | 2021-10-19 | Google Llc | System and method for creating persistent mappings in augmented reality |
US10748302B1 (en) | 2019-05-02 | 2020-08-18 | Apple Inc. | Multiple user simultaneous localization and mapping (SLAM) |
CN110221690B (en) * | 2019-05-13 | 2022-01-04 | Oppo广东移动通信有限公司 | Gesture interaction method and device based on AR scene, storage medium and communication terminal |
US11010921B2 (en) | 2019-05-16 | 2021-05-18 | Qualcomm Incorporated | Distributed pose estimation |
US20200364937A1 (en) | 2019-05-16 | 2020-11-19 | Subvrsive, Inc. | System-adaptive augmented reality |
US11145083B2 (en) | 2019-05-21 | 2021-10-12 | Microsoft Technology Licensing, Llc | Image-based localization |
US10854012B1 (en) | 2019-05-29 | 2020-12-01 | Dell Products, L.P. | Concealing loss of distributed simultaneous localization and mapping (SLAM) data in edge cloud architectures |
US20200380263A1 (en) | 2019-05-29 | 2020-12-03 | Gyrfalcon Technology Inc. | Detecting key frames in video compression in an artificial intelligence semiconductor solution |
US20200401617A1 (en) | 2019-06-21 | 2020-12-24 | White Raven Ltd | Visual positioning system |
US10852828B1 (en) | 2019-07-17 | 2020-12-01 | Dell Products, L.P. | Automatic peripheral pairing with hand assignments in virtual, augmented, and mixed reality (xR) applications |
US10936874B1 (en) | 2019-08-13 | 2021-03-02 | Dell Products, L.P. | Controller gestures in virtual, augmented, and mixed reality (xR) applications |
KR20190104928A (en) | 2019-08-22 | 2019-09-11 | 엘지전자 주식회사 | Extended reality device and method for controlling the extended reality device |
US11270515B2 (en) | 2019-09-04 | 2022-03-08 | Qualcomm Incorporated | Virtual keyboard |
CN114222960A (en) | 2019-09-09 | 2022-03-22 | 苹果公司 | Multimodal input for computer-generated reality |
US12079638B2 (en) | 2019-10-03 | 2024-09-03 | Magic Leap, Inc. | Management framework for mixed reality devices |
US11425220B2 (en) | 2019-10-08 | 2022-08-23 | Magic Leap, Inc. | Methods, systems, and computer program products for implementing cross-platform mixed-reality applications with a scripting framework |
JP2022551734A (en) * | 2019-10-15 | 2022-12-13 | マジック リープ, インコーポレイテッド | Cross-reality system that supports multiple device types |
US11632679B2 (en) * | 2019-10-15 | 2023-04-18 | Magic Leap, Inc. | Cross reality system with wireless fingerprints |
JP2022551733A (en) | 2019-10-15 | 2022-12-13 | マジック リープ, インコーポレイテッド | Cross-reality system with localization service |
US11494995B2 (en) | 2019-10-29 | 2022-11-08 | Magic Leap, Inc. | Systems and methods for virtual and augmented reality |
CN114616509A (en) * | 2019-10-31 | 2022-06-10 | 奇跃公司 | Cross-reality system with quality information about persistent coordinate frames |
WO2021096931A1 (en) | 2019-11-12 | 2021-05-20 | Magic Leap, Inc. | Cross reality system with localization service and shared location-based content |
EP4073763A4 (en) | 2019-12-09 | 2023-12-27 | Magic Leap, Inc. | Cross reality system with simplified programming of virtual content |
EP4103910A4 (en) | 2020-02-13 | 2024-03-06 | Magic Leap, Inc. | Cross reality system with accurate shared maps |
EP4104144A4 (en) | 2020-02-13 | 2024-06-05 | Magic Leap, Inc. | Cross reality system for large scale environments |
JP2023514208A (en) | 2020-02-13 | 2023-04-05 | マジック リープ, インコーポレイテッド | Cross-reality system with map processing using multi-resolution frame descriptors |
JP2023514207A (en) | 2020-02-13 | 2023-04-05 | マジック リープ, インコーポレイテッド | Cross-reality system with prioritization of geolocation information for localization |
US11593951B2 (en) | 2020-02-25 | 2023-02-28 | Qualcomm Incorporated | Multi-device object tracking and localization |
CN115461787A (en) | 2020-02-26 | 2022-12-09 | 奇跃公司 | Cross reality system with quick positioning |
US11900322B2 (en) | 2020-03-20 | 2024-02-13 | Procore Technologies, Inc. | Presence and collaboration tools for building information models |
JP2023524446A (en) | 2020-04-29 | 2023-06-12 | マジック リープ, インコーポレイテッド | Cross-reality system for large-scale environments |
-
2020
- 2020-10-29 CN CN202080076525.7A patent/CN114616509A/en active Pending
- 2020-10-29 US US17/084,174 patent/US12100108B2/en active Active
- 2020-10-29 WO PCT/US2020/057887 patent/WO2021087065A1/en unknown
- 2020-10-29 EP EP20880832.9A patent/EP4052086A4/en active Pending
- 2020-10-29 JP JP2022525302A patent/JP2023501952A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JPWO2021087065A5 (en) | ||
US9435911B2 (en) | Visual-based obstacle detection method and apparatus for mobile robot | |
US8199977B2 (en) | System and method for extraction of features from a 3-D point cloud | |
US10086955B2 (en) | Pattern-based camera pose estimation system | |
US10636168B2 (en) | Image processing apparatus, method, and program | |
CN109472820B (en) | Monocular RGB-D camera real-time face reconstruction method and device | |
US10451403B2 (en) | Structure-based camera pose estimation system | |
US9773335B2 (en) | Display control device and method | |
JP2019534510A5 (en) | ||
JP6349418B2 (en) | Object positioning by high-precision monocular movement | |
JP2018526698A (en) | Privacy sensitive queries in localization area description files | |
US9858669B2 (en) | Optimized camera pose estimation system | |
CN109961523B (en) | Method, device, system, equipment and storage medium for updating virtual target | |
JP2005528707A5 (en) | ||
CN111459269B (en) | Augmented reality display method, system and computer readable storage medium | |
JP2020173795A5 (en) | ||
KR101851303B1 (en) | Apparatus and method for reconstructing 3d space | |
AU2017279679A1 (en) | Fast rendering of quadrics and marking of silhouettes thereof | |
CN111311681A (en) | Visual positioning method, device, robot and computer readable storage medium | |
JP2018181047A (en) | Three-dimensional shape model generating device, three-dimensional shape model generating method and program | |
JPWO2021163300A5 (en) | ||
CN113989376A (en) | Method and device for acquiring indoor depth information and readable storage medium | |
JPWO2020262392A5 (en) | ||
US20200191577A1 (en) | Method and system for road image reconstruction and vehicle positioning | |
US11758100B2 (en) | Portable projection mapping device and projection mapping system |