WO2019105002A1 - Systems and methods for creating virtual 3d environment - Google Patents
Systems and methods for creating virtual 3d environment Download PDFInfo
- Publication number
- WO2019105002A1 WO2019105002A1 PCT/CN2018/091412 CN2018091412W WO2019105002A1 WO 2019105002 A1 WO2019105002 A1 WO 2019105002A1 CN 2018091412 W CN2018091412 W CN 2018091412W WO 2019105002 A1 WO2019105002 A1 WO 2019105002A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- place
- virtual
- environment
- map
- scanner
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/04—Indexing scheme for image data processing or generation, in general involving 3D image data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/08—Indexing scheme for image data processing or generation, in general involving all processing steps from image acquisition to 3D model generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2004—Aligning objects, relative positioning of parts
Definitions
- inventions relate to the field of 3D imaging systems. More specifically, embodiments of the present disclosure relate to systems and methods for creating a virtual environment for a place by generating an integrated 3D output by integrating an output of a plurality of scanners.
- Virtual reality technology may use multi-projected environment or headsets and physical props or environments, to create realistic images, sound, sensation for simulating a person’s physical presence in a virtual environment.
- augmented reality may bring out components of the digital world into a person's perceived real world.
- Augmented reality may be a live direct or indirect view of a physical, real-world environment whose elements are "augmented" by computer-generated or extracted real-world sensory input such as video, sound, graphics, or GPS data.
- AR Augmented reality
- virtual reality and augmented reality techniques were used in entertainment and game businesses, but now other business industries are also getting interested about AR's possibilities for example in knowledge sharing, educating, managing the information flood and organizing distant meetings.
- the augmented reality has a lot of potential in gathering and sharing tacit knowledge.
- the augmented reality may enhance one’s current perception of reality, whereas in contrast, virtual reality may replace the real world with a simulated one.
- the augmented reality is usually used in order to enhance the experienced environments or situations for offering an enriched experience to users.
- existing augmentation techniques for generating immersive augmented videos or experiences are typically performed in real-time and in semantic context with environmental elements, such as overlaying supplemental information like scores over a live video feed of a sporting event.
- the AR and VR have a lot of potential but hardly there exists any techniques for integration of real environment views with virtual environments for AR/VR systems.
- the information about the surrounding real world of the user may become interactive and the user may be able to digitally manipulate the augmented reality environment.
- Information about the environment and its objects may be overlaid on the real world.
- This information can be virtual information or real information, e.g. seeing other real sensed or measured information such as electromagnetic radio waves overlaid in exact alignment with where they actually are in space.
- One example is an AR Helmet for construction workers, which can display information about the construction sites.
- the present disclosure provides a system for integrating a plurality of outputs of a plurality of scanners for generating a virtual environment for a user.
- Augmented reality systems may generate a live direct or indirect view of a physical and real world environment whose elements may be augmented by computer-generated or extracted real-world sensory inputs such as video, sound, haptics, graphics, images, or GPS data.
- Virtual reality (VR) systems may use multi-projected environments, sometimes in combination with props or physical environment, to generate realistic images, sounds and other sensations that may simulate a user's physical presence in a virtual or imaginary environment.
- a further objective of the present disclosure is to provide a method for generating an integrated immersive display for use with web based systems.
- An objective of the present disclosure is to provide a method for generating an integrated immersive display for use with augmented reality based or virtual reality based systems.
- a further objective of the present disclosure is to provide a system to create an overall environment by integration of a plurality of scanned outputs received from 3D scanners comprising at least one of an aerial scanner, an indoor scanner, and an object scanner for any place such as an exhibition centre/venue
- An objective of the present disclosure is to provide a three-dimensional based virtual environment based on an output from multiple scanners to the users.
- Another objective of the present disclosure is to provide a system for enabling a user to view an integrated scanned output on a web/AR/VR based system.
- Another objective of the present disclosure is to provide an augmented reality or virtual reality based immersive display of an area comprising an indoor area, an outdoor area, and so forth to users.
- An embodiment of the present disclosure provides an integration system for generating a three dimensional (3D) environment of a place, comprising: a number of 3D scanners for generating a plurality of 3D scanned images of the place from different views, wherein the plurality of 3D scanners comprising at least one of an aerial scanner, an indoor scanner, and an object scanner; and a processor for integrating the plurality of 3D scanned images for creating an overall 3D environment for the place.
- the aerial scanner is configured to scan an outside area of the place to create a first virtual 3D map of the outside area.
- the indoor scanner is configured to scan an indoor area of the place to create a second virtual 3D map of the inside area.
- the object scanner is configured to scan one or more objects placed within the place to create a third virtual 3D map of the objects.
- the processor is configured to integrate the first virtual 3D map of the outside area, the second virtual 3D map of the inside area, and the third virtual 3D map of the objects for creating the overall 3D environment for the place.
- a user views the overall 3D environment for the place via at least one of a web based/AR/VR based system.
- the one or more 3D scanners comprising at least one of an object scanner, an indoor scanner, and an outdoor scanner.
- the disclosed system provides a virtual environment with a real feel/effect to the users.
- the users may feel as being a part of the virtual environment.
- the user accesses the integration system via a web browser on a mobile device.
- the integration system is installed on a mobile device of the user.
- an integration system for generating a three dimensional (3D) environment of a place, comprising: an aerial scanner configured to scan an outside area of the place to create a first virtual 3D map of the outside area of the place; an indoor scanner configured to scan an indoor area of the place to create a second virtual 3D map of the inside area of the place; an object scanner configured to scan one or more objects placed within the place to create a third virtual 3D map of the one or more objects; a processor configured to integrate the first virtual 3D map of the outside area, the second virtual 3D map of the inside area, and the third virtual 3D map of the objects for creating an overall 3D environment for the place; and a database configured to store the integrated overall 3D environment.
- Another embodiment of the present disclosure provides a method for method for creating a three dimensional (3D) environment of a place, comprising: scanning, by an aerial scanner, an outside area of the place to create a first virtual 3D map of the outside area of the place; scanning, by an indoor scanner, an indoor area of the place to create a second virtual 3D map of the inside area of the place; scanning, by an object scanner, one or more objects placed within the place to create a third virtual 3D map of the one or more objects; integrating, by a processor, the first virtual 3D map of the outside area, the second virtual 3D map of the inside area, and the third virtual 3D map of the objects for creating an overall 3D environment for the place; and storing, in a database, the 3D immersive display the integrated overall 3D environment.
- a further embodiment of the present disclosure provides a method for generating a three dimensional (3D) environment of a place, comprising: generating, by a plurality of 3D scanners, a plurality of 3D scanned images of the place from different views, wherein the plurality of 3D scanners comprising at least one of an aerial scanner, an indoor scanner, and an object scanner; integrating, by a processor, integrating the plurality of 3D scanned images for creating an overall 3D environment for the place.
- FIG. 1 illustrates an exemplary environment where various embodiments of the present disclosure may function
- FIG. 2 illustrates a flowchart of a method for creating an overall 3D environment of a place by using the integration system 100 of FIG. 1, in accordance with an embodiment of the present disclosure.
- FIG. 1 illustrates a block diagram illustrating an exemplary integration system 100 for creating a 3D environment for a place, according to an embodiment of the present disclosure may function.
- the integration system 100 primarily includes one or more three dimensional scanners comprising at least one of an aerial scanner 102, an indoor scanner 104, and an object scanner 106.
- the 3D scanners may be configured to generate a plurality of 3D scanned images or 3D model of an area.
- the aerial scanner 102 may be configured to scan an outside area of the place to create a first virtual 3D map of the outside area of the place.
- the indoor scanner 104 is configured to scan an indoor area of the place to create a second virtual 3D map of the inside area of the place.
- the object scanner 106 is configured to scan one or more objects placed within the place to create a third virtual 3D map of the one or more objects.
- the integration system 100 further includes a processor 108 configured to integrate one or more outputs of the plurality of scanners 102-106 for creating an overall 3D environment of the place.
- the processor 108 is configured to integrate the first virtual 3D map of the outside area, the second virtual 3D map of the inside area, and the third virtual 3D map of the objects for creating the overall 3D environment for the place.
- the place can be an event venue, exhibition, a park, a house, an office, a concert hall, a hospital, or any area.
- the processor 108 may store the scanned images, information of the scanners 102-106, overall 3D environment in a database 110.
- the database 110 may be inbuilt in the processor 108.
- the database 110 may be located remotely and may be connected to the processor 108 via a network (not shown) .
- the network may be such as, but not limited to, a local area network, wide area network, Internet, and so forth.
- the integration system 100 may allow a user 114 to view the overall 3D environment by using a mobile device 112 or any other device which has screen or browser such as computer or tablet.
- the user 114 may view the overall 3D environment of the place by using VR headsets, or AR technology based devices. The user 114 may not be required to physically visit the place or exhibition for viewing it. Further, the user 114 may view the overall 3D environment of any place virtually from anywhere and at anytime or in real-time.
- the processor 108 may be a software application, a hardware, a firmware, and combination of these. In some embodiments, the processor 108 may be a software application installed on the mobile device 112.
- the mobile device 112 can be a computing device such as, but not limited to, a computer, a VR headset, a AR based device, a laptop, a tablet computer, a smart television, a smart phone, and so forth.
- the user 114 accesses the processor for viewing the 3D virtual environments of the place via a web browser.
- FIG. 2 illustrates a flowchart of a method 200 for generating an overall three dimensional (3D) environment of a place by using the integration system 100 of FIG. 1, in accordance with an embodiment of the present disclosure.
- the aerial scanner 102 scans an outside area of the place such as a concert hall, for creating a first virtual 3D map of the outside area of the place.
- the indoor scanner 104 scans an indoor or inside area of the place to create a second virtual 3D map of the inside area of the place.
- the object scanner 106 scans one or more objects or people placed within the place to create a third virtual 3D map of the one or more objects.
- the processor 108 integrates the first virtual 3D map of the outside area, the second virtual 3D map of the inside area, and the third virtual 3D map of the objects for creating an overall 3D environment for the place.
- the overall 3D environment may be a virtual environment created for use with web based/AR or VR based systems or devices.
- the 3D environment may then be stored in a database for later use.
- the user 114 may view the overall 3D environment for the place via at least one of a web based/AR/VR based system.
- the present disclosure provides an augmented reality (AR) or virtual reality (VR) based immersive display system for generating an online or an offline realistic view of a location such as an exhibition.
- the user may view the exhibition from anywhere.
- AR augmented reality
- VR virtual reality
- the present disclosure provides a real feel of any place via the overall 3D environment for a user. While watching or viewing the overall 3D environment, the user may feel as being part of the place. The user may select or choose a place of which the user wants to view the virtual 3D environment. The user may view on the mobile device from anywhere irrespective of the location. The user may only require a suitable network for accessing the integration system/processor.
- the network may be the Internet.
- the present disclosure provides a flexibility to the users for viewing a 3D view of any place or venue based on his/her time and place convenience.
- the user may view the exhibition or location or any event at anytime from anywhere.
- present disclosure provides the integration system for creating an immersive display by integrating outputs of a plurality of scanners by using known technologies.
- Embodiments of the disclosure are also described above with reference to flowchart illustrations and/or block diagrams of methods and systems. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, may be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the acts specified in the flowchart and/or block diagram block or blocks.
- These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to operate in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the acts specified in the flowchart and/or block diagram block or blocks.
- the computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operations to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the acts specified in the flowchart and/or block diagram block or blocks.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Geometry (AREA)
- Architecture (AREA)
- Processing Or Creating Images (AREA)
Abstract
An integration system for generating a three dimensional (3D) environment of a place. The integration system includes a number of 3D scanners for generating a plurality of 3D scanned images of the place from different views, wherein the plurality of 3D scanners comprising at least one of an aerial scanner (102), an indoor scanner (104), and an object scanner (106). The integration system further includes a processor (108) for integrating the plurality of 3D scanned images for creating an overall 3D environment for the place.
Description
The presently disclosed embodiments relate to the field of 3D imaging systems. More specifically, embodiments of the present disclosure relate to systems and methods for creating a virtual environment for a place by generating an integrated 3D output by integrating an output of a plurality of scanners.
Virtual reality technology may use multi-projected environment or headsets and physical props or environments, to create realistic images, sound, sensation for simulating a person’s physical presence in a virtual environment. On the other hand augmented reality may bring out components of the digital world into a person's perceived real world. Augmented reality (AR) may be a live direct or indirect view of a physical, real-world environment whose elements are "augmented" by computer-generated or extracted real-world sensory input such as video, sound, graphics, or GPS data. Originally, virtual reality and augmented reality techniques were used in entertainment and game businesses, but now other business industries are also getting interested about AR's possibilities for example in knowledge sharing, educating, managing the information flood and organizing distant meetings. The augmented reality has a lot of potential in gathering and sharing tacit knowledge. The augmented reality may enhance one’s current perception of reality, whereas in contrast, virtual reality may replace the real world with a simulated one. The augmented reality is usually used in order to enhance the experienced environments or situations for offering an enriched experience to users. Further, existing augmentation techniques for generating immersive augmented videos or experiences are typically performed in real-time and in semantic context with environmental elements, such as overlaying supplemental information like scores over a live video feed of a sporting event. Though the AR and VR have a lot of potential but hardly there exists any techniques for integration of real environment views with virtual environments for AR/VR systems.
In light of above, there exist need for providing ways for techniques for creating integrated immersive display or environments for web/AR/VR systems.
SUMMARY
With the help of advanced AR technology, for example adding computing vision and object recognition, the information about the surrounding real world of the user may become interactive and the user may be able to digitally manipulate the augmented reality environment. Information about the environment and its objects may be overlaid on the real world. This information can be virtual information or real information, e.g. seeing other real sensed or measured information such as electromagnetic radio waves overlaid in exact alignment with where they actually are in space. One example is an AR Helmet for construction workers, which can display information about the construction sites.
The present disclosure provides a system for integrating a plurality of outputs of a plurality of scanners for generating a virtual environment for a user.
Another objective of the present disclosure is to provide a system for generating an integrated immersive display for use with web based on augmented reality (AR) based or virtual reality (VR) based systems. Augmented reality systems may generate a live direct or indirect view of a physical and real world environment whose elements may be augmented by computer-generated or extracted real-world sensory inputs such as video, sound, haptics, graphics, images, or GPS data. Virtual reality (VR) systems may use multi-projected environments, sometimes in combination with props or physical environment, to generate realistic images, sounds and other sensations that may simulate a user's physical presence in a virtual or imaginary environment.
A further objective of the present disclosure is to provide a method for generating an integrated immersive display for use with web based systems.
An objective of the present disclosure is to provide a method for generating an integrated immersive display for use with augmented reality based or virtual reality based systems.
A further objective of the present disclosure is to provide a system to create an overall environment by integration of a plurality of scanned outputs received from 3D scanners comprising at least one of an aerial scanner, an indoor scanner, and an object scanner for any place such as an exhibition centre/venue
An objective of the present disclosure is to provide a three-dimensional based virtual environment based on an output from multiple scanners to the users.
Another objective of the present disclosure is to provide a system for enabling a user to view an integrated scanned output on a web/AR/VR based system.
Another objective of the present disclosure is to provide an augmented reality or virtual reality based immersive display of an area comprising an indoor area, an outdoor area, and so forth to users.
An embodiment of the present disclosure provides an integration system for generating a three dimensional (3D) environment of a place, comprising: a number of 3D scanners for generating a plurality of 3D scanned images of the place from different views, wherein the plurality of 3D scanners comprising at least one of an aerial scanner, an indoor scanner, and an object scanner; and a processor for integrating the plurality of 3D scanned images for creating an overall 3D environment for the place.
According to another aspect of the present disclosure, the aerial scanner is configured to scan an outside area of the place to create a first virtual 3D map of the outside area.
According to another aspect of the present disclosure, the indoor scanner is configured to scan an indoor area of the place to create a second virtual 3D map of the inside area.
According to another aspect of the present disclosure, the object scanner is configured to scan one or more objects placed within the place to create a third virtual 3D map of the objects.
According to another aspect of the present disclosure, the processor is configured to integrate the first virtual 3D map of the outside area, the second virtual 3D map of the inside area, and the third virtual 3D map of the objects for creating the overall 3D environment for the place.
According to another aspect of the present disclosure, a user views the overall 3D environment for the place via at least one of a web based/AR/VR based system.
According to another aspect of the present disclosure, the one or more 3D scanners comprising at least one of an object scanner, an indoor scanner, and an outdoor scanner.
According to an aspect of the present disclosure, the disclosed system provides a virtual environment with a real feel/effect to the users. The users may feel as being a part of the virtual environment.
According to another aspect of the present disclosure, the user accesses the integration system via a web browser on a mobile device.
According to another aspect of the present disclosure, the integration system is installed on a mobile device of the user.
Another embodiment of the present disclosure provides an integration system for generating a three dimensional (3D) environment of a place, comprising: an aerial scanner configured to scan an outside area of the place to create a first virtual 3D map of the outside area of the place; an indoor scanner configured to scan an indoor area of the place to create a second virtual 3D map of the inside area of the place; an object scanner configured to scan one or more objects placed within the place to create a third virtual 3D map of the one or more objects; a processor configured to integrate the first virtual 3D map of the outside area, the second virtual 3D map of the inside area, and the third virtual 3D map of the objects for creating an overall 3D environment for the place; and a database configured to store the integrated overall 3D environment.
Another embodiment of the present disclosure provides a method for method for creating a three dimensional (3D) environment of a place, comprising: scanning, by an aerial scanner, an outside area of the place to create a first virtual 3D map of the outside area of the place; scanning, by an indoor scanner, an indoor area of the place to create a second virtual 3D map of the inside area of the place; scanning, by an object scanner, one or more objects placed within the place to create a third virtual 3D map of the one or more objects; integrating, by a processor, the first virtual 3D map of the outside area, the second virtual 3D map of the inside area, and the third virtual 3D map of the objects for creating an overall 3D environment for the place; and storing, in a database, the 3D immersive display the integrated overall 3D environment.
A further embodiment of the present disclosure provides a method for generating a three dimensional (3D) environment of a place, comprising: generating, by a plurality of 3D scanners, a plurality of 3D scanned images of the place from different views, wherein the plurality of 3D scanners comprising at least one of an aerial scanner, an indoor scanner, and an object scanner; integrating, by a processor, integrating the plurality of 3D scanned images for creating an overall 3D environment for the place.
Non-limiting and non-exhaustive embodiments of the present invention are described with reference to the following drawings. In the drawings, like reference numerals refer to like parts throughout the various figures unless otherwise specified.
For a better understanding of the present invention, reference will be made to the following Detailed Description, which is to be read in association with the accompanying drawings, wherein:
FIG. 1 illustrates an exemplary environment where various embodiments of the present disclosure may function; and
FIG. 2 illustrates a flowchart of a method for creating an overall 3D environment of a place by using the integration system 100 of FIG. 1, in accordance with an embodiment of the present disclosure.
The headings used herein are for organizational purposes only and are not meant to be used to limit the scope of the description or the claims. As used throughout this application, the word “may” is used in a permissive sense (i.e., meaning having the potential to) , rather than the mandatory sense (i.e., meaning must) . To facilitate understanding, like reference numerals have been used, where possible, to designate like elements common to the figures.
The presently disclosed subject matter is described with specificity to meet statutory requirements. However, the description itself is not intended to limit the scope of this patent. Rather, the inventors have contemplated that the claimed subject matter might also be embodied in other ways, to include different steps or elements similar to the ones described in this document, in conjunction with other present or future technologies. Moreover, although the term “step” may be used herein to connote different aspects of methods employed, the term should not be interpreted as implying any particular order among or between various steps herein disclosed unless and except when the order of individual steps is explicitly described.
Reference throughout this specification to “a select embodiment” , “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the disclosed subject matter. Thus, appearances of the phrases “a select embodiment” “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily referring to the same embodiment.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided, to provide a thorough understanding of embodiments of the disclosed subject matter. One skilled in the relevant art will recognize, however, that the disclosed subject matter can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring aspects of the disclosed subject matter.
All numeric values are herein assumed to be modified by the term “about, ” whether or not explicitly indicated. The term “about” generally refers to a range of numbers that one of skill in the art would consider equivalent to the recited value (i.e., having the same or substantially the same function or result) . In many instances, the terms “about” may include numbers that are rounded to the nearest significant figure. The recitation of numerical ranges by endpoints includes all numbers within that range (e.g., 1 to 5 includes 1, 1.5, 2, 2.75, 3, 3.80, 4, and 5) .
As used in this specification and the appended claims, the singular forms “a, ” “an, ” and “the” include or otherwise refer to singular as well as plural referents, unless the content clearly dictates otherwise. As used in this specification and the appended claims, the term “or” is generally employed to include “and/or, ” unless the content clearly dictates otherwise.
The following detailed description should be read with reference to the drawings, in which similar elements in different drawings are identified with the same reference numbers. The drawings, which are not necessarily to scale, depict illustrative embodiments and are not intended to limit the scope of the disclosure.
FIG. 1 illustrates a block diagram illustrating an exemplary integration system 100 for creating a 3D environment for a place, according to an embodiment of the present disclosure may function. As shown, the integration system 100 primarily includes one or more three dimensional scanners comprising at least one of an aerial scanner 102, an indoor scanner 104, and an object scanner 106. The 3D scanners may be configured to generate a plurality of 3D scanned images or 3D model of an area. For example, the aerial scanner 102 may be configured to scan an outside area of the place to create a first virtual 3D map of the outside area of the place. The indoor scanner 104 is configured to scan an indoor area of the place to create a second virtual 3D map of the inside area of the place. The object scanner 106 is configured to scan one or more objects placed within the place to create a third virtual 3D map of the one or more objects.
The integration system 100 further includes a processor 108 configured to integrate one or more outputs of the plurality of scanners 102-106 for creating an overall 3D environment of the place. In some embodiments, the processor 108 is configured to integrate the first virtual 3D map of the outside area, the second virtual 3D map of the inside area, and the third virtual 3D map of the objects for creating the overall 3D environment for the place. The place can be an event venue, exhibition, a park, a house, an office, a concert hall, a hospital, or any area. The processor 108 may store the scanned images, information of the scanners 102-106, overall 3D environment in a database 110. The database 110 may be inbuilt in the processor 108. In some embodiments, the database 110 may be located remotely and may be connected to the processor 108 via a network (not shown) . The network may be such as, but not limited to, a local area network, wide area network, Internet, and so forth.
The integration system 100 may allow a user 114 to view the overall 3D environment by using a mobile device 112 or any other device which has screen or browser such as computer or tablet. In some embodiments, the user 114 may view the overall 3D environment of the place by using VR headsets, or AR technology based devices. The user 114 may not be required to physically visit the place or exhibition for viewing it. Further, the user 114 may view the overall 3D environment of any place virtually from anywhere and at anytime or in real-time. In some embodiments, the processor 108 may be a software application, a hardware, a firmware, and combination of these. In some embodiments, the processor 108 may be a software application installed on the mobile device 112. The mobile device 112 can be a computing device such as, but not limited to, a computer, a VR headset, a AR based device, a laptop, a tablet computer, a smart television, a smart phone, and so forth. In some embodiments, the user 114 accesses the processor for viewing the 3D virtual environments of the place via a web browser.
FIG. 2 illustrates a flowchart of a method 200 for generating an overall three dimensional (3D) environment of a place by using the integration system 100 of FIG. 1, in accordance with an embodiment of the present disclosure. At step 202, the aerial scanner 102 scans an outside area of the place such as a concert hall, for creating a first virtual 3D map of the outside area of the place.
Then, at step 204, the indoor scanner 104 scans an indoor or inside area of the place to create a second virtual 3D map of the inside area of the place. Then at step 206, the object scanner 106 scans one or more objects or people placed within the place to create a third virtual 3D map of the one or more objects.
Thereafter at step 208, the processor 108 integrates the first virtual 3D map of the outside area, the second virtual 3D map of the inside area, and the third virtual 3D map of the objects for creating an overall 3D environment for the place. The overall 3D environment may be a virtual environment created for use with web based/AR or VR based systems or devices. The 3D environment may then be stored in a database for later use. The user 114 may view the overall 3D environment for the place via at least one of a web based/AR/VR based system.
The present disclosure provides an augmented reality (AR) or virtual reality (VR) based immersive display system for generating an online or an offline realistic view of a location such as an exhibition. The user may view the exhibition from anywhere.
Further, the present disclosure provides a real feel of any place via the overall 3D environment for a user. While watching or viewing the overall 3D environment, the user may feel as being part of the place. The user may select or choose a place of which the user wants to view the virtual 3D environment. The user may view on the mobile device from anywhere irrespective of the location. The user may only require a suitable network for accessing the integration system/processor. The network may be the Internet.
Further, the present disclosure provides a flexibility to the users for viewing a 3D view of any place or venue based on his/her time and place convenience. The user may view the exhibition or location or any event at anytime from anywhere.
Further, present disclosure provides the integration system for creating an immersive display by integrating outputs of a plurality of scanners by using known technologies.
Embodiments of the disclosure are also described above with reference to flowchart illustrations and/or block diagrams of methods and systems. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, may be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the acts specified in the flowchart and/or block diagram block or blocks. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to operate in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the acts specified in the flowchart and/or block diagram block or blocks. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operations to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the acts specified in the flowchart and/or block diagram block or blocks.
In addition, methods and functions described herein are not limited to any particular sequence, and the acts or blocks relating thereto can be performed in other sequences that are appropriate. For example, described acts or blocks may be performed in an order other than that specifically disclosed, or multiple acts or blocks may be combined in a single act or block.
While the invention has been described in connection with what is presently considered to be the most practical and various embodiments, it is to be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements.
Claims (11)
- An integration system for generating a three dimensional (3D) environment of a place, comprising:a plurality of 3D scanners for generating a plurality of 3D scanned images of the place from different views, wherein the plurality of 3D scanners comprising at least one of an aerial scanner, an indoor scanner, and an object scanner; anda processor for integrating the plurality of 3D scanned images for creating an overall 3D environment for the place.
- The integration system of claim 1, wherein the aerial scanner is configured to scan an outside area of the place to create a first virtual 3D map of the outside area.
- The integration system of claim 2, wherein the indoor scanner is configured to scan an indoor area of the place to create a second virtual 3D map of the inside area.
- The integration system of claim 3, wherein the object scanner is configured to scan one or more objects placed within the place to create a third virtual 3D map of the objects.
- The integration system of claim 4, wherein the processor is configured to integrate the first virtual 3D map of the outside area, the second virtual 3D map of the inside area, and the third virtual 3D map of the objects for creating the overall 3D environment for the place.
- The integration system of claim 4, wherein a user views the overall 3D environment for the place via at least one of a web based/AR/VR based system.
- An integration system for generating a three dimensional (3D) environment of a place, comprising:an aerial scanner configured to scan an outside area of the place to create a first virtual 3D map of the outside area of the place;an indoor scanner configured to scan an indoor area of the place to create a second virtual 3D map of the inside area of the place;an object scanner configured to scan one or more objects placed within the place to create a third virtual 3D map of the one or more objects;a processor configured to integrate the first virtual 3D map of the outside area, the second virtual 3D map of the inside area, and the third virtual 3D map of the objects for creating an overall 3D environment for the place; anda database configured to store the integrated overall 3D environment.
- The integration system of claim 7, wherein a user views the overall 3D environment for the place via at least one of a web based/AR/VR based system.
- A method for generating a three dimensional (3D) environment of a place, comprising:generating, by a plurality of 3D scanners, a plurality of 3D scanned images of the place from different views, wherein the plurality of 3D scanners comprising at least one of an aerial scanner, an indoor scanner, and an object scanner;integrating, by a processor, integrating the plurality of 3D scanned images for creating an overall 3D environment for the place.
- A method for creating a three dimensional (3D) environment of a place, comprising:scanning, by an aerial scanner, an outside area of the place to create a first virtual 3D map of the outside area of the place;scanning, by an indoor scanner, an indoor area of the place to create a second virtual 3D map of the inside area of the place;scanning, by an object scanner, one or more objects placed within the place to create a third virtual 3D map of the one or more objects;integrating, by a processor, the first virtual 3D map of the outside area, the second virtual 3D map of the inside area, and the third virtual 3D map of the objects for creating an overall 3D environment for the place; andstoring, in a database, the 3D immersive display the integrated overall 3D environment.
- The method of claim 11 further comprising viewing, by a user, the overall 3D environment for the place via at least one of a web based/AR/VR based system.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201762591267P | 2017-11-28 | 2017-11-28 | |
US62/591,267 | 2017-11-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019105002A1 true WO2019105002A1 (en) | 2019-06-06 |
Family
ID=63239395
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2018/091412 WO2019105002A1 (en) | 2017-11-28 | 2018-06-15 | Systems and methods for creating virtual 3d environment |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN108460842A (en) |
WO (1) | WO2019105002A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115767168A (en) * | 2022-12-02 | 2023-03-07 | 抖音视界有限公司 | Live content display method, device, medium and electronic equipment |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN201780606U (en) * | 2010-06-08 | 2011-03-30 | 上海市刑事科学技术研究所 | Field three-dimensional reappearance device |
US20120162384A1 (en) * | 2010-12-22 | 2012-06-28 | Vesely Michael A | Three-Dimensional Collaboration |
US20140368504A1 (en) * | 2013-06-12 | 2014-12-18 | Microsoft Corporation | Scalable volumetric 3d reconstruction |
US9595127B2 (en) * | 2010-12-22 | 2017-03-14 | Zspace, Inc. | Three-dimensional collaboration |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101763607B (en) * | 2008-12-25 | 2013-04-24 | 上海杰图软件技术有限公司 | Online exhibition platform system constructed by using panoramic electronic map and construction method thereof |
CN105931288A (en) * | 2016-04-12 | 2016-09-07 | 广州凡拓数字创意科技股份有限公司 | Construction method and system of digital exhibition hall |
-
2018
- 2018-01-29 CN CN201810083124.9A patent/CN108460842A/en active Pending
- 2018-06-15 WO PCT/CN2018/091412 patent/WO2019105002A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN201780606U (en) * | 2010-06-08 | 2011-03-30 | 上海市刑事科学技术研究所 | Field three-dimensional reappearance device |
US20120162384A1 (en) * | 2010-12-22 | 2012-06-28 | Vesely Michael A | Three-Dimensional Collaboration |
US9595127B2 (en) * | 2010-12-22 | 2017-03-14 | Zspace, Inc. | Three-dimensional collaboration |
US20140368504A1 (en) * | 2013-06-12 | 2014-12-18 | Microsoft Corporation | Scalable volumetric 3d reconstruction |
Also Published As
Publication number | Publication date |
---|---|
CN108460842A (en) | 2018-08-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10474336B2 (en) | Providing a user experience with virtual reality content and user-selected, real world objects | |
CN110300909B (en) | Systems, methods, and media for displaying an interactive augmented reality presentation | |
US20190019011A1 (en) | Systems and methods for identifying real objects in an area of interest for use in identifying virtual content a user is authorized to view using an augmented reality device | |
EP4024854A1 (en) | Virtual environments associated with processed video streams | |
US20180356885A1 (en) | Systems and methods for directing attention of a user to virtual content that is displayable on a user device operated by the user | |
US20220005283A1 (en) | R-snap for production of augmented realities | |
US12020667B2 (en) | Systems, methods, and media for displaying interactive augmented reality presentations | |
CN103914876B (en) | For showing the method and apparatus of video on 3D maps | |
US11232636B2 (en) | Methods, devices, and systems for producing augmented reality | |
WO2018102013A1 (en) | Methods, systems, and media for enhancing two-dimensional video content items with spherical video content | |
CN109427219B (en) | Disaster prevention learning method and device based on augmented reality education scene conversion model | |
US11758218B2 (en) | Integrating overlaid digital content into displayed data via graphics processing circuitry | |
WO2019105001A1 (en) | Immersive display systems and methods | |
US20220351425A1 (en) | Integrating overlaid digital content into data via processing circuitry using an audio buffer | |
US11532138B2 (en) | Augmented reality (AR) imprinting methods and systems | |
WO2019128138A1 (en) | Three-dimensional live streaming systems and methods | |
WO2019105002A1 (en) | Systems and methods for creating virtual 3d environment | |
JP2020162083A (en) | Content distribution system, content distribution method, and content distribution program | |
US20190012834A1 (en) | Augmented Content System and Method | |
WO2022231703A1 (en) | Integrating overlaid digital content into displayed data via processing circuitry using a computing memory and an operating system memory | |
JP7559026B2 (en) | VIRTUAL SPACE PROVIDING SYSTEM, VIRTUAL SPACE PROVIDING METHOD, AND PROGRAM | |
Pereira et al. | Hybrid Conference Experiences in the ARENA | |
Rowe | Archaeo-mented reality: a study of the use of augmented reality as a tool for archaeological interpretation | |
WO2023215637A1 (en) | Interactive reality computing experience using optical lenticular multi-perspective simulation | |
JP2020167654A (en) | Content distribution system, content distribution method, and content distribution program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18884668 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18884668 Country of ref document: EP Kind code of ref document: A1 |