CN112933606B - Game scene conversion method and device, storage medium and computer equipment - Google Patents
Game scene conversion method and device, storage medium and computer equipment Download PDFInfo
- Publication number
- CN112933606B CN112933606B CN202110280556.0A CN202110280556A CN112933606B CN 112933606 B CN112933606 B CN 112933606B CN 202110280556 A CN202110280556 A CN 202110280556A CN 112933606 B CN112933606 B CN 112933606B
- Authority
- CN
- China
- Prior art keywords
- real
- virtual
- image
- time
- world
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/69—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by enabling or updating specific game elements, e.g. unlocking hidden features, items, levels or versions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/005—General purpose rendering architectures
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Graphics (AREA)
- Processing Or Creating Images (AREA)
Abstract
The application discloses a game scene conversion method and device, a storage medium and computer equipment, wherein the method comprises the following steps: responding to an AR mode interactive operation request, acquiring a first real-time virtual image corresponding to a first virtual camera in a three-dimensional game virtual world, creating a second virtual camera and taking a real-world image acquired by a client camera as a first real-time real image corresponding to the second virtual camera; creating a first inlet material ball, and rendering a first real-time rendering image comprising a first transmission door through a game rendering engine so that the first real-time virtual image and the first real-time real image are respectively displayed inside and outside the first transmission door, wherein the first transmission door is obtained by rendering resources to be rendered on the first inlet material ball through the game rendering engine; and switching from the real world to the virtual world when the real-time distance between the second virtual camera and the first inlet material ball is smaller than a preset threshold value.
Description
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a game scene conversion method and apparatus, a storage medium, and a computer device.
Background
With the rapid development of computer technology, AR (augmented Reality) technology is beginning to be applied in each industry in succession, including: military, medical, video, games, etc. In the AR playing method of the game, the AR technology can superimpose the real environment and the virtual environment on the same picture or space, so that the experience of people is enriched, and a player roams the whole home scene from a first person perspective and can switch between the real scene and the virtual scene.
In the prior art, the IOS system can realize the basic AR function by utilizing a built-in ARKit SDK tool, the ARKit SDK tool has higher requirements on equipment hardware, the existing machine type supporting the AR function is almost realized by using the ARKit SDK, and partial low-end IOS equipment does not support the ARKit SDK. The bottom layer of the android system does not support the AR function, a third party plug-in is required to be accessed to realize that, for example, a Google ARCore SDK tool is used, safety, stability and applicability are poor, and when the android system uses the third party plug-in, an additional apk file is required to be installed, so that a channel package cannot pass through examination of each application store easily.
Disclosure of Invention
In view of the foregoing, the present application provides a game scene conversion method and apparatus, a storage medium, and a computer device.
According to one aspect of the present application, there is provided a game scene transition method for a game client, including:
responding to an AR mode interactive operation request, acquiring a first real-time virtual image corresponding to a first virtual camera in a three-dimensional game virtual world, creating a second virtual camera and taking a real-world image acquired by a client camera as a first real-time real image corresponding to the second virtual camera;
creating a first inlet material ball, and rendering a first real-time rendering image comprising a first transmission door through a game rendering engine so that the first real-time virtual image and the first real-time real image are respectively displayed inside and outside the first transmission door, wherein the first transmission door is obtained by rendering resources to be rendered on the first inlet material ball through the game rendering engine;
and acquiring a real-time distance between the second virtual camera and the first inlet material ball, and converting from the real world to the virtual world when the real-time distance is smaller than a preset threshold value.
Optionally, after the creating the second virtual camera, the method further comprises:
and generating a rocker control interface at the game client, and binding the rocker control with the second virtual camera so as to change the real-time distance between the second virtual camera and the first inlet material ball in the real world according to the rocker control operation data, wherein the first inlet material ball is in the real world.
Optionally, the virtual position of the first inlet material ball corresponding to the virtual world is determined according to a first initial display position corresponding to the virtual world or the position of the game character when the AR mode interactive operation request is generated; the virtual position of the first inlet material ball is a fixed position under a virtual world coordinate system.
Optionally, after the creating the first virtual camera, the method further comprises:
binding a gyroscope of the game client with the first virtual camera so that an image capturing angle of the first virtual camera under a virtual world coordinate system changes according to detection data of the gyroscope, and the first real-time virtual image is related to the image capturing angle of the first virtual camera.
Optionally, the rendering, by the game rendering engine, of the first real-time rendered image including the first transfer gate includes:
storing the first real-time virtual image as a first map in a pre-allocated memory;
mapping the first mapping to the first transfer door according to first mapping information corresponding to the first inlet material ball to obtain a real-time virtual transfer door image, and rendering the real-time virtual transfer door image into the first real-time real image through 3D mapping according to a display position corresponding to the first inlet material ball to obtain the first real-time rendering image.
Optionally, the size of the first transfer gate in the first real-time rendered image is inversely related to the real-time distance.
Optionally, the transition from the real world to the virtual world specifically includes:
destroying the first inlet material ball, acquiring a second real-time virtual image corresponding to the first virtual camera, taking a real-world image acquired by the client camera as a second real-time real image corresponding to the second virtual camera, and storing the second real-time real image as a second map;
creating a second inlet material ball, wherein the corresponding creation position of the second inlet material ball in the virtual world is determined according to a second initial display position of the virtual world or according to the virtual position of the first inlet material ball, and the second inlet material ball is positioned at a fixed position under a virtual world coordinate system;
and rendering a second real-time rendering image comprising a second transmission door through the game rendering engine so that the second real-time real image and the second real-time virtual image are respectively displayed inside and outside the second transmission door, wherein the second transmission door is obtained by rendering resources to be rendered on the second inlet material ball through the game rendering engine.
Optionally, after the destroying the first inlet material sphere, the method further includes:
and binding the rocker control of the game client with the first virtual camera so that the position of the first virtual camera under the virtual world coordinate system changes according to the rocker control operation data, and the second real-time virtual image is related to the position of the second virtual camera.
Optionally, the rendering, by the game rendering engine, of the second real-time rendered image including the second transfer gate includes:
when the virtual world display area corresponding to the second real-time virtual image comprises a second transmission door, based on the position relation between the first virtual camera and the second transmission door and the image capturing angle of the first virtual camera, determining display form information of the second transmission door, mapping the second mapping to the second transmission door according to the display form information and second mapping information corresponding to the second inlet material ball, and rendering the real-time real transmission door image to a creation position corresponding to the second real-time virtual image through 3D mapping, so as to obtain a second real-time rendering image;
And displaying the second real-time virtual image under the condition that the virtual world display area corresponding to the second real-time virtual image does not contain the second transmission door.
Optionally, before the rendering, by the game rendering engine, of the second real-time rendered image including the second transfer gate, the method further includes:
and respectively fusing a plurality of images in front in the second real-time virtual image with a plurality of corresponding preset special effect images to obtain a plurality of transmission special effect fused images.
According to another aspect of the present application, there is provided a game scene conversion device for a game client, including:
the image acquisition module is used for responding to the AR mode interactive operation request, acquiring a first real-time virtual image corresponding to a first virtual camera in the three-dimensional game virtual world, creating a second virtual camera and taking a real-world image acquired by a client camera as a first real-time real image corresponding to the second virtual camera;
the rendering module is used for creating a first inlet material ball, and rendering a first real-time rendering image comprising a first transmission door through a game rendering engine so that the first real-time virtual image and the first real-time real image are respectively displayed inside and outside the first transmission door, wherein the first transmission door renders resources to be rendered on the first inlet material ball through the game rendering engine;
The conversion module is used for obtaining the real-time distance between the second virtual camera and the first inlet material ball, and converting the real world into the virtual world when the real-time distance is smaller than a preset threshold value.
Optionally, the apparatus further comprises:
and the rocker binding module is used for generating a rocker control interface at the game client after the first virtual camera is created, and binding the rocker control with the second virtual camera so that the real-time distance between the second virtual camera and the first inlet material ball in a real-world coordinate system changes according to the rocker control operation data, wherein the first inlet material ball is in the real world.
Optionally, the virtual position of the first inlet material ball corresponding to the virtual world is determined according to a first initial display position corresponding to the virtual world or the position of the game character when the AR mode interactive operation request is generated; the virtual position of the first inlet material ball is a fixed position under a virtual world coordinate system.
Optionally, the apparatus further comprises:
and the gyroscope binding module is used for binding the gyroscope of the game client with the first virtual camera after the first virtual camera is created, so that the image capturing angle of the first virtual camera under the virtual world coordinate system changes according to the detection data of the gyroscope, and the first real-time virtual image is related to the image capturing angle of the first virtual camera.
Optionally, the rendering module is specifically configured to:
storing the first real-time virtual image as a first map in a pre-allocated memory;
mapping the first mapping to the first transfer door according to first mapping information corresponding to the first inlet material ball to obtain a real-time virtual transfer door image, and rendering the real-time virtual transfer door image into the first real-time real image through 3D mapping according to a display position corresponding to the first inlet material ball to obtain the first real-time rendering image.
Optionally, the size of the first transfer gate in the first real-time rendered image is inversely related to the real-time distance.
Optionally, the conversion module is specifically configured to:
destroying the first inlet material ball, acquiring a second real-time virtual image corresponding to the first virtual camera, taking a real-world image acquired by the client camera as a second real-time real image corresponding to the second virtual camera, and storing the second real-time real image as a second map;
creating a second inlet material ball, wherein the corresponding creation position of the second inlet material ball in the virtual world is determined according to a second initial display position of the virtual world or according to the virtual position of the first inlet material ball, and the second inlet material ball is positioned at a fixed position under a virtual world coordinate system;
And rendering a second real-time rendering image comprising a second transmission door through the game rendering engine so that the second real-time real image and the second real-time virtual image are respectively displayed inside and outside the second transmission door, wherein the second transmission door is obtained by rendering resources to be rendered on the second inlet material ball through the game rendering engine.
Optionally, the apparatus further comprises:
and the control binding module is used for binding the rocker control of the game client with the first virtual camera after the first inlet material ball is destroyed, so that the position of the first virtual camera under the virtual world coordinate system changes according to the rocker control operation data, and the second real-time virtual image is related to the position of the second virtual camera.
Optionally, the conversion module is further configured to:
when the virtual world display area corresponding to the second real-time virtual image comprises a second transmission door, based on the position relation between the first virtual camera and the second transmission door and the image capturing angle of the first virtual camera, determining display form information of the second transmission door, mapping the second mapping to the second transmission door according to the display form information and second mapping information corresponding to the second inlet material ball, and rendering the real-time real transmission door image to a creation position corresponding to the second real-time virtual image through 3D mapping, so as to obtain a second real-time rendering image;
And displaying the second real-time virtual image under the condition that the virtual world display area corresponding to the second real-time virtual image does not contain the second transmission door.
Optionally, the conversion module is further configured to:
before the game rendering engine renders the second real-time rendering image comprising the second transmission door, respectively fusing a plurality of images in front of the second real-time virtual image with a plurality of corresponding preset special effect images to obtain a plurality of transmission special effect fused images.
According to still another aspect of the present application, there is provided a storage medium having stored thereon a computer program which, when executed by a processor, implements the above-described game scene transition method.
According to still another aspect of the present application, there is provided a computer apparatus including a storage medium, a processor, and a computer program stored on the storage medium and executable on the processor, the processor implementing the above game scene transition method when executing the program.
By means of the technical scheme, the game scene conversion method, the game scene conversion device, the storage medium and the computer equipment realize construction and display of the transmission door by utilizing the mode that the game application program performs image rendering, the transmission door can be constructed under any condition, the problems that the transmission door is required to be constructed on plane environments such as a table and a floor and the like in the prior art and the requirement on the construction environment is high are solved, the ARKit SDK and the third party plug-in are not relied on, the requirement on the equipment hardware condition is low, the safety and the stability of a game are improved, in addition, the change of the game world environment image is driven by means of rocker control and the like, meanwhile, the phenomenon that in the scheme that the AR function is realized by means of tools and plug-in the prior art, the display of the game environment image is inaccurate due to inaccurate positioning is avoided, the game stability is improved, and the user experience is improved. Therefore, the android system user and the IOS system user which does not support the AR function model can also experience the AR playing method in the game, visit the AR virtual world and feel the game experience that the game characters are switched between the reality and the virtual scene.
The foregoing description is only an overview of the technical solutions of the present application, and may be implemented according to the content of the specification in order to make the technical means of the present application more clearly understood, and in order to make the above-mentioned and other objects, features and advantages of the present application more clearly understood, the following detailed description of the present application will be given.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute an undue limitation to the application. In the drawings:
fig. 1 is a schematic flow chart of a game scene conversion method according to an embodiment of the present application;
FIG. 2 is a schematic flow chart of another method for converting game scenes according to an embodiment of the present application;
fig. 3 shows a schematic structural diagram of a game scene conversion device according to an embodiment of the present application.
Detailed Description
The present application will be described in detail hereinafter with reference to the accompanying drawings in conjunction with embodiments. It should be noted that, in the case of no conflict, the embodiments and features in the embodiments may be combined with each other.
In this embodiment, a game scene conversion method is provided, as shown in fig. 1, and the method includes:
the game scene conversion method provided by the embodiment of the application can be applied to game application programs of game clients, the clients can comprise intelligent electronic devices such as smart phones and tablet computers, and cameras which can be used for realizing image acquisition are arranged in or out of the clients, for example, cameras in the smart phones. Thus, the client can respond to the AR mode interactive operation request to acquire and process the image through the client camera. The AR mode interactive operation request may be a transfer gate opening request to request opening of a virtual transfer gate transferred from the real world to the virtual world of the three-dimensional game, and the request may specifically be generated by a player triggering, for example, by clicking an AR mode trigger button on a specific page in the game, so that the player can implement input of the transfer gate opening request. The client starts a client camera in response to a transmission door opening request to run an AR mode, at the moment, the client camera enters a real-time image acquisition state, image acquisition is carried out on a real-time environment of the real world, an image of the real world is obtained through image acquisition, a second virtual camera in a game is created, the image of the real world is used as a first real-time real image corresponding to the second virtual camera, in addition, the created first virtual camera in the game running process is used for obtaining a first real-time virtual image in a three-dimensional game virtual world, the first virtual camera simulates a visual angle of a game character in the game world, and a scene image seen by the game character in the game world is the first real-time virtual image corresponding to the first virtual camera.
In the embodiment of the application, the first virtual camera can change the image capturing angle along with the operation of the game player on the client. Optionally, step 101 may further include: binding a gyroscope of the game client with the first virtual camera so that an image capturing angle of the first virtual camera under a virtual world coordinate system changes according to detection data of the gyroscope, and the first real-time virtual image is related to the image capturing angle of the first virtual camera.
The first virtual camera can change the image capturing angle based on the angle change of the client, specifically, the angle change of the client can be analyzed through detection data of a gyroscope in the client, the angle change of the client is simulated to be the angle change of a game character in the game world, namely, the image capturing angle of the first virtual camera is simulated, so that the first real-time virtual image correspondingly changes along with the image capturing angle. The player drives the image capturing angle of the first virtual camera to change by shaking the mobile phone, so that the change of the eye angle of the game character is simulated, and the player feels the game fun of combining virtual and real.
In the embodiment of the application, the real-time distance between the second virtual camera and the first inlet material ball can be changed through user operation, such as rocker control, clicking or dragging screen control. Optionally, step 101 may further include: and generating a rocker control interface at the game client, and binding the rocker control with the second virtual camera so as to change the real-time distance between the second virtual camera and the first inlet material ball in the real world according to the rocker control operation data, wherein the first inlet material ball is in the real world.
In the above embodiment, a rocker control interface is generated, and the rocker control interface is displayed in the touch area of the client, so that the player can drive the second virtual camera to change the corresponding position under the real world coordinate system through the rocker control, thereby changing the relative distance between the second virtual camera and the first inlet material ball, namely the real-time distance, wherein the relative distance between the first inlet material ball and the second virtual camera is a set value when the first inlet material ball is created, and the real-time distance between the first inlet material ball and the second virtual camera can be changed through the rocker control operation, so that the real-time distance is smaller than the preset threshold value, and the player enters the virtual world from the real world through the transmission door.
In step 102, the first transmission gate is a virtual gate in the real world, through which not only the virtual world can be entered, but also the virtual world can be seen, in a specific scene, the client side responds to the AR mode interactive operation request, acquires a first real-time virtual image through the first virtual camera, acquires the first real-time real image through the second virtual camera, creates a first entrance material ball, and performs image rendering through the game rendering engine, so that the rendered first real-time rendering image presents the first transmission gate and scene images inside and outside the first transmission gate, wherein the inside presents the virtual world scene, and the outside presents the real world scene. The embodiment of the application can realize the construction of the transfer door in any environment, has no special requirement on the construction environment, and solves the problem that the prior art has higher requirement on the construction environment.
Optionally, step 102 may specifically include: storing the first real-time virtual image as a first map in a pre-allocated memory; mapping the first mapping to the first transfer door according to first mapping information corresponding to the first inlet material ball to obtain a real-time virtual transfer door image, and rendering the real-time virtual transfer door image into the first real-time real image through 3D mapping according to a display position corresponding to the first inlet material ball to obtain the first real-time rendering image.
In this embodiment of the present application, after the first real-time virtual image is acquired, the client saves the first real-time virtual image as a first map for rendering the scene in the first transmission door. The first mapping can be stored in the equipment operation memory (RAM), the space of an equipment storage disk is not occupied, the repeated use of the equipment operation memory can be realized by continuously updating the first mapping in the equipment operation memory, the read-write efficiency of the first mapping is improved, and the space occupation of the equipment storage disk is reduced. Further, according to the first mapping, the first inlet material ball and the first real-time virtual image, the client can process and display the first mapping image by combining the first mapping information corresponding to the first inlet material ball, specifically, the first mapping is mapped to the first inlet material ball to obtain a real-time virtual transmission door image containing a virtual world scene in the transmission door, and then the real-time virtual transmission door image is rendered to a specific position of the first real-time real image, so that the virtual world environment picture, the first transmission door and the real world environment picture are displayed, the virtual world is arranged in the door, and the real world is arranged outside the door, thereby realizing the combination of the virtual scene and the real scene. It should be noted that, the first real-time real image may be rendered by using the first inlet material ball, and then the first inlet material ball may be rendered by using the first map, and the rendering order is not limited herein.
In step 103, when the real-time distance between the second virtual camera and the first transfer gate corresponding to the real world coordinate system is smaller than a certain threshold, a scene switching command may be triggered to achieve the effect of traversing from the real world to the virtual world, and after traversing to the virtual world, the images inside and outside the transfer gate are exchanged, i.e. the real world scene is displayed in the gate and the virtual world scene is displayed outside the gate.
In this embodiment of the present invention, optionally, a virtual position of the first inlet material ball corresponding to the virtual world is determined according to a first initial display position corresponding to the virtual world, or a position of the game character when the AR mode interactive operation request is generated; the virtual position of the first inlet material ball is a fixed position under a virtual world coordinate system.
In this embodiment, the position of the first virtual camera in the virtual world coordinate system may be a first initial display position in the virtual world set in advance, or may be a position of the game character in the game world when the AR mode interactive operation request is triggered, and further, the virtual position of the first entry material ball is determined according to the preset distance between the first virtual camera and the first transfer gate and the first initial display position (or the position of the game character). The position of the first inlet material ball under the virtual world coordinate system is fixed, namely the position of the first transmission door under the virtual world coordinate system is unchanged, and a player can shake the mobile phone to change the visual angle of the game character to the game world so as to drive the corresponding first real-time virtual image of the game world to change.
Additionally, optionally, a size of the first transfer gate in the first real-time rendered image is inversely related to the real-time distance. When a game player drives the first virtual camera to change positions through rocker control operation, the size of the first transmission door can change along with the real-time distance between the first virtual camera and the first transmission door, the smaller the real-time distance is, the larger the size of the first transmission door is, and the smaller the size of the first transmission door is, so that when the distance between a game character and the first transmission door is changed, the change of the size of the transmission door seen by the visual angle of the game character is simulated, and the sense of reality is created.
The embodiment of the present application provides another game scene transition method, as shown in fig. 2, optionally, the "transition from the real world to the virtual world" in step 103 may specifically include:
In the above embodiment, after the virtual world is entered from the real world, the first inlet material ball is destroyed, and the joystick control of the game client, the gyroscope and the first virtual camera are bound (the gyroscope is always bound with the first virtual camera), so that the first virtual camera can change the image capturing position and the image capturing angle according to the operation of the game player on the client. The player can control the position of the first virtual camera to be driven to be changed through the rocker, and drive the image capturing angle of the first virtual camera to be changed through the rocking client, so that the movement of the game character in the game world and the change of the eye angle of the game character are simulated, the game world view is seen through the view angle of the game character, the view is displayed in the second transmission door in real time, virtual-real combined game experience is brought to the player, and technical support is provided for increasing game play. And displaying a rocker control interface in the touch control area of the client, wherein a player can drive the position of the first virtual camera to change through rocker control, so that the movement of a game character in the game world is simulated, the movement can comprise up, down, left and right, the view of the game world is observed at a changed view angle of the game character, the view can be displayed in the first transmission door in real time, virtual-real combined game experience is brought to the player, and technical support is provided for increasing game play. The functions are realized through the game application program, the displacement data generated by the player walking through the holding client is not needed to drive the game role to move, the change of the image captured by the first virtual camera is controlled and driven through the rocker, the player can conveniently run the AR mode in any place, the size and the layout of the place are not required, the smooth roaming in the game world can be realized in places under any conditions, the problem that errors exist in the moving position of the game role due to inaccurate positioning in the prior art is solved, the problem that part of IOS equipment does not support ARkit SDK and has higher requirements on the equipment is solved, the problem that the security and the stability of the android equipment are lower because of the installation of a third-party plug-in is solved, the security and the stability of the game program are improved, and the occupied space of the system is also reduced.
In addition, after the second real-time real image is obtained, the second real-time real image is stored in a pre-allocated memory as a second map, so that the equipment storage disk space is not occupied, the repeated use of the equipment operation memory can be realized by continuously updating the second map in the equipment operation memory, the reading and writing efficiency of the second map is improved, and meanwhile, the occupation of the equipment storage disk space is reduced.
In step 203, a second inlet material ball is created, where a corresponding creation position of the second inlet material ball in the virtual world is determined according to a second initial display position of the virtual world or according to a virtual position of the first inlet material ball, and the second inlet material ball is located at a fixed position under a virtual world coordinate system.
In this embodiment, the second inlet material ball represents a transfer gate generated inside the virtual world from which real-world real-time views can be seen. The position of the first virtual camera under the virtual world coordinate system may be a second initial display position in the preset virtual world, and the creation position of the second inlet material ball is further determined according to the preset distance between the first virtual camera and the second transfer door and the second initial display position, or the virtual position of the first inlet material ball under the virtual world coordinate system is used as the creation position of the second inlet material ball. In addition, the position of the second inlet material ball is fixed under the virtual world coordinate system, namely the position of the second transmission door in the virtual world is unchanged.
And 204, respectively fusing the plurality of images in front in the second real-time virtual image with a plurality of corresponding preset special effect images to obtain a plurality of transmission special effect fused images.
In step 204, in order to reduce abrupt visual effect caused by abrupt image change during scene switching, special effects may be superimposed on a plurality of frames of images that are initially displayed when entering the virtual world through the first transfer gate, or a section of special effects may be played first and then the image after entering the virtual world is displayed. In this embodiment, image fusion is performed on a plurality of second real-time virtual images which are initially acquired after entering the virtual world, and a plurality of preset special effect images are sequentially fused and rendered with the plurality of second real-time virtual images to obtain a plurality of transmission special effect fused images, wherein the rendering effect of the preset special effect images can be specifically a foggy effect, and the rendered plurality of special effect fused images can be represented as a foggy and gradually dispersed effect, so that jump on the visual angle effect when a game player enters the virtual world from the real world is reduced, the visual effect is improved, and the game experience of the player is improved.
Step 205-1, when the virtual world display area corresponding to the second real-time virtual image includes a second transfer door, determining display form information of the second transfer door based on a positional relationship between the first virtual camera and the second transfer door and an image capturing angle of the first virtual camera, and mapping the second mapping to the second transfer door according to the display form information and second mapping information corresponding to the second inlet material ball, so as to obtain a real-time real transfer door image, and rendering the real-time real transfer door image to a creation position corresponding to the second real-time virtual image through 3D mapping, so as to obtain a second real-time rendering image;
and step 205-2, displaying the second real-time virtual image under the condition that the virtual world display area corresponding to the second real-time virtual image does not contain the second transmission gate.
In the above embodiment, based on the relative positions of the game character and the second transfer gate in the game world, it is possible to show that the game screen includes both the second transfer gate and does not include the second transfer gate. One is a case where the second transfer gate is visible at the position and angle of the character in the virtual world, and the other is a case where the second transfer gate is not visible at the position and angle of the character in the virtual world. Wherein the position and orientation of the second transfer gate in the virtual world scene is fixed. When the second transfer gate can be seen according to the position and the visual angle of the game character in the virtual world, namely the position and the angle of the first virtual camera, the display form information corresponding to the second transfer gate seen in the virtual world is determined according to the position relation between the first virtual camera and the second transfer gate and the image capturing angle of the second virtual camera. Here, since the position and the angle of view of the second transfer door in the virtual scene are fixed, the display form information corresponding to the second transfer door is also changed continuously with the change of the position and the angle of the first virtual camera in the virtual world scene. For example, if the preset presentation form of the transfer gate is a circle, and if the presentation form of the second transfer gate seen by the game character in the virtual world scene at the initial position is a circle, the presentation form of the virtual transfer gate seen becomes an ellipse when the game character rotates less than 90 degrees to the left or less than 90 degrees to the right around the virtual transfer gate as the center of a circle. And after the display form information corresponding to the second transmission door is determined, mapping the second mapping to the second inlet material ball according to the display form information and the second mapping information corresponding to the second inlet material ball, so as to generate a real-time real transmission door image. The mapping information may include a size of the mapping, a shape of the mapping, and the like, and the second mapping may be mapped on the second inlet material ball according to the size and the shape specified by the mapping information, so as to generate the real-time real transfer door image. Further, the second real-time rendering image is rendered to a creation position corresponding to the second transmission gate in the second real-time virtual image frame through 3D mapping, and the second real-time rendering image is obtained. At this time, the second real-time rendered image includes not only the virtual world environment screen, the second transfer door screen, but also the real world screen in the door. Along with the change of the relative positions of the first virtual camera and the second transfer door, the display form of the real-time real transfer door image frames in the second real-time rendering image is also continuously changed, so that the transfer door display effect is more real. And when the virtual transfer gate cannot be seen at the position and the angle of the game character in the virtual world, a second real-time virtual image is displayed on the game page.
Through the technical scheme of the embodiment, firstly, the transmission gate is created by using the mode of image rendering by the game application program, the transmission gate can be built under any condition, the problem that the transmission gate is required to be built on a table, a floor and other plane environments and the requirement on the built environment is high in the prior art is solved, the ARKit SDK and the third-party plug-in are not relied on, the requirement on the equipment hardware condition is low, the safety and the stability of a game are improved, secondly, the change of the game world environment image is driven by means of rocker control and the like, meanwhile, the phenomenon that the game environment image display is inaccurate due to inaccurate positioning in the scheme of realizing the AR function by means of tools and plug-ins in the prior art is avoided, the game stability is improved, and the user experience is improved. Therefore, the android system user and the IOS system user which does not support the AR function model can also experience the AR playing method in the game, visit the AR virtual world and feel the game experience that the game characters are switched between the reality and the virtual scene.
Further, as a specific implementation of the method of fig. 1, an embodiment of the present application provides a game scene conversion device, as shown in fig. 3, including:
The image acquisition module is used for responding to the AR mode interactive operation request, acquiring a first real-time virtual image corresponding to a first virtual camera in the three-dimensional game virtual world, creating a second virtual camera and taking a real-world image acquired by a client camera as a first real-time real image corresponding to the second virtual camera;
the rendering module is used for creating a first inlet material ball, and rendering a first real-time rendering image comprising a first transmission door through a game rendering engine so that the first real-time virtual image and the first real-time real image are respectively displayed inside and outside the first transmission door, wherein the first transmission door renders resources to be rendered on the first inlet material ball through the game rendering engine;
the conversion module is used for obtaining the real-time distance between the second virtual camera and the first inlet material ball, and converting the real world into the virtual world when the real-time distance is smaller than a preset threshold value.
Optionally, the apparatus further comprises:
and the rocker binding module is used for generating a rocker control interface at the game client after the first virtual camera is created, and binding the rocker control with the second virtual camera so that the real-time distance between the second virtual camera and the first inlet material ball in a real-world coordinate system changes according to the rocker control operation data, wherein the first inlet material ball is in the real world.
Optionally, the virtual position of the first inlet material ball corresponding to the virtual world is determined according to a first initial display position corresponding to the virtual world or the position of the game character when the AR mode interactive operation request is generated; the virtual position of the first inlet material ball is a fixed position under a virtual world coordinate system.
Optionally, the apparatus further comprises:
and the gyroscope binding module is used for binding the gyroscope of the game client with the first virtual camera after the first virtual camera is created, so that the image capturing angle of the first virtual camera under the virtual world coordinate system changes according to the detection data of the gyroscope, and the first real-time virtual image is related to the image capturing angle of the first virtual camera.
Optionally, the rendering module is specifically configured to:
storing the first real-time virtual image as a first map in a pre-allocated memory;
mapping the first mapping to the first transfer door according to first mapping information corresponding to the first inlet material ball to obtain a real-time virtual transfer door image, and rendering the real-time virtual transfer door image into the first real-time real image through 3D mapping according to a display position corresponding to the first inlet material ball to obtain the first real-time rendering image.
Optionally, the size of the first transfer gate in the first real-time rendered image is inversely related to the real-time distance.
Optionally, the conversion module is specifically configured to:
destroying the first inlet material ball, acquiring a second real-time virtual image corresponding to the first virtual camera, taking a real-world image acquired by the client camera as a second real-time real image corresponding to the second virtual camera, and storing the second real-time real image as a second map;
creating a second inlet material ball, wherein the corresponding creation position of the second inlet material ball in the virtual world is determined according to a second initial display position of the virtual world or according to the virtual position of the first inlet material ball, and the second inlet material ball is positioned at a fixed position under a virtual world coordinate system;
and rendering a second real-time rendering image comprising a second transmission door through the game rendering engine so that the second real-time real image and the second real-time virtual image are respectively displayed inside and outside the second transmission door, wherein the second transmission door is obtained by rendering resources to be rendered on the second inlet material ball through the game rendering engine.
Optionally, the apparatus further comprises:
and the control binding module is used for binding the rocker control of the game client with the first virtual camera after the first inlet material ball is destroyed, so that the position of the first virtual camera under the virtual world coordinate system changes according to the rocker control operation data, and the second real-time virtual image is related to the position of the second virtual camera.
Optionally, the conversion module is further configured to:
when the virtual world display area corresponding to the second real-time virtual image comprises a second transmission door, based on the position relation between the first virtual camera and the second transmission door and the image capturing angle of the first virtual camera, determining display form information of the second transmission door, mapping the second mapping to the second transmission door according to the display form information and second mapping information corresponding to the second inlet material ball, and rendering the real-time real transmission door image to a creation position corresponding to the second real-time virtual image through 3D mapping, so as to obtain a second real-time rendering image;
And displaying the second real-time virtual image under the condition that the virtual world display area corresponding to the second real-time virtual image does not contain the second transmission door.
Optionally, the conversion module is further configured to:
before the game rendering engine renders the second real-time rendering image comprising the second transmission door, respectively fusing a plurality of images in front of the second real-time virtual image with a plurality of corresponding preset special effect images to obtain a plurality of transmission special effect fused images.
It should be noted that, for other corresponding descriptions of each functional unit related to the game scene conversion device provided in the embodiment of the present application, reference may be made to corresponding descriptions in the methods of fig. 1 to 2, and no further description is given here.
Based on the above-described method shown in fig. 1 to 2, correspondingly, the embodiment of the present application further provides a storage medium, on which a computer program is stored, which when executed by a processor, implements the above-described game scene conversion method shown in fig. 1 to 2.
Based on such understanding, the technical solution of the present application may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.), and includes several instructions for causing a computer device (may be a personal computer, a server, or a network device, etc.) to perform the methods described in various implementation scenarios of the present application.
Based on the method shown in fig. 1 to fig. 2 and the virtual device embodiment shown in fig. 3, in order to achieve the above object, the embodiment of the present application further provides a computer device, which may specifically be a personal computer, a server, a network device, or the like, where the computer device includes a storage medium and a processor; a storage medium storing a computer program; a processor for executing a computer program to implement the above-described game scene transition method as shown in fig. 1 to 2.
Optionally, the computer device may also include a user interface, a network interface, a camera, radio Frequency (RF) circuitry, sensors, audio circuitry, WI-FI modules, and the like. The user interface may include a Display screen (Display), an input unit such as a Keyboard (Keyboard), etc., and the optional user interface may also include a USB interface, a card reader interface, etc. The network interface may optionally include a standard wired interface, a wireless interface (e.g., bluetooth interface, WI-FI interface), etc.
It will be appreciated by those skilled in the art that the architecture of a computer device provided in the present embodiment is not limited to the computer device, and may include more or fewer components, or may combine certain components, or may be arranged in different components.
The storage medium may also include an operating system, a network communication module. An operating system is a program that manages and saves computer device hardware and software resources, supporting the execution of information handling programs and other software and/or programs. The network communication module is used for realizing communication among all components in the storage medium and communication with other hardware and software in the entity equipment.
Through the description of the embodiment, a person skilled in the art can clearly understand that the method can be realized by means of software and a necessary general hardware platform, can also realize the construction and display of the transfer gate by means of image rendering by using a game application program through hardware, can realize the construction of the transfer gate under any condition, solves the problems that the transfer gate is required to be constructed on a table, a floor and other plane environments in the prior art, has high requirements on construction environment, does not depend on ARkit SDK and a third party plug-in, has lower requirements on equipment hardware conditions, improves the safety and stability of a game, and simultaneously, drives the change of game world environment images by means of rocker control and the like, avoids the phenomenon of inaccurate game environment image display caused by inaccurate positioning in the scheme of realizing AR functions by means of tools and plug-in the prior art, and improves the game stability and user experience. Therefore, the android system user and the IOS system user which does not support the AR function model can also experience the AR playing method in the game, visit the AR virtual world and feel the game experience that the game characters are switched between the reality and the virtual scene.
Those skilled in the art will appreciate that the drawings are merely schematic illustrations of one preferred implementation scenario, and that the modules or flows in the drawings are not necessarily required to practice the present application. Those skilled in the art will appreciate that modules in an apparatus in an implementation scenario may be distributed in an apparatus in an implementation scenario according to an implementation scenario description, or that corresponding changes may be located in one or more apparatuses different from the implementation scenario. The modules of the implementation scenario may be combined into one module, or may be further split into a plurality of sub-modules.
The foregoing application serial numbers are merely for description, and do not represent advantages or disadvantages of the implementation scenario. The foregoing disclosure is merely a few specific implementations of the present application, but the present application is not limited thereto and any variations that can be considered by a person skilled in the art shall fall within the protection scope of the present application.
Claims (10)
1. A game scene transition method, for a game client, comprising:
responding to an AR mode interactive operation request, acquiring a first real-time virtual image corresponding to a first virtual camera in a three-dimensional game virtual world, creating a second virtual camera and taking a real-world image acquired by a client camera as a first real-time real image corresponding to the second virtual camera;
Creating a first inlet material ball, and storing the first real-time virtual image as a first map in a pre-allocated memory; mapping the first mapping to a first transfer gate according to first mapping information corresponding to the first inlet material ball to obtain a real-time virtual transfer gate image, and rendering the real-time virtual transfer gate image to the first real-time real image through 3D mapping according to a display position corresponding to the first inlet material ball to obtain a first real-time rendering image, so that the first real-time virtual image and the first real-time real image are respectively displayed inside and outside the first transfer gate, wherein the first transfer gate is obtained by rendering resources to be rendered to the first inlet material ball through the game rendering engine, and the corresponding virtual position of the first inlet material ball in the virtual world is determined according to a first initial display position corresponding to the virtual world or the position of a game role when an AR mode interactive operation request is generated; the virtual position of the first inlet material ball is a fixed position under a virtual world coordinate system;
and acquiring a real-time distance between the second virtual camera and the first inlet material ball, and converting from the real world to the virtual world when the real-time distance is smaller than a preset threshold value.
2. The method of claim 1, wherein after the creating the second virtual camera, the method further comprises:
and generating a rocker control interface at the game client, and binding the rocker control with the second virtual camera so as to change the real-time distance between the second virtual camera and the first inlet material ball in the real world according to the rocker control operation data, wherein the first inlet material ball is in the real world.
3. The method of claim 1, wherein after the creating the first virtual camera, the method further comprises:
binding a gyroscope of the game client with the first virtual camera so that an image capturing angle of the first virtual camera under a virtual world coordinate system changes according to detection data of the gyroscope, and the first real-time virtual image is related to the image capturing angle of the first virtual camera.
4. The method of claim 1, wherein a size of the first transfer gate in the first real-time rendered image is inversely related to the real-time distance.
5. The method according to claim 1, characterized in that said transition from the real world to the virtual world comprises in particular:
destroying the first inlet material ball, acquiring a second real-time virtual image corresponding to the first virtual camera, taking a real-world image acquired by the client camera as a second real-time real image corresponding to the second virtual camera, and storing the second real-time real image as a second map;
creating a second inlet material ball, wherein the corresponding creation position of the second inlet material ball in the virtual world is determined according to a second initial display position of the virtual world or according to the virtual position of the first inlet material ball, and the second inlet material ball is positioned at a fixed position under a virtual world coordinate system;
when the virtual world display area corresponding to the second real-time virtual image comprises a second transmission door, based on the position relation between the first virtual camera and the second transmission door and the image capturing angle of the first virtual camera, determining display form information of the second transmission door, and according to the display form information and second mapping information corresponding to the second inlet material ball, mapping the second mapping to the second transmission door to obtain a real-time real transmission door image, and rendering the real-time real transmission door image to a creation position corresponding to the second real-time virtual image through 3D mapping to obtain a second real-time rendering image;
And under the condition that the virtual world display area corresponding to the second real-time virtual image does not contain a second transmission gate, displaying the second real-time virtual image so that the second real-time real image and the second real-time virtual image are respectively displayed inside and outside the second transmission gate, wherein the second transmission gate is obtained by rendering resources to be rendered on the second inlet material ball through the game rendering engine.
6. The method of claim 5, wherein after destroying the first inlet material sphere, the method further comprises:
and binding the rocker control of the game client with the first virtual camera so that the position of the first virtual camera under the virtual world coordinate system changes according to the rocker control operation data, and the second real-time virtual image is related to the position of the second virtual camera.
7. The method of claim 5, wherein prior to rendering, by the game rendering engine, a second real-time rendered image comprising a second transfer gate, the method further comprises:
and respectively fusing a plurality of images in front in the second real-time virtual image with a plurality of corresponding preset special effect images to obtain a plurality of transmission special effect fused images.
8. A game scene transition device for a game client, comprising:
the image acquisition module is used for responding to the AR mode interactive operation request, acquiring a first real-time virtual image corresponding to a first virtual camera in the three-dimensional game virtual world, creating a second virtual camera and taking a real-world image acquired by a client camera as a first real-time real image corresponding to the second virtual camera;
the rendering module is used for creating a first inlet material ball and storing the first real-time virtual image as a first map in a pre-allocated memory; mapping the first mapping to a first transfer gate according to first mapping information corresponding to the first inlet material ball to obtain a real-time virtual transfer gate image, and rendering the real-time virtual transfer gate image to the first real-time real image through 3D mapping according to a display position corresponding to the first inlet material ball to obtain a first real-time rendering image, so that the first real-time virtual image and the first real-time real image are respectively displayed inside and outside the first transfer gate, wherein the first transfer gate is obtained by rendering resources to be rendered to the first inlet material ball through the game rendering engine, and the corresponding virtual position of the first inlet material ball in the virtual world is determined according to a first initial display position corresponding to the virtual world or the position of a game role when an AR mode interactive operation request is generated; the virtual position of the first inlet material ball is a fixed position under a virtual world coordinate system;
The conversion module is used for obtaining the real-time distance between the second virtual camera and the first inlet material ball, and converting the real world into the virtual world when the real-time distance is smaller than a preset threshold value.
9. A storage medium having stored thereon a computer program, which when executed by a processor, implements the method of any of claims 1 to 7.
10. A computer device comprising a storage medium, a processor and a computer program stored on the storage medium and executable on the processor, characterized in that the processor implements the method of any one of claims 1 to 7 when executing the computer program.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110280556.0A CN112933606B (en) | 2021-03-16 | 2021-03-16 | Game scene conversion method and device, storage medium and computer equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110280556.0A CN112933606B (en) | 2021-03-16 | 2021-03-16 | Game scene conversion method and device, storage medium and computer equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112933606A CN112933606A (en) | 2021-06-11 |
CN112933606B true CN112933606B (en) | 2023-05-09 |
Family
ID=76230085
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110280556.0A Active CN112933606B (en) | 2021-03-16 | 2021-03-16 | Game scene conversion method and device, storage medium and computer equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112933606B (en) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113730905A (en) * | 2021-09-03 | 2021-12-03 | 北京房江湖科技有限公司 | Method and device for realizing free migration in virtual space |
CN114265496A (en) * | 2021-11-30 | 2022-04-01 | 歌尔光学科技有限公司 | VR scene switching method and device, VR head-mounted equipment and storage medium |
CN114398132B (en) * | 2022-01-14 | 2024-05-10 | 北京字跳网络技术有限公司 | Scene data display method and device, computer equipment and storage medium |
CN114461064B (en) * | 2022-01-21 | 2023-09-15 | 北京字跳网络技术有限公司 | Virtual reality interaction method, device, equipment and storage medium |
CN118236700A (en) * | 2022-12-16 | 2024-06-25 | 腾讯科技(成都)有限公司 | Character interaction method, device, equipment and medium based on virtual world |
CN116243831B (en) * | 2023-05-12 | 2023-08-08 | 青岛道可云网络科技有限公司 | Virtual cloud exhibition hall interaction method and system |
CN116260956B (en) * | 2023-05-15 | 2023-07-18 | 四川中绳矩阵技术发展有限公司 | Virtual reality shooting method and system |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108805989A (en) * | 2018-06-28 | 2018-11-13 | 百度在线网络技术(北京)有限公司 | Method, apparatus, storage medium and the terminal device that scene is passed through |
CN109829964A (en) * | 2019-02-11 | 2019-05-31 | 北京邮电大学 | The rendering method and device of Web augmented reality |
CN110515452A (en) * | 2018-05-22 | 2019-11-29 | 腾讯科技(深圳)有限公司 | Image processing method, device, storage medium and computer equipment |
CN111684393A (en) * | 2017-12-22 | 2020-09-18 | 奇跃公司 | Method and system for generating and displaying 3D video in virtual, augmented or mixed reality environment |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108159685B (en) * | 2017-12-08 | 2020-09-22 | 上海感悟通信科技有限公司 | Virtual rocker control method and system based on gyroscope, medium and equipment thereof |
CN107918949A (en) * | 2017-12-11 | 2018-04-17 | 网易(杭州)网络有限公司 | Rendering intent, storage medium, processor and the terminal of virtual resource object |
CN108665553B (en) * | 2018-04-28 | 2023-03-17 | 腾讯科技(深圳)有限公司 | Method and equipment for realizing virtual scene conversion |
WO2020201998A1 (en) * | 2019-04-03 | 2020-10-08 | Purple Tambourine Limited | Transitioning between an augmented reality scene and a virtual reality representation |
CN112192568A (en) * | 2020-09-30 | 2021-01-08 | 广东唯仁医疗科技有限公司 | Intelligent shopping robot control method and system based on 5G network |
-
2021
- 2021-03-16 CN CN202110280556.0A patent/CN112933606B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111684393A (en) * | 2017-12-22 | 2020-09-18 | 奇跃公司 | Method and system for generating and displaying 3D video in virtual, augmented or mixed reality environment |
CN110515452A (en) * | 2018-05-22 | 2019-11-29 | 腾讯科技(深圳)有限公司 | Image processing method, device, storage medium and computer equipment |
CN108805989A (en) * | 2018-06-28 | 2018-11-13 | 百度在线网络技术(北京)有限公司 | Method, apparatus, storage medium and the terminal device that scene is passed through |
CN109829964A (en) * | 2019-02-11 | 2019-05-31 | 北京邮电大学 | The rendering method and device of Web augmented reality |
Also Published As
Publication number | Publication date |
---|---|
CN112933606A (en) | 2021-06-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112933606B (en) | Game scene conversion method and device, storage medium and computer equipment | |
US12134037B2 (en) | Method and system for directing user attention to a location based game play companion application | |
JP6708689B2 (en) | 3D gameplay sharing | |
CN112862935B (en) | Game role movement processing method and device, storage medium and computer equipment | |
US9283473B2 (en) | Game providing device | |
US10573060B1 (en) | Controller binding in virtual domes | |
US9728011B2 (en) | System and method for implementing augmented reality via three-dimensional painting | |
WO2022083452A1 (en) | Two-dimensional image display method and apparatus for virtual object, and device and storage medium | |
JP7503122B2 (en) | Method and system for directing user attention to a location-based gameplay companion application - Patents.com | |
CN111694430A (en) | AR scene picture presentation method and device, electronic equipment and storage medium | |
CN112312111A (en) | Virtual image display method and device, electronic equipment and storage medium | |
WO2018000608A1 (en) | Method for sharing panoramic image in virtual reality system, and electronic device | |
KR20170105069A (en) | Method and terminal for implementing virtual character turning | |
CN109448050A (en) | A kind of method for determining position and terminal of target point | |
US10740957B1 (en) | Dynamic split screen | |
CN112891940B (en) | Image data processing method and device, storage medium and computer equipment | |
CN111973984A (en) | Coordinate control method and device for virtual scene, electronic equipment and storage medium | |
CN111897437A (en) | Cross-terminal interaction method and device, electronic equipment and storage medium | |
WO2023065949A1 (en) | Object control method and apparatus in virtual scene, terminal device, computer-readable storage medium, and computer program product | |
CN114632324A (en) | Immersive space virtual establishment system and method | |
CN114404993A (en) | Game data processing method and device, electronic equipment and storage medium | |
CN118161857A (en) | Task display method, device, storage medium and equipment | |
WO2023205145A1 (en) | Interactive reality computing experience using multi-layer projections to create an illusion of depth | |
WO2024039887A1 (en) | Interactive reality computing experience using optical lenticular multi-perspective simulation | |
WO2024039885A1 (en) | Interactive reality computing experience using optical lenticular multi-perspective simulation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |