CN116954824A - Runtime system supporting multi-process mixed operation of multiple extended reality (XR) technical specification application programs and 2D application programs, data interaction method, device and medium - Google Patents
Runtime system supporting multi-process mixed operation of multiple extended reality (XR) technical specification application programs and 2D application programs, data interaction method, device and medium Download PDFInfo
- Publication number
- CN116954824A CN116954824A CN202310946710.2A CN202310946710A CN116954824A CN 116954824 A CN116954824 A CN 116954824A CN 202310946710 A CN202310946710 A CN 202310946710A CN 116954824 A CN116954824 A CN 116954824A
- Authority
- CN
- China
- Prior art keywords
- application
- display
- standard
- module
- sdk
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 296
- 230000003993 interaction Effects 0.000 title claims abstract description 19
- 230000008569 process Effects 0.000 claims description 191
- 238000007726 management method Methods 0.000 claims description 110
- 238000009877 rendering Methods 0.000 claims description 62
- 244000035744 Hura crepitans Species 0.000 claims description 52
- 230000006870 function Effects 0.000 claims description 43
- 238000011068 loading method Methods 0.000 claims description 30
- 239000000203 mixture Substances 0.000 claims description 24
- 230000015572 biosynthetic process Effects 0.000 claims description 20
- 238000003786 synthesis reaction Methods 0.000 claims description 20
- 238000004891 communication Methods 0.000 claims description 17
- 239000008186 active pharmaceutical agent Substances 0.000 claims description 12
- 238000004364 calculation method Methods 0.000 claims description 12
- 238000012545 processing Methods 0.000 claims description 12
- 238000006243 chemical reaction Methods 0.000 claims description 11
- 230000003190 augmentative effect Effects 0.000 claims description 7
- 230000007246 mechanism Effects 0.000 claims description 7
- 230000001360 synchronised effect Effects 0.000 claims description 7
- 230000002194 synthesizing effect Effects 0.000 claims description 7
- 238000004458 analytical method Methods 0.000 claims description 6
- 238000004590 computer program Methods 0.000 claims description 4
- 238000002955 isolation Methods 0.000 claims description 4
- 238000011161 development Methods 0.000 claims description 3
- 238000001308 synthesis method Methods 0.000 claims description 3
- 238000001914 filtration Methods 0.000 claims description 2
- 238000011112 process operation Methods 0.000 claims description 2
- 238000010276 construction Methods 0.000 claims 2
- 230000004931 aggregating effect Effects 0.000 claims 1
- 230000005540 biological transmission Effects 0.000 claims 1
- 230000008859 change Effects 0.000 claims 1
- 230000009466 transformation Effects 0.000 claims 1
- 238000000844 transformation Methods 0.000 claims 1
- 230000010354 integration Effects 0.000 abstract description 3
- 230000009471 action Effects 0.000 description 14
- 239000003795 chemical substances by application Substances 0.000 description 12
- 239000002131 composite material Substances 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/455—Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
- G06F9/45533—Hypervisors; Virtual machine monitors
- G06F9/45537—Provision of facilities of other operating environments, e.g. WINE
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Stored Programmes (AREA)
Abstract
The application provides a runtime system supporting multi-process mixed operation of multiple kinds of extended reality (XR) technical specification application programs and 2D application programs, a data interaction method, equipment and a medium. For XR application developers, the system supports various XR standard technical specifications during operation, has good universality and compatibility, supports mixed operation of WebGPU graphic API and 2D application programs, and provides expansion capability for expanding XR standards; for XR equipment manufacturer developers, the support difficulty of XR equipment to various XR standards is effectively reduced, the application ecology of multi-process mixed operation of various XR standard application programs and 2D applications is provided, and a high-efficiency and convenient custom System UI integration method is also provided.
Description
Technical Field
The application relates to the technical field of extended reality (XR), in particular to a runtime system, a data interaction method, equipment and a medium for supporting multi-process mixed operation of multiple extended reality (XR) technical specification application programs and 2D application programs.
Background
Because there is no universal augmented reality operating system worldwide in the full sense at present, so that the VR, AR, MR, etc. XR capabilities of the current device are not directly provided by the operating system or GPU, the current solution of the XR device manufacturer is to provide an XR SDK and an XR Driver, where when the XR SDK includes an XR standard, an application developer calls an XR SDK API or an XR standard runtime API to indirectly use the XR capabilities on the device, but there are many XR standards in the global market at present, so that it is very difficult for the XR SDK provided by the XR device manufacturer to support the XR runtime of multiple XR standards at the same time, such as GSXR standards are mainly used in the current chinese market, and open XR standards are mainly used in markets other than china. Thus, the application developed by the application developer can only run on the XR device supporting the GSXR runtime API if the application is a GSXR version, but in fact, the application is irrelevant to the XR capability already provided by the XR device, and only needs to be relevant to whether the implementation of the GSXR runtime API is provided on the current XR device, so the application provides a method for automatically converting the XR standard runtime API in the XR SDK currently provided by the device into the XR standard runtime API really required by the running target XR application, thereby realizing that the GSXR application runs on the XR device supporting the OpenXR standard, and similarly, the application of the OpenXR standard can also run on the XR device supporting the GSXR standard.
Most XR equipment manufacturers currently provide proprietary XR SDKs for XR standards, for example, quest equipment of Meta provides OpenXR Mobile SDK and Pico provides SDK names of equipment, which are also called OpenXR Mobile SDK, and apple companies just push out vision os operating systems and XR SDKs, so that if the same XR application wants to run on equipment of different XR equipment manufacturers, not only needs to be specific to different operating systems or different CPU architectures, but also needs to compile and generate versions matched with XR SDKs for different XR equipment manufacturers, which can definitely greatly increase development cost and period of application developers.
At present, when XR applications are compiled and generated, because GPU graphics libraries are different, execution files of respective GPU graphics library versions need to be generated, for example, the same application has two versions of Vulkan and OpenGL, which can cause that when XR SDKs run, if only one GPU graphics library is supported, applications of another GPU graphics library version cannot be run.
The WebGPU is a new generation of Web-based 3D graphics API, the performance of the WebGPU is greatly improved compared with that of the WebGL, the WebGPU is used on a browser, and can develop an efficient cross-platform graphics application in the aspect of native application, and the WebGPU is one of the mainstream graphics APIs for developing an XR application in the future, so that the need for providing XR standard running time based on the WebGPU graphics APIs on an XR device is urgent.
The number of applications in the current market is greatly different from that in the mobile application market, if the XR device only supports a single XR standard to operate, fewer XR applications can be operated, so that only if the XR device manufacturer supports more XR standards to operate, a user can only operate more XR applications on the XR device.
There is no doubt that XR applications are now growing in bursts, but most of the mainstream applications in the market today are still 2D applications, and consumers also have a need to use these 2D applications directly in XR devices, so there is a great need for efficient methods of displaying and operating 2D applications on XR devices.
The 2D application of PWA mode is the most promising at present, and reference can be made to another patent of the inventor, zhang Yanghua. The runtime library based on the browser PWA standard, the data interaction method, the device and the medium are CN115729614A. 202211534038.8.
Disclosure of Invention
The application aims to solve the technical problems that various kinds of extended reality (XR) technical specifications are difficult to support on XR equipment, the existing extended reality (XR) cannot support the next generation of brand new Web-based 3D graphic API, 2D application programs and extended reality (XR) application programs are difficult to mix and run and interact smoothly, and the like.
In view of this, the present application aims to provide a runtime system, a data interaction method, a device and a medium supporting multiple processes mixed running of multiple extended reality (XR) technical specification application programs and 2D application programs, and aims to provide a runtime system solution with better universality and compatibility, so as to solve the strong requirements that the XR industry is rapidly developed, the capability of an XR device is frequently updated, the XR application and the 2D application are mixed to run, the XR application is developed by multiple XR standards and multiple graphics APIs, the XR application uses WebGPU to make a brand new graphics API, and the like, so that XR device manufacturers need to provide technical solutions compatible with multiple XR standards to run simultaneously and multiple graphics APIs and to mix to run with 2D applications.
In order to achieve the above object, the present application adopts the following technical scheme.
In a first aspect, the present application provides a runtime system supporting multiple augmented reality (XR) technical specification applications and 2D application multi-process hybrid operations, the runtime system deployed in a device, the device further comprising an operating system, a device driver, and the like, the runtime system comprising: an XR standard running API agent module, an XR AppService loading module, an XR SDK Provider sandbox process module, an XR application management module, an XR input/output management module, an XR display management module, an XR space management module, an XR embedded management module, an XR application life cycle management module, an XR standard transcoding module, an XR extension management module and the like.
XR standard runtime API proxy: the XR standards widely used in the world today are open XR, GSXR, etc. for adapting to the requirements of the API at runtime of the various XR standards.
XR AppService loading module: the method is used for loading the latest stable version of the XR SDK provided by equipment manufacturers on the XR equipment, the latest available version of the XR SDK provides version control according to a remote server side, if the version of the remote server is consistent with the version on the current XR equipment, updating is not needed, if the version provided by the remote server is newer, a user needs to be prompted to download the updated version, after updating is finished, an XR App process needs to be automatically restarted, and the updating failure of the XR SDK possibly occurs due to unknown reasons, so that an XR SDK version rollback function needs to be provided.
XR SDK Provider sandboxed process module: the method is used for executing XR functions provided by XR SDK on the equipment in a sandboxed process, so that the functions of performance isolation, safety isolation, stability isolation and the like are provided.
XR application management module: the system is used for managing the running states of system functions such as XR App, 2D application and SystemUI, popup, notify, and provides programmable interfaces for opening and closing applications.
XR input/output management module: for managing XR inputs and providing programmable interfaces for adding/deleting XR input devices, filtering XR inputs, forwarding XR inputs, etc.
XR display management module: the system is used for managing all running XR application rendering composite result display output and providing a Overlay, caputre programmable interface for frame rate monitoring and the like.
XR space management module: for managing various types of spatial pose transitions between origin calculations in all XR application runs, XR applications, XR SDK Provider processes.
XR embedded management module: the display synthesis method is used for managing all external 2D application display contents and embedding the external 2D application display contents into an XR App space, and particularly needs to display the 2D application in the XR App space and forward the received XR input according to a format which can be recognized by the 2D application.
XR application lifecycle management module: for managing the running state of xrapp processes on devices, these states are respectively: ready, synchronized, visible, focused, stopping, etc.
XR standard transcoding module: the method is used for analyzing the XR standard API call used in the XR application into the API call of each internal module of the runtime system, returning the result to the XR application through the XR standard runtime API proxy module, and simultaneously, being used for remotely calling various parameters of the XR standard runtime API in the XR SDK Provider process.
XR extension management module: the method is used for managing the running state of each expansion function in the running system, the expansion functions are realized by using a dynamic library, when an XR application needs to load a certain expansion function, the XR expansion management module needs to request the dynamic library resource of the expansion function, and after the request is successful, the life cycle of the expansion function is executed in an expansion realization thread in the XR application.
In a second aspect, the present application provides a data interaction method of a runtime system supporting multiple mixed operation of an extended reality (XR) technical specification application program and a 2D application program, where the runtime system is deployed in a device, and a manifest file, an XR SDK Provider sandbox program file, and an extended dynamic library may be deployed on a remote server or on the device, and the method includes:
1. when an XR application on the device is started, the XR application is connected to an XR AppService main process through an XR standard API call integrated by an XR standard proxy API module, and if the XR AppService main process is not started yet, the XR standard proxy API module starts the XR AppService main process and establishes IPC connection with the XR AppService main process;
2. after the XR AppService main process is created, the XR AppService loading module requests the latest version of the manifest FILE from the server side in the remote server or the device, the server side in the remote server or the device receives the HTTP/HTTPS/FILE request from the client side XR loading module, and the request contains the configuration information of the client side: key information such as a runtime system version number, an operating system version/name, a device ID, a display resolution, and the like;
3. The remote server analyzes key information in an XR AppService loading module request, returns a manifest file corresponding to the client XR equipment, and the manifest file should include necessary information for running an XR SDK Provider sandbox process in the client equipment: the method specifically comprises the steps of XR SDK Provider sandboxed program files, extended dynamic libraries and other information;
4. after an XR AppService loading module in the runtime system analyzes extension fields in a manifest file, requesting an extension realization thread of an XR application to load an extension dynamic library on which the XR application depends through an extension management module;
5. an XR AppService loading module in the runtime system creates an XR SDK Provider sandbox process and establishes IPC connection with the XR SDK Provider sandbox process;
6. an XR AppService main process in the runtime system creates key resources such as an application life cycle, a display synthesizer and the like for an XR application;
7. according to the service life cycle of the XR application, an XR AppService main process in the runtime system synthesizes the rasterization results of all XR application display frames, systemUI and Popup in an application management module, overlay in a display management module, 2D application in an XR Embedded module and other synthetic frames into a final display frame according to a specific sequence, and submits the final display frame to an XR SDK Provider sandbox process for final display;
8. If the XR application uses the WebGPU graphics API, the runtime system provides the XR application with the ability to render using the WebGPU graphics API by embedding the WebGPU XR standard extension, and synthesizes the XR application rendering frame into the display frame of the XR SDK by integrating WebGPU native server in the XR SDK Provider process;
9. if the XR application needs to extend the XR standard or redefine the implementation of the XR standard API, an extension management module in the runtime system provides a method for realizing an extension function for the XR application;
10. if the graphic API used by the XR application is inconsistent with the graphic API used by the XR SDK Provider, converting a GPU texture data format rendered by the XR application into a GPU texture data format required by the XR SDK when running through a ShareTexture module in the runtime system;
11. if the XR standard used by the XR application and the XR standard used by the XR SDK Provider process are inconsistent, the run-time system realizes XR standard calling adaptation and conversion among the XR application, the XR AppService main process and the XR SDK Provider sandbox process through an XR standard transcoding module;
12. if the equipment manufacturer needs to customize the SystemUI function, a SystemUI module of the runtime system provides a method for customizing the SystemUI;
13. If the XR application requires integration of local or remote server PWA 2D application display content into the XR display environment, the XR embedded module of the runtime system provides a means of embedding external 2D application display content into the XR application for display composition.
The memory stores machine readable instructions executable by the processor, the processor and the memory in communication via the communication unit when the device is operating;
wherein the processor executes the machine readable instructions to perform the method of the above aspects.
In a third aspect, an embodiment of the present application further provides an apparatus, including: a processor, a memory, and a communication unit;
the memory stores machine readable instructions executable by the processor, the processor and the memory in communication via the communication unit when the device is operating;
wherein the processor executes the machine readable instructions to perform the method of the above aspects.
In a fourth aspect, embodiments of the present application also provide a computer-readable storage medium having a computer program thereon, which when executed by a processor performs the method of the above aspects.
The beneficial effects of the application are as follows:
1. according to the technical scheme of the runtime system supporting multi-process mixed operation of multiple extended reality (XR) technical specification application programs and 2D application programs, an XR equipment manufacturer only needs to support one XR standard originally, so that other different XR standard applications can be supported to operate simultaneously through integrating the runtime system, development cost of the equipment manufacturer can be effectively saved, more XR standard supports can be added for equipment, an XR application developer does not need to recompile or develop application program versions for multiple XR equipment manufacturers, and the XR application of the same version developed by the XR application developer can operate on equipment of the multiple XR equipment manufacturers by using the XR standard runtime system provided by the application, so that the runtime system has better universality and compatibility;
2. the runtime system provides a technical scheme for integrating advanced graphics APIs such as WebGPU and the like in various extended reality (XR) technical specifications for XR application;
3. the runtime system of the present application provides a solution for XR applications to embed 2D applications in 3D space in a variety of extended reality (XR) specifications;
4. The system uses a multi-process architecture technical scheme to provide data, performance and stability isolated operation environments for all XR application programs;
5. the system provides a scheme for expanding XR standard and XR system capacity for equipment manufacturers through an API expanding mode in the running process, and can effectively strengthen the system expanding capacity and performance optimizing capacity of XR application on equipment.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 shows a process architecture diagram of a runtime system of the present application.
FIG. 2 illustrates a block diagram of the architecture of a runtime system of the present application.
FIG. 3 shows an XR AppService master process loading flow diagram in a runtime system of the present application.
FIG. 4 illustrates a flow chart of XR AppService master process operation and XR application lifecycle management in a runtime system in accordance with the present application.
FIG. 5 is a flow chart illustrating the synthesis of an XR display compositor from application process rendered frames into a real frame of an XR SDK in a runtime system in accordance with the present application.
FIG. 6 illustrates a flow chart of a graphics API that extends the WebGPU as an XR standard available in the runtime system of the present application.
FIG. 7 illustrates a flow chart of converting the GPU texture data format rendered by the XR application to the texture data format required by the GPU at run-time of the XR SDK in the runtime system of the present application.
FIG. 8 illustrates a flow chart of an XR standard transcoding module parsing a plurality of XR standard APIs in a runtime system in accordance with the present application.
FIG. 9 shows a flow chart of the SystemUI module implementation and integration extension SystemUI in a runtime system of the present application.
FIG. 10 illustrates a flow chart of embedding external 2D application display content into an XR application display composition in a runtime system of the present application.
FIG. 11 illustrates a flow chart of origin computation in XR application operation, various types of spatial pose transitions between XR application and XR SDK Provider processes in a runtime system in accordance with the present application.
Description of the embodiments
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments of the present application.
In order to make the description of the technical architecture of the embodiment of the present application clearer, the XR SDK Provider sandbox program and the XR application in the embodiment of the present application use the OpenXR standard as an extended reality (XR) technical specification, the most popular extended reality (XR) technical specification in the world is the OpenXR standard, and the OpenXR standard organization supports and sources the OpenXR runtime project named Monado, which is rapidly developing and many XR device manufacturers also plan to support and contribute codes to the Monado project, so that most XR device factories can be expected to provide an XR SDK based on the OpenXR standard running with the capability of providing the OpenXR standard, so that the technical developer can understand more quickly by using the OpenXR standard in the embodiment of the present application, but it must be pointed out that the embodiment of the present application only shows an embodiment of this part of the OpenXR standard, and if the XR SDK Provider sandbox program and the XR application use other extended reality (XR) technical specifications such as gss mobile in china can also be implemented on the standard.
The runtime system of the present application adopts a multi-process architecture, as shown in fig. 1, and includes three types of processes: an XR AppService main process, an XR SDK Provider sandbox process, an XR App process and an extended implementation thread.
An XR App process is created by interactive requests from the user through operating system calls on the XR device.
An XR AppService main process is actively created when an XR device is started or created when an XR standard of an XR standard running API proxy module in a running system of the application is called by the XR App process on the device to initialize an API, and is used for providing the following functions: XR AppService loading module, application management module, input/output management module, display composition module, embedded management module, expansion management module, systemUI module, etc.
The XR SDK Provider sandbox process is created by an XR application management module in an XR AppService main process, the XR App process and the XR SDK Provider sandbox process are in a corresponding relation of many to one, and rendering results of the SystemUI module, the XR App process and the embedded 2D application are synthesized by an XR display synthesizer module in the XR AppService process and submitted to the XR SDK Provider sandbox process to finish final display.
The extended implementation thread exists in the XRApp process, can directly call extended functions such as XR application management, input/output management, display management and the like provided by the XR AppService main process through IPC, and can also rewrite the implementation of the XR standard by using an XR standard interface of a current XR application request through an API of an API proxy module when the XR standard runs.
The ecology that a plurality of XR applications and 2D applications run simultaneously is usually required to be supported on the equipment; if the first, second, third and fourth 3 XR applications come from different 3D application developers and the fourth application comes from the 2D application developer, the first XR application is a SystemUI type application provided by the device vendor, the second XR application is a 3D game type application provided by the application developer a and developed using the XR standard X, and the third XR application is a 3D social type application provided by another application developer and developed using the XR standard Y, then the XR device is required to provide not only the XR standard capability required for the three XR applications, but also the capability of the 2D application to be mixed in the XR application, and the capability of the display composition method to uniformly compose the multiple 3D/2D application rendering results into the final display frame and display.
To address how to provide multi-process hybrid operation of multiple augmented reality (XR) technical specification applications and 2D applications on an XR device, and provide XR application and 2D application multitasking capabilities and capabilities to extend XR standards, a first aspect of the present application provides a runtime system based on a multi-process hybrid operation of multiple augmented reality (XR) technical specification applications and 2D applications, the runtime system in a device, the device further comprising an operating system kernel, drivers, device built-in services, as shown in fig. 2, the runtime system comprising: the system comprises an XR standard implementation module, an XR AppService loading module, an XR SDK runtime module, an XR input/output management module, an XR display management module, an XR space management module, an XR embedded management module, an XR display synthesizer module, an XR application management module, an XR application life cycle module, an XR expansion capacity implementation module, an expansion management module and a SystemUI module.
The XR standard realization module is used for adapting to the API requirements of various XR standards in running, and comprises two parts of XR standard agency realization and XR standard transcoding;
specifically, the XR standard agent implementation module 110 is deployed in a runtime library that invokes an XR standard API in an XRApp process;
specifically, the XR standard transcoding module 111 is deployed in the XR AppService main process, and is used for resolving the request of the XR standard API to function calls of each module of the runtime system according to the present application, and is also used for constructing the XR standard API call of the XR SDK Provider sandbox process.
The XR app service loading module 112 is deployed in an XR app service main process, and is configured to request, from a remote server, a latest version of file resources such as an XR SDK Provider sandbox program and a related extended dynamic library according to key information of an XR device, update and install the latest version of file resources on the device, so as to prepare an operating environment of the XR app service main process, and create and connect to the XR SDK Provider sandbox process after updating is completed, so as to complete starting of the XR app service main process;
specifically, the content that needs to be loaded onto the device in advance includes: the method comprises the steps of a manifest file, an XR SDK Provider sandbox program described in the manifest file and an extended dynamic library file;
Specifically, the XR AppService application running environment to be prepared includes: generating device_id, app_service directory, downloading file resources such as XR SDK Provider sandbox process and related expansion dynamic library, and starting an XR AppService main process and an XR SDK Provider sandbox process;
the identification device_id calculation method in the embodiment of the application is as follows: if the device_id is specified in the manifest, the manifest provider should ensure that the provided device_id is unique, and if the device_id is not specified in the manifest, the device_id is calculated in the manner of device vendor website domain name + operating system name + device name.
The XR SDK Runtime module 113 is deployed in an XR SDK Provider sandbox process, and is used for providing core hardware capabilities such as graphics, display, input and output of an XR device, and when the XR SDK and the XR standard Runtime provided by an XR device manufacturer need to be integrated, if the device manufacturer cannot provide the XR standard Runtime, an open source version Monado service program in the open XR standard Runtime can be integrated to implement the XR standard Runtime module in the XR SDK run time module.
The XR input/output management module 114 is disposed in the XR AppService main process, and is configured to manage all XR inputs of the whole XR device, obtain all XR input events from the XR standard runtime module in the XR SDK Provider sandbox process, and transmit all the XR input data volume to the XR input/output management module for processing in a shared memory manner, and then forward and format the input data volume into an input format required by each application according to the current focus state and running state of each XR application and 2D application when each application requests for the input event.
The XR display management module 115 is disposed in the XR AppService host process, and is configured to manage the composite display output of all the XR application rendering results, and provide a programmable interface such as Overlay, capture, refresh rate and frame rate control.
The XR space management module 116 is disposed in the XR AppService main process, and is configured to manage origin calculation in all XR application running, various types of space pose conversion between the XR application and the XR SDK Provider process, and provide functions such as resetting a field of view.
The XR embedded management module 117 is disposed in the XR AppService host process, and is configured to manage the running states of all external 2D applications supporting embedded display, where the external 2D applications have 6 main running states: ready, visible, focused, unFocused, inVisible, stopping.
The XR display synthesizer module 118 is disposed in the XR AppService main process, and is configured to synchronize rendering cycles of the multiple XR applications and the XR SDK Provider sandbox process, so that each XR application, the 2D embedded application, the SystemUI, popup, notifiy, embed, and the like complete corresponding rendering, and then synthesize the rendering results, and finally submit the rendering results to the XR SDK Provider sandbox process to complete display, where rendering data such as systemul generated during the rendering process is cached as a file in the app_service directory, so as to accelerate display of the XR application and the systemul.
The XR application management module 119 is deployed in an XR AppService main process, and is used for managing running states of system functions such as an XR application, a 2D embedded application, systemUI, popup, notify and the like, and providing programmable interfaces for opening and closing the XR application and the like.
An XR application lifecycle module 120, deployed in an XR AppService host process, is configured to provide lifecycle management for an XR application, where the XR application has 5 main operating states in the device: ready, visible, synchronized, focused, stopping, respectively, notifying the XR application management module of an event when the XR application state changes;
specifically, the Ready state of the XR application is a representation that the application is Ready to render the relevant resource;
specifically, the Synchronized state of the XR application is a state indicating that a frame buffer has been prepared for the application;
specifically, the visual state of the XR application is a representation that the application is ready to display composition-related resources;
specifically, the Focused state of the XR application is a state indicating that the application is ready to receive device input events;
specifically, the stop state of the XR application is that the XR application is about to close and release all relevant resources.
The XR extension capability implementation module 121 is configured to implement functions of extending XR applications and rewriting XR standard implementation by calling API interfaces of XR application management, input/output management, display management, systemUI and the like running in an XR App Service process through a cross-process API, where implementation of XR extension capability can support multiple programming languages, such as Java/c++/Javascript and the like;
Specifically, a thread is created for each extension implementation to manage the running state of the extension, the extension capability implementation thread runs in an XRApp process, 5 running states are respectively Available, starting, working, stopping, unavailable in the device, and the XR extension management module can manage the running state of the extension through an IPC interface.
The extension management module 122 is located in the XR App service host process, and is used for managing the running states of the extensions developed by all equipment manufacturers and application developers, and the extension dynamic stock is located in the extensions catalog of the XR App service program, and because of the permission problem, when the XR App process loads and extends, the cross-process loading API provided by the extension management module needs to be called to complete the loading.
The systemul module 123 is deployed in an XR AppService main process, and is configured to provide a system-level UI function, for example, basicUI, popup, notify, crashUI, and provide an extension API interface to an XR application developer, so that the developer can also customize the systemul, and the customized systemul runs in an independent XR process like a common XR application, and the developer can develop the customized systemul using the extension API interface of the module, for example, XR application management.
In a second aspect of the present application, the present application provides a data interaction method of a runtime system supporting multiple mixed operation of an application program and a 2D application program in an extended reality (XR) technical specification, where the runtime system is in a device, and a manifest file and an XR SDK Provider sandbox program may be deployed on a remote server or on the device, and the method includes: the method comprises an XR AppService main process loading method, an XR AppService main process running and XR application lifecycle management method, an XR display synthesizer method for synthesizing rendering frames of all application processes, a method for expanding a WebGPU as an XR standard usable graphic API, a method for converting an XR application rendered GPU texture format into an XR SDK running requiring GPU texture format, an XR standard transcoding module analyzing various XR standard APIs, a method for realizing and integrating an extended SystemUI by a SystemUI module, and a method for embedding external 2D application display content into an XR application for display synthesis.
As shown in fig. 3, the XR AppService main process loading method includes:
s301, starting an XR AppService loading module according to a specified address;
specifically, the designated address may be a remote server address or a local path of the device; the specified address should be a valid manifest file address or local path, if the specified address is not a valid manifest file, then the manifest file address needs to be backed up to the local path last saved locally on the device;
S302, retrieving a manifest file;
specifically, if the specified manifest file address is the device local path, then the update_url needs to be parsed, if the current device is in a networking state, then S303 is executed to attempt to update the XR AppService running environment;
specifically, if the specified manifest file address is a remote server address, a request is sent to the remote server through an HTTP/HTTPS protocol, wherein the request contains configuration information of the client; the key information such as the version/name of the operating system, the ID of the device, the display resolution and the like is used for retrieving the manifest file by the communication module on the device, and S303 is executed to try to update the XR AppService running environment;
s303, analyzing the retrieved manifest file;
s303-1, generating a catalog app_service on the device;
specifically, if a manifest file already exists in the local app_service directory, the version numbers of the two manifest files need to be compared, and updating is performed according to the size of the version numbers;
s303-2, analyzing an xr_ sdk _provider field and generating a catalog xr_ sdk on the device;
specifically, comparing the versions of the local XR SDK Provider sandboxes, if the version needs to be updated, downloading a new XR SDK Provider sandboxes to the local catalog xr_ SDK and guiding a user to perform updating operation;
S303-3, analyzing the extensions field and generating directory extensions on the device;
specifically, comparing the versions of the local extensions, and if the versions need to be updated, downloading the new extensions to the local catalog extensions;
s303-4, creating and connecting to an XR SDK Provider sandboxed process;
specifically, an XR SDK Provider sandbox process is established, IPC is established, and an initialization success message is returned by the XR SDK Provider sandbox process in a waiting mode;
s304, saving the successfully operated manifest file to the local;
s305, storing an extended dynamic library and an XR SDK Provider program file which are successfully operated to a local place;
s306, ending the XR AppService loading.
As shown in fig. 4, the XR application lifecycle operation management method includes:
s401, when an XR application in the device is started, the XR App process is connected to an XR AppService main process by calling an XR standard API proxy module provided by the runtime system;
specifically, the XR standard API proxy module is an XR standard complete implementation used by the XR application, and forwards all XR standard API calls to an XR standard transcoding module in an XR app service main process, and the detailed process can refer to fig. 8, and before forwarding, IPC connection needs to be established with the XR app service main process;
Specifically, if the XR application main process has not been created, an XR application service needs to be started, and if the XR application main process has been created successfully, the process directly jumps to S404;
s402, starting an XR application service; .
Specifically, an XR AppService main process is established and the XR application service is waited to be completely initialized successfully;
s403, loading a manifest file and an XR SDK Provider sandbox program and an extension item by an XR AppService main process;
specifically, the manifest file describing the XR application service running environment may be stored locally or may be stored in a remote server, after the content of the manifest file is analyzed, an app_service directory and a device_id are generated locally, and the detailed process may refer to fig. 3 and S301 to S306;
specifically, when the XR SDK Provider sandbox process executes the XR standard operation which is provided for support on the XR SDK Loader initializing device, if the device manufacturer does not provide support for the XR standard, the open source version Monado of OpenXR is installed by default, if the open XR standard is used, the OpenXR Loader built-in OpenXR Loader is executed to load the OpenXR run time in the XR SDK, parameters such as XR input and output, XR display and the like of the device are obtained, and parameters are returned to a corresponding management module in the XR AppService main process through IPC;
S404, creating an XRAppHost; creating an XR application life cycle by an application through an XR application management module interface;
specifically, after the start of the XR app service main process is successful, the modules of XR application management, XR extension and expansion, XR input and output management, XR display management and the like are initialized, and after each module is successfully initialized, a corresponding XR app Host instance is created for the connected XR application in an application management module of the XR app service main process as an IPC connection server to manage each XR application;
s405, running an XR application life cycle, wherein the application supports running XR applications of various XR standards simultaneously, and the life cycle of each XR application is managed by an XR application life cycle module;
s405-1, creating an XR application life cycle;
specifically, creating an XR application lifecycle instance in an XR App Host instance and initializing a lifecycle running state to Ready;
s405-2, establishing an XR application IPC connection, and remotely loading and expanding the XR application;
specifically, an XR App Client instance is created in an XR application process, and IPC connection is established with an XR App Host instance in an XR App service process;
specifically, an extension item to be loaded by an XR application is processed in an XR application process, all appointed extensions are loaded remotely through an extension management module in the XR AppService process, and the life cycle running state of the extension is switched to Synchronized;
S405-3, configuring display and input/output equipment;
specifically, the service life running state of the XR application is switched to Visible, and the display of the XR application and the state of the input equipment are configured through an XR input module and an XR display module;
s405-4, processing XR application running state and input event circulation;
specifically, the service state of the life cycle of the XR application is switched to Focused, at the moment, the XR application can only receive the input event, the XR input module sends the event to the XR application, the XR application calculates the update of the rendering result by using the input event, and meanwhile, the SystemUI module switches the life cycle of the XR application to states such as Visible or Synchronized according to the user interaction event;
s405-5, synthesizing XR application rendering content;
specifically, the service life cycle running state of the XR application is Focused or Visible, and the display synthesizer module synthesizes and displays rendering contents, systemUI, popup, notify and the like of the XR application in sequence;
s405-6, stopping XR application life cycle;
specifically, the Stopping of the XR application can be initiated by the XR application itself or by a SystemUI request, at this time, the running state of the life cycle of the XR application is switched to stop ping, and the display synthesizer is not used for synthesizing the rendering content of the XR application in the stop ping state;
S405-7, disconnecting the XR application IPC connection;
specifically, destroying the XRApp Host instance and releasing relevant input, display, synthesizer and other resources;
s406, destroying the XR application;
specifically, when the XR application exits, the XR App Client and the expansion capacity implementation thread and other resources are destroyed;
s407, storing the XR application cache data to the local equipment so as to accelerate the next starting of the XR application.
As shown in fig. 5, the method for synthesizing display frames by the XR display synthesis manager includes:
s501, an XR SDK Provider sandbox process, an XR AppService main process, an XR application and a 2D embedded application are already operated and are successfully established for inter-process communication and initialization;
specifically, the 2D embedded application, in this example the PWA application is run by the PWA runtime library and rendered into GPU Texture, which cannot create a 2D native window to save system resources;
specifically, in order to enable a technical developer to understand more quickly, in fig. 5 of the present application, an OpenXR standard is used to show a technical solution, and an XR application, a SystemUI module, and an XR SDK Provider all use the OpenXR standard;
s502, creating xrSwapChain by a main thread in an XR SDK Provider process;
s503, executing rendering cycle by the main thread in the XR SDK Provider process to finish the display of each frame;
S503-1: XR SDK xrWaitFrame, calling an XR SDK standard runtime module in an XR SDK Provider process to synchronize the rendering cycle of the display device, and informing the current rendering cycle state of the running XR application through an XR standard transcoding module in an XR AppService;
s503-2: XR SDK xrBeginFrame, notifying an XR SDK Provider process, starting to construct a display frame, and notifying the running XR application of the current rendering cycle state through an XR standard transcoding module in an XR AppService;
s503-3: XR SDK xrSyncAction, calling an XR SDK standard runtime module to read all input states, and then transmitting all input states to an input/output manager module in an XR AppService main process through IPC;
specifically, considering that each frame needs to transfer all XR input data volume, IPC in a shared memory mode is used;
s503-4, the input/output manager module forwards the XR input to the application currently acquiring the focus;
specifically, if a system key is defined, the system key will be forwarded to the SystemUI module, for example, if the user presses a system menu key, focus should be displayed and switched to the SystemUI;
specifically, if the application that currently acquires the focus is a 2D embedded application, the input/output management module converts XR input into 2D input and forwards the 2D input to the 2D embedded application, after receiving the 2D input, a display synthesis thread in the 2D PWA application in fig. 5 notifies a Web rendering pipeline to perform corresponding update, when the Web rendering pipeline finishes executing, a new display frame is submitted to the PWA display synthesis thread through IPC, and finally the PWA display synthesis thread synthesizes all PWA application display frames to GPU textures, which must be created using a cross-process sharing texture mechanism;
Specifically, if the application that currently acquires the focus is an XR application, the XR application can call the input/output manager module across processes through xrSyncActions to acquire XR input, and if the XR application does not currently have the focus, the XR input can not be acquired through xrSyncActions;
s503-5: XR SDK xrLocateViews, acquiring the gesture and parameters of the current HMD in a basic space, storing the result in an XR space management module, and converting the gesture and parameters into gesture information of an XR application through the XR space management module when the XR application running period is a Focused or Visible XR application request;
s503-6: XR SDK xrLocateSpace, acquiring parameters such as the gesture of all reference spaces related to the XR application of which the current XR application operation period is Focused or Visible in a basic space of a designated time, storing the results in an XR space management module, and converting the parameters into gesture information and the like of the XR application through the XR space management module when the XR application operation period is Focused or Visible XR application request;
s503-7, requesting an XR display composition manager to compose a display frame;
s503-8, synthesizing Composition Layers needed by the display frame;
s503-8-1, waiting for each XR application with a Visible or Focused running period to finish the current frame rendering;
Specifically, each XR application submits a rendering frame according to each XR standard, and converts the rendering frame into Composition Layers identifiable by an XR SDK through an XR standard transcoding module;
s503-8-2, taking out the latest 2D embedded application rendering frame from a display frame queue in the PWA display synthesis thread and copying the latest 2D embedded application rendering frame to the corresponding ShareSwapChainImage;
s503-8-3, calculating Composition Layers of an XR SDK display frame, arranging all XR applications with Visible running periods according to a display sequence, finally arranging the XR applications in a Focused state at the top, generating Composition Layers according to xrEndFrame information submitted by the applications according to an ordered XR application list respectively, merging the XR applications into a unified Layer array, and submitting the Layer array to an XR SDK Provider process;
s503-9: XR SDK xrAcquireSwapChainImage, obtaining SwapChainImage;
specifically, the swapchain image acquired by the XR SDK Provider process is used for displaying, the swapchain image acquired by the XR application process is used for sharing textures across processes, and finally the swapchain image in the XR application is copied to the swapchain image for displaying;
s503-10: XR SDK xrWaitSwapChainImage, confirming that SwapChainImage is currently writable;
S503-11: render xrSwapChainImage, sequentially updating the xrSwapChainImage according to the Layer array returned by the S503-8;
all ShareWapChain images in the Layer information need to be rendered into xrSwapChainimage, and ShareWapChain images are released after rendering is completed;
specifically, all ShareWapChainImage correlations in the Layer information need to be replaced by xrSwapChainImage;
s503-12: XR SDK xrReleaseSwapChainImage, release XR SDK xrSwapChainImage;
s503-13: XR SDK xrEndFrame, submitting the XR SDK display frame.
As shown in fig. 6, taking the application of the customized SystemUI of the OpenXR standard as an embodiment, the method for expanding WebGPU as the graphics API of the XR standard includes;
s601, when an XR AppService main process initializes an extension manager, an XR_EXT_WebGPU_enable extension is loaded;
s602, initializing a WebGPU Server by an XR SDK Provider sandbox process;
s603, an XR AppService main process is connected to an XR SDK Provider process and WebGPU Server information is obtained;
s604, the SystemUI application executes xrCreateInstance through an XR standard agent module, and configures XR_EXT_WebGPU_enable expansion in parameters;
s604-1, an XR standard agent module analyzes that XR Createinstance is configured with XR_EXT_WebGPU_enable extension, firstly requests WebGPU Server information from an XR application management module, then uses WebGPU API and DawnWire (IPC) to connect to the WebGPU Server, requests Apapter, device and the like to perform WebGPU initialization, and records a return value ReservedDevice, reservedInstance of the request;
S604-2, calling the OpenXR standard xrCreateInstance of the XR standard transcoding module, and adding an OpenXR parameter XR_TYPE_WEBGPU_INSTANCE_CREATE_INFO to transfer the WebGPU deviceID and instanceID;
s604-3, after the XR standard transcoding module analyzes, an XR application management module is called, an xrApp instance is created by using a WebGPU deviceID and an instanceID, a handle of the xrApp instance is used as a return value according to an OpenXR standard, and the return value is returned to a caller of the system UI application xrCreateInstance through an XR standard proxy module;
s605, the SystemUI application executes xrCreateess through the XR standard agent module;
s605-1, an XR standard agent module calls an XR application management module to create an xrApp session after analyzing through an OpenXR standard xrCreateSession, XR standard transcoding module of an IPC remote call XR standard transcoding module, takes a handle of the xrApp session as a return value according to the OpenXR standard, and returns the return value to a caller of the SystemUI application xrCreateSession through the XR standard agent module;
s606, the SystemUI application executes xrCreateWapChain through an XR standard proxy module;
s606-1, creating ReservedSwapChain by using Dawn Wire (IPC), taking a swapchainID of the ReservedSwapChain as a return value, and returning the swapchainID to a caller of the xrCreateWapchain in the SystemUI application through an XR standard agent module;
S607, the SystemUI application executes the xrWaitFrame through the XR standard proxy module, the XR standard proxy module remotely calls the OpenXR standard xrWaitFrame of the XR standard transcoding module through the IPC, the XR standard transcoding module analyzes and then calls the XR display synthesis module to synchronize rendering cycles in the SystemUI application and the XR SDK Provider, and the xrWaitFrame is returned to a caller in the SystemUI application through the XR standard proxy module;
s608, the SystemUI application executes xrBeginFrame, XR standard agent module through an XR standard agent module, the XR display synthesis module is called after the analysis of xrBeginFrame, XR standard transcoding module of the OpenXR standard of the XR standard transcoding module is called by the IPC remote call, the display frame of the xrBeginFrame is updated, and the display frame is returned to a caller of the xrBeginFrame in the SystemUI application through the XR standard agent module;
s609, the SystemUI application executes xrSyncActions, XR through an XR standard proxy module, the XR input/output management module is called after the XR standard xrSyncActions, XR standard transcoding module of the OpenXR standard transcoding module is remotely called through the IPC, the input state in the SystemUI application is synchronized, and the input state is returned to a caller of xrSyncActions in the SystemUI application through the XR standard proxy module;
S610, the SystemUI application executes xrACquireSwapChainImage through an XR standard proxy module, and executes DeviceCreateTexture through wgpu and DawnWire (IPC);
s611, the SystemUI application realizes the rendering (render frame) of the display frame through the WebGPU API;
s611-1, all rendering operations are recorded into a WebGPU Command queue, and are rendered to textureView after being executed;
s612, the SystemUI application executes the xrEndFrame through the XR standard proxy module, the XR standard proxy module remotely calls the xrEndFrame of the OpenXR standard of the XR standard transcoding module through the IPC, calls the XR display synthesis module after analysis of the XR standard transcoding module to mark that the display frame update of the xrApp session is completed, and returns the display frame update to a caller of the xrEndFrame in the SystemUI application through the XR standard proxy module;
s613, XR display composition manager, display composition module main thread submits final display frame information to XR SDK Provider process after ensuring that all the rendering frames of xrAppSession are updated;
s614, in the XR SDK Provider process, updating all the xrSwapChainImages, updating relevant textureView in the WebGPU Server to the xrSwapChainImages in the XR SDK Provider, and submitting the xrSwapChainImages to the XR SDK Runtime for final display.
As shown in fig. 7, the method for converting the GPU texture format rendered by the XR application into the GPU texture format required by the XR SDK run time includes:
s701, an XR AppService main process loading module starts an XR SDK Provder sandbox process;
s702, starting an XR SDK Loader to load an XR SDK App in an XR SDK Provder sandbox process;
specifically, taking the open XR standard as an example of the XR SDK, using XR energy technology extensions properties to query which graphics APIs are supported by the XR device, if the XR device supports Vulkan, the XR SDK runtime thread preferentially initializes the rendering loop with Vulkan because Vulkan is best supported on each operating system currently;
s703, starting an XR application, and calling xrCreateInstance through an XR standard proxy API module;
s703-1, after analysis by the XR standard conversion module, creating a life cycle of the XR application through the XR application management module;
specific: xrAppInstance, xrAppSession is created through an XR application lifecycle module, and xrAppSpace is created through a space management module;
s703-2, creating a rendering cycle in an XRApp process, wherein in the example, the XR application is an OpenXR application developed by a developer by using an OpenGLES graphics API, and initializing an OpenGLES Context;
s704, the XR application calls xrCreateStawapchain through an XR standard proxy API module;
S704-1: the XRApp process remotely calls an XR standard transcoding module through the IPC, and creates an xrAppSwapChain through a display synthesis management module after the XR standard transcoding module is analyzed;
specifically, in order to optimize GPU resources, an XRApp switch chain is created in an XR app service main process, a cross-process mechanism is adopted, the XR app service process, the XRapp process and the XR SDK Provider process can all be accessed, the XRApp switch chain is created and released by the XR app service main process, and the XRApp switch chain is returned to the XRapp process through an XR standard transcoding module;
s705, the XR application calls the xrACquireSwapChainImage through an XR standard proxy API module;
s705-1, XR application remotely calls xrAcquireSwapChainImage, XR standard transcoding module of XR standard transcoding module through IPC, then creates SharedTexture through display synthesis management module, returns OpenGLES version GPU Texture to XRApp process through XR standard transcoding module;
s706, XR application finishes rendering on ShareTexture;
s707, XR application xrRuleaseSwapchaInImage;
specifically, the XR application notifies the XR application that the write ShareTexture operation is complete;
s708, the XR application executes an xrEndFrame;
specifically, a display synthesizer module in an XR AppService main process is notified, and the XR application finishes rendering;
Specifically, all attitude information in the composite layer information of the XR application and the 2D application is updated into an actual attitude space position in the XR SDK Provider according to the xrAppSpace calculation through a space management module;
s709, rendering a display frame by the XR SDK Provider sandbox process;
specifically, all rendering information of the compsitionlayer is processed, including updating all information of the xrSharedSwapChain;
s709-1, converting all the GPU text related to the xrShareSwapChain into a Vulkan format, rendering the converted GPU text onto an xrSwapChain image of the XR SDK, and finally converting all the xrShareSwapChain related information in the Layer information into xrSwapChain and xrSwapChain image information;
s710, the XR SDK Provider process executes an xEndFrame to complete display of the display frame.
As shown in fig. 8, the method for the XR standard transcoding module to parse multiple XR standard APIs includes:
s801, an XR standard transcoding module analyzes a method about an input/output state of an XR device;
s801-1, in each rendering cycle of an XR SDK runtime thread in an XR SDK Provider process, all the input and output states of all XR devices are stored and transmitted to an XR standard transcoding module of an XR AppService main process through a shared memory;
S801-2, an XR standard transcoding module converts XR standard format input and output data in a shared memory into an intermediate format data structure, and an XR input and output manager forwards an input and output state according to a priority state configured by the XR input and output manager according to each XR application and 2D application focus and running state, for example, the XR input and output state must be directly forwarded to an application management module for processing for a system menu button;
s801-3, if the 2D application is required to be forwarded, converting the intermediate format data structure into a 2D input event and sending the 2D input event to a PWA rendering engine or a local window application;
s801-4, if the XR application is required to be forwarded, the intermediate format data structure is required to be converted into an XR standard format used by the XR application through an XR standard transcoding module and is transmitted to an XR App process through a shared memory;
s801-5, XR application obtains XR input through an XR standard proxy API module and directly reads the XR input from the shared memory, so that the latest input/output state in each display frame can be obtained;
s802, an XR standard transcoding module analyzes a method related to XR display configuration;
s802-1, calling an API (application program interface) for XR SDK operation in an XR SDK Provider process to obtain all display configuration data, and remotely calling an XR standard conversion module through IPC (Internet protocol) to convert all display configuration data on XR equipment into an intermediate data structure of an XR display equipment module;
S802-2, the XR display manager makes display configuration for each XR application;
s802-3, an XRApp process calls an XR standard display configuration API, after analysis is performed by an XR standard transcoding module, display configuration data aiming at the XRApp process is obtained through an XR display module, and the obtained display configuration data is formatted into an XR standard format used by an XR application and returned;
s803, an XR standard transcoding module analyzes the running state of the XR application and the rendering cycle related call;
s803-1, an XR standard agent module of an XR application process remotely calls an XR standard API in an XR standard transcoding module through IPC, and the XR standard transcoding module analyzes the XR standard API into calls of modules such as an application manager, an input/output manager, a display composition manager and the like;
specifically, xrAppInstance, xrAppSession and the like are intermediate data structures generated by calling an application management module;
specifically, xrAppSpace, xrAppPose and the like are intermediate data structures generated by calling the space management module;
specifically, xrAppSwapChain, xrAppSwapChainImage and the like are intermediate data structures generated by calling the display composition manager module;
s803-2, calling an XR SDK Provider process XR standard runtime API;
Specifically, some intermediate data structures, such as xrAppSpace, also need to create xrSpace one-to-one in the XR SDK Provider process to use the space computation capability of the XR SDK, in this case, the intermediate data structure needs to be formatted into the XR standard format in the XR SDK Provider process, and then call the XR standard runtime API of the XR SDK Provider process through the XR standard transcoding module;
the XR standard transcoding module formats the return value into a standard format for XR applications S803-3.
As shown in fig. 9, the system UI module operation flowchart and the method for providing the integrated system level XR UI control include:
s901, initializing a SystemUI module and starting a BasicUI;
specifically, the SystemUI module needs to provide a Basic SystemUI of a default version, and the default functions include: the method comprises the steps of setting a system, starting an application, ending the application, displaying the application and the like, wherein a Basic System UI module runs in a UI thread in an XR AppService process;
s902, registering a SystemUI module as a system application;
s903, registering system actions through an XR input module, and ensuring that a user can open a System UI through the system actions, wherein the system actions can be system keys, or can be actions such as voice, gestures and the like;
s904, the XR equipment developer registers the custom SystemUI application through an XR application management module;
Specifically, the device manufacturer can also customize the system ui, the running mode is an XR application of an independent process, which can be a 2D application or an XR application, and if the XR application is an XR application, the application can be developed by using WebGPU or the open XR standard, and functions such as an XR application management module are remotely called through IPC;
s905, preloading a custom SystemUI application;
s906, responding to system actions;
specifically, the system Action format is a json character string with a limited maximum length, and can be an XR input type Action or an Action such as Popup can be opened;
s906-1, firstly, forwarding the system Action to a BasicUI module for processing, if the system Action is not processed, forwarding the system Action to a custom System UI module for continuing processing, and if the custom System UI is not processed, displaying a prompt that the system Action cannot be supported;
specifically, the SystemUI module provides a mechanism for customizing the actions of the SystemUI processing system;
s906-2, the custom System UI application can call all the functions provided by XR AppService: application management, extension management, input/output management, embedded management, display management and other functions, and can also call functions provided by an operating system to develop a custom UI such as Launcher, crashUI, settings;
Specifically, the custom System UI application can judge whether the current system is in the state of Lowmemory/LowGPUMemmory through an interface provided by an operating system, and then does corresponding system optimization operation through an XR application management module;
specifically, the custom SystemUI application can make frame rate statistics through the display synthesizer management module, so that the frame rate reduction setting application ANR and the like are judged to be used for performing corresponding system optimization operation;
specifically, the custom System UI application can process system actions such as Crash.
As shown in FIG. 10, the method of embedding external 2D application display content into an XR application display composition includes;
in the runtime system provided by the application, the 2D application does not need to create a 2D window of a native operating system essentially, only needs to render and update the application display content to GPU textures, and then the application display content is synthesized by a display synthesis manager according to the position of the 2D application in an XR space and submitted to an XR SDK Provider process for display;
the method for projecting the 2D application into the XR space can use Virtual Display on Android/Linux, while Virtual Screen API is used on Windows, the names are slightly different, but the functions are almost the same, and the Virtual Display function is provided;
For the virtual display function provided by the operating system, the performance is flawed, so that in the embodiment of the application, the 2D application is developed by using the PWA, the PWA rendering content can be directly rendered on the GPU texture by using an off-screen mode by a browser engine, the scheme performance of creating a Native window is not required to be better, and the display composition manager module synthesizes the GPU texture to XR display after each update;
s1001, starting a PWA 2D application;
s1001-1, initializing the resolution of 2D application display, wherein the default can be according to the resolution of a common display in reality, such as 1920x1080;
s1001-2, running a 2D application life cycle;
s1001-3, creating Web rendering PipeLine;
s1001-4, generating a display frame;
s1002, applying ShareTexture by the 2D application through an Ebed manager;
specifically, the 2D PWA application needs to register with an embedded management module in the XR application process, and the embedded management module applies for GPU textures from the SharedTexture;
s1003, embedding PWA by the XR application through an embedded manager;
specifically, the XR application recognizes that the life cycle of the 2D application is in a Ready state through the Ebed extension, and can be placed in an XR space according to the application requirements;
s1004, after the XR application acquires the XR input, the XR input can be forwarded to the 2D application for processing by using a processing input interface of the Embed management module;
Specifically, when the input such as the XR ray points to the 2D application, the life cycle of the 2D application can be switched to a Focused state, and the XR input is required to be converted into the 2D input through the input-output manager and then forwarded to the Native application or the PWA application;
specifically, taking a 2D PWA application as an example, after receiving a 2D input, a display synthesis thread in the PWA application notifies a Web rendering pipeline to perform corresponding update, when the Web rendering pipeline finishes executing, a new display frame is submitted to the PWA display synthesis thread through IPC, and finally the PWA display synthesis thread synthesizes all the PWA application display frames to a GPU texture, wherein the GPU texture must be created by using a cross-process shared texture mechanism;
s1005, XR AppService synthesizes PWA display frames;
specifically, when the XR application embedded in the 2D application renders a display frame, getSurfaceTexture of the Ebed management module can be called to acquire the display content of the 2D application for secondary processing, and the display content can also be directly called to be directly submitted to a display composition manager;
s1005-1, calculating rendering contents of the XR application;
specifically, the XR application needs to render in real time, the 2D application does not need to render in real time, and each frame directly takes the latest completed display frame from the display frame queue of the 2D application;
S1005-2, calculating the rendering content of the SystemUI module;
s1005-3, sequentially synthesizing XR SDK display frames, wherein the detailed process can refer to FIGS. 5 and S503;
s1006, submitting the XR SDK Provider to display;
specifically, the display composition manager performs display composition on the 2D application and the XR application, forms a final display frame, and submits the final display frame to the XR SDK Provider process to complete final display.
Specifically, as shown in fig. 11, the method for origin calculation in the XR application running, and various types of spatial gesture conversion between the XR application and the XR SDK Provider process includes:
s1101, xrAppSpace update in each life cycle state in XR application;
s1101-1, recording an origin point when the XR application is started;
specifically, taking the OpenXR standard as an example, when an XR application is started, calling xrLocateViews to acquire the gesture information of the current Viewer from an XR SDK Provider sandbox process, and using the gesture information to create xrrreferenceSpace as origin information of the XR application, namely xrapSpace;
s1101-2, when XR application is switched between foreground and background, if VR type XR application needs to keep the position of the Viewer at the original position after switching;
specifically, taking the OpenXR standard as an example, when XR application is switched between the foreground and the background, recording the posture information of the Viewer at the moment, calculating a posture information difference value caused by switching between the foreground and the background, and updating xrR radio space by using the posture information difference value as origin information of XR application, namely xrAppSpace;
S1102, when XR applies gesture calculation, conversion is needed through a space management module;
specific: various types of spatial attitude computation in XR applications, including XR input, display synthesis, etc., require the conversion of xrAppSpace;
specific: taking the OpenXR standard as an example, the XR application calls xrLocateViews and xrLocatespace to be converted, the calculation mode is that the xrSpace is obtained by taking the origin recorded by xrAppSpace as a reference space and converted into xrSpace in the XR SDK Provider space, the converted xrSpace is continuously used for calling an XR standard API when the XR SDK operates, the gesture information result is obtained and then converted into gesture information with the xrAppSpace as the origin through a space management module, and the gesture information is returned to the XR application caller;
s1103, updating the posture position information related to the xrAppSpace when the display synthesizer module synthesizes the completions layer.
It is to be understood that the above examples of the present invention are provided by way of illustration only and not by way of limitation of the embodiments of the present invention. Other variations or modifications of the above teachings will be apparent to those of ordinary skill in the art. It is not necessary here nor is it exhaustive of all embodiments. While those obvious variations or modifications which come within the spirit of the invention remain within the scope of the invention.
In a third aspect of the present application, an embodiment of the present application further provides an apparatus, including: a processor, a memory, and a communication unit;
the memory stores machine readable instructions executable by the processor, the processor and the memory in communication via the communication unit when the device is operating;
wherein the processor executes the machine readable instructions to perform the method of the above aspects.
The memory may be used to store the processor's execution instructions and may be implemented by any type of volatile or non-volatile memory terminal or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic disk, or optical disk. The execution of the instructions in memory, when executed by the processor, causes the apparatus to perform some or all of the steps in the method embodiments described below.
The processor is a control center of the memory terminal, connects various parts of the entire electronic terminal using various interfaces and lines, and executes various functions of the electronic terminal and/or processes data by running or executing software programs and/or modules stored in the memory, and invoking data stored in the memory. The processor may be comprised of an integrated circuit (Integrated Circuit, simply referred to as an IC), for example, a single packaged IC, or may be comprised of a plurality of packaged ICs connected to the same function or different functions. For example, the processor may include only a central processing unit (Central Processing Unit, simply CPU). In the embodiment of the application, the CPU can be a single operation core or can comprise multiple operation cores.
And the communication unit is used for establishing a communication channel so that the storage device can communicate with other terminals. Receiving user data sent by other terminals or sending the user data to other terminals.
In a fourth aspect, embodiments of the present application further provide a computer readable storage medium, where a computer program is stored, where the computer program is executed by a processor to perform the method according to the above aspects.
The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), a random-access memory (random access memory, RAM), or the like.
Claims (14)
1. The runtime system supports multi-process mixed operation of multiple kinds of extended reality (XR) technical specification application programs and 2D application programs, and is characterized in that the runtime system is deployed in an XR device, an XR SDK Provider sandbox program and an extended dynamic library which match the latest XR SDK capability of an XR device manufacturer can be deployed on a remote server or local device, the runtime system loads the XR SDK Provider sandbox program and the extended dynamic library deployed on a remote server or local device to the local device for operation through an XR application service loading module, meanwhile, the XR application management module of the runtime system provides the capability of multi-process operation and management of different XR standard applications and 2D application programs, and the application of each XR standard can realize the capability of extending XR standard through an XR extended interface provided by the runtime system, and the runtime system comprises:
The XR standard run-time API proxy module is used for adapting to various XR standard run-time API requirements and forwarding the XR standard API calls to an XR standard transcoding module of an XR AppService main process, and specific XR standard API requirements can be achieved by calling interfaces such as an XR application management module, an XR input/output management module, an XR display management module and the like through the XR standard transcoding module, or by completing a custom XR standard API interface through rewriting the XR standard API interface, or by directly calling system native capacity;
the XR standard transcoding module specifically comprises XR standard analysis and XR standard API and parameter construction thereof, wherein the XR standard analysis is used for analyzing various XR standards into internal data structures and function calls used by each module of the runtime system, and the XR standard API and parameter construction thereof is used for formatting the data structures used by each module of the runtime system into API remote calls of various XR standards and parameter formats required by the API remote calls;
the XR AppService loading module is used for providing the function of dynamically updating the XR AppService running environment on the equipment, particularly dynamically updating and loading the XR SDK Provider sandbox module and the extension customized by equipment manufacturers, the remote server judges whether the current running system needs to be updated by identifying key information reported by the XR equipment, if so, a user is guided to update the running system, after the updating is successful, the running environment module for initializing the XR AppService is executed, and an XR SDK Provider sandbox process is created;
The XR SDK Provider sandbox module specifically comprises an XR SDK Loader and an XR SDK App, wherein the XR SDK Loader is used for loading and initializing an XR SDK running environment on equipment, the XR SDK App is used for providing XR functions such as input and output, display configuration and space calculation of the XR equipment for an XR AppService process, and the like, and in a rendering cycle, display frame data generated by the XR display synthesizer module are converted into a format when the XR SDK runs and submitted and packaged into a sandbox program, so that safety isolation of data, performance, stability and the like can be provided for running the XR application;
the XR input/output management module is used for managing the input/output functions of the XR equipment and providing expansion API interfaces for a developer, such as opening, closing, installing and unloading the XR input/output equipment, filtering the XR input/output, forwarding the XR input/output, expanding the XR input/output equipment and the like;
the XR display management module is used for managing all XR application rendering results to synthesize display output, providing an extended API interface with Overlay, capture, refresh rate and frame rate control, display output configuration and other functions, wherein the Overlay function is used for overlaying specified display content on the XR application rendering content, and the Capture function is used for capturing display synthesized frame data of each rendering cycle;
The XR space management module is used for dynamically managing the origin calculation in the operation of all XR applications according to the operation period of the XR applications, and calculating various types of space gesture conversion between the XR applications and the XR SDK Provider processes;
the XR embedded management module is used for managing display contents of all external 2D applications and providing a display synthesis method for embedding the display contents of the external 2D applications into an XR application space, and specifically, the XR embedded management module is required to display the 2D applications in the XR application space and can format received XR input into input events which can be identified by the 2D applications to forward the input events to the 2D application for processing;
the XR application management module is used for managing the running states of system function applications such as an XR application, a 2D embedded application, a SystemUI and the like, wherein the life cycle of the XR application on equipment is as follows: ready, synchronized, visible, focused, stopping, etc., wherein the lifecycle of a 2D application on a device is: ready, visible, focused, unFocused, inVisible, stopping, and the like, and provides an extension API interface for opening, closing, displaying, hiding, querying the information of the currently running application, and the like;
the XR display synthesizer module is used for synchronizing rendering cycles of an XR application and an XR SDK Provider process, aggregating synthesis results of display frames of a plurality of XR applications, embedded 2D applications, system applications and the like on equipment, and finally submitting the synthesis results to the XR SDK Provider sandbox process to finish display;
The XR extension management module is used for managing the extended running states developed by all equipment manufacturers and application developers, and specifically comprises an extended capability running state management thread and an extended capability implementation thread, wherein each extended XR implementation creates an extended thread in an XR application process, and each extended thread is managed in the extended capability management module by using the following states: available, starting, working, stopping, unavailable;
the system UI module is used for providing a system-level UI function and a custom system UI interface, and specifically comprises a basic UI module and a custom system UI extension API interface, wherein the basic UI module comprises basic functions such as Launcher, popup, notifiy, crashUI, and is recommended to be realized by integrating a lightweight UI library based on WebGPU (WebGPU) by equipment manufacturers, wherein the system UI extension API interface is used for providing a developer to develop a custom system UI XR application.
2. A data interaction method of a runtime system supporting multi-process mixed operation of multiple kinds of extended reality (XR) technical specification application programs and 2D application programs, wherein the runtime system is arranged in a device, and a manifest file, an XR SDK Provider service program file and an extended dynamic library can be deployed on a remote server or on the device, and the method comprises the following steps:
The XR application on the equipment calls an API of the runtime system to process a starting request of the XR application, if the XR application service main process is not currently running on the equipment, the runtime system creates and runs the XR application service main process, establishes an inter-process communication mechanism with the XR application service main process, and then initializes an XR application manager, an XR display synthesizer, an XR expansion manager, an XR Input manager, a display manager and the like in the XR application service main process;
after the initialization module is completed, an XR loading module prepares an operating environment for XR application by using a management file and an XR SDK Provider service process file which are deployed on a local or remote server of the equipment, and an extended dynamic library;
after the XR application operation environment is prepared, an XR SDK Provider sandbox process is created and operated, the runtime system supports simultaneous operation of a plurality of XR applications, and an XR application manager manages the operation states of all XR applications and XR SDK Provider sandboxes through an inter-process communication mechanism;
the XR application service host process uses an XR application lifecycle to manage the running state of the XR application;
the XR application service main process uses an XR display synthesizer to synthesize and calculate display frames for rendering contents and a SystemUI module of all XR applications;
The runtime system provides a method for realizing extension of a WebGPU as a graphic API available by an XR standard based on an XR standard runtime provided by an XR equipment manufacturer;
the runtime system provides a method for implementing extended functionality for XR applications;
the XR SDK Provider sandbox process provides a method for converting a GPU texture format rendered by an XR application into a GPU texture format required by XR SDK operation;
the XR application service main process XR standard transcoding module provides a method for analyzing various XR standard APIs;
the XR application service main process provides a method for integrating system-level XR UI controls;
the XR application service host process provides a method of embedding external 2D application display content into an XR application;
the XR application service host process provides methods for origin computation in XR application operation and various types of spatial gesture conversion in XR application.
3. The method for data interaction of a runtime system supporting multi-process hybrid operation of multiple extended reality (XR) technical specification applications and 2D applications according to claim 2, wherein the specific method for preparing an XR application operating environment by the runtime system XR loading module analyzing and loading a management file and an extension described by the management file comprises:
Starting an XR AppService loading module according to a designated address, wherein the XR AppService loading module requests the latest manifest version to a remote server according to key information such as a system version number, an operating system version/name, an equipment ID, hardware parameters and the like when running, if the manifest file needs to be updated, the file needs to be downloaded to the equipment according to a returned manifest file remote address, and in addition, equipment manufacturers need to provide a manifest file of a basic version and related files defined in the manifest file on the equipment in factory setting of the equipment, so that the equipment can be ensured not to be connected with the network and the XR application can be normally operated;
analyzing a manifest file, and firstly checking whether the manifest file is compatible with the current equipment;
analyzing an xr_ SDK _provider field in the manifest file, and downloading and installing an XR SDK Provider sandbox program if updating is required;
analyzing extensions fields in the manifest file, comparing with a locally extended version, and downloading an extended dynamic library if updating is needed;
starting an XR SDK Provider sandbox process appointed in the manifest file, and if the starting fails, considering that the XR AppService running environment fails to execute;
starting an extended implementation thread or service specified in the manifest file, and if the starting fails, considering that the XR AppService running environment fails to execute;
If the XR application running environment is successful in running, saving the successful manifest file and related files to the local, and recording the file as the last successful XR application running environment;
if the XR application running environment fails to run, the method moves back to the XR AppService running environment which can be successful last time.
4. The method for data interaction in a runtime system supporting multiple augmented reality (XR) specification applications and 2D application multi-process hybrid operations of claim 2, wherein the specific method for creating and executing an XR application lifecycle in the runtime system comprises:
in an XR application service main process, an application life cycle is created for each XR application to manage the running and display states of the XR application;
ready state, which indicates that rendering related resources have been prepared for XR applications, ensuring that input/output manager, display manager, composition frame buffer manager, etc. have been initialized successfully;
a syncronized state indicating that the frame buffer circulator is ready for XR applications;
a Visible state indicating that the display composition related resource is Ready for the XR application, ensuring that the XR display synthesizer and the like have Ready;
a Focused state, which indicates that the XR application is ready to receive XR input, an XR application service main process obtains an XR input event from an XR SDK Provider sandbox process, and sends the XR input event to the XR application where the current focus is located through an XR input/output manager, and the XR application calculates a rendering result according to the XR input update and submits synthetic layer information of a synthetic frame, and the like;
And (3) a stop state, namely destroying the XR application and releasing all relevant resources.
5. The method for data interaction in a runtime system supporting multiple mixed operation of an extended reality (XR) specification application and a 2D application, as set forth in claim 2, wherein the method for synthesizing the display frame by the XR display synthesizer module in the runtime system comprises:
in the XR display synthesizer, according to all XR application running states, the rasterization results of all XR application display frames, current Popup in the SystemUI, overlay in XR display management, XREmbed of 2D application content and other rendering frames are synthesized into a final display frame according to a specific sequence, and the final display frame is submitted to an XR SDK Provider sandbox process to finish final display.
6. The method of claim 2, wherein the runtime system provides an XR standard runtime API based on XR equipment vendor, and wherein the specific method of expanding WebGPU as a graphics API usable by XR standards comprises:
in the XR SDK Provider process, webGPU native server is integrated to synthesize an XR application rendering frame into an XR SDK display frame, and a WebGPU Client is integrated in an XR standard proxy module in the XR application process, and the support of the graphics API is expanded through the XR standard proxy module to increase the WebGPU.
7. The method for data interaction in a runtime system supporting multiple mixed operation of an extended reality (XR) technical specification application and a 2D application, as described in claim 2, wherein the specific method for converting a GPU texture format rendered by the XR application into a GPU texture format required by the XR SDK runtime, comprises:
a cross-process texture sharing mechanism is used among an XR application, an XR AppService process and an XR SDK Provider sandbox process, shared textures are managed by a display synthesizer module of the XR AppService process, the XR application applies for writing in during rendering, shared texture writing connection is released after writing is completed, the display synthesizer module completes synthesis of a display frame, the display frame is submitted to the XR SDK Provider process and then applied for reading by the XR SDK Provider process, during reading, the textures are converted into a format required by the XR SDK during operation through a graphic API, the read textures are rendered on an xrSwapChainage of the XR SDK, and shared texture reading connection is released after the completion.
8. The method for data interaction in a runtime system supporting multiple extended reality (XR) specification applications and 2D application multi-process hybrid operations of claim 2, wherein the specific method for implementing extended functions for the XR application by the runtime system comprises:
The application provides an extended development SDK header file, an application developer can compile and generate an extended dynamic library by using the SDK provided by the application, and the extension can remotely call the APIs of modules such as XR application management, XR input/output management, XR display, XR standard running API proxy and the like by IPC, and can also integrate other third-party libraries to complete the realization of the extension function.
9. The method for data interaction in a runtime system supporting multiple mixed operation of multiple extended reality (XR) technical specification applications and 2D applications according to claim 2, wherein the specific method for resolving multiple XR standard APIs provided by the XR standard transcoding module of the runtime system comprises:
the XR SDK Provider process calls information such as data returned by an API (application program interface) when an XR standard operation provided by the XR SDK, and the information is converted into an internal data structure which can be identified by an XR input/output, XR display and XR display synthesizer through an XR standard transcoding module;
the XR App process calls an XR standard API appointed by the XR application, converts the XR standard API into calls for modules such as XR input and output, XR display and XR display synthesizer through an XR standard transcoding module, formats an internal data structure returned by the calls into an XR standard data format appointed by the XR application through the XR standard transcoding module, and returns the XR standard data format to a caller in the XR application;
Because of large data volume, the XR input and output needs to use inter-process shared memory for data transmission, and the input and output states of all XR devices are all saved in each rendering cycle of the XR SDK runtime thread and are transmitted to an XR standard transcoding module of an XR AppService main process through the shared memory.
10. The method for data interaction in a runtime system supporting multiple mixed operation of an extended reality (XR) specification application and a 2D application according to claim 2, wherein the specific method for providing integrated system level XR UI controls in the runtime system comprises:
the system level UI controls include SystemUI, popup, notification, etc., and the display composition sequence of the system level UI controls is above all XR applications, and the display composition device of claim 5 performs the composition of display frames according to a specific sequence: rendering content of all XR applications, a system level UI control and an XR Overlay;
when the system level UI control is displayed, the system level UI control can acquire all XR input events, if the XR input events are not processed by the UI control, the system level UI control continues to forward to the XR application currently acquired in focus, and otherwise, the system level UI control does not continue to forward.
11. The method for data interaction in a runtime system supporting multiple augmented reality (XR) specification applications and 2D application multi-process hybrid operations of claim 2, wherein the specific method for embedding external 2D application display content in the XR application is provided in the runtime system, comprising:
A display synthesis method opens external 2D application in virtual space or off-screen mode, intercepts the display content of external 2D application, when the display content is updated, converts it into rendering data which can be used by XR standard, then can be obtained by XR application developer calling the expansion interface provided by the application, and then uses API of XR standard to put 2D application into 3D space, and provides functions of dragging, amplifying, etc.;
a spatial display computing method for receiving XR input and converting it into input events recognizable by a 2D application, thereby implementing the function of operating the 2D application in 3D space.
12. The method for data interaction in a runtime system supporting multiple augmented reality (XR) specification applications and 2D application multi-process hybrid operations of claim 2, wherein the specific method for providing origin computation in XR application operations and various types of spatial gesture transformations in XR applications and XR SDK Provider sandboxes comprises:
calculating an XR application space origin, and dynamically updating the XR application space origin according to the XR application running period change;
the XR application and the XR SDK Provider process are different in coordinate system and space origin, so that calculation of space position conversion is needed before data such as gestures of different spaces or coordinate systems are used, and then the calculation is transmitted to the XR application or the XR SDK Provider process through an XR standard conversion module.
13. An apparatus, comprising: a processor, a memory, and a communication unit;
the memory stores machine readable instructions executable by the processor, the processor and the memory in communication via the communication unit when the device is operating;
wherein the processor executes the machine readable instructions to perform the method of any of claims 3 to 12.
14. A computer readable storage medium, characterized in that the computer readable storage medium has stored thereon a computer program which, when executed by a processor, performs the method of any of claims 3-12.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310946710.2A CN116954824A (en) | 2023-07-28 | 2023-07-28 | Runtime system supporting multi-process mixed operation of multiple extended reality (XR) technical specification application programs and 2D application programs, data interaction method, device and medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310946710.2A CN116954824A (en) | 2023-07-28 | 2023-07-28 | Runtime system supporting multi-process mixed operation of multiple extended reality (XR) technical specification application programs and 2D application programs, data interaction method, device and medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116954824A true CN116954824A (en) | 2023-10-27 |
Family
ID=88444318
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310946710.2A Pending CN116954824A (en) | 2023-07-28 | 2023-07-28 | Runtime system supporting multi-process mixed operation of multiple extended reality (XR) technical specification application programs and 2D application programs, data interaction method, device and medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116954824A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117596377A (en) * | 2024-01-18 | 2024-02-23 | 腾讯科技(深圳)有限公司 | Picture push method, device, electronic equipment, storage medium and program product |
-
2023
- 2023-07-28 CN CN202310946710.2A patent/CN116954824A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117596377A (en) * | 2024-01-18 | 2024-02-23 | 腾讯科技(深圳)有限公司 | Picture push method, device, electronic equipment, storage medium and program product |
CN117596377B (en) * | 2024-01-18 | 2024-05-28 | 腾讯科技(深圳)有限公司 | Picture push method, device, electronic equipment, storage medium and program product |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP0753811B1 (en) | Data processing method and device | |
US11829787B2 (en) | Multi-process model for cross-platform applications | |
US10545749B2 (en) | System for cloud computing using web components | |
WO2016155388A1 (en) | Method and device for installing and running application | |
US9830176B2 (en) | Methods, systems, and media for binary compatible graphics support in mobile operating systems | |
US8539506B2 (en) | Dynamic injection of code into running process | |
WO2002033545A2 (en) | Pluggable instantiable distributed objects | |
EP1906305A2 (en) | Method and system for data preparation and communication between software applications | |
CN116954824A (en) | Runtime system supporting multi-process mixed operation of multiple extended reality (XR) technical specification application programs and 2D application programs, data interaction method, device and medium | |
US9268582B2 (en) | Method and device enabling the execution of heterogeneous transaction components | |
CN113448643B (en) | Configuration data management system and method | |
CN108228266B (en) | Method and device for starting Fragment component between different plug-ins under Android plug-in framework | |
US20070073902A1 (en) | Information processor, information processing method and program | |
WO2020060630A1 (en) | Connected application experience | |
WO2023245369A1 (en) | Application starting method and apparatus, electronic device, and storage medium | |
CN116700694B (en) | Applet engine | |
WO2024017278A1 (en) | Method, device and program carrier for cross-platform porting of applications | |
CN115729614A (en) | Runtime library based on PWA standard of browser, data interaction method, equipment and medium | |
CN114721655A (en) | Method for remotely deploying MATLAB function | |
CN113032094A (en) | CAD containerization method and device and electronic equipment | |
CN117453302A (en) | Method, device, equipment and computer readable medium for importing software development kit | |
CN117234627A (en) | AR applet platform | |
JP2005302055A (en) | Data processing method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |