CN115857698A - Mixed reality interactive display device and method - Google Patents
Mixed reality interactive display device and method Download PDFInfo
- Publication number
- CN115857698A CN115857698A CN202211679146.4A CN202211679146A CN115857698A CN 115857698 A CN115857698 A CN 115857698A CN 202211679146 A CN202211679146 A CN 202211679146A CN 115857698 A CN115857698 A CN 115857698A
- Authority
- CN
- China
- Prior art keywords
- display
- internal
- module
- external
- display module
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
The invention belongs to the technical field of mixed reality and discloses a mixed reality interactive display device and a mixed reality interactive display method. The mixed reality interactive display device comprises a display module and an external touch module, wherein the display module comprises an internal display module and an external display module, the internal display module and the external display module are arranged in a back-to-back mode, and the external display module is connected with the external touch module; the external display module acquires a display picture of the internal display module and displays the display picture of the internal display module to the interactive user so as to enable the interactive user to determine intention information; the external touch module acquires intention information of an interactive user and updates the intention information on a display picture displayed by the external display module; and acquiring a display picture with intention information for the internal display module, and displaying the display picture with the intention information for the wearing user so as to enable the wearing user to adjust and execute actions according to the intention information. The wearer can interact and transmit information with the outside more directly, and interaction effect and exchange efficiency are greatly improved.
Description
Technical Field
The invention relates to the technical field of mixed reality, in particular to a mixed reality interactive display device and method.
Background
Mixed Reality (MR) technology is a further development of virtual reality technology, which builds an interactive feedback information loop among the virtual world, the real world and the user by introducing real scene information into the virtual environment, so as to enhance the sense of reality experienced by the user. The realization of Mixed Reality (MR) requires an environment where the real world objects can interact with each other, and the key point is the interaction with the real world and the timely acquisition of information. The existing MR head cannot directly see the external environment, meanwhile, an external user cannot feel the visual angle of a wearer, although the MR head can be put on a television, the external user can only see the visual field content of the wearer and can repeat the visual field content to the wearer through the description form of language characters according to the seen content, the MR head cannot interact with the wearer more visually through touch control, information transmission and other modes, the MR interaction effect is greatly influenced, if the MR head is in a working scene, the problem communication is also greatly hindered, and the working efficiency is influenced.
The above is only for the purpose of assisting understanding of the technical aspects of the present invention, and does not represent an admission that the above is prior art.
Disclosure of Invention
The invention mainly aims to provide a mixed reality interactive display device and method, and aims to solve the technical problems that in the prior art, the interaction between an MR user and the outside is weak, and the outside user can only feed back information to a wearer through language description under the condition of knowing the visual angle of the user, so that the communication efficiency is low.
In order to achieve the above object, the present invention provides a mixed reality interactive display device, which includes a display module and an external touch module, wherein the display module includes an internal display module and an external display module, the internal display module and the external display module are arranged in a reverse direction, and the external display module is connected to the external touch module;
the external display module is used for acquiring the display picture of the internal display module and displaying the display picture of the internal display module to an interactive user so as to enable the interactive user to determine intention information;
the external touch module is used for acquiring intention information of the interactive user and updating the intention information on a display picture displayed by the external display module;
the internal display module is used for acquiring the display picture with the intention information and displaying the display picture with the intention information to a wearing user so that the wearing user can adjust and execute actions according to the intention information.
Optionally, the inward display module includes a lens and an inward display screen, a display surface of the inward display screen is disposed opposite to a back surface of the lens, and a preset distance is formed between the display surface of the inward display screen and the back surface of the lens;
the internal display screen is used for providing the display picture;
the lens is used for displaying the display picture of the inner display screen to the wearing user.
Optionally, the external display module includes an external display screen, and the external display screen is connected to the external touch module;
the external display screen is used for displaying a display picture of the internal display screen to the interactive user;
the external display screen is further used for displaying the intention information to the interactive user.
Optionally, the mixed reality interactive display device further comprises an internal control module, and the internal control module is connected with the internal display screen and the external display screen respectively;
and the internal control module is used for clearing the intention information and adjusting the display picture of the internal display screen according to the execution action.
Optionally, the internal control module includes a control unit and a backlight circuit connected to each other, the control unit is connected to the internal display screen and the external display screen, and the backlight circuit includes a first backlight circuit and a second backlight circuit;
the first backlight circuit is used for driving the inner display screen;
the second backlight circuit is used for driving the external display screen;
and the control unit is used for adjusting the display pictures of the internal display screen and the external display screen.
In addition, in order to achieve the above object, the present invention further provides a mixed reality interactive display method, including the following steps:
the external display module acquires a display picture of the internal display module and displays the display picture of the internal display module to an interactive user so that the interactive user can determine intention information;
the external touch module acquires intention information of the interactive user and updates the intention information on a display picture displayed by the external display module;
and the internal display module acquires the display picture with the intention information and displays the display picture with the intention information to a wearing user so that the wearing user can adjust and execute actions according to the intention information.
Optionally, after the obtaining, by the internal display module, a display screen with the intention information and displaying the display screen with the intention information to a wearing user, so that the wearing user adjusts an execution action according to the intention information, the method further includes:
and the internal control module clears the intention information and adjusts the display picture of the internal display module according to the execution action.
Optionally, the intention information includes mark information, indication information, and drawing information, and the external display module acquires a display screen of the internal display module and displays the display screen of the internal display module to an interactive user, so that the interactive user determines the intention information, including:
the external display module acquires a display picture of the internal display module and displays the display picture of the internal display module to an interactive user, so that the interactive user determines an interactive intention according to the display picture and generates intention information in the external touch module according to the interactive intention.
Optionally, the obtaining, by the internal display module, a display screen with the intention information, and displaying the display screen with the intention information to a wearing user, so that the wearing user adjusts an execution action according to the intention information, includes:
the internal display module acquires the display picture with the intention information displayed by the external display module, and displays the display picture with the intention information to a wearing user, so that the wearing user determines the interaction intention of the interaction user according to the intention information, and adjusts the execution action according to the interaction intention.
Optionally, before the external display module obtains the display screen of the internal display module and displays the display screen of the internal display module to the interactive user, the method further includes:
the internal control module drives the internal display module and the external display module;
the intra-pair display module presents a display screen to the wearer.
The mixed reality interactive display device comprises a display module and an external touch module, wherein the display module comprises an internal display module and an external display module, the internal display module and the external display module are arranged in a back direction, and the external display module is connected with the external touch module, wherein the external display module is used for acquiring a display picture of the internal display module and displaying the display picture of the internal display module for an interactive user so as to ensure that the interactive user determines intention information, the external touch module is used for acquiring the intention information of the interactive user, updating the intention information on the display picture displayed by the external display module, the internal display module is used for acquiring the display picture with the intention information and displaying the display picture with the intention information for a wearing user so as to adjust and execute actions according to the intention information of the wearing user. Compared with the method that the visual field content of the wearer is directly delivered to the television for interaction, the external interaction person can only feed back information to the wearer through language description, the communication mode is extremely low in efficiency, the external interaction person can interact with the wearer more visually through touch control, information transmission and the like, the interaction efficiency between a real space and a virtual space is greatly improved, the experience of a user is improved, and the MR interaction performance is improved.
Drawings
FIG. 1 is a schematic structural diagram of a mixed reality interactive display device according to a first embodiment of the present invention;
FIG. 2 is a schematic diagram of a detailed structure of an embodiment of a mixed reality interactive display device according to the invention;
FIG. 3 is a schematic structural diagram of a mixed reality interactive display device according to a second embodiment of the present invention;
FIG. 4 is a schematic overall structure diagram of a mixed reality interactive display device according to an embodiment of the present invention;
FIG. 5 is a flowchart illustrating a mixed reality interactive display method according to a first embodiment of the present invention;
fig. 6 is a schematic overall flowchart of a mixed reality interactive display method according to an embodiment of the present invention.
The reference numbers indicate:
reference numerals | Name (R) | Reference numerals | Name (R) |
10 | |
302 | |
20 | |
1011 | Lens and |
30 | |
1012 | |
101 | |
1021 | |
102 | External display module | 3021 | |
301 | |
3022 | Second backlight circuit |
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that all directional indicators (such as up, down, left, right, front, back \8230;) in the embodiments of the present invention are only used to explain the relative positional relationship between the components, the motion situation, etc. in a specific posture (as shown in the attached drawings), and if the specific posture is changed, the directional indicator is changed accordingly.
In addition, the descriptions related to "first", "second", etc. in the present invention are for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one of the feature. In addition, technical solutions between various embodiments may be combined with each other, but must be realized by a person skilled in the art, and when the technical solutions are contradictory or cannot be realized, such a combination should not be considered to exist, and is not within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a schematic structural diagram of a mixed reality interactive display device according to a first embodiment of the present invention.
As shown in fig. 1, the mixed reality interactive display device includes a display module 10 and an external touch module 20, where the display module 10 includes an internal display module 101 and an external display module 102, the internal display module 101 and the external display module 102 are disposed opposite to each other, and the external display module 102 is connected to the external touch module 20.
It should be noted that the display module 10 is used for displaying pictures to the wearer and the external interactors (i.e. the wearing user and the interaction user) so as to facilitate the interaction between the wearer and the external interactors. The display sides of the internal display module 101 and the external display module 102 are arranged in a back-to-back mode, the wearing user enters the metasma through the internal display module 101 when the internal display module 101 displays the picture internally, the external display module 102 displays the picture externally, and the interactive user knows the actual picture seen by the wearing user through the external display module 102 and interacts with the wearing user. The external touch module 20 is disposed on the display side of the external display module 102, and may be a related touch component, which is not limited in this embodiment, and the interactive user may embody an external idea on the external display module 102 through the external touch module 20, and further embody on the internal display module 101, so that the wearing user may feel the idea and intention of the interactive user more intuitively, and interactivity is increased.
In a specific implementation, the external display module 102 is configured to obtain a display screen of the internal display module 101 and display the display screen of the internal display module 101 to an interactive user, so that the interactive user determines intention information, the external touch module 20 is configured to obtain the intention information of the interactive user and update the intention information on the display screen displayed by the external display module 102, and the internal display module 101 is configured to obtain a display screen with the intention information and display the display screen with the intention information to a wearing user, so that the wearing user adjusts an execution action according to the intention information.
It is understood that the intention information refers to the expression of the idea and intention of the interactive user in the display screen, and may be in various forms, such as: the mark, the indication, and the drawing may be in other forms, which is not limited in this embodiment.
It should be understood that the display screen of the internal display module 101 and the display screen of the external display module 102 need to be consistent, so that the interactive user in the real environment can feel the actual scene seen by the wearing user and generate the responsive interaction, and simultaneously, the intention of the interactive user can be timely fed back to the wearing user, thereby realizing information transmission and interaction.
Further, as shown in fig. 2, the internal display module 101 includes a lens 1011 and an internal display screen 1012, a display surface of the internal display screen 1012 is disposed opposite to an opposite surface of the lens 1011, and the display surface of the internal display screen 1012 is spaced from the opposite surface of the lens 1011 by a predetermined distance. The external display module 102 includes an external display screen 1021, and the external display screen 1021 is connected to the external touch module 20.
It should be noted that since a person has a clear distance, something too close is not clear, but can be seen clearly at a distance larger than 25cm, and an MR display device is usually head-mounted, and it is difficult to satisfy the clear distance of 25cm, it is necessary to enable a user wearing the lens 1011 to see clearly the display screen on the inner display screen 1012. The preset distance between the inner display 1012 and the lens 1011 can be adjusted in the actual production process, which is not limited by the embodiment.
It can be understood that the inner display 1012 and the outer display 1021 may be a single screen bent by 180 degrees, or may be two screens, i.e., a front screen and a back screen, which may be driven simultaneously.
In a specific implementation, the inner display screen 1012 may provide a display screen, the lens 1011 may present the display screen of the inner display screen 1012 to a wearing user, and the outer display screen 1021 may present the display screen of the inner display screen 1012 to an interactive user, and may also present intention information to the interactive user in the display screen.
In this embodiment, the mixed reality interactive display device includes a display module 10 and an external touch module 20, the display module 10 includes an internal display module 101 and an external display module 102, the internal display module 101 and the external display module 102 are disposed in a manner of being opposite to each other, and the external display module 102 is connected to the external touch module 20, wherein the external display module 102 is configured to acquire a display screen of the internal display module 101 and display the display screen of the internal display module 101 to an interactive user, so that the interactive user determines intention information, the external touch module 20 is configured to acquire intention information of the interactive user and update the intention information on the display screen displayed by the external display module 102, and the internal display module 101 is configured to acquire a display screen with the intention information and display the display screen with the intention information to a wearing user, so that the wearing user adjusts and executes an action according to the intention information. According to the embodiment, an external interaction person can perform more visual interaction with a wearer through touch control, information transmission and other modes, so that the interaction efficiency between a real space and a virtual space is greatly improved, the experience of a user is increased, and the MR interaction performance is improved.
Referring to fig. 3, fig. 3 is a schematic structural diagram of a mixed reality interactive display device according to a second embodiment of the present invention.
Based on the first embodiment, the mixed reality interactive display device provided in the embodiment of the present invention further includes an internal control module 30, and the internal control module 30 is respectively connected to the internal display screen 1012 and the external display screen 1021. The internal control module 30 is configured to clear the intention information and adjust a display screen of the internal display screen 1012 according to the execution action.
In a specific implementation, after the user wearing the internal control module 30 adjusts the execution action, the internal control module may clear the intention information in the display frames of the internal display 1012 and the external display 1021, adjust the display frame of the internal display 1012 according to the new execution action, and continue to obtain the display frame of the internal display 1012 from the external display 1021, so as to perform the next round of interaction.
Further, as shown in fig. 4, the internal control module 30 includes a control unit 301 and a backlight circuit 302 connected, the control unit 301 is connected to the internal display screen 1012 and the external display screen 1021, and the backlight circuit 302 includes a first backlight circuit 3021 and a second backlight circuit 3022.
Note that the backlight circuit 302 is used to drive the inner display screen 1012 and the outer display screen 1021, the first backlight circuit 3021 is used to drive the inner display screen 1012, and the second backlight circuit 3022 is used to drive the outer display screen 1021.
It is understood that the control unit 301 is configured to adjust the display screens of the inner display screen 1012 and the outer display screen 1021. The control unit 301 needs to change the display screen of the inner display screen 1012 according to different execution actions of the wearer, and needs to clear the intention information marked in the inner display screen 1012 and the outer display screen 1021 in time after the wearer responds to the intention information.
It is to be understood that an SOC (system on chip) is included in the control unit 301, and the SOC can control the first backlight circuit 3021 to drive the inner display 1012 and the second backlight circuit 3022 to drive the outer display 1021 by GPIO (General Purpose Input/Output Port, i.e., chip pin) control signals. The control unit 301 further has an FPGA (Field Programmable Gate Array), which can adjust a display screen, and the FPGA may be integrated into the SOC or may be separately configured, which is related to the platform performance selected during product design, and this embodiment is not limited thereto.
It should be noted that, if the internal display 1012 and the external display 1021 are a single screen that can be bent by 180 degrees, the picture transmission between the internal display 1012 and the external display 1021 does not need to pass through the FPGA, and can be directly transmitted; if the internal display screen 1012 and the external display screen 1021 are two positive and negative screens capable of being driven simultaneously, the transmission of the picture between the internal display screen 1012 and the external display screen 1021 needs to pass through the FPGA, so as to avoid time delay, and in the process, the FPGA is connected to the internal display screen 1012 and the external display screen 1021 through two MIPIs (Mobile Industry Processor interfaces) respectively, so as to avoid that one MIPI cannot mount two loads at the same time.
In this embodiment, the mixed reality interactive display device further includes an internal control module 30, where the internal control module 30 is respectively connected to the internal display screen 1012 and the external display screen 1021, and the internal control module 30 is configured to clear the intention information and adjust a display screen of the internal display screen 1012 according to the execution action. According to the embodiment, the display picture can be adjusted according to the response of the wearing user to interaction, meanwhile, intention information after interaction is completed at each time is cleared according to requirements, so that normal proceeding of next interaction is guaranteed, an external interaction person can continue to interact with the wearing person more intuitively through touch control, information transmission and other modes, and the interaction efficiency between a real space and a virtual space is greatly improved.
An embodiment of the present invention provides a mixed reality interactive display method, and referring to fig. 5, fig. 5 is a schematic flow diagram of a first embodiment of a mixed reality interactive display method according to the present invention.
In this embodiment, the mixed reality interactive display method includes the following steps:
step S10: and the external display module acquires the display picture of the internal display module and displays the display picture of the internal display module to an interactive user so as to ensure that the interactive user determines intention information.
It should be noted that the execution subject of this embodiment is the above mixed reality interactive display device, the mixed reality interactive display device includes a display module, an external touch module and an internal control module, and the display module includes an internal display module and an external display module.
It will be appreciated that the wearing user is the user who is using the mixed reality interactive display device and the interactive user is the external user with whom the wearing user is interacting. The display frames of the internal display module and the external display module are required to be consistent, so that the interactive user can feel the actual scene seen by the wearing user to interact.
Further, the step S10 includes: the external display module acquires a display picture of the internal display module and displays the display picture of the internal display module to an interactive user, so that the interactive user determines an interactive intention according to the display picture and generates intention information in the external touch module according to the interactive intention.
It should be understood that the interactive intention refers to the intention and idea generated by the interactive user according to the display screen, such as: the interactive user wants to wear the user to adjust/modify a certain place, the intention information is embodied in the display screen of the interactive intention, the intention and the idea are expressed, and the intention and the idea include mark information, indication information and drawing information, namely embodied in a mark form, embodied in an indication form and embodied in a drawing form, and can be in other forms.
In a specific implementation, in order to ensure that a picture seen by an interactive user is the same as an actual picture seen by a wearing user, the external display module displays a display picture of the internal display module to the interactive user, and after the interactive user understands the display picture, related ideas and intentions are generated to interact with the wearing user.
Further, before step S10, the method further includes: the internal control module drives the internal display module and the external display module, and the internal display module displays a display picture to the wearer.
In specific implementation, the internal control module drives the internal display module and the external display module, so that the internal display module can display a display picture to a wearer, and the external display module can acquire the display picture of the internal display module.
Step S20: and the external touch module acquires the intention information of the interactive user and updates the intention information on a display picture displayed by the external display module.
It should be noted that, the interactive user may embody the interactive intention as intention information through the external touch module, so as to increase the interactivity between the wearing user and the interactive user.
In a specific implementation, an interactive user converts an interactive intention into intention information through an external touch module, and the external touch module displays the intention information on a display picture displayed by an external display module.
Step S30: and the internal display module acquires the display picture with the intention information and displays the display picture with the intention information to a wearing user so that the wearing user can adjust and execute actions according to the intention information.
It can be understood that, after the interactive user expresses the intention information, in order to ensure that the wearing user can obtain the intention information in time, the internal display module can display the display picture and the intention information of the external display module to the wearing user.
Further, the step S30 includes: the internal display module acquires the display picture with the intention information displayed by the external display module, and displays the display picture with the intention information to a wearing user, so that the wearing user determines the interaction intention of the interaction user according to the intention information, and adjusts the execution action according to the interaction intention.
It should be understood that the performing action refers to operations and actions performed by the wearing user.
In specific implementation, the internal display module displays a display screen containing intention information in the external display module to a wearing user, and after the wearing user understands the intention information of the interactive user, the operation/action of the wearing user can be adjusted according to the intention information, so that the interaction between the wearing user and the interactive user is realized.
Further, after the step S30, the method further includes: and the internal control module clears the intention information and adjusts the display picture of the internal display module according to the execution action.
It should be noted that the intention information in the display screen may affect the next interaction, so after the wearer responds correspondingly, the intention information needs to be cleared, and the wearer is prevented from obtaining wrong information in the next interaction, so as to ensure the next interaction to be performed smoothly.
As shown in the overall flowchart diagram of fig. 6, the wearing user transmits information to be shared to the outside through the external display module, the external interaction user receives and understands the information, the interaction user reflects his/her own idea to the display screen through the external touch module, the wearing user receives and understands the idea of the interaction user, and executes a corresponding modification action, and the internal control module responds to the modification action of the wearing user (for example, adjusts the display screen) and continues to perform the next interaction.
In this embodiment, an external display module acquires a display picture of an internal display module and displays the display picture of the internal display module to an interactive user, so that the interactive user determines intention information, an external touch module acquires the intention information of the interactive user and updates the intention information on the display picture displayed by the external display module, and an internal display module acquires a display picture with the intention information and displays the display picture with the intention information to a wearing user, so that the wearing user adjusts an execution action according to the intention information. According to the embodiment, an external interaction person can perform more visual interaction with a wearer through touch control, information transmission and other modes, so that the interaction efficiency between a real space and a virtual space is greatly improved, the experience of a user is increased, and the MR interaction performance is improved.
It should be understood that the above is only an example, and the technical solution of the present invention is not limited in any way, and in a specific application, a person skilled in the art may set the technical solution as needed, and the present invention is not limited thereto.
It should be noted that the above-described work flows are only exemplary, and do not limit the scope of the present invention, and in practical applications, a person skilled in the art may select some or all of them to achieve the purpose of the solution of the embodiment according to actual needs, and the present invention is not limited herein.
In addition, technical details that are not elaborated in this embodiment may refer to a mixed reality interactive display method provided in any embodiment of the present invention, and are not described herein again.
Further, it is to be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrases "comprising a," "8230," "8230," or "comprising" does not exclude the presence of other like elements in a process, method, article, or system comprising the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention or portions thereof that contribute to the prior art may be embodied in the form of a software product, where the computer software product is stored in a storage medium (e.g. a Read Only Memory (ROM)/RAM, a magnetic disk, and an optical disk), and includes several instructions for enabling a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.
Claims (10)
1. A mixed reality interactive display device is characterized by comprising a display module and an external touch module, wherein the display module comprises an internal display module and an external display module, the internal display module and the external display module are arranged in a reverse direction, and the external display module is connected with the external touch module;
the external display module is used for acquiring the display picture of the internal display module and displaying the display picture of the internal display module to an interactive user so as to enable the interactive user to determine intention information;
the external touch module is used for acquiring intention information of the interactive user and updating the intention information on a display picture displayed by the external display module;
the internal display module is used for acquiring the display picture with the intention information and displaying the display picture with the intention information to a wearing user so that the wearing user can adjust and execute actions according to the intention information.
2. The mixed reality interactive display device of claim 1, wherein the internal display module comprises a lens and an internal display screen, a display surface of the internal display screen is disposed opposite to a reverse surface of the lens, and the display surface of the internal display screen is spaced from the reverse surface of the lens by a preset distance;
the internal display screen is used for providing the display picture;
the lens is used for displaying the display picture of the inner display screen to the wearing user.
3. The mixed reality interactive display device of claim 2, wherein the external display module comprises an external display screen, and the external display screen is connected with the external touch module;
the external display screen is used for displaying a display picture of the internal display screen to the interactive user;
the external display screen is further used for displaying the intention information to the interactive user.
4. The mixed reality interactive display device of claim 3, further comprising an internal control module, the internal control module being connected to the internal display screen and the external display screen, respectively;
and the internal control module is used for clearing the intention information and adjusting the display picture of the internal display screen according to the execution action.
5. The mixed reality interactive display device of claim 4, wherein the internal control module comprises a control unit and a backlight circuit connected, the control unit is connected with the internal display screen and the external display screen, and the backlight circuit comprises a first backlight circuit and a second backlight circuit;
the first backlight circuit is used for driving the inner display screen;
the second backlight circuit is used for driving the external display screen;
and the control unit is used for adjusting the display pictures of the internal display screen and the external display screen.
6. A mixed reality interactive display method applied to the mixed reality interactive display device according to any one of claims 1 to 5, wherein the mixed reality interactive display method comprises the following steps:
the external display module acquires a display picture of the internal display module and displays the display picture of the internal display module to an interactive user so as to enable the interactive user to determine intention information;
the external touch module acquires intention information of the interactive user and updates the intention information on a display picture displayed by the external display module;
and the internal display module acquires a display picture with the intention information and displays the display picture with the intention information to a wearing user so as to enable the wearing user to adjust and execute actions according to the intention information.
7. The method of claim 6, wherein the displaying the display screen with the intention information by the internal display module and displaying the display screen with the intention information to the wearing user, so that the wearing user adjusts the execution action according to the intention information, further comprises:
and the internal control module clears the intention information and adjusts the display picture of the internal display module according to the execution action.
8. The method of claim 6, wherein the intention information includes mark information, indication information, drawing information, and the external display module acquires a display screen of an internal display module and presents the display screen of the internal display module to an interactive user to allow the interactive user to determine the intention information includes:
the external display module acquires a display picture of the internal display module and displays the display picture of the internal display module to an interactive user, so that the interactive user determines an interactive intention according to the display picture and generates intention information in the external touch module according to the interactive intention.
9. The method of claim 8, wherein the obtaining, by the intra-display module, the display screen with the intention information and presenting the display screen with the intention information to a wearing user to enable the wearing user to adjust an execution action according to the intention information comprises:
the internal display module acquires the display picture with the intention information displayed by the external display module, and displays the display picture with the intention information to a wearing user, so that the wearing user determines the interaction intention of the interaction user according to the intention information, and adjusts the execution action according to the interaction intention.
10. The method according to any one of claims 6 to 9, wherein before the external display module acquires the display screen of the internal display module and presents the display screen of the internal display module to an interactive user, the method further comprises:
the internal control module drives the internal display module and the external display module;
the intra-pair display module presents a display screen to the wearer.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211679146.4A CN115857698A (en) | 2022-12-26 | 2022-12-26 | Mixed reality interactive display device and method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211679146.4A CN115857698A (en) | 2022-12-26 | 2022-12-26 | Mixed reality interactive display device and method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115857698A true CN115857698A (en) | 2023-03-28 |
Family
ID=85654961
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211679146.4A Pending CN115857698A (en) | 2022-12-26 | 2022-12-26 | Mixed reality interactive display device and method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115857698A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116204068A (en) * | 2023-05-06 | 2023-06-02 | 蔚来汽车科技(安徽)有限公司 | Augmented reality display device, display method thereof, vehicle, mobile terminal, and medium |
-
2022
- 2022-12-26 CN CN202211679146.4A patent/CN115857698A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116204068A (en) * | 2023-05-06 | 2023-06-02 | 蔚来汽车科技(安徽)有限公司 | Augmented reality display device, display method thereof, vehicle, mobile terminal, and medium |
CN116204068B (en) * | 2023-05-06 | 2023-08-04 | 蔚来汽车科技(安徽)有限公司 | Augmented reality display device, display method thereof, vehicle, mobile terminal, and medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109002243B (en) | Image parameter adjusting method and terminal equipment | |
US20080252604A1 (en) | OSD controlling system and operation method thereof | |
CN101276568A (en) | Control system and control method for display of display screen | |
CN104123110A (en) | Android double-screen extraordinary image display method | |
CN104395935A (en) | Method and apparatus for modifying the presentation of information based on the visual complexity of environment information | |
CN101566909A (en) | Multiwindow operation interface display method | |
WO2022194003A1 (en) | Screen capture method and apparatus, electronic device, and readable storage medium | |
CN112286612B (en) | Information display method and device and electronic equipment | |
US11183105B2 (en) | Display panel and device, image processing method and device, and virtual reality system | |
CN113918260A (en) | Application program display method and device and electronic equipment | |
CN115857698A (en) | Mixed reality interactive display device and method | |
CN109426476B (en) | Signal source scheduling system and signal scheduling method of signal source system | |
CN112783594A (en) | Message display method and device and electronic equipment | |
CN106775397B (en) | PCB and multi-point touch method using PCB | |
CN112905134A (en) | Method and device for refreshing display and electronic equipment | |
CN113590066A (en) | Full-automatic multi-screen splicing method, device, equipment and storage medium | |
CN113504870B (en) | Hypervisor intelligent cockpit input method sharing system and method | |
CN107391125B (en) | User interface design method for man-machine interaction under VxWorks system | |
US20210375061A1 (en) | Augmented Reality System Supporting Customized Multi-Channel Interaction | |
CN113312016A (en) | Method for double-screen different display in vehicle-mounted entertainment system | |
WO2018029827A1 (en) | Information processing device, information processing method, program and storage medium | |
JP2023548807A (en) | Information processing methods, devices and electronic equipment | |
CN112346687A (en) | Information interaction method and device, electronic equipment and readable storage medium | |
WO2021064823A1 (en) | Linked display system and head-mounted display | |
CN112367422B (en) | Interaction method and device of mobile terminal equipment and display system and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |