[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN112004073B - System and method for different-surface fusion image interaction based on window platform - Google Patents

System and method for different-surface fusion image interaction based on window platform Download PDF

Info

Publication number
CN112004073B
CN112004073B CN202010786072.9A CN202010786072A CN112004073B CN 112004073 B CN112004073 B CN 112004073B CN 202010786072 A CN202010786072 A CN 202010786072A CN 112004073 B CN112004073 B CN 112004073B
Authority
CN
China
Prior art keywords
projector
unit
dimensional model
data
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010786072.9A
Other languages
Chinese (zh)
Other versions
CN112004073A (en
Inventor
周安斌
邓建波
尚绪峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Jindong Digital Creative Co ltd
Original Assignee
Shandong Jindong Digital Creative Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Jindong Digital Creative Co ltd filed Critical Shandong Jindong Digital Creative Co ltd
Priority to CN202010786072.9A priority Critical patent/CN112004073B/en
Publication of CN112004073A publication Critical patent/CN112004073A/en
Application granted granted Critical
Publication of CN112004073B publication Critical patent/CN112004073B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/3147Multi-projection systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/317Convergence or focusing systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

A system and a method for integrating images on different surfaces based on a window platform relate to the technical field of projection, and comprise a projector simulation projection module, an integration module and an image interaction module, wherein parameters of a projection area are measured and input into the projector simulation projection module to obtain three-dimensional model data for simulating the position and the angle of a projector, the integration module calibrates the three-dimensional model data to ensure that the edges of a projection picture are perfectly integrated and the definition of the picture is optimal, so that projector installation parameter data is obtained and sent to the image interaction module, the image interaction module sets the position of a projection unit according to the installation parameter data without field and field debugging, and interacts with audiences according to the content of a played video, thereby solving the problem that the prior dynamic cinema projector needs to debug the projector after being installed and consumes a large amount of time, manpower and material resources, causing a problem of low work efficiency.

Description

System and method for different-surface fusion image interaction based on window platform
Technical Field
The invention relates to the technical field of projection, in particular to a system and a method for different-surface fusion image interaction based on a window platform.
Background
The projection fusion technology is to overlap the edges of the pictures projected by a group of projectors, and the fusion technology displays a whole picture without gaps and brighter, oversized and high-resolution, the effect of the picture is similar to the picture projected by one projector.
Disclosure of Invention
The embodiment of the invention provides a window platform-based different-plane fusion image interaction system and method, wherein parameters of a projection area are measured and input into a projector simulation projection module to obtain three-dimensional model data simulating the position and the angle of a projector, the three-dimensional model data are calibrated by a fusion module to enable the edges of a projection picture to be perfectly fused, the definition of the picture is optimal, projector installation parameter data are obtained and sent to an image interaction module, the image interaction module sets the position of a projection unit according to the installation parameter data, field and field debugging are not needed, interaction is generated with audiences according to the content of a played video, and the problem that the existing dynamic cinema projector needs to be debugged after the projector is installed is solved, a large amount of time and manpower and material resources are consumed, and the working efficiency is low.
Different face fuses image interactive system based on window platform includes: the projector simulates a projection module, a fusion module and an image interaction module;
the projector simulation projection module is used for determining the installation position of the projector to obtain three-dimensional model data, the three-dimensional model data comprises the position data of the projector, and the three-dimensional model data is sent to the fusion module;
the projector simulation projection module comprises an input unit, a visual angle simulation unit, a preview unit and an output unit, wherein the input unit is used for inputting projection environment parameters and inputting the parameters into the visual angle simulation unit, the visual angle simulation unit is used for receiving the projection environment parameters input by the input unit, establishing a three-dimensional model according to the projection environment parameters to simulate the position and angle data of a projector to obtain three-dimensional model data, and sending the three-dimensional model data to the preview unit, and the preview unit reads the three-dimensional model data, previews the three-dimensional model data in real time and sends the three-dimensional model data to the fusion module;
the fusion module is used for receiving the three-dimensional model data sent by the projector simulation projection module and calibrating to fuse the edges of the projection picture;
the fusion module comprises a receiving unit and a parameter adjusting module, wherein the receiving unit is used for receiving three-dimensional model data sent by the projector simulation projection module and sending the three-dimensional model data to the parameter adjusting module, the parameter adjusting module is used for adjusting the edge of a projection picture to be fused to obtain projector installation parameter data, and the projector installation parameter data is sent to the image interaction module;
and the image interaction module is used for carrying out interaction on the projected picture and a user by sending the interaction data so as to enhance the viewing experience.
The image interaction module comprises a projector unit, a storage unit, an interaction unit, a video playing unit and a dynamic seat unit, wherein the projector unit is used for being installed according to projector installation parameter data, the storage unit is used for storing video files and interaction data packets, the interaction unit is used for reading the interaction data packets and sending the interaction data packets to the dynamic seat unit to achieve interaction with a user, and the video playing unit is used for reading the video files and playing videos through the projector unit.
Further, the projection environment parameters input by the input unit include the shape and size of the curtain and the size of the projection space.
Further, the three-dimensional model data includes position data of the projector, shape and size data of the curtain, and data of the projection space.
Further, the projector installation parameters include an installation position parameter and an installation angle parameter of the projector and a focal length parameter of the projector.
Further, the projector units are projectors, and the number of the projector units is more than two.
In a second aspect, an embodiment of the present invention provides a method for interacting an image based on a window platform with a different plane fusion, including the following steps:
s1, simulating and positioning the projector, inputting measured projection environment parameters by a user through an input unit, inputting the parameters into a visual angle simulation unit, receiving the projection environment parameters input by the input unit by the visual angle simulation unit, establishing a three-dimensional model according to the projection environment parameters to simulate the position and angle data of the projector to obtain three-dimensional model data, sending the three-dimensional model data to a preview unit, reading the three-dimensional model data by the preview unit, previewing in real time, and sending the three-dimensional model data to a receiving unit;
s2, positioning the projector, receiving three-dimensional model data sent by a projector simulation projection module by a receiving unit, sending the three-dimensional model data to a parameter adjusting module, adjusting the edge of a projection picture by the parameter adjusting module for fusion to obtain projector installation parameter data, and sending the projector installation parameter data to the projector unit;
and S3, interacting, installing the projector unit according to the projector installation parameter data, storing a video file and an interactive data packet in the storage unit, reading the interactive data packet by the interactive unit, sending the interactive data packet to the dynamic seat unit to realize interaction with a user, reading the video file by the video playing unit, and playing the video through the projector unit.
The technical scheme provided by the embodiment of the invention has the beneficial effects that at least:
the invention obtains three-dimensional model data simulating the position and the angle of a projector by measuring the parameters of a projection area and inputting the parameters into a projector simulation projection module, the fusion module calibrates the three-dimensional model data to ensure that the edges of a projection picture are perfectly fused and the definition of the picture is optimal, obtains projector installation parameter data and sends the projector installation parameter data to an image interaction module, and the image interaction module sets the position of a projection unit according to the installation parameter data without field debugging and generating interaction with audiences according to the content of a played video.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings:
FIG. 1 is a schematic structural diagram of a window platform-based out-of-plane fusion image interaction system according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of a window platform-based out-of-plane fusion image interaction method disclosed in the embodiment of the present invention.
Reference numerals:
100-projector simulation projection module; 101-an input unit; 102-a view angle simulation unit; 103-preview unit; 104-an output unit; 200-a fusion module; 201-a receiving unit; 202-parameter adjustment module; 300-image interaction module; 301-a projector unit; 302-a storage unit; 303-an interaction unit; 304-a video playback unit; 305-dynamic seat unit.
DETAILED DESCRIPTION OF EMBODIMENT (S) OF INVENTION
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Example one
As shown in fig. 1, an embodiment of the invention provides a window platform-based out-of-plane fusion image interaction system, which includes: the projector simulation projection module 100, the fusion module 200 and the image interaction module 300;
the projector simulation projection module 100 is configured to determine an installation position of a projector, obtain three-dimensional model data, and send the three-dimensional model data to the fusion module 200, where the projector simulation projection module 100 includes an input unit 101, a viewing angle simulation unit 102, a preview unit 103, and an output unit 104, the input unit 101 is configured to input projection environment parameters, the projection environment parameters input by the input unit 101 include a shape and a size of a curtain and a size of a projection space, the parameters are input to the viewing angle simulation unit 102, the viewing angle simulation unit 102 is configured to receive the projection environment parameters input by the input unit 101, establish a three-dimensional model simulation projector position and angle data according to the projection environment parameters, obtain three-dimensional model data, the three-dimensional model data includes position data of the projector, shape and size data of the curtain, and data of the projection space, and send the three-dimensional model data to the preview unit 103, the preview unit 103 reads the three-dimensional model data and performs real-time preview, and sends the three-dimensional model data to the fusion module 200;
specifically, before installation, a projection area is measured to obtain parameters of the shape and size of a curtain and the space size of a projection space, the measured parameters are input into a viewing angle simulation unit 102 through an input unit 101, a three-dimensional model is established, an asymmetric effect is generated by modifying a bottom camera matrix to enable a camera axis to generate the oblique viewing angle of a projector, the mapping deformation of an image is more accurate, for example, the number of the projectors is a and b respectively, the curtain is hemispherical, the positions of the projectors are set to be a1 and b1 respectively in the three-dimensional model, the projections of the curtains in the three-dimensional model are not in contact with each other, the positions a1 and b1 of a and b are adjusted to enable the projected edges of the curtains in the three-dimensional model to be close, the asymmetric effect is generated by modifying the bottom camera matrix to enable the camera axis, the oblique viewing angle of the simulated projector is adjusted to enable the angles and the projected edges of the projections of a and b in the three-dimensional model to be coincident, the simulation is more accurate, the image deformation is smaller, the debugged three-dimensional model data is sent to the preview unit 103, the whole process preview unit 103 reads the three-dimensional model data and previews the three-dimensional model data in real time, and the three-dimensional model data is sent to the fusion module 200.
The fusion module 200 is configured to receive three-dimensional model data sent by the projector simulation projection module 100, calibrate and accurately position the installation position of the projector to fuse the edges of the projection image, where the fusion module 200 includes a receiving unit 201 and a parameter adjusting module 202, the receiving unit 201 is configured to receive the three-dimensional model data sent by the projector simulation projection module 100, send the three-dimensional model data to the parameter adjusting module 202, the parameter adjusting module 202 is configured to adjust the edges of the projection image for fusion, obtain projector installation parameter data, and send the projector installation parameter data to the image interaction module 300, where the projector installation parameters include an installation position parameter and an installation angle parameter of the projector and a focal length parameter of the projector;
specifically, the receiving unit 201 receives three-dimensional model data sent by the projector simulation projection module 100, the parameter adjusting module 202 further adjusts the position and the focal length of the projector according to data in the three-dimensional model data, for example, the parameters of the projector are input to the parameter adjusting module 202, such as the adjustable focal length of the projector, through simulation of the projector, it is determined whether the picture projected by the coordinates of the projector in the current three-dimensional model into the curtain is clear, adjustment is performed according to the coordinates of the projector, so that the projected picture edges are fused further, projector installation parameter data are obtained, and the projector installation parameter data are sent to the image interaction module 300.
The image interaction module 300 is used for carrying out interaction enhancement viewing experience on a projected picture and a user by sending interaction data, the image interaction module 300 comprises a projector unit 301, a storage unit 302, an interaction unit 303, a video playing unit 304 and a dynamic seat unit 305, the projector unit 301 is used for being installed according to projector installation parameter data, the storage unit 302 is used for storing video files and interaction data packets, the interaction unit 303 is used for reading the interaction data packets and sending the interaction data packets to the dynamic seat unit 305 to realize interaction with the user, and the video playing unit 304 is used for reading the video files and carrying out video playing through the projector unit 301;
specifically, projector unit 301 installs according to the projector installation parameter data that obtains, the position angle and the focus of fixed projector, avoided the problem that debugging consumes a large amount of time and manpower and materials after installing earlier, after the installation, video playback unit 304 and interactive unit 303 read video file and the interactive data package that corresponds respectively from memory cell 302, interact with the user, for example, send appointed interactive data package through the udp protocol and drive dynamic seat unit 305 and produce the gesture at appointed film broadcast time point and change, the protocol format: where the frame is up to 1000, then: dividing into: second: frame, 00:00:00:000, for example, a dynamic seat is set at 00:00: 05: when a left shift posture change instruction sent by the interaction unit 303 is received at the time point 128, the dynamic seat unit 305 immediately performs left shift, and a strong interaction effect is generated with a user through the combination of the movement of the dynamic seat unit 305 and the picture, so that the watching experience of the user is enhanced.
The invention inputs parameters of a projection area into a projector simulation projection module 100 through measurement to obtain three-dimensional model data simulating the position and the angle of a projector, a fusion module 200 calibrates the three-dimensional model data to ensure that the edges of a projection picture are perfectly fused and the definition of the picture is optimal, projector installation parameter data is obtained and is sent to an image interaction module 300, the image interaction module 300 sets the position of a projection unit according to the installation parameter data, field and field debugging is not needed, and interaction is generated with audiences according to the content of a played video, thereby solving the problem that the existing dynamic cinema projector needs to debug the projector after the projector is installed, consumes a large amount of time, manpower and material resources and causes low working efficiency.
Example two
The embodiment of the invention also discloses a window platform-based out-of-plane fusion image interaction method, which comprises the following steps as shown in figure 1:
s1, simulating and positioning the projector position, inputting measured projection environment parameters by a user through the input unit 101, inputting the parameters into the visual angle simulation unit 102, receiving the projection environment parameters input by the input unit 101 by the visual angle simulation unit 102, establishing a three-dimensional model according to the projection environment parameters to simulate the projector position and angle data to obtain three-dimensional model data, sending the three-dimensional model data to the preview unit 103, reading the three-dimensional model data by the preview unit 103, previewing the three-dimensional model data in real time, and sending the three-dimensional model data to the receiving unit 201;
specifically, before installation, a projection area is measured to obtain parameters of the shape and size of a curtain and the space size of a projection space, the measured parameters are input into a viewing angle simulation unit 102 through an input unit 101, a three-dimensional model is established, an asymmetric effect is generated by modifying a bottom camera matrix to enable a camera axis to generate the oblique viewing angle of a projector, the mapping deformation of an image is more accurate, for example, the number of the projectors is a and b respectively, the curtain is hemispherical, the positions of the projectors are set to be a1 and b1 respectively in the three-dimensional model, the projections of the curtains in the three-dimensional model are not in contact with each other, the positions a1 and b1 of a and b are adjusted to enable the projected edges of the curtains in the three-dimensional model to be close, the asymmetric effect is generated by modifying the bottom camera matrix to enable the camera axis, the oblique viewing angle of the simulated projector is adjusted to enable the angles and the projected edges of the projections of a and b in the three-dimensional model to be coincident, the simulation is more accurate, the image deformation is smaller, the debugged three-dimensional model data is sent to the preview unit 103, the whole process preview unit 103 reads the three-dimensional model data and previews the three-dimensional model data in real time, and the three-dimensional model data is sent to the receiving unit 201.
S2, positioning the projector, receiving the three-dimensional model data sent by the projector simulation module 100 by the receiving unit 201, sending the three-dimensional model data to the parameter adjusting module 202, adjusting the edge of the projection picture by the parameter adjusting module 202 for fusion to obtain projector installation parameter data, and sending the projector installation parameter data to the projector unit 301;
specifically, the receiving unit 201 receives three-dimensional model data sent by the projector simulation projection module 100, the parameter adjusting module 202 further adjusts the position and the focal length of the projector according to data in the three-dimensional model data, for example, the parameters of the projector are input to the parameter adjusting module 202, such as the adjustable focal length of the projector, through simulation of the projector, it is determined whether the current coordinate of the projector in the three-dimensional model is clear when the picture projected into the curtain is clear, adjustment is performed according to the coordinate of the projector, so that the projected picture edges are fused and further fused, projector installation parameter data are obtained, and the projector installation parameter data are sent to the projector unit 301.
And S3, interacting, namely installing the projector unit 301 according to the projector installation parameter data, storing a video file and an interaction data packet in the storage unit 302, reading the interaction data packet by the interaction unit 303, sending the interaction data packet to the dynamic seat unit 305 to realize interaction with a user, reading the video file by the video playing unit 304, and playing a video through the projector unit 301.
Specifically, projector unit 301 installs according to the projector installation parameter data that obtains, the position angle and the focus of fixed projector, avoided the problem that debugging consumes a large amount of time and manpower and materials after installing earlier, after the installation, video playback unit 304 and interactive unit 303 read video file and the interactive data package that corresponds respectively from memory cell 302, interact with the user, for example, send appointed interactive data package through the udp protocol and drive dynamic seat unit 305 and produce the gesture at appointed film broadcast time point and change, the protocol format: where the frame is up to 1000, then: dividing into: second: frame, 00:00:00:000, for example, a dynamic seat is set at 00:00: 05: when a left shift posture change instruction sent by the interaction unit 303 is received at the time point 128, the dynamic seat unit 305 immediately performs left shift, and a strong interaction effect is generated with a user through the combination of the movement of the dynamic seat unit 305 and the picture, so that the watching experience of the user is enhanced.
The method comprises the steps of inputting parameters of a projection area into a projector simulation projection module 100 through measurement to obtain three-dimensional model data simulating the position and the angle of a projector, calibrating the three-dimensional model data through a fusion module 200 to enable the edges of a projection picture to be fused perfectly and the definition of the picture to be optimal, obtaining projector installation parameter data and sending the projector installation parameter data to an image interaction module 300, setting the position of a projection unit through the image interaction module 300 according to the installation parameter data without field and field debugging, and interacting with audiences according to the content of a played video, so that the problem that the existing movie theatre projector needs to be debugged after the projector is installed is solved, and a large amount of time, manpower and material resources are consumed to cause low working efficiency.
It should be understood that the specific order or hierarchy of steps in the processes disclosed is an example of exemplary approaches. Based upon design preferences, it is understood that the specific order or hierarchy of steps in the processes may be rearranged without departing from the scope of the present disclosure. The accompanying method claims present elements of the various steps in a sample order, and are not intended to be limited to the specific order or hierarchy presented.
In the foregoing detailed description, various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments of the subject matter require more features than are expressly recited in each claim. Rather, as the following claims reflect, invention lies in less than all features of a single disclosed embodiment. Thus, the following claims are hereby expressly incorporated into the detailed description, with each claim standing on its own as a separate preferred embodiment of the invention.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. Of course, the storage medium may also be integral to the processor. The processor and the storage medium may reside in an ASIC. The ASIC may reside in a user terminal. Of course, the processor and the storage medium may reside as discrete components in a user terminal.
For a software implementation, the techniques described herein may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. The software codes may be stored in memory units and executed by processors. The memory unit may be implemented within the processor or external to the processor, in which case it can be communicatively coupled to the processor via various means as is known in the art.
What has been described above includes examples of one or more embodiments. It is, of course, not possible to describe every conceivable combination of components or methodologies for purposes of describing the aforementioned embodiments, but one of ordinary skill in the art may recognize that many further combinations and permutations of various embodiments are possible. Accordingly, the embodiments described herein are intended to embrace all such alterations, modifications and variations that fall within the scope of the appended claims. Furthermore, to the extent that the term "includes" is used in either the detailed description or the claims, such term is intended to be inclusive in a manner similar to the term "comprising" as "comprising" is interpreted when employed as a transitional word in a claim. Furthermore, any use of the term "or" in the specification of the claims is intended to mean a "non-exclusive or".

Claims (5)

1. Different face fuses image interactive system based on window platform, its characterized in that includes: the projector simulates a projection module, a fusion module and an image interaction module;
the projector simulation projection module is used for determining the installation position of the projector to obtain three-dimensional model data, the three-dimensional model data comprises the position data of the projector, and the three-dimensional model data is sent to the fusion module;
the projector simulation projection module comprises an input unit, a visual angle simulation unit, a preview unit and an output unit, wherein the input unit is used for inputting projection environment parameters, the projection environment parameters input by the input unit comprise the shape and size of a curtain and the size of a projection space, the parameters are input into the visual angle simulation unit, the visual angle simulation unit is used for receiving the projection environment parameters input by the input unit, establishing a three-dimensional model according to the projection environment parameters to simulate the position and angle data of a projector to obtain three-dimensional model data, sending the three-dimensional model data to the preview unit, the preview unit reads the three-dimensional model data and previews the three-dimensional model data in real time, and sends the three-dimensional model data to the fusion module;
the fusion module is used for receiving the three-dimensional model data sent by the projector simulation projection module and calibrating to fuse the edges of the projection picture;
the fusion module comprises a receiving unit and a parameter adjusting module, wherein the receiving unit is used for receiving three-dimensional model data sent by the projector simulation projection module and sending the three-dimensional model data to the parameter adjusting module, the parameter adjusting module is used for adjusting the edge of a projection picture to be fused to obtain projector installation parameter data, and the projector installation parameter data is sent to the image interaction module;
the image interaction module is used for carrying out interaction enhancement on the projected picture and a user by sending interaction data to enhance the viewing experience;
the image interaction module comprises a projector unit, a storage unit, an interaction unit, a video playing unit and a dynamic seat unit, wherein the projector unit is used for being installed according to projector installation parameter data, the storage unit is used for storing video files and interaction data packets, the interaction unit is used for reading the interaction data packets and sending the interaction data packets to the dynamic seat unit to achieve interaction with a user, and the video playing unit is used for reading the video files and playing videos through the projector unit.
2. The window platform-based out-of-plane blended image interaction system of claim 1, wherein the three-dimensional model data comprises projector position data, curtain shape and size data, and projection space data.
3. The window platform-based out-of-plane blended image interaction system of claim 1, wherein the projector installation parameters comprise an installation position parameter and an installation angle parameter of the projector and a focal length parameter of the projector.
4. The window platform-based out-of-plane blended image interaction system of claim 1, wherein the projector units are two or more projectors.
5. The method for window platform-based out-of-plane fusion image interaction is applied to the window platform-based out-of-plane fusion image interaction system according to any one of claims 1 to 4, and comprises the following steps:
s1, simulating and positioning the projector, inputting measured projection environment parameters by a user through an input unit, inputting the parameters into a visual angle simulation unit, receiving the projection environment parameters input by the input unit by the visual angle simulation unit, establishing a three-dimensional model according to the projection environment parameters to simulate the position and angle data of the projector to obtain three-dimensional model data, sending the three-dimensional model data to a preview unit, reading the three-dimensional model data by the preview unit, previewing in real time, and sending the three-dimensional model data to a receiving unit;
s2, positioning the projector, receiving three-dimensional model data sent by a projector simulation projection module by a receiving unit, sending the three-dimensional model data to a parameter adjusting module, adjusting the edge of a projection picture by the parameter adjusting module for fusion to obtain projector installation parameter data, and sending the projector installation parameter data to the projector unit;
and S3, interacting, installing the projector unit according to the projector installation parameter data, storing a video file and an interactive data packet in the storage unit, reading the interactive data packet by the interactive unit, sending the interactive data packet to the dynamic seat unit to realize interaction with a user, reading the video file by the video playing unit, and playing the video through the projector unit.
CN202010786072.9A 2020-08-07 2020-08-07 System and method for different-surface fusion image interaction based on window platform Active CN112004073B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010786072.9A CN112004073B (en) 2020-08-07 2020-08-07 System and method for different-surface fusion image interaction based on window platform

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010786072.9A CN112004073B (en) 2020-08-07 2020-08-07 System and method for different-surface fusion image interaction based on window platform

Publications (2)

Publication Number Publication Date
CN112004073A CN112004073A (en) 2020-11-27
CN112004073B true CN112004073B (en) 2021-11-02

Family

ID=73462765

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010786072.9A Active CN112004073B (en) 2020-08-07 2020-08-07 System and method for different-surface fusion image interaction based on window platform

Country Status (1)

Country Link
CN (1) CN112004073B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114299248A (en) * 2021-12-27 2022-04-08 上海风语筑文化科技股份有限公司 Super-large model projection interactive display system and method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103729888A (en) * 2013-12-31 2014-04-16 成都有尔科技有限公司 3D projecting device convenient to debug and debugging method thereof
CN104243975A (en) * 2014-09-19 2014-12-24 西安中科晶像光电科技有限公司 Automatic calibration system and method of 3D projector

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102231099B (en) * 2011-07-06 2014-06-18 清华大学 Method for correcting per-pixel response brightness in multi-projector auto-stereoscopic display
US10282034B2 (en) * 2012-10-14 2019-05-07 Neonode Inc. Touch sensitive curved and flexible displays
CN103533318A (en) * 2013-10-21 2014-01-22 北京理工大学 Building outer surface projection method
US10089778B2 (en) * 2015-08-07 2018-10-02 Christie Digital Systems Usa, Inc. System and method for automatic alignment and projection mapping
JP6428694B2 (en) * 2016-03-29 2018-11-28 ブラザー工業株式会社 Projection control apparatus, program, and projection system
CN106067160B (en) * 2016-06-21 2018-12-11 江苏亿莱顿智能科技有限公司 Large screen merges projecting method
CN206402367U (en) * 2016-12-08 2017-08-11 南京信息工程大学 A kind of seamless optical projection system of three-dimensional scenic ring curtain based on Unity3D
CN107659801B (en) * 2017-05-12 2023-05-09 杭州一隅千象科技有限公司 Cross-arrangement multi-direction ring curtain full-coverage projection method, system and projector
CN107121888B (en) * 2017-07-13 2019-08-30 广西临届数字科技有限公司 The method of ball-screen projection projection
FR3069692B1 (en) * 2017-07-27 2019-08-23 Stephane Brard METHOD AND DEVICE FOR MANAGING THE DISPLAY OF VIRTUAL REALITY IMAGES
CN208849928U (en) * 2018-10-31 2019-05-10 山东金东数字创意股份有限公司 A kind of video image stereoscopic fusion playing device
KR102154684B1 (en) * 2019-09-24 2020-09-10 (주)코드쓰리 System for outputting of multi-projector and method thereof
CN111062869B (en) * 2019-12-09 2023-05-26 北京东方瑞丰航空技术有限公司 Multi-channel correction splicing method for curved curtain
CN111061421B (en) * 2019-12-19 2021-07-20 北京澜景科技有限公司 Picture projection method and device and computer storage medium
CN112203079A (en) * 2020-07-17 2021-01-08 中国科学院空天信息创新研究院 Three-dimensional sphere-oriented visualization system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103729888A (en) * 2013-12-31 2014-04-16 成都有尔科技有限公司 3D projecting device convenient to debug and debugging method thereof
CN104243975A (en) * 2014-09-19 2014-12-24 西安中科晶像光电科技有限公司 Automatic calibration system and method of 3D projector

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
激光投影技术在飞机导管支架安装中的研究与应用;王声等;《导航与控制》;20180605(第03期);全文 *

Also Published As

Publication number Publication date
CN112004073A (en) 2020-11-27

Similar Documents

Publication Publication Date Title
CN109272478B (en) Screen projection method and device and related equipment
CN111698390A (en) Virtual camera control method and device, and virtual studio implementation method and system
CN109005394B (en) A kind of bearing calibration of projected image and projector
CN113252309A (en) Testing method and testing device for near-to-eye display equipment and storage medium
CN105787920B (en) Ball curtain scaling method, calibration system and control device
CN110335307B (en) Calibration method, calibration device, computer storage medium and terminal equipment
CN102801952A (en) Method and device for adjusting video conference system
CN111062869B (en) Multi-channel correction splicing method for curved curtain
CN104574355B (en) Calibration system of stereo camera and calibration method of stereo camera
CN102004378A (en) Method for adjusting projection picture and projection device
CN112004073B (en) System and method for different-surface fusion image interaction based on window platform
CN104978077A (en) Interaction method and interaction system
CN112738491B (en) Correction method of projection reflection picture
CN104113692A (en) Image shooting method and device
CN114845147A (en) Screen rendering method, display picture synthesis method and device and intelligent terminal
CN108269288B (en) Intelligent special-shaped projection non-contact interaction system and method
US20120050276A1 (en) Signal processor, signal processing method, display device and program product
CN109801351A (en) Dynamic image generation method and processing equipment
CN108462860A (en) A kind of method and projection arrangement for realizing projection process
CN116309881A (en) Tripod head camera external parameter measuring and calculating method, device, equipment and medium
CN112019747B (en) Foreground tracking method based on holder sensor
CN115580691A (en) Image rendering and synthesizing system for virtual film production
WO2018195892A1 (en) Method and apparatus for adding three-dimensional stereoscopic watermark, and terminal
CN108346183A (en) A kind of method and system for AR origin reference locations
CN116453456B (en) LED screen calibration method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant