[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN112550778A - Deep space exploration visual imaging environment simulation device and method - Google Patents

Deep space exploration visual imaging environment simulation device and method Download PDF

Info

Publication number
CN112550778A
CN112550778A CN202011249378.7A CN202011249378A CN112550778A CN 112550778 A CN112550778 A CN 112550778A CN 202011249378 A CN202011249378 A CN 202011249378A CN 112550778 A CN112550778 A CN 112550778A
Authority
CN
China
Prior art keywords
environment
subsystem
landform
deep space
projection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011249378.7A
Other languages
Chinese (zh)
Other versions
CN112550778B (en
Inventor
孙运达
万雪
李盛阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Technology and Engineering Center for Space Utilization of CAS
Original Assignee
Technology and Engineering Center for Space Utilization of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Technology and Engineering Center for Space Utilization of CAS filed Critical Technology and Engineering Center for Space Utilization of CAS
Priority to CN202011249378.7A priority Critical patent/CN112550778B/en
Publication of CN112550778A publication Critical patent/CN112550778A/en
Application granted granted Critical
Publication of CN112550778B publication Critical patent/CN112550778B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64GCOSMONAUTICS; VEHICLES OR EQUIPMENT THEREFOR
    • B64G7/00Simulating cosmonautic conditions, e.g. for conditioning crews

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Transforming Electric Information Into Light Information (AREA)

Abstract

The invention relates to a system and a method for simulating a deep space exploration visual imaging environment, which comprises a space target motion simulation module, a deep space environment simulation projection module and an extraterrestrial planet landform simulation module; the simulated deep space environment projection module is used for providing a simulated deep space environment in a preset environment; the system comprises an extraterrestrial planet landform and landform simulation module, a data processing module and a data processing module, wherein the extraterrestrial planet landform and landform simulation module is used for providing a simulated extraterrestrial planet landform and landform environment in a preset environment; and the simulated space target motion module is used for providing a motion space target for the simulated deep space environment projection module and the extraterrestrial planet landform and landform simulation module. The invention establishes the space environment of the on-orbit operation of the space target and the simulation environment of the surface environment of the planet outside the earth, including the ground texture, the topographic relief, the rock and other factors of the planet outside the earth, reduces the experiment cost, controls the experiment risk, has simple and fast operation and strong reusability of the simulation environment, can greatly accelerate the experiment speed of the corresponding experiment task, and saves precious time.

Description

Deep space exploration visual imaging environment simulation device and method
Technical Field
The invention relates to the technical field of environment simulation, in particular to a device and a method for simulating a deep space exploration visual imaging environment.
Background
As an important branch of simulation technology, the field related to physical simulation is very wide, including mechatronics, control, image processing, and communication technologies. The application of physical simulation in the engineering field is very wide, namely, the physical simulation is performed by utilizing the environment and conditions required by physical simulation and controlling the designed physical through a computer simulation loop, so that the real situation is very close to the real situation.
In view of the unknown nature of the deep space environment and the extraterrestrial planet ground surface environment, it is necessary to simulate the extraterrestrial planet ground surface and the deep space environment on the ground in order to ensure the successful completion of the detection task.
Disclosure of Invention
The invention aims to solve the technical problem of the prior art and provides a deep space exploration visual imaging environment simulation device and method.
The technical scheme for solving the technical problems is as follows:
a deep space exploration visual imaging environment simulation system comprises a space target motion simulation module, a deep space environment simulation projection module and an extraterrestrial planet landform simulation module;
the simulated deep space environment projection module is used for providing a simulated deep space environment in a preset environment;
the extraterrestrial planet landform and landform simulation module is used for providing a simulated extraterrestrial planet landform and landform environment in the preset environment;
and the simulated space target motion module is used for providing a motion space target for the simulated deep space environment projection module and the extraterrestrial planet landform and landform simulation module.
The invention has the beneficial effects that: a deep space exploration visual imaging environment simulation system is established and comprises a space target motion simulation module, a deep space environment simulation projection module and an extraterrestrial planet landform and landform simulation module. The simulation environment of the space environment of the on-orbit operation of the space target and the surface environment of the planet outside the earth is established, and the simulation environment comprises the ground texture, the topographic relief, the rock and other elements of the planet outside the earth. The system reduces the experiment cost, controls the experiment risk, is simple and fast in operation of the simulation environment, is high in reusability, can greatly accelerate the experiment speed of corresponding experiment tasks, and saves precious time.
On the basis of the technical scheme, the invention can be further improved as follows.
Furthermore, the simulated space target motion module comprises a space target subsystem, a track and robot subsystem and a ground control subsystem;
the space target subsystem is used for setting a plurality of satellite models in the preset environment;
the track and robot subsystem is used for arranging an arc-shaped track in the preset environment, and the track robot moves in the arc-shaped track;
the ground control subsystem is used for dispatching the rail robots to prevent collision among the rail robots. Further, the simulated deep space environment projection module comprises a projection subsystem, an image fusion subsystem and an image processing subsystem;
the projection subsystem is used for projecting pictures on the plurality of curtains;
the image fusion subsystem is used for adjusting the brightness of the images projected by the projection subsystem on the plurality of curtains so that the brightness of the whole image is consistent;
and the image processing subsystem is used for controlling the projection subsystem to project pictures on the plurality of curtains.
Further, the projection subsystem includes at least two projectors, and the at least two projectors project onto a same curtain, wherein at least one projector of the at least two projectors projects onto a first side of the same curtain, and the remaining projectors of the at least two projectors project onto a second side of the same curtain.
Further, the image fusion subsystem is an edge fuser;
the edge fusion device is specifically used for adjusting the projection brightness of the projectors of the same curtain, so that the brightness of the pictures displayed by the same curtain is basically consistent.
Another technical solution of the present invention for solving the above technical problems is as follows: a deep space exploration visual imaging environment simulation method comprises the following steps:
the simulated deep space environment projection module provides a simulated deep space environment in a preset environment;
the extraterrestrial planet landform and landform simulation module provides a simulated extraterrestrial planet landform and landform environment in the preset environment;
and the simulated space target motion module provides a motion space target for the simulated deep space environment projection module and the extraterrestrial planet landform simulation module.
The invention has the beneficial effects that: a deep space exploration visual imaging environment simulation method is established, and comprises a space target motion simulation module, a deep space environment simulation projection module and an extraterrestrial planet landform simulation module. The simulation environment of the space environment of the on-orbit operation of the space target and the surface environment of the planet outside the earth is established, and the simulation environment comprises the ground texture, the topographic relief, the rock and other elements of the planet outside the earth. The system reduces the experiment cost, controls the experiment risk, is simple and fast in operation of the simulation environment, is high in reusability, can greatly accelerate the experiment speed of corresponding experiment tasks, and saves precious time.
On the basis of the technical scheme, the invention can be further improved as follows.
Further, the simulated spatial target motion module comprises a spatial target subsystem, a track and robot subsystem and a ground control subsystem, and the method comprises the following steps:
the space target subsystem sets a plurality of satellite models in the preset environment;
the track and the robot subsystem are provided with an arc-shaped track in the preset environment, and the track robot moves in the arc-shaped track;
the ground control subsystem is used for dispatching the rail robots to prevent collision among the rail robots. Further, the simulated deep space environment projection module comprises a projection subsystem, an image fusion subsystem and an image processing subsystem, and the method comprises the following steps:
the projection subsystem projects pictures on a plurality of curtains;
the image fusion subsystem adjusts the brightness of the images projected by the projection subsystem on the plurality of curtains, so that the brightness of the whole image is consistent;
the image processing subsystem controls the projection subsystem to project pictures on the plurality of curtains.
Further, the projection subsystem includes at least two projectors, and the at least two projectors project onto a same curtain, wherein at least one projector of the at least two projectors projects onto a first side of the same curtain, and the remaining projectors of the at least two projectors project onto a second side of the same curtain.
Further, the image fusion subsystem is an edge fuser, the method comprising:
and the edge fusion device adjusts the projection brightness of the projector of the same curtain, so that the brightness of the pictures displayed by the same curtain is basically consistent.
Advantages of additional aspects of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments of the present invention or in the description of the prior art will be briefly described below, and it is obvious that the drawings described below are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic structural diagram of a deep space exploration visual imaging environment simulation system according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a deep space exploration visual imaging environment simulation system according to another embodiment of the present invention;
FIG. 3 is a diagram illustrating an actual effect of a deep space exploration visual imaging environment simulation system according to another embodiment of the present invention;
fig. 4 is a diagram illustrating an actual effect of a deep space exploration visual imaging environment simulation system according to another embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, shall fall within the scope of protection of the present invention.
Fig. 1 is a schematic structural diagram of a deep space exploration visual imaging environment simulation system according to an embodiment of the present invention, which includes a space target motion simulation module, a deep space environment simulation projection module, and an extraterrestrial planet landform simulation module;
the simulated deep space environment projection module is used for providing a simulated deep space environment in a preset environment;
the extraterrestrial planet landform and landform simulation module is used for providing a simulated extraterrestrial planet landform and landform environment in the preset environment;
and the simulated space target motion module is used for providing a motion space target for the simulated deep space environment projection module and the extraterrestrial planet landform and landform simulation module.
The deep space exploration visual imaging environment simulation system based on the embodiment comprises a space target motion simulation module, a deep space environment simulation projection module and an extraterrestrial planet landform simulation module. The simulation environment of the space environment of the on-orbit operation of the space target and the surface environment of the planet outside the earth is established, and the simulation environment comprises the ground texture, the topographic relief, the rock and other elements of the planet outside the earth. The system reduces the experiment cost, controls the experiment risk, is simple and fast in operation of the simulation environment, is high in reusability, can greatly accelerate the experiment speed of corresponding experiment tasks, and saves precious time.
Based on the above embodiment, further, the simulated spatial target motion module includes a spatial target subsystem, a track and robot subsystem, and a ground control subsystem;
the space target subsystem is used for setting a plurality of satellite models in the preset environment;
the track and robot subsystem is used for arranging an arc-shaped track in the preset environment, and the track robot moves in the arc-shaped track;
the ground control subsystem is used for dispatching the rail robots to prevent collision among the rail robots. Further, the simulated deep space environment projection module comprises a projection subsystem, an image fusion subsystem and an image processing subsystem;
the projection subsystem is used for projecting pictures on the plurality of curtains;
the image fusion subsystem is used for adjusting the brightness of the images projected by the projection subsystem on the plurality of curtains so that the brightness of the whole image is consistent;
and the image processing subsystem is used for controlling the projection subsystem to project pictures on the plurality of curtains.
Further, the projection subsystem includes at least two projectors, and the at least two projectors project onto a same curtain, wherein at least one projector of the at least two projectors projects onto a first side of the same curtain, and the remaining projectors of the at least two projectors project onto a second side of the same curtain.
It should be understood that the projection subsystem further includes at least one curtain, and each curtain is projected by at least two projection devices, wherein one projector projects to the left of the curtain and the other projector projects to the right of the curtain.
Further, the image fusion subsystem is an edge fuser;
the edge fusion device is specifically used for adjusting the projection brightness of the projectors of the same curtain, so that the brightness of the pictures displayed by the same curtain is basically consistent.
It should be understood that there are at least two projectors projecting on the same screen, for example, when there are two projectors projecting on the same screen at the same time, one of the projectors projects on the left side of the screen and the other projector projects on the right side of the screen, there will be areas in the screen where the two projectors can project at the same time, and therefore, the brightness of this area is brighter than that of the area projected by only one projector. Therefore, by arranging the edge fusion device, the brightness of the right overlapping part of the left projector corresponding to the same curtain is linearly attenuated, the brightness of the left overlapping part of the right projector is linearly increased, and the brightness of the whole picture is completely consistent in the display effect.
It should be understood that a plurality of satellite models are all orbital motion by taking an orbital robot as a carrying platform and are precisely positioned by an ultra-wideband technology, and the satellite model motion is controlled by a ground control system, wherein the plurality of satellite models are 3D printing models of satellite 3D models published by the American aviation and Space administration (Nat i ona and Space Admi n i strat i on, NASA).
The orbit and robot subsystem adopts an arc orbit to simulate the real motion trail of the satellite in the motion process, and the guide rail adopts a guide rail based on a trapezoidal guiding technology to bear a large-scale satellite model.
The ground control subsystem is a computer, and controls an embedded motion control card of the orbital robot in a computer programming mode to realize the motion direction, speed and rotation of the orbital robot. Wherein, the ground control subsystem is connected with the track robot in a wireless communication mode.
The projection system includes a projection screen comprised of a plurality of projection devices and a plurality of motorized screens. The simulated deep space environment projection completes seamless fusion of a plurality of projections through the graphic processing terminal and the fusion equipment, and bright and high-resolution image environment simulation is realized. For example, the projection system uses a projector with six physical pixels exceeding 1920 × 1080, which has a blue 3D function, providing a more quality picture quality for image simulation. In addition, the projection system further comprises three electric soft curtains, the curtains support 4K high-definition display, and the three curtains are placed by being attached to three sides of a square, so that a high-quality picture is provided while the wide visual field is ensured.
The image processing terminal is a computer, the high-performance display of the pictures of a plurality of curtains and the real-time roaming of large-scale scenes are realized in a computer programming mode, and the frame rate is not lower than 30 FPS. The image processing terminal communicates with the image fusion system in a wired connection mode.
The extraterrestrial planet landform and landform simulation system comprises ground equipment for simulating the extraterrestrial planet earth surface and a plurality of geological models restored by a 3D printing technology, and provides a more real geological physical environment for a simulation environment.
As shown in fig. 2, a schematic structural diagram of a deep space exploration visual imaging environment simulation system according to another embodiment of the present invention, the simulated deep space environment projection system of this embodiment includes six projection devices 111 and 116, three electric curtains 119 and 121, a fusion machine 117, and a graphics workstation 118.
The projectors 113 and 114 correspond to a curtain 119, the projectors 111 and 112 correspond to a curtain 120, and the projectors 115 and 116 correspond to a curtain 121, the graphics workstation 118 and the fusion machine 117 are connected by wires to realize data interaction, and the fusion machine 117 and the projector 111 and 116 are connected by wires to realize data transmission. The image/video required by simulation is acquired by the graphic workstation and is subjected to corresponding image preprocessing, the image subjected to split screen processing is transmitted to the fusion machine 117 by the graphic workstation 118, the fusion machine 117 runs a written seamless soft edge fusion algorithm in a DSP hardware mode to fuse the image output by split screen, the brightness of the left overlapping part of the image is linearly weakened, and the brightness of the right overlapping part of the image is linearly strengthened, so that the images projected on each curtain keep consistent brightness and the image overlapping phenomenon is relieved.
The extraterrestrial planet landform simulation system of the present embodiment includes a plurality of planet rock models 210 and extraterrestrial planet earth surface images 211. The extraterrestrial planet landform simulation system consisting of the extraterrestrial planet earth surface image 211 and the geological model 210 is matched with the simulated deep space environment projection system shown in fig. 1, so that the extraterrestrial planet earth surface simulation can be realized on the image and geology.
The system for simulating the motion of the space object in the embodiment comprises an orbit robot system 310, a satellite model 320 and a ground control subsystem 330.
The orbital robot system 310 is powered by the cable laid on the orbit, the orbital robot itself can also complete spinning, and the satellite model 320 is carried on the orbital robot system 310, so that the moving space target object can be simulated. The ground control subsystem 330 is in wireless connection with the robot motion control card through an IO interface, and controls the motion direction, speed, rotation direction and rotation speed of a plurality of track robots by using an optimized DH architecture robot kinematics control algorithm, so that the track motion and the rotation of the space target are realized.
Another technical solution of the present invention for solving the above technical problems is as follows: a deep space exploration visual imaging environment simulation method comprises the following steps:
the simulated deep space environment projection module provides a simulated deep space environment in a preset environment;
the extraterrestrial planet landform and landform simulation module provides a simulated extraterrestrial planet landform and landform environment in the preset environment;
and the simulated space target motion module provides a motion space target for the simulated deep space environment projection module and the extraterrestrial planet landform simulation module.
The deep space exploration visual imaging environment simulation method based on the embodiment comprises a space target motion simulation module, a deep space environment simulation projection module and an extraterrestrial planet landform simulation module. The simulation environment of the space environment of the on-orbit operation of the space target and the surface environment of the planet outside the earth is established, and the simulation environment comprises the ground texture, the topographic relief, the rock and other elements of the planet outside the earth. The system reduces the experiment cost, controls the experiment risk, is simple and fast in operation of the simulation environment, is high in reusability, can greatly accelerate the experiment speed of corresponding experiment tasks, and saves precious time.
Based on the above embodiment, further, the simulated spatial target motion module includes a spatial target subsystem, a track and robot subsystem, and a ground control subsystem, and the method includes:
the space target subsystem sets a plurality of satellite models in the preset environment;
the track and the robot subsystem are provided with an arc-shaped track in the preset environment, and the track robot moves in the arc-shaped track;
the ground control subsystem is used for dispatching the rail robots to prevent collision among the rail robots. Further, the simulated deep space environment projection module comprises a projection subsystem, an image fusion subsystem and an image processing subsystem, and the method comprises the following steps:
the projection subsystem projects pictures on a plurality of curtains;
the image fusion subsystem adjusts the brightness of the images projected by the projection subsystem on the plurality of curtains, so that the brightness of the whole image is consistent;
the image processing subsystem controls the projection subsystem to project pictures on the plurality of curtains.
The projection subsystem comprises at least two projectors, and the at least two projectors project to the same curtain, wherein at least one projector of the at least two projectors projects to a first edge of the same curtain, and the rest projectors of the at least two projectors project to a second edge of the same curtain.
Further, the image fusion subsystem is an edge fuser, the method comprising:
and the edge fusion device adjusts the projection brightness of the projector projecting to the same curtain, so that the brightness of the pictures displayed by the same curtain is basically consistent.
It should be understood that when two or more projectors are combined to project a picture, a part of the image bulbs are overlapped, the edge fusion device gradually reduces the light brightness of the overlapped part of the two projectors, so that the brightness of the overlapped part of the right side of the left projector corresponding to the same curtain is linearly attenuated, and the brightness of the overlapped part of the left side of the right projector is linearly increased, and the brightness of the whole picture is completely consistent in the display effect. For example, as shown in the practical effect diagrams of the deep space exploration visual imaging environment simulation system in fig. 3 to 4, the brightness and the color of the whole picture presented on the curtain are made consistent through the edge fusion device.
It should be understood that, for convenience and brevity of description, only the division of the above-mentioned functional units and modules is illustrated, and in practical applications, the above-mentioned functions may be distributed as different functional units and modules as required, that is, the internal structure of the apparatus may be divided into different functional units or modules to complete all or part of the functions described above. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium.
Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media which may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, those skilled in the art should understand that they can still make modifications to the technical solutions described in the foregoing embodiments, or make equivalent substitutions for some technical features, and these modifications or substitutions do not make the essence of the corresponding technical solutions depart from the spirit and scope of the technical solutions of the embodiments of the present invention, and are included in the protection scope of the present invention.

Claims (10)

1. A deep space exploration visual imaging environment simulation system is characterized by comprising a space target motion simulation module, a deep space environment simulation projection module and an extraterrestrial planet landform and landform simulation module;
the simulated deep space environment projection module is used for providing a simulated deep space environment in a preset environment;
the extraterrestrial planet landform and landform simulation module is used for providing a simulated extraterrestrial planet landform and landform environment in the preset environment;
and the simulated space target motion module is used for providing a motion space target for the simulated deep space environment projection module and the extraterrestrial planet landform and landform simulation module.
2. The deep space exploration visual imaging environment simulation system according to claim 1, wherein said simulated spatial target motion module comprises a spatial target subsystem, a rail and robot subsystem, and a ground control subsystem;
the space target subsystem is used for setting a plurality of satellite models in the preset environment;
the track and robot subsystem is used for arranging an arc-shaped track in the preset environment, and the track robot moves in the arc-shaped track;
the ground control subsystem is used for dispatching the rail robots to prevent collision among the rail robots.
3. The deep space exploration visual imaging environment simulation system according to claim 1, wherein said simulated deep space environment projection module comprises a projection subsystem, an image fusion subsystem and an image processing subsystem;
the projection subsystem is used for projecting pictures on the plurality of curtains;
the image fusion subsystem is used for adjusting the brightness of the images projected by the projection subsystem on the plurality of curtains so that the brightness of the whole image is consistent;
and the image processing subsystem is used for controlling the projection subsystem to project pictures on the plurality of curtains.
4. The deep space exploration visual imaging environment simulation system according to claim 3, wherein the projection subsystem comprises at least two projectors, the at least two projectors project onto a same curtain, wherein at least one projector of the at least two projectors projects onto a first side of the same curtain, and the remaining projectors of the at least two projectors project onto a second side of the same curtain.
5. The deep space exploration visual imaging environment simulation system according to claim 4, wherein said image fusion subsystem is an edge fuser;
the edge fusion device is specifically configured to adjust the brightness of the picture of the same curtain, so that the brightness of the picture displayed by the same curtain is substantially the same.
6. A deep space exploration visual imaging environment simulation method based on the deep space exploration visual imaging environment simulation system of any one of claims 1 to 5, comprising:
the simulated deep space environment projection module provides a simulated deep space environment in a preset environment;
the extraterrestrial planet landform and landform simulation module provides a simulated extraterrestrial planet landform and landform environment in the preset environment;
and the simulated space target motion module provides a motion space target for the simulated deep space environment projection module and the extraterrestrial planet landform simulation module.
7. The deep space exploration visual imaging environment simulation method according to claim 6, wherein said simulated spatial target motion module comprises a spatial target subsystem, a rail and robot subsystem, and a ground control subsystem, the method comprising:
the space target subsystem sets a plurality of satellite models in the preset environment;
the track and the robot subsystem are provided with an arc-shaped track in the preset environment, and the track robot moves in the arc-shaped track;
the ground control subsystem is used for dispatching the rail robots to prevent collision among the rail robots.
8. The deep space exploration visual imaging environment simulation method according to claim 6, wherein the simulated deep space environment projection module comprises a projection subsystem, an image fusion subsystem and an image processing subsystem, the method comprising:
the projection subsystem projects pictures on a plurality of curtains;
the image fusion subsystem adjusts the brightness of the images projected by the projection subsystem on the plurality of curtains, so that the brightness of the whole image is consistent;
the image processing subsystem controls the projection subsystem to project pictures on the plurality of curtains.
9. The deep space exploration visual imaging environment simulation method according to claim 8, wherein the projection subsystem comprises at least two projectors, the at least two projectors project onto a same curtain, wherein at least one projector of the at least two projectors projects onto a first side of the same curtain, and the remaining projectors of the at least two projectors project onto a second side of the same curtain.
10. The deep space exploration visual imaging environment simulation system according to claim 9, wherein said image fusion subsystem is an edge fuser, said method comprising:
and the edge fusion device adjusts the projection brightness of the projector of the same curtain, so that the brightness of the pictures displayed by the same curtain is basically consistent.
CN202011249378.7A 2020-11-10 2020-11-10 Deep space exploration visual imaging environment simulation device and method Active CN112550778B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011249378.7A CN112550778B (en) 2020-11-10 2020-11-10 Deep space exploration visual imaging environment simulation device and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011249378.7A CN112550778B (en) 2020-11-10 2020-11-10 Deep space exploration visual imaging environment simulation device and method

Publications (2)

Publication Number Publication Date
CN112550778A true CN112550778A (en) 2021-03-26
CN112550778B CN112550778B (en) 2021-08-31

Family

ID=75042882

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011249378.7A Active CN112550778B (en) 2020-11-10 2020-11-10 Deep space exploration visual imaging environment simulation device and method

Country Status (1)

Country Link
CN (1) CN112550778B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114444304A (en) * 2022-01-24 2022-05-06 中国科学院空间应用工程与技术中心 Space task simulation method, system and simulation system

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1847791A (en) * 2006-05-12 2006-10-18 哈尔滨工业大学 Verification system for fast autonomous deep-space optical navigation control prototype
CN101452655A (en) * 2007-12-04 2009-06-10 北京卫星环境工程研究所 Synthesis simulation test field for lunar surface landform and environment
CN102879014A (en) * 2012-10-24 2013-01-16 北京控制工程研究所 Optical imaging autonomous navigation semi-physical simulation testing system for deep space exploration proximity process
US9194977B1 (en) * 2013-07-26 2015-11-24 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Active response gravity offload and method
CN105466477A (en) * 2015-12-07 2016-04-06 中国科学院光电研究院 A space-based observation simulation system and method targeted at satellite targets and fixed star targets
CN105628055A (en) * 2016-01-06 2016-06-01 北京工业大学 Autonomous optical navigation target imaging analog system for landing of deep space probe
CN205982972U (en) * 2016-06-27 2017-02-22 北京华航展览有限责任公司 Seamless integration projecting system
CN111453005A (en) * 2020-03-31 2020-07-28 上海卫星工程研究所 Reconfigurable small celestial body impact detection target characteristic ground simulation system
CN111537000A (en) * 2020-06-08 2020-08-14 中国科学院微小卫星创新研究院 Ground verification system and method for deep space small celestial body landing segment optical navigation algorithm
CN111637902A (en) * 2020-06-08 2020-09-08 中国科学院微小卫星创新研究院 Ground demonstration verification system and method for remote approach of small deep space celestial body

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1847791A (en) * 2006-05-12 2006-10-18 哈尔滨工业大学 Verification system for fast autonomous deep-space optical navigation control prototype
CN101452655A (en) * 2007-12-04 2009-06-10 北京卫星环境工程研究所 Synthesis simulation test field for lunar surface landform and environment
CN102879014A (en) * 2012-10-24 2013-01-16 北京控制工程研究所 Optical imaging autonomous navigation semi-physical simulation testing system for deep space exploration proximity process
US9194977B1 (en) * 2013-07-26 2015-11-24 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Active response gravity offload and method
CN105466477A (en) * 2015-12-07 2016-04-06 中国科学院光电研究院 A space-based observation simulation system and method targeted at satellite targets and fixed star targets
CN105628055A (en) * 2016-01-06 2016-06-01 北京工业大学 Autonomous optical navigation target imaging analog system for landing of deep space probe
CN205982972U (en) * 2016-06-27 2017-02-22 北京华航展览有限责任公司 Seamless integration projecting system
CN111453005A (en) * 2020-03-31 2020-07-28 上海卫星工程研究所 Reconfigurable small celestial body impact detection target characteristic ground simulation system
CN111537000A (en) * 2020-06-08 2020-08-14 中国科学院微小卫星创新研究院 Ground verification system and method for deep space small celestial body landing segment optical navigation algorithm
CN111637902A (en) * 2020-06-08 2020-09-08 中国科学院微小卫星创新研究院 Ground demonstration verification system and method for remote approach of small deep space celestial body

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114444304A (en) * 2022-01-24 2022-05-06 中国科学院空间应用工程与技术中心 Space task simulation method, system and simulation system
CN114444304B (en) * 2022-01-24 2023-04-07 中国科学院空间应用工程与技术中心 Space task simulation method, system and simulation system

Also Published As

Publication number Publication date
CN112550778B (en) 2021-08-31

Similar Documents

Publication Publication Date Title
US10564917B2 (en) Modular user-traversable display system
DE102016013766A1 (en) Image compensation for a masking, direct-view augmented reality system
JP2005534113A (en) Method and system enabling real-time mixing of composite and video images by a user
CN104298065B (en) 360-degree three-dimensional display device and method based on splicing of multiple high-speed projectors
CN105739934A (en) Multi-screen splicing display processing method and device
CN102156624A (en) Perceptually-based compensation of unintended light pollution of images for display systems
CN1477856A (en) True three-dimensional virtual studio system and its implement method
CN103037189A (en) Method to achieve integrate output of large-size screen video images through much projection
CN112550778B (en) Deep space exploration visual imaging environment simulation device and method
CN109901713A (en) Multi-person cooperative assembly system and method
CN206819048U (en) A kind of ball curtain projection system
CN105787920A (en) Dome screen demarcating method, demarcating system and control device
JP2006524457A (en) Using electronic paper-based screens to improve contrast
CN112462945A (en) Virtual reality-based logistics port collecting operation teaching method, system and medium
CN110636226A (en) Infrared dynamic scene driving control system and method
US20140300713A1 (en) Stereoscopic three dimensional projection and display
DE3306452A1 (en) View simulator
CN112926197A (en) Mine accident scene construction module and emergency rescue comprehensive training and drilling system
US20200265648A1 (en) Systems and methods for providing a virtual reality experience
Magnenat-Thalmann et al. Special cinematographic effects with virtual movie cameras
CN112396683B (en) Shadow rendering method, device, equipment and storage medium for virtual scene
CN115496884A (en) Virtual and real cabin fusion method based on SRWorks video perspective technology
CN103037190A (en) Distributed hardware system to achieve integrate output of large-size screen video images through much projection
Argelaguet et al. Automatic speed graph generation for predefined camera paths
CN107705253B (en) Method and device for generating video excitation source

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant