CN114520903A - Rendering display method, device, storage medium and computer program product - Google Patents
Rendering display method, device, storage medium and computer program product Download PDFInfo
- Publication number
- CN114520903A CN114520903A CN202210148247.2A CN202210148247A CN114520903A CN 114520903 A CN114520903 A CN 114520903A CN 202210148247 A CN202210148247 A CN 202210148247A CN 114520903 A CN114520903 A CN 114520903A
- Authority
- CN
- China
- Prior art keywords
- led screens
- display
- shooting
- led
- virtual shooting
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/74—Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/2224—Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
An embodiment of the application provides a rendering display method, rendering display equipment, a storage medium and a computer program product, wherein the method comprises the following steps: obtaining shooting environment information of at least two LED screens for virtual shooting, wherein the shooting environment information comprises: the size and position information of the at least two LED screens; obtaining the display range of the target display screen projected by the at least two LED screens according to the sizes of the at least two LED screens and the size of the target display screen; establishing at least two display areas corresponding to the at least two LED screens in the target display screen according to the display range and the positions of the at least two LED screens; rendering and displaying the image of the LED screen and the virtually shot object in at least two display areas of the target display screen.
Description
Technical Field
The embodiments of the present application relate to the field of electronic information technologies, and in particular, to a rendering display method, device, storage medium, and computer program product.
Background
With the development of the film and television shooting technology, a virtual shooting technology by adopting an LED screen appears, and video images are projected on the screen through the built LED screen so as to complete the set work of film and television shooting. The field view is not needed during the movie shooting, and the cost and the production period of the movie shooting are reduced.
The current virtual shooting technology is to output the image displayed on the LED screen to the screen through the processing of the real-time rendering engine, and the real-time rendering engine needs to process the image displayed on the LED screen through complicated post-stage software, which causes the complex operation of the virtual shooting technology for movie and television shooting and the low production efficiency.
Disclosure of Invention
In view of the above, embodiments of the present application provide a rendering display method, an apparatus, a storage medium, and a computer program product to at least partially solve the above problems.
According to a first aspect of the embodiments of the present application, there is provided a rendering and displaying method based on virtual shooting, the method including: acquiring shooting environment information of at least two LED screens for virtual shooting, wherein the shooting environment information comprises: the sizes and the position information of the at least two LED screens; obtaining the display range of the at least two LED screens projected to the target display screen according to the sizes of the at least two LED screens and the size of the target display screen; establishing at least two display areas corresponding to the at least two LED screens in the target display screen according to the display range and the positions of the at least two LED screens; rendering and displaying the image of the LED screen and the virtually shot object in at least two display areas of the target display screen.
According to a second aspect of embodiments of the present application, there is provided an electronic device including: the processor, the memory and the communication interface complete mutual communication through the communication bus; the memory is used for storing at least one executable instruction, and the executable instruction causes the processor to execute the operation corresponding to the rendering and displaying method based on virtual shooting in the first aspect.
According to a third aspect of embodiments of the present application, there is provided a storage medium having stored thereon a computer program that, when executed by a processor, implements the virtual photography-based rendering display method according to the first aspect.
According to a fourth aspect of embodiments of the present application, there is provided a computer program product which, when executed by a processor, implements the rendering display method based on virtual photography according to the first aspect.
According to the rendering display scheme based on virtual shooting, the display range of the projection of the at least two LED screens to the target display screen is obtained through the size of the at least two LED screens and the size of the target display screen, and then at least two display areas corresponding to the at least two LED screens are established in the target display screen according to the display range and the positions of the at least two LED screens. The method and the device for displaying the LED screen render and display the images of the LED screen and the virtual shot object in at least two display areas of the target display screen. The display state of at least two LED screens can be directly restored in the display area of the target screen, so that the display of the target screen does not need to be manually set, the operation of movie and television shooting by the virtual shooting technology is simpler and more convenient, and the production efficiency of movie and television shooting is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the embodiments of the present application, and other drawings can be obtained by those skilled in the art according to the drawings.
Fig. 1 is a schematic view of a rendering display scene based on virtual shooting according to an embodiment of the present disclosure;
fig. 2 is a flowchart of a rendering and displaying method based on virtual shooting according to an embodiment of the present disclosure;
fig. 3 is a schematic view of a screen based on virtual photography according to an embodiment of the present application;
fig. 4 is a schematic view of another screen based on virtual photography according to an embodiment of the present application;
fig. 5 is a schematic page diagram of another rendering and displaying method based on virtual shooting according to an embodiment of the present application;
fig. 6 is a flowchart of another rendering and displaying method based on virtual shooting according to an embodiment of the present disclosure;
fig. 7 is a flowchart of another rendering and displaying method based on virtual shooting according to an embodiment of the present disclosure;
fig. 8 is a flowchart of another rendering and displaying method based on virtual shooting according to an embodiment of the present disclosure;
Fig. 9 is a schematic page diagram of another rendering and displaying method based on virtual shooting according to an embodiment of the present application;
fig. 10 is a flowchart of yet another rendering and displaying method based on virtual shooting according to an embodiment of the present application;
fig. 11 is a flowchart of another rendering and displaying method based on virtual shooting according to an embodiment of the present disclosure;
fig. 12 is a flowchart of a rendering and displaying method based on virtual shooting according to an embodiment of the present disclosure;
fig. 13 is a schematic page diagram of another rendering and displaying method based on virtual shooting according to an embodiment of the present application;
fig. 14 is a flowchart of a rendering and displaying method based on virtual shooting according to an embodiment of the present disclosure;
fig. 15 is a schematic page diagram of another rendering and displaying method based on virtual shooting according to an embodiment of the present application;
fig. 16 is a structural diagram of a rendering display apparatus based on virtual shooting according to an embodiment of the present application;
fig. 17 is a block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the embodiments of the present application, the technical solutions in the embodiments of the present application will be described clearly and completely below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application shall fall within the scope of the protection of the embodiments in the present application.
The following further describes specific implementations of embodiments of the present application with reference to the drawings of the embodiments of the present application.
An application scenario of the rendering display method based on virtual shooting provided by the embodiment of the present application is described in order to facilitate understanding, and fig. 1 is a schematic view of a scenario of the rendering display method based on virtual shooting provided by the embodiment of the present application, referring to fig. 1. The rendering and displaying method of virtual shooting shown in fig. 1 is executed on the electronic device 101, and the electronic device 101 may be a device for executing the rendering and displaying method based on virtual shooting provided by the embodiment of the present application. The virtually shot image is transmitted to the electronic device 101 in real time, and the electronic device 101 runs the rendering display method of virtually shooting according to the embodiment of the present disclosure.
The electronic device 101 may be a terminal device such as a smart phone, a tablet computer, a notebook computer, and a vehicle-mounted terminal, and the electronic device 101 may also be a network device such as a server, which is only exemplary and not meant to limit the present application.
The electronic device 101 may access a network, be connected to a cloud terminal through the network, and perform data interaction, or the electronic device 101 may be a device in the cloud terminal. In the present application, the Network includes a Local Area Network (LAN), a Wide Area Network (WAN), and a mobile communication Network; such as the World Wide Web (WWW), Long Term Evolution (LTE) networks, 2G networks (2 th Generation Mobile Network), 3G networks (3 th Generation Mobile Network), 5G networks (5 th Generation Mobile Network), etc. The cloud may include various devices connected over a network, such as servers, relay devices, Device-to-Device (D2D) devices, and the like. Of course, this is merely an example and does not represent a limitation of the present application.
With reference to the system shown in fig. 1, a rendering and displaying method for virtual shooting provided in the embodiment of the present application is described in detail, and it should be noted that fig. 1 is only one application scenario of the rendering and displaying method for virtual shooting provided in the embodiment of the present application, and does not represent that the rendering and displaying method for virtual shooting must be applied to the application scenario shown in fig. 1.
Referring to fig. 2, an embodiment of the present application provides a rendering and displaying method based on virtual shooting. The method comprises the following steps:
Specifically, the shooting environment information includes: size and position information of the at least two LED screens.
The size of the at least two LED screens can represent the size of the at least two LED screens, and the position information of the at least two LED screens comprises: and the angle between the at least two LED screens, whether the at least two LED screens are overlapped or not and the like represent the position relation of the at least two LED screens.
In some implementations of the embodiment of the present application, referring to fig. 3, the at least two LED screens used for the virtual shooting are three LED screens, where the first LED screen 31 is disposed on the ground, and the second LED screen 32 and the third LED screen 33 are perpendicular to each other and are disposed on the first LED screen 31. The object for virtual shooting is positioned in the surrounding of the three LED screens, so that the video images projected by the three LED screens form a set for movie and television shooting.
In some specific implementations of the embodiment of the present application, referring to fig. 4, the at least two LED screens used for performing the virtual shooting are two LED screens, where the fourth LED screen 41 is disposed on the ground, and the fifth LED screen 42 is a curved screen and is vertically disposed on the first LED screen 41. The object for virtual shooting is positioned in the surrounding of the two LED screens, so that the video images projected by the two LED screens form a set for movie and television shooting.
The embodiment of the present application may also adopt at least two other LED screens surrounding the object of virtual shooting, which is not limited in the embodiment of the present application.
The embodiment of the application adopts at least two LED screens to carry out virtual shooting, thereby forming a set surrounding a virtual shot object.
According to the embodiment of the application, the display range of the at least two LED screens projected to the target display screen is determined according to the sizes of the at least two LED screens and the size of the target display screen, so that the display range of the at least two LED screens is determined in the target display screen without manual operation of a user.
Specifically, the display range of the target display screen projected by the at least two LED screens is determined according to the proportional relation between the sizes of the at least two LED screens and the size of the target display screen.
In the display range, in order to restore at least two LED screens in a target display screen, a plurality of first feature points are taken from the at least two LED screens, mapping is carried out according to coordinate values of the plurality of first feature points under a three-dimensional coordinate where the at least two LED screens are located, and second feature points corresponding to the plurality of first feature points in the display range of the target display screen are obtained. And obtaining at least two display areas corresponding to the at least two LED screens according to the plurality of second characteristic points.
The embodiment of the application adopts at least two display areas to restore at least two LED screens, so that the shooting environment of virtual shooting can be really restored, and the shooting environment of virtual shooting can be tested and adjusted.
And step 204, rendering and displaying the images of the LED screen and the virtual shot object in at least two display areas of the target display screen.
Illustratively, referring to fig. 5, the embodiment of the present application displays three display areas in a target display screen, which correspond to three LED screens virtually photographed, and displays images of the LED screens and the virtually photographed object in the three display areas.
According to the embodiment of the application, the images of the LED screens and the objects virtually shot are rendered and displayed in the at least two display areas, the display states of the at least two LED screens can be directly restored in the display areas of the target screen, so that the display of the target screen does not need to be manually set, the operation of movie shooting by the virtual shooting technology is simpler and more convenient, and the production efficiency of movie shooting is improved.
Referring to fig. 6, in some further specific implementations of the embodiments of the present application, the method further includes:
Illustratively, referring to fig. 5, the embodiment of the present application has a light distribution state control 51. The user can input a first user instruction through the lighting state control 51, adjust the lighting state, and simulate the lighting effect of virtual shooting in advance.
The embodiment of the application simulates the light distribution effect of virtual shooting in advance so as to adjust the light distribution of virtual shooting in advance and realize better virtual shooting effect.
Referring to fig. 7, in still some specific implementations of embodiments of the present application, the method further includes:
and step 206, respectively adjusting the time axes of the at least two display areas according to the received second user instruction so as to monitor the playing effect of the simulated shooting environment.
Illustratively, referring to FIG. 5, embodiments of the present application have a timeline control 52 associated with three display regions, respectively. The user can input a second user instruction through the three time axis controls 52, and the playing effects of the at least two display areas are respectively monitored, so that various adjustment operations can be performed on the virtual shooting screen image.
According to the embodiment of the application, the playing effects of the at least two display areas are monitored, so that various adjustment operations are performed on virtual shooting, and a better virtual shooting effect is achieved.
Referring to fig. 8, in still some specific implementations of embodiments of the present application, the method further includes:
and step 207, cutting the display area into at least one sub-display area, and performing image processing operation on each sub-display area.
In the virtual shooting process, the images may have differences of depth of field such as foreground, middle view, background and the like. Therefore, for each corresponding LED screen, the display area is divided into more sub-display areas, image processing operations such as masking, sorting, zooming, deforming and the like are allowed to be carried out on the sub-display areas to construct more various depth of field effects, and more movie and television shooting scenes are supported.
For example, referring to fig. 9, in the embodiment of the present application, one display area is divided into three sub-display areas (sub-display area 1, sub-display area 2, and sub-display area 3), and the depth control 91 corresponding to each of the three sub-display areas may be adjusted, so as to perform image processing operations such as masking, sorting, scaling, and deforming on the sub-display areas to construct a more various depth effect.
Referring to fig. 10, in some further specific implementations of the embodiments of the present application, the method further includes:
and 208, adding a virtual shooting element, and displaying the attribute of the added virtual shooting element, wherein the virtual shooting element is an element adopted for virtual shooting.
For example, referring to fig. 5, in the embodiment of the present application, a user may add a virtual camera element 53 such as a camera and a light, and display an attribute 54 of the virtual camera element after the virtual camera element is selected.
According to the embodiment of the application, the virtual shooting elements can be visually adjusted by adding the virtual shooting elements and displaying the attributes of the added virtual shooting elements, so that a better virtual shooting effect is achieved.
Referring to fig. 11, in still some specific implementations of embodiments of the present application, the method further includes:
and 209, performing image brightness reduction processing on the part, corresponding to the seam of the at least two LED screens, in the display area.
According to the embodiment of the application, the part corresponding to the joint of the at least two LED screens is subjected to image brightness reduction treatment, so that the unnatural feeling caused by the joint of the at least two LED screens is weakened.
Referring to fig. 12, in still some specific implementations of embodiments of the present application, the method further includes:
and step 210, respectively adjusting parameters of the cameras adopted for the virtual shooting according to the received third user instruction.
Illustratively, referring to fig. 13, the user adjusts the focus control, aperture control, light sensing control, zoom control of the first camera via a third user instruction.
According to the embodiment of the application, the parameters of the camera adopted by the virtual shooting can be respectively adjusted according to the instruction of the third user, so that the camera can be adjusted in real time according to the moving condition of the virtual shooting object, and a more optimized virtual shooting effect is realized.
Referring to fig. 14, in still some specific implementations of embodiments of the present application, the method further includes:
and step 211, respectively adjusting parameters of the light adopted by the virtual shooting according to the received fourth user instruction.
According to the embodiment of the application, parameters of the light adopted by the virtual shooting can be respectively adjusted according to the fourth user instruction, so that the light can be adjusted in real time according to the moving condition of the virtual shooting object, and the more optimized virtual shooting effect is realized.
Illustratively, referring to fig. 15, the user adjusts the color control, the brightness control, the contrast control, and the area control of the first light through a fourth user instruction.
To sum up, this application embodiment can adjust virtual screen image, camera, light respectively through second user instruction, third user instruction, fourth user instruction respectively to the realization is followed the removal of virtual shooting object and is carried out corresponding change to screen image, camera, light in the virtual shooting, can realize the better tracking effect to virtual shooting object.
Based on the method described in the foregoing embodiment, referring to fig. 16, an embodiment of the present application further provides a rendering and displaying device based on virtual shooting, where the device includes:
An information obtaining module 161, configured to obtain shooting environment information of at least two LED screens performing virtual shooting, where the shooting environment information includes: the size and position information of the at least two LED screens;
a range determining module 162, configured to obtain, according to the sizes of the at least two LED screens and the size of the target display screen, display ranges of the at least two LED screens projected onto the target display screen:
the area determining module 163 is configured to establish at least two display areas corresponding to the at least two LED screens in the target display screen according to the display range and the positions of the at least two LED screens;
and an image display module 164 for rendering and displaying the image of the LED screen and the virtual photographed object in at least two display areas of the target display screen.
According to the rendering display scheme based on virtual shooting, the display range of the projection of the at least two LED screens to the target display screen is obtained through the size of the at least two LED screens and the size of the target display screen, and then at least two display areas corresponding to the at least two LED screens are established in the target display screen according to the display range and the positions of the at least two LED screens. The method and the device for displaying the LED screen render and display the images of the LED screen and the virtual shot object in at least two display areas of the target display screen. The display state of at least two LED screens can be directly restored in the display area of the target screen, so that the display of the target screen does not need to be manually set, the operation of movie and television shooting by the virtual shooting technology is simpler and more convenient, and the production efficiency of movie and television shooting is improved.
Based on the method described in the foregoing embodiment, an electronic device is provided in the embodiment of the present application, and is configured to execute the method described in the foregoing embodiment, and referring to fig. 17, a schematic structural diagram of an electronic device according to the embodiment of the present application is shown, and the specific embodiment of the present application does not limit a specific implementation of the electronic device.
As shown in fig. 17, the electronic device 170 may include: a processor (processor)1702, a communication Interface 1704, a memory 1706, and a communication bus 1708.
Wherein:
the processor 1702, communication interface 1704, and memory 1706 communicate with each other via a communication bus 1708.
A communication interface 1704 for communicating with other electronic devices or servers.
The processor 1702 is configured to execute the program 1710, and may specifically execute relevant steps in the embodiment of the rendering and displaying method based on virtual shooting.
In particular, the program 1710 may include program code that includes computer operating instructions.
The processor 1702 may be a processor CPU, or an application Specific Integrated circuit (asic), or one or more Integrated circuits configured to implement embodiments of the present application. The intelligent device comprises one or more processors which can be the same type of processor, such as one or more CPUs; or may be different types of processors such as one or more CPUs and one or more ASICs.
The memory 1706 stores the program 1710. The memory 1706 may comprise high-speed RAM memory and may also include non-volatile memory (e.g., at least one disk memory).
The program 1710 may be specifically configured to be executed by the processor 1702 to implement the rendering and displaying method based on virtual photography described in the above embodiment. For specific implementation of each step in the program 1710, reference may be made to the corresponding steps and corresponding descriptions in the units in the foregoing embodiment of the rendering and displaying method based on virtual shooting, which are not described herein again. It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described devices and modules may refer to the corresponding process descriptions in the foregoing method embodiments, and are not described herein again.
Based on the methods described in the above embodiments, embodiments of the present application provide a computer storage medium, on which a computer program is stored, which when executed by a processor implements the methods described in the above embodiments.
Based on the methods described in the above embodiments, the embodiments of the present application provide a computer program product, which when executed by a processor implements the methods described in the above embodiments.
It should be noted that, according to implementation needs, each component/step described in the embodiment of the present application may be divided into more components/steps, and two or more components/steps or partial operations of the components/steps may also be combined into a new component/step to achieve the purpose of the embodiment of the present application.
The above-described methods according to the embodiments of the present application may be implemented in hardware, firmware, or as software or computer code that may be stored in a recording medium such as a CD ROM, RAM, floppy disk, hard disk, or magneto-optical disk, or as computer code downloaded through a network, originally stored in a remote recording medium or a non-transitory machine-readable medium, and to be stored in a local recording medium, so that the methods described herein may be stored in such software processes on a recording medium using a general purpose computer, a dedicated processor, or programmable or dedicated hardware such as an ASIC or FPGA. It is understood that a computer, processor, microprocessor controller, or programmable hardware includes memory components (e.g., RAM, ROM, flash memory, etc.) that can store or receive software or computer code that, when accessed and executed by a computer, processor, or hardware, implements the navigation methods described herein. Further, when a general-purpose computer accesses code for implementing the navigation methods shown herein, execution of the code transforms the general-purpose computer into a special-purpose computer for executing the navigation methods shown herein.
Those of ordinary skill in the art will appreciate that the various illustrative elements and method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the technical solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the present application.
The above embodiments are only used for illustrating the embodiments of the present application, and not for limiting the embodiments of the present application, and those skilled in the relevant art can make various changes and modifications without departing from the spirit and scope of the embodiments of the present application, so that all equivalent technical solutions also belong to the scope of the embodiments of the present application, and the scope of patent protection of the embodiments of the present application should be defined by the claims.
Claims (12)
1. A rendering and displaying method based on virtual shooting, the method comprising:
obtaining shooting environment information of at least two LED screens for virtual shooting, wherein the shooting environment information comprises: the size and position information of the at least two LED screens;
Obtaining the display range of the at least two LED screens projected to the target display screen according to the sizes of the at least two LED screens and the size of the target display screen;
establishing at least two display areas corresponding to the at least two LED screens in the target display screen according to the display range and the positions of the at least two LED screens;
rendering an image displaying the LED screen and the virtually photographed object in at least two display areas of the target display screen.
2. The method of claim 1, wherein the method further comprises:
and respectively adjusting the light distribution states of the at least two display areas according to the received first user instruction so as to simulate the light distribution effect of the virtual shooting.
3. The method of claim 1, wherein the method further comprises:
and respectively adjusting the time axes of the at least two display areas according to the received second user instruction so as to monitor the playing effect of the virtual shooting.
4. The method of claim 1, wherein the method further comprises:
and cutting the display area into at least one sub-display area, and respectively carrying out image processing operation on each sub-display area.
5. The method of claim 1, wherein the method further comprises:
adding a virtual shooting element, and displaying the attribute of the added virtual shooting element, wherein the virtual shooting element is an element adopted for virtual shooting.
6. The method of claim 1, wherein the method further comprises:
and carrying out image brightness reduction treatment on the part, corresponding to the joint of the at least two LED screens, in the display area.
7. The method of claim 1, wherein the method further comprises:
and respectively adjusting the parameters of the camera adopted by the virtual shooting according to the received third user instruction.
8. The method of claim 1, wherein the method further comprises:
and respectively adjusting parameters of the light adopted by the virtual shooting according to the received fourth user instruction.
9. A rendering display apparatus based on virtual photographing, the apparatus comprising:
the information acquisition module is used for acquiring shooting environment information of at least two LED screens for virtual shooting, and the shooting environment information comprises: the sizes and the position information of the at least two LED screens;
the range determining module is used for obtaining the display ranges of the at least two LED screens projected to the target display screen according to the sizes of the at least two LED screens and the size of the target display screen:
The area determining module is used for establishing at least two display areas corresponding to the at least two LED screens in the target display screen according to the display range and the positions of the at least two LED screens;
and the image display module is used for rendering and displaying the images of the LED screen and the virtual shot object in at least two display areas of the target display screen.
10. An electronic device, comprising: the system comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface are communicated with each other through the communication bus;
the memory is used for storing at least one executable instruction which causes the processor to execute the operation corresponding to the method of any one of claims 1-8.
11. A storage medium having stored thereon a computer program which, when executed by a processor, carries out the method according to any one of claims 1-8.
12. A computer program product which, when executed by a processor, implements the method of any one of claims 1-8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210148247.2A CN114520903B (en) | 2022-02-17 | 2022-02-17 | Rendering display method, rendering display device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210148247.2A CN114520903B (en) | 2022-02-17 | 2022-02-17 | Rendering display method, rendering display device, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114520903A true CN114520903A (en) | 2022-05-20 |
CN114520903B CN114520903B (en) | 2023-08-08 |
Family
ID=81598817
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210148247.2A Active CN114520903B (en) | 2022-02-17 | 2022-02-17 | Rendering display method, rendering display device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114520903B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117440110A (en) * | 2023-10-23 | 2024-01-23 | 神力视界(深圳)文化科技有限公司 | Virtual shooting control method, medium and mobile terminal |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180088889A1 (en) * | 2016-09-29 | 2018-03-29 | Jiang Chang | Three-dimensional image formation and color correction system and method |
CN107948466A (en) * | 2017-11-23 | 2018-04-20 | 北京德火新媒体技术有限公司 | A kind of three-dimensional scene construction method and system for video program production |
CN109961495A (en) * | 2019-04-11 | 2019-07-02 | 深圳迪乐普智能科技有限公司 | A kind of implementation method and VR editing machine of VR editing machine |
CN110806847A (en) * | 2019-10-30 | 2020-02-18 | 支付宝(杭州)信息技术有限公司 | Distributed multi-screen display method, device, equipment and system |
CN111766951A (en) * | 2020-09-01 | 2020-10-13 | 北京七维视觉科技有限公司 | Image display method and apparatus, computer system, and computer-readable storage medium |
CN112040092A (en) * | 2020-09-08 | 2020-12-04 | 杭州时光坐标影视传媒股份有限公司 | Real-time virtual scene LED shooting system and method |
CN113129814A (en) * | 2021-04-23 | 2021-07-16 | 浙江博采传媒有限公司 | Color correction method and system applied to virtual production of LED (light-emitting diode) ring screen |
US11107195B1 (en) * | 2019-08-23 | 2021-08-31 | Lucasfilm Entertainment Company Ltd. | Motion blur and depth of field for immersive content production systems |
CN113556443A (en) * | 2021-07-20 | 2021-10-26 | 北京星光影视设备科技股份有限公司 | LED screen real-time correction method facing virtual playing environment and virtual playing system |
US20210342971A1 (en) * | 2020-04-29 | 2021-11-04 | Lucasfilm Entertainment Company Ltd. | Photogrammetric alignment for immersive content production |
CN113810612A (en) * | 2021-09-17 | 2021-12-17 | 上海傲驰广告文化集团有限公司 | Analog live-action shooting method and system |
CN113905145A (en) * | 2021-10-11 | 2022-01-07 | 浙江博采传媒有限公司 | LED circular screen virtual-real camera focus matching method and system |
CN114051129A (en) * | 2021-11-09 | 2022-02-15 | 北京电影学院 | Film virtualization production system and method based on LED background wall |
-
2022
- 2022-02-17 CN CN202210148247.2A patent/CN114520903B/en active Active
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180088889A1 (en) * | 2016-09-29 | 2018-03-29 | Jiang Chang | Three-dimensional image formation and color correction system and method |
CN107948466A (en) * | 2017-11-23 | 2018-04-20 | 北京德火新媒体技术有限公司 | A kind of three-dimensional scene construction method and system for video program production |
CN109961495A (en) * | 2019-04-11 | 2019-07-02 | 深圳迪乐普智能科技有限公司 | A kind of implementation method and VR editing machine of VR editing machine |
US11107195B1 (en) * | 2019-08-23 | 2021-08-31 | Lucasfilm Entertainment Company Ltd. | Motion blur and depth of field for immersive content production systems |
CN110806847A (en) * | 2019-10-30 | 2020-02-18 | 支付宝(杭州)信息技术有限公司 | Distributed multi-screen display method, device, equipment and system |
US20210342971A1 (en) * | 2020-04-29 | 2021-11-04 | Lucasfilm Entertainment Company Ltd. | Photogrammetric alignment for immersive content production |
CN111766951A (en) * | 2020-09-01 | 2020-10-13 | 北京七维视觉科技有限公司 | Image display method and apparatus, computer system, and computer-readable storage medium |
CN112040092A (en) * | 2020-09-08 | 2020-12-04 | 杭州时光坐标影视传媒股份有限公司 | Real-time virtual scene LED shooting system and method |
CN113129814A (en) * | 2021-04-23 | 2021-07-16 | 浙江博采传媒有限公司 | Color correction method and system applied to virtual production of LED (light-emitting diode) ring screen |
CN113556443A (en) * | 2021-07-20 | 2021-10-26 | 北京星光影视设备科技股份有限公司 | LED screen real-time correction method facing virtual playing environment and virtual playing system |
CN113810612A (en) * | 2021-09-17 | 2021-12-17 | 上海傲驰广告文化集团有限公司 | Analog live-action shooting method and system |
CN113905145A (en) * | 2021-10-11 | 2022-01-07 | 浙江博采传媒有限公司 | LED circular screen virtual-real camera focus matching method and system |
CN114051129A (en) * | 2021-11-09 | 2022-02-15 | 北京电影学院 | Film virtualization production system and method based on LED background wall |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117440110A (en) * | 2023-10-23 | 2024-01-23 | 神力视界(深圳)文化科技有限公司 | Virtual shooting control method, medium and mobile terminal |
Also Published As
Publication number | Publication date |
---|---|
CN114520903B (en) | 2023-08-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10334153B2 (en) | Image preview method, apparatus and terminal | |
US9591237B2 (en) | Automated generation of panning shots | |
US11330172B2 (en) | Panoramic image generating method and apparatus | |
CN108939556B (en) | Screenshot method and device based on game platform | |
CN110072087B (en) | Camera linkage method, device, equipment and storage medium based on 3D map | |
JP2011239361A (en) | System and method for ar navigation and difference extraction for repeated photographing, and program thereof | |
CN112017222A (en) | Video panorama stitching and three-dimensional fusion method and device | |
CN110021071B (en) | Rendering method, device and equipment in augmented reality application | |
CN112954194B (en) | Image acquisition device adjusting method, system, terminal and medium | |
CN113875220B (en) | Shooting anti-shake method, shooting anti-shake device, terminal and storage medium | |
CN112634414B (en) | Map display method and device | |
CN115546377B (en) | Video fusion method and device, electronic equipment and storage medium | |
CN105635568B (en) | Image processing method and mobile terminal in a kind of mobile terminal | |
CN112770042A (en) | Image processing method and device, computer readable medium, wireless communication terminal | |
CN114520903B (en) | Rendering display method, rendering display device, electronic equipment and storage medium | |
CN108510541B (en) | Information adjusting method, electronic equipment and computer readable storage medium | |
CN111917986A (en) | Image processing method, medium thereof, and electronic device | |
CN111225144A (en) | Video shooting method and device, electronic equipment and computer storage medium | |
CN115908218A (en) | Third-view shooting method, device, equipment and storage medium for XR scene | |
CN116051700A (en) | Virtual shooting method and device, electronic equipment and storage medium | |
CN112672057B (en) | Shooting method and device | |
CN115988322A (en) | Method and device for generating panoramic image, electronic equipment and storage medium | |
CN114066731A (en) | Method and device for generating panorama, electronic equipment and storage medium | |
CN114339029A (en) | Shooting method and device and electronic equipment | |
CN117255247B (en) | Method and device for linkage of panoramic camera and detail dome camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20230804 Address after: Room 602, Building S1, Alibaba Cloud Building, No. 3239 Keyuan Road, Ulan Coast Community, Yuehai Street, Nanshan District, Shenzhen City, Guangdong Province, 518054 Applicant after: Shenli Vision (Shenzhen) Cultural Technology Co.,Ltd. Address before: Room 508, 5 / F, building 4, No. 699, Wangshang Road, Changhe street, Binjiang District, Hangzhou City, Zhejiang Province Applicant before: Alibaba (China) Co.,Ltd. |