[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN112866573B - Display, fusion display system and image processing method - Google Patents

Display, fusion display system and image processing method Download PDF

Info

Publication number
CN112866573B
CN112866573B CN202110043954.0A CN202110043954A CN112866573B CN 112866573 B CN112866573 B CN 112866573B CN 202110043954 A CN202110043954 A CN 202110043954A CN 112866573 B CN112866573 B CN 112866573B
Authority
CN
China
Prior art keywords
image
display
pixel
display data
position information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110043954.0A
Other languages
Chinese (zh)
Other versions
CN112866573A (en
Inventor
彭金豹
苗京花
王雪丰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOE Technology Group Co Ltd
Beijing BOE Optoelectronics Technology Co Ltd
Original Assignee
BOE Technology Group Co Ltd
Beijing BOE Optoelectronics Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOE Technology Group Co Ltd, Beijing BOE Optoelectronics Technology Co Ltd filed Critical BOE Technology Group Co Ltd
Priority to CN202110043954.0A priority Critical patent/CN112866573B/en
Publication of CN112866573A publication Critical patent/CN112866573A/en
Application granted granted Critical
Publication of CN112866573B publication Critical patent/CN112866573B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47205End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Liquid Crystal Display Device Control (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)

Abstract

The embodiment of the application provides a display, a fusion display system and an image processing method. The fusion display system utilizes the superimposer to superimpose the first image and the second image to obtain the display data of the processed first image, then the display data of the processed first image is transmitted to the backlight display screen to be displayed, the display data of the second image is directly transmitted to the non-backlight liquid crystal screen to be displayed, the area of the second image is usually smaller than that of the first image, and the display data is smaller, so that the transmitted data volume is reduced, the bandwidth requirement of data transmission is reduced, the data transmission time is shortened, and meanwhile, the image fusion only needs to replace the color information of the pixels at the corresponding positions, the difficulty of image fusion is reduced, the time required by the image fusion is greatly reduced, so that the fused image can be displayed quickly, and the motion sickness of a user is avoided.

Description

Display, fusion display system and image processing method
Technical Field
The application relates to the technical field of display, in particular to a display, a fusion display system and an image processing method.
Background
The scene fusion display technology is to fuse a certain picture into a display scene, the current processing process is to transmit an image shot by a camera to a computer, the computer fuses the image shot by the camera with an image generated by the computer, and the image is transmitted to a display for display after fusion.
In the above display scheme, the camera is connected to the computer, and then the computer is connected to the display. The transmitted image data is complete image data, and if the transmitted image is a high-definition image, a higher transmission bandwidth is required. And because the transmission is complete pictures, the required transmission time is longer, and the image fusion time is longer, so that the interval between two adjacent frames displayed by the display is longer, and motion sickness is easily caused to a user.
Disclosure of Invention
Aiming at the defects of the existing mode, the application provides the display, the fusion display system and the image processing method, so that the requirements on transmission bandwidth in the display process of image fusion and fusion images are reduced, the transmission speed is increased, the image fusion time is reduced, and motion sickness caused to a user is avoided.
In a first aspect, an embodiment of the present application provides a display, including:
the backlight liquid crystal display is configured to display according to received display data of a processed first image, wherein the processed first image comprises a replacement area which is the same as an area where a second image is located, and color information of pixels in the replacement area is white;
and the backlight-free liquid crystal screen is positioned on one side of the light-emitting surface of the backlight liquid crystal screen and is configured to display according to the received display data of the second image, and the color information of pixels in a display area of the backlight-free liquid crystal screen except the area where the second image is positioned is white.
Optionally, a superimposer is integrated in the display, and the superimposer is electrically connected to the backlit liquid crystal screen, and is configured to determine, according to the position information of each pixel in the second image, the position information of the pixel to be replaced in the first image, and replace the color information of the pixel to be replaced in the first image with white to obtain the display data of the processed first image.
In a second aspect, an embodiment of the present application provides a scene fusion display system, including:
a first image device configured to acquire display data of a first image, the display data of the first image including position information of each pixel and color information of each pixel in the first image, and to transmit the display data of the first image to a superimposer;
a second image device configured to acquire a second image, and send position information of each pixel in the second image to the superimposer and send display data of the second image to a display, the display data of the second image including the position information and color information of each pixel in the second image;
the superimposer is respectively electrically connected with the first image device and the second image device, and is configured to determine the position information of the pixel to be replaced in the first image according to the position information of each pixel in the second image, and replace the color information of the pixel to be replaced in the first image with white to obtain the display data of the processed first image;
the display comprises a backlight liquid crystal screen and a backlight-free liquid crystal screen positioned on one side of the light-emitting surface of the backlight liquid crystal screen, the backlight liquid crystal screen is electrically connected with the superimposer and is configured to display according to the received display data of the processed first image, and the backlight-free liquid crystal screen is electrically connected with the second image device and is configured to display according to the received display data of the second image.
Optionally, the first image device is a camera or a computer, and the second image device is a camera or a computer.
Optionally, the superimposer is integrated within the display or the first image device.
Optionally, the superimposer comprises: a position information conversion module configured to receive position information of each pixel in the second image and determine coordinates of each pixel in the second image; a superimposing module configured to replace display information of pixels in the first image having the same coordinates as pixels in the second image with white to obtain display data of the processed first image.
Optionally, the superimposing module is specifically configured to replace display information of pixels in the first image having the same coordinates as pixels in the second image with white by bitwise or.
Optionally, the second image device is further configured to transmit the position information of each pixel in the second image to the superimposer and the backlight-less liquid crystal screen in a sparse matrix. .
In a third aspect, an embodiment of the present application provides an image processing method, including:
receiving display data of a first image and position information of each pixel in a second image, wherein the display data of the first image comprises the position information and color information of each pixel in the first image;
replacing color information of each pixel in the first image, which is the same as the position information of each pixel in the second image, with white to obtain display data of the processed first image;
and transmitting the processed display data of the first image to a display for displaying.
Optionally, replacing color information of each pixel in the first image, which is the same as the position information of the pixel in the second image, with white, includes: replacing color information of pixels in the first image having the same coordinates as pixels in the second image with white by bitwise OR.
The technical scheme provided by the embodiment of the application has the following beneficial technical effects:
according to the display, the fusion display system and the image processing method, the first image and the second image are superposed by the aid of the superimposer to obtain the display data of the processed first image, the display data of the processed first image are transmitted to the liquid crystal screen with the backlight to be displayed, the display data of the second image are directly transmitted to the liquid crystal screen without the backlight to be displayed, the area of the second image is usually smaller than that of the first image, and the display data are smaller, so that the transmitted data volume is reduced, the bandwidth requirement of data transmission is reduced, the data transmission time is shortened, meanwhile, the image fusion only needs to replace the color information of the pixels at the corresponding positions, the difficulty of image fusion is reduced, the time required by the image fusion is greatly reduced, the fused image can be displayed rapidly, and motion sickness of a user is avoided.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a diagram of a prior art display system;
fig. 2 is a schematic structural diagram of a display according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of another display according to an embodiment of the present disclosure;
fig. 4 is an architecture diagram of a fusion display system according to an embodiment of the present application;
fig. 5 is a specific architecture diagram of a fusion display system according to an embodiment of the present application;
fig. 6 is a schematic diagram of a frame structure of an overlay module in a fusion display system according to an embodiment of the present disclosure;
fig. 7 is a schematic flowchart of an image processing method according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a first image according to an embodiment of the present application;
fig. 9 is a schematic diagram of fig. 8 as a fused first image and a fused second image.
Reference numerals:
1-a display; 11-with backlight LCD screen; 112-liquid crystal screen without backlight;
2-a first image device;
3-a second image device;
4-a superimposer; 41-position information conversion module; 42-stacking module.
Detailed Description
The present application is described in detail below and examples of embodiments of the present application are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements with the same or similar functionality throughout. In addition, if a detailed description of the known art is unnecessary for the features of the present application shown, it is omitted. The embodiments described below with reference to the drawings are exemplary only for the purpose of explaining the present application and are not to be construed as limiting the present application.
It will be understood by those within the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
As used herein, the singular forms "a", "an", "the" and "the" include plural referents unless the context clearly dictates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
As shown in fig. 1, in the current scene fusion display technology, image data needs to be transmitted from a camera to a computer and then from the computer to a display, in the transmission process, the transmitted image data is complete image display data, and if the transmitted image is a high-definition image, a higher transmission bandwidth is required. And because the transmission is complete pictures, the required transmission time is longer, and the image fusion time is longer, so that the interval between two adjacent frames displayed by the display is longer, and motion sickness is easily caused to a user.
The application provides a display, a scene fusion system and an image processing method, which aim to solve the technical problems in the prior art.
The following describes the technical solutions of the present application and how to solve the above technical problems with specific embodiments.
As shown in fig. 2, the display 1 includes a backlight liquid crystal panel 11 and a non-backlight liquid crystal panel 12 disposed at a side of a light emitting surface of the backlight liquid crystal panel 11.
The backlit liquid crystal panel 11 is configured to display according to the received display data of the processed first image P1', and the non-backlit liquid crystal panel 12 is configured to display according to the received display data of the second image P2.
The display data of the first image P1 includes position information and color information of each pixel in the first image P1, the display data of the second image P2 includes position information and color information of each pixel in the second image P2, and the processed display data of the first image P1' is display data in which the color information of the pixel having the same position as that in the second image P2 in the display data of the first image P1 is replaced with white.
Specifically, the default color information of each pixel in the non-backlit liquid crystal panel 12 may be set to white, and thus, only the color information of the pixel in the area where the second image P2 is located may be displayed according to the color information of the second image.
Specifically, the position information may be expressed by coordinates, for example, each pixel in the first display screen has unique coordinates by using the first pixel in the first row as the origin of coordinates, and the position information of one pixel is (x, y), where x is the pixel located in the x +1 th row and the y +1 th column. The color information may be represented in RGB format, for example, the color information of one pixel is (a, b, c), where a is the gray scale information of the red sub-pixel, b is the gray scale information of the green sub-pixel, and b is the gray scale information of the blue sub-pixel.
Note that, the first image P1 is usually a background image, and the second image P2 is fused to another image with the first image P1 as a background, for example, as shown in fig. 5, the first image P1 is a grassland, and the second image P2 is a cat, and the fusion effect intended to be achieved is that the cat is in the grassland. Of course, the first image P1 may be formed by a computer, and the second image P2 may be captured by a camera, for example, to blend the captured portrait into a virtual background.
In the display 1 provided in the embodiment of the present application, the backlight liquid crystal panel 11 is used to display the processed first image P1', and the non-backlight liquid crystal panel 12 is used to display the second image P2, wherein color information of pixels in an area of the processed first image P1' that is at the same position as the second image P2 is white, so that the backlight source can penetrate through the replacement area more to serve as backlight of the second image P2, and color information of pixels in a display area of the non-backlight liquid crystal panel 12 except the area where the second image P2 is located is white, so that light for displaying the processed first image P1' can penetrate through the non-backlight liquid crystal panel 12, so as to achieve a display effect of blending the first image P1 and the second image P2; the display effect of scene fusion is realized by utilizing the display 1, the first image P1 is not required to be transmitted to a computer, only the position signal of the second image P2 is required to be obtained, and the color information of the corresponding pixel in the first image P1 is replaced according to the position information of each pixel in the second image P2, the time required by the transmission process of the images can be reduced, meanwhile, the requirement on transmission bandwidth can be reduced, the display picture after scene fusion is more smooth, and the motion sickness of a user is avoided.
Optionally, as shown in fig. 3, the display 1 provided in this embodiment further integrates a superimposer, which is electrically connected to the backlit liquid crystal screen 11, and is configured to determine the position information of the pixel to be replaced in the first image P1 according to the position information of each pixel in the second image P2, and replace the color information of the pixel to be replaced in the first image P1 with white to obtain the display data of the processed first image P1'.
Because the camera and the computer are limited by the system architecture, the transmission protocol is difficult to modify, which is also the reason that the fused display system in the prior art has long transmission time and high requirement on required transmission bandwidth, and a high-speed transmission hardware and protocol can be adopted between the camera and the display 1, for example, a Mobile Industry Processor Interface (MIPI) is adopted, so that the transmission speed can be greatly increased, and because the display data of the second image is usually small, the time required for transmitting the second image from the computer to the superimposer integrated in the display 1 is short, and the time for transmitting the reply data can be reduced.
Specifically, the display 1 further includes a driving chip, a power supply, and the like. The power supply supplies power to the backlight liquid crystal panel 11 and the non-backlight liquid crystal panel 12, and the driving chips may include a first driving chip and a second driving chip. The first driving chip is electrically connected with the backlight liquid crystal screen 11, and the second driving chip is electrically connected with the non-backlight liquid crystal screen 12.
In addition, the display 1 further includes a housing for protecting the backlit liquid crystal panel 11, the non-backlit liquid crystal panel 12, the driving chip 12, and the like, and a signal interface for realizing signal transmission between the display 1 and other electronic devices (e.g., a computer, a video camera, and the like).
Based on the same inventive concept, an embodiment of the present application further provides a scene fusion display system, as shown in fig. 4, the scene fusion display system includes:
a first image device 2 configured to acquire display data of the first image P1, the display data of the first image P1 including position information of each pixel and color information of each pixel in the first image P1, and to transmit the display data of the first image to the superimposer 4. In particular, the first image device 2 may be a camera, or may be other equipment for capturing images or generating background images, such as a computer.
A second image device 3 configured to acquire display data of the second image P2, and to send the position information of each pixel in the second image to the superimposer 4 and to send the display data of the second image to the display 1, the display data of the second image P2 including the position information and the color information of each pixel in the second image P2. Specifically, the second imaging device 3 may be an electronic apparatus such as a computer that can generate an image or a camera that can capture an image.
A superimposer 4 configured to determine positional information of a pixel to be replaced in the first image P1 from the positional information of each pixel in the second image P2, and to replace color information of the pixel to be replaced in the first image P1 with white to obtain display data of the processed first image P1'.
The display 1 comprises a backlight liquid crystal screen 11 and a backlight-free liquid crystal screen 12 positioned on one side of a light-emitting surface of the backlight liquid crystal screen 11, wherein the backlight liquid crystal screen 11 is electrically connected with the superimposer 4 and is configured to display according to the received display data of the processed first image P1', and the backlight-free liquid crystal screen 12 is electrically connected with the second image device 3 and is configured to display according to the received display data of the second image P2.
Specifically, the default color information of each pixel in the non-backlit liquid crystal panel 12 may be set to white, and thus, only the color information of the pixel in the area where the second image P2 is located may be displayed according to the color information of the second image.
In the scene fusion display system provided by this embodiment, the superimposer 4 is used to superimpose the first image P1 and the second image P2 to obtain the processed display data of the first image P1', and then the processed display data of the first image P1' is transmitted to the liquid crystal screen with backlight for display, while the display data of the second image P2 is directly transmitted to the liquid crystal screen without backlight 12 for display, because the area of the second image P2 is usually smaller than that of the first image P1, and the display data is smaller, the amount of the transmitted data is reduced, thereby reducing the bandwidth requirement of data transmission and shortening the time of data transmission, and meanwhile, because image fusion only needs to replace the color information of the pixel at the corresponding position, the difficulty of image fusion is reduced, and the time required by image fusion is also greatly reduced, thereby enabling the fused image to be displayed quickly, and avoiding motion sickness of a user.
Optionally, in the scene fusion display system provided in this embodiment, the superimposer 4 is integrated in the first image device 2 or the display 1. This is because the transmission protocol is difficult to modify due to the limitation of the system architecture between the camera and the computer, and a high-speed transmission hardware and protocol may be used between the camera and the display 1, for example, a Mobile Industry Processor Interface (MIPI) is used, which may greatly increase the transmission speed, and because the display data of the second image is usually small, the time required for the computer to transmit to the superimposer 4 (integrated in the camera or the display 1) is short, which may also reduce the data transmission time.
As shown in fig. 5, the present embodiment provides a fusion display system, in which a camera is used as a first image device and a computer is used as a second image device. Specifically, the camera acquires a first image P1, the first image P1 is a grassland, the computer generates a second image P2, and the second image P2 is a cat. The superimposer 4 receives the display data of the first image P1 and the position information of each pixel in the second image P2, and replaces the color information of the pixel in the first image P1, which is the same as the position information of each pixel in the second image P2, with white to obtain the processed display data of the first image P' and send the processed display data to the first display screen 112, and the display data of the second image P2 is sent to the second display screen, so that the display effect realized by the display 1 is that the cat is put in prairie.
Optionally, as shown in fig. 6, in the scene fusion display system provided in this embodiment, the superimposer 4 includes a position information conversion module 41 and a superimposing module 42. The position information conversion module 41 is configured to receive the position information of each pixel in the second image and determine the coordinates of each pixel in the second image. In particular, the superimposing module 42 is configured to replace the display information of pixels in the first image P1 having the same coordinates as the pixels in the second image P2 with white to obtain the processed display data of the first image P1'.
Further, the superimposing module 42 is specifically configured to replace the display information of the pixels in the first image P1 with white, which have the same coordinates as the pixels in the second image P2, by bitwise or. Specifically, the superimposing module 42 buffers the display data of the first image P1 and the position information of each pixel in the second image P2, and performs bitwise or operation on the color information of the pixel at the corresponding position in the first image P1 according to the buffered position information of each pixel in the second image P2, that is, the processed display data of the first image P1' can be obtained.
Further, the present embodiment provides a scene fusion display system in which the second image device 3 is configured to transmit the position information of each pixel in the second image to the superimposer 4 and the second array substrate 1131 in the form of a sparse matrix. That is, the position information of each pixel in the second image is transmitted to the superimposer 4 in the form of a sparse matrix, and the display data of the second image is transmitted to the non-backlight liquid crystal screen 12 in the form of a sparse matrix (position information) + color information, so that the non-backlight liquid crystal screen 12 replaces the color information of the corresponding position. In the embodiment, data transmission is performed in a sparse matrix form, a compression algorithm can be adopted to save transmission bandwidth, and the image frame rate is ensured.
Based on the same inventive concept, an embodiment of the present application further provides an image processing method, as shown in fig. 7, the image processing method includes:
s1: display data of a first image and position information of each pixel in a second image are received, the display data of the first image including the position information and color information of each pixel in the first image.
S2: and replacing the color information of the pixels in the first image, which are the same as the position information of each pixel in the second image, with white to obtain the display data of the processed first image.
S3: and transmitting the processed display data of the first image to a display for displaying. Specifically, the processed display data of the first image is transmitted to the liquid crystal screen with the backlight for displaying.
The image processing method provided by the embodiment can realize scene fusion display only by replacing the color information of the pixel at the same position in the first image as the second image and matching with the display 1 in the embodiment, the data size required to be processed by image fusion is small, the required time is short, the requirement on transmission bandwidth can be reduced, the display picture after scene fusion is smoother, and motion sickness generated by a user is avoided.
Further, the present embodiment provides an image processing method, wherein the step S2 includes replacing display information of pixels in the first image having the same coordinates as pixels in the second image with white by bitwise or.
In a specific embodiment, as shown in fig. 8 and 9, the map shown in fig. 8 is used as the first image, and fig. 9 is a fused image obtained by adding a "cross" mark (the cross mark is the second image) to the upper left corner of the map shown in fig. 8. For convenience of explanation, it is assumed that the "crosses" (second images) are each only 3 pixels in the horizontal and vertical directions, i.e., the "crosses" are composed of 5 pixels; and assuming that the "cross" is located at the upper left most position, and assuming that the coordinates of the pixel at the upper left most position, i.e., the first pixel of the first row, are (0, 0), the position information of the 5 pixels constituting the "cross" are (0, 1), (1, 0), (1, 1), (1, 2), and (2, 1), respectively.
During image processing, that is, during fusion of the first image and the second image, the configuration (superimposer 4 in the above-described embodiment) that performs this operation first receives the display data of the map shown in fig. 8 (i.e., the display data of the first image P1) and the positional information of the above-described "cross" (i.e., the positional information of each pixel in the second image P2); the color information of the pixels on the map shown in fig. 8, which are the same as the position information of the cross, is replaced by white, specifically, for example, the RGB information is all 8bits, the RGB information of the pixels can be replaced by (255, 255, 255), and the replacement is completed; and then, the display data of the fused image is sent to the backlight liquid crystal screen 11, so that the backlight liquid crystal screen 11 displays the image in the area where the cross is located according to the map after the replacement is completed, namely the display data of the processed first image P1', the pixels in the area where the cross is located are all displayed in white, and the areas except the area where the cross is located display the original color.
The present embodiment can greatly reduce the data transmission amount and the data transmission time, and is still described with reference to fig. 8 and 9 as an example. Assume that the resolution of fig. 8 and 9 is 800 x 600. The whole data transmission mainly comprises the following parts: the image is transmitted to a computer by a camera (the required time is 800 × 600 × m), the computer renders the image (the second image is generated, namely the cross, the required time is t 1), the image is fused (the required time is t 2), and the fused image is transmitted to a display 1 (the required time is 800 × 600 × m), wherein m refers to the transmission time required by the display data of each pixel, namely the total transmission time required in the prior art is 800 × 600 × m + t1+ t2+800 × 600 m =960000m + t1+ t2.
Taking the superimposer 4 integrated in the display 1 as an example, and the data transmission in this embodiment is mainly divided into the following parts, the data is transmitted from the camera to the display 1 (specifically, the data is transmitted to the superimposer 4 first, the superimposer 4 can be directly connected with the liquid crystal screen 11 with backlight, the transmission time between the superimposer 4 and the liquid crystal screen 11 with backlight is negligible, the time required by the process is 800 × 600 × m), the image is rendered by the computer (generating the second image, i.e., "cross", the required time is t 1), the display data of the second image is transmitted to the display 1 (specifically, the required time is 5m, transmitting the position information to the liquid crystal screen 12 without backlight) and the position information is transmitted to the superimposer 4 (transmitting the position information only, the required time is less than 5m, for convenience of description, the calculation is performed by 5 m), the image fusion (the time required by bit or operation is negligible), that is, the total transmission time required by this application is negligible: 800 + 600 + t1+5m =480010m + t1.
As can be seen from the above calculation, the total transmission time required by the image processing method of the embodiment is reduced by 470090m + t2 compared with the total transmission time in the prior art, and the transmission time is greatly reduced.
By applying the embodiment of the application, at least the following beneficial effects can be realized:
according to the display module, the displayer, the fusion display system and the image processing method, the first image and the second image are overlapped through the superimposer to obtain the processed display data of the first image, the processed display data of the first image are transmitted to the backlight liquid crystal screen to be displayed, the display data of the second image are directly transmitted to the backlight-free liquid crystal screen to be displayed, the area of the second image is usually smaller than that of the first image, the display data are smaller, and therefore the data transmission amount is reduced, the bandwidth requirement of data transmission is reduced, the data transmission time is shortened, meanwhile, due to the fact that image fusion only needs to replace color information of pixels at corresponding positions, the difficulty of image fusion is reduced, the time needed by image fusion is greatly reduced, the fused image can be displayed rapidly, and the situation that a user generates motion sickness is avoided.
Those of skill in the art will appreciate that the various operations, methods, steps in the processes, acts, or solutions discussed in this application can be interchanged, modified, combined, or eliminated. Further, other steps, measures, or schemes in various operations, methods, or flows that have been discussed in this application can be alternated, altered, rearranged, broken down, combined, or deleted. Further, the steps, measures, and schemes in the various operations, methods, and flows disclosed in the present application in the prior art can also be alternated, modified, rearranged, decomposed, combined, or deleted.
In the description of the present application, it is to be understood that the terms "center", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", and the like indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience in describing the present application and simplifying the description, but do not indicate or imply that the referred device or element must have a particular orientation, be constructed in a particular orientation, and be operated, and thus should not be construed as limiting the present application.
The terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present application, the meaning of "a plurality" is two or more unless otherwise specified.
In the description of the present application, it is to be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meaning of the above terms in this application will be understood to be a specific case for those of ordinary skill in the art.
The particular features, structures, materials, or characteristics may be combined in any suitable manner in any one or more embodiments or examples.
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least a portion of the steps in the flow chart of the figure may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of execution is not necessarily sequential, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
The foregoing is only a partial embodiment of the present application, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present application, and these modifications and decorations should also be regarded as the protection scope of the present application.

Claims (10)

1. A display, comprising:
a backlight liquid crystal screen configured to display according to the received display data of the processed first image;
the backlight-free liquid crystal screen is positioned on one side of the light emergent surface of the backlight liquid crystal screen and is configured to display according to the received display data of the second image;
the display data of the first image comprises position information of each pixel in the first image and color information of each pixel, the display data of the second image comprises position information and color information of each pixel in the second image, and the processed display data of the first image is display data obtained by replacing color information of pixels with the same positions in the second image in the display data of the first image with white.
2. The display according to claim 1, wherein a superimposer is integrated in the display, the superimposer is electrically connected to the backlit liquid crystal screen, and is configured to determine the position information of the pixel to be replaced in the first image according to the position information of each pixel in the second image, and replace the color information of the pixel to be replaced in the first image with white to obtain the display data of the processed first image.
3. A scene fusion display system, comprising:
a first image device configured to acquire display data of a first image, the display data of the first image including position information of each pixel and color information of each pixel in the first image, and to transmit the display data of the first image to a superimposer;
a second image device configured to acquire a second image, and send position information of each pixel in the second image to the superimposer and send display data of the second image to a display, the display data of the second image including the position information and color information of each pixel in the second image;
the superimposer is respectively electrically connected with the first image device and the second image device, and is configured to determine the position information of the pixel to be replaced in the first image according to the position information of each pixel in the second image, and replace the color information of the pixel to be replaced in the first image with white to obtain the display data of the processed first image;
the display comprises a backlight liquid crystal screen and a backlight-free liquid crystal screen positioned on one side of the light-emitting surface of the backlight liquid crystal screen, the backlight liquid crystal screen is electrically connected with the superimposer and is configured to display according to the received display data of the processed first image, and the backlight-free liquid crystal screen is electrically connected with the second image device and is configured to display according to the received display data of the second image.
4. The scene fusion display system of claim 3, wherein the first image device is a camera or a computer and the second image device is a camera or a computer.
5. The scene fusion display system of claim 4, wherein the superimposer is integrated within the display or the first image device.
6. The scene fusion display system of claim 3, wherein the superimposer includes:
a position information conversion module configured to receive position information of each pixel in the second image and determine coordinates of each pixel in the second image;
a superimposing module configured to replace color information of pixels in the first image having the same coordinates as pixels in the second image with white to obtain display data of the processed first image.
7. The scene fusion display system of claim 6, wherein the superimposition module is specifically configured to replace color information of pixels in the first image having the same coordinates as pixels in the second image with white by bitwise OR.
8. The scene fusion display system according to any of claims 3-7, wherein the second image device is further configured to transmit the position information of each pixel in the second image to the superimposer and the non-backlit liquid crystal screen in the form of a sparse matrix.
9. An image processing method for the scene fusion display system according to any one of claims 3 to 8, comprising:
receiving display data of a first image and position information of each pixel in a second image, wherein the display data of the first image comprises the position information and color information of each pixel in the first image;
replacing color information of each pixel in the first image, which is the same as the position information of each pixel in the second image, with white to obtain display data of the processed first image;
and transmitting the processed display data of the first image to a display for displaying.
10. The method according to claim 9, wherein replacing color information of the pixel in the first image, which is the same as the position information of each pixel in the second image, with white, comprises:
replacing color information of pixels in the first image having the same coordinates as pixels in each pixel in the second image with white by bitwise OR.
CN202110043954.0A 2021-01-13 2021-01-13 Display, fusion display system and image processing method Active CN112866573B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110043954.0A CN112866573B (en) 2021-01-13 2021-01-13 Display, fusion display system and image processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110043954.0A CN112866573B (en) 2021-01-13 2021-01-13 Display, fusion display system and image processing method

Publications (2)

Publication Number Publication Date
CN112866573A CN112866573A (en) 2021-05-28
CN112866573B true CN112866573B (en) 2022-11-04

Family

ID=76003449

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110043954.0A Active CN112866573B (en) 2021-01-13 2021-01-13 Display, fusion display system and image processing method

Country Status (1)

Country Link
CN (1) CN112866573B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005191985A (en) * 2003-12-26 2005-07-14 Kyocera Corp Digital camera
US9514381B1 (en) * 2013-03-15 2016-12-06 Pandoodle Corporation Method of identifying and replacing an object or area in a digital image with another object or area
CN106534211A (en) * 2016-12-29 2017-03-22 四川九洲电器集团有限责任公司 Data transmission method and electronic equipment
CN107025457A (en) * 2017-03-29 2017-08-08 腾讯科技(深圳)有限公司 A kind of image processing method and device
CN110310229A (en) * 2019-06-28 2019-10-08 Oppo广东移动通信有限公司 Image processing method, image processing apparatus, terminal device and readable storage medium storing program for executing
CN110717919A (en) * 2019-10-15 2020-01-21 阿里巴巴(中国)有限公司 Image processing method, device, medium and computing equipment
CN111357295A (en) * 2017-06-27 2020-06-30 皮克索洛特公司 Method and system for fusing user-specific content into video production
CN111768479A (en) * 2020-07-29 2020-10-13 腾讯科技(深圳)有限公司 Image processing method, image processing apparatus, computer device, and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10055643B2 (en) * 2014-09-19 2018-08-21 Bendix Commercial Vehicle Systems Llc Advanced blending of stitched images for 3D object reproduction
JP6765956B2 (en) * 2016-12-27 2020-10-07 キヤノン株式会社 Imaging control device and its control method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005191985A (en) * 2003-12-26 2005-07-14 Kyocera Corp Digital camera
US9514381B1 (en) * 2013-03-15 2016-12-06 Pandoodle Corporation Method of identifying and replacing an object or area in a digital image with another object or area
CN106534211A (en) * 2016-12-29 2017-03-22 四川九洲电器集团有限责任公司 Data transmission method and electronic equipment
CN107025457A (en) * 2017-03-29 2017-08-08 腾讯科技(深圳)有限公司 A kind of image processing method and device
CN111357295A (en) * 2017-06-27 2020-06-30 皮克索洛特公司 Method and system for fusing user-specific content into video production
CN110310229A (en) * 2019-06-28 2019-10-08 Oppo广东移动通信有限公司 Image processing method, image processing apparatus, terminal device and readable storage medium storing program for executing
CN110717919A (en) * 2019-10-15 2020-01-21 阿里巴巴(中国)有限公司 Image processing method, device, medium and computing equipment
CN111768479A (en) * 2020-07-29 2020-10-13 腾讯科技(深圳)有限公司 Image processing method, image processing apparatus, computer device, and storage medium

Also Published As

Publication number Publication date
CN112866573A (en) 2021-05-28

Similar Documents

Publication Publication Date Title
US11120726B2 (en) Method and device for driving display panel, and display apparatus
US11037523B2 (en) Display method of display panel that uses different display algorithms for different display areas, display panel and display device
US20150077444A1 (en) 2D/3D Image Displaying Apparatus
TW200521941A (en) Display device and driving method thereof
JP2010113269A (en) Liquid crystal display, liquid crystal display controller, electronic equipment and method of driving liquid crystal display
JP7184788B2 (en) Integrated circuit display driving method, integrated circuit, display screen and display device
TW200305849A (en) Color/mono switched display
TW200947379A (en) Display apparatus and method
JP2004117431A (en) Color display device
JPH09292609A (en) Spatial optical modulator and directional display
WO2012090953A1 (en) Display device, television receiver
CN112866573B (en) Display, fusion display system and image processing method
JPH1185110A (en) Display device and display method
JP2004165713A (en) Image display apparatus
US20130229398A1 (en) Display apparatus and method of driving the same
JP2007219496A (en) Interconnect structure for display device and projection display device
US11202028B2 (en) Display device configuring multi display system and control method thereof
TWI521290B (en) Display panel and method for driving the same
JP2009229553A (en) Display device, driving method, and electronic apparatus
TWI278225B (en) Multiple views of stereoscopic image display scheme
US9165490B2 (en) 3D/2D multi-primary color image device and method for controlling the same
KR101012585B1 (en) Multi-channelImage registration system and the Method
JP2003189262A (en) Method for integrating three-dimensional y/c comb line filter and interlace/progressive converter into single chip and system thereof
US20140015939A1 (en) Passive-stereo three-dimensional displays
US8488897B2 (en) Method and device for image filtering

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant