Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the disclosed embodiments. However, it will be apparent to one skilled in the art that the present disclosure may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present disclosure with unnecessary detail.
The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 is a scene schematic diagram of an application scenario of an embodiment of the present disclosure. The application scenario may include a server 1, a network 2, and a device 3.
The terminal device 3 may be hardware or software. When the terminal device 3 is hardware, it may be various electronic devices having an LED display screen and supporting communication with the server 1, including but not limited to a smart phone, a tablet computer, a laptop portable computer, a desktop computer, and the like; when the terminal device 3 is software, it may be installed in the electronic device described above. The terminal device 3 may be implemented as a plurality of software or software modules, or may be implemented as a single software or software module, which is not limited in the embodiment of the present disclosure. Further, various applications, such as a data processing application, an instant messaging tool, social platform software, a search-type application, a shopping-type application, and the like, may be installed on the terminal device 3.
The server 1 may be a server providing various services, for example, a backend server receiving a request sent by a terminal device establishing a communication connection with the server, and the backend server may receive and analyze the request sent by the terminal device and generate a processing result. The server 1 may be one server, may also be a server cluster composed of a plurality of servers, or may also be a cloud computing service center, which is not limited in this disclosure.
The server 1 may be hardware or software. When the server 1 is hardware, it may be various electronic devices that provide various services to the terminal device 3. When the server 1 is software, it may be implemented as a plurality of software or software modules that provide various services for the terminal device 3, or may be implemented as a single software or software module that provides various services for the terminal device 3, which is not limited in this disclosure.
The network 2 may be a wired network connected by a coaxial cable, a twisted pair and an optical fiber, or may be a wireless network that can interconnect various Communication devices without wiring, for example, Bluetooth (Bluetooth), Near Field Communication (NFC), Infrared (Infrared), and the like, which is not limited in the embodiment of the present disclosure.
A user can establish a communication connection with the server 1 via the network 2 through the terminal device 3 to receive or transmit information or the like. Specifically, first, the server 1 may acquire raw image data, a correction coefficient, and an average coefficient. Next, the server 1 may generate corrected image data based on the correction processing policy, preset standard luminance data, raw image data, and the correction coefficient. Again, the server 1 may generate average number image data based on the average number processing policy, the standard luminance data, the raw image data, and the average number coefficient. Finally, the server 1 may generate target image data based on the original image data, the acquired first threshold value, the corrected image data, and the average image data.
It should be noted that the specific types, numbers and combinations of the server 1, the network 2 and the device 3 may be adjusted according to the actual requirements of the application scenario, and the embodiment of the present disclosure does not limit this.
Fig. 2 is a flowchart of an image processing method for an LED display screen according to an embodiment of the present disclosure. The image processing method for the LED display screen of fig. 2 may be performed by the server 1 of fig. 1. As shown in fig. 2, the image processing method for the LED display screen includes:
s201, acquiring original image data, a correction coefficient and an average coefficient.
The original image data may be a data set composed of all pixel data in a specific region. As an example, the original image data may be all pixel data in the entire LED display screen, or may be pixel data composed of M × N rectangles, where M and N are positive integers greater than 2. The selection is made as required. The pixel point data may refer to data associated with each LED lamp of the LED display screen. The pixel data may include red pixel data, green pixel data, and blue pixel data. Because the display light of each color of the LED lamp can be composed of red, green and blue light in different proportions, the data of each pixel point can be composed of basic red pixel data, green pixel data and blue pixel data.
The correction coefficient may refer to a coefficient obtained by measuring a correction for each pixel by a camera or a dedicated correction device.
In some embodiments, the correction coefficients may include a red primary coefficient, a green primary coefficient, a blue primary coefficient, a green-to-red secondary coefficient, a blue-to-red secondary coefficient, a red-to-green secondary coefficient, a blue-to-green secondary coefficient, a red-to-blue secondary coefficient, and a green-to-blue secondary coefficient.
In some embodiments, the red, green, and blue dominant coefficients may be data of 16-bit precision. The green-to-red auxiliary coefficient, the blue-to-red auxiliary coefficient, the red-to-green auxiliary coefficient, the blue-to-green auxiliary coefficient, the red-to-blue auxiliary coefficient, and the green-to-blue auxiliary coefficient may be data of 8-bit accuracy or may be data of 16-bit accuracy. The setting is performed as needed, and is not particularly limited herein.
The average coefficient may refer to data obtained by arithmetic mean calculation of the correction coefficient. In some embodiments, when calculating the average coefficient, the correction coefficient may be smoothed first, then the high fluctuation data is removed, and finally the average coefficient is calculated by performing an arithmetic average. Obviously, each pixel data may have a different correction coefficient, but the average coefficient data of the correction coefficients of all pixels in each calculation region is the same. It should be noted that the calculation method for calculating the average coefficient may also be performed by other methods for calculating an average, and all of the methods belong to the protection scope of the present embodiment, and are set as required, and are not limited specifically herein. In addition, the data structure of the average coefficient is the same as the correction coefficient, and will not be described herein.
And S202, generating corrected image data based on the correction processing strategy, preset standard brightness data, the original image data and the correction coefficient.
The standard luminance data may refer to an intermediate parameter for processing the raw image data, and the intermediate parameter may include red standard luminance data, green standard luminance data, and blue standard luminance data. The correction processing strategy may refer to a step or method of generating corrected image data based on the standard luminance data, the original image data, and the correction coefficient. The correction processing strategy can comprise data processing modes such as calculation, screening and the like. The setting is different according to the needs, and is not limited in particular here. The corrected image data may refer to data obtained by subjecting the original image data to a correction processing policy.
S203, generating average image data based on the average processing strategy, the standard brightness data, the original image data and the average coefficient.
The average number processing policy may refer to a step or method of generating average number image data based on the standard luminance data, the original image data, and the average number coefficient. The average number processing strategy can comprise data processing modes such as calculation, screening and the like. The setting is different according to the needs, and is not limited in particular here. The average number image data may refer to data obtained by processing the raw image data by an average number processing strategy.
S204, target image data is generated based on the original image data, the acquired first threshold value, the corrected image data and the average image data.
And (3) lighting the LED display screen in a correction state, continuously increasing the gray value from 0, observing the formed image, and comparing the image with preset abnormalities such as low gray pockmarks, color cast, a screen splash and the like. And when the generated image meets the requirement, the gray value is the first threshold value. It should be noted that the determination of the first threshold may be determined by human, or may be determined by a specific program or device, and is set as needed, and is not limited specifically herein. The first threshold may include a red threshold, a green threshold, and a blue threshold. When generating the target image data, the target image data may be obtained by comparing each pixel point data of the original image data with the first threshold value, and selecting data corresponding to the pixel data point from the corrected image data as the target pixel point data, or selecting data corresponding to the pixel data point from the average image data as the target pixel point data. The manner of comparing the data of each pixel point with the first threshold is set according to different situations, and is not limited herein. By selecting each pixel data point, the problems of color cast, screen blurring and poor color consistency of the LED display screen during low-gray-scale display can be solved.
In some embodiments, obtaining raw image data, correction coefficients, and average coefficients comprises: acquiring image data to be decoded and decoding the image data to generate decoded image data; carrying out gamma conversion on the decoded image data to generate original image data; acquiring a correction coefficient; an average coefficient is generated based on the correction coefficient.
The data to be decoded may refer to directly acquired transmission data. The data to be decoded is generally data encoded according to an image transmission protocol (e.g., HDMI, High Definition Multimedia Interface), and cannot be directly calculated, so that the data to be decoded needs to be decoded to obtain the decoded image data for subsequent processing. Before the decoded image data is processed, the decoded image data needs to be subjected to gamma conversion, so that the gray value of the decoded image data is changed, and the image data is more in line with the actual requirement. It should be noted that the gamma curve used in the gamma conversion is set according to the need, and is not limited in particular. The original image data can be obtained after the gamma conversion is carried out on the decoded image data.
The correction coefficient may refer to a relevant parameter for correcting each pixel point data in the original image data. The average coefficient may refer to a coefficient generated by performing an average calculation based on a correction coefficient corresponding to each pixel point data in the original image data.
In some embodiments, generating the corrected image data based on the correction processing policy, the preset standard brightness data, the raw image data, and the correction coefficient includes: acquiring red pixel data, green pixel data and blue pixel data of each pixel point data in original image data; acquiring red standard brightness data, green standard brightness data and blue standard brightness data of the standard brightness data; based on the correction calculation formula, the red pixel data, the green pixel data, the blue pixel data, the red standard luminance data, the green standard luminance data, the blue standard luminance data, and the sum of the correction coefficients, the corrected image data is generated.
The correction calculation formula may refer to a preset mathematical formula for generating the correction image data. The mathematical expression may be set differently according to different situations, and is not limited specifically herein.
In some embodiments, the correction calculation may be:
Br_R*Gamma_Rdata Corr_R2R Corr_G2R Corr_B2R Corr_Rdata
[Br_G*Gamma_Gdata]*[Corr_R2G Corr_G2G Corr_B2G]=[Corr_Gdata]
Br_B*Gamma_Bdata Corr_R2B Corr_G2B Corr_B2B Corr_Bdata
where BR _ R may be red standard luminance data of standard luminance data, BR _ G may be green standard luminance data of standard luminance data, BR _ B may be blue standard luminance data of standard luminance data, Gamma _ Rdata may be red pixel data of one pixel point data in original image data, Gamma _ Gdata may be green pixel data of one pixel point data in original image data, Gamma _ Bdata may be blue pixel data of one pixel point data in original image data, Corr _ R2R may be a red main coefficient of a correction coefficient, Corr _ G2G may be a green main coefficient of a correction coefficient, Corr _ B2B may be a blue main coefficient of a correction coefficient, Corr _ G2R may be a green to red auxiliary coefficient of a correction coefficient, Corr _ B2R may be a blue to red auxiliary coefficient of a correction coefficient, Corr _ R2G may be a red to green auxiliary coefficient of a correction coefficient, corr _ B2G may be a blue-to-green auxiliary coefficient of correction coefficients, Corr _ R2B may be a red-to-blue auxiliary coefficient of correction coefficients, Corr _ G2B may be a green-to-blue auxiliary coefficient of correction coefficients, Corr _ Rdata may be red correction data for correcting image data, Corr _ Gdata may be green correction data for correcting image data, and Corr _ Bdata may be blue correction data for correcting image data. By adopting the above coefficient setting, the corrected image data of each pixel point can be calculated more accurately. It should be noted that the above auxiliary coefficients are all coefficients commonly used in the art, and are not described herein again.
In some embodiments, generating the mean image data based on the mean processing policy, the standard luminance data, the raw image data, and the mean coefficient comprises: acquiring red pixel data, green pixel data and blue pixel data of each pixel point data in original image data; acquiring red standard brightness data, green standard brightness data and blue standard brightness data of the standard brightness data; average number image data is generated based on the average number calculation formula, the red pixel data, the green pixel data, the blue pixel data, the red standard luminance data, the green standard luminance data, the blue standard luminance data, and the average number coefficient.
The average calculation formula may refer to a preset mathematical formula for generating the average image data. The mathematical expression may be set differently according to different situations, and is not limited specifically herein.
In some embodiments, the average calculation may be:
Br_R*Gamma_Rdata Corr_R2R_Avr Corr_G2R_Avr Corr_B2R_Avr Corr_Rdata_Avr
[Br_G*Gamma_Gdata]*[Corr_R2G_Avr Corr_G2G_Avr Corr_B2G_Avr]=[Corr_Gdata_Avr]
Br_B*Gamma_Bdata Corr_R2B_Avr Corr_G2B_Avr Corr_B2B_Avr Corr_Bdata_Avr
where BR _ R may be red standard luminance data of the standard luminance data, BR _ G may be green standard luminance data of the standard luminance data, BR _ B may be blue standard luminance data of the standard luminance data, Gamma _ Rdata may be red pixel data of one pixel point data in the original image data, Gamma _ Gdata may be green pixel data of one pixel point data in the original image data, Gamma _ Bdata may be blue pixel data of one pixel point data in the original image data, Corr _ R2R _ Avr may be a red main coefficient of a mean coefficient, Corr _ G2G _ Avr may be a green main coefficient of a mean coefficient, Corr _ B2B _ Avr may be a blue main coefficient of a mean coefficient, Corr _ G2R _ Avr may be a green to red auxiliary coefficient of a mean coefficient, Corr _ B2R _ Avr may be a blue to red auxiliary coefficient of a mean coefficient, corr _ R2G _ Avr may be a red to green auxiliary coefficient of the mean coefficient, Corr _ B2G _ Avr may be a blue to green auxiliary coefficient of the mean coefficient, Corr _ R2B _ Avr may be a red to blue auxiliary coefficient of the mean coefficient, Corr _ G2B _ Avr may be a green to blue auxiliary coefficient of the mean coefficient, Corr _ Rdata _ Avr may be red mean data of the mean image data, Corr _ Gdata _ Avr may be green mean data of the mean image data, Corr _ Bdata _ Avr may be blue mean data of the mean image data. By adopting the above coefficient setting, the average number image data of each pixel point can be calculated more accurately.
In some embodiments, generating the target image data based on the raw image data, the acquired first threshold, the corrected image data, and the average image data comprises: the method comprises the following steps: obtaining pixel point data which is not compared in original image data, and comparing the pixel point data with a first threshold value; step two: determining data corresponding to the pixel point data in the corrected image data as target pixel point data when the pixel point data is not less than a first threshold value; step three: determining data corresponding to the pixel point data in the average number image data as target pixel point data when the pixel point data is smaller than a first threshold value; and repeating the first step to the third step until each pixel point data in the original image data is compared to obtain target image data.
In some embodiments, obtaining one of the pixel data in the original image data that is not compared with the first threshold includes: acquiring one piece of pixel point data which is not compared in original image data to obtain intermediate pixel point data; comparing the intermediate red pixel data, the intermediate green pixel data and the intermediate blue pixel data of the intermediate pixel point data with a red threshold, a green threshold and a blue threshold of a first threshold respectively; when the red pixel data is smaller than a red threshold, the green pixel data is smaller than a green threshold, and the blue pixel data is smaller than a blue threshold, the intermediate pixel data is smaller than a first threshold; when the red pixel data is not less than the red threshold, or the green pixel data is not less than the green threshold, or the blue pixel data is not less than the blue threshold, the intermediate pixel point data is not less than the first threshold. When the middle red pixel data, the middle green pixel data and the middle blue pixel data in the middle pixel point data are all smaller than the corresponding red threshold, green threshold and blue threshold, the pixel point data is generally data which is easy to cause problems, namely data points which can cause screen streaking or color cast; by screening the partial pixel data in the above manner, the corrected image data or the average number image data can be selected more accurately as the target data.
In some embodiments, the present embodiments further comprise: the target image data is transmitted to the target device.
The target device may refer to an intermediate apparatus for adjusting an LED display screen or an LED display screen, and the device is selected as needed, which is not particularly limited herein.
Fig. 3 is a flowchart of an image processing method for an LED display screen according to an embodiment of the present disclosure. The image processing method for the LED display screen of fig. 3 may be performed by the server 1 of fig. 1. As shown in fig. 3, the image processing method for the LED display screen includes:
s301, acquiring image data to be decoded, decoding the image data to be decoded, and generating decoded image data.
S302, gamma-converting the decoded image data to generate original image data.
S303, acquiring a correction coefficient and an average coefficient.
S304, red pixel data, green pixel data and blue pixel data of each pixel point data in the original image data are obtained.
S305, red standard brightness data, green standard brightness data and blue standard brightness data of preset standard brightness data are obtained.
S306, based on the correction calculation formula, the red pixel data, the green pixel data, the blue pixel data, the red standard luminance data, the green standard luminance data, the blue standard luminance data, and the sum of the correction coefficients, generates corrected image data.
S307, red pixel data, green pixel data and blue pixel data of each pixel point data in the original image data are obtained.
S308, red standard luminance data, green standard luminance data, and blue standard luminance data of the standard luminance data are acquired.
S309, average number image data is generated based on the average number calculation formula, the red pixel data, the green pixel data, the blue pixel data, the red standard luminance data, the green standard luminance data, the blue standard luminance data, and the average number coefficient.
S310, target image data is generated based on the original image data, the acquired first threshold, the corrected image data, and the average image data.
All the above optional technical solutions may be combined arbitrarily to form optional embodiments of the present application, and are not described herein again.
The following are embodiments of the disclosed apparatus that may be used to perform embodiments of the disclosed methods. For details not disclosed in the embodiments of the apparatus of the present disclosure, refer to the embodiments of the method of the present disclosure.
Fig. 4 is a block diagram of an image processing apparatus for an LED display screen provided in an embodiment of the present disclosure. As shown in fig. 4, the image processing apparatus for an LED display screen includes:
the acquisition module 401 of the image processing apparatus for the LED display screen is configured to acquire raw image data, correction coefficients, and average coefficients.
A corrected image data generating module 402 of the image processing apparatus for the LED display screen is configured to generate corrected image data based on a correction processing policy, preset standard brightness data, the raw image data, and the correction coefficient.
An average number image data generating module 403 of an image processing apparatus for an LED display screen is configured to generate average number image data based on an average number processing policy, the standard brightness data, the raw image data, and the average number coefficient.
A target image generation module 404 of an image processing apparatus for an LED display screen configured to generate target image data based on the raw image data, the acquired first threshold, the corrected image data, and the average number image data.
Target image data is generated based on the corrected image data and the generated average number image data, and the problems of color cast, screen splash and poor color consistency of the LED display screen during low-gray-scale display can be solved.
In some embodiments, the obtaining module 401 of the image processing apparatus for an LED display screen is further configured to: and the acquisition decoding module is configured to acquire and decode the image data to be decoded to generate decoded image data. And the gamma conversion module is configured to perform gamma conversion on the decoded image data to generate original image data. A correction coefficient acquisition module configured to acquire a correction coefficient. A mean coefficient acquisition module configured to generate a mean coefficient based on the correction coefficient.
In some embodiments, the corrected image data generation module 402 of the image processing apparatus for an LED display screen is further configured to: acquiring red pixel data, green pixel data and blue pixel data of each pixel point data in original image data; acquiring red standard brightness data, green standard brightness data and blue standard brightness data of the standard brightness data; based on the correction calculation formula, the red pixel data, the green pixel data, the blue pixel data, the red standard luminance data, the green standard luminance data, the blue standard luminance data, and the sum of the correction coefficients, the corrected image data is generated.
In some embodiments, the correction calculation is:
Br_R*Gamma_Rdata Corr_R2R Corr_G2R Corr_B2R Corr_Rdata
[Br_G*Gamma_Gdata]*[Corr_R2G Corr_G2G Corr_B2G]=[Corr_Gdata]
Br_B*Gamma_Bdata Corr_R2B Corr_G2B Corr_B2B Corr_Bdata
where BR _ R is red standard luminance data of standard luminance data, BR _ G is green standard luminance data of standard luminance data, BR _ B is blue standard luminance data of standard luminance data, Gamma _ Rdata is red pixel data of one pixel point data in original image data, Gamma _ Gdata is green pixel data of one pixel point data in original image data, Gamma _ Bdata is blue pixel data of one pixel point data in original image data, Corr _ R2R is a red main coefficient of a correction coefficient, Corr _ G2G is a green main coefficient of a correction coefficient, Corr _ B2B is a blue main coefficient of a correction coefficient, Corr _ G2R is a green to red auxiliary coefficient of a correction coefficient, Corr _ B2R is a blue to red auxiliary coefficient of a correction coefficient, Corr _ R2G is a red to green auxiliary coefficient of a correction coefficient, Corr _ B2G is a blue to green auxiliary coefficient of a correction coefficient, corr _ R2B is a red-to-blue auxiliary coefficient of the correction coefficient, Corr _ G2B is a green-to-blue auxiliary coefficient of the correction coefficient, Corr _ Rdata is red correction data of the correction image data, Corr _ Gdata is green correction data of the correction image data, and Corr _ Bdata is blue correction data of the correction image data.
In some embodiments, the average number image data generation module 403 of the image processing apparatus for the LED display screen is further configured to: acquiring red pixel data, green pixel data and blue pixel data of each pixel point data in original image data; acquiring red standard brightness data, green standard brightness data and blue standard brightness data of the standard brightness data; average number image data is generated based on the average number calculation formula, red pixel data, green pixel data, blue pixel data, red standard luminance data, green standard luminance data, blue standard luminance data, and the sum of average number coefficients.
In some embodiments, the average is calculated as:
Br_R*Gamma_Rdata Corr_R2R_Avr Corr_G2R_Avr Corr_B2R_Avr Corr_Rdata_Avr
[Br_G*Gamma_Gdata]*[Corr_R2G_Avr Corr_G2G_Avr Corr_B2G_Avr]=[Corr_Gdata_Avr]
Br_B*Gamma_Bdata Corr_R2B_Avr Corr_G2B_Avr Corr_B2B_Avr Corr_Bdata_Avr
wherein BR _ R is red standard luminance data of the standard luminance data, BR _ G is green standard luminance data of the standard luminance data, BR _ B is blue standard luminance data of the standard luminance data, Gamma _ Rdata is red pixel data of one pixel point data in the original image data, Gamma _ Gdata is green pixel data of one pixel point data in the original image data, Gamma _ Bdata is blue pixel data of one pixel point data in the original image data, Corr _ R2R _ Avr is a red main coefficient of a mean coefficient, Corr _ G2G _ Avr is a green main coefficient of a mean coefficient, Corr _ B2B _ Avr is a blue main coefficient of a mean coefficient, Corr _ G2R _ Avr is a green-to-red auxiliary coefficient of a mean coefficient, Corr _ B2R _ Avr is a blue-to-red auxiliary coefficient of a mean coefficient, Corr _ R2G _ Avr is a red-to green auxiliary coefficient of a mean coefficient, corr _ B2G _ Avr is a blue-to-green auxiliary coefficient of the average coefficient, Corr _ R2B _ Avr is a red-to-blue auxiliary coefficient of the average coefficient, Corr _ G2B _ Avr is a green-to-blue auxiliary coefficient of the average coefficient, Corr _ Rdata _ Avr is red average data of the average image data, Corr _ Gdata _ Avr is green average data of the average image data, Corr _ Bdata _ Avr is blue average data of the average image data.
In some embodiments, the target image generation module 404 of the image processing apparatus for an LED display screen is further configured to: the method comprises the following steps: obtaining pixel point data which is not compared in original image data, and comparing the pixel point data with a first threshold value; step two: determining data corresponding to the pixel point data in the corrected image data as target pixel point data when the pixel point data is not less than a first threshold value; step three: determining data corresponding to the pixel point data in the average number image data as target pixel point data when the pixel point data is smaller than a first threshold value; and repeating the first step to the third step until each pixel point data in the original image data is compared to obtain target image data.
In some embodiments, obtaining one of the pixel data in the original image data that is not compared with the first threshold includes: acquiring one piece of pixel point data which is not compared in original image data to obtain intermediate pixel point data; comparing the intermediate red pixel data, the intermediate green pixel data and the intermediate blue pixel data of the intermediate pixel point data with a red threshold, a green threshold and a blue threshold of a first threshold respectively; when the red pixel data is smaller than a red threshold, the green pixel data is smaller than a green threshold, and the blue pixel data is smaller than a blue threshold, the intermediate pixel data is smaller than a first threshold; when the red pixel data is not less than the red threshold, or the green pixel data is not less than the green threshold, or the blue pixel data is not less than the blue threshold, the intermediate pixel point data is not less than the first threshold.
In some embodiments, the image processing apparatus for an LED display screen further comprises: a transmission module configured to transmit the target image data to a target device.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation on the implementation process of the embodiments of the present disclosure.
Referring to fig. 5, the present disclosure further provides an LED display screen control card. The LED display screen control card comprises an image data input interface 501, a programmable logic device 502, a flat cable socket 503, a volatile memory 504 and a non-volatile memory 505. The programmable logic device 502 is connected to the image data input interface 501, the flat cable socket 503, the volatile memory 504, and the non-volatile memory 505, respectively.
Wherein the editable logic device 502 may implement the steps in the various method embodiments described above. Alternatively, the editable logic device 502 may implement the functions of the modules/units in the apparatus embodiments described above.
Illustratively, the LED display screen control card may be a control card for correcting the LED display screen display. The image data input interface 501 may be LVDS, the programmable logic device 502 may be FPGA, the volatile memory 504 may be SDRAM, and the non-volatile memory 505 may be Flash memory Flash. It should be noted that each component of the LED display screen control card may be replaced by other commonly used components, for example, the image data input interface 501 may also be replaced by a VbyOne decoding chipset or a network port Phy chipset, and the like, which are set as required, and are not limited specifically herein.
When the LED display control card is powered on, the programmable logic device 502 may read a pre-stored program from the non-volatile memory 505, and after the programmable logic device 502 finishes loading the program, the LED display control card enters a normal operation mode. The programmable logic device 502 may buffer the input data into the volatile memory 504, and after the processing is completed, the programmable logic device 502 may output the target image data to the target display unit via the flat cable socket 503.
The present disclosure also provides an LED display screen control system, which includes a data reading device, a display screen, and an LED display screen control card as shown in fig. 5.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules, so as to perform all or part of the functions described above. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
In the embodiments provided in the present disclosure, it should be understood that the disclosed apparatus/control card and method may be implemented in other ways. For example, the above-described device/control card embodiments are merely illustrative, and for example, a module or a unit may be divided into only one logical function, another division may be made in an actual implementation, multiple units or components may be combined or integrated into another system, or some features may be omitted or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, the present disclosure may implement all or part of the flow of the method in the above embodiments, and may also be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of the above methods and embodiments. The computer program may comprise computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain suitable additions or additions that may be required in accordance with legislative and patent practices within the jurisdiction, for example, in some jurisdictions, computer readable media may not include electrical carrier signals or telecommunications signals in accordance with legislative and patent practices.
The above examples are only intended to illustrate the technical solutions of the present disclosure, not to limit them; although the present disclosure has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present disclosure, and are intended to be included within the scope of the present disclosure.