CN113076159B - Image display method and device, storage medium and electronic equipment - Google Patents
Image display method and device, storage medium and electronic equipment Download PDFInfo
- Publication number
- CN113076159B CN113076159B CN202110327067.6A CN202110327067A CN113076159B CN 113076159 B CN113076159 B CN 113076159B CN 202110327067 A CN202110327067 A CN 202110327067A CN 113076159 B CN113076159 B CN 113076159B
- Authority
- CN
- China
- Prior art keywords
- image
- target
- desktop
- parameter
- sub
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 44
- 230000008859 change Effects 0.000 claims abstract description 19
- 238000012545 processing Methods 0.000 claims description 21
- 238000004590 computer program Methods 0.000 claims description 13
- 230000011218 segmentation Effects 0.000 claims description 11
- 238000013507 mapping Methods 0.000 claims description 6
- 230000005540 biological transmission Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 239000004744 fabric Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000017525 heat dissipation Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
- G06F9/452—Remote windowing, e.g. X-Window System, desktop virtualisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/46—Multiprogramming arrangements
- G06F9/50—Allocation of resources, e.g. of the central processing unit [CPU]
- G06F9/5005—Allocation of resources, e.g. of the central processing unit [CPU] to service a request
- G06F9/5027—Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals
- G06F9/505—Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals considering the load
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/20—Processor architectures; Processor configuration, e.g. pipelining
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Controls And Circuits For Display Device (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses an image display method and device, a storage medium and electronic equipment, wherein the method comprises the following steps: acquiring an image load parameter of a target source end, wherein the image load parameter is a display load occupied by a current desktop image displayed by the target source end; comparing the current desktop image with a reference desktop image to obtain an image variable parameter, wherein the reference desktop image is a desktop image displayed in an operating system in a target time period before the current moment of a target source end, and the image variable parameter is used for indicating the image change degree between the current desktop image and the reference desktop image; determining a scene type of a display scene corresponding to the current desktop image displayed in the target source end according to the image load parameter and the image variable parameter; and indicating the target receiving end to display the current desktop image according to the image display mode corresponding to the scene type. The invention solves the technical problem of unsmooth image display caused by mismatching of the image display and the application scene.
Description
Technical Field
The present invention relates to the field of computers, and in particular, to an image display method and apparatus, a storage medium, and an electronic device.
Background
The cloud desktop system generally comprises a cloud server and a receiving end, wherein a source end corresponding to the receiving end is arranged in the cloud server, and the source end generally exists in the form of a virtual machine VM. And the user operates a virtual machine VM running in the cloud server through the receiving end, and the virtual machine VM sends the desktop display image to the receiving end to be displayed to the user.
With the wider application of the cloud desktop system, the cloud desktop system can be roughly divided into a cloud office system and a cloud game system based on the application scene of the cloud desktop system. In general, the display performance required by the cloud office system is different from that required by the cloud game system, and correspondingly, the display modes of the desktop images at the receiving end are different, and if the game is run in the cloud office system scene, the situation of unsmooth running such as jamming can occur because the display modes are not corresponding. Therefore, the cloud desktop system needs to be correspondingly adjusted according to whether the application scene is a game scene or an office scene, so that the cloud desktop image is displayed more smoothly.
In the prior art, a cloud office system and a cloud game system need a user to select and switch scenes through a receiving end, namely, the performance of the cloud desktop system is switched by means of a scene switching instruction received by the receiving end, and self-adaption of the scenes cannot be realized, so that when the switching instruction is not received, but an application scene changes, for example, the office scene is changed into a game scene, and the cloud desktop system has the problem that image display is not smooth.
In view of the above problems, no effective solution has been proposed at present.
Disclosure of Invention
The embodiment of the invention provides an image display method and device, a storage medium and electronic equipment, which are used for at least solving the technical problem of unsmooth image display caused by mismatching of image display and application scenes.
According to an aspect of an embodiment of the present invention, there is provided an image display method including: acquiring an image load parameter of a target source end, wherein the image load parameter is a display load occupied by a current desktop image displayed by the target source end, and the current desktop image is a desktop image displayed in an operating system of the target source end at the current moment; comparing the current desktop image with a reference desktop image to obtain an image variable parameter, wherein the reference desktop image is a desktop image displayed in an operating system in a target time period before the current time, and the image variable parameter is used for indicating the image change degree between the current desktop image and the reference desktop image; determining a scene type of a display scene corresponding to the current desktop image displayed in the target source end according to the image load parameter and the image variable parameter; and indicating the target receiving end to display the current desktop image according to the image display mode corresponding to the scene type.
According to another aspect of the embodiment of the present invention, there is also provided an image display apparatus including: the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring image load parameters of a target source end, wherein the image load parameters are display loads occupied by a current desktop image displayed by the target source end, and the current desktop image is a desktop picture displayed in an operating system of the target source end at the current moment; the comparison module is used for comparing the current desktop image with a reference desktop image to obtain an image variable parameter, wherein the reference desktop image is a desktop image displayed in an operating system in a target time period before the current moment of the target source end, and the image variable parameter is used for indicating the image change degree between the current desktop image and the reference desktop image; the determining module is used for determining the scene type of the display scene corresponding to the current desktop image displayed in the target source end according to the image load parameter and the image variable parameter; and the display module is used for indicating the target receiving end to display the current desktop image according to the image display mode corresponding to the scene type.
According to still another aspect of the embodiments of the present invention, there is also provided a computer-readable storage medium having a computer program stored therein, wherein the computer program is configured to execute the above-described image display method when run.
According to still another aspect of the embodiments of the present invention, there is also provided an electronic apparatus including a memory in which a computer program is stored, and a processor configured to execute the above-described image display method by the computer program.
In the embodiment of the invention, the image load parameter is acquired to obtain the image variable parameter, so that the scene type corresponding to the current desktop image is determined according to the image load parameter and the image variable parameter, the scene type of the desktop image is determined by the image load parameter and the image variable parameter of the target source end in a mode of displaying the desktop image in the target receiving end according to the display mode corresponding to the scene type, the desktop image is displayed in the target receiving end in a display mode corresponding to the field Jing Leixing, the scene type of the desktop image is determined, and the desktop image is displayed in a display mode matched with the scene type, thereby realizing the technical effect of matching the desktop image with the application scene of the desktop image, and further solving the technical problem of unsmooth image display caused by mismatching of the image display and the application scene.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiments of the invention and together with the description serve to explain the invention and do not constitute a limitation on the invention. In the drawings:
FIG. 1 is a schematic illustration of an application environment of an alternative image display method according to an embodiment of the present invention;
FIG. 2 is a flow chart of an alternative image display method according to an embodiment of the invention;
FIG. 3 is a flow chart of an alternative image display method according to an embodiment of the invention;
FIG. 4 is a schematic diagram of an alternative parameter according to an embodiment of the invention;
FIG. 5 is a flow chart of an alternative image display method according to an embodiment of the invention;
FIG. 6 is a flow chart of an alternative image display method according to an embodiment of the invention;
FIG. 7 is a flow chart of an alternative image display method according to an embodiment of the invention;
fig. 8 is a schematic structural view of an alternative image display device according to an embodiment of the present invention;
fig. 9 is a schematic structural view of an alternative electronic device according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to an aspect of an embodiment of the present invention, there is provided an image display method, alternatively, the image display method may be applied to, but not limited to, an environment as shown in fig. 1. The source 102 interacts with the sink 122 via the server 110. The source end 102 receives the control instruction acquired by the receiving end 122 through the server 110, the source end 102 responds to the control instruction to display a corresponding desktop picture in a desktop of the operating system, performs data encoding on the desktop picture according to a corresponding display scene type, and sends the encoded desktop image data to the receiving end 122 so that the receiving end 122 displays the desktop picture in a corresponding display.
Optionally, a database 104 and a processing engine 106 are running in the source 102. The database 104 is used for storing desktop images and corresponding desktop image data displayed in an operating system running on a source side. The processing engine 106 is used for analyzing and processing to obtain the display content and the display mode contained in the desktop screen. Specifically, the processing engine 106 sequentially executes S102 to S108. And acquiring an image load parameter, wherein the image load parameter is a display load occupied by a current desktop image displayed by the source end, and the current desktop image is a desktop image displayed in an operating system of the source end at the current moment. Comparing the current desktop image with a reference desktop image to obtain an image variable parameter, wherein the reference desktop image can be a desktop image displayed in an operating system in a target time period before the current moment of a source end, and the image variable parameter is used for indicating the image change degree between the current desktop image and the reference desktop image. And determining the scene type of the display scene, and determining the scene type of the display scene corresponding to the current desktop image displayed in the target source end according to the image load parameter and the image variable parameter. And under the condition of determining the scene type, indicating the receiving end to display the current desktop image according to the image display mode corresponding to the scene type.
Alternatively, in this embodiment, the source terminal 102 and the receiving terminal 122 may be terminal devices, including but not limited to at least one of the following: cell phones (e.g., android cell phones, IOS cell phones, etc.), notebook computers, tablet computers, palm computers, MIDs (Mobile Internet Devices ), PADs, desktop computers, smart televisions, etc. The server 110 may be a single server, a server cluster composed of a plurality of servers, or a cloud server. The above is merely an example, and is not limited in any way in the present embodiment. The server 110 performs data interaction between the source 102 and the sink 122, and may be, but is not limited to, network-based. The network may include, but is not limited to: a wired network, a wireless network, wherein the wired network comprises: local area networks, metropolitan area networks, and wide area networks, the wireless network comprising: bluetooth, WIFI, and other networks that enable wireless communications.
As an alternative embodiment, as shown in fig. 2, the image display method includes:
s202, acquiring an image load parameter of a target source end, wherein the image load parameter is a display load occupied by a current desktop image displayed by the target source end, and the current desktop image is a desktop image displayed in an operating system of the target source end at the current moment;
S204, comparing the current desktop image with a reference desktop image to obtain an image variable parameter, wherein the reference desktop image is a desktop image displayed in an operating system in a target time period before the current moment of time, and the image variable parameter is used for indicating the image change degree between the current desktop image and the reference desktop image;
s206, determining the scene type of the display scene corresponding to the current desktop image displayed in the target source end according to the image load parameter and the image variable parameter;
s208, indicating the target receiving end to display the current desktop image according to the image display mode corresponding to the scene type.
Optionally, the current desktop image is a desktop image of a display desktop in an operating system running at a target source end at the current moment, and the desktop image includes an image picture of an application program running in the operating system at the current moment and displayed on the display desktop, including static elements and dynamic elements. The current desktop image may represent the image content at the current time in frame as a time reference.
Optionally, the image load parameter is used to represent an image display load occupied by the target source end to display the current desktop image, and the image display load can be obtained by means of load information of a graphics processor (Graphics Processing Unit, GPU). And taking the utilization rate of the graphics card resources contained in the GPU load information, namely the calculated utilization rate of the image corresponding to the display in the target source end, as an image load parameter. In the case that the image load parameter is the usage rate of image calculation contained in the GPU load information, the image load parameter may be obtained by searching the performance parameter in the operating system.
Optionally, the reference desktop image is a desktop screen corresponding to one or more historical moments displayed in an operating system of the target source end in the target time period. Under the condition that the reference desktop image comprises a plurality of desktop images, the current desktop image and the plurality of reference desktop images are compared one by one to obtain image variable parameters.
Optionally, the target time period is a preset reference time threshold, and the reference desktop image is a desktop image corresponding to a historical time extracted from a historical target time period adjacent to the current time. When the reference desktop image is a desktop image including a plurality of desktop images, that is, a plurality of history times are extracted, the plurality of history times may be a plurality of continuous times or a plurality of scattered times, and the present invention is not limited thereto.
Optionally, the image variable parameter is used to indicate that the image change degree between the current desktop image and the reference desktop image may be a change degree of an element included in the image, a change degree of a pixel included in the image, a change degree of a sub-image divided by the image according to a dividing manner, and a change degree of a region position divided by the image according to a region dividing manner. The degree of variation may include, but is not limited to, the degree of variation in number, location, and size of variation.
Optionally, the scene type of the display scene corresponding to the current desktop image displayed in the target source end may be, but not limited to, the scene type of the application scene, the scene type of the running scene, for example: office scenes, game scenes, standby scenes, and running scenes.
Optionally, the step of indicating the target receiving end to display the current desktop image according to the image display mode corresponding to the scene type may be to perform image encoding on the current desktop image according to the encoding mode corresponding to the scene type, and send the image encoding data to the target receiving end, so as to enable the target receiving end to decode the image encoding data and display the current desktop image in a display of the target receiving end. The coding mode corresponding to the scene type may be, but is not limited to, a corresponding coding frame rate, a corresponding coding structure, a corresponding coding format, a corresponding coding type, and the like.
In the embodiment of the application, the image load parameter is acquired to obtain the image variable parameter, so that the scene type corresponding to the current desktop image is determined according to the image load parameter and the image variable parameter, the scene type of the desktop image is determined by the image load parameter and the image variable parameter of the target source end in a mode of displaying the desktop image in the target receiving end according to the display mode corresponding to the scene type, the desktop image is displayed in the target receiving end in a display mode corresponding to the field Jing Leixing, the scene type of the desktop image is determined, the purpose of displaying the desktop image in a display mode matched with the scene type is achieved, the technical effect of matching the desktop image with the application scene of the desktop image is achieved, and the technical problem of unsmooth image display caused by mismatching of image display and the application scene is solved.
As an optional implementation manner, as shown in fig. 3, after the image load parameter of the target source end is acquired, the method further includes:
s302, determining configuration information of a target source end, wherein the configuration information comprises image processing performance parameters and image display performance parameters of the target source end;
s304, inquiring a scene load threshold corresponding to the target source end according to the configuration information in a load threshold database;
s306, comparing the image load parameter with a scene load threshold.
Alternatively, the image processing performance parameter of the target source may be determined by the CPU performance of the target source. CPU performance is not limited to include: core number, main frequency, frequency multiplication, first-level buffer, earphone buffer, third-level buffer, HT and heat dissipation design power consumption TDP. CPU performance can be obtained through a CPU model, the CPU model can be a series model of products for classification and management, and the CPU model is an identification of the CPU performance. Examples of performance corresponding to different CPU family models may be shown in the list of (1) in fig. 4.
Optionally, the image display performance parameter of the target source end may be determined by the graphics card performance of the target source end. The graphics card performance is not limited to include: core frequency, acceleration frequency. The display card performance can be obtained through display card model woolen cloth, and the display card model can be a series model of products for classification and management, and is an identification of the display card performance. Examples of performance corresponding to different graphics card family models may be shown in the list of (2) in fig. 4.
Optionally, the CPU model corresponding to the image processing performance parameter and the display card model corresponding to the image display performance parameter are recorded in the load threshold database, and the scene load threshold corresponding to the target source end is searched according to the CPU model and the display card model of the target source end. The scene load threshold data described in the load threshold database may be represented as a list as shown in fig. 4 (3). The higher the display card model configuration is, the better the corresponding display performance parameter is, the higher the CPU model configuration is, and the better the corresponding processing performance parameter is, so that under the condition of processing and displaying the same desktop image, the lower the image load occupies in a target source end with the better performance parameter, the smaller the corresponding numerical value of the corresponding scene load threshold value is. Therefore, for the plurality of threshold values described in fig. 4 (3), when the configuration of the graphics card model is the graphics card model E, the graphics card model D, the graphics card model C, the graphics card model B, and the graphics card model a in this order from high to low, the configuration of the CPU model is the CPU model 7, the CPU model 6, the CPU model 5, the CPU model 4, the CPU model 3, the CPU model 2, and the CPU model 1 in this order from high to low, the values corresponding to the threshold values in each row and each column are in a relationship from large to small, the value corresponding to the threshold value E7 is the smallest, and the value corresponding to the threshold value A1 is the largest.
Optionally, in the case that the CPU model consistent with the configuration information of the target source terminal cannot be queried in the load threshold database, the CPU model with the same image performance processing parameter may be matched as the CPU model of the target source terminal in the existing CPU models in the load threshold data according to the image performance processing parameter corresponding to the CPU model.
Optionally, setting a substitute CPU model for the load threshold database, and taking the substitute CPU model as the CPU model of the target source terminal in the case that the CPU model consistent with the configuration information of the target source terminal and the CPU model identical to the image performance processing parameter of the target source terminal cannot be queried in the load threshold database.
Optionally, in the case that the display card model consistent with the configuration information of the target source end cannot be queried in the load threshold database, the display card model with the same image display processing parameters as the display card model of the target source end can be matched in the existing display card models in the load threshold data according to the image display processing parameters corresponding to the display card model.
Optionally, setting a substitute graphics card model for the load threshold database, and taking the substitute graphics card model as the graphics card model of the target source end under the condition that the graphics card model consistent with the configuration information of the target source end cannot be queried in the load threshold database and the graphics card model identical with the image display processing parameters of the target source end does not exist.
In the embodiment of the application, the image processing performance and the image display performance are determined according to the hardware configuration of the target source end, so that the scene load threshold of the target source end is determined, and the scene type is judged through comparison of the image load parameter and the scene load threshold.
As an alternative embodiment, as shown in fig. 5, the comparing the current desktop image with the reference desktop image to obtain the image variable parameter includes:
s502, dividing a current desktop image into a plurality of current sub-images according to a preset desktop image dividing mode, and dividing a reference desktop image into a plurality of reference sub-images according to the dividing mode;
s504, comparing the plurality of current sub-images with the plurality of reference sub-images to determine a target sub-image, wherein the target sub-image is used for representing the current sub-image inconsistent with the picture content of the reference sub-image;
s506, acquiring image quantity parameters according to the quantity of target sub-images, and acquiring image area parameters according to the image areas of the target sub-images, wherein the image areas of the target sub-images represent the area positions of the target sub-images in the current desktop image according to the preset area division;
s508, taking the image quantity parameter and the image area parameter as image variable parameters.
Alternatively, the preset division manner of the desktop image may be to divide the desktop image into a plurality of sub-images. The segmentation modes of the current desktop image and the historical desktop image are kept consistent, namely, the current sub-image formed by segmenting the current desktop image according to the segmentation modes and the reference sub-image formed by segmenting the reference desktop image according to the segmentation modes are completely consistent in segmentation positions, image sizes and image numbers.
Optionally, in a case where the reference desktop image includes a plurality of desktop images, a plurality of reference sub-images formed by each reference desktop image in a divided manner are used as a set of reference sub-images, and a plurality of sets of reference sub-images corresponding to the plurality of reference desktop images are generated.
Alternatively, the comparing the plurality of current sub-images with the plurality of reference sub-images may be comparing the plurality of current sub-images with the plurality of reference sub-images included in the current group of reference sub-images, and sequentially comparing the current sub-images with the plurality of groups of reference sub-images.
Optionally, in the case that there are multiple groups of reference sub-images, the target sub-image contains a current sub-image whose picture content is inconsistent with that determined by comparison with any one group of reference sub-images.
Alternatively, the region positions divided according to the preset region may be a plurality of image regions formed by dividing the desktop image according to the preset region division manner. The image areas formed by dividing the current desktop image according to the preset area are completely consistent with the image areas formed by dividing the reference desktop image according to the preset area in dividing positions and area numbers.
Optionally, each image region contains at least two sub-images formed by segmentation. The region sizes of different image regions may not be uniform, and the number of sub-images contained in the image regions may not be the same.
In the embodiment of the application, compared with the current sub-image formed by dividing the current desktop image and the reference sub-image formed by dividing the reference desktop image, the current sub-image with inconsistent displayed desktop image content is used as the target sub-image, the variable parameters of the image are determined according to the number of the target sub-images and the image area, the change of the desktop image is represented by the sub-image, and the determination mode of the image change is refined, so that the number change and the area change are used as the variable parameters, and the scene type can be determined more accurately.
As an optional implementation manner, the comparing the plurality of current sub-images with the plurality of reference sub-images, and determining the target sub-image includes:
according to the position mapping relation corresponding to the segmentation mode, comparing the pixel points contained in the current sub-image with the pixel points contained in the reference sub-image one by one;
and taking the current sub-image which is inconsistent with the picture content displayed by the pixel points contained in the reference sub-image as a target sub-image.
Alternatively, the position mapping relationship corresponding to the division manner may be located in the mapping relationship of the same division position, and the current sub-image and the reference sub-image located in the same division position are used as a pair of comparison objects. The sub-images all comprise a plurality of pixel points, the number of the contained pixel points is consistent, and the pixel points of the current sub-image and the pixel points of the reference sub-image which are positioned at the same position are used as a group of comparison pixel points. Pixels located in the same group are compared.
Optionally, in the case that all the groups of comparison pixels are completely consistent, it is determined that the picture content displayed by the pixels of the current sub-image is consistent with the picture content displayed by the pixels included in the reference sub-image. And under the condition that at least one group of comparison pixels are inconsistent, determining that the picture content displayed by the pixels of the current sub-image is inconsistent with the picture content displayed by the pixels contained in the reference sub-image.
In the embodiment of the application, the current sub-image and the reference sub-image are compared one by one according to the position mapping relation, in the comparison process of a pair of comparison objects, the pixel points contained in the sub-images are utilized for performing one by one comparison, and the target sub-image with the picture content changed is determined from the current sub-image based on the picture content level displayed by the pixel points.
As an optional implementation manner, the acquiring the image number parameter according to the number of the target sub-images includes:
and calculating the number ratio of the target sub-image to the current sub-image according to the number of the target sub-images and the number of the current sub-image, and taking the number ratio as an image number parameter.
Optionally, the number of current sub-images is the number of sub-images formed by dividing the desktop image, and is also the number of reference sub-images contained in each group of reference sub-images.
Optionally, the quantitative ratio is used to indicate the quantitative ratio of the sub-image with the changed desktop content in the current sub-image.
In the embodiment of the application, the ratio of the number of the target sub-images to the number of the current sub-images is used as the image number parameter, and the scene type is judged from the number of image changes so as to realize automatic recognition of the scene type, so that the desktop image is displayed according to the display mode corresponding to the scene type.
As an alternative embodiment, as shown in fig. 6, the acquiring the image area parameter according to the image position of the target sub-image includes:
s602, determining a target region of a target sub-image in a current desktop image, wherein the target region is contained in a plurality of region positions of the current desktop image divided according to a preset region;
S604, acquiring a history target sub-image corresponding to the history moment in a target time period, and determining a history target area of the history target sub-image in a history desktop image;
s606, when the target area and the history target area are the same area position, the same area position is used as an image area parameter;
s608, when the target area and the history target area are not the same area position, the different area position is used as the image area parameter.
Alternatively, the historical target sub-image corresponding to the historical time in the target period may be a historical target sub-image corresponding to one or more historical time in the target period. When the history target sub-image is a plurality of history target sub-images corresponding to a plurality of history moments, a plurality of corresponding history target areas are respectively determined according to the plurality of history target sub-images.
Alternatively, determining the historical target sub-image corresponding to the historical time may be obtaining the historical reference desktop image that is the reference desktop image for the historical time within the target time period, so as to compare the historical sub-images corresponding to the plurality of historical times with the historical reference sub-image corresponding to the historical reference desktop image, so as to determine the historical target sub-image.
Alternatively, in the case where the reference desktop image of the current desktop image includes a plurality of history desktop images within the target period, the history time corresponding to the plurality of history desktop images may be used as the history time corresponding to the history target sub-image, and the history target sub-image may be determined according to the reference sub-image.
Optionally, in the case that the historical target area is a plurality of area positions, the target area and the plurality of historical target areas are the same area position, the target area and the historical target area are determined to be the same area position, and if the historical target area which is not the same area position as the target area exists, the target area and the historical target area are determined to be not the same area position.
In the embodiment of the application, the scene type is judged by comparing the region position of the target region where the target sub-image is located with the region position of the history target region of the history target sub-image and by utilizing the region position where the changed sub-image is located, so that the automatic identification of the scene type is realized, and the desktop image is displayed according to the display mode corresponding to the scene type.
As an optional implementation manner, the determining, according to the image load parameter and the image variable parameter, the scene type of the display scene corresponding to the current desktop image displayed in the target source end includes:
Determining that the scene type of the display scene corresponding to the current desktop image displayed by the target source end is an office scene under the condition that the image load parameter is smaller than the scene load threshold, the image quantity parameter is smaller than the scene quantity threshold and the image area parameter is at different area positions;
and determining the scene type of the display scene corresponding to the current desktop image displayed by the target source end as the game scene under the condition that the image load parameter is greater than or equal to the scene load threshold or the image quantity parameter is greater than or equal to the scene quantity threshold or the image area parameter is at the same area position.
Alternatively, determining the scene type according to the image load parameter and the image variable parameter may be determining the scene type according to a combination of the image load parameter, the image quantity parameter, and the image area parameter. The image load parameter, the image number parameter, and the image area parameter may be determined sequentially according to a preset parameter determination sequence.
Alternatively, the preset parameter judgment sequence may be an image load parameter or an image variable parameter. The judging order of the image variable parameters can be the image quantity parameters and the image area parameters. The parameter judgment sequence may be that parameter judgment is sequentially performed according to a preset judgment sequence until the parameter judgment is completed, and the scene type is determined according to the result of the parameter judgment.
In the embodiment of the application, the scene type is determined according to the combination of the image load parameter, the image quantity parameter and the image area parameter, and the scene type is determined to be an office scene only under the condition that the image load parameter is smaller than the scene load threshold, the image quantity parameter is smaller than the scene quantity threshold and the image area parameter is at different area positions, so that the determination accuracy of the game scene with a higher display mode is improved, the accuracy of automatic recognition of the scene type is improved, and the display of the desktop image is matched with the scene to be displayed more smoothly.
Alternatively, in the case where the current parameter judgment flow can determine the scene type, the parameter judgment flow is stopped. And under the condition that the scene type can be determined according to the image load parameters, judging the image variable parameters is not performed any more, and the scene type is directly determined. And in the case that the scene type cannot be determined by the image load parameter, judging the scene type again according to the image variable parameter. Under the condition that the scene type can be determined by executing the judgment of the image quantity parameters, namely combining the image load parameters and the image quantity parameters, the judgment of the image area parameters is not carried out any more, and the scene type is directly determined. In the case where the scene type cannot be determined by performing the image number parameter judgment, the image area parameter judgment is performed to determine the scene type.
Alternatively, the application scene type of the display scene corresponding to the current desktop image displayed in the target source may be determined, but is not limited to, as shown in fig. 7. Taking the GPU load as an example, the image load parameter executes S702 to obtain the GPU load and the load threshold when the desktop screen at the current moment is obtained. In the case where the GPU load and the load threshold are determined, S704 is executed to determine whether the GPU load is less than the load threshold. If the judgment at S704 is no, that is, if the GPU load is equal to or greater than the load threshold, S716 is executed to determine that the display scene of the desktop image is a game scene.
If yes, that is, if the GPU load is less than the load threshold, S706 is executed to acquire the image number parameter. Specifically, the current desktop image and the reference desktop image are divided to obtain a current sub-image and a reference sub-image, and the current sub-image and the reference sub-image are compared to determine a target sub-image. And taking the ratio of the number of the target sub-images to the number of the current sub-images as an image number parameter. In the case where the image number parameter is acquired, S708 is performed to determine whether the image number parameter image is smaller than the number threshold. If it is determined in S708 that the number of images is not less than the number threshold, S716 is executed to determine that the display scene of the desktop image is a game scene.
If the judgment in S708 is yes, that is, if the number of images parameter is smaller than the number threshold, S710 is executed to acquire the image area parameter. Specifically, a history target sub-image is acquired, a target area of the target sub-image is compared with a history target area of the history target sub-image, and when the target area and the history target area are at the same area position, the image area parameter is determined to be at the same area position. The quick-go-to-withstand image region parameter is a different region location in the case where the target region and the historical target region are not the same region location. In the case where the image area parameter is determined, S712 is performed to determine whether the image area parameter is a different area position. If yes in S712, that is, if the image area parameter is the different area position, S714 is executed to determine that the display scene of the desktop image is the office scene. If no in S712, that is, if the image area parameter is the same area position, S716 is executed to determine that the display scene of the desktop image is the game scene.
It should be noted that, for simplicity of description, the foregoing method embodiments are all described as a series of acts, but it should be understood by those skilled in the art that the present invention is not limited by the order of acts described, as some steps may be performed in other orders or concurrently in accordance with the present invention. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily required for the present invention.
According to another aspect of the embodiment of the present invention, there is also provided an image display apparatus for implementing the above-described image display method. As shown in fig. 8, the apparatus includes:
the obtaining module 802 is configured to obtain an image load parameter of the target source, where the image load parameter is a display load occupied by a current desktop image displayed by the target source, and the current desktop image is a desktop image displayed in an operating system of the target source at a current moment;
the comparison module 804 is configured to compare a current desktop image with a reference desktop image to obtain an image variable parameter, where the reference desktop image is a desktop image that has been displayed in an operating system in a target time period before a current time, and the image variable parameter is used to indicate an image change degree between the current desktop image and the reference desktop image;
a determining module 806, configured to determine, according to the image load parameter and the image variable parameter, a scene type of a display scene corresponding to the current desktop image displayed in the target source end;
and the display module 808 is configured to instruct the target receiving end to display the current desktop image according to the image display mode corresponding to the scene type.
Optionally, the image display device further includes:
The configuration module is used for determining configuration information of the target source end after the image load parameters of the target source end are acquired, wherein the configuration information comprises image processing performance parameters and image display performance parameters of the target source end;
the inquiring module is used for inquiring the scene load threshold corresponding to the target source end according to the configuration information in the load threshold database;
and the comparison module is used for comparing the image load parameter with a scene load threshold value.
Optionally, the comparing module 804 further includes:
the segmentation unit is used for dividing the current desktop image into a plurality of current sub-images according to a preset segmentation mode of the desktop image and dividing the reference desktop image into a plurality of reference sub-images according to the segmentation mode;
the comparison unit is used for comparing the plurality of current sub-images with the plurality of reference sub-images to determine a target sub-image, wherein the target sub-image is used for representing the reference sub-image inconsistent with the picture content of the reference sub-image;
the acquisition unit is used for acquiring image quantity parameters according to the quantity of the target sub-images and acquiring image area parameters according to the image positions of the target sub-images, wherein the image positions of the target sub-images comprise the area positions of the target sub-images in the current desktop image according to the preset area division;
And the combining unit is used for taking the image quantity parameter and the image area parameter as image variable parameters.
Optionally, the above-mentioned comparison unit further includes:
the comparison unit is used for comparing the pixel points contained in the current sub-image with the pixel points contained in the reference sub-image one by one according to the position mapping relation corresponding to the segmentation mode;
and a result unit for taking the current sub-image which is inconsistent with the picture content displayed by the pixel points included in the reference sub-image as a target sub-image.
Optionally, the above-mentioned acquisition unit further includes:
and the first parameter unit is used for calculating the number ratio of the target sub-image to the current sub-image according to the number of the target sub-images and the number of the current sub-image, and taking the number ratio as an image number parameter.
Optionally, the above-mentioned acquisition unit further includes:
the area unit is used for determining a target area of the target sub-image in the current desktop image, wherein the target area is contained in a plurality of area positions of the current desktop image divided according to a preset area;
the history unit is used for acquiring a plurality of history target sub-images corresponding to a plurality of history moments in a target time period and determining a history target area of the history target sub-images in a history desktop image;
A second parameter unit configured to, when the target area and the plurality of history target areas are the same area position, take the same area position as an image area parameter;
and the third parameter unit is used for taking different area positions as image area parameters when the target area and the plurality of historical target areas are not the same area positions.
The determining module 806 further includes:
the first determining unit is used for determining that the scene type of the display scene corresponding to the current desktop image displayed by the target source end is an office scene under the condition that the image load parameter is smaller than the scene load threshold, the image quantity parameter is smaller than the scene quantity threshold and the image area parameter is at different area positions;
the second determining unit is used for determining that the scene type of the display scene corresponding to the current desktop image displayed by the target source end is a game scene under the condition that the image load parameter is greater than or equal to the scene load threshold value or the image quantity parameter is greater than or equal to the scene quantity threshold value or the image area parameter is at the same area position.
In the embodiment of the application, the image load parameter and the image variable parameter are acquired, so that the scene type corresponding to the current desktop image is determined according to the image load parameter and the image variable parameter, the scene type of the desktop image is determined by the image load parameter and the image variable parameter of the target source end in a mode of displaying the desktop image in the target receiving end according to the display mode corresponding to the scene type, the desktop image is displayed in the target receiving end in a display mode corresponding to the field Jing Leixing, the purposes of determining the scene type of the desktop image and displaying the desktop image in a display mode matched with the scene type are achieved, the technical effect of matching the desktop image with the application scene of the desktop image is achieved, and the technical problem of unsmooth image display caused by mismatching of image display and the application scene is solved.
According to still another aspect of the embodiment of the present invention, there is also provided an electronic device for implementing the above image display method, which may be a terminal device (source end) or a server shown in fig. 1. The present embodiment is described taking the electronic device as a terminal device as an example. As shown in fig. 9, the electronic device comprises a memory 902 and a processor 904, the memory 902 having stored therein a computer program, the processor 904 being arranged to perform the steps of any of the method embodiments described above by means of the computer program.
Alternatively, in this embodiment, the electronic device may be located in at least one network device of a plurality of network devices of the computer network.
Alternatively, in the present embodiment, the above-described processor may be configured to execute the following steps by a computer program:
s1, acquiring an image load parameter of a target source end, wherein the image load parameter is a display load occupied by a current desktop image displayed by the target source end, and the current desktop image is a desktop image displayed in an operating system of the target source end at the current moment;
s2, comparing the current desktop image with a reference desktop image to obtain an image variable parameter, wherein the reference desktop image is a desktop image displayed in an operating system in a target time period before the current moment of a target source end, and the image variable parameter is used for indicating the image change degree between the current desktop image and the reference desktop image;
S3, determining the scene type of the display scene corresponding to the current desktop image displayed in the target source end according to the image load parameter and the image variable parameter;
and S4, indicating the target receiving end to display the current desktop image according to the image display mode corresponding to the scene type.
Alternatively, it will be understood by those skilled in the art that the structure shown in fig. 9 is only schematic, and the electronic device may also be a terminal device such as a smart phone (e.g. an Android phone, an IOS phone, etc.), a tablet computer, a palm computer, and a mobile internet device (Mobile Internet Devices, MID), a PAD, etc. Fig. 9 is not limited to the structure of the electronic device described above. For example, the electronic device may also include more or fewer components (e.g., network interfaces, etc.) than shown in FIG. 9, or have a different configuration than shown in FIG. 9.
The memory 902 may be used to store software programs and modules, such as program instructions/modules corresponding to the image display method and apparatus in the embodiment of the present invention, and the processor 904 executes the software programs and modules stored in the memory 902, thereby performing various functional applications and data processing, that is, implementing the image display method described above. The memory 902 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 902 may further include memory remotely located relative to the processor 904, which may be connected to the terminal via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof. The memory 902 may be used for storing information such as desktop images, image load parameters, image variable parameters, scene types, etc., in particular, but not limited to. As an example, as shown in fig. 9, the memory 902 may include, but is not limited to, the acquisition module 802, the comparison module 804, the determination module 806, and the display module 808 in the image display apparatus. In addition, other module units in the image display apparatus may be included, but are not limited to, and are not described in detail in this example.
Optionally, the transmission device 906 is used to receive or transmit data via a network. Specific examples of the network described above may include wired networks and wireless networks. In one example, the transmission means 906 includes a network adapter (Network Interface Controller, NIC) that can connect to other network devices and routers via a network cable to communicate with the internet or a local area network. In one example, the transmission device 906 is a Radio Frequency (RF) module for communicating wirelessly with the internet.
In addition, the electronic device further includes: a display 908 for displaying the desktop image; and a connection bus 910 for connecting the respective module parts in the above-described electronic device.
In other embodiments, the terminal device or the server may be a node in a distributed system, where the distributed system may be a blockchain system, and the blockchain system may be a distributed system formed by connecting the plurality of nodes through a network communication. Among them, the nodes may form a Peer-To-Peer (P2P) network, and any type of computing device, such as a server, a terminal, etc., may become a node in the blockchain system by joining the Peer-To-Peer network.
According to one aspect of the present application, there is provided a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The computer instructions are read from the computer-readable storage medium by a processor of a computer device, and executed by the processor, cause the computer device to perform the methods provided in various alternative implementations of the image display aspects described above. Wherein the computer program is arranged to perform the steps of any of the method embodiments described above when run.
Alternatively, in the present embodiment, the above-described computer-readable storage medium may be configured to store a computer program for executing the steps of:
s1, acquiring an image load parameter of a target source end, wherein the image load parameter is a display load occupied by a current desktop image displayed by the target source end, and the current desktop image is a desktop image displayed in an operating system of the target source end at the current moment;
s2, comparing the current desktop image with a reference desktop image to obtain an image variable parameter, wherein the reference desktop image is a desktop image displayed in an operating system in a target time period before the current moment of a target source end, and the image variable parameter is used for indicating the image change degree between the current desktop image and the reference desktop image;
S3, determining the scene type of the display scene corresponding to the current desktop image displayed in the target source end according to the image load parameter and the image variable parameter;
and S4, indicating the target receiving end to display the current desktop image according to the image display mode corresponding to the scene type.
Alternatively, in this embodiment, it will be understood by those skilled in the art that all or part of the steps in the methods of the above embodiments may be performed by a program for instructing a terminal device to execute the steps, where the program may be stored in a computer readable storage medium, and the storage medium may include: flash disk, read-Only Memory (ROM), random-access Memory (Random Access Memory, RAM), magnetic or optical disk, and the like.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
The integrated units in the above embodiments may be stored in the above-described computer-readable storage medium if implemented in the form of software functional units and sold or used as separate products. Based on such understanding, the technical solution of the present invention may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, comprising several instructions for causing one or more computer devices (which may be personal computers, servers or network devices, etc.) to perform all or part of the steps of the method described in the embodiments of the present invention.
In the foregoing embodiments of the present invention, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In several embodiments provided in the present application, it should be understood that the disclosed client may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, such as the division of the units, is merely a logical function division, and may be implemented in another manner, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The foregoing is merely a preferred embodiment of the present invention and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present invention, which are intended to be comprehended within the scope of the present invention.
Claims (10)
1. An image display method, comprising:
acquiring an image load parameter of a target source end, wherein the image load parameter is a display load occupied by a current desktop image displayed by the target source end, and the current desktop image is a desktop image displayed in an operating system of the target source end at the current moment;
comparing the current desktop image with a reference desktop image to obtain an image variable parameter, wherein the reference desktop image is a desktop image displayed in an operating system in a target time period before the current moment of the target source end, and the image variable parameter is used for indicating the image change degree between the current desktop image and the reference desktop image;
Determining a scene type of a display scene corresponding to the current desktop image displayed in the target source end according to the image load parameter and the image variable parameter;
and indicating a target receiving end to display the current desktop image according to the image display mode corresponding to the scene type.
2. The method of claim 1, wherein after the obtaining the image load parameter of the target source, the method further comprises:
determining configuration information of the target source end, wherein the configuration information comprises image processing performance parameters and image display performance parameters of the target source end;
inquiring a scene load threshold corresponding to the target source end according to the configuration information in a load threshold database;
comparing the image load parameter to the scene load threshold.
3. The method of claim 2, wherein comparing the current desktop image to the reference desktop image to obtain the image variable parameter comprises:
dividing the current desktop image into a plurality of current sub-images according to a preset desktop image dividing mode, and dividing the reference desktop image into a plurality of reference sub-images according to the dividing mode;
Comparing the current sub-images with the reference sub-images to determine a target sub-image, wherein the target sub-image is used for representing the reference sub-image inconsistent with the picture content of the reference sub-image;
acquiring image quantity parameters according to the quantity of the target sub-images, and acquiring image area parameters according to the image areas of the target sub-images, wherein the image areas of the target sub-images represent the area positions of the target sub-images in the current desktop image according to preset area division;
and taking the image quantity parameter and the image area parameter as the image variable parameters.
4. A method according to claim 3, wherein said comparing a plurality of said current sub-images with a plurality of said reference sub-images, determining a target sub-image comprises:
according to the position mapping relation corresponding to the segmentation mode, comparing the pixel points contained in the current sub-image with the pixel points contained in the reference sub-image pair by pair;
and taking the current sub-image which is inconsistent with the picture content displayed by the pixel points contained in the reference sub-image as the target sub-image.
5. A method according to claim 3, wherein said obtaining an image number parameter from the number of target sub-images comprises:
And calculating the quantity ratio of the target sub-image to the current sub-image according to the quantity of the target sub-image and the quantity of the current sub-image, and taking the quantity ratio as the image quantity parameter.
6. A method according to claim 3, wherein said obtaining image area parameters from the image position of the target sub-image comprises:
determining a target area of the target sub-image in the current desktop image, wherein the target area is contained in a plurality of area positions of the current desktop image divided according to a preset area;
acquiring a history target sub-image corresponding to the history moment of the target time period, and determining a history target area of the history target sub-image in a history desktop image;
when the target area and the history target area are at the same area position, taking the same area position as the image area parameter;
and taking the different region positions as the image region parameters in the case that the target region and the historical target region are not the same region position.
7. The method according to claim 3, wherein determining a scene type of a display scene corresponding to the current desktop image displayed in the target source according to the image load parameter and the image variable parameter includes:
Determining that the scene type of the display scene corresponding to the current desktop image displayed by the target source end is an office scene under the condition that the image load parameter is smaller than the scene load threshold, the image quantity parameter is smaller than the scene quantity threshold and the image area parameter is at different area positions;
and determining that the scene type of the display scene corresponding to the current desktop image displayed by the target source end is a game scene under the condition that the image load parameter is greater than or equal to the scene load threshold or the image quantity parameter is greater than or equal to the scene quantity threshold or the image area parameter is at the same area position.
8. An image display device, comprising:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring image load parameters of a target source end, wherein the image load parameters are display loads occupied by a current desktop image displayed by the target source end, and the current desktop image is a desktop picture displayed in an operating system of the target source end at the current moment;
the comparison module is used for comparing the current desktop image with a reference desktop image to obtain an image variable parameter, wherein the reference desktop image is a desktop image displayed in an operating system in a target time period before the current moment of the target source end, and the image variable parameter is used for indicating the image change degree between the current desktop image and the reference desktop image;
The determining module is used for determining the scene type of the display scene corresponding to the current desktop image displayed in the target source end according to the image load parameter and the image variable parameter;
and the display module is used for indicating the target receiving end to display the current desktop image according to the image display mode corresponding to the scene type.
9. A computer-readable storage medium, characterized in that the computer-readable storage medium comprises a stored program which, when run, performs the method of any one of claims 1 to 7.
10. An electronic device comprising a memory and a processor, characterized in that the memory has stored therein a computer program, the processor being arranged to execute the method according to any of the claims 1 to 7 by means of the computer program.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110327067.6A CN113076159B (en) | 2021-03-26 | 2021-03-26 | Image display method and device, storage medium and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110327067.6A CN113076159B (en) | 2021-03-26 | 2021-03-26 | Image display method and device, storage medium and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113076159A CN113076159A (en) | 2021-07-06 |
CN113076159B true CN113076159B (en) | 2024-02-27 |
Family
ID=76610526
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110327067.6A Active CN113076159B (en) | 2021-03-26 | 2021-03-26 | Image display method and device, storage medium and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113076159B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113254123A (en) * | 2021-05-11 | 2021-08-13 | 西安万像电子科技有限公司 | Cloud desktop scene identification method and device, storage medium and electronic device |
CN115002097B (en) * | 2022-04-25 | 2024-07-19 | 青岛海尔科技有限公司 | Application image display method and device, storage medium and electronic device |
CN115861030B (en) * | 2023-01-31 | 2023-07-25 | 南京砺算科技有限公司 | Graphics processor, system variable generation method thereof and medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109168068A (en) * | 2018-08-23 | 2019-01-08 | Oppo广东移动通信有限公司 | Method for processing video frequency, device, electronic equipment and computer-readable medium |
CN109218802A (en) * | 2018-08-23 | 2019-01-15 | Oppo广东移动通信有限公司 | Method for processing video frequency, device, electronic equipment and computer-readable medium |
CN109240576A (en) * | 2018-09-03 | 2019-01-18 | 网易(杭州)网络有限公司 | Image processing method and device, electronic equipment, storage medium in game |
WO2020073505A1 (en) * | 2018-10-11 | 2020-04-16 | 平安科技(深圳)有限公司 | Image processing method, apparatus and device based on image recognition, and storage medium |
CN111726533A (en) * | 2020-06-30 | 2020-09-29 | RealMe重庆移动通信有限公司 | Image processing method, image processing device, mobile terminal and computer readable storage medium |
-
2021
- 2021-03-26 CN CN202110327067.6A patent/CN113076159B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109168068A (en) * | 2018-08-23 | 2019-01-08 | Oppo广东移动通信有限公司 | Method for processing video frequency, device, electronic equipment and computer-readable medium |
CN109218802A (en) * | 2018-08-23 | 2019-01-15 | Oppo广东移动通信有限公司 | Method for processing video frequency, device, electronic equipment and computer-readable medium |
CN109240576A (en) * | 2018-09-03 | 2019-01-18 | 网易(杭州)网络有限公司 | Image processing method and device, electronic equipment, storage medium in game |
WO2020073505A1 (en) * | 2018-10-11 | 2020-04-16 | 平安科技(深圳)有限公司 | Image processing method, apparatus and device based on image recognition, and storage medium |
CN111726533A (en) * | 2020-06-30 | 2020-09-29 | RealMe重庆移动通信有限公司 | Image processing method, image processing device, mobile terminal and computer readable storage medium |
Non-Patent Citations (1)
Title |
---|
基于Spice协议分块图像缓存优化设计与分析;邓丽萍;;福建教育学院学报(第04期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN113076159A (en) | 2021-07-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113076159B (en) | Image display method and device, storage medium and electronic equipment | |
CN111681167B (en) | Image quality adjusting method and device, storage medium and electronic equipment | |
CN107645561B (en) | Picture preview method of cloud mobile phone | |
CN110852938B (en) | Display picture generation method, device and storage medium | |
CN108900843B (en) | Monochrome image compression method, apparatus, medium, and electronic device | |
CN113098946B (en) | Cloud desktop scene identification method and device, storage medium and electronic device | |
CN111506434B (en) | Task processing method and device and computer readable storage medium | |
US20190114989A1 (en) | Systems and methods for image optimization | |
CN109660508A (en) | Data visualization method, electronic device, computer equipment and storage medium | |
CN116384109A (en) | Novel power distribution network-oriented digital twin model automatic reconstruction method and device | |
WO2022121701A1 (en) | Image processing method and apparatus, electronic device, and storage medium | |
CN113064689A (en) | Scene recognition method and device, storage medium and electronic equipment | |
CN116168045B (en) | Method and system for dividing sweeping lens, storage medium and electronic equipment | |
JP2011249947A (en) | Image color subtraction apparatus, method and program | |
CN111683280A (en) | Video processing method and device and electronic equipment | |
CN117915088A (en) | Video processing method, video processing device, electronic equipment and computer readable storage medium | |
CN113254123A (en) | Cloud desktop scene identification method and device, storage medium and electronic device | |
CN112969027B (en) | Focusing method and device of electric lens, storage medium and electronic equipment | |
CN115550645A (en) | Method and device for determining intra-frame prediction mode, storage medium and electronic equipment | |
KR101526490B1 (en) | Visual data processing apparatus and method for Efficient resource management in Cloud Computing | |
CN114501060A (en) | Live broadcast background switching method and device, storage medium and electronic equipment | |
CN114630114A (en) | Intra-frame prediction method and device for video coding, storage medium and electronic equipment | |
CN113435515A (en) | Picture identification method and device, storage medium and electronic equipment | |
CN110232393B (en) | Data processing method and device, storage medium and electronic device | |
CN114168766A (en) | Data processing method and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |