[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN111104193B - Application program active interface display method and device, storage medium and terminal equipment - Google Patents

Application program active interface display method and device, storage medium and terminal equipment Download PDF

Info

Publication number
CN111104193B
CN111104193B CN201811251734.1A CN201811251734A CN111104193B CN 111104193 B CN111104193 B CN 111104193B CN 201811251734 A CN201811251734 A CN 201811251734A CN 111104193 B CN111104193 B CN 111104193B
Authority
CN
China
Prior art keywords
color
color feature
interface
target
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811251734.1A
Other languages
Chinese (zh)
Other versions
CN111104193A (en
Inventor
陈杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Qihoo Technology Co Ltd
Original Assignee
Beijing Qihoo Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Qihoo Technology Co Ltd filed Critical Beijing Qihoo Technology Co Ltd
Priority to CN201811251734.1A priority Critical patent/CN111104193B/en
Publication of CN111104193A publication Critical patent/CN111104193A/en
Application granted granted Critical
Publication of CN111104193B publication Critical patent/CN111104193B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/04Context-preserving transformations, e.g. by using an importance map
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Image Processing (AREA)

Abstract

The application provides a display method of an application program active interface, which comprises the following steps: acquiring a target image, and extracting reference color features in the target image, wherein the reference color features comprise colors and hues; fusing the reference color features based on a preset color feature fusion rule to obtain a target color; and acquiring at least one active interface on the display interface of the application program, and replacing the color features displayed in the active interface with the target colors. The color and the tone in the image can be fused, so that the user can apply the sensory color and the tone to the display interface of the application program, the display interface of the application program is more personalized, and the experience of the user in using the application program is improved.

Description

Application program active interface display method and device, storage medium and terminal equipment
Technical Field
The application relates to the technical field of software and image processing, in particular to a method and a device for displaying an application program active interface, a storage medium and terminal equipment.
Background
The development of computer technology has provided users with increasing convenience, and while enjoying these convenience, vast numbers of users have gradually begun to prefer to customize personalized interfaces according to their own preferences. For example, for a user of a desktop operating system, the user may display the interface keytone as an application with his own favorite photo. In addition, more and more applications are beginning to offer a variety of "skins" for user selection. And many applications also provide the user with a tool to make the skin so that the user can select his own preferred picture to generate the skin. When the user changes the display main tone of the active interface of the application program, the tone and color of the whole display interface of the application program can only be replaced by the picture of the photo selected by the user, particularly, when a plurality of active interfaces exist, the tone of the active interfaces can be uniformly replaced by the picture of the photo, and further, the personalized characteristics of the display interface of the application program customized by the user can not be reflected, so that the user experience is poor.
Disclosure of Invention
In order to solve the problems, the application provides a method and a device for displaying an application program active interface, a storage medium and a terminal device, which can fuse colors and hues in images, so that the application program active interface is more personalized, and the experience of a user for using an application program is improved.
In order to achieve the above object, the present application adopts the following technical scheme:
The application relates to a method for displaying an application program active interface, which comprises the following steps of
Acquiring a target image, and extracting reference color features in the target image, wherein the reference color features comprise colors and hues;
fusing the reference color features based on a preset color feature fusion rule to obtain a target color;
and acquiring at least one active interface on the display interface of the application program, and replacing the color features displayed in the active interface with the target colors.
Optionally, the extracting the reference color feature in the target image includes:
traversing each pixel point in the target image, and comparing the traversed current pixel point with pixels of adjacent pixel points of the current pixel point to obtain a pixel difference value;
judging whether the pixel difference value is within a preset pixel difference value threshold value or not;
when the pixel difference value is within a preset pixel difference value threshold value, classifying the pixels of the adjacent pixel points into a color feature set where the current pixel point is located;
When the pixel difference value is not within a preset pixel difference value threshold value, classifying the pixels of the adjacent pixel points into other color feature sets;
All of the pixels in the set of color features together comprise the color feature.
Optionally, after classifying the pixel of the pixel point adjacent to the current pixel point into the color feature set where the current pixel point is not located when the pixel difference value is not within a preset pixel difference value threshold, the method includes:
Respectively counting the number of pixels in each color feature set, and comparing the number of pixels in each color feature set;
and taking the pixel in the color feature set with the largest pixel number as a main color feature, and taking the pixels in the rest color feature sets as secondary color features.
Optionally, the fusing the color features based on a preset color feature fusion rule to obtain the target color includes:
and fusing the primary color feature and the secondary color feature according to the pixel quantity proportion of the primary color feature and the secondary color feature to obtain the target color.
Optionally, when the application display interface includes more than two active interfaces, the acquiring at least one active interface on the application display interface, and replacing the color feature displayed in the active interface with the target color includes:
Acquiring all active interfaces on the application display interface;
dividing all the active interfaces into a first type interface and a second type interface;
And replacing the color features displayed in the first type interface and the second type interface with any one of the target color, the main color feature and the secondary color feature respectively, wherein the colors of the first type interface and the second type interface after replacement are different.
Optionally, the fusing the reference color features based on a preset color feature fusion rule to obtain a target color includes:
fusing the reference color features based on a preset color feature fusion rule to obtain a reference color;
acquiring a corrected image of the application program in the cloud according to the reference color, and extracting color features in the corrected image;
And correcting the reference color according to the color characteristics to obtain the target color.
Optionally, the acquiring the target image includes:
Acquiring a user historical image browsing record, and determining a historical image with the largest user browsing frequency according to the historical image browsing record;
the history image is determined as the target image.
The embodiment of the application also provides an application program active interface display device, which comprises:
The reference color feature extraction module is used for acquiring a target image and extracting reference color features in the target image, wherein the reference color features comprise colors and hues;
The target color obtaining module is used for fusing the reference color characteristics based on a preset color characteristic fusion rule to obtain target colors;
and the replacing module is used for acquiring at least one active interface on the display interface of the application program and replacing the color characteristics displayed in the active interface with the target colors.
Optionally, the reference color feature acquisition module includes:
The pixel difference value obtaining unit is used for traversing each pixel point in the target image, and comparing the traversed current pixel point with pixels of adjacent pixel points of the current pixel point to obtain a pixel difference value;
the judging unit is used for judging whether the pixel difference value is within a preset pixel difference value threshold value or not;
the first color feature classification unit is used for classifying pixels of the adjacent pixel points into a color feature set where the current pixel point is located when the pixel difference value is within a preset pixel difference value threshold value;
The second color feature classification unit is used for classifying pixels of the adjacent pixel points into other color feature sets when the pixel difference value is not within a preset pixel difference value threshold;
All of the pixels in the set of color features together comprise the color feature.
Optionally, the method further comprises:
The pixel number statistics unit is used for respectively counting the number of pixels in each color feature set and comparing the number of pixels in each color feature set;
A color feature determining unit, configured to take a color feature in the color feature set with the largest number of pixels as a primary color feature, and take a color feature in the remaining color feature set as a secondary color feature;
Optionally, the target color obtaining module includes:
And the fusion unit is used for fusing the main color feature and the secondary color feature according to the pixel quantity proportion of the main color feature and the secondary color feature to obtain the target color.
Optionally, when the application display interface includes more than two active interfaces, the replacing module includes:
the active interface acquisition unit is used for acquiring all active interfaces on the application display interface;
the dividing unit is used for dividing all the active interfaces into a first type interface and a second type interface;
The replacing unit is used for respectively replacing the color features displayed in the first type of interface and the second type of interface with any one of the target color, the main color feature and the secondary color feature, and the colors of the first type of interface and the second type of interface after replacement are different.
Optionally, the target color obtaining module includes:
The reference color obtaining unit is used for fusing the reference color features based on a preset color feature fusion rule to obtain reference colors;
The color feature extraction unit is used for acquiring a corrected image of the cloud according to the reference color and extracting color features in the corrected image;
and the correction unit is used for correcting the reference color according to the color characteristics to obtain the target color.
Optionally, the reference color feature extraction module includes:
The history image acquisition unit is used for acquiring a history image browsing record of a user, and determining a history image with the largest browsing frequency of the user according to the history image browsing record;
And a target image determining unit configured to determine the history image as the target image.
The embodiment of the application also provides a computer readable storage medium, wherein the computer readable storage medium is stored with a computer program, and the program realizes the display method of the application program active interface according to any technical scheme when being executed by a processor.
The embodiment of the application also provides a terminal device, which comprises:
one or more processors;
A memory;
One or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more applications configured to perform the steps of the application activity interface display method according to any of the claims.
Compared with the prior art, the application has the following beneficial effects:
1. The application provides a display method of an application program active interface, which comprises the following steps: acquiring a target image, and extracting reference color features in the target image, wherein the reference color features comprise colors and hues; fusing the reference color features based on a preset color feature fusion rule to obtain a target color; and acquiring at least one active interface on the display interface of the application program, and replacing the color features displayed in the active interface with the target colors. According to the application, the color characteristics on the target image are extracted, and the colors and the hues in the target image are fused, so that the obtained target color is consistent with the style of the target image, but the target image is not replaced by a simple image, and the personalized setting of the user is further reflected, so that the user can apply the color and the hues of the sense to the display interface of the application program, the display interface of the application program is more personalized, and the experience of the user in using the application program is improved.
2. The application provides a display method of an application program active interface, which comprises the following steps: when the application display interface includes more than two active interfaces, the acquiring at least one active interface on the application display interface, and replacing the color feature displayed in the active interface with the target color includes: acquiring all active interfaces on the application display interface; dividing all the active interfaces into a first type interface and a second type interface; and replacing the color features displayed in the first type interface and the second type interface with any one of the target color, the main color feature and the secondary color feature respectively, wherein the colors of the first type interface and the second type interface after replacement are different. By the process, the colors on the application program are more similar to the colors on the target image, the movable interface is divided into a first part and a second part, and the color characteristics of the two parts are not repeated. For example, the color feature replaced in the first part is the target color, the second part can be replaced by any one of the main color feature and the secondary color feature, so that the visual difference between the first part and the second part is displayed, the two parts of the active interface can be clearly distinguished, and meanwhile, the active interface is closer to the target image in sense, so that the active interface of the application display interface is more glaring.
Drawings
FIG. 1 is a flowchart of an application program activity interface display method according to an embodiment of the application;
FIG. 2 is a schematic diagram of an application active interface display device according to another embodiment of the application;
fig. 3 is a schematic diagram of a basic structure of a terminal device according to another embodiment of the present application.
Detailed Description
Embodiments of the present application are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative only and are not to be construed as limiting the application.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless expressly stated otherwise, as understood by those skilled in the art. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. The term "and/or" as used herein includes all or any element and all combination of one or more of the associated listed items.
It will be understood by those skilled in the art that all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs unless defined otherwise. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
According to the application program active interface display method, the color features in the target image are extracted and fused, and the color features obtained through fusion are replaced on the active interface of the application program, so that personalized customization of the color tone and the color of the display interface of the application program is improved, the target image is prevented from being directly replaced on the display interface of the application program, the color tone and the color of the display interface of the application program are richer, the participation in personalized customization of the display color and the color tone of the display interface of the application program of a user is improved, and the experience of the user using the application program is further improved.
The following embodiment discloses a method for displaying an application program active interface, as shown in fig. 1, including:
s100: and acquiring a target image, and extracting reference color features in the target image, wherein the reference color features comprise colors and hues.
In an embodiment of the present application, the target image may be a user-selected image, a history image. When the target image is a history image: optionally, the acquiring the target image includes:
Acquiring a user historical image browsing record, and determining a historical image with the largest user browsing frequency according to the historical image browsing record;
the history image is determined as the target image.
The historical images can be used for counting the browsing record quantity of each frame of image or the clicking quantity of each frame of image according to the record of the browsing image of the user, such as the image browsing record of the user in the application program or the image clicked by the user in the application program, and taking the historical image with the largest quantity as the target image of the user.
Since there are various colors in a conventional image, each color has a different hue, i.e., a different brightness. After the target image is determined, in order to show personalized customization of a user on the display interface of the application program, the target image is prevented from being directly replaced on the display interface of the application program, color features in the target image are extracted, wherein the color features comprise hues and colors, the hues and the colors can be conveniently fused later, further, the hues and the colors are mutually corresponding and associated, and one color may have multiple hues.
Optionally, the extracting the reference color feature in the target image includes:
traversing each pixel point in the target image, and comparing the traversed current pixel point with pixels of adjacent pixel points of the current pixel point to obtain a pixel difference value;
judging whether the pixel difference value is within a preset pixel difference value threshold value or not;
when the pixel difference value is within a preset pixel difference value threshold value, classifying the pixels of the adjacent pixel points into a color feature set where the current pixel point is located;
When the pixel difference value is not within a preset pixel difference value threshold value, classifying the pixels of the adjacent pixel points into other color feature sets;
All of the pixels in the set of color features together comprise the color feature.
In combination with the foregoing, in the process of extracting the reference color feature, the method may be performed by traversing each pixel point in the image after the user uploads an image, comparing the pixel value of the traversed current pixel point with the pixel value of the pixel point adjacent to the current pixel point, determining a pixel difference value, and determining whether the pixel difference value is excessive. When the pixel difference value is within the pixel difference value threshold value, the pixel difference value is not large, and the pixel difference value can be classified into the same type of color characteristics, so that the pixel difference value is classified into a color characteristic set where the current pixel point is located; if the pixel difference is not within the pixel difference threshold, it indicates that the pixel difference between the current pixel and the adjacent pixel is larger, possibly the color difference is larger, and the hue difference is also larger (the brightness degree of the same primary color), then the pixel adjacent to the current pixel is classified into the color feature set where the current pixel is located. Therefore, in a specific implementation process, a plurality of color feature sets can be obtained based on the colors and the hues in the target image, so that the subsequent fusion can be conveniently carried out according to the pixels in the plurality of color feature sets, the color features of the application program active interface are consistent with the style of the target image, and the color on the target image is not directly replaced.
Optionally, after classifying the pixel of the pixel point adjacent to the current pixel point into the color feature set where the current pixel point is not located when the pixel difference value is not within a preset pixel difference value threshold, the method includes:
Respectively counting the number of pixels in each color feature set, and comparing the number of pixels in each color feature set;
and taking the color feature in the color feature set with the largest pixel number as a main color feature, and taking the color feature in the rest color feature set as a secondary color feature.
In view of the above, it is known that, in order to better distinguish the main color features in the target image, the pixel ranges in the same color feature set are relatively similar in combination with the classification of the pixel points, and by the above classification method, if the target image has only one color feature, the process can be omitted, but if the target image has multiple color features, multiple color feature sets are obtained, so that the main color features need to be distinguished. Specifically, by counting and comparing the number of pixel points in the color feature sets, when the number of pixels in a certain color feature set is the largest, the color feature corresponding to the color feature set is indicated to be the main color and the tone in the target image. In order to make the color and tone of the application program active interface display more similar to the target image, the pixels in the color feature set with the largest number of pixels are used as the main color feature, and the pixels in the remaining color feature set are used as the secondary color features. The secondary color feature sets may be further sorted (e.g., from more to less) based on the number of pixels in the remaining color feature sets, so as to determine a duty ratio of each color feature in the target image based on the number of pixels, and facilitate fusion based on the duty ratio. Of course, the duty cycle may also be preset by a developer.
S200: and fusing the reference color features based on a preset color feature fusion rule to obtain a target color.
Optionally, the fusing the color features based on a preset color feature fusion rule to obtain the target color includes:
and fusing the primary color feature and the secondary color feature according to the pixel quantity proportion of the primary color feature and the secondary color feature to obtain the target color.
In order to obtain better color characteristics, the reference color characteristics are fused through a preset color characteristic fusion rule to obtain target colors. In combination with the foregoing, the primary color feature and the secondary color feature are determined by remembering the color feature extraction, so that when determining the target color, the primary color feature occupies a primary component in the fusion process, the duty ratio of each color feature in the fusion process can be determined according to the number of the primary color feature pixels and the number of the secondary color feature pixels, and in the fusion process, the blending is performed based on the relationship between the hue and the color in the original image. As the same color has various hues, namely the same color has various brightness and darkness, and also has a certain arrangement relation, namely pixels. When multiple colors are present in an image, the different colors affect the brightness of the whole image, so that the brightness of the main color features can be adjusted based on the hue and the color of the secondary color features in the fusion process. For example: the target image comprises goose yellow and green, wherein the green is a main color feature and mint green, and the green is not a very bright color, so that the color tone of the whole image is brighter under the contrast of the goose yellow. Therefore, after the main color feature (green) is extracted, in order to make the target color and the target image closer, the green is adjusted based on the yellow-goose color tone in the target image, that is, the green color tone is adjusted, so that the green is brighter, and the obtained target color is closer to the target image.
S300: and acquiring at least one active interface on the display interface of the application program, and replacing the color features displayed in the active interface with the target colors.
The target color is obtained through the above process, in order to enable the display of the active interface on the application display interface to be the target color, at least one active interface on the application display interface is obtained, the color features displayed in the active interface are replaced by the target color, and if only one active interface exists, the color on the active interface is replaced by the target color.
Optionally, when the application display interface includes more than two active interfaces, the acquiring at least one active interface on the application display interface, and replacing the color feature displayed in the active interface with the target color includes:
Acquiring all active interfaces on the application display interface;
dividing all the active interfaces into a first type interface and a second type interface;
And replacing the color features displayed in the first type interface and the second type interface with any one of the target color, the main color feature and the secondary color feature respectively, wherein the colors of the first type interface and the second type interface after replacement are different.
In order to make the colors on the application program and the target image more approximate, the active interface is divided into a first part and a second part, wherein the first part can be a reference active interface of the application program, such as a navigation bar, a menu bar, a page background and the like, the second part is a classification icon and the like on a display interface of the application program, the color characteristics of the two parts are replaced by any one of target colors, main color characteristics and secondary color characteristics, and the color characteristics of the two parts are not repeated. For example, when the color feature replaced by the first portion is the target color, the second portion may be replaced by any one of the primary color feature and the secondary color feature, so that the difference between the first portion and the second portion is displayed, and the active interface of the application display interface is more glaring.
Optionally, the fusing the reference color features based on a preset color feature fusion rule to obtain a target color includes:
fusing the reference color features based on a preset color feature fusion rule to obtain a reference color;
acquiring a corrected image of the application program in the cloud according to the reference color, and extracting color features in the corrected image;
And correcting the reference color according to the color characteristics to obtain the target color.
On the basis of the above, the color and tone fusion degree in the target color obtained through the target image is insufficient, so that the target color is not very good, in order to enable the color and tone in the target color to be fused more, the reference color features in the target image are fused through the above process to obtain the reference color, the corrected image which is closer to the reference color features is obtained from the cloud based on the reference color features, the color features in the corrected image are extracted, the reference color is corrected based on the color features of the corrected image, for example, the brightness degree of partial color is enhanced and reduced, so that the color and tone in the target color are fused more, and when the corrected target color is replaced on the active interface of the application program display interface, the colors of all parts are more good, the personalized features of the user are reflected, and the user experience is improved. It should be noted that, the adjustment of the color feature extraction hue is as described above, and will not be described here.
The embodiment of the application also provides an application program active interface display device, in one implementation manner, as shown in fig. 2, the device comprises:
The reference color feature extraction module 100 is configured to obtain a target image, and extract reference color features in the target image, where the reference color features include color and hue.
In an embodiment of the present application, the target image may be a user-selected image, a history image. When the target image is a history image: optionally, the reference color feature extraction module includes:
The history image acquisition unit is used for acquiring a history image browsing record of a user, and determining a history image with the largest browsing frequency of the user according to the history image browsing record;
And a target image determining unit configured to determine the history image as the target image.
The historical images can be used for counting the browsing record quantity of each frame of image or the clicking quantity of each frame of image by the user according to the record of the browsing image by the user, such as the image browsing record of the user in the application program or the image clicked by the user in the application program, and taking the historical image with the largest quantity as the target image of the user.
Since there are various colors in a conventional image, each color has a different hue, i.e., a different brightness. After the target image is determined, in order to embody personalized customization of a user on the display interface of the application program, the target image is prevented from being directly replaced on the display interface of the application program, color features in the target image are extracted, wherein the color features comprise hues and colors, the hues and the colors can be conveniently fused later, and further, the hues and the colors are mutually corresponding and associated.
Optionally, the reference color feature acquisition module includes:
The pixel difference value obtaining unit is used for traversing each pixel point in the target image, and comparing the traversed current pixel point with pixels of adjacent pixel points of the current pixel point to obtain a pixel difference value;
the judging unit is used for judging whether the pixel difference value is within a preset pixel difference value threshold value or not;
the first color feature classification unit is used for classifying pixels of the adjacent pixel points into a color feature set where the current pixel point is located when the pixel difference value is within a preset pixel difference value threshold value;
The second color feature classification unit is used for classifying pixels of the adjacent pixel points into other color feature sets when the pixel difference value is not within a preset pixel difference value threshold;
All of the pixels in the set of color features together comprise the color feature.
In combination with the foregoing, in the process of extracting the reference color feature, the method may be performed by traversing each pixel point in the image after the user uploads an image, comparing the pixel value of the traversed current pixel point with the pixel value of the pixel point adjacent to the current pixel point, determining a pixel difference value, and determining whether the pixel difference value is excessive. When the pixel difference value is within the pixel difference value threshold value, the pixel difference value is not large, and the pixel difference value can be classified into the same type of color characteristics, so that the pixel difference value is classified into a color characteristic set where the current pixel point is located; if the pixel difference is not within the pixel difference threshold, it indicates that the pixel difference between the current pixel and the adjacent pixel is larger, possibly the color difference is larger, and the hue difference is also larger (the brightness degree of the same primary color), then the pixel adjacent to the current pixel is classified into the color feature set where the current pixel is located. Therefore, in a specific implementation process, a plurality of color feature sets can be obtained based on the colors and the hues in the target image, so that the subsequent fusion can be conveniently carried out according to the pixels in the plurality of color feature sets, the color features of the application program active interface are consistent with the style of the target image, and the color on the target image is not directly replaced.
Optionally, the method further comprises:
The pixel number statistics unit is used for respectively counting the number of the pixel points corresponding to the first color feature and the number of the pixel points corresponding to the second color feature;
And the first color feature determining unit is used for taking the color feature in the color feature set with the largest pixel number as a main color feature and taking the color feature in the rest color feature sets as a secondary color feature.
In view of the above, it is known that, in order to better distinguish the main color features in the target image, the pixel ranges in the same color feature set are relatively similar in combination with the classification of the pixel points, and by the above classification method, if the target image has only one color feature, the process can be omitted, but if the target image has multiple color features, multiple color feature sets are obtained, so that the main color features need to be distinguished. Specifically, by counting and comparing the number of pixel points in the color feature sets, when the number of pixels in a certain color feature set is the largest, the color feature corresponding to the color feature set is indicated to be the main color and the tone in the target image. In order to make the color and tone of the application program active interface display more similar to the target image, the pixels in the color feature set with the largest number of pixels are used as the main color feature, and the pixels in the remaining color feature set are used as the secondary color features. The secondary color feature sets may be further sorted (e.g., from more to less) based on the number of pixels in the remaining color feature sets, so as to determine a duty ratio of each color feature in the target image based on the number of pixels, and facilitate fusion based on the duty ratio. Of course, the duty cycle may also be preset by a developer.
The target color obtaining module 200 is configured to fuse the reference color features based on a preset color feature fusion rule to obtain a target color;
Optionally, the target color obtaining module includes:
And the fusion unit is used for fusing the main color feature and the secondary color feature according to the pixel quantity proportion of the main color feature and the secondary color feature to obtain the target color.
In order to obtain better color characteristics, the reference color characteristics are fused through a preset color characteristic fusion rule to obtain target colors. In combination with the foregoing, the primary color feature and the secondary color feature are determined by remembering the color feature extraction, so that when determining the target color, the primary color feature occupies a primary component in the fusion process, the duty ratio of each color feature in the fusion process can be determined according to the number of the primary color feature pixels and the number of the secondary color feature pixels, and in the fusion process, the blending is performed based on the relationship between the hue and the color in the original image. As the same color has various hues, namely the same color has various brightness and darkness, and also has a certain arrangement relation, namely pixels. When multiple colors are present in an image, the different colors affect the brightness of the whole image, so that the brightness of the main color features can be adjusted based on the hue and the color of the secondary color features in the fusion process. For example: the target image comprises goose yellow and green, wherein the green is a main color feature and mint green, and the green is not a very bright color, so that the color tone of the whole image is brighter under the contrast of the goose yellow. Therefore, after the main color feature (green) is extracted, in order to make the target color and the target image closer, the green is adjusted based on the yellow-goose color tone in the target image, that is, the green color tone is adjusted, so that the green is brighter, and the obtained target color is closer to the target image.
And the replacing module 300 is configured to obtain at least one active interface on the application display interface, and replace the color feature displayed in the active interface with the target color.
The target color is obtained through the above process, in order to enable the display of the active interface on the application display interface to be the target color, at least one active interface on the application display interface is obtained, the color features displayed in the active interface are replaced by the target color, and if only one active interface exists, the color on the active interface is replaced by the target color.
Optionally, when the application display interface includes more than two active interfaces, the replacing module includes:
the active interface acquisition unit is used for acquiring all active interfaces on the application display interface;
the dividing unit is used for dividing all the active interfaces into a first type interface and a second type interface;
The replacing unit is used for respectively replacing the color features displayed in the first type of interface and the second type of interface with any one of the target color, the main color feature and the secondary color feature, and the colors of the first type of interface and the second type of interface after replacement are different.
In order to make the colors on the application program and the target image more approximate, the active interface is divided into a first part and a second part, wherein the first part can be a reference active interface of the application program, such as a navigation bar, a menu bar, a page background and the like, the second part is a classification icon and the like on a display interface of the application program, the color characteristics of the two parts are replaced by any one of target colors, main color characteristics and secondary color characteristics, and the color characteristics of the two parts are not repeated. For example, the color feature replaced in the first portion is the target color, and the second portion may be replaced with either one of the primary color feature and the secondary color feature, thereby displaying the difference between the first portion and the second portion.
Optionally, the target color obtaining module includes:
A reference color obtaining unit for obtaining a reference color by fusing the reference color features based on a preset color feature fusion rule;
the color feature extraction unit is used for acquiring a corrected image of the application program in the cloud according to the reference color and extracting color features in the corrected image;
and the correction unit is used for correcting the reference color according to the color characteristics to obtain the target color.
On the basis of the above, the color and tone fusion degree in the target color obtained through the target image is insufficient, so that the target color is not very good, in order to enable the color and tone in the target color to be fused more, the reference color features in the target image are fused through the above process to obtain the reference color, the corrected image which is closer to the reference color features is obtained from the cloud based on the reference color features, the color features in the corrected image are extracted, the reference color is corrected based on the color features of the corrected image, for example, the brightness degree of partial color is enhanced and reduced, so that the color and tone in the target color are fused more, and when the corrected target color is replaced on the active interface of the application program display interface, the colors of all parts are more good, the personalized features of the user are reflected, and the user experience is improved. It should be noted that, the adjustment of the color feature extraction hue is as described above, and will not be described here.
The present invention also provides a terminal device, as shown in fig. 3, for convenience of explanation, only the relevant parts of the embodiments of the present invention are shown, and specific technical details are not disclosed, please refer to the method parts of the embodiments of the present invention. The terminal may be any terminal device including a desktop computer, a tablet computer, a PDA (Personal digital assistant), a mobile phone, a POS (Point of Sales), a vehicle-mounted computer, and the like, taking the terminal as an example of the mobile phone:
fig. 3 is a block diagram showing a part of a structure of a mobile phone related to a terminal provided by an embodiment of the present invention. Referring to fig. 3, the mobile phone includes: radio Frequency (RF) circuitry 1510, memory 1520, input unit 1530, display unit 1540, sensor 1550, audio circuitry 1560, wireless fidelity (WIRELESS FIDELITY, wi-Fi) module 1570, processor 1580, power supply 1590, and the like. Those skilled in the art will appreciate that the handset configuration shown in fig. 3 is not limiting of the handset and may include more or fewer components than shown, or may combine certain components, or may be arranged in a different arrangement of components.
The following describes the components of the mobile phone in detail with reference to fig. 3:
The RF circuit 1510 may be used for receiving and transmitting signals during a message or a call, and particularly, after receiving downlink information of a base station, the signal is processed by the processor 1580; in addition, the data of the design uplink is sent to the base station. Generally, RF circuitry 1510 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier (Low Noise Amplifier, LNA), a duplexer, and the like. In addition, the RF circuitry 1510 may also communicate with networks and other devices through wireless communication. The wireless communication may use any communication standard or protocol including, but not limited to, global System for Mobile communications (Global System of Mobile communication, GSM), general Packet Radio Service (GPRS), code division multiple Access (Code Division Multiple Access, CDMA), wideband code division multiple Access (Wideband Code Division Multiple Access, WCDMA), long term evolution (Long Term Evolution, LTE), email, short message Service (Short MESSAGING SERVICE, SMS), and the like.
The memory 1520 may be used to store software programs and modules, and the processor 1580 performs various functional applications and data processing of the cellular phone by executing the software programs and modules stored in the memory 1520. The memory 1520 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function (such as a voiceprint play function, an image play function, etc.), and the like; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, memory 1520 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 1530 may be used to receive input numerical or character information and generate key signal inputs related to user settings and function control of the handset. In particular, the input unit 1530 may include a touch panel 1531 and other input devices 1532. The touch panel 1531, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (e.g., operations of the user on the touch panel 1531 or thereabout by using any suitable object or accessory such as a finger, a stylus, etc.), and drive the corresponding connection device according to a predetermined program. Alternatively, the touch panel 1531 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts it into touch point coordinates, and sends the touch point coordinates to the processor 1580, and can receive and execute commands sent from the processor 1580. In addition, the touch panel 1531 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. The input unit 1530 may include other input devices 1532 in addition to the touch panel 1531. In particular, other input devices 1532 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, mouse, joystick, etc.
The display unit 1540 may be used to display information input by a user or provided to a user and an application program interface. The display unit 1540 may include a display panel 1541, and alternatively, the display panel 1541 may be configured in the form of a Liquid crystal display (Liquid CRYSTAL DISPLAY, LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch panel 1531 may cover the display panel 1541, and when the touch panel 1531 detects a touch operation thereon or thereabout, the touch operation is transferred to the processor 1580 to determine the type of touch event, and then the processor 1580 provides a corresponding visual output on the display panel 1541 according to the type of touch event. Although in fig. 3, the touch panel 1531 and the display panel 1541 are two separate components for implementing the input and input functions of the mobile phone, in some embodiments, the touch panel 1531 may be integrated with the display panel 1541 to implement the input and output functions of the mobile phone.
The handset may also include at least one sensor 1550, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel 1541 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 1541 and/or the backlight when the phone is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the acceleration in all directions (generally three axes), and can detect the gravity and direction when stationary, and can be used for applications of recognizing the gesture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and knocking), and the like; other sensors such as gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc. that may also be configured with the handset are not described in detail herein.
Audio circuitry 1560, a speaker 1561, and a microphone 1562 may provide an audio interface between a user and a cell phone. The audio circuit 1560 may transmit the received electrical signal converted from audio data to the speaker 1561, and be converted into a voiceprint signal by the speaker 1561 for output; on the other hand, the microphone 1562 converts the collected voiceprint signals into electrical signals, which are received by the audio circuit 1560 and converted into audio data, which are then processed by the audio data output processor 1580 for transmission, for example, to another cell phone via the RF circuit 1510 or for output to the memory 1520 for further processing.
Wi-Fi belongs to a short-distance wireless transmission technology, and a mobile phone can help a user to send and receive e-mails, browse webpages, access streaming media and the like through a Wi-Fi module 1570, so that wireless broadband Internet access is provided for the user. While fig. 3 shows Wi-Fi module 1570, it is to be understood that it is not an essential component of a cell phone and may be omitted entirely as desired without changing the essence of the invention.
The processor 1580 is a control center of the mobile phone, connects various parts of the entire mobile phone using various interfaces and lines, and performs various functions and processes data of the mobile phone by running or executing software programs and/or modules stored in the memory 1520 and calling data stored in the memory 1520, thereby performing overall monitoring of the mobile phone. In the alternative, processor 1580 may include one or more processing units; preferably, the processor 1580 can integrate an application processor and a modem processor, wherein the application processor primarily processes operating systems, user interfaces, application programs, and the like, and the modem processor primarily processes wireless communications. It is to be appreciated that the modem processor described above may not be integrated into the processor 1580.
The handset further includes a power supply 1590 (e.g., a battery) for powering the various components, which may preferably be logically connected to the processor 1580 via a power management system so as to provide for the management of charging, discharging, and power consumption by the power management system.
Although not shown, the mobile phone may further include a camera, a bluetooth module, etc., which will not be described herein.
In the embodiment of the present invention, the processor 1580 included in the terminal device further has the following functions: acquiring a target image, and extracting reference color features in the target image, wherein the reference color features comprise colors and hues; fusing the reference color features based on a preset color feature fusion rule to obtain a target color; and acquiring at least one active interface on the display interface of the application program, and replacing the color features displayed in the active interface with the target colors. That is, the processor 1580 has the function of executing the method for displaying an application program active interface in any of the above embodiments, which is not described herein.
The invention also provides a computer readable storage medium, wherein the computer readable storage medium stores a computer program, and the program realizes the display method of the application program active interface when being executed by a processor.
Those skilled in the art will appreciate that the present invention includes apparatuses related to performing one or more of the operations described herein. These devices may be specially designed and constructed for the required purposes, or may comprise known devices in general purpose computers. These devices have computer programs stored therein that are selectively activated or reconfigured. Such a computer program may be stored in a device (e.g., a computer) readable medium or any type of medium suitable for storing electronic instructions and respectively coupled to a bus, including, but not limited to, any type of disk including floppy disks, hard disks, optical disks, CD-ROMs, and magneto-optical disks, ROMs, RAMs (Random Access Memory, random access memories), EPROMs (Erasable Programmable Read-Only memories), EPROMs (ELECTRICALLY ERASABLE PROGRAMMABLE READ-Only memories), flash memories, magnetic cards, or optical cards. That is, a readable medium includes any medium that stores or transmits information in a form readable by a device (e.g., a computer).
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited in order and may be performed in other orders, unless explicitly stated herein. Moreover, at least some of the steps in the flowcharts of the figures may include a plurality of sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, the order of their execution not necessarily being sequential, but may be performed in turn or alternately with other steps or at least a portion of the other steps or stages.
The foregoing is only a partial embodiment of the present invention, and it should be noted that it will be apparent to those skilled in the art that modifications and adaptations can be made without departing from the principles of the present invention, and such modifications and adaptations are intended to be comprehended within the scope of the present invention.

Claims (14)

1. An application program activity interface display method, comprising:
Acquiring a target image, and extracting reference color features in the target image, wherein the reference color features comprise colors and hues, and the reference color features comprise primary color features and secondary color features;
fusing the reference color features based on a preset color feature fusion rule to obtain a target color;
Acquiring at least one active interface on an application display interface, and replacing color features displayed in the active interface with the target color;
The fusing of the reference color features based on the preset color feature fusion rule to obtain a target color comprises the following steps:
and fusing the primary color feature and the secondary color feature according to the pixel quantity proportion of the primary color feature and the secondary color feature to obtain the target color.
2. The application program active interface display method according to claim 1, wherein the extracting the reference color feature in the target image includes:
traversing each pixel point in the target image, and comparing the traversed current pixel point with pixels of adjacent pixel points of the current pixel point to obtain a pixel difference value;
judging whether the pixel difference value is within a preset pixel difference value threshold value or not;
when the pixel difference value is within a preset pixel difference value threshold value, classifying the pixels of the adjacent pixel points into a color feature set where the current pixel point is located;
When the pixel difference value is not within a preset pixel difference value threshold value, classifying the pixels of the adjacent pixel points into other color feature sets;
All of the pixels in the set of color features together comprise the color feature.
3. The application program active interface display method according to claim 2, wherein after classifying the pixels with the neighboring pixels into the other color feature sets when the pixel difference is not within a preset pixel difference threshold, comprising:
Respectively counting the number of pixels in each color feature set, and comparing the number of pixels in each color feature set;
and taking the color feature in the color feature set with the largest pixel number as a main color feature, and taking the color feature in the rest color feature set as a secondary color feature.
4. The method for displaying an active interface of an application according to claim 3, wherein when the active interfaces are included in the application display interface, the step of obtaining at least one active interface on the application display interface and replacing color features displayed in the active interface with the target color comprises:
Acquiring all active interfaces on the application display interface;
dividing all the active interfaces into a first type interface and a second type interface;
And replacing the color features displayed in the first type interface and the second type interface with any one of the target color, the main color feature and the secondary color feature respectively, wherein the colors of the first type interface and the second type interface after replacement are different.
5. The method for displaying an application program active interface according to claim 1, wherein the fusing the reference color features based on a preset color feature fusion rule to obtain a target color comprises:
fusing the reference color features based on a preset color feature fusion rule to obtain a reference color;
acquiring a corrected image of the application program in the cloud according to the reference color, and extracting color features in the corrected image;
And correcting the reference color according to the color characteristics to obtain the target color.
6. The application program active interface display method according to claim 1, wherein the acquiring the target image includes:
Acquiring a user historical image browsing record, and determining a historical image with the largest user browsing frequency according to the historical image browsing record;
the history image is determined as the target image.
7. An application activity interface display device, comprising:
the reference color feature extraction module is used for acquiring a target image and extracting reference color features in the target image, wherein the reference color features comprise colors and hues, and the reference color features comprise primary color features and secondary color features;
The target color obtaining module is used for fusing the reference color characteristics based on a preset color characteristic fusion rule to obtain target colors; the fusing of the reference color features based on the preset color feature fusion rule to obtain a target color comprises the following steps: fusing the primary color feature and the secondary color feature according to the pixel quantity proportion of the primary color feature and the secondary color feature to obtain the target color;
and the replacing module is used for acquiring at least one active interface on the display interface of the application program and replacing the color characteristics displayed in the active interface with the target colors.
8. The application activity interface display apparatus of claim 7, wherein the reference color feature acquisition module comprises:
The pixel difference value obtaining unit is used for traversing each pixel point in the target image, and comparing the traversed current pixel point with pixels of adjacent pixel points of the current pixel point to obtain a pixel difference value;
the judging unit is used for judging whether the pixel difference value is within a preset pixel difference value threshold value or not;
the first color feature classification unit is used for classifying pixels of the adjacent pixel points into a color feature set where the current pixel point is located when the pixel difference value is within a preset pixel difference value threshold value;
The second color feature classification unit is used for classifying pixels of the adjacent pixel points into other color feature sets when the pixel difference value is not within a preset pixel difference value threshold;
All of the pixels in the set of color features together comprise the color feature.
9. The application activity interface display apparatus of claim 8, further comprising:
The pixel number statistics unit is used for respectively counting the number of pixels in each color feature set and comparing the number of pixels in each color feature set;
and the color feature determining unit is used for taking the color feature in the color feature set with the largest pixel number as a main color feature and taking the color feature in the rest color feature sets as a secondary color feature.
10. The application program active interface display apparatus according to claim 9, wherein when the application program active interface includes two or more active interfaces thereon, the replacing module includes:
the active interface acquisition unit is used for acquiring all active interfaces on the application display interface;
the dividing unit is used for dividing all the active interfaces into a first type interface and a second type interface;
The replacing unit is used for respectively replacing the color features displayed in the first type of interface and the second type of interface with any one of the target color, the main color feature and the secondary color feature, and the colors of the first type of interface and the second type of interface after replacement are different.
11. The application activity interface display apparatus of claim 7, wherein the target color obtaining module comprises:
The reference color obtaining unit is used for fusing the reference color features based on a preset color feature fusion rule to obtain reference colors;
the color feature extraction unit is used for acquiring a corrected image of the application program in the cloud according to the reference color and extracting color features in the corrected image;
and the correction unit is used for correcting the reference color according to the color characteristics to obtain the target color.
12. The application activity interface display apparatus of claim 7, wherein the reference color feature extraction module comprises:
The history image acquisition unit is used for acquiring a history image browsing record of a user, and determining a history image with the largest browsing frequency of the user according to the history image browsing record;
And a target image determining unit configured to determine the history image as the target image.
13. A computer-readable storage medium, wherein a computer program is stored on the computer-readable storage medium, which when executed by a processor, implements the application program active interface display method of any one of claims 1 to 6.
14. A terminal device, comprising:
one or more processors;
A memory;
One or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more applications configured to perform the steps of the application activity interface display method of any of claims 1 to 6.
CN201811251734.1A 2018-10-25 2018-10-25 Application program active interface display method and device, storage medium and terminal equipment Active CN111104193B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811251734.1A CN111104193B (en) 2018-10-25 2018-10-25 Application program active interface display method and device, storage medium and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811251734.1A CN111104193B (en) 2018-10-25 2018-10-25 Application program active interface display method and device, storage medium and terminal equipment

Publications (2)

Publication Number Publication Date
CN111104193A CN111104193A (en) 2020-05-05
CN111104193B true CN111104193B (en) 2024-08-20

Family

ID=70417999

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811251734.1A Active CN111104193B (en) 2018-10-25 2018-10-25 Application program active interface display method and device, storage medium and terminal equipment

Country Status (1)

Country Link
CN (1) CN111104193B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114691252B (en) * 2020-12-28 2023-05-30 中国联合网络通信集团有限公司 Screen display method and device
CN117376718B (en) * 2023-12-08 2024-03-26 深圳市尊正数字视频有限公司 Real-time color adjustment method and system based on camera output signals

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104657037A (en) * 2015-01-29 2015-05-27 深圳市金立通信设备有限公司 Terminal interface processing method
CN104899853A (en) * 2014-03-04 2015-09-09 腾讯科技(深圳)有限公司 Image region dividing method and device
CN105867730A (en) * 2016-03-24 2016-08-17 广东欧珀移动通信有限公司 Display method and device for status bar

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8379972B1 (en) * 2009-12-01 2013-02-19 Adobe Systems Incorporated Color decontamination for image compositing
CN102592272B (en) * 2011-01-12 2017-01-25 深圳市世纪光速信息技术有限公司 Extracting method and device of picture dominant tone
CN102722880B (en) * 2011-03-29 2014-12-10 阿里巴巴集团控股有限公司 Image main color identification method and apparatus thereof, image matching method and server
US8897552B2 (en) * 2012-08-01 2014-11-25 Microsoft Corporation Setting an operating-system color using a photograph
US8970591B2 (en) * 2012-11-28 2015-03-03 Imagination Technologies, Limited Memory efficient progressive refinement in ray tracing
CN104239076A (en) * 2013-06-08 2014-12-24 中兴通讯股份有限公司 Interface color adjusting method and device
US9711553B2 (en) * 2014-04-28 2017-07-18 Samsung Electronics Co., Ltd. Image sensor including a pixel having photoelectric conversion elements and image processing device having the image sensor
CN104063562B (en) * 2014-07-14 2017-03-22 南京大学 Method used for generating bottom embroidery draft of disordered needlework and based on color clustering
CN104268904A (en) * 2014-09-10 2015-01-07 珠海市君天电子科技有限公司 Method and device for extracting main color of image
CN105468158B (en) * 2015-12-22 2020-06-05 亳州鼎源科技信息有限公司 Color adjustment method and mobile terminal
CN105812668A (en) * 2016-04-19 2016-07-27 奇酷互联网络科技(深圳)有限公司 Image processing method, processing device and photographing apparatus
CN106201212B (en) * 2016-07-19 2019-05-03 厦门美图移动科技有限公司 A kind of generation method of application icon, device and mobile terminal
CN106502510B (en) * 2016-10-27 2019-08-20 北京安云世纪科技有限公司 Icon and the adaptation processing method at interface, device
CN107908383B (en) * 2017-11-14 2020-07-07 维沃移动通信有限公司 Screen color adjusting method and device and mobile terminal
CN108389176A (en) * 2018-03-09 2018-08-10 网易(杭州)网络有限公司 Method, apparatus, storage medium, processor and the terminal of image blend processing

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104899853A (en) * 2014-03-04 2015-09-09 腾讯科技(深圳)有限公司 Image region dividing method and device
CN104657037A (en) * 2015-01-29 2015-05-27 深圳市金立通信设备有限公司 Terminal interface processing method
CN105867730A (en) * 2016-03-24 2016-08-17 广东欧珀移动通信有限公司 Display method and device for status bar

Also Published As

Publication number Publication date
CN111104193A (en) 2020-05-05

Similar Documents

Publication Publication Date Title
CN110417986B (en) Screen-off display method and electronic equipment
CN106055364B (en) Application starting method and terminal equipment
US9697622B2 (en) Interface adjustment method, apparatus, and terminal
CN108460769B (en) image processing method and terminal equipment
CN106856543B (en) Picture display method and device and mobile terminal
US20190257887A1 (en) Method and device for ascertaining required charging time
CN108172200B (en) Method for displaying text in application and terminal equipment
US20220319378A1 (en) Screen color temperature control method, apparatus, storage medium, and mobile terminal
CN110209245A (en) Face identification method and Related product
CN106445970B (en) Loading processing method and device for placeholder map
CN105930077A (en) Method and device for adjusting size of objects displayed by screens
CN111970404B (en) Display adjusting method of terminal, mobile terminal and storage medium
CN107864299B (en) Picture display method and related product
CN107256142B (en) Page display method and terminal
CN111104193B (en) Application program active interface display method and device, storage medium and terminal equipment
CN107292833B (en) Image processing method and device and mobile terminal
CN110767950B (en) Charging method, charging device, terminal equipment and computer readable storage medium
CN107046595A (en) Announcement information processing method, device and mobile terminal
CN107632985B (en) Webpage preloading method and device
CN107797723B (en) Display style switching method and terminal
CN108877733B (en) Color temperature adjusting method and related device
CN108628534B (en) Character display method and mobile terminal
CN106688305B (en) Intelligent matching method and terminal of filter
CN106293407B (en) Picture display method and terminal equipment
CN110336917B (en) Picture display method and device, storage medium and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant