[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN111885298B - Image processing method and device - Google Patents

Image processing method and device Download PDF

Info

Publication number
CN111885298B
CN111885298B CN202010568923.2A CN202010568923A CN111885298B CN 111885298 B CN111885298 B CN 111885298B CN 202010568923 A CN202010568923 A CN 202010568923A CN 111885298 B CN111885298 B CN 111885298B
Authority
CN
China
Prior art keywords
image
input
target
storage space
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010568923.2A
Other languages
Chinese (zh)
Other versions
CN111885298A (en
Inventor
张梦月
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202010568923.2A priority Critical patent/CN111885298B/en
Publication of CN111885298A publication Critical patent/CN111885298A/en
Application granted granted Critical
Publication of CN111885298B publication Critical patent/CN111885298B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application discloses an image processing method and device, belongs to the technical field of communication, and can solve the problem of complex operation when electronic equipment adds filter effects to a photo and simultaneously stores multiple filter effect images. The method comprises the following steps: under the condition that a first image and a target control are displayed on a target interface, receiving first input of a user to a target area of the target control; the target control comprises N first areas; each first area corresponds to a filter, and each first area corresponds to a storage space; responding to the first input, and storing a second image in a first storage space corresponding to the target area, wherein the second image is an image generated after the first image is subjected to filter processing corresponding to the target area; n is a positive integer; receiving a second input; and responding to the second input, and outputting the target image stored in a second storage space corresponding to the second input, wherein the first storage space comprises the second storage space. The embodiment of the application is applied to an image processing scene.

Description

Image processing method and device
Technical Field
The embodiment of the application relates to the technical field of communication, in particular to an image processing method and device.
Background
When a user uses the electronic device to process a taken photograph, in order to enable the photograph to present different styles of display effects, the user may use the electronic device to add various filters to the photograph, for example, filters that add a natural style, a sweet-style, etc. to the photograph.
In the related art, when a user uses an electronic device to add a filter to a certain photo, the user selects a favorite filter on an image processing interface and adds a filter effect to the photo, and then the photo with the filter effect added can be saved in an album or shared with other users.
However, when a user adds a filter to the same image on the image processing interface, only one filter effect graph can be stored each time, and if a plurality of different filter effect graphs are to be stored, the storage operation needs to be repeatedly executed, which is tedious in operation process.
Disclosure of Invention
The embodiment of the application aims to provide an image processing method and device, and the problem that when electronic equipment adds a filter effect to a photo and simultaneously stores multiple filter effect images, the operation is complex can be solved.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides an image processing method, including: under the condition that a first image and a target control are displayed on a target interface, receiving first input of a user to a target area of the target control; the target control comprises N first areas; each first area corresponds to a filter, and each first area corresponds to a storage space; the target area is at least one of the N first areas; responding to the first input, and storing a second image in a first storage space corresponding to the target area, wherein the second image is an image generated after the first image is subjected to filter processing corresponding to the target area; n is a positive integer; receiving a second input; responding to the second input, and outputting a target image stored in a second storage space corresponding to the second input, wherein the first storage space comprises the second storage space; wherein the second input comprises: and a second region in the target control is input by the user, the second storage space is a storage space corresponding to the second region, and the second region is at least one region in the target region.
In a second aspect, an embodiment of the present application further provides an image processing apparatus, which includes a receiving module, a storage module, and an output module; the receiving module is used for receiving first input of a user to a target area of a target control under the condition that a first image and the target control are displayed on a target interface; the target control comprises N first areas; each first area corresponds to a filter, and each first area corresponds to a storage space; the target area is at least one of the N first areas; the storage module is used for responding to the first input received by the receiving module and storing a second image in a first storage space corresponding to the target area, wherein the second image is an image generated after the first image is subjected to filter processing corresponding to the target area; n is a positive integer; the receiving module is also used for receiving a second input; the output module is used for responding to the second input received by the receiving module and outputting the target image stored in a second storage space corresponding to the second input, and the first storage space comprises the second storage space; wherein the second input comprises: and a second region in the target control is input by the user, the second storage space is a storage space corresponding to the second region, and the second region is at least one region in the target region.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, where the program or instructions, when executed by the processor, implement the steps of the image processing method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In the embodiment of the application, when a user uses the electronic device to add a filter effect to an image, by receiving a first input of the user to a target area of the target control under the condition that a target interface displays a first image and the target control, the electronic device can store a second image processed by a filter into a first storage space corresponding to the target area, and then after receiving the second input, the electronic device can output a target image stored in at least one second storage space corresponding to the second input into an album, so that when the user uses the electronic device to add the filter effect to the image, the user can simultaneously output a plurality of images processed by the filter.
Drawings
Fig. 1 is a schematic diagram of an interface applied by a related image processing method provided in an embodiment of the present application;
fig. 2 is a schematic flowchart of an image processing method according to an embodiment of the present application;
fig. 3 is a schematic diagram of an interface applied by an image processing method according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 6 is a second schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/", and generally means that the former and latter related objects are in an "or" relationship.
The image processing method provided by the embodiment of the application can be applied to scenes of beautifying images by using electronic equipment.
For example, in a scene in which a user adds a filter effect to a shot photo using an electronic device, in the related art, after the user opens a shot photo in an image editing APP or calls a camera function to take a new photo, the user may click various filter controls in an APP interface, add various filter effects to the photo, and display a corresponding filter effect diagram in the interface. For example, as shown in fig. 1, an interface for adding a filter effect to a shot photo by a user is provided, where the interface for editing an image includes an image preview area 11, a filter control area 12 (including A, B, C, D, E five filter controls, each filter control corresponding to a filter) and a "save/share" button 13, the user may click any filter control in the area 12 to add a filter effect corresponding to the filter control to the photo, and display a filter effect map in the area 11, and then, when the user clicks the button 13, the electronic device saves the filter effect map displayed in the area 11 to an album or shares the filter effect map to other users.
However, only one filter effect graph can be displayed in the interface each time, which results in that the user cannot intuitively compare the filter effect graphs of various filters, and the user can only store the filter effect graph displayed in the current interface each time.
To solve the problem, in the technical solution provided in the embodiment of the present application, a target control is displayed in a target interface of an electronic device, the target control includes N first regions, each first region corresponds to one type of filter, and a user can add a filter effect of a filter corresponding to the first region to a first image by clicking the first region, and display a second image with the filter effect added in the target interface. Then, the electronic device may store the filter effect map in a first storage space corresponding to the first area. The user can click on the plurality of first areas, so that the electronic equipment stores a plurality of filter effect graphs. The user can click a specific area of the target control, a plurality of filter effect graphs are displayed in the target interface, the user can select one or more satisfactory filter effect graphs according to the preference, and then the electronic equipment stores the one or more filter effect graphs selected by the user into an album or shares the one or more filter effect graphs with other users after receiving second input of the user.
The image processing method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
As shown in fig. 2, an image processing method provided in an embodiment of the present application may include the following steps 201 to 204:
step 201, under the condition that a target interface displays a first image and a target control, an image processing device receives a first input of a user to a target area of the target control.
The target control comprises N first areas; each first area corresponds to a filter and each first area corresponds to a storage space. The target region is at least one of N first regions, and N is a positive integer.
Illustratively, when a user uses the electronic device to add a filter effect to an image, the image to be processed needs to be opened in an image processing application and displayed in an image processing interface. Therefore, the target interface may be an image processing interface of an image processing application installed in the electronic device, the first image is a photograph that a user needs to beautify (add a filter effect), and the target control is a control including N first areas, where each first area corresponds to one filter. After the first image is displayed in the target interface of the electronic device, the user can add a filter effect to the first image by operating the target control in the target interface.
For example, referring to fig. 1, as shown in fig. 3, an image (i.e., the first image) that the user needs to edit is displayed in an area 11 of an interface 10 (i.e., the target interface), the user can click any one of controls 12 (including 8 areas, such as a, b, c, d, e, f, g, and h, each of which corresponds to one type of filter), and a filter effect corresponding to the area is added to the image displayed in the area 11.
Illustratively, the filter may be natural, sweet, natural, mint, summer, or red, orange, yellow, green, cyan, blue, purple, or the like, and is used to add a special image effect to the first image.
For example, the first input may be: the touch input of the target control by the user, or the voice instruction input by the user, or the specific gesture input by the user may be specifically determined according to the actual use requirement, and the embodiment of the present invention is not limited. For example, the touch input may be a click input of the user on the first region of the target control.
For example, taking the above-mentioned first input as an input for the user to click the first region in the target control as an example, the user may click one first region in the target control at a time, or may click multiple first regions in the target control at the same time.
For example, each first region in the target control further corresponds to a first storage space, and the first storage space may be used to store the filter effect map processed by the filter corresponding to the first region. For example, when a user clicks a first area, the electronic device processes a first image using a filter corresponding to the first area and generates a filter effect map, and then stores the filter effect map in a first storage space corresponding to the first area.
For example, the first storage space may be a certain space in a memory space of the electronic device, or may be a certain space in a flash memory of the electronic device.
Example 1, the first storage space is taken as a certain space in a memory space of the electronic device. After a user clicks a first region of a target control, the electronic device stores a filter effect graph of a first image processed by a filter corresponding to the first region in a memory space, a variable records a memory address of the memory space of the filter effect graph, and the electronic device can acquire the filter effect graph through the variable. The first storage space may be a memory space corresponding to the variable.
Specifically, the step of acquiring the filter effect graph by the electronic device through the variable may be: the electronic device starts to continuously read the data stored in the memory from the memory address of the variable record until an end symbol (usually "\ 0") is read, and the filter effect graph can be obtained from the memory.
Example 2, the first storage space is taken as a certain space in a flash memory of the electronic device. After the electronic device generates the filter effect map of the first image, the filter effect map may be temporarily stored in a target directory of a flash memory, for example, a/cache/directory in an android system. The electronic device may store the filter effect map in the directory, and record an address of the filter effect map, and the electronic device may obtain the filter effect map through the address.
It should be noted that the number of the first areas in the target control may be preset, or may be adjusted according to an actual situation, which is not limited in this embodiment of the application. For example, a plurality of filters may be preset in the target control, or the target control may be downloaded from a server in a form of networking or the like and added to the target control by a user during use.
Step 202, in response to the first input, the image processing apparatus stores a second image in a first storage space corresponding to the target area.
The second image is an image generated after the first image is processed by a filter corresponding to the target area; n is a positive integer.
For example, when the user clicks the first area, the electronic device generates a filter effect map (i.e., the second image) processed by the filter corresponding to the first area, and stores the second image in the first storage space corresponding to the first area.
Optionally, in this embodiment of the application, the electronic device may store the second area in the first storage space corresponding to the first area after receiving a click input of the user on the first area, or may store the second image in the first storage space after receiving a further confirmation of the user.
Illustratively, the first input further comprises: and the image processing device receives the first sub-input and stores the second image into the first storage space.
Therefore, the electronic equipment does not need to store the second image frequently, and the storage space is saved.
Step 203, the image processing apparatus receives a second input.
And step 204, the image processing device responds to the second input and outputs the target image stored in a second storage space corresponding to the second input, wherein the first storage space comprises the second storage space.
Wherein the second input may include: and a second input of the user to a second area in the target control, wherein the second storage space is a storage space corresponding to the second area, and the second area is at least one area in the target area.
Illustratively, the second input may be an input by the user clicking on a "save/share" control 13 in fig. 1. And after receiving a second input of the user, the electronic device outputs a second image (i.e., the target image) stored in each second storage space in the target control.
For example, based on the above examples 1 and 2, the electronic device may obtain the second image stored in the memory space from the memory space corresponding to each first region in the target control, and store the second image in the flash memory, for example, may store the second image in a system album directory. Alternatively, the electronic device may move/copy all the second images stored in the cache directory/cache directory to the system album directory.
For example, the second storage space may be a part or all of the first storage space screened out by the electronic device after receiving the second input from the user.
For example, the second input may further include an input for selecting the first area by the user, and the electronic device generates a plurality of first storage spaces after receiving the first input by the user, but not all of the filter effect maps stored in each of the first storage spaces are preferred by the user, so that the user may selectively enable the electronic device to output the preferred filter effect maps.
In this way, when a user uses the electronic device to add a filter effect to an image, the first input of the user to the target area of the target control is received under the condition that the first image and the target control are displayed on the target interface, so that the electronic device can store the second image subjected to filter processing into the first storage space corresponding to the target area, and then after the second input is received, the electronic device can output the target image stored in the second storage space corresponding to the second input into the album, so that the user can simultaneously output a plurality of images subjected to filter processing when the electronic device is used to add the filter effect to the image.
Optionally, in this embodiment of the application, in order to facilitate a user to visually observe the filter effect graph stored in the first storage space corresponding to each first region in the target control, or whether the filter effect graph is stored in the first storage space corresponding to the first region, the electronic device may display the display image on the first region.
Illustratively, M display images are displayed on the target area, each first area corresponds to one display image, and any display image is processed by the filter of the corresponding first area, so that the filter effect is achieved.
For example, as shown in fig. 3, the user clicks an area b and an area c in the control 12, the electronic device stores the filter effect maps processed by the filters corresponding to the area b and the area c into the storage spaces corresponding to the area b and the area c, respectively, and displays thumbnails of the filter effects on the area b and the area c, and then, after receiving the click input of the user on the control 13, the electronic device stores the filter effect maps stored in the storage spaces corresponding to the area b and the area c into an album of the electronic device, respectively, or shares them with other users.
Therefore, the user can visually see what filter effect graph processed by the filter is stored in the target control. The user can conveniently and selectively output the favorite filter effect graph in the subsequent process.
Optionally, in this embodiment of the application, when the user clicks the first area, the electronic device needs to occupy system resources to generate a filter effect graph, and in order to prevent the electronic device from repeatedly performing the operation, the electronic device may determine whether the storage space corresponding to the first area stores the filter effect graph.
Illustratively, the step 202 may further include the following step 202 a:
in step 202a, the image processing apparatus responds to the first input, and stores the second image in the first storage space if the image is not stored in the first storage space corresponding to the target area.
For example, with reference to the above example 1 and example 2, the electronic device may determine whether the variable corresponding to the first region clicked by the first input points to a specific address in the memory space or a specific address in the flash memory/cache/directory, that is, determine whether the variable is a null value, and if the variable is null, it indicates that the filter effect map is not stored in the memory space corresponding to the first region.
Therefore, if the first storage space stores the filter effect graph, the electronic equipment does not need to execute the image processing operation again, and the consumption of system resources is reduced.
Optionally, in this embodiment of the application, the electronic device may further adjust a filter effect parameter of the second image to adjust a filter effect degree of the second image.
For example, after the step 202, the image processing method provided in the embodiment of the present application may further include the following steps 202b1 to 202b 3:
step 202b1, the image processing device responds to the first input and displays a second image;
step 202b2, the image processing apparatus receives a third input from the user on the target interface.
Step 202b3, the image processing device responds to the third input, adjusts the filter effect parameter of the second image, generates a third image, and replaces the second image stored in the first storage space with the third image.
For example, the filter effect parameter is used to indicate a degree of filter effect of the second image, which can be understood as a degree of difference between the original image and the mirror effect of the first image added by the electronic device. For example, the user uses the electronic device to add a whitening filter to the first image, the whiter the image the darker the filter effect, and the closer the image is to the original, the lighter the filter effect.
For example, the third input may be a sliding input of the user in a horizontal or vertical direction of the target interface, and the user may adjust the degree of the filter effect of the second image through the sliding input.
Therefore, the electronic equipment can display the second image, meanwhile, the user can conveniently adjust the filter effect degree of the second image, and the second image stored before is replaced after the user adjusts the filter effect degree to a favorite degree.
Optionally, in this embodiment of the application, after the electronic device receives the first input of the user to generate at least one second image, the user may further display all the second images through a specific operation, so that the user can conveniently filter favorite second images.
Illustratively, the target region includes K first regions, and after the step 204, the image processing method provided in this embodiment of the application may further include the following steps 204a1 and 204a 2:
step 204a1, the image processing apparatus receives a fourth input of the target control by the user.
In step 204a2, the image processing apparatus displays a second image corresponding to each of K first regions in response to the fourth input, where K is a positive integer.
For example, the target control may include a ring including N first regions, and a center region, the fourth input may be a touch input of a user clicking the center region of the target control, and after the electronic device receives a click input of the user on the center region of the target control, a filter effect map stored in a first storage space corresponding to each of the K first regions is displayed on the target interface.
For example, the second input may also be a selection input of the K filter effect maps displayed on the target interface by the user.
In this way, after receiving the input that the user clicks the center region of the target control, the electronic device displays the filter effect graph generated after the electronic device receives the first input, so that the user can select the favorite filter effect graph.
Optionally, when the target control is a control including a ring having N first regions, the ring region of the target control is: and the control can rotate by taking the circle center of the circular ring as a rotation point.
For example, the target control may further include a target identifier (which may be a fixed pointer or arrow), where the target identifier points to one of the N first areas, and the electronic device displays, on the target interface, a second image stored in the first storage space corresponding to the first area pointed to by the target identifier.
For example, after receiving a fifth input from the user, the image processing apparatus displays the second image stored in the first storage space corresponding to the first area indicated by the target identifier.
For example, the target control may include a normal display state and a zoom-out display state, and the electronic device may switch the display state of the target control by receiving a touch input to the target control from a user. Further, the touch input may be a long press input of the target control by the user. When the target control is in a normal state, the electronic device may receive a second input and a third input of the target control by a user, and perform a corresponding operation in response to the second input and the third input. When the target control is in a zoom-out state, the user can drag the target control to move the display position of the target control in the target interface.
In this way, the user can change the pointing direction of the pointer by rotating the circular ring area of the target control, thereby changing the image displayed on the target interface.
According to the image processing method provided by the embodiment of the application, when a user uses the electronic equipment to add the filter effect to the image, the first input of the user to the target area of the target control is received under the condition that the first image and the target control are displayed on the target interface, so that the second image processed by the filter can be stored in the first storage space corresponding to the target area by the electronic equipment, meanwhile, the filter effect parameters of the second image can be adjusted by the electronic equipment after the third input of the user is received, and the filter effect images stored in all the first storage spaces are displayed after the fourth input of the user is received, so that the user can conveniently screen favorite images. After receiving the second input, the electronic device may output the target image stored in the second storage space corresponding to the second input to the album, so that when the user uses the electronic device to add a filter effect to the image, the user can simultaneously output a plurality of images processed by the filter.
It should be noted that, in the image processing method provided in the embodiment of the present application, the execution subject may be an image processing apparatus, or a control module in the image processing apparatus for executing the image processing method. The image processing apparatus provided in the embodiment of the present application is described with an example in which an image processing apparatus executes an image processing method.
In the embodiments of the present application, the above-described methods are illustrated in the drawings. The image processing method is exemplarily described with reference to one of the drawings in the embodiments of the present application. In specific implementation, the image processing methods shown in the above method drawings may also be implemented by combining with any other drawings that may be combined, which are illustrated in the above embodiments, and are not described herein again.
Fig. 4 is a schematic diagram of a possible structure of an image processing apparatus for implementing the embodiment of the present application, and as shown in fig. 4, the image processing apparatus 600 includes: a receiving module 601, a storage module 602, and an output module 603, wherein: the receiving module 601 is configured to receive a first input of a user to a target area of a target control under the condition that a first image and the target control are displayed on a target interface; the target control comprises N first areas; each first area corresponds to a filter, and each first area corresponds to a storage space; the target area is at least one of the N first areas; a storage module 602, configured to store, in response to the first input received by the receiving module 601, a second image in a first storage space corresponding to the target area, where the second image is an image generated after the first image is subjected to filter processing corresponding to the target area; n is a positive integer; a receiving module 601, further configured to receive a second input; an output module 603, configured to, in response to the second input received by the receiving module 601, output the target image stored in a second storage space corresponding to the second input, where the first storage space includes the second storage space; wherein the second input comprises: and a second input of the user to a second area in the target control, wherein the second storage space is a storage space corresponding to the second area, and the second area is at least one area in the target area.
Optionally, M display images are displayed on the target area, each first area in the target area corresponds to one display image, any display image is processed by the filter of the corresponding first area, and has a filter effect, and M is a positive integer.
Optionally, the storage module 602 is specifically configured to, in response to the first input, store the second image in the first storage space if the image is not stored in the first storage space corresponding to the target area.
Optionally, as shown in fig. 4, the image processing apparatus further includes: a display module 604 and a generation module 605, the display module 604 is used for responding to the first input received by the receiving module 601 and displaying the second image; the receiving module 601 is further configured to receive a third input of the user on the target interface; a generating module 605, configured to adjust a filter effect parameter of the second image in response to the third input received by the receiving module 601, and generate a third image; the storage module 602 is further configured to replace the second image stored in the first storage space with the third image generated by the generation module.
Optionally, the target area comprises K first areas; the receiving module 601 is further configured to receive a fourth input of the target control from the user; the display module 604 is further configured to display, in response to the fourth input received by the receiving module 601, a second image corresponding to each of the K first regions, where K is a positive integer.
It should be noted that, as shown in fig. 4, the modules that are necessarily included in the image processing apparatus 600 are illustrated by solid line boxes, such as a receiving module 601, a storage module 602, and an output module 603; modules that may be included in the image processing apparatus 600 are illustrated with dashed boxes, such as a display module 604 and a generation module 605.
The image processing apparatus in the embodiment of the present application may be an apparatus, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The image processing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android operating system (Android), an iOS operating system, or other possible operating systems, which is not specifically limited in the embodiments of the present application.
The image processing apparatus provided in the embodiment of the present application can implement each process implemented by the method embodiments of fig. 2 and fig. 3, and is not described herein again to avoid repetition.
The image processing apparatus provided in the embodiment of the application, when a user uses an electronic device to add a filter effect to an image, by displaying a first image and a target control on a target interface, receiving a first input of the user to a target area of the target control, so that the electronic device can store a second image processed by a filter into a first storage space corresponding to the target area, and simultaneously, the electronic device can also adjust a filter effect parameter of the second image after receiving a third input of the user, and after receiving a fourth input of the user, display filter effect maps stored in all first storage spaces, thereby facilitating the user to filter favorite images. After receiving the second input, the electronic device may output the target image stored in the second storage space corresponding to the second input to the album, so that when the user uses the electronic device to add a filter effect to the image, the user can simultaneously output a plurality of images processed by the filter.
Optionally, as shown in fig. 5, an electronic device M00 is further provided in this embodiment of the present application, and includes a processor M01, a memory M02, and a program or an instruction stored in the memory M02 and executable on the processor M01, where the program or the instruction when executed by the processor M01 implements the processes in the foregoing embodiment of the image processing method, and can achieve the same technical effects, and details are not repeated here to avoid repetition.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic devices and the non-mobile electronic devices described above.
Fig. 6 is a schematic diagram of a hardware structure of an electronic device implementing various embodiments of the present application.
The electronic device 100 includes, but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, a display unit 106, a user input unit 107, an interface unit 108, a memory 109, and a processor 110.
Those skilled in the art will appreciate that the electronic device 100 may further comprise a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 6 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
The user input unit 107 is configured to receive a first input of a user to a target area of a target control in a case that the target interface displays a first image and the target control. A memory 109 for storing the second image in the first storage space corresponding to the target area in response to the first input received by the user input unit 107. The user input unit 107 is further configured to receive a second input. And a processor 110, configured to output, in response to the second input received by the user input unit 107, the target image stored in the second storage space corresponding to the second input.
In this way, when a user uses the electronic device to add a filter effect to an image, the first input of the user to the target area of the target control is received under the condition that the first image and the target control are displayed on the target interface, so that the electronic device can store the second image subjected to filter processing into the first storage space corresponding to the target area, and then after the second input is received, the electronic device can output the target image stored in the second storage space corresponding to the second input into the album, so that the user can simultaneously output a plurality of images subjected to filter processing when the electronic device is used to add the filter effect to the image.
Optionally, the memory 109 is specifically configured to, in response to a first input received by the user input unit 107, store the second image in the first storage space if the image is not stored in the first storage space corresponding to the target area.
Therefore, if the first storage space stores the filter effect graph, the electronic equipment does not need to execute the image processing operation again, and the consumption of system resources is reduced.
Optionally, a display unit 106 for displaying the second image in response to the first input received by the user input unit 107; the user input unit 107 is further used for receiving a third input of the user on the target interface; a processor 110, configured to adjust a degree of a filter effect of the second image in response to a third input received by the user input unit 107, and generate a third image; the memory 109 is further configured to replace the second image stored in the first storage space with the third image generated by the processor 110.
Therefore, the electronic equipment can display the second image, meanwhile, the user can conveniently adjust the filter effect degree of the second image, and the second image stored before is replaced after the user adjusts the filter effect degree to a favorite degree.
Optionally, the user input unit 107 is further configured to receive a fourth input; the display unit 106 is further configured to display a second image corresponding to each of the K first areas in response to a fourth input received by the user input unit 107.
In this way, after receiving the input that the user clicks the center region of the target control, the electronic device displays the filter effect graph generated after the electronic device receives the first input, so that the user can select the favorite filter effect graph.
According to the electronic equipment provided by the embodiment of the application, when a user uses the electronic equipment to add the filter effect to the image, under the condition that the first image and the target control are displayed on the target interface, the first input of the user to the target area of the target control is received, the second image processed by the filter can be stored in the first storage space corresponding to the target area by the electronic equipment, meanwhile, the filter effect parameters of the second image can be adjusted by the electronic equipment after the third input of the user is received, and after the fourth input of the user is received, all the filter effect images stored in the first storage space are displayed, and the user can conveniently screen favorite images. After receiving the second input, the electronic device may output the target image stored in the second storage space corresponding to the second input to the album, so that when the user uses the electronic device to add a filter effect to an image, a plurality of images processed by the filter can be output at the same time.
It should be understood that, in the embodiment of the present application, the input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics Processing Unit 1041 processes image data of a still picture or a video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 107 includes a touch panel 1071 and other input devices 1072. The touch panel 1071 is also referred to as a touch screen. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 109 may be used to store software programs as well as various data including, but not limited to, application programs and an operating system. The processor 110 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the embodiment of the image processing method, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling an electronic device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (8)

1. An image processing method, characterized in that the method comprises:
under the condition that a first image and a target control are displayed on a target interface, receiving first input of a user to a target area of the target control; the target control comprises N first areas; each first area corresponds to a filter, and each first area corresponds to a storage space; the target area is at least one of the N first areas;
responding to the first input, and storing a second image in a first storage space corresponding to the target area, wherein the second image is an image generated after the first image is subjected to filter processing corresponding to the target area; n is a positive integer;
receiving a second input;
responding to the second input, and outputting a target image stored in a second storage space corresponding to the second input, wherein the first storage space comprises the second storage space;
wherein the second input comprises: a second input of the user to a second region in the target control, where the second storage space is a storage space corresponding to the second region, and the second region is at least one region in the target region;
after receiving the first input of the user to the target area of the target control, the method further comprises:
displaying the second image in response to the first input;
receiving a third input of a user on the target interface;
and responding to the third input, adjusting the filter effect parameters of the second image, generating a third image, and replacing the second image stored in the first storage space with the third image.
2. The method according to claim 1, wherein M display images are displayed on the target area, each first area in the target area corresponds to one display image, any display image has a filter effect after being subjected to filter processing of the corresponding first area, and M is a positive integer.
3. The method of claim 1, wherein the storing a second image in a first storage space corresponding to the target region in response to the first input comprises:
responding to the first input, and if the image is not stored in the first storage space corresponding to the target area, storing the second image in the first storage space.
4. The method according to claim 1, wherein the target area comprises K first areas, and after the second image is stored in the first storage space corresponding to the target area, the method further comprises:
receiving a fourth input of the target control by the user;
and responding to the fourth input, and displaying a second image corresponding to each first area in the K first areas, wherein K is a positive integer.
5. An image processing apparatus, characterized in that the image processing apparatus comprises a receiving module, a storage module, an output module, a display module and a generation module;
the receiving module is used for receiving a first input of a user to a target area of a target control under the condition that a first image and the target control are displayed on a target interface; the target control comprises N first areas; each first area corresponds to a filter, and each first area corresponds to a storage space; the target area is at least one of the N first areas;
the storage module is configured to store a second image in a first storage space corresponding to the target area in response to the first input received by the receiving module, where the second image is an image generated after the first image is processed by a filter corresponding to the target area; n is a positive integer;
the receiving module is further used for receiving a second input;
the output module is used for responding to a second input received by the receiving module and outputting a target image stored in a second storage space corresponding to the second input, wherein the first storage space comprises the second storage space;
wherein the second input comprises: a second input of the user to a second region in the target control, where the second storage space is a storage space corresponding to the second region, and the second region is at least one region in the target region;
the display module is used for responding to the first input received by the receiving module and displaying the second image;
the receiving module is further used for receiving a third input of the user on the target interface;
the generating module is used for responding to a third input received by the receiving module, adjusting the filter effect parameter of the second image and generating a third image;
the storage module is further configured to replace the second image stored in the first storage space with a third image generated by the generation module.
6. The apparatus according to claim 5, wherein M display images are displayed on the target area, each first area in the target area corresponds to one display image, any display image is subjected to filter processing of the corresponding first area, and has a filter effect, and M is a positive integer.
7. The apparatus of claim 5,
the storage module is specifically configured to, in response to the first input, store the second image in a first storage space corresponding to the target area if the image is not stored in the first storage space.
8. The apparatus of claim 5, wherein the target region comprises K first regions; the device further comprises: a display module;
the receiving module is further configured to receive a fourth input of the target control from the user;
the display module is further configured to display, in response to a fourth input received by the receiving module, a second image corresponding to each of the K first regions, where K is a positive integer.
CN202010568923.2A 2020-06-19 2020-06-19 Image processing method and device Active CN111885298B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010568923.2A CN111885298B (en) 2020-06-19 2020-06-19 Image processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010568923.2A CN111885298B (en) 2020-06-19 2020-06-19 Image processing method and device

Publications (2)

Publication Number Publication Date
CN111885298A CN111885298A (en) 2020-11-03
CN111885298B true CN111885298B (en) 2022-05-17

Family

ID=73156506

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010568923.2A Active CN111885298B (en) 2020-06-19 2020-06-19 Image processing method and device

Country Status (1)

Country Link
CN (1) CN111885298B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113747240B (en) 2021-09-10 2023-04-07 荣耀终端有限公司 Video processing method, apparatus and storage medium
CN114327166A (en) * 2021-12-29 2022-04-12 维沃移动通信有限公司 Image processing method and device, electronic equipment and readable storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017017609A (en) * 2015-07-03 2017-01-19 株式会社リコー Image processing device
CN106095278B (en) * 2016-06-22 2020-02-11 维沃移动通信有限公司 Photographing method and mobile terminal
CN106530222A (en) * 2016-11-25 2017-03-22 维沃移动通信有限公司 Picture saving method and mobile terminal
CN110598027B (en) * 2019-09-10 2022-09-02 Oppo广东移动通信有限公司 Image processing effect display method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN111885298A (en) 2020-11-03

Similar Documents

Publication Publication Date Title
CN113093968B (en) Shooting interface display method and device, electronic equipment and medium
CN112954196B (en) Shooting method, shooting device, electronic equipment and readable storage medium
CN111601012B (en) Image processing method and device and electronic equipment
CN112672061B (en) Video shooting method and device, electronic equipment and medium
US20240196082A1 (en) Image Processing Method and Apparatus, and Electronic Device
CN113905175A (en) Video generation method and device, electronic equipment and readable storage medium
CN113079316B (en) Image processing method, image processing device and electronic equipment
CN111885298B (en) Image processing method and device
CN112449110B (en) Image processing method and device and electronic equipment
CN113379866A (en) Wallpaper setting method and device
CN112312021B (en) Shooting parameter adjusting method and device
CN114430460A (en) Shooting method and device and electronic equipment
CN113596331B (en) Shooting method, shooting device, shooting equipment and storage medium
CN112764632B (en) Image sharing method and device and electronic equipment
CN113037618B (en) Image sharing method and device
CN116244028A (en) Interface display method and device and electronic equipment
CN115631109A (en) Image processing method, image processing device and electronic equipment
CN114390205A (en) Shooting method and device and electronic equipment
CN114500844A (en) Shooting method and device and electronic equipment
CN115037874A (en) Photographing method and device and electronic equipment
CN114518821A (en) Application icon management method and device and electronic equipment
CN114286010A (en) Shooting method, shooting device, electronic equipment and medium
CN114500852A (en) Photographing method, photographing apparatus, electronic device, and readable storage medium
CN112492206B (en) Image processing method and device and electronic equipment
CN114979488A (en) Shooting method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant