CN112860163A - Image editing method and device - Google Patents
Image editing method and device Download PDFInfo
- Publication number
- CN112860163A CN112860163A CN202110082957.5A CN202110082957A CN112860163A CN 112860163 A CN112860163 A CN 112860163A CN 202110082957 A CN202110082957 A CN 202110082957A CN 112860163 A CN112860163 A CN 112860163A
- Authority
- CN
- China
- Prior art keywords
- image
- layer
- input
- target
- target image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application discloses an image editing method and device, and belongs to the technical field of image processing. The method is applied to the electronic equipment and comprises the following steps: receiving a first input of a user to a target image under the condition that the target image is displayed, wherein the target image comprises at least one image layer; responding to the first input, displaying an image editing interface, and displaying a layer of a target image in a first area of the image editing interface; receiving a second input of the user to the layer of the target image; and responding to the second input, and editing the layer of the target image. According to the image editing method and device, the image layer of the target image can be edited through simple operation, the image does not need to be regenerated according to the original image, and the image editing efficiency is improved.
Description
Technical Field
The application belongs to the technical field of image processing, and particularly relates to an image editing method and device.
Background
At present, many images in life are formed by combining a plurality of images through different layers, such as movie posters, gourmet puzzles or advertising advertisements.
Generally, a plurality of images are grouped into one image through the layer template. Once generated, in the case where the sub-image needs to be modified, the image can only be regenerated from the original image, which is very cumbersome, time consuming, and inefficient.
Disclosure of Invention
The embodiment of the application aims to provide an image editing method and device, which can improve the image editing efficiency.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides an image editing method, which is applied to an electronic device, and includes:
receiving a first input of a user to a target image under the condition that the target image is displayed, wherein the target image comprises at least one image layer;
responding to the first input, displaying an image editing interface, and displaying a layer of a target image in a first area of the image editing interface;
receiving a second input of the user to the layer of the target image;
and responding to the second input, and editing the layer of the target image.
In a second aspect, an embodiment of the present application provides an image editing apparatus, which is applied to an electronic device, and includes:
the device comprises a receiving module, a display module and a display module, wherein the receiving module is used for receiving a first input of a user to a target image under the condition of displaying the target image, and the target image comprises at least one image layer;
the display module is used for responding to the first input, displaying an image editing interface and displaying a layer of a target image in a first area of the image editing interface;
the receiving module is further used for receiving a second input of the user to the layer of the target image;
and the editing module is used for responding to the second input and editing the layer of the target image.
In a third aspect, embodiments of the present application provide an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, where the program or instructions, when executed by the processor, implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium on which a program or instructions are stored, which when executed by a processor, implement the steps of the method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In the embodiment of the application, in the case of displaying a target image, an electronic device receives a first input of a user to the target image, displays an image editing interface in response to the first input, displays a layer of the target image in a first area of the image editing interface, receives a second input of the user to the layer of the target image, and edits the layer in response to the second input. Therefore, the layer of the target image can be edited through simple operation, the image does not need to be regenerated according to the original image, and the image editing efficiency is improved.
Drawings
Fig. 1 is a schematic flowchart of an image editing method provided in an embodiment of the present application;
FIG. 2 is a schematic diagram of a target image provided by an embodiment of the present application;
FIG. 3 is a schematic diagram of a target image display provided by an embodiment of the present application;
FIG. 4 is a schematic diagram of a horizontal display image editing interface provided by an embodiment of the present application;
FIG. 5 is a schematic diagram of a vertical display image editing interface provided in an embodiment of the present application;
FIG. 6 is a schematic diagram of another horizontal display image editing interface provided by an embodiment of the present application;
FIG. 7 is a schematic diagram of another vertical display image editing interface provided in an embodiment of the present application;
fig. 8 is a schematic structural diagram of an image editing apparatus according to an embodiment of the present application;
fig. 9 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present application;
fig. 10 is a hardware configuration diagram of another electronic device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
At present, for a synthesized multi-layer image, if each layer is to be edited, the image layer can only be regenerated according to an original image, and the layer cannot be directly edited, which is time consuming and inefficient.
In order to solve the problems in the related art, embodiments of the present application provide an image editing method and apparatus. In the case of displaying the target image, the electronic device receives a first input of a user to the target image, displays an image editing interface in response to the first input, displays a layer of the target image in a first area of the image editing interface, then receives a second input of the user to the layer of the target image, and edits the layer in response to the second input. Therefore, the layer of the target image can be edited through simple operation, the image does not need to be regenerated according to the original image, and the image editing efficiency is improved.
The following describes in detail an image editing method and apparatus provided in the embodiments of the present application with reference to the accompanying drawings.
As an example, the image editing method provided by the embodiment of the present application may be applied to scenes such as movie posters, interesting puzzles, or advertisement propaganda, which is convenient for a user to directly edit the image layer of an image, and improves the image editing efficiency.
Fig. 1 is a schematic flowchart of an image editing method provided in an embodiment of the present application, and as shown in fig. 1, the image editing method applied to an electronic device includes the following steps:
s110, receiving a first input of a user to the target image under the condition of displaying the target image.
Wherein the target image comprises at least one image layer. Such as a movie poster in which a plurality of images are combined, a fun puzzle or a promotional advertisement, etc.,
as one example, in a case where a screen of the electronic device displays a target image, the electronic device may receive a first input of a user for the target image. Alternatively, the first input may be a double click on the target image, a long press on the target image, or a drawing of a closed curve on the target image, etc., without limitation.
And S120, responding to the first input, displaying an image editing interface, and displaying the layer of the target image in a first area of the image editing interface.
As one example, the electronic device may display the image editing interface horizontally or vertically.
Optionally, in response to the first input, the electronic device may display all layers of the target image in the first area, i.e., enter a global editing mode. For example, the first input is a double-click of the target image, and in response to the first input, the electronic device displays all layers of the target image in the first area.
Optionally, in response to the first input, the electronic device may display a partial layer of the target image in the first area, i.e., enter a local editing mode. For example, the first input may be a long-press target image, and in response to the first input, the electronic device may display a layer including a long-press location in the first area. The layer including the long pressed position may be one, i.e. a background layer, or may be two or more layers. The first input may be drawing a closed curve on the target image, and in response to the first input, the electronic device may display a layer within the curve in the first area. Therefore, only the layer which is interested by the user can be displayed, and subsequent editing of the layer which is interested by the user is facilitated.
And S130, receiving a second input of the user to the layer of the target image.
The second input may be, without limitation, a long layer pressing, a layer dragging, or a layer double-clicking.
And S140, responding to the second input, and editing the layer of the target image.
Optionally, editing the layer of the target image may include any one of the following operations: adjusting the level of the target layer, deleting the target layer, and adding the layer in the layer of the target image. That is, in response to the second input, the electronic device may directly adjust the hierarchy of the target layer, delete the target layer, or add a layer in the layer of the target image. Therefore, the layers in the target image can be quickly adjusted, and the image editing efficiency is improved.
Optionally, in response to the second input, the electronic device may display an image of the target layer in a second area of the image editing interface. Wherein the second region is not coincident with the first region. And receiving a third input of the image of the target layer, such as clicking the image, from the user, and editing the image of the target layer in response to the third input. Editing the image of the target layer may include rotating the image, cropping the image, or adding a filter to the image, etc. Therefore, the image in the target layer can be edited quickly, other layers are prevented from being influenced, and the editing effect is improved.
For example, the electronic device may receive a fifth input of the user with respect to the boundary line of the second area or the first area, for example, drag the boundary line, and in response to the fifth input, adjust the occupation ratio of the second area or the first area in the image editing interface.
In the embodiment of the application, in the case of displaying a target image, an electronic device receives a first input of a user to the target image, displays an image editing interface in response to the first input, displays a layer of the target image in a first area of the image editing interface, receives a second input of the user to the layer of the target image, and edits the layer in response to the second input. Therefore, the layer of the target image can be edited through simple operation, the image does not need to be regenerated according to the original image, and the image editing efficiency is improved.
In some embodiments, after editing the layer of the target image, the method may further include: and receiving a fourth input of the user to the image editing interface, such as long-time pressing of the image editing interface, and displaying the edited target image in response to the fourth input, so that the user can flexibly view the editing effect of the image.
As one example, in a case where the electronic device includes a first screen and a second screen, in response to a first input, an image editing interface may be displayed on the first screen and a target image may be displayed on the second screen. After editing the layer of the target image, the method may further include: the image editing interface is displayed on the first screen, and the edited target image is displayed on the second screen, so that a user can check the editing effect of the target image in real time, and the user experience is improved. It is understood that the display contents on the first screen and the second screen may be interchanged with each other.
Fig. 2 is a schematic diagram of an object image according to an embodiment of the present application, and as shown in fig. 2, the object image includes 7 image layers, which are image layer 001, image layer 002, image layer 003, image layer 004, image layer 005, image layer 006, and image layer 007. Taking editing the target image shown in fig. 2 as an example, the image editing method provided in the embodiment of the present application will be described, which may include the following steps:
As shown in fig. 3, the electronic device may completely display the target image on a screen for a user to operate, and receive a first input from the user.
And step 2, responding to the first input, displaying an image editing interface by the electronic equipment, and displaying the layer of the target image in a first area of the image editing interface.
Alternatively, as shown in fig. 4, in response to the first input, the electronic device may display the image editing interface 10 horizontally, and display the layer of the target image in the sub-area 111 in the first area 11; as shown in fig. 5, in response to the first input, the electronic device may display the image editing interface 10 vertically, and display the layer of the target image in the sub-area 111 in the first area 11.
Referring to fig. 4 and 5, the image editing interface 10 includes a first area 11 and a second area 12, where layers may be displayed in the first area 11, and the second area 12 may display an image of a target layer when the target layer is selected. It can be understood that the layer displayed in the first area 11 may slide left and right, and the layer focus of the first area 11 is updated correspondingly during the sliding process.
Alternatively, in the case where the electronic device includes the first screen 1 and the second screen 2, in response to the first input, the electronic device may display the image editing interface 10 on the first screen 1 and display a real-time editing screen of the target image on the second screen 2. As shown in fig. 6 and 7, the display on the first screen 1 is similar to the single-sided screen shown in fig. 4 and 5, except that a real-time editing screen of the target image can be displayed directly on the second screen 2.
And 3, the electronic equipment receives a second input of the user aiming at the layer of the target image.
And 4, responding to the second input, and editing the layer of the target image by the electronic equipment.
In response to the second input, the electronic device may directly adjust the hierarchy of the target layer, or delete the target layer, or add a layer in the layer of the target image. Then, a sixth input may be received for the target layer, for example, a long press of the target layer, and in response to the sixth input, the image of the target layer is displayed in the second area 12 of the image editing interface 10. And receiving third input of the user to the image of the target layer, and editing the image of the target layer in response to the third input.
And step 5, receiving a seventh input of the user to the image editing interface 10, for example, sliding the image editing interface 10, responding to the seventh input, saving the edited target image, and exiting the image editing interface 10.
In the image editing method provided in the embodiment of the present application, the execution main body may be an image editing apparatus applied to an electronic device, or a control module used in the image editing apparatus to execute the image editing method; the image editing apparatus applied to the electronic device according to the embodiment of the present application is described by taking an example in which the image editing apparatus applied to the electronic device executes an image editing method.
Fig. 8 is a schematic structural diagram of an image editing apparatus according to an embodiment of the present application, and as shown in fig. 8, an image editing apparatus 800 applied to an electronic device includes:
the receiving module 810 is configured to receive a first input of a target image from a user in a case that the target image is displayed, where the target image includes at least one layer.
The display module 820 is configured to display the image editing interface in response to the first input, and display a layer of the target image in a first area of the image editing interface.
The receiving module 810 is further configured to receive a second input of the layer of the target image by the user.
And the editing module 830 is configured to edit the layer of the target image in response to the second input.
In the embodiment of the application, in the case of displaying a target image, an electronic device receives a first input of a user to the target image, displays an image editing interface in response to the first input, displays a layer of the target image in a first area of the image editing interface, receives a second input of the user to the layer of the target image, and edits the layer in response to the second input. Therefore, the layer of the target image can be edited through simple operation, the image does not need to be regenerated according to the original image, and the image editing efficiency is improved.
In some embodiments, the display module 820 includes: and the first display unit is used for displaying part of the image layer of the target image in the first area. Therefore, only the layer which is interested by the user can be displayed, and subsequent editing of the layer which is interested by the user is facilitated.
In one embodiment, editing the layer of the target image includes any one of the following operations:
adjusting the level of the target layer, deleting the target layer, and adding the layer in the layer of the target image. Therefore, the layers in the target image can be quickly adjusted, and the image editing efficiency is improved.
In one embodiment, the editing module 830 comprises: and the second display unit is used for displaying the image of the target layer in a second area of the image editing interface, wherein the second area is not overlapped with the first area. And the first receiving unit is used for receiving a third input of the image of the target layer from the user. And the editing unit is used for responding to the third input and editing the image of the target layer. Therefore, the image in the target layer can be edited quickly, other layers are prevented from being influenced, and the editing effect is improved.
In an embodiment, after editing the layer of the target image, the receiving module 810 is further configured to receive a fourth input of the user to the image editing interface. The display module 820 is further configured to display the edited target image in response to the fourth input, so that the user can flexibly view the editing effect of the image.
In one embodiment, in the case where the electronic device includes a first screen and a second screen, the display module 820 includes: and a third display unit for displaying an image editing interface on the first screen and displaying the target image on the second screen in response to the first input.
After the layer of the target image is edited, the third display unit is further used for displaying an image editing interface on the first screen and displaying the edited target image on the second screen, so that a user can check the editing effect of the target image in real time, and the user experience is improved.
The image editing apparatus 800 in the embodiment of the present application may be an apparatus, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the Mobile electronic device may be a Mobile phone, a tablet Computer, a notebook Computer, a palm top Computer, an in-vehicle electronic device, a wearable device, an Ultra-Mobile Personal Computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-Mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (Personal Computer, PC), a Television (TV), a teller machine, a self-service machine, and the like, and the embodiments of the present application are not limited in particular.
The image editing apparatus 800 in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android operating system (Android), an iOS operating system, or other possible operating systems, which is not specifically limited in the embodiments of the present application.
The image editing apparatus 800 provided in this embodiment of the application can implement each process implemented in the method embodiment of fig. 1, and is not described here again to avoid repetition.
As shown in fig. 9, an electronic device 900 is further provided in this embodiment of the present application, and includes a processor 901, a memory 902, and a program or an instruction stored in the memory 902 and executable on the processor 901, where the program or the instruction is executed by the processor 901 to implement each process of the above-mentioned embodiment of the image editing method, and can achieve the same technical effect, and no further description is provided here to avoid repetition.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic devices and the non-mobile electronic devices described above.
Fig. 10 is a hardware configuration diagram of another electronic device provided in an embodiment of the present application. The electronic device 1000 includes, but is not limited to: a radio frequency unit 1001, a network module 1002, an audio output unit 1003, an input unit 1004, a sensor 1005, a display unit 1006, a user input unit 1007, an interface unit 1008, a memory 1009, and a processor 1010.
Those skilled in the art will appreciate that the electronic device 1000 may further comprise a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 1010 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 10 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
The user input unit 1007 is configured to receive a first input of a target image by a user in a case that the target image is displayed, where the target image includes at least one layer.
The display unit 1006 is configured to display the image editing interface in response to the first input, and display a layer of the target image in a first area of the image editing interface.
The user input unit 1007 is further configured to receive a second input by the user for the layer of the target image.
A processor 1010 for editing layers of the target image in response to a second input.
In the embodiment of the application, in the case of displaying a target image, an electronic device receives a first input of a user to the target image, displays an image editing interface in response to the first input, displays a layer of the target image in a first area of the image editing interface, receives a second input of the user to the layer of the target image, and edits the layer in response to the second input. Therefore, the layer of the target image can be edited through simple operation, the image does not need to be regenerated according to the original image, and the image editing efficiency is improved.
In some embodiments, the display unit 1006 is specifically configured to: and displaying a part of the image layer of the target image in the first area. Therefore, only the layer which is interested by the user can be displayed, and subsequent editing of the layer which is interested by the user is facilitated.
In some embodiments, editing the layer of the target image comprises any one of:
adjusting the level of the target layer, deleting the target layer, and adding the layer in the layer of the target image. Therefore, the layers in the target image can be quickly adjusted, and the image editing efficiency is improved.
In some embodiments, the display unit 1006 is further configured to display the image of the target layer in a second area of the image editing interface, where the second area is not overlapped with the first area. The user input unit 1007 is further configured to receive a third input of the image of the target layer from the user. The processor 1010 is specifically configured to: and in response to a third input, editing the image of the target layer. Therefore, the image in the target layer can be edited quickly, other layers are prevented from being influenced, and the editing effect is improved.
In some embodiments, after editing the layer of the target image, the user input unit 1007 is further configured to receive a fourth input to the image editing interface from the user. The display unit 1006 is further configured to display the edited target image in response to a fourth input, so that the user can flexibly view the editing effect of the image.
In some embodiments, where the electronic device includes a first screen and a second screen, the display unit 1006 is specifically configured to: in response to the first input, an image editing interface is displayed on the first screen and a target image is displayed on the second screen.
After the layer of the target image is edited, the display unit 1006 is further configured to display an image editing interface on the first screen, and display the edited target image on the second screen, so that a user can view an editing effect of the target image in real time, and user experience is improved.
It should be understood that, in the embodiment of the present application, the input Unit 1004 may include a Graphics Processing Unit (GPU) and a microphone, and the Graphics Processing Unit processes image data of still pictures or videos obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 1006 may include a display panel, which may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1007 includes a touch panel and other input devices. Touch panels, also known as touch screens. The touch panel may include two parts of a touch detection device and a touch controller. Other input devices may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 1009 may be used to store software programs as well as various data, including but not limited to application programs and operating systems. Processor 1010 may integrate an application processor that handles primarily operating systems, user interfaces, applications, etc. and a modem processor that handles primarily wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1010.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the embodiment of the image editing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device in the above embodiment. Readable storage media, including computer-readable storage media, such as Read-Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, etc.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the embodiment of the image editing method, and can achieve the same technical effect, and in order to avoid repetition, the description is omitted here.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.
Claims (10)
1. An image editing method, applied to an electronic device, includes:
receiving a first input of a user to a target image under the condition that the target image is displayed, wherein the target image comprises at least one image layer;
responding to the first input, displaying an image editing interface, and displaying a layer of the target image in a first area of the image editing interface;
receiving a second input of the user to the layer of the target image;
and responding to the second input, and editing the layer of the target image.
2. The method of claim 1, wherein displaying the layer of the target image in the first area of the image editing interface comprises:
and displaying a part of image layers of the target image in the first area.
3. The method according to claim 1, wherein the editing the layer of the target image comprises any one of the following operations:
adjusting the level of a target layer, deleting the target layer, and adding the layer in the layer of the target image.
4. The method of claim 1, wherein said editing the layer of the target image comprises:
displaying an image of a target layer in a second area of the image editing interface, wherein the second area is not overlapped with the first area;
receiving a third input of the user to the image of the target layer;
and responding to the third input, and editing the image of the target layer.
5. The method according to any of claims 1-4, wherein after said editing the layer of the target image, the method further comprises:
receiving a fourth input of the image editing interface by the user;
displaying the edited target image in response to the fourth input.
6. An image editing apparatus, applied to an electronic device, includes:
the device comprises a receiving module, a display module and a display module, wherein the receiving module is used for receiving a first input of a user to a target image under the condition that the target image is displayed, and the target image comprises at least one image layer;
the display module is used for responding to the first input, displaying an image editing interface and displaying the layer of the target image in a first area of the image editing interface;
the receiving module is further configured to receive a second input of the user to the layer of the target image;
and the editing module is used for responding to the second input and editing the layer of the target image.
7. The apparatus of claim 6, wherein the display module comprises:
and the first display unit is used for displaying a part of image layers of the target image in the first area.
8. The apparatus according to claim 6, wherein the editing the layer of the target image comprises any one of:
adjusting the level of a target layer, deleting the target layer, and adding the layer in the layer of the target image.
9. The apparatus of claim 6, wherein the editing module comprises:
the second display unit is used for displaying the image of the target layer in a second area of the image editing interface, wherein the second area is not overlapped with the first area;
the first receiving unit is used for receiving a third input of the user to the image of the target layer;
and the editing unit is used for responding to the third input and editing the image of the target layer.
10. The apparatus according to any one of claims 6 to 9, wherein after the editing of the layer of the target image, the receiving module is further configured to receive a fourth input of the user to the image editing interface;
the display module is further configured to display the edited target image in response to the fourth input.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110082957.5A CN112860163B (en) | 2021-01-21 | 2021-01-21 | Image editing method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110082957.5A CN112860163B (en) | 2021-01-21 | 2021-01-21 | Image editing method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112860163A true CN112860163A (en) | 2021-05-28 |
CN112860163B CN112860163B (en) | 2022-11-11 |
Family
ID=76008925
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110082957.5A Active CN112860163B (en) | 2021-01-21 | 2021-01-21 | Image editing method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112860163B (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113362426A (en) * | 2021-06-21 | 2021-09-07 | 维沃移动通信(杭州)有限公司 | Image editing method and image editing device |
CN113593677A (en) * | 2021-07-21 | 2021-11-02 | 上海商汤智能科技有限公司 | Image processing method, device, equipment and computer readable storage medium |
CN113891127A (en) * | 2021-08-31 | 2022-01-04 | 维沃移动通信有限公司 | Video editing method and device and electronic equipment |
CN113888676A (en) * | 2021-10-19 | 2022-01-04 | 乐美科技股份私人有限公司 | Picture editing method and device and readable storage medium |
CN114359094A (en) * | 2021-12-30 | 2022-04-15 | 网易(杭州)网络有限公司 | Image processing method, device, equipment and storage medium |
CN114500833A (en) * | 2022-01-13 | 2022-05-13 | 西安维沃软件技术有限公司 | Shooting method and device and electronic equipment |
CN114979482A (en) * | 2022-05-23 | 2022-08-30 | 维沃移动通信有限公司 | Shooting method, shooting device, electronic equipment and medium |
CN114968040A (en) * | 2022-05-19 | 2022-08-30 | 北京市商汤科技开发有限公司 | Image processing method and device, electronic device and storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102681837A (en) * | 2011-01-14 | 2012-09-19 | 奥多比公司 | Systems and methods providing user interface features for editing multi-layer images |
JP2017512335A (en) * | 2014-02-19 | 2017-05-18 | クアルコム,インコーポレイテッド | Image editing techniques for devices |
CN108037872A (en) * | 2017-11-29 | 2018-05-15 | 上海爱优威软件开发有限公司 | A kind of photo editing method and terminal device |
CN109388301A (en) * | 2018-09-14 | 2019-02-26 | Oppo(重庆)智能科技有限公司 | Screenshot method and relevant apparatus |
CN109859211A (en) * | 2018-12-28 | 2019-06-07 | 努比亚技术有限公司 | A kind of image processing method, mobile terminal and computer readable storage medium |
-
2021
- 2021-01-21 CN CN202110082957.5A patent/CN112860163B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102681837A (en) * | 2011-01-14 | 2012-09-19 | 奥多比公司 | Systems and methods providing user interface features for editing multi-layer images |
JP2017512335A (en) * | 2014-02-19 | 2017-05-18 | クアルコム,インコーポレイテッド | Image editing techniques for devices |
CN108037872A (en) * | 2017-11-29 | 2018-05-15 | 上海爱优威软件开发有限公司 | A kind of photo editing method and terminal device |
CN109388301A (en) * | 2018-09-14 | 2019-02-26 | Oppo(重庆)智能科技有限公司 | Screenshot method and relevant apparatus |
CN109859211A (en) * | 2018-12-28 | 2019-06-07 | 努比亚技术有限公司 | A kind of image processing method, mobile terminal and computer readable storage medium |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113362426A (en) * | 2021-06-21 | 2021-09-07 | 维沃移动通信(杭州)有限公司 | Image editing method and image editing device |
CN113362426B (en) * | 2021-06-21 | 2023-03-31 | 维沃移动通信(杭州)有限公司 | Image editing method and image editing device |
CN113593677A (en) * | 2021-07-21 | 2021-11-02 | 上海商汤智能科技有限公司 | Image processing method, device, equipment and computer readable storage medium |
CN113891127A (en) * | 2021-08-31 | 2022-01-04 | 维沃移动通信有限公司 | Video editing method and device and electronic equipment |
CN113888676A (en) * | 2021-10-19 | 2022-01-04 | 乐美科技股份私人有限公司 | Picture editing method and device and readable storage medium |
CN114359094A (en) * | 2021-12-30 | 2022-04-15 | 网易(杭州)网络有限公司 | Image processing method, device, equipment and storage medium |
CN114500833A (en) * | 2022-01-13 | 2022-05-13 | 西安维沃软件技术有限公司 | Shooting method and device and electronic equipment |
CN114500833B (en) * | 2022-01-13 | 2024-02-02 | 西安维沃软件技术有限公司 | Shooting method and device and electronic equipment |
CN114968040A (en) * | 2022-05-19 | 2022-08-30 | 北京市商汤科技开发有限公司 | Image processing method and device, electronic device and storage medium |
CN114979482A (en) * | 2022-05-23 | 2022-08-30 | 维沃移动通信有限公司 | Shooting method, shooting device, electronic equipment and medium |
Also Published As
Publication number | Publication date |
---|---|
CN112860163B (en) | 2022-11-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112860163B (en) | Image editing method and device | |
CN112887794B (en) | Video editing method and device | |
CN112836086B (en) | Video processing method and device and electronic equipment | |
CN111857460A (en) | Split screen processing method, split screen processing device, electronic equipment and readable storage medium | |
CN112099714B (en) | Screenshot method and device, electronic equipment and readable storage medium | |
CN112911147B (en) | Display control method, display control device and electronic equipment | |
CN112148167A (en) | Control setting method and device and electronic equipment | |
WO2022068721A1 (en) | Screen capture method and apparatus, and electronic device | |
CN114116098A (en) | Application icon management method and device, electronic equipment and storage medium | |
CN115866314B (en) | Video playing method and device | |
CN112954484A (en) | Bullet screen information display method and device | |
CN113726953B (en) | Display content acquisition method and device | |
CN112698771B (en) | Display control method, device, electronic equipment and storage medium | |
CN113810538B (en) | Video editing method and video editing device | |
WO2022183967A1 (en) | Video picture display method and apparatus, and device, medium and program product | |
CN115718581A (en) | Information display method and device, electronic equipment and storage medium | |
CN115550741A (en) | Video management method and device, electronic equipment and readable storage medium | |
CN114625296A (en) | Application processing method and device | |
CN112732958A (en) | Image display method and device and electronic equipment | |
CN113393372A (en) | Desktop wallpaper setting method and device | |
CN112418942A (en) | Advertisement display method and device and electronic equipment | |
CN112463090B (en) | Resolution adjustment method, device, equipment and medium | |
CN116108299A (en) | Display method, display device, electronic equipment and readable storage medium | |
CN115328367A (en) | Screen capturing method and device, electronic equipment and storage medium | |
CN117319549A (en) | Multimedia data selection method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |