[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN107682631B - Image processing method and mobile terminal - Google Patents

Image processing method and mobile terminal Download PDF

Info

Publication number
CN107682631B
CN107682631B CN201710953465.2A CN201710953465A CN107682631B CN 107682631 B CN107682631 B CN 107682631B CN 201710953465 A CN201710953465 A CN 201710953465A CN 107682631 B CN107682631 B CN 107682631B
Authority
CN
China
Prior art keywords
channel data
target
pixel point
channel
concentration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710953465.2A
Other languages
Chinese (zh)
Other versions
CN107682631A (en
Inventor
陈伟星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201710953465.2A priority Critical patent/CN107682631B/en
Publication of CN107682631A publication Critical patent/CN107682631A/en
Application granted granted Critical
Publication of CN107682631B publication Critical patent/CN107682631B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

The invention provides an image processing method and a mobile terminal, which relate to the technical field of communication, and the method comprises the following steps: acquiring a three-primary-color light mode image; normalizing the three primary color light mode image to obtain brightness channel data, chrominance channel data and concentration channel data; and respectively carrying out parallel calculation on the brightness channel data, the chrominance channel data and the concentration channel data, and synthesizing calculation results to obtain an output image. The invention improves the performance and effect of image processing, further improves the processing speed of the mobile terminal and reduces the power consumption of the mobile terminal.

Description

Image processing method and mobile terminal
Technical Field
The present invention relates to the field of communications technologies, and in particular, to an image processing method and a mobile terminal.
Background
With the development of the technology, the user has higher and higher requirements on the camera and video images of the smart phone, and the smart phone is expected to process larger and clearer pictures. Therefore, more and more complex image processing methods are applied to mobile devices such as mobile phones, and thus, higher requirements are placed on the hardware processing capability of the mobile phones.
In the existing image processing process, a color model conversion method is usually used, the calculated amount is very large, the performance and the effect of image processing are greatly influenced, and the processing speed of the mobile terminal is further influenced.
Disclosure of Invention
The embodiment of the invention provides an image processing method and a mobile terminal, and aims to solve the problem that the processing speed of the mobile terminal is influenced due to the large calculation amount of the conventional image processing method.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an image processing method applied to a mobile terminal, including:
acquiring a three-primary-color light mode image;
normalizing the three primary color light mode image to obtain brightness channel data, chrominance channel data and concentration channel data;
and respectively carrying out parallel calculation on the brightness channel data, the chrominance channel data and the concentration channel data, and synthesizing calculation results to obtain an output image.
In a second aspect, an embodiment of the present invention further provides a mobile terminal, including:
the acquisition module is used for acquiring a three-primary-color light mode image;
the processing module is used for carrying out normalization processing on the three primary color light mode image to obtain brightness channel data, chrominance channel data and concentration channel data;
and the synthesis module is used for respectively carrying out parallel computation on the brightness channel data, the chrominance channel data and the concentration channel data, and synthesizing the computation results to obtain an output image.
In a third aspect, an embodiment of the present invention further provides a mobile terminal, including: a memory, a processor and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the image processing method according to the first aspect.
In a fourth aspect, the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps in the image processing method according to the first aspect.
In this way, in the embodiment of the present invention, a three-primary-color light mode image is obtained, normalization processing is performed on the three-primary-color light mode image, brightness channel data, chromaticity channel data, and concentration channel data are obtained, parallel calculation is performed on the brightness channel data, the chromaticity channel data, and the concentration channel data, respectively, and calculation results are synthesized, so that an output image is obtained. Because the embodiment of the invention performs parallel computation on the data of each channel, compared with the prior art, the embodiment of the invention has higher processing speed, thereby improving the performance and effect of image processing, further improving the processing speed of the mobile terminal and reducing the power consumption of the mobile terminal.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive exercise.
FIG. 1 is a flow chart of an image processing method provided by an embodiment of the invention;
FIG. 2 is a schematic diagram of image effects using a prior art method;
FIG. 3 is a schematic diagram of image effects using a method of an embodiment of the invention;
fig. 4 is a block diagram of a mobile terminal according to an embodiment of the present invention;
FIG. 5 is a block diagram of a processing module provided by an embodiment of the present invention;
FIG. 6 is a block diagram of a composition processing module provided by an embodiment of the present invention;
FIG. 7 is a block diagram of a zoom processing sub-module of an embodiment of the present invention;
fig. 8 is a block diagram of another mobile terminal according to an embodiment of the present invention;
fig. 9 is a block diagram of another mobile terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As described in the background art, the image processing method in the prior art needs a large amount of calculation, which greatly affects the performance of the mobile terminal, and when processing a large image on a mobile device such as a mobile phone, the consumed operation time often becomes a performance bottleneck, and meanwhile, image information is lost after conversion, and the edge is not very smooth, and the effect is not very ideal.
Referring to fig. 1, fig. 1 is a flowchart of an image processing method according to an embodiment of the present invention, as shown in fig. 1, including the following steps:
step 101, obtaining a three primary color light mode image.
The three primary color light mode image includes, but is not limited to, an RGB image.
And 102, carrying out normalization processing on the three primary color light mode image to obtain brightness channel data, chrominance channel data and concentration channel data.
In the embodiment of the present invention, for a target pixel point in the three primary color light mode image, a Graphics Processing Unit (GPU) is used to compare a value of each channel of the target pixel point with 0, and compare a value of each channel with 255. If the value of the target channel of the target pixel point is larger than 255, normalizing the value of the target channel to be 255; and if the value of the target channel is less than 0, normalizing the value of the target channel to be 0. In this way, the brightness channel (Y channel) data, the chroma channel (U channel) data, and the density channel (V channel) data are obtained.
And the target pixel point is any pixel point in the three primary color light mode image.
And 103, respectively carrying out parallel calculation on the brightness channel data, the chrominance channel data and the concentration channel data, and synthesizing calculation results to obtain an output image.
In this step, a graphic processor is used for respectively convolving and multiplying the brightness channel data, the chrominance channel data and the density channel data to obtain target brightness channel data, target chrominance channel data and target density channel data; performing linear scaling on the target chromaticity channel data and the target concentration channel data to obtain target chromaticity channel data and target concentration channel data which are subjected to linear scaling processing; and synthesizing the target brightness channel data, the target chrominance channel data after linear scaling and the target concentration channel data to obtain an output image.
Specifically, in this step, the graphics processor is used to convolve and multiply the normalized data of each channel in the normalized data, so as to obtain Y-channel data, U-channel data, and V-channel data.
When linear scaling processing is carried out, for a target pixel point in the three primary color light mode image, taking the target pixel point as a center, acquiring target chromaticity channel data of four adjacent pixel points of the target pixel point, and taking an average value of the target chromaticity channel data of the four adjacent pixel points as the target chromaticity channel data of the target pixel point; taking the target pixel point as a center, acquiring target concentration channel data of four adjacent pixel points of the target pixel point, and taking an average value of the target concentration channel data of the four adjacent pixel points as the target concentration channel data of the target pixel point; and the target pixel point is any pixel point in the three primary color light mode image.
Taking the three primary color light mode image as an RGB image as an example, since the normalization processing of the image is completed by the GPU hardware processing, by using the scheme of the embodiment of the present invention, there is substantially no overhead in performance, and the overhead of the performance is mainly concentrated on the computation of I420. Therefore, I420 is calculated independently here. This has the advantage that it differs from conventional GPU calculations, which are the conventional I420 calculations, where the final image memory data arrangement is yuvuv … format, which requires a rearrangement operation for output to the target image, and the memory usage is (image resolution x 4) data size.
In the embodiment of the present invention, the following formula conversion is performed on I420 in the following manner:
Y=A[i,1]*B[1,1]+A[i,1]*B[1,2]+A[i,1]*B[1,3];
U=A[i,2]*B[1,1]+A[i,2]*B[1,2]+A[i,2]*B[1,3];
V=A[i,3]*B[1,1]+A[i,3]*B[1,2]+A[i,3]*B[1,3];
y, U and V respectively represent Y channel data, U channel data and V channel data; matrix A [ i, 1] represents Y-channel normalized data, matrix A [ i, 2] represents U-channel normalized data, and matrix A [ i, 3] represents V-channel normalized data; matrix B represents the convolution sum.
Therefore, the Y-channel data, the U-channel data and the V-channel data obtained by calculation in the embodiment of the invention can respectively store corresponding channel data by using 3 blocks of memories, and finally, the memories are combined to obtain the target image. Meanwhile, since the data of U, V channel occupies only 1/4 of Y channel in real memory, the memory usage of the embodiment of the present invention is also reduced by about 2/3 compared with the existing method, and because parallel processing is adopted, the time consumption of the method of the embodiment of the present invention is about 1/10 of the existing method in 1920X1080 images.
In the embodiment of the invention, because 1/4 scaling of the UV channel is processed by GPU hardware, the GPU hardware extracts 1/4UV channel values of the target pixel points by averaging U channel values of 4 surrounding points of the target pixel points, and the obtained image effect is smoother and more natural than that of an image obtained by the method in the prior art.
Wherein fig. 2 is a diagram of the effect of an image obtained according to a prior art scheme; fig. 3 is an effect diagram of an image obtained using a method of an embodiment of the invention. As can be seen from fig. 3, the obtained image is smoother after the method of the embodiment of the present invention is used.
In the embodiment of the invention, a three-primary-color light mode image is obtained, normalization processing is carried out on the three-primary-color light mode image to obtain brightness channel data, chroma channel data and concentration channel data, parallel calculation is respectively carried out on the brightness channel data, the chroma channel data and the concentration channel data, and calculation results are synthesized to obtain an output image. Because the embodiment of the invention performs parallel computation on the data of each channel, compared with the prior art, the embodiment of the invention has higher processing speed, thereby improving the performance and effect of image processing, further improving the processing speed of the mobile terminal and reducing the power consumption of the mobile terminal.
Through tests, the performance of 1920x1080 RGB images is improved from 20ms to about 2ms by using the method of the embodiment of the invention.
In the embodiment of the present invention, the method may be applied to a mobile terminal, for example: a Mobile phone, a tablet Computer (tablet personal Computer), a Laptop Computer (Laptop Computer), a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), a Wearable Device (Wearable Device), or the like.
Referring to fig. 4, fig. 4 is a structural diagram of a mobile terminal according to an embodiment of the present invention, and as shown in fig. 4, the mobile terminal 400 includes:
an obtaining module 401, configured to obtain a three-primary-color light mode image; a processing module 402, configured to perform normalization processing on the three primary color light mode image to obtain brightness channel data, chrominance channel data, and concentration channel data; a synthesizing module 403, configured to perform parallel computation on the brightness channel data, the chrominance channel data, and the density channel data, respectively, and synthesize computation results to obtain an output image.
The three primary color light mode image includes, but is not limited to, an RGB image.
Optionally, as shown in fig. 5, the processing module 402 includes:
the comparison submodule 4021 is configured to compare, for a target pixel in the three primary color light mode image, a value of each channel of the target pixel with 0, and compare a value of each channel with 255; the processing submodule 4022 is configured to normalize the value of the target channel to 255 if the value of the target channel of the target pixel is greater than 255; and if the value of the target channel is less than 0, normalizing the value of the target channel to be 0 so as to obtain brightness channel data, chrominance channel data and concentration channel data.
Optionally, as shown in fig. 6, the synthesis module 403 includes:
a first processing sub-module 4031, configured to separately convolve and multiply the brightness channel data, the chrominance channel data, and the density channel data with a convolution and a multiplication by using a graphics processor, to obtain target brightness channel data, target chrominance channel data, and target density channel data;
a scaling sub-module 4032, configured to perform linear scaling on the target chrominance channel data and the target concentration channel data, to obtain target chrominance channel data and target concentration channel data after the linear scaling;
and a synthesis sub-module 4033, configured to synthesize the target brightness channel data, the target chromaticity channel data after the linear scaling processing, and the target concentration channel data, so as to obtain an output image.
As shown in fig. 7, the scaling sub-module 4032 includes:
a first scaling unit 40321, configured to, for a target pixel point in the three-primary-color-light-mode image, obtain target chromaticity channel data of four adjacent pixel points of the target pixel point with the target pixel point as a center, and use an average value of the target chromaticity channel data of the four adjacent pixel points as the target chromaticity channel data of the target pixel point;
a second scaling unit 40322, configured to obtain target concentration channel data of four adjacent pixels of the target pixel with the target pixel as a center, and use an average value of the target concentration channel data of the four adjacent pixels as the target concentration channel data of the target pixel;
and the target pixel point is any pixel point in the three primary color light mode image.
Optionally, as shown in fig. 8, the mobile terminal may further include: a display module 404, configured to display the output image.
The mobile terminal 400 can implement each process implemented by the mobile terminal in the method embodiment of fig. 1, and is not described herein again to avoid repetition.
In the embodiment of the invention, a three-primary-color light mode image is obtained, normalization processing is carried out on the three-primary-color light mode image to obtain brightness channel data, chroma channel data and concentration channel data, parallel calculation is respectively carried out on the brightness channel data, the chroma channel data and the concentration channel data, and calculation results are synthesized to obtain an output image. Because the embodiment of the invention performs parallel computation on the data of each channel, compared with the prior art, the embodiment of the invention has higher processing speed, thereby improving the performance and effect of image processing, further improving the processing speed of the mobile terminal and reducing the power consumption of the mobile terminal.
Figure 9 is a schematic diagram of a hardware configuration of a mobile terminal implementing various embodiments of the present invention,
the mobile terminal 900 includes, but is not limited to: a radio frequency unit 901, a network module 902, an audio output unit 903, an input unit 904, a sensor 905, a display unit 906, a user input unit 907, an interface unit 908, a memory 909, a processor 980, and a power supply 911. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 9 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The processor 980 is configured to obtain three primary color light mode images; normalizing the three primary color light mode image by using a graphic processor to obtain normalized image data; respectively processing the normalized image data by using the graphic processor to obtain Y channel data, U channel data and V channel data; respectively carrying out linear scaling on the U channel data and the V channel data by using the graphics processor to obtain U channel data subjected to linear scaling and V channel data subjected to linear scaling; and synthesizing the Y-channel data, the U-channel data after the linear scaling and the V-channel data after the linear scaling to obtain an output image.
The processor 980 is configured to, for a target pixel point in the three primary color light mode image, compare a value of each channel of the target pixel point with 0, and compare a value of each channel with 255; if the value of the target channel of the target pixel point is larger than 255, normalizing the value of the target channel to be 255; and if the value of the target channel is less than 0, normalizing the value of the target channel to be 0.
The processor 980 is configured to, by using the graphics processor, perform convolution and multiplication on the normalized data of each channel in the normalized data to obtain Y-channel data, U-channel data, and V-channel data.
The processor 980 is configured to, for a target pixel point in the three primary color light mode image, obtain U channel values of four adjacent pixel points of the target pixel point with the target pixel point as a center, and use an average value of the U channel values of the four adjacent pixel points as the U channel value of the target pixel point; taking the target pixel point as a center, acquiring V channel values of four adjacent pixel points of the target pixel point, and taking an average value of the V channel values of the four adjacent pixel points as the V channel value of the target pixel point; and the target pixel point is any pixel point in the three primary color light mode image.
The display unit 960 is configured to display the output image.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 901 may be used for receiving and sending signals during a message transmission and reception process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 980; in addition, the uplink data is transmitted to the base station. Generally, the radio frequency unit 901 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 901 can also communicate with a network and other devices through a wireless communication system.
The mobile terminal provides the user with wireless broadband internet access via the network module 902, such as helping the user send and receive e-mails, browse web pages, and access streaming media.
The audio output unit 903 may convert audio data received by the radio frequency unit 901 or the network module 902 or stored in the memory 909 into an audio signal and output as sound. Also, the audio output unit 903 may also provide audio output related to a specific function performed by the mobile terminal 900 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 903 includes a speaker, a buzzer, a receiver, and the like.
The input unit 904 is used to receive audio or video signals. The input Unit 904 may include a Graphics Processing Unit (GPU) 9041 and a microphone 9042, and the Graphics processor 9041 processes image data of a still picture or video obtained by an image capturing device (such as a camera) in a video capture mode or an image capture mode. The processed image frames may be displayed on the display unit 906. The image frames processed by the graphic processor 9041 may be stored in the memory 909 (or other storage medium) or transmitted via the radio frequency unit 901 or the network module 902. The microphone 9042 can receive sounds and can process such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 901 in case of the phone call mode.
The mobile terminal 900 also includes at least one sensor 905, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 9061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 9061 and/or backlight when the mobile terminal 900 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 905 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which are not described in detail herein.
The display unit 906 is used to display information input by the user or information provided to the user. The Display unit 906 may include a Display panel 9061, and the Display panel 9061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 907 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 907 includes a touch panel 9071 and other input devices 9072. The touch panel 9071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 9071 (e.g., operations by a user on or near the touch panel 9071 using a finger, a stylus, or any other suitable object or accessory). The touch panel 9071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 980, and receives and executes commands sent by the processor 980. In addition, the touch panel 9071 may be implemented by using various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The user input unit 907 may include other input devices 9072 in addition to the touch panel 9071. Specifically, the other input devices 9072 may include, but are not limited to, a physical keyboard, function keys (such as a volume control key, a switch key, and the like), a track ball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 9071 may be overlaid on the display panel 9061, and when the touch panel 9071 detects a touch operation on or near the touch panel 9071, the touch panel is transmitted to the processor 980 to determine the type of the touch event, and then the processor 980 provides a corresponding visual output on the display panel 9061 according to the type of the touch event. Although the touch panel 9071 and the display panel 9061 are shown in fig. X as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 9071 and the display panel 9061 may be integrated to implement the input and output functions of the mobile terminal, which is not limited herein.
The interface unit 908 is an interface through which an external device is connected to the mobile terminal 900. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 908 may be used to receive input from external devices (e.g., data information, power, etc.) and transmit the received input to one or more elements within the mobile terminal 900 or may be used to transmit data between the mobile terminal 900 and external devices.
The memory 909 may be used to store software programs as well as various data. The memory 909 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 909 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 980 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by running or executing software programs and/or modules stored in the memory 909 and calling data stored in the memory 909, thereby integrally monitoring the mobile terminal. Processor 980 may include one or more processing units; preferably, the processor 980 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 980.
The mobile terminal 900 may also include a power supply 911 (e.g., a battery) for powering the various components, and preferably, the power supply 911 is logically connected to the processor 980 via a power management system that provides power management functions to manage charging, discharging, and power consumption.
In addition, the mobile terminal 900 includes some functional modules that are not shown, and thus will not be described in detail herein.
Preferably, an embodiment of the present invention further provides a mobile terminal, which includes a processor, a memory, and a computer program stored in the memory and capable of running on the processor, where the computer program, when executed by the processor, implements each process of the above-mentioned embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (8)

1. An image processing method applied to a mobile terminal is characterized by comprising the following steps:
acquiring a three-primary-color light mode image;
normalizing the three primary color light mode image to obtain brightness channel data, chrominance channel data and concentration channel data;
respectively carrying out parallel calculation on the brightness channel data, the chrominance channel data and the concentration channel data, and synthesizing calculation results to obtain an output image;
wherein, the step of respectively performing parallel computation on the brightness channel data, the chrominance channel data and the density channel data, and synthesizing the computation results to obtain an output image comprises:
respectively convolving and multiplying the brightness channel data, the chrominance channel data and the concentration channel data by using a graphic processor to obtain target brightness channel data, target chrominance channel data and target concentration channel data;
performing linear scaling on the target chromaticity channel data and the target concentration channel data to obtain target chromaticity channel data and target concentration channel data which are subjected to linear scaling processing;
synthesizing the target brightness channel data, the target chrominance channel data subjected to linear scaling and the target concentration channel data to obtain an output image;
the step of performing linear scaling on the target chrominance channel data and the target concentration channel data to obtain the target chrominance channel data and the target concentration channel data after the linear scaling processing includes:
for a target pixel point in the three primary color light mode image, taking the target pixel point as a center, acquiring target chromaticity channel data of four adjacent pixel points of the target pixel point by using a graphic processor, and taking an average value of the target chromaticity channel data of the four adjacent pixel points as the target chromaticity channel data of the target pixel point;
taking the target pixel point as a center, acquiring target concentration channel data of four adjacent pixel points of the target pixel point by using a graphic processor, and taking an average value of the target concentration channel data of the four adjacent pixel points as the target concentration channel data of the target pixel point;
and the target pixel point is any pixel point in the three primary color light mode image.
2. The method according to claim 1, wherein the step of normalizing the three primary color light mode image to obtain brightness channel data, chromaticity channel data, and density channel data includes:
for a target pixel point in the three primary color light mode image, comparing the value of each channel of the target pixel point with 0, and comparing the value of each channel with 255;
if the value of the target channel of the target pixel point is larger than 255, normalizing the value of the target channel to be 255; and if the value of the target channel is less than 0, normalizing the value of the target channel to be 0 so as to obtain brightness channel data, chrominance channel data and concentration channel data.
3. The method of claim 1, further comprising:
and displaying the output image.
4. A mobile terminal, comprising:
the acquisition module is used for acquiring a three-primary-color light mode image;
the processing module is used for carrying out normalization processing on the three primary color light mode image to obtain brightness channel data, chrominance channel data and concentration channel data;
the synthesis module is used for respectively carrying out parallel calculation on the brightness channel data, the chrominance channel data and the concentration channel data, and synthesizing calculation results to obtain an output image;
wherein the synthesis module comprises:
the first processing submodule is used for respectively convolving and multiplying the brightness channel data, the chrominance channel data and the density channel data by using a graphic processor to obtain target brightness channel data, target chrominance channel data and target density channel data;
the scaling processing submodule is used for carrying out linear scaling on the target chrominance channel data and the target concentration channel data to obtain the target chrominance channel data and the target concentration channel data which are subjected to linear scaling processing;
the synthesis submodule is used for synthesizing the target brightness channel data, the target chromaticity channel data after linear scaling processing and the target concentration channel data to obtain an output image;
the scaling processing sub-module comprises:
a first scaling processing unit, configured to, for a target pixel point in the three-primary-color-light-mode image, use the target pixel point as a center, obtain, by using a graphics processor, target chromaticity channel data of four adjacent pixel points of the target pixel point, and use an average value of the target chromaticity channel data of the four adjacent pixel points as the target chromaticity channel data of the target pixel point;
a second scaling processing unit, configured to use a graphics processor to obtain target concentration channel data of four adjacent pixels of the target pixel with the target pixel as a center, and use an average value of the target concentration channel data of the four adjacent pixels as the target concentration channel data of the target pixel;
and the target pixel point is any pixel point in the three primary color light mode image.
5. The mobile terminal of claim 4, wherein the processing module comprises:
the comparison submodule is used for comparing the value of each channel of the target pixel point with 0 and comparing the value of each channel with 255 for the target pixel point in the three-primary-color light mode image;
the processing submodule is used for normalizing the value of the target channel to 255 if the value of the target channel of the target pixel point is larger than 255; and if the value of the target channel is less than 0, normalizing the value of the target channel to be 0 so as to obtain brightness channel data, chrominance channel data and concentration channel data.
6. The mobile terminal of claim 4, further comprising:
and the display module is used for displaying the output image.
7. A mobile terminal, comprising: memory, processor and computer program stored on the memory and executable on the processor, which computer program, when executed by the processor, carries out the steps of the image processing method according to any one of claims 1 to 3.
8. A computer-readable storage medium, characterized in that a computer program is stored thereon, which computer program, when being executed by a processor, carries out the steps in the image processing method according to any one of claims 1 to 3.
CN201710953465.2A 2017-10-13 2017-10-13 Image processing method and mobile terminal Active CN107682631B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710953465.2A CN107682631B (en) 2017-10-13 2017-10-13 Image processing method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710953465.2A CN107682631B (en) 2017-10-13 2017-10-13 Image processing method and mobile terminal

Publications (2)

Publication Number Publication Date
CN107682631A CN107682631A (en) 2018-02-09
CN107682631B true CN107682631B (en) 2020-09-01

Family

ID=61141212

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710953465.2A Active CN107682631B (en) 2017-10-13 2017-10-13 Image processing method and mobile terminal

Country Status (1)

Country Link
CN (1) CN107682631B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1636386A (en) * 2002-02-22 2005-07-06 索尼英国有限公司 Image processing apparatus and method
CN102158714A (en) * 2010-02-11 2011-08-17 昆山锐芯微电子有限公司 Method and device for enhancing image edge based on RGB (Red, Green and Blue) format of Bayer domain
CN105447830A (en) * 2015-11-27 2016-03-30 合一网络技术(北京)有限公司 Method and apparatus for strengthening clarity of dynamic video image
CN105611386A (en) * 2015-12-23 2016-05-25 小米科技有限责任公司 Video picture processing method and device
CN105681800A (en) * 2016-01-27 2016-06-15 桂林长海发展有限责任公司 Device and method for quickly converting YUV420 into RGB format
CN106952245A (en) * 2017-03-07 2017-07-14 深圳职业技术学院 A kind of processing method and system for visible images of taking photo by plane
CN107205143A (en) * 2016-03-17 2017-09-26 深圳超多维光电子有限公司 Method and device for adjusting stereo image

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101951524B (en) * 2009-07-10 2013-06-19 比亚迪股份有限公司 JPEG (Joint Photographic Experts Group) compression method and device of color digital image
CN103544678A (en) * 2012-07-13 2014-01-29 浙江大华技术股份有限公司 Video image processing device and video image processing method
CN106469439A (en) * 2016-08-30 2017-03-01 乐视控股(北京)有限公司 The processing method and processing device of picture in a kind of Night

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1636386A (en) * 2002-02-22 2005-07-06 索尼英国有限公司 Image processing apparatus and method
CN102158714A (en) * 2010-02-11 2011-08-17 昆山锐芯微电子有限公司 Method and device for enhancing image edge based on RGB (Red, Green and Blue) format of Bayer domain
CN105447830A (en) * 2015-11-27 2016-03-30 合一网络技术(北京)有限公司 Method and apparatus for strengthening clarity of dynamic video image
CN105611386A (en) * 2015-12-23 2016-05-25 小米科技有限责任公司 Video picture processing method and device
CN105681800A (en) * 2016-01-27 2016-06-15 桂林长海发展有限责任公司 Device and method for quickly converting YUV420 into RGB format
CN107205143A (en) * 2016-03-17 2017-09-26 深圳超多维光电子有限公司 Method and device for adjusting stereo image
CN106952245A (en) * 2017-03-07 2017-07-14 深圳职业技术学院 A kind of processing method and system for visible images of taking photo by plane

Also Published As

Publication number Publication date
CN107682631A (en) 2018-02-09

Similar Documents

Publication Publication Date Title
CN107957839B (en) Display control method and mobile terminal
CN110109593B (en) Screen capturing method and terminal equipment
CN108038825B (en) Image processing method and mobile terminal
CN108449541B (en) Panoramic image shooting method and mobile terminal
CN107730460B (en) Image processing method and mobile terminal
CN111459233B (en) Display method, electronic device and storage medium
CN111401463B (en) Method for outputting detection result, electronic equipment and medium
CN107734172B (en) Information display method and mobile terminal
CN111031178A (en) Video stream clipping method and electronic equipment
CN109727212B (en) Image processing method and mobile terminal
CN107153500B (en) Method and equipment for realizing image display
CN109005314B (en) Image processing method and terminal
CN110290263B (en) Image display method and mobile terminal
CN109639981B (en) Image shooting method and mobile terminal
CN108259808B (en) Video frame compression method and mobile terminal
CN109348212B (en) Image noise determination method and terminal equipment
CN109462732B (en) Image processing method, device and computer readable storage medium
CN111028161A (en) Image correction method and electronic equipment
CN107977947B (en) Image processing method and mobile terminal
CN111010514B (en) Image processing method and electronic equipment
CN111031265B (en) FSR (frequency selective response) determining method and electronic equipment
CN109189517B (en) Display switching method and mobile terminal
CN107566738A (en) A kind of panorama shooting method, mobile terminal and computer-readable recording medium
CN108965701B (en) Jitter correction method and terminal equipment
CN109062483B (en) Image processing method and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant