[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN110493510B - Image processing method and terminal equipment - Google Patents

Image processing method and terminal equipment Download PDF

Info

Publication number
CN110493510B
CN110493510B CN201910615789.4A CN201910615789A CN110493510B CN 110493510 B CN110493510 B CN 110493510B CN 201910615789 A CN201910615789 A CN 201910615789A CN 110493510 B CN110493510 B CN 110493510B
Authority
CN
China
Prior art keywords
parameter
image
color
terminal device
calibration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910615789.4A
Other languages
Chinese (zh)
Other versions
CN110493510A (en
Inventor
朱泽基
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201910615789.4A priority Critical patent/CN110493510B/en
Publication of CN110493510A publication Critical patent/CN110493510A/en
Application granted granted Critical
Publication of CN110493510B publication Critical patent/CN110493510B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the invention provides an image processing method and terminal equipment, which are applied to the field of communication and are used for solving the problem that a color cast phenomenon exists in a shot picture. The terminal equipment comprises a first camera component and a second camera component, wherein a color card is arranged in the first camera component, and the method comprises the following steps: capturing a first image with a first camera assembly; calculating a calibration parameter according to a first parameter of the first image and a second parameter of the color chart; capturing a second image by a second camera assembly; calculating a fourth parameter according to the calibration parameter and the third parameter of the second image; calibrating the second image into a target image according to the fourth parameter; the first parameter is used for indicating the color of the first image, the second parameter is used for indicating the color of the color card, the first image and the second image are images collected under the same light environment, the third parameter is used for indicating the color of the second image, and the fourth parameter is used for indicating the color of the target image. The method is applied to the scene of the terminal equipment for shooting the image.

Description

Image processing method and terminal equipment
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to an image processing method and terminal equipment.
Background
When a user uses the terminal device to take a picture, the terminal device is usually required to take a picture in various light environments. However, due to the difference of the color temperature of the light source under different light environments, different color cast phenomena exist in the pictures shot by the terminal equipment. For example, when the terminal device takes a white paper in the sun, the taken picture will be yellow, i.e. the color of the white paper cannot be accurately restored. Thus, the color cast phenomenon of the shot picture is an urgent problem to be solved.
Disclosure of Invention
The embodiment of the invention provides an image processing method and terminal equipment, and aims to solve the problem that a color cast phenomenon exists in a shot picture.
In order to solve the above technical problem, the embodiment of the present invention is implemented as follows:
in a first aspect, an embodiment of the present invention provides an image processing method applied to a terminal device, where the terminal device includes a first camera component and a second camera component, the first camera component includes a color chip, and the color chip is transparent to light of all colors, and the method includes: capturing a first image with a first camera assembly; calculating a calibration parameter according to a first parameter of the first image and a second parameter of the color chart; capturing a second image by a second camera assembly; calculating a fourth parameter according to the calibration parameter and the third parameter of the second image; calibrating the second image into a target image according to the fourth parameter; the first parameter is used for indicating the color of the first image, the second parameter is used for indicating the color of the color card, the first image and the second image are images collected under the same light environment, the third parameter is used for indicating the color of the second image, and the fourth parameter is used for indicating the color of the target image.
In a second aspect, an embodiment of the present invention further provides a terminal device, where the terminal device includes a first camera component and a second camera component, the first camera component includes a color chip, and the color chip is transparent to light of all colors, and the terminal device further includes: the device comprises an acquisition module, a calculation module and a calibration module; an acquisition module for acquiring a first image by a first camera assembly; the calculation module is used for calculating calibration parameters according to the first parameters of the first image and the second parameters of the color chart, which are acquired by the acquisition module; the acquisition module is also used for acquiring a second image through the second camera assembly; the calculation module is also used for calculating a fourth parameter according to the calibration parameter calculated by the calculation module and the third parameter of the second image acquired by the acquisition module; the calibration module is used for calibrating the second image acquired by the acquisition module into a target image according to the fourth parameter calculated by the calculation module; the first parameter is used for indicating the color of the first image, the second parameter is used for indicating the color of the color card, the first image and the second image are images collected under the same light environment, the third parameter is used for indicating the color of the second image, and the fourth parameter is used for indicating the color of the target image.
Optionally, the color chip is shaped like a hemisphere, and light enters the color chip from the outer surface of the color chip.
Optionally, the color chip is a gray and translucent component.
Optionally, the first camera assembly and the second camera assembly are located at different positions on the same exterior surface of the terminal device.
In a third aspect, an embodiment of the present invention provides a terminal device, which includes a processor, a memory, and a computer program stored on the memory and operable on the processor, and when executed by the processor, the computer program implements the steps of the image processing method according to the first aspect.
In a fourth aspect, the present invention provides a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the image processing method according to the first aspect.
In an embodiment of the present invention, a first image may be acquired by a first camera assembly; calculating a calibration parameter according to a first parameter of the first image and a second parameter of the color card; acquiring a second image by using a second camera component; thereby, a fourth parameter is calculated according to the calibration parameter and the third parameter of the second image; calibrating the second image into a target image according to the fourth parameter; the first parameter is used for indicating the color of the first image, the second parameter is used for indicating the color of the color card, the first image and the second image are images collected under the same light environment, the third parameter is used for indicating the color of the second image, and the fourth parameter is used for indicating the color of the target image. Based on the scheme, the relative position of the color card and the lens in the first camera assembly is fixed, so that the terminal equipment can acquire the first image of the color card only by acquiring and shooting the image through the first camera assembly. Therefore, the method is beneficial to quickly and accurately acquiring the photo of the color card when the terminal equipment calibrates the photo. Furthermore, the method is beneficial to the terminal equipment to calibrate the photos taken by the user conveniently and quickly according to the calibration parameters, and the problem that the taken photos have color cast is solved.
Drawings
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of an image processing method according to an embodiment of the present invention;
fig. 3 is a schematic diagram illustrating a position of a camera assembly on a terminal device according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a camera assembly according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a possible terminal device according to an embodiment of the present invention;
fig. 6 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that "/" in this context means "or", for example, A/B may mean A or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. "plurality" means two or more than two.
It should be noted that, in the embodiments of the present invention, words such as "exemplary" or "for example" are used to indicate examples, illustrations or explanations. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
The terms "first" and "second," and the like, in the description and in the claims of the present invention are used for distinguishing between different objects and not for describing a particular order of the objects. For example, the first image and the second image, etc. are for distinguishing different images, rather than for describing a particular order of the images.
According to the image processing method and the terminal device provided by the embodiment of the invention, the terminal device comprises a first camera component and a second camera component, wherein the first camera component comprises a color chip, and the color chip can penetrate through light rays of all colors. Specifically, a first image may be acquired by a first camera assembly; calculating a calibration parameter according to a first parameter of the first image and a second parameter of the color card; acquiring a second image by using a second camera component; thereby, a fourth parameter is calculated according to the calibration parameter and the third parameter of the second image; calibrating the second image into a target image according to the fourth parameter; the first parameter is used for indicating the color of the first image, the second parameter is used for indicating the color of the color card, the first image and the second image are images collected under the same light environment, the third parameter is used for indicating the color of the second image, and the fourth parameter is used for indicating the color of the target image. Based on the scheme, the relative position of the color card and the lens in the first camera assembly is fixed, so that the terminal equipment can acquire the first image of the color card only by acquiring and shooting the image through the first camera assembly. Therefore, the method is beneficial to quickly and accurately acquiring the photo of the color card when the terminal equipment calibrates the photo. Furthermore, the method is beneficial to the terminal equipment to calibrate the photos taken by the user conveniently and quickly according to the calibration parameters, and the problem that the taken photos have color cast is solved.
The terminal device in the embodiment of the invention can be a mobile terminal device and can also be a non-mobile terminal device. The mobile terminal device may be a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), etc.; the non-mobile terminal device may be a Personal Computer (PC), a Television (TV), a teller machine, a self-service machine, or the like; the embodiments of the present invention are not particularly limited.
It should be noted that, in the image Processing method provided in the embodiment of the present invention, the execution main body may be a terminal device, or a Central Processing Unit (CPU) of the terminal device, or a control module in the terminal device for executing the image Processing method. In the embodiment of the present invention, an image processing method executed by a terminal device is taken as an example, and the image processing method provided in the embodiment of the present invention is described.
The terminal device in the embodiment of the present invention may be a terminal device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present invention are not limited in particular.
The following describes a software environment to which the image processing method provided by the embodiment of the present invention is applied, by taking an android operating system as an example.
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention. In fig. 1, the architecture of the android operating system includes 4 layers, which are respectively: an application layer, an application framework layer, a system runtime layer, and a kernel layer (specifically, a Linux kernel layer).
The application program layer comprises various application programs (including system application programs and third-party application programs) in an android operating system.
The application framework layer is a framework of the application, and a developer can develop some applications based on the application framework layer under the condition of complying with the development principle of the framework of the application. For example, applications such as a system setup application, a system chat application, and a system camera application. And the third-party setting application, the third-party camera application, the third-party chatting application and other application programs.
The system runtime layer includes libraries (also called system libraries) and android operating system runtime environments. The library mainly provides various resources required by the android operating system. The android operating system running environment is used for providing a software environment for the android operating system.
The kernel layer is an operating system layer of an android operating system and belongs to the bottommost layer of an android operating system software layer. The kernel layer provides kernel system services and hardware-related drivers for the android operating system based on the Linux kernel.
Taking an android operating system as an example, in the embodiment of the present invention, a developer may develop a software program for implementing the image processing method provided in the embodiment of the present invention based on the system architecture of the android operating system shown in fig. 1, so that the image processing method may operate based on the android operating system shown in fig. 1. Namely, the processor or the terminal device can implement the image processing method provided by the embodiment of the invention by running the software program in the android operating system.
The following describes the image processing method provided by the embodiment of the present invention in detail with reference to the flowchart of the image processing method shown in fig. 2. Wherein, although the logical order of the image processing methods provided by embodiments of the present invention is shown in a method flow diagram, in some cases, the steps shown or described may be performed in an order different than here. For example, the image processing method illustrated in fig. 2 may include S201 to S205:
s201, the terminal device collects a first image through a first camera component.
The terminal device provided by the embodiment of the invention comprises a first camera component and a second camera component, wherein the first camera component comprises a color chip, and the color chip can transmit light rays of all colors.
It can be understood that, in the embodiment of the present invention, the color chart is fixedly disposed in the first camera assembly, that is, the color chart and the lens in the first camera assembly are fixed. Namely, the relative position of the color card and the terminal equipment is fixed.
It should be noted that, lenses are included in both the first camera assembly and the second camera assembly to support the terminal device to capture images. For example, the lens in the second camera assembly is the main lens, and the lens in the first camera assembly is the auxiliary lens.
Specifically, the terminal device may capture an image (also referred to as a photograph or a picture) of the color chart through the first camera component, and capture an image of the actual needs of the user, such as an image containing some people or still things, through the second camera component.
It is emphasized that the terminal device can capture images through the first camera assembly and the second camera assembly respectively in the same light environment.
It can be understood that parameters such as the position and color temperature of the light source are different in different light environments. Thus, with the change of the light environment of the object, the color of the object, that is, the color of the image of the object, changes. For example, the higher the color temperature of a light source in a light environment, the more yellow the color of an image of an object in the light environment; the lower the color temperature of the light source in a light environment, the more blue the color of the image of the object in the light environment.
Optionally, the first camera assembly and the second camera assembly are located at different positions on the same exterior surface of the terminal device. Therefore, the view finding direction of the lens of the first camera component is consistent with that of the second camera component, and the terminal device can shoot images in the same light environment through the first camera component and the second camera component respectively.
For example, the first camera component and the second camera component may be disposed on a back side of the terminal device.
In addition, a person skilled in the relevant art may set the first camera component and the second camera component at two positions on the first outer surface of the terminal device, which are closer to each other, so as to capture images in the same light environment through the first camera component and the second camera component. For example, the second camera component is arranged at the upper left of the rear face of the terminal device, and the first camera component is arranged below the second camera component on the rear face.
Exemplarily, as shown in fig. 3, a schematic diagram of a position of a camera assembly on a terminal device is provided for an embodiment of the present invention. Wherein, the view shown in (a) in fig. 3 is a rear view of the terminal device, the view shown in (b) in fig. 3 is a side view (for example, a left view) of the terminal device, and the view of the front side where the screen of the corresponding terminal device is located is a front view of the terminal device. Specifically, the second camera assembly 31 is disposed at the upper left on the rear face of the terminal device, and the first camera assembly 32 is disposed below the second camera assembly 31 on the rear face.
Specifically, in the embodiment of the present invention, after the terminal device acquires the first image through the first camera component, the first image may be saved (i.e., the first image is captured and saved), for example, the first image is saved in a gallery application program of the terminal device.
It should be noted that the terminal device may perform the step of capturing the first image by the first camera assembly in the background. At this time, a shooting preview interface such as a view frame of the image captured by the first camera assembly may not be displayed on the screen of the terminal device, and the first image does not need to be displayed in a preview manner. It is understood that the terminal device has a camera application installed therein, and the user can trigger the terminal device to start and run the camera application, such as capturing an image or displaying a preview image on a screen. Specifically, when the terminal device runs the camera application, the terminal device may preview and display the image captured by the second camera component on the screen in real time, without previewing and displaying the image captured by the first camera component (i.e., the image of the color chip, such as the first image) in real time.
It should be noted that the terminal device may perform the step of capturing the first image by the first camera assembly in the background. At this time, a shooting preview interface such as a view frame of the image captured by the first camera assembly may not be displayed on the screen of the terminal device, and the first image does not need to be displayed in a preview manner.
Optionally, the terminal device may start to enable the first camera component and the second camera component when the camera application is started, so that the terminal device starts to capture the first image through the first camera component and starts to capture the image through the second camera component.
Optionally, the first camera assembly further includes a lens and a shielding component, and the lens, the shielding component and the color card are sequentially arranged along the first direction; the first direction is perpendicular to the first outer surface of the terminal device and far away from the first outer surface of the terminal device, and light entering the color card enters the lens through the shielding component. Wherein, the lens, the shielding component and the color card can be connected in sequence, such as adhesion.
Specifically, the light enters the lens of the first camera assembly through the color chip of the first camera assembly.
The color card is arranged in front of the lens in the first camera component, so that light rays entering the color card can enter the lens through the shielding component. Therefore, light can better enter the lens, excessive loss on a path from the color card to the lens can be avoided, and the image quality of the first image acquired by the first camera component is better.
Optionally, the second surface of the shielding member is opaque, and the second surface is not in contact with the lens and the color card in the first camera assembly. That is, the second surface is a surface of the blocking member other than the surface in contact with the lens (i.e., the auxiliary lens) and the color chart. Specifically, the shielding member may be a hollow cylindrical structure, and the second surface of the shielding member may be an outer surface and/or an inner surface of the cylindrical structure of the shielding member.
Specifically, under the condition that the second surface of the shielding component is the cylindrical outer surface, the shielding component is a sealing area between the color card and the auxiliary lens, so that light can be prevented from entering the auxiliary lens from an area outside the color card, namely, the light is prevented from entering the auxiliary lens from the shielding component.
The shielding member may be a black opaque material. The specific material of the opaque material may be determined according to actual use requirements, and this is not particularly limited in the embodiment of the present invention.
Optionally, the color chip is shaped like a hemisphere, and light enters the color chip from the outer surface of the color chip. I.e. light enters the auxiliary lens of the first camera component from the hemispherical outer surface of the color chip. The hemispherical color card can support the color card to enter more light rays, so that the auxiliary lens enters more light rays, and the quality of the first image collected by the first camera component of the subsequent terminal device is improved.
Illustratively, as shown in fig. 3 (b), the exterior of the first camera assembly 32 is a hemispherical color chip. Specifically, referring to fig. 3, as shown in fig. 4, a structural composition diagram of a camera module is shown. Fig. 4 shows the structure of the first camera assembly 32, and the first camera assembly 32 includes an auxiliary lens 41, a blocking member 42, and a color chart 43, and it is apparent that the color chart 43 has a hemispherical structure.
S202, the terminal device calculates calibration parameters according to the first parameters of the first image and the second parameters of the color chart.
The first parameter is used for indicating the color of the first image, the second parameter is used for indicating the color of the color chip, and the calibration parameter is used for calibrating the image acquired by the terminal device through the second camera assembly.
In embodiments of the present invention, the color chip may be a translucent component to support light entering (i.e., passing through) the color chip. Therefore, the terminal equipment can shoot the first image through the first camera component and know the color of the first image, namely, the first parameter of the first image is obtained.
Optionally, the material of the color chart may be glass, plastic, paper, a color filter, or the like, which may be determined according to actual use requirements, and this is not specifically limited in the embodiment of the present invention. For example, the color chip may be implemented by a translucent glass lens.
Optionally, the color chip is a gray and translucent component. That is, the color chart in the embodiment of the present invention may be a gray standard card.
It is to be understood that the color of the color chip is known and predetermined by the skilled artisan. I.e. the second parameter of the color chip is known.
A grey and translucent colour chip allows ambient light to enter the colour chip, since grey is transparent for all colours, i.e. grey is uniform for all colours projected.
It should be noted that a color has three characteristics, namely hue, lightness, and saturation. The three characteristics of color and their interrelation can be illustrated by using three-dimensional colors as an example.
The terminal device can record various colors through a color mode, namely, the various colors are expressed into a model in a digital form. Specifically, the Color mode may be classified into a red-green-blue (RGB) mode, a cyan-magenta-yellow-black (CMYK) mode, a hue-saturation-brightness (HSB) mode, a Lab Color mode, a Bitmap (Bitmap) mode, a gray scale (gray) mode, an index Color (index Color) mode, a two-tone (Duotone) mode, a multi-channel (Multichannel) mode, and the like.
It will be appreciated that the colors of an image are characterized by different components in different color modes.
Specifically, in the embodiment of the present invention, the parameter indicating the color of an image may be a component of the color of the image in a certain color mode. E.g. the first parameter of the first image, is a component of the color of the first image in a certain color mode.
For example, in HSB mode, the colors of an image are characterized by including an h (hues) component representing hue, an s (saturation) component representing saturation, and a b (brightness) component representing brightness. In the HSB mode, the higher the values of the S component and the B component in the first parameter, the higher the saturation and brightness of the first image. For example, the first parameter of the first image includes an H component, an S component, and a B component of the first image.
For example, in RGB mode, the color of an image is characterized by an R (RED) component, a G (GREEN) component, and a B (BLUE) component. For example, the first parameter of the first image includes an R component, a G component, and a B component of the first image.
For example, in CMYK mode, the color of the image is characterized by a C (CYAN) component, an M (MAGENTA) component, a Y (yellow) component, and a K (BLACK) component. For example, the first parameter of the first image includes a C component, an M component, a Y component, and a K component of the first image.
In the embodiment of the present invention, which color mode is used for recording the color of the image may be determined according to actual use requirements, and this is not specifically limited in the embodiment of the present invention.
And S203, the terminal equipment acquires a second image through a second camera component.
The first image and the second image are images collected under the same light environment.
Optionally, the second image is an image currently acquired and shot by the terminal device through the second camera assembly, or a current frame image acquired and previewed on a screen through the second camera assembly, or an image acquired in the background through the second camera assembly.
It should be noted that the order of executing the above S201 and S203 by the terminal device is not limited, for example, the terminal device may execute S201 first and then execute S203. However, the time when the terminal device executes S201 and the time when S203 are different from each other by a small amount, so as to ensure that the first image and the second image are images in the same light scene.
Specifically, the user actively uses the camera of the terminal device to take an image, and at this time, the main lens of the camera is triggered to take a picture. And if the camera is in a preview mode, the terminal equipment captures the image of the current frame acquired by the main lens.
It can be understood that, because there may be light sources with different color temperatures in the captured scene, there may be different color shifts in the captured picture by the main lens of the camera under the illumination of the light sources with different color temperatures, such as the content of the picture generally turns yellow under the illumination of the light source with a high color temperature, and the content of the picture generally turns blue under the illumination of the light source with a low color temperature. Therefore, in general, photos taken by ordinary users have a certain color cast phenomenon.
It can be understood that the second parameter of the color chart is the true color of the color chart, and the first parameter of the first image is the image of the color chart with color cast under the current light environment. Therefore, the degree of color cast of the image under the current light environment can be determined through the first parameter and the second parameter, and the calibration parameter can be obtained.
Specifically, since the gray calibration card and the scene shot by the main lens are in the light source environment with the same color temperature, the color shift of the gray calibration card and the color shift of the shot scene under the same light source are consistent. The calibration parameters calculated by the grey calibration card can correct the image shot by the main lens to a real picture.
And S204, the terminal equipment calculates a fourth parameter according to the calibration parameter and the third parameter of the second image.
Wherein the third parameter is used for indicating the color of the second image, and the fourth parameter is used for indicating the color of the target image.
It is understood that the second image with the third parameter is an image collected by the terminal device through the second camera assembly under the current light environment, and color cast may occur. The first image and the second image are images under the same light environment, so that the color cast degree of the first image and the color cast degree of the second image are the same, the calibration parameter can adjust the third parameter of the second image, and the adjusted fourth parameter is obtained.
And S205, the terminal equipment calibrates the second image into a target image according to the fourth parameter.
It is understood that the target image having the fourth parameter is an image of a true color.
Specifically, after the terminal device acquires the image with color shift from the main lens, the image captured by the main lens can be corrected by the calculated calibration parameters.
For example, if the second image is irradiated by a light source with a high color temperature, and the picture content is generally yellow, the target image with better picture content can be obtained by calibrating the yellow degree of the picture content through the calibration parameter. Therefore, the terminal equipment can restore the color of the collected image.
It should be noted that, through the above S205, the terminal device completes one calibration on the image captured by the main lens. When the user controls the terminal device to shoot the image, the image after color calibration is stored in the terminal device. And if the main lens is in the preview mode, continuing to calibrate the next frame of image according to the process.
Optionally, the terminal device may store the current calibration parameters. And subsequently, recalculating the calibration parameters once at intervals, and updating and storing the calibration parameters in the terminal equipment.
It can be understood that, in a scene in which a user controls the terminal device to capture an image, the terminal device may automatically capture a first image to automatically acquire calibration parameters in a current light environment; and calibrating the second image acquired in real time through the calibration parameter to obtain a calibrated target image. In this way, the user can manually control the terminal device to capture the second image, and does not need to manually control the terminal device to capture the first image, but can trigger the terminal device to capture the first image in the background. Therefore, the operation of the user in the process of calibrating the photo by the terminal equipment is simplified, and the man-machine interaction performance is improved.
It should be noted that, in the image processing method provided by the embodiment of the present invention, the first image may be acquired by the first camera component; calculating a calibration parameter according to a first parameter of the first image and a second parameter of the color card; acquiring a second image by using a second camera component; thereby, a fourth parameter is calculated according to the calibration parameter and the third parameter of the second image; calibrating the second image into a target image according to the fourth parameter; the first parameter is used for indicating the color of the first image, the second parameter is used for indicating the color of the color card, the first image and the second image are images collected under the same light environment, the third parameter is used for indicating the color of the second image, and the fourth parameter is used for indicating the color of the target image. Based on the scheme, the relative position of the color card and the lens in the first camera assembly is fixed, so that the terminal equipment can acquire the first image of the color card only by acquiring and shooting the image through the first camera assembly. Therefore, the method is beneficial to quickly and accurately acquiring the photo of the color card when the terminal equipment calibrates the photo. Furthermore, the method is beneficial to the terminal equipment to calibrate the photos taken by the user conveniently and quickly according to the calibration parameters, and the problem that the taken photos have color cast is solved.
In a possible implementation manner, in the image processing method provided by the embodiment of the present invention, the above S202 may be implemented by S202a and S202 b:
s202a, the terminal device calculates the ratio of the first parameter to the second parameter to obtain a first ratio.
S202b, the terminal device calculates a product of the first ratio and the first sensitivity parameter to obtain a calibration parameter.
The first sensitivity parameter is used for representing the light sensitivity of a lens in the first camera assembly.
Specifically, the terminal device calculates the calibration parameters using p (λ) ═ O1(λ)/R1 × c 1.
Where λ denotes a light environment in which the terminal device is located, p (λ) denotes a calibration parameter, O1(λ) denotes a first parameter, R1 denotes a second parameter, and c1 denotes a light sensitivity of a lens in the first camera assembly. O1(λ)/R1 is a first ratio, and c1 is a first sensitivity.
Specifically, O1(λ) represents the color of the first image of the color chart captured by the terminal device through the auxiliary lens in the light environment of λ. R1 represents the true color of the terminal device passing through the color chart, and p (λ) represents the calibration parameter under the light environment of λ.
Wherein, the value of lambda changes along with the change of the current light environment. Further, as λ changes, p (λ) and O1(λ) change.
The sensitivity of the same lens under different light environments is consistent. The first photoreception sensitivity c1 of the lens in the first camera assembly is a determined value.
Accordingly, the above S204 can be implemented by S204a and S204 b:
s204a, the terminal device calculates the ratio of the third parameter to the calibration parameter to obtain a second ratio.
And S204b, the terminal equipment calculates the product of the second ratio and the second sensitivity parameter to obtain a fourth parameter.
The second sensitivity parameter is used for representing the light sensitivity of a lens in the second camera assembly.
Specifically, the terminal device may calculate the fourth parameter using R2 ═ O2(λ)/p (λ) × c 2.
Where O2(λ) denotes a third parameter, R2 denotes a fourth parameter, and c2 denotes the light-sensing sensitivity of the lens in the second camera assembly. Specifically, O2(λ)/p (λ) is a second ratio, and c2 is a second photosensitivity.
It will be appreciated that typically the same brand or model of lens will have the same photosensitivity, e.g. c1 is the same as c2 described above.
Specifically, O2(λ) represents the color of the second image captured by the main lens in the light environment of λ by the terminal device. R1 represents the true color of the second image captured by the terminal device through the main lens.
It should be noted that, in the embodiment of the present invention, the terminal device may determine, in real time, the calibration parameter of the image acquired through the main lens of the second camera assembly in the current light environment, so that the image is calibrated through the calibration parameter, and an image with better picture quality is obtained.
Fig. 5 is a schematic structural diagram of a terminal device according to an embodiment of the present invention. Fig. 5 shows a terminal device 50 comprising a first camera component 51 and a second camera component 52, wherein the first camera component 51 comprises a color chip 511, and the color chip 511 is transparent for all colors of light, and the terminal device further comprises: an acquisition module 53, a calculation module 54 and a calibration module 55; an acquisition module 53, configured to acquire a first image through the first camera assembly 51; a calculating module 54, configured to calculate a calibration parameter according to the first parameter of the first image acquired by the acquiring module 53 and the second parameter of the color chart 511; an acquisition module 53, further configured to acquire a second image through the second camera component 52; the calculating module 54 is further configured to calculate a fourth parameter according to the calibration parameter calculated by the calculating module 54 and the third parameter of the second image acquired by the acquiring module 53; a calibration module 55, configured to calibrate the second image acquired by the acquisition module 53 into a target image according to the fourth parameter calculated by the calculation module 54; the first parameter is used for indicating the color of the first image, the second parameter is used for indicating the color of the color chart 511, the first image and the second image are images acquired under the same light environment, the third parameter is used for indicating the color of the second image, and the fourth parameter is used for indicating the color of the target image.
Optionally, the calculating module 54 is specifically configured to calculate a ratio of the first parameter to the second parameter, so as to obtain a first ratio; calculating the product of the first ratio and the first sensitivity parameter to obtain a calibration parameter; wherein the first sensitivity parameter is used to represent the light sensitivity of the lens in the first camera assembly 51.
Optionally, the calculating module 54 is specifically configured to calculate a ratio of the third parameter to the calibration parameter, so as to obtain a second ratio; calculating the product of the second ratio and the second sensitivity parameter to obtain a fourth parameter; wherein the second sensitivity parameter is used to represent the photosensitivity of the lens in the second camera assembly 52.
Optionally, the first camera assembly 51 further includes a lens and a shielding component, and the lens, the shielding component and the color card 511 are sequentially disposed along the first direction; the first direction is perpendicular to and away from the first outer surface of the terminal device, and the light entering the color card 511 enters the lens through the shielding component.
Optionally, the second surface of the shielding member is opaque, and the second surface is not in contact with the lens or the color card 511.
The terminal device 50 provided in the embodiment of the present invention can implement each process implemented by the terminal device in the foregoing method embodiments, and for avoiding repetition, details are not described here again.
According to the terminal device provided by the embodiment of the invention, the first image can be acquired through the first camera component; calculating a calibration parameter according to a first parameter of the first image and a second parameter of the color card; acquiring a second image by using a second camera component; thereby, a fourth parameter is calculated according to the calibration parameter and the third parameter of the second image; calibrating the second image into a target image according to the fourth parameter; the first parameter is used for indicating the color of the first image, the second parameter is used for indicating the color of the color card, the first image and the second image are images collected under the same light environment, the third parameter is used for indicating the color of the second image, and the fourth parameter is used for indicating the color of the target image. Based on the scheme, the relative position of the color card and the lens in the first camera assembly is fixed, so that the terminal equipment can acquire the first image of the color card only by acquiring and shooting the image through the first camera assembly. Therefore, the method is beneficial to quickly and accurately acquiring the photo of the color card when the terminal equipment calibrates the photo. Furthermore, the method is beneficial to the terminal equipment to calibrate the photos taken by the user conveniently and quickly according to the calibration parameters, and the problem that the taken photos have color cast is solved.
Fig. 6 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present invention, where the terminal device 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the terminal device configuration shown in fig. 6 does not constitute a limitation of the terminal device, and that the terminal device may include more or fewer components than shown, or combine certain components, or a different arrangement of components. In the embodiment of the present invention, the terminal device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal device, a wearable device, a pedometer, and the like.
Wherein, the processor 110 is configured to acquire a first image through the first camera assembly; the calibration parameter is calculated according to the first parameter of the first image and the second parameter of the color card; capturing a second image by a second camera assembly; calculating a fourth parameter according to the calibration parameter and the third parameter of the second image; calibrating the second image into a target image according to the fourth parameter; the first parameter is used for indicating the color of the first image, the second parameter is used for indicating the color of the color card, the first image and the second image are images collected under the same light environment, the third parameter is used for indicating the color of the second image, and the fourth parameter is used for indicating the color of the target image.
According to the terminal device provided by the embodiment of the invention, the first image can be acquired through the first camera component; calculating a calibration parameter according to a first parameter of the first image and a second parameter of the color card; acquiring a second image by using a second camera component; thereby, a fourth parameter is calculated according to the calibration parameter and the third parameter of the second image; calibrating the second image into a target image according to the fourth parameter; the first parameter is used for indicating the color of the first image, the second parameter is used for indicating the color of the color card, the first image and the second image are images collected under the same light environment, the third parameter is used for indicating the color of the second image, and the fourth parameter is used for indicating the color of the target image. Based on the scheme, the relative position of the color card and the lens in the first camera assembly is fixed, so that the terminal equipment can acquire the first image of the color card only by acquiring and shooting the image through the first camera assembly. Therefore, the method is beneficial to quickly and accurately acquiring the photo of the color card when the terminal equipment calibrates the photo. Furthermore, the method is beneficial to the terminal equipment to calibrate the photos taken by the user conveniently and quickly according to the calibration parameters, and the problem that the taken photos have color cast is solved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 may be used for receiving and sending signals during a message transmission or call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The terminal device provides wireless broadband internet access to the user through the network module 102, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the terminal device 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used to receive an audio or video signal. The input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics processor 1041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The terminal device 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or the backlight when the terminal device 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal device posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal device. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 6, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the terminal device, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the terminal device, and is not limited herein.
The interface unit 108 is an interface for connecting an external device to the terminal apparatus 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal apparatus 100 or may be used to transmit data between the terminal apparatus 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the terminal device, connects various parts of the entire terminal device by using various interfaces and lines, and performs various functions of the terminal device and processes data by running or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the terminal device. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The terminal device 100 may further include a power supply 111 (such as a battery) for supplying power to each component, and preferably, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the terminal device 100 includes some functional modules that are not shown, and are not described in detail here.
Preferably, an embodiment of the present invention further provides a terminal device, which includes a processor 110, a memory 109, and a computer program stored in the memory 109 and capable of running on the processor 110, where the computer program is executed by the processor 110 to implement each process of the foregoing method embodiment, and can achieve the same technical effect, and for avoiding repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the processes of the method embodiments, and can achieve the same technical effects, and in order to avoid repetition, the details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. An image processing method applied to a terminal device, the terminal device including a first camera component and a second camera component, the first camera component including a color chip, the color chip being fixedly disposed in front of a lens in the first camera component, the color chip being transparent to light of all colors, the method comprising:
capturing a first image by the first camera assembly;
calculating a calibration parameter according to a first parameter of the first image and a second parameter of the color chart;
acquiring a second image by the second camera component;
calculating a fourth parameter according to the calibration parameter and the third parameter of the second image;
calibrating the second image into a target image according to the fourth parameter;
the first parameter is used for indicating the color of the first image, the second parameter is used for indicating the color of the color chart, the first image and the second image are images collected under the same light environment, the third parameter is used for indicating the color of the second image, and the fourth parameter is used for indicating the color of the target image.
2. The method of claim 1, wherein calculating calibration parameters from the first parameters of the first image and the second parameters of the color target comprises:
calculating the ratio of the first parameter to the second parameter to obtain a first ratio;
calculating the product of the first ratio and the first sensitivity parameter to obtain the calibration parameter;
wherein the first sensitivity parameter is used for representing the light sensitivity of a lens in the first camera assembly.
3. The method of claim 2, wherein said calculating a fourth parameter from said calibration parameter and a third parameter of said second image comprises:
calculating the ratio of the third parameter to the calibration parameter to obtain a second ratio;
calculating the product of the second ratio and the second sensitivity parameter to obtain a fourth parameter;
wherein the second sensitivity parameter is used for representing the light sensitivity of a lens in the second camera assembly.
4. A terminal device, the terminal device comprising a first camera component and a second camera component, the first camera component comprising a color chip therein, the color chip being fixedly disposed in front of a lens in the first camera component, the color chip being transparent to all colors of light, the terminal device further comprising: the device comprises an acquisition module, a calculation module and a calibration module;
the acquisition module is used for acquiring a first image through the first camera assembly;
the calculation module is used for calculating calibration parameters according to the first parameters of the first image and the second parameters of the color chart, which are acquired by the acquisition module;
the acquisition module is further used for acquiring a second image through the second camera component;
the calculation module is further configured to calculate a fourth parameter according to the calibration parameter calculated by the calculation module and the third parameter of the second image acquired by the acquisition module;
the calibration module is used for calibrating the second image acquired by the acquisition module into a target image according to the fourth parameter calculated by the calculation module;
the first parameter is used for indicating the color of the first image, the second parameter is used for indicating the color of the color chart, the first image and the second image are images collected under the same light environment, the third parameter is used for indicating the color of the second image, and the fourth parameter is used for indicating the color of the target image.
5. The terminal device according to claim 4, wherein the calculating module is specifically configured to calculate a ratio of the first parameter to the second parameter, so as to obtain a first ratio; calculating the product of the first ratio and the first sensitivity parameter to obtain the calibration parameter;
wherein the first sensitivity parameter is used for representing the light sensitivity of a lens in the first camera assembly.
6. The terminal device according to claim 5, wherein the calculating module is specifically configured to calculate a ratio of the third parameter to the calibration parameter to obtain a second ratio; calculating the product of the second ratio and the second sensitivity parameter to obtain a fourth parameter;
wherein the second sensitivity parameter is used for representing the light sensitivity of a lens in the second camera assembly.
7. The terminal device according to any one of claims 4 to 6, wherein the first camera assembly further comprises a lens and a shielding component, and the lens, the shielding component and the color card are arranged in sequence along a first direction;
the first direction is perpendicular to the first outer surface of the terminal device and far away from the first outer surface of the terminal device, and light entering the color card enters the lens through the shielding component.
8. The terminal device of claim 7, wherein a second surface of the shielding member is opaque to light, and the second surface is not in contact with the lens or the color chip.
9. A terminal device, characterized in that it comprises a processor, a memory and a computer program stored on the memory and executable on the processor, which computer program, when executed by the processor, implements the steps of the image processing method according to any one of claims 1 to 3.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the image processing method according to any one of claims 1 to 3.
CN201910615789.4A 2019-07-09 2019-07-09 Image processing method and terminal equipment Active CN110493510B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910615789.4A CN110493510B (en) 2019-07-09 2019-07-09 Image processing method and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910615789.4A CN110493510B (en) 2019-07-09 2019-07-09 Image processing method and terminal equipment

Publications (2)

Publication Number Publication Date
CN110493510A CN110493510A (en) 2019-11-22
CN110493510B true CN110493510B (en) 2021-01-26

Family

ID=68546878

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910615789.4A Active CN110493510B (en) 2019-07-09 2019-07-09 Image processing method and terminal equipment

Country Status (1)

Country Link
CN (1) CN110493510B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112858268B (en) * 2020-09-18 2022-06-14 武汉大学 Soil water and solute migration university measuring method based on chemical imaging-correction analysis
CN115243033A (en) * 2022-06-30 2022-10-25 东莞华贝电子科技有限公司 Color cast detection method and device, electronic equipment and storage medium
CN115460391B (en) * 2022-09-13 2024-04-16 浙江大华技术股份有限公司 Image simulation method and device, storage medium and electronic device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109286808A (en) * 2017-07-21 2019-01-29 北京小米移动软件有限公司 White balance calibration method and device

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103402102B (en) * 2013-07-17 2015-12-09 广东欧珀移动通信有限公司 The method and apparatus of dual camera camera system and white balance adjusting thereof
CN104661003B (en) * 2015-02-04 2016-09-07 苏州佳世达电通有限公司 Electronic installation and color correcting method thereof
US9948857B2 (en) * 2015-10-22 2018-04-17 Samsung Electronics Co., Ltd. Method and device for generating images
CN105678710B (en) * 2016-01-14 2018-10-26 佛山华芯微特科技有限公司 Color correction also original system and color correction restoring method
US10331968B2 (en) * 2017-03-24 2019-06-25 Disney Enterprises, Inc. One shot color calibrated metric learning for object re-identification
CN208820948U (en) * 2018-05-09 2019-05-03 清华大学 Image sensing apparatus
CN108712639A (en) * 2018-05-29 2018-10-26 凌云光技术集团有限责任公司 Image color correction method, apparatus and system
CN108469302B (en) * 2018-06-13 2024-01-16 上海安翰医疗技术有限公司 Color correction and test device
CN109525749B (en) * 2018-11-07 2020-02-21 厦门亿联网络技术股份有限公司 Method and device for correcting image color deviation
CN109218561A (en) * 2018-11-30 2019-01-15 豪威科技(上海)有限公司 The synchronous method and device of multi-cam
CN109636728A (en) * 2018-12-24 2019-04-16 天津天地伟业信息系统集成有限公司 A kind of color calibration method of image mosaic

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109286808A (en) * 2017-07-21 2019-01-29 北京小米移动软件有限公司 White balance calibration method and device

Also Published As

Publication number Publication date
CN110493510A (en) 2019-11-22

Similar Documents

Publication Publication Date Title
CN109639970B (en) Shooting method and terminal equipment
CN109361865B (en) Shooting method and terminal
CN107038681B (en) Image blurring method and device, computer readable storage medium and computer device
CN110602473B (en) White balance calibration method and device
CN108111754B (en) Method for determining image acquisition mode and mobile terminal
CN109238460B (en) Method for obtaining ambient light intensity and terminal equipment
WO2017071219A1 (en) Method for detecting skin region and device for detecting skin region
CN109688322B (en) Method and device for generating high dynamic range image and mobile terminal
WO2019091426A1 (en) Camera assembly, image acquisition method, and mobile terminal
CN112040223A (en) Image processing method, terminal device and storage medium
CN108307106B (en) Image processing method and device and mobile terminal
CN110493510B (en) Image processing method and terminal equipment
CN108513067B (en) Shooting control method and mobile terminal
CN110458921B (en) Image processing method, device, terminal and storage medium
CN107644396B (en) Lip color adjusting method and device
CN108848309B (en) Camera program starting method and mobile terminal
CN108200352B (en) Method, terminal and storage medium for adjusting picture brightness
CN109246351B (en) Composition method and terminal equipment
CN108616687B (en) Photographing method and device and mobile terminal
CN109639981B (en) Image shooting method and mobile terminal
CN110944163A (en) Image processing method and electronic equipment
CN109167917B (en) Image processing method and terminal equipment
CN109729264B (en) Image acquisition method and mobile terminal
CN111031242B (en) Image processing method and device
CN110930372B (en) Image processing method, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant