CN109729264B - Image acquisition method and mobile terminal - Google Patents
Image acquisition method and mobile terminal Download PDFInfo
- Publication number
- CN109729264B CN109729264B CN201811511256.3A CN201811511256A CN109729264B CN 109729264 B CN109729264 B CN 109729264B CN 201811511256 A CN201811511256 A CN 201811511256A CN 109729264 B CN109729264 B CN 109729264B
- Authority
- CN
- China
- Prior art keywords
- mobile terminal
- offset
- color
- image
- determining
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Landscapes
- Studio Devices (AREA)
- Image Processing (AREA)
Abstract
The embodiment of the invention provides an image acquisition method and a mobile terminal, which belong to the technical field of application terminals and are used for solving the problem of poor quality of a color image acquired by the conventional mobile terminal in a motion state. The photosensitive sensor applied to the mobile terminal comprises a plurality of pixel units, wherein each pixel unit comprises N color units, and N is a positive integer greater than 1. The method comprises the following steps: under the condition that the mobile terminal receives photographing input, if the mobile terminal is detected to be in a motion state, the mobile camera unit carries out exposure on a photographed object for M times according to motion parameters of the mobile terminal and the arrangement sequence of N color units in the pixel unit, and an image obtained by each exposure is obtained; synthesizing a color image of the shot object based on the M images; wherein M is greater than or equal to N, the image pickup unit includes: a lens and/or the light sensitive sensor.
Description
Technical Field
The embodiment of the invention relates to the technical field of terminals, in particular to an image acquisition method and mobile equipment.
Background
With the continuous development of mobile communication and the continuous improvement of living standard of people, the use of various mobile terminals (such as mobile phones and tablet computers) is more and more popular, and the mobile phones become indispensable communication tools in the life of people.
In the prior art, when a user uses a mobile terminal to take a picture, slight shaking is usually generated due to factors such as shaking of a human hand, and thus the acquired image is "blurred", that is, the definition is poor. Meanwhile, due to the fact that the single-color light filter is arranged on the photosensitive sensor in the existing mobile terminal, each pixel can only collect one primary color of red R, green G and blue B at the same moment, namely the photosensitive sensor can only receive one third of incident light, and further the image quality of the color image finally obtained by the mobile terminal is poor, namely the brightness and the definition of the color image are lost.
Therefore, how to improve the image quality of the color image acquired by the mobile terminal when the mobile terminal is in a motion state is an urgent problem to be solved at present.
Disclosure of Invention
The embodiment of the invention provides an image acquisition method and mobile equipment, and aims to solve the problem that the quality of a color image acquired by an existing mobile terminal in a motion state is poor.
In order to solve the above technical problem, the embodiment of the present invention is implemented as follows:
in a first aspect, an embodiment of the present invention provides an image obtaining method applied to a mobile terminal, where a photosensor of the mobile terminal includes a plurality of pixel units, each pixel unit includes N color units, where N is a positive integer greater than 1, and the method includes:
receiving a photographing input of a user;
responding to the photographing input, if the mobile terminal is detected to be in a motion state, moving a camera shooting unit to perform exposure for M times on a photographed object according to the motion parameters of the mobile terminal and the arrangement sequence of the N color units in the pixel unit, and obtaining an image obtained by each exposure;
synthesizing a color image of the photographic subject based on the M images;
wherein the image pickup unit includes: a lens and/or the light sensitive sensor.
In a second aspect, an embodiment of the present invention further provides a mobile terminal, where a photosensor of the mobile terminal includes a plurality of pixel units, each pixel unit includes N color units, where N is a positive integer greater than 1, and the mobile terminal further includes:
the receiving module is used for receiving photographing input of a user;
the execution module is used for responding to the photographing input received by the receiving module, and if the mobile terminal is detected to be in a motion state, the mobile photographing unit carries out exposure on a photographed object for M times according to the motion parameters of the mobile terminal and the arrangement sequence of the N color units in the pixel unit to obtain an image obtained by each exposure, wherein M is greater than or equal to N;
a synthesis module for synthesizing a color image of the photographic subject based on the M images obtained by the execution module;
wherein the image pickup unit includes: a lens and/or the light sensitive sensor.
In a third aspect, an embodiment of the present invention provides a mobile terminal, including a processor, a memory, and a computer program stored on the memory and operable on the processor, where the computer program, when executed by the processor, implements the steps of the image acquisition method according to the first aspect.
In a fourth aspect, the present invention provides a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the image acquisition method according to the first aspect.
In the embodiment of the invention, under the condition that the mobile terminal receives the photographing input, if the mobile terminal is detected to be in a motion state, the mobile terminal can move the photosensitive sensor and/or the lens to expose the photographed object for multiple times according to the motion parameters of the mobile terminal and the arrangement sequence of the N color units in the pixel units in the photosensitive sensor, so that the photosensitive sensor can completely acquire the light information of the photographed object while the shake displacement of the mobile terminal is compensated, and the imaging quality is improved.
Drawings
FIG. 1 is a schematic diagram of an arrangement of RGB Photo Diodes (PDs) of a photo sensor;
fig. 2 is a schematic structural diagram of an android operating system according to an embodiment of the present invention;
fig. 3 is a schematic flow chart of an image obtaining method according to an embodiment of the present invention;
fig. 4 is a schematic diagram illustrating an arrangement of red R, green G, and blue B in a pixel unit according to an embodiment of the invention;
fig. 5 is a second schematic flow chart of an image acquisition method according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention;
fig. 7 is a second schematic structural diagram of a mobile terminal according to an embodiment of the present invention.
Detailed Description
The existing photosensitive sensor in the mobile terminal is composed of several PDs. For example, as shown in fig. 1, in the case of the PD arrangement in the bayer format, each PD is adjacent to 8 other PDs (except for the edge), and each PD can only sense one color component due to the monochromatic light filter arranged on the photosensitive sensor. Therefore, for each pixel point, at the same moment, each pixel point can only collect one color component, other colors can not be directly collected, and the real color of incident light of the pixel point can not be restored.
To solve this problem, the existing solution is to represent the components of other colors by the average value of the color components sensed by the neighboring PDs of the PD. For example, only R color component pixels have G color components calculated from the average of 4 surrounding G color components, and B color components calculated from the average of 4 surrounding B color components; only the pixel point of the B color component has the R color component calculated by the average value of the surrounding 4R color components, and the G color component calculated by the average value of the surrounding 4G color components; only the pixel point of the G color component has the R color component calculated by the average value of the upper and lower 2R, and the B color component calculated by the average value of the left and right 2B color components. It is clear that such processing may be closer to the true color of the incident light, but still have a large distortion. Meanwhile, when a user uses the mobile terminal to take a picture, if the mobile terminal slightly shakes due to factors such as hand vibration, distortion of the image may be intensified step by step.
In order to solve the problem, an embodiment of the present invention provides a new image obtaining method, where when a mobile terminal receives a photographing input, and when it is detected that the mobile terminal is in a motion state, the mobile terminal moves a photosensitive sensor and/or a lens to expose a photographic object for multiple times according to a motion parameter of the mobile terminal and an arrangement sequence of N color units in a pixel unit in the photosensitive sensor, so that a shake displacement of the mobile terminal is compensated, and the photosensitive sensor can completely obtain light information of the photographic object, thereby improving imaging quality.
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that "/" in this context means "or", for example, A/B may mean A or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. "plurality" means two or more than two.
The terms "first," "second," and the like in the description and in the claims of the present invention are used for distinguishing between different objects and not for describing a particular order of the objects. For example, the first image, the second image, etc. are for distinguishing different images, rather than for describing a particular order of the images.
It should be noted that, in the embodiments of the present invention, words such as "exemplary" or "for example" are used to indicate examples, illustrations or explanations. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the embodiments of the present invention, "of", "corresponding" and "corresponding" may be sometimes used in combination, and it should be noted that the intended meaning is consistent when the difference is not emphasized. The meaning of "a plurality" in the embodiments of the present invention means two or more.
The mobile terminal in the embodiment of the present invention may be a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted mobile terminal, a wearable device, an ultra-mobile personal computer (UMPC), a netbook, or a Personal Digital Assistant (PDA), and the like, and the embodiment of the present invention is not particularly limited.
Exemplarily, the camera of the mobile terminal in the embodiment of the present invention includes: the image sensor comprises a lens and a photosensitive sensor, wherein the photosensitive sensor comprises a plurality of pixel units. The plurality of pixel units form a filter of the photosensitive sensor, and specifically, the filter is composed of a plurality of pixel units which are periodically and repeatedly arranged.
Wherein each pixel unit comprises N color units, and N is a positive integer greater than 1. In one example, the N color units include: red R, green G, blue B, or R, G, G, B four color cells, or R, G, B, W four color cells.
An execution main body of the image obtaining method provided by the embodiment of the present invention may be the mobile terminal, or may also be a functional module and/or a functional entity capable of implementing the image obtaining method in the mobile terminal, which may be determined specifically according to actual use requirements, and the embodiment of the present invention is not limited. The following takes a mobile terminal as an example to exemplarily describe the image acquisition method provided by the embodiment of the present invention.
The terminal in the embodiment of the present invention may be a terminal having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present invention are not limited in particular.
The following describes a software environment to which the image acquisition method provided by the embodiment of the present invention is applied, by taking an android operating system as an example.
Fig. 2 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention. In fig. 2, the architecture of the android operating system includes 4 layers, which are respectively: an application layer, an application framework layer, a system runtime layer, and a kernel layer (specifically, a Linux kernel layer).
The application program layer comprises various application programs (including system application programs and third-party application programs) in an android operating system.
The application framework layer is a framework of the application, and a developer can develop some applications based on the application framework layer under the condition of complying with the development principle of the framework of the application.
The system runtime layer includes libraries (also called system libraries) and android operating system runtime environments. The library mainly provides various resources required by the android operating system. The android operating system running environment is used for providing a software environment for the android operating system.
The kernel layer is an operating system layer of an android operating system and belongs to the bottommost layer of an android operating system software layer. The kernel layer provides kernel system services and hardware-related drivers for the android operating system based on the Linux kernel.
Taking an android operating system as an example, in the embodiment of the present invention, a developer may develop a software program for implementing the image acquisition method provided in the embodiment of the present invention based on the system architecture of the android operating system shown in fig. 2, so that the image acquisition method may operate based on the android operating system shown in fig. 2. Namely, the processor or the mobile terminal can realize the image acquisition method provided by the embodiment of the invention by running the software program in the android operating system.
The following describes an image acquisition method according to an embodiment of the present invention with reference to a flowchart of the image acquisition method shown in fig. 3, where fig. 3 is a schematic flowchart of an image acquisition method according to an embodiment of the present invention, and includes steps 201 to 203:
step 201: the mobile terminal receives a photographing input of a user.
In an embodiment of the present invention, the photographing input specifically includes: the operation of pressing the shutter by the user, or the click operation of the user on the photographing key in the photographing interface, or the pressing operation of the user on the photographing key of the mobile terminal, or other feasible operations of the user on the terminal device may be specifically determined according to actual use requirements, and the embodiment of the present invention is not limited.
For example, the click input may be a single click input, a double click input, or any number of click inputs; the specific gesture may be any one of a single-click gesture, a sliding gesture, a pressure recognition gesture, a long-press gesture, a double-click gesture, simultaneous sliding down of multiple fingers, and simultaneous sliding up of multiple fingers.
Step 202: and responding to the photographing input, if the mobile terminal is detected to be in a motion state, the mobile terminal moves the camera shooting unit to expose the photographed object for M times according to the motion parameters of the mobile terminal and the arrangement sequence of the N color units in the pixel unit, and an image obtained by each exposure is obtained.
Wherein M is greater than or equal to N, the image pickup unit includes: the lens and/or the light sensor, that is, the mobile terminal may move the lens to perform M exposures on the photographic subject, may also move the light sensor to perform M exposures on the photographic subject, or may move the two in combination, which is not limited in the present invention. Illustratively, the mobile terminal moves by driving a lens or a photosensor using a motor.
In the embodiment of the present invention, the number of exposures may be the same as the type of the color unit in the pixel unit, that is, each exposure may obtain color information corresponding to one color unit, that is, one primary color value, that is, a monochromatic light intensity value, of each pixel point.
In the embodiment of the present invention, the motion parameters of the mobile terminal include, but are not limited to: acceleration, angular velocity, direction of motion, etc. of the mobile terminal.
For example, a motion sensor (e.g., an accelerometer, a gyroscope, etc.) is disposed in the mobile terminal in the embodiment of the present invention, when a user holds the mobile terminal to take a picture, an involuntary hand shake inevitably occurs in the human hand due to pulsation of blood of a human body, and the shake is sufficiently reflected on data collected by the motion sensor, so that the mobile terminal in the embodiment of the present invention collects motion parameters of the mobile terminal through the motion sensor to determine a motion state (i.e., a motion state or a static state) of the mobile terminal. Specifically, in the embodiment of the present invention, whether the mobile terminal is in the motion state may be determined by setting some thresholds (e.g., an acceleration threshold and an angular velocity threshold), and when the absolute value of the motion parameter is greater than the threshold, the mobile terminal may be considered to be in the motion state, and when the absolute value of the motion parameter is less than the threshold, the mobile terminal may be considered to be in the stationary state.
Step 203: the mobile terminal synthesizes a color image of the photographic subject based on the M images.
In the embodiment of the present invention, since the M images are obtained by the mobile terminal for the color components corresponding to the N color units, the purpose of combining the M images is to superimpose different color components of the same incident light beam, so as to restore the real color of the shooting object.
In the embodiment of the present invention, the mobile terminal may directly synthesize M images obtained by multiple exposures to obtain a color image of the object, or may sequentially synthesize the M images, for example, if the mobile terminal performs three exposures to obtain three images, the image obtained by the first exposure and the image obtained by the second exposure may be combined to obtain a composite image, and then the composite image and the image obtained by the third exposure may be combined to obtain a final color image.
Optionally, in this embodiment of the present invention, in a case that a pixel unit in a photosensor of the mobile terminal includes three color units, namely, red R, green G, and blue B, the step 202 specifically includes the following steps:
step 202a 1: when the mobile terminal carries out first exposure on a shooting object, a first target offset is determined according to the current motion parameters of the mobile terminal and the arrangement sequence of R, G, B color cells in the pixel cells, and the shooting unit is moved according to the first target offset to carry out first exposure on the shooting object, so that a first image is obtained.
Step 202b 1: and when the mobile terminal carries out second exposure on the shot object, determining a second target offset according to the current motion parameter of the mobile terminal and the arrangement sequence of R, G, B color cells in the pixel cell, and moving the camera unit according to the second target offset to carry out second exposure on the shot object to obtain a second image.
Step 202c 1: and when the mobile terminal performs the third exposure on the shot object, determining a third target offset according to the current motion parameter of the mobile terminal and the arrangement sequence of R, G, B color cells in the pixel cell, and moving the camera unit according to the third target offset to perform the third exposure on the shot object to obtain a third image.
For example, in a case that a pixel unit in a photosensor of the mobile terminal includes R, G, B color units, referring to the PD sorting shown in fig. 1, an arrangement order of R, G, B color units in the pixel unit at least includes 6 sorts shown in fig. 4, it should be noted that the 6 sorts shown in fig. 4 are only an example, and in practical application, there may be other possibilities, which are not limited by the present invention.
It should be noted that, when the first exposure operation is performed, the zero color elements may be shifted to the image pickup element in the order of arrangement of R, G, B color elements in the pixel element, without considering the offset amount of the multiple exposure mechanism, that is, without compensating for the shake shift of the mobile terminal device itself.
Further, the step 202a1 specifically includes the following steps:
step A1: the mobile terminal determines a first offset according to the current motion parameter of the mobile terminal, determines a second offset according to the arrangement sequence of R, G, B color cells in the pixel cell, and adds the first offset and the second offset to obtain a first target offset.
Wherein, the second offset is: 0 or one color cell in a first preset direction.
Further, the step 202b1 specifically includes the following steps:
step B1: and the mobile terminal determines a third offset according to the current motion parameter of the mobile terminal, determines a fourth offset according to the arrangement sequence of R, G, B color cells in the pixel cell, and adds the third offset and the fourth offset to obtain a second target offset.
The fourth offset is a color unit in the second predetermined direction.
Further, the step 202c1 specifically includes the following steps:
step C1: and the mobile terminal determines a fifth offset according to the current motion parameter of the mobile terminal, determines a sixth offset according to the arrangement sequence of R, G, B color cells in the pixel cell, and adds the fifth offset and the sixth offset to obtain a third target offset.
The sixth offset is a color unit in the third preset direction.
In the embodiment of the invention, when the mobile terminal determines the offset of the mobile terminal according to the current motion parameter of the mobile terminal, the mobile terminal can activate the OIS function and calculate the OIS offset according to the current motion parameter of the mobile terminal. Since the OIS technology is the prior art, the corresponding calculation process is not described in detail herein.
It should be noted that, since the interval time is short when the lens is continuously exposed, the motion state of the mobile terminal is considered to be unchanged, that is, the current motion parameters of the mobile terminal in the above steps a1, B1, and C1 are the same, and correspondingly, the above first offset, third offset, and fifth offset are the same.
Example 1: it is assumed that the arrangement order of R, G, B color cells in the pixel cell is as shown in (b) in fig. 4.
Specifically, when the mobile terminal turns on the camera, if the motion sensor of the mobile terminal detects that the mobile terminal is in a motion state, the mobile terminal calculates the offset of the lens (in units of pixels, i.e., in units of color cells) as (X1, Y1) according to the data detected by the motion sensor in real time when the mobile terminal executes the first shutter. If the lens needs to move to the right when X1 is set to be a positive value, the lens needs to move to the left when X1 is a negative value, Y1 is a positive value, the lens needs to move upward, Y1 is a negative value, the lens needs to move downward, and the movement state of the mobile terminal is set to be unchanged during three times of movement of the mobile terminal, the offset of the corresponding three translations is as follows:
at the time of the first translation, the mobile terminal needs to add (X1, Y1) to the requirement of the offset of the multi-exposure mechanism, and since the offset of the multi-exposure mechanism can be disregarded when performing the first exposure operation, the actual offset is (X1, Y1).
In the second translation, on the basis of the previous translation, the mobile terminal needs to add (X1, Y1) to the offset requirement of the multi-exposure mechanism, if the offset requirement is 1 color unit, that is, the actual offset is (X1, Y1-1).
In the third translation, on the basis of the previous translation, the mobile terminal needs to add (X1, Y1) to the offset requirement of the multi-exposure mechanism, if the offset requirement is 1 color unit, that is, the actual offset is (X1+1, Y1).
Example 2: it is assumed that the arrangement order of R, G, B color cells in the pixel cell is as shown in (e) in fig. 4.
Specifically, when the mobile terminal turns on the camera, if the motion sensor of the mobile terminal detects that the mobile terminal is in a motion state, the mobile terminal calculates the offset of the lens (in units of pixels, that is, in units of color units) as (X, Y1) according to the data detected by the motion sensor in real time when the mobile terminal executes the first shutter. If the lens needs to move to the right when X1 is set to be a positive value, the lens needs to move to the left when X1 is a negative value, Y1 is a positive value, the lens needs to move upward, Y1 is a negative value, the lens needs to move downward, and the movement state of the mobile terminal is set to be unchanged during three times of movement of the mobile terminal, the offset of the corresponding three translations is as follows:
in the first translation, the mobile terminal needs to add (X1, Y1) to the offset requirement of the multi-exposure mechanism, if the offset requirement is 1 color unit, that is, the actual offset is (X1+1, Y1-1).
In the second translation, on the basis of the previous translation, the mobile terminal needs to add (X1, Y1) to the offset requirement of the multi-exposure mechanism, if the offset requirement is 1 color unit, that is, the actual offset is (X1, Y1-1).
In the third translation, on the basis of the previous translation, the mobile terminal needs to add (X1, Y1) to the offset requirement of the multi-exposure mechanism, if the offset requirement is 1 color unit, that is, the actual offset is (X1-1, Y1).
Finally, with reference to example 1 or example 2, as shown in fig. 5, the mobile terminal may align and combine the upper left corners of the images obtained by three exposures, so as to obtain a full-color image containing three primary color components of red, green, and blue. It should be noted that the synthesized image may have imaging blur (shaded portion in fig. 5) in the range of 1-2 PD widths around the synthesized image, because the objects in these areas may not be the same in each exposure, but the pixels are very few (the ratio is usually 1% o or even lower), and the influence on the whole image is not great. Further, the blurred portion of the surrounding image (shaded portion in fig. 5) may be clipped out, resulting in a more perfect final image.
Optionally, in this embodiment of the present invention, when the pixel unit in the photosensor of the mobile terminal includes R, G, G, B four color units, step 202 described above specifically includes the following steps:
step 202a 2: and when the mobile terminal performs first exposure on the shot object, determining a fourth target offset according to the current motion parameter of the mobile terminal and the arrangement sequence of R, G, G, B color cells in the pixel cell, and moving the camera unit according to the fourth target offset to perform first exposure on the shot object to obtain a fourth image.
Step 202b 2: and when the mobile terminal carries out second exposure on the shot object, determining a fifth target offset according to the current motion parameter of the mobile terminal and the arrangement sequence of R, G, G, B color cells in the pixel cell, and moving the camera unit according to the fifth target offset to carry out second exposure on the shot object to obtain a fifth image.
Step 202c 2: and when the mobile terminal performs the third exposure on the shot object, determining a sixth target offset according to the current motion parameter of the mobile terminal and the arrangement sequence of R, G, G, B color cells in the pixel cell, and moving the camera unit according to the sixth target offset to perform the third exposure on the shot object to obtain a sixth image.
Step 202d 2: and when the mobile terminal carries out fourth exposure on the shot object, determining a seventh target offset according to the current motion parameter of the mobile terminal and the arrangement sequence of R, G, G, B color cells in the pixel cell, and moving the camera unit according to the seventh target offset to carry out fourth exposure on the shot object to obtain a seventh image.
It should be noted that, since the implementation principle of the scene in which the pixel unit includes R, G, G, B four color cells is the same as that of the scene in which the pixel unit includes R, G, B three color cells, the specific implementation of the above step 202a2, step 202b2, step 202c2, and step 202d2 may refer to the above description of the scene in which the pixel unit includes R, G, B three color cells, and will not be described herein again.
According to the image acquisition method provided by the embodiment of the invention, under the condition that the mobile terminal receives the photographing input, if the mobile terminal is detected to be in a motion state, the mobile terminal can move the photosensitive sensor and/or the lens to expose the photographed object for multiple times according to the motion parameters of the mobile terminal and the arrangement sequence of the N color units in the pixel units in the photosensitive sensor, so that the photosensitive sensor can completely acquire the light information of the photographed object while the shake displacement of the mobile terminal is compensated, and the imaging quality is improved.
Fig. 6 is a schematic diagram of a possible structure of a mobile terminal for implementing the embodiment of the present invention, where a photosensor (not shown in the figure) of the mobile terminal includes a plurality of pixel units, each pixel unit includes N color units, where N is a positive integer greater than 1, and as shown in fig. 6, the mobile terminal 300 includes a receiving module 301, an executing module 302, and a synthesizing module 303, where:
the receiving module 301 is configured to receive a photographing input of a user.
An executing module 302, configured to respond to the photographing input received by the receiving module 301, if it is detected that the mobile terminal is in a motion state, perform M exposures on the photographic object by using the mobile camera unit according to the motion parameters of the mobile terminal and the arrangement sequence of the N color units in the pixel unit, so as to obtain an image obtained by each exposure, where M is greater than or equal to N.
And a synthesizing module 303, configured to synthesize a color image of the photographic subject based on the M images obtained by the executing module 302.
Wherein the image pickup unit includes: a lens and/or a photosensor.
Optionally, in a case that the pixel unit includes three color units of red R, green G, and blue B, the executing module 302 is specifically configured to: when a shooting object is exposed for the first time, determining a first target offset according to the current motion parameter of the mobile terminal and the arrangement sequence of R, G, B color units, and moving the shooting unit according to the first target offset to expose the shooting object for the first time to obtain a first image; when the shot object is exposed for the second time, determining a second target offset according to the current motion parameter of the mobile terminal and the arrangement sequence of R, G, B color units, and moving the camera unit according to the second target offset to expose the shot object for the second time to obtain a second image; and when the shot object is exposed for the third time, determining a third target offset according to the current motion parameter of the mobile terminal and the arrangement sequence of R, G, B color units, and moving the camera shooting unit according to the third target offset to expose the shot object for the third time to obtain a third image.
Optionally, the executing module 302 is specifically configured to: determining a first offset according to the current motion parameter of the mobile terminal, and determining a second offset according to the arrangement sequence of R, G, B color units, wherein the second offset is as follows: 0 or a color unit in a first preset direction; and adding the first offset and the second offset to obtain a first target offset.
Optionally, the executing module 302 is specifically configured to: determining a third offset according to the current motion parameter of the mobile terminal, and determining a fourth offset according to the arrangement sequence of R, G, B color units, wherein the fourth offset is a color unit in a second preset direction; and adding the third offset and the fourth offset to obtain a second target offset.
Optionally, the executing module 302 is specifically configured to: determining a fifth offset according to the current motion parameter of the mobile terminal, and determining a sixth offset according to the arrangement sequence of R, G, B color units, wherein the sixth offset is a color unit in a third preset direction; and adding the fifth offset and the sixth offset to obtain a third target offset.
According to the mobile terminal provided by the embodiment of the invention, under the condition that the mobile terminal receives the photographing input, if the mobile terminal is detected to be in a motion state, the mobile terminal can move the photosensitive sensor and/or the lens to expose the photographed object for multiple times according to the motion parameters of the mobile terminal and the arrangement sequence of the N color units in the pixel units in the photosensitive sensor, so that the light information of the photographed object can be completely acquired by the photosensitive sensor while the shake displacement of the mobile terminal is compensated, and the imaging quality is improved.
The mobile terminal provided by the embodiment of the present invention can implement each process implemented by the mobile terminal in the above method embodiments, and is not described herein again to avoid repetition.
Fig. 7 is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present invention, where the mobile terminal 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the configuration of the mobile terminal 100 shown in fig. 7 does not constitute a limitation of the mobile terminal, and that the mobile terminal 100 may include more or less components than those shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal 100 includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
Wherein, the user input unit 107 is used for receiving the photographing input of the user; a processor 110, configured to respond to the photographing input received by the receiving module 107, if it is detected that the mobile terminal is in a motion state, move the photographing unit to expose the photographic object M times according to the motion parameters of the mobile terminal and the arrangement order of the N color units in the pixel unit, so as to obtain an image obtained by each exposure, where M is greater than or equal to N, and synthesize a color image of the photographic object based on the M images, where the photographing unit includes: a lens and/or a photosensor.
According to the mobile terminal provided by the embodiment of the invention, under the condition that the mobile terminal receives the photographing input, if the mobile terminal is detected to be in a motion state, the mobile terminal can move the photosensitive sensor and/or the lens to expose the photographed object for multiple times according to the motion parameters of the mobile terminal and the arrangement sequence of the N color units in the pixel units in the photosensitive sensor, so that the light information of the photographed object can be completely acquired by the photosensitive sensor while the shake displacement of the mobile terminal is compensated, and the imaging quality is improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 may be used for receiving and sending signals during a message transmission or call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The mobile terminal 100 provides the user with wireless broadband internet access via the network module 102, such as helping the user send and receive e-mails, browse web pages, and access streaming media.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used to receive an audio or video signal. The input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics processor 1041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The mobile terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or a backlight when the mobile terminal 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal 100. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 7, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the mobile terminal 100, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the mobile terminal 100, and is not limited herein.
The interface unit 108 is an interface through which an external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal 100 and external devices.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the mobile terminal 100, connects various parts of the entire mobile terminal 100 using various interfaces and lines, and performs various functions of the mobile terminal 100 and processes data by running or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby monitoring the mobile terminal 100 as a whole. Processor 110 may include one or more processing units; alternatively, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The mobile terminal 100 may further include a power supply 111 (e.g., a battery) for supplying power to various components, and optionally, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the mobile terminal 100 includes some functional modules that are not shown, and thus, the detailed description thereof is omitted.
Optionally, an embodiment of the present invention further provides a mobile terminal, including a processor, a memory, and a computer program stored in the memory and capable of running on the processor 110, where the computer program, when executed by the processor, implements each process of the above-mentioned embodiment of the image obtaining method, and can achieve the same technical effect, and details are not repeated here to avoid repetition.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the embodiment of the image obtaining method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a mobile terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.
Claims (8)
1. An image acquisition method is applied to a mobile terminal, a photosensitive sensor of the mobile terminal comprises a plurality of pixel units, each pixel unit comprises N color units, N is a positive integer greater than 1, and the method comprises the following steps:
receiving a photographing input of a user;
responding to the photographing input, if the mobile terminal is detected to be in a motion state, moving a camera shooting unit to expose a photographing object for M times according to the motion parameters of the mobile terminal and the arrangement sequence of N color units in the pixel unit to obtain an image obtained by each exposure, wherein M is greater than or equal to N;
synthesizing a color image of the photographic subject based on the M images obtained by the M exposures;
wherein the image pickup unit includes: the color image is obtained by processing the imaging part at the periphery;
under the condition that the pixel unit at least comprises three color units of red R, green G and blue B, moving the camera unit to perform exposure on a shooting object for M times according to the motion parameters of the mobile terminal and the arrangement sequence of the N color units in the pixel unit to obtain an image obtained by each exposure, comprising:
when the shooting object is exposed for the first time, determining a first target offset according to the current motion parameter of the mobile terminal and the arrangement sequence of the R, G, B color units, and moving the shooting unit to expose the shooting object for the first time according to the first target offset to obtain a first image;
when the shot object is exposed for the second time, determining a second target offset according to the current motion parameter of the mobile terminal and the arrangement sequence of the R, G, B color units, and moving the camera unit according to the second target offset to expose the shot object for the second time to obtain a second image;
and when the shot object is exposed for the third time, determining a third target offset according to the current motion parameter of the mobile terminal and the arrangement sequence of the R, G, B color units, and moving the camera unit according to the third target offset to expose the shot object for the third time to obtain a third image.
2. The method according to claim 1, wherein said determining a first target offset according to the current motion parameter of the mobile terminal and the permutation order of the R, G, B color cells comprises:
determining a first offset according to the current motion parameter of the mobile terminal, and determining a second offset according to the arrangement sequence of the R, G, B color units, wherein the second offset is a color unit in a first preset direction;
and adding the first offset and the second offset to obtain a first target offset.
3. The method according to claim 1, wherein said determining a second target offset according to the current motion parameter of the mobile terminal and the permutation order of the R, G, B color cells comprises:
determining a third offset according to the current motion parameter of the mobile terminal, and determining a fourth offset according to the arrangement sequence of the R, G, B color units, where the fourth offset is: 0 or a color unit in a second preset direction;
and adding the third offset and the fourth offset to obtain a second target offset.
4. The method according to claim 1, wherein said determining a third target offset according to the current motion parameter of the mobile terminal and the permutation order of the R, G, B color cells comprises:
determining a fifth offset according to the current motion parameter of the mobile terminal, and determining a sixth offset according to the arrangement sequence of the R, G, B color units, wherein the sixth offset is a color unit in a third preset direction;
and adding the fifth offset and the sixth offset to obtain a third target offset.
5. A mobile terminal, a light-sensitive sensor of the mobile terminal comprises a plurality of pixel units, each pixel unit comprises N color units, N is a positive integer greater than 1, the mobile terminal is characterized by further comprising:
the receiving module is used for receiving photographing input of a user;
the execution module is used for responding to the photographing input received by the receiving module, and if the mobile terminal is detected to be in a motion state, the mobile photographing unit carries out exposure on a photographed object for M times according to the motion parameters of the mobile terminal and the arrangement sequence of the N color units in the pixel unit to obtain an image obtained by each exposure, wherein M is greater than or equal to N;
a synthesis module, configured to synthesize a color image of the photographic subject based on the M images obtained by the exposure for M times obtained by the execution module;
wherein the image pickup unit includes: the color image is obtained by processing the imaging part at the periphery;
in a case that the pixel unit includes three color units of red R, green G, and blue B, the executing module is specifically configured to:
when the shooting object is exposed for the first time, determining a first target offset according to the current motion parameter of the mobile terminal and the arrangement sequence of the R, G, B color units, and moving the shooting unit to expose the shooting object for the first time according to the first target offset to obtain a first image;
when the shot object is exposed for the second time, determining a second target offset according to the current motion parameter of the mobile terminal and the arrangement sequence of the R, G, B color units, and moving the camera unit according to the second target offset to expose the shot object for the second time to obtain a second image;
and when the shot object is exposed for the third time, determining a third target offset according to the current motion parameter of the mobile terminal and the arrangement sequence of the R, G, B color units, and moving the camera unit according to the third target offset to expose the shot object for the third time to obtain a third image.
6. The mobile terminal of claim 5, wherein the execution module is specifically configured to:
determining a first offset according to the current motion parameter of the mobile terminal, and determining a second offset according to the arrangement sequence of the R, G, B color units, wherein the second offset is as follows: 0 or a color unit in a first preset direction;
and adding the first offset and the second offset to obtain a first target offset.
7. The mobile terminal of claim 5, wherein the execution module is specifically configured to:
determining a third offset according to the current motion parameter of the mobile terminal, and determining a fourth offset according to the arrangement sequence of the R, G, B color units, wherein the fourth offset is a color unit in a second preset direction;
and adding the third offset and the fourth offset to obtain a second target offset.
8. The mobile terminal of claim 5, wherein the execution module is specifically configured to:
determining a fifth offset according to the current motion parameter of the mobile terminal, and determining a sixth offset according to the arrangement sequence of the R, G, B color units, wherein the sixth offset is a color unit in a third preset direction;
and adding the fifth offset and the sixth offset to obtain a third target offset.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811511256.3A CN109729264B (en) | 2018-12-11 | 2018-12-11 | Image acquisition method and mobile terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811511256.3A CN109729264B (en) | 2018-12-11 | 2018-12-11 | Image acquisition method and mobile terminal |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109729264A CN109729264A (en) | 2019-05-07 |
CN109729264B true CN109729264B (en) | 2021-01-08 |
Family
ID=66295631
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811511256.3A Active CN109729264B (en) | 2018-12-11 | 2018-12-11 | Image acquisition method and mobile terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109729264B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112957062B (en) * | 2021-05-18 | 2021-07-16 | 雅安市人民医院 | Vehicle-mounted CT imaging system and imaging method based on 5G transmission |
CN113873233A (en) * | 2021-10-14 | 2021-12-31 | 维沃移动通信有限公司 | Lens module detection method and device and electronic equipment |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102326381A (en) * | 2009-02-27 | 2012-01-18 | 索尼公司 | Imaging device and imaging method |
CN105430265A (en) * | 2015-11-27 | 2016-03-23 | 努比亚技术有限公司 | Method and device for increasing imaging range of camera |
CN107203966A (en) * | 2017-05-08 | 2017-09-26 | 珠海市魅族科技有限公司 | A kind of coloured image synthetic method and device |
WO2018072353A1 (en) * | 2016-10-17 | 2018-04-26 | 华为技术有限公司 | Image acquiring method and terminal device |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20180070264A (en) * | 2016-12-16 | 2018-06-26 | 삼성전자주식회사 | Obtaining method for Panning shot image and Electronic device supporting the same |
-
2018
- 2018-12-11 CN CN201811511256.3A patent/CN109729264B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102326381A (en) * | 2009-02-27 | 2012-01-18 | 索尼公司 | Imaging device and imaging method |
CN105430265A (en) * | 2015-11-27 | 2016-03-23 | 努比亚技术有限公司 | Method and device for increasing imaging range of camera |
WO2018072353A1 (en) * | 2016-10-17 | 2018-04-26 | 华为技术有限公司 | Image acquiring method and terminal device |
CN107203966A (en) * | 2017-05-08 | 2017-09-26 | 珠海市魅族科技有限公司 | A kind of coloured image synthetic method and device |
Also Published As
Publication number | Publication date |
---|---|
CN109729264A (en) | 2019-05-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110913132B (en) | Object tracking method and electronic equipment | |
US11451706B2 (en) | Photographing method and mobile terminal | |
CN108307109B (en) | High dynamic range image preview method and terminal equipment | |
CN110784651B (en) | Anti-shake method and electronic equipment | |
CN109743498B (en) | Shooting parameter adjusting method and terminal equipment | |
CN108989678B (en) | Image processing method and mobile terminal | |
CN111145192A (en) | Image processing method and electronic device | |
CN111010511B (en) | Panoramic body-separating image shooting method and electronic equipment | |
CN111601032A (en) | Shooting method and device and electronic equipment | |
CN109905603A (en) | A kind of shooting processing method and mobile terminal | |
CN109246351B (en) | Composition method and terminal equipment | |
CN111083374B (en) | Filter adding method and electronic equipment | |
CN110493510B (en) | Image processing method and terminal equipment | |
CN109729264B (en) | Image acquisition method and mobile terminal | |
CN110971832A (en) | Image shooting method and electronic equipment | |
CN109167917B (en) | Image processing method and terminal equipment | |
CN108317992A (en) | A kind of object distance measurement method and terminal device | |
CN109104573B (en) | Method for determining focusing point and terminal equipment | |
CN110944163A (en) | Image processing method and electronic equipment | |
CN108234978B (en) | A kind of image processing method and mobile terminal | |
CN110958387B (en) | Content updating method and electronic equipment | |
WO2021136181A1 (en) | Image processing method and electronic device | |
CN110913133B (en) | Shooting method and electronic equipment | |
CN110148167B (en) | Distance measuring method and terminal equipment | |
CN109922256B (en) | Shooting method and terminal equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |