[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN112866551B - Focusing method and device, electronic equipment and computer readable storage medium - Google Patents

Focusing method and device, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN112866551B
CN112866551B CN201911102669.0A CN201911102669A CN112866551B CN 112866551 B CN112866551 B CN 112866551B CN 201911102669 A CN201911102669 A CN 201911102669A CN 112866551 B CN112866551 B CN 112866551B
Authority
CN
China
Prior art keywords
phase difference
image
difference value
target
focusing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911102669.0A
Other languages
Chinese (zh)
Other versions
CN112866551A (en
Inventor
颜光宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201911102669.0A priority Critical patent/CN112866551B/en
Publication of CN112866551A publication Critical patent/CN112866551A/en
Application granted granted Critical
Publication of CN112866551B publication Critical patent/CN112866551B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Focusing (AREA)
  • Studio Devices (AREA)

Abstract

The application relates to a focusing method and device, electronic equipment and a computer readable storage medium, wherein a focusing mode of the electronic equipment is determined according to confidence degrees corresponding to two phase differences by acquiring a first phase difference value and a second phase difference value of a detection image corresponding to a first shooting gesture; when the focusing mode is phase detection automatic focusing, detecting a second shooting gesture of the electronic equipment and deflection data of the electronic equipment relative to the first shooting gesture; determining a target direction and a corresponding target phase difference according to the deflection data, the first phase difference value and the second phase difference value; and controlling the electronic equipment to focus according to the target phase difference. According to the method and the device, the target direction and the corresponding target phase difference value are determined according to the deflection data of the electronic equipment and the confidence coefficients of the two phase differences, the phase difference value of the electronic equipment in the two directions after the electronic equipment is subjected to angle deflection does not need to be recalculated, the calculation complexity of a focusing algorithm is reduced, and the focusing speed of the electronic equipment in a deflection posture is improved.

Description

Focusing method and device, electronic equipment and computer readable storage medium
Technical Field
The present disclosure relates to the field of images, and in particular, to a focusing method and apparatus, an electronic device, and a computer-readable storage medium.
Background
When an image is captured, in order to ensure that the image is captured clearly, focusing is generally required on the electronic device, and the focusing refers to a process of adjusting the distance between a lens and an image sensor. Currently, a common focusing method includes Phase Detection Auto Focus (PDAF).
In the related art, in order to perform phase detection autofocus, some phase detection pixel points (also referred to as shielding pixel points) may be generally arranged in pairs among pixel points included in an image sensor, where one phase detection pixel point in each phase detection pixel point pair performs left-side shielding and the other phase detection pixel point performs right-side shielding, so that an imaging light beam emitted to each phase detection pixel point pair may be separated into a left portion and a right portion, a phase difference may be obtained by comparing images formed by the left portion and the right portion of the imaging light beam, and focusing may be performed according to the phase difference after obtaining the phase difference, where the phase difference refers to a difference between imaging light beams emitted from different directions at an imaging position.
However, when the shooting angle of the electronic device is deflected, the phase difference between the two directions needs to be obtained by reusing the phase detection pixel point method, resulting in a low focusing rate.
Disclosure of Invention
The embodiment of the application provides a focusing method, a focusing device, electronic equipment and a computer readable storage medium, which can improve the accuracy of focusing.
A focusing method is applied to electronic equipment, the electronic equipment comprises an image sensor, the image sensor comprises a plurality of pixel point groups arranged in an array, and each pixel point group comprises M x N pixel points arranged in an array; each pixel point corresponds to a photosensitive unit, wherein M and N are both natural numbers greater than or equal to 2, and the method comprises the following steps:
obtaining a phase difference value of a detection image corresponding to a first shooting posture, wherein the phase difference value comprises a first phase difference value in a first direction and a second phase difference value in a second direction, and a preset included angle is formed between the first direction and the second direction;
determining a focusing mode of the electronic equipment according to a first confidence coefficient corresponding to the first phase difference value and a second confidence coefficient corresponding to the second phase difference value;
when the focusing mode is phase detection automatic focusing, detecting a second shooting gesture of the electronic equipment, and acquiring deflection data of the second shooting gesture relative to the first shooting gesture;
determining a target direction according to the deflection data, the first phase difference value and the second phase difference value, and acquiring a target phase difference corresponding to the target direction in a target image corresponding to the second shooting attitude;
and controlling the electronic equipment to focus according to the target phase difference.
13. The focusing device is applied to electronic equipment, wherein the electronic equipment comprises an image sensor, the image sensor comprises a plurality of pixel groups arranged in an array, and each pixel group comprises M x N pixels arranged in an array; every pixel corresponds a sensitization unit, wherein, M and N are the natural number more than or equal to 2, focusing device includes:
the first acquisition module is used for acquiring a phase difference value of a detection image corresponding to a first shooting posture, wherein the phase difference value comprises a first phase difference value in a first direction and a second phase difference value in a second direction, and a preset included angle is formed between the first direction and the second direction;
the first determining module is used for determining a focusing mode of the electronic equipment according to a first confidence coefficient corresponding to the first phase difference value and a second confidence coefficient corresponding to the second phase difference value;
the second obtaining module is used for determining a focusing mode of the electronic equipment according to a first confidence coefficient corresponding to the first phase difference value and a second confidence coefficient corresponding to the second phase difference value;
the second determining module is used for determining a target direction according to the deflection data, the first phase difference value and the second phase difference value and acquiring a target phase difference corresponding to the target direction in a target image corresponding to the second shooting posture;
and the focusing module is used for controlling the electronic equipment to focus according to the target phase difference.
An electronic device comprising a memory and a processor, the memory having stored therein a computer program, which, when executed by the processor, causes the processor to perform the steps of the focusing method as described.
A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the focusing method as described.
According to the focusing method and device, the electronic equipment and the computer readable storage medium, the phase difference value of the detection image corresponding to the first shooting posture is obtained, the phase difference value comprises a first phase difference value in a first direction and a second phase difference value in a second direction, and the focusing mode of the electronic equipment is determined according to a first confidence coefficient corresponding to the first phase difference value and a second confidence coefficient corresponding to the second phase difference value; when the focusing mode is phase detection automatic focusing, detecting a second shooting gesture of the electronic equipment, and acquiring deflection data of the second shooting gesture relative to the first shooting gesture; determining a target direction according to the deflection data, the first phase difference value and the second phase difference value, and acquiring a target phase difference corresponding to the target direction in a target image corresponding to the second shooting attitude; and controlling the electronic equipment to focus according to the target phase difference. According to the scheme, the target direction is determined according to the deflection data of the electronic equipment and the confidence degrees of the two phase differences, the target phase difference value corresponding to the target direction in the target image corresponding to the second shooting posture is obtained, the phase difference value of the two directions after the electronic equipment is subjected to angle deflection does not need to be recalculated, the calculation complexity of a focusing algorithm is reduced, and the focusing rate of the electronic equipment during the angle deflection is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic diagram of phase detection autofocus;
fig. 2 is a schematic diagram of arranging phase detection pixels in pairs among pixels included in an image sensor;
FIG. 3 is a schematic diagram showing a partial structure of an image sensor according to an embodiment;
FIG. 4 is a schematic diagram of an embodiment of a pixel structure;
FIG. 5 is a schematic configuration diagram of an image forming apparatus in one embodiment;
FIG. 6 is a diagram illustrating an embodiment of a filter disposed on a pixel group;
FIG. 7 is a flow chart of a focusing method in one embodiment;
FIG. 8 is a flowchart illustrating the steps of determining a target direction from the first direction and the second direction according to the deflection angle, a first confidence corresponding to the first phase difference value, and a second confidence corresponding to the second phase difference value, according to an embodiment;
FIG. 9 shows the steps in one embodiment: acquiring a flow chart of a first confidence coefficient and a second confidence coefficient according to scene information of a detected image;
FIG. 10 shows the steps in one embodiment: acquiring a flow chart of a phase difference value of a detection image;
FIG. 11 shows the steps in one embodiment: a flow chart for obtaining a first phase difference value according to the phase relation corresponding to the first segmentation image and the second segmentation image and obtaining a second phase difference value according to the phase relation corresponding to the third segmentation image and the fourth segmentation image;
FIG. 12 is a block diagram showing the structure of a focusing device in one embodiment;
FIG. 13 is a block diagram of an electronic device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first direction may be referred to as a second direction, and similarly, a second direction may be referred to as a first direction, without departing from the scope of the present application. The first direction and the second direction are both directions, but they are not the same direction.
When an image is shot, in order to ensure that the image of a moving object is shot clearly, focusing is generally required to be performed on the electronic device, and the focusing refers to a process of keeping focusing on a shot object in a subsequent shooting process after a target camera focuses on the shot object. For example, in the process of previewing a shot image by the electronic device, after focusing on the shot object, the focus on the shot object is still maintained in the subsequently acquired preview image, and the shot object in the acquired preview image is still clearly imaged. By "focus" is meant the process of adjusting the distance between the lens of the electronic device and the image sensor, thereby making the image sensor image sharp. Among them, Phase Detection Auto Focus (PDAF) is a common auto focus technology.
Hereinafter, the embodiment of the present application will briefly explain the principle of the PDAF technique.
Fig. 1 is a schematic diagram of a Phase Detection Auto Focus (PDAF) principle. As shown in fig. 1, M1 is a position where the image sensor is located when the electronic device is in a focusing state, where the focusing state refers to a successfully focused state. When the image sensor is located at the position M1, the imaging light rays g reflected by the object W in different directions toward the Lens converge on the image sensor, that is, the imaging light rays g reflected by the object W in different directions toward the Lens are imaged at the same position on the image sensor, and at this time, the image sensor is imaged clearly.
M2 and M3 are positions where the image sensor may be located when the electronic device is not in the in-focus state, and as shown in fig. 1, when the image sensor is located at the M2 position or the M3 position, the imaging light rays g reflected by the object W to the Lens in different directions will be imaged at different positions. Referring to fig. 1, when the image sensor is located at the position M2, the imaging light rays g reflected by the object W in different directions toward the Lens are imaged at the position a and the position B, respectively, and when the image sensor is located at the position M3, the imaging light rays g reflected by the object W in different directions toward the Lens are imaged at the position C and the position D, respectively, and at this time, the image sensor is not clear.
In the PDAF technique, the difference in the position of the image formed by the imaging light rays entering the lens from different directions in the image sensor can be obtained, for example, as shown in fig. 1, the difference between the position a and the position B, or the difference between the position C and the position D can be obtained; after acquiring the difference of the positions of images formed by imaging light rays entering the lens from different directions in the image sensor, the defocus distance can be obtained according to the deflection data and the geometric relationship between the lens and the image sensor in the camera, wherein the defocus distance refers to the distance between the current position of the image sensor and the position where the image sensor is supposed to be in the in-focus state; the electronic device can focus according to the obtained defocus distance.
From this, it is understood that the calculated PD value is 0 at the time of focusing, whereas the larger the calculated value is, the farther the position of the clutch focus is indicated, and the smaller the value is, the closer the clutch focus is indicated. When PDAF focusing is adopted, the PD value is calculated, the corresponding relation between the PD value and the defocusing distance is obtained according to calibration, the defocusing distance can be obtained, and then the lens is controlled to move to reach the focusing point according to the defocusing distance, so that focusing is realized.
In the related art, some phase detection pixel points may be provided in pairs among the pixel points included in the image sensor, and as shown in fig. 2, a phase detection pixel point pair (hereinafter, referred to as a pixel point pair) a, a pixel point pair B, and a pixel point pair C may be provided in the image sensor. In each pixel point pair, one phase detection pixel point performs Left shielding (English), and the other phase detection pixel point performs Right shielding (English).
For the phase detection pixel point which is shielded on the left side, only the light beam on the right side in the imaging light beam which is emitted to the phase detection pixel point can image on the photosensitive part (namely, the part which is not shielded) of the phase detection pixel point, and for the phase detection pixel point which is shielded on the right side, only the light beam on the left side in the imaging light beam which is emitted to the phase detection pixel point can image on the photosensitive part (namely, the part which is not shielded) of the phase detection pixel point. Therefore, the imaging light beam can be divided into a left part and a right part, and the phase difference can be obtained by comparing images formed by the left part and the right part of the imaging light beam.
However, since the phase detection pixel points arranged in the image sensor are generally sparse, only a horizontal phase difference can be obtained through the phase detection pixel points, and a scene with horizontal textures cannot be calculated, and the calculated PD values are mixed up to obtain an incorrect result, for example, a scene is photographed as a horizontal line, two left and right images are obtained according to PD characteristics, but the PD values cannot be calculated.
In order to solve the problem that the phase detection autofocus cannot calculate a PD value to implement focusing for some horizontal texture scenes, an embodiment of the present application provides an imaging component, which may be configured to detect and output a first phase difference value and a second phase difference value, and implement focusing for a horizontal texture scene by using the second phase difference value.
In one embodiment, the present application provides an imaging assembly. The imaging assembly includes an image sensor. The image sensor may be a Metal Oxide Semiconductor (CMOS) image sensor, a Charge-coupled Device (CCD), a quantum thin film sensor, an organic sensor, or the like.
Fig. 3 is a schematic structural diagram of a part of an image sensor in one embodiment. The image sensor 300 includes a plurality of pixel groups Z arranged in an array, each pixel group Z includes a plurality of pixels D arranged in an array, and each pixel D corresponds to one photosensitive unit. Each pixel point D comprises a plurality of sub-pixel points D arranged in an array. That is, each photosensitive unit may be composed of a plurality of photosensitive elements arranged in an array. The photosensitive element is an element capable of converting an optical signal into an electrical signal. In one embodiment, the light sensing element may be a photodiode. In this embodiment, each pixel group Z includes 4 pixels D arranged in a 2 × 2 array, and each pixel may include 4 sub-pixels D arranged in a 2 × 2 array. Each pixel group forms 2 x 2PD, can directly receive optical signals, performs photoelectric conversion, and can simultaneously output left and right and up and down signals. Each color channel may consist of 4 sub-pixel points.
As shown in fig. 4, taking each pixel point including a sub-pixel point 1, a sub-pixel point 2, a sub-pixel point 3, and a sub-pixel point 4 as an example, the sub-pixel point 1 and the sub-pixel point 2 may be synthesized, the sub-pixel point 3 and the sub-pixel point 4 are synthesized to form a PD pixel pair in the up-down direction, and a horizontal edge is detected to obtain a second phase difference value, i.e., a PD value in the vertical direction; the sub-pixel point 1 and the sub-pixel point 3 are synthesized, the sub-pixel point 2 and the sub-pixel point 4 are synthesized to form a PD pixel pair in the left and right directions, a vertical edge can be detected, and a first phase difference value, namely a PD value in the horizontal direction, is obtained.
Fig. 5 is a schematic structural diagram of an electronic device in one embodiment. As shown in fig. 5, the electronic device includes a microlens 50, a filter 52, and an imaging component 54. The microlens 50, the filter 52 and the imaging component 54 are sequentially located on the incident light path, i.e. the microlens 50 is disposed on the filter 52, and the filter 52 is disposed on the imaging component 54.
The filter 52 may include three types of red, green and blue, which only transmit the light with the wavelengths corresponding to the red, green and blue colors, respectively. A filter 52 is disposed on one pixel.
The imaging assembly 54 includes the image sensor of fig. 3.
The lens 50 is used to receive incident light and transmit the incident light to the filter 52. The filter 52 smoothes incident light, and then the smoothed light is incident on the imaging element 54 on a pixel basis.
The light sensing unit in the image sensor converts light incident from the optical filter 52 into a charge signal by a photoelectric effect, and generates a pixel signal in accordance with the charge signal. The charge signal corresponds to the received light intensity.
Fig. 6 is a schematic diagram illustrating a filter disposed on a pixel group according to an embodiment. The pixel point group Z comprises 4 pixel points D arranged in an array arrangement manner of two rows and two columns, wherein color channels of the pixel points in the first row and the first column are green, that is, the optical filters arranged on the pixel points in the first row and the first column are green optical filters; the color channel of the pixel points in the first row and the second column is red, that is, the optical filter arranged on the pixel points in the first row and the second column is a red optical filter; the color channel of the pixel points in the second row and the first column is blue, that is, the optical filter arranged on the pixel points in the second row and the first column is a blue optical filter; the color channel of the pixel points in the second row and the second column is green, that is, the optical filter arranged on the pixel points in the second row and the second column is a green optical filter.
FIG. 7 is a flowchart of a focusing method in one embodiment. As shown in fig. 7, the focusing method includes steps 702 to 708.
Step 702, obtaining a phase difference value of the detection image corresponding to the first shooting posture, where the phase difference value includes a first phase difference value in a first direction and a second phase difference value in a second direction, and a preset included angle is formed between the first direction and the second direction.
The first shooting attitude may be a reference attitude, and the first direction and the second direction form a preset included angle, such as any angle of 90 °, 60 °, 45 °, and the like, where, for example, the first direction and the second direction may be a vertical direction, a horizontal direction, or an inclined direction. The detection image is an image obtained by the electronic device in the first shooting posture and of a target detection area in a shooting scene, the detection image is an image corresponding to the region of interest, the detection image can include most of the region of interest, and the detection image can also cover the region of interest. The region of interest may be a region including: a subject, such as a person, a flower, a cat, a dog, a cow, a blue sky, a white cloud, a background, and the like. The phase difference value comprises a first phase difference value and a second phase difference value, the first phase difference value refers to a phase value corresponding to the first direction, and the second phase difference value refers to a phase value corresponding to the second direction. When the first phase difference value is a phase difference value in the horizontal direction, the second phase difference value is a phase difference value in the vertical direction.
Specifically, the specific process of obtaining the phase difference value of the detection image is as follows: the electronic equipment conducts segmentation processing on the detection image according to an upper graph and a lower graph segmented according to a first direction; a left graph and a right graph which are subjected to segmentation processing according to a second direction; the electronic device calculates a first phase difference value according to the position difference of the matched pixels in the upper graph and the lower graph. And obtaining a second phase difference value according to the position difference of the matched pixels of the left graph and the right graph.
In the embodiment of the present application, "the positional difference of pixels matched with each other" means: a difference in the position of a pixel located in the left image and the position of a pixel located in the right image among the pixels matched with each other; or the difference in the position of a pixel located in the upper diagram and the position of a pixel located in the lower diagram among pixels that match each other. The pixels matched with each other respectively correspond to different images formed in the image sensor by imaging light rays entering the lens from different directions. For example, pixel a in the left image and pixel B in the right image match each other, where pixel a may correspond to the image formed at the a position in fig. 1 and pixel B may correspond to the image formed at the B position in fig. 1. Since the matched pixels respectively correspond to different images formed by imaging light rays entering the lens from different directions in the image sensor, the phase difference of the matched pixels can be determined according to the position difference of the matched pixels.
Step 704, determining a focusing mode of the electronic device according to the first confidence degree corresponding to the first phase difference value and the second confidence degree corresponding to the second phase difference value.
The first confidence coefficient is used for representing the accuracy of the first phase difference value, and the second confidence coefficient is used for representing the accuracy of the second phase difference value. The focusing method includes Phase Detection Auto Focus (PDAF), contrast auto focus, and the like.
Specifically, the process of determining the focusing mode of the electronic device according to the first confidence coefficient and the second confidence coefficient is as follows: when the first confidence coefficient and the second confidence coefficient meet the confidence coefficient condition, determining that the focusing mode of the electronic equipment is PDAF; and when the first confidence coefficient and the second confidence coefficient do not meet the confidence coefficient condition, determining that the focusing mode of the electronic equipment is contrast automatic focusing. The confidence degree condition is set by an engineer according to actual requirements, wherein the first confidence degree is greater than a first threshold value and/or the second confidence degree is greater than a second threshold value, or the larger value of the first confidence degree and the second confidence degree is greater than a third threshold value; the smaller value of the first confidence degree and the second confidence degree is larger than a fourth threshold value; the difference between the first confidence level and the second confidence level may be greater than a fifth threshold, and the like, which is not limited herein. The first threshold, the second threshold, the third threshold, the fourth threshold, and the fifth threshold are only used for illustration, and may be the same value or different values, and may be set arbitrarily, which is not described herein again.
And 706, when the focusing mode is the phase detection automatic focusing, detecting a second shooting gesture of the electronic equipment, and acquiring deflection data of the second shooting gesture relative to the first shooting gesture.
Specifically, the second shooting attitude may be a certain degree of angular, height, or depth deflection in the first shooting attitude. The gyroscope can be used for acquiring data corresponding to the first shooting attitude and data corresponding to the second shooting attitude respectively, and the deflection data can be obtained by comparing the data corresponding to the first shooting attitude with the data corresponding to the second shooting attitude. The deflection data may include: angular deflection, height offset, and/or depth offset, etc.
Step 708, determining a target direction according to the deflection data, the first phase difference value and the second phase difference value, and obtaining a target phase difference corresponding to the target direction in the target image corresponding to the second shooting attitude.
Specifically, the target direction is determined according to the deflection data, the first phase difference value and the second phase difference value, and the specific process is as follows: and when the offset data is smaller than or equal to the deviation threshold value, determining the phase difference with higher confidence coefficient from the first phase difference value and the second phase difference value as the target phase difference. And when the offset data is larger than the deviation threshold, acquiring a target direction and a target phase difference corresponding to the target direction according to the offset data. The target phase difference corresponding to the target direction refers to: the target image is an image correspondingly acquired by the electronic device in the second shooting posture, and the target image may include most of the region of interest and may also cover the region of interest. The process of acquiring the target direction and the target phase difference corresponding to the target direction according to the offset data is as follows: judging whether the offset data meets a preset deflection condition, and when the offset data meets the preset deflection condition, determining one direction from the first direction and the second direction as a target direction, further acquiring a target image in a second shooting posture and acquiring a target phase difference of the target direction in the target image; and when the deflection data does not meet the preset deflection condition, acquiring a target image in the second shooting posture, obtaining the phase difference and the confidence coefficient of the upper image and the lower image of the target image, and selecting the phase difference value corresponding to the larger confidence coefficient value as the target phase difference. The preset conditions are angle, height and the like, and can be range thresholds of angle deviation or height thresholds, and specific numerical values can be set by engineers according to actual requirements, and are not limited here.
And step 710, controlling the electronic equipment to focus according to the target phase difference.
Specifically, since the corresponding relationship between the target phase difference value and the target defocus distance value can be obtained by calibration, the target defocus distance can be obtained by knowing the calibration relationship between the target phase difference value and the target defocus distance value. And the lens is controlled to focus the target main body according to the target defocusing distance value.
For a shooting scene with horizontal textures, because the phase difference value in the horizontal direction cannot be obtained by the PD pixel pair in the horizontal direction, the phase difference value in the vertical direction can be calculated compared with the phase difference value in the vertical direction, the defocusing distance value is calculated according to the phase difference value in the vertical direction, and then the lens is controlled to move according to the defocusing distance value in the vertical direction to realize focusing.
For a shooting scene with vertical texture, because the phase difference value in the vertical direction cannot be obtained by the PD pixel pair in the vertical direction, the phase difference value in the horizontal direction can be calculated compared with the phase difference value in the horizontal direction, the defocusing distance value is calculated according to the phase difference value in the horizontal direction, and then the lens is controlled to move according to the defocusing distance value in the horizontal direction to realize focusing.
The focusing method comprises the steps of obtaining a phase difference value of a detection image corresponding to a first shooting gesture, wherein the phase difference value comprises a first phase difference value in a first direction and a second phase difference value in a second direction, and determining a focusing mode of the electronic equipment according to a first confidence coefficient corresponding to the first phase difference value and a second confidence coefficient corresponding to the second phase difference value; when the focusing mode is phase detection automatic focusing, detecting a second shooting gesture of the electronic equipment, and acquiring deflection data of the second shooting gesture relative to the first shooting gesture; determining a target direction according to the deflection data, the first phase difference value and the second phase difference value, and obtaining a target phase difference corresponding to the target direction in a target image corresponding to the second shooting attitude; and controlling the electronic equipment to focus according to the target phase difference. According to the scheme, the target direction is determined according to the deflection data of the electronic equipment and the confidence degrees of the two phase differences, the target phase difference value corresponding to the target direction in the target image corresponding to the second shooting posture is obtained, the phase difference value of the two directions after the electronic equipment is subjected to angle deflection does not need to be recalculated, the calculation complexity of a focusing algorithm is reduced, and the focusing rate of the electronic equipment during the angle deflection is improved.
In one embodiment, the deflection data comprises at least a deflection angle, and determining the target direction from the deflection data, the first phase difference value, and the second phase difference value comprises: and determining the target direction from the first direction and the second direction according to the deflection angle, the first confidence coefficient corresponding to the first phase difference value and the second confidence coefficient corresponding to the second phase difference value.
Specifically, the target direction is determined according to the shooting angle, the first phase difference value and the second phase difference value, and the specific process is as follows: and when the deviation of the angle corresponding to the second shooting posture and the angle corresponding to the first shooting posture is larger than an angle deviation threshold value, acquiring a cheap angle of the angle corresponding to the second shooting posture relative to the angle corresponding to the first shooting posture, and acquiring a target direction and a target phase difference corresponding to the target direction according to the deflection angle. And judging whether the deflection angle meets a deflection preset condition, and when the deflection data meets the deflection preset condition, determining one direction from the first direction and the second direction as a target direction, further acquiring a target image in the second shooting posture and acquiring a target phase difference of the target direction in the target image. And when the deflection data does not meet the preset deflection condition, acquiring a target image in the second shooting posture, obtaining the phase difference and the confidence coefficient of the upper image and the lower image of the target image, and selecting the phase difference value corresponding to the larger confidence coefficient value as the target phase difference. The preset condition is an angle preset condition, can be a range threshold value of angle deviation, and can be set according to actual requirements.
In one embodiment, as shown in fig. 8, the step of determining the target direction from the first direction and the second direction according to the deflection angle, the first confidence corresponding to the first phase difference value, and the second confidence corresponding to the second phase difference value includes: step 802 and step 804.
And step 802, when the deflection angle is in the first angle range, selecting the direction of the phase difference value corresponding to the larger confidence coefficient from the first confidence coefficient and the second confidence coefficient as a target direction.
Specifically, the first angle threshold may be [0 ° -x, 0 ° + x ] or [180 ° -x, 180 ° + x ], where x is any value within 0-5 °, 0-8 °, or 0-10 °. And when the first confidence corresponding to the target phase difference in the first direction is greater than the second confidence corresponding to the target phase difference in the second direction, the first direction is the target direction, the target image corresponding to the current shooting angle is obtained, the phase difference value and the phase difference value corresponding to the first direction in the target image are obtained, and focusing is performed according to the phase difference value. And when the first confidence corresponding to the target phase difference in the first direction is smaller than the second confidence corresponding to the target phase difference in the second direction, the second direction is the target direction, the target image corresponding to the current shooting angle is obtained, the phase difference value and the phase difference value corresponding to the second direction in the target image are obtained, and focusing is performed according to the phase difference value.
And 804, when the deflection angle is in the second angle range, selecting the direction of the phase difference value corresponding to the smaller confidence coefficient from the first confidence coefficient and the second confidence coefficient as the target direction.
Specifically, the first angle threshold may be [90 ° -y, 90 ° + y ] or [270 ° -z, 270 ° + z ], and y, z may be any value within 0-5 °, 0-8 °, or 0-10 °. And when the first confidence corresponding to the target phase difference in the first direction is greater than the second confidence corresponding to the target phase difference in the second direction, determining the second direction as the target direction, acquiring the target image corresponding to the current shooting angle, acquiring the phase difference value and the phase difference value corresponding to the second direction in the target image, and focusing according to the phase difference value. And when the first confidence corresponding to the target phase difference in the first direction is smaller than the second confidence corresponding to the target phase difference in the second direction, determining the first direction as the target direction, acquiring a target image corresponding to the current shooting angle, acquiring a phase difference value and a phase difference value corresponding to the first direction in the target image, and focusing according to the phase difference value.
In one embodiment, before determining the focusing mode of the electronic device according to the first confidence corresponding to the first phase difference value and the second confidence corresponding to the second phase difference value, the method further includes: and acquiring a first confidence coefficient and a second confidence coefficient according to the scene information of the detected image.
Specifically, the scene information may include information such as image brightness and image edges. When the image brightness of the detected image corresponding to the scene information satisfies the brightness threshold, the first confidence level and the second confidence level may be obtained by using an image edge, such as a vertical edge and/or a horizontal edge, of the detected image.
In one embodiment, as shown in fig. 9, the step of obtaining the first confidence level and the second confidence level according to the scene information of the detected image includes: step 902 and step 904. Wherein the scene information includes: an image edge, the image edge comprising: horizontal edges and vertical edges. The first direction corresponds to the horizontal direction and the second direction corresponds to the vertical direction.
Step 902, performing horizontal edge detection on the detected image to obtain the number of pixel points of the obtained horizontal edge, and determining a second confidence according to the number of pixel points of the horizontal edge.
Specifically, horizontal edge detection is performed through a horizontal edge detection operator of the sobel operator, a second confidence coefficient corresponding to the vertical direction is defined according to the number of the acquired pixel points of the horizontal edge, and the second confidence coefficient is larger when the number of the pixel points of the horizontal edge is larger.
And 904, performing vertical edge detection on the detected image to obtain the number of pixel points of the obtained vertical edge, and determining a first confidence coefficient according to the number of the pixel points of the vertical edge.
Specifically, vertical edge detection is performed through a vertical edge detection operator of the sobel operator, a first confidence coefficient corresponding to the horizontal direction is defined according to the number of the acquired pixel points of the vertical edge, and the greater the number of the pixel points of the vertical edge, the greater the first confidence coefficient.
In one embodiment, determining the focusing mode of the electronic device according to a first confidence corresponding to the first phase difference value and a second confidence corresponding to the second phase difference value includes determining that the focusing mode is phase detection automatic focusing when the greater value of the first confidence and the second confidence is greater than a confidence threshold.
Specifically, the confidence threshold may be set by an engineer according to actual requirements, for example, the confidence threshold is set to a value such as 0.3, 0.5, 0.6, or 0.8, and is not limited herein. And when the larger value of the first confidence coefficient and the second confidence coefficient is the first confidence coefficient, if the first confidence coefficient is larger than the confidence coefficient threshold value, determining that the focusing mode of the electronic equipment is phase detection automatic focusing. And when the larger value of the first confidence coefficient and the second confidence coefficient is the second confidence coefficient, if the second confidence coefficient is larger than the confidence coefficient threshold value, determining that the focusing mode of the electronic equipment is phase detection automatic focusing.
In one embodiment, the method further comprises: and when the first confidence coefficient and the second confidence coefficient are smaller than or equal to the confidence coefficient threshold value, determining that the focusing mode is contrast focusing, and controlling the electronic equipment to perform the contrast focusing.
Specifically, when the first confidence coefficient and the second confidence coefficient are smaller than or equal to the confidence coefficient threshold, identifying that the larger value of the first confidence coefficient and the second confidence coefficient is smaller than or equal to the confidence coefficient threshold, determining that the focusing mode of the electronic equipment is contrast focusing, and controlling the electronic equipment to perform contrast focusing.
In one embodiment, the step of controlling the electronic device to focus according to the target phase difference includes: and acquiring a target out-of-focus distance according to the target phase difference value, and controlling the lens of the electronic equipment to move according to the target out-of-focus distance so as to focus.
Specifically, the target image comprises a first phase difference value and a second phase difference value, a target phase value is determined according to a target direction, and a target defocus distance can be obtained according to the target phase difference value. For example, when the target direction is the first direction, a first phase difference value of the target image is calculated, and then a corresponding target defocus distance value is obtained from the mapping relationship between the phase difference value and the defocus distance value according to the determined target phase difference value. The target defocusing distance refers to the distance between the current position of the image sensor and the position where the image sensor should be in the in-focus state; the electronic equipment can control the lens to move to the in-focus position for focusing according to the obtained target out-of-focus distance.
In one embodiment, the step of obtaining the defocus distance of the target according to the target phase difference value includes: and calculating the target defocus distance according to the calibrated defocus function and the target phase difference value, wherein the calibrated defocus function is used for representing the relation between the target phase difference value and the target defocus distance.
Specifically, the correspondence between the target defocus distance value and the target phase difference value is as follows: the Defocus Coefficient (DCC) may be obtained by calibration, and the PD is a target phase difference value. And calculating to obtain a target defocusing distance according to the calibrated defocusing function and the target phase difference value, and controlling the lens of the electronic equipment to move according to the target defocusing distance so as to focus. The calibration process of the corresponding relation between the target phase difference value and the defocus distance value comprises the following steps: dividing the effective focusing stroke of the camera module into N (N is more than or equal to 3) equal parts, namely (near-focus DAC-far-focus DAC)/N, so as to cover the focusing range of the motor; focusing is carried out at each focusing DAC (DAC can be 0-1023), and the phase difference of the current focusing DAC position is recorded; after the motor focusing stroke is finished, a group of N focusing DACs are compared with the obtained PD value; and generating N similar ratios K, and fitting the two-dimensional data consisting of the DAC and the PD to obtain a straight line with the slope K.
In one embodiment, as shown in fig. 10, the step of obtaining the phase difference value of the target detection area image includes: step 1002 and step 1004.
Step 1002, the detection image is segmented into a first segmentation image and a second segmentation image according to a first direction. And acquiring a first phase difference value according to the corresponding phase relation of the first segmentation image and the second segmentation image.
Specifically, the electronic device may perform segmentation processing on the target image in the row direction (x-axis direction in the image coordinate system), and each segmentation line of the segmentation processing is perpendicular to the row direction during the segmentation processing of the target image in the row direction. The first segmentation image and the second segmentation image obtained by performing segmentation processing on the target image along the row direction may be referred to as a left image and a right image, respectively. And acquiring a first phase difference value according to the phase difference of the 'matched pixels' in the left image and the right image.
And 1004, segmenting the detection image into a third segmented image and a fourth segmented image according to the second direction. And obtaining a second phase difference value according to the corresponding phase relation of the third segmentation image and the fourth segmentation image.
Specifically, the electronic device may perform a segmentation process on the target image in the column direction (y-axis direction in the image coordinate system), and each segmentation line of the segmentation process is perpendicular to the column direction during the segmentation process on the target image in the column direction. The first and second sliced images obtained by slicing the target image in the column direction may be referred to as an upper image and a lower image, respectively.
In one embodiment, the first direction is a row direction and the second direction is a column direction. The method comprises the following steps: segmenting the detection image into a first segmentation image and a second segmentation image according to a first direction, comprising: and carrying out segmentation processing on the detection image according to the first direction to obtain a plurality of image areas, wherein each image area comprises a line of pixels in the detection image. A plurality of first sliced image regions and a plurality of second sliced image regions are obtained from the plurality of image regions, the first sliced image regions including pixels of even lines in the test image, and the second sliced image regions including pixels of odd lines in the test image. And splicing the plurality of first split image areas into a first split image, and forming a second split image by using the plurality of second split image areas.
Specifically, the first direction is a line direction, and the detection image is segmented according to the first direction, so that a plurality of image areas can be obtained, wherein each image area comprises a line of pixels in the detection image. A plurality of first sliced image regions and a plurality of second sliced image regions are obtained from the plurality of image regions, the first sliced image regions refer to pixels of even lines in the inspection image, and the second sliced image regions refer to pixels of odd lines in the inspection image. And sequentially splicing the plurality of first segmentation image areas according to the positions in the detection image to obtain a first segmentation image, and sequentially splicing the plurality of second segmentation image areas according to the positions in the detection image to obtain a second segmentation image.
The step of segmenting the detection image into a third segmentation image and a fourth segmentation image according to a second direction comprises the following steps: and performing segmentation processing on the detection image according to a second direction to obtain a plurality of image areas, wherein each image area comprises a column of pixels in the detection image. And acquiring a plurality of third segmentation image areas and a plurality of fourth segmentation image areas from the plurality of image areas, wherein the third segmentation image areas comprise pixels of even columns in the detection image, and the fourth segmentation image areas comprise pixels of odd columns in the detection image. And splicing the plurality of third segmentation image regions into a third segmentation image, and forming a fourth segmentation image by using the plurality of fourth segmentation image regions.
Specifically, the second direction is a column direction, and the detection image is segmented according to the first column direction, so that a plurality of image areas can be obtained, where each image area includes a column of pixels in the detection image. And acquiring a plurality of third segmentation image areas and a plurality of fourth segmentation image areas from the plurality of image areas, wherein the third segmentation image areas refer to pixels of even columns in the detection image, and the fourth segmentation image areas refer to pixels of odd columns in the detection image. And sequentially splicing the plurality of third segmented image regions according to the positions in the detection image to obtain a third segmented image, and sequentially splicing the plurality of fourth segmented image regions according to the positions in the detection image to obtain a fourth segmented image.
In one embodiment, as shown in fig. 11, the step of obtaining a first phase difference value according to a phase relationship corresponding to the first sliced image and the second sliced image and obtaining a second phase difference value according to a phase relationship corresponding to the third sliced image and the fourth sliced image includes: step 1102 and step 1104.
Step 1102, determining a phase difference value of the mutually matched pixels according to the position difference of the mutually matched pixels in the first segmentation image and the second segmentation image. A first phase difference value is determined from the phase difference values of the mutually matched pixels.
Specifically, when the first sliced image includes pixels in even-numbered lines, the second sliced image includes pixels in odd-numbered lines, and the pixel a in the first sliced image and the pixel b in the second sliced image are matched with each other, the first phase difference value may be determined according to the phase difference between the matched pixel a and pixel b.
And 1104, determining phase difference values of the mutually matched pixels according to the position difference of the mutually matched pixels in the third segmentation image and the fourth segmentation image. And determining a second phase difference value according to the phase difference values of the matched pixels.
Specifically, when the first sliced image includes pixels in even columns and the second sliced image includes pixels in odd columns, and the pixel a in the first sliced image and the pixel b in the second sliced image are matched with each other, the second phase difference value can be determined according to the phase difference between the pixel a and the pixel b which are matched with each other.
It should be understood that although the various steps in the flow charts of fig. 7-11 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 7-11 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least some of the sub-steps or stages of other steps.
An embodiment of the present application provides a focusing apparatus, which is applied to an electronic device, as shown in fig. 12, the focusing apparatus includes: a first acquisition module 1202, a first determination module 1204, a second acquisition module 1206, a second determination module 1208, and a focus module 1210.
The first obtaining module 1202 is configured to obtain a phase difference value of the detected image corresponding to the first shooting posture, where the phase difference value includes a first phase difference value in a first direction and a second phase difference value in a second direction, and a preset included angle is formed between the first direction and the second direction.
The first shooting attitude may be a reference attitude, and the first direction and the second direction form a preset included angle, such as any angle of 90 °, 60 °, 45 °, and the like, where, for example, the first direction and the second direction may be a vertical direction, a horizontal direction, or an inclined direction. The detection image is an image obtained by the electronic device in the first shooting posture and of a target detection area in a shooting scene, the detection image is an image corresponding to the region of interest, the detection image can include most of the region of interest, and the detection image can also cover the region of interest. The region of interest may be a region including: a subject, such as a person, a flower, a cat, a dog, a cow, a blue sky, a white cloud, a background, and the like. The phase difference value comprises a first phase difference value and a second phase difference value, the first phase difference value refers to a phase value corresponding to the first direction, and the second phase difference value refers to a phase value corresponding to the second direction. When the first phase difference value is a phase difference value in the horizontal direction, the second phase difference value is a phase difference value in the vertical direction.
Specifically, the specific process of the first obtaining module 1202 for obtaining the phase difference value of the detection image is as follows: the electronic equipment conducts segmentation processing on the detection image according to an upper graph and a lower graph segmented according to a first direction; performing segmentation processing on the left image and the right image according to a second direction; the electronic device calculates a first phase difference value according to the position difference of the matched pixels in the upper graph and the lower graph. And obtaining a second phase difference value according to the position difference of the matched pixels of the left image and the right image. In the embodiment of the present application, "the positional difference of pixels matched with each other" means: a difference in the position of a pixel located in the left image and the position of a pixel located in the right image among the mutually matched pixels; or the difference between the position of a pixel located in the upper diagram and the position of a pixel located in the lower diagram among pixels that match each other. The pixels matched with each other respectively correspond to different images formed in the image sensor by imaging light rays entering the lens from different directions. For example, pixel a in the left image and pixel B in the right image match each other, where pixel a may correspond to the image formed at the a position in fig. 1 and pixel B may correspond to the image formed at the B position in fig. 1. Since the matched pixels respectively correspond to different images formed by imaging light rays entering the lens from different directions in the image sensor, the phase difference of the matched pixels can be determined according to the position difference of the matched pixels.
The first determining module 1204 is configured to determine a focusing manner of the electronic device according to a first confidence degree corresponding to the first phase difference value and a second confidence degree corresponding to the second phase difference value.
The first confidence coefficient is used for representing the accuracy of the first phase difference value, and the second confidence coefficient is used for representing the accuracy of the second phase difference value. The focusing modes include Phase Detection Auto Focus (PDAF), contrast auto focus, and laser focus.
Specifically, the process of the first determining module 1204 determining the focusing mode of the electronic device according to the first confidence level and the second confidence level is as follows: when the first confidence coefficient and the second confidence coefficient meet the confidence coefficient condition, determining that the focusing mode of the electronic equipment is PDAF; and when the first confidence coefficient and the second confidence coefficient do not meet the confidence coefficient condition, determining that the focusing mode of the electronic equipment is a contrast automatic focusing mode or a laser focusing mode. The confidence condition is set by an engineer according to actual requirements, wherein the first confidence is greater than a first threshold and/or the second confidence is greater than a second threshold, or the larger value of the first confidence and the second confidence is greater than a third threshold; the smaller value of the first confidence degree and the second confidence degree is larger than a fourth threshold value; the difference between the first confidence and the second confidence may be greater than a fifth threshold, and the like, which is not limited herein. The first threshold, the second threshold, the third threshold, the fourth threshold, and the fifth threshold are only used for illustration, and may be the same value or different values, and may be set arbitrarily, which is not described herein again.
The second obtaining module 1206 is configured to determine a focusing manner of the electronic device according to the first confidence degree corresponding to the first phase difference value and the second confidence degree corresponding to the second phase difference value.
Specifically, the second shooting attitude may be a certain degree of angular, height, or depth deflection in the first shooting attitude. The second obtaining module 1206 may obtain data corresponding to the first shooting attitude and data corresponding to the second shooting attitude by using the gyroscope, respectively, and obtain the deflection data by comparing the data corresponding to the first shooting attitude and the data corresponding to the second shooting attitude. The deflection data may include: angular deflection, height offset, and/or depth offset, etc.
The second determining module 1208 is configured to determine the target direction according to the deflection data, the first phase difference value, and the second phase difference value, and obtain a target phase difference corresponding to the target direction in the target image corresponding to the second shooting attitude.
Specifically, the second determining module 1208 determines the target direction according to the deflection data, the first phase difference value and the second phase difference value, and the specific process is as follows: and when the offset data is smaller than or equal to the deviation threshold value, determining the phase difference with higher confidence coefficient from the first phase difference value and the second phase difference value as the target phase difference. And when the offset data is larger than the deviation threshold value, acquiring a target direction and a target phase difference corresponding to the target direction according to the offset data. The target phase difference corresponding to the target direction refers to: the target image is an image which is correspondingly acquired when the electronic device is in the second shooting posture, and the target image can include most of the region of interest and can also cover the region of interest. The process of obtaining the target direction and the target phase difference corresponding to the target direction according to the offset data is as follows: judging whether the offset data meets a preset deflection condition, and when the offset data meets the preset deflection condition, determining one direction from the first direction and the second direction as a target direction, further acquiring a target image in a second shooting posture and acquiring a target phase difference of the target direction in the target image; and when the deflection data does not meet the preset deflection condition, acquiring a target image in the second shooting posture, obtaining the phase difference and the confidence coefficient of the upper image and the lower image of the target image, and selecting the phase difference value corresponding to the larger confidence coefficient value as the target phase difference. The preset conditions are angle, height and the like, and may be angle deviation or a range threshold of a height threshold, and specific numerical values may be set by engineers according to actual requirements, which is not limited herein.
And a focusing module 1210 for controlling the electronic device to focus according to the target phase difference.
Specifically, since the corresponding relationship between the target phase difference value and the target defocus distance value can be obtained by calibration, the target defocus distance can be obtained by knowing the calibration relationship between the target phase difference value and the target defocus distance value. The focusing module 1210 controls the lens to focus the target subject according to the target defocus distance value. For a shooting scene with horizontal textures, because the phase difference value in the horizontal direction cannot be obtained by the PD pixel pair in the horizontal direction, the phase difference value in the vertical direction can be calculated compared with the phase difference value in the vertical direction, the defocusing distance value is calculated according to the phase difference value in the vertical direction, and then the lens is controlled to move according to the defocusing distance value in the vertical direction to realize focusing. For a shooting scene with vertical texture, because the PD pixel pair in the vertical direction cannot obtain the phase difference value in the vertical direction, the phase difference value in the horizontal direction can be calculated compared with the PD pixel pair in the horizontal direction, the defocusing distance value is calculated according to the phase difference value in the horizontal direction, and then the lens is controlled to move according to the defocusing distance value in the horizontal direction to realize focusing.
The focusing device determines a focusing mode of the electronic equipment according to a first confidence corresponding to the first phase difference value and a second confidence corresponding to the second phase difference value by acquiring a phase difference value of a detection image corresponding to a first shooting gesture, wherein the phase difference value comprises a first phase difference value in a first direction and a second phase difference value in a second direction; when the focusing mode is phase detection automatic focusing, detecting a second shooting gesture of the electronic equipment, and acquiring deflection data of the second shooting gesture relative to the first shooting gesture; determining a target direction according to the deflection data, the first phase difference value and the second phase difference value, and obtaining a target phase difference corresponding to the target direction in a target image corresponding to the second shooting attitude; and controlling the electronic equipment to focus according to the target phase difference. According to the scheme, the target direction is determined according to the deflection data of the electronic equipment and the confidence degrees of the two phase differences, the target phase difference value corresponding to the target direction in the target image corresponding to the second shooting posture is obtained, the phase difference value of the two directions after the electronic equipment is subjected to angle deflection does not need to be recalculated, the calculation complexity of a focusing algorithm is reduced, and the focusing rate of the electronic equipment during the angle deflection is improved.
In one embodiment, the deflection data at least includes a deflection angle, and the second determining module is configured to determine the target direction from the first direction and the second direction according to the deflection angle, a first confidence degree corresponding to the first phase difference value, and a second confidence degree corresponding to the second phase difference value.
In one embodiment, the second determining module is configured to select, as the target direction, a direction of a phase difference value corresponding to a greater confidence degree from the first confidence degree and the second confidence degree when the deflection angle is within a first angle range; and when the deflection angle is in a second angle range, selecting the direction of the phase difference value corresponding to the smaller confidence coefficient from the first confidence coefficient and the second confidence coefficient as the target direction.
In one embodiment, the focusing apparatus includes a confidence obtaining module configured to obtain the first confidence and the second confidence according to scene information of the detected image.
In one embodiment, the scene information includes: an image edge, the image edge comprising: horizontal edges and vertical edges; the first direction is a row direction and the second direction is a column direction. The confidence coefficient acquisition module is used for carrying out horizontal edge detection on the detection image to acquire the number of pixel points of the horizontal edge, and determining the second confidence coefficient according to the number of the pixel points of the horizontal edge; and carrying out vertical edge detection on the detection image to obtain the number of pixel points of the vertical edge, and determining the first confidence according to the number of the pixel points of the vertical edge.
In one embodiment, the first determining module is configured to determine that the focusing manner is phase detection auto-focusing when a larger value of the first confidence level and the second confidence level is greater than a confidence threshold.
In one embodiment, the first determining module is further configured to determine that the focusing manner is contrast focusing and control the electronic device to perform contrast focusing when the first confidence degree and the second confidence degree are less than or equal to the confidence degree condition.
In one embodiment, the focusing module is configured to obtain a target defocus distance according to the target phase difference value; and controlling the lens of the electronic equipment to move according to the target defocusing distance so as to focus.
In one embodiment, the focusing module is configured to calculate the target defocus distance according to a calibrated defocus function and the target phase difference value, where the calibrated defocus function is used to characterize a relationship between the target phase difference value and the target defocus distance.
In one embodiment, the first obtaining module is configured to segment the detection image into a first segmented image and a second segmented image according to the first direction; acquiring the first phase difference value according to the phase relation corresponding to the first segmentation image and the second segmentation image; segmenting the detection image into a third segmentation image and a fourth segmentation image according to the second direction; and obtaining the second phase difference value according to the corresponding phase relation of the third segmentation image and the fourth segmentation image.
In one embodiment, the first obtaining module is configured to perform segmentation processing on the detection image according to the first direction to obtain a plurality of image regions, where each image region includes a row of pixels in the detection image; obtaining a plurality of first split image areas and a plurality of second split image areas from the plurality of image areas, wherein the first split image areas comprise pixels of even lines in the detection image, and the second split image areas comprise pixels of odd lines in the detection image; splicing the plurality of first split image areas into the first split image, and forming the second split image by using the plurality of second split image areas; performing segmentation processing on the detection image according to the second direction to obtain a plurality of image areas, wherein each image area comprises a column of pixels in the detection image; acquiring a plurality of third split image areas and a plurality of fourth split image areas from the plurality of image areas, wherein the third split image areas comprise pixels of even columns in the detection image, and the fourth split image areas comprise pixels of odd columns in the detection image; and splicing the plurality of third segmentation image regions into a third segmentation image, and forming a fourth segmentation image by using the plurality of fourth segmentation image regions.
In one embodiment, the first obtaining module is configured to determine a phase difference value of mutually matched pixels according to a position difference of the mutually matched pixels in the first segmentation image and the second segmentation image; determining a first phase difference value according to the phase difference values of the mutually matched pixels; determining the phase difference value of the mutually matched pixels according to the position difference of the mutually matched pixels in the third segmentation image and the fourth segmentation image; and determining a second phase difference value according to the phase difference values of the matched pixels.
The division of the modules in the focusing device is only used for illustration, and in other embodiments, the focusing device may be divided into different modules as needed to complete all or part of the functions of the focusing device.
For specific definition of the focusing device, reference may be made to the definition of the focusing method above, and details are not described herein. The modules in the focusing device can be realized by software, hardware and their combination in whole or in part. The modules can be embedded in a hardware form or independent of a processor in the electronic device, or can be stored in a memory in the electronic device in a software form, so that the processor can call and execute operations corresponding to the modules.
An electronic device comprises a memory and a processor, wherein a computer program is stored in the memory, and when the computer program is executed by the processor, the processor is enabled to execute steps such as a focusing method.
In one embodiment of the present application, an electronic device is provided, which may be an electronic device having a digital image capturing function, for example, a smart phone, a tablet computer, a camera, a video camera, or the like. The internal structure thereof may be as shown in fig. 13. The electronic device includes a processor and a memory connected by a system bus. Wherein the processor of the electronic device is configured to provide computing and control capabilities. The memory of the electronic equipment comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium may store an operating system and a computer program. The internal memory provides an environment for the operating system and the computer program to run on the non-volatile storage medium. The computer program is executed by a processor to implement a focusing method provided by the embodiments of the present application.
In addition, although not shown in fig. 13, the electronic device may further include a lens and an image sensor, wherein the lens may be composed of a set of lenses, and the image sensor may be a Metal Oxide semiconductor (CMOS) image sensor, a Charge-coupled device (CCD), a quantum thin film sensor, an organic sensor, or the like. The image sensor may be connected to a processor through a bus, and the processor may implement a focusing method provided by an embodiment of the present application through a signal output thereto by the image sensor.
Those skilled in the art will appreciate that the structure shown in fig. 13 is a block diagram of only a portion of the structure relevant to the present application, and does not constitute a limitation on the electronic device to which the present application is applied, and a particular electronic device may include more or less components than those shown in the drawings, or combine certain components, or have a different arrangement of components.
In one embodiment of the present application, an electronic device is provided, which includes a memory and a processor, the memory stores a computer program, and the processor implements the following steps when executing the computer program: acquiring a phase difference value of a detection image, wherein the phase difference value comprises a first phase difference value and a second phase difference value, and a preset included angle is formed between the first direction and the second direction; determining a focusing mode of the electronic equipment according to a first confidence coefficient corresponding to the first phase difference value and a second confidence coefficient corresponding to the second phase difference value; when the focusing mode is phase detection automatic focusing, acquiring a shooting posture of the electronic equipment; determining a target direction according to the deflection data, the first phase difference value and the second phase difference value, and acquiring a target phase difference corresponding to the target direction in a target image corresponding to the second shooting attitude; and controlling the electronic equipment to focus according to the target phase difference.
The implementation principle and technical effect of the computer-readable storage medium provided by this embodiment are similar to those of the above-described method embodiment, and are not described herein again.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware related to instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent application shall be subject to the appended claims.

Claims (15)

1. The focusing method is applied to electronic equipment, wherein the electronic equipment comprises an image sensor, the image sensor comprises a plurality of pixel groups arranged in an array, and each pixel group comprises M x N pixels arranged in an array; each pixel point corresponds to a photosensitive unit, wherein M and N are both natural numbers greater than or equal to 2, and the method comprises the following steps:
obtaining a phase difference value of a detection image corresponding to a first shooting attitude, wherein the phase difference value comprises a first phase difference value in a first direction and a second phase difference value in a second direction, and a preset included angle is formed between the first direction and the second direction;
determining a focusing mode of the electronic equipment according to a first confidence coefficient corresponding to the first phase difference value and a second confidence coefficient corresponding to the second phase difference value;
when the focusing mode is phase detection automatic focusing, detecting a second shooting gesture of the electronic equipment, and acquiring deflection data of the second shooting gesture relative to the first shooting gesture;
determining a target direction to which the phase difference value belongs according to the deflection data, the first phase difference value and the second phase difference value, and acquiring a target phase difference corresponding to the target direction in a target image corresponding to the second shooting attitude;
and controlling the electronic equipment to focus according to the target phase difference.
2. The method of claim 1, wherein the deflection data comprises at least a deflection angle, and wherein determining a target direction from the deflection data, the first phase difference value, and the second phase difference value comprises:
and determining the target direction from the first direction and the second direction according to the deflection angle, a first confidence degree corresponding to the first phase difference value and a second confidence degree corresponding to the second phase difference value.
3. The method of claim 2, wherein determining the target direction from the first direction and the second direction based on the deflection angle, a first confidence level corresponding to the first phase difference value, and a second confidence level corresponding to the second phase difference value comprises:
when the deflection angle is within a first angle range, selecting a direction of a phase difference value corresponding to a larger confidence coefficient from the first confidence coefficient and the second confidence coefficient as the target direction;
and when the deflection angle is in a second angle range, selecting the direction of the phase difference value corresponding to the smaller confidence coefficient from the first confidence coefficient and the second confidence coefficient as the target direction.
4. The method of claim 1, wherein before determining the focusing mode of the electronic device according to the first confidence level corresponding to the first phase difference value and the second confidence level corresponding to the second phase difference value, the method further comprises: and acquiring the first confidence coefficient and the second confidence coefficient according to scene information of the detection image.
5. The method of claim 4, wherein the context information comprises: an image edge, the image edge comprising: horizontal edges and vertical edges; the obtaining the first confidence degree and the second confidence degree according to the scene information of the detection image includes:
performing horizontal edge detection on the detection image to obtain the number of pixel points of the horizontal edge, and determining the second confidence according to the number of the pixel points of the horizontal edge;
and carrying out vertical edge detection on the detection image to obtain the number of pixel points of the vertical edge, and determining the first confidence according to the number of the pixel points of the vertical edge.
6. The method of claim 1, wherein determining the focusing mode of the electronic device according to a first confidence corresponding to the first phase difference value and a second confidence corresponding to the second phase difference value comprises:
when the larger value of the first confidence coefficient and the second confidence coefficient is larger than a confidence coefficient threshold value, determining that the focusing mode is phase detection automatic focusing.
7. The method of claim 6, further comprising: and when the first confidence coefficient and the second confidence coefficient are smaller than or equal to the confidence coefficient condition, determining that the focusing mode is contrast focusing, and controlling the electronic equipment to carry out contrast focusing.
8. The method of claim 1, wherein the controlling the electronic device to focus according to the target phase difference comprises:
acquiring a target defocusing distance according to the target phase difference value;
and controlling the lens of the electronic equipment to move according to the target defocus distance so as to focus.
9. The method of claim 8, wherein obtaining the target defocus distance according to the target phase difference value comprises:
and calculating the target defocus distance according to the calibrated defocus function and the target phase difference value, wherein the calibrated defocus function is used for representing the relation between the target phase difference value and the target defocus distance.
10. The method according to claim 1, wherein the obtaining a phase difference value of the detection image corresponding to the first shooting posture comprises:
segmenting the detection image into a first segmentation image and a second segmentation image according to the first direction; acquiring the first phase difference value according to the corresponding phase relation of the first segmentation image and the second segmentation image;
segmenting the detection image into a third segmentation image and a fourth segmentation image according to the second direction; and obtaining the second phase difference value according to the corresponding phase relation of the third segmentation image and the fourth segmentation image.
11. The method of claim 10, wherein the first direction is a row direction and the second direction is a column direction;
the segmenting the detection image into a first segmentation image and a second segmentation image according to the first direction includes:
segmenting the detection image according to the first direction to obtain a plurality of image areas, wherein each image area comprises a line of pixels in the detection image; obtaining a plurality of first split image areas and a plurality of second split image areas from the plurality of image areas, wherein the first split image areas comprise pixels of even lines in the detection image, and the second split image areas comprise pixels of odd lines in the detection image; splicing the plurality of first segmentation image regions into the first segmentation image, and forming the second segmentation image by using the plurality of second segmentation image regions;
the segmenting the detection image into a third segmented image and a fourth segmented image according to the second direction includes:
performing segmentation processing on the detection image according to the second direction to obtain a plurality of image areas, wherein each image area comprises a column of pixels in the detection image; obtaining a plurality of third split image areas and a plurality of fourth split image areas from the plurality of image areas, wherein the third split image areas comprise pixels in even columns of the detection image, and the fourth split image areas comprise pixels in odd columns of the detection image; and splicing the plurality of third segmentation image regions into a third segmentation image, and forming a fourth segmentation image by using the plurality of fourth segmentation image regions.
12. The method of claim 11, wherein obtaining the first phase difference value according to the corresponding phase relationship between the first slice image and the second slice image comprises:
determining the phase difference value of the mutually matched pixels according to the position difference of the mutually matched pixels in the first segmentation image and the second segmentation image; determining a first phase difference value according to the phase difference values of the mutually matched pixels;
the obtaining the second phase difference value according to the phase relationship corresponding to the third segmentation image and the fourth segmentation image includes:
determining the phase difference value of the mutually matched pixels according to the position difference of the mutually matched pixels in the third segmentation image and the fourth segmentation image; and determining a second phase difference value according to the phase difference values of the mutually matched pixels.
13. The focusing device is applied to electronic equipment, wherein the electronic equipment comprises an image sensor, the image sensor comprises a plurality of pixel groups arranged in an array, and each pixel group comprises M x N pixels arranged in an array; every pixel corresponds a sensitization unit, wherein, M and N are the natural number more than or equal to 2, focusing device includes:
the first acquisition module is used for acquiring a phase difference value of a detection image corresponding to a first shooting posture, wherein the phase difference value comprises a first phase difference value in a first direction and a second phase difference value in a second direction, and a preset included angle is formed between the first direction and the second direction;
the first determining module is configured to determine a focusing mode of the electronic device according to a first confidence coefficient corresponding to the first phase difference value and a second confidence coefficient corresponding to the second phase difference value;
the second acquisition module is used for detecting a second shooting gesture of the electronic equipment and acquiring deflection data of the second shooting gesture relative to the first shooting gesture when the focusing mode is phase detection automatic focusing;
the second determining module is used for determining a target direction to which the phase difference value belongs according to the deflection data, the first phase difference value and the second phase difference value, and acquiring a target phase difference corresponding to the target direction in a target image corresponding to the second shooting attitude;
and the focusing module is used for controlling the electronic equipment to focus according to the target phase difference.
14. An electronic device comprising a memory and a processor, the memory having stored therein a computer program that, when executed by the processor, causes the processor to perform the steps of the focusing method as claimed in any one of claims 1 to 12.
15. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 12.
CN201911102669.0A 2019-11-12 2019-11-12 Focusing method and device, electronic equipment and computer readable storage medium Active CN112866551B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911102669.0A CN112866551B (en) 2019-11-12 2019-11-12 Focusing method and device, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911102669.0A CN112866551B (en) 2019-11-12 2019-11-12 Focusing method and device, electronic equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN112866551A CN112866551A (en) 2021-05-28
CN112866551B true CN112866551B (en) 2022-06-14

Family

ID=75984493

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911102669.0A Active CN112866551B (en) 2019-11-12 2019-11-12 Focusing method and device, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN112866551B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109981965A (en) * 2017-12-27 2019-07-05 华为技术有限公司 The method and electronic equipment of focusing

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5387856B2 (en) * 2010-02-16 2014-01-15 ソニー株式会社 Image processing apparatus, image processing method, image processing program, and imaging apparatus
JP5740054B2 (en) * 2012-07-06 2015-06-24 富士フイルム株式会社 Imaging apparatus and image processing method
CN105007425B (en) * 2015-07-23 2018-07-06 广东欧珀移动通信有限公司 A kind of contrast formula focusing method and mobile terminal
CN109477948B (en) * 2016-07-06 2021-02-19 富士胶片株式会社 Focus control device, focus control method, focus control program, lens device, and imaging device
JP2018097253A (en) * 2016-12-15 2018-06-21 オリンパス株式会社 Focus adjustment unit and focus adjustment method
CN106973206B (en) * 2017-04-28 2020-06-05 Oppo广东移动通信有限公司 Camera shooting module group camera shooting processing method and device and terminal equipment
US20190033555A1 (en) * 2017-07-28 2019-01-31 Qualcomm Incorporated Phase detection autofocus with diagonal line detection
KR20190042353A (en) * 2017-10-16 2019-04-24 엘지전자 주식회사 Mobile terminal and method for controlling the same

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109981965A (en) * 2017-12-27 2019-07-05 华为技术有限公司 The method and electronic equipment of focusing

Also Published As

Publication number Publication date
CN112866551A (en) 2021-05-28

Similar Documents

Publication Publication Date Title
US10043290B2 (en) Image processing to enhance distance calculation accuracy
US10015472B2 (en) Image processing using distance information
JP5825817B2 (en) Solid-state imaging device and imaging apparatus
US10659766B2 (en) Confidence generation apparatus, confidence generation method, and imaging apparatus
CN112866542B (en) Focus tracking method and apparatus, electronic device, and computer-readable storage medium
CN112866549B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN112866675B (en) Depth map generation method and device, electronic equipment and computer-readable storage medium
CN114424516A (en) Image processing apparatus, image processing method, imaging apparatus, and program
CN112866655B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN112866511B (en) Imaging assembly, focusing method and device and electronic equipment
JP6353233B2 (en) Image processing apparatus, imaging apparatus, and image processing method
JP2020021126A (en) Image processing device and control method thereof, distance detection device, imaging device, program
CN112866510B (en) Focusing method and device, electronic equipment and computer readable storage medium
US9791599B2 (en) Image processing method and imaging device
CN112866551B (en) Focusing method and device, electronic equipment and computer readable storage medium
CN112866548B (en) Phase difference acquisition method and device and electronic equipment
CN112866546B (en) Focusing method and device, electronic equipment and computer readable storage medium
CN112862880B (en) Depth information acquisition method, device, electronic equipment and storage medium
CN112866545B (en) Focusing control method and device, electronic equipment and computer readable storage medium
CN112866552B (en) Focusing method and device, electronic equipment and computer readable storage medium
CN112866547B (en) Focusing method and device, electronic equipment and computer readable storage medium
CN112866554B (en) Focusing method and device, electronic equipment and computer readable storage medium
CN112866543B (en) Focusing control method and device, electronic equipment and computer readable storage medium
CN112866544B (en) Phase difference acquisition method, device, equipment and storage medium
CN112866674B (en) Depth map acquisition method and device, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant