CN106570028B - Mobile terminal and method and device for deleting blurred image - Google Patents
Mobile terminal and method and device for deleting blurred image Download PDFInfo
- Publication number
- CN106570028B CN106570028B CN201510655461.7A CN201510655461A CN106570028B CN 106570028 B CN106570028 B CN 106570028B CN 201510655461 A CN201510655461 A CN 201510655461A CN 106570028 B CN106570028 B CN 106570028B
- Authority
- CN
- China
- Prior art keywords
- image
- images
- blurred
- deleting
- definition
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 70
- 238000011156 evaluation Methods 0.000 claims abstract description 59
- 238000004364 calculation method Methods 0.000 claims abstract description 25
- 238000012217 deletion Methods 0.000 claims description 23
- 230000037430 deletion Effects 0.000 claims description 23
- 238000006243 chemical reaction Methods 0.000 claims description 9
- 238000001514 detection method Methods 0.000 claims description 6
- 230000006870 function Effects 0.000 description 27
- 230000008569 process Effects 0.000 description 15
- 238000010586 diagram Methods 0.000 description 12
- 238000013210 evaluation model Methods 0.000 description 4
- 238000012545 processing Methods 0.000 description 3
- 230000007423 decrease Effects 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000012216 screening Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 206010034912 Phobia Diseases 0.000 description 1
- 206010044565 Tremor Diseases 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 208000019899 phobic disease Diseases 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/04—Context-preserving transformations, e.g. by using an importance map
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/10—File systems; File servers
- G06F16/17—Details of further file system functions
- G06F16/174—Redundancy elimination performed by the file system
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- General Engineering & Computer Science (AREA)
- Image Processing (AREA)
- Studio Devices (AREA)
Abstract
The invention discloses a method and a device for deleting a blurred image, wherein the method comprises the following steps: the method comprises the steps of obtaining a plurality of original images, converting the original images into a plurality of gray level images, respectively carrying out definition calculation on the gray level images to obtain a plurality of definition values corresponding to the original images, obtaining fuzzy images in the original images according to the definition values, and deleting the fuzzy images. The method of the embodiment of the invention realizes the purpose of automatically and quickly deleting the blurred image, simplifies the user operation, can easily distinguish the blurring degree of the shot image through the definition evaluation algorithm, facilitates the user to quickly and preferably select the image with higher quality in a group of similar images, and improves the user experience. The invention also discloses a mobile terminal.
Description
Technical Field
The invention relates to the technical field of mobile terminals, in particular to a mobile terminal and a method and a device for deleting a blurred image.
Background
With the rapid development of mobile terminal technology and the diversity of application programs in mobile terminals, mobile terminals have become essential tools in people's lives, and more users use camera functions in mobile terminals to take self-shots or continuous shots due to the portability of mobile terminals. After a user uses a mobile terminal (e.g., a smart phone) to complete a group of continuous shots, the shot images may be blurred due to various reasons such as hand trembling.
In the related art, for a blurred image in a group of continuous shooting images, a user generally checks a shot image through a camera of a mobile terminal or a portal of a gallery application program, then judges whether the images are blurred through human eyes, and if so, the user deletes the blurred images one by one, or selects all the blurred images one by one and then deletes all the blurred images together.
However, there are problems that: the user needs to manually operate for many times through checking, deleting or selecting the images one by one as the images to be deleted, which is not only troublesome but also time-consuming, and especially for the user with selective phobia, the user can accept or reject the images in a group of images of the same type, which is very difficult and painful, and the user experience is poor.
Disclosure of Invention
The present invention has been made to solve at least one of the technical problems of the related art to some extent. To this end, a first object of the present invention is to propose a method for deleting a blurred image. The method realizes the purpose of automatically and quickly deleting the blurred image, simplifies the user operation, can easily distinguish the blurring degree of the shot image through the definition evaluation algorithm, is convenient for the user to quickly and preferably select the image with higher quality in a group of similar images, and improves the user experience.
A second object of the present invention is to provide a blurred image deleting apparatus.
A third object of the present invention is to provide a mobile terminal.
In order to achieve the above object, an embodiment of a first aspect of the present invention provides a method for deleting a blurred image, including the following steps: acquiring a plurality of original images, and converting the plurality of original images into a plurality of gray level images; respectively carrying out definition calculation on the gray level images to obtain a plurality of definition values corresponding to the original images; and acquiring blurred images in the plurality of original images according to the plurality of definition values, and deleting the blurred images.
According to the method for deleting the blurred image, a plurality of original images can be obtained firstly, the original images are converted into gray images, then the definition of the gray images can be calculated respectively to obtain the corresponding definition values, finally the blurred images in the original images are obtained according to the definition values and deleted, namely the definition of the images is automatically distinguished through a definition evaluation algorithm, the images meeting the blurring standard are judged according to the definition value of each image, and then the images meeting the blurring standard can be deleted automatically, so that on one hand, the purpose of automatically and quickly deleting the blurred images is achieved, the operation of a user is simplified, on the other hand, the blurring degree of the shot images can be easily distinguished through the definition evaluation algorithm, and the user can conveniently and quickly select the images with higher quality in a group of similar images, the user experience is improved.
In order to achieve the above object, a second embodiment of the present invention provides a blurred image deleting device, including: the original image acquisition module is used for acquiring a plurality of original images; the gray level image conversion module is used for converting the original images into gray level images; the definition calculating module is used for respectively calculating the definitions of the gray images to obtain a plurality of definition values corresponding to the original images; the blurred image acquisition module is used for acquiring blurred images in the plurality of original images according to the plurality of definition values; and a deleting module for deleting the blurred image.
According to the device for deleting the blurred image, which is disclosed by the embodiment of the invention, a plurality of original images can be obtained through an original image obtaining module, a gray image converting module converts the original images into gray images, a definition calculating module respectively calculates the definition of the gray images to obtain the corresponding definition values, a blurred image obtaining module obtains the blurred images in the original images according to the definition values, and a deleting module deletes the blurred images, namely, the definition of the images is automatically distinguished through a definition evaluating algorithm, the images meeting the blurring standard are judged according to the definition value of each image, and then the images meeting the blurring standard can be automatically deleted, so that on one hand, the aim of automatically and quickly deleting the blurred images is realized, the user operation is simplified, on the other hand, the blurring degree of the shot images can be easily distinguished through the definition evaluating algorithm, the method and the device facilitate the user to quickly and preferably select the pictures with higher quality in a group of similar images, and improve the user experience.
In order to achieve the above object, a third embodiment of the present invention provides a mobile terminal, including the blurred image deleting apparatus according to the second embodiment of the present invention.
According to the mobile terminal of the embodiment of the invention, a plurality of original images can be obtained through the original image obtaining module in the deleting device, the gray image converting module converts the original images into the gray images, the definition calculating module respectively calculates the definition of the gray images to obtain the corresponding definition values, the blurred image obtaining module obtains the blurred images in the original images according to the definition values, the deleting module deletes the blurred images, namely, the definition of the images is automatically distinguished through the definition evaluating algorithm, the images meeting the blurring standard are judged according to the definition value of each image, and then the images meeting the blurring standard can be automatically deleted, on one hand, the aim of automatically and quickly deleting the blurred images is realized, the user operation is simplified, on the other hand, the blurring degree of the shot images can be easily distinguished through the definition evaluating algorithm, the method and the device facilitate the user to quickly and preferably select the pictures with higher quality in a group of similar images, and improve the user experience.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which,
FIG. 1 is a flow diagram of a method of deletion of a blurred image according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an edge region of a picture selected by an edge-based gradient evaluation algorithm according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a gradient evaluation algorithm selecting a lattice of edge regions in accordance with one embodiment of the present invention;
FIG. 4 is a flow diagram of a method of blurred image deletion according to another embodiment of the present invention;
FIG. 5 is a schematic diagram of a set entry setup path of a delete blurred image switch, according to an embodiment of the present invention;
fig. 6 is a schematic diagram of a set entry setting path of a delete blurred image switch according to another embodiment of the present invention;
FIG. 7 is a diagram of an upper interface of a mobile terminal prompting a user to set a query dialog, in accordance with one embodiment of the present invention;
fig. 8 is a block diagram of a structure of a blurred image deletion apparatus according to an embodiment of the present invention;
fig. 9 is a block diagram showing the construction of a blurred image deletion apparatus according to another embodiment of the present invention;
fig. 10 is a block diagram of a structure of a blurred image deletion apparatus according to still another embodiment of the present invention;
FIG. 11 is a general flowchart of an implementation of an auto-delete image function of a mobile terminal in accordance with a specific embodiment of the present invention; and
fig. 12 is a diagram of an upper layer interface intelligent preferred waiting prompt box of a mobile terminal according to an embodiment of the invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
A mobile terminal and a blurred image deletion method and apparatus according to an embodiment of the present invention are described below with reference to the accompanying drawings.
Fig. 1 is a flowchart of a method of deleting a blurred image according to an embodiment of the present invention.
As shown in fig. 1, the method for deleting the blurred image includes:
s101, acquiring a plurality of original images, and converting the plurality of original images into a plurality of gray level images.
In the embodiment of the present invention, the plurality of original images may be a group of images captured by a user using a mobile terminal such as a smart phone through continuous shooting or self-shooting, for example, a group of images for the same scene, object, or person. It is understood that the image photographed by the mobile terminal is generally a color image.
Specifically, after a group of shot images is completed through a continuous shooting function or a self-shooting function provided by the mobile terminal, or when it is detected that a user opens and views a group of continuous shooting images or a group of self-shooting images through a gallery application program in the mobile terminal, original images of the images can be acquired, and then the original images can be converted into corresponding grayscale images through grayscale conversion.
It is to be appreciated that in embodiments of the present invention, converting the original image into a grayscale image may facilitate the calculation of the sharpness of the subsequent image, since the objects in the various sharpness evaluation functions are the grayscale values of the image. Among them, in the embodiment of the present invention, for the RGB model, the original image can be converted into the corresponding grayscale image by the following formula (1):
wherein, H is the gray value of the gray image, and r (red), g (green), b (blue) respectively represent the color values of the red, green, and blue channels in the original image.
S102, respectively carrying out definition calculation on the multiple gray level images to obtain multiple definition values corresponding to the multiple original images.
Specifically, the sharpness calculation may be performed on each gray image by a sharpness evaluation algorithm to obtain a corresponding sharpness value. In the embodiment of the present invention, the sharpness evaluation algorithm may include, but is not limited to, a contrast method (i.e., a gray difference method), an edge-based gradient evaluation algorithm, an image gray entropy function, and the like.
The following describes in detail the calculation process of the image sharpness in the present embodiment by taking a contrast method (i.e., a grayscale difference method) and an edge-based gradient evaluation algorithm as examples.
For example, taking a contrast method (i.e., a grayscale difference method) as an example, in the embodiment of the present invention, the specific implementation process of respectively performing sharpness calculation on a plurality of grayscale images to obtain sharpness values corresponding to a plurality of original images may include the following steps: for each gray image, the gray value of each pixel point in each gray image can be acquired, the contrast evaluation value corresponding to each gray image is calculated according to the preset contrast evaluation model and the gray value of each pixel point, and the contrast evaluation value is used as the definition value of each original image. In an embodiment of the present invention, the preset contrast evaluation model (i.e., the contrast evaluation function) may be represented by the following formula (2):
where E is a contrast evaluation value, m is the total number of pixels of the grayscale image, and Hn+1Is the gray value of the n +1 th pixel point, Hn-1Is the gray value of the (n-1) th pixel point.
That is, the gray value of each pixel in each gray image may be obtained first, then the gray values may be substituted into the above equation (2), and the contrast evaluation value corresponding to each gray image may be calculated, and finally the contrast evaluation value may be used as the sharpness value corresponding to the original image.
Among them, it can be understood that since an image captured in a correct focus is a sharp image and an image captured out of focus is a blurred image, by analyzing the correctly focused image (the image is a sharp image) and the out-of-focus image (the image is a blurred image), the following rules apply: when the image is correctly focused, the corresponding contrast of the shot image is strongest, and the more the image deviates from the position corresponding to the correct focusing, the lower the contrast of the shot image is. Therefore, based on the above-mentioned law, it can be seen from the above expression (2) that the contrast evaluation value E is the largest at the in-focus position, and decreases as the defocus amount increases at the in-focus and back positions, and tends to be 0 when the defocus offset is particularly large.
For another example, taking an edge-based gradient evaluation algorithm as an example, in the embodiment of the present invention, a specific implementation process of respectively performing sharpness calculation on a plurality of gray-scale images to obtain sharpness values corresponding to a plurality of original images may include the following steps: selecting a plurality of edge regions in each gray level image based on a gradient evaluation algorithm of edges aiming at each gray level image; and selecting a pixel lattice in each edge region, calculating the pixel lattice according to a Laplace operator to obtain a definition value of each edge region, and calculating the definition value of each original image according to the definition value of each edge region.
Wherein the laplace operator is understood to be a second derivative operator, e.g. for a continuous function f (x, y) inLaplace value Δ at coordinates (x, y)2f can be defined as shown in the following formula (3):
to be more suitable for digital image processing, the laplacian may also be expressed in the form of a template, as shown in table 1 below. The basic requirement of the template may be that the coefficients corresponding to the central pixel should be positive, and the coefficients corresponding to the neighboring pixels of the central pixel should be negative, and the sum of these coefficients should be zero.
TABLE 1 Laplace operator template
-1 | -1 | -1 |
-1 | 8 | -1 |
-1 | -1 | -1 |
By combining the above laplacian template with the discrete form of laplacian, the formula for calculating laplacian can be transformed as shown in the following formula (4):
E'=8H(x,y)-H(x-1,y-1)-H(x-1,y)-H(x-1,y+1)-H(x,y-1) (4)
-H(x,y+1)-H(x+1,y-1)-H(x+1,y)-H(x+1,y+1)
wherein, E' is Laplacian, which can also be called as evaluation factor in the image area; h (x, y) is the gray value of the pixel point (x, y). The above formula (4) can be understood that, if a bright point appears in a darker area in the image, the bright point can be used as a center, a nine-square grid can be drawn by using the bright point as the center, the gray values of the pixel points corresponding to the nine-square grid respectively can be calculated, and then the evaluation factor of the above formula (4) can be obtained by calculating according to the template coefficient shown in table 1 and the gray values of the pixel points in the nine-square grid. It can be understood that, for a blurred image, the gray value variation near each pixel is small, and the evaluation factor E' is small; for sharp images, the image is sharp and the evaluation factor E' is at a maximum.
Further, the sharpness value of the image can be calculated by the following equation (5) for the entire image:
wherein F is the definition value of the image, M, N is the maximum value of the abscissa and ordinate of the image pixel respectively, M × N is the total number of pixel points, E'xyIs the evaluation factor at pixel point (x, y).
For example, for each gray scale image, a plurality of edge regions may be selected in each gray scale image based on an edge gradient evaluation algorithm, for example, as shown in fig. 2, 4 edge regions, namely, a1, a2, A3, and a4, may be selected in each gray scale image, then, a 50 × 50 pixel lattice may be selected in each edge region (as shown in fig. 3), then, the pixel lattice is calculated according to the laplacian template and the formula (4), a plurality of evaluation factors of a plurality of squared pixel lattices conforming to the laplacian template are calculated, then, the evaluation factors may be squared and averaged by the formula (5), so as to obtain a sharpness value of the 50 × 50 pixel lattice, and through this calculation process, a sharpness value F of the 4 edge regions of each gray scale image may be obtained1、F2、F3、F4Then, the 4 edge regions of each image can be sharpenedThe values are summed to obtain a total sharpness value for each image.
S103, acquiring blurred images in the multiple original images according to the multiple definition values, and deleting the blurred images.
Specifically, in an embodiment of the present invention, a specific implementation process of acquiring blurred images in a plurality of original images according to a plurality of sharpness values may be as follows: the calculated definition values can be compared in size, the original image with the largest definition value is the preferred image, and the rest original images are blurred images. That is, the image with the largest sharpness value in the original images can be selected as the preferred image, and the rest of the images can be selected as the blurred images.
In another embodiment of the present invention, the specific implementation process of obtaining the blurred image in the plurality of original images according to the plurality of sharpness values may be as follows: and judging whether the plurality of definition values are smaller than a preset threshold value one by one, and if so, selecting the original image with the definition value smaller than the preset threshold value as a blurred image. Preferably, in an embodiment of the present invention, if the sharpness value is greater than or equal to the preset threshold, an original image having a sharpness value greater than or equal to the preset threshold is selected as the preferred image. That is, the sharpness value corresponding to each original image may be compared with a preset threshold, and the original image with the sharpness value smaller than the preset threshold may be selected as the blurred image, otherwise, the original image is the preferred image. In the embodiment of the present invention, the preset threshold may be an empirical value obtained through a large number of experiments in advance, or may be customized by the user according to the needs of the user.
After the original images are subjected to sharpness evaluation to obtain blurred images, the blurred images are deleted. It can be understood that, at the same time of deleting the blurred images, the space for storing the blurred images can be released. Thus, the purpose of automatically deleting the blurred image can be realized.
According to the method for deleting the blurred image, a plurality of original images can be obtained firstly, the original images are converted into gray images, then the definition of the gray images can be calculated respectively to obtain the corresponding definition values, finally the blurred images in the original images are obtained according to the definition values and deleted, namely the definition of the images is automatically distinguished through a definition evaluation algorithm, the images meeting the blurring standard are judged according to the definition value of each image, and then the images meeting the blurring standard can be deleted automatically, so that on one hand, the purpose of automatically and quickly deleting the blurred images is achieved, the operation of a user is simplified, on the other hand, the blurring degree of the shot images can be easily distinguished through the definition evaluation algorithm, and the user can conveniently and quickly select the images with higher quality in a group of similar images, the user experience is improved.
Preferably, in an embodiment of the present invention, as shown in fig. 4, on the basis shown in fig. 1, that is, after deleting the blurred image, the method for deleting the blurred image may further include:
s404, the preferred image is provided to the user.
For example, after deleting the blurred image, the remaining preferred images in the original image may be aggregated in a presentation interface and presented to the user for viewing by the user.
Therefore, the image with the optimal definition in the same group of images can be directly obtained without user selection or screening, the user can more intuitively know the optimal image, and the visual experience of the user is improved.
In order to improve the user experience, the user can determine whether the function of automatically deleting the blurred image is needed according to the requirement of the user. Optionally, in an embodiment of the present invention, before acquiring a plurality of original images, the method for deleting the blurred image may further include: and providing a setting entrance of a switch for deleting the blurred image, wherein the setting entrance can be used for receiving the operation input by the user for the switch for deleting the blurred image.
For example, assuming that the method for deleting the blurred image according to the embodiment of the present invention is applied to a mobile terminal, a setting entry of a switch for deleting the blurred image may be provided on a viewing interface of a camera application program in the mobile terminal, as shown in fig. 5, a user may click the setting button icon to enter the setting entry of the switch for deleting the blurred image, and may replace the state of the switch for deleting the blurred image by sliding or clicking, etc.
For another example, assuming that the method for deleting the blurred image according to the embodiment of the present invention is applied to a mobile terminal, as shown in fig. 6, a setting entry of a switch for deleting the blurred image may be provided in a sub-menu under a camera application program in "setting" of the mobile terminal, and a user may replace a state of the switch for deleting the blurred image by sliding or clicking or the like.
Since the user may not turn on the function switch for deleting the blurred image, in order to automatically delete the blurred image, it may be detected whether the function switch is turned on. Optionally, in an embodiment of the present invention, before acquiring a plurality of original images, the method for deleting the blurred image may further include: detecting whether the state of the fuzzy image deleting switch is in an opening state, if not, replacing the state of the fuzzy image deleting switch with the opening state; and if so, acquiring a plurality of original images. That is, whether a switch for deleting the blurred image is turned on by a user or not can be detected, and if the switch is turned on, the original image to be processed can be obtained; if not, a prompt inquiry box can pop up, as shown in fig. 7, so as to remind the user whether to turn on the blur image deleting switch, and if yes is detected, the setting entry as shown in fig. 6 can be called for the user to perform specific setting; and if the user selects 'no', entering a manual deleting operation mode.
In order to implement the above embodiments, the present invention further provides a device for deleting a blurred image.
Fig. 8 is a block diagram of a structure of a blurred image deletion apparatus according to an embodiment of the present invention.
As shown in fig. 8, the blurred image deletion apparatus includes: an original image acquisition module 10, a grayscale image conversion module 20, a sharpness calculation module 30, a blurred image acquisition module 40, and a deletion module 50.
Specifically, the original image acquiring module 10 is configured to acquire a plurality of original images. The grayscale image conversion module 20 is used to convert a plurality of original images into a plurality of grayscale images.
In the embodiment of the present invention, the plurality of original images may be a group of images captured by a user using a mobile terminal such as a smart phone through continuous shooting or self-shooting, for example, a group of images for the same scene, object, or person. It is understood that the image photographed by the mobile terminal is generally a color image.
More specifically, after completing a set of shot images through a continuous shooting function or a self-timer shooting function provided by the mobile terminal, or detecting that a user opens and views a set of continuous shooting images or a set of self-timer shooting images through a gallery application in the mobile terminal, the original image acquisition module 10 may acquire original images of the images, and the grayscale image conversion module 20 may convert the original images into corresponding grayscale images.
It is understood that in the embodiment of the present invention, the gray image conversion module 20 converts the original image into the gray image to facilitate the calculation of the definition of the subsequent image, because the objects in the various definition evaluation functions are the gray values of the image. In an embodiment of the present invention, for the RGB model, the grayscale image conversion module 20 may convert the original image into a corresponding grayscale image according to the following formula (1):
wherein, H is the gray value of the gray image, and r (red), g (green), b (blue) respectively represent the color values of the red, green, and blue channels in the original image.
The definition calculating module 30 is configured to perform definition calculation on the multiple gray-scale images, respectively, to obtain multiple definition values corresponding to the multiple original images.
More specifically, the sharpness calculation module 30 may perform sharpness calculation on each gray image through a sharpness evaluation algorithm to obtain a corresponding sharpness value. In the embodiment of the present invention, the sharpness evaluation algorithm may include, but is not limited to, a contrast method (i.e., a gray difference method), an edge-based gradient evaluation algorithm, an image gray entropy function, and the like.
The following describes in detail the calculation process of the image sharpness in the present embodiment by taking a contrast method (i.e., a grayscale difference method) and an edge-based gradient evaluation algorithm as examples.
For example, taking a contrast method (i.e. a gray difference method) as an example, in the embodiment of the present invention, the definition calculating module 30 respectively calculates the definitions of the multiple gray images, and the specific implementation process of obtaining the definition values corresponding to the multiple original images may be as follows: for each gray image, the gray value of each pixel point in each gray image can be acquired, the contrast evaluation value corresponding to each gray image is calculated according to the preset contrast evaluation model and the gray value of each pixel point, and the contrast evaluation value is used as the definition value of each original image. In an embodiment of the present invention, the preset contrast evaluation model (i.e., the contrast evaluation function) may be represented by the following formula (2):
where E is a contrast evaluation value, m is the total number of pixels of the grayscale image, and Hn+1Is the gray value of the n +1 th pixel point, Hn-1Is the gray value of the (n-1) th pixel point.
That is, the sharpness calculation module 30 may first obtain the gray scale value of each pixel point in each gray scale image, then substitute the gray scale values into the above equation (2), calculate the contrast evaluation value corresponding to each gray scale image, and finally use the contrast evaluation value as the sharpness value corresponding to the original image.
Among them, it can be understood that since an image captured in a correct focus is a sharp image and an image captured out of focus is a blurred image, by analyzing the correctly focused image (the image is a sharp image) and the out-of-focus image (the image is a blurred image), the following rules apply: when the image is correctly focused, the corresponding contrast of the shot image is strongest, and the more the image deviates from the position corresponding to the correct focusing, the lower the contrast of the shot image is. Therefore, based on the above-mentioned law, it can be seen from the above expression (2) that the contrast evaluation value E is the largest at the in-focus position, and decreases as the defocus amount increases at the in-focus and back positions, and tends to be 0 when the defocus offset is particularly large.
For another example, taking the edge-based gradient evaluation algorithm as an example, in the embodiment of the present invention, the sharpness calculation module 30 performs sharpness calculation on the multiple gray-scale images, respectively, and a specific implementation process of obtaining sharpness values corresponding to the multiple original images may be as follows: selecting a plurality of edge regions in each gray level image based on a gradient evaluation algorithm of edges aiming at each gray level image; and selecting a pixel lattice in each edge region, calculating the pixel lattice according to a Laplace operator to obtain a definition value of each edge region, and calculating the definition value of each original image according to the definition value of each edge region.
Wherein the laplacian operator is understood to be a second derivative operator, e.g. for a continuous function f (x, y) having a laplacian value Δ at coordinates (x, y)2f can be defined as shown in the following formula (3):
to be more suitable for digital image processing, the laplacian may also be expressed in the form of a template, as shown in table 1 below. The basic requirement of the template is that the coefficients corresponding to the central pixel should be positive, and the coefficients corresponding to the neighboring pixels of the central pixel should be negative, and the sum of these coefficients should be zero.
TABLE 1 Laplace operator template
-1 | -1 | -1 |
-1 | 8 | -1 |
-1 | -1 | -1 |
By combining the above laplacian template with the discrete form of laplacian, the formula for calculating laplacian can be transformed as shown in the following formula (4):
E'=8H(x,y)-H(x-1,y-1)-H(x-1,y)-H(x-1,y+1)-H(x,y-1) (4)
-H(x,y+1)-H(x+1,y-1)-H(x+1,y)-H(x+1,y+1)
wherein, E' is Laplacian, which can also be called as evaluation factor in the image area; h (x, y) is the gray value of the pixel point (x, y). The above formula (4) can be understood that, if a bright point appears in a darker area in the image, the bright point can be used as a center, a nine-square grid can be drawn by using the bright point as the center, the gray values of the pixel points corresponding to the nine-square grid respectively can be calculated, and then the evaluation factor of the above formula (4) can be obtained by calculating according to the template coefficient shown in table 1 and the gray values of the pixel points in the nine-square grid. It can be understood that, for a blurred image, the gray value variation near each pixel is small, and the evaluation factor E' is small; for sharp images, the image is sharp and the evaluation factor E' is at a maximum.
Further, the sharpness value of the image can be calculated by the following equation (5) for the entire image:
whereinF is the definition value of the image, M, N is the maximum value of the abscissa and ordinate of the image pixel, M × N is the total number of pixel points, E'xyIs the evaluation factor at pixel point (x, y).
For example, the sharpness calculating module 30 may select a plurality of edge regions in each gray scale image based on an edge gradient evaluation algorithm, for example, as shown in fig. 2, 4 edge regions, namely, a1, a2, A3, and a4, may be selected in each gray scale image, and then a 50 × 50 pixel lattice (as shown in fig. 3) may be selected in each edge region, and then the pixel lattice may be calculated according to the laplacian template and the formula (4), and a plurality of evaluation factors of a plurality of squared pixel lattices conforming to the laplacian template may be calculated, and then the evaluation factors may be squared and averaged by the formula (5) to obtain a sharpness value of the 50 × 50 pixel lattice, and through this calculating process, a sharpness value F of the 4 edge regions of each gray scale image may be obtained1、F2、F3、F4Then, the 4 edge region sharpness values of each image may be summed to obtain a total sharpness value for each image.
The blurred image obtaining module 40 is configured to obtain blurred images of the plurality of original images according to the plurality of sharpness values.
Specifically, in an embodiment of the present invention, a specific implementation process of the blurred image obtaining module 40 obtaining blurred images in a plurality of original images according to a plurality of sharpness values may be as follows: the calculated definition values can be compared in size, the original image with the largest definition value is the preferred image, and the rest original images are blurred images. That is, the blurred image acquisition module 40 may select an image with the largest sharpness value among the original images as a preferred image, and select the rest of the original images as blurred images.
In another embodiment of the present invention, the specific implementation process of the blurred image obtaining module 40 obtaining blurred images in a plurality of original images according to a plurality of sharpness values may be as follows: and judging whether the plurality of definition values are smaller than a preset threshold value one by one, and if so, selecting the original image with the definition value smaller than the preset threshold value as a blurred image. Preferably, in an embodiment of the present invention, if the sharpness value is greater than or equal to the preset threshold, an original image having a sharpness value greater than or equal to the preset threshold is selected as the preferred image. That is, the blurred image obtaining module 40 may compare the sharpness value corresponding to each original image with a preset threshold, and select the original image with the sharpness value smaller than the preset threshold as the blurred image, otherwise, the blurred image is the preferred image. In the embodiment of the present invention, the preset threshold may be an empirical value obtained through a large number of experiments in advance, or may be customized by the user according to the needs of the user.
The deleting module 50 is used to delete the blurred image. More specifically, after the blurred image acquisition module 40 performs sharpness evaluation on the original image to obtain blurred images, the deletion module 50 may delete the blurred images. It is understood that, the deleting module 50 may also release the space for storing the blurred images while deleting the blurred images. Thus, the purpose of automatically deleting the blurred image can be realized.
According to the device for deleting the blurred image, which is disclosed by the embodiment of the invention, a plurality of original images can be obtained through an original image obtaining module, a gray image conversion module converts the original images into gray images, a definition calculation module respectively carries out definition calculation on the gray images to obtain corresponding definition values, a blurred image obtaining module obtains the blurred images in the original images according to the definition values, and a deleting module deletes the blurred images, namely, the definition of the images is automatically distinguished through a definition evaluation algorithm, the images meeting the blurring standard are judged according to the definition value of each image, and then the images meeting the blurring standard can be automatically deleted, so that on one hand, the purpose of automatically and quickly deleting the blurred images is realized, the user operation is simplified, on the other hand, the blurring degree of the shot images can be easily distinguished through the definition evaluation algorithm, the method and the device facilitate the user to quickly and preferably select the pictures with higher quality in a group of similar images, and improve the user experience.
Preferably, in an embodiment of the present invention, as shown in fig. 9, the blurred image deleting apparatus may further include: a preferred image providing module 60, the preferred image providing module 60 being operable to provide the preferred image to the user after the deletion module 50 deletes the blurred image. For example, after the deletion module 50 deletes the blurred image, the preferred image providing module 60 may assemble the remaining preferred images of the original image in a presentation interface and present them to the user for viewing by the user.
Therefore, the image with the optimal definition in the same group of images can be directly obtained without user selection or screening, the user can more intuitively know the optimal image, and the visual experience of the user is improved.
In order to improve the user experience, the user can determine whether the function of automatically deleting the blurred image is needed according to the requirement of the user. Optionally, in an embodiment of the present invention, as shown in fig. 10, the device for deleting the blurred image may further include: an entrance providing module 70 is provided.
Specifically, the setting entry providing module 70 is configured to provide a setting entry of a delete blurred image switch, where the setting entry is configured to receive an operation input by a user for the delete blurred image switch.
For example, assuming that the device for deleting a blurred image according to the embodiment of the present invention is applied to a mobile terminal, the setting entry providing module 70 may provide a setting entry of a switch for deleting a blurred image on a viewing interface of a camera application program in the mobile terminal, as shown in fig. 5, a user may click the setting button icon to enter the setting entry of the switch for deleting a blurred image, and may replace the state of the switch for deleting a blurred image by sliding or clicking, etc.
For another example, assuming that the apparatus for deleting a blurred image according to the embodiment of the present invention is applied to a mobile terminal, as shown in fig. 6, the setting entry providing module 70 may provide a setting entry of a switch for deleting a blurred image in a sub-menu under a camera application program in "setting" of the mobile terminal, and a user may replace a state of the switch for deleting a blurred image by sliding or clicking or the like.
Since the user may not turn on the function switch for deleting the blurred image, in order to automatically delete the blurred image, it may be detected whether the function switch is turned on. Optionally, in an embodiment of the present invention, as shown in fig. 10, the device for deleting the blurred image may further include: a detection module 80 and a status change module 90.
Specifically, the detection module 80 may be configured to detect whether the state of the blur image deletion switch is in an on state before the original image acquisition module 10 acquires a plurality of original images. The state replacement module 90 may be configured to replace the state of the delete blurred image switch with the on state when the detection module 80 detects that the state of the delete blurred image switch is not in the on state. In the embodiment of the present invention, the original image acquiring module 10 may be further configured to acquire a plurality of original images when the state of the blur image deleting switch is in an on state.
That is, the detection module 80 may detect whether the user has turned on the switch for deleting the blurred image, and if so, the original image acquisition module 10 may acquire the original image to be processed; if not, the state replacement module 90 may pop up a prompt query box, as shown in fig. 7, to prompt the user whether to turn on the blur image deletion switch, and if yes is detected, the setting entry shown in fig. 6 may be called for the user to perform specific setting; and if the user selects 'no', entering a manual deleting operation mode.
In order to implement the above embodiments, the present invention further provides a mobile terminal, which may include the blurred image deleting apparatus according to any of the above embodiments.
According to the mobile terminal of the embodiment of the invention, a plurality of original images can be obtained through the original image obtaining module in the deleting device, the gray image converting module converts the original images into the gray images, the definition calculating module respectively calculates the definition of the gray images to obtain the corresponding definition values, the blurred image obtaining module obtains the blurred images in the original images according to the definition values, the deleting module deletes the blurred images, namely, the definition of the images is automatically distinguished through the definition evaluating algorithm, the images meeting the blurring standard are judged according to the definition value of each image, and then the images meeting the blurring standard can be automatically deleted, on one hand, the aim of automatically and quickly deleting the blurred images is realized, the user operation is simplified, on the other hand, the blurring degree of the shot images can be easily distinguished through the definition evaluating algorithm, the method and the device facilitate the user to quickly and preferably select the pictures with higher quality in a group of similar images, and improve the user experience.
In order to facilitate understanding of the implementation process of the method for deleting the blurred image according to the embodiment of the present invention, the operation steps for implementing the function of deleting the blurred image may be specifically described. Assuming that the blurred image deletion method is applied to a mobile terminal, as shown in fig. 11, the operation steps may include:
s1101, the user opens and views a group of continuous shooting images by using the mobile terminal.
S1102, when detecting that an event for opening an image occurs, may detect whether a current setting value of the "switch for automatically deleting a blurred image" is on or off.
And S1103, if the mobile terminal is started, displaying a waiting prompt box in intelligent optimization on an application layer interface of the mobile terminal, and as shown in FIG. 12, displaying the prompt box on the application layer and simultaneously completing the image definition calculation in the steps S1104 to S1107.
And S1104, graying the original image, and calling a definition algorithm function for calculation, such as a contrast method or an edge-based gradient evaluation algorithm.
And S1105, respectively calculating the definition values of the images continuously shot by the group.
S1106, images are sorted according to the calculated definition values, and the image corresponding to the maximum definition value is taken as a preferred image.
S1107, the preferred image is retained, and the remaining images of the same group are automatically deleted.
And S1108, displaying the preferred image on the display interface of the mobile terminal.
S1109, if the image is not turned on, popping up a prompt query box on the mobile terminal, as shown in fig. 7, so as to remind the user whether to turn on the "switch for automatically deleting the blurred image".
S1110, if it is detected that the user selects "yes", the setting entry providing module of fig. 5 or fig. 6 is called to make a specific setting.
S1111, control proceeds to step S1104.
S1112, if it is detected that the user selects "no", the mobile terminal enters a manual deletion operation mode.
In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.
Claims (13)
1. A method for deleting a blurred image is characterized by comprising the following steps:
acquiring a plurality of original images, and converting the plurality of original images into a plurality of gray level images;
respectively carrying out definition calculation on the gray level images to obtain a plurality of definition values corresponding to the original images; wherein, the respectively performing definition calculation on the plurality of gray level images to obtain a plurality of definition values corresponding to the plurality of original images comprises:
selecting a plurality of edge regions in each gray level image based on a gradient evaluation algorithm of edges aiming at each gray level image; selecting a pixel lattice in each edge region, and calculating the pixel lattice according to a Laplace operator to obtain a definition value of each edge region; summing the definition values of each edge area, and determining the obtained sum value as the definition value of each original image;
acquiring blurred images in the multiple original images according to the multiple definition values, and deleting the blurred images;
obtaining blurred images in the multiple original images according to the multiple sharpness values specifically includes:
judging whether the plurality of definition values are smaller than a preset threshold value one by one; the preset threshold value is self-defined by a user according to the self requirement;
if the sharpness value is smaller than the preset threshold value, the original image with the sharpness value smaller than the preset threshold value is the blurred image;
wherein the plurality of edge regions are all located at the edge of the corresponding gray scale image.
2. The method for deleting the blurred image according to claim 1, wherein the obtaining the blurred images in the plurality of original images according to the plurality of sharpness values specifically includes:
and comparing the plurality of definition values, wherein the original image with the largest definition value is the preferred image, and the rest original images are the blurred images.
3. The blurred image deletion method as set forth in claim 1, further comprising:
if the definition value is larger than or equal to the preset threshold value, the original image with the definition value larger than or equal to the preset threshold value is a preferred image.
4. A blurred image deletion method as claimed in claim 2 or 3, wherein after the blurred image is deleted, the method further comprises:
the preferred image is provided to the user.
5. The method for deleting a blurred image according to claim 1, wherein before acquiring a plurality of original images, the method further comprises:
and providing a setting inlet of a fuzzy image deleting switch, wherein the setting inlet is used for receiving the operation input by a user aiming at the fuzzy image deleting switch.
6. The method for deleting a blurred image according to claim 5, wherein before acquiring a plurality of original images, the method further comprises:
detecting whether the state of the fuzzy image deleting switch is in an opening state or not;
if not, the state of the fuzzy image deleting switch is changed into an opening state;
and if so, acquiring a plurality of original images.
7. A blurred image deletion apparatus, comprising:
the original image acquisition module is used for acquiring a plurality of original images;
the gray level image conversion module is used for converting the original images into gray level images;
the definition calculating module is used for respectively calculating the definitions of the gray images to obtain a plurality of definition values corresponding to the original images; wherein the sharpness calculation module is specifically configured to:
selecting a plurality of edge regions in each gray level image based on a gradient evaluation algorithm of edges aiming at each gray level image; selecting a pixel lattice in each edge region, and calculating the pixel lattice according to a Laplace operator to obtain a definition value of each edge region; summing the definition values of each edge area, and determining the obtained sum value as the definition value of each original image;
the blurred image acquisition module is used for acquiring blurred images in the plurality of original images according to the plurality of definition values; the blurred image acquisition module is specifically configured to: judging whether the plurality of definition values are smaller than a preset threshold value one by one; if the sharpness value is smaller than the preset threshold value, the original image with the sharpness value smaller than the preset threshold value is the blurred image; the preset threshold value is self-defined by a user according to the self requirement;
a deleting module for deleting the blurred image;
wherein the plurality of edge regions are all located at the edge of the corresponding gray scale image.
8. The apparatus for deleting the blurred image according to claim 7, wherein the blurred image acquiring module is specifically configured to:
and comparing the plurality of definition values, wherein the original image with the largest definition value is the preferred image, and the rest original images are the blurred images.
9. The blurred image deletion apparatus according to claim 7, wherein the blurred image acquisition module is further configured to: and when the definition value is greater than or equal to the preset threshold value, the original image with the definition value greater than or equal to the preset threshold value is a preferred image.
10. The blurred image deletion apparatus as set forth in claim 8, further comprising:
a preferred image providing module for providing the preferred image to a user after the deleting module deletes the blurred image.
11. The blurred image deletion apparatus as set forth in claim 7, further comprising:
and the setting entrance providing module is used for providing a setting entrance of a switch for deleting the blurred image, and the setting entrance is used for receiving the operation input by the user aiming at the switch for deleting the blurred image.
12. The blurred image deletion apparatus as set forth in claim 11, further comprising:
the detection module is used for detecting whether the state of the fuzzy image deleting switch is in an opening state or not before the original image acquisition module acquires a plurality of original images;
the state replacement module is used for replacing the state of the fuzzy image deleting switch with an opening state when the detection module detects that the state of the fuzzy image deleting switch is not in the opening state; wherein,
the original image acquisition module is further used for acquiring a plurality of original images when the state of the fuzzy image deleting switch is in the starting state.
13. A mobile terminal, comprising: the apparatus for deleting a blurred image as set forth in any one of claims 7 to 12.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510655461.7A CN106570028B (en) | 2015-10-10 | 2015-10-10 | Mobile terminal and method and device for deleting blurred image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510655461.7A CN106570028B (en) | 2015-10-10 | 2015-10-10 | Mobile terminal and method and device for deleting blurred image |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106570028A CN106570028A (en) | 2017-04-19 |
CN106570028B true CN106570028B (en) | 2020-12-25 |
Family
ID=58507930
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510655461.7A Active CN106570028B (en) | 2015-10-10 | 2015-10-10 | Mobile terminal and method and device for deleting blurred image |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106570028B (en) |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107240078A (en) * | 2017-06-06 | 2017-10-10 | 广州优创电子有限公司 | Lens articulation Method for Checking, device and electronic equipment |
CN108921806B (en) * | 2018-08-07 | 2020-08-07 | Oppo广东移动通信有限公司 | Image processing method, image processing device and terminal equipment |
CN111131688B (en) * | 2018-10-31 | 2021-04-23 | Tcl科技集团股份有限公司 | Image processing method and device and mobile terminal |
CN109598706A (en) * | 2018-11-26 | 2019-04-09 | 安徽嘉拓信息科技有限公司 | A kind of camera lens occlusion detection method and system |
CN110704380A (en) * | 2019-08-27 | 2020-01-17 | 努比亚技术有限公司 | Automatic picture deleting method, terminal and computer readable storage medium |
CN112580400B (en) * | 2019-09-29 | 2022-08-05 | 荣耀终端有限公司 | Image optimization method and electronic equipment |
CN110838099A (en) * | 2019-10-10 | 2020-02-25 | 深圳市燕麦科技股份有限公司 | Foreign matter detection method, device and system and terminal equipment |
CN112714246A (en) * | 2019-10-25 | 2021-04-27 | Tcl集团股份有限公司 | Continuous shooting photo obtaining method, intelligent terminal and storage medium |
CN110795579B (en) * | 2019-10-29 | 2022-11-18 | Oppo广东移动通信有限公司 | Picture cleaning method and device, terminal and storage medium |
CN111556249A (en) * | 2020-05-19 | 2020-08-18 | 青岛海信移动通信技术股份有限公司 | Image processing method based on ink screen, terminal and storage medium |
CN112135048B (en) * | 2020-09-23 | 2022-02-15 | 创新奇智(西安)科技有限公司 | Automatic focusing method and device for target object |
CN113763311B (en) * | 2021-01-05 | 2024-07-23 | 北京京东乾石科技有限公司 | Image recognition method and device and automatic sorting robot |
CN112784741A (en) * | 2021-01-21 | 2021-05-11 | 宠爱王国(北京)网络科技有限公司 | Pet identity recognition method and device and nonvolatile storage medium |
CN113823400A (en) * | 2021-11-22 | 2021-12-21 | 武汉楚精灵医疗科技有限公司 | Method and device for monitoring speed of endoscope withdrawal of intestinal tract and computer readable storage medium |
CN115714894A (en) * | 2022-11-15 | 2023-02-24 | 安徽宝信信息科技有限公司 | Automatic image tracking all-in-one of discernment position |
CN116311243B (en) * | 2023-03-22 | 2023-10-24 | 生态环境部长江流域生态环境监督管理局生态环境监测与科学研究中心 | Algae detection method and system based on microscope image |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2003009579A2 (en) * | 2001-07-17 | 2003-01-30 | Amnis Corporation | Computational methods for the segmentation of images of objects from background in a flow imaging instrument |
CN103093419A (en) * | 2011-10-28 | 2013-05-08 | 浙江大华技术股份有限公司 | Method and device for detecting image definition |
CN104036016A (en) * | 2014-06-25 | 2014-09-10 | 珠海全志科技股份有限公司 | Picture screening method and picture screening device |
CN104185981A (en) * | 2013-10-23 | 2014-12-03 | 华为终端有限公司 | Method and terminal selecting image from continuous captured image |
-
2015
- 2015-10-10 CN CN201510655461.7A patent/CN106570028B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2003009579A2 (en) * | 2001-07-17 | 2003-01-30 | Amnis Corporation | Computational methods for the segmentation of images of objects from background in a flow imaging instrument |
CN103093419A (en) * | 2011-10-28 | 2013-05-08 | 浙江大华技术股份有限公司 | Method and device for detecting image definition |
CN104185981A (en) * | 2013-10-23 | 2014-12-03 | 华为终端有限公司 | Method and terminal selecting image from continuous captured image |
CN104036016A (en) * | 2014-06-25 | 2014-09-10 | 珠海全志科技股份有限公司 | Picture screening method and picture screening device |
Also Published As
Publication number | Publication date |
---|---|
CN106570028A (en) | 2017-04-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106570028B (en) | Mobile terminal and method and device for deleting blurred image | |
US10341574B2 (en) | Operating a device to capture high dynamic range images | |
CN104486552B (en) | A kind of method and electronic equipment obtaining image | |
JP4840426B2 (en) | Electronic device, blurred image selection method and program | |
EP3196758B1 (en) | Image classification method and image classification apparatus | |
KR102356372B1 (en) | White balance processing method, electronic device and computer readable storage medium | |
JP6720881B2 (en) | Image processing apparatus and image processing method | |
CN107968919A (en) | Method and apparatus for inverse tone mapping | |
JP2014077994A (en) | Image display device, control method and control program for the same, and imaging device | |
JP6632724B2 (en) | Device and method for reducing exposure time set for high dynamic range video / imaging | |
US20150312487A1 (en) | Image processor and method for controlling the same | |
CN111669492A (en) | Method for processing shot digital image by terminal and terminal | |
CN103607523B (en) | Method and terminal for adjusting colors of images | |
US20090161961A1 (en) | Apparatus and method for trimming | |
CN110992283A (en) | Image processing method, image processing apparatus, electronic device, and readable storage medium | |
CN113810673B (en) | Projector uniformity testing method and device and computer readable storage medium | |
US20090185050A1 (en) | Apparatus and method for acquiring image based on expertise | |
JP2017130106A (en) | Data processing apparatus, imaging apparatus and data processing method | |
CN111275045A (en) | Method and device for identifying image subject, electronic equipment and medium | |
US11494884B2 (en) | Method and system for evaluating image sharpness | |
JP2017182668A (en) | Data processor, imaging device, and data processing method | |
CN110222207B (en) | Picture sorting method and device and intelligent terminal | |
CN115375796A (en) | Image processing | |
CN112422825A (en) | Intelligent photographing method, device, equipment and computer readable medium | |
JP2007195097A (en) | Imaging apparatus, image processing method, and image processing program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |