[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

KR101744163B1 - An application for managing images and method of managing images - Google Patents

An application for managing images and method of managing images Download PDF

Info

Publication number
KR101744163B1
KR101744163B1 KR1020160013385A KR20160013385A KR101744163B1 KR 101744163 B1 KR101744163 B1 KR 101744163B1 KR 1020160013385 A KR1020160013385 A KR 1020160013385A KR 20160013385 A KR20160013385 A KR 20160013385A KR 101744163 B1 KR101744163 B1 KR 101744163B1
Authority
KR
South Korea
Prior art keywords
image
cluster
images
clusters
feature
Prior art date
Application number
KR1020160013385A
Other languages
Korean (ko)
Inventor
서해종
김현철
이재일
Original Assignee
김현철
서해종
이재일
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 김현철, 서해종, 이재일 filed Critical 김현철
Priority to KR1020160013385A priority Critical patent/KR101744163B1/en
Application granted granted Critical
Publication of KR101744163B1 publication Critical patent/KR101744163B1/en

Links

Images

Classifications

    • G06F17/30256
    • G06F17/3028
    • G06F17/3071
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

An application for image management and a method therefor are provided.
An image management application stored in a medium coupled with hardware generates a feature vector composed of feature values including at least code stream and time information of a header related to color information for each of a plurality of images within a predetermined period, If the distance between the feature vectors of the adjacent image is less than or equal to the threshold value, the images are classified into the same cluster, and if the distance between the feature vector of the subsequent image and the cluster representative value is less than or equal to the threshold value, And merges the two clusters into a single cluster when the plurality of images are classified into a plurality of clusters and the distance between the cluster representative values in the two clusters is equal to or less than the threshold value.

Figure P1020160013385

Description

[0001] The present invention relates to an image management application and an image management method,

The present invention relates to an image management application and an image management method, and more particularly, to an image management application and an image management method, which are capable of easily grouping a large amount of images so as to maximally match a classification intention of a user in a limited performance device, To an image management application and an image management method for realizing quality evaluation of personalized classification standards and images.

Photos taken by smart devices such as tablet computers, smart phones, and digital cameras are stored in separate storage folders for each device. In order to classify the photographs according to the user's criteria, it takes a lot of time and effort to check the pictures one by one.

Even when the photographs stored in the photographing apparatus are transmitted to other apparatuses, there is an inconvenience that the photographs should be classified while checking all the photographs in order to classify the photographs according to the user's criteria.

An object of the present invention is to provide a method and apparatus for easily grouping a large amount of images so as to match a classification intention of a user in a limited performance device and realize personalized classification criteria and image quality evaluation according to user feedback An image management application and an image management method.

The objects of the present invention are not limited to the above-mentioned objects, and other objects not mentioned can be clearly understood by those skilled in the art from the following description.

According to an aspect of the present invention, there is provided an image management application, which is combined with hardware and includes a code stream and a time information of a header related to color information for each of a plurality of images within a predetermined period, The method comprising the steps of: generating a feature vector composed of at least a feature value including at least one feature vector of a plurality of neighboring images; classifying the images into the same cluster if the distance between the feature vectors of the first image and the adjacent image is equal to or less than a threshold; If the distance between the feature vector of the subsequent image and the cluster representative value is less than or equal to the threshold value, classifying the subsequent image into the same cluster as the cluster, The cluster representative value is updated based on the cluster representative value And when the plurality of images are classified into a plurality of clusters and a distance between the cluster representative values in two of the plurality of clusters is equal to or less than the threshold value, Merging, and if the threshold is exceeded, maintaining the plurality of clusters.

In another embodiment, prior to the step of generating the feature vector, the threshold may be estimated based on thresholds used in pre-classified clusters.

In another embodiment, a clustering weight may be given according to a feature value constituting the feature vector, in calculating the distance between the feature vectors and the distance between the feature vector of the subsequent image and the cluster representative value.

In another embodiment, after merging into the single cluster if the threshold is less than or equal to the threshold and maintaining the plurality of clusters if the threshold is exceeded, Updating the cluster representative value when a plurality of clusters are selected and clustered into one cluster; and a step of estimating the threshold value based on the updated cluster representative value .

Further, the estimated threshold value may be used to classify the clusters for a plurality of images in the subsequent period.

In addition, when a cluster weight is assigned according to a feature value constituting the feature vector in the calculation of the distance between the feature vectors and the distance between the feature vector of the subsequent image and the cluster representative value, And updating the clustering weight based on the estimated threshold, wherein the estimated threshold and the updated clustering weight may classify clusters for a plurality of images in a subsequent time period.

In another embodiment, the image is stored in digital form, and if the format of the image is JPEG (Joint Photographic Experts Group), the code stream of the header associated with the color information is stored in a Huffman table Code and an AC code, and when the image format is PNG (Portable Network Graphic), a code stream of a header related to the color information is a color data related code stored in a Huffman table, the image is a GIF (Graphic Interchange Format) , The code stream of the header associated with the color information may be the color data related code stored in the LZW (Lempei-Ziv-Welch) code.

In another embodiment, the cluster representative value may be composed of an average value for each feature value constituting a feature vector of images classified into the same cluster.

In yet another embodiment, the medium may be any of a mobile terminal, a computer, a cloud server, and a social network server.

In another embodiment, merging into the single cluster, when the threshold is below the threshold, and maintaining the plurality of clusters if the threshold is exceeded, Calculating characteristic values,

Ranking the images in the cluster according to a result of evaluating the images based on a plurality of quality-weighted quality feature values, and selecting representative images in the cluster .

Further comprising the steps of: ranking the images in the cluster and, after selecting the representative image, obtaining a representative image in response to an instruction of a user to directly relocate the images in the cluster; Calculating the quality weight based on the rearranged rank, and evaluating the quality feature values by applying the calculated quality weight to images of other clusters.

In addition, in the event that there is an indirect activity of the user for the images in the cluster, after ranking the images in the cluster and selecting the representative image, the intervention of the activity Rearranging the order of the images in the cluster based on the degree of intervening and obtaining a representative image according to the ranking; calculating the quality weight based on the degree of intervention of the activity; And evaluating the quality feature values by applying the calculated quality weights to the quality feature values.

In this case, the degree of intervention of the activity may be calculated to include at least one of the number of sharing of the image in the social network service, the number of times the image is modified, the number of times the image is browsed, and the viewing time of the image.

The quality characteristic values may include at least one of a presence or absence of a face in the image, a sharpness in a predetermined area of the image based on the face, a relative area occupied by the face in the image, an openness of an eye, a signal- ) Or face distortion due to lens distortion, and a distance from the center of the image to the center of the face.

The method may further include the step of ranking the images in the cluster and selecting only representative images in the cluster, and then storing only images of a predetermined higher ranking on the medium or uploading to other external media can do.

Further comprising the steps of: ranking the images in the cluster and selecting representative images in the cluster; converting the images of a predetermined lower rank into images having a lower compression ratio than the images; Storing the images having the compression ratio in the medium or uploading them to another external medium.

The details of other embodiments are included in the detailed description and drawings.

According to the present invention, it is possible to precisely group images, which are in a continuous time interval but have different compositions, into different clusters without involvement of excessive computational complexity and complex image processing in a mobile terminal and a cloud environment with limited resources, Clustering that meets the user's classification intent can be realized. In addition, by merging similar clusters after clustering, it is possible to prevent over clustering in which similar images are classified into different clusters due to images with disturbance in successive similar images.

In addition, according to the merging or creation of the user's clusters, the threshold value and the clustering weight, which are the criteria for distinguishing the clusters, are updated, so that a personalized classification criterion is generated at the time of clustering of the subsequent images to implement classification corresponding to the user's intention .

In addition, the user can directly change the ranking of the images in the cluster by quality evaluation, or by calculating the quality weights assigned to the evaluation elements through activities such as social network services, Can be reflected.

1 is a schematic view of an entire network including a mobile terminal in which an image management application according to an embodiment of the present invention is implemented.
2 is a schematic configuration diagram of a mobile terminal.
3 is a diagram showing functions of an image management application by module.
4 is a view showing information of an image stored in the database unit.
5 is a diagram illustrating the format of an image file.
6A and 6B are flowcharts of the clustering process of images.
7 is a diagram showing the merging of clusters to prevent overclustering.
Figures 8A-8D illustrate a user interface for the clustering process of images.
9 is a flowchart related to a case where a cluster is edited by a user's operation.
10A and 10B are diagrams illustrating a user interface when a cluster is edited by a user.
11 is a flow chart for evaluating the quality of images and changing the image ranking by direct manipulation of the user.
Figure 12 is a flow chart of the quality evaluation of images and the change of image ranking by degree of activity intervention.
13 is a diagram illustrating a user interface for adjusting parameters used for classifying a cluster and evaluating the quality of an image.
FIG. 14 is a diagram illustrating a user interface for determining the selection and storage form of images to be stored in a medium.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings and the following description. However, the present invention is not limited to the embodiments described herein but may be embodied in other forms. Rather, the embodiments disclosed herein are being provided so that this disclosure will be thorough and complete, and will fully convey the concept of the invention to those skilled in the art. Like reference numerals designate like elements throughout the specification. It is to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. In the present specification, the singular form includes plural forms unless otherwise specified in the specification. &Quot; comprises " and / or " comprising ", as used herein, unless the recited element, step, operation, and / Or additions.

Also, the terms " part " to " module " refer generally to components such as logically separable software (computer program), hardware and the like. Therefore, the module in the present embodiment indicates not only the module in the computer program but also the module in the hardware configuration. Therefore, the present embodiment is applicable to a computer program for causing the computer to function as a module (a program for causing each step to be executed in a computer, a program for causing a computer to function as each means, ), And also explains systems and methods. It should be noted that, for convenience of description, the words " store ", " store ", and words equivalent to these words are used, but these words may be stored in a storage device, As well as control. In addition, the modules may correspond to one-to-one functions, but in the case of mounting, one module may be constituted by one program, or a plurality of modules may be constituted by one program. Alternatively, one module may be constituted by a plurality of programs . Further, a plurality of modules may be executed by one computer, and one module may be executed by a plurality of computers by a computer in a distributed or parallel environment. Further, another module may be included in one module. Hereinafter, the term " connection " is used also in the case of logical connection (data transfer, instruction, reference relationship between data, etc.) in addition to physical connection. The term " predetermined " or " predetermined " means that the processing is determined before the processing to be performed. It is also possible that, even after the processing according to the present embodiment is started, If it is before the processing, it is used including the meaning of what is decided according to the situation / condition at that time, or the situation / state up to that time.

The system or device is not limited to being configured by connecting a plurality of computers, hardware, devices, and the like by communication means such as a network (including a one-to-one correspondence communication connection) This includes cases where it is realized.

In addition, when a plurality of processes are performed for each process by each module or each module, or when a plurality of processes are performed in each module, the target information is read from the memory device (memory) And writes the processing result into the storage device. Therefore, the description of the reading and writing from the storage device before processing and the writing into the storage device after processing may be omitted. The storage device here may include a hard disk, a RAM (Random Access Memory), an external storage medium, a storage device via a communication line, a register in a CPU (Central Processing Unit), and the like.

Hereinafter, an image management application according to an embodiment of the present invention will be described in detail with reference to FIG. 1 to FIG. FIG. 1 is a schematic diagram of an entire network including a mobile terminal in which an image management application according to an embodiment of the present invention is implemented. FIG. 2 is a schematic configuration diagram of a mobile terminal, FIG. 4 is a view showing information of an image stored in a database unit, and FIG. 5 is a view illustrating a format of an image file.

Hereinafter, an image management application is embedded in a mobile terminal for convenience of description. However, the image management application can be embedded in a cloud server, a social network server, or the like, and can be serviced to the mobile terminal through a program for the mobile terminal associated with the application. Even if it is embedded in a cloud server or a social network server, the image management application can be implemented substantially the same as the following description. In addition, the image management application can be simply built in a personal desktop computer and can be implemented as follows.

A plurality of mobile terminals 100, a cloud server 140, a social network server 300, and a network 400 for uploading images grouped into clusters to images ranked by the image management application, Is provided. These components are connected via a network 400 by wire and wireless connections, and the exchange of data between these components is performed via the network 400.

According to the present embodiment, the mobile terminal 100 can embed an image management application. The mobile terminal 100 is a terminal capable of communicating with the outside, and can photograph a subject. The mobile terminal 100 can be, for example, a cellular phone, a smart phone, a tablet computer, a laptop computer, or a digital camera.

The cloud server 140 may communicate with the mobile terminal 100 to store data stored in the mobile terminal 100, such as documents, images, video, audio, etc., To the mobile terminal (100) or the social network server (300).

The social network server 300 may implement a digital activity performed between the mobile terminals 100, for example, message exchange, text, image, video, audio, and sharing of various data.

The mobile terminal 100 may specifically include an operation unit 102, an image input unit 104, an audio input unit 106, and an external connection unit 124. [

The operation unit 102 may be a soft key and / or a hard key, such as a touch pad, provided on the display unit 112 as an interface for a user's input. The image input unit 104 is a portion into which a still image or an image of a subject is input, and may be constituted by, for example, a CCD or a CMOS image sensor. The audio input unit 106 is a part for receiving voice around the mobile terminal 100, and may be a microphone. The external connection unit 124 is a member connected to an external electronic device such as a computing device, an external memory, etc., and can perform data communication between the mobile terminal 100 and an external device.

The mobile terminal 100 may include an audio processing unit 108, an image processing unit 110, a display unit 112, and an audio output unit 126.

The audio processing unit 108 encodes the audio input from the audio input unit 106 into an audio file in digital compression format and decodes the audio file stored in the memory unit 118 and reproduces the audio file through the audio output unit 126.

The image processing unit 110 encodes an object image and an image input from the image input unit 104 into an image or video of digital compression type and decodes the image or video stored in the memory unit 118 and transmits the decoded image or video through the display unit 112 Images or video can be displayed. The format of the compressed image may be a Joint Picture Expert Group (JPEG), a Portable Network Graphic (PNG), a Graphic Interchange Format (GIF), a Tag Image File Format (TIFF), or the like.

The mobile terminal 100 may include a communication unit 114, a position sensor unit 116, a memory unit 118, and a controller 120 for controlling functions of the respective units.

The communication unit 114 may be, for example, LTE, WCDMA, Bluetooth, WIFI, WIBRO, Ethernet, and the like, but is not limited thereto.

The position sensor unit 116 is a member that recognizes position information on the current position of the mobile terminal 100, and may be, for example, a GPS sensor. The position sensor unit 116 may acquire positional information about a place where the image is captured and provide the stored positional information as metadata of the stored image. Although only the position sensor unit 116 is shown in FIG. 2, a plurality of sensors for sensing various situations around the mobile terminal 100 may be provided.

The memory unit 118 may store various multimedia data including image files, audio files, and the like compressed by the image processing unit 110, operating system related data, and various types of data.

Apart from the program memory of the memory unit 118, the application unit 122 may store applications and image management application 200, which are different programs for driving various adaptive functions used by the user, Alternatively, it may be incorporated into the memory unit 118. Such an application may be initially embedded in the mobile terminal 100 or may be downloaded and installed from an external app market.

The application 200 for image management implements the function of clustering images stored in a compressed file in the memory unit 118 into similar images and ranking the images in the cluster.

Specifically, the image management application 200 includes a database unit 202, a feature value extraction unit 204, an image clustering unit 206, an image evaluation unit 218, a class presentation unit 224, and an upload unit 236, . ≪ / RTI >

As shown in Fig. 4, the database unit 202 stores clustering and quality evaluation of each of the images, cluster information, scores of each of a plurality of quality feature values, feature values constituting a feature vector for clustering A clustering weight given to each feature value, a threshold for clustering, a quality weight given to a quality specific value, and a ranking according to quality evaluation.

The database unit 202 can be initially or grouped into images within a period set by the user. To be specific, clustering and quality evaluation are completed for each predetermined period, an image group 502 in which clustering has been completed, And stores the image group 504 separately. Accordingly, the image management application 200 can perform processing on images belonging to the incomplete image group 502. [

When the mobile terminal 100 is stopped while executing the image management application 200 and fails to process all the images belonging to the set period, the database unit 202 stores all the images up to the set period Only grouped images can be stored in the processed image group 502 without grouping.

Even when the image management application 200 is embedded in the cloud server 140 so that the cloud server 140 can not receive all the images within a predetermined period from the mobile terminal 100 due to a failure of the network while the application is running , Substantially the same as the above.

The feature value extracting unit 204 analyzes the format of the compressed image for each of the plurality of images of the imperfect image group 504 in the database unit 202 to extract a code stream ) And time information. The feature value extracting unit 204 generates a feature vector composed of extracted feature values for each image.

The compressed image may be composed of a header including compressed parameters and metadata generated in the image compression process, and a main body including compressed data of pixels of the image. The compression parameters may include color information, brightness information, coefficients related to sampling, and a table generated in the process of encoding with compressed data, and the metadata may include information on the generation time of the image, photographing position information, and the like.

The code stream of the header related to the color information of the feature value means a coefficient and a table relating to the color information included in the compression parameter.

For example, in the case where the image is JPEG, the code stream of the header related to the color information is a code or a quantization table stored in a Huffman table, and when the image is a GIF, LZW (Lempei-Ziv-Welch) code. If the image is PNG or TIFF, the code stream may be code in a Huffman table similar to JPEG.

Referring to FIG. 5, the format of a JPEG image, which is typically used as an image standard, is a JPEG image, which is obtained by sampling color information of a header and image pixels, performing a DCT (Discrete Cosine Transformation) JPEG compressed data 616 containing the data generated by the image file 616 and End Of Image (EOI) 618 for reporting the end of the image file.

The header includes an SOI (Start Of Image) 602 for notifying the start of an image file, an APP 604 as an application area in which metadata is recorded, a DQT (Define Quantization Tables) 606 for holding a quantization table, A Define Restart Interval (DRI) 610 for defining a restart interval in restoring compressed data corrupted by a marker existing in the JPEG compressed data 616, a DHT (Define Huffman Tables) (Start Of Frame) 612 for storing the size and sampling information of each component, and an SOS (Start Of Scan) 614 for identifying the Huffman table to be used by each component.

The feature value extracted from the JPEG image may be the quantization table stored in the DQT 406 or the code information associated with the color information in the Huffman table recorded in the DHT 608, have.

In the present embodiment, a DC code and an AC code of a Huffman table are employed as a code stream related to color information. In this case, the feature values such as the DC code, the AC code, and the time information may be constituted by the feature vector as shown in [Formula 1] below for each image F i (i is the i-th image).

Figure 112016011645402-pat00001
[Formula 1]

(DC i is the DC code of the i-th image, AC i is the AC code of the i-th image, and Time i is time information)

In the above description, only the time information is used as the meta data used for the feature value, but the position information related to the image capturing place of the image is acquired from the position sensor unit 116 and attached to the metadata of the image, .

In this embodiment, by using the code stream and the time information of the color information in the header previously stored in the format of the image as the feature values for determining the degree of similarity, The similarity of the images can be improved.

In addition, if the image compression method is processed again to determine the degree of similarity, a large amount of computation is required to recalculate the parameters in the image or to perform repetitive processing. Therefore, the method is suitable for the mobile terminal 100 with limited performance not. However, since the present embodiment uses information in the header without processing an additional image file, efficient similarity determination can be performed with a small amount of computation.

The image clustering unit 206 compares the feature vectors of the images with a threshold value to determine the degree of similarity between the images, and clusters the images according to the determination result. In addition, the image clustering unit 206 compares clusters with thresholds to prevent overclustering, and determines similarities between the clusters. The image clustering unit 206 may calculate the personalized threshold value and the clustering weight through editing of the user's cluster, and may reflect the clustering in the subsequent period or the reclassification of the completed cluster.

Specifically, the image clustering unit 206 may include an image determining unit 208, a cluster classifying unit 210, a cluster determining unit 212, a threshold value estimating unit 214, and a clustering weight calculating unit 216 have.

The image determining unit 208 calculates the distance between the feature vectors of the first image and the next image in the image group 504 in which the clustering and quality evaluation are not completed. If the distance is less than or equal to the threshold value, If the distance exceeds the threshold value, it is determined to classify both images into different clusters.

The threshold value may be initially designated in the application 200 for image management, or may be estimated based on the threshold value used in the pre-classified clusters of the image group 502 in which clustering has been completed. In the case where there are pre-classified clusters, a cluster representative value, which is an average value of the feature vectors of the images in each cluster, is obtained for each cluster and can be estimated as a threshold value based on the difference between cluster representative value averages. The threshold values can also be estimated by the difference between the cluster representative values. However, by dividing the difference by the threshold definition coefficient, for example, 2, as in the expression (2), images with high similarity can be classified into the same cluster. According to this, images related to an event such as a birthday party, a graduation ceremony, or the like, or images that are more similar than those with other people in the same background, can be classified into one cluster. Images having high similarity may be images obtained by successively photographing the same people in the same event and background, and the image determining unit 208 may use a threshold value that is classified into the same cluster only with consecutively captured images. However, when the threshold definition coefficient is not limited to 2 and the similarity between images is more strictly judged, the threshold definition coefficient may be large. If the images of the same event and background are simply classified into one cluster, The coefficient can be small.

Figure 112016011645402-pat00002
[Formula 2]

(T 0 is an estimate and a threshold,

Figure 112016011645402-pat00003
,
Figure 112016011645402-pat00004
Is a cluster representative value for cluster m, n belonging to the completed image group 502)

The distance between feature vectors can be calculated by a linear decision model based on the euclidean distance or kernel distance between the feature values shown in [Equation 1], or can be calculated by a nonlinear decision model.

In the case of finding the distance between feature vectors based on the linear decision model, a clustering weight can be given for each feature value. That is, a clustering weight as a clustering weight vector for the elements constituting the feature vector may be used.

Clustering weights may also be estimated based on the cluster representative values and thresholds used in the pre-classified clusters in the image group 502 that were initially designated, clustering, or the like, as well as the threshold values. The meaning of the cluster representative value can be understood by referring to the following description.

In the case of obtaining the clustering weights by the pre-classified clusters, the clustering weights can be obtained, for example, by [Equation 3] and [Equation 4] below. According to this, since the result of the existing clustering satisfied by the user is reflected, the images can be clustered so as to better match the classification intention of the user.

Figure 112016011645402-pat00005
[Equation 3]

(

Figure 112016011645402-pat00006
Is a cluster representative value in cluster p, W is a clustering weighted vector, and T 0 is a threshold value)

In Equation 3, p equations are expressed in matrix form

Figure 112016011645402-pat00007
, And the clustering weight vector W can be calculated by [Equation 4].

Figure 112016011645402-pat00008
[Formula 4]

(Where W is a clustering weighted vector and T is a threshold)

When it is determined that the distance between the feature vectors of the first image and the next image is less than or equal to the threshold value, the cluster classifying unit 210 classifies the compared images into the same cluster and records them into the database unit 202. If the threshold value is exceeded If so, each image can be classified into different clusters and recorded in the database unit 202.

If the first and subsequent images are classified into the same cluster, the image determining unit 208 calculates the cluster representative value based on the feature vectors of the first image and the next image. The cluster representative value may be obtained as an average of the feature vectors. If the next image is classified as a different cluster, the cluster representative value of the first image may be the feature vector.

The image determining unit 208 calculates the distance between the cluster representative value and the feature vector of the subsequent image adjacent to the image for which the degree of similarity determination is completed. If the distance is less than or equal to the threshold value, the subsequent image is determined to be the same cluster. It is determined to classify the subsequent image into different clusters.

If it is determined that the distance between the feature vector of the subsequent image and the cluster representative value is less than or equal to the threshold value, the cluster classifying unit 210 classifies the subsequent image into the same cluster and records it in the database unit 202, , The subsequent image may be classified into different clusters and recorded in the database unit 202. [

The image judging section 208 and the cluster classifying section 210 classify similar images into the same cluster until the subsequent images are judged to be dissimilar to the images classified into the same cluster as described above, Repeat the process of allocating clusters.

When a plurality of clusters are primarily created by clustering a plurality of images within a predetermined period, the cluster determining unit 212 selects two clusters out of a plurality of clusters and calculates a distance Is less than or equal to the threshold value. The comparison between the clusters can be performed for all the clusters that can be combined into two.

If the cluster determination unit 212 is below the threshold value, the compared clusters may be merged into a single cluster, the merge may be notified to the database unit 202, and the clusters may be maintained primarily when the threshold is exceeded. In the case of merge, the database unit 202 can assign the same cluster number to the secondary merged clusters. This can prevent overclustering in which similar images are classified into different clusters due to disturbing images in successive similar images.

The threshold value estimating unit 214 may provide the threshold value initially designated or estimated in [Equation 2] to the image determining unit 208, and may also set the threshold value Can be estimated. Editing of the clusters may be to merge the clusters for which the user has completed classification or to divide the images in the clusters into a plurality of clusters. The threshold value can be estimated in the following Equation 5 in a similar manner to Equation 2, when there is the above-mentioned editing. According to the editing, it is possible to accurately estimate the object composition and the situation in the image that the user desires to be classified as one cluster through the editing, and the personalized classification criterion is reflected in the clustering of the image of the subsequent period or the reclassification of the already completed cluster . The division of 2, which is an example of the critical definition coefficient in [Equation 5], has been described in the section related to [Equation 2], and therefore, detailed description thereof will be omitted. The threshold defining coefficient may be set by user adjustment such as threshold 702 shown in the user interface shown in FIG.

Figure 112016011645402-pat00009
[Formula 5]

(T 1 is the re-estimated threshold value,

Figure 112016011645402-pat00010
,
Figure 112016011645402-pat00011
Is the updated cluster representative value of the cluster m, n classified for editing)

The clustering weight calculation unit 216 can be calculated based on the updated cluster representative value and the re-estimated threshold value in the case of editing of the cluster described above. For example, the clustering weights can be updated by computing the clustering weights as a cluster weight vector through [Equation 6] and [Equation 7] below in a manner similar to [Equation 3] and [Equation 4]. According to this, the user can accurately estimate the object composition and the situation in the image desired to be classified as one cluster, and the personalized classification criterion can be reflected in the clustering.

Figure 112016011645402-pat00012
[Equation 6]

(

Figure 112016011645402-pat00013
Is the cluster representative value at p 'when there is an edited cluster, W' is the clustering weight vector to update, T 1 is the re-estimated threshold value,
Figure 112016011645402-pat00014
Is a matrix representation of p 'equations)

Figure 112016011645402-pat00015
[Equation 7]

(W 'is a clustering weighted vector and T' is a threshold)

Meanwhile, the image evaluation unit 218 may include a ranking determination unit 220 and a quality weight calculation unit 222.

The ranking determination unit 220 calculates a plurality of quality feature values for each of the images classified into the respective clusters, evaluates the quality of the images based on the quality feature values to which the quality weights are assigned, Ranking, and representative images in the cluster can be selected.

The quality feature values are determined by the presence or absence of a face in the image, the sharpness in a predetermined area of the image with respect to the face, the relative area occupied by the face in the image, the openness of the eye, the signal to noise ratio, HDR (High Dynamic Range) The distance from the center of the image to the center of the face, and is not limited to the above-described items.

Quality evaluation of images can use a weighted linear combination method accompanied by a small amount of computation, a neural network technique for accurate evaluation, or a SVM (Support Vector Machine) method.

In the case of the weighted linear combination method, the quality evaluation score for the jth image among the image series of the i-th cluster can be calculated by the following equation (8).

Figure 112016011645402-pat00016
[Equation 8]

(Feat 1 (j), Feat 2 (j) ... Feat N (j) is deulyigo quality characteristic value, a 1, a 2 ... a N are weights quality deulim)

In addition, if there are a total of j images in the cluster, the quality evaluation of the entire images can be made by the following equation (9).

Figure 112016011645402-pat00017
[Equation 9]

(FS is a vector of quality scores of j total images,

Figure 112016011645402-pat00018
Lt;

F is the expression of [Expression 8] as a vector of j total images,

Figure 112016011645402-pat00019
ego,

A is a vector of quality weights,

Figure 112016011645402-pat00020
being)

The quality weight calculation unit 222 initially supplies the designated value to the ranking determination unit 220 and, when there is a direct ranking change instruction or an indirect activity of the user, Based on the ranking or intervening degree of activity, the quality weight can be calculated and updated. When evaluating images in other clusters, the updated quality weight is applied to the quality feature values, thereby aligning the images to match the user's personalized intentions.

The indirect activity is distinguished from the direct ranking change operation and does not change the arrangement of the displayed images ranked on the display unit 112 but refers to an action due to the interest of a specific image. For example, the degree of intervention associated with indirect activities may include the number of times an image is shared with a social network service, whether or not an image is sent to another external medium, the number of times the image is modified (number of times the filter is applied, The number of times, and the browse time, and may include any action that matches the above-described meaning.

 In the case of a direct ranking change or activity, the ranking determining unit 220 may re-evaluate the ranking to reselect the representative image, and the display unit 112 may present the re-ranked images in the cluster.

In the case of a direct ranking change, the ranking determining unit 220 readsjust the quality score of the image of the corresponding cluster (see [Expression 9]) according to a predetermined rule, and the quality weight calculating unit 222 calculates the re- The vector of the quality weights can be calculated by inputting in Equation (9).

The ranking determining unit 220 recalculates the quality score of the cluster (see [Expression 9]) based on the accumulated behavior, and the quality weight calculating unit 222 calculates the quality score based on the recalculated quality score A vector of quality weights can be calculated.

The class presenting unit 224 may include a class providing unit 226, a cluster changing unit 228, a ranking changing unit 230, a parameter changing unit 232, and an image editing unit 234.

The class providing unit 226 refers to the database unit 202 that records the clustering and ranking of the images performed by the image clustering unit 206 and the image evaluating unit 218 to display the images stored in the memory unit 118 And displays a thumbnail of the representative image as an intuitive symbol of the cluster for each cluster. In addition, the class providing unit 226 may display the images in the cluster in an order of expanded sorting when the user selects the cluster displayed on the display unit 112. [

The cluster changing unit 228 receives the editing instruction of the user such as the merging of classified clusters or the division of a single cluster, and transmits the change information of the clusters according to the editing instruction to the cluster classifying unit 210, the image determining unit 208, To the threshold value estimating unit 214 and the clustering weight calculating unit 216. [

The ranking changing unit 230 may receive accumulated change information or information related to the accumulated activity based on the information related to the change in rank of the user or the indirect activity, and may feedback the ranking change information to the ranking determining unit 220.

The parameter changing unit 232 inputs a value through the user interface as shown in FIG. 13 to adjust the degree of clustering (for example, threshold value, clustering weight) and the degree of reflection (weighting) To the threshold value estimating unit 214, the clustering weight calculating unit 216, and the quality weight calculating unit 222.

The image editing unit 234 may receive the number and the storage format of the images to be stored in the cluster and display the images on the display unit 112. The number of images may be determined as a predetermined upper rank desired by the user as shown in FIG. 14, and when an instruction is received regarding the storage form of the image shown in FIG. 14, It can be processed and stored at a compression ratio.

When the cluster is uploaded to another medium such as the cloud server 140 or the social network server 300 as an external device, the image editing unit 234 may store only images of a predetermined high rank, It is possible to receive a corresponding instruction through the user interface shown in Fig. Of course, the image editing unit 234 may compress and store images of a predetermined high rank order.

The uploading unit 236 may upload the clustered images to another external medium, such as the cloud server 140 or the social network server 300, by the request or synchronization of the user. The clustered images stored in the cloud server 140 may be transmitted to the mobile terminal 100 and stored when the image management application 200 is embedded in the cloud server 140. [ In the case of transmitting predetermined upper or lower ranking images to the social network server 300, the uploading unit 236 may transmit the image without compression processing of the image, but may be adapted to the compression size presented in the social network or the like, , The image editing unit 234 can transmit the compressed image.

The social network can receive only images of a predetermined rank for each cluster from the upload unit 236. However, the ranking information may also be provided from the upload unit 236 so that the images in each cluster are sorted and presented . According to this, other users accessing the social network service can browse images of good quality among the clustered images, thereby enhancing the user's convenience.

Hereinafter, a method for image management, which is an embodiment of the present invention performed in the image management application 200, will be described with reference to Figs. 1 to 8D.

6A and 6B are flow charts of a clustering process of images, FIG. 7 is a view showing merging of clusters to prevent over-clustering, FIGS. 8A to 8D show a user interface Fig.

First, as shown in FIG. 8A, when the user activates the clustering through the operation unit 102 in the display unit 112 in which a plurality of images for a predetermined period are respectively displayed, the image management application 200 is driven. The progress of the clustering and the quality evaluation can be confirmed through the status bar displayed on the display unit 112.

When the application 200 for image management is operated as described above, the database unit 202 shown in FIG. 4 is searched to check whether there is an image group 504 in which the clustering and quality evaluation are incomplete. If it is determined that there is an image group 504 of an incomplete predetermined period, the feature value extracting unit 204 analyzes the compression format image format for each of the corresponding images stored in the memory unit 118, And extracts feature values including at least the code stream and time information of the header associated with the header. The feature value extraction unit 204 generates feature vectors F i to F n composed of extracted feature values for each image (S 402).

Hereinafter, the JPEG image format is described as an example for convenience of explanation, but the code stream and the time information of the header related to the color information can be extracted in the other image format as described above.

Specifically, the feature value extracting unit 204 extracts the image stored in the APP 604 together with the DC code and the AC code included in the Huffman table recorded in the DHT shown in FIG. 5 as a code stream of the header related to the color information, And the time information of the time information. The feature value extracting unit 204 generates feature vectors such as a DC code, an AC code, and time information for each of the images F i to F n as [Formula 1].

In the above description, only the time information is used as the meta data used for the feature value, but the position information related to the image capturing place of the image is acquired from the position sensor unit 116 and attached to the metadata of the image, .

Next, the threshold value estimating unit 214 provides a value estimated based on the threshold used in the pre-classified clusters of the image group 502, which is initially designated or clustering has been completed, as a threshold value of the similarity determination (S404). In the case where there are pre-classified clusters as shown in FIG. 4, a cluster representative value, which is an average value of the feature vectors of the images in each cluster, is obtained for each cluster, and based on the difference of the cluster representative value average calculated through [Formula 2] Value. ≪ / RTI > Accordingly, images can be clustered to match the image composition and situation desired by the user.

The cluster classifier 210 generates a new cluster C m for the original image associated with the feature vector F i and the image determination unit 208 generates the feature vector F (C m , i) as F i associated with the cluster C m (S406). The cluster classifying unit 210 records the initial image into the cluster unit C m so as to classify it into the cluster unit C m and the image determining unit 208 sets the feature vector F (C m , i) as the cluster representative value.

Then, the image determining unit 208 determines whether the distance between the feature vector F (C m , i) to which the clustering weight is assigned and the feature vector Fi + 1 of the next image is equal to or less than the threshold value according to the feature values constituting the feature vector (S408).

Here, when the distances between the feature vectors are calculated based on the Euclidean distance, which is a kind of linear decision model, a clustering weight value as a clustering weight vector may be given for each feature value.

The clustering weights may be estimated based on cluster representative values and thresholds used in the pre-classified clusters in the image group 502 that are initially designated or clustering completed as shown in FIG. 4, as well as the threshold values. In the case of obtaining the clustering weights by the pre-classified clusters, the clustering weights can be obtained, for example, by [Equation 3] and [Equation 4] below. According to this, since the result of the existing clustering satisfied by the user is reflected, the images can be clustered so as to better match the classification intention of the user.

Image determining unit 208 determines if the distance is below the threshold value then the image related to the feature vector F i + 1 in the same cluster C m, and characterized as F i +1 is associated with the cluster vector C m F (C m, i +1) (S410). The image determining unit 208 calculates an average of the feature vectors F (C m , i) and F (C m , i + 1) to calculate and update the cluster representative value.

If the distance exceeds the threshold value, the image determining unit 208 determines to classify the first image associated with the feature vectors F i , F i +1 and the next image into different clusters C m , C m + 1 , respectively (S412 ). Cluster classification unit 210 and the first record in the database 202, the next image in a different cluster, the image determining unit 208 of the F i +1 is associated with the C m +1 feature vector F (C m +1 , i + 1), and sets the feature vectors F (C m , i) and F (C m +1 , i + 1) as representative values of each cluster. If there are more subsequent images after step S412, the process is repeated from step S408.

Next, the image judging unit 208 judges whether the distance between the cluster representative value based on the feature vector F i +2 of the subsequent image and the feature vectors F (C m , i), F (C m , i + (S414). The clustering weight used in step S408 may be equally assigned to the feature vector F i + 2 and the cluster representative value.

Image determining unit 208 determines the distance a subsequent image related to the threshold value than if the feature vector F i +2 in the same cluster C m, and characterized as F i +2 associated with the cluster vector C m F (C m, i +2) (S416). Further, the image determining unit 208 calculates an average of the previous representative value and the feature vector F (C m , i + 2) to calculate and update the cluster representative value.

If the distance exceeds the threshold value, the image determining unit 208 determines to classify the images related to the feature vectors F ( Cm , i) and F ( Cm , i + 1) into the cluster Cm , It is determined to classify the image related to F i + 2 as C m +1 (S418). Cluster classification unit 210 and the second image recorded in the database unit 202, the subsequent image in a different cluster, the image determining unit 208 of the feature F i +2 +1 vector associated with the C m F (C m +1, to generate the i + 2), the feature vector F (C m, i) and F (C m, and the average value of F i + 1) (C m +1 , i + 2) representative of each cluster Value. If there are more subsequent images after step S418, the process is repeated from step S408.

If there are more images (S420), the process goes to step 414 to repeat the process of comparing the distance between the cluster representative value of the images up to the next image and the threshold value.

The above determination of similarity based on the distance between the cluster representative value and the feature vector of the subsequent image can be performed by clustering the images having similar composition and situation, as compared with the determination by the distance between the feature vectors of adjacent images.

If there are no more images, the cluster classifier 210 classifies the clusters for each of the plurality of images and primarily stores the clusters in the database unit 202 (S424).

As a result of searching the database unit 202, the cluster determination unit 212 determines whether a plurality of clusters exist for the clustering-completed images (S426).

If there are a plurality of clusters, the cluster determining unit 212 selects two clusters C m and C m + 1 among the plurality of clusters, and determines whether the distance between the cluster representative values of the clusters is less than or equal to a threshold value (S428). The comparison between the clusters can be performed for all the clusters that can be combined into two.

The cluster determining unit 212 merges the compared clusters C m and C m + 1 into a single cluster, and notifies the merging to the database unit 202 (S430). The cluster determination unit 212 maintains current clusters that are primarily completed when the threshold value is exceeded (S432).

An example of merging the clusters C m and C m + 1 below the threshold value into a single cluster in the cluster determination unit 212 is shown in FIG. This can prevent overclustering in which similar images are classified into different clusters due to disturbing images in successive similar images.

8B, the class providing unit 226 refers to the cluster information and the like of the database unit 202, and sends the result of the quality evaluation to the memory unit (not shown) 118 are displayed on the display unit 112 so as to be clustered, and a thumbnail of the representative image can be displayed as an intuitive symbol of the cluster for each cluster. Further, when the user selects a cluster displayed on the display unit 112, the class providing unit 226 can display to arbitrarily arrange the images in the cluster, as in Fig. 8C. When the user selects one of the sorted images, the original image can be browsed as shown in FIG. 8D.

With reference to Figs. 9 to 10B, the editing of the cluster, which is a further embodiment of the image management method, and the calculation of the threshold value and the clustering weight according to the editing will be described. FIG. 9 is a flowchart of a case where a cluster is edited by a user's operation, and FIGS. 10A and 10B are diagrams illustrating a user interface when a cluster is edited by a user.

As shown in FIG. 10A, the user selects a specific cluster C m (referred to as C 1 in the following description) to browse images belonging to the cluster C 1 , and then selects at least one image among the images (S 902).

When an image is selected, the cluster changing unit 228 receives an instruction to divide the cluster C 1 and outputs the cluster dividing unit 210, the image determining unit 208, the threshold value estimating unit 214 and the clustering weight calculating unit 216).

The cluster classifying unit 210 classifies the selected images 2 and 3 into a new cluster C 6 , classifies the unselected images 1 to 4 into an existing cluster C 1 , 202 (S904).

The class providing unit 226 refers to the changed cluster information and the like of the database unit 202 and provides the display unit 112 to classify the existing and new clusters C 1 and C 6 as shown in FIG.

Next, the image determining unit 208 recalculates the representative values of the existing and new clusters C 1 and C 6 , and the threshold estimating unit 214 re-estimates the threshold value based on the cluster representative values to update (S906). The re-estimation of the threshold value can be performed by [Equation 5].

Then, the clustering weight calculation unit 216 calculates and updates the clustering weight based on the representative values of all the clusters including the re-estimated threshold value and the changed clusters. Clustering weights can be calculated by [Equation 6] and [Equation 7].

According to this embodiment, the user can accurately estimate the object composition and situation in the image desired to be classified as one cluster, and the personalized classification criterion can be reflected in the clustering of the images of the subsequent period and the like.

In the above-described embodiment, editing such as division of clusters has been described. However, when the user merges the clusters C 2 and C 3 shown in FIG. 10A, the process of S 906 and S 908 is also performed, whereby the threshold value and the clustering weight are calculated And is updated.

8A to 8D and 11, description will be given of quality evaluation of images, change of direct image rank, and updating of quality weights, which is another further embodiment of the image management method. 11 is a flow chart for evaluating the quality of images and changing the image ranking by direct manipulation of the user.

The ranking determination unit 220 calculates a plurality of quality feature values for each of the images classified into the cluster C m (S1102).

The quality feature values include face presence in the image, sharpness in a predetermined area of the image with respect to the face, relative area occupied by the face in the image, eye openness, signal to noise ratio, face distortion due to HDR (High Dynamic Range) The distance from the center of the image to the center of the face, and is not limited to the above-described items.

The ranking determining unit 220 evaluates the quality of images based on the quality feature values to which the quality weights are assigned, assigns the images according to the evaluation results, and selects representative images in the cluster (S1104).

Here, the quality evaluation of images is performed by the weighted linear combination method as an example, and according to the method, it can be proceeded by [Expression 8] and [Expression 9].

The quality weighting center 222 may be initially specified or may provide the ranking determiner 220 with a quality weight as the value used in the clustering completed image group 502 shown in FIG.

After the quality evaluation of the images of the cluster C m is completed, if the next cluster C m + 1 exists (S1106), steps S1102 and S1104 are repeated.

After the above process is completed for all the clusters, the ranking determining unit 220 provides the ranking of images to the database unit 202 for each cluster, and the class providing unit 226 supplies cluster information of the database unit 202, 8B, a plurality of images may be displayed on the display unit 112 to be clustered. 8C, when the user selects the cluster C 1 displayed on the display unit 112, the class providing unit 226 displays the images in the cluster C 1 in order of ranking, Can be placed in order.

Next, the ranking changing section 230 detects whether the user directly changes the ranking of a specific image in the cluster C m (S1108).

Method for operating a user to change the order of images may be to move around the screen area of the cluster C 1 shown in Figure 8b for a particular image in the cluster C 1. If it is determined that the rank is directly changed, the rank change unit 230 relocates the images in the cluster C m in order and acquires a representative image (S 1110). The rank changing unit 230 provides rank change information to the database unit 202, and when the user browses the images of the cluster C m , the images are sorted according to the changed rank.

Ranking determining section 220 is a reference to the relocated ranking readjust the ranking quality score of the altered image in the cluster C m in accordance with a predetermined rule, and the quality weight calculation section 222 is re-quality score [Equation 9] To calculate a vector of quality weights (S1112).

8A to 8D and 12, description will be made of quality evaluation of images, ranking change according to indirect activity, and update of quality weight, which is still another embodiment of the image management method. Figure 12 is a flow chart of the quality evaluation of images and the change of image ranking by degree of activity intervention.

Evaluation of image quality values (S1202), evaluation and ranking of images (S1204), and progress of quality evaluation for all clusters (S1206) are substantially the same as those of S1102, S1104 and S1106 described above.

Next, the ranking changing unit 230 detects whether there is an indirect activity of the user for at least one of the ranked images in the cluster C m (S1208).

The ranking changing unit 230 can determine that there is an activity even if the accumulated activity is only once according to the activity and can determine that the activity exists when the activity is several times or more depending on the user's designation.

Next, the ranking determining unit 220 re-calculates the quality score of the cluster based on the degree of intervention of the activity to rearrange the ranking of the images, obtains a representative image to be ranked first in the cluster C m according to the rearranged ranking (S1210).

The degree of the indirect activity intervention may be, for example, at least one of the number of times the image is shared with the social network service, whether or not the image is transmitted to another external medium, the number of times of modification of the specific image, Can be calculated including one, and can include any action that is consistent with the meaning of the activity.

 The quality weight calculation unit 222 calculates a quality weight based on the recalculated quality score (S1212). The quality weight is calculated by [Equation 9].

If the updated quality weight is reflected in the quality evaluation of the subsequent images in accordance with the change in the ranking of the images, the user can adjust the image having the desired composition and color to be prioritized. In addition, since the quality of the image can be evaluated according to the degree of intervention according to the activity related to the interest of the user's image, the preference of the user, which changes from time to time, can be actively reflected without direct ranking operation.

13 is a diagram illustrating a user interface for adjusting parameters used for classifying a cluster and evaluating the quality of an image.

In order to adjust the threshold 702, the clustering weight 704 and the quality feature values 706 of the clustering that directly affect the threshold definition coefficient or the like by the user, The value can be accepted. These values may be input to the parameter changing unit 232 and transmitted to the threshold value estimating unit 214, the clustering weight calculating unit 216 and the quality weight calculating unit 222.

Referring to FIG. 14, the determination of the selection and storage form of images stored or uploaded to the medium, which is another embodiment of the image management method, will be described.

FIG. 14 is a diagram illustrating a user interface for determining the selection and storage form of images to be stored in a medium.

The user can access the user interface as shown in FIG. 14 through the environment setting. In this user interface, the item 802 related to the mobile terminal 100 can be displayed on the screen of the mobile terminal 100 according to the ratio of images to be stored in the cluster of the mobile terminal 100, A ratio of images to be processed at a compression ratio lower than that of the image, and the like. Items 804 associated with the cloud server 140 or the social network server 300 are also substantially the same as items 802 of the mobile terminal 100.

When the image editing unit 234 receives the input through the items 802 and 804, the image editing unit 234 can receive the number and the storage format of the images stored in the cluster, and determine the number of images stored in the predetermined higher rank order desired by the user , Images of a predetermined lower rank can be stored at a compression ratio lower than that of the corresponding image. As a result, the efficiency of image management can be maximized.

When the clusters are uploaded to another external medium such as the cloud server 140 or the social network server 300, the image editing unit 234 may store only images of a predetermined high rank in the original form or in a more compressed form Alternatively, a corresponding instruction can be received through the user interface shown in FIG. 14 so that images of a predetermined lower rank can be processed and stored at a low compression ratio.

When the clustered images are uploaded to the social network server 300 or the like by the uploading unit 236 by a request or a synchronization of the user, images of a predetermined upper or lower ranking can be transmitted without compression processing, The compressed images may be transmitted so as to conform to the compression size or the like and to reduce the transmission load.

The social network can receive only images of a predetermined rank for each cluster from the upload unit 236. However, the ranking information may also be provided from the upload unit 236 so that the images in each cluster are sorted and presented . According to this, other users accessing the service of the social network can browse images of good quality among the clustered images, thereby increasing convenience for users.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is clearly understood that the same is by way of illustration and example only and is not to be taken by way of limitation, I will understand. Therefore, the scope of the present invention should not be limited to the above-described embodiments, but should be determined by all changes or modifications derived from the scope of the appended claims and the appended claims.

100: mobile terminal 140: cloud server
200: image management application 202: database part
204: Feature value extraction unit 206: Image clustering unit
218: image evaluation unit 224: class presentation unit
236: Upload unit 300: Social network server

Claims (17)

1. An image management application stored in a medium in combination with hardware,
Generating a feature vector composed of feature values including at least code stream and time information of a header related to color information for each of a plurality of images within a predetermined period;
Classifying the images into the same cluster and calculating cluster representative values based on the feature vectors of the images when the distance between the feature vectors of the first image and the adjacent image is less than or equal to a threshold value;
If the distance between the feature vector of the subsequent image and the cluster representative value is less than or equal to the threshold value, classifies the subsequent image as same as the cluster, and based on the feature vector of the subsequent image and the cluster representative value, Updating; And
Merging the two clusters into a single cluster when the plurality of images are classified into a plurality of clusters and a distance between the cluster representative values in two of the plurality of clusters is equal to or less than the threshold, And if the threshold is exceeded, maintaining the plurality of clusters.
The method according to claim 1,
Wherein the threshold value is estimated based on thresholds used in the pre-classified clusters before the step of generating the feature vector.
The method according to claim 1,
Wherein a clustering weight is given according to a feature value constituting the feature vector at the time of calculating the distance between the feature vectors and the distance between the feature vector of the subsequent image and the cluster representative value.
The method according to claim 1,
Merging into the single cluster if the threshold is less than or equal to the threshold and maintaining the plurality of clusters if the threshold is exceeded,
Updating a cluster representative value when a user creates a new cluster for at least one image among the images classified into the same cluster, or clusters a plurality of clusters into one cluster; And
And estimating the threshold value based on the updated cluster representative value.
5. The method of claim 4,
And classifies clusters for a plurality of images in a subsequent period using the estimated threshold value.
5. The method of claim 4,
If a clustering weight assigned according to a feature value constituting the feature vector in the calculation of the distance between the feature vectors and the distance between the feature vector of the subsequent image and the cluster representative value is set, Further comprising updating the clustering weight based on the threshold value,
And classifying clusters for a plurality of images in a subsequent period by the estimated threshold and the updated clustering weight.
The method according to claim 1,
The code stream of the header related to the color information is a DC code and an AC code stored in a Huffman table when the image is stored in a digital form and the format of the image is Joint Photographic Experts Group (JPEG) In the case where the image format is PNG (Portable Network Graphic), a code stream of a header related to the color information is a color data related code stored in a Huffman table, and when the image is a GIF (Graphic Interchange Format) Wherein the code stream of the header associated with the information is a color data related code stored in an LZW (Lempei-Ziv-Welch) code.
The method according to claim 1,
Wherein the cluster representative value comprises an average value for each feature value constituting a feature vector of images classified into the same cluster.
The method according to claim 1,
Wherein the medium is any one of a mobile terminal, a computer, a cloud server, and a social network server.
The method according to claim 1,
Merging into the single cluster if the threshold is less than or equal to the threshold and maintaining the plurality of clusters if the threshold is exceeded,
Calculating a plurality of quality feature values for each of the images classified into each cluster; And
Further comprising: ranking the images in the cluster according to a result of evaluating the images based on a plurality of quality-weighted quality feature values, and selecting a representative image in the cluster Applications.
11. The method of claim 10,
Ranking the images in the cluster, and after selecting the representative image,
Obtaining a representative image in response to an instruction of a user to directly relocate the images in the cluster;
Calculating the quality weight based on the relocated rank; And
And applying the calculated quality weight to the quality feature values when evaluating images of other clusters.
11. The method of claim 10,
Ranking the images in the cluster, and after selecting the representative image,
If there is an indirect activity of the user for the images in the cluster, relocating the ranking of the images in the cluster based on an intervening degree of the activity, Acquiring an image;
Calculating the quality weight based on the degree of intervention of the activity; And
And applying the calculated quality weight to the quality feature values when evaluating images of other clusters.
13. The method of claim 12,
Wherein the degree of involvement of the activity is calculated by including at least one of a number of times the image is shared in the social network service, a number of times the image is modified, a number of times the image is viewed, and a viewing time of the image.
11. The method of claim 10,
The quality feature values may include at least one of a presence or absence of a face in the image, a sharpness in a predetermined region of the image based on the face, a relative area occupied by the face in the image, an openness of the eye, a SNR, The presence or absence of face distortion due to distortion, and the distance from the center of the image to the center of the face.
11. The method of claim 10,
Ranking the images in the cluster, and after selecting representative images in the cluster,
Further comprising the steps of: storing only images of a predetermined higher rank on the medium or uploading the images to another external medium.
11. The method of claim 10,
Ranking the images in the cluster, and after selecting representative images in the cluster,
Converting images of a predetermined lower rank into images having a lower compression ratio than the images; And
Further comprising the step of storing the images having the low compression ratio on the medium or uploading them to another external medium.
Generating a feature vector including feature values including at least code stream and time information of a header related to color information for each of a plurality of images within a predetermined period;
Classifying the images into the same clusters and calculating cluster representative values based on the feature vectors of the images when the distance between the feature vectors of the first image and the adjacent image is less than or equal to a threshold value;
If the distance between the feature vector of the subsequent image and the cluster representative value is less than or equal to the threshold value, classifies the subsequent image as same as the cluster, and based on the feature vector of the subsequent image and the cluster representative value, Updating;
Merging the two clusters into a single cluster when the plurality of images are classified into a plurality of clusters and the distance between the cluster representative values in two of the plurality of clusters is less than or equal to the threshold value, The method comprising the steps of: maintaining the plurality of clusters.
KR1020160013385A 2016-02-03 2016-02-03 An application for managing images and method of managing images KR101744163B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160013385A KR101744163B1 (en) 2016-02-03 2016-02-03 An application for managing images and method of managing images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160013385A KR101744163B1 (en) 2016-02-03 2016-02-03 An application for managing images and method of managing images

Publications (1)

Publication Number Publication Date
KR101744163B1 true KR101744163B1 (en) 2017-06-07

Family

ID=59223880

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160013385A KR101744163B1 (en) 2016-02-03 2016-02-03 An application for managing images and method of managing images

Country Status (1)

Country Link
KR (1) KR101744163B1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019088673A3 (en) * 2017-11-01 2019-06-20 주식회사 안랩 Image classification device and method
EP4435627A1 (en) * 2023-03-23 2024-09-25 Ricoh Company, Ltd. Information processing system, method for processing information, and carrier medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101145278B1 (en) 2011-11-08 2012-05-24 (주)올라웍스 Method, apparatus and computer-readable recording medium for choosing representative images among similar images

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101145278B1 (en) 2011-11-08 2012-05-24 (주)올라웍스 Method, apparatus and computer-readable recording medium for choosing representative images among similar images

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019088673A3 (en) * 2017-11-01 2019-06-20 주식회사 안랩 Image classification device and method
EP4435627A1 (en) * 2023-03-23 2024-09-25 Ricoh Company, Ltd. Information processing system, method for processing information, and carrier medium

Similar Documents

Publication Publication Date Title
US11677963B2 (en) Method and system for optimized delta encoding
KR101346730B1 (en) System, apparatus, method, program and recording medium for processing image
US9135278B2 (en) Method and system to detect and select best photographs
KR20070079330A (en) Display control apparatus, display control method, computer program, and recording medium
US20120189284A1 (en) Automatic highlight reel producer
US20140219566A1 (en) Methods and systems for content processing
KR20080080106A (en) Transferring of digital images
CN101287089B (en) Image capturing apparatus, image processing apparatus and control methods thereof
US8619150B2 (en) Ranking key video frames using camera fixation
US9113173B2 (en) Data processing device and data processing method which compresses image data
US20080085032A1 (en) Supplying digital images from a collection
KR101744163B1 (en) An application for managing images and method of managing images
US20060039478A1 (en) Image decoding and reducing apparatus and method
US20140379704A1 (en) Method, Apparatus and Computer Program Product for Management of Media Files
JP6109118B2 (en) Image processing apparatus and method, information processing apparatus and method, and program
KR20150096552A (en) System and method for providing online photo gallery service by using photo album or photo frame
KR20090044313A (en) Photography service method and system based on the robot
US9451275B2 (en) System and method for storing and moving graphical image data sets with reduced data size requirements
US20140324921A1 (en) Electronic device, method, and storage medium
JP2013061974A (en) Image processing apparatus and method, information processing apparatus and method, program, and image processing system

Legal Events

Date Code Title Description
GRNT Written decision to grant