CN106919326A - A kind of image searching method and device - Google Patents
A kind of image searching method and device Download PDFInfo
- Publication number
- CN106919326A CN106919326A CN201510997392.8A CN201510997392A CN106919326A CN 106919326 A CN106919326 A CN 106919326A CN 201510997392 A CN201510997392 A CN 201510997392A CN 106919326 A CN106919326 A CN 106919326A
- Authority
- CN
- China
- Prior art keywords
- picture
- search
- dimension
- searching
- display area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 54
- 230000009471 action Effects 0.000 claims description 17
- 238000004590 computer program Methods 0.000 claims description 10
- 230000004044 response Effects 0.000 claims description 3
- 230000008569 process Effects 0.000 description 30
- 230000001960 triggered effect Effects 0.000 description 18
- 238000010586 diagram Methods 0.000 description 17
- 238000000605 extraction Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 238000003860 storage Methods 0.000 description 6
- 238000012986 modification Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 238000010295 mobile communication Methods 0.000 description 3
- 238000003825 pressing Methods 0.000 description 3
- 238000010191 image analysis Methods 0.000 description 2
- 206010063385 Intellectualisation Diseases 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/53—Querying
- G06F16/532—Query formulation, e.g. graphical querying
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Mathematical Physics (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
This application discloses a kind of image searching method and device.In the application, after the first operation for first picture for detecting, determine the corresponding search dimension of first picture, and according to the corresponding search dimension of first picture, search the second picture with first picture match in the search dimension, it is achieved thereby that the automatic search procedure of the similar pictures by the first operation triggering, compared with picture searching is carried out using manual type, user's operation complexity is low, takes short, correspondingly, it is also smaller to power consumption.
Description
Technical Field
The present application relates to the field of communications technologies, and in particular, to a method and an apparatus for searching for a picture.
Background
With the rapid development of mobile communication technology and the coming of the mobile multimedia era, a mobile phone has developed from a simple communication tool to intellectualization as a necessary mobile communication tool for people, and has evolved into a mobile information collecting and processing platform.
At present, a large number of pictures can be stored in a mobile phone, for example, pictures taken by the mobile phone or pictures stored in other manners can be stored.
When a user needs to search for pictures with a certain same or similar characteristic from the pictures, the user needs to check and compare the pictures, the operation is complex, the time consumption is long, and correspondingly, the power consumption of the mobile phone is also high. With the increase of the storage capacity of the mobile phone, the storage capacity of the pictures is increased, so that the problem is more prominent.
Content of application
In order to solve the above problem, embodiments of the present application provide a method and an apparatus for searching an image.
The image searching method provided by the embodiment of the application comprises the following steps:
acquiring a picture searching instruction, wherein the picture searching instruction is generated according to a detected first operation aiming at a first picture;
determining a search dimension corresponding to the first picture;
searching a second picture matched with the first picture in the searching dimension according to the searching dimension corresponding to the first picture;
and outputting a search result.
Preferably, the first operation on the first picture comprises:
performing a first screen touch operation in a screen display area of the first picture; or,
obtaining the first picture through scanning; or,
obtaining the first picture by shooting; or,
pasting the first picture in a search box; or,
selecting an option for triggering picture search in a picture operation option menu, wherein the picture operation option menu pops up after a second screen touch operation performed in a screen display area of the first picture is detected; or in the screen display area of the first picture or in the preview view window, the UI component for triggering picture search is selected.
Wherein the first screen touch operation performed in the screen display area of the first picture includes: performing multi-finger zooming sliding operation in a screen display area of the first picture; or, performing a sliding operation of a single finger in a set direction in the screen display area of the first picture.
Preferably, the determining a search dimension corresponding to the first picture includes:
acquiring a default search dimension, and determining the default search dimension as a search dimension corresponding to the first picture, wherein the default search dimension comprises one or more search dimensions; or,
determining the selected search dimension as the search dimension corresponding to the first picture according to the operation that an option for selecting the search dimension in the pop-up menu of the first picture is selected, wherein the pop-up menu pops up after detecting a third screen touch operation performed in a screen display area of the first picture, and one search dimension option in the pop-up menu corresponds to one or more search dimensions; or,
determining a search dimension corresponding to the detected first operation according to the type of the detected first operation and a preset corresponding relation between the type of the first operation and the search dimension, wherein the first operation corresponds to one or more search dimensions.
Preferably, the search dimension comprises one or more combinations of the following dimensions:
based on the search dimensions of the objects in the picture;
a search dimension based on a hue of the picture;
a search dimension based on a shooting time of the picture;
searching dimension based on shooting location of the picture;
picture-based capture device or source search dimensions.
When the search dimension includes a search dimension based on an object in a picture, the searching for a second picture matched with the first picture in the search dimension according to the search dimension corresponding to the first picture includes:
extracting feature information of the first picture;
and searching a second picture according to the extracted characteristic information, wherein the characteristic information of the second picture is matched with the characteristic information of the first picture.
When the first operation on the first picture is a first screen touch operation performed in a screen display area of the first picture, the extracting feature information of the first picture according to pixels of the first picture includes:
acquiring an action area of first screen touch operation according to the first screen touch operation performed in the screen display area of the first picture;
and extracting the characteristic information of the first picture in the action region.
Preferably, finding a second picture that matches the first picture in the search dimension comprises: and searching a second picture matched with the first picture in the search dimension from a picture library stored locally and/or a picture library stored at a network side.
Preferably, the outputting the search result includes: and displaying the searched second pictures in a picture preview mode according to the sequence of the matching degree from high to low, wherein the number of the second pictures is one or more.
Preferably, the outputting the search result includes: and displaying the searched second picture by using N sub-windows, wherein each sub-window is used for displaying the second picture searched based on one search dimension or a preview view of the second picture, each sub-window is allowed to display the second picture searched based on the corresponding search dimension or the preview view of the second picture according to sliding touch operation switching, and N is an integer greater than or equal to 1.
The picture searching device provided by the embodiment of the application comprises:
the acquisition module is used for acquiring a picture search instruction, and the picture search instruction is generated according to the detected first operation aiming at the first picture;
the determining module is used for determining the searching dimension corresponding to the first picture;
the searching module is used for searching a second picture matched with the first picture in the searching dimension according to the searching dimension corresponding to the first picture;
and the output module is used for outputting the search result.
Preferably, the first operation on the first picture comprises:
performing a first screen touch operation in a screen display area of the first picture; or,
obtaining the first picture through scanning; or,
obtaining the first picture by shooting; or,
pasting the first picture in a search box; or,
selecting an option for triggering picture search in a picture operation option menu, wherein the picture operation option menu pops up after a second screen touch operation performed in a screen display area of the first picture is detected; or,
and in the screen display area of the first picture or in the preview view window, triggering the operation that the UI component of the picture search is selected.
Wherein the first screen touch operation performed in the screen display area of the first picture includes: performing multi-finger zooming sliding operation in a screen display area of the first picture; or, performing a sliding operation of a single finger in a set direction in the screen display area of the first picture.
Preferably, the determining module is specifically configured to:
acquiring a default search dimension, and determining the default search dimension as a search dimension corresponding to the first picture, wherein the default search dimension comprises one or more search dimensions; or,
determining the selected search dimension as the search dimension corresponding to the first picture according to the operation that an option for selecting the search dimension in the pop-up menu of the first picture is selected, wherein the pop-up menu pops up after detecting a third screen touch operation performed in a screen display area of the first picture, and one search dimension option in the pop-up menu corresponds to one or more search dimensions; or,
determining a search dimension corresponding to the detected first operation according to the type of the detected first operation and a preset corresponding relation between the type of the first operation and the search dimension, wherein the first operation corresponds to one or more search dimensions.
Preferably, the search dimension comprises one or more combinations of the following dimensions:
based on the search dimensions of the objects in the picture;
a search dimension based on a hue of the picture;
a search dimension based on a shooting time of the picture;
searching dimension based on shooting location of the picture;
picture-based capture device or source search dimensions.
Wherein the search module is specifically configured to: and when the search dimension comprises a search dimension based on an object in a picture, extracting the feature information of the first picture, and searching a second picture according to the extracted feature information, wherein the feature information of the second picture is matched with the feature information of the first picture.
Wherein the search module is specifically configured to: when the first operation aiming at the first picture is a first screen touch operation performed in the screen display area of the first picture, acquiring an action area of the first screen touch operation according to the first screen touch operation performed in the screen display area of the first picture, and extracting feature information of the first picture in the action area.
Preferably, the search module is specifically configured to: and searching a second picture matched with the first picture in the search dimension from a picture library stored locally and/or a picture library stored at a network side.
Preferably, the output module is specifically configured to: and displaying the searched second pictures in a picture preview mode according to the sequence of the matching degree from high to low, wherein the number of the second pictures is one or more.
Preferably, the output module is specifically configured to: and displaying the searched second picture by using N sub-windows, wherein each sub-window is used for displaying the second picture searched based on one search dimension or a preview view of the second picture, each sub-window is allowed to display the second picture searched based on the corresponding search dimension or the preview view of the second picture according to sliding touch operation switching, and N is an integer greater than or equal to 1.
The terminal provided by the embodiment of the application comprises:
a display;
a memory for storing computer program instructions;
a processor, coupled to the memory, for reading computer program instructions stored by the memory and, in response, performing the following:
acquiring a picture searching instruction, wherein the picture searching instruction is generated according to a detected first operation aiming at a first picture;
determining a search dimension corresponding to the first picture;
searching a second picture matched with the first picture in the searching dimension according to the searching dimension corresponding to the first picture;
and outputting a search result.
Preferably, the first operation on the first picture comprises:
performing a first screen touch operation in a screen display area of the first picture; or,
obtaining the first picture through scanning; or,
obtaining the first picture by shooting; or,
pasting the first picture in a search box; or,
selecting an option for triggering picture search in a picture operation option menu, wherein the picture operation option menu pops up after a second screen touch operation performed in a screen display area of the first picture is detected; or,
and in the screen display area of the first picture or in the preview view window, triggering the operation that the UI component of the picture search is selected.
Wherein the first screen touch operation performed in the screen display area of the first picture includes:
performing multi-finger zooming sliding operation in a screen display area of the first picture; or,
and performing sliding operation of a single finger in a set direction in the screen display area of the first picture.
Preferably, the processor is specifically configured to:
acquiring a default search dimension, and determining the default search dimension as a search dimension corresponding to the first picture, wherein the default search dimension comprises one or more search dimensions; or,
determining the selected search dimension as the search dimension corresponding to the first picture according to the operation that an option for selecting the search dimension in the pop-up menu of the first picture is selected, wherein the pop-up menu pops up after detecting a third screen touch operation performed in a screen display area of the first picture, and one search dimension option in the pop-up menu corresponds to one or more search dimensions; or,
determining a search dimension corresponding to the detected first operation according to the type of the detected first operation and a preset corresponding relation between the type of the first operation and the search dimension, wherein the first operation corresponds to one or more search dimensions.
Preferably, the search dimension comprises one or more combinations of the following dimensions:
based on the search dimensions of the objects in the picture;
a search dimension based on a hue of the picture;
a search dimension based on a shooting time of the picture;
searching dimension based on shooting location of the picture;
picture-based capture device or source search dimensions.
Wherein the processor is specifically configured to: and when the search dimension comprises a search dimension based on an object in a picture, extracting the feature information of the first picture, and searching a second picture according to the extracted feature information, wherein the feature information of the second picture is matched with the feature information of the first picture.
Wherein the processor is specifically configured to: when the first operation aiming at the first picture is a first screen touch operation performed in the screen display area of the first picture, acquiring an action area of the first screen touch operation according to the first screen touch operation performed in the screen display area of the first picture, and extracting feature information of the first picture in the action area.
Preferably, the processor is specifically configured to: and searching a second picture matched with the first picture in the search dimension from a picture library stored locally and/or a picture library stored at a network side.
Preferably, the processor is specifically configured to: and according to the sequence of the matching degrees from high to low, indicating the display to display the searched second pictures in a picture preview mode, wherein the number of the second pictures is one or more.
Preferably, the processor is specifically configured to: and instructing the display to display the found second picture by using N sub-windows, wherein each sub-window is used for displaying the found second picture or the preview view of the second picture based on one search dimension, each sub-window is allowed to display the found second picture or the preview view of the second picture based on the corresponding search dimension in a switching manner according to the sliding touch operation, and N is an integer greater than or equal to 1.
In the embodiment of the application, after the picture search instruction generated according to the detected first operation on the first picture is acquired, the search dimension corresponding to the first picture is determined, and the second picture matched with the first picture in the search dimension is searched according to the search dimension corresponding to the first picture, so that the automatic search process of the similar picture triggered by the first operation is realized.
Drawings
FIG. 1 is a schematic diagram of a network architecture suitable for use in embodiments of the present application;
fig. 2 is a schematic diagram of a picture search process provided in an embodiment of the present application;
fig. 3, fig. 4 and fig. 5 are schematic diagrams of screen touch operations for triggering a similar picture searching process in an embodiment of the present application, respectively;
FIG. 6 is a search result display diagram in an embodiment of the present application;
fig. 7 is a schematic structural diagram of an image search apparatus according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
The embodiment of the application provides an automatic picture searching scheme realized on a terminal. The scheme provided by the embodiment of the application can be applied to equipment such as a mobile terminal, a Personal Computer (PC) or intelligent equipment, and the mobile terminal can be a mobile phone, a Personal Digital Assistant (PDA) or intelligent wearing equipment. The method is particularly suitable for mobile intelligent equipment with a touch screen.
The embodiments of the present application will be described in detail below with reference to the accompanying drawings.
Fig. 1 schematically illustrates a network architecture to which the embodiments of the present application are applicable. As shown in fig. 1, the network architecture may include a terminal and a server on the network side, and the terminal and the server communicate with each other through a network.
The server in the above architecture may be used to provide storage or access services to a user's picture library. The user picture library is used for storing pictures of the user, for example, the user can take pictures through a mobile phone and upload the pictures to the user picture library for storage. The server can also provide data synchronization service, and through data synchronization, pictures stored on the terminal can be synchronously stored or updated into a user picture library.
The terminal in the above architecture can implement the picture searching method provided by the embodiment of the present application. Further, in the process of searching pictures, pictures in a local picture library of the terminal may be searched, pictures of a user of the terminal in a picture library of the user may be searched through the server, or both the local picture library of the terminal and the picture library of the network side may be searched.
In the network architecture, the terminal and the server may perform information interaction through a network, where the network may be a wide area network, a local area network, or the internet using a mobile communication technology. The terminal can access the internet in a wireless mode, and the server is usually connected with the internet in a wired mode.
Preferably, the terminal and the server may employ a cloud computing technology to implement information processing based on powerful functions of the cloud computing technology. The server and the terminal can adopt an operating system based on cloud computing technology, such as YunOS, so that resources and services of the cloud and the terminal can be integrated.
Referring to fig. 2, a schematic diagram of an image search process provided in the embodiment of the present application is shown, where the process may be implemented on a terminal side. As shown, the process may include the following steps:
step 201: acquiring a picture search instruction for a first picture, wherein the picture search instruction is generated according to a detected first operation for the first picture. That is, when the first operation for the first picture is detected, the following steps 202 to 204 may be triggered.
The "first picture" is not specific to a certain type of picture, but is an expression for convenience of description.
In this step, the first operation on the first picture may include one of the following operations:
operation 1: and performing a first screen touch operation in the screen display area of the first picture. As an example, when a first picture is displayed in a picture display interface, such as in a picture browsing window, the first picture is displayed in a normal size (or highlighted), instead of only displaying a preview view thereof, in this case, if a user performs a screen touch operation in a display area of the first picture, a similar picture search process with the first picture as a reference is triggered.
The first screen touch operation performed in the screen display area of the first picture may be a single-finger touch operation (also referred to as a single-point touch operation) performed in the screen display area of the first picture, or may be a multi-finger touch operation (also referred to as a multi-point touch operation). Preferably, in order to distinguish from commonly used screen touch operations for pictures (such as a single-finger horizontal sliding operation for triggering switching of pictures), the embodiment of the present application selects a specific screen touch operation to trigger a similar picture searching process. For example, one of the following screen touch operations may be selected to trigger the similar picture search process:
-a multi-finger zoom sliding operation within the screen display area of the first picture. Preferably, the multi-finger zoom operation may be a two-finger pinch and slide operation, which is not limited by the embodiment of the present application. As an example, as shown in fig. 3, when only a first picture is displayed in the picture browse window 301, if the user performs a double-finger pinch-and-pinch slide operation 303 in the display area 302 of the first picture, the similar picture search process with the first picture as a reference is triggered when the operation is detected.
-a sliding operation of a single finger in a set direction, preferably the set direction may be a downward direction, within the screen display area of the first picture. As an example, as shown in fig. 4, when only a first picture is displayed in the picture browse window 401, if the user performs a one-finger-down sliding operation 403 in the display area 402 of the first picture, in such a case, after the operation is detected, a similar picture search process with the first picture as a reference is triggered.
Operation 2: the operation of obtaining the first picture through scanning is suitable for the terminal provided with the picture scanning device. As an example, after the terminal scans a first picture through the scanning device provided thereon, a similar picture searching process with the first picture as a reference may be automatically triggered according to the obtained first picture scanning information.
Operation 3: the operation of obtaining the first picture through shooting is suitable for the terminal provided with the camera device such as a camera. As an example, after the terminal takes a picture through a camera arranged thereon, a similar picture searching process based on the first picture taken can be automatically triggered.
And operation 4: and pasting the first picture in the search box. As an example, when a user opens an application program for implementing a picture search and pastes a selected first picture into a search box in a search interface of the application program, a similar picture search process with the first picture as a reference is triggered.
Operation 5: and selecting an option for triggering picture search in the pop-up menu for the first picture. As an example, when it is detected that a second screen touch operation is performed in the screen display area of the first picture, a picture operation option menu may pop up, where the menu may include one or more picture operation options, including at least an option for triggering a picture search, for example, the menu may include the following options: "search similar pictures", "copy pictures", "share pictures"; when the option for triggering picture search (e.g., "search for similar pictures") is selected, a similar picture search process with respect to the first picture is triggered. The second screen touch operation may be an operation of long pressing or clicking in the first picture display area, which is not limited in the embodiment of the present application.
Operation 6: and in the screen display area of the first picture or in the preview view window, triggering the operation that a User Interface (UI) component for picture searching is selected. The UI component may be a function button, a text box, or another type of UI component, which is not limited in this embodiment of the application. As an example, in the picture preview interface, a UI component, such as a button named "similar picture search", may be disposed in the preview view window of each picture, and when the user operates the button "similar picture search" in the preview view window of a picture, a similar picture search process with respect to the picture may be triggered. For another example, as shown in fig. 5, if it is detected that the user performs a single-finger pressing operation (i.e., a long-press operation) 502 in the screen display area 501 of the first picture for 3 seconds, a UI component (e.g., a button named "find similar picture" in the figure) 503 is generated and displayed in the interface, and if the UI component is triggered (e.g., clicked), a similar picture search process with reference to the first picture is triggered.
Operation 7: a set function key operation performed when the first picture is displayed on the screen. As an example, when the first picture is displayed in a normal size (or highlighted) in the picture browsing window instead of only displaying its preview view, if it is detected that the user presses a specific key of the terminal, a similar picture searching process with the first picture as a reference is triggered.
Operation 8: and performing gesture setting operation under the condition that the first picture is displayed on the screen. As an example, when a first picture is displayed in a normal size in a picture browsing window, instead of only displaying a preview view thereof, if it is detected that a user performs a specific gesture operation, for example, an arm movement, a similar picture searching process with the first picture as a reference is triggered.
In addition to the above, other operations may also be included, for example, in a case that the first picture is displayed in a normal size in the picture browsing window, instead of displaying only the preview view thereof, the similar picture searching process based on the first picture is triggered by a voice control instruction, and embodiments of the present application are not listed here.
Step 202: and determining the corresponding search dimension of the first picture.
The method and the device for searching the pictures can support picture searching of multiple dimensions so as to meet different requirements of users. In the similar picture searching process, searching can be carried out based on one dimension or multiple dimensions. For example, the picture search dimension provided by the embodiments of the present application may include one or more combinations of the following dimensions:
-a search dimension based on objects in the picture. For example, for a person or building or other object in the first picture, other pictures containing the object are searched. If the dimension is adopted for searching the pictures, the characteristic information of the first picture and the characteristic information of other pictures in the picture library can be extracted; then comparing the characteristic information of the first picture with the characteristic information of other pictures, and calculating the similarity between the pictures by using a similarity measurement function, wherein the similarity can be generally represented by Euclidean distance, Cosine distance or Mahalanobis distance and the like; and if the similarity between the second picture and the first picture is higher than a set threshold value, taking the second picture as the searched picture similar to the first picture. The feature extraction is a process of converting an image into a vector, the vector is called an image feature, and the feature extraction can be completed through a feature extraction model, wherein the model inputs a picture (a value of each pixel in the picture) and outputs the picture as the image feature. There are various feature extraction methods in the field of image analysis, such as color histogram, LBP, Gabor, deep convolution network, etc., and the embodiments of the present application may use these methods for feature extraction.
-a search dimension based on the hue of the picture. For example, according to the tone of the first picture, other pictures consistent or approximately consistent with the tone are searched. Wherein the tone of the picture can be calculated by using an image analysis algorithm.
-a search dimension based on the time of taking the picture. For example, according to the shooting time of the first picture, other pictures which are the same as or close to the shooting time are searched. The time of taking a picture may be typically saved in the picture library as metadata or attribute information for the picture. The fact that the shooting time of the two pictures is close means that the difference value of the shooting time of the two pictures is within the set time length range.
-a search dimension based on the location of the picture taking. For example, according to the shooting location of the first picture, other pictures which are the same as or close to the shooting location are searched. The shooting location information of the picture may be stored in the picture library as metadata or attribute information of the picture. For example, after a photo is taken by a mobile phone, the current location of the mobile phone can be obtained, and the related information (such as longitude and latitude) of the current location can be stored in a picture library as metadata or attribute information of the picture. The shooting places of the two pictures are close to each other, namely the distance between the shooting places of the two pictures is within the set distance range.
-a search dimension of the picture-based capturing device or source. For example, according to the photographing apparatus used for the first picture, other pictures photographed using the same photographing apparatus are searched. Information about a photographing apparatus used for a picture, such as a type of the apparatus, a model number, etc., may be stored in the picture library as metadata or attribute information of the picture. For example, after a photo is taken by a mobile phone, the related information (mobile phone type, etc.) of the mobile phone can be saved in a photo library as metadata or attribute information of the photo; in the case of importing a photo from a memory card of a camera into a photo gallery of a mobile phone, information (such as type, model, etc.) related to the camera taking the photo can be saved as metadata or attribute information of the photo in the photo gallery; if the picture comes from an application such as WeChat (a social application), the related information of the application can be saved in the picture library as the metadata or attribute information of the picture when the picture is saved in the picture library.
The image search dimension provided by the embodiment of the present application is only exemplarily shown above, and the embodiment of the present application is not limited to the above search dimension, for example, a search for images with the same or similar size may also be performed based on the size of the image, and is not listed here.
As described above, one or more search dimensions may be used when conducting a picture search. The embodiment of the present application preferably provides the following several ways (one of ways 1 to 3) to determine a search dimension that needs to be used in the picture search, and specifically may include:
mode 1: and acquiring a default search dimension, and determining the default search dimension as a search dimension corresponding to the first picture, wherein the default search dimension may include one or more search dimensions.
Mode 2: and determining the selected search dimension as the search dimension corresponding to the first picture according to the operation that the option for selecting the search dimension in the pop-up menu of the first picture is selected. And popping up after detecting a third screen touch operation performed in the screen display area of the first picture by the pop-up menu, wherein one search dimension option in the pop-up menu corresponds to one or more search dimensions. The third screen touch operation may be a touch operation of pressing with a single finger in the screen display area of the first picture. As an example, after a user presses a screen with a single finger in a display area of a first picture displayed in a picture browsing window for 3 seconds, a menu is popped up, and the menu comprises options of 'searching for the same person', 'searching for the same tone', 'searching for the same source'; if the user selects the option of 'searching for the same person' in the menu, triggering a similar picture searching process based on feature extraction; if the user selects the "search for the same hue" option in the menu, a search process is triggered that looks for pictures with the same hue based on the hue of the first picture.
Mode 3: and determining the search dimension corresponding to the detected first operation according to the type of the detected first operation and the preset corresponding relation between the type of the first operation and the search dimension, wherein one first operation corresponds to one or more search dimensions. As an example, the corresponding relationship between the screen touch operation type and the search dimension is preset, for example, the single-finger up-sliding operation corresponds to the "search dimension based on the object in the picture", and the single-finger down-sliding operation corresponds to the "search dimension based on the hue of the picture"; in this way, when it is detected that the one-finger up-slide operation is performed in the first picture display area, it is determined to perform the picture search using the "search dimension based on the object in the picture"; when it is detected that a one-finger-down slide operation is performed in the first picture display region, it is determined to perform a picture search using a "search dimension based on a hue of a picture".
Step 203: and searching a second picture matched with the first picture in the searching dimension according to the searching dimension corresponding to the first picture.
Wherein the second picture refers to a picture that matches or is similar to the first picture in a certain search dimension or dimensions, and such second pictures may be one or more.
In specific implementation, the second picture matched with the first picture in the corresponding search dimension can be searched from a picture library stored locally in the terminal and/or a picture library stored at the network side.
Preferably, in order to further improve flexibility and meet the requirements of a user, the embodiment of the present application allows the user to select a local region in a first picture, extract features based on the local region, and further search for a second picture similar to the features of the local region, and specifically, an action region of a first screen touch operation may be obtained according to the first screen touch operation performed in a screen display region of the first picture; then extracting the feature information of the first picture in the action area, and further performing feature extraction operation on other pictures; and calculating the similarity between the pictures according to the extracted characteristic information, and further determining a second picture similar to the first picture. When extracting the features of other pictures, the feature information of the regions with the same size in the other pictures can be extracted according to the size of the action region, the feature information of a plurality of regions can be extracted at different positions according to the size of the region for one picture (one picture in the other pictures), and when comparing the feature information or the similarity, the feature information which is most similar to the feature information extracted from the first picture in the picture can be reserved as the feature information of the picture, so that the similarity between the picture and the first picture can be measured.
For example, if the first picture contains a plurality of characters, the user can select the head of one character through single-finger click operation, and thus, feature extraction is triggered to be performed on the head region of the character; for other pictures to be compared in the picture library, such as a second picture, feature extraction is performed on different positions (preferably including the same position as the head of the person selected in the first picture) in the second picture according to the size of the area, the similarity between each piece of feature information and the feature information of the first picture is calculated, the similarity with the maximum value is reserved as the similarity of the second picture, and if the similarity is greater than a threshold value, the second picture is output as a searched similar picture. In the above way, other pictures containing the object can be searched for each object.
Step 204: and outputting a search result.
In this step, the search result may be output to a display device for display. Accordingly, if the second picture matching the first picture is found, the found picture can be displayed, otherwise, prompt information of the picture which is not found to be matched can be given.
Preferably, when the found second picture is displayed, the found second picture may be displayed in a picture preview manner in an order from high matching degree to low matching degree. For the "search dimension based on the object in the picture", the matching degree refers to the similarity between pictures determined based on the feature extraction method; for the aforementioned "search dimension based on hue of picture", the matching degree thereof refers to the closeness of the hue value of the picture; for the aforementioned "search dimension based on the shooting time of the picture", the matching degree thereof refers to the proximity degree of time; as for the aforementioned "search dimension of a picture-based shooting location", the matching degree thereof refers to the proximity of the locations. If multiple dimensions are used for searching the picture, the matching degrees of the multiple dimensions can be integrated (for example, a weighted summation algorithm) to determine a total matching degree, and the search result is displayed according to the total matching degree.
In specific implementation, in order to facilitate a user to view a search result, N sub-windows may be set, the found second picture is displayed by using the N sub-windows, each sub-window is used for displaying the second picture found based on one search dimension or a preview view of the second picture, each sub-window is allowed to display the second picture found based on the corresponding search dimension or the preview view of the second picture in a switching manner according to a sliding touch operation, and N is an integer greater than or equal to 1. Fig. 6 exemplarily shows a search result page by taking 3 sub-windows as an example, as shown in fig. 6, a region 601 is used for displaying a first picture, a sub-window 602 is used for displaying a preview view of a picture found in "search dimension based on an object in a picture", a sub-window 603 is used for displaying a preview view of a picture found in "search dimension based on hue of a picture", and a sub-window 604 is used for displaying a preview view of a picture found in "search dimension based on shooting time of a picture". The preview view in child window 602 may be viewed via a single finger right slide operation 605 and the preview view in child window 603 may be viewed via a single finger left slide operation 606.
Further, the matching degree of the picture can be displayed, for example, the matching degree is displayed below the searched picture, so that the user can browse and select the picture conveniently.
As can be seen from the above description, in the embodiment of the present application, after a picture search instruction generated according to a detected first operation on a first picture is acquired, a search dimension corresponding to the first picture is determined, and a second picture matched with the first picture in the search dimension is found according to the search dimension corresponding to the first picture, so that an automatic search process of similar pictures triggered by the first operation is implemented.
Based on the same technical concept, the embodiment of the application also provides an image searching device, and the device can realize the image searching process. The apparatus may be a terminal or a component integrated within a terminal. The apparatus may be implemented by software or by a combination of software and hardware.
Referring to fig. 7, which is a schematic structural diagram of an image search apparatus provided in the embodiment of the present application, the image search apparatus may include: an obtaining module 701, a determining module 702, a searching module 703 and an outputting module 704, wherein:
an obtaining module 701, configured to obtain a picture search instruction, where the picture search instruction is generated according to a detected first operation on a first picture;
a determining module 702, configured to determine a search dimension corresponding to the first picture;
a searching module 703, configured to search, according to a search dimension corresponding to the first picture, for a second picture that matches the first picture in the search dimension;
and an output module 704, configured to output the search result.
Preferably, the first operation on the first picture comprises:
performing a first screen touch operation in a screen display area of the first picture; or,
obtaining the first picture through scanning; or,
obtaining the first picture by shooting; or,
pasting the first picture in a search box; or,
selecting an option for triggering picture search in a picture operation option menu, wherein the picture operation option menu pops up after a second screen touch operation performed in a screen display area of the first picture is detected; or in the screen display area of the first picture or in the preview view window, the UI component for triggering picture search is selected.
Wherein the first screen touch operation performed in the screen display area of the first picture includes: performing multi-finger zooming sliding operation in a screen display area of the first picture; or, performing a sliding operation of a single finger in a set direction in the screen display area of the first picture.
Preferably, the determining module 702 may be specifically configured to:
acquiring a default search dimension, and determining the default search dimension as a search dimension corresponding to the first picture, wherein the default search dimension comprises one or more search dimensions; or,
determining the selected search dimension as the search dimension corresponding to the first picture according to the operation that an option for selecting the search dimension in the pop-up menu of the first picture is selected, wherein the pop-up menu pops up after detecting a third screen touch operation performed in a screen display area of the first picture, and one search dimension option in the pop-up menu corresponds to one or more search dimensions; or,
determining a search dimension corresponding to the detected first operation according to the type of the detected first operation and a preset corresponding relation between the type of the first operation and the search dimension, wherein the first operation corresponds to one or more search dimensions.
Preferably, the search dimension comprises one or more combinations of the following dimensions:
based on the search dimensions of the objects in the picture;
a search dimension based on a hue of the picture;
a search dimension based on a shooting time of the picture;
searching dimension based on shooting location of the picture;
picture-based capture device or source search dimensions.
The search module 703 may be specifically configured to: and when the search dimension comprises a search dimension based on an object in a picture, extracting the feature information of the first picture, and searching a second picture according to the extracted feature information, wherein the feature information of the second picture is matched with the feature information of the first picture.
The search module 703 may be specifically configured to: when the first operation aiming at the first picture is a first screen touch operation performed in the screen display area of the first picture, acquiring an action area of the first screen touch operation according to the first screen touch operation performed in the screen display area of the first picture, and extracting feature information of the first picture in the action area.
Preferably, the searching module 703 is specifically configured to: and searching a second picture matched with the first picture in the search dimension from a picture library stored locally and/or a picture library stored at a network side.
Preferably, the output module 704 is specifically configured to: and displaying the searched second pictures in a picture preview mode according to the sequence of the matching degree from high to low, wherein the number of the second pictures is one or more.
Preferably, the output module 704 may be specifically configured to: and displaying the searched second picture by using N sub-windows, wherein each sub-window is used for displaying the second picture searched based on one search dimension or a preview view of the second picture, each sub-window is allowed to display the second picture searched based on the corresponding search dimension or the preview view of the second picture according to sliding touch operation switching, and N is an integer greater than or equal to 1.
Based on the same technical concept, the embodiment of the application also provides a terminal, and the terminal can realize the image searching process.
Referring to fig. 8, a schematic structural diagram of a terminal provided in the embodiment of the present application is shown. The terminal may generally include: a processor 801, a memory 802, and a display 803.
The processor 801 may be, among other things, a general purpose processor (such as a microprocessor or any conventional processor, etc.), a digital signal processor, an application specific integrated circuit, a field programmable gate array or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The memory 802 may specifically include an internal memory and/or an external memory, such as a random access memory, a flash memory, a read only memory, a programmable read only memory or an electrically erasable programmable memory, a register, and other storage media that are well known in the art. The display 803 may include touch screen control circuitry.
The processor 801 is connected to other modules in data communication, for example, data communication may be performed based on a bus architecture. The bus architecture may include any number of interconnected buses and bridges, with one or more processors, represented by the processor 801, and various circuits, represented by the memory 802, being linked together. The bus architecture may also link together various other circuits such as peripherals, voltage regulators, power management circuits, and the like, which are well known in the art, and therefore, will not be described any further herein. The bus interface provides an interface. The processor 801 is responsible for managing the bus architecture and general processing, and the memory 802 may store data used by the processor 801 in performing operations.
The picture search process disclosed in the embodiment of the present application may be applied to the processor 801 or implemented by the processor 801. In implementation, the steps of the picture search flow may be implemented by instructions in the form of hardware integrated logic circuits or software in the processor 801. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. The steps of a method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in a processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art.
Specifically, the processor 801, coupled to the memory 802, is configured to read the computer program instructions stored by the memory 802 and, in response, perform the following operations:
acquiring a picture searching instruction, wherein the picture searching instruction is generated according to a detected first operation aiming at a first picture;
determining a search dimension corresponding to the first picture;
searching a second picture matched with the first picture in the searching dimension according to the searching dimension corresponding to the first picture;
and outputting a search result.
Preferably, the first operation on the first picture comprises:
performing a first screen touch operation in a screen display area of the first picture; or,
obtaining the first picture through scanning; or,
obtaining the first picture by shooting; or,
pasting the first picture in a search box; or,
selecting an option for triggering picture search in a picture operation option menu, wherein the picture operation option menu pops up after a second screen touch operation performed in a screen display area of the first picture is detected; or,
and in the screen display area of the first picture or in the preview view window, the UI component for triggering picture search is selected.
Wherein the first screen touch operation performed in the screen display area of the first picture includes:
performing multi-finger zooming sliding operation in a screen display area of the first picture; or,
and performing sliding operation of a single finger in a set direction in the screen display area of the first picture.
Preferably, the processor 801 is specifically configured to:
acquiring a default search dimension, and determining the default search dimension as a search dimension corresponding to the first picture, wherein the default search dimension comprises one or more search dimensions; or,
determining the selected search dimension as the search dimension corresponding to the first picture according to the operation that an option for selecting the search dimension in the pop-up menu of the first picture is selected, wherein the pop-up menu pops up after detecting a third screen touch operation performed in a screen display area of the first picture, and one search dimension option in the pop-up menu corresponds to one or more search dimensions; or,
determining a search dimension corresponding to the detected first operation according to the type of the detected first operation and a preset corresponding relation between the type of the first operation and the search dimension, wherein the first operation corresponds to one or more search dimensions.
Preferably, the search dimension comprises one or more combinations of the following dimensions:
based on the search dimensions of the objects in the picture;
a search dimension based on a hue of the picture;
a search dimension based on a shooting time of the picture;
searching dimension based on shooting location of the picture;
picture-based capture device or source search dimensions.
Among other things, the processor 801 may be specifically configured to: and when the search dimension comprises a search dimension based on an object in a picture, extracting the feature information of the first picture, and searching a second picture according to the extracted feature information, wherein the feature information of the second picture is matched with the feature information of the first picture.
Among other things, the processor 801 may be specifically configured to: when the first operation aiming at the first picture is a first screen touch operation performed in the screen display area of the first picture, acquiring an action area of the first screen touch operation according to the first screen touch operation performed in the screen display area of the first picture, and extracting feature information of the first picture in the action area.
Preferably, the processor 801 is specifically configured to: and searching a second picture matched with the first picture in the search dimension from a picture library stored locally and/or a picture library stored at a network side.
Preferably, the processor 801 is specifically configured to: and according to the sequence of the matching degrees from high to low, indicating the display to display the searched second pictures in a picture preview mode, wherein the number of the second pictures is one or more.
Preferably, the processor 801 is specifically configured to: and instructing the display to display the found second picture by using N sub-windows, wherein each sub-window is used for displaying the found second picture or the preview view of the second picture based on one search dimension, each sub-window is allowed to display the found second picture or the preview view of the second picture based on the corresponding search dimension in a switching manner according to the sliding touch operation, and N is an integer greater than or equal to 1.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.
Claims (21)
1. An image searching method, comprising:
acquiring a picture searching instruction, wherein the picture searching instruction is generated according to a detected first operation aiming at a first picture;
determining a search dimension corresponding to the first picture;
searching a second picture matched with the first picture in the searching dimension according to the searching dimension corresponding to the first picture;
and outputting a search result.
2. The method of claim 1, wherein the first operation for the first picture comprises:
performing a first screen touch operation in a screen display area of the first picture; or,
obtaining the first picture through scanning; or,
obtaining the first picture by shooting; or,
pasting the first picture in a search box; or,
selecting an option for triggering picture search in a picture operation option menu, wherein the picture operation option menu pops up after a second screen touch operation performed in a screen display area of the first picture is detected; or,
and in the screen display area of the first picture or in the preview view window, triggering the operation that the UI component of the picture search is selected.
3. The method of claim 2, wherein the first screen touch operation performed within the screen display area of the first picture comprises:
performing multi-finger zooming sliding operation in a screen display area of the first picture; or,
and performing sliding operation of a single finger in a set direction in the screen display area of the first picture.
4. The method of claim 1, wherein the determining the search dimension corresponding to the first picture comprises:
acquiring a default search dimension, and determining the default search dimension as a search dimension corresponding to the first picture, wherein the default search dimension comprises one or more search dimensions; or,
determining the selected search dimension as the search dimension corresponding to the first picture according to the operation that an option for selecting the search dimension in the pop-up menu of the first picture is selected, wherein the pop-up menu pops up after detecting a third screen touch operation performed in a screen display area of the first picture, and one search dimension option in the pop-up menu corresponds to one or more search dimensions; or,
determining a search dimension corresponding to the detected first operation according to the type of the detected first operation and a preset corresponding relation between the type of the first operation and the search dimension, wherein the first operation corresponds to one or more search dimensions.
5. The method of claim 1 or 4, wherein the search dimensions comprise one or more combinations of the following dimensions:
based on the search dimensions of the objects in the picture;
a search dimension based on a hue of the picture;
a search dimension based on a shooting time of the picture;
searching dimension based on shooting location of the picture;
picture-based capture device or source search dimensions.
6. The method of claim 5, wherein when the search dimension comprises a search dimension based on an object in a picture, the finding a second picture that matches the first picture in the search dimension according to the search dimension corresponding to the first picture comprises:
extracting feature information of the first picture;
and searching a second picture according to the extracted characteristic information, wherein the characteristic information of the second picture is matched with the characteristic information of the first picture.
7. The method of claim 6, wherein when the first operation on the first picture is a first screen touch operation performed within a screen display area of the first picture, the extracting feature information of the first picture according to pixels of the first picture comprises:
acquiring an action area of first screen touch operation according to the first screen touch operation performed in the screen display area of the first picture;
and extracting the characteristic information of the first picture in the action region.
8. The method of claim 1, wherein the finding a second picture that matches the first picture in the search dimension comprises:
and searching a second picture matched with the first picture in the search dimension from a picture library stored locally and/or a picture library stored at a network side.
9. The method of claim 1, wherein outputting the search result comprises:
and displaying the searched second pictures in a picture preview mode according to the sequence of the matching degree from high to low, wherein the number of the second pictures is one or more.
10. The method of claim 1, wherein outputting the search result comprises:
and displaying the searched second picture by using N sub-windows, wherein each sub-window is used for displaying the second picture searched based on one search dimension or a preview view of the second picture, each sub-window is allowed to display the second picture searched based on the corresponding search dimension or the preview view of the second picture according to sliding touch operation switching, and N is an integer greater than or equal to 1.
11. An image search device, comprising:
the acquisition module is used for acquiring a picture search instruction, and the picture search instruction is generated according to the detected first operation aiming at the first picture;
the determining module is used for determining the searching dimension corresponding to the first picture;
the searching module is used for searching a second picture matched with the first picture in the searching dimension according to the searching dimension corresponding to the first picture;
and the output module is used for outputting the search result.
12. The apparatus of claim 11, wherein the first operation for the first picture comprises:
performing a first screen touch operation in a screen display area of the first picture; or,
obtaining the first picture through scanning; or,
obtaining the first picture by shooting; or,
pasting the first picture in a search box; or,
selecting an option for triggering picture search in a picture operation option menu, wherein the picture operation option menu pops up after a second screen touch operation performed in a screen display area of the first picture is detected; or,
and in the screen display area of the first picture or in the preview view window, triggering the operation that the UI component of the picture search is selected.
13. The apparatus of claim 12, wherein the first screen touch operation performed within the screen display area of the first picture comprises:
performing multi-finger zooming sliding operation in a screen display area of the first picture; or,
and performing sliding operation of a single finger in a set direction in the screen display area of the first picture.
14. The apparatus of claim 11, wherein the determination module is specifically configured to:
acquiring a default search dimension, and determining the default search dimension as a search dimension corresponding to the first picture, wherein the default search dimension comprises one or more search dimensions; or,
determining the selected search dimension as the search dimension corresponding to the first picture according to the operation that an option for selecting the search dimension in the pop-up menu of the first picture is selected, wherein the pop-up menu pops up after detecting a third screen touch operation performed in a screen display area of the first picture, and one search dimension option in the pop-up menu corresponds to one or more search dimensions; or,
determining a search dimension corresponding to the detected first operation according to the type of the detected first operation and a preset corresponding relation between the type of the first operation and the search dimension, wherein the first operation corresponds to one or more search dimensions.
15. The apparatus of claim 11 or 14, wherein the search dimension comprises one or more combinations of the following dimensions:
based on the search dimensions of the objects in the picture;
a search dimension based on a hue of the picture;
a search dimension based on a shooting time of the picture;
searching dimension based on shooting location of the picture;
picture-based capture device or source search dimensions.
16. The apparatus of claim 15, wherein the search module is specifically configured to:
and when the search dimension comprises a search dimension based on an object in a picture, extracting the feature information of the first picture, and searching a second picture according to the extracted feature information, wherein the feature information of the second picture is matched with the feature information of the first picture.
17. The apparatus of claim 16, wherein the search module is specifically configured to:
when the first operation aiming at the first picture is a first screen touch operation performed in the screen display area of the first picture, acquiring an action area of the first screen touch operation according to the first screen touch operation performed in the screen display area of the first picture, and extracting feature information of the first picture in the action area.
18. The apparatus of claim 11, wherein the search module is specifically configured to:
and searching a second picture matched with the first picture in the search dimension from a picture library stored locally and/or a picture library stored at a network side.
19. The apparatus of claim 11, wherein the output module is specifically configured to:
and displaying the searched second pictures in a picture preview mode according to the sequence of the matching degree from high to low, wherein the number of the second pictures is one or more.
20. The apparatus of claim 11, wherein the output module is specifically configured to:
and displaying the searched second picture by using N sub-windows, wherein each sub-window is used for displaying the second picture searched based on one search dimension or a preview view of the second picture, each sub-window is allowed to display the second picture searched based on the corresponding search dimension or the preview view of the second picture according to sliding touch operation switching, and N is an integer greater than or equal to 1.
21. A terminal, comprising:
a display;
a memory for storing computer program instructions;
a processor, coupled to the memory, for reading computer program instructions stored by the memory and, in response, performing the following:
acquiring a picture searching instruction, wherein the picture searching instruction is generated according to a detected first operation aiming at a first picture;
determining a search dimension corresponding to the first picture;
searching a second picture matched with the first picture in the searching dimension according to the searching dimension corresponding to the first picture;
and outputting a search result.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510997392.8A CN106919326A (en) | 2015-12-25 | 2015-12-25 | A kind of image searching method and device |
PCT/CN2016/110193 WO2017107855A1 (en) | 2015-12-25 | 2016-12-15 | Picture searching method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510997392.8A CN106919326A (en) | 2015-12-25 | 2015-12-25 | A kind of image searching method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106919326A true CN106919326A (en) | 2017-07-04 |
Family
ID=59089154
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510997392.8A Pending CN106919326A (en) | 2015-12-25 | 2015-12-25 | A kind of image searching method and device |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN106919326A (en) |
WO (1) | WO2017107855A1 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107632759A (en) * | 2017-08-21 | 2018-01-26 | 珠海市魅族科技有限公司 | Picture display method, device, computer installation and computer-readable recording medium |
CN108874262A (en) * | 2018-06-07 | 2018-11-23 | 北京小度信息科技有限公司 | Information interacting method, device, electronic equipment and computer readable storage medium |
CN109840128A (en) * | 2019-01-23 | 2019-06-04 | 努比亚技术有限公司 | A kind of double screen wallpaper treatment method, server and mobile terminal |
CN110245247A (en) * | 2019-05-24 | 2019-09-17 | 浙江大华技术股份有限公司 | Method, electronic equipment and the computer storage medium of picture searching |
CN110598037B (en) * | 2019-09-23 | 2022-01-04 | 腾讯科技(深圳)有限公司 | Image searching method, device and storage medium |
CN114422636A (en) * | 2021-12-16 | 2022-04-29 | 巴可(苏州)医疗科技有限公司 | Android-based method for generating test image to detect display |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101216830A (en) * | 2007-12-28 | 2008-07-09 | 腾讯科技(深圳)有限公司 | Method and system for search commercial articles according to colors |
CN101825989A (en) * | 2010-04-23 | 2010-09-08 | 杭州图酷科技有限公司 | Drag and drop image search input method |
CN104111989A (en) * | 2014-07-02 | 2014-10-22 | 百度在线网络技术(北京)有限公司 | Providing method and device of search results |
CN104462325A (en) * | 2014-12-02 | 2015-03-25 | 百度在线网络技术(北京)有限公司 | Search recommendation method and device |
CN104951537A (en) * | 2015-06-18 | 2015-09-30 | 百度在线网络技术(北京)有限公司 | Image searching method and device |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103020184B (en) * | 2012-11-29 | 2016-05-25 | 北京百度网讯科技有限公司 | Use photographic images to obtain the method and system of Search Results |
CN103559248A (en) * | 2013-10-30 | 2014-02-05 | 广东明创软件科技有限公司 | Searching method based on photo album picture of mobile terminal and mobile terminal |
KR20150105749A (en) * | 2014-03-10 | 2015-09-18 | 삼성전자주식회사 | Apparatus and method for display image |
CN104881457B (en) * | 2015-05-21 | 2018-11-16 | 百度在线网络技术(北京)有限公司 | Image searching method and device |
CN105138681A (en) * | 2015-09-15 | 2015-12-09 | 成都品果科技有限公司 | Picture search method and system based on space coordinate positioning |
-
2015
- 2015-12-25 CN CN201510997392.8A patent/CN106919326A/en active Pending
-
2016
- 2016-12-15 WO PCT/CN2016/110193 patent/WO2017107855A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101216830A (en) * | 2007-12-28 | 2008-07-09 | 腾讯科技(深圳)有限公司 | Method and system for search commercial articles according to colors |
CN101825989A (en) * | 2010-04-23 | 2010-09-08 | 杭州图酷科技有限公司 | Drag and drop image search input method |
CN104111989A (en) * | 2014-07-02 | 2014-10-22 | 百度在线网络技术(北京)有限公司 | Providing method and device of search results |
CN104462325A (en) * | 2014-12-02 | 2015-03-25 | 百度在线网络技术(北京)有限公司 | Search recommendation method and device |
CN104951537A (en) * | 2015-06-18 | 2015-09-30 | 百度在线网络技术(北京)有限公司 | Image searching method and device |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107632759A (en) * | 2017-08-21 | 2018-01-26 | 珠海市魅族科技有限公司 | Picture display method, device, computer installation and computer-readable recording medium |
CN108874262A (en) * | 2018-06-07 | 2018-11-23 | 北京小度信息科技有限公司 | Information interacting method, device, electronic equipment and computer readable storage medium |
CN109840128A (en) * | 2019-01-23 | 2019-06-04 | 努比亚技术有限公司 | A kind of double screen wallpaper treatment method, server and mobile terminal |
CN110245247A (en) * | 2019-05-24 | 2019-09-17 | 浙江大华技术股份有限公司 | Method, electronic equipment and the computer storage medium of picture searching |
CN110598037B (en) * | 2019-09-23 | 2022-01-04 | 腾讯科技(深圳)有限公司 | Image searching method, device and storage medium |
CN114422636A (en) * | 2021-12-16 | 2022-04-29 | 巴可(苏州)医疗科技有限公司 | Android-based method for generating test image to detect display |
Also Published As
Publication number | Publication date |
---|---|
WO2017107855A1 (en) | 2017-06-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5947131B2 (en) | Search input method and system by region selection method | |
CN109284729B (en) | Method, device and medium for acquiring face recognition model training data based on video | |
CN112954210B (en) | Photographing method and device, electronic equipment and medium | |
EP3125135B1 (en) | Picture processing method and device | |
WO2017107855A1 (en) | Picture searching method and device | |
CN111866392B (en) | Shooting prompting method and device, storage medium and electronic equipment | |
US10893137B2 (en) | Photography guiding method, device, and system | |
US20210281744A1 (en) | Action recognition method and device for target object, and electronic apparatus | |
EP3518522B1 (en) | Image capturing method and device | |
CN108924440A (en) | Paster display methods, device, terminal and computer readable storage medium | |
CN107515950A (en) | A kind of image processing method, device, terminal and computer-readable recording medium | |
CN107168536A (en) | Test question searching method, test question searching device and electronic terminal | |
JP6720353B2 (en) | Processing method and terminal | |
CN113271378B (en) | Image processing method and device and electronic equipment | |
CN111800574B (en) | Imaging method and device and electronic equipment | |
CN113194256A (en) | Shooting method, shooting device, electronic equipment and storage medium | |
US9411825B2 (en) | Computer implemented system for handling text distracters in a visual search | |
CN106557766B (en) | Fuzzy character processing method and system and electronic equipment | |
TW201828118A (en) | Picture search method and device searching in a search dimension for a second picture matching the first picture according to the search dimension corresponding to the first picture | |
CN113271379B (en) | Image processing method and device and electronic equipment | |
US20200236295A1 (en) | Imaging apparatus | |
WO2019075644A1 (en) | Portrait photograph searching method, and terminal | |
CN104866163B (en) | Image display method, device and electronic equipment | |
CN113873168A (en) | Shooting method, shooting device, electronic equipment and medium | |
CN113794943A (en) | Video cover setting method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
REG | Reference to a national code |
Ref country code: HK Ref legal event code: DE Ref document number: 1240355 Country of ref document: HK |
|
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170704 |
|
RJ01 | Rejection of invention patent application after publication | ||
REG | Reference to a national code |
Ref country code: HK Ref legal event code: WD Ref document number: 1240355 Country of ref document: HK |