[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

US20210117469A1 - Systems and methods for selecting content items to store and present locally on a user device - Google Patents

Systems and methods for selecting content items to store and present locally on a user device Download PDF

Info

Publication number
US20210117469A1
US20210117469A1 US17/138,379 US202017138379A US2021117469A1 US 20210117469 A1 US20210117469 A1 US 20210117469A1 US 202017138379 A US202017138379 A US 202017138379A US 2021117469 A1 US2021117469 A1 US 2021117469A1
Authority
US
United States
Prior art keywords
user
image
client device
images
virtual space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/138,379
Inventor
Anthony Grue
Andrew Haven
Andrew Scheff
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dropbox Inc
Original Assignee
Dropbox Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dropbox Inc filed Critical Dropbox Inc
Priority to US17/138,379 priority Critical patent/US20210117469A1/en
Assigned to DROPBOX, INC. reassignment DROPBOX, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HAVEN, ANDREW, GRUE, ANTHONY, SCHEFF, ANDREW
Assigned to JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT reassignment JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT PATENT SECURITY AGREEMENT Assignors: DROPBOX, INC.
Publication of US20210117469A1 publication Critical patent/US20210117469A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/5866Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, manually generated location and time information
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2457Query processing with adaptation to user needs
    • G06F16/24578Query processing with adaptation to user needs using ranking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9538Presentation of query results
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/01Social networking

Definitions

  • Various embodiments generally relate to selecting content items to store and present locally on a user device.
  • the server typically, does not account for user preferences and/or device conditions when choosing or selecting which images to send. This may be a tremendous disadvantage felt by almost all individuals at some point because they may be trying to view or find a specific image or content item within their or another individuals account, but be forced to wait to view the image or content item until it loads on their device.
  • Systems, methods, and non-transitory computer readable media for dynamically selecting and sending images to a user device based on various network and device conditions may be provided.
  • Such systems may include one or more processors, a display screen, communications circuitry, and memory including instructions.
  • Such methods may include generating a listing of images stored within a user account on a content management system. For example, all images and/or videos that a user may have stored within their user account may be listed.
  • a selection probability value may be assigned to each image from the listing based on one or more factors. The selection probability value may correspond to a likelihood of use by the user of one or more of the images from the listing. For example, images that have been recently viewed by the user may be assigned a higher selection probability value than images that have not been viewed in a long period of time (e.g., over one year).
  • an available capacity level of a user device may be determined.
  • Images may be dynamically selected from the listing to be sent to the user device based on each image's assigned selection probability value as well as the determined available capacity level.
  • the dynamically selected images may then be ranked in order of priority for sending, and sent to the user device based on the order until the available capacity level has been met.
  • the one or more factors for which the selection probability values are assigned may correspond to any images that have been recently viewed or added to a same collection as any images from the listing, any images added to the user account, a shared virtual space created by the user or to which the user belongs, and/or a notification sent to the user. For example, images recently added to a shared virtual space; or room, that the user may be a member of, may receive a higher selection probability value than an image which was not added to the shared virtual space.
  • the shared virtual space may be any communications space created by the user and/or visited by the user, where one or more images may be shared between the at least two users.
  • the aforementioned generating, assigning, determining, selecting, ranking, and sending may be repeated periodically at defined time intervals. For example, the steps may be repeated every few minutes, hours, days, upon launch of an application on the user device, and/or manually by the user. In some embodiments, the steps may be repeated in response to an action occurring. For example, the steps may be repeated in response to one or more images being added to the user account, a selection by the user to view one or more images within the user account, a selection to add one or more images to the user account, and/or an addition of one or more images to a shared virtual space.
  • Such methods may also include calculating a number of images stored within a user account on a content management system. For example, a user may have one thousand (1,000) images stored within their user account. For each image, a frequency of use may be determined. For example, the amount of times an image has been shared may correspond to the image's frequency of use.
  • a dynamic priority list for sending images to a user device from the content management system may be created. In some embodiments, the dynamic priority list may rank each image by an image's assigned selection probability value. In some embodiments, the dynamic priority list may queue the content management system to send images to the user device in order of each image's rank.
  • the frequency of use may correspond to an amount of times an image has been shared or requested to be shared, the amount of times an image has been viewed, and/or a number of contacts of the user account having a particular image stored within their own user account. For example, if multiple contacts of the user all have the same photograph stored within their user accounts, that image may receive a higher frequency of use than an image that may only be stored within one contact's user account.
  • an amount of available storage space on the user device may be calculated.
  • a first amount of images, each having a first resolution may be sent to the user device based on the calculated amount of available storage space and the images' rank within the dynamic priority list.
  • a level of connectivity between the user device and the content management system may be determined along with calculating the available storage space.
  • Such methods may still further include determining an amount of available storage space on a user device and sending data to a content management system indicating the determined amount of available storage space.
  • a first dynamic priority list may be received from the content management system and may include a first plurality of images ranked in a first order based on a first probability value assigned to each image from the first plurality of images.
  • one or more inputs may be detected that corresponds to one or more images from the first plurality of images, and data corresponding to the detected one or more images may be sent from the user device to the content management system.
  • a second dynamic priority list may be downloaded from the content management system to the user device.
  • the second dynamic priority list may include a second plurality of images ranked in a second order based on a second probability value assigned to each image from the second plurality of images.
  • FIG. 1 shows an exemplary system in accordance with various embodiments
  • FIG. 2 shows an illustrative diagram of a listing of images in accordance with various embodiments
  • FIG. 3 shows another illustrative diagram of a listing of images in accordance with various embodiments
  • FIG. 4 shows an illustrative diagram of a user interface in accordance with various embodiments
  • FIG. 5 shows another illustrative diagram of a user interface in accordance with various embodiments
  • FIG. 6A shows an illustrative diagram of a user interface displaying a user's storage settings in accordance with various embodiments
  • FIG. 6B shows another illustrative diagram of a user interface displaying a user's storage settings in accordance with various embodiments
  • FIG. 7 shows another illustrative diagram of a listing of images in accordance with various embodiments.
  • FIG. 8 shows an illustrative graph of a frequency of use for each image within a listing of images in accordance with various embodiments
  • FIG. 9 shows an illustrative flowchart of a process for sending dynamically selected images to a user device in accordance with various embodiments
  • FIG. 10 shows an illustrative flowchart of a process for creating a dynamic priority list in accordance with various embodiments
  • FIG. 11 shows an illustrative flowchart of a process for obtaining images from a content management system in accordance with various embodiments.
  • FIG. 12 shows another exemplary system in accordance with various embodiments.
  • Methods, systems, and computer readable media for dynamically selecting and sending images to a user device based on various network and device conditions are provided.
  • the user may be able to view, interact, share, and/or store images that are most relevant, important, and/or related to the user and/or the user's current activities.
  • the terms “device” and “content management system” are used herein to refer broadly to a wide variety of storage providers and data management service providers, electronic devices and user devices.
  • content item is user herein to refer broadly to a wide variety of digital data, documents, text content items, audio content items, video content items, portions of content items, and/or other types of data. Content items may also include files, folders or other mechanisms of grouping content items together with different behaviors, such as collections of content items, playlists, albums, etc.
  • the term “user” is also used herein broadly, and may correspond to a single user, multiple users, authorized accounts, an application or program operating automatically on behalf of, or at the behest of a person, or any other user type, or any combination thereof.
  • gesture and “gestures” are also used herein broadly, and may correspond to one or more motions, movements, hoverings, inferences, signs, or any other such physical interactions with one or more sensors, or any combination thereof, including vocal commands or interpretations of eye movements based on retinal tracking.
  • continuous real-time image is also used herein broadly, and may correspond to live images captured via one or more image capturing components, continuous images captured, recorded images, or any other type of image that may be captured via an image capturing component, or any combination thereof.
  • the present invention may take form in various components and arrangements of components, and in various techniques, methods, or procedures and arrangements of steps.
  • the referenced drawings are only for the purpose of illustrating embodiments, and are not to be construed as limiting the present invention.
  • Various inventive features are described below that can each be used independently of one another or in combination with other features.
  • FIG. 1 shows an exemplary system in accordance with various embodiments.
  • System 100 may include user devices 102 a and 102 b , which may communicate with content management system 104 across network 106 .
  • Persons of ordinary skill in the art will recognize that although only two user devices are shown within system 100 , any number of user devices may interact with content management system 104 and/or network 106 , and the aforementioned illustration is merely exemplary.
  • Network 106 may support any number of protocols, including, but not limited to, Transfer Control Protocol and Internet Protocol (“TCP/IP”), Hypertext Transfer Protocol (“HTTP”), and/or wireless application protocol (“WAP”).
  • TCP/IP Transfer Control Protocol and Internet Protocol
  • HTTP Hypertext Transfer Protocol
  • WAP wireless application protocol
  • user device 102 a and user device 102 b may communicate with content management system 104 using TCP/IP, and, at a higher level, use a web browser to communicate with a web server at content management system 104 using HTTP.
  • a variety of user devices 102 may communicate with content management system 104 , including, but not limited to, desktop computers, mobile computers, mobile communication devices (e.g., mobile phones, smart phones, tablets), televisions, set-top boxes, and/or any other network enabled device.
  • mobile communication devices e.g., mobile phones, smart phones, tablets
  • televisions set-top boxes
  • user devices may include, but are not limited to, smart phones, mobile phones, tablet computers, personal digital assistants (PDAs), laptop computers, digital music players, and/or any other type of user device capable of including a touch-sensing display interface.
  • PDAs personal digital assistants
  • touch-sensing display interfaces may include, but are not limited to, liquid crystal displays (LCD), monochrome displays, color graphics adapter (CGA) displays, enhanced graphics adapter (EGA) displays, variable-graphics array (VGA) displays, or any other display, or any combination thereof.
  • the touch-sensing display interface may include a multi-touch panel coupled to one or more processors to receive and detect gestures.
  • Multi-touch panels may include capacitive sensing mediums having a one or more of row traces and/or driving line traces, and one or more column traces and/or sensing lines.
  • user devices may, in some embodiments, include one or more image capturing components.
  • user devices 102 may include a front-facing camera and/or a rear facing camera.
  • Content management system 100 may allow a user with an authenticated account to store content, as well as perform management tasks, such as retrieve, modify, browse, synchronize, and/or share content with other accounts.
  • a counter-part user interface e.g., stand-alone application, client application, etc.
  • a content management interface module to allow a user to perform functions offered by modules of content management system 104 .
  • FIG. 2 shows an illustrative diagram of a listing of images in accordance with various embodiments.
  • listing 200 may correspond to a listing of images stored within a user account on a content management system (e.g., content management system 100 of FIG. 1 ).
  • Listing 200 may also correspond to a listing of images stored on a user device (e.g., client device 102 ), a user's account on a social media network, or a contact of the user.
  • Listing 200 may display images 202 , 204 , 206 , 206 , 208 , and 210 . Although only five (5) images are included within the presented listing, persons of ordinary skill in the art will recognize that any number of images may be included within the listing, and the use of five images is merely exemplary.
  • each image may include a title or identification tag.
  • image 202 may have title “IMAGE #1”
  • image 204 may have title “IMAGE #2”
  • image 206 may have title “IMAGE #3”
  • image 208 may have title “IMAGE #4”
  • image 210 may have title “IMAGE #5.”
  • the use of a title or an identification tag may depend on where listing 200 may be presented.
  • each image may include its associated title. This may aid the user in recognizing and/or viewing certain images within the listing.
  • listing 200 may correspond to a listing which may not be displayed to a user and/or may not be presented within a user interface.
  • each image may have an identification tag presented or linked to that image instead of, or in addition to, the title.
  • the identification tag may be any 8, 16, or 64 bit alphanumeric string that may describe various features of the respective image associated therewith.
  • each image from within listing 200 may have a selection probability value associated with that image.
  • the selection probability value may highlight a particular image's likelihood of being viewed, selected, shared, requested to be shared, interacted with, or to have any other action taken to it, or any combination thereof. For example, images that a user has recently viewed may have a higher selection probability value than images that have not been viewed in a long time (e.g., multiple years).
  • any action or combination of actions performed to an image, performed with an image, taken upon an image, and/or associated with an image may be used in any suitable manner to determine an image's selection probability value.
  • each image's selection probability value may be dynamic.
  • an image that has not been shared in over a year may have a lower selection probability value than an image that was viewed less than one week earlier.
  • that image may now be assigned a higher selection probability value than the image viewed earlier that week.
  • each image may have their respective selection probability value stored and/or presented in selection probability value column 220 .
  • image 202 may have a selection probability value of 90%
  • images 204 , 206 , 208 , and 210 may each have a selection probability value of 60%, 70%, 20%, and 5%, respectively.
  • Persons of ordinary skill in the art will recognize that any image may be assigned any selection probability value, and the aforementioned selection probability values are merely exemplary.
  • column 220 may store and/or present the selection probability values in terms of percentage, rank, frequency, collection, or using any other value, or any other suitable recognition means capable of signifying an image's probability of being selected.
  • an image's rank may correspond to an order that the image may be sent, downloaded, and/or uploaded.
  • image 202 may have a selection probability value of 90% and therefore may be assigned rank “A”.
  • An image having a rank of “A” may be sent, uploaded, and/or downloaded prior to other images because of its higher selection probability value.
  • image 210 may have a selection probability value of 5% and may be assigned rank “E”.
  • An image having a rank of “E” may be sent, downloaded, and/or uploaded later than images having a rank of A, B, C, or D because those images may have a higher selection probability value.
  • the rank of an image may correspond to a collection of images that may be sent, downloaded, and/or uploaded.
  • images 202 and 206 which respectively have selection probability values of 90% and 70%, may, in some embodiments, be sent in a same collection and therefore may both have a rank of “A”.
  • the rankings presented here are merely illustrative, and persons of ordinary skill in the art will recognize that any ranking order may be used, and each rank may include any number of images.
  • FIG. 3 shows another illustrative diagram of a listing of images in accordance with various embodiments.
  • listing 300 may be substantially similar to listing 200 of FIG. 2 , and the previous description of the latter may apply to the former.
  • listing 300 may include additional images, such as images 312 and 314 .
  • images 302 , 304 , 306 , 308 , and 310 may be substantially similar to images 202 , 204 , 206 , 208 , and 210 of FIG. 2 , and the previous description of the latter may also apply.
  • listing 300 may correspond to a new version of listing 200 in response to one or more user actions being performed. For example, a user may add, or attempt to add, images 312 and 314 to a user account on a content management system. In response to adding the images, each image's assigned probability value and/or rank may change. Thus, listing 300 may be dynamic, and each images selection probability value and/or rank may dynamically change depending on whether or not an action has occurred, and/or what type of action has occurred. For example, image 202 may not have been viewed in a week, and thus in response to adding images 312 and 314 , image 202 may have its selection probability value decrease slightly due to the lack of recent viewing.
  • each image from listing 300 may have a selection probability value and a rank assigned to it.
  • images 302 , 304 , 306 , 308 , 310 , 312 , and 314 may respectively have selection probability values of 80%, 50%, 85%, 5%, 20%, 10%, and 90%, which may be listed in column 320 .
  • column 330 may list each image's corresponding rank.
  • image 302 may have a selection probability value of 80%, which may correspond to rank “C”.
  • image 302 may be substantially similar to image 202 of FIG.
  • image 302 may now have a lower selection probability value (e.g., 80% from 90%) and a lower ranking order (e.g., rank “C” from rank “A”).
  • the changed probability value and rank may be due to an action being performed to one or more images within listing 300 .
  • image 302 and/or any other image within listing 300 may be assigned different selection probability values and/or ranks.
  • each of images 302 , 304 , 306 , 308 , and 310 may have their selection probability values change.
  • images 302 , 304 , 306 , 308 , and 310 may have been assigned selection probability values similar to that of images 202 , 204 , 206 , 208 , and 210 .
  • their values may be modified. This may correspond to the fact that within the time period between images 312 and 314 being added, the probability of a user viewing, interacting, and/or performing any action to the images may change, and the new selection probability values may encompass this change.
  • each image's selection probability value from listing 300 may also change.
  • image 302 may now have a selection probability value of 80%, which may correspond to rank “C”. This may be due to the fact that image 306 and 314 each have a higher selection probability value than image 302 (e.g., 85% and 90%, respectively). Images 306 and 314 may then have ranks of “B” and “A” respectively, as they have a higher selection probability value and therefore may be capable of sending prior to image 302 .
  • Persons of ordinary skill in the art will recognize that any selection probability value and any ranking order may be used and/or assigned to any image included within listing 300 , and the aforementioned values are merely exemplary.
  • FIG. 4 shows an illustrative diagram of a user interface in accordance with various embodiments.
  • User interface 400 may be any user interface displayed within a display screen on a user device (e.g., device 102 ).
  • user interface 400 may display image 402 in response to an attempt to share the image with one or more contacts, one or more social media networks, and/or a content management system.
  • user interface 400 may correspond to a shared virtual space that the user belongs to. For example, the user may belong to an image sharing stream or room with one or more contacts, and may attempt to share image 402 within that shared space.
  • User interface 400 may, in some embodiments, include user icon 404 .
  • User icon 404 may display a picture, icon, moving image, static image, or any other representation of a user, which may be used as a form of visual identification for a user within user interface 400 .
  • user interface 400 may correspond to a shared virtual space with a large number of users, and user icon 404 may help in letting other users recognize which user may be adding, or attempting to add, an image, such as image 402 to the shared virtual space.
  • the user icon may be assigned to the user by other users.
  • user icon 404 may correspond to an image stored on a user device corresponding to a particular contact.
  • the user icon may be self-assigned (e.g., the contact may decide their icon), and/or the user icon may be randomly or selectively assigned by a content management system. In some embodiments, no user icon may be present, or a static image may be displayed until a user icon has been selected/assigned.
  • user interface 400 may include action 404 a , time of action 404 b , and/or comment 404 c .
  • Action 404 a may describe to anyone viewing user interface 400 what the intended action of the user corresponding to user icon 404 may be.
  • action 404 a may display “Kayleigh shared a photo,” indicating that the user may have shared image 402 within user interface 400 .
  • Time of action 404 b may, in some embodiments, correspond to a general time when action 404 a occurred.
  • time of action 404 b may indicate that action 404 a occurred at a specific time (e.g., hour, minute), a specific date (e.g., day/month/year), and/or a general time period (e.g., yesterday, last week, a month ago, etc.).
  • Comment 404 c may, in some embodiments, correspond to a comment that the user may have provided along with the image.
  • comment 404 c may be created by the user and may be linked to image 402 . Anytime a user views image 402 within the shared virtual space, that user may be presented with comment 404 c as well.
  • any one of action 404 a , time of action 404 b , and/or comment 404 c may be removed or omitted from user interface 400 , and/or any other feature or icon may be included, and the aforementioned is merely exemplary.
  • user interface 400 may be presented on a user device in response to a contact sharing image 402 with the user.
  • a contact corresponding to user icon 404 may share image 402 with the user of device 102 .
  • the user may be able to download image 402 to their user device by selecting button 406 .
  • Button 406 may be any button operable to either save the shared image and/or re-share image 402 with another contact, another social media network, or a content management system.
  • a user may receive image 402 from a contact and then re-share image 402 .
  • user interface 400 may correspond to a shared virtual space that the user belongs to.
  • the user may be in a shared virtual space with one or more contacts who may share an image, such as image 402 , with the user.
  • the user may be able to add one or more photos to the shared virtual space by selecting “Add Photos” button 408 .
  • the user may be presented with a listing of images stored on their user device and/or within their user account on the content management system to share within the shared virtual space.
  • the user may select button 408 and reply by sending another image (e.g., an image of another dog) to that contact who shared image 402 .
  • user interface 400 may also include “Send Message” button 410 .
  • Button 410 may allow the user to send a textual message, an instant message, an email, and/or a chat message to any contact already within a shared virtual space with the user. For example, in response to receiving image 402 , the user may select button 410 and send a message back to the contact who sent image 402 , commenting on the shared image. In some embodiments, the message may be displayed with or substantially near message 404 c , thus creating a virtual chat or textual stream regarding image 402 .
  • FIG. 5 shows another illustrative diagram of a user interface in accordance with various embodiments.
  • User interface 500 may include multiple images shared within a shared virtual space. Images 502 a , 502 b , and 502 c may, for example be displayed within user interface 500 . In some embodiments, one or more of images 502 a , 502 b , and 502 c may be moveable about the user interface. For example, a user may provide one or more gestures to augment a location of an image. In some embodiments, images displayed within user interface 500 may be manipulated, modified, and/or resized. For example, a user may choose to view each image displayed within user interface 500 , or the user may decide to zoom in and display only a single image.
  • user interface 500 may be capable of displaying more images and/or different images than images 502 a - c .
  • the user may share thirteen (13) images within a shared virtual space.
  • only images 502 a - c may initially be displayed, however the user may be presented with one or more additional images by pressing “See All Photos” button 508 .
  • Selecting button 508 may present the user with any number of images which may have previously been shared within the shared virtual space.
  • pressing button 508 may provide the user with one or more images stored within their user account and/or resident on their local device.
  • a user may select button 508 and be presented with their user device's camera roll. The user may then be able to select one or more images from with their camera roll to share with one or more contacts, one or more social media networks, one or more shared virtual spaces, and/or a contact management system.
  • user interface 500 may include contact name 504 a and contact icon 504 b .
  • user interface 500 may display images 502 a - c , which may presented within a shared virtual space between a user and a contact of the user.
  • the contact's name may be displayed in the user interface as contact name 504 a .
  • contact name 504 may display multiple names.
  • contact icon 504 b may correspond to a picture or image of the contact from the shared virtual space.
  • Icon 504 b may correspond to any image and may be chosen by the contact, designated by the user, and/or assigned by a content management system hosting the shared virtual space.
  • contact name 504 a and contact icon 504 b may be representative of the user instead of a contact of the user.
  • user interface 500 may also include user icon 506 corresponding to the user being presented with user interface 500 .
  • User icon 506 may correspond to any photograph and/or image chosen by the user to represent the user within the shared virtual space.
  • User interface 500 may also include action 506 a , time of action 506 b , and/or comment 506 c , which may, in some embodiments, be substantially similar to action 404 a , time of action 404 b , and/or comment 404 c of FIG. 4 with the exception that the latter may apply to a contact of the user whereas the former may apply to the user.
  • user icon 506 and action 506 a , time of action 506 b , and comment 506 c may, in some embodiments, apply to a contact of the user, and contact name and icon 504 a and 504 b may then correspond to the user, and the use of one collection for either the user and/or the contact is merely exemplary.
  • FIG. 6A shows an illustrative diagram of a user interface displaying a user's storage settings in accordance with various embodiments.
  • User interface 600 may include network indicator 602 , wireless indicator 604 , and battery indicator 606 .
  • Network indicator 602 may indicate an amount of available network connection.
  • the user may have a cellular data carrier providing cellular network service.
  • Network indicator 602 may indicate how strong or powerful the cellular service may be in particular location at a particular time.
  • Wireless indicator 604 may behave similarly to network indicator 602 with the exception that wireless indicator 604 may indicate an amount of available Wi-Fi signal available.
  • battery indicator 606 may indicate an amount of available battery charge on the user device.
  • indicator 606 may display “50%” battery charge.
  • indicator 606 may indicate a percentage of battery charge remaining, an amount of battery time remaining, an amount of battery used, or any other charge indicator, or any combination thereof.
  • user interface 600 displays indicators 602 , 604 , and 606 , one or more of these indicators may not be displayed and/or one or more additional indicators may be displayed.
  • user interface 600 may indicate an amount of storage space occupied on the user device by various files or content items.
  • user interface 600 may correspond to a settings interface, indicating the various storage levels on the device.
  • user interface 600 may include photo storage bar 610 and video storage bar 620 .
  • Photo storage bar 610 may display an amount of storage space occupied on the user device by photographs.
  • the user may have 4 GB of photographs stored on their user device.
  • Current photo storage value 612 may indicate the amount of storage space occupied by photos on the user device (e.g., 4 GB).
  • Total photo storage value 614 may indicate a total amount of storage space available on the user device for storing photographs (e.g., 16 GB).
  • Video storage bar 620 may display an amount of storage space occupied on the user's user device by videos. For example, the user may have 8 GB of videos stored on their user device.
  • Current video storage value 622 may indicate the current amount of storage space occupied on the user device (e.g., 8 GB) by videos.
  • Total video storage value 624 may indicate a total amount of storage space available on the user device for storing videos (e.g., 16 GB). Persons of ordinary skill in the art will recognize that any amount of storage space may be allotted for any content item, and any amount of storage space may be occupied.
  • the total amount of storage space available for all content items may be the same.
  • the user device may have a total amount of available storage of 16 GB. In this scenario, if the device has 4 GB of photographs and 8 GB of videos, then 12 GB of storage space may be taken up. In some embodiments, there may only be one storage bar displaying a total amount of storage available and/or taken up on the device.
  • each of storage bars 610 and 612 may represent individual storage bars each displaying a total amount of storage space available for a single media type. For example, the user device may have 16 GB of storage space available for photographs and 16 GB of storage space available for videos.
  • the amount of storage space available on the user device may be determined using one or more system performance algorithms.
  • the user device's operating systems may include one or more options allowing the user to be presented with a screen substantially similar to user interface 600 that indicates the storage levels of the user device.
  • the performance algorithm may be performed upon a user's request and/or automatically at predefined time intervals (e.g., daily, monthly, annually, etc.).
  • FIG. 6B shows another illustrative diagram of a user interface displaying a user's storage settings in accordance with various embodiments.
  • User interface 600 of FIG. 6B may, in some embodiments, be substantially similar to user interface 600 of FIG. 6A , with the exception that the former may display various amounts of data available for download on the user device.
  • a user may continually download content until the amount of available storage space has been filled. For example, in order to continually maintain a local user experience on a user device while offline, a large amount of content may be downloaded in advance to the user device. The content may continually download and/or may be dynamically changing.
  • Photo storage bar 610 may also include available storage value 616 .
  • the user device may be capable of storing 16 GB of photographs, as indicated by total photo storage value 614 .
  • Current photo storage value 612 may indicate only 4 GB of the available 16 GB may be occupied by photographs and current available photo storage value 616 may indicate that a remaining 12 GB may be available.
  • current video storage value 626 may indicate that there may be 8 GB of available storage on the user device for videos because there are already 8 GB of videos occupying the 16 GB of total storage space for videos.
  • values 616 and 626 may indicate an amount of storage space occupied by content items dynamically sent to the user device from a content management system.
  • a user may have a large quantity of images stored within their user account on a content management system (e.g., content management system 100 ).
  • Each image stored therein may have a selection probability value assigned to that image based on the probability that the user may view, or attempt to view, that image. For example, images that have recently been added, shared, and/or viewed may be assigned a higher selection probability value than images that have not been viewed in a long period of time (e.g., one or more years).
  • the user account may continually send images to the user device based on the selection probability values of each image. For example, images having higher selection probability values may be sent prior to images having lower selection probability values. The images may continue to send to the user device until the total photo storage value 614 has been met. In some embodiments, the images sent to the user device may continually change based on any action performed on the images. For example, if one or more images are shared with a contact of the user, those images may be assigned a higher selection probability value and therefore the order of sending the images to the user device may change.
  • FIG. 7 shows another illustrative diagram of a listing of images in accordance with various embodiments.
  • Listing of images 700 may, in some embodiments, be substantially similar to listing 200 of FIG. 2 , with the exception that listing 700 may indicate various parameters associated with each image.
  • the parameters displayed may include, but are not limited to, the number of times an image has been viewed, and/or the number of times an image has been shared. Each of these parameters may have a corresponding numerical value associated with each image and listed within columns 720 , 730 , and 740 , respectively.
  • each image within listing 700 may have a corresponding value listed within column 720 corresponding to a number of times that image has been viewed.
  • image 702 may have been viewed one thousand (1,000) times, whereas image 714 may have been viewed three (3) times.
  • image 704 may have been viewed five thousand (5,000) times, whereas image 708 may have been viewed three hundred (300) times.
  • the amount of times an image has been viewed may correspond to any viewing of the image by the user, a contact of the user, a contact of the contact of the user, and/or any other individual or collection of individuals who may access and/or obtain the image and view it on their own device.
  • each image within listing 700 may have a corresponding value listed within column 730 corresponding to a number of times that image has been shared. For example, image 706 may have been shared by the user twenty (20) times, whereas image 710 may have been shared three hundred (300) times.
  • the number of times an image has been shared may correspond to any sharing of the image by the user.
  • a user may share any image stored within their user account on a content management system with one or more contacts, one or more social media networks, and/or one or more shared virtual spaces with which the user belongs too.
  • an image may be stored on the user device and may be shared within any number of contacts by uploading the image to a content management system. For example, a user may capture an image with their user device and upload that image to their user account on a content management system.
  • each image within listing 700 may have a corresponding value listed within column 740 corresponding to an amount of times that particular image has been requested to be shared. For example, image 712 may have been requested to be shared five (5) times, whereas image 716 may have been requested to be shared eighty (80) times.
  • a request to share an image may be received by a user from a contact of the user. For example, a user may have an image of their new child stored within their user account. The image may be requested to be shared with the user's friends and family so that they can each view and/or download a version of the stored image.
  • the user may receive the request to share an image via an email, a text message, and/or a pop-up notification, or using any other suitable notification, or any combination thereof.
  • FIG. 8 shows an illustrative graph of a frequency of use for each image within a listing of images in accordance with various embodiments.
  • Graph 800 may include data points corresponding to listing of images 700 of FIG. 7 .
  • diamond pointed data markers 802 may correspond to column 720
  • square data markers 804 and triangle data markers 806 may correspond to columns 730 and 740 , respectively.
  • Each data marker may correspond to a particular image from listing 700 , and may represent the amount of times that image was viewed (e.g., markers 802 ), shared (e.g., markers 804 ), and requested to be shared (e.g., markers 806 ).
  • Graph 800 may also include first axis 810 corresponding to an image number from listing of images 700 and a second axis 820 corresponding to values of columns 720 , 730 , and 740 .
  • first axis 810 may be referred to as the x-axis
  • second axis 820 may be referred to as the y-axis.
  • first axis 810 and second axis 820 may correspond to any axes and may be associated with any coordinate system, and the use of the x- and y-axes is merely exemplary.
  • second axis 820 may be in units referred to as “SAMPLE INDEX”.
  • axes 810 and/or 820 may be in logarithmic form, thus allowing large numbers and smaller numbers to both be viewed on a same graph with greater ease.
  • Graph 800 may plot the data markers for each image from listing 700 , thus allowing the frequency of use for each image to be viewed in a single graph.
  • image 706 of listing 700 may have been viewed 25 times, shared 20 times, and requested to be shared 10 times.
  • image 706 which may be the third image of the listing (or “IMAGE #3)
  • the frequency of use may be used to determine a selection probability value for an image.
  • Data for the amount of times and image is viewed, shared, and/or requested to be shared, may all be factors used to determine that particular image's selection probability value.
  • the selection probability value may be calculated in any suitable manner.
  • the selection probability value may be the aggregate of the number of times an image is viewed, shared, and requested to be shared.
  • image 714 may have an aggregate from columns 720 , 730 , and 740 of six (6) whereas image 710 may have an aggregate of ten thousand five hundred fifty (10,550).
  • image 710 may have a higher selection probability value than image 714 because image 710 may have an aggregate value which is much higher.
  • the selection probability value may be calculate based on one or more of the number of times an image is viewed, shared, and/or requested to be shared being weighted. For example, the number of times an image as been requested to be shared may be weighted twice that of the number of times an image has been shared. As an illustrative example, image 704 may have been requested to be shared thirty-five times (35), whereas it may have also been shared fifty (50) times. If the number of times an image has been requested to be shared is weighted, for example, twice that of the number of times that the image has been shared, image 702 may now have a value in column 740 of seventy (70), and therefore may obtain a different selection probability value than if it was not weighted double.
  • any combination, permutation, and/or weighting may be used to determine the selection probability value for an image based on any frequency of use characteristics, and the aforementioned examples are merely illustrative.
  • one or more algorithms running on a user's user device and/or on a content management system may be used to calculate the selection probability values.
  • FIG. 9 shows an illustrative flowchart of a process for sending dynamically selected images to a user device in accordance with various embodiments.
  • Process 900 may begin at step 902 .
  • a listing of images may be generated.
  • the listing may include images stored within a user account on a content management system.
  • listing 200 of FIG. 2 may correspond to a listing of some or all images stored within a user account on content management system 100 .
  • the listing may include every image stored within the user account. For example, if the user has one thousand (1,000) images stored within their user account, then each of those one thousand images may be included within the generated listing.
  • the listing of images may be generated in response to a user input being detected by the content management system. For example, the user may perform an action on an image located within their user account, and in response the listing may be generated.
  • a selection probability value may be assigned to each image within the image.
  • the selection probability value may be based on at least one factor, such as the amount of times an image has been viewed or shared.
  • the selection probability value may correspond to a likelihood of use of the at least one image. For example, image 710 of FIG. 7 may have been viewed ten thousand (10,000) times, whereas image 714 may have been viewed three (3) times. In this scenario, image 710 may have a selection probability value assigned to it that may be greater than a selection probability value assigned to image 714 based on the number of times that each image has been viewed.
  • the at least one factor used to determine the selection probability value of an image may correspond to the number of times an image has been shared and/or the number of times an image has been requested to be shared.
  • any factor may be used to assign a selection probability value to a particular image, and the use of the number of times viewed, shared, and/or requested to share, are merely exemplary.
  • multiple factors may be used to determine the selection probability value. For example, the number of times an image has been viewed and shared may both be used to determine the image's selection probability value. In some embodiments, one or more factors may be weighted greater or less than other factors. For example, the amount of times an image has been viewed may be weighted higher than the amount of times an image has been shared. In some embodiments, a recent viewing of a first image from the listing of images and/or a recent viewing of one or more additional images included in a same collection as one of the images from the listing may be used, at least in part, to determine and/or assign the selection probability value to a particular image. For example, a first image may be included in a collection of images from a user's vacation.
  • Within the vacation photos may be a second image that the user may view on their user device.
  • the first image may receive a higher selection probability value because the second image has recently been viewed.
  • the addition of one or more new images to the user account on the content management system, to a shared virtual space, and/or to a shared virtual space that user belongs may also be factors that the selection probability value may be based on.
  • the selection probability value of each image may dynamically update in response to any user interaction with a selected image from the listing of images, the content management system, and/or an additional image from within the listing. For example, the user may decide to share image 206 of FIG. 2 , and in response to the detected interaction with the content management system to share the image, each image from within listing 200 may have its selection probability value update.
  • an available capacity level on the user device may be determined.
  • the user may be presented with user interface 600 indicating current storage levels on their user device.
  • storage bar 610 may indicate to the user that there may be 12 GB of storage space available for storing photos and 4 GB of photos may already be stored on their user device.
  • user interface 600 may indicate a total amount of storage space occupied and/or available on the user device. For example, only one storage indicator may be present on user interface 600 indicating all available storage levels of the user device.
  • determining the available capacity level of the user device may include determining a level of connectivity between the user device and the content management system. For example, if the level of connectivity between the user device and the content management system is very low, the transmission of one or more images to the user device may not be possible or may take a substantially long time. As another example, if the level of connectivity is determined to be across a cellular network instead of over a Wi-Fi connection then the transmission may be paused so that cellular data usage is not taken up by the transmission of images. This may be especially useful if a large amount of images are set to be sent to the user device and the user only has a limited cellular data plan.
  • determining the available capacity level on the user device may also include detecting whether the user device has an available battery charge exceeding a threshold for transmission. For example, if the user device has a very low battery charge (e.g., 5% charge remaining), the transmission may be paused until the battery charge exceeds a predefined battery threshold needed for transmission (e.g., 20% charge). This may ensure that the user device does not completely drain out its battery charge downloading images so that no charge may be available for the device in case of an emergency.
  • a threshold for transmission For example, if the user device has a very low battery charge (e.g., 5% charge remaining), the transmission may be paused until the battery charge exceeds a predefined battery threshold needed for transmission (e.g. 20% charge). This may ensure that the user device does not completely drain out its battery charge downloading images so that no charge may be available for the device in case of an emergency.
  • one or more images may be selected from the listing of images.
  • the selected images may, in some embodiments, be selected for transmission to the user device, and the selection may be based on each image's selection probability value and/or the available capacity level of the user device. For example, image 202 may have a selection probability value of 90%, whereas image 210 may have a selection probability value of 5%. If the available capacity level on the user device is such that only one image may be capable of being downloaded to the device, then because image 202 has a higher selection probability value, it may send to the user device instead of image 210 . However, if the user device has enough storage space available that all five images from listing 200 may be downloaded, then both images 202 and 210 may be downloaded to the user device along with images 204 , 206 , and 208 .
  • selecting images from within the listing to be sent to the user device may also include determining whether any of the images within the listing are already stored locally on the user device. For example, image 204 of listing 200 may already be stored on the user device. In this scenario, sending image 204 as one of the selected images may be repetitive, and therefore that image may not be sent.
  • one or more systems checks may be performed on the user's user device prior to sending any of the images to determine which images and/or content may already be stored thereon. For example, the systems check may determine the photos stored on the user device as well as the amount of available storage space on the user device.
  • the dynamically selected images may be ranked in order of priority for transmission to the user device.
  • column 230 of FIG. 2 may present a rank for each image from listing 200 .
  • the rank may correspond to each image's selection probability value.
  • image 202 of listing 200 may have the highest selection probability value (e.g., 90%), whereas image 210 may have the lowest selection probability value (e.g., 5%).
  • image 202 may have a rank of “A” indicating that it may be first to be sent, whereas image 210 may have a rank of “E” indicating that it may be the last image from listing 200 sent.
  • the ranking order for transmission may corresponding to the images having the highest selection probability value to the image having the lowest selection probability value.
  • a threshold probability value may be used, such that any image having a selection probability value below the threshold may not be sent.
  • the ranking may be in order of the image or images having the lowest selection probability value to the image(s) having the highest selection probability value.
  • the dynamically selected images may be sent to the user device.
  • the transmission may be performed in ranking order. For example, an image having the ranking of “A” may send before an image having the ranking of “B” or “C”.
  • the images may be sent to storage on the user device.
  • transmission of the images may continue to occur until the available capacity level of the user device has been met. For example, if only 12 GB of storage space is available on the user device, images may continue to send to the user device until the 12 GB of storage space has been occupied.
  • the various steps of process 900 may be repeated periodically at defined time intervals.
  • the steps may repeat hourly, daily, monthly, annually, or at any other time interval.
  • the user may define the time intervals.
  • the user may have process 900 repeat every six (6) hours.
  • process 900 may repeat continually. For example, upon transmission of the selected images to the user device completing, a new listing of images within the user account may be generated.
  • the various steps of process 900 may repeat in response to an addition of an image to the user account. For example, upon images 312 and 314 being added to the user account, a new listing of images (e.g., listing 300 ) may be generated and the probability values, and thus rank and order of transmission of the images, may change. In some embodiments, the various steps of process 900 may repeat in response to one or more images being selected to be added. For example, image 312 and 314 may be selected to be added to the user account and process 900 may repeat. In some embodiments, the various steps of process 900 may repeat in response to one or more images being added to a shared virtual space which the user belongs too. For example, the user may add images 502 a - c of FIG. 5 to a shared virtual space. By adding these images to the shared virtual space, a new listing of images may be generated and selection probability values, and thus a ranking order for transmission, may also occur.
  • a new listing of images e.g., listing 300
  • the various steps of process 900
  • FIG. 10 shows an illustrative flowchart of a process for creating a dynamic priority list in accordance with various embodiments.
  • Process 1000 may being at step 1002 .
  • a number of images stored within a user account on a content management system may be calculated.
  • a user may store one or more images within their user account on a content management system.
  • the content management system may, in some embodiments, perform a status report to calculate the number of images that the user has stored within their user account.
  • the user may be capable of storing any number of images within their user account.
  • the user may store other content items within their user account including, but not limited to, videos, audio files (e.g., mp3s), textual documents, or any other file type, or any combination thereof.
  • a frequency of use for each image calculated to be stored within the user account may be determined. Any frequency of use parameter may be determined, for example, a number of times the image has been viewed. For example, image 702 of FIG. 7 may have been viewed one thousand (1,000) times.
  • one or more additional parameters may be determined corresponding to an image's frequency of use. For example, a number of times an image has been shared and/or a number of times an image has been requested to be shared may be determined.
  • the frequency of use may correspond to a determined number of contacts who also have a same image stored within their user account.
  • a selection probability value may be assigned to each image.
  • the selection probability value may be based on the determined frequency of use of each image. For example, an image that has been shared numerous times may be assigned a higher selection probability value than an image that has never been shared.
  • the selection probability value assigned to an image may be based on any determined frequency of use. For example, a number of times an image has been viewed, shared, and/or requested to be shared may all contribute to determining the frequency of use of an image.
  • step 1006 of FIG. 10 may be substantially similar to step 904 of FIG. 9 , and the previous description of the latter may apply to the former.
  • a dynamic priority list may be created for sending images to a user device from the content management system.
  • the dynamic priority list may rank each image by its assigned selection probability value. For example, listing 200 may have the images included therein ranked based on each image's assigned selection probability value.
  • the dynamic priority list may also queue the content management system to send images to the user device in order of each image's rank. For example, image 202 of FIG. 2 may have a rank of “A”, whereas image 208 may have a rank of “D”. Thus, in this scenario, image 202 may be sent prior to image 208 because it has a “higher” rank.
  • Persons of ordinary skill in the art will recognize that any image may be ranked in any order using any parameter, and the use of sending an image with a rank of “A” before an image with a rank of “B” is merely exemplary.
  • an amount of available store space on the user device may be calculated. For example, a user device may have 16 GB of total storage space where 4 GB of storage space may already be occupied. In this scenario, there may be 12 GB of storage space calculated to be available on the user device.
  • a first amount of images may be sent to the user device.
  • the first amount of images may be of a first resolution. For example, each image within the first amount of images may correspond to a low resolution images (75 ⁇ 75 pixels).
  • the first amount of images, and their resolution may be based on the amount of available storage space and each image's rank.
  • an image having a rank of “A” may be the only image capable of being sent to the user device.
  • an image having a rank of “A” may be the only image capable of being sent to the user device.
  • those images may send to the user device.
  • a first number of images from the dynamic priority list may be determined to be sendable to the user device.
  • the first amount of images may be in a first resolution, which may be based on the calculated available storage space.
  • a second number of images may be determined to be sendable to the user device in a second resolution based on the available amount of storage space. For example, if there is 2 GB of available storage space on the device, two images having a combined storage space of 1 GB may be sent in a high resolution and 10 images having a combined storage space of 1 GB may also be sent in a lower resolution as well.
  • the aforementioned example is merely one illustration, and any number of images having any resolution may be determined to be sendable to the user device.
  • the first number of images in the first resolution may correspond to a first subset of images stored on the content management system having a high-quality resolution.
  • one or more images stored within the user account on the content management system may be of a high-quality resolution (e.g., 1080 pixels), and these images may be sendable to the user device within the first number of images.
  • the second number of images in the second resolution may correspond to a second subset of images stored on the content management system having a low-quality resolution (e.g., 256 pixels).
  • the first subset of images may include fewer images than the second subset of images, however they may each be of equal size. For example, within 1 GB of storage space, ten (10) high quality images may be sendable, whereas one hundred (100) low quality images may be sendable within the same 1 GB.
  • FIG. 11 shows an illustrative flowchart of a process for obtaining images from a content management system in accordance with various embodiments.
  • Process 1100 may begin at step 1102 .
  • an amount of available storage space on a user device may be determined.
  • user interface 600 of FIG. 6B may display storage bar 610 indicating available storage 616 of 12 GB on a user device.
  • step 1102 of FIG. 11 may be substantially similar to step 906 of FIG. 9 , and the previous description of the latter may apply to the former.
  • data may be sent to the content management system indicating the determined amount of available storage space on the user device.
  • device 102 may send a system status report to a content management system indicating the current storage levels of the user device.
  • the system status report may be run at specific time intervals and/or upon user request, for example.
  • the system status report may detail the amount of storage space currently occupied on the user device by various content items.
  • the status report may indicate that the user device may have 4 GB of storage space occupied by photographs out of total storage space 614 , and therefore may have 12 GB of storage space available for additional photographs.
  • a first dynamic priority list may be received from the content management system.
  • the first dynamic priority list may include a first plurality of images that may be ranked in a first order based on a first selection probability value assigned to each image within the list.
  • listing 200 may include images 202 , 204 , 206 , 208 , and 210 , each having selection probability values assigned to them listed within column 220 .
  • the assigned selection probability values may be used to determine a rank of each image to be sent to the user device from the user account on the content management system. For example, image 202 may have a selection probability value of 90%, whereas image 206 may have a selection probability value of 70%.
  • image 202 may have a rank of “A”, whereas image 206 may have a rank of “B”.
  • images may be downloaded or attempted to be downloaded to the user device in ranking order.
  • image 202 may be ranked higher than image 206 , and therefore may be downloaded first.
  • any ranking order may be assigned to images within the listing of images, and the use of ranking images based on each image's selection probability value is merely exemplary.
  • one or more inputs may be detected using the user device.
  • the one or more inputs may be performed on one or more images from the first plurality of images.
  • a user may be presented within listing 200 within a user interface on their user device.
  • the user may perform a gesture or an action on an image in an attempt to share that image. Any gesture may be performed including, but not limited to, taps, swipes, hovers, audio commands, visual commands, motions, or any other gesture, or any combination thereof.
  • data corresponding to the detected one or more inputs may be sent to the content management system.
  • the user device may detect that the user may be attempting to share image 202 via one or more gestures performed on the user interface of the user device.
  • the user device may store the data corresponding to the gestures in one or more data files or gesture detection files, and send those files to the content management system.
  • a second dynamic priority list may be downloaded in response to the data corresponding to the detected input being sent to the content management system.
  • the second dynamic priority list may include a second plurality of images ranked in a second order based on a second selection probability value being assigned to each image.
  • the second selection probability value may correspond to the change in the likelihood of a particular image being viewed after the detected gesture.
  • Listing 300 may include, for example, image 308 , which may be substantially similar to image 208 from listing 200 . Image 208 may have initially had a selection probability value of 20% and a transmission ranking of “D”.
  • the selection probability value of image 308 may now be lower than before due to a lower likelihood of viewing this particular image.
  • image 308 may be assigned a selection probability value of 5% corresponding to a transmission ranking of “G”.
  • the detected input or inputs may cause the first probability value assigned to each image within the listing to change to a second probability value. For example, if the user attempts to share image 206 , the selection probability value for that image change in response to the detected input. In this scenario, image 206 , which may have initially had a selection probability value of 70%, may change to now correspond to image 306 of listing 300 , having a selection probability value of 85%.
  • sending the detected input data may be repeated periodically at specific time intervals. For example, the input data may be sent to the content management system from the user device every hour, or every day.
  • the first and second dynamic priority list may each be modified based on the transmission of the detected input data at each time interval. For example, each time the input data may be sent to the content management system, the dynamic priority list may update.
  • At least two sets of images may be received on the user device that may be selected as the first dynamic priority list.
  • the two sets of images may each be included within the listing of images used to create the first dynamic priority list.
  • the at least two sets of images may include a first set of images and a second set of images.
  • the first set of images may include a first number of images each having a first resolution.
  • the first set of images may be a first set of high-definition images.
  • the second set of images may include a second number of images each having a second resolution.
  • the second set of images may be a set of thumbnail resolution images.
  • both the first set of images and the second set of images may include the same number of images.
  • the first set of images may include ten (10) images
  • the second set of images may also include ten (10) images.
  • the first resolution may correspond to a higher resolution than the second resolution.
  • the first resolution may correspond to high definition images (e.g., 1080 pixels) whereas the second resolution may correspond to low definition images (e.g., 256 pixels).
  • FIG. 12 shows an exemplary system in accordance with various embodiments.
  • system 1200 of FIG. 12 may be substantially similar to system 100 of FIG. 1 , with the exception that the former may present elements of system 100 at a more granular level (e.g., modules, applications, etc.).
  • user devices 102 may be used to create, access, modify, and manage content items, such as content items 110 a and 110 b (collectively 110 ), stored locally within content item system 108 a and 108 b (collectively systems 108 ) on user device 102 and/or stored remotely on content management system 104 (e.g., within data store 118 ).
  • content items 110 a and 110 b collectively 110
  • stored locally within content item system 108 a and 108 b collectively systems 108
  • content management system 104 e.g., within data store 118
  • user device 102 a may access content items 110 b stored remotely with data store 118 of content management system 104 and may, or may not, store content item 110 b locally within content item system 108 a on user device 102 a .
  • user device 102 a may temporarily store content item 110 b within a cache locally on user device 102 a , make revisions to content item 110 b , and the revisions to content item 110 b may be communicated and stored in data store 118 of content management system 104 .
  • data store 118 may include one or more collections 132 of content items.
  • collections 132 may include one or more content items having similar properties (e.g., metadata) and/or including similar content.
  • user devices 102 may include camera 138 (e.g., 138 a and 138 b ) to capture and record digital images and/or videos.
  • User devices 102 may capture, record, and/or store content items, such as images, using camera 138 .
  • camera 138 may capture and record images and store metadata with the images.
  • Metadata may include, but is not limited to, the following: creation time timestamp, geolocation, orientation, rotation, title, and/or any other attributes or data relevant to the captured image.
  • Metadata values may be stored as attribute 112 name-value pairs, tag-value pairs, and/or any other method, or any combination thereof, to associate the metadata with the content item and easily identify the type of metadata.
  • attributes 112 may be tag-value pairs defined by a particular standard, including, but not limited to, Exchangeable Image File Format (“Exif”), JPEG File Interchange Format (Jfif), and/or any other standard.
  • user devices 102 may include time normalization module 146
  • content management system 104 may include time normalization module 148 .
  • Time normalization module 146 (e.g., 146 a and 146 b ) may be used to normalize dates and times stored with a content item.
  • Time normalization module 146 , counterpart time normalization module 148 , and/or any combination thereof, may be used to normalize dates and times stored for content items.
  • the normalized times and dates may be used to sort, group, perform comparisons, perform basic math, and/or cluster content items.
  • user devices 102 may include organization module 136
  • content management system 104 may include organization module 140 .
  • Organization module 136 (e.g., 136 a and 136 b ) may be used to organize content items into clusters or collections of content items, organize content items to provide samplings of content items for display within user interfaces, and/or retrieve organized content items for presentation.
  • Organization module 136 may utilize any clustering algorithm.
  • Organization module 136 may be used to identify similar content items for clusters in order to organize content items for presentation within user interfaces on user devices 102 and content management system 104 . Similarity rules may be defined to create one or more numeric representations embodying information on similarities between each of the content items in accordance with various similarity rules.
  • Organization module 136 may use numeric representations as a reference for similarity between content items in order to cluster the content items.
  • content items may be organized into clusters to aid with retrieval of similar content items in response to search requests.
  • organization module 136 a may identify that two images are similar and may group the images together in a cluster.
  • Organization module 136 a may process content items to determine clusters independently and/or in conjunction with counterpart organization module (e.g., 140 and/or 136 b ).
  • counterpart organization module e.g., 140 and/or 136 b
  • organization module 136 a may only provide clusters identified with counterpart organization modules (e.g., 140 and/or 136 b ) for presentation.
  • processing of content items to determine clusters may be an iterative process that may be executed upon receipt of new content items and/or new similarity rules.
  • user device 102 a may include classification module 150 a
  • user device 102 b may include classification module 150 b (collectively 150 ), which may be used independently, in combination with classification module 152 include on content management system 104 , and/or any combination thereof to classify content items, rectify content items, and/or classify images.
  • the classification modules 150 and/or 152 may be used to determine if an image includes a document, and if there so, determine a type of document stored therein.
  • Content item rectification may be performed to correct, perform further transformations, and/or crop content items to improve the display of the content items (e.g., correct the display of a document within an image).
  • user device 102 a may include search module 142 a
  • user device 102 b may include search module 142 b , which collectively may be referred to as search modules 142 .
  • Content management system 104 may also be provided with counterpart search module 144 .
  • Each of search modules 142 and 144 may be capable of supporting searches for content items located on both user devices 102 and/or content management system 104 .
  • a search request may be received by search module 142 and/or 144 that requests one or more content items.
  • the search may be handled by searching metadata and/or attributes assigned to content items during the provision of management services.
  • cluster markers stored with content items may be used to find content items by date.
  • cluster markers may indicate an approximate time, or average time, for the content items stored with the cluster marker, and the marker may be used to speed the search and/or return the search results with the contents of the cluster with particular cluster markers.
  • Content items 110 managed by content management system 104 may be stored locally within content item system 108 of respective user devices 102 and/or stored remotely within data store 118 of content management system 104 (e.g., content items 134 in data store 118 ). Content management system 104 may provide synchronization of content items managed thereon. Attributes 112 a and 112 b (collectively 112 ) or other metadata may also be stored with content items 110 . For example, a particular attribute may be stored with a content item to track content items locally stored on user devices 102 that are managed and/or synchronized by content management system 104 .
  • attributes 112 may be implemented using extended attributes, resource forks, or any other implementation that allows for storing metadata with a content item that is not interpreted by a content item system, such as content item system 108 .
  • attributes 112 a and 112 b may be content identifiers for content items.
  • the content identifier may be a unique or nearly unique identifier (e.g., number or string) that identifies the content item.
  • the content item may be tracked. For example, if a user moves the content item to another location within content item system 108 hierarchy and/or modifies the content item, then the content item may still be identified within content item system 108 of user device 102 . Any changes or modifications to the content item identified with the content identifier may be uploaded or provided for synchronization and/or version control services provided by content management system 104 .
  • a stand-alone content management application 114 a and 114 b may be implemented on user devices 102 a and 102 b , respectively, to provide a user interface to a user for interacting with content management system 104 .
  • Content management application 114 may expose the functionality provided with content management interface module 154 and accessible modules for user device 102 .
  • Web browser 116 a and 116 b may be used to display a web page front end for a client application that may provide content management 104 functionality exposed/provided with content management interface module 154 .
  • Content management system 104 may allow a user with an authenticated account to store content, as well as perform management tasks, such as retrieve, modify, browse, synchronize, and/or share content with other accounts.
  • Various embodiments of content management system 104 may have elements including, but not limited to, content management interface module 154 , account management module 120 , synchronization module 122 , collections module 124 , sharing module 126 , file system abstraction 128 , data store 118 , and organization module 140 .
  • Content management interface module 154 may expose the server-side or back end functionality/capabilities of content management system 104 .
  • a counter-part user interface e.g., stand-alone application, client application, etc.
  • on user devices 102 may be implemented using content management interface module 154 to allow a user to perform functions offered by modules of content management system 104 .
  • the user interface displayed on user device 102 may be used to create an account for a user and/or authenticate the user to use the account using account management module 120 .
  • Account management module 120 may provide the functionality for authenticating use of an account by a user and/or user device 102 with username/password, device identifiers, and/or any other authentication method.
  • Account information 130 may be maintained in data store 118 for accounts.
  • Account information may include, but is not limited to, personal information (e.g., an email address or username), account management information (e.g., account type, such as “free” or “paid”), usage information, (e.g., content item edit history), maximum storage space authorized, storage space used, content storage locations, security settings, personal configuration settings, content sharing data, etc.
  • An amount of storage space on content management system 104 may be reserved, allotted, allocated, stored, and/or may be accessed with an authenticated account.
  • the account may be used to access content items 134 and/or content items 110 within data store 118 for the account, and/or content items 134 and/or content items 110 made accessible to the account that are shared from another account.
  • account management module 120 may interact with any number of other modules of content management system 104 .
  • An account on content management system 104 may, in some embodiments, be used to store content such as documents, text items, audio items, video items, etc., from one or more user devices 102 authorized by the account.
  • the content may also include collections of various types of content with different behaviors, or other mechanisms of grouping content items together.
  • an account may include a public collection that may be accessible to any user.
  • the public collection may be assigned a web-accessible address. A link to the web-accessible address may be used to access the contents of the public folder.
  • an account may include a photos collection that may store photos and/or videos, and may provide specific attributes and actions tailored for photos and/or videos.
  • the account may also include an audio collection that provides the ability to play back audio items and perform other audio related actions.
  • the account may still further include a special purpose collection.
  • An account may also include shared collections or group collections that may be linked with and available to multiple user accounts. In some embodiments, access to a shared collection may differ for different users that may be capable of accessing the shared collection.
  • Content items 110 and/or content items 134 may be stored in data store 118 .
  • Data store 118 may, in some embodiments, be a storage device, multiple storage devices, or a server. Alternatively, data store 118 may be cloud storage provider or network storage accessible via one or more communications networks.
  • Content management system 104 may hide the complexity and details from user devices 102 by using content item system abstraction 128 (e.g., a content item system database abstraction layer) so that user devices 102 do not need to know exactly where the content items are being stored by content management system 104 .
  • Embodiments may store the content items in the same collections hierarchy as they appear on user device 102 .
  • content management system 104 may store the content items in various orders, arrangements, and/or hierarchies.
  • Content management system 140 may store the content items in a network accessible storage (SAN) device, in a redundant array of inexpensive disks (RAID), etc.
  • Content management system 104 may store content items using one or more partition types, such as FAT, FAT32, NTFS, EXT2, EXT3, EXT4, ReiserFS, BTRFS, and so forth.
  • Data store 118 may also store metadata describing content items, content item types, and the relationship of content items to various accounts, folders, collections, or groups.
  • the metadata for a content item may be stored as part of the content item and/or may be stored separately.
  • Metadata may be store in an object-oriented database, a relational database, a content item system, or any other collection of data.
  • each content item stored in data store 118 may be assigned a system-wide unique identifier.
  • Data store 118 may, in some embodiments, decrease the amount of storage space required by identifying duplicate content items or duplicate chunks of content items. Instead of storing multiple copies, data store 118 may store a single copy of content item 134 and then use a pointer or other mechanism to link the duplicates to the single copy. Similarly, data store 118 may store content items 134 more efficiently, as well as provide the ability to undo operations, by using a content item version control that tracks changes to content items, different versions of content items (including diverging version trees), and a change history. The change history may include a set of changes that, when applied to the original content item version, produce the changed content item version.
  • Content management system 104 may be configured to support automatic synchronization of content from one or more user devices 102 .
  • the synchronization may be platform independent. That is, the content may be synchronized across multiple user devices 102 of varying type, capabilities, operating systems, etc.
  • user device 102 a may include client software, which synchronizes, via synchronization module 122 at content management system 104 , content in content item system 108 of user devices 102 with the content in an associated user account.
  • the client software may synchronize any changes to content in a designated collection and its sub-collection, such as new, deleted, modified, copied, or moved content items or folders.
  • a user may manipulate content directly in a local folder, while a background process monitors the local content item for changes and synchronizes those changes to content management system 104 .
  • a background process may identify content that has been updated at content management system 104 and synchronize those changes to the local collection.
  • the client software may provide notifications of synchronization operations, and may provide indications of content statuses directly within the content management application.
  • user device 102 may not have a network connection available. In this scenario, the client software may monitor the linked collection for content item changes and queue those changes for later synchronization to content management system 104 when a network connection is available. Similarly, a user may manually stop or pause synchronization with content management system 104 .
  • a user may also view or manipulate content via a web interface generated and served by content management interface module 154 .
  • the user may navigate in a web browser to a web address provided by content management system 104 .
  • Changes or updates to content in data store 118 made through the web interface, such as uploading a new version of a content item, may be propagated back to other user devices 102 associated with the user's account.
  • multiple user devices 102 each with their own client software, may be associated with a single account, and content items in the account may be synchronized between each of user devices 102 .
  • Content management system 104 may include sharing module 126 for managing sharing content and/or collections of content publicly or privately.
  • Sharing module 126 may manage sharing independently or in conjunction with counterpart sharing module 152 a , located on user device 102 a , and sharing module 152 b located on user device 102 b (collectively sharing modules 152 ).
  • Sharing content publicly may include making the content item and/or the collection accessible from any device in network communication with content management system 104 .
  • Sharing content privately may include linking a content item and/or a collection in data store 118 with two or more user accounts so that each user account has access to the content item.
  • the sharing may be performed in a platform independent manner. That is, the content may be shared across multiple user devices 102 of varying type, capabilities, operating systems, etc.
  • one or more share links may be provided to a user, or a contact of a user, to access a shared content item.
  • the content may also be shared across varying types of user accounts.
  • the sharing module 126 may be used with collections module 124 to allow sharing of a virtual collection with another user or user account.
  • a virtual collection may be a collection of content identifiers that may be stored in various locations within content item systems 108 of user device 102 and/or stored remotely at content management system 104 .
  • the virtual collection for an account with a content management system may correspond to a collection of one or more identifiers for content items (e.g., identifying content items in storage).
  • the virtual collection is created with collections module 124 by selecting from existing content items stored and/or managed by content management system and associating the existing content items within data storage (e.g., associating storage locations, content identifiers, or addresses of stored content items) with the virtual collection.
  • data storage e.g., associating storage locations, content identifiers, or addresses of stored content items
  • a content item may be designated as part of the virtual collection without having to store (e.g., copy and paste the content item to a directory) the content item in another location within data storage in order to place the content item in the collection.
  • content management system 104 may be configured to maintain a content directory or a database table/entity for content items where each entry or row identifies the location of each content item in data store 118 .
  • a unique or a nearly unique content identifier may be stored for each content item stored in data store 118 .
  • Metadata may be stored for each content item.
  • metadata may include a content path that may be used to identify the content item.
  • the content path may include the name of the content item and a content item hierarchy associated with the content item (e.g., the path for storage locally within a user device 102 ).
  • Content management system 104 may use the content path to present the content items in the appropriate content item hierarchy in a user interface with a traditional hierarchy view.
  • a content pointer that identifies the location of the content item in data store 118 may also be stored with the content identifier.
  • the content pointer may include the exact storage address of the content item in memory.
  • the content pointer may point to multiple locations, each of which contains a portion of the content item.
  • a content item entry/database table row in a content item database entity may also include a user account identifier that identifies the user account that has access to the content item.
  • a user account identifier may be associated with a single content entry indicating that the content item has shared access by the multiple user accounts.
  • sharing module 126 may be configured to add a user account identifier to the content entry or database table row associated with the content item, thus granting the added user account access to the content item. Sharing module 126 may also be configured to remove user account identifiers from a content entry or database table rows to restrict a user account's access to the content item. The sharing module 126 may also be used to add and remove user account identifiers to a database table for virtual collections.
  • sharing module 126 may be configured to generate a custom network address, such as a uniform resource locator (“URL”), which allows any web browser to access the content in content management system 104 without any authentication.
  • sharing module 126 may be configured to include content identification data in the generated URL, which may later be used to properly identify and return the requested content item.
  • sharing module 126 may be configured to include the user account identifier and the content path in the generated URL.
  • the content identification data included in the URL may be sent to content management system 104 which may use the received content identification data to identify the appropriate content entry and return the content item associated with the content entry.
  • sharing module 126 may be configured to generate a custom network address, such as a uniform resource locator (URL), which allows any web browser to access the content in content management system 100 without any authentication.
  • sharing module 126 may be configured to include collection identification data in the generated URL, which may later be used to properly identify and return the requested content item.
  • sharing module 126 may be configured to include the user account identifier and the collection identifier in the generated URL.
  • the content identification data included in the URL may be sent to content management system 104 which may use the received content identification data to identify the appropriate content entry or database row and return the content item associated with the content entry or database row.
  • sharing module 126 may also be configured to record that a URL to the content item has been created.
  • the content entry associated with a content item may include a URL flag indicating whether a URL to the content item has been created.
  • the URL flag may be a Boolean value initially set to 0 or “false” to indicate that a URL to the content item has not been created. Sharing module 126 may be configured to change the value of the flag to 1 or “true” after generating a URL to the content item.
  • sharing module 126 may also be configured to deactivate a generated URL.
  • each content entry may also include a URL active flag indicating whether the content should be returned in response to a request from the generated URL.
  • sharing module 126 may be configured to only return a content item requested by a generated link if the URL active flag is set to 1 or true. Changing the value of the URL active flag or Boolean value may easily restrict access to a content item or a collection for which a URL has been generated. This may allow a user to restrict access to the shared content item without having to move the content item or delete the generated URL.
  • sharing module 126 may reactivate the URL by again changing the value of the URL active flag to 1 or true. A user may thus easily restore access to the content item without the need to generate a new URL.
  • any suitable programming language may be used to implement the routines of particular embodiments including C, C++, Java, JavaScript, Python, Ruby, CoffeeScript, assembly language, etc.
  • Different programming techniques may be employed such as procedural or object oriented.
  • the routines may execute on a single processing device or multiple processors.
  • Particular embodiments may be implemented in a computer-readable storage device or non-transitory computer readable medium for use by or in connection with the instruction execution system, apparatus, system, or device.
  • Particular embodiments may be implemented in the form of control logic in software or hardware or a combination of both.
  • the control logic when executed by one or more processors, may be operable to perform that which is described in particular embodiments.
  • Particular embodiments may be implemented by using a programmed general purpose digital computer, by using application specific integrated circuits, programmable logic devices, field programmable gate arrays, optical, chemical, biological, quantum or nanoengineered systems, components and mechanisms may be used.
  • the functions of particular embodiments may be achieved by any means as is known in the art.
  • Distributed, networked systems, components, and/or circuits may be used.
  • Communication, or transfer, of data may be wired, wireless, or by any other means.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Business, Economics & Management (AREA)
  • General Health & Medical Sciences (AREA)
  • Economics (AREA)
  • Tourism & Hospitality (AREA)
  • Primary Health Care (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Library & Information Science (AREA)
  • Computational Linguistics (AREA)
  • Information Transfer Between Computers (AREA)
  • User Interface Of Digital Computer (AREA)
  • Human Computer Interaction (AREA)

Abstract

Systems, methods, and non-transitory computer readable media for dynamically selecting and sending content items to a user device based on various network and device conditions are provided. In some embodiments, a listing of images stored within a user account on a content management system may be generated, and a selection probability value may be assigned to each image within the listing. An available capacity level on the user device may be determined, and images may be dynamically selected to be sent to the user device based on the determined available capacity level and each image's selection probability value. In some embodiments, each image's selection probability value may be based on one or more factors. For example, images that have been recently viewed, recently added to a collection of images, and/or shared within a shared virtual space created by the user, may receive higher selection probability values than other images. In some embodiments, the selected images may be ranked in order of priority for transmission to the user device and sent to the user device in order of rank.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. patent application Ser. No. 14/247,468, filed on Apr. 8, 2014, which claims the benefit of and priority to provisional application No. 61/945,808 filed Feb. 27, 2014. Each of the aforementioned applications is hereby incorporated by reference in its entirety.
  • FIELD OF THE INVENTION
  • Various embodiments generally relate to selecting content items to store and present locally on a user device.
  • BACKGROUND
  • Social media networks and content management systems that run resident applications on user devices are increasingly becoming programs that individuals interface with daily and habitually. Users of these applications may interface with the plethora of content items, such as photographs, that may be stored thereon. However, the typical social media application, for instance, has numerous drawbacks and limitations that the user typically experiences, but nonetheless may be obligated to deal with.
  • While the number of content items capable of being stored on social media networks and/or content management systems may be large, the amount of content items that may be viewed upon request may be limited. These limitations may be based on the network conditions between the user's user device and the server hosting the social media network and/or content management system, the local storage space on the user device and/or user account, and/or various other network or device conditions. For example, a user may access their user account on a content management system, and may only be able to view certain images that may have successfully been sent to their device while other images may be “grayed-out” or blank. This may be because the ordering in which images are sent may be random or substantially random. The server, typically, does not account for user preferences and/or device conditions when choosing or selecting which images to send. This may be a tremendous disadvantage felt by almost all individuals at some point because they may be trying to view or find a specific image or content item within their or another individuals account, but be forced to wait to view the image or content item until it loads on their device.
  • Thus, it would be beneficial for there to be systems and methods that allow images and/or content items to be dynamically selected and sent to a user's user device based on various network and device conditions.
  • SUMMARY
  • Systems, methods, and non-transitory computer readable media for dynamically selecting and sending images to a user device based on various network and device conditions may be provided. Such systems may include one or more processors, a display screen, communications circuitry, and memory including instructions.
  • Such methods may include generating a listing of images stored within a user account on a content management system. For example, all images and/or videos that a user may have stored within their user account may be listed. A selection probability value may be assigned to each image from the listing based on one or more factors. The selection probability value may correspond to a likelihood of use by the user of one or more of the images from the listing. For example, images that have been recently viewed by the user may be assigned a higher selection probability value than images that have not been viewed in a long period of time (e.g., over one year). In some embodiments, an available capacity level of a user device may be determined. Images may be dynamically selected from the listing to be sent to the user device based on each image's assigned selection probability value as well as the determined available capacity level. The dynamically selected images may then be ranked in order of priority for sending, and sent to the user device based on the order until the available capacity level has been met.
  • In some embodiments, the one or more factors for which the selection probability values are assigned may correspond to any images that have been recently viewed or added to a same collection as any images from the listing, any images added to the user account, a shared virtual space created by the user or to which the user belongs, and/or a notification sent to the user. For example, images recently added to a shared virtual space; or room, that the user may be a member of, may receive a higher selection probability value than an image which was not added to the shared virtual space. In some embodiments, the shared virtual space may be any communications space created by the user and/or visited by the user, where one or more images may be shared between the at least two users.
  • In some embodiments, the aforementioned generating, assigning, determining, selecting, ranking, and sending may be repeated periodically at defined time intervals. For example, the steps may be repeated every few minutes, hours, days, upon launch of an application on the user device, and/or manually by the user. In some embodiments, the steps may be repeated in response to an action occurring. For example, the steps may be repeated in response to one or more images being added to the user account, a selection by the user to view one or more images within the user account, a selection to add one or more images to the user account, and/or an addition of one or more images to a shared virtual space.
  • Such methods may also include calculating a number of images stored within a user account on a content management system. For example, a user may have one thousand (1,000) images stored within their user account. For each image, a frequency of use may be determined. For example, the amount of times an image has been shared may correspond to the image's frequency of use. A dynamic priority list for sending images to a user device from the content management system may be created. In some embodiments, the dynamic priority list may rank each image by an image's assigned selection probability value. In some embodiments, the dynamic priority list may queue the content management system to send images to the user device in order of each image's rank.
  • In some embodiments, the frequency of use may correspond to an amount of times an image has been shared or requested to be shared, the amount of times an image has been viewed, and/or a number of contacts of the user account having a particular image stored within their own user account. For example, if multiple contacts of the user all have the same photograph stored within their user accounts, that image may receive a higher frequency of use than an image that may only be stored within one contact's user account. In some embodiments, an amount of available storage space on the user device may be calculated. A first amount of images, each having a first resolution, may be sent to the user device based on the calculated amount of available storage space and the images' rank within the dynamic priority list. In some embodiments, a level of connectivity between the user device and the content management system may be determined along with calculating the available storage space.
  • Such methods may still further include determining an amount of available storage space on a user device and sending data to a content management system indicating the determined amount of available storage space. In some embodiments, a first dynamic priority list may be received from the content management system and may include a first plurality of images ranked in a first order based on a first probability value assigned to each image from the first plurality of images. In some embodiments, one or more inputs may be detected that corresponds to one or more images from the first plurality of images, and data corresponding to the detected one or more images may be sent from the user device to the content management system. In response to sending the data, a second dynamic priority list may be downloaded from the content management system to the user device. The second dynamic priority list may include a second plurality of images ranked in a second order based on a second probability value assigned to each image from the second plurality of images.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and other aspects and advantages of the invention will become more apparent upon consideration of the following detailed description, taken in conjunction with accompanying drawings, in which like referenced characters refer to like parts throughout, and in which:
  • FIG. 1 shows an exemplary system in accordance with various embodiments;
  • FIG. 2 shows an illustrative diagram of a listing of images in accordance with various embodiments;
  • FIG. 3 shows another illustrative diagram of a listing of images in accordance with various embodiments;
  • FIG. 4 shows an illustrative diagram of a user interface in accordance with various embodiments;
  • FIG. 5 shows another illustrative diagram of a user interface in accordance with various embodiments;
  • FIG. 6A shows an illustrative diagram of a user interface displaying a user's storage settings in accordance with various embodiments;
  • FIG. 6B shows another illustrative diagram of a user interface displaying a user's storage settings in accordance with various embodiments;
  • FIG. 7 shows another illustrative diagram of a listing of images in accordance with various embodiments;
  • FIG. 8 shows an illustrative graph of a frequency of use for each image within a listing of images in accordance with various embodiments;
  • FIG. 9 shows an illustrative flowchart of a process for sending dynamically selected images to a user device in accordance with various embodiments;
  • FIG. 10 shows an illustrative flowchart of a process for creating a dynamic priority list in accordance with various embodiments;
  • FIG. 11 shows an illustrative flowchart of a process for obtaining images from a content management system in accordance with various embodiments; and
  • FIG. 12 shows another exemplary system in accordance with various embodiments.
  • DETAILED DESCRIPTION OF THE DISCLOSURE
  • Methods, systems, and computer readable media for dynamically selecting and sending images to a user device based on various network and device conditions are provided. By dynamically selecting and sending images to the user device, the user may be able to view, interact, share, and/or store images that are most relevant, important, and/or related to the user and/or the user's current activities.
  • For purposes of description and simplicity, methods, systems and computer readable media will be described for dynamically selecting and sending images to a user device based on various network and device conditions.
  • It is noted that the terms “device” and “content management system” are used herein to refer broadly to a wide variety of storage providers and data management service providers, electronic devices and user devices. It is also noted that the term “content item” is user herein to refer broadly to a wide variety of digital data, documents, text content items, audio content items, video content items, portions of content items, and/or other types of data. Content items may also include files, folders or other mechanisms of grouping content items together with different behaviors, such as collections of content items, playlists, albums, etc. The term “user” is also used herein broadly, and may correspond to a single user, multiple users, authorized accounts, an application or program operating automatically on behalf of, or at the behest of a person, or any other user type, or any combination thereof. The term “gesture” and “gestures” are also used herein broadly, and may correspond to one or more motions, movements, hoverings, inferences, signs, or any other such physical interactions with one or more sensors, or any combination thereof, including vocal commands or interpretations of eye movements based on retinal tracking. The term “continuous real-time image” is also used herein broadly, and may correspond to live images captured via one or more image capturing components, continuous images captured, recorded images, or any other type of image that may be captured via an image capturing component, or any combination thereof.
  • The present invention may take form in various components and arrangements of components, and in various techniques, methods, or procedures and arrangements of steps. The referenced drawings are only for the purpose of illustrating embodiments, and are not to be construed as limiting the present invention. Various inventive features are described below that can each be used independently of one another or in combination with other features.
  • FIG. 1 shows an exemplary system in accordance with various embodiments. System 100 may include user devices 102 a and 102 b, which may communicate with content management system 104 across network 106. Persons of ordinary skill in the art will recognize that although only two user devices are shown within system 100, any number of user devices may interact with content management system 104 and/or network 106, and the aforementioned illustration is merely exemplary.
  • Network 106 may support any number of protocols, including, but not limited to, Transfer Control Protocol and Internet Protocol (“TCP/IP”), Hypertext Transfer Protocol (“HTTP”), and/or wireless application protocol (“WAP”). For example, user device 102 a and user device 102 b (collectively 102) may communicate with content management system 104 using TCP/IP, and, at a higher level, use a web browser to communicate with a web server at content management system 104 using HTTP.
  • A variety of user devices 102 may communicate with content management system 104, including, but not limited to, desktop computers, mobile computers, mobile communication devices (e.g., mobile phones, smart phones, tablets), televisions, set-top boxes, and/or any other network enabled device. Various types of user devices may include, but are not limited to, smart phones, mobile phones, tablet computers, personal digital assistants (PDAs), laptop computers, digital music players, and/or any other type of user device capable of including a touch-sensing display interface. Various touch-sensing display interfaces may include, but are not limited to, liquid crystal displays (LCD), monochrome displays, color graphics adapter (CGA) displays, enhanced graphics adapter (EGA) displays, variable-graphics array (VGA) displays, or any other display, or any combination thereof. In some embodiments, the touch-sensing display interface may include a multi-touch panel coupled to one or more processors to receive and detect gestures. Multi-touch panels, for example, may include capacitive sensing mediums having a one or more of row traces and/or driving line traces, and one or more column traces and/or sensing lines. Although multi-touch panels are described herein as one example for touch-sensing display interface, persons of ordinary skill in the art will recognize that any touch-sensing display interface may be used. Furthermore, various types of user devices may, in some embodiments, include one or more image capturing components. For example, user devices 102 may include a front-facing camera and/or a rear facing camera.
  • Content management system 100 may allow a user with an authenticated account to store content, as well as perform management tasks, such as retrieve, modify, browse, synchronize, and/or share content with other accounts. In some embodiments, a counter-part user interface (e.g., stand-alone application, client application, etc.) on user devices 102 may be implemented using a content management interface module to allow a user to perform functions offered by modules of content management system 104. A more detailed description of system 100 is presented below, with reference to FIG. 12.
  • FIG. 2 shows an illustrative diagram of a listing of images in accordance with various embodiments. In some embodiments, listing 200 may correspond to a listing of images stored within a user account on a content management system (e.g., content management system 100 of FIG. 1). Listing 200 may also correspond to a listing of images stored on a user device (e.g., client device 102), a user's account on a social media network, or a contact of the user.
  • Listing 200 may display images 202, 204, 206, 206, 208, and 210. Although only five (5) images are included within the presented listing, persons of ordinary skill in the art will recognize that any number of images may be included within the listing, and the use of five images is merely exemplary. In some embodiments, each image may include a title or identification tag. For example, image 202 may have title “IMAGE #1,” image 204 may have title “IMAGE #2,” image 206 may have title “IMAGE #3,” image 208 may have title “IMAGE #4,” and image 210 may have title “IMAGE #5.” In some embodiments, the use of a title or an identification tag may depend on where listing 200 may be presented. For example, if listing 200 is displayed within a user interface, each image may include its associated title. This may aid the user in recognizing and/or viewing certain images within the listing. In some embodiments, listing 200 may correspond to a listing which may not be displayed to a user and/or may not be presented within a user interface. In this scenario, each image may have an identification tag presented or linked to that image instead of, or in addition to, the title. For example, the identification tag may be any 8, 16, or 64 bit alphanumeric string that may describe various features of the respective image associated therewith.
  • In some embodiments, each image from within listing 200 may have a selection probability value associated with that image. The selection probability value may highlight a particular image's likelihood of being viewed, selected, shared, requested to be shared, interacted with, or to have any other action taken to it, or any combination thereof. For example, images that a user has recently viewed may have a higher selection probability value than images that have not been viewed in a long time (e.g., multiple years). Persons of ordinary skill in the art will recognize that any action or combination of actions performed to an image, performed with an image, taken upon an image, and/or associated with an image may be used in any suitable manner to determine an image's selection probability value. Furthermore, in some embodiments, each image's selection probability value may be dynamic. For example, an image that has not been shared in over a year may have a lower selection probability value than an image that was viewed less than one week earlier. In response to the user viewing the image last viewed a year ago, that image may now be assigned a higher selection probability value than the image viewed earlier that week.
  • In some embodiments, each image may have their respective selection probability value stored and/or presented in selection probability value column 220. For example, image 202 may have a selection probability value of 90%, whereas images 204, 206, 208, and 210 may each have a selection probability value of 60%, 70%, 20%, and 5%, respectively. Persons of ordinary skill in the art will recognize that any image may be assigned any selection probability value, and the aforementioned selection probability values are merely exemplary. In some embodiments, column 220 may store and/or present the selection probability values in terms of percentage, rank, frequency, collection, or using any other value, or any other suitable recognition means capable of signifying an image's probability of being selected.
  • Depending on a particular image's selection probability value, a different ordering or rank may also be assigned to that image, and may be stored and/or presented in column 230. In some embodiments, an image's rank may correspond to an order that the image may be sent, downloaded, and/or uploaded. For example, image 202 may have a selection probability value of 90% and therefore may be assigned rank “A”. An image having a rank of “A” may be sent, uploaded, and/or downloaded prior to other images because of its higher selection probability value. As another example, image 210 may have a selection probability value of 5% and may be assigned rank “E”. An image having a rank of “E” may be sent, downloaded, and/or uploaded later than images having a rank of A, B, C, or D because those images may have a higher selection probability value. In some embodiments, the rank of an image may correspond to a collection of images that may be sent, downloaded, and/or uploaded. For example, images 202 and 206, which respectively have selection probability values of 90% and 70%, may, in some embodiments, be sent in a same collection and therefore may both have a rank of “A”. Thus, the rankings presented here are merely illustrative, and persons of ordinary skill in the art will recognize that any ranking order may be used, and each rank may include any number of images.
  • FIG. 3 shows another illustrative diagram of a listing of images in accordance with various embodiments. In some embodiments, listing 300 may be substantially similar to listing 200 of FIG. 2, and the previous description of the latter may apply to the former. In some embodiments, listing 300 may include additional images, such as images 312 and 314. In some embodiments, images 302, 304, 306, 308, and 310 may be substantially similar to images 202, 204, 206, 208, and 210 of FIG. 2, and the previous description of the latter may also apply.
  • In some embodiments, listing 300 may correspond to a new version of listing 200 in response to one or more user actions being performed. For example, a user may add, or attempt to add, images 312 and 314 to a user account on a content management system. In response to adding the images, each image's assigned probability value and/or rank may change. Thus, listing 300 may be dynamic, and each images selection probability value and/or rank may dynamically change depending on whether or not an action has occurred, and/or what type of action has occurred. For example, image 202 may not have been viewed in a week, and thus in response to adding images 312 and 314, image 202 may have its selection probability value decrease slightly due to the lack of recent viewing.
  • As previously mentioned, each image from listing 300 may have a selection probability value and a rank assigned to it. For example, images 302, 304, 306, 308, 310, 312, and 314 may respectively have selection probability values of 80%, 50%, 85%, 5%, 20%, 10%, and 90%, which may be listed in column 320. Based on these values, column 330 may list each image's corresponding rank. For example, image 302 may have a selection probability value of 80%, which may correspond to rank “C”. In some embodiments, image 302 may be substantially similar to image 202 of FIG. 2, with the exception that image 302 may now have a lower selection probability value (e.g., 80% from 90%) and a lower ranking order (e.g., rank “C” from rank “A”). The changed probability value and rank may be due to an action being performed to one or more images within listing 300. For example, in response to the addition of images 312 and 314, image 302 and/or any other image within listing 300, may be assigned different selection probability values and/or ranks.
  • As an illustrative example, in response to images 312 and 314 being added to listing 300, each of images 302, 304, 306, 308, and 310 may have their selection probability values change. Initially, images 302, 304, 306, 308, and 310 may have been assigned selection probability values similar to that of images 202, 204, 206, 208, and 210. However, in response to the addition of images 312 and 314, their values may be modified. This may correspond to the fact that within the time period between images 312 and 314 being added, the probability of a user viewing, interacting, and/or performing any action to the images may change, and the new selection probability values may encompass this change.
  • In some embodiments, in response to the changes to each image's selection probability value from listing 300, their ranking order may also change. For example, image 302 may now have a selection probability value of 80%, which may correspond to rank “C”. This may be due to the fact that image 306 and 314 each have a higher selection probability value than image 302 (e.g., 85% and 90%, respectively). Images 306 and 314 may then have ranks of “B” and “A” respectively, as they have a higher selection probability value and therefore may be capable of sending prior to image 302. Persons of ordinary skill in the art will recognize that any selection probability value and any ranking order may be used and/or assigned to any image included within listing 300, and the aforementioned values are merely exemplary.
  • FIG. 4 shows an illustrative diagram of a user interface in accordance with various embodiments. User interface 400 may be any user interface displayed within a display screen on a user device (e.g., device 102). In some embodiments, user interface 400 may display image 402 in response to an attempt to share the image with one or more contacts, one or more social media networks, and/or a content management system. In some embodiments, user interface 400 may correspond to a shared virtual space that the user belongs to. For example, the user may belong to an image sharing stream or room with one or more contacts, and may attempt to share image 402 within that shared space.
  • User interface 400 may, in some embodiments, include user icon 404. User icon 404 may display a picture, icon, moving image, static image, or any other representation of a user, which may be used as a form of visual identification for a user within user interface 400. For example, user interface 400 may correspond to a shared virtual space with a large number of users, and user icon 404 may help in letting other users recognize which user may be adding, or attempting to add, an image, such as image 402 to the shared virtual space. In some embodiments, the user icon may be assigned to the user by other users. For example, user icon 404 may correspond to an image stored on a user device corresponding to a particular contact. In some embodiments, the user icon may be self-assigned (e.g., the contact may decide their icon), and/or the user icon may be randomly or selectively assigned by a content management system. In some embodiments, no user icon may be present, or a static image may be displayed until a user icon has been selected/assigned.
  • In some embodiments, user interface 400 may include action 404 a, time of action 404 b, and/or comment 404 c. Action 404 a may describe to anyone viewing user interface 400 what the intended action of the user corresponding to user icon 404 may be. For example, action 404 a may display “Kayleigh shared a photo,” indicating that the user may have shared image 402 within user interface 400. Time of action 404 b may, in some embodiments, correspond to a general time when action 404 a occurred. For example, time of action 404 b may indicate that action 404 a occurred at a specific time (e.g., hour, minute), a specific date (e.g., day/month/year), and/or a general time period (e.g., yesterday, last week, a month ago, etc.). Comment 404 c may, in some embodiments, correspond to a comment that the user may have provided along with the image. For example, comment 404 c may be created by the user and may be linked to image 402. Anytime a user views image 402 within the shared virtual space, that user may be presented with comment 404 c as well. However, persons of ordinary skill in the art will recognize that any one of action 404 a, time of action 404 b, and/or comment 404 c may be removed or omitted from user interface 400, and/or any other feature or icon may be included, and the aforementioned is merely exemplary.
  • In some embodiments, user interface 400 may be presented on a user device in response to a contact sharing image 402 with the user. For example, a contact corresponding to user icon 404 may share image 402 with the user of device 102. In some embodiments, the user may be able to download image 402 to their user device by selecting button 406. Button 406 may be any button operable to either save the shared image and/or re-share image 402 with another contact, another social media network, or a content management system. For example, a user may receive image 402 from a contact and then re-share image 402.
  • In some embodiments, user interface 400 may correspond to a shared virtual space that the user belongs to. For example, the user may be in a shared virtual space with one or more contacts who may share an image, such as image 402, with the user. In some embodiments, the user may be able to add one or more photos to the shared virtual space by selecting “Add Photos” button 408. In response to selecting button 408, the user may be presented with a listing of images stored on their user device and/or within their user account on the content management system to share within the shared virtual space. For example, in response to receiving image 402, the user may select button 408 and reply by sending another image (e.g., an image of another dog) to that contact who shared image 402.
  • In some embodiments, user interface 400 may also include “Send Message” button 410. Button 410 may allow the user to send a textual message, an instant message, an email, and/or a chat message to any contact already within a shared virtual space with the user. For example, in response to receiving image 402, the user may select button 410 and send a message back to the contact who sent image 402, commenting on the shared image. In some embodiments, the message may be displayed with or substantially near message 404 c, thus creating a virtual chat or textual stream regarding image 402.
  • FIG. 5 shows another illustrative diagram of a user interface in accordance with various embodiments. User interface 500 may include multiple images shared within a shared virtual space. Images 502 a, 502 b, and 502 c may, for example be displayed within user interface 500. In some embodiments, one or more of images 502 a, 502 b, and 502 c may be moveable about the user interface. For example, a user may provide one or more gestures to augment a location of an image. In some embodiments, images displayed within user interface 500 may be manipulated, modified, and/or resized. For example, a user may choose to view each image displayed within user interface 500, or the user may decide to zoom in and display only a single image.
  • In some embodiments, user interface 500 may be capable of displaying more images and/or different images than images 502 a-c. For example, the user may share thirteen (13) images within a shared virtual space. In some embodiments, only images 502 a-c may initially be displayed, however the user may be presented with one or more additional images by pressing “See All Photos” button 508. Selecting button 508 may present the user with any number of images which may have previously been shared within the shared virtual space. In some embodiments, pressing button 508 may provide the user with one or more images stored within their user account and/or resident on their local device. For example, a user may select button 508 and be presented with their user device's camera roll. The user may then be able to select one or more images from with their camera roll to share with one or more contacts, one or more social media networks, one or more shared virtual spaces, and/or a contact management system.
  • In some embodiments, user interface 500 may include contact name 504 a and contact icon 504 b. For example, user interface 500 may display images 502 a-c, which may presented within a shared virtual space between a user and a contact of the user. The contact's name may be displayed in the user interface as contact name 504 a. In some embodiments, if the shared virtual space where images 502 a-c are displayed corresponds to more than one contact, contact name 504 may display multiple names. In some embodiments, contact icon 504 b may correspond to a picture or image of the contact from the shared virtual space. Icon 504 b may correspond to any image and may be chosen by the contact, designated by the user, and/or assigned by a content management system hosting the shared virtual space. In some embodiments, contact name 504 a and contact icon 504 b may be representative of the user instead of a contact of the user.
  • In some embodiments, user interface 500 may also include user icon 506 corresponding to the user being presented with user interface 500. User icon 506 may correspond to any photograph and/or image chosen by the user to represent the user within the shared virtual space. User interface 500 may also include action 506 a, time of action 506 b, and/or comment 506 c, which may, in some embodiments, be substantially similar to action 404 a, time of action 404 b, and/or comment 404 c of FIG. 4 with the exception that the latter may apply to a contact of the user whereas the former may apply to the user. Persons of ordinary skill in the art will recognize that user icon 506 and action 506 a, time of action 506 b, and comment 506 c may, in some embodiments, apply to a contact of the user, and contact name and icon 504 a and 504 b may then correspond to the user, and the use of one collection for either the user and/or the contact is merely exemplary.
  • FIG. 6A shows an illustrative diagram of a user interface displaying a user's storage settings in accordance with various embodiments. User interface 600 may include network indicator 602, wireless indicator 604, and battery indicator 606. Network indicator 602 may indicate an amount of available network connection. For example, the user may have a cellular data carrier providing cellular network service. Network indicator 602 may indicate how strong or powerful the cellular service may be in particular location at a particular time. Wireless indicator 604 may behave similarly to network indicator 602 with the exception that wireless indicator 604 may indicate an amount of available Wi-Fi signal available. In some embodiments, battery indicator 606 may indicate an amount of available battery charge on the user device. For example, the user device may only have 50% charge remaining, and therefore indicator 606 may display “50%” battery charge. In some embodiments, indicator 606 may indicate a percentage of battery charge remaining, an amount of battery time remaining, an amount of battery used, or any other charge indicator, or any combination thereof. Persons of ordinary skill in the art will recognize that although user interface 600 displays indicators 602, 604, and 606, one or more of these indicators may not be displayed and/or one or more additional indicators may be displayed.
  • In some embodiments, user interface 600 may indicate an amount of storage space occupied on the user device by various files or content items. For example, user interface 600 may correspond to a settings interface, indicating the various storage levels on the device. In some embodiments, user interface 600 may include photo storage bar 610 and video storage bar 620. Photo storage bar 610 may display an amount of storage space occupied on the user device by photographs. For example, the user may have 4 GB of photographs stored on their user device. Current photo storage value 612 may indicate the amount of storage space occupied by photos on the user device (e.g., 4 GB). Total photo storage value 614 may indicate a total amount of storage space available on the user device for storing photographs (e.g., 16 GB). Video storage bar 620 may display an amount of storage space occupied on the user's user device by videos. For example, the user may have 8 GB of videos stored on their user device. Current video storage value 622 may indicate the current amount of storage space occupied on the user device (e.g., 8 GB) by videos. Total video storage value 624 may indicate a total amount of storage space available on the user device for storing videos (e.g., 16 GB). Persons of ordinary skill in the art will recognize that any amount of storage space may be allotted for any content item, and any amount of storage space may be occupied.
  • In some embodiments, the total amount of storage space available for all content items may be the same. For example, the user device may have a total amount of available storage of 16 GB. In this scenario, if the device has 4 GB of photographs and 8 GB of videos, then 12 GB of storage space may be taken up. In some embodiments, there may only be one storage bar displaying a total amount of storage available and/or taken up on the device. In some embodiments, each of storage bars 610 and 612 may represent individual storage bars each displaying a total amount of storage space available for a single media type. For example, the user device may have 16 GB of storage space available for photographs and 16 GB of storage space available for videos.
  • In some embodiments, the amount of storage space available on the user device may be determined using one or more system performance algorithms. For example, the user device's operating systems may include one or more options allowing the user to be presented with a screen substantially similar to user interface 600 that indicates the storage levels of the user device. In some embodiments, the performance algorithm may be performed upon a user's request and/or automatically at predefined time intervals (e.g., daily, monthly, annually, etc.).
  • FIG. 6B shows another illustrative diagram of a user interface displaying a user's storage settings in accordance with various embodiments. User interface 600 of FIG. 6B may, in some embodiments, be substantially similar to user interface 600 of FIG. 6A, with the exception that the former may display various amounts of data available for download on the user device. In some embodiments, a user may continually download content until the amount of available storage space has been filled. For example, in order to continually maintain a local user experience on a user device while offline, a large amount of content may be downloaded in advance to the user device. The content may continually download and/or may be dynamically changing. For example, based on a probability that an image will be viewed by a user, specific images may download to the user device based on each image's selection probability value. Thus, a user may continually have images stored on their user device that they are most likely to view, that way if connectivity is ever not available, the user may not have to wait until connectivity returns to view their images.
  • Photo storage bar 610 may also include available storage value 616. For example, the user device may be capable of storing 16 GB of photographs, as indicated by total photo storage value 614. Current photo storage value 612 may indicate only 4 GB of the available 16 GB may be occupied by photographs and current available photo storage value 616 may indicate that a remaining 12 GB may be available. Similarly, current video storage value 626 may indicate that there may be 8 GB of available storage on the user device for videos because there are already 8 GB of videos occupying the 16 GB of total storage space for videos.
  • In some embodiments, values 616 and 626 may indicate an amount of storage space occupied by content items dynamically sent to the user device from a content management system. For example, a user may have a large quantity of images stored within their user account on a content management system (e.g., content management system 100). Each image stored therein may have a selection probability value assigned to that image based on the probability that the user may view, or attempt to view, that image. For example, images that have recently been added, shared, and/or viewed may be assigned a higher selection probability value than images that have not been viewed in a long period of time (e.g., one or more years).
  • The user account may continually send images to the user device based on the selection probability values of each image. For example, images having higher selection probability values may be sent prior to images having lower selection probability values. The images may continue to send to the user device until the total photo storage value 614 has been met. In some embodiments, the images sent to the user device may continually change based on any action performed on the images. For example, if one or more images are shared with a contact of the user, those images may be assigned a higher selection probability value and therefore the order of sending the images to the user device may change.
  • FIG. 7 shows another illustrative diagram of a listing of images in accordance with various embodiments. Listing of images 700 may, in some embodiments, be substantially similar to listing 200 of FIG. 2, with the exception that listing 700 may indicate various parameters associated with each image. In some embodiments, the parameters displayed may include, but are not limited to, the number of times an image has been viewed, and/or the number of times an image has been shared. Each of these parameters may have a corresponding numerical value associated with each image and listed within columns 720, 730, and 740, respectively.
  • In some embodiments, each image within listing 700 may have a corresponding value listed within column 720 corresponding to a number of times that image has been viewed. For example, image 702 may have been viewed one thousand (1,000) times, whereas image 714 may have been viewed three (3) times. As another example, image 704 may have been viewed five thousand (5,000) times, whereas image 708 may have been viewed three hundred (300) times. The amount of times an image has been viewed may correspond to any viewing of the image by the user, a contact of the user, a contact of the contact of the user, and/or any other individual or collection of individuals who may access and/or obtain the image and view it on their own device.
  • In some embodiments, each image within listing 700 may have a corresponding value listed within column 730 corresponding to a number of times that image has been shared. For example, image 706 may have been shared by the user twenty (20) times, whereas image 710 may have been shared three hundred (300) times. In some embodiments, the number of times an image has been shared may correspond to any sharing of the image by the user. For example, a user may share any image stored within their user account on a content management system with one or more contacts, one or more social media networks, and/or one or more shared virtual spaces with which the user belongs too. In some embodiments, an image may be stored on the user device and may be shared within any number of contacts by uploading the image to a content management system. For example, a user may capture an image with their user device and upload that image to their user account on a content management system.
  • In some embodiments, each image within listing 700 may have a corresponding value listed within column 740 corresponding to an amount of times that particular image has been requested to be shared. For example, image 712 may have been requested to be shared five (5) times, whereas image 716 may have been requested to be shared eighty (80) times. In some embodiments, a request to share an image may be received by a user from a contact of the user. For example, a user may have an image of their new child stored within their user account. The image may be requested to be shared with the user's friends and family so that they can each view and/or download a version of the stored image. In some embodiments, the user may receive the request to share an image via an email, a text message, and/or a pop-up notification, or using any other suitable notification, or any combination thereof.
  • FIG. 8 shows an illustrative graph of a frequency of use for each image within a listing of images in accordance with various embodiments. Graph 800 may include data points corresponding to listing of images 700 of FIG. 7. For example, diamond pointed data markers 802 may correspond to column 720, whereas square data markers 804 and triangle data markers 806 may correspond to columns 730 and 740, respectively. Each data marker may correspond to a particular image from listing 700, and may represent the amount of times that image was viewed (e.g., markers 802), shared (e.g., markers 804), and requested to be shared (e.g., markers 806).
  • Graph 800 may also include first axis 810 corresponding to an image number from listing of images 700 and a second axis 820 corresponding to values of columns 720, 730, and 740. In some embodiments, first axis 810 may be referred to as the x-axis, whereas second axis 820 may be referred to as the y-axis. Persons of ordinary skill in the art will recognize that first axis 810 and second axis 820 may correspond to any axes and may be associated with any coordinate system, and the use of the x- and y-axes is merely exemplary. In some embodiments, second axis 820 may be in units referred to as “SAMPLE INDEX”. The use of a sample index as units is merely exemplary, and may indicate that the values of the numbers used on that particular axis can be in arbitrary, or substantially arbitrary. In some embodiments, axes 810 and/or 820 may be in logarithmic form, thus allowing large numbers and smaller numbers to both be viewed on a same graph with greater ease.
  • Graph 800 may plot the data markers for each image from listing 700, thus allowing the frequency of use for each image to be viewed in a single graph. For example, image 706 of listing 700 may have been viewed 25 times, shared 20 times, and requested to be shared 10 times. In this scenario, image 706, which may be the third image of the listing (or “IMAGE #3), may have data markers 802, 804, and 806 located at position 3 on axis 810 and respectively at values 25, 20, and 10 on axis 820.
  • In some embodiments, the frequency of use may be used to determine a selection probability value for an image. Data for the amount of times and image is viewed, shared, and/or requested to be shared, may all be factors used to determine that particular image's selection probability value. The selection probability value may be calculated in any suitable manner. For example, the selection probability value may be the aggregate of the number of times an image is viewed, shared, and requested to be shared. As an illustrative example, image 714 may have an aggregate from columns 720, 730, and 740 of six (6) whereas image 710 may have an aggregate of ten thousand five hundred fifty (10,550). In this particular scenario, image 710 may have a higher selection probability value than image 714 because image 710 may have an aggregate value which is much higher.
  • In some embodiments, the selection probability value may be calculate based on one or more of the number of times an image is viewed, shared, and/or requested to be shared being weighted. For example, the number of times an image as been requested to be shared may be weighted twice that of the number of times an image has been shared. As an illustrative example, image 704 may have been requested to be shared thirty-five times (35), whereas it may have also been shared fifty (50) times. If the number of times an image has been requested to be shared is weighted, for example, twice that of the number of times that the image has been shared, image 702 may now have a value in column 740 of seventy (70), and therefore may obtain a different selection probability value than if it was not weighted double. Persons of ordinary skill in the art will recognize that any combination, permutation, and/or weighting may be used to determine the selection probability value for an image based on any frequency of use characteristics, and the aforementioned examples are merely illustrative. Furthermore, one or more algorithms running on a user's user device and/or on a content management system may be used to calculate the selection probability values.
  • FIG. 9 shows an illustrative flowchart of a process for sending dynamically selected images to a user device in accordance with various embodiments. Process 900 may begin at step 902. At step 902, a listing of images may be generated. The listing may include images stored within a user account on a content management system. For example, listing 200 of FIG. 2 may correspond to a listing of some or all images stored within a user account on content management system 100. In some embodiments, the listing may include every image stored within the user account. For example, if the user has one thousand (1,000) images stored within their user account, then each of those one thousand images may be included within the generated listing. In some embodiments, the listing of images may be generated in response to a user input being detected by the content management system. For example, the user may perform an action on an image located within their user account, and in response the listing may be generated.
  • At step 904, a selection probability value may be assigned to each image within the image. The selection probability value may be based on at least one factor, such as the amount of times an image has been viewed or shared. In some embodiments, the selection probability value may correspond to a likelihood of use of the at least one image. For example, image 710 of FIG. 7 may have been viewed ten thousand (10,000) times, whereas image 714 may have been viewed three (3) times. In this scenario, image 710 may have a selection probability value assigned to it that may be greater than a selection probability value assigned to image 714 based on the number of times that each image has been viewed. In some embodiments, the at least one factor used to determine the selection probability value of an image may correspond to the number of times an image has been shared and/or the number of times an image has been requested to be shared. However, persons of ordinary skill in the art will recognize that any factor may be used to assign a selection probability value to a particular image, and the use of the number of times viewed, shared, and/or requested to share, are merely exemplary.
  • In some embodiments, multiple factors may be used to determine the selection probability value. For example, the number of times an image has been viewed and shared may both be used to determine the image's selection probability value. In some embodiments, one or more factors may be weighted greater or less than other factors. For example, the amount of times an image has been viewed may be weighted higher than the amount of times an image has been shared. In some embodiments, a recent viewing of a first image from the listing of images and/or a recent viewing of one or more additional images included in a same collection as one of the images from the listing may be used, at least in part, to determine and/or assign the selection probability value to a particular image. For example, a first image may be included in a collection of images from a user's vacation. Within the vacation photos may be a second image that the user may view on their user device. The first image may receive a higher selection probability value because the second image has recently been viewed. In some embodiments, the addition of one or more new images to the user account on the content management system, to a shared virtual space, and/or to a shared virtual space that user belongs, may also be factors that the selection probability value may be based on. In some embodiments, the selection probability value of each image may dynamically update in response to any user interaction with a selected image from the listing of images, the content management system, and/or an additional image from within the listing. For example, the user may decide to share image 206 of FIG. 2, and in response to the detected interaction with the content management system to share the image, each image from within listing 200 may have its selection probability value update.
  • At step 906, an available capacity level on the user device may be determined. In some embodiments, the user may be presented with user interface 600 indicating current storage levels on their user device. For example, storage bar 610 may indicate to the user that there may be 12 GB of storage space available for storing photos and 4 GB of photos may already be stored on their user device. In some embodiments, user interface 600 may indicate a total amount of storage space occupied and/or available on the user device. For example, only one storage indicator may be present on user interface 600 indicating all available storage levels of the user device.
  • In some embodiments, determining the available capacity level of the user device may include determining a level of connectivity between the user device and the content management system. For example, if the level of connectivity between the user device and the content management system is very low, the transmission of one or more images to the user device may not be possible or may take a substantially long time. As another example, if the level of connectivity is determined to be across a cellular network instead of over a Wi-Fi connection then the transmission may be paused so that cellular data usage is not taken up by the transmission of images. This may be especially useful if a large amount of images are set to be sent to the user device and the user only has a limited cellular data plan. In some embodiments, determining the available capacity level on the user device may also include detecting whether the user device has an available battery charge exceeding a threshold for transmission. For example, if the user device has a very low battery charge (e.g., 5% charge remaining), the transmission may be paused until the battery charge exceeds a predefined battery threshold needed for transmission (e.g., 20% charge). This may ensure that the user device does not completely drain out its battery charge downloading images so that no charge may be available for the device in case of an emergency.
  • At step 908, one or more images may be selected from the listing of images. The selected images may, in some embodiments, be selected for transmission to the user device, and the selection may be based on each image's selection probability value and/or the available capacity level of the user device. For example, image 202 may have a selection probability value of 90%, whereas image 210 may have a selection probability value of 5%. If the available capacity level on the user device is such that only one image may be capable of being downloaded to the device, then because image 202 has a higher selection probability value, it may send to the user device instead of image 210. However, if the user device has enough storage space available that all five images from listing 200 may be downloaded, then both images 202 and 210 may be downloaded to the user device along with images 204, 206, and 208.
  • In some embodiments, selecting images from within the listing to be sent to the user device may also include determining whether any of the images within the listing are already stored locally on the user device. For example, image 204 of listing 200 may already be stored on the user device. In this scenario, sending image 204 as one of the selected images may be repetitive, and therefore that image may not be sent. In some embodiments, one or more systems checks may be performed on the user's user device prior to sending any of the images to determine which images and/or content may already be stored thereon. For example, the systems check may determine the photos stored on the user device as well as the amount of available storage space on the user device.
  • At step 910, the dynamically selected images may be ranked in order of priority for transmission to the user device. For example, column 230 of FIG. 2 may present a rank for each image from listing 200. The rank may correspond to each image's selection probability value. For example, image 202 of listing 200 may have the highest selection probability value (e.g., 90%), whereas image 210 may have the lowest selection probability value (e.g., 5%). In this scenario, image 202 may have a rank of “A” indicating that it may be first to be sent, whereas image 210 may have a rank of “E” indicating that it may be the last image from listing 200 sent. In some embodiments, the ranking order for transmission may corresponding to the images having the highest selection probability value to the image having the lowest selection probability value. However this is merely exemplary and any ranking order may be used. For example, a threshold probability value may be used, such that any image having a selection probability value below the threshold may not be sent. In some embodiments, the ranking may be in order of the image or images having the lowest selection probability value to the image(s) having the highest selection probability value.
  • At step 912, the dynamically selected images may be sent to the user device. In some embodiments, the transmission may be performed in ranking order. For example, an image having the ranking of “A” may send before an image having the ranking of “B” or “C”. In some embodiments, the images may be sent to storage on the user device. In some embodiments, transmission of the images may continue to occur until the available capacity level of the user device has been met. For example, if only 12 GB of storage space is available on the user device, images may continue to send to the user device until the 12 GB of storage space has been occupied.
  • In some embodiments, the various steps of process 900 may be repeated periodically at defined time intervals. For example, the steps may repeat hourly, daily, monthly, annually, or at any other time interval. In some embodiments, the user may define the time intervals. For example, the user may have process 900 repeat every six (6) hours. In some embodiments, process 900 may repeat continually. For example, upon transmission of the selected images to the user device completing, a new listing of images within the user account may be generated.
  • In some embodiments, the various steps of process 900 may repeat in response to an addition of an image to the user account. For example, upon images 312 and 314 being added to the user account, a new listing of images (e.g., listing 300) may be generated and the probability values, and thus rank and order of transmission of the images, may change. In some embodiments, the various steps of process 900 may repeat in response to one or more images being selected to be added. For example, image 312 and 314 may be selected to be added to the user account and process 900 may repeat. In some embodiments, the various steps of process 900 may repeat in response to one or more images being added to a shared virtual space which the user belongs too. For example, the user may add images 502 a-c of FIG. 5 to a shared virtual space. By adding these images to the shared virtual space, a new listing of images may be generated and selection probability values, and thus a ranking order for transmission, may also occur.
  • FIG. 10 shows an illustrative flowchart of a process for creating a dynamic priority list in accordance with various embodiments. Process 1000 may being at step 1002. At step 1002, a number of images stored within a user account on a content management system may be calculated. For example, a user may store one or more images within their user account on a content management system. The content management system may, in some embodiments, perform a status report to calculate the number of images that the user has stored within their user account. The user may be capable of storing any number of images within their user account. In some embodiments, the user may store other content items within their user account including, but not limited to, videos, audio files (e.g., mp3s), textual documents, or any other file type, or any combination thereof.
  • At step 1004, a frequency of use for each image calculated to be stored within the user account may be determined. Any frequency of use parameter may be determined, for example, a number of times the image has been viewed. For example, image 702 of FIG. 7 may have been viewed one thousand (1,000) times. In some embodiments, one or more additional parameters may be determined corresponding to an image's frequency of use. For example, a number of times an image has been shared and/or a number of times an image has been requested to be shared may be determined. As another example, the frequency of use may correspond to a determined number of contacts who also have a same image stored within their user account.
  • At step 1006, a selection probability value may be assigned to each image. In some embodiments, the selection probability value may be based on the determined frequency of use of each image. For example, an image that has been shared numerous times may be assigned a higher selection probability value than an image that has never been shared. The selection probability value assigned to an image may be based on any determined frequency of use. For example, a number of times an image has been viewed, shared, and/or requested to be shared may all contribute to determining the frequency of use of an image. In some embodiments, step 1006 of FIG. 10 may be substantially similar to step 904 of FIG. 9, and the previous description of the latter may apply to the former.
  • At step 1008, a dynamic priority list may be created for sending images to a user device from the content management system. In some embodiments, the dynamic priority list may rank each image by its assigned selection probability value. For example, listing 200 may have the images included therein ranked based on each image's assigned selection probability value. In some embodiments, the dynamic priority list may also queue the content management system to send images to the user device in order of each image's rank. For example, image 202 of FIG. 2 may have a rank of “A”, whereas image 208 may have a rank of “D”. Thus, in this scenario, image 202 may be sent prior to image 208 because it has a “higher” rank. Persons of ordinary skill in the art will recognize that any image may be ranked in any order using any parameter, and the use of sending an image with a rank of “A” before an image with a rank of “B” is merely exemplary.
  • In some embodiments, an amount of available store space on the user device may be calculated. For example, a user device may have 16 GB of total storage space where 4 GB of storage space may already be occupied. In this scenario, there may be 12 GB of storage space calculated to be available on the user device. In some embodiments, in response to calculating the amount of available storage space on the user device, a first amount of images may be sent to the user device. The first amount of images may be of a first resolution. For example, each image within the first amount of images may correspond to a low resolution images (75×75 pixels). In some embodiments, the first amount of images, and their resolution, may be based on the amount of available storage space and each image's rank. For example, if there is only enough storage space to store one high-definition image (e.g., 1080 pixels), then an image having a rank of “A” may be the only image capable of being sent to the user device. However, in this scenario, if there is more than one image having a rank of “A” and/or if there are images having multiple rankings that combined would occupy the same amount of storage space as the one high-definition image, then those images may send to the user device.
  • In some embodiments, in response to calculating an amount of available storage space on the user device, a first number of images from the dynamic priority list may be determined to be sendable to the user device. The first amount of images may be in a first resolution, which may be based on the calculated available storage space. In some embodiments, a second number of images may be determined to be sendable to the user device in a second resolution based on the available amount of storage space. For example, if there is 2 GB of available storage space on the device, two images having a combined storage space of 1 GB may be sent in a high resolution and 10 images having a combined storage space of 1 GB may also be sent in a lower resolution as well. The aforementioned example is merely one illustration, and any number of images having any resolution may be determined to be sendable to the user device.
  • In some embodiments, the first number of images in the first resolution may correspond to a first subset of images stored on the content management system having a high-quality resolution. For example, one or more images stored within the user account on the content management system may be of a high-quality resolution (e.g., 1080 pixels), and these images may be sendable to the user device within the first number of images. In some embodiments, the second number of images in the second resolution may correspond to a second subset of images stored on the content management system having a low-quality resolution (e.g., 256 pixels). In some embodiments the first subset of images may include fewer images than the second subset of images, however they may each be of equal size. For example, within 1 GB of storage space, ten (10) high quality images may be sendable, whereas one hundred (100) low quality images may be sendable within the same 1 GB.
  • FIG. 11 shows an illustrative flowchart of a process for obtaining images from a content management system in accordance with various embodiments. Process 1100 may begin at step 1102. At step 1102, an amount of available storage space on a user device may be determined. For example, user interface 600 of FIG. 6B may display storage bar 610 indicating available storage 616 of 12 GB on a user device. In some embodiments, step 1102 of FIG. 11 may be substantially similar to step 906 of FIG. 9, and the previous description of the latter may apply to the former.
  • At step 1104, data may be sent to the content management system indicating the determined amount of available storage space on the user device. For example, device 102 may send a system status report to a content management system indicating the current storage levels of the user device. The system status report may be run at specific time intervals and/or upon user request, for example. In some embodiments, the system status report may detail the amount of storage space currently occupied on the user device by various content items. For example, the status report may indicate that the user device may have 4 GB of storage space occupied by photographs out of total storage space 614, and therefore may have 12 GB of storage space available for additional photographs.
  • At step 1106, a first dynamic priority list may be received from the content management system. The first dynamic priority list may include a first plurality of images that may be ranked in a first order based on a first selection probability value assigned to each image within the list. For example, listing 200 may include images 202, 204, 206, 208, and 210, each having selection probability values assigned to them listed within column 220. In some embodiments, the assigned selection probability values may be used to determine a rank of each image to be sent to the user device from the user account on the content management system. For example, image 202 may have a selection probability value of 90%, whereas image 206 may have a selection probability value of 70%. In this scenario, image 202 may have a rank of “A”, whereas image 206 may have a rank of “B”. In some embodiments, images may be downloaded or attempted to be downloaded to the user device in ranking order. For example, image 202 may be ranked higher than image 206, and therefore may be downloaded first. Persons of ordinary skill in the art will recognize that any ranking order may be assigned to images within the listing of images, and the use of ranking images based on each image's selection probability value is merely exemplary.
  • At step 1108, one or more inputs may be detected using the user device. The one or more inputs may be performed on one or more images from the first plurality of images. For example, a user may be presented within listing 200 within a user interface on their user device. In some embodiments, the user may perform a gesture or an action on an image in an attempt to share that image. Any gesture may be performed including, but not limited to, taps, swipes, hovers, audio commands, visual commands, motions, or any other gesture, or any combination thereof.
  • At step 1110, data corresponding to the detected one or more inputs may be sent to the content management system. For example, the user device may detect that the user may be attempting to share image 202 via one or more gestures performed on the user interface of the user device. In some embodiments, the user device may store the data corresponding to the gestures in one or more data files or gesture detection files, and send those files to the content management system.
  • At step 1112, a second dynamic priority list may be downloaded in response to the data corresponding to the detected input being sent to the content management system. In some embodiments, the second dynamic priority list may include a second plurality of images ranked in a second order based on a second selection probability value being assigned to each image. For example, the second selection probability value may correspond to the change in the likelihood of a particular image being viewed after the detected gesture. Listing 300 may include, for example, image 308, which may be substantially similar to image 208 from listing 200. Image 208 may have initially had a selection probability value of 20% and a transmission ranking of “D”. In response to the detected input adding images 312 and 314 to the listing, the selection probability value of image 308 may now be lower than before due to a lower likelihood of viewing this particular image. For example, image 308 may be assigned a selection probability value of 5% corresponding to a transmission ranking of “G”.
  • In some embodiments, the detected input or inputs may cause the first probability value assigned to each image within the listing to change to a second probability value. For example, if the user attempts to share image 206, the selection probability value for that image change in response to the detected input. In this scenario, image 206, which may have initially had a selection probability value of 70%, may change to now correspond to image 306 of listing 300, having a selection probability value of 85%. In some embodiments, sending the detected input data may be repeated periodically at specific time intervals. For example, the input data may be sent to the content management system from the user device every hour, or every day. In some embodiments, the first and second dynamic priority list may each be modified based on the transmission of the detected input data at each time interval. For example, each time the input data may be sent to the content management system, the dynamic priority list may update.
  • In some embodiments, at least two sets of images may be received on the user device that may be selected as the first dynamic priority list. For example, the two sets of images may each be included within the listing of images used to create the first dynamic priority list. In some embodiments, the at least two sets of images may include a first set of images and a second set of images. The first set of images may include a first number of images each having a first resolution. For example, the first set of images may be a first set of high-definition images. The second set of images may include a second number of images each having a second resolution. For example, the second set of images may be a set of thumbnail resolution images. In some embodiments, both the first set of images and the second set of images may include the same number of images. For example, the first set of images may include ten (10) images, and the second set of images may also include ten (10) images. In some embodiments, the first resolution may correspond to a higher resolution than the second resolution. For example, the first resolution may correspond to high definition images (e.g., 1080 pixels) whereas the second resolution may correspond to low definition images (e.g., 256 pixels).
  • FIG. 12 shows an exemplary system in accordance with various embodiments. In some embodiments, system 1200 of FIG. 12 may be substantially similar to system 100 of FIG. 1, with the exception that the former may present elements of system 100 at a more granular level (e.g., modules, applications, etc.).
  • In some embodiments, user devices 102 may be used to create, access, modify, and manage content items, such as content items 110 a and 110 b (collectively 110), stored locally within content item system 108 a and 108 b (collectively systems 108) on user device 102 and/or stored remotely on content management system 104 (e.g., within data store 118). For example, user device 102 a may access content items 110 b stored remotely with data store 118 of content management system 104 and may, or may not, store content item 110 b locally within content item system 108 a on user device 102 a. Continuing with the example, user device 102 a may temporarily store content item 110 b within a cache locally on user device 102 a, make revisions to content item 110 b, and the revisions to content item 110 b may be communicated and stored in data store 118 of content management system 104. Optionally, a local copy of content item 110 a may be stored on user device 102 a. In some embodiments, data store 118 may include one or more collections 132 of content items. For example, collections 132 may include one or more content items having similar properties (e.g., metadata) and/or including similar content.
  • In some embodiments, user devices 102 may include camera 138 (e.g., 138 a and 138 b) to capture and record digital images and/or videos. User devices 102 may capture, record, and/or store content items, such as images, using camera 138. For example, camera 138 may capture and record images and store metadata with the images. Metadata may include, but is not limited to, the following: creation time timestamp, geolocation, orientation, rotation, title, and/or any other attributes or data relevant to the captured image. Metadata values may be stored as attribute 112 name-value pairs, tag-value pairs, and/or any other method, or any combination thereof, to associate the metadata with the content item and easily identify the type of metadata. In some embodiments, attributes 112 may be tag-value pairs defined by a particular standard, including, but not limited to, Exchangeable Image File Format (“Exif”), JPEG File Interchange Format (Jfif), and/or any other standard.
  • In some embodiments, user devices 102 may include time normalization module 146, and content management system 104 may include time normalization module 148. Time normalization module 146 (e.g., 146 a and 146 b) may be used to normalize dates and times stored with a content item. Time normalization module 146, counterpart time normalization module 148, and/or any combination thereof, may be used to normalize dates and times stored for content items. The normalized times and dates may be used to sort, group, perform comparisons, perform basic math, and/or cluster content items.
  • In some embodiments, user devices 102 may include organization module 136, and content management system 104 may include organization module 140. Organization module 136 (e.g., 136 a and 136 b) may be used to organize content items into clusters or collections of content items, organize content items to provide samplings of content items for display within user interfaces, and/or retrieve organized content items for presentation. Organization module 136 may utilize any clustering algorithm. Organization module 136 may be used to identify similar content items for clusters in order to organize content items for presentation within user interfaces on user devices 102 and content management system 104. Similarity rules may be defined to create one or more numeric representations embodying information on similarities between each of the content items in accordance with various similarity rules. Organization module 136 may use numeric representations as a reference for similarity between content items in order to cluster the content items.
  • In some embodiments, content items may be organized into clusters to aid with retrieval of similar content items in response to search requests. For example, organization module 136 a may identify that two images are similar and may group the images together in a cluster. Organization module 136 a may process content items to determine clusters independently and/or in conjunction with counterpart organization module (e.g., 140 and/or 136 b). In other embodiments, organization module 136 a may only provide clusters identified with counterpart organization modules (e.g., 140 and/or 136 b) for presentation. Continuing with this example, processing of content items to determine clusters may be an iterative process that may be executed upon receipt of new content items and/or new similarity rules.
  • In some embodiments, user device 102 a may include classification module 150 a, while user device 102 b may include classification module 150 b (collectively 150), which may be used independently, in combination with classification module 152 include on content management system 104, and/or any combination thereof to classify content items, rectify content items, and/or classify images. For example, the classification modules 150 and/or 152 may be used to determine if an image includes a document, and if there so, determine a type of document stored therein. Content item rectification may be performed to correct, perform further transformations, and/or crop content items to improve the display of the content items (e.g., correct the display of a document within an image).
  • In some embodiments, user device 102 a may include search module 142 a, while user device 102 b may include search module 142 b, which collectively may be referred to as search modules 142. Content management system 104 may also be provided with counterpart search module 144. Each of search modules 142 and 144 may be capable of supporting searches for content items located on both user devices 102 and/or content management system 104. A search request may be received by search module 142 and/or 144 that requests one or more content items. In some embodiments, the search may be handled by searching metadata and/or attributes assigned to content items during the provision of management services. For example, cluster markers stored with content items may be used to find content items by date. In this particular scenario, cluster markers may indicate an approximate time, or average time, for the content items stored with the cluster marker, and the marker may be used to speed the search and/or return the search results with the contents of the cluster with particular cluster markers.
  • Content items 110 managed by content management system 104 may be stored locally within content item system 108 of respective user devices 102 and/or stored remotely within data store 118 of content management system 104 (e.g., content items 134 in data store 118). Content management system 104 may provide synchronization of content items managed thereon. Attributes 112 a and 112 b (collectively 112) or other metadata may also be stored with content items 110. For example, a particular attribute may be stored with a content item to track content items locally stored on user devices 102 that are managed and/or synchronized by content management system 104. In some embodiments, attributes 112 may be implemented using extended attributes, resource forks, or any other implementation that allows for storing metadata with a content item that is not interpreted by a content item system, such as content item system 108. In particular, attributes 112 a and 112 b may be content identifiers for content items. For example, the content identifier may be a unique or nearly unique identifier (e.g., number or string) that identifies the content item. By storing a content identifier with the content item, the content item may be tracked. For example, if a user moves the content item to another location within content item system 108 hierarchy and/or modifies the content item, then the content item may still be identified within content item system 108 of user device 102. Any changes or modifications to the content item identified with the content identifier may be uploaded or provided for synchronization and/or version control services provided by content management system 104.
  • A stand-alone content management application 114 a and 114 b (collectively 114), client application, and/or third-party application may be implemented on user devices 102 a and 102 b, respectively, to provide a user interface to a user for interacting with content management system 104. Content management application 114 may expose the functionality provided with content management interface module 154 and accessible modules for user device 102. Web browser 116 a and 116 b (collectively 116) may be used to display a web page front end for a client application that may provide content management 104 functionality exposed/provided with content management interface module 154.
  • Content management system 104 may allow a user with an authenticated account to store content, as well as perform management tasks, such as retrieve, modify, browse, synchronize, and/or share content with other accounts. Various embodiments of content management system 104 may have elements including, but not limited to, content management interface module 154, account management module 120, synchronization module 122, collections module 124, sharing module 126, file system abstraction 128, data store 118, and organization module 140. Content management interface module 154 may expose the server-side or back end functionality/capabilities of content management system 104. For example, a counter-part user interface (e.g., stand-alone application, client application, etc.) on user devices 102 may be implemented using content management interface module 154 to allow a user to perform functions offered by modules of content management system 104.
  • The user interface displayed on user device 102 may be used to create an account for a user and/or authenticate the user to use the account using account management module 120. Account management module 120 may provide the functionality for authenticating use of an account by a user and/or user device 102 with username/password, device identifiers, and/or any other authentication method. Account information 130 may be maintained in data store 118 for accounts. Account information may include, but is not limited to, personal information (e.g., an email address or username), account management information (e.g., account type, such as “free” or “paid”), usage information, (e.g., content item edit history), maximum storage space authorized, storage space used, content storage locations, security settings, personal configuration settings, content sharing data, etc. An amount of storage space on content management system 104 may be reserved, allotted, allocated, stored, and/or may be accessed with an authenticated account. The account may be used to access content items 134 and/or content items 110 within data store 118 for the account, and/or content items 134 and/or content items 110 made accessible to the account that are shared from another account. In some embodiments, account management module 120 may interact with any number of other modules of content management system 104.
  • An account on content management system 104 may, in some embodiments, be used to store content such as documents, text items, audio items, video items, etc., from one or more user devices 102 authorized by the account. The content may also include collections of various types of content with different behaviors, or other mechanisms of grouping content items together. For example, an account may include a public collection that may be accessible to any user. In some embodiments, the public collection may be assigned a web-accessible address. A link to the web-accessible address may be used to access the contents of the public folder. In another example, an account may include a photos collection that may store photos and/or videos, and may provide specific attributes and actions tailored for photos and/or videos. The account may also include an audio collection that provides the ability to play back audio items and perform other audio related actions. The account may still further include a special purpose collection. An account may also include shared collections or group collections that may be linked with and available to multiple user accounts. In some embodiments, access to a shared collection may differ for different users that may be capable of accessing the shared collection.
  • Content items 110 and/or content items 134 may be stored in data store 118. Data store 118 may, in some embodiments, be a storage device, multiple storage devices, or a server. Alternatively, data store 118 may be cloud storage provider or network storage accessible via one or more communications networks. Content management system 104 may hide the complexity and details from user devices 102 by using content item system abstraction 128 (e.g., a content item system database abstraction layer) so that user devices 102 do not need to know exactly where the content items are being stored by content management system 104. Embodiments may store the content items in the same collections hierarchy as they appear on user device 102. Alternatively, content management system 104 may store the content items in various orders, arrangements, and/or hierarchies. Content management system 140 may store the content items in a network accessible storage (SAN) device, in a redundant array of inexpensive disks (RAID), etc. Content management system 104 may store content items using one or more partition types, such as FAT, FAT32, NTFS, EXT2, EXT3, EXT4, ReiserFS, BTRFS, and so forth.
  • Data store 118 may also store metadata describing content items, content item types, and the relationship of content items to various accounts, folders, collections, or groups. The metadata for a content item may be stored as part of the content item and/or may be stored separately. Metadata may be store in an object-oriented database, a relational database, a content item system, or any other collection of data. In some embodiments, each content item stored in data store 118 may be assigned a system-wide unique identifier.
  • Data store 118 may, in some embodiments, decrease the amount of storage space required by identifying duplicate content items or duplicate chunks of content items. Instead of storing multiple copies, data store 118 may store a single copy of content item 134 and then use a pointer or other mechanism to link the duplicates to the single copy. Similarly, data store 118 may store content items 134 more efficiently, as well as provide the ability to undo operations, by using a content item version control that tracks changes to content items, different versions of content items (including diverging version trees), and a change history. The change history may include a set of changes that, when applied to the original content item version, produce the changed content item version.
  • Content management system 104 may be configured to support automatic synchronization of content from one or more user devices 102. The synchronization may be platform independent. That is, the content may be synchronized across multiple user devices 102 of varying type, capabilities, operating systems, etc. For example, user device 102 a may include client software, which synchronizes, via synchronization module 122 at content management system 104, content in content item system 108 of user devices 102 with the content in an associated user account. In some cases, the client software may synchronize any changes to content in a designated collection and its sub-collection, such as new, deleted, modified, copied, or moved content items or folders. In one example of client software that integrates with an existing content management application, a user may manipulate content directly in a local folder, while a background process monitors the local content item for changes and synchronizes those changes to content management system 104. In some embodiments, a background process may identify content that has been updated at content management system 104 and synchronize those changes to the local collection. The client software may provide notifications of synchronization operations, and may provide indications of content statuses directly within the content management application. In some embodiments, user device 102 may not have a network connection available. In this scenario, the client software may monitor the linked collection for content item changes and queue those changes for later synchronization to content management system 104 when a network connection is available. Similarly, a user may manually stop or pause synchronization with content management system 104.
  • A user may also view or manipulate content via a web interface generated and served by content management interface module 154. For example, the user may navigate in a web browser to a web address provided by content management system 104. Changes or updates to content in data store 118 made through the web interface, such as uploading a new version of a content item, may be propagated back to other user devices 102 associated with the user's account. For example, multiple user devices 102, each with their own client software, may be associated with a single account, and content items in the account may be synchronized between each of user devices 102.
  • Content management system 104 may include sharing module 126 for managing sharing content and/or collections of content publicly or privately. Sharing module 126 may manage sharing independently or in conjunction with counterpart sharing module 152 a, located on user device 102 a, and sharing module 152 b located on user device 102 b (collectively sharing modules 152). Sharing content publicly may include making the content item and/or the collection accessible from any device in network communication with content management system 104. Sharing content privately may include linking a content item and/or a collection in data store 118 with two or more user accounts so that each user account has access to the content item. The sharing may be performed in a platform independent manner. That is, the content may be shared across multiple user devices 102 of varying type, capabilities, operating systems, etc. For example, one or more share links may be provided to a user, or a contact of a user, to access a shared content item. The content may also be shared across varying types of user accounts. In particular, the sharing module 126 may be used with collections module 124 to allow sharing of a virtual collection with another user or user account. A virtual collection may be a collection of content identifiers that may be stored in various locations within content item systems 108 of user device 102 and/or stored remotely at content management system 104.
  • In some embodiments, the virtual collection for an account with a content management system may correspond to a collection of one or more identifiers for content items (e.g., identifying content items in storage). The virtual collection is created with collections module 124 by selecting from existing content items stored and/or managed by content management system and associating the existing content items within data storage (e.g., associating storage locations, content identifiers, or addresses of stored content items) with the virtual collection. By associating existing content items with the virtual collection, a content item may be designated as part of the virtual collection without having to store (e.g., copy and paste the content item to a directory) the content item in another location within data storage in order to place the content item in the collection.
  • In some embodiments, content management system 104 may be configured to maintain a content directory or a database table/entity for content items where each entry or row identifies the location of each content item in data store 118. In some embodiments, a unique or a nearly unique content identifier may be stored for each content item stored in data store 118.
  • In some embodiments, metadata may be stored for each content item. For example, metadata may include a content path that may be used to identify the content item. The content path may include the name of the content item and a content item hierarchy associated with the content item (e.g., the path for storage locally within a user device 102). Content management system 104 may use the content path to present the content items in the appropriate content item hierarchy in a user interface with a traditional hierarchy view. A content pointer that identifies the location of the content item in data store 118 may also be stored with the content identifier. For example, the content pointer may include the exact storage address of the content item in memory. In some embodiments, the content pointer may point to multiple locations, each of which contains a portion of the content item.
  • In addition to a content path and content pointer, a content item entry/database table row in a content item database entity may also include a user account identifier that identifies the user account that has access to the content item. In some embodiments, multiple user account identifiers may be associated with a single content entry indicating that the content item has shared access by the multiple user accounts.
  • To share a content item privately, sharing module 126 may be configured to add a user account identifier to the content entry or database table row associated with the content item, thus granting the added user account access to the content item. Sharing module 126 may also be configured to remove user account identifiers from a content entry or database table rows to restrict a user account's access to the content item. The sharing module 126 may also be used to add and remove user account identifiers to a database table for virtual collections.
  • To share content publicly, sharing module 126 may be configured to generate a custom network address, such as a uniform resource locator (“URL”), which allows any web browser to access the content in content management system 104 without any authentication. To accomplish this, sharing module 126 may be configured to include content identification data in the generated URL, which may later be used to properly identify and return the requested content item. For example, sharing module 126 may be configured to include the user account identifier and the content path in the generated URL. Upon selection of the URL, the content identification data included in the URL may be sent to content management system 104 which may use the received content identification data to identify the appropriate content entry and return the content item associated with the content entry.
  • To share a virtual collection publicly, sharing module 126 may be configured to generate a custom network address, such as a uniform resource locator (URL), which allows any web browser to access the content in content management system 100 without any authentication. To accomplish this, sharing module 126 may be configured to include collection identification data in the generated URL, which may later be used to properly identify and return the requested content item. For example, sharing module 126 may be configured to include the user account identifier and the collection identifier in the generated URL. Upon selection of the URL, the content identification data included in the URL may be sent to content management system 104 which may use the received content identification data to identify the appropriate content entry or database row and return the content item associated with the content entry or database row.
  • In addition to generating the URL, sharing module 126 may also be configured to record that a URL to the content item has been created. In some embodiments, the content entry associated with a content item may include a URL flag indicating whether a URL to the content item has been created. For example, the URL flag may be a Boolean value initially set to 0 or “false” to indicate that a URL to the content item has not been created. Sharing module 126 may be configured to change the value of the flag to 1 or “true” after generating a URL to the content item.
  • In some embodiments, sharing module 126 may also be configured to deactivate a generated URL. For example, each content entry may also include a URL active flag indicating whether the content should be returned in response to a request from the generated URL. For example, sharing module 126 may be configured to only return a content item requested by a generated link if the URL active flag is set to 1 or true. Changing the value of the URL active flag or Boolean value may easily restrict access to a content item or a collection for which a URL has been generated. This may allow a user to restrict access to the shared content item without having to move the content item or delete the generated URL. Likewise, sharing module 126 may reactivate the URL by again changing the value of the URL active flag to 1 or true. A user may thus easily restore access to the content item without the need to generate a new URL.
  • EXEMPLARY SYSTEMS
  • In exemplary embodiments of the present invention, any suitable programming language may be used to implement the routines of particular embodiments including C, C++, Java, JavaScript, Python, Ruby, CoffeeScript, assembly language, etc. Different programming techniques may be employed such as procedural or object oriented. The routines may execute on a single processing device or multiple processors. Although the steps, operations, or computations may be presented in a specific order, this order may be changed in different particular embodiments. In some particular embodiments, multiple steps shown as sequential in this specification may be performed at the same time
  • Particular embodiments may be implemented in a computer-readable storage device or non-transitory computer readable medium for use by or in connection with the instruction execution system, apparatus, system, or device. Particular embodiments may be implemented in the form of control logic in software or hardware or a combination of both. The control logic, when executed by one or more processors, may be operable to perform that which is described in particular embodiments.
  • Particular embodiments may be implemented by using a programmed general purpose digital computer, by using application specific integrated circuits, programmable logic devices, field programmable gate arrays, optical, chemical, biological, quantum or nanoengineered systems, components and mechanisms may be used. In general, the functions of particular embodiments may be achieved by any means as is known in the art. Distributed, networked systems, components, and/or circuits may be used. Communication, or transfer, of data may be wired, wireless, or by any other means.
  • It will also be appreciated that one or more of the elements depicted in the drawings/figures may also be implemented in a more separated or integrated manner, or even removed or rendered as inoperable in certain cases, as is useful in accordance with a particular application. It is also within the spirit and scope to implement a program or code that may be stored in a machine-readable medium, such as a storage device, to permit a computer to perform any of the methods described above.
  • As used in the description herein and throughout the claims that follow, “a”, “an”, and “the” includes plural references unless the context clearly dictates otherwise. Also, as used in the description herein and throughout the claims that follow, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise.
  • While there have been described methods for dynamically selecting and sending content items to a user device based on various network and device conditions, it is to be understood that many changes may be made therein without departing from the spirit and scope of the invention. Insubstantial changes from the claimed subject matter as viewed by a person with ordinary skill in the art, no known or later devised, are expressly contemplated as being equivalently within the scope of the claims. Therefore, obvious substitutions now or later known to one with ordinary skill in the art are defined to be within the scope of the defined elements. The described embodiments of the invention are presented for the purpose of illustration and not of limitation.

Claims (20)

What is claimed is:
1. A method, comprising:
receiving, from a first client device associated with a first user and at a content management system, a request to create a shared virtual space, the request comprising an image, a user-supplied comment, user identifier associated with a second user, and a user identifier associated with a second user;
providing access to the shared virtual space to the first client device and a second client device associated with the second user;
based on granting the access to the second client device, providing, to the second client device a graphical user interface for interacting with the shared virtual space, the graphical user interface comprising:
an image display area for displaying the image;
a comment display area for displaying the user-supplied comment; and
a send message element to send new comments;
receiving, from the second client device, a new comment related to the shared virtual space based on the second user interacting with the send message element within the graphical user interface; and
associating the new comment with the shared virtual space.
2. The method of claim 1, further comprising providing the new comment to the first client device based on the first client device accessing the shared virtual space.
3. The method of claim 1, further comprising:
providing an add images element within the graphical user interface on the second client device;
receiving, from the second client device, one or more images based on the second user interacting with the add images element within the graphical user interface; and
associating the one or more images with the shared virtual space.
4. The method of claim 3, further comprising providing the one or more images to the first client device based on the first client device accessing the shared virtual space.
5. The method of claim 1, further comprising:
creating a virtual chat stream that comprises the user-supplied comment and the new comment; and
associating the virtual chat stream with the shared virtual space to allow the first user and the second user to communicate via text comments from within the virtual space.
6. The method of claim 1, further comprising:
providing a share image button within the graphical user interface on the second client device;
receiving, from the second client device, a user identifier associated with a third user of the content management system based on the second user interacting with the share image button within the graphical user interface; and
providing, to a third client device associated with the third user, access to the image.
7. The method of claim 6, wherein providing the third client device access to the image comprises providing the third client device access to the shared virtual space.
8. A non-transitory computer readable medium comprising instructions that, when executed by at least one processor, cause a computing device to:
receive, from a first client device associated with a first user and at a content management system, a request to create a shared virtual space, the request comprising a user identifier associated with a second user, an image, and a user-supplied comment;
provide, to a second client device associated with the second user a graphical user interface for interacting with the shared virtual space, the graphical user interface comprising:
an image display area for displaying the image;
a comment display area for displaying the user-supplied comment; and
a send message element to send new comments;
receive, from the second client device, a new comment related to the shared virtual space based on the second user interacting with the send message element within the graphical user interface; and
associating the new comment with the shared virtual space.
9. The non-transitory computer readable medium of claim 8, further comprising instructions that, when executed by the at least one processor, cause the computing device to provide the new comment to the first client device based on the first client device accessing the shared virtual space.
10. The non-transitory computer readable medium of claim 8, further comprising instructions that, when executed by the at least one processor, cause the computing device to provide an add images element within the graphical user interface on the second client device, wherein a user interaction with the add images element causes the second client device to display a list of images stored on the second client device.
11. The non-transitory computer readable medium of claim 8, further comprising instructions that, when executed by the at least one processor, cause the computing device to:
provide an add images element within the graphical user interface on the second client device;
receive, from the second client device, one or more images based on the second user interacting with the add images element within the graphical user interface; and
provide the one or more images to the first client device based on the first client device accessing the shared virtual space.
12. The non-transitory computer readable medium of claim 11, further comprising instructions that, when executed by the at least one processor, cause the computing device to:
create a virtual chat stream that comprises the user-supplied comment and the new comment; and
provide access to the virtual chat stream to the first client device and the second client device from within the virtual space.
13. The non-transitory computer readable medium of claim 8, further comprising instructions that, when executed by the at least one processor, cause the computing device to:
provide a share image button within the graphical user interface on the second client device;
receive, from the second client device, a user identifier associated with a third user of the content management system based on the second user interacting with the share image button within the graphical user interface; and
provide, to a third client device associated with the third user, access to the shared virtual space.
14. The non-transitory computer readable medium of claim 8, further comprising instructions that, when executed by the at least one processor, cause the computing device to provide, within the graphical user interface, a user icon in association with the image within the image display area, wherein the user icon corresponds to the first user.
15. A system, comprising:
at least one processor; and
a non-transitory computer readable medium comprising instructions that, when executed by the at least one processor, cause the system to:
receive, from a first client device associated with a first user and at a content management system, a request to create a shared virtual space, the request comprising a user identifier associated with a second user, an image, and a user-supplied comment;
provide, to a second client device associated with the second user a graphical user interface for interacting with the shared virtual space, the graphical user interface comprising:
an image display area for displaying the image;
a comment display area for displaying the user-supplied comment; and
a send message element to send new comments;
receive, from the second client device, a new comment related to the shared virtual space based on the second user interacting with the send message element within the graphical user interface; and
associating the new comment with the shared virtual space.
16. The system of claim 15, further comprising instructions that, when executed by the at least one processor, cause the system to provide the new comment to the first client device based on the first client device accessing the shared virtual space.
17. The system of claim 15, further comprising instructions that, when executed by the at least one processor, cause the system to:
create a virtual chat stream that comprises the user-supplied comment and the new comment; and
provide access to the virtual chat stream to the first client device and the second client device from within the virtual space.
18. The system of claim 15, further comprising instructions that, when executed by the at least one processor, cause the system to:
provide an add images element within the graphical user interface on the second client device;
receive, from the second client device, one or more images based on the second user interacting with the add images element within the graphical user interface; and
provide the one or more images to the first client device based on the first client device accessing the shared virtual space.
19. The system of claim 15, further comprising instructions that, when executed by the at least one processor, cause the system to:
provide a share image button within the graphical user interface on the second client device;
receive, from the second client device, a user identifier associated with a third user of the content management system based on the second user interacting with the share image button within the graphical user interface; and
provide, to a third client device associated with the third user, access to the shared virtual space.
20. The system of claim 15, further comprising instructions that, when executed by the at least one processor, cause the system to provide, within the graphical user interface, a user icon in association with the image within the image display area of the graphical user interface, wherein the user icon corresponds to the first user.
US17/138,379 2014-02-27 2020-12-30 Systems and methods for selecting content items to store and present locally on a user device Abandoned US20210117469A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/138,379 US20210117469A1 (en) 2014-02-27 2020-12-30 Systems and methods for selecting content items to store and present locally on a user device

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201461945808P 2014-02-27 2014-02-27
US14/247,468 US10885104B2 (en) 2014-02-27 2014-04-08 Systems and methods for selecting content items to store and present locally on a user device
US17/138,379 US20210117469A1 (en) 2014-02-27 2020-12-30 Systems and methods for selecting content items to store and present locally on a user device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/247,468 Continuation US10885104B2 (en) 2014-02-27 2014-04-08 Systems and methods for selecting content items to store and present locally on a user device

Publications (1)

Publication Number Publication Date
US20210117469A1 true US20210117469A1 (en) 2021-04-22

Family

ID=53882408

Family Applications (3)

Application Number Title Priority Date Filing Date
US14/247,468 Active 2036-08-11 US10885104B2 (en) 2014-02-27 2014-04-08 Systems and methods for selecting content items to store and present locally on a user device
US17/138,354 Pending US20210117468A1 (en) 2014-02-27 2020-12-30 Systems and methods for selecting content items to store and present locally on a user device
US17/138,379 Abandoned US20210117469A1 (en) 2014-02-27 2020-12-30 Systems and methods for selecting content items to store and present locally on a user device

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US14/247,468 Active 2036-08-11 US10885104B2 (en) 2014-02-27 2014-04-08 Systems and methods for selecting content items to store and present locally on a user device
US17/138,354 Pending US20210117468A1 (en) 2014-02-27 2020-12-30 Systems and methods for selecting content items to store and present locally on a user device

Country Status (1)

Country Link
US (3) US10885104B2 (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5859771B2 (en) * 2011-08-22 2016-02-16 ソニー株式会社 Information processing apparatus, information processing system information processing method, and program
KR102213490B1 (en) 2013-09-03 2021-02-08 삼성전자주식회사 Electronic device and method for controlling screen
US20150286897A1 (en) * 2014-04-03 2015-10-08 Microsoft Corporation Automated techniques for photo upload and selection
US20160124994A1 (en) * 2014-10-31 2016-05-05 Pixcon Systems and methods for uploading and ranking photographs
EP3274878A1 (en) * 2015-09-28 2018-01-31 Google LLC Sharing images and image albums over a communication network
WO2017064891A1 (en) * 2015-10-13 2017-04-20 ソニー株式会社 Information processing system, information processing method, and storage medium
US20170351419A1 (en) * 2016-06-07 2017-12-07 Bounce Exchange, Inc. Systems and methods of dynamically providing information at detection of scrolling operations
KR102479360B1 (en) 2016-06-15 2022-12-20 삼성전자 주식회사 Method and apparatus for providing augmented reality service
CN110770717B (en) 2017-05-17 2024-04-16 谷歌有限责任公司 Automatic image sharing with designated users over a communication network
US11488164B2 (en) * 2017-12-13 2022-11-01 Mastercard International Incorporated Computerized methods and computer systems for verification of transactions
US11356605B1 (en) * 2021-05-10 2022-06-07 Samsara Inc. Dual-stream video management

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080052945A1 (en) * 2006-09-06 2008-03-06 Michael Matas Portable Electronic Device for Photo Management
US7369164B2 (en) * 2003-04-11 2008-05-06 Eastman Kodak Company Using favorite digital images to organize and identify electronic albums
US20100150407A1 (en) * 2008-12-12 2010-06-17 At&T Intellectual Property I, L.P. System and method for matching faces
US20120124517A1 (en) * 2010-11-15 2012-05-17 Landry Lawrence B Image display device providing improved media selection
US20140157138A1 (en) * 2012-11-30 2014-06-05 Google Inc. People as applications
US20150013016A1 (en) * 2013-07-03 2015-01-08 Facebook, Inc. Privacy model for shared collections of content on a social networking system
US20160334967A1 (en) * 2012-06-29 2016-11-17 Qualcomm Incorporated Sharing of user interface objects via a shared space
US9773228B2 (en) * 2012-11-02 2017-09-26 Facebook, Inc. Systems and methods for sharing images in a social network
US10289661B2 (en) * 2012-09-12 2019-05-14 Flipboard, Inc. Generating a cover for a section of a digital magazine

Family Cites Families (154)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4475172A (en) * 1978-05-30 1984-10-02 Bally Manufacturing Corporation Audio/visual home computer and game apparatus
US5321520A (en) 1992-07-20 1994-06-14 Automated Medical Access Corporation Automated high definition/resolution image storage, retrieval and transmission system
US6122411A (en) * 1994-02-16 2000-09-19 Apple Computer, Inc. Method and apparatus for storing high and low resolution images in an imaging device
GB2295936B (en) 1994-12-05 1997-02-05 Microsoft Corp Progressive image transmission using discrete wavelet transforms
US5920327A (en) 1995-06-06 1999-07-06 Microsoft Corporation Multiple resolution data display
US6545687B2 (en) 1997-01-09 2003-04-08 Canon Kabushiki Kaisha Thumbnail manipulation using fast and aspect ratio zooming, compressing and scaling
US6182127B1 (en) 1997-02-12 2001-01-30 Digital Paper, Llc Network image view server using efficent client-server tilting and caching architecture
US5983263A (en) * 1998-01-02 1999-11-09 Intel Corporation Method and apparatus for transmitting images during a multimedia teleconference
JPH11203359A (en) 1998-01-14 1999-07-30 Fuji Photo Film Co Ltd Network photo service system
US6144996A (en) 1998-05-13 2000-11-07 Compaq Computer Corporation Method and apparatus for providing a guaranteed minimum level of performance for content delivery over a network
US6496206B1 (en) 1998-06-29 2002-12-17 Scansoft, Inc. Displaying thumbnail images of document pages in an electronic folder
US6281874B1 (en) 1998-08-27 2001-08-28 International Business Machines Corporation Method and system for downloading graphic images on the internet
US6563517B1 (en) 1998-10-02 2003-05-13 International Business Machines Corp. Automatic data quality adjustment to reduce response time in browsing
US7042583B1 (en) * 1999-04-12 2006-05-09 Eastman Kodak Company Techniques for acquiring a parent multimedia asset (digital negative) from any of a plurality of multiply modified child multimedia assets
US6633608B1 (en) * 1999-05-27 2003-10-14 Sarnoff Corporation Method and apparatus for adapting memory resource utilization in an information stream decoder
US6738155B1 (en) * 1999-07-30 2004-05-18 Banta Corporation System and method of providing publishing and printing services via a communications network
US7536561B2 (en) * 1999-10-15 2009-05-19 Ebrary, Inc. Method and apparatus for improved information transactions
US6549674B1 (en) * 2000-10-12 2003-04-15 Picsurf, Inc. Image compression based on tiled wavelet-like transform using edge and non-edge filters
WO2001069585A1 (en) 2000-03-15 2001-09-20 Picsurf, Inc. System and method for efficient transmission and display of image details by re-usage of compressed data
US20020051065A1 (en) 2000-04-26 2002-05-02 Nikon Corporation Recording medium for data file management, apparatus for data file management, handling apparatus for image data, and image capturing system
US7076097B2 (en) 2000-05-29 2006-07-11 Sony Corporation Image processing apparatus and method, communication apparatus, communication system and method, and recorded medium
US7032179B2 (en) * 2000-06-12 2006-04-18 Peer Image, Inc. System for creating on a computer display screen composite images from diverse sources
US7653706B2 (en) * 2000-07-19 2010-01-26 Akamai Technologies, Inc. Dynamic image delivery system
US20040133924A1 (en) * 2000-12-15 2004-07-08 Wilkins David C. Techniques for syncronizing any of a plurality of associated multimedia assets in a distributed system
EP1354275A4 (en) 2001-01-26 2011-04-27 Eastman Kodak Co Method and apparatus for dynamic optimization and network delivery of multimedia content
US7154621B2 (en) 2001-03-20 2006-12-26 Lightsurf Technologies, Inc. Internet delivery of digitized photographs
US7000188B1 (en) * 2001-03-29 2006-02-14 Hewlett-Packard Development Company, L.P. System and method for intelligently selecting media through a simplified user interface
WO2002082799A2 (en) 2001-04-03 2002-10-17 Emblaze Systems, Ltd. A method for selective image acquisition and transmission
JP2005501310A (en) 2001-05-02 2005-01-13 ビットストリーム インコーポレーティッド Scaling method and / or method and system for displaying an information medium in a specific direction
US20030020749A1 (en) 2001-07-10 2003-01-30 Suhayya Abu-Hakima Concept-based message/document viewer for electronic communications and internet searching
US8549434B2 (en) 2001-10-18 2013-10-01 Microsoft Corporation Method for graphical representation of a content collection
US8417599B2 (en) 2001-12-06 2013-04-09 John Wirth, Jr. Method and system for browsing and ordering from catalogs via the internet
KR100465289B1 (en) 2002-04-22 2005-01-13 삼성전자주식회사 Device and method for displaying thumbnail screen in wireless terminal having a camera
JP2003331047A (en) 2002-05-16 2003-11-21 Canon Inc System, apparatus and method for processing information, program for executing the method by computer, and storage medium stored with the program computer- readably
US6898601B2 (en) * 2002-05-23 2005-05-24 Phochron, Inc. System and method for digital content processing and distribution
US7764308B2 (en) 2002-05-27 2010-07-27 Nikon Corporation Image transmission system, image relay apparatus, and electronic image device
US7574653B2 (en) 2002-10-11 2009-08-11 Microsoft Corporation Adaptive image formatting control
US20040175764A1 (en) 2003-01-06 2004-09-09 Hiroto Nishiyama Image processing apparatus, image processing program, recording medium, and image processing method
US7930434B2 (en) 2003-03-05 2011-04-19 Microsoft Corporation System and method for managing communication and/or storage of image data
JP4398669B2 (en) 2003-05-08 2010-01-13 シャープ株式会社 Mobile phone equipment
CN1860788B (en) 2003-05-16 2010-06-02 谷歌公司 Methods and systems for image sharing over a network
GB2404546B (en) * 2003-07-25 2005-12-14 Purple Interactive Ltd A method of organising and displaying material content on a display to a viewer
JP4239090B2 (en) * 2004-01-08 2009-03-18 富士フイルム株式会社 File management program
US7614075B2 (en) * 2004-08-13 2009-11-03 Microsoft Corporation Dynamically generating video streams for user interfaces
JP2006085259A (en) 2004-09-14 2006-03-30 Konica Minolta Photo Imaging Inc Image processor, image processing method, and program
US20060114820A1 (en) 2004-11-26 2006-06-01 Agfa Inc. System and method for caching and fetching data
US20060173974A1 (en) 2005-02-02 2006-08-03 Victor Tang System and method for providing mobile access to personal media
JP2006244221A (en) 2005-03-04 2006-09-14 Konica Minolta Photo Imaging Inc Image processing method, image processor and image processing program
US20060242163A1 (en) 2005-04-22 2006-10-26 Microsoft Corporation Local thumbnail cache
US9942511B2 (en) * 2005-10-31 2018-04-10 Invention Science Fund I, Llc Preservation/degradation of video/audio aspects of a data stream
US9621749B2 (en) * 2005-06-02 2017-04-11 Invention Science Fund I, Llc Capturing selected image objects
GB2426136B (en) 2005-05-11 2008-10-01 Idan Zuta Messaging system and method
US20060282788A1 (en) * 2005-06-10 2006-12-14 Nokia Corporation System and method for creating and utilizing context-sensitive popularity data
US8139642B2 (en) * 2005-08-29 2012-03-20 Stmicroelectronics S.R.L. Method for encoding signals, related systems and program product therefor
US8775662B2 (en) 2005-09-27 2014-07-08 Blackberry Limited System and method for progressive delivery of multimedia objects
US8185591B1 (en) * 2005-09-30 2012-05-22 Google Inc. Systems and methods for viewing email attachments as thumbnails
EP1793581A1 (en) 2005-12-05 2007-06-06 Microsoft Corporation Automatic selection of images for transfer depending on connection characteristics
KR100782827B1 (en) * 2005-12-12 2007-12-06 삼성전자주식회사 Image forming method and apparatus of printing image using data input from image acquisition device, computer readable recording medium storing program for executing the method
US9367832B2 (en) * 2006-01-04 2016-06-14 Yahoo! Inc. Synchronizing image data among applications and devices
WO2007084867A2 (en) * 2006-01-13 2007-07-26 Yahoo! Inc. Method and system for online remixing of digital multimedia
GB0608071D0 (en) 2006-04-24 2006-05-31 Pandora Int Ltd Method of and apparatus for image serving
WO2007139905A2 (en) * 2006-05-25 2007-12-06 Panraven, Inc. Life event recording system
US7797722B2 (en) 2006-05-26 2010-09-14 Sony Corporation System and method for content delivery
KR101228080B1 (en) * 2006-07-25 2013-02-01 삼성전자주식회사 Image forming apparatus capable of resizing image and method for resizing of image thereof
EP2090969A4 (en) * 2006-11-30 2013-01-09 Nec Corp Information selection support device, information selection support method, and program
JP4662071B2 (en) 2006-12-27 2011-03-30 富士フイルム株式会社 Image playback method
US20120331371A1 (en) 2007-01-23 2012-12-27 Larson Timothy M System and Method for Yearbook Creation
WO2008119129A1 (en) 2007-04-02 2008-10-09 Pip Technology Pty Ltd Systems methods and apparatuses for providing applications style functionality to a user
US7889949B2 (en) * 2007-04-30 2011-02-15 Microsoft Corporation Joint bilateral upsampling
US7778973B2 (en) * 2007-05-18 2010-08-17 Tat Kuen Choi System, method, and program for sharing photos via the internet
US8234558B2 (en) * 2007-06-22 2012-07-31 Apple Inc. Adaptive artwork for bandwidth- and/or memory-limited devices
US8161179B2 (en) * 2007-06-28 2012-04-17 Apple Inc. Generating low resolution user interfaces for transmission to mobile devices
US9131078B2 (en) * 2007-07-27 2015-09-08 Lagavulin Limited Apparatuses, methods, and systems for a portable, image-processing transmitter
US10318110B2 (en) 2007-08-13 2019-06-11 Oath Inc. Location-based visualization of geo-referenced context
US20090089448A1 (en) 2007-09-28 2009-04-02 David Sze Mobile browser with zoom operations using progressive image download
US7818355B2 (en) 2007-11-07 2010-10-19 Mills Brendon W System and method for managing content
US20090183060A1 (en) 2008-01-12 2009-07-16 David Heller Unified Media Presentation for Media Devices
JP2009175227A (en) 2008-01-22 2009-08-06 Toshiba Corp Portable terminal device
FR2929687B1 (en) 2008-04-03 2010-12-31 Air Liquide COMBUSTION TOOL COMPRISING A CLOSURE BLOCK AND AN INJECTOR, ITS ASSEMBLY AND OVEN EQUIPPED WITH SAID TOOL
US8208006B2 (en) 2008-04-10 2012-06-26 Research In Motion Limited Method and system for progressive delivery and synchronization of discrete content in rich media services
EP2283472B1 (en) * 2008-05-05 2021-03-03 Iomniscient Pty Ltd A system and method for electronic surveillance
US8078230B2 (en) 2008-06-27 2011-12-13 Motorola Mobility, Inc. Method and apparatus for providing a prioritized list of display devices for display of a media file
CN101667185B (en) 2008-09-05 2012-10-17 深圳富泰宏精密工业有限公司 Mobile device and fast display method of image thereof
US8478876B2 (en) 2008-09-29 2013-07-02 Infosys Technologies Limited System and method for dynamic management and distribution of data in a data network
EP2180664A1 (en) 2008-10-22 2010-04-28 Vivendi Mobile Entertainment System and method for accessing multi-media content via a mobile terminal
US20110211040A1 (en) * 2008-11-05 2011-09-01 Pierre-Alain Lindemann System and method for creating interactive panoramic walk-through applications
US8605348B2 (en) * 2008-12-30 2013-12-10 Konica Minolta Laboratory U.S.A., Inc. Systems and methods for rendering resizable image including resizable mask
US20110010629A1 (en) * 2009-07-09 2011-01-13 Ibm Corporation Selectively distributing updates of changing images to client devices
JP5504722B2 (en) 2009-07-15 2014-05-28 富士ゼロックス株式会社 Information processing apparatus and computer program
US20110125755A1 (en) 2009-11-23 2011-05-26 Ashish Kaila Systems and methods for thumbnail management
WO2011080497A1 (en) * 2009-12-31 2011-07-07 Vuture Ltd Systems and methods for processing online and print material
US20110197123A1 (en) * 2010-02-10 2011-08-11 Holden Caine System and Method for Linking Images Between Websites to Provide High-Resolution Images From Low-Resolution Websites
JP5366855B2 (en) * 2010-02-16 2013-12-11 富士フイルム株式会社 Image processing method and apparatus, and program
EP2618562B1 (en) * 2010-02-19 2017-11-15 EchoStar Technologies L.L.C. System for searching for online versions of broadcast items
US8714243B2 (en) 2010-03-15 2014-05-06 Weatherford/Lamb, Inc. Methods and apparatus relating to expansion tools for tubular strings
US8515255B2 (en) 2010-04-24 2013-08-20 Walker Digital, Llc Systems and methods for enhancing media with supplemental content
WO2011152149A1 (en) * 2010-06-03 2011-12-08 日本電気株式会社 Region recommendation device, region recommendation method, and recording medium
JP2012008643A (en) 2010-06-22 2012-01-12 Fuji Xerox Co Ltd Document processing device, reduced image generation device, reduced image display device and program
US8514246B2 (en) * 2010-06-30 2013-08-20 Eastman Kodak Company Method for image rendering in a computer network
US8296359B2 (en) 2010-07-12 2012-10-23 Opus Medicus, Inc. Systems and methods for networked, in-context, high resolution image viewing
US20120011568A1 (en) 2010-07-12 2012-01-12 Cme Advantage, Inc. Systems and methods for collaborative, networked, in-context, high resolution image viewing
US9189070B2 (en) 2010-09-24 2015-11-17 Sharp Kabushiki Kaisha Content display device, content display method, portable terminal, program, and recording medium
JP2012069009A (en) 2010-09-24 2012-04-05 Sharp Corp Content display device, display method, program, and recording medium
US20120102431A1 (en) * 2010-10-26 2012-04-26 Marc Krolczyk Digital media frame providing customized content
US8566370B2 (en) 2010-12-15 2013-10-22 Verizon Patent And Licensing Inc. Ranking media content for cloud-based searches
US9251855B2 (en) * 2011-01-28 2016-02-02 Apple Inc. Efficient media processing
US8811701B2 (en) * 2011-02-23 2014-08-19 Siemens Aktiengesellschaft Systems and method for automatic prostate localization in MR images using random walker segmentation initialized via boosted classifiers
WO2012154670A1 (en) * 2011-05-09 2012-11-15 Google Inc. Unified file display
US9201895B2 (en) 2011-06-03 2015-12-01 Apple Inc. Management of downloads from a network-based digital data repository based on network performance
ES2953667T3 (en) * 2011-06-16 2023-11-15 Ge Video Compression Llc Entropic coding of motion vector differences
US20120324397A1 (en) * 2011-06-20 2012-12-20 Tabb Alan Patz System and method for wireless interaction with medical image data
CN102855260B (en) 2011-07-01 2017-06-06 腾讯科技(深圳)有限公司 Process the method and system of picture
US8934015B1 (en) * 2011-07-20 2015-01-13 Google Inc. Experience sharing
US10706096B2 (en) * 2011-08-18 2020-07-07 Apple Inc. Management of local and remote media items
US8903909B1 (en) * 2011-09-15 2014-12-02 Google Inc. Detecting and extending engagement with stream content
US8949739B2 (en) 2011-10-28 2015-02-03 Microsoft Technology Licensing, Llc Creating and maintaining images of browsed documents
US8856141B1 (en) * 2011-12-15 2014-10-07 Google Inc. Providing posts from an extended network
US8773463B2 (en) 2012-01-20 2014-07-08 Nephosity, Inc. Systems and methods for image data management
US9330106B2 (en) 2012-02-15 2016-05-03 Citrix Systems, Inc. Selective synchronization of remotely stored content
US8898743B1 (en) * 2012-02-27 2014-11-25 Google Inc. Personal content control on media device using mobile user device
US20130239003A1 (en) * 2012-03-06 2013-09-12 Touchalbums Llc Digital album production and distribution architecture
US8594292B1 (en) * 2012-04-20 2013-11-26 West Corporation Conference call information sharing via interaction with social networking data
US20130290862A1 (en) * 2012-04-27 2013-10-31 Microsoft Corporation User-tailored search engine homepage experiences
WO2013177769A1 (en) 2012-05-30 2013-12-05 Google Inc. Browser-driven retrieval of content for webpages
US9304584B2 (en) * 2012-05-31 2016-04-05 Ca, Inc. System, apparatus, and method for identifying related content based on eye movements
CN104364754A (en) * 2012-06-12 2015-02-18 索尼公司 Information processing device, information processing method, and program
US9152313B2 (en) * 2012-08-06 2015-10-06 Shutterfly, Inc. Unified picture access across devices
US20140068689A1 (en) 2012-08-17 2014-03-06 Flextronics Ap, Llc Systems and methods for providing social media with an intelligent television
CN103748871A (en) 2012-08-17 2014-04-23 弗莱克斯电子有限责任公司 Interactive channel navigation and switching
US9405500B1 (en) * 2012-09-12 2016-08-02 Marisa Cox Photo album device and system
US9274678B2 (en) * 2012-09-13 2016-03-01 Google Inc. Identifying a thumbnail image to represent a video
EP2904562A4 (en) * 2012-10-08 2016-05-25 Nokia Technologies Oy Method and apparatus for social networking service strategy based on spread simulation
US9710944B2 (en) * 2012-10-22 2017-07-18 Apple Inc. Electronic document thinning
US9367641B2 (en) * 2012-12-27 2016-06-14 Qualcomm Innovation Center, Inc. Predictive web page rendering using a scroll vector
US10649607B2 (en) * 2012-12-28 2020-05-12 Facebook, Inc. Re-ranking story content
US20140282099A1 (en) * 2013-03-14 2014-09-18 Apple Inc. Retrieval, identification, and presentation of media
US9479576B2 (en) * 2013-06-03 2016-10-25 Adobe Systems Incorporated Image session ranking
US9836472B2 (en) * 2013-06-04 2017-12-05 Apple Inc. Tagged management of stored items
JP6201438B2 (en) 2013-06-06 2017-09-27 富士通株式会社 Content distribution method, content distribution server, and thumbnail collection program
US20150039593A1 (en) * 2013-07-31 2015-02-05 Opanga Networks, Inc. Pre-delivery of content to a user device
US20150169207A1 (en) * 2013-12-18 2015-06-18 Dropbox, Inc. Systems and methods for generating personalized account reconfiguration interfaces
US9535897B2 (en) * 2013-12-20 2017-01-03 Google Inc. Content recommendation system using a neural network language model
US20150201001A1 (en) 2014-01-13 2015-07-16 Facebook, Inc. Systems and methods for priority-based digital content downloading
US9787799B2 (en) 2014-02-27 2017-10-10 Dropbox, Inc. Systems and methods for managing content items having multiple resolutions
US9462054B2 (en) 2014-02-27 2016-10-04 Dropbox, Inc. Systems and methods for providing a user with a set of interactivity features locally on a user device
US9141709B1 (en) * 2014-11-20 2015-09-22 Microsoft Technology Licensing, Llc Relevant file identification using automated queries to disparate data storage locations
EP3073449A1 (en) * 2015-03-23 2016-09-28 Continental Automotive GmbH Apparatus and method for recording data associated with a vehicle
US10198355B2 (en) 2015-10-29 2019-02-05 Dropbox, Inc. Proving a dynamic digital content cache
US10423527B2 (en) * 2015-11-12 2019-09-24 Facebook, Inc. Memory management and image display for mobile devices
US10785286B2 (en) * 2016-05-05 2020-09-22 International Business Machines Corporation Proactive content push for enhancing real-time service delivery via cloud
US10051142B1 (en) * 2016-05-20 2018-08-14 Google Llc Adaptive display of image capacity for a camera
US10491963B1 (en) * 2016-06-28 2019-11-26 Amazon Technologies, Inc. Use video codecs to deliver images
KR20180013169A (en) 2016-07-28 2018-02-07 삼성전자주식회사 Method for displaying content and electronic device using the same
US20190005048A1 (en) * 2017-06-29 2019-01-03 Uber Technologies, Inc. Optimizing media presentation within an application

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7369164B2 (en) * 2003-04-11 2008-05-06 Eastman Kodak Company Using favorite digital images to organize and identify electronic albums
US20080052945A1 (en) * 2006-09-06 2008-03-06 Michael Matas Portable Electronic Device for Photo Management
US20100150407A1 (en) * 2008-12-12 2010-06-17 At&T Intellectual Property I, L.P. System and method for matching faces
US20120124517A1 (en) * 2010-11-15 2012-05-17 Landry Lawrence B Image display device providing improved media selection
US20160334967A1 (en) * 2012-06-29 2016-11-17 Qualcomm Incorporated Sharing of user interface objects via a shared space
US10289661B2 (en) * 2012-09-12 2019-05-14 Flipboard, Inc. Generating a cover for a section of a digital magazine
US9773228B2 (en) * 2012-11-02 2017-09-26 Facebook, Inc. Systems and methods for sharing images in a social network
US20140157138A1 (en) * 2012-11-30 2014-06-05 Google Inc. People as applications
US20150013016A1 (en) * 2013-07-03 2015-01-08 Facebook, Inc. Privacy model for shared collections of content on a social networking system

Also Published As

Publication number Publication date
US10885104B2 (en) 2021-01-05
US20210117468A1 (en) 2021-04-22
US20150242443A1 (en) 2015-08-27

Similar Documents

Publication Publication Date Title
US11943320B2 (en) Systems and methods for managing content items having multiple resolutions
US20210117469A1 (en) Systems and methods for selecting content items to store and present locally on a user device
US10235444B2 (en) Systems and methods for providing a user with a set of interactivity features locally on a user device
US11003327B2 (en) Systems and methods for displaying an image capturing mode and a content viewing mode
US10282056B2 (en) Sharing content items from a collection
US9942121B2 (en) Systems and methods for ephemeral eventing

Legal Events

Date Code Title Description
AS Assignment

Owner name: DROPBOX, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GRUE, ANTHONY;HAVEN, ANDREW;SCHEFF, ANDREW;SIGNING DATES FROM 20140401 TO 20140430;REEL/FRAME:054780/0297

AS Assignment

Owner name: JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT, NEW YORK

Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:DROPBOX, INC.;REEL/FRAME:055670/0219

Effective date: 20210305

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION