[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

US20200388374A1 - Method for Generating User Feedback Information From a Product Use Event and User Profile Data to Enhance Product Use Results - Google Patents

Method for Generating User Feedback Information From a Product Use Event and User Profile Data to Enhance Product Use Results Download PDF

Info

Publication number
US20200388374A1
US20200388374A1 US16/897,316 US202016897316A US2020388374A1 US 20200388374 A1 US20200388374 A1 US 20200388374A1 US 202016897316 A US202016897316 A US 202016897316A US 2020388374 A1 US2020388374 A1 US 2020388374A1
Authority
US
United States
Prior art keywords
user
product
activity
personal care
computing device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/897,316
Inventor
Melissa Ann Kreuzer
Faiz Feisal Sherman
Justin Gregory PARKER
Jonathan Michael Martin
Jonathan Livingston Joyce
Paris Nicolle Jackson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Procter and Gamble Co
Original Assignee
Procter and Gamble Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Procter and Gamble Co filed Critical Procter and Gamble Co
Priority to US16/897,316 priority Critical patent/US20200388374A1/en
Assigned to THE PROCTER & GAMBLE COMPANY reassignment THE PROCTER & GAMBLE COMPANY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KREUZER, MELISSA ANN, JACKSON, PARIS NICOLLE, JOYCE, JONATHAN LIVINGSTON, MARTIN, JONATHAN MICHAEL, PARKER, JUSTIN GREGORY, SHERMAN, FAIZ FEISAL
Publication of US20200388374A1 publication Critical patent/US20200388374A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0623Item investigation
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D44/00Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
    • A45D44/005Other cosmetic or toiletry articles, e.g. for hairdressers' rooms for selecting or displaying personal cosmetic colours or hairstyle
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0255Targeted advertisements based on user history
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0269Targeted advertisements based on user profile or attribute
    • G06Q30/0271Personalized advertisement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0282Rating or review of business operators or products
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0631Item recommendations
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
    • G10L25/48Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
    • G10L25/51Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/70ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/2866Architectures; Arrangements
    • H04L67/30Profiles
    • H04L67/306User profiles

Definitions

  • the present disclosure generally relates to personal care systems, and, more particularly, to a personal care assistant for identifying instances of use of personal care products and providing feedback to a user to enhance the user's experience with the personal care products.
  • home assistant devices or other computing devices collect data from network-enabled devices to enhance the users' experiences with the network-enabled devices.
  • a home assistant device may learn a user's habits based on the user's interactions with other network-enabled devices, such as smart lights, a smart TV, a smart heating and air conditioning system, etc. The home assistant device may then automatically control the network-enabled devices according to the learned habits.
  • a smart TV may provide indications of the user's watching habits to a remote server that provides recommendations on similar TV shows and movies to those the user is currently watching.
  • Such devices do not have similar ways of learning habits based on user interactions with devices which are not network-enabled, such as personal care products. While users interact with personal care products, such as makeup, shampoo, conditioner, moisturizer, hand cream, face cream, toothbrushes, mouthwash, facial cleansers, etc., on a daily basis, computing devices do not collect usage data based on users' interactions with these products to enhance the user experience. Accordingly, users do not know if they are using the products correctly and at the appropriate rate or for the appropriate amount of time.
  • personal care products such as makeup, shampoo, conditioner, moisturizer, hand cream, face cream, toothbrushes, mouthwash, facial cleansers, etc.
  • a personal care system includes a personal care computing device that obtains indications of personal care products being used by a user.
  • the personal care computing device identifies a personal care product based on an obtained indication and provides user feedback to assist the user is using the personal care product.
  • the personal care computing device may also determine product use event data based on the user's interaction with the personal care product.
  • the product use event data may include identification information for the personal care product such as the name of the personal care product, the date and/or time of the use, the duration of the use, the manner in which the personal care product is used, other personal care products used in the same time frame as the personal care product, etc.
  • the personal care computing device may provide the product use event data to a server computing device which stores historical product use event data for the user in a user profile.
  • the personal care computing device may also provide user profile data for the user to the server computing device, such as biographical information regarding the user, a current location of the user, an image of the user, or user preferences or goals regarding the personal care product.
  • the server computing device may analyze the product use event data and the historical product use event data at several instances in time along with the user profile data to generate the user feedback. For example, the server computing device may determine that the user is using a skin care product once a week based on the product use event data.
  • the server computing device may also determine the user's age according to the user profile data, and may determine that people in the user's age group should be using the skin care product more often. Accordingly, the server computing device may generate user feedback indicating that the user should use the skin care product at least twice per week. In some scenarios, the user feedback may also include recommendations to purchase other related personal care products.
  • the personal care computing device may present the user feedback via a user interface on the personal care computing device or as audio feedback via a speaker.
  • the personal care computing device may provide the user feedback to the user's mobile device which may be presented via a personal care application on the mobile device.
  • the server computing device may provide the user feedback to the user's mobile device via a short message service (SMS) message, email, or push notification.
  • SMS short message service
  • the personal care system collects and analyzes user data from personal care products which do not include a sensor, do not connect to the Internet, and/or do not include computing devices. Accordingly, the personal care system may digitize data from analog products.
  • a computing device for providing feedback regarding consumer habits includes a user interface, an environmental sensor, a communication interface, one or more processors, and a non-transitory computer-readable memory coupled to the one or more processors, the environmental sensor, the user interface, and the communication interface, and storing instructions thereon.
  • the instructions when executed by the one or more processors, cause the computing device to identify, via the environmental sensor, an activity by a user within the user's dwelling related to a product, and obtain at least one of: (i) activity data for the user, the activity data related to a frequency or duration of the activity performed by the user over time or (ii) product use event data associated with the user for the product, the product use event data related to product usage of the product over time.
  • the instructions further cause the computing device to generate user feedback information associated with the product or related products based on at least one of: the activity data or the product use event data, and provide the user feedback information via the user interface or the communication interface to a mobile device of the user.
  • a server device for providing feedback regarding consumer habits includes one or more processors, and a non-transitory computer-readable memory coupled to the one or more processors and storing instructions thereon.
  • the instructions when executed by the one or more processors, cause the server device to receive, at one or more time intervals, at least one of: (i) activity data for an activity performed by a user within the user's dwelling related to a product, the activity data related to a frequency or duration of the activity performed by the user over time, or (ii) product use event data associated with the user for the product, the product use event data related to product usage of the product over time.
  • the instructions further cause the server device to store the activity data and the product use event data in a user profile of the user and analyze at least one: the activity data or the product use event data at the one or more time intervals to generate user feedback information associated with the product or related personal care products. Moreover, the instructions cause the server device to provide the user feedback information to a client device for presenting the user feedback information to the user.
  • a method for providing feedback regarding consumer habits includes identifying, via an environmental sensor communicatively coupled to a computing device, an activity by a user within the user's dwelling related to a product, and obtaining, by the computing device, at least one of: (i) activity data for the user, the activity data related to a frequency or duration of the activity performed by the user over time or (ii) product use event data associated with the user for the product, the product use event data related to product usage of the product over time.
  • the method further includes generating, by the computing device, information associated with the product or related products based on at least one of: the activity data or the product use event data.
  • the method includes providing, by the computing device, the user feedback information via a user interface or a communication interface to a mobile device of the user.
  • FIG. 1 illustrates an example personal care computing device and a personal care product
  • FIG. 2 illustrates a block diagram of an example communication system in which the personal care computing device can operate
  • FIG. 3 illustrates an example data table including user profile data
  • FIG. 4 illustrates another example data table including product use event data
  • FIG. 5 illustrates example user feedback which may be provided by the personal care system to the user
  • FIG. 6 illustrates a flow diagram of an example method for providing feedback regarding personal care products, which can be implemented in the personal care computing device.
  • FIG. 7 illustrates a flow diagram of an example method for generating the feedback regarding personal care products, which can be implemented in a server device.
  • personal care products may be used to refer to consumer products which are typically used in a bathroom, laundry room, or kitchen.
  • personal care products may include tooth care products (e.g., toothbrushes, mouthwash, dental floss, etc.), skin care products (e.g., hand cream, face cream, facial cleansers, moisturizer, etc.), cosmetic products (e.g., face makeup, eye makeup, lipstick, makeup brushes, makeup kits, makeup mirrors, etc.), hair care products (e.g., shampoo, conditioner, hair dryers, straighteners, brushes, combs, curlers, spray gels, etc.), other grooming products (e.g., razors, hair removal products, etc.), toilet paper, cleaning products (e.g., bleach, window cleaner, all-purpose cleaner, soap, toilet bowl cleaner, etc.), laundry room products (e.g., laundry detergent, stain removal products, etc.), kitchen products (e.g., plates, bowls, forks, spoons, knives, measuring cups,
  • tooth care products e.g., toothbrush
  • consumer habits may refer to usage of consumer products by a user, a hygiene regimen by the user including an order in which a set of consumer products were used when the set of consumer products were used in the same time frame, an amount in which the user complies with product instructions, grooming patterns for the user, etc.
  • a personal care computing device identifies a personal care product which is being used by a user, and determines product use event data for the personal care product, such as identification information for the personal care product such as the name of the personal care product, the date and/or time of the use, the duration of the use, the manner in which the personal care product is used, other personal care products used in the same time frame as the personal care product, etc.
  • the personal care computing device may then provide identification information for the user (e.g., a user ID, user login credentials, etc.) and the product use event data to a server device.
  • the server device may then retrieve a user profile for the user based on the identification information and update the user profile to include the product use event data.
  • the personal care computing device may also provide user profile data to the server device, such as biographical information regarding the user, a current location of the user, an image of the user, or user preferences or goals regarding the personal care product. Accordingly, the server device may update the user profile with the user profile data.
  • the user profile may include product use event data for the user at several time intervals, and the server device may analyze the product use event data over time and/or the user profile data for the user to generate user feedback information. Then the server device provides the user feedback information to the personal care computing device which presents audio feedback via a speaker or visual feedback via a user interface. In other implementations, the personal care computing device forwards the user feedback information to a client computing device of the user for presentation on the client computing device, or the server device provides the user feedback information directly to the client computing device, for example via an SMS message, email, a push notification, etc.
  • FIG. 1 illustrates various aspects of an exemplary environment implementing a personal care system 100 .
  • the personal care system 100 includes a personal care computing device 102 which may be placed in a bathroom, such as on a bathroom sink.
  • the personal care system 100 also includes one or several personal care products 104 .
  • the personal care computing device 102 includes a voice assistant having one or several microphones 106 , such as an array of microphones 106 and one or several speakers 108 , such as an array of speakers 108 .
  • the voice assistant may also include processors and a memory storing instructions for receiving and analyzing voice input and providing voice output.
  • the voice assistant included in the personal care computing device 102 may include the hardware and software components of the voice controlled assistant described in U.S. Pat. No. 9,304,736 filed on Apr. 18, 2013, incorporated by reference herein.
  • the personal care computing device 102 include a user interface 110 for displaying information related to the personal care products, such as user feedback information regarding personal care products.
  • the user interface 110 may also present user controls for the user to providing information about herself, such as identification information (e.g., user login credentials, a user ID, biographical information, user preferences or goals regarding skin care, etc.
  • the user interface 110 may include user controls for the user to provide information regarding the personal care products she uses, such as the names of the personal care products, how often she uses the personal care products, the manner in which she uses each personal care product, the duration of each use, etc.
  • the personal care computing device 102 may include a camera 112 for capturing video and/or images of the area within the field of view of the camera 112 . In this manner, the personal care computing device 102 may identify personal care products 104 within an image or video frame to determine that a personal care product 104 is currently in use, determine the duration of the use, etc.
  • the personal care computing device 102 may also include a communication interface (not shown) for connecting to a long-range communication network such as the Internet and for transmitting/receiving radio signals over a short-range communication network, such as NFC, Bluetooth, RFID, Wi-Fi, etc.
  • the personal care computing device 102 may include an RFID reader or an NFC reader to receive radio signals from RFID tags, NFC tags, Bluetooth Low Energy (BLE) tags, etc.
  • the personal care product 104 includes a radio identification tag (not shown), such as an RFID tag, NFC tag, BLE tag, etc., which transmits identification information for the personal care products to the RFID reader in the personal care computing device 102 .
  • a radio identification tag such as an RFID tag, NFC tag, BLE tag, etc.
  • the personal care computing device 102 may identify a personal care product within a communication range of the personal care computing device 102 based on the radio identification tag and may determine that the identified personal care product is being used by the user.
  • the radio identification tag may be a passive radio identification tag, such that the radio identification tag does not include an internal power source such as a battery.
  • the RFID or NFC reader within the communication range of the radio identification tag provides electromagnetic signals that energize the radio identification tag so that the radio identification tag can transmit a radio signal to the RFID or NFC reader which includes identification information for the personal care product 104 .
  • the personal care product 104 does not include a radio identification tag or any other transceiver.
  • the personal care computing device 102 identifies the personal care product 104 in other ways, such as by identifying visual features within the personal care product 104 from images or video collected by the camera 112 which can be used to identify the personal care product 104 , identifying labels, barcodes, or other text placed on the personal care product from the images or video, or obtaining an indication that the user is using the personal care product 104 via user controls on the user interface 110 or via the user's mobile device.
  • the personal care computing device 102 includes an environmental sensor for capturing environmental characteristics in the area surrounding the personal care computing device 102 , such as the bathroom, the kitchen, the laundry room, the living room, etc. of the user's dwelling.
  • the environmental sensor may be a temperature sensor, a humidity sensor, an acoustic sensor, an ultrasonic sensor, a radio antenna for example for receiving Wi-Fi or BluetoothTM signals, a weighing scale, a wearable sensor, an air quality sensor such as a volatile organic compounds (VOC) sensor, or a depth sensor for generating a 3D point cloud of the area surrounding the environmental sensor, such as a light detection and ranging (LiDAR) sensor or an infrared (IR) sensor, each of which may be used in combination with the camera 112 to generate the 3D point cloud.
  • LiDAR light detection and ranging
  • IR infrared
  • the acoustic sensor may include the one or several microphones 106 , such as an array of microphones 106 for detecting audio characteristics, such as the volume of sounds within the area, the frequency of the sounds within the area, the tone of the sounds within the area, and/or the directions in which the sounds came from within the area.
  • the personal care computing device 102 may identify activities being performed by the user based on the environmental sensor.
  • the personal care computing device 102 may identify activities being performed by the user based on sounds within the area, such as the shower running, the sink running, a toilet flushing, gargling, a dishwasher running, a washing machine running, a dryer running, shaving, brushing teeth, etc.
  • the activities may be related to products.
  • the shower running may be related to hair care or skin care products.
  • the washing machine running may be related to laundry room products, such as laundry detergent, stain removal products, etc.
  • Gargling may be related to tooth care products (e.g., toothbrushes, mouthwash, dental floss, etc.).
  • the personal care computing device 102 may identify activity data for each activity, such as the type of activity (e.g., shaving), the duration of the activity, the date and/or time of the activity, the frequency in which the user performs the activity over a time period (e.g., day, a week, a month), etc.
  • type of activity e.g., shaving
  • duration of the activity e.g., the date and/or time of the activity
  • the frequency in which the user performs the activity over a time period e.g., day, a week, a month
  • the personal care computing device 102 may identify personal care products based on any suitable combination of visual features within the personal care products from images or video collected by the camera 112 which can be used to identify the personal care products, labels, barcodes, or other text placed on the personal care products from the images or video, an indication that the user is using the personal care products via user controls on the user interface 110 or via the user's mobile device, a radio identification tag such as an RFID tag, NFC tag, BLE tag, etc., which transmits identification information for the personal care products, and/or environmental characteristics in the area surrounding the personal care computing device 102 which may be used to identify activities performed by the user that are related to the personal care products.
  • a radio identification tag such as an RFID tag, NFC tag, BLE tag, etc.
  • FIG. 2 illustrates an example communication system in which the personal care computing device 102 and the personal care product 104 can operate to enhance the user's experience with personal care products.
  • the personal care computing device 102 has access to a wide area communication network 200 such as the Internet via a long-range wireless communication link (e.g., a cellular link).
  • the personal care computing device 102 communicates with a server device 202 that generates user feedback information to provide to the user based on the user's interactions with her personal care products 104 .
  • the personal care computing device 102 can communicate with any number of suitable servers.
  • the personal care computing device 102 can also use a variety of arrangements, singly or in combination, to communicate with the user's personal care products 104 .
  • the personal care computing device 102 obtains identification information from the user's personal care products 104 via a short-range communication link, such as short-range radio frequency links including BluetoothTM, RFID, NFC, etc.
  • Some personal care products 104 may include a communication component 130 , such as an RFID tag, NFC tag, BLE tag, etc. Other personal care products 104 may not include the communication component 130 .
  • the personal care computing device 102 may also communicate with a client computing device 222 of the user such as a mobile device including a tablet or smartphone over a short-range communication link, such as short-range radio frequency links including BluetoothTM, WiFi (802.11 based or the like) or another type of radio frequency link, such as wireless USB.
  • a client computing device 222 of the user such as a mobile device including a tablet or smartphone over a short-range communication link, such as short-range radio frequency links including BluetoothTM, WiFi (802.11 based or the like) or another type of radio frequency link, such as wireless USB.
  • the client computing device 222 may be a mobile device such as a tablet computer, a cell phone, a personal digital assistant (PDA), a smartphone, a laptop computer, a portable media player, a home phone, a pager, a wearable computing device, smart glasses, a smart watch or bracelet, a phablet, another smart device, etc.
  • the client computing device 222 may also be a desktop computer.
  • the client computing device 222 may include one or more processors 226 , a memory 228 , a communication unit (not shown) to transmit and receive data via long-range and short-range communication networks, and a user interface 232 for presenting data to the user.
  • the memory 228 may store, for example, instructions for a personal care application 230 that includes user controls for providing information regarding the user's personal care products, such as the names of the user's personal care products, the frequency, duration, and/or manner in which the user uses each personal care product, etc.
  • the personal care application 230 may also include user controls for providing user profile data such as user login credentials, a user ID, the user's name or other biographical information, an image of the user such as a before and after picture, etc.
  • the personal care application 230 may receive user feedback information to present on the user interface 232 or as voice output via a speaker,
  • the user feedback information may be received from the personal care computing device 102 via a short-range communication link, such as BluetoothTM or from the server device 202 via a long-range communication link, such as the Internet or a cellular network.
  • the personal care computing device 102 may include one or more speakers 108 such as an array of speakers, an environmental sensor, which may include any one of or any suitable combination of a temperature sensor, a humidity sensor, an ultrasonic sensor, a radio antenna for example for receiving Wi-Fi or Bluetooth signals, a weighing scale, a wearable sensor, an air quality sensor such as a VOC sensor, a depth sensor for generating a 3D point cloud of the area surrounding the environmental sensor, such as a LiDAR sensor or an IR sensor, each of which may be used in combination with the camera 112 to generate the 3D point cloud, and/or one or more microphones 106 such as an array of microphones.
  • the personal care computing device 102 may also include a user interface 110 , a camera 112 , one or more processors 114 , a communication unit 116 to transmit and receive data over long-range and short-range communication networks, and a memory 118 .
  • the memory 118 can store instructions of an operating system 120 and a personal care assistant application 122 .
  • the personal care assistant application 122 may obtain an indication of a personal care product 104 being used, identify the personal care product 104 based on the indication, and generate and present user feedback information to the user to assist the user with the personal care product or related personal care products via a product identification module 124 , a recommendation determination module 126 , and a control module 128 .
  • the personal care computing device 102 may obtain an indication of a personal care product 104 being used and the product identification module 124 may identify the personal care product 104 based on the obtained indication.
  • the indication of the personal care product 104 may be provided with manual input via user controls on the user interface 110 of the personal care computing device 102 .
  • the user may select the personal care product 104 from a list of personal care products included in a drop-down menu on the user interface 110 .
  • the product identification module 124 may then identify the selected personal care product 104 via the user controls.
  • the indication of the personal care product 104 may also be provided automatically, such as via a radio signal from the personal care product 104 , an image or video of the personal care product 104 , or environmental characteristics indicative of an activity performed by the user which is related to the personal care product 104 , as described below. More specifically, the indication of the personal care product 104 may be identification information from a radio identification tag provided by the personal care product 104 to the personal care computing device 102 . The product identification module 124 may then determine the personal care product 104 transmitting the radio signal based on the identification information. For example, the identification information may indicate that the personal care product 104 transmitting the radio signal is L′Oreal ParisTM Colour Riche Monos Eyeshadow.
  • the indication of the personal care product 104 may be an image or video of the area within the field of view of the camera 112 .
  • the camera 112 may periodically capture images or capture continuous video of the area in front of the camera 112 , which may include a bathroom counter or an area where a user may sit in front of a bathroom mirror.
  • the product identification module 124 may identify an object and determine a personal care product which corresponds to the object based on visual descriptors and semantic cues for the object. At least some of the visual descriptors and semantic cues for the object may be based on a product tag, a product label, a product color, a product shape, a product size, or a product logo.
  • an image or video frame may include multiple objects and the product identification module 124 may determine personal care products which correspond to each object.
  • the product identification module 124 may segment boundaries for the objects using edge detection, pixel entropy, or other image processing techniques. For example, when adjacent pixels in an image differ in intensity by more than a threshold amount, the product identification module 124 may identify the intersection between the adjacent pixels as a boundary of an object. In another example, when a cluster of pixels in the image differs in intensity by more than a threshold amount from an adjacent cluster of pixels, the product identification module 124 may identify the intersection between the adjacent pixels as a boundary of an object. In addition to performing the edge detection techniques described above to identify the boundaries of an object, the product identification module 124 may use an active contour model to refine the locations of the boundaries and further remove noise.
  • the product identification module 124 may identify each of the objects in the image. For each identified object, the product identification module 124 may determine a size and shape of the object according to its boundaries. The product identification module 124 may also identify visual features within the object along with the corresponding locations of the visual features within the object. For example, a first visual feature may be located in the upper right corner of the object, a second visual feature may be located in the center of the object, etc.
  • a visual feature may include a keypoint which is a stable region within the object that is detectable regardless of blur, motion, distortion, orientation, illumination, scaling, and/or other changes in camera perspective.
  • the stable regions may be extracted from the object using a scale-invariant feature transform (SIFT), speeded up robust features (SURF), fast retina keypoint (FREAK), binary robust invariant scalable keypoints (BRISK), or any other suitable computer vision techniques.
  • keypoints may be located at high-contrast regions of the object, such as edges within the object.
  • a bounding box may be formed around a keypoint and the portion of the object created by the bounding box may be a visual feature.
  • each visual feature is encoded as a vector which may include attributes of the visual feature, such as RGB pixel values, the location of the visual feature within the object, etc.
  • the product identification module 124 may identify semantic cues for the object, such as text displayed on the object (e.g., a product label), a tag on or adjacent to the object, a pattern or symbol on the object (e.g., a product logo), etc.
  • the product identification module 124 may apply a stroke width transform (SWT).
  • SWT stroke width transform
  • the SWT is used to find a portion of an image which includes text and filter out the remaining portions of the image which do not include text. In this manner, the text portion of the image may be converted to a text string.
  • the SWT technique may be based on an assumption that all text characters in an image have the same stroke width.
  • the pixel width of the horizontal line in the letter ‘T’ may be the same as the pixel width for the vertical line in the letter ‘T’ within the image. This width may also be the same for all other lines or curves that make up text characters within the image.
  • the product identification module 124 may identify text characters within an image by identifying several lines or curves having a same or similar width (e.g., within a threshold variance of each other). More specifically, the product identification module 124 may perform edge detection techniques within one of the objects, such as the edge detection techniques described above for boundary segmentation, to identify boundaries for lines and curves within the object. The product identification module 124 may then calculate pixel widths for each of these lines and curves based on the positions of their respective boundaries. When the pixel widths for several lines and/or curves are the same or are within a threshold variance of each other, the product identification module 124 may identify the lines and/or curves as text, and may filter out the remaining portions of the object.
  • Additional filtering steps may also be applied to identify the text characters within the image.
  • text characters may have minimum and maximum aspect ratios, such that the length of a text character does not exceed the width of the text character by more than a threshold amount. Accordingly, the identified lines and/or curves may be compared to minimum and maximum aspect ratios. If the length to width ratio of a candidate text character is outside the minimum or maximum aspect ratios, the candidate text character may be filtered out as a portion of the image which does not include text.
  • a threshold ratio between the diameter of a text character and the text character's average stroke width may also be used to filter out portions of the image which do not include text. For example, if the product identification module 124 identifies a portion of an image which resembles the letter ‘O’ the product identification module 124 may calculate the ratio of the diameter for the candidate text character to the average stroke width. When the ratio is less than the threshold ratio by more than a threshold variance (e.g., the candidate text character is donut-shaped) or the ratio is more than the threshold ratio by more than the threshold variance, the candidate text character may be filtered out as a portion of the image which does not include text.
  • a threshold variance e.g., the candidate text character is donut-shaped
  • the ratio is more than the threshold ratio by more than the threshold variance
  • the product identification module 124 may filter out candidate text characters having less than a minimum threshold size or greater than a maximum threshold size (e.g., a minimum height of 8 pixels and a maximum height of 300 pixels).
  • a minimum threshold size e.g., a minimum height of 8 pixels and a maximum height of 300 pixels.
  • other filtering steps may also be applied such as filtering overlapping bounding boxes, or any other suitable filtering steps.
  • the product identification module 124 may also use the SWT to identify words. For example, all text characters in a word may have the same color, may be spaced apart evenly, may be within a threshold distance from each other, and may be the same height or have height differences which are less than a threshold amount. Accordingly, the product identification module 124 may identify words by grouping identified text characters having the same color, that are within a threshold height difference of each other, that are within a threshold distance of each other, and/or that are spaced apart by the same distance.
  • the product identification module 124 may use Maximally Stable Extremal Regions (MSER) techniques to identify text within an object or may use a combination of SWT and MSER to identify the text.
  • MSER Maximally Stable Extremal Regions
  • the portion of the object containing text may be provided to an optical character recognition (OCR) engine which may convert an image (e.g., the portion of the object containing text) to a text string.
  • OCR optical character recognition
  • the product identification module 124 may identify a barcode or QR code within the identified object and may decode the barcode or QR code converting the barcode or QR code to a text string or other data stream which may be used as a semantic cue.
  • the product identification module 124 may compare each of the visual features, semantic cues, and/or other visual characteristics for the object to visual descriptors, semantic cues, and/or other visual characteristics for templates of personal care products to determine a likelihood that the object corresponds to one of the personal care products.
  • the personal care computing device 102 may store the templates of personal care products in a database. Each template may include the visual features, semantic cues, and/or other visual characteristics for the template personal care product.
  • each identified text string for an object may be compared to text strings in the templates of personal care products to determine likelihoods that the object corresponds to each template personal care product.
  • the personal care product having the highest likelihood for the object or having a likelihood that exceeds a likelihood threshold may be identified as the personal care product corresponding to the object.
  • the product identification module 124 may generate a machine learning model for identifying personal care products based on visual features and semantic cues using image classification and/or machine learning techniques.
  • the machine learning techniques may include linear regression, polynomial regression, logistic regression, random forests, boosting, nearest neighbors, Bayesian networks, neural networks, deep learning, support vector machines, or any other suitable machine learning technique. Then the product identification module 124 may apply the visual features and semantic cues for the object to the machine learning model to identify the personal care product corresponding to the object.
  • the template features and template semantic cues may be compared to the features and semantic cues for an object using a nearest neighbors algorithm.
  • the nearest neighbors algorithm may identify template features and template semantic cues which are the closest to the features of the object by creating numerical representations of the features and semantic cues to generate feature vectors, such as a pixel width and height of a personal care product, and RGB pixel values for the personal care product, for example.
  • the numerical representations of the features or feature vectors of the object may be compared to the feature vectors of template personal care products to determine a vector distance between the features of the object and each template personal care product.
  • a semantic cue for the object such as text may be compared to text in the template personal care products to identify the amount of matching text characters, words, or symbols to determine a vector distance between the semantic cues of the object and each template personal care product.
  • the product identification module 124 may generate vector distances for each vector (e.g., each visual feature and semantic cue) and combine the individual vector distances to generate an overall vector distance between the object and a particular template personal care product.
  • the product identification module 124 may then identify the personal care product which corresponds to the object based on the amount of similarity, or the vector distance in the nearest neighbors algorithm, between the visual features and semantic cues for the object and the visual features and semantic cues for the template personal care products.
  • the product identification module 124 may identify the template personal care product having the smallest overall vector distance between the object and the template personal care product as the template personal care product corresponding to the object.
  • the product identification module 124 may provide images or video of the area within the field of view of the camera 112 to the server device 202 which may identify an object and determine a personal care product which corresponds to the object using similar techniques as described above. Then the server device 202 may provide the identified personal care products to the product identification module 124 .
  • the product identification module 124 may identify consumer habits, such as product use event data for the personal care product 104 .
  • the product use event data may include identification information for the personal care product 104 such as the name of the personal care product, the date and/or time of the use, the duration of the use, the manner in which the personal care product 104 is used, other personal care products used in the same time frame as the personal care product 104 , etc.
  • the product identification module 124 may determine the date and/or time of the use based on the date and/or time when the product identification module 124 identifies the personal care product 104 . For example, when the personal care computing device 102 receives identification information from a radio identification tag provided by the personal care product 104 , the product identification module 124 may record the date and/or time in which the identification information is received.
  • the product identification module 124 may determine the duration of the use by determining when the personal care product 104 can no longer be identified. For example, the product identification module 124 may record the amount of time until the personal care computing device 102 stops receiving a radio signal from the personal care product 104 , until the personal care product 104 is no longer within the field of view of the camera 112 , etc.
  • the product identification module 124 may identify other personal care products used in the same time frame as the personal care product 104 by identifying the other personal care products in a similar manner as described above, and comparing identification times for each of the other personal care products to the identification time for the personal care product. If another personal care product is identified within a threshold time period (e.g., 2 minutes, 5 minutes, 10 minutes, etc.) of the personal care product, the product identification module 124 may determine that the other personal care product was used in the same time frame as the personal care product 104 . For example, the product identification module 124 may determine that 5 personal care products were used within a ten minute time period, and thus may determine that each of the 5 personal care products was used in the same time frame. The product identification module 124 may also generate an order in which a set of personal care products were used when the set of personal care products were used in the same time frame.
  • a threshold time period e.g. 2 minutes, 5 minutes, 10 minutes, etc.
  • the personal care computing device 102 may present questions on the user interface 110 or via the speaker 108 which are related to the use of the identified personal care product 104 . Accordingly, the user may respond to the questions with voice responses which are received via the microphone or via user controls on the user interface 110 , such as drop-down menus, text fields, etc. For example, when the personal care product 104 is eye makeup, the personal care computing device 102 may ask which color is being used, where the eye makeup is being applied around the eye, etc. In some implementations, the product identification module 124 may determine the manner in which the personal care product 104 is being used based on the user's responses to the questions.
  • the product identification module 124 may determine the manner in which the personal care product 104 is being used by analyzing the images or video from the camera 112 using computer vision techniques. For example, the product identification module 124 may identify the user's face and facial features from the images such as the user's eyes, lips, and nose, and may determine where the user is applying makeup, lipstick, moisturizer, etc., on her face.
  • the personal care computing device 102 may obtain an indication of an activity being performed by the user.
  • the indication of the activity may be obtained automatically, for example via the environmental sensor.
  • the indication of the activity may be environmental characteristics within an area surrounding the personal computing device 102 , such as audio characteristics, temperature characteristics, visual characteristics, weight characteristics, air quality characteristics, or humidity characteristics.
  • the environmental sensor may periodically capture sensor data within the area surrounding the personal computing device 102 (e.g., the living room, the kitchen, the bathroom, etc.), such as audio data, temperature data, humidity data, a 3D point cloud, air quality data, weight data, data received via a short-range communication link, etc. Then the personal computing device 102 may identify an activity based on sensor data characteristics from any one or any suitable combination of sensors.
  • the environmental sensor may periodically capture audio data for a sound within the area.
  • the personal computing device 102 may then compare the audio data for the sound to acoustic signatures for various types of activities, such as the shower running, the sink running, a toilet flushing, gargling, a dishwasher running, a washing machine running, a dryer running, shaving, brushing teeth, etc.
  • Each acoustic signature may include a set of audio characteristics for a particular activity and/or sound, such as the volume of the sound, the frequency of the sound, the tone of the sound, the direction of the sound, etc.
  • the personal computing device 102 may identify the type of activity by comparing the audio data to each acoustic signature for each type of activity to determine a likelihood that the sound corresponds to one of the activities.
  • the type of activity having the highest likelihood for the sound or having a likelihood that exceeds a likelihood threshold may be identified as the type of activity corresponding to the sound.
  • the environmental sensor may periodically capture audio data for a sound within the area and may periodically capture temperature data within the area.
  • the personal computing device 102 may then compare the audio data for the sound to acoustic signatures for various types of activities, such as the shower running, the sink running, a toilet flushing, gargling, a dishwasher running, a washing machine running, a dryer running, shaving, brushing teeth, etc., and may compare the temperature data to heat signatures for the various types of activities.
  • Each acoustic signature may include a set of audio characteristics for a particular activity and/or sound, such as the volume of the sound, the frequency of the sound, the tone of the sound, the direction of the sound, etc.
  • Each heat signature may include a set of temperature characteristics for a particular activity.
  • the personal computing device 102 may identify the type of activity by comparing the audio data to each acoustic signature for each type of activity and the temperature data to each heat signature for each type of activity to determine a likelihood that the sound/temperatures correspond to one of the activities.
  • the type of activity having the highest likelihood for the sound/temperatures or having a likelihood that exceeds a likelihood threshold may be identified as the type of activity corresponding to the sound/temperatures.
  • the personal computing device 102 may obtain an indication of a type of area in which the personal computing device 102 is located, such as the bathroom, the kitchen, the laundry room, etc.
  • the indication may be obtained from the user via user controls at the personal computing device 102 .
  • the personal computing device 102 may adjust the likelihoods that a sound corresponds to one of several different activities based on the type of area in which the personal computing device 102 is located. For example, if the personal computing device 102 is in the laundry room, it may be more likely that a detected sound or set of environmental characteristics corresponds to the washing machine running than to the dish washer running. Conversely, if the personal computing device 102 is in the kitchen, it may be more likely that a detected sound or set of environmental characteristics corresponds to the dish washer running than to washing machine running.
  • the product identification module 124 may generate a machine learning model for identifying an activity based on sensor data captured by the environmental sensor using machine learning techniques.
  • the machine learning techniques may include linear regression, polynomial regression, logistic regression, random forests, boosting, nearest neighbors, Bayesian networks, neural networks, deep learning, support vector machines, or any other suitable machine learning technique. Then the product identification module 124 may apply the audio characteristics for the sound, the type of area where the personal computing device 102 is located, and/or other environmental characteristics detected within the area to the machine learning model to identify the activity corresponding to the sound and/or other environmental characteristics.
  • the audio signatures may be compared to the audio characteristics for a sound using a nearest neighbors algorithm.
  • the nearest neighbors algorithm may identify audio signatures which are the closest to the audio characteristics of the sound by creating numerical representations of the audio characteristics to generate feature vectors, such as a volume, frequency, tone, and direction, for example.
  • the numerical representations of the features or feature vectors of the sound may be compared to the feature vectors of audio signatures of various types of activities to determine a vector distance between the features of the sound and each audio signature.
  • the product identification module 124 may generate vector distances for each vector and combine the individual vector distances to generate an overall vector distance between the sound and an audio signature for a particular type of activity.
  • the product identification module 124 may then identify the activity which corresponds to the sound based on the amount of similarity, or the vector distance in the nearest neighbors algorithm, between the features for the sound and the features of the audio signatures for the activities.
  • the product identification module 124 may identify the audio signature for the type of activity having the smallest overall vector distance between the sound and the audio signature as the audio signature for the type of activity corresponding to the sound.
  • the product identification module 124 may identify additional activity data, such as the date and/or time of the activity, the duration of the activity, etc.
  • the product identification module 124 may also identify activity data based on previous activities performed by the user, such as the frequency of the activity over a particular time period.
  • the product identification module 124 may also identify products related to the activity. For example, when the activity is running a washing machine or a dryer, the product identification module 124 may identify one or more laundry room products. When the activity is showering, the product identification module 124 may identify one or more hair care or skin care products. When the activity is the toilet flushing, the product identification module 124 may identify one or more bathroom products, such as toilet paper or cleaning products. When the activity is the dishwasher running, the product identification module 124 may identify one or more kitchen products, such as plates, bowls, forks, spoons, knives, dishwasher detergent, etc.
  • the product identification module 124 may use the identified activity to identify the personal care product 104 being used by a user. More specifically, the product identification module 124 may generate the machine learning model for identifying personal care products based on visual features, semantic cues, and the type of activity being performed by the user. For example, two personal care products (e.g., mouthwash and moisturizer) may have similar likelihoods for corresponding to the object. When the product identification module 124 identifies gargling as the activity, the product identification module 124 may determine that the personal care product corresponding to the object is mouthwash.
  • the product identification module 124 may identify the user.
  • the product identification module 124 may obtain an indication of the identity of the user from manual input via user controls on the user interface 110 of the personal care computing device 102 .
  • the user may login to a user profile using user login credentials, may enter the user's first and last name, may select a user profile from a set of user profiles, or may provide any other suitable identification information.
  • the product identification module 124 may obtain the indication of the identity of the user automatically from environmental sensor data, such as an image or video of the user, audio data indicative of the user's voice, etc.
  • the personal care computing device 102 may store template images of each of the users who utilize the personal care computing device 102 .
  • the personal care computing device 102 may also store voice recordings/audio signatures from each of the users and/or other biographical data.
  • the product identification module 124 may compare the environmental sensor data to the stored images, voice recordings, and/or other biographical data to identify the user.
  • the product identification module 124 may identify the user's face and facial features from the images such as the user's eyes, lips, and nose, and may compare the facial features to the facial features from the stored template images of each of the users.
  • the product identification module 124 may also identify the user's voice from audio data and compare the voice data to the stored voice recordings.
  • the product identification module 124 may compare the environmental sensor data to the stored images, voice recordings, and/or other biographical data using machine learning techniques.
  • the machine learning techniques may include linear regression, polynomial regression, logistic regression, random forests, boosting, nearest neighbors, Bayesian networks, neural networks, deep learning, support vector machines, or any other suitable machine learning technique. Then the product identification module 124 may apply the visual features and semantic cues for the object to the machine learning model to identify the personal care product corresponding to the object.
  • the template facial features and/or template voice features may be compared to the facial features and/or voice features for a user whose identity is unknown using a nearest neighbors algorithm.
  • the nearest neighbors algorithm may identify template facial features and/or template voice features which are the closest to the facial features and/or voice features for a user whose identity is unknown by creating numerical representations of the facial features and/or voice features to generate feature vectors.
  • the numerical representations of the features or feature vectors of the user whose identity is unknown may be compared to the feature vectors of template users to determine a vector distance between the features of the user whose identity is unknown and each template user.
  • the product identification module 124 may generate vector distances for each vector (e.g., each facial feature and/or voice feature) and combine the individual vector distances to generate an overall vector distance between the user whose identity is unknown and a particular template user.
  • the product identification module 124 may obtain an identifier for the identified user, such as a user ID.
  • the recommendation determination module 126 may then provide the product use event data for the identified personal care product 104 and/or the activity data for the activity as well as identification information for the user (e.g., user login credentials, a user ID, etc.) to the server device 202 .
  • the server device 202 may store the activity data and/or the product use event data in a user profile for the user which includes historical product use event data for the identified personal care product 104 and for other personal care products and/or historical activity data for the identified activities and for other activities.
  • the user profile may also include user profile data for the user, such as biographical information regarding the user, a current location of the user, an image of the user, or user preferences or goals regarding the personal care product.
  • the personal care computing device 102 or the user's client computing device 222 obtains user profile data from the user and provides the user profile data to the server device 202 .
  • the user's client computing device 222 may provide location data (e.g., obtained via a positioning sensor such as a GPS module or via an IP address), to the server device 202 which may be the user's current location.
  • the server device 202 may then store the user profile data in the user profile for the user.
  • Example data tables 300 , 400 illustrating user profile data and product use event data are illustrated in FIGS. 3 and 4 , respectively.
  • user profile data in a user profile may include a user ID 302 , a name of the user 304 , an address of the user 306 , a date of birth of the user 308 , personal care goals provided by the user 310 , reported cosmetic issues provided by the user 312 , rewards points for the user 314 , or any other suitable information about the user.
  • the data table 300 may also include images of the user (not shown), user performance metrics related to product usage (not shown), etc.
  • product use event data in a user profile may include a user ID 402 which may be the same user ID as in the user profile data for associating the product use event data with the user.
  • the product use event data may also include the name of the personal care product 404 , the date and/or time of the use 406 , the duration of the use 408 , and the manner of use 410 describing how the personal care product was used.
  • Jane Smith (User ID 2 ) applied OlayTM Total Effects Whip Face Moisturizer on Jul. 26, 2019 at 9:14 a.m. for 1 minute. She rubbed the moisturizer unevenly on parts of her face.
  • Jane Smith applied the OlayTM Total Effects Whip Face Moisturizer at 7:15 p.m. for 30 seconds. That time she rubbed the moisturizer evenly on her entire face. Additionally, on July 22, Jane Smith used a SK-IITM Facial Treatment Mask at 9:37 a.m. for 7 minutes. She placed the mask on her face, left it there for 7 minutes, and rinsed it off.
  • the server device 202 may also store a data table (not shown) which includes activity data.
  • Activity data in a user profile may include a user ID which may be the same user ID as in the user profile data for associating the activity data with the user.
  • the activity data may also include the type of the activity, the date and/or time of the activity, and the duration of the activity.
  • the activity data may include the frequency of the activity over a particular time period (e.g., a day, a week, a month) based on the dates and/or times of the activity and/or other metrics based on the activity data.
  • the server device 202 may analyze the product use event data for a particular personal care product, the activity data for a particular type of activity, and/or the user profile data for the user to generate user feedback information to assist the user in using the personal care product or related personal care products. This may enhance the user's experience with the personal care products and provide improved results from using the personal care products.
  • the server device 202 may include one or more processors 204 , a communication unit (not shown) to transmit and receive data over long-range and short-range communication networks, and a memory 206 .
  • the memory 206 can store instructions of an operating system (not shown) and a personal care recommendation generator 208 .
  • the server device 202 may also be communicatively coupled to a database 210 that stores user profiles for several users, where each user profile includes user profile data and product use event data as described above.
  • the database 210 may also store templates of personal care products including visual features, semantic cues, and/or other visual characteristics for the template personal care products.
  • the database 210 may store audio signatures for various activities each including a set of audio characteristics which correspond to the activity. Additionally, the database 210 may store machine learning models generated based on the visual features, semantic cues, and/or other visual characteristics of the template personal care products and/or based on the audio signatures for the various activities.
  • the database 210 may store a set of rules regarding the appropriate frequency, duration, and manner of use for the personal care product.
  • the rules may differ depending on the demographics of a particular user. For example, the rules may indicate that users in a first age group should moisturize more often than users in a second age group.
  • the database 210 may store machine learning models for determining the appropriate frequency, duration, and manner of use for the personal care product that is specific to a particular user based on the user's previous patterns of use and/or the results experienced by the user.
  • a user-specific machine learning model may be adjusted such that the appropriate moisturizing frequency for the user is weekly.
  • the database 210 may store a set of rules regarding the appropriate frequency, duration, and manner of use for a particular activity.
  • the set of rules may also include an estimated total number of times the activity may be performed and/or an estimated total duration over multiple instances of performing the activity before products related to the activity need to be replenished, such as the number of showers before the user needs to replace the soap and shampoo.
  • the database 210 may store machine learning models for determining the appropriate frequency, duration, and manner of use for the particular activity that is specific to a particular user based on the user's previous patterns of use and/or the results experienced by the user.
  • the personal care recommendation generator 208 may analyze the activity data and/or the product use event data at several instances in time for the identified personal care product 104 (e.g., from the user profile in the database 210 ) to generate the user feedback information. For example, the personal care recommendation generator 208 may analyze the activity data and/or the product use event data over a particular time window (e.g., the previous year, the previous month, the previous week, etc.). Then the personal care recommendation generator 208 may determine product use metrics for the personal care product 104 such as a frequency of use over the particular time window, an average duration of use, the time of day of the use, etc. For example, for the OlayTM Total Effects Whip Face Moisturizer described in FIG.
  • the personal care recommendation generator 208 may determine that the user applied the moisturizer about once a week.
  • the personal care recommendation generator 208 may also determine activity metrics for the activity such as a frequency of the activity over the particular time window, an average duration of the activity, the time of day of the activity, etc.
  • the personal care recommendation generator 208 may then compare the product use metrics and/or the product use event data to the set of rules for the identified personal care product 104 , for example from the database 210 to generate the user feedback information.
  • the personal care recommendation generator 208 may also compare the activity metrics and/or activity data to the set of rules for the identified activity, for example from the database 210 to generate the user feedback information.
  • the activity metrics, activity data, product use metrics, and/or the product use event data may be compared to the set of rules in view of the user profile data for the user, such as demographics, or the user's personal care goals and reported issues.
  • the personal care recommendation generator 208 may apply the activity metrics, activity data, product use metrics, product use event data, and/or user profile data to a machine learning model generated based on the performances of other users. For example, the personal care recommendation generator 208 may train the machine learning model using on a first set of activity metrics, activity data, product use metrics, product use event data, and/or user profile data for a first set of users who improved their cosmetic deficiencies with the personal care product and a second set of activity metrics, activity data, product use metrics, product use event data, and/or user profile data for a second set of users who did not improve their cosmetic deficiencies with the personal care product.
  • the personal care recommendation generator 208 may train the machine learning model using a first set of activity metrics, activity data, product use metrics, product use event data, and/or user profile data for a first set of users who received the type of user feedback information for the personal care product and a second set of activity metrics, activity data, product use metrics, product use event data, and/or user profile data for a second set of users who did not receive the type of user feedback information for the personal care product.
  • the personal care recommendation generator 208 may generate user feedback information using the set of rules and/or the machine learning models.
  • the user feedback information may include a recommendation to replenish the personal care product.
  • the personal care recommendation generator 208 may recommend replenishing the personal care product after a threshold number of uses of the personal care product which in some instances may be determined via the activity data, or when the personal care product exceeds a threshold age according to the set of rules and/or the machine learning models.
  • the user feedback information may also include advice on how to use the personal care product or a recommendation on how to improve the use of the personal care product.
  • the server device 202 may store a set of instructions on using the personal care product for example, in the database 210 .
  • the personal care recommendation generator 208 may generate advice on how to use the personal care product. For example, as described above with reference to FIG. 4 , the user placed a SK-IITM Facial Treatment Mask on her face, left it there for 7 minutes, and rinsed it off.
  • the set of rules for the SK-IITM Facial Treatment Mask may indicate that the user should not rinse off the mask and instead rub it in.
  • the personal care recommendation generator 208 may generate advice indicating that next time the user should rub in the mask without rinsing it off.
  • the advice may also include the frequency and duration in which to use the personal care product and/or a description of the frequency and duration in which the user is using the personal care product.
  • the personal care recommendation generator 208 may generate advice indicating the frequency and/or duration when the user first uses the personal care product according to the product use event data or when the user is using the personal care product too frequently, not frequently enough, for too long, or for not long enough according to the product use event data and the set of instructions on using the personal care product.
  • the advice on how to use the product may be based on the consumer habits for the user. For example, if the user's habits deviate from the set of rules for using a particular product, the personal care recommendation generator 208 may generate advice indicating how to use the product. Still further, the user feedback information may include opportunities for optimizing a particular hygiene regimen based on the user's habits. More specifically, the user feedback information may include a particular order in which the user should use a set of products for a particular hygiene regimen, such as when the user's habits indicate that the user does not follow the particular order. For example, when the user's habits indicate that the user applies concealer before putting on foundation, the user feedback information may include a recommendation to apply the concealer after putting on foundation. Additionally, the user feedback information may include recommendations for additional or alternative products to use during the particular hygiene regimen along with the products the user is currently using in the regimen.
  • the advice on how to use the product may be based on user profile data such as the weather conditions at the user's location, the time of year, or the time of day. If it is a hot, humid day or it is raining, the personal care recommendation generator 208 may recommend different types of use of hair care products than on a sunny day with low humidity. Also, if it is the daytime during the summer, the personal care recommendation generator 208 may recommend purchasing a daytime moisturizer with sunscreen to go along with the user's nighttime moisturizer. In the winter, the personal care recommendation generator 208 may recommend that the user apply the same moisturizer during the day and at night.
  • user profile data such as the weather conditions at the user's location, the time of year, or the time of day. If it is a hot, humid day or it is raining, the personal care recommendation generator 208 may recommend different types of use of hair care products than on a sunny day with low humidity. Also, if it is the daytime during the summer, the personal care recommendation generator 208 may recommend purchasing a daytime moisturizer with sunscreen
  • the user feedback information may include recommendations to purchase related personal care products.
  • the server device 202 may store lists of personal care products which work well together according to their ingredients or the effects of the personal care products on other users.
  • the personal care recommendation generator 208 may recommend a particular type of conditioner that compliments the shampoo.
  • the user profile may indicate that the user in the past used a particular personal care product within the same time frame as another personal care product. The personal care recommendation generator 208 may recommend that the user once again purchase the particular care product to use with the other personal care product.
  • the user feedback information may also include a user performance metric such as a score based on the duration and/or frequency in which the user uses a particular personal care product.
  • a user performance metric such as a score based on the duration and/or frequency in which the user uses a particular personal care product.
  • the user performance metric may be a score from 0-100 which increases each time the user uses conditioner. If the user does not use conditioner for a threshold time period, the score may decrease or reset to 0.
  • the personal care recommendation generator 208 generates the user performance metric using a machine learning model, such as a regression model.
  • the machine learning model may be trained using on a first set of activity metrics, activity data, product use metrics and/or product use event data for a first set of users who improved their cosmetic deficiencies with the personal care product and a second set of activity metrics, activity data, product use metrics and/or product use event data for a second set of users who did not improve their cosmetic deficiencies with the personal care product. Then the personal care recommendation generator 208 may apply the user's activity metric, activity data, product use metric, and/or product use event data to the machine learning model to generate the user performance metric.
  • the user feedback information may also include rewards which may be provided when a user performance metric exceeds a threshold value, when the user uses more than a threshold number of different personal care products, when the user follows recommendations or advice provided by the personal care computing device, etc.
  • the user performance metric may also be a comparison to the performances of other users.
  • the personal care recommendation generator 208 may compare the user's performance to the performances of other users in the same demographic (e.g., age group).
  • the user may have a raw user performance metric for eye makeup of 65 but this may be in the 75 th percentile of raw user performance metrics compared to other users in the same age group, same geographic area, etc.
  • the user feedback information may provide a raw user performance metric, a percentile or ranking of the raw user performance metric relative to other users, an adjusted user performance metric factoring in the user's performance relative to other users, or any other suitable relative user performance metric.
  • the user feedback information may also include recommendations on how to improve a user performance metric, encouragement to continue using the personal care product to reach a high score and receive rewards points or other incentives for maintaining consistent use of the personal care product.
  • Example user feedback information is illustrated in the data table 500 of FIG. 5 .
  • the personal care recommendation generator 208 may recommend that the user “Use leave-on frizz control to fight the humidity.”
  • the personal care recommendation generator 208 may advise, “In the past month, you have been moisturizing about twice a week. Make sure you moisturize everyday.”
  • the personal care recommendation generator 208 may advise, “Don't forget to replace your disposable razor after 5 uses.”
  • the personal care recommendation generator 208 may recommend that the user “Buy Jane's Conditioner to use along with your shampoo.
  • Another example of user feedback information may be, “You have earned 200 rewards points for maintaining proper skin care habits.”
  • the database 210 may also store previous user feedback information provided to the user, so that the personal care computing device 102 does not repeatedly provide the user with the same user feedback information. Based on the user's response to various user feedback information the personal care recommendation generator 208 may learn which types of user feedback information improve the user's performance. For example, the personal care recommendation generator 208 may learn that the user does not purchase recommended related products, and thus may stop providing related products recommendations.
  • the personal care recommendation generator 208 may provide the user feedback information to the personal care computing device 102 , or the client computing device 222 via an SMS message, email, push notification, etc.
  • the recommendation determination module 126 in the personal care computing device 102 may analyze the product use event data for the user to generate the user feedback information without sending the product use event data to the server device 202 .
  • the control module 128 may control operation of the personal care computing device 102 by for example, presenting a display which includes the user feedback information via the user interface 110 , presenting audio output which includes the user feedback information via the speaker 108 , providing haptic feedback indicative of the user feedback information via a vibration motor, or transmitting the user feedback information to the client computing device 222 via the communication unit 116 .
  • FIG. 6 illustrates a flow diagram representing an example method 600 for providing feedback regarding personal care products.
  • the method 600 may be performed by the personal care assistant application 122 and executed on the personal care computing device 102 .
  • the method 600 may be implemented in a set of instructions stored on a non-transitory computer-readable memory and executable on one or more processors of the personal care computing device 102 .
  • the method 600 may be at least partially performed by the product identification module 124 , the recommendation determination module 126 , and the control module 128 , as shown in FIG. 2 .
  • an indication of a personal care product 104 being used by a user is obtained.
  • the indication of the personal care product 104 may be provided with manual input via user controls on the user interface 110 of the personal care computing device 102 or the client computing device 222 .
  • the user may select the personal care product 104 from a list of personal care products included in a drop-down menu on the user interface 110 .
  • the indication of the personal care product 104 may also be provided automatically, such as via a radio signal from the personal care product 104 , or an image or video of the personal care product 104 .
  • an indication of an activity may be obtained.
  • the indication of the activity may be provided automatically, such as via environmental characteristics for the area surrounding the personal care computing device 102 detected by an environmental sensor, which may be any one of, any two of, or any suitable combination of an audio sensor such as a microphone or an array of microphones, a temperature sensor, an ultrasonic sensor, a radio antenna for example for receiving Wi-Fi or Bluetooth signals, a weighing scale, a wearable sensor, an air quality sensor such as a VOC sensor, a depth sensor for generating a 3D point cloud of the area surrounding the environmental sensor, such as a LiDAR sensor or an IR sensor, and/or a humidity sensor.
  • the personal care product 104 is identified based on the obtained indication.
  • the personal care computing device 102 may identify the selected personal care product 104 via the user controls.
  • the personal care computing device 102 may identify the personal care product 104 transmitting the radio signal based on the identification information included in the radio signal.
  • the indication of the personal care product 104 is an image or video
  • the personal care computing device 102 may identify the personal care product 104 by analyzing images or video frames using the computer vision techniques described above to identify an object within the images or video frames and identify visual features, semantic cues, and/or other visual characteristics for the object.
  • the personal care computing device 102 may compare the visual features, semantic cues, and/or other visual characteristics to visual features, semantic cues, and/or other visual characteristics for templates of personal care products to determine a likelihood that the object corresponds to one of the personal care products.
  • the personal care computing device 102 or the server device 202 may generate a machine learning model for identifying personal care products based on visual features and semantic cues using image classification and/or machine learning techniques.
  • the personal care computing device 102 may apply the visual features and semantic cues for the object to the machine learning model to identify the personal care product corresponding to the object.
  • the activity may be identified based on the indication of the activity.
  • the environmental sensor may periodically capture audio data for a sound within the area.
  • the personal computing device 102 may then compare the audio data for the sound to acoustic signatures for various types of activities, such as the shower running, the sink running, a toilet flushing, gargling, a dishwasher running, a washing machine running, a dryer running, shaving, brushing teeth, etc.
  • Each acoustic signature may include a set of audio characteristics for a particular activity and/or sound, such as the volume of the sound, the frequency of the sound, the tone of the sound, the direction of the sound, etc.
  • the personal computing device 102 may identify the type of activity by comparing the audio data to each acoustic signature for each type of activity to determine a likelihood that the sound corresponds to one of the activities. The type of activity having the highest likelihood for the sound or having a likelihood that exceeds a likelihood threshold may be identified as the type of activity corresponding to the sound.
  • the personal care computing device 102 or the server device 202 may generate a machine learning model for identifying an activity based on sensor data captured by the environmental sensor using machine learning techniques. Then the personal care computing device 102 may apply the audio characteristics for the sound, the type of area where the personal computing device 102 is located, and/or other environmental characteristics detected within the area to the machine learning model to identify the activity corresponding to the sound and/or other environmental characteristics.
  • the product identification module 124 may also identify products related to the activity. For example, when the activity is running a washing machine or a dryer, the product identification module 124 may identify one or more laundry room products. When the activity is showering, the product identification module 124 may identify one or more hair care or skin care products. When the activity is the toilet flushing, the product identification module 124 may identify one or more bathroom products, such as toilet paper or cleaning products. When the activity is the dishwasher running, the product identification module 124 may identify one or more kitchen products, such as plates, bowls, forks, spoons, knives, dishwasher detergent, etc.
  • the product identification module 124 may use the identified activity to identify the personal care product 104 being used by a user. More specifically, the product identification module 124 may generate the machine learning model for identifying personal care products based on visual features, semantic cues, and the type of activity being performed by the user.
  • the personal care computing device 102 provides the obtained indication of the personal care product 104 and/or the obtained indication of the activity to the server device 202 to identify the personal care product 104 corresponding to the indication. Then the server device 202 provides the identified personal care product 104 and/or the identified activity to the personal care computing device 102 .
  • the personal care computing device 102 may identify product use event data for the personal care product 104 based on the user's interaction with the personal care product 104 (block 606 ).
  • the product use event data may include identification information for the personal care product 104 such as the name of the personal care product, the date and/or time of the use, the duration of the use, the manner in which the personal care product 104 is used, other personal care products used in the same time frame as the personal care product 104 , etc.
  • the personal care computing device 102 may record the date and/or time in which the identification information is received. Additionally, the personal care computing device 102 may determine the duration of the use by determining when the personal care product 104 can no longer be identified. Furthermore, the personal care computing device 102 may identify other personal care products used in the same time frame as the personal care product 104 by identifying the other personal care products in a similar manner as described above, and comparing identification times for each of the other personal care products to the identification time for the personal care product.
  • the personal care computing device 102 may identify activity data such as the type of activity, the date and/or time of the activity, the duration of the activity, etc.
  • the personal care computing device 102 may present questions on the user interface 110 or via the speaker 108 which are related to the use of the identified personal care product 104 . Accordingly, the user may respond to the questions with voice responses which are received via the microphone or via user controls on the user interface 110 , such as drop-down menus, text fields, etc. In other implementations, the personal care computing device 102 may determine the manner in which the personal care product 104 is being used by analyzing the images or video from the camera 112 using computer vision techniques.
  • the personal care computing device 102 may identify the user's face and facial features from the images such as the user's eyes, lips, and nose, and may determine where the user is applying makeup, lipstick, moisturizer, etc., on her face.
  • the personal care computing device 102 may determine the manner in which the personal care product 104 is being used based on the activity data. More specifically, the activity data may indicate the type of activity the user performed while using the personal care product 104 .
  • the personal care computing device 102 or the client computing device 222 also obtains user profile data for the user, such as biographical information regarding the user, a current location of the user, an image of the user, or user preferences or goals regarding the personal care product. Then the personal care computing device 102 or the client computing device 222 provides the user profile data to the server device 202 .
  • user profile data such as biographical information regarding the user, a current location of the user, an image of the user, or user preferences or goals regarding the personal care product. Then the personal care computing device 102 or the client computing device 222 provides the user profile data to the server device 202 .
  • the personal care computing device 102 provides the activity data, the product use event data for the identified personal care product 104 , and/or identification information for the user (e.g., user login credentials, a user ID, etc.) to the server device 202 .
  • the server device 202 may analyze the activity data for the identified activity, the product use event data for the identified personal care product 104 , and/or the user profile data for the user to generate user feedback information to assist the user in using the personal care product or related personal care products. More specifically, the server device 202 may analyze the activity data and/or the product use event data at several instances in time for the identified personal care product 104 and/or identified activity to generate the user feedback information.
  • the server device 202 may analyze the activity data and/or the product use event data over a particular time window (e.g., the previous year, the previous month, the previous week, etc.). Then the server device 202 may determine product use metrics for the personal care product 104 such as a frequency of use over the particular time window, an average duration of use, the time of day of the use, etc. The server device 202 may also identify activity metrics based on the activity data, such as a frequency of the activity over the particular time window, an average duration of the activity, the time of day of the activity, etc.
  • the server device 202 may then compare the activity data, the activity metrics, the product use metrics and/or the product use event data to a set of rules for the identified personal care product 104 , for example from the database 210 to generate the user feedback information.
  • the server device 202 may apply the activity data, the activity metrics, the product use metrics, product use event data, and/or user profile data to a machine learning model generated based on the performances of other users.
  • the user feedback information may include a recommendation to replenish the personal care product, advice on how to use the personal care product or a recommendation on how to improve the use of the personal care product, the frequency and duration in which to use the personal care product and/or a description of the frequency and duration in which the user is using the personal care product, recommendations to purchase related personal care products, a user performance metric indicating how effectively the user is using the personal care product, rewards, or recommendations on how to improve a user performance metric, encouragement to continue using the personal care product to reach a high score and receive rewards points or other incentives for maintaining consistent use of the personal care product.
  • the server device 202 may generate several types of user feedback information.
  • the database 210 may also store previous user feedback information provided to the user, and the server device 202 may provide types of user feedback information to the personal care computing device 102 or the client computing device 222 which have not been presented to the user within a threshold time period. In other implementations, some types of user feedback information may be provided more often than others, such as user performance metrics.
  • the server device 202 may provide an updated user performance metric to the user each time the user performance metric changes. On the other hand, the server device 202 may only provide recommendations on how to use the personal care product once a week or once a month, for example.
  • the personal care computing device 102 obtains the user feedback information from the server device 202 .
  • the personal care computing device 102 generates the user feedback information based on the activity data, the product use event data for the personal care product, and/or the user profile data. In any event, the personal care computing device 102 presents the user feedback information to the user (block 614 ).
  • the personal care computing device 102 may present a display on the user interface 110 that includes the user feedback information, may provide haptic feedback via a vibration motor indicative of the user feedback information, may turn a set of light emitting diodes (LEDs) on or off based on the user feedback information, may present voice output which includes the user feedback information via the speaker 108 , or may transmit the user feedback information for display on the client computing device 222 via the communication unit 116 .
  • a vibration motor indicative of the user feedback information may turn a set of light emitting diodes (LEDs) on or off based on the user feedback information, may present voice output which includes the user feedback information via the speaker 108 , or may transmit the user feedback information for display on the client computing device 222 via the communication unit 116 .
  • LEDs light emitting diodes
  • FIG. 7 illustrates a flow diagram representing an example method 700 for generating the feedback regarding personal care products.
  • the method 700 may be performed by the server device 202 .
  • the method 700 may be implemented in a set of instructions stored on a non-transitory computer-readable memory and executable on one or more processors of the server device 202 .
  • the method 700 may be at least partially performed by the personal care recommendation generator 208 , as shown in FIG. 2 .
  • the server device 202 receives user profile data for a user.
  • the server device 202 may receive the user profile data from the user's personal care computing device 102 or client computing device 222 .
  • the user profile data may include biographical information regarding the user, a current location of the user, an image of the user, or user preferences or goals regarding the personal care product.
  • the server device 202 stores a user profile for the user in a database 210 which includes at least some of the user profile data. The server device 202 may then update the user profile with user profile data received from the personal care computing device 102 or client computing device 222 .
  • the server device 202 also receives product use event data indicative of the user's interaction with a personal care product 104 (block 704 ). For example, each time the personal care computing device 102 or the client computing device 222 identifies that the user is interacting with a personal care product 104 , the personal care computing device 102 or the client computing device 222 may generate a record of the use and provide the generated record to the server device 202 . This may include identification information for the personal care product 104 such as the name of the personal care product, the date and/or time of the use, the duration of the use, the manner in which the personal care product 104 is used, other personal care products used in the same time frame as the personal care product 104 , etc.
  • the server device 202 may receive activity data indicative of an activity performed by the user. For example, each time the personal care computing device 102 identifies an activity, the personal care computing device 102 may generate a record of the activity and provide the generated record to the server device 202 . This may include activity data, such as the type of activity, the duration of the activity, the date and/or time of the activity, one or more personal care products related to the activity, etc.
  • the server device 202 may store the activity data, the product use event data, and/or the user profile data in the user profile for the user, for example in the database 210 (block 706 ).
  • the server device 202 each time the server device 202 receives a new instance of activity data and/or product use event data, the server device 202 analyzes the new instance of activity data and/or product use event data and previously stored instances of activity data and/or product use event data for the activity/personal care product to generate user feedback information (block 708 ).
  • the server device 202 may analyze the activity data and/or product use event data over a particular time window (e.g., the previous year, the previous month, the previous week, etc.), which may include several instances of activity data and/or product use event data at different time intervals for the same activity and/or personal care product 104 .
  • a particular time window e.g., the previous year, the previous month, the previous week, etc.
  • the server device 202 may determine product use metrics for the personal care product 104 such as a frequency of use over the particular time window, an average duration of use, the time of day of the use, etc.
  • the server device 202 may also determine activity metrics for the activity such as a frequency of the activity over the particular time window, an average duration of the activity, the time of day of the activity, etc.
  • the server device 202 may then compare the activity data, activity metrics, product use metrics, and/or the product use event data to a set of rules for the identified personal care product 104 and/or the identified activity, for example from the database 210 to generate the user feedback information.
  • the server device 202 may apply the activity data, activity metrics, product use metrics, product use event data, and/or user profile data to a machine learning model generated based on the performances of other users.
  • the user feedback information may include a recommendation to replenish the personal care product, advice on how to use the personal care product or a recommendation on how to improve the use of the personal care product, the frequency and duration in which to use the personal care product and/or a description of the frequency and duration in which the user is using the personal care product, recommendations to purchase related personal care products, a user performance metric indicating how effectively the user is using the personal care product, rewards, or recommendations on how to improve a user performance metric, encouragement to continue using the personal care product to reach a high score and receive rewards points or other incentives for maintaining consistent use of the personal care product.
  • the user feedback information may also include rewards which may be provided when a user performance metric exceeds a threshold value, when the user uses more than a threshold number of different personal care products, when the user follows recommendations or advice provided by the personal care computing device, etc.
  • the user performance metric may be a personal care product-specific user performance metric, such that the server device 202 generates a different user performance metric for each personal care product 104 or each type of personal care product (e.g., hair care, eye care, etc.).
  • each user performance metric may be a score such as from 0-100 which increases or decreases based on the duration and/or frequency in which the user uses a particular personal care product.
  • Each user performance metric may also be a comparison to the performances of other users.
  • the server device 202 may compare the user's performance to the performances of other users in the same demographic (e.g., age group).
  • the user may have a raw user performance metric for eye makeup of 65 but this may be in the 75 th percentile of raw user performance metrics compared to other users in the same age group, same geographic area, etc.
  • the server device 202 may generate a raw user performance metric, an adjusted user performance metric factoring in the user's performance relative to other users, and/or a percentile or ranking of the raw user performance metric relative to other users for the same personal care product.
  • the database 210 may also store previous user feedback information provided to the user, so that the personal care computing device 102 does not repeatedly provide the user with the same user feedback information. Based on the user's response to various user feedback information the server device 202 may learn which types of user feedback information improve the user's performance. For example, the server device 202 may learn that the user does not purchase recommended related products, and thus may stop providing related products recommendations.
  • the server device 102 may provide the user feedback information to a client device, such as personal care computing device 102 , or the client computing device 222 via an SMS message, email, push notification, etc.
  • a client device such as personal care computing device 102
  • the client computing device 222 via an SMS message, email, push notification, etc.
  • routines, subroutines, applications, or instructions may constitute either software (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware.
  • routines, etc. are tangible units capable of performing certain operations and may be configured or arranged in a certain manner.
  • one or more computer systems e.g., a standalone, client or server computer system
  • one or more hardware modules of a computer system e.g., a processor or a group of processors
  • software e.g., an application or application portion
  • a hardware module may be implemented mechanically or electronically.
  • a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations.
  • a hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
  • the term “hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein.
  • hardware modules are temporarily configured (e.g., programmed)
  • each of the hardware modules need not be configured or instantiated at any one instance in time.
  • the hardware modules comprise a general-purpose processor configured using software
  • the general-purpose processor may be configured as respective different hardware modules at different times.
  • Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
  • Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple of such hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
  • a resource e.g., a collection of information
  • processors may be temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions.
  • the modules referred to herein may, in some example embodiments, comprise processor-implemented modules.
  • the methods or routines described herein may be at least partially processor-implemented. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented hardware modules. The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location (e.g., within a home environment, an office environment or as a server farm), while in other embodiments the processors may be distributed across a number of locations.
  • the performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines.
  • the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.
  • any reference to “one embodiment” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment.
  • the appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
  • Coupled and “connected” along with their derivatives.
  • some embodiments may be described using the term “coupled” to indicate that two or more elements are in direct physical or electrical contact.
  • the term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other.
  • the embodiments are not limited in this context.
  • the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion.
  • a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
  • “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Development Economics (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Game Theory and Decision Science (AREA)
  • Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • Signal Processing (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Computing Systems (AREA)
  • Evolutionary Computation (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Human Computer Interaction (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychiatry (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)

Abstract

A personal care computing device may include a user interface, a camera, a speaker, a communication interface, and a personal care assistant application. The personal care assistant application may identify personal care products being used by a user from manual input (e.g., via user controls on the user interface), or automatically (e.g., via the camera or the communication interface). The personal care assistant application may identify product use event data describing the use of a personal care product by the user, such as the date and time of the use, the duration and frequency of the use, the manner in which the personal care product is being used, etc. Then the personal care assistant application may generate user feedback information for the personal care product based on the product use event data, previous instances of product use event data, and/or user profile data for the user.

Description

    TECHNICAL FIELD
  • The present disclosure generally relates to personal care systems, and, more particularly, to a personal care assistant for identifying instances of use of personal care products and providing feedback to a user to enhance the user's experience with the personal care products.
  • BACKGROUND
  • Typically, home assistant devices or other computing devices collect data from network-enabled devices to enhance the users' experiences with the network-enabled devices. For example, a home assistant device may learn a user's habits based on the user's interactions with other network-enabled devices, such as smart lights, a smart TV, a smart heating and air conditioning system, etc. The home assistant device may then automatically control the network-enabled devices according to the learned habits. In another example, a smart TV may provide indications of the user's watching habits to a remote server that provides recommendations on similar TV shows and movies to those the user is currently watching.
  • However, such devices do not have similar ways of learning habits based on user interactions with devices which are not network-enabled, such as personal care products. While users interact with personal care products, such as makeup, shampoo, conditioner, moisturizer, hand cream, face cream, toothbrushes, mouthwash, facial cleansers, etc., on a daily basis, computing devices do not collect usage data based on users' interactions with these products to enhance the user experience. Accordingly, users do not know if they are using the products correctly and at the appropriate rate or for the appropriate amount of time.
  • SUMMARY
  • To enhance a user's experience with personal care products, a personal care system includes a personal care computing device that obtains indications of personal care products being used by a user. The personal care computing device identifies a personal care product based on an obtained indication and provides user feedback to assist the user is using the personal care product. The personal care computing device may also determine product use event data based on the user's interaction with the personal care product. For example, the product use event data may include identification information for the personal care product such as the name of the personal care product, the date and/or time of the use, the duration of the use, the manner in which the personal care product is used, other personal care products used in the same time frame as the personal care product, etc.
  • The personal care computing device may provide the product use event data to a server computing device which stores historical product use event data for the user in a user profile. The personal care computing device may also provide user profile data for the user to the server computing device, such as biographical information regarding the user, a current location of the user, an image of the user, or user preferences or goals regarding the personal care product. Then the server computing device may analyze the product use event data and the historical product use event data at several instances in time along with the user profile data to generate the user feedback. For example, the server computing device may determine that the user is using a skin care product once a week based on the product use event data. The server computing device may also determine the user's age according to the user profile data, and may determine that people in the user's age group should be using the skin care product more often. Accordingly, the server computing device may generate user feedback indicating that the user should use the skin care product at least twice per week. In some scenarios, the user feedback may also include recommendations to purchase other related personal care products.
  • The personal care computing device may present the user feedback via a user interface on the personal care computing device or as audio feedback via a speaker. In other implementations, the personal care computing device may provide the user feedback to the user's mobile device which may be presented via a personal care application on the mobile device. In yet other implementations, the server computing device may provide the user feedback to the user's mobile device via a short message service (SMS) message, email, or push notification.
  • In this manner, the personal care system collects and analyzes user data from personal care products which do not include a sensor, do not connect to the Internet, and/or do not include computing devices. Accordingly, the personal care system may digitize data from analog products.
  • In one embodiment, a computing device for providing feedback regarding consumer habits includes a user interface, an environmental sensor, a communication interface, one or more processors, and a non-transitory computer-readable memory coupled to the one or more processors, the environmental sensor, the user interface, and the communication interface, and storing instructions thereon. The instructions, when executed by the one or more processors, cause the computing device to identify, via the environmental sensor, an activity by a user within the user's dwelling related to a product, and obtain at least one of: (i) activity data for the user, the activity data related to a frequency or duration of the activity performed by the user over time or (ii) product use event data associated with the user for the product, the product use event data related to product usage of the product over time. The instructions further cause the computing device to generate user feedback information associated with the product or related products based on at least one of: the activity data or the product use event data, and provide the user feedback information via the user interface or the communication interface to a mobile device of the user.
  • In another embodiment, a server device for providing feedback regarding consumer habits includes one or more processors, and a non-transitory computer-readable memory coupled to the one or more processors and storing instructions thereon. The instructions, when executed by the one or more processors, cause the server device to receive, at one or more time intervals, at least one of: (i) activity data for an activity performed by a user within the user's dwelling related to a product, the activity data related to a frequency or duration of the activity performed by the user over time, or (ii) product use event data associated with the user for the product, the product use event data related to product usage of the product over time. The instructions further cause the server device to store the activity data and the product use event data in a user profile of the user and analyze at least one: the activity data or the product use event data at the one or more time intervals to generate user feedback information associated with the product or related personal care products. Moreover, the instructions cause the server device to provide the user feedback information to a client device for presenting the user feedback information to the user.
  • In yet another embodiment, a method for providing feedback regarding consumer habits includes identifying, via an environmental sensor communicatively coupled to a computing device, an activity by a user within the user's dwelling related to a product, and obtaining, by the computing device, at least one of: (i) activity data for the user, the activity data related to a frequency or duration of the activity performed by the user over time or (ii) product use event data associated with the user for the product, the product use event data related to product usage of the product over time. The method further includes generating, by the computing device, information associated with the product or related products based on at least one of: the activity data or the product use event data. Furthermore, the method includes providing, by the computing device, the user feedback information via a user interface or a communication interface to a mobile device of the user.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The figures described below depict various aspects of the system and methods disclosed herein. It should be understood that each figure depicts an embodiment of a particular aspect of the disclosed system and methods, and that each of the figures is intended to accord with a possible embodiment of thereof. Further, wherever possible, the following description refers to the reference numerals included in the following figures, in which features depicted in multiple figures are designated with consistent reference numerals.
  • FIG. 1 illustrates an example personal care computing device and a personal care product;
  • FIG. 2 illustrates a block diagram of an example communication system in which the personal care computing device can operate;
  • FIG. 3 illustrates an example data table including user profile data;
  • FIG. 4 illustrates another example data table including product use event data;
  • FIG. 5 illustrates example user feedback which may be provided by the personal care system to the user;
  • FIG. 6 illustrates a flow diagram of an example method for providing feedback regarding personal care products, which can be implemented in the personal care computing device; and
  • FIG. 7 illustrates a flow diagram of an example method for generating the feedback regarding personal care products, which can be implemented in a server device.
  • DETAILED DESCRIPTION
  • Although the following text sets forth a detailed description of numerous different embodiments, it should be understood that the legal scope of the description is defined by the words of the claims set forth at the end of this patent and equivalents. The detailed description is to be construed as exemplary only and does not describe every possible embodiment since describing every possible embodiment would be impractical. Numerous alternative embodiments could be implemented, using either current technology or technology developed after the filing date of this patent, which would still fall within the scope of the claims.
  • It should also be understood that, unless a term is expressly defined in this patent using the sentence “As used herein, the term ‘______’ is hereby defined to mean . . . ” or a similar sentence, there is no intent to limit the meaning of that term, either expressly or by implication, beyond its plain or ordinary meaning, and such term should not be interpreted to be limited in scope based on any statement made in any section of this patent (other than the language of the claims). To the extent that any term recited in the claims at the end of this patent is referred to in this patent in a manner consistent with a single meaning, that is done for sake of clarity only so as to not confuse the reader, and it is not intended that such claim term be limited, by implication or otherwise, to that single meaning. Finally, unless a claim element is defined by reciting the word “means” and a function without the recital of any structure, it is not intended that the scope of any claim element be interpreted based on the application of 35 U.S.C. § 112(f).
  • Accordingly, the term “personal care products” or “products” may be used to refer to consumer products which are typically used in a bathroom, laundry room, or kitchen. For example, personal care products may include tooth care products (e.g., toothbrushes, mouthwash, dental floss, etc.), skin care products (e.g., hand cream, face cream, facial cleansers, moisturizer, etc.), cosmetic products (e.g., face makeup, eye makeup, lipstick, makeup brushes, makeup kits, makeup mirrors, etc.), hair care products (e.g., shampoo, conditioner, hair dryers, straighteners, brushes, combs, curlers, spray gels, etc.), other grooming products (e.g., razors, hair removal products, etc.), toilet paper, cleaning products (e.g., bleach, window cleaner, all-purpose cleaner, soap, toilet bowl cleaner, etc.), laundry room products (e.g., laundry detergent, stain removal products, etc.), kitchen products (e.g., plates, bowls, forks, spoons, knives, measuring cups, pots, pans, spatulas, cutting boards, etc.), or any other suitable bathroom, laundry room, or kitchen products.
  • The term “consumer habits” as used herein may refer to usage of consumer products by a user, a hygiene regimen by the user including an order in which a set of consumer products were used when the set of consumer products were used in the same time frame, an amount in which the user complies with product instructions, grooming patterns for the user, etc.
  • Generally speaking, techniques for providing feedback regarding personal care products may be implemented in one or more personal care products, a personal care computing device, one or more other client computing devices, one or more network servers, and/or a system that includes several of these devices. However, for clarity, the examples below focus primarily on an embodiment in which a personal care computing device identifies a personal care product which is being used by a user, and determines product use event data for the personal care product, such as identification information for the personal care product such as the name of the personal care product, the date and/or time of the use, the duration of the use, the manner in which the personal care product is used, other personal care products used in the same time frame as the personal care product, etc. The personal care computing device may then provide identification information for the user (e.g., a user ID, user login credentials, etc.) and the product use event data to a server device. The server device may then retrieve a user profile for the user based on the identification information and update the user profile to include the product use event data. In some scenarios, the personal care computing device may also provide user profile data to the server device, such as biographical information regarding the user, a current location of the user, an image of the user, or user preferences or goals regarding the personal care product. Accordingly, the server device may update the user profile with the user profile data.
  • The user profile may include product use event data for the user at several time intervals, and the server device may analyze the product use event data over time and/or the user profile data for the user to generate user feedback information. Then the server device provides the user feedback information to the personal care computing device which presents audio feedback via a speaker or visual feedback via a user interface. In other implementations, the personal care computing device forwards the user feedback information to a client computing device of the user for presentation on the client computing device, or the server device provides the user feedback information directly to the client computing device, for example via an SMS message, email, a push notification, etc.
  • FIG. 1 illustrates various aspects of an exemplary environment implementing a personal care system 100. The personal care system 100 includes a personal care computing device 102 which may be placed in a bathroom, such as on a bathroom sink. The personal care system 100 also includes one or several personal care products 104. The personal care computing device 102, described in more detail below, includes a voice assistant having one or several microphones 106, such as an array of microphones 106 and one or several speakers 108, such as an array of speakers 108. The voice assistant may also include processors and a memory storing instructions for receiving and analyzing voice input and providing voice output. The voice assistant included in the personal care computing device 102 may include the hardware and software components of the voice controlled assistant described in U.S. Pat. No. 9,304,736 filed on Apr. 18, 2013, incorporated by reference herein.
  • Additionally, the personal care computing device 102 include a user interface 110 for displaying information related to the personal care products, such as user feedback information regarding personal care products. The user interface 110 may also present user controls for the user to providing information about herself, such as identification information (e.g., user login credentials, a user ID, biographical information, user preferences or goals regarding skin care, etc. Moreover, the user interface 110 may include user controls for the user to provide information regarding the personal care products she uses, such as the names of the personal care products, how often she uses the personal care products, the manner in which she uses each personal care product, the duration of each use, etc.
  • Furthermore, the personal care computing device 102 may include a camera 112 for capturing video and/or images of the area within the field of view of the camera 112. In this manner, the personal care computing device 102 may identify personal care products 104 within an image or video frame to determine that a personal care product 104 is currently in use, determine the duration of the use, etc. The personal care computing device 102 may also include a communication interface (not shown) for connecting to a long-range communication network such as the Internet and for transmitting/receiving radio signals over a short-range communication network, such as NFC, Bluetooth, RFID, Wi-Fi, etc. For example, the personal care computing device 102 may include an RFID reader or an NFC reader to receive radio signals from RFID tags, NFC tags, Bluetooth Low Energy (BLE) tags, etc.
  • In some implementations, the personal care product 104 includes a radio identification tag (not shown), such as an RFID tag, NFC tag, BLE tag, etc., which transmits identification information for the personal care products to the RFID reader in the personal care computing device 102. In this manner, the personal care computing device 102 may identify a personal care product within a communication range of the personal care computing device 102 based on the radio identification tag and may determine that the identified personal care product is being used by the user. The radio identification tag may be a passive radio identification tag, such that the radio identification tag does not include an internal power source such as a battery. Instead, the RFID or NFC reader within the communication range of the radio identification tag provides electromagnetic signals that energize the radio identification tag so that the radio identification tag can transmit a radio signal to the RFID or NFC reader which includes identification information for the personal care product 104. In other implementations, the personal care product 104 does not include a radio identification tag or any other transceiver. Accordingly, the personal care computing device 102 identifies the personal care product 104 in other ways, such as by identifying visual features within the personal care product 104 from images or video collected by the camera 112 which can be used to identify the personal care product 104, identifying labels, barcodes, or other text placed on the personal care product from the images or video, or obtaining an indication that the user is using the personal care product 104 via user controls on the user interface 110 or via the user's mobile device.
  • In some implementations, the personal care computing device 102 includes an environmental sensor for capturing environmental characteristics in the area surrounding the personal care computing device 102, such as the bathroom, the kitchen, the laundry room, the living room, etc. of the user's dwelling. The environmental sensor may be a temperature sensor, a humidity sensor, an acoustic sensor, an ultrasonic sensor, a radio antenna for example for receiving Wi-Fi or Bluetooth™ signals, a weighing scale, a wearable sensor, an air quality sensor such as a volatile organic compounds (VOC) sensor, or a depth sensor for generating a 3D point cloud of the area surrounding the environmental sensor, such as a light detection and ranging (LiDAR) sensor or an infrared (IR) sensor, each of which may be used in combination with the camera 112 to generate the 3D point cloud.
  • The acoustic sensor may include the one or several microphones 106, such as an array of microphones 106 for detecting audio characteristics, such as the volume of sounds within the area, the frequency of the sounds within the area, the tone of the sounds within the area, and/or the directions in which the sounds came from within the area. In this manner, the personal care computing device 102 may identify activities being performed by the user based on the environmental sensor.
  • More specifically, the personal care computing device 102 may identify activities being performed by the user based on sounds within the area, such as the shower running, the sink running, a toilet flushing, gargling, a dishwasher running, a washing machine running, a dryer running, shaving, brushing teeth, etc. The activities may be related to products. For example, the shower running may be related to hair care or skin care products. The washing machine running may be related to laundry room products, such as laundry detergent, stain removal products, etc. Gargling may be related to tooth care products (e.g., toothbrushes, mouthwash, dental floss, etc.). In addition to identifying activities, the personal care computing device 102 may identify activity data for each activity, such as the type of activity (e.g., shaving), the duration of the activity, the date and/or time of the activity, the frequency in which the user performs the activity over a time period (e.g., day, a week, a month), etc.
  • The personal care computing device 102 may identify personal care products based on any suitable combination of visual features within the personal care products from images or video collected by the camera 112 which can be used to identify the personal care products, labels, barcodes, or other text placed on the personal care products from the images or video, an indication that the user is using the personal care products via user controls on the user interface 110 or via the user's mobile device, a radio identification tag such as an RFID tag, NFC tag, BLE tag, etc., which transmits identification information for the personal care products, and/or environmental characteristics in the area surrounding the personal care computing device 102 which may be used to identify activities performed by the user that are related to the personal care products.
  • FIG. 2 illustrates an example communication system in which the personal care computing device 102 and the personal care product 104 can operate to enhance the user's experience with personal care products. The personal care computing device 102 has access to a wide area communication network 200 such as the Internet via a long-range wireless communication link (e.g., a cellular link). In the example configuration of FIG. 2, the personal care computing device 102 communicates with a server device 202 that generates user feedback information to provide to the user based on the user's interactions with her personal care products 104. More generally, the personal care computing device 102 can communicate with any number of suitable servers.
  • As described above, the personal care computing device 102 can also use a variety of arrangements, singly or in combination, to communicate with the user's personal care products 104. In some implementations, the personal care computing device 102 obtains identification information from the user's personal care products 104 via a short-range communication link, such as short-range radio frequency links including Bluetooth™, RFID, NFC, etc. Some personal care products 104 may include a communication component 130, such as an RFID tag, NFC tag, BLE tag, etc. Other personal care products 104 may not include the communication component 130. The personal care computing device 102 may also communicate with a client computing device 222 of the user such as a mobile device including a tablet or smartphone over a short-range communication link, such as short-range radio frequency links including Bluetooth™, WiFi (802.11 based or the like) or another type of radio frequency link, such as wireless USB.
  • The client computing device 222 may be a mobile device such as a tablet computer, a cell phone, a personal digital assistant (PDA), a smartphone, a laptop computer, a portable media player, a home phone, a pager, a wearable computing device, smart glasses, a smart watch or bracelet, a phablet, another smart device, etc. The client computing device 222 may also be a desktop computer. The client computing device 222 may include one or more processors 226, a memory 228, a communication unit (not shown) to transmit and receive data via long-range and short-range communication networks, and a user interface 232 for presenting data to the user. The memory 228 may store, for example, instructions for a personal care application 230 that includes user controls for providing information regarding the user's personal care products, such as the names of the user's personal care products, the frequency, duration, and/or manner in which the user uses each personal care product, etc. The personal care application 230 may also include user controls for providing user profile data such as user login credentials, a user ID, the user's name or other biographical information, an image of the user such as a before and after picture, etc. Additionally, the personal care application 230 may receive user feedback information to present on the user interface 232 or as voice output via a speaker, The user feedback information may be received from the personal care computing device 102 via a short-range communication link, such as Bluetooth™ or from the server device 202 via a long-range communication link, such as the Internet or a cellular network.
  • As shown in FIG. 2, the personal care computing device 102 may include one or more speakers 108 such as an array of speakers, an environmental sensor, which may include any one of or any suitable combination of a temperature sensor, a humidity sensor, an ultrasonic sensor, a radio antenna for example for receiving Wi-Fi or Bluetooth signals, a weighing scale, a wearable sensor, an air quality sensor such as a VOC sensor, a depth sensor for generating a 3D point cloud of the area surrounding the environmental sensor, such as a LiDAR sensor or an IR sensor, each of which may be used in combination with the camera 112 to generate the 3D point cloud, and/or one or more microphones 106 such as an array of microphones. The personal care computing device 102 may also include a user interface 110, a camera 112, one or more processors 114, a communication unit 116 to transmit and receive data over long-range and short-range communication networks, and a memory 118.
  • The memory 118 can store instructions of an operating system 120 and a personal care assistant application 122. The personal care assistant application 122 may obtain an indication of a personal care product 104 being used, identify the personal care product 104 based on the indication, and generate and present user feedback information to the user to assist the user with the personal care product or related personal care products via a product identification module 124, a recommendation determination module 126, and a control module 128.
  • More specifically, the personal care computing device 102 may obtain an indication of a personal care product 104 being used and the product identification module 124 may identify the personal care product 104 based on the obtained indication. The indication of the personal care product 104 may be provided with manual input via user controls on the user interface 110 of the personal care computing device 102. For example, the user may select the personal care product 104 from a list of personal care products included in a drop-down menu on the user interface 110. The product identification module 124 may then identify the selected personal care product 104 via the user controls.
  • The indication of the personal care product 104 may also be provided automatically, such as via a radio signal from the personal care product 104, an image or video of the personal care product 104, or environmental characteristics indicative of an activity performed by the user which is related to the personal care product 104, as described below. More specifically, the indication of the personal care product 104 may be identification information from a radio identification tag provided by the personal care product 104 to the personal care computing device 102. The product identification module 124 may then determine the personal care product 104 transmitting the radio signal based on the identification information. For example, the identification information may indicate that the personal care product 104 transmitting the radio signal is L′Oreal Paris™ Colour Riche Monos Eyeshadow.
  • Additionally, the indication of the personal care product 104 may be an image or video of the area within the field of view of the camera 112. The camera 112 may periodically capture images or capture continuous video of the area in front of the camera 112, which may include a bathroom counter or an area where a user may sit in front of a bathroom mirror. Then within each image or video frame, the product identification module 124 may identify an object and determine a personal care product which corresponds to the object based on visual descriptors and semantic cues for the object. At least some of the visual descriptors and semantic cues for the object may be based on a product tag, a product label, a product color, a product shape, a product size, or a product logo. In some scenarios, an image or video frame may include multiple objects and the product identification module 124 may determine personal care products which correspond to each object. To identify objects within the image or video frame, the product identification module 124 may segment boundaries for the objects using edge detection, pixel entropy, or other image processing techniques. For example, when adjacent pixels in an image differ in intensity by more than a threshold amount, the product identification module 124 may identify the intersection between the adjacent pixels as a boundary of an object. In another example, when a cluster of pixels in the image differs in intensity by more than a threshold amount from an adjacent cluster of pixels, the product identification module 124 may identify the intersection between the adjacent pixels as a boundary of an object. In addition to performing the edge detection techniques described above to identify the boundaries of an object, the product identification module 124 may use an active contour model to refine the locations of the boundaries and further remove noise.
  • Based on the boundary segmentation, the product identification module 124 may identify each of the objects in the image. For each identified object, the product identification module 124 may determine a size and shape of the object according to its boundaries. The product identification module 124 may also identify visual features within the object along with the corresponding locations of the visual features within the object. For example, a first visual feature may be located in the upper right corner of the object, a second visual feature may be located in the center of the object, etc.
  • A visual feature may include a keypoint which is a stable region within the object that is detectable regardless of blur, motion, distortion, orientation, illumination, scaling, and/or other changes in camera perspective. The stable regions may be extracted from the object using a scale-invariant feature transform (SIFT), speeded up robust features (SURF), fast retina keypoint (FREAK), binary robust invariant scalable keypoints (BRISK), or any other suitable computer vision techniques. In some embodiments, keypoints may be located at high-contrast regions of the object, such as edges within the object. A bounding box may be formed around a keypoint and the portion of the object created by the bounding box may be a visual feature. In some embodiments, each visual feature is encoded as a vector which may include attributes of the visual feature, such as RGB pixel values, the location of the visual feature within the object, etc.
  • Additionally, for each identified object, the product identification module 124 may identify semantic cues for the object, such as text displayed on the object (e.g., a product label), a tag on or adjacent to the object, a pattern or symbol on the object (e.g., a product logo), etc. To identify text with an object, the product identification module 124 may apply a stroke width transform (SWT). The SWT is used to find a portion of an image which includes text and filter out the remaining portions of the image which do not include text. In this manner, the text portion of the image may be converted to a text string. The SWT technique may be based on an assumption that all text characters in an image have the same stroke width. For example, when the letter ‘T’ is placed within an image, the pixel width of the horizontal line in the letter ‘T’ may be the same as the pixel width for the vertical line in the letter ‘T’ within the image. This width may also be the same for all other lines or curves that make up text characters within the image.
  • Based on this assumption, the product identification module 124 may identify text characters within an image by identifying several lines or curves having a same or similar width (e.g., within a threshold variance of each other). More specifically, the product identification module 124 may perform edge detection techniques within one of the objects, such as the edge detection techniques described above for boundary segmentation, to identify boundaries for lines and curves within the object. The product identification module 124 may then calculate pixel widths for each of these lines and curves based on the positions of their respective boundaries. When the pixel widths for several lines and/or curves are the same or are within a threshold variance of each other, the product identification module 124 may identify the lines and/or curves as text, and may filter out the remaining portions of the object.
  • Additional filtering steps may also be applied to identify the text characters within the image. For example, text characters may have minimum and maximum aspect ratios, such that the length of a text character does not exceed the width of the text character by more than a threshold amount. Accordingly, the identified lines and/or curves may be compared to minimum and maximum aspect ratios. If the length to width ratio of a candidate text character is outside the minimum or maximum aspect ratios, the candidate text character may be filtered out as a portion of the image which does not include text.
  • A threshold ratio between the diameter of a text character and the text character's average stroke width may also be used to filter out portions of the image which do not include text. For example, if the product identification module 124 identifies a portion of an image which resembles the letter ‘O’ the product identification module 124 may calculate the ratio of the diameter for the candidate text character to the average stroke width. When the ratio is less than the threshold ratio by more than a threshold variance (e.g., the candidate text character is donut-shaped) or the ratio is more than the threshold ratio by more than the threshold variance, the candidate text character may be filtered out as a portion of the image which does not include text. Moreover, the product identification module 124 may filter out candidate text characters having less than a minimum threshold size or greater than a maximum threshold size (e.g., a minimum height of 8 pixels and a maximum height of 300 pixels). In some embodiments, other filtering steps may also be applied such as filtering overlapping bounding boxes, or any other suitable filtering steps.
  • In addition to identifying text characters, the product identification module 124 may also use the SWT to identify words. For example, all text characters in a word may have the same color, may be spaced apart evenly, may be within a threshold distance from each other, and may be the same height or have height differences which are less than a threshold amount. Accordingly, the product identification module 124 may identify words by grouping identified text characters having the same color, that are within a threshold height difference of each other, that are within a threshold distance of each other, and/or that are spaced apart by the same distance.
  • In some embodiments, the product identification module 124 may use Maximally Stable Extremal Regions (MSER) techniques to identify text within an object or may use a combination of SWT and MSER to identify the text. Once text is identified within an object, the portion of the object containing text may be provided to an optical character recognition (OCR) engine which may convert an image (e.g., the portion of the object containing text) to a text string.
  • Also in some embodiments, for each identified object, the product identification module 124 may identify a barcode or QR code within the identified object and may decode the barcode or QR code converting the barcode or QR code to a text string or other data stream which may be used as a semantic cue.
  • In any event, to identify a personal care product which corresponds to an object, the product identification module 124 may compare each of the visual features, semantic cues, and/or other visual characteristics for the object to visual descriptors, semantic cues, and/or other visual characteristics for templates of personal care products to determine a likelihood that the object corresponds to one of the personal care products. The personal care computing device 102 may store the templates of personal care products in a database. Each template may include the visual features, semantic cues, and/or other visual characteristics for the template personal care product. For example, each identified text string for an object may be compared to text strings in the templates of personal care products to determine likelihoods that the object corresponds to each template personal care product. The personal care product having the highest likelihood for the object or having a likelihood that exceeds a likelihood threshold may be identified as the personal care product corresponding to the object.
  • In some embodiments, the product identification module 124 may generate a machine learning model for identifying personal care products based on visual features and semantic cues using image classification and/or machine learning techniques. The machine learning techniques may include linear regression, polynomial regression, logistic regression, random forests, boosting, nearest neighbors, Bayesian networks, neural networks, deep learning, support vector machines, or any other suitable machine learning technique. Then the product identification module 124 may apply the visual features and semantic cues for the object to the machine learning model to identify the personal care product corresponding to the object.
  • In some embodiments, the template features and template semantic cues may be compared to the features and semantic cues for an object using a nearest neighbors algorithm. The nearest neighbors algorithm may identify template features and template semantic cues which are the closest to the features of the object by creating numerical representations of the features and semantic cues to generate feature vectors, such as a pixel width and height of a personal care product, and RGB pixel values for the personal care product, for example. The numerical representations of the features or feature vectors of the object may be compared to the feature vectors of template personal care products to determine a vector distance between the features of the object and each template personal care product. Additionally, a semantic cue for the object such as text may be compared to text in the template personal care products to identify the amount of matching text characters, words, or symbols to determine a vector distance between the semantic cues of the object and each template personal care product. The product identification module 124 may generate vector distances for each vector (e.g., each visual feature and semantic cue) and combine the individual vector distances to generate an overall vector distance between the object and a particular template personal care product.
  • The product identification module 124 may then identify the personal care product which corresponds to the object based on the amount of similarity, or the vector distance in the nearest neighbors algorithm, between the visual features and semantic cues for the object and the visual features and semantic cues for the template personal care products. The product identification module 124 may identify the template personal care product having the smallest overall vector distance between the object and the template personal care product as the template personal care product corresponding to the object.
  • In other embodiments, the product identification module 124 may provide images or video of the area within the field of view of the camera 112 to the server device 202 which may identify an object and determine a personal care product which corresponds to the object using similar techniques as described above. Then the server device 202 may provide the identified personal care products to the product identification module 124.
  • In any event, in addition to identifying a personal care product 104 being used by a user, the product identification module 124 may identify consumer habits, such as product use event data for the personal care product 104. The product use event data may include identification information for the personal care product 104 such as the name of the personal care product, the date and/or time of the use, the duration of the use, the manner in which the personal care product 104 is used, other personal care products used in the same time frame as the personal care product 104, etc.
  • The product identification module 124 may determine the date and/or time of the use based on the date and/or time when the product identification module 124 identifies the personal care product 104. For example, when the personal care computing device 102 receives identification information from a radio identification tag provided by the personal care product 104, the product identification module 124 may record the date and/or time in which the identification information is received.
  • Additionally, the product identification module 124 may determine the duration of the use by determining when the personal care product 104 can no longer be identified. For example, the product identification module 124 may record the amount of time until the personal care computing device 102 stops receiving a radio signal from the personal care product 104, until the personal care product 104 is no longer within the field of view of the camera 112, etc.
  • Furthermore, the product identification module 124 may identify other personal care products used in the same time frame as the personal care product 104 by identifying the other personal care products in a similar manner as described above, and comparing identification times for each of the other personal care products to the identification time for the personal care product. If another personal care product is identified within a threshold time period (e.g., 2 minutes, 5 minutes, 10 minutes, etc.) of the personal care product, the product identification module 124 may determine that the other personal care product was used in the same time frame as the personal care product 104. For example, the product identification module 124 may determine that 5 personal care products were used within a ten minute time period, and thus may determine that each of the 5 personal care products was used in the same time frame. The product identification module 124 may also generate an order in which a set of personal care products were used when the set of personal care products were used in the same time frame.
  • Moreover, to determine the manner in which the personal care product 104 is used, the personal care computing device 102 may present questions on the user interface 110 or via the speaker 108 which are related to the use of the identified personal care product 104. Accordingly, the user may respond to the questions with voice responses which are received via the microphone or via user controls on the user interface 110, such as drop-down menus, text fields, etc. For example, when the personal care product 104 is eye makeup, the personal care computing device 102 may ask which color is being used, where the eye makeup is being applied around the eye, etc. In some implementations, the product identification module 124 may determine the manner in which the personal care product 104 is being used based on the user's responses to the questions. In other implementations, the product identification module 124 may determine the manner in which the personal care product 104 is being used by analyzing the images or video from the camera 112 using computer vision techniques. For example, the product identification module 124 may identify the user's face and facial features from the images such as the user's eyes, lips, and nose, and may determine where the user is applying makeup, lipstick, moisturizer, etc., on her face.
  • Furthermore, the personal care computing device 102 may obtain an indication of an activity being performed by the user. The indication of the activity may be obtained automatically, for example via the environmental sensor. The indication of the activity may be environmental characteristics within an area surrounding the personal computing device 102, such as audio characteristics, temperature characteristics, visual characteristics, weight characteristics, air quality characteristics, or humidity characteristics. The environmental sensor may periodically capture sensor data within the area surrounding the personal computing device 102 (e.g., the living room, the kitchen, the bathroom, etc.), such as audio data, temperature data, humidity data, a 3D point cloud, air quality data, weight data, data received via a short-range communication link, etc. Then the personal computing device 102 may identify an activity based on sensor data characteristics from any one or any suitable combination of sensors. For example, the environmental sensor may periodically capture audio data for a sound within the area. The personal computing device 102 may then compare the audio data for the sound to acoustic signatures for various types of activities, such as the shower running, the sink running, a toilet flushing, gargling, a dishwasher running, a washing machine running, a dryer running, shaving, brushing teeth, etc. Each acoustic signature may include a set of audio characteristics for a particular activity and/or sound, such as the volume of the sound, the frequency of the sound, the tone of the sound, the direction of the sound, etc. The personal computing device 102 may identify the type of activity by comparing the audio data to each acoustic signature for each type of activity to determine a likelihood that the sound corresponds to one of the activities. The type of activity having the highest likelihood for the sound or having a likelihood that exceeds a likelihood threshold may be identified as the type of activity corresponding to the sound. In another example, the environmental sensor may periodically capture audio data for a sound within the area and may periodically capture temperature data within the area. The personal computing device 102 may then compare the audio data for the sound to acoustic signatures for various types of activities, such as the shower running, the sink running, a toilet flushing, gargling, a dishwasher running, a washing machine running, a dryer running, shaving, brushing teeth, etc., and may compare the temperature data to heat signatures for the various types of activities. Each acoustic signature may include a set of audio characteristics for a particular activity and/or sound, such as the volume of the sound, the frequency of the sound, the tone of the sound, the direction of the sound, etc. Each heat signature may include a set of temperature characteristics for a particular activity. The personal computing device 102 may identify the type of activity by comparing the audio data to each acoustic signature for each type of activity and the temperature data to each heat signature for each type of activity to determine a likelihood that the sound/temperatures correspond to one of the activities. The type of activity having the highest likelihood for the sound/temperatures or having a likelihood that exceeds a likelihood threshold may be identified as the type of activity corresponding to the sound/temperatures.
  • Furthermore, to identify the type of activity, the personal computing device 102 may obtain an indication of a type of area in which the personal computing device 102 is located, such as the bathroom, the kitchen, the laundry room, etc. The indication may be obtained from the user via user controls at the personal computing device 102. The personal computing device 102 may adjust the likelihoods that a sound corresponds to one of several different activities based on the type of area in which the personal computing device 102 is located. For example, if the personal computing device 102 is in the laundry room, it may be more likely that a detected sound or set of environmental characteristics corresponds to the washing machine running than to the dish washer running. Conversely, if the personal computing device 102 is in the kitchen, it may be more likely that a detected sound or set of environmental characteristics corresponds to the dish washer running than to washing machine running.
  • In some embodiments, the product identification module 124 may generate a machine learning model for identifying an activity based on sensor data captured by the environmental sensor using machine learning techniques. The machine learning techniques may include linear regression, polynomial regression, logistic regression, random forests, boosting, nearest neighbors, Bayesian networks, neural networks, deep learning, support vector machines, or any other suitable machine learning technique. Then the product identification module 124 may apply the audio characteristics for the sound, the type of area where the personal computing device 102 is located, and/or other environmental characteristics detected within the area to the machine learning model to identify the activity corresponding to the sound and/or other environmental characteristics.
  • In some embodiments, the audio signatures may be compared to the audio characteristics for a sound using a nearest neighbors algorithm. The nearest neighbors algorithm may identify audio signatures which are the closest to the audio characteristics of the sound by creating numerical representations of the audio characteristics to generate feature vectors, such as a volume, frequency, tone, and direction, for example. The numerical representations of the features or feature vectors of the sound may be compared to the feature vectors of audio signatures of various types of activities to determine a vector distance between the features of the sound and each audio signature. The product identification module 124 may generate vector distances for each vector and combine the individual vector distances to generate an overall vector distance between the sound and an audio signature for a particular type of activity.
  • The product identification module 124 may then identify the activity which corresponds to the sound based on the amount of similarity, or the vector distance in the nearest neighbors algorithm, between the features for the sound and the features of the audio signatures for the activities. The product identification module 124 may identify the audio signature for the type of activity having the smallest overall vector distance between the sound and the audio signature as the audio signature for the type of activity corresponding to the sound.
  • In addition to identifying the type of activity, the product identification module 124 may identify additional activity data, such as the date and/or time of the activity, the duration of the activity, etc. The product identification module 124 may also identify activity data based on previous activities performed by the user, such as the frequency of the activity over a particular time period.
  • The product identification module 124 may also identify products related to the activity. For example, when the activity is running a washing machine or a dryer, the product identification module 124 may identify one or more laundry room products. When the activity is showering, the product identification module 124 may identify one or more hair care or skin care products. When the activity is the toilet flushing, the product identification module 124 may identify one or more bathroom products, such as toilet paper or cleaning products. When the activity is the dishwasher running, the product identification module 124 may identify one or more kitchen products, such as plates, bowls, forks, spoons, knives, dishwasher detergent, etc.
  • In some embodiments, the product identification module 124 may use the identified activity to identify the personal care product 104 being used by a user. More specifically, the product identification module 124 may generate the machine learning model for identifying personal care products based on visual features, semantic cues, and the type of activity being performed by the user. For example, two personal care products (e.g., mouthwash and moisturizer) may have similar likelihoods for corresponding to the object. When the product identification module 124 identifies gargling as the activity, the product identification module 124 may determine that the personal care product corresponding to the object is mouthwash.
  • Still further, the product identification module 124 may identify the user. In some embodiments, the product identification module 124 may obtain an indication of the identity of the user from manual input via user controls on the user interface 110 of the personal care computing device 102. For example, the user may login to a user profile using user login credentials, may enter the user's first and last name, may select a user profile from a set of user profiles, or may provide any other suitable identification information. In other embodiments, the product identification module 124 may obtain the indication of the identity of the user automatically from environmental sensor data, such as an image or video of the user, audio data indicative of the user's voice, etc.
  • More specifically, the personal care computing device 102 may store template images of each of the users who utilize the personal care computing device 102. The personal care computing device 102 may also store voice recordings/audio signatures from each of the users and/or other biographical data. The product identification module 124 may compare the environmental sensor data to the stored images, voice recordings, and/or other biographical data to identify the user. For example, the product identification module 124 may identify the user's face and facial features from the images such as the user's eyes, lips, and nose, and may compare the facial features to the facial features from the stored template images of each of the users. The product identification module 124 may also identify the user's voice from audio data and compare the voice data to the stored voice recordings. In some embodiments, the product identification module 124 may compare the environmental sensor data to the stored images, voice recordings, and/or other biographical data using machine learning techniques. The machine learning techniques may include linear regression, polynomial regression, logistic regression, random forests, boosting, nearest neighbors, Bayesian networks, neural networks, deep learning, support vector machines, or any other suitable machine learning technique. Then the product identification module 124 may apply the visual features and semantic cues for the object to the machine learning model to identify the personal care product corresponding to the object.
  • In some embodiments, the template facial features and/or template voice features may be compared to the facial features and/or voice features for a user whose identity is unknown using a nearest neighbors algorithm. The nearest neighbors algorithm may identify template facial features and/or template voice features which are the closest to the facial features and/or voice features for a user whose identity is unknown by creating numerical representations of the facial features and/or voice features to generate feature vectors. The numerical representations of the features or feature vectors of the user whose identity is unknown may be compared to the feature vectors of template users to determine a vector distance between the features of the user whose identity is unknown and each template user. The product identification module 124 may generate vector distances for each vector (e.g., each facial feature and/or voice feature) and combine the individual vector distances to generate an overall vector distance between the user whose identity is unknown and a particular template user. The product identification module 124 may obtain an identifier for the identified user, such as a user ID.
  • In any event, the recommendation determination module 126 may then provide the product use event data for the identified personal care product 104 and/or the activity data for the activity as well as identification information for the user (e.g., user login credentials, a user ID, etc.) to the server device 202. The server device 202 may store the activity data and/or the product use event data in a user profile for the user which includes historical product use event data for the identified personal care product 104 and for other personal care products and/or historical activity data for the identified activities and for other activities. The user profile may also include user profile data for the user, such as biographical information regarding the user, a current location of the user, an image of the user, or user preferences or goals regarding the personal care product. In some implementations, the personal care computing device 102 or the user's client computing device 222 obtains user profile data from the user and provides the user profile data to the server device 202. For example, the user's client computing device 222 may provide location data (e.g., obtained via a positioning sensor such as a GPS module or via an IP address), to the server device 202 which may be the user's current location. The server device 202 may then store the user profile data in the user profile for the user.
  • Example data tables 300, 400 illustrating user profile data and product use event data are illustrated in FIGS. 3 and 4, respectively. As shown in the data table 300 of FIG. 3, user profile data in a user profile may include a user ID 302, a name of the user 304, an address of the user 306, a date of birth of the user 308, personal care goals provided by the user 310, reported cosmetic issues provided by the user 312, rewards points for the user 314, or any other suitable information about the user. The data table 300 may also include images of the user (not shown), user performance metrics related to product usage (not shown), etc. As shown in the data table 400 of FIG. 4, product use event data in a user profile may include a user ID 402 which may be the same user ID as in the user profile data for associating the product use event data with the user. The product use event data may also include the name of the personal care product 404, the date and/or time of the use 406, the duration of the use 408, and the manner of use 410 describing how the personal care product was used. For example, as indicated in the data table 400, Jane Smith (User ID 2) applied Olay™ Total Effects Whip Face Moisturizer on Jul. 26, 2019 at 9:14 a.m. for 1 minute. She rubbed the moisturizer unevenly on parts of her face. Back on July 14, Jane Smith applied the Olay™ Total Effects Whip Face Moisturizer at 7:15 p.m. for 30 seconds. That time she rubbed the moisturizer evenly on her entire face. Additionally, on July 22, Jane Smith used a SK-II™ Facial Treatment Mask at 9:37 a.m. for 7 minutes. She placed the mask on her face, left it there for 7 minutes, and rinsed it off.
  • The server device 202 may also store a data table (not shown) which includes activity data. Activity data in a user profile may include a user ID which may be the same user ID as in the user profile data for associating the activity data with the user. The activity data may also include the type of the activity, the date and/or time of the activity, and the duration of the activity. Furthermore, the activity data may include the frequency of the activity over a particular time period (e.g., a day, a week, a month) based on the dates and/or times of the activity and/or other metrics based on the activity data.
  • The server device 202 may analyze the product use event data for a particular personal care product, the activity data for a particular type of activity, and/or the user profile data for the user to generate user feedback information to assist the user in using the personal care product or related personal care products. This may enhance the user's experience with the personal care products and provide improved results from using the personal care products.
  • The server device 202 may include one or more processors 204, a communication unit (not shown) to transmit and receive data over long-range and short-range communication networks, and a memory 206.
  • The memory 206 can store instructions of an operating system (not shown) and a personal care recommendation generator 208. The server device 202 may also be communicatively coupled to a database 210 that stores user profiles for several users, where each user profile includes user profile data and product use event data as described above. The database 210 may also store templates of personal care products including visual features, semantic cues, and/or other visual characteristics for the template personal care products. Moreover, the database 210 may store audio signatures for various activities each including a set of audio characteristics which correspond to the activity. Additionally, the database 210 may store machine learning models generated based on the visual features, semantic cues, and/or other visual characteristics of the template personal care products and/or based on the audio signatures for the various activities. Furthermore, for each personal care product or for each type of personal care product, the database 210 may store a set of rules regarding the appropriate frequency, duration, and manner of use for the personal care product. The rules may differ depending on the demographics of a particular user. For example, the rules may indicate that users in a first age group should moisturize more often than users in a second age group. In addition to sets of rules, the database 210 may store machine learning models for determining the appropriate frequency, duration, and manner of use for the personal care product that is specific to a particular user based on the user's previous patterns of use and/or the results experienced by the user. For example, if a general rule is to moisturize daily, but the user's product use event data indicates that the user has been moisturizing weekly yet recent images of the user indicates that her skin texture has greatly improved, a user-specific machine learning model may be adjusted such that the appropriate moisturizing frequency for the user is weekly.
  • Still further, the database 210 may store a set of rules regarding the appropriate frequency, duration, and manner of use for a particular activity. The set of rules may also include an estimated total number of times the activity may be performed and/or an estimated total duration over multiple instances of performing the activity before products related to the activity need to be replenished, such as the number of showers before the user needs to replace the soap and shampoo. In addition to sets of rules, the database 210 may store machine learning models for determining the appropriate frequency, duration, and manner of use for the particular activity that is specific to a particular user based on the user's previous patterns of use and/or the results experienced by the user.
  • In any event, the personal care recommendation generator 208 may analyze the activity data and/or the product use event data at several instances in time for the identified personal care product 104 (e.g., from the user profile in the database 210) to generate the user feedback information. For example, the personal care recommendation generator 208 may analyze the activity data and/or the product use event data over a particular time window (e.g., the previous year, the previous month, the previous week, etc.). Then the personal care recommendation generator 208 may determine product use metrics for the personal care product 104 such as a frequency of use over the particular time window, an average duration of use, the time of day of the use, etc. For example, for the Olay™ Total Effects Whip Face Moisturizer described in FIG. 4, the personal care recommendation generator 208 may determine that the user applied the moisturizer about once a week. The personal care recommendation generator 208 may also determine activity metrics for the activity such as a frequency of the activity over the particular time window, an average duration of the activity, the time of day of the activity, etc.
  • The personal care recommendation generator 208 may then compare the product use metrics and/or the product use event data to the set of rules for the identified personal care product 104, for example from the database 210 to generate the user feedback information. The personal care recommendation generator 208 may also compare the activity metrics and/or activity data to the set of rules for the identified activity, for example from the database 210 to generate the user feedback information. The activity metrics, activity data, product use metrics, and/or the product use event data may be compared to the set of rules in view of the user profile data for the user, such as demographics, or the user's personal care goals and reported issues.
  • In other implementations, the personal care recommendation generator 208 may apply the activity metrics, activity data, product use metrics, product use event data, and/or user profile data to a machine learning model generated based on the performances of other users. For example, the personal care recommendation generator 208 may train the machine learning model using on a first set of activity metrics, activity data, product use metrics, product use event data, and/or user profile data for a first set of users who improved their cosmetic deficiencies with the personal care product and a second set of activity metrics, activity data, product use metrics, product use event data, and/or user profile data for a second set of users who did not improve their cosmetic deficiencies with the personal care product. For each type of user feedback information (e.g., rewards, recommendations, advice, etc.), the personal care recommendation generator 208 may train the machine learning model using a first set of activity metrics, activity data, product use metrics, product use event data, and/or user profile data for a first set of users who received the type of user feedback information for the personal care product and a second set of activity metrics, activity data, product use metrics, product use event data, and/or user profile data for a second set of users who did not receive the type of user feedback information for the personal care product.
  • In any event, the personal care recommendation generator 208 may generate user feedback information using the set of rules and/or the machine learning models. The user feedback information may include a recommendation to replenish the personal care product. The personal care recommendation generator 208 may recommend replenishing the personal care product after a threshold number of uses of the personal care product which in some instances may be determined via the activity data, or when the personal care product exceeds a threshold age according to the set of rules and/or the machine learning models.
  • The user feedback information may also include advice on how to use the personal care product or a recommendation on how to improve the use of the personal care product. For each personal care product, the server device 202 may store a set of instructions on using the personal care product for example, in the database 210. When the user is using the personal care product incorrectly based on product use event data, the personal care recommendation generator 208 may generate advice on how to use the personal care product. For example, as described above with reference to FIG. 4, the user placed a SK-II™ Facial Treatment Mask on her face, left it there for 7 minutes, and rinsed it off. The set of rules for the SK-II™ Facial Treatment Mask may indicate that the user should not rinse off the mask and instead rub it in. Accordingly, the personal care recommendation generator 208 may generate advice indicating that next time the user should rub in the mask without rinsing it off. The advice may also include the frequency and duration in which to use the personal care product and/or a description of the frequency and duration in which the user is using the personal care product. The personal care recommendation generator 208 may generate advice indicating the frequency and/or duration when the user first uses the personal care product according to the product use event data or when the user is using the personal care product too frequently, not frequently enough, for too long, or for not long enough according to the product use event data and the set of instructions on using the personal care product.
  • The advice on how to use the product may be based on the consumer habits for the user. For example, if the user's habits deviate from the set of rules for using a particular product, the personal care recommendation generator 208 may generate advice indicating how to use the product. Still further, the user feedback information may include opportunities for optimizing a particular hygiene regimen based on the user's habits. More specifically, the user feedback information may include a particular order in which the user should use a set of products for a particular hygiene regimen, such as when the user's habits indicate that the user does not follow the particular order. For example, when the user's habits indicate that the user applies concealer before putting on foundation, the user feedback information may include a recommendation to apply the concealer after putting on foundation. Additionally, the user feedback information may include recommendations for additional or alternative products to use during the particular hygiene regimen along with the products the user is currently using in the regimen.
  • In some embodiments, the advice on how to use the product may be based on user profile data such as the weather conditions at the user's location, the time of year, or the time of day. If it is a hot, humid day or it is raining, the personal care recommendation generator 208 may recommend different types of use of hair care products than on a sunny day with low humidity. Also, if it is the daytime during the summer, the personal care recommendation generator 208 may recommend purchasing a daytime moisturizer with sunscreen to go along with the user's nighttime moisturizer. In the winter, the personal care recommendation generator 208 may recommend that the user apply the same moisturizer during the day and at night.
  • Furthermore, the user feedback information may include recommendations to purchase related personal care products. For example, the server device 202 may store lists of personal care products which work well together according to their ingredients or the effects of the personal care products on other users. When the user is using a particular type of shampoo, the personal care recommendation generator 208 may recommend a particular type of conditioner that compliments the shampoo. Additionally, the user profile may indicate that the user in the past used a particular personal care product within the same time frame as another personal care product. The personal care recommendation generator 208 may recommend that the user once again purchase the particular care product to use with the other personal care product.
  • Moreover, the user feedback information may also include a user performance metric such as a score based on the duration and/or frequency in which the user uses a particular personal care product. For example, the user performance metric may be a score from 0-100 which increases each time the user uses conditioner. If the user does not use conditioner for a threshold time period, the score may decrease or reset to 0. In some implementations, the personal care recommendation generator 208 generates the user performance metric using a machine learning model, such as a regression model. As described above, the machine learning model may be trained using on a first set of activity metrics, activity data, product use metrics and/or product use event data for a first set of users who improved their cosmetic deficiencies with the personal care product and a second set of activity metrics, activity data, product use metrics and/or product use event data for a second set of users who did not improve their cosmetic deficiencies with the personal care product. Then the personal care recommendation generator 208 may apply the user's activity metric, activity data, product use metric, and/or product use event data to the machine learning model to generate the user performance metric.
  • In some scenarios, the user feedback information may also include rewards which may be provided when a user performance metric exceeds a threshold value, when the user uses more than a threshold number of different personal care products, when the user follows recommendations or advice provided by the personal care computing device, etc.
  • The user performance metric may also be a comparison to the performances of other users. In some embodiments, the personal care recommendation generator 208 may compare the user's performance to the performances of other users in the same demographic (e.g., age group). For example, the user may have a raw user performance metric for eye makeup of 65 but this may be in the 75th percentile of raw user performance metrics compared to other users in the same age group, same geographic area, etc. Accordingly, the user feedback information may provide a raw user performance metric, a percentile or ranking of the raw user performance metric relative to other users, an adjusted user performance metric factoring in the user's performance relative to other users, or any other suitable relative user performance metric.
  • The user feedback information may also include recommendations on how to improve a user performance metric, encouragement to continue using the personal care product to reach a high score and receive rewards points or other incentives for maintaining consistent use of the personal care product.
  • Example user feedback information is illustrated in the data table 500 of FIG. 5. For example, for a hair care product, the personal care recommendation generator 208 may recommend that the user “Use leave-on frizz control to fight the humidity.” For moisturizer, the personal care recommendation generator 208 may advise, “In the past month, you have been moisturizing about twice a week. Make sure you moisturize everyday.” In another example, for a razor, the personal care recommendation generator 208 may advise, “Don't forget to replace your disposable razor after 5 uses.” In yet another example, for a shampoo, the personal care recommendation generator 208 may recommend that the user “Buy Jane's Conditioner to use along with your shampoo. Another example of user feedback information may be, “You have earned 200 rewards points for maintaining proper skin care habits.”
  • The database 210 may also store previous user feedback information provided to the user, so that the personal care computing device 102 does not repeatedly provide the user with the same user feedback information. Based on the user's response to various user feedback information the personal care recommendation generator 208 may learn which types of user feedback information improve the user's performance. For example, the personal care recommendation generator 208 may learn that the user does not purchase recommended related products, and thus may stop providing related products recommendations.
  • Then the personal care recommendation generator 208 may provide the user feedback information to the personal care computing device 102, or the client computing device 222 via an SMS message, email, push notification, etc. In other implementations, the recommendation determination module 126 in the personal care computing device 102 may analyze the product use event data for the user to generate the user feedback information without sending the product use event data to the server device 202.
  • The control module 128 may control operation of the personal care computing device 102 by for example, presenting a display which includes the user feedback information via the user interface 110, presenting audio output which includes the user feedback information via the speaker 108, providing haptic feedback indicative of the user feedback information via a vibration motor, or transmitting the user feedback information to the client computing device 222 via the communication unit 116.
  • FIG. 6 illustrates a flow diagram representing an example method 600 for providing feedback regarding personal care products. The method 600 may be performed by the personal care assistant application 122 and executed on the personal care computing device 102. In some embodiments, the method 600 may be implemented in a set of instructions stored on a non-transitory computer-readable memory and executable on one or more processors of the personal care computing device 102. For example, the method 600 may be at least partially performed by the product identification module 124, the recommendation determination module 126, and the control module 128, as shown in FIG. 2.
  • At block 602, an indication of a personal care product 104 being used by a user is obtained. The indication of the personal care product 104 may be provided with manual input via user controls on the user interface 110 of the personal care computing device 102 or the client computing device 222. For example, the user may select the personal care product 104 from a list of personal care products included in a drop-down menu on the user interface 110. The indication of the personal care product 104 may also be provided automatically, such as via a radio signal from the personal care product 104, or an image or video of the personal care product 104.
  • In some embodiments, an indication of an activity may be obtained. The indication of the activity may be provided automatically, such as via environmental characteristics for the area surrounding the personal care computing device 102 detected by an environmental sensor, which may be any one of, any two of, or any suitable combination of an audio sensor such as a microphone or an array of microphones, a temperature sensor, an ultrasonic sensor, a radio antenna for example for receiving Wi-Fi or Bluetooth signals, a weighing scale, a wearable sensor, an air quality sensor such as a VOC sensor, a depth sensor for generating a 3D point cloud of the area surrounding the environmental sensor, such as a LiDAR sensor or an IR sensor, and/or a humidity sensor.
  • Then at block 604, the personal care product 104 is identified based on the obtained indication. In the case of manual input, the personal care computing device 102 may identify the selected personal care product 104 via the user controls. When the indication of the personal care product 104 is a radio signal, the personal care computing device 102 may identify the personal care product 104 transmitting the radio signal based on the identification information included in the radio signal. Furthermore, when the indication of the personal care product 104 is an image or video, the personal care computing device 102 may identify the personal care product 104 by analyzing images or video frames using the computer vision techniques described above to identify an object within the images or video frames and identify visual features, semantic cues, and/or other visual characteristics for the object. Then the personal care computing device 102 may compare the visual features, semantic cues, and/or other visual characteristics to visual features, semantic cues, and/or other visual characteristics for templates of personal care products to determine a likelihood that the object corresponds to one of the personal care products. In other implementations, the personal care computing device 102 or the server device 202 may generate a machine learning model for identifying personal care products based on visual features and semantic cues using image classification and/or machine learning techniques. Then the personal care computing device 102 may apply the visual features and semantic cues for the object to the machine learning model to identify the personal care product corresponding to the object.
  • In some implementations, the activity may be identified based on the indication of the activity. For example, the environmental sensor may periodically capture audio data for a sound within the area. The personal computing device 102 may then compare the audio data for the sound to acoustic signatures for various types of activities, such as the shower running, the sink running, a toilet flushing, gargling, a dishwasher running, a washing machine running, a dryer running, shaving, brushing teeth, etc. Each acoustic signature may include a set of audio characteristics for a particular activity and/or sound, such as the volume of the sound, the frequency of the sound, the tone of the sound, the direction of the sound, etc. The personal computing device 102 may identify the type of activity by comparing the audio data to each acoustic signature for each type of activity to determine a likelihood that the sound corresponds to one of the activities. The type of activity having the highest likelihood for the sound or having a likelihood that exceeds a likelihood threshold may be identified as the type of activity corresponding to the sound. In other implementations, the personal care computing device 102 or the server device 202 may generate a machine learning model for identifying an activity based on sensor data captured by the environmental sensor using machine learning techniques. Then the personal care computing device 102 may apply the audio characteristics for the sound, the type of area where the personal computing device 102 is located, and/or other environmental characteristics detected within the area to the machine learning model to identify the activity corresponding to the sound and/or other environmental characteristics.
  • The product identification module 124 may also identify products related to the activity. For example, when the activity is running a washing machine or a dryer, the product identification module 124 may identify one or more laundry room products. When the activity is showering, the product identification module 124 may identify one or more hair care or skin care products. When the activity is the toilet flushing, the product identification module 124 may identify one or more bathroom products, such as toilet paper or cleaning products. When the activity is the dishwasher running, the product identification module 124 may identify one or more kitchen products, such as plates, bowls, forks, spoons, knives, dishwasher detergent, etc.
  • In some embodiments, the product identification module 124 may use the identified activity to identify the personal care product 104 being used by a user. More specifically, the product identification module 124 may generate the machine learning model for identifying personal care products based on visual features, semantic cues, and the type of activity being performed by the user.
  • In other embodiments, the personal care computing device 102 provides the obtained indication of the personal care product 104 and/or the obtained indication of the activity to the server device 202 to identify the personal care product 104 corresponding to the indication. Then the server device 202 provides the identified personal care product 104 and/or the identified activity to the personal care computing device 102.
  • In addition to identifying the personal care product 104, the personal care computing device 102 may identify product use event data for the personal care product 104 based on the user's interaction with the personal care product 104 (block 606). The product use event data may include identification information for the personal care product 104 such as the name of the personal care product, the date and/or time of the use, the duration of the use, the manner in which the personal care product 104 is used, other personal care products used in the same time frame as the personal care product 104, etc.
  • For example, when the personal care computing device 102 receives identification information from a radio identification tag provided by the personal care product 104, the personal care computing device 102 may record the date and/or time in which the identification information is received. Additionally, the personal care computing device 102 may determine the duration of the use by determining when the personal care product 104 can no longer be identified. Furthermore, the personal care computing device 102 may identify other personal care products used in the same time frame as the personal care product 104 by identifying the other personal care products in a similar manner as described above, and comparing identification times for each of the other personal care products to the identification time for the personal care product.
  • Furthermore, in addition to identifying the activity, the personal care computing device 102 may identify activity data such as the type of activity, the date and/or time of the activity, the duration of the activity, etc.
  • Moreover, to determine the manner in which the personal care product 104 is used, the personal care computing device 102 may present questions on the user interface 110 or via the speaker 108 which are related to the use of the identified personal care product 104. Accordingly, the user may respond to the questions with voice responses which are received via the microphone or via user controls on the user interface 110, such as drop-down menus, text fields, etc. In other implementations, the personal care computing device 102 may determine the manner in which the personal care product 104 is being used by analyzing the images or video from the camera 112 using computer vision techniques. For example, the personal care computing device 102 may identify the user's face and facial features from the images such as the user's eyes, lips, and nose, and may determine where the user is applying makeup, lipstick, moisturizer, etc., on her face. In yet other implementations, the personal care computing device 102 may determine the manner in which the personal care product 104 is being used based on the activity data. More specifically, the activity data may indicate the type of activity the user performed while using the personal care product 104.
  • In some embodiments, the personal care computing device 102 or the client computing device 222 also obtains user profile data for the user, such as biographical information regarding the user, a current location of the user, an image of the user, or user preferences or goals regarding the personal care product. Then the personal care computing device 102 or the client computing device 222 provides the user profile data to the server device 202.
  • At block 608, the personal care computing device 102 provides the activity data, the product use event data for the identified personal care product 104, and/or identification information for the user (e.g., user login credentials, a user ID, etc.) to the server device 202. The server device 202 may analyze the activity data for the identified activity, the product use event data for the identified personal care product 104, and/or the user profile data for the user to generate user feedback information to assist the user in using the personal care product or related personal care products. More specifically, the server device 202 may analyze the activity data and/or the product use event data at several instances in time for the identified personal care product 104 and/or identified activity to generate the user feedback information. For example, the server device 202 may analyze the activity data and/or the product use event data over a particular time window (e.g., the previous year, the previous month, the previous week, etc.). Then the server device 202 may determine product use metrics for the personal care product 104 such as a frequency of use over the particular time window, an average duration of use, the time of day of the use, etc. The server device 202 may also identify activity metrics based on the activity data, such as a frequency of the activity over the particular time window, an average duration of the activity, the time of day of the activity, etc. The server device 202 may then compare the activity data, the activity metrics, the product use metrics and/or the product use event data to a set of rules for the identified personal care product 104, for example from the database 210 to generate the user feedback information. In other implementations, the server device 202 may apply the activity data, the activity metrics, the product use metrics, product use event data, and/or user profile data to a machine learning model generated based on the performances of other users.
  • The user feedback information may include a recommendation to replenish the personal care product, advice on how to use the personal care product or a recommendation on how to improve the use of the personal care product, the frequency and duration in which to use the personal care product and/or a description of the frequency and duration in which the user is using the personal care product, recommendations to purchase related personal care products, a user performance metric indicating how effectively the user is using the personal care product, rewards, or recommendations on how to improve a user performance metric, encouragement to continue using the personal care product to reach a high score and receive rewards points or other incentives for maintaining consistent use of the personal care product.
  • In some implementations, the server device 202 may generate several types of user feedback information. The database 210 may also store previous user feedback information provided to the user, and the server device 202 may provide types of user feedback information to the personal care computing device 102 or the client computing device 222 which have not been presented to the user within a threshold time period. In other implementations, some types of user feedback information may be provided more often than others, such as user performance metrics. The server device 202 may provide an updated user performance metric to the user each time the user performance metric changes. On the other hand, the server device 202 may only provide recommendations on how to use the personal care product once a week or once a month, for example.
  • At block 612, the personal care computing device 102 obtains the user feedback information from the server device 202. In other implementations, the personal care computing device 102 generates the user feedback information based on the activity data, the product use event data for the personal care product, and/or the user profile data. In any event, the personal care computing device 102 presents the user feedback information to the user (block 614). The personal care computing device 102 may present a display on the user interface 110 that includes the user feedback information, may provide haptic feedback via a vibration motor indicative of the user feedback information, may turn a set of light emitting diodes (LEDs) on or off based on the user feedback information, may present voice output which includes the user feedback information via the speaker 108, or may transmit the user feedback information for display on the client computing device 222 via the communication unit 116.
  • FIG. 7 illustrates a flow diagram representing an example method 700 for generating the feedback regarding personal care products. The method 700 may be performed by the server device 202. In some embodiments, the method 700 may be implemented in a set of instructions stored on a non-transitory computer-readable memory and executable on one or more processors of the server device 202. For example, the method 700 may be at least partially performed by the personal care recommendation generator 208, as shown in FIG. 2.
  • At block 702, the server device 202 receives user profile data for a user. The server device 202 may receive the user profile data from the user's personal care computing device 102 or client computing device 222. The user profile data may include biographical information regarding the user, a current location of the user, an image of the user, or user preferences or goals regarding the personal care product. In some implementations, the server device 202 stores a user profile for the user in a database 210 which includes at least some of the user profile data. The server device 202 may then update the user profile with user profile data received from the personal care computing device 102 or client computing device 222.
  • The server device 202 also receives product use event data indicative of the user's interaction with a personal care product 104 (block 704). For example, each time the personal care computing device 102 or the client computing device 222 identifies that the user is interacting with a personal care product 104, the personal care computing device 102 or the client computing device 222 may generate a record of the use and provide the generated record to the server device 202. This may include identification information for the personal care product 104 such as the name of the personal care product, the date and/or time of the use, the duration of the use, the manner in which the personal care product 104 is used, other personal care products used in the same time frame as the personal care product 104, etc.
  • Additionally, the server device 202 may receive activity data indicative of an activity performed by the user. For example, each time the personal care computing device 102 identifies an activity, the personal care computing device 102 may generate a record of the activity and provide the generated record to the server device 202. This may include activity data, such as the type of activity, the duration of the activity, the date and/or time of the activity, one or more personal care products related to the activity, etc.
  • Then the server device 202 may store the activity data, the product use event data, and/or the user profile data in the user profile for the user, for example in the database 210 (block 706). In some embodiments, each time the server device 202 receives a new instance of activity data and/or product use event data, the server device 202 analyzes the new instance of activity data and/or product use event data and previously stored instances of activity data and/or product use event data for the activity/personal care product to generate user feedback information (block 708). For example, the server device 202 may analyze the activity data and/or product use event data over a particular time window (e.g., the previous year, the previous month, the previous week, etc.), which may include several instances of activity data and/or product use event data at different time intervals for the same activity and/or personal care product 104.
  • Then the server device 202 may determine product use metrics for the personal care product 104 such as a frequency of use over the particular time window, an average duration of use, the time of day of the use, etc. The server device 202 may also determine activity metrics for the activity such as a frequency of the activity over the particular time window, an average duration of the activity, the time of day of the activity, etc. The server device 202 may then compare the activity data, activity metrics, product use metrics, and/or the product use event data to a set of rules for the identified personal care product 104 and/or the identified activity, for example from the database 210 to generate the user feedback information. In other implementations, the server device 202 may apply the activity data, activity metrics, product use metrics, product use event data, and/or user profile data to a machine learning model generated based on the performances of other users.
  • The user feedback information may include a recommendation to replenish the personal care product, advice on how to use the personal care product or a recommendation on how to improve the use of the personal care product, the frequency and duration in which to use the personal care product and/or a description of the frequency and duration in which the user is using the personal care product, recommendations to purchase related personal care products, a user performance metric indicating how effectively the user is using the personal care product, rewards, or recommendations on how to improve a user performance metric, encouragement to continue using the personal care product to reach a high score and receive rewards points or other incentives for maintaining consistent use of the personal care product.
  • In some scenarios, the user feedback information may also include rewards which may be provided when a user performance metric exceeds a threshold value, when the user uses more than a threshold number of different personal care products, when the user follows recommendations or advice provided by the personal care computing device, etc.
  • The user performance metric may be a personal care product-specific user performance metric, such that the server device 202 generates a different user performance metric for each personal care product 104 or each type of personal care product (e.g., hair care, eye care, etc.). In some implementations, each user performance metric may be a score such as from 0-100 which increases or decreases based on the duration and/or frequency in which the user uses a particular personal care product. Each user performance metric may also be a comparison to the performances of other users. In some embodiments, the server device 202 may compare the user's performance to the performances of other users in the same demographic (e.g., age group). For example, the user may have a raw user performance metric for eye makeup of 65 but this may be in the 75th percentile of raw user performance metrics compared to other users in the same age group, same geographic area, etc. Accordingly, the server device 202 may generate a raw user performance metric, an adjusted user performance metric factoring in the user's performance relative to other users, and/or a percentile or ranking of the raw user performance metric relative to other users for the same personal care product.
  • The database 210 may also store previous user feedback information provided to the user, so that the personal care computing device 102 does not repeatedly provide the user with the same user feedback information. Based on the user's response to various user feedback information the server device 202 may learn which types of user feedback information improve the user's performance. For example, the server device 202 may learn that the user does not purchase recommended related products, and thus may stop providing related products recommendations.
  • At block 710, the server device 102 may provide the user feedback information to a client device, such as personal care computing device 102, or the client computing device 222 via an SMS message, email, push notification, etc.
  • Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.
  • Additionally, certain embodiments are described herein as including logic or a number of routines, subroutines, applications, or instructions. These may constitute either software (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware. In hardware, the routines, etc., are tangible units capable of performing certain operations and may be configured or arranged in a certain manner. In example embodiments, one or more computer systems (e.g., a standalone, client or server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.
  • In various embodiments, a hardware module may be implemented mechanically or electronically. For example, a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations. A hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
  • Accordingly, the term “hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where the hardware modules comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different hardware modules at different times. Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
  • Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple of such hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
  • The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions. The modules referred to herein may, in some example embodiments, comprise processor-implemented modules.
  • Similarly, the methods or routines described herein may be at least partially processor-implemented. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented hardware modules. The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location (e.g., within a home environment, an office environment or as a server farm), while in other embodiments the processors may be distributed across a number of locations.
  • The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.
  • Unless specifically stated otherwise, discussions herein using words such as “processing,” “computing,” “calculating,” “determining,” “presenting,” “displaying,” or the like may refer to actions or processes of a machine (e.g., a computer) that manipulates or transforms data represented as physical (e.g., electronic, magnetic, or optical) quantities within one or more memories (e.g., volatile memory, non-volatile memory, or a combination thereof), registers, or other machine components that receive, store, transmit, or display information.
  • As used herein any reference to “one embodiment” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
  • Some embodiments may be described using the expression “coupled” and “connected” along with their derivatives. For example, some embodiments may be described using the term “coupled” to indicate that two or more elements are in direct physical or electrical contact. The term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other. The embodiments are not limited in this context.
  • As used herein, the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Further, unless expressly stated to the contrary, “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
  • In addition, use of the “a” or “an” are employed to describe elements and components of the embodiments herein. This is done merely for convenience and to give a general sense of the description. This description, and the claims that follow, should be read to include one or at least one and the singular also includes the plural unless it is obvious that it is meant otherwise.
  • This detailed description is to be construed as exemplary only and does not describe every possible embodiment, as describing every possible embodiment would be impractical, if not impossible. One could implement numerous alternate embodiments, using either current technology or technology developed after the filing date of this application.

Claims (20)

What is claimed:
1. A computing device for providing feedback regarding consumer habits, the computing device comprising:
a user interface;
an environmental sensor;
a communication interface;
one or more processors; and
a non-transitory computer-readable memory coupled to the one or more processors, the user interface, the environmental sensor, and the communication interface, and storing thereon instructions that, when executed by the one or more processors, cause the computing device to:
identify, via the environmental sensor, an activity by a user within the user's dwelling related to a product;
obtain at least one of: (i) activity data for the user, the activity data related to a frequency or duration of the activity performed by the user over time, (ii) product use event data associated with the user for the product, the product use event data related to product usage of the product over time, or (iii) habits by the user;
generate user feedback information associated with the product or related products based on at least one of: the activity data, the product use event data, or the habits by the user; and
provide the user feedback information via the user interface or the communication interface to a mobile device of the user.
2. The computing device of claim 1, wherein the instructions further cause the computing device to:
determine an identity of the user via the environmental sensor,
wherein the activity data, product user event data, or habits are obtained for the identified user.
3. The computing device of claim 1, wherein the instructions further cause the computing device to:
obtain user profile data for a user profile of the user,
wherein the user feedback information is further generated based on the user profile data for the user.
4. The computing device of claim 1, wherein to identify the activity by the user, the instructions further cause the computing device to:
identify audio characteristics within an area that includes the computing device; and
identify the activity by the user based on the audio characteristics within the area.
5. The computing device of claim 4, wherein the instructions further cause the computing device to:
obtain an indication of a type of area in which the computing device is located,
wherein the activity is further identified based on the type of area in which the computing device is located.
6. The computing device of claim 3, wherein to identify the activity by the user based on the audio characteristics within the area, the instructions further cause the computing device to:
obtain a plurality of acoustic signatures each corresponding to a different activity; and
compare the audio characteristics within the area to the plurality of acoustic signatures to identify the activity by the user.
7. The computing device of claim 4, wherein to identify the activity by the user based on the audio characteristics within the area, the instructions further cause the computing device to:
train a machine learning model for identifying the activity by the user using (i) a plurality of sets of audio characteristics, and (ii) indications of activities for each of the plurality of sets of audio characteristics; and
apply the audio characteristics within the area to the machine learning model to identify the activity by the user.
8. The computing device of claim 4, wherein the audio characteristics include at least one of:
a volume of a sound within the area,
a frequency of the sound within the area,
a tone of the sound within the area, or
a direction of the sound within the area.
9. The computing device of claim 1, wherein the environmental sensor includes at least one of:
an audio sensor,
a temperature sensor,
a humidity sensor,
an ultrasonic sensor,
a radio antenna,
a weighing scale,
a wearable sensor,
an air quality sensor, or
a depth sensor.
10. The computing device of claim 9, wherein the activity is identified via two or more of: the audio sensor, the temperature sensor, the humidity sensor, the ultrasonic sensor, the radio antenna, the weighing scale, the wearable sensor, the air quality sensor, or the depth sensor.
11. The computing device of claim 1, wherein:
wherein the activity data includes a type of activity, a time in which the user performed the activity, a date in which the user performed the activity, a duration of the activity, or a frequency in which the user performed the activity over a particular time period; and
the product use event data includes a time in which the user used the product, a date in which the user used the product, an order in which the user used the product relative to other products, indications of the other products used with the personal care product, a duration in which the user used the personal care product, or a manner in which the user used the personal care product.
12. The computing device of claim 1, wherein to generate user feedback information, the instructions cause the computing device to:
provide the activity data or the product use event data for the user to a server device; and
receive the user feedback information from the server device, wherein the server device generates the user feedback information based on the frequency or duration of the activity performed by the user over time or the product usage of the product over time for the user.
13. The computing device of claim 1, further comprising a speaker, wherein the instructions cause the computing device to provide the user feedback information to the user via the speaker.
14. A server device for providing feedback regarding consumer habits, the server computing device comprising:
one or more processors; and
a non-transitory computer-readable memory coupled to the one or more processors, and storing thereon instructions that, when executed by the one or more processors, cause the server device to:
receive, at one or more time intervals, at least one of: (i) activity data for an activity performed by a user within the user's dwelling related to a product, the activity data related to a frequency or duration of the activity performed by the user over time, (ii) product use event data associated with the user for the product, the product use event data related to product usage of the product over time, or (iii) habits by the user;
store the activity data or the product use event data in a user profile of the user;
analyze at least one: the activity data, the product use event data, or the habits by the user at the one or more time intervals to generate user feedback information associated with the product or related personal care products; and
provide the user feedback information to a client device for presenting the user feedback information to the user.
15. The server device of claim 14, wherein the instructions cause the server computing device to:
receive, at one or more time intervals, user profile data for the user;
store the user profile data in the user profile of the user,
wherein the user feedback information is further generated based on the user profile data for the user.
16. The server device of claim 14, wherein:
wherein the activity data includes a type of activity, a time in which the user performed the activity, a date in which the user performed the activity, a duration of the activity, or a frequency in which the user performed the activity over a particular time period; and
the product use event data includes a time in which the user used the product, a date in which the user used the product, an order in which the user used the product relative to other products, indications of the other products used with the personal care product, a duration in which the user used the personal care product, or a manner in which the user used the personal care product.
17. The server device of claim 14, wherein to receive activity data for an activity, the instructions cause the server device to:
receive audio characteristics for an area within the user's dwelling; and
determine the activity by the user based on the audio characteristics within the area.
18. The server device of claim 17, wherein to determine the activity by the user based on the audio characteristics within the area, the instructions cause the server device to:
obtain a plurality of acoustic signatures each corresponding to a different activity; and
compare the audio characteristics within the area to the plurality of acoustic signatures to identify the activity by the user.
19. A method for providing feedback regarding consumer habits, the method comprising:
identifying, via an environmental sensor communicatively coupled to a computing device, an activity by a user within the user's dwelling related to a product;
obtaining, by the computing device, at least one of: (i) activity data for the user, the activity data related to a frequency or duration of the activity performed by the user over time, (ii) product use event data associated with the user for the product, the product use event data related to product usage of the product over time, or (iii) habits by the user;
generating, by the computing device, user feedback information associated with the product or related products based on at least one of: the activity data, the product use event data, or the habits by the user; and
providing, by the computing device, the user feedback information via a user interface or a communication interface to a mobile device of the user.
20. The method of claim 19, wherein identifying the activity by the user includes:
identifying, by the computing device, audio characteristics within an area that includes the computing device; and
identifying, by the computing device, the activity by the user based on the audio characteristics within the area.
US16/897,316 2019-06-10 2020-06-10 Method for Generating User Feedback Information From a Product Use Event and User Profile Data to Enhance Product Use Results Abandoned US20200388374A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/897,316 US20200388374A1 (en) 2019-06-10 2020-06-10 Method for Generating User Feedback Information From a Product Use Event and User Profile Data to Enhance Product Use Results

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962859427P 2019-06-10 2019-06-10
US16/897,316 US20200388374A1 (en) 2019-06-10 2020-06-10 Method for Generating User Feedback Information From a Product Use Event and User Profile Data to Enhance Product Use Results

Publications (1)

Publication Number Publication Date
US20200388374A1 true US20200388374A1 (en) 2020-12-10

Family

ID=71950854

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/897,316 Abandoned US20200388374A1 (en) 2019-06-10 2020-06-10 Method for Generating User Feedback Information From a Product Use Event and User Profile Data to Enhance Product Use Results
US16/897,793 Active US11544764B2 (en) 2019-06-10 2020-06-10 Method of generating user feedback information to enhance product use results

Family Applications After (1)

Application Number Title Priority Date Filing Date
US16/897,793 Active US11544764B2 (en) 2019-06-10 2020-06-10 Method of generating user feedback information to enhance product use results

Country Status (5)

Country Link
US (2) US20200388374A1 (en)
EP (1) EP3980962A1 (en)
JP (1) JP7319393B2 (en)
CN (1) CN113939840A (en)
WO (1) WO2020252498A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD947845S1 (en) * 2019-01-03 2022-04-05 The Procter & Gamble Company Smart hub for a beauty regimen
US20220160485A1 (en) * 2020-11-23 2022-05-26 Colgate-Palmolive Company Personal Care System, Device, and Method Thereof
US11544764B2 (en) 2019-06-10 2023-01-03 The Procter & Gamble Company Method of generating user feedback information to enhance product use results

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115687974B (en) * 2022-10-27 2023-06-09 深圳市黑金工业制造有限公司 Intelligent interactive blackboard application evaluation system and method based on big data

Family Cites Families (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5858333A (en) 1998-08-07 1999-01-12 Enamelon, Inc. Two-part oral products and methods of using same to remineralize teeth
US6509007B2 (en) 2001-03-19 2003-01-21 The Procter & Gamble Company Oral care kits and compositions
US7437344B2 (en) 2001-10-01 2008-10-14 L'oreal S.A. Use of artificial intelligence in providing beauty advice
US8071076B2 (en) 2002-05-28 2011-12-06 Oral Health Clinical Services Llc Oral lubricating and stain retarding compositions
US8557224B2 (en) 2003-07-15 2013-10-15 Kao Corporation Oral cavity composition
JP2006106951A (en) 2004-10-01 2006-04-20 Dainippon Printing Co Ltd Cosmetic management system
AT503625B1 (en) 2006-04-28 2013-10-15 Chemiefaser Lenzing Ag WATER-IRRADIZED PRODUCT CONTAINING CELLULASIC FIBERS
CA2655818A1 (en) 2006-06-16 2007-12-27 Tate & Lyle Ingredients Americas, Inc. Pullulan films and their use in edible packaging
WO2008080146A1 (en) 2006-12-26 2008-07-03 Discus Dental, Llc Disposable tongue scraper
US8728446B2 (en) 2008-06-03 2014-05-20 I Did It, Inc. Oral hygiene tablets and capsules for direct oral delivery of active ingredients
TWI404544B (en) 2008-08-11 2013-08-11 Colgate Palmolive Co Oral care compositions containing beads
JP5379499B2 (en) 2009-01-29 2013-12-25 リンテック株式会社 Swallow package and edible film assembly
KR101074271B1 (en) 2009-06-25 2011-10-17 (주)차바이오앤디오스텍 Fast dissolving oral dosage form containing steviosides as a taste masking agent
US9750669B2 (en) 2009-07-08 2017-09-05 Wayne R Solan Toothpaste droplets
CA2769636A1 (en) 2009-07-30 2011-02-03 The Procter & Gamble Company Oral care articles and methods
CN102958567B (en) 2010-06-30 2015-09-02 高露洁-棕榄公司 For sending the multilayer film of spice
MX2012015072A (en) 2010-07-02 2013-02-07 Procter & Gamble Dissolvable fibrous web structure article comprising active agents.
RU2613316C1 (en) 2010-07-02 2017-03-15 Дзе Проктер Энд Гэмбл Компани Methods of medical active agent delivery by injection of individual health articles containing filament
EP2588655B1 (en) 2010-07-02 2017-11-15 The Procter and Gamble Company Method for delivering an active agent
US20180163325A1 (en) 2016-12-09 2018-06-14 Robert Wayne Glenn, Jr. Dissolvable fibrous web structure article comprising active agents
CN104040061B (en) 2012-01-04 2019-11-08 宝洁公司 Fibre structure and its manufacturing method comprising particle
US9304736B1 (en) 2013-04-18 2016-04-05 Amazon Technologies, Inc. Voice controlled assistant with non-verbal code entry
US9656102B2 (en) 2013-04-23 2017-05-23 Rita Vaccaro Thin film toothpaste strip
RU2642781C2 (en) 2013-09-06 2018-01-26 Дзе Проктер Энд Гэмбл Компани Capsules, containing water-soluble fibre walls materials and methods of its manufacture
EP3041922A1 (en) 2013-09-06 2016-07-13 The Procter & Gamble Company Pouches comprising apertured film wall materials and methods for making same
CN105893721A (en) 2014-05-13 2016-08-24 陈威宇 Adaptive skin care information prompt system and adaptive skin care prompt method
FR3023110B1 (en) 2014-06-30 2017-10-13 Oreal METHOD FOR ANALYZING USER COSMETIC ROUTINES AND ASSOCIATED SYSTEM
EP3192022B1 (en) 2014-08-04 2019-04-17 Sarubbo, Davide A system for checking a correct oral hygiene procedure
DE112015004624T5 (en) 2014-10-10 2017-06-29 The Procter & Gamble Company Apertured fiber structures and methods of making the same
JP2018531437A (en) * 2015-06-15 2018-10-25 アミール,ハイム System and method for adaptive skin treatment
US20170024589A1 (en) 2015-07-22 2017-01-26 Robert Schumacher Smart Beauty Delivery System Linking Smart Products
US20190080385A1 (en) * 2015-12-28 2019-03-14 Koninklijke Philips N.V. System and method for providing a user with recommendations indicating a fitness level of one of more topical skin products with a personal care device
JP6710095B2 (en) 2016-02-15 2020-06-17 日本電信電話株式会社 Technical support device, method, program and system
CA3015492C (en) * 2016-03-21 2021-11-23 The Procter & Gamble Company Systems and methods for providing customized product recommendations
TWI585711B (en) * 2016-05-24 2017-06-01 泰金寶電通股份有限公司 Method for obtaining care information, method for sharing care information, and electronic apparatus therefor
JP7027314B2 (en) 2016-07-14 2022-03-01 株式会社 資生堂 A recording medium on which the advice information provision system and the advice information provision program are recorded.
JP6882496B2 (en) 2017-02-06 2021-06-02 ザ プロクター アンド ギャンブル カンパニーThe Procter & Gamble Company Laundry detergent sheet with microcapsules
EP3624765A1 (en) 2017-05-16 2020-03-25 The Procter and Gamble Company Conditioning hair care compositions in the form of dissolvable solid structures
US20190233974A1 (en) 2018-01-26 2019-08-01 The Procter & Gamble Company Process for Making an Article of Manufacture
US20190233970A1 (en) 2018-01-26 2019-08-01 The Procter & Gamble Company Process for Making an Article of Manufacture
EP3747017A1 (en) * 2018-01-29 2020-12-09 Atolla Skin Health, Inc. Systems and methods for formulating personalized skincare products
US20190043064A1 (en) 2018-03-29 2019-02-07 Intel Corporation Real-time qualitative analysis
US10095688B1 (en) 2018-04-02 2018-10-09 Josh Schilling Adaptive network querying system
CA3099960A1 (en) 2018-05-14 2019-11-21 The Proctor & Gamble Company Dentifrice dispenser
US10835455B2 (en) 2018-05-14 2020-11-17 The Procter & Gamble Company Oral care compositions comprising metal ions
US20200143655A1 (en) 2018-11-06 2020-05-07 iEldra Inc. Smart activities monitoring (sam) processing of data
US20200388374A1 (en) 2019-06-10 2020-12-10 The Procter & Gamble Company Method for Generating User Feedback Information From a Product Use Event and User Profile Data to Enhance Product Use Results
AU2020290584B2 (en) 2019-06-13 2023-11-23 The Procter & Gamble Company Process for making a fibrous structure
CA3139979A1 (en) 2019-06-13 2020-12-17 The Procter & Gamble Company Pouches comprising oral care active agents
CA3139976A1 (en) 2019-06-13 2020-12-17 The Procter & Gamble Company Kits comprising unit-dose oral care compositions

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD947845S1 (en) * 2019-01-03 2022-04-05 The Procter & Gamble Company Smart hub for a beauty regimen
USD960155S1 (en) 2019-01-03 2022-08-09 The Procter & Gamble Company Smart hub for a beauty regimen
US11544764B2 (en) 2019-06-10 2023-01-03 The Procter & Gamble Company Method of generating user feedback information to enhance product use results
US20220160485A1 (en) * 2020-11-23 2022-05-26 Colgate-Palmolive Company Personal Care System, Device, and Method Thereof

Also Published As

Publication number Publication date
JP2022535823A (en) 2022-08-10
US11544764B2 (en) 2023-01-03
US20200387942A1 (en) 2020-12-10
WO2020252498A1 (en) 2020-12-17
CN113939840A (en) 2022-01-14
EP3980962A1 (en) 2022-04-13
JP7319393B2 (en) 2023-08-01

Similar Documents

Publication Publication Date Title
US20200388374A1 (en) Method for Generating User Feedback Information From a Product Use Event and User Profile Data to Enhance Product Use Results
US20170011258A1 (en) Image analysis in support of robotic manipulation
KR102619221B1 (en) Machine-implemented facial health and beauty aids
JP6956389B2 (en) Makeup support device and makeup support method
CN108153169A (en) Guide to visitors mode switching method, system and guide to visitors robot
EP3579176A1 (en) Makeup evaluation system and operation method thereof
US11354882B2 (en) Image alignment method and device therefor
EP2915101A1 (en) Method and system for predicting personality traits, capabilities and suggested interactions from images of a person
WO2015122195A1 (en) Impression analysis device, game device, health management device, advertising support device, impression analysis system, impression analysis method, program, and program recording medium
GB2530515A (en) Apparatus and method of user interaction
CN117242482A (en) Digital imaging and learning system and method for analyzing pixel data of scalp region of user's scalp to generate one or more user-specific scalp classifications
CN116547721A (en) Digital imaging and learning system and method for analyzing pixel data of an image of a hair region of a user's head to generate one or more user-specific recommendations
KR20220126909A (en) Cosmetic recommendation system based on artificial intelligence-based customized personal color checkup
WO2018029963A1 (en) Make-up assistance apparatus and make-up assistance method
KR102271063B1 (en) Method for performing virtual fitting, apparatus and system thereof
Sethukkarasi et al. Interactive mirror for smart home
CN111064766A (en) Information pushing method and device based on Internet of things operating system and storage medium
KR20230044583A (en) Recording medium on which hair style simulation program is recorded
KR20220099491A (en) Customized cosmetic providing server and method using machine learning
KR20240011324A (en) Customized Makeup Techniques Recommended Display System for Individuals' Daily Emotional Information and Facial Skin Conditions
KR20230044587A (en) Apparatus for providing virtual hair style experience service according to hair style search terms and method of operation thereof
KR20230045635A (en) An apparatus for providing a hair style recommendation service in consideration of fashion elements worn by a user and an operating method thereof
KR20230045632A (en) A method for the operation of an artificial intelligence model-based realistic hair style simulation device
KR20230045633A (en) A computer program for simulating hairstyles
KR20230044584A (en) An apparatus for generating a bald head transformation for hair simulation

Legal Events

Date Code Title Description
AS Assignment

Owner name: THE PROCTER & GAMBLE COMPANY, OHIO

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KREUZER, MELISSA ANN;SHERMAN, FAIZ FEISAL;PARKER, JUSTIN GREGORY;AND OTHERS;SIGNING DATES FROM 20200610 TO 20200612;REEL/FRAME:052924/0187

STPP Information on status: patent application and granting procedure in general

Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION