US20200388374A1 - Method for Generating User Feedback Information From a Product Use Event and User Profile Data to Enhance Product Use Results - Google Patents
Method for Generating User Feedback Information From a Product Use Event and User Profile Data to Enhance Product Use Results Download PDFInfo
- Publication number
- US20200388374A1 US20200388374A1 US16/897,316 US202016897316A US2020388374A1 US 20200388374 A1 US20200388374 A1 US 20200388374A1 US 202016897316 A US202016897316 A US 202016897316A US 2020388374 A1 US2020388374 A1 US 2020388374A1
- Authority
- US
- United States
- Prior art keywords
- user
- product
- activity
- personal care
- computing device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims description 52
- 238000004891 communication Methods 0.000 claims abstract description 37
- 230000000694 effects Effects 0.000 claims description 256
- 230000007613 environmental effect Effects 0.000 claims description 39
- 238000010801 machine learning Methods 0.000 claims description 38
- 230000015654 memory Effects 0.000 claims description 17
- 238000005303 weighing Methods 0.000 claims description 5
- 230000000007 visual effect Effects 0.000 description 41
- 239000013598 vector Substances 0.000 description 30
- 230000001815 facial effect Effects 0.000 description 16
- 239000004909 Moisturizer Substances 0.000 description 15
- 230000001333 moisturizer Effects 0.000 description 15
- 238000005406 washing Methods 0.000 description 9
- 238000004422 calculation algorithm Methods 0.000 description 8
- 230000003993 interaction Effects 0.000 description 7
- 239000002453 shampoo Substances 0.000 description 7
- 239000002537 cosmetic Substances 0.000 description 6
- 238000011010 flushing procedure Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 239000002324 mouth wash Substances 0.000 description 5
- 229940051866 mouthwash Drugs 0.000 description 5
- 230000004044 response Effects 0.000 description 5
- 230000001680 brushing effect Effects 0.000 description 4
- 230000007812 deficiency Effects 0.000 description 4
- 239000003599 detergent Substances 0.000 description 4
- 238000003708 edge detection Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 238000001914 filtration Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 239000012855 volatile organic compound Substances 0.000 description 4
- 238000013528 artificial neural network Methods 0.000 description 3
- 238000004140 cleaning Methods 0.000 description 3
- 238000013135 deep learning Methods 0.000 description 3
- 238000012417 linear regression Methods 0.000 description 3
- 238000007477 logistic regression Methods 0.000 description 3
- 230000003020 moisturizing effect Effects 0.000 description 3
- 238000007637 random forest analysis Methods 0.000 description 3
- 238000012706 support-vector machine Methods 0.000 description 3
- 241000628997 Flos Species 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 239000006071 cream Substances 0.000 description 2
- 230000002354 daily effect Effects 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- -1 facial cleansers Substances 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000003370 grooming effect Effects 0.000 description 2
- 239000008269 hand cream Substances 0.000 description 2
- 238000012015 optical character recognition Methods 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 239000000344 soap Substances 0.000 description 2
- 230000003442 weekly effect Effects 0.000 description 2
- VEXZGXHMUGYJMC-UHFFFAOYSA-N Hydrochloric acid Chemical compound Cl VEXZGXHMUGYJMC-UHFFFAOYSA-N 0.000 description 1
- LPQOADBMXVRBNX-UHFFFAOYSA-N ac1ldcw0 Chemical compound Cl.C1CN(C)CCN1C1=C(F)C=C2C(=O)C(C(O)=O)=CN3CCSC1=C32 LPQOADBMXVRBNX-UHFFFAOYSA-N 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 230000002730 additional effect Effects 0.000 description 1
- 238000004378 air conditioning Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000007844 bleaching agent Substances 0.000 description 1
- 210000001520 comb Anatomy 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 239000000499 gel Substances 0.000 description 1
- 238000010438 heat treatment Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 201000006747 infectious mononucleosis Diseases 0.000 description 1
- 239000004615 ingredient Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 230000036548 skin texture Effects 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 239000007921 spray Substances 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 230000000475 sunscreen effect Effects 0.000 description 1
- 239000000516 sunscreening agent Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0623—Item investigation
-
- A—HUMAN NECESSITIES
- A45—HAND OR TRAVELLING ARTICLES
- A45D—HAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
- A45D44/00—Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
- A45D44/005—Other cosmetic or toiletry articles, e.g. for hairdressers' rooms for selecting or displaying personal cosmetic colours or hairstyle
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01D—MEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
- G01D21/00—Measuring or testing not otherwise provided for
- G01D21/02—Measuring two or more variables by means not covered by a single other subclass
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0201—Market modelling; Market analysis; Collecting market data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
- G06Q30/0251—Targeted advertisements
- G06Q30/0255—Targeted advertisements based on user history
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
- G06Q30/0251—Targeted advertisements
- G06Q30/0269—Targeted advertisements based on user profile or attribute
- G06Q30/0271—Personalized advertisement
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0282—Rating or review of business operators or products
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0631—Item recommendations
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L25/00—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
- G10L25/48—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
- G10L25/51—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/70—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/2866—Architectures; Arrangements
- H04L67/30—Profiles
- H04L67/306—User profiles
Definitions
- the present disclosure generally relates to personal care systems, and, more particularly, to a personal care assistant for identifying instances of use of personal care products and providing feedback to a user to enhance the user's experience with the personal care products.
- home assistant devices or other computing devices collect data from network-enabled devices to enhance the users' experiences with the network-enabled devices.
- a home assistant device may learn a user's habits based on the user's interactions with other network-enabled devices, such as smart lights, a smart TV, a smart heating and air conditioning system, etc. The home assistant device may then automatically control the network-enabled devices according to the learned habits.
- a smart TV may provide indications of the user's watching habits to a remote server that provides recommendations on similar TV shows and movies to those the user is currently watching.
- Such devices do not have similar ways of learning habits based on user interactions with devices which are not network-enabled, such as personal care products. While users interact with personal care products, such as makeup, shampoo, conditioner, moisturizer, hand cream, face cream, toothbrushes, mouthwash, facial cleansers, etc., on a daily basis, computing devices do not collect usage data based on users' interactions with these products to enhance the user experience. Accordingly, users do not know if they are using the products correctly and at the appropriate rate or for the appropriate amount of time.
- personal care products such as makeup, shampoo, conditioner, moisturizer, hand cream, face cream, toothbrushes, mouthwash, facial cleansers, etc.
- a personal care system includes a personal care computing device that obtains indications of personal care products being used by a user.
- the personal care computing device identifies a personal care product based on an obtained indication and provides user feedback to assist the user is using the personal care product.
- the personal care computing device may also determine product use event data based on the user's interaction with the personal care product.
- the product use event data may include identification information for the personal care product such as the name of the personal care product, the date and/or time of the use, the duration of the use, the manner in which the personal care product is used, other personal care products used in the same time frame as the personal care product, etc.
- the personal care computing device may provide the product use event data to a server computing device which stores historical product use event data for the user in a user profile.
- the personal care computing device may also provide user profile data for the user to the server computing device, such as biographical information regarding the user, a current location of the user, an image of the user, or user preferences or goals regarding the personal care product.
- the server computing device may analyze the product use event data and the historical product use event data at several instances in time along with the user profile data to generate the user feedback. For example, the server computing device may determine that the user is using a skin care product once a week based on the product use event data.
- the server computing device may also determine the user's age according to the user profile data, and may determine that people in the user's age group should be using the skin care product more often. Accordingly, the server computing device may generate user feedback indicating that the user should use the skin care product at least twice per week. In some scenarios, the user feedback may also include recommendations to purchase other related personal care products.
- the personal care computing device may present the user feedback via a user interface on the personal care computing device or as audio feedback via a speaker.
- the personal care computing device may provide the user feedback to the user's mobile device which may be presented via a personal care application on the mobile device.
- the server computing device may provide the user feedback to the user's mobile device via a short message service (SMS) message, email, or push notification.
- SMS short message service
- the personal care system collects and analyzes user data from personal care products which do not include a sensor, do not connect to the Internet, and/or do not include computing devices. Accordingly, the personal care system may digitize data from analog products.
- a computing device for providing feedback regarding consumer habits includes a user interface, an environmental sensor, a communication interface, one or more processors, and a non-transitory computer-readable memory coupled to the one or more processors, the environmental sensor, the user interface, and the communication interface, and storing instructions thereon.
- the instructions when executed by the one or more processors, cause the computing device to identify, via the environmental sensor, an activity by a user within the user's dwelling related to a product, and obtain at least one of: (i) activity data for the user, the activity data related to a frequency or duration of the activity performed by the user over time or (ii) product use event data associated with the user for the product, the product use event data related to product usage of the product over time.
- the instructions further cause the computing device to generate user feedback information associated with the product or related products based on at least one of: the activity data or the product use event data, and provide the user feedback information via the user interface or the communication interface to a mobile device of the user.
- a server device for providing feedback regarding consumer habits includes one or more processors, and a non-transitory computer-readable memory coupled to the one or more processors and storing instructions thereon.
- the instructions when executed by the one or more processors, cause the server device to receive, at one or more time intervals, at least one of: (i) activity data for an activity performed by a user within the user's dwelling related to a product, the activity data related to a frequency or duration of the activity performed by the user over time, or (ii) product use event data associated with the user for the product, the product use event data related to product usage of the product over time.
- the instructions further cause the server device to store the activity data and the product use event data in a user profile of the user and analyze at least one: the activity data or the product use event data at the one or more time intervals to generate user feedback information associated with the product or related personal care products. Moreover, the instructions cause the server device to provide the user feedback information to a client device for presenting the user feedback information to the user.
- a method for providing feedback regarding consumer habits includes identifying, via an environmental sensor communicatively coupled to a computing device, an activity by a user within the user's dwelling related to a product, and obtaining, by the computing device, at least one of: (i) activity data for the user, the activity data related to a frequency or duration of the activity performed by the user over time or (ii) product use event data associated with the user for the product, the product use event data related to product usage of the product over time.
- the method further includes generating, by the computing device, information associated with the product or related products based on at least one of: the activity data or the product use event data.
- the method includes providing, by the computing device, the user feedback information via a user interface or a communication interface to a mobile device of the user.
- FIG. 1 illustrates an example personal care computing device and a personal care product
- FIG. 2 illustrates a block diagram of an example communication system in which the personal care computing device can operate
- FIG. 3 illustrates an example data table including user profile data
- FIG. 4 illustrates another example data table including product use event data
- FIG. 5 illustrates example user feedback which may be provided by the personal care system to the user
- FIG. 6 illustrates a flow diagram of an example method for providing feedback regarding personal care products, which can be implemented in the personal care computing device.
- FIG. 7 illustrates a flow diagram of an example method for generating the feedback regarding personal care products, which can be implemented in a server device.
- personal care products may be used to refer to consumer products which are typically used in a bathroom, laundry room, or kitchen.
- personal care products may include tooth care products (e.g., toothbrushes, mouthwash, dental floss, etc.), skin care products (e.g., hand cream, face cream, facial cleansers, moisturizer, etc.), cosmetic products (e.g., face makeup, eye makeup, lipstick, makeup brushes, makeup kits, makeup mirrors, etc.), hair care products (e.g., shampoo, conditioner, hair dryers, straighteners, brushes, combs, curlers, spray gels, etc.), other grooming products (e.g., razors, hair removal products, etc.), toilet paper, cleaning products (e.g., bleach, window cleaner, all-purpose cleaner, soap, toilet bowl cleaner, etc.), laundry room products (e.g., laundry detergent, stain removal products, etc.), kitchen products (e.g., plates, bowls, forks, spoons, knives, measuring cups,
- tooth care products e.g., toothbrush
- consumer habits may refer to usage of consumer products by a user, a hygiene regimen by the user including an order in which a set of consumer products were used when the set of consumer products were used in the same time frame, an amount in which the user complies with product instructions, grooming patterns for the user, etc.
- a personal care computing device identifies a personal care product which is being used by a user, and determines product use event data for the personal care product, such as identification information for the personal care product such as the name of the personal care product, the date and/or time of the use, the duration of the use, the manner in which the personal care product is used, other personal care products used in the same time frame as the personal care product, etc.
- the personal care computing device may then provide identification information for the user (e.g., a user ID, user login credentials, etc.) and the product use event data to a server device.
- the server device may then retrieve a user profile for the user based on the identification information and update the user profile to include the product use event data.
- the personal care computing device may also provide user profile data to the server device, such as biographical information regarding the user, a current location of the user, an image of the user, or user preferences or goals regarding the personal care product. Accordingly, the server device may update the user profile with the user profile data.
- the user profile may include product use event data for the user at several time intervals, and the server device may analyze the product use event data over time and/or the user profile data for the user to generate user feedback information. Then the server device provides the user feedback information to the personal care computing device which presents audio feedback via a speaker or visual feedback via a user interface. In other implementations, the personal care computing device forwards the user feedback information to a client computing device of the user for presentation on the client computing device, or the server device provides the user feedback information directly to the client computing device, for example via an SMS message, email, a push notification, etc.
- FIG. 1 illustrates various aspects of an exemplary environment implementing a personal care system 100 .
- the personal care system 100 includes a personal care computing device 102 which may be placed in a bathroom, such as on a bathroom sink.
- the personal care system 100 also includes one or several personal care products 104 .
- the personal care computing device 102 includes a voice assistant having one or several microphones 106 , such as an array of microphones 106 and one or several speakers 108 , such as an array of speakers 108 .
- the voice assistant may also include processors and a memory storing instructions for receiving and analyzing voice input and providing voice output.
- the voice assistant included in the personal care computing device 102 may include the hardware and software components of the voice controlled assistant described in U.S. Pat. No. 9,304,736 filed on Apr. 18, 2013, incorporated by reference herein.
- the personal care computing device 102 include a user interface 110 for displaying information related to the personal care products, such as user feedback information regarding personal care products.
- the user interface 110 may also present user controls for the user to providing information about herself, such as identification information (e.g., user login credentials, a user ID, biographical information, user preferences or goals regarding skin care, etc.
- the user interface 110 may include user controls for the user to provide information regarding the personal care products she uses, such as the names of the personal care products, how often she uses the personal care products, the manner in which she uses each personal care product, the duration of each use, etc.
- the personal care computing device 102 may include a camera 112 for capturing video and/or images of the area within the field of view of the camera 112 . In this manner, the personal care computing device 102 may identify personal care products 104 within an image or video frame to determine that a personal care product 104 is currently in use, determine the duration of the use, etc.
- the personal care computing device 102 may also include a communication interface (not shown) for connecting to a long-range communication network such as the Internet and for transmitting/receiving radio signals over a short-range communication network, such as NFC, Bluetooth, RFID, Wi-Fi, etc.
- the personal care computing device 102 may include an RFID reader or an NFC reader to receive radio signals from RFID tags, NFC tags, Bluetooth Low Energy (BLE) tags, etc.
- the personal care product 104 includes a radio identification tag (not shown), such as an RFID tag, NFC tag, BLE tag, etc., which transmits identification information for the personal care products to the RFID reader in the personal care computing device 102 .
- a radio identification tag such as an RFID tag, NFC tag, BLE tag, etc.
- the personal care computing device 102 may identify a personal care product within a communication range of the personal care computing device 102 based on the radio identification tag and may determine that the identified personal care product is being used by the user.
- the radio identification tag may be a passive radio identification tag, such that the radio identification tag does not include an internal power source such as a battery.
- the RFID or NFC reader within the communication range of the radio identification tag provides electromagnetic signals that energize the radio identification tag so that the radio identification tag can transmit a radio signal to the RFID or NFC reader which includes identification information for the personal care product 104 .
- the personal care product 104 does not include a radio identification tag or any other transceiver.
- the personal care computing device 102 identifies the personal care product 104 in other ways, such as by identifying visual features within the personal care product 104 from images or video collected by the camera 112 which can be used to identify the personal care product 104 , identifying labels, barcodes, or other text placed on the personal care product from the images or video, or obtaining an indication that the user is using the personal care product 104 via user controls on the user interface 110 or via the user's mobile device.
- the personal care computing device 102 includes an environmental sensor for capturing environmental characteristics in the area surrounding the personal care computing device 102 , such as the bathroom, the kitchen, the laundry room, the living room, etc. of the user's dwelling.
- the environmental sensor may be a temperature sensor, a humidity sensor, an acoustic sensor, an ultrasonic sensor, a radio antenna for example for receiving Wi-Fi or BluetoothTM signals, a weighing scale, a wearable sensor, an air quality sensor such as a volatile organic compounds (VOC) sensor, or a depth sensor for generating a 3D point cloud of the area surrounding the environmental sensor, such as a light detection and ranging (LiDAR) sensor or an infrared (IR) sensor, each of which may be used in combination with the camera 112 to generate the 3D point cloud.
- LiDAR light detection and ranging
- IR infrared
- the acoustic sensor may include the one or several microphones 106 , such as an array of microphones 106 for detecting audio characteristics, such as the volume of sounds within the area, the frequency of the sounds within the area, the tone of the sounds within the area, and/or the directions in which the sounds came from within the area.
- the personal care computing device 102 may identify activities being performed by the user based on the environmental sensor.
- the personal care computing device 102 may identify activities being performed by the user based on sounds within the area, such as the shower running, the sink running, a toilet flushing, gargling, a dishwasher running, a washing machine running, a dryer running, shaving, brushing teeth, etc.
- the activities may be related to products.
- the shower running may be related to hair care or skin care products.
- the washing machine running may be related to laundry room products, such as laundry detergent, stain removal products, etc.
- Gargling may be related to tooth care products (e.g., toothbrushes, mouthwash, dental floss, etc.).
- the personal care computing device 102 may identify activity data for each activity, such as the type of activity (e.g., shaving), the duration of the activity, the date and/or time of the activity, the frequency in which the user performs the activity over a time period (e.g., day, a week, a month), etc.
- type of activity e.g., shaving
- duration of the activity e.g., the date and/or time of the activity
- the frequency in which the user performs the activity over a time period e.g., day, a week, a month
- the personal care computing device 102 may identify personal care products based on any suitable combination of visual features within the personal care products from images or video collected by the camera 112 which can be used to identify the personal care products, labels, barcodes, or other text placed on the personal care products from the images or video, an indication that the user is using the personal care products via user controls on the user interface 110 or via the user's mobile device, a radio identification tag such as an RFID tag, NFC tag, BLE tag, etc., which transmits identification information for the personal care products, and/or environmental characteristics in the area surrounding the personal care computing device 102 which may be used to identify activities performed by the user that are related to the personal care products.
- a radio identification tag such as an RFID tag, NFC tag, BLE tag, etc.
- FIG. 2 illustrates an example communication system in which the personal care computing device 102 and the personal care product 104 can operate to enhance the user's experience with personal care products.
- the personal care computing device 102 has access to a wide area communication network 200 such as the Internet via a long-range wireless communication link (e.g., a cellular link).
- the personal care computing device 102 communicates with a server device 202 that generates user feedback information to provide to the user based on the user's interactions with her personal care products 104 .
- the personal care computing device 102 can communicate with any number of suitable servers.
- the personal care computing device 102 can also use a variety of arrangements, singly or in combination, to communicate with the user's personal care products 104 .
- the personal care computing device 102 obtains identification information from the user's personal care products 104 via a short-range communication link, such as short-range radio frequency links including BluetoothTM, RFID, NFC, etc.
- Some personal care products 104 may include a communication component 130 , such as an RFID tag, NFC tag, BLE tag, etc. Other personal care products 104 may not include the communication component 130 .
- the personal care computing device 102 may also communicate with a client computing device 222 of the user such as a mobile device including a tablet or smartphone over a short-range communication link, such as short-range radio frequency links including BluetoothTM, WiFi (802.11 based or the like) or another type of radio frequency link, such as wireless USB.
- a client computing device 222 of the user such as a mobile device including a tablet or smartphone over a short-range communication link, such as short-range radio frequency links including BluetoothTM, WiFi (802.11 based or the like) or another type of radio frequency link, such as wireless USB.
- the client computing device 222 may be a mobile device such as a tablet computer, a cell phone, a personal digital assistant (PDA), a smartphone, a laptop computer, a portable media player, a home phone, a pager, a wearable computing device, smart glasses, a smart watch or bracelet, a phablet, another smart device, etc.
- the client computing device 222 may also be a desktop computer.
- the client computing device 222 may include one or more processors 226 , a memory 228 , a communication unit (not shown) to transmit and receive data via long-range and short-range communication networks, and a user interface 232 for presenting data to the user.
- the memory 228 may store, for example, instructions for a personal care application 230 that includes user controls for providing information regarding the user's personal care products, such as the names of the user's personal care products, the frequency, duration, and/or manner in which the user uses each personal care product, etc.
- the personal care application 230 may also include user controls for providing user profile data such as user login credentials, a user ID, the user's name or other biographical information, an image of the user such as a before and after picture, etc.
- the personal care application 230 may receive user feedback information to present on the user interface 232 or as voice output via a speaker,
- the user feedback information may be received from the personal care computing device 102 via a short-range communication link, such as BluetoothTM or from the server device 202 via a long-range communication link, such as the Internet or a cellular network.
- the personal care computing device 102 may include one or more speakers 108 such as an array of speakers, an environmental sensor, which may include any one of or any suitable combination of a temperature sensor, a humidity sensor, an ultrasonic sensor, a radio antenna for example for receiving Wi-Fi or Bluetooth signals, a weighing scale, a wearable sensor, an air quality sensor such as a VOC sensor, a depth sensor for generating a 3D point cloud of the area surrounding the environmental sensor, such as a LiDAR sensor or an IR sensor, each of which may be used in combination with the camera 112 to generate the 3D point cloud, and/or one or more microphones 106 such as an array of microphones.
- the personal care computing device 102 may also include a user interface 110 , a camera 112 , one or more processors 114 , a communication unit 116 to transmit and receive data over long-range and short-range communication networks, and a memory 118 .
- the memory 118 can store instructions of an operating system 120 and a personal care assistant application 122 .
- the personal care assistant application 122 may obtain an indication of a personal care product 104 being used, identify the personal care product 104 based on the indication, and generate and present user feedback information to the user to assist the user with the personal care product or related personal care products via a product identification module 124 , a recommendation determination module 126 , and a control module 128 .
- the personal care computing device 102 may obtain an indication of a personal care product 104 being used and the product identification module 124 may identify the personal care product 104 based on the obtained indication.
- the indication of the personal care product 104 may be provided with manual input via user controls on the user interface 110 of the personal care computing device 102 .
- the user may select the personal care product 104 from a list of personal care products included in a drop-down menu on the user interface 110 .
- the product identification module 124 may then identify the selected personal care product 104 via the user controls.
- the indication of the personal care product 104 may also be provided automatically, such as via a radio signal from the personal care product 104 , an image or video of the personal care product 104 , or environmental characteristics indicative of an activity performed by the user which is related to the personal care product 104 , as described below. More specifically, the indication of the personal care product 104 may be identification information from a radio identification tag provided by the personal care product 104 to the personal care computing device 102 . The product identification module 124 may then determine the personal care product 104 transmitting the radio signal based on the identification information. For example, the identification information may indicate that the personal care product 104 transmitting the radio signal is L′Oreal ParisTM Colour Riche Monos Eyeshadow.
- the indication of the personal care product 104 may be an image or video of the area within the field of view of the camera 112 .
- the camera 112 may periodically capture images or capture continuous video of the area in front of the camera 112 , which may include a bathroom counter or an area where a user may sit in front of a bathroom mirror.
- the product identification module 124 may identify an object and determine a personal care product which corresponds to the object based on visual descriptors and semantic cues for the object. At least some of the visual descriptors and semantic cues for the object may be based on a product tag, a product label, a product color, a product shape, a product size, or a product logo.
- an image or video frame may include multiple objects and the product identification module 124 may determine personal care products which correspond to each object.
- the product identification module 124 may segment boundaries for the objects using edge detection, pixel entropy, or other image processing techniques. For example, when adjacent pixels in an image differ in intensity by more than a threshold amount, the product identification module 124 may identify the intersection between the adjacent pixels as a boundary of an object. In another example, when a cluster of pixels in the image differs in intensity by more than a threshold amount from an adjacent cluster of pixels, the product identification module 124 may identify the intersection between the adjacent pixels as a boundary of an object. In addition to performing the edge detection techniques described above to identify the boundaries of an object, the product identification module 124 may use an active contour model to refine the locations of the boundaries and further remove noise.
- the product identification module 124 may identify each of the objects in the image. For each identified object, the product identification module 124 may determine a size and shape of the object according to its boundaries. The product identification module 124 may also identify visual features within the object along with the corresponding locations of the visual features within the object. For example, a first visual feature may be located in the upper right corner of the object, a second visual feature may be located in the center of the object, etc.
- a visual feature may include a keypoint which is a stable region within the object that is detectable regardless of blur, motion, distortion, orientation, illumination, scaling, and/or other changes in camera perspective.
- the stable regions may be extracted from the object using a scale-invariant feature transform (SIFT), speeded up robust features (SURF), fast retina keypoint (FREAK), binary robust invariant scalable keypoints (BRISK), or any other suitable computer vision techniques.
- keypoints may be located at high-contrast regions of the object, such as edges within the object.
- a bounding box may be formed around a keypoint and the portion of the object created by the bounding box may be a visual feature.
- each visual feature is encoded as a vector which may include attributes of the visual feature, such as RGB pixel values, the location of the visual feature within the object, etc.
- the product identification module 124 may identify semantic cues for the object, such as text displayed on the object (e.g., a product label), a tag on or adjacent to the object, a pattern or symbol on the object (e.g., a product logo), etc.
- the product identification module 124 may apply a stroke width transform (SWT).
- SWT stroke width transform
- the SWT is used to find a portion of an image which includes text and filter out the remaining portions of the image which do not include text. In this manner, the text portion of the image may be converted to a text string.
- the SWT technique may be based on an assumption that all text characters in an image have the same stroke width.
- the pixel width of the horizontal line in the letter ‘T’ may be the same as the pixel width for the vertical line in the letter ‘T’ within the image. This width may also be the same for all other lines or curves that make up text characters within the image.
- the product identification module 124 may identify text characters within an image by identifying several lines or curves having a same or similar width (e.g., within a threshold variance of each other). More specifically, the product identification module 124 may perform edge detection techniques within one of the objects, such as the edge detection techniques described above for boundary segmentation, to identify boundaries for lines and curves within the object. The product identification module 124 may then calculate pixel widths for each of these lines and curves based on the positions of their respective boundaries. When the pixel widths for several lines and/or curves are the same or are within a threshold variance of each other, the product identification module 124 may identify the lines and/or curves as text, and may filter out the remaining portions of the object.
- Additional filtering steps may also be applied to identify the text characters within the image.
- text characters may have minimum and maximum aspect ratios, such that the length of a text character does not exceed the width of the text character by more than a threshold amount. Accordingly, the identified lines and/or curves may be compared to minimum and maximum aspect ratios. If the length to width ratio of a candidate text character is outside the minimum or maximum aspect ratios, the candidate text character may be filtered out as a portion of the image which does not include text.
- a threshold ratio between the diameter of a text character and the text character's average stroke width may also be used to filter out portions of the image which do not include text. For example, if the product identification module 124 identifies a portion of an image which resembles the letter ‘O’ the product identification module 124 may calculate the ratio of the diameter for the candidate text character to the average stroke width. When the ratio is less than the threshold ratio by more than a threshold variance (e.g., the candidate text character is donut-shaped) or the ratio is more than the threshold ratio by more than the threshold variance, the candidate text character may be filtered out as a portion of the image which does not include text.
- a threshold variance e.g., the candidate text character is donut-shaped
- the ratio is more than the threshold ratio by more than the threshold variance
- the product identification module 124 may filter out candidate text characters having less than a minimum threshold size or greater than a maximum threshold size (e.g., a minimum height of 8 pixels and a maximum height of 300 pixels).
- a minimum threshold size e.g., a minimum height of 8 pixels and a maximum height of 300 pixels.
- other filtering steps may also be applied such as filtering overlapping bounding boxes, or any other suitable filtering steps.
- the product identification module 124 may also use the SWT to identify words. For example, all text characters in a word may have the same color, may be spaced apart evenly, may be within a threshold distance from each other, and may be the same height or have height differences which are less than a threshold amount. Accordingly, the product identification module 124 may identify words by grouping identified text characters having the same color, that are within a threshold height difference of each other, that are within a threshold distance of each other, and/or that are spaced apart by the same distance.
- the product identification module 124 may use Maximally Stable Extremal Regions (MSER) techniques to identify text within an object or may use a combination of SWT and MSER to identify the text.
- MSER Maximally Stable Extremal Regions
- the portion of the object containing text may be provided to an optical character recognition (OCR) engine which may convert an image (e.g., the portion of the object containing text) to a text string.
- OCR optical character recognition
- the product identification module 124 may identify a barcode or QR code within the identified object and may decode the barcode or QR code converting the barcode or QR code to a text string or other data stream which may be used as a semantic cue.
- the product identification module 124 may compare each of the visual features, semantic cues, and/or other visual characteristics for the object to visual descriptors, semantic cues, and/or other visual characteristics for templates of personal care products to determine a likelihood that the object corresponds to one of the personal care products.
- the personal care computing device 102 may store the templates of personal care products in a database. Each template may include the visual features, semantic cues, and/or other visual characteristics for the template personal care product.
- each identified text string for an object may be compared to text strings in the templates of personal care products to determine likelihoods that the object corresponds to each template personal care product.
- the personal care product having the highest likelihood for the object or having a likelihood that exceeds a likelihood threshold may be identified as the personal care product corresponding to the object.
- the product identification module 124 may generate a machine learning model for identifying personal care products based on visual features and semantic cues using image classification and/or machine learning techniques.
- the machine learning techniques may include linear regression, polynomial regression, logistic regression, random forests, boosting, nearest neighbors, Bayesian networks, neural networks, deep learning, support vector machines, or any other suitable machine learning technique. Then the product identification module 124 may apply the visual features and semantic cues for the object to the machine learning model to identify the personal care product corresponding to the object.
- the template features and template semantic cues may be compared to the features and semantic cues for an object using a nearest neighbors algorithm.
- the nearest neighbors algorithm may identify template features and template semantic cues which are the closest to the features of the object by creating numerical representations of the features and semantic cues to generate feature vectors, such as a pixel width and height of a personal care product, and RGB pixel values for the personal care product, for example.
- the numerical representations of the features or feature vectors of the object may be compared to the feature vectors of template personal care products to determine a vector distance between the features of the object and each template personal care product.
- a semantic cue for the object such as text may be compared to text in the template personal care products to identify the amount of matching text characters, words, or symbols to determine a vector distance between the semantic cues of the object and each template personal care product.
- the product identification module 124 may generate vector distances for each vector (e.g., each visual feature and semantic cue) and combine the individual vector distances to generate an overall vector distance between the object and a particular template personal care product.
- the product identification module 124 may then identify the personal care product which corresponds to the object based on the amount of similarity, or the vector distance in the nearest neighbors algorithm, between the visual features and semantic cues for the object and the visual features and semantic cues for the template personal care products.
- the product identification module 124 may identify the template personal care product having the smallest overall vector distance between the object and the template personal care product as the template personal care product corresponding to the object.
- the product identification module 124 may provide images or video of the area within the field of view of the camera 112 to the server device 202 which may identify an object and determine a personal care product which corresponds to the object using similar techniques as described above. Then the server device 202 may provide the identified personal care products to the product identification module 124 .
- the product identification module 124 may identify consumer habits, such as product use event data for the personal care product 104 .
- the product use event data may include identification information for the personal care product 104 such as the name of the personal care product, the date and/or time of the use, the duration of the use, the manner in which the personal care product 104 is used, other personal care products used in the same time frame as the personal care product 104 , etc.
- the product identification module 124 may determine the date and/or time of the use based on the date and/or time when the product identification module 124 identifies the personal care product 104 . For example, when the personal care computing device 102 receives identification information from a radio identification tag provided by the personal care product 104 , the product identification module 124 may record the date and/or time in which the identification information is received.
- the product identification module 124 may determine the duration of the use by determining when the personal care product 104 can no longer be identified. For example, the product identification module 124 may record the amount of time until the personal care computing device 102 stops receiving a radio signal from the personal care product 104 , until the personal care product 104 is no longer within the field of view of the camera 112 , etc.
- the product identification module 124 may identify other personal care products used in the same time frame as the personal care product 104 by identifying the other personal care products in a similar manner as described above, and comparing identification times for each of the other personal care products to the identification time for the personal care product. If another personal care product is identified within a threshold time period (e.g., 2 minutes, 5 minutes, 10 minutes, etc.) of the personal care product, the product identification module 124 may determine that the other personal care product was used in the same time frame as the personal care product 104 . For example, the product identification module 124 may determine that 5 personal care products were used within a ten minute time period, and thus may determine that each of the 5 personal care products was used in the same time frame. The product identification module 124 may also generate an order in which a set of personal care products were used when the set of personal care products were used in the same time frame.
- a threshold time period e.g. 2 minutes, 5 minutes, 10 minutes, etc.
- the personal care computing device 102 may present questions on the user interface 110 or via the speaker 108 which are related to the use of the identified personal care product 104 . Accordingly, the user may respond to the questions with voice responses which are received via the microphone or via user controls on the user interface 110 , such as drop-down menus, text fields, etc. For example, when the personal care product 104 is eye makeup, the personal care computing device 102 may ask which color is being used, where the eye makeup is being applied around the eye, etc. In some implementations, the product identification module 124 may determine the manner in which the personal care product 104 is being used based on the user's responses to the questions.
- the product identification module 124 may determine the manner in which the personal care product 104 is being used by analyzing the images or video from the camera 112 using computer vision techniques. For example, the product identification module 124 may identify the user's face and facial features from the images such as the user's eyes, lips, and nose, and may determine where the user is applying makeup, lipstick, moisturizer, etc., on her face.
- the personal care computing device 102 may obtain an indication of an activity being performed by the user.
- the indication of the activity may be obtained automatically, for example via the environmental sensor.
- the indication of the activity may be environmental characteristics within an area surrounding the personal computing device 102 , such as audio characteristics, temperature characteristics, visual characteristics, weight characteristics, air quality characteristics, or humidity characteristics.
- the environmental sensor may periodically capture sensor data within the area surrounding the personal computing device 102 (e.g., the living room, the kitchen, the bathroom, etc.), such as audio data, temperature data, humidity data, a 3D point cloud, air quality data, weight data, data received via a short-range communication link, etc. Then the personal computing device 102 may identify an activity based on sensor data characteristics from any one or any suitable combination of sensors.
- the environmental sensor may periodically capture audio data for a sound within the area.
- the personal computing device 102 may then compare the audio data for the sound to acoustic signatures for various types of activities, such as the shower running, the sink running, a toilet flushing, gargling, a dishwasher running, a washing machine running, a dryer running, shaving, brushing teeth, etc.
- Each acoustic signature may include a set of audio characteristics for a particular activity and/or sound, such as the volume of the sound, the frequency of the sound, the tone of the sound, the direction of the sound, etc.
- the personal computing device 102 may identify the type of activity by comparing the audio data to each acoustic signature for each type of activity to determine a likelihood that the sound corresponds to one of the activities.
- the type of activity having the highest likelihood for the sound or having a likelihood that exceeds a likelihood threshold may be identified as the type of activity corresponding to the sound.
- the environmental sensor may periodically capture audio data for a sound within the area and may periodically capture temperature data within the area.
- the personal computing device 102 may then compare the audio data for the sound to acoustic signatures for various types of activities, such as the shower running, the sink running, a toilet flushing, gargling, a dishwasher running, a washing machine running, a dryer running, shaving, brushing teeth, etc., and may compare the temperature data to heat signatures for the various types of activities.
- Each acoustic signature may include a set of audio characteristics for a particular activity and/or sound, such as the volume of the sound, the frequency of the sound, the tone of the sound, the direction of the sound, etc.
- Each heat signature may include a set of temperature characteristics for a particular activity.
- the personal computing device 102 may identify the type of activity by comparing the audio data to each acoustic signature for each type of activity and the temperature data to each heat signature for each type of activity to determine a likelihood that the sound/temperatures correspond to one of the activities.
- the type of activity having the highest likelihood for the sound/temperatures or having a likelihood that exceeds a likelihood threshold may be identified as the type of activity corresponding to the sound/temperatures.
- the personal computing device 102 may obtain an indication of a type of area in which the personal computing device 102 is located, such as the bathroom, the kitchen, the laundry room, etc.
- the indication may be obtained from the user via user controls at the personal computing device 102 .
- the personal computing device 102 may adjust the likelihoods that a sound corresponds to one of several different activities based on the type of area in which the personal computing device 102 is located. For example, if the personal computing device 102 is in the laundry room, it may be more likely that a detected sound or set of environmental characteristics corresponds to the washing machine running than to the dish washer running. Conversely, if the personal computing device 102 is in the kitchen, it may be more likely that a detected sound or set of environmental characteristics corresponds to the dish washer running than to washing machine running.
- the product identification module 124 may generate a machine learning model for identifying an activity based on sensor data captured by the environmental sensor using machine learning techniques.
- the machine learning techniques may include linear regression, polynomial regression, logistic regression, random forests, boosting, nearest neighbors, Bayesian networks, neural networks, deep learning, support vector machines, or any other suitable machine learning technique. Then the product identification module 124 may apply the audio characteristics for the sound, the type of area where the personal computing device 102 is located, and/or other environmental characteristics detected within the area to the machine learning model to identify the activity corresponding to the sound and/or other environmental characteristics.
- the audio signatures may be compared to the audio characteristics for a sound using a nearest neighbors algorithm.
- the nearest neighbors algorithm may identify audio signatures which are the closest to the audio characteristics of the sound by creating numerical representations of the audio characteristics to generate feature vectors, such as a volume, frequency, tone, and direction, for example.
- the numerical representations of the features or feature vectors of the sound may be compared to the feature vectors of audio signatures of various types of activities to determine a vector distance between the features of the sound and each audio signature.
- the product identification module 124 may generate vector distances for each vector and combine the individual vector distances to generate an overall vector distance between the sound and an audio signature for a particular type of activity.
- the product identification module 124 may then identify the activity which corresponds to the sound based on the amount of similarity, or the vector distance in the nearest neighbors algorithm, between the features for the sound and the features of the audio signatures for the activities.
- the product identification module 124 may identify the audio signature for the type of activity having the smallest overall vector distance between the sound and the audio signature as the audio signature for the type of activity corresponding to the sound.
- the product identification module 124 may identify additional activity data, such as the date and/or time of the activity, the duration of the activity, etc.
- the product identification module 124 may also identify activity data based on previous activities performed by the user, such as the frequency of the activity over a particular time period.
- the product identification module 124 may also identify products related to the activity. For example, when the activity is running a washing machine or a dryer, the product identification module 124 may identify one or more laundry room products. When the activity is showering, the product identification module 124 may identify one or more hair care or skin care products. When the activity is the toilet flushing, the product identification module 124 may identify one or more bathroom products, such as toilet paper or cleaning products. When the activity is the dishwasher running, the product identification module 124 may identify one or more kitchen products, such as plates, bowls, forks, spoons, knives, dishwasher detergent, etc.
- the product identification module 124 may use the identified activity to identify the personal care product 104 being used by a user. More specifically, the product identification module 124 may generate the machine learning model for identifying personal care products based on visual features, semantic cues, and the type of activity being performed by the user. For example, two personal care products (e.g., mouthwash and moisturizer) may have similar likelihoods for corresponding to the object. When the product identification module 124 identifies gargling as the activity, the product identification module 124 may determine that the personal care product corresponding to the object is mouthwash.
- the product identification module 124 may identify the user.
- the product identification module 124 may obtain an indication of the identity of the user from manual input via user controls on the user interface 110 of the personal care computing device 102 .
- the user may login to a user profile using user login credentials, may enter the user's first and last name, may select a user profile from a set of user profiles, or may provide any other suitable identification information.
- the product identification module 124 may obtain the indication of the identity of the user automatically from environmental sensor data, such as an image or video of the user, audio data indicative of the user's voice, etc.
- the personal care computing device 102 may store template images of each of the users who utilize the personal care computing device 102 .
- the personal care computing device 102 may also store voice recordings/audio signatures from each of the users and/or other biographical data.
- the product identification module 124 may compare the environmental sensor data to the stored images, voice recordings, and/or other biographical data to identify the user.
- the product identification module 124 may identify the user's face and facial features from the images such as the user's eyes, lips, and nose, and may compare the facial features to the facial features from the stored template images of each of the users.
- the product identification module 124 may also identify the user's voice from audio data and compare the voice data to the stored voice recordings.
- the product identification module 124 may compare the environmental sensor data to the stored images, voice recordings, and/or other biographical data using machine learning techniques.
- the machine learning techniques may include linear regression, polynomial regression, logistic regression, random forests, boosting, nearest neighbors, Bayesian networks, neural networks, deep learning, support vector machines, or any other suitable machine learning technique. Then the product identification module 124 may apply the visual features and semantic cues for the object to the machine learning model to identify the personal care product corresponding to the object.
- the template facial features and/or template voice features may be compared to the facial features and/or voice features for a user whose identity is unknown using a nearest neighbors algorithm.
- the nearest neighbors algorithm may identify template facial features and/or template voice features which are the closest to the facial features and/or voice features for a user whose identity is unknown by creating numerical representations of the facial features and/or voice features to generate feature vectors.
- the numerical representations of the features or feature vectors of the user whose identity is unknown may be compared to the feature vectors of template users to determine a vector distance between the features of the user whose identity is unknown and each template user.
- the product identification module 124 may generate vector distances for each vector (e.g., each facial feature and/or voice feature) and combine the individual vector distances to generate an overall vector distance between the user whose identity is unknown and a particular template user.
- the product identification module 124 may obtain an identifier for the identified user, such as a user ID.
- the recommendation determination module 126 may then provide the product use event data for the identified personal care product 104 and/or the activity data for the activity as well as identification information for the user (e.g., user login credentials, a user ID, etc.) to the server device 202 .
- the server device 202 may store the activity data and/or the product use event data in a user profile for the user which includes historical product use event data for the identified personal care product 104 and for other personal care products and/or historical activity data for the identified activities and for other activities.
- the user profile may also include user profile data for the user, such as biographical information regarding the user, a current location of the user, an image of the user, or user preferences or goals regarding the personal care product.
- the personal care computing device 102 or the user's client computing device 222 obtains user profile data from the user and provides the user profile data to the server device 202 .
- the user's client computing device 222 may provide location data (e.g., obtained via a positioning sensor such as a GPS module or via an IP address), to the server device 202 which may be the user's current location.
- the server device 202 may then store the user profile data in the user profile for the user.
- Example data tables 300 , 400 illustrating user profile data and product use event data are illustrated in FIGS. 3 and 4 , respectively.
- user profile data in a user profile may include a user ID 302 , a name of the user 304 , an address of the user 306 , a date of birth of the user 308 , personal care goals provided by the user 310 , reported cosmetic issues provided by the user 312 , rewards points for the user 314 , or any other suitable information about the user.
- the data table 300 may also include images of the user (not shown), user performance metrics related to product usage (not shown), etc.
- product use event data in a user profile may include a user ID 402 which may be the same user ID as in the user profile data for associating the product use event data with the user.
- the product use event data may also include the name of the personal care product 404 , the date and/or time of the use 406 , the duration of the use 408 , and the manner of use 410 describing how the personal care product was used.
- Jane Smith (User ID 2 ) applied OlayTM Total Effects Whip Face Moisturizer on Jul. 26, 2019 at 9:14 a.m. for 1 minute. She rubbed the moisturizer unevenly on parts of her face.
- Jane Smith applied the OlayTM Total Effects Whip Face Moisturizer at 7:15 p.m. for 30 seconds. That time she rubbed the moisturizer evenly on her entire face. Additionally, on July 22, Jane Smith used a SK-IITM Facial Treatment Mask at 9:37 a.m. for 7 minutes. She placed the mask on her face, left it there for 7 minutes, and rinsed it off.
- the server device 202 may also store a data table (not shown) which includes activity data.
- Activity data in a user profile may include a user ID which may be the same user ID as in the user profile data for associating the activity data with the user.
- the activity data may also include the type of the activity, the date and/or time of the activity, and the duration of the activity.
- the activity data may include the frequency of the activity over a particular time period (e.g., a day, a week, a month) based on the dates and/or times of the activity and/or other metrics based on the activity data.
- the server device 202 may analyze the product use event data for a particular personal care product, the activity data for a particular type of activity, and/or the user profile data for the user to generate user feedback information to assist the user in using the personal care product or related personal care products. This may enhance the user's experience with the personal care products and provide improved results from using the personal care products.
- the server device 202 may include one or more processors 204 , a communication unit (not shown) to transmit and receive data over long-range and short-range communication networks, and a memory 206 .
- the memory 206 can store instructions of an operating system (not shown) and a personal care recommendation generator 208 .
- the server device 202 may also be communicatively coupled to a database 210 that stores user profiles for several users, where each user profile includes user profile data and product use event data as described above.
- the database 210 may also store templates of personal care products including visual features, semantic cues, and/or other visual characteristics for the template personal care products.
- the database 210 may store audio signatures for various activities each including a set of audio characteristics which correspond to the activity. Additionally, the database 210 may store machine learning models generated based on the visual features, semantic cues, and/or other visual characteristics of the template personal care products and/or based on the audio signatures for the various activities.
- the database 210 may store a set of rules regarding the appropriate frequency, duration, and manner of use for the personal care product.
- the rules may differ depending on the demographics of a particular user. For example, the rules may indicate that users in a first age group should moisturize more often than users in a second age group.
- the database 210 may store machine learning models for determining the appropriate frequency, duration, and manner of use for the personal care product that is specific to a particular user based on the user's previous patterns of use and/or the results experienced by the user.
- a user-specific machine learning model may be adjusted such that the appropriate moisturizing frequency for the user is weekly.
- the database 210 may store a set of rules regarding the appropriate frequency, duration, and manner of use for a particular activity.
- the set of rules may also include an estimated total number of times the activity may be performed and/or an estimated total duration over multiple instances of performing the activity before products related to the activity need to be replenished, such as the number of showers before the user needs to replace the soap and shampoo.
- the database 210 may store machine learning models for determining the appropriate frequency, duration, and manner of use for the particular activity that is specific to a particular user based on the user's previous patterns of use and/or the results experienced by the user.
- the personal care recommendation generator 208 may analyze the activity data and/or the product use event data at several instances in time for the identified personal care product 104 (e.g., from the user profile in the database 210 ) to generate the user feedback information. For example, the personal care recommendation generator 208 may analyze the activity data and/or the product use event data over a particular time window (e.g., the previous year, the previous month, the previous week, etc.). Then the personal care recommendation generator 208 may determine product use metrics for the personal care product 104 such as a frequency of use over the particular time window, an average duration of use, the time of day of the use, etc. For example, for the OlayTM Total Effects Whip Face Moisturizer described in FIG.
- the personal care recommendation generator 208 may determine that the user applied the moisturizer about once a week.
- the personal care recommendation generator 208 may also determine activity metrics for the activity such as a frequency of the activity over the particular time window, an average duration of the activity, the time of day of the activity, etc.
- the personal care recommendation generator 208 may then compare the product use metrics and/or the product use event data to the set of rules for the identified personal care product 104 , for example from the database 210 to generate the user feedback information.
- the personal care recommendation generator 208 may also compare the activity metrics and/or activity data to the set of rules for the identified activity, for example from the database 210 to generate the user feedback information.
- the activity metrics, activity data, product use metrics, and/or the product use event data may be compared to the set of rules in view of the user profile data for the user, such as demographics, or the user's personal care goals and reported issues.
- the personal care recommendation generator 208 may apply the activity metrics, activity data, product use metrics, product use event data, and/or user profile data to a machine learning model generated based on the performances of other users. For example, the personal care recommendation generator 208 may train the machine learning model using on a first set of activity metrics, activity data, product use metrics, product use event data, and/or user profile data for a first set of users who improved their cosmetic deficiencies with the personal care product and a second set of activity metrics, activity data, product use metrics, product use event data, and/or user profile data for a second set of users who did not improve their cosmetic deficiencies with the personal care product.
- the personal care recommendation generator 208 may train the machine learning model using a first set of activity metrics, activity data, product use metrics, product use event data, and/or user profile data for a first set of users who received the type of user feedback information for the personal care product and a second set of activity metrics, activity data, product use metrics, product use event data, and/or user profile data for a second set of users who did not receive the type of user feedback information for the personal care product.
- the personal care recommendation generator 208 may generate user feedback information using the set of rules and/or the machine learning models.
- the user feedback information may include a recommendation to replenish the personal care product.
- the personal care recommendation generator 208 may recommend replenishing the personal care product after a threshold number of uses of the personal care product which in some instances may be determined via the activity data, or when the personal care product exceeds a threshold age according to the set of rules and/or the machine learning models.
- the user feedback information may also include advice on how to use the personal care product or a recommendation on how to improve the use of the personal care product.
- the server device 202 may store a set of instructions on using the personal care product for example, in the database 210 .
- the personal care recommendation generator 208 may generate advice on how to use the personal care product. For example, as described above with reference to FIG. 4 , the user placed a SK-IITM Facial Treatment Mask on her face, left it there for 7 minutes, and rinsed it off.
- the set of rules for the SK-IITM Facial Treatment Mask may indicate that the user should not rinse off the mask and instead rub it in.
- the personal care recommendation generator 208 may generate advice indicating that next time the user should rub in the mask without rinsing it off.
- the advice may also include the frequency and duration in which to use the personal care product and/or a description of the frequency and duration in which the user is using the personal care product.
- the personal care recommendation generator 208 may generate advice indicating the frequency and/or duration when the user first uses the personal care product according to the product use event data or when the user is using the personal care product too frequently, not frequently enough, for too long, or for not long enough according to the product use event data and the set of instructions on using the personal care product.
- the advice on how to use the product may be based on the consumer habits for the user. For example, if the user's habits deviate from the set of rules for using a particular product, the personal care recommendation generator 208 may generate advice indicating how to use the product. Still further, the user feedback information may include opportunities for optimizing a particular hygiene regimen based on the user's habits. More specifically, the user feedback information may include a particular order in which the user should use a set of products for a particular hygiene regimen, such as when the user's habits indicate that the user does not follow the particular order. For example, when the user's habits indicate that the user applies concealer before putting on foundation, the user feedback information may include a recommendation to apply the concealer after putting on foundation. Additionally, the user feedback information may include recommendations for additional or alternative products to use during the particular hygiene regimen along with the products the user is currently using in the regimen.
- the advice on how to use the product may be based on user profile data such as the weather conditions at the user's location, the time of year, or the time of day. If it is a hot, humid day or it is raining, the personal care recommendation generator 208 may recommend different types of use of hair care products than on a sunny day with low humidity. Also, if it is the daytime during the summer, the personal care recommendation generator 208 may recommend purchasing a daytime moisturizer with sunscreen to go along with the user's nighttime moisturizer. In the winter, the personal care recommendation generator 208 may recommend that the user apply the same moisturizer during the day and at night.
- user profile data such as the weather conditions at the user's location, the time of year, or the time of day. If it is a hot, humid day or it is raining, the personal care recommendation generator 208 may recommend different types of use of hair care products than on a sunny day with low humidity. Also, if it is the daytime during the summer, the personal care recommendation generator 208 may recommend purchasing a daytime moisturizer with sunscreen
- the user feedback information may include recommendations to purchase related personal care products.
- the server device 202 may store lists of personal care products which work well together according to their ingredients or the effects of the personal care products on other users.
- the personal care recommendation generator 208 may recommend a particular type of conditioner that compliments the shampoo.
- the user profile may indicate that the user in the past used a particular personal care product within the same time frame as another personal care product. The personal care recommendation generator 208 may recommend that the user once again purchase the particular care product to use with the other personal care product.
- the user feedback information may also include a user performance metric such as a score based on the duration and/or frequency in which the user uses a particular personal care product.
- a user performance metric such as a score based on the duration and/or frequency in which the user uses a particular personal care product.
- the user performance metric may be a score from 0-100 which increases each time the user uses conditioner. If the user does not use conditioner for a threshold time period, the score may decrease or reset to 0.
- the personal care recommendation generator 208 generates the user performance metric using a machine learning model, such as a regression model.
- the machine learning model may be trained using on a first set of activity metrics, activity data, product use metrics and/or product use event data for a first set of users who improved their cosmetic deficiencies with the personal care product and a second set of activity metrics, activity data, product use metrics and/or product use event data for a second set of users who did not improve their cosmetic deficiencies with the personal care product. Then the personal care recommendation generator 208 may apply the user's activity metric, activity data, product use metric, and/or product use event data to the machine learning model to generate the user performance metric.
- the user feedback information may also include rewards which may be provided when a user performance metric exceeds a threshold value, when the user uses more than a threshold number of different personal care products, when the user follows recommendations or advice provided by the personal care computing device, etc.
- the user performance metric may also be a comparison to the performances of other users.
- the personal care recommendation generator 208 may compare the user's performance to the performances of other users in the same demographic (e.g., age group).
- the user may have a raw user performance metric for eye makeup of 65 but this may be in the 75 th percentile of raw user performance metrics compared to other users in the same age group, same geographic area, etc.
- the user feedback information may provide a raw user performance metric, a percentile or ranking of the raw user performance metric relative to other users, an adjusted user performance metric factoring in the user's performance relative to other users, or any other suitable relative user performance metric.
- the user feedback information may also include recommendations on how to improve a user performance metric, encouragement to continue using the personal care product to reach a high score and receive rewards points or other incentives for maintaining consistent use of the personal care product.
- Example user feedback information is illustrated in the data table 500 of FIG. 5 .
- the personal care recommendation generator 208 may recommend that the user “Use leave-on frizz control to fight the humidity.”
- the personal care recommendation generator 208 may advise, “In the past month, you have been moisturizing about twice a week. Make sure you moisturize everyday.”
- the personal care recommendation generator 208 may advise, “Don't forget to replace your disposable razor after 5 uses.”
- the personal care recommendation generator 208 may recommend that the user “Buy Jane's Conditioner to use along with your shampoo.
- Another example of user feedback information may be, “You have earned 200 rewards points for maintaining proper skin care habits.”
- the database 210 may also store previous user feedback information provided to the user, so that the personal care computing device 102 does not repeatedly provide the user with the same user feedback information. Based on the user's response to various user feedback information the personal care recommendation generator 208 may learn which types of user feedback information improve the user's performance. For example, the personal care recommendation generator 208 may learn that the user does not purchase recommended related products, and thus may stop providing related products recommendations.
- the personal care recommendation generator 208 may provide the user feedback information to the personal care computing device 102 , or the client computing device 222 via an SMS message, email, push notification, etc.
- the recommendation determination module 126 in the personal care computing device 102 may analyze the product use event data for the user to generate the user feedback information without sending the product use event data to the server device 202 .
- the control module 128 may control operation of the personal care computing device 102 by for example, presenting a display which includes the user feedback information via the user interface 110 , presenting audio output which includes the user feedback information via the speaker 108 , providing haptic feedback indicative of the user feedback information via a vibration motor, or transmitting the user feedback information to the client computing device 222 via the communication unit 116 .
- FIG. 6 illustrates a flow diagram representing an example method 600 for providing feedback regarding personal care products.
- the method 600 may be performed by the personal care assistant application 122 and executed on the personal care computing device 102 .
- the method 600 may be implemented in a set of instructions stored on a non-transitory computer-readable memory and executable on one or more processors of the personal care computing device 102 .
- the method 600 may be at least partially performed by the product identification module 124 , the recommendation determination module 126 , and the control module 128 , as shown in FIG. 2 .
- an indication of a personal care product 104 being used by a user is obtained.
- the indication of the personal care product 104 may be provided with manual input via user controls on the user interface 110 of the personal care computing device 102 or the client computing device 222 .
- the user may select the personal care product 104 from a list of personal care products included in a drop-down menu on the user interface 110 .
- the indication of the personal care product 104 may also be provided automatically, such as via a radio signal from the personal care product 104 , or an image or video of the personal care product 104 .
- an indication of an activity may be obtained.
- the indication of the activity may be provided automatically, such as via environmental characteristics for the area surrounding the personal care computing device 102 detected by an environmental sensor, which may be any one of, any two of, or any suitable combination of an audio sensor such as a microphone or an array of microphones, a temperature sensor, an ultrasonic sensor, a radio antenna for example for receiving Wi-Fi or Bluetooth signals, a weighing scale, a wearable sensor, an air quality sensor such as a VOC sensor, a depth sensor for generating a 3D point cloud of the area surrounding the environmental sensor, such as a LiDAR sensor or an IR sensor, and/or a humidity sensor.
- the personal care product 104 is identified based on the obtained indication.
- the personal care computing device 102 may identify the selected personal care product 104 via the user controls.
- the personal care computing device 102 may identify the personal care product 104 transmitting the radio signal based on the identification information included in the radio signal.
- the indication of the personal care product 104 is an image or video
- the personal care computing device 102 may identify the personal care product 104 by analyzing images or video frames using the computer vision techniques described above to identify an object within the images or video frames and identify visual features, semantic cues, and/or other visual characteristics for the object.
- the personal care computing device 102 may compare the visual features, semantic cues, and/or other visual characteristics to visual features, semantic cues, and/or other visual characteristics for templates of personal care products to determine a likelihood that the object corresponds to one of the personal care products.
- the personal care computing device 102 or the server device 202 may generate a machine learning model for identifying personal care products based on visual features and semantic cues using image classification and/or machine learning techniques.
- the personal care computing device 102 may apply the visual features and semantic cues for the object to the machine learning model to identify the personal care product corresponding to the object.
- the activity may be identified based on the indication of the activity.
- the environmental sensor may periodically capture audio data for a sound within the area.
- the personal computing device 102 may then compare the audio data for the sound to acoustic signatures for various types of activities, such as the shower running, the sink running, a toilet flushing, gargling, a dishwasher running, a washing machine running, a dryer running, shaving, brushing teeth, etc.
- Each acoustic signature may include a set of audio characteristics for a particular activity and/or sound, such as the volume of the sound, the frequency of the sound, the tone of the sound, the direction of the sound, etc.
- the personal computing device 102 may identify the type of activity by comparing the audio data to each acoustic signature for each type of activity to determine a likelihood that the sound corresponds to one of the activities. The type of activity having the highest likelihood for the sound or having a likelihood that exceeds a likelihood threshold may be identified as the type of activity corresponding to the sound.
- the personal care computing device 102 or the server device 202 may generate a machine learning model for identifying an activity based on sensor data captured by the environmental sensor using machine learning techniques. Then the personal care computing device 102 may apply the audio characteristics for the sound, the type of area where the personal computing device 102 is located, and/or other environmental characteristics detected within the area to the machine learning model to identify the activity corresponding to the sound and/or other environmental characteristics.
- the product identification module 124 may also identify products related to the activity. For example, when the activity is running a washing machine or a dryer, the product identification module 124 may identify one or more laundry room products. When the activity is showering, the product identification module 124 may identify one or more hair care or skin care products. When the activity is the toilet flushing, the product identification module 124 may identify one or more bathroom products, such as toilet paper or cleaning products. When the activity is the dishwasher running, the product identification module 124 may identify one or more kitchen products, such as plates, bowls, forks, spoons, knives, dishwasher detergent, etc.
- the product identification module 124 may use the identified activity to identify the personal care product 104 being used by a user. More specifically, the product identification module 124 may generate the machine learning model for identifying personal care products based on visual features, semantic cues, and the type of activity being performed by the user.
- the personal care computing device 102 provides the obtained indication of the personal care product 104 and/or the obtained indication of the activity to the server device 202 to identify the personal care product 104 corresponding to the indication. Then the server device 202 provides the identified personal care product 104 and/or the identified activity to the personal care computing device 102 .
- the personal care computing device 102 may identify product use event data for the personal care product 104 based on the user's interaction with the personal care product 104 (block 606 ).
- the product use event data may include identification information for the personal care product 104 such as the name of the personal care product, the date and/or time of the use, the duration of the use, the manner in which the personal care product 104 is used, other personal care products used in the same time frame as the personal care product 104 , etc.
- the personal care computing device 102 may record the date and/or time in which the identification information is received. Additionally, the personal care computing device 102 may determine the duration of the use by determining when the personal care product 104 can no longer be identified. Furthermore, the personal care computing device 102 may identify other personal care products used in the same time frame as the personal care product 104 by identifying the other personal care products in a similar manner as described above, and comparing identification times for each of the other personal care products to the identification time for the personal care product.
- the personal care computing device 102 may identify activity data such as the type of activity, the date and/or time of the activity, the duration of the activity, etc.
- the personal care computing device 102 may present questions on the user interface 110 or via the speaker 108 which are related to the use of the identified personal care product 104 . Accordingly, the user may respond to the questions with voice responses which are received via the microphone or via user controls on the user interface 110 , such as drop-down menus, text fields, etc. In other implementations, the personal care computing device 102 may determine the manner in which the personal care product 104 is being used by analyzing the images or video from the camera 112 using computer vision techniques.
- the personal care computing device 102 may identify the user's face and facial features from the images such as the user's eyes, lips, and nose, and may determine where the user is applying makeup, lipstick, moisturizer, etc., on her face.
- the personal care computing device 102 may determine the manner in which the personal care product 104 is being used based on the activity data. More specifically, the activity data may indicate the type of activity the user performed while using the personal care product 104 .
- the personal care computing device 102 or the client computing device 222 also obtains user profile data for the user, such as biographical information regarding the user, a current location of the user, an image of the user, or user preferences or goals regarding the personal care product. Then the personal care computing device 102 or the client computing device 222 provides the user profile data to the server device 202 .
- user profile data such as biographical information regarding the user, a current location of the user, an image of the user, or user preferences or goals regarding the personal care product. Then the personal care computing device 102 or the client computing device 222 provides the user profile data to the server device 202 .
- the personal care computing device 102 provides the activity data, the product use event data for the identified personal care product 104 , and/or identification information for the user (e.g., user login credentials, a user ID, etc.) to the server device 202 .
- the server device 202 may analyze the activity data for the identified activity, the product use event data for the identified personal care product 104 , and/or the user profile data for the user to generate user feedback information to assist the user in using the personal care product or related personal care products. More specifically, the server device 202 may analyze the activity data and/or the product use event data at several instances in time for the identified personal care product 104 and/or identified activity to generate the user feedback information.
- the server device 202 may analyze the activity data and/or the product use event data over a particular time window (e.g., the previous year, the previous month, the previous week, etc.). Then the server device 202 may determine product use metrics for the personal care product 104 such as a frequency of use over the particular time window, an average duration of use, the time of day of the use, etc. The server device 202 may also identify activity metrics based on the activity data, such as a frequency of the activity over the particular time window, an average duration of the activity, the time of day of the activity, etc.
- the server device 202 may then compare the activity data, the activity metrics, the product use metrics and/or the product use event data to a set of rules for the identified personal care product 104 , for example from the database 210 to generate the user feedback information.
- the server device 202 may apply the activity data, the activity metrics, the product use metrics, product use event data, and/or user profile data to a machine learning model generated based on the performances of other users.
- the user feedback information may include a recommendation to replenish the personal care product, advice on how to use the personal care product or a recommendation on how to improve the use of the personal care product, the frequency and duration in which to use the personal care product and/or a description of the frequency and duration in which the user is using the personal care product, recommendations to purchase related personal care products, a user performance metric indicating how effectively the user is using the personal care product, rewards, or recommendations on how to improve a user performance metric, encouragement to continue using the personal care product to reach a high score and receive rewards points or other incentives for maintaining consistent use of the personal care product.
- the server device 202 may generate several types of user feedback information.
- the database 210 may also store previous user feedback information provided to the user, and the server device 202 may provide types of user feedback information to the personal care computing device 102 or the client computing device 222 which have not been presented to the user within a threshold time period. In other implementations, some types of user feedback information may be provided more often than others, such as user performance metrics.
- the server device 202 may provide an updated user performance metric to the user each time the user performance metric changes. On the other hand, the server device 202 may only provide recommendations on how to use the personal care product once a week or once a month, for example.
- the personal care computing device 102 obtains the user feedback information from the server device 202 .
- the personal care computing device 102 generates the user feedback information based on the activity data, the product use event data for the personal care product, and/or the user profile data. In any event, the personal care computing device 102 presents the user feedback information to the user (block 614 ).
- the personal care computing device 102 may present a display on the user interface 110 that includes the user feedback information, may provide haptic feedback via a vibration motor indicative of the user feedback information, may turn a set of light emitting diodes (LEDs) on or off based on the user feedback information, may present voice output which includes the user feedback information via the speaker 108 , or may transmit the user feedback information for display on the client computing device 222 via the communication unit 116 .
- a vibration motor indicative of the user feedback information may turn a set of light emitting diodes (LEDs) on or off based on the user feedback information, may present voice output which includes the user feedback information via the speaker 108 , or may transmit the user feedback information for display on the client computing device 222 via the communication unit 116 .
- LEDs light emitting diodes
- FIG. 7 illustrates a flow diagram representing an example method 700 for generating the feedback regarding personal care products.
- the method 700 may be performed by the server device 202 .
- the method 700 may be implemented in a set of instructions stored on a non-transitory computer-readable memory and executable on one or more processors of the server device 202 .
- the method 700 may be at least partially performed by the personal care recommendation generator 208 , as shown in FIG. 2 .
- the server device 202 receives user profile data for a user.
- the server device 202 may receive the user profile data from the user's personal care computing device 102 or client computing device 222 .
- the user profile data may include biographical information regarding the user, a current location of the user, an image of the user, or user preferences or goals regarding the personal care product.
- the server device 202 stores a user profile for the user in a database 210 which includes at least some of the user profile data. The server device 202 may then update the user profile with user profile data received from the personal care computing device 102 or client computing device 222 .
- the server device 202 also receives product use event data indicative of the user's interaction with a personal care product 104 (block 704 ). For example, each time the personal care computing device 102 or the client computing device 222 identifies that the user is interacting with a personal care product 104 , the personal care computing device 102 or the client computing device 222 may generate a record of the use and provide the generated record to the server device 202 . This may include identification information for the personal care product 104 such as the name of the personal care product, the date and/or time of the use, the duration of the use, the manner in which the personal care product 104 is used, other personal care products used in the same time frame as the personal care product 104 , etc.
- the server device 202 may receive activity data indicative of an activity performed by the user. For example, each time the personal care computing device 102 identifies an activity, the personal care computing device 102 may generate a record of the activity and provide the generated record to the server device 202 . This may include activity data, such as the type of activity, the duration of the activity, the date and/or time of the activity, one or more personal care products related to the activity, etc.
- the server device 202 may store the activity data, the product use event data, and/or the user profile data in the user profile for the user, for example in the database 210 (block 706 ).
- the server device 202 each time the server device 202 receives a new instance of activity data and/or product use event data, the server device 202 analyzes the new instance of activity data and/or product use event data and previously stored instances of activity data and/or product use event data for the activity/personal care product to generate user feedback information (block 708 ).
- the server device 202 may analyze the activity data and/or product use event data over a particular time window (e.g., the previous year, the previous month, the previous week, etc.), which may include several instances of activity data and/or product use event data at different time intervals for the same activity and/or personal care product 104 .
- a particular time window e.g., the previous year, the previous month, the previous week, etc.
- the server device 202 may determine product use metrics for the personal care product 104 such as a frequency of use over the particular time window, an average duration of use, the time of day of the use, etc.
- the server device 202 may also determine activity metrics for the activity such as a frequency of the activity over the particular time window, an average duration of the activity, the time of day of the activity, etc.
- the server device 202 may then compare the activity data, activity metrics, product use metrics, and/or the product use event data to a set of rules for the identified personal care product 104 and/or the identified activity, for example from the database 210 to generate the user feedback information.
- the server device 202 may apply the activity data, activity metrics, product use metrics, product use event data, and/or user profile data to a machine learning model generated based on the performances of other users.
- the user feedback information may include a recommendation to replenish the personal care product, advice on how to use the personal care product or a recommendation on how to improve the use of the personal care product, the frequency and duration in which to use the personal care product and/or a description of the frequency and duration in which the user is using the personal care product, recommendations to purchase related personal care products, a user performance metric indicating how effectively the user is using the personal care product, rewards, or recommendations on how to improve a user performance metric, encouragement to continue using the personal care product to reach a high score and receive rewards points or other incentives for maintaining consistent use of the personal care product.
- the user feedback information may also include rewards which may be provided when a user performance metric exceeds a threshold value, when the user uses more than a threshold number of different personal care products, when the user follows recommendations or advice provided by the personal care computing device, etc.
- the user performance metric may be a personal care product-specific user performance metric, such that the server device 202 generates a different user performance metric for each personal care product 104 or each type of personal care product (e.g., hair care, eye care, etc.).
- each user performance metric may be a score such as from 0-100 which increases or decreases based on the duration and/or frequency in which the user uses a particular personal care product.
- Each user performance metric may also be a comparison to the performances of other users.
- the server device 202 may compare the user's performance to the performances of other users in the same demographic (e.g., age group).
- the user may have a raw user performance metric for eye makeup of 65 but this may be in the 75 th percentile of raw user performance metrics compared to other users in the same age group, same geographic area, etc.
- the server device 202 may generate a raw user performance metric, an adjusted user performance metric factoring in the user's performance relative to other users, and/or a percentile or ranking of the raw user performance metric relative to other users for the same personal care product.
- the database 210 may also store previous user feedback information provided to the user, so that the personal care computing device 102 does not repeatedly provide the user with the same user feedback information. Based on the user's response to various user feedback information the server device 202 may learn which types of user feedback information improve the user's performance. For example, the server device 202 may learn that the user does not purchase recommended related products, and thus may stop providing related products recommendations.
- the server device 102 may provide the user feedback information to a client device, such as personal care computing device 102 , or the client computing device 222 via an SMS message, email, push notification, etc.
- a client device such as personal care computing device 102
- the client computing device 222 via an SMS message, email, push notification, etc.
- routines, subroutines, applications, or instructions may constitute either software (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware.
- routines, etc. are tangible units capable of performing certain operations and may be configured or arranged in a certain manner.
- one or more computer systems e.g., a standalone, client or server computer system
- one or more hardware modules of a computer system e.g., a processor or a group of processors
- software e.g., an application or application portion
- a hardware module may be implemented mechanically or electronically.
- a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations.
- a hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
- the term “hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein.
- hardware modules are temporarily configured (e.g., programmed)
- each of the hardware modules need not be configured or instantiated at any one instance in time.
- the hardware modules comprise a general-purpose processor configured using software
- the general-purpose processor may be configured as respective different hardware modules at different times.
- Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
- Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple of such hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
- a resource e.g., a collection of information
- processors may be temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions.
- the modules referred to herein may, in some example embodiments, comprise processor-implemented modules.
- the methods or routines described herein may be at least partially processor-implemented. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented hardware modules. The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location (e.g., within a home environment, an office environment or as a server farm), while in other embodiments the processors may be distributed across a number of locations.
- the performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines.
- the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.
- any reference to “one embodiment” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment.
- the appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
- Coupled and “connected” along with their derivatives.
- some embodiments may be described using the term “coupled” to indicate that two or more elements are in direct physical or electrical contact.
- the term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other.
- the embodiments are not limited in this context.
- the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion.
- a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
- “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
Landscapes
- Business, Economics & Management (AREA)
- Engineering & Computer Science (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- Development Economics (AREA)
- Strategic Management (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Marketing (AREA)
- Economics (AREA)
- General Business, Economics & Management (AREA)
- Entrepreneurship & Innovation (AREA)
- Game Theory and Decision Science (AREA)
- Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Data Mining & Analysis (AREA)
- Medical Informatics (AREA)
- Signal Processing (AREA)
- Computational Linguistics (AREA)
- Computing Systems (AREA)
- Multimedia (AREA)
- Acoustics & Sound (AREA)
- Human Computer Interaction (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Mathematical Physics (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Child & Adolescent Psychology (AREA)
- Social Psychology (AREA)
- Psychiatry (AREA)
- Public Health (AREA)
- Developmental Disabilities (AREA)
- Hospice & Palliative Care (AREA)
- Primary Health Care (AREA)
- Psychology (AREA)
- General Health & Medical Sciences (AREA)
- Epidemiology (AREA)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/897,316 US20200388374A1 (en) | 2019-06-10 | 2020-06-10 | Method for Generating User Feedback Information From a Product Use Event and User Profile Data to Enhance Product Use Results |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201962859427P | 2019-06-10 | 2019-06-10 | |
US16/897,316 US20200388374A1 (en) | 2019-06-10 | 2020-06-10 | Method for Generating User Feedback Information From a Product Use Event and User Profile Data to Enhance Product Use Results |
Publications (1)
Publication Number | Publication Date |
---|---|
US20200388374A1 true US20200388374A1 (en) | 2020-12-10 |
Family
ID=71950854
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/897,316 Abandoned US20200388374A1 (en) | 2019-06-10 | 2020-06-10 | Method for Generating User Feedback Information From a Product Use Event and User Profile Data to Enhance Product Use Results |
US16/897,793 Active US11544764B2 (en) | 2019-06-10 | 2020-06-10 | Method of generating user feedback information to enhance product use results |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/897,793 Active US11544764B2 (en) | 2019-06-10 | 2020-06-10 | Method of generating user feedback information to enhance product use results |
Country Status (5)
Country | Link |
---|---|
US (2) | US20200388374A1 (ja) |
EP (1) | EP3980962A1 (ja) |
JP (1) | JP7319393B2 (ja) |
CN (1) | CN113939840A (ja) |
WO (1) | WO2020252498A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
USD947845S1 (en) * | 2019-01-03 | 2022-04-05 | The Procter & Gamble Company | Smart hub for a beauty regimen |
US20220160485A1 (en) * | 2020-11-23 | 2022-05-26 | Colgate-Palmolive Company | Personal Care System, Device, and Method Thereof |
US11544764B2 (en) | 2019-06-10 | 2023-01-03 | The Procter & Gamble Company | Method of generating user feedback information to enhance product use results |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115687974B (zh) * | 2022-10-27 | 2023-06-09 | 深圳市黑金工业制造有限公司 | 一种基于大数据的智慧互动黑板应用评价系统及方法 |
Family Cites Families (50)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5858333A (en) | 1998-08-07 | 1999-01-12 | Enamelon, Inc. | Two-part oral products and methods of using same to remineralize teeth |
US6509007B2 (en) | 2001-03-19 | 2003-01-21 | The Procter & Gamble Company | Oral care kits and compositions |
US7437344B2 (en) * | 2001-10-01 | 2008-10-14 | L'oreal S.A. | Use of artificial intelligence in providing beauty advice |
US8071076B2 (en) | 2002-05-28 | 2011-12-06 | Oral Health Clinical Services Llc | Oral lubricating and stain retarding compositions |
US8557224B2 (en) | 2003-07-15 | 2013-10-15 | Kao Corporation | Oral cavity composition |
JP2006106951A (ja) * | 2004-10-01 | 2006-04-20 | Dainippon Printing Co Ltd | 化粧品管理システム |
AT503625B1 (de) | 2006-04-28 | 2013-10-15 | Chemiefaser Lenzing Ag | Wasserstrahlverfestigtes produkt enthaltend cellulosische fasern |
JP2009539719A (ja) | 2006-06-16 | 2009-11-19 | テイト アンド ライル イングレディエンツ アメリカス インコーポレイテッド | プルランフィルムおよび食用包装におけるそれらの使用 |
WO2008080146A1 (en) | 2006-12-26 | 2008-07-03 | Discus Dental, Llc | Disposable tongue scraper |
US8728446B2 (en) | 2008-06-03 | 2014-05-20 | I Did It, Inc. | Oral hygiene tablets and capsules for direct oral delivery of active ingredients |
TWI404544B (zh) | 2008-08-11 | 2013-08-11 | Colgate Palmolive Co | 含珠粒之口腔保健組成物 |
JP5379499B2 (ja) | 2009-01-29 | 2013-12-25 | リンテック株式会社 | 嚥下物包装体および可食性フィルム接合体 |
KR101074271B1 (ko) | 2009-06-25 | 2011-10-17 | (주)차바이오앤디오스텍 | 불쾌한 맛을 효과적으로 은폐한 경구용 속용 필름 |
US9750669B2 (en) | 2009-07-08 | 2017-09-05 | Wayne R Solan | Toothpaste droplets |
WO2011014401A2 (en) | 2009-07-30 | 2011-02-03 | The Procter & Gamble Company | Oral care articles and methods |
US9161890B2 (en) | 2010-06-30 | 2015-10-20 | Colgate-Palmolive Company | Multilayer films for delivery of flavor |
RU2555042C2 (ru) | 2010-07-02 | 2015-07-10 | Дзе Проктер Энд Гэмбл Компани | Способ доставки активнодействующего вещества |
RU2535040C2 (ru) | 2010-07-02 | 2014-12-10 | Дзе Проктер Энд Гэмбл Компани | Способы доставки медицинского активного агента путем введения индивидуальных медицинских изделий, содержащих филамент |
US20180163325A1 (en) | 2016-12-09 | 2018-06-14 | Robert Wayne Glenn, Jr. | Dissolvable fibrous web structure article comprising active agents |
MX2012015072A (es) | 2010-07-02 | 2013-02-07 | Procter & Gamble | Articulo con estructura soluble de trama fibrosa que comprende agentes activos. |
MX366484B (es) | 2012-01-04 | 2019-07-10 | Procter & Gamble | Estructuras fibrosas que comprenden particulas y metodos para fabricarlas. |
US9304736B1 (en) | 2013-04-18 | 2016-04-05 | Amazon Technologies, Inc. | Voice controlled assistant with non-verbal code entry |
US9656102B2 (en) | 2013-04-23 | 2017-05-23 | Rita Vaccaro | Thin film toothpaste strip |
RU2742913C1 (ru) | 2013-09-06 | 2021-02-11 | Дзе Проктер Энд Гэмбл Компани | Капсулы, содержащие водорастворимые волокнистые материалы стенок, и способы их изготовления |
JP6705747B2 (ja) | 2013-09-06 | 2020-06-03 | ザ プロクター アンド ギャンブル カンパニーThe Procter & Gamble Company | 開口フィルム壁材を含むパウチとその同一物を作製する方法 |
CN105893721A (zh) * | 2014-05-13 | 2016-08-24 | 陈威宇 | 适应性护肤信息提示系统及其适应性护肤提示方法 |
FR3023110B1 (fr) * | 2014-06-30 | 2017-10-13 | Oreal | Procede d'analyse de routines cosmetiques d'utilisateurs et systeme associe |
EP3192022B1 (en) * | 2014-08-04 | 2019-04-17 | Sarubbo, Davide | A system for checking a correct oral hygiene procedure |
EP3204539B1 (en) | 2014-10-10 | 2020-11-25 | The Procter and Gamble Company | Apertured soluble fibrous structures |
JP2018531437A (ja) * | 2015-06-15 | 2018-10-25 | アミール,ハイム | 適応皮膚処置のためのシステムおよび方法 |
US20170024589A1 (en) * | 2015-07-22 | 2017-01-26 | Robert Schumacher | Smart Beauty Delivery System Linking Smart Products |
WO2017115211A1 (en) * | 2015-12-28 | 2017-07-06 | Koninklijke Philips N.V. | System and method for providing a user with recommendations indicating a fitness level of one or more topical skin products with a personal care device |
JP6710095B2 (ja) * | 2016-02-15 | 2020-06-17 | 日本電信電話株式会社 | 技術支援装置、方法、プログラムおよびシステム |
JP6730443B2 (ja) * | 2016-03-21 | 2020-07-29 | ザ プロクター アンド ギャンブル カンパニーThe Procter & Gamble Company | カスタマイズされた製品の推奨を提供するためのシステム及び方法 |
TWI585711B (zh) * | 2016-05-24 | 2017-06-01 | 泰金寶電通股份有限公司 | 獲得保養信息的方法、分享保養信息的方法及其電子裝置 |
JP7027314B2 (ja) * | 2016-07-14 | 2022-03-01 | 株式会社 資生堂 | アドバイス情報提供システム及びアドバイス情報提供プログラムを記録した記録媒体 |
MX2019009276A (es) | 2017-02-06 | 2019-09-19 | Procter & Gamble | Hoja de detergente para ropa con microcapsulas. |
MX2019013048A (es) | 2017-05-16 | 2019-12-11 | Procter & Gamble | Composiciones acondicionadoras para el cuidado del cabello en la forma de estructuras solidas solubles. |
US20190233974A1 (en) | 2018-01-26 | 2019-08-01 | The Procter & Gamble Company | Process for Making an Article of Manufacture |
US20190233970A1 (en) | 2018-01-26 | 2019-08-01 | The Procter & Gamble Company | Process for Making an Article of Manufacture |
US10546658B2 (en) * | 2018-01-29 | 2020-01-28 | Atolla Skin Health, Inc. | Systems and methods for formulating personalized skincare products |
US20190043064A1 (en) | 2018-03-29 | 2019-02-07 | Intel Corporation | Real-time qualitative analysis |
US10095688B1 (en) * | 2018-04-02 | 2018-10-09 | Josh Schilling | Adaptive network querying system |
JP7114742B2 (ja) | 2018-05-14 | 2022-08-08 | ザ プロクター アンド ギャンブル カンパニー | 金属イオンを含む口腔ケア組成物 |
CN112135783A (zh) | 2018-05-14 | 2020-12-25 | 宝洁公司 | 洁牙剂分配器 |
US20200143655A1 (en) | 2018-11-06 | 2020-05-07 | iEldra Inc. | Smart activities monitoring (sam) processing of data |
CN113939840A (zh) | 2019-06-10 | 2022-01-14 | 宝洁公司 | 生成用户反馈信息以增强产品使用效果的方法 |
CN113950314A (zh) | 2019-06-13 | 2022-01-18 | 宝洁公司 | 用于制备纤维结构的方法 |
MX2021014994A (es) | 2019-06-13 | 2022-01-24 | Procter & Gamble | Kits que comprenden composiciones de dosis unitaria para el cuidado bucal. |
CA3139979A1 (en) | 2019-06-13 | 2020-12-17 | The Procter & Gamble Company | Pouches comprising oral care active agents |
-
2020
- 2020-06-10 CN CN202080040925.2A patent/CN113939840A/zh active Pending
- 2020-06-10 US US16/897,316 patent/US20200388374A1/en not_active Abandoned
- 2020-06-10 WO PCT/US2020/070131 patent/WO2020252498A1/en unknown
- 2020-06-10 EP EP20751853.1A patent/EP3980962A1/en not_active Withdrawn
- 2020-06-10 US US16/897,793 patent/US11544764B2/en active Active
- 2020-06-10 JP JP2021571697A patent/JP7319393B2/ja active Active
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
USD947845S1 (en) * | 2019-01-03 | 2022-04-05 | The Procter & Gamble Company | Smart hub for a beauty regimen |
USD960155S1 (en) | 2019-01-03 | 2022-08-09 | The Procter & Gamble Company | Smart hub for a beauty regimen |
US11544764B2 (en) | 2019-06-10 | 2023-01-03 | The Procter & Gamble Company | Method of generating user feedback information to enhance product use results |
US20220160485A1 (en) * | 2020-11-23 | 2022-05-26 | Colgate-Palmolive Company | Personal Care System, Device, and Method Thereof |
Also Published As
Publication number | Publication date |
---|---|
US20200387942A1 (en) | 2020-12-10 |
EP3980962A1 (en) | 2022-04-13 |
CN113939840A (zh) | 2022-01-14 |
WO2020252498A1 (en) | 2020-12-17 |
US11544764B2 (en) | 2023-01-03 |
JP7319393B2 (ja) | 2023-08-01 |
JP2022535823A (ja) | 2022-08-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20200388374A1 (en) | Method for Generating User Feedback Information From a Product Use Event and User Profile Data to Enhance Product Use Results | |
US20170011258A1 (en) | Image analysis in support of robotic manipulation | |
KR102619221B1 (ko) | 머신 구현 안면 건강 및 미용 보조기 | |
US10799010B2 (en) | Makeup application assist device and makeup application assist method | |
CN108153169A (zh) | 导览模式切换方法、系统和导览机器人 | |
EP3579176A1 (en) | Makeup evaluation system and operation method thereof | |
US11354882B2 (en) | Image alignment method and device therefor | |
EP2915101A1 (en) | Method and system for predicting personality traits, capabilities and suggested interactions from images of a person | |
WO2015122195A1 (ja) | 印象分析装置、ゲーム装置、健康管理装置、広告支援装置、印象分析システム、印象分析方法、プログラム、及びプログラム記録媒体 | |
GB2530515A (en) | Apparatus and method of user interaction | |
CN117242482A (zh) | 用于分析用户的头皮的头皮区域的像素数据以生成一个或多个用户特定头皮分类的数字成像和学习系统及方法 | |
CN116547721A (zh) | 用于分析用户的头部的头发区域的图像的像素数据以生成一个或多个用户特定推荐的数字成像和学习系统及方法 | |
KR20220126909A (ko) | 인공지능 기반 맞춤형 퍼스널 컬러 진단에 따른 화장품 추천 시스템 | |
WO2018029963A1 (ja) | メイクアップ支援装置およびメイクアップ支援方法 | |
Sethukkarasi et al. | Interactive mirror for smart home | |
CN111064766A (zh) | 基于物联网操作系统的信息推送方法、装置及存储介质 | |
CN118401971A (zh) | 分析用户皮肤区域的图像的像素数据以确定皮肤毛孔大小的数字成像系统和方法 | |
KR20230044583A (ko) | 헤어 스타일 시뮬레이션 프로그램 기록매체 | |
KR20220099491A (ko) | 기계학습을 이용한 맞춤형 화장품 제공 서버 및 방법 | |
KR20240011324A (ko) | 개인의 하루 감성 정보와 얼굴 피부 상태에 따른 맞춤형 화장기법 추천 디스플레이 시스템 | |
KR20230044587A (ko) | 헤어 스타일 검색어에 따른 가상 헤어 스타일 체험 서비스 제공 장치 및 그 동작 방법 | |
KR20230045635A (ko) | 사용자가 착용한 패션 요소를 고려한 헤어 스타일 추천 서비스 장치 및 그 동작 방법 | |
KR20230045632A (ko) | 인공지능 모델 기반 실감형 헤어 스타일 시뮬레이션 장치의 동작 방법 | |
KR20230045633A (ko) | 헤어 스타일 시뮬레이션을 위한 컴퓨터 프로그램 | |
KR20230044584A (ko) | 헤어 시뮬레이션을 위한 민머리 변환 두상 생성장치 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: THE PROCTER & GAMBLE COMPANY, OHIO Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KREUZER, MELISSA ANN;SHERMAN, FAIZ FEISAL;PARKER, JUSTIN GREGORY;AND OTHERS;SIGNING DATES FROM 20200610 TO 20200612;REEL/FRAME:052924/0187 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |