WO2016094327A1 - Intelligent shaving system having sensors - Google Patents

Intelligent shaving system having sensors Download PDF

Info

Publication number
WO2016094327A1
WO2016094327A1 PCT/US2015/064339 US2015064339W WO2016094327A1 WO 2016094327 A1 WO2016094327 A1 WO 2016094327A1 US 2015064339 W US2015064339 W US 2015064339W WO 2016094327 A1 WO2016094327 A1 WO 2016094327A1
Authority
WO
WIPO (PCT)
Prior art keywords
blade
shaving system
external device
microcontroller
skin
Prior art date
Application number
PCT/US2015/064339
Other languages
French (fr)
Inventor
Haggai Goldfarb
Simon OREN
Original Assignee
Haggai Goldfarb
Oren Simon
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Haggai Goldfarb, Oren Simon filed Critical Haggai Goldfarb
Publication of WO2016094327A1 publication Critical patent/WO2016094327A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B26HAND CUTTING TOOLS; CUTTING; SEVERING
    • B26BHAND-HELD CUTTING TOOLS NOT OTHERWISE PROVIDED FOR
    • B26B21/00Razors of the open or knife type; Safety razors or other shaving implements of the planing type; Hair-trimming devices involving a razor-blade; Equipment therefor
    • B26B21/40Details or accessories
    • B26B21/405Electric features; Charging; Computing devices
    • B26B21/4056Sensors or controlling means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B26HAND CUTTING TOOLS; CUTTING; SEVERING
    • B26BHAND-HELD CUTTING TOOLS NOT OTHERWISE PROVIDED FOR
    • B26B21/00Razors of the open or knife type; Safety razors or other shaving implements of the planing type; Hair-trimming devices involving a razor-blade; Equipment therefor
    • B26B21/08Razors of the open or knife type; Safety razors or other shaving implements of the planing type; Hair-trimming devices involving a razor-blade; Equipment therefor involving changeable blades
    • B26B21/14Safety razors with one or more blades arranged transversely to the handle
    • B26B21/28Safety razors with one or more blades arranged transversely to the handle of the drawing cut type, i.e. with the cutting edge of the blade arranged obliquely or curved to the handle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B26HAND CUTTING TOOLS; CUTTING; SEVERING
    • B26BHAND-HELD CUTTING TOOLS NOT OTHERWISE PROVIDED FOR
    • B26B21/00Razors of the open or knife type; Safety razors or other shaving implements of the planing type; Hair-trimming devices involving a razor-blade; Equipment therefor
    • B26B21/40Details or accessories
    • B26B21/4081Shaving methods; Usage or wear indication; Testing methods
    • B26B21/4087Usage or wear indication
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B26HAND CUTTING TOOLS; CUTTING; SEVERING
    • B26BHAND-HELD CUTTING TOOLS NOT OTHERWISE PROVIDED FOR
    • B26B21/00Razors of the open or knife type; Safety razors or other shaving implements of the planing type; Hair-trimming devices involving a razor-blade; Equipment therefor
    • B26B21/40Details or accessories
    • B26B21/52Handles, e.g. tiltable, flexible
    • B26B21/526Electric features
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B26HAND CUTTING TOOLS; CUTTING; SEVERING
    • B26BHAND-HELD CUTTING TOOLS NOT OTHERWISE PROVIDED FOR
    • B26B21/00Razors of the open or knife type; Safety razors or other shaving implements of the planing type; Hair-trimming devices involving a razor-blade; Equipment therefor
    • B26B21/54Razor-blades
    • B26B21/56Razor-blades characterised by the shape
    • B26B21/565Bent razor blades; Razor blades with bent carriers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B26HAND CUTTING TOOLS; CUTTING; SEVERING
    • B26BHAND-HELD CUTTING TOOLS NOT OTHERWISE PROVIDED FOR
    • B26B21/00Razors of the open or knife type; Safety razors or other shaving implements of the planing type; Hair-trimming devices involving a razor-blade; Equipment therefor
    • B26B21/54Razor-blades
    • B26B21/58Razor-blades characterised by the material
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30088Skin; Dermal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Definitions

  • a shaving system includes a handle; at least one blade connected to the handle; a microcontroller attached to the handle; and one or more sensors adjacent the at least one blade.
  • the one or more sensors are configured to transmit sensory data to the microcontroller, and one of the one or more sensors is a proximity sensor.
  • a shaving system includes a handle; at least one blade connected to the handle; a microcontroller attached to the handle; and one or more sensors adjacent the at least one blade.
  • the one or more sensors are configured to send sensory data to the microcontroller, and one of the one or more sensors is a camera having an image sensor configured to capture video and/or still images.
  • a blade includes a front leading edge of the blade; a spine of the blade; and a nanolattice that connects the front leading edge to the spine.
  • the mountable electrical device further includes a memory electrically connected to the microcontroller.
  • the memory is configured to store data from the microcontroller.
  • the mountable electrical device further includes one or more sensors attached to the precision hand tool. The one or more sensors are configured to provide sensory data to the microcontroller.
  • FIG. 1C illustrates a side view of a shaving system with a force sensor and an image camera according to an embodiment of the present invention.
  • FIG. 8 illustrates a shaving system with image camera that streams video via a wireless communication unit to an external wristwatch according to an embodiment of the present invention.
  • FIG. 10B illustrates various images of a shaved area of skin according to an embodiment of the present invention.
  • FIG. 11 is a flow diagram for gauging blade attrition according to an embodiment of the present invention.
  • FIGS. 13A and 13B illustrate a front view and an exploded view, respectively, of a blade cartridge 150 with blades that are slightly curved according to an embodiment of the present invention.
  • FIGS. 14A and 14B illustrate an ISO view and a plan view of a nanolattice blade with an octet-truss structure according to an embodiment of the present invention.
  • proximity sensor refers to a sensor that may be configured to detect how close blade 151 is to the skin.
  • Proximity sensors may include physical contact sensors that are configured to detect the force applied between blade 151 and the skin as well as sensors that do not have a physical contact between blade 151 and the skin.
  • Proximity sensors include, but are not limited to, IR sensors, ultrasonic rangefinders, and accelerometers.
  • FIG. 12 illustrates electronic components and modules of shaving system 100 in relation to external device 505 and cloud server 545 in accordance with some embodiments of the present disclosure. It should be understood that although shaving system 100, external device 505, and cloud server 545 are shown, the embodiments described herein with respect to FIG. 12 are not limited to shaving system 100, external device 505, or cloud server 545.
  • handle body 520 As depicted in FIG. 12, the components included in shaving system 100 are encased within handle body 520, and as depicted in FIG. 5 - FIG. 7, handle body 520 has an ergonomic shape that conforms according to a user's grip. In some embodiments, one or more
  • components of the shaving system 100 are incorporated within handle body 520 and one or more components are configured to conform to handle body 520.
  • speaker 164, microphone 165, and/or indicator display 510 are located externally on handle 140 of shaving system 100.
  • handle body 520 is configured to conform around USB connector 111
  • shaving system 100 includes within handle body 520 a microcontroller 160, which is an integrated circuit that embeds a processor core 169, cache memory 168, and programmable input/output peripherals 167 on an integrated circuit, as illustrated in FIG. 12.
  • Microcontroller 160 may include additional embedded components to facilitate aspects of intelligent shaving system 100, such as portions of wireless communication unit 110, an audio/video (AV) wireless module 117, a video transmitter/broadcaster, a video encoder/decoder (e.g., video compressor), an audio encoder/decoder (e.g., audio compressor), an encryption unit, a timer, and the like.
  • AV audio/video
  • microcontroller 160 is configured to electrically interface with sensors, specifically, camera sensor 163, force sensor 120, and microphone 165. Microcontroller 160 is also configured to facilitate interaction with a user by providing audio and/or visual feedback to the user during a shave session.
  • shaving system 100 includes on handle body 520, speaker 164 and indicator display 510.
  • shaving system 100 includes on handle body 520, user interaction switches 515 (e.g., power switch, selection switch) to select various features on shaving system 100.
  • Shaving system 100 includes first memory 161 electrically connected to microcontroller 160.
  • the first memory 161 is configured to store data associated with at least one blade 151.
  • first memory 161 is configured to store data and/or information to facilitate the interaction between microcontroller 160 and electrically connected sensors (e.g., camera sensor 163, force sensor 120).
  • first memory 161 is non-volatile memory, such as and/or configured to buffer sensory data between one or more sensors and wireless communication unit 110.
  • microcontroller 160 may offload sensory data to external device 505. Accordingly, in some examples, microcontroller 160 is configured to transmit sensory data via wireless communication unit 110 to wireless module 555 on external device 505. As such, external device 505 includes sensor analysis module 550 and image analysis module 560 to determine one or more quantitative results. External device includes one or more processors 575 as well as secondary memory 570 that may be volatile or non- volatile. In some embodiments, external device may display on display 565 streamed image frames and / or quantitative indicators. In some instances, display 565 is a touch screen configured to interface with a user with selectable software buttons or switches. 1. Shaving system 100 with proximity sensor
  • Communication unit 110 includes both Bluetooth and WiFi protocols and either may be configured to stream video data from camera 163 and/or audio data from microphone 165.
  • WiFi 119 wireless communication unit 110 is configured to use IEEE 802.11 protocols for implementing wireless local area network (WLAN) computer communication in the 2.4, 3.6, 5, and 60 GHz frequencies.
  • WLAN wireless local area network
  • Bluetooth 118 wireless communication unit 110 is configured in accordance with IEEE 802.15 protocols.
  • external device 505 includes a
  • wireless communication unit 110 may be separated across multiple locations and/or multiple printed circuit boards (PCBs).
  • PCBs printed circuit boards
  • WiFi module 119 and WiFi antenna 162 are disposed close to microcontroller 160 on camera PCB 166 and audio/video PCB 167 rather than on communication PCB 114. This
  • wireless communication unit 110 is embedded in microcontroller 160.
  • Bluetooth module 118 and Bluetooth antenna 113 are integrated with wireless communication unit 110 on communication PCB 114.
  • the Bluetooth module 118 is configured to transmit media information (e.g., streamed capture frames of the images) from video camera 163.
  • shaving system 100 includes wireless communication unit 110, which is attached to handle 140 and is electrically connected to microcontroller 160.
  • wireless communication unit 110 which is attached to handle 140 and is electrically connected to microcontroller 160.
  • communication unit 110 is configured to transmit and receive data between microcontroller 160 to wireless module 555 on external device 505 (e.g., FIG. 5 and FIG. 12).
  • shaving system 100 includes force sensor 120 (e.g., force cell, load cell) coupled to lever assembly 130.
  • Lever assembly 130 hinges blade cartridge 150 around first fulcrum 131 and second fulcrum 132 to depress plunger 124 over a distance S 0 .
  • Various techniques may be used to determine plunger depression distance S 0 .
  • plunger 124 is connected to a terminal of a slider
  • potentiometer or a variable resistor and configured to provide a resistance or voltage proportional to plunger depression distance S 0 .
  • the displacement at the tip of input arm 138 (e.g., input displacement distance) is proportional to the applied normal force, F N . That is, the displacement distance Si of input arm 138 is zero without any applied normal force, F N , as illustrated in FIG. 2A.
  • FIG. 2B and FIG. 2C illustrate increasing displacement distance Si of input arm 138 with the application of normal forces, 1/2 F N and F N , respectively.
  • the forward kinematics of the displacement distance, Si translates to a counter-clockwise rotational motion of input arm 138 about second fulcrum 132 that displaces coupling 137 a distance, S m .
  • the displacement of coupling 137, S m translates to a clockwise rotational motion of output arm 134 about first fulcrum 131 that displaces plunger 124 a distance, S 0 .
  • precise displacement distance of coupling 137, S m , with respect to the input displacement, Si is based on a ratio of the distance from the tip of input arm 138 to second fulcrum 132, L l5 and the distance from second fulcrum 132 to the center of coupling 137, L 2 , or
  • the displacement distance of plunger 124 (e.g. output displacement), S 0 , with respect to the displacement distance of coupling 137, S m , is based on a ratio of the distance from the center of coupling 137 to first fulcrum 131, L 3 , and the distance from first fulcrum 131 to plunger 124, L 4 , or
  • the lever assembly 130 of shaving system 100 can tune the
  • lever assembly 130 is configured to displace plunger 124 (e.g. output displacement), S 0 , less than the displacement distance of the tip of input arm 138 (e.g., input displacement distance), Si, which results in a transference ratio greater than one (e.g.,
  • lever assembly 130 is configured to displace plunger 124 (e.g. output displacement), S 0 , more than the displacement distance of the tip of input arm 138 (e.g., input displacement distance), Si, which results in a transference ratio less than 1 (e.g.,
  • One benefit of a transference ratio larger than one is that the displacement distance of plunger 124 (e.g. output displacement), S 0 , is larger than the displacement at the tip of input arm 138 (e.g., input displacement distance), Si, which results in a force sensor 120 with a higher resolution.
  • sensing force, F S with respect to normal force, F N is based on the distance from the tip of input arm 138 to second fulcrum 132, Li, times the distance from the center of coupling 137 to first fulcrum 131, L 3 , divided by the distance from second fulcrum 132 to the center of coupling 137, L 2 , and divided by the distance from first fulcrum 131 to plunger 124, L 4 , or
  • Tangential force, F T is part of composite force, F, that refers to the force a user applies to blade cartridge 150 to cut hair across the surface of the skin, and is based, at least in part, on friction due to the blade 151 dragging on the surface of the skin.
  • lever assembly 130 is configured to translate (e.g., transfer) tangential force, F T , to depress plunger
  • second fulcrum 132 is coupled to second slide bearing 133, which is configured to move along an inclined plane at angle ⁇ , with respect to the gripping portion of handle 140.
  • Applying tangential force, F T to the tip of input arm 138 slides second fulcrum 132 up the inclined plane at angle ⁇ to reposition coupling 137.
  • coupling 137 readjusts the position of output arm 134 along a channel within output arm 134 and first slide bearing 139 while coupling 137 pivots around first fulcrum 131 to depress plunger 124.
  • the position of second fulcrum 132 remains the same with respect to input arm 138, whereas the position of first fulcrum 131 is adjusted based on applied tangential force, F T .
  • F T applied tangential force
  • the distance from the center of coupling 137 to first fulcrum 131, L 3 , and the distance from first fulcrum 131 to plunger 124, L 4 varies over the distance of the channel within output arm 134.
  • This variance in the distance from the center of coupling 137 to first fulcrum 131, L 3 , and the distance from first fulcrum 131 to plunger 124, L 4 varies the transference ratio.
  • position sensor 136 may include other sensors such as a capacitive transducer, a capacitive displacement sensor, an eddy-current sensor, an ultrasonic sensor, a grating sensor, a hall effect sensor, an inductive non-contact sensor, an optical sensor (e.g., laser doppler vibrometer), a linear variable differential transformer (LVDT), a multi-axis displacement transducer, a photodiode array, a piezo-electric transducer, a rotary encoder, or the like.
  • sensors such as a capacitive transducer, a capacitive displacement sensor, an eddy-current sensor, an ultrasonic sensor, a grating sensor, a hall effect sensor, an inductive non-contact sensor, an optical sensor (e.g., laser doppler vibrometer), a linear variable differential transformer (LVDT), a multi-axis displacement transducer, a photodiode array, a piezo-electric transducer, a rotary encoder, or the like
  • tangential force, F T is proportional to a combination of offset sliding distance, S 0 ff, and the displace distance of plunger 124, S 0 .
  • microcontroller 160 is configured to determine the applied tangential force, F T , based on both the displace distance of plunger 124, S 0 , and the offset of position sensor 136.
  • lever assembly 130 and force sensor 120 are configured to combine normal force, F , and tangential force, F T , into single quantitative indicator 510 that is associated with the total force applied to the skin.
  • lever assembly 103 is configured to transfer both normal force, F N , and tangential force, F T , form blade 151 in contact with the skin to the compressive force at the proximity sensor.
  • lever assembly 130 and force sensor 120 e.g., force cell, load cell
  • force sensor 120 e.g., force cell, load cell
  • F N normal force
  • F T tangential force
  • lever assembly 130 and spring 123 cushion and absorb sudden movements. This provides for blade 151 to follow along the surface contour of the skin and conform across imperfections (e.g., micro bumps) for a closer, more comfortable shave.
  • imperfections e.g., micro bumps
  • lever assembly 130 and force sensor 120 include a dashpot configured to reduce vibrations in the spring 123 as well as slow the travel of lever assembly 130 to the initial position depicted in FIG. 2A and FIG. 3 A.
  • the dashpot includes pneumatics.
  • lever assembly 130 and force sensor 120 e.g., force cell, load cell
  • force sensor 120 e.g., force cell, load cell
  • F N normal force
  • F T tangential force
  • shaving system 100 is not limited to lever assembly 130 or force sensor 120 to detect one or both of normal force, F N , or tangential force, F T .
  • strain sensors e.g., piezo-electric sensors
  • one or more strain sensors may be configured to sense normal force, F N , and/or tangential force, F T , that can be combined into single quantitative indicator 510.
  • Some embodiments of shaving system 100 display quantitative force indicator 510 on handle 140 of shaving system 100 or alternatively on external device 505 (e.g., smartphone 525, tablet 535, laptop, or desktop 540) via wireless communication unit 110 to wireless module 555.
  • microcontroller 160 stores to first memory 161 data indicative of the force applied (e.g., the force over a shave session) prior to blade cartridge 150 replacement. This provides a reference for a 'dull' blade 151 and provides another indicator to facilitate predicting blade attrition and end of life of blade cartridges 150.
  • microcontroller 160 is configured to store in first memory 161 the data indicative of the force applied between a new blade cartridge 150 and the skin during the first shaving session. This beneficially can be used as a baseline for a 'sharp blade' for subsequent shaving sessions.
  • Tracking the force applied in this manner provides a metric to gauge blade attrition (e.g., dulling of blade 151).
  • the force a user applies using a new 'sharp' blade 151 may be equal to 1/2 F , which displaces lever assembly as depicted in FIG. 2B.
  • the average force a user applies using an older 'dull' blade 151 may be equal to F , which displaces lever assembly 130 twice as far as depicted in FIG. 2C.
  • the additional force against the skin a user applies to compensate for the additional friction of inefficiencies of 'dull' blade 151 is twice as much as 'sharp' blade 151.
  • microcontroller 160 or the external device is configured to count the number of shaving strokes, which in this instance is the number of times in a shaving session that an applied force exceeds the calculated average force applied over several shaving sessions. Contrasting the number of shaving strokes provides another metric to gauge blade attrition (e.g., dulling of blade 151). For example, the number of shaving strokes for new 'sharp' blade 151 is often significantly less than the number of shaving strokes for older 'dull' blade 151, because a user will drag 'dull' blade 151 across the skin more times to account for less efficient cutting. As such, the number of shaving strokes increase as the blade dulls, which provides a metric to gauge blade attrition.
  • the number of shaving strokes increase as the blade dulls, which provides a metric to gauge blade attrition.
  • microcontroller 160 or the external device incorporates machine learning (e.g., heuristics) to determine blade attrition based on the number of strokes.
  • shaving system 100 may include a threshold associated with a number of shaving strokes for 'dull' blade 151.
  • Microcontroller 160 or the external device adjusts the threshold associated with a number of shaving strokes for 'dull' blade 151 each time a user replaces blade 151. Over time, the threshold associated with a number of shaving strokes for 'dull' blade 151 converges on an accurate value that is based on a user's comfort level for blade cartridge 150 replacements.
  • microcontroller 160 is configured to prompt the user when the number of shaving strokes for 'dull' blade 151 approaches the adjusted threshold level.
  • external device 505 may be configured to prompt the user once the number of shaving strokes exceeds 90% of the threshold associated with number of shaving strokes for 'dull' blade 151.
  • a pop-up is displayed that facilitates the user to order a new replacement blade online.
  • replacement blades are automatically ordered for a user.
  • force sensor is configured as a proximity sensor that detects contact between each blade 151 and the skin.
  • force sensor 120 is configured to indicate contact between blade cartridge 150 and the skin for any depression distance S 0 greater than zero (e.g., x>0).
  • force sensor 120 may be configured to indicate contact based on changes in force, F, over a time differential, At, which can provide feedback to a user (e.g., audible sound, light or message displayed on an external device) to assist in proper shaving techniques.
  • lever assembly 130 includes a stopper configured to reduce the travel distance of lever assembly 130.
  • the stopper may be set at various positions of known deflection that are used to calibrate force sensor 120.
  • a stopper is set in a position that indicates a force threshold of 'dull' blade 151.
  • proximity sensor is a touch based sensor (e.g., piezoelectric sensor, capacitive sensor) attached to each blade 151 on blade cartridge 150 configured to detect contact of each blade 151 with the skin.
  • blade 151 is in contact with the skin and the proximity sensor is configured to detect a compressive force.
  • proximity sensor is attached to the front of blade cartridge 150 adjacent to blades 151 that are configured to detect contact between blade cartridge 150 and the skin.
  • shaving system 100 is not meant to be limited to force sensor 120.
  • lever assembly 130 may hinge blade cartridge 150 around fulcrum 131 to extend plunger 124 over a negative distance, -S 0 .
  • spring 123 of force sensor 120 is configured to detect a tensile force rather than a compressive force.
  • blade 151 is in contact with the skin and the proximity sensor is configured to detect a tensile force.
  • lever assembly 103 is configured to transfer both normal force, F , and tangential force, F T , form blade 151 in contact with the skin to the tensile force at the proximity sensor.
  • Other contact based proximity sensors configured to detect the force blade 151 exerts on the skin include piezoelectric sensors, capacitive sensors,
  • MEMS micro-electrical mechanical system
  • the proximity sensor is an ultrasonic rangefinder.
  • this includes a distance ranging mechanism such as an ultrasonic pulse rangefinder configured to determine the distance from blade 151 to the skin.
  • the proximity sensor is an infrared (IR) sensor or any electronic sensor configured to detect an electromagnetic field or a beam of electromagnetic radiation (e.g., infrared, laser).
  • the proximity sensors include optical or infrared imaging.
  • video camera 163 may be configured to detect proximity based on the incident light disparity such as detecting a dim, low intensity light when close to the skin and a brighter intense light away from the skin.
  • infrared sensors are configured to capture images that distinguish a slightly heated region caused by the friction of dragging blades 151 across the skin.
  • shaving system 100 may be configured to capture a profile of the slightly heated region and analyze the captured profile for uneven wear (e.g., imbalances in blade attrition).
  • proximity sensor is an accelerometer, which can detect the strokes count as well as the hand motion acceleration, which might assist in indicating dullness based on excess force applied by the user.
  • the proximity sensor is a mechanical friction sensor that detects mechanical deflections in a region where blades 151 contact the skin. Often, the mechanical deflections facilitate a mechanical friction sensor to detect both compressive forces (e.g., FIG. 2A-2C) and tensile forces (e.g., FIG. 3A-3C) in a region where blades 151 contact the skin.
  • the proximity sensor is a mechanical friction sensor that uses a piezoelectric film. In some instance, the mechanical friction sensor is attached to the front of blade cartridge 150 adjacent to blades 151 in a region that contacts the skin.
  • the mechanical friction sensor use a piezoelectric film that attaches to the front of blade cartridge 150 and adjacent to blades 151 to detect contact between blade cartridge 150 and the skin. In some instances, the mechanical friction sensor is attached between at least one blade 151 and handle 140.
  • the proximity is a piezoelectric friction sensor that attaches between blades 151 and the body of blade cartridge 150.
  • the proximity or the contact sensor is a piezoelectric sensor that attaches to one or more blades 151 to detect the deflection of each blade 151.
  • lever assembly 130 of shaving system 100 is configured to sense both a normal force, F (e.g. FIG. 2A-2C), and a tangential force, F T (FIG. 3A-3C), in a region where blades 151 contact the skin.
  • lever assembly 130 includes second slide bearings 133 that moves second fulcrum 132 in a direction of applied tangential force (e.g., friction force). That is, the applied tangential force (e.g., friction force) adjusts the position of second fulcrum 132 along input arm 138 that in turn adjusts the position of first fulcrum 131 to pivot along output arms 134. This adjustment transmits applied tangential force (e.g., friction force) on the blade 151 to the output arm 134 to depress force sensor 120.
  • applied tangential force e.g., friction force
  • microcontroller 160 is configured to collect and store in first memory 161, data associated with the forces applied to force sensor 120 for a portion of a shaving session.
  • microcontroller 160 is configured with a timer that measures the period of time that the proximity sensor detects contact between blade 151 and the skin. For this technique, the contact duration is compared to a contact duration threshold to determine a completed shaving stroke.
  • the proximity sensor is configured to detect when at least one blade 151 contacts the skin.
  • microcontroller 160 may not accurately interpret the occurrence of a shaving stroke when the proximity is too short or too long duration. As such, the contact duration threshold may be adjusted by the user (e.g., using external device 505 via wireless communication unit 110 to wireless module 555).
  • microcontroller 160 or the external device 505 is configured to automatically and incrementally adjust a threshold value (e.g., contact duration threshold) representative of the period of time that the proximity sensor detects contact between blade 151 and the skin, for instance, based on the user's behavior.
  • microcontroller 160 is configured to provide instructions to an external device 505 to incrementally adjust a threshold value (e.g., contact duration threshold ) representative of the period of time that the proximity sensor detects contact between blade 151 and the skin based on the user's behavior. For example, a woman shaving her legs may have long contact shaving strokes, whereas a man shaving his face may have short contact shaving strokes.
  • microcontroller 160 is configured to adaptively adjust (e.g., using heuristic learning) the contact duration threshold to calculate a more accurate metric for the total accumulated time that the blade 151 made contact with the skin.
  • adaptive learning e.g., heuristic learning
  • microcontroller 160 facilitates a more accurate estimate for predicting the blade attrition.
  • Shaving system 100 may also provide a quantitative comparison based on manufacturers data. For example, manufacture may report that a particular blade cartridge 150 that is reported to last up to five weeks. Based on the average number of shaving strokes determined for a user to be 150, microcontroller 160 would determine an expected lifetime of 5,250 (e.g., 150x5x7). In some embodiments, microcontroller 160 is configured to provide instructions to external device 505 to determine a total number of occurrences detected by the proximity sensor in second memory 570 and display on a display a quantitative comparison between the total number of shaving strokes and a number of shaving strokes expected over the lifetime of blade 151.
  • manufacture may report that a particular blade cartridge 150 that is reported to last up to five weeks. Based on the average number of shaving strokes determined for a user to be 150, microcontroller 160 would determine an expected lifetime of 5,250 (e.g., 150x5x7).
  • microcontroller 160 is configured to provide instructions to external device 505 to determine a total number of occurrences detected by the proximity sensor in second memory
  • shaving system 100 includes indicator display 510 disposed on handle 140.
  • microcontroller 160 is configured to receive the quantitative comparison from external device 505 via wireless communication unit 110 and display on the display 510 a dullness indicator representative of the quantitative comparison.
  • second memory 570 is electrically connected to external device 505. (FIG. 12) In some instances, second memory 570 is configured to store data associated with the at least one blade 151.
  • microcontroller 160 is configured to provide instructions to wireless communication unit 110 to transmit a quantitative comparison of the total number of shaving strokes stored in the memory and the number of shaving strokes expected over the lifetime of at least one blade 151 to be provided for display on external device 505.
  • the quantitative comparison is a display bar, color LEDs, or a small LCD displayed on handle 140 of shaving system 100 akin to dullness indicator 510 represented as a display bar as depicted on handle 140 in FIG. 5.
  • shaving system 100 includes a server-based or cloud-based 545 user subscription account that is configured to retrieve and store the relevant information from shaving system 100 for blade cartridge 151, such as the manufacturer, model number, number of completed shaving strokes, anticipated number of days remaining on blade cartridge 151, and the life expectancy of each blade.
  • the subscription account is configured to notify the user (e.g., via email, pop-up message) that a replacement blade cartridge should be ordered when the anticipated number of days remaining in the life of blade cartridge 151 drops below a certain threshold.
  • the subscription account is configured to automatically order or purchase a replacement cartridge once the anticipated number of days remaining in the life of blade cartridge 151 drops below a certain threshold.
  • the server-based or cloud-based 545 user subscription account is accessible through the external device 505. Accordingly, the external device 505 may be configured to provide access to the server-based or cloud-based user subscription account.
  • the server-based user subscription account is configured to order replacements for the at least one blade based on data or instructions received from the microcontroller 160 in some examples.
  • the server-based or cloud-based 545 user subscription is configured to retrieve the quantitative comparison between the total number of shaving strokes from the memory via wireless module 555 and order replacements for the at least one blade when the total number of shaving strokes reaches a threshold value proportional to the quantitative comparison.
  • filtering techniques e.g., low-pass filters to remove flicker noise
  • statistical analysis e.g., standard deviation, expected value
  • microcontroller 160 may be configured to provide sensory data to external device 505. As such, microcontroller 160 is configured to transmit sensory data via wireless communication unit 110 to wireless module 555 on external device 505. It should be understood that many of the computations performed by microcontroller 160 may be performed on external device 505 and transmitted and/or stored to first memory 161 on shaving system 100. This beneficially conserves power on shaving system 100 and in some instances may reduce the total processing time. Likewise, the quantitative comparison and other parameters may be displayed on external device 505.
  • shaving system 100 includes handle 140, at least one blade 151 connected to handle 140, microcontroller 160 attached to handle 140, and one or more sensors adjacent at least one blade 151.
  • the one or more sensors are configured to send sensory data to microcontroller 160.
  • one or more sensors is camera 163 having an image sensor configured to capture video and/or still images. In some instances, camera 163 is configured to capture both frames and video.
  • microcontroller 160 is configured to stream video data from camera 163 and/or audio data from microphone 165 via wireless communication unit 110 to be displayed on external device 505 (e.g., smartphone, tablet, laptop, desktop).
  • shaving system 100 includes wireless communication unit 110 attached to handle 140 and electrically connected to microcontroller 160.
  • wireless communication unit 110 is configured to transmit and receive data from microcontroller 160 to external device 505 (e.g., FIG. 5 and FIG. 12).
  • memory 161 is electrically connected to microcontroller
  • memory 161 is configured to store data associated with at least one blade 151.
  • microcontroller 160 is configured to instruct image camera to capture frames of the images from camera 163 and instruct wireless communication unit 110 to stream the frames to be processed, analyzed, or displayed on the external device 505.
  • the frames are stored in first memory 161 or in an external storage (e.g., second memory 570) on external device 505.
  • external device 505 is a wearable computing device, such as a wristwatch as depicted in FIG. 5, FIG. 6, and FIG. 8.
  • external device 505 is a hand-held phone 525 (e.g., mobile phone) as depicted in FIG. 7 or tablet that is held or mounted nearby similar to a portable hand mirror.
  • external device 505 uses a media player embedded in a user interface (UI) that is configured to play the video and/or audio captured in real time.
  • the media player may include other features, such as zoom (e.g., manual zoom or automatic zoom) and/or correction functionality that conditions to streamed media (e.g., image sharpness, contrast, color balance, filtering techniques).
  • One benefit of using camera sensor 163 is to provide a shaving view to the user on external device 505 without the need for a mirror, as well as viewing regions difficult to view with a single mirror (e.g., back of the neck). Further, having video streamed from the camera 163 offers a close-up look of the shaving regions to ensure a proper shaving technique and to better check the quality of the shave.
  • external device 505 can provide feedback to a user in real time or in near real time.
  • external device 505 is configured to analyze the frame images to determine a blade attrition comparison based on the analyzed frame images and present for display on display 565 on external device 505 the blade attrition comparison represented as a compass-like arrow that updates in near real time.
  • Microcontroller 160 may, in some examples, be configured to offload other tasks in order to save on power and provide more efficient utilization of computational resources, particularly during computationally intensive operations.
  • microcontroller 160 is configured to instruct wireless communication unit 110 to transmit frames to external device
  • external device 505 may include image analysis module 560 (FIG. 12) to differentiate a color variation between adjacent pixels in the captured frame and store in first memory 161 a quantitative comparison for the remaining hair.
  • the color between adjacent pixels may vary from a pinkish hue of bare skin that has been fully shaven to dark black that is unshaven.
  • external device 505 e.g., via image analysis module 560
  • the amount of hair remaining is configured to determine a quantitative comparison for the remaining hair based on the captured frames.
  • the amount of hair remaining is provided as a percentage of remaining hair that ranges from 100% (e.g., thick beard) to 0% (e.g., bare skin).
  • microcontroller 160 is configured to provide a real-time quantitative comparison, such as a variable pitch sound or a recorded voice from speaker 164, a visual indicator 510, and the like, on shaving system 100.
  • microcontroller 160 is configured to provide an audio signal to instruct speaker 164 (e.g., electrical audio device) to emit a sound corresponding to the quantitative comparison for the remaining hair.
  • the sound is a variable pitched sound or a recorded voice.
  • microcontroller 160 transmits data in real time via wireless communication unit 110 and wireless module 555 on external device 505 that is displayed on display 565 on external device 505.
  • external device 505 is configured to present the quantitative comparison for the remaining hair for display on external device 505 (e.g. display 565).
  • microcontroller 160 is configured to capture a frame via miniature camera 163 to determine the amount of hair remaining over a certain area.
  • microcontroller 160 captures a frame and compares the color difference between adjacent pixels to estimate the total amount of hair remaining over a specific area.
  • Microcontroller 160 stores to first memory 161 the total amount of hair remaining over a certain area as a quantitative comparison for the remaining hair. As depicted in FIG. 7, microcontroller 160 is configured to transmit via wireless communication unit 110 the quantitative comparison for the remaining hair to wireless module 555 on external device 505 (e.g., a wristwatch) that displays the frame of the specific area along with the quantitative comparison for the remaining hair.
  • wireless communication unit 110 e.g., a wristwatch
  • External device 505 is configured to analyze a frame to determine the general growth direction of the remaining hair. For example, one approach to determine the general direction of hair growth is to filter the frame image using an edge detection filter, which contrasts the edges of hairs on the face as depicted in FIG. 10A - FIG. 10B.
  • microcontroller 160 is configured to capture and transmit the frame to external device 505, and external device 505 implements an edge detection filter to distinguish the hairs.
  • the edge detection filter is a Sobel filter or a Canny filter.
  • external device 505 is configured to determine a general direction of the remaining hair based on the captured frames, and provide for display on external device 505, a directional indicator representative of a general direction of the remaining hair that corresponds to the best direction to drag the at least one blade over the skin. In some embodiments, external device 505 is configured to provide for display on the external device 505, the filtered frame images. In some embodiments, external device 505 is configured to overly filtered frame images with the streamed video frame image.
  • various filter techniques may be implemented to distinguish the hair.
  • frame image i) a filter is applied that iteratively increases the contrast of the hair edges with respect to the background.
  • a filter is applied that iteratively increases the contrast of the hair edges with respect to the background.
  • the hair are contrasted and "least square analysis” or “regression analysis” is applied to the remaining hair to calculate a general direction of the remaining hair.
  • external device 505 is configured to implement a "least square analysis” or “regression analysis” to the remaining hair to calculate the general direction of the remaining hair.
  • external device 505 is configured to provide a quantitative value representative of the general direction of hair growth and provide for display on the external device 505 the general direction of the remaining hair to memory.
  • This approach provides a directional indicator that corresponds to the best direction in which to drag at least one blade 151 over the skin. Determining the general direction of hair growth also allows the user to orient shaving system 100 according to the best direction to drag blade 151 over the skin.
  • the directional indicator is displayed on the external device as a circular bar graph that is updated and/or filled up in near real time. In some embodiments, the directional indicator is displayed on the external device 505 as a compass-like arrow that updates in near real time.
  • the external device 505 determines the directional indicator based on the received frame images from microcontroller 160 (e.g., via wireless
  • Another approach to determine the general direction of hair growth is to determine the angle value as color of each pixel based on HSV color space, which is representative of the hair directions.
  • external device 505 or microcontroller 160 is configured to filter the frame images using a median filter (e.g., Sobel filter) to reduce high-frequency noise prior to applying an edge detection filter.
  • external device 505 or microcontroller 160 is configured to apply a Canny edge detection filter to frame images to detect edges. Often, the resultant filtered image has thick line edges.
  • microcontroller 160 or external device 505 is configured to apply a line-thinning filter to reduce line thicknesses on frame images. Once the line thicknesses are reduced, microcontroller 160 or external device 505 is configured to determine the angle value as color of each pixel based on HSV color space. The angle value is representative of the line directions (e.g., hair).
  • Shaving system 100 can also assist in shaping regions of established hair.
  • frame images may include established hair growth regions such as a sideburn, muttonchops, mustache, goatee, and the like, where the image shows longer hair growth adjacent to short hair growth.
  • microcontroller 160 is configured to provide instructions to external device 505 to determine a boundary indicator associated with established hair growth based on the filtered frame images and provide the boundary indicator for display on external device 505.
  • external device 505 may overlay the boundary indicator with a frame.
  • external device 505 is configured to overlay the boundary indicator with streamed video frame images.
  • the streamed video would show the boundary indicator at the boundary between established hair growth region and stubble region to be shaved.
  • the boundary indicator is a line (e.g., a curved line or a straight line) that overlays a streamed video or frame. As such, the boundary indicator assists the user to balance the symmetry of unshaven regions as well as facilitate shaving near the contour of a beard or mustache.
  • external device 505 is configured to adjust the boundary indicator according to predefined features selected by a user. For example, a user may adjust a goatee style and select within external device 505 to overlay the goatee style with steamed video as a guide for regions to shave. In some instances, the boundary that represents sideburns is extended to incorporate a larger short hair region when the user desires muttonchops. In these instances, microcontroller 160 or external device 505 is configured to extend or reduce the boundary indicator and display an alternate quantitative boundary indicator on external device 505 representative of the predefined feature. In some
  • external device 505 is configured to overlay the boundary indicator with streamed video frame images.
  • the boundary indicator is displayed as a line.
  • the alternate quantitative boundary indicator overlays a streamed video or frame to guide the user in trimming and forming a desired look.
  • One technique for detecting blade attrition includes capturing a first image (e.g., frame) of a region of skin with hair using camera 163.
  • camera 163 is disposed below handle 140 and configured to view the region before blade 151 is dragged across the skin prior to shaving, as depicted in FIG. 7. This configuration facilitates determining a quantitative comparison for the number of hairs in the region of skin.
  • camera 163 is disposed above handle 140 and configured to view the region after blade 151 is dragged across the skin after shaving, as depicted in FIG. 1G. This configuration facilitates determining a quantitative comparison before and after a shaving stroke of the number of hairs in the region of skin.
  • Some embodiments include first camera 163 disposed below handle 140 and second camera 163 above handle 140. This configuration facilitates capturing a first image (e.g. frame) of a region of skin with hair in front of blade 151 and capturing a second image (e.g. frame) of a region of skin with hair behind blade 151.
  • one or more processors use the captured first and second images in determining a first and second quantitative comparison and providing an attrition comparison based on the difference between the second quantitative comparison and the first quantitative comparison to an electrical device.
  • microcontroller 160 may be configured to provide raw sensory data to external device 505. As such, microcontroller 160 is configured to transmit raw sensory data via wireless communication unit 110 to wireless module 555 on external device 505. It should be appreciated that many of the computations performed by microcontroller 160 may be performed on external device 505 and transmitted and/or stored to first memory 161 on shaving system 100. This beneficially conserves power on shaving system 100 and in some instances may reduce the total processing time. Likewise, the quantitative comparison and other parameters may be displayed on external device 505.
  • a mountable electrical device includes a fixture configured to fasten to a precision hand tool, microcontroller 160 attached to the fixture, and wireless communication unit 110 attached to the fixture and electrically connected to microcontroller 160, wherein wireless communication unit 110 is configured to transmit and receive data from microcontroller 160 to external device 505, first memory 161 electrically connected to microcontroller 160, wherein memory 161 is configured to store data from microcontroller 160, and one or more sensors attached to the precision hand tool, wherein the one or more sensors are configured to provide sensory data to microcontroller 160.
  • one of the one or more sensors is a proximity sensor.
  • one of the one or more sensors is image camera 163 configured to provide frames of images to microcontroller 160.
  • the independent mountable electrical device described above may be attached to high-precision hand tools that provide and/or improve upon real time information to facilitate specific procedures.
  • the mountable electrical device may be small, lightweight, and wireless to provide untethered freedom of motion for many applications.
  • Various applications that would benefit from a mountable device are electrical tools, automotive tools, carpentry tools, surgical tools, and the like. It should be recognized that the above mountable electrical device may be incorporated in any tool that would benefit from real time information to facilitate specific procedures.
  • FIG. 10A illustrates unfiltered and filtered images of an unshaven area of skin on a face.
  • microcontroller 160 captured the image frame via camera 163 and transmitted (e.g., streamed) the image frame via wireless communication unit 110 to wireless module 555 on external device 505.
  • Wireless module 555 forwards the image frame to image analysis module 560 on external device 505 for further processing.
  • Image analysis module 560 uses processors 575 on external device 505 to filter and analyze the image frame to determine a first quantitative comparison for a hair characteristic. For example, as depicted in FIG.
  • analysis module 560 calculates three hair characteristics that may be used as a first quantitative comparison for a hair characteristic, specifically, the hair count (e.g., Hair count: 3267), the average length of the hair (e.g., Avg. length: 32.3), and average density (e.g., avg. intensity: 7.91%).
  • the hair count e.g., Hair count: 3267
  • the average length of the hair e.g., Avg. length: 32.3
  • average density e.g., avg. intensity: 7.91%).
  • display 565 on external device 505 is a touch screen that includes selectable software buttons or switches that facilitate selecting between the original frame image (e.g., original), edge-filtered frame image (e.g., mono), edge- filtered image with the color inverted (e.g., color), and an overlay of the original and edge- filtered image with the color inverted (e.g., overlay).
  • original frame image e.g., original
  • edge-filtered frame image e.g., mono
  • edge- filtered image with the color inverted e.g., color
  • overlay e.g., overlay
  • display 565 on external device 505 includes selectable software buttons to select between stream video (e.g., wireless communication unit 110 to wireless module 555) from camera 163 (e.g., camera), frame images before blade 151 is dragged across the skin (e.g., Before), and frame images after blade 151 is dragged across the skin (e.g., After).
  • stream video e.g., wireless communication unit 110 to wireless module 555
  • camera 163 e.g., camera
  • frame images before blade 151 is dragged across the skin e.g., Before
  • frame images after blade 151 is dragged across the skin
  • FIG. 10B illustrates unfiltered and filtered images of an area of skin on a face where "dull" blade 151 is dragged once across the surface of the skin.
  • microcontroller 160 captured the image frame via camera 163 and transmitted (e.g., streamed) the image frame via wireless communication unit 110 to wireless module 555 on external device 505.
  • Wireless module 555 forwards the image frame to image analysis module 560 on external device 505 for further processing.
  • Image analysis module 560 uses processors 575 on external device 505 to filter and analyze the image frame to determine a second quantitative comparison for a hair characteristic.
  • analysis module 560 calculates three hair characteristics that may be used as a second quantitative comparison for a hair characteristic, specifically, the hair count (e.g., Hair count: 2231), the average length of the hair (e.g., Avg. length: 27.4), and average density (e.g., Avg. intensity: 4.59%).
  • the hair count e.g., Hair count: 2231
  • the average length of the hair e.g., Avg. length: 27.4
  • average density e.g., Avg. intensity: 4.59%
  • FIG. 11 is a flow diagram illustrating method 1100 for gauging blade attrition (e.g., determining the dullness of blade 151).
  • method 1100 may be performed at microcontroller 160 as part of shaving system 100.
  • method 1100 may be performed at external device 505 to conserve power and save on resources on shaving system 100. Some operations in method 1100 may be combined, the order of some operations may be changed, and some operations may be omitted.
  • method 1100 may filter, using one or more processors (e.g., processor cores 169, processors 575), a first image of a region of skin with hair.
  • processors e.g., processor cores 169, processors 575
  • microcontroller 160 may be configured to execute one or more modules or components to filter, using one or more processors (e.g., processor cores 169, processors 575), the first image of a region of skin with hair that was captured using camera 163.
  • filtering the first image of a region of skin with hair uses an edge detection filter.
  • the edge detection filter is a Sobel filter or a Canny filter.
  • method 1100 may determine, using one or more processors (e.g., processor cores 169, processors 575), a first quantitative comparison for a hair characteristic in a region of skin based on the first filtered image.
  • microcontroller 160 may be configured to execute one or more modules or components to determine, using one or more processors (e.g., processor cores 169, processors 575), a first quantitative comparison for a hair characteristic in a region of skin based on the first filtered image.
  • the hair characteristic is the quantity of hair.
  • the hair characteristic is the density of hair.
  • the hair characteristic is the average length of hair.
  • method 1100 may shave the region of skin with blade 151.
  • microcontroller 160 may be configured to execute one or more modules or components to shave the region of skin with blade 151.
  • method 1100 may filter, using one or more processors (e.g., processor cores 169, processors 575), a second image of a region of skin with hair.
  • processors e.g., processor cores 169, processors 575
  • microcontroller 160 may be configured to execute one or more modules or components to filter, using one or more processors (e.g., processor cores 169, processors 575), the second image of the region of skin with hair that was captured using camera 163.
  • filtering the second image of a region of skin with hair includes using an edge-detection filter.
  • the edge-detection filter is a Sobel filter or a Canny filter.
  • method 1100 may determine, using one or more processors, a second quantitative comparison for the hair characteristic in the region of skin based on the second filtered image.
  • microcontroller 160 may be configured to execute one or more modules or components to determine, using one or more processors (e.g., processor cores 169, processors 575), a second quantitative comparison for the hair characteristic in the region of skin based on the second filtered image.
  • determining the first or second quantitative comparison for the hair detection in the region of skin includes differentiating a color variation between adjacent pixels in the captured image.
  • method 1100 may include sending an audio-signal to an electrical audio unit configured to emit sound.
  • the electrical audio unit emits a sound associated with either the blade attrition comparison or the first or second quantitative comparison for the hair characteristic in the region of skin.
  • determining the first or second quantitative comparison for the hair characteristic in the region of skin further includes determining a quantitative boundary indicator that distinguishes a boundary between an established hair growth region and a stubble region to be shaved based on the first or second filtered image.
  • method 1100 may determine a quantitative boundary indicator that distinguishes a boundary between an established hair growth region and a stubble region to be shaved based on the first or second filtered image.
  • processors e.g., processor cores 169, processors 575
  • modules or components may be configured to execute one or more modules or components to determine a quantitative boundary indicator that distinguishes a boundary between an established hair growth region and a stubble region to be shaved based on the first or second filtered image.
  • method 1100 may determine a general direction of the remaining hair based on the first or second filtered image.
  • processors e.g., processor cores 169, processors 575
  • processors 575 may be configured to execute one or more modules or components to determine a general direction of the remaining hair based on the first or second filtered image.
  • determining the general direction of the remaining hair includes a "least square analysis” or "regression analysis”.
  • method 1100 may provide for display, a general direction of the remaining hair, wherein the general direction is associated with the best direction to drag the blade over the region of skin.
  • processors e.g., processor cores 169, processors 575
  • modules or components may be configured to execute one or more modules or components to provide for display, a general direction of the remaining hair, wherein the general direction is associated with the best direction to drag the blade over the region of skin.
  • method 1100 may provide for display, a blade attrition comparison based on the difference between the second quantitative comparison and the first quantitative comparison.
  • microcontroller 160 may be configured to execute one or more modules or components to provide for display, a blade attrition comparison based on the difference between the second quantitative comparison and the first quantitative comparison.
  • the blade attrition indicator may comprise a life remaining indicator and/or a dullness indictor for at least one blade. 4.
  • FIG. 13A and FIG. 13B illustrate a front view and an exploded view, respectively, of a blade cartridge 150 with blades that are slightly curved.
  • the views illustrate blade cartridge 150 that includes a fixture configured to fasten to a razor (e.g., safety razor, disposable razor, cartridge razor) and at least one blade 151 connected to the fixture, wherein the at least one blade 151 is curved.
  • blades 151 of a blade cartridge 150 are slightly curved in order to reduce the cutting resistance of the hair during impact with the blade.
  • curved (e.g., sickle-like) blades 151 reduce impact resistance along the direction of the motion of blade 151, which results in a more efficient cut that is smoother to the skin.
  • blade 151 is slightly curved (e.g., sickle-like), laying on a two-dimensional plane at an approximately normal angle to the surface of the skin. In this manner, the plane of the curve may follow the tangent of the skin.
  • the at least one blade 151 is slightly curved in a sickle-like fashion.
  • the back edge of blade 151 is convex and the sharp front edge is concave.
  • blade 151 curves inward along a cutting edge.
  • an enclosed arrangement of two or more blades 151 adjacent to each other can be applied to distribute the applied force among blades 151 as each blade 151 contacts the skin.
  • at least one blade 151 includes a plurality of blades 151, wherein each of the plurality of blades 151 are parallel to each adjacent blade 151.
  • One advantage of this configuration is that it can help to prevent wrongful cutting of the skin when a sideways motion of blade 151 is applied.
  • Curved blade 151 may include steel, ceramics (e.g., zirconia, alumina), or nanolattice.
  • curved blade 151 is made of carbon steel (e.g., austenitic, martensitic, stainless steel).
  • carbon steel e.g., austenitic, martensitic, stainless steel.
  • curved blade 151 is made of ceramics.
  • a ceramic blade 151 may be made through a dry-pressing and sintering process that subsequently sharpens the edge with a diamond grinder.
  • the ceramic powder is placed on rotating drum to first create a full ring, taking into account the inner diameter (id) and outer diameter (od), and then cut off sub- sections that are the width of blade 151, prior to cooling.
  • One advantage of ceramic over steel for blades 151 is that ceramic are harder than carbon steel, which results in an edge more resilient to dulling.
  • a nanolattice is a truss structure with connecting truss members implemented at a nanoscale. These structures can be made on a length scale spanning multiple orders of magnitude, for instance, from tens of nanometers to hundreds of microns.
  • the nano-sized connecting truss members in some examples with tube walls of less than 100 nanometers, facilitate properties different than more dense counterparts.
  • certain ceramics exhibit a higher hardness than metals but are brittle and tend to chip or fracture under certain loads.
  • nanolattices with nano-sized structures and comprising single crystal materials, such as ceramics (e.g., materials having approximately 20 to 60 nanometer wall thickness) do not exhibit elastic instability and have been shown to fully recover at approximately 20
  • Nanolattices maintain high strength yet have been found to be remarkably resilient and less brittle.
  • the advantage of forming blade 151 using a nanolattice is that leading edge 1505 of blade 151 would be much less susceptible to dulling.
  • a micro- scaffold structure may be formed (e.g., fabricated) through a process of two-photon lithography (e.g., a microscopic 3D printing) to create the truss structure based on a polymer model.
  • this technique includes two laser beams that crosslink and harden a polymer at the point of focus in 3D space. That is, the parts of the polymer exposed to the lasers remain intact while the material that is not exposed dissolves away.
  • this technique includes atomic layer deposition (ALD) or sputtering to deposit material (e.g., carbon steel, ceramic) on the truss structure.
  • ALD atomic layer deposition
  • sputtering to deposit material (e.g., carbon steel, ceramic) on the truss structure.
  • This technique coats the connecting truss members with a deposited material (e.g., carbon steel, ceramic).
  • a deposited material e.g., carbon steel, ceramic
  • ALD is based on one or more sequential exposure to a gas that chemically reacts with the surface of the target material (e.g., carbon steel, ceramic) to slowly form a thin film.
  • the resultant film coats the polymer and forms a rigid shell.
  • one end of the truss structure is cut to expose the internal polymer.
  • the exposed polymer truss is removed using an oxygen (e.g., 0 2 ) plasma etch.
  • the remaining structure is a nanolattice with hollow connecting truss members. That is the nanolattices use less material than dense counterparts.
  • one advantage of nanolattices is that the reduction of material reduces the weight of blade 151 without compromising the strength.
  • the nanolattice reduces brittleness (e.g., alumina, ceramics).
  • FIG. 14A - FIG. 15B illustrate blade 151 (e.g., nanoblade) that includes front leading edge 1505 of blade 151, spine 1530 of the blade 151, and a nanolattice that connects the front leading edge of blade 151 to spine 1530 of blade 151.
  • one or more connecting truss members of the nanolattice has a curvilinear geometric shape.
  • a shape of one or more connecting truss members may be a cylinder, an elliptical tube, or a closed-profile elongated tube.
  • one or more of the tubes of the nanolattice is hollow. Other embodiments may include rectangular tubes, I-beams, C-beams, and the like.
  • these tubes are conical cylinders or tapered cylinders. As illustrated in FIG. 14A - FIG. 15B, the tubes connecting to leading edge 1505 are tapered to conform to leading edge 1505. In this instance, connecting truss members taper along the edge of blade 151 and diverge (e.g., spread out) from leading edge 1505 of blade 151 toward spine 1530 (e.g., back) of blade 151. In some embodiments, one or more connecting truss members of the nanolattice is a tube that tapers towards leading edge 1505. In some embodiments, one or more connecting truss members tapers toward the leading edge 1505 and forms ribs along the leading edge for reinforcement and reduced friction.
  • FIG. 14A - FIG. 14B depict various views of blade 151 with a nanolattice that forms an octet-truss structure.
  • An octet-truss is a lightweight structure that distributes dominant forces among diagonal connecting truss members 1520. This means that portions of dominant compressive forces are converted to tensile force across the octet-truss. This makes the structure much less susceptible to failure because it can re -balance compressive load to tensile loads.
  • blade 151 is made of a metal.
  • blade 151 is made of a ceramic.
  • ceramics have been found to be remarkably less brittle and much stronger in tension. This means that hard ceramics such as alumina (e.g., corundum, sapphire) and zirconia may be manufactured into blade 151 with a nanolattice (e.g., nanoblade) having an octet- truss that resists the impacts of cutting hair longer without fracturing or chipping when dropped.
  • the ceramic is zirconia or alumina.
  • the octet- truss structure includes cross members that extends from spine 1530 to the leading edge and cross members that parallel to the leading edge.
  • the cross members are of a tetrahedral shape and added to the octahedral shape of the octet- truss. As a result, this structure includes more material over a unit volume than the octet-truss of FIG. 14A - 14B, which makes it denser and heavier.
  • the truss is more rigid because less portions of compressive force are rebalanced into tensile forces.
  • This reinforcement may provide better support to the nanolattice structure, further allowing hard ceramics such as alumina (e.g., corundum, sapphire) and zirconia to be manufactured into blade 151 with a nanolattice (e.g., nano-blade) having the structure depicted in FIG. 15A - 15B, which is less resistant to fracturing or chipping when compared to the nanolattice structure depicted in FIG. 14A- 14B.
  • alumina e.g., corundum, sapphire
  • zirconia zirconia
  • Embodiments may be combined and aspects described in connection with an embodiment may stand alone.

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Forests & Forestry (AREA)
  • Mechanical Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Cosmetics (AREA)
  • Dry Shavers And Clippers (AREA)

Abstract

Methods and apparatuses for an intelligent shaving system is disclosed herein. An example intelligent shaving system includes a handle, at least one blade connected to the handle, a microcontroller attached to the handle, a wireless communication unit configured to send and receive data from microcontroller to an external device, a memory configured to store data applicable to the at least one blade, and one or more sensors configured to send sensory data from the one or more sensors to microcontroller. The one of the one or more sensors is a proximity sensor or a camera having image sensor configured to capture video and/or still images. The shaving system assists in determining blade attrition and provides indicators to assist in shaving techniques. The shaving system further may include at least one blade slightly curved to follow a tangent of the skin. The at least one blade may have a nanolattice structure.

Description

INTELLIGENT SHAVING SYSTEM HAVING SENSORS
CLAIM OF PRIORITY
[0001] The present application claims priority to Provisional Application No. 62/090,335, entitled "INTELLIGENT SHAVING SYSTEM HAVING SENSORS," filed
December 10, 2014, which is hereby incorporated by reference in its entirety.
BACKGROUND
1. Field
[0002] The present disclosure generally relates to the field of Internet of Things (IoT) and wirelessly connected intelligent devices and high precision hand tools, and, in particular, a shaving system to improve the shaving experience and quality of shave by providing the user with key information related to the blade and shaving in near real-time.
2. Description of Related Art
[0003] Proper shaving techniques facilitate a close and comfortable shave that avoid razor burn, razor bumps, and irritation. One approach to assist in shaving is to determine the correct positioning of a razor while shaving. This is often challenging, because in many instances many users are not able to clearly see the shaving region and must rely only on "feel" to determine the shave quality. In turn, this often leads to over-shaving, shaving "against the grain," or missed spots with patchy results. Likewise, these improper shaving techniques can lead to premature blade dulling and increased cost. Few razors have been developed to assist in proper shaving techniques. To date, the focus has been on razor designs that minimize the impact of poor shaving techniques.
BRIEF SUMMARY
[0004] The following presents a simplified summary of one or more aspects in order to provide a basic understanding of such aspects. This summary is not an extensive overview of all contemplated aspects and is intended to neither identify key or critical elements of all aspects nor delineate the scope of any or all aspects. Its sole purpose is to present some concepts of one or more aspects in a simplified form as a prelude to the more detailed description that is presented later. [0005] In some embodiments, a shaving system includes a handle; at least one blade connected to the handle; a microcontroller attached to the handle; and one or more sensors adjacent the at least one blade. The one or more sensors are configured to transmit sensory data to the microcontroller, and one of the one or more sensors is a proximity sensor.
[0006] In some embodiments, a shaving system includes a handle; at least one blade connected to the handle; a microcontroller attached to the handle; and one or more sensors adjacent the at least one blade. The one or more sensors are configured to send sensory data to the microcontroller, and one of the one or more sensors is a camera having an image sensor configured to capture video and/or still images.
[0007] In some embodiments, a razor cartridge includes a fixture configured to fasten to a razor; and at least one blade connected to the fixture. The at least one blade is curved.
[0008] In some embodiments, a blade includes a front leading edge of the blade; a spine of the blade; and a nanolattice that connects the front leading edge to the spine.
[0009] In some embodiments, a mountable electrical device includes a fixture configured to fasten to a precision hand tool; a microcontroller attached to the fixture; and a wireless communication unit attached to the fixture and electrically connected to the microcontroller. The wireless communication unit is configured to send and receive data from the
microcontroller to an external device. The mountable electrical device further includes a memory electrically connected to the microcontroller. The memory is configured to store data from the microcontroller. The mountable electrical device further includes one or more sensors attached to the precision hand tool. The one or more sensors are configured to provide sensory data to the microcontroller.
[0010] In some embodiments, a method for determining blade attrition includes filtering, using an image device, a first image of a region of skin with hair; determining, using one or more processors, a first quantitative comparison for a hair characteristic in a region of skin based on the first filtered image; after the region of skin has been shaved, filtering, using one or more processors, a second image of the region of skin; determining, using one or more processors, a second quantitative comparison for the hair characteristic in the region of skin based on the second filtered image; and providing for display, a blade attrition comparison based on the difference between the second quantitative comparison and the first quantitative comparison. [0011] The foregoing has outlined rather broadly the features and technical advantages of examples according to the disclosure in order that the detailed description that follows may be better understood. Additional features and advantages will be described hereinafter. The conception and specific examples disclosed may be readily utilized as a basis for modifying or designing other structures for carrying out the same purposes of the present disclosure. Such equivalent constructions do not depart from the scope of the appended claims. Characteristics of the concepts disclosed herein, both their organization and method of operation, together with associated advantages will be better understood from the following description when considered in connection with the accompanying figures. Each of the figures is provided for the purpose of illustration and description and not as a definition of the limits of the claims.
DESCRIPTION OF THE FIGURES
[0012] For a better understanding of the various described embodiments, reference should be made to the description below, in conjunction with the following figures in which like reference numerals refer to corresponding parts throughout the figures.
[0013] FIG. 1A illustrates a back ISO view of shaving system with a force sensor and an image camera according to an embodiment of the present invention.
[0014] FIG. IB illustrates a front ISO view of shaving system with a force sensor and an image camera according to an embodiment of the present invention.
[0015] FIG. 1C illustrates a side view of a shaving system with a force sensor and an image camera according to an embodiment of the present invention.
[0016] FIG. ID illustrates a bottom view of a shaving system with a force sensor and an image camera according to an embodiment of the present invention.
[0017] FIG. IE illustrates a top view of shaving system with a force sensor and an image camera according to an embodiment of the present invention.
[0018] FIG. IF illustrates an ISO view of a force sensor according to an embodiment of the present invention.
[0019] FIG. 1G illustrates a front view of shaving system with a force sensor and an image camera according to an embodiment of the present invention. [0020] FIG. 1H illustrates a back view of shaving system with a force sensor and an image camera 163 according to an embodiment of the present invention.
[0021] FIGS. 2A-2C illustrate mechanical positions of a shaving system with a force sensor at zero insertion force, mid-range insertion force, and maximum insertion force applied normal to the skin, respectively, according to an embodiment of the present invention.
[0022] FIGS. 3A-3C illustrate positions of a shaving system with a force sensor at zero insertion force, mid-range insertion force, and maximum insertion force applied tangent to the skin, respectively, according to an embodiment of the present invention.
[0023] FIG. 4 illustrates the motion of a lever assembly from an initial position to a second position according to an embodiment of the present invention.
[0024] FIG. 5 illustrates connectivity between shaving system and external devices according to an embodiment of the present invention.
[0025] FIG. 6 illustrates a shaving system with an image camera that streams video via a wireless communication unit to an external wristwatch according to an embodiment of the present invention.
[0026] FIG. 7 illustrates a shaving system with an image camera that streams video via a wireless communication unit to a hand-held mobile phone or tablet according to an embodiment of the present invention.
[0027] FIG. 8 illustrates a shaving system with image camera that streams video via a wireless communication unit to an external wristwatch according to an embodiment of the present invention.
[0028] FIG. 9 illustrates images of hair at monotonically increasing contrast according to an embodiment of the present invention.
[0029] FIG. 10A illustrates various images of an unshaven area of skin according to an embodiment of the present invention
[0030] FIG. 10B illustrates various images of a shaved area of skin according to an embodiment of the present invention. [0031] FIG. 11 is a flow diagram for gauging blade attrition according to an embodiment of the present invention.
[0032] FIG. 12 illustrates electronic components and modules of shaving system in relation to an external device and a cloud server according to an embodiment of the present invention.
[0033] FIGS. 13A and 13B illustrate a front view and an exploded view, respectively, of a blade cartridge 150 with blades that are slightly curved according to an embodiment of the present invention.
[0034] FIGS. 14A and 14B illustrate an ISO view and a plan view of a nanolattice blade with an octet-truss structure according to an embodiment of the present invention.
[0035] FIGS. 15A and 15B illustrate an ISO view and a plan view of a nanolattice blade with a rigidly reinforced truss structure according to an embodiment of the present invention.
DETAILED DESCRIPTION
[0036] The following description is presented to enable a person of ordinary skill in the art to make and use the various embodiments. Descriptions of specific devices, techniques, and applications are provided only as examples. Various modifications to the examples described herein will be readily apparent to those of ordinary skill in the art, and the general principles defined herein may be applied to other examples and applications without departing from the spirit and scope of the various embodiments. Thus, the various embodiments are not intended to be limited to the examples described herein and shown but are to be accorded the scope consistent with the claims.
[0037] As used herein, proximity sensor refers to a sensor that may be configured to detect how close blade 151 is to the skin. Proximity sensors may include physical contact sensors that are configured to detect the force applied between blade 151 and the skin as well as sensors that do not have a physical contact between blade 151 and the skin. Proximity sensors include, but are not limited to, IR sensors, ultrasonic rangefinders, and accelerometers.
[0038] Various embodiments are described below, relating to intelligent shaving system 100 that communicates (e.g., wirelessly communicates) with external device 505. FIG. 12 illustrates electronic components and modules of shaving system 100 in relation to external device 505 and cloud server 545 in accordance with some embodiments of the present disclosure. It should be understood that although shaving system 100, external device 505, and cloud server 545 are shown, the embodiments described herein with respect to FIG. 12 are not limited to shaving system 100, external device 505, or cloud server 545.
[0039] As depicted in FIG. 12, the components included in shaving system 100 are encased within handle body 520, and as depicted in FIG. 5 - FIG. 7, handle body 520 has an ergonomic shape that conforms according to a user's grip. In some embodiments, one or more
components of the shaving system 100 are incorporated within handle body 520 and one or more components are configured to conform to handle body 520. For instance, speaker 164, microphone 165, and/or indicator display 510 (FIG. 5) are located externally on handle 140 of shaving system 100. In some examples, handle body 520 is configured to conform around USB connector 111
(FIG. 1A - FIG. 1H) to facilitate access for a mateable connector which provides power to charge battery 112 (FIG. 12) and/or access to media files (e.g., frame images, video) stored in first memory 161. It should be appreciated that shaving system 100 depicted in FIG. 1A - FIG. 4 may be adapted to conform to any known ergonomic form. In particular, the height of force sensor 120 (e.g., force cell, load cell) and lever assembly 130 may be reduced to accommodate a lower profile. In some examples, force sensor 120 may be implemented using a compression sensor.
[0040] As illustrated in FIG. 12, shaving system 100 includes within handle body 520 a microcontroller 160, which is an integrated circuit that embeds a processor core 169, cache memory 168, and programmable input/output peripherals 167 on an integrated circuit, as illustrated in FIG. 12. Microcontroller 160 may include additional embedded components to facilitate aspects of intelligent shaving system 100, such as portions of wireless communication unit 110, an audio/video (AV) wireless module 117, a video transmitter/broadcaster, a video encoder/decoder (e.g., video compressor), an audio encoder/decoder (e.g., audio compressor), an encryption unit, a timer, and the like.
[0041] In general, microcontroller 160 is configured to electrically interface with sensors, specifically, camera sensor 163, force sensor 120, and microphone 165. Microcontroller 160 is also configured to facilitate interaction with a user by providing audio and/or visual feedback to the user during a shave session. In particular, shaving system 100 includes on handle body 520, speaker 164 and indicator display 510. In some embodiments, shaving system 100 includes on handle body 520, user interaction switches 515 (e.g., power switch, selection switch) to select various features on shaving system 100.
[0042] Shaving system 100 includes first memory 161 electrically connected to microcontroller 160. In some embodiments, the first memory 161 is configured to store data associated with at least one blade 151. In particular, first memory 161 is configured to store data and/or information to facilitate the interaction between microcontroller 160 and electrically connected sensors (e.g., camera sensor 163, force sensor 120). In some embodiments, first memory 161 is non-volatile memory, such as and/or configured to buffer sensory data between one or more sensors and wireless communication unit 110.
[0043] Shaving system 100 includes wireless communication unit 110 that is configured to communicate with external devices 505. Wireless communication unit 110 includes WiFi module 119 and Bluetooth module 118. In some embodiments, wireless communication unit 110 includes an audio / video wireless module 117 that is configured to facilitate transmitting audio / video data between shaving system 100 and one or more external devices. In some instances, wireless communication unit 110 interfaces with cloud server 545 via a router or an internet gateway.
[0044] As illustrated in FIG. 12, external device 505 includes wireless modules 555 to interface to wireless communication unit 110 of shaving system 100. Wireless module 555 includes WiFi modules 556 and Bluetooth module 557. It will be appreciated that external device 505 and shaving system 100 are not limited to WiFi protocols or Bluetooth protocols and may operate in accordance with one or more other wireless protocols.
[0045] To conserve resources, microcontroller 160 may offload sensory data to external device 505. Accordingly, in some examples, microcontroller 160 is configured to transmit sensory data via wireless communication unit 110 to wireless module 555 on external device 505. As such, external device 505 includes sensor analysis module 550 and image analysis module 560 to determine one or more quantitative results. External device includes one or more processors 575 as well as secondary memory 570 that may be volatile or non- volatile. In some embodiments, external device may display on display 565 streamed image frames and / or quantitative indicators. In some instances, display 565 is a touch screen configured to interface with a user with selectable software buttons or switches. 1. Shaving system 100 with proximity sensor
[0046] Shaving system 100 includes a cartridge-razor body style with blade cartridge 150 and handle 140, that is equipped with one or more sensors configured to capture sensory data (e.g., force, proximity or contact, image, friction, temperature, motion) and send the sensory data to one or more onboard microcontrollers 160. In general, microcontroller 160 is configured to receive, process, and/or store the sensory data (e.g., force, proximity or contact, image, friction, temperature, motion) to first memory 161. In some instances, the
microcontroller 160 is configured to transmit sensory data (e.g., force, proximity or contact, image, friction, temperature, motion) or processed data (e.g., video stream, sensory data) to external device 505 associated with a user.
[0047] Proximity sensors, as described herein, may be configured to detect the nearness of a target from the sensor. As used herein, proximity sensors include not only sensors used to detect how close a blade 151 is to the skin, but also sensors such as physical contact sensors configured to detect the force applied between blade 151 and the skin and sensors that do not require physical contact between blade 151 and skin, such as accelerometers.
[0048] As depicted in FIG. 5, shaving system 100 may include wireless communication unit 110 configured to interface with external device 505 to provide useful shaving information and improve the shaving experience. In some embodiments, an external device 505 is a wearable computing device (e.g., watch 530). In some embodiments, an external device 505 is a hand-held phone 525, tablet 535, laptop, or desktop 540. In general, wireless communication unit 110 is configured to consume low-power and is configured for full duplex operation for transmitting (TX) and receiving (RX) simultaneously.
[0049] Communication unit 110 includes both Bluetooth and WiFi protocols and either may be configured to stream video data from camera 163 and/or audio data from microphone 165. For WiFi 119, wireless communication unit 110 is configured to use IEEE 802.11 protocols for implementing wireless local area network (WLAN) computer communication in the 2.4, 3.6, 5, and 60 GHz frequencies. For Bluetooth 118, wireless communication unit 110 is configured in accordance with IEEE 802.15 protocols. In some instances, external device 505 includes a
built-in WiFi module 556 or Bluetooth module 557 (FIG. 12) that connects to wireless module 555 to facilitate the wireless interface. It should be understood that, although wireless communication unit 110 and wireless module 555 (FIG. 12) include WiFi and Bluetooth protocols (e.g., IEEE 802.15), the embodiments described herein with respect to the figures are not limited to wireless communication unit 110 and/or any protocols or frequencies described herein.
[0050] Aspects of wireless communication unit 110 may be separated across multiple locations and/or multiple printed circuit boards (PCBs). For example, as depicted in FIG. IE, WiFi module 119 and WiFi antenna 162 are disposed close to microcontroller 160 on camera PCB 166 and audio/video PCB 167 rather than on communication PCB 114. This
configuration facilitates low voltage operation, which assists to reduce the power consumption. In some instances, wireless communication unit 110 is embedded in microcontroller 160. In contrast, as depicted in FIG. 1A and IE, Bluetooth module 118 and Bluetooth antenna 113 are integrated with wireless communication unit 110 on communication PCB 114. In some instances, the Bluetooth module 118 is configured to transmit media information (e.g., streamed capture frames of the images) from video camera 163. In some embodiments, shaving system 100 includes wireless communication unit 110, which is attached to handle 140 and is electrically connected to microcontroller 160. In some instances, wireless
communication unit 110 is configured to transmit and receive data between microcontroller 160 to wireless module 555 on external device 505 (e.g., FIG. 5 and FIG. 12).
[0051] As depicted in FIG. 1A - FIG. 4, shaving system 100 includes force sensor 120 (e.g., force cell, load cell) coupled to lever assembly 130. Lever assembly 130 hinges blade cartridge 150 around first fulcrum 131 and second fulcrum 132 to depress plunger 124 over a distance S0. In some instances, spring 123 is used to determine sensor force, Fs, at plunger 124 by multiplying plunger depression distance S0 with the stiffness, k, of spring 123 (e.g., Fs=k-S0). Various techniques may be used to determine plunger depression distance S0. For instance, in some embodiments, plunger 124 is connected to a terminal of a slider
potentiometer or a variable resistor and configured to provide a resistance or voltage proportional to plunger depression distance S0.
[0052] In some embodiments, force sensor 120 (e.g., force cell, load cell) includes a capacitor plate configured to provide a capacitance proportional to plunger depression distance S0. In some embodiments, force sensor 120 is a load-cell that includes micro-machined silicon piezo-resistive strain gauges fused with high temperature glass to a high performance stainless steel substrate. It should be appreciated that shaving system 100 is not limited to force sensor 120 and may include, for example, an accelerometer configured to calculate a number of shave strokes and their intensity, a piezoelectric material (e.g., quartz) sensor, or other capacitive- based sensor configured to provide an electric charge proportional to the force, Fs, at plunger 124.
[0053] Force sensor 120 may be configured to sense composite force, F, that includes both normal force, F , and tangential force, FT. Normal force, F , refers to the force a user applies to press blade cartridge 150 against the surface of the skin. As illustrated in FIGS. 2A-2C, shaving system 100 includes lever assembly 130 to detect normal force, FN. In this instance, lever assembly 130 is configured to translate (e.g., transfer) normal force, FN, to depress plunger 124 of force sensor 120. That is, applying normal force, F , to the tip of input arm 138, pivots coupling 137 around second fulcrum 132, which in turn pivots output arm 134 around second fulcrum 132 to depress plunger 124. In some embodiments, the positions of first fulcrum 131 and second fulcrum 132 remain fixed and do not readjust with the application of a normal force, F . For instance, first fulcrum 131 and second fulcrum 132 depicted in FIG. 2B and FIG. 2C, maintain the same initial position from FIG. 2A with the application of a normal force, 1/2 FN and FN, respectively. In some embodiments the positions of one or both of first fulcrum 131 or second fulcrum 132 is adjusted with the application of a normal force, FN.
[0054] The displacement at the tip of input arm 138 (e.g., input displacement distance) is proportional to the applied normal force, FN. That is, the displacement distance Si of input arm 138 is zero without any applied normal force, FN, as illustrated in FIG. 2A. FIG. 2B and FIG. 2C illustrate increasing displacement distance Si of input arm 138 with the application of normal forces, 1/2 FN and FN, respectively.
[0055] As illustrated in FIG. 2A - FIG. 4, the forward kinematics of the displacement distance, Si, translates to a counter-clockwise rotational motion of input arm 138 about second fulcrum 132 that displaces coupling 137 a distance, Sm. The displacement of coupling 137, Sm, translates to a clockwise rotational motion of output arm 134 about first fulcrum 131 that displaces plunger 124 a distance, S0.
[0056] As illustrated in FIG. 4, precise displacement distance of coupling 137, Sm, with respect to the input displacement, Si, is based on a ratio of the distance from the tip of input arm 138 to second fulcrum 132, Ll5 and the distance from second fulcrum 132 to the center of coupling 137, L2, or
_L—— (1)
[0057] Similarly, the displacement distance of plunger 124 (e.g. output displacement), S0, with respect to the displacement distance of coupling 137, Sm, is based on a ratio of the distance from the center of coupling 137 to first fulcrum 131, L3, and the distance from first fulcrum 131 to plunger 124, L4, or
Figure imgf000012_0001
[0058] The overall displacement ratio of the displacement distance of plunger 124 (e.g. output displacement), S0, with respect to the displacement at the tip of input arm 138 (e.g., input displacement distance), Si, is based on the distance from the tip of input arm 138 to second fulcrum 132, Li times the distance from the center of coupling 137 to first fulcrum 131, L3, divided by the distance from second fulcrum 132 to the center of coupling 137, L2, and divided by the distance from first fulcrum 131 to plunger 124, L4, or
Figure imgf000012_0002
[0059] Accordingly, the lever assembly 130 of shaving system 100 can tune the
transference ratio based on the distance from the tip of input arm 138 to second fulcrum 132, LI, the distance from the center of coupling 137 to first fulcrum 131, L3, the distance from second fulcrum 132 to the center of coupling 137, L2, and the distance from first fulcrum 131 to plunger 124, L4. Tuning the transference ratio provides a sensing range that is conducive to the force sensor 120 operating range.
[0060] In some embodiments, lever assembly 130 is configured to displace plunger 124 (e.g. output displacement), S0, the same distance as the tip of input arm 138 (e.g., input displacement distance), Si, which results in a one-to-one transference ratio (e.g., FS=FN, Si=S0)). In some embodiments, lever assembly 130 is configured to displace plunger 124 (e.g. output displacement), S0, less than the displacement distance of the tip of input arm 138 (e.g., input displacement distance), Si, which results in a transference ratio greater than one (e.g.,
FS<F , Si<S0). In some embodiments, lever assembly 130 is configured to displace plunger 124 (e.g. output displacement), S0, more than the displacement distance of the tip of input arm 138 (e.g., input displacement distance), Si, which results in a transference ratio less than 1 (e.g.,
Figure imgf000013_0001
[0061] One benefit of a transference ratio larger than one (e.g., FS>F , Si>S0) is that the displacement distance of plunger 124 (e.g. output displacement), S0, is larger than the displacement at the tip of input arm 138 (e.g., input displacement distance), Si, which results in a force sensor 120 with a higher resolution.
[0062] Relating the overall displacement ratio of the displacement at the tip of input arm 138 (e.g., input displacement distance), Si, with respect to the displacement distance of plunger 124 (e.g. output displacement), S0, is proportional to sensing force FS with respect to normal force, FN. In view of Equation (3) above, sensing force, FS with respect to normal force, FN, is based on the distance from the tip of input arm 138 to second fulcrum 132, Li, times the distance from the center of coupling 137 to first fulcrum 131, L3, divided by the distance from second fulcrum 132 to the center of coupling 137, L2, and divided by the distance from first fulcrum 131 to plunger 124, L4, or
Figure imgf000013_0002
[0063] That is, normal force, F , is multiplied by the transference ratio to calculate sensing force, FS. Likewise, displacement distance of plunger 124, S0, is multiplied by the transference ratio to calculate the displacement at the tip of input arm 138, Si .
[0064] Tangential force, FT, is part of composite force, F, that refers to the force a user applies to blade cartridge 150 to cut hair across the surface of the skin, and is based, at least in part, on friction due to the blade 151 dragging on the surface of the skin. In general, lever assembly 130 is configured to translate (e.g., transfer) tangential force, FT, to depress plunger
124 of force sensor 120. In this instance, second fulcrum 132 is coupled to second slide bearing 133, which is configured to move along an inclined plane at angle Θ, with respect to the gripping portion of handle 140. Applying tangential force, FT, to the tip of input arm 138 slides second fulcrum 132 up the inclined plane at angle Θ to reposition coupling 137. In turn, coupling 137 readjusts the position of output arm 134 along a channel within output arm 134 and first slide bearing 139 while coupling 137 pivots around first fulcrum 131 to depress plunger 124.
[0065] As illustrated in FIG. 3A-3C, the position of second fulcrum 132 remains the same with respect to input arm 138, whereas the position of first fulcrum 131 is adjusted based on applied tangential force, FT. As such, the distance from the center of coupling 137 to first fulcrum 131, L3, and the distance from first fulcrum 131 to plunger 124, L4, varies over the distance of the channel within output arm 134. This variance in the distance from the center of coupling 137 to first fulcrum 131, L3, and the distance from first fulcrum 131 to plunger 124, L4, varies the transference ratio.
[0066] To compensate for this variance, position sensor 136, which in some examples includes a slide bearing, is placed along the channel within output arm 134 to provide offset from the initial position depicted in FIG. 3A. In this instance, position sensor 136 is a variable sliding resistor configured to provide a resistance or voltage proportional to the offset sliding distance, S0ff. In some embodiments, position sensor 136 may include other sensors such as a capacitive transducer, a capacitive displacement sensor, an eddy-current sensor, an ultrasonic sensor, a grating sensor, a hall effect sensor, an inductive non-contact sensor, an optical sensor (e.g., laser doppler vibrometer), a linear variable differential transformer (LVDT), a multi-axis displacement transducer, a photodiode array, a piezo-electric transducer, a rotary encoder, or the like.
[0067] As illustrated in FIG. 3A, the sliding motion of input arm 138 is zero without any applied tangential force, FT, whereas, as illustrated in FIG. 3B and FIG. 3C, sliding motion of plunger 124 is increased with offset sliding distance, S0ff, of position sensor 136 with the application of tangential forces, 1/2 F and F , respectively. The inverse kinematics translates the sliding motion of position sensor 136 at second fulcrum 132 along an inclined plane to a clockwise rotational motion of output arm 134 about first fulcrum 131 to displace plunger 124 distance S0. That is, tangential force, FT, is proportional to a combination of offset sliding distance, S0ff, and the displace distance of plunger 124, S0. In some examples, microcontroller 160 is configured to determine the applied tangential force, FT, based on both the displace distance of plunger 124, S0, and the offset of position sensor 136.
[0068] To facilitate the slide motion along the inclined plane, slide bearing 139, slide bearing 133, slide bearing/position sensor 136, and vertical slide bearings 135 mounted over plunger 124 are configured to have mechanical properties of near zero friction (e.g., frictionless). In some instances, slide bearing 139 and second slide bearing 133 include ball bearings. In some instances, slide bearing 139 and second slide bearing 133 include linear bearings. In some instances, slide bearing 139 and second slide bearing 133 include both ball bearings and linear bearings.
[0069] As illustrated in FIG. 4, lever assembly 130 and force sensor 120 are configured to combine normal force, F , and tangential force, FT, into single quantitative indicator 510 that is associated with the total force applied to the skin. In some embodiments, lever assembly 103 is configured to transfer both normal force, FN, and tangential force, FT, form blade 151 in contact with the skin to the compressive force at the proximity sensor.
[0070] By having lever assembly 130 and force sensor 120 (e.g., force cell, load cell) configured to combine normal force, FN, and tangential force, FT, into single quantitative indicator 510, lever assembly 130 and spring 123 cushion and absorb sudden movements. This provides for blade 151 to follow along the surface contour of the skin and conform across imperfections (e.g., micro bumps) for a closer, more comfortable shave. In some
embodiments, lever assembly 130 and force sensor 120 (e.g., force cell, load cell) include a dashpot configured to reduce vibrations in the spring 123 as well as slow the travel of lever assembly 130 to the initial position depicted in FIG. 2A and FIG. 3 A. In some instances, the dashpot includes pneumatics.
[0071] Further, by having lever assembly 130 and force sensor 120 (e.g., force cell, load cell) configured to combine normal force, FN, and tangential force, FT, into single quantitative indicator 510, lever assembly 130 and spring 123 can compensate for rough motions of the user's arm or hand thereby minimizing the pressure of blade 151 against the skin.
[0072] It should be appreciated that shaving system 100 is not limited to lever assembly 130 or force sensor 120 to detect one or both of normal force, FN, or tangential force, FT. For example, strain sensors (e.g., piezo-electric sensors) may be disposed between blade 151 and the body of blade cartridge 150. In this instance, one or more strain sensors (e.g., piezoelectric sensors) may be configured to sense normal force, FN, and/or tangential force, FT, that can be combined into single quantitative indicator 510.
[0073] Some embodiments of shaving system 100 display quantitative force indicator 510 on handle 140 of shaving system 100 or alternatively on external device 505 (e.g., smartphone 525, tablet 535, laptop, or desktop 540) via wireless communication unit 110 to wireless module 555. In some instances, microcontroller 160 stores to first memory 161 data indicative of the force applied (e.g., the force over a shave session) prior to blade cartridge 150 replacement. This provides a reference for a 'dull' blade 151 and provides another indicator to facilitate predicting blade attrition and end of life of blade cartridges 150.
[0074] In some embodiments, microcontroller 160 is configured to store in first memory 161 the data indicative of the force applied between a new blade cartridge 150 and the skin during the first shaving session. This beneficially can be used as a baseline for a 'sharp blade' for subsequent shaving sessions.
[0075] In some embodiments, microcontroller 160 or the external device is configured to calculate a force applied over several shaving sessions (e.g., 'habitual' average force).
Tracking the force applied in this manner provides a metric to gauge blade attrition (e.g., dulling of blade 151). For example, the force a user applies using a new 'sharp' blade 151 may be equal to 1/2 F , which displaces lever assembly as depicted in FIG. 2B. In contrast, the average force a user applies using an older 'dull' blade 151 may be equal to F , which displaces lever assembly 130 twice as far as depicted in FIG. 2C. In this instance, the additional force against the skin a user applies to compensate for the additional friction of inefficiencies of 'dull' blade 151 is twice as much as 'sharp' blade 151.
[0076] In some instances, microcontroller 160 or the external device is configured to count the number of shaving strokes, which in this instance is the number of times in a shaving session that an applied force exceeds the calculated average force applied over several shaving sessions. Contrasting the number of shaving strokes provides another metric to gauge blade attrition (e.g., dulling of blade 151). For example, the number of shaving strokes for new 'sharp' blade 151 is often significantly less than the number of shaving strokes for older 'dull' blade 151, because a user will drag 'dull' blade 151 across the skin more times to account for less efficient cutting. As such, the number of shaving strokes increase as the blade dulls, which provides a metric to gauge blade attrition.
[0077] In some instances, microcontroller 160 or the external device incorporates machine learning (e.g., heuristics) to determine blade attrition based on the number of strokes. For example, shaving system 100 may include a threshold associated with a number of shaving strokes for 'dull' blade 151. Microcontroller 160 or the external device adjusts the threshold associated with a number of shaving strokes for 'dull' blade 151 each time a user replaces blade 151. Over time, the threshold associated with a number of shaving strokes for 'dull' blade 151 converges on an accurate value that is based on a user's comfort level for blade cartridge 150 replacements. In some instances, microcontroller 160 is configured to prompt the user when the number of shaving strokes for 'dull' blade 151 approaches the adjusted threshold level. For example, external device 505 may be configured to prompt the user once the number of shaving strokes exceeds 90% of the threshold associated with number of shaving strokes for 'dull' blade 151. In some instances, a pop-up is displayed that facilitates the user to order a new replacement blade online. In some instances, replacement blades are automatically ordered for a user.
[0078] As depicted in FIG. 1A - FIG. 4, force sensor is configured as a proximity sensor that detects contact between each blade 151 and the skin. In some instances, force sensor 120 is configured to indicate contact between blade cartridge 150 and the skin for any depression distance S0 greater than zero (e.g., x>0). Likewise, in some instances, force sensor 120 may be configured to indicate contact based on changes in force, F, over a time differential, At, which can provide feedback to a user (e.g., audible sound, light or message displayed on an external device) to assist in proper shaving techniques.
[0079] In some embodiments, lever assembly 130 includes a stopper configured to reduce the travel distance of lever assembly 130. The stopper may be set at various positions of known deflection that are used to calibrate force sensor 120. In some instances, a stopper is set in a position that indicates a force threshold of 'dull' blade 151.
[0080] In some embodiments, proximity sensor is a touch based sensor (e.g., piezoelectric sensor, capacitive sensor) attached to each blade 151 on blade cartridge 150 configured to detect contact of each blade 151 with the skin. In some instances, blade 151 is in contact with the skin and the proximity sensor is configured to detect a compressive force. In some instances, proximity sensor is attached to the front of blade cartridge 150 adjacent to blades 151 that are configured to detect contact between blade cartridge 150 and the skin.
[0081] It will be appreciated that shaving system 100 is not meant to be limited to force sensor 120. For instance, conceivable modifications to lever assembly 130 may hinge blade cartridge 150 around fulcrum 131 to extend plunger 124 over a negative distance, -S0. In this instance, spring 123 of force sensor 120 is configured to detect a tensile force rather than a compressive force. For example, in some embodiments, blade 151 is in contact with the skin and the proximity sensor is configured to detect a tensile force. In some embodiments, lever assembly 103 is configured to transfer both normal force, F , and tangential force, FT, form blade 151 in contact with the skin to the tensile force at the proximity sensor. Other contact based proximity sensors configured to detect the force blade 151 exerts on the skin include piezoelectric sensors, capacitive sensors,
micro-electrical mechanical system (MEMS) based sensors, and the like.
[0082] In some embodiments, the proximity sensor is an ultrasonic rangefinder. For example, in some instances, this includes a distance ranging mechanism such as an ultrasonic pulse rangefinder configured to determine the distance from blade 151 to the skin. In some embodiments, the proximity sensor is an infrared (IR) sensor or any electronic sensor configured to detect an electromagnetic field or a beam of electromagnetic radiation (e.g., infrared, laser).
[0083] In some embodiments, the proximity sensors include optical or infrared imaging. For example, video camera 163 may be configured to detect proximity based on the incident light disparity such as detecting a dim, low intensity light when close to the skin and a brighter intense light away from the skin. In some embodiments, infrared sensors are configured to capture images that distinguish a slightly heated region caused by the friction of dragging blades 151 across the skin. In addition, shaving system 100 may be configured to capture a profile of the slightly heated region and analyze the captured profile for uneven wear (e.g., imbalances in blade attrition).
[0084] In some embodiments, proximity sensor is an accelerometer, which can detect the strokes count as well as the hand motion acceleration, which might assist in indicating dullness based on excess force applied by the user.
[0085] In accordance with some embodiments, the proximity sensor is a mechanical friction sensor that detects mechanical deflections in a region where blades 151 contact the skin. Often, the mechanical deflections facilitate a mechanical friction sensor to detect both compressive forces (e.g., FIG. 2A-2C) and tensile forces (e.g., FIG. 3A-3C) in a region where blades 151 contact the skin. In some embodiments, the proximity sensor is a mechanical friction sensor that uses a piezoelectric film. In some instance, the mechanical friction sensor is attached to the front of blade cartridge 150 adjacent to blades 151 in a region that contacts the skin. In some instances, the mechanical friction sensor use a piezoelectric film that attaches to the front of blade cartridge 150 and adjacent to blades 151 to detect contact between blade cartridge 150 and the skin. In some instances, the mechanical friction sensor is attached between at least one blade 151 and handle 140.
[0086] In some embodiments, the proximity is a piezoelectric friction sensor that attaches between blades 151 and the body of blade cartridge 150. In some embodiments, the proximity or the contact sensor is a piezoelectric sensor that attaches to one or more blades 151 to detect the deflection of each blade 151.
[0087] As depicted in FIG. 4, lever assembly 130 of shaving system 100 is configured to sense both a normal force, F (e.g. FIG. 2A-2C), and a tangential force, FT (FIG. 3A-3C), in a region where blades 151 contact the skin. In particular, lever assembly 130 includes second slide bearings 133 that moves second fulcrum 132 in a direction of applied tangential force (e.g., friction force). That is, the applied tangential force (e.g., friction force) adjusts the position of second fulcrum 132 along input arm 138 that in turn adjusts the position of first fulcrum 131 to pivot along output arms 134. This adjustment transmits applied tangential force (e.g., friction force) on the blade 151 to the output arm 134 to depress force sensor 120.
[0088] One advantage of sensing both a normal force, FN, and a tangential force, FT, is that the combination provides a force-based profile of each shaving stroke, which facilitates distinguishing a shaving stroke performed using a worn blade from a shaving stroke performed using a fresh blade with respect to each of performance, quality of shave, and shave stroke count. In some instances, microcontroller 160 is configured to collect and store in first memory 161, data associated with the forces applied to force sensor 120 for a portion of a shaving session.
[0089] Another parameter that can be used to determine a shaving stroke performance and count is the duration blade 151 is in contact with the skin. In this approach, microcontroller 160 is configured with a timer that measures the period of time that the proximity sensor detects contact between blade 151 and the skin. For this technique, the contact duration is compared to a contact duration threshold to determine a completed shaving stroke. In some embodiments, the proximity sensor is configured to detect when at least one blade 151 contacts the skin. In some instances, microcontroller 160 may not accurately interpret the occurrence of a shaving stroke when the proximity is too short or too long duration. As such, the contact duration threshold may be adjusted by the user (e.g., using external device 505 via wireless communication unit 110 to wireless module 555).
[0090] In some embodiments, microcontroller 160 or the external device 505 is configured to automatically and incrementally adjust a threshold value (e.g., contact duration threshold) representative of the period of time that the proximity sensor detects contact between blade 151 and the skin, for instance, based on the user's behavior. In some embodiments, microcontroller 160 is configured to provide instructions to an external device 505 to incrementally adjust a threshold value (e.g., contact duration threshold ) representative of the period of time that the proximity sensor detects contact between blade 151 and the skin based on the user's behavior. For example, a woman shaving her legs may have long contact shaving strokes, whereas a man shaving his face may have short contact shaving strokes. In these instances, microcontroller 160 is configured to adaptively adjust (e.g., using heuristic learning) the contact duration threshold to calculate a more accurate metric for the total accumulated time that the blade 151 made contact with the skin. In conjunction with the counting of total number of shaving strokes in a shave session, adaptive learning (e.g., heuristic learning) facilitates a more accurate estimate for predicting the blade attrition.
[0091] Shaving system 100 may also provide a quantitative comparison based on manufacturers data. For example, manufacture may report that a particular blade cartridge 150 that is reported to last up to five weeks. Based on the average number of shaving strokes determined for a user to be 150, microcontroller 160 would determine an expected lifetime of 5,250 (e.g., 150x5x7). In some embodiments, microcontroller 160 is configured to provide instructions to external device 505 to determine a total number of occurrences detected by the proximity sensor in second memory 570 and display on a display a quantitative comparison between the total number of shaving strokes and a number of shaving strokes expected over the lifetime of blade 151.
[0092] In some embodiments, shaving system 100 includes indicator display 510 disposed on handle 140. In some instances, microcontroller 160 is configured to receive the quantitative comparison from external device 505 via wireless communication unit 110 and display on the display 510 a dullness indicator representative of the quantitative comparison. [0093] In some embodiments, second memory 570 is electrically connected to external device 505. (FIG. 12) In some instances, second memory 570 is configured to store data associated with the at least one blade 151.
[0094] In some embodiments, microcontroller 160 is configured to provide instructions to wireless communication unit 110 to transmit a quantitative comparison of the total number of shaving strokes stored in the memory and the number of shaving strokes expected over the lifetime of at least one blade 151 to be provided for display on external device 505.
[0095] In general, the quantitative comparison may be represented as an anticipated percentage of remaining use until replacement, as an anticipated number of days remaining, the anticipated number of shaving stroke remaining, or the like. For instance, if the device recorded 4500 shaving strokes on day 30, the user may be notified that the blade is
approaching the end of its lifespan with a total of 750 shaving strokes left or 5 days of dull shaving remaining. In some embodiments, the quantitative comparison is a display bar, color LEDs, or a small LCD displayed on handle 140 of shaving system 100 akin to dullness indicator 510 represented as a display bar as depicted on handle 140 in FIG. 5.
[0096] In some embodiments, shaving system 100 includes a server-based or cloud-based 545 user subscription account that is configured to retrieve and store the relevant information from shaving system 100 for blade cartridge 151, such as the manufacturer, model number, number of completed shaving strokes, anticipated number of days remaining on blade cartridge 151, and the life expectancy of each blade. In some embodiments, the subscription account is configured to notify the user (e.g., via email, pop-up message) that a replacement blade cartridge should be ordered when the anticipated number of days remaining in the life of blade cartridge 151 drops below a certain threshold. In some embodiments, the subscription account is configured to automatically order or purchase a replacement cartridge once the anticipated number of days remaining in the life of blade cartridge 151 drops below a certain threshold.
[0097] In some embodiments, the server-based or cloud-based 545 user subscription account is accessible through the external device 505. Accordingly, the external device 505 may be configured to provide access to the server-based or cloud-based user subscription account. The server-based user subscription account is configured to order replacements for the at least one blade based on data or instructions received from the microcontroller 160 in some examples. By way of example, the server-based or cloud-based 545 user subscription is configured to retrieve the quantitative comparison between the total number of shaving strokes from the memory via wireless module 555 and order replacements for the at least one blade when the total number of shaving strokes reaches a threshold value proportional to the quantitative comparison.
[0098] It should be appreciated that additional techniques may be implemented to assist in providing an accurate stroke count and quantitative comparison, such filtering techniques (e.g., low-pass filters to remove flicker noise) and statistical analysis (e.g., standard deviation, expected value).
[0099] To conserve resources, microcontroller 160 may be configured to provide sensory data to external device 505. As such, microcontroller 160 is configured to transmit sensory data via wireless communication unit 110 to wireless module 555 on external device 505. It should be understood that many of the computations performed by microcontroller 160 may be performed on external device 505 and transmitted and/or stored to first memory 161 on shaving system 100. This beneficially conserves power on shaving system 100 and in some instances may reduce the total processing time. Likewise, the quantitative comparison and other parameters may be displayed on external device 505.
2. Shaving system 100 with image camera 163
[00100] As depicted in FIG. 1A - FIG. 1H, shaving system 100 includes handle 140, at least one blade 151 connected to handle 140, microcontroller 160 attached to handle 140, and one or more sensors adjacent at least one blade 151. In some embodiments, the one or more sensors are configured to send sensory data to microcontroller 160. In some embodiments, one or more sensors is camera 163 having an image sensor configured to capture video and/or still images. In some instances, camera 163 is configured to capture both frames and video. In some embodiments, microcontroller 160 is configured to stream video data from camera 163 and/or audio data from microphone 165 via wireless communication unit 110 to be displayed on external device 505 (e.g., smartphone, tablet, laptop, desktop).
[00101] In some embodiments, shaving system 100 includes wireless communication unit 110 attached to handle 140 and electrically connected to microcontroller 160. In some embodiments, wireless communication unit 110 is configured to transmit and receive data from microcontroller 160 to external device 505 (e.g., FIG. 5 and FIG. 12). [00102] As depicted in FIG. 12, memory 161 is electrically connected to microcontroller
160. In some embodiments, memory 161 is configured to store data associated with at least one blade 151.
[00103] In some embodiments, microcontroller 160 is configured to instruct image camera to capture frames of the images from camera 163 and instruct wireless communication unit 110 to stream the frames to be processed, analyzed, or displayed on the external device 505. In some instances, the frames are stored in first memory 161 or in an external storage (e.g., second memory 570) on external device 505.
[00104] In some embodiments, external device 505 is a wearable computing device, such as a wristwatch as depicted in FIG. 5, FIG. 6, and FIG. 8. In some embodiments, external device 505 is a hand-held phone 525 (e.g., mobile phone) as depicted in FIG. 7 or tablet that is held or mounted nearby similar to a portable hand mirror. In some embodiments, external device 505 uses a media player embedded in a user interface (UI) that is configured to play the video and/or audio captured in real time. In some instances, the media player may include other features, such as zoom (e.g., manual zoom or automatic zoom) and/or correction functionality that conditions to streamed media (e.g., image sharpness, contrast, color balance, filtering techniques).
[00105] One benefit of using camera sensor 163 is to provide a shaving view to the user on external device 505 without the need for a mirror, as well as viewing regions difficult to view with a single mirror (e.g., back of the neck). Further, having video streamed from the camera 163 offers a close-up look of the shaving regions to ensure a proper shaving technique and to better check the quality of the shave.
[00106] An advantage of streaming the video is that external device 505 can provide feedback to a user in real time or in near real time. For example, in some embodiments, external device 505 is configured to analyze the frame images to determine a blade attrition comparison based on the analyzed frame images and present for display on display 565 on external device 505 the blade attrition comparison represented as a compass-like arrow that updates in near real time.
[00107] Microcontroller 160 may, in some examples, be configured to offload other tasks in order to save on power and provide more efficient utilization of computational resources, particularly during computationally intensive operations. For example, microcontroller 160 is configured to instruct wireless communication unit 110 to transmit frames to external device
505. In response, external device 505 may include image analysis module 560 (FIG. 12) to differentiate a color variation between adjacent pixels in the captured frame and store in first memory 161 a quantitative comparison for the remaining hair. In some instances, the color between adjacent pixels may vary from a pinkish hue of bare skin that has been fully shaven to dark black that is unshaven. From the frame, external device 505 (e.g., via image analysis module 560) is configured to determine the amount of hair remaining. In some embodiments, external device 505 is configured to determine a quantitative comparison for the remaining hair based on the captured frames. In some embodiments, the amount of hair remaining is provided as a percentage of remaining hair that ranges from 100% (e.g., thick beard) to 0% (e.g., bare skin).
[00108] At times, external device 505 and wireless communication unit 110 may exchange data back and forth in real time. This is particularly useful to provide a user with feedback with shaving. For example, in some embodiments, microcontroller 160 is configured to provide a real-time quantitative comparison, such as a variable pitch sound or a recorded voice from speaker 164, a visual indicator 510, and the like, on shaving system 100. In some embodiments, microcontroller 160 is configured to provide an audio signal to instruct speaker 164 (e.g., electrical audio device) to emit a sound corresponding to the quantitative comparison for the remaining hair. In some examples, the sound is a variable pitched sound or a recorded voice.
[00109] In some instances, it is beneficial to offload data from shaving system 100 to external device 505. For example, display 565 on external device 505 may be larger or easier to manipulate (e.g., a touch screen). In these instances, microcontroller 160 transmits data in real time via wireless communication unit 110 and wireless module 555 on external device 505 that is displayed on display 565 on external device 505. In some embodiments, external device 505 is configured to present the quantitative comparison for the remaining hair for display on external device 505 (e.g. display 565).
[00110] As depicted in FIG. 9, microcontroller 160 is configured to capture a frame via miniature camera 163 to determine the amount of hair remaining over a certain area. In particular, microcontroller 160 captures a frame and compares the color difference between adjacent pixels to estimate the total amount of hair remaining over a specific area.
Microcontroller 160 stores to first memory 161 the total amount of hair remaining over a certain area as a quantitative comparison for the remaining hair. As depicted in FIG. 7, microcontroller 160 is configured to transmit via wireless communication unit 110 the quantitative comparison for the remaining hair to wireless module 555 on external device 505 (e.g., a wristwatch) that displays the frame of the specific area along with the quantitative comparison for the remaining hair.
[00111] External device 505 is configured to analyze a frame to determine the general growth direction of the remaining hair. For example, one approach to determine the general direction of hair growth is to filter the frame image using an edge detection filter, which contrasts the edges of hairs on the face as depicted in FIG. 10A - FIG. 10B. In this instance, microcontroller 160 is configured to capture and transmit the frame to external device 505, and external device 505 implements an edge detection filter to distinguish the hairs. In some embodiments, the edge detection filter is a Sobel filter or a Canny filter.
[00112] In some embodiments, external device 505 is configured to determine a general direction of the remaining hair based on the captured frames, and provide for display on external device 505, a directional indicator representative of a general direction of the remaining hair that corresponds to the best direction to drag the at least one blade over the skin. In some embodiments, external device 505 is configured to provide for display on the external device 505, the filtered frame images. In some embodiments, external device 505 is configured to overly filtered frame images with the streamed video frame image.
[00113] As illustrated in FIG. 9, various filter techniques may be implemented to distinguish the hair. In this instance, frame image (i) a filter is applied that iteratively increases the contrast of the hair edges with respect to the background. After a few iterations (iii) as depicted in the zoomed-in region of FIG. 9, the hair are contrasted and "least square analysis" or "regression analysis" is applied to the remaining hair to calculate a general direction of the remaining hair. In some embodiments, external device 505 is configured to implement a "least square analysis" or "regression analysis" to the remaining hair to calculate the general direction of the remaining hair. In some instances, external device 505 is configured to provide a quantitative value representative of the general direction of hair growth and provide for display on the external device 505 the general direction of the remaining hair to memory.
[00114] This approach provides a directional indicator that corresponds to the best direction in which to drag at least one blade 151 over the skin. Determining the general direction of hair growth also allows the user to orient shaving system 100 according to the best direction to drag blade 151 over the skin. In some embodiments, the directional indicator is displayed on the external device as a circular bar graph that is updated and/or filled up in near real time. In some embodiments, the directional indicator is displayed on the external device 505 as a compass-like arrow that updates in near real time.
[00115] In some instances, the external device 505 determines the directional indicator based on the received frame images from microcontroller 160 (e.g., via wireless
communication unit 110 and wireless module 555).
[00116] Another approach to determine the general direction of hair growth is to determine the angle value as color of each pixel based on HSV color space, which is representative of the hair directions. In this technique, external device 505 or microcontroller 160 is configured to filter the frame images using a median filter (e.g., Sobel filter) to reduce high-frequency noise prior to applying an edge detection filter. Next, external device 505 or microcontroller 160 is configured to apply a Canny edge detection filter to frame images to detect edges. Often, the resultant filtered image has thick line edges. As such, microcontroller 160 or external device 505 is configured to apply a line-thinning filter to reduce line thicknesses on frame images. Once the line thicknesses are reduced, microcontroller 160 or external device 505 is configured to determine the angle value as color of each pixel based on HSV color space. The angle value is representative of the line directions (e.g., hair).
[00117] Shaving system 100 can also assist in shaping regions of established hair. For example, frame images may include established hair growth regions such as a sideburn, muttonchops, mustache, goatee, and the like, where the image shows longer hair growth adjacent to short hair growth. In these instances, microcontroller 160 is configured to provide instructions to external device 505 to determine a boundary indicator associated with established hair growth based on the filtered frame images and provide the boundary indicator for display on external device 505. For example, external device 505 may overlay the boundary indicator with a frame. In some instances, external device 505 is configured to overlay the boundary indicator with streamed video frame images.
[00118] As viewed from external device 505, the streamed video would show the boundary indicator at the boundary between established hair growth region and stubble region to be shaved. In some instances, the boundary indicator is a line (e.g., a curved line or a straight line) that overlays a streamed video or frame. As such, the boundary indicator assists the user to balance the symmetry of unshaven regions as well as facilitate shaving near the contour of a beard or mustache.
[00119] In some embodiments, external device 505 is configured to adjust the boundary indicator according to predefined features selected by a user. For example, a user may adjust a goatee style and select within external device 505 to overlay the goatee style with steamed video as a guide for regions to shave. In some instances, the boundary that represents sideburns is extended to incorporate a larger short hair region when the user desires muttonchops. In these instances, microcontroller 160 or external device 505 is configured to extend or reduce the boundary indicator and display an alternate quantitative boundary indicator on external device 505 representative of the predefined feature. In some
embodiments, external device 505 is configured to overlay the boundary indicator with streamed video frame images. In some instances, the boundary indicator is displayed as a line. In some embodiments, the alternate quantitative boundary indicator overlays a streamed video or frame to guide the user in trimming and forming a desired look.
[00120] Monitoring hair characteristics is one approach to improve the quality of the shave. One technique for detecting blade attrition includes capturing a first image (e.g., frame) of a region of skin with hair using camera 163. In some instances, camera 163 is disposed below handle 140 and configured to view the region before blade 151 is dragged across the skin prior to shaving, as depicted in FIG. 7. This configuration facilitates determining a quantitative comparison for the number of hairs in the region of skin. In some instances, camera 163 is disposed above handle 140 and configured to view the region after blade 151 is dragged across the skin after shaving, as depicted in FIG. 1G. This configuration facilitates determining a quantitative comparison before and after a shaving stroke of the number of hairs in the region of skin.
[00121] Some embodiments include first camera 163 disposed below handle 140 and second camera 163 above handle 140. This configuration facilitates capturing a first image (e.g. frame) of a region of skin with hair in front of blade 151 and capturing a second image (e.g. frame) of a region of skin with hair behind blade 151. In some embodiments, one or more processors use the captured first and second images in determining a first and second quantitative comparison and providing an attrition comparison based on the difference between the second quantitative comparison and the first quantitative comparison to an electrical device.
[00122] To conserve power and/or save on resources, microcontroller 160 may be configured to provide raw sensory data to external device 505. As such, microcontroller 160 is configured to transmit raw sensory data via wireless communication unit 110 to wireless module 555 on external device 505. It should be appreciated that many of the computations performed by microcontroller 160 may be performed on external device 505 and transmitted and/or stored to first memory 161 on shaving system 100. This beneficially conserves power on shaving system 100 and in some instances may reduce the total processing time. Likewise, the quantitative comparison and other parameters may be displayed on external device 505.
[00123] Shaving system 100 is not meant to be limited to a cartridge-razor body style and may have other body styles conducive to disposable razors, safety razors, electric razors, straight razors and the like. For example, in some embodiments, shaving system 100 may be an independent mountable electrical device that can be attached or clipped on to any hand-held razor. In these instances, users can purchase their preferred brand of razor and attach the mountable electrical device to the hand-held razor. One advantage to mountable electrical device to the hand-held razor is that the user can evaluate and compare different razors and select which razor best accommodates their shaving technique.
[00124] In some examples, a mountable electrical device includes a fixture configured to fasten to a precision hand tool, microcontroller 160 attached to the fixture, and wireless communication unit 110 attached to the fixture and electrically connected to microcontroller 160, wherein wireless communication unit 110 is configured to transmit and receive data from microcontroller 160 to external device 505, first memory 161 electrically connected to microcontroller 160, wherein memory 161 is configured to store data from microcontroller 160, and one or more sensors attached to the precision hand tool, wherein the one or more sensors are configured to provide sensory data to microcontroller 160. In some instances, one of the one or more sensors is a proximity sensor. In some instances, one of the one or more sensors is image camera 163 configured to provide frames of images to microcontroller 160.
[00125] In addition, various components of shaving system 100 should not be limited to razors but may be applicable to other aspects. For example, the independent mountable electrical device described above may be attached to high-precision hand tools that provide and/or improve upon real time information to facilitate specific procedures. Further, the mountable electrical device may be small, lightweight, and wireless to provide untethered freedom of motion for many applications. Various applications that would benefit from a mountable device are electrical tools, automotive tools, carpentry tools, surgical tools, and the like. It should be recognized that the above mountable electrical device may be incorporated in any tool that would benefit from real time information to facilitate specific procedures.
3. Optical technique for determining blade attrition
[00126] FIG. 10A illustrates unfiltered and filtered images of an unshaven area of skin on a face. In this instance, microcontroller 160 captured the image frame via camera 163 and transmitted (e.g., streamed) the image frame via wireless communication unit 110 to wireless module 555 on external device 505. Wireless module 555 forwards the image frame to image analysis module 560 on external device 505 for further processing. Image analysis module 560 uses processors 575 on external device 505 to filter and analyze the image frame to determine a first quantitative comparison for a hair characteristic. For example, as depicted in FIG. 10A, analysis module 560 calculates three hair characteristics that may be used as a first quantitative comparison for a hair characteristic, specifically, the hair count (e.g., Hair count: 3267), the average length of the hair (e.g., Avg. length: 32.3), and average density (e.g., avg. intensity: 7.91%).
[00127] As depicted in FIG. 10A and FIG. 10B, display 565 on external device 505 is a touch screen that includes selectable software buttons or switches that facilitate selecting between the original frame image (e.g., original), edge-filtered frame image (e.g., mono), edge- filtered image with the color inverted (e.g., color), and an overlay of the original and edge- filtered image with the color inverted (e.g., overlay). In addition, display 565 on external device 505 includes selectable software buttons to select between stream video (e.g., wireless communication unit 110 to wireless module 555) from camera 163 (e.g., camera), frame images before blade 151 is dragged across the skin (e.g., Before), and frame images after blade 151 is dragged across the skin (e.g., After).
[00128] FIG. 10B illustrates unfiltered and filtered images of an area of skin on a face where "dull" blade 151 is dragged once across the surface of the skin. In this instance microcontroller 160 captured the image frame via camera 163 and transmitted (e.g., streamed) the image frame via wireless communication unit 110 to wireless module 555 on external device 505. Wireless module 555 forwards the image frame to image analysis module 560 on external device 505 for further processing. Image analysis module 560 uses processors 575 on external device 505 to filter and analyze the image frame to determine a second quantitative comparison for a hair characteristic. In this instance, analysis module 560 calculates three hair characteristics that may be used as a second quantitative comparison for a hair characteristic, specifically, the hair count (e.g., Hair count: 2231), the average length of the hair (e.g., Avg. length: 27.4), and average density (e.g., Avg. intensity: 4.59%).
[00129] Comparing the first quantitative comparisons of the "before" images of FIG. 10A with the second quantitative comparison for a hair characteristic of FIG. 10B of the "after" images, indicates that a single pass of blade 151 over a region of skin removed about 31% of the hair at the skin and shortened the overall length the hair by about 15%, which decreased the average density by about 42%. A visual inspection of the "after" images of FIG. 10B confirms that blade 151 did not cut the majority of the hair down to the skin in the shaved region, which is an indication that blade 151 is dulling.
[00130] FIG. 11 is a flow diagram illustrating method 1100 for gauging blade attrition (e.g., determining the dullness of blade 151). In some embodiments, method 1100 may be performed at microcontroller 160 as part of shaving system 100. In some embodiments, method 1100 may be performed at external device 505 to conserve power and save on resources on shaving system 100. Some operations in method 1100 may be combined, the order of some operations may be changed, and some operations may be omitted.
[00131] At block 1105, method 1100 may filter, using one or more processors (e.g., processor cores 169, processors 575), a first image of a region of skin with hair. For example, microcontroller 160 may be configured to execute one or more modules or components to filter, using one or more processors (e.g., processor cores 169, processors 575), the first image of a region of skin with hair that was captured using camera 163. In some embodiments, filtering the first image of a region of skin with hair uses an edge detection filter. In some embodiments, the edge detection filter is a Sobel filter or a Canny filter.
[00132] At block 1110, method 1100 may determine, using one or more processors (e.g., processor cores 169, processors 575), a first quantitative comparison for a hair characteristic in a region of skin based on the first filtered image. For example, microcontroller 160 may be configured to execute one or more modules or components to determine, using one or more processors (e.g., processor cores 169, processors 575), a first quantitative comparison for a hair characteristic in a region of skin based on the first filtered image. In some embodiments, the hair characteristic is the quantity of hair. In some embodiments, the hair characteristic is the density of hair. In some embodiments, the hair characteristic is the average length of hair.
[00133] At block 1115, method 1100 may shave the region of skin with blade 151. For example, microcontroller 160 may be configured to execute one or more modules or components to shave the region of skin with blade 151.
[00134] After the region of skin has been shaved, at block 1120, method 1100 may filter, using one or more processors (e.g., processor cores 169, processors 575), a second image of a region of skin with hair. For example, microcontroller 160 may be configured to execute one or more modules or components to filter, using one or more processors (e.g., processor cores 169, processors 575), the second image of the region of skin with hair that was captured using camera 163. In some embodiments, filtering the second image of a region of skin with hair includes using an edge-detection filter. In some embodiments, the edge-detection filter is a Sobel filter or a Canny filter.
[00135] At block 1125, method 1100 may determine, using one or more processors, a second quantitative comparison for the hair characteristic in the region of skin based on the second filtered image. For example, microcontroller 160 may be configured to execute one or more modules or components to determine, using one or more processors (e.g., processor cores 169, processors 575), a second quantitative comparison for the hair characteristic in the region of skin based on the second filtered image.
[00136] In some embodiments, determining the first or second quantitative comparison for the hair detection in the region of skin includes differentiating a color variation between adjacent pixels in the captured image.
[00137] In some embodiments, method 1100 may include sending an audio-signal to an electrical audio unit configured to emit sound. The electrical audio unit emits a sound associated with either the blade attrition comparison or the first or second quantitative comparison for the hair characteristic in the region of skin.
[00138] In some embodiments, determining the first or second quantitative comparison for the hair characteristic in the region of skin further includes determining a quantitative boundary indicator that distinguishes a boundary between an established hair growth region and a stubble region to be shaved based on the first or second filtered image.
[00139] At block 1130, method 1100 may determine a quantitative boundary indicator that distinguishes a boundary between an established hair growth region and a stubble region to be shaved based on the first or second filtered image. For example, one or more processors (e.g., processor cores 169, processors 575) may be configured to execute one or more modules or components to determine a quantitative boundary indicator that distinguishes a boundary between an established hair growth region and a stubble region to be shaved based on the first or second filtered image.
[00140] At block 1135, method 1100 may determine a general direction of the remaining hair based on the first or second filtered image. For example, one or more processors (e.g., processor cores 169, processors 575) may be configured to execute one or more modules or components to determine a general direction of the remaining hair based on the first or second filtered image. As depicted in FIG. 9 (iv), in some embodiments, determining the general direction of the remaining hair includes a "least square analysis" or "regression analysis".
[00141] At block 1140, method 1100 may provide for display, a general direction of the remaining hair, wherein the general direction is associated with the best direction to drag the blade over the region of skin. For example, one or more processors (e.g., processor cores 169, processors 575) may be configured to execute one or more modules or components to provide for display, a general direction of the remaining hair, wherein the general direction is associated with the best direction to drag the blade over the region of skin.
[00142] At block 1145, method 1100 may provide for display, a blade attrition comparison based on the difference between the second quantitative comparison and the first quantitative comparison. For example, microcontroller 160 may be configured to execute one or more modules or components to provide for display, a blade attrition comparison based on the difference between the second quantitative comparison and the first quantitative comparison. The blade attrition indicator may comprise a life remaining indicator and/or a dullness indictor for at least one blade. 4. Blade cartridge 150 with curved blades 151
[00143] FIG. 13A and FIG. 13B illustrate a front view and an exploded view, respectively, of a blade cartridge 150 with blades that are slightly curved. The views illustrate blade cartridge 150 that includes a fixture configured to fasten to a razor (e.g., safety razor, disposable razor, cartridge razor) and at least one blade 151 connected to the fixture, wherein the at least one blade 151 is curved. As illustrated, blades 151 of a blade cartridge 150 are slightly curved in order to reduce the cutting resistance of the hair during impact with the blade. In some instances, curved (e.g., sickle-like) blades 151 reduce impact resistance along the direction of the motion of blade 151, which results in a more efficient cut that is smoother to the skin. In some embodiments, blade 151 is slightly curved (e.g., sickle-like), laying on a two-dimensional plane at an approximately normal angle to the surface of the skin. In this manner, the plane of the curve may follow the tangent of the skin. In some embodiments, the at least one blade 151 is slightly curved in a sickle-like fashion. In some embodiments, the back edge of blade 151 is convex and the sharp front edge is concave. In some embodiments, blade 151 curves inward along a cutting edge.
[00144] Similar to straight blades in a parallel configuration, an enclosed arrangement of two or more blades 151 adjacent to each other can be applied to distribute the applied force among blades 151 as each blade 151 contacts the skin. In this instance, at least one blade 151includes a plurality of blades 151, wherein each of the plurality of blades 151 are parallel to each adjacent blade 151. One advantage of this configuration is that it can help to prevent wrongful cutting of the skin when a sideways motion of blade 151 is applied.
[00145] Curved blade 151 may include steel, ceramics (e.g., zirconia, alumina), or nanolattice. In some embodiments, curved blade 151 is made of carbon steel (e.g., austenitic, martensitic, stainless steel). One advantage of using steel blades 151 is that they are easily shaped and formed using machining techniques.
[00146] In some embodiments, curved blade 151 is made of ceramics. A ceramic blade 151 may be made through a dry-pressing and sintering process that subsequently sharpens the edge with a diamond grinder. In some instances, the ceramic powder is placed on rotating drum to first create a full ring, taking into account the inner diameter (id) and outer diameter (od), and then cut off sub- sections that are the width of blade 151, prior to cooling. One advantage of ceramic over steel for blades 151 is that ceramic are harder than carbon steel, which results in an edge more resilient to dulling.
5. Blade 151 with a nanolattice
[00147] A nanolattice is a truss structure with connecting truss members implemented at a nanoscale. These structures can be made on a length scale spanning multiple orders of magnitude, for instance, from tens of nanometers to hundreds of microns. The nano-sized connecting truss members, in some examples with tube walls of less than 100 nanometers, facilitate properties different than more dense counterparts. Notably, certain ceramics exhibit a higher hardness than metals but are brittle and tend to chip or fracture under certain loads. In contrast, nanolattices with nano-sized structures and comprising single crystal materials, such as ceramics (e.g., materials having approximately 20 to 60 nanometer wall thickness), do not exhibit elastic instability and have been shown to fully recover at approximately 20
nanometers. Nanolattices maintain high strength yet have been found to be remarkably resilient and less brittle. The advantage of forming blade 151 using a nanolattice (e.g., nanoblade) is that leading edge 1505 of blade 151 would be much less susceptible to dulling.
[00148] To form the nanolattice structure, a micro- scaffold structure may be formed (e.g., fabricated) through a process of two-photon lithography (e.g., a microscopic 3D printing) to create the truss structure based on a polymer model. In some instances, this technique includes two laser beams that crosslink and harden a polymer at the point of focus in 3D space. That is, the parts of the polymer exposed to the lasers remain intact while the material that is not exposed dissolves away. In some instances, this technique includes atomic layer deposition (ALD) or sputtering to deposit material (e.g., carbon steel, ceramic) on the truss structure. This technique coats the connecting truss members with a deposited material (e.g., carbon steel, ceramic). In some instances, ALD is based on one or more sequential exposure to a gas that chemically reacts with the surface of the target material (e.g., carbon steel, ceramic) to slowly form a thin film.
[00149] The resultant film coats the polymer and forms a rigid shell. After the coated film forms a rigid shell, one end of the truss structure is cut to expose the internal polymer. The exposed polymer truss is removed using an oxygen (e.g., 02) plasma etch. In this instance, the remaining structure is a nanolattice with hollow connecting truss members. That is the nanolattices use less material than dense counterparts. As such, one advantage of nanolattices is that the reduction of material reduces the weight of blade 151 without compromising the strength. In some instances, the nanolattice reduces brittleness (e.g., alumina, ceramics).
[00150] FIG. 14A - FIG. 15B, illustrate blade 151 (e.g., nanoblade) that includes front leading edge 1505 of blade 151, spine 1530 of the blade 151, and a nanolattice that connects the front leading edge of blade 151 to spine 1530 of blade 151. In some embodiments, one or more connecting truss members of the nanolattice has a curvilinear geometric shape. By way of examples, a shape of one or more connecting truss members may be a cylinder, an elliptical tube, or a closed-profile elongated tube. In some embodiments, one or more of the tubes of the nanolattice is hollow. Other embodiments may include rectangular tubes, I-beams, C-beams, and the like.
[00151] In some instances, these tubes are conical cylinders or tapered cylinders. As illustrated in FIG. 14A - FIG. 15B, the tubes connecting to leading edge 1505 are tapered to conform to leading edge 1505. In this instance, connecting truss members taper along the edge of blade 151 and diverge (e.g., spread out) from leading edge 1505 of blade 151 toward spine 1530 (e.g., back) of blade 151. In some embodiments, one or more connecting truss members of the nanolattice is a tube that tapers towards leading edge 1505. In some embodiments, one or more connecting truss members tapers toward the leading edge 1505 and forms ribs along the leading edge for reinforcement and reduced friction.
[00152] FIG. 14A - FIG. 14B depict various views of blade 151 with a nanolattice that forms an octet-truss structure. An octet-truss is a lightweight structure that distributes dominant forces among diagonal connecting truss members 1520. This means that portions of dominant compressive forces are converted to tensile force across the octet-truss. This makes the structure much less susceptible to failure because it can re -balance compressive load to tensile loads. In some embodiments, blade 151 is made of a metal.
[00153] In some embodiments, blade 151 is made of a ceramic. On the nano-scale, ceramics have been found to be remarkably less brittle and much stronger in tension. This means that hard ceramics such as alumina (e.g., corundum, sapphire) and zirconia may be manufactured into blade 151 with a nanolattice (e.g., nanoblade) having an octet- truss that resists the impacts of cutting hair longer without fracturing or chipping when dropped. In some instances, the ceramic is zirconia or alumina. [00154] FIG. 15A - FIG. 15B depict various views of blade 151 with a nanolattice that forms additional connecting truss members to an octet-truss structure. In this instance, the octet- truss structure includes cross members that extends from spine 1530 to the leading edge and cross members that parallel to the leading edge. In some instances, the cross members are of a tetrahedral shape and added to the octahedral shape of the octet- truss. As a result, this structure includes more material over a unit volume than the octet-truss of FIG. 14A - 14B, which makes it denser and heavier. In addition, the truss is more rigid because less portions of compressive force are rebalanced into tensile forces. This reinforcement may provide better support to the nanolattice structure, further allowing hard ceramics such as alumina (e.g., corundum, sapphire) and zirconia to be manufactured into blade 151 with a nanolattice (e.g., nano-blade) having the structure depicted in FIG. 15A - 15B, which is less resistant to fracturing or chipping when compared to the nanolattice structure depicted in FIG. 14A- 14B.
[00155] Although the techniques have been described in conjunction with particular embodiments, it should be appreciated that various modifications and alterations may be made by those skilled in the art without departing from the spirit and scope of the invention.
Embodiments may be combined and aspects described in connection with an embodiment may stand alone.

Claims

CLAIMS What is claimed is:
1. A shaving system comprising:
a handle;
at least one blade connected to the handle;
a microcontroller attached to the handle; and
one or more sensors adjacent the at least one blade, wherein the one or more sensors are configured to transmit sensory data to the microcontroller, wherein one of the one or more sensors is a proximity sensor.
2. The shaving system of claim 1, wherein
the proximity sensor is an IR sensor.
3. The shaving system of any of claims 1-2, wherein
the proximity sensor is an ultrasonic rangefinder.
4. The shaving system of any of claims 1-3, wherein
the proximity sensor is an accelerometer.
5. The shaving system of any of claims 1-4, wherein
the proximity sensor is configured to detect a compressive force, when the at least one blade is in contact with skin.
6. The shaving system of claim 5, further comprising:
a lever assembly configured to transfer both a normal force and a tangential force at the at least one blade to the compressive force at the proximity sensor, when the at least one blade is in contact with the skin.
7. The shaving system of any of claims 1-6, wherein
the proximity sensor is configured to detect a tensile force, when the at least one blade is in contact with the skin.
8. The shaving system of claim 7, further comprising:
a lever assembly configured to transfer both a normal force and a tangential force at the at least one blade to the tensile force at the proximity sensor, when the at least one blade is in contact with the skin.
9. The shaving system of any of claims 1-8, wherein
the proximity sensor is a mechanical friction sensor.
10. The shaving system of claim 9, wherein
the mechanical friction sensor uses piezoelectric film.
11. The shaving system of any of claims 9-10, wherein
the mechanical friction sensor is attached to the front of a blade cartridge adjacent to blades in a region that contacts the skin.
12. The shaving system of any of claims 9-11, wherein
the mechanical friction sensor is attached between the at least one blade and the handle.
13. The shaving system of any of claims 1-12, wherein
the proximity sensor is configured to detect when the at least one blade contacts the skin.
14. The shaving system of any of claims 1-13, further comprising:
a wireless communication unit attached to the handle and electrically connected to the microcontroller, wherein the wireless communication unit is configured to transmit and receive data or instructions between the microcontroller and an external device.
15. The shaving system of claim 14, wherein
the microcontroller is configured with a timer to measure a period of time that the proximity sensor detects contact between the at least one blade and the skin.
16. The shaving system of claim 15, wherein
the microcontroller is configured to provide data or instructions to the external device to incrementally adjust a threshold value representative of the period of time that the proximity sensor detects contact between the at least one blade and the skin.
17. The shaving system of any of claims 14-16, wherein
the microcontroller is configured to provide data or instructions to the external device to: determine a total number of occurrences detected by the proximity sensor; and display a quantitative comparison between a total number of shaving strokes and a number of shaving strokes expected over a lifetime of the at least one blade.
18. The shaving system of any of claims 14-17:
the microcontroller is configured to provide data or instructions to the external device to determine and display a quantitative comparison that is a dullness indicator of the at least one blade.
19. The shaving system of any of claims 14-18, wherein
the external device is configured to provide access to a server-based or cloud-based user subscription account, wherein the user subscription account is configured to order replacements for the at least one blade based on data or instructions received from the microcontroller.
20. The shaving system of any of claims 14-19, wherein
the external device is a wearable computing device.
21. The shaving system of any of claims 14-19, wherein
the external device is a hand-held phone, tablet, laptop, or desktop.
22. The shaving system of any of claims 12-21, further comprising:
a first memory electrically connected to the microcontroller, wherein the first memory is configured to store data associated with the at least one blade.
23. The shaving system of any of claims 1-22, wherein
the at least one blade is slightly curved such that a plane of the curve follows a tangent of skin.
24. The shaving system of any of claims 1-23, wherein
the at least one blade is slightly curved in a sickle-like fashion.
25. The shaving system of any of claims 1-24, wherein the at least one blade further comprises:
a front leading edge of the at least one blade;
a spine of the at least one blade; and
a nanolattice that connects the front leading edge to the spine.
26. A shaving system comprising:
a handle;
at least one blade connected to the handle;
a microcontroller attached to the handle; and one or more sensors adjacent the at least one blade, wherein the one or more sensors are configured to send sensory data to the microcontroller, wherein one of the one or more sensors is a camera having image sensor configured to capture video and/or still images.
27. The shaving system of claim 26, further comprising:
a wireless communication unit attached to the handle and electrically connected to the microcontroller, wherein the wireless communication unit is configured to transmit and receive data from the microcontroller to an external device.
28. The shaving system of claim 27, wherein
the microcontroller is configured to instruct the camera to capture frames of images and instruct the wireless communication unit to transmit the frames to be processed, analyzed, or displayed on the external device.
29. The shaving system of claim 28, wherein
the external device is configured to analyze the frames to determine a blade attrition comparison based on the analyzed frames, and display, on the external device or a display on the handle, the blade attrition comparison.
30. The shaving system of claim 29, wherein
the blade attrition comparison comprises a life remaining indicator of the at least one blade.
31. The shaving system of any of claims 29-30, wherein
the blade attrition comparison comprises a dullness indicator of the at least one blade.
32. The shaving system of any of claims 28-31, wherein
the external device is configured to determine a quantitative comparison for remaining hair based on the captured frames.
33. The shaving system of claim 32, further comprising:
an electrical audio device, wherein at least one of the microcontroller or the external device is configured to provide an audio signal to instruct the electrical audio device to emit a sound corresponding to the quantitative comparison for the remaining hair.
34. The shaving system of claim 33, wherein
the sound is a variable pitched sound or a recorded
35. The shaving system of any of claims 32-34, wherein
the external device is configured to display, on the external device or a display on the handle, the quantitative comparison for the remaining hair.
36. The shaving system of any of claims 28-35, wherein
the external device is configured to determine a general direction of the remaining hair based on the captured frames, and provide for display, on the external device or a display on the handle, a directional indicator representative of a general direction of the remaining hair that corresponds to a best direction to drag the at least one blade over the skin.
37. The shaving system of claim 36, wherein
the directional indicator is displayed, on the external device or the display on the handle, as a compass-like arrow that updates in near real-time.
38. The shaving system of claim 36, wherein
the directional indicator is displayed, on the external device or a display on the handle, and wherein the directional indicator is updated in near real time.
39. The shaving system of any of claims 28-38, wherein
the external device is configured to determine a boundary indicator associated with established hair growth region based on the captured frames and provide the boundary indicator for display on the external device or the display on the handle.
40. The shaving system of claim 39, wherein
the external device is configured to adjust the boundary indicator according to predefined features selected by a user.
41. The shaving system of claim 39, wherein
the external device is configured to overlay the boundary indicator over streamed video frame images, wherein the boundary indicator is displayed as a line.
42. The shaving system of any of claims 28-41, wherein
the external device is configured to detect hair based on the captured frames.
43. The shaving system of any of claims 28-41, wherein
the external device is configured to filter the captured frames using an edge-detection filter.
44. The shaving system of claim 43, wherein
the edge detection filter is a Sobel filter or a Canny filter.
45. The shaving system of any of claims 43-44, wherein
the external device is configured to overlay the streamed video frame image with filtered frames.
46. The shaving system of any of claims 28-45, further comprising:
a memory electrically connected to the microcontroller, wherein the memory is configured to store data associated with the at least one blade.
47. The shaving system of any of claims 28-46, wherein
the external device is configured to implement a least square or regression analysis of remaining hair.
48. A razor cartridge comprising:
a fixture configured to fasten to a razor; and
at least one blade connected to the fixture, wherein the at least one blade is curved.
49. The razor cartridge of claim 48, wherein
the at least one blade curves inward along a cutting edge.
50. The razor cartridge of any of claims 48-49, wherein
the at least one blade includes a plurality of blades, wherein each of the plurality of blades are parallel to each adjacent blade.
51. The razor cartridge of any of claims 48-50, wherein
the razor cartridge is slightly curved.
52. A blade comprising:
a front leading edge of the blade;
a spine of the blade; and
a nanolattice that connects the front leading edge of the blade to the spine of the blade.
53. The blade of claim 52, wherein
the blade is made of a ceramic.
54. The blade of claim 53, wherein
the ceramic is zirconia or alumina.
55. The blade of any of claims 52-54, wherein
the blade is made of a metal.
56. The blade of any of claims 52-55, wherein
one or more connecting members of the nanolattice has a curvilinear geometric shape.
57. The blade of any of claims 52-56, wherein
one or more connecting members of the nanolattice is hollow.
58. The blade of any of claims 52-57, wherein
one or more connecting members of the nanolattice is a tube that tapers towards the leading edge.
59. The blade of claim 58, wherein
wherein one or more connecting members of the nanolattice tapers towards the leading edge and forms ribs along the leading edge.
60. The blade of any of claims 52-59, wherein
the nanolattice comprises an octet-truss structure or structures.
61. The blade of claim 60, wherein
the octet-truss structure has added cross members that extend from the spine to the leading edge and cross members that are parallel to the leading edge.
62. The blade of claim 61, wherein
the cross members are of a tetrahedral shape.
63. The blade of any of claims 60-62, wherein
micro-scaffold structures are fabricated to form the nanolattice.
64. A mountable electrical device comprising:
a fixture configured to fasten to a precision hand tool;
a microcontroller attached to the fixture; a wireless communication unit attached to the fixture and electrically connected to the microcontroller, wherein the wireless communication unit is configured to transmit and receive data from the microcontroller to an external device;
a memory electrically connected to the microcontroller, wherein the memory is configured to store data from the microcontroller; and
one or more sensors attached to the precision hand tool, wherein the one or more sensors are configured to provide sensory data to the microcontroller.
65. The mountable electrical device of claim 64, wherein
at least one of the one or more sensors is a proximity sensor.
66. The mountable electrical device of any of claims 64-65, wherein
at least one of the one or more sensors is an image camera configured to provide frames of images to the microcontroller.
67. A method for determining blade attrition, comprising:
filtering, using an image device, a first image of a region of skin with hair;
determining, using one or more processors, a first quantitative comparison for a hair characteristic in the region of skin based on the first filtered image;
after the region of skin has been shaved, filtering, using one or more processors, a second image of the region of skin;
determining, using one or more processors, a second quantitative comparison for the hair characteristic in the region of skin based on the second filtered image; and
providing for display, a blade attrition comparison based on a difference between the second quantitative comparison and the first quantitative comparison.
68. The method of claim 67, wherein the hair characteristic is a quantity of hair.
69. The method of any of claims 67-68, wherein the hair characteristic is a density of hair.
70. The method of any of claims 67-69, wherein the hair characteristic is an average length of hair.
71. The method of any of claims 67-70, wherein determining the first or second quantitative comparison for the hair in the region of skin includes differentiating a color variation between adjacent pixels in a captured image.
72. The method of any of claims 67-71, further comprising:
sending an audio signal to an electrical audio unit configured to emit sound, wherein the electrical audio unit emits a sound associated with either the blade attrition comparison the first or second quantitative comparison for the hair characteristic in the region of skin.
73. The method of claims 67-72, wherein filtering the first or second image of a region of skin with hair uses an edge-detection filter.
74. The method of claim 74, wherein the edge-detection filter is a Sobel filter or a Canny filter.
75. The method of any of claims 67-74, further comprising:
determining a quantitative boundary indicator that distinguishes a boundary between an established hair growth region and a stubble region to be shaved based on the first or second filtered image.
76. The method of any of claims 67-75, further comprising:
determining a general direction of remaining hair based on the first or second filtered image; and
providing for display, a general direction of the remaining hair, wherein the general direction is associated with the best direction to drag the blade over the region of skin.
77. The method of claim 76, wherein determining the general direction of the remaining hair includes a least square analysis or regression analysis.
PCT/US2015/064339 2014-12-10 2015-12-07 Intelligent shaving system having sensors WO2016094327A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201462090335P 2014-12-10 2014-12-10
US62/090,335 2014-12-10

Publications (1)

Publication Number Publication Date
WO2016094327A1 true WO2016094327A1 (en) 2016-06-16

Family

ID=56108026

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/064339 WO2016094327A1 (en) 2014-12-10 2015-12-07 Intelligent shaving system having sensors

Country Status (3)

Country Link
US (1) US20160167241A1 (en)
TW (1) TW201637802A (en)
WO (1) WO2016094327A1 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018005398A1 (en) * 2016-06-28 2018-01-04 The Gillette Company Llc Polymeric cutting edge structures and method of manufacturing polymeric cutting edge structures
WO2018005399A1 (en) * 2016-06-28 2018-01-04 The Gillette Company Llc Cutting edge structures and method of manufacturing polymeric cutting edge structures
US20180354147A1 (en) * 2014-12-10 2018-12-13 Haggai Goldfarb Intelligent shaving system having sensors
WO2019001894A1 (en) * 2017-06-29 2019-01-03 Bic Violex S.A. Shaver and methods for detecting shaving characteristics
CN109562525A (en) * 2016-08-18 2019-04-02 丽努·潘达卡索莱尔·库瑞克苏 Intelligent facial hair carding apparatus
EP3513923A1 (en) * 2018-01-19 2019-07-24 The Gillette Company LLC Method for generating user feedback information from a shave event
EP3513924A1 (en) * 2018-01-19 2019-07-24 The Gillette Company LLC Method for generating user feedback information from a shave event
EP3513927A1 (en) * 2018-01-19 2019-07-24 The Gillette Company LLC Shaving appliance including a notification circuit for communicating cumulative shave event information
WO2019143922A1 (en) * 2018-01-19 2019-07-25 The Gillette Company Llc Personal appliance
EP3546148A1 (en) 2018-03-27 2019-10-02 Braun GmbH Personal care device
WO2019234144A1 (en) * 2018-06-08 2019-12-12 Bic Violex S.A Smart shaving accessory
EP3513925B1 (en) 2018-01-19 2020-09-16 The Gillette Company LLC Networked shaving appliance system
EP3546153B1 (en) 2018-03-27 2021-05-12 Braun GmbH Personal care device
US11027442B2 (en) 2018-03-27 2021-06-08 Braun Gmbh Personal care device
EP3513926B1 (en) 2018-01-19 2021-06-23 The Gillette Company LLC Shaving appliance including a notification circuit for communicating shave stroke direction information
CN113646143A (en) * 2019-03-26 2021-11-12 皇家飞利浦有限公司 Computer-implemented method for providing visual feedback to a user of a rotary shaver, and apparatus and computer program product implementing the method
US11298839B2 (en) 2018-03-27 2022-04-12 Braun Gmbh Hair removal device
US11548177B2 (en) 2018-03-27 2023-01-10 Braun Gmbh Personal care device

Families Citing this family (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9862107B2 (en) 2013-05-17 2018-01-09 Hybrid Razor Ltd Shaving apparatus
CN105283276B (en) * 2013-05-30 2018-08-07 皇家飞利浦有限公司 Equipment and system for nursing hair and/or skin
EP3107699A2 (en) * 2014-02-18 2016-12-28 Hybrid Razor Ltd Shaving apparatus
US9505140B1 (en) * 2015-06-02 2016-11-29 Irobot Corporation Contact sensors for a mobile robot
CA2989864A1 (en) 2015-06-30 2017-01-05 The Gillette Company Llc Polymeric cutting edge structures and method of manufacturing thereof
US20170129676A1 (en) * 2015-11-05 2017-05-11 The Gillette Company Method for the selection of a shaving product
US20170232624A1 (en) * 2016-02-12 2017-08-17 The King Of Shaves Company Limited Shaving System
US11045963B2 (en) * 2016-06-17 2021-06-29 Thomas Jay LANDWEHR Knife with pressure indication
US10482522B2 (en) 2016-08-08 2019-11-19 The Gillette Company Llc Method for providing a customized product recommendation
US10621647B2 (en) 2016-08-08 2020-04-14 The Gillette Company Llc Method for providing a customized product recommendation
BR112019011037A2 (en) * 2016-12-01 2019-10-15 Koninklijke Philips Nv hair or hair clippers, and method for indicating a haircutting process
US11014254B2 (en) * 2016-12-01 2021-05-25 Koninklijke Philips N.V. Hair cutting apparatus comprising a current detector
JP7132250B2 (en) * 2017-06-29 2022-09-06 ビック・バイオレクス・エス・エー Method and Apparatus for Detecting Hair Characteristics in a Shaving Device
KR102478189B1 (en) 2017-06-29 2022-12-15 빅 비올렉스 싱글 멤버 에스.아. Smart dispenser system and how to use it
WO2019011523A1 (en) * 2017-07-14 2019-01-17 Bic Violex S.A. Apparatuses and methods for measuring skin characteristics and enhancing shaving experiences
WO2019015837A1 (en) * 2017-07-20 2019-01-24 Bic Violex S.A. System and method for sensing debris accumulation in shaving razor cartridge
JP7463629B2 (en) * 2017-07-20 2024-04-09 ビック バイオレクス シングル メンバー エス.エー Shaver handle and how to use it
US10970773B2 (en) * 2017-07-25 2021-04-06 Dollar Shave Club, Inc. Smart cap and/or handle
US10909611B2 (en) 2017-07-25 2021-02-02 Dollar Shave Club, Inc. Smart cap product reordering
CN107718059B (en) * 2017-10-31 2019-11-15 北京小米移动软件有限公司 Control method and device that hair is repaired facility, hair are repaired facility
US11117276B2 (en) * 2018-01-19 2021-09-14 The Gillette Company Llc Method for generating user feedback information from a shave event
CN111757689B (en) 2018-03-26 2022-10-21 博朗有限公司 Interface for attaching a brush to a skin treatment device
KR20210013023A (en) * 2018-05-21 2021-02-03 빅-비올렉스 에스아 System and method for providing speech recognition orders for replacement shaving cartridges
US11685068B2 (en) 2018-05-21 2023-06-27 BIC Violex Single Member S.A. Smart shaving system with a 3D camera
US11858155B1 (en) * 2018-07-30 2024-01-02 AI Incorporated Electronic razor with suction
WO2020081619A1 (en) * 2018-10-19 2020-04-23 The Gillette Company Llc Grooming device
EP3899974A1 (en) * 2018-12-21 2021-10-27 The Procter & Gamble Company Apparatus and method for operating a personal grooming appliance or household cleaning appliance
CN109940665B (en) * 2019-04-22 2024-08-16 王斌 Intelligent stab-resistant knife
US11219978B2 (en) * 2020-02-26 2022-01-11 Ritesafety Products Int'l, Llc Utility knife with a replacement blade and a system and method for determining the end of life of the blade
WO2021262955A1 (en) * 2020-06-24 2021-12-30 Edgewell Personal Care Brands, Llc Machine learning for a personal care device
US11890764B2 (en) 2020-07-02 2024-02-06 The Gillette Company Llc Digital imaging systems and methods of analyzing pixel data of an image of a user's body for determining a hair density value of a user's hair
US11801610B2 (en) * 2020-07-02 2023-10-31 The Gillette Company Llc Digital imaging systems and methods of analyzing pixel data of an image of a user's body for determining a hair growth direction value of the user's hair
JP2022124522A (en) * 2021-02-16 2022-08-26 パナソニックIpマネジメント株式会社 Electric shaver
US20220324126A1 (en) * 2021-04-07 2022-10-13 The Gillette Company Llc Personal care appliance
US12005596B2 (en) * 2021-04-07 2024-06-11 The Gillette Company Llc Personal care appliance and a method of assembling
EP4094908A1 (en) * 2021-05-28 2022-11-30 BIC Violex Single Member S.A. Shavers and methods

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6634104B2 (en) * 2000-10-13 2003-10-21 Sarcos Investments Lc Intelligent shaver
US20040098862A1 (en) * 2002-08-21 2004-05-27 Eveready Battery Company, Inc. Razor system having razor sensors
US20080189953A1 (en) * 2007-02-14 2008-08-14 The Gillette Company Safety razor
US20100186234A1 (en) * 2009-01-28 2010-07-29 Yehuda Binder Electric shaver with imaging capability
US20130067986A1 (en) * 2011-09-19 2013-03-21 Claire Elizabeth Girdler Shaving measurement method and apparatus
US20140137883A1 (en) * 2012-11-21 2014-05-22 Reagan Inventions, Llc Razor including an imaging device

Family Cites Families (56)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US1485093A (en) * 1922-04-20 1924-02-26 Speirs Stuart Safety razor
US1821825A (en) * 1931-01-10 1931-09-01 James M Zumwalt Safety razor
US2108267A (en) * 1935-02-07 1938-02-15 American Safety Razor Corp Blade for safety razors
US2069465A (en) * 1935-08-06 1937-02-02 William G Wallenbeck Safety razor
US2421205A (en) * 1945-06-01 1947-05-27 Errol F Kingsley Razor
US3331129A (en) * 1965-10-23 1967-07-18 John S Goudie Curved edge safety razor having notches to seat the ends of the blade
US3444499A (en) * 1967-01-16 1969-05-13 Endevco Corp Strain gauge
US3597841A (en) * 1969-06-11 1971-08-10 Benjamin Miller Safety razor
US3654701A (en) * 1970-06-29 1972-04-11 Donald M Hastings Sr Safety razor blade
ZA729149B (en) * 1972-02-12 1973-10-31 S Mandilaras Self operated hair cutting machine
US3777396A (en) * 1972-06-01 1973-12-11 Warner Lambert Co Cartridges having tandemly mounted cutting edges on two sides thereof
US4208791A (en) * 1979-02-01 1980-06-24 Cleve Barbara J Van Arcuate razor head
US4459744A (en) * 1982-02-04 1984-07-17 Alan K. Roberts Razor blade apparatus and method
US4605919A (en) * 1982-10-04 1986-08-12 Becton, Dickinson And Company Piezoresistive transducer
US4516320A (en) * 1983-04-28 1985-05-14 Warner-Lambert Company Dynamic razor
GB8405044D0 (en) * 1984-02-27 1984-04-04 Gillette Co Safety razors
US4599793A (en) * 1984-05-23 1986-07-15 American Safety Razor Company Razor connector
US4901437A (en) * 1984-05-25 1990-02-20 American Safety Razor Company Razor head and method of manufacture
US4782590A (en) * 1985-07-15 1988-11-08 Pope H Maie Personal grooming device
US4720917A (en) * 1985-09-13 1988-01-26 Solow Terry S Flexible blade contour razor
US4893641A (en) * 1988-03-23 1990-01-16 Edward Strickland Flexible razor, method of use
US4980974A (en) * 1989-05-11 1991-01-01 Radcliffe Allan F Contoured shaving blades
US4961262A (en) * 1989-06-05 1990-10-09 Lawrence Virginia M Eyebrow shaving apparatus
EP0436693B1 (en) * 1989-07-24 1994-02-16 The Gillette Company Safety razors
US4993154A (en) * 1990-03-01 1991-02-19 Allan Radcliffe Shaving apparatus
US5199173A (en) * 1991-10-17 1993-04-06 Hegemann Research Corporation Concave, convex safety razor
US5208982A (en) * 1992-02-19 1993-05-11 Sferruzza Jr Gerald A Device to shave concave areas
US5240107A (en) * 1992-12-02 1993-08-31 James Casale Razor holder with shave counter
US5979056A (en) * 1995-06-07 1999-11-09 Andrews; Edward A. Body shaving device with curved razor blade strip
US5347715A (en) * 1993-09-14 1994-09-20 Friedland Donald H Blade shave counter
US5604983A (en) * 1994-04-14 1997-02-25 The Gillette Company Razor system
US5704127A (en) * 1996-03-04 1998-01-06 Cordio; Caroline Concave, convex safety razors
US6009623A (en) * 1997-10-02 2000-01-04 Warner-Lambert Company Razor with in situ sensor
US6460251B1 (en) * 1998-03-25 2002-10-08 Pfizer Inc. Razor system with worn blade indicator
US6494882B1 (en) * 2000-07-25 2002-12-17 Verimetra, Inc. Cutting instrument having integrated sensors
US7654003B2 (en) * 2003-02-19 2010-02-02 The Gillette Company Safety razors with charge indicator and power switch
US20060026841A1 (en) * 2004-08-09 2006-02-09 Dirk Freund Razors
US7100283B1 (en) * 2004-10-18 2006-09-05 Greg Grdodian Shaving system
US20060277760A1 (en) * 2005-06-14 2006-12-14 Sangyong Lee Razor blades and assemblies therefor
US20080052911A1 (en) * 2006-09-01 2008-03-06 Eveready Battery Company, Inc. Shaving implement and method for using same
US20080052912A1 (en) * 2006-09-01 2008-03-06 Eveready Battery Company. Inc. Integrated shave counter and base
US7730619B2 (en) * 2006-09-26 2010-06-08 Debra Lynn Ozenick Ergonomically arcuate multi-blade razor
US8317098B2 (en) * 2006-12-01 2012-11-27 Ncr Corporation Item having a data tag
US20080172882A1 (en) * 2007-01-23 2008-07-24 Eger Noah M Shaving device
US8579856B2 (en) * 2007-05-16 2013-11-12 Mystic Pharmaceuticals, Inc. Unit dose drug delivery platform
US20090056141A1 (en) * 2007-08-28 2009-03-05 Eveready Battery Company, Inc. Shaving implement and method for using same
US8230600B2 (en) * 2007-09-17 2012-07-31 The Gillette Company Cartridge detachment sensor
US8122606B2 (en) * 2007-09-17 2012-02-28 The Gillette Company Cartridge life indicator
RU2008150012A (en) * 2008-12-10 2010-06-20 Александр Тарасович Володин (RU) SAFETY RAZOR BLADE BLOCK
KR101055684B1 (en) * 2009-02-11 2011-08-09 주식회사 도루코 Integrated razor blades and razor cartridges using the same
EP2218559B1 (en) * 2009-02-13 2012-08-15 Trisa Holding AG Body care device
US9227331B2 (en) * 2012-11-06 2016-01-05 The Gillette Company Razor blade unit
US20150051845A1 (en) * 2013-08-15 2015-02-19 Kaled Al-Hasan Smart razor handle
US20170305023A9 (en) * 2013-11-27 2017-10-26 Lamar Ball Shaving systems with razor blade usage tracking
WO2016032014A1 (en) * 2014-08-25 2016-03-03 주식회사 도루코 Razor blade and razor cartridge using same
US10298471B2 (en) * 2015-10-05 2019-05-21 The Gillette Company Llc Systems and methods for providing device usage data

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6634104B2 (en) * 2000-10-13 2003-10-21 Sarcos Investments Lc Intelligent shaver
US20040098862A1 (en) * 2002-08-21 2004-05-27 Eveready Battery Company, Inc. Razor system having razor sensors
US20080189953A1 (en) * 2007-02-14 2008-08-14 The Gillette Company Safety razor
US20100186234A1 (en) * 2009-01-28 2010-07-29 Yehuda Binder Electric shaver with imaging capability
US20130067986A1 (en) * 2011-09-19 2013-03-21 Claire Elizabeth Girdler Shaving measurement method and apparatus
US20140137883A1 (en) * 2012-11-21 2014-05-22 Reagan Inventions, Llc Razor including an imaging device

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180354147A1 (en) * 2014-12-10 2018-12-13 Haggai Goldfarb Intelligent shaving system having sensors
US11007659B2 (en) * 2014-12-10 2021-05-18 Haggai Goldfarb Intelligent shaving system having sensors
WO2018005398A1 (en) * 2016-06-28 2018-01-04 The Gillette Company Llc Polymeric cutting edge structures and method of manufacturing polymeric cutting edge structures
WO2018005399A1 (en) * 2016-06-28 2018-01-04 The Gillette Company Llc Cutting edge structures and method of manufacturing polymeric cutting edge structures
JP2019520235A (en) * 2016-06-28 2019-07-18 ザ ジレット カンパニー リミテッド ライアビリティ カンパニーThe Gillette Company Llc Polymer cut edge structure and method of manufacturing polymer cut edge structure
CN109562525A (en) * 2016-08-18 2019-04-02 丽努·潘达卡索莱尔·库瑞克苏 Intelligent facial hair carding apparatus
WO2019001894A1 (en) * 2017-06-29 2019-01-03 Bic Violex S.A. Shaver and methods for detecting shaving characteristics
US11504866B2 (en) 2017-06-29 2022-11-22 BIC Violex Single Member S.A. Shaver and methods for detecting shaving characteristics
EP3513927B1 (en) 2018-01-19 2020-09-09 The Gillette Company LLC Shaving appliance including a notification circuit for communicating cumulative shave event information
EP3513926B1 (en) 2018-01-19 2021-06-23 The Gillette Company LLC Shaving appliance including a notification circuit for communicating shave stroke direction information
EP3513927B2 (en) 2018-01-19 2024-01-10 The Gillette Company LLC Shaving appliance including a notification circuit for communicating cumulative shave event information
EP3513923A1 (en) * 2018-01-19 2019-07-24 The Gillette Company LLC Method for generating user feedback information from a shave event
EP3513927A1 (en) * 2018-01-19 2019-07-24 The Gillette Company LLC Shaving appliance including a notification circuit for communicating cumulative shave event information
EP3513925B1 (en) 2018-01-19 2020-09-16 The Gillette Company LLC Networked shaving appliance system
WO2019143922A1 (en) * 2018-01-19 2019-07-25 The Gillette Company Llc Personal appliance
EP3513924A1 (en) * 2018-01-19 2019-07-24 The Gillette Company LLC Method for generating user feedback information from a shave event
US11027442B2 (en) 2018-03-27 2021-06-08 Braun Gmbh Personal care device
EP3546153B1 (en) 2018-03-27 2021-05-12 Braun GmbH Personal care device
US11097437B2 (en) 2018-03-27 2021-08-24 Braun Gmbh Personal care device
US11298839B2 (en) 2018-03-27 2022-04-12 Braun Gmbh Hair removal device
US11548177B2 (en) 2018-03-27 2023-01-10 Braun Gmbh Personal care device
EP3546148A1 (en) 2018-03-27 2019-10-02 Braun GmbH Personal care device
WO2019234144A1 (en) * 2018-06-08 2019-12-12 Bic Violex S.A Smart shaving accessory
US11529745B2 (en) 2018-06-08 2022-12-20 BIC Violex Single Member S.A. Smart shaving accessory
CN113646143A (en) * 2019-03-26 2021-11-12 皇家飞利浦有限公司 Computer-implemented method for providing visual feedback to a user of a rotary shaver, and apparatus and computer program product implementing the method
CN113646143B (en) * 2019-03-26 2023-09-12 皇家飞利浦有限公司 Computer implemented method for providing visual feedback to a user of a rotary shaver, apparatus and computer program product implementing the method

Also Published As

Publication number Publication date
US20160167241A1 (en) 2016-06-16
TW201637802A (en) 2016-11-01

Similar Documents

Publication Publication Date Title
US11964405B2 (en) Intelligent shaving system having sensors
US20160167241A1 (en) Intelligent shaving system having sensors
EP3546150B1 (en) Personal care device
US20200368926A1 (en) Programmable hair trimming system
CN105899336B (en) System and method for handling body part
JP6885979B2 (en) Personal care equipment and its operation method
CN110662637B (en) Shaver and method for detecting shaving characteristics
EP3546152B1 (en) Hair removal apparatus
RU189106U1 (en) PROGRAMMABLE SYSTEM FOR HAIR CUT
US11548177B2 (en) Personal care device
US10668636B2 (en) Cutter head for automated hair cutting system
US11097437B2 (en) Personal care device
CN103294180B (en) Man-machine interaction control method and electronic terminal
CN105745052B (en) System and method for handling physical feeling
CN105899337A (en) A system and method for treating a part of a body
US9993058B2 (en) Positioning system and methods for use with automated hair cutting systems
CA2955728A1 (en) A razor handle comprising an insert within a hole and razor comprising such a razor handle
US20200085164A1 (en) Apparatus with imaging functionality
JP2022504699A (en) Grooming device
JP2013543437A (en) Mechanical pipette
EP3586733B1 (en) Information processing method, information processing device, and program
JP2018105974A (en) Surgical loupe
KR101733553B1 (en) Apparatus and method for caring eye
EP3169068B1 (en) Portable device that controls photography mode, and control method therefor
EP3544463A1 (en) Apparatus with imaging functionality

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15867713

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15867713

Country of ref document: EP

Kind code of ref document: A1