US11147509B2 - Method for customizing a mounted sensing device - Google Patents
Method for customizing a mounted sensing device Download PDFInfo
- Publication number
- US11147509B2 US11147509B2 US15/557,321 US201615557321A US11147509B2 US 11147509 B2 US11147509 B2 US 11147509B2 US 201615557321 A US201615557321 A US 201615557321A US 11147509 B2 US11147509 B2 US 11147509B2
- Authority
- US
- United States
- Prior art keywords
- wearer
- sensors
- sensing device
- sensor
- parameter
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 238000000034 method Methods 0.000 title claims abstract description 37
- 238000005259 measurement Methods 0.000 claims description 31
- 210000003128 head Anatomy 0.000 claims description 15
- 210000000707 wrist Anatomy 0.000 claims description 12
- 238000004891 communication Methods 0.000 claims description 10
- 238000004590 computer program Methods 0.000 claims description 4
- 210000000744 eyelid Anatomy 0.000 claims description 3
- 230000010344 pupil dilation Effects 0.000 claims description 2
- 238000012546 transfer Methods 0.000 claims description 2
- 230000015654 memory Effects 0.000 description 9
- 238000005286 illumination Methods 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 238000004159 blood analysis Methods 0.000 description 2
- 230000036772 blood pressure Effects 0.000 description 2
- 210000001747 pupil Anatomy 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 229910000980 Aluminium gallium arsenide Inorganic materials 0.000 description 1
- 208000029091 Refraction disease Diseases 0.000 description 1
- 230000004430 ametropia Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 238000009532 heart rate measurement Methods 0.000 description 1
- 210000000554 iris Anatomy 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000002496 oximetry Methods 0.000 description 1
- 208000014733 refractive error Diseases 0.000 description 1
- 238000009666 routine test Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/6803—Head-worn items, e.g. helmets, masks, headphones or goggles
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02C—SPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
- G02C11/00—Non-optical adjuncts; Attachment thereof
- G02C11/10—Electronic devices other than hearing aids
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3206—Monitoring of events, devices or parameters that trigger a change in power modality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3234—Power saving characterised by the action undertaken
- G06F1/325—Power saving in peripheral device
- G06F1/3265—Power saving in display device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
Definitions
- the invention relates to a method for customizing a mounted sensing device comprising a first set of similar or identical sensors having different location and/or orientation, each sensor being adapted to measure at least one first parameter relating to the wearer of the mounted sensing device.
- the invention further relates to a mounted sensing device.
- More and more mounted devices comprise sensors adapted to measure parameters relating to the wearer or the environment of the wearer.
- Such measured parameters may be used to adapt the mounted device.
- a way of increasing the accuracy of the measurements is to improve the accuracy of the sensors them self.
- a specificity of having sensors mounted on a mounted sensing device is that the sensing conditions are unstable.
- the mounted sensing device may move relative to the user and the environment may change quickly and be very diverse.
- the overall measurements may not be very accurate in particular because of the specific measurements condition linked to the fact that the sensors are mounted on a mounted sensing device to be mounted on a wearer, for example on the head or wrist of the wearer.
- One object of the present invention is to provide such a method.
- the invention proposes a method, for example implemented by computer means, for customizing a mounted sensing device comprising a first set of similar or identical sensors having different location and/or orientation, each sensor being adapted to measure at least one first parameter relating to the wearer of the mounted sensing device.
- the method according to the invention comprising a sensor selecting step during which at least one of the sensors is selected to be used to measure the at least one first parameter, wherein the at least one selected sensor is selected among the first set of similar or identical sensors based on wearer data indicative of the morphology of the wearer, and based on their respective location and/or orientation such that said selected at least one sensor is the most appropriate for the wearer and/or for measuring the at least one first parameter.
- a mounted sensing device with a plurality of similar or identical sensors having different locations and/or orientations and selecting the most appropriate sensor based on the location and/or orientation of the sensors allows increasing the accuracy of the measured parameter.
- the inventors have observed that depending on the parameter to be measured and the wearer the position and/or orientation of the sensor may greatly influence the accuracy of the measured parameter.
- the invention proposes to provide a mounted sensing device, for example a head mounted sensing device, with a set of similar sensors and to customize such mounted sensing device by selecting the most appropriate sensor(s).
- the method according to the invention allows adjusting in real time the most appropriate sensor(s) among the set of sensors.
- the invention relates to a mounted sensing device comprising:
- the wearer data may comprise at least morphology data relating to the morphology of the contact areas of the wearer, for example the contact areas of the wearer's head or of the wearer's wrist, and the mounted sensing device.
- the mounted sensing device may further comprising a communication unit configured to receive an application data indicative of the application of the measured parameter and wherein the processor is further configured to select the most appropriate sensor based on the application data.
- the mounted sensing device may be a head mounted device.
- the sensors may be integrated into the frame of the head mounted device.
- the mounted sensing device may be a wrist sensing device.
- the invention further relates to a computer program product comprising one or more stored sequences of instructions that are accessible to a processor and which, when executed by the processor, causes the processor to carry out the steps of the method according to the invention.
- the invention also relates to a computer-readable storage medium having a program recorded thereon; where the program makes the computer execute the method of the invention.
- the invention further relates to a device comprising a processor adapted to store one or more sequence of instructions and to carry out at least one of the steps of the method according to the invention.
- Embodiments of the present invention may include apparatuses for performing the operations herein.
- This apparatus may be specially constructed for the desired purposes, or it may comprise a general purpose computer or Digital Signal Processor (“DSP”) selectively activated or reconfigured by a computer program stored in the computer.
- DSP Digital Signal Processor
- Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs) electrically programmable read-only memories (EPROMs), electrically erasable and programmable read only memories (EEPROMs), magnetic or optical cards, or any other type of media suitable for storing electronic instructions, and capable of being coupled to a computer system bus.
- a computer readable storage medium such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs) electrically programmable read-only memories (EPROMs), electrically erasable and programmable read only memories (EEPROMs), magnetic or optical cards, or any other type of media suitable for storing electronic instructions, and capable of being coupled to a computer system bus.
- FIG. 1 is a schematic representation of an head mounted device according to an embodiment of the invention
- FIG. 2 represents a networked data-processing device according to the invention.
- FIG. 3 is a schematic representation of a head mounted device according to a further embodiment of the invention.
- a “mounted sensing device” corresponds to any device arranged to be mounted on a wearer, the device comprise a set of sensors.
- the mounted sensing device is not limited to such head mounted device and can be a wrist mounted device, a hand mounted device, a foot mounted device or any device comprising sensors and arranged to be mounted on a wearer.
- FIG. 1 represents an example of head-mounted device 10 comprising a first set of similar 20 , 22 , 24 sensors having different location and/or orientations.
- the term “similar sensors” refers to sensors which sense the same type of parameter, and more precisely analogous parameters. For instance, a CCD sensor and a CMOS sensor are considered as “similar sensors” since both sensors may acquire images.
- the head mounted device 10 comprises a spectacle frame 12 and the sensors are mounted on the spectacle frame 12 .
- the method of the invention is not limited to such type of head-mounted devices, it appears to be particularly advantageous for head mounted devices comprising a spectacle frame.
- the method according to the invention increases the accuracy of the sensors of the head mounted device, in particular of eye tracking devices arranged in spectacle frames.
- the head mounted device 10 represented on FIG. 1 comprises a spectacle frame 12 with three sensors, for example three cameras 20 , 22 , 24 , directed at the left eye (not shown) of the wearer.
- the cameras 20 , 22 , 24 are arranged to be directed toward the head in order to track the locations of the eyes of the wearer and/or the structures of the eyes of the wearer, for example the pupils, eyelids, irises, glints, and/or other reference points in the region of the eye(s).
- the cameras 20 , 22 , 24 may include charge-coupled device (CCD), complementary metal-oxide-semiconductor (CMOS), or other photodetectors that include an active area, e.g., including a rectangular or linear or other array of pixels, for capturing images and/or generating video signals representing the images.
- the active area of each of the cameras 20 , 22 , 24 may have any desired shape, e.g., a square or rectangular shape, circular, and the like.
- the surface of the active area of one or more cameras may also be curved, if desired, e.g., to compensate during image acquisition for the nearby three-dimensional curvature of the eye and surrounding structures being imaged.
- the three cameras have different orientations and positions on the spectacle frame.
- the head mounted device 10 may further comprise three illumination sources 30 , 32 , 34 arranged so as to illuminate the left eye of the wearer when wearing the spectacle frame 12 .
- illumination sources 30 , 32 , 34 are fixed to the spectacle frame 12 .
- illumination sources 30 , 32 , 34 may include light-emitting diodes (LEDs), organic LEDs (OLEDs), laser diodes, or other devices that convert electrical energy into photons.
- Each illumination source 30 , 32 , 34 may be used to illuminate the eye to acquire images using any of the cameras 20 , 22 , 24 and/or to produce reference glints for measurement purposes to improve gaze-tracking accuracy.
- each light source 30 , 32 , 34 may be configured for emitting a relatively narrow or wide bandwidth of the light, for example infrared light at one or more wavelengths between about 700-1000 nanometers.
- AlGaAs LEDs provides an emission peak at 850 nm and are widely used and affordable, while commodity CMOS cameras used in mobile phones and webcams show a good sensibility at this wavelength.
- the head mounted device 10 may further comprise a processing unit 14 arranged to receive the parameter sensed by the sensors 20 , 22 , 24 , for example images collected by the cameras 20 , 22 , 24 .
- the processing unit may be arranged in one of the sides of the spectacle frame.
- the head mounted device further comprises a non-transitory computer-readable medium, at least one processor, and program instructions stored on the non-transitory computer-readable medium and executable by the at least one processor to select at least one of the sensors having the location and/or orientation the most appropriate of the plurality of sensors for the wearer and/or for measuring the at least one parameter.
- the head mounted device communicates with a distant entity.
- the head mounted device may further comprise a communication unit configured to communicate with a distance entity either to store the measured parameter in a memory MEM or to select at least one of the sensors having the location and/or orientation the most appropriate of the plurality of sensors for the wearer and/or for measuring the at least one parameter.
- a communication unit configured to communicate with a distance entity either to store the measured parameter in a memory MEM or to select at least one of the sensors having the location and/or orientation the most appropriate of the plurality of sensors for the wearer and/or for measuring the at least one parameter.
- the distance entity comprises a communication unit COM configured to communicate at least with the head mounted device, a memory MEM, at least one processor PROC and program instructions stored on the non-transitory computer-readable medium and executable by the at least one processor to select at least one of the sensors having the location and/or orientation the most appropriate of the plurality of sensors for the wearer and/or for measuring the at least one parameter.
- a communication unit COM configured to communicate at least with the head mounted device, a memory MEM, at least one processor PROC and program instructions stored on the non-transitory computer-readable medium and executable by the at least one processor to select at least one of the sensors having the location and/or orientation the most appropriate of the plurality of sensors for the wearer and/or for measuring the at least one parameter.
- the distance entity can include different computing objects such as personal digital assistants, audio/video devices, mobile phones, MPEG-1 Audio Layer 3 (MP3) players, personal computers, laptops, tablets, bluetooth headset, watch, wristband, etc. . . .
- computing objects such as personal digital assistants, audio/video devices, mobile phones, MPEG-1 Audio Layer 3 (MP3) players, personal computers, laptops, tablets, bluetooth headset, watch, wristband, etc. . . .
- MP3 MPEG-1 Audio Layer 3
- Each computing object and the head mounted device can communicate with one or more other by way of a communication network, either directly or indirectly.
- network can include other computing objects and computing devices that provide services to the system of FIG. 2 , and/or can represent multiple interconnected networks, which are not shown.
- the computing objects can be Web servers, file servers, media servers, etc. with which the client computing objects or devices communicate via any of a number of known protocols, such as the hypertext transfer protocol (HTTP).
- HTTP hypertext transfer protocol
- the head mounted device may further comprise a power source, for example a battery and/or other electronics.
- a power source for example a battery and/or other electronics.
- the power source and/or other electronics may be arranged in the side of the spectacle frame opposite to the one containing the processing unit 14 .
- such head mounted device comprising sensors included in a spectacle frame the wearer may use the head mounted device over long periods without being hindered, for example on the everyday base.
- the head mounted device may very well comprise sensors and/or on the right side of the spectacle frame.
- having sensors on both sides of the spectacle frame allows providing accurate information, for example on the gazing direction and distance of the wearer.
- the sensors of the head mounted device according to the invention may comprise other sensors arranged to measure the gazing direction of the wearer
- the sensors may be configured to measure the contact area or the distance between the head mounted device and the wearer's head, for example the sensors may be capacitive sensors.
- the frame of the head mounted device may comprise a plurality of capacitive sensors at different positions and orientation.
- the method according to the invention may be used to select the sensors that are to be active based for example on the position of the head mounted device relative to the head of the wearer.
- the method of the invention allows reducing the batteries of the head mounted device since only the most appropriate sensors are activated.
- the sensors may be configured to measure biological and/or physiological parameters of the wearer, for example the sensors are resistive sensors.
- the sensors are configured to sense parameter relating to the ametropia of the wearer and/or features of the eyes of the wearer such as pupil diameter, and/or optical blood pressure or blood analysis, for example heart beat or oximetry measurement.
- the sensors are configured to measure a parameter relating to the environment of the wearer of the head mounted device.
- the sensors are configured to measure spectral features and intensity of the light received by the wearer.
- At least one, for example a plurality, of the sensors is selected to measure the at least one first parameter.
- the at least one selected sensor is selected among the first set of similar or identical sensors based on their respective location and/or orientation such that said selected at least one sensor is the most appropriate for the wearer and/or for measuring the at least one first parameter.
- At least one sensor among the first set of similar or identical sensors is not selected as being the most appropriate sensor for the wearer and/or for measuring the at least one first parameter. In other words, not all the sensors of the first set of similar or identical sensors may be selected during the sensor selecting step.
- the most appropriate sensor of the first set of similar or identical sensors for the wearer and/or for measuring the at least one first parameter may be determined with tests.
- the most appropriate sensor for the wearer and/or for measuring a first parameter may be determined by comparison of the measurement of each sensor of the first set of similar or identical sensors with a predetermined value.
- a predetermined value may be a standard value of the first parameter.
- a mounted sensing device comprises a plurality of LEDs having different location and/or orientation.
- the LEDs are successively turned on for determining which LED is the most appropriate for the wearer and/or for measuring the reflection of the light on the eye of the wearer.
- the most appropriate LEDs may be determined by comparison of the measurement of each LED with a standard value of the reflection of the light on the eye of the wearer.
- the most appropriate sensor may be determined based on the signal to noise ratio. More precisely, the sensors of the set of similar sensors may allow having a signal to noise ratio. The most appropriate sensor among the set of similar sensors may be selected as being the sensor giving the best signal to noise ratio.
- the most appropriate sensor may be determined based on a signal after reflection. More precisely, the sensors of the set of similar sensors may comprise a couple of an emitter and a receptor, and several possible combinations of couple of emitter and receptor may be tested. The most appropriate sensor among the set of similar sensors may be selected as being the sensor giving the higher signal after reflection.
- the most appropriate sensor for the wearer and/or for measuring a first parameter may also be determined by comparison of the measurement of each sensor of the first set of similar or identical sensors with a range of values. Indeed, since the mounted sensing device may move relative to the wearer, only few sensors of the first set of similar or identical sensors may be well-placed for measuring the first parameter.
- a plurality of similar or identical pulse sensors may have different location around the wrist.
- the most appropriate pulse sensor for measuring the pulse may be determined by comparison of the pulse measurement of each pulse sensor with a range of pulse values.
- the range of pulse values may be, for example, between 50 and 80 pulses per minute. Indeed, since the wristwatch may move around the wrist of the wearer, only few pulse sensors among the first set of similar or identical pulse sensors may be well-placed for allowing obtaining a measure of the pulse of the wearer.
- the at least one selected sensor may be selected further based on application data indicative of the application of the measured parameter. Indeed, a given parameter may be used for different applications that may require using a different sensor among the first set of sensors.
- an eye tracker close to the center of the head mounted device it may be more relevant to use an eye tracker close to the center of the head mounted device or to use a more off centered eye tracker.
- the at least one sensor is selected by measuring the at least one parameter with each of the plurality of sensors and by selecting the sensors providing the most accurate measurement.
- the measurements of the at least one parameter with each of the plurality of sensors may be repeated, at least once, in order to determine the sensors providing the most accurate measurement.
- the at least one sensor providing the most accurate measurement may be the sensor with the lowest difference between the two measurements.
- the at least one sensor providing the most accurate measurement may be the sensor with the lowest standard deviation, the first measurement being selected as the reference for calculating the standard deviation.
- the at least one sensor is selected based on wearer data indicative of the wearer, for example data indicative of the morphology of the wearer.
- the wearer data comprise at least morphology data relating to the morphology of the contact areas of the wearer's head and the head mounted device.
- the selection step may be carried out as part of an initialization process when the wearer tries on a new head mounted device.
- a head mounted device comprising a plurality of similar or identical sensors having different location and/or orientation, each sensor being adapted to measure at least one first parameter relating to the wearer of the head mounted device, is provided to the wearer.
- At least one of the sensors is selected to be used to measure a parameter of the wearer based for example on the morphology of the wearer.
- a routine test may be carried out to determine the most appropriate, for example the most accurate, sensor when the head mounted device is worn by the wearer.
- the most appropriate sensor among the set of similar sensors may be determined based on direct tests carried out by the similar sensors, or based on a simulation of the measurement with a three-dimensional scanned value of the morphology of the wearer. In other words, the most appropriate sensor may be selected based on direct measurements or on simulated measurements relative to the morphology of the wearer.
- the selection of the at least one sensors may be carried out regularly for example each time the wearer starts the sensors or each time the wearer puts the head mounted device on.
- the selection of the at least one sensor may be based on position data indicative to the position of the head mounted device relative to the wearer, for example the head of the wearer.
- the most appropriate sensor for example the most accurate sensor, may change and the method of the invention may be implemented to determine the most appropriate sensor based on the new position of the head mounted device.
- the position of the head mounted device relative to the wearer may also be used to select the most appropriate sensor, for example the most accurate, during an initializing process.
- At least one sensor is selected among the first set of similar or identical sensors based on their respective location and/or orientation.
- a plurality of sensors may be selected.
- a specific spatial distribution of the selected sensors may be determined based on the wearer data and/or application data and/or the specific parameter to be measured.
- the method may further comprises a distribution frequency determining step during which a distribution of frequencies of the measurements implemented by each sensor of the plurality of selected sensors is determined based at least on the at least one parameter intended to be measured.
- a distribution frequency determining step during which a distribution of frequencies of the measurements implemented by each sensor of the plurality of selected sensors is determined based at least on the at least one parameter intended to be measured.
- one of the two sensors may need to be activated on a lower frequency than the other.
- Determining the most appropriate a distribution of frequencies of the measurements implemented by each sensor of the plurality of selected sensors provides a good compromise between the accuracy of the measurements and the energy required to activate the sensors.
- the head mounted device may further comprises a second set of similar sensors (having different location and/or orientation and being adapted to measure at least one second parameter relating to the wearer of the head mounted device, the second parameter being different than the first parameter.
- a second set of similar sensors having different location and/or orientation and being adapted to measure at least one second parameter relating to the wearer of the head mounted device, the second parameter being different than the first parameter.
- At least one sensor from the second set of sensors is selected so that the location and/or orientation of the selected at least sensor is the most appropriate of the second set of sensors for the wearer and/or for measuring the at least one second parameter.
- the head mounted device may comprise a virtual image display device 50 , preferably allowing the wearer to see both the virtual image and the real world through it.
- the virtual image display device is able to display graphical images, and an electronic driving system (memory+processor) sends to the virtual display image the image to display. Preferably it is able to display image in different viewing directions.
- the mounted sensing device is not limited to a head mounted device.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Life Sciences & Earth Sciences (AREA)
- Optics & Photonics (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Computer Hardware Design (AREA)
- Pathology (AREA)
- Surgery (AREA)
- Ophthalmology & Optometry (AREA)
- Otolaryngology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Acoustics & Sound (AREA)
- Neurology (AREA)
- Neurosurgery (AREA)
- Dermatology (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
-
- the mounted sensing device is a head mounted device and/or a wrist mounted device; and/or
- during the selecting step the at least one selected sensor is selected based on application data indicative of the application of the measured parameter; and/or
- during the selecting step the at least one sensor is selected by measuring the at least one parameter with each of the plurality of sensors and by selecting the sensors providing the most accurate measure; and/or
- the wearer data comprise at least morphology data relating to the morphology of the contact areas of the wearer, for example of the contact area of the wearer's head or of the wearer's wrist, and the mounted sensing device; and/or
- during the selecting step the at least one sensor is selected based on position data indicative to the position of the mounted sensing device relative to the wearer, for example the head of the wearer; and/or
- during the selecting step a specific spatial distribution of sensors is selected; and/or
- during the selecting step a plurality of sensors are selected and wherein the method further comprises a distribution frequency determining step during which a distribution of frequencies of the measurements implemented by each sensor of the plurality of selected sensors is determined based at least on the at least one parameter intended to be measured; and/or
- the mounted sensing device further comprises a second set of similar sensors having different location and/or orientation and being adapted to measure at least one second parameter relating to the wearer of the mounted sensing device, the second parameter being different than the first parameter, and during the selecting step at least one sensor from the second set of sensors is selected so that the location and/or orientation of the selected at least sensor is the most appropriate of the second set of sensors for the wearer and/or for measuring the at least one second parameter; and/or
- the sensors comprise sensors, for example eye tracking sensors, arranged to measure the gazing direction or the wearer and/or the eyelid beat of the wearer and/or the pupil dilation of the wearer; and/or
- the sensors comprise sensors, for example capacitive sensors, configured to measure the contact area or the distance between the mounted sensing device and the wearer, for example the wearer's head; and/or
- the sensors comprise sensors, for example resistive sensors, configured to measure biological and/or physiological parameters of the wearer; and/or
- the sensors comprise optical blood pressure or blood analysis sensor, for example heart beat or oxymetry measurement and/or pressure sensor.
-
- a plurality of similar or identical sensors having different location and/or orientation, each sensor being adapted to measure at least one parameter relating to the wearer of the mounted sensing device,
- a non-transitory computer-readable medium, and
- program instructions stored on the non-transitory computer-readable medium and executable by at least one processor to select at least one of the sensors having the location and/or orientation the most appropriate of the plurality of sensors for the wearer and/or for measuring the at least one parameter based on wearer data indicative of the morphology of the wearer.
Claims (14)
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP15305377 | 2015-03-12 | ||
EP15305377 | 2015-03-12 | ||
EP15305377.2 | 2015-03-12 | ||
PCT/EP2016/055021 WO2016142423A1 (en) | 2015-03-12 | 2016-03-09 | A method for customizing a mounted sensing device |
Publications (2)
Publication Number | Publication Date |
---|---|
US20180049697A1 US20180049697A1 (en) | 2018-02-22 |
US11147509B2 true US11147509B2 (en) | 2021-10-19 |
Family
ID=52807754
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/557,321 Active 2036-11-24 US11147509B2 (en) | 2015-03-12 | 2016-03-09 | Method for customizing a mounted sensing device |
Country Status (4)
Country | Link |
---|---|
US (1) | US11147509B2 (en) |
EP (1) | EP3268799A1 (en) |
CN (1) | CN107430273B (en) |
WO (1) | WO2016142423A1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3919969A1 (en) * | 2016-09-22 | 2021-12-08 | Essilor International | Health monitoring device and wearing detection module for spectacles frame |
KR20190013224A (en) * | 2017-08-01 | 2019-02-11 | 엘지전자 주식회사 | Mobile terminal |
CN115886728A (en) * | 2022-10-24 | 2023-04-04 | 珠海格力电器股份有限公司 | Sleep detection method and system, electronic device, storage medium and intelligent bed |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6198394B1 (en) * | 1996-12-05 | 2001-03-06 | Stephen C. Jacobsen | System for remote monitoring of personnel |
US20090326406A1 (en) * | 2008-06-26 | 2009-12-31 | Microsoft Corporation | Wearable electromyography-based controllers for human-computer interface |
US20100004977A1 (en) * | 2006-09-05 | 2010-01-07 | Innerscope Research Llc | Method and System For Measuring User Experience For Interactive Activities |
US20100110368A1 (en) | 2008-11-02 | 2010-05-06 | David Chaum | System and apparatus for eyeglass appliance platform |
US20110254760A1 (en) * | 2010-04-20 | 2011-10-20 | Invensense, Inc. | Wireless Motion Processing Sensor Systems Suitable for Mobile and Battery Operation |
US20140118243A1 (en) | 2012-10-25 | 2014-05-01 | University Of Seoul Industry Cooperation Foundation | Display section determination |
US20140145914A1 (en) | 2012-11-29 | 2014-05-29 | Stephen Latta | Head-mounted display resource management |
US20140160424A1 (en) | 2012-12-06 | 2014-06-12 | Microsoft Corporation | Multi-touch interactions on eyewear |
US8862715B1 (en) | 2011-04-06 | 2014-10-14 | Google Inc. | Context-based sensor selection |
US20150029088A1 (en) | 2013-07-25 | 2015-01-29 | Lg Electronics Inc. | Head mounted display and method of controlling therefor |
US20150062022A1 (en) | 2013-09-04 | 2015-03-05 | Qualcomm Incorporated | Wearable display device use-based data processing control |
US20150282768A1 (en) * | 2012-09-29 | 2015-10-08 | Aliphcom | Physiological signal determination of bioimpedance signals |
US20160162022A1 (en) * | 2014-12-08 | 2016-06-09 | Rohit Seth | Wearable wireless hmi device |
US20170112393A1 (en) * | 2014-06-17 | 2017-04-27 | Kyocera Corporation | Measurement apparatus and measurement method |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7787969B2 (en) * | 2007-06-15 | 2010-08-31 | Caterpillar Inc | Virtual sensor system and method |
US20090300525A1 (en) * | 2008-05-27 | 2009-12-03 | Jolliff Maria Elena Romera | Method and system for automatically updating avatar to indicate user's status |
US9128522B2 (en) * | 2012-04-02 | 2015-09-08 | Google Inc. | Wink gesture input for a head-mountable device |
US9219901B2 (en) * | 2012-06-19 | 2015-12-22 | Qualcomm Incorporated | Reactive user interface for head-mounted display |
-
2016
- 2016-03-09 US US15/557,321 patent/US11147509B2/en active Active
- 2016-03-09 WO PCT/EP2016/055021 patent/WO2016142423A1/en active Application Filing
- 2016-03-09 CN CN201680015244.4A patent/CN107430273B/en active Active
- 2016-03-09 EP EP16709038.0A patent/EP3268799A1/en active Pending
Patent Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6198394B1 (en) * | 1996-12-05 | 2001-03-06 | Stephen C. Jacobsen | System for remote monitoring of personnel |
US20100004977A1 (en) * | 2006-09-05 | 2010-01-07 | Innerscope Research Llc | Method and System For Measuring User Experience For Interactive Activities |
US20090326406A1 (en) * | 2008-06-26 | 2009-12-31 | Microsoft Corporation | Wearable electromyography-based controllers for human-computer interface |
US20100110368A1 (en) | 2008-11-02 | 2010-05-06 | David Chaum | System and apparatus for eyeglass appliance platform |
US20110254760A1 (en) * | 2010-04-20 | 2011-10-20 | Invensense, Inc. | Wireless Motion Processing Sensor Systems Suitable for Mobile and Battery Operation |
US8862715B1 (en) | 2011-04-06 | 2014-10-14 | Google Inc. | Context-based sensor selection |
US20150282768A1 (en) * | 2012-09-29 | 2015-10-08 | Aliphcom | Physiological signal determination of bioimpedance signals |
US20140118243A1 (en) | 2012-10-25 | 2014-05-01 | University Of Seoul Industry Cooperation Foundation | Display section determination |
US20140145914A1 (en) | 2012-11-29 | 2014-05-29 | Stephen Latta | Head-mounted display resource management |
US20140160424A1 (en) | 2012-12-06 | 2014-06-12 | Microsoft Corporation | Multi-touch interactions on eyewear |
US20150029088A1 (en) | 2013-07-25 | 2015-01-29 | Lg Electronics Inc. | Head mounted display and method of controlling therefor |
US20150062022A1 (en) | 2013-09-04 | 2015-03-05 | Qualcomm Incorporated | Wearable display device use-based data processing control |
US20160025975A1 (en) | 2013-09-04 | 2016-01-28 | Qualcomm Incorporated | Wearable display device use-based data processing control |
US20170112393A1 (en) * | 2014-06-17 | 2017-04-27 | Kyocera Corporation | Measurement apparatus and measurement method |
US20160162022A1 (en) * | 2014-12-08 | 2016-06-09 | Rohit Seth | Wearable wireless hmi device |
Non-Patent Citations (1)
Title |
---|
International Search Report dated Apr. 20, 2016 in PCT/EP2016/055021, filed on Mar. 9, 2016. |
Also Published As
Publication number | Publication date |
---|---|
US20180049697A1 (en) | 2018-02-22 |
WO2016142423A1 (en) | 2016-09-15 |
EP3268799A1 (en) | 2018-01-17 |
CN107430273A (en) | 2017-12-01 |
CN107430273B (en) | 2022-11-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6650974B2 (en) | Electronic device and method for acquiring biological information | |
US10307053B2 (en) | Method for calibrating a head-mounted eye tracking device | |
CN106413529B (en) | Optical pressure sensor | |
US20160070122A1 (en) | Computerized replacement temple for standard eyewear | |
US20190101984A1 (en) | Heartrate monitor for ar wearables | |
JP7106569B2 (en) | A system that evaluates the user's health | |
Topal et al. | A low-computational approach on gaze estimation with eye touch system | |
US10909164B2 (en) | Method for updating an index of a person | |
US11147509B2 (en) | Method for customizing a mounted sensing device | |
CN108968972B (en) | Flexible fatigue detection device and information processing method and device | |
US10163014B2 (en) | Method for monitoring the visual behavior of a person | |
US20220293241A1 (en) | Systems and methods for signaling cognitive-state transitions | |
US10624570B2 (en) | User fatigue level analysis component | |
US11406330B1 (en) | System to optically determine blood pressure | |
Andrushevich et al. | Open smart glasses development platform for AAL applications | |
US20230119697A1 (en) | Method of evaluating quality of bio-signal and apparatus for estimating bio-information | |
US20180303431A1 (en) | User migraine analysis component | |
Spinsante et al. | The role of mobile apps in heart rate measurement with consumer devices | |
KR20200094344A (en) | Method for calculating recovery index based on rem sleep stage and electonic device therof | |
KR20230032697A (en) | Electronic device and method for detecting tremor of user in the electronic device | |
EP4305511A1 (en) | Systems and methods for signaling cognitive-state transitions |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: ESSILOR INTERNATIONAL (COMPAGNIE GENERALE D'OPTIQUE), FRANCE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ROUSSEAU, DENIS;LORE, MARIE;SIGNING DATES FROM 20171031 TO 20171208;REEL/FRAME:044743/0867 Owner name: ESSILOR INTERNATIONAL (COMPAGNIE GENERALE D'OPTIQU Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ROUSSEAU, DENIS;LORE, MARIE;SIGNING DATES FROM 20171031 TO 20171208;REEL/FRAME:044743/0867 |
|
AS | Assignment |
Owner name: ESSILOR INTERNATIONAL, FRANCE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ESSILOR INTERNATIONAL (COMPAGNIE GENERALE D'OPTIQUE);REEL/FRAME:045853/0275 Effective date: 20171101 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |