WO2018000257A1 - Method and device for disambiguating which hand user involves in handling electronic device - Google Patents

Method and device for disambiguating which hand user involves in handling electronic device Download PDF

Info

Publication number
WO2018000257A1
WO2018000257A1 PCT/CN2016/087735 CN2016087735W WO2018000257A1 WO 2018000257 A1 WO2018000257 A1 WO 2018000257A1 CN 2016087735 W CN2016087735 W CN 2016087735W WO 2018000257 A1 WO2018000257 A1 WO 2018000257A1
Authority
WO
WIPO (PCT)
Prior art keywords
inferring
touch
accelerometer
disambiguating
user
Prior art date
Application number
PCT/CN2016/087735
Other languages
French (fr)
Inventor
Nan Ye
Jie Wan
Zhihong Guo
Original Assignee
Orange
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Orange filed Critical Orange
Priority to PCT/CN2016/087735 priority Critical patent/WO2018000257A1/en
Publication of WO2018000257A1 publication Critical patent/WO2018000257A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer

Definitions

  • the invention relates to a method for disambiguating which hand a user involves in handling an electronic device, in particular a mobile phone.
  • Document EP2440986 discloses a device including a sensor for detecting a finger, a camera for capturing an image of the finger and a processor for determining whether the detected finger belongs to a right hand or a left hand based on the image.
  • This method has a disadvantage of requiring a camera and heavy processing.
  • the invention concerns a method for disambiguating which hand a user involves in handling an electronic device, the electronic device comprising a touch interface and an accelerometer, the method comprising the steps of:
  • the invention concerns an electronicdevice comprising a touch interface and an accelerometer, the device comprising:
  • control unit for capturing and storing in a memory the accelerometer output for touch inputs received at a discreet number of predefined distinct zones of the touch interface
  • the invention therefore advantageously proposes to use the accelerometer of a device to determine whether a user handles the device with the right hand or with the left hand, the inventors having in particular measured that left-handed users and right-handed users rotate a device differently when touching specific zones of a touch screen.
  • the predefined zones are at locations that the user naturally reaches using a thumb.
  • using zones at a right corner and at a leftcorner of the screen enables to efficiently discriminate left-handed and right-handed users, because:
  • a left-handed user naturally rotates the phone in a first direction to touch the right (top or bottom) corner with his thumb, while
  • the predefined distinct zones may include distinct quadrants of the touch interface.
  • the method comprises a preliminary step of displaying on the touch interface a graphical element for each distinct zone.
  • graphical elements may for example be displayed during a dedicated test for disambiguating which hand the user involves in handling the device, for example, when the user uses the device for the first time.
  • the graphical elements may be displayed in a specific order, for example in clockwise order, the inferring being carried out once the touch inputs have been received on all graphical elements.
  • the graphical elements may for example be displayed left-top, right-top, right-bottom and left-bottom.
  • the inferring step comprises:
  • the threshold may be defined according to the shape of the device.
  • the method is not carried out during a specific test.
  • the device may monitor touch input locations permanently or through interactions with determined applications and execute the inferring step once the user has interacted with the predefined zones.
  • the method comprises a step of configuring an application according to the result of the inferring step.
  • This application may be an application running on the device.
  • the device may also send the result of the inferring step to another device.
  • the various processes of the disambiguating method are determined by computer program instructions.
  • a computer program on a data medium the program being suitable for being performed in a electronic device, the program including instructions adapted to perform a disambiguating method as described above.
  • This program may use any programming language, and may be in the form of source codes, object codes, or code intermediate between source codes and object codes, such as in a partially compiled form, or in any other desirable form.
  • the data medium may be any entity or device capable of storing the program.
  • the medium may comprises storage means such as a read only memory (ROM) , e.g. a compact disk (CD) ROM or a microelectronic circuit ROM, or indeed magnetic recording means, such as a floppy disk or a hard disk.
  • ROM read only memory
  • CD compact disk
  • microelectronic circuit ROM indeed magnetic recording means, such as a floppy disk or a hard disk.
  • the data medium may be a transmissible medium such as an electrical or optical signal, suitable for being conveyed via an electrical or optical cable, by radio, or by other means.
  • the program may in particular be downloaded from an Internet type network.
  • the data medium may be an integrated circuit in which the program is incorporated, the circuit being adapted to execute or to be used in the execution of the method in question.
  • FIG. 1 shows electronic devices in accordance with a particular embodiment of the invention.
  • FIG. 3 is a flow chart showing the main steps of a method in accordance with a specific implementation of the invention.
  • Figure 1 shows an electronic device according to the invention comprises a touch interface TI, an accelerometer GYR, a memory MEM, control unit CU and an inferring unit IU.
  • the control unit CU is designed for capturing and storing in the memory MEM the accelerometer GYR output for touch inputs received at a discreet number of predefined distinct zones of the touch interface TI.
  • the inferring unit IU is designed for inferring which hand was used for the user touch inputs based on the captured accelerometer outputs.
  • Figure 2 represents an electronicdevice 10 according to one embodiment of the invention, for example a smartphone.
  • the device 10 comprises a processor 11, a volatile memory 12, a non-volatile memory 13, a flash memory MEM, an accelerometer or gyroscope GYR, and a touch screen 15.
  • the non-volatile memory 13 comprises a computer program PG according to an embodiment of the invention.
  • the processor 11 executes the steps of a method for disambiguating according to the invention which will be described with reference to Figure 3.
  • a method for disambiguating according to the invention which will be described with reference to Figure 3.
  • processor 11 captures the output of accelerometer GYR and stores same in flash memory MEM. Therefore processor 11 constitutes a control unit in the sense of the invention.
  • processor 11 infers which hand was used for the user touch inputs based on the captured accelerometer outputs. Therefore processor 11 constitutes an inferring unit in the sense of the invention.
  • touch screen 15 Four distinct zones are represented on touch screen 15: Z1 top-left corner, Z2 top-right corner, Z3 bottom-right corner and Z4 bottom-left corner.
  • zones Z1 to Z4 are at locations that the user naturally reaches using a thumb, more particularly at distinct quadrants of the touch interface.
  • the accelerometer GYR is a 3-axis accelerometer.
  • the accelerometer GYR is a 3-axis accelerometer.
  • the X sensor value is positive.
  • the method comprises a step E5 of determining whether the device is in a test mode.
  • the method comprises a loop E10-E30 executed for zones Zi in the clockwise order Z1, Z2, Z3, Z4.
  • a graphical element is displayed in zone Zi.
  • step E20 a touch is detected on zone Zi and the X accelerometer output is stored in memory MEM at step E30.
  • the loop can be executed one or a plurality of times.
  • an inferring step is executed (step E40) .
  • the method comprises a first process P1 and a second process P2. These processes can be executed at any time.
  • the first process P1 comprises a step E21 (similar to step E20) of detecting a touch, and a step E50 of determining the zone Zi of the touch.
  • step E31 (similar to step E30) , the X accelerometer output is stored in memory MEM.
  • the second process P2 comprises a step E60 of verifying that at least one touch input has been detected for each zone Z1 to Z4.
  • the inferring step E40 comprises:
  • an average value AVi is computed for each zone Zi, and an average AVG of average values AVi is calculated.
  • average values AV2 and AV3 will make AVG negative, even if AV1 and AV4 in Z1 and Z4 are positive ;
  • average values AV1 and AV4 will make AVG positive, even if AV2 and AV3 in Z2 and Z3 are positive.
  • the ⁇ 1.5 threshold is determined to be an acceptable threshold but other thresholds may be determined according to the shape of the device.
  • step E40 is followed by a configuration step E70 for configuring the screen of the device according to the result of the inferring step.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

This method for disambiguating which hand a user involves in handling an electronic device (10), the electronic device comprising a touch interface (16) and an accelerometer (GYR) comprises the steps of: -capturing (E20) the accelerometer output for touch inputs received at a discreet number of predefined distinct zones (Zi) of the touch interface; -inferring (E40) which hand was used for the user touch inputs based on the captured accelerometer outputs.

Description

[Title established by the ISA under Rule 37.2] METHOD AND DEVICE FOR DISAMBIGUATING WHICH HAND USER INVOLVES IN HANDLING ELECTRONIC DEVICE Background of the invention
The invention relates to a method for disambiguating which hand a user involves in handling an electronic device, in particular a mobile phone.
As a preliminary remark, it has been determined that simply asking whether a user is a lefty or a righty is not appropriate because some users are righty but use their right hands in handling an electronic device and vice versa.
Document EP2440986 discloses a device including a sensor for detecting a finger, a camera for capturing an image of the finger and a processor for determining whether the detected finger belongs to a right hand or a left hand based on the image.
This method has a disadvantage of requiring a camera and heavy processing.
Object and summary of the invention
According to a first aspect, the invention concerns a method for disambiguating which hand a user involves in handling an electronic device, the electronic device comprising a touch interface and an accelerometer, the method comprising the steps of:
- capturing the accelerometer output for touch inputs received at a discreet number of predefined distinct zones of the touch interface;
- inferring which hand was used for the user touch inputs based on the captured accelerometer outputs.
Correspondingly, according to a second aspect, the invention concerns an electronicdevice comprising a touch interface and an accelerometer, the device comprising:
- a control unit for capturing and storing in a memory the accelerometer output for touch inputs received at a discreet number of predefined distinct zones of the touch interface;
- an inferring unit for inferring which hand was used for the user touch inputs based on the captured accelerometer outputs.
The invention therefore advantageously proposes to use the accelerometer of a device to determine whether a user handles the device with the right hand or with the left hand, the inventors having in particular measured that left-handed users and right-handed users rotate a device differently when touching specific zones of a touch screen.
In one embodiment of the invention, the predefined zones are at locations that the user naturally reaches using a thumb. For example, using zones at a right corner and at a leftcorner of the screen enables to efficiently discriminate left-handed and right-handed users, because:
- a left-handed user naturally rotates the phone in a first direction to touch the right (top or bottom) corner with his thumb, while
- a right-handed user naturally rotates the phone in the opposite direction to touch the left (top or bottom) corner with his thumb.
In one embodiment, the predefined distinct zones may include distinct quadrants of the touch interface.
In one embodiment, the method comprises a preliminary step of displaying on the touch interface a graphical element for each distinct zone. These graphical elements may for example be displayed during a dedicated test for disambiguating which hand the user involves in handling the device, for example, when the user uses the device for the first time.
In one embodiment, the graphical elements may be displayed in a specific order, for example in clockwise order, the inferring being carried out once the touch inputs have been received on all graphical elements.
The graphical elements may for example be displayed left-top, right-top, right-bottom and left-bottom.
In one embodiment, the inferring step comprises:
- determining an average value of the detected rotations of the device for touch inputs received in at least one of said zones;
- comparing the average value with a threshold. The threshold may be defined according to the shape of the device.
In another embodiment of the invention, the method is not carried out during a specific test. The device may monitor touch input locations permanently or through interactions with determined  applications and execute the inferring step once the user has interacted with the predefined zones.
In one embodiment, the method comprises a step of configuring an application according to the result of the inferring step. This application may be an application running on the device.
The device may also send the result of the inferring step to another device.
In a particular implementation, the various processes of the disambiguating method are determined by computer program instructions.
Consequently, also provided is a computer program on a data medium, the program being suitable for being performed in a electronic device, the program including instructions adapted to perform a disambiguating method as described above.
This program may use any programming language, and may be in the form of source codes, object codes, or code intermediate between source codes and object codes, such as in a partially compiled form, or in any other desirable form.
Also provided is a computer readable data medium that includes instructions of a computer program as mentioned above.
The data medium may be any entity or device capable of storing the program. For example, the medium may comprises storage means such as a read only memory (ROM) , e.g. a compact disk (CD) ROM or a microelectronic circuit ROM, or indeed magnetic recording means, such as a floppy disk or a hard disk.
Furthermore, the data medium may be a transmissible medium such as an electrical or optical signal, suitable for being conveyed via an electrical or optical cable, by radio, or by other means. In some embodiments, the program may in particular be downloaded from an Internet type network.
Alternatively, the data medium may be an integrated circuit in which the program is incorporated, the circuit being adapted to execute or to be used in the execution of the method in question.
Brief description of the drawings
Particular advantages and characteristics of some embodiments of the present invention appear from the following detailed description of the figures, in which:
- figures 1 and 2 show electronic devices in accordance with a particular embodiment of the invention; and
- figure 3 is a flow chart showing the main steps of a method in accordance with a specific implementation of the invention.
Detailed description of embodiments of the invention
Figure 1 shows an electronic device according to the invention comprises a touch interface TI, an accelerometer GYR, a memory MEM, control unit CU and an inferring unit IU.
The control unit CU is designed for capturing and storing in the memory MEM the accelerometer GYR output for touch inputs received at a discreet number of predefined distinct zones of the touch interface TI.
The inferring unit IU is designed for inferring which hand was used for the user touch inputs based on the captured accelerometer outputs.
Figure 2 represents an electronicdevice 10 according to one embodiment of the invention, for example a smartphone.
The device 10 comprises a processor 11, a volatile memory 12, a non-volatile memory 13, a flash memory MEM, an accelerometer or gyroscope GYR, and a touch screen 15.
The non-volatile memory 13 comprises a computer program PG according to an embodiment of the invention. When the instructions of the computer program PG are executed by the processor 11, the processor 11 executes the steps of a method for disambiguating according to the invention which will be described with reference to Figure 3. In particular:
processor 11 captures the output of accelerometer GYR and stores same in flash memory MEM. Therefore processor 11 constitutes a control unit in the sense of the invention.
processor 11 infers which hand was used for the user touch inputs based on the captured accelerometer outputs. Therefore processor 11 constitutes an inferring unit in the sense of the invention.
Four distinct zones are represented on touch screen 15: Z1 top-left corner, Z2 top-right corner, Z3 bottom-right corner and Z4 bottom-left corner.
These zones Z1 to Z4 are at locations that the user naturally reaches using a thumb, more particularly at distinct quadrants of the touch interface.
In this specific embodiment, the accelerometer GYR is a 3-axis accelerometer. In particular,
- when a left-handed user naturally rotates the phone to touch the Z2 or Z3 zones with his thumb, the X sensor value is negative, and
- whena right-handed user naturally rotates the phone to touch the Z1 or Z4 zones with his thumb, the X sensor value is positive.
With reference to Figure 3, we describe now the main steps of method for disambiguating which hand a user involves in handling an electronic device, according to an embodiment of the invention.
In this embodiment, the method comprises a step E5 of determining whether the device is in a test mode.
When the device is in a test mode, the method comprises a loop E10-E30 executed for zones Zi in the clockwise order Z1, Z2, Z3, Z4.
At step E10, a graphical element is displayed in zone Zi.
At step E20, a touch is detected on zone Zi and the X accelerometer output is stored in memory MEM at step E30.
The loop can be executed one or a plurality of times. At the end of the loop, once the touch inputs have been received at least once on all graphical elements, an inferring step is executed (step E40) .
When the device is not in a test mode, the method comprises a first process P1 and a second process P2. These processes can be executed at any time.
The first process P1 comprises a step E21 (similar to step E20) of detecting a touch, and a step E50 of determining the zone Zi of the touch. At step E31 (similar to step E30) , the X accelerometer output is stored in memory MEM.
The second process P2 comprises a step E60 of verifying that at least one touch input has been detected for each zone Z1 to Z4.
When this is the case, the inferring step E40 is executed.
In this embodiment, the inferring step E40 comprises:
- determining an average value of the detected rotations on the X direction of the device for touch inputs received in zones Z1 to Z4 ; and
- comparing the average value with a threshold.
In this specific embodiment, an average value AVi is computed for each zone Zi, and an average AVG of average values AVi is calculated.
In this embodiment:
- for left handed users, average values AV2 and AV3 will make AVG negative, even if AV1 and AV4 in Z1 and Z4 are positive ; and
- for right handed users, average values AV1 and AV4 will make AVG positive, even if AV2 and AV3 in Z2 and Z3 are positive.
Therefore, in this embodiment:
- when the average value AVG is more than 1.5, it is considered that the user handles the device with theright hand and
- when the average value AVG is less than -1.5, it is considered that the user handles the device with the left hand.
The ±1.5 threshold is determined to be an acceptable threshold but other thresholds may be determined according to the shape of the device.
In this specific embodiment, step E40 is followed by a configuration step E70 for configuring the screen of the device according to the result of the inferring step.

Claims (10)

  1. A method for disambiguating which hand a user involves in handling an electronic device (10) , the electronic device comprising a touch interface (16) and an accelerometer (GYR) , the method comprising the steps of:
    - capturing (E20, E21) the accelerometer output for touch inputs received at a discreet number of predefined distinct zones (Zi) of the touch interface;
    - inferring (E40) which hand was used for the user touch inputs based on the captured accelerometer outputs.
  2. A method according to any one of Claims 1 to 3, where the predefined zones are at locations that the user naturally reaches using a thumb.
  3. A method according to Claim 1 or 2, where the predefined distinct zones include distinct quadrants of the touch interface.
  4. A method according to any one of Claims 1 to3, the method comprising a preliminary step of displaying (E10) on the touch interface a graphical element for each distinct zone.
  5. A method according to Claim 4, where the graphical elements are displayed in a specific order, for example in clockwise order, the inferring being carried out once the touch inputs have been received on all graphical elements.
  6. A method according to any one of Claims 1 to 5, where the inferring step comprises:
    - determining an average value of the detected rotations of the device for touch inputs received in at least one of said zones;
    - comparing the average value with a threshold.
  7. A method according to any one of Claims 1 to 6, the method comprising a step of configuring (E70) an application according to the result of the inferring step.
  8. An electronicdevice (10) comprising a touch interface (TI) and an accelerometer (GYR) , the device comprising:
    - a control unit (CU) for capturing and storing in a memory the accelerometer output for touch inputs received at a discreet number of predefined distinct zones of the touch interface;
    - an inferring unit (IU) for inferring which hand was used for the user touch inputs based on the captured accelerometer outputs.
  9. A computer program including instructions for executing the disambiguating method according to any one of claims 1 to 7, when said program is executed by an electronic device.
  10. A non-transitory computer readable data medium having stored thereon a computer program including instructions for executing the disambiguating method according to any one of claims 1 to 7.
PCT/CN2016/087735 2016-06-29 2016-06-29 Method and device for disambiguating which hand user involves in handling electronic device WO2018000257A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/087735 WO2018000257A1 (en) 2016-06-29 2016-06-29 Method and device for disambiguating which hand user involves in handling electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/087735 WO2018000257A1 (en) 2016-06-29 2016-06-29 Method and device for disambiguating which hand user involves in handling electronic device

Publications (1)

Publication Number Publication Date
WO2018000257A1 true WO2018000257A1 (en) 2018-01-04

Family

ID=60785871

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/087735 WO2018000257A1 (en) 2016-06-29 2016-06-29 Method and device for disambiguating which hand user involves in handling electronic device

Country Status (1)

Country Link
WO (1) WO2018000257A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102830935A (en) * 2012-08-22 2012-12-19 上海华勤通讯技术有限公司 Touch terminal and operation interface adjusting method
US20130300668A1 (en) * 2012-01-17 2013-11-14 Microsoft Corporation Grip-Based Device Adaptations
CN104182126A (en) * 2014-08-27 2014-12-03 北京数字天域科技股份有限公司 Method and device for dialing numbers via mobile terminals
CN104798030A (en) * 2012-12-28 2015-07-22 英特尔公司 Adapting user interface based on handedness of use of mobile computing device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130300668A1 (en) * 2012-01-17 2013-11-14 Microsoft Corporation Grip-Based Device Adaptations
CN102830935A (en) * 2012-08-22 2012-12-19 上海华勤通讯技术有限公司 Touch terminal and operation interface adjusting method
CN104798030A (en) * 2012-12-28 2015-07-22 英特尔公司 Adapting user interface based on handedness of use of mobile computing device
CN104182126A (en) * 2014-08-27 2014-12-03 北京数字天域科技股份有限公司 Method and device for dialing numbers via mobile terminals

Similar Documents

Publication Publication Date Title
JP6677425B2 (en) Intelligent terminal control method
US9519401B2 (en) Providing context menu based on predicted commands
WO2017054309A1 (en) Interactive control method and device for voice and video communications
CN105814524B (en) Object detection in optical sensor system
US10685256B2 (en) Object recognition state indicators
US10127246B2 (en) Automatic grouping based handling of similar photos
KR20110053820A (en) Method and apparatus for processing image
KR20140091555A (en) Measuring web page rendering time
AU2021200971B2 (en) Systems and methods of enabling fast user access to remote desktops
TWI665600B (en) Electronic device and touch method
WO2017032020A1 (en) Image processing method and electronic terminal
US11710111B2 (en) Methods and systems for collecting and releasing virtual objects between disparate augmented reality environments
US20190251342A1 (en) Collaboration event content sharing
US20130182005A1 (en) Virtual fashion mirror system
US20150309681A1 (en) Depth-based mode switching for touchless gestural interfaces
WO2018000257A1 (en) Method and device for disambiguating which hand user involves in handling electronic device
US11755193B2 (en) Method and system for receiving feedback from a user
WO2012104312A1 (en) Method and apparatus for gesture authentication
CN105631850B (en) Aligned multi-view scanning
CN111898529B (en) Face detection method and device, electronic equipment and computer readable medium
CN109657440A (en) Based on the biological information treating method and apparatus of block chain, terminal device
CN104679428A (en) Method for judging photograph rotation direction according to single finger gestures
CN105808051B (en) Image processing method and electronic equipment
CN109949407B (en) Head portrait generation method and device and electronic equipment
CN107168519B (en) Control method and device of intelligent wearable equipment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16906658

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16906658

Country of ref document: EP

Kind code of ref document: A1