WO2020055263A1 - Détection de proximité - Google Patents

Détection de proximité Download PDF

Info

Publication number
WO2020055263A1
WO2020055263A1 PCT/NO2019/050178 NO2019050178W WO2020055263A1 WO 2020055263 A1 WO2020055263 A1 WO 2020055263A1 NO 2019050178 W NO2019050178 W NO 2019050178W WO 2020055263 A1 WO2020055263 A1 WO 2020055263A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
electronic device
category
touch event
event
Prior art date
Application number
PCT/NO2019/050178
Other languages
English (en)
Inventor
Guenael Thomas Strutt
Espen Klovning
Original Assignee
Elliptic Laboratories As
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from NO20181316A external-priority patent/NO346144B1/en
Application filed by Elliptic Laboratories As filed Critical Elliptic Laboratories As
Priority to CN201980056290.2A priority Critical patent/CN112639673A/zh
Priority to US17/261,844 priority patent/US20210303099A1/en
Publication of WO2020055263A1 publication Critical patent/WO2020055263A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3206Monitoring of events, devices or parameters that trigger a change in power modality
    • G06F1/3215Monitoring of peripheral devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3206Monitoring of events, devices or parameters that trigger a change in power modality
    • G06F1/3231Monitoring the presence, absence or movement of users
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/325Power saving in peripheral device
    • G06F1/3265Power saving in display device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W52/00Power management, e.g. TPC [Transmission Power Control], power saving or power classes
    • H04W52/02Power saving arrangements
    • H04W52/0209Power saving arrangements in terminal devices
    • H04W52/0251Power saving arrangements in terminal devices using monitoring of local events, e.g. events related to user activity
    • H04W52/0254Power saving arrangements in terminal devices using monitoring of local events, e.g. events related to user activity detecting a user operation or a tactile contact or a motion of the device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W52/00Power management, e.g. TPC [Transmission Power Control], power saving or power classes
    • H04W52/02Power saving arrangements
    • H04W52/0209Power saving arrangements in terminal devices
    • H04W52/0251Power saving arrangements in terminal devices using monitoring of local events, e.g. events related to user activity
    • H04W52/0258Power saving arrangements in terminal devices using monitoring of local events, e.g. events related to user activity controlling an operation mode according to history or models of usage information, e.g. activity schedule or time of day
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Definitions

  • Present teachings relate to proximity sensing for an electronic device.
  • Most mobile devices especially mobile phones or smartphones, have a proximity sensor that is typically placed near the top of the screen of the device, or close to the earpiece.
  • the main function of such a proximity sensor is to detect when the user has positioned the device close to the ear during an ongoing phone call, in which case the touchscreen of the mobile device is disabled or switched off to prevent false touch events due to contact of the ear or other body part of the user with the screen of the mobile device. Since the touch screen is not normally used while the user has placed a call and has positioned the device close to their head or next to their ear, the touch screen controller can either be switched off or may enter a low-power mode to save power. Additionally, the screen lighting of the device is also normally switched off to save power.
  • a method for controlling an electronic device such that false positive detections for a proximity event on the electronic device are reduced.
  • a method for controlling an electronic device comprising a touch-sensitive surface and a proximity sensor, the proximity sensor having a field-of-view, the method comprising the steps of:
  • the device from a screen-on state to a near state in response to a near event being detected by the proximity sensor, wherein the screen-on state is a state in which user inputs are accepted through the touch-sensitive surface; and the near event occurs when an input object is determined to be in proximity of the electronic device by the proximity sensor;
  • the input object is determined to be in proximity of the electronic device, it is meant that the input object is detected by the proximity sensor to be present within the field-of-view (“FoV”) of the proximity sensor.
  • the input object is determined to be in proximity of the electronic device when the input object arrives within a first predetermined distance of the proximity sensor.
  • a method for controlling an electronic device comprising a touch-sensitive surface and a proximity sensor, the proximity sensor having a field-of-view, the method comprising the steps of:
  • the device from a screen-on state to a near state in response to a near event being detected by the proximity sensor, wherein the screen-on state is a state in which user inputs are accepted through the touch-sensitive surface; and the near event occurs when an input object is determined to be in proximity of the electronic device by the proximity sensor;
  • the method further includes the steps of: - Waiting for a predetermined time-period in the near state within which period, if no first category touch event is detected, then changing the device to a screen-off state, wherein the screen-off state is a state in which the touch-sensitive surface is either disabled or any inputs received thereof are ignored; and
  • the input object is determined to no longer be in proximity of the electronic device, it is meant that the input object is detected by the proximity sensor to be absent from the FoV of the proximity sensor.
  • the input object is determined to no longer be in proximity of the electronic device when the input object moves beyond a second predetermined distance from the proximity sensor. The first predetermined distance and the second
  • predetermined distance can be equal values, or alternatively they can be different values, for example for implementing hysteresis in changing the states in response to the input object entering and departing the FoV of the proximity sensor.
  • the arriving/departing of the input object in/from the FoV is meant in a relative sense to the electronic device, i.e., by saying e.g., the input object arrives within a first predetermined distance of the proximity sensor covers all scenarios whether it be, the input object moving towards a stationary electronic device, or the electronic device being brought towards a stationary input object, or both the electronic device and the input object moving towards each another.
  • the magnitude of the first predetermined distance may at least substantially equal to the magnitude of the second predetermined distance, or alternatively the magnitude of the first predetermined distance may different from the magnitude of the second predetermined distance.
  • an electronic device comprising a touch-sensitive surface and a proximity sensor, the electronic device also comprising a processing unit configured to:
  • the electronic device is configured to enter a near state in response to a near event being detected by the proximity sensor; and the screen-on state being a state in which user inputs are accepted through the touch- sensitive surface and processed by the processing unit; and the near event occurs when an input object is determined to be in proximity of the electronic device by the proximity sensor;
  • an electronic device comprising a touch-sensitive surface and a proximity sensor, the electronic device also comprising a processing unit configured to:
  • the electronic device is configured to enter a near state in response to a near event being detected by the proximity sensor; and the screen-on state being a state in which user inputs are accepted through the touch-sensitive surface and processed by the processing unit; and the near event occurs when an input object is determined to be in proximity of the electronic device by the proximity sensor; process a first category touch event, and based on the processing of the first category touch event performing at least one function on the electronic device, wherein the first category touch events include touch inputs occurring within the first portion of the touch-sensitive surface and/or the first category touch events are a first type of touch events; and ignore a second category touch event, wherein the second category touch event includes a touch input occurring in the second portion of the touch-sensitive surface and/or the second category touch events are a second type of touch events, wherein the first portion and the second portion are non-overlapping portions of the touch-sensitive surface.
  • the processing unit is configured to ignore the second category touch event by disabling the second portion of the touch-sensitive surface, or alternatively, the processing unit is configured to ignore the second category touch event by disregarding touch inputs received through the second portion of the touch-sensitive surface.
  • the first type of touch events may include any one or any combination of tap, double-tap, hold or swipe events.
  • the second type of touch events may include any one or any combination of tap, double-tap, hold or swipe events.
  • the second type of touch events is distinct from the first type of touch events, so that the first type of touch events is processed, while the second is ignored.
  • a touch event of type‘swipe’ cannot be used as a first type and also as a second type; it can only be used either as a first type or as a second type.
  • the electronic device may also comprise a display.
  • the display may be used for user interaction with the device, and for displaying characters, pictures, or videos, etc. It will be understood that in certain cases such as for a TFT LCD display, the display also comprises a backlight, while other kinds of display may be self-illuminated.
  • the display is turned-off or switched-off when the electronic device enters the near state, while the touchscreen sensor or touch-sensitive surface in still active for detecting first category touch events.
  • the display is switched-off when the electronic device enters the near state while the touchscreen sensor in still active for detecting first category touch events.
  • the display is kept on for a predetermined time-period after the electronic device changes to or enters the near state. After the expiry of the time-period, if the electronic device is still in the near state, the display is switched-off, while the touchscreen sensor in still kept active for detecting first category touch events. It will be appreciated that based on the use case of the electronic device, this can save power whilst allowing for detection of touch events for a longer period of time.
  • the proximity sensor may be infrared (“IR”) detection based, or it may be acoustic detection based.
  • IR infrared
  • teachings can also apply to other kinds of proximity detection arrangements such as those based on electric field, light, magnetic field, or capacitive field.
  • the proximity detection system may be based on the transmission and receiving of acoustic signals.
  • the proximity detection mode comprises the transmission, reflection, and detection of acoustic, particularly ultrasonic signals.
  • the proximity sensor may include a plurality of sensors.
  • the proximity sensor may even include a combination of various kinds of proximity detection arrangements. For example, a combination of an IR sensor and an acoustic sensor.
  • an IR-sensor based proximity sensing can be replaced by ultrasound sensing alone or in combination with capacitive sensing. In devices where ultrasound sensing is already implemented for gesture recognition, by using ultrasound sensing in combination with touch sensing, the requirement for a separate IR proximity sensor may be removed. Manufacturing costs may thus be reduced.
  • An acoustic proximity detection arrangement or more specifically an
  • ultrasound-sensor based proximity detection arrangement comprises at least one transmitter and at least one receiver.
  • the transmitter is used for
  • the receiver is used for receiving at least some portion of the ultrasound signal being reflected by an object.
  • the object may be almost any animate or an inanimate object.
  • a body part of a user such as a hand, may be considered an object.
  • one or more fingers may be considered an object.
  • inanimate things such as either a stylus or a pen may be considered an object.
  • an input object can be anything that can trigger the proximity detection system.
  • the head of the user, or ear, or even hair of the user may be considered input objects.
  • the transmitter and receiver may either be different components or alternatively they can be the same transducer that is used in a transmit mode for transmitting the ultrasound signal and then in a receive mode for receiving the reflected ultrasound signal. If the transmitter and receiver are different components, they may be placed in the same location, or they may be installed at different locations on the electronic device.
  • the electronic device may comprise a plurality of transmitters and/or a plurality of receivers. Multiple transmitter-receiver combinations may be used to extract spatial information related to the object and/or surroundings.
  • the method may further comprise computing a distance value by the processing of the measured signal, said distance value can be relative to the distance between the input object and the electronic device, or more specifically the proximity sensor.
  • the processing of the signal or a plurality of signals received by the proximity sensor can be done by a processing unit such as a computer processor.
  • the processing unit may either be the same processor that is used for processing signals received by the touch-sensitive surface, or it may be a separate processor.
  • a usage of the term processing unit in this disclosure thus includes both alternatives, i.e., separate processors and same processor.
  • the processing unit can be any type of computer processor, such as a DSP, an FPGA, or an ASIC.
  • the range and/or sensitivity, and thus the field-of-view, of the proximity sensing arrangement may either be limited according to component specifications, or it may be statically or dynamically adapted by the processing unit to a certain value according to processing requirements and/or use case of the electronic device.
  • the field-of-view may be encompassing at least partially the first portion of the touch-sensitive surface.
  • the method may also comprise transmitting data related to the input object to another electronic module of the electronic device.
  • the input object related data may include one or more of: input object position, distance, speed, estimated trajectory, and projected trajectory.
  • Another electronic module may be a hardware or software module, and may include any one or more of, application programming interface (“API”), and sensor fusion module.
  • API application programming interface
  • sensor fusion module For example, data related to either one or any of, distance, speed of movement, position, and gesture type may be transmitted used by the processing unit to estimate the use case of the electronic device.
  • the method also comprises receiving data from at least one of the other sensors or modules in the electronic device for improving the robustness of the control of the electronic device.
  • the other sensors or modules may include any one or more of, accelerometer, inertial sensor, IR sensor, or any other sensor or modules related to a sensor fusion module in the electronic device.
  • proximity detection can be performed not only on the screen side of the device, but also from an input object that is located at or towards another side of the electronic device. This can for example be used to determine when the electronic device has been placed on a surface such as a table.
  • the electronic device may thus recognize an on-table use case by processing the signal from the proximity sensor and or other sensors in the electronic device. It may thus be recognized that the electronic device has been placed on such a surface.
  • the processing of the echo signals may be based on time of flight (“TOF”) measurements between the transmitted ultrasound signal and the
  • the processing of the echo signals may also be based on the amplitude of the measured signal, or phase difference between the transmitted signal and the measured signal, or the frequency difference between the transmitted signal and the measured signal, or a combination thereof.
  • the transmitted ultrasound signal may comprise either a single frequency or a plurality of frequencies. In another embodiment, the transmitted ultrasound signal may comprise chirps.
  • the method steps are preferably implemented using a computing unit such as a computer or a data processor.
  • the present teachings can also provide a computer software product for implementing any method steps disclosed herein. Accordingly, the present teachings also relate to a computer readable program code having specific capabilities for executing any method steps herein disclosed.
  • the term electronic device includes any device, mobile or stationary.
  • the electronic device is a mobile phone or a smartphone.
  • the electronic device can be executing any of the method steps disclosed herein. Accordingly, any aspects discussed in context of the method or process also apply to the product aspects in the present teachings.
  • the present teachings can allow original equipment manufacturers (“OEMs”) to save effort used on perfecting the proximity detection system. More specifically, the necessity to have a restricted field-of- view or detection distance can be alleviated.
  • OEMs original equipment manufacturers
  • FIG. 1 shows a perspective front view of an electronic device with a proximity detection system
  • FIG. 2 shows a perspective side view of an electronic device with a proximity detection system
  • FIG. 3 shows a state-diagram related to the electronic device
  • FIG. 4 shows a modified state-diagram related to the electronic device Detailed description
  • FIG. 1 shows a perspective front view of an electronic device 100 which is shown as a mobile phone.
  • the mobile phone 100 has a screen 101 for displaying and interacting with the device 100.
  • an earpiece 120 and a proximity sensor 105 are placed above the top-edge 1 10 of the screen 101 .
  • the earpiece 120 comprises a speaker that is used for outputting acoustic signals such as audio of the caller. In certain phones, the same speaker may also be used for outputting ultrasonic signals, for example for ultrasound-based user interaction.
  • the screen 101 comprises not only a display for displaying pictures and videos, but also a touchscreen sensor for touch based user interaction.
  • the proximity sensor 105 is usually infrared (“IR”) detection based, but it can also be an acoustic-detection based sensor, or another type of sensor suitable for proximity detection.
  • FIG. 1 also shows a finger 180 of a user interacting with the device 100. The tip 108 of the finger 180 is close to the top-edge 1 10 of the screen 101. One of the possible interactions close the top edge 110 of the screen 101 is a pull-down swipe, which is shown with an arrow 130 indicating the motion of the fingertip 108 while it is in contact with the screen 101.
  • the user For performing a pull-down swipe, the user will typically place his/her fingertip 101 substantially close to the top-edge 1 10 and move the fingertip 108 in the direction of the arrow 130 whilst the fingertip remaining in contact with the surface of the screen 101.
  • the touchscreen controller connected to the touchscreen sensor, detects a pull-down event resulting from a pull-down swipe 130, it usually sends a signal to a digital processor that performs a certain task. This task can for example be pulling down or displaying of the notification menu on the screen 101 .
  • a near event corresponds to a condition when an object comes within a certain predetermined distance of the proximity sensor 105 within its field of view (“FoV”).
  • the FoV of a proximity sensor 105 is a three-dimensional envelope or space around the sensor 105 within which the sensor 105 can reliably detect a proximity event. Detection of a near event is required, for example, to be able to switch off the touchscreen and display (or screen 101 ) of the device 100 such that undesired touchscreen operation may be prevented.
  • Such undesired touchscreen operation could otherwise occur when the user has placed the earpiece 120 close to his/her ear, in which condition, if the touchscreen were not disabled and the ear of the user touched the screen 101. Detection of a near event by using the proximity sensor 105 causes the touchscreen to be disabled such that undesired touchscreen operation is prevented.
  • any detection of a near event can cause the screen 101 to be switched off.
  • in-call it is meant that the phone 100 is in an ongoing call, incoming or outgoing. While the phone 100 is in an in-call state, and the user tries to interact with the device 100 such that a part his/her hand or finger 108 comes within the FoV of the proximity sensor 105, a near event may be detected even though it is not desired, or even required. In certain devices, performing a pull-down swipe, for example to view notifications, may almost be impossible to execute while the device is in a certain state, such as an in-call state. For example, in FIG.
  • the user could be hindered from properly interacting with the device if during the interaction, the fingertip 108 came within the FoV of the proximity sensor 105.
  • Such interaction could be a pull-down swipe as previously explained, i.e., swiping the fingertip 108 on the screen 101 from the top-edge 1 10 in the direction 130.
  • a part of the finger 180 may trigger a detection of a near event by the proximity sensor 105. The detection of the near event will then result in the screen 101 being switched off. In such a case, the screen turns off not because the phone is being held near the ear or the user’s head, but because there is a finger nearby.
  • a pull-down swipe from the top edge of the screen during an ongoing phone call can be very hard to perform because when attempting to perform a pull-down swipe, presence of the user’s finger or hand is detected as a near event which switches off the touchscreen and/or the display.
  • the detection of a near event resulting in the screen 101 being switched off can also exist in a hands-free mode.
  • hand-free mode it will be understood that the user is not resting the earpiece 120 against or near his/her ear.
  • a hands-free call either the same speaker as the one inside the earpiece 120 is being driven at a larger volume, or the device may have another dedicated hands-free speaker.
  • the phone is in-call in hands-free mode or otherwise, or the call is a call through a mobile network, such as GSM or CDMA, or whether the call is a VOIP call, the problem of an undesired screen-off may exist in any of such modes or use cases. Accordingly, the present teachings are not limited to any specific use case or configuration.
  • the proximity sensor 105 is active, it is meant that a presence of an object within a certain region around the sensor 105 results in a detection of a certain event in the proximity sensing system.
  • the proximity sensor is inactive it is meant that the presence of an object within a certain region around the sensor 105 does not result in a detection of the certain event in the proximity sensing system, irrespective of whether the sensor 105 itself is disabled or powered off or just the output of the sensor is being ignored by a processing system.
  • the proximity sensing system may also detect the absence or removal of an object, which was previously detected within a certain region around the sensor 105.
  • the detection of absence or removal of the object may generate another event in the proximity sensing system.
  • inactive it is meant that the absence of the object does not result in a detection of the another event in the proximity sensing system, irrespective of whether the sensor 105 itself is disabled or powered off or just the output of the sensor is being ignored by the processing system.
  • the events resulting in the proximity sensing system from the presence and absence of one or more objects can be called near and far events, respectively. Near and far events can generate near detection signal and far detection signal respectively.
  • the proximity sensing system may detect movement of the object approaching towards the sensor 105 to issue the near detection signal.
  • the proximity sensing system may detect movement of the object departing away from the sensor 105 to issue the far detection signal.
  • FIG. 2 shows a perspective side-view of the phone 100.
  • the FoV 205 of the proximity sensing system is shown extending in a divergent manner from the proximity sensor 105 along an axis 206 such that the cross-sectional area of the FoV 205 in a plane normal to the axis 206 increases with distance from the proximity sensor 105 along the axis 206.
  • the FoV 205 will extend to a certain distance 250 from the sensor 105. Accordingly, FoV 205 is the region or 3D space within which the proximity sensing system can reliably detect the proximity of an object.
  • the FoV 205 is shown as a conical shape with its vertex at the location of the proximity sensor 105 and the base 207 of the cone representing the limit within which a reliable sensing is possible.
  • the base 207 of the cone could represent the limit within which proximity sensing is desired.
  • the conical shape of the FoV 205 is shown just as an example.
  • the FoV 205 may be asymmetrical in either or all directions and may have another shape depending upon the sensor used.
  • an ultrasound-based proximity sensor usually has a wider FoV than an IR based proximity sensor.
  • a certain shape of the FoV is not limiting to the generality of the present teachings.
  • FIG. 2 further shows the user interacting with the device 100 using his/her fingertip 108.
  • a part of the finger 180 is lying within the FoV 205 such that a near event will be detected in certain situations, such as the phone 100 being in an in-call condition. Accordingly, detection of a near event may prevent an execution of a pull-down swipe 130 during an in-call condition.
  • FIG. 3 shows a state-diagram 300 illustrating a conventional method for control of screen of an electronic device such as a smartphone.
  • the electronic device can be the device 100 that is shown in the previous figures.
  • the state-diagram 300 illustrating a conventional method for control of screen of an electronic device such as a smartphone.
  • the electronic device can be the device 100 that is shown in the previous figures.
  • 300 shows the behavior of the electronic device in conditions where proximity sensing is enabled.
  • An example of such condition is an in-call situation.
  • the term screen may include the touchscreen and/or the backlight for the screen. In the state 301 at least the touchscreen is on.
  • a near event 310 is detected by the proximity sensing system, the device enters a near state 302. While the device is the near state 302, any touch event 31 1 that occurs in this state 302 does not result in a change in the state. In other words, any touch event 31 1 occurring while the device is in the near state 302 is ignored by the device.
  • the touch event 31 1 can be any interaction, intentional or unintentional, that occurs through the touchscreen of the device. An unintentional interaction can, for example, be a contact between the ear of the user and the screen.
  • An intentional interaction can, for example, be a pull-down swipe or any other purposeful touch or tap on the screen of the device.
  • the pull-down swipe will also be ignored if the device is in a near state.
  • the near state 302 can be triggered by an object being in the FoV 205 of the proximity sensing system of the device.
  • a timeout timer can be triggered that waits for a time-period of predetermined length, upon expiry of which period the screen is switched off.
  • the time-period can be counted up or down, accordingly the timer can be a count-up or count-down timer respectively.
  • the expiry of the time-period can be termed a timeout event 320 or more simply timeout.
  • the timeout event 320 occurs, the screen is switched off.
  • the screen is switched off, here it is meant that at least the touchscreen sensing or the touchscreen is disabled, switched off.
  • the screen backlight and the screen driver can also be disabled.
  • the screen backlight and the screen driver may also be disabled while the device is in the near state 302 while the touchscreen sensing is still kept active. This can allow touch events to be detected for a longer period of time, while still saving power.
  • the device By using the term disabled, various alternative conditions such as switched off, powered off, entering a low power state, sensor outputs being disabled, are all included as possible additions or alternatives for switching off the screen of the device. Consequently, upon timeout 320, the device enters a screen switched- off state 303.
  • the touchscreen and preferably also the display of the device are disabled such that no touch-based interaction is possible with the device.
  • the device can be woken up by a detection 330 of a far event.
  • the far event 330 can simply be a condition which is equivalent to the removal of the near event 302 that caused the screen to the switched-off, or alternatively there may be a hysteresis between the distance between an object approaching the proximity sensor and the proximity sensor that triggers a near event when the distance becomes lower than a certain value, and the distance between the same object and the proximity sensor which causes the far event when the object moves beyond a certain distance from the proximity sensor.
  • the far event 330 is detected, the device goes back to the screen on state 301.
  • FIG. 4 shows a modified state-diagram 400 showing a method according to the present teachings.
  • the device changes from the screen on state 301 to the near state 302 when a near event 310 is detected.
  • timeout 320 occurs, the device changes to a screen-off state 303.
  • occurrence of a far event 330 results in the device returning to the screen-on state 301.
  • These states are events were discussed in detail in context of FIG. 3 and apply to FIG. 4 as well.
  • the touch events are divided in at least two categories. In FIG. 4, touch events 41 1 and 412 belonging to two such categories are shown. The touch events may be divided in categories, for example, based upon the location where they occur on the screen. Alternatively, or in
  • the touch events may be divided in categories based upon a type of the touch event.
  • the touch events occurring on the bottom half of the screen can lie in a second category 412 of touch events, whereas the touch events occurring on the top half of the screen can lie in a first category 41 1 of the touch events.
  • the top half will be understood as the portion of the screen area spanning between the top-edge 1 10 and until a given distance essentially in the middle of the screen 101.
  • the bottom half is the remaining portion of the screen 101.
  • top and bottom are used in a relative sense and do not limit the present teachings to a given division of the screen area, symmetrical or asymmetrical.
  • the categories of touch events may be more than two.
  • the screen instead of the first half and second half, the screen may be divided in a first portion and second portion respectively. The first portion may be smaller than the second portion or vice versa.
  • the touch events may be divided in categories based on the type of the touch event. Accordingly, the touch event may be evaluated to determine if the touch event is of a first type, such as a tap, hold or swipe and deemed to be a first category touch event. Since such types of events are unlikely to be caused unintentionally, i.e., by contact of the phone with the user’s ear or cheek, such events may be accepted or not ignored. It will be appreciated that other types of touch events, or second type of touch events that cannot be distinguished from an undesired touch event such as from unintentional contact with the user’s ear or cheek can be ignored.
  • any second category touch event 412 i.e., a touch event occurring at the bottom portion of the screen and/or the second type of touch event, does not result in a change in the state.
  • any second category touch event 412 occurring while the device is in the near state 302 is ignored by the device.
  • the second category touch event 412 can be any interaction, intentional or unintentional, that occurs through the bottom portion of the touchscreen.
  • the second category touch event 412 is a specific type of touch event occurring on the screen, or the second type of touch event as discussed previously.
  • any first category touch event 41 1 i.e., a touch event occurring at the top portion of the screen and/or a first type of touch event, results in the device from changing to the screen-on state 301.
  • any first category touch event 41 1 occurring while the device is in the near state 302 will switch-on the screen of the device.
  • the first category touch event 41 1 can be any interaction, intentional or unintentional, that occurs through the top portion of the touchscreen.
  • first category touch event 41 1 can be the first type of touch event occurring anywhere on the screen.
  • the method before changing from the near state 302 to the screen- on state 301 after detecting the first category touch event 41 1 , the method comprises a step of verifying that the first category touch event 41 1 is an intentional touch event.
  • the first category touch event 41 1 can be identified as an intentional event, for example, by comparing the occurred first category touch event 41 1 is a predetermined type of touch event, such as a tap, double tap, hold, or a swipe. More specifically the swipe can be a pull-down swipe. If the first category touch event 41 1 cannot be recognized as an intentional touch event, the system enters the screen-off state 303 at timeout 320.
  • the present teachings can provide an increased flexibility in setting the timeout 320 time-duration as touch events can now be filtered firstly based on the location where they occur on the screen, and optionally secondly, by estimating whether the touch event is intentional or unintentional.
  • the timeout duration may be increased to allow more time for the user to interact with the device.
  • the user may decide to rest the phone on a table or such surface such that he is not required to hold the phone while being in the call. In such a case, the chances of false touch events can be significantly low as compared to a call where the user is resting the earpiece close to or against his/her ear.
  • a larger timeout duration may be allowed if certain conditions are detected, such as a movement to place the device on a table or such object.
  • the modified method may adapt the timeout duration such that the user experience is improved.
  • the method may utilize machine learning and artificial intelligence (“Al”) for recognizing and further improving the
  • the method before changing from the near state 302 to the screen-on state 301 after detecting the first category touch event 41 1 , the method also comprises a step of checking that active notifications exist.
  • active notifications it is meant that the user has not cleared the notification area such that at least one notification exists. Accordingly, if no notification exists, or the notification area is empty, the first category touch event 41 1 does not change the device to a screen-on state 301. In such a case, both touch events 41 1 and 412 are ignored. Similarly, if at least one notification exists in the notification area, or the notification area is not empty, the occurrence of the first category touch event 41 1 changes the device to the screen-on state 301.
  • the notifications area may be defined as a set of coordinates e.g., a dedicated area on the screen, a type of gesture such as a swipe, or a combination of both.
  • a method for controlling an electronic device comprising a touch-sensitive surface and a proximity sensor, the method comprising the steps of:
  • the device from a screen-on state to a near state in response to a near event being detected by the proximity sensor, wherein the screen-on state is a state in which user inputs are accepted through the touch-sensitive surface; and the near event occurs when an input object is determined to be in proximity of the electronic device by the proximity sensor;
  • first category touch event includes touch input occurring within a first portion of the touch-sensitive surface and the second category touch event includes touch input occurring in a second portion of the touch-sensitive surface, the first portion and the second portion being non-overlapping portions of the touch-sensitive surface.
  • first category touch event is a first type of touch event and the second category touch event is a second type of touch event distinct from the first type of touch event.
  • the electronic device comprises a display.
  • Clause 7 The method according to clause 6, wherein the display remains on for a predetermined time-period after the electronic device changes to the near state. Clause 8.
  • the proximity sensor is based on infrared (“IR”) detection.
  • An electronic device comprising a touch-sensitive surface and a proximity sensor, the electronic device also comprising a processing unit configured to: process user input received through the touch-sensitive surface;
  • the electronic device is configured to enter a near state in response to a near event being detected by the proximity sensor; and the screen-on state being a state in which user inputs are accepted through the touch-sensitive surface and processed by the processing unit; and the near event occurs when an input object is determined to be in proximity of the electronic device by the proximity sensor,;
  • Clause 18 The electronic device according to clause 17, wherein the first category touch event includes touch input occurring within a first portion of the touch-sensitive surface and the second category touch event includes touch input occurring in a second portion of the touch-sensitive surface, the first portion and the second portion being non-overlapping portions of the touch-sensitive surface. Clause 19.
  • the electronic device according to any of the above clauses 17 - 18, wherein the first category touch event is a first type of touch event and the second category touch event is a second type of touch event.
  • Clause 20 The electronic device according to clause 19, wherein the first type touch event is any one or more of, tap, hold, or swipe.
  • processing unit is configured to ignore the second category touch event by disregarding touch inputs received through the second portion of the touch-sensitive surface.
  • the electronic device according to any of the clauses 17 - 22, wherein the proximity sensor is based on Infrared (“IR”) detection.
  • IR Infrared
  • Clause 24 The electronic device according to any of the clauses 17 - 22, wherein the proximity sensor is based on acoustic detection.
  • An electronic device configured to execute the steps any of the clauses 1 - 16.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un procédé de commande d'un dispositif électronique comprenant une surface tactile et un capteur de proximité, lequel procédé consiste à : - faire passer le dispositif d'un état de mise sous tension d'écran à un état proche en réponse à la détection d'un événement proche par le capteur de proximité et l'événement proche se produit lorsqu'un objet d'entrée est déterminé comme étant à proximité du dispositif électronique par le capteur de proximité ; - pendant que le dispositif est dans l'état proche, traiter un événement tactile de première catégorie, et, en réponse au traitement de l'événement tactile de première catégorie, effectuer au moins une fonction sur le dispositif électronique ; - pendant que le dispositif est dans l'état proche, ignorer un événement tactile de seconde catégorie, l'événement tactile de seconde catégorie étant distinct de l'événement tactile de première catégorie.
PCT/NO2019/050178 2018-09-12 2019-09-10 Détection de proximité WO2020055263A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201980056290.2A CN112639673A (zh) 2018-09-12 2019-09-10 接近感测
US17/261,844 US20210303099A1 (en) 2018-09-12 2019-09-10 Proximity sensing

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201862730200P 2018-09-12 2018-09-12
US62/730,200 2018-09-12
NO20181316 2018-10-12
NO20181316A NO346144B1 (en) 2018-09-12 2018-10-12 Proximity sensing

Publications (1)

Publication Number Publication Date
WO2020055263A1 true WO2020055263A1 (fr) 2020-03-19

Family

ID=69777749

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/NO2019/050178 WO2020055263A1 (fr) 2018-09-12 2019-09-10 Détection de proximité

Country Status (1)

Country Link
WO (1) WO2020055263A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210179223A1 (en) * 2019-12-11 2021-06-17 Spin Electric I.K.E. User interaction and visual feedback system for bikes

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012112181A1 (fr) * 2011-02-16 2012-08-23 Google Inc. Gestion d'affichage de dispositif mobile
US20160246449A1 (en) * 2015-02-25 2016-08-25 Microsoft Technology Licensing, Llc Ultrasound sensing of proximity and touch
US20160345113A1 (en) * 2015-05-22 2016-11-24 Samsung Electronics Co., Ltd. Method of recognizing surrounding environment and electronic device for the same

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012112181A1 (fr) * 2011-02-16 2012-08-23 Google Inc. Gestion d'affichage de dispositif mobile
US20160246449A1 (en) * 2015-02-25 2016-08-25 Microsoft Technology Licensing, Llc Ultrasound sensing of proximity and touch
US20160345113A1 (en) * 2015-05-22 2016-11-24 Samsung Electronics Co., Ltd. Method of recognizing surrounding environment and electronic device for the same

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210179223A1 (en) * 2019-12-11 2021-06-17 Spin Electric I.K.E. User interaction and visual feedback system for bikes

Similar Documents

Publication Publication Date Title
US9261990B2 (en) Hybrid touch screen device and method for operating the same
US9176615B2 (en) Method and apparatus for activating a function of an electronic device
US11112872B2 (en) Method, apparatus and computer program for user control of a state of an apparatus
US20200158556A1 (en) Power management
KR102090750B1 (ko) 지문 인식을 위한 전자 장치 및 방법
KR101572071B1 (ko) 휴대단말 표시부의 온/오프 제어 방법 및 장치
US20060012577A1 (en) Active keypad lock for devices equipped with touch screen
WO2013030441A1 (fr) Procédé et appareil pour empêcher des opérations associées à des entrées tactiles accidentelles
KR20120085392A (ko) 터치 스크린을 구비한 단말기 및 그 단말기에서 터치 이벤트 확인 방법
KR20140005764A (ko) 입력 감지 방법 및 그 방법을 처리하는 전자 장치
WO2014013249A1 (fr) Commande de dispositifs électroniques
US20190391723A1 (en) Touch recognition method, touch device
JP2017510879A (ja) 端末デバイスを処理するための方法及び端末デバイス
US20150077351A1 (en) Method and system for detecting touch on user terminal
US20210303099A1 (en) Proximity sensing
JP2014123327A (ja) 携帯情報端末
EP3283941B1 (fr) Évitement d'un déplacement de curseur accidentel lors d'un contact avec une surface d'un pavé tactile
WO2020055263A1 (fr) Détection de proximité
JP6295363B1 (ja) 電子機器、プログラムおよび制御方法
US20120313891A1 (en) Distance sensing circuit and touch-control electronic apparatus
NO20181651A1 (en) Power management
CN112416155B (zh) 报点输出控制方法、装置及存储介质
TWI434205B (zh) 電子裝置及其相關控制方法
JP2014119875A (ja) 電子機器、操作入力装置、操作入力処理方法およびプログラム
US20170160811A1 (en) Electronic device, control method, and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19860977

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19860977

Country of ref document: EP

Kind code of ref document: A1