CN114449069A - Electronic device, method, and storage medium - Google Patents

Electronic device, method, and storage medium Download PDF

Info

Publication number
CN114449069A
CN114449069A CN202011216016.8A CN202011216016A CN114449069A CN 114449069 A CN114449069 A CN 114449069A CN 202011216016 A CN202011216016 A CN 202011216016A CN 114449069 A CN114449069 A CN 114449069A
Authority
CN
China
Prior art keywords
user
handheld device
distance data
electronic device
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011216016.8A
Other languages
Chinese (zh)
Inventor
闫冬生
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Semiconductor Solutions Corp
Original Assignee
Sony Semiconductor Solutions Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Semiconductor Solutions Corp filed Critical Sony Semiconductor Solutions Corp
Priority to CN202011216016.8A priority Critical patent/CN114449069A/en
Priority to PCT/CN2021/128556 priority patent/WO2022095915A1/en
Priority to CN202180073970.2A priority patent/CN116391163A/en
Publication of CN114449069A publication Critical patent/CN114449069A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/725Cordless telephones

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Environmental & Geological Engineering (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure relates to electronic devices, methods, and storage media. An electronic device for a handheld device, comprising processing circuitry configured to: obtaining distance data associated with one or more physical features of a user of the handheld device; determining a current use hand of the user operating the handheld device by determining a direction of the handheld device relative to the user based on the distance data; and presenting a User Interface (UI) suitable for the operation of the user's hands according to the result of the judgment.

Description

Electronic device, method, and storage medium
Technical Field
The present disclosure relates generally to handheld user devices such as smartphones and, more particularly, to a method of determining a use hand of a user operating a handheld device.
Background
In recent years, handheld user devices such as smartphones have become increasingly popular. Such handheld devices are often equipped with a touch screen, which, in addition to the purpose of visual presentation, may provide interaction with the user by means of a UI containing virtual operating elements (such as virtual keys) displayed on the screen, thereby reducing the use of physical keys. However, to provide a better operating and visual experience, the screen size of handheld devices is increasing, which invisibly increases the difficulty for a user to operate the device while holding the device in one hand. While it is possible to have the user's hand conveniently complete each operation on the screen by optimizing the UI design, this requires a determination of whether the user is currently operating the device with the left or right hand in order to present the UI in a form suitable for operation.
Various methods of judging the use of the hand of the user have been proposed. For example, patent document 1(CN105468269A) discloses using proximity sensors 201-209 and 201 '-209' mounted on a cellular phone to detect which hand of a user is holding the cellular phone, as shown in fig. 1A. Patent document 2(JP2012023554A) discloses detecting with a sensor such as a gyroscope or an accelerometer to determine which hand of a user is holding a cellular phone, as shown in fig. 1B. However, the gyroscope or the accelerometer is influenced by the motion state or gravity (vertical direction), and the judgment method has poor effect when the user walks or lies down, so that the application scene is limited. Further, patent document 3(CN108958603A) discloses capturing a face of a user with a general RGB camera and determining the use hand of the user from different areas of the left face and the right face. However, RGB sensors are limited by brightness conditions and are not accurate because the area ratio is ambiguous under different head poses.
Therefore, there is a need for a wider range of applications and a highly accurate determination of which hand the user is operating the device.
Disclosure of Invention
A brief summary of the disclosure is provided in this section to provide a basic understanding of some aspects of the disclosure. However, it should be understood that this summary is not an exhaustive overview of the disclosure. It is not intended to identify key or critical elements of the disclosure or to delineate the scope of the disclosure. Its sole purpose is to present some concepts of the disclosure in a simplified form as a prelude to the more detailed description that is presented later.
According to an aspect of the present disclosure, there is provided an electronic device for a handheld device, comprising processing circuitry configured to: obtaining distance data associated with one or more physical features of a user of the handheld device; determining a direction of the handheld device relative to the user based on the distance data; determining a current usage hand of the user operating the handheld device based on the determined method; and presenting a User Interface (UI) suitable for the operation of the user's hands according to the result of the judgment.
According to another aspect of the present disclosure, there is provided a handheld device including: a depth camera to obtain a depth image in relation to a user of the handheld device; and processing circuitry configured to: based on the depth image, obtaining distance data associated with one or more physical features of the user; determining a direction of the handheld device relative to the user based on the distance data; determining a current usage hand of the user operating the handheld device based on the determined direction; and presenting a User Interface (UI) suitable for the operation of the user by hands according to the judgment result.
According to another aspect of the present disclosure, there is provided a method for a handheld device, comprising: obtaining distance data associated with one or more physical features of a user of the handheld device; determining a direction of the handheld device relative to the user based on the distance data; determining a current usage hand of the user operating the handheld device based on the determined direction; and presenting a User Interface (UI) suitable for the operation of the user's hands according to the result of the judgment.
According to another aspect of the present disclosure, there is provided a non-transitory computer-readable storage medium having stored thereon executable instructions that, when executed, implement the method described above.
By applying one or more aspects of the present disclosure, it is possible to conveniently and accurately judge the use hand of a user operating a handheld device and present a UI suitable for the user operation.
Drawings
The disclosure may be better understood by reference to the following detailed description taken in conjunction with the accompanying drawings, in which like or similar reference numerals are used throughout the figures to designate like or similar elements. The accompanying drawings, which are incorporated in and form a part of the specification, further illustrate the embodiments of the present disclosure and explain the principles and advantages of the disclosure. Wherein:
FIGS. 1A-1B show schematic diagrams of a prior art judgment use hand;
fig. 2 shows a hardware configuration of a smartphone as an example of a handheld device;
FIG. 3A shows a block diagram of an electronic device according to the present disclosure;
FIG. 3B shows a flow chart of a method according to the present disclosure;
4A-4B illustrate an example of determining the orientation of a handheld device relative to a user;
5A-5B illustrate another example of determining the orientation of a handheld device relative to a user;
fig. 6A-6B illustrate another example of determining the orientation of a handheld device relative to a user.
Detailed Description
Various exemplary embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings. In the interest of clarity and conciseness, not all implementations of embodiments have been described in this specification. It should be noted, however, that many implementation-specific settings may be made in implementing embodiments of the present disclosure according to particular needs.
Further, it should be noted that only process steps and/or equipment structures germane to the technical solutions of the present disclosure are illustrated in the drawings in order to avoid obscuring the present disclosure with unnecessary detail. The following description of exemplary embodiments is merely illustrative and is not intended to serve as any limitation on the present disclosure and its application.
Fig. 2 is a block diagram showing a hardware configuration of a smartphone 1600 as an example of a handheld device of the present disclosure. It should be understood that although the present disclosure is described with a smart phone as an example, a handheld device to which the technical contents of the present disclosure can be applied is not limited to the smart phone, but can be implemented as various types of mobile terminals, such as a tablet computer, a Personal Computer (PC), a palmtop computer, a smart assistant, a portable game terminal, a portable digital camera, and the like.
The smartphone 1600 includes a processor 1601, memory 1602, storage 1603, external connection interface 1604, RGB camera 1605, depth camera 1606, sensors 1607, microphone 1608, input device 1609, display 1610, speaker 1611, wireless communication subsystem 1612, bus 1617, battery 1618, etc., which are connected to each other by bus 1617. Battery 1618 provides power to the various components of smartphone 1600 via feed lines (not shown in fig. 2).
Processor 1601 may be implemented as, for example, a CPU or a system on a chip (SoC), and generally controls the functions of smartphone 1600. Processor 1601 may include various implementations of digital circuitry, analog circuitry, or mixed-signal (a combination of analog and digital) circuitry that performs a function in a computing system, such as Integrated Circuits (ICs), Application Specific Integrated Circuits (ASICs), portions or circuits of an individual processor core, an entire processor core, an individual processor, a programmable hardware device such as a Field Programmable Gate Array (FPGA), and/or a system that includes multiple processors.
The memory 1602 is used to store data and programs executed by the processor 1601 and may be, for example, volatile memory and/or nonvolatile memory including, but not limited to, Random Access Memory (RAM), Dynamic Random Access Memory (DRAM), Static Random Access Memory (SRAM), Read Only Memory (ROM), flash memory, and the like. The storage device 1603, in addition to the memory 1602, may include a storage medium such as a semiconductor memory and a hard disk. The external connection interface 1604 is an interface for connecting external devices, such as a memory card and a Universal Serial Bus (USB) device, to the smartphone 1600.
The input device 1609 includes, for example, a keyboard, a keypad, keys, or switches for receiving operations or information input from a user. The display device 1610 includes a screen, such as a Liquid Crystal Display (LCD) and an Organic Light Emitting Diode (OLED) display, and displays an output image of the smartphone 1600, such as a UI including virtual operation components. Typically, the display device 1610 may be implemented as a touch screen including a touch sensor configured to detect a touch on a screen of the display device 1610, whereby the touch screen is both a display device and may serve as an input device.
The microphone 1608 converts sound input to the smartphone 1600 into an audio signal. With voice recognition technology, instructions from a user can be input to the smartphone 1600 through the microphone 1608 in the form of voice, whereby the microphone 1608 can also serve as an input device. The speaker 1611 converts an audio signal output from the smartphone 1600 into sound.
The wireless communication subsystem 1612 is for performing wireless communication. The wireless communication subsystem 1612 may support any cellular communication scheme, such as 4G LTE or 5G NR, etc., or another type of wireless communication scheme, such as a short-range wireless communication scheme, a near-field communication scheme, and a wireless Local Area Network (LAN) scheme. The wireless communication subsystem 1612 may include, for example, a BB processor and RF circuitry, antenna switches, antennas, and so forth.
The smartphone 1600 also includes an RGB camera 1605 for capturing ambient images. The RGB camera 1605 includes optics and an image sensor, such as a Charge Coupled Device (CCD) sensor and a Complementary Metal Oxide Semiconductor (CMOS) sensor. The light passing through the optical components is captured and photoelectrically converted by an image sensor and processed by, for example, a processor 1601 to generate an image. The smartphone 1600 may include more than one RGB camera 1605, e.g., multiple cameras in front or rear. The RGB camera 1605 may be composed of a wide-angle camera module, a super wide-angle camera module, a telephoto camera module, etc. to provide excellent photographing effects.
In addition to the conventional RGB camera 1605, the smartphone 1600 may also include a depth camera 1606 for obtaining depth information. Depth cameras, sometimes also referred to as 3D cameras, by which the depth of field distance of a shooting space can be detected, are increasingly popular for application scenarios such as object recognition, behavior recognition, scene modeling, etc. Compared with the traditional camera, the depth camera is functionally added with a depth measurement, so that the surrounding environment and change can be sensed more conveniently and accurately. Depending on the technology utilized, the depth camera 1606 may include, but is not limited to, a Time of Flight (TOF) camera, a structured Light (structured Light) camera, a Binocular Stereo Vision (Binocular Stereo Vision) camera, and the like. These depth cameras are briefly described below.
The TOF camera takes distance by measuring the time of flight of light in an active detection mode. In particular, a light emitter (e.g., an LED or laser diode) continuously emits modulated light pulses, typically invisible infrared light, toward a target, and the light pulses encounter an object for reflection, after which the reflected light is received by a light receiver (e.g., a specially-made CMOS sensor). Because of the speed of light, it is not practical to measure the time of flight of light directly, typically by detecting the phase shift of the light wave modulated by some means. TOF techniques can be generally classified into two types according to modulation methods: pulse Modulation (Pulsed Modulation) and Continuous Wave Modulation (Continuous Wave Modulation). Since the speed of light and the wavelength of the modulated light are known, the arithmetic unit can quickly and accurately calculate the depth distance to the object by using the phase shift between the transmitted pulsed light and the received pulsed light. The TOF camera can simultaneously obtain depth information of the whole image, i.e. two-dimensional depth point cloud information, and the value of each point on the image represents the value of the distance between the camera and the object (i.e. depth value), unlike the RGB camera, which is a light intensity value. The advantages of TOF cameras are mainly: 1) the detection distance is long. And can reach dozens of meters under the condition of enough laser energy. 2) The interference from ambient light is relatively small. However, TOF technology also has some obvious problems, such as high requirements on equipment, in particular time measurement modules; the calculation resource consumption is large, multiple sampling integrals are needed when the phase offset is detected, and the calculation amount is large; limited to resource consumption and filtering, neither frame rate nor resolution can be made higher.
The structured light camera also adopts an active detection mode, and the principle is that light rays with certain structural characteristics (such as coded images or pseudo-random speckle light spots) are projected onto a shot object through a light transmitter (such as a near infrared laser), and a transmitted structured light pattern is captured by a special light receiver. The light rays with a certain structure are captured to obtain different image phase information due to different depth areas of the shot object, and then the change of the structure is converted into depth information through an arithmetic unit according to the triangulation principle. Compared with a TOF camera, the structured light camera has the advantages of small computation amount, low power consumption and higher precision in a close range, so that the structured light camera has great advantages in the aspects of face recognition and gesture recognition, but is easily interfered by ambient light, has poor outdoor experience, and has poor precision along with the increase of detection distance.
The binocular stereo vision camera adopts a passive detection mode. Binocular stereo vision is an important form of machine vision, and is a method for acquiring three-dimensional geometric information of an object by acquiring two images of the object to be measured from different positions by using an imaging device (such as a conventional RGB camera) and calculating a positional deviation between corresponding points of the images based on a parallax principle. The binocular stereoscopic vision camera has low requirements on hardware, and can be used by a common RGB camera and be suitable for indoor and outdoor use as long as light is suitable. But the disadvantages are also very obvious, such as very sensitive to ambient lighting, unsuitability for monotonously texture-lacking scenes, high computational complexity, limited measurement range with baseline, etc.
The smartphone 1600 may also include other sensors 1607, such as light sensors, gravity sensors, proximity sensors, fingerprint sensors, and so forth. In addition, the smartphone 1600 is generally equipped with a gyro sensor, an acceleration sensor, and the like so as to detect the motion state or posture thereof.
It should be understood that the above describes a smartphone schematically as an example of a handheld device and its representative components, and that these components are not meant to be all that is necessary for a handheld device according to the present disclosure.
According to embodiments of the present disclosure, a handheld device may adaptively present a UI according to which hand the user is operating the device. An electronic device for adaptive UI presentation and a method performed by the same according to the present disclosure will be described below with reference to fig. 3A and 3B.
Fig. 3A shows a block diagram of an electronic device 1000. The electronic device 1000 comprises processing circuitry 1001 and potentially other circuitry. The processing circuit 1001 may be implemented, for example, as a processor such as the processor 1601 described above. As shown in fig. 3A, the processing circuit 1001 includes an acquisition unit 1002, a determination unit 1003, and a presentation unit 1004, and may be configured to perform the method shown in fig. 3B.
The obtaining unit 1002 of the processing circuit 1001 is configured to obtain distance data associated with one or more physical features of a user of the handheld device (i.e. to perform step S1001 in fig. 3B).
According to embodiments of the present disclosure, the physical features of the user may include various facial features or other physical features. In one example, the physical characteristics of the user include forehead, nose tip, mouth, chin, and the like. In another example, the user's physical characteristics include a pair of left-right symmetric characteristics, such as eyes, shoulders, and the like. The acquisition unit 1002 may depend on a specific determination method, which will be described in detail below, for which body features distance data is to be acquired.
Distance data associated with a physical feature of the user may be acquired from a depth image taken of the user by a depth camera. The depth image takes as pixel values the distance (depth) from the light receiver (e.g., TOF sensor) of the depth camera to the user's body. That is, the depth image contains distance information of the captured physical features of the user. Typically, the depth image may be represented as a two-dimensional point cloud, and when two directions of a sensor plane of the depth camera are regarded as X, Y directions (for example, a horizontal direction is an X direction, a vertical direction is a Y direction) and a normal direction of the sensor plane (depth direction) is regarded as a Z direction to establish a coordinate system, a pixel value of the depth image may be represented as (X) Xi,yi,zi) Wherein x isi、yi、ziRespectively, are distances in three dimensions.
In one example, the processing circuitry 1001 may perform image recognition processing on the depth image obtained by the depth camera to identify desired body features. Many such image recognition processes exist in the prior art, such as various classification methods, artificial intelligence, etc., and are not described here in detail. Then, the acquisition unit 1002 may obtain the corresponding pixel value from the depth image as distance data associated with the body feature.
In another example, the handheld device may utilize an RGB camera and a depth camera to simultaneously obtain images of the user, and the RGB images obtained by the RGB camera and the depth images obtained by the depth camera may be aligned with each other. Even in the case where an RGB camera and a depth camera are integrated, such a camera can simultaneously obtain RGB values and a distance of a user as pixel values. The processing circuit 1001 may perform an image recognition process on the RGB image to recognize a desired physical feature. Then, based on the correspondence between the pixel points of the depth image and the RGB image, the obtaining unit 1002 may obtain a corresponding pixel value from the depth image as distance data associated with the body feature.
The determination unit 1003 may be configured to determine the use hand of the handheld apparatus currently operated by the user based on the distance data acquired by the acquisition unit 1002 (i.e., perform step S1002 in fig. 3B). In particular, based on distance data associated with one or more physical features of the user, the determination unit 1003 may first determine the orientation of the handheld device relative to the user. Several examples of determining the orientation of a handheld device relative to a user are presented herein.
In one example, the determination unit 1003 may calculate an azimuth angle of the body feature relative to the handheld device based on the distance data associated with the body feature. Fig. 4A and 4B show front and top views, respectively, of the nose tip relative to the orientation of the handheld device when the user holds the handheld device in the right and left hands. It should be understood that although fig. 4A and 4B illustrate the tip of the nose as an example of a physical feature, the disclosure is not so limited and the physical feature may also be the forehead, the mouth, the chin, etc.
As shown in the figure, if a coordinate system is established using a sensor of a depth camera as a center, an azimuth angle of a tip of a nose of a user may be calculated as
Figure BDA0002760405750000091
Where X is the distance of the tip of the nose in the X direction (e.g., horizontal direction) and Z is the distance of the tip of the nose in the Z direction (e.g., normal to the plane of the sensor, i.e., depth direction). As shown in figure 4A of the drawings,when the handheld device is positioned at the right of the user, because x is a positive value, theta is larger than 0; and when the handheld device is to the left of the user, as shown in fig. 4B, since x is a negative value, θ should be less than 0. Thus, by comparing the calculated orientation angle with a predetermined threshold (e.g., 0), the orientation of the handheld device relative to the user can be determined.
It should be understood that the predetermined threshold herein is not limited to 0, but may be set appropriately taking into account tolerances, such as determining that the handheld device is to the left of the user if the calculated azimuth angle θ is greater than 5 °, 10 °, or other threshold, and determining that the handheld device is to the right of the user if the calculated azimuth angle θ is less than-5 °, -10 °, or other threshold.
In another example, the determination unit 1003 may consider more than one physical feature, but may consider two or even more physical features, such as eyes, shoulders, etc. Fig. 5A and 5B show examples of calculating an azimuth angle for an eye. As shown in fig. 5A, when the handheld device is positioned to the right of the user, the azimuth angle for the right eye θ R should be less than the azimuth angle for the left eye θ L; and when the handheld device is positioned at the left of the user, the azimuth angle thetar of the right eye should be larger than the azimuth angle thetal of the left eye. Thus, by comparing the azimuth angles of the two eyes, it can be determined whether the handheld device is located to the left or right of the user.
Indeed, as can be seen from fig. 5A and 5B, the direction in which the handheld device is located relative to the user always corresponds to a physical feature (e.g., an eye) having a smaller azimuthal angle. For example, in fig. 5A, the handheld device is located to the right of the user, which corresponds to the right eye with smaller azimuth angle, while in fig. 5B, the handheld device is located to the left of the user, which corresponds to the left eye with smaller azimuth angle. This means that the user's hand is ipsilaterally oriented with the body feature having the smaller orientation.
In yet another example, the determination unit 1004 may calculate the distance between the physical feature and the handheld device based on distance data associated with a pair of physical features. Fig. 6A and 6B show examples of calculating the distance for the eye. It should be understood that the physical characteristics considered herein may beNot limited to the eyes but may be a pair of any physical features that are bilaterally symmetric, such as the shoulders. By establishing the coordinate system as above, the azimuth angle of the tip of the nose of the user can be calculated as
Figure BDA0002760405750000101
Where X is the distance of the eye in the X direction (e.g. horizontal direction) and Z is the distance of the eye in the Z direction (e.g. normal to the sensor plane, i.e. depth direction).
As shown in fig. 6A, the distance dR for the right eye should be less than the distance dL for the left eye when the handheld device is to the right of the user, and the distance dR for the right eye should be greater than the distance dL for the left eye when the handheld device is to the left of the user. Thus, by comparing the distance of the two eyes, it can be determined whether the handheld device is located to the left or to the right of the user. Likewise, it can be found that the direction in which the handheld device is located relative to the user always corresponds to the eye having the smaller distance. This means that the user's hand is in ipsilateral relationship with a physical feature having a small distance.
Although several exemplary methods of determining the orientation of a handheld device relative to a user based on depth data associated with a body feature are described above, the disclosure is not so limited. Any method may be used as long as the same object can be achieved. For example, a method of machine learning may be adopted, a model such as a neural network model may be constructed by using the depth images or distance data as an input training set and the orientation of the handheld device with respect to the user as an output training set, and when used, the determination unit 1003 may input the corresponding depth images or distance data acquired by the acquisition unit 1002 to the model to obtain a prediction output of the model.
After determining the orientation of the handheld device relative to the user, the determination unit 1003 may determine which hand the user is holding the handheld device with. Typically, when a user holds the handheld device in their left hand, the device is positioned in front of the user's left hand, and when the user holds the handheld device in their right hand, the device is positioned in front of the user's right hand. Thus, the determination unit 1003 can accordingly determine the usage hand of the handheld device operated by the user according to the direction of the handheld device relative to the user.
There may be situations where the user holds the handheld device in the left hand but the handheld device is located to the user's right or vice versa. In this case, the determination unit 1003 may obtain an erroneous determination result. According to an embodiment of the present disclosure, the processing circuit 1001 may also correct the determination result of the determination unit 1003 using sensing data from an additional sensor other than the depth camera.
In one example, the processing circuit 1001 may detect a motion trajectory of the handheld device using the sensing data of the acceleration sensor, and assuming that the handheld device moves from left to right across the front of the user and is located at the right of the user, the processing circuit 1001 may correct the determination result of the determination unit 1003 to use the left hand in combination with the detected motion trajectory, and vice versa.
In another example, the processing circuit 1001 may detect the gesture of the handheld device using the sensing data of the gyro sensor and integrate ergonomics and face positions to comprehensively judge the use hand of the user.
In still another example, the processing circuit 1001 may control the determination result of the display determination unit 1003 and prompt the user to give confirmation. For example, "can a current usage hand be detected as the left hand? ", the user may confirm by clicking" yes "or" no ".
Returning to fig. 3A and 3B, the presentation unit 1004 of the processing circuit 1001 may present a UI suitable for the user' S hands-on operation according to the determined hands-on (i.e., perform step S1003 in fig. 3B). For example, when it is determined that the user is holding the handheld device with the left hand, the presentation unit 1004 may present the UI component that requires user operation close to the left side of the screen so as to be within reach of the fingers of the user's left hand. In contrast, when it is determined that the user is holding the handheld device with the right hand, the presentation unit 1004 may present the UI part that requires the user operation close to the right side of the screen so as to be within the reach of the finger of the user's right hand.
The use hand determination method of the present disclosure can infer which hand a user is holding and operating a handheld device with, using only a depth camera commonly equipped on the device, without having to be equipped with a specific sensor. In addition, the method for judging the use hand is less influenced by the illumination condition, and wide application scenes and high precision can be ensured.
The above-described method process may be performed in many cases to provide convenience for a user to operate a UI.
For example, when the user unlocks the handheld device, the technique of the present disclosure may be performed, for example, by starting a depth camera to capture a depth image while the user performs facial recognition or presses an unlocking key, and determining the use hand based on the captured depth image, thereby directly presenting a UI suitable for the user operation on the unlocked screen.
Alternatively, when a change in the posture or usage scenario of the handheld device is detected, such as by a gyroscope sensor or an acceleration sensor, for example, the user changes from standing to lying down, the user picks up the mobile phone, the user's left and right hands exchange, or the like, execution of the method process of the present disclosure is triggered by such a change.
Further, the user hand determination of the present disclosure may be initiated when a reverse change in the orientation of the handheld device relative to the user is detected by a camera or other sensor, e.g., moving from the user's left to right or from the user's right to left, and the handheld device may present a query to the user on the screen whether to change the user hand, and subsequently present a changed UI if a response is received from the user confirming the change in the user hand, whereas if the user responds that the user hand is not changed, the UI need not be changed.
Various implementations of implementing the concepts of the present disclosure are contemplated in accordance with embodiments of the present disclosure, including but not limited to:
1) an electronic device for a handheld device, comprising: processing circuitry configured to: obtaining distance data associated with one or more physical features of a user of the handheld device; determining a direction of the handheld device relative to the user based on the distance data; determining, based on the determined direction, a use hand of the user currently operating the handheld device; and presenting a User Interface (UI) suitable for the operation of the user's hands according to the result of the judgment.
2) The electronic device of 1), wherein the processing circuit is further configured to: the one or more body features are identified based on a depth image about the user obtained by a depth camera, and the distance data is acquired.
3) The electronic device of 1), wherein the processing circuit is further configured to: identifying the one or more physical features based on RGB images obtained by an RGB camera with respect to the user; and obtaining distance data associated with the identified one or more body features from a depth image obtained by the depth camera.
4) The electronic device of claim 1), wherein the processing circuit is further configured to: based on the distance data, an azimuth angle of one or more physical features of the user relative to the handheld device is calculated, and a direction of the handheld device relative to the user is determined based on the azimuth angle.
5) The electronic device of 1), wherein the processing circuit is further configured to: based on the distance data, calculating and comparing distances between a pair of bilaterally symmetric body features of the user and the handheld device, and determining an orientation of the handheld device relative to the user based on the distances.
6) The electronic device of 1), wherein the physical characteristic comprises at least one of: nose tip, chin, forehead, mouth, eyes, shoulders.
7) The electronic device of 1), wherein the processing circuit is further configured to: presenting a query to the user as to whether to change the usage hand upon a reverse change in orientation of the handheld device relative to the user; receiving a response to the query from the user; determining whether to change the User Interface (UI) based on the received response.
8) The electronic device of 1), wherein the depth camera is one of: time of flight (TOF) cameras, structured light cameras, binocular stereo vision cameras.
9) A handheld device, comprising: a depth camera to obtain a depth image in relation to a user of the handheld device; and processing circuitry configured to: based on the depth image, obtaining distance data associated with one or more physical features of the user; determining a direction of the handheld device relative to the user based on the distance data; determining, based on the determined direction, a use hand of the user currently operating the handheld device; and presenting a User Interface (UI) suitable for the operation of the user's hands according to the judgment result.
10) The handheld device of 9), wherein the processing circuit is further configured to: based on the depth image, the one or more body features are identified and the distance data is acquired.
11) The handheld device of 9), further comprising an RGB camera for capturing RGB images, wherein the processing circuit is further configured to: identifying the one or more physical features based on RGB images captured by an RGB camera with respect to the user; and obtaining distance data associated with the identified one or more body features from a depth image obtained by the depth camera.
12) The handheld device of 9), wherein the processing circuit is further configured to: based on the distance data, an azimuth angle of one or more physical features of the user relative to the handheld device is calculated, and a direction of the handheld device relative to the user is determined based on the azimuth angle.
13) The handheld device of 9), wherein the processing circuit is further configured to: based on the distance data, calculating and comparing distances between a pair of bilaterally symmetric body features of the user and the handheld device, and determining an orientation of the handheld device relative to the user based on the distances.
14) The handheld device of 9), wherein the depth camera is one of: time of flight (TOF) cameras, structured light cameras, binocular stereo vision cameras.
15) A method for a handheld device, comprising: obtaining distance data associated with one or more physical features of a user of the handheld device; determining a direction of the handheld device relative to the user based on the distance data; determining, based on the determined direction, a use hand of the user currently operating the handheld device; and presenting a User Interface (UI) suitable for the operation of the user's hands according to the result of the judgment.
16) A non-transitory computer readable storage medium storing executable instructions that when executed perform the method according to 15).
The exemplary embodiments of the present disclosure are described above with reference to the drawings, but the present disclosure is of course not limited to the above examples. Various changes and modifications within the scope of the appended claims may be made by those skilled in the art, and it should be understood that these changes and modifications naturally will fall within the technical scope of the present disclosure.
For example, a plurality of functions included in one unit may be implemented by separate devices in the above embodiments. Alternatively, a plurality of functions implemented by a plurality of units in the above embodiments may be implemented by separate devices, respectively. In addition, one of the above functions may be implemented by a plurality of units. Needless to say, such a configuration is included in the technical scope of the present disclosure.
In this specification, the steps described in the flowcharts include not only the processing performed in time series in the described order but also the processing performed in parallel or individually without necessarily being performed in time series. Further, even in the steps processed in time series, needless to say, the order can be changed as appropriate.
Although the present disclosure and its advantages have been described in detail, it should be understood that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the disclosure as defined by the appended claims. Also, the terms "comprises," "comprising," or any other variation thereof, of the embodiments of the present disclosure are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.

Claims (10)

1. An electronic device for a handheld device, comprising:
processing circuitry configured to:
obtaining distance data associated with one or more physical features of a user of the handheld device;
determining a direction of the handheld device relative to the user based on the distance data;
determining, based on the determined direction, a current usage hand of the user operating the handheld device; and
and presenting a User Interface (UI) suitable for the operation of the user's hands according to the judgment result.
2. The electronic device of claim 1, wherein the processing circuit is further configured to:
the one or more body features are identified based on a depth image about the user obtained by a depth camera, and the distance data is acquired.
3. The electronic device of claim 1, wherein the processing circuit is further configured to:
identifying the one or more physical features based on RGB images obtained by an RGB camera with respect to the user; and
distance data associated with the identified one or more body features is acquired from a depth image obtained by a depth camera.
4. The electronic device of claim 1, wherein the processing circuit is further configured to:
based on the distance data, an azimuth angle of one or more physical features of the user relative to the handheld device is calculated, and a direction of the handheld device relative to the user is determined based on the azimuth angle.
5. The electronic device of claim 1, wherein the processing circuit is further configured to:
based on the distance data, calculating and comparing distances between a pair of bilaterally symmetric body features of the user and the handheld device, and determining an orientation of the handheld device relative to the user based on the distances.
6. The electronic device of claim 1, wherein the physical characteristic comprises at least one of: nose tip, chin, forehead, mouth, eyes, shoulders.
7. The electronic device of claim 1, wherein the processing circuit is further configured to:
presenting a query to the user as to whether to change the usage hand upon a reverse change in orientation of the handheld device relative to the user;
receiving a response to the query from the user;
determining whether to change the User Interface (UI) based on the received response.
8. The electronic device of claim 1, wherein the depth camera is one of: time of flight (TOF) cameras, structured light cameras, binocular stereo vision cameras.
9. A handheld device, comprising:
a depth camera to obtain a depth image in relation to a user of the handheld device; and
a processing circuit configured to:
based on the depth image, obtaining distance data associated with one or more physical features of the user;
determining a direction of the handheld device relative to the user based on the distance data;
determining, based on the determined direction, a use hand of the user currently operating the handheld device;
and presenting a User Interface (UI) suitable for the operation of the user by hands according to the judgment result.
10. The handheld device of claim 9, wherein the processing circuit is further configured to:
based on the depth image, the one or more body features are identified and the distance data is acquired.
CN202011216016.8A 2020-11-04 2020-11-04 Electronic device, method, and storage medium Pending CN114449069A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202011216016.8A CN114449069A (en) 2020-11-04 2020-11-04 Electronic device, method, and storage medium
PCT/CN2021/128556 WO2022095915A1 (en) 2020-11-04 2021-11-04 Electronic device, method and storage medium
CN202180073970.2A CN116391163A (en) 2020-11-04 2021-11-04 Electronic device, method, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011216016.8A CN114449069A (en) 2020-11-04 2020-11-04 Electronic device, method, and storage medium

Publications (1)

Publication Number Publication Date
CN114449069A true CN114449069A (en) 2022-05-06

Family

ID=81362133

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202011216016.8A Pending CN114449069A (en) 2020-11-04 2020-11-04 Electronic device, method, and storage medium
CN202180073970.2A Pending CN116391163A (en) 2020-11-04 2021-11-04 Electronic device, method, and storage medium

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202180073970.2A Pending CN116391163A (en) 2020-11-04 2021-11-04 Electronic device, method, and storage medium

Country Status (2)

Country Link
CN (2) CN114449069A (en)
WO (1) WO2022095915A1 (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110289455A1 (en) * 2010-05-18 2011-11-24 Microsoft Corporation Gestures And Gesture Recognition For Manipulating A User-Interface
CN105302448A (en) * 2014-06-18 2016-02-03 中兴通讯股份有限公司 Method and apparatus for adjusting interface of mobile terminal and terminal
US20160224123A1 (en) * 2015-02-02 2016-08-04 Augumenta Ltd Method and system to control electronic devices through gestures
CN106527850A (en) * 2016-10-31 2017-03-22 维沃移动通信有限公司 Message display method and mobile terminal
EP3373114B1 (en) * 2017-03-10 2019-10-09 Alterface Projects Tracking system and method using a depth aware camera
JP2019219904A (en) * 2018-06-20 2019-12-26 ソニー株式会社 Program, recognition apparatus, and recognition method
CN111125659A (en) * 2018-10-31 2020-05-08 北京小米移动软件有限公司 Input component, unlocking method, electronic device and machine-readable storage medium
CN109656359A (en) * 2018-11-26 2019-04-19 深圳奥比中光科技有限公司 3D body feeling interaction adaptation method, system, terminal device and readable storage medium storing program for executing

Also Published As

Publication number Publication date
WO2022095915A1 (en) 2022-05-12
CN116391163A (en) 2023-07-04

Similar Documents

Publication Publication Date Title
US12117284B2 (en) Method and apparatus for measuring geometric parameter of object, and terminal
JP6858650B2 (en) Image registration method and system
EP2984541B1 (en) Near-plane segmentation using pulsed light source
US20190025544A1 (en) Imaging apparatus and focus control method
US20140037135A1 (en) Context-driven adjustment of camera parameters
US9465444B1 (en) Object recognition for gesture tracking
US9560273B2 (en) Wearable information system having at least one camera
CN112005548B (en) Method of generating depth information and electronic device supporting the same
KR101985674B1 (en) Method of recognizing contactless user interface motion and System there-of
US20150009119A1 (en) Built-in design of camera system for imaging and gesture processing applications
US11048923B2 (en) Electronic device and gesture recognition method thereof
US10607069B2 (en) Determining a pointing vector for gestures performed before a depth camera
JP2020525958A (en) Image processing system and image processing method
CN113260951A (en) Fade-in user interface display based on finger distance or hand proximity
CN115526983B (en) Three-dimensional reconstruction method and related equipment
CN118317069A (en) Multidimensional rendering
WO2022161011A1 (en) Method for generating image and electronic device
CN111127541A (en) Vehicle size determination method and device and storage medium
WO2022095915A1 (en) Electronic device, method and storage medium
CN113033590B (en) Image feature matching method, device, image processing equipment and storage medium
CN115066882A (en) Electronic device and method for performing auto-focusing
CN115499576A (en) Light source estimation method, device and system
CN117726926B (en) Training data processing method, electronic device, and computer-readable storage medium
US20240054694A1 (en) Electronic device for placing object according to space in augmented reality and operation method of electronic device
WO2021253308A1 (en) Image acquisition apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20220506

WD01 Invention patent application deemed withdrawn after publication