WO2022131659A1 - Appareil électronique et son procédé de commande - Google Patents

Appareil électronique et son procédé de commande Download PDF

Info

Publication number
WO2022131659A1
WO2022131659A1 PCT/KR2021/018490 KR2021018490W WO2022131659A1 WO 2022131659 A1 WO2022131659 A1 WO 2022131659A1 KR 2021018490 W KR2021018490 W KR 2021018490W WO 2022131659 A1 WO2022131659 A1 WO 2022131659A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
screen
magnetic field
processor
data
Prior art date
Application number
PCT/KR2021/018490
Other languages
English (en)
Korean (ko)
Inventor
디팩 아가왈데쉬
타얄로힛
아가왈안킷
오베로이하르쉬
라투르선일
콘하리초이스
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020210119262A external-priority patent/KR20220084996A/ko
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Publication of WO2022131659A1 publication Critical patent/WO2022131659A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/046Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by electromagnetic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Definitions

  • the present invention relates to an electronic device that can be controlled using an object without a touch, and a method for controlling the same.
  • BACKGROUND Electronic devices such as e-book readers, smart phones, tablet computers, and the like include touch sensor panels, which are becoming more and more popular due to the ease and versatility of operation on the touch screen.
  • an active digitizer sheet may be integrated with a touch sensor panel to sense a touch input of a stylus.
  • FIG. 1 illustrates an electronic device 11 with a stylus 13 .
  • the user may provide the touch input 12 to the electronic device 11 by holding the stylus 13 like a pen or pencil.
  • the active digitizer sheet consumes power of the electronic device 11 . Accordingly, the battery of the electronic device 11 is rapidly consumed.
  • the stylus 13 may be inserted into the cavity of the electronic device 11 .
  • the weight of the electronic device 11 increases, and the manufacturing cost of the electronic device 11 also increases due to additional hardware such as the stylus 13 .
  • an electronic device that can be controlled using an object such as ferromagnetic without touching the electronic device, and a control method thereof.
  • An electronic device includes a screen; a magnetometer sensor for acquiring magnetic field data in front of the screen; and a processor, wherein when detecting an object hovering over the screen based on the magnetic field data, a distance threshold value corresponding to the object based on the size and type of the object and if the distance between the object and the screen is less than the distance threshold, an operation corresponding to the position of the object is performed.
  • the processor may be configured to detect that the object is hovering over the screen when determining that a rate of change of the magnetic field in front of the screen exceeds a magnetic field threshold based on the magnetic field data.
  • the electronic device may include a speaker; and a microphone; wherein the processor controls the speaker to output a sound signal toward the object when detecting the object, and is reflected from the object after being output from the speaker and received by the microphone
  • the size of the object may be determined based on the data of the sound signal.
  • the processor may determine the type of the object based on an output of a machine learning (ML) model to which the magnetic field data is input.
  • ML machine learning
  • the processor may determine the position of the object based on the speed of the object, the magnetic field data, and the direction of the electronic device.
  • the electronic device may include an accelerometer sensor; and a gyro sensor, wherein the processor determines the speed of the object based on accelerometer data output from the accelerometer sensor, and determines the direction of the electronic device based on direction data output from the gyro sensor.
  • the processor is configured to determine a position vector of the object with respect to a three-dimensional Cartesian coordinate system based on the speed of the object and the magnetic field data, and the object relative to the screen based on the position vector of the object and the direction of the electronic device position can be determined.
  • the processor may determine a point indicated by the object on the screen based on the position of the object, determine a user interface data item including the point, and perform an operation corresponding to the user interface data item .
  • the processor may determine a movement form of the object on the screen based on the position of the object, and control the screen to display a character corresponding to the movement form of the object.
  • the processor is configured to change the user profile of the electronic device from the first user profile corresponding to the first portion to the second portion when the position of the object moves from the first portion of the screen to the second portion of the screen. may be changed to a second user profile corresponding to .
  • an electronic device including a screen and a magnetometer sensor for acquiring magnetic field data in front of the screen, when detecting an object hovering over the screen based on the magnetic field data, the determine a distance threshold value corresponding to the object based on the size and type of the object; and performing an operation corresponding to the position of the object when the distance between the object and the screen is less than the distance threshold.
  • Detecting the object may include detecting that the object hovers over the screen when it is determined based on the magnetic field data that the rate of change of the magnetic field in front of the screen exceeds a magnetic field threshold.
  • the electronic device may include a speaker; and a microphone, wherein the method of controlling the electronic device includes: when detecting the object, controlling the speaker to output a sound signal toward the object; It may further include; determining the size of the object based on the data of the sound signal reflected from the object after being output from the speaker and received by the microphone.
  • the control method of the electronic device may further include determining the type of the object based on an output of a machine learning (ML) model to which the magnetic field data is input.
  • ML machine learning
  • the method of controlling the electronic device may further include determining the position of the object based on the speed of the object, the magnetic field data, and the direction of the electronic device.
  • a stylus and an activated digitizer sheet are provided by providing control of an electronic device using an object, such as a ferromagnetic, without a touch based on an output of a pre-installed sensor.
  • an object such as a ferromagnetic
  • the size, weight, and manufacturing cost of the electronic device can be reduced, and power consumption of the electronic device can be significantly improved.
  • FIG. 1 is a diagram illustrating an electronic device having a stylus according to the related art.
  • FIG. 2A is a control block diagram of an electronic device according to an exemplary embodiment.
  • 2B is a block diagram of an operation controller for performing an operation of an electronic device according to an exemplary embodiment.
  • 3 and 4 are flowcharts illustrating a method of controlling an operation of an electronic device according to an exemplary embodiment.
  • 5A is a diagram illustrating a signal received by a microphone of an electronic device according to an exemplary embodiment.
  • 5B is a diagram illustrating a case in which an electronic device determines a size of an object hovered over a screen according to an exemplary embodiment.
  • 6A and 6B are diagrams illustrating a recurrent neural network (RNN) used by an electronic device to identify a type of an object hovered over a screen, according to an exemplary embodiment.
  • RNN recurrent neural network
  • FIG. 7 is a diagram illustrating a plot of a volume of an object with respect to a height of the object from a screen of an electronic device according to an exemplary embodiment.
  • FIG. 8 is a diagram illustrating a case in which an electronic device determines a speed of an object hovered over a screen according to an exemplary embodiment.
  • FIG. 9 is a diagram illustrating a case in which an electronic device calculates a direction according to an exemplary embodiment.
  • FIG. 10 is a diagram illustrating a position vector of an object in a 3D Cartesian coordinate system and a position of the object with respect to a screen in an electronic device according to an exemplary embodiment.
  • FIG. 11 is a diagram illustrating an exemplary scenario in which an electronic device receives an alphabet based on a position of an object, according to an embodiment.
  • FIG. 12 is a diagram illustrating an exemplary scenario in which an electronic device switches a user profile based on a location of an object, according to an embodiment.
  • FIG. 13 is a diagram illustrating an exemplary scenario in which an electronic device instructs to wash a user's hand based on a location of an object, according to an embodiment.
  • first may be referred to as a second component
  • second component may also be referred to as a first component
  • ⁇ part may mean a unit for processing at least one function or operation.
  • the terms may refer to at least one hardware such as a field-programmable gate array (FPGA) / application specific integrated circuit (ASIC), at least one software stored in a memory, or at least one process processed by a processor. have.
  • FPGA field-programmable gate array
  • ASIC application specific integrated circuit
  • An embodiment of the disclosed invention provides a method for controlling an operation of an electronic device using an object without a touch.
  • the method includes detecting, by the electronic device, an object hovered over a screen of the electronic device.
  • the method includes obtaining, by the electronic device, data from a first set of sensors (a first set of sensors).
  • the method includes determining, by the electronic device, a threshold value of a distance of the object to a screen of the electronic device based on data received from a first set of sensors.
  • the method includes obtaining, by the electronic device, data from a second set of sensors (a second set of sensors).
  • the method includes dynamically determining, by the electronic device, a position of the object relative to a screen of the electronic device based on data received from a second set of sensors.
  • the method includes determining, by the electronic device, a distance between the object and a screen of the electronic device based on the position of the object.
  • the method includes determining, by the electronic device, whether a distance between the object and a screen of the electronic device meets a distance threshold value.
  • the method may include performing, by the electronic device, an operation based on the position of the object when it is determined that the distance between the object and the screen of the electronic device satisfies the distance threshold value.
  • An embodiment of the disclosed invention provides an electronic device for controlling an operation using an object.
  • the electronic device includes a memory, a processor, a plurality of sensors, a screen, and an operation control unit, and the operation control unit is connected to the memory and the processor.
  • the motion controller may be configured to detect an object hovered over a screen of the electronic device.
  • the motion control may be configured to obtain data from a first set of sensors in the plurality of sensors.
  • the operation control unit may be configured to determine a distance threshold value of the object to the screen of the electronic device based on data received from the first sensor set.
  • the motion control may be configured to obtain data from a second set of sensors in the plurality of sensors.
  • the motion control unit may be configured to dynamically determine the position of the object with respect to the screen of the electronic device based on data received from the second sensor set.
  • the motion controller may be configured to determine a distance between the object and the screen of the electronic device based on the position of the object.
  • the operation controller may be configured to determine whether a distance between an object of the electronic device and the screen satisfies a distance threshold value. When it is determined that the distance between the object and the screen of the electronic device satisfies the distance threshold, the operation controller may be configured to perform an operation according to the position of the object.
  • the proposed method can be used to control the operation of an electronic device using an object such as a ferromagnetic tool (eg, a screwdriver, a spanner, a nail cutter, a metal paper clip, etc.).
  • the electronic device may use a plurality of existing sensors of the electronic device to detect a user input performed through a screen of the electronic device using an object.
  • this method can be used in electronic devices to incorporate the functionality of a stylus as in traditional touch sensitive devices. Since the stylus is not included in the design of the electronic device, the size, weight, and manufacturing cost of the electronic device can be greatly reduced.
  • the electronic device is effective in accurately determining the position of an object hovered over a screen of the electronic device using a plurality of existing sensors.
  • the electronic device may consider “hovering on the screen” as a user input such as a gesture, alphabet input, or command for performing an operation.
  • this method can be used in electronic devices to incorporate the functionality of a stylus as in traditional touch sensitive devices. Since the electronic device does not include an activation digitizer sheet, the power consumption of the electronic device can be greatly improved using the proposed method.
  • the proposed method enables a user to sign on the screen of the electronic device using an object for authentication, thereby eliminating the need for a fingerprint sensor in the electronic device for authentication. Since the electronic device does not include a fingerprint sensor, the power consumption of the electronic device can be greatly improved using the proposed method, and the manufacturing cost can be greatly reduced.
  • FIG. 2A is a control block diagram of an electronic device according to an exemplary embodiment.
  • the electronic device 100 includes an operation controller 110 , a memory 120 , a processor 130 , a sensor 140 , a screen 150 , a speaker 160 , and a communication unit ( 170) may be included.
  • the operation control unit 110 may be connected to the memory 120 and the processor 130 , or may be provided integrally with the memory 120 and the processor 130 .
  • the operation control unit 110 may be provided integrally with the processor 130 , and an operation of the operation control unit 110 to be described later may be performed by the processor 130 .
  • the operation control unit 110 is physically implemented by an analog or digital circuit such as a logic gate, an integrated circuit, a microprocessor, a microcontroller, a memory circuit, a passive electronic device, an active electronic device, an optical device, a hard wire circuit, and the like, and is stored in firmware.
  • an analog or digital circuit such as a logic gate, an integrated circuit, a microprocessor, a microcontroller, a memory circuit, a passive electronic device, an active electronic device, an optical device, a hard wire circuit, and the like, and is stored in firmware.
  • the circuit may be implemented with one or more semiconductor chips or may be implemented on a substrate support such as a printed circuit board.
  • the sensor 140 may be provided in plurality, and may include, for example, an accelerometer sensor 141 , a gyro sensor 142 , a microphone 143 , and a magnetometer sensor 144 .
  • the screen 150 may be an electronic display device such as a liquid crystal display (LCD) screen.
  • LCD liquid crystal display
  • the type of display panel constituting the screen 150 is not limited.
  • the speaker 160 may be an electronic device that generates an audible signal using power.
  • Examples of the operation of the electronic device 100 to be described below include, but are not limited to, alphabet input, user's signature input, drawing generation, user profile start, user's command identification, and gesture identification.
  • shapes such as alphabets, smiles, numbers, and symbols may be formed by moving an object.
  • the electronic device 100 may be a user equipment (UE), a smart phone, a tablet computer, a personal digital assistance (PDA), a desktop computer, the Internet of things (IoT), etc., and the type is limited. none.
  • UE user equipment
  • PDA personal digital assistance
  • IoT Internet of things
  • the object is a device made of a ferromagnetic material such as iron, cobalt, nickel, or an alloy.
  • a ferromagnetic material such as iron, cobalt, nickel, or an alloy.
  • objects include, but are not limited to, screwdrivers, nail cutters, metal paper clips, and pens.
  • the operation controller 110 may be configured to detect an object hovered over the screen 150 of the electronic device 100 .
  • the operation controller 110 may be configured to acquire magnetic field data from the magnetometer sensor 144 among the plurality of sensors 140 of the electronic device 100 .
  • the magnetometer sensor 144 may acquire magnetic field data in front of the screen 150 .
  • the operation controller 110 may detect an object hovering over the screen 150 based on the magnetic field data.
  • the operation controller 110 may detect that the object hovers over the screen 150 when it is determined that the rate of change of the magnetic field in front of the screen 150 exceeds a magnetic field threshold based on the magnetic field data.
  • the magnetic field data is a value of the magnetic field (B) in the (X, Y, Z) direction of a three-dimensional Cartesian coordinate system (ie, the real world).
  • Examples of magnetic field data (0.00 ⁇ T, 5.90 ⁇ T, -48.40 ⁇ T) indicate magnetic fields in the (X, Y, Z) direction, respectively.
  • the operation control unit 110 may be configured to determine whether the magnetic field data meets a magnetic field threshold value.
  • the magnetic field threshold value for each object may be stored in advance in the memory 120 .
  • the magnetic field threshold is the rate at which the magnetic field changes. For example, the speed range is greater than 0.05 micro tesla and less than 0.98 micro tesla.
  • the rate of change of the magnetic field (ie, ⁇ B/ ⁇ t) according to an embodiment is matched with a magnetic field threshold to determine whether the magnetic field data meets the magnetic field threshold.
  • the rate of change of the magnetic field is the change in the magnetic field with time.
  • An example for the rate of change of the magnetic field is 14 micro tesla/sec (ie, 14 ⁇ T/s).
  • the operation control unit 110 may be configured to detect an object hovered over the screen 150 . For example, when the magnetic field data (ie, ⁇ B/ ⁇ t) is greater than the magnetic field threshold, that is, when the magnetic field data meets the magnetic field threshold, the operation control unit 110 determines that the object is hovering over the screen 150 . can be configured to determine. When the magnetic field data (ie, ⁇ B/ ⁇ t) is less than or equal to the magnetic field threshold, that is, when the magnetic field data does not meet the magnetic field threshold, the operation control unit 110 determines that the object does not hover over the screen. can be configured.
  • the operation control unit 110 may be configured to obtain data from the first set 143 , 144 of the plurality of sensors 140 .
  • a first set 143 , 144 of sensors 140 includes a microphone 143 and a magnetometer sensor 144 .
  • the motion controller 110 determines a distance threshold value (eg, 62 mm) of the object to the screen 150 of the electronic device 100 based on data received from the first set 143 and 144 of the sensor 140 .
  • the distance threshold value means a minimum distance between an object for detecting an object by the electronic device 100 and the screen 150 .
  • the operation controller 110 may determine a distance threshold value corresponding to the object based on the size of the object and the type of the object.
  • the operation control unit 110 determines the size (eg, length, width, height, radius, etc.) of the object based on data received from the microphones 143 of the first set 143 and 144 .
  • sizes may include length, width, height, radius, and the like.
  • the operation control unit 110 controls the speaker 160 to output a sound signal toward the object when an object is detected, and the sound signal output from the speaker 160 and reflected from the object and received by the microphone 143 .
  • the size of the object can be determined based on the data of
  • the operation controller 110 may be configured to transmit a first signal (eg, an audio signal) from the speaker 160 of the electronic device 100 toward the object. Also, the operation controller 110 may be configured to receive a second signal (eg, an audio signal) reflected from the object using the microphone 143 in response to the first signal. Also, the operation controller 110 may be configured to determine the size of the object based on the second signal reflected from the object. The operation control unit 110 according to an embodiment may be configured to determine whether a temporal displacement of the reflected signal (ie, the second signal) is greater than a displacement threshold value. The displacement threshold is pre-stored in the memory 120 .
  • the temporal displacement of the reflected signal is greater than the pre-stored threshold (ie, the displacement threshold) the time it takes to detect the reflected signal after it is emitted.
  • the pre-stored threshold is 1450 ⁇ s.
  • the operation control unit 110 may be configured to determine the size of the object when the temporal displacement of the second signal is greater than the displacement threshold value.
  • the operation controller 110 may be configured to recognize the type of an object by applying a machine learning (ML) model 119 (see FIG. 2B ) to data received from the magnetometer sensor 144 .
  • ML model 119 is a recurrent neural network (RNN).
  • RNN recurrent neural network
  • the type of object may represent the ferromagnetic material used to make the object.
  • the operation controller 110 may determine the type of the object based on the output of the machinability model to which the magnetic field data is input.
  • the operation controller 110 may be configured to determine a threshold value of the distance of the object to the screen 150 of the electronic device 100 based on the size of the object and the type of the object. Specifically, the operation controller 110 may be configured to determine the volume of the object by using the size of the object. Also, the operation controller 110 may be configured to determine a threshold value of the distance of the object to the screen 150 of the electronic device 100 based on the volume of the object and the type of the object.
  • the operation controller 110 may perform an operation corresponding to the position of the object.
  • the operation controller 110 may determine the position of the object based on the object's speed, magnetic field data, and the direction of the electronic device.
  • the operation control unit 110 may be configured to obtain data from the second set 141 , 142 , 144 of the plurality of sensors 140 .
  • the second set of sensors 140 , 141 , 142 , 144 may include an accelerometer sensor 141 , a gyro sensor 142 and a magnetometer sensor 144 .
  • the operation control unit 110 may be configured to dynamically determine the position of the object with respect to the screen 150 of the electronic device 100 based on data received from the second set 141 , 142 , 144 of the sensor 140 .
  • the motion control unit 110 may be configured to determine the velocity of an object based on accelerometer data received from the accelerometer sensor 141 in the second set 141 , 142 , 144 of the sensor 140 .
  • the accelerometer data may include acceleration values of the electronic device 100 in the 3D Cartesian coordinate system (X, Y, Z) direction. Examples of the accelerometer data (0.00m/s2, 9.81m/s2, and 0.00m/s2) may indicate the acceleration in the (X, Y, Z) direction of the electronic device 100 , respectively. Examples of the object velocity (-0.2 m/s, 0.4 m/s, -0.14 m/s) may indicate the velocity of the electronic device 100 in the (X, Y, Z) directions, respectively.
  • the operation controller 110 may be configured to determine the speed of the object based on a change in magnetic field data generated while the object hovers over the screen 150 (refer to FIG. 8 ).
  • the operation controller 110 may be configured to identify a direction of the electronic device 100 based on direction data received from the gyro sensor 142 in the second set 141 , 142 , and 144 of the sensor 140 .
  • the direction of the electronic device 100 may be identified based on azimuth ( ⁇ ), pitch ( ⁇ ), and roll ( ⁇ ) information of direction data. Examples of azimuth, pitch and roll are -55 ⁇ , -28 ⁇ , -64 ⁇ , respectively.
  • the operation control unit 110 is configured to continuously determine the position of the object with respect to the screen 150 based on the direction of the electronic device 100 , the speed of the object, and magnetometer data (magnetic field data) received from the magnetometer sensor 144 .
  • magnetometer data magnetic field data
  • the motion control unit 110 may be configured to determine a position vector of the object with respect to a 3D Cartesian coordinate system based on the object's velocity and magnetometer data.
  • An example of an object's position vector with respect to the (X, Y, Z) direction of a three-dimensional Cartesian coordinate system is (-1.3, 6.7, 4.1), respectively.
  • ⁇ Equation 1> for determining the position vector of an object with respect to a three-dimensional Cartesian coordinate system (eg, world plane), that is, r (World Plane) is as follows.
  • B x , B y , and B z mean magnetic field values in the X, Y, and Z directions of the Cartesian coordinate system, respectively.
  • , is the change in the magnetic field in the X, Y, and Z directions of the Cartesian coordinate system, respectively.
  • the directions v x , v y , and v z mean the velocity of the object in the X, Y, and Z directions of the Cartesian coordinate system, respectively.
  • the operation controller 110 may be configured to dynamically determine the position of the object with respect to the screen 150 based on the position vector of the object and the direction of the electronic device 100 .
  • Equation 2> for dynamically determining the position of an object, that is, r (Device Plane) is as follows.
  • the operation controller 110 may be configured to determine a distance between the object of the electronic device 100 and the screen 150 based on the position of the object.
  • the operation controller 110 may be configured to determine whether a distance between the object of the electronic device 100 and the screen 150 satisfies a distance threshold value.
  • the operation controller 110 determines that the distance between the object of the electronic device 100 and the screen 150 is the distance It can be determined that the threshold is met.
  • the operation controller 110 determines that the distance between the object of the electronic device 100 and the screen 150 does not satisfy the distance threshold. can decide not to.
  • the operation controller 110 may perform an operation according to the position of the object.
  • the operation controller 110 may be configured to determine a part (point) of the screen 150 of the electronic device 100 pointed to by the object based on the position of the object.
  • the operation controller 110 may be configured to detect a user interface (UI) data item including a portion of the screen 150 of the electronic device 100 . Examples of UI data items include, but are not limited to, icons, widgets, videos, images, and UIs of applications.
  • the operation control unit 110 may be configured to determine an operation to be performed based on the UI data item.
  • the operation control unit 110 may be configured to perform an operation.
  • the operation controller 110 may determine a movement form of the object on the screen 150 based on the position of the object and control the screen 150 to display a character corresponding to the movement form of the object.
  • the operation controller 110 may change the user profile of the electronic device 100 to correspond to the first part.
  • the first user profile may be changed to a second user profile corresponding to the second part.
  • the memory 120 may store data from the plurality of sensors 140 .
  • the memory 120 may store a displacement threshold and a magnetic field threshold. Also, the memory 120 may provide the displacement threshold and the magnetic field threshold to the operation controller 110 in response to receiving a request from the operation controller 110 .
  • Memory 120 may include a non-volatile storage element. Examples of such non-volatile storage elements may include magnetic hard disks, optical disks, floppy disks, flash memory, or any form of electrical programmable memory (EPROM) or electrical erasable and programmable (EEPROM) memory. Also, memory 120 may be considered a non-transitory storage medium in some examples. The term “non-transitory” may indicate that the storage medium is not embodied as a carrier wave or a propagated signal. However, the term “non-transitory” is not to be interpreted as meaning that the memory 120 is immovable. In some examples, memory 120 may be configured to store larger amounts of information. In certain instances, the non-transitory storage medium may store data that may change over time (eg, random access memory (RAM) or cache).
  • RAM random access memory
  • the processor 130 is configured to execute instructions stored in the memory 120 .
  • the processor 130 is a general-purpose processor such as a central processing unit (CPU), an application processor (AP), and the like, a graphics processing unit (GPU), and a visual processing unit (VPU). It may be a graphics-only processing unit such as, for example.
  • Processor 130 may include multiple cores to execute instructions.
  • the communication unit 170 may be configured to internally communicate between hardware components of the electronic device 100 . Also, the communication unit 170 may be configured to facilitate communication between the electronic device 100 and other devices.
  • the communication unit 170 may include an electronic circuit specialized for a standard that enables wired or wireless communication.
  • FIG. 2A illustrates various hardware components of the electronic device 100, it should be understood that other embodiments are not limited thereto.
  • the electronic device 100 may include a smaller number of components.
  • labels or names of components are used for illustrative purposes only and do not limit the scope of the present invention.
  • 2B is a block diagram of the operation controller 110 for performing an operation of the electronic device 100 according to an exemplary embodiment.
  • the motion control unit 110 includes an object detector 111 , a size estimation engine 112 , an object classifier 113 , a distance threshold value estimator 114 , and a speed determination engine 115 . , a direction estimator 116 , a position estimation engine 117 , an action executor 118 , and an ML model 119 .
  • An object detector 111 , a size estimation engine 112 , an object classifier 113 , a distance threshold value estimator 114 , a velocity determination engine 115 , a direction estimator 116 , and a position estimation engine 117 according to an embodiment ), the action executor 118 and the ML model 119 are for analog or digital circuits such as logic gates, integrated circuits, microprocessors, microcontrollers, memory circuits, passive electronics, active electronics, optics, hardwired circuits, and the like. It may be physically implemented by a firmware and selectively driven by firmware.
  • the circuit may be implemented with one or more semiconductor chips or may be implemented on a substrate support such as a printed circuit board.
  • the object detector 111 may detect an object hovered over the screen 150 of the electronic device 100 .
  • the object detector 111 may acquire magnetic field data from the magnetometer sensor 144 .
  • the object detector 111 may determine whether the magnetic field data meets a magnetic field threshold. Further, the object detector 111 may detect an object hovered over the screen 150 in response to determining that the magnetic field data meets a magnetic field threshold.
  • the distance threshold value estimator 114 may obtain data from the first set 143 , 144 of the sensors 140 .
  • the distance threshold value estimator 114 may determine a distance threshold value of the object to the screen 150 based on data received from the first set 143 , 144 of the sensors 140 .
  • the size estimation engine 112 determines the size of the object based on data received from the microphone 143 .
  • the size estimation engine 112 transmits a first signal from the speaker 160 of the electronic device 100 toward the object.
  • the magnitude estimation engine 112 receives a second signal reflected from the object using the microphone 143 in response to the first signal.
  • the size estimation engine 112 may determine the size of the object based on the second signal reflected from the object.
  • the magnitude estimation engine 112 according to an embodiment may determine whether the temporal displacement of the second signal is greater than a displacement threshold value.
  • the size estimation engine 112 may determine the size of the object when the temporal displacement of the second signal is greater than the displacement threshold value.
  • the object classifier 113 may recognize the type of object by applying the ML model 119 to the data received from the magnetometer sensor 144 .
  • the distance threshold value estimator 114 may determine the distance threshold value of the object to the screen 150 according to the size of the object and the type of the object.
  • the distance threshold value estimator 114 may determine the volume of the object by using the size of the object.
  • the distance threshold value estimator 114 may determine a distance threshold value of the object to the screen 150 according to the volume of the object and the type of the object.
  • the position estimation engine 117 dynamically determines the position of the object with respect to the screen 150 of the electronic device 100 based on data received from the second set 141 , 142 , 144 of the sensor 140 .
  • the speed determination engine 115 determines the speed of the object based on a change in accelerometer data or magnetic field data.
  • the direction estimator 116 may identify the direction of the electronic device 100 based on direction data received from the gyro sensor 142 .
  • the position estimation engine 117 may continuously determine the position of the object with respect to the screen 150 based on the direction of the electronic device 100 , the velocity of the object, and magnetometer data received from the magnetometer sensor 144 .
  • the position estimation engine 117 according to an embodiment may determine the position vector of the object with respect to the 3D Cartesian coordinate system based on the velocity and magnetic field data of the object.
  • the position estimation engine 117 may dynamically determine the position of the object with respect to the screen 150 based on the position vector of the object and the direction of the electronic device 100 .
  • the location estimation engine 117 may determine a distance between the object and the screen 150 based on the location of the object. The location estimation engine 117 may determine whether a distance between the object of the electronic device 100 and the screen 150 satisfies a distance threshold value.
  • the action executor 118 determines that the distance between the object of the electronic device 100 and the screen 150 satisfies the distance threshold, the action executor 118 performs an action according to the location of the object.
  • the action executor 118 determines the portion of the screen 150 of the electronic device 100 pointed to by the object based on the position of the object.
  • the action executor 118 detects a UI data item that includes a portion of the screen 150 of the electronic device 100 .
  • the action executor 118 determines the action to perform based on the UI data item and performs the action.
  • FIG. 2B shows hardware components of the operation control unit 110, it should be understood that other embodiments are not limited thereto.
  • the operation control unit 110 may include a smaller number of components.
  • labels or names of components are used for illustrative purposes only and do not limit the scope of the present invention.
  • One or more components may be coupled together to perform the same or substantially similar function for performing the operation of the electronic device 100 based on the position of the object.
  • 3 and 4 are flowcharts illustrating a method of controlling an operation of the electronic device 100 according to an exemplary embodiment.
  • the electronic device 100 may detect an object hovered over the screen 150 .
  • the object detector 111 may detect an object hovered over the screen 150 .
  • the electronic device 100 may obtain data from the first set 143 and 144 of the sensors 140 .
  • the distance threshold value estimator 114 may obtain data from the first set 143 , 144 of the sensors 140 .
  • the electronic device 100 may determine a distance threshold value of the object to the screen 150 based on data received from the first set 143 and 144 of the sensor 140 .
  • the distance threshold value estimator 114 may determine a distance threshold value of the object to the screen 150 based on data received from the first set 143 , 144 of the sensors 140 .
  • the electronic device 100 may obtain data from the second set 141 , 142 , and 144 of the sensor 140 .
  • the distance threshold value estimator 114 , the speed determination engine 115 , and the direction estimator 116 may obtain data from the second set 141 , 142 , 144 of the sensors 140 .
  • the electronic device 100 may dynamically determine the position of the object with respect to the screen 150 based on data received from the second set 141 , 142 , and 144 of the sensor 140 .
  • the position estimation engine 117 may dynamically determine the position of the object relative to the screen 150 based on data received from the second set 141 , 142 , 144 of the sensor 140 .
  • the electronic device 100 may determine a distance between the object and the screen 150 based on the position of the object.
  • the position estimation engine 117 may determine a distance between the object and the screen 150 based on the position of the object.
  • the electronic device 100 may determine whether the distance between the object and the screen 150 satisfies a distance threshold. Specifically, the location estimation engine 117 may determine whether the distance between the object and the screen 150 meets a distance threshold value.
  • operation 316 when the electronic device 100 determines that the distance between the object and the screen 150 satisfies a distance threshold, performing an operation based on the position of the object. Specifically, when the action executor 118 determines that the distance between the object and the screen 150 meets the distance threshold, it causes the action to be performed based on the position of the object.
  • the electronic device 100 monitors magnetic field data received from the magnetometer sensor 144 .
  • the electronic device 100 determines whether the rate of change of the magnetic field (ie, ⁇ B/ ⁇ t) is greater than a magnetic field threshold. If the electronic device 100 determines that the rate of change of the magnetic field (ie, ⁇ B/ ⁇ t) is not greater than the magnetic field threshold (No in 402 ), the electronic device 100 continues to monitor the magnetic field data ( 401 ).
  • step 403 when it is determined that the rate of change of the magnetic field (ie, ⁇ B/ ⁇ t) is greater than the magnetic field threshold (YES in 402 ), the electronic device 100 detects an object proximate to the electronic device 100 , and a speaker (160) may be used to transmit a first signal at a frequency of 15 kHz towards the object at time t1.
  • the electronic device 100 receives the second signal at t2 using the microphone 143 .
  • the second signal is a signal that collides with the first signal for the object and is reflected from the object.
  • the electronic device 100 may determine whether the temporal displacement of the second signal is greater than a displacement threshold value. When determining that the time displacement of the second signal is not greater than the displacement threshold (No in 405 ), the electronic device 100 may continue to monitor the magnetic field data ( 401 ).
  • the electronic device 100 determines the size of the object in step 406, and determines the size of the object in step 407 based on the magnetometer data type can be identified.
  • the electronic device 100 determines a distance threshold value based on the size of the object and the type of the object.
  • step 409 if it is determined that the rate of change of the magnetic field (ie, ⁇ B/ ⁇ t) is greater than the magnetic field threshold (YES in 402 ), the electronic device 100 uses the accelerometer data received from the accelerometer sensor 141 to set the object determine the speed of
  • the electronic device 100 determines a position vector of the object with respect to the 3D Cartesian coordinate system based on the velocity of the object.
  • the electronic device 100 identifies the direction of the electronic device 100 based on the direction data received from the gyro sensor 142 .
  • the electronic device 100 dynamically determines the position of the object with respect to the screen 150 based on the position vector of the object and the direction of the electronic device 100 .
  • the electronic device 100 may estimate the distance between the object and the screen 150 based on the position of the object with respect to the screen 150 .
  • the electronic device 100 may determine whether the distance between the object and the screen 150 is less than a distance threshold value.
  • the electronic device 100 may continue to monitor the magnetic field data ( 401 ).
  • the electronic device 100 may perform an operation based on the position of the object.
  • 5A is a diagram illustrating a signal received by the microphone 143 of the electronic device 100 according to an exemplary embodiment.
  • the amplitude 501 of the second signal received at the microphone 143 at various times 502 is shown in FIG. 5A .
  • the second signal is a signal reflected by the first signal output from the speaker 160 to the object hitting the object.
  • the electronic device 100 does not know an appropriate second signal to be processed in order to estimate the distance between the object and the electronic device 100 .
  • the user In order to determine an appropriate second signal, the user must move the object back and forth in a direction parallel to the screen 150 of the electronic device 100 .
  • the electronic device 100 may determine the relevant movement time displacement of the appropriate second signal based on the movement of the object, and the unrelated second signal may be maintained close to the specific movement time.
  • the related movement time displacement is a displacement indicating whether the object moves away from or approaches the electronic device 100 .
  • the movement time is the time it takes for the signal output from the speaker 160 to travel to the microphone 143 after being reflected from the object.
  • the electronic device 100 may determine a plurality of accumulated detection and dispersion of the second signal, that is, the standard deviation of the movement time of each second signal.
  • the electronic device 100 may select a second signal exceeding the variance threshold as an appropriate second signal.
  • the dispersion threshold ranges from 1350 to 1550 ⁇ s.
  • FIG. 5B is a diagram illustrating a case in which the electronic device 100 determines the size of an object hovered over the screen 150 according to an exemplary embodiment.
  • FIG. 5B A side view of the object 510 hovered over the electronic device 100 is illustrated in FIG. 5B .
  • the microphone 143 and the speaker 160 are disposed with a distance 511 from the electronic device 100 .
  • the first signal 512A is transmitted from the speaker 160 and collides with the object 510 .
  • the first signal 512A reflected from the object 510 is the second signal 512B.
  • the microphone 143 may receive the second signal 512B.
  • the first signal 513A is transmitted from the speaker 160 and strikes the object 510 .
  • the first signal 513A reflected from the object 510 is the second signal 513B.
  • the microphone 143 may receive the second signal 513B.
  • the second signal 513B is bent due to the structural characteristics of the object 510 .
  • the electronic device 100 measures the distance 'x' 511 between the microphone 143 and the speaker 160 using ⁇ Equation 3>.
  • the movement time of the signal may be determined by the electronic device 100 .
  • the electronic device 100 determines a position vector of the object with respect to the 3D Cartesian coordinate system based on the velocity of the object.
  • the electronic device 100 identifies the direction of the electronic device 100 based on the direction data received from the gyro sensor 142 .
  • the electronic device 100 dynamically determines the position of the object with respect to the screen 150 based on the position vector of the object and the direction of the electronic device 100 .
  • the electronic device 100 may estimate the distance between the object and the screen 150 based on the position of the object with respect to the screen 150 .
  • the electronic device 100 may determine whether the distance between the object and the screen 150 is less than a distance threshold value.
  • the electronic device 100 may continue to monitor the magnetic field data ( 401 ).
  • the electronic device 100 may perform an operation based on the position of the object.
  • 5A is a diagram illustrating a signal received by the microphone 143 of the electronic device 100 according to an exemplary embodiment.
  • the amplitude 501 of the second signal received at the microphone 143 at various times 502 is shown in FIG. 5A .
  • the second signal is a signal reflected by the first signal output from the speaker 160 to the object hitting the object.
  • the electronic device 100 does not know an appropriate second signal to be processed in order to estimate the distance between the object and the electronic device 100 .
  • the user In order to determine an appropriate second signal, the user must move the object back and forth in a direction parallel to the screen 150 of the electronic device 100 .
  • the electronic device 100 may determine the relevant movement time displacement of the appropriate second signal based on the movement of the object, and the unrelated second signal may be maintained close to the specific movement time.
  • the related movement time displacement is a displacement indicating whether the object moves away from or approaches the electronic device 100 .
  • the movement time is the time it takes for the signal output from the speaker 160 to travel to the microphone 143 after being reflected from the object.
  • the electronic device 100 may determine a plurality of accumulated detection and dispersion of the second signal, that is, the standard deviation of the movement time of each second signal.
  • the electronic device 100 may select a second signal exceeding the variance threshold as an appropriate second signal.
  • the dispersion threshold ranges from 1350 to 1550 ⁇ s.
  • FIG. 5B is a diagram illustrating a case in which the electronic device 100 determines the size of an object hovered over the screen 150 according to an exemplary embodiment.
  • FIG. 5B A side view of the object 510 hovered over the electronic device 100 is illustrated in FIG. 5B .
  • the microphone 143 and the speaker 160 are disposed with a distance 511 from the electronic device 100 .
  • the first signal 512A is transmitted from the speaker 160 and collides with the object 510 .
  • the first signal 512A reflected from the object 510 is the second signal 512B.
  • the microphone 143 may receive the second signal 512B.
  • the first signal 513A is transmitted from the speaker 160 and strikes the object 510 .
  • the first signal 513A reflected from the object 510 is the second signal 513B.
  • the microphone 143 may receive the second signal 513B.
  • the second signal 513B is bent due to the structural characteristics of the object 510 .
  • the electronic device 100 measures the distance 'x' 511 between the microphone 143 and the speaker 160 using ⁇ Equation 3>.
  • the movement time of the signal may be determined by the electronic device 100 .
  • the speed of sound, v sound is 34300 cm/sec.
  • the electronic device 100 may determine the size of the object by applying the distance 511 between the microphone 143 and the speaker 160 to the Fresnel-Kirchhoff diffraction equation.
  • 6A and 6B are diagrams illustrating a recurrent neural network (RNN) used by an electronic device to identify a type of an object hovered over a screen, according to an exemplary embodiment.
  • RNN recurrent neural network
  • Table 1 shows absolute values of the magnetic field B measured every second by hovering each type of object on the electronic device 100 according to an embodiment.
  • the change rate ( ⁇ B/ ⁇ t) of the magnetic field according to an embodiment is shown in ⁇ Table 2>.
  • the electronic device 100 determines the type of object by learning the rate of change of the magnetic field. Hundreds of values of the rate of change of the magnetic field are sufficient to learn the electronic device 100 . Also, the electronic device 100 identifies the type of object based on the change rate of the magnetic field using the learned change rate of the magnetic field. As shown in FIG. 6A , the electronic device 100 applies a rate of change of the magnetic field to the RNN 600 to identify the type of object.
  • X 1 , X 2 , X 3 are inputs of the RNN 600 . X 1 is ⁇ B/ ⁇ t for the first instant. X 2 is ⁇ B/ ⁇ t for the second instant. X 3 is ⁇ B/ ⁇ t for the third.
  • f w (602), f w (604), f w (606) are functions with parameter 'W' (f w ).
  • h 0 (601), h 1 (603), h 2 (605), h 3 (607), ... h t (609) may be defined as in ⁇ Equation 4>.
  • Equation 4> can also be expressed as follows.
  • W(608) means a weight matrix (W hh ) used in every instantaneous function f w .
  • the weight matrix may be obtained by training using a backpropagation method.
  • W y (610) means the weight matrix used by h t (609) to generate the output Y (611).
  • Y(611) denotes the output of the RNN 600 given in ⁇ Equation 5> indicating the type of object for the given inputs X 1 , X 2 , and X 3 .
  • X 1 , X 2 , and X 3 are 0.70, 0.55, and 0.54, respectively.
  • the electronic device 100 may identify iron as the type of object.
  • a flowchart of a method of identifying a type of an object using the RNN 600 is shown in FIG. 6B .
  • the input of the RNN 600 is X (612), where X (612) means the rate of change of the magnetic field at each instant.
  • the output of the RNN 600 is Y 611 , where Y 611 means the type of object such as iron, cobalt, nickel, or the like.
  • FIG. 7 is a diagram illustrating a plot of a volume of an object with respect to a height of the object from the screen 150 of the electronic device 100 according to an exemplary embodiment.
  • the electronic device 100 may determine the volume of the object based on the size of the object. Also, the electronic device 100 may determine the distance threshold value of the object by performing regression analysis on the volume of the object. The electronic device 100 uses regression analysis to estimate the value of the dependent variable based on the value of the at least one independent variable, where the dependent variable is the height of the object from the screen 150 and the independent variable is the height of the object. is the volume
  • volume plot on the graph in the X direction and the height of the object from the screen 150 are plotted on the graph in the Y direction with the same relation as in Equation (6).
  • the relationship between volume and height can be described by a linear function 707 . That is, it is assumed that the change in height is due to the change in volume.
  • 705 denotes a point plotted on the graph according to the relationship between volume and height.
  • Y i 702 is the expected height of the object from the screen 150 relative to the object's volume 'X i ' 706 , while the actual value of Y i.e. the screen 150 of X i relative to the volume 706 . ), the actual height of the object is 701.
  • ⁇ 0 (703) and ⁇ 1 (708) represent constants determined according to the type of object.
  • ⁇ 0 and ⁇ 1 are trainable parameters whose values are determined using ⁇ Equation 8> and ⁇ Equation 9>.
  • the values of ⁇ 0 (703) and ⁇ 1 (708) are real numbers.
  • ⁇ i (704) represents an arbitrary error range between 10 -2 and 10 -4 .
  • the random error can be calculated using the minimization of ⁇ Equation 6> using gradient descent. Also, ⁇ Equation 6> may be modified to ⁇ Equation 7> according to random error calculation.
  • the electronic device 100 After determining the volume, the electronic device 100 determines the height at which the object is sensed. X is the volume and Y is the height. The electronic device 100 uses ⁇ Equation 8> and ⁇ Equation 9> to calculate b 0 and b 1 by minimizing the loss between the predicted value and the actual value using gradient descent.
  • distance thresholds for different objects for different volumes are as provided in Table 3.
  • FIG. 8 is a diagram illustrating a case in which an electronic device determines a speed of an object hovered over a screen according to an exemplary embodiment.
  • the electronic device 100 may learn to determine the speed of the object 510 , for example, a screw driver. During the learning phase, the object 510 should be placed on the screen 150 of the electronic device 100 . In addition, the electronic device 100 must move in the horizontal and vertical directions 801 - 804 by fixing the position of the object 510 .
  • the electronic device 100 may monitor magnetic field data and accelerometer data at every instant while moving the electronic device 100 .
  • the electronic device 100 may determine the magnetic field at every moment by using the accelerometer data.
  • the electronic device 100 may determine the speed of the electronic device 100 at every moment based on the accelerometer data. Examples of magnetic field and velocity are shown in ⁇ Table 4>.
  • B t-1 is the magnetic field at time 0 second and B t represents the magnetic field at time 1 second. Accordingly, the electronic device 100 may learn the relationship between the speed of the electronic device 100 and the magnetic field.
  • the electronic device 100 determines the speed of the object 510 by using the learned knowledge about the relationship between the speed of the electronic device 100 and the magnetic field. .
  • the electronic device 100 determines the speed of the object 510 by performing regression analysis on the magnetic field.
  • the electronic device 100 may use regression analysis to estimate the value of the dependent variable based on the value of the at least one independent variable.
  • the dependent variable is the speed of the object 510
  • the independent variable is a change in the magnetic field due to the movement of the object 510 .
  • the relationship between the velocity of the object 510 and the change in the magnetic field may be described as a linear function given in Equation (6).
  • Y i is the estimated speed of the object 510 according to the change of the magnetic field 'X i '.
  • ⁇ 0 and ⁇ 1 are constants determined according to the type of the object 510 .
  • ⁇ 0 and ⁇ 1 are trainable parameters whose values are determined using ⁇ Equation 8> and ⁇ Equation 9>.
  • the values of ⁇ 0 and ⁇ 1 are real numbers.
  • ⁇ i is a random error range between 10 -2 and 10 -4 .
  • An arbitrary error can be calculated using the minimization of ⁇ Equation 6> using the gradient descent method.
  • the electronic device 100 may determine the speed.
  • X is the change in the magnetic field and Y is the velocity.
  • the electronic device 100 may use ⁇ Equation 8> and ⁇ Equation 9> to calculate b 0 and b 1 by minimizing the loss between the predicted value and the actual value using gradient descent.
  • the electronic device 100 determines the position vector of the object 510 based on the speed of the object 510 .
  • Magnetic field 'B t-1 ' at time 0 seconds, magnetic field 'B t ' at time 1 second, change of magnetic field ' ⁇ B', corresponding velocity of object 510 and corresponding position vector of object 510 'r (World Plane) ' is shown in Table 5.
  • FIG. 9 is a diagram illustrating a method of calculating a direction of the electronic device 100 according to an embodiment
  • FIG. 10 is a position of an object 510 in a 3D Cartesian coordinate system in the electronic device 100 according to an embodiment. It is a diagram showing the position of the object 510 with respect to the vector and the screen 150 .
  • X ( 901 ), Y ( 902 ), and Z ( 903 ) are axes of a three-dimensional Cartesian coordinate system in a positive direction.
  • the electronic device 100 is arranged such that the screen 150 faces the positive Z-axis 903 and is disposed at the center of the X-axis 901, the Y-axis 902, and the Z-axis 903, and the positive Y-axis ( The upper end of the electronic device 100 may be disposed in a direction toward 902 .
  • the azimuth ' ⁇ ' 906 is the angle between the positive Y-axis 902 and magnetic north 907, and the azimuth 906 ranges from 0 degrees to 360 degrees.
  • a positive roll ' ⁇ ' 905 is defined as the electronic device 100 starts lying flat on a horizontal plane and the positive Z-axis 903 begins to tilt toward the positive X-axis 901 .
  • the positive pitch ' ⁇ ' 904 is defined as a case in which the electronic device 100 starts lying flat on a horizontal plane and the positive Z-axis 903 starts to tilt toward the positive Y-axis 902 .
  • the object 510 may be positioned on the electronic device 100 with position coordinates (x3, y3, z3) 1005 and a height 'h' 1004 above the screen 150 .
  • the height 'h' 1004 at which the object 510 is positioned from the screen 150 may be lower than the distance threshold value 805 .
  • the position coordinates of the magnetometer sensor 144 of the electronic device 100 may be (x1, y1) (1002).
  • the electronic device 100 may determine the position vector of the object 510 as r(1001). Also, the electronic device 100 uses the position vector of the object 510 and the direction of the electronic device 100 to determine the position (ie, position coordinates) of the object 510 with respect to the screen 150 or (x2, y2). ) 1003 may determine an arbitrary plane of the electronic device 100 .
  • FIG 11 is a diagram illustrating an exemplary scenario in which the electronic device 100 receives a text based on the position of the object 510 according to an embodiment.
  • a user uses a screw driver 510 , ie, an object 510 on the screen 150 of the electronic device 100 , in the shape of an 'S' 1101 , a distance threshold value. You can hover below (805).
  • the electronic device 100 determines the position of the screw driver 510 at every moment the screw driver 510 hovers over the screen 150 . Also, the electronic device 100 may sense a movement form of the object 510 on the screen 150 at every moment based on the position of the screw driver 510 . The electronic device 100 identifies the movement form of the object 510 as 'S' and takes the input as the alphabet 'S'. Also, as shown in 1100B, the electronic device 100 displays the alphabet 'S' 1102 to the user.
  • the proposed method can be used in the electronic device 100 to integrate the function of a stylus as in a traditional touch sensing device.
  • the size, weight, and manufacturing cost of the electronic device 100 may be greatly reduced.
  • the electronic device 100 does not include an active digitizer sheet, power consumption of the electronic device 100 can be greatly improved by using the proposed method.
  • the proposed method allows a user to sign the screen 150 of the electronic device 100 using an object to be authenticated, so that the electronic device 100 does not need a fingerprint sensor for authentication. By not including the fingerprint sensor in the electronic device 100 , power consumption of the electronic device 100 can be greatly improved and manufacturing cost can be greatly reduced using the proposed method.
  • FIG. 12 is a diagram illustrating an exemplary scenario in which the electronic device 100 switches a user profile based on a location of an object 510 according to an embodiment.
  • the electronic device 100 sets a boundary 1201 between the first part 1202 and the second part 1203 to display the UI of the electronic device 100 with the first part 1202 and the second part 1202 . Divide into two parts 1203.
  • the electronic device 100 may allocate the first part 1202 as a part for the first user profile and allocate the second part 1203 as a part for the second user profile.
  • the electronic device 100 may activate the first user profile when the user hovers the object into the first portion 1202 on the screen 150 within the distance threshold value range. Also, the electronic device 100 may activate the second user profile when the user hovers the object into the second portion 1203 on the screen 150 within the distance threshold range.
  • this method enables the electronic device 100 to switch a user profile based on a portion in which the user hovers an object.
  • FIG. 13 is a diagram illustrating an exemplary scenario in which the electronic device 100 instructs the user to wash the user's hand based on the position of the object, according to an embodiment.
  • a user may wear a metal ring 1304 that is an object. Also, as shown in 1300 , the user may wash hands 1303 using water 1302 flowing from the faucet 1301 .
  • the electronic device 100 may monitor the position of the metal ring 1304 while washing the hands 1303 when the position of the metal ring 1304 is within a distance threshold value range. That is, the electronic device 100 may track the movement of the metal ring 1304 while the user washes the hand 1303 .
  • the electronic device 100 may start and track a timer, as shown at 1305 . Also, according to the movement of the metal ring 1304 , the electronic device 100 may track whether the user is washing their hands for up to 20 seconds. As illustrated in 1306 , the electronic device 100 may instruct the user to continue hand washing when the movement of the metal ring 1304 is not detected within 20 seconds after the washing starts. The timer expires after 20 seconds, as shown in 1307 , and the electronic device 100 may instruct the user to stop washing the hands 1303 . Accordingly, the electronic device 100 of the present invention can intelligently track the tasks performed by the user to improve the user experience and provide the best recommendations to the user.
  • the disclosed embodiments may be implemented in the form of a recording medium storing instructions executable by a computer. Instructions may be stored in the form of program code, and when executed by a processor, may create a program module to perform the operations of the disclosed embodiments.
  • the recording medium may be implemented as a computer-readable recording medium.
  • the computer-readable recording medium includes any type of recording medium in which computer-readable instructions are stored. For example, there may be read only memory (ROM), random access memory (RAM), magnetic tape, magnetic disk, flash memory, optical data storage, and the like.
  • ROM read only memory
  • RAM random access memory
  • magnetic tape magnetic tape
  • magnetic disk magnetic disk
  • flash memory optical data storage, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Artificial Intelligence (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Selon un mode de réalisation, l'invention concerne un appareil électronique qui comprend : un écran ; un capteur magnétométrique qui acquiert des données de champ magnétique devant l'écran ; et un processeur. Lorsqu'un objet en survol au-dessus de l'écran est détecté sur la base des données de champ magnétique, le processeur détermine une valeur seuil de distance correspondant à l'objet sur la base de la taille et du type de l'objet ; et lorsque la distance entre l'objet et l'écran est inférieure à la valeur seuil de distance, le processeur effectue une opération correspondant à la position de l'objet.
PCT/KR2021/018490 2020-12-14 2021-12-07 Appareil électronique et son procédé de commande WO2022131659A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
IN202041054372 2020-12-14
IN202041054372 2020-12-14
KR10-2021-0119262 2021-09-07
KR1020210119262A KR20220084996A (ko) 2020-12-14 2021-09-07 전자 장치 및 그 제어 방법

Publications (1)

Publication Number Publication Date
WO2022131659A1 true WO2022131659A1 (fr) 2022-06-23

Family

ID=82059678

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/018490 WO2022131659A1 (fr) 2020-12-14 2021-12-07 Appareil électronique et son procédé de commande

Country Status (1)

Country Link
WO (1) WO2022131659A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090121566A (ko) * 2008-05-22 2009-11-26 최형락 피사체 사이즈 측정 기능을 갖는 이동통신 단말기 및 이를이용한 피사체 사이즈 측정방법
JP5263776B2 (ja) * 2009-01-28 2013-08-14 独立行政法人産業技術総合研究所 非磁性金属の識別方法
KR101412879B1 (ko) * 2010-08-27 2014-06-26 애플 인크. 터치 및 호버 감지를 위한 동시 발생 신호 검출
KR101501949B1 (ko) * 2008-04-04 2015-03-11 엘지전자 주식회사 근접센서를 이용하여 동작 제어가 가능한 휴대 단말기 및그 제어방법
KR101549553B1 (ko) * 2008-10-20 2015-09-03 엘지전자 주식회사 휴대단말기, 및 휴대단말기의 제어방법
KR101553952B1 (ko) * 2009-04-24 2015-09-17 엘지전자 주식회사 이동 단말기의 제어 방법 및 그 장치
KR20160089717A (ko) * 2015-01-20 2016-07-28 주식회사 트레이스 펜의 기울기 정보를 이용하여 3d 호버링 인식이 가능한 디지타이저 시스템

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101501949B1 (ko) * 2008-04-04 2015-03-11 엘지전자 주식회사 근접센서를 이용하여 동작 제어가 가능한 휴대 단말기 및그 제어방법
KR20090121566A (ko) * 2008-05-22 2009-11-26 최형락 피사체 사이즈 측정 기능을 갖는 이동통신 단말기 및 이를이용한 피사체 사이즈 측정방법
KR101549553B1 (ko) * 2008-10-20 2015-09-03 엘지전자 주식회사 휴대단말기, 및 휴대단말기의 제어방법
JP5263776B2 (ja) * 2009-01-28 2013-08-14 独立行政法人産業技術総合研究所 非磁性金属の識別方法
KR101553952B1 (ko) * 2009-04-24 2015-09-17 엘지전자 주식회사 이동 단말기의 제어 방법 및 그 장치
KR101412879B1 (ko) * 2010-08-27 2014-06-26 애플 인크. 터치 및 호버 감지를 위한 동시 발생 신호 검출
KR20160089717A (ko) * 2015-01-20 2016-07-28 주식회사 트레이스 펜의 기울기 정보를 이용하여 3d 호버링 인식이 가능한 디지타이저 시스템

Similar Documents

Publication Publication Date Title
WO2016114571A1 (fr) Dispositif flexible, et procédé de commande correspondant
WO2015023136A1 (fr) Procédé et appareil de reconnaissance d'état de préhension dans un dispositif électronique
WO2020013528A1 (fr) Affichage souple et dispositif électronique le comportant
WO2020251288A1 (fr) Dispositif tactile et procédé de détection tactile associé
WO2013051752A1 (fr) Appareil et procédé permettant de détecter un contact
WO2015002440A1 (fr) Procédé de changement de mode d'un numériseur
WO2019146918A1 (fr) Procédé de reconnaissance d'empreinte digitale, et dispositif électronique et support de stockage associés
WO2014088253A1 (fr) Procédé et système de fourniture d'informations sur la base d'un contexte et support d'enregistrement lisible par ordinateur correspondant
WO2020159106A1 (fr) Dispositif tactile
WO2015149588A1 (fr) Procédé de reconnaissance d'un mode d'exploitation d'un utilisateur sur un dispositif portatif, et dispositif portatif
WO2016129923A1 (fr) Dispositif d'affichage, procédé d'affichage et support d'enregistrement lisible par ordinateur
WO2016089074A1 (fr) Dispositif et procédé de réception d'entrée de caractères par l'intermédiaire de ce dernier
WO2020209639A1 (fr) Dispositif tactile et procédé de détection tactile associé
WO2014035113A1 (fr) Procédé de commande d'une fonction de toucher et dispositif électronique associé
WO2020085643A1 (fr) Dispositif électronique et procédé de commande associé
WO2021080360A1 (fr) Dispositif électronique et son procédé de commande de fonctionnement du dispositif d'affichage
WO2020153657A1 (fr) Dispositif tactile et procédé de détection tactile associé
WO2016195197A1 (fr) Terminal à stylet et procédé de commande associé
WO2019199086A1 (fr) Dispositif électronique et procédé de commande pour dispositif électronique
WO2022158692A1 (fr) Dispositif électronique permettant d'identifier une force tactile et son procédé de fonctionnement
WO2020060121A1 (fr) Procédé de correction pour saisie manuscrite, et dispositif électronique et support d'informations associés
WO2022131659A1 (fr) Appareil électronique et son procédé de commande
EP3350677A1 (fr) Appareil de mesure de coordonnées et procédé de commande correspondant
WO2019022336A1 (fr) Dispositif d'affichage et son procédé de commande
WO2018208041A1 (fr) Procédé et appareil de réalisation d'au moins une opération d'après un contexte de dispositifs

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21906963

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21906963

Country of ref document: EP

Kind code of ref document: A1