WO2020050636A1 - Procédé et appareil de reconnaissance de gestes basée sur l'intention de l'utilisateur - Google Patents

Procédé et appareil de reconnaissance de gestes basée sur l'intention de l'utilisateur Download PDF

Info

Publication number
WO2020050636A1
WO2020050636A1 PCT/KR2019/011442 KR2019011442W WO2020050636A1 WO 2020050636 A1 WO2020050636 A1 WO 2020050636A1 KR 2019011442 W KR2019011442 W KR 2019011442W WO 2020050636 A1 WO2020050636 A1 WO 2020050636A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
user
area
processor
electronic device
Prior art date
Application number
PCT/KR2019/011442
Other languages
English (en)
Korean (ko)
Inventor
허창룡
이기혁
조치현
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Priority to US17/250,813 priority Critical patent/US11416080B2/en
Publication of WO2020050636A1 publication Critical patent/WO2020050636A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Definitions

  • Various embodiments disclosed herein relate to a user intention-based gesture recognition method and apparatus, for example, to a user intention-based gesture recognition method and apparatus capable of recognizing a user's valid gesture.
  • an input device such as a keyboard or a mouse may be used, or the electronic device may recognize the user's gesture and manipulate the user interface.
  • the user recognizes a gesture by carrying or wearing a device including a motion sensor or recognizing a user gesture by analyzing an image acquired by a camera included in an electronic device The method is used.
  • a gesture irrelevant to the user's intention may also be recognized as a command to perform an operation that the user does not want.
  • An electronic device includes one or more sensors, a memory, and a processor, and the processor uses the one or more sensors to check a distance between the electronic device and a user, and the one or more sensors Using, based on the identified distance satisfies the specified range, the first gesture mode operates to detect the user's gesture, and based on the identified distance outside the specified range, the second gesture mode It may be set to detect a user's gesture by operating, and perform an operation corresponding to the valid gesture based on the detected gesture being recognized as a valid gesture.
  • An operation method of an electronic device includes an operation of checking a distance between the electronic device and a user using one or more sensors; Using the one or more sensors, based on that the identified distance satisfies a specified range, operates in a first gesture mode to detect a user's gesture, and based on the identified distance outside the specified range, Operating in a second gesture mode to detect a user's gesture; And performing an operation corresponding to the valid gesture based on the sensed gesture being recognized as a valid gesture.
  • the electronic device may recognize only a gesture intended by a user as a command by determining whether the gesture is valid based on a region in which the gesture is detected or a characteristic of the detected gesture.
  • An electronic device may set a gesture detection area suitable for each user by setting a gesture detection area based on a distance between the user and the electronic device and characteristics of the user.
  • the electronic device may continuously set the user's intention to set a gesture detection area suitable for the user.
  • FIG. 1 is a block diagram of an electronic device in a network environment, according to various embodiments of the present disclosure.
  • FIG. 2 is a software block diagram, according to various embodiments of the present disclosure.
  • FIG. 3 is an operation flowchart illustrating a method of determining a gesture recognition mode according to a distance between an electronic device and a user according to various embodiments of the present disclosure.
  • FIG. 5 is a diagram illustrating a gesture recognition situation between an electronic device and a user according to various embodiments of the present disclosure.
  • FIG. 6 is a diagram briefly illustrating a gesture region determined by an electronic device according to various embodiments of the present disclosure.
  • FIG. 7 is a diagram illustrating a gesture region determined by an electronic device according to various embodiments of the present disclosure.
  • 8A is a diagram simply illustrating a process of determining a gesture area suitable for a user according to various embodiments of the present disclosure.
  • 8B is a diagram illustrating a gesture region appropriately determined for each user according to various embodiments of the present disclosure.
  • FIG. 9 is a diagram illustrating a gesture region determined according to a user's physical characteristics according to various embodiments of the present disclosure.
  • FIG. 10 is a diagram illustrating an operation between an electronic device, a user, and an external device according to various embodiments of the present disclosure.
  • 11 is an operation flowchart illustrating a method of operating an electronic device according to an external device according to various embodiments of the present disclosure.
  • FIG. 12 is a diagram illustrating an operation of controlling an external device through an electronic device according to various embodiments of the present disclosure.
  • FIGS. 13A and 13B are diagrams illustrating a method of distinguishing a gesture by an electronic device according to various embodiments of the present disclosure.
  • FIG. 14 is an operation flowchart illustrating a method of operating an electronic device according to various embodiments of the present disclosure.
  • 15 is a diagram briefly illustrating a method of recognizing a gesture based on a speed of a gesture according to various embodiments of the present disclosure.
  • 16 is a diagram illustrating a case in which gestures are input from a plurality of users according to various embodiments of the present disclosure.
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100 according to various embodiments.
  • the electronic device 101 communicates with the electronic device 102 through the first network 198 (eg, a short-range wireless communication network), or the second network 199 It may communicate with the electronic device 104 or the server 108 through (eg, a remote wireless communication network). According to an embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108.
  • the first network 198 eg, a short-range wireless communication network
  • the second network 199 It may communicate with the electronic device 104 or the server 108 through (eg, a remote wireless communication network).
  • the electronic device 101 may communicate with the electronic device 104 through the server 108.
  • the electronic device 101 includes a processor 120, a memory 130, an input device 150, an audio output device 155, a display device 160, an action module 163, an audio module ( 170), sensor module 176, interface 177, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 Or, it may include an antenna module 197.
  • at least one of the components may be omitted or one or more other components may be added to the electronic device 101.
  • some of these components may be implemented as one integrated circuit.
  • the sensor module 176 eg, a fingerprint sensor, an iris sensor, or an illuminance sensor
  • the display device 160 eg, a display.
  • the processor 120 executes software (eg, the program 140) to execute at least one other component (eg, hardware or software component) of the electronic device 101 connected to the processor 120. It can be controlled and can perform various data processing or operations. According to one embodiment, as at least part of data processing or computation, the processor 120 may receive instructions or data received from other components (eg, the sensor module 176 or the communication module 190) in the volatile memory 132. Loaded into, process instructions or data stored in volatile memory 132, and store result data in non-volatile memory 134.
  • software eg, the program 140
  • the processor 120 may receive instructions or data received from other components (eg, the sensor module 176 or the communication module 190) in the volatile memory 132. Loaded into, process instructions or data stored in volatile memory 132, and store result data in non-volatile memory 134.
  • the processor 120 includes a main processor 121 (eg, a central processing unit or an application processor), and an auxiliary processor 123 (eg, a graphics processing unit, an image signal processor) that can be operated independently or together. , Sensor hub processor, or communication processor). Additionally or alternatively, the coprocessor 123 may be configured to use lower power than the main processor 121 or to be specialized for its designated function. The coprocessor 123 may be implemented separately from the main processor 121 or as part of it.
  • a main processor 121 eg, a central processing unit or an application processor
  • an auxiliary processor 123 eg, a graphics processing unit, an image signal processor
  • the coprocessor 123 may be configured to use lower power than the main processor 121 or to be specialized for its designated function.
  • the coprocessor 123 may be implemented separately from the main processor 121 or as part of it.
  • the coprocessor 123 may replace, for example, the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or the main processor 121 may be active (eg, execute an application) ) With the main processor 121 while in the state, at least one of the components of the electronic device 101 (for example, the display device 160, the sensor module 176, or the communication module 190) It can control at least some of the functions or states associated with.
  • the coprocessor 123 eg, image signal processor or communication processor
  • may be implemented as part of other functionally relevant components eg, camera module 180 or communication module 190). have.
  • the memory 130 may store various data used by at least one component of the electronic device 101 (eg, the processor 120 or the sensor module 176).
  • the data may include, for example, software (eg, the program 140) and input data or output data for commands related thereto.
  • the memory 130 may include a volatile memory 132 or a non-volatile memory 134.
  • the program 140 may be stored as software in the memory 130, and may include, for example, an operating system 142, middleware 144, or an application 146.
  • the input device 150 may receive a command or data to be used for a component (for example, the processor 120) of the electronic device 101 from the outside (for example, a user) of the electronic device 101.
  • the input device 150 may include, for example, a microphone, mouse, or keyboard.
  • the audio output device 155 may output an audio signal to the outside of the electronic device 101.
  • the audio output device 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback, and the receiver can be used to receive an incoming call.
  • the receiver may be implemented separately from, or as part of, a speaker.
  • the display device 160 may visually provide information to the outside of the electronic device 101 (eg, a user).
  • the display device 160 may include, for example, a display, a hologram device, or a projector and a control circuit for controlling the device.
  • the display device 160 may include a touch circuitry configured to sense a touch, or a sensor circuit (eg, a pressure sensor) configured to measure the strength of the force generated by the touch. have.
  • the action module 163 may perform expression expression change, posture expression, or driving.
  • the action module 163 may include a facial expression motor, a posture expression motor, or a driving unit.
  • the facial expression motor may visually provide the state of the electronic device 101 through, for example, the display device 160.
  • the driving unit may be used, for example, to mechanically change the movement of the electronic device 101 and other components.
  • the driving unit may be configured to rotate up / down, left / right, or clockwise / counterclockwise around at least one axis.
  • the driving unit may be implemented by, for example, a combination of a driving motor (eg, a wheel-type wheel, a sphere-type wheel, a continuous track or a propeller), or by independently controlling. .
  • the audio module 170 may convert sound into an electrical signal, or vice versa. According to an embodiment, the audio module 170 acquires sound through the input device 150, or an external electronic device (eg, connected to the sound output device 155 or the electronic device 101 directly or wirelessly). Sound may be output through the electronic device 102 (eg, a speaker or a headphone).
  • an external electronic device eg, connected to the sound output device 155 or the electronic device 101 directly or wirelessly. Sound may be output through the electronic device 102 (eg, a speaker or a headphone).
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101 or an external environmental state (eg, a user state), and generates an electrical signal or data value corresponding to the detected state can do.
  • the sensor module 176 includes, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an infrared (IR) sensor, a biological sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 177 may support one or more designated protocols that can be used for the electronic device 101 to be directly or wirelessly connected to an external electronic device (eg, the electronic device 102).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card interface
  • audio interface audio interface
  • connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 may convert electrical signals into mechanical stimuli (eg, vibration or movement) or electrical stimuli that the user can perceive through tactile or motor sensations.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture still images and videos. According to one embodiment, the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes. According to an embodiment, the camera module 180 may include a 2D camera 182 or an infrared-based depth camera 184.
  • the power management module 188 may manage power supplied to the electronic device 101.
  • the power management module 388 may be implemented, for example, as at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101.
  • the battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108). It can support establishing and performing communication through the established communication channel.
  • the communication module 190 may operate independently of the processor 120 (eg, an application processor) and include one or more communication processors supporting direct (eg, wired) or wireless communication.
  • the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg : Local area network (LAN) communication module, or power line communication module.
  • GNSS global navigation satellite system
  • Corresponding communication module among these communication modules includes a first network 198 (for example, a short-range communication network such as Bluetooth, WiFi direct, or infrared data association (IrDA)) or a second network 199 (for example, a cellular network, the Internet, or It may communicate with external electronic devices through a computer network (eg, a telecommunication network such as a LAN or WAN).
  • a first network 198 for example, a short-range communication network such as Bluetooth, WiFi direct, or infrared data association (IrDA)
  • a second network 199 for example, a cellular network, the Internet, or It may communicate with external electronic devices through a computer network (eg, a telecommunication network such as a LAN or WAN).
  • a computer network eg, a telecommunication network such as a LAN or WAN.
  • the wireless communication module 192 uses subscriber information (e.g., international mobile subscriber identifier (IMSI)) stored in the subscriber identification module 196 in a communication network such as the first network 198 or the second network 199.
  • subscriber information e.g., international mobile subscriber identifier (IMSI)
  • IMSI international mobile subscriber identifier
  • the antenna module 197 may transmit a signal or power to the outside (eg, an external electronic device) or receive it from the outside.
  • the antenna module may be formed of a conductor or a conductive pattern according to one embodiment, and according to some embodiments, may further include other components (eg, RFIC) in addition to the conductor or conductive pattern.
  • the antenna module 197 may include one or more antennas, from which at least one suitable for a communication scheme used in a communication network, such as the first network 198 or the second network 199 The antenna of, for example, may be selected by the communication module 190.
  • the signal or power may be transmitted or received between the communication module 190 and an external electronic device through the at least one selected antenna.
  • peripheral devices for example, a bus, a general purpose input and output (GPIO), a serial peripheral interface (SPI), or a mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • commands or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199.
  • Each of the electronic devices 102 and 104 may be the same or a different type of device from the electronic device 101.
  • all or some of the operations performed on the electronic device 101 may be performed on one or more external devices of the external electronic devices 102, 104, or 108.
  • the electronic device 101 when the electronic device 101 needs to perform a certain function or service automatically or in response to a request from a user or another device, the electronic device 101 instead executes the function or service itself.
  • one or more external electronic devices may be requested to perform at least a portion of the function or the service.
  • the one or more external electronic devices receiving the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and deliver the result of the execution to the electronic device 101.
  • the electronic device 101 may process the result as it is or additionally and provide it as at least part of the response to the request.
  • cloud computing distributed computing, or client-server computing technology can be used.
  • FIG. 2 is a software block diagram, according to various embodiments of the present disclosure.
  • software of an electronic device may include an operating system (OS), middleware, intelligent framework, or internal storage for controlling one or more resources of the electronic device.
  • the operating system may include, for example, Android TM , iOS TM , Windows TM , Symbian TM , Tizen TM , or Bada TM .
  • At least some of the software programs may be preloaded, for example, on an electronic device at manufacturing time, or downloaded from an external electronic device (eg, electronic device 102 or 103) or server 108 when used by a user. Can be updated.
  • the operating system may control management (eg, allocation or retrieval) of one or more system resources (eg, process, memory, or power) of the electronic device.
  • the operating system 142 may additionally or alternatively include other hardware devices of the electronic device 101, such as the input device 150, the sound output device 155, the display device 160, and the action module 163. , Audio module 170, sensor module 176, interface 177, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification
  • One or more driver programs for driving the module 196 or the antenna module 197 may be included.
  • Middleware may detect and track a user's face location using signal-processed data or perform authentication through face recognition.
  • the middleware may perform a role of recognizing a user's 3D gesture, tracking an input location for audio signals (DOA), voice recognition, and signals of various sensor data.
  • the middleware may include, for example, a gesture recognition manager, a face detection / tracking / recognition manager, a sensor information processing manager, a conversation engine manager, a speech synthesis manager, a sound source tracking manager, or a speech recognition manager.
  • the intelligent framework may include, for example, a multimodal convergence block, a user pattern learning block, or a behavior controller block.
  • the multi-modal convergence block may serve to collect and manage various information processed in the middleware.
  • the user pattern learning block may extract and learn meaningful information such as a user's life pattern and preference using information of a multi-modal fusion block, for example.
  • the behavior control block may express information to be fed back to the user by motion, graphics (UI / UX), light, voice response, or sound.
  • the internal storage may include, for example, a user model DB, a behavior model DB, or a voice model DB.
  • the user model DB may store information learned by the intelligent framework for each user.
  • the behavior model DB may store information for controlling behavior of the electronic device, for example.
  • Information stored in each DB may be stored or shared in, for example, a wireless network DB (eg, cloud).
  • FIG. 3 is an operation flowchart illustrating a method of determining a gesture mode according to a distance between an electronic device and a user according to various embodiments of the present disclosure.
  • the processor may check a distance between the electronic device 101 and a user using one or more sensors.
  • One or more sensors include a gesture sensor, a gyro sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an infrared (infra-red) sensor, a biometric sensor, and a 3D depth sensor.
  • a microphone, a camera sensor, an ultrasonic sensor, or a radio frequency sensor The sensing value of the sensor is provided to the processor, and the processor can check the position of the object (eg, the user) and the distance from the object, for example, using the sensing value of the sensor.
  • the processor may detect a user's gesture using, for example, the one or more sensors.
  • the camera sensor may be at least one of a dual camera, a 360 degree camera, a 2D image capture camera 182, or an infrared-based depth camera.
  • the camera sensor can be, for example, a vision camera.
  • the 3D depth sensor may be an infrared-based depth camera.
  • the 3D depth sensor may be a depth camera 184.
  • the 3D depth sensor and the camera sensor may be the camera module 180.
  • the processor may perform a user's face recognition using, for example, a camera sensor.
  • the 3D depth sensor is composed of, for example, an infrared emitter and a depth image CMOS, and uses a time difference in which an infrared signal transmitted through the infrared emitter is reflected by an object and returns, The distance between the electronic device 101 and the object can be measured.
  • the processor may determine whether the distance from the user identified in operation 301 is within or outside the specified range.
  • the process moves to operation 305, the processor controls the electronic device 101 to operate in the first gesture mode, and the electronic device 101 controls one or more sensors under the first gesture mode. Through this, the user's gesture can be detected.
  • the processor controls the electronic device 101 to operate in the second gesture mode, and the electronic device 101 through the one or more sensors under the second gesture mode The user's gesture can be detected.
  • the processor may change the gesture mode based on the distance between the electronic device 101 and the user.
  • the processor may operate in the first gesture mode when it is determined that the user is located within the predetermined range, and in the second gesture mode when it is determined that the user is out of the designated range.
  • the processor may operate by dividing the user's gesture into a first gesture mode performed at a short distance or a second gesture mode performed at a long distance, for example.
  • the processor When the processor operates, for example, in the first gesture mode 420 performed at a short distance, the user may directly input the electronic device 101 such as a touch, so the processor may operate in the 2D gesture mode. .
  • the processor When the processor operates, for example, in the second gesture mode 430 performed at a long distance, the user is located in an area where touch is not possible, so the processor may operate in a 3D gesture mode.
  • the designated range for determining the gesture mode may be a value preset in the electronic device 101, or may be determined adaptively in consideration of the user's physical characteristics.
  • the processor may detect a user's gesture using one or more sensors.
  • the one or more sensors may include, for example, a 2D imaging camera sensor or an infrared-based depth camera.
  • the electronic device 101 may detect a user's gesture in the gesture area.
  • the electronic device 101 may store a gesture recognition algorithm in a memory.
  • the electronic device 101 may process the user's gesture detected in the gesture area according to the gesture recognition algorithm.
  • the gesture region may mean, for example, an entire region capable of detecting a user's gesture.
  • the processor may recognize a user's skeleton, for example, recognize movement of a joint, and use the corresponding gesture as an input command.
  • the processor may detect a user's hand pointing motion, a hand moving the hand forward / backward / left / right, or a hand drawing a triangle or circle.
  • the processor may perform a designated operation corresponding to the valid gesture based on the sensed gesture being recognized as a valid gesture. For example, when the detected gesture is recognized as a valid gesture, the processor may proceed to operation 309 to perform a designated operation corresponding to the valid gesture, and when the detected gesture is recognized as an invalid gesture The designated operation corresponding to the invalid gesture may not be performed.
  • FIG. 4 is an operation flowchart illustrating a method for an electronic device to recognize a valid gesture according to various embodiments of the present disclosure.
  • the processor may determine the first area and the second area based on the location of the user.
  • the processor may detect the user's location using one or more sensors.
  • the processor may divide a gesture region into a first region and a second region.
  • the processor may, for example, set the user's position as the user's central axis, and the first area, the specified first distance ( A region including a peripheral space within a designated second distance R2 farther than R1) and excluding the first region may be determined as the second region.
  • the gesture area may mean, for example, the entire area within the second distance R2, and the gesture area may be divided into a first area and a second area.
  • the processor detects a gesture and determines an area in which the gesture is detected.
  • the processor may determine whether a gesture is detected in the first area or a gesture in the second area, for example.
  • the processor may determine whether the detected gesture is valid based on the region in which the gesture is detected.
  • the processor may designate one of the first area and the second area to recognize only the gesture detected in the designated area as a valid gesture.
  • the electronic device 101 may recognize, for example, a gesture detected in the first area as an invalid gesture, and a gesture detected in the second area as a valid gesture.
  • the processor When the processor recognizes the detected gesture as a valid gesture, an operation corresponding to the detected gesture may be performed. When the processor recognizes the detected gesture as an invalid gesture, it may not perform an operation corresponding to the detected gesture.
  • the operation proceeds to operation 309 of FIG. 3, and the processor may perform the operation based on whether the gesture is valid.
  • the processor may recognize a gesture generated in the first area as an invalid gesture, and recognize a gesture generated in the second area as a valid gesture to perform a corresponding operation.
  • FIG. 5 is a diagram illustrating a gesture recognition situation between the electronic device 101 and a user according to various embodiments of the present disclosure.
  • the electronic device 101 can detect a user's gesture and perform a command based on the detected gesture.
  • the electronic device 101 may change the gesture detection mode according to the distance from the user. For example, if it is determined that the user is located within a specified range, the electronic device 101 operates in the first gesture mode to detect the user's gesture, and when the user determines that the user is located outside the specified range, operates in the second gesture mode To detect the user's gesture.
  • the electronic device 101 may determine, for example, the first area and the second area based on the distance away from the user's central axis 520, and determine whether the gesture is valid for the detected gesture based on the determined gesture area I can do it.
  • the electronic device 101 can confirm the user's location using at least one sensor.
  • the processor may determine the first area and the second area based on the location of the user.
  • the processor may, for example, set the user's position to the user's central axis 520, in this case, the first area R1 within the first distance R1 from the user's central axis 520.
  • the region 570 and the peripheral space within the second distance R2 that is farther than the first distance R1 and excluding the first region 570 may be determined as the second region 560.
  • R1 and R2 may be values stored in advance in the electronic device 101, based on the characteristics of the user recognized by the electronic device 101 (eg, the user's height, the length of the user's body structure (eg, arm), etc.) It may be decided.
  • the processor may determine whether the detected gesture is valid, according to the region in which the gesture is detected. When the processor determines that only the gesture detected in the second area is recognized as a valid gesture, for example, when the user's gesture is detected in the first area 570, it may be recognized as an invalid gesture, and the second When the user's gesture is detected in the region 560, it may be recognized as a valid gesture.
  • the processor When the processor recognizes the detected gesture as a valid gesture, an operation corresponding to the detected gesture may be performed. When the processor recognizes the detected gesture as an invalid gesture, it may not perform an operation corresponding to the detected gesture.
  • R1 may be defined, for example, as a radius of an area in which an invalid gesture occurs among the user's gestures
  • R2 may be defined as a maximum range radius in which the user can perform a gesture
  • a gesture detected in the gesture area The region recognizing as a valid gesture may be a space between the first boundary 550 having R1 as a radius and the second boundary 540 having R2 as a radius.
  • the gesture region may mean, for example, the entire region within the maximum radius range R2 where the user can perform the gesture, and the gesture region may be divided into a first region and a second region.
  • the first region 570 and the second region 560 are expressed in a circular shape and a cylindrical shape, but the first region 570 and the second region 560 are various such as sphere, ellipse, and semicircle. It can be determined in the form. For example, when the first area 570 and the second area 560 are determined in a spherical shape, it may be determined by setting a center point (eg, the user's head) rather than the user's central axis 520.
  • the processor may operate by dividing the user's gesture into a first gesture mode performed at a short range 581 or a second gesture mode performed at a long distance 583.
  • the processor operates, for example, in a first gesture mode performed at a short distance 581
  • the user may directly input the electronic device 101 such as a touch, so the processor operates in a touch mode or a 2D gesture mode can do.
  • the processor operates, for example, in a second gesture mode performed at a long distance 583, the user is located in an area where touch is not possible, so the processor may operate in a 3D gesture mode.
  • the gesture mode may be changed according to whether the user is located within a specified range from the electronic device.
  • the designated range for determining the gesture mode may be determined as a preset value, or may be determined adaptively in consideration of a user's physical characteristics.
  • the electronic device 101 may check the distances 530 and D between the electronic device 101 and the user using at least one sensor.
  • the distance 530 between the electronic device 101 and the user may mean, for example, a straight line distance from the central axis 510 of the electronic device 101 to the user's central axis 520.
  • the electronic device 101 may determine the first area and the second area based on the user's location.
  • the processor may set the gesture area, for example, by setting R2 as a maximum range radius where a user can perform a gesture, and R1 as a radius of an area in which an invalid gesture occurs among various gestures of the user. Can decide.
  • the electronic device may adaptively determine a designated range, which is a criterion for determining a gesture mode, based on a second distance R2 that is a second area radius.
  • a designated range which is a criterion for determining a gesture mode
  • the processor may operate in the first gesture mode.
  • the first gesture mode for example, only a user's touch input may be received, a 2D gesture area may be determined, and a gesture having specific validity may be recognized for each 2D gesture area.
  • the processor may operate in the second gesture mode.
  • the second gesture mode for example, it is possible to determine a 3D gesture area and determine whether a detected gesture is valid for each 3D gesture area.
  • a conversion point for distinguishing the first gesture mode from the second gesture mode may be defined as (D-R2).
  • FIG. 6 is a diagram briefly illustrating a gesture region determined by an electronic device according to various embodiments of the present disclosure.
  • the user's gesture generated in the first region 620 including the surrounding space within the first distance R1 from the reference point using the user's location as the reference point 610 is a gesture that the processor is not valid Can be recognized.
  • a user's chin gesture 620a occurring in the first area 620 a user's head scratching gesture 620b, and a user's finger gesturing gesture 620c, etc.
  • Gestures are only natural gestures that occur everyday, and should not be judged as valid gestures intended to control electronic devices. For example, in the case of a gesture occurring in the first region 620, the processor may determine and filter the invalid gesture.
  • the user's gesture generated in the second area 630 including the surrounding space within the second distance R2 and not including the first area 620 may be recognized by the processor as a valid gesture.
  • Gestures such as a gesture 630b that the user passes by hand, and a gesture 630c that the user controls using both hands are intended to control the electronic device, and should be determined as a valid gesture command.
  • the processor may determine, for example, a valid gesture in the case of a gesture occurring in the second region 630 and perform an operation corresponding thereto.
  • the processor may change the second distance R2 according to the user's state. If the processor recognizes, for example, that the user is holding an additional object (eg, a baton, etc.) with his hand, the processor may adjust the length ( Considering R2 The second region 630 may be enlarged to and extended to the hatched region 640.
  • an additional object eg, a baton, etc.
  • FIG. 7 is a diagram illustrating a gesture region determined by an electronic device according to various embodiments of the present disclosure.
  • the processor may recognize, for example, a gesture recognized in the first area 720 including a peripheral space within a first distance R1 from the reference point 710 as a valid gesture.
  • the gesture recognized in the second region 730 that includes the surrounding space within the two distances R2 and does not include the first region 720 may be recognized as an invalid gesture.
  • the processor may set the location of the user identified using one or more sensors as the reference point 710.
  • the reference point 710 may be, for example, a central axis of the user, or a body part of the user (eg, the user's head).
  • the processor may set an external object (eg, sofa) as a reference point.
  • the processor may determine, for example, a gesture area based on the external object.
  • the processor may perform an operation by recognizing, for example, a gesture detected in the first region 720 as a valid gesture, and recognize a gesture detected in the second region 730 as an invalid gesture. You can.
  • 8A is a diagram simply illustrating a process of determining an effective gesture area from a reference point 810 (eg, a user) according to various embodiments of the present disclosure.
  • 8B is a diagram illustrating a gesture region learned for each user according to various embodiments of the present disclosure.
  • the processor may determine the first region and the second region by learning a gesture generation pattern continuously performed by the user.
  • the electronic device may perform a machine learning process to determine a gesture area for each user.
  • the processor may, for example, perform an operation of confirming the user's intention for all gestures detected by the user within the entire gesture area.
  • the processor may receive user input, for example, to determine whether the gesture is intended by the user or gesture independent of the user's intention for all gestures.
  • the processor may change the first area 820 to recognize the detected gesture as an invalid gesture and the second area 830 to recognize the detected gesture as a valid gesture based on the received user input. Therefore, sizes and shapes of the first region 820 and the second region 830 may be determined differently for each user.
  • the processor may check whether the gesture 821 and the gesture 823 are intended gestures.
  • the electronic device receives a user input from the user through the input device to confirm that the gesture 821 and the gesture 823 are gestures irrelevant to the intention, where the gestures 821 and the gestures 823 occur
  • the first area and the second area may be changed to be included in the first area.
  • the processor may check whether the gesture 831 is an intended gesture.
  • the electronic device receives a user input from the user through the input device, confirming that the gesture 831 is a gesture intended by the user, the first area and the second area such that the place where the gesture 831 occurs is included in the second area You can change the area.
  • the processor may determine an area within the maximum range radius R2 where a user can perform a gesture as the entire gesture area, and confirm the user's intention for all user gestures detected within the entire gesture area. .
  • the processor may, for example, receive a user input confirming the user's intention.
  • the processor may learn the user's gesture generation pattern based on the user input, and determine the first area 820 and the second area 830 based on the user's gesture generation pattern.
  • the processor determines the first area 820 and the second area 830 according to a specified distance based on the user's location, and then detects the first area 820 and the second area 830 The user's intention may be confirmed with respect to the gesture of the user.
  • the processor may learn the gesture pattern of the user and change the first area 820 and the second area 830 based on this to determine again.
  • the processor may variously determine the size and shape of the first region 820 and the second region 830 in the gesture region for each user.
  • FIG. 9 is a diagram illustrating a gesture region determined according to a user's physical characteristics according to various embodiments of the present disclosure.
  • the processor may recognize a user's physical characteristics (eg, height, length of body structure, age, presence or absence of physical disability, etc.) and apply a gesture area suitable for the user's physical characteristics.
  • the user's physical characteristics may be input directly to the electronic device by the user, or may be determined by the processor based on the recognized information.
  • the processor can set, for example, the location and size of the gesture region.
  • the processor may set the gesture region to the entire body 911, the upper body 921 or the lower body 923 only, or the head 931, arms 933, 935, or legs 937, 939. It can be set with only or can be set with both hands (941, 943). In this case, the position and size of the gesture recognition area may be set differently within one application.
  • FIG. 10 is a diagram illustrating an operation between an electronic device, a user, and an external device according to various embodiments of the present disclosure.
  • the user 1020 may control the external device 1030 located around the electronic device 1010.
  • the electronic device 1010 may be paired with, for example, an external device 1030 located nearby, and may recognize a gesture of the user 1020 and transmit a specific command command to the external device 1030.
  • the electronic device 1010 may check, for example, a field of view of the user 1020 and select the external device 1030 located within the viewing angle as a command command transmission target.
  • the electronic device 1010 may determine the gesture area 1040 based on the location and / or body characteristics of the user 1020.
  • 11 is an operation flowchart illustrating a method of operating an electronic device according to various embodiments of the present disclosure.
  • the processor may detect a user's gesture for controlling an external device through at least one sensor.
  • the processor may sense a gesture of a user controlling an external device, for example.
  • the processor may transmit a signal to be communicated with the external device through a communication circuit to the external device.
  • the electronic device may be in communication communication with an external device.
  • the processor may receive information regarding content that is being executed from an external device.
  • the content may include, for example, a menu mode for displaying a menu, a play mode for playing a game, etc., a setting mode for setting a user, or an advertisement mode for displaying an advertisement.
  • the processor may set a gesture area based on the information on the received content.
  • the processor may determine, for example, different regions for detecting gestures for each content.
  • FIG. 12 is a diagram illustrating an operation of controlling an external device through an electronic device according to various embodiments of the present disclosure.
  • the processor may differently determine an area for detecting a gesture according to content provided by the external device 1030.
  • the processor may determine an area for detecting a gesture based on the information on the running content received from the external device 1030. In this case, an area for detecting a gesture may be determined in consideration of a user's physical characteristics.
  • the upper side of FIG. 12 shows an example of a screen according to a change of a content mode with time.
  • the lower side of FIG. 12 displays an area for detecting a gesture for each content running in the external device 1030.
  • the external device 1030 may enter the menu mode.
  • the processor can determine a region for detecting a gesture only around the user's right hand 1211. In this case, the processor may recognize only the movement of the user's right hand as a valid gesture, or only the movement of the finger of the right hand as a valid gesture.
  • the processor may be configured to recognize only a gesture detected from the right side of the user's center axis as a valid gesture, or may recognize only a movement of the right hand detected in a region more than a predetermined distance from the user's center point as a valid gesture.
  • the game is executed on the external device 1030 to enter the play mode.
  • the processor may determine the entire upper body 1221 of the user as an area in which the gesture is detected.
  • the processor may determine, for example, an upper portion of the waist as an area in which a gesture is detected, using the waist portion of the user as a central axis.
  • the external device 1030 may enter a setting mode for user setting.
  • the processor may determine only the right hand area 1231 as an area in which the gesture is detected.
  • the external device 1030 may enter the advertisement mode.
  • the advertisement mode it may be determined to operate by dividing it into a first gesture area 1241 centered on the user's face and a second gesture area 1243 centered on the right hand.
  • the processor may recognize, for example, as a first gesture (eg, a request for providing an additional function) when a gesture is detected in a first gesture area around the user's face, and when a gesture is detected in a second gesture area around the user's right hand It can be recognized as a second gesture (eg, a channel change request).
  • FIGS. 13A and 13B are diagrams illustrating a method for an electronic device to recognize a gesture according to various embodiments of the present disclosure.
  • the processor may set the reference axis 1310 based on the user's location.
  • the processor may detect a gesture using at least one sensor, and may check a direction of the gesture with respect to the reference axis 1310.
  • the processor may determine whether the detected gesture is valid based on, for example, an area in which the gesture is detected and a direction of the gesture.
  • the processor may determine whether the detected gesture is valid by comparing a direction in which the region where the gesture is detected is located from the reference axis and the direction of the detected gesture.
  • a user may repeatedly perform a plurality of gestures in the same direction to the left or right using the hand.
  • the processor may recognize the gesture as a valid gesture only for a gesture in a specific direction for each gesture detection area.
  • the processor may set the reference axis 1310 based on the user's location.
  • the reference axis 1310 may be, for example, a vertical axis centered on a user.
  • the processor may set a gesture area around the reference axis 1310.
  • the processor may include a third region located in a first direction 1320 perpendicular to the reference axis 1310 from the reference axis 1310 and a third area that is opposite to the first direction 1320 from the reference axis 1310. It can be determined by dividing into a fourth area located in the 2 direction 1330.
  • the processor may perform a user gesture 1321 repeated three times in the first direction 1320 and the second direction 1330 in the third area located in the first direction 1320 from the reference axis 1310. ) Can be detected.
  • the second direction may be a direction opposite to the first direction.
  • the processor may recognize, for example, only the gesture 1323 detected in the first direction in the third area as a valid gesture. have.
  • the processor may detect a user gesture 1333 that is repeated three times in the first direction 1320 and the second direction 1330 in the fourth region located in the second direction 1330 from the reference axis 1310. .
  • the processor may recognize only the gesture 1333 detected in the second direction 1330 in the fourth area as a valid gesture.
  • the processor may set the reference axis 1350 based on the user's location, and the reference axis 1350 may be, for example, a horizontal axis centered on the user.
  • the processor for example, when a plurality of gestures 1136 repeating directions are detected in an area located in the third direction 1360 from the reference axis 1350, the gesture 1403 generated in the third direction 1360 Only can be recognized as a valid gesture.
  • the processor for example, when the user's repeated gesture 1371 is detected in a region located in the fourth direction 1370 from the reference axis 1350, only the gesture 1373 generated in the fourth direction 1370 is valid. Gesture can be recognized.
  • FIG. 14 is an operation flowchart illustrating a method of operating an electronic device according to various embodiments of the present disclosure.
  • the processor may determine the first area and the second area based on the location of the user.
  • the processor may include, for example, a first area including a peripheral space within a first distance specified by a user's head as a reference point, and a peripheral area within a specified second distance farther than a specified first distance.
  • a second region that does not include may be determined.
  • the processor may detect a user's gesture using one or more sensors.
  • the processor may check a region in which the gesture is detected and the speed of the detected gesture.
  • the processor may determine whether the detected gesture is valid based on the area in which the gesture is detected and the speed of the gesture.
  • the processor may designate, for example, a speed for effectively recognizing a gesture detected for each area of the gesture.
  • the speed of the gesture when the speed of the gesture is a first speed that exceeds a set threshold value, it is recognized as a valid gesture, and when the generated speed is a second speed that is less than or equal to the set reference value, it is an invalid gesture. Can be recognized.
  • the speed of the gesture when the speed of the gesture is a second speed that is equal to or less than the set reference value, it is recognized as a valid gesture, and when the speed is the first speed, it can be recognized as an invalid gesture.
  • the speed of the gesture in the second region, may be recognized as a valid gesture for both the first speed and the second speed.
  • the processor may perform an operation based on whether the gesture is valid.
  • the processor may perform an operation corresponding to the detected gesture, for example, when it is recognized as a valid gesture.
  • 15 is a diagram briefly illustrating a method of recognizing a gesture based on a speed of a gesture according to various embodiments of the present disclosure.
  • the processor may determine the validity of the gesture based on the region and the gesture speed at which the gesture is detected.
  • the gesture area is divided into a first area, a second area, and a third area. Description overlapping with FIG. 14 is omitted. The contents are separate from the first, second, and third regions described above, and are merely classified to help understand the present embodiment.
  • the third area may be determined as an area that includes the surrounding space within the designated third distance 1540 that is farther than the designated second distance 1530 and does not include the first area and the second area.
  • the first area is the distance from the head to the shoulder (1520)
  • the second area is the distance from the shoulder to the elbow (1530)
  • the third area is from the elbow to the fingertip.
  • the graph on the right side of FIG. 15 is a graph showing the speed of the gesture over time calculated by tracking the detected gesture.
  • the electronic device can track the speed of the gesture in real time.
  • the gesture occurring in the first area, it can be recognized as a gesture that is valid, for example, when the generation speed exceeds the second reference value 1560 (first speed 1521).
  • the gesture generation speed is less than or equal to the second reference value 1560, it may be recognized as an invalid gesture.
  • the first area it can be recognized as a valid gesture only when the rate at which the gesture is generated exceeds the second reference value 1560.
  • the gesture is recognized as a valid gesture only when the occurrence speed exceeds the first reference value 1550 and is less than or equal to the second reference value 1560 (second speed 1531),
  • the occurrence rate exceeds the second reference value 1560 or is less than or equal to the first threshold value 1550, it may be recognized as an invalid gesture.
  • the gesture may be recognized as a valid gesture only when the speed of the gesture exceeds the first reference value 1550 and is less than or equal to the second reference value 1560.
  • the gesture occurring in the third area for example, it is recognized as a gesture that is valid only when the generation speed is equal to or less than the first reference value 1550 (third speed 1541), and exceeds the first reference value 1550 In this case, it can be recognized as an invalid gesture.
  • the gesture in the third region, it is possible to recognize the gesture as a valid gesture only when the gesture generation speed is equal to or less than the first reference value 1550.
  • different gestures may be performed for each gesture region. For example, when only a relatively fast gesture is recognized in the first area, a zoom-in / zoom-out operation may be performed in response to recognizing a valid gesture for effective control, and the speed in the second area
  • a zoom movement operation in units of 5 lines may be performed, and in the third region, only a relatively slow gesture is recognized.
  • a page-by-page scroll movement operation may be performed in response to recognizing a valid gesture.
  • the processor may recognize it as a valid gesture only when the speed of the gesture increases. For example, a third speed (the first reference value (1550) or less) is detected in the first area, and a second speed (the first reference value (1550) exceeds the second reference value (1560)) in the second area. Is detected, and in the third area, when the first speed (exceeding the second reference value 1560) is detected, the speed increases as the distance from the reference point increases, so in this case, it can be recognized as a valid gesture.
  • input may be performed based on a difference in speed of gestures detected in each region. For example, when scrolling the screen, if the difference between the speed of the gesture detected in the first area and the speed of the gesture detected in the second area is 0.1 m / s, one page scroll is provided, and the difference is 0.5 m / s. In this case, 10-page scroll may be provided.
  • 16 is a diagram illustrating a case in which gestures are input from a plurality of users according to various embodiments of the present disclosure.
  • the processor may determine whether the gesture is valid by assigning a weight based on the area where each user is located.
  • the processor may determine a gesture area by setting a reference point 1601 based on the location of the first user 1612. For example, the processor may determine the fifth area, the sixth area, and the seventh area according to the distance using the head of the first user 1612 as a reference point. According to various embodiments, the processor may be divided into two regions according to a distance, or may be divided into three or more regions.
  • the fifth, sixth, and seventh areas are separate contents from those described in the other drawings, and are merely classified to help understanding in the present embodiment.
  • the method of determining the area is the same as described above.
  • the processor may recognize, for example, only gestures having a high priority when each gesture is detected from a plurality of users, or may recognize each as a different gesture. Priority may be determined for each area.
  • the processor may determine a first user based on one of a user authentication method, a distance between the electronic device and a plurality of users, and a field of view (FOV) of the electronic device.
  • the processor may perform user authentication, for example, through face recognition.
  • the electronic device may store the face recognition information of the user in the memory, or may set and store the face recognition information of the plurality of users by setting the priority.
  • the processor may determine, for example, as a first user, a user whose priority has been set as high priority when faces of a plurality of users are recognized.
  • the processor may determine the first user based on the distance between the electronic device and the plurality of users. For example, the processor may check respective distances between the electronic device and the plurality of users, and may determine the first user from the electronic device, which is the closest user, as the first user.
  • the processor may determine the first user based on the viewing angle of the electronic device.
  • the first user may be determined to be the most central person in the viewing angle of the electronic device.
  • the processor may determine the gesture area based on the location of an external object (eg, sofa), not the user.
  • the processor may detect, for example, an external object within a viewing angle of the electronic device, and check the location of the detected external object.
  • the processor may determine the gesture region based on the location of the external object.
  • the processor may determine the fifth area and the sixth area according to the distance using the external object as a reference point.
  • the fifth and sixth areas are separate from the contents described in the other drawings, and are merely classified to help understanding in the present embodiment.
  • the processor may determine whether the detected gesture is valid, for example, according to an area in which the gesture is detected. For example, the processor may recognize, as a valid gesture, only a gesture detected in one designated area among the fifth area and the sixth area. For example, the processor may recognize the gesture sensed in the fifth area as a valid gesture, and recognize the gesture sensed in the sixth area as an invalid gesture.
  • gestures may be classified and recognized using a plurality of reference objects.
  • the processor may set an external object (eg, sofa) as a first reference object and classify the gesture area according to the distance based on the reference object.
  • the processor may set the user's face as a second reference object, for example. In this case, the processor may classify the gesture according to the region where the second reference object is located, regardless of the location where the actual gesture is input.
  • the processor detects the gesture detected by the first user 1612 as the first classification and the second user ( If the face of 1622) is located in the sixth area, the gesture detected from the second user 1622 is in the second classification, and when the face of the third user 1632 is located in the seventh area, the third user 1632 Gestures detected from) can be classified into a third classification.
  • the processor may assign a weight to a gesture detected for each classification, and perform a gesture having a high weight preferentially. For example, the processor may preferentially process the gesture sensed from the first user in the fifth area, and the gesture sensed from the second user in the sixth area.
  • the plurality of gestures input in each region may be recognized as independent gestures, respectively.
  • the first reference object may be previously designated through image recognition and depth map, or may be designated by recognizing a location through electrical communication with an electronic device.
  • the electronic device may be various types of devices.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, an intelligent device (Robot), or a home appliance device. have.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a laptop, a tablet, or a portable multimedia device.
  • a portable medical device e.g., a portable medical device
  • camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a wearable device e.g., a wearable device
  • an intelligent device Robot
  • the electronic device is not limited to the above-described devices.
  • any (eg, first) component is referred to as a “coupled” or “connected” to another (eg, second) component, with or without the term “functionally” or “communicatively”
  • any of the above components can be connected directly to the other components (eg, by wire), wirelessly, or through a third component.
  • module may include a unit implemented in hardware, software, or firmware, and may be used interchangeably with terms such as logic, logic blocks, components, or circuits.
  • the module may be an integrally configured component or a minimum unit of the component or a part thereof performing one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • Various embodiments of the present disclosure may include one or more instructions stored in a storage medium (eg, internal memory 136 or external memory 138) readable by a machine (eg, electronic device 101). It may be implemented as software (eg, the program 140) including them.
  • a processor eg, processor 120
  • a device eg, electronic device 101
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the storage medium readable by the device may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device, and does not include a signal (eg, electromagnetic wave), and this term is used when data is semi-permanently stored in the storage medium. It does not distinguish between temporary storage cases.
  • a signal eg, electromagnetic wave
  • a method may be provided included in a computer program product.
  • Computer program products are commodities that can be traded between sellers and buyers.
  • the computer program product may be distributed in the form of a device-readable storage medium (e.g. compact disc read only memory (CD-ROM)), or through an application store (e.g. Play Store TM ) or two user devices ( Example: smartphones) may be distributed (eg downloaded or uploaded) directly or online.
  • a device such as a memory of a manufacturer's server, an application store's server, or a relay server, or may be temporarily generated.
  • each component (eg, module or program) of the above-described components may include a singular or plural entity.
  • one or more components or actions of the above-described corresponding components may be omitted, or one or more other components or actions may be added.
  • a plurality of components eg, modules or programs
  • the integrated component may perform one or more functions of each component of the plurality of components the same or similar to that performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component may be executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations may be executed in a different order, or omitted Or, one or more other actions can be added.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé et un appareil de reconnaissance de gestes basée sur l'intention de l'utilisateur, ainsi qu'un dispositif électronique comprenant au moins un capteur, une mémoire et un processeur, le processeur pouvant être configuré pour vérifier une distance entre le dispositif électronique et l'utilisateur au moyen dudit capteur au moins, détecter un geste d'utilisateur par un fonctionnement dans un premier mode gestuel en fonction de la distance vérifiée correspondant à une plage désignée au moyen dudit capteur au moins, détecter le geste de l'utilisateur par un fonctionnement dans un deuxième mode gestuel en fonction de la distance vérifiée s'écartant de la plage désignée, et effectuer une opération correspondant à un geste valable en fonction du geste détecté comme geste valable. La présente invention comprend divers autres modes de réalisation.
PCT/KR2019/011442 2018-09-07 2019-09-05 Procédé et appareil de reconnaissance de gestes basée sur l'intention de l'utilisateur WO2020050636A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/250,813 US11416080B2 (en) 2018-09-07 2019-09-05 User intention-based gesture recognition method and apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2018-0107348 2018-09-07
KR1020180107348A KR102582863B1 (ko) 2018-09-07 2018-09-07 사용자 의도 기반 제스처 인식 방법 및 장치

Publications (1)

Publication Number Publication Date
WO2020050636A1 true WO2020050636A1 (fr) 2020-03-12

Family

ID=69723087

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/011442 WO2020050636A1 (fr) 2018-09-07 2019-09-05 Procédé et appareil de reconnaissance de gestes basée sur l'intention de l'utilisateur

Country Status (3)

Country Link
US (1) US11416080B2 (fr)
KR (1) KR102582863B1 (fr)
WO (1) WO2020050636A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4118517A4 (fr) * 2020-03-20 2023-08-23 Huawei Technologies Co., Ltd. Procédés et systèmes de commande de dispositif à l'aide de gestes de la main dans un environnement multi-utilisateur

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110908504B (zh) * 2019-10-10 2021-03-23 浙江大学 一种增强现实博物馆协作交互方法与系统
KR20210138923A (ko) * 2020-05-13 2021-11-22 삼성전자주식회사 증강 현실 서비스를 제공하기 위한 전자 장치 및 그의 동작 방법
CN111736693B (zh) * 2020-06-09 2024-03-22 海尔优家智能科技(北京)有限公司 智能设备的手势控制方法及装置
KR20220067964A (ko) * 2020-11-18 2022-05-25 삼성전자주식회사 카메라 시야(fov) 가장자리에서 움직임을 인식하여 전자 장치를 제어하는 방법 및 그 전자 장치
WO2023085847A1 (fr) * 2021-11-15 2023-05-19 삼성전자 주식회사 Dispositif habitronique communiquant avec au moins un dispositif homologue en réponse à un événement déclencheur, et son procédé de commande
CN115309297A (zh) * 2022-08-11 2022-11-08 天津速越科技有限公司 一种用于燃气表的手势感应切换显示界面的方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140017829A (ko) * 2012-08-01 2014-02-12 삼성전자주식회사 제스처의 방향에 기초하여 상기 제스처를 인식하는 제스처 인식 장치와 제스처 인식 방법
KR20140144412A (ko) * 2013-06-11 2014-12-19 삼성전자주식회사 제스처 기반 통신 서비스 수행 방법 및 장치
KR20150130808A (ko) * 2014-05-14 2015-11-24 삼성전자주식회사 사용자의 공간 제스처를 식별하는 방법 및 장치
US20180095524A1 (en) * 2016-09-30 2018-04-05 Intel Corporation Interaction mode selection based on detected distance between user and machine interface
JP2018109874A (ja) * 2017-01-04 2018-07-12 京セラ株式会社 電子機器、プログラムおよび制御方法

Family Cites Families (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7227526B2 (en) * 2000-07-24 2007-06-05 Gesturetek, Inc. Video-based image control system
JP2004299025A (ja) * 2003-04-01 2004-10-28 Honda Motor Co Ltd 移動ロボット制御装置、移動ロボット制御方法及び移動ロボット制御プログラム
US8195220B2 (en) * 2008-02-01 2012-06-05 Lg Electronics Inc. User interface for mobile devices
WO2010103482A2 (fr) 2009-03-13 2010-09-16 Primesense Ltd. Interfaçage 3d amélioré pour dispositifs distants
JP5527423B2 (ja) * 2010-11-10 2014-06-18 日本電気株式会社 画像処理システム、画像処理方法、及び画像処理プログラムを記憶した記憶媒体
KR20120119440A (ko) * 2011-04-21 2012-10-31 삼성전자주식회사 전자기기에서 사용자의 제스처를 인식하는 방법
US20120316680A1 (en) * 2011-06-13 2012-12-13 Microsoft Corporation Tracking and following of moving objects by a mobile robot
WO2013081594A1 (fr) * 2011-11-30 2013-06-06 Hewlett-Packard Development Company, L.P. Mode d'entrée basé sur un emplacement d'un geste de main
JP5957893B2 (ja) * 2012-01-13 2016-07-27 ソニー株式会社 情報処理装置及び情報処理方法、並びにコンピューター・プログラム
US9423877B2 (en) * 2012-02-24 2016-08-23 Amazon Technologies, Inc. Navigation approaches for multi-dimensional input
CN103455137B (zh) * 2012-06-04 2017-04-12 原相科技股份有限公司 位移感测方法与位移感测装置
JP2014048936A (ja) * 2012-08-31 2014-03-17 Omron Corp ジェスチャ認識装置、その制御方法、表示機器、および制御プログラム
US9274608B2 (en) * 2012-12-13 2016-03-01 Eyesight Mobile Technologies Ltd. Systems and methods for triggering actions based on touch-free gesture detection
US20140173524A1 (en) * 2012-12-14 2014-06-19 Microsoft Corporation Target and press natural user input
KR101956073B1 (ko) * 2012-12-20 2019-03-08 삼성전자주식회사 시각적 인디케이터를 이용하여 사용자 인터페이스를 제공하는 3차원 입체 영상 표시 장치 및 그 장치를 이용한 방법
US9747696B2 (en) * 2013-05-17 2017-08-29 Leap Motion, Inc. Systems and methods for providing normalized parameters of motions of objects in three-dimensional space
US20150277570A1 (en) * 2014-03-31 2015-10-01 Google Inc. Providing Onscreen Visualizations of Gesture Movements
JP6053845B2 (ja) * 2015-03-09 2016-12-27 株式会社ソニー・インタラクティブエンタテインメント ジェスチャ操作入力処理装置、3次元ディスプレイ装置およびジェスチャ操作入力処理方法
KR102298232B1 (ko) 2015-06-16 2021-09-06 엘지디스플레이 주식회사 공간 터치 기능을 갖는 입체 영상 표시 장치
CN105468144B (zh) 2015-11-17 2019-02-12 小米科技有限责任公司 智能设备控制方法及装置
US10503968B2 (en) * 2016-03-22 2019-12-10 Intel Corporation Identifying a local coordinate system for gesture recognition
JP2017211960A (ja) 2016-05-27 2017-11-30 株式会社エクスビジョン ユーザインターフェース装置およびユーザインターフェースプログラム
EP3267289B1 (fr) * 2016-07-05 2019-02-27 Ricoh Company, Ltd. Appareil de traitement d'informations, procédé de génération d'informations de position et système de traitement d'informations
US20180188817A1 (en) 2017-01-04 2018-07-05 Kyocera Corporation Electronic device, computer-readable non-transitory recording medium, and control method
US10515117B2 (en) * 2017-02-14 2019-12-24 Cisco Technology, Inc. Generating and reviewing motion metadata
US20200012350A1 (en) * 2018-07-08 2020-01-09 Youspace, Inc. Systems and methods for refined gesture recognition
EP3691277A1 (fr) * 2019-01-30 2020-08-05 Ubimax GmbH Procédé et système mis en place par un ordinateur pour augmenter un flux vidéo d'un environnement

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140017829A (ko) * 2012-08-01 2014-02-12 삼성전자주식회사 제스처의 방향에 기초하여 상기 제스처를 인식하는 제스처 인식 장치와 제스처 인식 방법
KR20140144412A (ko) * 2013-06-11 2014-12-19 삼성전자주식회사 제스처 기반 통신 서비스 수행 방법 및 장치
KR20150130808A (ko) * 2014-05-14 2015-11-24 삼성전자주식회사 사용자의 공간 제스처를 식별하는 방법 및 장치
US20180095524A1 (en) * 2016-09-30 2018-04-05 Intel Corporation Interaction mode selection based on detected distance between user and machine interface
JP2018109874A (ja) * 2017-01-04 2018-07-12 京セラ株式会社 電子機器、プログラムおよび制御方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4118517A4 (fr) * 2020-03-20 2023-08-23 Huawei Technologies Co., Ltd. Procédés et systèmes de commande de dispositif à l'aide de gestes de la main dans un environnement multi-utilisateur

Also Published As

Publication number Publication date
KR20200028771A (ko) 2020-03-17
US11416080B2 (en) 2022-08-16
US20210191526A1 (en) 2021-06-24
KR102582863B1 (ko) 2023-09-27

Similar Documents

Publication Publication Date Title
WO2020050636A1 (fr) Procédé et appareil de reconnaissance de gestes basée sur l'intention de l'utilisateur
WO2020085789A1 (fr) Dispositif électronique pliable pour commander une interface utilisateur et son procédé de fonctionnement
WO2020130691A1 (fr) Dispositif électronique et procédé pour fournir des informations sur celui-ci
WO2021020814A1 (fr) Dispositif électronique de mise en place d'avatar et son procédé d'exploitation
WO2020022780A1 (fr) Procédé et appareil permettant d'établir une connexion de dispositif
EP3808097A1 (fr) Procédé et appareil permettant d'établir une connexion de dispositif
WO2019164092A1 (fr) Dispositif électronique de fourniture d'un second contenu pour un premier contenu affiché sur un dispositif d'affichage selon le mouvement d'un objet externe, et son procédé de fonctionnement
WO2019156480A1 (fr) Procédé de détection d'une région d'intérêt sur la base de la direction du regard et dispositif électronique associé
WO2020085628A1 (fr) Procédé d'affichage d'objets et dispositif électronique d'utilisation associé
WO2022080869A1 (fr) Procédé de mise à jour d'une carte tridimensionnelle au moyen d'une image et dispositif électronique prenant en charge ledit procédé
WO2021080171A1 (fr) Procédé et dispositif permettant la détection d'un port à l'aide d'un capteur inertiel
WO2021080360A1 (fr) Dispositif électronique et son procédé de commande de fonctionnement du dispositif d'affichage
WO2021066564A1 (fr) Dispositif électronique à changement de forme comprenant une pluralité de caméras, et procédé de capture d'image
WO2020171342A1 (fr) Dispositif électronique permettant de fournir un service d'intelligence artificielle visualisé sur la base d'informations concernant un objet externe, et procédé de fonctionnement pour dispositif électronique
AU2018321518B2 (en) Method for determining input detection region corresponding to user interface and electronic device thereof
WO2019143199A1 (fr) Dispositif électronique permettant de déterminer un procédé de traitement d'empreintes digitales en fonction du degré de pression d'une entrée d'empreintes digitales, et procédé associé
WO2019221518A1 (fr) Dispositif électronique destiné à régler une position d'un contenu affiché sur un dispositif d'affichage sur la base d'un éclairement ambiant et son procédé de fonctionnement
WO2019160325A1 (fr) Dispositif électronique et son procédé de commande
WO2021230568A1 (fr) Dispositif électronique permettant de fournir un service de réalité augmentée et son procédé de fonctionnement
WO2021060721A1 (fr) Dispositif électronique et procédé de traitement d'image mis en oeuvre par celui-ci
WO2019066323A1 (fr) Dispositif électronique et procédé d'exécution de contenu utilisant des informations de ligne de vue de celui-ci
WO2020032512A1 (fr) Dispositif électronique et procédé d'affichage d'une mise à disposition pour fournir une charge de batterie de dispositif externe par l'intermédiaire d'un dispositif d'affichage
WO2022014836A1 (fr) Procédé et appareil d'affichage d'objets virtuels dans différentes luminosités
WO2022085940A1 (fr) Procédé et appareil de commande d'affichage d'une pluralité d'objets sur un dispositif électronique
WO2021149938A1 (fr) Dispositif électronique et procédé de commande de robot

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19858535

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19858535

Country of ref document: EP

Kind code of ref document: A1