US10289901B2 - Gesture control device and method - Google Patents
Gesture control device and method Download PDFInfo
- Publication number
- US10289901B2 US10289901B2 US15/604,666 US201715604666A US10289901B2 US 10289901 B2 US10289901 B2 US 10289901B2 US 201715604666 A US201715604666 A US 201715604666A US 10289901 B2 US10289901 B2 US 10289901B2
- Authority
- US
- United States
- Prior art keywords
- gesture
- coordinate
- image
- control device
- electronic device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G06K9/00335—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G06K9/00201—
-
- G06K9/52—
-
- G06K9/78—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/42—Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/107—Static hand or arm
- G06V40/113—Recognition of static hand signs
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
Definitions
- the subject matter herein generally relates to interface devices, and particularly to a gesture control device and method capable of determining an object to be controlled by gesture, among a plurality of electronic devices.
- Electronic devices can be controlled by gestures.
- a gesture command usually controls one electronic device.
- a number of electronic devices may be close together, and it is difficult to determine which one of the electronic devices should be controlled by the gesture.
- FIG. 1 is a block diagram illustrating an exemplary embodiment of an operating environment of a device with system for control by gesture.
- FIG. 2 is a block diagram illustrating an exemplary embodiment of a gesture control system running in the device of FIG. 1 .
- FIG. 3 is a schematic diagram illustrating an exemplary embodiment of a working process of the device of FIG. 1 .
- FIG. 4 is a flowchart illustrating an exemplary embodiment of a gesture control method.
- FIG. 1 illustrate an exemplary embodiment of an operating environment of a gesture control device 100 .
- the gesture control device 100 can communicate with a number of electronic devices.
- the gesture control device 100 can determine which one of the electronic devices should be controlled by a gesture.
- the electronic devices can be, but are not limited to, televisions, air conditioners, fridges, multimedia players, monitors, computers, and the like.
- the gesture control device 100 can communicate with the electronic devices wirelessly, for example by using WIFI, Global System for Mobile Communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMAW-CDMA), CDMA2000, IMT Single Carrier, Enhanced Data Rates for GSM Evolution (EDGE), Long-Term Evolution, (LTE), Time-Division LTE, (TD-LTE), High Performance Radio Local Area Network, (HiperLAN), High Performance Radio Wide Area Network, (HiperWAN), Local Multipoint Distribution Service, (LMDS), Worldwide Interoperability for Microwave Access, (WiMAX), ZIGBEE, BLUETOOTH, Flash Orthogonal Frequency-Division Multiplexing, (Flash-OFDM), High Capacity Spatial Division Multiple Access, (HC-SDMA), Universal Mobile Telecommunications System, (UMTS), UMTS Time-Division Duplexing, (UMTS-TDD), Evolved High Speed Packet Access, (HSPA+), Time Division Synchronous Code Division Multiple Access,
- the gesture control device 100 can be, but is not limited to, a server, a communication device such as a Set Top Box, or an integrated chip or programming modules embedded in the first electronic device 200 or in the second electronic device 300 .
- the storage device 11 can store a gesture control system 10 .
- the gesture control system 10 can include a number of modules, which are collections of software instructions stored in the storage device 11 and executed by the processor 12 .
- the gesture control system 10 can include an acquiring module 101 , an establishing module 102 , a calculating module 103 , and a determining module 104 .
- the establishing module 102 establishes a three dimensional coordinate system for the gesture image, and determines a coordinate of a central point of each of the electronic devices (the first electronic device 200 and the second electronic device 300 ).
- plane coordinates of the gesture image are determined as an X axis and a Y axis of the coordinate system, and the depth direction of the gesture image is determined as a Z axis of the coordinate system.
- the coordinate of the center points of the electronic devices are predetermined according to a position of the image capturing device.
- the coordinate of the center point of the first electronic device 200 will be predetermined as the coordinate of the center point of the screen of the first electronic device 200 .
- the acquiring module 101 further determines a coordinate of a most extreme left-side horizontal position (left end) of the gesture in different depths and a coordinate of a most extreme right-side horizontal position (right end) of the gesture in different depths when the gesture is ended. For example, as shown in FIG.
- the calculating module 103 calculates a coordinate of a center point between the left end and the right end of the gesture in each different depth. For example, a coordinate of a center point C 1 in the depth Z 1 is
- the calculating module 103 calculates a regression plane equation according to the coordinate of the center points of the gesture image in different depths.
- the calculating module 103 calculates the regression plane equation by using a regression analysis method. By standardizing the coordinate of the center points, the calculating module 103 obtains the following formulas:
- the determining module 104 determines which one of the electronic devices (for example the first electronic device 200 or the second electronic device 300 ) is intended as the target of the gesture by determining which one of the distances between the regression plane and the center points of the electronic devices is less than a preset value. If the determining module 104 determines that a distance between the regression plane and the center point of one electronic device is less than the preset value, the determining module determines that such electronic device is the object to be controlled by the gesture. If the determining module 104 determines that a distance between the regression plane and the center point of the electronic device is equal to or greater than the preset value, the determining module determines that such electronic device is not the intended target object to be controlled.
- FIG. 4 A method for determining a target electronic device controlled by a gesture is illustrated in FIG. 4 .
- the method is provided by way of example, as there are a variety of ways to carry out the method.
- Each block shown in FIG. 4 represent one or more processes, methods, or subroutines carried out in the example method. Additionally, the illustrated order of blocks is by example only and the order of the blocks can be changed.
- the example method can begin at block S 40 .
- an acquiring module of a gesture control device acquires an image of a gesture from an image capturing device of each electronic device communicated with the gesture control device, and acquires an orientation of a gesture in the image of the gesture.
- the gesture image can include a depth information.
- the gesture image can include a number of pictures, the orientation and motion in the gesture is acquired according to a position of the gesture in different pictures.
- the orientation of the gesture can indicate that the gesture of a user has a directivity. The orientation is determined as ended if the gesture is stopped for a preset time interval.
- a establishing module establishes a three dimensional coordinate system for the gesture image, and determines a coordinate of a central point of each of the electronic devices communicated with the gesture control device.
- the coordinate of the center point of the electronic devices are predetermined according to a position of the image capturing device.
- the acquiring module determines a coordinate of a most extreme left-side horizontal position (left end) of the gesture in different depths and a coordinate of a most extreme right-side horizontal position (right end) of the gesture in different depths when the gesture is ended. For example, as shown in FIG.
- a coordinate of a left end A 1 of the gesture in a depth Z 1 is (x′ 1 ,y′ 1 ,z 1 )
- a coordinate of a right end B 1 of the gesture in the depth Z 1 is (x′′ 1 ,y′′ 1 ,z 1 )
- a coordinate of the left end A 2 of the gesture in a depth Z 2 is (x′ 2 ,y′ 2 ,z 2 )
- a coordinate of the right end B 2 of the gesture depth Z 2 is (x′′ 2 ,y′′ 2 ,z 2 ).
- a coordinate of the left end A n of the gesture in a depth Z n is (x′ n ,y′ n ,z n )
- a coordinate of the right end B n of the gesture depth Z n is (x′′ n ,y′′ n ,z n ).
- the calculating module calculates a coordinate of a center point between the left end and the right end of the gesture in each different depth.
- the calculating module calculates a regression plane equation according to the coordinate of the center points of the gesture image in different depths. The method for calculating the regression plane equation is described previously.
- the calculating module calculates a distance between the regression plane and the center points of each of the electronic devices. The method for calculating the distance is described previously.
- the determining module determines whether the distance between the regression plane and the center point of one of the electronic device is less than a preset value. If the determining module determines that the distance between the regression plane and the center point of one of the electronic device is less than the preset value, the procedure goes to block S 47 . Otherwise, the procedure is ended.
- the determining module determines that the electronic device is a target controlled by the gesture if the distance between the regression plane and the center point of this electronic device is less than the preset value.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Social Psychology (AREA)
- Psychiatry (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
a coordinate of a center point C2 in the depth Z2 is
and a coordinate of a center point Cn in the depth Zn is
Setting a standardized plane equation as z′=ax′+by′+c, a residual error is ei=z′i−{circumflex over (z)}′i, and is the result. The calculating
is minimal.
If C=0, above formula becomes:
By
referring to a function
that is: ƒ(a,b)=Aa2+2Bab+Cb2+Da+Eb+F.
Claims (8)
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| TW105116709A | 2016-05-27 | ||
| TW105116709 | 2016-05-27 | ||
| TW105116709A TWI597656B (en) | 2016-05-27 | 2016-05-27 | Gesture control system and method |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20170344813A1 US20170344813A1 (en) | 2017-11-30 |
| US10289901B2 true US10289901B2 (en) | 2019-05-14 |
Family
ID=60418069
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/604,666 Active 2037-06-10 US10289901B2 (en) | 2016-05-27 | 2017-05-25 | Gesture control device and method |
Country Status (2)
| Country | Link |
|---|---|
| US (1) | US10289901B2 (en) |
| TW (1) | TWI597656B (en) |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104866083B (en) * | 2014-02-25 | 2020-03-17 | 中兴通讯股份有限公司 | Gesture recognition method, device and system |
| CN108960109B (en) * | 2018-06-26 | 2020-01-21 | 哈尔滨拓博科技有限公司 | Space gesture positioning device and method based on two monocular cameras |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20070259716A1 (en) * | 2004-06-18 | 2007-11-08 | Igt | Control of wager-based game using gesture recognition |
| US20120113241A1 (en) | 2010-11-09 | 2012-05-10 | Qualcomm Incorporated | Fingertip tracking for touchless user interface |
| US20150199122A1 (en) * | 2012-06-29 | 2015-07-16 | Spotify Ab | Systems and methods for multi-context media control and playback |
| US20160224036A1 (en) * | 2015-01-30 | 2016-08-04 | Lutron Electronics Co., Inc. | Gesture-based load control via wearable devices |
| US9477302B2 (en) * | 2012-08-10 | 2016-10-25 | Google Inc. | System and method for programing devices within world space volumes |
| US20160321838A1 (en) * | 2015-04-29 | 2016-11-03 | Stmicroelectronics S.R.L. | System for processing a three-dimensional (3d) image and related methods using an icp algorithm |
| US20170277267A1 (en) * | 2014-02-25 | 2017-09-28 | Zte Corporation | Hand gesture recognition method, device, system, and computer storage medium |
| US9784554B2 (en) * | 2012-03-20 | 2017-10-10 | Hurco Companies, Inc. | Method for measuring a rotary axis of a machine tool system |
-
2016
- 2016-05-27 TW TW105116709A patent/TWI597656B/en active
-
2017
- 2017-05-25 US US15/604,666 patent/US10289901B2/en active Active
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20070259716A1 (en) * | 2004-06-18 | 2007-11-08 | Igt | Control of wager-based game using gesture recognition |
| US20120113241A1 (en) | 2010-11-09 | 2012-05-10 | Qualcomm Incorporated | Fingertip tracking for touchless user interface |
| US9784554B2 (en) * | 2012-03-20 | 2017-10-10 | Hurco Companies, Inc. | Method for measuring a rotary axis of a machine tool system |
| US20150199122A1 (en) * | 2012-06-29 | 2015-07-16 | Spotify Ab | Systems and methods for multi-context media control and playback |
| US9477302B2 (en) * | 2012-08-10 | 2016-10-25 | Google Inc. | System and method for programing devices within world space volumes |
| US20170277267A1 (en) * | 2014-02-25 | 2017-09-28 | Zte Corporation | Hand gesture recognition method, device, system, and computer storage medium |
| US20160224036A1 (en) * | 2015-01-30 | 2016-08-04 | Lutron Electronics Co., Inc. | Gesture-based load control via wearable devices |
| US20160321838A1 (en) * | 2015-04-29 | 2016-11-03 | Stmicroelectronics S.R.L. | System for processing a three-dimensional (3d) image and related methods using an icp algorithm |
Non-Patent Citations (2)
| Title |
|---|
| Caon, Maurizio, Yong Yue, Julien Tscherrig, Elena Mugellini, and O. Abou Khaled. "Context-aware 3d gesture interaction based on multiple kinects." In Proceedings of the first international conference on ambient computing, applications, services and technologies , AMBIENT, pp. 7-12. 2011. (Year: 2011). * |
| Zhen-Zhang Li, Yuan-Xiang Zhang, Zhi-Heng Li; A Fingertip Detection and Interaction System Based on Stereo Vision; http://cvl.ice.cycu.edu.tw/publications/Li2011.pdf; 2011; pp. 2-9, Form 1; Taiwan. |
Also Published As
| Publication number | Publication date |
|---|---|
| US20170344813A1 (en) | 2017-11-30 |
| TW201741856A (en) | 2017-12-01 |
| TWI597656B (en) | 2017-09-01 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11189037B2 (en) | Repositioning method and apparatus in camera pose tracking process, device, and storage medium | |
| US10600252B2 (en) | Coarse relocalization using signal fingerprints | |
| JP7507964B2 (en) | Method and apparatus for adjusting shelf position and orientation by a mobile robot | |
| US10531065B2 (en) | Coarse relocalization using signal fingerprints | |
| CN108898171B (en) | Image recognition processing method, system and computer readable storage medium | |
| CN111357034A (en) | Point cloud generation method, system and computer storage medium | |
| CN105554367A (en) | Movement photographing method and mobile terminal | |
| US20190005678A1 (en) | Pose estimation using multiple cameras | |
| US20160112701A1 (en) | Video processing method, device and system | |
| CN114663618A (en) | Three-dimensional reconstruction and correction method, device, equipment and storage medium | |
| TWI748439B (en) | Positioning method and device based on shared map, electronic equipment and computer readable storage medium | |
| US9946957B2 (en) | Method, apparatus, computer program and system for image analysis | |
| US11670200B2 (en) | Orientated display method and apparatus for audio device, and audio device | |
| US10289901B2 (en) | Gesture control device and method | |
| CN105701762B (en) | Picture processing method and electronic equipment | |
| JP6378453B2 (en) | Feature extraction method and apparatus | |
| US10331946B2 (en) | Gesture control device and method | |
| KR20190035414A (en) | Wireless device and operating method thereof | |
| CN104038798A (en) | Image processing method, device and system | |
| US10360943B2 (en) | Method, device and system for editing video | |
| US20150051724A1 (en) | Computing device and simulation method for generating a double contour of an object | |
| US20160182817A1 (en) | Visualization for Viewing-Guidance during Dataset-Generation | |
| CN111639634B (en) | OCR (optical character recognition) method and electronic equipment | |
| US9904355B2 (en) | Display method, image capturing method and electronic device | |
| KR20230168929A (en) | Method for detecting object and electronic device for supporting the same |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: HON HAI PRECISION INDUSTRY CO., LTD., TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LU, CHIH-TE;KUO, CHIN-PIN;TSAI, TUNG-TSO;AND OTHERS;REEL/FRAME:042512/0514 Effective date: 20170517 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |