US10289901B2 - Gesture control device and method - Google Patents

Gesture control device and method Download PDF

Info

Publication number
US10289901B2
US10289901B2 US15/604,666 US201715604666A US10289901B2 US 10289901 B2 US10289901 B2 US 10289901B2 US 201715604666 A US201715604666 A US 201715604666A US 10289901 B2 US10289901 B2 US 10289901B2
Authority
US
United States
Prior art keywords
gesture
coordinate
image
control device
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US15/604,666
Other versions
US20170344813A1 (en
Inventor
Chih-Te Lu
Chin-Pin Kuo
Tung-Tso Tsai
Jung-Hao Yang
Chih-Yuan Chuang
Tsung-Yuan Tu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hon Hai Precision Industry Co Ltd
Original Assignee
Hon Hai Precision Industry Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hon Hai Precision Industry Co Ltd filed Critical Hon Hai Precision Industry Co Ltd
Assigned to HON HAI PRECISION INDUSTRY CO., LTD. reassignment HON HAI PRECISION INDUSTRY CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHUANG, CHIH-YUAN, KUO, CHIN-PIN, LU, CHIH-TE, TSAI, TUNG-TSO, TU, TSUNG-YUAN, YANG, JUNG-HAO
Publication of US20170344813A1 publication Critical patent/US20170344813A1/en
Application granted granted Critical
Publication of US10289901B2 publication Critical patent/US10289901B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • G06K9/00335
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06K9/00201
    • G06K9/52
    • G06K9/78
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/42Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • G06V40/113Recognition of static hand signs
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Definitions

  • the subject matter herein generally relates to interface devices, and particularly to a gesture control device and method capable of determining an object to be controlled by gesture, among a plurality of electronic devices.
  • Electronic devices can be controlled by gestures.
  • a gesture command usually controls one electronic device.
  • a number of electronic devices may be close together, and it is difficult to determine which one of the electronic devices should be controlled by the gesture.
  • FIG. 1 is a block diagram illustrating an exemplary embodiment of an operating environment of a device with system for control by gesture.
  • FIG. 2 is a block diagram illustrating an exemplary embodiment of a gesture control system running in the device of FIG. 1 .
  • FIG. 3 is a schematic diagram illustrating an exemplary embodiment of a working process of the device of FIG. 1 .
  • FIG. 4 is a flowchart illustrating an exemplary embodiment of a gesture control method.
  • FIG. 1 illustrate an exemplary embodiment of an operating environment of a gesture control device 100 .
  • the gesture control device 100 can communicate with a number of electronic devices.
  • the gesture control device 100 can determine which one of the electronic devices should be controlled by a gesture.
  • the electronic devices can be, but are not limited to, televisions, air conditioners, fridges, multimedia players, monitors, computers, and the like.
  • the gesture control device 100 can communicate with the electronic devices wirelessly, for example by using WIFI, Global System for Mobile Communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMAW-CDMA), CDMA2000, IMT Single Carrier, Enhanced Data Rates for GSM Evolution (EDGE), Long-Term Evolution, (LTE), Time-Division LTE, (TD-LTE), High Performance Radio Local Area Network, (HiperLAN), High Performance Radio Wide Area Network, (HiperWAN), Local Multipoint Distribution Service, (LMDS), Worldwide Interoperability for Microwave Access, (WiMAX), ZIGBEE, BLUETOOTH, Flash Orthogonal Frequency-Division Multiplexing, (Flash-OFDM), High Capacity Spatial Division Multiple Access, (HC-SDMA), Universal Mobile Telecommunications System, (UMTS), UMTS Time-Division Duplexing, (UMTS-TDD), Evolved High Speed Packet Access, (HSPA+), Time Division Synchronous Code Division Multiple Access,
  • the gesture control device 100 can be, but is not limited to, a server, a communication device such as a Set Top Box, or an integrated chip or programming modules embedded in the first electronic device 200 or in the second electronic device 300 .
  • the storage device 11 can store a gesture control system 10 .
  • the gesture control system 10 can include a number of modules, which are collections of software instructions stored in the storage device 11 and executed by the processor 12 .
  • the gesture control system 10 can include an acquiring module 101 , an establishing module 102 , a calculating module 103 , and a determining module 104 .
  • the establishing module 102 establishes a three dimensional coordinate system for the gesture image, and determines a coordinate of a central point of each of the electronic devices (the first electronic device 200 and the second electronic device 300 ).
  • plane coordinates of the gesture image are determined as an X axis and a Y axis of the coordinate system, and the depth direction of the gesture image is determined as a Z axis of the coordinate system.
  • the coordinate of the center points of the electronic devices are predetermined according to a position of the image capturing device.
  • the coordinate of the center point of the first electronic device 200 will be predetermined as the coordinate of the center point of the screen of the first electronic device 200 .
  • the acquiring module 101 further determines a coordinate of a most extreme left-side horizontal position (left end) of the gesture in different depths and a coordinate of a most extreme right-side horizontal position (right end) of the gesture in different depths when the gesture is ended. For example, as shown in FIG.
  • the calculating module 103 calculates a coordinate of a center point between the left end and the right end of the gesture in each different depth. For example, a coordinate of a center point C 1 in the depth Z 1 is
  • the calculating module 103 calculates a regression plane equation according to the coordinate of the center points of the gesture image in different depths.
  • the calculating module 103 calculates the regression plane equation by using a regression analysis method. By standardizing the coordinate of the center points, the calculating module 103 obtains the following formulas:
  • the determining module 104 determines which one of the electronic devices (for example the first electronic device 200 or the second electronic device 300 ) is intended as the target of the gesture by determining which one of the distances between the regression plane and the center points of the electronic devices is less than a preset value. If the determining module 104 determines that a distance between the regression plane and the center point of one electronic device is less than the preset value, the determining module determines that such electronic device is the object to be controlled by the gesture. If the determining module 104 determines that a distance between the regression plane and the center point of the electronic device is equal to or greater than the preset value, the determining module determines that such electronic device is not the intended target object to be controlled.
  • FIG. 4 A method for determining a target electronic device controlled by a gesture is illustrated in FIG. 4 .
  • the method is provided by way of example, as there are a variety of ways to carry out the method.
  • Each block shown in FIG. 4 represent one or more processes, methods, or subroutines carried out in the example method. Additionally, the illustrated order of blocks is by example only and the order of the blocks can be changed.
  • the example method can begin at block S 40 .
  • an acquiring module of a gesture control device acquires an image of a gesture from an image capturing device of each electronic device communicated with the gesture control device, and acquires an orientation of a gesture in the image of the gesture.
  • the gesture image can include a depth information.
  • the gesture image can include a number of pictures, the orientation and motion in the gesture is acquired according to a position of the gesture in different pictures.
  • the orientation of the gesture can indicate that the gesture of a user has a directivity. The orientation is determined as ended if the gesture is stopped for a preset time interval.
  • a establishing module establishes a three dimensional coordinate system for the gesture image, and determines a coordinate of a central point of each of the electronic devices communicated with the gesture control device.
  • the coordinate of the center point of the electronic devices are predetermined according to a position of the image capturing device.
  • the acquiring module determines a coordinate of a most extreme left-side horizontal position (left end) of the gesture in different depths and a coordinate of a most extreme right-side horizontal position (right end) of the gesture in different depths when the gesture is ended. For example, as shown in FIG.
  • a coordinate of a left end A 1 of the gesture in a depth Z 1 is (x′ 1 ,y′ 1 ,z 1 )
  • a coordinate of a right end B 1 of the gesture in the depth Z 1 is (x′′ 1 ,y′′ 1 ,z 1 )
  • a coordinate of the left end A 2 of the gesture in a depth Z 2 is (x′ 2 ,y′ 2 ,z 2 )
  • a coordinate of the right end B 2 of the gesture depth Z 2 is (x′′ 2 ,y′′ 2 ,z 2 ).
  • a coordinate of the left end A n of the gesture in a depth Z n is (x′ n ,y′ n ,z n )
  • a coordinate of the right end B n of the gesture depth Z n is (x′′ n ,y′′ n ,z n ).
  • the calculating module calculates a coordinate of a center point between the left end and the right end of the gesture in each different depth.
  • the calculating module calculates a regression plane equation according to the coordinate of the center points of the gesture image in different depths. The method for calculating the regression plane equation is described previously.
  • the calculating module calculates a distance between the regression plane and the center points of each of the electronic devices. The method for calculating the distance is described previously.
  • the determining module determines whether the distance between the regression plane and the center point of one of the electronic device is less than a preset value. If the determining module determines that the distance between the regression plane and the center point of one of the electronic device is less than the preset value, the procedure goes to block S 47 . Otherwise, the procedure is ended.
  • the determining module determines that the electronic device is a target controlled by the gesture if the distance between the regression plane and the center point of this electronic device is less than the preset value.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A device for recognizing control gestures and determining which one device out of a plurality is the target of control acquires images of a gesture from each electronic device. A three dimensional coordinate system for each image is established, and coordinate of a central point of each electronic device determined. Extent of gesture to the left and to the right at different depths is determined and a regression plane equation is calculated. A distance between the regression plane and center points of each electronic device is determined and the electronic device with the closest (the shortest distance) center point is determined as the target device of the control gesture. A gesture control method is also provided.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
This application claims priority to Taiwan Patent Application No. 105116709 filed on May 27, 2016.
FIELD
The subject matter herein generally relates to interface devices, and particularly to a gesture control device and method capable of determining an object to be controlled by gesture, among a plurality of electronic devices.
BACKGROUND
Electronic devices can be controlled by gestures. A gesture command usually controls one electronic device. However, a number of electronic devices may be close together, and it is difficult to determine which one of the electronic devices should be controlled by the gesture.
BRIEF DESCRIPTION OF THE DRAWINGS
Implementations of the present technology will now be described, by way of example only, with reference to the attached figures.
FIG. 1 is a block diagram illustrating an exemplary embodiment of an operating environment of a device with system for control by gesture.
FIG. 2 is a block diagram illustrating an exemplary embodiment of a gesture control system running in the device of FIG. 1.
FIG. 3 is a schematic diagram illustrating an exemplary embodiment of a working process of the device of FIG. 1.
FIG. 4 is a flowchart illustrating an exemplary embodiment of a gesture control method.
DETAILED DESCRIPTION
It will be appreciated that for simplicity and clarity of illustration, where appropriate, reference numerals have been repeated among the different figures to indicate corresponding or analogous elements. In addition, numerous specific details are set forth in order to provide a thorough understanding of the embodiments described herein. However, it will be understood by those of ordinary skill in the art that the embodiments described herein can be practiced without these specific details. In other instances, methods, procedures, and components have not been described in detail so as not to obscure the related relevant feature being described. The drawings are not necessarily to scale and the proportions of certain parts may be exaggerated to better illustrate details and features. The description is not to be considered as limiting the scope of the embodiments described herein.
The term “comprising” means “including, but not necessarily limited to”, it specifically indicates open-ended inclusion or membership in a so-described combination, group, series, and the like.
FIG. 1 illustrate an exemplary embodiment of an operating environment of a gesture control device 100. The gesture control device 100 can communicate with a number of electronic devices. The gesture control device 100 can determine which one of the electronic devices should be controlled by a gesture. In the exemplary embodiment, the electronic devices can be, but are not limited to, televisions, air conditioners, fridges, multimedia players, monitors, computers, and the like. The gesture control device 100 can communicate with the electronic devices wirelessly, for example by using WIFI, Global System for Mobile Communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMAW-CDMA), CDMA2000, IMT Single Carrier, Enhanced Data Rates for GSM Evolution (EDGE), Long-Term Evolution, (LTE), Time-Division LTE, (TD-LTE), High Performance Radio Local Area Network, (HiperLAN), High Performance Radio Wide Area Network, (HiperWAN), Local Multipoint Distribution Service, (LMDS), Worldwide Interoperability for Microwave Access, (WiMAX), ZIGBEE, BLUETOOTH, Flash Orthogonal Frequency-Division Multiplexing, (Flash-OFDM), High Capacity Spatial Division Multiple Access, (HC-SDMA), Universal Mobile Telecommunications System, (UMTS), UMTS Time-Division Duplexing, (UMTS-TDD), Evolved High Speed Packet Access, (HSPA+), Time Division Synchronous Code Division Multiple Access, (TD-SCDMA), Evolution-Data Optimized, (EV-DO), Digital Enhanced Cordless Telecommunications, (DECT), or the like. The gesture control device 100 also can communicate with electronic devices by wires.
In the exemplary embodiment, a first electronic device 200 and a second electronic device 300 are taken as examples of communicating with the gesture control device 100. In the exemplary embodiment, the first electronic device 200 includes a first image capturing device 20, the second electronic device 300 includes a second image capturing device 30. The first image capturing device 20 is a depth camera configured to capture images of gestures in a first effective range R1, the second image capturing device 30 is a depth camera configured to capture gesture images in a second effective range R2. The first electronic device 200 and the second electronic device 300 can execute functions according to gestures captured by the image capturing devices 20 and/or 30.
In the exemplary embodiment, the gesture control device 100 can be, but is not limited to, a server, a communication device such as a Set Top Box, or an integrated chip or programming modules embedded in the first electronic device 200 or in the second electronic device 300.
Referring to FIG. 2, in the exemplary embodiment, the gesture control device 100 can include, but is not limited to, a storage device 11 and a processor 12. The storage device 11 can be, but is not limited to, an internal storage system, such as a flash memory, a random access memory (RAM) for temporary storage of information, and/or a read-only memory (ROM) for permanent storage of information. The storage device 11 can also be a storage system, such as a hard disk, a storage card, or a data storage medium. The processor 12 can be, but is not limited to, a central processing unit, a digital signal processor, or a single chip, for example.
In the exemplary embodiment, the storage device 11 can store a gesture control system 10. The gesture control system 10 can include a number of modules, which are collections of software instructions stored in the storage device 11 and executed by the processor 12. In the exemplary embodiment, the gesture control system 10 can include an acquiring module 101, an establishing module 102, a calculating module 103, and a determining module 104.
The acquiring module 101 acquires an image of a gesture from each of the first image capturing device 20 and/or the second image capturing device 30 and acquires an orientation of a gesture in the gesture image. In the exemplary embodiment, the gesture image can include a depth information as to objects in the images. In the exemplary embodiment, the gesture image can include a number of pictures, thus the orientation and motion in the gesture can be acquired according to a position of the gesture in different pictures. The orientation of the gesture can indicate that the gesture of a user has a directivity. The orientation is determined as ended if the gesture is stopped for a preset time interval. In the exemplary embodiment, the first image capturing device 20 and the second image capturing device 30 capture the gesture image when a gesture is detected in the effective area of the first image capturing device 20 and the second image capturing device 30.
The establishing module 102 establishes a three dimensional coordinate system for the gesture image, and determines a coordinate of a central point of each of the electronic devices (the first electronic device 200 and the second electronic device 300). In the exemplary embodiment, plane coordinates of the gesture image are determined as an X axis and a Y axis of the coordinate system, and the depth direction of the gesture image is determined as a Z axis of the coordinate system. In the exemplary embodiment, the coordinate of the center points of the electronic devices are predetermined according to a position of the image capturing device. For example, if the first electronic device 200 is a computer, and the image capturing device 20 is located in the middle of an upper edge of a screen of the first electronic device 200, the coordinate of the center point of the first electronic device 200 will be predetermined as the coordinate of the center point of the screen of the first electronic device 200.
The acquiring module 101 further determines a coordinate of a most extreme left-side horizontal position (left end) of the gesture in different depths and a coordinate of a most extreme right-side horizontal position (right end) of the gesture in different depths when the gesture is ended. For example, as shown in FIG. 3, a coordinate of a left end A1 of the gesture in a depth Z1 is (x′1,y′1,z1), a coordinate of a right end B1 of the gesture in the depth Z1 is (x″1,y″1,z1), a coordinate of the left end A2 of the gesture in a depth Z2 is (x′2,y′2,z2), a coordinate of the right end B2 of the gesture depth Z2 is (x″2,y″2,z2). Thus a coordinate of the left end An of the gesture in a depth Zn is (x′n,y′n,zn), and a coordinate of the right end Bn of the gesture depth Zn is (x″n,y″n,zn).
The calculating module 103 calculates a coordinate of a center point between the left end and the right end of the gesture in each different depth. For example, a coordinate of a center point C1 in the depth Z1 is
( x 1 + x 1 2 , y 1 + y 1 2 , z 1 ) ,
a coordinate of a center point C2 in the depth Z2 is
( x 2 + x 2 2 , y 2 + y 2 2 , z 2 ) ,
and a coordinate of a center point Cn in the depth Zn is
( x n + x n 2 , y n + y n 2 , z n ) .
The calculating module 103 calculates a regression plane equation according to the coordinate of the center points of the gesture image in different depths.
In the exemplary embodiment, the calculating module 103 calculates the regression plane equation by using a regression analysis method. By standardizing the coordinate of the center points, the calculating module 103 obtains the following formulas:
x k = x k - x _ S x , y k = y k - y _ S y , z k = z k - z _ S z , wherein , x _ = x 1 + x 2 + + x n n , S x = i = 1 n ( x i - x _ ) 2 n - 1 , y _ = y 1 + y 2 + + y n n , S y = i = 1 n ( y i - y _ ) 2 n - 1 , z _ = z 1 + z 2 + + z n n , S z = i = 1 n ( z i - z _ ) 2 n - 1 .
Setting a standardized plane equation as z′=ax′+by′+c, a residual error is ei=z′i−{circumflex over (z)}′i, and is the result. The calculating module 103 further calculates a value of “a”, “b”, and “c” in above formula when a value of
i = 1 n e i 2
is minimal.
i = 1 n e i 2 = i = 1 n ( ax i + by i + c - z i ) 2 = i = 1 n [ ( ax i + by i - z i ) + c ] 2 = i = 1 n ( ax i + by i - z i ) 2 + 2 c i = 1 n ( ax i + by i - z i ) + i = 1 n c 2 = i = 1 n ( ax i + by i - z i ) 2 + 2 ac i = 1 n x i + 2 bc i = 1 n y i - 2 c i = 1 n z i + nc 2 = i = 1 n ( ax i + by i - z i ) 2 + nc 2 , here , x _ = 0 i = 1 n x i = 0 , y _ = 0 i = 1 n y i = 0 , z _ = 0 i = 1 n z i = 0.
If C=0, above formula becomes:
i = 1 n ( ax i + by i - z i ) 2 = ( i = 1 n x i ′2 ) a 2 + ( 2 i = 1 n x i y i ) ab + ( i = 1 n y i ′2 ) b 2 - 2 ( i = 1 n x i z i ) a - 2 ( i = 1 n y i z i ) b + i = 1 n z i ′2 .
By
f ( a , b ) = ( i = 1 n x i ′2 ) a 2 + ( 2 i = 1 n x i y i ) ab + ( i = 1 n y i ′2 ) b 2 - 2 ( i = 1 n x i z i ) a - 2 ( i = 1 n y i z i ) b
referring to a function
+ i = 1 n z i ′2 ,
that is: ƒ(a,b)=Aa2+2Bab+Cb2+Da+Eb+F.
By using a Cauchy inequality algorithm:
( i = 1 n x i ′2 ) ( i = 1 n y i ′2 ) ( i = 1 n x i y i ) 2 B 2 - AC 0.
Generally speaking, B2−AC≠0.
According to a formula: if ƒ(x,y)=ax2+2bxy+cy2+dx+ey+ƒ, here a>0, and b2−ac≤0, ƒ(x,y) is minimal when (x,y)=(h,k). In this condition,
h = - d 2 b - e 2 c 2 a 2 b 2 b 2 c = - dc + de 2 ( ac - b 2 ) , k = 2 a - d 2 b - e 2 a 2 b 2 b 2 c = - ae + bd 2 ( ac - b 2 ) .
Calculating the function ƒ(a,b)=Aa2+2Bab+Cb2+Da+Eb+F,
a = - D 2 B - E 2 C 2 A 2 B 2 B 2 C = - DC + DE 2 ( AC - B 2 ) , b = 2 A - D 2 B - E 2 A 2 B 2 B 2 C = - AE + BD 2 ( AC - B 2 ) . Here , A = i = 1 n x i ′2 , B = i = 1 n x i y i , C = i = 1 n y i ′2 , D = i = 1 n x i z i , E = i = 1 n y i z i , F = i = 1 n z i ′2 .
The parameters “a” and “b” in above formula further can be converted to
a = - DC + DE 2 ( AC - B 2 ) = - ( i = 1 n x i z i ) ( i = 1 n y i ′2 ) + ( i = 1 n x i z i ) ( i = 1 n y i z i ) 2 [ ( i = 1 n x i ′2 ) ( i = 1 n y i ′2 ) - ( i = 1 n x i y i ) 2 ] = - r xz r yy + r xz r yz 2 ( r xx r yy - r xy 2 ) , b = - AE + BD 2 ( AC - B 2 ) = - ( i = 1 n x i ′2 ) ( i = 1 n y i z i ) + ( i = 1 n x i y i ) ( i = 1 n x i z i ) 2 [ ( i = 1 n x i ′2 ) ( i = 1 n y i ′2 ) - ( i = 1 n x i y i ) 2 ] = - r xx r yz + r xy r xz 2 ( r xx r yy - r xy 2 ) .
The calculated regression plane equation is:
z - z _ S z = a ( x - x _ S x ) + b ( y - y _ S y ) .
The calculating module 103 further calculates a distance between the regression plane and the center points of each of the electronic devices (the first electronic device 200 and the second electronic device 300). For example, if the coordinate of the center point of the first electronic device 200 is (x0,y0,z0), and if the regression plane equation is expanded to form px+qy+rz+s=0, the calculating module 103 calculates the distance between the center point of the first electronic device 100 and the regression plane according to the formula:
px 0 + qy 0 + rz 0 + s p 2 + q 2 + r 2 .
The determining module 104 determines which one of the electronic devices (for example the first electronic device 200 or the second electronic device 300) is intended as the target of the gesture by determining which one of the distances between the regression plane and the center points of the electronic devices is less than a preset value. If the determining module 104 determines that a distance between the regression plane and the center point of one electronic device is less than the preset value, the determining module determines that such electronic device is the object to be controlled by the gesture. If the determining module 104 determines that a distance between the regression plane and the center point of the electronic device is equal to or greater than the preset value, the determining module determines that such electronic device is not the intended target object to be controlled.
A method for determining a target electronic device controlled by a gesture is illustrated in FIG. 4. The method is provided by way of example, as there are a variety of ways to carry out the method. Each block shown in FIG. 4 represent one or more processes, methods, or subroutines carried out in the example method. Additionally, the illustrated order of blocks is by example only and the order of the blocks can be changed. The example method can begin at block S40.
At block S40, an acquiring module of a gesture control device acquires an image of a gesture from an image capturing device of each electronic device communicated with the gesture control device, and acquires an orientation of a gesture in the image of the gesture. In the exemplary embodiment, the gesture image can include a depth information. The gesture image can include a number of pictures, the orientation and motion in the gesture is acquired according to a position of the gesture in different pictures. The orientation of the gesture can indicate that the gesture of a user has a directivity. The orientation is determined as ended if the gesture is stopped for a preset time interval.
At block S41, a establishing module establishes a three dimensional coordinate system for the gesture image, and determines a coordinate of a central point of each of the electronic devices communicated with the gesture control device. In the exemplary embodiment, the coordinate of the center point of the electronic devices are predetermined according to a position of the image capturing device.
At block S42, the acquiring module determines a coordinate of a most extreme left-side horizontal position (left end) of the gesture in different depths and a coordinate of a most extreme right-side horizontal position (right end) of the gesture in different depths when the gesture is ended. For example, as shown in FIG. 3, a coordinate of a left end A1 of the gesture in a depth Z1 is (x′1,y′1,z1), a coordinate of a right end B1 of the gesture in the depth Z1 is (x″1,y″1,z1), a coordinate of the left end A2 of the gesture in a depth Z2 is (x′2,y′2,z2), a coordinate of the right end B2 of the gesture depth Z2 is (x″2,y″2,z2). Thus, a coordinate of the left end An of the gesture in a depth Zn is (x′n,y′n,zn), and a coordinate of the right end Bn of the gesture depth Zn is (x″n,y″n,zn).
At block S43, the calculating module calculates a coordinate of a center point between the left end and the right end of the gesture in each different depth.
At block S44, the calculating module calculates a regression plane equation according to the coordinate of the center points of the gesture image in different depths. The method for calculating the regression plane equation is described previously.
At block S45, the calculating module calculates a distance between the regression plane and the center points of each of the electronic devices. The method for calculating the distance is described previously.
At block S46, the determining module determines whether the distance between the regression plane and the center point of one of the electronic device is less than a preset value. If the determining module determines that the distance between the regression plane and the center point of one of the electronic device is less than the preset value, the procedure goes to block S47. Otherwise, the procedure is ended.
At block S47, the determining module determines that the electronic device is a target controlled by the gesture if the distance between the regression plane and the center point of this electronic device is less than the preset value.
It is believed that the present embodiments and their advantages will be understood from the foregoing description, and it will be apparent that various changes may be made thereto without departing from the spirit and scope of the disclosure or sacrificing all of its material advantages, the examples hereinbefore described merely being exemplary embodiments of the present disclosure.

Claims (8)

What is claimed is:
1. A gesture control device to communicate with at least two electronic devices, each of the electronic devices comprising an image capturing device, the gesture control device comprising:
at least one processor; and
at least one storage device storing one or more programs, when executed by the at least one processor, the one or more programs cause the at least one processor to:
acquire an image of a gesture from the image capturing device of each electronic device communicated with the gesture control device and acquire an orientation of a gesture in the image of the gesture;
establish a three dimensional coordinate system for the gesture image, and determine a coordinate of a central point of each of the electronic devices communicated with the gesture control device;
determine a coordinate of a left end of the gesture in different depths and a coordinate of a right end of the gesture in different depths when the gesture is ended;
calculate a coordinate of a center point between the left end and the right end of the gesture in each different depths;
calculate a regression plane equation according to the coordinate of the center points of the gesture image in different depths;
calculate a distance between the regression plane and the central points of each of the electronic devices; and
determine which one of the electronic devices is intended as a target controlled by the gesture by determining which one of the distances between the regression plane and the central point of the electronic device is less than a preset value.
2. The gesture control device of claim 1, wherein the gesture image comprises a plurality of pictures, the orientation of the gesture is acquired according to a position of the gesture in different pictures.
3. The gesture control device of claim 1, wherein the orientation is determined as ended if the gesture is stopped for a preset time interval.
4. The gesture control device of claim 1, wherein a formula for calculating the distance between the regression plane and the center point of the electronic device is:
px 0 + qy 0 + rz 0 + s p 2 + q 2 + r 2 ,
wherein (x0,y0,z0) is the coordinate of the center point of the electronic device, the regression plane equation is: px+qy+rz+s=0.
5. A gesture control method applied in a gesture control device, the gesture control device configured to communicate with at least two electronic devices, each of the electronic devices comprising an image capturing device, the gesture control method comprising:
acquiring a gesture image from the image capturing device of each electronic device communicated with the gesture control device and acquiring an orientation of a gesture in the gesture image;
establishing a three dimensional coordinate system for the gesture image, and determining a coordinate of a central point of each of the electronic devices communicated with the gesture control device;
determining a coordinate of a left end of the gesture in different depths and a coordinate of a right end of the gesture in different depths when the gesture is ended;
calculating a regression plane equation according to the coordinate of the center points of the gesture image in different depths;
calculating a distance between the regression plane and the central points of each of the electronic devices; and
determining which one of the electronic devices is intended as a target controlled by the gesture by determining which one of the distances between the regression plane and the central point of the electronic device is less than a preset value.
6. The gesture control method of claim 5, wherein the gesture image comprises a plurality of pictures, the orientation of the gesture is acquired according to a position of the gesture in different pictures.
7. The gesture control method of claim 5, wherein the orientation is determined as ended if the gesture is stopped for a preset time interval.
8. The gesture control method of claim 5, wherein a formula for calculating the distance between the regression plane and the center point of the electronic device is:
px 0 + qy 0 + rz 0 + s p 2 + q 2 + r 2 ,
wherein (x0,y0,z0) is the coordinate of the center point of the electronic device, the regression plane equation is: px+qy+rz+s=0.
US15/604,666 2016-05-27 2017-05-25 Gesture control device and method Active 2037-06-10 US10289901B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
TW105116709A 2016-05-27
TW105116709 2016-05-27
TW105116709A TWI597656B (en) 2016-05-27 2016-05-27 Gesture control system and method

Publications (2)

Publication Number Publication Date
US20170344813A1 US20170344813A1 (en) 2017-11-30
US10289901B2 true US10289901B2 (en) 2019-05-14

Family

ID=60418069

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/604,666 Active 2037-06-10 US10289901B2 (en) 2016-05-27 2017-05-25 Gesture control device and method

Country Status (2)

Country Link
US (1) US10289901B2 (en)
TW (1) TWI597656B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104866083B (en) * 2014-02-25 2020-03-17 中兴通讯股份有限公司 Gesture recognition method, device and system
CN108960109B (en) * 2018-06-26 2020-01-21 哈尔滨拓博科技有限公司 Space gesture positioning device and method based on two monocular cameras

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070259716A1 (en) * 2004-06-18 2007-11-08 Igt Control of wager-based game using gesture recognition
US20120113241A1 (en) 2010-11-09 2012-05-10 Qualcomm Incorporated Fingertip tracking for touchless user interface
US20150199122A1 (en) * 2012-06-29 2015-07-16 Spotify Ab Systems and methods for multi-context media control and playback
US20160224036A1 (en) * 2015-01-30 2016-08-04 Lutron Electronics Co., Inc. Gesture-based load control via wearable devices
US9477302B2 (en) * 2012-08-10 2016-10-25 Google Inc. System and method for programing devices within world space volumes
US20160321838A1 (en) * 2015-04-29 2016-11-03 Stmicroelectronics S.R.L. System for processing a three-dimensional (3d) image and related methods using an icp algorithm
US20170277267A1 (en) * 2014-02-25 2017-09-28 Zte Corporation Hand gesture recognition method, device, system, and computer storage medium
US9784554B2 (en) * 2012-03-20 2017-10-10 Hurco Companies, Inc. Method for measuring a rotary axis of a machine tool system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070259716A1 (en) * 2004-06-18 2007-11-08 Igt Control of wager-based game using gesture recognition
US20120113241A1 (en) 2010-11-09 2012-05-10 Qualcomm Incorporated Fingertip tracking for touchless user interface
US9784554B2 (en) * 2012-03-20 2017-10-10 Hurco Companies, Inc. Method for measuring a rotary axis of a machine tool system
US20150199122A1 (en) * 2012-06-29 2015-07-16 Spotify Ab Systems and methods for multi-context media control and playback
US9477302B2 (en) * 2012-08-10 2016-10-25 Google Inc. System and method for programing devices within world space volumes
US20170277267A1 (en) * 2014-02-25 2017-09-28 Zte Corporation Hand gesture recognition method, device, system, and computer storage medium
US20160224036A1 (en) * 2015-01-30 2016-08-04 Lutron Electronics Co., Inc. Gesture-based load control via wearable devices
US20160321838A1 (en) * 2015-04-29 2016-11-03 Stmicroelectronics S.R.L. System for processing a three-dimensional (3d) image and related methods using an icp algorithm

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Caon, Maurizio, Yong Yue, Julien Tscherrig, Elena Mugellini, and O. Abou Khaled. "Context-aware 3d gesture interaction based on multiple kinects." In Proceedings of the first international conference on ambient computing, applications, services and technologies , AMBIENT, pp. 7-12. 2011. (Year: 2011). *
Zhen-Zhang Li, Yuan-Xiang Zhang, Zhi-Heng Li; A Fingertip Detection and Interaction System Based on Stereo Vision; http://cvl.ice.cycu.edu.tw/publications/Li2011.pdf; 2011; pp. 2-9, Form 1; Taiwan.

Also Published As

Publication number Publication date
US20170344813A1 (en) 2017-11-30
TW201741856A (en) 2017-12-01
TWI597656B (en) 2017-09-01

Similar Documents

Publication Publication Date Title
US11189037B2 (en) Repositioning method and apparatus in camera pose tracking process, device, and storage medium
US10600252B2 (en) Coarse relocalization using signal fingerprints
JP7507964B2 (en) Method and apparatus for adjusting shelf position and orientation by a mobile robot
US10531065B2 (en) Coarse relocalization using signal fingerprints
CN108898171B (en) Image recognition processing method, system and computer readable storage medium
CN111357034A (en) Point cloud generation method, system and computer storage medium
CN105554367A (en) Movement photographing method and mobile terminal
US20190005678A1 (en) Pose estimation using multiple cameras
US20160112701A1 (en) Video processing method, device and system
CN114663618A (en) Three-dimensional reconstruction and correction method, device, equipment and storage medium
TWI748439B (en) Positioning method and device based on shared map, electronic equipment and computer readable storage medium
US9946957B2 (en) Method, apparatus, computer program and system for image analysis
US11670200B2 (en) Orientated display method and apparatus for audio device, and audio device
US10289901B2 (en) Gesture control device and method
CN105701762B (en) Picture processing method and electronic equipment
JP6378453B2 (en) Feature extraction method and apparatus
US10331946B2 (en) Gesture control device and method
KR20190035414A (en) Wireless device and operating method thereof
CN104038798A (en) Image processing method, device and system
US10360943B2 (en) Method, device and system for editing video
US20150051724A1 (en) Computing device and simulation method for generating a double contour of an object
US20160182817A1 (en) Visualization for Viewing-Guidance during Dataset-Generation
CN111639634B (en) OCR (optical character recognition) method and electronic equipment
US9904355B2 (en) Display method, image capturing method and electronic device
KR20230168929A (en) Method for detecting object and electronic device for supporting the same

Legal Events

Date Code Title Description
AS Assignment

Owner name: HON HAI PRECISION INDUSTRY CO., LTD., TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LU, CHIH-TE;KUO, CHIN-PIN;TSAI, TUNG-TSO;AND OTHERS;REEL/FRAME:042512/0514

Effective date: 20170517

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4