CN103034333A - Gesture recognition device and gesture recognition method - Google Patents

Gesture recognition device and gesture recognition method Download PDF

Info

Publication number
CN103034333A
CN103034333A CN2012105506792A CN201210550679A CN103034333A CN 103034333 A CN103034333 A CN 103034333A CN 2012105506792 A CN2012105506792 A CN 2012105506792A CN 201210550679 A CN201210550679 A CN 201210550679A CN 103034333 A CN103034333 A CN 103034333A
Authority
CN
China
Prior art keywords
image
colour
hand
finger tip
judge
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2012105506792A
Other languages
Chinese (zh)
Inventor
姜智尹
黄子轩
张哲维
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujian Huaying Display Technology Co Ltd
Chunghwa Picture Tubes Ltd
Original Assignee
Fujian Huaying Display Technology Co Ltd
Chunghwa Picture Tubes Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujian Huaying Display Technology Co Ltd, Chunghwa Picture Tubes Ltd filed Critical Fujian Huaying Display Technology Co Ltd
Priority to CN2012105506792A priority Critical patent/CN103034333A/en
Publication of CN103034333A publication Critical patent/CN103034333A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a gesture recognition device, which comprises an image processing module. The image processing module is used for processing an image, and comprises a skin color detection unit, a feature detection unit and an edge detection unit, wherein the skin color detection unit is used for judging whether the skin color area in the image is larger than a threshold value or not; the feature detection unit is electrically connected with the skin color detection unit and is used for recognizing a hand image in the image; and the edge detection unit is electrically connected with the feature detection unit and is used for judging the coordinate of a center of mass, the number of fingertips and the coordinate of the fingertips.

Description

Gesture device for identifying and method
Technical field
Present invention is directed to a kind of gesture device for identifying and method, particularly relevant for a kind of device that utilizes image processing to carry out the gesture identification, and use the gesture identification of this device.
Background technology
The user is more and more higher to the operation requirements of human-computer interface system (man-machine interface system), and the hope flow process that can more simplify the operation allows more intuitive of interface operation.The operation mechanism of human-computer interface system mostly is greatly four kinds of mechanism such as keyboard operation, slide-mouse operation, touch control operation and telepilot operation.Keyboard operation is suitable for input characters, but display interface now mostly is greatly graphic display interface, and is therefore also inconvenient in the use.Although slide-mouse operation or telepilot operation can provide good convenience, the user must rely on an external device (ED) and operate, and command range also is restricted.Touch control operation then limits the user must operate man-machine interface at finger or pointer in the screen scope that can touch.
At present, the another kind of operation mechanism of human-computer interface system is hand to be modeled to slide-mouse use.For example, the Kinect human-computer interface system is followed the trail of hand first, uses obtaining the hand coordinate, then system's coordinate and hand coordinate is linked, and just hand can be modeled to slide-mouse and use.If the user releases (towards the direction of image sensor) with hand toward forward direction, then can assign the corresponding instruction actions such as slide-mouse click action (Click).Yet hardware framework Cover matrix formula infrared transmitter, infrared camera, visible light photography machine, matrix form microphone and the motor etc. of Kinect cause the hardware cost high.Although what the hardware framework of Kinect was can Quasi true obtains hand in the coordinate values of Z axis, only need know that in real the application the relative context of hand just can learn that corresponding instruction moves.
Therefore, just having to provide a kind of gesture device for identifying and method of taking into account operating space freedom and hand operation, to solve aforesaid problem.
Summary of the invention
A purpose of the present invention is that the space free that overcomes existing operation mechanism is not enough, and a kind of gesture identification and device of taking into account operating space freedom and hand operation is provided.
For reaching above-mentioned purpose, the invention provides a kind of gesture identification, comprising: an image is provided; The three primary colors figure of this image is converted to GTG figure; Judge a hand image of this image; And judge this hand image barycenter coordinate, finger tip number and finger tip coordinate wherein at least one.
For reaching above-mentioned purpose, the present invention provides a kind of gesture device for identifying in addition, comprises an image processing module.This image processing module is in order to processing an image, and comprises: a colour of skin detecting unit, in order to judge that whether colour of skin area in this image is greater than a threshold value; One feature detecting unit is electrically connected this colour of skin detecting unit, in order to the hand image in this image of identification; And an edge detecting unit, be electrically connected this feature detecting unit, in order to barycenter coordinate, finger tip number and the finger tip coordinate of judging this hand image wherein at least one.
Gesture identification of the present invention and device utilize colour of skin detecting unit to find out colour of skin area, and recycling feature detecting unit is found out the hand image from image, and utilizes the edge detection unit judges to go out barycenter coordinate, finger tip number and the finger tip coordinate of hand image.Or else the bending change of the spatial position change of follow-up hand, the number of finger fingertip and finger needs the image of whole picture to scan identification.Therefore, the image file of picture is little, can accelerate the speed of hand image identification, and control module is carried out corresponding action according to the result who changes again.In the use, the user not be used in the restriction that is confined on the space, can operate more freely and control.
Description of drawings
Fig. 1 is the Organization Chart that the present invention has the human-computer interface system of gesture device for identifying.
Fig. 2 is gesture identification process flow diagram of the present invention.
Fig. 3 is colour of skin method for detecting process flow diagram of the present invention.
Fig. 4 a is hand image identification schematic diagram of the present invention, and it is the GTG schematic diagram.
Fig. 4 b is hand image identification schematic diagram of the present invention, and it is the GTG schematic diagram of choosing the hand image.
Fig. 4 c is hand image identification schematic diagram of the present invention, and it is for indicating the GTG schematic diagram of salient point, concave point and barycenter coordinate.
Fig. 5 uses the schematic diagram of the man-machine interface system of the present invention for the user.
[main element symbol description]
1 human-computer interface system, 10 gesture device for identifying
100 Ying Xiang Screenshot get unit 20 display units
200 image processing modules, 210 colour of skin detecting units
220 feature detecting units, 230 edge detection unit
240 data bank, 250 control modules
251 touch point functions, 252 gesture arbitration functions
300 user interfaces, 310 graphical user interfaces
320 user-centered interfaces, 410 GTG figure
420 first hand images, 450 salient points
460 concave points, 470 barycenter coordinates
510 network video cameras, 520 image processing modules
530 computer screens, 540 users
Step S100 ~ S112 step S1021 ~ S1023.
Embodiment
For making purpose of the present invention, technical scheme and advantage clearer, below will by specific embodiment and relevant drawings, the present invention be described in further detail.
Seeing also Fig. 1 is the Organization Chart of the human-computer interface system with gesture device for identifying of one of the present invention embodiment.Human-computer interface system 1 comprises a gesture device for identifying 10 and a display unit 20.Gesture device for identifying 10 comprises that a Ying Xiang Screenshot gets unit 100, an image processing module 200 and a user interface 300.Image processing module 200 comprises a colour of skin detecting unit 210, a feature detecting unit 220, an edge detecting unit 230, a data bank 240 and a control module 250.Image processing module 200 is electrically connected at this Ying Xiang Screenshot and gets unit 100.User interface 300 is electrically connected at image processing module 200.
Fig. 2 is the gesture identification process flow diagram of one of the present invention embodiment, please consults simultaneously Fig. 1.This gesture identification comprises the following steps:
At step S100, provide the first image.In this step, get unit 100 Screenshot by Ying Xiang Screenshot and get the first image, and be electrically connected this colour of skin detecting unit 210, this first image is passed to this colour of skin detecting unit 210.This Ying Xiang Screenshot gets unit 100 and can be video camera or image sensor.
At step S102, colour of skin detecting unit 210 carries out colour of skin detecting step, and three primary colors (RGB) figure of this first image is converted to GTG (Gray-level) figure.See also Fig. 3, colour of skin detecting step comprises the following steps:
At step S1021, the three primary colors model (RGB color model) of this first image is converted to colourity/saturation degree/lightness color model (HSV color model, HSV color model).In this step, this colour of skin detecting unit 210 is this first image from the picture frame (frame) that Ying Xiang Screenshot gets unit 100 receptions, this first image is to show with the three primary colors model originally, but in order to carry out the judgement of the colour of skin, so be the HSV color model with the three primary colors model conversion, with convenient follow-up processing.
At step S1022, remove the lightness parameter of this first image, do the tracking of the colour of skin with colorimetric parameter and saturation parameters again and judge the colour of skin area of this first image.In this step, this colour of skin detecting unit 210 removes first the lightness parameter of this first image, to reduce the impact of external environment light.Utilize the skin of palm not have melanin and generate, colorimetric parameter and saturation parameters are drawn up a scope, and filtering do not drop on the image of this scope, and this first image is formed GTG Figure 41 0 (shown in Fig. 4 a) with the GTG performance.Then, calculate the area that drops on this scope image and be colour of skin area.
At step S1023, whether judge the colour of skin area of the first image greater than threshold value.In this step, colour of skin detecting unit 210 judges that whether the colour of skin area of this first image is greater than threshold value.This threshold value will account for whole imagery coverage one predetermined ratio at least for colour of skin area in this first image.When colour of skin area during less than threshold value, just get back to step S100; That is colour of skin detecting unit 210 is got back to original state, and is waited for that next image repeats again with regard to finishing the detecting flow process.When colour of skin area during greater than threshold value, colour of skin detecting unit 210 just just is delivered to this feature detecting unit 220 with the GTG figure of this first image.Suppose, this imagery coverage is 640 * 480, and then colour of skin area wants 300 * 200 at least in the first image, and 300 * 200 in above-mentioned is above-mentioned threshold value.
At step S104, feature detecting unit 220 carries out feature detecting step, in order to judge the first hand image in this first image.In this step, when feature detecting unit 220 is electrically connected this colour of skin detecting unit 210, and when receiving the GTG figure of this first image from colour of skin detecting unit 210, this feature detecting unit 220 utilize Ha Er (Haar) algorithm carry out this first image of identification in the first hand image.A plurality of vectors can be organized out setting up a hand feature parameter model according to Ha Er (Haar) algorithm, and then the sample characteristics parameter value of indivedual correspondences can be obtained.When carrying out the hand identification, this feature detecting unit 220 Hui Screenshot get the feature of each hand region, are distinguished corresponding region parameter eigenwert to calculate each hand region.Next, with the corresponding region parameter eigenwert of each hand region, compare with the sample characteristics parameter value, to obtain the similarity between hand region and the sample, as long as (for example the similarity threshold value is 95 to similarity greater than a threshold value, just judge the hand image, and choose this hand image (shown in Fig. 4 b).When feature detecting unit 220 picks out when in this image the hand image being arranged, just this hand image is reached this edge detection unit 230.If picked out a plurality of hand images, just only transmit the hand image that maximum area is arranged, that is the first hand image 420.
At step S106, edge detection unit 230 carries out edge detecting step, in order to barycenter coordinate, finger tip number and the finger tip coordinate of judging the first hand image.
In this step, please consult simultaneously Fig. 4 c, this edge detection unit 230 is electrically connected this feature detecting unit 220, and receives this first hand image from this feature detecting unit 220.This edge detection unit 230 utilizes the round dot pattern of the maximum convex polygon of this first hand image to be salient point 450, side's dot pattern is concave point 460, calculate the gap of two concave points 460 and its middle salient point 450, finger tip be can judge whether for stretching out or pack up, and then finger tip number and finger tip coordinate learnt.Perhaps, calculate the distance of concave point 460 between finger fingertip salient point 450 and two fingers, for example the forefinger finger tip is to the distance of recess between forefinger and the middle finger.This edge detection unit 230 reaches data bank 240 with finger tip number and the finger tip coordinate of this first hand image 440.
In this step, this edge detection unit 230 judges that the maximum convex polygon of this first hand image calculates the area of the first hand image, to learn that the trigpoint pattern is as barycenter coordinate 470.This edge detection unit 230 reaches data bank 240 with the barycenter coordinate 470 of this first hand image.
Step S108 provides the n image, and judges barycenter coordinate, finger tip number and the finger tip coordinate of n hand image and n hand image.In this step, n is the integer more than 2 or 2, and Ying Xiang Screenshot gets unit 100 Screenshot and gets the n image, and this n image is passed to this colour of skin detecting unit 210, such as step S100.The n image through the colour of skin detecting step of step S102, is judged the colour of skin area of this n image greater than threshold value, and this n image gray scale figure is delivered to feature detecting unit 220 again.Step S104, feature detecting unit utilize Ha Er (Haar) algorithm carry out this n image of identification in n hand image, and this n hand image is delivered to edge detection unit 230.Such as step S106, barycenter coordinate, finger tip number and the finger tip coordinate of n hand image are judged in edge detection unit 230, and reach data bank 240.
Step S110 judges the difference between barycenter coordinate, finger tip number and the finger tip coordinate of the first hand image and n hand image, and carries out corresponding action.In this step, control module 250 is electrically connected this data bank 240.This control module 250 is carried out corresponding action according to the signal of this data bank 240.
For example: the first mode of operation is that control module 250 is different from the barycenter coordinate of n hand image according to the first hand image, just can judge mobile change of hand image in the space, and carries out the action of touch point function 251.
The second mode of operation is that control module 250 is judged the variation of finger according to the finger tip number of the first hand image or the second hand image, and carries out the action of gesture arbitration functions 252.
The 3rd mode of operation is that control module 250 is different from the finger tip coordinate of the second hand image according to the first hand image, just can judge the finger degree of crook of hand image, and carries out the action of gesture arbitration functions 252.
In above-mentioned first, second and third mode of operation, control module 250 optional then one of them mode of operation, also three modes of operation are used alternately simultaneously.
Step S112, display unit 20 sees through user interface 300, the result behind indicative control unit 250 execution actions.In this step, user interface 300 comprises user-centered interface 320 and graphical user interface 310, and is electrically connected control module 250 and display unit 20.User-centered interface 320 is the output interfaces for touch point function 251, and graphical user interface 310 is the output interfaces for gesture arbitration functions 252.Via user-centered interface 320 and graphical user interface 310, can be by the result behind display unit 20 indicative control units 250 execution actions.
For instance: as shown in Figure 5, gesture device for identifying of the present invention can replace the action of present slide-mouse, wherein this Ying Xiang Screenshot get the unit can be network video camera 510 (Web camera); Image processing module 520 of the present invention can be the institutes such as wafer set (Chip Set), processor (Processor such as CPU, MPU), control circuit (Control Circuit), other auxiliary circuit, computing formula (Operation Software), firmware (Firmware) or relevant module, element, software and combines; This display unit can be general computer screen (screen) 530.
When user 540 when network video camera 510 is front, when user 540 hand is moved to the left, just can see that from computer screen 530 arrow on the screen is moved to the left.When user 540 finger is bent downwardly, to process through the signal of image processing module 520, the selected element of the arrow on the computer screen 530 will be clicked.
Gesture identification of the present invention and device utilize colour of skin detecting unit to find out colour of skin area, and recycling feature detecting unit is found out the hand image from image, and utilizes the edge detection unit judges to go out barycenter coordinate, finger tip number and the finger tip coordinate of hand image.Or else the bending change of the spatial position change of follow-up hand, the number of finger fingertip and finger needs the image of whole picture to scan identification.Therefore, the image file of picture is little, can accelerate the speed of hand image identification, and control module is carried out corresponding action according to the result who changes again.In the use, the user not be used in the restriction that is confined on the space, can operate more freely and control.
In sum, being only notebook invention is embodiment or the embodiment of the technological means that presents the Mining of institute that deals with problems and use, is not the scope that limits patent working of the present invention.Be allly to conform to the present patent application claim context, or change and modify according to the equalization that claim of the present invention is done, be all claim of the present invention and contain.

Claims (13)

1. a gesture identification is characterized in that: comprise the following steps:
One first image is provided;
The three primary colors figure of this first image is converted to one first GTG figure;
Judge one first hand image of this first image; And
Judge this first hand image barycenter coordinate, finger tip number and finger tip coordinate wherein at least one.
2. gesture identification according to claim 1, it is characterized in that: wherein the three primary colors figure with this first image is converted in the step of one first GTG figure, comprises the following steps:
Be colourity/saturation degree/lightness color model with the three primary colors model conversion of this first image;
Remove the lightness parameter of this first image, do the tracking of the colour of skin with colorimetric parameter and saturation parameters again and judge the colour of skin area of this first image, and this first image is formed this first GTG figure with the GTG performance; And
Whether judge the colour of skin area of this first image greater than a threshold value.
3. gesture identification according to claim 2 is characterized in that: wherein this threshold value accounts for whole imagery coverage one predetermined ratio for the colour of skin area in this first image.
4. gesture identification according to claim 1 is characterized in that: more comprise the following steps:
One second image is provided;
The three primary colors figure of this second image is converted to one second GTG figure;
Judge one second hand image of this second image; And
Judge this second hand image barycenter coordinate, finger tip number and finger tip coordinate wherein at least one.
5. gesture identification according to claim 4, it is characterized in that: wherein the three primary colors figure with this second image is converted in the step of one second GTG figure, more comprises the following steps:
Be colourity/saturation degree/lightness color model with the three primary colors model conversion of this second image;
Remove the lightness parameter of this second image, do the tracking of the colour of skin with colorimetric parameter and saturation parameters again and judge the colour of skin area of this second image, and this second image is formed this second GTG figure with the GTG performance; And
Whether judge the colour of skin area of this second image greater than a threshold value.
6. gesture identification according to claim 4 is characterized in that: more comprise the following steps: to judge the barycenter coordinate of this first hand image and this second hand image, and carry out corresponding action.
7. gesture identification according to claim 4 is characterized in that: more comprise the following steps: to judge the finger tip number of this first hand image or this second hand image, and carry out corresponding action.
8. gesture identification according to claim 4 is characterized in that: more comprise the following steps: to judge the finger tip coordinate of this first hand image and this second hand image, and carry out corresponding action.
9. gesture device for identifying is characterized in that: comprising:
One image processing module in order to processing an image, and comprises:
One colour of skin detecting unit is in order to judge that whether colour of skin area in this image is greater than a threshold value;
One feature detecting unit is electrically connected this colour of skin detecting unit, in order to the hand image in this image of identification; And
One edge detecting unit is electrically connected this feature detecting unit, in order to barycenter coordinate, finger tip number and the finger tip coordinate of judging this hand image wherein at least one.
10. gesture device for identifying according to claim 9, it is characterized in that: more comprise a data bank, it is electrically connected this edge detection unit, in order to the barycenter coordinate, finger tip number and the finger tip coordinate that store this hand image wherein at least one.
11. gesture device for identifying according to claim 10 is characterized in that: more comprise a control module, it is electrically connected this data bank, in order to the difference of foundation barycenter coordinate, judges mobile change of hand image in the space.
12. gesture device for identifying according to claim 10 is characterized in that: more comprise a control module, it is electrically connected this data bank, in order to foundation finger tip number, judges the variation of finger.
13. gesture device for identifying according to claim 10 is characterized in that: more comprise a control module, it is electrically connected this data bank, judges the finger degree of crook of this hand image in order to the difference of foundation finger tip coordinate.
CN2012105506792A 2012-12-18 2012-12-18 Gesture recognition device and gesture recognition method Pending CN103034333A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2012105506792A CN103034333A (en) 2012-12-18 2012-12-18 Gesture recognition device and gesture recognition method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2012105506792A CN103034333A (en) 2012-12-18 2012-12-18 Gesture recognition device and gesture recognition method

Publications (1)

Publication Number Publication Date
CN103034333A true CN103034333A (en) 2013-04-10

Family

ID=48021295

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2012105506792A Pending CN103034333A (en) 2012-12-18 2012-12-18 Gesture recognition device and gesture recognition method

Country Status (1)

Country Link
CN (1) CN103034333A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105335711A (en) * 2015-10-22 2016-02-17 华南理工大学 Fingertip detection method in complex environment
CN107357428A (en) * 2017-07-07 2017-11-17 京东方科技集团股份有限公司 Man-machine interaction method and device based on gesture identification, system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101853071A (en) * 2010-05-13 2010-10-06 重庆大学 Gesture identification method and system based on visual sense
CN102063618A (en) * 2011-01-13 2011-05-18 中科芯集成电路股份有限公司 Dynamic gesture identification method in interactive system
CN102096471A (en) * 2011-02-18 2011-06-15 广东威创视讯科技股份有限公司 Human-computer interaction method based on machine vision
CN102142084A (en) * 2011-05-06 2011-08-03 北京网尚数字电影院线有限公司 Method for gesture recognition
US20110267258A1 (en) * 2010-04-29 2011-11-03 Acer Incorporated Image based motion gesture recognition method and system thereof
CN102368290A (en) * 2011-09-02 2012-03-07 华南理工大学 Hand gesture identification method based on finger advanced characteristic
CN102509088A (en) * 2011-11-28 2012-06-20 Tcl集团股份有限公司 Hand motion detecting method, hand motion detecting device and human-computer interaction system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110267258A1 (en) * 2010-04-29 2011-11-03 Acer Incorporated Image based motion gesture recognition method and system thereof
CN101853071A (en) * 2010-05-13 2010-10-06 重庆大学 Gesture identification method and system based on visual sense
CN102063618A (en) * 2011-01-13 2011-05-18 中科芯集成电路股份有限公司 Dynamic gesture identification method in interactive system
CN102096471A (en) * 2011-02-18 2011-06-15 广东威创视讯科技股份有限公司 Human-computer interaction method based on machine vision
CN102142084A (en) * 2011-05-06 2011-08-03 北京网尚数字电影院线有限公司 Method for gesture recognition
CN102368290A (en) * 2011-09-02 2012-03-07 华南理工大学 Hand gesture identification method based on finger advanced characteristic
CN102509088A (en) * 2011-11-28 2012-06-20 Tcl集团股份有限公司 Hand motion detecting method, hand motion detecting device and human-computer interaction system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李文生等: "基于Hermite神经网络的动态手势学习和识别", 《计算机工程与科学》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105335711A (en) * 2015-10-22 2016-02-17 华南理工大学 Fingertip detection method in complex environment
CN105335711B (en) * 2015-10-22 2019-01-15 华南理工大学 Fingertip Detection under a kind of complex environment
CN107357428A (en) * 2017-07-07 2017-11-17 京东方科技集团股份有限公司 Man-machine interaction method and device based on gesture identification, system

Similar Documents

Publication Publication Date Title
US11775076B2 (en) Motion detecting system having multiple sensors
TWI471815B (en) Gesture recognition device and method
KR100943792B1 (en) A device and a method for identifying movement pattenrs
US9389779B2 (en) Depth-based user interface gesture control
KR101184460B1 (en) Device and method for controlling a mouse pointer
KR102496531B1 (en) Method for providing fingerprint recognition, electronic apparatus and storage medium
US20150070301A1 (en) Methods for controlling a hand-held electronic device and hand-held electronic device utilizing the same
US20160179210A1 (en) Input supporting method and input supporting device
US20150153832A1 (en) Visual feedback by identifying anatomical features of a hand
CN101248410A (en) Device and method for recognizing movement mode
US9218060B2 (en) Virtual mouse driving apparatus and virtual mouse simulation method
CN113253908B (en) Key function execution method, device, equipment and storage medium
CN106569716B (en) Single-hand control method and control system
CN109947243B (en) Intelligent electronic equipment gesture capturing and recognizing technology based on touch hand detection
CN101853076A (en) Method for acquiring input information by input equipment
CN103034333A (en) Gesture recognition device and gesture recognition method
CN104169858A (en) Method and device of using terminal device to identify user gestures
KR101216833B1 (en) system for controling non-contact screen and method for controling non-contact screen in the system
CN109993059B (en) Binocular vision and object recognition technology based on single camera on intelligent electronic equipment
KR102569170B1 (en) Electronic device and method for processing user input based on time of maintaining user input
US11287897B2 (en) Motion detecting system having multiple sensors
Zhang et al. DynaKey: Dynamic Keystroke Tracking Using a Head-Mounted Camera Device
Sathya et al. Systematic Review on on-Air Hand Doodle System for the Purpose of Authentication
Pullan et al. High Resolution Touch Screen Module
Dube et al. Embedded user interface for smart camera

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C12 Rejection of a patent application after its publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20130410