CN102331883B - Identification method for three-dimensional control end point and computer readable medium adopting same - Google Patents

Identification method for three-dimensional control end point and computer readable medium adopting same Download PDF

Info

Publication number
CN102331883B
CN102331883B CN 201010225545 CN201010225545A CN102331883B CN 102331883 B CN102331883 B CN 102331883B CN 201010225545 CN201010225545 CN 201010225545 CN 201010225545 A CN201010225545 A CN 201010225545A CN 102331883 B CN102331883 B CN 102331883B
Authority
CN
China
Prior art keywords
dimensional
block
end points
discrimination method
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN 201010225545
Other languages
Chinese (zh)
Other versions
CN102331883A (en
Inventor
廖志彬
黄捷
蔡曜阳
王科翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial Technology Research Institute ITRI
Original Assignee
Industrial Technology Research Institute ITRI
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial Technology Research Institute ITRI filed Critical Industrial Technology Research Institute ITRI
Priority to CN 201010225545 priority Critical patent/CN102331883B/en
Publication of CN102331883A publication Critical patent/CN102331883A/en
Application granted granted Critical
Publication of CN102331883B publication Critical patent/CN102331883B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to an identification method for a three-dimensional control end point and a computer readable medium adopting the identification method. The identification method for the three-dimensional control end point comprises the following steps of: receiving the depth information relevant to images captured by an image capturing device; generating three-dimensional block information relevant to a three-dimensional block according to the depth information; generating a reference plane according to the depth information; generating a connecting group according to the three-dimensional block information and the reference plane; and selecting the three-dimensional block of the closest image capturing device in the connecting group as the control end point.

Description

The three-dimensional computer-readable medium of controlling the discrimination method of end points and using it
Technical field
The present invention relates to a kind of device of controlling the discrimination method of end points and using the method, and be particularly related to a kind of three-dimensional computer-readable medium of controlling the discrimination method of end points and using it.
Background technology
The multi-point touch function is very convenient function in the Touch Screen interface, and the spirit of multi-point touch is to utilize more that the exchange premium mankind's motor habit comes operating system, more draws in the distance of people and computing machine.The way of the first known technology first defines the object feature, as color characteristic, shape facility or the textural characteristics of fist.Capture again afterwards image.Then the feature block in object feature and image is compared to find out the control end points.The way of the second known technology further utilizes depth characteristic to filter the erroneous judgement of background to prevent that complex background from being caused.The rule of doing of the third known technology is to find out three-dimensional control area, and utilizes depth information and hand-characteristic to find out control end points near video camera in three-dimensional control area.
Summary of the invention
The present invention relates to a kind of three-dimensional computer-readable medium of controlling the discrimination method of end points and using it.
According to an aspect of the present invention, a kind of three-dimensional discrimination method of controlling end points is proposed.The three-dimensional discrimination method of controlling end points comprises: receive depth information, depth information is relevant to the image that image capturing device captures; Produce the space block message that is relevant to three-dimensional block according to depth information; Produce reference planes according to depth information; Connect group according to space block message and reference planes generation; And select to connect in group three-dimensional block near image capturing device as controlling end points.
According to a further aspect in the invention, a kind of computer-readable medium is proposed.Computer-readable medium has several programmed instruction to carry out a three-dimensional discrimination method of controlling end points, and the three-dimensional discrimination method of controlling end points comprises: receive depth information, depth information is relevant to the image that image capturing device captures; Produce the space block message that is relevant to three-dimensional block according to depth information; Produce reference planes according to depth information; Connect group according to space block message and reference planes generation; And select to connect in group three-dimensional block near image capturing device as controlling end points.
For foregoing of the present invention can be become apparent, a preferred embodiment cited below particularly, and cooperation accompanying drawing are described in detail below:
Description of drawings
Fig. 1 illustrates and is a kind of three-dimensional identification system of controlling end points.
Fig. 2 illustrates and is a kind of three-dimensional process flow diagram of controlling the discrimination method of end points.
Fig. 3 illustrates as producing the thin section process flow diagram of space block message.
Fig. 4 illustrates the schematic diagram into salient point.
Fig. 5 illustrates and is the schematic diagram before the filtering noise block.
Fig. 6 illustrates and is the schematic diagram after the filtering noise block.
Fig. 7 illustrates the schematic diagram into all salient points in image.
Fig. 8 illustrates the schematic diagram into all three-dimensional blocks in image.
Fig. 9 illustrates as producing the thin section process flow diagram of reference planes.
Figure 10 illustrates the schematic diagram into the space distribution statistics.
Figure 11 illustrates the thin section process flow diagram that connects group into producing.
Figure 12 illustrates and is the three-dimensional block connection result of the first schematic diagram.
Figure 13 illustrates and is the three-dimensional block connection result of the second schematic diagram.
Figure 14 illustrates and is the third three-dimensional block connection result schematic diagram.
Figure 15 illustrates as find out the schematic diagram of reference point in the link of two three-dimensional blocks.
Figure 16 illustrates and is the first space block type schematic diagram.
Figure 17 illustrates and is the second space block type schematic diagram.
Figure 18 illustrates and is the third space block type schematic diagram.
Figure 19 illustrates as connecting the schematic diagram of group.
[main element symbol description]
10: the three-dimensional identification system of controlling end points
21~25,221~225,231~233,241~243: step
410,410 (1)~410 (4): salient point
50: reference planes
60 (1), 60 (2): connect group
110: computing machine
120: computer-readable medium
420,420 (a), 420 (b), 420 (1)~420 (9): three-dimensional block
430: the reference point
Embodiment
In order correctly to pick out the control end points, following embodiment provides a kind of three-dimensional computer-readable medium of controlling the discrimination method of end points and using it.Computer-readable medium has several programmed instruction to carry out a three-dimensional discrimination method of controlling end points, and the three-dimensional discrimination method of controlling end points comprises: receive depth information, depth information is relevant to the image that image capturing device captures; Produce the space block message that is relevant to three-dimensional block according to depth information; Produce reference planes according to depth information; Connect group according to space block message and reference planes generation; And select to connect in group three-dimensional block near image capturing device as controlling end points.
Three-dimensional discrimination method and the computer-readable medium of controlling end points
Please be simultaneously with reference to Fig. 1 and Fig. 2, it is a kind of three-dimensional identification system of controlling end points that Fig. 1 illustrates, Fig. 2 illustrates the process flow diagram of the discrimination method that is a kind of three-dimensional control end points.The three-dimensional identification system 10 of controlling end points comprises computing machine 110 and computer-readable medium (Computer-Readable Medium) 120.Computer-readable medium 120 has several programmed instruction, is written into to carry out the three-dimensional discrimination method of controlling end points for computing machine 110.Computer-readable medium 120 is for example disk, CD, tape or Winchester disk drive.At first the three-dimensional discrimination method of controlling end points comprises the steps: to receive depth information as shown in step 21.Depth information is relevant to the image that an image capturing device captures, and image capturing device is for example infrared camera or two video camera.
Then as shown in step 22, produce the space block message that is relevant to three-dimensional block according to depth information.And then as shown in step 23, produce reference planes according to depth information.Then as shown in step 24, connect group according to space block message and reference planes generation.At last as shown in step 25, select three-dimensional block near image capturing device as controlling end points from connect group.Follow-up general further illustrates respectively the step that produces space block message, reference planes and connect group.
Produce the space block message
Please be simultaneously with reference to Fig. 3 to Fig. 8, Fig. 3 illustrates as producing the thin section process flow diagram of space block message, Fig. 4 illustrates the schematic diagram into salient point, Fig. 5 illustrates and is the schematic diagram before the filtering noise block, Fig. 6 illustrates and is the schematic diagram after the filtering noise block, Fig. 7 illustrates the schematic diagram into all salient points in image, and Fig. 8 illustrates the schematic diagram into all three-dimensional blocks in image.Abovementioned steps 22 comprises that further step 221 is to 225.
At first as shown in step 221, detect all salient points 410 in the image that image capturing device captures according to depth information along continuous straight runs and vertical direction.So-called salient point 410 refers in an image to exist the feature pixel (illustrating as Fig. 4) of a certain altitude that more swells with respect to other peripheries.
Then as shown in step 222, the salient point 410 that Fig. 7 is illustrated according to the depth difference of salient point 410 and the neighboring pixel point of salient point 410 expands to the three-dimensional block 420 that Fig. 8 illustrates.It should be noted that, step 222 is the depth information that checks salient point 410 all pixels of periphery.When the depth difference of all pixels of salient point 410 periphery is in a preset range, be about to the pixel of the peripheral depth difference of salient point 410 in a preset range and include in spreading range, make salient point 410 expand to three-dimensional block 420.
And then as shown in step 223, according to the noise block in the three-dimensional block 420 of change in depth filtration.It should be noted that, step 223 also can be synchronizeed with step 222 execution.For instance, when salient point 410 (1) expansion that Fig. 5 illustrates, the action of expansion touches another salient point 410 (2).Step 223 was about to salient point 410 (2) and compared with the depth information of salient point 410 (1) this moment.When the depth difference of salient point 410 (2) is in a preset range, we just can incorporate salient point 410 (2) in the spreading range of salient point 410 (1), and get rid of the power (illustrating as Fig. 6) of salient point 410 (2) expansions, and salient point 410 (1) expands to three-dimensional block 420 (a).Similarly, when the depth difference of salient point 410 (4) is in a preset range, we just can incorporate salient point 410 (4) in the spreading range of salient point 410 (3), and get rid of the power of salient point 410 (4) expansions, and salient point 410 (3) expands to three-dimensional block 420 (b).By this, we can simplify the operation times of abovementioned steps 222, and are issued to the effect of acceleration in the constant prerequisite of accuracy.
Then as shown in step 224, judge in image whether all salient points 410 check complete, if otherwise repeat abovementioned steps 222 and step 223.The salient point that but is removed the power of expansion in step 223 will can not included computing in again, the salient point 410 (2) and the salient point 410 (4) that illustrate as Fig. 5 and Fig. 6.Step 224 meeting rechecking is until all salient points 410 that detect in abovementioned steps 221 all are examined complete.
Then as shown in step 225, produce the space block message that is relevant to three-dimensional block 420.Furthermore, after aforementioned step 224 was found out all three-dimensional blocks 420, step 225 was to find out respectively representative point prior to all three-dimensional blocks 420, and representative point is for example the center of gravity of three-dimensional block 420.After the representative point of three-dimensional block 420 determines, then produce the space block message that is relevant to representative point.
Produce reference planes
Please be simultaneously with reference to Fig. 9 to Figure 10, Fig. 9 illustrates as producing the thin section process flow diagram of reference planes, and Figure 10 illustrates the schematic diagram into the space distribution statistics.Abovementioned steps 23 comprises that further step 231 is to 233.At first as shown in step 231, according to depth information computer memory distribution statistics.Furthermore, step 231 at three-dimensional correspondence position, and then is rebuild a three-dimensional scenic by each pixel in the depth information evaluate image.The space distribution that can obtain illustrating as Figure 10 by three-dimensional scenic is by this added up.In Figure 10, initial point presentation video capture device position, and the x axle represents the distance of range image capture device, the y axle represents the number of number of pixels.The distance that represents the range image capture device due to the x axle, so the space distribution that Figure 10 illustrates is added up the number of pixels that namely represents under different depth.
Then as shown in step 232, interval smoothing and overall weight are adjusted the space distribution statistics that Figure 10 illustrates.Furthermore, in order to remove the impact of the medium and small details noise of space distribution statistics, step 232 is utilized the practice of obfuscation, level and smooth original space distribution statistics, the impact that alleviates little details noise.For instance, the optional degree of depth section (a certain size zone in the y axial coordinate) of selecting certain limit of step 232 adds with the corresponding pixel of the degree of depth section typical value that the General Logistics Department on average obtains this degree of depth section.Then, mobile this degree of depth section and repeat aforementioned computing progressively again allows these noises can not have influence on thereafter computing, next by with the relation of video camera distance, the space distribution statistics that the weighting assessment is whole, and then reach the target that overall weight is adjusted.
And then as shown in step 233, the filtered noise plane is to produce reference planes.Furthermore, in order to find out the reference planes of suitable lattice in the space distribution statistics that illustrates at Figure 10, step 233 can first be found out the peak-peak y in the space distribution statistics max, and with peak-peak y maxFor benchmark produces evaluation criteria.For instance, with peak-peak y max30% be judgment basis, if there is the pixel value of a peak value there is no this reference value of surpassing can not be chosen for suitable reference planes, and filter after being determined into the noise plane.After filtration operation via step 233, can find y in the space distribution statistics that Figure 10 illustrates 1Meet the requirements.Step 233 is again with peak-peak y maxAnd peak value y 1Be benchmark, analyze the space pixel map of x axle interior between peak region, be partitioned into the different users's of approximate depth reference planes.
Produce and connect group
Figure 11 to Figure 19 simultaneously, Figure 11 illustrates the thin section process flow diagram that connects group into producing, Figure 12 illustrates and is the three-dimensional block connection result of the first schematic diagram, Figure 13 illustrates and is the three-dimensional block connection result of the second schematic diagram, Figure 14 illustrates and is the third three-dimensional block connection result schematic diagram, Figure 15 illustrates as find out the schematic diagram of reference point in the link of two three-dimensional blocks, Figure 16 illustrates and is the first space block type schematic diagram, Figure 17 illustrates and is the second space block type schematic diagram, Figure 18 illustrates and is the third space block type schematic diagram, Figure 19 illustrates as connecting the schematic diagram of group.When aforementioned space block message and space block message produce complete after, abovementioned steps 24 can analyze according to space block message and reference planes the connectivity of space interblock, and goes to produce according to the connectivity of three-dimensional block and connect group.Abovementioned steps 24 comprises that further step 241 is to 243.
At first as shown in step 241, will be connected apart from close three-dimensional block according to the distance between three-dimensional block.Furthermore, step 241 can link together close three-dimensional block according to the distance between three-dimensional block, and the distance between three-dimensional block can be come distance between calculation block by Euclidean distance.In general, three-dimensional block 420 (1) to 420 (3) correct closures should be to illustrate towards reference planes 50 as Figure 12 to connect, and the correct closure of three-dimensional block 420 (4) to 420 (6) should be also to illustrate towards reference planes 50 as Figure 12 to connect.Yet, if when merely only assessing two three-dimensional blocks and whether be connected with Euclidean distance, the connection error that illustrates as Figure 13 or Figure 14 might occur.The facts for fear of incorrect link occurs, and after step 241 utilizes Euclidean distance to find out immediate three-dimensional block, need check whether the connection of space interblock is correct by subsequent step 242.When if the connection of space interblock can't be passed through the inspection of step 242, just need return step 241.After getting back to step 241, need find out time approaching three-dimensional block, and again check by subsequent step 242.
Then as shown in step 242, find out in the link of two three-dimensional blocks that are connected the degree of depth minimum a bit as reference point 430, and 430 judgements link whether meet default connection status according to reference point, do not meet default connection status if link, and repeated execution of steps 241.Furthermore, illustrate as Figure 15, step 241 first with the depth calculation of points all on three-dimensional block 420 (7) and 420 (8) lines out then is found out the degree of depth point at the end from these points, and the point at the definition degree of depth end is reference point 430.When the relation of the point at the degree of depth end and reference point 430 such as Figure 16 illustrated, representing did not have relatively low reference point 430 in three-dimensional block 420 (7) and 420 (8) links.Such connection representative connects along hand toward reference planes, so the three-dimensional block 420 (7) that Figure 16 illustrates and 420 (8) link meet default connection status.
In addition, when the relation of the point at the degree of depth end and reference point 430 such as Figure 17 illustrate, though expression reference point 430 is lower than three-dimensional block 420 (7) and 420 (8), reference point 430 and three-dimensional block 420 (7) and 420 (8) distance are in certain limit.It seems with regard to the globality that links, it is still to connect towards reference planes that Figure 17 illustrates, so the three-dimensional block 420 (7) that Figure 17 illustrates and 420 (8) link meet default connection status.
In addition, when the relation of the point at the degree of depth end and reference point 430 such as Figure 18 illustrated, expression reference point 430 was far below three-dimensional block 420 (7) and 420 (8).If three-dimensional block 420 (7) and 420 (8) is held before being respectively both hands, and reference point 430 is on reference planes.Even if what this moment, three-dimensional block 420 (7) and 420 (8) leaned on is near again, the connection between three-dimensional block 420 (7) and 420 (8) is closed and is not also conformed with default connection status.Therefore three-dimensional block 420 (7) and 420 (8) can not be connected.Need return step 241, and find out time approaching three-dimensional block.
In like manner, near again no matter two people lean on, as long as there is a reference point of subsideing 430 on the three-dimensional block line of its hand end points, we just can not connect.Can overcome the problem that the multiple spot correspondence is disturbed each other thus when many people are intended to operate, meet default connection status with the connection of guaranteeing the space interblock.
And then as shown in step 243, the three-dimensional block of gathering the reference planes that are connected with each other and are linked back connects group to produce.Furthermore, illustrate as Figure 19, if when three-dimensional block 420 (1) and 420 (4) can correctly be linked back reference planes 50, three-dimensional block 420 (1) and 420 (4) is reasonably to control end points.On the contrary, if when three-dimensional block 420 (9) can't correctly be linked back reference planes 50, three-dimensional block 420 (9) can't become rational control end points, and is the noise in the space.By this, step 243 set be connected with each other and the be linked back three-dimensional block 420 (1) to 420 (3) of reference planes connects group 60 (1) to produce, and set be connected with each other and the be linked back three-dimensional block 420 (4) to 420 (6) of reference planes connects group 60 (2) to produce.Afterwards abovementioned steps 25 can be in connecting group 60 (1) and connecting group 60 (2) selection near the three-dimensional block 420 (1) and 420 (4) of image capturing device as the control end points.
Disclosed three-dimensional discrimination method and the computer-readable medium of controlling end points of the above embodiment of the present invention has multiple advantages, below only enumerates the part advantage and is described as follows:
One, also can correctly detect the control end points in complex background.
Two, need not limit user's operating position and distance.
Three, can find out the most correct control end points with human body type trend, and the impact that causes each other when many people are operated drops to minimum.
In sum, although the present invention with a preferred embodiment openly as above, so it is not to limit the present invention.Those skilled in the art of the invention, without departing from the spirit and scope of the present invention, when being used for a variety of modifications and variations.Therefore, protection scope of the present invention is as the criterion when looking the appended claims person of defining.

Claims (10)

1. three-dimensional discrimination method of controlling end points comprises:
Receive a depth information, this depth information is relevant to the image that an image capturing device captures;
Produce a plurality of spaces block message that is relevant to a plurality of three-dimensional blocks according to this depth information;
Produce at least one reference planes according to this depth information;
Produce at least one connection group according to these space block messages and this reference planes; And
Select to control end points near the three-dimensional block of this image capturing device as one in this connection group.
2. three-dimensional as claimed in claim 1 is controlled the discrimination method of end points, and this step that wherein produces a plurality of spaces block message comprises:
Detect a plurality of salient points according to this depth information;
According to the depth difference of the neighboring pixel point of these salient points and these salient points, these salient points are expanded to these three-dimensional blocks; And
Generation is relevant to these space block messages of these three-dimensional blocks.
3. three-dimensional as claimed in claim 2 is controlled the discrimination method of end points, and this step that wherein produces a plurality of spaces block message also comprises:
Filter noise block in these three-dimensional blocks according to change in depth.
4. three-dimensional as claimed in claim 2 is controlled the discrimination method of end points, and this step that wherein produces a plurality of spaces block message also comprises:
Judge whether these salient points check complete, if otherwise repeat these salient points are expanded to this steps of these three-dimensional blocks.
5. three-dimensional as claimed in claim 2 is controlled the discrimination method of end points, and this step that wherein produces a plurality of spaces block message also comprises:
Find out respectively a plurality of representative points in these three-dimensional blocks; And
Generation is relevant to these space block messages of these representative points.
6. three-dimensional as claimed in claim 5 is controlled the discrimination method of end points, and wherein these representative points are respectively the center of gravity of these three-dimensional blocks.
7. three-dimensional as claimed in claim 1 is controlled the discrimination method of end points, and this step that wherein produces at least one reference planes comprises:
Calculate a space distribution statistics according to this depth information;
Interval smoothing and overall weight are adjusted this space distribution statistics; And
Filter a noise plane to produce this reference planes.
8. three-dimensional as claimed in claim 7 is controlled the discrimination method of end points, and wherein this space distribution statistics is the corresponding number of pixels of different depth.
9. the three-dimensional discrimination method of controlling end points as claimed in claim 7 wherein filters a noise plane and produces this reference planes with the peak-peak according to this space distribution statistics in this step that produces these reference planes.
10. three-dimensional as claimed in claim 1 is controlled the discrimination method of end points, wherein produces at least one this step that connects group and comprises:
The three-dimensional block that distance between block three-dimensional according to these is close with the distance in these three-dimensional blocks is connected;
In one of wherein two the three-dimensional blocks of these distances in close three-dimensional block that are connected connect find out the degree of depth minimum a bit as a reference point, and judge according to this reference point whether this connection meets a default connection status, if this connection does not meet this default connection status, repeat this Connection Step; And
Set is connected with each other and is linked back the three-dimensional block of these reference planes producing this connection group,
Wherein first will be connected apart from immediate two three-dimensional blocks in this Connection Step system, if connections of immediate two the three-dimensional blocks of these distances do not meet this default connection status, two three-dimensional blocks approaching of chosen distance.
CN 201010225545 2010-07-14 2010-07-14 Identification method for three-dimensional control end point and computer readable medium adopting same Active CN102331883B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201010225545 CN102331883B (en) 2010-07-14 2010-07-14 Identification method for three-dimensional control end point and computer readable medium adopting same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201010225545 CN102331883B (en) 2010-07-14 2010-07-14 Identification method for three-dimensional control end point and computer readable medium adopting same

Publications (2)

Publication Number Publication Date
CN102331883A CN102331883A (en) 2012-01-25
CN102331883B true CN102331883B (en) 2013-11-06

Family

ID=45483680

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201010225545 Active CN102331883B (en) 2010-07-14 2010-07-14 Identification method for three-dimensional control end point and computer readable medium adopting same

Country Status (1)

Country Link
CN (1) CN102331883B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112000824A (en) * 2019-05-27 2020-11-27 英业达科技有限公司 Object identification system and method thereof
CN111127633A (en) * 2019-12-20 2020-05-08 支付宝(杭州)信息技术有限公司 Three-dimensional reconstruction method, apparatus, and computer-readable medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5734743A (en) * 1994-07-12 1998-03-31 Canon Kabushiki Kaisha Image processing method and apparatus for block-based corresponding point extraction
CN101689299A (en) * 2007-06-20 2010-03-31 汤姆逊许可证公司 System and method for stereo matching of images

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2041983B1 (en) * 2006-07-17 2010-12-15 Thomson Licensing Method and apparatus for encoding video color enhancement data, and method and apparatus for decoding video color enhancement data

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5734743A (en) * 1994-07-12 1998-03-31 Canon Kabushiki Kaisha Image processing method and apparatus for block-based corresponding point extraction
CN101689299A (en) * 2007-06-20 2010-03-31 汤姆逊许可证公司 System and method for stereo matching of images

Also Published As

Publication number Publication date
CN102331883A (en) 2012-01-25

Similar Documents

Publication Publication Date Title
CN103065134B (en) A kind of fingerprint identification device and method with information
CN110209273A (en) Gesture identification method, interaction control method, device, medium and electronic equipment
CN105872477A (en) Video monitoring method and system
US20130050076A1 (en) Method of recognizing a control command based on finger motion and mobile device using the same
CN106778456B (en) Optimization method and device for handwriting input
CN102609093A (en) Method and device for controlling video playing by using gestures
KR20140002007A (en) Information processing device, information processing method, and recording medium
CN103164022A (en) Multi-finger touch method, device and portable type terminal device
CN106708270A (en) Display method and apparatus for virtual reality device, and virtual reality device
CN103985137A (en) Moving object tracking method and system applied to human-computer interaction
CN106056089A (en) Three-dimensional posture recognition method and system
CN104517100B (en) Gesture pre-judging method and system
WO2021114896A1 (en) Computer vision-based anomaly detection method and apparatus, and electronic device
WO2021146449A1 (en) Visual object history
CN105590090A (en) Apparatus And Method For Detecting Object For Vehicle
CN102331883B (en) Identification method for three-dimensional control end point and computer readable medium adopting same
CN102622601A (en) Fingertip detection method
CN106406638A (en) Touch point outline generation method and device
CN103761011B (en) A kind of method of virtual touch screen, system and the equipment of calculating
CN106774846A (en) Alternative projection method and device
CN106023262A (en) Crowd flowing main direction estimating method and device
JP2017120503A (en) Information processing device, control method and program of information processing device
JP7119493B2 (en) Recognition device, recognition method and program
JP2016525235A (en) Method and device for character input
CN109977740B (en) Depth map-based hand tracking method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant