CN102331883A - Identification method for three-dimensional control end point and computer readable medium adopting same - Google Patents

Identification method for three-dimensional control end point and computer readable medium adopting same Download PDF

Info

Publication number
CN102331883A
CN102331883A CN2010102255454A CN201010225545A CN102331883A CN 102331883 A CN102331883 A CN 102331883A CN 2010102255454 A CN2010102255454 A CN 2010102255454A CN 201010225545 A CN201010225545 A CN 201010225545A CN 102331883 A CN102331883 A CN 102331883A
Authority
CN
China
Prior art keywords
dimensional
control end
block
end points
produces
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2010102255454A
Other languages
Chinese (zh)
Other versions
CN102331883B (en
Inventor
廖志彬
黄捷
蔡曜阳
王科翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial Technology Research Institute ITRI
Original Assignee
Industrial Technology Research Institute ITRI
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial Technology Research Institute ITRI filed Critical Industrial Technology Research Institute ITRI
Priority to CN 201010225545 priority Critical patent/CN102331883B/en
Publication of CN102331883A publication Critical patent/CN102331883A/en
Application granted granted Critical
Publication of CN102331883B publication Critical patent/CN102331883B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention relates to an identification method for a three-dimensional control end point and a computer readable medium adopting the identification method. The identification method for the three-dimensional control end point comprises the following steps of: receiving the depth information relevant to images captured by an image capturing device; generating three-dimensional block information relevant to a three-dimensional block according to the depth information; generating a reference plane according to the depth information; generating a connecting group according to the three-dimensional block information and the reference plane; and selecting the three-dimensional block of the closest image capturing device in the connecting group as the control end point.

Description

The three-dimensional computer-readable medium of controlling the discrimination method of end points and using it
Technical field
The present invention relates to a kind of device of controlling the discrimination method of end points and using this method, and be particularly related to a kind of three-dimensional computer-readable medium of controlling the discrimination method of end points and using it.
Background technology
The multi-point touch function is very convenient function in the Touch Screen interface, and the spirit of multi-point touch is to utilize the human motor habit of exchange premium more to come operating system, draws in the distance of people and computing machine more.The way of first kind of known technology is definition object characteristic earlier, like color characteristic, shape facility or the textural characteristics of fist.Capture image afterwards again.Then the characteristic block in object characteristic and the image is compared to find out the control end points.The way of second kind of known technology further utilizes depth characteristic to filter the erroneous judgement of background to prevent that complex background from being caused.The rule of doing of the third known technology is to find out three-dimensional control area, and in three-dimensional control area, utilizes depth information and hand-characteristic to find out the control end points near video camera.
Summary of the invention
The present invention relates to a kind of three-dimensional computer-readable medium of controlling the discrimination method of end points and using it.
A kind of discrimination method of three-dimensional control end points is proposed according to an aspect of the present invention.The discrimination method of three-dimensional control end points comprises: receive depth information, depth information is relevant to the image that image capturing device captures; Produce the space block message that is relevant to three-dimensional block according to depth information; Produce reference planes according to depth information; Produce connection group according to space block message and reference planes; And select to connect in the group three-dimensional block near image capturing device as the control end points.
According to a further aspect in the invention, a kind of computer-readable medium is proposed.Computer-readable medium has several programmed instruction to carry out the discrimination method of a three-dimensional control end points, and the discrimination method of three-dimensional control end points comprises: receive depth information, depth information is relevant to the image that image capturing device captures; Produce the space block message that is relevant to three-dimensional block according to depth information; Produce reference planes according to depth information; Produce connection group according to space block message and reference planes; And select to connect in the group three-dimensional block near image capturing device as the control end points.
For letting the foregoing of the present invention can be more obviously understandable, hereinafter is special lifts a preferred embodiment, and conjunction with figs., elaborates as follows:
Description of drawings
Fig. 1 illustrates the identification system into a kind of three-dimensional control end points.
Fig. 2 illustrates the process flow diagram into a kind of discrimination method of three-dimensional control end points.
Fig. 3 illustrates to producing the thin portion process flow diagram of space block message.
Fig. 4 illustrates the synoptic diagram into salient point.
Fig. 5 illustrates and is the synoptic diagram before the filtering noise block.
Fig. 6 illustrates and is the synoptic diagram behind the filtering noise block.
Fig. 7 illustrates the synoptic diagram into all salient points in the image.
Fig. 8 illustrates the synoptic diagram into all three-dimensional blocks in the image.
Fig. 9 illustrates to producing the thin portion process flow diagram of reference planes.
Figure 10 illustrates the synoptic diagram into the space distribution statistics.
Figure 11 illustrates the thin portion process flow diagram that connects group into producing.
It is first kind of three-dimensional block johning knot fruit synoptic diagram that Figure 12 illustrates.
It is second kind of three-dimensional block johning knot fruit synoptic diagram that Figure 13 illustrates.
Figure 14 illustrates and is the third three-dimensional block johning knot fruit synoptic diagram.
Figure 15 illustrates in the binding of two three-dimensional blocks, finding out the synoptic diagram of reference point.
It is first kind of space block type synoptic diagram that Figure 16 illustrates.
It is second kind of space block type synoptic diagram that Figure 17 illustrates.
Figure 18 illustrates and is the third space block type synoptic diagram.
Figure 19 illustrates to connecting the synoptic diagram of group.
[main element symbol description]
10: the identification system of three-dimensional control end points
21~25,221~225,231~233,241~243: step
410,410 (1)~410 (4): salient point
50: reference planes
60 (1), 60 (2): connect group
110: computing machine
120: computer-readable medium
420,420 (a), 420 (b), 420 (1)~420 (9): three-dimensional block
430: the reference point
Embodiment
In order correctly to pick out the control end points, following embodiment provides a kind of discrimination method of three-dimensional control end points and uses its computer-readable medium.Computer-readable medium has several programmed instruction to carry out the discrimination method of a three-dimensional control end points, and the discrimination method of three-dimensional control end points comprises: receive depth information, depth information is relevant to the image that image capturing device captures; Produce the space block message that is relevant to three-dimensional block according to depth information; Produce reference planes according to depth information; Produce connection group according to space block message and reference planes; And select to connect in the group three-dimensional block near image capturing device as the control end points.
The discrimination method and the computer-readable medium of three-dimensional control end points
Please be simultaneously with reference to Fig. 1 and Fig. 2, Fig. 1 illustrates the identification system into a kind of three-dimensional control end points, and Fig. 2 illustrates the process flow diagram into a kind of discrimination method of three-dimensional control end points.The identification system 10 of three-dimensional control end points comprises computing machine 110 and computer-readable medium (Computer-Readable Medium) 120.Computer-readable medium 120 has several programmed instruction, supplies computing machine 110 to be written into to carry out the discrimination method of three-dimensional control end points.Computer-readable medium 120 for example is disk, CD, tape or Winchester disk drive.The discrimination method of three-dimensional control end points comprises the steps: at first shown in step 21, to receive depth information.Depth information is relevant to the image that an image capturing device is captured, and image capturing device for example is infrared camera or two video camera.
Then shown in step 22, produce the space block message that is relevant to three-dimensional block according to depth information.And then shown in step 23, produce reference planes according to depth information.Shown in step 24, produce connection group then according to space block message and reference planes.At last shown in step 25, from connect group, select three-dimensional block near image capturing device as the control end points.The follow-up step that generation space block message, reference planes will further be described respectively and connect group.
Produce the space block message
Please be simultaneously with reference to Fig. 3 to Fig. 8; Fig. 3 illustrates to producing the thin portion process flow diagram of space block message; Fig. 4 illustrates the synoptic diagram into salient point, and Fig. 5 illustrates and is that the synoptic diagram before the filtering noise block, Fig. 6 illustrate and is the synoptic diagram behind the filtering noise block; Fig. 7 illustrates the synoptic diagram into all salient points in the image, and Fig. 8 illustrates the synoptic diagram into all three-dimensional blocks in the image.Abovementioned steps 22 comprises that further step 221 is to 225.
At first shown in step 221, detect all salient points 410 in the image that image capturing device captures according to depth information along continuous straight runs and vertical direction.So-called salient point 410 is meant in an image feature pixel (illustrating like Fig. 4) that has a certain altitude that more swells with respect to other peripheries.
Then shown in step 222, the salient point 410 that Fig. 7 is illustrated according to the depth difference of salient point 410 and the neighboring pixel point of salient point 410 expands to the three-dimensional block 420 that Fig. 8 illustrates.What need explanation is that step 222 is the depth information of inspection salient point 410 all pixels of periphery.When the depth difference of all pixels of salient point 410 periphery is in a preset range, be about to the pixel of salient point 410 peripheral depth differences in a preset range and include in the spreading range, make salient point 410 expand to three-dimensional block 420.
And then shown in step 223, filter the noise block in the three-dimensional block 420 according to change in depth.What need explanation is that step 223 also can be carried out with step 222 synchronously.For instance, when salient point 410 (1) expansions that Fig. 5 illustrates, the action of expansion touches another salient point 410 (2).Step 223 was about to salient point 410 (2) and compared with the depth information of salient point 410 (1) this moment.When the depth difference of salient point 410 (2) is in a preset range; We just can incorporate salient point 410 (2) in the spreading range of salient point 410 (1); And get rid of the power (illustrating) of salient point 410 (2) expansions, and salient point 410 (1) expands to three-dimensional block 420 (a) like Fig. 6.Similarly, when the depth difference of salient point 410 (4) was in a preset range, we just can incorporate salient point 410 (4) in the spreading range of salient point 410 (3), and got rid of the power of salient point 410 (4) expansions, and salient point 410 (3) expands to three-dimensional block 420 (b).By this, we can simplify the operation times of abovementioned steps 222, and are issued to the effect of acceleration in the constant prerequisite of accuracy.
Then shown in step 224, judge whether all salient points 410 are checked in the image to finish, if otherwise repeat abovementioned steps 222 and step 223.The salient point that but in step 223, is removed the power of expansion will can not included computing in again, the salient point 410 (2) and the salient point 410 (4) that illustrate like Fig. 5 and Fig. 6.Till all salient points 410 that step 224 meeting rechecking is detected in abovementioned steps 221 all are examined and finish.
Then shown in step 225, produce the space block message that is relevant to three-dimensional block 420.Further, after aforementioned step 224 was found out all three-dimensional blocks 420, step 225 was to find out representative point respectively prior to all three-dimensional blocks 420, and representative point for example is the center of gravity of three-dimensional block 420.After the representative point decision of three-dimensional block 420, produce the space block message that is relevant to representative point again.
Produce reference planes
Please be simultaneously with reference to Fig. 9 to Figure 10, Fig. 9 illustrates to producing the thin portion process flow diagram of reference planes, and Figure 10 illustrates the synoptic diagram into the space distribution statistics.Abovementioned steps 23 comprises that further step 231 is to 233.At first shown in step 231, according to depth information computer memory distribution statistics.Further, step 231 at three-dimensional correspondence position, and then is rebuild a three-dimensional scenic through each pixel in the depth information evaluate image.The space distribution that can obtain illustrating like Figure 10 through three-dimensional scenic is by this added up.In Figure 10, initial point presentation video capture device position, and the x axle is represented the distance of range image capture device, the y axle is the number of remarked pixel number then.Because the x axle is represented the distance of range image capture device, so the space distribution statistics that Figure 10 illustrates is promptly represented the number of pixels under the different depth.
Then shown in step 232, the space distribution that interval smoothing and overall weight adjustment Figure 10 illustrate is added up.Further, add up medium and small details The noise in order to remove space distribution, step 232 is utilized the practice of obfuscation, and level and smooth original space distribution statistics alleviates little details The noise.For instance, the optional degree of depth section (a certain size zone in the y axial coordinate) of selecting certain limit of step 232 adds the typical value that the General Logistics Department on average obtains this degree of depth section with the pairing pixel of degree of depth section.Then; Progressively move this degree of depth section again and repeat aforementioned computing, let these noises can not have influence on computing thereafter, next through with the relation of video camera distance; The space distribution statistics that the weighting assessment is whole, and then reach the target that overall weight is adjusted.
And then shown in step 233, the filtered noise plane is to produce reference planes.Further, in order to find out the reference planes of fitting lattice in the space distribution statistics that illustrates at Figure 10, step 233 can be found out the peak-peak y in the space distribution statistics earlier Max, and with peak-peak y MaxFor benchmark produces evaluation criteria.For instance, with peak-peak y Max30% be judgment basis, if having the pixel value of a peak value not have this reference value of surpassing then can not be chosen for suitable reference planes, and filter after being judged into the noise plane.After the filtration operation via step 233, can find y in the space distribution statistics that Figure 10 illustrates 1Meet the requirements.Step 233 is again with peak-peak y MaxAnd peak value y 1Be benchmark, analyze the space pixel map of x axle interior between peak region, be partitioned into the different users's of approximate depth reference planes.
Produce and connect group
Figure 11 to Figure 19 simultaneously; Figure 11 illustrates the thin portion process flow diagram that connects group into producing, and it is first kind of three-dimensional block johning knot fruit synoptic diagram that Figure 12 illustrates, and it is second kind of three-dimensional block johning knot fruit synoptic diagram that Figure 13 illustrates; Figure 14 illustrates and is the third three-dimensional block johning knot fruit synoptic diagram; Figure 15 illustrates in the binding of two three-dimensional blocks, finding out the synoptic diagram of reference point, and it is first kind of space block type synoptic diagram that Figure 16 illustrates, and it is second kind of space block type synoptic diagram that Figure 17 illustrates; Figure 18 illustrates and is that the third space block type synoptic diagram, Figure 19 illustrate to connecting the synoptic diagram of group.When aforementioned space block message and space block message produce finish after, abovementioned steps 24 can analyze the connectivity of space interblock according to space block message and reference planes, and goes to produce according to the connectivity of three-dimensional block and connect group.Abovementioned steps 24 comprises that further step 241 is to 243.
At first shown in step 241, will be connected apart from close three-dimensional block according to the distance between the three-dimensional block.Further, step 241 can link together close three-dimensional block according to the distance between the three-dimensional block, and the distance between the three-dimensional block can be come the distance between calculation block through Euclidean distance.In general; Three-dimensional block 420 (1) to 420 (3) correct closures should be to illustrate towards reference planes 50 like Figure 12 to connect, and three-dimensional block 420 (4) to 420 (6) correct closures also should be to illustrate towards reference planes 50 like Figure 12 to connect.Yet, if when merely only assessing two three-dimensional blocks and whether link to each other with Euclidean distance, the connection error that illustrates like Figure 13 or Figure 14 might take place.The facts for fear of incorrect link takes place, after step 241 utilizes Euclidean distance to find out immediate three-dimensional block, and need whether correct through the connection of subsequent step 242 inspection space interblocks.When if the connection of space interblock can't be passed through the inspection of step 242, just need return step 241.After getting back to step 241, then need find out time approaching three-dimensional block, and once more through subsequent step 242 inspections.
Then shown in step 242; In the binding of two three-dimensional blocks that are connected, find out the degree of depth minimum a bit as reference point 430; And judge to link whether meet preset connection status according to reference point 430, and do not meet preset connection status if link, then repeated execution of steps 241.Further, illustrate like Figure 15, step 241 is come out the depth calculation of points all on three-dimensional block 420 (7) and 420 (8) lines earlier, from these points, finds out the degree of depth point at the end again, and the point at the end of the definition degree of depth is a reference point 430.When the relation of the point at the end of the degree of depth and reference point 430 such as Figure 16 illustrated, representing did not have low relatively reference point 430 in three-dimensional block 420 (7) and 420 (8) bindings.Such connection representative connects along hand toward reference planes, so three-dimensional block 420 (7) that Figure 16 illustrates and 420 (8) binding meet preset connection status.
In addition; When the relation of the point at the end of the degree of depth and reference point 430 such as Figure 17 illustrate; Though expression reference point 430 is lower than three-dimensional block 420 (7) and 420 (8), reference point 430 and three-dimensional block 420 (7) and 420 (8) distance are in certain limit.Globality with regard to linking it seems that it still is to connect towards reference planes that Figure 17 illustrates, so three-dimensional block 420 (7) that Figure 17 illustrates and 420 (8) binding meet preset connection status.
In addition, when the relation of the point at the end of the degree of depth and reference point 430 such as Figure 18 illustrated, expression reference point 430 was far below three-dimensional block 420 (7) and 420 (8).If three-dimensional block 420 (7) and 420 (8) is held before being respectively both hands, and reference point 430 is on reference planes.Even if what this moment, three-dimensional block 420 (7) and 420 (8) leaned on is near again, the connection between three-dimensional block 420 (7) and 420 (8) is closed and is not also conformed with preset connection status.Therefore three-dimensional block 420 (7) and 420 (8) can not be connected.Need return step 241, and find out time approaching three-dimensional block.
In like manner, near again no matter two people lean on, as long as there is a reference point of subsideing 430 on the three-dimensional block line of its hand end points, we just can not connect.Can overcome the problem that the multiple spot correspondence is disturbed each other thus when many people are intended to operate, meet preset connection status with the connection of guaranteeing the space interblock.
And then shown in step 243, the three-dimensional block of gathering the reference planes that are connected with each other and are linked back connects group to produce.Further, illustrate like Figure 19, if when three-dimensional block 420 (1) and 420 (4) can correctly be linked back reference planes 50, then three-dimensional block 420 (1) and 420 (4) be reasonably to control end points.On the contrary, if when three-dimensional block 420 (9) can't correctly be linked back reference planes 50, then three-dimensional block 420 (9) can't become rational control end points, and is the noise in the space.By this, step 243 set be connected with each other and the be linked back three-dimensional block 420 (1) to 420 (3) of reference planes connects group 60 (1) to produce, and set be connected with each other and the be linked back three-dimensional block 420 (4) to 420 (6) of reference planes connects group 60 (2) to produce.Abovementioned steps 25 can connect group 60 (1) and connect the three-dimensional block 420 (1) and 420 (4) selected in the group 60 (2) near image capturing device as controlling end points afterwards.
The discrimination method and the computer-readable medium of the disclosed three-dimensional control end points of the above embodiment of the present invention have multiple advantages, below just list and lift the explanation of part advantage as follows:
One, in complex background, also can correctly detect the control end points.
Two, need not limit user's operating position and distance.
Three, can find out the most correct control end points with human body type trend, and the influence that causes each other when many people are operated drops to minimum.
In sum, though the present invention with a preferred embodiment openly as above, so it is not in order to limit the present invention.One of ordinary skill in the art of the present invention are not breaking away from the spirit and scope of the present invention, when doing various changes and retouching.Therefore, protection scope of the present invention is as the criterion when looking the appended claims person of defining.

Claims (22)

1. the discrimination method of a three-dimensional control end points comprises:
Receive a depth information, this depth information is relevant to the image that an image capturing device is captured;
Produce a plurality of spaces block message that is relevant to a plurality of three-dimensional blocks according to this depth information;
Produce at least one reference planes according to this depth information;
According to these space block messages and at least one connection of this reference planes generation group; And
Select to control end points near the three-dimensional block of this image capturing device as one in this connection group.
2. the discrimination method of three-dimensional control end points as claimed in claim 1, this step that wherein produces a plurality of spaces block message comprises:
Detect a plurality of salient points according to this depth information;
Depth difference according to the neighboring pixel point of these salient points and these salient points expands to these three-dimensional blocks with these salient points; And
Generation is relevant to these three-dimensional blocks of these three-dimensional blocks.
3. the discrimination method of three-dimensional control end points as claimed in claim 2, this step that wherein produces a plurality of spaces block message also comprises:
Filter the noise block in these three-dimensional blocks according to change in depth.
4. the discrimination method of three-dimensional control end points as claimed in claim 2, this step that wherein produces a plurality of spaces block message also comprises:
Judging whether these salient points are checked finishes, if otherwise repeat these salient points are expanded to this steps of these three-dimensional blocks.
5. the discrimination method of three-dimensional control end points as claimed in claim 2, this step that wherein produces a plurality of spaces block message also comprises:
In these three-dimensional blocks, find out a plurality of representative points respectively; And
Generation is relevant to these space block messages of these representative points.
6. the discrimination method of three-dimensional control end points as claimed in claim 5, wherein these representative points are respectively the center of gravity of these three-dimensional blocks.
7. the discrimination method of three-dimensional control end points as claimed in claim 1, this step that wherein produces at least one reference planes comprises:
Calculate space distribution statistics according to this depth information;
This space distribution statistics of interval smoothing and overall weight adjustment; And
Filter a noise plane to produce this reference planes.
8. the discrimination method of three-dimensional control end points as claimed in claim 7, wherein this space distribution statistics is the pairing number of pixels of different depth.
9. the discrimination method of three-dimensional control end points as claimed in claim 7 wherein filters a noise plane and produces these reference planes with the peak-peak according to this space distribution statistics in this step that produces these reference planes.
10. the discrimination method of three-dimensional control end points as claimed in claim 1 wherein produces at least one this step that connects group and comprises:
To be connected apart from close three-dimensional block according to the distance between these three-dimensional blocks;
In one of these two three-dimensional blocks that are connected link find out the degree of depth minimum a bit as reference point; And judge according to this reference point whether this binding meets one and preset connection status; If this binding does not meet this preset connection status, then repeat this Connection Step; And
The three-dimensional block of gathering these reference planes that are connected with each other and are linked back is to produce this connection group.
11. the discrimination method of three-dimensional control end points as claimed in claim 10; Wherein will be connected apart from immediate two three-dimensional blocks earlier in this Connection Step system; If the binding of immediate two the three-dimensional blocks of distance does not meet this preset connection status, then chosen distance time two approaching three-dimensional blocks.
12. a computer-readable medium has a plurality of programmed instruction to carry out the discrimination method of a three-dimensional control end points, the discrimination method of this three-dimensional control end points comprises:
Receive a depth information, this depth information is relevant to the image that an image capturing device is captured;
Produce a plurality of spaces block message that is relevant to a plurality of three-dimensional blocks according to this depth information;
Produce at least one reference planes according to this depth information;
According to these space block messages and at least one connection of this reference planes generation group; And
Select to control end points near the three-dimensional block of this image capturing device as one in this connection group.
13. computer-readable medium as claimed in claim 12, this step that wherein produces a plurality of spaces block message comprises:
Detect a plurality of salient points according to this depth information;
Depth difference according to the neighboring pixel point of these salient points and these salient points expands to these three-dimensional blocks with these salient points; And
Generation is relevant to these space block messages of these three-dimensional blocks.
14. computer-readable medium as claimed in claim 13, this step that wherein produces a plurality of spaces block message also comprises:
Filter the noise block in these three-dimensional blocks according to change in depth.
15. computer-readable medium as claimed in claim 13, this step that wherein produces a plurality of spaces block message also comprises:
Judging whether these salient points are checked finishes, if otherwise repeat these salient points are expanded to this steps of these three-dimensional blocks.
16. computer-readable medium as claimed in claim 13, this step that wherein produces a plurality of spaces block message also comprises:
In these three-dimensional blocks, find out a plurality of representative points respectively; And
Generation is relevant to these space block messages of these representative points.
17. computer-readable medium as claimed in claim 16, wherein these representative points are respectively the center of gravity of these three-dimensional blocks.
18. computer-readable medium as claimed in claim 12, this step that wherein produces at least one reference planes comprises:
Calculate space distribution statistics according to this depth information;
This space distribution statistics of interval smoothing and overall weight adjustment; And
Filter a noise plane to produce this reference planes.
19. computer-readable medium as claimed in claim 18, wherein this space distribution statistics is the pairing number of pixels of different depth.
20. computer-readable medium as claimed in claim 18 wherein filters a noise plane and produces these reference planes with the peak-peak according to this space distribution statistics in this step that produces these reference planes.
21. computer-readable medium as claimed in claim 12 wherein produces at least one this step that connects group and comprises:
To be connected apart from close three-dimensional block according to the distance between these three-dimensional blocks;
In one of these two three-dimensional blocks that are connected link find out the degree of depth minimum a bit as reference point; And judge according to this reference point whether this binding meets one and preset connection status; If this binding does not meet this preset connection status, then repeat this Connection Step; And
The three-dimensional block of gathering these reference planes that are connected with each other and are linked back is to produce this connection group.
22. computer-readable medium as claimed in claim 21; Wherein will be connected apart from immediate two three-dimensional blocks earlier in this Connection Step system; If the binding of immediate two the three-dimensional blocks of distance does not meet this preset connection status, then chosen distance time two approaching three-dimensional blocks.
CN 201010225545 2010-07-14 2010-07-14 Identification method for three-dimensional control end point and computer readable medium adopting same Active CN102331883B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201010225545 CN102331883B (en) 2010-07-14 2010-07-14 Identification method for three-dimensional control end point and computer readable medium adopting same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201010225545 CN102331883B (en) 2010-07-14 2010-07-14 Identification method for three-dimensional control end point and computer readable medium adopting same

Publications (2)

Publication Number Publication Date
CN102331883A true CN102331883A (en) 2012-01-25
CN102331883B CN102331883B (en) 2013-11-06

Family

ID=45483680

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201010225545 Active CN102331883B (en) 2010-07-14 2010-07-14 Identification method for three-dimensional control end point and computer readable medium adopting same

Country Status (1)

Country Link
CN (1) CN102331883B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111127633A (en) * 2019-12-20 2020-05-08 支付宝(杭州)信息技术有限公司 Three-dimensional reconstruction method, apparatus, and computer-readable medium
CN112000824A (en) * 2019-05-27 2020-11-27 英业达科技有限公司 Object identification system and method thereof

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5734743A (en) * 1994-07-12 1998-03-31 Canon Kabushiki Kaisha Image processing method and apparatus for block-based corresponding point extraction
US20090285283A1 (en) * 2006-07-17 2009-11-19 Yong Ying Gao Method and apparatus for encoding video color enhancement data, and method and apparatus for decoding video color enhancement data
CN101689299A (en) * 2007-06-20 2010-03-31 汤姆逊许可证公司 System and method for stereo matching of images

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5734743A (en) * 1994-07-12 1998-03-31 Canon Kabushiki Kaisha Image processing method and apparatus for block-based corresponding point extraction
US20090285283A1 (en) * 2006-07-17 2009-11-19 Yong Ying Gao Method and apparatus for encoding video color enhancement data, and method and apparatus for decoding video color enhancement data
CN101689299A (en) * 2007-06-20 2010-03-31 汤姆逊许可证公司 System and method for stereo matching of images

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112000824A (en) * 2019-05-27 2020-11-27 英业达科技有限公司 Object identification system and method thereof
CN111127633A (en) * 2019-12-20 2020-05-08 支付宝(杭州)信息技术有限公司 Three-dimensional reconstruction method, apparatus, and computer-readable medium

Also Published As

Publication number Publication date
CN102331883B (en) 2013-11-06

Similar Documents

Publication Publication Date Title
US11360571B2 (en) Information processing device and method, program and recording medium for identifying a gesture of a person from captured image data
KR102465532B1 (en) Method for recognizing an object and apparatus thereof
KR101064573B1 (en) System for tracking a moving object, by using particle filtering
JP4855556B1 (en) Moving object detection apparatus, moving object detection method, moving object detection program, moving object tracking apparatus, moving object tracking method, and moving object tracking program
KR101551576B1 (en) Robot cleaner, apparatus and method for recognizing gesture
CN105187785A (en) Cross-checkpost pedestrian identification system and method based on dynamic obvious feature selection
JPWO2010140578A1 (en) Image processing apparatus, image processing method, and image processing program
CN106056089A (en) Three-dimensional posture recognition method and system
CN104850219A (en) Equipment and method for estimating posture of human body attached with object
WO2021114896A1 (en) Computer vision-based anomaly detection method and apparatus, and electronic device
JP2018028784A (en) Movable body group detection program, movable body group detection device, and movable body group detection method
JP6255944B2 (en) Image analysis apparatus, image analysis method, and image analysis program
JP2010057105A (en) Three-dimensional object tracking method and system
CN108932465B (en) Method and device for reducing false detection rate of face detection and electronic equipment
CN102331883B (en) Identification method for three-dimensional control end point and computer readable medium adopting same
CN106023262A (en) Crowd flowing main direction estimating method and device
CN111753587A (en) Method and device for detecting falling to ground
JP6568772B2 (en) Image processing apparatus, image processing system, operation method of image processing apparatus, and control program
JP2019175107A (en) Recognition device, recognition method, program, and data generation device
KR102161212B1 (en) System and method for motion detecting
CN106303203A (en) A kind of information processing method and electronic equipment
Denman et al. Multi-modal object tracking using dynamic performance metrics
WO2021260934A1 (en) Information processing device, information processing method, and program storage medium
CN102945103B (en) A kind of touch object identification method of optical sensor
TWI431512B (en) Method for recognizing three-dimensional control point and computer readable medium using the same thereof

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant