CN103279225B - Projection type man-machine interactive system and touch control identification method - Google Patents

Projection type man-machine interactive system and touch control identification method Download PDF

Info

Publication number
CN103279225B
CN103279225B CN201310210406.8A CN201310210406A CN103279225B CN 103279225 B CN103279225 B CN 103279225B CN 201310210406 A CN201310210406 A CN 201310210406A CN 103279225 B CN103279225 B CN 103279225B
Authority
CN
China
Prior art keywords
button
edge
image
projection
offset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310210406.8A
Other languages
Chinese (zh)
Other versions
CN103279225A (en
Inventor
胡军
李昂
韩衍隽
李国林
谢翔
吕众
宋玮
任力飞
郑毅
王志华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN201310210406.8A priority Critical patent/CN103279225B/en
Publication of CN103279225A publication Critical patent/CN103279225A/en
Application granted granted Critical
Publication of CN103279225B publication Critical patent/CN103279225B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to human-computer interaction technique field, particularly a kind of projection type man-machine interactive system and touch control identification method.This touch control identification method, comprises step: homography and region of interesting extraction: the image mapped of being taken by camera to projector image planes, and extracts button fringe region; Button shape changing detection: the edge detecting button from described area-of-interest, and calculate the edge offset H on finger tip offset; Touch-control judges: judge whether finger tip there occurs touching with projection plane by the edge offset of the button detected.The present invention has that recognition speed is fast, precision is high and robustness high.

Description

Projection type man-machine interactive system and touch control identification method
Technical field
The present invention relates to human-computer interaction technique field, particularly a kind of projection type man-machine interactive system and touch control identification method.
Background technology
Current most of portable electric appts (such as mobile phone, panel computer etc.) all adopts liquid crystal display (LCD) as the primary output device of man-machine interface.Owing to being subject to the restriction of portable electric appts own vol, display screen all does smaller, and this is not easy to user and carries out man-machine interaction on interface.Dummy keyboard button on the smart mobile phones such as such as iPhone is all smaller and very compact, and user uses a kind of timid and hesitant sensation, is easy to a wrong button, and this greatly have impact on speed and the comfort level of man-machine interaction.If adopt large-sized display screen, can obtain high-quality display image on the one hand, man-machine interaction is also more comfortable, more convenient on the other hand.Therefore, the mobile phone of giant-screen is commercially more welcome.But the size change of display screen causes greatly the portability of electronic equipment to be deteriorated.The man-machine interactive system of projection has the advantages that own vol is little, display screen is large.There is the portable electric appts (such as mobile phone) of projecting function, have very large attractive force to consumer.In actual applications, the body surface that mobile phone interface projects to by user, can obtain larger interactive interface on a surface of an.Current projection mobile phone, the i8530 of such as Samsung, the man-machine interaction (such as gesture identification, touch-control identification) not on projection support interface.The present invention utilizes the single camera on mobile phone to carry out touch-control identification, does not need the sensor device embedding other, is a kind of low cost solution.
Traditional touch-control identification core concept is: foreground segmentation, and finger tip extracts, and then estimates the distance between finger tip and projecting plane, judges whether finger tip there occurs touching with interactive interface.Adopt the difficulty of single camera detection observing and controlling larger: to detect finger tip in the 2D image not only needing camera to take, also need the 3D coordinate measuring finger tip.Wherein, extract difficult point based on the finger tip on the projection screen of 2D image to be: 1) projected image can change the color of staff, when extracting staff, loses efficacy based on staff color characteristic method; 2) projected image makes the contrast of going smoothly with background decline, and the profile of hand fogs, so also bad based on the method effect of staff contour feature; 3) in addition, projected image changes in time, therefore, background difference can not be adopted to extract prospect.3D coordinate for finger tip extracts, and principle can adopt the method for structured light.But the projection of structured light can have influence on the display quality of image, the complexity of 3D reconstruction is higher in addition, is not suitable for real time human-machine interaction system.
Summary of the invention
(1) technical matters that will solve
The technical problem to be solved in the present invention how to realize the touch control detection based on accurate, real-time, the quick and high robust in the projection type man-machine interactive system of single camera.
(2) technical scheme
For solving the problem, the invention provides a kind of projection type man-machine interactive system, comprising: projection module, for human-computer interaction interface is projected to projection plane; Image capture module, for monitoring the interactive action in view field, gathers image transmitting to controlling and processing module; Controlling and processing module, for configuring the parameter of described projection module, and producing described human-computer interaction interface, transmit it to described projection module, and, for configuring the parameter of described image capture module, receive and process gathered image, therefrom extracting human-machine interactive information.
Preferably, also comprise: posture perception module, for detecting the attitudes vibration of system described in interactive process.
Preferably, when the attitude of described system changes, again demarcate based on projection plane, from camera image plane to the homography matrix of projector image planes.
Preferably, described control and processing module also for controlling described posture perception module, receiving and analyzing posture perception data, again demarcating described homography matrix when the attitude of described system changes.
Preferably, described posture perception module adopts following attitude sensor: accelerometer, magnetometer or gyroscope.
The present invention also provides a kind of touch control identification method based on any one projection type man-machine interactive system above-mentioned, comprise step: S1: homography and region of interesting extraction: the image mapped of being taken by camera to projector image planes, and extracts button fringe region; S2: button shape changing detection: the edge detecting button from described area-of-interest, and calculate the edge offset H on finger tip offset; S3: touch-control judges: judge whether finger tip there occurs touching with projection plane by the edge offset of the button detected.
Preferably, the mode whether the button shape changing detection in described step S2 belongs to the edge of button by the marginal point that the amplitude of combining image gradient and the walking direction of gradient detect realizes.
Preferably, the method of estimation of the button edge offset amount in described step S2 is: with the radial direction of original button marginal point position for searching route, first marginal point detected from original edge point as the shifted edges point of coupling, the mean deviation amount by following formulae discovery button edge:
H o f f s e t = Σ i Δb i K , Δb i > b m i n
Wherein, Δ bi is the side-play amount of an edge pixel point, b minbe a positive threshold value, K is that side-play amount is greater than b minthe number of edge pixel point.
Preferably, the touch-control decision criteria in described step S3 is:
1) if H offset<h min, be not judged to be touching;
2) if h min<Hoff set<h max, be judged to be touching;
3) if Hoff set>h max, be judged to be that finger is unsettled;
Wherein, h minand h maxfor touch-control decision threshold.
(3) beneficial effect
In projection type man-machine interactive system based on single camera of the present invention and touch control identification method, judge whether keys upwards has finger tip by button edge deformation, and estimate the distance of finger tip apart from projection plane further by the side-play amount at button edge, thus carry out touch-control identification, this be a kind of newly, there is the touch control identification method of widespread use.
In recognition methods, camera image is mapped to projector image planes, improves the detection degree of accuracy at button edge; Meanwhile, the amplitude of combining image gradient and the direction of gradient, propose a kind of button edge detection method, can the impact of noise decrease, finger edge and finger shade, detects the edge of figure button accurately and rapidly.In addition, image procossing is all based on area-of-interest, thus has the feature that computational complexity precision that is little, that detect is higher.Therefore, this touch control identification method has that speed is fast, precision is high and robustness high.
Accompanying drawing explanation
Fig. 1 is the projection type man-machine interactive system block diagram based on single camera of one embodiment of the invention;
Fig. 2 is the projection type man-machine interactive system block diagram based on single camera of another embodiment of the present invention;
Fig. 3 is the workflow diagram of the projection type man-machine interactive system of one embodiment of the invention;
Fig. 4 is the workflow diagram of the projection type man-machine interactive system of another embodiment of the present invention;
Fig. 5 is that projection image's vegetarian refreshments in " projector-camera " system of the embodiment of the present invention is reflected schematic diagram by finger tip;
Fig. 6 is the button edge offset schematic diagram that the finger tip of the embodiment of the present invention causes;
Fig. 7 is the touch-control recognition detection block diagram of the embodiment of the present invention;
Fig. 8 is the rim detection schematic diagram of the embodiment of the present invention;
Fig. 9 is the button edge matching schematic diagram of the embodiment of the present invention;
Figure 10 is that the touch-control of the embodiment of the present invention judges corresponding minimum finger tip height and maximum finger tip high-level schematic.
Embodiment
Below in conjunction with drawings and Examples, the specific embodiment of the present invention is described in further detail.Following examples for illustration of the present invention, but are not used for limiting the scope of the invention.
The embodiment of the invention discloses a kind of projection type man-machine interactive system based on single camera, as shown in Figure 1, this man-machine interactive system is made up of projection module 103, image capture module 104 and control and processing module 102.
Projection module 103, as the output display unit of interactive system, projects to man-machine interface on the physical plane (such as desktop, metope, ground etc.) in daily life.Projection module can be the projector based on technology such as DLP, LCoS, LCD and Laser.
Image capture module 104, can interactive action in Real-Time Monitoring view field, gathers image transmitting to controlling and processing module 102.For reducing system cost, in the present embodiment, what image capture module 104 adopted is common RGB camera or black and white camera.
Control and processing module 102, mainly contain 2 functions: 1) configure the parameter of projection module 103, and produce projected image (man-machine interface), by data path by image transmitting to projection module 103; 2) parameter of configuration image acquisition module 104, receives and processes the image of camera collection, therefrom extracting human-machine interactive information.
Preferably, as shown in Figure 2, described projection type man-machine interactive system also can comprise posture perception module 205.
Posture perception module 205, adopts the attitude sensors such as accelerometer, magnetometer, gyroscope to detect the attitudes vibration of system in reciprocal process in real time.When the attitude of system changes, again demarcate based on projection plane, homography matrix H from camera image plane to projector image planes cp.
Control can control posture perception module 205 with processing module 202, receive and analyze posture perception data, again demarcating homography matrix H when posture changes cp.
When carrying out man-machine interaction, the workflow 300 of system as shown in Figure 2, comprising:
Step 301, demarcates homography matrix H cp.Pixel-map relation from camera image plane to projector image planes can be described with homography matrix.
Preferably, tessellated method can be adopted to demarcate homography matrix, and calibration process is: 1) to physical plane projection gridiron pattern, the cross-hatch pattern in camera plane is as P c; 2) P is detected cin chessboard angle point, obtain the mapping relations of angle point between projected image plane and camera image plane; 3) homography matrix H is calculated cp.
Step 302, the man-machine interaction image in image capture module acquired projections region.
Step 303, carries out touch-control identification.
Step 304, according to the touch control operation detected, makes response, and upgrades interactive interface.
Preferably, if projection type man-machine interactive system comprises posture perception module, so the workflow of system as shown in Figure 4, also comprises step 405, and whether posture there occurs change to utilize the data of attitude sensing system to judge.If posture there occurs change, need again to demarcate homography matrix.
A kind of touch control identification method of the projection type man-machine interactive system based on single camera is also disclosed in the embodiment of the present invention.
In man-machine interface, projecting to the upper button edge of finger can offset.By the edge deformation of figure button, the present invention judges whether keys upwards has finger tip, and estimate the distance of finger tip apart from projection plane further by the side-play amount at button edge, thus carries out touch-control identification.
First, the impact of finger on projection image's vegetarian refreshments (button) is further illustrated from model.As shown in Figure 5, the distance, delta d defined between finger tip upper surface to projection plane is the height of finger tip.According to " projector-camera " system model, the point projected on projection plane meets formula (1), projects the pixel m on finger tip upper surface pmeet formula (2):
m c = &gamma;K c ( R - tn T d ) K p - 1 m p - - - ( 1 )
m c &prime; = &gamma;K c ( R - tn T d - &Delta; d ) K p - 1 m p - - - ( 2 )
Wherein, R and t is the rotation matrix and the translation matrix (the outer parameter matrix of the projector camera system that is otherwise known as) that are tied to camera coordinate system from projector coordinates respectively, K cand K pbe the Intrinsic Matrix of camera and projector respectively, n is the normalized normal vector of projection plane, and d is the distance of projector coordinates system initial point to projection plane; m p=(x p,y p, 1) tthe homogeneous coordinates of imaging point on projector image planes, m c=(x c, y c, 1) tbe on projector pixel mp through projection plane reflection after, the homogeneous coordinates of imaging point in camera image plane, m ' c=(x c, y c, 1) tpixel m on projector pafter the reflection of finger upper surface, the homogeneous coordinates of imaging point in camera image plane, γ is the normalized factor of non-zero.In addition, homography matrix can be expressed as:
H p c = K c ( R - tn T d ) K p - 1 - - - ( 3 )
In camera image plane, if the pixel in projected image projects on finger tip, so its position will be displaced to m ' from mc c.By homography matrix H cp, image mapped can be reduced difficulty of matching to projector image planes process.
m p &prime; = H p c - 1 m c &prime; = &gamma;K p ( R - tn T d ) - 1 ( R - tn T d - &Delta; d ) K p - 1 m p - - - ( 4 )
On projector image planes, the skew of pixel is expressed as:
&Delta;m p &prime; = m p &prime; - m p - &gamma;K p ( R - tn T d ) - 1 ( R - tn T d - &Delta; d ) K p - 1 m p - m p - - - ( 5 )
In the applied environment of the overwhelming majority, Δ d is very little compared to d, and therefore formula (5) can be reduced to:
&Delta;m p &prime; &ap; K p ( R - tn T d ) - 1 tn T K p - 1 m p &CenterDot; &Delta; d d 2 - - - ( 6 )
By formula (6), the side-play amount (hereinafter referred to as pixel-shift) that can sum up pixel has following character:
1., in projected image plane, the height of pixel-shift and finger tip is approximated to linear relationship;
2. the pixel coordinate m of pixel-shift and projected image prelevant;
3. pixel-shift and projector are inversely proportional to the square distance of projection plane.
In addition, formula (6) tells us, can infer the height of finger tip by detecting the side-play amount of projection image's vegetarian refreshments on finger tip (unit: pixels), and the height of finger tip and pixel-shift are approximated to linear relationship.Utilize formula (6) to calculate the height of finger tip, but this is operated in and many times there is no need, because pixel-shift can be used as the height of finger tip to a certain extent, that is:
Δd≈χ·Δm p'(7)
Wherein, χ is coefficient that systematic parameter and projection plane parameter determine, that pixel-shift amount is converted to finger tip height.
Preferably, button can be square, circular key.Fig. 6 (a) is the original figure button that camera photographs.As shown in Fig. 6 (b), when finger tapping button, the button edge projected on finger there occurs skew.The present invention makes full use of this feature, carries out touch-control identification by the skew detecting button edge.But when pointing unsettled, the button edge projected on finger also can offset, as shown in Fig. 6 (c).Therefore, the method needs the problem solved to be judge that the deformation of button causes by finger tip is unsettled, or finger tip key tap causes.
In the embodiment of the present invention, also disclose the identification process of above-mentioned touch control identification method, using button as interactive object, touch-control identification process is described below.
Touch control identification method is divided into three phases: 1) homography and area-of-interest (regionofinterest, ROI) extract; 2) button shape changing detection; 3) touch-control judges.As shown in Figure 7, homography and area-of-interest (regionofinterest, ROI) extract to refer to takes image mapped to PDP plane by camera, and extracts button fringe region; Button shape changing detection refers to and detect button edge from ROI region, and calculates the button edge offset on finger tip; Finally, the button edge offset by detecting judges whether finger tip there occurs touching with projecting plane.Below, respectively the realization of every one-phase is introduced.
1. homography and ROI extract
First, camera shooting comprises the image of view field and interactive action, then converts RGB image to gray-scale map.Then, the homography matrix Hcp measured in calibration process is utilized, by image from CIP Planar Mapping to PDP plane.Then, from the image after mapping, extract image of interest region (ROI), ROI is defined as the adjacent domain (region of ± Δ L) centered by original button edge.Due to the impact of finger, button edge can depart from original position, therefore, needs to ensure that Δ L is greater than button edge offset amount.Compared to entire image, detect button edge in the roi and there is following advantage: 1) computational complexity is little; 2) precision detected is higher; 3) in entire image, need corresponding with button after detecting edge, what judge to detect is the edge of which button, may mate the phenomenon of makeing mistakes in this process, and the edge detected in the roi has clear and definite button corresponding with it, do not need to mate.
Preferably, area-of-interest (ROI) can be also entire image, to obtain maximum information from image.
2. button shape changing detection
Based on the thinking of rim detection, can key outlines be detected, thus estimate the deformation of button.But, due to 1) noise that camera causes, noise in environment; 2) impact of finger edge and the impact of finger shade, classical Boundary extracting algorithm (such as canny algorithm, rim detection etc. based on Laplace operator) is not also suitable for and detects button edge.As shown in Fig. 8 (c), be the edge detection results adopting canny operator, can see that the edge of finger tip and shade have also been detected.Disclose in the embodiment of the present invention a kind of be similar to canny detect algorithm, this algorithm be divided into three steps:
1) calculate ROI image f (x, the y) gradient in each position (x, y), the amplitude of gradient is:
g ( x , y ) = | G x | + | G y | = | &part; f &part; x | + | &part; f &part; y | - - - ( 8 )
The direction of gradient is:
α(x,y)=arctan(G x/G y)(9)
Amplitude and the direction of traditional edge detection operator (such as Sobel, Prewitt, Roberts etc.) compute gradient can be utilized.
2) button marginal point is connected.If the pixel in image meets inequality:
g(x,y)>E,and|α(x,y)-α 0(x,y)|<A(10)
So this point is the pixel on the edge of button.Wherein, E and A is the threshold value of non-negative, α 0(x, y) is the vertical direction (i.e. gradient direction) at original button edge, (x, y) place.But due to the impact of noise and finger edge, some button marginal point may be missed.Now, we fill this undetected pixel in the original position at button edge, are then coupled together by all button marginal points.See from Fig. 8 (b), the shade of finger edge and finger also meets first inequality (gradient amplitude meets the demands) in formula (10), therefore, needs the direction by gradient, judges whether to belong to button edge.Fig. 8 (d) is that the button edge detected, can see that this testing result inhibits the impact of finger edge and finger shade substantially in conjunction with gradient amplitude constraint and gradient direction constraint.
3) the button edge by detecting, estimates the deformation degree of button.Adopt button edge offset Hoff setdescribe the deformation degree of button, button edge offset refers to the mean deviation of button edge relative to original position.
Preferably, the quantitative description by the deformation of key text interactive object is selected.Button is made up of many pixels, and these pixels meet pixel-shift character 1,2,3.Definition: the button marginal point detected from projected image plane, relative to the mean pixel side-play amount of original button marginal point, is called button edge offset.Adopt button edge offset can describe the deformation size of button edge on finger tip exactly., introduce the mathematical description mode of button edge offset below, and how by the button edge calculations button edge offset after original button edge and deformation.
Suppose in projected image plane, the set of original button marginal point is B, namely
B={b 1,b 2...b i...b N},andb i=(x i,y i)(11)
Wherein, these marginal points all quantize in units of pixel, and namely xi and yi is integer.
Button there occurs deformation due to the impact of finger or other factors, in order to quantize the pixel-shift at button edge, needs original marginal point to mate with the marginal point after deformation.The matching process that the present invention proposes is: with the radial direction of original button marginal point position for searching route, detects the shifted edges point matched.Fig. 9, for square button and circular key, describes and how to be mated with the button marginal point after deformation by original button marginal point.
According to above-mentioned matching process, the deformation button marginal point set B corresponding with original button marginal point set B can be found out ', B ' is expressed as:
B'={b 1',b 2'...b i'...b' N},andb i'=(x i',y i')(12)
Under Euclidean space, the offset Δ bi of each edge pixel point can be calculated
Δb i=||b′ i-b i|| 2(13)
Button edge offset can be expressed as
H o f f s e t = &Sigma; i &Delta;b i K , &Delta;b i > b m i n - - - ( 14 )
Wherein, Δ bi is the side-play amount of an edge pixel point, b minbe a positive threshold value, K is that side-play amount is greater than b minthe number of edge pixel point.
What button edge offset described is that this parameter can be described in the mean deviation amount at the button edge on finger tip well at the mean value of edge pixel point skew.Button edge offset is brought into formula (6), the average height of the edge pixel point be projected on hand can be calculated.
3. touch-control judges
Finally, by corresponding with key offsets amount two threshold value h maxand h minjudge whether finger tip contacts with projecting plane.The present patent application adopts a threshold range to define touching.As shown in Figure 10, when system supposition user finger tip contacts with projecting plane, finger tip height is at Δ d minwith Δ d maxbetween.If finger tip height is greater than Δ d max, judge that finger tip is in vacant state; If finger tip height is greater than Δ d min, and be less than Δ d max, judge that finger tip contacts with projection plane.Formula (6) describes the relation between the button edge offset amount on finger tip height and finger tip, utilizes this formula, can pass through Δ d min, Δ d maxcalculate corresponding h min, h max.Detecting button edge offset amount H offsetafterwards, by threshold value h maxand h minjudge touching, decision criteria is as follows:
1) if H offset<h min, be not judged to be touching (because although existence skew side-play amount in this button edge is too little, may be that the factors such as noise cause, not there is the offsets of finger touching);
2) if h min<H offset<h max, be judged to be touching;
3) if H offset>h max, be judged to be that finger is unsettled.
The above is only the preferred embodiment of the present invention; it should be pointed out that for those skilled in the art, under the prerequisite not departing from the technology of the present invention principle; can also make some improvement and replacement, these improve and replace and also should be considered as protection scope of the present invention.

Claims (8)

1. based on a touch control identification method for single camera, it is characterized in that, comprise step:
S1: homography and region of interesting extraction: the image mapped of being taken by camera to projector image planes, and extracts button fringe region;
S2: button shape changing detection: the edge detecting button from described area-of-interest, and the mean deviation amount H calculating button edge offset;
S3: touch-control judges: judge whether finger tip there occurs touching with projection plane by the mean deviation amount at the button edge detected;
The method of estimation of the mean deviation amount at the button edge in described step S2 is: with the radial direction of original button marginal point position for searching route, first the edge pixel point detected from original button marginal point as the shifted edges pixel of coupling, the mean deviation amount by following formulae discovery button edge:
H o f s e t = &Sigma; i &Delta;b i K , &Delta;b i > b min
Wherein, Δ b ibe the side-play amount of an edge pixel point, b minbe a positive threshold value, K is offset Δ b ibe greater than b minthe number of edge pixel point.
2. touch control identification method according to claim 1, is characterized in that, the mode whether the button shape changing detection in described step S2 belongs to the edge of button by the marginal point that the amplitude of combining image gradient and the walking direction of gradient detect realizes.
3. touch control identification method according to claim 1, is characterized in that, the touch-control decision criteria in described step S3 is:
1) if H offset<h min, be not judged to be touching;
2) if h min<H offset<h max, be judged to be touching;
3) if H offset>h max, be judged to be that finger is unsettled;
Wherein, h minand h maxfor touch-control decision threshold.
4. adopt a projection type man-machine interactive system for the touch control identification method as described in any one of claim 1-3, it is characterized in that, comprising:
Projection module, for projecting to projection plane by human-computer interaction interface;
Image capture module, for monitoring the interactive action in view field, gathers image transmitting to controlling and processing module;
Controlling and processing module, for configuring the parameter of described projection module, and producing described human-computer interaction interface, transmit it to described projection module, and, for configuring the parameter of described image capture module, receive and process gathered image, therefrom extracting human-machine interactive information.
5. projection type man-machine interactive system according to claim 4, is characterized in that, also comprises:
Posture perception module, for detecting the attitudes vibration of system described in interactive process.
6. projection type man-machine interactive system according to claim 5, is characterized in that, when the attitude of described system changes, again demarcate based on projection plane, from camera image plane to the homography matrix of projector image planes.
7. projection type man-machine interactive system according to claim 6, it is characterized in that, described control and processing module also for controlling described posture perception module, receiving and analyzing posture perception data, again demarcating described homography matrix when the attitude of described system changes.
8. projection type man-machine interactive system according to claim 5, is characterized in that, described posture perception module adopts following attitude sensor: accelerometer, magnetometer or gyroscope.
CN201310210406.8A 2013-05-30 2013-05-30 Projection type man-machine interactive system and touch control identification method Active CN103279225B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310210406.8A CN103279225B (en) 2013-05-30 2013-05-30 Projection type man-machine interactive system and touch control identification method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310210406.8A CN103279225B (en) 2013-05-30 2013-05-30 Projection type man-machine interactive system and touch control identification method

Publications (2)

Publication Number Publication Date
CN103279225A CN103279225A (en) 2013-09-04
CN103279225B true CN103279225B (en) 2016-02-24

Family

ID=49061778

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310210406.8A Active CN103279225B (en) 2013-05-30 2013-05-30 Projection type man-machine interactive system and touch control identification method

Country Status (1)

Country Link
CN (1) CN103279225B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103956036B (en) * 2013-10-14 2016-12-07 天津锋时互动科技有限公司 A kind of non-touching formula remote controller being applied to household electrical appliances
CN103558914A (en) * 2013-10-31 2014-02-05 中山大学 Single-camera virtual keyboard based on geometric correction and optimization
CN103559809B (en) * 2013-11-06 2017-02-08 常州文武信息科技有限公司 Computer-based on-site interaction demonstration system
CN103809880B (en) * 2014-02-24 2017-02-08 清华大学 Man-machine interaction system and method
CN103955316B (en) * 2014-04-28 2016-09-21 清华大学 A kind of finger tip touching detecting system and method
CN104298355A (en) * 2014-10-16 2015-01-21 广东科学技术职业学院 Quick input system and method of mobile terminal device
CN106155423A (en) * 2015-04-28 2016-11-23 长城汽车股份有限公司 Vehicle-mounted laser projection key system and vehicle
JP2017146927A (en) * 2016-02-19 2017-08-24 ソニーモバイルコミュニケーションズ株式会社 Control device, control method, and program
CN106204604B (en) * 2016-04-29 2019-04-02 北京仁光科技有限公司 Project touch control display apparatus and its exchange method
CN106114519A (en) * 2016-08-05 2016-11-16 威马中德汽车科技成都有限公司 A kind of device and method vehicle being controlled by operation virtual push button
CN110909729B (en) * 2019-12-09 2022-10-18 广东小天才科技有限公司 Click-to-read content identification method and device and terminal equipment
CN114820791B (en) * 2022-04-26 2023-05-02 极米科技股份有限公司 Obstacle detection method, device, system and nonvolatile storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1701351A (en) * 2000-09-07 2005-11-23 卡尼斯塔公司 Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device
CN101901051A (en) * 2009-05-26 2010-12-01 美国智能科技有限公司 Data entry device and device based on the input object of distinguishing
CN102508574A (en) * 2011-11-09 2012-06-20 清华大学 Projection-screen-based multi-touch detection method and multi-touch system
CN102821323A (en) * 2012-08-01 2012-12-12 成都理想境界科技有限公司 Video playing method, video playing system and mobile terminal based on augmented reality technique

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8490002B2 (en) * 2010-02-11 2013-07-16 Apple Inc. Projected display shared workspaces

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1701351A (en) * 2000-09-07 2005-11-23 卡尼斯塔公司 Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device
CN101901051A (en) * 2009-05-26 2010-12-01 美国智能科技有限公司 Data entry device and device based on the input object of distinguishing
CN102508574A (en) * 2011-11-09 2012-06-20 清华大学 Projection-screen-based multi-touch detection method and multi-touch system
CN102821323A (en) * 2012-08-01 2012-12-12 成都理想境界科技有限公司 Video playing method, video playing system and mobile terminal based on augmented reality technique

Also Published As

Publication number Publication date
CN103279225A (en) 2013-09-04

Similar Documents

Publication Publication Date Title
CN103279225B (en) Projection type man-machine interactive system and touch control identification method
CN102799318B (en) A kind of man-machine interaction method based on binocular stereo vision and system
CN102508574B (en) Projection-screen-based multi-touch detection method and multi-touch system
US9405182B2 (en) Image processing device and image processing method
US9288373B2 (en) System and method for human computer interaction
US6512838B1 (en) Methods for enhancing performance and data acquired from three-dimensional image systems
US20150213647A1 (en) Dimensioning system calibration systems and methods
CN110473293B (en) Virtual object processing method and device, storage medium and electronic equipment
CN103955316B (en) A kind of finger tip touching detecting system and method
US20170106540A1 (en) Information processing apparatus, information processing method, and program
CN102591533B (en) Multipoint touch screen system realizing method and device based on computer vision technology
KR102057531B1 (en) Mobile devices of transmitting and receiving data using gesture
Caputo et al. 3D Hand Gesture Recognition Based on Sensor Fusion of Commodity Hardware.
CN105723300A (en) Determining a segmentation boundary based on images representing an object
US9582127B2 (en) Large feature biometrics using capacitive touchscreens
CN103488356B (en) A kind of touch identification method based on infrared camera three-dimensional imaging
US9727776B2 (en) Object orientation estimation
CN103713738A (en) Man-machine interaction method based on visual tracking and gesture recognition
CN104461176A (en) Information processor, processing method, and projection system
KR20170081808A (en) System and method for detecting object in depth image
CN104899361A (en) Remote control method and apparatus
CN103761011B (en) A kind of method of virtual touch screen, system and the equipment of calculating
KR20130050672A (en) Method of virtual touch using 3d camera and apparatus thereof
US20150268793A1 (en) Low ground mass artifact management
CN107113417A (en) Project image onto on object

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant