CN103955316A - Fingertip touch detection system and method - Google Patents

Fingertip touch detection system and method Download PDF

Info

Publication number
CN103955316A
CN103955316A CN201410175698.0A CN201410175698A CN103955316A CN 103955316 A CN103955316 A CN 103955316A CN 201410175698 A CN201410175698 A CN 201410175698A CN 103955316 A CN103955316 A CN 103955316A
Authority
CN
China
Prior art keywords
finger tip
image sensing
information
human
computer interaction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410175698.0A
Other languages
Chinese (zh)
Other versions
CN103955316B (en
Inventor
谢翔
李国林
蔡西蕾
宋玮
郑毅
吕众
任力飞
王志华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN201410175698.0A priority Critical patent/CN103955316B/en
Publication of CN103955316A publication Critical patent/CN103955316A/en
Application granted granted Critical
Publication of CN103955316B publication Critical patent/CN103955316B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention provides a fingertip touch detection system and a fingertip touch detection method. The system comprises a projection interface processing unit used for receiving human-computer interaction interface information input by the external part, and outputting the human-computer interaction interface information to a projection unit, the projection unit used for projecting the human-computer interaction interface information to a projection plane, an image sensing unit used for collecting a projection area image, and sending the projection area image to an image processing unit, the image processing unit used for judging whether the projection area image contains a fingertip, and calculating the distance from the fingertip to a human-computer interaction interface and location information thereof in the human-computer interaction interface if yes, and a touch detection unit comprising a judgment execution module, an optical center imaging location calculation module on an imaging plane, a fingertip lowest point acquisition module, a shadow front endpoint acquisition module and a fingertip touch judgment module and used for judging touch and outputting touch information. The fingertip touch detection system and method disclosed by the invention can realize touch detection which is low in cost and power consumption and high in precision.

Description

A kind of finger tip touching detection system and method
Technical field
The present invention relates to field of computer technology, be specifically related to a kind of finger tip touching detection system and method.
Background technology
In recent years, along with the appearance of smart mobile phone, panel computer and Google's eyes, the communication of people and digital world is more becoming tight close, and this has also further promoted the microminiaturization of smart machine.Be subject to the restriction of conventional display apparatus volume, rely on the further microminiaturization of smart machine of traditional monitor, people thereby by diversion to projection display equipment.
In order to realize the judgement to projection screen touch action, many new man-machine interactive systems and method have been there is in recent years, mainly be divided into two classes, a class is that implicit striped formula structured light the side-play amount by striped judge whether to occur touch event in projected image.This system major defect is to have the implicit striated structure light of projector equipment of high frame per second and realizes touching judgement.Another kind of is in projection human-computer interaction interface image, adopts the spatial information of depth camera acquired projections face and touch control object, calculates after projecting plane and touch control object spacing, by distance threshold, judges whether touch event occurs.The defect of this system is to use depth transducer sampling depth information, and the resolution of depth transducer is also lower at present, thereby causes this system accuracy not high.In addition, adopt depth transducer also the cost that causes system is higher, volume is large, power consumption is higher.
Summary of the invention
For the deficiencies in the prior art, the invention provides a kind of finger tip touching detection system and method, can realize low cost, low-power consumption, high-precision touching detection.
For achieving the above object, the present invention is achieved by the following technical programs:
A finger tip touching detection system, this system comprises:
Projection interface processing unit, for receiving the human-computer interaction interface information of outside input, and exports to projecting cell by the human-computer interaction interface information of reception;
Projecting cell, for by human-computer interaction interface information projection to projection plane;
Image sensing cell, for the view field's image in acquired projections plane, and sends to graphics processing unit by the view field's image collecting;
Graphics processing unit is used for judging in view field's image, whether to contain finger tip, and when containing finger tip, calculates the distance of finger tip and human-computer interaction interface and the positional information in human-computer interaction interface thereof; If the distance of described finger tip and human-computer interaction interface is less than or equal to the first threshold value, output touching information is to touching detecting unit; Otherwise output containing view field's image of finger tip and the positional information of finger tip in human-computer interaction interface to touching detecting unit;
Graphics processing unit is also for obtaining the homography matrix between system intrinsic parameter, the intrinsic outer parameter of system, the variable outer parameter of system and projecting cell and image sensing cell, and exports the homography matrix between described system intrinsic parameter, the intrinsic outer parameter of system, the variable outer parameter of system and projecting cell and image sensing cell to touching detecting unit;
Wherein system intrinsic parameter is the photocentre location parameter of projecting cell and image sensing cell; The intrinsic outer parameter of system is relative tertiary location and attitude Relation Parameters between projecting cell and image sensing cell; The variable outer parameter of system is relative tertiary location and the attitude Relation Parameters between image sensing cell and projecting cell and projection plane;
Touching detecting unit, comprises photocentre mapping position computing module on judgement execution module, imaging surface, finger tip minimum point acquisition module, shade forward terminal acquisition module and finger tip touching judge module, wherein,
Judgement execution module, for when reception is touching information from the output information of graphics processing unit, directly export touching information, if while receiving from the output information of graphics processing unit as positional information in human-computer interaction interface of the view field's image containing finger tip and finger tip, carry out photocentre mapping position computing module on imaging surface;
Photocentre mapping position computing module on imaging surface, for receiving the intrinsic outer parameter of system of graphics processing unit output, the variable outer parameter of system, homography matrix between the intrinsic intrinsic parameter of system and projecting cell and image sensing cell, the described parameter receiving is calculated to obtain in image sensing cell plane by geometry and cross image sensing cell photocentre and perpendicular to the straight line of projection plane and the intersection point b on projecting plane and mistake projecting cell photocentre and the straight line of image sensing cell photocentre and the intersection point a positional information on projecting plane, and export above two intersection point information to described finger tip minimum point acquisition module and described shade forward terminal acquisition module,
Finger tip minimum point acquisition module, for receiving the view field's image containing finger tip of graphics processing unit output and finger tip in the positional information of human-computer interaction interface and the positional information of the intersection point b of photocentre mapping position computing module output on from described imaging surface, utilize finger tip in human-computer interaction interface, may occur touching area information and the colouring information of finger tip obtain finger tip and finger tip carried out to rim detection and obtain finger tip edge, matching is justified in finger tip edge and simulate center of circle o, connect center of circle o and intersection point b, straight line ob and finger tip edge intersection point are defined as finger tip minimum point f, if its coordinate on image sensing cell imaging surface is and export the positional information of finger tip minimum point f to described shade forward terminal acquisition module,
Shade forward terminal acquisition module, for receiving the view field's image containing finger tip of graphics processing unit output and finger tip in the positional information of human-computer interaction interface and the positional information of the intersection point a of photocentre mapping position computing module output on receiving from described imaging surface, the positional information of the finger tip minimum point f of described finger tip minimum point acquisition module output, utilize the feature of shade in hsv color space to obtain finger tip shadow region, by connecting intersection point a and finger tip minimum point f, line and Shadow edge place intersection point are shade forward terminal s, if its coordinate on image sensing cell imaging surface is and export finger tip minimum point f and shade forward terminal s positional information to described finger tip touching judge module,
Finger tip touching judge module, for receiving homography matrix between the projecting cell of described graphics processing unit output and image sensing cell, and finger tip lowest point, the shade forward terminal position of from described shade forward terminal acquisition module, exporting, by homography matrix between projecting cell and image sensing cell, calculate the position of shade forward terminal in projecting cell plane utilize the intrinsic inside and outside parameter of system of graphics processing unit output can obtain rotation matrix R and the translation matrix T between image sensing cell and projecting cell, calculate projecting cell photocentre position and image sensing cell photocentre position under the same coordinate system, if unified under image sensing cell photocentre coordinate system, image sensing cell photocentre is (0,0,0), projecting cell photocentre is T c → p, finger tip minimum point is shade forward terminal is r wherein p → cfor 3x3 is tied to the rotation matrix of image sensing cell coordinate system, T by projecting cell coordinate c → pfor 1x3 is tied to the translation matrix of projecting cell coordinate system, f by image sensing cell coordinate c, f pbe respectively image sensing cell and projecting cell focal length, with be respectively after image sensing cell and the correction of projecting cell intrinsic parameter with according to finger tip lowest point, projecting cell imaging surface top shadow forward terminal position and image sensing cell photocentre and projecting cell photocentre position on image sensing cell imaging surface, in space, cross the straight line [x of projecting cell photocentre, projecting cell plane top shadow forward terminal c, y c, z c]=λ 1v 1+ T c → pwith the straight line [x that crosses finger tip minimum point in image sensing cell photocentre, image sensing cell plane c, y c, z c]=λ 2v 2the volume coordinate of intersection point is λ 2' v 2, and using this position of intersecting point as fingertip location, wherein v 1 = ( x s p ′ , y s p ′ , f p ) , v 2 = ( x f c ′ , y f c ′ , f c ) , λ 1, λ 2for scale-up factor, λ 2' be definite value and satisfied λ ′ 1 λ ′ 2 = | | v 1 | | 2 - v 1 t v 2 - v 2 t v 1 | | v 2 | | 2 - 1 - v 1 t T c → p v 2 t T c → p , v 1transposed matrix, v 2transposed matrix, finally by the variable outer calculation of parameter of system, go out projecting plane equation, with normal vector n cwith single-point p crepresent, calculate and obtain above-mentioned intersection point and projecting plane spacing when distance is less than the second threshold value, be judged as touching operation; When there is touching operation, finger tip touching judge module is also by according to the image sensing cell receiving and the homography matrix between projecting cell, then calculate and obtain the coordinate position of touch points in the human-computer interaction interface of projection, and export finger tip touching information, finger tip touching information at least comprises the location coordinate information in the human-computer interaction interface of finger tip touching original projection;
Control module, for the every other unit of control system, and coordinates the work of every other unit; Control module also can control system in the intrinsic inside and outside parameter acquiring state of system, system variable element obtains state and finger tip obtains and touches mode of operation.
Wherein, in the human-computer interaction interface of outside input, exist while thering is the point, line of obvious characteristic or surface information, described projection interface processing unit is also for extracting First Characteristic information from the human-computer interaction interface information of outside input, and First Characteristic information is exported to graphics processing unit, wherein First Characteristic information comprises unique point, line and face, and unique point, line and the face coordinate position in interface;
Described graphics processing unit is for extracting Second Characteristic information from view field's image of image sensing cell output, and wherein Second Characteristic information comprises unique point, line and face;
Graphics processing unit is compared Second Characteristic information and described First Characteristic information, according to Second Characteristic information, with respect to the unique point in First Characteristic information, line or face, whether there is deformation, judge in view field, whether to contain finger tip, and the finger tip that calculates deformation place is from the distance of human-computer interaction interface and the positional information in view field thereof.
Wherein, in the human-computer interaction interface of outside input, do not exist while thering is the point, line of obvious characteristic or surface information, described projection interface processing unit is also for extracting the 3rd characteristic information from the human-computer interaction interface information of outside input, and the 3rd characteristic information is exported to graphics processing unit, wherein the 3rd characteristic information comprises the border of the human-computer interaction interface of projection;
Described graphics processing unit is for extracting the 4th characteristic information from view field's image of image sensing cell output, and wherein the 4th characteristic information comprises the border of human-computer interaction interface;
Graphics processing unit is compared the 3rd characteristic information and described the 4th characteristic information, according to the 3rd characteristic information, with respect to the 4th characteristic information, whether there is deformation, judge in view field, whether to contain finger tip, and calculate the finger tip of deformation place from distance and the positional information of finger tip in human-computer interaction interface of human-computer interaction interface.
Wherein, between described projection interface processing unit and projecting cell, be provided with projection interface module, for the human-computer interaction interface information receiving is carried out to the processing of shape predistortion correction.
Wherein, between described image sensing cell and graphics processing unit, be provided with image sensing interface module, for the image gathering is carried out to optic aberrance revising processing.
A finger tip detection method of touch based on described finger tip touching detection system, the method comprises:
Step S1: projection interface processing unit receives the human-computer interaction interface information of outside input, and the human-computer interaction interface information of reception is exported to projecting cell;
Step S2: projecting cell by human-computer interaction interface information projection to projection plane;
Step S3: the view field's image in image sensing cell acquired projections plane, Bing Jianggai view field image sends to graphics processing unit;
Step S4: graphics processing unit obtains system intrinsic parameter, the intrinsic outer parameter of system and the variable outer parameter of system, and utilize the homography matrix between the intrinsic outer parameter of described system and the variable outer calculation of parameter projecting cell of system and image sensing cell;
Wherein system intrinsic parameter is the photocentre location parameter of projecting cell and image sensing cell; The intrinsic outer parameter of system is relative tertiary location and attitude Relation Parameters between projecting cell and image sensing cell; The variable outer parameter of system is relative tertiary location and the attitude Relation Parameters between image sensing cell and projecting cell and projection plane;
Step S5: graphics processing unit judges whether contain finger tip on view field's image, and when containing finger tip, calculate the distance of finger tip and human-computer interaction interface; If the distance of described finger tip and human-computer interaction interface is less than or equal to the first threshold value, output touching information is to touching detecting unit; Otherwise output containing view field's image of finger tip and the positional information of finger tip in human-computer interaction interface to touching detecting unit;
Step S6: when touching detecting unit receives output information from graphics processing unit and is touching information, directly export touching information, if while receiving from the output information of graphics processing unit as positional information in human-computer interaction interface of the view field's image containing finger tip and finger tip, obtain fingertip location, judge whether finger tip is touched, if touching information is exported in touching, be specially: the intrinsic outer parameter of system that receives graphics processing unit output, the variable outer parameter of system, homography matrix between the intrinsic intrinsic parameter of system and projecting cell and image sensing cell, the described parameter receiving is calculated to obtain in image sensing cell plane by geometry and cross image sensing cell photocentre and perpendicular to the straight line of projection plane and the intersection point b on projecting plane and mistake projecting cell photocentre and the straight line of image sensing cell photocentre and the intersection point a positional information on projecting plane, utilize finger tip in human-computer interaction interface, may occur touching area information and the colouring information of finger tip obtain finger tip and finger tip carried out to rim detection and obtain finger tip edge, matching is justified in finger tip edge and simulate center of circle o, connect center of circle o and intersection point b, straight line ob and finger tip edge intersection point are defined as finger tip minimum point f, establish its coordinate on image sensing cell imaging surface to be utilize the feature of shade in hsv color space to obtain finger tip shadow region, by connecting intersection point a and finger tip minimum point f, line and Shadow edge place intersection point are shade forward terminal s, establish its coordinate on image sensing cell imaging surface to be by homography matrix between projecting cell and image sensing cell, calculate the position of shade forward terminal in projecting cell plane utilize the intrinsic inside and outside parameter of system of graphics processing unit output can obtain rotation matrix R and the translation matrix T between image sensing cell and projecting cell, calculate projecting cell photocentre position and image sensing cell photocentre position under the same coordinate system, if unified under image sensing cell photocentre coordinate system, image sensing cell photocentre is (0,0,0), projecting cell photocentre is T c → p, finger tip minimum point is shade forward terminal is r wherein p → cfor 3x3 is tied to the rotation matrix of image sensing cell coordinate system, T by projecting cell coordinate c → pfor 1x3 is tied to the translation matrix of projecting cell coordinate system, f by image sensing cell coordinate c, f pbe respectively image sensing cell and projecting cell focal length, with be respectively after image sensing cell and the correction of projecting cell intrinsic parameter with according to finger tip lowest point, projecting cell imaging surface top shadow forward terminal position and image sensing cell photocentre and projecting cell photocentre position on image sensing cell imaging surface, in space, cross the straight line [x of projecting cell photocentre, projecting cell plane top shadow forward terminal c, y c, z c]=λ 1v 1+ T c → pwith the straight line [x that crosses finger tip minimum point in image sensing cell photocentre, image sensing cell plane c, y c, z c]=λ 2v 2the volume coordinate of intersection point is λ 2' v 2, and using this position of intersecting point as fingertip location, wherein v 1 = ( x s p ′ , y s p ′ , f p ) , v 2 = ( x f c ′ , y f c ′ , f c ) , λ 1, λ 2for scale-up factor, λ 2' be definite value and satisfied λ ′ 1 λ ′ 2 = | | v 1 | | 2 - v t v 1 - v 2 t v | | v | | 2 1 2 - 1 - v 1 t 2 T c → p v 2 t T c → p , v 1transposed matrix, v 2transposed matrix, finally by the variable outer calculation of parameter of system, go out projecting plane equation, with normal vector n cwith single-point p crepresent, calculate and obtain above-mentioned intersection point and projecting plane spacing when distance is less than the second threshold value, be judged as touching operation; When there is touching operation, finger tip touching judge module is also by according to the image sensing cell receiving and the homography matrix between projecting cell, then calculate and obtain the coordinate position of touch points in the human-computer interaction interface of projection, and export finger tip touching information, finger tip touching information at least comprises the location coordinate information in the human-computer interaction interface of finger tip touching original projection.
Wherein, graphics processing unit described in step S4 judges that on view field's image, whether containing finger tip comprises:
In the human-computer interaction interface of outside input, exist while thering is the point, line of obvious characteristic or surface information, described projection interface processing unit is also for extracting First Characteristic information from the human-computer interaction interface information of outside input, and First Characteristic information is exported to graphics processing unit, wherein First Characteristic information comprises unique point, line and face, and unique point, line and the face coordinate position in interface;
Described graphics processing unit is for extracting Second Characteristic information from view field's image of image sensing cell output, and wherein Second Characteristic information comprises unique point, line and face;
Graphics processing unit is compared Second Characteristic information and described First Characteristic information, according to Second Characteristic information, with respect to the unique point in First Characteristic information, line or face, whether there is deformation, judge in view field, whether to contain finger tip, and the finger tip that calculates deformation place is from the distance of human-computer interaction interface and the positional information in view field thereof.
Wherein, graphics processing unit described in step S4 judges whether view field's image contains finger tip and comprise:
In the human-computer interaction interface of outside input, do not exist while thering is the point, line of obvious characteristic or surface information, described projection interface processing unit is also for extracting the 3rd characteristic information from the human-computer interaction interface information of outside input, and the 3rd characteristic information is exported to graphics processing unit, wherein the 3rd characteristic information comprises the border of the human-computer interaction interface of projection;
Described graphics processing unit is for extracting the 4th characteristic information from view field's image of image sensing cell output, and wherein the 4th characteristic information comprises the border of human-computer interaction interface;
Graphics processing unit is compared the 3rd characteristic information and described the 4th characteristic information, according to the 3rd characteristic information, with respect to the 4th characteristic information, whether there is deformation, judge in view field, whether to contain finger tip, and the finger tip that calculates deformation place is from the distance of human-computer interaction interface and the positional information in view field thereof.
Wherein, before projecting cell is exported to by the human-computer interaction interface information of reception in described projection interface processing unit, the human-computer interaction interface information receiving is carried out to the processing of shape predistortion correction.
Wherein, before image sensing cell sends to graphics processing unit by the image of collection, the image gathering is carried out to optic aberrance revising processing.
The present invention at least has following beneficial effect:
1, the present invention only utilizes a common projecting cell and a common image sensing cell can complete the judgement of whether finger tip being touched to projection plane, low cost, low-power consumption, high-precision touch detection have been realized, in addition, because the present invention does not need to adopt depth transducer, so the volume of system is also less.
2, finger tip touching detection algorithm of the present invention is not subject to the impact of characteristic information in human-computer interaction interface, even if there is no the essential characteristic unit such as obvious point, line or Else Rule figure or image in interactive interface, this algorithm also can well be implemented.
3, finger tip touching detection algorithm of the present invention does not rely on image feature information, and projector is without the dominant or recessive structured light of projection, therefore less demanding to the acquisition frame rate of image sensing cell and projecting cell, makes algorithm have more universality;
4, finger tip of the present invention touching detection algorithm is by calculating finger tip and the foundation of projecting plane spacing as touching judgement, so this touching detection algorithm is subject to finger tip self thickness effect little, touches accuracy in detection high.
Certainly, implement either method of the present invention or product and not necessarily need to reach above-described all advantages simultaneously.
Accompanying drawing explanation
In order to be illustrated more clearly in the embodiment of the present invention or technical scheme of the prior art, to the accompanying drawing of required use in embodiment or description of the Prior Art be briefly described below, apparently, accompanying drawing in the following describes is some embodiments of the present invention, for those of ordinary skills, do not paying under the prerequisite of creative work, can also obtain according to these accompanying drawings other accompanying drawing.
Fig. 1 is the structural representation that the finger tip of the embodiment of the present invention touches detection system;
Fig. 2 be the embodiment of the present invention have touch control object and without the feature striped deformation schematic diagram of touch control object;
Fig. 3 is the projecting cell photocentre of the embodiment of the present invention and the position view on the relative projecting plane of graphics processing unit photocentre;
Fig. 4 is finger tip minimum point and the shade forward terminal position view of the embodiment of the present invention;
Fig. 5 is the schematic diagram of embodiment of the present invention touching determination methods;
Fig. 6 is the process flow diagram of the finger tip detection method of touch of the embodiment of the present invention.
Embodiment
For making object, technical scheme and the advantage of the embodiment of the present invention clearer, below in conjunction with the accompanying drawing in the embodiment of the present invention, technical scheme in the embodiment of the present invention is carried out to clear, complete description, obviously, described embodiment is the present invention's part embodiment, rather than whole embodiment.Embodiment based in the present invention, those of ordinary skills, not making the every other embodiment obtaining under creative work prerequisite, belong to the scope of protection of the invention.
The embodiment of the present invention has proposed a kind of finger tip touching detection system, referring to Fig. 1, comprising:
Projection interface processing unit 101, for receiving the human-computer interaction interface information of outside input, and exports to projecting cell by the human-computer interaction interface information of reception;
Projecting cell 102, for by human-computer interaction interface information projection to projection plane;
Image sensing cell 103, for the view field's image in acquired projections plane, Bing Jianggai view field image sends to graphics processing unit;
Graphics processing unit 104 is for judging whether contain finger tip in view field's image, and when containing finger tip, calculates distance and the positional information of finger tip in human-computer interaction interface of finger tip and human-computer interaction interface; If the distance of described finger tip and human-computer interaction interface is less than or equal to the first threshold value, output touching information is to touching detecting unit; Otherwise output containing view field's image of finger tip and the positional information of finger tip in human-computer interaction interface to touching detecting unit;
Graphics processing unit 104 is also for obtaining the homography matrix between system intrinsic parameter, the intrinsic outer parameter of system, the variable outer parameter of system and projecting cell and image sensing cell, and exports the homography matrix between described system intrinsic parameter, the intrinsic outer parameter of system, the variable outer parameter of system and projecting cell and image sensing cell to touching detecting unit;
Wherein system intrinsic parameter is the photocentre location parameter of projecting cell and image sensing cell; The intrinsic outer parameter of system is relative tertiary location and attitude Relation Parameters between projecting cell and image sensing cell; The variable outer parameter of system is relative tertiary location and the attitude Relation Parameters between image sensing cell and projecting cell and projection plane;
Touching detecting unit 105, comprises photocentre mapping position computing module on judgement execution module, imaging surface, finger tip minimum point acquisition module, shade forward terminal acquisition module and finger tip touching judge module, wherein,
Judgement execution module, for when reception is touching information from the output information of graphics processing unit, directly export touching information, if while receiving from the output information of graphics processing unit as positional information in human-computer interaction interface of the view field's image containing finger tip and finger tip, carry out photocentre mapping position computing module on imaging surface;
Photocentre mapping position computing module on imaging surface, for receiving the intrinsic outer parameter of system of graphics processing unit output, the variable outer parameter of system, homography matrix between the intrinsic intrinsic parameter of system and projecting cell and image sensing cell, the described parameter receiving is calculated to obtain in image sensing cell plane by geometry and cross image sensing cell photocentre and perpendicular to the straight line of projection plane and the intersection point b on projecting plane and mistake projecting cell photocentre and the straight line of image sensing cell photocentre and the intersection point a positional information on projecting plane, as shown in Figure 3, and export above two intersection point information to described finger tip minimum point acquisition module and described shade forward terminal acquisition module,
Finger tip minimum point acquisition module, for receiving the view field's image containing finger tip of graphics processing unit output and finger tip in the positional information of human-computer interaction interface and the positional information of the intersection point b of photocentre mapping position computing module output on from described imaging surface, utilize finger tip in human-computer interaction interface, may occur touching area information and the colouring information of finger tip obtain finger tip and finger tip carried out to rim detection and obtain finger tip edge, matching is justified in finger tip edge and simulate center of circle o, connect center of circle o and intersection point b, straight line ob and finger tip edge intersection point are defined as finger tip minimum point f, if its coordinate on image sensing cell imaging surface is and export the positional information of finger tip minimum point f to described shade forward terminal acquisition module,
Shade forward terminal acquisition module, for receiving the view field's image containing finger tip of graphics processing unit output and finger tip in the positional information of human-computer interaction interface and the positional information of the intersection point a of photocentre mapping position computing module output on receiving from described imaging surface, the positional information of the finger tip minimum point f of described finger tip minimum point acquisition module output, as shown in Figure 4, utilize the feature of shade in hsv color space to obtain finger tip shadow region, by connecting intersection point a and finger tip minimum point f, line and Shadow edge place intersection point are shade forward terminal s, if its coordinate on image sensing cell imaging surface is and export finger tip minimum point f and shade forward terminal s positional information to described finger tip touching judge module,
Finger tip touching judge module, for receiving homography matrix between the projecting cell of described graphics processing unit output and image sensing cell, and finger tip lowest point, the shade forward terminal position of from described shade forward terminal acquisition module, exporting, by homography matrix between projecting cell and image sensing cell, calculate the position of shade forward terminal in projecting cell plane utilize the intrinsic inside and outside parameter of system of graphics processing unit output can obtain rotation matrix R and the translation matrix T between image sensing cell and projecting cell, calculate projecting cell photocentre position and image sensing cell photocentre position under the same coordinate system, if unified under image sensing cell photocentre coordinate system, image sensing cell photocentre is (0,0,0), projecting cell photocentre is T c → p, finger tip minimum point is shade forward terminal is r wherein p → cfor 3x3 is tied to the rotation matrix of image sensing cell coordinate system, T by projecting cell coordinate c → pfor 1x3 is tied to the translation matrix of projecting cell coordinate system, f by image sensing cell coordinate c, f pbe respectively image sensing cell and projecting cell focal length, with be respectively after image sensing cell and the correction of projecting cell intrinsic parameter with according to finger tip lowest point, projecting cell imaging surface top shadow forward terminal position and image sensing cell photocentre and projecting cell photocentre position on image sensing cell imaging surface, in space, cross the straight line [x of projecting cell photocentre, projecting cell plane top shadow forward terminal c, y c, z c]=λ 1v 1+ T c → pwith the straight line [x that crosses finger tip minimum point in image sensing cell photocentre, image sensing cell plane c, y c, z c]=λ 2v 2the volume coordinate of intersection point is λ 2' v 2, and using this position of intersecting point as fingertip location, wherein v 1 = ( x s p ′ , y s p ′ , f p ) , v 2 = ( x f c ′ , y f c ′ , f c ) , λ 1, λ 2for scale-up factor, λ 2' be definite value and satisfied λ ′ 1 λ ′ 2 = | | v 1 | | 2 - v 1 t v 2 - v 2 t v 1 | | v 2 | | 2 - 1 - v 1 t T c → p v 2 t T c → p , v 1transposed matrix, v 2transposed matrix, finally by the variable outer calculation of parameter of system, go out projecting plane equation, with normal vector n cwith single-point p crepresent, calculate and obtain above-mentioned intersection point and projecting plane spacing when distance is less than the second threshold value, be judged as touching operation; When there is touching operation, finger tip touching judge module is also by according to the image sensing cell receiving and the homography matrix between projecting cell, then calculate and obtain the coordinate position of touch points in the human-computer interaction interface of projection, and export finger tip touching information, finger tip touching information at least comprises the location coordinate information in the human-computer interaction interface of finger tip touching original projection.
In the human-computer interaction interface of outside input, exist while thering is the point, line of obvious characteristic or surface information, described projection interface processing unit 101 is also for extracting First Characteristic information from the human-computer interaction interface information of outside input, and First Characteristic information is exported to graphics processing unit, wherein First Characteristic information comprises unique point, line and face, and unique point, line and the face coordinate position in interface;
Described graphics processing unit 104 is for extracting Second Characteristic information from view field's image of image sensing cell 103 outputs, and wherein Second Characteristic information comprises unique point, line and face;
Graphics processing unit 104 is compared Second Characteristic information and described First Characteristic information, the deformation occurring with respect to the unique point in First Characteristic information, line or face according to Second Characteristic information, calculates the finger tip of deformation place from distance and the positional information of finger tip in human-computer interaction interface of human-computer interaction interface.
In the human-computer interaction interface of outside input, do not exist while thering is the point, line of obvious characteristic or surface information, described projection interface processing unit 101 is also for extracting the 3rd characteristic information from the human-computer interaction interface information of outside input, and the 3rd characteristic information is exported to graphics processing unit 104, wherein the 3rd characteristic information comprises the border of the human-computer interaction interface of projection;
Described graphics processing unit 104 is for extracting the 4th characteristic information from view field's image of image sensing cell output, and wherein the 4th characteristic information comprises the border of human-computer interaction interface;
Graphics processing unit 104 is compared the 3rd characteristic information and described the 4th characteristic information, according to the 3rd characteristic information, with respect to the 4th characteristic information, whether there is deformation, judge in view field, whether to contain finger tip, and calculate the finger tip of deformation place from distance and the positional information of finger tip in human-computer interaction interface of human-computer interaction interface.
Wherein, between described projection interface processing unit 101 and projecting cell 102, be provided with projection interface module, it belongs to interface unit 106, for the human-computer interaction interface information receiving is carried out to the processing of shape predistortion correction.Projection interface module is for receiving the data for projection from 101 outputs of projection interface processing unit, and the optical distortion parameter of the projecting cell based in the intrinsic intrinsic parameter of system, to carrying out the processing of optical distortion predistortion correction from the projection interface image of wanting of projection interface processing unit output, the optical distortion distortion bringing to eliminate projecting cell optical module characteristic, the projection interface image after correction exports projecting cell to.
Wherein, between described image sensing cell 103 and graphics processing unit 104, be provided with image sensing interface module, it belongs to interface unit 106, for the image gathering is carried out to optic aberrance revising processing.
Image sensing interface module is for receiving the optical distortion parameter of the image sensing cell of the intrinsic intrinsic parameter of system of exporting from graphics processing unit 104; Image sensing interface module is also for receiving the view data of image sensing cell 103, and the optical distortion parameter based on image sensing cell in the intrinsic intrinsic parameter of system, the image that realization gathers image sensing cell 103 carries out optic aberrance revising processing, to eliminate the optical distortion distortion bringing due to optical module characteristic in image sensing cell 103, then the image after proofreading and correct is exported to graphics processing unit 104.
Control module 107, for the every other unit of control system, and coordinates the work of every other unit; Control module also can control system in the intrinsic inside and outside parameter acquiring state of system, system variable element obtains state and finger tip obtains and touches mode of operation.
Graphics processing unit is for receiving the point of the obvious characteristic of projection interface processing unit output, the characteristic information such as line or face, and utilize system intrinsic in, the variable outer parameter of outer parameter and system, the homography matrix (also referred to as the homography matrix between projecting cell and image sensing cell) between projecting cell plane and image sensing cell plane is obtained in calculating, thereby calculate these characteristic informations two-dimensional coordinate information during imaging (while there is no user's hand or the operation of other touch control object in Ye Ji view field in image sensing cell plane, the coordinate information of characteristic information on image sensing cell acquired projections face in the human-computer interaction interface of projection in image sensor cell plane), be called First Characteristic information.This unit also receives the view data from the output of image sensing interface module, and extract the obvious characteristic point in this image, the characteristic information such as line or face, be called Second Characteristic information, utilize existing obvious characteristic information in projection interface, calculating in view field owing to there is operating article (hand or other touch control object) and not occur operating article, Second Characteristic information is with respect to the unique point in First Characteristic information, the size of line or face generation deformation is calculated the position of finger tip in projecting plane (as shown in Figure 2, while there is no hand or touch control object on the human-computer interaction interface of projection, and the schematic diagram while having hand or touch control object).In human-computer interaction interface, there is not obvious characteristic point, during the information such as line or face (as shown in Figure 2 a), the human-computer interaction interface border of projector projection is available characteristic information, utilize structured light principle to measure deformation place finger from the distance on projecting plane according to the deformation size of border striped deformation place, if this measuring distance value is less than or equal to the first threshold value, can directly judge that there is finger tip touching in this region, and the touch position coordinate of calculating in human-computer interaction interface, output touching information is to touching detecting unit, if this measuring distance value is greater than the first threshold value, for may there is the region of touching and exporting this information to touching detecting unit in whole projection interface, in human-computer interaction interface, there is the point with obvious characteristic, during the information such as line or face (as shown in Figure 2 b), can utilize striped bias size to calculate striped distortion place finger from projecting plane distance according to above-mentioned principle equally, when being less than or equal to the first threshold value, measuring distance value exports touching information to touching detecting unit, if it (is fingertip location information that this measuring distance value is greater than the area information that the first threshold value according to feature stripe information output finger touching may occur in the human-computer interaction interface of projection, and this positional information is more accurate with the increase of stripe information) to touching detecting unit, described graphics processing unit is also under the control of control module is coordinated, it (is the intrinsic parameter of image sensing cell and projecting cell that the intrinsic intrinsic parameter of system is obtained in calculating, mainly refer to the optical distortion parameter of bringing due to their optical module characteristics separately), and the intrinsic intrinsic parameter of storage system is to nonvolatile memory, and the intrinsic intrinsic parameter of output system is to interface unit, described graphics processing unit is for the corresponding characteristic information of image data extraction receiving from image sensing cell, and carry out matching treatment with the characteristic information of projection interface processing unit output, position and the attitude relation (being called the intrinsic outer parameter of system) between image sensing cell and projecting cell obtained in calculating, and the position between man-machine interactive system and projection plane and attitude relation (being called the variable outer parameter of system), described graphics processing unit also exports projection interface processing unit to the projection plane in the variable outer parameter of the system of obtaining and the position between projecting cell, attitude Relation Parameters.
System is by projection interface processing unit recipient's machine interactive interface information, and this interface information is after treatment delivered to projecting cell project in some daily planes, such as desktop, metope, paper, palm, arm etc., accurate Projection Display man-machine interface, then user can carry out the naked manual manipulation of similar touch screen in the man-machine interface of this projection, image sensing cell is the image in acquired projections region, identification man-machine interactive operation, and export the information of corresponding interactive operation, wherein whether key need to be identified staff and touch on projection interface, and the position of touch points in the man-machine interface of projection, the wherein work of all other unit in coordinated control system of control module.
The present embodiment is to be far longer than finger thickness according to image sensing cell and projecting cell and finger spacing, therefore the two can be approximately same point to block finger tip minimum point that finger tip point that projection ray produces shade front end and image sensing cell photograph, and in Fig. 5, A, 2 of B can be approximately same point.Based on above-mentioned principle, the present invention is by usining the intersection point of straight line PA and CB as fingertip location.If finger tip is approximately to 1/4th spheroids, finger tip minimum point was positioned in image sensing cell photocentre and the finger tip centre of sphere and the plane perpendicular to projecting plane, on the image gathering at image sensing cell, this plane is being crossed on the straight line of a some b (some b be image sensing cell photocentre and perpendicular to the straight line of projection plane and the intersection point on projecting plane).As shown in Figure 4, in finger tip minimum point acquisition module, usingd and crossed the straight line ob of finger tip fitting circle center of circle o and some b and finger tip edge intersection point as finger tip minimum point.For calculating PA and CB intersection point, PA and CB need coplanar, thus PM and CB coplanar, M point is in PCB plane, corresponding was on the intersection point a on straight line PC and projecting plane and the straight line of finger tip minimum point f on image sensing cell collection image, was the intersection point s of straight line af and Shadow edge.According to the coordinate of a f, s, utilize binocular principle to calculate the intersecting point coordinate of PM and CB, so this algorithm can calculate fingertip location preferably, thereby be subject to the less precision that makes of finger tip thickness effect higher.
More generally, if f is not finger tip minimum point, straight line af and Shadow edge intersection point s are still on this finger tip point and straight line PC formed plane, and PM and CB intersection point still can characterize fingertip location.
The embodiment of the present invention is in system touching operation acquisition process, control module is controlled whole system and is obtained state in touching operation, finger tip acquisition module in graphics processing unit receives the image of the view field gathering from image sensing cell, according to characteristic information, in image, obtain the positional information of finger fingertip, and this result is exported to touching detecting unit, described touching detecting unit receives the fingertip location information of obtaining from graphics processing unit, calculate finger tip minimum point and corresponding shade forward terminal thereof, in also intrinsic from graphics processing unit receiving system, outer parameter, utilize " binocular range measurement principle ", can calculate the position relationship of the relative image sensing cell of finger tip minimum point, finally utilize position and the attitude Relation Parameters on the relative projecting plane of image sensing cell in the variable outer parameter of system, thereby calculate the distance of the relative projection plane of finger tip, if distance value is within the scope of a threshold value, be judged as touching operation.During for generation touching operation, also need to calculate the position of this touch points in projection interface, the present embodiment is preferably usingd finger tip lowest point as finger tip touch points position.When touch event occurs, the image sensing cell that utilization is obtained from graphics processing unit and the homography matrix between projecting cell calculate the coordinate position of finger tip minimum point in original projection interface, touching detecting unit output touching operation and the location coordinate information of corresponding touch points in human-computer interaction interface.The position coordinates of last finger tip touching module output touch points in human-computer interaction interface.
Consider when projection plane is on-fixed plane, can exist the relative man-machine interactive system of projection plane to rotate or translation, now, can be by the variable outer parameter of the real-time system of obtaining of graphics processing unit, obtain relative position and the attitude information of the relative man-machine interactive system of projection plane, when finding that its relative position and attitude change, again obtain the variable outer parameter of system after variation, and the variable outer parameter of system is exported to projection interface processing module again, described projection interface processing module is upgraded these parameters, and carry out projection of shape pre-distortion based on this new parameter, to realize the real-time tracking of projection plane.
It should be noted that in all embodiment of the present invention, except processing the situation of hand touching projection plane, is also completely passable for other touch control object, and its principle is identical, is not repeated herein.
The embodiment of the present invention only utilizes a common projection device and a common image sensing device can complete the judgement of whether finger tip being touched to projection plane, low cost, low-power consumption, high-precision touch detection have been realized, in addition, because the present invention does not need to adopt depth transducer, so the volume of system is also less.
The finger tip touching detection algorithm of the embodiment of the present invention is not subject to the impact of characteristic information in human-computer interaction interface, even if there is no the essential characteristic unit such as obvious point, line or Else Rule figure or image in interactive interface, this algorithm also can well be implemented.
Finger tip touching detection algorithm described in the embodiment of the present invention does not rely on image feature information, projector is without the dominant or recessive structured light of projection, therefore less demanding to the acquisition frame rate of image sensing cell and projecting cell, make algorithm have more universality;
Finger tip described in embodiment of the present invention touching detection algorithm is by calculating finger tip and the foundation of projecting plane spacing as touching judgement, so this touching detection algorithm is subject to finger tip self thickness effect little, touches accuracy in detection high.
An alternative embodiment of the invention has also proposed a kind of finger tip detection method of touch, and referring to Fig. 6, the method comprises:
Step 601: projection interface processing unit receives the human-computer interaction interface information of outside input, and the human-computer interaction interface information of reception is exported to projecting cell.
In this step, before projecting cell is exported to by the human-computer interaction interface information of reception in described projection interface processing unit, the human-computer interaction interface information receiving is carried out to the processing of shape predistortion correction.
Step 602: projecting cell by human-computer interaction interface information projection to projection plane.
Step 603: the view field's image in image sensing cell acquired projections plane, Bing Jiang view field image sends to graphics processing unit.
In this step, before image sensing cell sends to graphics processing unit by the image of collection, the image gathering is carried out to optic aberrance revising processing.
Step 604: graphics processing unit obtains system intrinsic parameter, the intrinsic outer parameter of system and the variable outer parameter of system, and utilize the homography matrix between the intrinsic outer parameter of described system and the variable outer calculation of parameter projecting cell of system and image sensing cell.
Wherein system intrinsic parameter is the photocentre location parameter of projecting cell and image sensing cell; The intrinsic outer parameter of system is relative tertiary location and attitude Relation Parameters between projecting cell and image sensing cell; The variable outer parameter of system is relative tertiary location and the attitude Relation Parameters between image sensing cell and projecting cell and projection plane.
Step 605: graphics processing unit judges whether contain finger tip in view field's image, and when containing finger tip, calculate distance and the positional information of finger tip in human-computer interaction interface of finger tip and human-computer interaction interface; If the distance of described finger tip and human-computer interaction interface is less than or equal to the first threshold value, output touching information is to touching detecting unit; Otherwise output containing view field's image of finger tip and finger tip in human-computer interaction interface positional information to touching detecting unit.
In this step, in the human-computer interaction interface of outside input, exist while thering is the point, line of obvious characteristic or surface information, described projection interface processing unit is also for extracting First Characteristic information from the human-computer interaction interface information of outside input, and First Characteristic information is exported to graphics processing unit, wherein First Characteristic information comprises unique point, line and face, and unique point, line and the face coordinate position in interface.
Described graphics processing unit is for extracting Second Characteristic information from view field's image of image sensing cell output, and wherein Second Characteristic information comprises unique point, line and face.
Graphics processing unit is compared Second Characteristic information and described First Characteristic information, the deformation occurring with respect to the unique point in First Characteristic information, line or face according to Second Characteristic information, calculates the finger tip of deformation place from distance and the positional information of finger tip in human-computer interaction interface of human-computer interaction interface.
In the human-computer interaction interface of outside input, do not exist while thering is the point, line of obvious characteristic or surface information, described projection interface processing unit is also for extracting the 3rd characteristic information from the human-computer interaction interface information of outside input, and the 3rd characteristic information is exported to graphics processing unit, wherein the 3rd characteristic information comprises the border of the human-computer interaction interface of projection.
Described graphics processing unit is for extracting the 4th characteristic information from view field's image of image sensing cell output, and wherein the 4th characteristic information comprises the border of human-computer interaction interface.
Graphics processing unit is compared the 3rd characteristic information and described the 4th characteristic information, according to the 3rd characteristic information, with respect to the 4th characteristic information, whether there is deformation, judge in view field, whether to contain finger tip, and calculate the finger tip of deformation place from distance and the positional information of finger tip in human-computer interaction interface of human-computer interaction interface.
Step 606: when touching detecting unit receives output information from graphics processing unit and is touching information, directly export touching information, if receive from the output information of graphics processing unit as the view field's image containing finger tip and finger tip are in human-computer interaction interface during positional information, obtain fingertip location, judge whether finger tip is touched, if touching information is exported in touching, be specially: the intrinsic outer parameter of system that receives graphics processing unit output, the variable outer parameter of system, homography matrix between the intrinsic intrinsic parameter of system and projecting cell and image sensing cell, the described parameter receiving is calculated to obtain in image sensing cell plane by geometry and cross image sensing cell photocentre and perpendicular to the straight line of projection plane and the intersection point b on projecting plane and mistake projecting cell photocentre and the straight line of image sensing cell photocentre and the intersection point a positional information on projecting plane, utilize finger tip in human-computer interaction interface, may occur touching area information and the colouring information of finger tip obtain finger tip and finger tip carried out to rim detection and obtain finger tip edge, matching is justified in finger tip edge and simulate center of circle o, connect center of circle o and intersection point b, straight line ob and finger tip edge intersection point are defined as finger tip minimum point f, establish its coordinate on image sensing cell imaging surface to be utilize the feature of shade in hsv color space to obtain finger tip shadow region, by connecting intersection point a and finger tip minimum point f, line and Shadow edge place intersection point are shade forward terminal s, establish its coordinate on image sensing cell imaging surface to be by homography matrix between projecting cell and image sensing cell, calculate the position of shade forward terminal in projecting cell plane utilize the intrinsic inside and outside parameter of system of graphics processing unit output can obtain rotation matrix R and the translation matrix T between image sensing cell and projecting cell, calculate projecting cell photocentre position and image sensing cell photocentre position under the same coordinate system, if unified under image sensing cell photocentre coordinate system, image sensing cell photocentre is (0,0,0), projecting cell photocentre is T c → p, finger tip minimum point is shade forward terminal is r wherein p → cfor 3x3 is tied to the rotation matrix of image sensing cell coordinate system, T by projecting cell coordinate c → pfor 1x3 is tied to the translation matrix of projecting cell coordinate system, f by image sensing cell coordinate c, f pbe respectively image sensing cell and projecting cell focal length, with be respectively after image sensing cell and the correction of projecting cell intrinsic parameter with according to finger tip lowest point, projecting cell imaging surface top shadow forward terminal position and image sensing cell photocentre and projecting cell photocentre position on image sensing cell imaging surface, in space, cross the straight line [x of projecting cell photocentre, projecting cell plane top shadow forward terminal c, y c, z c]=λ 1v 1+ T c → pwith the straight line [x that crosses finger tip minimum point in image sensing cell photocentre, image sensing cell plane c, y c, z c]=λ 2v 2the volume coordinate of intersection point is λ 2' v 2, and using this position of intersecting point as fingertip location, wherein v 1 = ( x s p ′ , y s p ′ , f p ) , v 2 = ( x f c ′ , y f c ′ , f c ) , λ 1, λ 2for scale-up factor, λ 2' be definite value and satisfied λ ′ 1 λ ′ 2 = | | v 1 | | 2 - v t v 1 - v 2 t v | | v | | 2 1 2 - 1 - v 1 t 2 T c → p v 2 t T c → p , v 1transposed matrix, v 2transposed matrix, finally by the variable outer calculation of parameter of system, go out projecting plane equation, with normal vector n cwith single-point p crepresent, calculate and obtain above-mentioned intersection point and projecting plane spacing when distance is less than the second threshold value, be judged as touching operation; When there is touching operation, finger tip touching judge module is also by according to the image sensing cell receiving and the homography matrix between projecting cell, then calculate and obtain the coordinate position of touch points in the human-computer interaction interface of projection, and export finger tip touching information, finger tip touching information at least comprises the location coordinate information in the human-computer interaction interface of finger tip touching original projection.
The present embodiment is to be far longer than finger thickness according to image sensing cell and projecting cell and finger spacing, therefore the two can be approximately same point to block finger tip minimum point that finger tip point that projection ray produces shade front end and image sensing cell photograph, and in Fig. 5, A, 2 of B can be approximately same point.Based on above-mentioned principle, the present invention is by usining the intersection point of straight line PA and CB as fingertip location.If finger tip is approximately to 1/4th spheroids, finger tip minimum point was positioned in image sensing cell photocentre and the finger tip centre of sphere and the plane perpendicular to projecting plane, on the image gathering at image sensing cell, this plane is being crossed on the straight line of a some b (some b be image sensing cell photocentre and perpendicular to the straight line of projection plane and the intersection point on projecting plane).As shown in Figure 4, in finger tip minimum point acquisition module, usingd and crossed the straight line ob of finger tip fitting circle center of circle o and some b and finger tip edge intersection point as finger tip minimum point.For calculating PA and CB intersection point, PA and CB need coplanar, thus PM and CB coplanar, M point is in PCB plane, corresponding was on the intersection point a on straight line PC and projecting plane and the straight line of finger tip minimum point f on image sensing cell collection image, was the intersection point s of straight line af and Shadow edge.According to the coordinate of a f, s, utilize binocular principle to calculate the intersecting point coordinate of PM and CB, so this algorithm can calculate fingertip location preferably, thereby be subject to the less precision that makes of finger tip thickness effect higher.
More generally, if f is not finger tip minimum point, straight line af and Shadow edge intersection point s are still on this finger tip point and straight line PC formed plane, and PM and CB intersection point still can characterize fingertip location.
The present embodiment is in system touching operation acquisition process, control module is controlled whole system and is obtained state in touching operation, finger tip acquisition module in graphics processing unit receives the image of the view field gathering from image sensing cell, according to characteristic information, in image, obtain the positional information of finger fingertip, and this result is exported to touching detecting unit, described touching detecting unit receives the fingertip location information of obtaining from graphics processing unit, calculate finger tip minimum point and corresponding shade forward terminal thereof, in also intrinsic from graphics processing unit receiving system, outer parameter, utilize " binocular range measurement principle ", can calculate the position relationship of the relative image sensing cell of finger tip minimum point, finally utilize position and the attitude Relation Parameters on the relative projecting plane of image sensing cell in the variable outer parameter of system, thereby calculate the distance of the relative projection plane of finger tip, if distance value is within the scope of a threshold value, be judged as touching operation.During for generation touching operation, also need to calculate the position of this touch points in projection interface, the present embodiment is preferably usingd finger tip lowest point as finger tip touch points position.When touch event occurs, the image sensing cell that utilization is obtained from graphics processing unit and the homography matrix between projecting cell calculate the coordinate position of finger tip minimum point in original projection interface, touching detecting unit output touching operation and the location coordinate information of corresponding touch points in human-computer interaction interface.The position coordinates of last finger tip touching module output touch points in human-computer interaction interface.
Consider when projection plane is on-fixed plane, can exist the relative man-machine interactive system of projection plane to rotate or translation, now, can be by the variable outer parameter of the real-time system of obtaining of graphics processing unit, obtain relative position and the attitude information of the relative man-machine interactive system of projection plane, when finding that its relative position and attitude change, again obtain the variable outer parameter of system after variation, and the variable outer parameter of system is exported to projection interface processing module again, described projection interface processing module is upgraded these parameters, and carry out projection of shape pre-distortion based on this new parameter, to realize the real-time tracking of projection plane.
The embodiment of the present invention only utilizes a common projection device and a common image sensing device can complete the judgement of whether finger tip being touched to projection plane, low cost, low-power consumption, high-precision touch detection have been realized, in addition, because the present invention does not need to adopt depth transducer, so the volume of system is also less.
Finger tip touching detection algorithm described in the embodiment of the present invention is not subject to the impact of characteristic information in human-computer interaction interface, even if there is no the essential characteristic unit such as obvious point, line or Else Rule figure or image in interactive interface, this algorithm also can well be implemented.
Finger tip touching detection algorithm described in the embodiment of the present invention does not rely on image feature information, projector is without the dominant or recessive structured light of projection, therefore less demanding to the acquisition frame rate of image sensing cell and projecting cell, make algorithm have more universality.
Finger tip described in embodiment of the present invention touching detection algorithm is by calculating finger tip and the foundation of projecting plane spacing as touching judgement, so this touching detection algorithm is subject to finger tip self thickness effect little, touches accuracy in detection high.
It should be noted that in all embodiment of the present invention, except processing the situation of hand touching projection plane, is also completely passable for other touch control object, and its principle is identical, is not repeated herein.
Above embodiment only, for technical scheme of the present invention is described, is not intended to limit; Although the present invention is had been described in detail with reference to previous embodiment, those of ordinary skill in the art is to be understood that: its technical scheme that still can record aforementioned each embodiment is modified, or part technical characterictic is wherein equal to replacement; And these modifications or replacement do not make the essence of appropriate technical solution depart from the spirit and scope of various embodiments of the present invention technical scheme.

Claims (10)

1. a finger tip touching detection system, is characterized in that, this system comprises:
Projection interface processing unit, for receiving the human-computer interaction interface information of outside input, and exports to projecting cell by the human-computer interaction interface information of reception;
Projecting cell, for by human-computer interaction interface information projection to projection plane;
Image sensing cell, for the view field's image in acquired projections plane, and sends to graphics processing unit by the view field's image collecting;
Graphics processing unit is used for judging in view field's image, whether to contain finger tip, and when containing finger tip, calculates the distance of finger tip and human-computer interaction interface and the positional information in human-computer interaction interface thereof; If the distance of described finger tip and human-computer interaction interface is less than or equal to the first threshold value, output touching information is to touching detecting unit; Otherwise output containing view field's image of finger tip and the positional information of finger tip in human-computer interaction interface to touching detecting unit;
Graphics processing unit is also for obtaining the homography matrix between system intrinsic parameter, the intrinsic outer parameter of system, the variable outer parameter of system and projecting cell and image sensing cell, and exports the homography matrix between described system intrinsic parameter, the intrinsic outer parameter of system, the variable outer parameter of system and projecting cell and image sensing cell to touching detecting unit;
Wherein system intrinsic parameter is the photocentre location parameter of projecting cell and image sensing cell; The intrinsic outer parameter of system is relative tertiary location and attitude Relation Parameters between projecting cell and image sensing cell; The variable outer parameter of system is relative tertiary location and the attitude Relation Parameters between image sensing cell and projecting cell and projection plane;
Touching detecting unit, comprises photocentre mapping position computing module on judgement execution module, imaging surface, finger tip minimum point acquisition module, shade forward terminal acquisition module and finger tip touching judge module, wherein,
Judgement execution module, for when reception is touching information from the output information of graphics processing unit, directly export touching information, if while receiving from the output information of graphics processing unit as positional information in human-computer interaction interface of the view field's image containing finger tip and finger tip, carry out photocentre mapping position computing module on imaging surface;
Photocentre mapping position computing module on imaging surface, for receiving the intrinsic outer parameter of system of graphics processing unit output, the variable outer parameter of system, homography matrix between the intrinsic intrinsic parameter of system and projecting cell and image sensing cell, the described parameter receiving is calculated to obtain in image sensing cell plane by geometry and cross image sensing cell photocentre and perpendicular to the straight line of projection plane and the intersection point b on projecting plane and mistake projecting cell photocentre and the straight line of image sensing cell photocentre and the intersection point a positional information on projecting plane, and export above two intersection point information to described finger tip minimum point acquisition module and described shade forward terminal acquisition module,
Finger tip minimum point acquisition module, for receiving the view field's image containing finger tip of graphics processing unit output and finger tip in the positional information of human-computer interaction interface and the positional information of the intersection point b of photocentre mapping position computing module output on from described imaging surface, utilize finger tip in human-computer interaction interface, may occur touching area information and the colouring information of finger tip obtain finger tip and finger tip carried out to rim detection and obtain finger tip edge, matching is justified in finger tip edge and simulate center of circle o, connect center of circle o and intersection point b, straight line ob and finger tip edge intersection point are defined as finger tip minimum point f, if its coordinate on image sensing cell imaging surface is and export the positional information of finger tip minimum point f to described shade forward terminal acquisition module,
Shade forward terminal acquisition module, for receiving the view field's image containing finger tip of graphics processing unit output and finger tip in the positional information of human-computer interaction interface and the positional information of the intersection point a of photocentre mapping position computing module output on receiving from described imaging surface, the positional information of the finger tip minimum point f of described finger tip minimum point acquisition module output, utilize the feature of shade in hsv color space to obtain finger tip shadow region, by connecting intersection point a and finger tip minimum point f, line and Shadow edge place intersection point are shade forward terminal s, if its coordinate on image sensing cell imaging surface is and export finger tip minimum point f and shade forward terminal s positional information to described finger tip touching judge module,
Finger tip touching judge module, for receiving homography matrix between the projecting cell of described graphics processing unit output and image sensing cell, and finger tip lowest point, the shade forward terminal position of from described shade forward terminal acquisition module, exporting, by homography matrix between projecting cell and image sensing cell, calculate the position of shade forward terminal in projecting cell plane utilize the intrinsic inside and outside parameter of system of graphics processing unit output can obtain rotation matrix R and the translation matrix T between image sensing cell and projecting cell, calculate projecting cell photocentre position and image sensing cell photocentre position under the same coordinate system, if unified under image sensing cell photocentre coordinate system, image sensing cell photocentre is (0,0,0), projecting cell photocentre is T c → p, finger tip minimum point is shade forward terminal is r wherein p → cfor 3x3 is tied to the rotation matrix of image sensing cell coordinate system, T by projecting cell coordinate c → pfor 1x3 is tied to the translation matrix of projecting cell coordinate system, f by image sensing cell coordinate c, f pbe respectively image sensing cell and projecting cell focal length, with be respectively after image sensing cell and the correction of projecting cell intrinsic parameter with according to finger tip lowest point, projecting cell imaging surface top shadow forward terminal position and image sensing cell photocentre and projecting cell photocentre position on image sensing cell imaging surface, in space, cross the straight line [x of projecting cell photocentre, projecting cell plane top shadow forward terminal c, y c, z c]=λ 1v 1+ T c → pwith the straight line [x that crosses finger tip minimum point in image sensing cell photocentre, image sensing cell plane c, y c, z c]=λ 2v 2the volume coordinate of intersection point is λ 2' v 2, and using this position of intersecting point as fingertip location, wherein v 1 = ( x s p ′ , y s p ′ , f p ) , v 2 = ( x f c ′ , y f c ′ , f c ) , λ 1, λ 2for scale-up factor, λ 2' be definite value and satisfied λ ′ 1 λ ′ 2 = | | v 1 | | 2 - v 1 t v 2 - v 2 t v 1 | | v 2 | | 2 - 1 - v 1 t T c → p v 2 t T c → p , v 1transposed matrix, v 2transposed matrix, finally by the variable outer calculation of parameter of system, go out projecting plane equation, with normal vector n cwith single-point p crepresent, calculate and obtain above-mentioned intersection point and projecting plane spacing when distance is less than the second threshold value, be judged as touching operation; When there is touching operation, finger tip touching judge module is also by according to the image sensing cell receiving and the homography matrix between projecting cell, then calculate and obtain the coordinate position of touch points in the human-computer interaction interface of projection, and export finger tip touching information, finger tip touching information at least comprises the location coordinate information in the human-computer interaction interface of finger tip touching original projection;
Control module, for the every other unit of control system, and coordinates the work of every other unit; Control module also can control system in the intrinsic inside and outside parameter acquiring state of system, system variable element obtains state and finger tip obtains and touches mode of operation.
2. system according to claim 1, it is characterized in that, in the human-computer interaction interface of outside input, exist while thering is the point, line of obvious characteristic or surface information, described projection interface processing unit is also for extracting First Characteristic information from the human-computer interaction interface information of outside input, and First Characteristic information is exported to graphics processing unit, wherein First Characteristic information comprises unique point, line and face, and unique point, line and the face coordinate position in interface;
Described graphics processing unit is for extracting Second Characteristic information from view field's image of image sensing cell output, and wherein Second Characteristic information comprises unique point, line and face;
Graphics processing unit is compared Second Characteristic information and described First Characteristic information, according to Second Characteristic information, with respect to the unique point in First Characteristic information, line or face, whether there is deformation, judge in view field, whether to contain finger tip, and the finger tip that calculates deformation place is from the distance of human-computer interaction interface and the positional information in view field thereof.
3. system according to claim 1, it is characterized in that, in the human-computer interaction interface of outside input, do not exist while thering is the point, line of obvious characteristic or surface information, described projection interface processing unit is also for extracting the 3rd characteristic information from the human-computer interaction interface information of outside input, and the 3rd characteristic information is exported to graphics processing unit, wherein the 3rd characteristic information comprises the border of the human-computer interaction interface of projection;
Described graphics processing unit is for extracting the 4th characteristic information from view field's image of image sensing cell output, and wherein the 4th characteristic information comprises the border of human-computer interaction interface;
Graphics processing unit is compared the 3rd characteristic information and described the 4th characteristic information, the deformation whether occurring with respect to the 4th characteristic information according to the 3rd characteristic information, judge in view field, whether to contain finger tip, and calculate the finger tip of deformation place from distance and the positional information of finger tip in human-computer interaction interface of human-computer interaction interface.
4. system according to claim 1, is characterized in that, between described projection interface processing unit and projecting cell, is provided with projection interface module, for the human-computer interaction interface information receiving is carried out to the processing of shape predistortion correction.
5. system according to claim 1, is characterized in that, is provided with image sensing interface module between described image sensing cell and graphics processing unit, for the image gathering is carried out to optic aberrance revising processing.
6. the finger tip detection method of touch based on finger tip touching detection system claimed in claim 1, is characterized in that, the method comprises:
Step S1: projection interface processing unit receives the human-computer interaction interface information of outside input, and the human-computer interaction interface information of reception is exported to projecting cell;
Step S2: projecting cell by human-computer interaction interface information projection to projection plane;
Step S3: the view field's image in image sensing cell acquired projections plane, Bing Jianggai view field image sends to graphics processing unit;
Step S4: graphics processing unit obtains system intrinsic parameter, the intrinsic outer parameter of system and the variable outer parameter of system, and utilize the homography matrix between the intrinsic outer parameter of described system and the variable outer calculation of parameter projecting cell of system and image sensing cell;
Wherein system intrinsic parameter is the photocentre location parameter of projecting cell and image sensing cell; The intrinsic outer parameter of system is relative tertiary location and attitude Relation Parameters between projecting cell and image sensing cell; The variable outer parameter of system is relative tertiary location and the attitude Relation Parameters between image sensing cell and projecting cell and projection plane;
Step S5: graphics processing unit judges whether contain finger tip on view field's image, and when containing finger tip, calculate the distance of finger tip and human-computer interaction interface; If the distance of described finger tip and human-computer interaction interface is less than or equal to the first threshold value, output touching information is to touching detecting unit; Otherwise output containing view field's image of finger tip and the positional information of finger tip in human-computer interaction interface to touching detecting unit;
Step S6: when touching detecting unit receives output information from graphics processing unit and is touching information, directly export touching information, if while receiving from the output information of graphics processing unit as positional information in human-computer interaction interface of the view field's image containing finger tip and finger tip, obtain fingertip location, judge whether finger tip is touched, if touching information is exported in touching, be specially: the intrinsic outer parameter of system that receives graphics processing unit output, the variable outer parameter of system, homography matrix between the intrinsic intrinsic parameter of system and projecting cell and image sensing cell, the described parameter receiving is calculated to obtain in image sensing cell plane by geometry and cross image sensing cell photocentre and perpendicular to the straight line of projection plane and the intersection point b on projecting plane and mistake projecting cell photocentre and the straight line of image sensing cell photocentre and the intersection point a positional information on projecting plane, utilize finger tip in human-computer interaction interface, may occur touching area information and the colouring information of finger tip obtain finger tip and finger tip carried out to rim detection and obtain finger tip edge, matching is justified in finger tip edge and simulate center of circle o, connect center of circle o and intersection point b, straight line ob and finger tip edge intersection point are defined as finger tip minimum point f, establish its coordinate on image sensing cell imaging surface to be utilize the feature of shade in hsv color space to obtain finger tip shadow region, by connecting intersection point a and finger tip minimum point f, line and Shadow edge place intersection point are shade forward terminal s, establish its coordinate on image sensing cell imaging surface to be by homography matrix between projecting cell and image sensing cell, calculate the position of shade forward terminal in projecting cell plane utilize the intrinsic inside and outside parameter of system of graphics processing unit output can obtain rotation matrix R and the translation matrix T between image sensing cell and projecting cell, calculate projecting cell photocentre position and image sensing cell photocentre position under the same coordinate system, if unified under image sensing cell photocentre coordinate system, image sensing cell photocentre is (0,0,0), projecting cell photocentre is T c → p, finger tip minimum point is shade forward terminal is r wherein p → cfor 3x3 is tied to the rotation matrix of image sensing cell coordinate system, T by projecting cell coordinate c → pfor 1x3 is tied to the translation matrix of projecting cell coordinate system, f by image sensing cell coordinate c, f pbe respectively image sensing cell and projecting cell focal length, with be respectively after image sensing cell and the correction of projecting cell intrinsic parameter with according to finger tip lowest point, projecting cell imaging surface top shadow forward terminal position and image sensing cell photocentre and projecting cell photocentre position on image sensing cell imaging surface, in space, cross the straight line [x of projecting cell photocentre, projecting cell plane top shadow forward terminal c, y c, z c]=λ 1v 1+ T c → pwith the straight line [x that crosses finger tip minimum point in image sensing cell photocentre, image sensing cell plane c, y c, z c]=λ 2v 2the volume coordinate of intersection point is λ 2' v 2, and using this position of intersecting point as fingertip location, wherein v 1 = ( x s p ′ , y s p ′ , f p ) , v 2 = ( x f c ′ , y f c ′ , f c ) , λ 1, λ 2for scale-up factor, λ 2' be definite value and satisfied λ ′ 1 λ ′ 2 = | | v 1 | | 2 - v t v 1 - v 2 t v | | v | | 2 1 2 - 1 - v 1 t 2 T c → p v 2 t T c → p , v 1transposed matrix, v 2transposed matrix, finally by the variable outer calculation of parameter of system, go out projecting plane equation, with normal vector n cwith single-point p crepresent, calculate and obtain above-mentioned intersection point and projecting plane spacing when distance is less than the second threshold value, be judged as touching operation; When there is touching operation, finger tip touching judge module is also by according to the image sensing cell receiving and the homography matrix between projecting cell, then calculate and obtain the coordinate position of touch points in the human-computer interaction interface of projection, and export finger tip touching information, finger tip touching information at least comprises the location coordinate information in the human-computer interaction interface of finger tip touching original projection.
7. method according to claim 6, is characterized in that, graphics processing unit described in step S4 judges that on view field's image, whether containing finger tip comprises:
In the human-computer interaction interface of outside input, exist while thering is the point, line of obvious characteristic or surface information, described projection interface processing unit is also for extracting First Characteristic information from the human-computer interaction interface information of outside input, and First Characteristic information is exported to graphics processing unit, wherein First Characteristic information comprises unique point, line and face, and unique point, line and the face coordinate position in interface;
Described graphics processing unit is for extracting Second Characteristic information from view field's image of image sensing cell output, and wherein Second Characteristic information comprises unique point, line and face;
Graphics processing unit is compared Second Characteristic information and described First Characteristic information, according to Second Characteristic information, with respect to the unique point in First Characteristic information, line or face, whether there is deformation, judge in view field, whether to contain finger tip, and the finger tip that calculates deformation place is from the distance of human-computer interaction interface and the positional information in view field thereof.
8. method according to claim 6, is characterized in that, graphics processing unit described in step S4 judges whether view field's image contains finger tip and comprise:
In the human-computer interaction interface of outside input, do not exist while thering is the point, line of obvious characteristic or surface information, described projection interface processing unit is also for extracting the 3rd characteristic information from the human-computer interaction interface information of outside input, and the 3rd characteristic information is exported to graphics processing unit, wherein the 3rd characteristic information comprises the border of the human-computer interaction interface of projection;
Described graphics processing unit is for extracting the 4th characteristic information from view field's image of image sensing cell output, and wherein the 4th characteristic information comprises the border of human-computer interaction interface;
Graphics processing unit is compared the 3rd characteristic information and described the 4th characteristic information, according to the 3rd characteristic information, with respect to the 4th characteristic information, whether there is deformation, judge in view field, whether to contain finger tip, and the finger tip that calculates deformation place is from the distance of human-computer interaction interface and the positional information in view field thereof.
9. method according to claim 6, is characterized in that, before projecting cell is exported to by the human-computer interaction interface information of reception in described projection interface processing unit, the human-computer interaction interface information receiving is carried out to the processing of shape predistortion correction.
10. method according to claim 6, is characterized in that, before image sensing cell sends to graphics processing unit by the image of collection, the image gathering is carried out to optic aberrance revising processing.
CN201410175698.0A 2014-04-28 2014-04-28 A kind of finger tip touching detecting system and method Active CN103955316B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410175698.0A CN103955316B (en) 2014-04-28 2014-04-28 A kind of finger tip touching detecting system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410175698.0A CN103955316B (en) 2014-04-28 2014-04-28 A kind of finger tip touching detecting system and method

Publications (2)

Publication Number Publication Date
CN103955316A true CN103955316A (en) 2014-07-30
CN103955316B CN103955316B (en) 2016-09-21

Family

ID=51332597

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410175698.0A Active CN103955316B (en) 2014-04-28 2014-04-28 A kind of finger tip touching detecting system and method

Country Status (1)

Country Link
CN (1) CN103955316B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106033286A (en) * 2015-03-08 2016-10-19 青岛通产软件科技有限公司 A projection display-based virtual touch control interaction method and device and a robot
CN106648263A (en) * 2016-11-11 2017-05-10 珠海格力电器股份有限公司 Terminal equipment and control system, control method and device thereof
CN107092350A (en) * 2017-03-22 2017-08-25 深圳大学 A kind of remote computer based system and method
CN107943351A (en) * 2017-11-22 2018-04-20 苏州佳世达光电有限公司 Touch identifying system and method in perspective plane
CN108363484A (en) * 2018-01-24 2018-08-03 广州杰赛科技股份有限公司 Control method, device, system and the computer equipment of non-tactile display device
CN108363485A (en) * 2018-01-25 2018-08-03 广州杰赛科技股份有限公司 Control method, device, system and the computer equipment of non-touch screen display terminal
CN110691548A (en) * 2017-07-28 2020-01-14 谷歌有限责任公司 System and method for predicting and summarizing medical events from electronic health records
CN112930523A (en) * 2018-09-10 2021-06-08 阿韦瓦软件有限责任公司 Edge HMI module server system and method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI470478B (en) * 2008-12-26 2015-01-21 Inventec Appliances Corp Virtual keyboard of an electronic device and a data inputting method therefor
CN102508574B (en) * 2011-11-09 2014-06-04 清华大学 Projection-screen-based multi-touch detection method and multi-touch system
CN103279225B (en) * 2013-05-30 2016-02-24 清华大学 Projection type man-machine interactive system and touch control identification method
CN103336634B (en) * 2013-07-24 2016-04-20 清华大学 Based on touching detection system and the method for adaptive layered structured light

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106033286A (en) * 2015-03-08 2016-10-19 青岛通产软件科技有限公司 A projection display-based virtual touch control interaction method and device and a robot
CN106648263A (en) * 2016-11-11 2017-05-10 珠海格力电器股份有限公司 Terminal equipment and control system, control method and device thereof
CN106648263B (en) * 2016-11-11 2022-01-04 珠海格力电器股份有限公司 Terminal equipment and control system, control method and device thereof
CN107092350A (en) * 2017-03-22 2017-08-25 深圳大学 A kind of remote computer based system and method
CN110691548A (en) * 2017-07-28 2020-01-14 谷歌有限责任公司 System and method for predicting and summarizing medical events from electronic health records
CN110691548B (en) * 2017-07-28 2023-05-12 谷歌有限责任公司 System and method for predicting and summarizing medical events from electronic health records
CN107943351A (en) * 2017-11-22 2018-04-20 苏州佳世达光电有限公司 Touch identifying system and method in perspective plane
CN107943351B (en) * 2017-11-22 2021-01-05 苏州佳世达光电有限公司 Projection surface touch identification system and method
CN108363484B (en) * 2018-01-24 2021-04-09 广州杰赛科技股份有限公司 Control method, device and system of non-touch display screen equipment and computer equipment
CN108363484A (en) * 2018-01-24 2018-08-03 广州杰赛科技股份有限公司 Control method, device, system and the computer equipment of non-tactile display device
CN108363485A (en) * 2018-01-25 2018-08-03 广州杰赛科技股份有限公司 Control method, device, system and the computer equipment of non-touch screen display terminal
CN112930523A (en) * 2018-09-10 2021-06-08 阿韦瓦软件有限责任公司 Edge HMI module server system and method
CN112930523B (en) * 2018-09-10 2024-01-23 阿韦瓦软件有限责任公司 Edge HMI module server system and method

Also Published As

Publication number Publication date
CN103955316B (en) 2016-09-21

Similar Documents

Publication Publication Date Title
CN103955316A (en) Fingertip touch detection system and method
CN103809880B (en) Man-machine interaction system and method
US11842438B2 (en) Method and terminal device for determining occluded area of virtual object
CN107016697B (en) A kind of height measurement method and device
CN102508574B (en) Projection-screen-based multi-touch detection method and multi-touch system
CN104102343A (en) Interactive Input System And Method
CN102163108B (en) Method and device for identifying multiple touch points
CN103279225B (en) Projection type man-machine interactive system and touch control identification method
CN103336634B (en) Based on touching detection system and the method for adaptive layered structured light
CN110276774B (en) Object drawing method, device, terminal and computer-readable storage medium
CN102722254B (en) Method and system for location interaction
CN102508575B (en) Screen writing device, screen writing system and realization method thereof
CN104423578A (en) Interactive Input System And Method
CN103488356B (en) A kind of touch identification method based on infrared camera three-dimensional imaging
WO2011146070A1 (en) System and method for reporting data in a computer vision system
WO2018018624A1 (en) Gesture input method for wearable device, and wearable device
CN109544628A (en) A kind of the accurate reading identifying system and method for pointer instrument
CN105513128A (en) Kinect-based three-dimensional data fusion processing method
CN107346175A (en) A kind of hand gesture location bearing calibration and augmented reality display device
CN104714646A (en) 3D virtual touch control man-machine interaction method based on stereoscopic vision
CN103176606B (en) Based on plane interaction system and the method for binocular vision identification
CN107527353B (en) Projection picture outer frame detection method based on visual processing
CN106415460B (en) Wearable device with intelligent subscriber input interface
CN203386146U (en) Infrared video positioning-based man-machine interactive device
CN102023759B (en) Writing and locating method of active pen

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant