CN100585329C - Location system of video finger and location method based on finger tip marking - Google Patents

Location system of video finger and location method based on finger tip marking Download PDF

Info

Publication number
CN100585329C
CN100585329C CN200710021403A CN200710021403A CN100585329C CN 100585329 C CN100585329 C CN 100585329C CN 200710021403 A CN200710021403 A CN 200710021403A CN 200710021403 A CN200710021403 A CN 200710021403A CN 100585329 C CN100585329 C CN 100585329C
Authority
CN
China
Prior art keywords
sin
cos
theta
phi
psi
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN200710021403A
Other languages
Chinese (zh)
Other versions
CN101033963A (en
Inventor
顾宏斌
朱为珏
孙瑾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN200710021403A priority Critical patent/CN100585329C/en
Publication of CN101033963A publication Critical patent/CN101033963A/en
Application granted granted Critical
Publication of CN100585329C publication Critical patent/CN100585329C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

This invention relates to a kind of video finger positioning system and its orient method which base on finger tip labeled, belongs to virtual reality systemic man-computer interaction technology. This invention includes camera, image picking equipment and calculator. Locating method includes initialization and positioning. Initialization is a calibrating method of via two-dimensional surface mark target, to definite interior parameter of each camera; through camera global calibrating, to definite each camera coordinate relative to one unified world coordinate translation and turning; the positioning stage adopt self-adapting focusing distance, via detecting picture mark point to realize finger locating.

Description

Video finger positioning system and localization method thereof based on finger tip marking
(1) technical field
Video finger positioning system and localization method based on finger tip marking of the present invention belong to a kind of data processing equipment and technology based on video, are applicable to the man-machine interaction link of virtual reality system.
(2) background technology
Interaction technique is one of guardian technique of virtual reality system, has realized the mutual of people and computing machine, real world and virtual world.Realize alternately, need in some time reference frame, (constantly) obtain user, the particularly real time position and the directional information of user's limbs (hand).
In field of human-computer interaction, hand is followed the tracks of and mainly is divided into data glove tracking and video tracking dual mode.
Referring to " Shi Jiaoying work, " virtual reality basis and practical algorithm ", Science Press; 2002 ", comprehensively set forth the related content of data glove: data glove will be pointed and palm is stretched various postures when bending and is converted to digital signal and gives computing machine, for identification with carry out, realize mutual.Standard configuration is on each finger two sensors to be arranged, and control is contained in two fiber optic loop or other measuring sensors of dorsal surfaces of fingers, is used for measuring the angle of bend in the main joint of finger.Data glove also provides and measures thumb and close up/open and upwarp/and the sensor of the angle that has a downwarp is as option.
Though being used of tracking equipment and data glove can obtain position and the attitude information pointed exactly, and its weak point is also arranged: the 1. complex structure of equipment own, cause the dumb of user's finger motion after wearing; 2. in order to reach high precision, adopted accurate senser element, caused whole costing an arm and a leg, and be difficult to safeguard; 3. data glove adopts inflation, vibration or electro photoluminescence to oppress chafe, and to reach the tactile feedback purpose, only the most basic touching of simulation is felt.
Hand track and localization based on vision has two classes: based on monocular vision with based on many orders (binocular) vision.Referring to " Hua Hong etc. are based on the hand shape hand position tracking of computer vision technique, Beijing Institute of Technology's journal, 1999,19 (6): 739-743 for Chang Hong, Wang Yongtian ", a kind of hand tracking based on monocular vision has been proposed.Referring to " JianchaoZeng; Yue Wang; Turner R. etc.; Vision-based finger tracking of breast palpationfor improving breast self-examination; 18th Annual International Conference ofthe IEEE Engineering in Medicine and Biology Society; Amsterdam 1996.Vol 1.Page (s): 148-149 ", a kind of binocular vision tracking technique based on color detection has been proposed.But, no matter binocular or monocular vision, present video tracking technology is mainly used in determines position and relative position finger between, i.e. the hand shape identification problem of finger with respect to palm.
(3) summary of the invention
The purpose of this invention is to provide a kind ofly based on principle of computer vision, light practicality, cheap and can not influence the finger location system of hand sense of touch realizes the identification positioning function of data glove; Utilize video processing technique, the finger locating technology when providing (hereinafter to be referred as switch) generic operations such as switch, button, handle to simulate especially.The key of problem is to want high precision to determine the position of finger in absolute space.
Technical scheme of the present invention is as follows:
(1) system features
Video finger positioning system based on finger tip marking is characterized in that, comprises computing machine, and image collecting device and several video cameras stick the pictorial symbolization point at the five fingers finger tip and the back of the hand simultaneously.Wherein, hand is carried out three-dimensional every the video camera that detects and all be connected in computing machine by image collecting device.
Aforesaid video finger positioning system based on finger tip marking is characterized in that, above-mentioned pictorial symbolization point is affixed on nail, and designs through special: the gauge point figure of different finger tips adopts different colours, adapts to location at a distance; The gauge point figure has direction character (for example: the gauge point graphic designs is the arrow figure towards finger tip), the finger tip direction identification when adapting to the middle distance location; The inner drafting of gauge point figure demarcated pattern, adapts to closely location.Mark is far and near compatible, adapts to the acute variation of shooting distance.
(2) method feature
Localization method based on the video finger positioning system of finger tip marking is characterized in that, comprises following job step:
Starting stage:, determine the inner parameter of every video camera by calibrated reference by scaling method based on the two dimensional surface target; By every video camera is carried out global calibration, determine translation and the corner of each camera coordinate system with respect to a unified world coordinate system, finish system initialization;
Positioning stage: the self-adaptation shooting distance, realize finger locating by detection to pictorial symbolization point.
The localization method of aforesaid video finger positioning system based on finger tip marking is characterized in that the self-adaptation shooting distance of above-mentioned positioning stage: when remote, adopt the used for multi-vision visual principle, realize the location based on the gauge point color; During middle distance, adopt the used for multi-vision visual principle, realize the location, determine the finger tip direction based on the gauge point figure based on the gauge point color; In the time of closely, adopt the monocular vision principle, based on demarcating the location that pattern is realized single finger tip, and many information fusion of the video camera by detecting different finger tips, determine whole hand shape and location.
Wherein, during medium and long distance, the occlusion issue of finger tip marking point may take place, when the gauge point coordinate information is not enough, by the method based on skin color segmentation, determine whole hand shape, in conjunction with the coordinate information of hand shape and witness marking point, realize whole location.
Wherein, in the time of closely, according to the difference of demarcating pattern, concrete localization method is:
1. demarcate the localization method that pattern is right-angle triangle and straight line, utilize to demarcate the correspondence between pattern, determine the position and the angle of finger tip by homography matrix, wherein, homography matrix be expression three dimensions point with corresponding X-Y scheme picture point between the matrix that concerns;
2. demarcating pattern is foursquare localization method, by the projection imaging on four summits of rectangle, utilizes the unit orthogonality location finger of rotation matrix, and described rotation matrix unit orthogonality is meant
RR T=R TR=I, wherein
R = cos ψ cos φ sin θ sin ψ cos φ - cos θ sin φ cos θ sin ψ cos φ + sin θ sin φ cos ψ sin φ sin θ sin ψ sin φ + cos θ cos φ cos θ sin ψ sin φ - sin θ cos φ - sin ψ sin θ cos ψ cos θ cos ψ , R T = cos ψ cos φ cos ψθ sin φ - sin ψ sin θ sin ψ cos φ - cos θ sin φ sin θ sin ψ sin φ + cos θ cos φ sin θ cos ψ cos θ sin ψ cos φ + sin θ sin φ sin θ cos ψ sin - sin cos cos θ cos ψ , θ is the rotation angle around X-axis, and ψ is the rotation angle around Y-axis, and φ is the rotation angle around the Z axle, I = 1 0 0 0 1 0 0 0 1 ;
3. demarcating pattern is foursquare one group of opposite side and cornerwise localization method, determine the position and the direction of finger by the vanishing point of both direction, wherein, vanishing point be meant the space parallel straight line under not parallel situation with the camera imaging surface imaging at the intersection point of imaging plane;
4. demarcate the localization method of pattern, utilize circle in the perspective projection process, to become ellipse, the location that utilization triangle geometric knowledge is finished finger for circle.
The localization method of aforesaid video finger positioning system based on finger tip marking, it is characterized in that, the pictorial symbolization point of above-mentioned positioning stage detects, insertion switch type, switch locus etc. concern with operator shape corresponding parameters, prediction finger tip movement locus, dwindle gauge point and detect search space, improve locating speed.
Beneficial effect of the present invention is:
Compare with current data glove, have two remarkable advantages: 1. contactless, when not wearing data glove to the interference of people's hand feeling and action.Be that the user does not need to wear loaded down with trivial details tracking means, finish the location recognition function of data glove, simplify whole input equipment by setting up several video cameras; 2. with low cost.At present, the acquisition price of Various types of data gloves, maintenance cost are extremely expensive, and cheap, the Maintenance free of conventional video camera, greatly reduce the whole cost of interactive system, more are applicable to user at all levels.
Compare with current video tracking technology, have three distinctive features: the hand location in the time of 1. can operating at Switch especially, video camera is embedded near the switch, and image resolution ratio is high more when pointing more near switch, so bearing accuracy is high more; 2. mark is affixed on nail, eliminates mark itself with respect to pointing the error that moves and cause; 3. mark is through special design, and far and near compatibility is applicable to motion in a big way.
(4) description of drawings
Fig. 1 is the formation block diagram based on the video finger positioning system of finger tip marking.
Fig. 2 is the principle of work block diagram based on the video finger positioning system of finger tip marking.
Fig. 3 is the positioning flow figure based on the video finger positioning system of finger tip marking:
Positioning flow during the 1-wide-long shot
Positioning flow during the 2-medium shot
Positioning flow during the 3-shooting at close range
Fig. 4 is for demarcating the example schematic of pattern in the finger tip marking point:
The mark 5-square mark that 4-right-angle triangle and straight line are formed
The mark 7-circle mark that foursquare one group of opposite side of 6-and diagonal line are formed
(5) embodiment
At first, system as shown in Figure 1 constitutes block diagram and carries out system layout, adopts several video cameras to carry out the monitoring of hand.Afterwards, principle of work block diagram as shown in Figure 2 transfers the video simulation input to image digital signal by image capture device, finishes the identification location of finger tip by computing machine.
The location detailed process is shown in the process flow diagram of Fig. 3:
(1) starting stage
By scaling method, as calibrated reference, determine the f of each video camera by ordinary rectangular based on the two dimensional surface target u, f v, u 0, v 0, these 5 parameters of s (u wherein 0, v 0Be principal point coordinate, f uBe the scale factor of image u axle, f vBe the scale factor of image v axle, s is a distortion factor), obtain the intrinsic parameter matrix K:
K = f u s u 0 0 f u v 0 0 0 1
Be transformed in the unified world coordinate system by the coordinate system of global calibration, determine translation and the corner of each camera coordinate system, be combined into outer parameter matrix W with respect to a unified world coordinate system with several video cameras:
W = R T 0 1
Rotation matrix wherein R = cos ψ cos φ sin θ sin ψ cos φ - cos θ sin φ cos θ sin ψ cos φ + sin θ sin φ cos ψ sin φ sin θ sin ψ sin φ + cos θ cos φ cos θ sin ψ sin φ - sin θ cos φ - sin ψ sin θ cos ψ cos θ cos ψ , Angle function in expression and the world coordinate system between three axes, θ is the rotation angle around X-axis, and ψ is the rotation angle around Y-axis, and φ is the rotation angle around the Z axle. T = t x t y t z Be illustrated in the axial translation vector of three-dimensional in the world coordinate system, t x, t y, t zBe illustrated respectively in X-axis in the world coordinate system, Y-axis, the translational movement of Z axle.
(2) positioning stage
When medium and long distance is taken, adopt the used for multi-vision visual principle:
Determine the two dimensional image coordinate of gauge point.The first step by the moving region in the inter-frame difference algorithm detected image, does not process when having motion, saves system resource.Second step, utilize gauge point color thresholding, moving image is carried out filtering, remove background interference, determine the position of gauge point figure in image; And, find the solution the figure barycenter, determine the two dimensional image coordinate of gauge point.The 3rd step, corresponding relation between reference switch parameter (switch parameter comprises switchtype, switch locus etc.) and the operator shape, according to real-time finger tip marking point coordinate, predict its next regional location constantly, in the moving region, further dwindle the gauge point detection window, realize the window tracking.
When medium shot,, utilize the figure direction character to determine that finger tip points to especially, improve the precision of prediction of next moment regional location of gauge point because the finger tip marking dot pattern is more clear.
The two dimensional image coordinate of finger tip marking point is returned to the three-dimensional world coordinate.The image coordinate of the same gauge point that several video cameras are caught by different azimuth is reduced calculating, accurately locatees three-dimensional coordinate.Coordinate with a video camera is reduced to example:
If P is a gauge point, (x y) is the image coordinate of gauge point to P, P (X, Y Z) is the volume coordinate of gauge point, adopts homogeneous coordinates, exist following relation (wherein, K is the intrinsic parameter matrix that obtains in the starting stage, and T is the outer parameter matrix that obtains in the starting stage) between the two:
λ x y 1 = ( K × T ) x y z 1
Wherein, λ is a non-zero proportions coefficient.Because N is 3 * 4 singular matrixs, so, as known (K * T) with (x, in the time of y), three equations that following formula provides obtain two linear equations about X, Y, Z only.The system of equations of being made up of these two linear equations is a projection ray l, and, on image subpoint be gauge point P (x, y) have a few all on this ray.
For several video cameras, can obtain the N bar projection ray l at same gauge point place 1, l 2..., l N, intersect in twos by this N bar ray, can get
Figure C20071002140300091
Individual volume coordinate point P 1, P 2...,
Figure C20071002140300092
By this
Figure C20071002140300093
Individual coordinate points is averaged, and is the final volume coordinate of gauge point.
When finger tip marking point blocks situation, adopt method based on skin color segmentation, extract hand region, determine hand shape; By hand shape information and the constraint of hand shape biology,, realize whole location in conjunction with the witness marking point coordinate.
When shooting distance is nearer, adopt the monocular vision principle:
Because shooting distance is very near, the demarcation pattern in the pictorial symbolization point will account for body position in image.
According to the video camera imaging principle, the outer parameter of video camera comprises R and T, and wherein R is previously described rotation matrix, the angle function in expression and the world coordinate system between three axes; T is previously described translation vector, is illustrated in the axial translation vector of three-dimensional in the world coordinate system.So the relation of camera coordinate system and world coordinate system can be described with rotation matrix R and translation vector T.And some specified points (circular central shown in the leg-of-mutton right angle electrical shown in Fig. 4 label 4, the square left upper apex shown in Fig. 4 label 5 and 6, Fig. 4 label 7) of demarcating pattern in the finger tip marking point are set at the world coordinate system initial point, the relative position of finger tip marking point and video camera just can be described with rotation matrix R and translation vector T.Video camera is fixed, and is known with position of the switch relation, just can transform once more to try to achieve the position of finger tip marking point with respect to switch.Thereby, determine the finger tip absolute coordinate space.
As mentioned above, the key of monocular vision location is the acquisition of rotation matrix R and translation vector T, still, demarcates the difference of pattern, and its method for solving is also different.Demarcation pattern shown in Fig. 4 label 4, the method for solving of R and T is the correspondence of utilize demarcating between pattern, determines the position and the angle of finger tip by homography matrix, wherein, homography matrix is the matrix that concerns between expression three dimensions point and the corresponding X-Y scheme picture point, is previously described (K * T); Demarcation pattern shown in Fig. 4 label 5, the method for solving of R and T is the projection imaging by four summits of rectangle, utilizes the unit orthogonality location finger of rotation matrix R, wherein, the unit orthogonality is meant RR as previously described T=R TR=I; Demarcation pattern shown in Fig. 4 label 6, the method for solving of R and T is position and a direction of being determined finger by the vanishing point of both direction, wherein, vanishing point be meant the space parallel straight line under not parallel situation with the camera imaging surface imaging at the intersection point of imaging plane; Demarcation pattern shown in Fig. 4 label 7, the method for solving of R and T are to utilize circle to become ellipse in the perspective projection process, the location that utilization triangle geometric knowledge is finished finger.

Claims (1)

1. the video finger localization method based on finger tip marking is characterized in that, comprises following job step:
(1) starting stage:, determine the inner parameter of N portion video camera by scaling method based on the two dimensional surface target; By N portion video camera is carried out global calibration, determine N portion camera coordinate system with respect to the translation of a unified world coordinate system with turn to, finish system initialization, wherein N is the natural number greater than 1;
(2) positioning stage: adopt the self-adaptation shooting distance, by the detection realization finger locating of far and near compatible pictorial symbolization point that the five fingers finger tip and the back of the hand are sticked; The compatible pictorial symbolization point of described distance is meant that the gauge point figure of different finger tips adopts different colours, and the gauge point figure has direction character, and inner drafting of gauge point figure demarcated pattern; Described self-adaptation shooting distance comprises at a distance, middle distance and closely, when remote, adopt the used for multi-vision visual principle, realize the location based on the different colours of gauge point, during middle distance, adopt the used for multi-vision visual principle, realize the location based on the gauge point different colours, determine the finger tip direction based on the direction character of gauge point figure, in the time of closely, adopt the monocular vision principle, realize the location based on demarcating pattern; The detection of described pictorial symbolization point, insertion switch type, switch locus and operator shape corresponding parameters relation, prediction finger tip movement locus, dwindle gauge point and detect search space, improve locating speed, during middle distance, when the gauge point coordinate information is not enough,, determine whole hand shape by method based on skin color segmentation, coordinate information in conjunction with hand shape and witness marking point, realize whole location, during shooting at close range, according to the difference of demarcating pattern, adopt different localization methods, be respectively:
1. demarcate the localization method that pattern is right-angle triangle and straight line, utilize to demarcate the correspondence between pattern, determine the position and the angle of finger tip by homography matrix, wherein, homography matrix be expression three dimensions point with corresponding X-Y scheme picture point between the matrix that concerns;
2. demarcating pattern is foursquare localization method, by the projection imaging on four summits of rectangle, utilizes the unit orthogonality location finger of rotation matrix, the unit orthogonality of described rotation matrix;
RR T=R TR=I, wherein
R = cos ψ cos φ sin θ sin ψ cos φ - cos θ sin φ cos θ sin ψ cos φ + sin θ sin φ cos ψ sin φ sin θ sin ψ sin φ + cos θ cos φ cos θ sin ψ sin φ - sin θ cos φ - sin ψ sin θ cos ψ cos θ cos ψ ,
R T = cos ψ cos φ cos ψ sin φ - sin ψ sin θ sin ψ cos φ - cos θ sin φ sin θ sin ψ sin φ + cos θ cos φ sin θ cos ψ cos θ sin ψ cos φ + sin θ sin φ cos θ sin ψ sin φ - sin θ cos φ cos θ cos ψ , Wherein θ is the rotation angle around unified world coordinate system X-axis, and ψ is the rotation angle around unified world coordinate system Y-axis, and φ is the rotation angle around unified world coordinate system Z axle, I = 1 0 0 0 1 0 0 0 1 ;
3. demarcating pattern is foursquare one group of opposite side and cornerwise localization method, determine the position and the direction of finger by the vanishing point of both direction, wherein, vanishing point be meant the space parallel straight line under not parallel situation with the camera imaging surface imaging at the intersection point of imaging plane;
4. demarcate the localization method of pattern, utilize circle in the perspective projection process, to become ellipse, the location that utilization triangle geometric knowledge is finished finger for circle.
CN200710021403A 2007-04-10 2007-04-10 Location system of video finger and location method based on finger tip marking Expired - Fee Related CN100585329C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN200710021403A CN100585329C (en) 2007-04-10 2007-04-10 Location system of video finger and location method based on finger tip marking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN200710021403A CN100585329C (en) 2007-04-10 2007-04-10 Location system of video finger and location method based on finger tip marking

Publications (2)

Publication Number Publication Date
CN101033963A CN101033963A (en) 2007-09-12
CN100585329C true CN100585329C (en) 2010-01-27

Family

ID=38730623

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200710021403A Expired - Fee Related CN100585329C (en) 2007-04-10 2007-04-10 Location system of video finger and location method based on finger tip marking

Country Status (1)

Country Link
CN (1) CN100585329C (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102455827A (en) * 2010-10-26 2012-05-16 昕兴时乐股份有限公司 Object sensing device

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101324922B (en) * 2008-07-30 2012-04-18 北京中星微电子有限公司 Method and apparatus for acquiring fingertip track
CN101777182B (en) * 2010-01-28 2012-02-29 南京航空航天大学 Video positioning method of coordinate cycling approximation type orthogonal camera system and system thereof
HK1147905A2 (en) 2010-06-30 2011-08-19 Chi Ching Lee System and method for virtual touch sensing
CN101901339B (en) * 2010-07-30 2012-11-14 华南理工大学 Hand movement detecting method
CN101975588B (en) 2010-08-20 2012-07-11 北京航空航天大学 Global calibration method and device of rigid rod of multisensor vision measurement system
CN102012769B (en) * 2010-11-18 2013-03-27 无锡中星微电子有限公司 Method and device for performing multi-point control on contents on screen by using camera
CN102164269A (en) * 2011-01-21 2011-08-24 北京中星微电子有限公司 Method and device for monitoring panoramic view
JP2016033759A (en) * 2014-07-31 2016-03-10 セイコーエプソン株式会社 Display device, method for controlling display device, and program
CN106683137B (en) * 2017-01-11 2019-12-31 中国矿业大学 Artificial mark based monocular and multiobjective identification and positioning method
CN107424192A (en) * 2017-03-10 2017-12-01 北京小鸟看看科技有限公司 A kind of image processing method, device and virtual reality device for photosphere positioning
CN107160364B (en) * 2017-06-07 2021-02-19 华南理工大学 Industrial robot teaching system and method based on machine vision
CN108151731B (en) * 2017-12-22 2019-02-19 北京轻威科技有限责任公司 A kind of novel fast vision alignment sensor
CN110113560B (en) * 2018-02-01 2021-06-04 中兴飞流信息科技有限公司 Intelligent video linkage method and server
CN110826385A (en) * 2018-06-07 2020-02-21 皇家飞利浦有限公司 Rehabilitation device and method
CN109376612B (en) * 2018-09-27 2022-04-22 广东小天才科技有限公司 Method and system for assisting positioning learning based on gestures
CN109887031A (en) * 2019-01-30 2019-06-14 国网湖南省电力有限公司 Position and posture detection method, system, medium and the equipment of earthing knife-switch in a kind of switchgear
TWI720447B (en) * 2019-03-28 2021-03-01 財團法人工業技術研究院 Image positioning method and system thereof
CN110633666A (en) * 2019-09-10 2019-12-31 江南大学 Gesture track recognition method based on finger color patches
CN110974241A (en) * 2019-12-18 2020-04-10 上海理工大学 Vision-based movement track measuring device for flexible exoskeleton finger joints
CN111665883B (en) * 2020-05-20 2021-05-07 浙江旅游职业学院 Intelligent safety monitoring system and method for sterile workshop

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1533683A2 (en) * 2003-11-21 2005-05-25 Seat, S.A. Mixed reality simulation system
CN1912816A (en) * 2005-08-08 2007-02-14 北京理工大学 Virtus touch screen system based on camera head

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1533683A2 (en) * 2003-11-21 2005-05-25 Seat, S.A. Mixed reality simulation system
CN1912816A (en) * 2005-08-08 2007-02-14 北京理工大学 Virtus touch screen system based on camera head

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
一种新的利用模板进行摄像机自标定的方法. 全红艳,张田文.电子学报,第33卷第11期. 2005
一种新的利用模板进行摄像机自标定的方法. 全红艳,张田文.电子学报,第33卷第11期. 2005 *
一种新的基于圆环点的摄像机自标定方法. 孟晓桥,胡占义.软件学报,第13卷第5期. 2002
一种新的基于圆环点的摄像机自标定方法. 孟晓桥,胡占义.软件学报,第13卷第5期. 2002 *
基于计算机视觉技术的手形手位跟踪方法. 常红,王涌天,华宏,徐彤,周雅,程雪岷.北京理工大学学报,第19卷第6期. 1999
基于计算机视觉技术的手形手位跟踪方法. 常红,王涌天,华宏,徐彤,周雅,程雪岷.北京理工大学学报,第19卷第6期. 1999 *
由矩形确定摄像机内参数与位置的线性方法. 吴福朝,王光辉,胡占义.软件学报,第14卷第3期. 2003
由矩形确定摄像机内参数与位置的线性方法. 吴福朝,王光辉,胡占义.软件学报,第14卷第3期. 2003 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102455827A (en) * 2010-10-26 2012-05-16 昕兴时乐股份有限公司 Object sensing device

Also Published As

Publication number Publication date
CN101033963A (en) 2007-09-12

Similar Documents

Publication Publication Date Title
CN100585329C (en) Location system of video finger and location method based on finger tip marking
Sato et al. Fast tracking of hands and fingertips in infrared images for augmented desk interface
US10261595B1 (en) High resolution tracking and response to hand gestures through three dimensions
CN101536494B (en) System and method for genture based control system
CN108140360B (en) System and method for manipulating a virtual environment
KR101652535B1 (en) Gesture-based control system for vehicle interfaces
WO2022002133A1 (en) Gesture tracking method and apparatus
Lee et al. Handy AR: Markerless inspection of augmented reality objects using fingertip tracking
CN1774690B (en) Implement for optically inferring information from a planar jotting surface
CN101799717A (en) Man-machine interaction method based on hand action catch
CN108256504A (en) A kind of Three-Dimensional Dynamic gesture identification method based on deep learning
JP2004157850A (en) Motion detector
CN202662011U (en) Physical education teaching auxiliary system based on motion identification technology
Premaratne et al. Historical development of hand gesture recognition
CN103930944A (en) Adaptive tracking system for spatial input devices
CN108022264A (en) Camera pose determines method and apparatus
Gratal et al. Visual servoing on unknown objects
CN107357426A (en) A kind of motion sensing control method for virtual reality device
CN100523727C (en) Finger ring type video measuring finger location system and location method
Lin et al. The manipulation of real-time Kinect-based robotic arm using double-hand gestures
CN106991398B (en) Gesture recognition method based on image recognition and matched with graphical gloves
KR101406855B1 (en) Computer system using Multi-dimensional input device
Hanaoka et al. Development of 3D printed structure that visualizes bending and compression deformations for soft-bodied robots
Athar et al. Vistac towards a unified multi-modal sensing finger for robotic manipulation
JP5788853B2 (en) System and method for a gesture-based control system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20100127

Termination date: 20160410