CN101673161A - Visual, operable and non-solid touch screen system - Google Patents

Visual, operable and non-solid touch screen system Download PDF

Info

Publication number
CN101673161A
CN101673161A CN200910197205A CN200910197205A CN101673161A CN 101673161 A CN101673161 A CN 101673161A CN 200910197205 A CN200910197205 A CN 200910197205A CN 200910197205 A CN200910197205 A CN 200910197205A CN 101673161 A CN101673161 A CN 101673161A
Authority
CN
China
Prior art keywords
touch screen
virtual touch
image
virtual
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN200910197205A
Other languages
Chinese (zh)
Other versions
CN101673161B (en
Inventor
许景禧
沈一帆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fudan University
Original Assignee
Fudan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fudan University filed Critical Fudan University
Priority to CN2009101972052A priority Critical patent/CN101673161B/en
Publication of CN101673161A publication Critical patent/CN101673161A/en
Application granted granted Critical
Publication of CN101673161B publication Critical patent/CN101673161B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention belongs to the technical field of human-computer interaction, and is a visual, operable and non-solid touch screen system which comprises a computer, two network cameras, a head-mounteddisplay and a calibration reference object, wherein the calibration reference object calibrates a virtual touch screen. A user can observe a real world containing the virtual touch screen, and directly operates the non-solid touch screen by hand through the head-mounted display. The invention uses two cameras for real-time image collection. The computer system carries out spatial calibration and the identification of the action of fingertip positions according to the image input by the cameras and synthesizes a corresponding image. The head-mounted display outputs the image containing the virtual touch screen to the user, and the user directly observes the feedback of the virtual touch screen through the head-mounted display. The invention can be used in the field of human-computer interaction, and is particularly suitable to be used as the equipment of human-computer interaction of a portable computer or a multimedia interaction experience system.

Description

A kind of visual, operable and non-solid touch screen system
Technical field
The invention belongs to human-computer interaction technique field, be specifically related to a kind of visual exercisable touch-screen system.
Background technology
In a lot of virtual reality systems, the user uses data glove, stylus, 3D mouse to replace mouse and keyboard to operate.They can adapt to sterically defined requirement in the virtual reality system, have expanded the form of operation.But they have common shortcoming: use not nature.When the user used these equipment, what have will wear heavy equipment, the data line of the restraint scope of will ining succession that has.
With the virtual reality the personalizing and be microminiaturization, carry-onization, the embeddingization of the computing machine of representative of computer system of representative with Hand Personal Computer, smart mobile phone, be two important development trends of current computer, and be that the GUI technology of representative is the bottleneck that influences their development with mouse and keyboard.So, separation instrumentation, end user's sensation and action are imported and more and more are subjected to people's attention.
On the other hand, in the man-machine interaction of virtual reality, people singly do not need the input of nature, also need the output of nature.People wish the virtual world observed just around them, and are not limited only in the screen.People could watch and obtain the information that needs from the angle of oneself liking in the mode of usual observation object like this.The fixing display that the researcher begins to discard tradition has proposed the method exported with projection and head mounted display, or uses the 3D hologram display device to show virtual object to people.
The CN1912816A patent announcement has been announced a kind of virtual touch screen system based on multi-cam.This system has 2 or above fixing camera to take finger moving and clicking operation on virtual touch screen, utilizes HSV color space background segment method to extract the hand profile, uses particle filter to follow the trail of and carries out the finger tip location.This invention can solve the problem of utilizing direct staff to import, but does not consider but how the user observes virtual output, only is to use common display screen to export, and does not realize virtual in the aspect of exporting.
Summary of the invention
The object of the present invention is to provide a kind of touch-screen system that does not have entity that the user is observed simultaneously and operate, this system combines based on the finger tip detection technique of computer vision and augmented reality technology, the finger location map in the augmented reality world, the virtual touch screen that the user is seen can be touched in correct position and operate, and can realize virtual in the output aspect.
Touch-screen system of the present invention comprises a computing machine, two IP Camera, and a head mounted display, a calibrated reference, this calibrated reference have been demarcated a virtual touch screen for operator's hand touch operation.Two IP Camera are connected with computing machine, and computing machine is connected with head mounted display again.Head mounted display is worn on user's (being the operator) at the moment, and two IP Camera are arranged at top about head mounted display respectively, are used to take user's hand and calibrated reference, and captured in real time obtains video image.Two IP Camera are taken direction place ray and are intersected, and become the angle of 30 to 90 degree on same surface level.This system does not have strict requirement to the angle angle.The first video captured frame of network shooting is sent to computing machine; Be provided with the software for calculation system in the computing machine, these software systems calculate the orientation at current camera place according to the image of calibrated reference in frame of video, thereby determine the virtual touch screen position; According to the image of staff in frame of video, calculate the position of staff, differentiate staff and whether contact with virtual touch screen; According to above information, it is synthetic that the software for calculation system carries out image again; Synthetic image is exported by head mounted display, and the user observes directly the feedback of virtual touch screen by head mounted display.
Various piece of the present invention further describes as follows:
1, virtual touch screen, and the user is to its operation
Virtual touch screen among the present invention is actually a rectangular area in the realistic space.This panel region uses an object of reference to demarcate.The operation of conventional touch screen such as the user can click in this panel region with finger (any finger that stretches out when clenching fist state), towing, multi-point touch.The present invention is not provided with any touch-control perception hardware device and display device in this zone, but carry out staff touch-control sensing based on vision by two IP Camera, the stack by touch-screen virtual image in the head mounted display and real world image shows output.In the view of other people, the user is at the sightless object of operation.
2, two cameras are used for gathering in real time video image
The present invention uses two IP Camera to take user's hand and calibrated reference simultaneously.
System mainly is to carry out the previous work that camera is caught image when initialization, comprises the data of reading in pick-up lens, sets the optical characteristics and the picture size of camera.In addition, the shape data of object of reference also can be read in system, for coupling use afterwards.
System catches the image of two cameras respectively in the approaching time, and waits until later step and handle.Simultaneously, the image that captures is done a backup, the background image that uses when synthesizing as the later stage.
The picture format that captures is RGBA form (R-is red, and G-is green, B-indigo plant, A-transparency).Why keeping A channel, is because will use A channel to carry out transparent judgement in the synthetic processing of image afterwards, and then the synthesis mode of decision virtual image and real world images.
System requires approaching as far as possible the opportunity of catching at left order camera and right order camera, be to judge whether this moment finger tip has the touching plane because will rely on the position of two observed finger tips of camera.If it is far away that capture time differs, the image that two cameras capture has not just had related each other.
3, computing machine is handled input picture
Core of the present invention is computer software module.This module is handled the image of camera collection, obtains necessary human-machine interactive information, and then carries out the synthetic of virtual touch screen image.This computer software module comprises the following module:
1) virtual touch screen locating module
This module is determined the locus of virtual touch screen with respect to camera according to the transformational relation between the coordinate system of the coordinate system of object of reference and camera.After virtual touch screen positioned, system can obtain the corresponding relation of the point on the image that point above the virtual touch screen and camera photograph, can obtain the virtual touch screen image and synthesize position and angle at the real world image.
ARToolKit is an augmented reality development library of increasing income that is proposed by people such as Hirokazu Kato.System uses ARToolKit to carry out the augmented reality correlation computations.This storehouse also can be replaced by other similar development libraries of increasing income.Use the built-in function arGetTransMat of ARToolkit,, can obtain the transformation matrix Tm that is tied to the camera coordinate system from the mark coordinate by the size and the angle of object of reference in image.The built-in function gluProject of coupled computer graphical development bag OpenGL can calculate the corresponding point of 4 angle points on the camera coordinate system of rectangle virtual touch screen.
Behind the position that has obtained the virtual touch screen plane, system has determined the position of virtual touch screen in images acquired.This independent intercepting in piece zone is come out to proceed to handle finger tip identification as sensitizing range (zone that finger tip location and motion detection block are concerned about), give up all the other zones simultaneously.
2) finger tip location and motion detection block
This module is at first carried out Face Detection, and the staff Region Segmentation is come out, and obtains the position of staff finger tip (as forefinger) in camera image with the morphological method of Computer Image Processing again.Then,, calculate finger position thereon, and use the motion tracking algorithm to add stiff stability according to the position of virtual touch screen.At last, use the projective geometry method, judge whether finger tip contacts with virtual touch screen.This module is divided into following several submodule again:
I) Face Detection submodule
Native system uses a kind of background segment method based on the Face Detection algorithm of propositions such as Wu Yueming.This algorithm is based on such statistics: when the color of staff projects on the Cb-Cr plane in the distribution on YCbCr (Y-luminance component, Cb-chroma blue component, and the Cr-red color) color space, can be similar to and accumulate in the elliptic region.Can judge that thus the color in this elliptic region scope is the colour of skin, obtain the zone at colour of skin place, i.e. the zone at staff place.
Before using this segmenting Background, need adjust according to specific camera.Finish in case adjust, these data just can be used always.
The R that below mentions, G, B and Y, Cb, Cr will be normalized in the 0-255 scope.
At first, need be to the YCbCr color space the RGBA color space transformation that collects picture.Conversion formula is as shown in Equation 1:
Y Cb Cr = 16 128 128 + 0.263 0.5021 0.0975 - 0.1476 - 0.29 0.4375 0.4375 - 0.3663 - 0.329 R G B (formula 1)
Formula 1 is the normalized form of RGBA color space transformation to the YCbCr color space.
Use concrete camera collection picture, staff zone is wherein extracted, its color is added up, draw its color distribution on the Cb-Cr plane.With the approximate color distribution scope of surrounding of an elliptic region, obtain this oval major semi-axis length a and minor semi-axis length b, the coordinate (x of central point c, y c), and major axis counter clockwise direction and x axle angulation
Figure G2009101972052D00042
This ellipse is carried out standardization, and corresponding transformation for mula is suc as formula 2, shown in the formula 3:
[x ', y ', 1]=[x, y, 1] M (formula 3)
Wherein M is a transformation matrix, and (x y) is point in the former ellipse, and (x ', y ') is the point after the standardization.
When point (x ', y ') when standard ellipse is inner, see formula 4, think that the color that this point is represented is the colour of skin.
x ′ 2 a 2 + y ′ 2 b 2 ≤ 1 (formula 4)
After carrying out background segment, obtain all zones, i.e. the zone at staff place near the colour of skin.
Ii) finger tip position detection sub-module in camera image
In this module, system will obtain finger tip position in camera image.
At first, system uses connection district algorithm to obtain all connection districts to area of skin color, judges that hand region is the connection district of maximum in the picture.
After carrying out denoising to this connection district, extract the profile of finger, fill finger areas complete.
Then, determine fingertip location from it.Use a kind of Takao algorithm to orient the position of finger tip: to calculate the angle of inclination (direction that colour of skin point distributes) that hand is communicated with the district earlier, decide the bottom of finger then according to the close situation on hand and border, sensitizing range, then calculate from the finger bottom and find the solstics along the hand vergence direction.This point is fingertip location.
It should be noted that situation, support that two fingers touch simultaneously for multi-point touch.In identification, get maximum and time big two and be communicated with the district, and think that the finger tip on the left side, plane is the left hand finger tip as hand region, the right, plane be right hand finger tip.
Keep the backup of a hand images, so that the later stage uses when synthetic.
After finishing this section operation, can allow camera obtain the next frame image, employ one's time to the best advantage.
The iii) position calculation submodule of finger tip in virtual touch screen
Before, by the virtual touch screen locating module, 4 angle points of 4 angle points correspondence on the camera photographic images of virtual touch screen have been obtained.In this module, system finds out the position of finger tip on virtual touch screen by approximate projective transformation.
Its conversion requires the point above the irregular quadrilateral is mapped on the rectangle.
Here use the method for two-dimentional projective transformation.Suppose on the camera imaging plane a bit (X, Y) corresponding point on the virtual touch screen plane be (x, y), then (X, Y) can obtain by projective transformation (x, y).Here, the virtual touch screen zone of camera imaging plane is an irregular quadrilateral, and the touch screen zone on the virtual touch screen plane is a rectangle.
Below 8 yuan of 1 equation of n th order n group formula 5, band numbering (X, Y)---(X1, Y1) (X2, Y2) (X3, Y3) (X4 Y4) is 4 corresponding on camera photographic images angle points, (x, y)---(x1, y1) (x2, y2) (x3, y3) (x4 y4) is 4 angle points of virtual touch screen correspondence.
X 1 Y 1 1 0 0 0 - x 1 × Y 1 - x 1 X 2 Y 2 1 0 0 0 - x 2 × Y 2 - x 2 X 3 Y 3 1 0 0 0 - x 2 × Y 3 - x 3 X 4 Y 4 1 0 0 0 - x 4 × Y 4 - x 4 0 0 0 X 1 Y 1 1 - y 1 × Y 1 - y 1 0 0 0 X 2 Y 2 1 - y 2 × Y 2 - y 2 0 0 0 X 3 Y 3 1 - y 3 × Y 3 - y 3 0 0 0 X 4 Y 4 1 - y 4 × Y 4 - y 4 a 1 a 2 a 3 b 1 b 2 b 3 c 2 c 3 = x 1 × X 1 x 2 × X 2 x 3 × X 3 x 4 × X 4 y 1 × X 1 y 2 × X 2 y 3 × X 3 y 4 × X 4 (formula 5)
As long as 4 groups of corresponding point substitutions, just can utilize Gaussian processes to solve 8 parameter a1, a2, a3, b1, b2, b3, c2, c3.
Afterwards, obtain the corresponding point of point on rectangle on the irregular quadrilateral with formula 6 and formula 7.
x = a 1 × X + a 2 × Y + a 3 X + c 2 × Y + c 3 (formula 6)
y = b 1 × X + b 2 × Y + b 3 X + c 2 × Y + c 3 (formula 7)
Obtain fingertip location on the virtual touch screen plane (x, y) after, use the Condensation algorithm to come fingertip location is followed the trail of, determine probability by the normal distribution of measurement point and estimation point inverse distance.
Iv) finger tip and virtual touch screen contact detection submodule
This module is used to judge whether finger tip contacts with virtual touch screen.Its information that obtains can allow operations such as system judges whether finger tip is clicked, towing.
Utilize the method for projective geometry to judge whether finger tip contacts virtual screen plane.When finger tip does not drop on the virtual plane, to observe by imaging plane, its projection on virtual plane does not overlap.
When finger tip dropped on the virtual plane, it overlapped in the projection of virtual plane.
As long as set 2 of projections apart from threshold values, and think and represent that less than this threshold values the user touches the plane, just can know the user whether point click, drag motions.The big I of this threshold values is used in reality and is adjusted according to user's request.
Because the position of two cameras is not fixed, under the finger situation identical apart from the planimetric position, 2 distance of projection is not necessarily identical.Therefore, provide the threshold parameter of camera to adjust, in order to the degree of tightness of judgement of decision button to the user.
3) virtual touch screen video image synthesis module
This module is according to the positional information of the virtual touch screen that obtains previously and the position action message of finger tip, and synthetic corresponding virtual image is to provide suitable operational feedback.
Obtained position and the direction of camera in front with respect to calibrated reference.Use ARToolkit built-in function argConvGlpara to obtain the parameter of camera lens, compose the setting of carrying out camera position of giving OpenGL, just can obtain correct 3-D view.
According to the finger tip information that obtains before, can draw corresponding feedback image.
Virtual image and real world images are synthesized, and are steps very important in the augmented reality.Have only correct composograph, the user can think that just virtual object really is present in the real world.
Here, being divided into two parts carries out:
I) dummy object is superimposed upon on the background image
According to the touch control operation information that collects above, corresponding virtual touch screen feedback is drawn by system, is superimposed upon then on the real world images of camera shooting.
Ii) handle is superimposed upon the image the top
The user judges a whether real major criterion of virtual touch screen, is whether his hand and the position relation of touch-screen can correctly be represented.User's hand should be operated above virtual touch screen, so, be superimposed upon the user's who preserves before hand images the top of picture.
4, synthetic dummy object image and outputing on the head mounted display
Image after synthetic is outputed on the head mounted display, and the user just can observe virtual touch screen.
When system's continuous service, the user just can watch continuous images, when the frame number of camera is enough high, just with observe real world and be as good as.
Compared with prior art, beneficial effect of the present invention is embodied in:
1, adopted the background segment technology that is applicable to the augmented reality environment, in conjunction with finish the method that the location map of finger is arrived plane, the augmented reality world and its touch action of perception by two cameras, higher adaptability is provided, has realized the function that expensive multi-point touch touch-screen has with low-cost equipment simultaneously.
2, because the finger location map in the augmented reality world, the virtual touch screen that the user sees by wear-type can be touched in correct position, has saved the entity display, has increased portability of equipment.
3, the relative object of reference with touch location of the viewing location of virtual touch screen is fixed, and its physical location has unchangeability in the space, strengthened the sense of reality.
Description of drawings
Fig. 1 is the system architecture vertical view.
Fig. 2 is the overview flow chart of system.
Number in the figure 1 is a computing machine, and 2 is head mounted display, and 3 is IP Camera, and 4 is calibrated reference, and 5 is virtual touch screen.
Embodiment
The invention will be further described with enforcement below in conjunction with accompanying drawing:
Embodiment 1: being implemented in and operating the Windows program on the virtual touch screen is example.
Present embodiment has showed how the present invention substitutes common touch screen operation computing machine.
When the user used, two IP Camera wearing head mounted display and be fixed thereon had been put calibrated reference simultaneously.After the system on the operation computing machine, the user can see that a virtual touch-screen has appearred in the calibrated reference next door, has the interface of Windows program above by head mounted display.When the user clicked this virtual touch screen with forefinger, touch-screen can be made corresponding operation.For example, lift when the user presses icon with forefinger again, finish single-click operation one time, icon is selected; When the user presses icon, finger tip remains on the plane of virtual touch screen, moves, and icon just can be drawn to corresponding position by finger, lifts finger constipation bundle drag operation.
Realize that this example need carry out following configuration:
1, hardware device
This example use two LogitechS5500 IP Camera (Fig. 1-3) carry out the video input, use i-glasses head mounted display (Fig. 1-2) to export, use a computing machine (Fig. 1-1) that disposes Intel Core263001.86GHz*2 processor, 2GB internal memory, NVIDIA GeForce 7300GT display card to handle.
2, initialization
Before concrete the use, the program calib_camera2 of use ARToolkit obtains the deformation parameter of camera earlier.When initialization, directly call this deformation parameter.Read in the data of calibrated reference, use the patt.sample2 of ARToolkit in this example.Calibrated reference is 80 millimeters a square paperboard sheet.
3, use camera to gather video image in real time
Two cameras are apart from 15 centimetres, and the angle of taking between the direction is 40 degree.
4, colour of skin identification
When carrying out colour of skin identification, formula 2 is used following parameter:
x c=110.0, y c=120.5 (formulas 8)
When ellipse was carried out standardization, the transformation for mula of corresponding abbreviation was suc as formula 9, shown in the formula 10
C ' b=c b* 0.716-c r* 0.699+ (x c* 0.716+y c* 0.699) (formula 9)
C ' r=c b* 0.699+c r* 0.716+ (x c* 0.699-y c* 0.716) (formula 10)
(c wherein b, c r) be (x, concrete value y) in the formula 3, formula 4 above.
When point (cb ', cr ') is 23.478 at major semi-axis, minor semi-axis is 11.434 standard ellipse when inner, sees formula 11, thinks that the color that this point is represented is the colour of skin.
c b ′ 2 23.478 2 + c r ′ 2 1 1.434 2 ≤ 1 (formula 11)
In addition, this example has also also been done restriction to area of skin color brightness, and the too low color of brightness is not thought the colour of skin.Adjust by test, this example sets a threshold values 30 for the Y value of YCbCr color space, does not think then that less than this value this color is the colour of skin.Can adjust voluntarily according to the camera situation in actual use.
5, finger tip location and contact detection
The step that the hand profile is handled all realizes with the built-in function of computer vision development library OpenCV.
When using the Condensation function tracking of OpenCV, population is 500, and each measurement is only once predicted.
Adjust according to test, that judges in this example whether finger tip click virtual touch screen is made as 10 pixels apart from threshold values.When distance threshold values during, think left mouse button pressing less than 10 pixels.When distance threshold values during, think left mouse button unclamping greater than 10 pixels.Mouse event by heavily loaded Windows system just can be mapped the operation of virtual touch screen and Windows mouse action, just can make user's direct control Windows program on virtual touch screen.
6, synthetic virtual touch screen video image
When synthetic virtual touch screen video image, the Windows windows content is intercepted, project on the virtual touch screen, the user just can directly see the operation feedback of Windows program on virtual touch screen.
7, output to head mounted display
Embodiment 2: carry out the multi-point touch photo and browse on virtual touch screen
This example is showed multi-point touch function of the present invention.
User's wearable device and start computer system after, he can see and show multiple pictures on the virtual touch screen.Towing about the user carries out with right hand forefinger comparison film center, photo can switch; When the user drags simultaneously the upper left side and the corner, lower right of photo with right-hand man's forefinger, photo can carry out convergent-divergent according to the position of two fingers.
Implement this example, carry out finger tip when identification, get maximum and time big two and be communicated with the district, and think that the finger tip on the left side, plane is the left hand finger tip as hand region, the right, plane be right hand finger tip.
To each photo, set photo centre one square area and be the towing zone, when about the user carries out this zone, dragging, switch a last photo respectively and next opens photo on virtual touch screen, on virtual touch screen, draw new photo; Setting the photo upper left corner and the lower right corner is the convergent-divergent control area, and when user's left index finger and right hand forefinger dragged the upper left corner and the lower right corner respectively, photo redrew on virtual touch screen simultaneously, and the user is fed back in real time.

Claims (2)

1. visual, operable and non-solid touch screen system, it is characterized in that comprising a computing machine, two IP Camera, a head mounted display, a calibrated reference, this calibrated reference have been demarcated a virtual touch screen for user's hand touch operation; Two IP Camera are connected with computing machine, and computing machine is connected with head mounted display again; Head mounted display is worn on the user at the moment, and two IP Camera are arranged at top about head mounted display respectively, are used to take user's hand and calibrated reference, and captured in real time obtains video image; Two IP Camera are taken direction place ray and are intersected, and become the angle of 30 to 90 degree on same surface level; The first video captured frame of network shooting is sent to computing machine; Be provided with the software for calculation system in the computing machine, these software systems are according to the image of calibrated reference in frame of video, calculate the orientation at current camera place, thereby determine the virtual touch screen position, and according to the image of staff in frame of video, calculate the position of staff, differentiate staff and whether contact with virtual touch screen; According to above information, it is synthetic that the software for calculation system carries out image again; Synthetic image is exported by head mounted display, and the user observes directly the feedback of virtual touch screen by head mounted display.
2. touch-screen system according to claim 1 is characterized in that described software for calculation system is made up of following software module:
1) virtual touch screen locating module
This module is determined the locus of virtual touch screen with respect to camera according to the transformational relation between the coordinate system of the coordinate system of object of reference and camera; After virtual touch screen positioned, system can obtain the corresponding relation of the point on the image that point above the virtual touch screen and camera photograph, comprise 4 angle points of virtual touch screen corresponding 4 points on the camera photographic images, obtain the virtual touch screen image and synthesize position and angle at the real world image;
2) finger tip location and motion detection block
This module is at first carried out Face Detection, and the staff Region Segmentation is come out, and obtains the position of staff finger tip in camera image with the morphological method of Computer Image Processing again; Then,, calculate finger position thereon, and use the motion tracking algorithm to add stiff stability according to the position of virtual touch screen; At last, use the projective geometry method, judge whether finger tip contacts with virtual touch screen; This module is divided into following several submodule again:
I) Face Detection submodule
Native system uses a kind of background segment method based on the Face Detection algorithm; This algorithm is based on such statistics: when the distribution of the color of staff on the YCbCr color space projects on the Cb-Cr plane, can approximate accumulate in the elliptic region, judge that thus the color in this elliptic region scope is the colour of skin, obtain the zone at colour of skin place, i.e. the zone at staff place;
Ii) finger tip position detection sub-module in camera image
In this module, system will obtain finger tip position in camera image;
At first, system uses connection district algorithm to obtain all connection districts to area of skin color, judges that hand region is the connection district of maximum in the picture;
After carrying out denoising to this connection district, extract the profile of finger, fill finger areas complete;
Then, determine fingertip location from it: the position of using a kind of improved Takao algorithm to orient finger tip: calculate the angle of inclination that hand is communicated with the district earlier, decide the bottom of finger then according to the close situation on hand and border, sensitizing range, then calculate from the finger bottom and find the solstics along the hand vergence direction; This point is fingertip location;
The iii) position calculation submodule of finger tip in virtual touch screen
In this module, system finds out the position of finger tip on virtual touch screen by approximate projective transformation;
Iv) finger tip and virtual touch screen contact detection submodule
This module is used to judge whether finger tip contacts with virtual touch screen, and its information that obtains can allow operations such as system judges whether finger tip is clicked, towing;
Utilize the method for projective geometry to judge whether finger tip contacts virtual screen plane, when finger tip does not drop on the virtual plane, observe by imaging plane, its projection on virtual plane does not overlap; When finger tip dropped on the virtual plane, it overlapped in the projection of virtual plane;
3) virtual touch screen video image synthesis module
This module is according to the positional information of the virtual touch screen that obtains previously and the position action message of finger tip, and synthetic corresponding virtual image is to provide suitable operational feedback; Specifically be divided into two parts:
I) dummy object is superimposed upon on the background image
According to the touch control operation information that collects above, corresponding virtual touch screen feedback is drawn by system, is superimposed upon then on the real world images of camera shooting;
Ii) handle is superimposed upon the image the top
Image after synthetic is outputed on the head mounted display, and the user just can observe virtual touch screen.
CN2009101972052A 2009-10-15 2009-10-15 Visual, operable and non-solid touch screen system Expired - Fee Related CN101673161B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2009101972052A CN101673161B (en) 2009-10-15 2009-10-15 Visual, operable and non-solid touch screen system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2009101972052A CN101673161B (en) 2009-10-15 2009-10-15 Visual, operable and non-solid touch screen system

Publications (2)

Publication Number Publication Date
CN101673161A true CN101673161A (en) 2010-03-17
CN101673161B CN101673161B (en) 2011-12-07

Family

ID=42020403

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2009101972052A Expired - Fee Related CN101673161B (en) 2009-10-15 2009-10-15 Visual, operable and non-solid touch screen system

Country Status (1)

Country Link
CN (1) CN101673161B (en)

Cited By (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101924767A (en) * 2010-08-23 2010-12-22 北京启动在线文化娱乐有限公司 Human-computer interaction-based network group system
CN101950322A (en) * 2010-08-23 2011-01-19 北京启动在线文化娱乐有限公司 Man-machine interaction network system for realizing point-to-point connection
CN102306053A (en) * 2011-08-29 2012-01-04 Tcl集团股份有限公司 Virtual touch screen-based man-machine interaction method and device and electronic equipment
CN102521819A (en) * 2011-12-06 2012-06-27 无锡海森诺科技有限公司 Optical touch image processing method
CN102591449A (en) * 2010-10-27 2012-07-18 微软公司 Low-latency fusing of virtual and real content
CN102722309A (en) * 2011-03-30 2012-10-10 中国科学院软件研究所 Method for identifying touch-control information of touch gestures in multi-point touch interaction system
CN103019377A (en) * 2012-12-04 2013-04-03 天津大学 Head-mounted visual display equipment-based input method and device
CN103270456A (en) * 2011-10-20 2013-08-28 松下电器产业株式会社 Display device and display system
CN103309226A (en) * 2013-06-09 2013-09-18 深圳先进技术研究院 Smart watch assorted with smart glasses
CN103336575A (en) * 2013-06-27 2013-10-02 深圳先进技术研究院 Man-machine interaction intelligent glasses system and interaction method
WO2013143409A1 (en) * 2012-03-29 2013-10-03 华为终端有限公司 Three-dimensional display-based curser operation method and mobile terminal
CN103365408A (en) * 2012-04-10 2013-10-23 深圳泰山在线科技有限公司 Feedback method and feedback system of virtual screen
CN103389793A (en) * 2012-05-07 2013-11-13 深圳泰山在线科技有限公司 Human-computer interaction method and human-computer interaction system
CN103472919A (en) * 2013-09-12 2013-12-25 深圳先进技术研究院 Intelligent glasses system for image display and use method
CN103534665A (en) * 2011-04-04 2014-01-22 英特尔公司 Keyboard avatar for heads up display (hud)
CN103530061A (en) * 2013-10-31 2014-01-22 京东方科技集团股份有限公司 Display device, control method, gesture recognition method and head-mounted display device
CN103559809A (en) * 2013-11-06 2014-02-05 常州文武信息科技有限公司 Computer-based on-site interaction demonstration system
CN103713737A (en) * 2013-12-12 2014-04-09 中国科学院深圳先进技术研究院 Virtual keyboard system used for Google glasses
CN103914128A (en) * 2012-12-31 2014-07-09 联想(北京)有限公司 Head mounted electronic device and input method
CN104063092A (en) * 2014-06-16 2014-09-24 青岛歌尔声学科技有限公司 Method and device for controlling touch screen
CN104238665A (en) * 2013-06-24 2014-12-24 佳能株式会社 Image processing apparatus and image processing method
CN104714646A (en) * 2015-03-25 2015-06-17 中山大学 3D virtual touch control man-machine interaction method based on stereoscopic vision
CN104820492A (en) * 2015-04-23 2015-08-05 济南大学 Three-dimensional haptic system
US9122053B2 (en) 2010-10-15 2015-09-01 Microsoft Technology Licensing, Llc Realistic occlusion for a head mounted augmented reality display
CN104951059A (en) * 2014-03-31 2015-09-30 联想(北京)有限公司 Data processing method and device and electronic equipment
CN105867611A (en) * 2015-12-29 2016-08-17 乐视致新电子科技(天津)有限公司 Space positioning method, device and system in virtual reality system
CN106055108A (en) * 2016-06-10 2016-10-26 北京行云时空科技有限公司 Method and system for operating and controlling virtual touch screen
CN106502420A (en) * 2016-11-14 2017-03-15 北京视据科技有限公司 Based on the virtual key triggering method that image aberration is recognized
CN106708256A (en) * 2016-11-14 2017-05-24 北京视据科技有限公司 Opencv and easyar based virtual key trigger method
CN106873768A (en) * 2016-12-30 2017-06-20 中兴通讯股份有限公司 A kind of augmented reality method, apparatus and system
CN106951087A (en) * 2017-03-27 2017-07-14 联想(北京)有限公司 A kind of exchange method and device based on virtual interacting plane
CN107710132A (en) * 2015-05-15 2018-02-16 阿斯尔公司 It is used for the method and apparatus of the free space input of surface restraint control for applying
CN107743604A (en) * 2015-09-16 2018-02-27 谷歌有限责任公司 Touch-screen hovering detection in enhancing and/or reality environment
CN107943351A (en) * 2017-11-22 2018-04-20 苏州佳世达光电有限公司 Touch identifying system and method in perspective plane
CN109085931A (en) * 2018-07-25 2018-12-25 南京禹步信息科技有限公司 A kind of interactive input method, device and storage medium that actual situation combines
CN110291576A (en) * 2016-12-23 2019-09-27 瑞欧威尔股份有限公司 The hands-free navigation of operating system based on touch
CN110428468A (en) * 2019-08-12 2019-11-08 北京字节跳动网络技术有限公司 A kind of the position coordinates generation system and method for wearable display equipment
CN110944139A (en) * 2019-11-29 2020-03-31 维沃移动通信有限公司 Display control method and electronic equipment
CN112306353A (en) * 2020-10-27 2021-02-02 北京京东方光电科技有限公司 Augmented reality device and interaction method thereof
CN112840379A (en) * 2018-10-15 2021-05-25 索尼公司 Information processing apparatus, information processing method, and program
CN112866672A (en) * 2020-12-30 2021-05-28 深圳卡乐星球数字娱乐有限公司 Augmented reality system and method for immersive cultural entertainment
US11340465B2 (en) 2016-12-23 2022-05-24 Realwear, Inc. Head-mounted display with modular components
CN114581535A (en) * 2022-03-03 2022-06-03 北京深光科技有限公司 Method, device, storage medium and equipment for marking key points of user bones in image
US11507216B2 (en) 2016-12-23 2022-11-22 Realwear, Inc. Customizing user interfaces of binary applications

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104866103B (en) * 2015-06-01 2019-12-24 联想(北京)有限公司 Relative position determining method, wearable electronic device and terminal device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1912816A (en) * 2005-08-08 2007-02-14 北京理工大学 Virtus touch screen system based on camera head
CN101295442A (en) * 2008-06-17 2008-10-29 上海沪江虚拟制造技术有限公司 Non-contact stereo display virtual teaching system

Cited By (71)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101950322A (en) * 2010-08-23 2011-01-19 北京启动在线文化娱乐有限公司 Man-machine interaction network system for realizing point-to-point connection
CN101924767A (en) * 2010-08-23 2010-12-22 北京启动在线文化娱乐有限公司 Human-computer interaction-based network group system
US9122053B2 (en) 2010-10-15 2015-09-01 Microsoft Technology Licensing, Llc Realistic occlusion for a head mounted augmented reality display
US9348141B2 (en) 2010-10-27 2016-05-24 Microsoft Technology Licensing, Llc Low-latency fusing of virtual and real content
CN102591449A (en) * 2010-10-27 2012-07-18 微软公司 Low-latency fusing of virtual and real content
US9710973B2 (en) 2010-10-27 2017-07-18 Microsoft Technology Licensing, Llc Low-latency fusing of virtual and real content
CN102722309A (en) * 2011-03-30 2012-10-10 中国科学院软件研究所 Method for identifying touch-control information of touch gestures in multi-point touch interaction system
CN102722309B (en) * 2011-03-30 2014-09-24 中国科学院软件研究所 Method for identifying touch-control information of touch gestures in multi-point touch interaction system
CN103534665A (en) * 2011-04-04 2014-01-22 英特尔公司 Keyboard avatar for heads up display (hud)
CN102306053A (en) * 2011-08-29 2012-01-04 Tcl集团股份有限公司 Virtual touch screen-based man-machine interaction method and device and electronic equipment
CN102306053B (en) * 2011-08-29 2014-09-10 Tcl集团股份有限公司 Virtual touch screen-based man-machine interaction method and device and electronic equipment
US9389421B2 (en) 2011-10-20 2016-07-12 Panasonic Intellectual Property Management Co., Ltd. Display device and display system
CN103270456A (en) * 2011-10-20 2013-08-28 松下电器产业株式会社 Display device and display system
CN102521819A (en) * 2011-12-06 2012-06-27 无锡海森诺科技有限公司 Optical touch image processing method
WO2013143409A1 (en) * 2012-03-29 2013-10-03 华为终端有限公司 Three-dimensional display-based curser operation method and mobile terminal
CN103365408A (en) * 2012-04-10 2013-10-23 深圳泰山在线科技有限公司 Feedback method and feedback system of virtual screen
CN103389793A (en) * 2012-05-07 2013-11-13 深圳泰山在线科技有限公司 Human-computer interaction method and human-computer interaction system
CN103019377A (en) * 2012-12-04 2013-04-03 天津大学 Head-mounted visual display equipment-based input method and device
CN103914128A (en) * 2012-12-31 2014-07-09 联想(北京)有限公司 Head mounted electronic device and input method
CN103914128B (en) * 2012-12-31 2017-12-29 联想(北京)有限公司 Wear-type electronic equipment and input method
CN103309226B (en) * 2013-06-09 2016-05-11 深圳先进技术研究院 The intelligent watch that coordinates intelligent glasses to use
CN103309226A (en) * 2013-06-09 2013-09-18 深圳先进技术研究院 Smart watch assorted with smart glasses
CN104238665B (en) * 2013-06-24 2017-12-22 佳能株式会社 Image processing apparatus and image processing method
US9684169B2 (en) 2013-06-24 2017-06-20 Canon Kabushiki Kaisha Image processing apparatus and image processing method for viewpoint determination
CN104238665A (en) * 2013-06-24 2014-12-24 佳能株式会社 Image processing apparatus and image processing method
CN103336575A (en) * 2013-06-27 2013-10-02 深圳先进技术研究院 Man-machine interaction intelligent glasses system and interaction method
CN103336575B (en) * 2013-06-27 2016-06-29 深圳先进技术研究院 The intelligent glasses system of a kind of man-machine interaction and exchange method
CN103472919B (en) * 2013-09-12 2017-01-25 深圳先进技术研究院 Intelligent glasses system for image display and use method
CN103472919A (en) * 2013-09-12 2013-12-25 深圳先进技术研究院 Intelligent glasses system for image display and use method
CN103530061A (en) * 2013-10-31 2014-01-22 京东方科技集团股份有限公司 Display device, control method, gesture recognition method and head-mounted display device
US10203760B2 (en) 2013-10-31 2019-02-12 Boe Technology Group Co., Ltd. Display device and control method thereof, gesture recognition method, and head-mounted display device
WO2015062247A1 (en) * 2013-10-31 2015-05-07 京东方科技集团股份有限公司 Display device and control method therefor, gesture recognition method and head-mounted display device
CN103559809A (en) * 2013-11-06 2014-02-05 常州文武信息科技有限公司 Computer-based on-site interaction demonstration system
CN103559809B (en) * 2013-11-06 2017-02-08 常州文武信息科技有限公司 Computer-based on-site interaction demonstration system
CN103713737B (en) * 2013-12-12 2017-01-11 中国科学院深圳先进技术研究院 Virtual keyboard system used for Google glasses
CN103713737A (en) * 2013-12-12 2014-04-09 中国科学院深圳先进技术研究院 Virtual keyboard system used for Google glasses
CN104951059B (en) * 2014-03-31 2018-08-10 联想(北京)有限公司 A kind of data processing method, device and a kind of electronic equipment
CN104951059A (en) * 2014-03-31 2015-09-30 联想(北京)有限公司 Data processing method and device and electronic equipment
CN104063092B (en) * 2014-06-16 2016-12-07 青岛歌尔声学科技有限公司 A kind of touch screen control method and device
WO2015192763A1 (en) * 2014-06-16 2015-12-23 青岛歌尔声学科技有限公司 Touch screen control method and device
CN104063092A (en) * 2014-06-16 2014-09-24 青岛歌尔声学科技有限公司 Method and device for controlling touch screen
US9823779B2 (en) 2014-06-16 2017-11-21 Qingdao Goertek Technology Co., Ltd. Method and device for controlling a head-mounted display by a terminal device
CN104714646A (en) * 2015-03-25 2015-06-17 中山大学 3D virtual touch control man-machine interaction method based on stereoscopic vision
CN104820492A (en) * 2015-04-23 2015-08-05 济南大学 Three-dimensional haptic system
CN107710132B (en) * 2015-05-15 2021-11-02 阿斯尔公司 Method and apparatus for applying free space input for surface constrained control
CN107710132A (en) * 2015-05-15 2018-02-16 阿斯尔公司 It is used for the method and apparatus of the free space input of surface restraint control for applying
CN107743604A (en) * 2015-09-16 2018-02-27 谷歌有限责任公司 Touch-screen hovering detection in enhancing and/or reality environment
CN105867611A (en) * 2015-12-29 2016-08-17 乐视致新电子科技(天津)有限公司 Space positioning method, device and system in virtual reality system
CN106055108A (en) * 2016-06-10 2016-10-26 北京行云时空科技有限公司 Method and system for operating and controlling virtual touch screen
CN106502420A (en) * 2016-11-14 2017-03-15 北京视据科技有限公司 Based on the virtual key triggering method that image aberration is recognized
CN106708256B (en) * 2016-11-14 2018-05-25 北京视据科技有限公司 virtual key triggering method based on opencv and easyar
CN106708256A (en) * 2016-11-14 2017-05-24 北京视据科技有限公司 Opencv and easyar based virtual key trigger method
CN110291576A (en) * 2016-12-23 2019-09-27 瑞欧威尔股份有限公司 The hands-free navigation of operating system based on touch
US11947752B2 (en) 2016-12-23 2024-04-02 Realwear, Inc. Customizing user interfaces of binary applications
US11507216B2 (en) 2016-12-23 2022-11-22 Realwear, Inc. Customizing user interfaces of binary applications
US11409497B2 (en) 2016-12-23 2022-08-09 Realwear, Inc. Hands-free navigation of touch-based operating systems
US11340465B2 (en) 2016-12-23 2022-05-24 Realwear, Inc. Head-mounted display with modular components
CN106873768A (en) * 2016-12-30 2017-06-20 中兴通讯股份有限公司 A kind of augmented reality method, apparatus and system
CN106873768B (en) * 2016-12-30 2020-05-05 中兴通讯股份有限公司 Augmented reality method, device and system
CN106951087A (en) * 2017-03-27 2017-07-14 联想(北京)有限公司 A kind of exchange method and device based on virtual interacting plane
CN106951087B (en) * 2017-03-27 2020-02-21 联想(北京)有限公司 Interaction method and device based on virtual interaction plane
CN107943351A (en) * 2017-11-22 2018-04-20 苏州佳世达光电有限公司 Touch identifying system and method in perspective plane
CN109085931A (en) * 2018-07-25 2018-12-25 南京禹步信息科技有限公司 A kind of interactive input method, device and storage medium that actual situation combines
CN112840379A (en) * 2018-10-15 2021-05-25 索尼公司 Information processing apparatus, information processing method, and program
CN110428468A (en) * 2019-08-12 2019-11-08 北京字节跳动网络技术有限公司 A kind of the position coordinates generation system and method for wearable display equipment
CN110944139A (en) * 2019-11-29 2020-03-31 维沃移动通信有限公司 Display control method and electronic equipment
CN112306353B (en) * 2020-10-27 2022-06-24 北京京东方光电科技有限公司 Augmented reality device and interaction method thereof
CN112306353A (en) * 2020-10-27 2021-02-02 北京京东方光电科技有限公司 Augmented reality device and interaction method thereof
CN112866672A (en) * 2020-12-30 2021-05-28 深圳卡乐星球数字娱乐有限公司 Augmented reality system and method for immersive cultural entertainment
CN114581535A (en) * 2022-03-03 2022-06-03 北京深光科技有限公司 Method, device, storage medium and equipment for marking key points of user bones in image
CN114581535B (en) * 2022-03-03 2023-04-18 北京深光科技有限公司 Method, device, storage medium and equipment for marking key points of user bones in image

Also Published As

Publication number Publication date
CN101673161B (en) 2011-12-07

Similar Documents

Publication Publication Date Title
CN101673161B (en) Visual, operable and non-solid touch screen system
US9651782B2 (en) Wearable tracking device
CN102662577B (en) A kind of cursor operating method based on three dimensional display and mobile terminal
US8643598B2 (en) Image processing apparatus and method, and program therefor
CN104199550B (en) Virtual keyboard operation device, system and method
US9904372B2 (en) Method by which eyeglass-type display device recognizes and inputs movement
CN105487673A (en) Man-machine interactive system, method and device
WO2021213067A1 (en) Object display method and apparatus, device and storage medium
US9442571B2 (en) Control method for generating control instruction based on motion parameter of hand and electronic device using the control method
CN104813258A (en) Data input device
WO1999040562A1 (en) Video camera computer touch screen system
CN108027656B (en) Input device, input method, and program
WO2021097600A1 (en) Inter-air interaction method and apparatus, and device
WO2011152634A2 (en) Monitor-based augmented reality system
TW202025719A (en) Method, apparatus and electronic device for image processing and storage medium thereof
CN104460951A (en) Human-computer interaction method
CN108027655A (en) Information processing system, information processing equipment, control method and program
CN104199548B (en) A kind of three-dimensional man-machine interactive operation device, system and method
CN112657176A (en) Binocular projection man-machine interaction method combined with portrait behavior information
EP3617851B1 (en) Information processing device, information processing method, and recording medium
JP2014029656A (en) Image processor and image processing method
CN109947243A (en) Based on the capture of intelligent electronic device gesture and identification technology for touching hand detection
WO2024012268A1 (en) Virtual operation method and apparatus, electronic device, and readable storage medium
JP3860560B2 (en) Display interface method and apparatus
JP6971788B2 (en) Screen display control method and screen display control system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20111207

Termination date: 20141015

EXPY Termination of patent right or utility model