Summary of the invention
Technical matters to be solved by this invention is: make up above-mentioned the deficiencies in the prior art, a kind of positioning interaction method and system are proposed, realize alternately free-hand by common projection screen, no longer need to rely on specific interactive device, also do not need specific electronic whiteboard.
Technical matters of the present invention is solved by following technical scheme:
A kind of positioning interaction method, comprises the following steps: 1) form projection scene by projection systems project, and takes the image of mutual object in described projection scene and mutual shadow of object by camera system; 2) according to following steps process, the image of mutual object and mutual shadow of object obtains the positional information at described mutual object tip: 2-1) distance in detected image between the most advanced and sophisticated and most advanced and sophisticated shade of mutual object of mutual object; 2-2) judge whether described distance is greater than setting value, if be greater than setting value, then calculate the three dimensional local information at mutual object tip; If be less than setting value, then calculate the two-dimensional position information at mutual object tip; 3) by step 2) positional information that obtains is converted into projection information, projects on screen area.
Technical matters of the present invention is solved by following further technical scheme:
A kind of positioning interaction system, comprises optical projection system (100), camera system (200) and disposal system (300); Projection information, for the formation of projection scene, is projected screen area by described optical projection system (100); Described camera system (200) is for getting off the image taking of the mutual object in described projection scene and mutual shadow of object; Described disposal system (300) receives the mutual object and mutual shadow of object image that described camera system (200) takes, and according to step 2 described above) process the positional information obtaining described mutual object tip, and the positional information obtained is converted into projection information, output to described optical projection system.
The beneficial effect that the present invention is compared with the prior art is:
Positioning interaction method and system of the present invention, by the image of mutual object and mutual shadow of object in the lower projection scene of camera system shooting, the image of disposal system to mutual object and mutual shadow of object is coordinated to process thus obtain most advanced and sophisticated positional information, the positional information at tip is converted into after projection information outputs to projection screen region, can realizes mutual.In whole positioning interaction process, user does not need hand-held specific interactive device, and by finger or common thin rod, pen etc. all can realize positioning interaction.Projection screen is common projection screen and can realizes above-mentioned positioning interaction process simultaneously, no longer needs specific interactive electric whiteboard.
Embodiment
Contrast accompanying drawing below in conjunction with embodiment the present invention is described in further details.
As shown in Figure 1, be the process flow diagram of the positioning interaction method of this embodiment, comprise the following steps:
P1) form projection scene by projection systems project, and take the image of mutual object in described projection scene and mutual shadow of object by camera system.
Wherein, mutual object can be the finger of people, the thin rod that people is hand-held, common written pen etc.In this embodiment, mutual object is directly the finger of people, and the forefinger usually can selected during to indicate illustrates.In this step, namely staff is when the scene middle finger that projects shows the content on projection screen region, the light of projection system projects is blocked by staff, and the screen area of projection forms shade, the image of finger and finger shade in now camera system shooting projection scene.During forefinger instruction content, can contact projection screen area thus instruction, also can not contact screen region only projection scene in remote indication.
P2) image processing mutual object and mutual shadow of object obtains the positional information at described mutual object tip.Particularly, positional information is obtained according to the step process shown in Fig. 2: distance P21) in detected image between mutual object tip and the most advanced and sophisticated shade of mutual object; P22) judge whether described distance is greater than setting value, if be greater than setting value, then enter step P231) calculate the three dimensional local information at mutual object tip; If be less than setting value, then enter step P232) calculate the two-dimensional position information at mutual object tip.
In this embodiment, step P1) obtain the image of staff and shade thereof after, namely the spatial positional information obtaining forefinger tip is processed to image in this step.Judge the distance between forefinger tip and the most advanced and sophisticated shade of forefinger from image, if this distance is greater than setting value, then represents that forefinger does not touch on projection screen region, then need to calculate forefinger positional information in three dimensions; If this distance is less than setting value, then represent that forefinger has touched on projection screen region, then calculate the two-dimensional position information of forefinger.By the process of this step, can judge that to projection screen touch-control whether forefinger.Preferably, during from the image of finger and finger shade Image Acquisition finger tips shade, the point on available finger shade carries out curve fitting and obtains finger tips shadow image.Like this, after the image obtained from step 1) extracts finger tips image information and finger tips shadow image information, and then step P21 can be entered) to P23), thus obtain the positional information of finger tips.It should be noted that, when obtaining the image information of finger tips shade, except adopting the method for above-mentioned curve, other image processing operations such as shadow Detection algorithm also can be adopted to obtain.
Above-mentioned three-dimensional xyz axle is respectively, and z-axis is the direction with place, projection screen region plane orthogonal; Place, projection screen region plane is x-y plane, y-axis is projection screen region in the planes along the direction of described optical projection system and described camera system line, x-axis is projection screen region vertical with described camera system line direction with described optical projection system in the planes direction.Described two-dimensional position information, when being z=0 in above-mentioned three dimensions, the coordinate information of x-y plane.And judge touch-control whether setting value, then can by the user of positioning interaction method rule of thumb situation setting, such as, reality on touch-control to image during projection screen both distances be how many, and look touch-control to projection screen but reality also non-touch-control then on image both distances be how many, get an intermediate value between two values as setting value.
P3) by step P2) positional information that obtains is converted to projection information, and project on screen area, thus realize the mutual input behind location.
In this embodiment, also provide a kind of positioning interaction system.As shown in Figure 3, be the composition schematic diagram of positioning interaction system.Positioning interaction system comprises optical projection system 100, camera system 200 and disposal system 300.
Wherein, projection information, for the formation of projection scene, projects in screen area 4 by optical projection system 100.
Camera system 200 is for getting off the image taking of the mutual object in the projection scene of described optical projection system 100 and mutual shadow of object.Mutual object can be the finger of people, the thin rod that people is hand-held, common written pen etc.In this embodiment, mutual object is directly the finger of people, and the forefinger usually can selected during to indicate illustrates.Namely include the image of staff 5 and shade 6 information thereof under shooting, what need to utilize is the relevant information of the most advanced and sophisticated shade of forefinger forefinger that is most advanced and sophisticated, staff shade 6 of staff 5 on image.
Disposal system 300 receives the mutual object of camera system 200 shooting and mutual shadow of object image, and according to the described step P2 in aforementioned positioning interaction method) process the positional information obtaining described mutual object tip, and the positional information obtained is converted into projection information, output to described optical projection system 100.
During work, staff 5 as mutual object when the scene middle finger of projecting shows the content on projection screen region, the light that optical projection system 100 projects is blocked by staff 5, the screen area 4 of projection forms staff shade 6, and now camera system 200 takes the image comprising staff 5 and staff shade 6 in projection scene.After disposal system 300 receives the image of camera system 200 transmission, namely image is processed, obtain the positional information at the forefinger tip of staff 5, after obtaining its positional information, and by after this positional information binding time information, be converted into projection information, exported to optical projection system 100, projected on screen area 4 by optical projection system 100, thus realize the interactive mode input behind location.Such as, staff 5 goes out straight line at projection scene inside-paint, then, after the positioning interaction system of this embodiment, can demonstrate straight line in the relevant position of screen area.
The positioning interaction method and system of this embodiment, by after the image that obtains mutual object and shade thereof, utilize image procossing to obtain positional information, thus realize positioning interaction.In whole positioning interaction process, user does not need hand-held specific interactive device, and by finger or common thin rod, pen etc. all can realize positioning interaction, and user need not carry interactive device, freely facilitates.Owing to can be realized alternately, therefore not needing hand-held specific interactive device by people's hand finger, also just there is not the problem that user's hand blocks signal, data receiver is complete, and reciprocal process can be carried out real-time and accurately.Projection screen is common projection screen and can realizes above-mentioned positioning interaction process simultaneously, and no longer need specific interactive electric whiteboard, the cost of whole system decreases.
Preferably, the line of optical projection system 100 position and camera system 200 position is parallel to screen area 4, when then disposal system 300 processes image calculating location information, obtain the z-axis coordinate in the three dimensional local information at described mutual object tip according to following formulae discovery:
(1)
Wherein, z-axis is the direction with place, projection screen region 4 plane orthogonal, y-axis is projection screen region 4 in the planes along the direction of described optical projection system 100 with camera system 200 line, x-axis is projection screen region 4 vertical with camera system 200 line direction with described optical projection system 100 in the planes direction, y1 represents the y coordinate of the most advanced and sophisticated shade of mutual object, y represents the y coordinate at mutual object tip, L represents the vertical range of described optical projection system 100 and described screen area, and w represents the distance between described optical projection system 100 and described camera system 200.
As shown in Figure 4, be the schematic top plan view of positioning interaction system.P is the position at optical projection system 100 place, and C is camera system 200 position, and A is staff forefinger tip end.PP1 is the throw light of optical projection system 100 outgoing, forms the top P1 of shade through most advanced and sophisticated A on screen, is the most advanced and sophisticated shade of forefinger, corresponding with the most advanced and sophisticated A of forefinger.C1C is reflection ray, and entering camera system 200, C1 through A is the picture point of the most advanced and sophisticated A of forefinger in camera system 200.Distance between PC is the vertical range of w, P and screen area 4 is L, and the vertical range of most advanced and sophisticated A and screen area 4 is the distance between z, P1C1 is s.Because the line of optical projection system 100 position and camera system 200 position is parallel to screen area 4, also namely the line of PC is parallel with screen, then △ PAC ∽ △ P1AC1 in Fig. 4, similar according to triangle, can obtain:
(2)
Remember after abbreviation:
, take food the s that fingertip end A and most advanced and sophisticated shade P1 replaces along the relative distance of y-axis in formula, therefore namely obtain in formula 1
, so just can obtain the vertical range of forefinger most advanced and sophisticated A distance screen area 4, the z-axis coordinate information namely in three dimensional local information.For the x of the most advanced and sophisticated A of forefinger, y coordinate can carry out image procossing acquisition by the two dimensional image photographed, and does not describe in detail at this.By processing the three-dimensional coordinate calculating and can obtain the most advanced and sophisticated A of forefinger above, thus track can be carried out write operation, realize mutual.
Above-mentioned process calculates the preferred version of z-axis coordinate, and the algorithm involved by calculation processes is simple, and data volume is little, can carry out mutual in real time.And calculation processes does not exist the problem that hand blocks signal data loss, result of calculation is comparatively accurate, and reciprocal process is located and fast speed ground real-time follow-up with having degree of precision, thus realizes the interactive operation of precisely smoothness.
It should be noted that, above-mentioned process calculates in the preferred version of z-axis coordinate, and system needs to carry out emendation and correction to system before starting.First guarantee the rational position of optical projection system 100 and camera system 200, optical projection system 100 is parallel to screen area 4 with the line of camera system 200.Measure the coverage of camera system 200, ensure that whole projection scene is photographed.Location position is carried out to system, preferably with the center of screen area 4 for three-dimensional system of coordinate initial point, can the multiple calibration point of equally distributed collection (such as 25 calibration points), for setting up error model later.Error model: optical property factor (comprising optical distortion and the astigmatism) impact on error analyzing the projection lens of optical projection system 100 and the pick-up lens of camera system 200, and analytic system whole geometry structure is on the various possible error effect variable of the meeting influential system interaction accuracy such as the impact of error.According to the measuring position of calculating and the error analysis source of error of the most advanced and sophisticated physical location of forefinger, change an error effect variable successively, obtain the largest source factor of error, carry out software compensation optimization, thus reduce error.
Namely another kind of preferred scheme is increase Infrared projector.Also comprise the infrared light emission device for launching infrared light in optical projection system 100, camera system 200 is provided with infrared fileter.Because camera system 200 is added with infrared fileter, visible ray can be filtered, the image background photographed like this is simple, and tone is single, and contrast obviously, very directly can obtain the image of object.By the scheme of this increase Infrared projector, make disposal system 300 only for infrared light process, effectively can remove the noise of visible ray.
Above content is in conjunction with concrete preferred implementation further description made for the present invention, can not assert that specific embodiment of the invention is confined to these explanations.For general technical staff of the technical field of the invention, make some substituting or obvious modification without departing from the inventive concept of the premise, and performance or purposes identical, all should be considered as belonging to protection scope of the present invention.