CN102929434B - Optical projection system and its image treatment method - Google Patents

Optical projection system and its image treatment method Download PDF

Info

Publication number
CN102929434B
CN102929434B CN201110230838.6A CN201110230838A CN102929434B CN 102929434 B CN102929434 B CN 102929434B CN 201110230838 A CN201110230838 A CN 201110230838A CN 102929434 B CN102929434 B CN 102929434B
Authority
CN
China
Prior art keywords
image
coordinate
optical projection
point
indication point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201110230838.6A
Other languages
Chinese (zh)
Other versions
CN102929434A (en
Inventor
洪永庆
廖孟秋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aptos Technology Inc
Original Assignee
Aptos Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aptos Technology Inc filed Critical Aptos Technology Inc
Priority to CN201110230838.6A priority Critical patent/CN102929434B/en
Publication of CN102929434A publication Critical patent/CN102929434A/en
Application granted granted Critical
Publication of CN102929434B publication Critical patent/CN102929434B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The present invention discloses a kind of optical projection system and its image treatment method, and this optical projection system comprises projection module, camera module and processing module.Projection module projects the first image to body surface.The second image of the first image and indication point is contained on camera module shot object surface with acquisition, wherein indication point is formed to body surface by external device (ED) throw light.Processing module analyzes the second image to obtain indication point in the coordinate of the second image, carry out beeline approaching method to obtain unique point in the coordinate of the second image, use two dimension coordinate conversion formula by indication point in the coordinate conversion of the second image for indication point is in the coordinate of the first image.

Description

Optical projection system and its image treatment method
Technical field
The present invention relates to a kind of display system, and particularly relate to a kind of optical projection system and image treatment method thereof.
Background technology
Generally speaking, when carrying out electronics bulletin, bulletin person utilizes computer to connect projector by the bulletin content storage made in advance in computer, utilizes projector by bulletin content projection on screen.In the carrying out of bulletin, bulletin person can utilize flow process and the content of bulletin pen or mouse control bulletin usually, if need the situation of indicating positions, bulletin person that laser pen or the vernier of mouse usually can be utilized as the use of instruction at the paragraph of explanation.
But, when bulletin person will explain while on bulletin during annotation content, just must use the build-in function of electronics presentation software, such as, in the Powerpoint software of Microsoft (MicrosoftCorp.), use the function that mouse cursor cooperation left mouse button is pressed.But in bulletin process, use mouse and left mouse button and not convenient, such as, bulletin person need by the side of smooth desktop or one wait until wait near computer inconvenient.Even utilize the situation of a bulletin mouse beacon vernier, using operates still has no idea as utilizing laser pen directly to indicate the convenience come.
Summary of the invention
The object of the present invention is to provide a kind of optical projection system and image treatment method thereof, it can utilize camera module to take to obtain the image that projects and the position of unique point and indication point in the image of identification projection.
For reaching above-mentioned purpose, the present invention proposes a kind of optical projection system, and it comprises projection module, camera module and processing module.Projection module projects the first image to body surface, and wherein the first image comprises multiple unique point.The second image of the first image and indication point is contained on camera module shot object surface with acquisition, wherein indication point is formed to this body surface by external device (ED) throw light.Processing module analyzes the second image to obtain indication point in the coordinate of the second image, carry out beeline approaching method to obtain unique point in the coordinate of the second image, and use two dimension coordinate conversion formula by indication point in the coordinate conversion of the second image for indication point is in the coordinate of this first image.
The present invention proposes a kind of image treatment method of optical projection system, and the step of described method comprises: projection one first image to body surface, and wherein the first image comprises most unique points.Shot object surface, to capture one second image containing the first image and an indication point, wherein indication point is formed to body surface by an external device (ED) throw light.Analyze the second image to obtain indication point in the coordinate of the second image.Carry out a straight line approach method to obtain unique point in the coordinate of the second image.Use a pair of dimension coordinate conversion formula by indication point in the coordinate conversion of the second image for indication point is in the coordinate of this first image.
Based on above-mentioned, the invention provides a kind of optical projection system and applicable image treatment method thereof, the image projected can be made by camera module shooting to capture the position of unique point and the indication point obtained in image, and obtain the correspondence position of indication point on the image of projection by coordinate conversion.
For above-mentioned feature and advantage of the present invention can be become apparent, special embodiment below, and coordinate institute's accompanying drawings to be described in detail below.
Accompanying drawing explanation
Fig. 1 is the device calcspar of the optical projection system of the present invention one example embodiment;
Fig. 2 is in the present invention one example embodiment, the schematic diagram of the first image and the second image;
Fig. 3 A, Fig. 3 B and Fig. 3 C are in the present invention one example embodiment, the schematic diagram of beeline approaching method;
Fig. 4 is in the present invention one example embodiment, the schematic diagram of image coordinate conversion;
Fig. 5 is in the present invention one example embodiment, the schematic diagram of two points of approximatiosses;
Fig. 6 is in the present invention one example embodiment, the method flow diagram of the image treatment method of optical projection system;
Fig. 7 is in an example embodiment of the present invention, a kind of embodiment of optical projection system.
Main element symbol description
10: optical projection system
101: projection module
102: camera module
103: processing module
104: body surface
201: the first images
210: the second images
201 ': the first image after projection
212,712: indication point
310: the three images
311: bright grey decision-making region
312: dark grey decision-making region
301 ~ 304: equation of line
320,321,322: point
700: screen
701: projector
702: video camera
703: PC
704: the color filter
P01 ~ P04, P01 ' ~ P04 ': unique point
P1 ~ P3, P1 ' ~ P3 ': calculation level
S601 ~ S605: step
Embodiment
Fig. 1 is the device calcspar of the optical projection system of one embodiment of the invention.Please refer to Fig. 1, in this embodiment, optical projection system 10 comprises projection module 101, camera module 102 and processing module 103.Projection module 101 is connected with processing module 103, and in order to project the first image 201 on a body surface 104, wherein, described object 104 can be screen, white wall etc. can make the clear object manifested of the first image 201, but is not limited to above-mentioned.Camera module 102 is connected with processing module 103, the second image 210 is obtained in order to shot object surface 104, wherein the second image 210 comprises the first image 201 and indication point, indication point formed to body surface 104 by external device (ED) throw light, and described external device (ED) can be laser pen or bulletin pen or other beam projectors, but the present invention is not limited to above-mentioned.
Fig. 2 illustrates as in the present invention one example embodiment, the schematic diagram of the first image and the second image.Please refer to Fig. 1 and Fig. 2, in the present embodiment, after the first image 201 is projected to body surface 104 by projection module 101, camera module 102 is shot object surface 104 and obtain the second image 210, the first image 201 ' after wherein the second image 210 comprises projection and the indication point 212 formed to body surface 104 by external device (ED) (such as laser pen) throw light again.First image 201 comprises multiple unique point P01 ~ P04.Unique point P01 ~ P04 obtains unique point the P01 ' ~ P04 ' after projection after camera module 102 is taken, as shown in Figure 2.
Due to projecting direction and both normal directions angle of described body surface 104, the shooting angle of camera module 102 and/or other factors of projection module 101, the first image 201 ' after projection compared with the first original image 201 under may have difference in deformation and size.Therefore, for try to achieve the position of indication point 212 in the first original image 201 first must try to achieve projection after the first image 201 ' and the first original image 201 between transformational relation, namely first must try to achieve the transformational relation between the second image 210 and the first image 201.
Processing module 103 receives the second image 210 from camera module 102, and obtains indication point 212 in the coordinate of the second image 210 by analysis second image 210.Unique point P01 ~ P04 at this with in the hope of the first image 201 ' in the second image and the transformational relation between the first original image 201.In this embodiment, unique point P01 ~ P04 is set as four summits of the first image 201.Then, processing module 103 uses beeline approaching method to obtain the coordinate of unique point P01 ~ P04 in the second image 210.Described beeline approaching method comprises following action: the second image 210 is carried out threshold value (threshold) binaryzation by processing module 103, to obtain one the 3rd image; And processing module 103 uses one or more equation of line to move in the 3rd image, to find out unique point P01 ~ P04 in the coordinate of the second image 210.Following high-ranking officer is with the above-mentioned step of graphic detailed description.
In beeline approaching method, the step that the second image 210 carries out threshold binarization is also comprised: define first threshold by processing module 103; The multiple pixel datas being greater than first threshold in second image 210 are revised as a bright grey decision-making (such as maximum gray value or white grey decision-making) by processing module 103; And the multiple pixel datas being less than first threshold in the second image 210 are revised as dark grey decision-making (such as minimum gray value or black grey scale value) by processing module 103, and after completing the procedure, obtain the 3rd image.Processing module 103 is the function that the second image 210 carries out the action of threshold binarization, the region of the first image 201 ' distinguish projection in the second image 210 after and the region without projection.
Fig. 3 A illustrates as in the present invention one example embodiment, the second image 210 has been carried out to the schematic diagram of the 3rd image 310 of gained after threshold binarization.Please refer to Fig. 2 and Fig. 3 A, wherein, the bright grey decision-making region 311 formed by bright grey decision-making pixel can correspond to the region of the first image 201 ' in the second image 210 after projection, and dark grey decision-making region 312 then corresponds to the region without projection image in the second image 210.
Fig. 3 B illustrate, in the present invention one example embodiment, in beeline approaching method, uses one or more equation of line to move in the 3rd image, to find out unique point in the schematic diagram of the action of the coordinate of the second image.Please refer to Fig. 3 B, in the present embodiment, the unique point P01 ' that look for ~ P04 ' is four summits of the first image 311 after binaryzation, so use four equation of line 301,302,303 and 304 to move at the 3rd image 310.In the present embodiment, the equation of four equation of line 301,302,303 and 304 is respectively:
Equation of line 301:y=-x+c
Equation of line 302:y=x-sizex+c
Equation of line 303:y=x+sizey-c
Equation of line 304:y=-x+sizex+sizey-c
Wherein, parameter s izex is the width of the 3rd image 310, and namely the 3rd image 310 is in the size in X-axis (transverse axis) direction.Parameter s izey is the height of the 3rd image 310, and namely the 3rd image 310 is in the size in Y-axis (longitudinal axis) direction.The intercept parameter of parameter c for changing when equation of line 301,302,303 or 304 moves on the 3rd image 310, as shown in Figure 3 B.
Continue referring to Fig. 2 and Fig. 3 B, when parameter c is 0, then equation of line 301,302,303 and 304 is each via four corners of the 3rd image 310.Along with increase parameter c, each the rising since four corners of the 3rd image 310 of equation of line 301,302,303 and 304 is at first moved toward the center direction of the 3rd image 310.In the process of movement, check the gray-scale intensity of multiple coordinate position in equation of line 301,302,303 and 304.In multiple coordinate positions of equation of line 301,302,303 and 304, first time finds that coordinate position that gray-scale intensity is greater than Second Threshold is considered as the coordinate of unique point respectively.Size due to the second image 210 and the 3rd image 310 is identical and have corresponding content, and four unique points found out by above-mentioned steps also equal these four unique points in the coordinate of the second image at the coordinate of the 3rd image 310.
Fig. 3 C illustrates as the present invention one example embodiment, determines the coordinate time of unique point P04 ' on the 3rd image 310, the close-up schematic view of the 3rd image 310 in beeline approaching method.Fig. 3 C only with equation of line 303 for illustrative examples.Other equation of line 301,302,304 can with reference to the related description of equation of line 303.Please refer to Fig. 3 C, often make parameter c increase a step, equation of line 303 just can towards central mobile step of the 3rd image 310.After every order three image 310 completes the movement of a step, processing module 103 just can check the gray-scale intensity of multiple coordinate position in equation of line 303.Because the 3rd image 310 is binarized, before the first image 311 therefore after equation of line 303 touches binaryzation, in equation of line 303, the grey decision-making of each pixel is " dark grey decision-making ".When equation of line 303 is towards central mobile, if bright grey decision-making section appears in first time in the multiple pixels in equation of line 303, represent that equation of line 303 touches unique point P04 '.Such as, when equation of line 303 moves to position shown in Fig. 3 C, if the grey decision-making that processing module 103 is checked through pixel 320,321 and 322 in equation of line 303 is respectively dark grey decision-making, bright grey decision-making and dark grey decision-making, then bright grey decision-making pixel 321 can be considered as unique point P04 '.Therefore, the coordinate position of pixel 321 can be considered as unique point P04 ' in the coordinate of the second image 210.Needing special one to carry, is all the result judging gained with Second Threshold more afterwards at this high gray value point and the judgement of low grey decision-making point.
Though the above-mentioned grey decision-making using continuous three pixels be the situation of dark grey decision-making, bright grey decision-making and dark grey decision-making as enforcement example, but the situation of reality is not limited thereto.In other embodiments, equation of line 303 towards central mobile first time touch binaryzation after the first image 311 time, the bright grey decision-making section in equation of line 303 may have multiple pixel.Based on the amount of movement (such as moving the distance of five pixels) of each step of equation of line 303 at every turn, when bright grey decision-making section appears in first time in the multiple pixels in equation of line 303, equation of line 303 just may can not pass through unique point P04 '.Therefore, processing module 103 can check and occur whether the pixel quantity in bright grey decision-making section is less than a preset value first in equation of line 303.If the quantity of pixel is less than preset value in bright grey decision-making section, represent that the distance of equation of line 303 and unique point P04 ' is also in tolerable error range, therefore processing module 103 can be considered as unique point P04 ' in the coordinate of the second image 210 with the coordinate position of bright grey decision-making section center pixel.If the quantity of pixel is greater than preset value in bright grey decision-making section, then equation of line 303 retreats (opposite direction towards the 3rd image 310 central authorities moves), until the quantity of pixel is less than preset value in bright grey decision-making section with less step amount of movement (such as moving the distance of a pixel) at every turn.
Continue referring to Fig. 1 and Fig. 2, after learning the coordinate of unique point on the second image 210, the processing module 103 just then coordinate of positioning index dot 212 on the second image 210.In one embodiment of this invention, one the color filter is configured in the light path of photography of camera module 102, such as, if indication point is formed by the ray cast of particular color (such as red), then configure red the color filter in the light path of photography of camera module 102, to highlight indication point 212 in the contrast of the second image 210.In other embodiments, projection module 101 when projection the first image 201, can configure the red point of a high brightness in the position of each unique point.Thus, in the second image 210 taken after red filter, unique point P01 ' ~ P04 ' is also highlighted.Utilize the color filter to highlight the method for indication point and unique point similar in appearance to this, other colors of above-mentioned elimination also can utilize the software function in processing module 103 to reach with the action highlighting indication point and unique point, but the present invention is not limited to above-mentioned.
Continue referring to Fig. 1 and Fig. 2, in another embodiment of the invention, multiple pixel datas of the second image 210 are converted to multiple brightness data by processing module 103, such as from RGB (RedGreenBlue, RGB) color space formed is transformed into by lightness, colourity and concentration (LuminanceChrominanceChroma, YUV) color space formed, and the color component capturing wherein lightness becomes brightness data.Brightness due to indication point 212 should be greater than the brightness of arbitrary pixel in the second image 210, so processing module 103 can set one the 3rd threshold value, and compares all brightness datas of the 3rd threshold value and the second image 210.The coordinate position being greater than the 3rd threshold value in brightness data in second image 210 is considered as indication point 212 in the coordinate of the second image 210 by processing module 103, but in the present invention, the implementation of positioning index dot 212 is not limited to above-mentioned.
Fig. 4 illustrates as in the present invention one example embodiment, the schematic diagram of image coordinate conversion.Please refer to Fig. 1 and Fig. 4, unique point P01, P02, P03 and P04 coordinate in the first image 201 are respectively (x 1, y 1), (x 2, y 2), (x 3, y 3) and (x 4, y 4).Unique point P01, P02, P03 and P04 correspond to unique point P01 ', P02 ', P03 ' and P04 ' in the second image 210, and unique point P01 ', P02 ', P03 ' and the P04 ' coordinate in the second image 210 be respectively (x ' 1, y ' 1), (x ' 2, y ' 2), (x ' 3, y ' 3) with (x ' 4, y ' 4).The width (size of X-direction) defining the first image 201 at this is sx, and the height of the first image 201 (size of Y direction) is sy.If unique point P01 is the coordinate (x in the first image 201 1, y 1) be (0,0), then the unique point P01 in the first image 201, the coordinate of P02, P03 and P04 are respectively:
P01:(x 1,y 1)=(0,0)
P02:(x 2,y 2)=(sx,0)
P03:(x 3,y 3)=(sx,sy)
P04:(x 4,y 4)=(0,sy)
After processing module 103 obtains indication point 212 and unique point P01 ', P02 ', P03 ' and the P04 ' coordinate in the second image 210, a pair of dimension coordinate conversion formula (1) and (2) can be utilized to change between the first image 201 and the second image 210:
x ′ = Σ j = 0 m Σ k = 0 m a jk x j y k - - - ( 1 )
y ′ = Σ j = 0 m Σ k = 0 m b jk x j y k - - - ( 2 )
Wherein x and y is respectively the position coordinates of pixel X-axis and Y-axis in the first image 201, and x ' and y ' is respectively the position coordinates of pixel X-axis and Y-axis in the second image 210.Coefficient a jk, b jkfor real number, and Coefficient m is integer.Coefficient a jk, b jkobtained by the first image 201 and the respective corresponding point of the second image 210.Such as, if m=0, by the coordinate (x of unique point P01 in the first image 201 1, y 1) and the coordinate of unique point P01 ' in the second image 210 (x ' 1, y ' 1) obtain coefficient a jk, b jk.
The m of higher numerical value can process the geometric distortion of higher variation.Such as, use m=1 in the present embodiment, but be not limited to m=1.Substitute into formula (1) and (2) with m=1, formula (3) and (4) can be obtained:
x′=a×x+b×y+c×x×y+d(3)
y′=e×x+f×y+g×x×y+h(4)
Wherein (x, y) is the position coordinates of pixel in the first image 201, and (x ', y ') be the position coordinates of pixel in the second image 210.By unique point P01, P02, P03 and P04 coordinate (x in the first image 201 1, y 1), (x 2, y 2), (x 3, y 3) and (x 4, y 4) and unique point P01 ', P02 ', P03 ' and the P04 ' coordinate in the second image 210 (x ' 1, y ' 1), (x ' 2, y ' 2), (x ' 3, y ' 3) with (x ' 4, y ' 4) substitute into formula (3) with (4) coefficient a, b, c, d, e, f, g, h can be obtained.Parameter in this pair of dimension coordinate conversion formula (3) and (4) is respectively:
d=x′ 1
h=y′ 1
a = x ′ 2 - d sx
e = y ′ 2 - h sy
b = x ′ 3 - a × sx - b × sy - d sx × sy
g = y ′ 3 - e × sx - f × sy - h sx × sy
Because above-mentioned two-wire system interpolation formula (3) and (4) are comparatively applicable to by the coordinate (x of the first image 201, y) be converted to the second image 210 coordinate (x ', y '), to from the coordinate of the second image 210 (x ', y ') calculating that converts back the coordinate (x, y) of the first image 201 then slightly dislikes complicated.Therefore, in one embodiment of this invention, processing module 103 can use the method for two points of approximatiosses to try to achieve the respective coordinates of indication point 212 on the first image 201.
Fig. 5 illustrates as in the present invention one example embodiment, the schematic diagram of two points of approximatiosses.Please refer to Fig. 1 and Fig. 5, first, processing module 103 is by average for the coordinate position of the unique point P01 on the first image 201, P02, P03 and P04, obtain calculation level coordinate P1, then the coordinate of calculation level P1 is converted to coordinate P1 ' corresponding on the second image 210 by aforesaid pair of dimension coordinate conversion formula (3) and (4).Then, second image 210 compares the position of the coordinate of calculation level P1 ' and the coordinate of indication point 212, obtain the relative sector position (namely indication point 212 is in the lower left of calculation level P1 ') of indication point 212 and calculation level P1 ' on the second image 210, therefore namely the possible range of indication point 212 reduces 1/4th.
As shown in Figure 5, fall within the region of the third quadrant of calculation level P1 ' due to indication point 212, so average by the coordinate of the coordinate of calculation level P1 and unique point P01 in the first image 201, obtain next calculation level P2.Then, processing module 103 reuses two dimension coordinate conversion formula (3) and the coordinate conversion of calculation level P2 in the first image 201 is the coordinate of calculation level P2 ' in the second image 210 by (4).Processing module 103 to be compared in the second image 210 after the coordinate of calculation level P2 ' and the coordinate of indication point 212, learns that indication point 212 falls within the first quartile region of calculation level P2 '.
By that analogy, processing module 103 is average by the coordinate of the coordinate of calculation level P2 and calculation level P1 in the first image 201, obtain next calculation level P3, then processing module 103 reuses two dimension coordinate conversion formula (3) and the coordinate conversion of calculation level P3 in the first image 201 is the coordinate of calculation level P3 ' in the second image 210 by (4).Processing module 103 to be compared in the second image 210 after the coordinate of calculation level P3 ' and the coordinate of indication point 212, learns that indication point 212 falls within second quadrant area of calculation level P3 '.Processing module 103 uses two points of approach method mobile computing points similar to the above, until calculation level Pn (not illustrating) is less than the 4th threshold value, then the coordinate of processing module 103 using the coordinate of last calculation level Pn in the first image 201 as indication point 212 in the first image 201 with the distance of indication point 212 in the second image 210.
Continue referring to Fig. 1 and Fig. 5, in one embodiment of this invention, after processing module 103 obtains the coordinate of indication point 212 in the first image 201, if when processing module 103 judges that indication point 212 is positioned at the predeterminable area of the first image, then processing module 103 triggers the preset function corresponding to this predeterminable area.Such as, the first image 201 comprises a virtual key, then, when indication point 212 points to the region of virtual key, can trigger the such as function such as page turning, Scalable; Or directly utilizing the position at indication point 212 place, triggering and mouse click the identical function that this region has.But the present invention is not limited to above-mentioned.
When user moves indication point 212, processing module 103 continues the coordinate of detecting indication point 212 in the first image 201 to obtain multiple tracing point (not illustrating).Processing module 103 by line mutual for these tracing points, becomes the motion track of indication point 212 according to the order of time.In addition, processing module 103 also sets a time threshold, and counts the interval time of two tracing points adjacent in time sequencing.Described time threshold is greater than, then described two adjacent tracing point not interconnecting lines when the interval time of adjacent two tracing points of time sequencing.
In addition, while processing module 103 continues detecting motion track, motion track is also illustrated on a hyaline layer by processing module 103.Then, transparent laminated is placed on the first image 201 by processing module 103, and by projection module 103, the hyaline layer after overlap is projected to body surface 104 together with the first image 201.That is, the motion track of indication point 212 can be shown in the first image 201 in the mode of superposition by optical projection system 10.In addition, in one embodiment of this invention, first image 201 is projected to body surface via projection module 103 according to the first briefing file by processing module 103, and the hyaline layer containing motion track, after the motion track obtaining indication point 212, is stored in the second briefing file by processing module 103.It is worth mentioning that, the present embodiment can detect one or more indication point, and each indication point of detecting continued respectively is also noted down its motion track and is illustrated in hyaline layer, is embedded in the lump in the second briefing file.
Fig. 6 illustrates as in the present invention one example embodiment, the method flow diagram of the image treatment method of optical projection system.Please refer to Fig. 6, described method comprises following steps.Project the first image to body surface (S601), wherein the first image comprises most unique points.Shot object surface, to capture the second image (S602) containing the first image and indication point, wherein indication point is formed to body surface by external device (ED) throw light.Analyze the second image to obtain indication point in the coordinate (S603) of the second image.Carry out beeline approaching method to obtain unique point in the coordinate (S604) of the second image.Use two dimension coordinate conversion formula by indication point in the coordinate conversion of the second image for indication point is in the coordinate (S605) of the first image.In addition, about the implementation detail in above-mentioned steps, there is detailed description in aforesaid embodiment and embodiment, below do not repeat for this reason.
Fig. 7 illustrates as in an example embodiment of the present invention, and a kind of embodiment of optical projection system, please refer to Fig. 1 and Fig. 7.Be realize body surface 104 with screen 700 in the present embodiment, realize projection module 101 with projector 701, realize camera module 102 with video camera 702, and realize processing module 103 with PC 703.Projector 701 connects with PC 703, is projected on screen 700 by the image data (i.e. the first image 201) from PC 703.The video camera 702 can putting the color filter 704 before camera lens is connected to PC 703, takes and notes down the content on screen 700, namely aforesaid second image 210.In addition, laser pen 705 throw light produces indication point 712 to screen 700.In other embodiments, projection module 101, camera module 102 and/or processing module 103 also can be integrated in same device.Such as, camera module 102 and processing module 103 can be integrated into notebook computer or the Smartphone that has video camera function.Again such as, optical projection system 10 is a Smartphone, and this Smartphone is embedded with minitype projection machine (projection module 101) and camera (camera module 102).In any case the present invention is not limited to above-mentioned embodiment.
In sum, the invention provides a kind of optical projection system and applicable image treatment method thereof, making, when not needing extra special device, the convenience when utilizing projector equipment bulletin can be improved.The present invention directly can detect the position of the indication point projected by external device (ED) and note down its track, is equivalent to directly to utilize the content of indication point annotation bulletin and be stored in archives.In addition, the present invention also can utilize the position of indication point directly to trigger default function, such as, replace the click function etc. of mouse, allows user be able to be described in the process of bulletin in the mode of more intuition and manipulate.
Although disclose the present invention in conjunction with above embodiment; but itself and be not used to limit the present invention; have in any art and usually know the knowledgeable; without departing from the spirit and scope of the present invention; a little change and retouching can be done, therefore being as the criterion of should defining with the claim of enclosing of protection scope of the present invention.

Claims (28)

1. an optical projection system, comprising:
Projection module, projection one first image to body surface, wherein this first image comprises most unique points;
Camera module, take this body surface contains this first image and an indication point one second image with acquisition, wherein this indication point is formed to this body surface by an external device (ED) throw light; And
Processing module, analyze this second image to obtain this indication point in the coordinate of this second image, carry out a straight line approach method to obtain those unique points in the coordinate of this second image, use a pair of dimension coordinate conversion formula by this indication point in the coordinate conversion of this second image be this indication point in the coordinate of this first image, wherein this beeline approaching method comprises:
This second image is carried out threshold binarization, to obtain one the 3rd image; And
At least one equation of line is used to move in the 3rd image, to find out those unique points in the coordinate of this second image.
2. optical projection system as claimed in claim 1, wherein saidly comprises the step that this second image carries out threshold binarization: define a first threshold; The multiple pixel datas being greater than this first threshold in this second image are revised as a bright grey decision-making; And the multiple pixel datas being less than this first threshold in this second image are revised as a dark grey decision-making.
3. optical projection system as claimed in claim 1, wherein said at least one equation of line at first moves toward the center direction of the 3rd image from a corner of the 3rd image; In described moving process, check the gray-scale intensity of multiple coordinate position in described at least one equation of line; And in described moving process, will find that gray-scale intensity be greater than the coordinate position of a Second Threshold in multiple coordinate positions of described at least one equation of line, be considered as a unique point in those unique points in the coordinate of this second image first time.
4. optical projection system as claimed in claim 1, wherein this beeline approaching method also comprises: make this projection module improve the brightness of those unique points.
5. optical projection system as claimed in claim 1, wherein this camera module configures a color filter in light path of photography, to highlight this indication point and those unique points.
6. optical projection system as claimed in claim 1, wherein multiple pixel datas of this second image are converted to multiple brightness data by this processing module, and the coordinate position being greater than one the 3rd threshold value in those brightness datas is considered as this indication point in the coordinate of this second image.
7. optical projection system as claimed in claim 1, wherein this pair of dimension coordinate conversion formula is x ′ = Σ j = 0 m Σ k = 0 m a jk x j y k And y ′ = Σ j = 0 m Σ k = 0 m b jk x j y k , Wherein x and y is shown in the coordinate in this first image, and x ' and y ' is shown in the coordinate in this second image, coefficient a jk, b jkfor real number, Coefficient m is integer.
8. optical projection system as claimed in claim 1, wherein this pair of dimension coordinate conversion formula is x '=a × x+b × y+c × x × y+d and y '=e × x+f × y+g × x × y+h, wherein x and y is shown in the coordinate in this first image, and x ' and y ' is shown in the coordinate in this second image, d=x 1', h=y 1', a=(x 2'-d) ÷ sx, e=(y 2'-h) ÷ sy, b=(x 3'-a × sx-b × sy-d) ÷ (sx × sy), g=(y 3'-e × sx-f × sy-h) ÷ (sx × sy), x 1' and y 1' be the coordinate of a fisrt feature point in this second image in those unique points, x 2' and y 2' be the coordinate of a second feature point in this second image in those unique points, x 3' and y 3' be the coordinate of a third feature point in this second image in those unique points, sx be in this first image this fisrt feature point to the distance of this second feature point, and sy be in this first image this second feature point to the distance of this third feature point.
9. optical projection system as claimed in claim 8, wherein this processing module uses this pair of dimension coordinate conversion formula to be the coordinate of this calculation level in this second image by the coordinate conversion of a calculation level in this first image, the relatively coordinate of this calculation level in this second image and the coordinate of this indication point in this second image, move this calculation level with two points of approximatiosses and approach this indication point to make this calculation level, and the coordinate using the coordinate of this calculation level in this first image as this indication point in this first image.
10. optical projection system as claimed in claim 1, wherein this processing module continues to detect the coordinate of this indication point in this first image to obtain multiple tracing point, and by those tracing points according to time sequencing interconnecting line.
11. optical projection systems as claimed in claim 10, wherein an interval time of two tracing points that this processing module counting is adjacent in time sequencing, if be greater than a time threshold this interval time, then described adjacent two tracing points not interconnecting line.
12. optical projection systems as claimed in claim 1, wherein when this processing module judges that this indication point is positioned at a predeterminable area of this first image, this processing module triggers the preset function corresponding to this predeterminable area.
13. optical projection systems as claimed in claim 1; wherein this processing module continues to detect this indication point in the motion track of this first image; this motion track is illustrated in a hyaline layer; and this hyaline layer is projected to this body surface via this projection module, wherein this transparent laminated is placed on this first image.
14. optical projection systems as claimed in claim 13, wherein this first image is projected to this body surface via this projection module according to one first briefing file by this processing module, and this hyaline layer containing this motion track is stored in one second briefing file by this processing module.
The image treatment method of 15. 1 kinds of optical projection systems, comprising:
Project one first image to body surface, wherein this first image comprises most unique points;
Take this body surface, to capture one second image containing this first image and an indication point, wherein this indication point is formed to this body surface by an external device (ED) throw light;
Analyze this second image to obtain this indication point in the coordinate of this second image;
Carry out a straight line approach method to obtain those unique points in the coordinate of this second image, wherein this beeline approaching method comprises:
This second image is carried out threshold binarization, to obtain one the 3rd image; And
At least one equation of line is used to move in the 3rd image, to find out those unique points in the coordinate of this second image; And
A pair of dimension coordinate conversion formula is used to be that this indication point is in the coordinate of this first image by this indication point in the coordinate conversion of this second image.
The image treatment method of 16. optical projection systems as claimed in claim 15, wherein saidly comprises the step that this second image carries out threshold binarization:
Define a first threshold;
The multiple pixel datas being greater than this first threshold in this second image are revised as a bright grey decision-making; And
The multiple pixel datas being less than this first threshold in this second image are revised as a dark grey decision-making.
The image treatment method of 17. optical projection systems as claimed in claim 15, wherein saidly find out those unique points in the step of the coordinate of this second image and comprise:
Described at least one equation of line is at first moved toward the center direction of the 3rd image from a corner of the 3rd image;
In described moving process, check the gray-scale intensity of multiple coordinate position in described at least one equation of line; And
In described moving process, will find that gray-scale intensity be greater than the coordinate position of a Second Threshold in multiple coordinate positions of described at least one equation of line, be considered as a unique point in those unique points in the coordinate of this second image first time.
The image treatment method of 18. optical projection systems as claimed in claim 15, wherein this beeline approaching method also comprises:
This projection module is made to improve the brightness of those unique points.
The image treatment method of 19. optical projection systems as claimed in claim 15, also comprises:
Configure a color filter in the light path of photography of this camera module, to highlight this indication point and those unique points.
The image treatment method of 20. optical projection systems as claimed in claim 15, also comprises:
Multiple pixel datas of this second image are converted to multiple brightness data; And
The coordinate position being greater than one the 3rd threshold value in those brightness datas is considered as this indication point in the coordinate of this second image.
The image treatment method of 21. optical projection systems as claimed in claim 15, wherein this pair of dimension coordinate conversion formula is x ′ = Σ j = 0 m Σ k = 0 m a jk x j y k And y ′ = Σ j = 0 m Σ k = 0 m b jk x j y k , Wherein x and y is shown in the coordinate in this first image, and x ' and y ' is shown in the coordinate in this second image, coefficient a jk, b jkfor real number, Coefficient m is integer.
The image treatment method of 22. optical projection systems as claimed in claim 15, wherein this pair of dimension coordinate conversion formula is x '=a × x+b × y+c × x × y+d and y '=e × x+f × y+g × x × y+h, wherein x and y is shown in the coordinate in this first image, x ' and y ' is shown in the coordinate in this second image, d=x 1', h=y 1', a=(x 2'-d) ÷ sx, e=(y 2'-h) ÷ sy, b=(x 3'-a × sx-b × sy-d) ÷ (sx × sy), g=(y 3'-e × sx-f × sy-h) ÷ (sx × sy), x 1' and y 1' be the coordinate of a fisrt feature point in this second image in those unique points, x 2' and y 2' be the coordinate of a second feature point in this second image in those unique points, x 3' and y 3' be the coordinate of a third feature point in this second image in those unique points, sx be in this first image this fisrt feature point to the distance of this second feature point, and sy be in this first image this second feature point to the distance of this third feature point.
The image treatment method of 23. optical projection systems as claimed in claim 22, the step of wherein said use a pair of dimension coordinate conversion formula comprises:
This pair of dimension coordinate conversion formula is used to be the coordinate of this calculation level in this second image by the coordinate conversion of a calculation level in this first image;
The relatively coordinate of this calculation level in this second image and the coordinate of this indication point in this second image;
Move this calculation level with two points of approximatiosses, approach this indication point to make this calculation level; And
Coordinate using the coordinate of this calculation level in this first image as this indication point in this first image.
The image treatment method of 24. optical projection systems as claimed in claim 15, also comprises:
Continue this indication point of detecting in the coordinate of this first image, to obtain multiple tracing point; And
By those tracing points according to time sequencing interconnecting line.
The image treatment method of 25. optical projection systems as claimed in claim 24, also comprises:
Count an interval time of two tracing points adjacent in time sequencing; And
If this interval time is greater than a time threshold, then described adjacent two tracing points not interconnecting line.
The image treatment method of 26. optical projection systems as claimed in claim 15, also comprises:
When this indication point is positioned at a predeterminable area of this first image, trigger the preset function corresponding to this predeterminable area.
The image treatment method of 27. optical projection systems as claimed in claim 15, also comprises:
Continue this indication point of detecting in the motion track of this first image;
This motion track is illustrated in a hyaline layer; And
This hyaline layer is projected to this body surface, and wherein this transparent laminated is placed on this first image.
The image treatment method of 28. optical projection systems as claimed in claim 27, wherein this first image is the content of one first briefing file, and described image treatment method also comprises:
This hyaline layer containing this motion track is stored in one second briefing file.
CN201110230838.6A 2011-08-12 2011-08-12 Optical projection system and its image treatment method Active CN102929434B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201110230838.6A CN102929434B (en) 2011-08-12 2011-08-12 Optical projection system and its image treatment method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201110230838.6A CN102929434B (en) 2011-08-12 2011-08-12 Optical projection system and its image treatment method

Publications (2)

Publication Number Publication Date
CN102929434A CN102929434A (en) 2013-02-13
CN102929434B true CN102929434B (en) 2015-11-11

Family

ID=47644259

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110230838.6A Active CN102929434B (en) 2011-08-12 2011-08-12 Optical projection system and its image treatment method

Country Status (1)

Country Link
CN (1) CN102929434B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201528049A (en) * 2014-01-03 2015-07-16 Utechzone Co Ltd Correction method based on pattern and electronic apparatus

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008152622A (en) * 2006-12-19 2008-07-03 Mitsubishi Electric Corp Pointing device
CN101526848A (en) * 2008-03-05 2009-09-09 广达电脑股份有限公司 Coordinate judging system and method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3951984B2 (en) * 2003-08-22 2007-08-01 日本電気株式会社 Image projection method and image projection apparatus

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008152622A (en) * 2006-12-19 2008-07-03 Mitsubishi Electric Corp Pointing device
CN101526848A (en) * 2008-03-05 2009-09-09 广达电脑股份有限公司 Coordinate judging system and method

Also Published As

Publication number Publication date
CN102929434A (en) 2013-02-13

Similar Documents

Publication Publication Date Title
CN101189570B (en) Image displaying apparatus
US20170351324A1 (en) Camera-based multi-touch interaction apparatus, system and method
US8485668B2 (en) 3D interaction for mobile device
CN102799318B (en) A kind of man-machine interaction method based on binocular stereo vision and system
US9734392B2 (en) Image processing device and image processing method
CN104052976B (en) Projecting method and device
TWI446225B (en) Projection system and image processing method thereof
TW201101140A (en) Active display feedback in interactive input systems
CN109782962A (en) A kind of projection interactive method, device, system and terminal device
JPH07200158A (en) Apparatus and method for positioning of stylus using one-dimensional image sensor
CN102163108B (en) Method and device for identifying multiple touch points
CN102508574A (en) Projection-screen-based multi-touch detection method and multi-touch system
CN102184056B (en) Method and device for identifying multiple touch points
Dai et al. Making any planar surface into a touch-sensitive display by a mere projector and camera
KR20090048197A (en) Calculation method and system of pointing locations, and collaboration system comprising it
US9146625B2 (en) Apparatus and method to detect coordinates in a penbased display device
Liang et al. Turn any display into a touch screen using infrared optical technique
CN102929434B (en) Optical projection system and its image treatment method
CN109799928B (en) Method and system for acquiring user finger parameters in projection touch panel
JP2015087776A (en) Information input auxiliary sheet, dot code information processing system and calibration method
CN101819493B (en) Interactive display screen and method thereof
Fujiwara et al. Interactions with a line-follower: An interactive tabletop system with a markerless gesture interface for robot control
Gunn et al. Using sticky light technology for projected guidance
CN105653101B (en) Touch point sensing method and optical touch system
CN102479002B (en) Optical touch control system and sensing method thereof

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant