CN202815785U - Optical touch screen - Google Patents
Optical touch screen Download PDFInfo
- Publication number
- CN202815785U CN202815785U CN 201220385921 CN201220385921U CN202815785U CN 202815785 U CN202815785 U CN 202815785U CN 201220385921 CN201220385921 CN 201220385921 CN 201220385921 U CN201220385921 U CN 201220385921U CN 202815785 U CN202815785 U CN 202815785U
- Authority
- CN
- China
- Prior art keywords
- input device
- image input
- image
- touch screen
- blank
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Images
Landscapes
- Image Processing (AREA)
Abstract
The utility model relates to an optical touch screen which comprises an image scanning device, border strips and an ordinary writing white board. Two or more than two image sensors installed at the corners of the ordinary writing white board are used for performing image scanning on opposite border strips. The gray scale can obtain a referenced touch plane by decorating borders on the inner sides. When an object enters a touch area, the gray scale edges in the border strips cause damage, an intersecting angle of the object between two camera heads can be obtained through damage points, and the position of the object on the white board area can be obtained through a triangle calculation method. The optical touch screen utilizes the borders of the white board to perform image detection and achieves mutual information of electronic white boards without additionally arranging electronic devices. In addition, the system further introduces mobile detection and color discrimination, detection accuracy is improved by means of the mobile detection, and the color discrimination can provide a function of an electronic intelligent color pen.
Description
Technical field
The utility model relates to the electronic whiteboard field.
Background technology
Electronic whiteboard can be divided into two classes by touching technique: initiatively touch and passive touch, the special device of touch system needs initiatively remove to contact touch-surface, and the most frequently used interactive device is the interaction pen of special Power supply; Passive touch system does not then need special contact deactivation touch-surface, and the user can finish interworking with finger or other objects fully.
From the development trend of whole electronic whiteboard, passive touch type electronic blank has been tending towards main flow, adopts the Detection Techniques of camera-shooting scanning formula to have following several:
The first, use is installed in the outer video camera of blank whole blank is taken in, when light pen is write at blank, the movement of camera perception luminous point obtains the position of light pen, its shortcoming is that camera need to be installed in the blank outside, when the people write in front, the luminous point of light pen can be blocked.
The second adopts a plurality of line array CCDs that are installed on the blank that light pen is surveyed, and by judging that luminous point carries out compute location in the position of each thread CCD, its shortcoming is to need light pen.
The third, a plurality of video cameras that employing is installed on the blank scan detecting object, carry out coordinate setting by the characteristic quantity of judgment object in each video camera, this mode or more advanced, but process comparatively complexity, in addition the installation of video camera had very high requirement.
Summary of the invention
The utility model will solve the shortcoming of above-mentioned prior art, and a kind of optical touch screen is provided, and it obtains touch information indirectly by detecting auxiliary blank block diagram picture, and it allows finger or object directly to touch at blank, need not to use sign pen.
The utility model solves the technical scheme that its technical matters adopts: this electric whiteboard system based on image detection, mainly comprise blank, left and right sides corner location on blank is placed with respectively the first image input device and the second image input device, wherein the first image input device carries out image acquisition for second pair of frame bar and the 3rd frame bar, the second image input device is used for second pair of frame bar and the first side moulding carries out image acquisition, and first side moulding, Second Edge moulding and the 3rd frame bar are provided with black and white at the medial surface of blank and replace the border; The central optical axis of the first image input device and the second image input device is angled with its both sides frame respectively, and the height of central optical axis is right against black and white and replaces the border; The first image input device is connected with primary processor respectively with the second image input device.
Described the first image input device and the second image input device consist of by camera and graphics processing unit, and the central optical axis of two cameras becomes 45 degree with its both sides frame respectively.
Described camera is comprised of greater than the sphere camera lens of 90 degree and the CMOS color image sensor of high-resolution the horizontal field of view angle.
Described primary processor is connected with computing machine by the USB device.
By above technical scheme, the utility model has adopted the intrinsic frame of blank to carry out image detection, is not increasing in the additional electronic devices situation, has realized the interactive information of electronic whiteboard.It as the image detection target, by judging the Feature change of this reference target, obtains object at the touch information of electronic whiteboard by the blank frame of introducing indirectly, and the utility model of native system becomes simply board for writing or drawing, has high practical value.
Description of drawings
Fig. 1 is that system of the present utility model forms synoptic diagram;
Fig. 2 is that the utility model system frame bar is decorated synoptic diagram;
Fig. 3 is the composition synoptic diagram of image input device of the present utility model:
Fig. 4 is the principle schematic of graphics processing unit of the present utility model;
Fig. 5 is the principle schematic of main processor unit of the present utility model;
Fig. 6 is the utility model coordinate Calculation synoptic diagram;
Fig. 7 is the utility model secondary light source scheme of installation;
Description of reference numerals: the first image input device 1, the second image input device 2, first side moulding 3, Second Edge moulding 4, the three frame bars 5, blank 6, primary processor 7, black and white replace border 11, sphere camera lens 20, CMOS color image sensor 21, programmable logic device (PLD) 22, digital signal processor 23, camera 26, graphics processing unit 27, view data snubber assembly 51, picture noise filtration module 52, edge extracting module 53, mobile detection module 54, color average module 55, digital signal module 56, brightness data 60, chroma data 61, tracing analysis unit 62, variance analysis unit 63, color analysis unit 64, single-chip microcomputer 70, EPROM 71, USB module 80, testee 100, infrared LED 101, pcb board 110.
Embodiment
The utility model is described in further detail below in conjunction with drawings and Examples:
As shown in Figure 1, this electric whiteboard system based on image detection mainly comprises blank 6, pull in vain left and right sides corner location on 6 place respectively the first image input device 1 and the second image input device 2 (also can be more than two 〉.The first image input device 1 and the second image input device 2 by camera 26 and graphics processing unit 27 consist of (see Fig. 3 〉, camera 26 is comprised of greater than the sphere camera lens 20 of 90 degree and the CMOS color image sensor 21 of high-resolution the horizontal field of view angle, the central optical axis of the camera after the installation becomes 45 degree with its both sides frame, 1 pair of mat woven of fine bamboo strips of the first image input device, two frame bars 4 and the mat woven of fine bamboo strips three frame bars 5 carry out image acquisition, 2 pairs of Second Edge mouldings 4 of the second image input device and first side moulding 3 carry out image Bian collection, first side moulding 3, Second Edge moulding 4 and the 3rd frame bar 5 the medial surface of blank 6 be decorated black and white replace border 11 see Fig. 2 (scuncheon is decorated to GTG and separates 〉, adjust camera with the setting height(from bottom) between blank 6 planes, make its central optical axis also replace the border over against the black and white of frame bar, if clipboard is light, frame is dark, and image detection is carried out on the border that we also can directly use both assemblings to go out.
Use step of the present utility model is as follows:
1, the left and right sides corner location on blank 6 is placed respectively the first image input device 1 and the second image input device 2, the mat woven of fine bamboo strips one frame bar 3, Second Edge moulding 4 and the 3rd frame bar 5 are provided with black and white at the medial surface of blank 6 and replace border 11, the central optical axis of the first image input device 1 and the second image input device 2 is angled with its both sides frame respectively, adjust the height of central optical axis, make it be right against black and white and replace border 11;
2, the first image input dress 1 pair of second pair of frame bar 4 of Jia and the 3rd frame bar 5 carry out image Bian collection, 2 pairs of second pair of frame bars 4 of the second image input device and first side moulding 3 carry out image Bian collection, namely by two imageing sensors that are installed on the common white edges of boards angle relative frame bar are carried out image scanning;
3, replace border 11 obtains reference for GTG touch plane by black and white, when object enters the touch area, Grayscale Edge in the frame bar will cause destruction, draw the crossing angle of object between two camera heads by breakdown point, and obtain object in the position in blank zone by the triangulation calculation method.The gray scale image of the internal side frame of graphics processing unit carries out edge extracting, and when not having object to touch, the Grayscale Edge profile of this extraction is continuously and collimates; When having object to touch blank, the frame image is subject to stopping of object, and its edge contour can produce distortion or fracture, just can calculate object in the position of blank by knowing object at the deformation point that two or more camera heads produces.(object or finger cause frame to gather picture breakdown when entering Petting Area, by destroying indirect acquisition touch information)
4, the image in the first image input Zhuan Jia I and the second image input device 2 dialogue edges of boards frames is done mobile the detection, and 2 pairs of objects that move in the blank of the first image input device 1 and the second image input device are done color identification.In the situation of having used above-mentioned edge contour to detect, the utility model also auxiliary Bian has used mobile detection to differentiate, image in the first image input device and the second image input device dialogue edges of boards frame is done mobile the detection, its objective is and improve the accuracy that detects, graphics processing unit is by analyzing front and back two frame differences of frame image, and whether judgment object is mobile near Petting Area.
In the utility model system, the first image input device and the second image input device are done color identification to the object that moves in the blank, by the color component of object analysis at color image sensor, draw the color of object, write information notice PC painted.Its benefit is the user uses the object of which kind of color, just can write with which kind of color.
The utility model has adopted the intrinsic frame of blank to carry out image detection, is not increasing in the additional electronic devices situation, has realized the interactive information of electronic whiteboard.It as the image detection target, by judging the Feature change of this reference target, obtains object at the touch information of electronic whiteboard by the blank frame of introducing indirectly, and the utility model of native system becomes simply board for writing or drawing, has high practical value.This external system is also further introduced mobile the detection and color identification, and the former improves the accuracy that detects, and the latter provides electronic intelligence color pencil function.
Unit 27 was made of FPGA programmable logic device (PLD) 22 and DSP digital signal processor 23 during image was processed, and FPGA is responsible for the pre-treatment of view data, and DSP is responsible for the aftertreatment of view data.
The implementation method of graphics processing unit:
Absorption by cutting imageing sensor 21 is interval, make the ornamental tape of the inboard frame of its scanning, the view data of its output is admitted to graphics processing unit 27 schematic diagrams and processes as shown in Figure 4, view data snubber assembly 51 in the graphics processing unit 27 is separated into brightness data 60 and chroma data 61 to the view data from imageing sensor 21, wherein brightness data 60 is sent to picture noise filtration module 52 and carries out noise filtering, delivered to simultaneously edge extracting module 53 and mobile detection module 54 is done the image pre-service through the brightness data of noise processed, 61 of chroma data bufferings are admitted to color average module 55 and do the color average value processing.
Below specifically analyze the realization mechanism of above-mentioned image processing module, the length direction of at first establishing picture strip is X, and Width is y:
1) the noise filtering module 52
In realization, we Bian 2D medium filtering, namely
g(x,y)=MED{p(x-i,y-j)},(i,j)∈S (1)
Wherein, MED { } gets intermediate value for the element in the set, and P (x, y) is grey scale pixel value, and g (x, y) is the grey scale pixel value after processing, and S is template window, selects 3 * 3.
2) the edge extracting module 53
In realization, the horizontal edge extracting of we Bian, namely
Wherein, g (x, y) is grey scale pixel value, and d (x, y) is the gray-scale value behind the edge extracting.
We do binary conversion treatment and refinement to above-mentioned d (x, y) again, obtain boundary curve y=f (x).
3) mobile detection module 54
In realization, the method that we Bian has used present frame and the initial reference frame that powers on to subtract each other, namely
m(x,y)=|g
1(x,y)-g
0(x,y)| (3)
Wherein, g
1(x, y) is the gradation data of present frame, g
0(x, y) is the gradation data of initial reference frame, and m (x, y) is the comparison frame of output.
We do binary conversion treatment to above-mentioned comparison frame m (x, y) again, obtain binary picture picture frame n (x, y)
4) color average module 55
In realization, it is average to do chroma data near the boundary curve of our employings to initial extraction, namely
Wherein, f
0(x) curvilinear function that extracts for initial edge, C
r(x, y) is C in the image
rThe pixel value of color difference components, C
h(x, y) is C in the image
hThe pixel value of color difference components, K is the chroma sampling window, gets 2, C
r(x), C
h(x) be the mean value of corresponding color difference components.
After the view data process edge extracting module 53 after the noise processed, be sent in the tracing analysis unit 62 in the digital signal module 56 and carry out curve fitting, match is by comparing record coordinate figure X when the match criterion meets with current edge extracting curve to the edge extracting curve that powers on initial.Its match criterion is defined as follows:
|f
1(x)-f
0(x)|>K (5)
Wherein, f
0(x) be the boundary curve function of initial extraction, f
1(x) be the boundary curve of current extraction, K is the match threshold value, gets 2.
View data after the noise processed is through behind the mobile detection module 54, be sent to and carry out the difference differentiation in the variance analysis unit 63 in the digital signal module 56, the template window of its differentiation is slided along the boundary curve of initial extraction, record coordinate figure x when the difference criterion meets.If the n (x, y) when image difference occurs is 1, the n (x, y) when occuring without image difference is 0, and its difference criterion is defined as follows:
Wherein, ADD { } is the number of pixels in the template window for element addition in the set, the binary image data that n (x, y) draws for formula 3, N, and S is template window, selects 5 * 5, (x
0, y
0) be template center's point coordinate, f
0(x) curvilinear function that extracts for initial edge.
Digital signal processing module 56 carries out comprehensive analysis to the two record data again, supposes that the coordinate set by tracing analysis unit 62 records is Xa, and the coordinate set that is recorded by variance analysis unit 63 is Xb, and the so actual coordinate set X that draws is
X=X
a∩X
b (7)
To coordinate set X border centering point, can obtain the physical location in imageing sensor 21 of object, behind the object space that digital signal processing module 56 obtains, in the color average of this position extraction from color average module 55, convert color value output by color analysis unit 64 to by predefined color difference components interval.
Arrive this, we have obtained positional information and the colouring information of object at the single image shooting device.We cross serial line interface to the information exchange of two image input devices again and connect with primary processor 7, the final color value of object is got the average magnitude of both color values, coordinate is then by triangulation calculation in the plane for it, and the calculating synoptic diagram is seen Fig. 6, concrete calculating schematically as follows:
If the first image input device 1 testee 100 subtended angles are that the object subtended angle that α the second image input device 2 measures is β, the width of blank is L, and the upper left corner of regulation blank is initial point, is downwards y, is to the right x, and then object coordinates is:
x=L×tanβ/(tanα+tanβ), (8)
y=L×tanα×tanβ/(tanα+tanβ)
In order to adapt to the image detection of the internal side frame of energy under any environment, we have also installed auxiliary lighting apparatus at camera head, specifically see synoptic diagram 7, carry out floor light by the 101 pairs of detected inboard frames of infrared LED that are placed on the pcb board 110.
In addition to the implementation, the utility model can also have other embodiments.All employings are equal to the technical scheme of replacement or equivalent transformation formation, all drop on the protection domain of the utility model requirement.
Claims (4)
1. optical touch screen, mainly comprise blank, it is characterized in that: the left and right sides corner location on blank is placed with respectively the first image input device and the second image input device, wherein the first image input device carries out image acquisition for second pair of frame bar and the 3rd frame bar, the second image input device is used for second pair of frame bar and the first side moulding carries out image acquisition, and first side moulding, Second Edge moulding and the 3rd frame bar are provided with black and white at the medial surface of blank and replace the border; The central optical axis of the first image input device and the second image input device is angled with its both sides frame respectively, and the height of central optical axis is right against black and white and replaces the border; The first image input device is connected with primary processor respectively with the second image input device.
2. a kind of optical touch screen according to claim 1, it is characterized in that: described the first image input device and the second image input device consist of by camera and graphics processing unit, and the central optical axis of two cameras becomes 45 degree with its both sides frame respectively.
3. a kind of optical touch screen according to claim 2 is characterized in that: described camera is comprised of greater than the sphere camera lenses of 90 degree and the CMOS color image sensor of high-resolution the horizontal field of view angle.
4. a kind of optical touch screen according to claim 1, it is characterized in that: described primary processor is connected with computing machine by the USB device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN 201220385921 CN202815785U (en) | 2012-08-06 | 2012-08-06 | Optical touch screen |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN 201220385921 CN202815785U (en) | 2012-08-06 | 2012-08-06 | Optical touch screen |
Publications (1)
Publication Number | Publication Date |
---|---|
CN202815785U true CN202815785U (en) | 2013-03-20 |
Family
ID=47874634
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN 201220385921 Expired - Fee Related CN202815785U (en) | 2012-08-06 | 2012-08-06 | Optical touch screen |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN202815785U (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104007883A (en) * | 2014-06-19 | 2014-08-27 | 安徽春勤教育装备有限公司 | Locating-improved optical touch electronic whiteboard and locating method thereof |
CN104635998A (en) * | 2013-11-15 | 2015-05-20 | 联想(北京)有限公司 | Information processing method and electronic equipment |
CN106095202A (en) * | 2016-08-23 | 2016-11-09 | 苏州优函信息科技有限公司 | Utilize the cloud plate of two Linear Array Realtime imageing sensor pen with means for holding it in right position tongue marks |
-
2012
- 2012-08-06 CN CN 201220385921 patent/CN202815785U/en not_active Expired - Fee Related
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104635998A (en) * | 2013-11-15 | 2015-05-20 | 联想(北京)有限公司 | Information processing method and electronic equipment |
CN104635998B (en) * | 2013-11-15 | 2018-03-23 | 联想(北京)有限公司 | A kind of information processing method and electronic equipment |
CN104007883A (en) * | 2014-06-19 | 2014-08-27 | 安徽春勤教育装备有限公司 | Locating-improved optical touch electronic whiteboard and locating method thereof |
CN104007883B (en) * | 2014-06-19 | 2017-02-15 | 安徽春勤教育装备有限公司 | Locating-improved optical touch electronic whiteboard and locating method thereof |
CN106095202A (en) * | 2016-08-23 | 2016-11-09 | 苏州优函信息科技有限公司 | Utilize the cloud plate of two Linear Array Realtime imageing sensor pen with means for holding it in right position tongue marks |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN101464751B (en) | Electronic white board system based on image detection and detection method thereof | |
CN105469113B (en) | A kind of skeleton point tracking method and system in two-dimensional video stream | |
CN103150549B (en) | A kind of road tunnel fire detection method based on the early stage motion feature of smog | |
CN102122390B (en) | Method for detecting human body based on range image | |
CN102930334B (en) | Video recognition counter for body silhouette | |
CN104360501A (en) | Visual detection method and device for defects of liquid crystal display screen | |
CN104616275A (en) | Defect detecting method and defect detecting device | |
Song et al. | Design of control system based on hand gesture recognition | |
CN102880865A (en) | Dynamic gesture recognition method based on complexion and morphological characteristics | |
CN105427320A (en) | Image segmentation and extraction method | |
CN103218605A (en) | Quick eye locating method based on integral projection and edge detection | |
CN102708383A (en) | System and method for detecting living face with multi-mode contrast function | |
CN103424404A (en) | Material quality detection method and system | |
CN104484645A (en) | Human-computer interaction-oriented '1' gesture-recognition method and system | |
CN105138990A (en) | Single-camera-based gesture convex hull detection and palm positioning method | |
CN106682665A (en) | Digital recognition method for seven-segment digital indicator | |
CN112329646A (en) | Hand gesture motion direction identification method based on mass center coordinates of hand | |
CN202815785U (en) | Optical touch screen | |
CN106228541A (en) | Screen positioning method and device in visual inspection | |
CN106327464A (en) | Edge detection method | |
CN103886319A (en) | Intelligent held board recognizing method based on machine vision | |
CN104662560A (en) | Method and system for processing video image | |
CN102521567A (en) | Human-computer interaction fingertip detection method, device and television | |
CN102610104A (en) | Onboard front vehicle detection method | |
CN104008396A (en) | In and out people flow statistical method based on people head color and shape features |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20130320 Termination date: 20180806 |
|
CF01 | Termination of patent right due to non-payment of annual fee |