CN201853210U - Planar multi-point touch control device based on optical sensing - Google Patents

Planar multi-point touch control device based on optical sensing Download PDF

Info

Publication number
CN201853210U
CN201853210U CN2010202738281U CN201020273828U CN201853210U CN 201853210 U CN201853210 U CN 201853210U CN 2010202738281 U CN2010202738281 U CN 2010202738281U CN 201020273828 U CN201020273828 U CN 201020273828U CN 201853210 U CN201853210 U CN 201853210U
Authority
CN
China
Prior art keywords
optical sensing
receiver
control device
point
device based
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2010202738281U
Other languages
Chinese (zh)
Inventor
娄思华
张江山
汤俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
WUHAN TRANSVALUE IMAGING CONTROL CO Ltd
Original Assignee
WUHAN TRANSVALUE IMAGING CONTROL CO Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by WUHAN TRANSVALUE IMAGING CONTROL CO Ltd filed Critical WUHAN TRANSVALUE IMAGING CONTROL CO Ltd
Priority to CN2010202738281U priority Critical patent/CN201853210U/en
Application granted granted Critical
Publication of CN201853210U publication Critical patent/CN201853210U/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Abstract

The utility model relates to a planar multi-point touch control device based on an optical sensor, which comprises three or more optical sensing receivers and a bracket; the three or more optical sensing receivers are respectively mounted on the bracket, and then the bracket provided with the optical sensing receivers is mounted in front or at the rear of a display screen; and the optical sensing receivers are connected with a computer in a wired or wireless way. The planar multi-point touch control device has the following advantages: 1, the multi-point recognition and treatment of the device are more accurate than those of other multi-point touch realizing ways; 2, error points and virtual points during the multi-point treatment can be effectively prevented.

Description

Plane multiple point touching control device based on optical sensing
Technical field
The utility model relates to a kind of plane multiple point touching control device based on optical sensing.
Background technology
Development along with multi-point touch, in this year, the multi-point touch implementation has had bigger development and improvement, but because people are to the also raising day by day of requirement of multi-point touch, in order to adapt to demands of social development, the utility model proposes another multi-point touch implementation on the basis that original multi-point touch implementation is constantly explored.
Summary of the invention
A kind of plane touch control device based on optical sensor of doing on the multi-point touch implementation further to improve and providing is being provided now the purpose of this utility model, the utility model can utilize biocular systems to carry out three-dimensional reconstruction realization plane and accurately locate, and reaches the accurate identification of multiple spot.
The technical solution of the utility model is:
Plane multiple point touching control device based on optical sensing, comprise three or three above optical sensing receivers and a support, it is characterized in that: the above optical sensing receivers of three or three are rack-mount respectively, the support that the installation of optical sensing receiver will be installed again is installed in the place ahead or the rear of display screen, and the optical sensing receiver links to each other with computing machine by wired or wireless.
Described optical sensing receiver is the visible light receiver, as camera, camera or laser pickoff, or the non-visible light receiver: infrared thermoviewer, infrared remote receiver or ultrasonic receiver.
The utlity model has following advantage:
1, in the identification of multiple spot with handle more accurate than other multi-point touch implementations;
2, can effectively prevent the overdue and terrible point of multiple spot in handling.
Description of drawings
Fig. 1 is an example structure synoptic diagram of the present utility model.
Fig. 2 is another example structure synoptic diagram of the present utility model.
Fig. 3 is a kind of example principle schematic of the present utility model.
Fig. 4 is the another kind of example principle schematic of the utility model.
Fig. 5 is for solving multiple spot problem synoptic diagram.
Fig. 6 is general biocular systems principle schematic.
Embodiment
Below in conjunction with accompanying drawing the utility model is described in detail:
As shown in Figure 1, 2, the utility model is rack-mount respectively with three optical sensing receivers, it also can be the rear that the place ahead that support that the optical sensing receiver installs is installed in display screen will be installed again, and the optical sensing receiver links to each other with computing machine by wired or wireless.Described optical sensing receiver is the visible light receiver, as camera, camera or laser pickoff, or the non-visible light receiver: infrared thermoviewer, infrared remote receiver or ultrasonic receiver.
As shown in Figure 6, following coordinate derivation formula is arranged: (this example is the optical sensing receiver with the video camera, also is applicable to the optical sensor of other types) for biocular systems
After demarcating, we can obtain the inside and outside parameter of left and right cameras, and are specific as follows:
The intrinsic parameter of left side video camera: focal distance f l, and the real image centre coordinate (ul0, vl0)
The outer parameter of left side video camera: rotation matrix Rl, translation vector Tl
The intrinsic parameter of right video camera: focal distance f r, and the real image centre coordinate (ur0, vr0)
The outer parameter of right video camera: rotation matrix Rr, translation vector Tr
But it should be noted that the outer parameter of the left and right cameras that obtains this moment all is for the world coordinate system of setting in the calibration process separately.In order to determine the three-dimensional coordinate of spatial point in the binocular visual field, we need be accustomed to according to human vision, and left and right cameras is unified under same world coordinate system.
Suppose that now left video camera photocentre is the world coordinates initial point, promptly the observer is in left video camera photocentre position, and then right camera coordinate system or-xryrzr and the world coordinate system o-xyz of this moment can be expressed as by space conversion matrix [R T]:
( 1.1 ) , x r y r z r = ( R T ) x y z 1 = r 1 r 2 r 3 t x r 4 r 5 r 6 t y r 7 r 8 r 9 t z x y z 1
R wherein, T is respectively rotation matrix and the translation vector of left and right cameras with respect to same world coordinate system.Because
Xl=Rlxw+Tl, xr=Rrxw+Tr, cancellation xw obtains xr=RrRl-1xl+Tr-RrRl-1Tl.So far, the incidence matrix of left and right cameras [R T] obtains, wherein R=RrRl-1; T=Tr-RrRl-1Tl.
We can obtain certain spatial point P the computer picture coordinate (u, v) and world coordinates (Xw, Yw, the Zw) relation between, we can obtain the extraction formula of the spatial point three-dimensional coordinate under the binocular vision convolution (1.1), it is as follows specifically to derive:
1) the left and right cameras intrinsic parameter that demarcation is obtained is the substitution matrix respectively
Al = fl / dx 0 ul 0 0 fl / dy vl 0 0 0 1 Ar = fr / dx 0 ur 0 0 fr / dy vr 0 0 0 1 - - - ( 1.2 )
Wherein, dx, dy are respectively image at x, the size of unit picture element on the y direction.
For example, CCD chip imaging area size is 4.8x3.6mm, and the image resolution ratio size is the 768x576 pixel, then:
Figure BDA0000023792010000032
2) world coordinates of establishing space point P is (Xw, Yw, Zw) (the world coordinate system initial point is left video camera photocentre); Subpoint on the left and right cameras image be respectively (ul, vl), (ur, vr).Obtain following two groups of matrix equations:
zl ul vl 1 = Hl × Xw Yw Zw ; zr ur vr 1 = Hr × Xw Yw Zw - - - ( 1.3 )
Wherein,
Hl = Al × E = hl 11 hl 12 hl 13 hl 14 hl 21 hl 22 hl 23 hl 24 hl 31 hl 32 hl 33 hl 34 ; E = 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 - - - ( 1.4 )
Hr = Ar × R T = hr 11 hr 12 hr 13 hr 14 hr 21 hr 22 hr 23 hr 24 hr 31 hr 32 hr 33 hr 34 - - - ( 1.5 )
Here the institute of Hl and Hr is important is known, formula (1.5) is launched and cancellation zl respectively, and zr can get:
(u 1hl 31-hl 11)X w+(u 1hl 32-hl 12)Y w+(u 1hl 33-hl 13)Z w=hl 14-u 1hl 34
(v 1hl 31-hl 21)X w+(v 1hl 32-hl 22)Y w+(v 1hl 33-hl 23)Z w=hl 24-v 1hl 34
(u 2hr 31-hr 11)X w+(u 2hr 32-hr 12)Y w+(u 2hr 33-hr 13)Z w=hr 14-u 2hr 34
(v 2hr 31-hr 21)X w+(v 2hr 32-hr 22)Y w+(v 2hr 33-hr 23)Z w=hr 24-v 2hr 34 (1.6)
Get M = u 1 hl 31 - hl 11 u 1 hl 32 - hl 12 u 1 hl 33 - hl 13 v 1 hl 31 - hl 11 v 1 hl 32 - hl 22 v 1 hl 33 - hl 23 u 2 hr 31 - hr 11 u 2 hr 32 - hr 12 u 2 hr 33 - hr 13 v 2 hr 31 - hr 11 v 2 hr 32 - hr 22 v 2 hr 33 - hr 23 ; U = hl 14 - u 1 hl 34 hl 24 - v 1 hl 34 hr 14 - u 2 hr 34 hr 24 - v 2 hr 34 ,
M (Xw Yw Zw)=U, utilize least square method to calculate the three-dimensional coordinate of extraterrestrial target point P under the world coordinate system that is initial point with left video camera photocentre:
Xw Yw Zw = ( M ′ × M ) - 1 × M ′ × U - - - ( 3.31 )
Because we only require coordinate information in one plane, promptly on our platform, as long as know the coordinate of Xw and Zw, give up to fall the Yw of error maximum.
The utility model will be further described in conjunction with the accompanying drawings.
As shown in Figure 3, the utility model is by three optical sensing receivers and a support, the length and width of support are not waited, three optical sensing receivers are installed in respectively on three angles of square support, the support that the installation of optical sensing receiver will be installed during use again is installed in the place ahead of display screen, the optical sensing receiver links to each other with computing machine by wired or wireless, and described optical sensing receiver can be among camera, camera, laser pickoff, infrared thermoviewer, infrared remote receiver, the ultrasonic receiver any one.
In the situation of present embodiment, for being covered, images acquired information controls screen surface, and the imaging angle that need ask each optical sensing receiver is greater than 90 degree.Optical sensing receiver one and two is formed a biocular systems I, and optical sensing receiver two and three is formed biocular systems II.
By calculating, in binocular I, can draw 1. 2. 3. 4. four point coordinate values, in binocular II, draw 1. 2. 5. 6. four point coordinate values, by the comparison of value, equal reservation, 1. unequal rejecting so can draw 2. two true point coordinate.
As shown in Figure 4, the utility model is by three optical sensing receivers and a support, three optical sensing receivers are placed as shown in the figure, the support that the installation of optical sensing receiver will be installed during use again is installed in the place ahead of display screen, the optical sensing receiver links to each other with computing machine by wired or wireless, and described optical sensing receiver can be among camera, camera, laser pickoff, infrared thermoviewer, infrared remote receiver, the ultrasonic receiver any one.
In the situation of present embodiment, for being covered, images acquired information controls screen surface, need ask imaging angle 90 degree of one, three two optical sensing receiver.Optical sensing receiver two adopts the wide-angle imaging head, imaging angle 180 degree, and the volume coordinate position calculation of operation object is similar to Example 1.
As shown in Figure 5, when 2 determine after, for initiate point, owing to there are differences on the time, we only need find new adding point imaging in each optical sensing receiver, try to achieve the coordinate of this new adding point then according to the binocular principle, in like manner for four points, five points, perhaps multiple spot more, all adopt temporal differences to mate, can draw the multiple spot coordinate.

Claims (3)

1. based on the plane multiple point touching control device of optical sensing, comprise three or three above optical sensing receivers and a support, it is characterized in that: the above optical sensing receivers of three or three are rack-mount respectively, the support that the installation of optical sensing receiver will be installed again is installed in the place ahead or the rear of display screen, and the optical sensing receiver links to each other with computing machine by wired or wireless.
2. the plane multiple point touching control device based on optical sensing according to claim 1 is characterized in that: described optical sensing receiver is visible light receiver or non-visible light receiver.
3. the plane multiple point touching control device based on optical sensing according to claim 2, it is characterized in that: described optical sensing receiver is camera, camera, laser pickoff, infrared thermoviewer, infrared remote receiver or ultrasonic receiver.
CN2010202738281U 2010-07-28 2010-07-28 Planar multi-point touch control device based on optical sensing Expired - Fee Related CN201853210U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2010202738281U CN201853210U (en) 2010-07-28 2010-07-28 Planar multi-point touch control device based on optical sensing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2010202738281U CN201853210U (en) 2010-07-28 2010-07-28 Planar multi-point touch control device based on optical sensing

Publications (1)

Publication Number Publication Date
CN201853210U true CN201853210U (en) 2011-06-01

Family

ID=44095597

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2010202738281U Expired - Fee Related CN201853210U (en) 2010-07-28 2010-07-28 Planar multi-point touch control device based on optical sensing

Country Status (1)

Country Link
CN (1) CN201853210U (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102221941A (en) * 2011-06-24 2011-10-19 武汉传威光控科技有限公司 Frame-type multi-point position and touch monitoring and controlling system and method based on optical sensing
CN102306067A (en) * 2011-08-25 2012-01-04 武汉传威光控科技有限公司 Multi-point measurement and control system and method based on optical sensing distributed multi-system parallel detection
CN102622140A (en) * 2012-03-05 2012-08-01 安徽大学 Image pick-up multi-point touch system
CN107765928A (en) * 2017-04-21 2018-03-06 青岛陶知电子科技有限公司 A kind of multi-touch display system based on graphene optical sensing technology

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102221941A (en) * 2011-06-24 2011-10-19 武汉传威光控科技有限公司 Frame-type multi-point position and touch monitoring and controlling system and method based on optical sensing
CN102306067A (en) * 2011-08-25 2012-01-04 武汉传威光控科技有限公司 Multi-point measurement and control system and method based on optical sensing distributed multi-system parallel detection
CN102306067B (en) * 2011-08-25 2014-09-24 武汉传威光控科技有限公司 Multi-point measurement and control system and method based on optical sensing distributed multi-system parallel detection
CN102622140A (en) * 2012-03-05 2012-08-01 安徽大学 Image pick-up multi-point touch system
CN102622140B (en) * 2012-03-05 2015-05-13 安徽大学 Image pick-up multi-point touch system
CN107765928A (en) * 2017-04-21 2018-03-06 青岛陶知电子科技有限公司 A kind of multi-touch display system based on graphene optical sensing technology

Similar Documents

Publication Publication Date Title
JP6522630B2 (en) Method and apparatus for displaying the periphery of a vehicle, and driver assistant system
US10194135B2 (en) Three-dimensional depth perception apparatus and method
JP6091586B1 (en) VEHICLE IMAGE PROCESSING DEVICE AND VEHICLE IMAGE PROCESSING SYSTEM
TWI531495B (en) Automatic Calibration Method and System for Vehicle Display System
CN109089074A (en) For looking around the camera angle estimation method of monitoring system
US20160121794A1 (en) In-vehicle display apparatus and program product
KR102057021B1 (en) Panel transformation
CN201853210U (en) Planar multi-point touch control device based on optical sensing
WO2013086249A3 (en) Vehicle vision system with customized display
CN103852060B (en) A kind of based on single visual visible images distance-finding method felt
JP2015114757A (en) Information processing apparatus, information processing method, and program
US9162621B2 (en) Parking support apparatus
KR102124298B1 (en) Rear Cross Traffic-Quick Look
KR20150125767A (en) Method for generating calibration indicator of camera for vehicle
US20110063436A1 (en) Distance estimating apparatus
US11729367B2 (en) Wide viewing angle stereo camera apparatus and depth image processing method using the same
WO1997018523B1 (en) Computer stereo vision system and method
US20160037154A1 (en) Image processing system and method
CN204856623U (en) Scaling board composite set
JP2013124972A (en) Position estimation device and method and television receiver
CN111694374B (en) Flight control method, device, control terminal, flight system and processor
CN101887330B (en) Electronic equipment as well as single-camera object-positioning device and method thereof
TW201537510A (en) 3D AVM (Around View Monitoring) image system based on probabilistic approach and acquisition method thereof
WO2017163648A1 (en) Head-mounted device
CN210277081U (en) Floor sweeping robot

Legal Events

Date Code Title Description
C14 Grant of patent or utility model
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of utility model: Planar multi-point touch control device based on optical sensing

Effective date of registration: 20141127

Granted publication date: 20110601

Pledgee: Wuhan rural commercial bank Limited by Share Ltd Optics Valley branch

Pledgor: Wuhan Transvalue Imaging Control Co.,Ltd.

Registration number: 2014990001001

PLDC Enforcement, change and cancellation of contracts on pledge of patent right or utility model
PC01 Cancellation of the registration of the contract for pledge of patent right

Date of cancellation: 20150624

Granted publication date: 20110601

Pledgee: Wuhan rural commercial bank Limited by Share Ltd Optics Valley branch

Pledgor: Wuhan Transvalue Imaging Control Co.,Ltd.

Registration number: 2014990001001

PLDC Enforcement, change and cancellation of contracts on pledge of patent right or utility model
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of utility model: Planar multi-point touch control device based on optical sensing

Effective date of registration: 20150630

Granted publication date: 20110601

Pledgee: Wuhan rural commercial bank Limited by Share Ltd Optics Valley branch

Pledgor: Wuhan Transvalue Imaging Control Co.,Ltd.

Registration number: 2015990000519

PLDC Enforcement, change and cancellation of contracts on pledge of patent right or utility model
PC01 Cancellation of the registration of the contract for pledge of patent right

Date of cancellation: 20160607

Granted publication date: 20110601

Pledgee: Wuhan rural commercial bank Limited by Share Ltd Optics Valley branch

Pledgor: Wuhan Transvalue Imaging Control Co.,Ltd.

Registration number: 2015990000519

PLDC Enforcement, change and cancellation of contracts on pledge of patent right or utility model
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of utility model: Planar multi-point touch control device based on optical sensing

Effective date of registration: 20160608

Granted publication date: 20110601

Pledgee: Wuhan rural commercial bank Limited by Share Ltd Optics Valley branch

Pledgor: Wuhan Transvalue Imaging Control Co.,Ltd.

Registration number: 2016420000019

PLDC Enforcement, change and cancellation of contracts on pledge of patent right or utility model
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20110601

Termination date: 20170728

CF01 Termination of patent right due to non-payment of annual fee