CN103838438B - A kind of infrared multipoint positioning method - Google Patents

A kind of infrared multipoint positioning method Download PDF

Info

Publication number
CN103838438B
CN103838438B CN201410114588.3A CN201410114588A CN103838438B CN 103838438 B CN103838438 B CN 103838438B CN 201410114588 A CN201410114588 A CN 201410114588A CN 103838438 B CN103838438 B CN 103838438B
Authority
CN
China
Prior art keywords
point
visual angle
distance
true
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201410114588.3A
Other languages
Chinese (zh)
Other versions
CN103838438A (en
Inventor
张自能
杨运
肖时航
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Huayuan Shengya Science And Technology Co ltd
Original Assignee
QLTOUCH TECH Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by QLTOUCH TECH Co Ltd filed Critical QLTOUCH TECH Co Ltd
Priority to CN201410114588.3A priority Critical patent/CN103838438B/en
Publication of CN103838438A publication Critical patent/CN103838438A/en
Application granted granted Critical
Publication of CN103838438B publication Critical patent/CN103838438B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The present invention relates to a kind of infrared multipoint positioning method, described localization method comprises the following steps: preset window distance threshold t1;Each field of view information is added up at the traversal each visual angle of horizontal direction, it is assumed that the areal obtained is respectively K1, K2..., KN;The traversal each visual angle of vertical direction, adds up each field of view information, obtains areal and be respectively P1, P2..., PM;Calculate that candidate point concentrates the standard scores of each point;Statistics candidate point concentrates the true integration of each point;The judgement of true and false point.The invention have the benefit that the concept proposing various visual angles;Using two most for the areal that is blocked visual angles as benchmark visual angle, it is utilized to carry out the calculating of candidate touch point;By the region that is blocked to distance on the basis of the minimum range of point, other distance thoughts that herewith distance compares;Utilize the thought that auxiliary view is integrated;The most important thing is above 4 points.

Description

A kind of infrared multipoint positioning method
Technical field
The present invention relates to a kind of infrared multipoint positioning method.
Background technology
Existing infrared multiple spot identification technology has logical judgment method and image treating two kinds, and both of which is based on one To many hardware scanning modes, ask for obstacles borders according to light circuit-switched data;Ask for standard according to border (to wait Choosing) touch point set, and set true touch point threshold value;Travel through each candidate touch point, it is determined that current Point is the credibility of true touch point.Credibility judges as follows, the region formed according to this border, Through the light path set in this region, and should judge what this light path collective entity was blocked successively in anti-dianoetic Number.Number is blocked divided by theory, if more than threshold value, being then true point, otherwise with the actual number that blocks It is false point, removes it;Follow the tracks of and export, the touch point centre of form and exact boundary can be utilized respectively as sentencing Determining region, during in touch point farther out, effect is preferable.But when touch point is nearer, can be owing to hiding mutually Keep off and cause asking for mistake.It addition, when border, touch point interacts, border, touch point can be caused to become Change, and then ask for region, touch point (determinating area) mistake, so since, the judgement of touch point verity Lost efficacy.
The most basic thought of image treating is: true touch point relatively sizing grid substantially wants big, corrosion (filter Ripple) dynamics controls to be removed by grid noise, retains true touch point simultaneously and be preferred.Therefore image method will The actual touch object that None-identified is identical or close with grid noise size, though be now clearly present by It is also such for blocking light path;It addition, time difference during high-speed motion can cause corrosion after lose situation a little.Send out A person of good sense finds that above-mentioned multiple spot is asked for scheme and be there is significant deficiency, because logical approach only make use of the local of light path Characteristic, and cannot be in view of the overall permanence of light path and the relation that influences each other of true touch point;And image Method only make use of global property, it is impossible to identifies the actual touch object suitable with noise, and image method meter Calculation amount is big, memory consumption big, very sensitive to delay problem, easily loses true touch point.Logical approach has The advantage that computing is simple, treating capacity is little, the logical approach of a kind of improvement of the present invention so that it is can correctly process With the situation of delay data, the impact that when reducing and eliminate multiple spot, touch point is blocked mutually, more correctly Ground identifies all true touch points, preferably rejects ghost point and propose.
Summary of the invention
For weak point present in the problems referred to above, the present invention provides a kind of infrared multipoint positioning method.
For achieving the above object, the present invention provides a kind of infrared multipoint positioning method, described localization method bag Include following steps:
Step 1: preset window distance threshold t1, difference distance threshold t2, true some decision threshold t3;
Step 2: determine level and the benchmark visual angle in vertical direction respectively;
Step 3: the candidate's point set drawn by horizontal reference visual angle and vertical reference visual angle calculates the most every The standard scores of individual point;
Step 4: the candidate's point set drawn by horizontal reference visual angle and vertical reference visual angle is each in calculating The true integration of point;
Step 5: the judgement of true and false point.
Further, in step 1, the region that is blocked just is integrated to the distance put less than this threshold value t1, The most not integration, prevents the erroneous judgement that wire jumper and light path instability cause;With point and the region narrow spacing that is blocked On the basis of from, if other points and this region distance difference with minimum range that is blocked are less than this threshold value t2, then These points are also carried out integration;Threshold value t3 is the percentage ratio of the decimal in 0-1.
Further, in step 2, travel through each visual angle of horizontal direction, add up each field of view information, The areal obtained is respectively K1, K2..., KN;The traversal each visual angle of vertical direction, adds up each Field of view information, obtains areal and is respectively P1, P2..., PM;In horizontal direction, number of regions Measure most i-th visual angle Ni(1≤i≤N) is considered as horizontal reference visual angle;In vertical direction, number of regions Measure most jth visual angle Mj(1≤j≤M) is considered as vertical reference visual angle.
Further, in step 3, utilize two benchmark field of view orthogonal calculation respectively, obtain benchmark Scene, i.e. intersection point set, as the intersection point in region, (point herein refers to by level the intersection point of centrage The benchmark scene that benchmark field of view and vertical reference field of view obtain jointly, i.e. touch area), these Intersection point is referred to as candidate's point set, and the intersection point of zone boundary may determine that the size of contact, according to the some position obtained Put with putting size that (point herein refers to by horizontal reference field of view and vertical reference field of view common The benchmark scene obtained, i.e. touch area), and light path visual angle slope (visual angle of light path refer to by Block just to line and inclined line, just to line correspondence benchmark visual angle, inclined line correspondence auxiliary view), counter push away through The visual angle number of specified point (specified point refers to touch point, is i.e. blocked a little), this number is as this point Standard scores, if now calculate be 0, then force be set to 1.
Further, in step 4, the benchmark visual angle in removing both horizontally and vertically, by other visual angles It is considered as auxiliary view, calculates in auxiliary view each region that is blocked respectively (by horizontal reference field of view The benchmark scene jointly obtained with vertical reference field of view, be i.e. blocked region) and candidate point centrostigma Distance;For each region that is blocked, it is sorted from small to large with the distance of all candidate points, with Minimum range is as reference range, if reference range d < t1, then relevant to d dot product 1;If other distances DOther and poor dOff < t2, the wherein dOff=dOther-d of minimum range d, then relevant to this distance point Long-pending 1 point, the most not integration.
Further, in step 5, candidate point is asked to concentrate ratio r of each true integration of point and standard scores, If r > t3, this point is true, is otherwise false, carries out smooth output according to true point and historical record.
The invention have the benefit that the concept proposing various visual angles;By most for the areal that is blocked two Individual visual angle, as benchmark visual angle (resolution is the highest), utilizes it to carry out the calculating of candidate touch point;To be hidden Gear region is to distance on the basis of the minimum range of point, and other (disappear apart from thoughts that herewith distance compares The error brought except time delay (the most each fluorescent tube light on and off time irreversibility and contact rapid movement));Profit The thought being integrated by auxiliary view (other visual angles);The most important thing is above 4 points, touch completing After touching dot information extraction, residue ghost point can be removed further according to the restricting relation of each touch point, this Step not necessarily, but can make algorithm more perfect.In scheme, a lot of steps may be by other Some algorithms substitute, and therefore should protect emphatically this roadmap and solution principle thought.
Accompanying drawing explanation
Fig. 1 is the perspective definition figure of a kind of infrared multipoint positioning method of the present invention;
Fig. 2 is the area schematic that is blocked of a kind of infrared multipoint positioning method of the present invention;
Fig. 3 is be blocked region and the some distance schematic diagram of a kind of infrared multipoint positioning method of the present invention.
Detailed description of the invention
As Figure 1-3, the infrared multipoint positioning method of one described in the embodiment of the present invention, described location Method comprises the following steps:
Step 1: preset window distance threshold t1, difference distance threshold t2, true some decision threshold t3;
Step 2: determine level and the benchmark visual angle in vertical direction respectively;
Step 3: the candidate's point set drawn by horizontal reference visual angle and vertical reference visual angle calculates the most every The standard scores of individual point;
Step 4: the candidate's point set drawn by horizontal reference visual angle and vertical reference visual angle is each in calculating The true integration of point;
Step 5: the judgement of true and false point.
Further, in step 1, the region that is blocked just is integrated to the distance put less than this threshold value t1, The most not integration, prevents the erroneous judgement that wire jumper and light path instability cause;With point and the region narrow spacing that is blocked On the basis of from, if other points and this region distance difference with minimum range that is blocked are less than this threshold value t2, then These points are also carried out integration;Threshold value t3 is the percentage ratio of the decimal in 0-1.
Further, in step 2, travel through each visual angle of horizontal direction, add up each field of view information, The areal obtained is respectively K1, K2..., KN;The traversal each visual angle of vertical direction, adds up each Field of view information, obtains areal and is respectively P1, P2..., PM;In horizontal direction, number of regions Measure most i-th visual angle Ni(1≤i≤N) is considered as horizontal reference visual angle;In vertical direction, number of regions Measure most jth visual angle Mj(1≤j≤M) is considered as vertical reference visual angle.
In step 3, utilize two benchmark field of view orthogonal calculation respectively, obtain benchmark scene, i.e. hand over Point set, the intersection point of centrage is as the intersection point in region, and these intersection points are referred to as candidate's point set, zone boundary Intersection point may determine that the size of contact, according to the some position obtained and some size, and the visual angle of light path Slope, counter pushes away the visual angle number through specified point, and the standard scores that this number is put as this, if now calculating It is 0, then forces to be set to 1.
Further, in step 4, the benchmark visual angle in removing both horizontally and vertically, by other visual angles It is considered as auxiliary view, calculates each region distance with candidate point centrostigma that is blocked in auxiliary view respectively; For each region that is blocked, it is sorted from small to large with the distance of all candidate points, with minimum range As reference range, if reference range d < t1, then relevant to d dot product 1;If other distances dOther With the poor dOff < t2 of minimum range d, wherein dOff=dOther-d, then to this relevant dot product 1 point of distance, The most not integration.
Further, in step 5, candidate point is asked to concentrate ratio r of each true integration of point and standard scores, If r > t3, this point is true, is otherwise false, carries out smooth output according to true point and historical record.
The computing formula of the application:
d = | Ax 0 + By 0 + C | A 2 + B 2
Data:
Algorithm inputs: AD gathers data, and example is as follows:
Algorithm exports:
Calculated xy coordinate, data type is float.
Only as described above, only presently preferred embodiments of the present invention, be such as familiar with the special of this skill Industry personage.After understanding the technological means of the present invention, natural energy is according to actual needs, in the present invention Teaching under changed.The most all equal changes made according to scope of the present invention patent and modification, Once should still remain within the scope of the patent.

Claims (2)

1. an infrared multipoint positioning method, it is characterised in that: described localization method comprises the following steps:
Step 1: preset window distance threshold t1, difference distance threshold t2, true some decision threshold t3;
The region distance to point that is blocked just is integrated less than this threshold value t1, and the most not integration prevents jumping The erroneous judgement that line and light path instability cause;On the basis of point with the region minimum range that is blocked, if other points With this region distance difference with minimum range that is blocked less than this threshold value t2, then these points are also carried out integration; Threshold value t3 is the percentage ratio of the decimal in 0-1;
Step 2: determine level and the benchmark visual angle in vertical direction respectively;
The traversal each visual angle of horizontal direction, adds up each field of view information, and the areal obtained is respectively For K1, K2..., KN;The traversal each visual angle of vertical direction, adds up each field of view information, obtains Areal is respectively P1, P2..., PM;In horizontal direction, the i-th visual angle that region quantity is most Ni(1≤i≤N) is considered as horizontal reference visual angle;In vertical direction, the jth visual angle that region quantity is most Mj(1≤j≤M) is considered as vertical reference visual angle;
Step 3: the candidate's point set drawn by horizontal reference visual angle and vertical reference visual angle calculates the most every The standard scores of individual point;
Utilize two benchmark field of view orthogonal calculations respectively, obtain benchmark scene, i.e. intersection point set, in The intersection point of heart line is as the intersection point in region, and these intersection points are referred to as candidate's point set, and the intersection point of zone boundary is permissible Determine the size of contact, according to the some position obtained and some size, and the visual angle slope of light path, counter push away Through the visual angle number of specified point, the standard scores that this number is put as this, if now calculating is 0, then force It is set to 1;
Step 4: the candidate's point set drawn by horizontal reference visual angle and vertical reference visual angle is each in calculating The true integration of point;
Other visual angles are considered as auxiliary view, count respectively by the benchmark visual angle in removing both horizontally and vertically Calculate each region distance with candidate point centrostigma that is blocked in auxiliary view;For each region that is blocked, It is sorted from small to large with the distance of all candidate points, with minimum range as reference range, if benchmark Distance d < t1, then relevant to d dot product 1;If other distances dOther and the poor dOff < t2 of minimum range d, Wherein dOff=dOther-d, then relevant to this distance dot product 1 point, the most not integration;
Step 5: the judgement of true and false point.
Infrared multipoint positioning method the most according to claim 1, it is characterised in that: in step 5, Asking candidate point to concentrate ratio r of each true integration of point and standard scores, if r > t3, this point is true, is otherwise false, Smooth output is carried out according to true point and historical record.
CN201410114588.3A 2014-03-25 2014-03-25 A kind of infrared multipoint positioning method Expired - Fee Related CN103838438B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410114588.3A CN103838438B (en) 2014-03-25 2014-03-25 A kind of infrared multipoint positioning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410114588.3A CN103838438B (en) 2014-03-25 2014-03-25 A kind of infrared multipoint positioning method

Publications (2)

Publication Number Publication Date
CN103838438A CN103838438A (en) 2014-06-04
CN103838438B true CN103838438B (en) 2016-12-07

Family

ID=50802014

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410114588.3A Expired - Fee Related CN103838438B (en) 2014-03-25 2014-03-25 A kind of infrared multipoint positioning method

Country Status (1)

Country Link
CN (1) CN103838438B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105335020B (en) * 2014-08-15 2018-09-25 青岛海信电器股份有限公司 A kind of touch point recognition methods and device
CN105373262B (en) * 2014-09-02 2018-09-25 青岛海信电器股份有限公司 A kind of method and device of the identification invalid light path of infrared touch panel
TWI529583B (en) * 2014-12-02 2016-04-11 友達光電股份有限公司 Touch system and touch detection method
CN105404433B (en) * 2015-12-04 2019-06-07 青岛海信电器股份有限公司 A kind of touch control identification method and display device based on infrared touch panel

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101387931A (en) * 2008-10-14 2009-03-18 贺伟 Infrared touch screen multi-point recognizing method
CN102270063A (en) * 2010-06-03 2011-12-07 上海优熠电子科技有限公司 Infrared true multi-point touch screen
CN102339170A (en) * 2011-05-31 2012-02-01 广州视睿电子科技有限公司 Signal scanning calculation method and system of infrared touch system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101697133B1 (en) * 2009-09-30 2017-02-01 베이징 아이어터치 시스템 코퍼레이션 리미티드 Touch screen, touch system and method for positioning a touch object in a touch system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101387931A (en) * 2008-10-14 2009-03-18 贺伟 Infrared touch screen multi-point recognizing method
CN102270063A (en) * 2010-06-03 2011-12-07 上海优熠电子科技有限公司 Infrared true multi-point touch screen
CN102339170A (en) * 2011-05-31 2012-02-01 广州视睿电子科技有限公司 Signal scanning calculation method and system of infrared touch system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"基于MCU的红外多点触摸屏设计";李均,谷灵康;《电脑知识与技术》;20120731;第8卷(第19期);第4701-4704页 *

Also Published As

Publication number Publication date
CN103838438A (en) 2014-06-04

Similar Documents

Publication Publication Date Title
CN103838438B (en) A kind of infrared multipoint positioning method
CN103699908B (en) Video multi-target tracking based on associating reasoning
Dubois et al. Human activities recognition with RGB-Depth camera using HMM
CN102510506B (en) Virtual and real occlusion handling method based on binocular image and range information
CN106780557A (en) A kind of motion target tracking method based on optical flow method and crucial point feature
CN103164711A (en) Regional people stream density estimation method based on pixels and support vector machine (SVM)
CN103489175A (en) Road surface detecting method and device
CN105005999A (en) Obstacle detection method for blind guiding instrument based on computer stereo vision
Kuk et al. Fast lane detection & tracking based on Hough transform with reduced memory requirement
CN101763636A (en) Method for tracing position and pose of 3D human face in video sequence
CN105426858A (en) Vision and vibration information fusion based ground type identification method
US20190362163A1 (en) Method for validation of obstacle candidate
CN104183142A (en) Traffic flow statistics method based on image visual processing technology
CN104392239A (en) License plate identification method and system
Li et al. SVM-based information fusion for weld deviation extraction and weld groove state identification in rotating arc narrow gap MAG welding
CN104537342B (en) A kind of express lane line detecting method of combination ridge border detection and Hough transformation
CN103135136A (en) Automatic fault interpretation device for three-dimensional seismic data body
CN105844328A (en) Method applied to automatic commissioning personnel counting system and automatic commissioning personnel counting system
CN104574441B (en) A kind of tumble real-time detection method based on GMM and temporal model
CN112085675B (en) Depth image denoising method, foreground segmentation method and human motion monitoring method
CN109584265A (en) A kind of method for tracking target and device
CN104751146A (en) Indoor human body detection method based on 3D (three-dimensional) point cloud image
CN102750522B (en) A kind of method of target following
CN105096292A (en) Object quantity estimation method and device
CN105741326B (en) A kind of method for tracking target of the video sequence based on Cluster-Fusion

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20171213

Address after: 100085 Beijing city Haidian District on the 28 Street Hospital No. 2 Building 2 layer 205

Patentee after: BEIJING HUAYUAN SHENGYA SCIENCE AND TECHNOLOGY CO.,LTD.

Address before: 100176 days in Beijing City, park of Daxing District economic and Technological Development Zone in the two district two building 19 layer F1B9

Patentee before: QLTOUCH TECH Co.,Ltd.

TR01 Transfer of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20161207

CF01 Termination of patent right due to non-payment of annual fee