CN112306305A - Three-dimensional touch device - Google Patents

Three-dimensional touch device Download PDF

Info

Publication number
CN112306305A
CN112306305A CN202011168533.2A CN202011168533A CN112306305A CN 112306305 A CN112306305 A CN 112306305A CN 202011168533 A CN202011168533 A CN 202011168533A CN 112306305 A CN112306305 A CN 112306305A
Authority
CN
China
Prior art keywords
camera
dimensional touch
cube
touch area
cameras
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011168533.2A
Other languages
Chinese (zh)
Other versions
CN112306305B (en
Inventor
黄奎云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202011168533.2A priority Critical patent/CN112306305B/en
Publication of CN112306305A publication Critical patent/CN112306305A/en
Application granted granted Critical
Publication of CN112306305B publication Critical patent/CN112306305B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • G06F3/04186Touch location disambiguation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The three-dimensional touch device is characterized in that a camera set and at least 1 camera set are arranged outside at least one of the top surface, the left side surface and the right side surface of a three-dimensional touch area cube, each camera set comprises 2 cameras, the cameras are symmetrically arranged on parallel surfaces of the front surface of the cube and positioned on 45-degree angle distribution lines of two adjacent angles of the cube, the axes of the 2 cameras are respectively the same as the corresponding 45-degree angle distribution lines, and the acquisition directions of the 2 cameras face towards the cube; the planes where the camera sets are located are parallel and equidistant to each other, the distance between the camera set close to the edge of the cube and the edge is smaller than or equal to half of the distance between the adjacent camera sets, and the camera sets of all the camera sets on the face where the cube is located in the three-dimensional touch area can collect and cover the whole area of the face where the cube is located. The three-dimensional touch device has the advantages of small blind area or volume, more accurate touch, convenience in realization, convenience in standardization, adaptability to different 3D application Z-axis positions and sizes, realization of more accurate touch and the like, and the popularity of three-dimensional touch is improved.

Description

Three-dimensional touch device
Technical Field
The invention relates to a touch screen, in particular to a three-dimensional touch screen, and belongs to the technical field of information.
Background
With the development of touch technology, touch screens are more and more widely used, but are mainly two-dimensional plane or curved touch screens. The current touch screen mainly has the types of surface acoustic wave, infrared, resistance, capacitance and the like, and has the common characteristics of belonging to a two-dimensional plane touch screen or a curved surface touch screen, and also having a touch screen (such as an electromagnetic touch screen, a capacitance touch screen and an infrared touch screen) which adds a pressure sensor to a touch pen under the condition that the touch pen is needed to be used so as to obtain a touch screen with pressure feeling, or having a pressure sensor arranged on the touch screen body so as to detect the touch pressure, but these are not three-dimensional touch screens with spatial significance.
Display technologies naturally associated with touch screen applications are changing day by day, 3D displays are becoming mature day by day, and applications such as games, education, training and the like have increasingly strong demands for three-dimensional touch with spatial significance.
The existing three-dimensional touch scheme implemented based on ultrasonic waves (for example, patent publication No. CN106843502A) has the defect that complex interference media such as human bodies in a touch area are difficult to eliminate and difficult to accurately position.
The existing three-dimensional touch schemes implemented based on cameras (such as patent publication numbers CN103914152A, CN104199550A, and CN103488356A) have the disadvantages of dead zones or too large device volume, low touch precision, poor realizability, inconvenience in standardization, or inaccurate touch.
Disclosure of Invention
The invention aims to overcome the problems in the prior art, and provides a three-dimensional touch device, which is used for realizing three-dimensional touch of different 3D display applications, has the advantages of small blind area or volume, convenience in realization, high touch accuracy, convenience in standardization, suitability for Z-axis positions and sizes of different 3D applications, capability of realizing more accurate touch, size identification of a touch body and the like, and improves the universality of the three-dimensional touch.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
the three-dimensional touch device is characterized in that a camera set and at least 1 camera set are arranged outside at least one of the top surface, the left side surface and the right side surface of a three-dimensional touch area cube; each camera set comprises 2 cameras and is symmetrically arranged on a parallel surface of the front face of the three-dimensional touch area cube and located on 45-degree angle distribution lines of two adjacent angles of the three-dimensional touch area cube, the axes of the 2 cameras are respectively identical to the corresponding 45-degree angle distribution lines, the acquisition directions of the 2 cameras face the three-dimensional touch area cube, and the row or column scanning lines of the 2 cameras are parallel to the front face of the three-dimensional touch area cube; planes of adjacent camera sets in the width direction of the cube of the three-dimensional touch area are parallel and equidistant to each other, the distance between the camera set close to the front surface or the back surface of the cube of the three-dimensional touch area and the front surface or the back surface is less than or equal to half of the distance between the camera sets adjacent to the width direction of the cube of the three-dimensional touch area, and the camera sets of all the camera sets on one surface of the top surface, the left side surface and the right side surface of the cube of the three-dimensional touch area can collect and cover the whole; the specifications of the cameras are the same, and the acquisition angle is greater than or equal to 90 degrees; the cameras are respectively connected with the image processing module and then connected with the computer host. The working method comprises the following steps: and calculating the coordinates of the three-dimensional touch X, Y according to the 2 camera parameters of the camera group, the distance parameter from the axis direction of the camera to the nearest corner vertex of the three-dimensional touch area cube, the length and height parameters of the three-dimensional touch area cube and the row and column change data of the camera, and then calculating the Z coordinate of the three-dimensional touch according to the 2 camera parameters of the camera group, the distance parameter from the axis direction of the camera to the nearest corner vertex of the three-dimensional touch area cube, the length, height and width parameters of the three-dimensional touch area cube, the determined X and Y coordinates of the three-dimensional touch area cube and the column change data of the camera. When a plurality of camera sets are arranged outside the surface where the three-dimensional touch area cube is located, each camera set only determines touch coordinates within a half range of the distance between the adjacent parallel camera sets in the width direction of the three-dimensional touch area cube.
The three-dimensional touch device can be provided with the camera sets symmetrically on the left side surface and the right side surface of the three-dimensional touch area cube, the left side camera set calculates the right half touch coordinate of the three-dimensional touch area cube, and the right side camera set calculates the left half touch coordinate of the three-dimensional touch area cube.
The distance between planes of adjacent camera sets in the width direction of the three-dimensional touch area cube is in direct proportion to the distance between the camera on the plane and the vertex of the nearest corner of the corresponding three-dimensional touch area cube.
The camera row and column change data is used for setting a threshold value of the change data to filter out large object touch; and taking the position of the camera row and column change data closest to the back of the cube of the three-dimensional touch area as the effective touch position.
The camera comprises an image sensor, a lens, a processor and a communication module.
The camera is an infrared camera; the camera set comprises an infrared light source covering a three-dimensional touch area.
The camera is connected with the image processing module through an Ethernet interface, a USB interface, an I2C interface and an SPI interface.
The camera or the infrared light source is arranged on the bracket.
The image processing module is a computer host.
The invention has the advantages that:
in the invention, a camera set and at least 1 camera set are arranged outside at least one of the top surface, the left side surface and the right side surface of a three-dimensional touch area cube; each camera set comprises 2 cameras and is symmetrically arranged on a parallel surface of the front face of the three-dimensional touch area cube and located on 45-degree angle distribution lines of two adjacent angles of the three-dimensional touch area cube, the axes of the 2 cameras are respectively identical to the corresponding 45-degree angle distribution lines, the acquisition directions of the 2 cameras face the three-dimensional touch area cube, and the row or column scanning lines of the 2 cameras are parallel to the front face of the three-dimensional touch area cube; planes of adjacent camera sets in the width direction of the cube of the three-dimensional touch area are parallel and equidistant to each other, the distance between the camera set close to the front surface or the back surface of the cube of the three-dimensional touch area and the front surface or the back surface is less than or equal to half of the distance between the camera sets adjacent to the width direction of the cube of the three-dimensional touch area, and the camera sets of all the camera sets on one surface of the top surface, the left side surface and the right side surface of the cube of the three-dimensional touch area can collect and cover the whole; the specifications of the cameras are the same, and the acquisition angle is greater than or equal to 90 degrees; the cameras are respectively connected with the image processing module and then connected with the computer host. The working method comprises the following steps: and calculating the coordinates of the three-dimensional touch X, Y according to the 2 camera parameters of the camera group, the distance parameter from the axis direction of the camera to the nearest corner vertex of the three-dimensional touch area cube, the length and height parameters of the three-dimensional touch area cube and the row and column change data of the camera, and then calculating the Z coordinate of the three-dimensional touch according to the 2 camera parameters of the camera group, the distance parameter from the axis direction of the camera to the nearest corner vertex of the three-dimensional touch area cube, the length, height and width parameters of the three-dimensional touch area cube, the determined X and Y coordinates of the three-dimensional touch area cube and the column change data of the camera. When a plurality of camera sets are arranged outside the surface where the three-dimensional touch area cube is located, each camera set only determines touch coordinates within a half range of the distance between the adjacent parallel camera sets in the width direction of the three-dimensional touch area cube.
By adopting the device and the method, the needed camera lens does not need an ultra-wide angle, the imaging is not easy to distort, the touch accuracy is higher, and the lens is easy to purchase; the camera is arranged on the 45-degree angle division line of the corresponding angle according to rules, so that excessive touch algorithms cannot be corrected along with different three-dimensional touch space sizes, and standardization is facilitated; the requirements of small size or three-dimensional Z axis positions and different lengths of different 3D display applications can be met conveniently by adopting a mode of a plurality of parallel camera sets; the used touch algorithm is simple, occupies small processor resources and is beneficial to cost control; when a plurality of camera sets are arranged outside the surface of the cube of the three-dimensional touch area, each camera set only determines touch coordinates within a half range of the distance between the adjacent parallel camera sets, and calculation and standardization are facilitated.
2, in the three-dimensional touch device, the camera sets are symmetrically arranged on the left side surface and the right side surface of the three-dimensional touch area cube. The left side camera set calculates the right half touch coordinate of the three-dimensional touch area cube, and the right side camera set calculates the left half touch coordinate of the three-dimensional touch area cube, so that the problem of inaccurate positioning of a near-field image (a near area of a touch point to a camera) of the camera can be solved; the left side camera set and the right side camera set can judge the approximate size of touch when detecting the touch together, and can identify the difference between the interference of human bodies and the effective touch of fingers or other touch bodies.
3, in the invention, the distance between planes of adjacent camera sets in the width direction of the three-dimensional touch area cube is in direct proportion to the distance between the camera on the plane and the vertex of the nearest angle of the three-dimensional touch area cube, so that the optimal balance can be easily found in the volume of the device and the size of the Z axis of the three-dimensional touch or different physical application places can be easily met.
4, in the invention, the row and column change data of the camera are set, and the threshold value of the change data is set to filter the touch of a large object, so that the interference of human body entering in a three-dimensional touch area can be overcome, or the interference of fist falling is shielded when fingers are selected for touch; and the position of the camera row and column change data closest to the back of the cube of the three-dimensional touch area is taken as an effective touch position, so that the touch interference of a human body and the like is removed more reliably.
5, in the invention, the camera comprises an image sensor, a lens, a processor and a communication module, so that different cameras can be configured conveniently to meet the requirements of different touch resolutions or operation speeds and the like.
6, in the invention, the camera is an infrared camera; the camera set comprises an infrared light source covering a three-dimensional touch area, and the influence of ambient light on touch reliability can be improved.
7, in the invention, the camera is connected with the image processing module through an Ethernet interface, a USB interface, an I2C interface and an SPI interface, thus improving the flexibility of camera selection.
According to the invention, the camera or the infrared light source is arranged on the bracket, so that the installation is facilitated.
9, in the invention, the image processing module is a computer host, which is beneficial to reducing the cost.
Drawings
FIG. 1 is a schematic view of a typical spatial structure of the present apparatus 1;
FIG. 2 is a schematic view of a typical spatial structure of the present apparatus 2;
FIG. 3 is a schematic diagram 3 of a typical spatial structure of the present apparatus;
FIG. 4 is a block diagram of the present apparatus;
labeled as: 1. an image processing module for processing the image data,
2. the host machine of the computer is connected with the host machine,
101. the 1 st camera of the 1 st camera group,
102. the 2 nd camera of the 1 st camera group,
103. the axis of the camera 101 or the cube of the three-dimensional touch area of the plane where the 1 st camera group is located is divided into lines with an angle of 45 degrees,
104. the axis of the camera 102 or the cube corresponding corner 45 bisector of the three-dimensional touch area of the plane in which the 1 st camera group lies,
105. the parallel plane where the 1 st camera group is located,
111. the 1 st camera of the 2 nd camera group,
112. the 2 nd camera of the 2 nd camera group,
113. the axis of the camera 111 or the cube of the three-dimensional touch area of the plane where the 2 nd camera set is located is divided into lines with an angle of 45 degrees,
114. the axis of the camera 112 or the cube of the three-dimensional touch area of the plane in which the 2 nd camera group is located is bisected by an angle of 45 degrees,
115. the parallel plane of the 2 nd camera group.
121. The 1 st camera of the 1 st camera group on the left side,
122. the 2 nd camera of the 1 st camera group on the left side,
123. the axis of the left side camera 121 or the cube of the three-dimensional touch area of the plane where the 1 st camera group is located corresponds to an angular line of 45 degrees,
124. the axis of the left side camera 122 or the cube corresponding corner 45 of the three-dimensional touch area of the plane of the 1 st camera group on the left side,
125. the parallel planes of the 121 th and 122 th camera groups and 131 and 132 th camera groups,
131. the 1 st camera of the 1 st camera group on the right side,
132. the 2 nd camera of the 1 st camera group on the right side,
133. the three-dimensional touch area cube of the plane of the right side camera 131 or the 1 st camera group corresponds to an angle division line of 45 degrees,
134. the three-dimensional touch area cube of the axis of the right side camera 132 or the plane of the 1 st camera group on the right side corresponds to an angular division of 45 degrees,
141. the 1 st camera of the 2 nd camera group on the left side,
142. the 2 nd camera of the 2 nd camera group on the left side,
143. the axis of the left side camera 141 or the cube of the three-dimensional touch area of the plane where the 2 nd camera set is located is divided into lines with an angle of 45 degrees,
144. the axis of the left side camera 142 or the cube corresponding corner 45 of the three-dimensional touch area of the plane of the left side camera group 2,
145. the 141 th and 142 th camera groups and 151 and 152 th camera groups are located in parallel planes,
151. the 1 st camera of the 2 nd camera group on the right side,
152. the 2 nd camera of the 2 nd camera group on the right side,
153. the three-dimensional touch area cube of the plane of the right side camera 151 or the 2 nd camera group on the right side corresponds to an angle division line of 45 degrees,
154. the three-dimensional touch area cube of the axis of the right side camera 152 or the plane of the right side 2 nd camera group corresponds to an angular line of 45 degrees.
Detailed Description
The three-dimensional touch device is characterized in that a camera set and at least 1 camera set are arranged outside at least one of the top surface, the left side surface and the right side surface of a three-dimensional touch area cube, each camera set comprises 2 cameras, the cameras are symmetrically arranged on parallel surfaces of the front surface of the three-dimensional touch area cube and are positioned on 45-degree angle distribution lines of two adjacent angles of the three-dimensional touch area cube, the axes of the 2 cameras are respectively the same as the corresponding 45-degree angle distribution lines, the acquisition directions of the 2 cameras face the three-dimensional touch area cube, and line or row scanning lines of the 2 cameras are parallel to the front surface of the three-dimensional touch area cube; planes of adjacent camera sets in the width direction of the cube of the three-dimensional touch area are parallel and equidistant to each other, the distance between the camera set close to the front surface or the back surface of the cube of the three-dimensional touch area and the front surface or the back surface is less than or equal to half of the distance between the camera sets adjacent to the width direction of the cube of the three-dimensional touch area, and the camera sets of all the camera sets on one surface of the top surface, the left side surface and the right side surface of the cube of the three-dimensional touch area can collect and cover the whole; the specifications of the cameras are the same, and the acquisition angle is greater than or equal to 90 degrees; the cameras are respectively connected with the image processing module and then connected with the computer host. The working method comprises the following steps: and calculating the coordinates of the three-dimensional touch X, Y according to the 2 camera parameters of the camera group, the distance parameter from the axis direction of the camera to the nearest corner vertex of the three-dimensional touch area cube, the length and height parameters of the three-dimensional touch area cube and the row and column change data of the camera, and then calculating the Z coordinate of the three-dimensional touch according to the 2 camera parameters of the camera group, the distance parameter from the axis direction of the camera to the nearest corner vertex of the three-dimensional touch area cube, the length, height and width parameters of the three-dimensional touch area cube, the determined X and Y coordinates of the three-dimensional touch area cube and the column change data of the camera. When a plurality of camera sets are arranged outside the surface where the three-dimensional touch area cube is located, each camera set only determines touch coordinates within a half range of the distance between the adjacent parallel camera sets in the width direction of the three-dimensional touch area cube.
Still preferably, the camera groups are symmetrically arranged on the left side and the right side of the three-dimensional touch area cube, the left side camera group calculates the touch coordinates of the right half of the three-dimensional touch area cube, and the right side camera group calculates the touch coordinates of the left half of the three-dimensional touch area cube.
Still preferably, the distance between the planes of adjacent camera groups in the width direction of the three-dimensional touch area cube is proportional to the distance between the camera on the plane and the vertex of the nearest corner of the corresponding three-dimensional touch area cube.
Still further preferably, the camera row and column variation data is thresholded to filter out large object touches; and taking the position of the camera row and column change data closest to the back of the cube of the three-dimensional touch area as the effective touch position.
Still further preferably, the camera includes an image sensor, a lens, a processor, and a communication module.
Still further preferred one of the cameras is an infrared camera; the camera set comprises an infrared light source covering a three-dimensional touch area.
A further preferred camera is connected to the image processing module via an ethernet interface, a USB interface, an I2C interface, or an SPI interface.
It is further preferred that a camera or and infrared light source is provided on the support.
Still more preferably, an image processing module is a host computer.
The camera axis refers to a connecting line between the imaging center point of the target surface of the camera and the center point of the imaging plane of the imaging object acquired by the camera.
The three-dimensional touch area cube is arranged right in front of display panels such as an LCD and an LED or other display interfaces such as projection and is in a cuboid shape, the back of the three-dimensional touch area cube is parallel to the display plane of the display panels or other display interfaces, and the three-dimensional touch area corresponds to the three-dimensional display area; the height of the front face of the three-dimensional touch area cube is the height H of the cube, the width of the front face is the length L of the cube, the length L direction of the cube is the X-axis direction, the width W direction of the cube is the Z-axis direction, and the height H direction of the cube is the Y-axis direction. And the distance between the back of the three-dimensional touch area cube and the display plane of the display panel is determined according to the spatial three-dimensional display area of the display panel.
The camera is a CCD camera or a CMOS camera, and the image sensor is a CCD or CMOS sensor.
The image processing module comprises a processor such as STM32F427 and the like, and also can comprise a HUB chip such as G850G and the like; and can also be directly arranged in the computer host.
The camera set is located in the parallel surface and is located in the area between the front surface and the back surface of the three-dimensional touch area cube. When the axis of the camera and the corresponding cube angular division line of the three-dimensional touch area have an error angle, the error can be corrected in a calibration mode; when the parallel error exists between the row or column scanning lines of the camera and the front face of the cube of the three-dimensional touch area, the error can be corrected in a calibration mode; when the distance between adjacent camera sets has errors, the errors can be corrected in a calibration mode.
Example 1
As shown in fig. 2
The three-dimensional touch device is characterized in that 1 camera group is arranged outside the top surface of a three-dimensional touch area cube, each camera group comprises cameras 101 and 102, the cameras 101 and 102 are symmetrically arranged on a parallel surface 105 of the front surface of the three-dimensional touch area cube and are positioned on 45-degree angle distribution lines 103 and 104 of two adjacent angles of the three-dimensional touch area cube, 2 camera axes 103 and 104 are respectively identical to the corresponding 45-degree angle distribution lines 103 and 104 and face the three-dimensional touch area cube, and row scanning lines of the 2 cameras 103 and 104 are parallel to the front surface of the three-dimensional touch area cube; the 2 cameras 101 and 102 collect and cover the whole area of the top surface of the three-dimensional touch area cube; the specifications of the 2 cameras 101 and 102 are the same, and the acquisition angle is 100 degrees; the 2 cameras 101 and 102 are respectively connected to the image processing module 1 and the computer main unit 2 by USB.
The working method comprises the following steps: and calculating a three-dimensional touch XY coordinate according to parameters of 2 cameras 101 and 102 of the camera group, a distance parameter from the axial direction of each of the cameras 101 and 102 to the nearest corner vertex of the three-dimensional touch area cube, a length L and a height H of the three-dimensional touch area cube, row and column change data of the cameras 101 and 102, and then calculating a three-dimensional touch Z coordinate according to parameters of the 2 cameras 101 and 102 of the camera group, a distance parameter from the axial direction of each of the cameras to the nearest corner vertex of the three-dimensional touch area cube, a length L, a height H and a width W parameter of the three-dimensional touch area cube, determined X and Y coordinates of the three-dimensional touch area cube, and row or column change data of the cameras 101 and.
Example 2
As shown in figure 1
The three-dimensional touch device is characterized in that 2 camera groups are arranged outside the top surface of a three-dimensional touch area cube, the 1 st camera group comprises cameras 101 and 102 and is symmetrically arranged on a parallel surface 105 of the front surface of the three-dimensional touch area cube and is positioned on 45-degree angle distribution lines 103 and 104 of two adjacent angles of the three-dimensional touch area cube, 2 camera axes 103 and 104 are respectively identical to the corresponding 45-degree angle distribution lines 103 and 104 and face the three-dimensional touch area cube, and row scanning lines of the 2 cameras 103 and 104 are parallel to the front surface of the three-dimensional touch area cube; the 2 nd camera group comprises 2 cameras 111 and 112 which are symmetrically arranged on a parallel surface 115 of the front face of the three-dimensional touch area cube and are positioned on 45-degree angle distribution lines 113 and 114 of two adjacent angles of the three-dimensional touch area cube, 2 camera axes 113 and 114 are respectively the same as the corresponding 45-degree angle distribution lines 113 and 114 and face the three-dimensional touch area cube, and row scanning lines of the 2 cameras 111 and 112 are parallel to the front face of the three-dimensional touch area cube; the camera 101 is at a distance of L/2 from the camera 111 and at a distance of L/4 from the back of the three-dimensional touch area cube.
The 4 cameras 101, 102, 111 and 112 collect and cover the whole area of the top surface of the three-dimensional touch area cube; the specifications of the 4 cameras 101, 102, 111 and 112 are the same, and the acquisition angle is 100 degrees; the 4 cameras 101, 102, 111, and 112 are respectively connected to the image processing module 1 by USB, and the image processing module 1 is connected to the computer main unit 2 by USB.
2 camera groups only determine touch coordinates within a distance l/2 range of adjacent parallel camera groups
The working method comprises the following steps:
firstly detecting the row and column data of the cameras 101 and 102 of the 1 st camera group, judging whether the row and column data are changed, if the row and column data are not changed, detecting the row and column data of the cameras 111 and 112 of the 2 nd camera group, if the row and column data are changed, judging whether a change area is in a range of a distance l/2, if the change area is in the range, calculating the touch coordinate in the 1 st camera group and ending, and if the change area is not changed, detecting the row and column data of the cameras 111 and 112 of the 2 nd camera group.
And detecting row and column data of the cameras 111 and 112 of the 2 nd camera group, judging whether the row and column data are changed or not, if the row and column data are not changed, ending the judgment, if the row and column data are changed, judging whether a change area is within a range of a distance l/2 or not, if the row and column data are within the range, calculating touch coordinates in the 2 nd camera group and ending the calculation, and otherwise, directly ending the calculation.
Calculating the touch coordinate of the 1 st camera group: and then, calculating a three-dimensional touch X coordinate according to the parameters of the 2 cameras 101 and 102 of the camera group, the distance parameter from the axial direction of the cameras 101 and 102 to the nearest corner vertex of the three-dimensional touch area cube, the length L and height H parameter of the three-dimensional touch area cube, and the row or column change data of the cameras 101 and 102, and then calculating a three-dimensional touch Z coordinate according to the parameters of the 2 cameras 101 and 102 of the camera group, the distance parameter from the axial direction of the cameras to the nearest corner vertex of the three-dimensional touch area cube, the length L, height H and width W parameters of the three-dimensional touch area cube, the determined X and Y coordinates of the three-dimensional touch area cube, and the column or row change data of the cameras 101 and.
Calculating the touch coordinate of the 2 nd camera group: and then, calculating a three-dimensional touch XY coordinate according to the parameters of the 2 cameras 111 and 112 of the camera group, the distance parameter from the axial direction of the cameras 111 and 112 to the nearest corner vertex of the three-dimensional touch area cube, the length L and height H parameter of the three-dimensional touch area cube, and the row or column change data of the cameras 111 and 112, and then calculating a three-dimensional touch Z coordinate according to the parameters of the 2 cameras 111 and 112 of the camera group, the distance parameter from the axial direction of the cameras to the nearest corner vertex of the three-dimensional touch area cube, the length L, height H and width W parameters of the three-dimensional touch area cube, the determined X and Y coordinates of the three-dimensional touch area cube, and the column or row change data of the cameras 111 and.
Example 3
As shown in fig. 3
This three-dimensional touch device sets up 4 camera groups at the outside symmetry of left surface, the right flank of three-dimensional touch region cube:
the 1 st camera group on the left side comprises cameras 121 and 122, the cameras are symmetrically arranged on a parallel surface 125 of the front face of the three-dimensional touch area cube and are positioned on 45-degree angle distribution lines 123 and 124 of two adjacent angles of the three-dimensional touch area cube, 2 camera axes 123 and 124 are respectively identical to the corresponding 45-degree angle distribution lines 123 and 124 and face the three-dimensional touch area cube, and row scanning lines of the 2 cameras 123 and 124 are parallel to the front face of the three-dimensional touch area cube; the 1 st camera group on the right side comprises 2 cameras 131 and 132 which are symmetrically arranged on a parallel surface 125 of the front face of the three-dimensional touch area cube and are positioned on 45-degree angle distribution lines 133 and 134 of two adjacent angles of the three-dimensional touch area cube, 2 camera axes 133 and 134 are respectively identical to the corresponding 45-degree angle distribution lines 133 and 134 and face the three-dimensional touch area cube, and row scanning lines of the 2 cameras 131 and 132 are parallel to the front face of the three-dimensional touch area cube.
The 2 nd camera group on the left side comprises cameras 141 and 142 which are symmetrically arranged on a parallel surface 145 of the front surface of the three-dimensional touch area cube and are positioned on 45-degree angle distribution lines 143 and 144 of two adjacent angles of the three-dimensional touch area cube, 2 camera axes 143 and 144 are respectively the same as the corresponding 45-degree angle distribution lines 143 and 144 and face the three-dimensional touch area cube, and row scanning lines of the 2 cameras 143 and 144 are parallel to the front surface of the three-dimensional touch area cube; the 2 nd camera group on the right side comprises 2 cameras 151 and 152 which are symmetrically arranged on a parallel surface 145 on the front surface of the three-dimensional touch area cube and are positioned on 45-degree angle distribution lines 153 and 154 of two adjacent angles of the three-dimensional touch area cube, 2 camera axes 153 and 154 are respectively identical to the corresponding 45-degree angle distribution lines 153 and 154 and face the three-dimensional touch area cube, and row scanning lines of the 2 cameras 151 and 152 are parallel to the front surface of the three-dimensional touch area cube.
The 4 cameras 121, 122, 141 and 142 collect and cover the whole area of the left side surface of the three-dimensional touch area cube, and the 4 cameras 131, 132, 151 and 152 collect and cover the whole area of the right side surface of the three-dimensional touch area cube; the specifications of the 8 cameras 121, 122, 131, 132, 141, 142, 151 and 152 are the same, and the acquisition angle is 100 degrees; the 8 cameras 121, 122, 131, 132, 141, 142, 151, and 152 are respectively connected to the image processing module 1 by USB, and the image processing module 1 is connected to the computer main body 2 by USB.
The cameras 121 and 122 of the 1 st camera group on the left side only determine the touch coordinates of the right half of the three-dimensional touch area cube within the distance range of l/2 from the adjacent parallel 2 nd camera group, and if the left half is judged to have touch, the touch coordinates are ignored; the cameras 131 and 132 of the 1 st camera group on the right side only determine the touch coordinates of the left half part of the three-dimensional touch area cube within the distance range of l/2 from the adjacent parallel 2 nd camera group, and if the right half part is judged to have touch, the touch coordinates are ignored; the cameras 141 and 142 of the 2 nd camera group on the left side only determine the touch coordinates of the right half part of the three-dimensional touch area cube within the distance range of l/2 from the adjacent parallel 1 st camera group, and if the left half part is judged to have touch, the touch coordinates are ignored; the cameras 151 and 152 of the 2 nd camera group on the right side only determine the touch coordinates of the left half part of the three-dimensional touch area cube within the distance l/2 from the adjacent parallel 1 st camera group, and if the left half part is judged to be touched, the touch coordinates are ignored. Therefore, the touch at all positions in the three-dimensional touch area cube can be ensured to be accurate by calculating the far field image data of the camera, and the three-dimensional touch area cube has the other advantages that the Z-axis size can adapt to different 3D applications and the like.
The implementation case of the invention is composed of the above but not limited to the above implementation case, and more various implementation cases can be simply diffracted by the technical scheme of the invention.

Claims (9)

1. A three-dimensional touch device, characterized by: arranging a camera set and at least 1 camera set outside at least one of the top surface, the left side surface and the right side surface of the three-dimensional touch area cube; each camera set comprises 2 cameras and is symmetrically arranged on a parallel surface of the front face of the three-dimensional touch area cube and located on 45-degree angle distribution lines of two adjacent angles of the three-dimensional touch area cube, the axes of the 2 cameras are respectively identical to the corresponding 45-degree angle distribution lines, the acquisition directions of the 2 cameras face the three-dimensional touch area cube, and the row or column scanning lines of the 2 cameras are parallel to the front face of the three-dimensional touch area cube; planes of adjacent camera sets in the width direction of the cube of the three-dimensional touch area are parallel and equidistant to each other, the distance between the camera set close to the front surface or the back surface of the cube of the three-dimensional touch area and the front surface or the back surface of the cube of the three-dimensional touch area is less than or equal to half of the distance between the camera sets adjacent to the width direction of the cube of the three-dimensional touch area, and the camera sets of all the camera sets on one of the top surface, the left side surface and the right side surface of the cube of the three-dimensional touch area can; the specifications of the cameras are the same, and the acquisition angle is greater than or equal to 90 degrees; the cameras are respectively connected with the image processing module (1).
The working method comprises the following steps: and calculating the coordinates of the three-dimensional touch X, Y according to the 2 camera parameters of the camera group, the distance parameter from the axis direction of the camera to the nearest corner vertex of the three-dimensional touch area cube, the length and height parameters of the three-dimensional touch area cube and the row and column change data of the camera, and then calculating the Z coordinate of the three-dimensional touch according to the 2 camera parameters of the camera group, the distance parameter from the axis direction of the camera to the nearest corner vertex of the three-dimensional touch area cube, the length, height and width parameters of the three-dimensional touch area cube, the determined X and Y coordinates of the three-dimensional touch area cube and the column change data of the camera. When a plurality of camera sets are arranged outside the surface where the three-dimensional touch area cube is located, each camera set determines touch coordinates within a half range of the distance between the adjacent parallel camera sets in the width direction of the three-dimensional touch area cube.
2. The three-dimensional touch apparatus according to claim 1, wherein: the left side face and the right side face of the three-dimensional touch area cube are symmetrically provided with camera sets, the left side camera set calculates the right half touch coordinate of the three-dimensional touch area cube, and the right side camera set calculates the left half touch coordinate of the three-dimensional touch area cube.
3. The three-dimensional touch apparatus according to claim 1 or 2, wherein: the distance between planes of adjacent camera sets in the width direction of the three-dimensional touch area cube is in direct proportion to the distance between the camera on the plane and the vertex of the nearest corner of the corresponding three-dimensional touch area cube.
4. The three-dimensional touch apparatus according to claim 1 or 2, wherein: and the camera row and column change data sets a threshold value of the change data to filter out large object touch.
5. The three-dimensional touch apparatus according to claim 1 or 2, wherein: the camera comprises an image sensor, a lens, a processor and a communication module.
6. The three-dimensional touch apparatus according to claim 1 or 2, wherein: the camera is an infrared camera; the camera set comprises an infrared light source covering a three-dimensional touch area.
7. The three-dimensional touch apparatus according to claim 1 or 2, wherein: the camera is connected with the image processing module (1) through an Ethernet interface, a USB interface, an I2C interface and an SPI interface.
8. The three-dimensional touch apparatus according to claim 1 or 2, wherein: the camera or the infrared light source is arranged on the bracket.
9. The three-dimensional touch apparatus according to claim 1, wherein: the image processing module (1) is a computer host (2).
CN202011168533.2A 2020-10-28 2020-10-28 Three-dimensional touch device Active CN112306305B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011168533.2A CN112306305B (en) 2020-10-28 2020-10-28 Three-dimensional touch device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011168533.2A CN112306305B (en) 2020-10-28 2020-10-28 Three-dimensional touch device

Publications (2)

Publication Number Publication Date
CN112306305A true CN112306305A (en) 2021-02-02
CN112306305B CN112306305B (en) 2021-08-31

Family

ID=74331219

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011168533.2A Active CN112306305B (en) 2020-10-28 2020-10-28 Three-dimensional touch device

Country Status (1)

Country Link
CN (1) CN112306305B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113873156A (en) * 2021-09-27 2021-12-31 北京有竹居网络技术有限公司 Image processing method and device and electronic equipment

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020118286A1 (en) * 2001-02-12 2002-08-29 Takeo Kanade System and method for servoing on a moving fixation point within a dynamic scene
US20130127705A1 (en) * 2011-11-18 2013-05-23 Korea Electronics Technology Institute Apparatus for touching projection of 3d images on infrared screen using single-infrared camera
CN103905808A (en) * 2012-12-27 2014-07-02 北京三星通信技术研究有限公司 Device and method used for three-dimension display and interaction.
CN104199547A (en) * 2014-08-29 2014-12-10 福州瑞芯微电子有限公司 Man-machine interactive type virtual touch device, system and method
CN106919294A (en) * 2017-03-10 2017-07-04 京东方科技集团股份有限公司 A kind of 3D touch-controls interactive device, its touch-control exchange method and display device
US9734419B1 (en) * 2008-12-30 2017-08-15 Cognex Corporation System and method for validating camera calibration in a vision system
CN108052205A (en) * 2017-12-28 2018-05-18 郑州巍瀚信息科技有限公司 Virtual reality system based on 3D interactions
US20190004609A1 (en) * 2017-06-30 2019-01-03 Intel Corporation Adaptive cursor technology
CN110865704A (en) * 2019-10-21 2020-03-06 浙江大学 Gesture interaction device and method for 360-degree suspended light field three-dimensional display system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020118286A1 (en) * 2001-02-12 2002-08-29 Takeo Kanade System and method for servoing on a moving fixation point within a dynamic scene
US9734419B1 (en) * 2008-12-30 2017-08-15 Cognex Corporation System and method for validating camera calibration in a vision system
US20130127705A1 (en) * 2011-11-18 2013-05-23 Korea Electronics Technology Institute Apparatus for touching projection of 3d images on infrared screen using single-infrared camera
CN103905808A (en) * 2012-12-27 2014-07-02 北京三星通信技术研究有限公司 Device and method used for three-dimension display and interaction.
CN104199547A (en) * 2014-08-29 2014-12-10 福州瑞芯微电子有限公司 Man-machine interactive type virtual touch device, system and method
CN106919294A (en) * 2017-03-10 2017-07-04 京东方科技集团股份有限公司 A kind of 3D touch-controls interactive device, its touch-control exchange method and display device
US20190004609A1 (en) * 2017-06-30 2019-01-03 Intel Corporation Adaptive cursor technology
CN108052205A (en) * 2017-12-28 2018-05-18 郑州巍瀚信息科技有限公司 Virtual reality system based on 3D interactions
CN110865704A (en) * 2019-10-21 2020-03-06 浙江大学 Gesture interaction device and method for 360-degree suspended light field three-dimensional display system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113873156A (en) * 2021-09-27 2021-12-31 北京有竹居网络技术有限公司 Image processing method and device and electronic equipment

Also Published As

Publication number Publication date
CN112306305B (en) 2021-08-31

Similar Documents

Publication Publication Date Title
CA2521418C (en) Auto-aligning touch system and method
CA2670357C (en) Interactive input system and method
CN201378310Y (en) Touch-sensitive display screen frame and system based on infrared videography
CN103797446A (en) Method for detecting motion of input body and input device using same
TWI461975B (en) Electronic device and method for correcting touch position
US20110261016A1 (en) Optical touch screen system and method for recognizing a relative distance of objects
CN105373266A (en) Novel binocular vision based interaction method and electronic whiteboard system
US20150009119A1 (en) Built-in design of camera system for imaging and gesture processing applications
WO2014003803A1 (en) Touch orientation calculation
CN102163108A (en) Method and device for identifying multiple touch points
CN112306305B (en) Three-dimensional touch device
CN102841711B (en) Multi-dimensional image detection device
CN102622140B (en) Image pick-up multi-point touch system
CN1674042A (en) Contact screen information inputting positioning method based on image recognition
CN101149653B (en) Device for distinguishing image position
CN216927587U (en) Three-dimensional touch device
CN102798382B (en) Embedded vision positioning system
CN202443449U (en) Photographic multi-point touch system
CN112631463A (en) Three-dimensional touch device
CN201084128Y (en) Image positioning touching device
CN103543884B (en) Optical touch system and touch object distinguishing method thereof
CN108984039B (en) Electronic whiteboard device and display method thereof
CN102364418B (en) Optical touch-control positioning system and method
TWI543047B (en) Optical touch display
CN116974400B (en) Screen touch recognition method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant