CN212781988U - High-precision touch interaction system for multi-surface splicing projection - Google Patents

High-precision touch interaction system for multi-surface splicing projection Download PDF

Info

Publication number
CN212781988U
CN212781988U CN202021956000.6U CN202021956000U CN212781988U CN 212781988 U CN212781988 U CN 212781988U CN 202021956000 U CN202021956000 U CN 202021956000U CN 212781988 U CN212781988 U CN 212781988U
Authority
CN
China
Prior art keywords
infrared
infrared camera
screen
projection
laser device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202021956000.6U
Other languages
Chinese (zh)
Inventor
李辉熠
贺耿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Guanchao Intelligent Technology Co ltd
Original Assignee
Hunan Guanchao Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Guanchao Intelligent Technology Co ltd filed Critical Hunan Guanchao Intelligent Technology Co ltd
Priority to CN202021956000.6U priority Critical patent/CN212781988U/en
Application granted granted Critical
Publication of CN212781988U publication Critical patent/CN212781988U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Projection Apparatus (AREA)
  • Position Input By Displaying (AREA)

Abstract

A high-precision touch interaction system for multi-surface splicing projection comprises at least two first screens and two second screens, wherein the first screens can be touched and arranged at an included angle, and are provided with a first infrared laser device and a first infrared camera device; the second screen is provided with a second infrared laser device and a second infrared camera device; the first infrared laser device and the second infrared laser device have different wavelengths. The utility model discloses can realize low delay, high accuracy, the system of control computer on projection screen of blind area is touched to multiaspect concatenation projection.

Description

High-precision touch interaction system for multi-surface splicing projection
Technical Field
The utility model relates to a multiaspect projection system, especially a projected high accuracy touch interactive system is spliced to multiaspect.
Background
A multi-surface projection system is provided, in which a plurality of projection surfaces are arranged on 2 or more non-parallel surfaces, and images on the plurality of surfaces are subjected to stitching processing, thereby creating an immersive display effect. Multi-faceted projection systems are disclosed in, for example, patent nos. CN103995429A, 201510209626.8, and CN 202837801U. However, these schemes have good immersive display effects, but cannot interact with each other.
The traditional interactive projection technology captures and shoots target images (such as participants) through a sensor, then an image analysis system analyzes the target images to generate actions of the captured objects, the action data is combined with a real-time image interactive system to generate an interactive effect projection technology which is tightly combined between the participants and a screen, and the mode of capturing the target impression through the sensor is widely applied, but the precision is very low, the delay is serious, and a good interactive effect is difficult to achieve.
By introducing an infrared inline laser emitter in front of the display device, preferably a projection screen, the laser emitter emits a laser curtain parallel to and proximate to the projection screen. And because the infrared camera is provided with the optical filter matched with the laser emitter, only the reflected light of the laser emitter irradiating the object can be seen. When a finger or other object is pointed at a particular object in front of the projection screen, it will intersect the laser curtain in front of the projection screen, thereby forming a reflection. The infrared camera can only see the reflected light and cannot see other spectrums, so that the position touched by the finger can be accurately calculated. The technical solutions of application numbers 200810024947.0 and 200910058018.6 disclose the method. The specific principle is as follows: the infrared laser emitter is arranged on the edge of the projection screen, the infrared laser emitter forms a laser curtain in the emitting area, the laser curtain is close to the projection screen and parallel to the projection screen, when people need to operate the projection screen by fingers or objects, infrared laser can be reflected by a contact point, an infrared camera can efficiently and accurately capture the position of the position, the position corresponding to the position in a computer coordinate system is calculated by an interactive host, and therefore the purpose of accurately controlling a computer on the projection screen and interacting with the computer is achieved.
The high-precision interaction of the multi-surface projection system is that the projection of each surface can be controlled like a computer touch screen on the back of a projection screen of the multi-surface projection system. At present, no system or method for performing high-precision interaction on a multi-surface splicing projection system exists. The utility model discloses an introduce a word laser in the multiaspect projection system to design out the method of constructing multiaspect concatenation projection high accuracy interactive system and designing this type of system.
And the utility model discloses what consider is that if be the contained angle setting with multiaspect concatenation projection (we take two screens as an example), two screen contained angles are usually less than 180 degrees, and especially, the many cases are two vertically screens, are 90 degrees and place. Generally, a No. 1 laser screen on a No. 1 projection screen is parallel to the projection screen, and no laser exists in the projection screen; during actual processing, even if the projection screen is not operated, the infrared camera can also see laser spectrums reflected by the intersection of the laser curtain outside the projection screen and other objects, and the system can process the laser spectrums outside the screen and cannot cause interference to the system; if no included angle exists, the No. 1 laser curtain can not irradiate the No. 1 projection screen, can not irradiate the No. 2 projection screen, and is parallel to the screen, so that the whole screen can be operated. When two screens have an included angle, the No. 1 laser screen is parallel to the No. 1 projection screen, but the No. 2 laser can irradiate the No. 1 projection screen, interference can be formed on the high-precision interaction of multi-surface projection, and the high-precision touch interaction of the multi-surface projection screen cannot be correctly carried out.
SUMMERY OF THE UTILITY MODEL
The utility model aims at providing a high efficiency, easy-to-use, feasible, stable, commercially available, low delay, the projected high accuracy touch interactive system of multiaspect concatenation of no touch blind area.
The technical scheme of the utility model is that: a high-precision touch interaction system for multi-surface splicing projection comprises at least two first screens and two second screens, wherein the first screens can be touched and arranged at an included angle, and are provided with a first infrared laser device and a first infrared camera device; the second screen is provided with a second infrared laser device and a second infrared camera device; the first infrared laser device and the second infrared laser device have different wavelengths.
Further, the wavelength of the first infrared camera device is the same as that of the first infrared laser device, and the wavelength of the second infrared camera device is the same as that of the second infrared laser device.
Further, the wavelength of the first infrared laser device is within the wavelength range of the first infrared camera device, and the wavelength of the second infrared laser device is within the wavelength range of the second infrared camera device.
The computer is connected with the fusion device, the fusion device is connected with the first projector and the second projector, and pictures of the projectors are seamlessly spliced into the first screen and the second screen; the infrared laser device emits a laser curtain, and the laser curtain is not crossed with the corresponding screen.
Further, the infrared camera device can see the whole picture of the corresponding screen; if the content of the first screen is seen in the picture of the first infrared camera device, the content is indicated by V11, and the whole picture shot by the first infrared camera device is V1, so that the content is displayed
Figure DEST_PATH_GDA0002888651310000021
Similarly, the range of the picture of the second screen in the picture of the second infrared camera device is V21, and the whole picture shot by the first infrared camera device is V2, so that
Figure DEST_PATH_GDA0002888651310000022
Further, the computer comprises a processing unit, and the processing unit is used for processing and analyzing each frame of image shot by the infrared camera device so as to identify and track the motion condition and the position of each group of contact points.
Further, the first infrared laser device and the second infrared laser device each include one or more infrared lasers.
Further, the first infrared camera device and the second infrared camera device both comprise one or more infrared cameras with infrared filter switching functions.
The utility model has the advantages that: by arranging the infrared laser devices with different wavelengths, even if the projection screen is irradiated by another infrared laser device, the projection screen can be identified and shot by one of the infrared camera devices with the corresponding wavelength, and the other infrared camera device which is not in the sensing wavelength range can not trigger shooting, so that the other screen can not be interfered; therefore, high-precision interaction can be achieved for multi-surface wall projection, immersive interaction experience is formed, and multi-surface projection experience effect is greatly improved.
Drawings
Fig. 1 is a general structure diagram of a high-precision touch interactive system of multi-surface splicing projection according to an embodiment of the present invention;
fig. 2 is a schematic diagram of an interactive projection screen L1 according to an embodiment of the present invention.
Description of the labeling:
a computer for generating images for presentation by the projectors P1, P2, running the processing unit;
the processing unit processes the images of the infrared cameras C1 and C2 and controls the computer;
the fusion device is used for fusing and splicing the projectors P1 and P2 into a large non-overlapping screen;
p1, P2-projector for displaying images in the computer;
l1, L2-touchable interactive projection screen, including projection screen L11/L21 and laser curtain L12/L22;
g1, G2-infrared laser for emitting a laser curtain, wherein the infrared laser curtain is emitted for the projection screen L11/L21, and the infrared laser wavelengths of G1 and G2 are different;
c1, C2-an infrared camera with a switchable infrared narrowband filter is used for shooting images of corresponding projection screens L11 and L21, and the images and laser screens emitted by G1 and G2 realize touch control functions of interactive projection screens L1 and L2; the wavelength of the filters in the infrared cameras C1 and C2 is different, and the wavelength of the filter in C1 is the same as that of the laser emitted by G1; the filter of C2 has the same wavelength as the laser light emitted by G2.
Detailed Description
The present invention will be described in further detail with reference to specific embodiments.
A high-precision touch interaction system of a multi-surface projection system comprises at least two projection screens, namely a first screen and a second screen, wherein the first screen is provided with a first infrared laser device and a first infrared camera device; the second screen is provided with a second infrared laser device and a second infrared camera device; the first infrared laser device and the second infrared laser device have different wavelengths.
In this embodiment, because the infrared laser devices with different wavelengths are arranged, that is, the laser of the first infrared laser device is irradiated in the second projection screen, the picture taken by the second infrared camera device is not interfered by the laser of the first screen, and because the second infrared camera device cannot see the laser spectrum emitted by the first infrared laser device from the second screen, no interference is generated.
In this embodiment, the wavelength of the first infrared camera is the same as that of the first infrared laser device, and the wavelength of the second infrared camera is the same as that of the second infrared laser device; or the wavelength of the first infrared laser device is in the wavelength range of the first infrared camera device, and the wavelength of the second infrared laser device is in the wavelength range of the second infrared camera device. Like this, when first projection screen is shone by the infrared laser instrument of other screens, because the wavelength range is different, infrared laser that first infrared camera device can not see the infrared laser that the infrared laser device that other screens correspond shines first screen to can not influence the interactive effect of first screen in the system, and then can bring complete interactive experience for the multiaspect concatenation projection.
In this embodiment, the multi-screen splicing is realized by a plurality of projectors, and the computer can output an image with a higher resolution.
In this embodiment, the first infrared camera and the second infrared camera are both connected to a computer, the computer is connected to the plurality of projectors through a fusion device (including a software fusion device or a hardware fusion device), and images of the plurality of projectors are partially overlapped to form an ultra-large image capable of being turned. And the computer is used for receiving the images shot by the first infrared camera device or the second infrared camera device and processing and analyzing each frame of shot images so as to identify and track the motion condition and position of each group of contact points.
In this embodiment, the first infrared laser device and the second infrared laser device each include one or more infrared lasers; the first infrared camera device and the second infrared camera device both comprise one or more infrared cameras. Namely: one infrared camera may be responsible for one or more screens, or multiple infrared cameras may be responsible for one screen. And one screen may match one or more infrared lasers. The infrared camera of the embodiment is a camera with a switchable infrared narrowband filter, and is a common camera when the filter is opened, and is an infrared camera when the filter is closed.
In this embodiment, a user may perform a touch operation on a screen by using a finger or a stylus.
The following is a preferred embodiment of the present invention:
as shown in fig. 1: a high precision touch interaction system for a multi-faceted projection system, comprising: two interactive projection screens L1 and L2, infrared lasers G1 and G2, infrared cameras C1 and C2, a computer, and projectors P1 and P2. Wherein, the interactive projection screen includes two layers, as shown in fig. 2: the interactive projection screen L1 comprises a projection screen L11 and a laser curtain L12, and the laser curtain L12 is positioned at the front side of the projection screen L11; l11 does not intersect L12 and has a thickness of no more than 5 mm. Screens L1 and L2 are arranged at an angle, for example, interactive projection screen L1 is disposed on one wall surface and interactive projection screen L2 is disposed on another wall surface perpendicular to the wall surface, so that interactive projection screens L1 and L2 are arranged at 90 °. The infrared lasers G1 and G2 are respectively used for emitting laser curtains L12 and L22 which are parallel to the projection screen L11 and the projection screen L21; the infrared laser G1 is provided above the projection screen L11, and the infrared laser G2 is provided above the projection screen L21. The infrared lasers G1 and G2 differ in wavelength, for example infrared laser G1 has a wavelength of 850 nm; the wavelength of G2 was 940 nm. The infrared cameras C1 and C2 are used for collecting each frame of image touched by the user (such as finger touch) on the interactive projection screen, the interactive projection screens L1 and L2 are located in the viewing angle ranges of the infrared cameras C1 and C2, the infrared wavelength of the infrared camera C1 is matched with the wavelength of the infrared laser G1, and the infrared wavelength of the infrared camera C2 is matched with the wavelength of the infrared laser G2. For example: the wavelength of the infrared camera C1 is a narrow-band filter of 850nm, and the wavelength of the infrared camera C2 is a narrow-band filter of 940 nm.
Preferably, C1, C2 use black and white cameras.
A high-precision touch interaction method of a multi-surface projection system comprises the following steps:
the first step is as follows: connecting all products, connecting a computer with a fusion device, connecting the fusion device with projectors P1 and P2, and seamlessly splicing the pictures of the projectors into two projection screens L11 and L21; adjusting the infrared camera C1 so that the infrared camera C1 can see the whole picture of the projection screen L11, if the content of L11 is seen in the back of the picture of C1, the whole picture is represented by V11, and if the whole picture seen by C1 is V1, the whole picture is represented by V1
Figure DEST_PATH_GDA0002888651310000051
Similarly, the range of the L21 screen in the C2 screen is V21, and the entire C2 screen is V2, so that
Figure DEST_PATH_GDA0002888651310000052
The infrared laser G1 emits a laser curtain L12, and the infrared laser G1 is adjusted so that the laser curtain L12 does not intersect the projection screen L11, and similarly the laser curtain L22 emitted by G2 does not intersect the projection screen L21.
The second step is that: and (6) calibrating.
(1) Setting the effective area of the infrared camera C1
Calculating the effective area of the infrared camera C1, controlling the filter of the infrared camera C1 to close through a processing unit on the computer, wherein the narrow-band filter plays a role, and the embodiment can obtain a specific working threshold value T because the distinction between the black background and the laser irradiation area is very obviousWThe small value of P is used as the binary threshold of the effective area, i.e. the effective area threshold TE=TW-P. Then, binarization processing is carried out, and the gray value of the pixel at the position of x row and y column of the V1 pixel is expressed by f (x, y), namely, the gray value of the pixel at the position of x row and y column of the V1 pixel
Figure DEST_PATH_GDA0002888651310000061
Then
Figure DEST_PATH_GDA0002888651310000062
The set V1E where g (x, y) is 0 is an effective area corresponding to the infrared camera C1. The system only processes signals within the range of V1E for the infrared camera C1, and the infrared camera C1 cannot be required to see any laser emitted by the infrared laser G1 in an actual working environment. The system is made practical by using the effective area V1E.
(2) The work area of C1 is calculated.
The processing unit on the computer controls the optical filter of the infrared camera C1 to be opened, at this time, the infrared camera sees an unshielded image, and the processing unit sets the display image, that is, the interactive projection screen L1 is all white, and the interactive projection screen L2 is all black. Thus, the screen V1 of the infrared camera C1 includes part or all of L1 and L2, and a background portion. In this embodiment, a proper threshold t is selected to perform binarization processing on V1, and then the image portion V11 including L1 is an effective working area of the infrared camera C1. Representing the grey value of a pixel at the x row and y column position of the V1 pixel by f (x, y), i.e. for
Figure DEST_PATH_GDA0002888651310000063
Then:
Figure DEST_PATH_GDA0002888651310000064
and g (x, y) ═ 255 of the set V1W, namely the working area corresponding to the infrared camera C1.
In the embodiment, a suitable threshold is selected, preferably, a "maximum inter-class variance method (Otsu)" is used, and the basic idea of the Otsu algorithm is to divide the gray scale of an image into two groups by using a certain assumed gray scale value t, and when the inter-class variance of the two groups is maximum, the gray scale value t is the optimal threshold for image binarization. This threshold value can of course also be specified according to its own right.
(3) The intersection V1EW of V1E and V1W is the effective working area of C1. Namely:
V1EW=V1E∩V1W
the processing unit only processes points belonging to VIEW in the C1 picture, so that interference light sources outside the V11 picture cannot influence the system, and meanwhile, the interference light sources fixedly existing in the V11 cannot influence the system, so that the system has good robustness.
In the same way, the effective working area of C2 is obtained.
(3) And (6) automatic calibration.
When the infrared camera C1 filter is turned on, the processing unit causes the computer to project dots with black background and white foreground on the projection screen L11. The coordinates of the center position of the white point are calculated, and preferably, the coordinates of 63 coordinate points in 7 rows and 9 columns are selected, and 35 quadrangles are obtained. The coordinate position of the center position of 63 points on the L11 in the infrared camera C1 can be obtained and calculated by C1, and 35 quadrilateral areas are divided by the 63 points. And (3) performing perspective transformation on the world coordinate position in C1 in the 35 quadrilaterals and the computer coordinate system generating the points to obtain corresponding perspective transformation matrixes, so as to establish a one-to-one mapping relation between the world coordinate system in V11 and the computer coordinate system.
Similarly, calibration was performed for L21 by C2.
After calibration, for C1, the system will only process the signals for valid operating region V1 EW. When other reflective objects, such as a finger or a stick, come into close proximity, the objects will intersect the laser curtain L12, thereby forming a reflection.
Since the wavelengths of the optical filters of the infrared laser G2 and the infrared camera C2 are different from the wavelengths of the optical filters of the infrared laser G1 and the infrared camera C1, the infrared laser G2 does not interfere with the infrared camera C1, the infrared laser G1 does not interfere with the infrared camera C2, and similarly, the infrared camera C2 can operate the computer by the same principle.
Preferably, the embodiment is described by a 2-plane splicing projection 1920 × 1080, where the upper left corner and the lower right corner of the computer coordinate system corresponding to the projection screen L11 are (0,0) and (1920-; the computer coordinate system of projection screen L22 is: the upper left corner is (1920, 0) and the lower right corner is (3840-1, 1080-1). When a finger of a user operates on the L1, the user can operate the computer coordinate system corresponding to the L11 according to the perspective transformation matrix, and similarly, when the user operates the computer coordinate system corresponding to the L21 by using an object or the finger on the L2, the computer coordinate system corresponding to the L21 can be operated, so that high-precision touch interaction of multi-surface splicing projection is perfectly realized.
In contrast to the case where the laser and the optical filter with the same wavelength are used in the multi-surface tiled projection, the infrared laser G2 will irradiate the inside of the projection screen L11, and the infrared laser G1 will also irradiate the inside of the projection screen L21. As mentioned above, the system has strong stability and robustness while ensuring low delay, and the system can set the part of the infrared laser G2 irradiated into the L11 as a blind area. When the filter of the infrared camera C1 is turned on, it can be seen that a part of the projection screen L11 is set as a blind area, and interaction cannot be performed.
To sum up, the utility model discloses a carry out high accuracy touch interaction to multiaspect concatenation projection, have good real-time and stability to and the practicality that can industrialize.

Claims (8)

1. A high-precision touch interaction system for multi-surface splicing projection is characterized by comprising at least two first screens and two second screens, wherein the first screens can be touched and arranged at an included angle, and are provided with a first infrared laser device and a first infrared camera device; the second screen is provided with a second infrared laser device and a second infrared camera device; the first infrared laser device and the second infrared laser device have different wavelengths.
2. The multi-surface tiled projection high precision touch interaction system according to claim 1, wherein the first infrared camera has the same wavelength as the first infrared laser device, and the second infrared camera has the same wavelength as the second infrared laser device.
3. The multi-surface tiled projection high accuracy touch interaction system of claim 1, wherein the wavelength of the first infrared laser device is in the wavelength range of the first infrared camera device and the wavelength of the second infrared laser device is in the wavelength range of the second infrared camera device.
4. The multi-surface tiled projected high precision touch interaction system according to claim 1, 2 or 3, further comprising a computer, wherein the computer is connected with a fuser, the fuser is connected with the first projector and the second projector, and seamlessly splices the pictures of the projectors into the first screen and the second screen; the infrared laser device emits a laser curtain, and the laser curtain is not crossed with the corresponding screen.
5. The multi-surface tiled projection high-precision touch interaction system according to claim 1, 2 or 3, wherein the infrared camera device can see the whole screen of the corresponding screen; if the content of the first screen is seen in the picture of the first infrared camera device, the content is indicated by V11, and the whole picture shot by the first infrared camera device is V1, so that the content is displayed
Figure DEST_PATH_FDA0002888651300000011
Similarly, the range of the picture of the second screen in the picture of the second infrared camera device is V21, and the whole picture shot by the first infrared camera device is V2, so that
Figure DEST_PATH_FDA0002888651300000012
6. The multi-surface tiled projection high precision touch interaction system according to claim 4, wherein the computer comprises a processing unit for processing and analyzing each frame of image captured by the infrared camera to identify and track the motion and position of each set of touch points.
7. The multi-faceted tiled projected high accuracy touch interaction system of claim 1, 2 or 3, wherein said first and second infrared laser means each comprise one or more infrared lasers.
8. The multi-surface tiled projection high-precision touch interaction system according to claim 1, 2 or 3, wherein the first infrared camera and the second infrared camera each comprise one or more infrared cameras with infrared filter switching function.
CN202021956000.6U 2020-09-09 2020-09-09 High-precision touch interaction system for multi-surface splicing projection Active CN212781988U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202021956000.6U CN212781988U (en) 2020-09-09 2020-09-09 High-precision touch interaction system for multi-surface splicing projection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202021956000.6U CN212781988U (en) 2020-09-09 2020-09-09 High-precision touch interaction system for multi-surface splicing projection

Publications (1)

Publication Number Publication Date
CN212781988U true CN212781988U (en) 2021-03-23

Family

ID=75065148

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202021956000.6U Active CN212781988U (en) 2020-09-09 2020-09-09 High-precision touch interaction system for multi-surface splicing projection

Country Status (1)

Country Link
CN (1) CN212781988U (en)

Similar Documents

Publication Publication Date Title
EP2321697B1 (en) Spatially adaptive photographic flash unit
US4468694A (en) Apparatus and method for remote displaying and sensing of information using shadow parallax
KR101831350B1 (en) Camera-based multi-touch interaction and illumination system and method
JP5950130B2 (en) Camera-type multi-touch interaction device, system and method
US6704000B2 (en) Method for remote computer operation via a wireless optical device
RU2455676C2 (en) Method of controlling device using gestures and 3d sensor for realising said method
US9521276B2 (en) Portable projection capture device
WO2017075932A1 (en) Gesture-based control method and system based on three-dimensional displaying
WO2012124730A1 (en) Detection device, input device, projector, and electronic apparatus
EP2824923B1 (en) Apparatus, system and method for projecting images onto predefined portions of objects
US20140354602A1 (en) Interactive input system and method
CN104423578B (en) Interactive input system and method
US20140139668A1 (en) Projection capture system and method
US8184101B2 (en) Detecting touch on a surface via a scanning laser
US20080252596A1 (en) Display Using a Three-Dimensional vision System
KR101990001B1 (en) Input system for a computer incorporating a virtual touch screen
BR112014002186B1 (en) capture projection system, executable means of processing and method of collaboration in the workspace
CN109782435B (en) Multi-scene air imaging and interaction system
EP2128693A1 (en) Spatially Adaptive Photographic Flash Unit
US20200241697A1 (en) Position detecting method, position detecting device, and interactive projector
Danciu et al. Shadow removal in depth images morphology-based for kinect cameras
CN212781988U (en) High-precision touch interaction system for multi-surface splicing projection
JP4112878B2 (en) Coordinate detection device
US20100295823A1 (en) Apparatus for touching reflection image using an infrared screen
KR101002072B1 (en) Apparatus for touching a projection of images on an infrared screen

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant