CN108269288B - Intelligent special-shaped projection non-contact interaction system and method - Google Patents

Intelligent special-shaped projection non-contact interaction system and method Download PDF

Info

Publication number
CN108269288B
CN108269288B CN201711460835.5A CN201711460835A CN108269288B CN 108269288 B CN108269288 B CN 108269288B CN 201711460835 A CN201711460835 A CN 201711460835A CN 108269288 B CN108269288 B CN 108269288B
Authority
CN
China
Prior art keywords
camera
projection
projector
image
calibration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711460835.5A
Other languages
Chinese (zh)
Other versions
CN108269288A (en
Inventor
王波
于海涛
朱晓阳
蒋永实
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhongke Qichuang Tianjin Technology Co ltd
Institute of Automation of Chinese Academy of Science
Original Assignee
Zhongke Qichuang Tianjin Technology Co ltd
Institute of Automation of Chinese Academy of Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhongke Qichuang Tianjin Technology Co ltd, Institute of Automation of Chinese Academy of Science filed Critical Zhongke Qichuang Tianjin Technology Co ltd
Priority to CN201711460835.5A priority Critical patent/CN108269288B/en
Publication of CN108269288A publication Critical patent/CN108269288A/en
Application granted granted Critical
Publication of CN108269288B publication Critical patent/CN108269288B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20164Salient point detection; Corner detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention belongs to the field of special-shaped projection, and particularly relates to an intelligent special-shaped projection non-contact interaction system and method, aiming at solving the non-contact interaction problem of special-shaped surface projection, wherein the system comprises a projection module, a projection module and a display module, wherein the projection module is used for carrying out video projection and generating an interaction area index map when the interaction projection is initialized; the image acquisition module is used for acquiring images of the interaction area in real time; the special-shaped projection calibration module comprises camera calibration, projector calibration and simultaneous calibration of the camera and the projector; the image making module is used for making a corresponding interactive area map on a video and a special-shaped surface which need to be projected by the corresponding projector; and the interaction module is used for detecting whether the acquired images of the interaction area have touch spots or not, and selecting the corresponding projection images according to the touch spots to send to the projector for projection playing. The invention adopts the principle of machine vision to realize the non-contact interactive function and can realize the long-distance interactive projection playing of large-scale special-shaped projection.

Description

Intelligent special-shaped projection non-contact interaction system and method
Technical Field
The invention belongs to the field of special-shaped projection, and particularly relates to an intelligent special-shaped projection non-contact interaction system and method.
Background
With the development of science and technology, various projection technologies are greatly developed, wherein the large-scale special-shaped projection technology enables a projection picture to be perfectly attached to a special-shaped surface, the visual perception that the special-shaped surface emits light is provided, the beautiful special-shaped effect can be displayed, the drowning feeling of an experiencer is enhanced, and therefore the wide development is achieved. Different from general plane projection, the projection technology of the special-shaped surface can be used for projecting on various special-shaped surfaces, such as automobiles, human body models, artistic walls, stereoscopic sand tables and the like. The method has wide application in the aspects of education, entertainment, virtual reality, commercial propaganda, military sand table and the like. The projection of the special-shaped surface needs to consider the structural characteristics of the special-shaped surface, otherwise, the projected image may not be aligned with the special-shaped surface, and the phenomena of light leakage, dislocation and the like occur, so that the expected effect cannot be achieved. In order to solve the problem, a commonly used method generates a grid on a projection irregular surface in time, and then manually drags the grid on the irregular surface to realize the effect of image alignment, but the method lacks the automatic capability and needs to consume larger manpower, particularly when a larger irregular surface is projected, such as a large stereoscopic sand table.
When a large irregular surface is projected, a plurality of projectors are required to perform linkage projection. If the direct splicing of each projector adopts hard splicing, namely the projection range of one projector is adjacent to the projection range of other projectors but not superposed, projection gaps can appear on the projection surface, and the arrangement of the positions of the projectors is difficult to operate; if overlapped projection areas exist among projectors, the overlapped projection areas can generate highlight and ghost phenomena; the usual solution is to blend the brightness of the projectors for overlapping highlight areas, but this method is complicated to operate.
In some specific applications it may be desirable to provide a certain interaction capability for the user. At present, an interaction technology combined with a projection technology is generally a touch type, such as an infrared touch interaction technology, and the interaction technology has the requirements of a plane interaction area, a limited interaction area and the like. However, when projection interaction is performed on a large irregular surface, the touch interaction technology cannot meet the requirement. Therefore, how to break through the limitation of the contact interaction technology in the projection of the large-scale irregular surface and solve the non-contact interaction technology becomes a problem to be solved urgently.
Disclosure of Invention
In order to solve the above problems in the prior art, that is, to solve the problem of non-contact interaction of projection of a special-shaped surface, the invention provides an intelligent special-shaped projection non-contact interaction system, which comprises a projection module, an image acquisition module, an image making module, a special-shaped projection calibration module and an interaction module; the projection module comprises at least one projector; the image acquisition module comprises at least one camera;
the special-shaped projection calibration module comprises camera calibration, projector calibration and simultaneous calibration of the camera and the projector; the camera calibration is used for obtaining camera internal parameters; the projector calibration is used for obtaining camera internal parameters; the camera and the projector are simultaneously calibrated to obtain external parameters of the camera and the projector and the position of the projector under a camera coordinate system;
the image making module is used for making a corresponding interactive area map on a video and a special-shaped surface which are required to be projected by the projector according to the position of the projector in the camera coordinate system; in the image production module, parameters and positions of a virtual camera in the image production process are configured according to internal parameters, external parameters and positions of the projector returned by the special-shaped projection calibration module;
the projection module is used for carrying out video projection and generating an interaction area index map according to the interaction area map when the interaction projection is initialized;
the image acquisition module is used for acquiring images of the interaction area in real time;
the interactive module is used for detecting whether the acquired image of the interactive area has touch spots or not, and selecting a corresponding projection image according to the touch spots to send to the projector for projection playing.
Further, the camera calibration method comprises the following steps:
fixing the checkerboard on a calibration plate, changing the position or angle of the calibration plate within the field of view of a camera, and acquiring a calibration image by the camera when each angular point can be completely and correctly detected after the position of the calibration plate is changed and stabilized;
acquiring angular points detected in each calibration image and the physical positions of the angular points;
and calibrating the internal reference of the camera according to the angular points detected in the calibration images and the physical positions of the angular points.
Further, the projector calibration method comprises the following steps:
projecting checkerboard on a calibration plate adopted by camera calibration through a projector; and acquiring the physical three-dimensional coordinates of the corner points in the projected checkerboard through the camera, and sending the coordinates to the projector for internal reference calibration.
Further, the camera and the projector are calibrated simultaneously, and the method comprises the following steps:
calibrating a calibration plate adopted by a camera, fixing the position and the angle of the calibration plate, executing the projector calibration once, and obtaining selection matrixes of the camera and the projector which are respectively Rc、RpThe translation vectors of the camera and the projector are respectively Tc、TpCalculating the position of the projection in the camera coordinate system
Figure GDA0001609749890000031
Further, the checkerboard is projected on a calibration board adopted by camera calibration through a projector; acquiring the physical three-dimensional coordinates of the corner points in the projected checkerboard through a camera, and sending the coordinates to the projector for internal reference calibration, wherein the method comprises the following steps:
step A1, projecting a white image on a calibration board in the field of view of a camera by a projector, and when the checkerboard corner points on the calibration board can be completely and correctly detected, acquiring an image B by the camera, wherein the image B comprises the checkerboard when the camera is calibrated, detecting the corner points in the image B, and calibrating to obtain the external parameters of the camera;
step A2, keeping the angle and position of the calibration board unchanged, projecting a checkerboard image on the calibration board in the field of view of the camera by the projector, acquiring an image I by the camera, and acquiring a foreground image F which only contains the projected checkerboard as I-B;
step A3, when the checkerboard corner points in the image F can be detected completely and correctly, obtaining the physical three-dimensional coordinates of each corner point by using the internal reference and the external reference of the camera, and storing the physical three-dimensional coordinates into a corner point physical position information set;
and A4, changing the angle and/or position of the calibration board, repeating the steps A1-A3, reaching the set times, and calibrating the internal parameters of the projector by using the corner physical three-dimensional coordinates stored in the corner physical position information set and the image information corresponding to the corners in the projection image.
Further, the method for making the interactive area map corresponding to the video to be projected by the projector and the special-shaped surface includes:
generating an interaction area image for each projector, wherein the pixel values are all 0;
setting the position of each virtual camera according to the position of each projector in the camera coordinate system; each camera shoots an image of a special-shaped surface in a view field of the camera, and a corresponding position on the special-shaped surface in the image, which needs to be projected by a projector, is converted into a virtual view field of the projector for rendering;
the scene image needing to be rendered in the virtual camera comprises a playing image and an interactive area graph; the played images are used for being played on a special-shaped surface, one of the overlapped areas of the projectors is selected for rendering, and the images projected by the rest projectors in the overlapped areas are black; the interactive area graph is used for interactive initialization, the interactive areas in the interactive area graph are rendered into white, and the pixel values of the rest areas are kept unchanged at 0.
Further, the method of generating the interaction area index map according to the interaction area map during the initialization of the interaction projection includes:
step B1, defining the default of the index map of the interaction area as a full black image, and the pixel value is 0;
step B2, coordinating projection among projectors, projecting the interactive area map onto a special-shaped surface in the camera field in sequence according to the principle of projecting a large area first and then projecting a small area, and shooting the special-shaped surface image by the camera;
b3, sequentially carrying out graying and binarization operation on the image acquired by the camera and searching a maximum connected domain;
and step B4, setting the maximum value of the image pixel as Max, the number of the interactive areas of the special-shaped surface as N, and the current area as N, and updating the pixel value of the position corresponding to the current maximum connected domain in the interactive index map as Areadindex-N Max/N.
Further, the method of detecting whether the acquired image of the interaction area has a touch spot and selecting a corresponding projection image to send to a projector for projection playing based on the touch spot includes:
the camera detects the image of the special-shaped surface in the view field in real time, the maximum connected domain is inquired after graying and binarization processing are carried out on each frame of image, and if the maximum connected domain is larger than a set threshold value, the fact that the spots interacted by the user fall in the connected domain is judged;
reading a pixel value AreadIndex at the position in the interactive index map according to the position of the maximum connected domain, and if the AreadIndex is N Max/N, sequentially storing the AreadIndex as the index of the current frame in an interactive area index queue;
and counting the interactive area index with the maximum index frequency in the current index list, and sending the interactive area index number to the corresponding projector in the projection module for projection playing of the interactive image.
Further, the length of the interaction area index queue is set according to a human-computer interaction interface window.
On the other hand, the invention provides an intelligent special-shaped projection non-contact interaction method, which is characterized in that the intelligent special-shaped projection non-contact interaction system comprises the following steps:
step S1, initializing an interaction area index map;
step S2, reading the image of the interactive area acquired by the camera in real time;
step S3, detecting whether the image of the collected interaction area has a touch spot, if not, continuing to play the current video, and if so, executing step S4;
step S4, selecting a corresponding projection image to be sent to the projector for projection and playback according to the detected touch spot.
The invention has the following beneficial effects:
the invention adopts the principle of machine vision to realize the non-contact interactive function, can realize the long-distance interactive projection playing of large-scale special-shaped projection, and can be better applied to the examples of education, science popularization, large military sand tables and the like, and a user can interactively show a specific area at will;
the invention adopts the simultaneous calibration of the camera and the projector, and the position returned to the projector is used as the position of the virtual camera during the image production, so that the projection of the large special-shaped surface does not need to align the image and the special-shaped object in a grid mapping similar mode, and the degree of automation is higher;
the invention can provide the specific position of the virtual image in the image production link of the related projector, so that the image brightness fusion of the projector overlapping parts of a plurality of projectors is not needed.
Drawings
FIG. 1 is a schematic structural diagram of an intelligent irregular projection non-contact interaction system according to an embodiment of the invention;
FIG. 2(a) is a schematic diagram of a projector printing a full white image on a checkerboard calibration board during calibration of the projector;
FIG. 2(b) is a schematic diagram of the projector checkerboard of the projector with the calibration board attached to the projector during calibration of the projector;
FIG. 2(c) is a schematic checkerboard diagram for projector calibration obtained by camera background modeling;
FIG. 3 is a schematic diagram of the relationship of the projector calibrated by the camera;
FIG. 4 images captured by the cameras to the virtual camera render images projected by the projector;
FIGS. 5(a), 5(b), and 5(c) are diagrams illustrating the interaction regions produced by the production module, respectively;
fig. 6 is an interaction area index map obtained by acquiring initialization with a camera after the projector projects the interaction area map;
FIG. 7 is a flowchart illustrating an exemplary process implementation of the present invention.
Description of reference numerals: a checkerboard 101 pasted on a calibration board, a full white image 102 printed on the calibration board by a projector, a checkerboard image 103 printed on the calibration board by the projector, a background image 104 obtained by camera modeling, a checkerboard 105 on the background modeling image, a camera 106 and a projector 107.
Detailed Description
Preferred embodiments of the present invention are described below with reference to the accompanying drawings. It should be understood by those skilled in the art that these embodiments are only for explaining the technical principle of the present invention, and are not intended to limit the scope of the present invention.
The intelligent special-shaped projection non-contact interaction system provided by the invention provides a more intelligent interaction function for users, and can realize linkage projection of large special-shaped surfaces by using a plurality of projectors, thereby fundamentally solving the problem of fusion between the projectors.
The intelligent special-shaped projection non-contact interaction system disclosed by the embodiment of the invention comprises a projection module, an image acquisition module, an image making module, a special-shaped projection calibration module and an interaction module, as shown in fig. 1.
1. Projection module
And the projection module is used for carrying out video projection and generating an interaction area index map according to the interaction area map when the interaction projection is initialized.
The projection module comprises one or more projectors; in the case of a plurality of projectors, each projector and the other projectors are provided with a projection overlap area which is not less than 10% of the projection area of the projector.
In this embodiment, when the interactive projection is initialized, an interactive region index map is generated according to the interactive region map, that is, the interactive region index map is initialized, and the method includes:
step B1, defining the default of the index map of the interaction area as a full black image, and the pixel value is 0;
step B2, coordinating projection among projectors, projecting the interactive area map onto a special-shaped surface in the camera field in sequence according to the principle of projecting a large area first and then projecting a small area, and shooting the special-shaped surface image by the camera;
b3, sequentially carrying out graying and binarization operation on the image acquired by the camera and searching a maximum connected domain;
and step B4, setting the maximum value of the image pixel as Max, the number of the interactive areas of the special-shaped surface as N, and the current area as N, and updating the pixel value of the position corresponding to the current maximum connected domain in the interactive index map as Areadindex-N Max/N.
2. Image acquisition module
And the image acquisition module is used for acquiring the image of the interactive area in real time.
The image acquisition module comprises one or more cameras; if a plurality of cameras are used, the fields of view of the cameras overlap, the cameras overlap the projector projection area that projects within their fields of view, and the projection interaction area is within the field of view of the cameras.
3. Special-shaped projection calibration module
The special-shaped projection calibration module comprises camera calibration, projector calibration and simultaneous calibration of the camera and the projector; the camera calibration is used for obtaining camera internal parameters; the projector calibration is used for obtaining internal parameters of the projector; and the camera and the projector are simultaneously calibrated to obtain external parameters of the camera and the projector and the position of the projector in a camera coordinate system.
(1) The camera calibration method comprises the following steps:
fixing the checkerboard on a calibration plate, changing the position or angle of the calibration plate within the field of view of a camera, and acquiring a calibration image by the camera when each angular point can be completely and correctly detected after the position of the calibration plate is changed and stabilized; the camera collects calibration images which are set according to the requirements of users, and the number of the calibration images is generally not less than 3;
acquiring angular points detected in each calibration image and the physical positions of the angular points;
calibrating internal references of the camera according to the angular points detected in each calibration image and the physical positions of the angular points, wherein the method is the prior art, and is not repeated here, and reference can be made to: ZHENGYOU Zhang, A flexible new technical for camera calibration. IEEE Transactions on Pattern Analysis and machinery Analysis, Volume:22, Issue:11, Nov 2000.
(2) The projector calibration can be regarded as an inverse process of camera calibration, and the method comprises the following steps:
the projector projects checkerboard on a calibration plate adopted when the camera is calibrated; and acquiring the physical three-dimensional coordinates of the corner points in the projected checkerboard through the camera, and sending the coordinates to the projector for internal reference calibration.
In this embodiment, the projector calibration may be divided into the following steps:
step A1, projecting a white image on a calibration board in the field of view of a camera by a projector (as shown in fig. 2 (a)), and when the angular points of the checkerboard on the calibration board can be completely and correctly detected, acquiring an image B by the camera, wherein the image B comprises the checkerboard when the camera is calibrated, detecting the angular points in the image B, and calibrating to obtain the external parameters of the camera;
step a2, keeping the angle and position of the calibration board unchanged, projecting a checkerboard image on the calibration board in the field of view of the camera by the projector (as shown in fig. 2(B)), acquiring an image I by the camera, and acquiring a foreground image F (I-B) (as shown in fig. 2(c)) only containing the projected checkerboard;
step A3, when the checkerboard corner points in the image F can be detected completely and correctly, obtaining the physical three-dimensional coordinates of each corner point by using the internal reference and the external reference of the camera, and storing the physical three-dimensional coordinates into a corner point physical position information set;
and step A4, changing the angle and/or position of the calibration plate, and repeating the steps A1-A3 for a set number of times, generally not less than 3 times. The method is the prior art, and is not repeated here, and reference can be made to the following steps: ZHENGYOUZhang, A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine understanding, Volume:22, Issue:11, Nov 2000.
A schematic diagram of the relationship of the camera calibration projector is shown in fig. 3.
(3) The camera and the projector are calibrated simultaneously, the process of calibrating the projector and the camera simultaneously is similar to the calibration of the projector, the difference is that the angle and the position of the calibration plate are unchanged in the calibration process, and the specific method comprises the following steps:
adopting a camera to calibrate a used calibration plate, fixing the position and the angle of the calibration plate, executing the projector calibration once, and obtaining a rotation matrix of the camera and the projector: rc、RpAnd, translation vectors of the camera and projector: t isc、TpAnd calculating the position of the projector in the camera coordinate system, which is the prior art and is not described herein again, reference may be made to: richard Hartley, Andrew Zisserman, Multiple View Geometry in Computer Vision, Cambridge University Press, 2004.3.
4. Image making module
The image making module is used for making a corresponding interactive area map on a video and a special-shaped surface which are required to be projected by the projector according to the position of the projector in the camera coordinate system; in the image making module, parameters and positions of the virtual camera in the image making process are configured according to the internal parameters, the external parameters and the positions of the projector returned by the special-shaped projection calibration module.
In this embodiment, the method for making the interactive area map corresponding to the video to be projected by the projector and the corresponding irregular surface includes:
generating an interaction area image for each projector, wherein the pixel values are all 0;
setting the position of each projector as the position of each virtual camera according to the position of each projector in the camera coordinate system; each camera shoots an image of a special-shaped surface in a view field of the camera, and an image needing to be projected by a projector on the special-shaped surface in the image is transformed into the view field of the virtual camera to be rendered;
the scene images needing to be rendered in the virtual camera comprise playing images and an interactive area graph:
(1) playing images for playing on a special-shaped surface, wherein one of the overlapped areas of the projectors is selected for rendering, and the images projected by the rest projectors in the overlapped areas are black;
(2) and the interactive area graph is used for interactive initialization, the interactive areas in the interactive area graph are rendered into white, and the pixel values of the rest areas are kept unchanged at 0.
Fig. 4 shows an example of an image captured by a camera to a virtual camera rendering an image projected by a projector.
Fig. 5(a), 5(b), and 5(c) are examples of interaction region diagrams produced by the production module.
Fig. 6 is an interaction area index map obtained by acquiring initialization by a camera after the projector projects the interaction area map.
5. Interaction module
The interaction module is used for detecting whether the acquired images of the interaction area have touch spots or not, and selecting the corresponding projection images according to the touch spots to send to the projector for projection playing.
In the intelligent irregular projection non-contact interaction system of the embodiment, the projection playing mode comprises a playing mode and an interaction mode; the system defaults to a playing mode, and under the playing mode, each projector coordinates projection to play specific image data on a special-shaped surface; in the interactive mode, the system provides interactive functions, and allows a user to click a specific interactive area through a laser pen and the like for interaction.
In the interactive mode, the judging and playing control method in the interactive module is as follows:
(1) the camera detects the image of the special-shaped surface in the view field in real time, the maximum connected domain is inquired after graying and binarization processing are carried out on each frame of image, and if the maximum connected domain is larger than a set threshold value, the fact that the spots interacted by the user fall in the connected domain is judged;
(2) reading a pixel value AreadIndex at the position in the interactive index map according to the position of the maximum connected domain, and if the AreadIndex is N Max/N, sequentially storing the AreadIndex as the index of the current frame in an interactive area index queue;
(3) and counting the interactive area index with the maximum index frequency in the current index list, and sending the interactive area index number to the corresponding projector in the projection module for projection playing of the interactive image.
The length of the interactive area index queue of the embodiment of the invention can be set from a human-computer interaction interface window according to requirements.
The intelligent special-shaped projection non-contact interaction method based on the intelligent special-shaped projection non-contact interaction system disclosed by the embodiment of the invention is shown in fig. 7 and comprises the following steps:
step S1, initializing an interaction area index map;
step S2, reading the image of the interactive area acquired by the camera in real time;
step S3, detecting whether the image of the collected interaction area has a touch spot, if not, continuing to play the current video, and if so, executing step S4;
step S4, selecting a corresponding projection image to be sent to the projector for projection and playback according to the detected touch spot.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process and related descriptions of the method described above may refer to the corresponding contents in the foregoing system embodiments, and are not described herein again.
Those of skill in the art will appreciate that the various illustrative modules, and method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate the interchangeability of electronic hardware and software. Whether such functionality is implemented as electronic hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The terms "comprises," "comprising," or any other similar term are intended to cover a non-exclusive inclusion, such that a process, method, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, or apparatus.
So far, the technical solutions of the present invention have been described in connection with the preferred embodiments shown in the drawings, but it is easily understood by those skilled in the art that the scope of the present invention is obviously not limited to these specific embodiments. Equivalent changes or substitutions of related technical features can be made by those skilled in the art without departing from the principle of the invention, and the technical scheme after the changes or substitutions can fall into the protection scope of the invention.

Claims (9)

1. An intelligent special-shaped projection non-contact interaction system is characterized by comprising a projection module, an image acquisition module, an image making module, a special-shaped projection calibration module and an interaction module; the projection module comprises at least one projector; the image acquisition module comprises at least one camera;
the special-shaped projection calibration module comprises camera calibration, projector calibration and simultaneous calibration of the camera and the projector; the camera calibration is used for obtaining camera internal parameters; the projector calibration is used for obtaining camera internal parameters; the camera and the projector are simultaneously calibrated to obtain external parameters of the camera and the projector and the position of the projector under a camera coordinate system;
the image making module is used for making a corresponding interactive area map on a video and a special-shaped surface which are required to be projected by the projector according to the position of the projector in the camera coordinate system; in the image production module, parameters and positions of a virtual camera in the image production process are configured according to internal parameters, external parameters and positions of the projector returned by the special-shaped projection calibration module;
the projection module is used for carrying out video projection and generating an interaction area index map according to the interaction area map when the interaction projection is initialized; the image acquisition module is used for acquiring images of the interaction area in real time; and
the interaction module is used for detecting whether the acquired images of the interaction area have touch spots or not, and selecting a corresponding projection image according to the touch spots to send to a projector for projection playing;
wherein, the method for generating the interactive area index map according to the interactive area map during the initialization of the interactive projection comprises the following steps:
step B1, defining the default of the index map of the interaction area as a full black image, and the pixel value is 0;
step B2, coordinating projection among projectors, projecting the interactive area map onto a special-shaped surface in the camera field in sequence according to the principle of projecting a large area first and then projecting a small area, and shooting the special-shaped surface image by the camera;
b3, sequentially carrying out graying and binarization operation on the image acquired by the camera and searching a maximum connected domain; and
and step B4, setting the maximum value of the image pixel as Max, the number of the interactive areas of the special-shaped surface as N, and the current area as N, and updating the pixel value of the position corresponding to the current maximum connected domain in the interactive index map as Areadindex-N Max/N.
2. The intelligent, profiled-projection, non-contact interactive system as recited in claim 1, wherein the camera is calibrated by:
fixing the checkerboard on a calibration plate, changing the position or angle of the calibration plate within the field of view of a camera, and acquiring a calibration image by the camera when each angular point can be completely and correctly detected after the position of the calibration plate is changed and stabilized;
acquiring angular points detected in each calibration image and the physical positions of the angular points;
and calibrating the internal reference of the camera according to the angular points detected in the calibration images and the physical positions of the angular points.
3. An intelligent, profiled-projection, non-contact, interactive system as claimed in claim 2, wherein the projector is calibrated by:
projecting checkerboard on a calibration plate adopted by camera calibration through a projector; and acquiring the physical three-dimensional coordinates of the corner points in the projected checkerboard through the camera, and sending the coordinates to the projector for internal reference calibration.
4. An intelligent, profiled-projection, non-contact, interactive system as recited in claim 3, wherein the camera and projector are calibrated simultaneously by:
calibrating a calibration plate adopted by a camera, fixing the position and the angle of the calibration plate, executing the projector calibration once, and obtaining selection matrixes of the camera and the projector which are respectively Rc、RpThe translation vectors of the camera and the projector are respectively Tc、TpCalculating the position of the projection in the camera coordinate system
Figure FDA0002432857750000021
5. An intelligent, special-shaped projection, non-contact interaction system according to claim 3, wherein the checkerboard is projected by the projector on a calibration board used for camera calibration; acquiring the physical three-dimensional coordinates of the corner points in the projected checkerboard through a camera, and sending the coordinates to the projector for internal reference calibration, wherein the method comprises the following steps:
step A1, projecting a white image on a calibration board in the field of view of a camera by a projector, and when the checkerboard corner points on the calibration board can be completely and correctly detected, acquiring an image B by the camera, wherein the image B comprises the checkerboard when the camera is calibrated, detecting the corner points in the image B, and calibrating to obtain the external parameters of the camera;
step A2, keeping the angle and position of the calibration board unchanged, projecting a checkerboard image on the calibration board in the field of view of the camera by the projector, acquiring an image I by the camera, and acquiring a foreground image F which only contains the projected checkerboard as I-B;
step A3, when the checkerboard corner points in the image F can be detected completely and correctly, obtaining the physical three-dimensional coordinates of each corner point by using the internal reference and the external reference of the camera, and storing the physical three-dimensional coordinates into a corner point physical position information set;
and A4, changing the angle and/or position of the calibration board, repeating the steps A1-A3, reaching the set times, and calibrating the internal parameters of the projector by using the corner physical three-dimensional coordinates stored in the corner physical position information set and the image information corresponding to the corners in the projection image.
6. The intelligent irregular projection non-contact interaction system as claimed in claim 1, wherein the method for making the interaction area map corresponding to the video to be projected by the projector and the irregular surface is as follows:
generating an interaction area image for each projector, wherein the pixel values are all 0;
setting the position of each virtual camera according to the position of each projector in the camera coordinate system; each camera shoots an image of a special-shaped surface in a view field of the camera, and a corresponding position on the special-shaped surface in the image, which needs to be projected by a projector, is converted into a virtual view field of the projector for rendering;
the scene image needing to be rendered in the virtual camera comprises a playing image and an interactive area graph; the played images are used for being played on a special-shaped surface, one of the overlapped areas of the projectors is selected for rendering, and the images projected by the rest projectors in the overlapped areas are black; the interactive area graph is used for interactive initialization, the interactive areas in the interactive area graph are rendered into white, and the pixel values of the rest areas are kept unchanged at 0.
7. The intelligent irregular projection non-contact interaction system as claimed in claim 1, wherein the method comprises the steps of detecting whether the acquired image of the interaction area has touch spots or not, and selecting a corresponding projection image to send a projector for projection playing according to the touch spots, wherein the method comprises the following steps:
the camera detects the image of the special-shaped surface in the view field in real time, the maximum connected domain is inquired after graying and binarization processing are carried out on each frame of image, and if the maximum connected domain is larger than a set threshold value, the fact that the spots interacted by the user fall in the connected domain is judged;
reading a pixel value AreadIndex at the position in the interactive index map according to the position of the maximum connected domain, and if the AreadIndex is N Max/N, sequentially storing the AreadIndex as the index of the current frame in an interactive area index queue;
and counting the interactive area index with the maximum index frequency in the current index list, and sending the interactive area index number to the corresponding projector in the projection module for projection playing of the interactive image.
8. The intelligent irregular projection non-contact interaction system as claimed in claim 7, wherein the length of the interaction area index queue is set according to a human-computer interaction interface window.
9. An intelligent irregular projection non-contact interaction method, based on the intelligent irregular projection non-contact interaction system of any one of claims 1-8, comprising the following steps:
step S1, initializing an interaction area index map;
step S2, reading the image of the interactive area acquired by the camera in real time;
step S3, detecting whether the image of the collected interaction area has a touch spot, if not, continuing to play the current video, and if so, executing step S4;
step S4, selecting a corresponding projection image to be sent to the projector for projection and playback according to the detected touch spot.
CN201711460835.5A 2017-12-28 2017-12-28 Intelligent special-shaped projection non-contact interaction system and method Active CN108269288B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711460835.5A CN108269288B (en) 2017-12-28 2017-12-28 Intelligent special-shaped projection non-contact interaction system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711460835.5A CN108269288B (en) 2017-12-28 2017-12-28 Intelligent special-shaped projection non-contact interaction system and method

Publications (2)

Publication Number Publication Date
CN108269288A CN108269288A (en) 2018-07-10
CN108269288B true CN108269288B (en) 2020-07-24

Family

ID=62772663

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711460835.5A Active CN108269288B (en) 2017-12-28 2017-12-28 Intelligent special-shaped projection non-contact interaction system and method

Country Status (1)

Country Link
CN (1) CN108269288B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109062484B (en) * 2018-07-30 2022-08-02 安徽慧视金瞳科技有限公司 Manual mask image acquisition method for interactive classroom teaching system
CN111093066A (en) * 2019-12-03 2020-05-01 耀灵人工智能(浙江)有限公司 Dynamic plane projection method and system
CN110933391B (en) * 2019-12-20 2021-11-09 成都极米科技股份有限公司 Calibration parameter compensation method and device for projection system and readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101321303A (en) * 2008-07-17 2008-12-10 上海交通大学 Geometric and optical correction method for non-plane multi-projection display
CN101561622A (en) * 2008-04-17 2009-10-21 上海仕联文化传播有限公司 Mirror surface image interactive system
CN103500438A (en) * 2013-10-21 2014-01-08 北京理工大学 Interactive building surface projection method
CN103729888A (en) * 2013-12-31 2014-04-16 成都有尔科技有限公司 3D projecting device convenient to debug and debugging method thereof
CN105278662A (en) * 2014-07-14 2016-01-27 腾讯科技(深圳)有限公司 Interactive control method and apparatus for electronic whiteboard system and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101818024B1 (en) * 2011-03-29 2018-01-12 퀄컴 인코포레이티드 System for the rendering of shared digital interfaces relative to each user's point of view

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101561622A (en) * 2008-04-17 2009-10-21 上海仕联文化传播有限公司 Mirror surface image interactive system
CN101321303A (en) * 2008-07-17 2008-12-10 上海交通大学 Geometric and optical correction method for non-plane multi-projection display
CN103500438A (en) * 2013-10-21 2014-01-08 北京理工大学 Interactive building surface projection method
CN103729888A (en) * 2013-12-31 2014-04-16 成都有尔科技有限公司 3D projecting device convenient to debug and debugging method thereof
CN105278662A (en) * 2014-07-14 2016-01-27 腾讯科技(深圳)有限公司 Interactive control method and apparatus for electronic whiteboard system and system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
A flexible new technique for camera calibration;Zhengyou Zhang;《IEEE Transactions on Pattern Analysis and Machine Intelligence》;20001130;第22卷(第11期);第1330-1334页 *
基于投影图像自适应校正方法的智能投影系统研究;朱博;《中国博士学位论文全文数据库》;20140615;正文第34-48 *

Also Published As

Publication number Publication date
CN108269288A (en) 2018-07-10

Similar Documents

Publication Publication Date Title
EP3614340B1 (en) Methods and devices for acquiring 3d face, and computer readable storage media
CN114245905A (en) Depth aware photo editing
US10701334B2 (en) Virtual reality parallax correction
US8947422B2 (en) Gradient modeling toolkit for sculpting stereoscopic depth models for converting 2-D images into stereoscopic 3-D images
US20070291035A1 (en) Horizontal Perspective Representation
US20140168367A1 (en) Calibrating visual sensors using homography operators
CN104349155B (en) Method and equipment for displaying simulated three-dimensional image
CN108269288B (en) Intelligent special-shaped projection non-contact interaction system and method
US20110273731A1 (en) Printer with attention based image customization
US10893259B2 (en) Apparatus and method for generating a tiled three-dimensional image representation of a scene
US11477432B2 (en) Information processing apparatus, information processing method and storage medium
CN105611267B (en) Merging of real world and virtual world images based on depth and chrominance information
CN109688343A (en) The implementation method and device of augmented reality studio
CN109906600A (en) Simulate the depth of field
US20230328400A1 (en) Auxiliary focusing method, apparatus, and system
CN111866523B (en) Panoramic video synthesis method and device, electronic equipment and computer storage medium
CN115439616A (en) Heterogeneous object characterization method based on multi-object image alpha superposition
JP4554231B2 (en) Distortion parameter generation method, video generation method, distortion parameter generation apparatus, and video generation apparatus
JP7387029B2 (en) Single-image 3D photography technology using soft layering and depth-aware inpainting
CN114860184A (en) Processing device, system and method for blackboard writing display
CN114332356A (en) Virtual and real picture combining method and device
US10902669B2 (en) Method for estimating light for augmented reality and electronic device thereof
JP2002135807A (en) Method and device for calibration for three-dimensional entry
KR20190056694A (en) Virtual exhibition space providing method using 2.5 dimensional object tagged with digital drawing element
CN117152244A (en) Inter-screen relationship determination method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant