WO2022121686A1 - 投影融合方法、投影融合系统及计算机可读存储介质 - Google Patents

投影融合方法、投影融合系统及计算机可读存储介质 Download PDF

Info

Publication number
WO2022121686A1
WO2022121686A1 PCT/CN2021/132870 CN2021132870W WO2022121686A1 WO 2022121686 A1 WO2022121686 A1 WO 2022121686A1 CN 2021132870 W CN2021132870 W CN 2021132870W WO 2022121686 A1 WO2022121686 A1 WO 2022121686A1
Authority
WO
WIPO (PCT)
Prior art keywords
projection
test pattern
pattern
components
positional relationship
Prior art date
Application number
PCT/CN2021/132870
Other languages
English (en)
French (fr)
Inventor
余新
吴超
赵鹏
李屹
Original Assignee
深圳光峰科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳光峰科技股份有限公司 filed Critical 深圳光峰科技股份有限公司
Publication of WO2022121686A1 publication Critical patent/WO2022121686A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/3147Multi-projection systems
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4038Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/32Indexing scheme for image data processing or generation, in general involving image mosaicing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • the present application relates to the field of projection technology, and in particular, to a projection fusion method, a projection fusion system, and a computer-readable storage medium.
  • the splicing and fusion of multiple projectors is to achieve a larger area display or to achieve projection on a complex curved surface by merging the images of multiple projectors.
  • the inventor of the present application found in the long-term research and development process that the current 3D mapping and projector edge fusion are provided by the calibration function of the fusion system.
  • the multi-point geometric correction parameters and edge fusion parameters are adjusted, and the automatic fusion of multiple projectors is not realized.
  • the main technical problem solved by the present application is to provide a projection fusion method, a projection fusion system and a computer-readable storage medium, so as to realize automatic splicing and fusion of multiple projection components, and realize projection fusion and 3D mapping calibration without human intervention.
  • the projection fusion method is used in a projection fusion system, the projection fusion system includes a plurality of projection components, and the projection fusion method includes: configuring the information of the plurality of projection components; controlling each projection component to project a test pattern; When projecting the test pattern, control multiple projection components to collect the test pattern respectively to obtain multiple sampling patterns, and use the test pattern and the sampling pattern to obtain the relative positional relationship between the projection component that projects the test pattern and other projection components; The relative positional relationship splits the target projection pattern, and splices and fuses it through multiple projection components.
  • the above-mentioned target projection pattern includes a plurality of sub-patterns
  • the above-mentioned step of using the relative positional relationship to split the target projection pattern, and splicing and merging through a plurality of projection components includes: using the relative positional relationship to generate each projection component in the Absolute positional relationship in the projection fusion system; use the absolute positional relationship of each projection component to calculate the split and distortion mapping of the target projection pattern; determine the sub-pattern that each projection component needs to project, and control each projection component to project synchronously corresponding to subpattern.
  • the above-mentioned multiple projection components include a first projection component and a second projection component, the first projection component is used to project the test pattern, the second projection component is other projection components in the projection fusion system, and the first projection component is used to project the test pattern. It includes a first projection lens and a first camera, and the second projection assembly includes a second projection lens and a second camera.
  • the above-mentioned steps of using the test pattern and the sampling pattern to obtain the projection assembly for projecting the test pattern and other projection components include: acquiring the test pattern. The position information of the feature point; the first external parameter of the first camera is obtained by using the position information; the relative positional relationship between the first projection component and the second projection component is obtained by using the first external parameter and the position information.
  • the location information includes coordinate information
  • the first external parameter includes spatial coordinate information
  • the above step of acquiring the location information of the feature points of the test pattern includes: acquiring first coordinate information of the feature points of the test pattern in the test pattern.
  • the above-mentioned step of using the position information to obtain the first external parameter of the first camera comprises: obtaining the relative external parameter of the first camera to the first projection lens; Convert the first coordinate information to the world coordinate system to obtain the first direction vector of the feature point, and convert the second coordinate information to the world coordinate system to obtain the second direction vector of the feature point; use triangulation ranging method
  • the first direction vector, the second direction vector and the relative external parameter are calculated to obtain the spatial position information of the feature point in the world coordinate system.
  • the above step of obtaining the position information of the feature points of the test pattern further includes: obtaining the third coordinate information of the feature points in the collection pattern collected by the second camera; obtaining the third coordinate information by using the first external parameter and the position information.
  • the step of the relative positional relationship between a projection component and a second projection component includes: converting the third coordinate information to the world coordinate system to obtain the third direction vector of the feature point; using the triangulation method to measure the third direction vector Calculate with the spatial position information to obtain the relative position relationship of the feature point relative to the first projection lens in the world coordinate system.
  • the above-mentioned projection fusion method further includes: in response to the sampling pattern including the image information of the test pattern, performing a process of obtaining the position information of the feature points of the test pattern. step.
  • the above-mentioned projection fusion method further includes: accumulating the first projection components of the projected test pattern. Quantity; in response to the first quantity being greater than or equal to the total amount of the multiple projection assemblies, perform the above-mentioned step of splicing and merging the target projection pattern by using the relative positional relationship; in response to the first quantity being less than the total quantity of the multiple projection assemblies, sequentially controlling The step of projecting a test pattern by each projection assembly.
  • the above-mentioned controlling a plurality of projection components to collect test patterns respectively to obtain a plurality of sampling patterns, and use the test patterns and the sampling patterns to obtain the relative positional relationship between the projection components that project the test patterns and other projection components includes performing the following sub-steps in a cyclic manner: controlling a projection component to collect a test pattern to obtain a sampling pattern; using the test pattern and a sampling pattern to obtain a projection component that projects the test pattern and other projection components that collect the sampling pattern The relative positional relationship between the above; the above-mentioned control of multiple projection components to collect the test patterns respectively to obtain a plurality of sampling patterns, and use the test patterns and the sampling patterns to obtain the relative position between the projection component that projects the test pattern and other projection components
  • the step of relation further includes: accumulating a second number of projection components that have collected sampling patterns; in response to the second number being less than the total amount of the plurality of projection components, performing the above-mentioned controlling a projection component to collect a test
  • the projection fusion system includes: a plurality of projection components; a processor, which is respectively connected with the plurality of projection components, and the processor is used for configuring the information of the plurality of projection components; the processor is used for controlling each projection component to project a test pattern, and in each projection component When each projection assembly projects the test pattern, control the plurality of projection assemblies to collect the test pattern respectively to obtain multiple sampling patterns, and use the test pattern and the sampling pattern to obtain the relative position between the projection assembly that projects the test pattern and other projection assemblies relationship; the processor is further configured to use the relative position relationship to split the target projection pattern, and splicing and merging through multiple projection components.
  • a technical solution adopted in the present application is to provide a computer-readable storage medium.
  • Program instructions are stored on the computer-readable storage medium, and when the program instructions are executed, the above-mentioned projection fusion method of the filter is implemented.
  • the projection fusion system controls a plurality of projection assemblies to project a test pattern respectively through a processor, and when each projection assembly projects a test pattern, controls the plurality of projection assemblies to respectively project a test pattern. Collect the projected test pattern to obtain multiple sampling patterns.
  • the relative positional relationship between the projection component that projects the test pattern and other projection components can be calculated by using the test pattern and the sampling pattern. By using a similar method, multiple sampling patterns can be obtained.
  • the relative positional relationship between any two of the projection components Using this relative positional relationship, the automatic splicing and fusion of multiple projection components can be realized, and the modeling and parameter extraction of the projection components can be solved at the same time. 3D mapping calibration.
  • FIG. 1 is a schematic structural diagram of an embodiment of a projection fusion system of the present application.
  • FIG. 2 is a schematic flowchart of an embodiment of a projection fusion method of the present application
  • FIG. 3 is a schematic structural diagram of a projection lens and a camera model in the projection fusion system of the present application
  • Fig. 4 is the polar coordinate schematic diagram under the world coordinate system
  • Fig. 5 is a specific flowchart of step S202 in the projection fusion method of the embodiment of Fig. 2;
  • Fig. 6 is a specific flowchart of step S501 in the embodiment of Fig. 5;
  • Fig. 7 is a specific flowchart of step S502 in the embodiment of Fig. 5;
  • FIG. 8 is a schematic diagram of triangulation ranging in the projection fusion method of the present application.
  • Fig. 9 is a specific flow chart of step S503 in the embodiment of Fig. 5;
  • Fig. 10 is a specific flowchart of step S203 in the projection fusion method of the embodiment of Fig. 2;
  • FIG. 11 is a schematic flowchart of an embodiment of the projection fusion method of the present application.
  • FIG. 12 is a schematic structural diagram of an embodiment of a computer-readable storage medium of the present application.
  • first and second in this application are only used for descriptive purposes, and should not be construed as indicating or implying relative importance or implicitly indicating the number of indicated technical features.
  • a plurality of means at least two, such as two, three, etc., unless otherwise expressly and specifically defined.
  • the terms “comprising” and “having” and any variations thereof are intended to cover non-exclusive inclusion.
  • a process, method, system, product or device comprising a series of steps or units is not limited to the listed steps or units, but optionally also includes unlisted steps or units, or optionally also includes For other steps or units inherent to these processes, methods, products or devices.
  • Multi-machine splicing and fusion is an important function in projection applications, especially in engineering applications. By merging the images of multiple projectors, a larger area of display can be realized, or projection on a complex curved surface can be realized.
  • the splicing and fusion of projectors are completed by manual adjustment of the fusion device.
  • an experienced operator adjusts the configuration of each projector and the fusion belt of the fusion device. This process requires a lot of work.
  • the multi-machine fusion system installed for a long time will often have a ghost of fusion band caused by drift, which will affect the viewing effect.
  • the current multi-projector implementation of fusion and 3D mapping requires modeling of the projection object and calibration of the model parameters of the projection system.
  • the projection fusion system and method proposed in the present application can simultaneously solve the modeling of the projection object and the complete parameter extraction of the projection system, and realize projector fusion and 3D mapping calibration without human intervention.
  • the present application first proposes a projection fusion system, as shown in FIG. 1 , which is a schematic structural diagram of an embodiment of the projection fusion system of the present application.
  • the projection fusion system 10 of this embodiment includes a plurality of projection assemblies 110 and a processor 120; wherein, the processors 120 are respectively connected with the plurality of projection assemblies 110, and the processor 120 is used to configure the information of the plurality of projection assemblies 110; the processor 120 is configured to control each projection assembly 110 to project the test pattern, and when each projection assembly 110 projects the test pattern, control the plurality of projection assemblies 110 to collect the test pattern respectively to obtain a plurality of sampling patterns; the processor 120 further uses The relative positional relationship between the projection component 110 that projects the test pattern and other projection components 112 is obtained by using the test pattern and the sampling pattern, and the target projection pattern is split using the relative positional relationship, and is spliced and merged by multiple projection components 110 .
  • the projection fusion system 10 of this embodiment includes n (n is a natural number greater than or equal to 2) projection components 110 (ie P1-Pn); the processor 120 communicates with each projection component 110 through the communication network 150 to obtain information of the projection components 110 Specifically, the processor 120 is configured to acquire network access information, video capture information and optical parameter information of the projection assembly 110 , and to control and configure the projection assembly 110 . It should be noted that the method by which the processor 120 obtains the projection component information can be divided into active acquisition or passive acquisition, wherein the active acquisition can establish the list and data set of the projection component through user configuration, preset configuration file, cloud database query, etc.; Passively obtain parameter information that is automatically reported by the projection component.
  • the communication network 150 can be any type of communication network, as long as it can ensure that the processor 120 can realize two-way communication with the projection assembly 110, and the two-way communication between the projection assemblies 110 to realize the cluster control and video synchronization of the projection assembly 110; communication;
  • the network 150 may be a cellular network (eg, 4G, 5G), a wireless local area network (WiFi), a wired Ethernet (LAN), the Internet, an HDMI video channel, a DP video channel, and the like.
  • the processor 120 may also be referred to as a CPU (Central Processing Unit, central processing unit).
  • the processor 120 may be an integrated circuit chip with signal processing capability.
  • the processor 120 may also be a general purpose processor, digital signal processor (DSP), application specific integrated circuit (ASIC), field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components .
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • a general purpose processor may be a microprocessor or the processor 120 may be any conventional processor or the like.
  • the processor 120 may be an independent processor, or may be a processor integrated in the projection assembly 110 .
  • a plurality of projection components 110 can form a master-slave relationship topology, and can also form a peer-to-peer (P2P) relationship topology.
  • P2P peer-to-peer
  • a certain projection component 110 becomes a master node through a competition mechanism or configuration, and other projection components 110 are slave nodes, receiving instructions from the master node to project and collect test patterns.
  • P2P topology When the P2P topology is adopted, each node publishes a test pattern projection message in the communication network 150 in a bus competition manner, and other projection components 110 collect the test patterns and use them for positioning. In this mode, each projection component 110 computes and stores relative position information for all nodes.
  • the projection fusion system 10 of the present embodiment controls the plurality of projection components 110 to project the test pattern respectively through the processor 120, and when each projection component 110 projects the test pattern, controls the plurality of projection components 110 to collect and project the test pattern respectively.
  • the relative positional relationship between the projection component 110 that projects the test pattern and other projection components 110 can be calculated by using the test pattern and the sampling pattern, and a similar method can be used to obtain a plurality of projections
  • the relative positional relationship between any two of the components 110, the use of this relative positional relationship can realize the automatic splicing and fusion of multiple projection components 110, can solve the modeling and parameter extraction of the projection components 110 at the same time, and realize the projection fusion without human intervention. and 3D mapping calibration.
  • the processor 120 in this embodiment sequentially controls the plurality of projection assemblies 110 to project the test patterns respectively, that is, there is only one projection assembly 110 to project the test patterns at a time, which can reduce the interference between the test patterns and reduce the workload of the processor 120. Computational complexity.
  • the processor 120 can simultaneously control two or more projection assemblies 110 to project the test pattern respectively, which can reduce the time for the projection fusion system 10 to project the test pattern.
  • the processor 120 controls the two projection assemblies 110 farthest apart, for example, the leftmost and rightmost projection assemblies 110 project test patterns respectively, which can not only reduce the interference between the test patterns , and can also reduce the time for the projection fusion system 10 to project the test pattern.
  • each projection assembly 110 includes a projection lens 130 and a camera 140; wherein, the projection lens 130 is connected to the processor 120, and the projection lens 130 is used for projecting the coded test pattern under the control of the processor 120; the camera 140 is connected to the processor 120 is connected, and the camera 140 is used to collect the coded test pattern under the control of the processor 120 to obtain the sampling pattern.
  • other devices may also be used to replace the projection lens, and other image acquisition devices may also be used to replace the camera.
  • the relative positional relationship between the camera 140 and the projection lens 130 in each projection assembly 110 is fixed and known, while the positional relationship between the projection assembly 110 and other projection assemblies 110 is unknown.
  • the processor 120 first obtains the model of the projection assembly 110, and then obtains the relative positional relationship between the projection lens 130 and the camera 140 of the projection assembly 110 from the database corresponding to the model; if the models of all the projection assemblies 110 are the same, obtain The relative positional relationship between the projection lens 130 and the camera 140 on one projection assembly 110 is sufficient.
  • the processor 120 controls each projection lens 130 to project the test pattern, and controls each camera 140 to sample the test pattern to obtain the sample pattern.
  • test patterns projected by each projection lens 130 are different to distinguish the projection assemblies 110 that project the test patterns, that is, different projection lenses project test patterns with specific codes.
  • the projection fusion system 10 of this embodiment is also used to implement the following projection fusion method.
  • the present application further proposes a projection fusion method, as shown in FIG. 2 , which is a schematic flowchart of an embodiment of the projection fusion system of the present application.
  • the projection fusion method of this embodiment can be used in the above-mentioned projection fusion system 10, and the projection fusion method of this embodiment includes the following steps:
  • Step S201 configure the information of the plurality of projection components 110 .
  • the processor 120 verifies and logs in the network information of each projection assembly 110, and configures basic information such as optical parameter collection and database query.
  • the processor 120 must first obtain the model of each projection assembly 110, and collect and query the optical parameters of each projection assembly 110 according to the model, such as the projection ratio, the height of the projection lens 130, the distance between the projection lens 130 and the camera 140 and other necessary information .
  • the processor 120 loads all the projection components 110 in the projection fusion system 10 , namely the information of the n projection components 110 , the information includes the network access information of the projection components 110 and the image acquisition information and optical parameter information of the projection components 110 .
  • the methods of acquiring the information of each projection component 110 can be divided into two types: active acquisition and passive acquisition.
  • Active acquisition is to automatically acquire through the Internet and other channels after connecting to the Internet. It can create a list and data set of the projection component 110 through user configuration, preset configuration files, and cloud database query.
  • Passive acquisition can be obtained when the CPU is connected to the projection component 110.
  • the projection component 110 actively packs and reports the information or obtains the information by broadcasting in the network.
  • Step S202 Control each projection component 110 to project the test pattern.
  • the processor 120 After the processor 120 loads the information of all the projection assemblies 110, the processor 120 sends an instruction through the communication protocol with the projection assemblies 110 to control the projection lens 130 of each projection assembly 110 to project the specific coded test pattern in turn, so as to convert the test pattern to the test pattern.
  • a target surface 160 which may be a curved or flat surface of any topography.
  • the processor 120 needs to ensure that only one projection component 110 is controlled to project the test pattern at the same time, so as to reduce the interference between the test patterns and reduce the computational complexity of the processor 120 .
  • the processor can simultaneously control two or more projection components to project the test pattern respectively, which can reduce the time for the projection fusion system to project the test pattern.
  • Step S203 When each projection component 110 projects the test pattern, control the plurality of projection components 110 to collect the test pattern respectively to obtain a plurality of sampling patterns, and use the test pattern and the sampling pattern to obtain the projection component 110 that projects the test pattern and the Relative positional relationship between other projection assemblies 110 .
  • the processor 120 needs to ensure that only one projection assembly 110 is controlled to project the test pattern at the same time.
  • the processor 120 controls the n projection assemblies 110.
  • the camera 140 collects the test pattern to obtain n sampling patterns; the processor 120 can obtain the distance between the projection component 110 that projects the test pattern and the other (n-1) projection components 110 by using one test pattern and n sampling patterns relative positional relationship.
  • the processor cannot obtain the coding information of the projection assembly 110 that projects the test pattern, and further cannot decode and obtain the position of the projection assembly that projects the test pattern.
  • the projection component 110 corresponding to the camera used for collecting the test pattern does not participate in the decoding position calculation.
  • step S202 The models of the projection lens and the camera are introduced below, so as to introduce the specific implementation of step S202 below.
  • FIG. 3 is a schematic structural diagram of a projection lens and a camera model in the projection fusion system of the present application; and FIG. 4 is a schematic diagram of polar coordinates in the world coordinate system.
  • the projection lens 130 and the camera 140 in this embodiment can be represented by the model shown in FIG.
  • 301 is the imaging plane of the camera 140 or the image plane of the projection lens 130
  • 302 is the equivalent optical center
  • 303 is the equivalent
  • the main optical axis, 304-1, 304-2, 304-3 and 304-4 are the boundaries of the image acquisition area (or projection imaging area), and the image acquisition space range of the camera 140 or the projection range of the projection lens 130 is equivalent to
  • Optical center 302 is the apex and a cone bounded by 304-1 to 304-4. define a spherical coordinate system Taking the equivalent optical center 302 as the origin, the equivalent principal optical axis 303 as the main axis, the direction 305 is the upward direction, and the direction 306 is the positive direction of horizontal rotation.
  • 305, 303 and 306 are perpendicular to each other
  • 305 is perpendicular to the broad side of the imaging plane
  • 303, 305 and 306 form a right-hand coordinate system.
  • any point in the image acquisition area or the projection imaging area can be represented as ( ⁇ , ⁇ ), where is the vector from the optical center to the point in The azimuth angle in the coordinate system, ⁇ is the vector from the optical center to the point in The inclination angle in the coordinate system, ⁇ is the length of the vector from the optical center to this point.
  • any point 307 on the image plane, with normalized coordinates (x, y) in the image coordinate system can be uniquely mapped to Azimuth and inclination in the coordinate system ( ⁇ i ), the mapping relationship can be expressed as:
  • D is the distortion transformation, which is determined by the characteristics of the optical system of the camera 140
  • T is the perspective mapping transformation, and these two transformations are determined by the internal parameters of the camera 140 or the projection lens 130 .
  • the image plane is generally a pixelated imaging device (such as a photosensitive chip such as CCD, CMOS, or a light modulation device such as DMD, LCoS, and LCD).
  • the relationship between normalized coordinates (x, y) and pixels can be expressed as:
  • [I, J] is the resolution of the imaging device
  • [i, j] is the index number of the pixel, indicating the pixel in the jth row and the ith column.
  • [I,J] is the index number of the pixel, indicating the pixel in the jth row and the ith column.
  • the perspective mapping transformation T(x,y) can be expressed as:
  • the projection component 110 is determined by six parameters in the global coordinate system: position and the direction vector one of them Determines the position of the optical center of the projection component 110 in the global coordinate system, the direction vector The orientation of the camera 140 and the rotation of the projection lens 130 along the main optical axis are determined.
  • Figure 4 shows a schematic diagram of polar coordinates under the world coordinate system, wherein the North direction is the y direction of the Cartesian coordinate system under the world coordinate system, and the Up direction is the z direction of the Cartesian coordinate system.
  • the transformation of the Cartesian coordinate system and the polar coordinate system is the basic coordinate system transformation, which is described in any book on three-dimensional analytic geometry, and will not be expanded here.
  • any point in space can be uniquely mapped to the normalized coordinates of the camera 140 .
  • any point in space Transform from the world coordinate system to the local coordinate system of the camera 140:
  • Equations (4) and (6) define the spatial mapping relationship from any point in space to 140 pixels of the camera.
  • Equations (3) and (4) define the mapping relationship between the 140 pixels of the camera and the spatial mapping direction.
  • any point on the imaging chip corresponds to a ray in a direction determined from the optical center of the projection assembly 110, and the imaging position of the projection pixel is the focal point of the ray and the first intersecting plane.
  • n sampling patterns are acquired when a test pattern is projected.
  • the method shown in FIG. 5 can be used to obtain the distance between the projection component 110 that projects the test pattern and the (n-1) projection components 110. relative positional relationship.
  • this embodiment divides the n projection components 110 in the projection fusion system 10 into a first projection component and (n-1) second projection components, wherein the first projection component is a projection component for projecting a test pattern 110, the second projection assembly is another projection assembly 110 in the projection fusion system 10; the projection lens 130 in the first projection assembly is referred to as the first projection lens, and the camera 140 in the first projection assembly is referred to as the first projection lens.
  • the projection lens 130 in the second projection assembly is referred to as the second projection lens, and the camera 140 in the second projection assembly is referred to as the second camera.
  • the projection components 110 for projecting the test pattern are different, the projection components 110 specifically referred to by the first projection component are also different, and the projection components 110 specifically referred to by the second projection component are also changed.
  • the first projection component is the i-th projection component 110 in the projection fusion system 10 and the second projection component is the j-th projection component 110 in the projection fusion system 10 as an example for description.
  • the method of this embodiment specifically includes steps S501 to S504.
  • Step S501 Obtain position information of feature points of the test pattern.
  • the location information in this embodiment includes coordinate information
  • step S501 may be implemented by the method shown in FIG. 6 .
  • the method of this embodiment specifically includes step S601 and step S602.
  • Step S601 Obtain first coordinate information of the feature points of the test pattern in the test pattern.
  • the test pattern has a special distribution of feature points, and the feature description operator of the feature points has nothing to do with the geometric transformation and perspective transformation of the image.
  • the projection component 110 for projecting the test pattern and the data for collecting the sampling pattern can be acquired according to the coordinate information of these special feature points on the test pattern and the coordinate information of these special feature points on the sampling pattern.
  • the relative positional relationship between the projection assemblies 110 can be acquired according to the coordinate information of these special feature points on the test pattern and the coordinate information of these special feature points on the sampling pattern.
  • the processor 120 obtains the content of the test pattern projected by the first projection lens, the feature operator of each feature point from the storage medium, and obtains the first coordinate information of the feature point in the test pattern as:
  • Step S602 Acquire second coordinate information of the feature point in the sampling pattern collected by the first camera.
  • the second coordinate information of the feature point in the sampling pattern extracted by the processor 120 from the sampling pattern collected by the first camera of the first projection component is:
  • the processor 120 also acquires third coordinate information of the feature point in the acquisition pattern acquired by the second camera
  • Step S502 Obtain the first external parameter of the first camera by using the position information.
  • the first external parameter in this embodiment includes spatial coordinate information
  • step S502 may be implemented by the method shown in FIG. 7 .
  • the method of this embodiment specifically includes steps S701 to S703.
  • Step S701 Obtain a relative extrinsic parameter of the first camera to the first projection lens.
  • the processor 120 obtains the relative extrinsic parameter of the first camera to the optical center of the first projection lens as: and respectively represent the relative position and direction of the first camera in the first projection lens coordinate system.
  • Step S702 Convert the first coordinate information to the world coordinate system to obtain the first direction vector of the feature point, and convert the second coordinate information to the world coordinate system to obtain the second direction vector of the feature point.
  • the feature points the corresponding second direction vector for:
  • T -1 is the The defined rotation transform.
  • Step S703 Calculate the first direction vector and the second direction vector by using the triangulation method to obtain spatial position information of the feature point in the world coordinate system.
  • the processor 120 adopts the triangulation ranging method, that is, the following equation is used for the first direction vector and the second direction vector Calculate and find feature points Spatial location coordinates
  • l 0 and l 1 are respectively The distance to the first camera and the first projection lens.
  • multiple feature points in the test pattern can be extracted, and the spatial position information of these feature points can be calculated.
  • Step S503 Obtain the relative positional relationship between the first projection component and the second projection component by using the first external parameter and the positional relationship.
  • step S503 may be implemented by the method shown in FIG. 9 .
  • the method of this embodiment specifically includes step S901 and step S902.
  • Step S901 Convert the third coordinate information to the world coordinate system to obtain the third direction vector of the feature point.
  • the third coordinate information obtained by the processor 120 of the feature point in the collection pattern collected by the second camera is:
  • Step S902 using the triangulation ranging method to calculate the third direction vector and the spatial position information, so as to obtain the relative positional relationship of the feature point relative to the first projection lens in the world coordinate system.
  • the processor 120 generates the triangular ranging equation of the k-th feature point by using the triangulation ranging method:
  • Two or more feature points can be used to generate a triangular ranging equation system.
  • the relative positional relationship between the second camera in the second projection assembly and the first projection lens in the first projection assembly can be solved, that is, the projection of the camera in the jth projection assembly 110 relative to the projection in the ith projection assembly 110 can be obtained.
  • the relative positional relationship of the lens is obtained.
  • the processor 120 may obtain the relative positional relationship between the i-th projection component 110 and the other (n-1) projection components 110 through the above method.
  • the processor 120 can obtain the relative positional relationship between any two projection assemblies 110 among the n projection assemblies 110 .
  • the processor 120 may further perform sampling on the sample before step S501, that is, before using the test pattern and the sampling pattern to obtain the relative positional relationship between the projection component that projects the test pattern and other projection components.
  • the pattern is matched with the test pattern to determine whether the sampling pattern includes part or all of the image information of the test pattern, and in response to the sampling pattern including the image information of the test pattern, step S501 is executed; in response to the sampling pattern not including the image information of the test pattern, then Continue to sample the test pattern with the next projection assembly 110 to obtain the next sample pattern.
  • Step S204 splitting the target projection pattern by using the relative positional relationship, and splicing and merging the target projection patterns through a plurality of projection components 110 .
  • the target projection pattern includes a plurality of sub-patterns.
  • step S204 may be implemented by the method shown in FIG. 10 .
  • the method of this embodiment includes steps S1001 to S1003.
  • Step S1001 using the relative positional relationship to generate the absolute positional relationship of each projection component 110 in the projection fusion system 10 .
  • the processor 120 generates the absolute positional relationship of each projection component 110 in the projection fusion system 10 by using the relative positional relationship, and stores the absolute positional relationship in the fusion and mapping relationship table.
  • Step S1002 using the absolute positional relationship of each projection component 110 to calculate the split and distortion map of the target projection pattern.
  • the processor 120 can perform the functions of segmentation, mapping and fusion of the target projection pattern by using the fusion and mapping relation table.
  • Step S1003 Determine the sub-pattern to be projected by each projection component 110, and control each projection component 110 to project the corresponding sub-pattern synchronously.
  • the projection fusion system 10 of the present embodiment controls the plurality of projection components 110 to project the test pattern respectively through the processor 120, and when each projection component 110 projects the test pattern, controls the plurality of projection components 110 to collect and project the test pattern respectively.
  • the relative positional relationship between the projection component 110 that projects the test pattern and other projection components 110 can be calculated by using the test pattern and the sampling pattern, and a similar method can be used to obtain a plurality of The relative positional relationship between any two of the projection assemblies 110.
  • the automatic splicing and fusion of multiple projection assemblies 110 can be realized, the modeling and parameter extraction of the projection assemblies 110 can be solved simultaneously, and the projection without human intervention can be realized. Fusion and 3D mapping calibration.
  • the present application further proposes a projection fusion method of another embodiment.
  • the projection fusion method of this embodiment includes the following steps:
  • Step S1101 configure the information of the plurality of projection components 110 .
  • Step S1101 is similar to step S201 and will not be repeated here.
  • Step S1102 Control each projection component 110 to project the test pattern.
  • Step S1102 is similar to step S202 and will not be repeated here.
  • Step S1103 When each projection component 110 projects the test pattern, control a projection component 110 to collect the test pattern to obtain a sampling pattern.
  • Step S1104 Using the test pattern and a sampling pattern to obtain the relative positional relationship between the projection component 110 that projects the test pattern and the other projection components 110 that collect the sampling pattern.
  • a certain projection component 110 projects a test pattern, it sequentially controls other projection components 110 to collect sampling patterns, and sequentially acquires the relative positional relationship between the projection component 110 that projects the test pattern and the projection component 110 that collects the sampling pattern, and the processor 120
  • the computing power requirement is low.
  • Step S1105 Accumulate the second number of projection components 110 that have collected sampling patterns.
  • Step S1106 In response to the second quantity being less than the total amount of the plurality of projection components 110, perform step 1103.
  • the processor 120 determines that the second number of projection assemblies 110 that have collected sampling patterns is less than the total number of projection assemblies 110, and considers that there are still projection assemblies 110 that have not collected sampling patterns, and the processor 120 controls the next projection assembly 110 to collect sampling patterns .
  • Step S1107 in response to the second quantity being greater than or equal to the total amount of the plurality of projection components 110 .
  • the processor 120 determines that the second number of projection assemblies 110 that have collected sampling patterns is greater than or equal to the total amount of the plurality of projection assemblies 110, it is considered that all projection assemblies 110 have collected sampling patterns.
  • Step S1108 Accumulate the first number of the projection components 110 that have projected the test pattern.
  • Step S1109 In response to the first quantity being greater than or equal to the total amount of the plurality of projection assemblies 110, step S1111 is executed.
  • the processor 120 determines that the first number of projection assemblies 110 that have projected the test pattern is greater than or equal to the total number of projection assemblies 110 , it is considered that all the projection assemblies 110 have projected the test pattern, and the processor 120 executes step S1111 .
  • Step S1110 In response to the first quantity being less than the total amount of the plurality of projection assemblies 110, perform step S1102.
  • the processor 120 determines that the first number of the projection assemblies 110 that have projected the test pattern is less than the total number of the plurality of projection assemblies 110, then it is considered that there are still projection assemblies 110 that have not projected the test pattern, and the processor 120 controls the next projection assembly 110 to project the test pattern. .
  • Step S1111 splitting the target projection pattern using the relative positional relationship, and splicing and merging through multiple projection components 110 .
  • Step S1112 is similar to step S204 and will not be repeated here.
  • other projection components may also be controlled to collect the test pattern at the same time, so as to obtain multiple sampling patterns and/or perform calculations on the multiple sampling patterns at the same time, so as to obtain the multiple sampling patterns simultaneously.
  • Multiple relative positional relationships can improve the processing efficiency of the processor 120 .
  • the present embodiment further performs determination processing on the number of projection components 110 that have projected the test pattern and determination processing of the number of projection components 110 that have collected sampling patterns, so as to prevent some projection components 110 from not projecting The test pattern or the sampling pattern is not collected, so as to avoid the problem that the position information of some projection components 110 is not obtained.
  • the present application further proposes a computer-readable storage medium, as shown in FIG. 12 , which is a schematic structural diagram of an embodiment of the computer-readable storage medium of the present application.
  • the computer-readable storage medium 90 stores program instructions 91 thereon, and the program instructions 91 implement the above-mentioned projection fusion method when executed by a processor (not shown).
  • the computer-readable storage medium 90 in this embodiment may be, but is not limited to, a U disk, an SD card, a PD optical drive, a mobile hard disk, a large-capacity floppy drive, a flash memory, a multimedia memory card, a server, and the like.
  • the projection fusion method of the embodiment of the present application is used in a projection fusion system, the projection fusion system includes a plurality of projection components, and the projection fusion method includes: performing information configuration on the plurality of projection components; controlling each projection component to project Test pattern; when each projection component projects the test pattern, control multiple projection components to collect the test pattern respectively to obtain multiple sampling patterns, and use the test pattern and the sampling pattern to obtain the projection component that projects the test pattern and other projection components The relative positional relationship between them; the target projection pattern is split using the relative positional relationship, and spliced and fused through multiple projection components.
  • the projection fusion system controls a plurality of projection components to project test patterns respectively through a processor, and when each projection component projects a test pattern, controls the plurality of projection components to collect the projected test patterns respectively, so as to obtain a plurality of sampling patterns , the relative positional relationship between the projection component that projects the test pattern and other projection components can be calculated by using the test pattern and the sampling pattern, and a similar method can be used to obtain the relative positional relationship between any two of the multiple projection components , the relative position relationship can be used to realize the automatic splicing and fusion of multiple projection components, the modeling and parameter extraction of the projection components can be solved at the same time, and the projection fusion and 3D mapping calibration without human intervention can be realized.
  • the projection assembly of the embodiment of the present application can calculate and generate a 3D point cloud of the projection target curved surface by controlling the content of the test pattern, such as using a traditional structured light method, and thereby generate a 3D geometric model of the projection target curved surface.
  • the projection component of the embodiment of the present application calculates the splitting and mapping transformation of the target image according to the complete projection component model of the embodiment of the present application and the three-dimensional geometric model of the projection target surface, generates the projection content of each projection component, and sends it to the project through the communication channel.
  • Each of the embodiments of the present application projects the component display.
  • the above functions are implemented in the form of software functions and sold or used as independent products, they can be stored in a storage medium readable by a mobile terminal, that is, the present application also provides a storage device storing program data, so The program data can be executed to implement the methods of the above embodiments, and the storage device can be, for example, a USB flash drive, an optical disc, a server, or the like. That is to say, the present application can be embodied in the form of a software product, which includes several instructions to make an intelligent terminal execute all or part of the steps of the methods described in the various embodiments.
  • references to the terms “one embodiment,” “some embodiments,” “example,” “specific example,” or “some examples”, etc. refer to the specific features described in connection with the embodiment or example.
  • structure, material or feature is included in at least one embodiment or example of the present application.
  • schematic representations of the above terms are not necessarily directed to the same embodiment or example.
  • the particular features, structures, materials or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
  • those skilled in the art may combine and combine the different embodiments or examples described in this specification, as well as the features of the different embodiments or examples, without conflicting each other.
  • first and second are only used for descriptive purposes, and should not be construed as indicating or implying relative importance or implying the number of indicated technical features. Thus, a feature delimited with “first”, “second” may expressly or implicitly include at least one of that feature.
  • plurality means at least two, such as two, three, etc., unless expressly and specifically defined otherwise.
  • any description of a process or method in the flowcharts or otherwise described herein may be understood to represent a module, segment or portion of code comprising one or more executable instructions for implementing a specified logical function or step of the process , and the scope of the preferred embodiments of the present application includes alternative implementations in which the functions may be performed out of the order shown or discussed, including performing the functions substantially concurrently or in the reverse order depending upon the functions involved, which should It is understood by those skilled in the art to which the embodiments of the present application belong.
  • a "computer-readable medium” can be any device that can contain, store, communicate, propagate, or transport the program for use by or in connection with an instruction execution system, apparatus, or apparatus.
  • computer readable media include the following: electrical connections with one or more wiring (electronic devices), portable computer disk cartridges (magnetic devices), random access memory (RAM), Read Only Memory (ROM), Erasable Editable Read Only Memory (EPROM or Flash Memory), Fiber Optic Devices, and Portable Compact Disc Read Only Memory (CDROM).
  • the computer readable medium may even be paper or other suitable medium on which the program may be printed, as the paper or other medium may be optically scanned, for example, followed by editing, interpretation, or other suitable medium as necessary process to obtain the program electronically and then store it in computer memory.

Abstract

本申请公开了一种投影融合方法、投影融合系统及计算机可读存储介质。该投影融合方法用于投影融合系统,该投影融合系统包括多个投影组件,该投影融合方法包括:对多个投影组件的信息进行配置;控制每个投影组件投射测试图案;在每个投影组件投射测试图案时,控制多个投影组件分别对测试图案进行采集,以获得多个采样图案,并利用测试图案和采样图案获取投射测试图案的投影组件与其它投影组件之间的相对位置关系;利用相对位置关系对目标投影图案进行拆分,并通过多个投影组件拼接融合。通过这种方式,能够实现多个投影组件的自动拼接融合,实现无人为干预的投影融合和3D映射校准。

Description

投影融合方法、投影融合系统及计算机可读存储介质 技术领域
本申请涉及投影技术领域,特别是涉及一种投影融合方法、投影融合系统及计算机可读存储介质。
背景技术
多投影机的拼接融合是通过将多台投影机的画面融合,能够实现更大面积的显示,或者实现复杂曲面上的投影。
本申请的发明人在长期的研发过程中发现,目前的3D映射和投影机边缘融合都是通过融合系统提供的校准功能,在现场安装时通过手动更改融合系统中的投影机模型参数、投影机的多点几何校正参数、边缘融合参数来进行调整,并未实现多投影机的自动融合。
发明内容
本申请主要解决的技术问题是提供投影融合方法、投影融合系统及计算机可读存储介质,以实现多个投影组件的自动拼接融合,实现无人为干预的投影融合和3D映射校准。
为解决上述技术问题,本申请采用的一个技术方案是:提供一种投影融合方法。该投影融合方法用于投影融合系统,该投影融合系统包括多个投影组件,该投影融合方法包括:对多个投影组件的信息进行配置;控制每个投影组件投射测试图案;在每个投影组件投射测试图案时,控制多个投影组件分别对测试图案进行采集,以获得多个采样图案,并利用测试图案和采样图案获取投射测试图案的投影组件与其它投影组件之间的相对位置关系;利用相对位置关系对目标投影图案进行拆分,并通过多个投影组件拼接融合。
在一具体实施中,上述目标投影图案包括多个子图案,上述利用相 对位置关系对目标投影图案进行拆分,并通过多个投影组件拼接融合的步骤包括:利用相对位置关系生成每个投影组件在投影融合系统中的绝对位置关系;利用每个投影组件的绝对位置关系计算出目标投影图案的拆分和畸变映射;确定每个投影组件所需投影的子图案,控制每个投影组件同步投射对应的子图案。
在一具体实施中,上述多个投影组件包括第一投影组件和第二投影组件,第一投影组件用于投射测试图案,第二投影组件为投影融合系统中的其它投影组件,第一投影组件包括第一投影镜头和第一摄像头,第二投影组件包括第二投影镜头和第二摄像头,上述利用测试图案及采样图案获取投射测试图案的投影组件与其它投影组件的步骤包括:获取测试图案的特征点的位置信息;利用位置信息获取第一摄像头的第一外部参数;利用第一外部参数和位置信息获取第一投影组件与第二投影组件之间的相对位置关系。
在一具体实施中,位置信息包括坐标信息,第一外部参数包括空间坐标信息,上述获取测试图案的特征点的位置信息的步骤包括:获取测试图案的特征点在测试图案中的第一坐标信息;获取特征点在第一摄像头采集的采样图案中的第二坐标信息;上述利用位置信息获取第一摄像头的第一外部参数的步骤包括:获取第一摄像头对第一投影镜头的相对外参;将第一坐标信息转换到世界坐标系下,以获得特征点的第一方向向量,并将第二坐标信息转换到世界坐标系下,以获得特征点的第二方向向量;采用三角测距法对第一方向向量、第二方向向量及相对外参进行计算,以获取特征点在世界坐标系下的空间位置信息。
在一具体实施中,上述获取测试图案的特征点的位置信息的步骤进一步包括:获取特征点在第二摄像头采集的采集图案中的第三坐标信息;上述利用第一外部参数和位置信息获取第一投影组件与第二投影组件之间的相对位置关系的步骤包括:将第三坐标信息转换到世界坐标系下,以获得特征点的第三方向向量;采用三角测距法对第三向向量和空间位置信息进行计算,以获取世界坐标系下特征点相对于第一投影镜头的相对位置关系。
在一具体实施中,在上述获取测试图案的特征点的位置信息的步骤之前,上述投影融合方法进一步包括:响应于采样图案包含测试图案的图像信息,执行获取测试图案的特征点的位置信息的步骤。
在一具体实施中,在上述利用相对位置关系对目标投影图案进行拆分,并通过多个投影组件拼接融合的步骤之前,上述投影融合方法进一步包括:累积已投射测试图案的投影组件的第一数量;响应于第一数量大于或者等于多个投影组件的总量,执行上述利用相对位置关系对目标投影图案进行拼接融合的步骤;响应于第一数量小于多个投影组件的总量,依次控制每个投影组件投射测试图案的步骤。
在一具体实施中,上述控制多个投影组件分别对测试图案进行采集,以获得多个采样图案,并利用测试图案和采样图案获取投射测试图案的投影组件与其它投影组件之间的相对位置关系的步骤包括以循环方式执行下述子步骤:控制一投影组件对测试图案进行采集,以获得一采样图案;利用测试图案和一采样图案获取投射测试图案的投影组件与采集采样图案的其它投射组件之间的相对位置关系;上述控制多个投影组件分别对测试图案进行采集,以获得多个采样图案,并利用测试图案和采样图案获取投射测试图案的投影组件与其它投影组件之间的相对位置关系的步骤进一步包括:累积已采集采样图案的投影组件的第二数量;响应于第二数量小于多个投影组件的总量,执行上述控制一投影组件对测试图案进行采集,以获得一采样图案的子步骤;响应于第二数量大于或等于多个投影组件的总量,执行上述依次控制每个投影组件投射测试图案的步骤。
为解决上述技术问题,本申请采用的一个技术方案是:提供一种投影融合系统。该投影融合系统包括:多个投影组件;处理器,分别与多个投影组件连接,处理器用于对多个投影组件的信息进行配置;处理器用于控制每个投影组件投射测试图案,并在每个投影组件投射测试图案时,控制多个投影组件分别对测试图案进行采集,以获得多个采样图案,并利用测试图案和采样图案获取投射测试图案的投影组件与其它投影组件之间的相对位置关系;处理器进一步用于利用相对位置关系对目标 投影图案进行拆分,并通过多个投影组件拼接融合。
为解决上述技术问题,本申请采用的一个技术方案是:提供一种计算机可读存储介质。该计算机可读存储介质上存储有程序指令,程序指令被执行时以实现上述滤波器的投影融合方法。
本申请的有益效果是:区别于现有技术,本申请实施例投影融合系统通过处理器控制多个投影组件分别投射测试图案,并在每个投影组件投射测试图案时,控制多个投影组件分别采集投射出来的测试图案,以获得多个采样图案,可以利用测试图案及采样图案计算出投射测试图案的投影组件与其它投影组件之间的相对于位置关系,采用类似的方法,即可获得多个投影组件中任意两者之间的相对位置关系,利用该相对位置关系能够实现多个投影组件的自动拼接融合,能够同时解决投影组件的建模和参数提取,实现无人为干预的投影融合和3D映射校准。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请投影融合系统一实施例的结构示意图;
图2是本申请投影融合方法一实施例的流程示意图;
图3是本申请投影融合系统中投影镜头及摄像头模型的结构示意图;
图4是世界坐标系下的极坐标示意图;
图5是图2实施例投影融合方法中步骤S202的一具体流程示意图;
图6是图5实施例中步骤S501的一具体流程示意图;
图7是图5实施例中步骤S502的一具体流程示意图;
图8是本申请投影融合方法中三角测距示意图;
图9是图5实施例中步骤S503的一具体流程示意图;
图10是图2实施例投影融合方法中步骤S203的一具体流程示意图;
图11是本申请投影融合方法一实施例的流程示意图;
图12是本申请计算机可读存储介质一实施例的结构示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,均属于本申请保护的范围。
本申请中的术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。本申请的描述中,“多个”的含义是至少两个,例如两个,三个等,除非另有明确具体的限定。此外,术语“包括”和“具有”以及它们任何变形,意图在于覆盖不排他的包含。例如包含了一系列步骤或单元的过程、方法、系统、产品或设备没有限定于已列出的步骤或单元,而是可选地还包括没有列出的步骤或单元,或可选地还包括对于这些过程、方法、产品或设备固有的其它步骤或单元。
多机的拼接融合是投影应用中的一个重要的功能,尤其在工程应用中。通过将多台投影机的画面融合,能够实现更大面积的显示,或者实现复杂曲面上的投影。目前的投影机拼接融合都是通过融合器手动调整完成,需要现场安装的时候由有经验的操作人员分别调整每一台投影机以及融合器的融合带配置来实现。这个过程需要大量的工作量。同时,由于各台用于融合的投影机会随着时间和环境的变化发生画面的漂移,对于长期安装的多机融合系统经常会出现漂移导致的融合带重影,影响观看效果。现有技术通常采用硬件融合将图像的拆分和几何变换通过专用ASCII芯片实现,成本较低。但是硬件融合灵活度不高,一般难以做到任意安装和数目的投影机的融合处理,且投影面和投影机的建模都是通过手工建模完成,而投影机的位置和方向信息则通过现场的参数调节完成,需要安装人员在现场安装和调试实现理想的拼接融合和三维映射。对于复杂投影环境下的多机融合和三维映射,需要专业的设计人员进行 建模和内容设计,也局限了投影显示在环境装饰方面的应用。
综上所述,目前的多投影机实现融合和3D映射,都需要有投影对象的建模和投影系统模型参数的校准。通过文献和专利的调研,并没有发现有同时能够满足对象建模和投影系统参数校准的方法,也未见能够完整获取投影系统参数的方法。本申请提出的投影融合系统及方法,能够同时解决投影对象的建模和投影系统的完整参数提取,实现无人为干预的投影机融合和3D映射校准。
本申请首先提出一种投影融合系统,如图1所示,图1是本申请投影融合系统一实施例的结构示意图。本实施例投影融合系统10包括多个投影组件110和处理器120;其中,处理器120分别与多个投影组件110连接,处理器120用于对多个投影组件110的信息进行配置;处理器120用于控制每个投影组件110投射测试图案,并在每个投影组件110投射测试图案时,控制多个投影组件110分别对测试图案进行采集,以获得多个采样图案;处理器120进一步用于利用测试图案和采样图案获取投射测试图案的投影组件110与其它投影组件112之间的相对位置关系,并利用相对位置关系对目标投影图案进行拆分,通过多个投影组件110拼接融合。
本实施例投影融合系统10包括n(n为大于或者等于2的自然数)个投影组件110(即P1-Pn);处理器120通过通信网络150与各个投影组件110通信以获取投影组件110的信息,具体来说,处理器120用于获取投影组件110的网络访问信息、视频采集信息及光学参数信息,并控制和配置投影组件110。需要说明的是,处理器120获取投影组件信息的方法可分为主动获取或被动获取,其中主动获取可以通过用户配置,预设配置文件,云端数据库查询等方式建立投影组件的列表和数据集;被动获取通过投影组件自动上报参数信息。
其中,通信网络150可以是任意类型的通信网络,只要能保证处理器120能与投影组件110实现双向通信,以及投影组件110之间的双向通信以实现投影组件110的集群控制和视频同步;通信网络150可以是蜂窝网络(如4G,5G)、无线局域网(WiFi)、有线以太网(LAN)、互 联网,HDMI视频信道,DP视频信道等。
其中,处理器120还可以称为CPU(Central Processing Unit,中央处理单元)。处理器120可能是一种集成电路芯片,具有信号的处理能力。处理器120还可以是通用处理器、数字信号处理器(DSP)、专用集成电路(ASIC)、现场可编程门阵列(FPGA)或者其它可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。通用处理器可以是微处理器或者该处理器120也可以是任何常规的处理器等。
处理器120可以为独立的处理器,也可以是集于投影组件110内的处理器。多个投影组件110可以组成主-从的关系拓扑,也可以组成Peer to Peer(P2P)的关系拓扑。在主-从关系拓扑中,某一个投影组件110通过竞争机制或通过配置成为主控节点,其它的投影组件110为从节点,接受主控节点的指令进行测试图案的投射和采集。当采用P2P的拓扑时,每个节点采用总线竞争的方式在通信网络150中发布测试图案投射消息,其它投影组件110采集测试图案,并用于定位。在这种模式下,每个投影组件110都会计算并存储所有节点的相对位置信息。
区别于现有技术,本实施例投影融合系统10通过处理器120控制多个投影组件110分别投射测试图案,并在每个投影组件110投射测试图案时,控制多个投影组件110分别采集投射出来的测试图案,以获得多个采样图案,可以利用测试图案及采样图案计算出投射测试图案的投影组件110与其它投影组件110之间的相对位置关系,采用类似的方法,即可获得多个投影组件110中任意两者之间的相对位置关系,利用该相对位置关系能够实现多个投影组件110的自动拼接融合,能够同时解决投影组件110的建模和参数提取,实现无人为干预的投影融合和3D映射校准。
其中,本实施例的处理器120依次控制多个投影组件110分别投射测试图案,即每次有且仅有一个投影组件110投射测试图案,能够减少测试图案之间的干扰,降低处理器120的计算复杂度。
当然,在其它实施例中,处理器120可以同时控制两个或者两个以上的投影组件110分别投射测试图案,能够减少投影融合系统10投射 测试图案的时间。
在一应用场景中,成像面较大时,处理器120控制相距最远的两个投影组件110,例如最左边和最右边的投影组件110分别投射测试图案,不仅可以减少测试图案之间的干扰,而且还能减少投影融合系统10投射测试图案的时间。
其中,每个投影组件110包括投影镜头130和摄像头140;其中,投影镜头130与处理器120连接,投影镜头130用于在处理器120的控制下投影带编码的测试图案;摄像头140与处理器120连接,摄像头140用于在处理器120的控制下对带编码的测试图案进行采集,以获取采样图案。
在其它实施例中,还可以采用其它器件代替投影镜头,也可以采用其它图像采集器件代替摄像头。
其中,每个投影组件110中的摄像头140和投影镜头130之间的相对于位置关系是固定的,且是已知的,而投影组件110与其它投影组件110之间的位置关系是未知的。
处理器120先获取投影组件110的型号,然后从与该型号对应的数据库中获取该投影组件110的投影镜头130和摄像头140之间的相对位置关系;若所有的投影组件110的型号相同,获取一个投影组件110上的投影镜头130和摄像头140之间的相对位置关系即可。
处理器120控制每个投影镜头130投射测试图案,同时控制各个摄像头140对测试图案进行采样,以获取采样图案。
需要注意的是,各个投影镜头130投射的测试图案不同,以区分投射测试图案的投影组件110,也就是说,不同投影镜头投射特定编码的测试图案。
本实施例投影融合系统10还用于实现下述投影融合方法。
本申请进一步提出一种投影融合方法,如图2所示,图2是本申请投影融合系统一实施例的流程示意图。本实施例投影融合方法可以用于上述投影融合系统10,本实施例投影融合方法包括以下步骤:
步骤S201:对多个投影组件110的信息进行配置。
处理器120对各投影组件110的网络信息验证登录,并对光学参数采集及数据库查询等基础信息进行配置。其中,处理器120首先得获取各投影组件110的型号,根据型号采集查询每个投影组件110的光学参数,例如投射比、投影镜头130的高度、投影镜头130和摄像头140之间距离等必要信息。
处理器120载入投影融合系统10中的所有投影组件110,即n个投影组件110的信息,该信息包括投影组件110的网络访问信息和投影组件110的图像采集信息和光学参数信息。
获取各个投影组件110的信息的方法可以分为主动获取和被动获取两种。主动获取是连网后通过互联网等渠道自动获取,其可以通过用户配置、预设配置文件及云端数据库查询等方式建立投影组件110的列表和数据集;被动获取可以在CPU接入投影组件110时,投影组件110主动将信息打包上报或网络内广播等方式获取信息。
步骤S202:控制每个投影组件110投射测试图案。
在处理器120载入所有投影组件110的信息后,处理器120通过与投影组件110的通信协议下发指令控制每个投影组件110的投影镜头130轮流投射特定编码的测试图案,以将测试图案投射在目标表面160上,其中目标表面160可以是任何形貌的曲面或平面。
本实施例中,处理器120需要保证在同一时间控制有且仅有一个投影组件110投射测试图案,以减少测试图案之间的干扰,降低处理器120的计算复杂度。
当然,在其它实施例中,处理器可以同时控制两个或者两个以上的投影组件分别投射测试图案,能够减少投影融合系统投射测试图案的时间。
步骤S203:在每个投影组件110投射测试图案时,控制多个投影组件110分别对测试图案进行采集,以获得多个采样图案,并利用测试图案和采样图案获取投射测试图案的投影组件110与其它投影组件110之间的相对位置关系。
由上述分析可知,处理器120需要保证在同一时间控制有且仅有一 个投影组件110投射测试图案,在n个中任意一个投影镜头130投射测试图案时,处理器120控制n个投影组件110的摄像头140对测试图案进行采集,以获得n个采样图案;处理器120利用一个测试图案和n个采样图案即可获得投射测试图案的投影组件110与其它(n-1)个投影组件110之间的相对位置关系。需要说明的是,对于部分投影组件110的摄像头140未采集到测试图案信息的情况,处理器无法获取到投射测试图案的投影组件110的编码信息,进一步无法解码获取投射测试图案的投影组件的位置信息,此时用于采集测试图案的摄像头对应的投影组件110不参与解码位置计算。
下面对投影镜头和摄像头的模型进行介绍,以便于下文介绍步骤S202的具体实现方式。
请一并参阅图1至图4,图3是本申请投影融合系统中投影镜头及摄像头模型的结构示意图;图4是世界坐标系下的极坐标示意图。本实施例的投影镜头130及摄像头140可以用图3所示的模型来表示,其中301为摄像头140的成像面或投影镜头130的图像面,302为等效的光学中心,303为等效的主光轴,304-1、304-2、304-3及304-4为图像采集区域(或投影成像区域)的边界,摄像头140的图像采集空间范围或投影镜头130的投影范围为以等效光学中心302为顶点,且以304-1到304-4为边界的锥。定义一个球坐标系
Figure PCTCN2021132870-appb-000001
以等效光学中心302为原点,等效主光轴303为主轴,305的方向为上方向,306的方向为水平旋转正方向。其中305、303及306相互两两垂直,305与成像面的宽边垂直,303、305及306组成右手坐标系。
本实施例可以将图像采集区域或投影成像区域内的任意点表示为(
Figure PCTCN2021132870-appb-000002
θ,ρ),其中
Figure PCTCN2021132870-appb-000003
为光学中心到该点的向量在
Figure PCTCN2021132870-appb-000004
坐标系中的方位角,θ为光学中心到该点的向量在
Figure PCTCN2021132870-appb-000005
坐标系中的倾角,ρ为光学中心到该点的向量的长度。这样,在像平面上的任何一点307,具有在该像坐标系下的归一化坐标(x,y),能够唯一映射到
Figure PCTCN2021132870-appb-000006
坐标系下的方位角和倾角(
Figure PCTCN2021132870-appb-000007
θ i),映射关系可以表示为:
Figure PCTCN2021132870-appb-000008
其中,D为畸变变换,由摄像头140的光学系统特性决定,T为透视映射变换,这两个变换由摄像头140或投影镜头130的内参决定。像平面一般为像素化的成像器件(例如CCD、CMOS等感光芯片或DMD、LCoS及LCD等光调制器件)。归一化坐标(x,y)与像素之间的关系可以表示为:
Figure PCTCN2021132870-appb-000009
其中,[I,J]为成像器件的分辨率,[i,j]为像素的索引号,表示第j行第i列的像素。例如分辨率为1080p的成像器件,[I,J]=[1920,1080]。以上归一化坐标的定义,定义了成像器件的中心为坐标的原点。
对于摄像头140和投影镜头130,主光轴303垂直于成像面301。则该投影组件110的透视映射变换可以由以下几个参数决定:归一化焦距
Figure PCTCN2021132870-appb-000010
其中f为镜头的焦距,单位为长度单位,W为成像芯片的宽,单位与镜头焦距相同;成像芯片宽高比
Figure PCTCN2021132870-appb-000011
其中H为成像芯片的高,单位与镜头焦距相同;归一化镜头偏移量off=(x 0,y 0),为等效主光轴偏离成像芯片中心的位移除以成像芯片的宽W。则透视映射变换T(x,y)可以表示为:
Figure PCTCN2021132870-appb-000012
投影组件110在全局坐标系下,由六个参数决定:位置
Figure PCTCN2021132870-appb-000013
Figure PCTCN2021132870-appb-000014
和方向向量
Figure PCTCN2021132870-appb-000015
其中的
Figure PCTCN2021132870-appb-000016
决定了投影组件110在全局坐标系下的光学中心的位置,方向向量
Figure PCTCN2021132870-appb-000017
决定了摄像头140的朝向和投影镜头130沿主光轴的旋转。
图4所示为世界坐标系下的极坐标示意图,其中的North方向为世界坐标系下的笛卡尔坐标系的y方向,Up方向为笛卡尔坐标系的z方向。笛卡尔坐标系和极坐标系的变换为基本的坐标系变换,任何三维解析几 何的书都有描述,这里不再展开。
空间中的任意点
Figure PCTCN2021132870-appb-000018
都可以唯一映射到摄像头140的归一化坐标。将空间中的任意点
Figure PCTCN2021132870-appb-000019
从世界坐标系变换到摄像头140的局部坐标系:
Figure PCTCN2021132870-appb-000020
其中的Τ为旋转变换矩阵:
Figure PCTCN2021132870-appb-000021
则空间中任意点到摄像头140的归一化坐标可以表示为:
Figure PCTCN2021132870-appb-000022
式(4)和式(6)定义了从空间任意点到摄像头140像素的空间映射关系。而式(3)和式(4)则定义了摄像头140像素点到空间映射方向的映射关系。在投影组件110中,成像芯片上的任意点,都对应了以投影组件110光学中心为起点,确定一个方向的射线,投影像素的成像位置,为该射线与第一个相交平面的焦点。
由上述分析可知,每投影一个测试图案时采集获得n个采样图案,本实施例可以采用如图5所示的方法获取投射测试图案的投影组件110与(n-1)个投影组件110之间的相对位置关系。
便于分析,本实施例将投影融合系统10中的n个投影组件110分为一个第一投影组件和(n-1)个第二投影组件,其中,第一投影组件为投射测试图案的投影组件110,第二投影组件为投影融合系统10中的其它投影组件110;并将第一投影组件中的投影镜头130称之为第一投影镜头,将第一投影组件中的摄像头140称之为第一摄像头,将第二投影组件中的投影镜头130称之为第二投影镜头,将第二投影组件中的摄像头140称之为第二摄像头。
需要注意的是,每次投射测试图案的投影组件110不同,第一投影 组件所具体指代的投影组件110也不同,第二投影组件所具体指代的投影组件110也会变化。
本实施例将以第一投影组件为投影融合系统10中第i个投影组件110,第二投影组件为投影融合系统10中第j个投影组件110为例进行介绍。
具体地,如图5所示,本实施例的方法具体包括步骤S501至步骤S504。
步骤S501:获取测试图案的特征点的位置信息。
具体地,本实施例的位置信息包括坐标信息,本实施例可以通过如图6所示的方法实现步骤S501。本实施例的方法具体包括步骤S601和步骤S602。
步骤S601:获取测试图案的特征点在测试图案中的第一坐标信息。
测试图案具有特殊的特征点分布,其特征点的特征描述算子与图像的几何变换和透视变换无关。
本实施例通过设置特殊特征点,能够根据这些特殊特征点在测试图案上的坐标信息与这些特殊特征点在采样图案上的坐标信息来获取投射该测试图案的投影组件110与采集该采样图案的投影组件110之间的相对位置关系。
处理器120从存储介质中获取第一投影镜头投射的测试图案的内容、每一个特征点的特征算子,并获取特征点在测试图案中的第一坐标信息为
Figure PCTCN2021132870-appb-000023
步骤S602:获取特征点在第一摄像头采集的采样图案中的第二坐标信息。
处理器120从第一投影组件的第一摄像头采集的采样图案中提取出该特征点在该采样图案中的第二坐标信息为
Figure PCTCN2021132870-appb-000024
进一步地,处理器120还获取特征点在第二摄像头采集的采集图案中的第三坐标信息
Figure PCTCN2021132870-appb-000025
步骤S502:利用位置信息获取第一摄像头的第一外部参数。
具体地,本实施例的第一外部参数包括空间坐标信息,本实施例可 以通过如图7所示的方法实现步骤S502。本实施例的方法具体包括步骤S701至步骤S703。
步骤S701:获取第一摄像头对第一投影镜头的相对外参。
处理器120获取第一摄像头对第一投影镜头光心的相对外参为
Figure PCTCN2021132870-appb-000026
Figure PCTCN2021132870-appb-000027
分别代表了第一摄像头在第一投影镜头坐标系下的相对位置和方向。
步骤S702:将第一坐标信息转换到世界坐标系下,以获得特征点的第一方向向量,并将第二坐标信息转换到世界坐标系下,以获得特征点的第二方向向量。
在第一摄像头坐标系下,特征点
Figure PCTCN2021132870-appb-000028
对应的第二方向向量
Figure PCTCN2021132870-appb-000029
为:
Figure PCTCN2021132870-appb-000030
其中,Τ -1为由
Figure PCTCN2021132870-appb-000031
定义的旋转变换。
用相同的方法,可以求得特征点
Figure PCTCN2021132870-appb-000032
对应的第一方向向量
Figure PCTCN2021132870-appb-000033
为:
Figure PCTCN2021132870-appb-000034
步骤S703:采用三角测距法对第一方向向量和第二方向向量进行计算,以获取特征点在世界坐标系下的空间位置信息。
如图8所示,处理器120采用三角测距法,即下述等式对第一方向向量
Figure PCTCN2021132870-appb-000035
和第二方向向量
Figure PCTCN2021132870-appb-000036
进行计算,求得特征点
Figure PCTCN2021132870-appb-000037
空间位置坐标
Figure PCTCN2021132870-appb-000038
Figure PCTCN2021132870-appb-000039
其中,l 0和l 1分别为
Figure PCTCN2021132870-appb-000040
到第一摄像头和第一投影镜头的距离。
利用该方法,可以提取测试图案中的多个特征点,并计算这些特征点的空间位置信息。
步骤S503:利用第一外部参数和位置关系获取第一投影组件与第二投影组件之间的相对位置关系。
具体地,本实施例的本实施例可以通过如图9所示的方法实现步骤 S503。本实施例的方法具体包括步骤S901和步骤S902。
步骤S901:将第三坐标信息转换到世界坐标系下,以获得特征点的第三方向向量。
由上述分析可知,处理器120获取特征点在第二摄像头采集的采集图案中的第三坐标信息为
Figure PCTCN2021132870-appb-000041
假设第二投影组件中的第二摄像头相对第一投影镜头的相对外参为
Figure PCTCN2021132870-appb-000042
Figure PCTCN2021132870-appb-000043
则第k个特征点对应的第三方向向量为:
Figure PCTCN2021132870-appb-000044
步骤S902:采用三角测距法对第三向向量和空间位置信息进行计算,以获取世界坐标系下特征点相对于第一投影镜头的相对位置关系。
处理器120利用三角测距法生成第k个特征点的三角测距方程:
Figure PCTCN2021132870-appb-000045
其中,
Figure PCTCN2021132870-appb-000046
为第k个特征点的空间位置信息。
为求解第二摄像头相对第一投影镜头的相对外参为
Figure PCTCN2021132870-appb-000047
Figure PCTCN2021132870-appb-000048
可以采用两个或者两个以上的特征点生成三角测距方程组。由此,能求解出第二投影组件中第二摄像头相对于第一投影组件中第一投影镜头的相对位置关系,即获得第j个投影组件110中摄像头相对于第i个投影组件110中投影镜头的相对于位置关系。
因同一个投影组件110中摄像头140与投影镜头130之间的位置关系为已知的固定值,因此可以获得第j个投影组件110中投影镜头130与第i个投影组件110中投影镜头130的相对于位置关系,及第j个投影组件110中摄像头140与第i个投影组件110中摄像头140的相对于位置关系。
进一步地,处理器120可以通过上述方法获取第i个投影组件110与其它(n-1)个投影组件110之间的相对于位置关系。
因此通过上述方法,处理器120能够获取n个投影组件110中任意两个投影组件110之间的相对位置关系。
在其它实施例中,为减少计算开销,处理器120还可以在步骤S501之前,即在利用测试图案和采样图案获取投射测试图案的投影组件与其它投影组件之间的相对位置关系之前,对采样图案与测试图案进行匹配,以判断采样图案是否包含测试图案的部分或者全部图像信息,响应于采样图案包含测试图案的图像信息,执行步骤S501;响应于采样图案不包含测试图案的图像信息,则继续利用下一个投影组件110采样测试图案的,以获取下一个采样图案。
步骤S204:利用相对位置关系对目标投影图案进行拆分,并通过多个投影组件110拼接融合。
其中,目标投影图案包括多个子图案。
本实施例可以通过如图10所示的方法实现步骤S204。本实施例的方法包括步骤S1001至步骤S1003。
步骤S1001:利用相对位置关系生成每个投影组件110在投影融合系统10中的绝对位置关系。
处理器120利用相对位置关系生成每个投影组件110在投影融合系统10中的绝对位置关系,并存储于融合和映射关系表。
步骤S1002:利用每个投影组件110的绝对位置关系计算出目标投影图案的拆分和畸变映射。
处理器120利用融合和映射关系表可完成目标投影图案的分割、映射和融合功能。
步骤S1003:确定每个投影组件110所需投影的子图案,控制每个投影组件110同步投射对应的子图案。
区别于现有技术,本实施例投影融合系统10通过处理器120控制多个投影组件110分别投射测试图案,并在每个投影组件110投射测试图案时,控制多个投影组件110分别采集投射出来的测试图案,以获得多个采样图案,可以利用测试图案及采样图案计算出投射测试图案的投影组件110与其它投影组件110之间的相对于位置关系,采用类似的方法,即可获得多个投影组件110中任意两者之间的相对位置关系,利用该相对位置关系能够实现多个投影组件110的自动拼接融合,能够同时 解决投影组件110的建模和参数提取,实现无人为干预的投影融合和3D映射校准。
本申请进一步提出另一实施例的投影融合方法,如图11所示,本实施例投影融合方法包括以下步骤:
步骤S1101:对多个投影组件110的信息进行配置。
步骤S1101与步骤S201类似,这里不赘述。
步骤S1102:控制每个投影组件110投射测试图案。
步骤S1102与步骤S202类似,这里不赘述。
步骤S1103:在每个投影组件110投射测试图案时,控制一投影组件110对测试图案进行采集,以获得一采样图案。
步骤S1104:利用测试图案和一采样图案获取投射测试图案的投影组件110与采集采样图案的其它投射组件110之间的相对位置关系。
在某个投影组件110投射测试图案时,依次控制其它投影组件110采集采样图案,并依次获取投射测试图案的投影组件110与采集采样图案的投影组件110之间的相对位置关系,对处理器120的运算能力要求较低。
步骤S1105:累积已采集采样图案的投影组件110的第二数量。
步骤S1106:响应于第二数量小于多个投影组件110的总量,执行步骤1103。
处理器120判定已采集采样图案的投影组件110的第二数量小于多个投影组件110的总量,则认为还有投影组件110未采集采样图案,处理器120控制下一个投影组件110采集采样图案。
步骤S1107:响应于第二数量大于或等于多个投影组件110的总量。
处理器120判定已采集采样图案的投影组件110的第二数量大于或者等于多个投影组件110的总量,则认为所有投影组件110均已采集采样图案。
步骤S1108:累积已投射测试图案的投影组件110的第一数量。
步骤S1109:响应于第一数量大于或者等于多个投影组件110的总量,执行步骤S1111。
处理器120判定已投射测试图案的投影组件110的第一数量大于或者等于多个投影组件110的总量,则认为所有投影组件110均已投射测试图案,处理器120执行步骤S1111。
步骤S1110:响应于第一数量小于多个投影组件110的总量,执行步骤S1102。
处理器120判定已投射测试图案的投影组件110的第一数量小于多个投影组件110的总量,则认为还有投影组件110未投射测试图案,处理器120控制下一个投影组件110投射测试图案。
步骤S1111:利用相对位置关系对目标投影图案进行拆分,并通过多个投影组件110拼接融合。
步骤S1112与步骤S204类似,这里不赘述。
在其它实施例中,在某个投影组件投射测试图案时,还可以控制其它投影组件同时对测试图案进行采集,以获得多个采样图案和/或同时对于多个采样图案进行计算,以同时获得多个相对于位置关系,能够提高处理器120的处理效率。
在上述实施例的基础上,本实施例进一步对已投射测试图案的投影组件110的数量进行判定处理及对已采集采样图案的投影组件110的数量进行判定处理,能够避免部分投影组件110未投射测试图案或者未采集采样图案,以避免部分投影组件110的位置信息未获取的问题。
本申请进一步提出一种计算机可读存储介质,如图12所示,图12是本申请计算机可读存储介质一实施例的结构示意图。计算机可读存储介质90其上存储有程序指令91,程序指令91被处理器(图未示)执行时实现上述投影融合方法。
本实施例计算机可读存储介质90可以是但不局限于U盘、SD卡、PD光驱、移动硬盘、大容量软驱、闪存、多媒体记忆卡、服务器等。
区别于现有技术,本申请实施例投影融合方法用于投影融合系统,该投影融合系统包括多个投影组件,该投影融合方法包括:对多个投影组件进行信息配置;控制每个投影组件投射测试图案;在每个投影组件投射测试图案时,控制多个投影组件分别对测试图案进行采集,以获得 多个采样图案,并利用测试图案和采样图案获取投射测试图案的投影组件与其它投影组件之间的相对位置关系;利用相对位置关系对目标投影图案进行拆分,并通过多个投影组件拼接融合。本申请实施例投影融合系统通过处理器控制多个投影组件分别投射测试图案,并在每个投影组件投射测试图案时,控制多个投影组件分别采集投射出来的测试图案,以获得多个采样图案,可以利用测试图案及采样图案计算出投射测试图案的投影组件与其它投影组件之间的相对于位置关系,采用类似的方法,即可获得多个投影组件中任意两者之间的相对位置关系,利用该相对位置关系能够实现多个投影组件的自动拼接融合,能够同时解决投影组件的建模和参数提取,实现无人为干预的投影融合和3D映射校准。
本申请实施例投影组件通过控制测试图案的内容,如采用传统结构光的方法,能够计算并生成投影目标曲面的三维点云并由此生成投影目标曲面的三维几何模型。
本申请实施例投影组件根据完整的本申请实施例投影组件模型和投影目标曲面的三维几何模型,计算目标图像的拆分和映射变换,生成每个投影组件的投射内容并通过通信信道下发给每个本申请实施例投影组件显示。
另外,上述功能如果以软件功能的形式实现并作为独立产品销售或使用时,可存储在一个移动终端可读取存储介质中,即,本申请还提供一种存储有程序数据的存储装置,所述程序数据能够被执行以实现上述实施例的方法,该存储装置可以为如U盘、光盘、服务器等。也就是说,本申请可以以软件产品的形式体现出来,其包括若干指令用以使得一台智能终端执行各个实施例所述方法的全部或部分步骤。
在本申请的描述中,参考术语“一个实施例”、“一些实施例”、“示例”、“具体示例”、或“一些示例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实施例或示例中。在本说明书中,对上述术语的示意性表述不必须针对的是相同的实施例或示例。而且,描述的具体特征、结构、材料或者特点可以在任一个或多个实施例或示例中以合适的方式结合。此外,在不相互矛 盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。
此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括至少一个该特征。在本申请的描述中,“多个”的含义是至少两个,例如两个,三个等,除非另有明确具体的限定。
流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于实现特定逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分,并且本申请的优选实施方式的范围包括另外的实现,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被本申请的实施例所属技术领域的技术人员所理解。
在流程图中表示或在此以其他方式描述的逻辑和/或步骤,例如,可以被认为是用于实现逻辑功能的可执行指令的定序列表,可以具体实现在任何计算机可读介质中,以供指令执行系统、装置或设备(可以是个人计算机,服务器,网络设备或其他可以从指令执行系统、装置或设备取指令并执行指令的系统)使用,或结合这些指令执行系统、装置或设备而使用。就本说明书而言,"计算机可读介质"可以是任何可以包含、存储、通信、传播或传输程序以供指令执行系统、装置或设备或结合这些指令执行系统、装置或设备而使用的装置。计算机可读介质的更具体的示例(非穷尽性列表)包括以下:具有一个或多个布线的电连接部(电子装置),便携式计算机盘盒(磁装置),随机存取存储器(RAM),只读存储器(ROM),可擦除可编辑只读存储器(EPROM或闪速存储器),光纤装置,以及便携式光盘只读存储器(CDROM)。另外,计算机可读介质甚至可以是可在其上打印所述程序的纸或其他合适的介质,因为可以例如通过对纸或其他介质进行光学扫描,接着进行编辑、解译或必要时以其他合适方式进行处理来以电子方式获得所述程序,然后将其存储在计算机存储器中。
以上所述仅为本申请的实施方式,并非因此限制本申请的专利范围,凡是利用本申请说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本申请的专利保护范围内。

Claims (10)

  1. 一种投影融合方法,其特征在于,用于投影融合系统,所述投影融合系统包括多个投影组件,所述投影融合方法包括:
    对所述多个投影组件进行信息配置;
    控制每个所述投影组件投射测试图案;
    在每个所述投影组件投射所述测试图案时,控制所述多个投影组件分别对所述测试图案进行采集,以获得多个采样图案,并利用所述测试图案和所述采样图案获取投射所述测试图案的所述投影组件与其它所述投影组件之间的相对位置关系;
    利用所述相对位置关系对目标投影图案进行拆分,并通过所述多个投影组件拼接融合。
  2. 根据权利要求1所述的投影融合方法,其特征在于,所述目标投影图案包括多个子图案,所述利用所述相对位置关系对目标投影图案进行拆分,并通过所述多个投影组件拼接融合的步骤包括:
    利用所述相对位置关系生成每个所述投影组件在所述投影融合系统中的绝对位置关系;
    利用每个所述投影组件的绝对位置关系计算出目标投影图案的拆分和畸变映射;
    确定每个所述投影组件所需投影的所述子图案,控制每个所述投影组件同步投射对应的所述子图案。
  3. 根据权利要求1所述的投影融合方法,其特征在于,所述多个投影组件包括第一投影组件和第二投影组件,所述第一投影组件用于投射所述测试图案,所述第二投影组件为所述投影融合系统中的其它所述投影组件,所述第一投影组件包括第一投影镜头和第一摄像头,所述第二投影组件包括第二投影镜头和第二摄像头,所述利用所述测试图案及所述采样图案获取投射所述测试图案的所述投影组件与其它所述投影组件的步骤包括:
    获取所述测试图案的特征点的位置信息;
    利用所述位置信息获取所述第一摄像头的第一外部参数;
    利用所述第一外部参数和所述位置信息获取所述第一投影组件与所述第二投影组件之间的相对位置关系。
  4. 根据权利要求3所述的投影融合方法,其特征在于,所述位置信息包括坐标信息,所述第一外部参数包括空间坐标信息,所述获取所述测试图案的特征点的位置信息的步骤包括:
    获取所述测试图案的特征点在所述测试图案中的第一坐标信息;
    获取所述特征点在所述第一摄像头采集的所述采样图案中的第二坐标信息;
    所述利用所述位置信息获取所述第一摄像头的第一外部参数的步骤包括:
    获取所述第一摄像头对所述第一投影镜头的相对外参;
    将所述第一坐标信息转换到世界坐标系下,以获得所述特征点的第一方向向量,并将所述第二坐标信息转换到所述世界坐标系下,以获得所述特征点的第二方向向量;
    采用三角测距法对所述第一方向向量、所述第二方向向量及所述相对外参进行计算,以获取所述特征点在所述世界坐标系下的空间位置信息。
  5. 根据权利要求4所述的投影融合方法,其特征在于,所述获取所述测试图案的特征点的位置信息的步骤进一步包括:
    获取所述特征点在所述第二摄像头采集的所述采集图案中的第三坐标信息;
    所述利用所述第一外部参数和所述位置信息获取所述第一投影组件与所述第二投影组件之间的相对位置关系的步骤包括:
    将所述第三坐标信息转换到世界坐标系下,以获得所述特征点的第三方向向量;
    采用三角测距法对所述第三向向量和所述空间位置信息进行计算,以获取所述世界坐标系下所述特征点相对于所述第一投影镜头的相对位置关系。
  6. 根据权利要求3所述的投影融合方法,其特征在于,在所述获取所述测试图案的特征点的位置信息的步骤之前,所述投影融合方法进一步包括:
    响应于所述采样图案包含所述测试图案的图像信息,执行所述获取所述测试图案的特征点的位置信息的步骤。
  7. 根据权利要求1所述的投影融合方法,其特征在于,在所述利用所述相对位置关系对目标投影图案进行拆分,并通过所述多个投影组件拼接融合的步骤之前,所述投影融合方法进一步包括:
    累积已投射所述测试图案的所述投影组件的第一数量;
    响应于所述第一数量大于或者等于所述多个投影组件的总量,执行所述利用所述相对位置关系对目标投影图案进行拼接融合的步骤;
    响应于所述第一数量小于所述多个投影组件的总量,执行依次控制每个所述投影组件投射测试图案的步骤。
  8. 根据权利要求1所述的投影融合方法,其特征在于,所述控制所述多个投影组件分别对所述测试图案进行采集,以获得多个采样图案,并利用所述测试图案和所述采样图案获取投射所述测试图案的所述投影组件与其它所述投影组件之间的相对位置关系的步骤包括以循环方式执行下述子步骤:
    控制一所述投影组件对所述测试图案进行采集,以获得一采样图案;
    利用所述测试图案和所述一采样图案获取投射所述测试图案的所述投影组件与采集所述采样图案的其它所述投射组件之间的相对位置关系;
    所述控制所述多个投影组件分别对所述测试图案进行采集,以获得多个采样图案,并利用所述测试图案和所述采样图案获取投射所述测试图案的所述投影组件与其它所述投影组件之间的相对位置关系的步骤进一步包括:
    累积已采集所述采样图案的所述投影组件的第二数量;
    响应于所述第二数量小于所述多个投影组件的总量,执行所述控制一所述投影组件对所述测试图案进行采集,以获得一采样图案的子步骤;
    响应于所述第二数量大于或等于所述多个投影组件的总量,执行所述依次控制每个所述投影组件投射测试图案的步骤。
  9. 一种投影融合系统,其特征在于,所述投影融合系统包括:
    多个投影组件;
    处理器,分别与所述多个投影组件连接,所述处理器用于对所述多个投影组件的信息进行配置;所述处理器用于控制每个所述投影组件投射测试图案,并在每个所述投影组件投射所述测试图案时,控制所述多个投影组件分别对所述测试图案进行采集,以获得多个采样图案,并利用所述测试图案和所述采样图案获取投射所述测试图案的所述投影组件与其它所述投影组件之间的相对位置关系;所述处理器进一步用于利用所述相对位置关系对目标投影图案进行拆分,并通过所述多个投影组件拼接融合。
  10. 一种计算机可读存储介质,其特征在于,其上存储有程序数据,所述程序数据被执行时以实现权利要求1至8任一项所述的投影融合方法。
PCT/CN2021/132870 2020-12-11 2021-11-24 投影融合方法、投影融合系统及计算机可读存储介质 WO2022121686A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011459435.4 2020-12-11
CN202011459435.4A CN114630087A (zh) 2020-12-11 2020-12-11 投影融合方法、投影融合系统及计算机可读存储介质

Publications (1)

Publication Number Publication Date
WO2022121686A1 true WO2022121686A1 (zh) 2022-06-16

Family

ID=81895819

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/132870 WO2022121686A1 (zh) 2020-12-11 2021-11-24 投影融合方法、投影融合系统及计算机可读存储介质

Country Status (2)

Country Link
CN (1) CN114630087A (zh)
WO (1) WO2022121686A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115314690A (zh) * 2022-08-09 2022-11-08 北京淳中科技股份有限公司 一种图像融合带处理方法、装置、电子设备及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040085256A1 (en) * 2002-10-30 2004-05-06 The University Of Chicago Methods and measurement engine for aligning multi-projector display systems
CN102288131A (zh) * 2011-05-12 2011-12-21 上海大学 物体360°轮廓误差的自适应条纹测量装置和方法
CN102881006A (zh) * 2012-08-03 2013-01-16 吉林禹硕动漫游戏科技股份有限公司 多投影显示系统中的图像拼接与融合方法
US20130027332A1 (en) * 2011-07-26 2013-01-31 Jar-Ferr Yang Projection system with touch control
CN106060493A (zh) * 2016-07-07 2016-10-26 广东技术师范学院 多源投影无缝边缘拼接方法及系统
CN110191326A (zh) * 2019-05-29 2019-08-30 北京小鸟听听科技有限公司 一种投影系统分辨率扩展方法、装置和投影系统
CN111918045A (zh) * 2020-08-05 2020-11-10 华强方特(深圳)软件有限公司 用于多个投影机进行投影拼接校正的网格数据生成方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040085256A1 (en) * 2002-10-30 2004-05-06 The University Of Chicago Methods and measurement engine for aligning multi-projector display systems
CN102288131A (zh) * 2011-05-12 2011-12-21 上海大学 物体360°轮廓误差的自适应条纹测量装置和方法
US20130027332A1 (en) * 2011-07-26 2013-01-31 Jar-Ferr Yang Projection system with touch control
CN102881006A (zh) * 2012-08-03 2013-01-16 吉林禹硕动漫游戏科技股份有限公司 多投影显示系统中的图像拼接与融合方法
CN106060493A (zh) * 2016-07-07 2016-10-26 广东技术师范学院 多源投影无缝边缘拼接方法及系统
CN110191326A (zh) * 2019-05-29 2019-08-30 北京小鸟听听科技有限公司 一种投影系统分辨率扩展方法、装置和投影系统
CN111918045A (zh) * 2020-08-05 2020-11-10 华强方特(深圳)软件有限公司 用于多个投影机进行投影拼接校正的网格数据生成方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115314690A (zh) * 2022-08-09 2022-11-08 北京淳中科技股份有限公司 一种图像融合带处理方法、装置、电子设备及存储介质
CN115314690B (zh) * 2022-08-09 2023-09-26 北京淳中科技股份有限公司 一种图像融合带处理方法、装置、电子设备及存储介质

Also Published As

Publication number Publication date
CN114630087A (zh) 2022-06-14

Similar Documents

Publication Publication Date Title
JP6951595B2 (ja) 住宅のデータ収集およびモデル生成方法
US20190378330A1 (en) Method for data collection and model generation of house
WO2020063139A1 (zh) 脸部建模方法、装置、电子设备和计算机可读介质
KR20210064115A (ko) 촬영을 기반으로 하는 3d 모델링 시스템 및 방법, 자동 3d 모델링 장치 및 방법
CN102157011B (zh) 利用移动拍摄设备进行动态纹理采集及虚实融合的方法
US10726580B2 (en) Method and device for calibration
US11557083B2 (en) Photography-based 3D modeling system and method, and automatic 3D modeling apparatus and method
US20120007943A1 (en) Method for determining the relative position of a first and a second imaging device and devices therefore
WO2023045147A1 (zh) 双目摄像机的标定方法、系统、电子设备和存储介质
CN111221933A (zh) 海量地图数据与建筑信息模型融合的三维瓦片构建方法
CN104778656A (zh) 基于球面透视投影的鱼眼图像校正方法
CN103843329A (zh) 用于立体图像对的有条件显示的方法和设备
CN109102563A (zh) 一种实景三维建模方法
CN106570907B (zh) 一种相机标定方法及装置
CN110084797B (zh) 平面检测方法、装置、电子设备和存储介质
CN103902343A (zh) 一种基于Delaunay三角网精度控制的瓦片地图下载与拼接方法
WO2022121686A1 (zh) 投影融合方法、投影融合系统及计算机可读存储介质
JP7446414B2 (ja) 3dモデルのテクスチャを取得するための方法および関連する装置
CN105389845A (zh) 三维重建的图像获取方法和系统、三维重建方法和系统
CN113643414A (zh) 一种三维图像生成方法、装置、电子设备及存储介质
EP3472667A1 (en) System and method for capturing horizontal disparity stereo panorama
Lin et al. Real-time low-cost omni-directional stereo vision via bi-polar spherical cameras
WO2023088127A1 (zh) 室内导航方法、服务器、装置和终端
WO2024007654A1 (zh) 摄像头对焦方法、装置、电子设备和计算机可读存储介质
CN112767484B (zh) 定位模型的融合方法、定位方法、电子装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21902393

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21902393

Country of ref document: EP

Kind code of ref document: A1