CN112614190B - Method and device for projecting mapping - Google Patents

Method and device for projecting mapping Download PDF

Info

Publication number
CN112614190B
CN112614190B CN202011481546.5A CN202011481546A CN112614190B CN 112614190 B CN112614190 B CN 112614190B CN 202011481546 A CN202011481546 A CN 202011481546A CN 112614190 B CN112614190 B CN 112614190B
Authority
CN
China
Prior art keywords
image
projector
mapping table
coordinate mapping
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011481546.5A
Other languages
Chinese (zh)
Other versions
CN112614190A (en
Inventor
程星凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Tricolor Technology Co ltd
Original Assignee
Beijing Tricolor Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Tricolor Technology Co ltd filed Critical Beijing Tricolor Technology Co ltd
Priority to CN202011481546.5A priority Critical patent/CN112614190B/en
Publication of CN112614190A publication Critical patent/CN112614190A/en
Application granted granted Critical
Publication of CN112614190B publication Critical patent/CN112614190B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • G06T5/77
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The application provides a method and a device for projecting a map. The method for projecting the map is applied to control equipment in a control system of the projecting map, and the control system also comprises a camera and a projector; the method comprises the following steps: acquiring a grating image; the grating image is an image which is acquired by a camera and contains a grating pattern projected by a projector on an object to be projected; obtaining a parallax image of the projector based on the raster image; acquiring a discrete color image; the discrete color image is an image obtained by transforming the grating image, and comprises a hole area; repairing the hole area based on the parallax image to obtain a first repairing image; generating a projection image based on the first repair image and the parallax image; and controlling a projector to project the projection image on an object to be projected. The method is used for simplifying the operation flow of projection mapping and improving the practicability.

Description

Method and device for projecting mapping
Technical Field
The application relates to the technical field of three-dimensional reconstruction, in particular to a method and a device for projection mapping.
Background
In the existing three-dimensional reconstruction technology of a camera and a projector, strict calibration is basically required to be carried out on the gestures of the camera and the projector. In the traditional triangulation principle, the relative posture cannot be changed after each calibration is finished, and once the relative position changes, posture calibration needs to be carried out again.
In the recent phase-height mapping theory, a plurality of grating images with a certain phase difference are respectively projected to a reference plane and a detection plane, and then the relative depth information from the detection plane to the reference plane is obtained by combining a phase-height mapping relation, so that the operation flow is simplified compared with the traditional mode, but for a use scene without the reference plane, the accurate phase difference cannot be obtained, and further the depth information is difficult to obtain. And, although the operation flow is simplified to some extent, the flow is still complex.
Therefore, the existing camera-projector three-dimensional reconstruction technology has complex operation flow and poor practicability.
Disclosure of Invention
An objective of the embodiments of the present application is to provide a method and an apparatus for projection mapping, which are used for simplifying the operation flow of projection mapping and improving the practicability.
In a first aspect, an embodiment of the present application provides a method for projecting a map, which is applied to a control device in a control system for projecting a map, where the control system further includes a camera and a projector; the method comprises the following steps: acquiring a grating image; the grating image is an image which is acquired by a camera and contains a grating pattern projected by the projector on an object to be projected; obtaining a parallax image of the projector based on the raster image; acquiring a discrete color image; the discrete color image is an image obtained by transforming the grating image, and comprises a hole area; repairing the hole area based on the parallax image to obtain a first repairing image; generating a projection image based on the first repair image and the parallax image; and controlling the projector to project the projection image on the object to be projected.
In the embodiment of the application, compared with the prior art, on one hand, the projector projects the grating pattern, the camera collects the grating image containing the grating pattern, and the control device executes the subsequent image processing flow based on the grating image. The camera and the projector can be flexibly assembled when in use each time, and the relative gesture is changed without re-calibrating the gesture, so that the use is convenient, and the operation flow is simplified to a certain extent. On the other hand, based on the raster image, determining a parallax image of the projector, wherein the parallax of the projector can reflect the scene depth gradient condition (namely depth information); based on the parallax image, projection mapping based on depth information can be realized, three-dimensional reconstruction is completed, and practicality is improved.
As a possible implementation manner, the obtaining, based on the raster image, a parallax image of the projector includes: decoding grating information in the grating image to obtain a first coordinate mapping table; the first coordinate mapping table comprises a corresponding relation between camera pixel coordinates and projector pixel coordinates; removing outliers in the first coordinate mapping table to obtain a second coordinate mapping table; acquiring a third coordinate mapping table based on the second coordinate mapping table; the coordinates in the third coordinate mapping table are integers, and one camera pixel coordinate corresponds to one projector pixel coordinate; acquiring a local perspective transformation matrix of the camera-projector based on the second coordinate mapping table; based on the local perspective transformation matrix, perspective transforming the projector pixel coordinates in the second coordinate mapping table to a camera view to obtain a fourth coordinate mapping table; performing difference on the matched coordinate pairs in the fourth coordinate mapping table in a preset direction to obtain a first difference image; performing differential forward normalization processing on the first differential image to obtain a second differential image; reversely mapping the second difference image to a projector field of view based on the third coordinate mapping table to obtain a third difference image; and repairing the hole area in the third difference image to obtain the parallax image of the projector.
In the embodiment of the application, a series of transformation and calculation are performed based on the grating image, so that the parallax image of the projector can be accurately obtained, and the gradual change process of the parallax value is close to the actual trend of the depth information in the process of obtaining the parallax image.
As a possible implementation manner, the removing the outlier in the first coordinate mapping table to obtain a second coordinate mapping table includes: based on the RANSAC algorithm, calculating a basic matrix between a camera and the projector; and eliminating outliers in the first coordinate mapping table based on the basic matrix to obtain the second coordinate mapping table.
In the embodiment of the application, the base matrix is calculated through the RANSAC algorithm, and based on the base matrix, the outlier in the first coordinate mapping table can be well removed.
As a possible implementation manner, the obtaining, based on the second coordinate mapping table, a third coordinate mapping table includes: carrying out normalization operation on floating point coordinates in the projector pixel coordinates in the second coordinate mapping table; the partial pixel coordinates after the normalization operation correspond to a plurality of pixel coordinates; and extracting optimal coordinates from the plurality of pixel coordinates, and determining the optimal coordinates as pixel coordinates corresponding to the corresponding pixel coordinates.
In the embodiment of the application, the normalization operation and the optimal coordinate extraction are combined, and the one-to-one correspondence of the coordinates can be maintained on the basis of guaranteeing the normalization of the coordinates.
As a possible implementation manner, the obtaining a local perspective transformation matrix of the camera-projector based on the second coordinate mapping table includes: sliding a plurality of pixel coordinates on the grating image through a preset rectangular area; based on the pixel coordinates, a RANSAC algorithm is adopted to calculate a local perspective transformation matrix of the camera-projector.
In the method, a plurality of pixel coordinates are obtained through sliding of a preset rectangular area, and quick calculation of the local perspective transformation matrix can be achieved based on a RANSAC algorithm.
As a possible implementation manner, before the repairing the hole area in the third difference image to obtain the parallax image of the projector, the method further includes: performing weighted median filtering on the third difference image to obtain a third difference image with noise removed; normalizing the third difference image with noise removed to obtain a normalized third difference image; smoothing and filtering the normalized third difference image to obtain a smoothed third difference image; correspondingly, the repairing the hole area in the third difference image to obtain the parallax image of the projector includes: repairing the hole area in the smoothed third difference image to obtain the parallax image of the projector.
In the embodiment of the application, based on the third difference image, a more accurate difference image is obtained through a series of image processing, and the parallax image determined based on the more accurate difference image is also more accurate.
As a possible implementation manner, the acquiring a discrete color image includes: and filling RGB values of the corresponding positions of the camera pixel coordinates in the third coordinate mapping table to the corresponding positions of the projector pixel coordinates, so as to obtain a discrete color image.
In the embodiment of the application, the color image is acquired rapidly and accurately through filling of RGB values.
As a possible implementation manner, the repairing the hole area based on the parallax image, to obtain a first repairing image includes: determining a hole image corresponding to the discrete color image based on the third coordinate mapping table; in the hole image, the pixel value of the hole area is 255; determining a shadow area image corresponding to the hole image; holes in the shadow area image are caused by shadows; and filling a hole area in the discrete color image based on the shadow area image and the parallax image to obtain a first repair image.
In the embodiment of the application, the restoration of the hole area caused by the shadow is beneficial to making the projection effect of the projection image finally generated based on the restoration image better.
As a possible implementation manner, before the generating a projection image based on the first repair image and the parallax image, the method further includes: performing secondary restoration on the first restoration image by using a navier-stokes equation to obtain a second restored first restoration image; correspondingly, the generating a projection image based on the first repair image and the parallax image includes: and generating a projection image based on the second restored first restored image and the parallax image.
In the embodiment of the application, the repaired first repair image is secondarily repaired through the navier-stokes equation, so that the projection effect of the finally generated projection image is better.
In a second aspect, an embodiment of the present application provides an apparatus for projecting a map, which is applied to a control device in a control system for projecting a map, where the control system further includes a camera and a projector; the apparatus comprises functional modules for implementing the method of the first aspect and any one of the possible implementations of the first aspect.
In a third aspect, an embodiment of the present application provides an electronic device, including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the projection mapping method of the first aspect and any one of the possible implementations of the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium having stored thereon a computer program which, when executed by a computer, performs a method of projecting a map as described in the first aspect and any one of the possible implementations of the first aspect.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a first implementation of a control system provided in an embodiment of the present application;
FIG. 2 is a schematic diagram of a second implementation of a control system provided in an embodiment of the present application;
fig. 3 is a schematic diagram of an electronic device according to an embodiment of the present application;
FIG. 4 is a flow chart of a method of projecting a map provided in an embodiment of the present application;
FIG. 5 is a schematic diagram of a first coordinate mapping table according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of a second coordinate mapping table according to an embodiment of the present disclosure;
fig. 7 is a functional block diagram of an apparatus for projecting a map according to an embodiment of the present application.
Icon: 100-a control system; 110-an upper computer; 120-projector; 130-a camera; 140-peripheral devices; 150-a data processing terminal; 200-an electronic device; 210-memory; 220-a communication module; 230-bus; 240-a processor; 400-means for projecting a map; 410-an acquisition module; 420-a first processing module; 430-a second processing module; 440-a third processing module; 450-control module.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. The specific method of operation in the method embodiment may also be applied to the device embodiment or the system embodiment. In the description of the present application, unless otherwise indicated, "at least one" includes one or more. "plurality" means two or more. For example, at least one of A, B and C, includes: a alone, B alone, a and B together, a and C together, B and C together, and A, B and C together. In the present application, "/" means or, for example, A/B may represent A or B; "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone.
Before describing the method for projecting the map provided in the embodiment of the present application, a description is given of a hardware environment to which the method for projecting the map is applied.
Referring to fig. 1, which is a schematic diagram of a first implementation of a control system 100 according to an embodiment of the present application, the control system 100 includes: a host computer 110, a projector 120, and a camera 130. The upper computer 110 is connected to the projector 120, and the upper computer 110 is connected to the camera 130.
The upper computer 110 may be a conventional desktop computer, a notebook computer, an industrial personal computer, etc., and may be directly connected to the projector 120 and the camera 130. The projector 120 may be connected by HDMI (High Definition Multimedia Interface, high-definition multimedia interface). The camera 130 may be connected by USB (Universal Serial Bus ), ethernet, MIPI (Mobile Industry Processor Interface, mobile industry processor interface), or the like. The camera 130 may employ a 4K camera.
In the embodiment of the present application, the upper computer 110 may be used as a control device in the control system 100, and on one hand, the upper computer 110 may control the image projected by the projector 120 and control the camera 130 to collect the image. On the other hand, the method for projecting the map provided by the embodiment of the application can also be performed.
In fig. 1, an object to be projected whose position is related to the positions of the projector 120 and the camera 130 is also included. In the embodiment of the present application, the projector 120 is used to project an image onto an object to be projected, and thus, the object to be projected may be within the projection field of view of the projector 120. The camera 130 is used for capturing an image corresponding to the object to be projected, and therefore, the object to be projected is also required to be within the image capturing range of the camera 130.
The object to be projected refers to the object to be mapped and also the object to be sampled.
Referring to fig. 2, which is a schematic diagram of a second implementation of the control system 100 according to the embodiment of the present application, the control system 100 includes: projector 120, camera 130, peripheral device 140, data processing terminal 150. Wherein the peripheral device 140 is connected to a data processing terminal 150, the data processing terminal 150 is connected to the projector 120 and the camera 130, and the control devices in the system comprise the peripheral device 140 and the data processing terminal 150.
The peripheral device 140 and the data processing terminal 150 may be a conventional desktop, notebook, industrial computer, etc., and the data processing terminal 150 is directly connected to the projector 120 and the camera 130 in the same manner as in the first embodiment.
This embodiment differs from the first embodiment in that: the host computer 110 in the first embodiment is replaced with the peripheral device 140 and the data processing terminal 150, and thus the method performed by the host computer 110 in the first embodiment is completed by the peripheral device 140 and the data processing terminal 150 in the second embodiment. Accordingly, the data processing terminal 150 is responsible for directly controlling the projector 120 and the camera 130, and acquiring data on the projector 120 and the camera 130, and the data processing terminal 150 is controlled based on instructions issued by the peripheral device 140. That is, in the second embodiment, the peripheral device 140 may issue a corresponding instruction to the data processing terminal 150, and the data processing terminal 150 performs image processing or control.
Based on the two implementations shown in fig. 1 and fig. 2, the projection mapping method of the embodiment of the present application is applied to a control device of the control system 100, and may be applied to the host computer 110, and may also be applied to the peripheral device 140 and the data processing terminal 150.
Referring to fig. 3, the electronic device 200 includes: memory 210, communication module 220, bus 230, and processor 240. Wherein the processor 240, the communication module 220 and the memory 210 are connected by a bus 230.
In the embodiment of the present application, the electronic device 200 may be the host computer 110, the peripheral device 140, or the data processing terminal 150.
In the embodiment of the present application, the memory 210 stores a program required for implementing the method for projecting a map provided in the embodiment of the present application.
Memory 210 may include, but is not limited to, RAM (Random Access Memory ), ROM (Read Only Memory), PROM (Programmable Read-Only Memory, programmable Read Only Memory), EPROM (Erasable Programmable Read-Only Memory, erasable Read Only Memory), EEPROM (Electric Erasable Programmable Read-Only Memory), etc.
Bus 230 may be an ISA (Industry Standard Architecture ) bus, a PCI (Peripheral Component Interconnect, peripheral component interconnect standard) bus, or an EISA (Enhanced Industry Standard Architecture, extended industry standard architecture) bus, among others. The buses may be divided into address buses, data buses, control buses, etc. For ease of illustration, only one bi-directional arrow is shown in FIG. 3, but not only one bus or one class of buses.
The processor 240 is configured to execute executable modules, such as computer programs, stored in the memory 210. The methods performed by the processes or the defined apparatuses disclosed in the embodiments of the present application may be applied to the processor 240 or implemented by the processor 240. After the processor 240 receives the execution instruction and invokes the program stored in the memory 210 through the bus 230, the processor 240 controls the communication module 220 through the bus 230 to implement a method flow for running the projection map.
Processor 240 may be an integrated circuit chip with signal processing capabilities. Processor 240 may be a general-purpose processor including a CPU (Central Processing Unit ), NP (Network Processor, network processor), etc.; but may be a digital signal processor, an application specific integrated circuit, an off-the-shelf programmable gate array or other programmable logic device, a discrete gate or transistor logic device, or a discrete hardware component. Which may implement or perform the disclosed methods, steps, and logic blocks in embodiments of the present application. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The components and structures of the electronic device 200 shown in fig. 3 are exemplary only and not limiting, as the electronic device 200 may have other components and structures as desired.
Based on the description of the above hardware motion environment and application scenario, please refer to fig. 4, which is a flowchart of a method for projecting a map, the method includes: step 310, step 320, step 330, step 340, step 350, step 360.
Step 310: and acquiring a grating image. The raster image is an image acquired by the camera 130 and including a raster pattern projected by the projector 120 on an object to be projected.
Step 320: based on the raster image, a parallax image of the projector 120 is obtained.
Step 330: a discrete color image is acquired. The discrete color image is an image obtained by transforming the grating image, and the discrete color image comprises a hole area.
Step 340: and repairing the hole area based on the parallax image to obtain a first repairing image.
Step 350: a projection image is generated based on the first repair image and the parallax image.
Step 360: the projector 120 is controlled to project a projection image onto an object to be projected.
In the embodiment of the present application, compared with the prior art, on the one hand, the projector 120 projects the grating pattern, the camera 130 collects the grating image including the grating pattern, and the control device performs the subsequent image processing flow based on the grating image. The camera 130 and the projector 120 can be flexibly assembled each time when in use, and the posture calibration is not required to be carried out again after the relative posture is changed, so that the use is convenient, and the operation flow is simplified to a certain extent. On the other hand, based on the raster image, a parallax image of the projector 120 is determined, and the parallax of the projector 120 may reflect the scene depth gradation condition (i.e., depth information); based on the parallax image, projection mapping based on depth information can be realized, three-dimensional reconstruction is completed, and practicality is improved.
If the control system 100 adopts the first embodiment, steps 310-360 are all executed by the host computer 110. If the control system 100 employs the second embodiment, steps 310-350 may be performed by the data processing terminal 150, and step 360 may be performed by the peripheral device 140 or by the data processing terminal 150. For easy understanding, the following embodiments are described with the control device as the execution body.
In step 310, the raster image is an image that includes a raster pattern projected by projector 120 on the object to be projected. As an alternative embodiment, the control device controls the projector 120 to project the grating pattern onto the object to be projected, and then controls the camera 130 to collect an image of the object to be projected, where the image includes the projected grating pattern.
The projected grating pattern may be a pattern corresponding to a phase shift plus gray code grating, which is a type of grating well known in the art and not specifically described herein.
In the embodiment of the present application, before the camera 130 is used (i.e. before the image is acquired), the internal parameters of the camera 130 may be calibrated, so as to implement the distortion correction of the camera 130.
In step 320, the parallax image of the projector 120 may be understood as information for characterizing the depth information (depth gradation condition) of the projector.
As an alternative embodiment, step 320 includes: decoding grating information in the grating image to obtain a first coordinate mapping table; the first coordinate mapping table includes a correspondence between the pixel coordinates of the camera 130 and the pixel coordinates of the projector 120; removing outer points in the first coordinate mapping table to obtain a second coordinate mapping table; acquiring a third coordinate mapping table based on the second coordinate mapping table; the coordinates in the third coordinate mapping table are integers, and one camera 130 pixel coordinate corresponds to one projector 120 pixel coordinate; acquiring a local perspective transformation matrix of the camera 130-projector 120 based on the second coordinate mapping table; based on the local perspective transformation matrix, the projector 120 pixel coordinates in the second coordinate mapping table are perspective transformed to the field of view of the camera 130, and a fourth coordinate mapping table is obtained; performing difference on the matched coordinate pairs in the fourth coordinate mapping table in a preset direction to obtain a first difference image; performing differential forward normalization processing on the first differential image to obtain a second differential image; reversely mapping the second difference image to the field of view of the projector 120 based on the third coordinate mapping table to obtain a third difference image; and repairing the hole area in the third difference image to obtain a parallax image of the projector 120.
The steps in this embodiment will be described in order.
Decoding the raster information in the raster image may be understood as decoding the raster image to obtain the pixel coordinates of the camera 130 and the pixel coordinates of the projector 120 in the raster image, and the pixel coordinates of the camera 130 and the pixel coordinates of the projector 120 also have a corresponding relationship, that is, coordinate mapping between the camera 130 and the projector 120. For ease of understanding, please refer to fig. 5, which is a schematic diagram of the first coordinate mapping table, in fig. 5, P1-P6 are the projector 120 coordinates, and C1-C6 are the camera 130 coordinates. The coordinates in the first coordinate mapping table are initial coordinates, and the first coordinate mapping table is a sub-pixel level mapping table because of the difference in resolution between the camera 130 and the projector 120 and the direct conversion of the phase into a floating point type (i.e., the converted coordinates are floating point type coordinates during decoding).
Therefore, the outliers in the first coordinate mapping table can be removed on the basis of the first coordinate mapping table. As an alternative embodiment, the culling process includes: based on RANSAC (Random sample consensus ) algorithm, a basis matrix between the camera 130 and the projector 120 is calculated; and removing outliers in the first coordinate mapping table based on the basic matrix to obtain a second coordinate mapping table.
The RANSAC algorithm is an algorithm well known in the art, and therefore, the application and specific implementation of the algorithm will not be described in detail herein.
As shown in fig. 6, on the basis of the first coordinate map shown in fig. 5, P2, C2, and P5, C5, the several coordinate points are eliminated. That is, a schematic diagram of the second coordinate mapping table is shown in fig. 6.
On the basis of the second coordinate mapping table, when the third coordinate mapping table is acquired, as an optional implementation manner, the method includes: carrying out normalization operation on floating point coordinates in the pixel coordinates of the projector 120 in the second coordinate mapping table; the partial pixel coordinates after the normalization operation correspond to a plurality of pixel coordinates; and extracting optimal coordinates from the plurality of pixel coordinates, and determining the optimal coordinates as pixel coordinates corresponding to the corresponding pixel coordinates.
In this embodiment, the normalization operation is performed by converting the projector 120 pixel coordinates from the floating point type to the integer type according to the second coordinate map. After the normalization operation, some pixel coordinates may correspond to a plurality of pixel coordinates, and for this part of pixel coordinates, the optimal coordinates may be extracted according to RGB (Red/Green/Blue) color values and a phase euclidean distance method. Such as: and determining the coordinates of which the color value and the phase Euclidean distance meet the preset conditions as optimal coordinates. After the optimal coordinates are extracted, the optimal coordinates are determined to be the pixel coordinates corresponding to the corresponding pixel coordinates, and the coordinates after the normalization can be restored to a one-to-one correspondence relationship, so that a third coordinate mapping table is obtained.
On the basis of the second coordinate mapping table, a fourth coordinate mapping table can be obtained in addition to the third coordinate mapping table, and coordinates in the fourth coordinate mapping table are coordinates subjected to perspective transformation. As an alternative embodiment, the process of obtaining the local perspective transformation matrix includes: sliding a plurality of pixel coordinates on the raster image through a preset rectangular area; based on a plurality of pixel coordinates, a RANSAC algorithm is adopted to obtain a local perspective transformation matrix of the camera-projector.
The preset matrix area (may be referred to as a kernel window) may be a rectangular area with a preset size, and the inside of the preset matrix area is formed by included image pixels, for example, a kernel window of 3*3 includes 9 pixels, and slides in the whole image area according to a certain step length, and each sliding covers different pixels.
In practical application, a kernel window with a certain size is set first, and the pixel coordinates of a single sub-region on the raster image are extracted through a sliding window. Then, a RANSAC random 8-point method is adopted to iteratively calculate the optimal perspective transformation matrix of the kernel window, and a plurality of optimal local perspective transformation matrices of a single kernel window are stored into a list for subsequent calling.
Correspondingly, the projector 120 pixel coordinate perspective is transformed to the field of view of the camera 130 by using the local perspective transformation matrix and the second coordinate mapping table, so as to obtain a fourth coordinate mapping table, and as an alternative embodiment, the transformation relation may be:
Figure BDA0002834876710000121
wherein [ x ', y ', z ] ']New coordinates of projector 120 for converting the field of view of camera 130, [ u, v, w ]]For the original projector 120 view coordinates, [ a (i, j)]Representing the perspective transformation matrix.
On the basis of a fourth coordinate mapping table, the matched coordinate pairs in the fourth coordinate mapping table are subjected to difference in the preset direction, and the preset direction can be the x direction, so that a corresponding first difference image is obtained. For example, as shown in FIG. 6, the matching coordinate pair { P (x i ,y i )→C(x i ,y i ) Difference is made in the x-direction.
And carrying out difference forward normalization processing on the basis of the first difference image, so as to obtain a second difference image. As an alternative embodiment, the process comprises: dividing the pixel values of the first difference image into two groups according to positive and negative values, and utilizing a clustering algorithmRespectively find the upper and lower limit threshold TH of the positive and negative regions hi (upper threshold) and TH low And (3) eliminating abnormal data in the first difference image through the threshold value, normalizing to a forward interval, and obtaining a second difference image (which can be understood as a forward difference image). The difference forward normalization process can be described as: d1 (i, j) =d0 (i, j) -TH low ,(TH low ≤D0(i)≤TH hi ) Wherein D1 (i, j) represents the second difference image and D0 (i, j) represents the first difference image.
And reversely mapping the second difference image to the field of view of the projector 120 on the basis of the third coordinate mapping table to obtain a third difference image. The reverse mapping process can be described as: d2 (i, j) =d1 (M, n), { M2[ C (M, n) ]→m2[ P (i, j) ] } where D2 (i, j) represents the third difference image of the field of view of projector 120, D1 (M, n) represents the second difference image of the field of view of camera 130, M2[ C (M, n) ] represents the coordinates of the field of view of camera 130 in the third coordinate mapping table, and M2[ P (i, j) ] represents the transformed coordinates of the field of view of projector 120.
On the basis of the third difference image, an algorithm of curvature diffusion and edge reconstruction can be adopted to repair a hole area in the third difference image, so as to obtain a parallax image of the projector 120.
As an alternative embodiment, further processing may be performed on the basis of the third difference image, where the further processing includes: carrying out weighted median filtering on the third difference image to obtain a third difference image with noise removed; normalizing the third difference image with noise removed to obtain a normalized third difference image; and carrying out smoothing filtering on the normalized third difference image to obtain a smoothed third difference image.
Wherein, normalization means that each pixel value in the third difference image is normalized to a 0-255 interval. After smoothing filtering, the depth data surface of the image tends to be smooth.
Correspondingly, during restoration, hole restoration can be performed based on the third difference image after smoothing, and the restored image is the final parallax image.
After the acquisition of the parallax image is completed in step 320, step 330 is performed to acquire a discrete color image. In connection with the embodiment of step 320, the acquisition of the discrete color image may be accomplished in the process performed in step 320. As an alternative embodiment, step 330 includes: and filling RGB values of the corresponding positions of the pixel coordinates of the camera 130 into the corresponding positions of the pixel coordinates of the corresponding projector 120 in the third coordinate mapping table to obtain a discrete color image. The discrete color image obtained at this time is filled with RGB values, but also includes a hole area, so that the hole area can be repaired, that is, step 340 is performed.
As an alternative embodiment, step 340 includes: determining a hole image corresponding to the discrete color image based on the third coordinate mapping table; in the hole image, the pixel value of the hole area is 255; determining a shadow area image corresponding to the hole image; holes in the shadow area image are caused by shadows; and filling the hole areas in the discrete color images based on the shadow area images and the parallax images to obtain a first repair image.
In practical application, firstly, a hole area coordinate image (assumed to be H0) of the discrete color image is extracted according to the third coordinate mapping table, the hole area pixel value in the H0 image is 255, and the effective area value is 0. Then, the hole images H0 are classified according to connected domains, connected domain thresholds including area and distribution density of coordinate points are set, and hole domains H1 due to shadows are extracted according to the defined thresholds. Finally, based on the shadow region image H1 and in combination with the disparity map obtained in step 320, traversing the pixel points in the disparity map, determining the horizontal/vertical boundary disparity value of each connected region, selecting the filling direction according to the determination result, and filling the neighboring pixel points into the hole region, wherein the determination process is as follows:
D h =|V left -V right |≥TH d ,D V =|V up -V down |≥TH d ,D h ≥D V taking V left And V right The pixel value of the middle position with the smaller value is used as the filling value.
D h =|V left -V right |≥TH d ,D V =|V up -V down |≥TH d ,D h <D V Taking V up And V d o wn The pixel value of the middle position with the smaller value is used as the filling value.
D h =|V left -V right |≥TH d ,D V =|V up -V down |<TH d Taking V left And V right The pixel value of the middle position with the smaller value is used as the filling value.
D h =|V left -V right |<TH d ,D V =|V up -V down |≥TH d Taking V up And V d o wn The pixel value of the middle position with the smaller value is used as the filling value.
D h =|V left -V right |<TH d ,D V =|V up -V down |<TH d And (3) taking the hole point as a hole point to be stored without processing, and obtaining a hole image H2.
Wherein D is h ,D V Is the absolute value of the parallax depth difference of the boundary points in the horizontal/vertical direction, V left ,V right ,V up ,V down The difference values at the boundary, TH d Is the directional disparity depth difference threshold.
Based on the hole image H2, the first repair image may also be repaired again, and the process may include: and performing secondary restoration on the first restoration image by using a navier-stokes equation to obtain a second restoration first restoration image. The navier-stokes equation and its corresponding repair method are well known in the art and will not be described in detail herein.
Correspondingly, in step 350, a projection image may be generated based on the second restored first restored image (i.e., the mapped color image of the field of view of the projector 120) and the parallax image.
In generating the projection image, rendering and editing may be completed based on the mapping color map and the parallax image, such as: special effect filling (adding) realizes the generation of projection images.
In step 360, the control device controls the projector 120 to project a projection image onto the object to be projected, completing the three-dimensional projection map.
Through the projection mapping method provided by the embodiment of the application, the camera 130 and the projector 120 can be flexibly assembled when in use each time, and the gesture calibration is not required to be carried out again after the relative gesture is changed, so that the use is convenient, the three-dimensional projection mapping effect (improvement of the projection effect) can be well completed according to the obtained virtual parallax image, and the gradual change process of the parallax value is close to the actual depth information trend in the parallax image obtaining process.
Based on the same inventive concept, please refer to fig. 7, an apparatus 400 for projecting a map is further provided in an embodiment of the present application, including: the acquisition module 410, the first processing module 420, the second processing module 430, the third processing module 440, and the control module 450. The apparatus 400 for projecting a map corresponds to the method for projecting a map described in the foregoing embodiment, and is applied to a control device, for example: applied to the upper computer 110.
An acquisition module 410, configured to acquire a raster image; the grating image is an image acquired by the camera 130 and comprising a grating pattern projected by the projector 120 on an object to be projected; a first processing module 420 for obtaining a parallax image of the projector 120 based on the raster image; an acquisition module 410, further configured to acquire a discrete color image; the discrete color image is an image obtained by transforming the grating image, and comprises a hole area; the second processing module 430 is configured to repair the hole area based on the parallax image, to obtain a first repair image; a third processing module 440 for generating a projection image based on the first repair image and the parallax image; a control module 450, configured to control the projector 120 to project the projection image onto the object to be projected.
In the embodiment of the present application, the first processing module 420 is specifically configured to: decoding grating information in the grating image to obtain a first coordinate mapping table; the first coordinate mapping table comprises a corresponding relation between camera pixel coordinates and projector pixel coordinates; removing outliers in the first coordinate mapping table to obtain a second coordinate mapping table; acquiring a third coordinate mapping table based on the second coordinate mapping table; the coordinates in the third coordinate mapping table are integers, and one camera pixel coordinate corresponds to one projector pixel coordinate; acquiring a local perspective transformation matrix of the camera-projector based on the second coordinate mapping table; based on the local perspective transformation matrix, perspective transforming the projector pixel coordinates in the second coordinate mapping table to a camera view to obtain a fourth coordinate mapping table; performing difference on the matched coordinate pairs in the fourth coordinate mapping table in a preset direction to obtain a first difference image; performing differential forward normalization processing on the first differential image to obtain a second differential image; reversely mapping the second difference image to a projector field of view based on the third coordinate mapping table to obtain a third difference image; and repairing the hole area in the third difference image to obtain a parallax image of the projector 120.
In this embodiment of the present application, the first processing module 420 is specifically further configured to: based on the RANSAC algorithm, a base matrix between the camera 130 and the projector 120 is calculated; and eliminating outliers in the first coordinate mapping table based on the basic matrix to obtain the second coordinate mapping table.
In this embodiment of the present application, the first processing module 420 is specifically further configured to: performing a normalization operation on floating point coordinates in the projector 120 pixel coordinates in the second coordinate mapping table; the partial pixel coordinates after the normalization operation correspond to a plurality of pixel coordinates; and extracting optimal coordinates from the plurality of pixel coordinates, and determining the optimal coordinates as pixel coordinates corresponding to the corresponding pixel coordinates.
In this embodiment of the present application, the first processing module 420 is specifically further configured to: sliding a plurality of pixel coordinates on the grating image through a preset rectangular area; based on the plurality of pixel coordinates, a RANSAC algorithm is used to determine a local perspective transformation matrix of the camera 130-projector 120.
In the embodiment of the present application, the first processing module 420 is further configured to: performing weighted median filtering on the third difference image to obtain a third difference image with noise removed; normalizing the third difference image with noise removed to obtain a normalized third difference image; smoothing and filtering the normalized third difference image to obtain a smoothed third difference image; the method is particularly used for: and repairing the hole area in the smoothed third difference image to obtain a parallax image of the projector 120.
In the embodiment of the present application, the obtaining module 410 is specifically configured to: and filling RGB values of the corresponding positions of the pixel coordinates of the camera 130 in the third coordinate mapping table to the corresponding positions of the pixel coordinates of the corresponding projector 120, so as to obtain a discrete color image.
In the embodiment of the present application, the second processing module 430 is specifically configured to: determining a hole image corresponding to the discrete color image based on the third coordinate mapping table; in the hole image, the pixel value of the hole area is 255; determining a shadow area image corresponding to the hole image; holes in the shadow area image are caused by shadows; and filling a hole area in the discrete color image based on the shadow area image and the parallax image to obtain a first repair image.
In the embodiment of the present application, the second processing module 430 is further configured to: performing secondary restoration on the first restoration image by using a navier-stokes equation to obtain a second restored first restoration image; the method is particularly used for: and generating a projection image based on the second restored first restored image and the parallax image.
The embodiment of the application also provides a storage medium, on which one or more programs are stored, where the one or more programs may be executed by one or more processors to implement the method for projecting a map in the embodiment.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be other manners of division in actual implementation, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
Further, the units described as separate units may or may not be physically separate, and units displayed as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
Furthermore, functional modules in various embodiments of the present application may be integrated together to form a single portion, or each module may exist alone, or two or more modules may be integrated to form a single portion.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
The foregoing is merely exemplary embodiments of the present application and is not intended to limit the scope of the present application, and various modifications and variations may be suggested to one skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principles of the present application should be included in the protection scope of the present application.

Claims (8)

1. A method of projecting a map, characterized by a control device applied in a control system of the projecting map, the control system further comprising a camera and a projector; the method comprises the following steps:
acquiring a grating image; the grating image is an image which is acquired by the camera and contains a grating pattern projected by the projector on an object to be projected;
obtaining a parallax image of the projector based on the raster image;
acquiring a discrete color image; the discrete color image is an image obtained by transforming the grating image, and comprises a hole area;
Repairing the hole area based on the parallax image to obtain a first repairing image;
generating a projection image based on the first repair image and the parallax image;
controlling the projector to project the projection image on the object to be projected;
the obtaining a parallax image of the projector based on the raster image includes:
decoding grating information in the grating image to obtain a first coordinate mapping table; the first coordinate mapping table comprises a corresponding relation between camera pixel coordinates and projector pixel coordinates;
removing outliers in the first coordinate mapping table to obtain a second coordinate mapping table;
acquiring a third coordinate mapping table based on the second coordinate mapping table; the coordinates in the third coordinate mapping table are integers, and one camera pixel coordinate corresponds to one projector pixel coordinate;
acquiring a local perspective transformation matrix of the camera-projector based on the second coordinate mapping table;
based on the local perspective transformation matrix, perspective transforming the projector pixel coordinates in the second coordinate mapping table to a camera view to obtain a fourth coordinate mapping table;
performing difference on the matched coordinate pairs in the fourth coordinate mapping table in a preset direction to obtain a first difference image;
Performing differential forward normalization processing on the first differential image to obtain a second differential image;
reversely mapping the second difference image to a projector field of view based on the third coordinate mapping table to obtain a third difference image;
repairing the hole area in the third difference image to obtain a parallax image of the projector;
the removing the outliers in the first coordinate mapping table to obtain a second coordinate mapping table includes:
calculating a basis matrix between the camera and the projector based on a RANSAC algorithm;
and eliminating outliers in the first coordinate mapping table based on the basic matrix to obtain the second coordinate mapping table.
2. The method of claim 1, wherein the obtaining a third coordinate mapping table based on the second coordinate mapping table comprises:
carrying out normalization operation on floating point coordinates in the projector pixel coordinates in the second coordinate mapping table; the partial pixel coordinates after the normalization operation correspond to a plurality of pixel coordinates;
and extracting optimal coordinates from the plurality of pixel coordinates, and determining the optimal coordinates as pixel coordinates corresponding to the corresponding pixel coordinates.
3. The method of claim 1, wherein the obtaining a local perspective transformation matrix of a camera-projector based on the second coordinate mapping table comprises:
sliding a plurality of pixel coordinates on the grating image through a preset rectangular area;
based on the pixel coordinates, a RANSAC algorithm is adopted to calculate a local perspective transformation matrix of the camera-projector.
4. The method of claim 1, wherein prior to said repairing the hole area in the third difference image to obtain the parallax image of the projector, the method further comprises:
performing weighted median filtering on the third difference image to obtain a third difference image with noise removed;
normalizing the third difference image with noise removed to obtain a normalized third difference image;
smoothing and filtering the normalized third difference image to obtain a smoothed third difference image;
correspondingly, the repairing the hole area in the third difference image to obtain the parallax image of the projector includes:
repairing the hole area in the smoothed third difference image to obtain the parallax image of the projector.
5. The method of claim 1, wherein the acquiring a discrete color image comprises:
and filling RGB values of the corresponding positions of the camera pixel coordinates in the third coordinate mapping table to the corresponding positions of the projector pixel coordinates, so as to obtain a discrete color image.
6. The method of claim 1, wherein repairing the hole area based on the parallax image to obtain a first repair image comprises:
determining a hole image corresponding to the discrete color image based on the third coordinate mapping table; in the hole image, the pixel value of the hole area is 255;
determining a shadow area image corresponding to the hole image; holes in the shadow area image are caused by shadows;
and filling a hole area in the discrete color image based on the shadow area image and the parallax image to obtain a first repair image.
7. The method of claim 1, wherein prior to the generating a projection image based on the first repair image and the parallax image, the method further comprises:
performing secondary restoration on the first restoration image by using a navier-stokes equation to obtain a second restored first restoration image;
Correspondingly, the generating a projection image based on the first repair image and the parallax image includes:
and generating a projection image based on the second restored first restored image and the parallax image.
8. An apparatus for projection mapping, characterized by a control device applied in a control system for projection mapping, the control system further comprising a camera and a projector; the device comprises:
the acquisition module is used for acquiring the grating image; the grating image is an image which is acquired by the camera and contains a grating pattern projected by the projector on an object to be projected;
a first processing module for obtaining a parallax image of the projector based on the raster image;
the acquisition module is also used for acquiring discrete color images; the discrete color image is an image obtained by transforming the grating image, and comprises a hole area;
the second processing module is used for repairing the hole area based on the parallax image to obtain a first repairing image;
a third processing module for generating a projection image based on the first repair image and the parallax image;
The control module is used for controlling the projector to project the projection image on the object to be projected;
the first processing module is specifically configured to decode the grating information in the grating image to obtain a first coordinate mapping table; the first coordinate mapping table comprises a corresponding relation between camera pixel coordinates and projector pixel coordinates; removing outliers in the first coordinate mapping table to obtain a second coordinate mapping table; acquiring a third coordinate mapping table based on the second coordinate mapping table; the coordinates in the third coordinate mapping table are integers, and one camera pixel coordinate corresponds to one projector pixel coordinate; acquiring a local perspective transformation matrix of the camera-projector based on the second coordinate mapping table; based on the local perspective transformation matrix, perspective transforming the projector pixel coordinates in the second coordinate mapping table to a camera view to obtain a fourth coordinate mapping table; performing difference on the matched coordinate pairs in the fourth coordinate mapping table in a preset direction to obtain a first difference image; performing differential forward normalization processing on the first differential image to obtain a second differential image; reversely mapping the second difference image to a projector field of view based on the third coordinate mapping table to obtain a third difference image; repairing the hole area in the third difference image to obtain a parallax image of the projector;
The first processing module is specifically configured to calculate a base matrix between the camera and the projector based on a RANSAC algorithm; and eliminating outliers in the first coordinate mapping table based on the basic matrix to obtain the second coordinate mapping table.
CN202011481546.5A 2020-12-14 2020-12-14 Method and device for projecting mapping Active CN112614190B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011481546.5A CN112614190B (en) 2020-12-14 2020-12-14 Method and device for projecting mapping

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011481546.5A CN112614190B (en) 2020-12-14 2020-12-14 Method and device for projecting mapping

Publications (2)

Publication Number Publication Date
CN112614190A CN112614190A (en) 2021-04-06
CN112614190B true CN112614190B (en) 2023-06-06

Family

ID=75239391

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011481546.5A Active CN112614190B (en) 2020-12-14 2020-12-14 Method and device for projecting mapping

Country Status (1)

Country Link
CN (1) CN112614190B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113393480B (en) * 2021-06-09 2023-01-06 华南理工大学 Method for projecting notes in real time based on book positions

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102915559A (en) * 2012-08-22 2013-02-06 北京航空航天大学 Real-time transparent object GPU (graphic processing unit) parallel generating method based on three-dimensional point cloud
CN103292733A (en) * 2013-05-27 2013-09-11 华中科技大学 Corresponding point searching method based on phase shift and trifocal tensor
CN106705855A (en) * 2017-03-10 2017-05-24 东南大学 High-dynamic performance three-dimensional measurement method based on adaptive grating projection
CN110111262A (en) * 2019-03-29 2019-08-09 北京小鸟听听科技有限公司 A kind of projector distortion correction method, device and projector

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130095920A1 (en) * 2011-10-13 2013-04-18 Microsoft Corporation Generating free viewpoint video using stereo imaging

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102915559A (en) * 2012-08-22 2013-02-06 北京航空航天大学 Real-time transparent object GPU (graphic processing unit) parallel generating method based on three-dimensional point cloud
CN103292733A (en) * 2013-05-27 2013-09-11 华中科技大学 Corresponding point searching method based on phase shift and trifocal tensor
CN106705855A (en) * 2017-03-10 2017-05-24 东南大学 High-dynamic performance three-dimensional measurement method based on adaptive grating projection
CN110111262A (en) * 2019-03-29 2019-08-09 北京小鸟听听科技有限公司 A kind of projector distortion correction method, device and projector

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于 MAR 的非物质文化遗产数字化保护系统研究;候守明等;《系统仿真学报》;第33卷(第6期);1334-1341 *

Also Published As

Publication number Publication date
CN112614190A (en) 2021-04-06

Similar Documents

Publication Publication Date Title
CN109840881B (en) 3D special effect image generation method, device and equipment
US8902229B2 (en) Method and system for rendering three dimensional views of a scene
CN111311482B (en) Background blurring method and device, terminal equipment and storage medium
US9589359B2 (en) Structured stereo
CN108694719B (en) Image output method and device
US11776202B2 (en) Image processing method and apparatus, computer storage medium, and electronic device
CN111583381B (en) Game resource map rendering method and device and electronic equipment
JP2019204193A (en) Image processing device, image processing method, and program
CN111275824A (en) Surface reconstruction for interactive augmented reality
CN109214996A (en) A kind of image processing method and device
CN112614190B (en) Method and device for projecting mapping
CN113506305B (en) Image enhancement method, semantic segmentation method and device for three-dimensional point cloud data
CN107357422B (en) Camera-projection interactive touch control method, device and computer readable storage medium
CN113379815A (en) Three-dimensional reconstruction method and device based on RGB camera and laser sensor and server
TWI595446B (en) Method for improving occluded edge quality in augmented reality based on depth camera
JP6579659B2 (en) Light source estimation apparatus and program
Suk et al. Fixed homography–based real‐time sw/hw image stitching engine for motor vehicles
JP2021174406A (en) Depth map super-resolution device, depth map super-resolution method, and depth map super-resolution program
CN111553969A (en) Texture mapping method, medium, terminal and device based on gradient domain
CN116778095B (en) Three-dimensional reconstruction method based on artificial intelligence
CN111899181A (en) Method and device for removing shadow in image
CN115330803B (en) Surface defect data enhancement method and device, electronic equipment and storage medium
JP7218445B2 (en) Upscaling device, upscaling method, and upscaling program
CN113570518B (en) Image correction method, system, computer equipment and storage medium
Gao et al. A Low-Complexity End-to-End Stereo Matching Pipeline From Raw Bayer Pattern Images to Disparity Maps

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant