CN111583117A - Rapid panoramic stitching method and device suitable for space complex environment - Google Patents

Rapid panoramic stitching method and device suitable for space complex environment Download PDF

Info

Publication number
CN111583117A
CN111583117A CN202010390004.0A CN202010390004A CN111583117A CN 111583117 A CN111583117 A CN 111583117A CN 202010390004 A CN202010390004 A CN 202010390004A CN 111583117 A CN111583117 A CN 111583117A
Authority
CN
China
Prior art keywords
cameras
image
calibration
camera
panoramic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010390004.0A
Other languages
Chinese (zh)
Inventor
唐明乐
袁杰
刘柯健
徐起
方彩婷
赵静
伍伟
王琰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Spaceflight Institute of TT&C and Telecommunication
Original Assignee
Shanghai Spaceflight Institute of TT&C and Telecommunication
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Spaceflight Institute of TT&C and Telecommunication filed Critical Shanghai Spaceflight Institute of TT&C and Telecommunication
Priority to CN202010390004.0A priority Critical patent/CN111583117A/en
Publication of CN111583117A publication Critical patent/CN111583117A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4038Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
    • G06T3/12
    • G06T3/153
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/60Rotation of a whole image or part thereof
    • G06T3/604Rotation of a whole image or part thereof using a CORDIC [COordinate Rotation Digital Compute] device
    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/32Indexing scheme for image data processing or generation, in general involving image mosaicing

Abstract

The application provides a rapid panoramic stitching technology suitable for a space complex environment, and the method at least comprises the following steps: s101, performing a calibration test of a camera, and extracting angular points on a calibration image shot by an effective field of view; s102, acquiring internal parameters of each camera by monocular calibration; s103, obtaining external parameters between adjacent cameras through binocular calibration; s104, performing spherical inverse projection to form a panoramic plane image with 360 degrees-180 degrees of view fields, selecting one camera as a basic reference coordinate system, completing image coordinate conversion of other three cameras, and realizing indexing of image pixels in the same coordinate system; s105, eliminating splicing seams by a gradual-in and gradual-out algorithm, and smoothing the large parallax pixel area; and S106, making the pixel point indexes of the four images corresponding to the panoramic image and the fusion coefficient of the stitching area into an LUT (look up table) for the FPGA to use.

Description

Rapid panoramic stitching method and device suitable for space complex environment
Technical Field
The invention relates to the technical field of video image synthesis, in particular to a rapid panoramic stitching method and device suitable for a space complex environment.
Background
Panoramic vision refers to the acquisition of all visual information in a three-dimensional space larger than a hemispherical field of view (360 ° x180 °) at a time. Obtaining panoramic vision requires a special vision sensor system. Panoramic vision is the most direct mode for acquiring three-dimensional space information, has wide application in many fields, and particularly has very important significance in the fields and industries for making decisions through visual information, such as civil, military and aerospace spaces.
The current methods for acquiring panoramic visual images include: (1) the method of the common visual sensor and the rotating holder has the advantages that the visual field of the common visual sensor is limited, and the visual field is increased by means of the rotation of the holder; (2) the compound eye technology + image splicing method comprises the steps of simultaneously acquiring visual images with different angles of a visual field by using a plurality of visual sensors, and then realizing seamless splicing of the images; (3) by using a fish-eye imaging technology, the field of view close to a hemisphere can be observed by fish eyes at one time, and a fish-eye lens specially manufactured according to the principle of fish-eye imaging is formed by combining a plurality of groups of lenses, so that the imaging principle is complex and the price is relatively high; and (4) a method of using a convex mirror + a common vision sensor.
The prior art has overhigh cost, unstable imaging in a complex space environment and unsatisfactory splicing effect.
Disclosure of Invention
Aiming at the defects in the prior art, the embodiment of the application provides a rapid panoramic stitching method suitable for a space complex environment. The technical scheme is as follows:
a quick panoramic stitching method suitable for space complex environment at least comprises the following steps:
s101, performing a calibration test of a camera, and extracting angular points on a calibration image shot by an effective field of view;
s102, acquiring internal parameters of each camera by monocular calibration;
s103, obtaining external parameters between adjacent cameras through binocular calibration;
s104, performing spherical inverse projection to form a panoramic plane image with 360 degrees to 180 degrees of view field, selecting one camera as a basic reference coordinate system, completing image coordinate conversion of other cameras, and realizing index of image pixel points under the same coordinate system;
s105, eliminating splicing seams by a gradual-in and gradual-out algorithm, and smoothing the large parallax pixel area;
and S106, making the pixel point indexes of the four images corresponding to the panoramic image and the fusion coefficient of the stitching area into a lookup table (Look Up Table, LUT) for a Field Programmable Gate Array (FPGA).
In one possible implementation manner, the step S102 includes:
and extracting the coordinates of the corner points on the black and white chessboard calibration plate by adopting a template matching method, and calculating the internal parameters and distortion coefficients of the fisheye camera.
In one possible implementation manner, the step S103 includes:
and performing binocular calibration by using two adjacent fisheye cameras to obtain external parameters of the cameras, namely a rotation matrix and a translation matrix, so as to obtain the rotation matrix between every two adjacent cameras.
In one possible implementation manner, the step S104 includes:
the cameras are horizontally and equally spaced around the device, and the optical centers of the cameras are all on the circle center.
In one possible implementation manner, the step S105 includes:
and eliminating the splicing seams by adopting a gradual-in and gradual-out fusion algorithm, and smoothing the large parallax pixels in a short distance.
In one possible implementation mode, the pixel index of the panoramic image is spherical equidistant inverse projection to obtain the corresponding pixel point coordinate of the shot image; the fusion coefficients are calculated from a fade-in fade-out fusion algorithm.
The invention also provides a rapid panoramic splicing device suitable for the space complex environment, which comprises:
the calibration module is used for performing a calibration test of the camera and extracting angular points on a calibration image shot by an effective field of view;
the first parameter acquisition module acquires the internal parameters of each camera in a monocular calibration mode;
the second parameter acquisition module is used for acquiring external parameters between adjacent cameras through binocular calibration;
the coordinate conversion module is used for carrying out spherical inverse projection to form a panoramic plane image with 360 degrees to 180 degrees of view field, selecting one camera as a basic reference coordinate system, completing the image coordinate conversion of other cameras and realizing the index of image pixel points under the same coordinate system;
the pixel splicing module gradually enters and gradually exits to eliminate a splicing seam, and the large parallax pixel area is subjected to smoothing treatment;
and the LUT generation module is used for making pixel point indexes of the four images corresponding to the panoramic image and the fusion coefficient of the stitching area into an LUT table for the FPGA to use.
In one possible implementation manner, the first parameter obtaining module is configured to: and extracting the coordinates of the corner points on the black and white chessboard calibration plate by adopting a template matching method, and calculating the internal parameters and distortion coefficients of the fisheye camera.
In one possible implementation manner, the second parameter obtaining module is configured to:
and performing binocular calibration by using two adjacent fisheye cameras to obtain external parameters of the cameras, namely a rotation matrix and a translation matrix, so as to obtain the rotation matrix between every two adjacent cameras.
In one possible implementation, the plurality of cameras horizontally surround the device at equal intervals, and the optical centers of the cameras are all on the center of a circle.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
1. the invention realizes the panoramic camera composed of a plurality of fisheye cameras, FPGA hardware completes the panoramic splicing algorithm, the whole structure is simple, and the engineering cost is low;
2. the invention successfully realizes the real-time 360-degree panoramic image shooting with high resolution, has small splicing error and stable imaging, and is suitable for the complex space environment.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic view of a panoramic image stitching process of a multi-channel fisheye camera according to an exemplary embodiment of the present application;
fig. 2 is a view of a panoramic camera provided in an exemplary embodiment of the present application.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that variations and modifications can be made by persons skilled in the art without departing from the spirit of the invention. All falling within the scope of the present invention.
The invention provides a rapid panoramic stitching method suitable for a space complex environment, which at least comprises the following steps in combination with a figure 1: s101, performing a calibration test of a camera, and extracting angular points on a calibration image shot by an effective field of view; s102, acquiring internal parameters of each camera by monocular calibration; s103, obtaining external parameters between adjacent cameras through binocular calibration; s104, performing spherical inverse projection to form a panoramic plane image of a 360-degree-180-degree view field, selecting one camera as a basic reference coordinate system, completing image coordinate conversion of other cameras, and realizing index of image pixels under the same coordinate system; s105, gradually entering and gradually exiting to eliminate splicing seams, and performing smoothing treatment on the large parallax pixel area; and S106, making the pixel point indexes of the four images corresponding to the panoramic image and the fusion coefficient of the stitching area into an LUT (look up table) for the FPGA to use.
Illustratively, in order to acquire a high-resolution image with a large field of view and a large depth of field, a multi-channel fisheye camera is used for quickly splicing panoramic images with 360 degrees and 180 degrees of field of view. The horizontal and vertical viewing angles of the single fisheye camera are consistent and are all larger than 120 degrees, and the imaging characteristic is spherical. Because the shooting background in the space environment is single, and the image splicing error is large and unstable by using feature matching, the method adopts a view field splicing method based on a multi-path fisheye camera 360-degree surrounding structure and realizes real-time panoramic splicing based on FPGA hardware. The whole splicing implementation process can be divided into five parts:
1) monocular calibration
The angular point coordinates on the black and white chessboard calibration plate are extracted by adopting a template matching method, the sub-pixel level precision is required, and the internal parameters and the distortion coefficient of the fisheye camera can be calculated according to a Zhangyingyou calibration method.
2) Binocular calibration
Two adjacent fisheye cameras are used for binocular calibration, and camera external parameters, namely a rotation matrix and a translation matrix, can be obtained. For the cameras distributed at equal intervals horizontally, the translation can be omitted, and thus, the rotation matrix between every two adjacent cameras can be obtained through solving.
3) Spherical equidistant inverse projection
The panoramic image splicing technology designed by the invention is based on a spherical imaging model, and a plurality of cameras horizontally surround the device at equal intervals of 360 degrees, and the optical centers of the devices are required to be on the circle center. In this way, the half-sphere field of view of 360 ° x180 ° is expanded to a plane, that is, the expanded image is a panoramic image in which the field angles of view such as 90 ° of the images captured by the cameras are distributed and connected in parallel.
The specific process of spherical equidistant back projection reverse indexing of pixel points on the fisheye image is as follows:
a. the spherical image is unfolded to a plane, the panoramic image is required to be an equivalent proportion image with a horizontal visual angle of 360 degrees and a vertical visual angle of 180 degrees, so that the aspect ratio of the corresponding panoramic image is 2: 1;
b. converting the world coordinate system into an ideal camera plane coordinate system;
c. adding fisheye distortion characteristics to convert an ideal imaging plane coordinate system into an actual imaging plane coordinate system;
d. the actual imaging plane is converted into a fisheye projection, i.e. a pixel plane coordinate system.
Because the panoramic image is formed by arranging and combining images shot by a plurality of paths of fisheye cameras, the same coordinate system conversion after spherical inverse projection is carried out according to the adjacent position relation, namely the rotation vector obtained by binocular calibration, and finally the primary splicing of the panoramic image is completed.
4) Suture zone fusion
By the processing of the third step, the adjacent fisheye images are adjacently arranged, and the 360-degree splicing of the images is realized. However, in practice, a distinct stitching seam exists between adjacent images. At this time, the last step of panoramic image generation, i.e. image fusion at the splicing seam, needs to be performed. Because the FPGA platform is based, the method of the gradual-in gradual-out fusion is selected to realize the highest efficiency.
5) Generating LUT tables
The LUT table is composed of a panoramic image pixel index (i.e., the position coordinates of the pixel points) and a fusion coefficient of the stitching seam. The pixel index of the panoramic image is the coordinate of a corresponding pixel point of the shot image obtained by spherical equidistant back projection. The fusion coefficient is calculated by a fade-in fade-out fusion algorithm.
In summary, the LUT lookup table required for panoramic stitching is generated by calibrating the camera in advance, stitching, fusing and calculating the images, and the like, and the multiple camera images are processed in parallel by using the FPGA hardware, so as to finally generate the 360-degree panoramic image in real time.
By adopting the method, the panoramic camera formed by the multi-path fisheye cameras is realized, the FPGA hardware completes the panoramic splicing algorithm, the whole structure is simple, and the engineering cost is low.
With reference to fig. 2, the present embodiment is a panoramic camera of a certain type mounted on a space station, the panoramic camera is mounted outside a cabin of the space station, and the panoramic camera is a high-definition image device with compression, and is used for capturing images outside the cabin clearly in real time.
As shown in fig. 2, the panoramic camera is formed by surrounding four fisheye cameras by 360 degrees, the included angle of the optical centers of the cameras is 90 degrees, the horizontal effective viewing field angle is 120 degrees, and the overlapping area between the adjacent cameras can meet the image splicing condition. The four cameras respectively carry out monocular calibration and binocular calibration between adjacent cameras, and internal and external parameters of the cameras are obtained. When the spherical surface is in equidistant inverse projection, four paths of camera images to be spliced are intercepted by taking an optical center as a symmetric center to obtain four 90-degree view field images, one path of camera is selected as a basic reference coordinate system, the other three images are converted into the coordinate system, 360-degree panoramic splicing is realized, finally, splicing seams are gradually merged and gradually out, and a final panoramic image is obtained after the splicing seams are eliminated.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.
It should be understood that reference to "a plurality" herein means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. A quick panoramic stitching method suitable for a space complex environment is characterized by at least comprising the following steps:
s101, performing a calibration test of a camera, and extracting angular points on a calibration image shot by an effective field of view;
s102, acquiring internal parameters of each camera by monocular calibration;
s103, obtaining external parameters between adjacent cameras through binocular calibration;
s104, performing spherical inverse projection to form a panoramic plane image with 360 degrees to 180 degrees of view field, selecting one camera as a basic reference coordinate system, completing image coordinate conversion of other cameras, and realizing index of image pixel points under the same coordinate system;
s105, eliminating splicing seams by a gradual-in and gradual-out algorithm, and smoothing the large parallax pixel area;
and S106, making the pixel point indexes of the four images corresponding to the panoramic image and the fusion coefficient of the stitching area into a lookup table LUT for the FPGA to use.
2. The method according to claim 1, wherein the step S105 comprises:
and eliminating the splicing seams by using a gradual-in and gradual-out fusion algorithm, and smoothing the large parallax pixels in a short distance.
3. The method according to claim 1, wherein the step S102 comprises:
and extracting the coordinates of the corner points on the black and white chessboard calibration plate by adopting a template matching method, and calculating the internal parameters and distortion coefficients of the fisheye camera.
4. The method according to claim 1, wherein the step S103 comprises:
two adjacent fisheye cameras are used for carrying out binocular calibration to obtain external parameters of the cameras, namely a rotation matrix and a translation matrix, so that the rotation matrix between every two adjacent cameras is obtained.
5. The method according to claim 1, wherein the step S104 comprises:
the cameras are horizontally and equally spaced around the device, and the optical centers of the cameras are all on the circle center.
6. The method of claim 1, wherein the pixel index of the panoramic image is a spherical equidistant back projection to obtain the corresponding pixel coordinates of the shot image; the fusion coefficient is obtained by calculation from a fade-in fade-out fusion algorithm.
7. A quick panorama splicing apparatus suitable for space complex environment, characterized in that, the apparatus includes:
the calibration module is used for performing a calibration test of the camera and extracting angular points on a calibration image shot by an effective field of view;
the first parameter acquisition module acquires the internal parameters of each camera in a monocular calibration mode;
the second parameter acquisition module is used for acquiring external parameters between adjacent cameras through binocular calibration;
the coordinate conversion module is used for carrying out spherical inverse projection to form a panoramic plane image with 360 degrees to 180 degrees of view field, selecting one camera as a basic reference coordinate system, completing the image coordinate conversion of other cameras and realizing the index of image pixel points under the same coordinate system;
the pixel splicing module gradually enters and gradually exits to eliminate a splicing seam, and the large parallax pixel area is subjected to smoothing treatment;
and the LUT generation module is used for making pixel point indexes of the four images corresponding to the panoramic image and the fusion coefficient of the stitching area into an LUT table for the FPGA to use.
8. The apparatus of claim 7, wherein the first parameter obtaining module is configured to: and extracting the coordinates of the corner points on the black and white chessboard calibration plate by adopting a template matching method, and calculating the internal parameters and distortion coefficients of the fisheye camera.
9. The apparatus of claim 7, wherein the second parameter obtaining module is configured to:
two adjacent fisheye cameras are used for carrying out binocular calibration to obtain external parameters of the cameras, namely a rotation matrix and a translation matrix, so that the rotation matrix between every two adjacent cameras is obtained.
10. The apparatus of claim 7, wherein the plurality of cameras are spaced horizontally equally around the apparatus with the optical centers of the cameras all at the center of the circle.
CN202010390004.0A 2020-05-09 2020-05-09 Rapid panoramic stitching method and device suitable for space complex environment Pending CN111583117A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010390004.0A CN111583117A (en) 2020-05-09 2020-05-09 Rapid panoramic stitching method and device suitable for space complex environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010390004.0A CN111583117A (en) 2020-05-09 2020-05-09 Rapid panoramic stitching method and device suitable for space complex environment

Publications (1)

Publication Number Publication Date
CN111583117A true CN111583117A (en) 2020-08-25

Family

ID=72126455

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010390004.0A Pending CN111583117A (en) 2020-05-09 2020-05-09 Rapid panoramic stitching method and device suitable for space complex environment

Country Status (1)

Country Link
CN (1) CN111583117A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114115365A (en) * 2021-10-29 2022-03-01 中国科学院合肥物质科学研究院 Sun tracking system and method based on mobile unstable platform
CN116681732A (en) * 2023-08-03 2023-09-01 南昌工程学院 Target motion recognition method and system based on compound eye morphological vision

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106875339A (en) * 2017-02-22 2017-06-20 长沙全度影像科技有限公司 A kind of fish eye images joining method based on strip scaling board
CN108040205A (en) * 2017-12-08 2018-05-15 中国科学院长春光学精密机械与物理研究所 A kind of satellite VR panoramic imaging devices for space imaging
CN108171759A (en) * 2018-01-26 2018-06-15 上海小蚁科技有限公司 The scaling method of double fish eye lens panorama cameras and device, storage medium, terminal

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106875339A (en) * 2017-02-22 2017-06-20 长沙全度影像科技有限公司 A kind of fish eye images joining method based on strip scaling board
CN108040205A (en) * 2017-12-08 2018-05-15 中国科学院长春光学精密机械与物理研究所 A kind of satellite VR panoramic imaging devices for space imaging
CN108171759A (en) * 2018-01-26 2018-06-15 上海小蚁科技有限公司 The scaling method of double fish eye lens panorama cameras and device, storage medium, terminal

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
朱秀昌: "《数字图像处理与图像通信》", 31 December 2011, 中国铁道出版社, pages: 38 - 39 *
陆晓燕: "基于全景拼接的泊车辅助系统研究", no. 01, pages 035 - 502 *
马永峰: "《虚拟现实技术及应用》", 中国铁道出版社, pages: 38 - 39 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114115365A (en) * 2021-10-29 2022-03-01 中国科学院合肥物质科学研究院 Sun tracking system and method based on mobile unstable platform
CN116681732A (en) * 2023-08-03 2023-09-01 南昌工程学院 Target motion recognition method and system based on compound eye morphological vision
CN116681732B (en) * 2023-08-03 2023-10-20 南昌工程学院 Target motion recognition method and system based on compound eye morphological vision

Similar Documents

Publication Publication Date Title
CN111062873B (en) Parallax image splicing and visualization method based on multiple pairs of binocular cameras
TWI555378B (en) An image calibration, composing and depth rebuilding method of a panoramic fish-eye camera and a system thereof
CN110782394A (en) Panoramic video rapid splicing method and system
US7429997B2 (en) System and method for spherical stereoscopic photographing
US5594845A (en) Method and device for processing an image in order to construct a target image from a plurality of contiguous source images
CN111028155B (en) Parallax image splicing method based on multiple pairs of binocular cameras
JP4825971B2 (en) Distance calculation device, distance calculation method, structure analysis device, and structure analysis method.
CN103971375B (en) A kind of panorama based on image mosaic stares camera space scaling method
CN108122191A (en) Fish eye images are spliced into the method and device of panoramic picture and panoramic video
KR20020056895A (en) Fast digital pan tilt zoom video
CN109325981B (en) Geometric parameter calibration method for micro-lens array type optical field camera based on focusing image points
JP2014520337A (en) 3D image synthesizing apparatus and method for visualizing vehicle periphery
CN101938599A (en) Method for generating interactive dynamic panoramic image
JP2007192832A (en) Calibrating method of fish eye camera
US10057487B1 (en) Panoramic imaging systems based on normal-lens cameras
WO2020235110A1 (en) Calibration device, chart for calibration, and calibration method
EP1702475A2 (en) Multi-dimensional imaging apparatus, systems, and methods
Jung et al. Flexibly connectable light field system for free view exploration
CN111583117A (en) Rapid panoramic stitching method and device suitable for space complex environment
CN111009030A (en) Multi-view high-resolution texture image and binocular three-dimensional point cloud mapping method
CN106023073A (en) Image splicing system
CN104113747A (en) Image acquisition and pseudo 3D display system based on binocular vision
CN111064945B (en) Naked eye 3D image acquisition and generation method
KR20200013800A (en) A multicamera imaging system
CN112040140A (en) Wide-view-field high-resolution hybrid imaging device based on light field

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination