WO2024002023A1 - Procédé et appareil de génération d'image stéréoscopique panoramique, et dispositif électronique - Google Patents

Procédé et appareil de génération d'image stéréoscopique panoramique, et dispositif électronique Download PDF

Info

Publication number
WO2024002023A1
WO2024002023A1 PCT/CN2023/102498 CN2023102498W WO2024002023A1 WO 2024002023 A1 WO2024002023 A1 WO 2024002023A1 CN 2023102498 W CN2023102498 W CN 2023102498W WO 2024002023 A1 WO2024002023 A1 WO 2024002023A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
eye
longitude
latitude
splicing
Prior art date
Application number
PCT/CN2023/102498
Other languages
English (en)
Chinese (zh)
Inventor
王果
姜文杰
蔡锦霖
Original Assignee
影石创新科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 影石创新科技股份有限公司 filed Critical 影石创新科技股份有限公司
Publication of WO2024002023A1 publication Critical patent/WO2024002023A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/156Mixing image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing

Definitions

  • Embodiments of the present invention relate to the field of image processing, and in particular, to a method, device and electronic device for generating a panoramic stereoscopic image.
  • Panoramic stereo images can record 360° images, giving users a strong sense of reality and three-dimensionality. Therefore, they are widely used in travel, live broadcast, There are wide application prospects in many industries such as film and television and real estate.
  • embodiments of the present invention provide a method, device and electronic device for generating a panoramic stereoscopic image, which are used to eliminate faults or ghosting phenomena in the splicing area of the panoramic stereoscopic image, so as to reduce the picture distortion of the panoramic stereoscopic image.
  • embodiments of the present invention provide a method for generating a panoramic stereoscopic image.
  • the method includes:
  • a panoramic stereoscopic image is generated based on the left eye panoramic image and the right eye panoramic image.
  • generating a left eye longitude and latitude image and a right eye longitude and latitude image corresponding to each fisheye image based on the acquired multiple fisheye images includes:
  • a left-eye view splicing block area is divided on the set left-eye blank longitude and latitude map.
  • the longitude and latitude map of the fisheye image is expanded according to the projection relationship between the fisheye image and the longitude and latitude map. , generate the left eye latitude and longitude image corresponding to the fisheye image;
  • a right-eye view splicing block area is divided on the set right-eye blank longitude and latitude map.
  • the longitude and latitude map of the fisheye image is expanded according to the projection relationship between the fisheye image and the longitude and latitude map. , generate the right eye latitude and longitude image corresponding to the fisheye image.
  • the left-eye view splicing block area is divided on the set left-eye blank longitude and latitude map, including:
  • the center point of the fisheye image is projected onto the blank latitude and longitude map of the left eye to form a projection of the center point, and the meridian where the projection of the center point is located is defined as the left eye Blank latitude and longitude map centerline;
  • the meridian with a set angle value to the left of the center line of the left eye blank longitude and latitude map from the left eye blank longitude and latitude map is used as the left eye viewing angle centerline;
  • the left-eye viewing angle center line is the viewing field angle center line
  • the image area with the set viewing angle is taken as the left-eye viewing angle splicing block area.
  • the right eye perspective splicing block area is divided on the set right eye blank longitude and latitude map, including:
  • the meridian with a set angle value to the right of the center line of the right eye blank longitude and latitude map from the right eye blank longitude and latitude map is used as the right eye viewing angle centerline;
  • the right eye viewing angle center line is the viewing field angle center line, and the image area with the set viewing angle is taken as the right eye viewing angle splicing block area.
  • performing image fusion on each of the first splicing areas to generate a left-eye perspective panoramic image includes:
  • the images of each first splicing area are fused to generate the left eye panoramic image.
  • performing image fusion on each of the second splicing areas to generate a right-eye perspective panoramic image includes:
  • the images of each second splicing area are fused to generate the right eye panoramic image.
  • the left-eye viewable picture and the right-eye viewable picture are synchronized to generate the panoramic stereoscopic image.
  • embodiments of the present invention provide a device for generating a panoramic stereoscopic image, including:
  • Fisheye image acquisition module used to acquire multiple fisheye images
  • the longitude and latitude image generation module is used to generate the left eye longitude and latitude image and the right eye longitude and latitude image corresponding to each fisheye image based on the multiple acquired fisheye images;
  • inventions of the present invention provide an electronic device.
  • the electronic device includes a plurality of fisheye lenses for acquiring fisheye images.
  • the electronic device further includes a memory and a processor, and the memory is used to store Information including program instructions, the processor is used to control the execution of the program instructions, and when the program instructions are loaded and executed by the processor, the panoramic stereoscopic image generation method in the first aspect or any possible implementation of the first aspect is implemented. A step of.
  • the left eye longitude and latitude image and the right eye longitude and latitude image corresponding to each fisheye image are generated, and the first splicing between the multiple left eye longitude and latitude images is extracted.
  • the panoramic image and the right-eye panoramic image generate a panoramic stereoscopic image, which can eliminate the faults or ghosting phenomena in the splicing area of the panoramic stereoscopic image, and can reduce the picture distortion of the panoramic stereoscopic image, thus improving the realism of the panoramic stereoscopic image.
  • Figures 3a to 3b are schematic diagrams of generating a left eye latitude and longitude image according to an embodiment of the present invention
  • Figure 6 is a schematic diagram of a presentation method of a panoramic stereoscopic image generated in an embodiment of the present invention.
  • the number of fisheye lenses in the electronic device can be set according to product design needs, as long as the field of view of all fisheye lenses is ensured (
  • the number of fisheye lenses can be 6 or 8.
  • Electronic equipment using 6 or 8 fisheye lenses can make the difference between n*FOV and 360° larger. , which is conducive to seamless splicing during the generation of panoramic stereoscopic images.
  • the electronic device includes but is not limited to a panoramic camera, and the electronic device can be applied to robots, vehicles, drones, etc.
  • Figure 1 is a flow chart of a method for generating a panoramic stereoscopic image provided by an embodiment of the present invention. As shown in Figure 1, the method includes:
  • Step 10 Based on the acquired multiple fisheye images, generate a left eye longitude and latitude image and a right eye longitude and latitude image corresponding to each fisheye image.
  • Each fisheye lens in the electronic device can capture a fisheye image, and multiple fisheye lenses can acquire multiple fisheye images, so that the electronic device can acquire multiple fisheye images through the multiple fisheye lenses set up. eye image.
  • the 6 fisheye lenses are fisheye lens No. 1 to fisheye lens No. 6 respectively.
  • the multiple fisheye images acquired include 6 fisheye images. Then this step It is necessary to generate the left eye longitude and latitude image and the right eye longitude and latitude image corresponding to each of the 6 fisheye images.
  • the projection relationship between the fisheye image and the latitude and longitude map can be established based on the calibration parameters of the fisheye lens, where the calibration parameters of the fisheye lens can include internal parameters and external parameters of the camera.
  • step 10 may specifically include: using the calibration parameters of the fisheye lens and generating a left eye longitude and latitude image and a right eye longitude and latitude image corresponding to each fisheye image based on the acquired multiple fisheye images.
  • Figure 2 is a flow chart of a method for generating a left eye longitude and latitude image and a right eye longitude and latitude image provided by an embodiment of the present invention.
  • step 10 may specifically include:
  • a corresponding left eye blank longitude and latitude map is set for each fisheye image.
  • the six set left eye blank longitude and latitude maps are respectively the left eye blank longitude and latitude map I1 to I6.
  • Figures 3a to 3b are schematic diagrams of generating a left eye longitude and latitude image according to an embodiment of the present invention.
  • a left eye perspective splicing block is divided on the set left eye blank longitude and latitude image.
  • Area specifically may include:
  • Step S11 According to the projection relationship between the fisheye image and the latitude and longitude map, project the center point of the fisheye image onto the blank latitude and longitude map of the left eye to form a projection of the center point, and define the meridian where the projection of the center point is located as the blank latitude and longitude map of the left eye. Centerline.
  • the center point of the fisheye image is projected onto the left eye blank longitude and latitude maps I1 to I6 respectively, and the center line of the left eye blank longitude and latitude map of each left eye blank longitude and latitude map I1 to I6 is obtained.
  • Step S12 Use the meridian with a set angle value to the left of the center line of the left-eye blank longitude and latitude map as the left-eye viewing angle center line.
  • Figures 3a and 3b take the left eye blank longitude and latitude map I1 and the left eye blank longitude and latitude map I2 as examples for description.
  • the meridian 30° to the left of the center line of the left-eye blank longitude and latitude map I1 is used as the left-eye viewing angle center line.
  • the meridian 30° to the left of the center line of the left-eye blank longitude and latitude map I2 is used as the left-eye viewing angle center line.
  • the meridian 30° to the left of the center line of the left eye blank longitude and latitude map I6 is used as the left eye viewing angle center line.
  • Step S13 In the left-eye blank longitude and latitude map, use the center line of the left eye viewing angle as the center line of the viewing angle and take the image area with the set viewing angle as the left eye viewing angle splicing block area.
  • the field of view angle ⁇ is set to be greater than 60°.
  • the field of view angle ⁇ is set to be greater than or equal to 80° and less than or equal to 100°.
  • the seamless splicing method requires adjacent splicing block areas to have a certain overlapping FOV in order to perform image matching and alignment operations during the splicing process. Therefore, The FOV must be greater than 60°. Only when it is greater than 60° can adjacent splicing block areas be guaranteed to have overlapping FOVs. For example, when the FOV of a splicing block is 80°, there is a 20° overlapping FOV between adjacent splicing block areas; when the FOV of a splicing block area is 100°, there is a 40° FOV between adjacent splicing block areas. Coincident FOV. The larger the overlapping FOV is, the more conducive it is to the realization of seamless splicing. As shown in Figure 3a and Figure 3b, taking the field of view angle ⁇ as 100° as an example.
  • the shape of the left-eye view splicing block area is a rectangle.
  • the shape of the left-eye view splicing block area can also adopt other shapes, as long as the splicing block area can cover the required FOV.
  • the boundary of the left-eye view splicing block area can be a curve.
  • the center line of the left eye perspective is the center line of the field of view, and an image with a set field of view angle of 100° is taken as the left eye view splicing block area L1.
  • the center line of the left-eye viewing angle is the center line of the viewing angle, and the image with the set viewing angle of 100° is used as the left-eye viewing angle splicing block area L2.
  • the center line of the left eye view angle is the center line of the field of view, and the image with a set field of view angle of 100° is used as the left eye view splicing block area. Domain L6.
  • the fisheye image is expanded into a latitude and longitude map in the left eye view splicing block area L1 to generate a left eye longitude and latitude image A1 corresponding to the fisheye image.
  • the longitude and latitude map of the fish-eye image is expanded in the left-eye view splicing block area L2, and a left-eye longitude and latitude image A2 corresponding to the fish-eye image is generated.
  • the longitude and latitude map of the fisheye image is expanded in the left eye view splicing block area L6 to generate the left eye longitude and latitude image A6 corresponding to the fisheye image.
  • Step S2 Divide the right eye perspective splicing block area on the set right eye blank latitude and longitude map. In the right eye perspective splicing block area, expand the longitude and latitude map of the fisheye image according to the projection relationship between the fisheye image and the longitude and latitude map. Generate the right eye latitude and longitude image corresponding to the fisheye image.
  • a corresponding right eye blank longitude and latitude map is set for each fisheye image.
  • the set six right eye blank longitude and latitude maps are respectively the right eye blank longitude and latitude map M1 to M6.
  • FIGS 3c to 3d are schematic diagrams of generating a right eye longitude and latitude image according to an embodiment of the present invention.
  • step S1 right eye perspective splicing blocks are divided on the set right eye blank longitude and latitude image.
  • Area specifically may include:
  • Step S21 According to the projection relationship between the fisheye image and the latitude and longitude map, project the center point of the fisheye image onto the blank latitude and longitude map of the right eye to form a projection of the center point, and define the meridian where the projection of the center point is located as the blank latitude and longitude map of the right eye. Centerline.
  • Step S22 Use the meridian with a set angle value to the right of the center line of the blank latitude and longitude map of the right eye as the center line of the right eye viewing angle.
  • Figures 3c and 3d take the right eye blank longitude and latitude map M1 and the right eye blank longitude and latitude map M2 as examples for description.
  • the longitude 30° to the right of the center line of the right eye blank longitude and latitude map M1 is used as the right eye viewing angle center line.
  • place the right eye The meridian 30° to the right of the center line of the blank longitude and latitude map M2 of the right eye's blank longitude and latitude map is used as the center line of the right eye's visual angle.
  • the longitude 30° to the right of the center line of the right eye's blank longitude and latitude map M6 is used as the center line of the right eye's viewing angle.
  • Step S23 In the right-eye blank longitude and latitude map, use the center line of the right eye viewing angle as the center line of the viewing angle and take the image area with the set viewing angle as the right eye viewing angle splicing block area.
  • the field of view angle ⁇ is set to be greater than 60°.
  • the field of view angle ⁇ is set to be greater than or equal to 80° and less than or equal to 100°.
  • the seamless splicing method requires adjacent splicing block areas to have a certain overlapping FOV in order to perform image matching and alignment operations during the splicing process. Therefore, The FOV must be greater than 60°. Only when it is greater than 60° can adjacent splicing block areas be guaranteed to have overlapping FOVs. For example, when the FOV of a splicing block is 80°, there is a 20° overlapping FOV between adjacent splicing block areas; when the FOV of a splicing block area is 100°, there is a 40° FOV between adjacent splicing block areas. Coincident FOV. The larger the overlapping FOV is, the more conducive it is to the realization of seamless splicing. As shown in Figure 3a and Figure 3b, taking the field of view angle ⁇ as 100° as an example.
  • the shape of the right-eye view splicing block area is a rectangle.
  • the shape of the right-eye view splicing block area can also adopt other shapes, as long as the splicing block area can cover the required FOV.
  • the boundary of the right-eye view splicing block area can be a curve.
  • the center line of the right eye viewing angle is the center line of the field of view, and an image with a set field of view angle of 100° is used as the right eye viewing angle splicing block area R1.
  • the center line of the right eye viewing angle is the center line of the field of view, and an image with a set field of view angle of 100° is used as the right eye viewing angle splicing block area R2.
  • the center line of the right eye viewing angle is the center line of the field of view, and an image with a set field of view angle of 100° is taken as the right eye viewing angle splicing block area R6.
  • the fish-eye image is expanded into a latitude and longitude map in the right-eye view splicing block area R1 to generate a right-eye longitude and latitude image B1 corresponding to the fish-eye image.
  • the longitude and latitude map of the fisheye image is expanded in the right eye view splicing block area R2' to generate the right eye longitude and latitude corresponding to the fisheye image.
  • Image B2 the longitude and latitude map of the fisheye image is expanded in the right eye view splicing block area M6 to generate the right eye longitude and latitude image B6 corresponding to the fisheye image.
  • Step 12 Extract first splicing areas between multiple left-eye longitude and latitude images to obtain multiple first splicing areas, and perform image fusion on each first splicing area to generate a left-eye panoramic image.
  • step 12 may specifically include:
  • Step 122 Extract first splicing areas between multiple left-eye longitude and latitude images to obtain multiple first splicing areas.
  • the first splicing areas between n left-eye longitude and latitude images A1-An are respectively extracted to obtain N first splicing areas S1-Sn.
  • step 124 may specifically include:
  • FIG. 4a to 4b are schematic diagrams of projecting the first splicing area onto the unit sphere in the embodiment of the present invention, as shown in FIG. 4b.
  • FIG. 4b shows the first splicing area Sn.
  • the first splicing area is The area Sn is projected onto the unit sphere to obtain the first area Jn to be spliced on the unit sphere.
  • Step 1244 Map the plurality of first areas to be spliced on the unit sphere to the cylinder surfaces corresponding to the unit sphere respectively, to obtain cylinder areas corresponding to the plurality of first areas to be spliced.
  • Step 1246 Calculate the optical flow field between the left image and the right image of the cylindrical area corresponding to the plurality of first regions to be spliced.
  • the optical flow field between the left image and the right image of the N cylindrical regions T1-Tn can be calculated respectively.
  • calculating the optical flow field between the left image and the right image of the N cylindrical regions T1-Tn may specifically include:
  • the optical flow field should be smaller as it is closer to the left boundary of the cylindrical area; while for the right image of the cylindrical area, it should be closer to the right boundary of the cylindrical area.
  • the smaller the light field value therefore, set two weight values ⁇ and ⁇ , where ⁇ [0,1] and ⁇ [0,1].
  • the ⁇ value gradually increases from left to right along with the area of the left image; in Formula 2, the ⁇ value gradually decreases along with the area of the right image from left to right.
  • Step 1248 Fusion of the images of each first splicing area according to the optical flow field between the left image and the right image of each cylindrical area to generate a left-eye panoramic image.
  • the optical flow field between the left image and the right image of the cylindrical area can be used to fuse the images in the first splicing area of the left eye longitude and latitude images to achieve seamless splicing of the first splicing area. , thus achieving seamless splicing of left eye latitude and longitude images.
  • step 1248 may specifically include:
  • Step A1 On the longitude and latitude map, according to the distance between the pixels in the area and the left and right boundaries of the area distance, adaptively adjust the weight value ⁇ ' corresponding to the left image and the weight value ⁇ ' corresponding to the right image when the left image and the right image are fused.
  • Step A2 According to the projection relationship between the cylinder and the longitude and latitude image, back-project the optical flow field between the left image and the right image of the cylinder area onto the longitude and latitude image, and obtain the left image and the first splicing area of the left eye longitude and latitude image. Optical flow field between right images.
  • the optical flow field between the left image and the right image in the cylindrical area may include the optical flow field f L ⁇ R (x, y) from the left image to the right image and the optical flow field f R ⁇ from the right image to the left image.
  • L (x, y) then after back-projection, the optical flow field between the left image and the right image in the first splicing area of the left eye longitude and latitude image includes the optical flow field f' from the left image to the right image L ⁇ R (x, y) and the optical flow field f' R ⁇ L (x, y) from the right image to the left image.
  • Step A3 According to the weight value ⁇ ' corresponding to the left image and the weight value ⁇ ' corresponding to the right image and the optical flow field between the left image and the right image of the first splicing area, use Formula 3 to calculate the left image of the first splicing area.
  • the image S L and the right image S R are image fused, and the fused fusion area S B is obtained to obtain the left eye panoramic image.
  • S B (x, y) represents the value of the pixel point (x, y) in the fusion area S B
  • S L (x, y) represents the value of the pixel point (x, y) in the left image S L
  • S R (x,y) represents the value of the pixel point (x,y) in the right image S R
  • ⁇ ' is the weight value corresponding to the left image
  • ⁇ ' is the weight value corresponding to the right image
  • the corresponding pixel value of the left image should have a greater weight, so the ⁇ ' value changes from large to small as the distance from the left boundary of S L increases.
  • the corresponding pixel value of the right image should have a greater weight, so the ⁇ ' value changes from large to small as the distance from the right boundary of SR increases.
  • the left-eye panoramic image is obtained, and the obtained left-eye panoramic image is a seamlessly spliced longitude and latitude map.
  • steps A1 to A3 first the optical flow field between the left image and the right image is back-projected onto the latitude and longitude image, and then the image of the first area to be spliced is directly performed on the latitude and longitude image. Fusion to obtain the left eye panoramic image.
  • step 1248 may specifically include:
  • Step B1 According to the distance between the pixels of the cylindrical area and the left and right boundaries of the cylindrical area, adaptively adjust the weight value ⁇ corresponding to the left image and the weight value ⁇ corresponding to the right image when the left image and the right image are fused. .
  • Step B2 In the cylindrical area, according to the weight value ⁇ corresponding to the left image and the weight value ⁇ corresponding to the right image and the optical flow field between the left image and the right image of the first to-be-stitched area, use Formula 4 to calculate the first The left image TL and the right image TR of the area to be spliced are image fused to obtain the fused fusion area TB .
  • T B (x, y) ⁇ ′ ⁇ T L (x+f′ x(L ⁇ R) (x, y), y+f′ y(L ⁇ R) (x, y)) + ⁇ ′ ⁇ T R (x+f′ x(R ⁇ L) (x, y), y+f′ y(R ⁇ L) (x, y))
  • T B (x, y) represents the value of the pixel point (x, y) in the fusion area T B
  • T L (x, y) represents the value of the pixel point in the left image T L
  • T R (x, y) Represents the value of the pixel point in the right image T R
  • is the weight value corresponding to the left image
  • is the weight value corresponding to the right image, ⁇ [0, 1], ⁇ [0, 1].
  • the corresponding pixel value of the left image should have a greater weight, so the ⁇ value changes from large to small as the distance from the left boundary of the cylindrical area increases.
  • the corresponding pixel value of the right image should have a greater weight, so the ⁇ value changes from large to small as the distance from the right boundary of the cylindrical area increases.
  • the images of each first region to be spliced are fused respectively until all the images of the first regions to be spliced are fused and all the fusion regions T B are obtained.
  • Step B3 According to the projection relationship between the cylinder and the longitude and latitude image, back-project all the fusion areas T B into the longitude and latitude image to obtain the left eye panoramic image.
  • the left-eye panoramic image obtained after back-projecting it onto the latitude and longitude map is also seamless.
  • the image of the first region to be spliced is first fused on the cylinder, and then the fused region is back-projected onto the longitude and latitude image to achieve the image of the first splicing region. Fusion is performed to obtain the left eye panoramic image.
  • Step 14 Extract second splicing areas between multiple right eye latitude and longitude images to obtain multiple second splicing areas, and perform image fusion on each second splicing area to generate a right eye panoramic image.
  • step 14 may specifically include:
  • Step 142 Extract second splicing areas between multiple right eye longitude and latitude images to obtain multiple second splicing areas.
  • Step 144 Perform image fusion on each second splicing area to generate a right-eye panoramic image.
  • step 144 may specifically include:
  • Step 1442 Project the plurality of second splicing areas onto the unit sphere to obtain multiple second areas to be spliced on the unit sphere.
  • Step 1444 Map the plurality of second regions to be spliced on the unit sphere to the cylinder surfaces corresponding to the unit sphere, respectively, to obtain cylinder regions corresponding to the plurality of second regions to be spliced.
  • Step 1446 Calculate the optical flow field between the left image and the right image of the cylindrical area corresponding to the plurality of second areas to be spliced.
  • Step 1448 Fusion of the images of each second splicing area according to the optical flow field between the left image and the right image of each cylindrical area to generate a right-eye panoramic image.
  • Step 16 Generate a panoramic stereoscopic image based on the left eye panoramic image and the right eye panoramic image.
  • step 16 may specifically include:
  • Step 161 Reproject the left-eye panoramic image to generate a left-eye visible picture.
  • the left-eye viewable picture is a panoramic stereoscopic left-eye viewable picture, and the left-eye viewable picture is an image including the left eye's visible range.
  • Step 162 Reproject the right-eye panoramic image to generate a left-eye viewable image.
  • the right-eye viewable picture is a panoramic stereoscopic right-eye viewable picture, and the right-eye viewable picture includes images within the right eye's visible range.
  • the left-eye panoramic image and the right-eye panoramic image are re-projected according to the orientation of the user's eyes to generate left-eye visible images and right-eye visible images respectively.
  • Step 163 Synchronize the left eye viewable image and the right eye viewable image to generate a stereoscopic panoramic image.
  • Figure 6 is a schematic diagram of a presentation method of a panoramic stereoscopic image generated in an embodiment of the present invention.
  • the left window is the left eye viewable picture obtained by reprojecting the left eye panoramic image
  • the right window is the right eye
  • the panoramic image is re-projected and the right-eye viewable image is obtained.
  • the pictures in the left and right windows have real parallax, thus achieving a three-dimensional display effect.
  • the pictures in the left and right windows can change in real time according to the direction of the user's eyes, and can present pictures from every angle in the original left-eye panoramic image and right-eye panoramic image, thereby achieving a panoramic three-dimensional display effect.
  • the left eye longitude and latitude image and the right eye longitude and latitude image corresponding to each fisheye image are generated, and the first splicing between the multiple left eye longitude and latitude images is extracted.
  • the panoramic image and the right-eye panoramic image generate a panoramic stereoscopic image, which can eliminate the faults or ghosting phenomena in the splicing area of the panoramic stereoscopic image, and can reduce the picture distortion of the panoramic stereoscopic image, thus improving the realism of the panoramic stereoscopic image.
  • FIG. 7 is a schematic structural diagram of a panoramic stereoscopic image generation device provided by an embodiment of the present invention.
  • the device includes: a fisheye image acquisition module 11, a latitude and longitude image generation module 12, a fusion module 13 and a generation module. 14.
  • the fisheye image acquisition module 11 may include multiple fisheye lenses.
  • the fisheye image acquisition module 11 is used to acquire multiple fisheye images.
  • the longitude and latitude image generation module 12 is used to generate a left eye longitude and latitude image and a right eye longitude and latitude image corresponding to each fisheye image based on the multiple acquired fisheye images;
  • the fusion module 13 is used to extract the third image between the multiple left eye longitude and latitude images.
  • a splicing area is used to obtain multiple first splicing areas, and image fusion is performed on each of the first splicing areas to generate a left eye panoramic image; second splicing areas between multiple right eye longitude and latitude images are extracted to obtain multiple second splicing areas. area, perform image fusion on each of the second splicing areas to generate a right-eye panoramic image;
  • the generating module 14 is configured to generate a panoramic stereoscopic image according to the left-eye panoramic image and the right-eye panoramic image.
  • the longitude and latitude image generation module 12 is specifically used to divide the left eye view splicing block area on the set left eye blank longitude and latitude map. In the left eye view splicing block area, the fisheye image is converted into the longitude and latitude map.
  • the longitude and latitude map of the fisheye image is expanded to generate the right eye longitude and latitude image corresponding to the fisheye image.
  • the latitude and longitude image generation module 12 is specifically configured to project the center point of the fisheye image onto the blank latitude and longitude map of the left eye to form a projection of the center point based on the projection relationship between the fisheye image and the latitude and longitude map.
  • the meridian where the projection of the center point is located is defined as the center line of the left eye blank longitude and latitude map; the meridian with a set angle value to the left of the left eye blank longitude and latitude map center line of the left eye blank longitude and latitude map is used as the left eye viewing angle center line; in the left eye blank longitude and latitude map, the left eye viewing angle center line is the viewing field angle center line, and the image area with the set viewing angle is taken as the left eye viewing angle splicing block area.
  • the latitude and longitude image generation module 12 is specifically configured to project the center point of the fisheye image onto the blank latitude and longitude map of the right eye to form a projection of the center point based on the projection relationship between the fisheye image and the latitude and longitude map.
  • the meridian where the projection of the center point is located is defined as the center line of the right eye blank longitude and latitude map; the meridian with a set angle value to the right of the right eye blank latitude and longitude map from the right eye blank longitude and latitude map is used as the right eye viewing angle center line; in the right eye blank longitude and latitude map, the right eye viewing angle center line is the viewing field angle center line, and the image area with the set viewing angle is taken as the right eye viewing angle splicing block area.
  • the fusion module 13 is specifically used to project multiple first splicing areas onto the unit sphere to obtain multiple first areas to be spliced on the unit sphere; Map them to the cylinder corresponding to the unit sphere respectively to obtain the cylinder areas corresponding to the multiple first areas to be spliced; calculate the optical flow field between the left image and the right image of the cylinder areas corresponding to the multiple first areas to be spliced. ; According to the optical flow field between the left image and the right image of each cylindrical area, the images of each first splicing area are fused to generate the left eye panoramic image.
  • the fusion module 13 is specifically used to project multiple second splicing areas onto the unit sphere to obtain multiple second areas to be spliced on the unit sphere; Map them to the cylinder corresponding to the unit sphere respectively, and obtain the cylinders corresponding to multiple second areas to be spliced. area; calculate the optical flow field between the left image and the right image of the cylindrical area corresponding to the multiple second areas to be spliced; according to the optical flow field between the left image and the right image of each cylindrical area, for each The images of the second splicing area are fused to generate the right eye panoramic image.
  • the generation module 14 is specifically configured to re-project the left-eye panoramic image to generate the left-eye viewable picture; re-project the right-eye panoramic image to generate the right-eye viewable picture. Picture; synchronize the left-eye visible picture and the right-eye visible picture to generate the panoramic stereoscopic image.
  • the panoramic stereoscopic image is generated by the panoramic stereoscopic image generating device in the embodiment of the present invention, which can eliminate the faults or ghosting phenomena that appear in the splicing area of the panoramic stereoscopic image, and can reduce the appearance of the panoramic stereoscopic image. Distortion, thus improving the realism of panoramic stereoscopic images.
  • An embodiment of the present invention provides an electronic device.
  • the electronic device includes a plurality of fisheye lenses for acquiring fisheye images.
  • the electronic device also includes a memory and a processor.
  • the memory is used to store information including program instructions, and the processor is used to control Execution of the program instructions.
  • the program instructions are loaded and executed by the processor, each step of the embodiment of the above-mentioned method for generating a panoramic stereoscopic image is implemented.
  • the embodiment of the above-mentioned method for generating a panoramic stereoscopic image please refer to the embodiment of the above-mentioned method for generating a panoramic stereoscopic image.
  • the electronic device 30 includes, but is not limited to, a processor 31 and a memory 32 .
  • FIG. 8 is only an example of the electronic device 30 and does not constitute a limitation on the electronic device 30.
  • the electronic device 30 may also include input and output devices, network access devices, buses, etc.
  • the so-called processor 31 can be a central processing unit (Central Processing Unit, CPU), or other general-purpose processor, digital signal processor (Digital Signal Processor, DSP), application specific integrated circuit (Application Specific Integrated Circuit, ASIC), Field-Programmable Gate Array (FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general-purpose processor may be a microprocessor or the processor may be any conventional processor, etc.
  • the disclosed systems, devices and methods can be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined. Either it can be integrated into another system, or some features can be ignored, or not implemented.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, and the indirect coupling or communication connection of the devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separate.
  • the components shown as units may or may not be physical units, that is, they may be located in one place, or they may not be physically separated. Can be distributed to multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in various embodiments of the present invention can be integrated into one processing unit, or each unit can exist physically alone, or two or more units can be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of hardware plus software functional units.
  • the above-mentioned integrated unit implemented in the form of a software functional unit can be stored in a computer-readable storage medium.
  • the above-mentioned software functional unit is stored in a storage medium and includes a number of instructions to cause a computer device (which can be a personal computer, server, or network device, etc.) or processor (Processor) to execute the methods described in various embodiments of the present invention. Some steps.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk and other media that can store program code. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

Les modes de réalisation de la présente invention concernent un procédé et un appareil de génération d'une image stéréoscopique panoramique, et un dispositif électronique. Le procédé consiste : en fonction d'une pluralité d'images d'œil de poisson acquises, à générer une image de longitude et de latitude d'œil gauche et une image de longitude et de latitude d'œil droit, qui correspondent à chaque image d'œil de poisson ; à extraire des premières zones d'assemblage entre une pluralité d'images de longitude et de latitude d'œil gauche, de façon à obtenir une pluralité de premières zones d'assemblage, et à effectuer une fusion d'image sur les premières zones d'assemblage, de façon à générer une image panoramique d'œil gauche ; à extraire des secondes zones d'assemblage entre une pluralité d'images de longitude et de latitude d'œil droit, de façon à obtenir une pluralité de secondes zones d'assemblage, et à effectuer une fusion d'image sur les secondes zones d'assemblage, de façon à générer une image panoramique d'œil droit ; et à générer une image stéréoscopique panoramique en fonction de l'image panoramique d'œil gauche et de l'image panoramique d'œil droit, ce qui permet d'éliminer un phénomène de défaut ou d'effet d'image fantôme.
PCT/CN2023/102498 2022-06-27 2023-06-26 Procédé et appareil de génération d'image stéréoscopique panoramique, et dispositif électronique WO2024002023A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210744903.5 2022-06-27
CN202210744903.5A CN115174805B (zh) 2022-06-27 2022-06-27 全景立体图像的生成方法、装置和电子设备

Publications (1)

Publication Number Publication Date
WO2024002023A1 true WO2024002023A1 (fr) 2024-01-04

Family

ID=83489272

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/102498 WO2024002023A1 (fr) 2022-06-27 2023-06-26 Procédé et appareil de génération d'image stéréoscopique panoramique, et dispositif électronique

Country Status (2)

Country Link
CN (1) CN115174805B (fr)
WO (1) WO2024002023A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115174805B (zh) * 2022-06-27 2024-07-02 影石创新科技股份有限公司 全景立体图像的生成方法、装置和电子设备
WO2024103366A1 (fr) * 2022-11-18 2024-05-23 影石创新科技股份有限公司 Véhicule aérien sans pilote panoramique

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160269717A1 (en) * 2015-03-12 2016-09-15 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and recording medium
CN106534832A (zh) * 2016-11-21 2017-03-22 深圳岚锋创视网络科技有限公司 立体影像处理方法及系统
CN106791762A (zh) * 2016-11-21 2017-05-31 深圳岚锋创视网络科技有限公司 立体影像处理方法及系统
CN107369129A (zh) * 2017-06-26 2017-11-21 深圳岚锋创视网络科技有限公司 一种全景图像的拼接方法、装置及便携式终端
CN110349077A (zh) * 2018-04-02 2019-10-18 杭州海康威视数字技术股份有限公司 一种全景图像合成方法、装置及电子设备
CN115174805A (zh) * 2022-06-27 2022-10-11 影石创新科技股份有限公司 全景立体图像的生成方法、装置和电子设备

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106981050A (zh) * 2016-01-18 2017-07-25 深圳岚锋创视网络科技有限公司 对鱼眼镜头拍摄的图像矫正的方法和装置
CN107274340A (zh) * 2016-04-08 2017-10-20 北京岚锋创视网络科技有限公司 一种全景图像生成方法及装置
CN108122191B (zh) * 2016-11-29 2021-07-06 成都美若梦景科技有限公司 鱼眼图像拼接成全景图像和全景视频的方法及装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160269717A1 (en) * 2015-03-12 2016-09-15 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and recording medium
CN106534832A (zh) * 2016-11-21 2017-03-22 深圳岚锋创视网络科技有限公司 立体影像处理方法及系统
CN106791762A (zh) * 2016-11-21 2017-05-31 深圳岚锋创视网络科技有限公司 立体影像处理方法及系统
CN107369129A (zh) * 2017-06-26 2017-11-21 深圳岚锋创视网络科技有限公司 一种全景图像的拼接方法、装置及便携式终端
CN110349077A (zh) * 2018-04-02 2019-10-18 杭州海康威视数字技术股份有限公司 一种全景图像合成方法、装置及电子设备
CN115174805A (zh) * 2022-06-27 2022-10-11 影石创新科技股份有限公司 全景立体图像的生成方法、装置和电子设备

Also Published As

Publication number Publication date
CN115174805B (zh) 2024-07-02
CN115174805A (zh) 2022-10-11

Similar Documents

Publication Publication Date Title
WO2024002023A1 (fr) Procédé et appareil de génération d'image stéréoscopique panoramique, et dispositif électronique
US11076142B2 (en) Real-time aliasing rendering method for 3D VR video and virtual three-dimensional scene
EP2328125B1 (fr) Procédé et dispositif de raccordement d'images
CN107180406B (zh) 图像处理方法和设备
US8928654B2 (en) Methods, systems, devices and associated processing logic for generating stereoscopic images and video
CN108475327A (zh) 三维采集与渲染
US6388666B1 (en) System and method for generating stereoscopic image data
Devernay et al. Stereoscopic cinema
CN106228530B (zh) 一种立体摄影方法、装置及立体摄影设备
WO2015196791A1 (fr) Procédé de rendu graphique tridimensionnel binoculaire et système associé
CN109510975B (zh) 一种视频图像的提取方法、设备及系统
US11812009B2 (en) Generating virtual reality content via light fields
US20190266802A1 (en) Display of Visual Data with a Virtual Reality Headset
WO2017128887A1 (fr) Procédé et système d'affichage 3d corrigé d'image panoramique et dispositif
US12100106B2 (en) Stereoscopic rendering of virtual 3D objects
CN104599308B (zh) 一种基于投射的动态贴图方法
CN111294580B (zh) 基于gpu的摄像头视频投影方法、装置、设备及存储介质
JP2018504009A (ja) デジタルビデオのレンダリング
CN114513646B (zh) 一种三维虚拟场景中全景视频的生成方法及设备
Bourke Synthetic stereoscopic panoramic images
CN108765582B (zh) 一种全景图片显示方法及设备
Kutka Reconstruction of correct 3-D perception on screens viewed at different distances
CN113821107B (zh) 一种实时、自由视点的室内外裸眼3d系统
CN116957913A (zh) 全景视频映射方法、装置、电子设备和存储介质
CA2252063C (fr) Systeme et methode de production de donnees d'image stereoscopique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23830200

Country of ref document: EP

Kind code of ref document: A1