CN211018942U - Imaging device with sensors of the same height - Google Patents

Imaging device with sensors of the same height Download PDF

Info

Publication number
CN211018942U
CN211018942U CN201921573500.9U CN201921573500U CN211018942U CN 211018942 U CN211018942 U CN 211018942U CN 201921573500 U CN201921573500 U CN 201921573500U CN 211018942 U CN211018942 U CN 211018942U
Authority
CN
China
Prior art keywords
camera
optical axis
imaging apparatus
imaging device
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201921573500.9U
Other languages
Chinese (zh)
Inventor
马龙·托马尔
阿伦森·鲁思
宾南·沙哈尔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
HUMANEYES TECHNOLOGIES Ltd
Original Assignee
HUMANEYES TECHNOLOGIES Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HUMANEYES TECHNOLOGIES Ltd filed Critical HUMANEYES TECHNOLOGIES Ltd
Application granted granted Critical
Publication of CN211018942U publication Critical patent/CN211018942U/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/111Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/167Synchronising or controlling image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/172Processing image signals image signals comprising non-image signal components, e.g. headers or format information
    • H04N13/178Metadata, e.g. disparity information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/243Image signal generators using stereoscopic image cameras using three or more 2D image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/282Image signal generators for generating image signals corresponding to three or more geometrical viewpoints, e.g. multi-view systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/286Image signal generators having separate monoscopic and stereoscopic modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/21805Source of audio or video content, e.g. local disk arrays enabling multiple viewpoints, e.g. using a plurality of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/4728End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for selecting a Region Of Interest [ROI], e.g. for requesting a higher resolution version of a selected region
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8106Monomedia components thereof involving special audio data, e.g. different tracks for different languages
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2213/00Details of stereoscopic systems
    • H04N2213/001Constructional or mechanical details

Abstract

The utility model discloses an imaging device for acquire a plurality of simultaneous video image in order to produce 180 three-dimensional and 360 panoramic video streams, imaging device includes: a rigid body comprising two major surfaces and a bottom surface, there being a mechanical mount interface on the bottom surface; and three cameras having ultra-wide angle lenses, each camera defining a respective optical axis. The first camera is on the first major surface and its optical axis defines a first direction, the second camera is on the second major surface and its optical axis defines a second direction opposite to the first direction, and the third camera is on the second major surface and its optical axis defines a third direction parallel to the second direction. The apparatus includes a video display module including a display screen. The imaging device is upright, all three cameras being vertically aligned with each other at the same height, within a tolerance of one-half of any diameter of the respective ultra-wide angle lens.

Description

Imaging device with sensors of the same height
Technical Field
The present invention relates to an imaging device with multiple sensors for acquiring simultaneous real-time video images for combining and converting into 180 ° stereoscopic and/or 360 ° panoramic video images, and in particular to a single device that does not require mechanical manipulation for changing the image acquisition mode.
Background
Dual mode cameras have been introduced which allow the user to capture simultaneous images for conversion into 180 ° stereoscopic and 360 ° panoramic video images. Such dual mode cameras typically require a user to select between a stereoscopic mode and a panoramic mode by manipulating one or more mechanical devices, such as folding or unfolding a moveable camera platform, or changing orientation. Furthermore, such dual mode cameras typically allow for stereo or panoramic modes, but do not allow for both modes to exist simultaneously, as the built-in requirements for mechanical manipulation and/or orientation changes tend to make it impractical or even impossible to acquire images simultaneously for creating both stereo and panoramic images. Therefore, there is a need for a multi-mode, multi-sensor camera that will allow multiple simultaneous images to be captured for combination and conversion into either or both of 180 ° stereoscopic and 360 ° panoramic video images.
Disclosure of Invention
According to an embodiment of the present invention, an imaging apparatus for acquiring a plurality of simultaneous video images to generate 180 ° stereoscopic and 360 ° panoramic video streams, comprises: (a) a rigid body comprising first and second major surfaces; (b) first, second and third cameras fixedly mounted to the rigid body, each camera including a respective ultra-wide angle lens and defining a respective optical axis, wherein: (i) said first camera being positioned such that its lens extends from said first major surface and its optical axis defines a first direction; (ii) the second camera is positioned such that its lens extends from the second major surface and its optical axis defines a second direction that is opposite, or no more than 5 degrees opposite, the first direction; and (iii) the third camera is positioned such that its lens extends from the second major surface and its optical axis defines a third direction, the third direction being parallel to the second direction or within 10 degrees of parallel; (c) a video display module comprising a display screen mounted on the first major surface, the display screen operable to display and/or play 180 ° or 360 ° images in response to input received through the display screen and/or through a display control interface; and (d) a mechanical mount interface mounted on the first minor surface of the rigid body disposed closer to each of the first and second cameras than the third camera.
In some embodiments, a majority of a footprint of the mechanical mount interface on the first minor surface falls within a boundary of orthographic projections of the first and second cameras on the first minor surface.
In some embodiments, all footprints of the mechanical mount interface on the first minor surface fall within a boundary of orthographic projections of the first and second cameras on the first minor surface.
In some embodiments, a footprint of the mechanical mount interface on the first secondary surface is aligned on two orthogonal axes with a midline center of an orthographic projection of the first and second cameras on the first secondary surface.
In some embodiments, the imaging device further comprises a first electronic status indicator on the first major surface and a second electronic status indicator on the second major surface.
In some embodiments, the second camera 5202Optical axis 4012Defining a second direction opposite to the first direction or notOver 1 degree.
In some embodiments, the third camera 5203Optical axis 4013A third direction is defined that is parallel to the second direction, or within 5 degrees of parallelism.
In some embodiments, the third camera 5203Optical axis 4013A third direction is defined that is parallel to the second direction, or within 1 degree of parallelism.
In some embodiments, the imaging device is configured to acquire a plurality of simultaneous video images in a first mode, a second mode, or a third mode; in a first mode, the first and second cameras acquire simultaneous video images for stitching into a 360 ° panoramic video image; in a second mode, the second and third cameras acquire simultaneous video images for composing a 180 ° stereoscopic video image; in a third mode, the first, second and third cameras acquire simultaneous video images for producing 180 ° stereoscopic and 360 ° panoramic video images; the device is arranged to respond to input received via the display screen and/or via a display-control interface.
In some embodiments, the second minor surface may comprise a concave, textured finger grip region extending across a majority of the second minor surface in one of two perpendicular directions relative to the first minor surface.
According to an embodiment of the present invention, there is provided an imaging device for acquiring a plurality of simultaneous video images to produce a 180 ° stereoscopic and 360 ° panoramic video stream, the imaging device comprising (a) a rigid body comprising first and second major surfaces and a bottom surface on which a mechanical mount interface is present, (b) first, second and third cameras fixedly mounted to the rigid body, each camera comprising a respective super-wide angle lens and defining a respective optical axis, wherein (i) the first camera is positioned such that its lens extends from the first major surface and its optical axis defines a first direction, (ii) the second camera is positioned such that its lens extends from the second major surface and its optical axis defines a second direction, the second direction being opposite to the first direction or no more than 5 degrees, and (iii) the third camera is positioned such that its lens extends from the second major surface and its optical axis defines a third direction, the third direction being parallel to the second direction or parallel to the first direction or no more than 5 degrees, and (iii) the third camera is positioned such that its lens extends from the second major surface and its optical axis defines a third direction, the third direction is parallel to the second direction or parallel to the second major surface, and is operably mounted on the second plane, wherein the display screen is positioned such that it is positioned within a plane when the display screen is positioned such that it is positioned perpendicular to the display screen, the display device has a display screen is positioned such that it has a display screen that it is positioned such that it is positioned within.
In some embodiments, the imaging device is unattended standing on a flat surface of its bottom surface, and the first, second and third cameras are vertically aligned with each other at the same height, within a tolerance of one-fifth of any diameter of the respective ultra-wide angle lens.
In some embodiments, the imaging device is unattended standing on a flat surface of its bottom surface, and the first, second and third cameras are vertically aligned with each other at the same height, within a tolerance of 1 mm.
In some embodiments, the first and second cameras are aligned with each other in the x-axis dimension within a tolerance of 1 millimeter.
In some embodiments, the mechanical mount interface is aligned with the first and second cameras in the x-axis dimension to within a 5 mm tolerance.
In some embodiments, the optical axis of the second camera defines a second direction that is opposite the first direction, or is not more than 1 degree opposite.
In some embodiments, the optical axis of the third camera defines a third direction, the third direction being parallel to the second direction, or within 5 degrees of parallelism.
In some embodiments, the optical axis of the third camera defines a third direction, the third direction being parallel to the second direction, or within 1 degree of parallelism.
In some embodiments, the imaging device further comprises a first electronic status indicator on the first major surface and a second electronic status indicator on the second major surface.
In some embodiments, the imaging device is configured to acquire a plurality of simultaneous video images in a first mode, a second mode, or a third mode; in a first mode, the first and second cameras acquire simultaneous video images for stitching into a 360 ° panoramic video image; in a second mode, the second and third cameras acquire simultaneous video images for composing a 180 ° stereoscopic video image; in a third mode, the first, second and third cameras acquire simultaneous video images for producing 180 ° stereoscopic and 360 ° panoramic video images; the device is arranged to respond to input received via the display screen and/or via a display-control interface.
According to an embodiment of the present invention, there is provided an imaging apparatus for acquiring a plurality of simultaneous video images to generate 180 ° stereoscopic and 360 ° panoramic video streams, the imaging apparatus including: (a) a rigid body comprising first and second major surfaces; (b) first, second and third cameras fixedly mounted to the rigid body, each camera including a respective ultra-wide angle lens and a respective planar array of photodetectors defining a respective photodetector plane, each camera defining a respective optical axis perpendicularly intersecting the respective photodetector plane, wherein: (i) said first camera being positioned such that its lens extends from said first major surface and its optical axis defines a first direction; (ii) the second camera is positioned such that its lens extends from the second major surface and its optical axis defines a second direction that is opposite, or no more than 5 degrees opposite, the first direction; and (iii) the third camera is positioned such that its lens extends from the second major surface and its optical axis defines a third direction, the third direction being parallel to the second direction or within 10 degrees of parallel; wherein the respective photodetector planes of the first and second cameras are non-coplanar.
In some embodiments, the photodetector plane of the second camera is further disposed outwardly from the second major surface at least 3 millimeters further than the photodetector plane of the third camera.
In some embodiments, (i) the second camera has a view angle of at least 180 °, and (ii) the view angle does not encompass any portion of the lens of the third camera.
In some embodiments, (i) the second camera has a view angle of at least 180 °, and (ii) the 180 ° connection of the view angle does not contain any lens of the third camera.
In some embodiments, the imaging device further comprises a video display module including a display screen mounted on the first major surface.
In some embodiments, (i) the first camera has a viewing angle of at least 180 °, and (ii) the viewing angle does not encompass any portion of the display screen.
In some embodiments, the optical axis of the third camera defines a third direction, the third direction being parallel to the second direction, or within 5 degrees of parallelism.
In some embodiments, the optical axis of the third camera defines a third direction, the third direction being parallel to the second direction, or within 1 degree of parallelism.
In some embodiments, the imaging device is configured to acquire a plurality of simultaneous video images in a first mode, a second mode, or a third mode; in a first mode, the first camera and the second camera acquire simultaneous video images for splicing into a panoramic video image; in a second mode, the second and third cameras acquire simultaneous video images for composing a 180 ° stereoscopic video image; in a third mode, the first, second and third cameras acquire simultaneous video images for producing 180 ° stereoscopic and 360 ° panoramic video images; the device is arranged to respond to input received via the display screen and/or via a display-control interface.
In some embodiments, the imaging device further comprises a first minor surface and a second minor surface, the second minor surface comprising, relative to the first minor surface, a concave, textured finger grip region extending across a majority of the second minor surface in one of two perpendicular directions.
Drawings
The invention will be further described with reference to the accompanying drawings, in which the dimensions of the components and features shown in the drawings are chosen for clarity of presentation and are not necessarily to scale. In some drawings, the relative sizes of objects and the relative distances between objects may be exaggerated or minimized for convenience and clarity. In these drawings:
fig. 1A, 1B, 1C, and 1D are front, rear, bottom, and top views, respectively, of an imaging device including three cameras according to a specific embodiment of the present invention.
Fig. 2 is a bottom view of an imaging device including three cameras according to a specific embodiment of the present invention, showing the boundaries of the orthographic projection of the two cameras on the bottom surface of the device.
FIG. 3 is a bottom view of an imaging device including three cameras according to a specific embodiment of the invention, schematically illustrating respective photodetector planes of the three cameras
Fig. 4 is a bottom view of an imaging device including three cameras according to a specific embodiment of the invention, schematically illustrating the respective perspectives of the two cameras with respect to physical features on two major surfaces of the imaging device.
Figures 5A and 5B illustrate block diagrams of systems for acquiring simultaneous images, processing the images to create stereoscopic and/or panoramic video streams, delivering one or both of the streaming videos to a client browsing device,
fig. 6 and 7 show flowcharts of a method for delivering simultaneous real-time video streaming of panoramic and three-dimensional stereoscopic images according to a specific embodiment of the present invention
Detailed Description
The invention will be described herein by way of example only and with reference to the accompanying drawings. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTSReferring now to the details of the drawings in particular, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of the preferred embodiments of the present invention only, and are presented in the cause of providing what is believed to be the most useful and readily understood description of the principles and conceptual aspects of the invention. In this regard, no attempt is made to show structural details of the invention in more detail than is necessary for a fundamental understanding of the invention, the description taken with the drawings making apparent to those skilled in the art how the several forms of the invention may be embodied in practice. Like reference characters generally refer to like elements throughout the several views.
Note that: throughout this specification, subscripted reference numerals (e.g., 10)1Or 10A) May be used to indicate multiple separate appearances of a single kind of element, for example: 101Is a single appearance (from among multiple appearances) of the element 10. When not referring to a particular one of the plurality of separate appearances, that is, for a generic class, the same element may instead be referred to as being without a subscript (e.g., 10, instead of 10)1)。
For convenience, in the context of the description herein, various terms are presented herein. To the extent that a definition is provided herein or elsewhere in this application, either explicitly or implicitly, such definition is understood to be consistent with the use of the defined term by those skilled in the relevant art. Further, such definitions are to be construed in the broadest possible sense consistent with such usage.
The term "module" as used herein refers to any combination of electronic hardware, software, and firmware necessary to perform the functions associated with the module, e.g., "stitching module" includes hardware, software, and firmware for "stitching" an image. The myriad of techniques available in each of the three technical categories of hardware, software, and firmware are well known in the particular arts of their technical endeavors, and the present invention is intended to encompass the use of any such techniques, whether known and commonly used within the time of the present invention or developed and made available during the life of any patent after disclosure.
The terms "imaging device", "sensor" and "camera" are used interchangeably in this specification and the appended claims, and all have the same meaning: a device for capturing digital images comprises (at least) a CMOS or CCD device and a lens, and optionally any number of mechanical and electronic components and accessories. In contrast, the term "imaging device" as used herein is specific to a device comprising a plurality of sensors/cameras/imaging devices. In this statement, the imaging device, sensor or camera may be a component of the imaging device, but this is not the case.
According to some embodiments of the present invention, an imaging device including multiple imaging devices may acquire simultaneous video images of a scene. Each of these imaging devices may have a wide viewing angle, e.g., 180 ° or more. The scene may contain a field of view of 180 ° or more or 360 °. While the video images may be overlapping, this means that at least two of the video images are overlapping, or equivalently at least abutting or nearly abutting. In some embodiments, one of the video images may overlap each of the other two video images. In some embodiments, the angle of view captured by one or more respective imaging devices is exactly 180 °, or alternatively the sum of the angles of view of two said imaging devices amounts to 360 °, and in this case the concept of "overlapping" as used herein is extended to include adjacent or abutting images. In some embodiments, the perspectives captured by one or more respective imaging devices are less than 180 °, or the perspectives of at least some of the respective imaging devices are non-overlapping, and in such cases, the concept of "stitching" as used herein is extended to include interpolation, such that the images may be considered "stitched" even if there is no actual overlap between them.
These images are acquired by the imaging device simultaneously or with a minimum delay of no more than a few milliseconds or tens or hundreds of milliseconds. The stitching module may create a panoramic image from two or more video images that overlap (or abut or nearly abut). Meanwhile, the 3D stereoscopic synthesis module may create a 3D stereoscopic image from two or more overlapped video images, which may include at least partially the same image as a video image used when stitching the panoramic image.
Panoramic stitching and 3D stereoscopic compositing of multiple and overlapping images, including asymmetric cropping, intra-camera calibration, image acquisition and manipulation, may be performed using any of the systems and methods described in pending international application PCT/IB2018/059771, filed 2011, 11, 7, the contents of which are incorporated herein by reference in their entirety. The above-described methods and systems are also disclosed in the following U.S. patent application No. 62823409, filed on 3, 25, 2019, and U.S. patent application No. 62/596,112, filed on 12, 7, 2018, the entire contents of which are incorporated herein by reference in their entirety.
In some embodiments, either or both of the stitching module and the 3D stereoscopic compositing module are included in the imaging device, and in some embodiments either or both of the stitching module and the 3D stereoscopic compositing module reside remotely, such as at one or more remote servers, which may be part of a server farm that includes modules dedicated to the stitching module and/or the 3D stereoscopic compositing module (and/or other modules that may be necessary for processing the video images, storing the images, and/or delivering them as video streams to the client, where delivering them to the client may include selecting which video stream to deliver to any particular client). Alternatively, one or more servers may be "in the cloud," that is, servers available from commercial computing operations, and may include any number of networked servers (including but not limited to shared, dedicated, and virtual servers) available to any number of unrelated customers. For convenience, the remote server is described throughout as "in the cloud," but as noted above, this is not necessarily the case in some embodiments.
In some embodiments, the imaging device may include a communication circuit configured for uploading the overlaid video images to one or more servers; and in other embodiments, the communication device is configured to upload the stitched panoramic image and the synthesized 3D stereoscopic image to one or more servers. Any reference herein to a communication circuit may include an integrated or "built-in" communication circuit, or a module that is readily attachable to the imaging device, for example, where the imaging device is equipped with an appropriate connector.
Reference is now made to fig. 1A, 1B, 1C and 1D, which respectively illustrate front, rear, bottom and top views of an imaging apparatus 500 capable of acquiring a plurality of simultaneous video images from which 180 ° stereoscopic and 360 ° panoramic video streams (or files) can be created.
The imaging apparatus 500 according to the particular embodiment is a "one-piece" device that is preferably constructed with a rigid frame 501. By "unitary" is meant herein that the device has no moving parts for positioning/repositioning the individual cameras, nor is there a folding mechanism to place one or more individual cameras in more or fewer locations for imaging in a particular mode (stereo or panoramic). Of course, the rigid frame 501 may be constructed of multiple parts, and it may also have movable or removable openings for connection and control.
The imaging device 500 shown in FIGS. 1A-1D has two "major surfaces" -a front surface and a back surface 5401、5402And four minor surfaces, the shape of the imaging apparatus 500 in all figures consistently shows a single example of a product design for clarity purposes, but the product may have any rectangular/prismatic shape (including, for example, a rectangle with or without cuts or rounded corners, or a rectangle with rounded edges or edges). Nevertheless, it may be desirable to provide a bottom surface 545 that allows imaging apparatus 500 to be placed on an unattended flat surface without support, whether for storage or for operation.
The imaging device 500 includes three cameras 520. A dual mode device has only two cameras and would necessarily require mechanical operation of one or both cameras in order to orient them correctly for the selected mode. The device may have more than three cameras, but three is the practical minimum number of cameras for having dual mode capability in a rigid design. Each of these cameras 520 is a wide-angle camera, meaning that it has a respective wide-angle lens 530 mounted thereon. Wide angle lens 530 may comprise a fish-eye lens, which advantageously has a viewing angle of at least 180 °, i.e., a hemispherical or larger angle or viewing angle. All three cameras 520 are operable to acquire images at least when the cameras are in the orientation shown in fig. 1A to 1D.
First camera 5201Is mounted on the first main surface 5401And two other cameras 5202And 5203Mounted on the second major surface 5402The above. The two cameras 5201、5202Each having a respective optical axis 4011、4012They are: directly opposite each other and therefore collinear; or in exactly opposite directions but not collinear, and thus parallel; or in nearly opposite directions, meaning within a rotation from the opposite ± 10 °, or within a rotation from the opposite ± 5 °, or within a rotation of ± 1 °. Terms in the reverse or parallel "+ -. x.degree" refer to a range that means anywhere in the range from-x to + x, away from or parallel, andand in each case means the case "from-x ° to + x ° does not include".
The display screen 510 may be in contact with the first major surface 5401Is flush or slightly recessed or otherwise displaced so as not to be flush based on aesthetic or ergonomic considerations. It is preferably a touch screen to facilitate receiving user input. Regardless of whether display 510 is a touch screen or not, display control interface 515 may be, for example, a button in the example shown in FIG. 1B. The display screen 510 is operable to display and/or playback 180 ° or 360 ° images in response to input received via the display screen 510 (if the touch screen 510) or the display control interface 515. The video display module may include a display screen 510, a display control interface 515, and any required electronics or software for performing video display functions.
First major surface 5401Also included is an electronic status indicator 5161It can be operated to display an on/off state or to indicate that photographing or recording or transmission is in progress. Electronic status indicator 5161Can include any kind of electronic display, for example, a small L CD screen, or a plurality L ED. of two major surfaces 5401、5402Either of which may be flat or rounded and/or may optionally include raised or recessed surfaces for aesthetic or ergonomic purposes. First major surface 5401Identified in the figure as the "front" of the imaging device 500 "
(and second major surface 5402As a "back" face), but this is for convenience only and it has no meaning to design or function, particularly in view of the fact that imaging apparatus 500 is operable to acquire images in any and all directions.
Two cameras 5202、5203Are all mounted on the second major surface 5402The above. Second electronic status indicator 5162Can also be mounted on the second major surface 5402The above. Second and third cameras 5202、5203Facing in the same direction as each other, meaning their respective optical axes 4012、4013Are parallel to each other, or within + -5 deg. of parallelism, orWithin ± 2 ° of parallelism, or within ± 1 ° of parallelism. By a second camera 5202And a third camera 5203The simultaneously acquired images may be used together to synthesize a stereoscopic video image, for example, a 180 ° stereoscopic video image for viewing on a 3D browsing device.
Second and third cameras 5202、5203Preferably at the same height H as each otherLENSWherein the height HLENSIs the height in the y-dimension to the centerline as shown in fig. 1A, as measured in the y-dimension, when imaging device 500 is placed on a flat surface, i.e., has a bottom surface 545 on the flat surface. First camera 5201Is also preferably mounted at the same height H as shown in fig. 1BLENSAnd particularly in the second camera 520 "back-to-back" therewith2At the same height. The expression "same height" may mean that the cameras 520 are exactly at the same height, or have respective heights different from each other, three respective lenses 5301、5302、5303(or the smallest diameter of the respective diameters 908 if the lenses 530 are not all of the same diameter) of no more than 50% of the diameter 908, or no more than 20% of the diameter 908, or no more than 2mm, or no more than 1 mm.
Fig. 1D shows an example of a top surface 546 of the imaging device 500, the finger grip region 547 preferably extending across most of the top surface 546 in each direction in order to provide as large an area as possible with typically small equipment. The finger grip area 547 may be at least partially textured, as shown, with an area that is less smooth to touch than the surrounding surface. The texture pattern shown in FIG. 1D is non-limiting and shows only one possible example. In some embodiments, not shown, the finger grip area may be flat or convex, or concave but not textured.
FIG. 1C is a bottom view of imaging device 500 showing mechanical mount interface 550 mounted on bottom surface 545, interface 550 may include, for example, for attaching a tripod with a mating threaded connection elementThe threaded socket of (1). In some embodiments, it may be desirable for the cradle interface 550 to be closer to the second and third cameras 520 than the first camera 52012、5203 Interface 550 is preferably mounted along a longitudinal centerline C L x, which longitudinal centerline C L x represents the "middle" of imaging device 500 in the z-dimension the term "middle" may be a line drawn along the central longitudinal axis of floor 545 or may be a line indicating the center of balance considered considering the weights of various elements mounted on both major surfaces 540 of the vehicle, it may be desirable for interface 550 to be positioned along x-dimension centerline C L x and centered on first and second cameras 5201、5202Optical axis 4011、4012As shown in fig. 1C. In one example, this location of the interface 550 provides maximum angular stability for capturing images of a 360 ° panorama, i.e., ensures that the two cameras 520 responsible for the task of acquiring 360 ° images are responsible1、5202Can be stably maintained at an optimum angle for capturing an image.
In some embodiments, the interface 550 is not exactly located at the centerline C L x and the optical axis 4011、4012But is located at the intersection of two cameras 5201、5202Within the coverage area of (a). Fig. 2 shows a first camera 5201And a second camera 5202(including respective lenses 530)1、5302) Orthogonal projection 590 on bottom surface 545 of imaging device 500 orthogonal projection 590 is a projection of the combined footprint of two cameras 5201. as described in the previous paragraph, mechanical mount interface 550, or at least a majority of interface 550, is preferably located within orthogonal projection 590, and it may be desirable for interface 550 to be substantially (+ -3 mm, or + -2 mm, or + -1 mm) around x-dimensional centerline C L x and optical axis 4011、4012Is centered on the "intersection" of, the optical axis 4011、4012Represented in fig. 2 as optical centerline C LOPT
The inner workings of the digital camera may include a planar array of photodetectors. Such a planar array may define a respective photodetector plane for each camera. This is illustrated in fig. 3, where a first camera 5201A second camera 5202And a third camera 5203With respective photodetector planes P1、P2、P3And (4) associating. As discussed above with respect to FIG. 1C, the first camera 5201And a second camera 5202Optical axis 4011、4012Directly opposite (collinear) each other, or opposite each other within a range of rotation of + -10 deg., or opposite each other within a range of rotation of + -5 deg., or opposite each other within a range of rotation of + -1 deg.. The respective optical axis 401 of each camera is perpendicular to the photodetector plane P of a given camera and, therefore, with the first and second cameras 5201、5202Associated photodetector plane P1、P2Respectively, parallel to each other, or within a parallelism of + -10 deg., or within a parallelism of + -5 deg., or within a parallelism of + -1 deg.. As discussed above with respect to FIG. 1C, the second camera 5202And a third camera 5203Respectively optical axis 4012、4013Are parallel to each other, or within a parallelism of + -5 deg., or within a parallelism of + -2 deg., or within a parallelism of + -1 deg., so that the respective photodetector planes P2、P3Parallel to each other, or within a parallelism of + -5 deg., or within a parallelism of + -2 deg., or within a parallelism of + -1 deg.. In these embodiments, the second camera 5202And a third camera 5203Of the corresponding photodetector plane P2、P3Out of plane, in an embodiment, when the second camera 5202Than the third camera 5203Further from the rigid body 501, the second camera 5202Further extending from the rigid body 501, a second camera 5202Than the third camera 5203From the second major surface 540 of the rigid body 5012At least 2mm and not more than 10mm, or at least 3mm and not more than 9mm, or at least 4mm and not more than 8 mm.
As shown schematically in FIG. 4, each camera 520 has a viewing angle α limited by its optics (e.g., its respective wide angle lens 530), and the second camera 5202Preferably having a viewing angle α of at least 180 deg.2. And a second camera 5202And a third camera 5203Associated photodetector plane P2、P3Is not coplanar with the third camera 5203In contrast, the second camera 5202Further from the second main surface 540 of the rigid body 5012Extended, third camera 5203And its wide angle lens 5303Is located at the third camera 5203And its wide angle lens 5303In addition, (at least) a second camera 5202Angle of view α 2180 deg.. In FIG. 4, the marker AOV can be seen2(showing a second camera 520)2Angle of view α2Limited) in the third camera 5203 Wide angle lens 5303And pass through it. On the other major surface 5401, a first camera 5201And also has a viewing angle α of at least 180 deg1And corresponding AOV1The lines are well away from the display screen 510 and the first major surface 5401Through any other feature of (a). The final effect is: using first and second cameras 5202、5202The obtained panoramic image the acquired panoramic image may be generated without capturing any external portion of the imaging apparatus 500, as in each of the examples described in this paragraph.
In these embodiments, imaging device 500 may provide, among other things, the following image capture modes: (i) using only the first camera 5201And a second camera 5202To capture images for creating (stitching) a 360 ° panoramic image; (ii) using only the second and third cameras 5202、5203To capture images for creating 180 ° 3D stereoscopic images; and (iii) use of all three cameras 5201、5202、5203For acquiring images for creating synchronized panoramic and stereoscopic images, wherein there is a first and a second camera 5201、5202The acquired images may be used to create a panoramic image, the second and third cameras 5202、5203The acquired images may be used simultaneously to create a stereoscopic image. The image capture mode can be user-selected, using the display screen 510 (if a touch screen) and ≧ based
Or display the control interface 515 for selection.
The following paragraphs will discuss a system and method suitable for delivering synchronized panoramic and stereoscopic video streams, employing all three cameras 5201、5202、5203
Referring now to fig. 5A, a block diagram illustrates an example of a system 100 for delivering synchronized real-time panoramic and stereoscopic video streams. The system 100 shown in FIG. 5A includes an imaging device 500 that includes three camera devices 5201、5202、5203. The imaging device 500 is preferably operable to use a camera device 5201、5202、5203To acquire three corresponding overlapping real-time video images 1111、1112、1113. The system 100 also includes a communication circuit 157 configured to, among other things, overlay the video image 1111、1112、1113Upload to remote server 170, employ communication channel 201, which, similar to all other communication channels disclosed herein, can incorporate any combination of wired and wireless communication techniques, and is preferably configured to be accessed by imaging device 5201、5202、5203The resolution that can be used to transmit video images, and at the throughput speed necessary to maintain substantially live and real-time transmission. The communication circuit 157 may be integrated into the imaging device 500, or when the term "server" is used herein, it is synonymous with the terms "server(s)", "remote server(s)" and "remote server", and we note that the particular distribution of load and functionality among the servers or processors is not relevant to the present invention.
According to these embodiments, the system 100 may further comprise: multiple modules residing on one or more remote servers 170 in cloud 199-for overlaying real-time video images 1111、1112、1113A stitching module 175 that stitches together, the images being transferred by the imaging device 500 to the server 170; and a stereo image compositing module 176 for compositing from overlapping live video images1111、1112、1113And synthesizing the 3D stereoscopic image. In connection with server 170, storage media 140 (e.g., non-transitory computer readable storage media) can reside in cloud 199.
As shown in fig. 5A, the stitching module 175 can be operated to transfer (i.e., download) live video images 111 from an overlay1、1112、1113The stitched panoramic video images are transmitted to the plurality of devices 195 of the client browsing device 190 as a panoramic video stream 180 using the communication channel 202. Plurality 195 can include any number N of client browsing devices 1901……190NDepending on the operational capabilities of the server 170 and the communication channel 202 or 203. The stereoscopic image composition module 176 may be operable to transmit (i.e., download) the live video images 111 from the overlay1、1112、1113To multiple devices 195 of the same client browsing device 190 as a live 3D video stream 181 using a communication channel 203 for transmission. It is important to note that the entire process implemented in the system shown in FIG. 5A includes, but is not limited to, acquiring multiple overlapping real-time video images 1111、1112、1113And superimposing these live video images 1111、1112、1113Uploading to one or more remote servers 170, employing respective stitching modules 175 and stereoscopic image composition modules 176 to generate respective panoramic and stereoscopic images therefrom, and delivering the images to multiple devices 195 of client browsing device 190 as real-time panoramic video stream 180 and real-time 3D stereoscopic video stream 181, without delay in processing time, so as to deliver real-time video streams 180, 181 in real-time, i.e., within an amount of delay time acceptable in the industry of "live" and "real-time" transmissions. It should be apparent to those skilled in the art that not all of the components shown in fig. 5A are necessary for proper operation of the system 100, and that any of the imaging devices 500 disclosed herein are also suitable for use in the system 100.
Referring now to fig. 5B, a second embodiment of a system 100 for delivering a synchronized real-time video stream is shown in block diagram form. The system 100 shown in fig. 5B has similarities to the system 100 previously discussed with respect to fig. 5A, with a number of differences:
in the example shown in fig. 5B, the stitching module 175 and the stereoscopic image composition module 176 reside within the imaging device 500 and not in the cloud at the one or more servers 170, as is the case in the example of fig. 5A;
the communication circuitry 157 of the imaging apparatus 500 is configured to upload the processed (pre-stitched/pre-composited) video images to the one or more servers 170 as live streams (i.e., the "stitched" real-time panoramic video stream 180 and the "composited" 3D stereoscopic video stream 181).
The delivery module 178 at the one or more servers 170 is configured to deliver the respective real-time video streams 180, 181 to a plurality of devices 195 of the client browsing device 190. Although not shown as a component of the system 100, in some embodiments, the delivery module 178 may also be a component of the system 100.
It should be apparent to those skilled in the art that not all of the components shown in fig. 5A and 5B are necessary for proper operation of the system 100, nor are every possible component and interface for such a system 100 shown in fig. 5A and 5B. It should also be apparent that certain aspects of the two figures may be combined — as in one non-limiting example, it may be one particular module of the respective stitching module 175 and stereoscopic image composition module 176, which may be included in the imaging apparatus 500, and the other of the two modules may reside in the cloud 199, i.e., at one or more remote servers 170.
As is known in the art, images acquired by imaging devices having ultra-wide angle lenses, such as fish-eye lenses, are typically dewaxed to remove distortion effects of the lenses prior to further processing into complex images, such as panoramic images or stereoscopic images (or even prior to being considered simplex images). The dewarping module may be included as part of the system 100 for delivering synchronized real-time panoramic video stream 180 and stereoscopic video stream 181, or may otherwise be available for use by components of the system. Dewarping module 177 may reside in any of a variety of different locations as described in accordance with a particular embodiment.
Referring now to fig. 6, a flow diagram of a method for delivering simultaneous real-time video streaming of panoramic and three-dimensional stereoscopic images to multiple clients is presented. The method comprises four steps as follows:
in step S01, the imaging device 500 including the three imaging devices 520 is used to acquire the corresponding three simultaneous real-time video images 111 of the scene. Any of the imaging devices 500 disclosed herein would be suitable for performing the method.
At step S02, the plurality of real-time video images 111 are transferred from the imaging device 500 to one or more remote servers 170 using the communication circuit 157 and communication channel 201 of the imaging device 500.
Step S03 creates real-time panoramic video images and real-time 3D stereoscopic video images from the plurality of real-time video images 111 using the respective stitching module 175 and stereoscopic image composition module 176 residing at the one or more servers 190. In an embodiment, creating the real-time panoramic video image includes using a camera comprised of the first and second cameras 5201、5202Simultaneous real-time video images are acquired and creating the real-time 3D stereoscopic video images includes using a camera controlled by a second and third camera 5202、5203And simultaneously acquiring real-time video images.
Step S04, simultaneously delivering from one or more remote servers 170 to a client browsing device 190 in communication therewith said real-time panoramic video image and said real-time 3D stereoscopic video image, delivered as live video streams 180, 181 respectively, said delivering being such that: (i) the real-time panoramic video stream 180 is the first plurality of devices 191 or 193 that is delivered to the client browsing device 190, while the real-time 3D stereoscopic video stream is the second plurality of devices 192 or 194 that is delivered to the client browsing device 190. In some embodiments, the two sets of multiple devices 191, 192 do not overlap, while in some embodiments, the two sets of multiple devices 193, 194 overlap such that a set of client browsing devices 190 in the overlap has two real-time video streams delivered to them. These real-time video streams are synchronized with each other to achieve smooth or seamless switching (transition) between them.
In some embodiments, not all steps of the method are performed.
Referring now to fig. 7, a flow diagram of another method for delivering simultaneous real-time video streaming of panoramic and 3D stereoscopic images to multiple clients is presented. The method comprises four steps as follows:
in step S11, the imaging device 500 including the three imaging devices 520 is used to acquire the corresponding three real-time video images 111 of the scene. Any of the imaging devices 500 disclosed herein would be suitable for performing the method.
Step S12, creates a real-time panoramic video image and a real-time 3D stereoscopic video image from the plurality of real-time video images 111, using the respective stitching module 175 and stereoscopic image composition module 176 residing at the imaging device 500. In an embodiment, creating the real-time panoramic video image includes using a camera comprised of the first and second cameras 5201、5202Simultaneous real-time video images are acquired and creating the real-time 3D stereoscopic video images includes using a camera controlled by a second and third camera 5202、5203And simultaneously acquiring real-time video images.
In step S13, the real-time panoramic video image and the real-time 3D stereoscopic video image are transferred from the imaging apparatus 500 to the one or more remote servers 170 using the communication circuit 157 and the communication channel 201 of the imaging apparatus 500.
Step S14, simultaneously delivering from one or more remote servers 170 to a client browsing device 190 in communication therewith said real-time panoramic video image and said real-time 3D stereoscopic video image, delivered as live video streams 180, 181 respectively, said delivering being such that: (i) the real-time panoramic video stream 180 is the first plurality of devices 191 or 193 that is delivered to the client browsing device 190, while the real-time 3D stereoscopic video stream is the second plurality of devices 192 or 194 that is delivered to the client browsing device 190. In some embodiments, the two sets of multiple devices 191, 192 do not overlap, while in some embodiments, the two sets of multiple devices 193, 194 overlap such that a set of client browsing devices 190 in the overlap has two real-time video streams delivered to them. These real-time video streams are synchronized with each other to achieve smooth or seamless switching (transition) between them.
In some embodiments, not all steps of the method are performed. Steps from multiple methods may be combined into a synthetic method.
In some embodiments, either method may further include a storing and retrieving step in which the uploaded video images 111 or the uploaded stitched and composited panoramic and stereoscopic images are stored separately for subsequent processing, e.g., retrieval by a customer on demand.
The present invention has been illustrated by a detailed description of specific embodiments, which are provided by way of example, and which should not be taken as limiting the scope of the invention. The described embodiments comprise different features, which are not required in all embodiments of the invention. Some embodiments of the invention utilize only some of these features or possible combinations. While various embodiments of the invention have been described, these embodiments of the invention include various combinations of features described in the described embodiments, as will occur to those skilled in the art to which the invention pertains.
In this specification and in the claims, each of the verbs "comprise", "comprise" and "have", and their conjugates, are used to indicate a subject or object of the verb, but they are not necessarily a complete list of the subjects, members, elements or parts of the verb, or the subject or object. The singular forms "a", "an" and "the" include plural references unless the context clearly dictates otherwise. For example, the term "marker" or "at least one marker" may include a plurality of markers.

Claims (10)

1. An imaging apparatus (500) for acquiring a plurality of simultaneous video images to produce 180 ° stereoscopic and 360 ° panoramic video streams, the imaging apparatus comprising:
a. a rigid body (501) comprising a first main surface (540)1) And a second major surface (540)2) And a bottom surface (545) on which a mechanical mount interface (550) is present;
b. first camera (520)1) A second camera (520)2) And a third camera (520)3) Fixedly mounted to the rigid body (501), each camera (520) comprising a respective ultra-wide angle lens (530) and defining a respective optical axis (401), wherein:
i. the first camera (520)1) Is arranged such that its lens (530)1) From the first main surface (540)1) Extends out and its optical axis (401)1) Defining a first direction;
the second camera (520)2) Is arranged such that its lens (530)2) From the second main surface (540)2) Extends out and its optical axis (401)2) Defining a second direction that is opposite to the first direction, or is not more than 5 degrees opposite; and
the third camera (520)3) Is arranged such that its lens (530)3) From the second main surface (540)2) Extends out and its optical axis (401)3) Defining a third direction, the third direction being parallel to the second direction, or within 10 degrees of parallelism;
c. a video display module including a first major surface (540) mounted thereon1) A display screen (510) on said display screen (510), said display screen (510) operable to display and/or play 180 ° or 360 ° images in response to input received through said display screen (510) and/or through a display control interface (515);
wherein:
i. the floor (545) defines an x-z plane oriented such that a long centerline C L x of the floor (545) has only an x-axis dimension;
a y-axis intersects the x-z plane perpendicularly and defines a vertical direction; and
when the imaging device (500) is unattended standing on a flat surface on its bottom surface (545), the first camera (520)1) A second camera (520)2) And a third camera (520)3) Are vertically aligned with each other at the same height in the respective super wide-angle lenses (530)1、5302、5303) Within a tolerance of one-half of the arbitrary diameter (908).
2. The imaging apparatus (500) of claim 1, wherein: the imaging device (500) is unattended standing on a flat surface of its bottom surface (545), the first camera (520)1) A second camera (520)2) And a third camera (520)3) Are vertically aligned with each other at the same height in the respective super wide-angle lenses (530)1、5302、5303) Within a tolerance of one fifth of the arbitrary diameter (908).
3. The imaging apparatus (500) of claim 1 or 2, wherein: the imaging device (500) is unattended standing on a flat surface of its bottom surface (545), the first camera (520)1) A second camera (520)2) And a third camera (520)3) Are vertically aligned with each other at the same height, within a tolerance of 1 mm.
4. The imaging apparatus (500) of claim 1 or 2, wherein: the first camera (520)1) And a second camera (520)2) Are aligned with each other in the x-axis dimension to within a tolerance of 1 mm.
5. The imaging apparatus (500) of claim 1 or 2, wherein: the mechanical support interface (550) is to the first camera (520)1) And a second camera (520)2) Are aligned with each other in the x-axis dimension to within a 5 mm tolerance。
6. The imaging apparatus (500) of claim 1 or 2, wherein: the second camera (520)2) Optical axis (401)2) A second direction is defined that is opposite to the first direction, or no more than 1 degree opposite.
7. The imaging apparatus (500) of claim 1 or 2, wherein: the third camera (520)3) Optical axis (401)3) A third direction is defined that is parallel to the second direction, or within 5 degrees of parallelism.
8. The imaging apparatus (500) of claim 7, wherein: the third camera (520)3) Optical axis (401)3) A third direction is defined that is parallel to the second direction, or within 1 degree of parallelism.
9. The imaging apparatus (500) of claim 1 or 2, wherein: further comprising a first main surface (540)1) First electronic status indicator (516) of (a)1) And on the second main surface (540)2) Second electronic status indicator (516)2)。
10. The imaging apparatus (500) of claim 1 or 2, wherein: configured to acquire a plurality of simultaneous video images in a first mode, a second mode, or a third mode; in a first mode, a first camera (520)1) And a second camera (520)2) Acquiring simultaneous video images for splicing into a 360-degree panoramic video image; in the second mode, the second camera (520)2) And a third camera (520)3) Acquiring simultaneous video images for synthesizing a 180-degree stereoscopic video image; in a third mode, the first camera (520)1) A second camera (520)2) And a third camera (520)3) Acquiring simultaneous video images for productionGenerating 180-degree stereo and 360-degree panoramic video images; the apparatus is arranged to respond to input received via the display screen (510) and/or via a display-control interface (515).
CN201921573500.9U 2019-03-25 2019-09-20 Imaging device with sensors of the same height Expired - Fee Related CN211018942U (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201962823409P 2019-03-25 2019-03-25
US62/823,409 2019-03-25
US201962893497P 2019-08-29 2019-08-29
US62/893,497 2019-08-29

Publications (1)

Publication Number Publication Date
CN211018942U true CN211018942U (en) 2020-07-14

Family

ID=71474902

Family Applications (3)

Application Number Title Priority Date Filing Date
CN201921573500.9U Expired - Fee Related CN211018942U (en) 2019-03-25 2019-09-20 Imaging device with sensors of the same height
CN201921572238.6U Expired - Fee Related CN211019015U (en) 2019-03-25 2019-09-20 Imaging device with non-coplanar sensors
CN201921572237.1U Expired - Fee Related CN211018941U (en) 2019-03-25 2019-09-20 Multi-sensor imaging device with cradle interface

Family Applications After (2)

Application Number Title Priority Date Filing Date
CN201921572238.6U Expired - Fee Related CN211019015U (en) 2019-03-25 2019-09-20 Imaging device with non-coplanar sensors
CN201921572237.1U Expired - Fee Related CN211018941U (en) 2019-03-25 2019-09-20 Multi-sensor imaging device with cradle interface

Country Status (2)

Country Link
CN (3) CN211018942U (en)
WO (1) WO2020194190A1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10368011B2 (en) * 2014-07-25 2019-07-30 Jaunt Inc. Camera array removing lens distortion
US10750153B2 (en) * 2014-09-22 2020-08-18 Samsung Electronics Company, Ltd. Camera system for three-dimensional video
CN108700798A (en) * 2016-01-03 2018-10-23 人眼技术有限公司 Frame during creating panoramic frame adapts to splicing
US10600155B2 (en) * 2017-02-09 2020-03-24 Verizon Patent And Licensing Inc. Generating virtual reality content based on corrections to stitching errors
EP3580935A4 (en) * 2017-02-13 2020-07-22 Nokia Technologies Oy An apparatus, a method and a computer program for video coding and decoding
US20180342043A1 (en) * 2017-05-23 2018-11-29 Nokia Technologies Oy Auto Scene Adjustments For Multi Camera Virtual Reality Streaming

Also Published As

Publication number Publication date
WO2020194190A1 (en) 2020-10-01
CN211018941U (en) 2020-07-14
CN211019015U (en) 2020-07-14

Similar Documents

Publication Publication Date Title
US11259009B2 (en) Modular configurable camera system
US9264695B2 (en) System and method for multi-viewpoint video capture
US7224382B2 (en) Immersive imaging system
WO2011052064A1 (en) Information processing device and method
CN105165000A (en) Panoramic-imaging digital camera, and panoramic imaging system
CN209402638U (en) Photographic device
KR101649752B1 (en) Generating method for multiview image using interface with image matrix and generating system for multiview image
WO2017118662A1 (en) Spherical virtual reality camera
US20130021448A1 (en) Stereoscopic three-dimensional camera rigs
JP2007264592A (en) Automatic three-dimensional image forming device and method
JP2000112019A (en) Electronic triplet lens camera apparatus
JP2012129768A (en) Document camera, document camera control method, program, and display processing system
CN211018942U (en) Imaging device with sensors of the same height
WO2018079283A1 (en) Image-processing device, image-processing method, and program
KR102019879B1 (en) Apparatus and method for acquiring 360 VR images in a game using a virtual camera
KR102019880B1 (en) 360 VR image acquisition system and method using distributed virtual camera
WO2013144506A1 (en) Method and device for creating images
US20100289881A1 (en) Camera for multiple perspective image capturing
CN213461928U (en) Panoramic camera and electronic device
CN217590970U (en) Double-shooting equipment and double-shooting system
JP2020022065A (en) Distribution device, camera device, distribution system, distribution method, and distribution program
CN217445411U (en) System for generating successive images from independent image sources
WO2009014416A1 (en) Setup for three dimensional image capture
Baker et al. Building camera arrays for light-field capture, display, and analysis
JP2018180324A (en) Panoramic image imaging system

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200714

Termination date: 20210920