WO2014162324A1 - Spherical omnidirectional video-shooting system - Google Patents

Spherical omnidirectional video-shooting system Download PDF

Info

Publication number
WO2014162324A1
WO2014162324A1 PCT/IT2014/000095 IT2014000095W WO2014162324A1 WO 2014162324 A1 WO2014162324 A1 WO 2014162324A1 IT 2014000095 W IT2014000095 W IT 2014000095W WO 2014162324 A1 WO2014162324 A1 WO 2014162324A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
video
images
spherical
cameras
Prior art date
Application number
PCT/IT2014/000095
Other languages
French (fr)
Inventor
Davide Angelelli
Original Assignee
Virtualmind Di Davide Angelelli
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Virtualmind Di Davide Angelelli filed Critical Virtualmind Di Davide Angelelli
Priority to GBGB1520437.3A priority Critical patent/GB201520437D0/en
Priority to CN201480032323.7A priority patent/CN105684415A/en
Publication of WO2014162324A1 publication Critical patent/WO2014162324A1/en
Priority to SG10201508072WA priority patent/SG10201508072WA/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/12Panospheric to cylindrical image transformations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/189Recording image signals; Reproducing recorded image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/243Image signal generators using stereoscopic image cameras using three or more 2D image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/58Means for changing the camera field of view without moving the camera body, e.g. nutating or panning of optics or image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums

Definitions

  • This invention relates to a spherical omnidirectional video-shooting system.
  • This system has the innovative function to record, produce and send, via wireless signal, to remote devices, spherical video in real time.
  • the invention has the objective to record images and video content in 'immersive omni-directional ' 360° mode, and is a miniaturized system that has two or more different cameras, combined together with geometric arrangements, inside a chassis.
  • This system and its integrated software provides enormous potentials: it has two or more video lenses inside it, and can execute in real time the movies merge from all video camera signals generating omnidirectional movies.
  • the film thus obtained can be cut to the desired portion of the video, allowing the users to create their own timeline with multiple shots and then upload it on the internet or send it to remote devices.
  • the system can also assist and/or integrate mobile phones cameras, providing the ability to record omnidirectional contents exploiting the optics present in this system, or by using the front and rear optics present in modern phones, smartphones and tablets with technologies specially designed for this kind of equipment, as shown in Fig.16.
  • the classic shots typically used to monitor the Planetary or hemispherical IMAX type are created to give the illusion to the public to be in the center of the scene, in an immersive environment, such as outdoors on a starry night, with the sky that fills for the entire curved screen on which will be projected.
  • the classic video recorded by the current cameras is produced at 30 frames per second with photographic lens of circular shape, because the higher distortion and loss of resolution, is so much greater, because the distance from the center of the captured image is greater, or on the perimeter points of the image.
  • This image is usually cropped in a rectangular shape, traditionally derived from the shape of painting pictures and theater scene. In this way, a large part of the useful surface is therefore wasted.
  • a photo with a rectangular aspect ratio of 1:1.85, used for moving images, is able to record only 53% of this circular area.
  • the proposed mechanism is highly miniaturized, watertight and water resistant till 20 atmospheres of pressure, and it can be used in movies, video surveillance, commercials, security, video inspections in inaccessible or contaminated places, by the military aboard drones, air and underwater, parachuted or endoscopic probe, onboard phones or smartphones.
  • This system is dedicated to record images and videos scope, and more particularly to digitally record images and
  • 360 degree panoramic videos a special particular version enables immersive stereoscopic shooting, which is a system of cameras to record a panoramic field of vision at 360° in 3D in digital mode.
  • the system is composed of multiple cameras, at least four in the stereoscopic version, that coupled, or in double number for each shooting quadrant, allow the user to obtain stereoscopic vision: the cameras of the system are able to capture images covering an entire 360° panorama to create sequences of images and video immersive 3D, a 3D movie or stereoscopic panoramic animation.
  • the optics of the new digital cameras with greater resolution and picture quality of 5 or 10 or more megapixels that record at 30-60 or 100 frames per second and beyond, together with the speed and processing power of electronic processors in constant growth, the foundations have been laid for three-dimensional video systems recovery at 360° omnidirectional cameras, that can capture images and video data to create three-dimensional images and animation for immersive movies.
  • the USB protocol 3 is used, but the system is already prepared to accept video streams of higher speed as the thunderbolt2 or higher, without limitations, being an open system to receive new components.
  • the video stream coming from the different optics is merged into one omnidirectional image by the processor of the system, and recorded in the storage memory inside the system and/or sent to remote devices via the high-speed wireless module integrated into the system.
  • the movies shot with this innovative technology are designed to be projected on flat screens, but also in theaters in hemispherical dome.
  • the video created for a flat screen and then displayed on a curved surface, such as a dome, and IMAX screens are distorted, if shot with classic cameras, while those videos, recorded with the 360° immersive technology are perfect, once projected on both flat and hemispherical screens, operative rooms, traffic control or surveillance programs and television formats.
  • the immersive movies can be "navigated" with three-dimensional eyeglasses and visors, which also produce sounds to offer the viewer a realistic view, as if it were at the center of the scene.
  • the importance and relevance of the technical innovation of this invention is creating 3D spherical video, through a system that includes several (two or more) video cameras, mounted on a modular structure shaped as a sphere, pentagon or dodecahedron, or any other shape, without limitations in size and shape, to shoot in video mode the entire surrounding landscape at 360° spherically from the shooting point, with no dead spots.
  • Another innovation is the absolute chance to realize immersive shooting, also taking advantage of front and rear optics of modern mobile phones, same for smartphones, tablets.
  • This technology can also be used for example on classic cameras mounted on a pole that shoot different angles or by combining the various video signals from each camera, to create an omnidirectional image.
  • video-shooting unlike other systems, are merged in real time by the on-board software that use appropriate mathematical algorithms of image fusion hosted on the microprocessor of the device.
  • the video projection in high resolution as result of a movie that covers the entire field of vision (spherical in this case) has no more visual distortion, in order to give to the viewer the feeling of being immersed in a spherical dome, giving the perception of being immersed in the second half, and this places looks like in the virtual reality video game with the synthetic vision mode, just as it is perceived by a viewer in reality, without there being any optical distortion.
  • the system of this invention has the particularity to be miniaturized, self-powered, portable but especially to mount images inside without the aid of other devices, so it is a unique and innovative shooting system, with wide-angle lenses, and can record at the same time in stereoscopic mode, and realize omnidirectional images merging together the signals coming from different optics, already assembled and processed in real time, ready to be sent remotely via the wireless system, in real-time, to remote devices.
  • Video and photo contents are also stored into an on-board memory card.
  • the system is equipped with a GPS module for locating, or tracing the movements of taken shots, to produce the analysis carried out with respect to the path, and a rangefinder, which provides the recording of data necessary to obtain parameters of distance data, all of them being also assembled together and stored in the storage system as metadata.
  • Networks have an obsolete design and the viewer is totally passive: he can only watch the images that are presented to him by the director, without being able to interact with them in any way: the only possible interaction is changing the channel .
  • None of current systems has a real time interaction between camera and user and vice versa, also no network transmits images, content or immersive omnidirectional videos interactive with the users.
  • Networks like Youtube or similar even if they allow uploading online contents by users, do not allow to interact with what is proposed in any way, certainly not in real time.
  • Real time Streaming Live the viewer will become director of real-time video transmitted building, as he wants, his personal schedule and unique within the event, as if it were present at the shooting point of omnidirectional cameras, deciding to follow the event from the angle he prefers, looking where he prefers without having to adapt to the decision of the director or operator who is picking up the event, so crossing the insurmountable current video limits.
  • Sharing and uploading content users not only can use part of the video they want, by selecting it from the video recorded by omnidirectional cameras into this system, but will be also able to upload their content, also recorded with these omnidirectional cameras, thanks to a real time upload streaming server. It will be possible to enjoy and "navigate" the contents created by other connected users, which can interact or participate effectively in shared video sessions (e.g., teleconferencing) . This will be the new real frontier of interactive TV: every user becomes a television producer, and it is possible to simultaneously enjoy movies uploaded by other users via their PC, phone or tablet and participate in omni-directional remote sessions.
  • Video management software that combines various video sources in real time (real time and video stitching) . At the same this software is able to merge video signal coming from classic cameras, with single optics, to build three-dimensional objects and scenes for biometric analysis of people.
  • this system provides an innovative way to combine or cut out portions of video from different video signals, taken with different optics, including quality and different resolutions.
  • using the same concept can be merged from different video signals pointing in one direction to realize a video of a single three- dimensional object, for analysis of the shape and size detectors using telemetry data and/or biometrics .
  • Figure 1 illustrates an embodiment of the system and the printed circuit board, where the various electronic components and the digital micro-camera with optics greater than 2 mega pixels are placed;
  • Figure 2 illustrates one of the possible embodiments of the inventive system, having assembled therein, the chassis 3 with spherical shape with the use of two sun optics 2, the figure also showing the monitor 9 and the function keys
  • Figure 3 illustrates one of the possible embodiments of the assembled system, the chassis 3 in one of the possible embodiments in this case of spherical shape with the use of three optics 2, the figure also showing the LED status 10, the slot for the memory card 13, the breading for any tripod or mounting brackets 11, the USB3 port for data exchange 12;
  • Figure 4 illustrates the side view of one of the possible embodiments of the assembled system, where the chassis 5 of hemispherical shape accommodates six optics 2, the figure highlighting the possible mounted tripode 4.
  • Figure 5 illustrates the side view of one of the possible embodiments of the assembled system, the chassis 6 of spherical optics housing 2, in the figure highlighting the possible mounted tripod 4;
  • Figure 6 illustrates a perspective view of one of the possible embodiments of the system partially disassembled, the chassis 17 of spherical optical housing 2, there being also visible the electronic components and the microprocessor 1 housed on the printed circuit 16;
  • Figures 7a, 7b illustrate a front and up view of the radial mounting of the various recovery cones 19 that are overlapped by 15-20 degrees and that cover the entire recovery hemisphere;
  • Figure 8 illustrates a scheme of union of the various video recorded with the use of the chassis with. six. optics and the related, video merge fox spherical imaging;
  • Figure 9 shows a diagram of the merge of the various video recorded by different optics, merged via the algorithms for the video-stitching software present in the microprocessor;
  • Figure 10 is the schematics of the communication interface between electronic cameras, microprocessor, battery and related video outputs for USB 3, wireless module and other devices integrated into the system;
  • Figure 11 shows an example of the methods and procedures in which the various video signals and metadata are assembled and then sent in real time to remote devices as shown i Figure 16;
  • Figure 12 illustrates an example of two recorded frames from adjacent optics inside the device and the processing stage, Figure 13 and merged Figure 14, of this frames with a 15-2 ⁇ overlap 19;
  • Figure 15 illustrates how the device is able to send pictures, video and metadata in real time to remote devices through the wireless module and interact with such equipment.
  • Figures 16a, 16b, 16c illustrate how the highly innovative software in the edge of the device can also be used on board of modern smartphones, tablets and mobile phones in general, but also classic cameras such as those already installed for the video surveillance, capable of transforming them into devices capable of 360° omni-directional content recording via front and rear cameras or in the system without restrictions in number: the corresponding Figure 16 illustrates the two . cameras and angles in the recording views : Figure 16a rear, Figure 16b front, and Figure 16c side;
  • Figure 17 illustrates one version of the device and hand 18 travel support 4.
  • the system of the invention is substantially constituted by the following components:
  • chassis preferably made of aluminum or composite material, containing electronics and two or more optics;
  • a frame structure of the chassis preferably made of aluminum or composite material, containing electronics and multiple optical coupled in pairs of two for each quadrant of recovery (in the stereoscopic version) ;
  • the invention is related to a system comprising at least two lenses that create an omnidirectional video camera, an images and videos recording system that can be used to create a 3D immersive environment at 360°, also stereoscopic.
  • the system uses at least two cameras, with preferable variations preferable even with six or eleven cameras; if 3D stereoscopic version lenses are doubled for each shooting quadrant, the optics have a resolution >2 megapixel camera and a camera angle >100°, organized with overlapping visual frame, to capture image data covering the entire 360° scene, oriented so as to have an overlap of recovery of at least 15°.
  • the collected data are processed by the chip on the device that records data from different cameras, and can be sent in real time to remote devices via a wireless interface and to a cable with USB 3 technology with greater bandwidth.
  • the camera system can be used to create a 3D model taken from the real world of a scene at 360°, using the triangulation of the image data in the frame of overlap view.
  • a microphone to record sounds in stereo mode, a geolocation (GPS) as a wireless module, a range finder, a Bluetooth module, a GSM-4G module, battery power, a memory storage and possibly an optional parachute powered by an accelerometer.
  • GPS geolocation
  • Another aspect of this invention is describing a system of modular cameras interchangeable parts, in which a configuration is sufficient for photographing an entire hemisphere of the visual field; in addition, is it possible to shoot with an operator and a single camera that can be easily carried on one's shoulder, in cars, planes, drones, helicopters, or other mobile devices due to its very low size and weight.
  • the microprocessor inside the device has executable instructions and fusion algorithms of the recorded data comprising the steps of:
  • the sequence of acquisition of image data is transferred from the plurality of cameras; - acquiring audio data from the microphone at least in synchronism with the acquisition of the image data;
  • the system is water resistant, up to 20 atmospheres of pressure, it is also resistant to weathering and chemicals. It also has the ability to be connected to a flexible tube for moving within the human body, in the version for endoscopic probe, or the ducts or video inspected for movement in the application for video surveillance.
  • a built-in microprocessor in the camera using algorithms, provides the processing of data acquired by cameras: crop the scanned data from the first, the second and then the third camera; scale the data from the three cameras, rotate the image data produced; then adjust one or more visual properties of the images rotate, to vary exposure, color, brightness, and contrast, and then merge them into a single frame overlapping for about 20% each image.
  • These frames are produced by the processor board 30 times per second to create a smooth video display visible through an external system on-board the camera, which has a detector for the telemetry data of the acquired data, and transfers the data outside through a physical port and via a wireless module, with a transfer speed of at least 1 gigabyte per second.
  • the system includes a card slot for accepting a memory card where image data are stored; it also includes a wireless module in which the microprocessor has executable instructions, including the transmission functions of the spherical video to remote devices, PC, tablets, internet, mobile phones.
  • the system also includes a small parachute driven by an accelerometer: the parachute system offers the possibility to be released in territories or inaccessible areas or where you want to put an area under remote surveillance.
  • the system can be also provided in a particular version of a miniaturized motor and a propulsion system to become self-propelled, in air, liquids, on the ground, within conduits or for use as endoscopic probe.
  • a gyroscope and an accelerometer are also provided, to determine the rotational acceleration. These data are stored as metadata of rotation into the system.
  • the system also includes a Global Positioning System (“GPS”) to determine changes in the position of the camera system during the movement of the camera itself: these data are stored as metadata in the global GPS position into the system, and can then be analyzed to produce charts or patterns.
  • GPS Global Positioning System
  • the system also has in its ROM memory, an algorithm for displaying spherical video files and a directional sound created by the camera: the videos taken are transferred wirelessly to a spherical video viewer.
  • This display is controlled by a computer, creating, for the processing system, a three-dimensional video "navigable" with a mouse or touchscreen mode in a virtual environment.
  • the system also has a module with GSM data transmission with 4G LTE technology or higher.
  • the system weighs around 180 grams including the chassis, optics and electronics, battery, and its dimensions are: diameter 10 cm, and 2 cm in the endoscopic version, which may be reduced to 1mm or less with modern nanotechnology.
  • the peculiarity of this invention is the ability to record the entire world around in real time, by merging the video shot from various optics going to cancel the deformation of the image from a stereoscopic image by merging two images from a single shooting quadrant (for stereoscopic vision) .
  • the acquired data can be also used for creating panoramic images with various configurations of two-dimensional audio-video compression: this serves to provide the operator with the ability to view the entire visual frame, but what he does not perceive behind, above and below him in a single moment and a single image or video unlike what that occurs in the human eye which has a field of vision of approximately 90°, useful for control rooms or for video security with immense advantages of drastically reducing the number of cameras and related monitors present in the control room.
  • sequence of images or videos generated are then sent through the wireless module on-board equipment, and, through an appropriate software, are usable on standard personal computers, touch screen or via the Internet, with an application that makes it navigable 360° the whole scene or through the new stereoscopic viewers eye.
  • the cameras are oriented in a radial manner with respect to a structure made of plastic, aluminum or composite material. Each camera has a field of view of >90°, which overlaps with the field of view of the adjacent digital optics.
  • the digital cameras, to create a stereoscopic vision, to create the stereoscopic version, are two digital camera for each quadrant.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Studio Devices (AREA)
  • Surgical Instruments (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)
  • Endoscopes (AREA)

Abstract

A system is described for 360° spherical and stereoscopic video recordings, also adapted to view and send data wirelessly to other devices, said system comprising: a structure (3, 5, 6) supporting an electronic board (16) where a microprocessor (1) and other electronic components are houses; at least two electronic high-definition wide-angle cameras (2), with an upper angle of 100 °, inserted in the structure (3) and arranged radially from the center of the apparatus designed to create spherical photos and videos, wherein said system is suitable for recording at the same time also in stereoscopic mode, producing already merged images, stitching in real time signals from different optics, and transferring them to remote devices through a wireless module present therein, eliminating the distortion of images.

Description

SPHERICAL OMNIDIRECTIONAL VIDEO-SHOOTING SYSTEM
This invention relates to a spherical omnidirectional video-shooting system. This system has the innovative function to record, produce and send, via wireless signal, to remote devices, spherical video in real time. In particular, the invention has the objective to record images and video content in 'immersive omni-directional ' 360° mode, and is a miniaturized system that has two or more different cameras, combined together with geometric arrangements, inside a chassis.
The limit of modern cameras is to record video with a very narrow angle of view, losing the ability to record many important details. Even more difficult is rotating the optics to shoot what is behind the camera and comparing it with the images or video taken from an original angle at that time.
This system and its integrated software provides enormous potentials: it has two or more video lenses inside it, and can execute in real time the movies merge from all video camera signals generating omnidirectional movies. The film thus obtained can be cut to the desired portion of the video, allowing the users to create their own timeline with multiple shots and then upload it on the internet or send it to remote devices. The system can also assist and/or integrate mobile phones cameras, providing the ability to record omnidirectional contents exploiting the optics present in this system, or by using the front and rear optics present in modern phones, smartphones and tablets with technologies specially designed for this kind of equipment, as shown in Fig.16.
The classic shots typically used to monitor the Planetary or hemispherical IMAX type are created to give the illusion to the public to be in the center of the scene, in an immersive environment, such as outdoors on a starry night, with the sky that fills for the entire curved screen on which will be projected. The classic video recorded by the current cameras, even with an optimized lens, is produced at 30 frames per second with photographic lens of circular shape, because the higher distortion and loss of resolution, is so much greater, because the distance from the center of the captured image is greater, or on the perimeter points of the image. This image is usually cropped in a rectangular shape, traditionally derived from the shape of painting pictures and theater scene. In this way, a large part of the useful surface is therefore wasted. A photo with a rectangular aspect ratio of 1:1.85, used for moving images, is able to record only 53% of this circular area.
The earlier documents, most relevant to this invention, are: US-A- 7710463, US-A1-2009/0278917 Al and US-A- 7003136.
Object of this invention is solving the above prior art problems. The proposed mechanism is highly miniaturized, watertight and water resistant till 20 atmospheres of pressure, and it can be used in movies, video surveillance, commercials, security, video inspections in inaccessible or contaminated places, by the military aboard drones, air and underwater, parachuted or endoscopic probe, onboard phones or smartphones. This system is dedicated to record images and videos scope, and more particularly to digitally record images and
360 degree panoramic videos; a special particular version enables immersive stereoscopic shooting, which is a system of cameras to record a panoramic field of vision at 360° in 3D in digital mode. The system is composed of multiple cameras, at least four in the stereoscopic version, that coupled, or in double number for each shooting quadrant, allow the user to obtain stereoscopic vision: the cameras of the system are able to capture images covering an entire 360° panorama to create sequences of images and video immersive 3D, a 3D movie or stereoscopic panoramic animation.
With the optics of the new digital cameras with greater resolution and picture quality of 5 or 10 or more megapixels that record at 30-60 or 100 frames per second and beyond, together with the speed and processing power of electronic processors in constant growth, the foundations have been laid for three-dimensional video systems recovery at 360° omnidirectional cameras, that can capture images and video data to create three-dimensional images and animation for immersive movies.
In detail, for the optics the USB protocol 3 is used, but the system is already prepared to accept video streams of higher speed as the thunderbolt2 or higher, without limitations, being an open system to receive new components. The video stream coming from the different optics is merged into one omnidirectional image by the processor of the system, and recorded in the storage memory inside the system and/or sent to remote devices via the high-speed wireless module integrated into the system. The movies shot with this innovative technology are designed to be projected on flat screens, but also in theaters in hemispherical dome. The video created for a flat screen and then displayed on a curved surface, such as a dome, and IMAX screens are distorted, if shot with classic cameras, while those videos, recorded with the 360° immersive technology are perfect, once projected on both flat and hemispherical screens, operative rooms, traffic control or surveillance programs and television formats. The immersive movies can be "navigated" with three-dimensional eyeglasses and visors, which also produce sounds to offer the viewer a realistic view, as if it were at the center of the scene.
To create the illusion of a continuous picture able to fill the inside of a hemisphere, it is necessary to turn the images with an appropriate camera system that covers the entire field of vision, or images that are appropriate for such a semi-curved, cylindrical, or hemispherical screen, which exactly are the technologies objective of this patent. So, the importance and relevance of the technical innovation of this invention is creating 3D spherical video, through a system that includes several (two or more) video cameras, mounted on a modular structure shaped as a sphere, pentagon or dodecahedron, or any other shape, without limitations in size and shape, to shoot in video mode the entire surrounding landscape at 360° spherically from the shooting point, with no dead spots. Another innovation is the absolute chance to realize immersive shooting, also taking advantage of front and rear optics of modern mobile phones, same for smartphones, tablets. This technology can also be used for example on classic cameras mounted on a pole that shoot different angles or by combining the various video signals from each camera, to create an omnidirectional image.
In this system, video-shooting, unlike other systems, are merged in real time by the on-board software that use appropriate mathematical algorithms of image fusion hosted on the microprocessor of the device. The video projection in high resolution as result of a movie that covers the entire field of vision (spherical in this case) has no more visual distortion, in order to give to the viewer the feeling of being immersed in a spherical dome, giving the perception of being immersed in the second half, and this places looks like in the virtual reality video game with the synthetic vision mode, just as it is perceived by a viewer in reality, without there being any optical distortion. This is because the system with two or more high-definition optics from 2 or more megapixels, has shot in real time across the optical field at 360° around the shooting point.
Compared with existing solutions, the system of this invention has the particularity to be miniaturized, self-powered, portable but especially to mount images inside without the aid of other devices, so it is a unique and innovative shooting system, with wide-angle lenses, and can record at the same time in stereoscopic mode, and realize omnidirectional images merging together the signals coming from different optics, already assembled and processed in real time, ready to be sent remotely via the wireless system, in real-time, to remote devices.
Video and photo contents are also stored into an on-board memory card. Furthermore, the system is equipped with a GPS module for locating, or tracing the movements of taken shots, to produce the analysis carried out with respect to the path, and a rangefinder, which provides the recording of data necessary to obtain parameters of distance data, all of them being also assembled together and stored in the storage system as metadata.
Other highly innovative aspects are described below.
In recent years, there has been much talk of interactive TV, although so far the results have been poor. This system introduces important new innovative concepts, but so far little progress and Utopian ideas have come out, which are partial and unsatisfactory, and do not provide a lot to the end user of the service. The systems on the market today are created to be controlled by voice commands and hand signals to the TV and the equipment with Touch Gesture, but still there is no real, concrete interactivity in real time with the television "object" or with the network.
Networks have an obsolete design and the viewer is totally passive: he can only watch the images that are presented to him by the director, without being able to interact with them in any way: the only possible interaction is changing the channel .
Moreover, all video systems, including video surveillance, have a very big limitation: they record images with a default angle, even in the best cases, with the use of wide-angle >90° and PTZ cameras; this recording mode, framing only a part of what surrounds the camera, is obviously very disadvantageous and constitutes an insurmountable limit, at least so far.
None of current systems has a real time interaction between camera and user and vice versa, also no network transmits images, content or immersive omnidirectional videos interactive with the users.
Networks like Youtube or similar, even if they allow uploading online contents by users, do not allow to interact with what is proposed in any way, certainly not in real time.
This is the innovation of the present system: new systems can be designed starting from four new innovative concepts, fundamentals for the system of this patent application:
1. The usage, by Networks Television, of cameras, precisely Omnidirectional real-time cameras, which are able to record an omnidirectional video 360° x 180°, because they have a device equipped with multiple remotely controlled optics, even introducing new concepts for the video direction: our system device, compared with the "standard" or PAL Full HD devices, offers many advantages, namely:
• Vision at 360° x 180°
• Resolution at more than 10 times compared with standard cameras, up to 100 million pixels or more.
• Real time Streaming Live: the viewer will become director of real-time video transmitted building, as he wants, his personal schedule and unique within the event, as if it were present at the shooting point of omnidirectional cameras, deciding to follow the event from the angle he prefers, looking where he prefers without having to adapt to the decision of the director or operator who is picking up the event, so crossing the insurmountable current video limits.
• Ubiquity: the viewer will observe as if it were present at several points of the event or events simultaneously.
2. Sharing and uploading content: users not only can use part of the video they want, by selecting it from the video recorded by omnidirectional cameras into this system, but will be also able to upload their content, also recorded with these omnidirectional cameras, thanks to a real time upload streaming server. It will be possible to enjoy and "navigate" the contents created by other connected users, which can interact or participate effectively in shared video sessions (e.g., teleconferencing) . This will be the new real frontier of interactive TV: every user becomes a television producer, and it is possible to simultaneously enjoy movies uploaded by other users via their PC, phone or tablet and participate in omni-directional remote sessions.
3. Video management software that combines various video sources in real time (real time and video stitching) . At the same this software is able to merge video signal coming from classic cameras, with single optics, to build three-dimensional objects and scenes for biometric analysis of people.
4. Thanks to applications for tablets, mobile phones and smart TV, the user will have the option to choose and crop, within a 360° omni-directional spherical video, the portions he desires to create new scenes. This will allow the user not to miss any detail of the space surrounding the omnidirectional camera object of this patent, unlike the classic cameras, which usually miss everything that is not exactly in front of the camera at that moment, which is the most remaining of all that surrounds, making them very limited and, when used for surveillance purposes, very vulnerable .
Starting from these new innovative and revolutionary concepts, it is easy to understand, even for people not skilled in multimedia, the immense potential offered by the system of this patent and the possible uses in many different applications.
Thanks to the many options available, this system provides an innovative way to combine or cut out portions of video from different video signals, taken with different optics, including quality and different resolutions.
In contrast, using the same concept can be merged from different video signals pointing in one direction to realize a video of a single three- dimensional object, for analysis of the shape and size detectors using telemetry data and/or biometrics .
These and other advantages of the invention just described, which will be highlighted below, are achieved with a system as described in claim 1. Preferred embodiments and non-trivial variations of the present invention are the subject matter of the dependent claims .
It is understood and clear that all the appended claims form an integral part of the present description.
This invention will be better described by some examples of the final product, given by way of non-limiting example, with reference to the related drawings, in which:
Figure 1 illustrates an embodiment of the system and the printed circuit board, where the various electronic components and the digital micro-camera with optics greater than 2 mega pixels are placed;
Figure 2 illustrates one of the possible embodiments of the inventive system, having assembled therein, the chassis 3 with spherical shape with the use of two sun optics 2, the figure also showing the monitor 9 and the function keys Figure 3 illustrates one of the possible embodiments of the assembled system, the chassis 3 in one of the possible embodiments in this case of spherical shape with the use of three optics 2, the figure also showing the LED status 10, the slot for the memory card 13, the breading for any tripod or mounting brackets 11, the USB3 port for data exchange 12;
Figure 4 illustrates the side view of one of the possible embodiments of the assembled system, where the chassis 5 of hemispherical shape accommodates six optics 2, the figure highlighting the possible mounted tripode 4.
Figure 5 illustrates the side view of one of the possible embodiments of the assembled system, the chassis 6 of spherical optics housing 2, in the figure highlighting the possible mounted tripod 4;
Figure 6 illustrates a perspective view of one of the possible embodiments of the system partially disassembled, the chassis 17 of spherical optical housing 2, there being also visible the electronic components and the microprocessor 1 housed on the printed circuit 16;
Figures 7a, 7b illustrate a front and up view of the radial mounting of the various recovery cones 19 that are overlapped by 15-20 degrees and that cover the entire recovery hemisphere;
Figure 8 illustrates a scheme of union of the various video recorded with the use of the chassis with. six. optics and the related, video merge fox spherical imaging;
Figure 9 shows a diagram of the merge of the various video recorded by different optics, merged via the algorithms for the video-stitching software present in the microprocessor;
Figure 10 is the schematics of the communication interface between electronic cameras, microprocessor, battery and related video outputs for USB 3, wireless module and other devices integrated into the system;
Figure 11 shows an example of the methods and procedures in which the various video signals and metadata are assembled and then sent in real time to remote devices as shown i Figure 16;
Figure 12 illustrates an example of two recorded frames from adjacent optics inside the device and the processing stage, Figure 13 and merged Figure 14, of this frames with a 15-2Ό overlap 19;
Figure 15 illustrates how the device is able to send pictures, video and metadata in real time to remote devices through the wireless module and interact with such equipment.
Figures 16a, 16b, 16c illustrate how the highly innovative software in the edge of the device can also be used on board of modern smartphones, tablets and mobile phones in general, but also classic cameras such as those already installed for the video surveillance, capable of transforming them into devices capable of 360° omni-directional content recording via front and rear cameras or in the system without restrictions in number: the corresponding Figure 16 illustrates the two . cameras and angles in the recording views : Figure 16a rear, Figure 16b front, and Figure 16c side;
Figure 17 illustrates one version of the device and hand 18 travel support 4.
The system of the invention is substantially constituted by the following components:
- a chassis, preferably made of aluminum or composite material, containing electronics and two or more optics;
- a processor and other electronic components that can be updated using new software and hardware with other components;
- a series of 2-megapixel digital camera with wide-angle lenses or greater viewing angle and recording >90°;
- a frame structure of the chassis, preferably made of aluminum or composite material, containing electronics and multiple optical coupled in pairs of two for each quadrant of recovery (in the stereoscopic version) ;
- possible extension tube for mounting with a preferable range from 5-25 cm;
- frame plate for mounting;
- interface and/or possibility of installing cable for data transfer, in particular USB3 type or faster;
- battery power;
- wireless module;
- GPS module;
- LED status (at least two) ;
- any parachute, and a tripod in the probe parachuted version;
- display (at least one) ;
- gyroscope;
- GSM module;
- memory storage; - telemetric laser rangefinder;
- engine for movement system on wheels or propulsion for moving on the ground in the air, on surfaces, in liquids or conduits;
- stereo microphone (at least one) .
The invention is related to a system comprising at least two lenses that create an omnidirectional video camera, an images and videos recording system that can be used to create a 3D immersive environment at 360°, also stereoscopic. The system uses at least two cameras, with preferable variations preferable even with six or eleven cameras; if 3D stereoscopic version lenses are doubled for each shooting quadrant, the optics have a resolution >2 megapixel camera and a camera angle >100°, organized with overlapping visual frame, to capture image data covering the entire 360° scene, oriented so as to have an overlap of recovery of at least 15°.
The collected data are processed by the chip on the device that records data from different cameras, and can be sent in real time to remote devices via a wireless interface and to a cable with USB 3 technology with greater bandwidth. The camera system can be used to create a 3D model taken from the real world of a scene at 360°, using the triangulation of the image data in the frame of overlap view.
Inside the object there is also a microphone to record sounds in stereo mode, a geolocation (GPS) as a wireless module, a range finder, a Bluetooth module, a GSM-4G module, battery power, a memory storage and possibly an optional parachute powered by an accelerometer.
Another aspect of this invention is describing a system of modular cameras interchangeable parts, in which a configuration is sufficient for photographing an entire hemisphere of the visual field; in addition, is it possible to shoot with an operator and a single camera that can be easily carried on one's shoulder, in cars, planes, drones, helicopters, or other mobile devices due to its very low size and weight.
In particular, the microprocessor inside the device has executable instructions and fusion algorithms of the recorded data comprising the steps of:
- acquiring images;
- the sequence of acquisition of image data is transferred from the plurality of cameras; - acquiring audio data from the microphone at least in synchronism with the acquisition of the image data;
processing the data acquired from multiple cameras with field of vision on the recorded image and thus overlap to about 15-20°;
- processing the acquired telemetry data;
- assembling, via the microprocessor, the encoded data of the individual images and audio of the acquired data, to provide the result of an image spherical product of the merge of the various optical joint between them;
- producing a spherical video file, and
recording the immersive 360° video in the internal memory of the device system.
- sending images via the wireless module to remote systems (cloud/internet) .
The system is water resistant, up to 20 atmospheres of pressure, it is also resistant to weathering and chemicals. It also has the ability to be connected to a flexible tube for moving within the human body, in the version for endoscopic probe, or the ducts or video inspected for movement in the application for video surveillance. A built-in microprocessor in the camera, using algorithms, provides the processing of data acquired by cameras: crop the scanned data from the first, the second and then the third camera; scale the data from the three cameras, rotate the image data produced; then adjust one or more visual properties of the images rotate, to vary exposure, color, brightness, and contrast, and then merge them into a single frame overlapping for about 20% each image. These frames are produced by the processor board 30 times per second to create a smooth video display visible through an external system on-board the camera, which has a detector for the telemetry data of the acquired data, and transfers the data outside through a physical port and via a wireless module, with a transfer speed of at least 1 gigabyte per second.
The system includes a card slot for accepting a memory card where image data are stored; it also includes a wireless module in which the microprocessor has executable instructions, including the transmission functions of the spherical video to remote devices, PC, tablets, internet, mobile phones.
The system also includes a small parachute driven by an accelerometer: the parachute system offers the possibility to be released in territories or inaccessible areas or where you want to put an area under remote surveillance.
The system can be also provided in a particular version of a miniaturized motor and a propulsion system to become self-propelled, in air, liquids, on the ground, within conduits or for use as endoscopic probe.
A gyroscope and an accelerometer are also provided, to determine the rotational acceleration. These data are stored as metadata of rotation into the system.
The system also includes a Global Positioning System ("GPS") to determine changes in the position of the camera system during the movement of the camera itself: these data are stored as metadata in the global GPS position into the system, and can then be analyzed to produce charts or patterns.
The system also has in its ROM memory, an algorithm for displaying spherical video files and a directional sound created by the camera: the videos taken are transferred wirelessly to a spherical video viewer. This display is controlled by a computer, creating, for the processing system, a three-dimensional video "navigable" with a mouse or touchscreen mode in a virtual environment.
The system also has a module with GSM data transmission with 4G LTE technology or higher.
In the preferred embodiment, the system weighs around 180 grams including the chassis, optics and electronics, battery, and its dimensions are: diameter 10 cm, and 2 cm in the endoscopic version, which may be reduced to 1mm or less with modern nanotechnology.
In summary, the peculiarity of this invention is the ability to record the entire world around in real time, by merging the video shot from various optics going to cancel the deformation of the image from a stereoscopic image by merging two images from a single shooting quadrant (for stereoscopic vision) . The acquired data can be also used for creating panoramic images with various configurations of two-dimensional audio-video compression: this serves to provide the operator with the ability to view the entire visual frame, but what he does not perceive behind, above and below him in a single moment and a single image or video unlike what that occurs in the human eye which has a field of vision of approximately 90°, useful for control rooms or for video security with immense advantages of drastically reducing the number of cameras and related monitors present in the control room.
The sequence of images or videos generated are then sent through the wireless module on-board equipment, and, through an appropriate software, are usable on standard personal computers, touch screen or via the Internet, with an application that makes it navigable 360° the whole scene or through the new stereoscopic viewers eye.
The cameras are oriented in a radial manner with respect to a structure made of plastic, aluminum or composite material. Each camera has a field of view of >90°, which overlaps with the field of view of the adjacent digital optics. The digital cameras, to create a stereoscopic vision, to create the stereoscopic version, are two digital camera for each quadrant.

Claims

1. System for 360° spherical and stereoscopic video recordings, also adapted to view and send data wirelessly to other devices, said system comprising:
a structure (3, 5, 6) supporting an electronic board (16) where a microprocessor (1) and other electronic components are houses;
at least two electronic high-definition wide- angle cameras (2), with an upper angle of 100 °, inserted in the structure (3) and arranged radially from the center of the apparatus designed to create spherical photos and videos,
wherein said system is suitable for recording at the same time also in stereoscopic mode, producing already merged images, stitching in real time signals from different optics, and transferring them to remote devices through a wireless module present therein, eliminating the distortion of images.
2. System according to claim 1, characterized in that it also includes;
- at least one touch-screen display (9) to display functions, data and images; - at least one multi-color LED (10) to indicate the status of the equipment;
- the electronic board (16) of which a GPS, an accelerometer, a laser rangefinder for drawing objects, sync and transfer data via wireless module or using the power disconnectable connection (7), are also installed.
3. System according to claim 1 or 2, characterized in that it is adapted to operate in air or liquid, on the ground or driven by a motor and a propulsion system, to be used in areas affected by natural disasters, earthquakes, seismic events, or volcanic events, or areas subjected to geological or potholing studies, video-surveillance areas, or as endoscopic probe inside the human body, or also in the circulatory system of living beings and all those places inaccessible to the human eye, aboard terrestrial aircrafts and underwater, said system being further adapted to also record an audio signal, a GPS positioning, telemetry data and laser tracking with transfer images and data recorded and send data via wireless or bluetooth to other remote devices .
4. System according to any one of claims 1 to 3, characterized in that said system further comprises:
a parachute adapted to be opened when it parachuted through an accelerometer, and then detached when the system lies on the ground, leaning on a possible tripod 4;
- two function keys (14, 15) that can be remotely controlled.
5. System according to any one of the preceding claims, characterized in that said system further comprises:
- a rechargeable battery power supply;
- a GSM, 4G LTE type or higher;
- an accelerometer;
- a gyroscope;
- a bracket (11) for. attaching tripods (4) or supports for attacking helmets or equipment or additional support flanges;
- a flexible tube (8) switchable for use as endoscopic probe that engages the bracket (11) ;
- a wireless module to provide access to other devices;
- a bluetooth module ;
- a detector telemetry data ; - a slot (13) for the acceptance of cards for additional storage where details are stored for sequences of images or videos;
- an input/output data (12) USB3 or higher;
- at least one stereo omni-directional microphone to capture audio data corresponding to the image data acquired;
- a memory for storing data;
- a microprocessor that performs the functions of image fusion algorithms using the functions of storing images and acquiring metadata of GPS, telemetry, accelerometer, and of transmitting data wirelessly through any physical cable and/or storage on internal memory.
6. System according to claim 5, characterized in that said microprocessor has stored executable instructions designed to implement the following steps :
- acquiring images from different lenses (19) whose fields of view overlap for about 15-20° for each optics adjacent to the next;
- once having initiated the sequence of image acquisition, transferring data from the plurality of cameras to the microprocessor; processing data from the GPS module, accelerometer, rangefinder;
- assembling the acquired data, to provide the result of a spherical image created from the union of the various image from the various optical joints between them;
- generating a spherical video file, and
- saving in the internal memory of the device, after the system sends to remote devices through wireless integrated module.
7. System according to claim 5 or 6, characterized in that the microprocessor is additionally provided with instructions to perform the following steps:
- trimming the data acquired from the first, the second and then the third camera and so on;
- scaling the data from the first cut, then the second and finally the third camera and so on;
- rotating the image data produced by the cameras;
- intercepting similar points in each image in the overlapping areas (19) ;
- controlling similar points;
- adjusting one or more visual properties of the images, rotating them, to vary exposure, color, brightness, and contrast, and then merging them into a single frame, framing such products to 30 times per second, and generating a video fluid can be used by remote devices.
8. System according to any one of the preceding claims, characterized in that it further comprises:
5 an accelerometer and a gyroscope to determine rotational acceleration, these data being stored as metadata of rotation.
9. System according to any one of the preceding claims, characterized in that it further comprises0 a global positioning system, GPS, for determining changes in position of the camera system while moving the camera, these data being stored as metadata GPS global position.
10. System according to any one of the preceding5 claims, characterized in that it further comprises, inserted in its ROM memory system, an algorithm for the assembly of video files in which there is directional sound created by the camera, a video shot being transferred wirelessly to a spherical(X video viewer, said display being run remote devices, mobile phones, tablets, PCs or similar, to the processing system, a three-dimensional video navigable with a mouse or touch screen mode or in a virtual environment navigable through stereoscopic5 viewers eye.
PCT/IT2014/000095 2013-04-04 2014-04-03 Spherical omnidirectional video-shooting system WO2014162324A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
GBGB1520437.3A GB201520437D0 (en) 2013-04-04 2014-04-03 Spherical omnidirectional video-shooting system
CN201480032323.7A CN105684415A (en) 2013-04-04 2014-04-03 Spherical omnidirectional video-shooting system
SG10201508072WA SG10201508072WA (en) 2013-04-04 2015-09-28 Spherical omnidirectional video-shooting system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IT000063U ITRM20130063U1 (en) 2013-04-04 2013-04-04 PROBE FOR ENDOSCOPIC SHOOTS AND VIDEOINSPECTS, NAME REALWORLD360
ITRM2013U000063 2013-04-04

Publications (1)

Publication Number Publication Date
WO2014162324A1 true WO2014162324A1 (en) 2014-10-09

Family

ID=49485352

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IT2014/000095 WO2014162324A1 (en) 2013-04-04 2014-04-03 Spherical omnidirectional video-shooting system

Country Status (5)

Country Link
CN (1) CN105684415A (en)
GB (1) GB201520437D0 (en)
IT (1) ITRM20130063U1 (en)
SG (1) SG10201508072WA (en)
WO (1) WO2014162324A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2600308C1 (en) * 2015-11-03 2016-10-20 Вячеслав Михайлович Смелков Device of computer system for panoramic television surveillance
CN106488139A (en) * 2016-12-27 2017-03-08 深圳市道通智能航空技术有限公司 Image compensation method, device and unmanned plane that a kind of unmanned plane shoots
WO2017127816A1 (en) * 2016-01-22 2017-07-27 Ziyu Wen Omnidirectional video encoding and streaming
WO2017152600A1 (en) * 2016-03-11 2017-09-14 Effire Universal Limited Smartphone with a vr content capturing assembly
US9838668B2 (en) 2014-06-17 2017-12-05 Actality, Inc. Systems and methods for transferring a clip of video data to a user facility
WO2018154589A1 (en) * 2017-02-23 2018-08-30 Kshitij Marwah An apparatus, method, and system for capturing 360/virtual reality video using a mobile phone add-on
US10102610B2 (en) 2016-04-05 2018-10-16 Qualcomm Incorporated Dual fisheye images stitching for spherical video
KR101914206B1 (en) * 2016-09-19 2018-11-01 주식회사 씨오티커넥티드 Server of cloud audio rendering based on 360-degree vr video
US10275928B2 (en) 2016-04-05 2019-04-30 Qualcomm Incorporated Dual fisheye image stitching for spherical image content
EP3696589A1 (en) * 2019-02-13 2020-08-19 Vecnos Inc. Imaging device
JP2020134938A (en) * 2019-02-13 2020-08-31 ベクノス株式会社 Imaging device
CN111953863A (en) * 2020-08-07 2020-11-17 山东金东数字创意股份有限公司 Special-shaped LED point-to-point video snapshot mapping system and method
CN114339157A (en) * 2021-12-30 2022-04-12 福州大学 Multi-camera real-time splicing system and method with adjustable observation area

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106027919A (en) * 2016-06-30 2016-10-12 北京和兴宏图科技有限公司 Video camera
CN106210535A (en) * 2016-07-29 2016-12-07 北京疯景科技有限公司 The real-time joining method of panoramic video and device
CN106162206A (en) * 2016-08-03 2016-11-23 北京疯景科技有限公司 Panorama recording, player method and device
KR20180040451A (en) * 2016-10-12 2018-04-20 엘지전자 주식회사 Mobile terminal and operating method thereof
CN106572356A (en) * 2016-10-20 2017-04-19 安徽协创物联网技术有限公司 Motion VR camera for enabling real-time video broadcast
CN106713996A (en) * 2016-12-31 2017-05-24 天脉聚源(北京)科技有限公司 Method and apparatus for constructing panoramic image television program
CN106989730A (en) * 2017-04-27 2017-07-28 上海大学 A kind of system and method that diving under water device control is carried out based on binocular flake panoramic vision
CN107105143A (en) * 2017-05-13 2017-08-29 杜广香 A kind of image acquiring method and equipment
CN108267454B (en) * 2018-01-30 2023-07-07 中国计量大学 Is applied to the blocking of the inside of a pressure fluid pipe fitting Defect measurement positioning system and method
KR102177401B1 (en) * 2018-02-02 2020-11-11 재단법인 다차원 스마트 아이티 융합시스템 연구단 A noiseless omnidirectional camera device
KR101982751B1 (en) * 2018-12-27 2019-05-27 주식회사 월드씨엔에스 Video surveillance device with motion path tracking technology using multi camera
CN109474797B (en) * 2019-01-04 2023-12-08 北京快鱼电子股份公司 Conference transcription system based on panoramic camera and microphone array
CN111284692A (en) * 2020-03-27 2020-06-16 深圳市格上格创新科技有限公司 Panoramic camera unmanned aerial vehicle

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040246333A1 (en) * 2003-06-03 2004-12-09 Steuart Leonard P. (Skip) Digital 3D/360 degree camera system
US20050012745A1 (en) * 2002-06-03 2005-01-20 Tetsujiro Kondo Image processing device and method, program, program recording medium, data structure, and data recording medium
US7003136B1 (en) 2002-04-26 2006-02-21 Hewlett-Packard Development Company, L.P. Plan-view projections of depth image data for object tracking
US20090278914A1 (en) 1997-04-21 2009-11-12 Masakazu Koyanagi Controller for Photographing Apparatus and Photographing System
US7710463B2 (en) 1999-08-09 2010-05-04 Fuji Xerox Co., Ltd. Method and system for compensating for parallax in multiple camera systems
ITRM20120329A1 (en) * 2012-07-12 2012-10-11 Virtualmind Di Davide Angelelli 360 ° IMMERSIVE / SPHERICAL VIDEO CAMERA WITH 6-11 OPTICS 5-10 MEGAPIXEL WITH GPS GEOLOCALIZATION

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5023725A (en) * 1989-10-23 1991-06-11 Mccutchen David Method and apparatus for dodecahedral imaging system
IL139995A (en) * 2000-11-29 2007-07-24 Rvc Llc System and method for spherical stereoscopic photographing
US6831699B2 (en) * 2001-07-11 2004-12-14 Chang Industry, Inc. Deployable monitoring device having self-righting housing and associated method
CN201903752U (en) * 2010-12-04 2011-07-20 徐进 Panoramic camera

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090278914A1 (en) 1997-04-21 2009-11-12 Masakazu Koyanagi Controller for Photographing Apparatus and Photographing System
US7710463B2 (en) 1999-08-09 2010-05-04 Fuji Xerox Co., Ltd. Method and system for compensating for parallax in multiple camera systems
US7003136B1 (en) 2002-04-26 2006-02-21 Hewlett-Packard Development Company, L.P. Plan-view projections of depth image data for object tracking
US20050012745A1 (en) * 2002-06-03 2005-01-20 Tetsujiro Kondo Image processing device and method, program, program recording medium, data structure, and data recording medium
US20040246333A1 (en) * 2003-06-03 2004-12-09 Steuart Leonard P. (Skip) Digital 3D/360 degree camera system
ITRM20120329A1 (en) * 2012-07-12 2012-10-11 Virtualmind Di Davide Angelelli 360 ° IMMERSIVE / SPHERICAL VIDEO CAMERA WITH 6-11 OPTICS 5-10 MEGAPIXEL WITH GPS GEOLOCALIZATION
EP2685707A1 (en) * 2012-07-12 2014-01-15 Virtualmind di Davide Angelelli System for spherical video shooting

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9838668B2 (en) 2014-06-17 2017-12-05 Actality, Inc. Systems and methods for transferring a clip of video data to a user facility
RU2600308C1 (en) * 2015-11-03 2016-10-20 Вячеслав Михайлович Смелков Device of computer system for panoramic television surveillance
WO2017127816A1 (en) * 2016-01-22 2017-07-27 Ziyu Wen Omnidirectional video encoding and streaming
WO2017152600A1 (en) * 2016-03-11 2017-09-14 Effire Universal Limited Smartphone with a vr content capturing assembly
US10275928B2 (en) 2016-04-05 2019-04-30 Qualcomm Incorporated Dual fisheye image stitching for spherical image content
US10102610B2 (en) 2016-04-05 2018-10-16 Qualcomm Incorporated Dual fisheye images stitching for spherical video
KR101914206B1 (en) * 2016-09-19 2018-11-01 주식회사 씨오티커넥티드 Server of cloud audio rendering based on 360-degree vr video
CN106488139A (en) * 2016-12-27 2017-03-08 深圳市道通智能航空技术有限公司 Image compensation method, device and unmanned plane that a kind of unmanned plane shoots
WO2018154589A1 (en) * 2017-02-23 2018-08-30 Kshitij Marwah An apparatus, method, and system for capturing 360/virtual reality video using a mobile phone add-on
EP3696589A1 (en) * 2019-02-13 2020-08-19 Vecnos Inc. Imaging device
CN111565269A (en) * 2019-02-13 2020-08-21 唯光世株式会社 Image capturing apparatus
JP2020134938A (en) * 2019-02-13 2020-08-31 ベクノス株式会社 Imaging device
JP7467958B2 (en) 2019-02-13 2024-04-16 株式会社リコー Imaging device
CN111565269B (en) * 2019-02-13 2024-05-03 株式会社理光 Image pickup apparatus
CN111953863A (en) * 2020-08-07 2020-11-17 山东金东数字创意股份有限公司 Special-shaped LED point-to-point video snapshot mapping system and method
CN114339157A (en) * 2021-12-30 2022-04-12 福州大学 Multi-camera real-time splicing system and method with adjustable observation area

Also Published As

Publication number Publication date
SG10201508072WA (en) 2015-10-29
GB201520437D0 (en) 2016-01-06
ITRM20130063U1 (en) 2014-10-05
CN105684415A (en) 2016-06-15

Similar Documents

Publication Publication Date Title
WO2014162324A1 (en) Spherical omnidirectional video-shooting system
US10237455B2 (en) Camera system
US20170195568A1 (en) Modular Panoramic Camera Systems
US20160286119A1 (en) Mobile Device-Mountable Panoramic Camera System and Method of Displaying Images Captured Therefrom
US9939843B2 (en) Apparel-mountable panoramic camera systems
US7429997B2 (en) System and method for spherical stereoscopic photographing
US20170155888A1 (en) Systems and Methods for Transferring a Clip of Video Data to a User Facility
US20150234156A1 (en) Apparatus and method for panoramic video imaging with mobile computing devices
US20170227841A1 (en) Camera devices with a large field of view for stereo imaging
WO2018133849A1 (en) Panoramic image photographic method, panoramic image display method, panoramic image photographic device, and panoramic image display device
US20100045773A1 (en) Panoramic adapter system and method with spherical field-of-view coverage
EP2685707A1 (en) System for spherical video shooting
JP5483027B2 (en) 3D image measurement method and 3D image measurement apparatus
US20180295284A1 (en) Dynamic field of view adjustment for panoramic video content using eye tracker apparatus
US20170195563A1 (en) Body-mountable panoramic cameras with wide fields of view
US20130021448A1 (en) Stereoscopic three-dimensional camera rigs
WO2020059327A1 (en) Information processing device, information processing method, and program
KR20160102845A (en) Flight possible omnidirectional image-taking camera system
JP6969249B2 (en) Systems and programs for panoramic portals for connecting remote spaces
KR101889225B1 (en) Method of obtaining stereoscopic panoramic images, playing the same and stereoscopic panoramic camera
WO2022220306A1 (en) Video display system, information processing device, information processing method, and program
WO2016196825A1 (en) Mobile device-mountable panoramic camera system method of displaying images captured therefrom
WO2021030518A1 (en) System for producing a continuous image from separate image sources
TW202203646A (en) Analog panoramic system and method of used the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14733737

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 10201500001436

Country of ref document: CH

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 1520437.3

Country of ref document: GB

122 Ep: pct application non-entry in european phase

Ref document number: 14733737

Country of ref document: EP

Kind code of ref document: A1