US20130141526A1 - Apparatus and Method for Video Image Stitching - Google Patents

Apparatus and Method for Video Image Stitching Download PDF

Info

Publication number
US20130141526A1
US20130141526A1 US13/691,632 US201213691632A US2013141526A1 US 20130141526 A1 US20130141526 A1 US 20130141526A1 US 201213691632 A US201213691632 A US 201213691632A US 2013141526 A1 US2013141526 A1 US 2013141526A1
Authority
US
United States
Prior art keywords
video camera
video
image
method
parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/691,632
Inventor
Bill Banta
Geoff Donaldson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Amazon Technologies Inc
Stealth HD Corp
Original Assignee
Stealth HD Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201161566269P priority Critical
Application filed by Stealth HD Corp filed Critical Stealth HD Corp
Priority to US13/691,632 priority patent/US20130141526A1/en
Assigned to Stealth HD Corp. reassignment Stealth HD Corp. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BANTA, BILL, DONALDSON, GEOFF
Publication of US20130141526A1 publication Critical patent/US20130141526A1/en
Assigned to STEALTHHD, INC. reassignment STEALTHHD, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BANTA, BILL, DONALDSON, GEOFF
Assigned to STEALTHHD LLC reassignment STEALTHHD LLC CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: STEALTHHD, INC.
Assigned to AMAZON TECHNOLOGIES, INC. reassignment AMAZON TECHNOLOGIES, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: STEALTHHD LLC
Application status is Abandoned legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/4728End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for selecting a Region Of Interest [ROI], e.g. for requesting a higher resolution version of a selected region
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/21805Source of audio or video content, e.g. local disk arrays enabling multiple viewpoints, e.g. using a plurality of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234363Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by altering the spatial resolution, e.g. for clients with a lower screen resolution
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/23439Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements for generating different versions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/239Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests
    • H04N21/2393Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests involving handling client requests
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/466Learning process for intelligent management, e.g. learning user preferences for recommending movies
    • H04N21/4668Learning process for intelligent management, e.g. learning user preferences for recommending movies for recommending content, e.g. movies
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47202End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting content on demand, e.g. video on demand
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • H04N5/23238Control of image capture or reproduction to achieve a very large field of view, e.g. panorama
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2101/00Still video cameras

Abstract

A method includes collecting a first image from a first video camera of a test pattern and a second image from a second video camera of the test pattern. The first image and the second image have an overlap region. The overlap region is evaluated to generate calibration parameters that accommodate for any vertical, horizontal or rotational misalignment between the first image and the second image. Calibration parameters are applied to video streams from the first video camera and the second video camera.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application claims priority to U.S. Provisional Patent Application 61/566,269, filed Dec. 2, 2011, entitled “Panoramic video Camera System and Related Methods”, the contents of which are incorporated herein by reference.
  • FIELD OF THE INVENTION
  • This invention relates generally to video image processing. More particularly, this invention relates to panoramic video image stitching utilizing alignment and calibration parameters for dynamic video image compensation.
  • BACKGROUND OF THE INVENTION
  • Panoramic video feeds should be combined in a seamless manner. Existing techniques that endeavor to achieve this have high power requirements and excessive processing times, particularly in the context of mobile devices, which are constrained by these parameters.
  • SUMMARY OF THE INVENTION
  • A method includes collecting a first image from a first video camera of a test pattern and a second image from a second video camera of the test pattern. The first image and the second image have an overlap region. The overlap region is evaluated to generate calibration parameters that accommodate for any vertical, horizontal or rotational misalignment between the first image and the second image. Calibration parameters are applied to video streams from the first video camera and the second video camera.
  • BRIEF DESCRIPTION OF THE FIGURES
  • The invention is more fully appreciated in connection with the following detailed description taken in conjunction with the accompanying drawings, in which:
  • FIG. 1 illustrates a calibration system utilized in accordance with an embodiment of the invention.
  • FIG. 2 illustrates processing operations associated with an embodiment of the invention.
  • FIGS. 3 and 4 illustrate image correction operations performed in accordance with an embodiment of the invention.
  • FIGS. 5 and 6 illustrate error correction operations performed in accordance with an embodiment of the invention.
  • Like reference numerals refer to corresponding parts throughout the several views of the drawings.
  • DETAILED DESCRIPTION OF THE INVENTION
  • An embodiment of the invention enables image stitching at high frame rates to produce an output video data stream created from more than one input video stream. Test and calibration processes using still photographs of charts and video capture of moving targets enable alignment of the cameras and determine the known overlap between cameras.
  • FIG. 1 illustrates a calibration technique utilized in accordance with an embodiment of the invention. A first camera 100 with a viewing angle 102 takes an image of a test pattern 104. Similarly, a second camera 106 with a viewing angle 108 takes an image of the test pattern 104. The viewing angles have an overlap region 110. The cameras 100 and 106 have fixed positions in mount 112. Each camera delivers image data to an image processor 114. The image processor 114 may be a computer with a central processor, graphics processing unit and/or an additional processor.
  • Still photographs are used to determine the fixed overlap area 110 from one image sensor to the adjacent image sensor based on alignment within the final system, or based on a standardized calibration fixture. Sharp lines of the test pattern 104 allow the image processor 144 to program the known overlap at the pixel level. That is, the image processor 114 accommodates for any vertical, horizontal or rotational misalignment. The test pattern 104 also allows the image processor 114 to determine any focus errors or areas of soft focus for each image sensor so that image correction processing can be applied later to produce uniformly sharp images.
  • The test pattern 104 may be in the form of a grid. In this case, the field of view of each image capture device will have areas where the image has some distortion caused by either the lens or some other irregularity in the system. To produce an output video stream from more than one video input, the images should appear to have no distortion or uniform distortion along the edge where the input streams are joined. By calculating a known distortion for each image capture device, distortion for individual cameras is corrected. Since resulting distortions for points in the image plane will be known for each image capture device, the distortion between images will be corrected by image processing to create an image with minimal distortion between camera data streams.
  • The test pattern 104 may also be in the form of a color chart and/or uniform grey chart. Such a test pattern allows the image processor 114 to analyze any potential differences in color accuracy, relative illumination, and relative uniformity between cameras. These differences are stored as correction factors utilized by the image processor 114 to reduce noticeable differences between image streams.
  • This calibration process allows for an output video stream to be assembled from multiple video streams with the viewer being unable to perceive any meaningful change in image quality throughout an entire field of view, including up to 360 degrees along multiple axes.
  • FIG. 2 illustrates processing operations associated with an embodiment of the invention. The image processor 114 simultaneously captures frames from the first camera 100 and the second camera 106, as shown with block 200. Stored known camera distortion parameters 202 are applied by the image processor 204. The frames may then be converted from rectangular coordinates to cylindrical or spherical coordinates 206. It is then determined whether alignment has been calculated 208. If not, alignment is calculated 210 and the alignment calculation is stored 212 and processing proceeds to block 214. At block 214, adjacent frames are stitched and blended based on the alignment calculation 214. Adjacent frames are then applied as output 216 to a display.
  • Thus, through the calibration process, camera distortion and alignment parameters are obtained. These parameters are subsequently utilized as additional frames are received.
  • The following is an example of lens distortion correction and perspective correction applied to an image in accordance with an embodiment of the invention. Consider the following parameters.
      • Image Size: 1280 px×720 px
      • Image Center: (640, 360)
      • Distortion Size: 1 (Scaling factor, 0-1)
      • Distortion: 0.03 (Distortion correction polynomial coefficient)
      • Distant Param: 103.00 px (Focal length in pixels)
  • The following code implements operations of FIG. 2.
  • vec2 tc = gl_TexCoord[0].st; // Lens Distortion Correction vec2 P = (tc / imageCenter) − 1.; // to normalized image coordinates P /= distortionSize; vec2 Pp = P / (1. − distortion * dot(P, P)); P *= distortionSize; tc = (Pp + 1.) * imageCenter; // back to pixel coordinates // Cartesian coordinates tc −= vec2( imageCenter.x, imageCenter.y ); // Sphere(FishEye)-to-Erectangular float phi = tc.s / distanceparam; float theta = −tc.t / distanceparam + (M_PI/2.); if (theta < 0.0) { theta = −theta; phi += M_PI; } if (theta > M_PI) { theta = M_PI − (theta − M_PI); phi += M_PI ; } float s = sin(theta); vec2 v = vec2(s * sin(phi), cos(theta)); float r = length(v); theta = distanceparam * atan(r, s * cos(phi)); tc = v * (theta / r); // Erectangular-to-Cylindrical tc.t = distanceparam * atan(tc.t / distanceparam ); //Pixel coordinates tc += vec2( imageCenter.x − 0.5, imageCenter.y − 0.5);
  • In one embodiment, the process of automated alignment uses processing techniques implemented in OpenCV to determine the offset between two images that have overlapping regions. For example, feature matching and homography calculations may be used.
  • In one embodiment, manual alignment may be used to manually position image frames in a panoramic frame until the overlapping regions are seamless.
    When dealing with a 360-degree panoramic frame, the beginning and end of the frame need to be aligned and stitched together. To accomplish this there are three parameters that need to be calibrated.
      • Overlap—The amount of pixel overlap between the beginning and end of the frame.
      • Twist—The amount of vertical error between the beginning and end of the frame.
      • Blend Width—The width of the blending area across the overlapping region.
  • Those skilled in the art will appreciate that the invention may be used in the production of video camera systems that create ultra wide-angle video streams or panoramic video. The invention can also be used post-production to address camera performance issues in the field. If cameras are knocked out of alignment during use, the calibration and alignment steps can re-calibrate the system, preventing the unnecessary replacement of certain parts or the whole system.
  • This invention can also be used when creating camera systems in multiple axes. For example, if camera systems are stacked along a vertical axis to create a “tall” cylindrical video stream, the alignment and calibration process can be used to ensure that camera performance is set within a pre-defined standard so there is not a wide variation in image performance between the separate video camera systems.
  • The invention also allows multiple cameras to be calibrated within a specific performance range. If there are multiple camera systems being used at the same event, it is critical that they all have image performance within a defined range to reduce variation in image performance/quality between camera systems. The invention allows for calibration across a group of panoramic or wide-angle camera systems to ensure video outputs appear consistent to the end user.
  • A reduction in bandwidth can be achieved by calibrating the Lens Distortion Parameter and the Camera Field of View (FOV) for each camera in the system. When the system processes each camera's video stream frame by frame it will use the Lens Distortion Parameter and Camera FOV to correct the inherent lens distortion/perspective in each frame. Correcting the video frames with these parameters warps the pixels to a certain degree, causing the frame to be cropped in order to get rid of the warping affects. For example, FIG. 3 illustrates distortion 300 between individual images 302 and 304. As shown in FIG. 4, cropping lines 400 and 402 may be used to compensate for this warping affect and provide a more uniform viewing area that is pleasing to the eye.
  • The image processor 114 may execute software with calibrated parameters that reduce the overall time to stitch camera frames together. These parameters may include FOV overlap, image alignment, individual camera illumination correction and system relative illumination.
  • Calibrating the image alignment and exposure values for the system can reduce the overall per-frame system processing time by approximately ˜400 ms and ˜450 ms respectively. This improvement is critical for live viewing applications, or applications when the camera may be turned on and off frequently.
  • Image alignment is the parameter that defines the offset between each camera in order to stitch and produce a complete 360° panoramic video. Each camera's exposure settings should be set to be the same and change at the same rate to eliminate under-exposed and over-exposed areas in the 360-degree image. Dynamic range software can help compensate for cameras that are under or over exposed during difficult lighting conditions when the system is set to a single exposure level.
  • The alignment and calibration techniques may be used for initial system set up and initial video streaming. Further, they may be used once the system is operative and is otherwise streaming video. In such a context, real world problems occur, such as camera faults and subsequent camera displacement or offset. Techniques of the invention address these issues.
  • FIG. 5 illustrates processing associated with one such technique. The image processor 114 streams video from multiple cameras 500. The image processor 114 is configured to check for a faulty camera 502 (e.g., a lost signal). If a faulty camera exists, the image processor 114 maximizes the field of view of adjacent cameras 504. As shown in FIG. 1, an overlapping region 110 exists between cameras. So for example, if camera 100 is faulty, the full field of view 108 of camera 106 would be utilized.
  • FIG. 6 illustrates processing associated with a camera misalignment after initial alignment and calibration. The image processor 114 streams video from multiple cameras 600. The image processor 114 is configured to check for an offset camera 602 (e.g., a camera misaligned from its original position). If an offset camera is identified, the offset is evaluated 604. For example, the displacement between matching pixels of adjacent video streams may be evaluated. Also, outside sensors may be used to determine this displacement, such as accelerometers or digital gyroscopes. This displacement is then used to form updated alignment parameters (e.g., the original alignment parameters are compensated to incorporate the identified displacement) 606. Video streams are subsequently stitched based on the updated alignment parameters 608.
  • An embodiment of the present invention relates to a computer storage product with a computer readable storage medium having computer code thereon for performing various computer-implemented operations. The media and computer code may be those specially designed and constructed for the purposes of the present invention, or they may be of the kind well known and available to those having skill in the computer software arts. Examples of computer-readable media include, but are not limited to: magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs, DVDs and holographic devices; magneto-optical media; and hardware devices that are specially configured to store and execute program code, such as application-specific integrated circuits (“ASICs”), programmable logic devices (“PLDs”) and ROM and RAM devices. Examples of computer code include machine code, such as produced by a compiler, and files containing higher-level code that are executed by a computer using an interpreter. For example, an embodiment of the invention may be implemented using JAVA®, C++, or other object-oriented programming language and development tools. Another embodiment of the invention may be implemented in hardwired circuitry in place of, or in combination with, machine-executable software instructions.
  • The foregoing description, for purposes of explanation, used specific nomenclature to provide a thorough understanding of the invention. However, it will be apparent to one skilled in the art that specific details are not required in order to practice the invention. Thus, the foregoing descriptions of specific embodiments of the invention are presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the invention to the precise forms disclosed; obviously, many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, they thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated. It is intended that the following claims and their equivalents define the scope of the invention.

Claims (12)

1. A method, comprising:
collecting a first image from a first video camera of a test pattern and a second image from a second video camera of the test pattern, wherein the first image and the second image have an overlap region;
evaluating the overlap region to generate calibration parameters that accommodate for any vertical, horizontal or rotational misalignment between the first image and the second image; and
applying the calibration parameters to video streams from the first video camera and the second video camera.
2. The method of claim 1 further comprising collecting camera distortion values from the first video camera and the second video camera to generate calibration parameters.
3. The method of claim 1 further comprising collecting alignment parameter between the first video camera and the second video camera to generate calibration parameters.
4. The method of claim 1 further comprising collecting overlap values between the first video camera and the second video camera to generate calibration parameters.
5. The method of claim 1 further comprising collecting twist parameters between the first video camera and the second video camera to generate calibration parameters.
6. The method of claim 1 further comprising collecting blend width parameters between the first video camera and the second video camera to generate calibration parameters.
7. The method of claim 1 further comprising collecting field of view overlap parameters between the first video camera and the second video camera to generate calibration parameters.
8. The method of claim 1 further comprising collecting individual camera illumination parameters from the first video camera and the second video camera to generate calibration parameters.
9. The method of claim 1 further comprising collecting system relative illumination parameters from the first video camera and the second video camera to generate calibration parameters.
10. The method of claim 1 further comprising applying cropping lines to the video streams to correct for warping.
11. The method of claim 1 further comprising:
identifying a faulty video camera in a stream of video from multiple video cameras; and
maximizing the field of view of video cameras adjacent to the faulty video camera.
12. The method of claim 1 further comprising:
identifying an offset video camera in a stream of video from multiple video cameras;
evaluating the offset;
forming updated alignment parameters; and
combining video based upon the updated alignment parameters.
US13/691,632 2011-12-02 2012-11-30 Apparatus and Method for Video Image Stitching Abandoned US20130141526A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US201161566269P true 2011-12-02 2011-12-02
US13/691,632 US20130141526A1 (en) 2011-12-02 2012-11-30 Apparatus and Method for Video Image Stitching

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/691,632 US20130141526A1 (en) 2011-12-02 2012-11-30 Apparatus and Method for Video Image Stitching

Publications (1)

Publication Number Publication Date
US20130141526A1 true US20130141526A1 (en) 2013-06-06

Family

ID=48523714

Family Applications (3)

Application Number Title Priority Date Filing Date
US13/691,632 Abandoned US20130141526A1 (en) 2011-12-02 2012-11-30 Apparatus and Method for Video Image Stitching
US13/691,654 Active 2033-08-26 US9516225B2 (en) 2011-12-02 2012-11-30 Apparatus and method for panoramic video hosting
US15/366,878 Active US9843840B1 (en) 2011-12-02 2016-12-01 Apparatus and method for panoramic video hosting

Family Applications After (2)

Application Number Title Priority Date Filing Date
US13/691,654 Active 2033-08-26 US9516225B2 (en) 2011-12-02 2012-11-30 Apparatus and method for panoramic video hosting
US15/366,878 Active US9843840B1 (en) 2011-12-02 2016-12-01 Apparatus and method for panoramic video hosting

Country Status (1)

Country Link
US (3) US20130141526A1 (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140168357A1 (en) * 2012-12-13 2014-06-19 Microsoft Corporation Displacing image on imager in multi-lens cameras
US20160307300A1 (en) * 2013-12-06 2016-10-20 Huawei Device Co. Ltd. Image processing method and apparatus, and terminal
US9516225B2 (en) 2011-12-02 2016-12-06 Amazon Technologies, Inc. Apparatus and method for panoramic video hosting
CN106447735A (en) * 2016-10-14 2017-02-22 安徽协创物联网技术有限公司 Panoramic camera geometric calibration processing method
CN106470313A (en) * 2015-08-23 2017-03-01 宏达国际电子股份有限公司 Image generation system and image generating method
US9723223B1 (en) 2011-12-02 2017-08-01 Amazon Technologies, Inc. Apparatus and method for panoramic video hosting with directional audio
US9781356B1 (en) 2013-12-16 2017-10-03 Amazon Technologies, Inc. Panoramic video viewer
US9826156B1 (en) 2015-06-16 2017-11-21 Amazon Technologies, Inc. Determining camera auto-focus settings
US9838687B1 (en) 2011-12-02 2017-12-05 Amazon Technologies, Inc. Apparatus and method for panoramic video hosting with reduced bandwidth streaming
US20170352191A1 (en) * 2016-06-07 2017-12-07 Visbit Inc. Virtual Reality 360-Degree Video Camera System for Live Streaming
US9854155B1 (en) 2015-06-16 2017-12-26 Amazon Technologies, Inc. Determining camera auto-focus settings
CN107592452A (en) * 2017-09-05 2018-01-16 深圳市圆周率软件科技有限责任公司 A kind of panorama audio-video acquisition equipment and method
TWI617195B (en) * 2017-06-22 2018-03-01 宏碁股份有限公司 Image capturing apparatus and image stitching method thereof
US9998661B1 (en) 2014-05-13 2018-06-12 Amazon Technologies, Inc. Panoramic camera enclosure
WO2018171487A1 (en) * 2017-03-23 2018-09-27 华为技术有限公司 Panoramic video playback method and client terminal
US10432855B1 (en) * 2016-05-20 2019-10-01 Gopro, Inc. Systems and methods for determining key frame moments to construct spherical images

Families Citing this family (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9219768B2 (en) 2011-12-06 2015-12-22 Kenleigh C. Hobby Virtual presence model
US9836875B2 (en) * 2013-04-26 2017-12-05 Flipboard, Inc. Viewing angle image manipulation based on device rotation
US8917355B1 (en) 2013-08-29 2014-12-23 Google Inc. Video stitching system and method
US8818081B1 (en) 2013-10-16 2014-08-26 Google Inc. 3D model updates using crowdsourced video
US9300882B2 (en) 2014-02-27 2016-03-29 Sony Corporation Device and method for panoramic image processing
US9699437B2 (en) * 2014-03-03 2017-07-04 Nextvr Inc. Methods and apparatus for streaming content
US10063851B2 (en) * 2014-04-30 2018-08-28 Intel Corporation System for and method of generating user-selectable novel views on a viewing device
US9866765B2 (en) * 2014-11-18 2018-01-09 Elwha, Llc Devices, methods, and systems for visual imaging arrays
US10027873B2 (en) 2014-11-18 2018-07-17 The Invention Science Fund Ii, Llc Devices, methods and systems for visual imaging arrays
US9942583B2 (en) 2014-11-18 2018-04-10 The Invention Science Fund Ii, Llc Devices, methods and systems for multi-user capable visual imaging arrays
TW201637432A (en) * 2015-04-02 2016-10-16 Ultracker Technology Co Ltd Real-time image stitching device and real-time image stitching method
US20160353146A1 (en) * 2015-05-27 2016-12-01 Google Inc. Method and apparatus to reduce spherical video bandwidth to user headset
US10419770B2 (en) 2015-09-09 2019-09-17 Vantrix Corporation Method and system for panoramic multimedia streaming
US20170195561A1 (en) * 2016-01-05 2017-07-06 360fly, Inc. Automated processing of panoramic video content using machine learning techniques
WO2017205642A1 (en) * 2016-05-25 2017-11-30 Livit Media Inc. Methods and systems for live sharing 360-degree video streams on a mobile device
CN106060513B (en) * 2016-06-29 2017-11-21 深圳市优象计算技术有限公司 A kind of code stream caching method for cylinder three-dimensional panoramic video netcast
CN106131581A (en) * 2016-07-12 2016-11-16 上海摩象网络科技有限公司 The panoramic video manufacturing technology of mixed image
CN106101847A (en) * 2016-07-12 2016-11-09 三星电子(中国)研发中心 The method and system of panoramic video alternating transmission
KR20180039939A (en) * 2016-10-11 2018-04-19 삼성전자주식회사 Display apparatus and method for generating capture image
KR20180051202A (en) * 2016-11-08 2018-05-16 삼성전자주식회사 Display apparatus and control method thereof
GB2557175A (en) * 2016-11-17 2018-06-20 Nokia Technologies Oy Method for multi-camera device
JP2018110354A (en) * 2017-01-05 2018-07-12 株式会社リコー Communication terminal, image communication system, communication method, and program
CN106954093A (en) * 2017-03-15 2017-07-14 北京小米移动软件有限公司 Panoramic video processing method, apparatus and system
US20180295282A1 (en) * 2017-04-10 2018-10-11 Intel Corporation Technology to encode 360 degree video content
EP3404913B1 (en) * 2017-05-16 2019-11-06 Axis AB A system comprising a video camera and a client device and a method performed by the same
FR3066672A1 (en) * 2017-05-19 2018-11-23 Sagemcom Broadband Sas Method for communicating an immersive video
CN107277474B (en) * 2017-06-26 2019-06-25 深圳看到科技有限公司 Panorama generation method and generating means
CN109698952A (en) * 2017-10-23 2019-04-30 腾讯科技(深圳)有限公司 Playback method, device, storage medium and the electronic device of full-view video image

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5319465A (en) * 1991-09-20 1994-06-07 Sony Pictures Entertainment, Inc. Method for generating film quality images on videotape
US6611241B1 (en) * 1997-12-02 2003-08-26 Sarnoff Corporation Modular display system
US20040030527A1 (en) * 2002-02-07 2004-02-12 Accu-Sport International, Inc. Methods, apparatus and computer program products for processing images of a golf ball
US6788333B1 (en) * 2000-07-07 2004-09-07 Microsoft Corporation Panoramic video
US7324664B1 (en) * 2003-10-28 2008-01-29 Hewlett-Packard Development Company, L.P. Method of and system for determining angular orientation of an object
US20100150458A1 (en) * 2008-12-12 2010-06-17 International Business Machines Corporation Generating Cohorts Based on Attributes of Objects Identified Using Video Input
US8334905B2 (en) * 2010-05-05 2012-12-18 Cisco Technology, Inc. Zone, system and failure aware self adjusting IP surveillance cameras
US8406562B2 (en) * 2006-08-11 2013-03-26 Geo Semiconductor Inc. System and method for automated calibration and correction of display geometry and color
US8687070B2 (en) * 2009-12-22 2014-04-01 Apple Inc. Image capture device having tilt and/or perspective correction

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2240961C (en) * 1995-12-18 2001-06-12 David Alan Braun Head mounted displays linked to networked electronic panning cameras
US6043837A (en) * 1997-05-08 2000-03-28 Be Here Corporation Method and apparatus for electronically distributing images from a panoptic camera system
US6192393B1 (en) 1998-04-07 2001-02-20 Mgi Software Corporation Method and system for panorama viewing
US20020049979A1 (en) 2000-05-18 2002-04-25 Patrick White Multiple camera video system which displays selected images
IL153164D0 (en) 2000-06-09 2003-06-24 Imove Inc Streaming panoramic video
US7796162B2 (en) 2000-10-26 2010-09-14 Front Row Technologies, Llc Providing multiple synchronized camera views for broadcast from a live venue activity to remote viewers
GB0313866D0 (en) * 2003-06-14 2003-07-23 Impressive Ideas Ltd Display system for recorded media
US8126155B2 (en) 2003-07-02 2012-02-28 Fuji Xerox Co., Ltd. Remote audio device management system
US20050280701A1 (en) 2004-06-14 2005-12-22 Wardell Patrick J Method and system for associating positional audio to positional video
US20070035612A1 (en) 2005-08-09 2007-02-15 Korneluk Jose E Method and apparatus to capture and compile information perceivable by multiple handsets regarding a single event
BRPI0622048B1 (en) 2006-10-20 2018-09-18 Thomson Licensing method, device, and system for generating regions of interest in video content
US8339456B2 (en) 2008-05-15 2012-12-25 Sri International Apparatus for intelligent and autonomous video content generation and streaming
US20100050221A1 (en) 2008-06-20 2010-02-25 Mccutchen David J Image Delivery System with Image Quality Varying with Frame Rate
US8893026B2 (en) 2008-11-05 2014-11-18 Pierre-Alain Lindemann System and method for creating and broadcasting interactive panoramic walk-through applications
GB0907870D0 (en) 2009-05-07 2009-06-24 Univ Louvain Systems and methods for the autonomous production of videos from multi-sensored data
US8605783B2 (en) * 2009-05-22 2013-12-10 Microsoft Corporation Composite video generation
US10440329B2 (en) 2009-05-22 2019-10-08 Immersive Media Company Hybrid media viewing application including a region of interest within a wide field of view
US20120240061A1 (en) 2010-10-11 2012-09-20 Teachscape, Inc. Methods and systems for sharing content items relating to multimedia captured and/or direct observations of persons performing a task for evaluation
US8970666B2 (en) 2011-09-16 2015-03-03 Disney Enterprises, Inc. Low scale production system and method
US20130141526A1 (en) 2011-12-02 2013-06-06 Stealth HD Corp. Apparatus and Method for Video Image Stitching

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5319465A (en) * 1991-09-20 1994-06-07 Sony Pictures Entertainment, Inc. Method for generating film quality images on videotape
US6611241B1 (en) * 1997-12-02 2003-08-26 Sarnoff Corporation Modular display system
US6788333B1 (en) * 2000-07-07 2004-09-07 Microsoft Corporation Panoramic video
US20040030527A1 (en) * 2002-02-07 2004-02-12 Accu-Sport International, Inc. Methods, apparatus and computer program products for processing images of a golf ball
US7324664B1 (en) * 2003-10-28 2008-01-29 Hewlett-Packard Development Company, L.P. Method of and system for determining angular orientation of an object
US8406562B2 (en) * 2006-08-11 2013-03-26 Geo Semiconductor Inc. System and method for automated calibration and correction of display geometry and color
US20100150458A1 (en) * 2008-12-12 2010-06-17 International Business Machines Corporation Generating Cohorts Based on Attributes of Objects Identified Using Video Input
US8687070B2 (en) * 2009-12-22 2014-04-01 Apple Inc. Image capture device having tilt and/or perspective correction
US8334905B2 (en) * 2010-05-05 2012-12-18 Cisco Technology, Inc. Zone, system and failure aware self adjusting IP surveillance cameras

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9723223B1 (en) 2011-12-02 2017-08-01 Amazon Technologies, Inc. Apparatus and method for panoramic video hosting with directional audio
US10349068B1 (en) 2011-12-02 2019-07-09 Amazon Technologies, Inc. Apparatus and method for panoramic video hosting with reduced bandwidth streaming
US9516225B2 (en) 2011-12-02 2016-12-06 Amazon Technologies, Inc. Apparatus and method for panoramic video hosting
US9843840B1 (en) 2011-12-02 2017-12-12 Amazon Technologies, Inc. Apparatus and method for panoramic video hosting
US9838687B1 (en) 2011-12-02 2017-12-05 Amazon Technologies, Inc. Apparatus and method for panoramic video hosting with reduced bandwidth streaming
US9094540B2 (en) * 2012-12-13 2015-07-28 Microsoft Technology Licensing, Llc Displacing image on imager in multi-lens cameras
US20140168357A1 (en) * 2012-12-13 2014-06-19 Microsoft Corporation Displacing image on imager in multi-lens cameras
US9870602B2 (en) * 2013-12-06 2018-01-16 Huawei Device (Dongguan) Co., Ltd. Method and apparatus for fusing a first image and a second image
US20160307300A1 (en) * 2013-12-06 2016-10-20 Huawei Device Co. Ltd. Image processing method and apparatus, and terminal
US9781356B1 (en) 2013-12-16 2017-10-03 Amazon Technologies, Inc. Panoramic video viewer
US10015527B1 (en) 2013-12-16 2018-07-03 Amazon Technologies, Inc. Panoramic video distribution and viewing
US9998661B1 (en) 2014-05-13 2018-06-12 Amazon Technologies, Inc. Panoramic camera enclosure
US9854155B1 (en) 2015-06-16 2017-12-26 Amazon Technologies, Inc. Determining camera auto-focus settings
US9826156B1 (en) 2015-06-16 2017-11-21 Amazon Technologies, Inc. Determining camera auto-focus settings
CN106470313A (en) * 2015-08-23 2017-03-01 宏达国际电子股份有限公司 Image generation system and image generating method
US10432855B1 (en) * 2016-05-20 2019-10-01 Gopro, Inc. Systems and methods for determining key frame moments to construct spherical images
WO2017214291A1 (en) * 2016-06-07 2017-12-14 Visbit Inc. Virtual reality 360-degree video camera system for live streaming
US20170352191A1 (en) * 2016-06-07 2017-12-07 Visbit Inc. Virtual Reality 360-Degree Video Camera System for Live Streaming
CN106447735A (en) * 2016-10-14 2017-02-22 安徽协创物联网技术有限公司 Panoramic camera geometric calibration processing method
WO2018171487A1 (en) * 2017-03-23 2018-09-27 华为技术有限公司 Panoramic video playback method and client terminal
TWI617195B (en) * 2017-06-22 2018-03-01 宏碁股份有限公司 Image capturing apparatus and image stitching method thereof
CN107592452A (en) * 2017-09-05 2018-01-16 深圳市圆周率软件科技有限责任公司 A kind of panorama audio-video acquisition equipment and method

Also Published As

Publication number Publication date
US9843840B1 (en) 2017-12-12
US20130141523A1 (en) 2013-06-06
US9516225B2 (en) 2016-12-06

Similar Documents

Publication Publication Date Title
US8427538B2 (en) Multiple view and multiple object processing in wide-angle video camera
US6778207B1 (en) Fast digital pan tilt zoom video
JP4885179B2 (en) Image distortion correction method and image processing apparatus employing the correction method
CN101710932B (en) Image stitching method and device
US20130021434A1 (en) Method and System of Simultaneously Displaying Multiple Views for Video Surveillance
TWI298416B (en) Projector and image correction method
US20070248260A1 (en) Supporting a 3D presentation
KR100996995B1 (en) Graphics processing unit use and device
JP2006060447A (en) Keystone correction using partial side of screen
US8023009B2 (en) Imaging apparatus for correcting optical distortion and wide-angle distortion
US8872887B2 (en) Object detection and rendering for wide field of view (WFOV) image acquisition systems
US9576403B2 (en) Method and apparatus for fusion of images
CN101064780B (en) Method and apparatus for improving image joint accuracy using lens distortion correction
US20130208134A1 (en) Image Stabilization
US20060181610A1 (en) Method and device for generating wide image sequences
US20040076336A1 (en) System and method for electronic correction of optical anomalies
JP2007282245A (en) Image synthesizing apparatus and method
CN101616237B (en) Image processing apparatus, image processing method
ES2370512T3 (en) Method and appliance to automatically adjust the alignment of a projector with regard to a projection screen.
JP6273163B2 (en) Stereoscopic panorama
WO2008005066A1 (en) Parametric calibration for panoramic camera systems
JP4297111B2 (en) Imaging apparatus, image processing method and program thereof
CN101577795A (en) Method and device for realizing real-time viewing of panoramic picture
EP1813105B1 (en) Methods and systems for producing seamless composite images without requiring overlap of source images
WO2006025191A1 (en) Geometrical correcting method for multiprojection system

Legal Events

Date Code Title Description
AS Assignment

Owner name: STEALTH HD CORP., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BANTA, BILL;DONALDSON, GEOFF;REEL/FRAME:029389/0153

Effective date: 20121126

AS Assignment

Owner name: STEALTHHD, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BANTA, BILL;DONALDSON, GEOFF;REEL/FRAME:033761/0874

Effective date: 20140917

AS Assignment

Owner name: STEALTHHD LLC, WASHINGTON

Free format text: CHANGE OF NAME;ASSIGNOR:STEALTHHD, INC.;REEL/FRAME:034749/0590

Effective date: 20141118

AS Assignment

Owner name: AMAZON TECHNOLOGIES, INC., WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:STEALTHHD LLC;REEL/FRAME:038338/0297

Effective date: 20160222

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION