US20140226952A1 - Method and system for split-screen video display - Google Patents
Method and system for split-screen video display Download PDFInfo
- Publication number
- US20140226952A1 US20140226952A1 US14/254,384 US201414254384A US2014226952A1 US 20140226952 A1 US20140226952 A1 US 20140226952A1 US 201414254384 A US201414254384 A US 201414254384A US 2014226952 A1 US2014226952 A1 US 2014226952A1
- Authority
- US
- United States
- Prior art keywords
- image data
- omnidirectional
- images
- view
- narrow
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2624—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of whole input images, e.g. splitscreen
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/243—Image signal generators using stereoscopic image cameras using three or more 2D image sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/272—Means for inserting a foreground image in a background image, i.e. inlay, outlay
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/40—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
- H04N25/44—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array
- H04N25/443—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array by reading pixels from selected 2D regions of the array, e.g. for windowing or digital zooming
Definitions
- this patent application relates to video-recording devices and more particularly, but not by way of limitation, to systems that include split-screen video displays for use with law-enforcement vehicles.
- cameras and other video-recording devices have long been used to capture still images and video.
- cameras include an enclosed hollow portion with an opening or aperture at one end to allow light to enter and a recording surface for capturing the light at another end.
- cameras often have a lens positioned in front of the aperture along an optical axis to gather incoming light and focus all or part of an image onto the recording surface.
- dashboard cameras Use of dashboard cameras in police vehicles has been known for years and is an integral part of a police department's evidence-gathering capability.
- One limitation of conventional cameras is a limited field of vision. Fields of view vary from camera to camera but, in general, most cameras have a field of view that ranges from a few degrees to, at most, 180°.
- surveillance cameras used for monitoring large areas are oftentimes mounted to mechanisms adapted to enable the camera to pan, tilt, and zoom in order to move objects into the camera's field of view.
- One type of camera, called an omnidirectional camera has been used to monitor large areas without a need for mechanisms to enable pan, tilt, and zoom.
- an omnidirectional camera is a camera adapted to capture omnidirectional images.
- the omnidirectional camera is adapted to capture wide-angle images from a wide-angle field of view up to and including 360-degree images from a 360-degree field of view.
- An omnidirectional image may be a wide-angle image, for example, of 130-190° from a wide-angle field of view, for example, of 130-360°.
- the omnidirectional camera may have a field of view ranging from on the order of 180°, 190°, 200°, 210°, 220°, 230°, 240°, 250°, 260°, 270°, 280°, 290°, 300°, 310°, 320°, 330°, 340°, 350°, or 360° and the omnidirectional images may be less than or equal to a omnidirectional-camera field of view.
- dual-lens devices have been developed that combine a narrow-view lens and an omnidirectional lens. These dual-lens devices typically allow recording of up to 360 degrees of images at a plurality of different resolutions. However, display of the output from such dual-lens devices in a way that eliminates unimportant portions of images remains problematic.
- a system includes a first camera operable to capture omnidirectional images and send omnidirectional-image data representing the omnidirectional images, a second camera operable to capture narrow-view images and send narrow-view-image data representing the narrow-view images, a video processor coupled to the first camera and the second camera and operable to form combined-image data using at least part of the omnidirectional-image data and the narrow-view-image data, and a display module interoperably coupled to the video processor and operable to display combined images from the combined-image data.
- the combined images each comprise a narrow-view-display portion and an omnidirectional-display portion.
- a method includes concurrently capturing omnidirectional images and narrow-view images, storing data representing the captured omnidirectional images as omnidirectional-image data, storing data representing the captured narrow-view images as narrow-view-image data, removing data representing an unimportant portion of the narrow-view images to create cropped narrow-view-image data, creating combined-image data using the cropped narrow-view-image data and at least part of the omnidirectional-image data, and displaying combined images from the combined-image data.
- a system includes an omnidirectional sensor operable to capture images and create therefrom image data, a video processor operable to create, from at least part of the image data, combined-image data includes narrow-view-image data and non-narrow-view-image data, and a display module interoperably coupled to the video processor and operable, using the combined-image data, to display combined images includes narrow-view images and non-narrow-view images.
- the displayed narrow-view images comprise an enlarged version of a portion of images represented by the image data.
- a method includes capturing omnidirectional images, enlarging a relevant area of the omnidirectional images via a video processor, the enlarging resulting in enlarged relevant-area images, downsampling and cropping the omnidirectional images via the video processor, the downsampling resulting in downsampled cropped omnidirectional images, combining the enlarged-relevant-area images and the downsampled cropped omnidirectional images into combined images via the video processor, and displaying the combined images via a display module.
- FIG. 1 is a block diagram of a dual-camera system
- FIG. 2A is a side elevation view of an omnidirectional camera
- FIG. 2B is a side elevation view of another omnidirectional camera
- FIG. 3 is an illustrative field of view (FOV) of an omnidirectional camera
- FIG. 4A is a top view of a dual-camera system
- FIG. 4B is a top view of another dual-camera system
- FIG. 4C is a top view of another dual-camera system
- FIG. 5A is a detailed view of a combined image
- FIG. 5B is a flow diagram illustrating a process for operation of the camera system of FIG. 1 ;
- FIG. 6A is a block diagram of a single-camera system
- FIG. 6B is a detailed view of an image captured by the camera system of FIG. 6A ;
- FIG. 6C is a detailed view of a modified image displayed by display module of the camera system shown in FIG. 6A ;
- FIG. 7 is a flow diagram illustrating a process for operation of the camera system of FIG. 6A .
- FIG. 1 is a block diagram of a dual-camera system.
- a dual-camera system 100 includes an omnidirectional camera 10 , a narrow-view camera 12 , a video processor 14 , and a display module 16 .
- the omnidirectional camera 10 is coupled to the video processor 14 by way of a connection 18 .
- the omnidirectional camera 10 is a front-facing camera equipped with a fish-eye lens and has a field of view of at least 90 degrees.
- the omnidirectional camera 10 can be any type of omnidirectional camera such as, for example, a conical mirror camera, and typically has a field of view of at least 180 degrees.
- the dual-camera system 100 is depicted by way of example as including a single omnidirectional camera 10 , a dual-camera system in accordance with principles of the invention can incorporate any number of omnidirectional cameras 10 arranged in any orientation such as, for example, a front-facing omnidirectional camera and a rear-facing omnidirectional camera.
- the narrow-view camera 12 is coupled to the video processor 14 by way of a connection 20 .
- the narrow-view camera 12 has a field of view, for example, of approximately 10-50°; however, a camera that has any appropriate field of view may be used.
- the omnidirectional camera 10 and the narrow-view camera 12 are depicted by way of example as being connected to the video processor 14 via the connections 18 and 20 , it is also contemplated that the omnidirectional camera 10 and the narrow-view camera 12 could be wirelessly connected to the video processor 14 .
- the omnidirectional camera 10 and the narrow-view camera 12 are placed in close proximity to one another so that the points of view of the omnidirectional camera 10 and of the narrow-view camera 12 are at least approximately the same.
- the video processor 14 may be, for example, a stand-alone unit or contained within the same housing as one or both of the narrow-view camera 12 and the omnidirectional camera 10 .
- the video processor 12 receives image data from both of the narrow-view camera 12 and the omnidirectional camera 10 .
- the display module 16 is coupled to the video processor 14 by way of a connection 22 .
- the display module 16 includes a video display that simultaneously displays images captured by the omnidirectional camera 10 and the narrow-view camera 12 and processed by the video processor 14 .
- the display module 16 is depicted by way of example as being connected to the video processor 14 via the connection 22 , the display module 16 could be wirelessly connected to the video processor 14 .
- FIG. 2A is a side elevation view of a typical omnidirectional camera.
- an omnidirectional camera 10 includes a sensor 11 and a lens 13 .
- the lens 13 is a fish-eye lens and has a field of view of approximately 180 degrees; however, lenses having different fields of view may be used.
- any lens adapted to focus omnidirectional images such as, for example, a wide-angle lens, a super-wide-angle lens, a full-circle lens, a spherical mirror-type lens, a conical minor-type lens, or other lens or minor configuration capable of focusing omnidirectional images may be employed in place of the lens 13 .
- the omnidirectional camera 10 outputs image data to a display module or a video processor.
- FIG. 2B is a side elevation view of another omnidirectional camera.
- an omnidirectional camera 10 ′ includes a sensor 24 arranged relative to an external mirror 26 and a dome 28 , the dome 28 being concave relative to the sensor 24 .
- the dome 28 and the minor 26 in combination are adapted to allow light to pass therethrough.
- the dome 28 may be convex relative to the sensor 24 , the dome 28 and mirror 26 in combination being adapted to reflect light towards the sensor 24 .
- a resulting omnidirectional image captured by the omnidirectional camera 10 ′ may be, for example, a 360-degree image of a scene surrounding the omnidirectional camera 10 ′, wherein 360 degrees is relative to a centerline 31 of the camera 24 .
- the omnidirectional camera 10 ′ may be a high-definition camera such as, for example, a camera having a sensor adapted to capture images on the order of several Megapixels.
- the omnidirectional camera 10 ′ may be used interchangeably with the omnidirectional camera 10 in various embodiments.
- the omnidirectional cameral 10 ′ output image data to a display module or a video processor.
- FIG. 3 is an illustrative field of view (FOV) of the omnidirectional camera 10 ′.
- FOV field of view
- a coordinate system has been superimposed about the omnidirectional camera 10 ′.
- the coordinate system has an optical axis 30 shown running vertically along the centerline 31 of the omnidirectional camera 10 ′ and a horizontal axis 32 perpendicular thereto and passing through the minor 26 .
- the FOV of a camera is the area of a scene around the camera that can be captured by the camera.
- the FOV 34 of the omnidirectional camera 10 ′ along the horizontal axis 32 is shown.
- the FOV 34 extends both above and below the horizontal axis 32 .
- the FOV 34 extends approximately 10 degrees above the horizontal axis 32 and approximately 45 degrees below the horizontal axis 32 .
- the FOV 34 may extend more than or less than 10 degrees above the horizontal axis 32 and/or may extend more than or less than 45 degrees below the horizontal axis 32 .
- FIG. 3 shows the FOV 34 along one axis
- the full FOV of the omnidirectional camera 10 ′ may include all 360 degrees of rotation about the optical axis 30 .
- the entire panorama of the omnidirectional camera 10 ′ would then be a 55° ⁇ 360° FOV, where the 55 degrees represents the size of the angle relative to the horizontal axis 32 .
- a FOV of the omnidirectional camera 10 and the FOV 34 of the omnidirectional camera 10 ′ would be similar.
- FIG. 4A is a top view of the dual-camera system 100 in an illustrative environment.
- the omnidirectional camera 10 and the narrow-view camera 12 are positioned, for example, on a dashboard of a police vehicle 36 .
- the narrow-view camera 12 is oriented to capture images in front of the police vehicle 36 as shown by a field of view 35 and output image data representing the captured images.
- the omnidirectional camera 10 is oriented to have a similar point of view as that of the narrow-view camera 12 .
- a field of view of the omnidirectional camera 10 is illustrated by arrows 40 .
- the omnidirectional camera 10 captures images of objects in front of the police vehicle 36 as well as objects on the sides of the police vehicle 36 that are outside the field of view 35 of the narrow-view camera 12 .
- FIG. 4B is a top view of another dual-camera system in an illustrative environment.
- a system 102 includes an omnidirectional camera 10 ′′ that has a field of view that is greater than the 180 degrees illustrated in the system 100 of FIG. 4A .
- the field of view of the omnidirectional camera 10 ′′ is illustrated by arrows 40 ′.
- the narrow-view camera 12 and the omnidirectional camera 10 ′′ are placed in close proximity to each other such as, for example, on the dashboard of the police vehicle 36 .
- the narrow-view camera 12 is oriented to capture images in front of the police vehicle as shown by the field of view 35 and output image data representing the captured images.
- FIG. 4C is a top view of another dual-camera system in an illustrative environment.
- a system 104 includes omnidirectional cameras 10 ( 1 ) and 10 ( 2 ).
- the omnidirectional camera 10 ( 1 ) is shown arranged in a front-facing orientation while the omnidirectional camera 10 ( 2 ) is shown arranged in a rear-facing orientation relative to the police vehicle 36 .
- a field of view of the front-facing omnidirectional camera 10 ( 1 ) is shown by the arrows 40 .
- a field of view of the rear-facing omnidirectional camera 10 ( 2 ) is shown by arrows 40 ′′.
- the inclusion of the rear-facing omnidirectional camera 10 ( 2 ) allows the system 104 to obtain a full 360 degrees of coverage.
- the narrow-view camera 12 and the omnidirectional camera 10 ( 1 ) are placed in close proximity to each other such as, for example, on the dashboard of the police vehicle 36 .
- the narrow-view camera 12 is oriented to capture images occurring directly in front of the police vehicle as shown by the field of view 35 and output image data representing the captured images.
- a second narrow-view camera that is rear-facing may also be employed. Output of cameras facing different directions such as, for example the omnidirectional cameras 10 ( 1 ) and 10 ( 2 ), can be displayed simultaneously or sequentially in an automated fashion or responsive to user input.
- FIG. 5A is a detailed view of a combined image displayable via the display module 16 .
- a combined image 42 includes a narrow-view portion 44 and an omnidirectional portion 46 .
- the narrow-view portion 44 includes, for example, about 85% of the total viewable area of the combined image 42 .
- the narrow-view portion 44 typically has a standard resolution of D1.
- the term D1 is commonly understood to represent a resolution of approximately 720 ⁇ 480.
- the narrow-view portion 44 may have a high-definition resolution such as, for example, 720p or 1080i.
- the narrow-view portion 44 typically includes at least part of an image captured by the narrow-view camera 12 .
- the omnidirectional portion 46 includes, for example, a lower 15% of the area of the combined image 42 ; however, the size and positioning of the omnidirectional portion 46 may be altered as needed for particular applications.
- the omnidirectional portion 46 typically includes at least part of an image captured by an omnidirectional camera such as, for example, the omnidirectional camera 10 .
- FIG. 5B is a flow diagram illustrating a process for operation of the camera system of FIG. 1 .
- a process 500 begins at step 502 .
- the omnidirectional camera 10 and the narrow-view camera 12 each capture images and create image data representing the captured images.
- the image data are transmitted to the video processor 14 .
- the video processor 14 digitally unfolds and crops the image data received by the video processor 14 from the omnidirectional camera 10 .
- Unfolding may be performed in an effort to minimize edge distortion caused by the use of, for example, a fish-eye lens. Cropping may be performed to remove undesired or unimportant image portions.
- analog unfolding may be accomplished through use of a special lens designed to correct edge distortion.
- the omnidirectional camera 10 may capture images at a greater resolution than that of images captured by the narrow-view camera 12 . In some embodiments, one or both of unfolding and cropping of the output by the omnidirectional camera 10 may not be performed.
- step 508 also includes cropping by the video processor of image data from the narrow-view camera 12 that contain irrelevant or unimportant information such as, for example, data representing a hood of a police vehicle. Cropping of the image data from the narrow-view camera 12 is performed so that irrelevant image portions are not displayed. In other words, a portion of a captured image that would otherwise be displayed and that often contains irrelevant image portions may be discarded and not displayed without loss of useful information.
- the video processor creates combined images 42 and transmits data representing the combined images 42 to the display module 16 .
- the combined images 42 are composed of narrow-view portions 44 and omnidirectional portions 44 .
- the display module displays the combined images 42 .
- the omnidirectional portions 46 can be thought of as being displayed in place of a portion of images output from the narrow-view camera 12 that are considered unimportant.
- data representing the narrow-view portion 44 and the omnidirectional portion 46 are transmitted from the video processor 14 to the display module 16 as separate data streams and are displayed by the display module 16 as separate images to form the combined image 42 , while in other embodiments, a single combined-image data stream is employed.
- FIG. 6A is a block diagram of a single-camera system.
- a single-camera system 200 includes the video processor 14 , the display module 16 , and a sensor 202 .
- the display module 16 is coupled to the video processor 14 by way of the connection 22 .
- the sensor 202 is coupled to the video processor 14 by way of the connection 18 .
- the sensor 202 may be any appropriate video sensor but is typically a 20-40 megapixel sensor. In a typical embodiment, the sensor 202 has a field of view of approximately 180 degrees; however, fields of view up to and including 360 degrees may also be utilized.
- FIG. 6B is a detailed view of an image captured by a sensor such as the sensor 202 .
- an omnidirectional image 204 captured by the sensor 202 includes a relevant area 206 as well as portions of the omnidirectional image 204 that are not within the relevant area 206 as illustrated by a shaded area 208 .
- the shaded area 208 includes all or part of the relevant area 206 .
- the relevant area 206 may be, for example, the area directly in front of a police vehicle or areas including license plates.
- FIG. 6C is a detailed view of a modified image displayed by the display module 16 of the camera system 200 .
- a modified image 42 ′ includes a narrow-view portion 44 ′ and an omnidirectional portion 46 ′.
- the narrow-view portion of 44 ′ is an enlarged version of the relevant area 206 and the omnidirectional portion 46 ′ is a cropped version of the shaded area 208 .
- the cropped omnidirectional portion 46 ′ is also downsampled.
- the sensor 202 captures the omnidirectional image 204 at very high resolution such as, for example, 20-40 megapixels.
- Data representing the omnidirectional image 204 is transmitted from the sensor 202 to the video processor 14 via the connection 18 .
- the video processor 14 identifies and enlarges the relevant area 206 , the enlargement thereof resulting in the narrow-view portion 44 ′.
- the video processor 14 also crops the shaded area 208 , thereby forming a cropped version thereof (i.e., the omnidirectional portion 46 ′).
- the shaded area 208 includes all or part of the relevant area 206 .
- the display module 16 displays the narrow-view portion 44 ′ and the cropped version of the shaded area 208 (i.e., the omnidirectional portion 46 ′). In this sense, the system 200 creates data representing the narrow-view portion 44 ′ via what is sometimes referred to as digital zoom.
- the video processor 14 also typically downsamples at least portions of data representing the omnidirectional image 204 not within the relevant area 206 (e.g., the shaded area 208 ). In other embodiments, both data representing the relevant area 206 and the shaded area 208 are downsampled. Downsampling reduces the amount of data needed to be displayed and, in some cases, transferred between components of the system 200 .
- the shaded area 208 need not necessarily include all of the omnidirectional image 204 other than the relevant area 206 .
- one or both of the relevant area 206 and the enlarged version of the relevant area 206 may be retained so as to be available to be presented to and displayed by the display module 16 .
- downsampling may be performed by the sensor 202 , thereby reducing the amount of data that must be transmitted from the sensor 202 to the video processor 14 .
- the video processor 14 typically transmits data representing the combined image 42 ′ to the display module 16 as a single data stream.
- the combined image 42 ′ includes the narrow-view portion 44 ′ and the omnidirectional portion 46 ′.
- the display module 16 displays at least part of the omnidirectional image 204 or a downsampled version thereof in the omnidirectional portion 46 ′ of the display module 16 .
- the display module 16 displays the relevant area 206 or an enlarged version thereof in the narrow-view portion 44 ′. In this way, more-relevant images are in some embodiments presented at a relatively higher resolution, while less relevant images are presented at a relatively lower resolution.
- the combined image 42 ′ is created by the display module 16 from a first video stream containing, for example, the enlarged version of the relevant area 206 and a second video stream containing, for example, all or part of a downsampled version of the omnidirectional image 204 .
- the video processor 14 presents a first video stream to the display module 16 containing the enlarged version of the relevant area 206 .
- the video processor 14 also presents a second video stream containing all or part of the downsampled version of the omnidirectional portion 204 .
- FIG. 7 is a flow diagram illustrating a process of operation of the camera system 200 .
- a process 700 starts at step 702 . From step 702 , execution proceeds to step 704 .
- the sensor 202 captures an omnidirectional image and transmits the data representing the captured omnidirectional image to the video processor 14 .
- execution proceeds to step 706 .
- the video processor 14 identifies the relevant area 206 .
- execution proceeds to step 708 .
- the video processor 14 enlarges the relevant area 206 to create an enlarged version thereof; however, in some embodiments, step 708 may not be performed such that the relevant area 206 is not enlarged. From step 708 , execution proceeds to step 710 .
- the video processor 14 optionally downsamples at least portions of the omnidirectional image 204 , such as those within the shaded area 208 .
- the video processor creates a combined image 42 ′ that includes the enlarged version of the relevant area 206 and at least part of the downsampled portions of the omnidirectional image 204 and presents the combined image 42 ′ to the display module 16 .
- the combined image 42 ′ may be created by the display module 16 from a first video stream containing the enlarged version of the relevant area 206 and a second video stream containing at least part of the downsampled portions of the omnidirectional image 204 .
- the display module 16 displays the combined image 42 ′.
- the display module 16 displays the enlarged version of the relevant area 206 in the narrow-view portion 44 ′ and at least part of the downsampled portions of the omnidirectional image 204 in the omnidirectional portion 46 ′.
- the process ends at step 714 .
- Various steps of the process 700 may be performed concurrently or in a different order than described above without departing from principles of the invention.
- the omnidirectional camera 10 and the narrow-view camera 12 are described herein as separate units, a system could contain both the omnidirectional camera 10 and the narrow-view camera 12 in a single housing.
- components may have different functions from those described herein.
- functions described herein as being performed by the video processor 14 may, in various embodiments, be performed by one or both of the omnidirectional camera 10 or the narrow-view camera 12 .
- the system 100 and the system 200 and the displayed images 42 and 42 ′ are only examples of split-screen displayed images that could be created by various embodiments. It is intended that the specification and examples be considered as illustrative only. For example, either of the system 100 or the system 200 could be used to display either or both of the combined image 42 or the combined image 42 ′ or other configurations of combined images in accordance with principles of the invention.
- the video processor 14 regardless of whether operations performed by the video processor 14 are described as being performed on images or image data, it will be understood that the operations are digital operations performed on image data.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
- This application is a continuation of U.S. patent application Ser. No. 13/109,557, filed on May 17, 2011. U.S. patent application Ser. No. 13/109,557 claims priority from U.S. Provisional Patent Application No. 61/345,663, filed May 18, 2010, entitled METHOD AND SYSTEM FOR SPLIT-SCREEN VIDEO DISPLAY. U.S. patent application Ser. No. 13/109,557 and U.S. Provisional Patent Application No. 61/345,663 are incorporated herein by reference. In addition, this patent application incorporates by reference U.S. patent application Ser. No. 12/362,381 filed Jan. 29, 2009, entitled OMNIDIRECTIONAL CAMERA FOR USE IN POLICE CAR EVENT RECORDING and U.S. patent application Ser. No. 12/188,273 filed Aug. 8, 2008, entitled COMBINED WIDE-ANGLE/ZOOM CAMERA FOR LICENSE-PLATE IDENTIFICATION.
- 1. Field of the Invention
- In general, this patent application relates to video-recording devices and more particularly, but not by way of limitation, to systems that include split-screen video displays for use with law-enforcement vehicles.
- 2. History of the Related Art
- Cameras and other video-recording devices have long been used to capture still images and video. In general, cameras include an enclosed hollow portion with an opening or aperture at one end to allow light to enter and a recording surface for capturing the light at another end. In addition, cameras often have a lens positioned in front of the aperture along an optical axis to gather incoming light and focus all or part of an image onto the recording surface.
- Use of dashboard cameras in police vehicles has been known for years and is an integral part of a police department's evidence-gathering capability. One limitation of conventional cameras is a limited field of vision. Fields of view vary from camera to camera but, in general, most cameras have a field of view that ranges from a few degrees to, at most, 180°.
- To overcome the limited field of view, surveillance cameras used for monitoring large areas are oftentimes mounted to mechanisms adapted to enable the camera to pan, tilt, and zoom in order to move objects into the camera's field of view. One type of camera, called an omnidirectional camera, has been used to monitor large areas without a need for mechanisms to enable pan, tilt, and zoom.
- Some omnidirectional cameras may be adapted to capture images from all directions (i.e., a full sphere). However, many omnidirectional cameras do not capture a full sphere of images, but rather capture 360 degrees of images along a single axis with the field of view being limited angularly above and below the axis. As referred to herein, an omnidirectional camera is a camera adapted to capture omnidirectional images. The omnidirectional camera is adapted to capture wide-angle images from a wide-angle field of view up to and including 360-degree images from a 360-degree field of view. An omnidirectional image may be a wide-angle image, for example, of 130-190° from a wide-angle field of view, for example, of 130-360°. In some cases, the omnidirectional camera may have a field of view ranging from on the order of 180°, 190°, 200°, 210°, 220°, 230°, 240°, 250°, 260°, 270°, 280°, 290°, 300°, 310°, 320°, 330°, 340°, 350°, or 360° and the omnidirectional images may be less than or equal to a omnidirectional-camera field of view.
- More recently, dual-lens devices have been developed that combine a narrow-view lens and an omnidirectional lens. These dual-lens devices typically allow recording of up to 360 degrees of images at a plurality of different resolutions. However, display of the output from such dual-lens devices in a way that eliminates unimportant portions of images remains problematic.
- A system includes a first camera operable to capture omnidirectional images and send omnidirectional-image data representing the omnidirectional images, a second camera operable to capture narrow-view images and send narrow-view-image data representing the narrow-view images, a video processor coupled to the first camera and the second camera and operable to form combined-image data using at least part of the omnidirectional-image data and the narrow-view-image data, and a display module interoperably coupled to the video processor and operable to display combined images from the combined-image data. The combined images each comprise a narrow-view-display portion and an omnidirectional-display portion.
- A method includes concurrently capturing omnidirectional images and narrow-view images, storing data representing the captured omnidirectional images as omnidirectional-image data, storing data representing the captured narrow-view images as narrow-view-image data, removing data representing an unimportant portion of the narrow-view images to create cropped narrow-view-image data, creating combined-image data using the cropped narrow-view-image data and at least part of the omnidirectional-image data, and displaying combined images from the combined-image data.
- A system includes an omnidirectional sensor operable to capture images and create therefrom image data, a video processor operable to create, from at least part of the image data, combined-image data includes narrow-view-image data and non-narrow-view-image data, and a display module interoperably coupled to the video processor and operable, using the combined-image data, to display combined images includes narrow-view images and non-narrow-view images. The displayed narrow-view images comprise an enlarged version of a portion of images represented by the image data.
- A method includes capturing omnidirectional images, enlarging a relevant area of the omnidirectional images via a video processor, the enlarging resulting in enlarged relevant-area images, downsampling and cropping the omnidirectional images via the video processor, the downsampling resulting in downsampled cropped omnidirectional images, combining the enlarged-relevant-area images and the downsampled cropped omnidirectional images into combined images via the video processor, and displaying the combined images via a display module.
- For a more complete understanding of the present invention and for further objects and advantages thereof, reference may now be had to the following description taken in conjunction with the accompanying drawings in which:
-
FIG. 1 is a block diagram of a dual-camera system; -
FIG. 2A is a side elevation view of an omnidirectional camera; -
FIG. 2B is a side elevation view of another omnidirectional camera; -
FIG. 3 is an illustrative field of view (FOV) of an omnidirectional camera; -
FIG. 4A is a top view of a dual-camera system; -
FIG. 4B is a top view of another dual-camera system; -
FIG. 4C is a top view of another dual-camera system; -
FIG. 5A is a detailed view of a combined image; -
FIG. 5B is a flow diagram illustrating a process for operation of the camera system ofFIG. 1 ; -
FIG. 6A is a block diagram of a single-camera system; -
FIG. 6B is a detailed view of an image captured by the camera system ofFIG. 6A ; -
FIG. 6C is a detailed view of a modified image displayed by display module of the camera system shown inFIG. 6A ; and -
FIG. 7 is a flow diagram illustrating a process for operation of the camera system ofFIG. 6A . - Various embodiments of the present invention will now be described more fully with reference to the accompanying drawings. The invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, the embodiments are provided so that this disclosure will be thorough and will fully convey the scope of the invention to those skilled in the art.
-
FIG. 1 is a block diagram of a dual-camera system. InFIG. 1 , a dual-camera system 100 includes anomnidirectional camera 10, a narrow-view camera 12, avideo processor 14, and adisplay module 16. Theomnidirectional camera 10 is coupled to thevideo processor 14 by way of aconnection 18. In a typical embodiment, theomnidirectional camera 10 is a front-facing camera equipped with a fish-eye lens and has a field of view of at least 90 degrees. However, theomnidirectional camera 10 can be any type of omnidirectional camera such as, for example, a conical mirror camera, and typically has a field of view of at least 180 degrees. Although the dual-camera system 100 is depicted by way of example as including a singleomnidirectional camera 10, a dual-camera system in accordance with principles of the invention can incorporate any number ofomnidirectional cameras 10 arranged in any orientation such as, for example, a front-facing omnidirectional camera and a rear-facing omnidirectional camera. The narrow-view camera 12 is coupled to thevideo processor 14 by way of aconnection 20. - In a typical embodiment, the narrow-
view camera 12 has a field of view, for example, of approximately 10-50°; however, a camera that has any appropriate field of view may be used. Although theomnidirectional camera 10 and the narrow-view camera 12 are depicted by way of example as being connected to thevideo processor 14 via theconnections omnidirectional camera 10 and the narrow-view camera 12 could be wirelessly connected to thevideo processor 14. - In a typical embodiment, the
omnidirectional camera 10 and the narrow-view camera 12 are placed in close proximity to one another so that the points of view of theomnidirectional camera 10 and of the narrow-view camera 12 are at least approximately the same. Thevideo processor 14 may be, for example, a stand-alone unit or contained within the same housing as one or both of the narrow-view camera 12 and theomnidirectional camera 10. Thevideo processor 12 receives image data from both of the narrow-view camera 12 and theomnidirectional camera 10. Thedisplay module 16 is coupled to thevideo processor 14 by way of aconnection 22. In a typical embodiment, thedisplay module 16 includes a video display that simultaneously displays images captured by theomnidirectional camera 10 and the narrow-view camera 12 and processed by thevideo processor 14. Although thedisplay module 16 is depicted by way of example as being connected to thevideo processor 14 via theconnection 22, thedisplay module 16 could be wirelessly connected to thevideo processor 14. -
FIG. 2A is a side elevation view of a typical omnidirectional camera. InFIG. 2A , anomnidirectional camera 10 includes asensor 11 and alens 13. In a typical embodiment, thelens 13 is a fish-eye lens and has a field of view of approximately 180 degrees; however, lenses having different fields of view may be used. In addition, any lens adapted to focus omnidirectional images, such as, for example, a wide-angle lens, a super-wide-angle lens, a full-circle lens, a spherical mirror-type lens, a conical minor-type lens, or other lens or minor configuration capable of focusing omnidirectional images may be employed in place of thelens 13. In a typical embodiment, theomnidirectional camera 10 outputs image data to a display module or a video processor. -
FIG. 2B is a side elevation view of another omnidirectional camera. InFIG. 2B , anomnidirectional camera 10′ includes asensor 24 arranged relative to anexternal mirror 26 and adome 28, thedome 28 being concave relative to thesensor 24. Thedome 28 and the minor 26 in combination are adapted to allow light to pass therethrough. In some embodiments, thedome 28 may be convex relative to thesensor 24, thedome 28 andmirror 26 in combination being adapted to reflect light towards thesensor 24. A resulting omnidirectional image captured by theomnidirectional camera 10′ may be, for example, a 360-degree image of a scene surrounding theomnidirectional camera 10′, wherein 360 degrees is relative to acenterline 31 of thecamera 24. In some embodiments, theomnidirectional camera 10′ may be a high-definition camera such as, for example, a camera having a sensor adapted to capture images on the order of several Megapixels. Theomnidirectional camera 10′ may be used interchangeably with theomnidirectional camera 10 in various embodiments. In a typical embodiment, theomnidirectional cameral 10′ output image data to a display module or a video processor. -
FIG. 3 is an illustrative field of view (FOV) of theomnidirectional camera 10′. For descriptive purposes, a coordinate system has been superimposed about theomnidirectional camera 10′. The coordinate system has anoptical axis 30 shown running vertically along thecenterline 31 of theomnidirectional camera 10′ and ahorizontal axis 32 perpendicular thereto and passing through the minor 26. - In general, the FOV of a camera is the area of a scene around the camera that can be captured by the camera. The
FOV 34 of theomnidirectional camera 10′ along thehorizontal axis 32 is shown. TheFOV 34 extends both above and below thehorizontal axis 32. For example, in the embodiment shown, theFOV 34 extends approximately 10 degrees above thehorizontal axis 32 and approximately 45 degrees below thehorizontal axis 32. - In various embodiments, the
FOV 34 may extend more than or less than 10 degrees above thehorizontal axis 32 and/or may extend more than or less than 45 degrees below thehorizontal axis 32. AlthoughFIG. 3 shows theFOV 34 along one axis, the full FOV of theomnidirectional camera 10′ may include all 360 degrees of rotation about theoptical axis 30. The entire panorama of theomnidirectional camera 10′ would then be a 55°×360° FOV, where the 55 degrees represents the size of the angle relative to thehorizontal axis 32. In typical embodiments, a FOV of theomnidirectional camera 10 and theFOV 34 of theomnidirectional camera 10′ would be similar. -
FIG. 4A is a top view of the dual-camera system 100 in an illustrative environment. During operation, theomnidirectional camera 10 and the narrow-view camera 12 are positioned, for example, on a dashboard of apolice vehicle 36. In a typical embodiment, the narrow-view camera 12 is oriented to capture images in front of thepolice vehicle 36 as shown by a field ofview 35 and output image data representing the captured images. Theomnidirectional camera 10 is oriented to have a similar point of view as that of the narrow-view camera 12. A field of view of theomnidirectional camera 10 is illustrated byarrows 40. Theomnidirectional camera 10 captures images of objects in front of thepolice vehicle 36 as well as objects on the sides of thepolice vehicle 36 that are outside the field ofview 35 of the narrow-view camera 12. -
FIG. 4B is a top view of another dual-camera system in an illustrative environment. InFIG. 4B , asystem 102 includes anomnidirectional camera 10″ that has a field of view that is greater than the 180 degrees illustrated in thesystem 100 ofFIG. 4A . The field of view of theomnidirectional camera 10″ is illustrated byarrows 40′. Similarly to thesystem 100, the narrow-view camera 12 and theomnidirectional camera 10″ are placed in close proximity to each other such as, for example, on the dashboard of thepolice vehicle 36. In a typical embodiment, the narrow-view camera 12 is oriented to capture images in front of the police vehicle as shown by the field ofview 35 and output image data representing the captured images. -
FIG. 4C is a top view of another dual-camera system in an illustrative environment. InFIG. 4C , asystem 104 includes omnidirectional cameras 10(1) and 10(2). Those having skill in the art will recognize that the number of omnidirectional or narrow-view cameras in a given system need not be limited to any particular number and that a plurality of either type of camera as dictated by design considerations may be used. The omnidirectional camera 10(1) is shown arranged in a front-facing orientation while the omnidirectional camera 10(2) is shown arranged in a rear-facing orientation relative to thepolice vehicle 36. - A field of view of the front-facing omnidirectional camera 10(1) is shown by the
arrows 40. A field of view of the rear-facing omnidirectional camera 10(2) is shown byarrows 40″. The inclusion of the rear-facing omnidirectional camera 10(2) allows thesystem 104 to obtain a full 360 degrees of coverage. In similar fashion to thesystem 100, the narrow-view camera 12 and the omnidirectional camera 10(1) are placed in close proximity to each other such as, for example, on the dashboard of thepolice vehicle 36. In a typical embodiment, the narrow-view camera 12 is oriented to capture images occurring directly in front of the police vehicle as shown by the field ofview 35 and output image data representing the captured images. In some embodiments, a second narrow-view camera that is rear-facing may also be employed. Output of cameras facing different directions such as, for example the omnidirectional cameras 10(1) and 10(2), can be displayed simultaneously or sequentially in an automated fashion or responsive to user input. -
FIG. 5A is a detailed view of a combined image displayable via thedisplay module 16. InFIG. 5A , a combinedimage 42 includes a narrow-view portion 44 and anomnidirectional portion 46. In a typical embodiment, the narrow-view portion 44 includes, for example, about 85% of the total viewable area of the combinedimage 42. The narrow-view portion 44 typically has a standard resolution of D1. The term D1 is commonly understood to represent a resolution of approximately 720×480. However, the narrow-view portion 44 may have a high-definition resolution such as, for example, 720p or 1080i. The narrow-view portion 44 typically includes at least part of an image captured by the narrow-view camera 12. Theomnidirectional portion 46 includes, for example, a lower 15% of the area of the combinedimage 42; however, the size and positioning of theomnidirectional portion 46 may be altered as needed for particular applications. Theomnidirectional portion 46 typically includes at least part of an image captured by an omnidirectional camera such as, for example, theomnidirectional camera 10. -
FIG. 5B is a flow diagram illustrating a process for operation of the camera system ofFIG. 1 . Referring now toFIGS. 1 , 5A, and 5B, aprocess 500 begins atstep 502. Atstep 504, theomnidirectional camera 10 and the narrow-view camera 12 each capture images and create image data representing the captured images. Atstep 506, the image data are transmitted to thevideo processor 14. - At
step 508, thevideo processor 14 digitally unfolds and crops the image data received by thevideo processor 14 from theomnidirectional camera 10. Unfolding may be performed in an effort to minimize edge distortion caused by the use of, for example, a fish-eye lens. Cropping may be performed to remove undesired or unimportant image portions. In another option, analog unfolding may be accomplished through use of a special lens designed to correct edge distortion. In order to minimize unacceptable image resolution post-unfolding, theomnidirectional camera 10 may capture images at a greater resolution than that of images captured by the narrow-view camera 12. In some embodiments, one or both of unfolding and cropping of the output by theomnidirectional camera 10 may not be performed. - In a typical embodiment, step 508 also includes cropping by the video processor of image data from the narrow-
view camera 12 that contain irrelevant or unimportant information such as, for example, data representing a hood of a police vehicle. Cropping of the image data from the narrow-view camera 12 is performed so that irrelevant image portions are not displayed. In other words, a portion of a captured image that would otherwise be displayed and that often contains irrelevant image portions may be discarded and not displayed without loss of useful information. - At
step 510, the video processor creates combinedimages 42 and transmits data representing the combinedimages 42 to thedisplay module 16. The combinedimages 42 are composed of narrow-view portions 44 andomnidirectional portions 44. Atstep 512, the display module displays the combinedimages 42. Theomnidirectional portions 46 can be thought of as being displayed in place of a portion of images output from the narrow-view camera 12 that are considered unimportant. In some embodiments, data representing the narrow-view portion 44 and theomnidirectional portion 46 are transmitted from thevideo processor 14 to thedisplay module 16 as separate data streams and are displayed by thedisplay module 16 as separate images to form the combinedimage 42, while in other embodiments, a single combined-image data stream is employed. -
FIG. 6A is a block diagram of a single-camera system. InFIG. 6A , a single-camera system 200 includes thevideo processor 14, thedisplay module 16, and a sensor 202. Thedisplay module 16 is coupled to thevideo processor 14 by way of theconnection 22. The sensor 202 is coupled to thevideo processor 14 by way of theconnection 18. The sensor 202 may be any appropriate video sensor but is typically a 20-40 megapixel sensor. In a typical embodiment, the sensor 202 has a field of view of approximately 180 degrees; however, fields of view up to and including 360 degrees may also be utilized. -
FIG. 6B is a detailed view of an image captured by a sensor such as the sensor 202. InFIG. 6B , anomnidirectional image 204 captured by the sensor 202 includes arelevant area 206 as well as portions of theomnidirectional image 204 that are not within therelevant area 206 as illustrated by a shadedarea 208. In some embodiments, the shadedarea 208 includes all or part of therelevant area 206. In a typical embodiment, therelevant area 206 may be, for example, the area directly in front of a police vehicle or areas including license plates. -
FIG. 6C is a detailed view of a modified image displayed by thedisplay module 16 of the camera system 200. InFIG. 6C , a modifiedimage 42′ includes a narrow-view portion 44′ and anomnidirectional portion 46′. The narrow-view portion of 44′ is an enlarged version of therelevant area 206 and theomnidirectional portion 46′ is a cropped version of the shadedarea 208. In some embodiments, the croppedomnidirectional portion 46′ is also downsampled. - Referring now to
FIGS. 6A-6C , during operation, the sensor 202 captures theomnidirectional image 204 at very high resolution such as, for example, 20-40 megapixels. Data representing theomnidirectional image 204 is transmitted from the sensor 202 to thevideo processor 14 via theconnection 18. Thevideo processor 14 identifies and enlarges therelevant area 206, the enlargement thereof resulting in the narrow-view portion 44′. Thevideo processor 14 also crops the shadedarea 208, thereby forming a cropped version thereof (i.e., theomnidirectional portion 46′). As noted above, in some embodiments, the shadedarea 208 includes all or part of therelevant area 206. Thedisplay module 16 displays the narrow-view portion 44′ and the cropped version of the shaded area 208 (i.e., theomnidirectional portion 46′). In this sense, the system 200 creates data representing the narrow-view portion 44′ via what is sometimes referred to as digital zoom. - The
video processor 14 also typically downsamples at least portions of data representing theomnidirectional image 204 not within the relevant area 206 (e.g., the shaded area 208). In other embodiments, both data representing therelevant area 206 and the shadedarea 208 are downsampled. Downsampling reduces the amount of data needed to be displayed and, in some cases, transferred between components of the system 200. The shadedarea 208 need not necessarily include all of theomnidirectional image 204 other than therelevant area 206. Regardless of whether only the shadedarea 208 or both the shadedarea 208 and therelevant area 206 are downsampled, one or both of therelevant area 206 and the enlarged version of therelevant area 206 may be retained so as to be available to be presented to and displayed by thedisplay module 16. In another option, downsampling may be performed by the sensor 202, thereby reducing the amount of data that must be transmitted from the sensor 202 to thevideo processor 14. - The
video processor 14 typically transmits data representing the combinedimage 42′ to thedisplay module 16 as a single data stream. As illustrated, the combinedimage 42′ includes the narrow-view portion 44′ and theomnidirectional portion 46′. Thedisplay module 16 displays at least part of theomnidirectional image 204 or a downsampled version thereof in theomnidirectional portion 46′ of thedisplay module 16. In similar fashion, thedisplay module 16 displays therelevant area 206 or an enlarged version thereof in the narrow-view portion 44′. In this way, more-relevant images are in some embodiments presented at a relatively higher resolution, while less relevant images are presented at a relatively lower resolution. - In other embodiments, the combined
image 42′ is created by thedisplay module 16 from a first video stream containing, for example, the enlarged version of therelevant area 206 and a second video stream containing, for example, all or part of a downsampled version of theomnidirectional image 204. In such embodiments, thevideo processor 14 presents a first video stream to thedisplay module 16 containing the enlarged version of therelevant area 206. Thevideo processor 14 also presents a second video stream containing all or part of the downsampled version of theomnidirectional portion 204. -
FIG. 7 is a flow diagram illustrating a process of operation of the camera system 200. InFIG. 7 , aprocess 700 starts atstep 702. Fromstep 702, execution proceeds to step 704. Atstep 704, the sensor 202 captures an omnidirectional image and transmits the data representing the captured omnidirectional image to thevideo processor 14. Fromstep 704, execution proceeds to step 706. Atstep 706, thevideo processor 14 identifies therelevant area 206. Fromstep 706, execution proceeds to step 708. Atstep 708, thevideo processor 14 enlarges therelevant area 206 to create an enlarged version thereof; however, in some embodiments,step 708 may not be performed such that therelevant area 206 is not enlarged. Fromstep 708, execution proceeds to step 710. - At
step 710, thevideo processor 14 optionally downsamples at least portions of theomnidirectional image 204, such as those within the shadedarea 208. Atstep 711, the video processor creates a combinedimage 42′ that includes the enlarged version of therelevant area 206 and at least part of the downsampled portions of theomnidirectional image 204 and presents the combinedimage 42′ to thedisplay module 16. In another option, the combinedimage 42′ may be created by thedisplay module 16 from a first video stream containing the enlarged version of therelevant area 206 and a second video stream containing at least part of the downsampled portions of theomnidirectional image 204. - At
step 712, thedisplay module 16 displays the combinedimage 42′. In other words, thedisplay module 16 displays the enlarged version of therelevant area 206 in the narrow-view portion 44′ and at least part of the downsampled portions of theomnidirectional image 204 in theomnidirectional portion 46′. The process ends atstep 714. Various steps of theprocess 700 may be performed concurrently or in a different order than described above without departing from principles of the invention. - Although various embodiments of the method and apparatus of the present invention have been illustrated in the accompanying Drawings and described in the foregoing Detailed Description, it will be understood that the invention is not limited to the embodiments disclosed, but is capable of numerous rearrangements, modifications and substitutions without departing from the spirit of the invention as set forth herein. For example, although the
omnidirectional camera 10 and the narrow-view camera 12 are described herein as separate units, a system could contain both theomnidirectional camera 10 and the narrow-view camera 12 in a single housing. Furthermore, components may have different functions from those described herein. In particular, functions described herein as being performed by thevideo processor 14 may, in various embodiments, be performed by one or both of theomnidirectional camera 10 or the narrow-view camera 12. Thesystem 100 and the system 200 and the displayedimages system 100 or the system 200 could be used to display either or both of the combinedimage 42 or the combinedimage 42′ or other configurations of combined images in accordance with principles of the invention. In addition, regardless of whether operations performed by thevideo processor 14 are described as being performed on images or image data, it will be understood that the operations are digital operations performed on image data.
Claims (19)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/254,384 US20140226952A1 (en) | 2010-05-18 | 2014-04-16 | Method and system for split-screen video display |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US34566310P | 2010-05-18 | 2010-05-18 | |
US13/109,557 US8736680B1 (en) | 2010-05-18 | 2011-05-17 | Method and system for split-screen video display |
US14/254,384 US20140226952A1 (en) | 2010-05-18 | 2014-04-16 | Method and system for split-screen video display |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/109,557 Continuation US8736680B1 (en) | 2010-05-18 | 2011-05-17 | Method and system for split-screen video display |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140226952A1 true US20140226952A1 (en) | 2014-08-14 |
Family
ID=50736491
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/109,557 Active 2031-12-22 US8736680B1 (en) | 2010-05-18 | 2011-05-17 | Method and system for split-screen video display |
US14/254,384 Abandoned US20140226952A1 (en) | 2010-05-18 | 2014-04-16 | Method and system for split-screen video display |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/109,557 Active 2031-12-22 US8736680B1 (en) | 2010-05-18 | 2011-05-17 | Method and system for split-screen video display |
Country Status (1)
Country | Link |
---|---|
US (2) | US8736680B1 (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150185484A1 (en) * | 2013-12-30 | 2015-07-02 | Electronics And Telecommunications Research Institute | Pupil tracking apparatus and method |
KR20150079393A (en) * | 2013-12-30 | 2015-07-08 | 한국전자통신연구원 | Apparatus and method for tracking pupil |
US20150288886A1 (en) * | 2010-08-27 | 2015-10-08 | Sony Corporation | Imaging device, imaging system, and imaging method |
US9560309B2 (en) | 2004-10-12 | 2017-01-31 | Enforcement Video, Llc | Method of and system for mobile surveillance and event recording |
CN106502557A (en) * | 2016-09-14 | 2017-03-15 | 深圳众思科技有限公司 | A kind of split screen transmits the method and device of file |
US10334249B2 (en) | 2008-02-15 | 2019-06-25 | WatchGuard, Inc. | System and method for high-resolution storage of images |
US10341605B1 (en) | 2016-04-07 | 2019-07-02 | WatchGuard, Inc. | Systems and methods for multiple-resolution storage of media streams |
CN111163283A (en) * | 2018-11-07 | 2020-05-15 | 浙江宇视科技有限公司 | Monitoring method and device |
WO2020138536A1 (en) * | 2018-12-24 | 2020-07-02 | 서울과학기술대학교 산학협력단 | System and method for transmitting image on basis of hybrid network |
US20200227089A1 (en) * | 2016-03-25 | 2020-07-16 | Samsung Electronics Co., Ltd. | Method and device for processing multimedia information |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130089301A1 (en) * | 2011-10-06 | 2013-04-11 | Chi-cheng Ju | Method and apparatus for processing video frames image with image registration information involved therein |
US10493916B2 (en) | 2012-02-22 | 2019-12-03 | Magna Electronics Inc. | Vehicle camera system with image manipulation |
US11327302B2 (en) | 2013-09-18 | 2022-05-10 | Beth Holst | Secure capture and transfer of image and audio data |
US10008124B1 (en) | 2013-09-18 | 2018-06-26 | Beth Holst | Method and system for providing secure remote testing |
US10140827B2 (en) | 2014-07-07 | 2018-11-27 | Google Llc | Method and system for processing motion event notifications |
US10127783B2 (en) * | 2014-07-07 | 2018-11-13 | Google Llc | Method and device for processing motion events |
US9082018B1 (en) | 2014-09-30 | 2015-07-14 | Google Inc. | Method and system for retroactively changing a display characteristic of event indicators on an event timeline |
JP6304391B2 (en) * | 2014-10-17 | 2018-04-04 | 株式会社リコー | Image display system for vehicles |
CN105635635A (en) | 2014-11-19 | 2016-06-01 | 杜比实验室特许公司 | Adjustment for space consistency in video conference system |
US9888174B2 (en) | 2015-10-15 | 2018-02-06 | Microsoft Technology Licensing, Llc | Omnidirectional camera with movement detection |
US10277858B2 (en) * | 2015-10-29 | 2019-04-30 | Microsoft Technology Licensing, Llc | Tracking object of interest in an omnidirectional video |
US20170134714A1 (en) * | 2015-11-11 | 2017-05-11 | Microsoft Technology Licensing, Llc | Device and method for creating videoclips from omnidirectional video |
KR20180060236A (en) * | 2016-11-28 | 2018-06-07 | 엘지전자 주식회사 | Mobile terminal and operating method thereof |
JP6349558B1 (en) * | 2017-05-12 | 2018-07-04 | パナソニックIpマネジメント株式会社 | Imaging system and display system |
US11226731B1 (en) * | 2018-01-24 | 2022-01-18 | Snap Inc. | Simulated interactive panoramas |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100225817A1 (en) * | 2000-06-28 | 2010-09-09 | Sheraizin Semion M | Real Time Motion Picture Segmentation and Superposition |
US20100238327A1 (en) * | 2009-03-19 | 2010-09-23 | Griffith John D | Dual Sensor Camera |
US20110053654A1 (en) * | 2008-03-26 | 2011-03-03 | Tessera Technologies Ireland Limited | Method of Making a Digital Camera Image of a Scene Including the Camera User |
US20110134141A1 (en) * | 2007-04-03 | 2011-06-09 | Lifetouch Inc. | Method and apparatus for background replacement in still photographs |
US20110242277A1 (en) * | 2010-03-30 | 2011-10-06 | Do Minh N | Systems and methods for embedding a foreground video into a background feed based on a control input |
US20110249153A1 (en) * | 2009-01-20 | 2011-10-13 | Shinichiro Hirooka | Obstacle detection display device |
US20110310435A1 (en) * | 2002-06-05 | 2011-12-22 | Seiko Epson Corporation | Digital camera recording a composite image |
US20120092522A1 (en) * | 2007-04-13 | 2012-04-19 | Fujifilm Corporation | Imaging apparatus, method and program |
Family Cites Families (86)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4389706A (en) | 1972-05-03 | 1983-06-21 | Westinghouse Electric Corp. | Digital computer monitored and/or operated system or process which is structured for operation with an improved automatic programming process and system |
US4949186A (en) | 1987-02-13 | 1990-08-14 | Peterson Roger D | Vehicle mounted surveillance system |
EP0374419A3 (en) | 1988-12-21 | 1991-04-10 | International Business Machines Corporation | Method and apparatus for efficient loop constructs in hardware and microcode |
US5408330A (en) | 1991-03-25 | 1995-04-18 | Crimtec Corporation | Video incident capture system |
DE69330513D1 (en) | 1992-03-20 | 2001-09-06 | Commw Scient Ind Res Org | OBJECT MONITORING SYSTEM |
CA2135240A1 (en) | 1993-12-01 | 1995-06-02 | James F. Frazier | Automated license plate locator and reader |
CA2148631C (en) | 1994-06-20 | 2000-06-13 | John J. Hildin | Voice-following video system |
US5703604A (en) | 1995-05-22 | 1997-12-30 | Dodeca Llc | Immersive dodecaherdral video viewing system |
US6731334B1 (en) | 1995-07-31 | 2004-05-04 | Forgent Networks, Inc. | Automatic voice tracking camera system and method of operation |
CA2260195C (en) | 1996-06-28 | 2003-09-23 | T. Eric Hopkins | Image acquisition system |
US6252989B1 (en) | 1997-01-07 | 2001-06-26 | Board Of The Regents, The University Of Texas System | Foveated image coding system and method for image bandwidth reduction |
JPH10304334A (en) | 1997-04-25 | 1998-11-13 | Canon Inc | Communication method and device, transmission device and reception device, communication system and recording medium |
WO1998052795A2 (en) | 1997-05-21 | 1998-11-26 | Siemens Aktiengesellschaft | Passenger protection control system, and its control method |
US6477202B1 (en) | 1997-09-03 | 2002-11-05 | Matsushita Electric Industrial Co., Ltd. | Apparatus of layered picture coding, apparatus of picture decoding, methods of picture decoding, apparatus of recording for digital broadcasting signal, and apparatus of picture and audio decoding |
CN1137487C (en) | 1997-09-17 | 2004-02-04 | 松下电器产业株式会社 | Optical disc, video data editing apparatus, computer-readable recording medium storing editing program, reproduction apparatus for optical disc, and computer-readable recording medium |
US6389340B1 (en) | 1998-02-09 | 2002-05-14 | Gary A. Rayner | Vehicle data recorder |
US6546119B2 (en) | 1998-02-24 | 2003-04-08 | Redflex Traffic Systems | Automated traffic violation monitoring and reporting system |
JPH11242518A (en) | 1998-02-25 | 1999-09-07 | Honda Motor Co Ltd | Radar device |
US6215519B1 (en) | 1998-03-04 | 2001-04-10 | The Trustees Of Columbia University In The City Of New York | Combined wide angle and narrow angle imaging system and method for surveillance and monitoring |
JPH11306283A (en) | 1998-04-24 | 1999-11-05 | Chuo Spring Co Ltd | Number plate reader |
JP2000059758A (en) | 1998-08-05 | 2000-02-25 | Matsushita Electric Ind Co Ltd | Monitoring camera apparatus, monitoring device and remote monitor system using them |
US20030025599A1 (en) | 2001-05-11 | 2003-02-06 | Monroe David A. | Method and apparatus for collecting, sending, archiving and retrieving motion video and still images and notification of detected events |
US7023913B1 (en) | 2000-06-14 | 2006-04-04 | Monroe David A | Digital security multimedia sensor |
US7583290B2 (en) | 1998-10-09 | 2009-09-01 | Diebold, Incorporated | Cash dispensing automated banking machine with improved fraud detection capabilities |
US20020140924A1 (en) | 1999-01-08 | 2002-10-03 | Richard J. Wangler | Vehicle classification and axle counting sensor system and method |
FI106998B (en) | 1999-01-15 | 2001-05-15 | Nokia Mobile Phones Ltd | Bit rate control on a multimedia device |
US6738073B2 (en) * | 1999-05-12 | 2004-05-18 | Imove, Inc. | Camera system with both a wide angle view and a high resolution view |
US6734911B1 (en) | 1999-09-30 | 2004-05-11 | Koninklijke Philips Electronics N.V. | Tracking camera using a lens that generates both wide-angle and narrow-angle views |
US20020040475A1 (en) | 2000-03-23 | 2002-04-04 | Adrian Yap | DVR system |
US6829391B2 (en) | 2000-09-08 | 2004-12-07 | Siemens Corporate Research, Inc. | Adaptive resolution system and method for providing efficient low bit rate transmission of image data for distributed applications |
US7027655B2 (en) | 2001-03-29 | 2006-04-11 | Electronics For Imaging, Inc. | Digital image compression with spatially varying quality levels determined by identifying areas of interest |
JP2002308030A (en) | 2001-04-16 | 2002-10-23 | Yazaki Corp | Periphery monitoring system for vehicle |
US6831556B1 (en) | 2001-05-16 | 2004-12-14 | Digital Safety Technologies, Inc. | Composite mobile digital information system |
GB0116877D0 (en) | 2001-07-10 | 2001-09-05 | Hewlett Packard Co | Intelligent feature selection and pan zoom control |
US7119832B2 (en) | 2001-07-23 | 2006-10-10 | L-3 Communications Mobile-Vision, Inc. | Wireless microphone for use with an in-car video system |
US7940299B2 (en) | 2001-08-09 | 2011-05-10 | Technest Holdings, Inc. | Method and apparatus for an omni-directional video surveillance system |
AU2002357686A1 (en) | 2001-11-01 | 2003-05-12 | A4S Technologies, Inc. | Remote surveillance system |
US6892167B2 (en) | 2001-11-28 | 2005-05-10 | Sypris Data Systems, Inc. | Real-time data acquisition and storage network |
US6741168B2 (en) | 2001-12-13 | 2004-05-25 | Samsung Electronics Co., Ltd. | Method and apparatus for automated collection and transfer of collision information |
US7262790B2 (en) | 2002-01-09 | 2007-08-28 | Charles Adams Bakewell | Mobile enforcement platform with aimable violation identification and documentation system for multiple traffic violation types across all lanes in moving traffic, generating composite display images and data to support citation generation, homeland security, and monitoring |
WO2004004320A1 (en) | 2002-07-01 | 2004-01-08 | The Regents Of The University Of California | Digital processing of video images |
US20040056779A1 (en) | 2002-07-01 | 2004-03-25 | Rast Rodger H. | Transportation signaling device |
EP1391859A1 (en) | 2002-08-21 | 2004-02-25 | Strategic Vista International Inc. | Digital video securtiy system |
US20040119869A1 (en) | 2002-12-24 | 2004-06-24 | Tretter Daniel R. | Dual sensor camera |
US20040150717A1 (en) | 2003-01-21 | 2004-08-05 | Page Warren S. | Digital in-car video surveillance system |
US7735104B2 (en) | 2003-03-20 | 2010-06-08 | The Directv Group, Inc. | System and method for navigation of indexed video content |
EP1627524A4 (en) | 2003-03-20 | 2009-05-27 | Ge Security Inc | Systems and methods for multi-resolution image processing |
EP1513342A3 (en) | 2003-04-29 | 2005-03-16 | Synectic Systems Limited | System and method for storing audio/video data |
US7450165B2 (en) | 2003-05-02 | 2008-11-11 | Grandeye, Ltd. | Multiple-view processing in wide-angle video camera |
US7986339B2 (en) | 2003-06-12 | 2011-07-26 | Redflex Traffic Systems Pty Ltd | Automated traffic violation monitoring and reporting system with combined video and still-image data |
US20060193384A1 (en) | 2003-06-19 | 2006-08-31 | Boyce Jill M | Method and apparatus for low-complexity spatial scalable decoding |
US7711150B2 (en) | 2003-07-10 | 2010-05-04 | James Simon | Autonomous wide-angle license plate recognition |
US7609941B2 (en) | 2003-10-20 | 2009-10-27 | Panasonic Corporation | Multimedia data recording apparatus, monitor system, and multimedia data recording method |
US7929010B2 (en) | 2003-10-24 | 2011-04-19 | Motorola Mobility, Inc. | System and method for generating multimedia composites to track mobile events |
AR046430A1 (en) | 2003-10-28 | 2005-12-07 | Cargill Inc | AGRICULTURAL HANDLING SYSTEM. |
US7373395B2 (en) | 2004-02-04 | 2008-05-13 | Perseus Wireless, Inc. | Method and system for providing information to remote clients |
JP2005251313A (en) | 2004-03-04 | 2005-09-15 | Toshiba Corp | Device and method for information recording and reproducing |
US7768571B2 (en) | 2004-03-22 | 2010-08-03 | Angstrom, Inc. | Optical tracking system using variable focal length lens |
WO2005120924A1 (en) | 2004-06-11 | 2005-12-22 | Stratech Systems Limited | Method and system for rail track scanning and foreign object detection |
WO2005125209A1 (en) | 2004-06-22 | 2005-12-29 | Stratech Systems Limited | Method and system for surveillance of vessels |
WO2006006081A2 (en) | 2004-07-09 | 2006-01-19 | Emitall Surveillance S.A. | Smart video surveillance system ensuring privacy |
US20060028547A1 (en) | 2004-08-04 | 2006-02-09 | Chao-Hung Chang | Integrated active surveillance system |
US7750936B2 (en) | 2004-08-06 | 2010-07-06 | Sony Corporation | Immersive surveillance system interface |
WO2006044476A2 (en) | 2004-10-12 | 2006-04-27 | Robert Vernon Vanman | Method of and system for mobile surveillance and event recording |
US7355527B2 (en) | 2005-01-10 | 2008-04-08 | William Franklin | System and method for parking infraction detection |
US20060159325A1 (en) | 2005-01-18 | 2006-07-20 | Trestle Corporation | System and method for review in studies including toxicity and risk assessment studies |
US20070109411A1 (en) * | 2005-06-02 | 2007-05-17 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Composite image selectivity |
US7495579B2 (en) | 2005-06-13 | 2009-02-24 | Sirota J Marcos | Traffic light status remote sensor system |
US20070024706A1 (en) | 2005-08-01 | 2007-02-01 | Brannon Robert H Jr | Systems and methods for providing high-resolution regions-of-interest |
US7768548B2 (en) | 2005-08-12 | 2010-08-03 | William Bradford Silvernail | Mobile digital video recording system |
US7405834B1 (en) | 2006-02-15 | 2008-07-29 | Lockheed Martin Corporation | Compensated coherent imaging for improved imaging and directed energy weapons applications |
JP4566166B2 (en) | 2006-02-28 | 2010-10-20 | 三洋電機株式会社 | Imaging device |
US20070217761A1 (en) | 2006-03-07 | 2007-09-20 | Coban Research And Technologies, Inc. | Method for video/audio recording using unrestricted pre-event/post-event buffering with multiple bit and frame rates buffer files |
JP2007279017A (en) | 2006-03-15 | 2007-10-25 | Omron Corp | Radar system |
US20070222859A1 (en) | 2006-03-23 | 2007-09-27 | Coban Research And Technologies, Inc. | Method for digital video/audio recording with backlight compensation using a touch screen control panel |
US7574131B2 (en) | 2006-03-29 | 2009-08-11 | Sunvision Scientific Inc. | Object detection system and method |
US7535383B2 (en) | 2006-07-10 | 2009-05-19 | Sharp Laboratories Of America Inc. | Methods and systems for signaling multi-layer bitstream data |
IL179186A0 (en) | 2006-11-12 | 2008-01-20 | Elta Systems Ltd | Method and system for detecting signal soures in a surveillance space |
KR100819047B1 (en) | 2006-11-27 | 2008-04-02 | 한국전자통신연구원 | Apparatus and method for estimating a center line of intersection |
US20090046157A1 (en) | 2007-08-13 | 2009-02-19 | Andrew Cilia | Combined wide-angle/zoom camera for license plate identification |
US20090049491A1 (en) | 2007-08-16 | 2009-02-19 | Nokia Corporation | Resolution Video File Retrieval |
US8045799B2 (en) | 2007-11-15 | 2011-10-25 | Sony Ericsson Mobile Communications Ab | System and method for generating a photograph with variable image quality |
EP2243290A4 (en) | 2008-01-29 | 2011-06-22 | Enforcement Video Llc | Omnidirectional camera for use in police car event recording |
WO2009102477A1 (en) | 2008-02-15 | 2009-08-20 | Enforcement Video, Llc | System and method for high-resolution storage of images |
TWI433531B (en) | 2009-12-25 | 2014-04-01 | Primax Electronics Ltd | Method of starting snapping static screen and system therof |
US10643467B2 (en) | 2010-03-28 | 2020-05-05 | Roadmetric Ltd. | System and method for detecting and recording traffic law violation events |
-
2011
- 2011-05-17 US US13/109,557 patent/US8736680B1/en active Active
-
2014
- 2014-04-16 US US14/254,384 patent/US20140226952A1/en not_active Abandoned
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100225817A1 (en) * | 2000-06-28 | 2010-09-09 | Sheraizin Semion M | Real Time Motion Picture Segmentation and Superposition |
US20110310435A1 (en) * | 2002-06-05 | 2011-12-22 | Seiko Epson Corporation | Digital camera recording a composite image |
US20110134141A1 (en) * | 2007-04-03 | 2011-06-09 | Lifetouch Inc. | Method and apparatus for background replacement in still photographs |
US20120092522A1 (en) * | 2007-04-13 | 2012-04-19 | Fujifilm Corporation | Imaging apparatus, method and program |
US20110053654A1 (en) * | 2008-03-26 | 2011-03-03 | Tessera Technologies Ireland Limited | Method of Making a Digital Camera Image of a Scene Including the Camera User |
US20110249153A1 (en) * | 2009-01-20 | 2011-10-13 | Shinichiro Hirooka | Obstacle detection display device |
US20100238327A1 (en) * | 2009-03-19 | 2010-09-23 | Griffith John D | Dual Sensor Camera |
US20110242277A1 (en) * | 2010-03-30 | 2011-10-06 | Do Minh N | Systems and methods for embedding a foreground video into a background feed based on a control input |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9871993B2 (en) | 2004-10-12 | 2018-01-16 | WatchGuard, Inc. | Method of and system for mobile surveillance and event recording |
US10075669B2 (en) | 2004-10-12 | 2018-09-11 | WatchGuard, Inc. | Method of and system for mobile surveillance and event recording |
US9560309B2 (en) | 2004-10-12 | 2017-01-31 | Enforcement Video, Llc | Method of and system for mobile surveillance and event recording |
US10063805B2 (en) | 2004-10-12 | 2018-08-28 | WatchGuard, Inc. | Method of and system for mobile surveillance and event recording |
US9756279B2 (en) | 2004-10-12 | 2017-09-05 | Enforcement Video, Llc | Method of and system for mobile surveillance and event recording |
US10334249B2 (en) | 2008-02-15 | 2019-06-25 | WatchGuard, Inc. | System and method for high-resolution storage of images |
US10462372B2 (en) * | 2010-08-27 | 2019-10-29 | Sony Corporation | Imaging device, imaging system, and imaging method |
US20150288886A1 (en) * | 2010-08-27 | 2015-10-08 | Sony Corporation | Imaging device, imaging system, and imaging method |
US10110820B2 (en) * | 2010-08-27 | 2018-10-23 | Sony Corporation | Imaging device, imaging system, and imaging method |
KR20150079393A (en) * | 2013-12-30 | 2015-07-08 | 한국전자통신연구원 | Apparatus and method for tracking pupil |
US20150185484A1 (en) * | 2013-12-30 | 2015-07-02 | Electronics And Telecommunications Research Institute | Pupil tracking apparatus and method |
KR102269088B1 (en) | 2013-12-30 | 2021-06-24 | 한국전자통신연구원 | Apparatus and method for tracking pupil |
US20200227089A1 (en) * | 2016-03-25 | 2020-07-16 | Samsung Electronics Co., Ltd. | Method and device for processing multimedia information |
EP3716635A1 (en) * | 2016-03-25 | 2020-09-30 | Samsung Electronics Co., Ltd. | Method and device for processing multimedia information |
US11081137B2 (en) | 2016-03-25 | 2021-08-03 | Samsung Electronics Co., Ltd | Method and device for processing multimedia information |
US10341605B1 (en) | 2016-04-07 | 2019-07-02 | WatchGuard, Inc. | Systems and methods for multiple-resolution storage of media streams |
CN106502557A (en) * | 2016-09-14 | 2017-03-15 | 深圳众思科技有限公司 | A kind of split screen transmits the method and device of file |
CN111163283A (en) * | 2018-11-07 | 2020-05-15 | 浙江宇视科技有限公司 | Monitoring method and device |
WO2020138536A1 (en) * | 2018-12-24 | 2020-07-02 | 서울과학기술대학교 산학협력단 | System and method for transmitting image on basis of hybrid network |
Also Published As
Publication number | Publication date |
---|---|
US8736680B1 (en) | 2014-05-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8736680B1 (en) | Method and system for split-screen video display | |
US5657073A (en) | Seamless multi-camera panoramic imaging with distortion correction and selectable field of view | |
US7450165B2 (en) | Multiple-view processing in wide-angle video camera | |
JP5194679B2 (en) | Vehicle periphery monitoring device and video display method | |
KR100983625B1 (en) | Imaging device | |
JP5321711B2 (en) | Vehicle periphery monitoring device and video display method | |
US20100045773A1 (en) | Panoramic adapter system and method with spherical field-of-view coverage | |
US20110234475A1 (en) | Head-mounted display device | |
KR101179131B1 (en) | Monitoring system using synthetic simultaneous monitoring camera with pan/tilt/zoom function | |
JP4844979B2 (en) | Image processing method and imaging apparatus using the image processing method | |
US8994825B2 (en) | Vehicle rear view camera system and method | |
US20140176700A1 (en) | Driving assistant system and method | |
CN106488181B (en) | Display control device, display control method, and recording medium | |
US20140184737A1 (en) | Driving assistant system and method | |
US20140176699A1 (en) | Driving assistant system and method | |
KR101478980B1 (en) | System for multi channel display to use a fish-eye lens | |
US8139120B2 (en) | Image processing device, camera device and image processing method | |
KR101360244B1 (en) | Monitoring system for moving things using cctv fluoroscopy | |
KR100445548B1 (en) | Panorama shooting monitoring method and shooting monitoring device | |
JP2003116029A (en) | Imaging device and image recorder using the same | |
JP2019001325A (en) | On-vehicle imaging device | |
JP2010134617A (en) | Panoramic imaging apparatus | |
CN112272829A (en) | Camera with scanning optical path folding element for automotive or surveillance applications | |
JP6844055B1 (en) | Surveillance camera | |
JP2000322564A (en) | Omnidirectional visual sensor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: ENFORCEMENT VIDEO, LLC, TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CILIA, ANDREW;VANMAN, ROBERT V.;SIGNING DATES FROM 20111025 TO 20111129;REEL/FRAME:032767/0485 |
|
AS | Assignment |
Owner name: TEXAS CAPITAL BANK (NATIONAL BANKING ASSOCIATION), Free format text: RESTATED INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:ENFORCEMENT VIDEO, LLC (A TEXAS LIMITED LIABILITY COMPANY);REEL/FRAME:041564/0706 Effective date: 20161229 |
|
AS | Assignment |
Owner name: WATCHGUARD, INC., TEXAS Free format text: CERTIFICATE OF CONVERSION;ASSIGNOR:ENFORCEMENT VIDEO, LLC;REEL/FRAME:044712/0932 Effective date: 20170927 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION |
|
AS | Assignment |
Owner name: WATCHGUARD, INC., TEXAS Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:TEXAS CAPITAL BANK, NATIONAL ASSOCIATION;REEL/FRAME:049735/0091 Effective date: 20190711 |
|
AS | Assignment |
Owner name: MOTOROLA SOLUTIONS INC., ILLINOIS Free format text: CHANGE OF NAME;ASSIGNORS:WATCHGUARD, INC.;WATCHGUARD VIDEO, INC.;REEL/FRAME:051325/0261 Effective date: 20191031 |
|
AS | Assignment |
Owner name: WATCHGUARD VIDEO, INC., TEXAS Free format text: MERGER AND CHANGE OF NAME;ASSIGNORS:WATCHGUARD, INC.;WATCHGUARD VIDEO, INC.;REEL/FRAME:052536/0535 Effective date: 20191031 |