WO2014159868A1 - System and method for adjusting an image for a vehicle mounted camera - Google Patents
System and method for adjusting an image for a vehicle mounted camera Download PDFInfo
- Publication number
- WO2014159868A1 WO2014159868A1 PCT/US2014/025362 US2014025362W WO2014159868A1 WO 2014159868 A1 WO2014159868 A1 WO 2014159868A1 US 2014025362 W US2014025362 W US 2014025362W WO 2014159868 A1 WO2014159868 A1 WO 2014159868A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- vehicle
- accordance
- video
- horizon
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 33
- 230000004044 response Effects 0.000 claims abstract description 8
- 238000000605 extraction Methods 0.000 claims description 30
- 238000004891 communication Methods 0.000 claims description 19
- 230000008859 change Effects 0.000 claims description 10
- 230000000694 effects Effects 0.000 claims description 7
- 230000007613 environmental effect Effects 0.000 claims description 4
- 230000001133 acceleration Effects 0.000 claims description 2
- 238000009877 rendering Methods 0.000 claims description 2
- 238000004364 calculation method Methods 0.000 description 5
- 238000004519 manufacturing process Methods 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000032258 transport Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 230000000750 progressive effect Effects 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 239000000969 carrier Substances 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 239000012769 display material Substances 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 229910000078 germane Inorganic materials 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000001454 recorded image Methods 0.000 description 1
- 238000004513 sizing Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000000153 supplemental effect Effects 0.000 description 1
- 230000007723 transport mechanism Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/28—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with an adjustable field of view
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/24—Aligning, centring, orientation detection or correction of the image
- G06V10/243—Aligning, centring, orientation detection or correction of the image by compensating for image skew or non-uniform image deformations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/60—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
- B60R2300/602—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective with an adjustable viewpoint
- B60R2300/605—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective with an adjustable viewpoint the adjustment being automatic
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W2050/0062—Adapting control system settings
- B60W2050/0075—Automatic parameter input, automatic initialising or calibrating means
- B60W2050/0083—Setting, resetting, calibration
- B60W2050/0085—Setting or resetting initial positions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/02—Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
- B60W50/0205—Diagnosing or detecting failures; Failure detection models
- B60W2050/0215—Sensor drifts or sensor failures
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo or light sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/16—Pitch
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2628—Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
Definitions
- the present invention is generally related to vehicle mounted cameras. More particularly, example embodiments of the present invention are related systems and methods for adjusting an image, e.g., an image horizon, for a vehicle mounted camera.
- Vehicle mounted cameras are utilized in a variety of applications, from personal use to record street or track or flight performance to professional use in racecars.
- FIG. 1 a traditional camera image in NASCAR is illustrated generally at 10, with Figures 1 and 2 illustrating a fixed image horizon (note virtual image horizon line 12 provided across the image to show the fixed perspective of the image) relative to the hood 14 of the racecar between a straightaway and a turn.
- this virtual line 12 shows a change in horizon relative to the sky 16 due to a change in angle of the track.
- the present system and method for adjusting an image for a vehicle mounted camera overcomes and alleviates the problems and disadvantages in the prior art by providing an adjustable image that adjusts in response to at least one vehicle mounted sensor.
- telemetry of a vehicle from a plurality of sensors may be used to automatically adjust an image, e.g. an image horizon, in a desired way.
- both image horizon and zoom are automatically adjusted during tilting of a vehicle.
- such image horizon adjustment may be provided as a digital video effect, alleviating the need to actually adjust the angle of a camera during vehicle tilt.
- FIGURE 1 is a view of a racecar camera image with a fixed image horizon on a racetrack straightaway;
- FIGURE 2 is a view of the racecar camera image with a fixed image horizon on a banked turn of a racetrack;
- FIGURE 3 is a flowchart of an exemplary method for adjusting the image horizon of a vehicle mounted camera
- FIGURE 4 is a view of a racecar camera image with an adjustable image horizon on a racetrack straightaway;
- FIGURE 5 is a view of a racecar camera image with an adjusted image horizon on a racetrack turn with zoom remaining constant;
- FIGURE 6 is a view of a racecar camera image with an adjustable image horizon on a racetrack straightaway
- FIGURE 7 is a view of a racecar camera image with an adjusted image horizon on a racetrack turn adjusted zoom
- FIGURE 8 is an exemplary system for adjusting image for a vehicle mounted camera
- FIGURE 9 is a diagram comparing relative pixel dimensions of high definition and greater than high definition images
- FIGURE 10 is an exemplary graphical user interface of a 4K captured image with a 720p selectable extraction window
- FIGURE 11 is an exemplary first system for capturing and transporting a 4K image to an offsite processor and graphical user interface
- FIGURE 12 is an exemplary second system for capturing and processing a 4K image onsite, followed by transport of a high definition image offsite.
- the present invention relates to adjusting an image, e.g., an image horizon, for a vehicle mounted camera by providing an image that adjusts in response to at least one vehicle mounted sensor.
- telemetry of a vehicle from a plurality of sensors may be used to automatically adjust an image horizon in a desired way.
- Sensor data may include any convenient type of data, including gyro data, vehicle angle, attitude, altitude, speed, acceleration, traction, etc., data, navigational data, or the like.
- Sensor data may also comprise data that describes environmental conditions for the vehicle, such as weather, sensed track conditions, wind, including turbulence, shear, etc., temperature, and others, including any sensed data that may be useful in adjusting an image.
- Such adjusting of an image may include, as in specific examples described below, adjustment of an image horizon, or another type of image adjustment, such as crop, selection of image portions, tracking of objects of interest in images, rendering selective high definition images from greater than high definition cameras, selective capture of image points of interest, adjustment of the image responsive to environmental conditions, etc. Examples are described by co-pending U.S. Patent Application Serial No. 13/567,323 to the present inventor, filed August 6, 2012 and claiming priority to U.S. Patent Application Serial Nos. 61/515,549, filed August 5, 20011 and 61/563,126, filed November 23, 2011, the entire contents of which are incorporated herein by reference. A selection from 13/567,323 relating to selective capture and presentation of native image portions follows:
- Prior Art FIGURE 9 shows an example of relative pixel dimensions at a 2.39:1 aspect ratio, with 720p and 1080p formats being letterboxed.
- Examples of vertical high resolution designators are 720p (1280 x 720 pixels), 1080i (utilizing an interlace of two fields of 1920 x 540 pixels for a total resolution of 1920 x 1080 pixels) or 1080p (representing a progressive scan of 1920 x 1080 pixels).
- Examples of horizontal high resolution designators which are more common to digital cinema terminology, include 2K (2048 pixels wide) and 4K (4096 pixels wide).
- FIGURE 9 illustrates a comparison of relative pixel dimensions for 720p, 1080p, 2K and 4K captured images.
- a first image or video is captured at a first resolution, which resolution is greater than high definition and higher than a predetermined broadcast display resolution.
- a desired portion of the first image or video is then displayed at a second, lower resolution, which resolution is less than and closer to the predetermined broadcast display resolution. Accordingly, a selected portion of the captured image may be displayed at or near the predetermined broadcast display resolution (i.e., minimizing or eliminating loss of image detail relative to the predetermined broadcast display resolution).
- FIGURE 10 shows a screenshot of a full- raster 4K moving video image 1110.
- a portion of the 4K image illustrated as a 720p moving video selectable extraction window 1112, is then selected for presentation.
- native image capture occurs at a greater than high definition resolution, and portions of that greater than high definition image are selected for presentation via the 720p extraction window.
- FIGURE 10 specifically illustrates 4K capture and a 720p extraction window, it should be recognized that both or either of the captured image and extraction window may be provided at or sized to other resolutions.
- the selectable extraction window (1112 in FIGURE 10) is provided at a graphical user interface ("GUI") (1114 in FIGURES 11 and 12) that is configured to allow an operator to navigate within a captured image and select portions of the captured image for presentation.
- GUI graphical user interface
- the extraction window is configured to allow the operator to adjust the size and position of the extraction window.
- the extraction window is configured to track or scan across moving images, e.g., to follow a play or subject of interest during a sporting event.
- plural operators may extract from the same images via the same or via plural GUIs.
- FIGURES 11 and 12 processing of the captured images may occur either off site (FIGURE 11) or onsite (FIGURE 12).
- a camera 1116 captures 4K images onsite, e.g., at a field (shown generally at 1118) for a sporting event.
- a transport mechanism 1120 e.g. a fiber capable of transporting a full bandwidth 4K video, transports the captured images to an operations base (“OB") (shown generally at 1122), e.g., a production truck away from the field 1118.
- OB operations base
- An image recorder 1124 records the captured images, e.g., as a data stream on a server, and is configured to allow an operator to go back in time relative to the recording and examine selected portions of the captured image as described above. Such control is provided to an operator via the GUI 1114 through a processor 1126 interfacing with the GUI 1114 and recorder 1124.
- the recorder, processor and GUI are configured to allow the operator to go back instantaneously or near-instantaneously to select portions of the recorded image for presentation.
- an operator in a truck would use a GUI to navigate the full raster 4K image and maneuver the selective 16:9 extraction window, in a manner similar to a cursor, to select an area of interest.
- the GUI is configured such that the extraction window may select an area of interest in one or both of live and recorded video.
- the present disclosure contemplates sizing and zooming capabilities for the extraction window.
- the system is configured to mark keyframes and establish mapping for desired moves, e.g., pans and zooms, among others, around the image.
- the output 1128 of the system (e.g., a 720p/59.94 output relative to a 4K capture) is provided to a router 1130 that allows the output to be taken live to a switcher 1132 or to be ingested at a server 1134 ("EVS") for later playout.
- a resulting image can be slowed down for replay or rendered as a still image, if desired, either at the server 1134 or at the operator's position (via processor 1126).
- FIGURE 12 provides an alternate exemplary embodiment, wherein capture, transport and recording of the native image (in this example 4K images) occurs onsite, e.g., at the field 1118 of a sporting event).
- An onsite processor 1126 provides or interfaces with an operator GUI 1114 in an operations base 1122 (e.g., a truck, though the GUI could be accessed from any convenient location) and provides a reference video 1138 of the image to allow the operator to navigate the image via the extraction window.
- the output 1128 is then transported from the field to an offsite router 1130.
- At least one GUI is accessed by a tablet controller as a navigation tool for the system.
- a tablet controller may be wireless and portable to allow for flexible a primary or supplemental navigation tool.
- multiple cameras may be positioned to capture images from different points of view, and extraction windows may be provided relative to the multiple image captures in a system for selectively displaying portions of native images from different points of view.
- Further exemplary embodiments provide real time or near real time tracking of subjects of interest (e.g., identified, selected or pre-tagged players of interest or automatic tracking of a ball in a game). Additional exemplary embodiments also provide virtual directing of operated and automatically tracked subjects of interest for cutting into a full live broadcast, utilizing backend software and tracking technology to provide a virtual viewfinder that operates in manners similar to otherwise human camera operators.
- Such processes may also use artificial technology for simple tracking, e.g., of a single identified object, or for more complex operations approximating motions utilized by human camera operators, e.g., pan, tilt and zoom of the extraction window in a manner similar to human operators.
- camera capture could utilize a specifically designed 4K camera.
- a camera may also use wider lensing to capture more of the subject, with possible reconstituting or flattening in post production. Also, different lensing can be used specific to different applications.
- Such processes may use the above-described multiple cameras and/or multiple extraction windows, or may run with specific regard to one camera and/or one extraction window.
- an artificial intelligence can automatically capture, extract and display material for broadcast, utilizing the extraction window(s) as virtual viewfinders.
- Additional exemplary embodiments also provide for virtual 3D extraction, e.g. via s single camera at 4K or 8K with a two window output.
- an increased image capture frame rates relative to a broadcast frame rate along with or in lieu of an increased image capture resolution, as has been discussed above.
- a first video is captured at a first frame rate, which frame rate is higher than a predetermined broadcast frame rate.
- a desired portion of the first video is then displayed at a second, lower frame rate, which frame rate is less than and closer to the predetermined broadcast frame rate.
- the desired portion of the first video is captured by an extraction window that extracts frames across the native captured video. In such a way, the extracted video provides smooth and clear video, without edgy or blurred frames.
- Such captured first video may be at any frame rate that is above the predetermined broadcast frame rate.
- the first video is captured at a first frame rate that is in super motion or hyper motion. In traditional video, this equates to approximately 180 ("supermotion") frames per second or above ("hypermotion” or “ultramotion") in a progressive frame rate.
- hypermotion is recorded in discrete times sufficient to capture a triggered instance of an action of camera subject for playback.
- the present system performs a full time record of a camera in hypermotion, e.g., of sufficient length for replay playback archiving, such as more than fifteen minutes, more than thirty minutes, more than an hour, more than an hour and a half, or more than two hours, among others.
- raw data from at least one camera is manipulated to adjust the image quality (make it "paintable") to broadcast specifications.
- broadcast "handles" may be integrated into the system to affect the raw data in a manner that is more germane to broadcast color temperatures, hues and gamma variables.
- the present disclosure thus advantageously provides systems and methods for selective capture of and presentation of native image portions, for broadcast production or other applications.
- a selectable extraction window through a GUI
- an operator has complete control over portions within the native images that the operator desires for presentation.
- image capture greater than high definition e.g., 4K
- desired portions of the image selected by an operator may be presented at or relatively near high definition quality (i.e., without relative degradation of image quality).
- high definition e.g. 4K
- FIGURE 3 an exemplary method for adjusting an image for a vehicle mounted camera is illustrated generally at 20, including receiving image data from a vehicle mounted camera (described at box 22), receiving data from at least one vehicle mounted sensor (described at box 24), and adjusting the image horizon utilizing the data received from the at least one vehicle mounted sensor (described at box 26).
- such adjusting of the image horizon may be applied as a digital video effect, such that actual manipulation of a vehicle mounted camera is unnecessary. Further, any type of image horizon adjustment is contemplated, whether or not such adjustment results in matching image horizon with a skyline horizon.
- image adjustment may be performed on the vehicle.
- an on-board (on the vehicle) processor may perform some or all of the image adjustment based upon data from the at least one sensor. Allocating processing power to the vehicle may be particularly useful, e.g., in wireless transmission applications where a reduced data package can take advantage of bandwidth limitations.
- an operator can communicate with an on-board processor over a separate channel, leaving one or more wireless transmission channels from the vehicle substantially dedicated to video output.
- exemplary embodiments contemplate automatic adjustment of image horizon based upon received vehicle telemetry data.
- data from at least one sensor is used to calculate the distance between the sensors.
- FIGURES 4 and 5 show exemplary adjustment of an image horizon 12 such that it matches a skyline horizon (shown as line 28) during tilting of a racecar as it transitions from a straightaway to a banked turn.
- both image horizon and zoom are automatically adjusted during tilting of a vehicle.
- FIGURES 6 and 7 show exemplary adjustment of an image horizon 12 such that it matches a skyline horizon (shown as line 28) during tilting of a racecar as it transitions from a straightaway to a banked turn, with an increase in zoom during the turn.
- FIGURE 8 illustrates an exemplary system for adjusting an image horizon from a vehicle mounted camera.
- the system 100 may include a server 101.
- the server 101 may include a plurality of information, including but not limited to, vehicle telemetry information, static and continuous video images from a vehicle mounted camera, algorithms and processing modules and other data storage.
- the server 101 may be in communication with a network 106 via a communication channel 110.
- the system 100 may access or interface with additional, third party data sources or servers 103.
- Third party sources of data 103 may be in communication with the network 106 via a communication channel 111.
- the source 103 may include a server substantially similar to server 101.
- the server 101 or source 103 may include a data service provider, for example, a cellular service provider, a business information provider, or any other suitable provider or repository.
- the server 101 or source 103 may also include an application server providing applications and/or computer executable code implementing any of the interfaces/methodologies described herein.
- the server 101 or source 103 may present a plurality of application defaults, choices, set-ups, and/or configurations such that a device may receive and process the application accordingly.
- the server 101 or source 103 may present any application on a viewer interface or web-browser of a device for relatively easy selection by a viewer of the device.
- another server component or local computer apparatus may produce the viewer interface and control connectivity to the server 101 or source 103.
- the server 101 or one or more of the local computer apparatus 104, 105 and 106 may be configured to periodically access the source 103 and cache data relevant to data used in embodiments of the present invention.
- the network 106 may be any suitable network, including the Internet, wide area network, and/or a local network.
- the server 101 and the source 103 may be in
- the communication channels 110, 111 may be any suitable communication channels including wireless, satellite, wired, or otherwise.
- An exemplary system 100 further includes computer apparatus 105 in communication with the network 106, over communication channel 112.
- the computer apparatus 105 may be any suitable computer apparatus including a personal computer (fixed location), a laptop or portable computer, a personal digital assistant, a cellular telephone, a portable tablet computer, a portable audio player, or otherwise.
- the system 100 may include computer apparatuses 104 and 106, which are embodied as a portable cellular telephone and a tablet, respectively.
- the apparatuses 104 and 106 may include display means 141, 161, and/or buttons/controls 142.
- the controls 142 may operate independently or in combination with any of the controls noted above.
- the apparatuses 104, 105, and 106 may be in communication with each other over communication channels 115, 116 (for example, wired, wireless, Bluetooth channels, etc); and may further be in communication with the network 106 over communication channels 112, 113, and 114.
- communication channels 115, 116 for example, wired, wireless, Bluetooth channels, etc.
- the apparatuses 104, 105, and 106 may all be in communication with one or both of the server 101 and the source 103, as well as each other.
- Each of the apparatuses may be in severable communication with the network 106 and each other, such that the apparatuses 104, 105, and 106 may be operated without constant communication with the network 106 (e.g., using data connection controls of an interface). For example, if there is no data availability or if a viewer directs an apparatus to work offline, the data used by any of the apparatuses 104, 105, and 106 may be based on stored or cached information/parameters. It follows that each of the apparatuses 104, 105, and 106 may be configured to perform the methodologies described in the various exemplary embodiments.
- the apparatuses 104, 105, and 106 may manipulate, share, transmit, and/or receive different data previously or currently produced at any one of the illustrated elements of the system 100.
- data may be available on the server 101 and/or the source 103.
- viewers of any of the devices 104, 105, and 106 may independently manipulate, transmit, etc., data, e.g., to separately determine a current value of the index at a given time.
- any suitable device may be utilized to use vehicle telemetry data from at least one vehicle sensor to adjust image horizon from a vehicle mounted camera.
Abstract
Description
Claims
Priority Applications (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2014244374A AU2014244374A1 (en) | 2013-03-13 | 2014-03-13 | System and method for adjusting an image for a vehicle mounted camera |
EP14776040.9A EP2969652A4 (en) | 2013-03-13 | 2014-03-13 | System and method for adjusting an image for a vehicle mounted camera |
BR112015022394A BR112015022394A2 (en) | 2013-03-13 | 2014-03-13 | method and system for adjusting image horizon for a vehicle mounted camera |
MX2015011989A MX369718B (en) | 2013-03-13 | 2014-03-13 | System and method for adjusting an image for a vehicle mounted camera. |
JP2016501836A JP6593929B2 (en) | 2013-03-13 | 2014-03-13 | System and method for adjusting images for in-vehicle cameras |
HK16102841.9A HK1214796A1 (en) | 2013-03-13 | 2016-03-11 | System and method for adjusting an image for a vehicle mounted camera |
AU2018201913A AU2018201913A1 (en) | 2013-03-13 | 2018-03-16 | System and method for adjusting an image for a vehicle mounted camera |
AU2019271924A AU2019271924B2 (en) | 2013-03-13 | 2019-11-26 | System and method for adjusting an image for a vehicle mounted camera |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361778641P | 2013-03-13 | 2013-03-13 | |
US61/778,641 | 2013-03-13 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014159868A1 true WO2014159868A1 (en) | 2014-10-02 |
Family
ID=51625277
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2014/025362 WO2014159868A1 (en) | 2013-03-13 | 2014-03-13 | System and method for adjusting an image for a vehicle mounted camera |
Country Status (7)
Country | Link |
---|---|
EP (1) | EP2969652A4 (en) |
JP (1) | JP6593929B2 (en) |
AU (3) | AU2014244374A1 (en) |
BR (1) | BR112015022394A2 (en) |
HK (1) | HK1214796A1 (en) |
MX (1) | MX369718B (en) |
WO (1) | WO2014159868A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108491840A (en) * | 2018-03-18 | 2018-09-04 | 刘兴丹 | A kind of method, apparatus of adjust automatically picture angles of display |
JP2019523622A (en) * | 2016-07-13 | 2019-08-22 | モバイル アプライアンス,インコーポレイテッド | Vehicle driving assistance device |
US11405559B1 (en) | 2021-02-19 | 2022-08-02 | Honda Motor Co., Ltd. | Systems and methods for live signal adjustment of a movable camera |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030021445A1 (en) * | 1999-12-23 | 2003-01-30 | Markus Larice | Method for optically monitoring the environment of a moving vehicle to determine an inclination angle |
US20050237385A1 (en) * | 2003-05-29 | 2005-10-27 | Olympus Corporation | Stereo camera supporting apparatus, stereo camera supporting method, calibration detection apparatus, calibration correction apparatus, and stereo camera system |
US20110181728A1 (en) * | 2008-12-19 | 2011-07-28 | Delphi Technologies, Inc. | Electronic side view display system |
JP2013020308A (en) * | 2011-07-07 | 2013-01-31 | Clarion Co Ltd | Vehicle surrounding image pickup system |
US20130033605A1 (en) * | 2011-08-05 | 2013-02-07 | Fox Sports Productions, Inc. | Selective capture and presentation of native image portions |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6738697B2 (en) * | 1995-06-07 | 2004-05-18 | Automotive Technologies International Inc. | Telematics system for vehicle diagnostics |
JPH08164896A (en) * | 1994-12-15 | 1996-06-25 | Mitsubishi Heavy Ind Ltd | Visibility display in operating unmanned aircraft |
JPH0952555A (en) * | 1995-08-11 | 1997-02-25 | Mitsubishi Electric Corp | Periphery monitoring device |
US5865624A (en) * | 1995-11-09 | 1999-02-02 | Hayashigawa; Larry | Reactive ride simulator apparatus and method |
US6201554B1 (en) * | 1999-01-12 | 2001-03-13 | Ericsson Inc. | Device control apparatus for hand-held data processing device |
AU2001277259A1 (en) * | 2000-06-30 | 2002-01-14 | The Muller Sports Group, Inc. | Sporting events broadcasting system |
JP2003162213A (en) * | 2001-11-27 | 2003-06-06 | Mitsubishi Heavy Ind Ltd | Simulated environment creating device and simulated environment creating method |
JP2004354256A (en) * | 2003-05-29 | 2004-12-16 | Olympus Corp | Calibration slippage detector, and stereo camera and stereo camera system equipped with the detector |
JP2004354236A (en) * | 2003-05-29 | 2004-12-16 | Olympus Corp | Device and method for stereoscopic camera supporting and stereoscopic camera system |
JP2006245726A (en) * | 2005-03-01 | 2006-09-14 | Fuji Photo Film Co Ltd | Digital camera |
JP2006340108A (en) * | 2005-06-02 | 2006-12-14 | Canon Inc | Image processing unit, image processing method, program, and storage medium |
US20090040308A1 (en) * | 2007-01-15 | 2009-02-12 | Igor Temovskiy | Image orientation correction method and system |
JP2010245821A (en) * | 2009-04-06 | 2010-10-28 | Denso Corp | Image display for vehicle |
-
2014
- 2014-03-13 WO PCT/US2014/025362 patent/WO2014159868A1/en active Application Filing
- 2014-03-13 AU AU2014244374A patent/AU2014244374A1/en not_active Abandoned
- 2014-03-13 EP EP14776040.9A patent/EP2969652A4/en not_active Ceased
- 2014-03-13 BR BR112015022394A patent/BR112015022394A2/en not_active Application Discontinuation
- 2014-03-13 JP JP2016501836A patent/JP6593929B2/en active Active
- 2014-03-13 MX MX2015011989A patent/MX369718B/en active IP Right Grant
-
2016
- 2016-03-11 HK HK16102841.9A patent/HK1214796A1/en unknown
-
2018
- 2018-03-16 AU AU2018201913A patent/AU2018201913A1/en not_active Abandoned
-
2019
- 2019-11-26 AU AU2019271924A patent/AU2019271924B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030021445A1 (en) * | 1999-12-23 | 2003-01-30 | Markus Larice | Method for optically monitoring the environment of a moving vehicle to determine an inclination angle |
US20050237385A1 (en) * | 2003-05-29 | 2005-10-27 | Olympus Corporation | Stereo camera supporting apparatus, stereo camera supporting method, calibration detection apparatus, calibration correction apparatus, and stereo camera system |
US20110181728A1 (en) * | 2008-12-19 | 2011-07-28 | Delphi Technologies, Inc. | Electronic side view display system |
JP2013020308A (en) * | 2011-07-07 | 2013-01-31 | Clarion Co Ltd | Vehicle surrounding image pickup system |
US20130033605A1 (en) * | 2011-08-05 | 2013-02-07 | Fox Sports Productions, Inc. | Selective capture and presentation of native image portions |
Non-Patent Citations (1)
Title |
---|
See also references of EP2969652A4 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019523622A (en) * | 2016-07-13 | 2019-08-22 | モバイル アプライアンス,インコーポレイテッド | Vehicle driving assistance device |
EP3486131A4 (en) * | 2016-07-13 | 2020-01-08 | Mobile Appliance, Inc. | Vehicle driving assistance device |
CN108491840A (en) * | 2018-03-18 | 2018-09-04 | 刘兴丹 | A kind of method, apparatus of adjust automatically picture angles of display |
US11405559B1 (en) | 2021-02-19 | 2022-08-02 | Honda Motor Co., Ltd. | Systems and methods for live signal adjustment of a movable camera |
Also Published As
Publication number | Publication date |
---|---|
AU2018201913A1 (en) | 2018-04-26 |
AU2019271924A1 (en) | 2019-12-19 |
BR112015022394A2 (en) | 2017-07-18 |
JP6593929B2 (en) | 2019-10-23 |
AU2014244374A1 (en) | 2015-08-13 |
MX369718B (en) | 2019-11-20 |
EP2969652A4 (en) | 2016-11-09 |
MX2015011989A (en) | 2015-12-01 |
HK1214796A1 (en) | 2016-08-05 |
AU2019271924B2 (en) | 2021-12-02 |
EP2969652A1 (en) | 2016-01-20 |
JP2016521473A (en) | 2016-07-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11490054B2 (en) | System and method for adjusting an image for a vehicle mounted camera | |
US10573351B2 (en) | Automatic generation of video and directional audio from spherical content | |
US10939140B2 (en) | Selective capture and presentation of native image portions | |
US10084961B2 (en) | Automatic generation of video from spherical content using audio/visual analysis | |
US10313665B2 (en) | Behavioral directional encoding of three-dimensional video | |
US10110814B1 (en) | Reducing bandwidth for video streaming using de-warping and video analytics | |
US10402445B2 (en) | Apparatus and methods for manipulating multicamera content using content proxy | |
AU2019271924B2 (en) | System and method for adjusting an image for a vehicle mounted camera | |
US20180103197A1 (en) | Automatic Generation of Video Using Location-Based Metadata Generated from Wireless Beacons | |
US10839601B2 (en) | Information processing device, information processing method, and program | |
US9871994B1 (en) | Apparatus and methods for providing content context using session metadata | |
CN105939497B (en) | Media streaming system and media streaming method | |
US9787862B1 (en) | Apparatus and methods for generating content proxy | |
WO2017112800A1 (en) | Macro image stabilization method, system and devices | |
US10192362B2 (en) | Generating virtual reality and augmented reality content for a live event | |
JP2014082764A (en) | Image display device, image display method, server apparatus and image data structure | |
NZ719619A (en) | Selective capture and presentation of native image portions | |
NZ719619B2 (en) | Selective capture and presentation of native image portions |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14776040 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014244374 Country of ref document: AU Date of ref document: 20140313 Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2016501836 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: MX/A/2015/011989 Country of ref document: MX |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014776040 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112015022394 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112015022394 Country of ref document: BR Kind code of ref document: A2 Effective date: 20150910 |