WO2010144635A1 - Caméras, systèmes de caméras, et leurs méthodes d'utilisation - Google Patents

Caméras, systèmes de caméras, et leurs méthodes d'utilisation Download PDF

Info

Publication number
WO2010144635A1
WO2010144635A1 PCT/US2010/038055 US2010038055W WO2010144635A1 WO 2010144635 A1 WO2010144635 A1 WO 2010144635A1 US 2010038055 W US2010038055 W US 2010038055W WO 2010144635 A1 WO2010144635 A1 WO 2010144635A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
video
cameras
image
markers
Prior art date
Application number
PCT/US2010/038055
Other languages
English (en)
Inventor
Gregory David Gallinat
Linda Rheinstein
Original Assignee
Gregory David Gallinat
Linda Rheinstein
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gregory David Gallinat, Linda Rheinstein filed Critical Gregory David Gallinat
Priority to US13/377,531 priority Critical patent/US20120140085A1/en
Publication of WO2010144635A1 publication Critical patent/WO2010144635A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/173Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42203Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] sound input device, e.g. microphone
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/61Network physical structure; Signal processing
    • H04N21/6156Network physical structure; Signal processing specially adapted to the upstream path of the transmission network
    • H04N21/6181Network physical structure; Signal processing specially adapted to the upstream path of the transmission network involving transmission via a mobile phone network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/65Control of camera operation in relation to power supply
    • H04N23/651Control of camera operation in relation to power supply for reducing power consumption by affecting camera operations, e.g. sleep mode, hibernation mode or power off of selective parts of the camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet

Definitions

  • This disclosure relates to devices and methods for generating, processing, transmitting, and displaying images, either locally or remotely.
  • This disclosure relates to devices and methods for monitoring a specific location, function, or event, such as a sporting event.
  • the devices of the present disclosure may be concealed, portable, or comprise plural cameras.
  • Action sports spectators are particularly drawn to images from the player's point of view — seeing through their eyes.
  • an intimate viewpoint might be compelling in numerous situations to many types of viewers, parents, video artists, behavioral scientists, advertisers, etc.
  • U.S. Patent No. 6,819,354 provides a helmet-mounted camera.
  • U.S. Patent No. 6,704,044 provides a camera mounted to a baseball-style cap.
  • the helmet and cap mounted cameras were of great interest to the spectators (including the professional announcers), those cameras suffered from several insurmountable problems.
  • the battery packs were relative large and mounted inside the helmet or cap. The mounting location, coupled with the weight of the battery pack, was uncomfortable and dangerous for the players.
  • the picture quality was nominal because the lighting inside the stadium was constantly changing and the image would rapidly lighten or darken as the angle of the helmet changed with the viewpoint of the player. In addition, the nature of the player movement caused jumpiness in the image.
  • the wireless transmission and NTSC signal encroached on the frequencies of the other wireless systems already in place.
  • the present specification provides hands-free, mobile, real-time video cameras that overcome the shortcomings of previous designs.
  • Cameras described in the present specification may be light-weight and small enough to be mounted anywhere, especially on a user's body. Cameras described in the present specification may be also cost- effective and rugged enough for use during very strenuous and/or high contact, semi- or full collision activities. Strenuous activities can be defined by perceived exertion, for example, according to the Borg RPE Scale. High contact, semi- or full collision activities can be defined by the American Academy of Pediatrics
  • cameras described in the present specification offer full-motion, enhanced, and/or high-definition video capture over an extended period of time.
  • the combination of diminutive size, low-power consumption, and high resolution has been heretofore unavailable in the art.
  • cameras described in the present specification may be seamlessly compatible with various software applications and platform independent.
  • FIG. 1 is a diagram of the modules of a camera according to the present specification.
  • any reference to light or optical devices may contemplate any type of electromagnetic radiation of any frequency and wavelength, including and not limited to visible, infrared, and ultraviolet light.
  • the term “sensor” may include any device that converts at least one type of electromagnetic radiation to an electric signal. Nonetheless, the term “sensor” may be preferably limited devices that convert visible light to an electrical signal.
  • Real-time means without intentional delay, given the features of the camera and camera apparatuses described herein, including the time required to accurately receive, process, and transmit image data.
  • the present specification describes cameras, external and/or remote interfaces for cameras, and camera apparatuses.
  • Cameras according to the present specification may include a sensor module, a processing module, a communication module, a power supply, and a mount.
  • the modules of the cameras according to the present specification may also be themselves modular or customizable.
  • the modules of the cameras according to the present specification may be integrated, separate, or separable.
  • the sensor module is adapted to receive at least one type electromagnetic radiation and produce an output signal related to the received electromagnetic radiation.
  • the sensor module comprises a sensor and, optionally, other optical devices including and not limited to at least one lens, a waveguide (e.g., optical fiber), an optical and/or mechanical image stabilizer, and/or a protective cover (e.g. a pull-tab lens cover).
  • the sensor may be, for example, a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) active-pixel sensor.
  • CCD charge coupled device
  • CMOS complementary metal-oxide-semiconductor
  • the sensor module may be automatically or user-selectably controlled for different focal lengths, lighting conditions, or other camera and video performance features.
  • different lens configurations may be employed, such as wide angle, fish eye, miniature, and/or zoom.
  • the sensor module may comprise a solid state auto-focus mechanism.
  • the sensor module may comprise an optical, and/or electrical, and/or mechanical image stabilizer.
  • An optical image stabilizer as part of the sensor module could be implemented in front of the sensor, e.g., by a floating lens element that may be moved relative to the optical axis of the lens using at least one actuator, such as an electromagnet. Vibration could be detected using piezoelectric angular velocity sensors (often called gyroscopic sensors).
  • an electrical image stabilizer could be incorporated into the software processing portion of the image sensor and/or the processor module itself.
  • a mechanical image stabilizer as part of the image sensor module could be implemented by moving the sensor itself.
  • a mechanical image stabilizer may employ gyroscopic sensors to encode information to at least one actuator, such an electromagnet, that moves the sensor. It could also employ dedicated gyroscopic sensors which provide acceleration and/or movement data to aid in calculations to stabilize the image detected
  • Resolutions that may be output from the sensor include and are not limited to NTSC, 48Op ⁇ i.e., VGA 640x480), PAL, 525p, HDTV, 72Op ⁇ i.e., 1280x720 pixels), 108Op, and 108Oi.
  • the sensor may be capable of variable output, i.e., automatically or user selectively sending more or less data to the processor.
  • a variable output sensor is described in U.S. Patent No. 5,262,871 , which is incorporated by reference herein in its entirety.
  • the image sensor module may be, for example, a High Definition 72Op or 108Op Camera Module that may be about 7 mm by 7 mm by 6 mm (x by y by z) in size including the lens.
  • the image sensor may also be an Enhanced Definition 48Op Camera Module (VGA may be 640 x 480 square pixels).
  • VGA Enhanced Definition 48Op Camera Module
  • Major manufacturers from which such image sensor modules may be available include OmniVision (e.g., native HD sensors), Samsung, and Sony.
  • a preferred sensor module comprises support for YUV, combined RGB, and raw RGB output formats, parallel DVP output interface, automatic exposure/gain, horizontal and vertical windowing capability, auto white balance control, aperture/gamma correction, serial camera control bus for register programming, external frame sync capability, flicker cancellation, defective pixel correction, a power requirement of less than about 600 mW, an input clock frequency of about 5 to about 30 Mhz, progressive scan mode, rolling shutter, 30 fps full resolution, at least about 5 V sensitivity (lux-sec), at least about 100 dB dynamic range, and pixel size less than 5 ⁇ m.
  • the sensor module may be optionally adapted to receive at least one type of mechanical vibration (e.g., sound, ultrasound, and/or infrasound) and produce an output signal related to the received mechanical wave.
  • the sensor module may include a microphone.
  • the data output from the sensor module may be provided to a processing module.
  • the image processing module preferably provides highly integrated, fully compliant encoding, decoding, pre-processing, and post-processing.
  • the image processing module may be a system-on-a-chip and its potential features may be limited only by the constraints of weight, size, and power consumption.
  • Hardware or software enabled features of the image processing module may include: a high, main, and baseline H.264 HD 1920x108Oi codec; an HD 1920x108Oi MPEG2 decoder; a MJPEG codec (up to 12MP); multiple audio formats, such as, for example, AAC, G.7xx, AMR, MP1/2/3, and Dolby; dual, high-profile 720p30; multichannel 8 D1 or 16 CIF; 720p30 full-duplex operation; 1920x108Oi MPEG2 to H.264 transcoding; AES and SHA hardware assist; motion adaptive de-interlacing and noise reduction; temporal/spatial filters; video cropping, scaling, and compositing; frame- and bit-rate control; advanced edge preservation; image stabilization, which feature may employ gyroscopes or other positioning and/or acceleration detection capability; multiple inputs and outputs; time and/or date coding; and/or a GPS locator, which may communicate with satellite and/or terrestrial GPS transmitter for highly accurate tracking (e.
  • the image processing module may provide high dynamic range (HDR) imaging.
  • Exposure bracketing may be used to achieve HDR.
  • Tone mapping techniques which reduce overall contrast to facilitate display of HDR images on devices with lower dynamic range, can be applied to produce images with preserved or exaggerated local contrast for artistic effect.
  • the Image Processing Module may comprise integrated and/or removable image storage.
  • the image processing module may comprise on-board video memory that may be exportable for viewing and processing with an external and/or remote interface.
  • the image processing module may be the size of a small pack of matches and consume less then 1 watt of power.
  • the image processing module may be about 20 mm by 20 mm in size.
  • Suitable processing modules may be available from Maxim Integrated Products, Inc., Texas Instruments, Inc. (e.g., OMAP), Xilinx® (e.g., Spartan® FPGA), and Freescale Semiconductor Inc. (e.g., i.MX multimedia applications processors).
  • a preferred processing module comprises an 800 MHz CPU with 32KB instruction and data caches, unified L2 cache, SIMD media accelerator, and vector floating point co-processor.
  • a preferred processing module further comprises a multi- format HD720p encoder, a HD720p video decoder and D1 video encoder hardware engine, 24-bit primary display support up to WXGA resolution, 18-bit secondary display support, analog HD720p component TV output, hardware video de-interlacing, image and video resize, inversion and rotation hardware, alpha blending and color space conversion, color correction, gamut mapping, and gamma correction.
  • a preferred processing module also comprises an external memory interface for mDDR and DDR2 SDRAM, and SLC/MLC NAND flash memory.
  • the processed data output from the processing module may be provided to a communication module for transmission to an external and/or remote receiver.
  • the communication module may also receive input from an external and/or remote transmitter, such as, for example, signals for controlling the sensor module and/or processing module. Communication may be wired or wireless.
  • the communication module may be preferably a complete client device comprising an integrated media access controller (MAC), baseband processor, transceiver, and amplifier.
  • MAC media access controller
  • Hardware or software enabled features of the communication module may include: compliance to IEEE 802.11 b/g and single or multiple stream IEEE 802.11 n; compliance to WiMAX ⁇ e.g., IEEE 802.16e "mobile WiMAX"); a host interface through SDIO and SPI; bluetooth coexistence; ultra low power operation; complete WLAN software along with a host driver for Windows; embedded CE, windows Mobile, windows XP, Linux, iPhone, Mac and/or Google Android OS; single supply 3.0 to 3.6 V operation; robust multipath performance and extended range using STBC; and a small footprint.
  • the communication module may be adapted for a wireless transmission environment that may be entirely scalable and able to support multiple mobile camera feeds or placed on fixed locations ⁇ e.g., goal line markers or goal nets).
  • the access point receivers may be placed virtually anywhere inside a field and/or stadium to provide live action feeds from anywhere on the field.
  • players may carry wireless transmission booster packs to increase signal strength for transmission to the sideline.
  • cameras described in present specification may be remotely utilized ⁇ e.g., controlled and/or viewed) via a mobile telephone/smart phone, laptop computer, or other wireless or wired display (LCD) capable viewing and/or control interface.
  • LCD wireless or wired display
  • the communication module may be about 20 mm by 30 mm in size.
  • Suitable communication modules may be available from Redpine Signals, Inc.
  • a preferred communications module is a complete IEEE 802.11 bgn Wi-Fi client device with a standard serial or SPI interface to a host processor or data source. It integrates a MAC, baseband processor, RF transceiver with power amplifier, a frequency reference, an antenna, and all WLAN protocol and configuration functionality in embedded firmware to provide a self-contained 802.11 n WLAN solution.
  • the power supply may be selected by balancing various parameters including and not limited to the size, weight, and capacity of the power supply versus the size, weight, and efficiency of the other camera modules.
  • a suitable battery for cameras according to the present specification may provide power for at least about an hour (and preferably two hours or more) and be about 20 mm in diameter and weight about 5 grams.
  • the power supply may be disposable or rechargeable.
  • the power supply may comprise an alternative energy source, such as, for example, a power generator powered by solar energy or kinetic energy (i.e., power from the user's body motion or body heat).
  • Suitable power supplies include and may be not limited to lithium ion batteries, nickel metal hydride batteries, and alkaline batteries.
  • the power supply may rely on wireless energy transfer, such as, for example, induction, and/or printed electronics techniques, such as, for example, flexible polymer batteries.
  • a light sensitive on/off switch may be utilized to conserve power while allowing for a quick transition from low-power standby mode (also known as "sleep mode") to full-power operation.
  • the image sensor chip may include at least one pixel that is always "on,” i.e., always held within an operational voltage range.
  • the always-on pixel may be located in the test pixel area. While the lens is covered, the camera can be in standby or sleep mode. Once the cover is removed, the always-on pixel detects light entering the lens, and the camera returns to full-power operation.
  • Cameras according to the present specification may incorporate a mount removeably attachable to a user or an object.
  • a camera according to the present specification may be reversibly mounted on a wall, a goal post, or even a helmet (just like a postage stamp).
  • cameras according to the present specification may not have a display, which might require an inconvenient amount of both space and power. Also, cameras according to the present specification may not have built-in control interfaces to operate various system parameters. In fact, by utilizing the sensor, processing, and communication modules described herein above, cameras according to the present invention may have only an on/off switch (or no switches at all); all other control features being available through an external and/or remote interface. The external and/or remote interface may also provide further processing subsequent to transmission.
  • External and/or remote interface may include: single and multiple camera control and synchronization; software for image and audio processing; mobile phone/smart phone compatibility.
  • the sensor module of a camera receives light through a lens that focuses the light onto a sensor.
  • the light causes a voltage change within the pixel structure of the sensor. This voltage change may be detected and, by having the sensor pixel structure arranged in an array pattern, an image may be built from each individual pixel voltage level change.
  • the image data may be transferred to the processing module in which video processing occurs that may, for example, construct a video stream and/or improve the image quality.
  • Image stabilization may occur either in the camera module depending on the capabilities of the camera module, in the central processor, or in an external and/or remote interface.
  • the image stabilization process may use data obtained from gyroscopes or other acceleration/positioning detection technology incorporated within the camera.
  • the processed image data may be then compressed using MPEG-4, Motion-JPEG, or various other video compression techniques.
  • the processed image data may be sent to the communications module where the image data may be formatted for wireless broadcast.
  • Wireless broadcast may be via 802.11 n, or WiMax, or another wireless transmission capability.
  • Control features and functions may be controlled via an external and/or remote wired or wireless interface, such as a laptop computer, smart phone, or other wireless device with an image display or projection capability.
  • the processed image data could also be stored within the camera itself, in a dedicated memory location.
  • the wireless video broadcast may be a user selectable between different target reception devices.
  • User control may select a single reception device such as a laptop computer, smart phone, or other video display device to receive and/or decrypt the video image.
  • the user control may enable select multiple reception devices, such as, for example, a group of devices or user defined specific devices to be allowed to receive and/or decrypt the video data.
  • the user control may select a broadcast which allows any device within range to receive the video data.
  • Video broadcast data may be encrypted to ensure privacy of the video data.
  • the video capture and display function may be partially performed by having the video data stored by the camera on a optional memory device for processing and playback later in an external and/or remote display device such as laptop, smart phone, or other display device.
  • the video capture and display function may be partially performed by having the video data stored by a first external and/or remote interface for processing and playback later in a second external and/or remote display device such as laptop, smart phone, or other display device.
  • the video data may be stored on a video hosting server for processing and/or playback later in a web-based interface.
  • On-camera or external processing may include combining real image data from the camera with virtual image data to create a composite image.
  • real image data of a user may be combined with virtual image data of a background to create a combined image of a user in a location that the user did not visit.
  • On-camera or external processing may include using real image data from the camera to create or paint virtual images.
  • real image data of a user may be used to paint a virtual image of the user (i.e., an avatar).
  • cameras according to the present specification may provide a single control point user interface.
  • a camera may broadcast a handshake protocol request and wait for a reply from a wireless transmission video display device.
  • a user would reply to the camera's handshake request enabling the user's interface to be the only recognized control point for accessing the camera.
  • a single point user interface allows the individual user to control the user interface options available on the camera, such as, for example, lighting controls, selectable compression techniques and algorithms, broadcast type (single point, select group, or worldwide), power modes, such as, for example, on/off or sleep, continuous or intermittent video image capture and/or broadcast, or other camera performance capabilities.
  • multiple cameras could be controlled by a single external and/or remote user interfaces.
  • a single camera could be controlled by multiple external and/or remote user interfaces.
  • multiple cameras could be controlled by multiple external and/or remote user interfaces.
  • an external interface i.e., control device
  • the design of cameras according to the present invention requires the careful balancing of processing and communication power usage versus energy supply (and price).
  • all data generated by the camera e.g., gyroscope and accelerometer data along with video and timecode
  • a sustainable, clean signal i.e., acceptable communication using the least amount of power
  • boosting power to the transceiver may allow for a more accurate communication between the camera and the remote interface, but at the expense of being able to perform image stabilization in the processing module.
  • Each application e.g., a professional sporting event or a small family picnic, requires a different balance and a different combination of modules.
  • the miniaturized and modular cameras described herein are well adapted for achieving the right balance.
  • Camera apparatuses according to the present specification include at least one camera as described herein above in combination with markers that are employed for enhanced video processing, such as, for example, enhanced image stabilization and enhanced image tracking. Camera apparatuses according to the present specification including markers are even capable of producing data for a 3-D display. Enhancing the capture for use with 3-D display could also include two or more cameras..
  • the markers may be passive (such as, for example, paint, ink, chalk, or a reflective surface) or active (such as, for example, radio transmitters or LEDs). Markers may be located or defined upon persons and objects that are within an area of interest or that will pass through an area of interest. For example, if a football field is the area of interest, marker(s) may be located or defined on all the player's helmets, caps, jerseys, uniforms, shoulder pads, hip pads, gloves, shoes, hands, and feet, as well as on sidelines, goal lines, and even the ball. Markers may be pre-determined or dynamically defined and be of any shape and size. For example, regarding a ball, a marker may be defined as the laces, as a stripe applied to the ball, or as either of the laces or the stripe depending upon which marker is visible to a camera in a scene.
  • Cameras and external interfaces can receive and transmit more data using less power if the processing module (or external interface) can process data faster and more accurately. For example, using one or more techniques for enhancing edge detection/determination allows the processing of each a frame of data to be faster and more accurate.
  • colors that have a higher contrast ratio may be used. For example, with dynamic action on a relatively static background (e.g., a football game on green field or a skiing competition on white slope), having an individual in a highly contrasting color allows the processing module (or external interface) to better determine lines, shapes, and edges ⁇ e.g., a white jersey against the green field, or a red ski jacket against the white slope).
  • a relatively static background e.g., a football game on green field or a skiing competition on white slope
  • having an individual in a highly contrasting color allows the processing module (or external interface) to better determine lines, shapes, and edges ⁇ e.g., a white jersey against the green field, or a red ski jacket against the white slope).
  • patterns of markers may also be used to process edge definitions faster and more accurately.
  • using easily defined patterns of markers e.g., dots, squares, or diamonds
  • the processing module or external interface
  • a pre-determined pattern of markers is defined or applied to a specific location (e.g., numbers on a jersey, diamonds on a helmet, or stripes on shoes)
  • this allow for better detection and deterministic calculation, which better defines the scene.
  • Active markers many emit a signal in continuous, random, or controlled patterns.
  • Controlled pattern could include intelligent information such as velocity, acceleration, and/or, more biological information of the wearer (e.g., heart beat or body temperature).
  • an LED can be pulsed depending on the action or activity of the player. Faster pulses could be from speed, acceleration, or other physical attributes.
  • the control of the LEDs can be both from on-player sensors such as G-force or accelerometers, and from remote determination.
  • the LED emission can be in the visible, infra-red, and/or ultra-violet spectrum.
  • edge-enhancement techniques may be utilized simultaneously and each technique may be employed in numerous schemes.
  • the front of a jersey may be red with blue dots and the back of the jersey may be blue with red dots.
  • the processing module or external interface
  • Image data from a camera may be received and processed by an external and/or remote interface together with processing data (e.g., timecode, and gyroscope and accelerometer data) received from that camera, as well as tracking data based on the markers (received by the external and/or remote interface either within the image data from the camera or from an active marker).
  • processing data e.g., timecode, and gyroscope and accelerometer data
  • Color and/or time vector analysis based on tracking data may be performed with or without processing data.
  • a color/time vector may track a "video paint" through an area of interest for wireless signal detection.
  • color and/or time vector analysis based on tracking data allows individual players to be tracked within a scene. Such tracking might allow for having a player's helmet camera turn-on depending on if the player is determined as "in the play” or not. Directors and/or computers could provide real-time update/play-by-play for which cameras on the field may be "always-on" or "sometimes-on.”
  • Employing two cameras as part of a camera apparatus according to the present invention may be the basis for 3-D image capture and display.
  • Each individual camera creates a video vector to a target image ⁇ e.g., a football).
  • software processing may create a 3D image.
  • Parallax errors may be introduced within the image scene due to the two camera having slightly different views of the target.
  • Static markers provide a fixed reference plane.
  • Dynamic markers i.e., markers on moving objects including players
  • the player may be accurately placed into the 3-D field (e.g., an X 1 Y 1 Z reference Cartesian coordinate system space).
  • 3-D field e.g., an X 1 Y 1 Z reference Cartesian coordinate system space
  • each frame of video data may be accurately placed into 4-D space (e.g., time plus an X 1 Y 1 Z reference Cartesian coordinate system space).
  • multiple cameras may have a constant timecode, which allows for a more accurate recreation of the entire scene).
  • other cameras within the same field provide additional video vectors. Knowing the video vector from each camera allows for processing parallax removal, which helps increase the depth of the 3-D field. Also, the additional video vectors may be used sharpen the primary 3D image.
  • processing data including gyroscope and accelerometer data from the cameras provides the possibility of video image processing using digital data plus external sensor inputs for position and movement parameters, which affords an enhanced ability to re-create 3-D video via the best resolution possible, plus added dynamic & static information on position, alignment, acceleration, and shock.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Studio Devices (AREA)

Abstract

L'invention porte sur des caméras vidéo en temps réel, mobiles, mains-libres, surmontant les insuffisances des concepts antérieurs. Les caméras décrites dans les présentes spécifications doivent être assez légères et petites pour pouvoir être installées partout, spécialement sur le corps de l'utilisateur; elles doivent également être assez économiques et robustes pour pouvoir être utilisées dans des activités ardues et/ou à fort contact, et outre leur facteur de forme très réduit et durable elles doivent offrir un mouvement et une capture vidéo améliorées et/ou de haute définition pendant une longue période. La combinaison taille réduite, faible consommation et résolution élevée n'a jusqu'à présent pas été disponible dans l'art antérieur. De plus ces caméras peuvent être compatibles sans interruption avec différentes applications logicielles et plate-formes indépendantes.
PCT/US2010/038055 2009-06-09 2010-06-09 Caméras, systèmes de caméras, et leurs méthodes d'utilisation WO2010144635A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/377,531 US20120140085A1 (en) 2009-06-09 2010-06-09 Cameras, camera apparatuses, and methods of using same

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US18519909P 2009-06-09 2009-06-09
US61/185,199 2009-06-09

Publications (1)

Publication Number Publication Date
WO2010144635A1 true WO2010144635A1 (fr) 2010-12-16

Family

ID=43309217

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2010/038055 WO2010144635A1 (fr) 2009-06-09 2010-06-09 Caméras, systèmes de caméras, et leurs méthodes d'utilisation

Country Status (2)

Country Link
US (1) US20120140085A1 (fr)
WO (1) WO2010144635A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102011103675A1 (de) * 2011-06-01 2012-12-06 von Friedrich Georg Joachim Heinrich Bernewitz Freiherr Einrichtung und Verfahren zur Aufzeichnung und Übertragung von Sportereignissen
CN103051830A (zh) * 2012-12-31 2013-04-17 北京中科大洋科技发展股份有限公司 一种对所拍目标多角度实时转播的系统和方法

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013033954A1 (fr) 2011-09-09 2013-03-14 深圳市大疆创新科技有限公司 Rotule à équilibrage automatique dynamique et gyroscopique
US8903568B1 (en) 2013-07-31 2014-12-02 SZ DJI Technology Co., Ltd Remote control method and terminal
JP2016541026A (ja) * 2013-10-08 2016-12-28 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd 安定化と振動低減のための装置および方法
US9622720B2 (en) 2013-11-27 2017-04-18 Clear Guide Medical, Inc. Ultrasound system with stereo image guidance or tracking
US8880151B1 (en) 2013-11-27 2014-11-04 Clear Guide Medical, Llc Surgical needle for a surgical system with optical recognition
CN103826100A (zh) * 2013-12-31 2014-05-28 南宁市公安局 一种基于双摄像头的监控方法
CN103813139A (zh) * 2013-12-31 2014-05-21 南宁市公安局 一种车载监控设备
KR102191869B1 (ko) 2014-02-17 2020-12-16 엘지전자 주식회사 포터블 디바이스 및 그 제어 방법
US10516816B2 (en) * 2014-11-19 2019-12-24 Lenovo (Singapore) Pte. Ltd. Enhanced information handling device cover
US20170048495A1 (en) * 2015-02-17 2017-02-16 SkyBell Technologies, Inc. Power outlet cameras
JP6560709B2 (ja) * 2017-05-18 2019-08-14 レノボ・シンガポール・プライベート・リミテッド 情報処理装置、及びモード選択方法
EP3855387A4 (fr) * 2018-09-18 2022-02-23 Zhejiang Uniview Technologies Co., Ltd. Procédé et appareil de traitement d'images, dispositif électronique et support de stockage lisible par ordinateur
US10594856B1 (en) * 2019-06-27 2020-03-17 Shenzhen GOODIX Technology Co., Ltd. Immediate-mode camera for portable personal electronic devices

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6072529A (en) * 1996-09-16 2000-06-06 Eastman Kodak Company Electronic camera for the realization of the imaging properties of a studio bellow camera
EP0951697B1 (fr) * 1997-01-13 2002-12-18 Qualisys AB Procede et systeme d'analyse de mouvement
US6690001B2 (en) * 2000-04-06 2004-02-10 Rensselaer Polytechnic Institute THz pulse measurement with an optical streak camera
US6819354B1 (en) * 2000-06-13 2004-11-16 Omnivision Technologies, Inc. Completely integrated helmet camera
US20070091175A1 (en) * 1998-09-28 2007-04-26 3Dv Systems Ltd. 3D Vision On A Chip
US7277599B2 (en) * 2002-09-23 2007-10-02 Regents Of The University Of Minnesota System and method for three-dimensional video imaging using a single camera
US7301465B2 (en) * 2005-03-24 2007-11-27 Tengshe Vishwas V Drowsy driving alarm system
US20080145044A1 (en) * 2001-01-10 2008-06-19 Ip Holdings, Inc. Motion detector camera
US20080246759A1 (en) * 2005-02-23 2008-10-09 Craig Summers Automatic Scene Modeling for the 3D Camera and 3D Video
WO2008128205A1 (fr) * 2007-04-13 2008-10-23 Presler Ari M Système de caméra cinématographique numérique pour enregistrer, éditer et visualiser des images
US7489340B2 (en) * 2004-11-04 2009-02-10 Samsung Electronics Co., Ltd. Optical image stabilizer for camera lens assembly

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4630913A (en) * 1985-12-16 1986-12-23 Lo Allen K W Extended depth-of-field 3-D camera
JP3530679B2 (ja) * 1996-06-14 2004-05-24 キヤノン株式会社 接眼検知機能付撮像装置
US7046273B2 (en) * 2001-07-02 2006-05-16 Fuji Photo Film Co., Ltd System and method for collecting image information
US7663666B2 (en) * 2003-06-23 2010-02-16 Canon Kabushiki Kaisha Operation at mobile terminal when communicating with remote camera
CN101129071A (zh) * 2004-02-19 2008-02-20 安全摄像机股份有限公司 摄像机系统

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6072529A (en) * 1996-09-16 2000-06-06 Eastman Kodak Company Electronic camera for the realization of the imaging properties of a studio bellow camera
EP0951697B1 (fr) * 1997-01-13 2002-12-18 Qualisys AB Procede et systeme d'analyse de mouvement
US20070091175A1 (en) * 1998-09-28 2007-04-26 3Dv Systems Ltd. 3D Vision On A Chip
US6690001B2 (en) * 2000-04-06 2004-02-10 Rensselaer Polytechnic Institute THz pulse measurement with an optical streak camera
US6819354B1 (en) * 2000-06-13 2004-11-16 Omnivision Technologies, Inc. Completely integrated helmet camera
US20080145044A1 (en) * 2001-01-10 2008-06-19 Ip Holdings, Inc. Motion detector camera
US7277599B2 (en) * 2002-09-23 2007-10-02 Regents Of The University Of Minnesota System and method for three-dimensional video imaging using a single camera
US7489340B2 (en) * 2004-11-04 2009-02-10 Samsung Electronics Co., Ltd. Optical image stabilizer for camera lens assembly
US20080246759A1 (en) * 2005-02-23 2008-10-09 Craig Summers Automatic Scene Modeling for the 3D Camera and 3D Video
US7301465B2 (en) * 2005-03-24 2007-11-27 Tengshe Vishwas V Drowsy driving alarm system
WO2008128205A1 (fr) * 2007-04-13 2008-10-23 Presler Ari M Système de caméra cinématographique numérique pour enregistrer, éditer et visualiser des images

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102011103675A1 (de) * 2011-06-01 2012-12-06 von Friedrich Georg Joachim Heinrich Bernewitz Freiherr Einrichtung und Verfahren zur Aufzeichnung und Übertragung von Sportereignissen
CN103051830A (zh) * 2012-12-31 2013-04-17 北京中科大洋科技发展股份有限公司 一种对所拍目标多角度实时转播的系统和方法
CN103051830B (zh) * 2012-12-31 2015-12-23 北京中科大洋科技发展股份有限公司 一种对所拍目标多角度实时转播的系统和方法

Also Published As

Publication number Publication date
US20120140085A1 (en) 2012-06-07

Similar Documents

Publication Publication Date Title
US20120140085A1 (en) Cameras, camera apparatuses, and methods of using same
JP7466115B2 (ja) 撮像システムおよび校正方法
US11647204B2 (en) Systems and methods for spatially selective video coding
WO2020200067A1 (fr) Procédé et dispositif de traitement d'image pour visiocasque
US10936894B2 (en) Systems and methods for processing image data based on region-of-interest (ROI) of a user
US10484652B2 (en) Smart headgear
CN109076249B (zh) 用于视频处理和显示的系统和方法
WO2020238741A1 (fr) Procédé de traitement d'image, dispositif associé et support de stockage informatique
US8730388B2 (en) Wearable video camera with dual rotatable imaging devices
WO2014010157A1 (fr) Dispositif de génération d'image et procédé de génération d'image
US20170308116A1 (en) Apparel-Mountable Panoramic Camera Systems
CN113542529B (zh) 用于dms和oms的940nm led闪光同步
CN103562791A (zh) 用于与移动计算设备一起进行全景视频成像的装置和方法
WO2014162324A1 (fr) Système omnidirectionnel sphérique pour le tournage d'une vidéo
US20170195563A1 (en) Body-mountable panoramic cameras with wide fields of view
US10969655B2 (en) Camera system using interchangeable fixed lenses
JP2003324649A (ja) 通信機能付電子カメラおよびその電子カメラシステムにおける撮影方法
CN110069236A (zh) 一种头戴电子设备及其音频输出控制方法
EP3542541A1 (fr) Procédé destiné à un dispositif multicaméra
GB2558893A (en) Method for processing media content and technical equipment for the same
WO2013186805A1 (fr) Dispositif de capture d'image et procédé de capture d'image
WO2018010473A1 (fr) Procédé de commande de rotation de tête à bascule de véhicule aérien sans pilote basé sur un dispositif d'affichage intelligent
WO2022220305A1 (fr) Système d'affichage vidéo, procédé de traitement d'informations et programme
US20240112305A1 (en) Real-time fiducials and event-driven graphics in panoramic video
KR20130064533A (ko) 패치형 카메라

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10786816

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 13377531

Country of ref document: US

122 Ep: pct application non-entry in european phase

Ref document number: 10786816

Country of ref document: EP

Kind code of ref document: A1