US20140192235A1 - Systems, methods, and media for reconstructing a space-time volume from a coded image - Google Patents
Systems, methods, and media for reconstructing a space-time volume from a coded image Download PDFInfo
- Publication number
- US20140192235A1 US20140192235A1 US14/001,139 US201214001139A US2014192235A1 US 20140192235 A1 US20140192235 A1 US 20140192235A1 US 201214001139 A US201214001139 A US 201214001139A US 2014192235 A1 US2014192235 A1 US 2014192235A1
- Authority
- US
- United States
- Prior art keywords
- image
- shutter function
- coded
- image data
- space
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 64
- 230000010354 integration Effects 0.000 claims description 13
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 claims description 3
- 239000004973 liquid crystal related substance Substances 0.000 claims description 3
- 229910052710 silicon Inorganic materials 0.000 claims description 3
- 239000010703 silicon Substances 0.000 claims description 3
- 230000006870 function Effects 0.000 description 29
- 238000010586 diagram Methods 0.000 description 4
- 230000002123 temporal effect Effects 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 3
- 238000012549 training Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 239000013598 vector Substances 0.000 description 2
- 241000254032 Acrididae Species 0.000 description 1
- 238000012935 Averaging Methods 0.000 description 1
- 230000003466 anti-cipated effect Effects 0.000 description 1
- 239000000872 buffer Substances 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 230000000593 degrading effect Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/001—Image restoration
- G06T5/002—Denoising; Smoothing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/0007—Image acquisition
-
- G06T5/70—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/296—Synchronisation thereof; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/388—Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/40—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/50—Control of the SSIS exposure
- H04N25/53—Control of the integration time
- H04N25/533—Control of the integration time by using differing integration times for different sensor regions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/21—Indexing scheme for image data processing or generation, in general involving computational photography
Definitions
- A/D analog-to-digital
- systems, methods, and media for reconstructing a space-time volume from a coded image comprising: an image sensor that outputs image data; and at least one processor that: causes a projection of the space-time volume to be captured in a single image of the image data in accordance with a coded shutter function; receives the image data; and performs a reconstruction process on the image data to provide a space-time volume corresponding to the image data.
- methods for reconstructing a space-time volume from a coded image comprising: causing a projection of the space-time volume to be captured by an image sensor in a single image of image data in accordance with a coded shutter function using a hardware processor; receiving the image data using a hardware processor; and performing a reconstruction process on the image data to provide a space-time volume corresponding to the image data using a hardware processor.
- non-transitory computer-readable media containing computer-executable instructions that, when executed by a processor, cause the processor to perform a method for reconstructing a space-time volume from a coded image
- the method comprising: causing a projection of the space-time volume to be captured in a single image of image data in accordance with a coded shutter function; receiving the image data; and performing a reconstruction process on the image data to provide a space-time volume corresponding to the image data.
- FIG. 1 is a diagram of a process for producing a space-time volume video from a single coded image in accordance with some embodiments.
- FIG. 2 is a diagram of a process for generating a coded shutter function in accordance with some embodiments.
- FIG. 3 is a diagram of hardware that can be used in accordance with some embodiments.
- FIG. 4 is a diagram of an image sensor that can be used in accordance with some embodiments.
- Systems, methods, and media for reconstructing a space-time volume from a coded image are provided.
- these systems, methods, and media can provide improved temporal resolution without sacrificing spatial resolution in a captured video.
- a video can be produced by reconstructing a space-time volume E from a single coded image I captured using a per-pixel coded shutter function S which defines how pixels of a camera sensor capture the coded image I.
- the coded image I can be defined as shown in equation (1):
- x and y correspond to the two-dimensions corresponding to an M ⁇ M pixel neighborhood of a camera sensor
- t corresponds to N intervals of one integration time of the camera sensor
- the resolution of this space-time volume E is M ⁇ M ⁇ N.
- a neighborhood of a camera sensor is described herein as being square (M ⁇ M) for simplicity and consistency, in some embodiments, a neighborhood need not be square and can be any suitable shape.
- D is a basis in which E is sparse
- a is the sparse representation of E.
- FIG. 1 an example of a process 100 for reconstructing a space-time volume from a captured image is shown. As illustrated, after process 100 begins at 102 , the space-time volume can be sampled into a coded image using a coded shutter function at 104 .
- any suitable coded shutter function can be used to capture an image at 104 , and the used shutter function can have any suitable attributes.
- the shutter function can have the attribute of being a binary shutter function (i.e., S(x, y, t) ⁇ 0, 1) wherein, at every time interval t, the shutter is either integrating light (on) or not (off).
- the shutter function can have the attribute of having only one continuous exposure period (or “bump”) for each pixel during a camera sensor's integration time.
- the shutter function can have the attribute of having one or more bump lengths (i.e., durations of exposure) measured in intervals t.
- the shutter function can have the attribute of having bumps that start at periodic or random times.
- the shutter function can have the attribute of having groups of pixels having the same start time based on location (e.g., in the same row) in a camera sensor.
- the shutter function can have the attribute that at least one pixel of each M ⁇ M pixel neighborhood of a camera sensor is sampled at each interval during the camera sensor's integration time.
- a coded shutter function can include a combination of such attributes.
- a coded shutter function can be a binary shutter function, can have only one continuous exposure period (or “bump”) for each pixel during a camera sensor's integration time, can have only one bump length, can have bumps that start at random times, and can have the attribute that at least one pixel of each M ⁇ M pixel neighborhood of a camera sensor is sampled at each interval during the camera sensor's integration time.
- FIG. 2 A process 200 for generating such a coded shutter function in accordance with some embodiments is illustrated in FIG. 2 . This process can be performed at any suitable point or points in time and can be performed only once in some embodiments.
- the process can set a first bump length at 204 .
- Any suitable bump length can be set as the first bump length.
- the first bump length can be set to one interval t.
- the process can select the first camera sensor pixel. Any suitable pixel can be selected as the first camera sensor pixel.
- the camera sensor pixel with the lowest set of coordinate values can be set as the first camera sensor pixel.
- process 200 can randomly select (or pseudo-randomly select) a start time during the integration time of the camera's sensor for the selected pixel and assign the bump length and start time to the pixel.
- it can be determined if the selected pixel is the last pixel. If not, then process 200 can select the next pixel (using any suitable technique) at 212 and loop back to 208 .
- process 200 can next select a first M ⁇ M pixel neighborhood at 214 .
- This neighborhood can be selected in any suitable manner.
- a first M ⁇ M pixel neighborhood can be selected as the M ⁇ M pixel neighborhood with the lowest set of coordinates.
- the process can then determine if at least one pixel in the selected neighborhood was sampled at each time t. This determination can be made in any suitable manner. For example, in some embodiments, the process can loop through each time t and determine if a pixel in the neighborhood has a bump that occurs during that time t. If no pixel in the neighborhood is determined to have a bump during the time t, then the neighborhood can be determined as not having at least one pixel being sampled at each time t and process 200 can loop back to 206 .
- the process can determine if the current neighborhood is the last neighborhood at 218 . This determination can be made in any suitable manner. For example, in some embodiments, the current neighborhood can be determined as being the last neighborhood if it has the highest coordinate pair of all of the neighborhoods. If it is determined that the current neighborhood is not the last neighborhood, then process 200 can select the next neighborhood at 220 and loop back to 216 .
- process 200 can next simulate image capture using the bump length and start time assigned to each pixel.
- Image capture can be simulated in any suitable manner. For example, in some embodiments, image capture can be simulated using real high-speed video data.
- reconstruction of the M ⁇ M ⁇ N sub-volumes and averaging of the sub-volumes to provide a single volume can be performed as described in connection with 106 and 108 of FIG. 1 below.
- the peak signal to noise ratio (PSNR) for the single volume produced at 222 and 224 can be determined. This PSNR can be determined in any suitable manner, such as by comparing the single volume to real high-speed video used for the simulated image capture.
- PSNR peak signal to noise ratio
- process 200 can determine if the current bump length is the last bump length to be checked. This can be determined in any suitable manner. For example, when the bump length is equal to the camera sensor's integration time, the bump length can be determined to be the last bump length. If the bump length is determined to not be the last bump length, then process 200 can select the next bump length at 230 and loop back to 206 . The next bump length can be selected in any suitable manner. For example, the next bump length can be set to be the previous bump length plus one interval t in some embodiments.
- the bump length and starting time assignments with the best PSNR can be selected as the coded shutter function at 232 .
- the best PSNR can be selected on any suitable basis.
- the best PSNR can be selected as the highest PSNR value determined in the presence of noise similar to anticipated camera noise.
- process 200 can terminate at 234 .
- a reconstruction process can be performed on patches of size M ⁇ M for every spatial location in the captured image to produce volume patches of size M ⁇ M ⁇ N at 106 .
- This reconstruction process can be performed in any suitable manner. For example, in some embodiments, this reconstruction process can be performed by solving the following sparse approximation problem to find ⁇ circumflex over ( ⁇ ) ⁇ :
- ⁇ is a sparse representation of E
- S is a matrix of the shutter function
- I is a vector of the captured coded image
- ⁇ is the error between the reconstructed space-time volume and the ground truth.
- Any suitable mechanism can be used to solve this approximation problem.
- the orthogonal matching pursuit (OMP) algorithm can be used to solve this approximation problem.
- any suitable over-complete dictionary D can be used in some embodiments, and such a dictionary can be formed in any suitable manner.
- an over-complete dictionary for sparsely expressing target video volumes can be built from a large collection of natural video data.
- Such an over-complete dictionary can be trained from patches of natural scenes in a training data set using the K-SVD algorithm, as described in Aharon et al., “K-SVD: An Algorithm for Designing Overcomplete Dictionaries for Sparse Representation,” IEEE Transactions on Signal Processing, vol. 54, no. 11, November 2006, which is hereby incorporated by reference herein in its entirety.
- Such training can occur any suitable number of times (such as only once) and can occur at any suitable point(s) in time.
- any suitable number of videos of any suitable type can be used to train the dictionary in some embodiments.
- a random selection of 20 video sequences with frame rates close to a target frame rate e.g., 300 fps
- spatial rotations on the sequences can be performed and the sequences can be used for training in their forward (i.e., normal playback) and backward (i.e., reverse playback) directions, in some embodiments.
- Any suitable rotations can be performed in some embodiments.
- rotations of 0, 45, 90, 135, 180, 215, 270, and 315 degrees can be performed.
- Any suitable number of basis elements e.g., 5000
- the learned dictionary can capture various features such as shifting edges in various orientations in some embodiments.
- the overlapping reconstructed patches can be averaged and the full space-time volume obtained at 108 , and process 100 can terminate at 110 .
- the resulting space-time volume video can then be used in any suitable manner.
- this video can be presented on a display, can be stored, can be analyzed, etc.
- the hardware can include an objective lens 302 , relay lenses 306 , 310 , and 314 , a polarizing beam splitter 308 , a Liquid Crystal on Silicon (LCoS) chip 312 , an image sensor 316 , and a computer 318 .
- an objective lens 302 relay lenses 306 , 310 , and 314
- a polarizing beam splitter 308 a Liquid Crystal on Silicon (LCoS) chip 312
- an image sensor 316 can be used in some embodiments.
- a computer 318 a computer 318 .
- the scene can be imaged onto a virtual image plane 304 using objective lens 302 .
- Objective lens 302 can be any suitable lens, such as an objective lens with a focal length equal to 25 mm, for example.
- the virtual image can then be re-focused onto an image plane of LCoS chip 312 via relay lenses 306 and 310 and polarizing beam splitter 308 .
- LCoS chip 312 can be any suitable LCoS chip, such as a LCoS chip part number SXGA-3DM from Forth Dimension Displays Ltd. of Birmingham, UK.
- Relay lenses 306 and 310 can be any suitable lenses, such as relay lenses with focal lengths equal to 100 mm, for example.
- Polarizing beam splitter 308 can be any suitable polarizing beam splitter.
- the image formed on the image plane of LCoS chip 312 can be polarized according to the shutter function and reflected back to polarizing beam splitter 308 , which can reflect the image through relay lens 314 and can focus the image on image sensor 316 .
- Relay lens 314 can be any suitable relay lens, such as a relay lens with a focal length equal to 100 mm, for example.
- Image sensor 316 can be any suitable image sensor, such as a Point Grey Grasshopper sensor from Point Grey Research Inc. of Richmond, BC, Canada.
- the virtual image can be focused on both the image plane of the LCoS chip and the image sensor, thereby enabling per-pixel alignment between the pixels of the LCoS chip and the pixels of the image sensor.
- a trigger signal from the LCoS chip into computer 318 can be used to temporally synchronize the LCoS chip and the image sensor.
- the LCoS chip can be run at any suitable frequency. For example, in some embodiments, the LCoS chip can be run at 9-18 times the frame-rate of the image sensor.
- the shutter function can be performed by pixel-wise control of reset and reading of the pixels in an image sensor 416 as shown in FIG. 4 in some embodiments.
- image sensor 416 can allow pixel-wise access by providing both row and column select lines for the pixel array.
- Image sensor 416 can be a CMOS image sensor in some embodiments. In such an embodiment including an image sensor 416 , the LCoS chip, the beam splitter, and some of the lenses can be omitted.
- Computer 318 can be used to perform functions described above and any additional or alternative function(s).
- Computer 318 can be used to perform the functions described above in connection with FIGS. 1 and 2 .
- Computer 318 can be any of a general purpose device such as a computer or a special purpose device such as a client, a server, etc. Any of these general or special purpose devices can include any suitable components such as a hardware processor (which can be a microprocessor, digital signal processor, a controller, etc.), memory, communication interfaces, display controllers, input devices, etc.
- computer 318 can be part of another device (such as a camera, a mobile phone, computing device, a gaming device, etc.) or can be a stand-alone device.
- any suitable computer readable media can be used for storing instructions for performing the processes described herein.
- computer readable media can be transitory or non-transitory.
- non-transitory computer readable media can include media such as magnetic media (such as hard disks, floppy disks, etc.), optical media (such as compact discs, digital video discs, Blu-ray discs, etc.), semiconductor media (such as flash memory, electrically programmable read only memory (EPROM), electrically erasable programmable read only memory (EEPROM), etc.), any suitable media that is not fleeting or devoid of any semblance of permanence during transmission, and/or any suitable tangible media.
- transitory computer readable media can include signals on networks, in wires, conductors; optical fibers, circuits, any suitable media that is fleeting and devoid of any semblance of permanence during transmission, and/or any suitable intangible media.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/001,139 US20140192235A1 (en) | 2011-02-25 | 2012-02-27 | Systems, methods, and media for reconstructing a space-time volume from a coded image |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161446970P | 2011-02-25 | 2011-02-25 | |
PCT/US2012/026816 WO2012116373A1 (en) | 2011-02-25 | 2012-02-27 | Systems, methods, and media for reconstructing a space-time volume from a coded image |
US14/001,139 US20140192235A1 (en) | 2011-02-25 | 2012-02-27 | Systems, methods, and media for reconstructing a space-time volume from a coded image |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2012/026816 A-371-Of-International WO2012116373A1 (en) | 2011-02-25 | 2012-02-27 | Systems, methods, and media for reconstructing a space-time volume from a coded image |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/405,962 Continuation US9979945B2 (en) | 2011-02-25 | 2017-01-13 | Systems, methods, and media for reconstructing a space-time volume from a coded image |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140192235A1 true US20140192235A1 (en) | 2014-07-10 |
Family
ID=46721259
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/001,139 Abandoned US20140192235A1 (en) | 2011-02-25 | 2012-02-27 | Systems, methods, and media for reconstructing a space-time volume from a coded image |
US15/405,962 Active US9979945B2 (en) | 2011-02-25 | 2017-01-13 | Systems, methods, and media for reconstructing a space-time volume from a coded image |
US15/950,297 Active US10277878B2 (en) | 2011-02-25 | 2018-04-11 | Systems, methods, and media for reconstructing a space-time volume from a coded image |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/405,962 Active US9979945B2 (en) | 2011-02-25 | 2017-01-13 | Systems, methods, and media for reconstructing a space-time volume from a coded image |
US15/950,297 Active US10277878B2 (en) | 2011-02-25 | 2018-04-11 | Systems, methods, and media for reconstructing a space-time volume from a coded image |
Country Status (4)
Country | Link |
---|---|
US (3) | US20140192235A1 (ja) |
EP (1) | EP2678740B1 (ja) |
JP (1) | JP5989681B2 (ja) |
WO (1) | WO2012116373A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9736425B2 (en) | 2009-10-28 | 2017-08-15 | Sony Corporation | Methods and systems for coded rolling shutter |
US9979945B2 (en) | 2011-02-25 | 2018-05-22 | Sony Corporation | Systems, methods, and media for reconstructing a space-time volume from a coded image |
US10070067B2 (en) | 2014-04-15 | 2018-09-04 | Sony Corporation | Systems, methods, and media for extracting information and a display image from two captured images |
US10110827B2 (en) | 2011-08-31 | 2018-10-23 | Sony Semiconductor Solutions Corporation | Imaging apparatus, signal processing method, and program |
US10148893B2 (en) | 2012-12-17 | 2018-12-04 | Sony Corporation | Methods, systems, and media for high dynamic range imaging |
US10375321B2 (en) | 2016-04-11 | 2019-08-06 | Samsung Electronics Co., Ltd | Imaging processing method and electronic device supporting the same |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6702323B2 (ja) | 2015-07-22 | 2020-06-03 | ソニー株式会社 | カメラモジュール、固体撮像素子、電子機器、および撮像方法 |
CN109903719A (zh) * | 2017-12-08 | 2019-06-18 | 宁波盈芯信息科技有限公司 | 一种时空编码的结构光编码图案生成方法及装置 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070103595A1 (en) * | 2005-10-27 | 2007-05-10 | Yihong Gong | Video super-resolution using personalized dictionary |
US20070104382A1 (en) * | 2003-11-24 | 2007-05-10 | Koninklijke Philips Electronics N.V. | Detection of local visual space-time details in a video signal |
US7428019B2 (en) * | 2001-12-26 | 2008-09-23 | Yeda Research And Development Co. Ltd. | System and method for increasing space or time resolution in video |
US7511643B2 (en) * | 2005-05-10 | 2009-03-31 | William Marsh Rice University | Method and apparatus for distributed compressed sensing |
Family Cites Families (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6335082A (ja) * | 1986-07-30 | 1988-02-15 | Rhythm Watch Co Ltd | ビデオカメラの光量補正装置 |
US6392689B1 (en) * | 1991-02-21 | 2002-05-21 | Eugene Dolgoff | System for displaying moving images pseudostereoscopically |
US7006567B2 (en) * | 2001-11-30 | 2006-02-28 | International Business Machines Corporation | System and method for encoding three-dimensional signals using a matching pursuit algorithm |
US7978929B2 (en) * | 2003-01-20 | 2011-07-12 | Nexvi Corporation | Device and method for outputting a private image using a public display |
US20060215934A1 (en) * | 2005-03-25 | 2006-09-28 | Yissum Research Development Co of the Hebrew University of Jerusalem Israeli Co | Online registration of dynamic scenes using video extrapolation |
GB0709026D0 (en) * | 2007-05-10 | 2007-06-20 | Isis Innovation | High speed imaging with slow scan cameras using pixel level dynami shuttering |
US7812869B2 (en) * | 2007-05-11 | 2010-10-12 | Aptina Imaging Corporation | Configurable pixel array system and method |
WO2011053678A1 (en) | 2009-10-28 | 2011-05-05 | The Trustees Of Columbia University In The City Of New York | Methods and systems for coded rolling shutter |
JP2012175234A (ja) | 2011-02-18 | 2012-09-10 | Sony Corp | 撮像装置、撮像素子、および撮像制御方法、並びにプログラム |
JP5655626B2 (ja) | 2011-02-24 | 2015-01-21 | ソニー株式会社 | 画像処理装置、および画像処理方法、並びにプログラム |
US20140192235A1 (en) | 2011-02-25 | 2014-07-10 | Sony Corporation | Systems, methods, and media for reconstructing a space-time volume from a coded image |
JP2012182657A (ja) | 2011-03-01 | 2012-09-20 | Sony Corp | 撮像装置、および撮像装置制御方法、並びにプログラム |
JP2012234393A (ja) | 2011-05-02 | 2012-11-29 | Sony Corp | 画像処理装置、および画像処理方法、並びにプログラム |
JP2012257193A (ja) | 2011-05-13 | 2012-12-27 | Sony Corp | 画像処理装置、撮像装置、および画像処理方法、並びにプログラム |
JP2013005017A (ja) | 2011-06-13 | 2013-01-07 | Sony Corp | 撮像装置、および撮像装置制御方法、並びにプログラム |
JP2013021660A (ja) | 2011-07-14 | 2013-01-31 | Sony Corp | 画像処理装置、撮像装置、および画像処理方法、並びにプログラム |
JP2013038504A (ja) | 2011-08-04 | 2013-02-21 | Sony Corp | 撮像装置、および画像処理方法、並びにプログラム |
JP2013050538A (ja) | 2011-08-30 | 2013-03-14 | Sony Corp | 表示装置および電子機器 |
JP2013050537A (ja) | 2011-08-30 | 2013-03-14 | Sony Corp | 表示装置および電子機器 |
JP2013066145A (ja) | 2011-08-31 | 2013-04-11 | Sony Corp | 画像処理装置、および画像処理方法、並びにプログラム |
JP2013066142A (ja) | 2011-08-31 | 2013-04-11 | Sony Corp | 画像処理装置、および画像処理方法、並びにプログラム |
JP2013066140A (ja) | 2011-08-31 | 2013-04-11 | Sony Corp | 撮像装置、および信号処理方法、並びにプログラム |
WO2014099320A1 (en) | 2012-12-17 | 2014-06-26 | The Trustees Of Columbia University In The City Of New York | Methods, systems, and media for high dynamic range imaging |
WO2015160967A1 (en) | 2014-04-15 | 2015-10-22 | The Trustees Of Columbia University In The City Of New York | Systems, methods, and media for extracting information and a display image from two captured images |
-
2012
- 2012-02-27 US US14/001,139 patent/US20140192235A1/en not_active Abandoned
- 2012-02-27 EP EP12749484.7A patent/EP2678740B1/en active Active
- 2012-02-27 WO PCT/US2012/026816 patent/WO2012116373A1/en active Application Filing
- 2012-02-27 JP JP2013555637A patent/JP5989681B2/ja not_active Expired - Fee Related
-
2017
- 2017-01-13 US US15/405,962 patent/US9979945B2/en active Active
-
2018
- 2018-04-11 US US15/950,297 patent/US10277878B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7428019B2 (en) * | 2001-12-26 | 2008-09-23 | Yeda Research And Development Co. Ltd. | System and method for increasing space or time resolution in video |
US20070104382A1 (en) * | 2003-11-24 | 2007-05-10 | Koninklijke Philips Electronics N.V. | Detection of local visual space-time details in a video signal |
US7511643B2 (en) * | 2005-05-10 | 2009-03-31 | William Marsh Rice University | Method and apparatus for distributed compressed sensing |
US20070103595A1 (en) * | 2005-10-27 | 2007-05-10 | Yihong Gong | Video super-resolution using personalized dictionary |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9736425B2 (en) | 2009-10-28 | 2017-08-15 | Sony Corporation | Methods and systems for coded rolling shutter |
US9979945B2 (en) | 2011-02-25 | 2018-05-22 | Sony Corporation | Systems, methods, and media for reconstructing a space-time volume from a coded image |
US10277878B2 (en) | 2011-02-25 | 2019-04-30 | Sony Corporation | Systems, methods, and media for reconstructing a space-time volume from a coded image |
US10110827B2 (en) | 2011-08-31 | 2018-10-23 | Sony Semiconductor Solutions Corporation | Imaging apparatus, signal processing method, and program |
US10148893B2 (en) | 2012-12-17 | 2018-12-04 | Sony Corporation | Methods, systems, and media for high dynamic range imaging |
US10070067B2 (en) | 2014-04-15 | 2018-09-04 | Sony Corporation | Systems, methods, and media for extracting information and a display image from two captured images |
US10375321B2 (en) | 2016-04-11 | 2019-08-06 | Samsung Electronics Co., Ltd | Imaging processing method and electronic device supporting the same |
Also Published As
Publication number | Publication date |
---|---|
US20170134706A1 (en) | 2017-05-11 |
JP2014511531A (ja) | 2014-05-15 |
US10277878B2 (en) | 2019-04-30 |
EP2678740A1 (en) | 2014-01-01 |
EP2678740A4 (en) | 2017-05-31 |
WO2012116373A1 (en) | 2012-08-30 |
JP5989681B2 (ja) | 2016-09-07 |
US20180234672A1 (en) | 2018-08-16 |
US9979945B2 (en) | 2018-05-22 |
EP2678740B1 (en) | 2020-12-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10277878B2 (en) | Systems, methods, and media for reconstructing a space-time volume from a coded image | |
US10277914B2 (en) | Measuring spherical image quality metrics based on user field of view | |
CN108141610B (zh) | 用于编码和解码基于光场的图像的方法和设备 | |
CN108322650A (zh) | 视频拍摄方法和装置、电子设备、计算机可读存储介质 | |
US11659294B2 (en) | Image sensor, imaging apparatus, electronic device, image processing system, and signal processing method | |
US11438503B2 (en) | Plenoptic sub aperture view shuffling with improved resolution | |
CN111369443A (zh) | 光场跨尺度的零次学习超分辨率方法 | |
KR102437510B1 (ko) | 더 풍부한 컬러 샘플링을 위한 플렌옵틱 서브 애퍼처 뷰 셔플링 | |
Meinel et al. | Effective display resolution of 360 degree video footage in virtual reality | |
US20100302403A1 (en) | Generating Images With Different Fields Of View | |
Li et al. | Compressive imaging beyond the sensor's physical resolution via coded exposure combined with time-delay integration | |
CN114040184A (zh) | 图像显示方法、系统、存储介质及计算机程序产品 | |
CN116033138B (zh) | 一种单次曝光压缩感知被动三维成像系统及方法 | |
Ge et al. | Real-time Night Surveillance Video Retrieval through Calibrated Denoising and Super-resolution | |
US20240114118A1 (en) | Multi-Viewpoint Image Processing System and Method Thereof | |
RU2379726C1 (ru) | Способ получения и восстановления объемного изображения | |
CN113870165A (zh) | 一种图像合成方法、装置、电子设备及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HITOMI, YASUNOBU;MITSUNAGA, TOMOO;SIGNING DATES FROM 20131002 TO 20131004;REEL/FRAME:032605/0287 Owner name: THE TRUSTEES OF COLUMBIA UNIVERSITY IN THE CITY OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GU, JINWEI;GUPTA, MOHIT;NAYAR, SHREE K.;SIGNING DATES FROM 20131030 TO 20140226;REEL/FRAME:032605/0326 |
|
AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:THE TRUSTEES OF COLUMBIA UNIVERSITY;REEL/FRAME:040187/0545 Effective date: 20161025 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |