WO2012089895A1 - Compensation d'obturateur à rideau dans un réseau de caméras - Google Patents

Compensation d'obturateur à rideau dans un réseau de caméras Download PDF

Info

Publication number
WO2012089895A1
WO2012089895A1 PCT/FI2010/051103 FI2010051103W WO2012089895A1 WO 2012089895 A1 WO2012089895 A1 WO 2012089895A1 FI 2010051103 W FI2010051103 W FI 2010051103W WO 2012089895 A1 WO2012089895 A1 WO 2012089895A1
Authority
WO
WIPO (PCT)
Prior art keywords
cameras
adjacent
images
sub
rolling shutter
Prior art date
Application number
PCT/FI2010/051103
Other languages
English (en)
Inventor
Tommi Ilmonen
Original Assignee
Multitouch Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Multitouch Oy filed Critical Multitouch Oy
Priority to PCT/FI2010/051103 priority Critical patent/WO2012089895A1/fr
Publication of WO2012089895A1 publication Critical patent/WO2012089895A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/53Control of the integration time
    • H04N25/531Control of the integration time by controlling rolling shutters in CMOS SSIS

Definitions

  • the present invention generally relates to rolling shutter compensation in a camera array.
  • CMOS complementary metal-oxide-semiconductor
  • the slanting is not, however, particularly disturbing under normal circumstances when a single camera is used.
  • different cameras in the array may cause relatively easily very surprising and disturbing effects in which some parts of an image object are substituted by a surround part of the image object.
  • the camera arrays are also technically complex systems, as individual camera units have to be made to work together.
  • an apparatus comprising:
  • an array of rolling shutter cameras that are configured to capture overlapping or adjacent sub-images, which sub-images collectively form a full image
  • a synchronizing circuitry configured to control the adjacent cameras to capture images continuously taking into account the rolling shutter such that line scan is performed with a substantially constant delay between each of a plurality of lines scanned out from the adjacent cameras.
  • the adjacent cameras may be oriented in a common scan direction.
  • the synchronizing circuitry may be configured to command each of the adjacent cameras to start exposing an image frame in a cascade with such a delay in the scan direction that lines are scanned by the adjacent cameras so that the full image is formed in a consistent progression.
  • the synchronizing circuitry may comprise a field-programmable gate array (FPGA) that is communicatively connected with at least two of the cameras.
  • FPGA field-programmable gate array
  • the synchronizing circuitry may comprise a single field-programmable gate array (FPGA) that is communicatively connected with at least two of the cameras.
  • the single field-programmable gate array may be communicatively connected with all of the cameras
  • the synchronizing circuitry may comprise a plurality of output ports configured to control respective trigger ports of each of the cameras.
  • the apparatus may further comprise a data bus communicatively connected with the synchronizing circuitry.
  • the synchronizing circuitry may be configured to image data from one or more of the cameras at a time and to pass the received image data to the data bus without prior buffering on an external buffer circuitry.
  • the apparatus may enable significantly simplified a construction in which separate memory buffers may be omitted without necessitating use of very fast data buses.
  • the cameras may be infra-red cameras.
  • the apparatus may further comprise a touch detection circuitry configured to detect touching of a touching surface based on the full image. It may be particularly advantageous to compensate for rolling shutter in an apparatus detecting touching, where continuous images may be of major importance.
  • controlling said adjacent cameras of the array to capture images continuously taking into account the rolling shutter such that line scan is performed with a substantially constant delay between each of a plurality of lines scanned out from the adjacent cameras.
  • a computer program configured to cause when executed by a computer a method according to the second aspect of the invention.
  • a computer readable memory med ium embodied with a computer program wh ich when executed by a computer causes a computer to perform a method according to the second aspect of the invention.
  • a computer program product comprising a non-transitory computer readable medium having computer executable program code stored thereon, which when executed by at least one processor causes an apparatus at least to perform a method according to the second aspect of the invention.
  • Fig. 1 shows a block diagram of a system according to an embodiment of the invention
  • Fig. 2 shows a simplified block diagram of the structure of a control unit shown in Fig. 1 according to an embodiment of the invention
  • Fig. 3a shows an example of distribution of a full image into a plurality of sub- images
  • Fig. 3c shows a schematic flow chart according to an embodiment of the invention.
  • Fig. 4 shows a system to illustrate further details regarding possible circuitries suited e.g. for operation as shown in Figs. 3a, 3b and/or 3c.
  • Fig. 1 shows a block diagram of a system 100 according to an embodiment of the invention.
  • the system 100 comprises a plurality of cameras 101 arranged in an array (here only three shown), sub-images or sub-image areas 102 of the cameras 101 , a control unit 103 that controls the operation of the cameras 101 and a data bus 104 connected with the control unit 103.
  • the sub- images and sub-image areas are used interchangeably.
  • the sub-images 102 are adjacent such that the adjacent sub-images form a continuous full image without overlap between different sub-images.
  • the sub-images 102 do overlap: this is particularly the case when more accurate images are desired by imaging a common region by two or more overlapping sub-images 102, which technique is also referred to as super- resolution imaging.
  • the cameras 1 01 are rolling shutter cameras such as complementary metal- oxide-semiconductor (CMOS) cameras that produce the image line-by-line (of image pixels) with minute delay between each line.
  • CMOS complementary metal- oxide-semiconductor
  • the cameras typically produce a square or rectangular sub-images 102.
  • the array typically conforms to a regular matrix, but other alignments are also provided for in other embodiments. If the cameras have common or individually differing overlaps, it is advantageous to account for this by storing a measure of the overlap and by adjusting the timing when each of the cameras 101 is triggered so that the sub-images 102 are formed on each camera 101 such that any moment of time, the lines being scanned have substantially linear correspondence to the full image and thus to an object that is being imaged.
  • Fig. 2 shows a simplified block diagram of the structure of the control unit 103.
  • the control unit 1 03 may be based on, for example, a general purpose computer supplied with suitable software and / or on a particularly adapted computing device. While it is possible to implement the control unit 103 by purely hardware based a device, typically it is more economic and faster to produce by making use of software.
  • control unit 103 is drawn to comprise a memory 201 that comprises a work memory 202, a non-volatile memory 203 that is configured to store software 204, and settings 206 needed e.g. for manual or automatic calibration of the system 100.
  • the software 204 may comprise any one or more of the following items: operating system, device drivers, display presentation application, hypertext markup language parser, image processing software, and drivers for different external equipment that may be connected to the system such as printers, further displays, further interactive systems 100, audio systems, and external IR illumination equipment (not shown).
  • the control unit 103 further comprises a processor 207 (such as a field- programmable gate array, FPGA) configured to control the operation of the control unit 103 according to the software 204 by executing computer executable program code contained by the software in the work memory 202.
  • the control 103 unit may be configured to execute the software in place in the non-volatile memory in which case the work memory may not be necessary.
  • the control unit 103 further comprises an input/output unit (I/O) 208 for exchanging signals with other elements of the system 100 and optionally also with external equipment.
  • the I/O 208 may comprise e.g.
  • the system 100 may be provided with a transferable memory reception unit 209 such as a cd-rom or dvd-rom drive, memory card reader or memory stick reader which enables replacing part of the non-volatile memory.
  • a transferable memory reception unit 209 such as a cd-rom or dvd-rom drive, memory card reader or memory stick reader which enables replacing part of the non-volatile memory.
  • control unit may consist of one separate unit
  • control unit 103 may alternatively be integrated with any other element or comprise two or more discreet elements each for one or more of the aforementioned acts.
  • Fig. 3a shows an example of distribution of a full image into a plurality of sub- images 102, also referred to as sub-regions.
  • the sub-regions 102 are drawn to form an 8 x 4 matrix and denoted with reference signs 1 -1 to 4-8.
  • a rolling shutter direction or scanning direction 300 is shown, here from left to right, implying that camera scan-lines are aligned vertically.
  • Fig. 3b shows a timing chart that illustrates how scan-out periods of adjacent rolling shutter cameras 101 are synchronized in one example embodiment.
  • Fig. 3b shows the timing of some main events for three adjacent cameras 101 in the rolling shutter direction 300.
  • Fig. 3b is not drawn in scale.
  • Fig. 3b clearly shows how the scan out periods continuously progress as a function of time. After the eight cameras 101 on each row (in Fig. 3a embodiment), the first camera 101 would be triggered again so that the scan out period starts from where the scan out period of the eighth camera 101 ends.
  • the cameras 101 on other rows are preferably synchronized with the first row so that the cameras 101 on each column of the matrix shown in Fig. 3a are triggered simultaneously.
  • Fig. 3b is based on the assumption that the sub-images or regions 102 are adjacent and non-overlapping. If there is overlap, the timing is preferably adjusted for the cameras 101 so that such lines, that overlap in the full image, are scanned out substantially simultaneously from all cameras against the rolling shutter direction 300.
  • the frame-capture signal is sent at a moment of time t1 . Actual frame capture starts at a moment of time t2, with the time between the moment t1 and t2 (if any) being spent in preparing the image capture. Image exposure takes place between moments of time t2 and t3, with the image scan-out happening between moments of time t3, and t4.
  • Fig. 3c shows a schematic flow chart according to an embodiment of the invention.
  • step 301 the process the camera units of the first column (in the embodiment of Fig. 3a) are activated for capturing first sub-images 102, which in Fig. 3a example means sub-regions 1 -1 , 2-1 , 3-1 and 3-1 .
  • step 302 the following sub-regions are selected and triggered as the first sub-regions in step 301 in the rolling shutter direction 300. It is understood more (or less) than two sub-regions may be processed simultaneously. Moreover, the sub-regions need not be processed exactly simultaneously; the exact timing of parallel processing of different sub-regions may be partly random or there may be a given time offset such 1/10 or 1/100 of the scan out period. Such offsetting may be help feeding all the image data to the data bus 104 e.g. as interlaced bursts. Of course, in some other embodiments the scanning may proceed along columns rather than rows.
  • the operation is preferably advanced from one sub-region 102 to the following in the rolling shutter direction 300 in a continuous manner without disturbingly different interval between neighboring (outmost) lines of two cameras 1 01 . Therefore, there should be no holes or other discontinuities in images formed by combining plural adjacent sub-images unlike with current systems where rolling shutter images may suffer from temporal disruptions.
  • step 304 the full image is formed from all the different sub-images 102.
  • the mutual alignment of the different sub-images 102 is accounted for. For instance, stored alignment data (e.g. information that identifies overlapping portions of different sub-images 102) is used in one embodiment to determine how the full image is put together. In another embodiment, however, the mutual alignment of different sub-images 102 is determined entirely or in part based on the image information of the different sub-images 102. This determining of mutual alignment may be implemented e.g. by searching the mutual alignment with least difference between adjacent pixels of two sub-images along the adjoining edge of one of the sub-images.
  • Fig. 4 shows a system 410 to illustrate further details regarding possible circuitries suited e.g. for operation as shown in Figs. 3a, 3b and/or 3c.
  • Fig. 4 also makes clear how the circuitry may be greatly simplified. This simplifying takes advantage of the rolling shutter nature of the cameras 101 - when the cameras 101 produce image pixels for different parts of the image on slightly different moments of time.
  • some of the cameras 101 are laid onto one or more circuit boards into an array.
  • one or more cameras 101 are connected as one common camera unit 420.
  • the common camera unit 420 comprises a field-programmable gate array (FPGA) 422 that is communicatively connected with each of the cameras 101 of the common camera unit 420.
  • the FPGA 422s are configured to synchronise the cameras 101 so that the image formed by adjacent cameras 101 is formed continuously taking into account the rolling shutter. That is, in Fig. 4, first the topmost cameras 101 start exposing a first sub-image 102.
  • the FPGAs 422 of the top-most camera units 420 start substantially simultaneously scanning out from the first (top-most cameras 101 ) these sub-image 102.
  • the FPGAs 422 also trigger second (downward in Fig.
  • the system 410 further comprises one or more common camera unit data buses 430 through which the FPGA 422 passes the data stream for subsequent processing.
  • the common camera unit data buses 430 are data channels that are configured capable of transporting image data from all of the FPGAs 422 connected thereto.
  • each data bus should simultaneously convey the image data of two cameras 101 .
  • a single FPGA 422 may be configured to control all the cameras 101 .
  • An interfacing unit 440 is connected to each of the common camera unit data buses and configured to pass all the image data and necessary metadata.
  • the metadata comprises e.g. an identification of the camera 101 from which the image data in question comes from.
  • the metadata is typically provided by the FPGA 422.
  • the interfacing unit 440 may comprise a coordinating processor 442 (such as an FPGA circuitry, central processing unit, digital signal processor or the like), a data input port 444 and possibly a control output port 446 for controlling the FPGAs 422.
  • a coordinating processor 442 such as an FPGA circuitry, central processing unit, digital signal processor or the like
  • data input port 444 and possibly a control output port 446 for controlling the FPGAs 422.
  • the interfacing unit further comprises a data output port 448 which may comprise a relative small buffer memory 4482 e.g. to allow retransmissions should data be corrupted over a connection 450 between the system 410 and an auxiliary device that receives the image data (such as a computer, not shown).
  • a data output port 448 may comprise a relative small buffer memory 4482 e.g. to allow retransmissions should data be corrupted over a connection 450 between the system 410 and an auxiliary device that receives the image data (such as a computer, not shown).
  • the associated FPGA 422 When data is scanned out from a camera 101 , the associated FPGA 422 receives and forwards that data, potentially together with identification of the camera 101 , and the interfacing unit 440 further passes on the image data it receives from the different FPGAs 422. All of this may take place without need to buffer and re-buffer the scanned-out image data.
  • Fig. 4 exemplifies some factors related to the design of the system 410. The fewer FPGAs 422 there are, the higher the number of data ports is needed at the FPGAs 422 and the more complex the wiring may become, but the overall costs may still be reduced by reducing the number of the FPGAs 422.
  • the fewer FPGAs 422 there are per simultaneously scanned cameras 101 the faster the data bus or buses 430 have to be.
  • the system 410 is implemented so that the data bus 430 is preferably fast enough to relay image data from all the simultaneously scanned cameras 101 , or if that is not possible, there is the smallest possible number of data buses 430 that suffices for this task. Then, a single FPGA 422 is provided per data bus and common camera units are formed of all the cameras 101 that shall communicate via each data bus 430.
  • the common camera units may altogether lack memory buffers, whereas in normal implementation, each of the FPGAs 422 would be associated with a memory buffer that is large enough to store at least one entire image frame.
  • the system 410 can be simplified with the advantages that the manufacturing and maintenance become cheaper and faster and debugging of possible problems is also improved in comparison to an alternative implementation in which the memory buffers are provided.
  • Such an alternative implementation may be more advantageous e.g. when fast data buses are not desired for any reason
  • words comprise, include and contain are each used as open-ended expressions with no intended exclusivity.
  • term light here is interchangeable with radiation. While infrared light has in occasions been used, this terming is merely for convenience of explanation the term light is not intended to imply suitability for perception by means of a human eye.
  • the foregoing description has provided by way of non-limiting examples of particular implementations and embodiments of the invention a full and informative description of the best mode presently contemplated by the inventors for carrying out the invention. It is however clear to a person skilled in the art that the invention is not restricted to details of the embodiments presented above, but that it can be implemented in other embodiments using equivalent means without deviating from the characteristics of the invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

Selon la présente invention, les effets d'un obturateur à rideau sont compensés dans un réseau de caméras pourvu de caméras à obturateur à rideau (101) qui sont configurées pour capturer des sous-images (102) qui se chevauchent ou qui sont adjacentes, lesdites sous-images (102) formant ensemble une image complète des sous-images capturées (102). Un circuit de synchronisation (103) commande les caméras adjacentes (101) afin de capturer en continu des images en prenant en compte l'obturateur à rideau de telle sorte qu'un balayage linéaire soit effectué avec un retard sensiblement constant entre chaque ligne parmi une pluralité de lignes balayées par les caméras adjacentes (101).
PCT/FI2010/051103 2010-12-31 2010-12-31 Compensation d'obturateur à rideau dans un réseau de caméras WO2012089895A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/FI2010/051103 WO2012089895A1 (fr) 2010-12-31 2010-12-31 Compensation d'obturateur à rideau dans un réseau de caméras

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/FI2010/051103 WO2012089895A1 (fr) 2010-12-31 2010-12-31 Compensation d'obturateur à rideau dans un réseau de caméras

Publications (1)

Publication Number Publication Date
WO2012089895A1 true WO2012089895A1 (fr) 2012-07-05

Family

ID=46382359

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2010/051103 WO2012089895A1 (fr) 2010-12-31 2010-12-31 Compensation d'obturateur à rideau dans un réseau de caméras

Country Status (1)

Country Link
WO (1) WO2012089895A1 (fr)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103581531A (zh) * 2012-07-20 2014-02-12 万里科技股份有限公司 通过隔离式开关扩充及控制相机快门线信号的装置
WO2015058157A1 (fr) 2013-10-18 2015-04-23 The Lightco Inc. Procédés et appareil de commande de capture d'image
WO2015150619A1 (fr) * 2014-04-03 2015-10-08 Nokia Technologies Oy Appareil, procédé et programme d'ordinateur pour obtenir des images
EP2946336A2 (fr) * 2013-01-15 2015-11-25 Mobileye Vision Technologies Ltd. Assistance stéréo avec obturateurs rotatifs
US20170289482A1 (en) * 2016-03-30 2017-10-05 Panasonic Intellectual Property Management Co., Ltd. Imaging apparatus
EP3058713A4 (fr) * 2013-10-18 2017-11-15 The Lightco Inc. Procédés et appareil de commande de capture d'image
US9857584B2 (en) 2015-04-17 2018-01-02 Light Labs Inc. Camera device methods, apparatus and components
US9955082B2 (en) 2013-10-18 2018-04-24 Light Labs Inc. Methods and apparatus for capturing images using optical chains and/or for using captured images
US9967535B2 (en) 2015-04-17 2018-05-08 Light Labs Inc. Methods and apparatus for reducing noise in images
US9998638B2 (en) 2014-12-17 2018-06-12 Light Labs Inc. Methods and apparatus for implementing and using camera devices
US10003738B2 (en) 2015-12-18 2018-06-19 Light Labs Inc. Methods and apparatus for detecting and/or indicating a blocked sensor or camera module
US10009530B2 (en) 2013-10-18 2018-06-26 Light Labs Inc. Methods and apparatus for synchronized image capture using camera modules with different focal lengths
US10051182B2 (en) 2015-10-05 2018-08-14 Light Labs Inc. Methods and apparatus for compensating for motion and/or changing light conditions during image capture
US10091447B2 (en) 2015-04-17 2018-10-02 Light Labs Inc. Methods and apparatus for synchronizing readout of multiple image sensors
US10129483B2 (en) 2015-06-23 2018-11-13 Light Labs Inc. Methods and apparatus for implementing zoom using one or more moveable camera modules
US10225445B2 (en) 2015-12-18 2019-03-05 Light Labs Inc. Methods and apparatus for providing a camera lens or viewing point indicator
US10365480B2 (en) 2015-08-27 2019-07-30 Light Labs Inc. Methods and apparatus for implementing and/or using camera devices with one or more light redirection devices
US10491806B2 (en) 2015-08-03 2019-11-26 Light Labs Inc. Camera device control related methods and apparatus
US10516834B2 (en) 2015-10-06 2019-12-24 Light Labs Inc. Methods and apparatus for facilitating selective blurring of one or more image portions
US10670858B2 (en) 2017-05-21 2020-06-02 Light Labs Inc. Methods and apparatus for maintaining and accurately determining the position of a moveable element
US10931866B2 (en) 2014-01-05 2021-02-23 Light Labs Inc. Methods and apparatus for receiving and storing in a camera a user controllable setting that is used to control composite image generation performed after image capture

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60172888A (ja) * 1984-02-17 1985-09-06 Matsushita Electric Ind Co Ltd 広角テレビカメラ
US6937270B1 (en) * 1999-05-03 2005-08-30 Omnivision Technologies, Inc. Analog video monitoring system using a plurality of phase locked CMOS image sensors
US20070030342A1 (en) * 2004-07-21 2007-02-08 Bennett Wilburn Apparatus and method for capturing a scene using staggered triggering of dense camera arrays
WO2007045714A1 (fr) * 2005-10-21 2007-04-26 Nokia Corporation Procede et dispositif destines a reduire une distorsion de mouvement en imagerie numerique
US20090201361A1 (en) * 2008-02-08 2009-08-13 Google Inc. Panoramic Camera With Multiple Image Sensors Using Timed Shutters

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60172888A (ja) * 1984-02-17 1985-09-06 Matsushita Electric Ind Co Ltd 広角テレビカメラ
US6937270B1 (en) * 1999-05-03 2005-08-30 Omnivision Technologies, Inc. Analog video monitoring system using a plurality of phase locked CMOS image sensors
US20070030342A1 (en) * 2004-07-21 2007-02-08 Bennett Wilburn Apparatus and method for capturing a scene using staggered triggering of dense camera arrays
WO2007045714A1 (fr) * 2005-10-21 2007-04-26 Nokia Corporation Procede et dispositif destines a reduire une distorsion de mouvement en imagerie numerique
US20090201361A1 (en) * 2008-02-08 2009-08-13 Google Inc. Panoramic Camera With Multiple Image Sensors Using Timed Shutters

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103581531A (zh) * 2012-07-20 2014-02-12 万里科技股份有限公司 通过隔离式开关扩充及控制相机快门线信号的装置
US10764517B2 (en) 2013-01-15 2020-09-01 Mobileye Vision Technologies Ltd. Stereo assist with rolling shutters
US10200638B2 (en) 2013-01-15 2019-02-05 Mobileye Vision Technologies Ltd. Stereo assist with rolling shutters
EP2946336A2 (fr) * 2013-01-15 2015-11-25 Mobileye Vision Technologies Ltd. Assistance stéréo avec obturateurs rotatifs
EP2946336B1 (fr) * 2013-01-15 2023-06-21 Mobileye Vision Technologies Ltd. Assistance stéréo avec obturateurs rotatifs
US10048472B2 (en) 2013-10-18 2018-08-14 Light Labs Inc. Methods and apparatus for implementing and/or using a camera device
US10120159B2 (en) 2013-10-18 2018-11-06 Light Labs Inc. Methods and apparatus for supporting zoom operations
US9955082B2 (en) 2013-10-18 2018-04-24 Light Labs Inc. Methods and apparatus for capturing images using optical chains and/or for using captured images
US10274706B2 (en) 2013-10-18 2019-04-30 Light Labs Inc. Image capture control methods and apparatus
US10205862B2 (en) 2013-10-18 2019-02-12 Light Labs Inc. Methods and apparatus relating to a camera including multiple optical chains
US10009530B2 (en) 2013-10-18 2018-06-26 Light Labs Inc. Methods and apparatus for synchronized image capture using camera modules with different focal lengths
US10038860B2 (en) 2013-10-18 2018-07-31 Light Labs Inc. Methods and apparatus for controlling sensors to capture images in a synchronized manner
EP3058713A4 (fr) * 2013-10-18 2017-11-15 The Lightco Inc. Procédés et appareil de commande de capture d'image
WO2015058157A1 (fr) 2013-10-18 2015-04-23 The Lightco Inc. Procédés et appareil de commande de capture d'image
US10931866B2 (en) 2014-01-05 2021-02-23 Light Labs Inc. Methods and apparatus for receiving and storing in a camera a user controllable setting that is used to control composite image generation performed after image capture
WO2015150619A1 (fr) * 2014-04-03 2015-10-08 Nokia Technologies Oy Appareil, procédé et programme d'ordinateur pour obtenir des images
US9998638B2 (en) 2014-12-17 2018-06-12 Light Labs Inc. Methods and apparatus for implementing and using camera devices
US10091447B2 (en) 2015-04-17 2018-10-02 Light Labs Inc. Methods and apparatus for synchronizing readout of multiple image sensors
US9857584B2 (en) 2015-04-17 2018-01-02 Light Labs Inc. Camera device methods, apparatus and components
US9967535B2 (en) 2015-04-17 2018-05-08 Light Labs Inc. Methods and apparatus for reducing noise in images
US10129483B2 (en) 2015-06-23 2018-11-13 Light Labs Inc. Methods and apparatus for implementing zoom using one or more moveable camera modules
US10491806B2 (en) 2015-08-03 2019-11-26 Light Labs Inc. Camera device control related methods and apparatus
US10365480B2 (en) 2015-08-27 2019-07-30 Light Labs Inc. Methods and apparatus for implementing and/or using camera devices with one or more light redirection devices
US10051182B2 (en) 2015-10-05 2018-08-14 Light Labs Inc. Methods and apparatus for compensating for motion and/or changing light conditions during image capture
US10516834B2 (en) 2015-10-06 2019-12-24 Light Labs Inc. Methods and apparatus for facilitating selective blurring of one or more image portions
US10225445B2 (en) 2015-12-18 2019-03-05 Light Labs Inc. Methods and apparatus for providing a camera lens or viewing point indicator
US10003738B2 (en) 2015-12-18 2018-06-19 Light Labs Inc. Methods and apparatus for detecting and/or indicating a blocked sensor or camera module
US10044963B2 (en) * 2016-03-30 2018-08-07 Panasonic Intellectual Property Management Co., Ltd. Imaging apparatus
US20170289482A1 (en) * 2016-03-30 2017-10-05 Panasonic Intellectual Property Management Co., Ltd. Imaging apparatus
US10670858B2 (en) 2017-05-21 2020-06-02 Light Labs Inc. Methods and apparatus for maintaining and accurately determining the position of a moveable element

Similar Documents

Publication Publication Date Title
WO2012089895A1 (fr) Compensation d'obturateur à rideau dans un réseau de caméras
US10237506B2 (en) Image adjustment apparatus and image sensor for synchronous image and asynchronous image
RU2570354C2 (ru) Захват и отображение изображений в реальном времени
TWI523516B (zh) 電視牆
US7525576B2 (en) Method and apparatus for panning and tilting a camera
US10645258B2 (en) Multi-camera system, method of controlling a multi-camera system, and camera
US9270907B2 (en) Radiation imaging apparatus, control method for radiation imaging apparatus, and storage medium
US20130235149A1 (en) Image capturing apparatus
JP2007079028A (ja) 投射型画像表示装置およびマルチプロジェクションシステム
US20030222987A1 (en) Line scan image recording device with internal system for delaying signals from multiple photosensor arrays
US7619195B2 (en) Imaging device driver, imaging device driving method, and image signal processor
US20170046843A1 (en) Method, Apparatus and System for Detecting Location of Laser Point on Screen
JP7466229B2 (ja) プレス部品用検査装置及びプレス部品の検査方法
JP5824278B2 (ja) 画像処理装置
US20230228691A1 (en) Smart synchronization method of a web inspection system
US7705910B2 (en) Photographic device for obtaining a plurality of images at a time by rolling shutter method
US9075482B2 (en) Optical touch display
JP2013048333A (ja) 画像処理装置、画像処理方法、および画像処理システム
US20060214087A1 (en) Imaging device and method, and imaging controlling apparatus and method
US10212405B2 (en) Control apparatus and method
JP2014175931A (ja) 撮影システム、撮像装置及びその制御方法
US20060215050A1 (en) Driving controlling method for image sensing device, and imaging device
JP4320780B2 (ja) 撮像装置
US20190052829A1 (en) Image pickup apparatus and method utilizing the same line rate for upscaling and outputting image
US7839433B2 (en) Imaging apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10861460

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10861460

Country of ref document: EP

Kind code of ref document: A1