EP3507974B1 - Motion triggered gated imaging - Google Patents

Motion triggered gated imaging Download PDF

Info

Publication number
EP3507974B1
EP3507974B1 EP17761684.4A EP17761684A EP3507974B1 EP 3507974 B1 EP3507974 B1 EP 3507974B1 EP 17761684 A EP17761684 A EP 17761684A EP 3507974 B1 EP3507974 B1 EP 3507974B1
Authority
EP
European Patent Office
Prior art keywords
capture unit
sensor
image capture
motion
digital image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP17761684.4A
Other languages
German (de)
French (fr)
Other versions
EP3507974A1 (en
Inventor
Christian Mäkelä
Ossi Pirinen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Technology Licensing LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Technology Licensing LLC filed Critical Microsoft Technology Licensing LLC
Publication of EP3507974A1 publication Critical patent/EP3507974A1/en
Application granted granted Critical
Publication of EP3507974B1 publication Critical patent/EP3507974B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/684Vibration or motion blur correction performed by controlling the image sensor readout, e.g. by controlling the integration time
    • H04N23/6845Vibration or motion blur correction performed by controlling the image sensor readout, e.g. by controlling the integration time by combination of a plurality of images sequentially taken
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/685Vibration or motion blur correction performed by mechanical compensation
    • H04N23/687Vibration or motion blur correction performed by mechanical compensation by shifting the lens or sensor position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/684Vibration or motion blur correction performed by controlling the image sensor readout, e.g. by controlling the integration time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/743Bracketing, i.e. taking a series of images with varying exposure conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/57Control of the dynamic range
    • H04N25/58Control of the dynamic range involving two or more exposures
    • H04N25/587Control of the dynamic range involving two or more exposures acquired sequentially, e.g. using the combination of odd and even image fields

Definitions

  • Digital cameras may need extended exposure times e.g. in low light conditions. Extended exposure times are prone to camera movement, such as hand-shaking motion. This camera movement can be partially compensated for by an optical image stabilization system. In some instances, the camera movement compensation by the optical image stabilization system may have limitations which make capturing sharp, well exposed images with a hand-held device in low light conditions difficult or even impossible.
  • WO 2015/198300 relates to an active or passive gated-sensor imaging system characterized by a minimized time period between successive sensor exposures.
  • US 2014/362256 relates to a system and method for reference frame selection for still image stabilisation including a combination of image quality and commonality metrics are used to identify a reference frame from a set of commonly captured images which, when the set's other images are combined with it, results in a quality stabilized image.
  • a digital image capture unit comprises a gated image sensor configured to operate multiple sensor exposure events per a single image frame readout.
  • the digital image capture unit further comprises a motion monitor configured to monitor motion related to the digital image capture unit.
  • the digital image capture unit further comprises a controller configured to instruct the gated image sensor to discard a sensor exposure event of the multiple sensor exposure events in response to a temporally corresponding monitored motion related to the digital image capture unit failing to meet a motion requirement.
  • At least some of the disclosed examples may allow motion triggered gated imaging, for example to enhance image stabilization used in digital cameras. Accordingly, at least some of the disclosed examples may allow capturing sharp and well-exposed images with a hand-held digital camera even in low light conditions. At least some of the disclosed examples may allow eliminating or at least decreasing blur originating from non-ideal OIS compensation from the final image due to combining information from the gyroscopes and the Hall effect sensors with storage/reset capability of a gated imaging sensor thereby enabling the use of image signal fragments in final image formation for which the OIS is able to compensate either fully or to an agreeable extent the motion of the digital image capture unit. At least some of the disclosed examples may allow capturing sharp and well-exposed images with a hand-held digital camera for both video and still images.
  • FIG. 1A is an example block diagram of a digital image capture unit 100A in accordance with an example embodiment.
  • the digital image capture unit 100A may be employed, for example, in the electronic apparatus 400 of FIG. 4 .
  • the digital image capture unit 100A may also be employed on a variety of other devices and apparatuses, and therefore, embodiments should not be limited to application on devices and apparatuses such as the electronic apparatus 400 of FIG. 4 .
  • at least some of the elements described below may not be mandatory and thus some may be omitted in certain embodiments.
  • the digital image capture unit 100A may be included e.g. in a stand-alone digital camera or an integrated digital camera which may be still cameras and/or video cameras, and the like.
  • the digital image capture unit 100A comprises a gated image sensor 110 that is configured to operate (e.g. store and/or discard) multiple sensor exposure events per a single image frame readout.
  • the gated image sensor may be further configured to operate in a global shutter mode in which case the sensor exposure events are global sensor exposure events.
  • the duration of the single image frame readout may be 30 milliseconds (ms).
  • the duration of each sensor exposure event is a fragment of the duration of the single image frame readout, for example in the range of microseconds or nanoseconds.
  • an entire image frame is captured at the same instant. This is in contrast to e.g. rolling shutter mode in which different parts (e.g. pixel rows) of an image frame are captured at slightly different times, for example one row after another.
  • the gated image sensor 110 may include e.g. a charge-coupled device (CCD) sensor or a complementary metal-oxide-semiconductor (CMOS) sensor.
  • CCD charge-coupled device
  • CMOS complementary metal-oxide-semiconductor
  • GCMOS gated CMOS image sensor
  • One single frame is typically composed from repeating global exposures events.
  • a GCMOS may be manufactured using e.g. contact image sensor (CIS) technology on near-infrared (NIR) global shutter platform.
  • CIS contact image sensor
  • NIR near-infrared
  • image frame readout refers to an event that starts with the shutter opening (or the image frame light accumulation beginning, in the case of an electronic shutter) and ends with the shutter closing (or the image frame light accumulation finishing, in the case of an electronic shutter), with no other shutter actuation in-between.
  • a single “image frame readout” may contain multiple exposure events.
  • image frame readout is the period needed for accumulating or integrating light for an entire single image frame, and an exposure event is a temporal segment or subset of this light accumulation period. This is also illustrated in FIGS. 2A-2B in which s open represents the instant of the shutter opening and s close represents the instant of the shutter closing.
  • These multiple exposure events are controlled or operated with the gating functionality (comprising e.g. one or more logical switches or gates) of the gated image sensor (rather than shutter functionality) by gating the charge generated in the photodiode(s) to either a storage node or to a reset (i.e. ground) node.
  • the gating functionality comprising e.g. one or more logical switches or gates of the gated image sensor (rather than shutter functionality) by gating the charge generated in the photodiode(s) to either a storage node or to a reset (i.e. ground) node.
  • the digital image capture unit 100A further comprises a motion monitor 120A that is configured to monitor motion related to the digital image capture unit 100A.
  • the motion related to the digital image capture unit may comprise e.g. motion of an object in a scene to be captured during the image frame readout.
  • the motion related to the digital image capture unit may comprise motion of the digital image capture unit, as discussed in more detail with reference to FIG. 1B .
  • the digital image capture unit 100A further comprises a controller 130 that is configured to instruct the gated image sensor 110 to discard a sensor exposure event of the multiple sensor exposure events in response to a temporally corresponding monitored motion related to the digital image capture unit 100A failing to meet a motion requirement.
  • a controller 130 that is configured to instruct the gated image sensor 110 to discard a sensor exposure event of the multiple sensor exposure events in response to a temporally corresponding monitored motion related to the digital image capture unit 100A failing to meet a motion requirement.
  • “temporally corresponding” indicates that a given monitored motion occurs at the same instant as its corresponding sensor exposure event, as also illustrated e.g. in FIGS. 2A-2B .
  • the duration of each sensor exposure event is a fragment of the duration of the single image frame readout, for example in the range of microseconds or nanoseconds.
  • FIG. 1B is an example block diagram of a digital image capture unit 100B in accordance with an example embodiment.
  • the digital image capture unit 100B may be employed, for example, in the electronic apparatus 400 of FIG. 4 .
  • the digital image capture unit 100B may also be employed on a variety of other devices and apparatuses, and therefore, embodiments should not be limited to application on devices and apparatuses such as the electronic apparatus 400 of FIG. 4 .
  • at least some of the elements described below may not be mandatory and thus some may be omitted in certain embodiments.
  • the digital image capture unit 100B may be included e.g. in a stand-alone digital camera or an integrated digital camera which may be still cameras and/or video cameras, and the like.
  • the functionalities and properties of the gated image sensor 110 and the controller 130 are substantially similar to those of their counterparts in the example of FIG. 1A , so their descriptions are not repeated here in detail.
  • the digital image capture unit 100B further comprises an image stabilizer 120B (e.g. an optical image stabilizer or OIS) that is configured to stabilize image frames by compensating for the monitored motion of the digital image capture unit.
  • image stabilizer 120B e.g. an optical image stabilizer or OIS
  • the image stabilizer 120B comprises one or more motion sensors 121 that are configured to detect the motion of the digital image capture unit 100B. At least one of the motion sensors may comprise a gyroscope.
  • the motion of the digital image capture unit 100B may comprise e.g. pitch, yaw and/or roll of the digital image capture unit 100B.
  • the motion of the digital image capture unit 100B may be caused e.g. by hand shaking of a user operating the digital image capture unit 100B.
  • the digital image capture unit 100B further comprises a lens system 140.
  • the image stabilizer 120B further comprises one or more actuators 122 that are configured to shift either the gated image sensor 110 or the lens system 140 in order to compensate for the detected motion of the digital image capture unit 100B.
  • the image stabilizer 120B further comprises one or more position feedback sensors 123 that are configured to measure the movement of the shifted gated image sensor 110 or lens system 140. At least one of the position feedback sensors may comprise a Hall effect sensor.
  • the motion requirement may comprise the difference between the detected motion of the digital image capture unit 100B and the measured movement of the shifted gated image sensor 110 or lens system 140 staying below a threshold.
  • the failure to meet the motion requirement may comprise e.g. the difference between the detected motion of the digital image capture unit 100B and the measured movement of the shifted gated image sensor 110 or lens system 140 failing to stay below a threshold.
  • FIGS. 2A-2B illustrate thresholds in such a case.
  • the dashed line represents a gyroscope signal
  • the dotted line represents a Hall effect sensor signal.
  • FIG. 2A illustrates how these temporal segments with lag are discarded by setting a control value to zero, thereby ruling them out of exposure accumulation of the gated image sensor 110.
  • the control value may be set one, thereby enabling their use in image signal integration of the gated image sensor 110.
  • the threshold may be increased over time during the image frame readout. This results in the likelihood of sensor exposure events being discarded decreasing over time, thereby lessening or removing the risk of having no non-discarded sensor exposure events at all for the duration of the image frame readout. Alternatively, it may be determined that at least for a given portion of the duration of the image frame readout (e.g. 10 ms out of 30 ms) sensor exposure events must not be discarded to avoid having no non-discarded sensor exposure events at all for the duration of the image frame readout.
  • the controller 130 may comprise an image stabilizer driver, such as an OIS driver.
  • the image stabilizer driver may be included in an integrated circuit.
  • FIG. 3A is an example flow diagram of a method 300A in accordance with an example embodiment.
  • multiple sensor exposure events per a single image frame readout are operated by a gated image sensor of a digital image capture unit.
  • motion related to the digital image capture unit is monitored by a motion monitor of the digital image capture unit.
  • the motion related to the digital image capture unit may comprise e.g. motion of an object in a scene to be captured during the image frame readout.
  • the motion related to the digital image capture unit may comprise motion of the digital image capture unit, as discussed in more detail with reference to FIG. 3B .
  • a controller of the digital image capture unit instructs the gated image sensor to discard a temporally corresponding sensor exposure event of the multiple sensor exposure events.
  • the non-discarded exposure events may then be used in accumulating the final or actual image signal.
  • FIG. 3B is an example flow diagram of a method 300B in accordance with an example embodiment.
  • multiple sensor exposure events per a single image frame readout are operated by a gated image sensor of a digital image capture unit.
  • motion of the digital image capture unit is monitored, e.g. in order to stabilize image frames by compensating for the monitored motion of the digital image capture unit. Accordingly, at operation 302A, motion of the digital image capture unit is detected by at least one motion sensor of the digital image capture unit.
  • the gated image sensor or a lens system of the digital image capture unit is shifted by at least one actuator of the digital image capture unit in order to compensate for the detected motion of the digital image capture unit.
  • the compensating movement of the shifted gated image sensor or lens system caused by the actuator(s) is measured by at least one position feedback sensor of the digital image capture unit.
  • the method determines whether the difference between the detected motion of the digital image capture unit and the measured movement of the shifted gated image sensor or lens system stays below a threshold. If yes, the method returns to operation 302C. Otherwise, the method proceeds to operation 304.
  • the threshold may be increased over time during the image frame readout.
  • a controller of the digital image capture unit instructs the gated image sensor to discard a temporally corresponding sensor exposure event of the multiple sensor exposure events.
  • the non-discarded exposure events may then be used in accumulating the final or actual image signal.
  • Operation 301 may be performed e.g. by the gated image sensor 110 of FIGS. 1A-1B .
  • Operation 302 may be performed e.g. by the motion monitor 120A of FIG. 1A .
  • Operations 302A-302C may be performed e.g. by the image stabilizer 120B of FIG. 1B . More particularly, operation 302A may be performed e.g. by the motion sensor 121 of FIG. 1B , operation 302B may be performed e.g. by the actuator 122 of FIG. 1B , and operation 302C may be performed e.g. by the position feedback sensor 123 of FIG. 1B .
  • Operations 303, 303A and 304 may be performed e.g. by the controller 130 of FIGS. 1A-1B .
  • a gated imaging sensor is a sensor that is capable of gating the charge generated in its photodiode to either a storage node or to a reset (i.e. ground) node at very fast intervals. This allows for precise control over which parts of the signal are used to accumulate the actual image signal and which parts are omitted.
  • a gated imaging sensor by combining the information from the gyroscopes and the Hall effect sensors with the storage/reset capability of a gated imaging sensor it is possible to only use in final image formation image signal fragments where the OIS is able to compensate either fully or to an agreeable extent the motion of the digital image capture unit. This eliminates blur originating from non-ideal OIS compensation from the final image, resulting in a sharper capture.
  • FIG. 4 is a schematic block diagram of an electronic apparatus 400 capable of implementing embodiments of the techniques described herein.
  • the electronic apparatus 400 as illustrated and hereinafter described is merely illustrative of one type of apparatus or an electronic device and should not be taken to limit the scope of the embodiments.
  • the electronic apparatus 400 could be any of apparatuses incorporating a digital image capture unit.
  • the device 400 may be implemented as a smart phone, tablet computer, laptop computer, laptop/tablet hybrid, stand-alone digital (still and/or video) camera or the like.
  • the illustrated electronic apparatus 400 includes a controller or a processor 402 (i.e. a signal processor, microprocessor, ASIC, or other control and processing logic circuitry) for performing such tasks as signal coding, data processing, input/output processing, power control, and/or other functions.
  • An operating system 404 controls the allocation and usage of the components of the electronic apparatus 400 and support for one or more application programs 406.
  • the application programs 406 can include common mobile applications, for instance, telephony applications, email applications, calendars, contact managers, web browsers, messaging applications, or any other application.
  • the illustrated electronic apparatus 400 includes one or more memory components, for example, a non-removable memory 408 and/or removable memory 410.
  • the non-removable memory 408 may include RAM, ROM, flash memory, a hard disk, or other well-known memory storage technologies.
  • the removable memory 410 may include flash memory or smart cards.
  • the one or more memory components may be used for storing data and/or code for running the operating system 404 and the applications 406.
  • Example of data may include web pages, text, images, sound files, image data, video data, or other data sets to be sent to and/or received from one or more network servers or other devices via one or more wired or wireless networks.
  • the electronic device 400 may further include a subscriber identity module (SIM) 412.
  • SIM subscriber identity module
  • the SIM 412 typically stores information elements related to a mobile subscriber.
  • a SIM is well known in Global System for Mobile Communications (GSM) communication systems, Code Division Multiple Access (CDMA) systems, or with third-generation (3G) wireless communication protocols such as Universal Mobile Telecommunications System (UMTS), CDMA1000, wideband CDMA (WCDMA) and time division-synchronous CDMA (TD-SCDMA), or with fourth-generation (4G) wireless communication protocols such as LTE (Long-Term Evolution).
  • GSM Global System for Mobile Communications
  • CDMA Code Division Multiple Access
  • 3G third-generation
  • UMTS Universal Mobile Telecommunications System
  • WCDMA wideband CDMA
  • TD-SCDMA time division-synchronous CDMA
  • 4G fourth-generation
  • LTE Long-Term Evolution
  • the electronic apparatus 400 can support one or more input devices 420 and one or more output devices 430.
  • the input devices 420 may include, but are not limited to, a touchscreen 422 (i.e., capable of capturing finger tap inputs, finger gesture inputs, multi-finger tap inputs, multi-finger gesture inputs, or keystroke inputs from a virtual keyboard or keypad), a microphone 424 (i.e., capable of capturing voice input), a camera module 426 (i.e., capable of capturing still picture images and/or video images) and a physical keyboard 428.
  • the camera module 426 may include the digital image capture unit 100A, 100B of FIGS. 1A-1B .
  • Examples of the output devices 430 may include, but are not limited to a speaker 432 and a display 434.
  • Other possible output devices can include piezoelectric or other haptic output devices. Some devices can serve more than one input/output function.
  • the touchscreen 422 and the display 434 can be combined into a single input/output device.
  • the electronic device 400 may comprise a wireless radio(s) 440.
  • the wireless radio(s) 440 can support two-way communications between the processor 402 and external devices, as is well understood in the art.
  • the wireless radio(s) 440 are shown generically and can include, for example, a cellular modem 442 for communicating at long range with the mobile communication network, a Wi-Fi radio 444 for communicating at short range with a local wireless data network or router, and/or a BLUETOOTH radio 446.
  • the cellular modem 442 is typically configured for communication with one or more cellular networks, such as a GSM/3G/4G network for data and voice communications within a single cellular network, between cellular networks, or between the mobile device and a public switched telephone network (PSTN).
  • PSTN public switched telephone network
  • the electronic device 400 can further include one or more input/output ports 450, a power supply 452, one or more sensors 454, for example an accelerometer, a gyroscope, a compass, or an infrared proximity sensor for detecting the orientation or motion of the electronic device 400, and a transceiver 456 (for wirelessly transmitting analog or digital signals), and an integrated circuit 460.
  • the illustrated components are not required or all-inclusive, as any of the components shown can be deleted and other components can be added.
  • the integrated circuit 460 may include the controller 130 of FIGS. 1A-1B .
  • Computer executable instructions may be provided using any computer-readable media that is accessible by computing based devices.
  • Computer-readable media may include, for example, computer storage media such as memory and communications media.
  • Computer storage media, such as memory includes volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or the like.
  • Computer storage media includes, but is not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information for access by a computing device.
  • communication media may embody computer readable instructions, data structures, program modules, or the like in a modulated data signal, such as a carrier wave, or other transport mechanism.
  • computer storage media does not include communication media. Therefore, a computer storage medium should not be interpreted to be a propagating signal per se.
  • the computer storage media is shown within the computing based devices it will be appreciated that the storage may be distributed or located remotely and accessed via a network or other communication link, for example by using a communication interface.
  • FIGS. 1-4 are able to provide motion triggered gated imaging, for example to enhance image stabilization used in digital cameras. At least some of the examples disclosed in FIGS. 1-4 are able to provide capturing sharp and well-exposed images with a hand-held digital camera even in low light conditions.
  • An embodiment of a digital image capture unit comprises a gated image sensor configured to operate multiple sensor exposure events per a single image frame readout; a motion monitor configured to monitor motion related to the digital image capture unit; and a controller configured to instruct the gated image sensor to discard a sensor exposure event of the multiple sensor exposure events in response to a temporally corresponding monitored motion related to the digital image capture unit failing to meet a motion requirement.
  • the motion related to the digital image capture unit comprises motion of an object in a scene to be captured during the image frame readout.
  • the motion related to the digital image capture unit comprises motion of the digital image capture unit.
  • the motion monitor comprises an image stabilizer configured to stabilize image frames by compensating for the monitored motion of the digital image capture unit.
  • the image stabilizer comprises at least one motion sensor configured to detect the motion of the digital image capture unit.
  • At least one motion sensor comprises a gyroscope.
  • the digital image capture unit further comprises a lens system
  • the image stabilizer further comprises at least one actuator configured to shift one of the gated image sensor and the lens system in order to compensate for the detected motion of the digital image capture unit; and at least one position feedback sensor configured to measure the movement of the shifted one of the gated image sensor and the lens system.
  • At least one position feedback sensor comprises a Hall effect sensor.
  • the motion requirement comprises the difference between the detected motion of the digital image capture unit and the measured movement of the shifted one of the gated image sensor and the lens system staying below a threshold.
  • the threshold is increased over time during the image frame readout.
  • the controller comprises an image stabilizer driver.
  • the gated image sensor is further configured to operate in a global shutter mode.
  • An embodiment of a method comprises operating, by a gated image sensor of a digital image capture unit, multiple sensor exposure events per a single image frame readout; monitoring, by a motion monitor of the digital image capture unit, motion related to the digital image capture unit; and instructing, by a controller of the digital image capture unit, the gated image sensor to discard a sensor exposure event of the multiple sensor exposure events in response to a temporally corresponding monitored motion related to the digital image capture unit failing to meet a motion requirement.
  • the motion related to the digital image capture unit comprises motion of an object in a scene to be captured during the image frame readout.
  • the motion related to the digital image capture unit comprises motion of the digital image capture unit
  • the method further comprises stabilizing image frames by compensating for the monitored motion of the digital image capture unit.
  • the stabilizing of the image frames comprises detecting, by at least one motion sensor of the digital image capture unit, motion of the digital image capture unit.
  • the stabilizing of the image frames further comprises shifting, by at least one actuator of the digital image capture unit, one of the gated image sensor and a lens system of the digital image capture unit in order to compensate for the detected motion of the digital image capture unit; and measuring, by at least one position feedback sensor of the digital image capture unit, the movement of the shifted one of the gated image sensor and the lens system.
  • the motion requirement comprises the difference between the detected motion of the digital image capture unit and the measured movement of the shifted one of the gated image sensor and the lens system staying below a threshold.
  • the method further comprises increasing the threshold over time during the image frame readout.
  • An embodiment of an electronic apparatus comprises a digital image capture unit comprising a gated image sensor configured to operate multiple sensor exposure events per a single image frame readout; a motion monitor configured to monitor motion related to the digital image capture unit; and a controller configured to instruct the gated image sensor to discard a sensor exposure event of the multiple sensor exposure events in response to a temporally corresponding monitored motion related to the digital image capture unit failing to meet a motion requirement.
  • FIGS. 1A-1B constitute exemplary means for operating multiple sensor exposure events per a single image frame readout, exemplary means for monitoring motion related to a digital image capture unit, and exemplary means for instructing a gated image sensor to discard a sensor exposure event of the multiple sensor exposure events in response to a temporally corresponding monitored motion related to the digital image capture unit failing to meet a motion requirement.
  • 'computer' or 'computing-based device' is used herein to refer to any device with processing capability such that it can execute instructions. Those skilled in the art will realize that such processing capabilities are incorporated into many different devices and therefore the terms 'computer' and 'computing-based device' each include mobile telephones (including smart phones), tablet computers and many other devices.
  • the processes described herein may be performed by software in machine readable form on a tangible storage medium e.g. in the form of a computer program comprising computer program code means adapted to perform all the steps of any of the processes described herein when the program is run on a computer and where the computer program may be embodied on a computer readable medium.
  • tangible storage media include disks, thumb drives, memory etc. and do not include propagated signals.
  • the software can be suitable for execution on a parallel processor or a serial processor such that the method steps may be carried out in any suitable order, or simultaneously.
  • a remote computer may store an example of the process described as software.
  • a local or terminal computer may access the remote computer and download a part or all of the software to run the program.
  • the local computer may download pieces of the software as needed, or execute some software instructions at the local terminal and some at the remote computer (or computer network).
  • a dedicated circuit such as a digital signal processor (DSP), programmable logic array, or the like.
  • DSP digital signal processor
  • the functionality described herein can be performed, at least in part, by one or more hardware logic components.
  • illustrative types of hardware logic components include Field-programmable Gate Arrays (FPGAs), Application-specific Integrated Circuits (ASICs), Application-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Description

    BACKGROUND
  • Digital cameras may need extended exposure times e.g. in low light conditions. Extended exposure times are prone to camera movement, such as hand-shaking motion. This camera movement can be partially compensated for by an optical image stabilization system. In some instances, the camera movement compensation by the optical image stabilization system may have limitations which make capturing sharp, well exposed images with a hand-held device in low light conditions difficult or even impossible.
  • Reference is made to the cited documents WO 2015/198300 and US 2014/362256 . WO 2015/198300 relates to an active or passive gated-sensor imaging system characterized by a minimized time period between successive sensor exposures. US 2014/362256 relates to a system and method for reference frame selection for still image stabilisation including a combination of image quality and commonality metrics are used to identify a reference frame from a set of commonly captured images which, when the set's other images are combined with it, results in a quality stabilized image.
  • SUMMARY
  • In accordance with the claims there is provided a digital image capture unit, as defined in claim 1; an electronic apparatus comprising a digital image capture unit as defined in claim 9; and, a method as defined in claim 10. Further features are in accordance with the dependent claims.
  • This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
  • In one example, a digital image capture unit comprises a gated image sensor configured to operate multiple sensor exposure events per a single image frame readout. The digital image capture unit further comprises a motion monitor configured to monitor motion related to the digital image capture unit. The digital image capture unit further comprises a controller configured to instruct the gated image sensor to discard a sensor exposure event of the multiple sensor exposure events in response to a temporally corresponding monitored motion related to the digital image capture unit failing to meet a motion requirement.
  • In another example, a method and an electronic apparatus have been discussed along with the features of the digital image capture unit.
  • Many of the attendant features will be more readily appreciated as the same becomes better understood by reference to the following detailed description considered in connection with the accompanying drawings.
  • DESCRIPTION OF THE DRAWINGS
  • The present description will be better understood from the following detailed description read in light of the accompanying drawings, wherein:
    • FIG. 1A is an example block diagram of a digital image capture unit in accordance with an example embodiment;
    • FIG. 1B is an example block diagram of a digital image capture unit in accordance with another example embodiment;
    • FIGS. 2A-2B illustrate thresholds in accordance with an example embodiment;
    • FIG. 3A-3B are example flow diagrams of methods in accordance with example embodiments; and
    • FIG. 4 illustrates an example block diagram of an electronic apparatus capable of implementing example embodiments described herein.
  • Like reference numerals are used to designate like parts in the accompanying drawings.
  • DETAILED DESCRIPTION
  • The detailed description provided below in connection with the appended drawings is intended as a description of the present examples and is not intended to represent the only forms in which the present example may be constructed or utilized. The description sets forth the functions of the example and the sequence of operations for constructing and operating the example. However, the same or equivalent functions and sequences may be accomplished by different examples.
  • At least some of the disclosed examples may allow motion triggered gated imaging, for example to enhance image stabilization used in digital cameras. Accordingly, at least some of the disclosed examples may allow capturing sharp and well-exposed images with a hand-held digital camera even in low light conditions. At least some of the disclosed examples may allow eliminating or at least decreasing blur originating from non-ideal OIS compensation from the final image due to combining information from the gyroscopes and the Hall effect sensors with storage/reset capability of a gated imaging sensor thereby enabling the use of image signal fragments in final image formation for which the OIS is able to compensate either fully or to an agreeable extent the motion of the digital image capture unit. At least some of the disclosed examples may allow capturing sharp and well-exposed images with a hand-held digital camera for both video and still images.
  • FIG. 1A is an example block diagram of a digital image capture unit 100A in accordance with an example embodiment. The digital image capture unit 100A may be employed, for example, in the electronic apparatus 400 of FIG. 4. However, it should be noted that the digital image capture unit 100A may also be employed on a variety of other devices and apparatuses, and therefore, embodiments should not be limited to application on devices and apparatuses such as the electronic apparatus 400 of FIG. 4. Furthermore, it should be noted that at least some of the elements described below may not be mandatory and thus some may be omitted in certain embodiments. The digital image capture unit 100A may be included e.g. in a stand-alone digital camera or an integrated digital camera which may be still cameras and/or video cameras, and the like.
  • The digital image capture unit 100A comprises a gated image sensor 110 that is configured to operate (e.g. store and/or discard) multiple sensor exposure events per a single image frame readout. The gated image sensor may be further configured to operate in a global shutter mode in which case the sensor exposure events are global sensor exposure events. In an example, the duration of the single image frame readout may be 30 milliseconds (ms). The duration of each sensor exposure event is a fragment of the duration of the single image frame readout, for example in the range of microseconds or nanoseconds.
  • In the global shutter mode, an entire image frame is captured at the same instant. This is in contrast to e.g. rolling shutter mode in which different parts (e.g. pixel rows) of an image frame are captured at slightly different times, for example one row after another.
  • The gated image sensor 110 may include e.g. a charge-coupled device (CCD) sensor or a complementary metal-oxide-semiconductor (CMOS) sensor. A gated CMOS image sensor is also known as GCMOS. One single frame is typically composed from repeating global exposures events. A GCMOS may be manufactured using e.g. contact image sensor (CIS) technology on near-infrared (NIR) global shutter platform.
  • Herein, the term "image frame readout" refers to an event that starts with the shutter opening (or the image frame light accumulation beginning, in the case of an electronic shutter) and ends with the shutter closing (or the image frame light accumulation finishing, in the case of an electronic shutter), with no other shutter actuation in-between. As discussed above, a single "image frame readout" may contain multiple exposure events. In other words, "image frame readout" is the period needed for accumulating or integrating light for an entire single image frame, and an exposure event is a temporal segment or subset of this light accumulation period. This is also illustrated in FIGS. 2A-2B in which sopen represents the instant of the shutter opening and sclose represents the instant of the shutter closing. These multiple exposure events are controlled or operated with the gating functionality (comprising e.g. one or more logical switches or gates) of the gated image sensor (rather than shutter functionality) by gating the charge generated in the photodiode(s) to either a storage node or to a reset (i.e. ground) node.
  • The digital image capture unit 100A further comprises a motion monitor 120A that is configured to monitor motion related to the digital image capture unit 100A. The motion related to the digital image capture unit may comprise e.g. motion of an object in a scene to be captured during the image frame readout. Alternatively, the motion related to the digital image capture unit may comprise motion of the digital image capture unit, as discussed in more detail with reference to FIG. 1B.
  • The digital image capture unit 100A further comprises a controller 130 that is configured to instruct the gated image sensor 110 to discard a sensor exposure event of the multiple sensor exposure events in response to a temporally corresponding monitored motion related to the digital image capture unit 100A failing to meet a motion requirement. Here, "temporally corresponding" indicates that a given monitored motion occurs at the same instant as its corresponding sensor exposure event, as also illustrated e.g. in FIGS. 2A-2B. As discussed above, the duration of each sensor exposure event is a fragment of the duration of the single image frame readout, for example in the range of microseconds or nanoseconds.
  • FIG. 1B is an example block diagram of a digital image capture unit 100B in accordance with an example embodiment. The digital image capture unit 100B may be employed, for example, in the electronic apparatus 400 of FIG. 4. However, it should be noted that the digital image capture unit 100B may also be employed on a variety of other devices and apparatuses, and therefore, embodiments should not be limited to application on devices and apparatuses such as the electronic apparatus 400 of FIG. 4. Furthermore, it should be noted that at least some of the elements described below may not be mandatory and thus some may be omitted in certain embodiments. The digital image capture unit 100B may be included e.g. in a stand-alone digital camera or an integrated digital camera which may be still cameras and/or video cameras, and the like.
  • In the example of FIG. 1B, the functionalities and properties of the gated image sensor 110 and the controller 130 are substantially similar to those of their counterparts in the example of FIG. 1A, so their descriptions are not repeated here in detail.
  • The digital image capture unit 100B further comprises an image stabilizer 120B (e.g. an optical image stabilizer or OIS) that is configured to stabilize image frames by compensating for the monitored motion of the digital image capture unit.
  • The image stabilizer 120B comprises one or more motion sensors 121 that are configured to detect the motion of the digital image capture unit 100B. At least one of the motion sensors may comprise a gyroscope. The motion of the digital image capture unit 100B may comprise e.g. pitch, yaw and/or roll of the digital image capture unit 100B. The motion of the digital image capture unit 100B may be caused e.g. by hand shaking of a user operating the digital image capture unit 100B.
  • The digital image capture unit 100B further comprises a lens system 140. The image stabilizer 120B further comprises one or more actuators 122 that are configured to shift either the gated image sensor 110 or the lens system 140 in order to compensate for the detected motion of the digital image capture unit 100B. The image stabilizer 120B further comprises one or more position feedback sensors 123 that are configured to measure the movement of the shifted gated image sensor 110 or lens system 140. At least one of the position feedback sensors may comprise a Hall effect sensor.
  • In the embodiment of FIG. 1B, the motion requirement may comprise the difference between the detected motion of the digital image capture unit 100B and the measured movement of the shifted gated image sensor 110 or lens system 140 staying below a threshold. Accordingly, the failure to meet the motion requirement may comprise e.g. the difference between the detected motion of the digital image capture unit 100B and the measured movement of the shifted gated image sensor 110 or lens system 140 failing to stay below a threshold. FIGS. 2A-2B illustrate thresholds in such a case. In FIG. 2A, the dashed line represents a gyroscope signal, and the dotted line represents a Hall effect sensor signal. As can be seen in FIG. 2A, there are temporal segments when the Hall effect sensor(s) 123 lag behind the motion of the digital image capture unit 100B detected by gyroscope(s) 121. This lag may be due to e.g. processing logic. In this example, for these segments the difference between the detected motion of the digital image capture unit 100B and the measured movement of the shifted gated image sensor 110 or lens system 140 is considered to exceed the threshold. FIG. 2B illustrates how these temporal segments with lag are discarded by setting a control value to zero, thereby ruling them out of exposure accumulation of the gated image sensor 110. For temporal segments without lag, the control value may be set one, thereby enabling their use in image signal integration of the gated image sensor 110.
  • The threshold may be increased over time during the image frame readout. This results in the likelihood of sensor exposure events being discarded decreasing over time, thereby lessening or removing the risk of having no non-discarded sensor exposure events at all for the duration of the image frame readout. Alternatively, it may be determined that at least for a given portion of the duration of the image frame readout (e.g. 10 ms out of 30 ms) sensor exposure events must not be discarded to avoid having no non-discarded sensor exposure events at all for the duration of the image frame readout.
  • The controller 130 may comprise an image stabilizer driver, such as an OIS driver. The image stabilizer driver may be included in an integrated circuit.
  • FIG. 3A is an example flow diagram of a method 300A in accordance with an example embodiment. At operation 301, multiple sensor exposure events per a single image frame readout are operated by a gated image sensor of a digital image capture unit.
  • At operation 302, motion related to the digital image capture unit is monitored by a motion monitor of the digital image capture unit. The motion related to the digital image capture unit may comprise e.g. motion of an object in a scene to be captured during the image frame readout. Alternatively, the motion related to the digital image capture unit may comprise motion of the digital image capture unit, as discussed in more detail with reference to FIG. 3B.
  • At operation 303, it is determined whether a monitored motion related to the digital image capture unit meets a motion requirement. If yes, the method returns to operation 302. Otherwise, the method proceeds to operation 304.
  • At operation 304, a controller of the digital image capture unit instructs the gated image sensor to discard a temporally corresponding sensor exposure event of the multiple sensor exposure events. The non-discarded exposure events may then be used in accumulating the final or actual image signal.
  • FIG. 3B is an example flow diagram of a method 300B in accordance with an example embodiment. At operation 301, multiple sensor exposure events per a single image frame readout are operated by a gated image sensor of a digital image capture unit.
  • In the example of FIG. 3B, motion of the digital image capture unit is monitored, e.g. in order to stabilize image frames by compensating for the monitored motion of the digital image capture unit. Accordingly, at operation 302A, motion of the digital image capture unit is detected by at least one motion sensor of the digital image capture unit.
  • At operation 302B, the gated image sensor or a lens system of the digital image capture unit is shifted by at least one actuator of the digital image capture unit in order to compensate for the detected motion of the digital image capture unit.
  • At operation 302C, the compensating movement of the shifted gated image sensor or lens system caused by the actuator(s) is measured by at least one position feedback sensor of the digital image capture unit.
  • At operation 303A, it is determined whether the difference between the detected motion of the digital image capture unit and the measured movement of the shifted gated image sensor or lens system stays below a threshold. If yes, the method returns to operation 302C. Otherwise, the method proceeds to operation 304. The threshold may be increased over time during the image frame readout.
  • At operation 304, a controller of the digital image capture unit instructs the gated image sensor to discard a temporally corresponding sensor exposure event of the multiple sensor exposure events. The non-discarded exposure events may then be used in accumulating the final or actual image signal.
  • Operation 301 may be performed e.g. by the gated image sensor 110 of FIGS. 1A-1B. Operation 302 may be performed e.g. by the motion monitor 120A of FIG. 1A. Operations 302A-302C may be performed e.g. by the image stabilizer 120B of FIG. 1B. More particularly, operation 302A may be performed e.g. by the motion sensor 121 of FIG. 1B, operation 302B may be performed e.g. by the actuator 122 of FIG. 1B, and operation 302C may be performed e.g. by the position feedback sensor 123 of FIG. 1B. Operations 303, 303A and 304 may be performed e.g. by the controller 130 of FIGS. 1A-1B.
  • A gated imaging sensor is a sensor that is capable of gating the charge generated in its photodiode to either a storage node or to a reset (i.e. ground) node at very fast intervals. This allows for precise control over which parts of the signal are used to accumulate the actual image signal and which parts are omitted. At least in some of the examples disclosed in FIGS. 1-3B, by combining the information from the gyroscopes and the Hall effect sensors with the storage/reset capability of a gated imaging sensor it is possible to only use in final image formation image signal fragments where the OIS is able to compensate either fully or to an agreeable extent the motion of the digital image capture unit. This eliminates blur originating from non-ideal OIS compensation from the final image, resulting in a sharper capture.
  • FIG. 4 is a schematic block diagram of an electronic apparatus 400 capable of implementing embodiments of the techniques described herein. It should be understood that the electronic apparatus 400 as illustrated and hereinafter described is merely illustrative of one type of apparatus or an electronic device and should not be taken to limit the scope of the embodiments. As such, it should be appreciated that at least some of the components described below in connection with the electronic apparatus 400 may be optional and thus in an example embodiment may include more, less or different components than those described in connection with the example embodiment of FIG. 4. As such, among other examples, the electronic apparatus 400 could be any of apparatuses incorporating a digital image capture unit. For example, the device 400 may be implemented as a smart phone, tablet computer, laptop computer, laptop/tablet hybrid, stand-alone digital (still and/or video) camera or the like.
  • The illustrated electronic apparatus 400 includes a controller or a processor 402 (i.e. a signal processor, microprocessor, ASIC, or other control and processing logic circuitry) for performing such tasks as signal coding, data processing, input/output processing, power control, and/or other functions. An operating system 404 controls the allocation and usage of the components of the electronic apparatus 400 and support for one or more application programs 406. The application programs 406 can include common mobile applications, for instance, telephony applications, email applications, calendars, contact managers, web browsers, messaging applications, or any other application.
  • The illustrated electronic apparatus 400 includes one or more memory components, for example, a non-removable memory 408 and/or removable memory 410. The non-removable memory 408 may include RAM, ROM, flash memory, a hard disk, or other well-known memory storage technologies. The removable memory 410 may include flash memory or smart cards. The one or more memory components may be used for storing data and/or code for running the operating system 404 and the applications 406. Example of data may include web pages, text, images, sound files, image data, video data, or other data sets to be sent to and/or received from one or more network servers or other devices via one or more wired or wireless networks. The electronic device 400 may further include a subscriber identity module (SIM) 412. The SIM 412 typically stores information elements related to a mobile subscriber. A SIM is well known in Global System for Mobile Communications (GSM) communication systems, Code Division Multiple Access (CDMA) systems, or with third-generation (3G) wireless communication protocols such as Universal Mobile Telecommunications System (UMTS), CDMA1000, wideband CDMA (WCDMA) and time division-synchronous CDMA (TD-SCDMA), or with fourth-generation (4G) wireless communication protocols such as LTE (Long-Term Evolution). The SIM 412 may comprise a virtual SIM. Furthermore, multiple SIMs may be utilized.
  • The electronic apparatus 400 can support one or more input devices 420 and one or more output devices 430. Examples of the input devices 420 may include, but are not limited to, a touchscreen 422 (i.e., capable of capturing finger tap inputs, finger gesture inputs, multi-finger tap inputs, multi-finger gesture inputs, or keystroke inputs from a virtual keyboard or keypad), a microphone 424 (i.e., capable of capturing voice input), a camera module 426 (i.e., capable of capturing still picture images and/or video images) and a physical keyboard 428. The camera module 426 may include the digital image capture unit 100A, 100B of FIGS. 1A-1B. Examples of the output devices 430 may include, but are not limited to a speaker 432 and a display 434. Other possible output devices (not shown) can include piezoelectric or other haptic output devices. Some devices can serve more than one input/output function. For example, the touchscreen 422 and the display 434 can be combined into a single input/output device.
  • In an embodiment, the electronic device 400 may comprise a wireless radio(s) 440. The wireless radio(s) 440 can support two-way communications between the processor 402 and external devices, as is well understood in the art. The wireless radio(s) 440 are shown generically and can include, for example, a cellular modem 442 for communicating at long range with the mobile communication network, a Wi-Fi radio 444 for communicating at short range with a local wireless data network or router, and/or a BLUETOOTH radio 446. The cellular modem 442 is typically configured for communication with one or more cellular networks, such as a GSM/3G/4G network for data and voice communications within a single cellular network, between cellular networks, or between the mobile device and a public switched telephone network (PSTN).
  • The electronic device 400 can further include one or more input/output ports 450, a power supply 452, one or more sensors 454, for example an accelerometer, a gyroscope, a compass, or an infrared proximity sensor for detecting the orientation or motion of the electronic device 400, and a transceiver 456 (for wirelessly transmitting analog or digital signals), and an integrated circuit 460. The illustrated components are not required or all-inclusive, as any of the components shown can be deleted and other components can be added. The integrated circuit 460 may include the controller 130 of FIGS. 1A-1B.
  • Computer executable instructions may be provided using any computer-readable media that is accessible by computing based devices. Computer-readable media may include, for example, computer storage media such as memory and communications media. Computer storage media, such as memory includes volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or the like. Computer storage media includes, but is not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information for access by a computing device. In contrast, communication media may embody computer readable instructions, data structures, program modules, or the like in a modulated data signal, such as a carrier wave, or other transport mechanism. As defined herein, computer storage media does not include communication media. Therefore, a computer storage medium should not be interpreted to be a propagating signal per se. Although the computer storage media is shown within the computing based devices it will be appreciated that the storage may be distributed or located remotely and accessed via a network or other communication link, for example by using a communication interface.
  • At least some of the examples disclosed in FIGS. 1-4 are able to provide motion triggered gated imaging, for example to enhance image stabilization used in digital cameras. At least some of the examples disclosed in FIGS. 1-4 are able to provide capturing sharp and well-exposed images with a hand-held digital camera even in low light conditions.
  • An embodiment of a digital image capture unit comprises a gated image sensor configured to operate multiple sensor exposure events per a single image frame readout; a motion monitor configured to monitor motion related to the digital image capture unit; and a controller configured to instruct the gated image sensor to discard a sensor exposure event of the multiple sensor exposure events in response to a temporally corresponding monitored motion related to the digital image capture unit failing to meet a motion requirement.
  • In an embodiment, alternatively or in addition to the above described embodiments, the motion related to the digital image capture unit comprises motion of an object in a scene to be captured during the image frame readout.
  • In an embodiment, alternatively or in addition to the above described embodiments, the motion related to the digital image capture unit comprises motion of the digital image capture unit.
  • In an embodiment, alternatively or in addition to the above described embodiments, the motion monitor comprises an image stabilizer configured to stabilize image frames by compensating for the monitored motion of the digital image capture unit.
  • In an embodiment, alternatively or in addition to the above described embodiments, the image stabilizer comprises at least one motion sensor configured to detect the motion of the digital image capture unit.
  • In an embodiment, alternatively or in addition to the above described embodiments, at least one motion sensor comprises a gyroscope.
  • In an embodiment, alternatively or in addition to the above described embodiments, the digital image capture unit further comprises a lens system, and the image stabilizer further comprises at least one actuator configured to shift one of the gated image sensor and the lens system in order to compensate for the detected motion of the digital image capture unit; and at least one position feedback sensor configured to measure the movement of the shifted one of the gated image sensor and the lens system.
  • In an embodiment, alternatively or in addition to the above described embodiments, at least one position feedback sensor comprises a Hall effect sensor.
  • In an embodiment, alternatively or in addition to the above described embodiments, the motion requirement comprises the difference between the detected motion of the digital image capture unit and the measured movement of the shifted one of the gated image sensor and the lens system staying below a threshold.
  • In an embodiment, alternatively or in addition to the above described embodiments, the threshold is increased over time during the image frame readout.
  • In an embodiment, alternatively or in addition to the above described embodiments, the controller comprises an image stabilizer driver.
  • In an embodiment, alternatively or in addition to the above described embodiments, the gated image sensor is further configured to operate in a global shutter mode.
  • An embodiment of a method comprises operating, by a gated image sensor of a digital image capture unit, multiple sensor exposure events per a single image frame readout; monitoring, by a motion monitor of the digital image capture unit, motion related to the digital image capture unit; and instructing, by a controller of the digital image capture unit, the gated image sensor to discard a sensor exposure event of the multiple sensor exposure events in response to a temporally corresponding monitored motion related to the digital image capture unit failing to meet a motion requirement.
  • In an embodiment, alternatively or in addition to the above described embodiments, the motion related to the digital image capture unit comprises motion of an object in a scene to be captured during the image frame readout.
  • In an embodiment, alternatively or in addition to the above described embodiments, the motion related to the digital image capture unit comprises motion of the digital image capture unit, and the method further comprises stabilizing image frames by compensating for the monitored motion of the digital image capture unit.
  • In an embodiment, alternatively or in addition to the above described embodiments, the stabilizing of the image frames comprises detecting, by at least one motion sensor of the digital image capture unit, motion of the digital image capture unit.
  • In an embodiment, alternatively or in addition to the above described embodiments, the stabilizing of the image frames further comprises shifting, by at least one actuator of the digital image capture unit, one of the gated image sensor and a lens system of the digital image capture unit in order to compensate for the detected motion of the digital image capture unit; and measuring, by at least one position feedback sensor of the digital image capture unit, the movement of the shifted one of the gated image sensor and the lens system.
  • In an embodiment, alternatively or in addition to the above described embodiments, the motion requirement comprises the difference between the detected motion of the digital image capture unit and the measured movement of the shifted one of the gated image sensor and the lens system staying below a threshold.
  • In an embodiment, alternatively or in addition to the above described embodiments, the method further comprises increasing the threshold over time during the image frame readout.
  • An embodiment of an electronic apparatus comprises a digital image capture unit comprising a gated image sensor configured to operate multiple sensor exposure events per a single image frame readout; a motion monitor configured to monitor motion related to the digital image capture unit; and a controller configured to instruct the gated image sensor to discard a sensor exposure event of the multiple sensor exposure events in response to a temporally corresponding monitored motion related to the digital image capture unit failing to meet a motion requirement.
  • The embodiments illustrated and described herein as well as embodiments not specifically described herein but within the scope of aspects of the disclosure constitute exemplary means for motion triggered gated imaging. For example, the elements illustrated in FIGS. 1A-1B constitute exemplary means for operating multiple sensor exposure events per a single image frame readout, exemplary means for monitoring motion related to a digital image capture unit, and exemplary means for instructing a gated image sensor to discard a sensor exposure event of the multiple sensor exposure events in response to a temporally corresponding monitored motion related to the digital image capture unit failing to meet a motion requirement.
  • The term 'computer' or 'computing-based device' is used herein to refer to any device with processing capability such that it can execute instructions. Those skilled in the art will realize that such processing capabilities are incorporated into many different devices and therefore the terms 'computer' and 'computing-based device' each include mobile telephones (including smart phones), tablet computers and many other devices.
  • The processes described herein may be performed by software in machine readable form on a tangible storage medium e.g. in the form of a computer program comprising computer program code means adapted to perform all the steps of any of the processes described herein when the program is run on a computer and where the computer program may be embodied on a computer readable medium. Examples of tangible storage media include disks, thumb drives, memory etc. and do not include propagated signals. The software can be suitable for execution on a parallel processor or a serial processor such that the method steps may be carried out in any suitable order, or simultaneously.
  • This acknowledges that software can be a valuable, separately tradable commodity. It is intended to encompass software, which runs on or controls "dumb" or standard hardware, to carry out the desired functions. It is also intended to encompass software which "describes" or defines the configuration of hardware, such as HDL (hardware description language) software, as is used for designing silicon chips, or for configuring universal programmable chips, to carry out desired functions.
  • Those skilled in the art will realize that storage devices utilized to store program instructions can be distributed across a network. For example, a remote computer may store an example of the process described as software. A local or terminal computer may access the remote computer and download a part or all of the software to run the program. Alternatively, the local computer may download pieces of the software as needed, or execute some software instructions at the local terminal and some at the remote computer (or computer network). Those skilled in the art will also realize that by utilizing conventional techniques known to those skilled in the art that all, or a portion of the software instructions may be carried out by a dedicated circuit, such as a digital signal processor (DSP), programmable logic array, or the like.
  • Alternatively, or in addition, the functionality described herein can be performed, at least in part, by one or more hardware logic components. For example, and without limitation, illustrative types of hardware logic components that can be used include Field-programmable Gate Arrays (FPGAs), Application-specific Integrated Circuits (ASICs), Application-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
  • Any range or device value given herein may be extended or altered without losing the effect sought, as will be apparent to the skilled person.
  • Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims, and other equivalent features and acts are intended to be within the scope of the claims.
  • It will be understood that the benefits and advantages described above may relate to one embodiment or may relate to several embodiments. The embodiments are not limited to those that solve any or all of the stated problems or those that have any or all of the stated benefits and advantages. It will further be understood that reference to 'an' item refers to one or more of those items.
  • Aspects of any of the examples described above may be combined with aspects of any of the other examples described to form further examples without losing the effect sought.
  • The term 'comprising' is used herein to mean including the blocks or elements identified, but that such blocks or elements do not comprise an exclusive list, and a system, a device or an apparatus may contain additional blocks or elements.
  • It will be understood that the above description is given by way of example only and that various modifications may be made by those skilled in the art. The above specification, examples and data provide a complete description of the structure and use of exemplary embodiments.

Claims (13)

  1. A digital image capture unit (100A, 100B), comprising:
    a gated image sensor (110) configured to operate in a global shutter mode to control multiple global sensor exposure events that occur during a single image frame readout, the single image frame readout starting with a shutter opening and ending with the shutter closing;
    wherein the single image frame readout is a period of time during which light is accumulated for a single image frame, and an exposure event of the multiple sensor exposure events is a temporal segment of said single image frame readout period of time during which an image frame is captured, such that the single image frame readout is composed from multiple sensor exposure events;
    wherein the multiple sensor exposure events are controlled by the gated image sensor (110) operating to store sensor exposure events by gating the charge generated in photodiodes of the gated image sensor to a storage node to store the sensor exposure event
    characterized in that the digital image capture unit further comprises: a motion monitor (120A) configured to monitor motion related to the digital image capture unit;
    a controller (130) configured to instruct the gated image sensor (110) to discard a sensor exposure event of the multiple sensor exposure events in response to a temporally corresponding monitored motion related to the digital image capture unit failing to meet a motion requirement of being below a threshold, wherein the discarded sensor exposure event and the temporally corresponding monitored motion occurred at a same time,
    wherein the gated image sensor (110) is configured to discard sensor exposure events by gating the charge generated in the photodiodes of the gated image sensor to a ground node
  2. The digital image capture unit as claimed in claim 1, wherein the motion related to the digital image capture unit (100A, 100B) comprises motion of an object in a scene to be captured during the image frame readout.
  3. The digital image capture unit as claimed in claim 1, wherein the motion related to the digital image capture unit (100A, 100B) comprises motion of the digital image capture unit.
  4. The digital image capture unit as claimed in claim 3, wherein the motion monitor (120A) comprises an image stabilizer (120B) configured to stabilize image frames by compensating for the monitored motion of the digital image capture unit.
  5. The digital image capture unit as claimed in claim 4, wherein the image stabilizer (120B) comprises at least one motion sensor (121) configured to detect the motion of the digital image capture unit.
  6. The digital image capture unit as claimed in claim 5, further comprising a lens system (140), wherein the image stabilizer (120B) further comprises: at least one actuator (122) configured to shift one of the gated image sensor(110) and the lens system (140) in order to compensate for the detected motion of the digital image capture unit; and
    at least one position feedback sensor (123) configured to measure the movement of the shifted one of the gated image sensor (110) and the lens system (140).
  7. The digital image capture unit as claimed in claim 6, wherein the motion requirement comprises the difference between the detected motion of the digital image capture unit (100A, 100B) and the measured movement of the shifted one of the gated image sensor (110) and the lens system (140) staying below the threshold.
  8. The digital image capture unit as claimed in claim 7, wherein the threshold is increased over time during the image frame readout.
  9. A method, comprising:
    operating (301), by a gated image sensor of a digital image capture unit, multiple sensor exposure events that occur during a single image frame readout, the single image frame readout starting with a shutter opening and ending with the shutter closing;
    wherein the single image frame readout is a period of time during which light is accumulated for a single image frame, and an exposure event of the multiple sensor exposure events is a temporal segment of said single image frame readout period of time during which a image frame is captured, such that the single image frame readout is composed from multiple sensor exposure events;
    wherein the gated image sensor (110) operates to store sensor exposure events of the multiple sensor exposure events by gating the charge generated in a photodiode of the gated image sensor to a storage node to store a sensor exposure event
    characterized in that the method further comprises:
    monitoring (302), by a motion monitor of the digital image capture unit, motion related to the digital image capture unit,
    instructing (303), by a controller of the digital image capture unit, the gated image sensor to discard a sensor exposure event of the multiple sensor exposure events in response to a temporally corresponding monitored motion related to the digital image capture unit failing to meet a motion requirement of being below a threshold, when the sensor exposure event and the temporally corresponding monitored motion occurred at a same time, the gated image sensor (110) discarding said sensor exposure event by gating the charge generated in the photodiodes of the gated image sensor to a ground node, such that the single image frame readout is composed from the multiple sensor exposures events that meet the motion requirement threshold.
  10. The method as claimed in claim 9, wherein the motion related to the digital image capture unit (100A, 100B) comprises motion of the digital image capture unit, and the method further comprises stabilizing image frames by compensating for the monitored motion of the digital image capture unit.
  11. The method as claimed in claim 10, wherein the stabilizing of the image frames comprises detecting (302A), by at least one motion sensor of the digital image capture unit, motion of the digital image capture unit.
  12. The method as claimed in claim 11, wherein the stabilizing of the image frames further comprises:
    shifting (302b), by at least one actuator of the digital image capture unit, one of the gated image sensor(110) and a lens system (140) of the digital image capture unit in order to compensate for the detected motion of the digital image capture unit; and measuring, by at least one position feedback sensor of the digital image capture unit, the movement of the shifted one of the gated image sensor and the lens system.
  13. The method as claimed in claim 12, wherein the motion requirement comprises the difference between the detected motion of the digital image capture unit (100A, 100B) and the measured movement of the shifted one of the gated image sensor (110) and the lens system (140) staying below a threshold.
EP17761684.4A 2016-08-30 2017-08-23 Motion triggered gated imaging Active EP3507974B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US15/252,194 US10306148B2 (en) 2016-08-30 2016-08-30 Motion triggered gated imaging
PCT/US2017/048092 WO2018044627A1 (en) 2016-08-30 2017-08-23 Motion triggered gated imaging

Publications (2)

Publication Number Publication Date
EP3507974A1 EP3507974A1 (en) 2019-07-10
EP3507974B1 true EP3507974B1 (en) 2022-08-31

Family

ID=59772748

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17761684.4A Active EP3507974B1 (en) 2016-08-30 2017-08-23 Motion triggered gated imaging

Country Status (4)

Country Link
US (2) US10306148B2 (en)
EP (1) EP3507974B1 (en)
CN (1) CN109644238B (en)
WO (1) WO2018044627A1 (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10306148B2 (en) * 2016-08-30 2019-05-28 Microsoft Technology Licensing, Llc Motion triggered gated imaging
JP7285058B2 (en) 2018-10-04 2023-06-01 株式会社ソニー・インタラクティブエンタテインメント Sensor module, electronic device, object detection method, program and processing circuit
JP7369517B2 (en) * 2018-10-04 2023-10-26 株式会社ソニー・インタラクティブエンタテインメント Sensor module, electronic equipment, object detection method and program
JP7023209B2 (en) 2018-10-04 2022-02-21 株式会社ソニー・インタラクティブエンタテインメント Control methods and programs for electronic devices and actuators
CN113243016A (en) * 2018-12-10 2021-08-10 株式会社小糸制作所 Object recognition system, arithmetic processing device, automobile, vehicle lamp, and method for learning classifier
US10855896B1 (en) 2018-12-13 2020-12-01 Facebook Technologies, Llc Depth determination using time-of-flight and camera assembly with augmented pixels
US10791286B2 (en) 2018-12-13 2020-09-29 Facebook Technologies, Llc Differentiated imaging using camera assembly with augmented pixels
US10791282B2 (en) * 2018-12-13 2020-09-29 Fenwick & West LLP High dynamic range camera assembly with augmented pixels
US10902623B1 (en) 2019-11-19 2021-01-26 Facebook Technologies, Llc Three-dimensional imaging with spatial and temporal coding for depth camera assembly
US11194160B1 (en) 2020-01-21 2021-12-07 Facebook Technologies, Llc High frame rate reconstruction with N-tap camera sensor

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007148169A1 (en) * 2006-06-22 2007-12-27 Nokia Corporation Method and system for image stabilization

Family Cites Families (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000305126A (en) * 1999-04-26 2000-11-02 Olympus Optical Co Ltd Camera with shake reducing function
JP3804617B2 (en) * 2003-02-14 2006-08-02 コニカミノルタフォトイメージング株式会社 Image processing apparatus and method
JP2004301939A (en) * 2003-03-28 2004-10-28 Sony Corp Camera system, camera, and interchangeable lens
JP2005077886A (en) * 2003-09-02 2005-03-24 Canon Inc Photographing equipment
US8045009B2 (en) * 2004-05-10 2011-10-25 Hewlett-Packard Development Company, L.P. Image-exposure systems and methods using detecting motion of a camera to terminate exposure
US8482618B2 (en) * 2005-02-22 2013-07-09 Hewlett-Packard Development Company, L.P. Reduction of motion-induced blur in images
US20070009722A1 (en) * 2005-07-11 2007-01-11 Strait Michael A Polymer/WUCS mat and method of forming same
US20070097221A1 (en) * 2005-10-28 2007-05-03 Stavely Donald J Systems and methods of exposure restart for cameras
US20070237514A1 (en) * 2006-04-06 2007-10-11 Eastman Kodak Company Varying camera self-determination based on subject motion
JP4717748B2 (en) * 2006-08-11 2011-07-06 キヤノン株式会社 Camera body and camera system having the same
KR100819301B1 (en) 2006-12-20 2008-04-03 삼성전자주식회사 Method and apparatus for optical image stabilizer on mobile camera module
US7796872B2 (en) 2007-01-05 2010-09-14 Invensense, Inc. Method and apparatus for producing a sharp image from a handheld device containing a gyroscope
TWI367026B (en) 2007-03-28 2012-06-21 Quanta Comp Inc Method and apparatus for image stabilization
US8564676B2 (en) * 2007-11-28 2013-10-22 Sanyo Semiconductor Co., Ltd. Semiconductor device with anti-shake control function
JP5237620B2 (en) 2007-12-14 2013-07-17 セミコンダクター・コンポーネンツ・インダストリーズ・リミテッド・ライアビリティ・カンパニー Anti-vibration control circuit for imaging device
JP4900401B2 (en) 2008-05-16 2012-03-21 カシオ計算機株式会社 Imaging apparatus and program
JP5465500B2 (en) 2008-10-20 2014-04-09 日本電産サンキョー株式会社 Optical unit with shake correction function, and shake correction control method in optical unit with shake correction function
KR101575626B1 (en) 2008-11-26 2015-12-08 삼성전자주식회사 Digital camera and controlling method thereof
US8170408B2 (en) 2009-05-18 2012-05-01 Invensense, Inc. Optical image stabilization in a digital still camera or handset
JP5300591B2 (en) 2009-05-21 2013-09-25 キヤノン株式会社 Image processing apparatus and method
KR101630297B1 (en) * 2009-12-03 2016-06-14 삼성전자주식회사 Method and apparatus for correcting a shakiness
JP5553597B2 (en) 2009-12-25 2014-07-16 キヤノン株式会社 Imaging apparatus and control method thereof
US8493454B1 (en) 2010-02-17 2013-07-23 Ambarella, Inc. System for camera motion compensation
US9420179B2 (en) * 2010-11-02 2016-08-16 Canon Kabushiki Kaisha Optical device for performing image blur compensation
TWI444753B (en) * 2010-11-16 2014-07-11 Altek Corp Image capturing device and adjusting method of exposure time thereof
JP5869812B2 (en) * 2011-09-13 2016-02-24 キヤノン株式会社 Image blur correction apparatus, image pickup apparatus including the same, and method for controlling image blur correction apparatus
JP5984574B2 (en) 2012-08-14 2016-09-06 キヤノン株式会社 Imaging system, control method therefor, and imaging apparatus
US9491360B2 (en) 2013-06-06 2016-11-08 Apple Inc. Reference frame selection for still image stabilization
KR20150058952A (en) 2013-11-21 2015-05-29 삼성전기주식회사 System for correcting hand-shake and controlling method thereof
US20150195457A1 (en) 2014-01-03 2015-07-09 Samsung Electro-Mechanics Co., Ltd. Apparatus and method for image correction
US20150195461A1 (en) * 2014-01-03 2015-07-09 Samsung Electro-Mechanics Co., Ltd. Apparatus and method for image correction
IL233356A (en) * 2014-06-24 2015-10-29 Brightway Vision Ltd Gated sensor based imaging system with minimized delay time between sensor exposures
JP2016019076A (en) * 2014-07-07 2016-02-01 ソニー株式会社 Imaging apparatus, imaging method, program and reproducing apparatuses
CN204119344U (en) 2014-09-18 2015-01-21 深圳市四季春科技有限公司 A kind of optical anti-vibration moves camera module
CN104716151B (en) * 2015-03-14 2017-05-17 长春长光辰芯光电技术有限公司 Back lighting type TDI image sensor and electronic shutter control method thereof
US10306148B2 (en) * 2016-08-30 2019-05-28 Microsoft Technology Licensing, Llc Motion triggered gated imaging

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007148169A1 (en) * 2006-06-22 2007-12-27 Nokia Corporation Method and system for image stabilization

Also Published As

Publication number Publication date
US10306148B2 (en) 2019-05-28
CN109644238B (en) 2020-12-01
EP3507974A1 (en) 2019-07-10
US20180063442A1 (en) 2018-03-01
CN109644238A (en) 2019-04-16
US20190253632A1 (en) 2019-08-15
US10917574B2 (en) 2021-02-09
WO2018044627A1 (en) 2018-03-08

Similar Documents

Publication Publication Date Title
EP3507974B1 (en) Motion triggered gated imaging
US9854151B2 (en) Imaging device and focusing control method
CN108139562B (en) Focus control device, focus control method, storage medium, lens device, and imaging device
CN108027496B (en) Focus control device, focus control method, recording medium, lens device, and imaging device
US10848692B2 (en) Global shutter and rolling shutter drive start timings for imaging apparatus, imaging method, and imaging program
US10003758B2 (en) Defective pixel value correction for digital raw image frames
US20160227114A1 (en) Automatic processing of automatic image capture parameter adjustment
US10944925B2 (en) Global shuttering, first rolling readout and second rolling readout employed with an imaging apparatus, imaging method, and imaging program
US10750105B2 (en) Imaging apparatus, operation method of imaging apparatus, and operation program of imaging apparatus
CN111108743B (en) Image pickup control device, image pickup control method, and recording medium
US9942459B2 (en) Method and system for image quality learning with solid state image sensors
CN108139563B (en) Focus control device, focus control method, focus control program, lens device, and imaging device
US11861874B2 (en) Imaging apparatus, imaging control method, and imaging control program
WO2022145322A1 (en) Imaging device, focus control method, and focus control program
CN112203015B (en) Camera control method, device and medium system
JP2023094941A (en) Control apparatus, imaging apparatus, control method, and control program
JP2017163254A (en) Imaging apparatus and control method of imaging apparatus

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20190215

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20201105

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

RAP3 Party data changed (applicant data changed or rights of an application transferred)

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20220420

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1516176

Country of ref document: AT

Kind code of ref document: T

Effective date: 20220915

Ref country code: DE

Ref legal event code: R096

Ref document number: 602017061267

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Ref document number: 602017061267

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: H04N0005232000

Ipc: H04N0023600000

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20220831

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220831

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220831

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221130

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220831

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220831

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220831

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1516176

Country of ref document: AT

Kind code of ref document: T

Effective date: 20220831

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220831

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221231

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220831

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221201

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220831

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220831

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230102

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220831

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220831

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220831

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220831

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220831

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220831

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602017061267

Country of ref document: DE

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230430

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220831

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220831

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20230601

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220831

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220831

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220831

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20230823

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20230823

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20230831

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20230831

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220831

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20230823

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20230823

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20230831

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20240723

Year of fee payment: 8

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20240723

Year of fee payment: 8

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20240723

Year of fee payment: 8