EP3315905B1 - Image processing device, stereo camera device, vehicle, and image processing method - Google Patents

Image processing device, stereo camera device, vehicle, and image processing method Download PDF

Info

Publication number
EP3315905B1
EP3315905B1 EP16813972.3A EP16813972A EP3315905B1 EP 3315905 B1 EP3315905 B1 EP 3315905B1 EP 16813972 A EP16813972 A EP 16813972A EP 3315905 B1 EP3315905 B1 EP 3315905B1
Authority
EP
European Patent Office
Prior art keywords
feature points
captured image
controller
image processing
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP16813972.3A
Other languages
German (de)
French (fr)
Other versions
EP3315905A1 (en
EP3315905A4 (en
Inventor
Tomofumi KOISHI
Shushin Inoue
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kyocera Corp
Original Assignee
Kyocera Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kyocera Corp filed Critical Kyocera Corp
Publication of EP3315905A1 publication Critical patent/EP3315905A1/en
Publication of EP3315905A4 publication Critical patent/EP3315905A4/en
Application granted granted Critical
Publication of EP3315905B1 publication Critical patent/EP3315905B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/10Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with variable angles and a base of fixed length in the observation station, e.g. in the instrument
    • G01C3/14Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with variable angles and a base of fixed length in the observation station, e.g. in the instrument with binocular observation at a single point, e.g. stereoscopic type
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/08Stereoscopic photography by simultaneous recording
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30261Obstacle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Manufacturing & Machinery (AREA)
  • Electromagnetism (AREA)
  • Human Computer Interaction (AREA)
  • Measurement Of Optical Distance (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Description

    TECHNICAL FIELD
  • The present disclosure relates to an image processing device, a stereo camera device, a vehicle, and an image processing method.
  • BACKGROUND
  • In recent years, stereo cameras that simultaneously capture images of a target, such as an object or a human, using a plurality of cameras, and that use the captured images to measure a distance to the target based on the triangulation principle, are known. Such a stereo cameras may be mounted, for example, to a vehicle to inform a driver of the presence of a target that is present in the vicinity of the vehicle and to assist safe driving.
  • EP 2 624 575 A1 discloses an image processing apparatus comprising a corresponding-area computing section for finding a relation between areas on images taken by two cameras, a coincidence-degree computing section for finding a degree of coincidence of information obtained from corresponding areas on the images taken by the cameras, and a camera-parameter computing section for finding camera parameters on the basis of the coincidence degree computed by the coincidence-degree computing section.
  • EP 2 803 944 A2 discloses an image processing apparatus that uses two captured images captured by two image capturing units disposed on a given base line. One of the two captured images is used as a reference image and another one of the two captured images is used as a comparing image. The image processing apparatus includes a parallax computing unit to conduct a matching process and a parallax computing process. In the matching process, the parallax computing unit compares a feature value of a search target area related to a target portion for computing parallax in the reference image, and a feature value of each of candidate corresponding areas in the comparing image respectively related to a plurality of candidate corresponding portions, and the parallax computing unit identifies a candidate corresponding area having a feature value substantially matched to the feature value of the search target area as a corresponding area. In the parallax computing process, the parallax computing unit computes a deviation amount between the target portion for computing parallax and a corresponding portion related to the corresponding area identified by the matching process along the base line direction as a parallax value of the target portion for computing parallax. When the matching process is conducted, the greater the deviation amount of the candidate corresponding portion along the base line direction with respect to the target portion for computing parallax, the greater size is set for the search target area that is compared with a feature value of a candidate corresponding area related to the candidate corresponding portion.
  • CITATION LIST Patent Literature
  • PTL1: JP 2001-082955 A
  • SUMMARY
  • The present invention provides an image processing device according to claim 1, a stereo camera device according to claim 7, a vehicle according to claim 8, and an image processing method according to claim 9. Further embodiments of the invention are disclosed in the dependent claims.
  • One of aspects of the present disclosure resides in an image processing device that is configured to calibrate a stereo camera including a plurality of imaging units and that includes an input interface and a controller. The input interface is configured to acquire a first captured image and a second captured image captured by the plurality of imaging units. The controller is configured to calculate parallax by performing a one-dimensional matching based on pixel values of the first captured image and pixel values of the second captured image. The controller is also configured to extract one or more first feature points from a region in the first captured image that includes continuous pixels having a difference in parallax which is within a predetermined range and extract one or more second feature points corresponding respectively to the one or more first feature points by performing a two-dimensional matching between the one or more first feature points and pixels in the second captured image. The controller is also configured to calibrate at least one of the plurality of imaging units based on positions of the one or more first feature points and positions of the one or more second feature points.
  • Another aspect of the present disclosure resides in a stereo camera device, including: a plurality of imaging units; and an image processing device that is configured to calibrate a stereo camera including the plurality of imaging units. The image processing device includes an input interface and a controller. The input interface is configured to receive input of a first captured image and a second captured image captured by the plurality of imaging units. The controller is configured to calculate parallax by performing a one-dimensional matching based on pixel values of the first captured image and pixel values of the second captured image. The controller is also configured to extract one or more first feature points from a region in the first captured image that includes continuous pixels having a difference in parallax which is within a predetermined range and extract one or more second feature points corresponding respectively to the one or more first feature points by performing a two-dimensional matching between the one or more first feature points and pixels in the second captured image. The controller is also configured to calibrate at least one of the plurality of imaging units based on positions of the one or more first feature points and positions of the one or more second feature points.
  • Yet another aspect of the present disclosure resides in a vehicle, including a stereo camera device. The stereo camera device includes: a plurality of imaging units; and an image processing device that is configured to calibrate a stereo camera including the plurality of imaging units. The image processing device includes an input interface and a controller. The input interface is configured to receive input of a first captured image and a second captured image captured by the plurality of imaging units. The controller is configured to calculate parallax by performing a one-dimensional matching based on pixel values of the first captured image and pixel values of the second captured image. The controller is also configured to extract one or more first feature points from a region in the first captured image that includes continuous pixels having a difference in parallax which is within a predetermined range and extract one or more second feature points corresponding respectively to the one or more first feature points by performing a two-dimensional matching between the one or more feature points and pixels in the second captured image. The controller is also configured to calibrate at least one of the plurality of imaging units based on positions of the one or more first feature points and positions of the one or more second feature points.
  • Yet another aspect of the present disclosure resides in an image processing method performed by an image processing device configured to calibrate a stereo camera including a plurality of imaging units, that is, the image processing method, in which a controller in the image processing device performs steps including calculating parallax by performing a one-dimensional matching based on pixel values of a first captured image captured by a first imaging unit in a plurality of imaging units and pixel values in a second captured image captured by a second imaging unit in the plurality of imaging units that is different from the first imaging unit, extracting one or more first feature points from a region in the first captured image that includes continuous pixels having a difference in parallax which is within a predetermined range, extracting one or more second feature points corresponding respectively to the one or more first feature points by performing a two-dimensional matching between the one or more first feature points and pixels in the second captured image, and calibrating at least one of the plurality of imaging units based on positions of the one or more first feature points and positions of the one or more second feature points.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • In the accompanying drawings:
    • FIG. 1 is a perspective view of a stereo camera device including an image processing device according to an embodiment;
    • FIG. 2 is a side view of a vehicle in which the stereo camera device illustrated in FIG. 1 is mounted;
    • FIG. 3 is a configuration diagram of the stereo camera device illustrated in FIG. 1;
    • FIGS. 4(1) to 4(4) illustrate a first captured image and a second captured image, and one-dimensional pixel value distributions thereof, captured respectively by a first imaging unit and a second imaging unit illustrated in FIG. 1;
    • FIG. 5 is a conceptual view of parallax of pixels constituting a subject region of the first captured image illustrated in FIG. 4; and
    • FIG. 6 is a flowchart illustrating processing performed by the image processing device illustrated in FIG. 1.
    DETAILED DESCRIPTION
  • In order to accurately notify a user of the distance to a target in the vicinity thereof a stereo camera, must accurately measure a distance from a plurality of cameras to the target to be captured by the plurality of cameras. However, when the plurality of cameras is mounted at positions which deviate from a standard, the distance to the target cannot be measured with accuracy.
  • Accordingly, an existing calibration method determines that a reference region is appropriate for matching and accurately calibrates the horizontality of the camera when the maximum appearance frequency of the distances in real space to the subject captured in the reference region is greater than a predetermined value.
  • According to the above calibration method, since the reference region appropriate for matching is determined based on the maximum appearance frequency of the distances to the subject, a distance distribution histogram needs to be generated, and this processing may require time. Furthermore, when positions in the reference region in which a portion corresponding to the distance of the maximum appearance frequency is captured are discrete, occlusion may occur in the reference region, and this makes accurate matching difficult.
  • An image processing device according to the present disclosure calibrates an imaging unit by performing a one-dimensional matching to calculate parallax and a two-dimensional matching with at least one first feature point in a first captured image captured by the stereo camera to extract a second feature point in a second captured image that corresponds to the first feature point.
  • The image processing device of the present disclosure is described in detail below with reference to the drawings.
  • As illustrated in FIG. 1, a stereo camera device 1 includes a plurality of imaging units 11L and 11R having optical axes that are not overlapping with each other, and an image processor 10. In the description below, the image processor 10 may also be called an image processing device. Accordingly, a plurality of images captured simultaneously by the plurality of imaging units 11L and 11R will have parallax d.
  • The term "simultaneous" as used herein is not limited to the exact same time. For example, "simultaneous" imaging as used in the disclosure includes (i) the plurality of cameras capturing images at the same time, (ii) the plurality of cameras capturing images in response to the same signal, and (iii) the plurality of cameras capturing images at the same time according to respective internal clocks. The imaging start time, the imaging end time, the transmission time of captured image data, and the time at which another device receives image data are included in the time standard for imaging. The stereo camera device 1 may include a plurality of cameras in a single housing. The stereo camera device 1 may also include two or more independent cameras positioned apart from each other. The stereo camera device 1 is not limited to a plurality of independent cameras. In the present disclosure, a camera having an optical mechanism that guides light incident at two separate locations to one optical detector, for example, may be adopted as the stereo camera device 1.
  • The first imaging unit 11L and the second imaging unit 11R include a solid-state image sensor. A solid-state image sensor includes Charge-Coupled Device (CCD) image sensor and a Complementary MOS (CMOS) image sensor. The first imaging unit 11L and the second imaging unit 11R may include a lens mechanism. The first imaging unit 11L and the second imaging unit 11R capture an image of real space to generate a first captured image 14L and a second captured image 14R.
  • The plurality of imaging units 11L and 11R is mounted in a vehicle 15, which is placed on a horizontal plane as illustrated in FIG. 2, in a manner such that the optical axes of the imaging units 11L and 11R are parallel, lens surfaces and imaging surfaces of the imaging units 11L and 11R are on the same planes, and the baseline length direction is horizontal. In the state where the two imaging units 11L and 11R are mounted in the correct position and correct posture, the stereo camera device 1 is able to measure a distance from the stereo camera device 1 to the subject with accuracy. Hereinafter, one of the two imaging units 11L and 11R that is mounted on the left side when looking at the stereo camera device 1 from the opposite side of the subject is called the first imaging unit 11L, and the other one that is mounted on the right side is called the second imaging unit 11R.
  • In the present embodiment, the first imaging unit 11L and the second imaging unit 11R are able to capture an image of the outside of the vehicle 15 via the windshield of the vehicle 15. In the present embodiment, the first imaging unit 11L and the second imaging unit 11R may be fixed to any one of the front bumper, the fender grill, the side fenders, the light modules, and the bonnet of the vehicle 15.
  • The term "parallel" as used above is not limited to strict parallelism. The term "parallel" may encompass a substantially parallel state in which the optical axes of the imaging units 11L and 11R are considered substantially parallel, e.g., misaligned and not perfectly parallel. The term "horizontal" as used above is not limited to strict horizontality. The term "horizontal" may encompass a substantially horizontal state in which the baseline length direction is for example deviated from the perfectly horizontal position with respect to the direction of the horizon plane.
  • Here, "vehicle" in the present disclosure includes, but is not limited to, automobiles, railway vehicles, industrial vehicles, and vehicles for daily life. For example, "vehicle" may include aircraft that travel down a runway. Automobiles include, but are not limited to, passenger vehicles, trucks, buses, two-wheeled vehicles, and trolley buses, and may include other vehicles that drive on a road. Railway vehicles include, but are not limited to, locomotives, freight cars, passenger cars, streetcars, guided railway vehicles, ropeways, cable cars, linear motor cars, and monorails, and may include other vehicles that travel along a rail. Industrial vehicles include industrial vehicles for agriculture and for construction. Industrial vehicles include, but are not limited to, forklifts and golf carts. Industrial vehicles for agriculture include, but are not limited to, tractors, cultivators, translators, binders, combines, and lawnmowers. Industrial vehicles for construction include, but are not limited to, bulldozers, scrapers, backhoes, cranes, dump cars, and road rollers. Vehicles for daily life include, but are not limited to, bicycles, wheelchairs, baby carriages, wheelbarrows, and motorized, two-wheeled standing vehicles. Power engines for the vehicle include, but are not limited to, internal-combustion engines including diesel engines, gasoline engines, and hydrogen engines, and electrical engines including motors. The "vehicle" is not limited to the above-listed types. For example, automobiles may include industrial vehicles that can drive on a road, and the same vehicle may be included in multiple categories.
  • Next, the image processor 10 is described with reference to FIG. 3. As illustrated in FIG. 3, the image processor 10 includes an input interface 12 and a controller 13.
  • The input interface 12 is an input interface for inputting image data to the image processor 10. A physical connector or a wireless communication device may be used in the input interface 12. Physical connectors include an electrical connector corresponding to transmission by an electric signal, an optical connector corresponding to transmission by an optical signal, and an electromagnetic connector corresponding to transmission by an electromagnetic wave. Electrical connectors include connectors conforming to IEC60603, connectors conforming to the USB standard, connectors comprising an RCA terminal, connectors comprising an S terminal prescribed by EIAJ CP-1211A, connectors comprising a D terminal prescribed by EIAJ RC-5237, connectors conforming to the HDMI® (HDMI is a registered trademark in Japan, other countries, or both) standard, and connector corresponding to a coaxial cable that includes a BNC terminal. Optical connectors include a variety of connectors conforming to IEC 61754. Wireless communication devices include wireless communication devices conforming to standards that include Bluetooth® (Bluetooth is a registered trademark in Japan, other countries, or both) and IEEE802.11. The wireless communication device includes at least one antenna.
  • Image data of images captured by the first imaging unit 11L and the second imaging unit 11R is inputted to the input interface 12. The input interface 12 delivers the inputted image data to the controller 13. Input to the input interface 12 includes signals input over a wired cable and signals input over a wireless connection. The input interface 12 may correspond to the transmission method of an image signal in the stereo camera device 1.
  • The controller 13 includes one or a plurality of processors. The controller 13 or the processors may include one or a plurality of memories that store programs for various processing and information on which operations are being performed. Memories include volatile and nonvolatile memories. Memories also include those independent of processors and those embedded in processors. Processors include universal processors that execute particular functions by reading particular programs and dedicated processors that are specialized for particular processing. Dedicated processors include an Application Specific Integrated Circuit (ASIC) for a specific application. Processors include a Programmable Logic Device (PLD). PLDs include a Field-Programmable Gate Array (FPGA). The controller 13 may be either a System-on-a-Chip (SoC) or a System in a Package (SiP) with one processor or a plurality of processors that work together.
  • The controller 13 measures the distance in real space from the stereo camera device 1 to the subject captured in the first captured image 14L and the second captured image 14R, which have been inputted to the input interface 12.
  • In the stereo camera device 1, the controller 13 calculates the distance from the stereo camera device 1 to the subject in a spatial coordinate system as illustrated in FIG. 1. The spatial coordinate system includes an X-axis in the direction of the baseline length and a Y-axis and a Z-axis in two directions that are perpendicular to the baseline length and that are also perpendicular with respect to each other, with any point being defined as the origin. The optical axes of the first imaging unit 11L and the second imaging unit 11R are parallel to Z-axis, the row direction of the imaging surfaces is parallel to X-axis, and the columnar direction of the imaging surfaces is parallel to Y-axis. The rotation angle around X-axis is defined as pitch angle φ, and the rotation angle around Z-axis is defined as rho angle ω in the spatial coordinate system.
  • In the stereo camera device 1, both of the optical axes are parallel to Z-axis, and the columnar direction of the imaging surfaces is parallel to Y-axis, which is perpendicular to the baseline length direction. Accordingly, the positions of spot images of the same subject differ only in the row direction in the first captured image 14L and in the second captured image 14R. Accordingly, to perform calculation of the distance at a high speed such as 30 fps, the stereo camera device 1 performs one-dimensional matching along the direction parallel to the baseline length, i.e., along the X-axis direction, to bring the spot images of the same subject in the first captured image 14L and in the second captured image 14R into correspondence with each other.
  • However, the accuracy of the correspondence between the spot images according to the above one-dimensional matching decreases as a displacement ΔY along the Y-axis direction of the first imaging unit 11L with respect to the second imaging unit 11R in external orientation parameters increases. Similarly, the accuracy of the correspondence between the spot images according to the one-dimensional matching decreases as misalignment Δφ of pitch angle φ of the optical axes increases. To address the above, as described below, the stereo camera device 1 performs calibration, for at least one of the position Y of the Y-axis direction and pitch angle φ, the first imaging unit 11L, with reference to the second imaging unit 11R, based on the first captured image 14L and the second captured image 14R.
  • The controller 13 calculates parallax d of pixel positions in the first captured image 14L and in the second captured image 14R. The pixel positions in the first captured image 14L and in the second captured image 14R are represented by the image coordinate system (u, v), having a U-axis that is parallel to the row direction of the imaging surfaces and a V-axis that is parallel to the columnar direction of the imaging surfaces. The controller 13 calculates parallax d according to the one-dimensional matching along the row direction of the imaging surfaces. In detail, the controller 13 compares one-dimensional pixel value distribution in the U-axis direction at different v-coordinates in the first captured image 14L and one-dimensional pixel value distribution at the same v-coordinates in the second captured image 14R. The controller 13 calculates a difference in position of two pixels including pixel values corresponding to each other in the two distributions as parallax d.
  • Here, a detailed description of a method to calculate parallax d is provided. The controller 13 determines a constant on V-axis for which parallax d is to be calculated. Here, the controller 13 is assumed to calculate parallax d for v1 on the V-axis. The controller 13 extracts pixel values of different pixels at v=v1 from the first captured image 14L and the second captured image 14R, as illustrated in FIGS. 4(1) and 4(2). Pixel value distributions extracted from the first captured image 14L and the second captured image 14R are illustrated for example in FIGS. 4(3) and 4(4). Based on the two extracted pixel value distributions, the controller 13 brings pixels of the second captured image 14R into correspondence with pixels of the first captured image 14L according to the one-dimensional matching. That is, the controller 13 extracts pixels on the second captured image 14R that are most likely to represent the spot image formed by pixels on the first captured image 14L and brings the extracted pixels into correspondence with the pixels on the first captured image 14L. The controller 13 calculates a difference between the position (uL1, v1) and the position (uR1, v1) of corresponding pixels respectively in the first captured image 14L and the second captured image 14R as parallax d = uL1 - uR1.
  • The controller 13 determines a region in a parallax image that includes continuous pixels whose difference in parallax d is within a predetermined range as a parallax approximate region. A parallax image refers to an image representing shift amount of pixels forming the same spot image in two different captured images captured simultaneously. In the example of the present embodiment, a parallax image refers to an image representing a shift amount in the U-axis direction between the spot image in the first captured image 14L that corresponds to the pixels and the corresponding pixels of the same spot image in the second captured image 14R.
  • A description of the specific processing of the parallax approximate region is now provided with reference to the example illustrated in FIG. 5. FIG. 5 is a conceptual view of parallax d in a subject region 16 illustrated in FIG. 4(1). A plurality of squares each surrounding a number in FIG. 5 corresponds to the pixels in the subject region 16. Parallax d of each pixel is illustrated in the position of the pixel. The thick line in FIG. 5 corresponds to the pixels constituting an optical image of a vehicle in FIG. 4(1). Parallax d of the pixels surrounded by the thick line is generally within the range of 79 to 82. Parallax d changes depending on a distance from the stereo camera device 1. A subject located at substantially the same distance from the stereo camera device 1 is subjected to substantially the same parallax d. A subject with a measurable size on the first captured image 14L and on the second captured image 14R is located at substantially the same distance from the stereo camera device 1, and parallax d of the pixels forming the optical images of the subject is within a predetermined range. In other words, an object is present in a region in which parallax d is within a predetermined range. In the present embodiment, the controller 13 determines, as the parallax approximate region, a region including continuous pixels in which parallax d is within a predetermined range.
  • The controller 13 extracts from the parallax approximate region at least one first feature point P1. The first feature point P1 refers to a characteristic point on the first captured image 14L, that is, a point having a feature value of the pixel that satisfies predetermined requirements. For example, a vertex of edge at which a differential of brightness value is a predetermined value or more may be regarded as the first feature point P1.
  • The controller 13 determines whether the first feature point P1 extracted from the first captured image 14L is suitable to be used for calibration processing. When the first feature point P1 is a part of at least one of a linear edge and a repetitive pattern, an error tends to occur in matching the second feature point P2 with the first feature point P1. For the above reason, the controller 13 determines whether the first feature point P1 and a region including the vicinity of the first feature point P1 include a linear pattern or a repetitive pattern. The second feature point P2 refers to a feature point on the second captured image 14R that has a feature value within a predetermined range of the feature value of the first feature point P1.
  • When it is determined that the first feature point P1 is a part of at least one of a linear edge and a repetitive pattern, this first feature point P1 is not used in the subsequent processing, and another first feature point P1 different from this first feature point P1 is used. When it is determined that the first feature point P1 is not a part of at least one of a linear edge and a repetitive pattern, the controller 13 retrieves the second feature point P2 corresponding to this first feature point P1 for extraction by using a conventionally known two-dimensional pattern matching method. The controller 13 performs the two-dimensional pattern matching with sub-pixel precision by using for example the interpolation method.
  • A starting point PS of the retrieval of the second feature point P2 in the second captured image 14R is positioned at (uL2 + d, vL2), which is offset by parallax d in the U-axis direction from the position (uL2, vL2) same as the first feature point P1. The controller 13 retrieves the second feature point P2 within a predetermined dimensional range centered around the position (uL2 + d, vL2). The predetermined dimensional range may be the range within 1 to 2 pixels from the starting point PS in the U-axis direction and within 1 to 2 pixels from the starting point PS in the V-axis direction.
  • Based on the position of the first feature point P1 and the position of the second feature point P2 that have been extracted, the controller 13 calibrates the first imaging unit 11L, with reference to the second imaging unit 11R.
  • Here, a concrete example of the calibration performed by the controller 13 is described.
  • In the following concrete example, a parameter corresponding to translation of the first imaging unit 11L in a direction perpendicular to the baseline length and to the optical axis is set as an external orientation parameter to be updated. The parameter corresponding to translation in the direction perpendicular to the baseline length and to the optical axis extends in the Y-axis direction in the spatial coordinate system illustrated in FIG. 1. X-axis and Y-axis in the spatial coordinate system are respectively parallel to U-axis and V-axis in the image coordinate system.
  • In a case where the position of the first feature point P1 extracted as above is represented by (uL2, vL2) and the position of the second feature point P2 extracted as above is represented by (uR2, vR2), the controller 13 determines that various positions of the first captured image 14L are offset by the shift amount Δv = vL2 - vR2, with reference to the second captured image 14R. The controller 13 determines that the first imaging unit 11L is offset by the shift amount ΔYL in the spatial coordinate system that corresponds to the shift amount Δv in the image coordinate system, with reference to the second captured image 14R, and updates the position YL of the first imaging unit 11L in the Y-axis direction by using the shift amount ΔYL. The controller 13 performs the same processing when the pitch angle φ, which is rotation angle around X-axis, is set as the external orientation parameter to be updated.
  • Next, processing performed by the image processor 10 according to the present embodiment is described with reference to the flowchart illustrated in FIG. 6. The controller 13 starts the processing upon receiving at least one of a start-up instruction, a stop instruction, and a calibration control executing instruction from the stereo camera device 1.
  • Firstly, the input interface 12 receives input of the first captured image 14L and the second captured image 14R, which have been respectively generated by the first imaging unit 11L and the second imaging unit 11R (Step S1).
  • Subsequently, based on the pixel values of the first captured image 14L and the pixel values of the second captured image 14R that have been inputted to the input interface 12, the controller 13 calculates parallax d of the pixels of the first captured image 14L with respect to the corresponding pixels of the second captured image 14R according to the one-dimensional matching (Step S2).
  • Subsequently, the controller 13 determines the parallax approximate region in the first captured image 14L (Step S3). Then, the controller 13 retrieves from the determined parallax approximate region at least one first feature point P1 for extraction (Step S4). The controller 13 determines whether the extracted first feature point P1 is suitable to be used for calibration processing (Step S5). After that, the controller 13 retrieves from the second captured image 14R the second feature point P2 corresponding to the first feature point P1 that has been determined as suitable to be used for calibration processing, for extraction (Step S6). The first feature point P1 is determined as suitable to be used for calibration processing when the first feature point P1 is not a part of at least one of a linear edge and a repetitive pattern.
  • Then, based on the first feature point P1 and the second feature point P2, the controller 13 calibrates the first imaging unit 11L, with reference to the second imaging unit 11R (Step S7).
  • According to the image processing device of the present embodiment, the controller 13 calculates parallax d according to the one-dimensional matching and also extracts at least one first feature point P1 from the region including continuous pixels with deviation in parallax d within a predetermined range. Accordingly, compared with cases where the controller 13 retrieves the first feature point P1 over the entire region of the first captured image 14L according to the two-dimensional matching, the first imaging unit 11L is calibrated rapidly, with reference to the second imaging unit 11R. Since it is highly likely that the pixels constituting the parallax approximate region do not contain noise, the controller 13 is able to extract the first feature point P1 with accuracy. This allows the controller 13 to perform the calibration with accuracy.
  • According to the image processing device of the present embodiment, the second feature point P2 is extracted based on the first feature point P1 that is different from any first feature point P1 corresponding to a part of at least one of a linear edge and a repetitive pattern. Since the first feature point P1 is neither a part of a linear edge nor a part of a repetitive pattern, a feature point similar to the first feature point P1 is less likely to be included in the first captured image 14L. Similarly, , a feature point similar to the first feature point P1 is less likely to be included in the second captured image 14R which corresponds to the first captured image 14L, excluding the second feature point P2 which corresponds to the first feature point P1. Accordingly, the second feature point P2 which corresponds to the first feature point P1, is determined with accuracy and without the second feature point P2 being mistaken for another feature point similar to the first feature point P1.
  • According to the image processing device of the present embodiment, the controller 13 determines the starting point PS for the two-dimensional matching based on the position of the first feature point P1 and parallax d at the position, and therefore, the second feature point P2 is retrieved within a range in which the second feature point P2 is very likely to be present. This allows the controller 13 to extract the second feature point P2 rapidly.
  • According to the image processing device of the present embodiment, since the controller 13 performs the two-dimensional matching with sub-pixel precision, the calibration is performed even when the first imaging unit 11L is offset by a shift amount of less than one pixel, with reference to the second imaging unit 11R. That is, the controller 13 calibrates the first imaging unit 11L with high precision, with reference to the second imaging unit 11R.
  • For example, in the present embodiment, the controller 13 may use a plurality of first feature points P1 and extract a plurality of second feature points P2 corresponding to the plurality of first feature points P1. In this case, the calibration is performed with high accuracy even when the rho angle ωL of the first imaging unit 11L is misaligned with the rho angle ωR of the second imaging unit 11R.
  • Although in the present embodiment the controller 13 calibrates the first imaging unit 11L, with reference to the second imaging unit 11R, the controller 13 may also calibrates the second imaging unit 11R, with reference to the first imaging unit 11L.
  • Although in the present embodiment the controller 13 extracts the second feature point P2 based on the first feature point P1 extracted from the first captured image 14L, the controller 13 may also extract the first feature point P1 based on the second feature point P2 extracted from the second captured image 14R.
  • Although in the present embodiment the stereo camera device 1 includes the image processor 10, another device may include the image processor 10, and the controller 13 of the stereo camera device 1 may perform control for calibration based on the first captured image 14L and the second captured image 14R that are inputted from the other device to the input interface 12 via a communication network or the like.
  • REFERENCE SIGNS LIST
  • 1
    Stereo camera device
    10
    Image processor
    11L
    First imaging unit
    11R
    Second imaging unit
    12
    Input interface
    13
    Controller
    14L
    First captured image
    14R
    Second captured image
    15
    Vehicle
    16
    Subject region

Claims (9)

  1. An image processing device (10), comprising:
    an input interface (12) configured to receive input of a first captured image (14L) and a second captured image (14R) captured by a plurality of imaging units (11L, 11R) of a stereo camera; and
    a controller (13) configured to calculate parallax (d) by performing a one-dimensional matching based on pixel values of the first captured image (14L) and pixel values of the second captured image (14R), extract one or more first feature points (P1) from a region in the first captured image (14L) that includes continuous pixels having a difference in parallax (d) which is within a predetermined range, extract one or more second feature points (P2) corresponding respectively to the one or more first feature points (P1) by performing a two-dimensional matching between the one or more first feature points (P1) and pixels in the second captured image (14R), and calibrate at least one of the plurality of imaging units (11L, 11R) based on positions of the one or more first feature points (P1) and positions of the one or more second feature points (P2).
  2. The image processing device (10) of claim 1, wherein
    the controller (13) is configured to perform the two-dimensional matching based on at least one first feature point (P1) in the one or more first feature points (P1) that is different from the first feature points (P1) corresponding to a part of at least one of a linear edge and a repetitive pattern.
  3. The image processing device (10) of claim 1 or 2, wherein
    the controller (13) is configured to determine a starting point (PS) for the two-dimensional matching based on positions of the one or more first feature points (P1) and parallax (d) at the positions.
  4. The image processing device (10) of any one of claims 1 to 3, wherein
    the controller (13) is configured to perform the two-dimensional matching with sub-pixel precision.
  5. The image processing device (10) of any one of claims 1 to 4, wherein
    the one or more first feature points (P1) extracted from the first captured image (14L) by the controller (13) comprise a plurality of first feature points (P1), and the one or more second feature points (P2) extracted from the second captured image (14R) by the controller (13) comprise a plurality of second feature points (P2) corresponding respectively to the plurality of first feature points (P1), and the controller (13) is configured to calibrate at least one of the plurality of imaging units (11L, 11R) based on positions of the plurality of first feature points (P1) and positions of the plurality of second feature points (P2) corresponding to the plurality of first feature points (P1).
  6. The image processing device (10) of claim 5, wherein
    the controller (13) is configured to calibrate a posture of the stereo camera based on the positions of the plurality of first feature points (P1) and the positions of the plurality of second feature points (P2).
  7. A stereo camera device (1), comprising:
    a plurality of imaging units (11L, 11R); and
    an image processing device (10) according to claim 1.
  8. A vehicle, comprising
    a stereo camera device (1) according to claim 7.
  9. An image processing method, in which a controller (13) in an image processing device (10) performs steps comprising
    calculating parallax (d) by performing a one-dimensional matching based on pixel values in a first captured image (14L) captured by a first imaging unit in a plurality of imaging units (11L, 11R) and pixel values in a second captured image (14R) captured by a second imaging unit in the plurality of imaging units (11L, 11R) that is different from the first imaging unit;
    extracting one or more first feature points (P1) from a region in the first captured image (14L) that includes continuous pixels having a difference in parallax (d) which is within a predetermined range;
    extract one or more second feature points (P2) corresponding respectively to the one or more first feature points (P1) by performing a two-dimensional matching between the one or more first feature points (P1) and pixels in the second captured image (14R); and
    calibrating at least one of the plurality of imaging units (11L, 11R) based on positions of the one or more first feature points (P1) and positions of the one or more second feature points (P2).
EP16813972.3A 2015-06-24 2016-06-24 Image processing device, stereo camera device, vehicle, and image processing method Active EP3315905B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015126817 2015-06-24
PCT/JP2016/003058 WO2016208200A1 (en) 2015-06-24 2016-06-24 Image processing device, stereo camera device, vehicle, and image processing method

Publications (3)

Publication Number Publication Date
EP3315905A1 EP3315905A1 (en) 2018-05-02
EP3315905A4 EP3315905A4 (en) 2019-02-27
EP3315905B1 true EP3315905B1 (en) 2020-04-22

Family

ID=57584803

Family Applications (1)

Application Number Title Priority Date Filing Date
EP16813972.3A Active EP3315905B1 (en) 2015-06-24 2016-06-24 Image processing device, stereo camera device, vehicle, and image processing method

Country Status (4)

Country Link
US (1) US10360462B2 (en)
EP (1) EP3315905B1 (en)
JP (1) JP6121641B1 (en)
WO (1) WO2016208200A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110998241A (en) * 2018-01-23 2020-04-10 深圳市大疆创新科技有限公司 System and method for calibrating an optical system of a movable object
US11182914B2 (en) 2018-05-21 2021-11-23 Facebook Technologies, Llc Dynamic structured light for depth sensing systems based on contrast in a local area
JP7173836B2 (en) 2018-11-05 2022-11-16 京セラ株式会社 Controller, position determination device, position determination system, display system, program, and recording medium
CN109559305B (en) * 2018-11-26 2023-06-30 易思维(杭州)科技有限公司 Line structured light image rapid processing system based on SOC-FPGA

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5383013A (en) 1992-09-18 1995-01-17 Nec Research Institute, Inc. Stereoscopic computer vision system
JP2966248B2 (en) * 1993-10-01 1999-10-25 シャープ株式会社 Stereo compatible search device
JP3280001B2 (en) 1999-09-16 2002-04-30 富士重工業株式会社 Stereo image misalignment adjustment device
JP2001266144A (en) 2000-03-17 2001-09-28 Glory Ltd Device and method for collating picture, and computer- readable recording medium with program for making computer perform the method recorded thereon
JP2004053407A (en) 2002-07-19 2004-02-19 Fuji Heavy Ind Ltd Stereo image processing apparatus and stereo image processing method
US8922623B2 (en) 2009-03-31 2014-12-30 Panasonic Corporation Stereo image processor and stereo image processing method
JP5588812B2 (en) * 2010-09-30 2014-09-10 日立オートモティブシステムズ株式会社 Image processing apparatus and imaging apparatus using the same
JP2012198075A (en) * 2011-03-18 2012-10-18 Ricoh Co Ltd Stereoscopic camera device and image adjusting method
JP6202367B2 (en) * 2013-05-14 2017-09-27 株式会社リコー Image processing device, distance measurement device, mobile device control system, mobile device, and image processing program

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
EP3315905A1 (en) 2018-05-02
JPWO2016208200A1 (en) 2017-06-29
WO2016208200A1 (en) 2016-12-29
US20180165528A1 (en) 2018-06-14
US10360462B2 (en) 2019-07-23
EP3315905A4 (en) 2019-02-27
JP6121641B1 (en) 2017-04-26

Similar Documents

Publication Publication Date Title
EP3358295B1 (en) Image processing device, stereo camera device, vehicle, and image processing method
EP3306267B1 (en) Arithmetic logic device, camera device, vehicle and calibration method
EP3315905B1 (en) Image processing device, stereo camera device, vehicle, and image processing method
US9378553B2 (en) Stereo image processing device for vehicle
EP3330664B1 (en) Parallax calculating device, stereo camera device, vehicle, and parallax calculating method
EP3316006B1 (en) Three-dimensional-object detection device, stereo camera device, vehicle, and three-dimensional-object detection method
WO2018220184A1 (en) 3d vision system for a motor vehicle and method of controlling a 3d vision system
EP2913999A1 (en) Disparity value deriving device, equipment control system, movable apparatus, robot, disparity value deriving method, and computer-readable storage medium
US10769804B2 (en) Parallax calculation apparatus, stereo camera apparatus, vehicle, and parallax calculation method
CN114365182A (en) Object detection device, object detection system, moving body, and object detection method
JP6855325B2 (en) Image processing equipment, stereo camera systems, moving objects, road surface shape detection methods and programs
CN112639864B (en) Method and apparatus for ranging
JP2018201167A (en) Image processing device, periphery monitoring system, and mobile body
JP2022024872A (en) Image processing device, stereo camera device, moving object, and image processing method
CN114402357A (en) Road surface detection device, object detection system, moving body, and object detection method
JP2022066925A (en) Image processing device, stereo camera device, moving body, and image processing method
CN112270311A (en) Near-target rapid detection method and system based on vehicle-mounted panoramic inverse projection
CN111267841A (en) Robot distance acquisition system

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20180115

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20190130

RIC1 Information provided on ipc code assigned before grant

Ipc: G01C 25/00 20060101ALI20190124BHEP

Ipc: G06T 7/80 20170101ALI20190124BHEP

Ipc: G03B 35/08 20060101ALI20190124BHEP

Ipc: G06T 1/00 20060101ALI20190124BHEP

Ipc: H04N 7/18 20060101ALI20190124BHEP

Ipc: G01C 3/06 20060101AFI20190124BHEP

Ipc: G01C 3/14 20060101ALI20190124BHEP

Ipc: H04N 13/246 20180101ALI20190124BHEP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Ref document number: 602016034707

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: G01C0003000000

Ipc: G01C0003060000

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

RIC1 Information provided on ipc code assigned before grant

Ipc: G01C 3/14 20060101ALI20191115BHEP

Ipc: G01C 25/00 20060101ALI20191115BHEP

Ipc: G06T 7/80 20170101ALI20191115BHEP

Ipc: H04N 7/18 20060101ALI20191115BHEP

Ipc: G01C 3/06 20060101AFI20191115BHEP

Ipc: G06T 1/00 20060101ALI20191115BHEP

Ipc: H04N 13/246 20180101ALI20191115BHEP

Ipc: G03B 35/08 20060101ALI20191115BHEP

Ipc: H04N 13/239 20180101ALI20191115BHEP

INTG Intention to grant announced

Effective date: 20191217

RIN1 Information on inventor provided before grant (corrected)

Inventor name: INOUE, SHUSHIN

Inventor name: KOISHI, TOMOFUMI

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1260704

Country of ref document: AT

Kind code of ref document: T

Effective date: 20200515

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602016034707

Country of ref document: DE

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20200422

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200822

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200723

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200722

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200824

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1260704

Country of ref document: AT

Kind code of ref document: T

Effective date: 20200422

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200722

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602016034707

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20210125

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200624

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20200630

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200624

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200630

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200630

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200630

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200422

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230508

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20230510

Year of fee payment: 8

Ref country code: DE

Payment date: 20230502

Year of fee payment: 8

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20230504

Year of fee payment: 8