WO2017057041A1 - 信号処理装置、信号処理方法、およびプログラム - Google Patents
信号処理装置、信号処理方法、およびプログラム Download PDFInfo
- Publication number
- WO2017057041A1 WO2017057041A1 PCT/JP2016/077397 JP2016077397W WO2017057041A1 WO 2017057041 A1 WO2017057041 A1 WO 2017057041A1 JP 2016077397 W JP2016077397 W JP 2016077397W WO 2017057041 A1 WO2017057041 A1 WO 2017057041A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- target
- coordinate system
- signal processing
- correspondence
- detected
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/40—Means for monitoring or calibrating
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/40—Means for monitoring or calibrating
- G01S7/4004—Means for monitoring or calibrating of parts of a radar system
- G01S7/4026—Antenna boresight
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
- G01C3/08—Use of electric radiation detectors
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/10—Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with variable angles and a base of fixed length in the observation station, e.g. in the instrument
- G01C3/14—Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with variable angles and a base of fixed length in the observation station, e.g. in the instrument with binocular observation at a single point, e.g. stereoscopic type
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/28—Details of pulse systems
- G01S7/285—Receivers
- G01S7/295—Means for transforming co-ordinates or for evaluating data, e.g. using computers
- G01S7/2955—Means for determining the position of the radar coordinate system for evaluating the position data of the target in another coordinate system
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/40—Means for monitoring or calibrating
- G01S7/4052—Means for monitoring or calibrating by simulation of echoes
- G01S7/4082—Means for monitoring or calibrating by simulation of echoes using externally generated reference signals, e.g. via remote reflector or transponder
- G01S7/4086—Means for monitoring or calibrating by simulation of echoes using externally generated reference signals, e.g. via remote reflector or transponder in a calibrating environment, e.g. anechoic chamber
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
- G06V20/647—Three-dimensional objects by matching two-dimensional images to three-dimensional objects
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/246—Calibration of cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R21/00—Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/86—Combinations of sonar systems with lidar systems; Combinations of sonar systems with systems not using wave reflection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/93—Sonar systems specially adapted for specific applications for anti-collision purposes
- G01S15/931—Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9323—Alternative operation using light waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9324—Alternative operation using ultrasonic waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
- G01S7/4972—Alignment of sensor
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52004—Means for monitoring or calibrating
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10044—Radar image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
Definitions
- the present technology relates to a signal processing device, a signal processing method, and a program, and more particularly, to a signal processing device, a signal processing method, and a program that enable calibration with high accuracy.
- Patent Document 1 proposes a method of performing sensor fusion calibration using a calibration-dedicated board (reflecting plate).
- the present technology has been made in view of such a situation, and enables calibration with high accuracy.
- a signal processing device includes a first position calculation unit that calculates a three-dimensional position of a target in a first coordinate system from a stereo image captured by a stereo camera, and at least one of a horizontal direction and a vertical direction.
- a second position calculation unit that calculates the three-dimensional position of the target in the second coordinate system from the sensor signal of the sensor that can acquire the position information of the position and the position information in the depth direction; Based on the detected correspondence, the correspondence detection unit that detects the correspondence between the target and the target on the second coordinate system, the first coordinate system and the second coordinate system
- a positional relationship information estimation unit that estimates positional relationship information.
- a signal processing method calculates a three-dimensional position of a target in a first coordinate system from a stereo image captured by a stereo camera, and acquires position information of at least one of a horizontal direction and a vertical direction, and a depth.
- a three-dimensional position of the target in a second coordinate system is calculated from a sensor signal of a sensor capable of acquiring direction position information, and the target on the first coordinate system and the second coordinate system are calculated. Detecting a correspondence relationship with the target, and estimating positional relationship information between the first coordinate system and the second coordinate system based on the detected correspondence relationship.
- a program calculates a three-dimensional position of a target in a first coordinate system from a stereo image captured by a stereo camera on a computer, and position information on at least one of a horizontal direction and a vertical direction;
- a three-dimensional position of the target in a second coordinate system is calculated from a sensor signal of a sensor capable of acquiring position information in the depth direction, and the target on the first coordinate system and the second coordinate system are calculated.
- Detecting a correspondence relationship between the first coordinate system and the second coordinate system based on the detected correspondence relationship and executing a process including a step of estimating positional relationship information between the first coordinate system and the second coordinate system. Is.
- a three-dimensional position of the target in the first coordinate system is calculated from a stereo image captured by a stereo camera, and at least one position information in the horizontal direction or the vertical direction and a position in the depth direction are calculated.
- a three-dimensional position of the target in a second coordinate system is calculated from a sensor signal of a sensor capable of acquiring information, and the target on the first coordinate system, the target on the second coordinate system, And the positional relationship information between the first coordinate system and the second coordinate system is estimated based on the detected correspondence relationship.
- program can be provided by being transmitted through a transmission medium or by being recorded on a recording medium.
- the signal processing device may be an independent device, or may be an internal block constituting one device.
- calibration can be performed with high accuracy.
- FIG. 18 is a block diagram illustrating a configuration example of an embodiment of a computer to which the present technology is applied.
- FIG. 1 is a block diagram illustrating a configuration example of an object detection system to which the present technology is applied.
- the object detection system 1 in FIG. 1 includes a millimeter wave radar 11, a stereo camera 12, and a signal processing device 13, and detects an object that becomes an obstacle using each of the millimeter wave radar 11 and the stereo camera 12. System.
- the object detection system 1 is mounted on a vehicle such as an automobile or a truck.
- the millimeter wave radar 11 and the stereo camera 12 are mounted so that the detection direction faces the front of the vehicle, and the object detection direction is detected. Not limited to the front of the vehicle.
- the object detection system 1 detects an object behind the vehicle.
- the millimeter wave radar 11 emits a millimeter wave in a predetermined direction ⁇ , acquires a reflected wave that collides with a predetermined object, and supplies a reflected signal corresponding to the acquired reflected wave to the signal processing device 13. To do.
- the millimeter wave radar 11 scans the millimeter wave within a predetermined angle range in front of the vehicle, and supplies the reflected signal obtained as a result to the signal processing device 13 together with the direction ⁇ irradiated.
- a unit for scanning a predetermined angle range once in the millimeter wave radar 11 is called one frame.
- the stereo camera 12 includes a right camera 21R and a left camera 21L.
- the right camera 21R and the left camera 21L are disposed at the same height and at a predetermined interval in the horizontal direction, and take an image of a predetermined range in front of the vehicle.
- An image captured by the right camera 21R (hereinafter also referred to as a right camera image) and an image captured by the left camera 21L (hereinafter also referred to as a left camera image) have a parallax (in the horizontal direction) due to the difference in arrangement position.
- the image has a deviation.
- the positional relationship between the right camera 21R and the left camera 21L is accurately calibrated.
- the right camera image and the left camera image are not particularly distinguished, they are also referred to as stereo images.
- the signal processing device 13 performs signal processing on sensor signals output from the sensors of the millimeter wave radar 11 and the stereo camera 12. It is assumed that temporal synchronization when the millimeter wave radar 11 and the stereo camera 12 perform sensing is achieved to some extent.
- the signal processing device 13 includes a target detection unit 31, a three-dimensional position calculation unit 32, a target detection unit 33, a parallax estimation unit 34, a three-dimensional position calculation unit 35, a correspondence detection unit 36, a position and orientation estimation unit 37, and a storage unit. 38.
- the object detection system 1 in order to accurately detect an object, it is necessary to specify the correspondence between the objects detected by the millimeter wave radar 11 and the stereo camera 12. That is, the detected object is represented by different coordinate systems of the millimeter wave radar 11 and the stereo camera 12, but when the same object is detected, it is detected by the millimeter wave radar 11 and the stereo camera 12. It is necessary to convert to a predetermined one coordinate system so that the coordinate values of the obtained objects are the same.
- the signal processing device 13 performs processing for calculating the correspondence between the coordinate system of the millimeter wave radar 11 and the coordinate system of the stereo camera 12. In other words, the signal processing device 13 calculates the relationship of the other position (position and orientation) with respect to one position of the millimeter wave radar 11 and the stereo camera 12.
- the calibration process performed by the signal processing device 13 for calculating the positional relationship between the millimeter wave radar 11 and the stereo camera 12 includes a pre-shipment calibration process that is performed before the vehicle is shipped and a deviation that occurs after the shipment. There is an operation calibration process for adjustment.
- the deviation after shipment may be caused by, for example, a change with time, heat, vibration, or the like.
- the target in the calibration process before shipment is, for example, a pole that has a texture (pattern) whose position can be uniquely specified in a stereo image captured by the stereo camera 12 and reflects a millimeter wave.
- FIG. 2 shows an example of a target used in the calibration process before shipment.
- the target 51 shown in FIG. 2 is a cylindrical pole formed of a material that reflects millimeter waves, and a lattice-pattern texture is formed on the outer periphery of the cylindrical shape.
- the target detection unit 33 calculates the position of the target 51 in the stereo image, for example, the pixel position of the lattice pattern intersection 52 is calculated by pattern matching or feature extraction.
- 3 and 4 show examples of the arrangement of the targets 51 in the pre-shipment calibration process.
- FIG. 3 is a layout diagram of the targets 51 when the targets 51 in the pre-shipment calibration process are viewed from above.
- the vertical direction of the paper which is the front direction (depth direction) of the vehicle
- the horizontal direction of the paper which is the horizontal direction (horizontal direction) of the vehicle
- the direction perpendicular to the paper is the Y axis.
- the plurality of targets 51 are arranged so that they do not overlap when captured by the stereo camera 12. Then, as shown in FIG. 3, the plurality of targets 51 may be arranged so as not to be in the same position as other targets 51 in any one of the X-axis direction and the Z-axis direction. desirable.
- FIG. 4 is a layout view of the targets 51 when the targets 51 in the pre-shipment calibration process are viewed from the lateral direction.
- the horizontal direction of the paper is the Z axis
- the vertical direction of the paper is the Y axis
- the direction perpendicular to the paper is the X axis.
- the millimeter wave radar 11 is arranged so as to irradiate millimeter waves on the XZ plane having a height h from the ground
- a plurality of targets 51 are intersecting points 52 of the targets 51 as shown in FIG. Is arranged to have a millimeter wave height h.
- the lattice pattern intersections 52 of the target 51 are formed in accordance with the height h of the millimeter wave irradiated by the millimeter wave radar 11.
- the stereo camera 12 may be arranged at the height of the millimeter wave radar 11 so that the position of the height h from the ground is the same as the intersection point 52 of the grid pattern. It is not necessary to match the 11 height positions.
- the target cannot be specified as a predetermined fixed object, an object that exists on the passage through which the vehicle passes is targeted.
- an object that exists on the passage through which the vehicle passes is targeted.
- pedestrians and poles such as signs and utility poles are targets in the calibration process during operation.
- positional information of a plurality of targets at different positions is necessary.
- the position information of a plurality of targets may be acquired by detecting a plurality of targets in one frame each for the millimeter wave radar 11 and the stereo camera 12, or by acquiring a plurality of frames that capture one target. May be.
- the target detection unit 31 and the three-dimensional position calculation unit 32 on the millimeter wave radar 11 side will be described with reference to FIG.
- the target detection unit 31 detects the position of the target in front of the vehicle based on the reflection signal supplied from the millimeter wave radar 11 and the irradiation direction ⁇ . More specifically, the target detection unit 31 determines a peak position where the reflection signal intensity is equal to or higher than a predetermined intensity based on a reflection intensity map in which the intensity of the reflection signal is associated with the irradiation direction ⁇ . Detect as.
- the target detection position is represented by a polar coordinate system including a distance L based on the intensity of the reflected signal and an irradiation direction ⁇ .
- the detected target detection position is supplied to the three-dimensional position calculation unit 32.
- a black triangle extending from the millimeter wave radar 11 indicates the irradiation range of the millimeter wave, and the position where the target is detected is indicated in white. The greater the intensity of the reflected signal, the more white is expressed.
- the three-dimensional position calculation unit 32 supplies the target detection position represented by the polar coordinate system supplied from the target detection unit 31 to the Z direction in the front direction (depth direction) of the vehicle, the X axis in the horizontal direction (horizontal direction), Conversion is made to a target detection position on a three-dimensional coordinate system with the vertical direction (vertical direction) as the Y axis.
- the target detection position represented by the polar coordinate system composed of the distance L based on the intensity of the reflected signal and the irradiation direction ⁇ is converted into an orthogonal coordinate system by the three-dimensional position calculation unit 32, and XZ of the three-dimensional coordinate system is converted. It is converted into a target detection position on the plane.
- the calculated target detection position is a position on a three-dimensional coordinate system with the millimeter wave radar 11 as a reference, and a three-dimensional coordinate system with the millimeter wave radar 11 as a reference is connected to a stereo camera 12 described later.
- a radar three-dimensional coordinate system In order to distinguish from the reference three-dimensional coordinate system, it is also referred to as a radar three-dimensional coordinate system.
- the three-dimensional position calculation unit 32 supplies the target detection position represented by the calculated radar three-dimensional coordinate system to the correspondence detection unit 36.
- the target detection unit 33, the parallax estimation unit 34, and the three-dimensional position calculation unit 35 on the stereo camera 12 side will be described with reference to FIG.
- the target detection unit 33 performs pattern matching (image recognition processing) using a pre-registered pattern (shape or texture) on the stereo images supplied from the right camera 21R and the left camera 21L, and features of the target image. By performing feature detection processing for detecting the position of the target, the position of the target on the two-dimensional coordinate system composed of the X axis and the Y axis is detected.
- the target detection unit 33 uses the left camera image supplied from the right camera 21R and the left camera image supplied from the left camera 21L as a reference (left camera image in the present embodiment).
- the position of the intersection 52 of the target 51 in the camera image is detected with accuracy in units of pixels, and is supplied to the three-dimensional position calculation unit 35.
- the parallax estimation unit 34 calculates parallax from the right camera image supplied from the right camera 21R and the left camera image supplied from the left camera 21L, and supplies the calculation result to the three-dimensional position calculation unit 35 as parallax information. .
- FIG. 6 shows a parallax image represented by a higher luminance value as the parallax calculated from the right camera image and the left camera image is larger with the left camera image as a reference.
- the higher the luminance value the closer the distance to the target 51 is.
- the three-dimensional position calculation unit 35 calculates the position (distance) in the Z-axis direction, which is the forward direction of the vehicle, from the target parallax information supplied from the parallax estimation unit 34. Then, the three-dimensional position calculation unit 35 calculates the vehicle forward direction (depth) from the calculated target position in the Z-axis direction and the target position on the two-dimensional coordinate system (XY plane) supplied from the target detection unit 33.
- the target detection position on the three-dimensional coordinate system is calculated with the direction (direction) as the Z axis, the horizontal direction (horizontal direction) as the X axis, and the vertical direction (vertical direction) as the Y axis.
- the target detection position calculated here is a position on a three-dimensional coordinate system with the stereo camera 12 as a reference, and the axis direction is the same as that of the radar three-dimensional coordinate system, but the origin is different.
- the three-dimensional coordinate system with reference to the stereo camera 12 is also referred to as a camera three-dimensional coordinate system in distinction from the radar three-dimensional coordinate system described above. Further, when it is not necessary to particularly distinguish the radar three-dimensional coordinate system and the camera three-dimensional coordinate system, both are collectively referred to as a sensor coordinate system.
- the three-dimensional position calculation unit 35 supplies the target detection position represented by the calculated camera three-dimensional coordinate system to the correspondence detection unit 36.
- the correspondence detector 36 detects the correspondence between the target detected in the radar three-dimensional coordinate system and the target detected in the camera three-dimensional coordinate system. In other words, the correspondence detection unit 36 detects which target detected by the radar three-dimensional coordinate system corresponds to which target detected by the camera three-dimensional coordinate system.
- the target arrangement is known in advance.
- the correspondence detection unit 36 acquires the pre-arrangement information of the target from the storage unit 38, and sets the target detection position detected in the radar three-dimensional coordinate system and the target detection position detected in the camera three-dimensional coordinate system.
- the target pre-arrangement information is collated and the target is specified, and then the correspondence between the target detected in the radar three-dimensional coordinate system and the target detected in the camera three-dimensional coordinate system is detected.
- the correspondence detection unit 36 detects that the target detection position a detected in the radar three-dimensional coordinate system corresponds to the target position 1 of the target preliminary arrangement information, and detects the target. It is detected that the position b corresponds to the target position 2, and the target detection positions c to g correspond to the target positions 3 to 7, respectively.
- the correspondence detection unit 36 corresponds to the target detection position A detected by the camera three-dimensional coordinate system corresponding to the target position 1 of the target preliminary arrangement information, the target detection position B corresponding to the target position 2, and so on. Then, it is detected that the target detection positions C to G correspond to the target positions 3 to 7, respectively.
- the correspondence detection unit 36 detects that the target at the target detection position a detected in the radar three-dimensional coordinate system corresponds to the target at the target detection position A detected in the camera three-dimensional coordinate system. Similarly, it is detected that the targets at the target detection positions b to g detected in the radar three-dimensional coordinate system correspond to the targets at the target detection positions B to G detected in the camera three-dimensional coordinate system, respectively. .
- the correspondence detection unit 36 is based on the positional relationship already obtained by the previously executed pre-shipment calibration process or the operation calibration process.
- the target detection position detected in the radar three-dimensional coordinate system is compared with the target detection position detected in the camera three-dimensional coordinate system, and the target detected in the radar three-dimensional coordinate system and the camera three-dimensional coordinate system The correspondence of the target detected in step 1 is detected.
- the position / orientation estimation unit 37 calculates the positional relationship between the millimeter wave radar 11 and the stereo camera 12 using a plurality of targets whose correspondence relationship has been specified by the correspondence detection unit 36.
- the position of the kth target (0 ⁇ k ⁇ K + 1) among the K targets whose correspondence relationship is specified by the correspondence detection unit 36 is the radar three-dimensional.
- P MMW (k) [X MMW (k) Y A Z MMW (k)] T
- P cam (k) [X cam (k) Y cam (K) Z cam (k)] T
- T represents transposition
- Y A represents a predetermined fixed value.
- k is a variable (0 ⁇ k ⁇ K + 1) that identifies a predetermined one of a plurality (K) of targets
- P cam (k) is a camera three-dimensional coordinate system.
- the detected target detection position of the kth target, P MMW (k) represents the target detection position of the kth target detected in the radar three-dimensional coordinate system.
- Expression (1) is an expression for converting the target detection position P MMW (k) of the k-th target detected in the radar three-dimensional coordinate system into the target detection position P cam (k) on the camera three-dimensional coordinate system.
- the rotation matrix R represents the attitude of the millimeter wave radar 11 with respect to the stereo camera 12
- the translation vector V represents the position of the millimeter wave radar 11 with respect to the stereo camera 12.
- the rotation matrix R Since there are three variables of the rotation matrix R and three variables of the translation vector V, if at least six target detection positions can be obtained, the rotation matrix R and the translation vector V of Expression (1) are calculated. be able to. Note that the rotation matrix R can be expressed by a quaternion in addition to using the least square method.
- the storage unit 38 stores positional relationship information (calibration information) between the millimeter wave radar 11 and the stereo camera 12 calculated by the position / orientation estimation unit 37. Specifically, the rotation matrix R and the translation vector V of the equation (1) are supplied from the position / orientation estimation unit 37 to the storage unit 38 and stored therein.
- the object detection system 1 is configured as described above.
- P Det (k) [X Det (k) Y Det (k) Z Det (k)] Let it be represented by T.
- Y Det (k) is a fixed value as described above.
- K targets in the sensor coordinate system of the millimeter wave radar 11 or the stereo camera 12, K or more targets may be detected due to the influence of disturbance or the like.
- the target position f is detected as a target by, for example, noise, and the target detection positions a to f are 6 Targets have been detected.
- the detection of the correspondence between the five target positions 1 to 5 on the world coordinate system and the six target detection positions a to f on the sensor coordinate system is most likely performed by a three-dimensional point of a different coordinate system. It can be solved by considering it as a graph matching problem that finds the correspondences that overlap well.
- the correspondence (connection) between the five target positions 1 to 5 on the world coordinate system and the six target detection positions a to f on the sensor coordinate system is represented by a matrix variable X of M rows and N columns, and (2).
- the subscript i of x is a variable for identifying a target on the world coordinate system (0 ⁇ i ⁇ M + 1)
- the subscript j of x is a variable for identifying a target on the sensor coordinate system (0 ⁇ j ⁇ N + 1).
- x i, j represents whether or not the i th target on the world coordinate system and the j th target on the sensor coordinate system are connected, and is “1” when connected, “0” when not connected. It is a variable that takes
- the target position 1 on the world coordinate system and the target detection position a on the sensor coordinate system For example, as shown by a thick solid line in FIG. 11, the target position 1 on the world coordinate system and the target detection position a on the sensor coordinate system, the target position 2 on the world coordinate system and the target detection position b on the sensor coordinate system.
- Target position 3 on the world coordinate system Target detection position c on the sensor coordinate system
- target position 4 on the world coordinate system target detection position d on the sensor coordinate system
- target position 5 on the world coordinate system When the target detection position e on the sensor coordinate system corresponds, the matrix variable X representing the correspondence relationship is expressed as follows.
- the correspondence detection unit 36 obtains X that maximizes the score function score (X) using the matrix variable X represented by Expression (2).
- the score function score (X) is expressed by the following equation (3).
- i1 and i2 are variables that identify targets on the world coordinate system
- j1 and j2 are variables that identify targets on the sensor coordinate system.
- l i1, i2 is the length of the line segment connecting P MAP (i1) and P MAP (i2) on the world coordinate system
- h j1, j2 is P Det (j1) and P Det on the sensor coordinate system This represents the length of the line segment connecting (j2).
- S (l i1, i2 , h j1, j2 ) represents the similarity between the line length l i1, i2 and the line length h j1, j2 , and the line length l i1, i2 and the line The closer the minute length h j1, j2 is, the larger the value becomes.
- Similarity S (l i1, i2, h j1, j2) for example, the length of the line segment l i1, i2 and the line segment of length h j1, j2 of the difference d (l i1, i2, h j1, j2
- the following equation (4) using) can be employed.
- the first correspondence detection process described above is a detection method that uses the target pre-position information, but the target detected by the radar three-dimensional coordinate system and the camera three-dimensional coordinates without using the target pre-position information. It is also possible to detect the correspondence with the target detected by the system.
- the correspondence detection unit 36 determines at least one of the target position P MMW (k) on the radar three-dimensional coordinate system and the target position P cam (k) on the camera three-dimensional coordinate system. By superimposing by making a fixed amount slide, the targets arranged in the nearest vicinity can be made to correspond.
- FIG. 14 is a block diagram illustrating a detailed configuration example of the target detection unit 31 on the millimeter wave radar 11 side when the calibration process during operation is executed.
- the target detection unit 31 includes a motion detection unit 71, a peak detection unit 72, and an AND calculation unit 73.
- the motion detection unit 71 includes a storage unit that holds a reflection signal of at least one previous frame.
- the motion detection unit 71 includes a reflection signal of the current frame supplied from the millimeter wave radar 11 and a previous frame input from the previous frame. The movement of the peak position is detected by comparing with the reflected signal.
- the motion detection unit 71 supplies the peak position where the motion is detected to the AND calculation unit 73.
- the peak detection unit 72 detects a peak position where the reflection signal intensity is equal to or higher than a predetermined intensity from the reflection signal of the current frame supplied from the millimeter wave radar 11, and supplies the detection result to the AND calculation unit 73. .
- the AND operation unit 73 performs an AND operation on the peak position supplied from the motion detection unit 71 and the peak position supplied from the peak detection unit 72. In other words, the AND operation unit 73 extracts and extracts only the peak position supplied from the motion detection unit 71 from the peak position supplied from the peak detection unit 72, that is, the peak position where the motion is detected. The result is supplied to the three-dimensional position calculation unit 32 as a target detection position.
- FIG. 15 is a block diagram illustrating a detailed configuration example of the target detection unit 33 on the stereo camera 12 side when the calibration process during operation is executed.
- the target detection unit 33 includes a motion region detection unit 81, an image recognition unit 82, an AND calculation unit 83, and a center position calculation unit 84.
- the motion region detection unit 81 includes a storage unit that holds a stereo image of at least one previous frame.
- the motion region detection unit 81 includes a stereo image of the current frame supplied from the stereo camera 12 and the previous frame input from the previous frame.
- the motion area of the stereo image is detected by comparing with the stereo image. Detection of a motion area of a stereo image can be performed using motion vector estimation, frame difference, or the like.
- the motion region detection unit 81 supplies the detected motion region to the AND operation unit 83.
- the image recognition unit 82 detects the target area by performing image recognition on the stereo image of the current frame supplied from the stereo camera 12. For example, when a pedestrian (person) is detected as a target, the target region can be detected by image recognition processing that recognizes a human figure (silhouette) or face. The image recognition unit 82 supplies the detected target area to the AND calculation unit 83.
- the AND operation unit 83 performs an AND operation on the motion region supplied from the motion region detection unit 81 and the target region supplied from the image recognition unit 82. In other words, the AND operation unit 83 extracts only the target region supplied from the image recognition unit 82 among the motion regions supplied from the motion region detection unit 81, that is, the target region where the motion is detected. The result is supplied to the center position calculation unit 84.
- the center position calculation unit 84 calculates the pixel position that is the center of the target area supplied from the AND calculation unit 83, and supplies the calculated pixel position to the three-dimensional position calculation unit 32 as the target detection position.
- the object detection system 1 detects the pedestrian 101 in front shown in FIG. 16 as a target and executes the calibration process during operation. The case will be described.
- the detection range of the millimeter wave radar 11 and the stereo camera 12 includes a pedestrian 101 and two fixed objects 102-1 and 102-2.
- the pedestrian 101 is moving rightward in the figure, and the fixed objects 102-1 and 102-2 are objects that do not move.
- the peak detection unit 72 detects peak positions 111 to 113 from the reflected signal of the current frame supplied from the millimeter wave radar 11 as shown in FIG. The detection result is supplied to the AND operation unit 73.
- the peak position 111 corresponds to the pedestrian 101 in FIG. 16, and the peak positions 112 and 113 correspond to the fixed objects 102-1 and 102-2.
- the reflected signal of the current frame is compared with the reflected signal of the previous frame input immediately before, and only the peak position 111 is determined as the peak position where the motion is detected. It is supplied to the AND operation unit 73.
- the AND calculation unit 73 supplies only the peak position 111 supplied from the motion detection unit 71 among the peak positions 111 to 113 supplied from the peak detection unit 72 to the three-dimensional position calculation unit 32 as a target detection position. .
- the image recognition unit 82 performs image recognition processing for recognizing a human figure and a face on the stereo image of the current frame.
- the target area 121 is detected.
- the target area 121 corresponds to the pedestrian 101 in FIG.
- the motion region detection unit 81 detects the motion region 122 of the stereo image by comparing the stereo image of the current frame supplied from the stereo camera 12 with the stereo image of the previous frame input immediately before. .
- the motion region 122 detected here also corresponds to the pedestrian 101 in FIG.
- the same left camera image as the parallax image is used for the stereo image for which the target detection unit 33 performs image recognition processing and the stereo image for which the motion region detection unit 81 detects the motion region.
- the AND operation unit 83 performs an AND operation on the motion region 122 supplied from the motion region detection unit 81 and the target region 121 supplied from the image recognition unit 82, and as a result, the target region 121 is converted into the center position calculation unit 84. To supply.
- the center position calculation unit 84 calculates the center pixel position 123 of the target area 121 supplied from the AND calculation unit 83 and supplies the calculated center pixel position 123 to the three-dimensional position calculation unit 32 as the target detection position.
- the three-dimensional position calculation unit 35 on the stereo camera 12 side includes the parallax information based on the left camera image supplied from the parallax estimation unit 34 and the center position calculation unit of the target detection unit 33.
- a target detection position 131 on the camera three-dimensional coordinate system is calculated from the target detection position 123 supplied from 84, the front direction of the vehicle being the Z axis, the horizontal direction being the X axis, and the vertical direction being the Y axis. 36.
- the three-dimensional position calculation unit 32 on the millimeter wave radar 11 side uses the target detection position 111 represented by the polar coordinate system supplied from the target detection unit 31 as the Z-axis for the front direction of the vehicle and the X-axis for the horizontal direction. Then, it is converted to a target detection position 132 on the radar three-dimensional coordinate system with the vertical direction as the Y axis, and is supplied to the correspondence detection unit 36.
- the correspondence detection unit 36 is calculated by the pre-shipment calibration process and is stored in the storage unit 38.
- the correspondence detection unit 36 uses the current positional relationship information between the millimeter wave radar 11 and the stereo camera 12 and the radar three-dimensional coordinate system supplied from the three-dimensional position calculation unit 32 as shown in FIG.
- a target detection position 133 is calculated by correcting the position of the upper target detection position 132 to a position on the camera three-dimensional coordinate system.
- the correspondence detection unit 36 includes a target detection position 131 on the camera three-dimensional coordinate system supplied from the three-dimensional position calculation unit 35 on the stereo camera 12 side, and a millimeter wave radar whose position is corrected on the camera three-dimensional coordinate system.
- the target detection position 133 on the 11th side is compared, and the correspondence between the target detected on the stereo camera 12 side and the target detected on the millimeter wave radar 11 side is detected.
- the correspondence detection unit 36 recognizes the targets having the closest coordinate positions as corresponding targets.
- the number of detected targets is one.
- the detected position whose position is corrected using the positional relationship information calculated in the pre-shipment calibration process. Therefore, the correspondence can be easily detected.
- the signal processing device 13 performs the processing in a plurality of frames (N frames) as shown in FIG. Execute. As a result, when there is one corresponding point detected in each frame, N corresponding points with different detection positions are detected and stored in the storage unit 38 in N frames.
- the rotation matrix R and the translation vector V in Expression (1) at least six target detection positions are required.
- the six target detection positions may be obtained in a total of 6 frames including 1 point in each frame, or may be obtained in a total of 3 frames including 2 points in each frame, for example.
- the number N of frames is preferably 6 or more and a large value.
- the times t to t + N of the N frame for solving the equation (1) shown in FIG. 20 do not necessarily have to be continuous in time.
- the above-described calibration at the time of operation using corresponding points detected in 10 frames of a day that satisfy a predetermined condition and corresponding points detected in 20 frames of another day that satisfy a predetermined condition It is also possible to execute processing.
- the signal processing device 13 can select a frame to be used for the calibration process during operation in order to improve the calibration accuracy.
- the signal processing device 13 uses a frame in which no other target detection position exists within a predetermined range (distance) determined in advance from the detected target detection position, for the calibration process during operation. To be stored in the storage unit 38.
- the frame A in FIG. 21 is selected as a frame used for the calibration process during operation.
- the frame B in FIG. 21 another target detection position 142 exists within a predetermined range 143 from the target detection position 141.
- the frame B in FIG. 21 is excluded from the frames used for the calibration process during operation.
- the signal processing apparatus 13 excludes a frame from the frame used for the calibration process during operation even when a predetermined number of targets or more are detected in one frame, as in the frame C of FIG. To do.
- the in-operation calibration process does not use a target prepared in advance, and therefore selects a frame (target) from which a corresponding point is detected so that higher accuracy can be obtained.
- the frame may be selected by the target detection unit 31 or 33 or the correspondence detection unit 36.
- step S1 the signal processing device 13 executes a factory calibration process.
- the details of this process will be described later with reference to the flowchart of FIG. 23.
- the rotation matrix R and the translation vector V of the equation (1) which are positional relationship information between the millimeter wave radar 11 and the stereo camera 12, are obtained. Calculated and stored in the storage unit 38.
- step S1 for example, in a factory that manufactures a vehicle equipped with the object detection system 1 or a dealer such as a dealer, the user (operator) starts the calibration on the operation panel or the like. It is executed when instructed by. Alternatively, it may be automatically executed when the vehicle stops (is detected) at a place where the calibration environment is prepared. After completion of the shipping calibration process, the vehicle is shipped and delivered to the owner (driver).
- step S2 the signal processing device 13 determines whether to start the calibration process during operation. For example, when a certain period or more has elapsed since the last shipping calibration process or operation calibration process, the signal processing device 13 accumulates a predetermined number or more of corresponding points in the storage unit 38 as described with reference to FIG. When the predetermined start condition is satisfied, such as when the displacement amount of the corresponding point after the position correction of the millimeter wave radar 11 and the stereo camera 12 is always (predetermined number of times) or more, a predetermined value is satisfied. It is determined that the calibration process is started.
- the vehicle when the vehicle is not level with the road surface (when it is tilted), when the vehicle is moving at a high speed, when the number of targets detected at a time is more than a predetermined value, bad weather, retrograde, darkness, etc.
- the stereo image captured by the stereo camera 12 is in an environmental condition where the reliability of the stereo image is low, the vehicle is in a place (for example, a tunnel) where the millimeter wave of the millimeter wave radar 11 is likely to cause multiple reflections.
- the signal processing device 13 determines that the in-operation calibration process is not started, for example, when the environmental condition is low in reliability. Whether or not the vehicle is likely to cause multiple reflections can be determined based on, for example, a GPS reception signal.
- step S2 If it is determined in step S2 that the operation calibration process is not started, the process returns to step S2, and the process in step S2 is repeated until it is determined that the operation calibration process is to be started next.
- step S2 determines whether the operation calibration process is started. If it is determined in step S2 that the operation calibration process is started, the process proceeds to step S3, and the signal processing device 13 executes the operation calibration process.
- the rotation matrix R and the translation vector V in Expression (1) which are positional relationship information between the millimeter wave radar 11 and the stereo camera 12, are obtained. It is calculated again and overwritten (updated) in the storage unit 38.
- FIG. 23 is a flowchart for explaining the details of the calibration process at the time of shipment in step S1 described above.
- step S21 the target detection unit 31 on the millimeter wave radar 11 side detects the position of the target in front of the vehicle based on the reflection signal supplied from the millimeter wave radar 11 and the irradiation direction ⁇ .
- the target detection position detected by the target detection unit 31 is represented by a polar coordinate system including the distance L based on the intensity of the reflected signal and the irradiation direction ⁇ , and is supplied to the three-dimensional position calculation unit 32.
- step S22 the three-dimensional position calculation unit 32 on the millimeter wave radar 11 side converts the target detection position represented by the polar coordinate system supplied from the target detection unit 31 into a target detection position on the radar three-dimensional coordinate system. To do.
- step S23 the target detection unit 33 on the stereo camera 12 side detects the position of the target on the two-dimensional coordinate system by performing image processing such as pattern matching and feature detection processing on the stereo image.
- step S24 the parallax estimation unit 34 calculates parallax from the right camera image supplied from the right camera 21R and the left camera image supplied from the left camera 21L, and supplies the parallax information to the three-dimensional position calculation unit 35. To do.
- step S ⁇ b> 25 the three-dimensional position calculation unit 35 uses the parallax information supplied from the parallax estimation unit 34 and the target detection position on the two-dimensional coordinate system supplied from the target detection unit 33 on the camera three-dimensional coordinate system. The target detection position is calculated.
- or S25 can be performed in parallel as mentioned above, and the process of step S21 and S22 and the process of step S23 thru
- step S ⁇ b> 26 the correspondence detection unit 36 executes the first correspondence detection process described above, thereby obtaining a correspondence relationship between the target detected in the radar three-dimensional coordinate system and the target detected in the camera three-dimensional coordinate system.
- the correspondence detection unit 36 compares the target detection position detected in the radar three-dimensional coordinate system with target pre-arrangement information, and identifies the target. Further, the correspondence detection unit 36 compares the target detection position detected in the camera three-dimensional coordinate system with the target pre-arrangement information, and identifies the target. Then, the correspondence detection unit 36 detects which target detected in the radar three-dimensional coordinate system corresponds to which target detected in the camera three-dimensional coordinate system based on the collation result with the target prior arrangement information. .
- step S26 the second correspondence detection process described above may be executed instead of the first correspondence detection process.
- step S27 the position / orientation estimation unit 37 substitutes the target detection positions of the plurality of targets whose correspondences are specified by the correspondence detection unit 36 into the equation (1), and solves them using the least square method or the like.
- the positional relationship between the wave radar 11 and the stereo camera 12 is calculated. Thereby, the rotation matrix R and the translation vector V of Formula (1) are calculated.
- step S28 the position / orientation estimation unit 37 stores the calculated rotation matrix R and translation vector V in the storage unit 38.
- step S41 the target detection unit 31 on the millimeter wave radar 11 side detects the position of the target where the motion is detected based on the reflection signal supplied from the millimeter wave radar 11 and the irradiation direction ⁇ .
- the motion detecting unit 71 compares the reflected signal of the current frame with the reflected signal of the previous frame input immediately before it, and detects the motion of the peak position.
- the peak detection unit 72 detects the peak position from the reflection signal of the current frame and supplies the peak position to the AND calculation unit 73.
- the AND calculation part 73 extracts only the peak position supplied from the motion detection part 71 among the peak positions supplied from the peak detection part 72, and uses the extracted result as the target detection position where the movement is detected. This is supplied to the three-dimensional position calculation unit 32.
- step S42 the three-dimensional position calculation unit 32 on the millimeter wave radar 11 side converts the target detection position represented by the polar coordinate system supplied from the target detection unit 31 into a target detection position on the radar three-dimensional coordinate system. To do.
- step S43 the target detection unit 33 on the stereo camera 12 side performs image processing such as pattern matching and feature detection processing on the stereo image to detect the position of the target where the motion is detected.
- the motion region detection unit 81 detects the motion region of the stereo image by comparing the current stereo image with the stereo image of the previous frame input immediately before.
- the image recognition unit 82 detects the target area by performing image recognition on the current stereo image supplied from the stereo camera 12.
- the AND operation unit 83 extracts the target region supplied from the image recognition unit 82 from the motion regions supplied from the motion region detection unit 81 and supplies the extracted result to the center position calculation unit 84.
- the center position calculation unit 84 calculates the center pixel position of the target area supplied from the AND calculation unit 83, and supplies the calculated center pixel position to the three-dimensional position calculation unit 32 as the target detection position where the motion is detected. To do.
- step S44 the parallax estimation unit 34 calculates parallax from the right camera image supplied from the right camera 21R and the left camera image supplied from the left camera 21L, and supplies the parallax information to the three-dimensional position calculation unit 35. To do.
- step S ⁇ b> 45 the three-dimensional position calculation unit 35 on the stereo camera 12 side calculates the camera 3 from the parallax information supplied from the parallax estimation unit 34 and the target detection position on the two-dimensional coordinate system supplied from the target detection unit 33.
- the target detection position on the dimensional coordinate system is calculated.
- step S46 the correspondence detection unit 36 acquires the positional relationship information between the millimeter wave radar 11 and the stereo camera 12 stored in the storage unit 38, specifically, the rotation matrix R and the translation vector V of Expression (1).
- the positional relationship information acquired from the storage unit 38 is data calculated in the pre-shipment calibration process in the first operation calibration process, but in the second and subsequent operation calibration processes. Is the data updated in the previous operational calibration process.
- step S47 the correspondence detection unit 36 uses the acquired positional relationship information to convert the target detection position 132 on the radar three-dimensional coordinate system supplied from the three-dimensional position calculation unit 32 into a position on the camera three-dimensional coordinate system.
- the target detection position corrected in position is calculated.
- step S48 the correspondence detection unit 36 detects the target detection position on the camera three-dimensional coordinate system supplied from the three-dimensional position calculation unit 35 on the stereo camera 12 side, and the millimeter wave radar whose position is corrected on the camera three-dimensional coordinate system. 11 is compared with the target detection position on the 11 side, and the correspondence between the target detected on the stereo camera 12 side and the target detected on the millimeter wave radar 11 side is detected.
- step S49 the position / orientation estimation unit 37 substitutes the target detection positions of the plurality of targets whose correspondences have been specified in the process of step S48 into equation (1) and solves them, whereby the millimeter wave radar 11 and the stereo camera 12 are detected. The positional relationship of is calculated.
- step S50 the position / orientation estimation unit 37 compares the current positional relationship information stored in the storage unit 38 with the positional relationship information newly calculated in step S49. It is determined whether it is within a predetermined range of the positional relationship information.
- step S50 If it is determined in step S50 that the new positional relationship information is within the predetermined range of the current positional relationship information, the process proceeds to step S51, and the position / orientation estimation unit 37 stores the current positional relationship information stored in the storage unit 38. The new positional relationship information is overwritten and stored in the positional relationship information, and the in-operation calibration process is terminated.
- step S50 determines whether the new positional relationship information is within the predetermined range of the current positional relationship information. If it is determined in step S50 that the new positional relationship information is not within the predetermined range of the current positional relationship information, the processing in step S51 is skipped, and the in-operation calibration processing ends.
- step S50 when the newly calculated positional relationship information is a value that is significantly different from the previous positional relationship information, the position / orientation estimation unit 37 calculates the calculated positional relationship information. Is not updated as an unreliable value including some error factor. Note that the processing in step S50 may be omitted, and the information stored in the storage unit 38 may be constantly updated with the newly calculated positional relationship information.
- the factory calibration process and the operation calibration process are executed as described above.
- the shipping calibration process and the operation calibration process described above are examples in which the process of calculating calibration data indicating the positional relationship between the millimeter wave radar 11 and the stereo camera 12 is performed only once.
- the average value may be stored in the storage unit 38 as final calibration data. Further, when using calibration data calculated a plurality of times, it is possible to calculate final calibration data after excluding data greatly deviating from other calibration data among the calibration data of a plurality of times. it can.
- FIG. 25 shows another example of a target that can be used in the calibration process at the time of shipment.
- a sphere 161 reflecting a millimeter wave or a corner reflector 162 can be used as a target.
- the target detection unit 33 that detects the target position based on the stereo image detects the target position at the pixel level in order to improve calibration accuracy.
- the intersection point 52 of the texture is calculated, but when the target is the sphere 161, the sphere 161 in the stereo image is detected as a circle by pattern matching or circular shape recognition. Then, the center position of the detected sphere 161 can be output as the target detection position.
- the corner reflector 162 in the stereo image is detected by pattern matching with a registered pattern of the corner reflector 162 registered in advance, and the detected corner reflector 162 is registered.
- the center position of the pattern can be output as the target detection position.
- FIG. 26 shows an example of a parallax image calculated from a reflection signal of the millimeter wave radar 11 and a stereo image by the stereo camera 12 when the target is a sphere 161.
- a target in addition to the above-described pedestrian (human), a general object existing in the traffic environment, for example, a part of another vehicle (for example, a license plate), a part of the own vehicle Signs, signs, traffic lights, utility poles, etc. can be used.
- a part of another vehicle for example, a license plate
- Signs, signs, traffic lights, utility poles, etc. can be used.
- the target detection unit 33 detects vertical parallel lines from the stereo image, and calculates a region surrounded by the detected parallel lines as a pole region.
- the target detection unit 33 detects vertical parallel lines from the stereo image, and calculates a region surrounded by the detected parallel lines as a pole region.
- one pole area is detected in the stereo image, but a plurality of pole areas may be detected.
- the target detection unit 33 performs pole determination in each of the horizontal direction and the vertical direction with respect to the detected pole region based on the parallax information of the stereo image calculated by the parallax estimation unit 34.
- the target detection unit 33 when the parallax when viewed in the horizontal direction changes so as to draw a convex curve in the pole region, For a region, the horizontal pole determination result is determined to be true, and otherwise, it is determined to be false.
- the target detection unit 33 determines that the vertical pole determination result is true for the pole region, and otherwise Is determined to be false.
- the target detection unit 33 performs an AND operation on the horizontal pole determination result and the vertical pole determination result. That is, when both the horizontal pole determination result and the vertical pole determination result are true, the target detection unit 33 outputs the position of the pole area as the target detection position.
- the target detection unit 33 when the target is a pole-shaped object such as a utility pole, the target detection unit 33 also detects the target using the parallax information calculated by the parallax estimation unit 34 and sets the target detection position. Can be output.
- the object detection system 1 it is possible to detect and calibrate positional relationship information between the millimeter wave radar 11 and the stereo camera 12 with accuracy in units of pixels of a stereo image, thereby realizing highly accurate calibration. it can.
- millimeter waves are caused by factors such as aging, vibration, and heat. Even when the positional relationship between the radar 11 and the stereo camera 12 changes, it can be corrected (automatically) at an arbitrary timing.
- the object detection system 1 is mounted on a vehicle.
- the present technology can also be mounted on a moving object that travels on land such as a robot in addition to the vehicle. .
- the millimeter wave radar 11 is employed as the first sensor for calculating the three-dimensional position of the target in the first coordinate system, and the second for calculating the three-dimensional position of the target in the second coordinate system.
- a stereo camera 12 was adopted.
- the first sensor in addition to the millimeter wave radar 11, other radar type sensors such as a radar using ultrasonic waves, a laser radar such as infrared rays, and a lidar may be used.
- the first sensor may be any sensor that can acquire position information in at least one of the horizontal direction (horizontal direction) and the vertical direction (vertical direction) and position information in the depth direction.
- the series of processes including the calibration process described above can be executed by hardware or can be executed by software.
- a program constituting the software is installed in the computer.
- the computer includes, for example, a general-purpose personal computer capable of executing various functions by installing various programs by installing a computer incorporated in dedicated hardware.
- FIG. 28 is a block diagram showing an example of the hardware configuration of a computer that executes the above-described series of processing by a program.
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- An input / output interface 205 is further connected to the bus 204.
- An input unit 206, an output unit 207, a storage unit 208, a communication unit 209, and a drive 210 are connected to the input / output interface 205.
- the input unit 206 includes a keyboard, a mouse, a microphone, and the like.
- the output unit 207 includes a display, a speaker, and the like.
- the storage unit 208 includes a hard disk, a nonvolatile memory, and the like.
- the communication unit 209 includes a network interface and the like.
- the drive 210 drives a removable recording medium 211 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
- the CPU 201 loads, for example, the program stored in the storage unit 208 to the RAM 203 via the input / output interface 205 and the bus 204 and executes the program. Is performed.
- the program can be installed in the storage unit 208 via the input / output interface 205 by attaching the removable recording medium 211 to the drive 210. Further, the program can be received by the communication unit 209 via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting, and can be installed in the storage unit 208. In addition, the program can be installed in the ROM 202 or the storage unit 208 in advance.
- the program executed by the computer may be a program that is processed in time series in the order described in this specification, or in parallel or at a necessary timing such as when a call is made. It may be a program for processing.
- the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Accordingly, a plurality of devices housed in separate housings and connected via a network and a single device housing a plurality of modules in one housing are all systems. .
- Embodiments of the present technology are not limited to the above-described embodiments, and various modifications can be made without departing from the gist of the present technology.
- the present technology can take a cloud computing configuration in which one function is shared by a plurality of devices via a network and is jointly processed.
- each step described in the above flowchart can be executed by one device or can be shared by a plurality of devices.
- the plurality of processes included in the one step can be executed by being shared by a plurality of apparatuses in addition to being executed by one apparatus.
- a first position calculation unit that calculates a three-dimensional position of a target in a first coordinate system from a stereo image captured by a stereo camera
- a second position calculation unit that calculates a three-dimensional position of the target in a second coordinate system from a sensor signal of a sensor that can acquire position information in at least one of the horizontal direction and the vertical direction and position information in the depth direction
- a correspondence detection unit for detecting a correspondence relationship between the target on the first coordinate system and the target on the second coordinate system
- a signal processing apparatus comprising: a positional relationship information estimation unit that estimates positional relationship information between the first coordinate system and the second coordinate system based on the detected correspondence.
- the correspondence detection unit collates each of the target on the first coordinate system and the target on the second coordinate system with the pre-arrangement information of the target, specifies the target, and then performs the correspondence The signal processing device according to (1), wherein a relationship is detected.
- the correspondence detection unit superimposes the three-dimensional position of the target on the first coordinate system and the three-dimensional position of the target on the second coordinate system, and associates the targets arranged closest to each other.
- the first position calculation unit calculates a three-dimensional position of the target where the movement is detected,
- the first position calculation unit calculates a three-dimensional position of the plurality of targets from a stereo image of one frame or more
- the second position calculation unit calculates three-dimensional positions of the plurality of targets from sensor signals of one frame or more
- the signal processing apparatus according to any one of (1) to (4), wherein the correspondence detection unit detects correspondences for a plurality of the targets.
- the first position calculation unit calculates a three-dimensional position of the plurality of targets from one frame of a stereo image
- the signal processing apparatus according to (5), wherein the second position calculation unit calculates a three-dimensional position of the plurality of targets from one frame of sensor signals.
- a storage unit that stores the three-dimensional position of the target calculated by the first position calculation unit and the second position calculation unit;
- the correspondence detection unit starts detection of the correspondence relationship when a predetermined number or more of three-dimensional positions of the target are accumulated in the storage unit.
- the signal according to any one of (1) to (6) Processing equipment.
- the signal processing device according to any one of (1) to (7), wherein a plurality of the targets are arranged at different positions in the depth direction.
- the signal processing device according to any one of (1) to (10), wherein a plurality of the targets are arranged at positions that do not overlap when viewed from the stereo camera.
- the signal processing apparatus according to any one of (1) to (10), wherein the target is a person.
- the signal processing apparatus according to any one of (1) to (10), wherein the target is an object having a predetermined texture.
- the signal processing apparatus according to any one of (1) to (10), wherein the target is a pole-shaped object.
- the positional relationship information between the first coordinate system and the second coordinate system is a rotation matrix and a translation vector.
- 1 object detection system 11 millimeter wave radar, 12 stereo camera, 13 signal processing device, 21L left camera, 21R right camera, 31 target detection unit, 32 3D position calculation unit, 33 target detection unit, 34 parallax estimation unit, 35 3D position calculation unit, 36 correspondence detection unit, 37 position and orientation estimation unit, 38 storage unit, 51 target, 71 motion detection unit, 72 peak detection unit, 73 AND operation unit, 81 motion region detection unit, 82 image recognition unit, 83 AND operation unit, 84 central position calculation unit, 201 CPU, 202 ROM, 203 RAM, 206 input unit, 207 output unit, 208 storage unit, 209 communication unit, 210 drive
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Radar Systems Or Details Thereof (AREA)
- Measurement Of Optical Distance (AREA)
- Traffic Control Systems (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
1.物体検出システムの構成例
2.対応検出処理の詳細説明
3.運用時キャリブレーション処理
4.キャリブレーション処理の処理フロー
5.出荷時キャリブレーション処理におけるターゲットの例
6.運用時キャリブレーション処理におけるターゲットの例
7.コンピュータ構成例
図1は、本技術を適用した物体検出システムの構成例を示すブロック図である。
Pcam(k)=R・PMMW(k)+V ・・・・・・・(1)
<第1の対応検出処理>
次に、対応検出部36が行う、ターゲットの事前配置情報を用いた第1の対応検出処理についてさらに詳しく説明する。
上述した第1の対応検出処理は、ターゲットの事前配置情報を用いた検出方法であるが、ターゲットの事前配置情報を用いずに、レーダ3次元座標系で検出されたターゲットと、カメラ3次元座標系で検出されたターゲットとの対応関係を検出することもできる。
次に、運用時キャリブレーション処理について説明する。
図16乃至図21を参照しながら、運用時キャリブレーション処理について具体的に説明する。
次に、図22のフローチャートを参照して、信号処理装置13により実行されるキャリブレーション処理について説明する。
図25は、出荷時キャリブレーション処理において用いることができるターゲットのその他の例を示している。
次に、運用時キャリブレーション処理において用いることができるターゲットのその他の例について説明する。
上述したキャリブレーション処理を含む一連の処理は、ハードウエアにより実行することもできるし、ソフトウエアにより実行することもできる。一連の処理をソフトウエアにより実行する場合には、そのソフトウエアを構成するプログラムが、コンピュータにインストールされる。ここで、コンピュータには、専用のハードウエアに組み込まれているコンピュータや、各種のプログラムをインストールすることで、各種の機能を実行することが可能な、例えば汎用のパーソナルコンピュータなどが含まれる。
(1)
ステレオカメラにより撮像されたステレオ画像から、第1の座標系におけるターゲットの3次元位置を算出する第1位置算出部と、
横方向または縦方向の少なくとも一方の位置情報と、奥行き方向の位置情報とを取得できるセンサのセンサ信号から、第2の座標系における前記ターゲットの3次元位置を算出する第2位置算出部と、
前記第1の座標系上の前記ターゲットと、前記第2の座標系上の前記ターゲットとの対応関係を検出する対応検出部と、
検出された前記対応関係に基づいて、前記第1の座標系と前記第2の座標系の位置関係情報を推定する位置関係情報推定部と
を備える信号処理装置。
(2)
前記対応検出部は、前記第1の座標系上の前記ターゲットと前記第2の座標系上の前記ターゲットのそれぞれを前記ターゲットの事前配置情報と照合し、前記ターゲットを特定した上で、前記対応関係を検出する
前記(1)に記載の信号処理装置。
(3)
前記対応検出部は、前記第1の座標系上の前記ターゲットの3次元位置と前記第2の座標系上の前記ターゲットの3次元位置を重畳させ、最近傍に配置されたターゲットどうしを対応させ、前記対応関係を検出する
前記(1)または(2)に記載の信号処理装置。
(4)
前記第1位置算出部は、動きが検出された前記ターゲットの3次元位置を算出し、
前記第2位置算出部は、動きが検出された前記ターゲットの3次元位置を算出する
前記(1)乃至(3)のいずれかに記載の信号処理装置。
(5)
前記第1位置算出部は、1フレーム以上のステレオ画像から複数の前記ターゲットの3次元位置を算出し、
前記第2位置算出部は、1フレーム以上のセンサ信号から複数の前記ターゲットの3次元位置を算出し、
前記対応検出部は、複数の前記ターゲットについて対応関係を検出する
前記(1)乃至(4)のいずれかに記載の信号処理装置。
(6)
前記第1位置算出部は、1フレームのステレオ画像から複数の前記ターゲットの3次元位置を算出し、
前記第2位置算出部は、1フレームのセンサ信号から複数の前記ターゲットの3次元位置を算出する
前記(5)に記載の信号処理装置。
(7)
前記第1位置算出部及び前記第2位置算出部により算出された前記ターゲットの3次元位置を記憶する記憶部をさらに備え、
前記対応検出部は、所定数以上の前記ターゲットの3次元位置が前記記憶部に蓄積された場合に、前記対応関係の検出を開始する
前記(1)乃至(6)のいずれかに記載の信号処理装置。
(8)
前記ターゲットは、前記奥行き方向の位置が異なる位置に複数配置される
前記(1)乃至(7)のいずれかに記載の信号処理装置。
(9)
前記ターゲットは、前記横方向の位置が異なる位置に複数配置される
前記(1)乃至(8)のいずれかに記載の信号処理装置。
(10)
前記ターゲットは、同じ高さ位置に複数配置される
前記(1)乃至(9)のいずれかに記載の信号処理装置。
(11)
前記ターゲットは、前記ステレオカメラから見て、重ならない位置に複数配置される
前記(1)乃至(10)のいずれかに記載の信号処理装置。
(12)
前記ターゲットは、人である
前記(1)乃至(10)のいずれかに記載の信号処理装置。
(13)
前記ターゲットは、所定のテクスチャを有する物体である
前記(1)乃至(10)のいずれかに記載の信号処理装置。
(14)
前記ターゲットは、ポール状の物体である
前記(1)乃至(10)のいずれかに記載の信号処理装置。
(15)
前記第1の座標系と前記第2の座標系の位置関係情報は、回転行列と並進ベクトルである
前記(1)乃至(14)のいずれかに記載の信号処理装置。
(16)
前記センサは、ミリ波レーダである
前記(1)乃至(15)のいずれかに記載の信号処理装置。
(17)
ステレオカメラにより撮像されたステレオ画像から、第1の座標系におけるターゲットの3次元位置を算出し、
横方向または縦方向の少なくとも一方の位置情報と、奥行き方向の位置情報とを取得できるセンサのセンサ信号から、第2の座標系における前記ターゲットの3次元位置を算出し、
前記第1の座標系上の前記ターゲットと、前記第2の座標系上の前記ターゲットとの対応関係を検出し、
検出された前記対応関係に基づいて、前記第1の座標系と前記第2の座標系の位置関係情報を推定する
ステップを含む信号処理方法。
(18)
コンピュータに、
ステレオカメラにより撮像されたステレオ画像から、第1の座標系におけるターゲットの3次元位置を算出し、
横方向または縦方向の少なくとも一方の位置情報と、奥行き方向の位置情報とを取得できるセンサのセンサ信号から、第2の座標系における前記ターゲットの3次元位置を算出し、
前記第1の座標系上の前記ターゲットと、前記第2の座標系上の前記ターゲットとの対応関係を検出し、
検出された前記対応関係に基づいて、前記第1の座標系と前記第2の座標系の位置関係情報を推定する
ステップを含む処理を実行させるためのプログラム。
Claims (18)
- ステレオカメラにより撮像されたステレオ画像から、第1の座標系におけるターゲットの3次元位置を算出する第1位置算出部と、
横方向または縦方向の少なくとも一方の位置情報と、奥行き方向の位置情報とを取得できるセンサのセンサ信号から、第2の座標系における前記ターゲットの3次元位置を算出する第2位置算出部と、
前記第1の座標系上の前記ターゲットと、前記第2の座標系上の前記ターゲットとの対応関係を検出する対応検出部と、
検出された前記対応関係に基づいて、前記第1の座標系と前記第2の座標系の位置関係情報を推定する位置関係情報推定部と
を備える信号処理装置。 - 前記対応検出部は、前記第1の座標系上の前記ターゲットと前記第2の座標系上の前記ターゲットのそれぞれを前記ターゲットの事前配置情報と照合し、前記ターゲットを特定した上で、前記対応関係を検出する
請求項1に記載の信号処理装置。 - 前記対応検出部は、前記第1の座標系上の前記ターゲットの3次元位置と前記第2の座標系上の前記ターゲットの3次元位置を重畳させ、最近傍に配置されたターゲットどうしを対応させ、前記対応関係を検出する
請求項1に記載の信号処理装置。 - 前記第1位置算出部は、動きが検出された前記ターゲットの3次元位置を算出し、
前記第2位置算出部は、動きが検出された前記ターゲットの3次元位置を算出する
請求項1に記載の信号処理装置。 - 前記第1位置算出部は、1フレーム以上のステレオ画像から複数の前記ターゲットの3次元位置を算出し、
前記第2位置算出部は、1フレーム以上のセンサ信号から複数の前記ターゲットの3次元位置を算出し、
前記対応検出部は、複数の前記ターゲットについて対応関係を検出する
請求項1に記載の信号処理装置。 - 前記第1位置算出部は、1フレームのステレオ画像から複数の前記ターゲットの3次元位置を算出し、
前記第2位置算出部は、1フレームのセンサ信号から複数の前記ターゲットの3次元位置を算出する
請求項5に記載の信号処理装置。 - 前記第1位置算出部及び前記第2位置算出部により算出された前記ターゲットの3次元位置を記憶する記憶部をさらに備え、
前記対応検出部は、所定数以上の前記ターゲットの3次元位置が前記記憶部に蓄積された場合に、前記対応関係の検出を開始する
請求項1に記載の信号処理装置。 - 前記ターゲットは、前記奥行き方向の位置が異なる位置に複数配置される
請求項1に記載の信号処理装置。 - 前記ターゲットは、前記横方向の位置が異なる位置に複数配置される
請求項1に記載の信号処理装置。 - 前記ターゲットは、同じ高さ位置に複数配置される
請求項1に記載の信号処理装置。 - 前記ターゲットは、前記ステレオカメラから見て、重ならない位置に複数配置される
請求項1に記載の信号処理装置。 - 前記ターゲットは、人である
請求項1に記載の信号処理装置。 - 前記ターゲットは、所定のテクスチャを有する物体である
請求項1に記載の信号処理装置。 - 前記ターゲットは、ポール状の物体である
請求項1に記載の信号処理装置。 - 前記第1の座標系と前記第2の座標系の位置関係情報は、回転行列と並進ベクトルである
請求項1に記載の信号処理装置。 - 前記センサは、ミリ波レーダである
請求項1に記載の信号処理装置。 - ステレオカメラにより撮像されたステレオ画像から、第1の座標系におけるターゲットの3次元位置を算出し、
横方向または縦方向の少なくとも一方の位置情報と、奥行き方向の位置情報とを取得できるセンサのセンサ信号から、第2の座標系における前記ターゲットの3次元位置を算出し、
前記第1の座標系上の前記ターゲットと、前記第2の座標系上の前記ターゲットとの対応関係を検出し、
検出された前記対応関係に基づいて、前記第1の座標系と前記第2の座標系の位置関係情報を推定する
ステップを含む信号処理方法。 - コンピュータに、
ステレオカメラにより撮像されたステレオ画像から、第1の座標系におけるターゲットの3次元位置を算出し、
横方向または縦方向の少なくとも一方の位置情報と、奥行き方向の位置情報とを取得できるセンサのセンサ信号から、第2の座標系における前記ターゲットの3次元位置を算出し、
前記第1の座標系上の前記ターゲットと、前記第2の座標系上の前記ターゲットとの対応関係を検出し、
検出された前記対応関係に基づいて、前記第1の座標系と前記第2の座標系の位置関係情報を推定する
ステップを含む処理を実行させるためのプログラム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017543132A JP6825569B2 (ja) | 2015-09-30 | 2016-09-16 | 信号処理装置、信号処理方法、およびプログラム |
CN201680055407.1A CN108139475A (zh) | 2015-09-30 | 2016-09-16 | 信号处理设备、信号处理方法和程序 |
US15/762,136 US10908257B2 (en) | 2015-09-30 | 2016-09-16 | Signal processing apparatus, signal processing method, and program |
EP16851215.0A EP3358368A4 (en) | 2015-09-30 | 2016-09-16 | SIGNAL PROCESSING APPARATUS, SIGNAL PROCESSING METHOD, AND PROGRAM |
US17/139,290 US11719788B2 (en) | 2015-09-30 | 2020-12-31 | Signal processing apparatus, signal processing method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015194134 | 2015-09-30 | ||
JP2015-194134 | 2015-09-30 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/762,136 A-371-Of-International US10908257B2 (en) | 2015-09-30 | 2016-09-16 | Signal processing apparatus, signal processing method, and program |
US17/139,290 Continuation US11719788B2 (en) | 2015-09-30 | 2020-12-31 | Signal processing apparatus, signal processing method, and program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017057041A1 true WO2017057041A1 (ja) | 2017-04-06 |
Family
ID=58427496
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/077397 WO2017057041A1 (ja) | 2015-09-30 | 2016-09-16 | 信号処理装置、信号処理方法、およびプログラム |
Country Status (5)
Country | Link |
---|---|
US (2) | US10908257B2 (ja) |
EP (1) | EP3358368A4 (ja) |
JP (1) | JP6825569B2 (ja) |
CN (1) | CN108139475A (ja) |
WO (1) | WO2017057041A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017223680A (ja) * | 2016-12-30 | 2017-12-21 | 東軟集団股▲分▼有限公司 | 目標検出情報を生成する方法、装置、および、機器 |
JP2020030466A (ja) * | 2018-08-20 | 2020-02-27 | 株式会社Soken | 物体検知装置 |
KR20200052589A (ko) * | 2018-11-07 | 2020-05-15 | 현대자동차주식회사 | 전방 차량 오인식 제거 장치 및 그의 오인식 제거 방법과 그를 포함하는 차량 |
JP2020079781A (ja) * | 2018-09-07 | 2020-05-28 | バイドゥ オンライン ネットワーク テクノロジー (ベイジン) カンパニー リミテッド | 相対的位置姿勢の決定方法、装置、機器及び媒体 |
WO2020116206A1 (ja) * | 2018-12-07 | 2020-06-11 | ソニーセミコンダクタソリューションズ株式会社 | 情報処理装置、および情報処理方法、並びにプログラム |
JP2020530555A (ja) * | 2017-07-26 | 2020-10-22 | ロベルト・ボッシュ・ゲゼルシャフト・ミト・ベシュレンクテル・ハフツングRobert Bosch Gmbh | 物体の位置を認識する装置および方法 |
WO2022044634A1 (ja) * | 2020-08-26 | 2022-03-03 | 株式会社デンソー | 対象物認識装置、移動体衝突予防装置および対象物認識方法 |
WO2024143349A1 (ja) * | 2022-12-28 | 2024-07-04 | 住友重機械工業株式会社 | 作業機械の周辺監視システム |
Families Citing this family (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108139475A (zh) | 2015-09-30 | 2018-06-08 | 索尼公司 | 信号处理设备、信号处理方法和程序 |
JP6660751B2 (ja) * | 2016-02-04 | 2020-03-11 | 日立オートモティブシステムズ株式会社 | 撮像装置 |
JP6194520B1 (ja) * | 2016-06-24 | 2017-09-13 | 三菱電機株式会社 | 物体認識装置、物体認識方法および自動運転システム |
US11067996B2 (en) | 2016-09-08 | 2021-07-20 | Siemens Industry Software Inc. | Event-driven region of interest management |
US10481243B2 (en) * | 2016-10-31 | 2019-11-19 | Aptiv Technologies Limited | Automated vehicle radar system with self-calibration |
JP2019015553A (ja) * | 2017-07-05 | 2019-01-31 | ソニーセミコンダクタソリューションズ株式会社 | 情報処理装置、情報処理方法および個体撮像装置 |
US20190120934A1 (en) * | 2017-10-19 | 2019-04-25 | GM Global Technology Operations LLC | Three-dimensional alignment of radar and camera sensors |
CN110044371A (zh) * | 2018-01-16 | 2019-07-23 | 华为技术有限公司 | 一种车辆定位的方法以及车辆定位装置 |
US11041941B2 (en) * | 2018-02-26 | 2021-06-22 | Steradian Semiconductors Private Limited | Method and device for calibrating a radar object detection system |
CN110609274B (zh) * | 2018-06-15 | 2022-07-01 | 杭州海康威视数字技术股份有限公司 | 一种测距方法、装置及系统 |
KR102675522B1 (ko) * | 2018-09-07 | 2024-06-14 | 삼성전자주식회사 | 센서들에 대한 정렬 모델 조정 방법 및 그 방법을 수행하는 전자 장치 |
TWI734932B (zh) * | 2018-09-17 | 2021-08-01 | 為昇科科技股份有限公司 | 雷達偵測角度校正系統及其方法 |
US11982736B2 (en) * | 2018-10-01 | 2024-05-14 | Kpit Technologies Limited | Perception sensors based fusion system for vehicle control and method thereof |
JP7148064B2 (ja) * | 2018-10-25 | 2022-10-05 | 株式会社アイシン | カメラパラメータ推定装置、カメラパラメータ推定方法、およびカメラパラメータ推定プログラム |
WO2020141588A1 (ja) * | 2019-01-02 | 2020-07-09 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 情報処理装置、情報処理方法及びプログラム |
JP7238422B2 (ja) * | 2019-01-22 | 2023-03-14 | 株式会社リコー | 測距方法、測距装置、車載装置、移動体、測距システム |
JP2020118567A (ja) * | 2019-01-24 | 2020-08-06 | ソニーセミコンダクタソリューションズ株式会社 | 測距装置、車載システム及び測距方法 |
JP7062138B2 (ja) * | 2019-05-16 | 2022-05-02 | 三菱電機株式会社 | 情報処理装置、情報処理方法、及び、情報処理プログラム |
US10942267B2 (en) | 2019-06-17 | 2021-03-09 | Advanced New Technologies Co., Ltd. | Video object processing |
CN110263700B (zh) * | 2019-06-17 | 2021-04-27 | 创新先进技术有限公司 | 视频处理方法、装置、设备及视频监控系统 |
CN112208529B (zh) * | 2019-07-09 | 2022-08-02 | 毫末智行科技有限公司 | 用于目标检测的感知系统、驾驶辅助方法和无人驾驶设备 |
CN110597390B (zh) * | 2019-09-12 | 2022-05-20 | Oppo广东移动通信有限公司 | 控制方法、电子装置和存储介质 |
CN112558023B (zh) * | 2019-09-25 | 2024-03-26 | 华为技术有限公司 | 传感器的标定方法和装置 |
US11892559B2 (en) * | 2019-12-18 | 2024-02-06 | Rohde & Schwarz Gmbh & Co. Kg | Radar calibration system and method for moving a radar calibration target along a desired movement path |
CN111060904B (zh) * | 2019-12-25 | 2022-03-15 | 中国汽车技术研究中心有限公司 | 一种基于毫米波与视觉融合感知的盲区监测方法 |
JP7323057B2 (ja) * | 2020-03-31 | 2023-08-08 | 日本電気株式会社 | 制御装置、制御方法、および、制御プログラム |
US12105225B2 (en) * | 2020-04-17 | 2024-10-01 | Velodyne Lidar Usa, Inc. | Systems and methods for calibrating a LiDAR device |
US12055632B2 (en) * | 2020-10-13 | 2024-08-06 | Waymo Llc | LIDAR based stereo camera correction |
KR102484691B1 (ko) * | 2021-03-15 | 2023-01-05 | 주식회사 바이다 | 스테레오 카메라 및 레이더를 이용한 차량 감지 시스템 및 차량 감지 방법 |
US12111385B2 (en) * | 2021-12-23 | 2024-10-08 | Gm Cruise Holdings Llc | Radar sensor processing chain |
CN114509762A (zh) * | 2022-02-15 | 2022-05-17 | 南京慧尔视智能科技有限公司 | 一种数据处理方法、装置、设备及介质 |
EP4310534A1 (en) * | 2022-07-21 | 2024-01-24 | Inxpect S.p.A. | Target detection in world reference system |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007218738A (ja) * | 2006-02-16 | 2007-08-30 | Kumamoto Univ | 校正装置、物標検知装置および校正方法 |
JP2010151682A (ja) * | 2008-12-25 | 2010-07-08 | Topcon Corp | レーザスキャナ及びレーザスキャナ測定システム及びレーザスキャナ測定システムの較正方法及び較正用ターゲット |
US20110122257A1 (en) * | 2009-11-25 | 2011-05-26 | Honeywell International Inc. | Geolocation of objects in an area of interest |
JP2014153211A (ja) * | 2013-02-08 | 2014-08-25 | Furukawa Electric Co Ltd:The | 周辺監視システム及び周辺監視システムの軸ずれ検知方法 |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3212218B2 (ja) * | 1994-05-26 | 2001-09-25 | 三菱電機株式会社 | 車両用障害物検出装置 |
US6859705B2 (en) * | 2001-09-21 | 2005-02-22 | Ford Global Technologies, Llc | Method for operating a pre-crash sensing system with object classifier in a vehicle having a countermeasure system |
WO2005005206A1 (en) * | 2003-07-11 | 2005-01-20 | Toyota Jidosha Kabushiki Kaisha | Crash-safe vehicle control system |
JP3918791B2 (ja) * | 2003-09-11 | 2007-05-23 | トヨタ自動車株式会社 | 物体検出装置 |
JP2006011570A (ja) * | 2004-06-23 | 2006-01-12 | Daihatsu Motor Co Ltd | カメラキャリブレーション方法及びカメラキャリブレーション装置 |
JP2006252473A (ja) * | 2005-03-14 | 2006-09-21 | Toshiba Corp | 障害物検出装置、キャリブレーション装置、キャリブレーション方法およびキャリブレーションプログラム |
WO2006123615A1 (ja) * | 2005-05-19 | 2006-11-23 | Olympus Corporation | 距離計測装置、距離計測方法および距離計測プログラム |
JP4304517B2 (ja) * | 2005-11-09 | 2009-07-29 | トヨタ自動車株式会社 | 物体検出装置 |
US20070182623A1 (en) * | 2006-02-03 | 2007-08-09 | Shuqing Zeng | Method and apparatus for on-vehicle calibration and orientation of object-tracking systems |
JP4595833B2 (ja) * | 2006-02-24 | 2010-12-08 | トヨタ自動車株式会社 | 物体検出装置 |
JP2008116357A (ja) * | 2006-11-06 | 2008-05-22 | Toyota Motor Corp | 物体検出装置 |
US8855848B2 (en) * | 2007-06-05 | 2014-10-07 | GM Global Technology Operations LLC | Radar, lidar and camera enhanced methods for vehicle dynamics estimation |
JP5145585B2 (ja) * | 2007-06-08 | 2013-02-20 | 国立大学法人 熊本大学 | 物標検出装置 |
US20090292468A1 (en) * | 2008-03-25 | 2009-11-26 | Shunguang Wu | Collision avoidance method and system using stereo vision and radar sensor fusion |
JP4434296B1 (ja) * | 2008-09-05 | 2010-03-17 | トヨタ自動車株式会社 | 物体検出装置 |
JP5632762B2 (ja) * | 2011-01-25 | 2014-11-26 | パナソニック株式会社 | 測位情報形成装置、検出装置、及び測位情報形成方法 |
JP5503578B2 (ja) * | 2011-03-10 | 2014-05-28 | パナソニック株式会社 | 物体検出装置及び物体検出方法 |
US8970425B2 (en) * | 2011-06-09 | 2015-03-03 | Sony Corporation | Radar apparatus and method |
US9405006B2 (en) * | 2012-09-03 | 2016-08-02 | Toyota Jidosha Kabushiki Kaisha | Collision determination device and collision determination method |
CN104318561B (zh) * | 2014-10-22 | 2017-05-03 | 上海理工大学 | 基于双目立体视觉与光流融合的车辆运动信息检测方法 |
CN108139475A (zh) | 2015-09-30 | 2018-06-08 | 索尼公司 | 信号处理设备、信号处理方法和程序 |
-
2016
- 2016-09-16 CN CN201680055407.1A patent/CN108139475A/zh active Pending
- 2016-09-16 EP EP16851215.0A patent/EP3358368A4/en not_active Withdrawn
- 2016-09-16 WO PCT/JP2016/077397 patent/WO2017057041A1/ja active Application Filing
- 2016-09-16 US US15/762,136 patent/US10908257B2/en active Active
- 2016-09-16 JP JP2017543132A patent/JP6825569B2/ja active Active
-
2020
- 2020-12-31 US US17/139,290 patent/US11719788B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007218738A (ja) * | 2006-02-16 | 2007-08-30 | Kumamoto Univ | 校正装置、物標検知装置および校正方法 |
JP2010151682A (ja) * | 2008-12-25 | 2010-07-08 | Topcon Corp | レーザスキャナ及びレーザスキャナ測定システム及びレーザスキャナ測定システムの較正方法及び較正用ターゲット |
US20110122257A1 (en) * | 2009-11-25 | 2011-05-26 | Honeywell International Inc. | Geolocation of objects in an area of interest |
JP2014153211A (ja) * | 2013-02-08 | 2014-08-25 | Furukawa Electric Co Ltd:The | 周辺監視システム及び周辺監視システムの軸ずれ検知方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3358368A4 * |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10217005B2 (en) | 2016-12-30 | 2019-02-26 | Neusoft Corporation | Method, apparatus and device for generating target detection information |
JP2017223680A (ja) * | 2016-12-30 | 2017-12-21 | 東軟集団股▲分▼有限公司 | 目標検出情報を生成する方法、装置、および、機器 |
JP2020530555A (ja) * | 2017-07-26 | 2020-10-22 | ロベルト・ボッシュ・ゲゼルシャフト・ミト・ベシュレンクテル・ハフツングRobert Bosch Gmbh | 物体の位置を認識する装置および方法 |
US11313961B2 (en) | 2017-07-26 | 2022-04-26 | Robert Bosch Gmbh | Method and device for identifying the height of an object |
JP2020030466A (ja) * | 2018-08-20 | 2020-02-27 | 株式会社Soken | 物体検知装置 |
JP7135579B2 (ja) | 2018-08-20 | 2022-09-13 | 株式会社Soken | 物体検知装置 |
JP7222854B2 (ja) | 2018-09-07 | 2023-02-15 | アポロ インテリジェント ドライビング テクノロジー(ペキン)カンパニー リミテッド | 相対的位置姿勢の決定方法、装置、機器及び媒体 |
US11372101B2 (en) | 2018-09-07 | 2022-06-28 | Apollo Intelligent Driving Technology (Beijing) Co., Ltd. | Method and apparatus for determining relative pose, device and medium |
JP2020079781A (ja) * | 2018-09-07 | 2020-05-28 | バイドゥ オンライン ネットワーク テクノロジー (ベイジン) カンパニー リミテッド | 相対的位置姿勢の決定方法、装置、機器及び媒体 |
KR20200052589A (ko) * | 2018-11-07 | 2020-05-15 | 현대자동차주식회사 | 전방 차량 오인식 제거 장치 및 그의 오인식 제거 방법과 그를 포함하는 차량 |
KR102524293B1 (ko) | 2018-11-07 | 2023-04-21 | 현대자동차주식회사 | 전방 차량 오인식 제거 장치 및 그의 오인식 제거 방법과 그를 포함하는 차량 |
WO2020116206A1 (ja) * | 2018-12-07 | 2020-06-11 | ソニーセミコンダクタソリューションズ株式会社 | 情報処理装置、および情報処理方法、並びにプログラム |
US11978261B2 (en) | 2018-12-07 | 2024-05-07 | Sony Semiconductor Solutions Corporation | Information processing apparatus and information processing method |
WO2022044634A1 (ja) * | 2020-08-26 | 2022-03-03 | 株式会社デンソー | 対象物認識装置、移動体衝突予防装置および対象物認識方法 |
JP2022037985A (ja) * | 2020-08-26 | 2022-03-10 | 株式会社デンソー | 対象物認識装置、移動体衝突予防装置および対象物認識方法 |
JP7359107B2 (ja) | 2020-08-26 | 2023-10-11 | 株式会社デンソー | 対象物認識装置、移動体衝突予防装置および車両 |
WO2024143349A1 (ja) * | 2022-12-28 | 2024-07-04 | 住友重機械工業株式会社 | 作業機械の周辺監視システム |
Also Published As
Publication number | Publication date |
---|---|
US11719788B2 (en) | 2023-08-08 |
US20210124013A1 (en) | 2021-04-29 |
US20180267142A1 (en) | 2018-09-20 |
JP6825569B2 (ja) | 2021-02-03 |
EP3358368A1 (en) | 2018-08-08 |
CN108139475A (zh) | 2018-06-08 |
EP3358368A4 (en) | 2019-03-13 |
JPWO2017057041A1 (ja) | 2018-08-09 |
US10908257B2 (en) | 2021-02-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017057041A1 (ja) | 信号処理装置、信号処理方法、およびプログラム | |
CN111191600B (zh) | 障碍物检测方法、装置、计算机设备和存储介质 | |
EP3598874B1 (en) | Systems and methods for updating a high-resolution map based on binocular images | |
CA3028653C (en) | Methods and systems for color point cloud generation | |
CN111448591B (zh) | 不良光照条件下用于定位车辆的系统和方法 | |
AU2018282302B2 (en) | Integrated sensor calibration in natural scenes | |
EP3283843B1 (en) | Generating 3-dimensional maps of a scene using passive and active measurements | |
WO2017159382A1 (ja) | 信号処理装置および信号処理方法 | |
JP5588812B2 (ja) | 画像処理装置及びそれを用いた撮像装置 | |
US7103213B2 (en) | Method and apparatus for classifying an object | |
US20170285161A1 (en) | Object Detection Using Radar And Vision Defined Image Detection Zone | |
US11061122B2 (en) | High-definition map acquisition system | |
WO2020140164A1 (en) | Systems and methods for updating a high-definition map | |
WO2020133415A1 (en) | Systems and methods for constructing a high-definition map based on landmarks | |
AU2018102199A4 (en) | Methods and systems for color point cloud generation | |
Li et al. | Automatic Surround Camera Calibration Method in Road Scene for Self-driving Car | |
US20240098231A1 (en) | Image processing device, image processing method, and computer-readable medium | |
Paracchini et al. | Accurate omnidirectional multi-camera embedded structure from motion | |
CN117953046A (zh) | 数据处理方法、装置、控制器、车辆及存储介质 | |
CN115511975A (zh) | 一种单目相机的测距方法及计算机程序产品 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16851215 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017543132 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15762136 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016851215 Country of ref document: EP |