WO2017130397A1 - Dispositif d'estimation de position, procédé d'estimation de position, et programme d'estimation de position - Google Patents

Dispositif d'estimation de position, procédé d'estimation de position, et programme d'estimation de position Download PDF

Info

Publication number
WO2017130397A1
WO2017130397A1 PCT/JP2016/052758 JP2016052758W WO2017130397A1 WO 2017130397 A1 WO2017130397 A1 WO 2017130397A1 JP 2016052758 W JP2016052758 W JP 2016052758W WO 2017130397 A1 WO2017130397 A1 WO 2017130397A1
Authority
WO
WIPO (PCT)
Prior art keywords
movement
moving object
image
imaging
trajectory
Prior art date
Application number
PCT/JP2016/052758
Other languages
English (en)
Japanese (ja)
Inventor
藤田 卓志
水谷 政美
真司 神田
佐藤 裕一
直之 沢崎
Original Assignee
富士通株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士通株式会社 filed Critical 富士通株式会社
Priority to PCT/JP2016/052758 priority Critical patent/WO2017130397A1/fr
Publication of WO2017130397A1 publication Critical patent/WO2017130397A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network

Definitions

  • the present invention relates to a position estimation device, a position estimation method, and a position estimation program.
  • SLAM Simultaneous Localization and Mapping
  • Visual Odometry for analyzing an image captured by an imaging device.
  • SLAM Simultaneous Localization and Mapping
  • a target is detected from each of images captured at different times, and the movement direction and distance of the moving object are calculated by tracking changes in coordinates in which the target appears in the image.
  • the technology for estimating the position of a moving object from an image is useful when the moving object exists indoors where it is difficult to use a satellite positioning system such as GPS (Global Positioning System).
  • the technique for estimating the position of a moving object from an image is also useful when it is desired to estimate the position of a moving object with higher accuracy than a satellite positioning system.
  • a technique For position estimation using an image, a technique has been proposed in which a fisheye lens imaging device with a wide visual field range is mounted on a moving object, and the position is estimated by analyzing the fisheye lens image.
  • a technique has been proposed in which two imaging devices (stereo cameras) facing in the same direction are mounted on a moving object, and the position is estimated using stereo images captured by the two imaging devices.
  • the image picked up by the image pickup device may become an image in which it is difficult to detect the target around the moving object.
  • the precision which estimates the position of a moving object falls.
  • a plane such as a ground or a wall without a pattern occupies a large area of the image
  • the estimation accuracy is likely to decrease.
  • a halation in which an image is blurred in white occurs due to strong light such as sunlight hitting the imaging apparatus
  • the estimation accuracy is likely to be lowered.
  • the estimation accuracy tends to be lowered.
  • the estimation accuracy is likely to decrease.
  • an object of the present invention is to provide a position estimation device, a position estimation method, and a position estimation program that improve the accuracy of estimating the position of a moving object from an image.
  • a position estimation device having an image acquisition unit and a determination unit.
  • the image acquisition unit acquires a sequence of first image data having a different imaging time from a first imaging device provided in the moving object, and a second imaging device having a different imaging direction from the first imaging device provided in the moving object. From the second, a sequence of second image data having different imaging times is acquired.
  • the determination unit calculates a first movement trajectory indicating a first estimation of a route along which the moving object has moved from the first image data sequence, and a route along which the moving object has moved from the second image data sequence. A second movement trajectory indicating the second estimation is calculated, and the position of the moving object is determined using the first movement trajectory and the second movement trajectory.
  • a position estimation method executed by the position estimation device is provided.
  • a position estimation program to be executed by a computer is provided.
  • FIG. 1st Embodiment It is a figure which shows the example of the position estimation apparatus of 1st Embodiment. It is a figure which shows the example of arrangement
  • FIG. 1 is a diagram illustrating an example of a position estimation apparatus according to the first embodiment.
  • the position estimation device 10 estimates the position of the moving object 20 from images of a plurality of imaging devices mounted on the moving object 20.
  • the moving object 20 is a movable artificial object, such as a vehicle, a robot, or a drone.
  • the position estimation device 10 may be mounted on the moving object 20 or may exist outside the moving object 20.
  • Information indicating the estimated position of the moving object 20 may be output from an output device such as a display or a speaker, or may be used for autonomous traveling of the moving object 20 (for example, automatic parking of a vehicle).
  • the position estimation device 10 includes an image acquisition unit 11 and a determination unit 12.
  • the image acquisition unit 11 is an interface that captures image data from the imaging apparatus.
  • the determination unit 12 determines the position of the moving object 20 by processing the image data captured by the image acquisition unit 11.
  • the determination unit 12 for example, a processor such as a CPU (Central Processing Unit) or a DSP (Digital Signal Processor) can be used.
  • the determination unit 12 may include an electronic circuit for a specific application such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array).
  • the processor executes a program stored in a memory such as a RAM (Random Access Memory).
  • the program includes a position estimation program in which processing described below is described.
  • a set of multiple processors (multiprocessor) may be referred to as a “processor”.
  • the moving object 20 has a plurality of imaging devices including an imaging device 21 (first imaging device) and an imaging device 22 (second imaging device).
  • the imaging devices 21 and 22 are installed so that the imaging directions are different from each other. It is preferable that the visual field range of the imaging device 21 and the visual field range of the imaging device 22 have less overlap. The visual field range of the imaging device 21 and the visual field range of the imaging device 22 may not overlap at all.
  • the center of the field-of-view range of the imaging device 21 (the direction in which the center of the image is captured) and the center of the field-of-view range of the imaging device 22 may be 90 ° or more apart from the moving object 20.
  • the imaging device 21 is installed in front of the moving object 20 (in the normal traveling direction), and the imaging device 22 is installed in the rear of the moving object 20 (in the direction opposite to the normal traveling direction).
  • the imaging device 21 is installed on the left side surface of the moving object 20 and the imaging device 22 is installed on the right side surface of the moving object 20.
  • the imaging device 21 is installed in front of or behind the moving object 20 and the imaging device 22 is installed on the left side or right side of the moving object 20.
  • the image acquisition unit 11 acquires a sequence of image data with different imaging times from each of a plurality of imaging devices included in the moving object 20.
  • the image acquisition unit 11 acquires from the imaging device 21 a sequence 23 of image data (first sequence of image data) captured by the imaging device 21. Further, the image acquisition unit 11 acquires from the imaging device 22 a column 24 (second image data column) of image data captured by the imaging device 22.
  • the determination unit 12 calculates a movement trajectory indicating an estimation of a route traveled by the moving object 20 from each of the plurality of image data sequences acquired by the image acquisition unit 11.
  • the determination unit 12 calculates a movement trajectory 13 (first movement trajectory) indicating the first estimation of the route from the image data sequence 23 acquired from the imaging device 21.
  • the determination unit 12 calculates a movement trajectory 14 (second movement trajectory) indicating the second estimation of the route from the sequence 24 of image data acquired from the imaging device 22. Since the movement locus 13 and the movement locus 14 are calculated from different image data, there is a high possibility that they do not coincide.
  • Image processing technology such as SLAM can be used for calculating the movement trajectories 13 and 14.
  • the determination unit 12 detects a target object (for example, an outer wall of a building or a white line on the ground) around each moving object 20 from each image data in the image data row 23. Then, the determination unit 12 tracks the change in the coordinates in the image in which the target is shown, estimates the route along which the moving object 20 has moved, and calculates the movement trajectory 13. Similarly, the determination unit 12 detects a target from each image data in the image data row 24 and calculates the movement locus 14.
  • a target object for example, an outer wall of a building or a white line on the ground
  • the determination unit 12 determines the position 15 of the moving object 20 (for example, the current position of the moving object 20) using a plurality of movement loci including the movement loci 13 and 14. For example, the determination unit 12 generates a single combined movement track by combining a plurality of movement tracks, and determines the position 15 based on the combined movement track. The synthesis of the plurality of movement trajectories is performed, for example, by obtaining an average of the plurality of movement trajectories. However, the determination unit 12 may assign a weight to each movement locus in accordance with the shape of each movement locus, and obtain a weighted average of a plurality of movement loci. In that case, a small weight may be given to an unnatural movement locus such as a meandering movement locus.
  • the determination unit 12 compares the three or more movement loci and greatly increases the other movement loci. You may determine the abnormal movement trace which left
  • the image data sequence 23 is acquired from the imaging device 21 included in the moving object 20, and the movement trajectory 13 is calculated from the image data sequence 23. Further, the image data column 24 is acquired from the imaging device 22 having a different imaging direction from the imaging device 21 included in the moving object 20, and the movement track 14 is calculated from the image data column 24 independently of the movement track 13. . Then, the position 15 of the moving object 20 is estimated using the movement trajectories 13 and 14.
  • the imaging direction of the imaging device 22 can be largely shifted from the imaging direction of the imaging device 21 as compared to a method of calculating one movement locus from stereo images captured by the imaging devices 21 and 22. Therefore, the risk that both images of the imaging devices 21 and 22 are not suitable for position estimation can be reduced, and the accuracy of the determination of the position 15 is improved.
  • the vehicle 30 according to the second embodiment is a four-wheeled vehicle driven by a person.
  • the vehicle 30 includes four imaging devices as sensors that monitor the situation around the vehicle 30.
  • FIG. 2 is a diagram illustrating an arrangement example of the imaging devices in the vehicle.
  • the vehicle 30 includes imaging devices 31 to 34.
  • the imaging device 31 is installed in front of the vehicle 30 so that the imaging direction (direction perpendicular to the lens surface) coincides with the front front direction of the vehicle.
  • the imaging device 32 is installed behind the vehicle 30 such that the imaging direction is opposite to the front direction of the vehicle.
  • the imaging device 33 is installed on the left side surface of the vehicle so that the imaging direction is shifted 90 ° to the left from the front front direction of the vehicle.
  • the imaging device 34 is installed on the right side surface of the vehicle such that the imaging direction is shifted 90 ° to the right from the front front direction of the vehicle.
  • a fisheye lens is used for the imaging devices 31 to 34.
  • the field of view of each of the imaging devices 31 to 34 is 190 ° with respect to the lens surface. That is, the imaging devices 31 to have a visual field range of 190 ° in the horizontal direction and a visual field range of 190 ° in the vertical direction.
  • the vehicle 30 has four imaging devices whose imaging directions are shifted by 90 °.
  • the vehicle 30 may have an arbitrary number of imaging devices of three or more. .
  • the vehicle 30 may include six imaging devices whose imaging directions are shifted by 60 °.
  • FIG. 3 is a block diagram illustrating a hardware example of the vehicle.
  • the vehicle 30 includes an odometer 35, a GPS measurement unit 36, an automatic parking device 37, a position estimation device 100, and a navigation device 200 in addition to the imaging devices 31 to 34.
  • the position estimation apparatus 100 corresponds to the position estimation apparatus 10 of the first embodiment.
  • the odometer 35 measures the distance traveled by the vehicle 30 based on the rotational speed of the tire of the vehicle 30 and the like.
  • the mileage provided by the odometer 35 may be a cumulative mileage from the start of measurement, or may be a mileage in the latest fixed time.
  • the GPS measurement unit 36 receives a GPS signal from a GPS satellite, and calculates the current position of the vehicle 30 in the earth coordinate system based on the GPS signal.
  • the position in the earth coordinate system can be expressed by latitude and longitude.
  • the GPS measurement unit 36 may not receive a GPS signal and may not be able to calculate the current position.
  • the current position calculated by the GPS measurement unit 36 may include an error of several meters to several tens of meters.
  • the automatic parking device 37 moves the vehicle 30 to the parking space of the parking lot by automatic driving regardless of the user's driving.
  • the automatic parking device 37 shifts to an automatic parking mode in which an accelerator, a brake, a handle, and the like included in the vehicle 30 are automatically operated according to an instruction from the user.
  • the automatic parking device 37 uses the estimation result of the current position provided by the position estimation device 100 described later in order to determine an appropriate moving direction and moving amount.
  • the current position estimated by the position estimation device 100 is more accurate than the current position calculated by the GPS measurement unit 36.
  • the estimation error of the position estimation apparatus 100 is expected to be about several centimeters.
  • the position estimation apparatus 100 can estimate the current position even indoors where GPS signals cannot be received. Therefore, in the automatic parking mode, the estimation result of the position estimation device 100 is used.
  • the position estimation apparatus 100 analyzes images captured by the image capturing apparatuses 31 to 34 using SLAM, which is an image processing technique, and estimates the movement trajectory and the current position of the vehicle 30.
  • the movement trajectory and the current position output by the position estimation apparatus 100 can be expressed using absolute coordinates in the earth coordinate system, as in the case of the GPS measurement unit 36.
  • the position estimation apparatus 100 sets a reference position in the earth coordinate system using external information such as a GPS signal, detects a relative movement of the vehicle 30 from the reference position by image analysis, and detects a movement locus and a current position. Is calculated.
  • the position estimation device 100 outputs the estimated current position to the automatic parking device 37. Thus, automatic parking of the vehicle 30 is executed based on the estimated current position. Further, the position estimation device 100 outputs the estimated movement trajectory and the current position to the navigation device 200. Thereby, the movement trajectory and the current position are displayed on the screen of the navigation device 200 so as to overlap the map prepared in advance.
  • the position estimation apparatus 100 may use the output of the odometer 35 or the output of the GPS measurement unit 36. Details of the inside of the position estimation apparatus 100 will be described later.
  • the navigation device 200 is an in-vehicle device that supports the driving of the user of the vehicle 30 and presents the situation around the vehicle 30 to the user. For example, the navigation device 200 accepts designation of the destination from the user, calculates a recommended route from the current position measured by the GPS measurement unit 36 to the destination, and displays the recommended route on the display so as to overlap the map. The navigation device 200 may reproduce a voice message indicating a recommended route from a speaker.
  • the navigation device 200 acquires information on the movement locus and the current position from the position estimation device 100, and displays the movement locus and the current position on the display so as to overlap with the map.
  • the navigation device 200 may reproduce a voice message indicating the state of automatic parking from a speaker. Thereby, the user can confirm whether the vehicle 30 is moving appropriately during the automatic parking mode.
  • the automatic parking mode may be canceled by the user operating the navigation device 200 or another device.
  • position estimation device 100 and the automatic parking device 37 may be housed in separate housings or in the same housing. Further, the position estimation device 100 and the navigation device 200 may be housed in separate housings or in the same housing.
  • FIG. 4 is a diagram illustrating a hardware example of the position estimation device and the navigation device.
  • the position estimation apparatus 100 includes a processor 101, a RAM 102, a ROM (Read Only Memory) 103, an image signal interface 104, an input interface 105, and an output interface 106. These units are connected to the bus.
  • the processor 101 corresponds to the determination unit 12 of the first embodiment.
  • the image signal interface 104 corresponds to the image acquisition unit 11 of the first embodiment.
  • the processor 101 is a controller including an arithmetic circuit that executes program instructions.
  • the processor 101 may be called a CPU or an ECU (Electronic Control Unit).
  • the processor 101 loads at least a part of the program and data stored in the ROM 103 into the RAM 102 and executes the program.
  • the RAM 102 is a volatile semiconductor memory that temporarily stores programs executed by the processor 101 and data used by the processor 101 for operations.
  • the position estimation apparatus 100 may include a type of memory other than the RAM, or may include a plurality of memories.
  • the ROM 103 is a non-volatile storage device that stores programs and data.
  • the program includes a position estimation program.
  • the ROM 103 may be non-volatile and may be a rewritable storage device such as a flash memory.
  • the position estimation device 100 may include other types of storage devices, and may include a plurality of nonvolatile storage devices.
  • the image signal interface 104 is connected to the imaging devices 31 to 34, and acquires the image data generated by the imaging devices 31 to 34.
  • the input interface 105 is connected to the odometer 35, the GPS measurement unit 36, and the like, and acquires information on the measured mileage and the calculated current position.
  • the output interface 106 is connected to the automatic parking device 37, the navigation device 200, and the like, and outputs information on the estimated movement locus and the current position.
  • the navigation device 200 includes a processor 201, a RAM 202, a flash memory 203, a display 204, an input device 205, and a media reader 206. These units are connected to the bus.
  • the processor 201 is a controller including an arithmetic circuit that executes program instructions.
  • the processor 201 loads at least a part of the program and data stored in the flash memory 203 into the RAM 202 and executes the program.
  • the RAM 202 is a volatile semiconductor memory that temporarily stores programs executed by the processor 201 and data used by the processor 201 for calculation.
  • the flash memory 203 is a non-volatile storage device that stores programs and data.
  • the navigation device 200 may include other types of storage devices such as an HDD (Hard Disk Drive).
  • Display 204 displays an image in accordance with a command from processor 201.
  • various types of displays such as a liquid crystal display (LCD: Liquid Crystal Display) and an organic EL (OEL: Organic Electro-Luminescence) display can be used.
  • the input device 205 receives a user operation and outputs an input signal to the processor 201.
  • various types of input devices such as a touch panel, a keypad, and a trackball can be used.
  • the media reader 206 is a reading device that reads programs and data recorded on the recording medium 207.
  • the recording medium 207 includes a magnetic disk such as a flexible disk (FD) or HDD, an optical disk such as a CD (Compact Disk) or a DVD (Digital Versatile Disk), a magneto-optical disk (MO), a semiconductor.
  • Various types of recording media such as a memory can be used.
  • the medium reader 206 stores the read program and data in the RAM 202 or the flash memory 203.
  • FIG. 5 is a diagram illustrating an example of an imaging apparatus and a vehicle coordinate system.
  • the description will be made assuming that the vehicle 30 is placed horizontally with respect to the horizontal ground in the earth coordinate system.
  • the position estimation device 100 defines a coordinate system (camera coordinate system) unique to each imaging device for each of the imaging devices 31 to 34.
  • the camera coordinate system is a logical coordinate system for image analysis and is different from the earth coordinate system.
  • coordinate axes C 1X , C 1Y , and C 1Z are defined with the position of the imaging device 31 as the origin.
  • the positive direction of C 1X is the horizontal right direction when viewed from the imaging device 31.
  • the positive direction of C 1Y is a vertically downward direction when viewed from the imaging device 31.
  • the positive direction of C 1Z is the front direction (imaging direction) when viewed from the imaging device 31.
  • a coordinate axis C 2X, C 2Y to the origin position of the imaging device 32, C 2Z are defined.
  • coordinate axes C 3X , C 3Y , and C 3Z with the position of the imaging device 33 as the origin are defined.
  • coordinate axes C 4X , C 4Y , and C 4Z are defined with the position of the imaging device 34 as the origin.
  • the XY plane of each of the imaging devices 31 to 34 is a plane parallel to the lens of the imaging device, and the XZ plane is a horizontal plane. Therefore, the images picked up by the image pickup devices 31 to 34 represent the XY plane of the camera coordinate system.
  • C 2Z is parallel to the C 1Z rotated 180 ° in the XZ plane.
  • C 3Z is parallel to C 1Z rotated 90 ° to the left in the XZ plane.
  • C 4Z is parallel to the C 1Z rotated 90 ° to the right in the XZ plane.
  • C 2X is parallel to the C 1X rotated 180 ° in the XZ plane.
  • C 3X is parallel to the C 1X rotated 90 ° to the left in the XZ plane.
  • C 4X is parallel to the C 1X rotated 90 ° to the right in the XZ plane.
  • C 1Y , C 2Y , C 3Y and C 4Y are parallel.
  • the position estimation device 100 defines one coordinate system (vehicle coordinate system) for the vehicle 30 in addition to the camera coordinate system when calculating the movement trajectory of the vehicle 30.
  • vehicle coordinate system is also a logical coordinate system for image analysis and is different from the earth coordinate system.
  • the origin of the vehicle coordinate system is a predetermined position in the vehicle 30, and is set to the center of gravity of the vehicle 30, for example.
  • coordinate axes V X , V Y , V Z are defined.
  • the positive direction of V X is a horizontal right direction with respect to the front front direction of the vehicle 30.
  • the positive direction of V Y is the vertically downward direction.
  • the positive direction of V Z is a vehicle front front direction of the vehicle 30. That is, V X is parallel to C 1X , V Y is parallel to C 1Y , and V Z is parallel to C 1Z .
  • the coordinates of the camera coordinate system of the imaging devices 31 to 34 are converted to the coordinates of the vehicle coordinate system of the vehicle 30.
  • the coordinate transformation from the former to the latter may be implemented using a transformation matrix or the like.
  • FIG. 6 is a diagram illustrating an example of images of four imaging devices.
  • the imaging devices 31 to 34 continuously capture images and output a sequence of image data indicating a sequence of images with different imaging times. For example, the imaging devices 31 to 34 capture images at a 1/30 second period. Images captured by the imaging devices 31 to 34 may be color images or monochrome images.
  • the imaging device 31 captures the image 41 at time “9:30: 00”. Thereafter, the imaging device 31 captures an image at a period of 1/30 second, and captures an image 42 at time “9:30:05”. Since the vehicle 30 is moving in the depth direction of the image, in the sequence of images captured by the imaging device 31, objects near the vehicle 30, such as other parked vehicles and white lines indicating parking spaces, Looks like it is moving towards. On the other hand, a building far away from the vehicle 30 appears to have little change in position and size.
  • the imaging device 32 captures the image 43 at time “9:30”. Thereafter, the imaging device 32 captures an image at a period of 1/30 second, and captures an image 44 at time “9:30:05”. Since the vehicle 30 travels in the opposite direction to the depth direction of the image, in the sequence of images captured by the imaging device 32, objects near the vehicle 30 such as white lines indicating other vehicles and parking spaces are Looks like it is moving towards the center. On the other hand, a building far away from the vehicle 30 appears to have little change in position and size.
  • the imaging device 33 captures the image 45 at the time “9:30: 00”. Thereafter, the imaging device 33 captures an image at a period of 1/30 second, and captures an image 46 at time “9:30:05”. Since the vehicle 30 is moving in the right direction of the image, in the row of images captured by the imaging device 33, objects near the vehicle 30 such as white lines indicating other vehicles and parking spaces are directed from the right to the left. Looks like it ’s moving. In addition, a building far away from the vehicle 30 also appears to move from right to left.
  • the imaging device 34 captures the image 47 at the time “9:30”. Thereafter, the imaging device 34 captures an image at a period of 1/30 second, and captures an image 48 at time “9:30:05”. Since the vehicle 30 is moving in the left direction of the image, in the sequence of images captured by the imaging device 34, objects near the vehicle 30 such as white lines indicating other vehicles and parking spaces are directed from the left to the right. Looks like it ’s moving. In addition, a building far away from the vehicle 30 also appears to move from left to right.
  • FIG. 7 is a diagram illustrating an example of target point extraction in SLAM processing.
  • the position estimation device 100 performs SLAM processing separately for each of the imaging devices 31 to 34.
  • trajectory data indicating the estimation of the relative movement path from the reference position
  • point cloud data indicating the estimation of the position of each target point in the image in the three-dimensional space are generated. That is, the position estimation device 100 generates trajectory data and point cloud data as estimation results from the sequence of image data output from the imaging device 31.
  • the position estimation device 100 generates trajectory data and point cloud data, which are estimation results, from the sequence of image data output by the imaging device 32.
  • the position estimation device 100 generates trajectory data and point cloud data as estimation results from the sequence of image data output by the imaging device 33.
  • the position estimation device 100 generates trajectory data and point cloud data as estimation results from the sequence of image data output by the imaging device 34.
  • the image 51 is an image captured by the imaging device 33.
  • the image 52 is an image captured by the imaging device 33 next to the image 51 (for example, after 1/30 second).
  • the position estimation device 100 analyzes the image 51 and extracts a target point from the image 51.
  • the target point is, for example, a pixel having a color gradient (a change amount of a value from an adjacent pixel) of pixels of the image 51 that is equal to or greater than a threshold value.
  • a threshold value As the target point, pixels having a color different from that of adjacent pixels, such as a building outline, a window frame of the building, and a white line drawn on the ground, are extracted.
  • the position estimation apparatus 100 generates an analysis image 53 indicating the target point extracted from the image 51.
  • the position estimation apparatus 100 analyzes the image 52 and extracts target points from the image 52.
  • the position estimation apparatus 100 generates an analysis image 54 indicating the target point extracted from the image 52.
  • the position estimation device 100 compares the analysis image 53 and the analysis image 54 and determines the correspondence between the target point included in the analysis image 53 and the target point included in the analysis image 54. For example, for each target point of the analysis image 54, the position estimation apparatus 100 searches for a target point of the analysis image 53 whose distance is equal to or less than a threshold when the analysis image 53 is superimposed, and a target pointing to substantially the same object. Find a spot. Then, the position estimation device 100 determines the position change of the target point between the analysis image 53 and the analysis image 54. Thereby, the moving direction and moving distance of the imaging device 33 can be estimated. Further, the position of the target point extracted from the images 51 and 52 in the three-dimensional space can also be estimated.
  • the imaging device 33 when the target point is moving from the periphery of the image toward the center, it can be estimated that the imaging device 33 is moving in the negative direction of C 3Z .
  • the imaging device 33 is moving in the positive direction of C 3Z .
  • the target point If the target point is moving from left to right of the image, it can be estimated that the image pickup device 33 is moved in the negative direction of the C 3X.
  • the imaging device 33 When the target point is moving from the right to the left of the image, it can be estimated that the imaging device 33 is moving in the positive direction of C 3X .
  • a target point with a large movement amount exists near the imaging device 33 it can be estimated that a target point with a small amount of movement exists far from the imaging device 33.
  • FIG. 8 is a diagram illustrating an example of SLAM results for four imaging devices.
  • the position estimation device 100 calculates the movement locus 61 and the point group 65 using the image data output from the imaging device 31.
  • the position estimation device 100 calculates the movement locus 62 and the point group 66 using the image data output from the imaging device 32.
  • the position estimation device 100 calculates the movement locus 63 and the point group 67 using the image data output from the imaging device 33.
  • D The position estimation device 100 calculates the movement locus 64 and the point group 68 using the image data output from the imaging device 34.
  • the movement trajectories 61 to 64 are calculated independently of each other.
  • the point groups 65 to 68 are calculated independently of each other.
  • the scale of the movement trajectory calculated from the image of each imaging device depends on the camera coordinate system of the imaging device, and the scales of the movement trajectories 61 to 64 are not unified. Further, the scale of the point group calculated from the image of each imaging device also depends on the camera coordinate system of the imaging device, and the scales of the point groups 65 to 68 are not unified.
  • some of the imaging devices 31 to 34 may take images (images unsuitable for SLAM processing) for which it is difficult to accurately extract the target points. For example, an image in which a plane such as a ground or a wall without a pattern occupies a large area is unsuitable for SLAM processing. In addition, an image in which halation has occurred due to strong light such as sunlight hitting the imaging device is not suitable for SLAM processing. Further, when the shadow of the vehicle 30 appears in the image and the shadow moves as the vehicle 30 moves, the image is not suitable for SLAM processing. Further, when another vehicle in the image is moving, the image is not suitable for SLAM processing.
  • a low-accuracy movement trajectory or point cloud may be calculated.
  • a movement trajectory that is significantly different from the actual movement route of the vehicle 30 or a point group that is significantly different from the arrangement of the targets around the vehicle 30 may be calculated.
  • the image of the imaging device 32 (for example, the images 43 and 44 in FIG. 6) is an image in which it is difficult to extract a target point due to the influence of the ground or a distant building. is doing.
  • SLAM processing is performed independently of each other on the imaging devices 31 to 34 whose imaging directions are shifted by 90 °.
  • the images of the imaging devices 31, 33, and 34 are not images that are difficult to extract the target points, and the decrease in the estimation accuracy of the movement locus 62 does not affect the movement locus 61, 63, and 64.
  • the movement trajectories 61, 63, and 64 are movement trajectories calculated from different images, and thus do not completely match.
  • FIG. 9 is a diagram illustrating a synthesis example of SLAM results.
  • the position estimation apparatus 100 calculates the movement trajectories 61 to 64 as described above.
  • the movement trajectories 61 to 64 represent movement paths from the reference position, and represent movement paths for a predetermined time (for example, 5 seconds from time “9:30: 00” to time “9:30:05”). Yes.
  • the movement trajectories 61 to 64 are expressed in different camera coordinate systems. Therefore, the position estimation apparatus 100 converts the coordinate system of the movement trajectories 61 to 64 from the camera coordinate system to the vehicle coordinate system. As a result, the start points of the movement trajectories 61 to 64 move to the origin of the vehicle coordinate system. Further, the coordinate systems of the movement trajectories 61 to 64 are all unified to a coordinate system defined by V X , V Y , and V Z.
  • the position estimation apparatus 100 converts the movement trajectories 61 to 64 into the movement trajectories 71 to 74 so that the lengths are the same in the vehicle coordinate system. For example, the position estimation apparatus 100 determines a uniform length, and performs similar transformation on each of the movement trajectories 61 to 64 so that the uniform length is obtained.
  • the unified length may be any one of the movement trajectories 61 to 64 (for example, the length of the longest movement trajectory), or may be different from any of the movement trajectories 61 to 64.
  • the position estimation apparatus 100 searches for an abnormal movement trajectory that is significantly different from other movement trajectories, and when an abnormal movement trajectory is found, excludes the abnormal movement trajectory from the subsequent processing.
  • the position estimation apparatus 100 calculates an average movement trajectory obtained by averaging the movement trajectories 71 to 74, and calculates a degree of deviation (“distance” between the movement trajectories) between the average movement trajectory and each of the movement trajectories 71 to 74. To do.
  • the position estimation apparatus 100 determines that a movement locus whose degree of deviation from the average movement locus is equal to or greater than a threshold is an abnormal movement locus.
  • the position estimation apparatus 100 calculates a divergence degree for each pair of two movement trajectories, and determines a movement trajectory whose deviation degree is equal to or greater than a threshold value among all other movement trajectories as an abnormal movement trajectory. .
  • the movement locus 72 is determined to be an abnormal movement locus.
  • the position estimation apparatus 100 synthesizes movement trajectories other than the abnormal movement trajectory to calculate a movement trajectory 70 that is a composite movement trajectory.
  • the movement locus 70 is regarded as a correct movement locus.
  • the position estimation apparatus 100 sets the average of the movement trajectories 71, 73, and 74 as the combined movement trajectory.
  • the position estimation apparatus 100 may assign weights to the movement trajectories 71, 73, and 74 and use a weighted average of the movement trajectories 71, 73, and 74 as a combined movement trajectory. The weight may be determined based on the shape of each movement locus.
  • the weight of an unnatural movement locus such as a meandering movement locus.
  • the weight of the abnormal trajectory may be reduced and the weighted average of the movement trajectories 71 to 74 may be calculated.
  • the position estimation apparatus 100 converts the coordinate system of the movement locus 70 from the vehicle coordinate system to the earth coordinate system.
  • the length of the movement locus 70 may be different from the actual movement distance of the vehicle 30 in the earth coordinate system. Therefore, the position estimation apparatus 100 performs scaling of the movement locus 70 and adjusts the scale of the movement locus 70.
  • the scaling method of the movement locus 70 As a first method, a method using the travel distance of the vehicle 30 measured by the odometer 35 can be considered. Based on the information output from the odometer 35, the position estimation apparatus 100 performs the process from the time at the reference position to the present (for example, 5 seconds from the time “9: 30: 0” to the time “9:30:05”). ) Of the vehicle 30 is obtained. The position estimation apparatus 100 performs similarity conversion on the movement locus 70 so that the length of the movement locus 70 matches the travel distance. When the actual travel distance is used, when the position estimation apparatus 100 converts the travel trajectories 61 to 64 into the travel trajectories 71 to 74 in FIG. 9, the length of the travel trajectories 71 to 74 matches the travel distance. You may let them. In that case, scaling of the movement locus 70 is not necessary.
  • the position estimation apparatus 100 selects point groups 65, 67, and 68 excluding the point group 66 corresponding to the abnormal movement locus from the point groups 65 to 68 calculated by the SLAM process.
  • the position estimation apparatus 100 matches the point groups 65, 67, and 68 with the map.
  • the position estimation apparatus 100 changes the scale for each of the point groups 65, 67, and 68, and calculates the magnification at which the target point and the line drawn on the map overlap most.
  • the position estimation apparatus 100 obtains the desired length of the movement locus 70 by applying the magnification calculated for the point groups 65, 67, and 68 to the movement locus 61, 63, 64.
  • FIG. 10 is a diagram illustrating an example of the scale adjustment of the movement trajectory using the point group.
  • the position estimation apparatus 100 selects point groups 65, 67, and 68 excluding the point group 66 corresponding to the movement locus 62 that is an abnormal movement locus among the point groups 65 to 68 calculated by the SLAM process.
  • the position estimation apparatus 100 resembles the point group 65 with the point group 75 around the origin of the camera coordinate system of the imaging apparatus 31 at the magnification when the movement locus 61 is converted into the movement locus 71.
  • the position estimation apparatus 100 resembles the point group 67 with the point group 77 around the origin of the camera coordinate system of the imaging apparatus 33 at the magnification when the movement locus 63 is converted into the movement locus 73.
  • the position estimation apparatus 100 resembles the point group 68 with the point group 78 around the origin of the camera coordinate system of the imaging apparatus 34 at the magnification when the movement locus 64 is converted into the movement locus 74.
  • the position estimation apparatus 100 scales the point groups 75, 77, 78 with a common magnification.
  • the scaling of the point group 75 is performed with the origin of the camera coordinate system of the imaging device 31 as the center.
  • the scaling of the point group 77 is performed with the origin of the camera coordinate system of the imaging device 33 as the center.
  • the scaling of the point group 78 is performed with the origin of the camera coordinate system of the imaging device 34 as the center. Since the origins of the coordinate systems are different from each other, the overlapping degree of the target points between the point groups 75, 77, and 78 changes when the magnification is changed.
  • the position estimation apparatus 100 calculates a magnification that maximizes the degree of overlap. For example, for each target point, a probability distribution that extends in a certain range around the target point is defined. The position estimation apparatus 100 adjusts the magnification so that the sum of the overlapping amounts of the probability distributions is maximized. In addition, for example, the position estimation apparatus 100 counts other target points that are within a predetermined range from the target point for each target point, and adjusts the magnification so that the total of the counts is maximized.
  • the position estimation apparatus 100 converts the point group 75 into the point group 85 at the common magnification, converts the point group 77 into the point group 87 at the common magnification, and converts the point group 78 into the point group 85.
  • the point group 88 is converted at a common magnification.
  • the position estimation apparatus 100 obtains the desired length of the movement track 70 by applying the common magnification to the movement tracks 71, 73, and 74. As a result, a movement locus 80 that matches the scale of the earth coordinate system is calculated.
  • FIG. 11 is a block diagram illustrating an example of functions of the position estimation apparatus.
  • the position estimation apparatus 100 includes a SLAM result storage unit 111, a parameter storage unit 112, a map data storage unit 113, a SLAM processing unit 121 to 124, a trajectory comparison unit 125, a point group comparison unit 126, a position determination unit 127, and a travel distance acquisition unit. 128 and a GPS information acquisition unit 129.
  • the SLAM result storage unit 111, the parameter storage unit 112, and the map data storage unit 113 can be mounted using the storage area of the RAM 102 or the ROM 103.
  • the SLAM processing units 121 to 124, the trajectory comparison unit 125, the point group comparison unit 126, the position determination unit 127, the travel distance acquisition unit 128, and the GPS information acquisition unit 129 can be implemented using program modules.
  • the SLAM result storage unit 111 stores trajectory data and point cloud data generated by the SLAM processing units 121-124.
  • the SLAM result storage unit 111 stores intermediate data generated in the course of processing by the trajectory comparison unit 125 and the point group comparison unit 126.
  • the intermediate data includes data converted from the initial trajectory data and point cloud data.
  • the parameter storage unit 112 stores parameters indicating the camera coordinate system of the imaging devices 31 to 34 and the vehicle coordinate system of the vehicle 30.
  • the parameter is used for converting the coordinate system of the trajectory data and the point cloud data.
  • the parameters are defined in advance according to the arrangement of the imaging devices 31 to 34.
  • the map data storage unit 113 stores map data indicating roads and parking lots.
  • the map data includes, for example, the coordinates (latitude and longitude) of the earth coordinate system indicating the location of the road and the parking lot, and line data indicating the shape of the road and the parking lot.
  • the map data storage unit 113 may store only map data related to the parking lot.
  • the map data related to the parking lot preferably represents a detailed shape in the parking lot such as the arrangement of the parking space.
  • the SLAM processing units 121 to 124 generate trajectory data and point cloud data by image processing.
  • the SLAM processing units 121 to 124 operate independently of each other.
  • the SLAM processing units 121 to 124 output the generated trajectory data to the trajectory comparison unit 125.
  • the SLAM processing units 121 to 124 output the generated point group data to the point group comparison unit 126.
  • the SLAM processing unit 121 acquires image data from the imaging device 31, and generates trajectory data and point cloud data corresponding to the imaging device 31.
  • the SLAM processing unit 122 acquires image data from the imaging device 32 and generates trajectory data and point cloud data corresponding to the imaging device 32.
  • the SLAM processing unit 123 acquires image data from the imaging device 33 and generates trajectory data and point cloud data corresponding to the imaging device 33.
  • the SLAM processing unit 124 acquires image data from the imaging device 34 and generates trajectory data and point cloud data corresponding to the imaging device 34.
  • the trajectory comparison unit 125 uses the trajectory data generated by the SLAM processing units 121 to 124 to calculate a movement trajectory 70 (composite movement trajectory) in the vehicle coordinate system.
  • the trajectory comparison unit 125 uses the parameters stored in the parameter storage unit 112 to convert the coordinate system of the movement trajectories 61 to 64 indicated by the trajectory data of the SLAM processing units 121 to 124 from the camera coordinate system to the vehicle coordinate system.
  • the trajectory comparison unit 125 adjusts the scales of the movement trajectories 61 to 64 so that the movement trajectories 61 to 64 have the same length, and generates the movement trajectories 71 to 74.
  • the trajectory comparison unit 125 notifies the point group comparison unit 126 of the magnification applied to the movement trajectories 61 to 64.
  • the trajectory comparison unit 125 generates the movement trajectory 70 by combining the movement trajectories 71 to 74 having the same length, and notifies the position determination unit 127 of the movement trajectory 70.
  • the synthesis of the movement trajectories 71 to 74 may include detecting an abnormal movement trajectory from the movement trajectories 71 to 74 and excluding the abnormal movement trajectory. In that case, the trajectory comparison unit 125 notifies the point group comparison unit 126 of the abnormal movement trajectory. Further, the synthesis of the movement trajectories 71 to 74 may include calculating an average of all or a part of the movement trajectories 71 to 74.
  • the composition of the movement trajectories 71 to 74 may include assigning weights to the movement trajectories 71 to 74 and calculating a weighted average of the movement trajectories 71 to 74.
  • the point group comparison unit 126 uses the point group data generated by the SLAM processing units 121 to 124 to determine a magnification for adjusting the scale of the moving locus 70 generated by the locus comparison unit 125.
  • the point group comparison unit 126 applies the magnification of the movement trajectories 61 to 64 notified from the trajectory comparison unit 125 to the point groups 65 to 68 to generate point groups 75 to 78 corresponding to the movement trajectories 71 to 74. That is, the point group comparison unit 126 applies the magnification applied to the movement locus 61 to the point group 65, applies the magnification applied to the movement locus 62 to the point group 66, and calculates the magnification applied to the movement locus 63.
  • the magnification applied to the point group 67 and the magnification applied to the movement locus 64 is applied to the point group 68.
  • the point group comparison unit 126 scales the point groups 75 to 78 by a common magnification, and determines the magnification at which the target points overlap most between the point groups 75 to 78.
  • the point group comparison unit 126 may exclude the point group corresponding to the abnormal movement locus from the point groups 75 to 78.
  • the point group comparison unit 126 notifies the position determination unit 127 of the determined magnification.
  • the position determination unit 127 adjusts the length of the movement locus 70 notified from the locus comparison unit 125 based on the magnification notified from the point group comparison unit 126, and calculates the movement locus 80.
  • the position determination unit 127 maps the movement trajectory 80 to the earth coordinate system so that the starting point of the movement trajectory 80 becomes the reference position, and outputs an estimation result of the movement trajectory and the current position of the vehicle 30 in the earth coordinate system.
  • the current position corresponds to the end point of the movement track 80.
  • the movement trajectory and the current position can be expressed using the coordinates (latitude and longitude) of the earth coordinate system.
  • the position determination unit 127 may perform scaling of the movement locus 70 using the travel distance acquired from the travel distance acquisition unit 128 instead of using the processing result of the point group comparison unit 126. Further, the position determining unit 127 determines the magnification by collating the point cloud data generated by the SLAM processing unit 123 with the map data stored in the map data storage unit 113, and using the determined magnification, the position of the moving locus 70 is determined. Scaling may be performed. As the reference position in the earth coordinate system, the current position of the vehicle 30 last measured by the GPS information acquisition unit 129 may be used.
  • the position of the ETC (Electronic Toll Collection System) gate through which the vehicle 30 last passed may be specified from the map data, and the position of the ETC gate may be used as the reference position. Further, the current position previously estimated by the position determination unit 127 may be used as the reference position in the next estimation.
  • ETC Electronic Toll Collection System
  • the travel distance acquisition unit 128 acquires travel distance information from the travel distance meter 35.
  • the GPS information acquisition unit 129 acquires current position information from the GPS measurement unit 36. However, the GPS information acquisition unit 129 cannot acquire the current position information in a place where the GPS signal does not reach.
  • FIG. 12 is a diagram illustrating an example of a parameter table.
  • the parameter table 114 is stored in the parameter storage unit 112.
  • the parameter table 114 includes items of an imaging device ID, an X coordinate, a Y coordinate, a Z coordinate, a yaw (Yaw), a pitch (Pitch), and a roll (Roll).
  • the imaging device ID is identification information of the imaging devices 31 to 34 mounted on the vehicle 30.
  • the X coordinate, the Y coordinate, and the Z coordinate are coordinates in the vehicle coordinate system that indicate the locations where the imaging devices 31 to 34 are arranged.
  • the imaging device 31 exists at (0.0 m, 0.0 m, 2.5 m).
  • the imaging device 32 exists at (0.0 m, 0.0 m, ⁇ 2.5 m).
  • the imaging device 33 exists at ( ⁇ 1.0 m, 0.0 m, 0.8 m).
  • the imaging device 34 exists at (1.0 m, 0.0 m, 0.8 m).
  • the yaw is an angle in the imaging direction of the imaging devices 31 to 34 on the XZ plane, that is, an angle in the left-right direction with respect to the front of the vehicle 30.
  • the pitch is an angle in the imaging direction of the imaging devices 31 to 34 on the YZ plane, that is, an angle in the vertical direction with respect to the front of the vehicle 30.
  • the roll is an angle in the imaging direction of the imaging devices 31 to 34 on the XY plane, that is, an angle between an upward direction of an image to be captured and a vertical upward direction of the vehicle 30.
  • the imaging directions of the imaging devices 31 to 34 are parallel to the horizontal plane, and the pitch and roll are all 0 °.
  • the imaging devices 31 to 34 are set so as to be shifted by 90 ° forward, backward, left and right of the vehicle 30. Therefore, the yaw of the imaging device 31 is 0 °, the yaw of the imaging device 32 is 180 °, the yaw of the imaging device 33 is ⁇ 90 °, and the yaw of the imaging device 34 is 90 °.
  • FIG. 13 is a diagram illustrating an example of trajectory data generated by SLAM processing.
  • trajectory data 131 to 134 are stored in the SLAM result storage unit 111.
  • the trajectory data 131 is generated from the image of the imaging device 31 by the SLAM processing unit 121.
  • the trajectory data 132 is generated from the image of the imaging device 32 by the SLAM processing unit 122.
  • the trajectory data 133 is generated from the image of the imaging device 33 by the SLAM processing unit 123.
  • the trajectory data 134 is generated from the image of the imaging device 34 by the SLAM processing unit 124.
  • Each of the trajectory data 131 to 134 includes a plurality of records in which time, X coordinate, Y coordinate, Z coordinate, yaw, pitch, roll, and distance are associated with each other. However, since the pitch and roll are all 0 ° in the second embodiment, the description is omitted in FIG. In the example of FIG. 13, each of the trajectory data 131 to 134 includes six records corresponding to the time from the time “9: 30: 0” to the time “9:30:05”. The first time “9:30:30” is the time when the vehicle 30 was at the reference position (reference time).
  • the time of the trajectory data 131 to 134 is the time when the image is captured.
  • the X coordinate, Y coordinate, and Z coordinate of the trajectory data 131 to 134 are coordinates in the camera coordinate system, and represent the estimation of the relative position from the position of the imaging device at the reference time.
  • the yaw, pitch, and roll of the trajectory data 131 to 134 are directions in the camera coordinate system, and represent estimation of the moving direction at each time.
  • the distance of the trajectory data 131 to 134 is a distance in the camera coordinate system, and represents a cumulative moving distance from the position of the imaging device at the reference time. This distance is the sum of the amount of change in the relative position at adjacent times.
  • the distance at the last time “9:30:05” represents the length of the movement locus.
  • the trajectory data 132 includes a time “9:30:01”, an X coordinate “0.080 m”, a Y coordinate “0.000 m”, a Z coordinate “ ⁇ 0.500 m”, a yaw “ ⁇ 9.090 °”, A record of distance “0.506 m” is included. This represents that the imaging device 32 has moved from (0.000 m, 0.000 m, 0.000 m) to (0.080 m, 0.000 m, ⁇ 0.500 m) one second after the reference time. . This also indicates that the traveling direction has changed by 9.090 ° to the left from the traveling direction at the reference time. This also indicates that the imaging device 32 has moved by 0.506 m from the reference time position.
  • FIG. 14 is a diagram illustrating an example of point cloud data generated by SLAM processing.
  • point cloud data 135 to 138 are stored.
  • the point cloud data 135 is generated from the image of the imaging device 31 by the SLAM processing unit 121.
  • the point cloud data 136 is generated from the image of the imaging device 32 by the SLAM processing unit 122.
  • the point cloud data 137 is generated from the image of the imaging device 33 by the SLAM processing unit 123.
  • the point cloud data 138 is generated from the image of the imaging device 34 by the SLAM processing unit 124.
  • Each of the point cloud data 135 to 138 includes a plurality of records in which the X coordinate, the Y coordinate, and the Z coordinate are associated with each other.
  • One record of the point cloud data 135 to 138 corresponds to one target point.
  • the X coordinate, Y coordinate, and Z coordinate of the point group data 135 to 138 are coordinates in the camera coordinate system, and represent the estimation of the position where the target point exists in the three-dimensional space.
  • the number of records included in the point cloud data 135 to 138 that is, the number of target points recognized by the SLAM processing units 121 to 124 may be different from each other.
  • the position of each target point is not classified by time because it is estimated by combining analysis results of a plurality of images having different imaging times.
  • FIG. 15 is a diagram illustrating a first conversion example of trajectory data.
  • the SLAM result storage unit 111 stores trajectory data 141 to 144.
  • the trajectory data 141 to 144 are converted from the trajectory data 131 to 134 by the trajectory comparison unit 125.
  • the trajectory data 131 is converted into trajectory data 141.
  • the trajectory data 132 is converted into trajectory data 142.
  • the trajectory data 133 is converted into trajectory data 143.
  • the trajectory data 134 is converted into trajectory data 144.
  • the trajectory data 141 is obtained by converting the coordinate system of the trajectory data 131 from the camera coordinate system of the imaging device 31 to the vehicle coordinate system using the parameters of the imaging device 31 included in the parameter table 114.
  • the trajectory data 142 is obtained by converting the coordinate system of the trajectory data 132 from the camera coordinate system of the imaging device 32 to the vehicle coordinate system using the parameters of the imaging device 32 included in the parameter table 114.
  • the trajectory data 143 is obtained by converting the coordinate system of the trajectory data 133 from the camera coordinate system of the imaging device 33 to the vehicle coordinate system using the parameters of the imaging device 33 included in the parameter table 114.
  • the trajectory data 144 is obtained by converting the coordinate system of the trajectory data 134 from the camera coordinate system of the imaging device 34 to the vehicle coordinate system using the parameters of the imaging device 34 included in the parameter table 114.
  • the X coordinate, Y coordinate, and Z coordinate of the trajectory data 141 to 144 are coordinates in the vehicle coordinate system and represent the estimation of the relative position of the vehicle 30 from the reference position.
  • the reference position is, for example, the center position of the vehicle 30 at the reference time.
  • the relative position at each time is, for example, the center position of the vehicle 30 at that time.
  • the yaw, pitch, and roll of the trajectory data 141 to 144 represent the estimation of the moving direction at each time in the vehicle coordinate system.
  • the distance of the trajectory data 141 to 144 represents the cumulative moving distance from the reference position in the vehicle coordinate system.
  • the trajectory data 142 includes time “9:30:01”, X coordinate “ ⁇ 0.475 m”, Y coordinate “0.000 m”, Z coordinate “0.500 m”, yaw “ ⁇ 9.090 °”, A record of distance “0.690 m” is included.
  • the X coordinate at the time “9:30:01” is adjusted.
  • FIG. 16 is a diagram illustrating a second conversion example of trajectory data.
  • Trajectory data 145 to 148 are stored in the SLAM result storage unit 111.
  • the trajectory data 145 to 148 are converted from the trajectory data 141 to 144 by the trajectory comparison unit 125.
  • the trajectory data 141 is converted into trajectory data 145.
  • the trajectory data 142 is converted into trajectory data 146.
  • the trajectory data 143 is converted into trajectory data 147.
  • the trajectory data 144 is converted into trajectory data 148.
  • the trajectory data 145 is obtained by similarity conversion of the trajectory data 141 so that the lengths of the moving trajectories are aligned.
  • the trajectory data 145 corresponds to the movement trajectory 71.
  • the trajectory data 146 is obtained by similarity conversion of the trajectory data 142 so that the lengths of the movement trajectories are uniform.
  • the trajectory data 146 corresponds to the movement trajectory 72.
  • the trajectory data 147 is obtained by similarity conversion of the trajectory data 143 so that the lengths of the movement trajectories are uniform.
  • the trajectory data 147 corresponds to the movement trajectory 73.
  • the trajectory data 148 is obtained by similarity conversion of the trajectory data 144 so that the lengths of the moving trajectories are uniform.
  • the trajectory data 148 corresponds to the movement trajectory 74.
  • the length of the movement trajectories 71 to 74 indicated by the trajectory data 145 to 148 is unified to 5.000 m.
  • the trajectory data 146 includes time “9:30:01”, X coordinate “ ⁇ 0.525 m”, Y coordinate “0.000 m”, Z coordinate “0.811 m”, yaw “ ⁇ 9.090 °”, A record of distance “0.965 m” is included.
  • the length of the moving trajectory is increased from 3.432 m to 5.000 m. Therefore, the distance at time “9:30:01” also increases from 0.690 m to 0.965 m.
  • FIG. 17 is a diagram illustrating an example of synthesized trajectory data.
  • the trajectory comparison unit 125 generates trajectory data 151 from the trajectory data 145 to 148. This corresponds to calculating the movement trajectory 70 by combining the movement trajectories 71 to 74.
  • the locus comparison unit 125 determines that the movement locus 72 indicated by the locus data 146 is an abnormal movement locus.
  • the trajectory comparison unit 125 generates trajectory data 151 by averaging the trajectory data 145, 147, and 148. In the averaging of the trajectory data 145, 147, and 148, the trajectory comparison unit 125 calculates the average value of the X coordinate, Y coordinate, Z coordinate, yaw, pitch, and low of the records having the same time, and calculates the distance.
  • the trajectory data 151 includes time “9:30:01”, X coordinate “ ⁇ 0.037 m”, Y coordinate “0.000 m”, Z coordinate “1.014 m”, yaw “ ⁇ 2.839 °”, A record of distance “1.016 m” is included.
  • the trajectory comparison unit 125 performs scaling on the trajectory data 151 and generates trajectory data 152. This corresponds to calculating the movement locus 80 by converting the length of the movement locus 70 into an actual distance in the earth coordinate system.
  • the length of the movement locus 80 indicated by the locus data 152 is extended to 8.000 m.
  • the trajectory data 152 includes time “9:30:01”, X coordinate “ ⁇ 0.007 m”, Y coordinate “0.000 m”, Z coordinate “1.615 m”, yaw “ ⁇ 2.839 °”, distance “ The record “1.615m” is included.
  • FIG. 18 is a flowchart illustrating an exemplary procedure for position estimation.
  • the position determination unit 127 sets a reference position. Setting of the reference position at the start of position estimation is performed using GPS or the like. After the start of position estimation, the position determination unit 127 may set the current position estimated last time as the reference position.
  • the SLAM processing unit 121 analyzes the sequence of images captured by the imaging device 31, and generates trajectory data 131 and point cloud data 135.
  • the SLAM processing unit 122 analyzes the sequence of images captured by the imaging device 32 and generates trajectory data 132 and point cloud data 136.
  • the SLAM processing unit 123 analyzes the sequence of images captured by the imaging device 33 and generates trajectory data 133 and point cloud data 137.
  • the SLAM processing unit 124 analyzes the sequence of images captured by the imaging device 34 and generates trajectory data 134 and point cloud data 138.
  • the trajectory comparison unit 125 refers to the parameter table 114 stored in the parameter storage unit 112, and converts the trajectory data 131 to 134 expressed in the camera coordinate system into the trajectory data 141 to 144 expressed in the vehicle coordinate system. Convert to
  • the trajectory comparison unit 125 performs scaling so that the lengths of the movement trajectories 61 to 64 indicated by the trajectory data 141 to 144 are equal, and calculates the movement trajectories 71 to 74. That is, the trajectory comparison unit 125 converts the trajectory data 141 to 144 into trajectory data 145 to 148 so that the distance at the last time is the same between the trajectory data 141 to 144. When the travel distance can be acquired from the travel distance acquisition unit 128, the trajectory comparison unit 125 may match the lengths of the travel tracks 71 to 74 with the travel distance at this time.
  • the point group comparison unit 126 refers to the parameter table 114 stored in the parameter storage unit 112, determines the coordinates of the point group data 135 to 138 expressed in the camera coordinate system, and the origin of each camera coordinate system. Rotate to the center to align with the vehicle coordinate system.
  • the point group comparison unit 126 performs scaling by applying the same magnification as the movement trajectories 61 to 64 to the point groups 65 to 68 indicated by the point group data 135 to 138. Scaling is performed around the position of the imaging devices 31 to 34 (the origin of the camera coordinate system).
  • the point group comparison unit 126 applies the magnification obtained when the movement locus 61 is converted into the movement locus 71 to the point group 65. Further, the point group comparison unit 126 applies the magnification obtained when the movement locus 62 is converted into the movement locus 72 to the point group 66.
  • the point group comparison unit 126 applies the magnification obtained when the movement locus 63 is converted into the movement locus 73 to the point group 67.
  • the point group comparison unit 126 applies the magnification obtained when the movement locus 64 is converted to the movement locus 74 to the point group 68.
  • the trajectory comparison unit 125 detects an abnormal movement trajectory greatly different from other movement trajectories from the movement trajectories 71 to 74 indicated by the trajectory data 145 to 148, and excludes the abnormal movement trajectory. For example, the trajectory comparison unit 125 calculates the average value of the coordinates indicated by the trajectory data 145 to 148 at the same time, and calculates the average movement trajectory. The trajectory comparison unit 125 calculates the square of the deviation between the coordinates indicated by the trajectory data 145 and the coordinates of the average moving trajectory at the same time, and defines the sum of squared deviations as the degree of divergence between the moving trajectory 71 and the average moving trajectory.
  • the trajectory comparison unit 125 calculates the divergence degree for the movement trajectories 72 to 74 as well. Then, the trajectory comparison unit 125 determines a movement trajectory having a deviation degree equal to or greater than a threshold among the movement trajectories 71 to 74 as an abnormal movement trajectory.
  • the trajectory comparison unit 125 generates the trajectory data 151 by synthesizing the remaining trajectory data other than the trajectory data indicating the abnormal movement trajectory from the trajectory data 145 to 148. For example, the trajectory comparison unit 125 calculates the average value of the coordinates indicated by the remaining trajectory data at the same time. This represents that the movement trajectory 70 is calculated by averaging the movement trajectories other than the abnormal movement trajectory.
  • the point group comparison unit 126 selects the remaining point group data excluding the point group data corresponding to the abnormal movement trajectory from the point group data indicating the point group scaled in step S6.
  • the point group comparison unit 126 scales the point group indicated by the selected point group data with the same magnification around the origin of each camera coordinate system, and searches for a common magnification that maximizes the overlap of the target points.
  • the position determination unit 127 generates trajectory data 152 indicating the movement trajectory 80 by applying the found common magnification to the trajectory data 151 generated in step S8.
  • the length of the movement locus 80 is an estimated value of the actual movement distance in the earth coordinate system.
  • the position determination unit 127 maps the movement locus 80 to the map space (latitude and longitude space) of the earth coordinate system from the reference position set in step S1 and the locus data 152 generated in step S9.
  • the position indicated by the end point of the movement locus 80 represents the current position of the vehicle 30.
  • the automatic parking device 37 controls automatic parking based on the current position estimated by the position estimation device 100.
  • the navigation device 200 displays the current position and movement trajectory estimated by the position estimation device 100 on the display 204 so as to overlap the map.
  • the position estimation apparatus 100 determines whether or not to continue position estimation. For example, the position estimation is continued while the automatic parking device 37 is in the automatic parking mode, and the position estimation is terminated when the automatic parking mode is canceled. When the position estimation is continued, the process proceeds to step S1. When the position estimation is not continued, the process of the position estimation device 100 ends.
  • FIG. 19 is a diagram illustrating an example of a navigation screen.
  • a navigation screen 90 is displayed on the display 204 of the navigation device 200.
  • Map data indicating a map of the parking lot is stored in the flash memory 203 of the navigation device 200.
  • the map of the parking lot shows the detailed shape of the parking lot such as the arrangement of parking spaces.
  • the estimated current position of the vehicle 30 is displayed on the navigation screen 90 so as to overlap the map of the parking lot.
  • the navigation screen 90 displays the movement trajectory of the vehicle 30 from at least a predetermined time before to the present. Thereby, the user can confirm that automatic driving
  • images of the imaging devices 31 to 34 having different imaging directions are acquired, and movement trajectories are calculated for each of the imaging devices 31 to 34 by SLAM processing.
  • An abnormal movement track is detected by comparing the four movement tracks, and the remaining movement tracks excluding the abnormal movement track are synthesized.
  • the imaging directions of the imaging devices 31 to 34 can be greatly shifted from each other as compared with a method of calculating one movement locus from a stereo image captured by a stereo camera. Therefore, it is possible to reduce a risk that all images of the imaging devices 31 to 34 are unsuitable for SLAM processing, and the accuracy of position estimation using SLAM is improved.
  • the information processing according to the first embodiment can be realized by causing the position estimation apparatus 10 to execute a program.
  • the information processing of the second embodiment can be realized by causing the position estimation device 100 and the navigation device 200 to execute a program.
  • the program can be recorded on a computer-readable recording medium (for example, the recording medium 207).
  • a computer-readable recording medium for example, the recording medium 207.
  • the recording medium for example, a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like can be used.
  • Magnetic disks include FD and HDD.
  • Optical disks include CD, CD-R (Recordable) / RW (Rewritable), DVD, and DVD-R / RW.
  • the program may be recorded and distributed on a portable recording medium. In that case, the program may be copied from a portable recording medium to another recording medium (for example, the flash memory 203) and executed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Image Analysis (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention vise à améliorer la précision d'estimation de la position d'un objet mobile à partir d'une image. Une unité d'acquisition d'image (11) acquiert une rangée de données d'image (23) provenant d'un dispositif d'imagerie (21) disposé sur un objet mobile (20), et acquiert une rangée de données d'image (24) provenant d'un dispositif d'imagerie (22) disposé sur l'objet mobile (20), la direction d'imagerie du dispositif d'imagerie (22) étant différente de celle du dispositif d'imagerie (21). Une unité de détermination (12) calcule, à partir de la rangée de données d'image (23), une trajectoire de déplacement (13) indiquant une estimation du trajet le long duquel l'objet mobile (20) s'est déplacé, et calcule, à partir de la rangée de données d'image (24), une trajectoire de déplacement (14) indiquant une estimation du trajet le long duquel l'objet mobile (20) est déplacé. L'unité de détermination (12) détermine la position (15) de l'objet mobile (20) à l'aide de la trajectoire de déplacement (13) et de la trajectoire de déplacement (14).
PCT/JP2016/052758 2016-01-29 2016-01-29 Dispositif d'estimation de position, procédé d'estimation de position, et programme d'estimation de position WO2017130397A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/052758 WO2017130397A1 (fr) 2016-01-29 2016-01-29 Dispositif d'estimation de position, procédé d'estimation de position, et programme d'estimation de position

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/052758 WO2017130397A1 (fr) 2016-01-29 2016-01-29 Dispositif d'estimation de position, procédé d'estimation de position, et programme d'estimation de position

Publications (1)

Publication Number Publication Date
WO2017130397A1 true WO2017130397A1 (fr) 2017-08-03

Family

ID=59397676

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/052758 WO2017130397A1 (fr) 2016-01-29 2016-01-29 Dispositif d'estimation de position, procédé d'estimation de position, et programme d'estimation de position

Country Status (1)

Country Link
WO (1) WO2017130397A1 (fr)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107765263A (zh) * 2017-10-30 2018-03-06 武汉海达数云技术有限公司 激光扫描装置及移动测量系统
JP6430087B1 (ja) * 2018-03-23 2018-11-28 三菱電機株式会社 経路生成装置、および、車両制御システム
JP2019172219A (ja) * 2018-03-29 2019-10-10 トヨタ自動車株式会社 車両走行管理システム
JP2020068499A (ja) * 2018-10-26 2020-04-30 現代自動車株式会社Hyundai Motor Company 車両周囲画像表示システム及び車両周囲画像表示方法
CN111738047A (zh) * 2019-03-25 2020-10-02 本田技研工业株式会社 自身位置推测方法
CN111902692A (zh) * 2018-09-14 2020-11-06 松下电器(美国)知识产权公司 判定方法及判定装置
CN114435470A (zh) * 2020-11-05 2022-05-06 长沙智能驾驶研究院有限公司 自动倒车控制方法、装置、车辆和存储介质
US12030521B2 (en) 2018-03-23 2024-07-09 Mitsubishi Electric Corporation Path generation device and vehicle control system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007278871A (ja) * 2006-04-07 2007-10-25 Technical Research & Development Institute Ministry Of Defence 動き量計算装置
WO2014070334A1 (fr) * 2012-11-02 2014-05-08 Qualcomm Incorporated Utiliser une pluralité de capteurs pour la cartographie et la localisation
WO2015169338A1 (fr) * 2014-05-05 2015-11-12 Hexagon Technology Center Gmbh Système d'arpentage

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007278871A (ja) * 2006-04-07 2007-10-25 Technical Research & Development Institute Ministry Of Defence 動き量計算装置
WO2014070334A1 (fr) * 2012-11-02 2014-05-08 Qualcomm Incorporated Utiliser une pluralité de capteurs pour la cartographie et la localisation
WO2015169338A1 (fr) * 2014-05-05 2015-11-12 Hexagon Technology Center Gmbh Système d'arpentage

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019085376A1 (fr) * 2017-10-30 2019-05-09 武汉海达数云技术有限公司 Dispositif de balayage laser et son procédé de commande, et système de mesure mobile et son procédé de commande
CN107765263A (zh) * 2017-10-30 2018-03-06 武汉海达数云技术有限公司 激光扫描装置及移动测量系统
JP6430087B1 (ja) * 2018-03-23 2018-11-28 三菱電機株式会社 経路生成装置、および、車両制御システム
WO2019180919A1 (fr) * 2018-03-23 2019-09-26 三菱電機株式会社 Dispositif de génération d'itinéraire et système de commande de véhicule
CN111868801A (zh) * 2018-03-23 2020-10-30 三菱电机株式会社 路径生成装置及车辆控制系统
US12030521B2 (en) 2018-03-23 2024-07-09 Mitsubishi Electric Corporation Path generation device and vehicle control system
JP2019172219A (ja) * 2018-03-29 2019-10-10 トヨタ自動車株式会社 車両走行管理システム
CN111902692A (zh) * 2018-09-14 2020-11-06 松下电器(美国)知识产权公司 判定方法及判定装置
JP7426174B2 (ja) 2018-10-26 2024-02-01 現代自動車株式会社 車両周囲画像表示システム及び車両周囲画像表示方法
JP2020068499A (ja) * 2018-10-26 2020-04-30 現代自動車株式会社Hyundai Motor Company 車両周囲画像表示システム及び車両周囲画像表示方法
CN111738047A (zh) * 2019-03-25 2020-10-02 本田技研工业株式会社 自身位置推测方法
CN114435470B (zh) * 2020-11-05 2022-11-25 长沙智能驾驶研究院有限公司 自动倒车控制方法、装置、车辆和存储介质
CN114435470A (zh) * 2020-11-05 2022-05-06 长沙智能驾驶研究院有限公司 自动倒车控制方法、装置、车辆和存储介质

Similar Documents

Publication Publication Date Title
WO2017130397A1 (fr) Dispositif d'estimation de position, procédé d'estimation de position, et programme d'estimation de position
EP2948927B1 (fr) Procédé de détection de parties structurelles d'une scène
JP7147119B2 (ja) 自律的な自己位置推定のためのデバイス及び方法
TWI695181B (zh) 用於產生彩色點雲的方法和系統
JP6595182B2 (ja) マッピング、位置特定、及び姿勢補正のためのシステム及び方法
EP3650814B1 (fr) Navigation par vision augmentée
CN109443348B (zh) 一种基于环视视觉和惯导融合的地下车库库位跟踪方法
CN111862673B (zh) 基于顶视图的停车场车辆自定位及地图构建方法
CN108759823B (zh) 基于图像匹配的指定道路上低速自动驾驶车辆定位及纠偏方法
CN110675307A (zh) 基于vslam的3d稀疏点云到2d栅格图的实现方法
US20220270358A1 (en) Vehicular sensor system calibration
KR102006291B1 (ko) 전자 장치의 이동체 포즈 추정 방법
US11567497B1 (en) Systems and methods for perceiving a field around a device
JP6552448B2 (ja) 車両位置検出装置、車両位置検出方法及び車両位置検出用コンピュータプログラム
Song et al. End-to-end learning for inter-vehicle distance and relative velocity estimation in ADAS with a monocular camera
JP2019056629A (ja) 距離推定装置及び方法
CN112455502A (zh) 基于激光雷达的列车定位方法及装置
US20190293444A1 (en) Lane level accuracy using vision of roadway lights and particle filter
WO2020113425A1 (fr) Systèmes et procédés pour construire une carte de haute définition
CN112577499B (zh) 一种vslam特征地图尺度恢复方法及系统
KR20160125803A (ko) 영역 추출 장치, 물체 탐지 장치 및 영역 추출 방법
WO2023222671A1 (fr) Détermination de position d'un véhicule à l'aide de segmentations d'image
CN116804553A (zh) 基于事件相机/imu/自然路标的里程计系统及方法
JP2022513830A (ja) 道路の表面上の対象体を検出してモデル化する方法
JP5557036B2 (ja) 退出判定装置、退出判定プログラム及び退出判定方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16887983

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16887983

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP