WO2018142533A1 - Position/orientation estimating device and position/orientation estimating method - Google Patents

Position/orientation estimating device and position/orientation estimating method Download PDF

Info

Publication number
WO2018142533A1
WO2018142533A1 PCT/JP2017/003757 JP2017003757W WO2018142533A1 WO 2018142533 A1 WO2018142533 A1 WO 2018142533A1 JP 2017003757 W JP2017003757 W JP 2017003757W WO 2018142533 A1 WO2018142533 A1 WO 2018142533A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
recognition
azimuth
distance marker
orientation
Prior art date
Application number
PCT/JP2017/003757
Other languages
French (fr)
Japanese (ja)
Inventor
隆博 加島
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2018562138A priority Critical patent/JP6479296B2/en
Priority to PCT/JP2017/003757 priority patent/WO2018142533A1/en
Priority to TW106113892A priority patent/TW201830336A/en
Publication of WO2018142533A1 publication Critical patent/WO2018142533A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a position / orientation estimation apparatus and a position / orientation estimation method for estimating the position and orientation of an estimation object.
  • Patent Literature 1 describes a vehicle position detection device that corrects a vehicle position using position information of a kilometer post (distance mark). This vehicle position detection device specifies the position information of the kilometer post corresponding to the feature information recognized from the photographed image among the pre-stored position information and feature information of the kilometer post, and uses this position information to determine the vehicle position. Is corrected.
  • a kilometer post is installed every 100 meters starting from the start point of the down line and the end point of the up line on the road where the lanes are separated into the up line and the down line.
  • the kilometer post indicates the number of kilometers that is the distance from the starting point set on the road to the kilometer post, so the same number of kilometers is present on the upstream kilometer post and the downstream kilopost on the corresponding position. Marked.
  • the number of kilometers is the feature information. Therefore, the position information of the kilometer post on the upstream line should be specified, but the position information of the kilometer post on the downstream line in the corresponding position is erroneously specified. There is a possibility that.
  • This invention solves the said subject, and it aims at obtaining the position and orientation estimation apparatus and position and orientation estimation method which can improve the estimation precision of the position and orientation of an estimation target object.
  • a position and orientation estimation apparatus includes an object search unit, an orientation calculation unit, an object identification unit, and a position and orientation estimation unit.
  • the object search unit converts the reference point set for the recognition object in the three-dimensional space and the label information recognized from the photographed image from the database in which the coordinate information indicated by the recognition object is associated. Search for the corresponding recognition object.
  • the azimuth calculating unit calculates the azimuth of the recognition target based on the position coordinates of the reference point set for the recognition target searched by the target search unit.
  • the object specifying unit is based on the shooting direction of the estimated object obtained by shooting the shot image and the direction of the recognition target calculated by the direction calculation unit, from among the recognition targets searched by the target search unit.
  • a recognition object in the captured image is specified.
  • the position / orientation estimation unit based on the correspondence between the position coordinates in the three-dimensional space of the reference point set for the recognition object specified by the object specifying unit and the position coordinates in the captured image, Estimate posture.
  • the present invention it is possible to accurately specify the recognition object in the photographed image from the recognition objects that indicate the same marking information based on the shooting direction of the estimation object and the direction of the recognition object.
  • the position and orientation of the estimation object can be accurately estimated based on the correspondence relationship between the position coordinates of the reference point of the accurately identified recognition object in the three-dimensional space and the position coordinates in the captured image.
  • FIG. 3A is a block diagram showing a hardware configuration for realizing the function of the position / orientation estimation apparatus according to Embodiment 1.
  • FIG. 3B is a block diagram illustrating a hardware configuration that executes software that implements the functions of the position and orientation estimation apparatus according to Embodiment 1.
  • 3 is a flowchart showing an operation of the position and orientation estimation apparatus according to the first embodiment. It is a figure which shows the outline
  • FIG. 1 is a block diagram showing a functional configuration of a position / orientation estimation apparatus 1 according to Embodiment 1 of the present invention.
  • the position / orientation estimation device 1 is connected to each of the imaging device 2, the sensor device 3, and the distance marker database 4, and estimates the position and orientation of the imaging device 2.
  • the imaging device 2 is an estimation target whose position and orientation are estimated by the position / orientation estimation device 1, and is installed at a predetermined location of the vehicle to image the periphery of the vehicle.
  • position indicates the position of the estimation object in the three-dimensional space
  • posture is the tilt angle or rotation angle of the estimation object in the three-dimensional space.
  • the imaging device 2 is assumed to be an in-vehicle camera that images the front of the vehicle.
  • the sensor device 3 is a sensor mounted on the vehicle, and is a three-axis acceleration sensor and a three-axis geomagnetic sensor.
  • the acceleration sensor detects acceleration along the x-axis, acceleration along the y-axis, and acceleration along the z-axis in a three-dimensional space.
  • the geomagnetic sensor detects geomagnetism along each of the x-axis, y-axis, and z-axis. In the vehicle, the positional relationship between the imaging device 2 and the sensor device 3 is fixed.
  • the distance marker database 4 is a database in which the position coordinates of the reference point set in the distance marker in the three-dimensional space are associated with the number of kilometers of the distance marker.
  • the distance marker is a recognition object whose number of kilometers is recognized from the photographed image photographed by the photographing device 2.
  • the number of kilometers is the marking information marked on the distance marker that is the recognition object, and indicates the distance from the starting point set on the road to the distance marker.
  • FIG. 2 is a diagram showing an example of the distance marker database 4.
  • the distance marker is a rectangular sign board, and the reference points are the four corner points of the surface on which the kilometer is marked in the distance marker.
  • the distance marker registered in the distance marker database 4 shown in FIG. 2 is a distance marker set on a road in which lanes are separated into an up line and a down line.
  • the distance markers indicating the same number of kilometers are the distance marker on the up line side and the distance marker on the down line side at the position corresponding to the distance marker.
  • the reference points registered in the distance marker database 4 are points in a three-dimensional space representing the real space.
  • the three-dimensional space defines, for example, an x-axis (for example, positive in the east direction) in the east-west direction and a y-axis (for example, positive in the north direction) in the north-south direction from a specific point on the earth as the origin. It is represented by a three-dimensional coordinate system that defines the z-axis (for example, the height direction of the distance marker is positive) in the elevation direction.
  • the position coordinates of the four corner points are distances along the x-axis, y-axis, and z-axis from the origin to the four corner points, expressed in meters.
  • the three-dimensional coordinate system representing the real space is referred to as a “world coordinate system”.
  • the position / orientation estimation apparatus 1 includes a distance marker recognition unit 5, an orientation estimation unit 6, a distance marker search unit 7, an orientation calculation unit 8, a distance marker identification unit 9, and a position / orientation estimation unit 10.
  • the distance marker recognition unit 5 is a component corresponding to the object recognition unit, and from the captured image captured by the imaging device 2, the number of kilometers marked on the distance marker and the reference point set for the distance marker above. Recognize position coordinates in a two-dimensional coordinate system of a captured image. For example, the distance marker recognition unit 5 performs pattern recognition processing or character recognition processing on a captured image that is a two-dimensional photographed image, thereby recognizing the number of kilometers of the distance marker captured in the captured image and the position coordinates of the reference point. To do.
  • the two-dimensional coordinate system of the captured image is referred to as an “image coordinate system”.
  • the orientation estimation unit 6 estimates the imaging orientation of the imaging device 2 based on the detection information of the sensor device 3.
  • the shooting direction of the shooting device 2 is a direction in which the viewpoint of the shooting device 2 is located.
  • This azimuth may be an angle indicating the shooting azimuth of the image taking device 2.
  • the north may be represented by 0 degrees, the east by 90 degrees, the south by 180 degrees, and the west by 270 degrees. It may be expressed as
  • the distance marker search unit 7 is a component corresponding to the object search unit, and searches the distance marker database 4 for a distance marker corresponding to the number of kilometers recognized by the distance marker recognition unit 5 from the photographed image.
  • the distance marker database 4 shown in FIG. 2 the distance marker on the up line side and the distance marker on the down line side corresponding to the distance marker have the same number of kilometers. Two distance markers corresponding to the number are searched. For example, if the kilometer recognized by the distance marker recognition unit 5 is 123.4, two distance markers corresponding to the kilometer are retrieved, and the position coordinates of the reference point set for each of these distance markers are calculated. Is output to the unit 8.
  • the azimuth calculating unit 8 calculates the normal direction of the surface on which the number of kilometers is indicated as the azimuth of the distance marker based on the position coordinates of the reference point set in the distance marker searched by the distance marker searching unit 7.
  • the azimuth of the distance marker may be an angle indicating the azimuth of the distance marker with respect to the imaging device 2. For example, as with the azimuth estimation unit 6, 0 degrees north, 90 degrees east, 180 degrees south, and 270 west The angle may be expressed in degrees or the angle may be expressed in radians.
  • the distance marker specifying unit 9 is a component corresponding to the object specifying unit, and based on the shooting direction of the imaging device 2 and the direction of the distance marker, from the distance markers searched by the distance marker searching unit 7, The distance marker in the captured image is specified.
  • the shooting direction of the shooting apparatus 2 is the direction estimated by the direction estimation unit 6, and the direction of the distance marker is the direction calculated by the direction calculation unit 8.
  • the position / orientation estimation unit 10 determines the position of the imaging device 2 based on the correspondence between the position coordinates in the three-dimensional space of the reference point set in the distance marker specified by the distance marker specifying unit 9 and the position coordinates in the captured image. And estimate posture. For example, the position / orientation estimation unit 10 generates a PnP (Perspective n-Points) problem using the internal parameters of the imaging apparatus 2 based on the correspondence between the position coordinates of the reference point in the world coordinate system and the position coordinates in the image coordinate system. Solve. Thereby, the position and orientation of the photographing apparatus 2 are estimated.
  • the internal parameters are information including the focal length and principal point of the photographing lens provided in the photographing apparatus 2.
  • the distance marker database 4 may be constructed on a storage area of an external storage device provided separately from the position and orientation estimation device 1.
  • the distance marker search unit 7 searches the distance marker by accessing the distance marker database 4 in the external storage device via a communication line such as the Internet or an intranet.
  • the distance marker recognizing unit 5 may be a constituent element included in the photographing apparatus 2.
  • the distance marker recognizing unit 5 is removed from the position / orientation estimation device 1 and the label information recognized from the captured image is output from the imaging device 2 to the distance marker searching unit 7 and is recognized from the captured image.
  • the position coordinates of the points are output from the imaging device 2 to the position / orientation estimation unit 10.
  • the direction estimation unit 6 may be a component included in the sensor device 3.
  • the azimuth estimation unit 6 is removed from the position / orientation estimation device 1, and information indicating the photographing orientation of the photographing device 2 is output from the sensor device 3 to the distance indicator specifying unit 9.
  • FIG. 3A is a block diagram showing a hardware configuration for realizing the function of the position / orientation estimation apparatus 1.
  • the imaging device 100, the sensor device 101, the storage device 102, and the processing circuit 103 are connected to each other by a bus.
  • FIG. 3B is a block diagram illustrating a hardware configuration that executes software that implements the functions of the position and orientation estimation apparatus 1.
  • the imaging device 100, the sensor device 101, the storage device 102, the CPU (Central Processing Unit) 104, and the memory 105 are connected to each other by a bus.
  • CPU Central Processing Unit
  • the image capturing device 100 is an image capturing device 2 that captures the periphery of the vehicle on which the position and orientation estimation device 1 is mounted, and the sensor device 101 is a sensor that constitutes the sensor device 3.
  • the storage device 102 stores a distance marker database 4.
  • the storage device 102 is realized by, for example, a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an HDD (Hard Disk Drive), or the like, and may be a storage device that combines these. Further, a part or all of the storage area of the storage device 102 may be provided in the external device.
  • the position / posture estimation apparatus 1 communicates with the external apparatus via a communication line such as the Internet or an intranet, and the distance marker search process is executed.
  • the functions of the distance marker recognition unit 5, the orientation estimation unit 6, the distance marker search unit 7, the orientation calculation unit 8, the distance marker identification unit 9, and the position / orientation estimation unit 10 are realized by a processing circuit.
  • the position / orientation estimation apparatus 1 includes a processing circuit for executing these functions.
  • the processing circuit may be dedicated hardware or a CPU that executes a program stored in a memory.
  • the processing circuit 103 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or the like. ), FPGA (Field-Programmable Gate Array), or a combination thereof.
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array
  • each function of the distance marker recognition unit 5, the direction estimation unit 6, the distance marker search unit 7, the direction calculation unit 8, the distance target specification unit 9, and the position / orientation estimation unit 10 is realized by a processing circuit.
  • each function may be realized by a single processing circuit.
  • the processing circuit is the CPU 104 shown in FIG. 3B
  • the functions of the distance marker recognition unit 5, the azimuth estimation unit 6, the distance marker search unit 7, the azimuth calculation unit 8, the distance marker identification unit 9, and the position and orientation estimation unit 10 are as follows: It is realized by software, firmware, or a combination of software and firmware. Software and firmware are described as programs and stored in the memory 105.
  • the CPU 104 realizes each function by reading and executing a program stored in the memory 105. That is, the position / orientation estimation apparatus 1 includes a memory 105 for storing a program that, when executed by the CPU 104, results in the processing from step ST1 to step ST9 shown in FIG. These programs cause the computer to execute the procedures or methods of the distance marker recognition unit 5, the direction estimation unit 6, the distance marker search unit 7, the direction calculation unit 8, the distance marker identification unit 9, and the position and orientation estimation unit 10. .
  • the memory is, for example, non-volatile or volatile semiconductor memory such as RAM, ROM, flash memory, EPROM (Erasable Programmable ROM), EEPROM (Electrically Programmable EPROM), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disk) and the like are applicable.
  • non-volatile or volatile semiconductor memory such as RAM, ROM, flash memory, EPROM (Erasable Programmable ROM), EEPROM (Electrically Programmable EPROM), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disk) and the like are applicable.
  • the distance marker recognition unit 5 and the azimuth estimation unit 6 realize their functions by a dedicated hardware processing circuit, and include a distance marker search unit 7, an azimuth calculation unit 8, a distance marker identification unit 9, and a position and orientation estimation unit 10.
  • the function is realized by the CPU 104 executing the program stored in the memory 105.
  • the processing circuit can realize the above-described functions by hardware, software, firmware, or a combination thereof.
  • FIG. 4 is a flowchart showing the operation of the position / orientation estimation apparatus 1 and shows a series of processes from obtaining a captured image in front of the vehicle until the position and orientation of the imaging apparatus 2 are estimated.
  • the distance marker recognition unit 5 inputs a captured image in front of the vehicle from the imaging device 2 (step ST1), the distance marker recognizes the distance marker from the captured image (step ST2).
  • FIG. 5 is a diagram showing an outline of processing for recognizing the distance marker 401 from the captured image 2A.
  • the pixel in the upper left corner of the captured image 2A is the origin (0, 0)
  • the axis extending rightward from the origin is defined as the x axis
  • the axis extending downward from the origin is defined as the y axis.
  • Each point on the captured image 2A is defined in units of pixels (pixels) and represents a position in the image coordinate system.
  • the definition of the origin and the coordinate axes described above is merely an example, and the present invention is not limited to this.
  • the distance marker recognition unit 5 recognizes the distance marker 401 in the captured image 2A by performing image processing on the captured image 2A (step ST3; YES), the distance marker recognition unit 5 recognizes the number of kilometers displayed on the distance marker 401. To the distance marker search unit 7. In FIG. 5, the number information “123.4” is output to the distance marker search unit 7. The distance marker recognition unit 5 recognizes the position coordinates of the four corner points of the distance marker 401 in the image coordinate system and outputs them to the position and orientation estimation unit 10. The four corner points of the distance marker 401 are points on the surface where the number of kilometers in the distance marker 401 is marked, and are points 402a, 402b, 402c, and 402d shown in FIG. On the other hand, when the distance marker is not shown in the photographed image, the distance marker recognition unit 5 cannot recognize the distance marker from the photographed image (step ST3; NO), and returns to the process of step ST1.
  • the distance marker search unit 7 searches the distance marker database 4 for a distance marker corresponding to the number of kilometers recognized from the captured image 2A by the distance marker recognition unit 5 (step ST4). For example, when the distance marker search unit 7 searches the distance marker database 4 shown in FIG. 2 based on “123.4”, which is the number of kilometers input from the distance marker recognition unit 5, the number of kilometers “123.4”. Information on the two distance markers corresponding to is read out and output to the azimuth calculation unit 8.
  • the above-mentioned distance marker information is position coordinates in the world coordinate system of each point at the four corners of the distance marker.
  • the azimuth calculating unit 8 calculates the azimuth of the distance marker with respect to the photographing apparatus 2 based on the position coordinates in the world coordinate system of the four corners of the distance marker input from the distance marker searching unit 7 (step ST5). ). As described above, when two distance markers corresponding to the number of kilometers “123.4” recognized from the photographed image 2A are retrieved from the distance marker database 4, the azimuth calculation unit 8 determines each of these distance markers. Calculate the bearing. The azimuth calculation unit 8 outputs the calculated azimuth for each distance marker and the position coordinates in the world coordinate system of the four corners (reference points) of the distance marker for which the azimuth has been calculated to the distance marker identification unit 9.
  • FIG. 6 is a diagram showing an outline of processing for calculating the azimuth of the distance marker 601 and shows a state where the distance marker 601 is viewed from the sky.
  • the positive direction of the x axis is the east direction
  • the positive direction of the y axis is the north direction.
  • a point 602 is a point at the upper left corner of the distance marker 601
  • a point 603 is a point at the upper right corner of the distance marker 601. Points 602 and 603 are the reference points described above.
  • An arrow 604 indicates the installation direction of the distance marker 601, that is, the normal direction of the surface on which the kilometer is marked.
  • the angle 605 is an angle indicating the azimuth of the distance marker 601 calculated by the azimuth calculation unit 8.
  • the azimuth calculation unit 8 calculates the angle 605 using the position coordinates of the upper left corner point 602, the position coordinates of the upper right corner point 603, and the inverse trigonometric function of the distance marker 601.
  • the azimuth calculation unit 8 calculates the azimuth for all the distance markers searched by the distance marker search unit 7 and outputs the calculated azimuth to the distance marker identification unit 9.
  • the azimuth of the distance marker 601 is calculated using the position coordinates of the upper left corner point 602 and the upper right corner point 603 of the distance marker 601, but instead of the lower left corner of the distance marker 601.
  • the azimuth of the distance marker 601 may be calculated using the position coordinates of the point and the point at the lower right corner.
  • the azimuth of the distance marker may be calculated using a combination of an upper left corner point, an upper right corner point, a lower left corner point, and a lower right corner point.
  • the bearing calculation unit 8 calculates the bearing of the distance marker based on the outer product of the vector from the upper left corner point to the lower left corner point and the vector from the upper left corner point to the upper right corner point. May be.
  • the azimuth calculation unit 8 calculates a vector perpendicular to the surface on which the number of kilometers is marked in the distance marker based on the value of the outer product, and uses the calculated vector and the inverse trigonometric function to calculate the distance marker. Calculate the bearing.
  • the orientation estimation unit 6 estimates the imaging orientation of the imaging device 2 based on the acquired sensor value (step ST7).
  • the azimuth estimation unit 6 calculates the posture of the photographing device 2 with respect to the ground based on the gravitational acceleration detected by the acceleration sensor, and based on the calculated posture and the value of the geomagnetism detected by the geomagnetic sensor.
  • the shooting direction of 2 is estimated.
  • the azimuth estimation unit 6 includes a geomagnetic sensor.
  • the photographing direction of the photographing apparatus 2 may be estimated using only the sensor information.
  • FIG. 7 is a diagram showing an outline of the distance marker specifying process.
  • a shooting direction 701 is a shooting direction of the shooting apparatus 2 estimated by the direction estimation unit 6 and has an azimuth angle of 60 degrees.
  • direction 703 shown with a broken line are the azimuth
  • FIG. The azimuth angle of the azimuth 702 is 50 degrees
  • the distance marker specifying unit 9 inverts the azimuth angle of the imaging azimuth 701 by 180 degrees in order to facilitate comparison with the azimuth of the distance marker.
  • an azimuth 704 is an azimuth obtained by inverting the azimuth angle of the shooting azimuth 701 by 180 degrees, and the azimuth angle is 240 degrees.
  • the distance marker specifying unit 9 compares the azimuth 704 obtained by inverting the imaging azimuth 701 by 180 degrees with the azimuths 702 and 703 of the two distance markers.
  • the distance marker specifying unit 9 calculates the absolute value 705 of the difference between the azimuth angle of the azimuth 704 and the azimuth angle of the azimuth 702 and calculates the absolute value 706 of the difference between the azimuth angle of the azimuth 704 and the azimuth angle of the azimuth 703.
  • the absolute value 705 and the absolute value 706 are compared in magnitude.
  • the distance marker specifying unit 9 specifies the distance marker having the smallest absolute value of the azimuth angle difference as the distance marker captured by the imaging device 2, that is, the distance marker in the captured image. Since the absolute value 705 is 170 degrees and the absolute value 706 is 20 degrees, the distance marker of the azimuth 703 is specified as the distance marker in the captured image.
  • the distance marker specifying unit 9 outputs the position coordinates in the world coordinate system of each point (reference point) at the four corners of the specified distance marker to the position and orientation estimating unit 10.
  • the distance sign is set to the road.
  • the number of kilometers that is the distance from the starting point to the mileage mark is displayed.
  • the same kilometer is marked on the up-distance side distance indicator 800a and the down-line side distance indicator 800b corresponding to the up-distance side distance indicator 800a.
  • the same number of kilometers is indicated on the distance marker 801b on the down line side at the position corresponding to.
  • the distance marker specifying unit 9 searches for the distance searched by the distance marker search unit 7 based on the direction of the distance marker calculated by the direction calculation unit 8 and the shooting direction of the shooting device 2 estimated by the direction estimation unit 6. A distance marker in the captured image is specified from the markers. Thereby, it is possible to accurately identify the distance marker 800a on the upline side that is actually reflected in the captured image.
  • the position / orientation estimation unit 10 associates the position coordinates in the world coordinate system of each point of the four corners of the distance marker specified by the distance marker specifying unit 9 with the position coordinates in the image coordinate system of each point of the four corners of the distance marker. Based on the relationship, the position and orientation of the imaging device 2 are estimated (step ST9). Since the distance marker specified by the distance marker specifying unit 9 is a distance marker captured by the imaging device 2, the position / orientation estimation unit 10 determines the position coordinates in the image coordinate system of each point at the four corners of the distance marker. It can be acquired from the mark recognition unit 5.
  • the position / orientation estimation unit 10 solves the PnP problem using the internal parameters of the imaging apparatus 2 based on the correspondence between the position coordinates in the world coordinate system and the position coordinates in the image coordinate system of the four corners of the distance marker. By solving, the position and orientation including the rotation component of the imaging device 2 are estimated.
  • the position / orientation estimation unit 10 is based on the estimated position and orientation of the imaging device 2 and the positional relationship. Can be accurately estimated. Since the vehicle position can be estimated accurately, the navigation accuracy can be improved.
  • step ST1 to step ST5 there is no dependency between the process from step ST1 to step ST5 and the process from step ST6 to step ST7, so the former process and the latter process are performed in parallel. May be executed.
  • the recognition target object is a distance marker and the labeling information is the number of kilometers has been described, but Embodiment 1 is not limited to this.
  • the recognition target object may be equipment installed along a road, and the marking information may be code information such as numbers, character strings, images, and two-dimensional barcodes marked on the equipment. . That is, the recognition target object and the labeling information may be anything that can be recognized from the captured image.
  • the present invention is not limited to this.
  • the reference point may be a point on the edge of the distance marker or a point on the marking surface in kilometer.
  • the position / orientation estimation apparatus 1 may be mounted on a vehicle, the present invention is not limited to this.
  • the position / orientation estimation apparatus 1 may be mounted on a moving body such as a railway or an aircraft, or the position / orientation estimation apparatus 1 may be realized by a mobile terminal such as a smartphone or a tablet PC and carried by a person.
  • the position / orientation estimation apparatus 1 searches the distance marker database 4 for the position coordinates of the reference point of the distance marker corresponding to the number of kilometers recognized from the captured image, and the retrieved distance marker.
  • the azimuth of the distance marker is calculated based on the position coordinates of the reference point
  • the distance marker in the photographed image is identified based on the calculated azimuth of the distance marker and the photographing azimuth of the photographing device 2, and the identified distance marker
  • the position and orientation of the photographing apparatus 2 are estimated based on the correspondence between the position coordinates of the reference point in the world coordinate system and the position coordinates in the image coordinate system.
  • the distance marker in the captured image can be accurately specified from the distance markers indicating the same number of kilometers. It is possible to accurately estimate the position and orientation of the photographing apparatus 2 based on the position coordinates of the reference point of the distance marker that is accurately specified. Since the position of the photographed distance marker can also be specified accurately, the position / orientation estimation apparatus 1 can be applied to the record of road maintenance inspection work. Further, since the position and orientation of the position / orientation estimation apparatus 1 can be accurately estimated based on the position and orientation of the photographing apparatus 2, the augmented reality application that requires the accurate position and orientation of the object is also applicable. Can be applied.
  • the orientation estimation unit 6 estimates the orientation of the imaging device 2 with respect to the ground based on the detection information of the acceleration sensor, and the imaging orientation of the imaging device 2 based on the estimated orientation. Is estimated. By comprising in this way, the imaging
  • any component of the embodiment can be modified or any component of the embodiment can be omitted within the scope of the invention.
  • the position / orientation estimation apparatus can improve the estimation accuracy of the position and orientation of the estimation object, and is suitable, for example, for an in-vehicle navigation apparatus.
  • 1 position and orientation estimation device 2,100 photographing device, 2A photographed image, 3,101 sensor device, 4 distance marker database, 5 distance marker recognition unit, 6 bearing estimation unit, 7 distance marker search unit, 8 bearing calculation unit, 9 Distance marker specifying unit, 10 position and orientation estimation unit, 102 storage device, 103 processing circuit, 104 CPU, 105 memory, 401, 601 distance marker, 604 arrow, 605 angle, 701 shooting orientation, 702 to 704 orientation, 705, 706 absolute Value, 800a, 800b, 801a, 801b distance markers.

Abstract

A position/orientation estimating device (1): retrieves, from a kilometer post database (4), the positional coordinates of a reference point of a kilometer post corresponding to kilometers recognized from a photographed image; calculates the azimuth of the kilometer post on the basis of the retrieved positional coordinates of the reference point; specifies a kilometer post in the photographed image on the basis of the calculated azimuth of the kilometer post and the photograph azimuth of a photographing device (2); and estimates the position and the orientation of the photographing device (2) on the basis of the correspondence between the position coordinates of the reference point of the identified kilometer post in a world coordinate system and the positon coordinates thereof in the image coordinate system.

Description

位置姿勢推定装置および位置姿勢推定方法Position / orientation estimation apparatus and position / orientation estimation method
 この発明は、推定対象物の位置および姿勢を推定する位置姿勢推定装置および位置姿勢推定方法に関する。 The present invention relates to a position / orientation estimation apparatus and a position / orientation estimation method for estimating the position and orientation of an estimation object.
 撮影画像から画像認識された距離標の位置情報を用いて、推定対象物の位置および姿勢を推定する技術が知られている。例えば、特許文献1には、キロポスト(距離標)の位置情報を用いて車両位置を補正する車両位置検出装置が記載されている。この車両位置検出装置は、予め記憶しておいたキロポストの位置情報と特徴情報のうち、撮影画像から画像認識した特徴情報に対応するキロポストの位置情報を特定し、この位置情報を用いて車両位置を補正している。 A technique for estimating the position and orientation of an estimation object using position information of a distance marker that has been image-recognized from a captured image is known. For example, Patent Literature 1 describes a vehicle position detection device that corrects a vehicle position using position information of a kilometer post (distance mark). This vehicle position detection device specifies the position information of the kilometer post corresponding to the feature information recognized from the photographed image among the pre-stored position information and feature information of the kilometer post, and uses this position information to determine the vehicle position. Is corrected.
特開2009-250718号公報JP 2009-250718 A
 しかしながら、特許文献1に記載される車両位置検出装置では、同じ特徴情報を有する複数のキロポストがそれぞれ異なる場所に存在した場合、撮影画像から画像認識した特徴情報のみでは、カメラで撮影されたキロポストを正確に特定できない。このため、誤った位置に補正される可能性があった。 However, in the vehicle position detection device described in Patent Document 1, when a plurality of kiloposts having the same feature information are present in different places, only the feature information recognized by the image from the photographed image is used to capture the kilopost taken by the camera. It cannot be accurately identified. For this reason, there was a possibility of being corrected to an incorrect position.
 例えば、上り線と下り線とに車線が分離された道路において、下り線の始点と上り線の終点とを起点として100メートルごとにキロポストが設置されている場合を考える。
 キロポストには、道路に設定された起点から当該キロポストまでの距離であるキロ数が標示されるので、上り線側のキロポストとこれに対応する位置の下り線側のキロポストとには同じキロ数が標示される。上記車両位置検出装置では、このキロ数が特徴情報となるため、上り線側のキロポストの位置情報を特定すべきところ、これに対応する位置にある下り線側のキロポストの位置情報が誤って特定される可能性がある。
For example, let us consider a case where a kilometer post is installed every 100 meters starting from the start point of the down line and the end point of the up line on the road where the lanes are separated into the up line and the down line.
The kilometer post indicates the number of kilometers that is the distance from the starting point set on the road to the kilometer post, so the same number of kilometers is present on the upstream kilometer post and the downstream kilopost on the corresponding position. Marked. In the above vehicle position detection device, the number of kilometers is the feature information. Therefore, the position information of the kilometer post on the upstream line should be specified, but the position information of the kilometer post on the downstream line in the corresponding position is erroneously specified. There is a possibility that.
 この発明は上記課題を解決するもので、推定対象物の位置および姿勢の推定精度を高めることができる位置姿勢推定装置および位置姿勢推定方法を得ることを目的とする。 This invention solves the said subject, and it aims at obtaining the position and orientation estimation apparatus and position and orientation estimation method which can improve the estimation precision of the position and orientation of an estimation target object.
 この発明に係る位置姿勢推定装置は、対象物検索部、方位算出部、対象物特定部および位置姿勢推定部を備える。
 対象物検索部は、認識対象物に設定された基準点の3次元空間における位置座標と当該認識対象物が標示する標示情報とが対応付けられたデータベースから、撮影画像から認識された標示情報に対応する認識対象物を検索する。
 方位算出部は、対象物検索部によって検索された認識対象物に設定された基準点の位置座標に基づいて、認識対象物の方位を算出する。
 対象物特定部は、撮影画像を撮影した推定対象物の撮影方位と方位算出部によって算出された認識対象物の方位とに基づいて、対象物検索部によって検索された認識対象物の中から、撮影画像における認識対象物を特定する。
 位置姿勢推定部は、対象物特定部によって特定された認識対象物に設定された基準点の3次元空間における位置座標と撮影画像における位置座標との対応関係に基づいて、推定対象物の位置および姿勢を推定する。
A position and orientation estimation apparatus according to the present invention includes an object search unit, an orientation calculation unit, an object identification unit, and a position and orientation estimation unit.
The object search unit converts the reference point set for the recognition object in the three-dimensional space and the label information recognized from the photographed image from the database in which the coordinate information indicated by the recognition object is associated. Search for the corresponding recognition object.
The azimuth calculating unit calculates the azimuth of the recognition target based on the position coordinates of the reference point set for the recognition target searched by the target search unit.
The object specifying unit is based on the shooting direction of the estimated object obtained by shooting the shot image and the direction of the recognition target calculated by the direction calculation unit, from among the recognition targets searched by the target search unit. A recognition object in the captured image is specified.
The position / orientation estimation unit, based on the correspondence between the position coordinates in the three-dimensional space of the reference point set for the recognition object specified by the object specifying unit and the position coordinates in the captured image, Estimate posture.
 この発明によれば、推定対象物の撮影方位と認識対象物の方位とに基づいて、同じ標示情報を標示する認識対象物の中から撮影画像における認識対象物を正確に特定することができる。正確に特定された認識対象物の基準点の3次元空間における位置座標と上記撮影画像における位置座標との対応関係に基づいて、推定対象物の位置および姿勢を精度よく推定することができる。 According to the present invention, it is possible to accurately specify the recognition object in the photographed image from the recognition objects that indicate the same marking information based on the shooting direction of the estimation object and the direction of the recognition object. The position and orientation of the estimation object can be accurately estimated based on the correspondence relationship between the position coordinates of the reference point of the accurately identified recognition object in the three-dimensional space and the position coordinates in the captured image.
この発明の実施の形態1に係る位置姿勢推定装置の機能構成を示すブロック図である。It is a block diagram which shows the function structure of the position and orientation estimation apparatus which concerns on Embodiment 1 of this invention. 距離標データベースの例を示す図である。It is a figure which shows the example of a distance marker database. 図3Aは、実施の形態1に係る位置姿勢推定装置の機能を実現するハードウェア構成を示すブロック図である。図3Bは、実施の形態1に係る位置姿勢推定装置の機能を実現するソフトウェアを実行するハードウェア構成を示すブロック図である。FIG. 3A is a block diagram showing a hardware configuration for realizing the function of the position / orientation estimation apparatus according to Embodiment 1. FIG. 3B is a block diagram illustrating a hardware configuration that executes software that implements the functions of the position and orientation estimation apparatus according to Embodiment 1. 実施の形態1に係る位置姿勢推定装置の動作を示すフローチャートである。3 is a flowchart showing an operation of the position and orientation estimation apparatus according to the first embodiment. 撮影画像から距離標を認識する処理の概要を示す図である。It is a figure which shows the outline | summary of the process which recognizes a distance marker from a picked-up image. 距離標の方位を算出する処理の概要を示す図である。It is a figure which shows the outline | summary of the process which calculates the azimuth | direction of a distance marker. 距離標の特定処理の概要を示す図である。It is a figure which shows the outline | summary of the specific process of a distance marker. 距離標の設置例を示す図である。It is a figure which shows the example of installation of a distance marker.
 以下、この発明をより詳細に説明するため、この発明を実施するための形態について、添付の図面に従って説明する。
実施の形態1.
 図1は、この発明の実施の形態1に係る位置姿勢推定装置1の機能構成を示すブロック図である。位置姿勢推定装置1は、図1に示すように、撮影装置2、センサ装置3および距離標データベース4のそれぞれと接続し、撮影装置2の位置および姿勢を推定する。
 撮影装置2は、位置姿勢推定装置1によって位置および姿勢が推定される推定対象物であり、車両の予め定められた箇所に設置されて当該車両の周辺を撮影する。
 以下の説明において、“位置”は3次元空間における推定対象物の位置を示しており、“姿勢”は3次元空間における推定対象物の傾き角または回転角などである。撮影装置2は、上記車両の前方を撮影する車載カメラであるものとする。
Hereinafter, in order to describe the present invention in more detail, modes for carrying out the present invention will be described with reference to the accompanying drawings.
Embodiment 1 FIG.
FIG. 1 is a block diagram showing a functional configuration of a position / orientation estimation apparatus 1 according to Embodiment 1 of the present invention. As shown in FIG. 1, the position / orientation estimation device 1 is connected to each of the imaging device 2, the sensor device 3, and the distance marker database 4, and estimates the position and orientation of the imaging device 2.
The imaging device 2 is an estimation target whose position and orientation are estimated by the position / orientation estimation device 1, and is installed at a predetermined location of the vehicle to image the periphery of the vehicle.
In the following description, “position” indicates the position of the estimation object in the three-dimensional space, and “posture” is the tilt angle or rotation angle of the estimation object in the three-dimensional space. The imaging device 2 is assumed to be an in-vehicle camera that images the front of the vehicle.
 センサ装置3は、上記車両に搭載されたセンサであり、3軸の加速度センサおよび3軸の地磁気センサである。上記加速度センサは、3次元空間におけるx軸に沿った加速度、y軸に沿った加速度、z軸に沿った加速度を検出する。上記地磁気センサは、x軸、y軸およびz軸のそれぞれに沿った地磁気を検出する。
 なお、上記車両において、撮影装置2とセンサ装置3との位置関係は固定されている。
The sensor device 3 is a sensor mounted on the vehicle, and is a three-axis acceleration sensor and a three-axis geomagnetic sensor. The acceleration sensor detects acceleration along the x-axis, acceleration along the y-axis, and acceleration along the z-axis in a three-dimensional space. The geomagnetic sensor detects geomagnetism along each of the x-axis, y-axis, and z-axis.
In the vehicle, the positional relationship between the imaging device 2 and the sensor device 3 is fixed.
 距離標データベース4は、距離標に設定された基準点の3次元空間における位置座標と当該距離標のキロ数とが対応付けられたデータベースである。
 距離標は、撮影装置2によって撮影された撮影画像から、キロ数が画像認識される認識対象物である。キロ数は、認識対象物である上記距離標に標示された標示情報であって、上記道路に設定された起点から上記距離標までの距離を示している。
The distance marker database 4 is a database in which the position coordinates of the reference point set in the distance marker in the three-dimensional space are associated with the number of kilometers of the distance marker.
The distance marker is a recognition object whose number of kilometers is recognized from the photographed image photographed by the photographing device 2. The number of kilometers is the marking information marked on the distance marker that is the recognition object, and indicates the distance from the starting point set on the road to the distance marker.
 図2は、距離標データベース4の例を示す図である。図2において、距離標は、矩形の標識板であり、基準点は、当該距離標においてキロ数が標示された面の四隅の点である。図2に示す距離標データベース4に登録された距離標は、上り線と下り線とに車線が分離された道路に設定された距離標である。また、距離標データベース4において、同じキロ数を標示する距離標は、上り線側の距離標とこれに対応する位置にある下り線側の距離標である。 FIG. 2 is a diagram showing an example of the distance marker database 4. In FIG. 2, the distance marker is a rectangular sign board, and the reference points are the four corner points of the surface on which the kilometer is marked in the distance marker. The distance marker registered in the distance marker database 4 shown in FIG. 2 is a distance marker set on a road in which lanes are separated into an up line and a down line. In the distance marker database 4, the distance markers indicating the same number of kilometers are the distance marker on the up line side and the distance marker on the down line side at the position corresponding to the distance marker.
 距離標データベース4に登録された基準点は、現実空間を表す3次元空間における点である。上記3次元空間は、例えば、地球上にある特定の点を原点として、東西方向にx軸(例えば、東方向が正)を規定し、南北方向にy軸(例えば、北方向が正)を規定し、標高方向にz軸(例えば、距離標の高さ方向を正)を規定した3次元座標系で表される。
 上記四隅の点の位置座標は、上記原点から上記四隅の点までのx軸、y軸およびz軸のそれぞれに沿った距離をメートル単位で表したものである。
 以下、上記現実空間を表す3次元座標系を“世界座標系”と呼ぶ。
The reference points registered in the distance marker database 4 are points in a three-dimensional space representing the real space. The three-dimensional space defines, for example, an x-axis (for example, positive in the east direction) in the east-west direction and a y-axis (for example, positive in the north direction) in the north-south direction from a specific point on the earth as the origin. It is represented by a three-dimensional coordinate system that defines the z-axis (for example, the height direction of the distance marker is positive) in the elevation direction.
The position coordinates of the four corner points are distances along the x-axis, y-axis, and z-axis from the origin to the four corner points, expressed in meters.
Hereinafter, the three-dimensional coordinate system representing the real space is referred to as a “world coordinate system”.
 位置姿勢推定装置1は、距離標認識部5、方位推定部6、距離標検索部7、方位算出部8、距離標特定部9および位置姿勢推定部10を備える。
 距離標認識部5は、対象物認識部に相当する構成要素であり、撮影装置2によって撮影された撮影画像から、距離標に標示されたキロ数と当該距離標に設定された基準点の上記撮影画像の2次元座標系における位置座標とを認識する。
 例えば、距離標認識部5は、2次元実写画像である撮影画像にパターン認識処理または文字認識処理を行うことで、当該撮影画像に写った距離標のキロ数と基準点の位置座標とを認識する。
 以下、撮影画像の2次元座標系を“画像座標系”と呼ぶ。
The position / orientation estimation apparatus 1 includes a distance marker recognition unit 5, an orientation estimation unit 6, a distance marker search unit 7, an orientation calculation unit 8, a distance marker identification unit 9, and a position / orientation estimation unit 10.
The distance marker recognition unit 5 is a component corresponding to the object recognition unit, and from the captured image captured by the imaging device 2, the number of kilometers marked on the distance marker and the reference point set for the distance marker above. Recognize position coordinates in a two-dimensional coordinate system of a captured image.
For example, the distance marker recognition unit 5 performs pattern recognition processing or character recognition processing on a captured image that is a two-dimensional photographed image, thereby recognizing the number of kilometers of the distance marker captured in the captured image and the position coordinates of the reference point. To do.
Hereinafter, the two-dimensional coordinate system of the captured image is referred to as an “image coordinate system”.
 方位推定部6は、センサ装置3の検出情報に基づいて、撮影装置2の撮影方位を推定する。撮影装置2の撮影方位とは、撮影装置2の視点がある方位である。この方位は、撮影装置2の撮影方位を示す角度であればよく、例えば、北を0度、東を90度、南を180度、西を270度で表してもよいし、角度をラジアン単位で表してもよい。 The orientation estimation unit 6 estimates the imaging orientation of the imaging device 2 based on the detection information of the sensor device 3. The shooting direction of the shooting device 2 is a direction in which the viewpoint of the shooting device 2 is located. This azimuth may be an angle indicating the shooting azimuth of the image taking device 2. For example, the north may be represented by 0 degrees, the east by 90 degrees, the south by 180 degrees, and the west by 270 degrees. It may be expressed as
 距離標検索部7は、対象物検索部に相当する構成要素であり、距離標認識部5によって撮影画像から認識されたキロ数に対応する距離標を距離標データベース4から検索する。図2に示した距離標データベース4には、上り線側の距離標とこれに対応する位置の下り線側の距離標とが同じキロ数であるので、距離標認識部5によって認識されたキロ数に対応する2つの距離標が検索される。例えば、距離標認識部5によって認識されたキロ数が123.4であると、このキロ数に対応する2つの距離標が検索され、これらのそれぞれに設定された基準点の位置座標が方位算出部8に出力される。 The distance marker search unit 7 is a component corresponding to the object search unit, and searches the distance marker database 4 for a distance marker corresponding to the number of kilometers recognized by the distance marker recognition unit 5 from the photographed image. In the distance marker database 4 shown in FIG. 2, the distance marker on the up line side and the distance marker on the down line side corresponding to the distance marker have the same number of kilometers. Two distance markers corresponding to the number are searched. For example, if the kilometer recognized by the distance marker recognition unit 5 is 123.4, two distance markers corresponding to the kilometer are retrieved, and the position coordinates of the reference point set for each of these distance markers are calculated. Is output to the unit 8.
 方位算出部8は、距離標検索部7によって検索された距離標に設定された基準点の位置座標に基づいて、キロ数が標示された面の法線方向を距離標の方位として算出する。
 距離標の方位は、撮影装置2に対する距離標の方位を示す角度であればよく、例えば、方位推定部6と同様に、北を0度、東を90度、南を180度、西を270度で表してもよいし、角度をラジアン単位で表してもよい。
The azimuth calculating unit 8 calculates the normal direction of the surface on which the number of kilometers is indicated as the azimuth of the distance marker based on the position coordinates of the reference point set in the distance marker searched by the distance marker searching unit 7.
The azimuth of the distance marker may be an angle indicating the azimuth of the distance marker with respect to the imaging device 2. For example, as with the azimuth estimation unit 6, 0 degrees north, 90 degrees east, 180 degrees south, and 270 west The angle may be expressed in degrees or the angle may be expressed in radians.
 距離標特定部9は、対象物特定部に相当する構成要素であり、撮影装置2の撮影方位と距離標の方位とに基づいて、距離標検索部7によって検索された距離標の中から、撮影画像における距離標を特定する。ここで、撮影装置2の撮影方位は、方位推定部6によって推定された方位であり、距離標の方位は、方位算出部8によって算出された方位である。 The distance marker specifying unit 9 is a component corresponding to the object specifying unit, and based on the shooting direction of the imaging device 2 and the direction of the distance marker, from the distance markers searched by the distance marker searching unit 7, The distance marker in the captured image is specified. Here, the shooting direction of the shooting apparatus 2 is the direction estimated by the direction estimation unit 6, and the direction of the distance marker is the direction calculated by the direction calculation unit 8.
 位置姿勢推定部10は、距離標特定部9によって特定された距離標に設定された基準点の3次元空間における位置座標と撮影画像における位置座標との対応関係に基づいて、撮影装置2の位置および姿勢を推定する。
 例えば、位置姿勢推定部10は、基準点の世界座標系における位置座標と画像座標系における位置座標との対応関係に基づいて、撮影装置2の内部パラメータを用いたPnP(Perspective n-Points)問題を解く。これにより、撮影装置2の位置および姿勢が推定される。上記内部パラメータ(intrinsic parameters)は、撮影装置2が備える撮影レンズの焦点距離と主点とを含む情報である。
The position / orientation estimation unit 10 determines the position of the imaging device 2 based on the correspondence between the position coordinates in the three-dimensional space of the reference point set in the distance marker specified by the distance marker specifying unit 9 and the position coordinates in the captured image. And estimate posture.
For example, the position / orientation estimation unit 10 generates a PnP (Perspective n-Points) problem using the internal parameters of the imaging apparatus 2 based on the correspondence between the position coordinates of the reference point in the world coordinate system and the position coordinates in the image coordinate system. Solve. Thereby, the position and orientation of the photographing apparatus 2 are estimated. The internal parameters (intrinsic parameters) are information including the focal length and principal point of the photographing lens provided in the photographing apparatus 2.
 距離標データベース4は、位置姿勢推定装置1とは別に設けられた外部記憶装置の記憶領域上に構築されていてもよい。この場合、距離標検索部7は、インターネットまたはイントラネットなどの通信回線を経由して上記外部記憶装置における距離標データベース4にアクセスして距離標を検索する。
 距離標認識部5は、撮影装置2が備える構成要素であってもよい。この場合、距離標認識部5は、位置姿勢推定装置1から除かれて、撮影画像から認識された標示情報は、撮影装置2から距離標検索部7に出力され、撮影画像から認識された基準点の位置座標は、撮影装置2から位置姿勢推定部10に出力される。
 方位推定部6は、センサ装置3が備える構成要素であってもよい。この場合、方位推定部6は、位置姿勢推定装置1から除かれて、撮影装置2の撮影方位を示す情報は、センサ装置3から距離標特定部9に出力される。
The distance marker database 4 may be constructed on a storage area of an external storage device provided separately from the position and orientation estimation device 1. In this case, the distance marker search unit 7 searches the distance marker by accessing the distance marker database 4 in the external storage device via a communication line such as the Internet or an intranet.
The distance marker recognizing unit 5 may be a constituent element included in the photographing apparatus 2. In this case, the distance marker recognizing unit 5 is removed from the position / orientation estimation device 1 and the label information recognized from the captured image is output from the imaging device 2 to the distance marker searching unit 7 and is recognized from the captured image. The position coordinates of the points are output from the imaging device 2 to the position / orientation estimation unit 10.
The direction estimation unit 6 may be a component included in the sensor device 3. In this case, the azimuth estimation unit 6 is removed from the position / orientation estimation device 1, and information indicating the photographing orientation of the photographing device 2 is output from the sensor device 3 to the distance indicator specifying unit 9.
 図3Aは、位置姿勢推定装置1の機能を実現するハードウェア構成を示すブロック図である。撮影装置100、センサ装置101、記憶装置102、および処理回路103は、バスによって互いに接続されている。図3Bは、位置姿勢推定装置1の機能を実現するソフトウェアを実行するハードウェア構成を示すブロック図である。撮影装置100、センサ装置101、記憶装置102、CPU(Central Processing Unit)104、およびメモリ105は、バスによって互いに接続されている。 FIG. 3A is a block diagram showing a hardware configuration for realizing the function of the position / orientation estimation apparatus 1. The imaging device 100, the sensor device 101, the storage device 102, and the processing circuit 103 are connected to each other by a bus. FIG. 3B is a block diagram illustrating a hardware configuration that executes software that implements the functions of the position and orientation estimation apparatus 1. The imaging device 100, the sensor device 101, the storage device 102, the CPU (Central Processing Unit) 104, and the memory 105 are connected to each other by a bus.
 図3Aおよび図3Bにおいて、撮影装置100は、位置姿勢推定装置1を搭載する車両周辺を撮影する撮影装置2であり、センサ装置101は、センサ装置3を構成するセンサである。記憶装置102は、距離標データベース4を記憶している。記憶装置102は、例えば、RAM (Random Access Memory)、ROM (Read Only Memory)、フラッシュメモリ、HDD (Hard Disk Drive)などで実現され、これらを組み合わせた記憶装置であってもよい。また、記憶装置102の記憶領域の一部または全部を外部装置に設けてもよい。この場合、位置姿勢推定装置1が、インターネットまたはイントラネットなどの通信回線を経由して上記外部装置と通信して距離標の検索処理が実行される。 3A and 3B, the image capturing device 100 is an image capturing device 2 that captures the periphery of the vehicle on which the position and orientation estimation device 1 is mounted, and the sensor device 101 is a sensor that constitutes the sensor device 3. The storage device 102 stores a distance marker database 4. The storage device 102 is realized by, for example, a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an HDD (Hard Disk Drive), or the like, and may be a storage device that combines these. Further, a part or all of the storage area of the storage device 102 may be provided in the external device. In this case, the position / posture estimation apparatus 1 communicates with the external apparatus via a communication line such as the Internet or an intranet, and the distance marker search process is executed.
 位置姿勢推定装置1における、距離標認識部5、方位推定部6、距離標検索部7、方位算出部8、距離標特定部9および位置姿勢推定部10の各機能は、処理回路により実現される。すなわち、位置姿勢推定装置1は、これらの機能を実行するための処理回路を備える。処理回路は、専用のハードウェアであってもメモリに格納されるプログラムを実行するCPUであってもよい。 In the position / orientation estimation apparatus 1, the functions of the distance marker recognition unit 5, the orientation estimation unit 6, the distance marker search unit 7, the orientation calculation unit 8, the distance marker identification unit 9, and the position / orientation estimation unit 10 are realized by a processing circuit. The That is, the position / orientation estimation apparatus 1 includes a processing circuit for executing these functions. The processing circuit may be dedicated hardware or a CPU that executes a program stored in a memory.
 処理回路が図3Aに示す専用のハードウェアの処理回路103である場合、処理回路103は、例えば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field-Programmable Gate Array)またはこれらを組み合わせたものが該当する。
 位置姿勢推定装置1における、距離標認識部5、方位推定部6、距離標検索部7、方位算出部8、距離標特定部9および位置姿勢推定部10の各機能をそれぞれ処理回路で実現してもよいし、各機能をまとめて1つの処理回路で実現してもよい。
When the processing circuit is the dedicated hardware processing circuit 103 illustrated in FIG. 3A, the processing circuit 103 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or the like. ), FPGA (Field-Programmable Gate Array), or a combination thereof.
In the position / orientation estimation apparatus 1, each function of the distance marker recognition unit 5, the direction estimation unit 6, the distance marker search unit 7, the direction calculation unit 8, the distance target specification unit 9, and the position / orientation estimation unit 10 is realized by a processing circuit. Alternatively, each function may be realized by a single processing circuit.
 処理回路が図3Bに示すCPU104である場合、距離標認識部5、方位推定部6、距離標検索部7、方位算出部8、距離標特定部9および位置姿勢推定部10の各機能は、ソフトウェア、ファームウェア、またはソフトウェアとファームウェアとの組み合わせにより実現される。ソフトウェアおよびファームウェアはプログラムとして記述され、メモリ105に格納される。 When the processing circuit is the CPU 104 shown in FIG. 3B, the functions of the distance marker recognition unit 5, the azimuth estimation unit 6, the distance marker search unit 7, the azimuth calculation unit 8, the distance marker identification unit 9, and the position and orientation estimation unit 10 are as follows: It is realized by software, firmware, or a combination of software and firmware. Software and firmware are described as programs and stored in the memory 105.
 CPU104は、メモリ105に格納されたプログラムを読み出し実行することにより各機能を実現する。すなわち、位置姿勢推定装置1は、CPU104によって実行されるときに、図4に示すステップST1からステップST9までの処理が結果的に実行されるプログラムを格納するためのメモリ105を備える。
 これらのプログラムは、距離標認識部5、方位推定部6、距離標検索部7、方位算出部8、距離標特定部9および位置姿勢推定部10の手順または方法をコンピュータに実行させるものである。
The CPU 104 realizes each function by reading and executing a program stored in the memory 105. That is, the position / orientation estimation apparatus 1 includes a memory 105 for storing a program that, when executed by the CPU 104, results in the processing from step ST1 to step ST9 shown in FIG.
These programs cause the computer to execute the procedures or methods of the distance marker recognition unit 5, the direction estimation unit 6, the distance marker search unit 7, the direction calculation unit 8, the distance marker identification unit 9, and the position and orientation estimation unit 10. .
 メモリは、例えば、RAM、ROM、フラッシュメモリ、EPROM(Erasable Programmable ROM)、EEPROM(Electrically EPROM)などの不揮発性または揮発性の半導体メモリ、磁気ディスク、フレキシブルディスク、光ディスク、コンパクトディスク、ミニディスク、DVD(Digital Versatile Disk)などが該当する。 The memory is, for example, non-volatile or volatile semiconductor memory such as RAM, ROM, flash memory, EPROM (Erasable Programmable ROM), EEPROM (Electrically Programmable EPROM), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disk) and the like are applicable.
 また、距離標認識部5、方位推定部6、距離標検索部7、方位算出部8、距離標特定部9および位置姿勢推定部10の各機能について、一部を専用のハードウェアで実現し、一部をソフトウェアまたはファームウェアで実現してもよい。
 例えば、距離標認識部5および方位推定部6は、専用のハードウェアの処理回路でその機能を実現し、距離標検索部7、方位算出部8、距離標特定部9および位置姿勢推定部10については、CPU104がメモリ105に格納されたプログラム実行することにより、その機能を実現する。
 このように、処理回路は、ハードウェア、ソフトウェア、ファームウェアまたはこれらの組み合わせによって前述の機能を実現することができる。
In addition, a part of the functions of the distance marker recognition unit 5, the direction estimation unit 6, the distance marker search unit 7, the direction calculation unit 8, the distance marker identification unit 9, and the position and orientation estimation unit 10 are realized by dedicated hardware. A part may be realized by software or firmware.
For example, the distance marker recognition unit 5 and the azimuth estimation unit 6 realize their functions by a dedicated hardware processing circuit, and include a distance marker search unit 7, an azimuth calculation unit 8, a distance marker identification unit 9, and a position and orientation estimation unit 10. As for, the function is realized by the CPU 104 executing the program stored in the memory 105.
As described above, the processing circuit can realize the above-described functions by hardware, software, firmware, or a combination thereof.
 次に動作について説明する。
 図4は、位置姿勢推定装置1の動作を示すフローチャートであり、車両前方の撮影画像を得てから撮影装置2の位置および姿勢が推定されるまでの一連の処理を示している。
 まず、距離標認識部5は、撮影装置2から車両前方の撮影画像を入力すると(ステップST1)、当該撮影画像から距離標の認識を行う(ステップST2)。
 図5は、撮影画像2Aから距離標401を認識する処理の概要を示す図である。図5において、撮影画像2Aの左上隅の画素が原点(0,0)であり、原点から右方向に延びる軸をx軸、上記原点から下方向に延びる軸をy軸と定義している。撮影画像2A上の各点は、画素(ピクセル)単位で定義されて画像座標系における位置が表現される。
 ただし、前述した原点および座標軸の定義は、あくまで一例であり、これに限定されるものではない。
Next, the operation will be described.
FIG. 4 is a flowchart showing the operation of the position / orientation estimation apparatus 1 and shows a series of processes from obtaining a captured image in front of the vehicle until the position and orientation of the imaging apparatus 2 are estimated.
First, when the distance marker recognition unit 5 inputs a captured image in front of the vehicle from the imaging device 2 (step ST1), the distance marker recognizes the distance marker from the captured image (step ST2).
FIG. 5 is a diagram showing an outline of processing for recognizing the distance marker 401 from the captured image 2A. In FIG. 5, the pixel in the upper left corner of the captured image 2A is the origin (0, 0), the axis extending rightward from the origin is defined as the x axis, and the axis extending downward from the origin is defined as the y axis. Each point on the captured image 2A is defined in units of pixels (pixels) and represents a position in the image coordinate system.
However, the definition of the origin and the coordinate axes described above is merely an example, and the present invention is not limited to this.
 距離標認識部5は、撮影画像2Aを画像処理することにより、撮影画像2Aに写っている距離標401を認識すると(ステップST3;YES)、距離標401に標示されているキロ数を認識して距離標検索部7に出力する。図5では、“123.4”という数字の標示情報が距離標検索部7に出力される。
 距離標認識部5は、画像座標系における距離標401の四隅の点の位置座標を認識して位置姿勢推定部10に出力する。距離標401の四隅の点は、距離標401におけるキロ数が標示された面上の点であり、図5に示す点402a,402b,402c,402dである。
 一方、撮影画像に距離標が写っていない場合、距離標認識部5は、撮影画像から距離標を認識できないため(ステップST3;NO)、ステップST1の処理に戻る。
When the distance marker recognition unit 5 recognizes the distance marker 401 in the captured image 2A by performing image processing on the captured image 2A (step ST3; YES), the distance marker recognition unit 5 recognizes the number of kilometers displayed on the distance marker 401. To the distance marker search unit 7. In FIG. 5, the number information “123.4” is output to the distance marker search unit 7.
The distance marker recognition unit 5 recognizes the position coordinates of the four corner points of the distance marker 401 in the image coordinate system and outputs them to the position and orientation estimation unit 10. The four corner points of the distance marker 401 are points on the surface where the number of kilometers in the distance marker 401 is marked, and are points 402a, 402b, 402c, and 402d shown in FIG.
On the other hand, when the distance marker is not shown in the photographed image, the distance marker recognition unit 5 cannot recognize the distance marker from the photographed image (step ST3; NO), and returns to the process of step ST1.
 距離標検索部7は、距離標認識部5によって撮影画像2Aから認識されたキロ数に対応する距離標を距離標データベース4から検索する(ステップST4)。
 例えば、距離標検索部7が、距離標認識部5から入力したキロ数である“123.4”に基づいて、図2に示した距離標データベース4を検索すると、キロ数“123.4”に対応する2つの距離標の情報が読み出されて方位算出部8に出力される。前述した距離標の情報とは、距離標の四隅の各点の世界座標系における位置座標である。
The distance marker search unit 7 searches the distance marker database 4 for a distance marker corresponding to the number of kilometers recognized from the captured image 2A by the distance marker recognition unit 5 (step ST4).
For example, when the distance marker search unit 7 searches the distance marker database 4 shown in FIG. 2 based on “123.4”, which is the number of kilometers input from the distance marker recognition unit 5, the number of kilometers “123.4”. Information on the two distance markers corresponding to is read out and output to the azimuth calculation unit 8. The above-mentioned distance marker information is position coordinates in the world coordinate system of each point at the four corners of the distance marker.
 次に、方位算出部8は、距離標検索部7から入力した距離標の四隅の各点の世界座標系における位置座標に基づいて、撮影装置2に対する上記距離標の方位を算出する(ステップST5)。前述のように、撮影画像2Aから認識されたキロ数“123.4”に対応する2つの距離標が距離標データベース4から検索された場合、方位算出部8は、これらの距離標のそれぞれについて方位を算出する。方位算出部8は、算出した距離標ごとの方位と、方位を算出した距離標の四隅の各点(基準点)の世界座標系における位置座標とを、距離標特定部9に出力する。 Next, the azimuth calculating unit 8 calculates the azimuth of the distance marker with respect to the photographing apparatus 2 based on the position coordinates in the world coordinate system of the four corners of the distance marker input from the distance marker searching unit 7 (step ST5). ). As described above, when two distance markers corresponding to the number of kilometers “123.4” recognized from the photographed image 2A are retrieved from the distance marker database 4, the azimuth calculation unit 8 determines each of these distance markers. Calculate the bearing. The azimuth calculation unit 8 outputs the calculated azimuth for each distance marker and the position coordinates in the world coordinate system of the four corners (reference points) of the distance marker for which the azimuth has been calculated to the distance marker identification unit 9.
 図6は距離標601の方位を算出する処理の概要を示す図であり、距離標601を上空から見た様子を示している。図6において、x軸の正方向が東方向であり、y軸の正方向が北方向である。点602は、距離標601の左上隅の点であり、点603は、距離標601の右上隅の点である。点602,603は、前述した基準点である。矢印604は、距離標601の設置方向、すなわちキロ数が標示された面の法線方向を示している。 FIG. 6 is a diagram showing an outline of processing for calculating the azimuth of the distance marker 601 and shows a state where the distance marker 601 is viewed from the sky. In FIG. 6, the positive direction of the x axis is the east direction, and the positive direction of the y axis is the north direction. A point 602 is a point at the upper left corner of the distance marker 601, and a point 603 is a point at the upper right corner of the distance marker 601. Points 602 and 603 are the reference points described above. An arrow 604 indicates the installation direction of the distance marker 601, that is, the normal direction of the surface on which the kilometer is marked.
 角度605は、方位算出部8によって算出される距離標601の方位を示す角度である。方位算出部8は、距離標601の左上隅の点602の位置座標、右上隅の点603の位置座標、および逆三角関数を利用して、角度605を算出する。
 方位算出部8は、距離標検索部7によって検索された全ての距離標について上記方位の算出を行い、算出した方位を距離標特定部9に出力する。
 上記説明では、距離標601の左上隅の点602の位置座標と右上隅の点603の位置座標とを用いて距離標601の方位を算出したが、これらの代わりに距離標601の左下隅の点と右下隅の点の各位置座標を用いて距離標601の方位を算出してもよい。また、距離標の左上隅の点、右上隅の点、左下隅の点および右下隅の点の組み合わせを用いて、距離標の方位を算出してもよい。
The angle 605 is an angle indicating the azimuth of the distance marker 601 calculated by the azimuth calculation unit 8. The azimuth calculation unit 8 calculates the angle 605 using the position coordinates of the upper left corner point 602, the position coordinates of the upper right corner point 603, and the inverse trigonometric function of the distance marker 601.
The azimuth calculation unit 8 calculates the azimuth for all the distance markers searched by the distance marker search unit 7 and outputs the calculated azimuth to the distance marker identification unit 9.
In the above description, the azimuth of the distance marker 601 is calculated using the position coordinates of the upper left corner point 602 and the upper right corner point 603 of the distance marker 601, but instead of the lower left corner of the distance marker 601. The azimuth of the distance marker 601 may be calculated using the position coordinates of the point and the point at the lower right corner. Alternatively, the azimuth of the distance marker may be calculated using a combination of an upper left corner point, an upper right corner point, a lower left corner point, and a lower right corner point.
 さらに、方位算出部8が、距離標の左上隅の点から左下隅の点へのベクトルと、左上隅の点から右上隅の点へのベクトルとの外積に基づいて、距離標の方位を算出してもよい。例えば、方位算出部8は、上記外積の値に基づいて、距離標においてキロ数が標示された面に対して垂直なベクトルを算出し、算出したベクトルと逆三角関数とを用いて距離標の方位を算出する。 Further, the bearing calculation unit 8 calculates the bearing of the distance marker based on the outer product of the vector from the upper left corner point to the lower left corner point and the vector from the upper left corner point to the upper right corner point. May be. For example, the azimuth calculation unit 8 calculates a vector perpendicular to the surface on which the number of kilometers is marked in the distance marker based on the value of the outer product, and uses the calculated vector and the inverse trigonometric function to calculate the distance marker. Calculate the bearing.
 次に、方位推定部6は、センサ装置3からセンサ値を取得すると(ステップST6)、取得したセンサ値に基づいて撮影装置2の撮影方位を推定する(ステップST7)。
 例えば、方位推定部6は、加速度センサが検出した重力加速度に基づいて、撮影装置2の地面に対する姿勢を算出して、算出した姿勢と地磁気センサが検出した地磁気の値とに基づいて、撮影装置2の撮影方位を推定する。
 また、撮影装置2が地面に対して垂直な姿勢(撮影方向は地面に水平)である場合もしくは上記車両に撮影装置2の姿勢が固定されて既知である場合、方位推定部6は、地磁気センサのセンサ情報のみを利用して撮影装置2の撮影方位を推定してもよい。
Next, when the orientation estimation unit 6 acquires the sensor value from the sensor device 3 (step ST6), the orientation estimation unit 6 estimates the imaging orientation of the imaging device 2 based on the acquired sensor value (step ST7).
For example, the azimuth estimation unit 6 calculates the posture of the photographing device 2 with respect to the ground based on the gravitational acceleration detected by the acceleration sensor, and based on the calculated posture and the value of the geomagnetism detected by the geomagnetic sensor. The shooting direction of 2 is estimated.
When the photographing device 2 is in a posture perpendicular to the ground (the photographing direction is horizontal to the ground) or when the posture of the photographing device 2 is fixed to the vehicle and is known, the azimuth estimation unit 6 includes a geomagnetic sensor. The photographing direction of the photographing apparatus 2 may be estimated using only the sensor information.
 続いて、距離標特定部9は、方位算出部8が算出した距離標の方位と方位推定部6が推定した撮影装置2の撮影方位とに基づいて、距離標検索部7が検索した距離標の中から、撮影画像における距離標を特定する(ステップST8)。
 図7は、距離標の特定処理の概要を示す図である。図7において、撮影方位701は、方位推定部6によって推定された撮影装置2の撮影方位であり、方位角が60度である。破線で示す方位702と方位703は、方位算出部8によって算出された上記2つの距離標の方位である。方位702の方位角は50度であり、方位703の方位角は260度である。
Subsequently, the distance marker specifying unit 9 searches for the distance marker searched by the distance marker searching unit 7 based on the azimuth of the distance marker calculated by the azimuth calculating unit 8 and the shooting azimuth of the imaging device 2 estimated by the azimuth estimating unit 6. A distance marker in the captured image is specified from among the images (step ST8).
FIG. 7 is a diagram showing an outline of the distance marker specifying process. In FIG. 7, a shooting direction 701 is a shooting direction of the shooting apparatus 2 estimated by the direction estimation unit 6 and has an azimuth angle of 60 degrees. The azimuth | direction 702 and the azimuth | direction 703 shown with a broken line are the azimuth | directions of the said two distance markers calculated by the azimuth | direction calculation part 8. FIG. The azimuth angle of the azimuth 702 is 50 degrees, and the azimuth angle of the azimuth 703 is 260 degrees.
 距離標特定部9は、距離標の方位との比較を容易にするため、撮影方位701の方位角を180度反転させる。図7において、方位704は、撮影方位701の方位角を180度反転させた方位であり、方位角が240度となる。
 次に、距離標特定部9は、撮影方位701を180度反転させた方位704と、2つの距離標の方位702,703とをそれぞれ比較する。
The distance marker specifying unit 9 inverts the azimuth angle of the imaging azimuth 701 by 180 degrees in order to facilitate comparison with the azimuth of the distance marker. In FIG. 7, an azimuth 704 is an azimuth obtained by inverting the azimuth angle of the shooting azimuth 701 by 180 degrees, and the azimuth angle is 240 degrees.
Next, the distance marker specifying unit 9 compares the azimuth 704 obtained by inverting the imaging azimuth 701 by 180 degrees with the azimuths 702 and 703 of the two distance markers.
 例えば、距離標特定部9は、方位704の方位角と方位702の方位角との差の絶対値705を算出し、方位704の方位角と方位703の方位角との差の絶対値706を算出して、絶対値705と絶対値706とを大小比較する。距離標特定部9は、上記方位角の差の絶対値が最も小さい距離標を、撮影装置2が撮影している距離標、すなわち撮影画像における距離標として特定する。絶対値705は170度、絶対値706は20度であることから、方位703の距離標が、撮影画像における距離標として特定される。
 距離標特定部9は、特定した距離標の四隅の各点(基準点)の世界座標系における位置座標を、位置姿勢推定部10に出力する。
For example, the distance marker specifying unit 9 calculates the absolute value 705 of the difference between the azimuth angle of the azimuth 704 and the azimuth angle of the azimuth 702 and calculates the absolute value 706 of the difference between the azimuth angle of the azimuth 704 and the azimuth angle of the azimuth 703. The absolute value 705 and the absolute value 706 are compared in magnitude. The distance marker specifying unit 9 specifies the distance marker having the smallest absolute value of the azimuth angle difference as the distance marker captured by the imaging device 2, that is, the distance marker in the captured image. Since the absolute value 705 is 170 degrees and the absolute value 706 is 20 degrees, the distance marker of the azimuth 703 is specified as the distance marker in the captured image.
The distance marker specifying unit 9 outputs the position coordinates in the world coordinate system of each point (reference point) at the four corners of the specified distance marker to the position and orientation estimating unit 10.
 上り線と下り線とに車線が分離された道路において、下り線の始点と上り線の終点とを起点として100メートルごとに距離標が設置されている場合、距離標には、道路に設定された起点から当該距離標までの距離であるキロ数が標示される。
 この場合、図8に示すように、上り線側の距離標800aとこれに対応する位置の下り線側の距離標800bとには同じキロ数が標示され、上り線側の距離標801aとこれに対応する位置の下り線側の距離標801bとには同じキロ数が標示される。
In a road where the lanes are separated into an up line and a down line, if distance signs are installed every 100 meters starting from the start point of the down line and the end point of the up line, the distance sign is set to the road. The number of kilometers that is the distance from the starting point to the mileage mark is displayed.
In this case, as shown in FIG. 8, the same kilometer is marked on the up-distance side distance indicator 800a and the down-line side distance indicator 800b corresponding to the up-distance side distance indicator 800a. The same number of kilometers is indicated on the distance marker 801b on the down line side at the position corresponding to.
 従来の技術では、撮影画像から認識されたキロ数に基づいて、撮影されている距離標を特定するため、例えば、上り線側の距離標800aを特定すべきところ、これに対応する位置にある下り線側の距離標800bが誤って特定される可能性があった。
 これに対し、距離標特定部9は、方位算出部8が算出した距離標の方位と方位推定部6が推定した撮影装置2の撮影方位とに基づいて、距離標検索部7が検索した距離標の中から、撮影画像における距離標を特定する。これにより、実際に撮影画像に写っている上り線側の距離標800aを正確に特定することができる。
In the conventional technique, in order to identify the distance marker being photographed based on the number of kilometers recognized from the photographed image, for example, the distance marker 800a on the up-line side should be identified, and it is at a position corresponding to this. There is a possibility that the distance marker 800b on the down line side is specified by mistake.
On the other hand, the distance marker specifying unit 9 searches for the distance searched by the distance marker search unit 7 based on the direction of the distance marker calculated by the direction calculation unit 8 and the shooting direction of the shooting device 2 estimated by the direction estimation unit 6. A distance marker in the captured image is specified from the markers. Thereby, it is possible to accurately identify the distance marker 800a on the upline side that is actually reflected in the captured image.
 位置姿勢推定部10は、距離標特定部9によって特定された距離標の四隅の各点の世界座標系における位置座標と、当該距離標の四隅の各点の画像座標系における位置座標との対応関係に基づいて、撮影装置2の位置および姿勢を推定する(ステップST9)。
 距離標特定部9が特定した距離標は、撮影装置2によって撮影されている距離標であるので、位置姿勢推定部10は、当該距離標の四隅の各点の画像座標系における位置座標を距離標認識部5から取得することができる。
 位置姿勢推定部10は、上記距離標の四隅の各点の世界座標系における位置座標と画像座標系における位置座標との対応関係に基づいて、撮影装置2の上記内部パラメータを用いたPnP問題を解くことで、撮影装置2の回転成分も含む位置および姿勢を推定する。
The position / orientation estimation unit 10 associates the position coordinates in the world coordinate system of each point of the four corners of the distance marker specified by the distance marker specifying unit 9 with the position coordinates in the image coordinate system of each point of the four corners of the distance marker. Based on the relationship, the position and orientation of the imaging device 2 are estimated (step ST9).
Since the distance marker specified by the distance marker specifying unit 9 is a distance marker captured by the imaging device 2, the position / orientation estimation unit 10 determines the position coordinates in the image coordinate system of each point at the four corners of the distance marker. It can be acquired from the mark recognition unit 5.
The position / orientation estimation unit 10 solves the PnP problem using the internal parameters of the imaging apparatus 2 based on the correspondence between the position coordinates in the world coordinate system and the position coordinates in the image coordinate system of the four corners of the distance marker. By solving, the position and orientation including the rotation component of the imaging device 2 are estimated.
 位置姿勢推定部10は、位置姿勢推定装置1を搭載する車両と撮影装置2との位置関係が既知である場合、推定した撮影装置2の位置および姿勢と上記位置関係とに基づいて、上記車両の位置および姿勢を正確に推定することができる。車両位置を正確に推定できることから、ナビゲーション精度を高めることもできる。 When the positional relationship between the vehicle on which the position / orientation estimation device 1 is mounted and the imaging device 2 is known, the position / orientation estimation unit 10 is based on the estimated position and orientation of the imaging device 2 and the positional relationship. Can be accurately estimated. Since the vehicle position can be estimated accurately, the navigation accuracy can be improved.
 図4に示した処理のうち、ステップST1からステップST5までの処理と、ステップST6からステップST7までの処理との間には依存関係が存在しないため、前者の処理と後者の処理とを並行して実行してもよい。 Among the processes shown in FIG. 4, there is no dependency between the process from step ST1 to step ST5 and the process from step ST6 to step ST7, so the former process and the latter process are performed in parallel. May be executed.
 これまで、認識対象物が距離標であり、標示情報がキロ数である場合を示したが、実施の形態1は、これに限定されるものではない。
 例えば、認識対象物は、道路沿いに設置された設備であってもよく、標示情報は、この設備に標示された、数字、文字列、画像、2次元バーコードといった符号情報であってもよい。すなわち、認識対象物および標示情報は、撮影画像から画像認識可能なものであればよい。
So far, the case where the recognition target object is a distance marker and the labeling information is the number of kilometers has been described, but Embodiment 1 is not limited to this.
For example, the recognition target object may be equipment installed along a road, and the marking information may be code information such as numbers, character strings, images, and two-dimensional barcodes marked on the equipment. . That is, the recognition target object and the labeling information may be anything that can be recognized from the captured image.
 距離標の基準点が四隅の各点である場合を示したが、これに限定されるものではない。例えば、基準点は、距離標の端辺上の点またはキロ数の標示面上の点であってもよい。
 さらに、位置姿勢推定装置1を車両に搭載した場合を示したが、これに限定されるものではない。例えば、位置姿勢推定装置1を、鉄道または航空機などの移動体に搭載してもよく、スマートフォン、タブレットPCなどの携帯端末で位置姿勢推定装置1を実現して人が携帯してもよい。
Although the case where the reference point of the distance marker is each of the four corners has been shown, the present invention is not limited to this. For example, the reference point may be a point on the edge of the distance marker or a point on the marking surface in kilometer.
Furthermore, although the case where the position / orientation estimation apparatus 1 is mounted on a vehicle is shown, the present invention is not limited to this. For example, the position / orientation estimation apparatus 1 may be mounted on a moving body such as a railway or an aircraft, or the position / orientation estimation apparatus 1 may be realized by a mobile terminal such as a smartphone or a tablet PC and carried by a person.
 以上のように、実施の形態1に係る位置姿勢推定装置1は、撮影画像から認識されたキロ数に対応する距離標の基準点の位置座標を距離標データベース4から検索し、検索した距離標の基準点の位置座標に基づいて当該距離標の方位を算出し、算出した距離標の方位と撮影装置2の撮影方位とに基づいて上記撮影画像における距離標を特定し、特定した上記距離標の基準点の世界座標系における位置座標と画像座標系における位置座標との対応関係に基づいて撮影装置2の位置および姿勢を推定する。
 このように撮影装置2の撮影方位と距離標の方位とに基づくことで、同じキロ数を標示する距離標の中から、撮影画像における距離標を正確に特定することができる。
 正確に特定された距離標の基準点の位置座標に基づいて撮影装置2の位置および姿勢を精度よく推定することができる。
 撮影された距離標の位置も正確に特定できることから、位置姿勢推定装置1は、道路の保守点検作業の記録に適用することができる。
 また、撮影装置2の位置および姿勢に基づいて、位置姿勢推定装置1の位置および姿勢を正確に推定できるので、対象物の正確な位置と姿勢を必要とする拡張現実(augmented reality)アプリケーションにも適用することができる。
As described above, the position / orientation estimation apparatus 1 according to the first embodiment searches the distance marker database 4 for the position coordinates of the reference point of the distance marker corresponding to the number of kilometers recognized from the captured image, and the retrieved distance marker. The azimuth of the distance marker is calculated based on the position coordinates of the reference point, the distance marker in the photographed image is identified based on the calculated azimuth of the distance marker and the photographing azimuth of the photographing device 2, and the identified distance marker The position and orientation of the photographing apparatus 2 are estimated based on the correspondence between the position coordinates of the reference point in the world coordinate system and the position coordinates in the image coordinate system.
Thus, based on the shooting direction of the shooting apparatus 2 and the direction of the distance marker, the distance marker in the captured image can be accurately specified from the distance markers indicating the same number of kilometers.
It is possible to accurately estimate the position and orientation of the photographing apparatus 2 based on the position coordinates of the reference point of the distance marker that is accurately specified.
Since the position of the photographed distance marker can also be specified accurately, the position / orientation estimation apparatus 1 can be applied to the record of road maintenance inspection work.
Further, since the position and orientation of the position / orientation estimation apparatus 1 can be accurately estimated based on the position and orientation of the photographing apparatus 2, the augmented reality application that requires the accurate position and orientation of the object is also applicable. Can be applied.
 実施の形態1に係る位置姿勢推定装置1において、方位推定部6が、加速度センサの検出情報に基づいて撮影装置2の地面に対する姿勢を推定し、推定した姿勢に基づいて撮影装置2の撮影方位を推定する。このように構成することにより、地面に対して自由な姿勢で取り付けられた撮影装置2の撮影方位を推定することができる。 In the position / orientation estimation apparatus 1 according to the first embodiment, the orientation estimation unit 6 estimates the orientation of the imaging device 2 with respect to the ground based on the detection information of the acceleration sensor, and the imaging orientation of the imaging device 2 based on the estimated orientation. Is estimated. By comprising in this way, the imaging | photography azimuth | direction of the imaging device 2 attached with the free attitude | position with respect to the ground can be estimated.
 なお、本発明はその発明の範囲内において、実施の形態の任意の構成要素の変形、もしくは実施の形態の任意の構成要素の省略が可能である。 In the present invention, any component of the embodiment can be modified or any component of the embodiment can be omitted within the scope of the invention.
 この発明に係る位置姿勢推定装置は、推定対象物の位置および姿勢の推定精度を高めることができるので、例えば、車載用のナビゲーション装置に好適である。 The position / orientation estimation apparatus according to the present invention can improve the estimation accuracy of the position and orientation of the estimation object, and is suitable, for example, for an in-vehicle navigation apparatus.
 1 位置姿勢推定装置、2,100 撮影装置、2A 撮影画像、3,101 センサ装置、4 距離標データベース、5 距離標認識部、6 方位推定部、7 距離標検索部、8 方位算出部、9 距離標特定部、10 位置姿勢推定部、102 記憶装置、103 処理回路、104 CPU、105 メモリ、401,601 距離標、604 矢印、605 角度、701 撮影方位、702~704 方位、705,706 絶対値、800a,800b,801a,801b 距離標。 1 position and orientation estimation device, 2,100 photographing device, 2A photographed image, 3,101 sensor device, 4 distance marker database, 5 distance marker recognition unit, 6 bearing estimation unit, 7 distance marker search unit, 8 bearing calculation unit, 9 Distance marker specifying unit, 10 position and orientation estimation unit, 102 storage device, 103 processing circuit, 104 CPU, 105 memory, 401, 601 distance marker, 604 arrow, 605 angle, 701 shooting orientation, 702 to 704 orientation, 705, 706 absolute Value, 800a, 800b, 801a, 801b distance markers.

Claims (6)

  1.  認識対象物に設定された基準点の3次元空間における位置座標と当該認識対象物が標示する標示情報とが対応付けられたデータベースから、撮影画像から認識された標示情報に対応する前記認識対象物を検索する対象物検索部と、
     前記対象物検索部によって検索された前記認識対象物に設定された前記基準点の位置座標に基づいて、前記認識対象物の方位を算出する方位算出部と、
     前記撮影画像を撮影した推定対象物の撮影方位と前記方位算出部によって算出された前記認識対象物の方位とに基づいて、前記対象物検索部によって検索された前記認識対象物の中から、前記撮影画像における前記認識対象物を特定する対象物特定部と、
     前記対象物特定部によって特定された前記認識対象物に設定された前記基準点の3次元空間における位置座標と前記撮影画像における位置座標との対応関係に基づいて、前記推定対象物の位置および姿勢を推定する位置姿勢推定部と
     を備えたことを特徴とする位置姿勢推定装置。
    The recognition target object corresponding to the labeling information recognized from the photographed image from the database in which the position coordinates of the reference point set in the recognition target object in the three-dimensional space and the labeling information indicated by the recognition target object are associated with each other. An object search part for searching for
    An azimuth calculating unit that calculates the azimuth of the recognition object based on the position coordinates of the reference point set in the recognition object searched by the object searching unit;
    Based on the shooting direction of the estimated object obtained by shooting the captured image and the direction of the recognition object calculated by the direction calculation unit, from among the recognition objects searched by the object search unit, An object specifying unit for specifying the recognition object in the captured image;
    Based on the correspondence between the position coordinates in the three-dimensional space of the reference point set for the recognition object specified by the object specifying unit and the position coordinates in the captured image, the position and orientation of the estimation object A position and orientation estimation apparatus comprising: a position and orientation estimation unit that estimates
  2.  前記推定対象物によって撮影された前記撮影画像から標示情報と前記撮影画像における前記基準点の位置座標とを認識する対象物認識部を備えたこと
     を特徴とする請求項1記載の位置姿勢推定装置。
    The position / orientation estimation apparatus according to claim 1, further comprising an object recognition unit that recognizes labeling information and a position coordinate of the reference point in the captured image from the captured image captured by the estimated object. .
  3.  前記推定対象物の撮影方位を推定する方位推定部を備えたこと
     を特徴とする請求項1記載の位置姿勢推定装置。
    The position / orientation estimation apparatus according to claim 1, further comprising an azimuth estimation unit configured to estimate a shooting azimuth of the estimation target object.
  4.  前記方位推定部は、加速度センサの検出情報に基づいて前記推定対象物の地面に対する姿勢を推定し、推定した姿勢に基づいて前記推定対象物の撮影方位を推定すること
     を特徴とする請求項3記載の位置姿勢推定装置。
    The azimuth estimating unit estimates a posture of the estimation object with respect to the ground based on detection information of an acceleration sensor, and estimates a shooting azimuth of the estimation object based on the estimated posture. The position and orientation estimation apparatus described.
  5.  前記基準点は、前記認識対象物において標示情報が標示された面上の複数の点であること
     を特徴とする請求項1記載の位置姿勢推定装置。
    The position / orientation estimation apparatus according to claim 1, wherein the reference points are a plurality of points on a surface on which labeling information is marked in the recognition target object.
  6.  対象物認識部が、推定対象物によって撮影された撮影画像から、認識対象物が標示する標示情報と前記認識対象物に設定された基準点の前記撮影画像における位置座標とを認識するステップと、
     対象物検索部が、前記認識対象物に設定された前記基準点の3次元空間における位置座標と前記認識対象物が標示する標示情報とが対応付けられたデータベースから、前記対象物認識部によって前記撮影画像から認識された標示情報に対応する前記認識対象物を検索するステップと、
     方位算出部が、前記対象物検索部によって検索された前記認識対象物に設定された前記基準点の位置座標に基づいて、前記認識対象物の方位を算出するステップと、
     方位推定部が、前記推定対象物の撮影方位を推定するステップと、
     対象物特定部が、前記方位推定部によって推定された前記推定対象物の撮影方位と前記方位算出部によって算出された前記認識対象物の方位とに基づいて、前記対象物検索部によって検索された前記認識対象物の中から、前記撮影画像における前記認識対象物を特定するステップと、
     位置姿勢推定部が、前記対象物特定部によって特定された前記認識対象物に設定された前記基準点の3次元空間における位置座標と前記撮影画像における位置座標との対応関係に基づいて、前記推定対象物の位置および姿勢を推定するステップと
     を備えたことを特徴とする位置姿勢推定方法。
    A step of recognizing the labeling information indicated by the recognition target object and the position coordinates of the reference point set in the recognition target object in the captured image from the captured image captured by the estimation target object;
    The object search unit is configured to perform the object recognition from the database in which the position coordinates in the three-dimensional space of the reference point set to the recognition object and the labeling information indicated by the recognition object are associated with each other by the object recognition unit. Searching the recognition object corresponding to the labeling information recognized from the captured image;
    An azimuth calculating unit calculating the azimuth of the recognition object based on the position coordinates of the reference point set in the recognition object searched by the object searching unit;
    An azimuth estimating unit estimating a shooting azimuth of the estimation object;
    The object specifying unit is searched by the object searching unit based on the shooting direction of the estimated object estimated by the direction estimating unit and the direction of the recognition object calculated by the direction calculating unit. Identifying the recognition object in the captured image from the recognition object;
    The position and orientation estimation unit is configured to perform the estimation based on a correspondence relationship between a position coordinate in the three-dimensional space of the reference point set in the recognition target specified by the target specifying unit and a position coordinate in the captured image. A position and orientation estimation method comprising: estimating a position and orientation of an object.
PCT/JP2017/003757 2017-02-02 2017-02-02 Position/orientation estimating device and position/orientation estimating method WO2018142533A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2018562138A JP6479296B2 (en) 2017-02-02 2017-02-02 Position / orientation estimation apparatus and position / orientation estimation method
PCT/JP2017/003757 WO2018142533A1 (en) 2017-02-02 2017-02-02 Position/orientation estimating device and position/orientation estimating method
TW106113892A TW201830336A (en) 2017-02-02 2017-04-26 Position/orientation estimating device and position/orientation estimating method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/003757 WO2018142533A1 (en) 2017-02-02 2017-02-02 Position/orientation estimating device and position/orientation estimating method

Publications (1)

Publication Number Publication Date
WO2018142533A1 true WO2018142533A1 (en) 2018-08-09

Family

ID=63039486

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/003757 WO2018142533A1 (en) 2017-02-02 2017-02-02 Position/orientation estimating device and position/orientation estimating method

Country Status (3)

Country Link
JP (1) JP6479296B2 (en)
TW (1) TW201830336A (en)
WO (1) WO2018142533A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020045345A1 (en) * 2018-08-31 2020-03-05 株式会社デンソー Sign recognition system and sign recognition method
CN112639905A (en) * 2018-08-31 2021-04-09 株式会社电装 Marker recognition system and marker recognition method
CN113188439A (en) * 2021-04-01 2021-07-30 深圳市磐锋精密技术有限公司 Internet-based automatic positioning method for mobile phone camera shooting
WO2023013171A1 (en) * 2021-08-02 2023-02-09 ミネベアミツミ株式会社 Distance estimation device, antena device, power feeding system, power feeding apparatus, and power feeding method
WO2023013160A1 (en) * 2021-08-02 2023-02-09 ミネベアミツミ株式会社 Distance estimation device, antena device, power feeding system, power feeding device, and power feeding method
JP7407213B2 (en) 2022-02-22 2023-12-28 本田技研工業株式会社 Direction identification device and direction identification method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09119851A (en) * 1995-10-24 1997-05-06 Nissan Diesel Motor Co Ltd Position detection apparatus for vehicle
JP2007147515A (en) * 2005-11-29 2007-06-14 Denso Corp Navigation apparatus for vehicle
JP2009250718A (en) * 2008-04-03 2009-10-29 Nissan Motor Co Ltd Vehicle position detecting apparatus and vehicle position detection method
JP2016018405A (en) * 2014-07-09 2016-02-01 東日本高速道路株式会社 On-vehicle kilometer post value display terminal and kilometer post value update method used in the same

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09119851A (en) * 1995-10-24 1997-05-06 Nissan Diesel Motor Co Ltd Position detection apparatus for vehicle
JP2007147515A (en) * 2005-11-29 2007-06-14 Denso Corp Navigation apparatus for vehicle
JP2009250718A (en) * 2008-04-03 2009-10-29 Nissan Motor Co Ltd Vehicle position detecting apparatus and vehicle position detection method
JP2016018405A (en) * 2014-07-09 2016-02-01 東日本高速道路株式会社 On-vehicle kilometer post value display terminal and kilometer post value update method used in the same

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020045345A1 (en) * 2018-08-31 2020-03-05 株式会社デンソー Sign recognition system and sign recognition method
CN112639905A (en) * 2018-08-31 2021-04-09 株式会社电装 Marker recognition system and marker recognition method
CN112639905B (en) * 2018-08-31 2023-09-29 株式会社电装 Marker identification system and marker identification method
US11830255B2 (en) 2018-08-31 2023-11-28 Denso Corporation Method and system for recognizing sign
CN113188439A (en) * 2021-04-01 2021-07-30 深圳市磐锋精密技术有限公司 Internet-based automatic positioning method for mobile phone camera shooting
WO2023013171A1 (en) * 2021-08-02 2023-02-09 ミネベアミツミ株式会社 Distance estimation device, antena device, power feeding system, power feeding apparatus, and power feeding method
WO2023013160A1 (en) * 2021-08-02 2023-02-09 ミネベアミツミ株式会社 Distance estimation device, antena device, power feeding system, power feeding device, and power feeding method
JP7407213B2 (en) 2022-02-22 2023-12-28 本田技研工業株式会社 Direction identification device and direction identification method

Also Published As

Publication number Publication date
JPWO2018142533A1 (en) 2019-03-14
JP6479296B2 (en) 2019-03-06
TW201830336A (en) 2018-08-16

Similar Documents

Publication Publication Date Title
CN110411441B (en) System and method for multi-modal mapping and localization
JP6479296B2 (en) Position / orientation estimation apparatus and position / orientation estimation method
US10515458B1 (en) Image-matching navigation method and apparatus for aerial vehicles
WO2021185218A1 (en) Method for acquiring 3d coordinates and dimensions of object during movement
EP2917754B1 (en) Image processing method, particularly used in a vision-based localization of a device
CN110570477B (en) Method, device and storage medium for calibrating relative attitude of camera and rotating shaft
WO2018142900A1 (en) Information processing device, data management device, data management system, method, and program
EP2491529B1 (en) Providing a descriptor for at least one feature of an image
Wendel et al. Natural landmark-based monocular localization for MAVs
WO2016199605A1 (en) Image processing device, method, and program
JPWO2015045834A1 (en) Marker image processing system
CN107044853B (en) Method and device for determining landmarks and method and device for positioning
JP4132068B2 (en) Image processing apparatus, three-dimensional measuring apparatus, and program for image processing apparatus
CN111279354A (en) Image processing method, apparatus and computer-readable storage medium
JP2022042146A (en) Data processor, data processing method, and data processing program
JP6410231B2 (en) Alignment apparatus, alignment method, and computer program for alignment
JP6922348B2 (en) Information processing equipment, methods, and programs
JP2011112556A (en) Search target position locating device, method, and computer program
Huttunen et al. A monocular camera gyroscope
KR102195040B1 (en) Method for collecting road signs information using MMS and mono camera
US20230079899A1 (en) Determination of an absolute initial position of a vehicle
JP6886136B2 (en) Alignment device, alignment method and computer program for alignment
CN113011212B (en) Image recognition method and device and vehicle
CN114842224A (en) Monocular unmanned aerial vehicle absolute vision matching positioning scheme based on geographical base map
KR20220062709A (en) System for detecting disaster situation by clustering of spatial information based an image of a mobile device and method therefor

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2018562138

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17895095

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17895095

Country of ref document: EP

Kind code of ref document: A1