WO2021220331A1 - Système de corps mobile - Google Patents

Système de corps mobile Download PDF

Info

Publication number
WO2021220331A1
WO2021220331A1 PCT/JP2020/017937 JP2020017937W WO2021220331A1 WO 2021220331 A1 WO2021220331 A1 WO 2021220331A1 JP 2020017937 W JP2020017937 W JP 2020017937W WO 2021220331 A1 WO2021220331 A1 WO 2021220331A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
locus
moving body
data
mobile system
Prior art date
Application number
PCT/JP2020/017937
Other languages
English (en)
Japanese (ja)
Inventor
修一 槙
Original Assignee
株式会社日立産機システム
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立産機システム filed Critical 株式会社日立産機システム
Priority to PCT/JP2020/017937 priority Critical patent/WO2021220331A1/fr
Priority to CN202080099292.2A priority patent/CN115362423A/zh
Priority to JP2022518431A priority patent/JP7338048B2/ja
Publication of WO2021220331A1 publication Critical patent/WO2021220331A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Definitions

  • the present invention relates to a technique for a mobile system, a technique for measuring the position of a moving body, and the like.
  • a sensor In a mobile system or the like having a function of measuring the position of a moving body (sometimes referred to as a positioning function), a sensor is installed on the moving body.
  • This sensor is a type of sensor that can at least detect the position or calculate the position based on the sensor data. Examples of this sensor include a distance measuring sensor such as a laser scanner, a GPS receiver, and the like.
  • the mobile system can realize a positioning function, a function of creating a map around the mobile, and the like by using the information of the sensor.
  • Patent Document 1 JP-A-2017-97402
  • Patent Document 1 As a "peripheral map creation method” or the like, the self-position estimation device of a mobile robot creates the latest self-position / attitude data by matching the distance data of the laser range finder (LRF) with the map. It is stated that it will be done.
  • LRF laser range finder
  • a plurality of (for example, two) sensors may be installed in one mobile body.
  • the purpose of this installation depends on the details such as the function and application of the mobile system, and examples thereof include securing a wide detection range, calculating one position using a plurality of sensor data, and realizing a redundant configuration.
  • a plurality of distance measuring sensors may be installed on the moving body.
  • the type and shape of the moving body to be applied, the position and direction in which the sensor is installed, and the like vary depending on the application environment such as a factory.
  • the positions and directions of installation of a plurality of sensors may be changed even for the same moving body.
  • the relative positional relationship between the sensors can be initially set in advance, and there is no problem if there is no change.
  • the installation positions of the two sensors in the moving body are changed, it may be difficult to measure the relative positional relationship between the changed sensors with high accuracy, or even if the user sets it, it may take a lot of time and effort. be. If the accuracy of the setting regarding the relative positional relationship between the sensors after the change is low, it may affect the functions such as the positioning function and the map creation function of the mobile system, and the accuracy of those functions may also be low.
  • An object of the present invention is to obtain a relative positional relationship between sensors even when a plurality of sensors are installed on a moving body or when the installation position is changed with respect to the technology of a moving body system, and a positioning function can be obtained. It is to provide a technology that can improve the accuracy of such as.
  • the moving body system of one embodiment includes a moving body, a plurality of sensors including a first sensor and a second sensor installed at different positions in the moving body coordinate system of the moving body, and a plurality of the plurality of sensors.
  • the first sensor and the second sensor are provided with a control device that realizes a positioning function that measures at least the position of the moving body in the spatial coordinate system based on sensor data, and the first sensor and the second sensor are in the spatial coordinate system. It is a type of sensor that can detect the position of the self-sensor, and the control device is a first sensor data of the first sensor and a second sensor of the second sensor when the moving body moves in the environment.
  • the position of the first sensor and the position of the second sensor in the spatial coordinate system are identified, and based on the position identification result, the first locus of the first sensor on the time series and the said The second locus of the second sensor is acquired, and the first locus and the second locus are used to compare and collate the shapes of the loci, and the first sensor of the moving body in the moving body coordinate system of the moving body.
  • the relative positional relationship between the position and the position of the second sensor is calculated, and the information representing the calculated relative positional relationship is set in the moving body.
  • the relative positional relationship between the sensors even when a plurality of sensors are installed on the mobile body or when the installation position or the like is changed. Can be obtained, and the accuracy of the positioning function and the like can be improved.
  • FIG. 1 shows the structure of the mobile system of Embodiment 1 of this invention. It is a figure which shows the structure of the moving body in Embodiment 1.
  • FIG. It is a figure which shows the configuration example of the detection range of a sensor in Embodiment 1.
  • FIG. It is a figure which shows the structural example of the moving mechanism in Embodiment 1.
  • FIG. It is a figure which shows the functional block composition example of the mobile body system in Embodiment 1.
  • FIG. It is a figure which shows the configuration example of the software and hardware of the position identification apparatus in Embodiment 1.
  • FIG. It is a figure which shows the main processing flow of the position identification apparatus in Embodiment 1.
  • FIG. 1 shows the example of the locus of a moving body and a 1st sensor in Embodiment 1.
  • FIG. It is a figure which shows the example of the locus of a moving body and a 2nd sensor in Embodiment 1.
  • FIG. It is a figure which shows the example of the shape of the locus corresponding to the sensor position in Embodiment 1.
  • FIG. It is a figure which shows the generation example of a relative position parameter in Embodiment 1.
  • FIG. It is a figure which shows the example of the temporary locus of the temporary position of the 2nd sensor in Embodiment 1.
  • FIG. 1 shows the example of the matching process in the case of the 1st embodiment, when the time is not synchronized between the sensor data. It is a figure which shows the example of the detection by the 1st sensor in the environment in Embodiment 1.
  • FIG. 2nd sensor shows the environment in Embodiment 1.
  • FIG. It is a figure which shows the example of the 1st map by the 1st sensor in Embodiment 1.
  • FIG. It is a figure which shows the example of the association between the 1st map and the 2nd map in Embodiment 1.
  • the mobile system according to the first embodiment of the present invention will be described with reference to FIGS. 1 to 19.
  • the mobile system of the first embodiment has a function (sometimes referred to as a relative positional relationship calculation function) capable of automatically adjusting (in other words, calibrating) the relative positional relationship between a plurality of sensors provided in the mobile body. Even if the installation position of the sensor in the moving body is changed, the installation position of each sensor can be set with high accuracy by the automatic adjustment by this function with less trouble for the user. Therefore, the positioning function of the mobile system can be maintained with high accuracy.
  • a function sometimes referred to as a relative positional relationship calculation function
  • FIG. 1 shows the configuration of the mobile system of the first embodiment.
  • This mobile system is a system applied to the environment 101 such as a factory.
  • a production facility 102 is installed in a building.
  • This mobile system has a mobile 1.
  • the moving body 1 is an AGV (or an autonomous traveling robot or the like) capable of unmanned transporting a luggage 103 such as a product or a member.
  • the mobile body 1 conveys a predetermined route in the factory, and supplies, for example, a luggage 103 to a production facility 102.
  • the moving body 1 includes a control device 100, a sensor 2, a moving mechanism 3, a mounting mechanism 4, and the like in a housing 10.
  • the control device 100 is a device that controls the moving body 1.
  • the spatial coordinate system CS of the environment 101 is represented by (X, Y, Z).
  • the origin of the spatial coordinate system CS is set at any position.
  • the moving body 1 of FIG. 1 is arranged so that the front and rear correspond to the X direction, the left and right correspond to the Y direction, and the vertical and height correspond to the Z direction.
  • the coordinate system in the moving body 1 is defined as the moving body coordinate system CM and is represented by (x, y, z).
  • the origin of the moving body coordinate system CM is set to any position, for example, a representative position of the moving body 1.
  • the moving mechanism 3 is a mechanism including, for example, wheels and a drive unit.
  • the drive unit includes, for example, a motor, a drive circuit, and the like.
  • the moving mechanism 3 is a mechanism capable of traveling forward and backward and turning left and right by the wheels (FIG. 4 described later), but is not limited to this.
  • the mounting mechanism 4 is a structural portion for stably mounting the luggage 103, and the details are not limited.
  • the mounting mechanism 4 has various types depending on the application and the like, and is, for example, a structural part including a conveyor and the like.
  • the housing 10 of the moving body 1 has a flat plate-shaped first portion 10a parallel to a horizontal plane and a flat plate-shaped second portion standing vertically from a part of the first portion 10a. It has 10b, but is not limited to this.
  • a moving mechanism 3 including two front and rear axles and four front, rear, left and right wheels is provided in the first portion 10a.
  • a control device 100 is built in the second portion 10b. The control device 100 may be installed so as to be exposed to the outside in the second portion 10b or the like.
  • the side with the second portion 10b is the front
  • the side without the second portion 10b is the back
  • the left and right directions are defined with respect to the front-back direction.
  • it is not limited to this.
  • a plurality of sensors 2 and two sensors 2 (2A, 2B) in this example are installed in the moving body 1.
  • the two sensors 2 be the first sensor, the sensor 2A, and the second sensor, the sensor 2B.
  • each sensor 2 is a distance measuring sensor, and in particular, a two-dimensional laser scanner (laser range finder: sometimes called LRF or the like).
  • the device 100 can calculate the position of the sensor 2 from the distance measurement data of the sensor 2.
  • the two-dimensional meaning means that the distance to an object can be detected in a plane (horizontal plane in this example) centered on the direction of the sensor 2.
  • the sensor 2 which is a laser scanner detects and measures a distance by using an object in each direction around the moving body 1 as a feature point.
  • the sensor 2 in this example plays a role like a safety sensor in a function of realizing safe automatic transportation of the moving body 1 in the moving body system.
  • Each sensor 2 may be any type of sensor that can measure the position of the sensor 2 itself in the spatial coordinate system CS (particularly, the locus of the position on the time series) by a moving body system, and details. Is not limited.
  • the sensor 2 may be at least a type of sensor that can realize a positioning function.
  • the sensor 2 may be a type of sensor 2 capable of detecting its own position, or a type of sensor 2 capable of calculating the position by the control device 100 or the like from the sensor data of the sensor 2.
  • the positioning function may be realized by the sensor 2 alone, or may be realized in combination with the control device 100 or the like.
  • the sensor data output by the sensor 2 includes information on the position and orientation of the sensor 2.
  • the control device 100 calculates the position and orientation of the sensor 2 based on the sensor data of the sensor 2.
  • the sensor 2 is a type of sensor that can detect or calculate the position and orientation of the sensor 2.
  • the posture of the sensor 2 is a state of direction and rotation.
  • the control device 100 calculates the position and posture state of each sensor 2 based on the distance measuring data from the sensor 2.
  • the two sensors 2 (2A, 2B) are installed at different positions (sometimes referred to as installation positions) in the moving body coordinate system CM of the moving body 1.
  • the sensor 2A which is the first sensor
  • the sensor 2B which is the second sensor, is installed at a position near the left corner of the upper surface on the rear side of the first portion 10a.
  • the plurality of sensors 2 (2A, 2B) installed in the moving body 1 of FIG. 1 are distance measuring sensors having the same function and specifications, but are not limited to this, and may be a plurality of sensors 2 having different types and specifications. good.
  • the sensor 2 is not limited to a distance measuring sensor such as a laser scanner, and an acceleration sensor, a gyro sensor, a geomagnetic sensor, a GPS receiver, or the like may be used.
  • FIG. 2 shows (A) a side view and (B) a top view as the configuration of the moving body 1 of FIG.
  • the position of the sensor 2A (indicated by a black dot) in the moving body coordinate system CM is set to PA
  • the height position is set to ZA
  • the position of the sensor 2B is set to ZA.
  • PB the height position is particularly shown as ZB.
  • the height positions (ZA, ZB) of the two sensors 2 (2A, 2B) are different.
  • the height position ZA of the sensor 2A is higher than the height position ZB of the sensor 2B (ZA> ZB> 0).
  • the positioning function of the moving body system corresponds to the one height position. It is a function to measure the position of a moving body on a horizontal plane.
  • the map creation function is a function of creating a map showing the shape of an object in a horizontal plane corresponding to the one height position.
  • the positioning function in the first embodiment is a function of measuring the position of the moving body 1 based on the positions of the respective sensors 2 on the two horizontal planes corresponding to the two height positions.
  • the map creation function (described later) is a function of creating a map showing the shape of an object in two horizontal planes corresponding to two height positions.
  • the coordinate system CA of the sensor 2A is shown by (x, y) with the position PA of the sensor 2A as a reference / origin.
  • the x-axis is the installation direction of the sensor 2A.
  • the coordinate system CB of the sensor 2B is indicated by (x, y) with the position PB of the sensor 2B as a reference / origin.
  • the x-axis is the installation direction of the sensor 2B.
  • the sensor 2A and the sensor 2B each perform detection within a range of a predetermined angle (FIG. 3 described later) about the x-axis.
  • the coordinate system (CA, CB) of each sensor 2 is (x, y, z) in three dimensions.
  • the sensor 2 has a direction (sometimes referred to as an installation direction) that serves as a reference for installation and detection.
  • the installation direction of the sensor 2A is also shown as ⁇ A
  • the installation direction of the sensor 2B is also shown as ⁇ B.
  • the direction of the sensor 2 is a reference direction when emitting laser light.
  • the direction of the sensor 2A is the forward direction on the X-axis and the x-axis direction on the coordinate system CA.
  • the direction of the sensor 2B is a direction different from the X-axis due to the relative relationship of the angle ⁇ , substantially a direction diagonally to the left of the rear, and is an x-axis direction in the coordinate system CB.
  • the installation direction of the sensor 2 may change along with the installation position.
  • the relative positional relationship calculation function in the first embodiment is a function of calculating the relative positional relationship 105 including the relationship between the installation position and the direction between the sensors 2.
  • the position PA of the sensor 2A and the position PB of the sensor 2B are expressed as (xA, yA, zA) or (xB, yB, zB) as coordinate values in the moving body coordinate system CM (x, y, z). can. Further, the position PA of the sensor 2A and the position PB of the sensor 2B are represented by different coordinate values in the spatial coordinate system CS.
  • the relationship of the positional coordinates is represented by the illustrated values ⁇ x, ⁇ y, ⁇ z.
  • the values ⁇ x, ⁇ y, ⁇ z are the difference values of the origin position PB (xB, yB, zB) of the coordinate system CB of the sensor 2B with respect to the position PA (xA, yA, zA) of the origin of the coordinate system CA of the sensor 2A.
  • ⁇ x xB ⁇ xA.
  • the value ⁇ x is the difference value of the X coordinate value in the spatial coordinate system CS of the position PB of the sensor 2B with respect to the X coordinate value in the spatial coordinate system CS of the position PA of the sensor 2A.
  • ⁇ y and ⁇ z are the same applies to ⁇ y and ⁇ z.
  • the relative positional relationship 105 in obtaining the relative positional relationship 105, the relationship between the position and direction of the second sensor 2B with respect to the first sensor is obtained with reference to the sensor 2A which is the first sensor. This is also true in reverse.
  • the relative positional relationship 105 particularly the relationship of the position coordinates, is also represented as a vector vAB.
  • the relationship of particular direction is represented by the illustrated value ⁇ .
  • the value ⁇ is a difference value with respect to the direction ( ⁇ A, ⁇ B) of the sensor 2 (2A, 2B).
  • the direction ⁇ A of the sensor 2A is the positive direction (angle is 0 degrees) of the x-axis of the coordinate system CA.
  • the direction ⁇ B of the sensor 2B is the positive direction (angle is 0 degrees) of the x-axis of the coordinate system CB.
  • the relative positional relationship calculation function in this mobile system obtains values ( ⁇ x, ⁇ y) representing the above positional relationship as at least the relative positional relationship 105 between the sensors 2.
  • this relative positional relationship calculation function also obtains a value ⁇ representing the above-mentioned directional relationship.
  • the relative positional relationship calculation function may be a function capable of obtaining the value ⁇ z in the Z direction of FIG. 2 (A) in consideration of three dimensions.
  • each sensor 2 The installation position and direction of each sensor 2 are selected so as to form a predetermined detection range as shown in the example of FIG. At that time, depending on the shape of the moving body 1, it is necessary to select an appropriate installation so that, for example, the laser beam is not blocked by a portion such as the mounting mechanism 4.
  • the shape and the like of the moving body 1 are variously different depending on the application environment and the application, but the installation position and direction of the sensor 2 can be changed according to them. Further, even after the sensor 2 is installed at a certain position with the intention of being fixed, it may deviate from the intention of the user and a deviation from that position may occur. For example, it is conceivable that the sensor 2 hits something and the position is slightly shifted.
  • the plurality of sensors 2 have a predetermined detection range (FIG. 3) according to the application environment, application and function (map creation function, etc.) of the mobile body system, the type and shape of the mobile body 1, user operation, and the like. ) Etc., the position and direction of installation on the moving body 1 are determined and can be changed as appropriate.
  • the relative positional relationship calculation function in the mobile system of the first embodiment automatically adjusts the relative positional relationship 105 between the sensors 2 with high accuracy and easily in response to such a change in the installation state of the sensor 2. be able to.
  • This relative positional relationship calculation function calculates the relative positional relationship 105 by a mechanism that matches the shape of the locus of the sensor 2 as described later (FIG. 7). Then, the mobile system can realize the positioning function, the map creation function, and the like with high accuracy based on the high-precision relative positional relationship 105 as a result.
  • the height positions (ZA, ZB) of the sensor 2 are different as shown in FIG. 2 and the like as a restriction on the installation of the sensor 2 or an intentional design.
  • the maps that can be created by the map creation function described later are basically different (two) maps according to each height position that is different for each sensor 2 according to the conventional technology. ..
  • the relative positional relationship 105 is unknown or low accuracy
  • the relationship between the maps is also unknown or low accuracy. Therefore, it may be difficult for the user or the moving body 1 to grasp the same object in the environment 101 of FIG. 1, such as the shape of a building or the production equipment 102, from the maps.
  • the map around the moving body 1 can be created with high accuracy according to the map creation function based on the highly accurate relative positional relationship 105 obtained by the relative positional relationship calculation function.
  • this mobile system associates a plurality of (two) map data corresponding to each height position different for each created sensor 2 by using the relative positional relationship 105.
  • those maps can be treated as one map in an integrated manner. This makes it easier to grasp the shape and the like of the same object in the environment 101.
  • FIG. 3 shows a configuration example of the detection range of the sensor 2.
  • the sensor 2 emits laser light while scanning in each direction of the surroundings (horizontal plane in this example) within the detection range from the installation position (PA, PB), and has a feature point of hitting an object in the environment 101.
  • the laser beam returning from the feature point is incident.
  • the sensor 2 calculates the distance from the feature point according to the direction from the time from the emission of the laser beam to the incident by the so-called TOF (Time of flight) method.
  • TOF Time of flight
  • the sensor data which is the output from the sensor 2, in other words, the distance measurement data, is at least at each time point in the time series, the angles ⁇ and ⁇ representing the direction in which the sensor 2 looks at the surroundings, and the distance value corresponding to the angle. (D) and.
  • FIG. 3 schematically shows an example of the detection range of the sensor 2 corresponding to the configuration of the moving body 1 of FIG. 2 on a horizontal plane (XY plane).
  • Each sensor 2 makes a horizontal plane and an xy plane in the sensor coordinate system a detection target, in other words, a distance measurement target.
  • the detection range 301 indicates the detection range of the sensor 2A and is defined by the angle range 305 in the horizontal plane.
  • the detection range 302 indicates the detection range of the sensor 2B and is defined by the angle range 306 in the horizontal plane. In this example, the angle ranges 305 and 306 are greater than 180 degrees.
  • the detection direction of the sensor 2 is expressed by an angle ( ⁇ , ⁇ ) with respect to a reference direction (direction ⁇ A, ⁇ B) in the horizontal plane.
  • the black spot in that direction is an example of the feature point 303 of the object and has a distance of 304 (value d).
  • the actual feature points and detection ranges cover positions farther from the moving body 1. The distance that can be measured depends on the type of sensor 2 and the like.
  • the angles ( ⁇ , ⁇ ) in this example can take positive and negative values.
  • each sensor 2 is different as shown in the drawing, and a part of the detection range may overlap between them, or there may be a part of the range that cannot be detected around the moving body 1. Even a part of the range that cannot be detected can be detected by changing the position and posture of the moving body 1. As shown in the figure, by providing a plurality of sensors 2, a wide detection range as a mobile system can be secured.
  • FIG. 3B shows another installation example of the same two sensors 2 (2A, 2B) as in (A) in the example of the moving body 1 having a shape different from that of FIG.
  • the moving body 1 does not have the second portion 10b as shown in FIG. 2, and has a flat plate-shaped first portion 10a on a horizontal plane and a mounting mechanism 4 on the first portion 10a.
  • the sensor 2A is installed on the front side of the x-axis in the mobile coordinate system CM at the center position PA on the left and right in the forward direction ⁇ A.
  • the sensor 2B is installed on the rear side of the x-axis at the center position PB on the left and right in the rear direction ⁇ B.
  • the two sensors 2 are installed at positions symmetrical with respect to the moving body 1 (PA, PB).
  • the detection ranges (301 and 302) of the two sensors 2 are configured as symmetrical detection ranges in the front-rear direction.
  • the installation position, direction, detection range, etc. of the plurality of sensors 2 can be changed as appropriate. It is possible to respond to various environments and applications by changing it.
  • the sensor 2 When the sensor 2 is a laser scanner, laser light can be emitted in each of the surrounding directions by scanning that rotationally drives the laser irradiation unit, and distance information for each direction can be obtained.
  • the distance information can be converted into position information of feature points based on the position of the sensor 2. Since the position information of the surrounding object seen from the moving body 1 or the sensor 2 represents the geometric shape of the object, it may be described as shape data.
  • a camera or the like can also be applied as the sensor 2.
  • a positioning system or the like using a sensor (for example, RFID tag, beacon) or the like installed in the environment instead of the mobile body 1 can be applied.
  • the distance to the object can be calculated from the images of the left and right cameras.
  • FIG. 4 shows a configuration example of the moving mechanism 3 of the moving body 1 of FIG. 1 as a schematic view on a horizontal plane (XY plane).
  • the moving mechanism 3 is a mechanism capable of traveling forward and backward, stopping traveling, and turning left and right, and includes two axes and four wheels, for example, a rear wheel drive type mechanism.
  • the moving mechanism 3 has a left wheel 401 and a right wheel 402 of the front axle 410, and a left wheel 403 and a right wheel 404 of the rear axle 420.
  • the moving mechanism 3 is, for example, a mechanism in which the speeds of the left and right wheels can be controlled independently, and turning or the like can be controlled by controlling the difference in speed between the left and right wheels.
  • FIG. 4 shows a state when traveling forward (in the X direction).
  • the position PM indicates the center points of the front, back, left, and right as an example of a typical position of the moving body 1 in the moving body coordinate system CM.
  • the position PM1 is the left and right center points of the front axle 410, and the position PM2 is the left and right center points of the rear axle 420.
  • the moving mechanism 3 realizes forward traveling by driving each wheel at the same rotational speed.
  • the broken line locus 431 indicates a locus when traveling forward from the current position PM.
  • FIG. 4 shows a state when turning to the right.
  • a right-handed turning operation is realized by controlling the rotation speeds of the right wheels 402 and 404 to be smaller than those of the left wheels 401 and 403.
  • the broken line locus 432 shows the locus when turning right from the current position PM in the future.
  • the moving mechanism 3 may be any mechanism capable of traveling and changing direction.
  • the moving mechanism 3 may be a mechanism in which the direction of the wheels is fixed, a mechanism in which the direction of the wheels can be steered, or a mechanism using a structure other than the axle or the wheel, for example, a caster, a caterpillar, or a leg structure.
  • the moving mechanism 3 may be, for example, a mechanism used in a cleaning robot or the like, for example, a mechanism capable of independently controlling the direction and rotation speed of each wheel.
  • the turning motion is not limited to the turning motion accompanied by the locus of an arc as shown in the figure.
  • the relative positional relationship calculation function in the mobile body system is a function of obtaining the relative relationship regarding the installation position and the installation direction of the plurality of sensors 2 in the mobile body coordinate system CM.
  • the typical position of the moving body 1 can be specified in advance, and the method of specifying is not limited.
  • This representative position may be defined using the position (PA, PB) of the sensor 2 (2A, 2B).
  • This representative position may be, for example, the same as the position PA of one specific sensor 2, for example, the sensor 2A, or may be an intermediate position between the two sensors 2 (2A, 2B).
  • This typical position may be a predetermined position in the shape of the housing 10 or the like, for example, a center position, an intermediate position of the axle (for example, positions PM1 and PM2) and the like.
  • This typical position may be a position having a predetermined relative relationship (direction and distance) from the position of the sensor 2.
  • FIG. 5 shows a functional block configuration of the mobile system of the first embodiment.
  • the mobile body 1 of this mobile body system includes a control device 100, two sensors 2 (2A, 2B), a moving mechanism 3, and the like.
  • the control device 100 includes a position identification device 5 and a movement mechanism control device 6.
  • the position identification device 5 is mounted on, for example, a microcomputer or the like.
  • the movement mechanism control device 6 is implemented by, for example, a PLC (programmable logic controller) or the like.
  • the position identification device 5 and the movement mechanism control device 6 are integrally mounted as the control device 100, but the present invention is not limited to this.
  • the control device 100 may include a portion that drives and controls the operation of the mounting mechanism 4.
  • the position identification device 5 has a positioning function (in other words, a position / orientation estimation function), an automatic transport control function, a map creation function, a relative positional relationship calculation function, and the like (FIG. 6).
  • the position identification device 5 realizes each part such as the sensor control unit 51 based on the program processing by the processor 601 of FIG.
  • the position identification device 5 includes a sensor control unit 51, a position identification unit 52, a map creation unit 53, a data storage unit 54, an adjustment unit 55, and the like as each unit.
  • data storage unit 54 data such as the first position identification result 41A and the second position identification result 41B as the position identification result data, the first map data 42A and the second map data 42B as the map data, and the relative positional relationship data 43, etc. Information is stored.
  • the sensor control unit 51 includes a first sensor control unit 51A and a second sensor control unit 51B.
  • the first sensor control unit 51A controls the sensor 2A and obtains the sensor data SDA from the sensor 2A.
  • the second sensor control unit 51B controls the sensor 2B and obtains the sensor data SDB from the sensor 2B.
  • the sensor data SDA and the sensor data SDB include distance measurement data for each time point on the time series, that is, distance information for each angle representing a direction.
  • the sensor control unit 51 holds time-series sensor data, data for at least a certain period of time or longer, in the memory.
  • the position identification unit 52 is an element that constitutes a positioning function (particularly a position / orientation estimation function), and is a part that identifies the position and orientation of the sensor 2 in the spatial coordinate system using sensor data.
  • the position identification unit 52 has a first position identification unit 52A and a second position identification unit 52B.
  • the first position identification unit 52A estimates the position and orientation of the sensor 2A based on the sensor data SDA, and sets the result as the first position identification result 41A.
  • the second position identification unit 52B estimates the position and orientation of the sensor 2B based on the sensor data SDB, and sets the result as the second position identification result 41B.
  • the position identification unit 52 creates shape data representing the geometric shape of a surrounding object from distance measurement data which is sensor data, and compares the shape data with the existing map data of the data storage unit 54. Match. Then, the position identification unit 52 estimates the position and orientation of the sensor 2 in the spatial coordinate system from the collation result.
  • the map creation unit 53 is an element constituting the map creation function, and is a part that creates and updates the map data of the environment 101 (FIG. 1) while using the processing result by the position identification unit 52.
  • the map creation unit 53 uses the shape data created by the position identification unit 52 to create new map data and update existing map data.
  • the map creation unit 53 has a first map creation unit 53A and a second map creation unit 53B.
  • the first map creation unit 53A creates the first map data 42A by using the sensor data SDA and the first position identification result 41A.
  • the second map creation unit 53B creates the second map data 42B using the sensor data SDB and the second position identification result 41B.
  • the first map data 42A is data representing the shape of an object around the moving body 1 in the horizontal plane at the height position ZA in FIG.
  • the second map data 42B is data representing the shape of an object around the moving body 1 in the horizontal plane at the height position ZB.
  • the data storage unit 54 temporarily stores each data such as the first position identification result 41A, the second position identification result 41A, the first map data 42A, and the second map data 42B created by the above processing.
  • the adjustment unit 55 is an element that constitutes the relative positional relationship calculation function, in other words, is a relative positional relationship calculation unit.
  • the adjusting unit 55 calculates the relative positional relationship between the sensors 2 (relative positional relationship 105 in FIGS. 1 and 2) while referring to each data (41A, 41B, 42A, 42B) of the data storage unit 54. I do.
  • this process is a process of calibrating between the sensor coordinate systems, and is a process related to setting the position coordinates of each sensor 2 in the mobile coordinate system CM.
  • the adjusting unit 55 stores it in the data storage unit 54 as the relative positional relationship data 43. This corresponds to the latest settings related to the sensor 2 and the positioning function, in other words, automatic setting updates.
  • the relative positional relationship data 43 is data including sensor relative coordinate information and the like. Specifically, the relative positional relationship data 43 is data including values representing the positional relationship ( ⁇ x, ⁇ y) representing the relative positional relationship 105 as shown in FIG. 2 and values ( ⁇ ) representing the directional relationship.
  • the position identification device 5 may output the relative positional relationship data 43 or the like of the data storage unit 54 to the user in the form of display or the like.
  • the position identification device 5 provides a setting screen related to the sensor 2 in the form of, for example, a Web page, displays the relative positional relationship information based on the relative positional relationship data 43 on the setting screen, and confirms or manually by the user. Allows setting.
  • the relative positional relationship information may be displayed on the setting screen, for example, the above values ( ⁇ x, ⁇ y, ⁇ ) may be displayed, or the relative positional relationship information may be displayed relative to the image showing the appearance configuration of the moving body 1 as shown in FIG.
  • the positional relationship information may be displayed graphically.
  • the business operator can set the initial setting value of the relative positional relationship of the sensor 2 in advance on the setting screen of the position identification device 5. After this initial setting, the setting of the relative positional relationship data 43 can be automatically updated depending on the enabled state of the relative positional relationship calculation function even if there is no manual setting by a person.
  • the mobile system may be in a form in which a device such as a PC (PC 110 in FIG. 1) is further connected to the control device 100 of the mobile body 1 by communication.
  • the device such as a PC is provided with an OS, an application program, and the like.
  • Examples of this application program include processing related to user setting processing of a mobile system, positioning function, automatic transportation function, map creation function, and the like.
  • Examples of this application program include those that support a function of setting an automatic transportation route, a function of a user browsing a map, and the like.
  • the user can operate the device such as the PC and use those functions on the display screen.
  • the user can also confirm the relative positional relationship data 43 on the display screen of the device.
  • the adjustment unit 55 or the map creation unit 53 of the position identification device 5 further uses the relative positional relationship data 43 to associate a plurality of (two) map data (first map data 42A and second map data 42B) with each other. Is processed. As a result, the plurality of (two) map data can be roughly treated as one map data as a whole.
  • the position identification device 5 may create one map data from a plurality of (two) map data by synthesis or the like. The user can browse the one map data on the display screen of the PC 110.
  • the movement mechanism control device 6 is a part including a drive control circuit and the like, and controls the operation of the movement mechanism 3 by using the position identification result data and the map data by the position identification device 5.
  • the movement mechanism control device 6 includes a position identification device control unit 61 and a movement mechanism control unit 62.
  • the position identification device control unit 61 communicates with the position identification device 5 and acquires data necessary for control from the position identification device 5.
  • the movement mechanism control unit 62 controls the traveling and turning operations of the movement mechanism 3 based on the position of the moving body 1 grasped by the positioning function and the map data created by the map creation function.
  • FIG. 6 shows an implementation configuration example including the software and hardware of the position identification device 5 of FIG.
  • the position identification device 5 includes a processor 601, a memory 603, an auxiliary storage device 605, a communication interface device 607, an input / output interface device 608, a power supply device 609, and the like, and these are connected to each other via a bus or the like.
  • the moving body 1 may be provided with a mechanism such as an operation unit for the user to operate the moving body 1.
  • the processor 601 is composed of, for example, a CPU, a ROM, a RAM, or the like, in other words, a controller.
  • the processor 601 or the like may be implemented by a programmable hardware circuit or the like such as FPGA.
  • the processor 601 reads a program stored in the auxiliary storage device 605 or the like into the memory 603, expands the program, and executes processing according to the program. As a result, each part such as the position identification part 52 in FIG. 5 is realized as an execution module.
  • the control program 630, the processing data 635 by the processor 601 and the like are stored in the memory 603.
  • the control program 630 includes a sensor control program 631, a position identification program 632, a map creation program 633, an adjustment program 635, and the like, whereby the sensor control unit 51, the position identification unit 52, the map creation unit 53, and the map creation unit 53 of FIG. 5 are included.
  • the adjusting unit 55 and the like are realized.
  • the processed data 640 includes data such as map data, position identification results, and relative positional relationship data (corresponding to each data of the data storage unit 54 in FIG. 5).
  • the auxiliary storage device 605 is composed of a non-volatile memory, a storage device, for example, a storage medium such as a disk or a memory card, or a DB server on a communication network, and stores programs and various data in advance.
  • the auxiliary storage device 605 stores, for example, map data 651, sensor position identification result data 652, sensor relative positional relationship data 653, and the like. If necessary, the processor 601 reads the data in the auxiliary storage device 605 into the memory 603, writes the data in the memory 603 into the auxiliary storage device 605, and stores the data.
  • the map data 651 is environment map data, and may be a map database (DB) that stores a plurality of map data, and includes each map data corresponding to the first map data 42A and the second map data 42B of FIG. Each map data is composed of, for example, an image.
  • the sensor position identification result data 652 is the position identification result data of each sensor 2 (2A, 2B), and corresponds to the first position identification result 41A and the second position identification result 41B in FIG. Contains information that represents the position and orientation of.
  • the sensor relative positional relationship data 653 is data representing the relative positional relationship between the sensors 2 (2A, 2B), in other words, is setting data for automatic adjustment of the sensor 2, and corresponds to the relative positional relationship data 43 in FIG. ..
  • the communication interface device 607 performs communication processing according to each communication interface between the sensor 2 and the mobile mechanism control device 6, or between an external device such as a PC or a server.
  • the communication interface may be wired or wireless, short-range communication or remote communication.
  • An input device for example, a keyboard
  • an output device for example, a display device
  • An input device or an output device may be mounted on the mobile body 1.
  • the power supply device 609 is composed of a battery or the like, and supplies electric power to each part.
  • the processor 601 has at least the above-mentioned positioning function (in other words, a position / orientation estimation function), an automatic transport control function, a map creation function, and a relative positional relationship calculation function as functions realized by program processing or the like.
  • the positioning function is a function of measuring the position of the moving body 1 based on the sensor 2.
  • the position / orientation estimation function is a function of estimating the position and orientation of the sensor 2.
  • the positioning function and the position / orientation estimation function are mainly realized by the position identification unit 52 of FIG.
  • the automatic transport control function is a function of controlling the automatic transport of the moving body 1, for example, a function of traveling a route set in the environment 101 (FIG. 1) so as not to collide with a surrounding object.
  • the map creation function is a function of creating and updating a map of the environment 101 based on the movement in the environment 101 and the sensor 2, and is mainly realized by the map creation unit 53 of FIG.
  • the relative positional relationship calculation function is a function of calculating the relative positional relationship 105 (FIG. 1 and the like) between the sensors 2 and performing automatic adjustment, and is mainly realized by the adjustment unit 55 of FIG.
  • the control device 100 shown in FIG. 1 or the like calculates the position and orientation of the moving body 1 in the spatial coordinate system CS of the environment 101 by calculation using the sensor data from the two sensors 2. Is a function to measure or estimate.
  • the control device 100 uses the locus of the position of each sensor 2 in the spatial coordinate system CS grasped by this function for the calculation by the relative positional relationship calculation function.
  • the automatic transport control function is a function realized by using the positioning function or the position / attitude estimation function.
  • the moving body 1 controls suitable automatic transportation, for example, safe transportation on a route, based on the state of the position and posture of the moving body 1 grasped by those functions. Since this mobile system includes a plurality of (two) sensors 2, the overall detection range can be widened, the position and the like can be stably detected by the positioning function, and as a result, suitable automatic transport can be performed. Can be realized.
  • this mobile system may have not only a positioning function but also other functions as a function configured by using a plurality of sensors 2, and in the first embodiment, it has a map creation function. ..
  • the map creation function constitutes a so-called SLAM (Simultaneous Localization and Mapping) function together with a position / orientation estimation function.
  • SLAM is a method in which a moving object such as an automatic guided vehicle estimates its own position and posture based on the detection of the surrounding situation by a sensor, and at the same time creates and updates a map of the surroundings. That is, the map creation function is a function that can automatically create or update a map around the moving body 1 in the environment 101 based on the sensor data of the sensor 2 accompanying the traveling of the moving body 1.
  • this map corresponds to the type of sensor 2 (two-dimensional laser scanner) and is a map on a horizontal plane, and is configured as image data representing the shape of an object in the environment 101. Since this mobile system includes a plurality of (two) sensors 2, it is possible to more preferably perform map creation by the map creation function.
  • the relative positional relationship calculation function is a function of calculating and automatically setting the relative positional relationship 105 between the sensors 2 in the moving body coordinate system CM based on the sensor data of the two sensors 2 (2A, 2B). Is.
  • the moving body 1 automatically activates this function during normal traveling and automatically adjusts the relative positional relationship 105.
  • the control device 100 detects and measures surrounding objects by the sensor 2 when moving in the environment 101 (FIG. 1), for example, when automatically transporting on a set route.
  • the control device 100 creates shape data representing the shape of an object around the moving body 1 with reference to the position of the sensor 2 based on the distance measurement data which is the sensor data from the sensor 2.
  • the control device 100 estimates the current position and orientation of the moving body 1 in the environment 101 (corresponding map data) by comparing and collating the shape data with the existing map data stored in the control device 100. It is the position identification result.
  • This estimation is, for example, a process of evaluating and judging the degree of matching or similarity between the shape data and the map data within a predetermined search range, and the degree can be evaluated from, for example, the number of overlaps in pixel units. ..
  • the control device 100 controls suitable movement while estimating the current position and posture of the moving body 1 for each section on the route.
  • the control device 100 determines the next target position on the route from the current position and posture, and controls the movement mechanism 3 for movement to the target position. At that time, the control device 100 controls the moving mechanism 3 so that the relationship between the surrounding object and the current position and posture of the moving body 1 becomes suitable.
  • control device 100 creates and registers new map data using the shape data created above while estimating the position and posture. Alternatively, the control device 100 updates the existing map data using the shape data created above. The control device 100 similarly repeats the above-mentioned local planning process for each section on the route.
  • FIG. 7 shows a flow of main processing (particularly relative positional relationship calculation) by the position identification device 5 of the control device 100 of FIG. 5, particularly the adjusting unit 55.
  • the flow of FIG. 7 has steps S1 to S5.
  • the adjusting unit 55 uses the data of the position identification results (41A, 41B) on the time series in a time of a certain period of time or more, in other words, the locus data, as the moving body 1 travels. Is input / acquired.
  • the position identification result includes information on the position and orientation (angle representing the direction) of the sensor 2 in the spatial coordinate system CS.
  • the adjusting unit 55 generates the relative position parameter 700 ( ⁇ x, ⁇ y).
  • the relative position parameter 700 is a parameter representing the relationship (PA, PB) between the sensors 2 (2A, 2B) in the moving body coordinate system CM of the moving body 1 as shown in FIG.
  • the relative position parameter 700 has, for example, a difference value ⁇ x with respect to the position PB of the sensor 2B in the x-axis direction and a difference value ⁇ y with respect to the y-axis direction with reference to the position PA of the sensor 2A.
  • the positions (ZA, ZB) and the difference ( ⁇ z) in the z-axis direction are excluded from the calculation because they are set values.
  • the subscript k represents a certain point in time
  • the subscript T represents the last point in time.
  • step S2 the adjusting unit 55 corresponds to each relative position parameter 700 in step S1 with respect to the locus 701 (sometimes referred to as the first locus) of the first position identification result 41A of the sensor 2A which is the first sensor.
  • a locus 703 (sometimes referred to as a tentative locus) of the “temporary position” (referred to as VPB) of the sensor 2B, which is the second sensor, is generated.
  • This "temporary position” sets a temporary position of the sensor 2B for matching processing, and a relative position parameter 700 ( ⁇ x, ⁇ y) is used as a vector from the position PA of the sensor 2A at each time point (t). Corresponds to the previous position.
  • the temporary position VPB is generated as a plurality of candidates (VPB1, ..., VPBn).
  • the locus 701 of the first position identification result 41A of the sensor 2A is expressed as time series data, for example, ⁇ (xA_1, yA_1), ..., (xA_k, yA_k), ..., (xA_T, yA_T) ⁇ .
  • the locus 703 of the temporary position of the sensor 2B is expressed as time series data, for example, ⁇ (vxB_1, vyB_1), ..., (vxB_k, vyB_k), ..., (vxB_T, vyB_T).
  • step S3 the adjusting unit 55 matches the locus 703 of the temporary position VPB generated in step S2 with the locus 702 of the second position identification result 41B of the sensor 2B (may be described as the second locus).
  • This matching process is a process of evaluating and determining the degree of matching or similarity of shapes among trajectories.
  • "matching degree” referred to as K
  • a rotation parameter referred to as R
  • the adjusting unit 55 stores the calculated coincidence degree K and the rotation parameter R in the memory for each candidate relative position parameter 700.
  • the locus 702 of the second position identification result 41B is expressed as time series data, for example, ⁇ (xB_1, yB_1), ..., (xB_k, yB_k), ..., (xB_T, yB_T) ⁇ .
  • step S4 the adjusting unit 55 determines the relative position parameter 700 ( ⁇ x, ⁇ y) corresponding to the pair of loci having the highest degree of coincidence K as the optimum relative position 704 ( ⁇ x_opt, ⁇ y_opt) in the result of step S3. -Extract and store in memory. Up to this point, the positional relationship ( ⁇ x, ⁇ y) of the relative positional relationship 105 shown in FIG. 1 and the like has been grasped.
  • step S5 the adjusting unit 55 further calculates the optimum value regarding the relationship ( ⁇ ) of the direction ( ⁇ ) between the sensors 2.
  • the adjusting unit 55 uses the optimum relative position 704 ( ⁇ x_opt, ⁇ y_opt) in step S4 to generate an amount ⁇ B_t + R ⁇ A_t ⁇ that represents the relationship ( ⁇ ) of the direction ( ⁇ ).
  • the angle ⁇ B_t is an angle representing the posture of the sensor 2B at each time point (t) included in the second identification result 41B of the sensor 2B.
  • the angle ⁇ A_t is an angle representing the posture of the sensor 2A at each time point (t) included in the first identification result 41A of the sensor 2A.
  • “+ R” is the addition of the rotation parameter R. This addition is an operation for adjusting to the coordinate system CA of the first sensor (sensor 2A) in obtaining the relationship of directions.
  • “ ⁇ A_t” is an operation for taking the difference between the angle ⁇ B_t and the angle ⁇ A_t.
  • the adjusting unit 55 takes the average value of the quantities ⁇ B_t + R- ⁇ A_t ⁇ generated at each time point, determines it as the optimum relative direction 705 ( ⁇ _opt), and stores it in the memory.
  • This average value is represented by ⁇ ⁇ B_t + R ⁇ A_t ⁇ / T.
  • the optimum relative direction 705 ( ⁇ _opt) is an optimum value regarding the directional relationship ( ⁇ ) between the sensors 2 expressed by the difference in angles.
  • step S5 an example of calculation of the directional relationship is shown, but the present invention is not limited to this.
  • the adjusting unit 55 stores the information including the optimum relative position 704 ( ⁇ x_opt, ⁇ y_opt) and the optimum relative direction ( ⁇ _opt) obtained as described above as the relative positional relationship data 43 in FIG.
  • the relative positional relationship data 43 corresponds to the values ( ⁇ x, ⁇ y, ⁇ ) representing the relative positional relationship 105 in FIG.
  • FIG. 8A shows an example of the movement of the moving body 1 and the locus at that time in a horizontal plane.
  • the moving body 1 is initially located in front of the position P1 at the time point t1 (here, the typical position PM in FIG. 4 is used) in the front (X direction in the spatial coordinate system CS, moving body coordinate system CM). It is in a state of facing the x direction), and is traveling forward to the position P2 at the time point t2.
  • the moving body 1 makes a right turn from the position P2 so as to pass through the position P3 at the time point t3.
  • the moving body 1 is in a state of facing right at the position P4 at the time point t4.
  • the locus 800 shown by the broken line indicates the locus of movement of the moving body 1 (position PM) as described above, particularly the locus through which the position PM2 of the rear axis 420 of FIG. 4 passes.
  • the locus 701 shown by the solid line indicates a locus (first locus) through which the position PA of the sensor 2A on the front side passes along with this movement.
  • FIG. 8B shows only the locus 701 of the sensor 2A (position PA) in (A) extracted.
  • This locus 701 includes a straight line portion 701a when traveling straight forward, a curved section 701b (in other words, an arc) when turning right, and a straight line portion 701c when traveling straight forward to the right.
  • the first position identification unit 52A in FIG. 5 obtains such trajectory data based on the sensor 2A.
  • This locus data is data having position coordinates in the horizontal plane corresponding to the height position ZA of FIG. 2 at each time point on the time series. Although the locus is shown by a line, it is a point cloud in detail.
  • (A) of FIG. 9 corresponds to (A) of FIG. 8, and the locus 800 of the moving body 1 is the same.
  • the locus 702 shown by the solid line indicates a locus (second locus) through which the position PB of the sensor 2B on the rear side passes along with this movement.
  • FIG. 9B shows only the locus 702 of the sensor 2B (position PB) in (A) extracted.
  • This locus 702 includes a straight line portion 702a when traveling straight forward, a curved section 702b (in other words, an arc) when turning right, and a straight line portion 702c when traveling straight forward to the right.
  • the second position identification unit 52B in FIG. 5 obtains such trajectory data based on the sensor 2B.
  • This locus data is data having position coordinates in the horizontal plane corresponding to the height position ZB in FIG. 2 at each time point on the time series.
  • FIG. 10 shows two loci 701 (second locus) when the installation position PB of the sensor 2B in FIG. 9 is different from the locus 800 of the moving body 1 and the locus 701 (first locus) of the sensor 2A as in FIG. ) Are shown as examples of loci 7011 and 7012.
  • the locus 7011 is an example of a locus based on an actual measured value when the position PB of the sensor 2B in the mobile coordinate system CM is the position PB1
  • the locus 7012 is an example of a locus when the position PB2 is the position PB2. be.
  • each locus (7011, 7012) of the sensor 2B the distance (for example, distance) from the turning center of the moving body 1 (for example, the locus 800 of the position PM of the moving body 1) to the position PB (PB1, PB2). It can be seen that the radius of the arc portion in the locus changes according to 1001, 1002).
  • the mobile system uses locus data including a locus such as a straight line corresponding to a straight line motion and a locus such as an arc corresponding to a direction change motion such as turning obtained in a certain period of time or more.
  • the relative positional relationship can be calculated by the matching process in step S3 of.
  • each relative position parameter 700 is set so that each such position PB is a temporary position VPB.
  • FIG. 11A shows an example of generating the relative position parameter 700 ( ⁇ x, ⁇ y) in step S1 of FIG. 7.
  • the adjusting unit 55 can select a plurality of relative position parameters 700 ( ⁇ x, ⁇ y) from the position PA of the reference sensor 2A at each time point (t) in each direction and each distance in the horizontal plane.
  • the relative position parameter 700 ( ⁇ x, ⁇ y) is a value of a plurality of candidates shifted within a predetermined range based on, for example, the initial setting value of the relative position relationship (relative position relationship data 43 in FIG. 5). Is generated as.
  • the adjusting unit 55 sets a temporary position VPB according to the relative position parameter 700 on a grid (a grid having a plurality of position coordinate points) as shown in the figure.
  • the temporary position VPB0 is a value generated by the relative position parameters ( ⁇ x0, ⁇ y0) corresponding to the initial setting values of the relative positional relationship.
  • the temporary positions VPB1 and VPB2 are values generated by other relative position parameters ( ⁇ x1, ⁇ p1) and (xp2, yp2).
  • the range in which the adjusting unit 55 generates the relative position parameter 700 may be a range in which the sensor 2 based on the shape of the moving body 1 can be installed, or a predetermined range centered on the initial setting value of the relative positional relationship. May be good.
  • the direction (x-axis, y-axis) of the sensor 2B (coordinate system CB) at each temporary position VPB is a constant direction based on the initial setting value of the relative positional relationship.
  • FIG. 11B shows an example of parameters when setting a temporary direction (“temporary direction”: V ⁇ ) of the sensor 2B at the temporary position VPB.
  • various temporary directions (V ⁇ ) may be generated by using the parameters of the temporary direction (V ⁇ ) of the sensor 2B at each temporary position VPB corresponding to each relative position parameter 700.
  • the parameter of the tentative direction (V ⁇ ) for example, a plurality of candidates are generated within a predetermined range based on the initial setting value of the relative positional relationship.
  • three examples V ⁇ 0, V ⁇ 1, V ⁇ 2 are shown as parameters of the tentative direction (V ⁇ ).
  • the parameter of the tentative direction (V ⁇ ) can be defined by using the angle difference with respect to the direction ( ⁇ A, x-axis) of the reference sensor 2A or the rotation parameter R described above.
  • step S2 the locus 703 of the temporary position VPB of the sensor 2B is generated according to the relative position parameter 700 ( ⁇ x, ⁇ y) of step S1 from the locus 701 which is the first position identification result regarding the position PA of the sensor 2A. ..
  • the same relative position parameter 700 is applied to the locus 703 of the temporary position VPB corresponding to the relative position parameter 700 of a certain candidate at each time point (t).
  • FIG. 12 shows an example in which the locus 703 (provisional locus) of each temporary position VPB corresponding to each relative position parameter 700 is generated in step S2.
  • the tentative locus is indicated by a long-dotted chain line.
  • Relative position parameters from each position PA (PA1, PA2, ...) having a direction ( ⁇ A) at each time point (t t1, t2, ...)
  • the temporary position VPB is set first using 700 ( ⁇ x, ⁇ y).
  • each position at each time point (t) is shown in ⁇ (vxB_1, vyB_1), (vxB_2, vyB_2), (vxB_3). , VyB_3), (vxB_4, vyB_4), (vxB_5, vyB_5) ⁇ .
  • one tentative locus is a case where the relative position parameters 700 ( ⁇ x, ⁇ y) for each time point are the same.
  • the temporary trajectories 1201 and 1202 of each temporary position VPB after generation have different radii of arcs (curved portions 1201b and 1202b) depending on the distance from the turning center.
  • the shape of the tentative locus is different from the shape of the first locus.
  • FIG. 13 shows an explanatory diagram regarding an example of the matching process in step S3 of FIG.
  • the rotation parameter R is used for the temporary locus of the temporary position VPB of the second sensor generated based on the first locus of the first sensor and the relative position parameter 700 and the second locus of the second sensor.
  • the shapes of the trajectories are compared and collated in relation to various orientations.
  • FIG. 13A as a pair of loci to be compared, a plurality of loci (for example, loci 7021, 7022) based on the locus 703 (provisional locus) shown by the alternate long and short dash line and the locus 702 (second locus) shown by the solid line.
  • the locus 703 is a provisional locus of a certain temporary position VPB corresponding to a certain relative position parameter 700 ( ⁇ x, ⁇ y).
  • the plurality of loci (7021, etc.) are a plurality of loci generated in different directions using the rotation parameter R.
  • a plurality of loci (7021 and the like) are superposed on the locus 703, assuming that the position at the first time point (t1) is the same.
  • the adjusting unit 55 compares the locus 703 of the temporary position VPB with a plurality of loci (7021 etc.) and calculates the degree of coincidence K.
  • the adjusting unit 55 corresponds to a point on the locus 703 of the temporary position VPB (for example, a position 1300 corresponding to the time t3) and each locus 7021 to 7025 at each time point (t).
  • the distance 1301 is calculated for each position.
  • the adjusting unit 55 calculates the degree of coincidence K according to the total sum. That is, roughly, the degree of coincidence K is defined and calculated so that the smaller the sum is, the higher the degree of coincidence K is.
  • the rotation parameter R is changed in the locus 702 (second locus) to generate a comparison target. It is possible without limitation, and the rotation parameter R may be changed in the locus 703 (provisional locus) to generate a comparison target.
  • An important point of the matching process is to determine the degree of matching of the shapes between the loci including the curved portion, and the provisional locus may be generated from either the first locus or the second locus.
  • step S5 the optimum relative direction ( ⁇ _opt) is calculated from the quantity ⁇ 2_t + R ⁇ 1_t ⁇ .
  • FIG. 13B there is an example of the relationship between the locus 701 of the sensor 2A, the locus 702 of the sensor 2B, the locus 703 of the temporary position VPB, the optimum relative position 704 ( ⁇ x_opt, ⁇ y_opt), and the rotation parameter R. Shown.
  • the quantity ⁇ 2_t + R ⁇ 1_t ⁇ indicates the relationship between the direction of the sensor 2A ( ⁇ 1_t1) and the direction of the sensor 2B ( ⁇ 2_t1) at the time point t1.
  • the control device 100 performs a locus shape matching process in step S3 of FIG. 7 as a relative positional relationship calculation function. At that time, in the sensor data SDA of the first sensor and the sensor data SDB of the second sensor, the matching process is performed regardless of whether the information at the time of detection (t) is synchronized or not. It is possible. Even when the time is not synchronized between the sensor data SDA and the sensor data SDB, the control device 100 may, for example, have a first locus (specifically, a tentative locus generated based on the first locus) and a second locus of the first sensor.
  • a first locus specifically, a tentative locus generated based on the first locus
  • FIG. 14 shows an example of matching processing when the time is not synchronized between the sensor data.
  • the locus 1401 provisional locus
  • the alternate long and short dash line is generated from, for example, the locus of the first position identification result 41A based on the sensor data SDA of the sensor 2A and a certain relative position parameter 700.
  • This is the locus of the temporary position VPB of the sensor 2B.
  • the locus 1402 shown by the solid line is, for example, the locus (second locus) of the second position identification result 41 based on the sensor data SDB of the sensor 2B.
  • the time points (t) of the two sensor data are not synchronized with each other.
  • the time point t1 on the locus 1401 and the time point t1 on the locus 1402 are different times.
  • FIG. 15 does not consider the rotation parameter R.
  • the control device 100 compares each pair so that, for example, the position at a certain time point on the locus 1401 corresponds to the position at each time point (at least some candidates) of the locus 1402. Try.
  • FIG. 14B shows a case where the position p1 at the time point t1 of the locus 1401 is aligned with the position p11 at the time point t1 of the locus 1402 when the locus 1402 is superimposed on the locus 1401.
  • FIG. 14C shows a case where the position p1 at the time point t1 of the locus 1401 is aligned with the position p12 at the time point t2 of the locus 1402 when the locus 1402 is superimposed on the locus 1401.
  • the method of taking the distance between the loci is not limited to the example of the distance 1301 in FIG. 13, and for example, the line with the shortest distance from the point of one locus to the point of the other locus is taken. You may do it.
  • the control device 100 may select the one having the highest degree of agreement K from the results of each trial as described above.
  • the moving body 1 may be made to perform a specific preset operation, and the sensor data at that time may be acquired.
  • This specific operation is an operation that satisfies the condition that the relative positional relationship can be calculated / determined, and is an operation including a change of direction such as turning in a time of a certain time or more. That is, the locus data based on the sensor data obtained in this time includes curved portions (701b, 702b) such as arcs corresponding to turning, as shown in FIGS. 8 and 9, for example. If there is such locus data, the above-mentioned matching process is established, and a solution as an optimum value can be obtained. As a result, it is possible to avoid the fact that a solution cannot be obtained and that it takes a long time to obtain a solution, so that calibration can be performed efficiently.
  • the present inventor has confirmed that the relative positional relationship between the sensors 2 can be calculated and determined from at least the trajectory data including the arcs and the like corresponding to the turning as described above, based on the examination including the experiment. From this, necessary conditions and specific actions can be specified. As shown in FIG. 10 and the like described above, the matching process described above is effective because the shape of the locus differs depending on the position of the sensor 2 on the moving body 1.
  • the mobile system may be provided with a user interface that allows the user to set a specific operation for the calibration. For example, on the display screen of a device such as a PC 101 (FIG. 1) connected to the mobile body 1, a route for the specific operation can be set by the user.
  • Map creation function 15 to 19 are explanatory views relating to the map creation function.
  • two map data (42A, 42B) are associated with each other based on the relative positional relationship data 43 (FIG. 5) obtained by the relative positional relationship calculation function.
  • FIG. 15 shows a schematic configuration example of an environment 1500 such as a factory to which the moving body 1 is applied in a horizontal plane (XY plane).
  • FIG. 15 shows a configuration detected by the sensor 2A, particularly with respect to the environment 1500. Based on the sensor data by the sensor 2A at the height position ZA in FIG. 2, the surrounding shape and the locus as the first identification result 41A are grasped, and the first map data 42A is created or updated based on them.
  • the object 1501 shown in the shaded hatched region is an example of an object detected as a feature point by the sensor 2A of the moving body 1.
  • FIG. 15 shows an example of the traveling and locus of the moving body 1.
  • the moving body 1 In the spatial coordinate system CS (X, Y, Z), the moving body 1 is moving straight from the position P1 in the positive direction (for example, south) of the X axis. From the position P2, the moving body 1 turns to the left when viewed from the moving body 1. After turning left, the moving body 1 faces the positive direction (for example, east) of the Y-axis at the position P3, further turns to the left from the position P3, and faces the negative direction (for example, north) of the X-axis. It has become.
  • the moving body 1 is moving straight up to the position P4 in the negative direction (for example, north) of the X axis.
  • the broken line locus 1510 shows a locus with respect to a typical position PM of the moving body 1.
  • the solid line locus 1511 shows the locus of the sensor 2A with respect to the position PA.
  • the range 1502 shows an example of the emission range by scanning the laser beam from the sensor 2A (position PA).
  • the range 1502 shows the case where it is larger than 180 degrees, as in FIG.
  • the solid arrow indicates the laser beam.
  • the laser beam 1503 hits the object 1501 and is reflected back to the sensor 2A to be detected as a feature point.
  • the laser beam 1504 hits a wall or the like of a factory building, is reflected, returns to the sensor 2A, and is detected as a feature point.
  • There is an object in the white area but it is not detected by the sensor 2A because it is not in the height position ZA.
  • the above-mentioned position identification and map creation are possible, and the detailed technical contents thereof are not limited.
  • FIG. 16 shows the configuration detected by the sensor 2B in the same environment 1500 as in FIG.
  • the locus 1510 of the moving body 1 is the same as that in FIG.
  • the solid line locus 1512 is the locus of the position PB of the sensor 2B.
  • range 1602 shows an example of an emission range due to scanning of laser light from sensor 2B (position PB).
  • the laser beam 1603 hits the object 1601, is reflected, returns to the sensor 2B, and is detected as a feature point. Since this object 1601 is at the height position ZB, it is detected by the sensor 2B.
  • objects such as production equipment in factories can have various shapes, and the height may differ for each part.
  • the moving body 1 is provided with a plurality of (two) sensors 2 having different positions including the height position as in the first embodiment, the detection range that can be covered by each sensor 2 is different, and the object shape that can be measured is different.
  • this mobile system it is possible to create a map that measures and reflects the shape of the environment 1500 in more detail.
  • a plurality of map data (42A, 42B) are created as different maps for each sensor 2.
  • the plurality of map data can be associated with each other based on the calibration of the relative positional relationship between the sensors 2.
  • Map data 17 and 18 show configuration examples of map data created based on the environment 1500 and measurements of FIGS. 15 and 16.
  • FIG. 17 shows a map 1700 corresponding to the first map data 42A created based on the sensor data of the sensor 2A of FIG.
  • line 1701 is a line corresponding to the contour (corresponding feature point group) of the object 1501 in FIG.
  • the origin is the first position PA of the sensor 2A at the time when the measurement by the moving body 1 is started
  • the x-axis direction of the sensor 2A is the X-axis, and it is orthogonal to it.
  • the case where the y-axis direction is the Y-axis is shown.
  • FIG. 18 shows a map 1800 corresponding to the second map data 42B created based on the sensor data of the sensor 2B of FIG.
  • line 1801 is a line corresponding to the contour of the object 1601 in FIG.
  • the origin is the first position PB of the sensor 2B at the time when the measurement by the moving body 1 is started
  • the x-axis direction of the sensor 2B is the X-axis, and it is orthogonal to it.
  • the case where the y-axis direction is the Y-axis is shown.
  • FIG. 19 shows an example in which the above two map data (42A, 42B) are associated with one and output to the user by using the relative positional relationship data 43 obtained in the above calculation.
  • the map 1700 its coordinate system
  • the map 1800 of the sensor 2B is rotated and superposed on the map 1700 by using the relative positional relationship 105 between the position PA and the position PB.
  • the user can refer to the two maps in a state of being associated with each other as one map in this way.
  • the map data is configured as, for example, image data, and has information such as a position and the presence / absence of an object for each pixel.
  • the relative positional relationship calculation function makes it possible to easily obtain the relative positional relationship between the sensors 2 with high accuracy.
  • the relative positional relationship calculation function can automatically adjust the moving body 1 as it travels, there is less time and effort for the user to manually operate the setting of the sensor 2.
  • the detection direction and the detection range of the sensor 2 are set to the horizontal plane, but the present invention is not limited to this, that is, the detection direction and the detection range of the sensor 2 can be similarly applied to other than the horizontal plane.
  • the sensor 2 may be a type of sensor capable of three-dimensional positioning or distance measurement including the height direction.
  • the present invention has been specifically described above based on the embodiments, the present invention is not limited to the above-described embodiments and can be variously modified without departing from the gist.
  • the moving body an autonomously movable moving body such as an AGV has been described, but the present invention is not limited to this, and the moving body can also be applied to a moving body operated by a user.
  • the moving body is not limited to a vehicle, but may be a ship or a flying object such as a drone.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

L'invention concerne une technologie avec laquelle il est possible de déterminer une relation de position relative entre des capteurs si une pluralité de capteurs sont installés dans un corps mobile et si les positions d'installation ou analogues sont modifiées. Ce système de corps mobile comprend : un corps mobile 1 ; une pluralité de capteurs 2 y compris un premier capteur (2A) et un deuxième capteur (2B) qui sont installés à des positions différentes dans un système de coordonnées de corps mobile CM ; et un dispositif de commande 100 qui réalise au moins une fonction de mesure de position sur la base de données de capteur. Les capteurs 2 sont des capteurs d'un type qui permet de détecter la position du capteur dans un système de coordonnées spatiales (CS). Lorsque le corps mobile 1 se déplace à l'intérieur d'un environnement 101, le dispositif de commande 100 identifie une première position de capteur et une deuxième position de capteur dans le système de coordonnées spatiales CS sur la base des données de capteur, acquiert une première trajectoire de premier capteur et une deuxième trajectoire de deuxième capteur en série chronologique, et, sur la base d'une comparaison des formes de trajectoire, calcule une relation de position relative 105 entre le premier capteur et le deuxième capteur dans le système de coordonnées de corps mobile CM, et configure ladite relation dans le corps mobile 1.
PCT/JP2020/017937 2020-04-27 2020-04-27 Système de corps mobile WO2021220331A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/JP2020/017937 WO2021220331A1 (fr) 2020-04-27 2020-04-27 Système de corps mobile
CN202080099292.2A CN115362423A (zh) 2020-04-27 2020-04-27 移动体系统
JP2022518431A JP7338048B2 (ja) 2020-04-27 2020-04-27 移動体システム

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/017937 WO2021220331A1 (fr) 2020-04-27 2020-04-27 Système de corps mobile

Publications (1)

Publication Number Publication Date
WO2021220331A1 true WO2021220331A1 (fr) 2021-11-04

Family

ID=78332347

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/017937 WO2021220331A1 (fr) 2020-04-27 2020-04-27 Système de corps mobile

Country Status (3)

Country Link
JP (1) JP7338048B2 (fr)
CN (1) CN115362423A (fr)
WO (1) WO2021220331A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012226675A (ja) * 2011-04-22 2012-11-15 Hitachi Industrial Equipment Systems Co Ltd 移動体
JP2015127664A (ja) * 2013-12-27 2015-07-09 株式会社国際電気通信基礎技術研究所 キャリブレーション装置、キャリブレーション方法およびキャリブレーションプログラム
JP2015222541A (ja) * 2014-05-23 2015-12-10 株式会社日立産機システム 台車搬送システム、搬送車、及び台車搬送方法
JP2017096813A (ja) * 2015-11-25 2017-06-01 株式会社国際電気通信基礎技術研究所 キャリブレーション装置、キャリブレーション方法およびキャリブレーションプログラム
JP2018022215A (ja) * 2016-08-01 2018-02-08 村田機械株式会社 移動教示装置、及び、移動教示方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012226675A (ja) * 2011-04-22 2012-11-15 Hitachi Industrial Equipment Systems Co Ltd 移動体
JP2015127664A (ja) * 2013-12-27 2015-07-09 株式会社国際電気通信基礎技術研究所 キャリブレーション装置、キャリブレーション方法およびキャリブレーションプログラム
JP2015222541A (ja) * 2014-05-23 2015-12-10 株式会社日立産機システム 台車搬送システム、搬送車、及び台車搬送方法
JP2017096813A (ja) * 2015-11-25 2017-06-01 株式会社国際電気通信基礎技術研究所 キャリブレーション装置、キャリブレーション方法およびキャリブレーションプログラム
JP2018022215A (ja) * 2016-08-01 2018-02-08 村田機械株式会社 移動教示装置、及び、移動教示方法

Also Published As

Publication number Publication date
JPWO2021220331A1 (fr) 2021-11-04
CN115362423A (zh) 2022-11-18
JP7338048B2 (ja) 2023-09-04

Similar Documents

Publication Publication Date Title
CN111511620B (zh) 使用最优交互避碰代价评估的动态窗口方法
CN109154827B (zh) 机器人车辆的定位
Schneier et al. Literature review of mobile robots for manufacturing
US11568559B2 (en) Localization of a surveying instrument
JP6825712B2 (ja) 移動体、位置推定装置、およびコンピュータプログラム
CN112840285A (zh) 具有路径点匹配的自主地图遍历
CN110998473A (zh) 位置推断系统和具有该位置推断系统的移动体
CN110789529B (zh) 车辆的控制方法、装置及计算机可读存储介质
JP6074205B2 (ja) 自律移動体
US20230064071A1 (en) System for 3d surveying by an autonomous robotic vehicle using lidar-slam and an estimated point distribution map for path planning
JP2013020345A (ja) 移動体の位置・姿勢推定システム
US11537140B2 (en) Mobile body, location estimation device, and computer program
CN111971633B (zh) 位置推断系统、具有该位置推断系统的移动体以及记录介质
WO2018179960A1 (fr) Corps mobile et dispositif d'estimation de position locale
KR102564663B1 (ko) 무인 반송 차량의 위치 인식 장치 및 방법
US20230333568A1 (en) Transport vehicle system, transport vehicle, and control method
WO2021220331A1 (fr) Système de corps mobile
US20230316567A1 (en) Localization of a surveying instrument
Buck et al. Multi-sensor payload detection and acquisition for truck-trailer AGVs
JP7396353B2 (ja) 地図作成システム、信号処理回路、移動体および地図作成方法
JP7300413B2 (ja) 制御装置、移動体、移動制御システム、制御方法及びプログラム
JP2023000901A (ja) 検出システム、処理装置、移動体、検出方法及びプログラム
JP2018013860A (ja) 自律移動体制御装置
García-Gutierrez et al. Obstacle Coordinates Transformation from TVS Body-Frame to AGV Navigation-Frame
JP2017130006A (ja) 自律移動体制御装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20934195

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022518431

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20934195

Country of ref document: EP

Kind code of ref document: A1