WO2023219101A1 - Corps mobile et programme - Google Patents

Corps mobile et programme Download PDF

Info

Publication number
WO2023219101A1
WO2023219101A1 PCT/JP2023/017555 JP2023017555W WO2023219101A1 WO 2023219101 A1 WO2023219101 A1 WO 2023219101A1 JP 2023017555 W JP2023017555 W JP 2023017555W WO 2023219101 A1 WO2023219101 A1 WO 2023219101A1
Authority
WO
WIPO (PCT)
Prior art keywords
tracking target
sensor
moving
moving body
unit
Prior art date
Application number
PCT/JP2023/017555
Other languages
English (en)
Japanese (ja)
Inventor
直広 早石
宏佑 竹内
Original Assignee
株式会社計数技研
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社計数技研 filed Critical 株式会社計数技研
Publication of WO2023219101A1 publication Critical patent/WO2023219101A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a moving object, etc. that moves while following a tracking target.
  • a moving object that moves by following a human using measurement results obtained by a distance sensor (for example, see Japanese Patent Application Laid-Open No. 2014-178789).
  • the present invention has been made in order to solve the above-mentioned problems, and an object thereof is to provide a moving object etc. that can move while appropriately following an object to be followed.
  • a moving object is a moving object that moves by following a tracking target, and includes a distance sensor that measures distances to surrounding objects in multiple directions; An image sensor that acquires captured images around the image sensor, a tracking target in the captured image captured by the image sensor, and a local A device comprising: a specifying unit that specifies the position of a tracking target in a coordinate system; a moving mechanism that moves the moving object; and a movement control unit that controls the moving mechanism to follow the tracking target specified by the specifying unit. It is.
  • the distance sensor and the image sensor are used to identify the position of the tracking target. For example, after identifying a person using the image sensor, the position of the person can be determined using the measurement results of the distance sensor. It is possible to specify the position of the tracking target more accurately. As a result, it becomes possible to move while appropriately following the object to be followed.
  • the mobile object further includes a storage unit and a storage unit that stores the latest position of the tracking target specified by the identification unit in the storage unit, and the movement control unit
  • the moving mechanism may be controlled so as to move to the latest position of the tracking target stored by the storage unit.
  • the tracking target can be tracked again by moving to the latest stored position of the tracking target. You will be able to do this.
  • the storage unit stores the latest position of the tracking target within a predetermined range
  • the movement control unit is configured to detect the position of the tracking target when the identifying unit cannot specify the position of the tracking target.
  • the moving mechanism may be controlled so as to move along a path corresponding to the latest position of the tracking target in a predetermined range stored by the storage section.
  • the moving object is one that transports the object to be transported, and uses at least one of the measurement results of the distance measurement sensor and the photographed image acquired by the image sensor to determine the object to be transported.
  • the movement control unit further includes an acquisition unit that acquires the size of the object, and the movement control unit acquires at least one of the measurement result of the distance measurement sensor and the photographed image acquired by the image sensor, and the size of the conveyed object acquired by the acquisition unit.
  • the movement mechanism may be controlled using the movement mechanism to avoid collision with an obstacle.
  • the conveyed object can be moved to a position far away from the obstacle accordingly. and be able to properly avoid obstacles.
  • the identification unit uses the measurement results of the distance sensor to detect objects whose position changes less in the local coordinate system while the moving object is moving than other objects.
  • the position may be the position of the tracking target.
  • a schematic diagram showing a moving body and a transport vehicle according to an embodiment of the present invention Bottom view of the moving body according to the embodiment Functional block diagram showing the configuration of a mobile object according to the embodiment
  • Functional block diagram showing the configuration of a mobile object according to the embodiment Flowchart showing the operation of the mobile object according to the embodiment Diagram for explaining the ranging sensor in the embodiment Diagram for explaining the image sensor in the same embodiment
  • a diagram showing an example of a captured image in the same embodiment A diagram showing an example of the measurement results of the ranging sensor in the same embodiment.
  • a diagram for explaining an example of movement of a mobile object in the same embodiment A diagram for explaining an example of movement of a mobile object in the same embodiment.
  • a diagram showing an example of movement of a mobile object in the same embodiment A diagram showing an example of the configuration of a computer system in the same embodiment.
  • the moving object according to the present embodiment specifies the position of the tracking target using the measurement result of the ranging sensor and the photographed image acquired by the image sensor, and moves to follow the tracking target whose position has been identified. It is something to do.
  • FIG. 1 is a schematic diagram showing a moving body 1 and a transport vehicle 2 according to the present embodiment
  • FIG. 2 is a bottom view of the moving body 1
  • FIG. 3A is a functional block diagram showing the configuration of the moving body 1. It is.
  • the moving body 1 may be, for example, a moving body that transports the object 3 to be transported.
  • the moving body 1 moves by following an object to be tracked, and may include a main body 10 and a moving mechanism 16 attached to the main body 10, as shown in FIGS. 1 and 2.
  • the moving body 1 conveys the conveyed object 3
  • it may mean conveying the conveyed object 3 loaded on the moving body 1 itself.
  • It may also be possible to tow the transport vehicle 2.
  • the latter case will mainly be explained. Note that in this embodiment, a case will be mainly described in which the moving body 1 transports the object to be transported 3, but the moving body 1 may be used for other purposes.
  • the moving body 1 may be a moving body for guiding, guarding, monitoring, entertainment, etc., for example. That is, the moving body 1 may be a transport robot, a guide robot, a guard robot, a monitoring robot, an entertainment robot, or the like. Furthermore, in this embodiment, the case where the moving body 1 is a traveling body will be mainly described, and other cases will be described later. Further, in this embodiment, as shown in FIG. 1, a case where the tracking target is a person 5 will be mainly described.
  • the main body 10 of the moving body 1 may be, for example, a loading platform or a member to which the moving mechanism 16 or the like is attached.
  • the main body 10 is a rectangular parallelepiped-shaped member to which the moving mechanism 16 is attached.
  • the moving mechanism 16 will be described later.
  • mobile object 1 includes distance measurement sensor 11, image sensor 12, identification section 13, storage section 14, storage section 15, movement mechanism 16, and movement control. 17.
  • the distance sensor 11 measures distances to surrounding objects in multiple directions.
  • the distance sensor 11 may be, for example, a laser sensor, an ultrasonic sensor, a distance sensor using microwaves, or the like.
  • the laser sensor may be, for example, a laser range sensor (laser range scanner) or LIDAR. Note that these distance measuring sensors are already known and their explanation will be omitted. In this embodiment, the case where the distance measurement sensor 11 is a laser range sensor will mainly be described. Moreover, the distance measurement sensor 11 may have one laser range sensor, or may have two or more laser range sensors. In the latter case, all directions may be covered by two or more laser range sensors.
  • distances in multiple directions may be measured by rotating the distance measurement direction of the distance sensor.
  • Distances in multiple directions may be measured using a plurality of distance sensors arranged in a plurality of distance sensors. Measuring distances in multiple directions may mean, for example, measuring distances in multiple directions at predetermined angular intervals over a predetermined angular range or the entire circumference (that is, 360 degrees).
  • the angular intervals may be constant, such as 1 degree, 2 degree, or 5 degree intervals, for example.
  • the information obtained from the distance measurement sensor 11 may be, for example, distances to surrounding objects at each of a plurality of azimuth angles based on a certain direction of the moving body 1. By using this distance, it becomes possible to know what objects exist around the moving body 1 in the local coordinate system of the moving body 1.
  • the distance measurement sensor 11 may, for example, measure distances in a two-dimensional planar direction, or may measure distances in a plurality of three-dimensional directions. In this embodiment, a case will be mainly described in which the distance measuring sensor 11 measures a distance in a two-dimensional horizontal direction.
  • the image sensor 12 acquires captured images of the surroundings of the moving body 1.
  • the image sensor 12 may be, for example, a CCD image sensor or a CMOS image sensor.
  • the image sensor 12 may include, for example, an optical system such as a lens for forming an image on the image sensor.
  • the image sensor 12 may be monocular or binocular (stereo camera). In this embodiment, the case where the image sensor 12 is monocular will mainly be described.
  • the image sensor 12 may be one that captures moving images, that is, one that captures continuous image frames, or may be one that captures still images.
  • the photographed image may be, for example, in color or in gray scale, but from the viewpoint of specifying the tracking target, color is preferable.
  • the distance measurement sensor 11 and the image sensor 12 each acquire at least information in the traveling direction (that is, forward) of the moving object 1.
  • 5A and 5B are top views of the moving body 1, and are diagrams showing distance measurement by the distance measuring sensor 11 and photographing by the image sensor 12, respectively. Note that in FIGS. 5A and 5B, the distance measurement sensor 11 and the image sensor 12 are shown in separate drawings for convenience of explanation, but in reality, the distance measurement sensor 11 and the image sensor 12 are shown in one moving body 1. This means that the sensor 12 is attached.
  • the distance measurement direction of the distance measurement sensor 11 and the shooting direction of the image sensor 12 may be the same, for example, as shown in FIGS. 5A and 5B.
  • the sensing ranges of both sensors have at least an overlapping range.
  • the sensing range of both sensors may be the same, for example, or one may be wider than the other.
  • the measurement results obtained by the distance measurement sensor 11 and the photographed image obtained by the image sensor 12 are such that the correspondence relationship between the angle of the distance measurement direction and the position in the photographed image is known. is suitable. For example, it is preferable to be able to see which position in the horizontal direction in the photographed image corresponds to each angle in the distance measurement direction.
  • the frequency at which the distance measurement sensor 11 acquires measurement results and the frequency at which the image sensor 12 acquires captured images may be, for example, approximately the same.
  • the frequency at which the sensor 11 acquires measurement results may be higher than the frequency at which the image sensor 12 acquires captured images.
  • the frequency with which both acquire information is not particularly limited, but for example, the range sensor 11 may perform measurements approximately 1 to 100 times per second, and the image sensor 12 may take an image approximately once every 1 to 10 seconds. You may go.
  • the identifying unit 13 identifies the position of the tracking target in the local coordinate system of the moving body 1 using the measurement result of the distance measuring sensor 11 and the photographed image acquired by the image sensor 12. For example, the identifying unit 13 may use the captured image acquired by the image sensor 12 to identify the tracking target in the captured image.
  • this identification may be performed, for example, by recognizing the area of the person in the captured image. Recognition of a person's area in a photographed image may be performed, for example, by pattern matching of the person, or may be performed by performing segmentation and identifying the person's area as a result of the segmentation, or by using a person detection model. (For example, Yolov5 etc.).
  • the specifying unit 13 may obtain the distance and direction to the tracking target using the measurement results of the distance measuring sensor 11 corresponding to the area of the tracking target identified in the photographed image. More specifically, the specifying unit 13 uses the azimuth of the tracking target identified in the photographed image or the distance measured by the range sensor 11 in the azimuth range to determine the distance in the local coordinate system of the moving object 1.
  • the position of the tracking target may be specified. Note that the position of the tracking target specified by the specifying unit 13 may be, for example, a coordinate value in a two-dimensional orthogonal coordinate system that is a local coordinate system of the moving body 1.
  • the measurement results of the distance measurement sensor 11 are a set of distances for each angle, that is, a set of distances to each measurement point. Therefore, the identification unit 13 may perform clustering on the measurement results, for example. Through this clustering, it is possible to classify, for example, into a cluster based on the distance to a wall, a cluster based on the distance to a person, a cluster based on the distance to an obstacle other than a person, etc. As an example of this clustering, please refer to the above-mentioned Japanese Patent Application Publication No. 2014-178789.
  • the specifying unit 13 may then specify a cluster corresponding to the specified area of the tracking target in the photographed image, and may set the representative position of the measurement point included in the cluster as the position of the tracking target.
  • the cluster corresponding to the area of the tracking target specified in the captured image may be, for example, a cluster existing in the direction of the azimuth of the tracking target specified in the captured image.
  • the representative position of the plurality of measurement points may be, for example, the position of the center of gravity of the plurality of measurement points, the position of the measurement point closest to the mobile object 1 among the plurality of measurement points, or the like.
  • the identification unit 13 uses the measurement results of the distance measurement sensor 11 to track the position of the tracking target. , the latest position of the tracking target may be identified. More specifically, the identifying unit 13 may perform clustering on the distances to each measurement point included in the measurement results of the distance measurement sensor 11 at each time point. Further, the specifying unit 13 uses the shape of the measurement point included in each cluster according to the measurement result at a certain time and the shape of the measurement point included in each cluster according to the measurement result at the next time to create a cluster. You may also specify the correspondence relationship.
  • clusters at adjacent times where the plurality of measurement points are similar in shape and located in close positions may be identified as corresponding clusters.
  • the fact that the shapes of the plurality of measurement points are similar may mean that the degree of similarity of the shapes of the plurality of measurement points exceeds a threshold value.
  • the nearby position may be, for example, a position closer than a threshold value.
  • the identifying unit 13 may track the position of the cluster to be tracked. Note that even if the frequency of obtaining the distance measurement result and the photographed image is approximately the same, the identification unit 13 uses the distance measurement result to increase the frequency of using the photographed image to specify the position of the tracking target. It may be lower than the frequency of use. This is because image processing usually requires a greater load than processing related to distance measurement results.
  • the moving body 1 moves to follow the object to be followed. Therefore, in a situation where appropriate movement control is performed, the position of the tracking target in the local coordinate system will be approximately the same. Therefore, the identifying unit 13 uses the measurement results of the ranging sensor 11 to determine the position of the object to be tracked whose position changes less in the local coordinate system during the movement of the moving object 1 than other objects. It can also be a location. By doing so, the identifying unit 13 can prevent, for example, from erroneously recognizing a person crossing in front of the moving body 1 as a tracking target. More specifically, the identifying unit 13 may perform clustering on the distances to each measurement point included in the measurement results of the distance measurement sensor 11 at each time point. Further, the specifying unit 13 may specify a change in the position of each cluster in the local coordinate system by specifying the correspondence relationship of each cluster in time series. The identifying unit 13 may then select the cluster with the least change in position as the cluster to be tracked.
  • the storage unit 14 stores at least the latest position of the tracking target specified by the specifying unit 13.
  • the storage unit 14 may store the latest position of the tracking target in a predetermined range, including the latest position.
  • the predetermined range may be, for example, a predetermined time range (for example, the latest 5 seconds or 10 seconds) or a predetermined distance range (for example, the latest 5 meters or 10 meters). etc.).
  • a position acquired within a predetermined time from the latest position will be stored in the storage unit 14
  • a position acquired within a predetermined distance from the latest position will be stored in the storage unit 14.
  • the included acquired positions will be stored in the storage unit 14.
  • the latest predetermined range is a time range will be mainly described.
  • the storage unit 14 may also store information other than the position of the tracking target, as will be described later.
  • the storage unit 14 is preferably implemented by a nonvolatile recording medium, but may also be implemented by a volatile recording medium.
  • the recording medium may be, for example, a semiconductor memory, a magnetic disk, an optical disk, or the like.
  • the storage unit 15 stores the latest position of the tracking target specified by the identification unit 13 in the storage unit 14.
  • the storage unit 15 may store the latest position by overwriting, or the storage unit 15 may store the latest position by overwriting or It may be stored separately from the position. In the latter case, it is preferable that the information be stored in such a way that the latest position can be known.
  • the storage unit 15 stores it so that only the latest predetermined range of positions are stored in the storage unit 14, for example. Other positions may be deleted, or the latest positions may be accumulated one after another so that it is known which position is the latest within a predetermined range.
  • the storage unit 15 may accumulate all the latest positions specified by the identification unit 13, or may accumulate some of the latest positions. In the latter case, for example, the latest positions may be accumulated at predetermined time intervals, or the latest positions may be accumulated at every predetermined distance.
  • the storage unit 15 may store, in the storage unit 14, for example, information indicating the difference in movement of the mobile body 1 from when the latest position is stored until when the next latest position is stored.
  • the information indicating the movement difference is, for example, information indicating the position and direction of the moving body 1 at the time when the next position is accumulated, based on the position and direction of the moving body 1 at the time when a certain position is accumulated. There may be.
  • the information may be information indicating the position and direction of the moving body 1 at the time when the next position is accumulated in the local coordinate system of the moving body 1 at the time when a certain position is accumulated.
  • Information indicating the difference in movement may be obtained using, for example, the rotational speed of wheels 31a and 31b, which are drive wheels, which will be described later. Note that the method of obtaining the current position and current direction using the rotation speed of the drive wheels is already known, and its explanation will be omitted. Further, for example, instead of storing information indicating the movement difference in the storage unit 14, the storage unit 15 stores information such that the position stored in the storage unit 14 is the current position in the local coordinate system. May be updated constantly.
  • the storage unit 15 stores, for example, that each time the moving body 1 moves by a minute amount, each position stored in the storage unit 14 becomes the position in the local coordinate system after moving by that minute amount. You can update it as follows. This update may also be performed using the rotational speed of the wheels 31a, 31b, which are the driving wheels.
  • the moving mechanism 16 moves the moving body 1, and may include wheels 31a to 31d and motors 32a and 32b provided on the back side of the main body 10, as shown in FIG. .
  • each of the wheels 31a to 31d may also be referred to as a wheel 31 unless otherwise distinguished.
  • the motors 32a and 32b when they are not distinguished, they may be referred to as the motor 32.
  • the number of wheels 31 is not limited as long as it is three or more.
  • the wheels 31 may be omnidirectional wheels. As shown in FIG. 2, wheels 31a and 31b are drive wheels driven by motors 32a and 32b, respectively, and wheels 31c and 31d are driven wheels that can turn.
  • the traveling direction of the moving body 1 is determined by the difference in the rotational speed of the wheels 31a and 31b, which are drive wheels. It may have a steering mechanism that can change the direction.
  • all the wheels 31 may be drive wheels.
  • some of the wheels 31, for example, the front wheels 31c and 31d, may be steering wheels, or all the wheels 31 may be steering wheels, and the moving mechanism 16 may be a steering wheel. It may have a steering mechanism that can change the direction of the wheels.
  • the motor 32 or the wheels 31a, 31b may be provided with an encoder for obtaining the rotation speed of the wheels 31a, 31b.
  • the movement control unit 17 controls the movement mechanism 16 to follow the tracking target specified by the identification unit 13.
  • the movement mechanism 16 may be controlled, for example, by controlling the start of movement, controlling the stop of movement, controlling the direction of movement, and the like.
  • the movement control unit 17 controls the moving mechanism 16 to move to the latest position of the tracking target specified by the specifying unit 13. It's okay.
  • the movement control unit 17 may control the movement mechanism 16 to stop at a position where the distance between the object to be followed and the moving body 1 is a predetermined distance, for example.
  • the movement control unit 17 may perform movement control to follow the object to be followed so that the distance between the object to be followed and the moving body 1 is a predetermined distance. Note that since the movement control unit 17 performs movement control to follow the object to be followed, it does not need to know the position in the global coordinate system. Therefore, the mobile object 1 does not need to have a configuration for acquiring a position in the global coordinate system, such as a configuration such as GPS (Global Positioning System) or SLAM (Simultaneous Localization and Mapping).
  • GPS Global Positioning System
  • SLAM Simultaneous Localization and Mapping
  • the movement control unit 17 controls the movement mechanism 16 to move to the latest position of the tracking target stored by the storage unit 15 when the identifying unit 13 cannot specify the position of the tracking target. For example, when a person to be followed turns a corner and moves out of the sensing range of the moving body 1, movement control using the position stored in the storage unit 14 is performed in this manner. Good too. Note that if the position stored in the storage unit 14 is a position in the local coordinate system at the time of accumulation, the movement control unit 17 uses, for example, information indicating the movement difference to change the position to the current position. It is also possible to perform movement control such that the object is converted to a position in the local coordinate system, and the object is moved to the converted position.
  • the moving body 1 can at least approach the tracking target, and at the destination, the moving object 1 can specify the position of the tracking target again. If you are able to do so, you will be able to continue moving. Note that if the position of the tracking target cannot be specified even when moving using the position stored in the storage unit 14, the moving object 1 assumes that an error has occurred and does not move at that position. Alternatively, the object may be searched for by performing a 360 degree turn at the position after the movement.
  • the movement control unit 17 may, for example, perform movement control so that the latest position stored in the storage unit 14 becomes the target position.
  • the mobile object 1 may move, for example, from the current position to the latest position by the shortest route.
  • the shortest path is usually a straight path.
  • the moving mechanism 16 may be controlled to move along a path according to the position of the tracking target in the latest predetermined range, which is accumulated by the following. In this case, the moving body 1 will move to the latest stored position along the same route as the tracking target. Therefore, for example, if the object to be followed is moving while avoiding obstacles, the moving object 1 will similarly move along a route that avoids obstacles.
  • the movement control unit 17 detects an obstacle using at least one of the measurement result of the distance measurement sensor 11 and the photographed image acquired by the image sensor 12, and is configured to avoid collision with the detected obstacle.
  • the moving mechanism 16 may also be controlled. Collision with an obstacle may be avoided, for example, by decelerating or stopping, or by changing the moving route to avoid the obstacle. Note that movement control to a predetermined target position, movement control for avoiding collision with obstacles, etc. are already known, and detailed explanation thereof will be omitted.
  • the transport vehicle 2 includes a main body 20, a plurality of wheels 21 fixed to the back side of the main body 20, and frame parts 22 provided on the front side and the rear side of the main body 20 in the direction of movement. . It is assumed that the main body 20 is a loading platform, and the object to be transported 3 is placed on the loading platform. Further, the number of wheels 21 is not limited as long as it is two or more. Furthermore, all of the wheels 21 may be driven wheels that can turn, or some of them may be driven wheels that can turn, and the rest may be fixed wheels that cannot turn. Furthermore, the transport vehicle 2 may be towed by the mobile body 1 by connecting the connecting portion 10a of the mobile body 1 and the connecting portion 20a of the transport vehicle 2.
  • the connecting portion 10a of the moving body 1 and the connecting portion 20a of the transport vehicle 2 may be connected by a connecting shaft extending in the vertical direction.
  • at least one of the connecting portions 10a, 20a may be provided rotatably about its connecting shaft.
  • the configuration of the transport vehicle 2 is not limited as long as the object to be transported 3 is loaded thereon and the transport vehicle 2 is towed by the moving body 1 .
  • the frame portion 22 may not be provided in the transport vehicle 2, or the frame portion 22 may be provided all around the main body 20.
  • the object to be transported 3 is not particularly limited, but may be, for example, a cardboard, a box, a folding container, etc., placed on a pallet, or any other object to be transported.
  • the distance measurement sensor 11 measures distances to surrounding objects in multiple directions.
  • Step S102 The image sensor 12 acquires a captured image of the surroundings.
  • Step S103 The identifying unit 13 identifies the position of the tracking target in the local coordinate system using the measurement result acquired in step S101 and the captured image acquired in step S102.
  • Step S104 The storage unit 15 stores the latest position identified in step S103 in the storage unit 14. Further, the storage unit 15 may store, for example, the difference in movement in the storage unit 14 together with the latest position. Further, if the latest position cannot be specified in step S103, the storage unit 15 does not need to store the latest position, or stores information to the effect that the position could not be specified in the storage unit 14, for example. You may. Note that even if the latest position cannot be specified, the storage unit 15 may store the movement difference in the storage unit 14, for example.
  • Step S105 The movement control unit 17 determines whether the position of the tracking target has been identified in step S103. If the position of the tracking target can be specified, the process proceeds to step S106; otherwise, the process proceeds to step S107.
  • Step S106 The movement control unit 17 controls the movement mechanism 16 to move to the latest position specified in step S103.
  • the movement control unit 17 controls the movement mechanism 16 to move to the latest position stored in the storage unit 14.
  • movement control may be performed to move to the latest position in the shortest distance, or movement control may be performed to follow the movement route to the latest position of the tracking target.
  • Step S108 The movement control unit 17 determines whether to end movement control. If the movement control is to be terminated, a series of movement processes for following the object to be followed is terminated; if not, the process returns to step S101. Note that the movement control unit 17 may determine to end the movement control, for example, when an instruction to end the movement is input.
  • the distance measurement by the distance measurement sensor 11 may be performed more frequently than the acquisition of captured images by the image sensor 12. Further, the order of processing in the flowchart of FIG. 4 is an example, and the order of each step may be changed as long as the same result can be obtained. For example, after the process of acquiring a captured image by the image sensor 12 (step S102), the process of measuring the distance by the distance measuring sensor 11 (step S101) may be performed.
  • the moving object 1 moves following the movement of a human being to be followed.
  • a user who is a tracking target operates the moving body 1, inputs an instruction to start tracking, and starts walking.
  • the distance sensor 11 measures the distance to objects around the moving object 1 and passes it to the identification section 13, and the image sensor 12 acquires a captured image of the surroundings of the moving object 1 shown in FIG. and passes it to the identification unit 13 (steps S101, S102).
  • the distance measurement results are as shown in FIG. In FIG. 7, measurement points are indicated by black circles in the xy coordinate system that is the local coordinate system of the moving body 1.
  • the specifying unit 13 specifies the area of the person 5 in the captured image shown in FIG. 6 using the person detection model, and specifies the azimuth range of the specified area. Further, the specifying unit 13 specifies, in the distance measurement results shown in FIG. 7, measurement points corresponding to the azimuth range of the area of the person 5 specified in the photographed image. In this specific example, it is assumed that a set of measurement points located on both sides of the y-axis are specified as measurement points corresponding to the azimuth angle range of the area of the person 5. Then, the specifying unit 13 passes the position of the measuring point 6 closest to the moving object 1 among the set of measuring points to the accumulating unit 15 and the movement control unit 17 as the latest position of the tracking target (step S103). .
  • the storage unit 15 stores the position of the tracking target, the current time T11, and the movement difference in the storage unit 14 (step S104). Note that the time may be obtained from a clock unit or a timer (not shown). Further, at this point, since movement has just started, all movement differences may be 0. Thereafter, the movement control unit 17 controls the movement mechanism 16 to move to the position (X11, Y11) of the tracking target received from the identification unit 13 (steps S105, S106).
  • the movement control unit 17 calculates the difference in movement of the moving object 1 after the latest position of the tracking target is specified by the identification unit 13 using the rotational speed of the wheels 31a and 31b, which are the drive wheels of the movement mechanism 16. You may also obtain it by In this way, as shown in FIG. 9A, the moving body 1 moves to follow the person 5 who is the tracking target. In addition, in FIG. 9A, for convenience of explanation, the moving directions of the moving body 1 and the person 5 are shown by arrows, and the transport vehicle 2 is omitted.
  • the storage unit 14 stores the position and movement difference of the tracking target, as shown in FIG. 8A. It is assumed that the time changes as T11, T12, and T13 as time passes. Further, in this specific example, it is assumed that the latest three pieces of information are stored in the storage unit 14. Further, the movement difference includes a position difference (a13, b13), etc., an angular difference ⁇ 13, etc.
  • the distance is measured and the photographed image is acquired in the same manner, but as shown in FIG. 9B, after the person 5 who is the tracking target turns the corner of the wall 7, the person 5 is detected from the moving object 1.
  • the specifying unit 13 becomes unable to specify the position of the person 5 who is the tracking target (steps S101 to S104). Therefore, as shown in FIG. 8B, the storage unit 14 no longer stores the position of the tracking target at the latest time T23.
  • the movement control unit 17 converts the positions (X21, Y21) and (X22, Y22) of the tracking target shown in FIG.
  • Movement control is performed using the converted position (steps S105 and S107). Specifically, it is assumed that the movement difference (a23, b23, ⁇ 23) is the movement difference from time T22 to time T23. Then, the movement control unit 17 can use the difference in movement and the position (X22, Y22) of the tracking target to find the position P22 corresponding to (X22, Y22) in the current local coordinate system. Similarly, the movement control unit 17 can find the position P21 corresponding to (X21, Y21) in the current local coordinate system. Assuming that these positions P21 and P22 are as shown in FIG.
  • the movement control unit 17 first moves to position P21 and then moves to position P21 until the position of the tracking target can be specified again. Movement control is performed so as to move toward P22. Note that when moving in this manner, the person 5 to be followed enters the sensing range, so that the moving body 1 can move again to follow the person 5 to be followed.
  • Accumulation of the position of the tracking target, etc. may be performed at a lower frequency than specifying the position of the tracking target. Accumulation of the position of the tracking target, etc. may be performed, for example, at predetermined time intervals, or may be performed only once every time the position of the tracking target is specified a predetermined number of times.
  • the moving object 1 in order to specify the position of the tracking target using the distance measuring sensor 11 and the image sensor 12, for example, after specifying a person using the image sensor 12, Using the measurement results of the distance measurement sensor 11, the position of the person can be specified, and the position of the tracking target can be specified more accurately. As a result, it becomes possible to move while appropriately following the object to be followed. Furthermore, by accumulating the latest position of the specified tracking target, even if the position of the tracking target cannot be specified, it is possible to move based on the accumulated position. As a result, if the vehicle approaches the target that cannot be identified due to a corner or an obstacle and is able to identify the target again, it is possible to continue tracking the target. Furthermore, if a plurality of positions are stored in the storage unit 14, it is also possible to move along the movement route of the tracking target.
  • the frequency with which captured images are used to identify the position of a tracking target may be lower than the frequency with which distance measurement results are used; however, for example, if the tracking target is lost, That is, when the position of the tracking target cannot be specified, the frequency of acquiring captured images and the frequency of using captured images to specify the position of the tracking target may be set higher than in other cases. This is to enable the position of the tracking target to be specified again more quickly.
  • the distance measuring sensor 11 and the image sensor 12 may be used for purposes other than specifying the position of a tracking target.
  • FIG. 3B is a block diagram showing the configuration of the moving object 1 that uses the measurement results of the distance measurement sensor 11 and the photographed image acquired by the image sensor 12 for purposes other than specifying the position of a tracking target.
  • the mobile body 1 further includes an acquisition unit 18.
  • the acquisition unit 18 may acquire the size of the conveyed object 3 using at least one of the measurement result of the distance measurement sensor 11 and the photographed image acquired by the image sensor 12.
  • the acquisition unit 18 can acquire the size of the conveyed object 3 using at least one of the measurement result of the distance measurement sensor 11 and the photographed image acquired by the image sensor 12.
  • the acquisition unit 18 may acquire the size of the conveyed object 3 from the measurement result of the distance measurement sensor 11, or acquire the size of the conveyed object 3 from the photographed image acquired by the image sensor 12.
  • the size of the object to be transported 3 may be obtained using both of them.
  • the final size of the conveyed object 3 is as follows. It may be a representative value of the size obtained by each. The representative value may be, for example, an average or a maximum value.
  • the acquisition unit 18 specifies the conveyed object 3 using the captured image acquired by the image sensor 12, and acquires the size of the identified conveyed object 3 using the measurement result of the distance measuring sensor 11. You may. More specifically, the azimuth or azimuth angle range of the object to be transported 3 is specified using the photographed image, and the identification result and the measurement result of the distance measurement sensor 11 are used to determine the azimuth of the object to be transported 3. The size may be obtained.
  • the transportation target object 3 in the photographed image may be identified using, for example, pattern matching, segmentation, a detection model, etc., similarly to the identification of a person.
  • the conveyed object 3 whose size is to be obtained may be, for example, the conveyed object 3 placed on the conveying cart 2 as shown in FIG. It may also be an object to be transported that is placed there.
  • the movement control unit 17 uses at least one of the measurement result of the distance sensor 11 and the captured image acquired by the image sensor 12, and the size of the conveyed object 3 acquired by the acquisition unit 18.
  • the moving mechanism 16 may be controlled to avoid collision with an obstacle. That is, the movement control unit 17 detects obstacles existing around the moving body 1 using at least one of the measurement results of the distance measurement sensor 11 and the photographed image acquired by the image sensor 12, and detects the detected obstacles. Movement may be controlled so as not to collide with objects.
  • the movement control unit 17 performs, for example, movement control to avoid collision with an obstacle such that the larger the size of the conveyed object 3 acquired by the acquisition unit 18 is, the farther it is from the obstacle. You may do so.
  • the movement control may be performed such that the larger the size of the conveyed object 3 acquired by the acquisition unit 18 is, the more the moving object 1 passes through a position farther from the obstacle. Movement control may be performed so that the body 1 stops at a position further away from the obstacle.
  • the size of the conveyed object 3 acquired by the acquisition unit 18 is usually the width in the horizontal direction, but other sizes, such as the height of the conveyed object 3, may also be acquired.
  • the movement control unit 17 may perform movement control so that the object 3 moves while avoiding obstacles located at a position lower than the height.
  • the tracking target is a person
  • the identifying unit 13 may perform facial recognition or the like using the photographed image, and may track only a specific person.
  • the tracking target may be another moving object instead of a person.
  • a figure, marker, etc. that can identify the other moving body is attached to the other moving body, and the identification unit 13 recognizes the figure, marker, etc. in the photographed image, and The position of the moving object to be tracked may be specified.
  • the moving object to be followed may not have, for example, a figure or a marker attached to it, and the identification unit 13 may identify the object to be followed by using characteristics such as the shape of the moving object to be followed. .
  • other moving objects in the photographed image may be identified using pattern matching, segmentation, a detection model, or the like, similar to identifying a person.
  • the image sensor 12 is monocular
  • the image sensor 12 may be, for example, a stereo camera.
  • the identifying unit 13 identifies the final position of the tracking target using the position of the tracking target obtained using the distance measurement result and the position of the tracking target obtained using the photographed stereo image. You may.
  • the final position may be specified, for example, by specifying a representative position of both positions.
  • the representative position may be an intermediate position between both positions, or may be a position resulting from combining both positions using their respective reliability levels.
  • the case where the movement control is performed using the stored past position when the position of the tracking target cannot be specified has been mainly described, but this need not be the case.
  • the mobile body 1 does not need to perform such control.
  • the mobile body 1 does not need to include the storage section 14 or the storage section 15.
  • the moving object 1 may be, for example, a flying object that flies in the air, or a water moving object (for example, a ship) that moves on water.
  • the flying object may be, for example, a rotary wing aircraft, an airplane, an airship, or any other flying object. From the viewpoint of being movable to any position, it is preferable that the flying object is a rotary wing aircraft.
  • the rotary wing aircraft may be, for example, a helicopter or a multicopter having three or more rotors.
  • the multicopter may be, for example, a quadrotor with four rotors, or may have other numbers of rotors.
  • the moving mechanism 16 may include, for example, a propeller and driving means such as a motor or an engine that drives the propeller.
  • the moving body 1 is a water moving body
  • the moving mechanism 16 includes, for example, a screw, a driving means such as a motor or an engine that drives the screw, a rudder, and a steering mechanism that can change the direction of the rudder. You may do so.
  • each process or each function may be realized by being centrally processed by a single device or a single system, or may be realized by being distributedly processed by multiple devices or multiple systems. This may be realized by
  • the information exchange performed between each component is performed by one component, for example, when the two components that exchange the information are physically different. This may be done by outputting information and receiving the information by another component, or by one component if the two components passing that information are physically the same. This may be performed by moving from a phase of processing corresponding to the component to a phase of processing corresponding to the other component.
  • information related to processing executed by each component for example, information accepted, acquired, selected, generated, transmitted, or received by each component.
  • Information such as threshold values, formulas, addresses, etc. used by each component in processing may be held temporarily or for a long period of time in a recording medium (not shown), even if not specified in the above description.
  • the information may be stored in the recording medium (not shown) by each component or by a storage unit (not shown).
  • each component or a reading unit may read information from the recording medium (not shown).
  • the information used in each component may be changed by the user, the above-mentioned Even if it is not specified in the description, the user may or may not be able to change the information as appropriate.
  • the change is realized by, for example, a reception unit (not shown) that receives change instructions from the user, and a change unit (not shown) that changes the information in accordance with the change instruction. It's okay.
  • the acceptance of the change instruction by the reception unit (not shown) may be, for example, acceptance from an input device, information transmitted via a communication line, or information read from a predetermined recording medium. .
  • the two or more components included in the mobile body 1 may physically have a single device. , or may have separate devices.
  • each component may be configured by dedicated hardware, or components that can be realized by software may be realized by executing a program.
  • each component can be realized by a program execution unit such as a CPU reading and executing a software program recorded on a recording medium such as a hard disk or a semiconductor memory.
  • the program execution section may execute the program while accessing the storage section or recording medium.
  • the software that implements the mobile object 1 in the above embodiment is, for example, the following program.
  • this program causes the computer to specify the following target in the photographed image acquired by the image sensor that acquires the photographed image of the surroundings of the moving object that moves following the followed target, and A step of identifying the position of the tracking target in the local coordinate system using the measurement results of a distance sensor that measures distances to surrounding objects in multiple directions of the moving body corresponding to the area, and following the identified tracking target.
  • the step of acquiring information and the step of accumulating information include at least processing that can only be performed by hardware, such as processing that is performed by an interface card in the step of acquiring information. Not possible.
  • this program may be executed by being downloaded from a server or the like, and the program recorded on a predetermined recording medium (for example, an optical disk such as a CD-ROM, a magnetic disk, a semiconductor memory, etc.) is read out. It may be executed by Further, this program may be used as a program constituting a program product.
  • a predetermined recording medium for example, an optical disk such as a CD-ROM, a magnetic disk, a semiconductor memory, etc.
  • the number of computers that execute this program may be one or more. That is, centralized processing or distributed processing may be performed.
  • FIG. 10 is a diagram showing an example of a computer system 901 that executes the above program to realize the mobile object 1 according to the above embodiment.
  • the above embodiments may be implemented by computer hardware and a computer program executed on the computer hardware.
  • a computer system 901 includes an MPU (Micro Processing Unit) 911, a ROM 912 such as a flash memory, and a RAM 913 that is connected to the MPU 911 and temporarily stores instructions of an application program and provides temporary storage space. , a hard disk 914, and a bus 915 that interconnects an MPU 911, a ROM 912, and the like.
  • Programs such as a boot-up program, application programs, system programs, and data may be stored in the ROM 912 or the hard disk 914.
  • the program is loaded into RAM 913 during execution. Note that the program may be loaded directly from the network.
  • the computer system 901 may include a wireless communication module and an input device such as a touch panel.
  • the program does not necessarily need to include an operating system (OS) or a third party program that causes the computer system 901 to execute the functions of the mobile object 1 according to the above embodiment.
  • a program may include only those portions of instructions that call appropriate functions or modules in a controlled manner to achieve desired results. How computer system 901 operates is well known and will not be described in detail.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Image Analysis (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

La présente invention aborde le problème de la fourniture d'un corps mobile qui peut se déplacer tout en suivant de manière appropriée un sujet à suivre. La solution selon la présente invention porte sur un corps mobile (1) qui se déplace tout en suivant un sujet à suivre, comprenant : un capteur de télémétrie (11) qui mesure des distances par rapport à des sujets environnants dans une pluralité de directions ; un capteur d'image (12) qui acquiert une image capturée dans laquelle est capturé l'environnement du corps mobile (1) ; une unité de spécification (13) qui spécifie un sujet à suivre dans l'image capturée acquise par le capteur d'image (12), et spécifie la position du sujet à suivre dans un système de coordonnées local en utilisant le résultat de mesure du capteur de télémétrie (11) correspondant à une région du sujet à suivre spécifiée dans l'image capturée ; un mécanisme de déplacement (16) destiné à déplacer le corps mobile (1) ; et une unité de commande de déplacement (17) qui commande le mécanisme de déplacement (16) de façon à suivre le sujet à suivre spécifié par l'unité de spécification (13).
PCT/JP2023/017555 2022-05-12 2023-05-10 Corps mobile et programme WO2023219101A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-078613 2022-05-12
JP2022078613A JP7481029B2 (ja) 2022-05-12 2022-05-12 移動体、及びプログラム

Publications (1)

Publication Number Publication Date
WO2023219101A1 true WO2023219101A1 (fr) 2023-11-16

Family

ID=88730237

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/017555 WO2023219101A1 (fr) 2022-05-12 2023-05-10 Corps mobile et programme

Country Status (2)

Country Link
JP (2) JP7481029B2 (fr)
WO (1) WO2023219101A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006185239A (ja) * 2004-12-28 2006-07-13 Toshiba Corp ロボット装置、ロボット装置の移動追従方法、および、プログラム
WO2019159519A1 (fr) * 2018-02-13 2019-08-22 セイコーエプソン株式会社 Système de commande de déplacement de véhicule de transport et procédé de commande de déplacement de véhicule de transport
JP2020155033A (ja) * 2019-03-22 2020-09-24 株式会社豊田自動織機 移動車両
WO2021241189A1 (fr) * 2020-05-25 2021-12-02 ソニーセミコンダクタソリューションズ株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7081194B2 (ja) 2018-02-13 2022-06-07 セイコーエプソン株式会社 搬送車の走行制御システム、及び、搬送車の走行制御方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006185239A (ja) * 2004-12-28 2006-07-13 Toshiba Corp ロボット装置、ロボット装置の移動追従方法、および、プログラム
WO2019159519A1 (fr) * 2018-02-13 2019-08-22 セイコーエプソン株式会社 Système de commande de déplacement de véhicule de transport et procédé de commande de déplacement de véhicule de transport
JP2020155033A (ja) * 2019-03-22 2020-09-24 株式会社豊田自動織機 移動車両
WO2021241189A1 (fr) * 2020-05-25 2021-12-02 ソニーセミコンダクタソリューションズ株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Also Published As

Publication number Publication date
JP2023167432A (ja) 2023-11-24
JP2024031992A (ja) 2024-03-07
JP7481029B2 (ja) 2024-05-10

Similar Documents

Publication Publication Date Title
US10234278B2 (en) Aerial device having a three-dimensional measurement device
JP7141403B2 (ja) 実時間オンライン自己運動推定を備えたレーザスキャナ
US10152059B2 (en) Systems and methods for landing a drone on a moving base
EP3591490B1 (fr) Procédé et dispositif d'évitement d'obstacle, et véhicule aérien autonome
KR102159376B1 (ko) 레이저 스캔 시스템, 레이저 스캔 방법, 이동 레이저 스캔 시스템 및 프로그램
KR20210109529A (ko) 파이프에 uav 착지를 위한 자동화 방법
JP2014119828A (ja) 自律飛行ロボット
JP2014119901A (ja) 自律移動ロボット
JP2018109564A (ja) 情報処理装置、情報処理方法、および情報処理プログラム
TW201904643A (zh) 控制裝置、飛行體以及記錄媒體
JP7424390B2 (ja) 画像処理装置、画像処理方法、および画像処理プログラム
JP6014484B2 (ja) 自律移動ロボット
JP7385388B2 (ja) 自己位置推定装置
WO2021081958A1 (fr) Procédé de détection de terrain, plateforme mobile, dispositif de commande, système et support de stockage
JP2017182690A (ja) 自律移動ロボット
WO2023219101A1 (fr) Corps mobile et programme
EP3943979A1 (fr) Localisation de dispositif intérieur
JP6745111B2 (ja) 移動体
WO2020079309A1 (fr) Détection d'obstacle
US20210149412A1 (en) Position estimating apparatus, method for determining position of movable apparatus, and non-transitory computer readable medium
CN111736622B (zh) 基于双目视觉与imu相结合的无人机避障方法及系统
JP2746487B2 (ja) 垂直離着陸航空機の機体位置測定方法
JP7199337B2 (ja) 位置推定装置、位置推定方法およびプログラム
WO2022227096A1 (fr) Procédé de traitement de données de nuage de points, dispositif, et support de stockage
de la Puente et al. Extraction of geometrical features in 3d environments for service robotic applications

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23803576

Country of ref document: EP

Kind code of ref document: A1