WO2018074069A1 - Mobile body tracking control device - Google Patents

Mobile body tracking control device Download PDF

Info

Publication number
WO2018074069A1
WO2018074069A1 PCT/JP2017/030890 JP2017030890W WO2018074069A1 WO 2018074069 A1 WO2018074069 A1 WO 2018074069A1 JP 2017030890 W JP2017030890 W JP 2017030890W WO 2018074069 A1 WO2018074069 A1 WO 2018074069A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
tracking
wheelchair
moving
movement
Prior art date
Application number
PCT/JP2017/030890
Other languages
French (fr)
Japanese (ja)
Inventor
安藤 充宏
博敏 落合
▲高▼柳 渉
Original Assignee
アイシン精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by アイシン精機株式会社 filed Critical アイシン精機株式会社
Publication of WO2018074069A1 publication Critical patent/WO2018074069A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Definitions

  • Embodiments of the present invention relate to a moving body tracking control device.
  • a tracking method of a moving body has been proposed in which a movement path of an object (operator or the like) to be tracked (tracked) is calculated, a tracking path is set, and tracking (tracking) is executed.
  • a follow-up type traveling vehicle that automatically follows the operator without requiring the operator to start and stop.
  • one of the problems of the present invention is to provide a moving body tracking control device that can reduce the calculation load and can smoothly track (follow) a tracking target (operator, target).
  • the mobile body tracking control device acquires, for example, information indicating a positional relationship between a target specifying unit that specifies a tracking target object that the mobile body tracks and an object around the mobile body. Based on the acquisition result of the part, the acquisition unit for acquiring at least the relative distance between the moving object and the tracking target object and the moving direction of the tracking target object, and the moving direction of the tracking target object A target setting unit that is located on a direction deviated by a predetermined angle from the moving direction and that sets a moving target position of the moving body at a position of a first relative distance from the tracking target body; and And a control unit that controls to move to.
  • the moving body moves to a position at a predetermined angle with respect to the moving direction of the tracking target object while maintaining the first relative distance.
  • the moving body tracks the tracking target object while always existing at a predetermined position with respect to the tracking target object, the management (monitoring) of the moving object becomes easy.
  • the target setting unit of the mobile tracking control apparatus has the moving target on a circular orbit whose center is the position of the tracking target and whose radius is defined by the first relative distance.
  • the position may be set. According to this configuration, for example, even when there is a change in the surrounding situation of the moving object or when the tracking target object moves irregularly, the moving object is on a circular orbit with a constant radius centered on the position of the tracking target object. Therefore, a smooth tracking operation can be realized without leaving the tracking target.
  • the target setting unit of the mobile body tracking control device may be, for example, in a region excluding a predetermined rear region including a backward vector of the moving direction from a lateral region and a rear region of the tracking target body.
  • the movement target position may be set.
  • the moving body performs the tracking operation so as to exist in the rear region excluding the back side from the side of the tracking target body.
  • the object does not move directly behind the tracking object, for example, even when the tracking object suddenly stops, it is possible to prevent the tracking object and the moving object from coming into contact with each other.
  • the target setting unit of the moving body tracking control device may change the first relative distance according to, for example, the surrounding state of the moving body.
  • the moving body moves closer to the tracking target object. Can be made.
  • the moving body can be smoothly moved even when the periphery of the moving body is congested.
  • the surrounding area of the moving object is not sparse, the relative distance between the moving object and the tracking target object is increased, for example, the moving object is changed to the tracking target object in a state where the mutual feeling of pressure is reduced. Can be tracked.
  • the acquisition unit of the mobile body tracking control device may include a second unit between the obstacle and the mobile body present around the mobile body included in the acquisition result of the information acquisition unit.
  • the target setting unit is configured such that the second relative distance is equal to or less than a predetermined distance threshold.
  • the movement target position may be set to a position closer to the tracking target object than the first relative distance.
  • the acquisition unit of the mobile body tracking control device is configured such that the second relative distance between the obstacle and the obstacle existing around the mobile body included in the acquisition result of the information acquisition unit.
  • a contact prediction time until contact with the moving body is acquired, and the target setting unit is configured to detect when the second relative distance is equal to or less than a predetermined distance threshold.
  • the control unit is configured so that the moving object is the obstacle. It may be controlled to move to the movement target position on the opposite side through a trajectory away from the vehicle.
  • the moving body moves so as to increase the relative distance from the obstacle.
  • the probability of contact with an obstacle is further reduced, and a smoother avoidance movement of the moving body with respect to the obstacle can be easily performed.
  • FIG. 1 is a perspective view of a wheelchair that is an example of a moving body on which the moving body tracking control device according to the embodiment is mounted.
  • FIG. 2 is a block diagram illustrating an example of a moving body tracking system including a moving body tracking control apparatus according to the embodiment.
  • FIG. 3 is an explanatory diagram illustrating an example of a detection range of a laser sensor that detects a surrounding situation mounted on the moving body of the moving body tracking control device according to the embodiment.
  • FIG. 4 is a block diagram illustrating an example of a configuration of a control unit (CPU) that realizes a tracking operation realized in the ECU of the moving body tracking control device according to the embodiment.
  • CPU control unit
  • FIG. 5 is an explanatory diagram illustrating an example of the tracking position of the moving object of the moving object tracking control device according to the embodiment.
  • FIG. 6 is an explanatory diagram illustrating an example of the movement target position of the moving object when the tracking target object travels straight ahead in the moving object tracking control device according to the embodiment.
  • FIG. 7 is an explanatory diagram illustrating an example of the movement target position of the moving object when the tracking target object moves obliquely forward to the right in the moving object tracking control device according to the embodiment.
  • FIG. 8 illustrates a case where the tracking target object moves to the same position as that in FIG. 7 in the moving body tracking control apparatus according to the embodiment, but the direction (movement direction) facing the position is different from that in FIG.
  • FIG. 9 is a flowchart for explaining the first half of the processing procedure for realizing an example of the tracking operation of the mobile object in the mobile object tracking control device according to the embodiment.
  • FIG. 10 is a flowchart for explaining the second half of the processing procedure for realizing an example of the tracking operation of the mobile object in the mobile object tracking control device according to the embodiment.
  • FIG. 11 is an explanatory diagram illustrating an example of the tracking position of the moving object when the degree of congestion around the moving object is low in the moving object tracking control device according to the embodiment.
  • FIG. 12 is an explanatory diagram illustrating an example of the tracking position of the moving object when the degree of congestion around the moving object is higher than that in FIG.
  • FIG. 13 is an explanatory diagram illustrating an example of an avoidance position of a moving object when an obstacle approaches from the front of the moving object and an example of a moving track for moving to the avoidance position in the moving object tracking control device according to the embodiment. is there.
  • FIG. 14 is an explanatory diagram illustrating an example of an avoidance position of a moving object when an obstacle approaches from the rear of the moving object and a moving trajectory for moving to the avoidance position in the moving object tracking control device according to the embodiment. is there.
  • the moving body tracking control device can be applied to a small moving body such as a “wheelchair” as shown in FIG.
  • a “wheelchair” is shown as an example of the moving body, but the present invention is not limited to this.
  • a moving means for healthy persons a moving vehicle that rides in a sitting posture or a standing posture may be used.
  • the moving body is a vehicle including a driving source such as a motor, and may include a normal traveling mode in which the user (passenger) moves by maneuvering (driving), or automatically specifying a tracking target object.
  • a tracking travel mode for tracking may be provided.
  • the moving body may be an unmanned vehicle including a loading platform on which an object other than a human (eg, luggage, animal) can be loaded.
  • the moving body may run alone, or a plurality of moving bodies may be connected, or a plurality of moving bodies may be moved in a state where they are connected in a separated state.
  • FIG. 1 shows a specific configuration in which a moving body tracking control device is mounted on a wheelchair 10 as an example of a moving body.
  • the wheelchair 10 is a foldable chair in which a seat 14 and a back support 16 formed of, for example, a flexible cloth material or a resin material are fixed to a metal frame 12.
  • a retractable foot support 18 is provided at the lower front portion of the seat 14 so that a user (user) can put his / her foot in a sitting position.
  • a leg support 20 is provided between the foot support 18 and the seat 14 to support the inflated shin and prevent the foot from entering the lower side of the seat 14 when the user is seated on the seat 14. ing.
  • Driving wheels 22 are arranged on the left and right sides of the seat 14.
  • a ring-shaped hand rim 22 a is integrally fixed to the drive wheel 22 outside the drive wheel 22.
  • the hand rim 22a realizes a self-running mode in which the user sitting on the seat 14 grips and rotates the hand rim 22a by hand to drive the driving wheel 22 and move the wheelchair 10.
  • the drive wheel 22 of the wheelchair 10 of this embodiment is provided with the drive system 24, and is comprised so that automatic driving
  • the drive wheel 22 is provided with, for example, a drum brake system 26, and the drive wheel 22 can be decelerated, stopped, and held in a stopped state.
  • the brake system 26 can also be integrated with the drive system 24.
  • a pair of casters 28 for determining the traveling direction of the wheelchair 10 are provided in front of the drive wheels 22.
  • the casters 28 are connected to a steering system 30 composed of gears and the like.
  • the steering system 30 allows the caster 28 to turn freely in a self-propelled mode in which the user rotates the hand rim 22a to move the wheelchair 10 or when a companion or the like grips the handwheel 12a to push the wheelchair 10 to move. To do. That is, the steering direction is not restricted.
  • the steering system 30 performs a turning instruction by the user or tracking control by the ECU 44 (see FIG. 2).
  • the caster 28 is controlled to turn in a predetermined direction to determine the traveling direction of the wheelchair 10.
  • the drive system 24, the brake system 26, and the steering system 30 can be driven by receiving power supply from the battery 32a of the battery system 32 disposed on the lower surface of the seat 14, for example.
  • a control box 34 is arranged at a position where the user can reach in a posture where the user is seated on the seat 14, for example, at a side of the seat 14.
  • the control box 34 contains a control unit for causing the wheelchair 10 to travel in the normal travel mode or the tracking travel mode.
  • the control box 34 may be provided with an operation unit 36 for instructing a moving speed and a moving direction when the user moves the wheelchair 10 in the normal driving mode.
  • the operation unit 36 can be a joystick that can be tilted in an arbitrary direction of 360 °, for example. In this case, the traveling direction of the wheelchair 10 can be instructed in the direction in which the joystick is tilted, and the moving speed of the wheelchair 10 can be instructed by the amount by which the joystick is tilted.
  • the wheelchair 10 can be stopped by returning the joystick to the neutral position (upright position). In this case, the actuator 26a of the brake system 26 may be driven to maintain the stopped state of the drive wheels 22.
  • control box 34 is provided with a laser sensor 38 (front sensor 38F) as an example of an information acquisition unit that acquires information indicating the positional relationship of objects existing in the area mainly in front of the wheelchair 10.
  • a laser sensor 38 (rear sensor 38 ⁇ / b> R) that acquires information indicating the positional relationship of objects existing there is provided for a region mainly behind the wheelchair 10.
  • the rear sensor 38 ⁇ / b> R is supported by a casing constituting the battery system 32. It is desirable that the front sensor 38F and the rear sensor 38R can cover a range around the wheelchair 10, for example, a range of 360 °.
  • the positions where the front sensor 38F and the rear sensor 38R are installed are only examples, and can be appropriately changed as long as the situation around the wheelchair 10 can be acquired. Moreover, although this embodiment showed the example which provided the two laser sensors 38, if the range around the wheelchair 10 (for example, the range of 360 degrees) can be covered, it may be one piece or three or more pieces. By increasing the number of laser sensors 38, the non-detection range (dead angle) can be reduced or eliminated.
  • the wheelchair 10 is provided with a hand brake lever 40 that keeps the stop state of the drive wheel 22 by bringing the brake pad 40 a into contact with the tire surface of the drive wheel 22.
  • a monitor device 42 that provides the user with various information such as the running state of the wheelchair 10 and the surrounding situation may be provided.
  • the monitor device 42 can be fixed in an arbitrary posture, for example, by adjusting the angle of the support arm, making it easy to ensure the visibility in the posture in which the user is seated on the seat 14, and is obstructive during the sitting posture and the seating operation. It can be arranged so that it does not become.
  • FIG. 2 is a block diagram illustrating an example of the mobile tracking system 100 including the mobile tracking control device according to the embodiment.
  • ECU44 electronic control unit
  • the interface 102 to realize traveling control (tracking control) of the wheelchair 10 as the moving body.
  • the drive system 24, the brake system 26, the steering system 30, the battery system 32, the laser sensor 38, the steering angle sensor 46, the wheel speed sensor 48, and the like are connected to the interface 102.
  • the drive system 24 includes an actuator 24a and a torque sensor 24b.
  • the drive system 24 is electrically controlled by the ECU 44 or the like, and drives the actuator 24a (for example, a motor) when the wheelchair 10 is in the normal travel mode or the tracking travel mode.
  • the drive system 24 adjusts the speed and torque of the drive wheels 22 and moves the wheelchair 10 in a forward direction or a reverse direction at a desired speed (for example, a speed indicated by an operation amount of the operation unit 36).
  • the torque sensor 24b detects the torque of the actuator 24a, and the ECU 44 analyzes the torque generated in the actuator 24a, thereby correcting the command value of the actuator 24a and improving the speed control accuracy of the wheelchair 10.
  • the brake system 26 includes an actuator 26a and a brake sensor 26b.
  • the brake system 26 is electrically controlled by the ECU 44 or the like, and operates an actuator 26a (for example, a piston) to change, for example, the pressing state of the brake shoe against the drum of the drum brake. By changing the pressing state of the brake shoe, the speed of the drive wheel 22 is adjusted, the drive wheel 22 is stopped, or the stop state of the drive wheel 22 is maintained.
  • the brake sensor 26b is a sensor that detects an operation state (depression amount or the like) of a brake pedal disposed on the foot support 18, for example.
  • the steering system 30 includes an actuator 30a and a torque sensor 30b.
  • the steering system 30 is electrically controlled by the ECU 44 and the like to apply torque to gears and the like by driving an actuator 30a (for example, a motor) when the wheelchair 10 is driven in the normal driving mode or the tracking driving mode.
  • the caster 28 is turned.
  • the traveling direction of the wheelchair 10 is changed by changing the direction in which the caster 28 faces (the steering direction).
  • the actuator 30a may steer one caster 28, or may steer the left and right casters 28.
  • the torque sensor 30b detects, for example, torque generated on the turning shaft of the caster 28, and detects whether the turning state of the caster 28 by the actuator 30a is being executed as instructed.
  • the ECU 44 performs feedback control of the actuator 30a according to the detection result, and improves the turning accuracy of the caster 28.
  • the steering angle sensor 46 is, for example, a sensor that detects the direction of the caster 28.
  • the rudder angle sensor 46 is configured using, for example, a hall element, and acquires, for example, an angle in the positive direction (clockwise direction) and a negative direction (counterclockwise direction) with respect to the front direction of the wheelchair 10.
  • the wheel speed sensor 48 is, for example, a sensor that detects the amount of rotation of the drive wheels 22 and the number of rotations per unit time.
  • the wheel speed sensor 48 is disposed on each drive wheel 22 and outputs a wheel speed pulse number indicating the rotation speed detected by each drive wheel 22 as a sensor value.
  • the wheel speed sensor 48 can be configured using, for example, a hall element.
  • the ECU 44 calculates the amount of movement of the wheelchair 10 based on the sensor value acquired from the wheel speed sensor 48 and executes various controls.
  • the ECU 44 determines the vehicle speed of the wheelchair 10 based on the speed of the drive wheel 22 having the smaller sensor value of the left and right drive wheels 22. Execute various controls.
  • the battery system 32 includes a battery 32a for charging / discharging, a voltage sensor 32b, and a current sensor 32c.
  • the battery 32a may be of a type that is connected to an external power source and charged, or may be charged when the drive wheels 22 rotate in the normal travel mode. Moreover, you may charge by the regeneration operation
  • the voltage sensor 32b and the current sensor 32c manage the voltage value and the current value when the battery 32a is charged and discharged. Moreover, when charging the battery 32a by regeneration, the voltage sensor 32b and the current sensor 32c are managed so as to maintain the battery 32a in an optimal charging state.
  • the laser sensor 38 (front sensor 38F, rear sensor 38R) functions as an information acquisition unit that acquires information indicating the positional relationship of objects around the wheelchair 10 as a moving body.
  • the laser sensor 38 is reflected when a laser beam emitted from a light source (laser diode or the like) inside the sensor hits a measurement target (for example, a tracking target or an obstacle) and is received by a light receiving element.
  • the distance to the position where the laser beam is reflected is calculated by evaluating and calculating the reflected light received at this time. Further, by comparing the information in time series, the moving direction when the measurement object moves can be calculated. As shown in FIG.
  • the front sensor 38F of the laser sensor 38 is directed to the front of the wheelchair 10 and has, for example, a first region FE including the front at a detection angle ⁇ F as a detection range.
  • the rear sensor 38R is directed to the rear of the wheelchair 10, and, for example, the second region RE including the rear and the side at the detection angle ⁇ R is set as a detection range.
  • a non-detection region (dead angle) may be formed.
  • the first region FE and the second region RE overlap with each other in the overlapping region CE, and the non-detection region DE is reduced.
  • FIG. 3 the case of FIG.
  • the interface 102 is further connected to an ECU 44, a monitor device 42, and the like.
  • the ECU 44 can control the drive system 24, the brake system 26, the steering system 30, the battery system 32, and the like by sending a control signal through the interface 102. Further, the ECU 44 detects detection results of the torque sensor 24b, the brake sensor 26b, the torque sensor 30b, the voltage sensor 32b, the current sensor 32c, the rudder angle sensor 46, the wheel speed sensor 48, the laser sensor 38, etc. via the interface 102, An operation signal from the operation unit 36 or the like can be received.
  • the ECU 44 includes, for example, a CPU 44a (central processing unit), a ROM 44b (read only memory), a RAM 44c (random access memory), an SSD 44d (solid state drive, flash memory), and the like.
  • the CPU 44a specifies, for example, a tracking target object (for example, a companion, a leader, a leading conductor, etc.) to be tracked by the wheelchair 10, and keeps a relative distance from the tracking target object constant.
  • a tracking target object for example, a companion, a leader, a leading conductor, etc.
  • the CPU 44a controls the drive system 24, the steering system 30, the battery system 32, and the like so as to realize traveling of the wheelchair 10 according to the moving direction and moving speed instructed by the user via the operation unit 36. To do. Further, when the wheelchair 10 is driven in the self-propelled mode, the drive system 24 and the steering system 30 are disconnected from the system so that the drive wheels 22 can be easily rotated with a light load and the casters 28 are directed in an arbitrary direction. To do. In this case, the brake system 26 may be in a controllable state or may be used as an emergency stop brake in the self-propelled mode.
  • the CPU 44a reads a program stored (installed) in a non-volatile storage device such as the ROM 44b, and executes arithmetic processing according to the program.
  • the RAM 44c temporarily stores various data used in the calculation by the CPU 44a.
  • the SSD 44d is a rewritable nonvolatile storage unit, and can store data even when the power of the ECU 44 is turned off.
  • the CPU 44a, ROM 44b, RAM 44c and the like can be integrated in the same package.
  • the ECU 44 may have a configuration in which another logical operation processor, a logic circuit, or the like such as a DSP (digital signal processor) is used instead of the CPU 44a.
  • an HDD hard disk drive
  • the monitor device 42 includes a display device 42a, an operation input unit 42b, and an audio output device 42c.
  • the display device 42a may be, for example, an LCD (liquid crystal display), an OELD (organic electroluminescent display), or the like. You may display the surroundings of the wheelchair 10, for example, the position of an obstacle, etc. on the display device 42a. In particular, it is possible to capture and display a rear situation or the like that is difficult to see in the sitting posture with an imaging device (camera). Further, when the wheelchair 10 is in the normal traveling mode or the tracking traveling mode, the control state of the wheelchair 10, for example, the charging state of the battery 32 a or the traveling speed may be displayed.
  • the display device 42a may be a simple device that includes only an indicator lamp.
  • the approach of an obstacle may be displayed by a change in display mode such as a change in display color of the indicator lamp or blinking.
  • the operation input unit 42b can be configured with a switch, a dial, a push button, and the like.
  • the operation input unit 42b may be realized on the display screen of the display device 42a.
  • the voice output device 42c outputs a message (such as an alarm) when the wheelchair 10 is in the normal travel mode or the tracking travel mode, for example, when suddenly stopping, when changing course, or when an obstacle exists.
  • the state of the battery 32a may be notified.
  • the CPU 44a included in the ECU 44 includes various modules as shown in FIG. 4 in order to realize the tracking traveling as described above.
  • the CPU 44a includes an acquisition unit 58, a target identification unit 60, an obstacle identification unit 62, a target setting unit 64, a travel control unit 66, a display control unit 68, a voice control unit 70, an output unit 72, and the like as modules.
  • the acquisition unit 58 includes a relative distance acquisition unit 58a, a movement direction acquisition unit 58b, a movement speed acquisition unit 58c, an obstacle distance calculation unit 58d, a contact prediction time calculation unit 58e, and the like.
  • the target setting unit 64 includes an orbital circle setting unit 64a, a movement target position setting unit 64b, a surrounding situation recognition unit 64c, and the like.
  • the traveling control unit 66 includes a straight traveling speed calculation unit 66a, a turning speed calculation unit 66b, a steering angle setting unit 66c, a battery control unit 66d, a motor control unit 66e, and the like. These modules can be realized by reading a program installed and stored in a storage device such as the ROM 44b and executing it.
  • the CPU 44a can guide and track the wheelchair 10 in a predetermined direction so as to be separated by a predetermined relative distance, for example, based on the tracking target object by executing processing by various modules.
  • the acquisition unit 58 is a relative distance between the wheelchair 10 and its surrounding objects based on the acquisition result of the laser sensor 38 (information acquisition unit) that acquires information indicating the positional relationship between objects around the wheelchair 10 as a moving body. And the moving direction of the object.
  • the laser sensor 38 transmits and receives laser light every 1/16 second, for example. Then, based on the detection result of the laser sensor 38, the relative distance acquisition unit 58a detects the presence or absence of an object around the wheelchair 10 and the relative distance between the object and the wheelchair 10 when an object exists.
  • the moving direction acquisition unit 58b acquires the presence / absence and moving direction of the object by comparing the detection results of the laser sensor 38 in time series. For example, by comparing the position of an object at a certain detection timing with the position of the same object at the next detection timing, a direction vector that connects the two can be obtained. This direction vector is the moving direction of the object.
  • the moving speed acquisition unit 58c calculates the moving speed of the object detected around the wheelchair 10 by differentiating the detection result of the laser sensor 38.
  • the target specifying unit 60 specifies a tracking target object that the wheelchair 10 tracks from among objects existing around the wheelchair 10 acquired by the acquiring unit 58.
  • a moving object for example, a human being, an animal, or another moving object
  • the situation around the wheelchair 10 is imaged by an imaging unit (for example, a camera), an object included in the captured image is displayed on the display device 42a, and the tracking target object is specified by the user via the operation input unit 42b or the like. May be specified.
  • an object to be tracked is possessed by a transmitter (for example, a beacon), and the target specifying unit 60 receives a signal from the transmitter, so that the object carrying the transmitter is regarded as a tracking target. Also good.
  • a tracking target object may be specified by extracting an object included in a captured image captured by an imaging unit (for example, a camera) using a known pattern matching technique or the like.
  • the obstacle specifying unit 62 specifies an object existing around the wheelchair 10 as an obstacle other than the object (tracking target object) specified by the target specifying unit 60.
  • the obstacle distance calculating unit 58d calculates a relative distance (second relative distance) between the obstacle specified by the obstacle specifying unit 62 and the wheelchair 10.
  • the moving speed acquisition unit 58c also acquires the moving speed for the obstacle specified by the obstacle specifying unit 62.
  • the predicted contact time calculation unit 58e uses (compares) the obstacle movement speed acquired by the movement speed acquisition part 58c and the speed of the wheelchair 10 based on the sensor value of the wheel speed sensor 48, and The predicted contact time until the wheelchair 10 comes into contact is calculated.
  • the second relative distance calculated by the obstacle distance calculating unit 58d and the predicted contact time calculated by the predicted contact time calculating unit 58e are used when the wheelchair 10 determines an avoidance exercise when avoiding the obstacle.
  • the CPU 44a executes an avoidance exercise to avoid the obstacle.
  • a predetermined distance threshold for example, 3 m or less
  • the CPU 44a executes an avoidance exercise to avoid the obstacle.
  • the CPU 44a executes an avoidance exercise, avoiding obstacles.
  • the avoidance movement is realized by reducing the radius R1 of the concentric circle 76a, or changing the tracking position (position viewed from the target 74) of the wheelchair 10 to the side with few obstacles with the target 74 interposed therebetween.
  • the target setting unit 64 determines the movement target position of the wheelchair 10 based on the tracking target object determined by the target specifying unit 60.
  • the movement target position of the wheelchair 10 as a moving body is, for example, a tracking interval (first step) centered on a target 74 as a tracking target body as shown in FIG. Relative distance) is set on a concentric circle of radius R1.
  • the orbital circle setting unit 64a always sets a concentric circle 76a (circular orbit) having a radius R1 regardless of whether the target 74 is moved or stopped around the tracking target object (target 74) identified by the object identifying unit 60.
  • the movement target position setting unit 64b is the target movement direction of the target 74 acquired by the movement direction acquisition unit 58b on the concentric circle 76a set by the trajectory circle setting unit 64a with respect to the target 74 specified by the target specifying unit 60.
  • the movement target position of the wheelchair 10 is set at a position where the tracking angle ⁇ is based on P. As an example in the case of FIG. 5, the movement target position of the wheelchair 10 is set at a position rotated (shifted) by a predetermined angle (tracking angle ⁇ ) counterclockwise.
  • the movement target position setting unit 64b excludes a predetermined rear region (release angle ⁇ N) including the reverse vector PV in the target moving direction P from the side and rear regions of the target 74 (tracking target) as the tracking angle ⁇ .
  • Set the movement target position in the area That is, the movement target position of the wheelchair 10 is not set to a position immediately behind the target 74.
  • the movement target position of the wheelchair 10 may be basically set in an area excluding the rear area (release angle ⁇ N) on the concentric circle 76a. However, in order to give the wheelchair 10 a more secure feeling. Is preferably set at a position behind the target 74 (position defined by the standard angle ⁇ G). In this case, the target 74 (in this case, an attendant) always enters the field of view of the user sitting on the wheelchair 10, and a sense of security can be obtained. Note that since the human field of view is, for example, about 120 °, the target 74 (in this case, an attendant) confirms the wheelchair 10 existing at the standard angle ⁇ G only by slightly shaking the head while facing the target moving direction P. be able to. As a result, for example, the attendant who is the target 74 can move while confirming the user sitting on the wheelchair 10 (for example, while having a conversation). Even when the wheelchair 10 being tracked is separated from the target 74 for some reason, the situation can be recognized quickly.
  • the surrounding situation recognition unit 64c detects the degree of congestion of obstacles around the wheelchair 10 specified by the obstacle specifying unit 62.
  • the ROM 44b holds in advance a congestion degree M indicating the density of obstacles per unit area, and whether the surroundings of the wheelchair 10 are in a “dense” state with a congestion degree M or higher or a “sparse” state with a congestion degree M or less. Recognize what it is.
  • the orbit circle setting unit 64a sets the radius R1 (first relative distance) of the orbit circle to, for example, a standard distance (eg, 1 m).
  • the orbital circle setting unit 64a decreases the radius R1 (first relative distance, for example, radius 1 m) of the orbital circle (for example, radius 0). 5m), the wheelchair 10 is moved closer to the target 74 so that the wheelchair 10 can easily travel even in a crowded state.
  • the traveling control unit 66 performs overall control related to traveling of the wheelchair 10.
  • the straight speed calculation unit 66a is based on the moving speed of the target 74 acquired by the moving speed acquiring unit 58c, that is, the rotation of the driving wheel 22 Calculate the speed.
  • the turning speed calculation unit 66b sets the current speed of the wheelchair 10 based on the movement direction of the target 74 acquired by the movement direction acquisition unit 58b and the sensor value acquired by the wheel speed sensor 48. Accordingly, the turning speed ⁇ M of the caster 28 is calculated.
  • the steering angle setting unit 66c determines the control angle of the caster 28, that is, the control amount of the actuator 30a of the steering system 30, so that the wheelchair 10 reaches the movement target position set by the movement target position setting unit 64b.
  • the battery control unit 66d allows the drive system 24 to realize a speed at which the wheelchair 10 can move to the movement target position set by the movement target position setting unit 64b until the movement target position in the next control cycle is determined.
  • the output value of the battery system 32 is controlled.
  • the battery control unit 66d controls charging / discharging of the battery 32a and manages the battery amount.
  • the motor control unit 66e acquires the load state of the actuator 24a (motor) based on the torque value obtained from the torque sensor 24b, and moves the movement target position set by the movement target position setting unit 64b to the movement target in the next control cycle. Increase / decrease the speed so that it can move before the position is determined.
  • the output value of the actuator 24a (motor) is determined according to the road surface state (gradient state, uneven state, etc.).
  • the display control unit 68 displays the traveling state (speed, traveling direction, etc.) of the wheelchair 10 in the normal traveling mode and the tracking traveling mode, displays the charging state of the battery 32a, and the number of obstacles around the wheelchair 10. And processing for displaying the existence position and the like.
  • the voice control unit 70 executes processing for providing information on the driving state of the wheelchair 10 or an alarm when there is an obstacle or the like in the normal traveling mode or the tracking traveling mode.
  • the output unit 72 outputs the control results from the travel control unit 66 to each system, and outputs the control results from the display control unit 68 and the voice control unit 70 to the monitor device 42.
  • the travel control unit 66 follows the operation content of the operation unit 36 (for example, the direction and amount of tilting of the joystick), and the straight travel speed calculation unit 66a, the turning speed calculation unit 66b, and the steering angle setting unit 66c. Then, control of the battery control unit 66d, the motor control unit 66e, and the like is executed, and the wheelchair 10 is driven according to the user's instructions.
  • the movement target position is set when there is no obstacle around the wheelchair 10 or when the obstacle is present in a “sparse” state (non-congested state). Shows the case.
  • FIG. 6 shows the movement target position 78 of the wheelchair 10 when the target 74 that is the tracking object moves straight from the current position in the target movement direction P to the position indicated by the dotted line at the target speed VT.
  • the object specifying unit 60 first specifies the target 74 based on the positional relationship of objects around the wheelchair 10 acquired by the acquiring unit 58. For example, an object existing at a position closest to the wheelchair 10 is set as the target 74. Further, the orbital circle setting unit 64a acquires the degree of congestion around the wheelchair 10 based on the recognition result of the surrounding state recognition unit 64c, and when it is not crowded as shown in FIG. Relative distance).
  • the traveling control unit 66 uses the target moving direction P of the target 74 acquired by the moving direction acquiring unit 58b as a reference on the concentric circle 76a centered on the target 74.
  • the wheelchair 10 is moved to the position moved counterclockwise by the tracking angle ⁇ , and this position is set as the current position of the wheelchair 10 corresponding to the current position of the target 74.
  • the concentric circle 76a having the radius R1 is similarly moved straight in the target movement direction P, that is, a position centered on the target 74 indicated by a dotted line.
  • the movement target position 78 is on a concentric circle 76a set around the target 74 indicated by a dotted line, and on the concentric circle 76a centered on the target 74 with reference to the target moving direction P of the target 74 indicated by the dotted line. Only the position moved counterclockwise is set.
  • the straight speed calculation unit 66a calculates the straight speed VM of the wheelchair 10 such that the wheelchair 10 reaches the movement target position 78 until the next movement target position is set.
  • the turning speed calculation unit 66b does not calculate the turning speed ⁇ .
  • the relative positional relationship of the wheelchair 10 viewed from the target 74 is not changed, and the target 74 can be tracked.
  • FIG. 7 shows the movement target of the wheelchair 10 when the target 74 as the tracking target object performs a turning movement from the current position and moves to the position of the target 74 indicated by the dotted line with the turning direction as the target movement direction P.
  • a position 78 is shown.
  • the target specifying unit 60 specifies the target 74 based on the positional relationship of the objects around the wheelchair 10 acquired by the acquiring unit 58.
  • the orbital circle setting unit 64a acquires the degree of congestion around the wheelchair 10 based on the recognition result of the surrounding situation recognition unit 64c, and in the case of non-congestion as shown in FIG. 7, a concentric circle 76a having a radius R1. Set (first relative distance). Even when the target 74 turns straight as shown in FIG.
  • the concentric circle 76a having the radius R1 is set to a position centered on the target 74 indicated by the dotted line.
  • the moving target position 78 is a tracking angle on a concentric circle 76 a centered on the dotted target 74 and on a concentric circle 76 a centered on the dotted target 74 with reference to the target moving direction P of the dotted target 74. It is set at a position moved counterclockwise by ⁇ .
  • the straight speed calculation unit 66a calculates the straight speed VM of the wheelchair 10 such that the wheelchair 10 reaches the movement target position until the next movement target position 78 is set.
  • the turning speed calculation unit 66b calculates the turning speed ⁇ that turns so that the wheelchair 10 reaches the movement target position before the next movement target position 78 is set. .
  • the relative positional relationship of the wheelchair 10 viewed from the target 74 does not change, and the target 74 can be tracked. it can.
  • FIG. 8 shows the wheelchair 10 in a case where the target moving direction P is a direction different from the turning direction, such as the target 74 indicated by the dotted line, after the target 74 as the tracking target object makes a turning movement from the current position.
  • a movement target position 78 is shown.
  • the position of the dotted target 74 is the same as the position of the dotted target 74 in FIG. 7, but the direction in which the dotted target 74 is currently facing is different.
  • the target specifying unit 60 specifies the target 74 based on the positional relationship of the objects around the wheelchair 10 acquired by the acquiring unit 58.
  • the orbital circle setting unit 64a acquires the degree of congestion around the wheelchair 10 based on the recognition result of the surrounding situation recognition unit 64c, and when it is not crowded as shown in FIG. 8, the concentric circle 76a having the radius R1.
  • Set (first relative distance) When the dotted target 74 moves straight in the target moving direction P after turning as shown in FIG. 8, the concentric circle 76 a with the radius R ⁇ b> 1 is set at a position centered on the dotted target 74.
  • the moving target position 78 is a tracking angle on a concentric circle 76 a centered on the dotted target 74 and on a concentric circle 76 a centered on the dotted target 74 with reference to the target moving direction P of the dotted target 74.
  • the movement target position 78 set by the tracking angle ⁇ with respect to the target movement direction P is set.
  • the position is different from the position of the movement target position 78 in FIG.
  • the straight speed calculation unit 66a calculates the straight speed VM of the wheelchair 10 such that the wheelchair 10 reaches the movement target position 78 until the next movement target position 78 is set.
  • the turning speed calculation unit 66b sets the turning speed ⁇ for turning so that the wheelchair 10 reaches the moving target position 78 until the next moving target position 78 is set.
  • the wheelchair 10 can be tracked in a fixed relative positional relationship regardless of the direction of movement of the target 74.
  • FIG. 9 is a flowchart for explaining the first half of the process
  • FIG. 10 is a flowchart for explaining the second half of the process.
  • the CPU 44a determines whether the tracking traveling mode for identifying and tracking the target 74 is selected as the traveling mode of the wheelchair 10 based on the operation state of the operation input unit 42b of the monitor device 42 or the like. It is detected whether the normal driving mode which moves the wheelchair 10 by operating 36 is selected (S100). When neither the tracking driving mode nor the normal driving mode is set, that is, in the self-running mode, the flowcharts of FIGS. 9 and 10 are not executed, and the wheelchair 10 turns the left and right hand rims 22a (drive wheels 22) by the user himself / herself. Move by.
  • the display control unit 68 When the CPU 44a detects that the tracking travel mode is selected (Yes in S100), the display control unit 68 outputs a control signal indicating that it is the tracking travel mode via the output unit 72, and the display device A display such as “in control of tracking driving mode” is performed on 42a (S102). Subsequently, the acquisition unit 58 acquires the surrounding situation of the wheelchair 10 (such as the positional relationship of objects existing around the wheelchair 10) via the laser sensor 38 (S104). Further, the target specifying unit 60 specifies the tracking target body (target 74) from among the objects existing around the wheelchair 10 (S106). In this case, there are various methods for specifying the target 74.
  • an object closest to the wheelchair 10 among the objects detected by the acquisition unit 58 may be used as the target 74.
  • the target 74 may be specified by previously carrying a transmitter such as a beacon with the target 74 and acquiring the signal.
  • target 74 may be specified based on the image acquired by the imaging part with which wheelchair 10 is provided. The target 74 may be specified by combining these methods. For example, when the density of objects existing around the wheelchair 10 is “dense”, a method using a transmitter such as a beacon is advantageous in that the target 74 can be quickly identified.
  • the processing load is small, and a device such as a beacon is unnecessary, which is costly. But it is advantageous.
  • the type of target 74 to be specified can be easily changed and the specification system can be improved without using a device such as a beacon.
  • the relative distance acquisition unit 58a acquires the relative distance of the target 74, and the movement direction acquisition unit 58b moves the movement direction of the target 74 based on time series information of the relative distance. Is acquired (S110).
  • the target 74 specifying process is continuously executed.
  • the surrounding state recognition unit 64c calculates the congestion state around the wheelchair 10 based on the object presence information acquired by the relative distance acquisition unit 58a, and compares it with the congestion degree M previously stored in the ROM 44b or the like (S112). ).
  • the congestion degree around the wheelchair 10 is less than the predetermined congestion degree M (No in S112), for example, the surrounding situation recognized by the surrounding situation recognition unit 64c moves the target around the wheelchair 10 as shown in FIG. This is a case of “sparse” in which there are several pedestrians 80a (walking speed VSa) and pedestrians 80b (walking speed VSb) in addition to the target 74 having the target speed VT in the direction P.
  • the orbital circle setting unit 64a has a concentric circle 76a (first step) having a larger radius R1 than in the case of “dense” such as pedestrian 80a (walking speed VSa) to pedestrian 80g (walking speed VSg) as shown in FIG. 1 orbital circle) is set (S114).
  • the first orbit circle may be a circle whose first relative distance is a standard distance (1 m), for example.
  • the congestion degree around the wheelchair 10 is equal to or greater than the predetermined congestion degree M (Yes in S112), for example, the surrounding situation recognized by the surrounding situation recognition unit 64c is around the wheelchair 10 as shown in FIG.
  • the orbital circle setting unit 64a sets a concentric circle 76b (second orbital circle) having a smaller radius R2 than when the pedestrians 80a and 80b are “sparse” (S116).
  • the second orbit circle may be a 0.5 m circle whose first relative distance is half the standard distance (for example, 1 m).
  • the movement target position of the wheelchair 10 is set to be directly beside the target 74 as an example.
  • the CPU 44a determines whether there is an obstacle to be avoided around the wheelchair 10 based on the recognition result of the surrounding situation recognition unit 64c (S118). For example, when the second relative distance between the obstacle calculated by the obstacle distance calculation unit 58d and the wheelchair 10 is longer than a predetermined distance threshold (for example, 3 m or more), there is no obstacle to be avoided. Is determined. Alternatively, when the predicted contact time until the obstacle calculated by the predicted contact time calculation unit 58e contacts the wheelchair 10 is longer than a predetermined time threshold (for example, 5 seconds or more), the CPU 44a determines that there is an obstacle to be avoided. Judge that it does not exist.
  • a predetermined distance threshold for example, 3 m or more
  • the movement target position setting unit 64b sets the movement target position 78 at the tracking angle ⁇ based on the target movement direction P of the target 74 ( S120).
  • the straight traveling speed calculating unit 66a calculates the tracking straight traveling speed
  • the turning speed calculating unit 66b calculates the tracking turning speed (S122).
  • the traveling control unit 66 sets the direction of the caster 28 by the rudder angle setting unit 66c so that the wheelchair 10 moves to the movement target position, and drives the actuator 24a (motor) by the motor control unit 66e, The wheelchair 10 is moved to the movement target position (S124).
  • the CPU 44a When the CPU 44a has confirmed that the movement to the movement target position in the current control cycle has been completed based on the relative position with respect to the target 74 after movement and the tracking angle ⁇ based on the target movement direction P of the target 74. (Yes in S126), it is detected whether the completion condition of the tracking travel mode is satisfied (S128). For example, when the driving mode is a mode other than the tracking driving mode or when the movement of the target 74 cannot be confirmed for a predetermined period or longer (when the target 74 has stopped for, for example, 60 seconds), the CPU 44a completes the tracking driving mode. It is determined that the condition is satisfied (Yes in S128). The CPU 44a causes the display control unit 68 to display the end of the tracking travel mode (S130), and once ends this flow.
  • the movement target position setting unit 64b determines that there are obstacles to avoid as an area where the wheelchair 10 exists.
  • the avoidance position is set in an area different from the existing area (S132). This avoidance position is set, for example, in a region on the side where the number of obstacles to be avoided with the target 74 interposed therebetween is smaller. For example, as shown in FIGS. 13 and 14, when there is an obstacle 80a (80b) to be avoided on the left side across the target 74, the wheelchair 10 is avoided on the right side (reverse side) of the target 74.
  • the tracking movement is executed based on the position.
  • the avoidance trajectory is different for moving to the avoidance position 82 based on the direction in which the obstacle to be avoided approaches.
  • FIG. 13 shows a case where the pedestrian 80b approaches from the front of the wheelchair 10 as an obstacle at the approach speed VQ.
  • the movement target position setting unit 64b moves backward while avoiding the wheelchair 10 moving away from the pedestrian 80b. Is selected (S136). For example, when the second relative distance L between the pedestrian 80b and the wheelchair 10 is equal to or less than a predetermined distance threshold (for example, 3 m or less), the wheelchair 10 passes the retreat avoidance track and avoids the opposite side across the target 74.
  • a predetermined distance threshold for example, 3 m or less
  • the wheelchair 10 moves to position 82. Further, when the second relative distance L between the pedestrian 80b and the wheelchair 10 is reduced, the predicted contact time until the wheelchair 10 having the straight traveling speed VM contacts the pedestrian 80b having the approach speed VQ is equal to or less than a predetermined time threshold. When it becomes (for example, 5 seconds or less), the wheelchair 10 moves to the avoidance position 82 on the opposite side across the target 74 through the retreat avoidance track retreating away from the pedestrian 80b.
  • FIG. 14 shows a case where the pedestrian 80a approaches from the rear of the wheelchair 10 as an obstacle at the approach speed VQ.
  • the movement target position setting unit 64b moves forward while avoiding the movement of the wheelchair 10 away from the pedestrian 80a. Is selected (S138). For example, when the second relative distance L between the pedestrian 80a and the wheelchair 10 is equal to or less than a predetermined distance threshold (for example, 3 m or less), the wheelchair 10 passes the forward avoidance track away from the pedestrian 80a and sandwiches the target 74. To move to the opposite avoidance position 82.
  • a predetermined distance threshold for example, 3 m or less
  • the predicted contact time until the wheelchair 10 having the straight traveling speed VM and the pedestrian 80a having the approach speed VQ come into contact with each other is equal to or less than a predetermined time threshold.
  • a predetermined time threshold For example, 5 seconds or less, the wheelchair 10 moves to the avoidance position 82 on the opposite side across the target 74 through the forward avoidance track away from the pedestrian 80a.
  • the straight traveling speed calculating unit 66a calculates the avoiding straight traveling speed of the wheelchair 10 for moving to the avoiding position 82
  • the turning speed calculating unit 66b calculates the avoiding turning speed (S140).
  • the traveling control unit 66 performs avoidance movement of the wheelchair 10 to the avoidance position 82 (S142).
  • This avoidance movement may be linearly moved from the current position of the wheelchair 10 to the avoidance position 82, or along the orbit circle (first orbit circle: radius R1 or second orbit circle: radius R2). It may be moved.
  • the travel time can be shortened and the pedestrian 80a and the pedestrian 80b can be quickly separated.
  • the avoidance movement is performed while the relative distance between the wheelchair 10 and the target 74 is kept constant, so that the wheelchair 10 and the target 74 are more than usual. Therefore, even if the avoidance movement suddenly starts, it is possible to make it difficult for the user and the target 74 to feel uncomfortable.
  • the CPU 44a shifts to S120 and executes the subsequent processing to perform the tracking control of the target 74 based on the avoidance position 82. To implement.
  • the avoidance movement in S142 is continued to reach the avoidance position 82.
  • the display control unit 68 when it is not the self-running mode and it is not detected that the tracking running mode is selected (No in S100), the display control unit 68 outputs a control signal indicating that it is the normal running mode to the output unit 72. And display such as “during normal driving mode control” on the display device 42a (S146). Then, the traveling control unit 66 executes the stick operation driving process according to the operation state of the operation unit 36 (S148). For example, the straight traveling speed calculation unit 66a calculates the straight traveling speed according to the amount of tilting of the joystick, and the turning speed calculating unit 66b calculates the turning speed according to the direction of tilting the joystick.
  • the rudder angle setting unit 66c sets the rudder angle of the caster 28 and drives the actuator 30a of the steering system 30 by a predetermined amount. Further, the motor control unit 66e rotates the actuator 24a (motor) of the drive system 24 at a predetermined speed, and moves the wheelchair 10 in the direction instructed by the joystick.
  • the CPU 44a continues the stick operation driving process until the stop process condition in the normal travel mode is satisfied (No in S150).
  • the conditions for the stop process in the normal travel mode can be considered to be satisfied, for example, when a predetermined period (for example, 30 seconds) has elapsed after the joystick is returned to the neutral position.
  • a predetermined period for example, 30 seconds
  • the CPU 44a determines that the end condition for the normal travel mode is satisfied, and causes the display control unit 68 to display the end of the normal travel mode (S152). ) This flow is finished once.
  • the wheelchair 10 maintains the first relative position from the target 74 and also has the position of the tracking angle ⁇ based on the target moving direction P of the target 74 (for example, The target position is always set at the left and right positions of the target 74, and the target 74 moves.
  • the calculation load for tracking can be reduced, and smooth tracking (following) operation at low cost is possible. Can be realized.
  • the wheelchair 10 is shown as an example of the moving body.
  • the present invention is not limited to this, and a moving vehicle that rides in a sitting posture or a standing posture may be used as a moving means for a healthy person. Similar effects can be obtained.
  • the vehicle which moves other than a human may be sufficient.
  • it may be an unmanned vehicle or a robot for luggage transportation or animal transportation.
  • the traveling mode may be only the tracking traveling mode.
  • a plurality of moving bodies may be run.
  • the leading mobile body may track the target 74, and the subsequent mobile body may regard the preceding mobile body as the tracking target body and track it, and the same effect can be obtained.
  • the CPU 44a may acquire the degree of congestion and the position of an obstacle (pedestrian 80) from the infrastructure around the moving object (wheelchair 10), or an area where the moving object (wheelchair 10) exists.
  • the degree of congestion (for example, the degree of congestion by time zone) may be acquired with reference to the past congestion degree data, and reflected in the tracking control.
  • the laser sensor 38 is shown as an example of an information acquisition unit that acquires information indicating the positional relationship between objects around the moving body (wheelchair 10). , And a relative distance from the object, etc.) can be appropriately changed.
  • an image may be acquired by an imaging device (stereo camera or the like), and information indicating the positional relationship of surrounding objects may be acquired by performing image processing on the image, which is the same as when the laser sensor 38 is used. You can get an effect.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Electric Propulsion And Braking For Vehicles (AREA)

Abstract

A mobile body tracking control device is provided with: an object-identifying unit that identifies a tracking object tracked by a mobile body; an acquisition unit that, on the basis of acquisition results of an information acquisition unit that acquires information indicating the positional relationship of an object in the periphery of a mobile body, acquires at least the relative distance between the mobile body and the tracking object, and the movement direction of the tracking object; a target-setting unit that, with the movement direction of the tracking object for reference, sets the movement target position of the mobile body at a position in a direction skewed at a prescribed angle from that movement direction, and at a position of a first relative distance from the tracking object; and a control unit that controls the mobile body so as to move to the movement target position.

Description

移動体追尾制御装置Moving body tracking control device
 本発明の実施形態は、移動体追尾制御装置に関する。 Embodiments of the present invention relate to a moving body tracking control device.
 従来、追従(追尾)する対象(オペレータ等)の移動経路を算出して、追従経路を設定して追従(追尾)を実行する移動体の追従方法が提案されている。また、オペレータの発進、停止の操作を必要とせずに、オペレータに自動で追従して行く追従型走行車両が提案されている。 Conventionally, a tracking method of a moving body has been proposed in which a movement path of an object (operator or the like) to be tracked (tracked) is calculated, a tracking path is set, and tracking (tracking) is executed. There has also been proposed a follow-up type traveling vehicle that automatically follows the operator without requiring the operator to start and stop.
特許第4316477号公報Japanese Patent No. 4316477 特許第4552869号公報Japanese Patent No. 4552869 特開平6-274223号公報JP-A-6-274223
 しかしながら、移動体の周囲には、追従対象者(オペレータ等)の他にも歩行者や障害物が多数存在する場合が想定されるため、オペレータの移動経路の算出には高負荷な演算が要求される。また、オペレータ以外の周囲の移動体(例えば、歩行者等)との接触回避も要求されるため、移動経路の算出の複雑性がさらに増加してしまうという問題があった。 However, since it is assumed that there are many pedestrians and obstacles in addition to the tracking target person (operator, etc.) around the moving body, a high load calculation is required for calculating the movement path of the operator. Is done. In addition, since it is also required to avoid contact with surrounding moving bodies (for example, pedestrians) other than the operator, there has been a problem that the complexity of calculating the moving route is further increased.
 そこで、本発明の課題の一つは、演算負荷の軽減が可能で、スムーズに追尾対象体(オペレータ、ターゲット)を追尾(追従)できる移動体追尾制御装置を提供することである。 Therefore, one of the problems of the present invention is to provide a moving body tracking control device that can reduce the calculation load and can smoothly track (follow) a tracking target (operator, target).
 本発明の実施形態にかかる移動体追尾制御装置は、例えば、移動体が追尾する追尾対象体を特定する対象特定部と、上記移動体の周辺の物体の位置関係を示す情報を取得する情報取得部の取得結果に基づき、少なくとも上記移動体と上記追尾対象体との相対距離と、上記追尾対象体の移動方向と、を取得する取得部と、上記追尾対象体の上記移動方向を基準に当該移動方向から所定角度ずれた方向上に位置し、かつ上記追尾対象体から第一の相対距離の位置に上記移動体の移動目標位置を設定する目標設定部と、上記移動体が上記移動目標位置に移動するように制御する制御部と、を備える。この構成によれば、移動体は、追尾対象体の移動方向を基準とした所定角度の位置に、第一の相対距離を維持しながら移動する。その結果、例えば、追尾対象体の移動経路を推定(算出)する必要がなく、演算負荷の軽減ができる。また、移動体は、追尾対象体に対して、常に所定の位置に存在しながら追尾対象体を追尾するので、移動体の管理(監視)が容易になる。 The mobile body tracking control device according to the embodiment of the present invention acquires, for example, information indicating a positional relationship between a target specifying unit that specifies a tracking target object that the mobile body tracks and an object around the mobile body. Based on the acquisition result of the part, the acquisition unit for acquiring at least the relative distance between the moving object and the tracking target object and the moving direction of the tracking target object, and the moving direction of the tracking target object A target setting unit that is located on a direction deviated by a predetermined angle from the moving direction and that sets a moving target position of the moving body at a position of a first relative distance from the tracking target body; and And a control unit that controls to move to. According to this configuration, the moving body moves to a position at a predetermined angle with respect to the moving direction of the tracking target object while maintaining the first relative distance. As a result, for example, it is not necessary to estimate (calculate) the movement path of the tracking target object, and the calculation load can be reduced. In addition, since the moving body tracks the tracking target object while always existing at a predetermined position with respect to the tracking target object, the management (monitoring) of the moving object becomes easy.
 本発明の実施形態にかかる移動体追尾制御装置の上記目標設定部は、例えば、上記追尾対象体の位置を中心とし、半径が上記第一の相対距離で規定される円軌道上に上記移動目標位置を設定してもよい。この構成によれば、例えば、移動体の周辺状況に変化がある場合や追尾対象体が不規則な移動を行う場合でも、移動体は追尾対象体の位置を中心とする一定半径の円軌道上に存在するので、追尾対象体から離れることなく、スムーズな追尾動作が実現できる。 For example, the target setting unit of the mobile tracking control apparatus according to the embodiment of the present invention has the moving target on a circular orbit whose center is the position of the tracking target and whose radius is defined by the first relative distance. The position may be set. According to this configuration, for example, even when there is a change in the surrounding situation of the moving object or when the tracking target object moves irregularly, the moving object is on a circular orbit with a constant radius centered on the position of the tracking target object. Therefore, a smooth tracking operation can be realized without leaving the tracking target.
 本発明の実施形態にかかる移動体追尾制御装置の上記目標設定部は、例えば、上記追尾対象体の側方および後方の領域から上記移動方向の逆方向ベクトルを含む所定の後方領域を除く領域に上記移動目標位置を設定してもよい。この構成によれば、移動体は追尾対象体の側方から真後ろを除く後方の領域に存在するように追尾動作を行う。その結果、例えば、移動体が追尾対象体より前方に離れて先行してしまうことが抑制できる。また、追尾対象体の真後ろには、移動しないので、例えば、追尾対象体が急停止した場合等でも追尾対象体と移動体とが接触してしまうことを避けることができる。 The target setting unit of the mobile body tracking control device according to the embodiment of the present invention may be, for example, in a region excluding a predetermined rear region including a backward vector of the moving direction from a lateral region and a rear region of the tracking target body. The movement target position may be set. According to this configuration, the moving body performs the tracking operation so as to exist in the rear region excluding the back side from the side of the tracking target body. As a result, for example, it is possible to suppress the moving body from leading ahead of the tracking target object. Further, since the object does not move directly behind the tracking object, for example, even when the tracking object suddenly stops, it is possible to prevent the tracking object and the moving object from coming into contact with each other.
 本発明の実施形態にかかる移動体追尾制御装置の上記目標設定部は、例えば、上記移動体の周辺状況に応じて、上記第一の相対距離を変更してもよい。この構成によれば、例えば、移動体の周囲の物体の存在状況が通常の場合、例えば「疎」の場合より「密」の場合は、より移動体を追尾対象体に接近させた位置で移動させることができる。その結果、移動体の周囲が混雑している場合でも移動体をスムーズに移動させることができる。逆に、移動体の周囲が混雑していない「疎」場合は、移動体と追尾対象体との相対距離を広げて、例えば、お互いの圧迫感を軽減した状態で移動体を追尾対象体に追尾させることができる。 The target setting unit of the moving body tracking control device according to the embodiment of the present invention may change the first relative distance according to, for example, the surrounding state of the moving body. According to this configuration, for example, when the existence situation of the object around the moving body is normal, for example, when it is “dense” rather than “sparse”, the moving body moves closer to the tracking target object. Can be made. As a result, the moving body can be smoothly moved even when the periphery of the moving body is congested. On the other hand, if the surrounding area of the moving object is not sparse, the relative distance between the moving object and the tracking target object is increased, for example, the moving object is changed to the tracking target object in a state where the mutual feeling of pressure is reduced. Can be tracked.
 本発明の実施形態にかかる移動体追尾制御装置の上記取得部は、例えば、上記情報取得部の上記取得結果に含まれる上記移動体の周辺に存在する障害物と上記移動体との第二の相対距離または、上記第二の相対距離が縮まる場合に上記移動体に接触するまでの接触予測時間を取得し、上記目標設定部は、上記第二の相対距離が所定の距離閾値以下になった場合または上記接触予測時間が所定の時間閾値以下になった場合、上記移動目標位置を上記追尾対象体から上記第一の相対距離より近い位置に設定してもよい。この構成によれば、例えば、障害物が接近する場合に、障害物と接触する可能性を低減させつつ、移動体のスムーズな追尾運動を実行させやすくなる。 For example, the acquisition unit of the mobile body tracking control device according to the embodiment of the present invention may include a second unit between the obstacle and the mobile body present around the mobile body included in the acquisition result of the information acquisition unit. When the relative distance or the second relative distance is reduced, the predicted contact time until contact with the moving body is acquired, and the target setting unit is configured such that the second relative distance is equal to or less than a predetermined distance threshold. In this case or when the predicted contact time is equal to or less than a predetermined time threshold, the movement target position may be set to a position closer to the tracking target object than the first relative distance. According to this configuration, for example, when an obstacle approaches, the smooth tracking movement of the moving body can be easily performed while reducing the possibility of contact with the obstacle.
 本発明の実施形態にかかる移動体追尾制御装置の上記取得部は、上記情報取得部の上記取得結果に含まれる上記移動体の周辺に存在する障害物と上記移動体との第二の相対距離または、上記第二の相対距離が縮まる場合に上記移動体に接触するまでの接触予測時間を取得し、上記目標設定部は、上記第二の相対距離が所定の距離閾値以下になった場合または上記接触予測時間が所定の時間閾値以下になった場合、上記追尾対象体を挟んで、上記障害物の逆側に上記移動目標位置を設定し、上記制御部は、上記移動体が上記障害物から離れるような軌道を通り上記逆側の上記移動目標位置に移動するように制御してもよい。この構成によれば、障害物が接近する場合、移動体は、障害物との相対距離を広げるように移動する。その結果、例えば、障害物との接触確率をより低減させて、障害物に対する移動体のよりスムーズな回避運動を実行させやすくなる。 The acquisition unit of the mobile body tracking control device according to the embodiment of the present invention is configured such that the second relative distance between the obstacle and the obstacle existing around the mobile body included in the acquisition result of the information acquisition unit. Alternatively, when the second relative distance is shortened, a contact prediction time until contact with the moving body is acquired, and the target setting unit is configured to detect when the second relative distance is equal to or less than a predetermined distance threshold. When the predicted contact time is equal to or less than a predetermined time threshold, the movement target position is set on the opposite side of the obstacle with the tracking target object interposed therebetween, and the control unit is configured so that the moving object is the obstacle. It may be controlled to move to the movement target position on the opposite side through a trajectory away from the vehicle. According to this configuration, when an obstacle approaches, the moving body moves so as to increase the relative distance from the obstacle. As a result, for example, the probability of contact with an obstacle is further reduced, and a smoother avoidance movement of the moving body with respect to the obstacle can be easily performed.
図1は、実施形態にかかる移動体追尾制御装置を搭載する移動体の一例である車いすの斜視図である。FIG. 1 is a perspective view of a wheelchair that is an example of a moving body on which the moving body tracking control device according to the embodiment is mounted. 図2は、実施形態にかかる移動体追尾制御装置を含む移動体追尾システムの一例が示されたブロック図である。FIG. 2 is a block diagram illustrating an example of a moving body tracking system including a moving body tracking control apparatus according to the embodiment. 図3は、実施形態にかかる移動体追尾制御装置の移動体に搭載される周囲の状況を検出するレーザセンサの検出範囲の一例を示す説明図である。FIG. 3 is an explanatory diagram illustrating an example of a detection range of a laser sensor that detects a surrounding situation mounted on the moving body of the moving body tracking control device according to the embodiment. 図4は、実施形態にかかる移動体追尾制御装置のECU内に実現される追尾動作を実現する制御部(CPU)の構成の一例を示すブロック図である。FIG. 4 is a block diagram illustrating an example of a configuration of a control unit (CPU) that realizes a tracking operation realized in the ECU of the moving body tracking control device according to the embodiment. 図5は、実施形態にかかる移動体追尾制御装置の移動体の追尾位置の一例を示す説明図である。FIG. 5 is an explanatory diagram illustrating an example of the tracking position of the moving object of the moving object tracking control device according to the embodiment. 図6は、実施形態にかかる移動体追尾制御装置において、追尾対象体が前方に直進する場合の、移動体の移動目標位置の一例を示す説明図である。FIG. 6 is an explanatory diagram illustrating an example of the movement target position of the moving object when the tracking target object travels straight ahead in the moving object tracking control device according to the embodiment. 図7は、実施形態にかかる移動体追尾制御装置において、追尾対象体が右斜め前方に移動する場合の、移動体の移動目標位置の一例を示す説明図である。FIG. 7 is an explanatory diagram illustrating an example of the movement target position of the moving object when the tracking target object moves obliquely forward to the right in the moving object tracking control device according to the embodiment. 図8は、実施形態にかかる移動体追尾制御装置において、追尾対象体が図7と同じ位置に移動するものの、その位置で向いている方向(移動方向)が図7の場合と異なるときの、移動体の移動目標位置の一例を示す説明図である。FIG. 8 illustrates a case where the tracking target object moves to the same position as that in FIG. 7 in the moving body tracking control apparatus according to the embodiment, but the direction (movement direction) facing the position is different from that in FIG. It is explanatory drawing which shows an example of the movement target position of a moving body. 図9は、実施形態にかかる移動体追尾制御装置において、移動体の追尾動作の一例を実現するための処理手順の前半部分を説明するフローチャートである。FIG. 9 is a flowchart for explaining the first half of the processing procedure for realizing an example of the tracking operation of the mobile object in the mobile object tracking control device according to the embodiment. 図10は、実施形態にかかる移動体追尾制御装置において、移動体の追尾動作の一例を実現するための処理手順の後半部分を説明するフローチャートである。FIG. 10 is a flowchart for explaining the second half of the processing procedure for realizing an example of the tracking operation of the mobile object in the mobile object tracking control device according to the embodiment. 図11は、実施形態にかかる移動体追尾制御装置において、移動体の周囲の混雑度が低い場合の移動体の追尾位置の一例を示す説明図である。FIG. 11 is an explanatory diagram illustrating an example of the tracking position of the moving object when the degree of congestion around the moving object is low in the moving object tracking control device according to the embodiment. 図12は、実施形態にかかる移動体追尾制御装置において、移動体の周囲の混雑度が図11の場合より高い場合の移動体の追尾位置の一例を示す説明図である。FIG. 12 is an explanatory diagram illustrating an example of the tracking position of the moving object when the degree of congestion around the moving object is higher than that in FIG. 11 in the moving object tracking control device according to the embodiment. 図13は、実施形態にかかる移動体追尾制御装置において、移動体の前方から障害物が接近する場合の移動体の回避位置および、回避位置に移動するための移動軌道の一例を示す説明図である。FIG. 13 is an explanatory diagram illustrating an example of an avoidance position of a moving object when an obstacle approaches from the front of the moving object and an example of a moving track for moving to the avoidance position in the moving object tracking control device according to the embodiment. is there. 図14は、実施形態にかかる移動体追尾制御装置において、移動体の後方から障害物が接近する場合の移動体の回避位置および、回避位置に移動するための移動軌道の一例を示す説明図である。FIG. 14 is an explanatory diagram illustrating an example of an avoidance position of a moving object when an obstacle approaches from the rear of the moving object and a moving trajectory for moving to the avoidance position in the moving object tracking control device according to the embodiment. is there.
 以下、本発明の例示的な実施形態が開示される。以下に示される実施形態の構成、ならびに当該構成によってもたらされる作用、結果、および効果は、一例である。本発明は、以下の実施形態に開示される構成以外によっても実現可能であるとともに、基本的な構成に基づく種々の効果や、派生的な効果のうち、少なくとも一つを得ることが可能である。 Hereinafter, exemplary embodiments of the present invention will be disclosed. The configuration of the embodiment shown below and the operations, results, and effects brought about by the configuration are examples. The present invention can be realized by configurations other than those disclosed in the following embodiments, and at least one of various effects based on the basic configuration and derivative effects can be obtained. .
 本実施形態において、移動体追尾制御装置は、図1に示すような小型の移動体、例えば、「車いす」等に適用されうる。以下に示す実施形態では、移動体として「車いす」を一例として示すが、これに限定されない。例えば、健常者の移動手段として、着座姿勢や起立姿勢で搭乗する移動車両でもよい。移動体は、モータ等の駆動源を備える車両であり、ユーザ(搭乗者)が操縦(運転)することにより移動する通常走行モードを備えてもよいし、追尾対象体を特定して自動的に追尾する追尾走行モードを備えていてもよい。また、ユーザが自ら手や足を用いて移動体に駆動力を付与して移動する自走モードを備えていてもよい。移動体は、人間以外の物体(荷物、動物等)を積載可能な荷台を備える無人車両でもよい。また、移動体は、単体で走行してもよいし、複数が連結されたり、複数が分離状態で連なる態様で移動されたりしてもよい。 In this embodiment, the moving body tracking control device can be applied to a small moving body such as a “wheelchair” as shown in FIG. In the embodiment described below, a “wheelchair” is shown as an example of the moving body, but the present invention is not limited to this. For example, as a moving means for healthy persons, a moving vehicle that rides in a sitting posture or a standing posture may be used. The moving body is a vehicle including a driving source such as a motor, and may include a normal traveling mode in which the user (passenger) moves by maneuvering (driving), or automatically specifying a tracking target object. A tracking travel mode for tracking may be provided. Moreover, you may provide the self-propelled mode which a user gives a driving force to a moving body and moves by himself / herself using a hand or a foot. The moving body may be an unmanned vehicle including a loading platform on which an object other than a human (eg, luggage, animal) can be loaded. In addition, the moving body may run alone, or a plurality of moving bodies may be connected, or a plurality of moving bodies may be moved in a state where they are connected in a separated state.
 図1は、移動体の一例としての車いす10に移動体追尾制御装置を搭載した具体的な構成を示す。車いす10は、例えば金属製のフレーム12に柔軟性を有する布材や樹脂材等で形成されるシート14やバックサポート16が固定された折り畳み自在の椅子である。シート14の前方下部には、収納自在なフットサポート18が設けられ、ユーザ(利用者)が着座姿勢で足を乗せることができる。フットサポート18とシート14との間には、ユーザがシート14に着座した場合に、脹ら脛を支持してシート14の下側に足が入り込んでしまうことを防止するレッグサポート20が設けられている。 FIG. 1 shows a specific configuration in which a moving body tracking control device is mounted on a wheelchair 10 as an example of a moving body. The wheelchair 10 is a foldable chair in which a seat 14 and a back support 16 formed of, for example, a flexible cloth material or a resin material are fixed to a metal frame 12. A retractable foot support 18 is provided at the lower front portion of the seat 14 so that a user (user) can put his / her foot in a sitting position. A leg support 20 is provided between the foot support 18 and the seat 14 to support the inflated shin and prevent the foot from entering the lower side of the seat 14 when the user is seated on the seat 14. ing.
 シート14の左右側方には、駆動輪22(左駆動輪22L,右駆動輪22R)が配置されている。駆動輪22には、当該駆動輪22の外側に一体にリング状のハンドリム22aが固定されている。ハンドリム22aは、シート14に着座したユーザが当該ハンドリム22aを手で把持して回転させることにより駆動輪22を駆動して車いす10を移動させる自走モードを実現する。また、本実施形態の車いす10の駆動輪22は、駆動システム24を備え、自動走行ができるように構成されている。なお、駆動輪22には例えばドラム式のブレーキシステム26が設けられ、駆動輪22の減速および停止、停止状態の保持を行うことができる。ブレーキシステム26は駆動システム24と一体化することもできる。 Driving wheels 22 (left driving wheel 22L and right driving wheel 22R) are arranged on the left and right sides of the seat 14. A ring-shaped hand rim 22 a is integrally fixed to the drive wheel 22 outside the drive wheel 22. The hand rim 22a realizes a self-running mode in which the user sitting on the seat 14 grips and rotates the hand rim 22a by hand to drive the driving wheel 22 and move the wheelchair 10. Moreover, the drive wheel 22 of the wheelchair 10 of this embodiment is provided with the drive system 24, and is comprised so that automatic driving | running | working can be performed. The drive wheel 22 is provided with, for example, a drum brake system 26, and the drive wheel 22 can be decelerated, stopped, and held in a stopped state. The brake system 26 can also be integrated with the drive system 24.
 駆動輪22の前方には、車いす10の進行方向を決めるためのキャスタ28が一対設けられている。キャスタ28はギヤ等で構成される操舵システム30に接続されている。操舵システム30は、ユーザがハンドリム22aを回転させて車いす10を移動させる自走モードの場合、または同伴者等が手押しハンドル12aを把持して車いす10を押し動かす場合は、キャスタ28を旋回自在にする。つまり、操舵方向を拘束しないようしている。一方、操舵システム30は、駆動輪22が駆動システム24によって回転する通常走行モードまたは後述する追尾走行モードで駆動される場合には、ユーザによる旋回指示やECU44(図2参照)による追尾制御により、キャスタ28を所定の方向に旋回させて車いす10の走行方向を決定するように制御される。 A pair of casters 28 for determining the traveling direction of the wheelchair 10 are provided in front of the drive wheels 22. The casters 28 are connected to a steering system 30 composed of gears and the like. The steering system 30 allows the caster 28 to turn freely in a self-propelled mode in which the user rotates the hand rim 22a to move the wheelchair 10 or when a companion or the like grips the handwheel 12a to push the wheelchair 10 to move. To do. That is, the steering direction is not restricted. On the other hand, when the driving wheel 22 is driven in the normal traveling mode in which the driving wheel 22 is rotated by the driving system 24 or the tracking traveling mode described later, the steering system 30 performs a turning instruction by the user or tracking control by the ECU 44 (see FIG. 2). The caster 28 is controlled to turn in a predetermined direction to determine the traveling direction of the wheelchair 10.
 駆動システム24、ブレーキシステム26および操舵システム30は、例えばシート14の下面に配置されたバッテリシステム32のバッテリ32aから電力供給を受けて駆動することができる。 The drive system 24, the brake system 26, and the steering system 30 can be driven by receiving power supply from the battery 32a of the battery system 32 disposed on the lower surface of the seat 14, for example.
 また、ユーザがシート14に着座した姿勢で手の届く位置、例えばシート14の側方には、コントロールボックス34が配置されている。コントロールボックス34には、車いす10を通常走行モードまたは追尾走行モードで走行させる場合の制御ユニットが収められている。また、コントロールボックス34には、車いす10をユーザが通常走行モードで走行させる場動に移動速度や移動方向を指示する操作部36が設けられていてもよい。操作部36は、例えば360°任意の方向に倒すことのできるジョイスティックとすることができる。この場合、ジョイスティックを倒す方向で車いす10の進行方向を指示し、ジョイスティックを倒す量によって、車いす10の移動速度を指示することができる。なお、ジョイスティックを中立位置(直立位置)に戻すことにより車いす10を停止させことができる。この場合、ブレーキシステム26のアクチュエータ26aを駆動して、駆動輪22の停止状態を維持するようにしてもよい。 Further, a control box 34 is arranged at a position where the user can reach in a posture where the user is seated on the seat 14, for example, at a side of the seat 14. The control box 34 contains a control unit for causing the wheelchair 10 to travel in the normal travel mode or the tracking travel mode. In addition, the control box 34 may be provided with an operation unit 36 for instructing a moving speed and a moving direction when the user moves the wheelchair 10 in the normal driving mode. The operation unit 36 can be a joystick that can be tilted in an arbitrary direction of 360 °, for example. In this case, the traveling direction of the wheelchair 10 can be instructed in the direction in which the joystick is tilted, and the moving speed of the wheelchair 10 can be instructed by the amount by which the joystick is tilted. The wheelchair 10 can be stopped by returning the joystick to the neutral position (upright position). In this case, the actuator 26a of the brake system 26 may be driven to maintain the stopped state of the drive wheels 22.
 また、コントロールボックス34には車いす10の主として前方の領域について、そこに存在する物体の位置関係を示す情報を取得する情報取得部の一例として、レーザセンサ38(フロントセンサ38F)が設けられる。同様に、車いす10の主として後方の領域について、そこに存在する物体の位置関係を示す情報を取得するレーザセンサ38(リヤセンサ38R)が設けられる。図1の場合、一例としてリヤセンサ38Rは、バッテリシステム32を構成する筐体に支持されている例が示されている。フロントセンサ38Fとリヤセンサ38Rとで、車いす10の周囲の範囲、例えば360°の範囲をカバーできるようにすることが望ましい。フロントセンサ38Fやリヤセンサ38Rを設置する位置は、一例であり、車いす10の周囲の状況が取得できれば適宜変更することができる。また、本実施形態は、レーザセンサ38を二個設けた例を示したが、車いす10の周囲の範囲、例えば360°の範囲をカバーできれば、一個でも三個以上でもよい。レーザセンサ38の数を増やすことにより非検出範囲(死角)を縮小または排除することができる。 Also, the control box 34 is provided with a laser sensor 38 (front sensor 38F) as an example of an information acquisition unit that acquires information indicating the positional relationship of objects existing in the area mainly in front of the wheelchair 10. Similarly, a laser sensor 38 (rear sensor 38 </ b> R) that acquires information indicating the positional relationship of objects existing there is provided for a region mainly behind the wheelchair 10. In the case of FIG. 1, as an example, the rear sensor 38 </ b> R is supported by a casing constituting the battery system 32. It is desirable that the front sensor 38F and the rear sensor 38R can cover a range around the wheelchair 10, for example, a range of 360 °. The positions where the front sensor 38F and the rear sensor 38R are installed are only examples, and can be appropriately changed as long as the situation around the wheelchair 10 can be acquired. Moreover, although this embodiment showed the example which provided the two laser sensors 38, if the range around the wheelchair 10 (for example, the range of 360 degrees) can be covered, it may be one piece or three or more pieces. By increasing the number of laser sensors 38, the non-detection range (dead angle) can be reduced or eliminated.
 なお、車いす10は、ブレーキパッド40aを駆動輪22のタイヤ表面に接触させて、駆動輪22の停止状態を維持するハンドブレーキレバー40が設けられている。また、例えば、コントロールボックス34の反対側には、車いす10の走行状態や周囲の状況等の各種情報をユーザに提供するモニタ装置42が設けられてもよい。モニタ装置42は、例えば支持アームの角度調整により任意の姿勢で固定可能であり、ユーザがシート14に着座した姿勢での視認性を確保しやすくするとともに、着座姿勢および着座動作の際に邪魔にならないように配置できるようにしている。 In addition, the wheelchair 10 is provided with a hand brake lever 40 that keeps the stop state of the drive wheel 22 by bringing the brake pad 40 a into contact with the tire surface of the drive wheel 22. Further, for example, on the opposite side of the control box 34, a monitor device 42 that provides the user with various information such as the running state of the wheelchair 10 and the surrounding situation may be provided. The monitor device 42 can be fixed in an arbitrary posture, for example, by adjusting the angle of the support arm, making it easy to ensure the visibility in the posture in which the user is seated on the seat 14, and is obstructive during the sitting posture and the seating operation. It can be arranged so that it does not become.
 図2は、実施形態にかかる移動体追尾制御装置を含む移動体追尾システム100の一例が示されたブロック図である。移動体追尾制御装置の主構成であるECU44(electronic control unit)と各種システム、各種センサ等はインタフェース102を介して電気的に接続され、移動体としての車いす10の走行制御(追尾制御)を実現する。インタフェース102には、例えば、駆動システム24、ブレーキシステム26、操舵システム30、バッテリシステム32、レーザセンサ38、舵角センサ46、車輪速センサ48等が接続されている。 FIG. 2 is a block diagram illustrating an example of the mobile tracking system 100 including the mobile tracking control device according to the embodiment. ECU44 (electronic control unit), which is the main configuration of the moving body tracking control device, and various systems, various sensors, etc. are electrically connected via the interface 102 to realize traveling control (tracking control) of the wheelchair 10 as the moving body. To do. For example, the drive system 24, the brake system 26, the steering system 30, the battery system 32, the laser sensor 38, the steering angle sensor 46, the wheel speed sensor 48, and the like are connected to the interface 102.
 駆動システム24は、アクチュエータ24aとトルクセンサ24bを有する。駆動システム24は、ECU44等によって電気的に制御されて、車いす10を通常走行モードや追尾走行モードの際にアクチュエータ24a(例えば、モータ)を駆動する。駆動システム24は、駆動輪22の速度調整やトルク調整を行い、車いす10を前進方向または後退方向に所望の速度(例えば操作部36の操作量で指示された速度)で移動させる。なお、トルクセンサ24bは、アクチュエータ24aのトルクを検出し、ECU44がアクチュエータ24aに発生しているトルクを解析することにより、アクチュエータ24aの指令値を補正して、車いす10の速度制御の精度を向上させる。 The drive system 24 includes an actuator 24a and a torque sensor 24b. The drive system 24 is electrically controlled by the ECU 44 or the like, and drives the actuator 24a (for example, a motor) when the wheelchair 10 is in the normal travel mode or the tracking travel mode. The drive system 24 adjusts the speed and torque of the drive wheels 22 and moves the wheelchair 10 in a forward direction or a reverse direction at a desired speed (for example, a speed indicated by an operation amount of the operation unit 36). The torque sensor 24b detects the torque of the actuator 24a, and the ECU 44 analyzes the torque generated in the actuator 24a, thereby correcting the command value of the actuator 24a and improving the speed control accuracy of the wheelchair 10. Let
 ブレーキシステム26は、アクチュエータ26aとブレーキセンサ26bを有する。ブレーキシステム26は、ECU44等によって電気的に制御されて、アクチュエータ26a(例えばピストン)を動作させて、例えば、ドラムブレーキのドラムに対するブレーキシューの押圧状態を変化させる。ブレーキシューの押圧状態を変化させることにより、駆動輪22の速度調整を行ったり、駆動輪22を停止させたり、駆動輪22の停止様態の維持を行ったりする。ブレーキセンサ26bは、例えばフットサポート18に配置されたブレーキペダルの操作状態(踏み込み量等)を検出するセンサである。 The brake system 26 includes an actuator 26a and a brake sensor 26b. The brake system 26 is electrically controlled by the ECU 44 or the like, and operates an actuator 26a (for example, a piston) to change, for example, the pressing state of the brake shoe against the drum of the drum brake. By changing the pressing state of the brake shoe, the speed of the drive wheel 22 is adjusted, the drive wheel 22 is stopped, or the stop state of the drive wheel 22 is maintained. The brake sensor 26b is a sensor that detects an operation state (depression amount or the like) of a brake pedal disposed on the foot support 18, for example.
 操舵システム30は、アクチュエータ30aとトルクセンサ30bを有する。操舵システム30は、ECU44等によって電気的に制御されて、車いす10を通常走行モードや追尾走行モードで走行させる際に、アクチュエータ30a(例えばモータ)を駆動することによりギヤ等にトルクを付加してキャスタ28を旋回させる。その結果、キャスタ28の向く方向(操舵方向)を変化させて車いす10の走行方向を変化させる。この場合、アクチュエータ30aは、一つのキャスタ28を転舵してもよいし、左右のキャスタ28を転舵してもよい。また、トルクセンサ30bは、例えば、キャスタ28の旋回軸に生じているトルクを検出し、アクチュエータ30aによるキャスタ28の旋回状態が指令通りに実行されているかを検出する。ECU44はこの検出結果にしたがいアクチュエータ30aのフィードバック制御を行い、キャスタ28の転舵精度を向上させる。 The steering system 30 includes an actuator 30a and a torque sensor 30b. The steering system 30 is electrically controlled by the ECU 44 and the like to apply torque to gears and the like by driving an actuator 30a (for example, a motor) when the wheelchair 10 is driven in the normal driving mode or the tracking driving mode. The caster 28 is turned. As a result, the traveling direction of the wheelchair 10 is changed by changing the direction in which the caster 28 faces (the steering direction). In this case, the actuator 30a may steer one caster 28, or may steer the left and right casters 28. Further, the torque sensor 30b detects, for example, torque generated on the turning shaft of the caster 28, and detects whether the turning state of the caster 28 by the actuator 30a is being executed as instructed. The ECU 44 performs feedback control of the actuator 30a according to the detection result, and improves the turning accuracy of the caster 28.
 舵角センサ46は、例えば、キャスタ28の方向を検出するセンサである。舵角センサ46は、例えば、ホール素子などを用いて構成され、例えば、車いす10の正面方向を基準として、正方向(時計回り方向)の角度と負方向(反時計回り方向)の角度を取得する。車輪速センサ48は、例えば、駆動輪22の回転量や単位時間当たりの回転数を検出するセンサである。車輪速センサ48は、各駆動輪22に配置され、各駆動輪22で検出した回転数を示す車輪速パルス数をセンサ値として出力する。車輪速センサ48は、例えば、ホール素子などを用いて構成されうる。ECU44は、車輪速センサ48から取得したセンサ値に基づいて車いす10の移動量などを演算し、各種制御を実行する。ECU44は、各車輪速センサ48のセンサ値に基づいて車両1の車速を算出する場合、左右の駆動輪22のうち小さい方のセンサ値の駆動輪22の速度に基づき車いす10の車速を決定し、各種制御を実行する。 The steering angle sensor 46 is, for example, a sensor that detects the direction of the caster 28. The rudder angle sensor 46 is configured using, for example, a hall element, and acquires, for example, an angle in the positive direction (clockwise direction) and a negative direction (counterclockwise direction) with respect to the front direction of the wheelchair 10. To do. The wheel speed sensor 48 is, for example, a sensor that detects the amount of rotation of the drive wheels 22 and the number of rotations per unit time. The wheel speed sensor 48 is disposed on each drive wheel 22 and outputs a wheel speed pulse number indicating the rotation speed detected by each drive wheel 22 as a sensor value. The wheel speed sensor 48 can be configured using, for example, a hall element. The ECU 44 calculates the amount of movement of the wheelchair 10 based on the sensor value acquired from the wheel speed sensor 48 and executes various controls. When calculating the vehicle speed of the vehicle 1 based on the sensor value of each wheel speed sensor 48, the ECU 44 determines the vehicle speed of the wheelchair 10 based on the speed of the drive wheel 22 having the smaller sensor value of the left and right drive wheels 22. Execute various controls.
 バッテリシステム32は、充放電を行うバッテリ32aと電圧センサ32bと電流センサ32cを有する。バッテリ32aは、外部電源に接続して充電されるタイプでもよいし、駆動輪22が通常走行モードで回転する場合に充電されてもよい。また、駆動システム24のアクチュエータ24a(モータ)の回生動作によって充電されてもよい。またそれらの組み合わせでもよい。電圧センサ32b、電流センサ32cは、バッテリ32aの充放電の際に電圧値や電流値を管理する。また、電圧センサ32b、電流センサ32cは、バッテリ32aを回生により充電する場合、バッテリ32aを最適な充電状態に維持するように管理する。 The battery system 32 includes a battery 32a for charging / discharging, a voltage sensor 32b, and a current sensor 32c. The battery 32a may be of a type that is connected to an external power source and charged, or may be charged when the drive wheels 22 rotate in the normal travel mode. Moreover, you may charge by the regeneration operation | movement of the actuator 24a (motor) of the drive system 24. FIG. A combination thereof may also be used. The voltage sensor 32b and the current sensor 32c manage the voltage value and the current value when the battery 32a is charged and discharged. Moreover, when charging the battery 32a by regeneration, the voltage sensor 32b and the current sensor 32c are managed so as to maintain the battery 32a in an optimal charging state.
 レーザセンサ38(フロントセンサ38F、リヤセンサ38R)は、移動体としての車いす10の周辺の物体の位置関係を示す情報を取得する情報取得部として機能する。レーザセンサ38は、センサ内部の光源(レーザダイオード等)から照射されたレーザ光が、測定対象物(例えば、追尾対象体や障害物等)にあたると反射され、受光素子で受光される。このとき受光した反射光を評価、演算することで、レーザ光が反射された位置までの距離を算出する。また、その情報を時系列で比較することにより、測定対象物が移動する場合の移動方向を算出することができる。レーザセンサ38のうちフロントセンサ38Fは、図3に示すように、車いす10の前方に向けられ、例えば、検出角θFで前方を含む第1領域FEを検出範囲とする。また、リヤセンサ38Rは、車いす10の後方に向けられ、例えば、検出角θRで後方および側方を含む第2領域REを検出範囲とする。なお、複数のレーザセンサ38を組み合わせて車いす10の周囲360°を検出領域とする場合、非検出領域(死角)が形成されてしまう場合がある。図3に示す例では、第1領域FEと第2領域REとは重複領域CEで重なり、非検出領域DEを低減するようにしている。なお、図3の場合、非検出領域DEが存在するが、車いす10の前方の直近の位置なので、レーザセンサ38としては死角になるが、車いす10に着座するユーザによる目視確認が容易な領域なので、制御上の影響は最小限(実質無視できる程度)である。 The laser sensor 38 (front sensor 38F, rear sensor 38R) functions as an information acquisition unit that acquires information indicating the positional relationship of objects around the wheelchair 10 as a moving body. The laser sensor 38 is reflected when a laser beam emitted from a light source (laser diode or the like) inside the sensor hits a measurement target (for example, a tracking target or an obstacle) and is received by a light receiving element. The distance to the position where the laser beam is reflected is calculated by evaluating and calculating the reflected light received at this time. Further, by comparing the information in time series, the moving direction when the measurement object moves can be calculated. As shown in FIG. 3, the front sensor 38F of the laser sensor 38 is directed to the front of the wheelchair 10 and has, for example, a first region FE including the front at a detection angle θF as a detection range. Further, the rear sensor 38R is directed to the rear of the wheelchair 10, and, for example, the second region RE including the rear and the side at the detection angle θR is set as a detection range. Note that when a plurality of laser sensors 38 are combined to form 360 ° around the wheelchair 10 as a detection region, a non-detection region (dead angle) may be formed. In the example shown in FIG. 3, the first region FE and the second region RE overlap with each other in the overlapping region CE, and the non-detection region DE is reduced. In the case of FIG. 3, there is a non-detection area DE, but since it is the closest position in front of the wheelchair 10, it is a blind spot as the laser sensor 38, but it is an area that can be easily visually confirmed by a user sitting on the wheelchair 10. The control impact is minimal (substantially negligible).
 また、インタフェース102には、さらにECU44、モニタ装置42等が接続されている。ECU44は、インタフェース102を通じて制御信号を送ることで、駆動システム24、ブレーキシステム26、操舵システム30、バッテリシステム32等を制御することができる。また、ECU44は、インタフェース102を介して、トルクセンサ24b、ブレーキセンサ26b、トルクセンサ30b、電圧センサ32b、電流センサ32c、舵角センサ46、車輪速センサ48、レーザセンサ38等の検出結果や、操作部36等の操作信号を、受け取ることができる。 The interface 102 is further connected to an ECU 44, a monitor device 42, and the like. The ECU 44 can control the drive system 24, the brake system 26, the steering system 30, the battery system 32, and the like by sending a control signal through the interface 102. Further, the ECU 44 detects detection results of the torque sensor 24b, the brake sensor 26b, the torque sensor 30b, the voltage sensor 32b, the current sensor 32c, the rudder angle sensor 46, the wheel speed sensor 48, the laser sensor 38, etc. via the interface 102, An operation signal from the operation unit 36 or the like can be received.
 ECU44は、例えば、CPU44a(central processing unit)や、ROM44b(read only memory)、RAM44c(random access memory)、SSD44d(solid state drive、フラッシュメモリ)等を有している。CPU44aは、追尾走行モードの場合、例えば、車いす10が追尾すべき追尾対象体(例えば、同伴者や先導者、先導体等)を特定し、追尾対象体との相対距離を一定に保ちつつ、追尾対象体を基準に一定の位置で車いす10を追尾させるための処理を行う。CPU44aは、通常走行モードの場合、操作部36を介してユーザから指示される移動方向や移動速度による車いす10の走行を実現するように、駆動システム24や操舵システム30、バッテリシステム32等を制御する。また、車いす10が自走モードで走行される場合には、駆動システム24や操舵システム30をシステムから切り離し、駆動輪22が軽負荷で容易に回転し、キャスタ28が任意の方向に向くようにする。この場合、ブレーキシステム26は制御可能な状態としてもよく、自走モードにおける緊急停止ブレーキとして利用できるようにしてもよい。CPU44aは、ROM44b等の不揮発性の記憶装置に記憶された(インストールされた)プログラムを読み出し、当該プログラムに従って演算処理を実行する。 The ECU 44 includes, for example, a CPU 44a (central processing unit), a ROM 44b (read only memory), a RAM 44c (random access memory), an SSD 44d (solid state drive, flash memory), and the like. In the tracking driving mode, the CPU 44a specifies, for example, a tracking target object (for example, a companion, a leader, a leading conductor, etc.) to be tracked by the wheelchair 10, and keeps a relative distance from the tracking target object constant. A process for tracking the wheelchair 10 at a fixed position on the basis of the tracking object is performed. In the normal traveling mode, the CPU 44a controls the drive system 24, the steering system 30, the battery system 32, and the like so as to realize traveling of the wheelchair 10 according to the moving direction and moving speed instructed by the user via the operation unit 36. To do. Further, when the wheelchair 10 is driven in the self-propelled mode, the drive system 24 and the steering system 30 are disconnected from the system so that the drive wheels 22 can be easily rotated with a light load and the casters 28 are directed in an arbitrary direction. To do. In this case, the brake system 26 may be in a controllable state or may be used as an emergency stop brake in the self-propelled mode. The CPU 44a reads a program stored (installed) in a non-volatile storage device such as the ROM 44b, and executes arithmetic processing according to the program.
 RAM44cは、CPU44aでの演算で用いられる各種データを一時的に記憶する。また、SSD44dは、書き換え可能な不揮発性の記憶部であって、ECU44の電源がオフされた場合であってもデータを記憶することができる。なお、CPU44a、ROM44b、RAM44c等は、同一パッケージ内に集積されることができる。また、ECU44は、CPU44aに替えて、DSP(digital signal processor)等の他の論理演算プロセッサや論理回路等が用いられる構成であってもよい。また、SSD44dに替えてHDD(hard disk drive)が設けられてもよい。 The RAM 44c temporarily stores various data used in the calculation by the CPU 44a. Further, the SSD 44d is a rewritable nonvolatile storage unit, and can store data even when the power of the ECU 44 is turned off. Note that the CPU 44a, ROM 44b, RAM 44c and the like can be integrated in the same package. Further, the ECU 44 may have a configuration in which another logical operation processor, a logic circuit, or the like such as a DSP (digital signal processor) is used instead of the CPU 44a. Further, an HDD (hard disk drive) may be provided instead of the SSD 44d.
 モニタ装置42は、表示装置42aと操作入力部42bと音声出力装置42cを含む。表示装置42aは、例えば、LCD(liquid crystal display)や、OELD(organic electroluminescent display)等であってもよい。表示装置42aに車いす10の周囲の状況、例えば障害物の位置等を表示してもよい。特に着座姿勢で目視し難い後方の状況等を撮像装置(カメラ)で撮像して、表示することができる。また、車いす10が通常走行モードや追尾走行モードの場合に、車いす10の制御状態、例えば、バッテリ32aの充電状態や走行速度等を表示してもよい。また、表示装置42aは、表示灯のみで構成されるシンプルなものであってもよい。この場合、障害物の接近等は、表示灯の表示色の変化や点滅等の表示態様の変化で表示するようにしてもよい。操作入力部42bは、スイッチや、ダイヤル、押しボタン等で構成することができる。なお、表示装置42aがタッチパネル等で構成される場合、操作入力部42bを表示装置42aの表示画面上で実現してもよい。音声出力装置42cは、車いす10が通常走行モードや追尾走行モードの場合に、例えば、急停止する場合や進路変更する場合、障害物が存在する場合等に、メッセージ(警報等)を出力したり、バッテリ32aの状態を通知したりしてもよい。 The monitor device 42 includes a display device 42a, an operation input unit 42b, and an audio output device 42c. The display device 42a may be, for example, an LCD (liquid crystal display), an OELD (organic electroluminescent display), or the like. You may display the surroundings of the wheelchair 10, for example, the position of an obstacle, etc. on the display device 42a. In particular, it is possible to capture and display a rear situation or the like that is difficult to see in the sitting posture with an imaging device (camera). Further, when the wheelchair 10 is in the normal traveling mode or the tracking traveling mode, the control state of the wheelchair 10, for example, the charging state of the battery 32 a or the traveling speed may be displayed. In addition, the display device 42a may be a simple device that includes only an indicator lamp. In this case, the approach of an obstacle may be displayed by a change in display mode such as a change in display color of the indicator lamp or blinking. The operation input unit 42b can be configured with a switch, a dial, a push button, and the like. When the display device 42a is configured with a touch panel or the like, the operation input unit 42b may be realized on the display screen of the display device 42a. The voice output device 42c outputs a message (such as an alarm) when the wheelchair 10 is in the normal travel mode or the tracking travel mode, for example, when suddenly stopping, when changing course, or when an obstacle exists. Alternatively, the state of the battery 32a may be notified.
 なお、上述した各種センサやアクチュエータの構成や、配置、電気的な接続形態等は、一例であって、種々に設定(変更)することができる。 The configuration, arrangement, electrical connection form, and the like of the various sensors and actuators described above are merely examples, and can be set (changed) in various ways.
 ECU44に含まれるCPU44aは、上述したような追尾走行を実現するために、図4に示されるような各種モジュールを備える。CPU44aは、モジュールとして、取得部58、対象特定部60、障害物特定部62、目標設定部64、走行制御部66、表示制御部68、音声制御部70、出力部72等を含む。そして、取得部58は、相対距離取得部58a、移動方向取得部58b、移動速度取得部58c、障害物距離算出部58d、接触予測時間算出部58e等を含む。また、目標設定部64は、軌道円設定部64a、移動目標位置設定部64b、周辺状況認識部64c等を含む。走行制御部66は、直進速度算出部66a、旋回速度算出部66b、舵角設定部66c、バッテリ制御部66d、モータ制御部66e等を含む。これらのモジュールは、ROM44b等の記憶装置にインストールされ記憶されたプログラムを読み出し、それを実行することで実現可能である。CPU44aは、各種モジュールによる処理を実行することにより、例えば追尾対象体を基準に、車いす10を所定の方向に所定の相対距離だけ離れるように誘導して追尾させることができる。 The CPU 44a included in the ECU 44 includes various modules as shown in FIG. 4 in order to realize the tracking traveling as described above. The CPU 44a includes an acquisition unit 58, a target identification unit 60, an obstacle identification unit 62, a target setting unit 64, a travel control unit 66, a display control unit 68, a voice control unit 70, an output unit 72, and the like as modules. The acquisition unit 58 includes a relative distance acquisition unit 58a, a movement direction acquisition unit 58b, a movement speed acquisition unit 58c, an obstacle distance calculation unit 58d, a contact prediction time calculation unit 58e, and the like. The target setting unit 64 includes an orbital circle setting unit 64a, a movement target position setting unit 64b, a surrounding situation recognition unit 64c, and the like. The traveling control unit 66 includes a straight traveling speed calculation unit 66a, a turning speed calculation unit 66b, a steering angle setting unit 66c, a battery control unit 66d, a motor control unit 66e, and the like. These modules can be realized by reading a program installed and stored in a storage device such as the ROM 44b and executing it. The CPU 44a can guide and track the wheelchair 10 in a predetermined direction so as to be separated by a predetermined relative distance, for example, based on the tracking target object by executing processing by various modules.
 取得部58は、移動体としての車いす10の周辺の物体の位置関係を示す情報を取得するレーザセンサ38(情報取得部)の取得結果に基づき、車いす10と、その周囲の物体との相対距離やその物体の移動方向等を取得する。レーザセンサ38は、例えば、1/16秒ごとにレーザ光の送受信を行う。そして、相対距離取得部58aは、レーザセンサ38の検出結果に基づき、車いす10の周囲について物体の有無および物体が存在する場合にその物体と車いす10との相対距離を検出する。 The acquisition unit 58 is a relative distance between the wheelchair 10 and its surrounding objects based on the acquisition result of the laser sensor 38 (information acquisition unit) that acquires information indicating the positional relationship between objects around the wheelchair 10 as a moving body. And the moving direction of the object. The laser sensor 38 transmits and receives laser light every 1/16 second, for example. Then, based on the detection result of the laser sensor 38, the relative distance acquisition unit 58a detects the presence or absence of an object around the wheelchair 10 and the relative distance between the object and the wheelchair 10 when an object exists.
 移動方向取得部58bは、レーザセンサ38の検出結果を時系列で比較することにより、物体の移動の有無および移動方向を取得する。例えば、ある検出タイミングにおける物体の位置と、次の検出タイミングにおける同じ物体の位置とを比較することにより、両者をつなぐ方向ベクトルを得ることができる。この方向ベクトルが物体の移動方向となる。 The moving direction acquisition unit 58b acquires the presence / absence and moving direction of the object by comparing the detection results of the laser sensor 38 in time series. For example, by comparing the position of an object at a certain detection timing with the position of the same object at the next detection timing, a direction vector that connects the two can be obtained. This direction vector is the moving direction of the object.
 移動速度取得部58cは、レーザセンサ38の検出結果を微分することにより、車いす10の周囲で検出された物体の移動速度を算出する。 The moving speed acquisition unit 58c calculates the moving speed of the object detected around the wheelchair 10 by differentiating the detection result of the laser sensor 38.
 対象特定部60は、取得部58が取得した車いす10の周囲に存在する物体の中から、車いす10が追尾する追尾対象体を特定する。例えば、車いす10に最も近い位置に存在する移動体(例えば、人間、動物、他の移動体)を追尾対象体としてもよい。また、車いす10の周囲の状況を撮像部(例えばカメラ)で撮像し、表示装置42aに撮像画像に含まれる物体を表示させて、ユーザに操作入力部42b等を介して指定させて追尾対象体を特定してもよい。また、追尾対象体となる物体に送信機(例えばビーコン)を所持させて、その送信機からの信号を対象特定部60が受信することにより、送信機を携帯する物体を追尾対象体と見なしてもよい。また、撮像部(例えばカメラ)が撮像した撮像画像に含まれる物体を周知のパターンマッチング技術等を用いて抽出して追尾対象体を特定してもよい。一方、障害物特定部62は、対象特定部60が特定した物体(追尾対象体)以外に車いす10の周囲に存在する物体を障害物として特定する。 The target specifying unit 60 specifies a tracking target object that the wheelchair 10 tracks from among objects existing around the wheelchair 10 acquired by the acquiring unit 58. For example, a moving object (for example, a human being, an animal, or another moving object) present at a position closest to the wheelchair 10 may be used as the tracking target object. In addition, the situation around the wheelchair 10 is imaged by an imaging unit (for example, a camera), an object included in the captured image is displayed on the display device 42a, and the tracking target object is specified by the user via the operation input unit 42b or the like. May be specified. In addition, an object to be tracked is possessed by a transmitter (for example, a beacon), and the target specifying unit 60 receives a signal from the transmitter, so that the object carrying the transmitter is regarded as a tracking target. Also good. In addition, a tracking target object may be specified by extracting an object included in a captured image captured by an imaging unit (for example, a camera) using a known pattern matching technique or the like. On the other hand, the obstacle specifying unit 62 specifies an object existing around the wheelchair 10 as an obstacle other than the object (tracking target object) specified by the target specifying unit 60.
 障害物距離算出部58dは、障害物特定部62が特定した障害物と車いす10との相対距離(第二の相対距離)を算出する。また、移動速度取得部58cは、障害物特定部62が特定した障害物についても移動速度を取得する。そして、接触予測時間算出部58eは、移動速度取得部58cが取得した障害物の移動速度と車輪速センサ48のセンサ値に基づく車いす10の速度とを用いて(比較して)、障害物と車いす10とが接触するまでの接触予測時間を算出する。障害物距離算出部58dの算出する第二の相対距離と接触予測時間算出部58eの算出する接触予測時間は、車いす10がその障害物を回避する場合の回避運動を決定する際に用いる。例えば、車いす10の周辺に存在する障害物と車いす10との第二の相対距離が所定の距離閾値以下(例えば3m以下)の場合、CPU44aは障害物を回避する回避運動を実行する。同様に、車いす10と障害物のとの第二の相対距離が縮まる場合で、車いす10に接触するまでの接触予測時間が所定の時間閾値以下(例えば5秒以下)になった場合、CPU44aは、障害物を回避する回避運動を実行する。回避運動は、同心円76aの半径R1を小さくしたり、車いす10の追尾位置(ターゲット74から見た位置)をターゲット74を挟んで障害物が少ない側に変更したりすることで実現する。 The obstacle distance calculating unit 58d calculates a relative distance (second relative distance) between the obstacle specified by the obstacle specifying unit 62 and the wheelchair 10. The moving speed acquisition unit 58c also acquires the moving speed for the obstacle specified by the obstacle specifying unit 62. Then, the predicted contact time calculation unit 58e uses (compares) the obstacle movement speed acquired by the movement speed acquisition part 58c and the speed of the wheelchair 10 based on the sensor value of the wheel speed sensor 48, and The predicted contact time until the wheelchair 10 comes into contact is calculated. The second relative distance calculated by the obstacle distance calculating unit 58d and the predicted contact time calculated by the predicted contact time calculating unit 58e are used when the wheelchair 10 determines an avoidance exercise when avoiding the obstacle. For example, when the second relative distance between the obstacle present around the wheelchair 10 and the wheelchair 10 is equal to or less than a predetermined distance threshold (for example, 3 m or less), the CPU 44a executes an avoidance exercise to avoid the obstacle. Similarly, when the second relative distance between the wheelchair 10 and the obstacle is reduced, and the predicted contact time until the wheelchair 10 comes into contact is less than a predetermined time threshold (for example, 5 seconds or less), the CPU 44a Execute avoidance exercise, avoiding obstacles. The avoidance movement is realized by reducing the radius R1 of the concentric circle 76a, or changing the tracking position (position viewed from the target 74) of the wheelchair 10 to the side with few obstacles with the target 74 interposed therebetween.
 目標設定部64は、対象特定部60が決定した追尾対象体を基準として、車いす10の移動目標位置を決定する。本実施形態の移動体追尾システム100の場合、移動体としての車いす10の移動目標位置は、一例として、図5に示すように追尾対象体としてのターゲット74を中心とする追尾間隔(第一の相対距離)が半径R1の同心円上に設定される。軌道円設定部64aは、対象特定部60が特定した追尾対象体(ターゲット74)を中心にターゲット74の移動、停止に拘わらず、常に半径R1の同心円76a(円軌道)を設定する。 The target setting unit 64 determines the movement target position of the wheelchair 10 based on the tracking target object determined by the target specifying unit 60. In the case of the moving body tracking system 100 according to the present embodiment, the movement target position of the wheelchair 10 as a moving body is, for example, a tracking interval (first step) centered on a target 74 as a tracking target body as shown in FIG. Relative distance) is set on a concentric circle of radius R1. The orbital circle setting unit 64a always sets a concentric circle 76a (circular orbit) having a radius R1 regardless of whether the target 74 is moved or stopped around the tracking target object (target 74) identified by the object identifying unit 60.
 また、移動目標位置設定部64bは、対象特定部60が特定したターゲット74に対して、軌道円設定部64aが設定した同心円76a上で、移動方向取得部58bが取得したターゲット74のターゲット移動方向Pを基準に追尾角度θとなる位置に車いす10の移動目標位置を設定する。図5の場合一例として、反時計方向に所定角度(追尾角度θ)だけ回転した(ずれた)位置に車いす10の移動目標位置を設定する。移動目標位置設定部64bは、追尾角度θとして、ターゲット74(追尾対象体)の側方および後方の領域からターゲット移動方向Pの逆方向ベクトルPVを含む所定の後方領域(解除角度θN)を除く領域に移動目標位置を設定する。つまり、車いす10の移動目標位置は、ターゲット74の真後ろの位置には設定されない。このような設定を行うことにより、例えば、車いす10がターゲット74を追尾して移動しているときに、ターゲット74が急停止したり、急に方向転換したりした場合でも、車いす10は、ターゲット74と接触することを回避できる。図5の場合、逆方向ベクトルPVを中心として、例えば、左右45°(合計90°)で解除角度θNが設定されている例を示している。そして、図5の場合は、ターゲット74が図示の位置に移動した場合の車いす10の位置が示されている。このように、ターゲット74がターゲット移動方向Pで示される方向にターゲット速度VTで移動する場合、車いす10も同様に移動し、ターゲット74と車いす10の相対位置関係を保ちながら移動するとともに、ターゲット74との接触も容易に回避することができる。 In addition, the movement target position setting unit 64b is the target movement direction of the target 74 acquired by the movement direction acquisition unit 58b on the concentric circle 76a set by the trajectory circle setting unit 64a with respect to the target 74 specified by the target specifying unit 60. The movement target position of the wheelchair 10 is set at a position where the tracking angle θ is based on P. As an example in the case of FIG. 5, the movement target position of the wheelchair 10 is set at a position rotated (shifted) by a predetermined angle (tracking angle θ) counterclockwise. The movement target position setting unit 64b excludes a predetermined rear region (release angle θN) including the reverse vector PV in the target moving direction P from the side and rear regions of the target 74 (tracking target) as the tracking angle θ. Set the movement target position in the area. That is, the movement target position of the wheelchair 10 is not set to a position immediately behind the target 74. By performing such setting, for example, when the wheelchair 10 is moving while tracking the target 74, even if the target 74 suddenly stops or suddenly changes direction, the wheelchair 10 Contact with 74 can be avoided. In the case of FIG. 5, an example in which the release angle θN is set at 45 ° left and right (total 90 °) with the reverse direction vector PV as the center is shown. In the case of FIG. 5, the position of the wheelchair 10 when the target 74 moves to the illustrated position is shown. Thus, when the target 74 moves in the direction indicated by the target moving direction P at the target speed VT, the wheelchair 10 moves in the same manner, and moves while maintaining the relative positional relationship between the target 74 and the wheelchair 10. Contact with can also be easily avoided.
 なお、車いす10の移動目標位置は、基本的には、同心円76a上の後方領域(解除角度θN)を除く領域で設定されればよいが、車いす10のユーザに、より安心感を与えるためには、ターゲット74の真横より後方の位置(標準角度θGで規定される位置)で設定されることが望ましい。この場合、車いす10に着座するユーザの視野に常にターゲット74(この場合、付添人)が入り、安心感を得ることができる。なお、人間の視野は例えば、120°程度あるので、ターゲット74(この場合、付添人)は、ターゲット移動方向Pを向きつつも首を少し振るのみで標準角度θGに存在する車いす10を確認することができる。その結果、例えば、ターゲット74である付添人は車いす10に着座するユーザを確認しながら(例えば会話をしながら)移動することができる。また、追尾中の車いす10が何らかの原因でターゲット74から離れてしまうような場合でも、迅速にその状況を認識することができる。 The movement target position of the wheelchair 10 may be basically set in an area excluding the rear area (release angle θN) on the concentric circle 76a. However, in order to give the wheelchair 10 a more secure feeling. Is preferably set at a position behind the target 74 (position defined by the standard angle θG). In this case, the target 74 (in this case, an attendant) always enters the field of view of the user sitting on the wheelchair 10, and a sense of security can be obtained. Note that since the human field of view is, for example, about 120 °, the target 74 (in this case, an attendant) confirms the wheelchair 10 existing at the standard angle θG only by slightly shaking the head while facing the target moving direction P. be able to. As a result, for example, the attendant who is the target 74 can move while confirming the user sitting on the wheelchair 10 (for example, while having a conversation). Even when the wheelchair 10 being tracked is separated from the target 74 for some reason, the situation can be recognized quickly.
 周辺状況認識部64cは、障害物特定部62が特定した車いす10の周囲の障害物の混雑の程度を検出する。例えば、ROM44bは、単位面積当たりの障害物の密度を示す混雑度Mを予め保持し、車いす10の周囲が混雑度M以上の「密」状態なのか、混雑度M未満の「疎」の状態なのかを認識する。軌道円設定部64aは、周辺状況認識部64cの認識した周囲状況が「疎」の場合、軌道円の半径R1(第一の相対距離)を例えば、標準距離(例えば1m)に設定する。車いす10とターゲット74との相対距離が例えば1mに維持されることにより、お互いの圧迫感を軽減できるとともに、離れ過ぎによる不安感を低減できる。逆に周辺状況認識部64cの認識した周囲状況が「密」の場合、軌道円設定部64aは、軌道円の半径R1(第一の相対距離、例えば半径1m)を小さくして(例えば半径0.5m)、車いす10をターゲット74に、より接近させて混雑状態でも車いす10が容易に走行できるようにする。 The surrounding situation recognition unit 64c detects the degree of congestion of obstacles around the wheelchair 10 specified by the obstacle specifying unit 62. For example, the ROM 44b holds in advance a congestion degree M indicating the density of obstacles per unit area, and whether the surroundings of the wheelchair 10 are in a “dense” state with a congestion degree M or higher or a “sparse” state with a congestion degree M or less. Recognize what it is. When the surrounding situation recognized by the surrounding situation recognition unit 64c is “sparse”, the orbit circle setting unit 64a sets the radius R1 (first relative distance) of the orbit circle to, for example, a standard distance (eg, 1 m). By maintaining the relative distance between the wheelchair 10 and the target 74 at, for example, 1 m, it is possible to reduce the feeling of pressure on each other and to reduce the feeling of anxiety due to being too far away. On the other hand, when the surrounding situation recognized by the surrounding situation recognition unit 64c is “dense”, the orbital circle setting unit 64a decreases the radius R1 (first relative distance, for example, radius 1 m) of the orbital circle (for example, radius 0). 5m), the wheelchair 10 is moved closer to the target 74 so that the wheelchair 10 can easily travel even in a crowded state.
 走行制御部66は、車いす10の走行に関する全体的な制御を行う。直進速度算出部66aは、追尾走行モード時に車いす10がターゲット74を追尾する場合に移動速度取得部58cの取得したターゲット74の移動速度に基づき車いす10の直進速度VM、つまり、駆動輪22の回転速度を算出する。旋回速度算出部66bは、車いす10がターゲット74を追尾する場合に、移動方向取得部58bが取得したターゲット74の移動方向と車輪速センサ48が取得するセンサ値に基づく車いす10の現在の速度にしたがいキャスタ28の旋回速度ωMを算出する。例えば、ターゲット74が急に進行方向を変える場合は旋回速度ωMが早くなり、ゆっくりと進行方向を変える場合には、旋回速度ωMがそれより遅くなる。舵角設定部66cは、車いす10が移動目標位置設定部64bの設定した移動目標位置に到達するように、キャスタ28の舵角の制御、すなわち操舵システム30のアクチュエータ30aの制御量を決定する。 The traveling control unit 66 performs overall control related to traveling of the wheelchair 10. When the wheelchair 10 tracks the target 74 in the tracking travel mode, the straight speed calculation unit 66a is based on the moving speed of the target 74 acquired by the moving speed acquiring unit 58c, that is, the rotation of the driving wheel 22 Calculate the speed. When the wheelchair 10 tracks the target 74, the turning speed calculation unit 66b sets the current speed of the wheelchair 10 based on the movement direction of the target 74 acquired by the movement direction acquisition unit 58b and the sensor value acquired by the wheel speed sensor 48. Accordingly, the turning speed ωM of the caster 28 is calculated. For example, when the target 74 suddenly changes its traveling direction, the turning speed ωM becomes faster, and when the target 74 changes its traveling direction slowly, the turning speed ωM becomes slower. The steering angle setting unit 66c determines the control angle of the caster 28, that is, the control amount of the actuator 30a of the steering system 30, so that the wheelchair 10 reaches the movement target position set by the movement target position setting unit 64b.
 バッテリ制御部66dは、車いす10が移動目標位置設定部64bの設定した移動目標位置に、次の制御周期による移動目標位置が決定されるまでに移動できるような速度を駆動システム24が実現できるようにバッテリシステム32の出力値を制御する。なお、バッテリ制御部66dは、バッテリ32aの充放電の制御を行いバッテリ量の管理も行う。モータ制御部66eは、トルクセンサ24bから得られるトルク値に基づき、アクチュエータ24a(モータ)の負荷状態を取得し、移動目標位置設定部64bの設定した移動目標位置に、次の制御周期による移動目標位置が決定されるまでに移動できるように速度の増減調整行う。例えば、路面の状態(勾配状態や凹凸状態等)に応じてアクチュエータ24a(モータ)の出力値を決定する。 The battery control unit 66d allows the drive system 24 to realize a speed at which the wheelchair 10 can move to the movement target position set by the movement target position setting unit 64b until the movement target position in the next control cycle is determined. The output value of the battery system 32 is controlled. The battery control unit 66d controls charging / discharging of the battery 32a and manages the battery amount. The motor control unit 66e acquires the load state of the actuator 24a (motor) based on the torque value obtained from the torque sensor 24b, and moves the movement target position set by the movement target position setting unit 64b to the movement target in the next control cycle. Increase / decrease the speed so that it can move before the position is determined. For example, the output value of the actuator 24a (motor) is determined according to the road surface state (gradient state, uneven state, etc.).
 表示制御部68は、通常走行モード時や追尾走行モード時の車いす10の走行状態(速度や走行方向等)を表示したり、バッテリ32aの充電状態を表示したり、車いす10の周囲の障害物の数や存在位置等の表示を行うための処理を実行する。音声制御部70は、通常走行モード時や追尾走行モード時に車いす10の駆動状態に関する情報や障害物等が存在する場合の警報等を行うための処理を実行する。出力部72は、走行制御部66により制御結果を各システムに向けて出力するとともに、表示制御部68や音声制御部70の制御結果をモニタ装置42に向けて出力する。 The display control unit 68 displays the traveling state (speed, traveling direction, etc.) of the wheelchair 10 in the normal traveling mode and the tracking traveling mode, displays the charging state of the battery 32a, and the number of obstacles around the wheelchair 10. And processing for displaying the existence position and the like. The voice control unit 70 executes processing for providing information on the driving state of the wheelchair 10 or an alarm when there is an obstacle or the like in the normal traveling mode or the tracking traveling mode. The output unit 72 outputs the control results from the travel control unit 66 to each system, and outputs the control results from the display control unit 68 and the voice control unit 70 to the monitor device 42.
 なお、走行制御部66は、通常走行モードの場合、操作部36の操作内容(例えばジョイスティックの倒し方向や倒し量)にしたがい、直進速度算出部66a、旋回速度算出部66b、舵角設定部66c、バッテリ制御部66d、モータ制御部66e等の制御を実行し、ユーザの指示に従う車いす10の走行を実行する。 In the normal travel mode, the travel control unit 66 follows the operation content of the operation unit 36 (for example, the direction and amount of tilting of the joystick), and the straight travel speed calculation unit 66a, the turning speed calculation unit 66b, and the steering angle setting unit 66c. Then, control of the battery control unit 66d, the motor control unit 66e, and the like is executed, and the wheelchair 10 is driven according to the user's instructions.
 このように構成される車いす10がターゲット74を追尾する際の移動目標位置の設定例を図6~図8を用いて説明する。なお、図6~図8に示す例は、車いす10の周囲に障害物が存在しないまたは、障害物が存在する場合でも「疎」の状態(非混雑状態)のときに移動目標位置を設定する場合を示している。 An example of setting the movement target position when the wheelchair 10 configured in this way tracks the target 74 will be described with reference to FIGS. In the examples shown in FIG. 6 to FIG. 8, the movement target position is set when there is no obstacle around the wheelchair 10 or when the obstacle is present in a “sparse” state (non-congested state). Shows the case.
 図6は追尾対象体であるターゲット74が、現在の位置からターゲット速度VTでターゲット移動方向Pに直進して点線で示される位置に移動した場合の車いす10の移動目標位置78を示している。対象特定部60は、まず、取得部58の取得した車いす10の周囲の物体の位置関係に基づきターゲット74を特定する。例えば、車いす10に最も近い位置に存在する物体をターゲット74として設定する。また、軌道円設定部64aは、周辺状況認識部64cの認識結果に基づき、車いす10の周囲の混雑度を取得して、図6に示すような非混雑時には、半径R1の同心円76a(第一の相対距離)の設定を行う。このとき、もし車いす10が同心円76a上に存在しない場合、走行制御部66は、移動方向取得部58bの取得したターゲット74のターゲット移動方向Pを基準として、ターゲット74を中心とする同心円76a上で追尾角度θだけ、反時計方向に移動した位置に車いす10を移動させて、この位置をターゲット74の現在位置に対応する現在の車いす10の位置とする。ターゲット74が、図6に示すようにターゲット移動方向Pに直進移動する場合、半径R1の同心円76aも同様にターゲット移動方向Pに直進移動した位置、つまり、点線で示すターゲット74を中心とした位置に設定される。そして、移動目標位置78は、点線で示すターゲット74を中心に設定された同心円76a上で、点線のターゲット74のターゲット移動方向Pを基準として、ターゲット74を中心とする同心円76a上で追尾角度θだけ、反時計方向に移動した位置に設定される。直進速度算出部66aは、次の移動目標位置が設定されるまでに、車いす10が移動目標位置78に到達するような車いす10の直進速度VMを算出する。この場合、ターゲット74はターゲット移動方向Pに直進しているのみなので、旋回速度算出部66bは旋回速度ωθを算出しない。このように、点線示すターゲット74のように、ターゲット移動方向Pに直進移動する場合、ターゲット74から見た車いす10の相対位置関係は変わらず、ターゲット74を追尾することができる。 FIG. 6 shows the movement target position 78 of the wheelchair 10 when the target 74 that is the tracking object moves straight from the current position in the target movement direction P to the position indicated by the dotted line at the target speed VT. The object specifying unit 60 first specifies the target 74 based on the positional relationship of objects around the wheelchair 10 acquired by the acquiring unit 58. For example, an object existing at a position closest to the wheelchair 10 is set as the target 74. Further, the orbital circle setting unit 64a acquires the degree of congestion around the wheelchair 10 based on the recognition result of the surrounding state recognition unit 64c, and when it is not crowded as shown in FIG. Relative distance). At this time, if the wheelchair 10 does not exist on the concentric circle 76a, the traveling control unit 66 uses the target moving direction P of the target 74 acquired by the moving direction acquiring unit 58b as a reference on the concentric circle 76a centered on the target 74. The wheelchair 10 is moved to the position moved counterclockwise by the tracking angle θ, and this position is set as the current position of the wheelchair 10 corresponding to the current position of the target 74. When the target 74 moves straight in the target movement direction P as shown in FIG. 6, the concentric circle 76a having the radius R1 is similarly moved straight in the target movement direction P, that is, a position centered on the target 74 indicated by a dotted line. Set to Then, the movement target position 78 is on a concentric circle 76a set around the target 74 indicated by a dotted line, and on the concentric circle 76a centered on the target 74 with reference to the target moving direction P of the target 74 indicated by the dotted line. Only the position moved counterclockwise is set. The straight speed calculation unit 66a calculates the straight speed VM of the wheelchair 10 such that the wheelchair 10 reaches the movement target position 78 until the next movement target position is set. In this case, since the target 74 is only going straight in the target moving direction P, the turning speed calculation unit 66b does not calculate the turning speed ωθ. Thus, when moving straight in the target moving direction P as in the case of the target 74 indicated by the dotted line, the relative positional relationship of the wheelchair 10 viewed from the target 74 is not changed, and the target 74 can be tracked.
 図7は追尾対象体であるターゲット74が、現在の位置から旋回運動を行い、かつ旋回した方向をターゲット移動方向Pとする点線で示されるターゲット74の位置に移動した場合の車いす10の移動目標位置78を示している。この場合も対象特定部60は、取得部58の取得した車いす10の周囲の物体の位置関係に基づきターゲット74を特定する。また、軌道円設定部64aは、周辺状況認識部64cの認識結果に基づき、車いす10の周囲の混雑度を取得して、図7に示すような非混雑時の場合は、半径R1の同心円76a(第一の相対距離)の設定を行う。ターゲット74が、図7に示すように旋回した後ターゲット移動方向Pに直進移動する場合も、半径R1の同心円76aが、点線で示すターゲット74を中心とした位置に設定される。そして、移動目標位置78は、点線のターゲット74を中心に設定された同心円76a上で、点線のターゲット74のターゲット移動方向Pを基準として、点線のターゲット74を中心とする同心円76a上で追尾角度θだけ、反時計方向に移動した位置に設定される。直進速度算出部66aは、次の移動目標位置78が設定されるまでに、車いす10が移動目標位置に到達するような車いす10の直進速度VMを算出する。この場合、ターゲット74は旋回しているので、旋回速度算出部66bは次の移動目標位置78が設定されるまでに、車いす10が移動目標位置に到達するように旋回する旋回速度ωθを算出する。このように、ターゲット74が旋回した後、その旋回方向にターゲット移動方向Pを向けて直進移動する場合も、ターゲット74から見た車いす10の相対位置関係は変わらず、ターゲット74を追尾することができる。 FIG. 7 shows the movement target of the wheelchair 10 when the target 74 as the tracking target object performs a turning movement from the current position and moves to the position of the target 74 indicated by the dotted line with the turning direction as the target movement direction P. A position 78 is shown. Also in this case, the target specifying unit 60 specifies the target 74 based on the positional relationship of the objects around the wheelchair 10 acquired by the acquiring unit 58. Further, the orbital circle setting unit 64a acquires the degree of congestion around the wheelchair 10 based on the recognition result of the surrounding situation recognition unit 64c, and in the case of non-congestion as shown in FIG. 7, a concentric circle 76a having a radius R1. Set (first relative distance). Even when the target 74 turns straight as shown in FIG. 7 and moves straight in the target movement direction P, the concentric circle 76a having the radius R1 is set to a position centered on the target 74 indicated by the dotted line. The moving target position 78 is a tracking angle on a concentric circle 76 a centered on the dotted target 74 and on a concentric circle 76 a centered on the dotted target 74 with reference to the target moving direction P of the dotted target 74. It is set at a position moved counterclockwise by θ. The straight speed calculation unit 66a calculates the straight speed VM of the wheelchair 10 such that the wheelchair 10 reaches the movement target position until the next movement target position 78 is set. In this case, since the target 74 is turning, the turning speed calculation unit 66b calculates the turning speed ωθ that turns so that the wheelchair 10 reaches the movement target position before the next movement target position 78 is set. . Thus, after the target 74 turns, even when the target 74 moves straight with the target moving direction P in the turning direction, the relative positional relationship of the wheelchair 10 viewed from the target 74 does not change, and the target 74 can be tracked. it can.
 図8は追尾対象体であるターゲット74が、現在の位置から旋回運動を行った後、点線で示されるターゲット74のように旋回方向とは異なる方向をターゲット移動方向Pとする場合の車いす10の移動目標位置78を示している。この場合、点線のターゲット74の位置は、図7の点線のターゲット74の位置と同じであるが、点線のターゲット74が現在向いている方向が異なる。この場合も対象特定部60は、取得部58の取得した車いす10の周囲の物体の位置関係に基づきターゲット74を特定する。また、軌道円設定部64aは、周辺状況認識部64cの認識結果に基づき、車いす10の周囲の混雑度を取得して、図8に示すように非混雑時の場合は、半径R1の同心円76a(第一の相対距離)の設定を行う。点線のターゲット74が、図8に示すように旋回した後、ターゲット移動方向Pに直進移動する場合、半径R1の同心円76aが、点線のターゲット74を中心とした位置に設定される。そして、移動目標位置78は、点線のターゲット74を中心に設定された同心円76a上で、点線のターゲット74のターゲット移動方向Pを基準として、点線のターゲット74を中心とする同心円76a上で追尾角度θだけ、反時計方向に移動した位置に設定される。この場合、図7の点線のターゲット74と図8の点線のターゲット74とでは、ターゲット移動方向Pの方向が異なるため、ターゲット移動方向Pを基準として追尾角度θで設定される移動目標位置78の位置が図7の移動目標位置78の位置と異なる。直進速度算出部66aは、次の移動目標位置78が設定されるまでに、車いす10が移動目標位置78に到達するような車いす10の直進速度VMを算出する。この場合も、ターゲット74は旋回しているので、旋回速度算出部66bは次の移動目標位置78が設定されるまでに、車いす10が移動目標位置78に到達するように旋回する旋回速度ωθを算出する。このように、ターゲット74が旋回した後のターゲット移動方向Pにしたがって、移動目標位置78の位置が決定されるので、ターゲット74から見た車いす10の相対位置関係は変わらず、ターゲット74を追尾することができる。そして、図6~図8の動作を組み合わせて実行することにより、ターゲット74がいずれの方向に移動しても車いす10は、一定の相対位置関係で追尾することができる。 FIG. 8 shows the wheelchair 10 in a case where the target moving direction P is a direction different from the turning direction, such as the target 74 indicated by the dotted line, after the target 74 as the tracking target object makes a turning movement from the current position. A movement target position 78 is shown. In this case, the position of the dotted target 74 is the same as the position of the dotted target 74 in FIG. 7, but the direction in which the dotted target 74 is currently facing is different. Also in this case, the target specifying unit 60 specifies the target 74 based on the positional relationship of the objects around the wheelchair 10 acquired by the acquiring unit 58. Further, the orbital circle setting unit 64a acquires the degree of congestion around the wheelchair 10 based on the recognition result of the surrounding situation recognition unit 64c, and when it is not crowded as shown in FIG. 8, the concentric circle 76a having the radius R1. Set (first relative distance). When the dotted target 74 moves straight in the target moving direction P after turning as shown in FIG. 8, the concentric circle 76 a with the radius R <b> 1 is set at a position centered on the dotted target 74. The moving target position 78 is a tracking angle on a concentric circle 76 a centered on the dotted target 74 and on a concentric circle 76 a centered on the dotted target 74 with reference to the target moving direction P of the dotted target 74. It is set at a position moved counterclockwise by θ. In this case, since the direction of the target movement direction P is different between the dotted line target 74 in FIG. 7 and the dotted line target 74 in FIG. 8, the movement target position 78 set by the tracking angle θ with respect to the target movement direction P is set. The position is different from the position of the movement target position 78 in FIG. The straight speed calculation unit 66a calculates the straight speed VM of the wheelchair 10 such that the wheelchair 10 reaches the movement target position 78 until the next movement target position 78 is set. Also in this case, since the target 74 is turning, the turning speed calculation unit 66b sets the turning speed ωθ for turning so that the wheelchair 10 reaches the moving target position 78 until the next moving target position 78 is set. calculate. Thus, since the position of the movement target position 78 is determined according to the target movement direction P after the target 74 turns, the relative positional relationship of the wheelchair 10 viewed from the target 74 is not changed, and the target 74 is tracked. be able to. 6 to 8 are executed in combination, the wheelchair 10 can be tracked in a fixed relative positional relationship regardless of the direction of movement of the target 74.
 上述したような車いす10の追尾動作の一例を実現するための処理手順の一例を図9および図10のフローチャートを用いて説明する。なお、図9は、処理の前半部分を説明するフローチャートで、図10が処理の後半部分を説明するフローチャートである。 An example of a processing procedure for realizing an example of the tracking operation of the wheelchair 10 as described above will be described with reference to the flowcharts of FIGS. 9 and 10. FIG. 9 is a flowchart for explaining the first half of the process, and FIG. 10 is a flowchart for explaining the second half of the process.
 まず、CPU44aは、例えば、モニタ装置42の操作入力部42b等の操作状態に基づき、車いす10の走行モードとして、ターゲット74を特定して追尾する追尾走行モードが選択されているか、ユーザが操作部36を操作して車いす10を移動させる通常走行モードが選択されているかを検出する(S100)。なお、追尾走行モードでも通常走行モードでもない場合、つまり、自走モードの場合、図9、図10のフローチャートは実行されず、車いす10はユーザ自らが左右のハンドリム22a(駆動輪22)を回すことによって移動する。 First, the CPU 44a, for example, determines whether the tracking traveling mode for identifying and tracking the target 74 is selected as the traveling mode of the wheelchair 10 based on the operation state of the operation input unit 42b of the monitor device 42 or the like. It is detected whether the normal driving mode which moves the wheelchair 10 by operating 36 is selected (S100). When neither the tracking driving mode nor the normal driving mode is set, that is, in the self-running mode, the flowcharts of FIGS. 9 and 10 are not executed, and the wheelchair 10 turns the left and right hand rims 22a (drive wheels 22) by the user himself / herself. Move by.
 CPU44aは、追尾走行モードが選択されていることを検出した場合(S100のYes)、表示制御部68は、追尾走行モードである旨を示す制御信号を出力部72を介して出力し、表示装置42aに「追尾走行モード制御中」等の表示を行う(S102)。続いて、取得部58は、レーザセンサ38を介して車いす10の周辺状況(車いす10の周辺に存在する物体の位置関係等)を取得する(S104)。また、対象特定部60は、車いす10の周囲に存在する物体の中から追尾対象体(ターゲット74)を特定する(S106)。この場合、ターゲット74の特定方法は種々ある。例えば、取得部58が検出した物体のうち車いす10に最も近い物体をターゲット74としてもよい。また、別の実施形態では、予めターゲット74にビーコン等の発信器を携帯させ、その信号を取得することによりターゲット74を特定してもよい。また、別の実施形態では、車いす10が備える撮像部によって取得された画像に基づき、ターゲット74を特定してもよい。なお、ターゲット74の特定は、これらの方法を組み合わせて行ってもよい。例えば、車いす10の周囲に存在する物体の存在密度が「密」の場合、ビーコン等の発信器を用いた方法が迅速にターゲット74を特定できる点で有利である。また、車いす10の周囲に存在する物体の存在密度が「疎」の場合、最も近くに存在する物体をターゲット74として特定する場合、処理負荷が少なく、ビーコン等の機器が不要になりコストの点でも有利である。画像処理を行う場合、特定するターゲット74の種類を容易に変更可能であるとともに、ビーコン等の機器を用いることなく、特定制度を向上できる点で有利である。 When the CPU 44a detects that the tracking travel mode is selected (Yes in S100), the display control unit 68 outputs a control signal indicating that it is the tracking travel mode via the output unit 72, and the display device A display such as “in control of tracking driving mode” is performed on 42a (S102). Subsequently, the acquisition unit 58 acquires the surrounding situation of the wheelchair 10 (such as the positional relationship of objects existing around the wheelchair 10) via the laser sensor 38 (S104). Further, the target specifying unit 60 specifies the tracking target body (target 74) from among the objects existing around the wheelchair 10 (S106). In this case, there are various methods for specifying the target 74. For example, an object closest to the wheelchair 10 among the objects detected by the acquisition unit 58 may be used as the target 74. In another embodiment, the target 74 may be specified by previously carrying a transmitter such as a beacon with the target 74 and acquiring the signal. In another embodiment, target 74 may be specified based on the image acquired by the imaging part with which wheelchair 10 is provided. The target 74 may be specified by combining these methods. For example, when the density of objects existing around the wheelchair 10 is “dense”, a method using a transmitter such as a beacon is advantageous in that the target 74 can be quickly identified. Further, when the density of objects existing around the wheelchair 10 is “sparse”, when the closest object is specified as the target 74, the processing load is small, and a device such as a beacon is unnecessary, which is costly. But it is advantageous. When performing image processing, it is advantageous in that the type of target 74 to be specified can be easily changed and the specification system can be improved without using a device such as a beacon.
 ターゲット74の特定が完了した場合(S108のYes)、相対距離取得部58aはターゲット74の相対距離を取得するとともに、相対距離の時系列情報に基づき、移動方向取得部58bがターゲット74の移動方向を取得する(S110)。なお、ターゲット74の特定ができない場合(S108のNo)、ターゲット74の特定処理を継続して実行する。 When the identification of the target 74 is completed (Yes in S108), the relative distance acquisition unit 58a acquires the relative distance of the target 74, and the movement direction acquisition unit 58b moves the movement direction of the target 74 based on time series information of the relative distance. Is acquired (S110). When the target 74 cannot be specified (No in S108), the target 74 specifying process is continuously executed.
 続いて、周辺状況認識部64cは、相対距離取得部58aが取得した物体の存在情報に基づき、車いす10の周囲の混雑状態を算出し、ROM44b等に予め保持した混雑度Mと比較する(S112)。車いす10の周囲の混雑度が所定の混雑度M未満の場合(S112のNo)、例えば、周辺状況認識部64cの認識した周囲状況が、図11に示すように、車いす10の周囲にターゲット移動方向Pでターゲット速度VTのターゲット74の他に、歩行者80a(歩行速度VSa)、歩行者80b(歩行速度VSb)が数人(例えば二人)存在するような「疎」の場合である。この場合、軌道円設定部64aは、図12に示すような歩行者80a(歩行速度VSa)~歩行者80g(歩行速度VSg)のような「密」の場合より大きな半径R1の同心円76a(第1軌道円)を設定する(S114)。第1軌道円は、例えば第一の相対距離が標準距離(1m)の円とすることができる。一方、車いす10の周囲の混雑度が所定の混雑度M以上の場合(S112のYes)、例えば、周辺状況認識部64cの認識した周囲状況が、図12に示すように、車いす10の周囲にターゲット74の他に歩行者80a(歩行速度VSa)~歩行者80g(歩行速度VSg)のように多数存在するような「密」の場合である。この場合、軌道円設定部64aは、図11に示すように歩行者80a,80bが「疎」の場合より小さな半径R2の同心円76b(第2軌道円)を設定する(S116)。第2軌道円は、第一の相対距離が標準距離(例えば1m)の半分の0.5mの円とすることができる。第1軌道円と第2軌道円とを周囲の状況に応じて使い分けることにより、障害物の回避が容易になったり、車いす10の移動が容易になったりする。また、ターゲット74と車いす10との相対距離がそのときの周囲の状況に応じて変化するため、ターゲット74と車いす10とが必要以上に接近し過ぎて圧迫感を感じさせてしまったり、逆に、離れすぎて不安感を抱かしてしまったりするようなことが回避できる。なお、図11、図12の場合、車いす10の移動目標位置が一例としてターゲット74の真横に来るように設定されている例である。 Subsequently, the surrounding state recognition unit 64c calculates the congestion state around the wheelchair 10 based on the object presence information acquired by the relative distance acquisition unit 58a, and compares it with the congestion degree M previously stored in the ROM 44b or the like (S112). ). When the congestion degree around the wheelchair 10 is less than the predetermined congestion degree M (No in S112), for example, the surrounding situation recognized by the surrounding situation recognition unit 64c moves the target around the wheelchair 10 as shown in FIG. This is a case of “sparse” in which there are several pedestrians 80a (walking speed VSa) and pedestrians 80b (walking speed VSb) in addition to the target 74 having the target speed VT in the direction P. In this case, the orbital circle setting unit 64a has a concentric circle 76a (first step) having a larger radius R1 than in the case of “dense” such as pedestrian 80a (walking speed VSa) to pedestrian 80g (walking speed VSg) as shown in FIG. 1 orbital circle) is set (S114). The first orbit circle may be a circle whose first relative distance is a standard distance (1 m), for example. On the other hand, when the congestion degree around the wheelchair 10 is equal to or greater than the predetermined congestion degree M (Yes in S112), for example, the surrounding situation recognized by the surrounding situation recognition unit 64c is around the wheelchair 10 as shown in FIG. This is a “dense” case where there are a large number of pedestrians 80a (walking speed VSa) to pedestrian 80g (walking speed VSg) in addition to the target 74. In this case, as shown in FIG. 11, the orbital circle setting unit 64a sets a concentric circle 76b (second orbital circle) having a smaller radius R2 than when the pedestrians 80a and 80b are “sparse” (S116). The second orbit circle may be a 0.5 m circle whose first relative distance is half the standard distance (for example, 1 m). By properly using the first orbit circle and the second orbit circle according to the surrounding situation, it is easy to avoid obstacles and to move the wheelchair 10 easily. In addition, since the relative distance between the target 74 and the wheelchair 10 changes according to the surrounding conditions at that time, the target 74 and the wheelchair 10 are too close to each other, causing a feeling of pressure, and conversely , You can avoid feeling too anxious about being too far away. In the case of FIGS. 11 and 12, the movement target position of the wheelchair 10 is set to be directly beside the target 74 as an example.
 CPU44aは、周辺状況認識部64cの認識結果により、車いす10の周囲に回避すべき障害物が存在するか否かを判定する(S118)。例えば、CPU44aは、障害物距離算出部58dの算出した障害物と車いす10との第二の相対距離が所定の距離閾値より長い場合(例えば3m以上の場合)、回避すべき障害物が存在しないと判定する。または、CPU44aは、接触予測時間算出部58eが算出した障害物が車いす10に接触するまでの接触予測時間が所定の時間閾値より長い場合(例えば5秒以上の場合)、回避すべき障害物が存在しないと判定する。CPU44aが回避すべき障害物が存在しないと判定した場合(S118のNo)、移動目標位置設定部64bは、ターゲット74のターゲット移動方向Pを基準に追尾角度θで移動目標位置78を設定する(S120)。移動目標位置78が設定されると、直進速度算出部66aは追尾直進速度を算出するとともに、旋回速度算出部66bが追尾旋回速度を算出する(S122)。そして、走行制御部66は、移動目標位置に車いす10が移動するように、舵角設定部66cによりキャスタ28の向きを設定するとともに、モータ制御部66eによりアクチュエータ24a(モータ)を駆動して、車いす10を移動目標位置へ移動させる(S124)。 The CPU 44a determines whether there is an obstacle to be avoided around the wheelchair 10 based on the recognition result of the surrounding situation recognition unit 64c (S118). For example, when the second relative distance between the obstacle calculated by the obstacle distance calculation unit 58d and the wheelchair 10 is longer than a predetermined distance threshold (for example, 3 m or more), there is no obstacle to be avoided. Is determined. Alternatively, when the predicted contact time until the obstacle calculated by the predicted contact time calculation unit 58e contacts the wheelchair 10 is longer than a predetermined time threshold (for example, 5 seconds or more), the CPU 44a determines that there is an obstacle to be avoided. Judge that it does not exist. When the CPU 44a determines that there is no obstacle to avoid (No in S118), the movement target position setting unit 64b sets the movement target position 78 at the tracking angle θ based on the target movement direction P of the target 74 ( S120). When the movement target position 78 is set, the straight traveling speed calculating unit 66a calculates the tracking straight traveling speed, and the turning speed calculating unit 66b calculates the tracking turning speed (S122). Then, the traveling control unit 66 sets the direction of the caster 28 by the rudder angle setting unit 66c so that the wheelchair 10 moves to the movement target position, and drives the actuator 24a (motor) by the motor control unit 66e, The wheelchair 10 is moved to the movement target position (S124).
 CPU44aは、今回の制御周期における移動目標位置への移動が完了したことを、移動後のターゲット74との相対位置とターゲット74のターゲット移動方向Pを基準にする追尾角度θに基づき確認できた場合(S126のYes)、追尾走行モードの完了条件が成立しているか否かを検出する(S128)。例えば、走行モードが追尾走行モード以外のモードになった場合やターゲット74の移動が所定期間以上確認できない場合(ターゲット74が例えば60秒以上停止した場合)等は、CPU44aは、追尾走行モードの完了条件が成立した判定する(S128のYes)。CPU44aは、表示制御部68により追尾走行モードの終了表示を実行させて(S130)、このフローを一旦終する。また、S128において、追尾走行モードの終了条件が成立したことが検出できない場合(S128のNo)、CPU44aは、S108に移行して、それ以降の処理を実行する。これにより、ターゲット74が移動し続けるかぎり、車いす10がターゲット74を追尾することができる。また、S126で今回の制御周期における移動目標位置への移動が完了したことが確認できない場合(S126のNo)、CPU44aは、S124に戻り移動目標位置への移動を継続して行う。 When the CPU 44a has confirmed that the movement to the movement target position in the current control cycle has been completed based on the relative position with respect to the target 74 after movement and the tracking angle θ based on the target movement direction P of the target 74. (Yes in S126), it is detected whether the completion condition of the tracking travel mode is satisfied (S128). For example, when the driving mode is a mode other than the tracking driving mode or when the movement of the target 74 cannot be confirmed for a predetermined period or longer (when the target 74 has stopped for, for example, 60 seconds), the CPU 44a completes the tracking driving mode. It is determined that the condition is satisfied (Yes in S128). The CPU 44a causes the display control unit 68 to display the end of the tracking travel mode (S130), and once ends this flow. In S128, when it is not possible to detect that the tracking travel mode end condition is satisfied (No in S128), the CPU 44a proceeds to S108 and executes the subsequent processing. As a result, as long as the target 74 continues to move, the wheelchair 10 can track the target 74. If it is not possible to confirm in S126 that the movement to the movement target position in the current control cycle has been completed (No in S126), the CPU 44a returns to S124 and continues to move to the movement target position.
 S118において、CPU44aが、車いす10の周囲に回避すべき障害物が存在すると判定した場合(S118のYes)、移動目標位置設定部64bは、車いす10が存在する領域として、回避すべき障害物が存在する領域とは別の領域に回避位置を設定する(S132)。この回避位置は、例えば、ターゲット74を挟んで回避すべき障害物の数がより少ない側の領域に設定される。例えば、図13、図14に示すように、ターゲット74を挟んで左側に回避すべき障害物80a(80b)が存在する場合、車いす10をターゲット74の右側(逆側)に回避させて、この位置を基準に追尾移動を実行するようにする。本実施形態の場合、回避すべき障害物が接近する方向に基づき、回避位置82へ移動するために回避軌道が異なる。例えば、図13は、車いす10の前方から障害物として歩行者80bが接近速度VQで接近する場合である。歩行者80bが点線で示すように、車いす10の前方から接近する場合(S134のYes)、移動目標位置設定部64bは、車いす10が歩行者80bから離れるように後退しながら移動する後退回避軌道を選択する(S136)。例えば、歩行者80bと車いす10との第二の相対距離Lが所定の距離閾値以下(例えば3m以下)になった場合、車いす10は、後退回避軌道を通りターゲット74を挟んで反対側の回避位置82に移動する。また、歩行者80bと車いす10との第二の相対距離Lが縮まる場合で、直進速度VMの車いす10と接近速度VQの歩行者80bとが接触するまでの接触予測時間が所定の時間閾値以下(例えば5秒以下)になった場合、車いす10は、歩行者80bから離れるように後退する後退回避軌道を通りターゲット74を挟んで反対側の回避位置82に移動する。 In S118, when the CPU 44a determines that there are obstacles to avoid around the wheelchair 10 (Yes in S118), the movement target position setting unit 64b determines that there are obstacles to avoid as an area where the wheelchair 10 exists. The avoidance position is set in an area different from the existing area (S132). This avoidance position is set, for example, in a region on the side where the number of obstacles to be avoided with the target 74 interposed therebetween is smaller. For example, as shown in FIGS. 13 and 14, when there is an obstacle 80a (80b) to be avoided on the left side across the target 74, the wheelchair 10 is avoided on the right side (reverse side) of the target 74. The tracking movement is executed based on the position. In the case of the present embodiment, the avoidance trajectory is different for moving to the avoidance position 82 based on the direction in which the obstacle to be avoided approaches. For example, FIG. 13 shows a case where the pedestrian 80b approaches from the front of the wheelchair 10 as an obstacle at the approach speed VQ. When the pedestrian 80b approaches from the front of the wheelchair 10 as indicated by a dotted line (Yes in S134), the movement target position setting unit 64b moves backward while avoiding the wheelchair 10 moving away from the pedestrian 80b. Is selected (S136). For example, when the second relative distance L between the pedestrian 80b and the wheelchair 10 is equal to or less than a predetermined distance threshold (for example, 3 m or less), the wheelchair 10 passes the retreat avoidance track and avoids the opposite side across the target 74. Move to position 82. Further, when the second relative distance L between the pedestrian 80b and the wheelchair 10 is reduced, the predicted contact time until the wheelchair 10 having the straight traveling speed VM contacts the pedestrian 80b having the approach speed VQ is equal to or less than a predetermined time threshold. When it becomes (for example, 5 seconds or less), the wheelchair 10 moves to the avoidance position 82 on the opposite side across the target 74 through the retreat avoidance track retreating away from the pedestrian 80b.
 また、図14は、車いす10の後方から障害物として歩行者80aが接近速度VQで接近する場合である。歩行者80aが点線で示すように、車いす10の後方から接近する場合(S134のNo)、移動目標位置設定部64bは、車いす10が歩行者80aから離れるように前進しながら移動する前進回避軌道を選択する(S138)。例えば、歩行者80aと車いす10との第二の相対距離Lが所定の距離閾値以下(例えば3m以下)になった場合、車いす10は、歩行者80aから離れる前進回避軌道を通りターゲット74を挟んで反対側の回避位置82に移動する。また、歩行者80aと車いす10との第二の相対距離Lが縮まる場合で、直進速度VMの車いす10と接近速度VQの歩行者80aとが接触するまでの接触予測時間が所定の時間閾値以下(例えば5秒以下)になった場合、車いす10は、歩行者80aから離れるような前進回避軌道を通りターゲット74を挟んで反対側の回避位置82に移動する。 FIG. 14 shows a case where the pedestrian 80a approaches from the rear of the wheelchair 10 as an obstacle at the approach speed VQ. When the pedestrian 80a approaches from the rear of the wheelchair 10 as indicated by a dotted line (No in S134), the movement target position setting unit 64b moves forward while avoiding the movement of the wheelchair 10 away from the pedestrian 80a. Is selected (S138). For example, when the second relative distance L between the pedestrian 80a and the wheelchair 10 is equal to or less than a predetermined distance threshold (for example, 3 m or less), the wheelchair 10 passes the forward avoidance track away from the pedestrian 80a and sandwiches the target 74. To move to the opposite avoidance position 82. Further, when the second relative distance L between the pedestrian 80a and the wheelchair 10 is reduced, the predicted contact time until the wheelchair 10 having the straight traveling speed VM and the pedestrian 80a having the approach speed VQ come into contact with each other is equal to or less than a predetermined time threshold. When it becomes (for example, 5 seconds or less), the wheelchair 10 moves to the avoidance position 82 on the opposite side across the target 74 through the forward avoidance track away from the pedestrian 80a.
 回避位置82と回避軌道が決まったら、直進速度算出部66aは回避位置82に移動するための車いす10の回避直進速度を算出し、旋回速度算出部66bは、回避旋回速度を算出する(S140)。そして、走行制御部66は、車いす10の回避位置82への回避移動を実行する(S142)。なお、この回避移動は、車いす10の現在の位置から回避位置82まで直線的に移動させてもよいし、軌道円(第1軌道円:半径R1または第2軌道円:半径R2)に沿って移動させてもよい。車いす10を直線的に移動させて回避する場合、移動時間が短縮可能であり歩行者80aや歩行者80bから迅速に離れることができる。一方、第1軌道円または第2軌道円に沿って回避移動する場合、車いす10とターゲット74との相対距離が一定に保たれたまま回避運動を行うので、通常より車いす10とターゲット74とが接近したりしないため、急に回避運動が始まってもユーザやターゲット74に違和感を与え難くすることができる。そして、CPU44aは、車いす10が回避位置82への回避移動が完了した場合(S144のYes)、S120に移行して以降の処理を実行して、回避位置82を基準とするターゲット74の追尾制御を実施する。S144において、回避移動が完了していない場合(S144のNo)、S142の回避移動を継続し、回避位置82に到達させる。 When the avoidance position 82 and the avoidance trajectory are determined, the straight traveling speed calculating unit 66a calculates the avoiding straight traveling speed of the wheelchair 10 for moving to the avoiding position 82, and the turning speed calculating unit 66b calculates the avoiding turning speed (S140). . Then, the traveling control unit 66 performs avoidance movement of the wheelchair 10 to the avoidance position 82 (S142). This avoidance movement may be linearly moved from the current position of the wheelchair 10 to the avoidance position 82, or along the orbit circle (first orbit circle: radius R1 or second orbit circle: radius R2). It may be moved. When avoiding by moving the wheelchair 10 linearly, the travel time can be shortened and the pedestrian 80a and the pedestrian 80b can be quickly separated. On the other hand, when avoiding movement along the first or second circle, the avoidance movement is performed while the relative distance between the wheelchair 10 and the target 74 is kept constant, so that the wheelchair 10 and the target 74 are more than usual. Therefore, even if the avoidance movement suddenly starts, it is possible to make it difficult for the user and the target 74 to feel uncomfortable. Then, when the avoidance movement of the wheelchair 10 to the avoidance position 82 is completed (Yes in S144), the CPU 44a shifts to S120 and executes the subsequent processing to perform the tracking control of the target 74 based on the avoidance position 82. To implement. In S144, when the avoidance movement is not completed (No in S144), the avoidance movement in S142 is continued to reach the avoidance position 82.
 S100において、自走モードではなく、また追尾走行モードが選択されていることを検出されない場合(S100のNo)、表示制御部68は、通常走行モードである旨を示す制御信号を出力部72を介して出力し、表示装置42aに「通常走行モード制御中」等の表示を行う(S146)。そして、走行制御部66は、操作部36の操作状態にしたがい、スティック操作駆動処理を実行する(S148)。例えば、ジョイスティックの倒し量にしたがい、直進速度算出部66aが直進速度を算出し、ジョイスティックの倒す方向にしたがい旋回速度算出部66bが旋回速度を算出する。そして、舵角設定部66cがキャスタ28の舵角を設定して操舵システム30のアクチュエータ30aを所定量駆動させる。また、モータ制御部66eが駆動システム24のアクチュエータ24a(モータ)を所定速度で回転させて、車いす10をジョイスティックで指示された方向に指示された速度で移動させる。 In S100, when it is not the self-running mode and it is not detected that the tracking running mode is selected (No in S100), the display control unit 68 outputs a control signal indicating that it is the normal running mode to the output unit 72. And display such as “during normal driving mode control” on the display device 42a (S146). Then, the traveling control unit 66 executes the stick operation driving process according to the operation state of the operation unit 36 (S148). For example, the straight traveling speed calculation unit 66a calculates the straight traveling speed according to the amount of tilting of the joystick, and the turning speed calculating unit 66b calculates the turning speed according to the direction of tilting the joystick. Then, the rudder angle setting unit 66c sets the rudder angle of the caster 28 and drives the actuator 30a of the steering system 30 by a predetermined amount. Further, the motor control unit 66e rotates the actuator 24a (motor) of the drive system 24 at a predetermined speed, and moves the wheelchair 10 in the direction instructed by the joystick.
 CPU44aは、通常走行モードにおける停止処理の条件が成立するまで、スティック操作駆動処理を継続する(S150のNo)。通常走行モードにおける停止処理の条件は、例えば、ジョイスティックが中立位置に戻られて所定期間(例えば30秒)経過した場合に成立したものと見なすことができる。通常走行モードにおける停止処理の条件が成立した場合(S150のYes)、CPU44aは、通常走行モードの終了条件が成立した判定し、表示制御部68により通常走行モードの終了表示を実行させて(S152)、このフローを一旦終する。 The CPU 44a continues the stick operation driving process until the stop process condition in the normal travel mode is satisfied (No in S150). The conditions for the stop process in the normal travel mode can be considered to be satisfied, for example, when a predetermined period (for example, 30 seconds) has elapsed after the joystick is returned to the neutral position. When the condition for the stop process in the normal travel mode is satisfied (Yes in S150), the CPU 44a determines that the end condition for the normal travel mode is satisfied, and causes the display control unit 68 to display the end of the normal travel mode (S152). ) This flow is finished once.
 このように、本実施形態の移動体追尾システム100によれば、車いす10はターゲット74から第一の相対位置を維持しながら、かつターゲット74のターゲット移動方向Pに基づく追尾角度θの位置(例えばターゲット74の手の届く左右のポジション)に常に移動目標位置を設定して移動する。その結果、車いす10をターゲット74に追尾させるために、複雑なターゲット74の移動経路を推定させる必要がなく、追尾のための演算負荷の軽減が可能で、低コストでスムーズな追尾(追従)動作を実現することができる。 As described above, according to the moving body tracking system 100 of the present embodiment, the wheelchair 10 maintains the first relative position from the target 74 and also has the position of the tracking angle θ based on the target moving direction P of the target 74 (for example, The target position is always set at the left and right positions of the target 74, and the target 74 moves. As a result, in order to track the wheelchair 10 to the target 74, it is not necessary to estimate a complicated movement path of the target 74, the calculation load for tracking can be reduced, and smooth tracking (following) operation at low cost is possible. Can be realized.
 なお、上述した実施形態では、移動体の一例として車いす10を示したが、これに限らず、健常者の移動手段として、着座姿勢や起立姿勢で搭乗する移動車両でもよく、上述した実施形態と同様の効果を得ることができる。また、人間以外を移動させる車両でもよい。例えば、荷物運搬用や動物輸送用の無人車両やロボット等でもよい。この場合、走行モードは、追尾走行モードのみとしてもよい。また、移動体を複数走行させてもよい。例えば、先頭の移動体は、ターゲット74を追尾し、後続の移動体は先行する移動体を追尾対象体と見なし、これを追尾するようにしてもよく、同様の効果を得ることができる。 In the above-described embodiment, the wheelchair 10 is shown as an example of the moving body. However, the present invention is not limited to this, and a moving vehicle that rides in a sitting posture or a standing posture may be used as a moving means for a healthy person. Similar effects can be obtained. Moreover, the vehicle which moves other than a human may be sufficient. For example, it may be an unmanned vehicle or a robot for luggage transportation or animal transportation. In this case, the traveling mode may be only the tracking traveling mode. A plurality of moving bodies may be run. For example, the leading mobile body may track the target 74, and the subsequent mobile body may regard the preceding mobile body as the tracking target body and track it, and the same effect can be obtained.
 また、上述した実施形態において、障害物の混雑度は、取得部58の取得結果に基づき周辺状況認識部64cにより判定する例を示した。別の実施形態において、CPU44aは、移動体(車いす10)の周囲のインフラから混雑度や障害物(歩行者80)の位置を取得してもよいし、移動体(車いす10)が存在する領域の過去の混雑度データを参照して混雑度(例えば、時間帯による混雑度等)を取得して、追尾制御に反映させてもよい。 In the above-described embodiment, an example in which the degree of congestion of the obstacle is determined by the surrounding situation recognition unit 64c based on the acquisition result of the acquisition unit 58 has been described. In another embodiment, the CPU 44a may acquire the degree of congestion and the position of an obstacle (pedestrian 80) from the infrastructure around the moving object (wheelchair 10), or an area where the moving object (wheelchair 10) exists. The degree of congestion (for example, the degree of congestion by time zone) may be acquired with reference to the past congestion degree data, and reflected in the tracking control.
 また、上述した実施形態では、移動体(車いす10)の周辺の物体の位置関係を示す情報を取得する情報取得部の一例として、レーザセンサ38を示したが、物体の位置関係(物体の検出、および物体との相対距離等)が検出可能であれば、適宜変更できる。例えば、撮像装置(ステレオカメラ等)により画像を取得し、その画像に画像処理を施すことにより周辺の物体の位置関係を示す情報を取得してもよく、レーザセンサ38を用いた場合と同様の効果をえることができる。 In the above-described embodiment, the laser sensor 38 is shown as an example of an information acquisition unit that acquires information indicating the positional relationship between objects around the moving body (wheelchair 10). , And a relative distance from the object, etc.) can be appropriately changed. For example, an image may be acquired by an imaging device (stereo camera or the like), and information indicating the positional relationship of surrounding objects may be acquired by performing image processing on the image, which is the same as when the laser sensor 38 is used. You can get an effect.
 本発明の実施形態及び変形例を説明したが、これらの実施形態及び変形例は、例として提示したものであり、発明の範囲を限定することは意図していない。これら新規な実施形態は、その他の様々な形態で実施されることが可能であり、発明の要旨を逸脱しない範囲で、種々の省略、置き換え、変更を行うことができる。これら実施形態やその変形は、発明の範囲や要旨に含まれるとともに、請求の範囲に記載された発明とその均等の範囲に含まれる。 Although embodiments and modifications of the present invention have been described, these embodiments and modifications are presented as examples and are not intended to limit the scope of the invention. These novel embodiments can be implemented in various other forms, and various omissions, replacements, and changes can be made without departing from the scope of the invention. These embodiments and modifications thereof are included in the scope and gist of the invention, and are included in the invention described in the claims and the equivalents thereof.
10…車いす、22…駆動輪、24…駆動システム、26…ブレーキシステム、28…キャスタ、30…操舵システム、32…バッテリシステム、36…操作部、38…レーザセンサ、42…モニタ装置、44…ECU、60…対象特定部、62…障害物特定部、64…目標設定部、64a…軌道円設定部、64b…移動目標位置設定部、64c…周辺状況認識部、66…走行制御部、66a…直進速度算出部、66d…バッテリ制御部、66e…モータ制御部、68…表示制御部、72…出力部、74…ターゲット、76a,76b…同心円、78…移動目標位置、80…歩行者、82…回避位置、100…移動体追尾システム、P…ターゲット移動方向、θ…追尾角度。 DESCRIPTION OF SYMBOLS 10 ... Wheelchair, 22 ... Drive wheel, 24 ... Drive system, 26 ... Brake system, 28 ... Caster, 30 ... Steering system, 32 ... Battery system, 36 ... Operation part, 38 ... Laser sensor, 42 ... Monitor apparatus, 44 ... ECU, 60 ... target specifying unit, 62 ... obstacle specifying unit, 64 ... target setting unit, 64a ... trajectory circle setting unit, 64b ... movement target position setting unit, 64c ... peripheral situation recognition unit, 66 ... travel control unit, 66a ... straight speed calculation unit, 66d ... battery control unit, 66e ... motor control unit, 68 ... display control unit, 72 ... output unit, 74 ... target, 76a, 76b ... concentric circle, 78 ... target position for movement, 80 ... pedestrian, 82 ... avoidance position, 100 ... moving body tracking system, P ... target movement direction, θ ... tracking angle.

Claims (6)

  1.  移動体が追尾する追尾対象体を特定する対象特定部と、
     前記移動体の周辺の物体の位置関係を示す情報を取得する情報取得部の取得結果に基づき、少なくとも前記移動体と前記追尾対象体との相対距離と、前記追尾対象体の移動方向と、を取得する取得部と、
     前記追尾対象体の前記移動方向を基準に当該移動方向から所定角度ずれた方向上に位置し、かつ前記追尾対象体から第一の相対距離の位置に前記移動体の移動目標位置を設定する目標設定部と、
     前記移動体が前記移動目標位置に移動するように制御する制御部と、
     を備える移動体追尾制御装置。
    A target identifying unit that identifies a tracking target object that the mobile body tracks;
    Based on the acquisition result of the information acquisition unit that acquires information indicating the positional relationship of objects around the moving body, at least the relative distance between the moving body and the tracking target object, and the moving direction of the tracking target object, An acquisition unit to acquire;
    A target that is located on a direction deviated from the movement direction by a predetermined angle with respect to the movement direction of the tracking target object, and that sets the movement target position of the moving object at a first relative distance from the tracking target object A setting section;
    A control unit for controlling the moving body to move to the movement target position;
    A moving body tracking control device.
  2.  前記目標設定部は、前記追尾対象体の位置を中心とし、半径が前記第一の相対距離で規定される円軌道上に前記移動目標位置を設定する請求項1に記載の移動体追尾制御装置。 2. The moving body tracking control device according to claim 1, wherein the target setting unit sets the moving target position on a circular orbit whose center is the position of the tracking target object and whose radius is defined by the first relative distance. .
  3.  前記目標設定部は、前記追尾対象体の側方および後方の領域から前記移動方向の逆方向ベクトルを含む所定の後方領域を除く領域に前記移動目標位置を設定する請求項1または請求項2に記載の移動体追尾制御装置。 The said target setting part sets the said movement target position to the area | region except the predetermined back area | region containing the reverse vector of the said moving direction from the area | region of the side and the back of the said tracking object. The moving body tracking control device described.
  4.  前記目標設定部は、前記移動体の周辺状況に応じて、前記第一の相対距離を変更する請求項1から請求項3のいずれか1項に記載の移動体追尾制御装置。 The moving object tracking control device according to any one of claims 1 to 3, wherein the target setting unit changes the first relative distance according to a surrounding situation of the moving object.
  5.  前記取得部は、前記情報取得部の前記取得結果に含まれる前記移動体の周辺に存在する障害物と前記移動体との第二の相対距離または、前記第二の相対距離が縮まる場合に前記移動体に接触するまでの接触予測時間を取得し、
     前記目標設定部は、前記第二の相対距離が所定の距離閾値以下になった場合または前記接触予測時間が所定の時間閾値以下になった場合、前記移動目標位置を前記追尾対象体から前記第一の相対距離より近い位置に設定する請求項1から請求項4のいずれか1項に記載の移動体追尾制御装置。
    The acquisition unit is configured to reduce the second relative distance between the obstacle and the moving body that is present in the periphery of the moving body included in the acquisition result of the information acquisition unit or the second relative distance is reduced. Get the predicted contact time to contact the moving body,
    When the second relative distance is less than or equal to a predetermined distance threshold or when the predicted contact time is less than or equal to a predetermined time threshold, the target setting unit moves the target position from the tracking target to the first The moving body tracking control apparatus according to any one of claims 1 to 4, wherein the moving body tracking control apparatus is set to a position closer than one relative distance.
  6.  前記取得部は、前記情報取得部の前記取得結果に含まれる前記移動体の周辺に存在する障害物と前記移動体との第二の相対距離または、前記第二の相対距離が縮まる場合に前記移動体に接触するまでの接触予測時間を取得し、
     前記目標設定部は、前記第二の相対距離が所定の距離閾値以下になった場合または前記接触予測時間が所定の時間閾値以下になった場合、前記追尾対象体を挟んで、前記障害物の逆側に前記移動目標位置を設定し、
     前記制御部は、前記移動体が前記障害物から離れるような軌道を通り前記逆側の前記移動目標位置に移動するように制御する請求項1から請求項4のいずれか1項に記載の移動体追尾制御装置。
    The acquisition unit is configured to reduce the second relative distance between the obstacle and the moving body that is present in the periphery of the moving body included in the acquisition result of the information acquisition unit or the second relative distance is reduced. Get the predicted contact time to contact the moving body,
    When the second relative distance is equal to or smaller than a predetermined distance threshold or when the predicted contact time is equal to or smaller than a predetermined time threshold, the target setting unit sandwiches the tracking target object and Set the movement target position on the opposite side,
    The said control part controls the movement so that it may move to the said movement target position of the said opposite side through the track | orbit which the said mobile body leaves | separates from the said obstacle. Body tracking control device.
PCT/JP2017/030890 2016-10-20 2017-08-29 Mobile body tracking control device WO2018074069A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-206039 2016-10-20
JP2016206039A JP2018067183A (en) 2016-10-20 2016-10-20 Mobile body tracking controller

Publications (1)

Publication Number Publication Date
WO2018074069A1 true WO2018074069A1 (en) 2018-04-26

Family

ID=62019638

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/030890 WO2018074069A1 (en) 2016-10-20 2017-08-29 Mobile body tracking control device

Country Status (2)

Country Link
JP (1) JP2018067183A (en)
WO (1) WO2018074069A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019215699A (en) * 2018-06-13 2019-12-19 アルパイン株式会社 Personal mobility
CN112393724A (en) * 2019-08-15 2021-02-23 阿里巴巴集团控股有限公司 Following method and device for target object
CN113923592A (en) * 2021-10-09 2022-01-11 广州宝名机电有限公司 Target following method, device, equipment and system
CN114047743A (en) * 2021-08-11 2022-02-15 中国舰船研究设计中心 Unmanned ship target tracking control method and system with prediction function
JPWO2022138476A1 (en) * 2020-12-23 2022-06-30
US11886190B2 (en) 2020-12-23 2024-01-30 Panasonic Intellectual Property Management Co., Ltd. Method for controlling robot, robot, and recording medium

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7070265B2 (en) * 2018-09-14 2022-05-18 トヨタ車体株式会社 Follow-up trolley
JP2020086995A (en) * 2018-11-27 2020-06-04 富士ゼロックス株式会社 Autonomous mobile device and program
JP7281306B2 (en) * 2019-03-06 2023-05-25 パナソニックホールディングス株式会社 Mobile object management device and mobile object management method
JP2020166708A (en) * 2019-03-29 2020-10-08 株式会社エクォス・リサーチ Moving body
JP7336935B2 (en) * 2019-09-26 2023-09-01 ダイムラー トラック エージー Vehicle control system and vehicle control method
CN113450923B (en) * 2020-03-27 2023-07-14 中国科学院深圳先进技术研究院 Method and system for simulating influenza space-time propagation process by large-scale track data
WO2021246169A1 (en) * 2020-06-01 2021-12-09 ソニーグループ株式会社 Information processing device, information processing system, method, and program
JP2024004557A (en) * 2022-06-29 2024-01-17 株式会社日立製作所 Mobile object introduction support system and mobile object introduction support method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008191800A (en) * 2007-02-02 2008-08-21 Hitachi Ltd Follow-up vehicle
JP4316477B2 (en) * 2004-11-18 2009-08-19 パナソニック株式会社 Tracking method of mobile robot
JP4552869B2 (en) * 2006-02-10 2010-09-29 パナソニック株式会社 Tracking method for moving objects
JP2014123304A (en) * 2012-12-21 2014-07-03 Secom Co Ltd Autonomous mobile robot

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4316477B2 (en) * 2004-11-18 2009-08-19 パナソニック株式会社 Tracking method of mobile robot
JP4552869B2 (en) * 2006-02-10 2010-09-29 パナソニック株式会社 Tracking method for moving objects
JP2008191800A (en) * 2007-02-02 2008-08-21 Hitachi Ltd Follow-up vehicle
JP2014123304A (en) * 2012-12-21 2014-07-03 Secom Co Ltd Autonomous mobile robot

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7166712B2 (en) 2018-06-13 2022-11-08 アルパイン株式会社 personal mobility
JP2019215699A (en) * 2018-06-13 2019-12-19 アルパイン株式会社 Personal mobility
CN112393724A (en) * 2019-08-15 2021-02-23 阿里巴巴集团控股有限公司 Following method and device for target object
CN112393724B (en) * 2019-08-15 2024-04-02 阿里巴巴集团控股有限公司 Following method and device for target object
JP7249565B2 (en) 2020-12-23 2023-03-31 パナソニックIpマネジメント株式会社 ROBOT CONTROL METHOD, ROBOT, AND PROGRAM
WO2022138476A1 (en) * 2020-12-23 2022-06-30 パナソニックIpマネジメント株式会社 Method of controlling robot, robot, and program
JPWO2022138476A1 (en) * 2020-12-23 2022-06-30
JP7281654B1 (en) 2020-12-23 2023-05-26 パナソニックIpマネジメント株式会社 ROBOT CONTROL METHOD, ROBOT, AND PROGRAM
CN116249604A (en) * 2020-12-23 2023-06-09 松下知识产权经营株式会社 Robot control method, robot, and program
JP2023082713A (en) * 2020-12-23 2023-06-14 パナソニックIpマネジメント株式会社 Control method for robot, robot, and program
US11886190B2 (en) 2020-12-23 2024-01-30 Panasonic Intellectual Property Management Co., Ltd. Method for controlling robot, robot, and recording medium
US11906966B2 (en) 2020-12-23 2024-02-20 Panasonic Intellectual Property Management Co., Ltd. Method for controlling robot, robot, and recording medium
US11960285B2 (en) 2020-12-23 2024-04-16 Panasonic Intellectual Property Management Co., Ltd. Method for controlling robot, robot, and recording medium
CN116249604B (en) * 2020-12-23 2024-05-28 松下知识产权经营株式会社 Robot control method, robot, and computer program product
CN114047743A (en) * 2021-08-11 2022-02-15 中国舰船研究设计中心 Unmanned ship target tracking control method and system with prediction function
CN113923592B (en) * 2021-10-09 2022-07-08 广州宝名机电有限公司 Target following method, device, equipment and system
CN113923592A (en) * 2021-10-09 2022-01-11 广州宝名机电有限公司 Target following method, device, equipment and system

Also Published As

Publication number Publication date
JP2018067183A (en) 2018-04-26

Similar Documents

Publication Publication Date Title
WO2018074069A1 (en) Mobile body tracking control device
CN107817791B (en) Vehicle control device
JP4687395B2 (en) Parking control device
US10202123B2 (en) Vehicle control system
US9505436B2 (en) Parking assist system
JP4108314B2 (en) Vehicle periphery monitoring device
US10131356B2 (en) Travel control method and travel control apparatus
US20210061264A1 (en) Method for performing automatic valet parking
US20190187719A1 (en) Emergency lane change assistance system
WO2018011872A1 (en) Drive assistance device
WO2007142345A1 (en) Travel assistance device
KR20200017971A (en) Vehicle and method for controlling thereof
JP2020111310A (en) Vehicle and control method therefor
JP5461071B2 (en) Autonomous mobile body and mobile body system using it
US20140055615A1 (en) Parking assistant device
JP7301471B2 (en) Electronic devices and moving bodies equipped with them
JP2017146945A (en) Lead robot
JP4677880B2 (en) Parking assistance device and parking assistance method
JP2021000974A (en) Parking support system
JP2020203620A (en) Trailer
KR20220121186A (en) Vehicle control method, vehicle control system, and vehicle
JP2014164315A (en) Self-propelled transporting device and system
WO2016092773A1 (en) Autonomous driving control device, driving information output device, footrest, autonomous driving control method, and driving information output method
JP2017136990A (en) Parking support method
WO2017164911A1 (en) Transportation device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17861314

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17861314

Country of ref document: EP

Kind code of ref document: A1