WO2017038883A1 - 自律移動体及び信号制御システム - Google Patents
自律移動体及び信号制御システム Download PDFInfo
- Publication number
- WO2017038883A1 WO2017038883A1 PCT/JP2016/075535 JP2016075535W WO2017038883A1 WO 2017038883 A1 WO2017038883 A1 WO 2017038883A1 JP 2016075535 W JP2016075535 W JP 2016075535W WO 2017038883 A1 WO2017038883 A1 WO 2017038883A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- autonomous mobile
- mobile body
- signal
- pedestrian
- travel
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 claims description 19
- 230000004397 blinking Effects 0.000 claims description 12
- 238000013459 approach Methods 0.000 claims description 5
- 230000008054 signal transmission Effects 0.000 claims description 3
- 238000000034 method Methods 0.000 description 8
- 238000004891 communication Methods 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 5
- 230000007613 environmental effect Effects 0.000 description 4
- 238000001514 detection method Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 241000282472 Canis lupus familiaris Species 0.000 description 2
- 238000004140 cleaning Methods 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 239000003905 agrochemical Substances 0.000 description 1
- 239000003990 capacitor Substances 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000000474 nursing effect Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 239000007921 spray Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000013024 troubleshooting Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
- G05D1/024—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/584—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/095—Traffic lights
- G08G1/0955—Traffic lights transportable
Definitions
- the present invention relates to an autonomous mobile body and a signal control system, and is particularly suitable for application to an autonomous mobile body that safely crosses a pedestrian crossing as well as a pedestrian and a signal control system that cooperates with the autonomous mobile body.
- SLAM Simultaneous Localization and Mapping
- This autonomous mobile robot using the SLAM technology dynamically generates an environment map that represents the three-dimensional position of an object existing in real space while estimating its position with high accuracy. It is designed to move autonomously in the environment by specifying the route.
- Patent Document 1 a fixed obstacle or a moving obstacle is detected while the unmanned moving device is moving, and movement along the planned movement path is resumed while avoiding the collision. Proposed.
- JP 2008-294934 A Japanese Patent Laid-Open No. 11-275562
- an autonomous mobile robot crosses a pedestrian crossing autonomously, it may contact or collide with a car traveling on the roadway, unlike walking on a pedestrian crossing. May interfere with traffic.
- the present invention has been made in consideration of the above points, and intends to propose an autonomous mobile body and a signal control system capable of traveling while ensuring much higher safety than conventional ones.
- the position of the self is estimated with respect to the external environment in the designated area, and at the same time, planar or three-dimensional.
- a SLAM function unit that creates an environmental map of the designated area, a travel control unit that appropriately changes the travel route of the autonomous mobile body so as not to contact surrounding objects, based on the output of the SLAM function unit;
- An imaging unit that captures the outside of a predetermined range in front of the travel route, and a pedestrian traffic light in the designated section are recognized based on the imaging result of the imaging unit, and the signal light color and lighting of the pedestrian traffic signal
- a signal light color determination unit for determining blinking switching, and the travel control unit determines the signal light color determination unit when the autonomous mobile body traveling on the sidewalk reaches the pedestrian crossing. Based on the result, the autonomous mobile body is made to travel across the pedestrian crossing in synchronization with the timing when the signal light color of the pedestrian traffic light switches from red to blue, while
- a signal control system that cooperates with an autonomous mobile body that freely moves within a designated area including a sidewalk, a signal lamp color control that controls a signal lamp color and lighting / blinking switching of a pedestrian traffic light in the designated section.
- a signal transmission unit that transmits a signal from the signal lamp color control unit, and the autonomous mobile body estimates its own position with respect to the external environment in the designated area, and at the same time, is planar or stereoscopic.
- a SLAM function unit that creates an environmental map of the designated area, a travel control unit that appropriately changes the travel route of the autonomous mobile body so as not to contact surrounding objects, based on the output of the SLAM function unit;
- a signal receiving unit that receives a control signal from the signal transmitting unit, and the traveling control unit is configured to switch the signal receiving unit when the autonomous moving body traveling on the sidewalk reaches the pedestrian crossing. Based on the control of the signal light color control unit obtained as described above, the autonomous mobile body is made to travel across the pedestrian crossing in synchronization with the timing when the signal light color of the pedestrian traffic light switches from red to blue, Other than the timing, the autonomous mobile body is brought into a stopped state.
- FIG. 1 is an external view showing an overall configuration of an autonomous mobile robot and a signal control system according to an embodiment of the present invention. It is the perspective view and side view which show the external appearance structure of the autonomous mobile robot which concerns on embodiment of the same invention. It is a block diagram which shows the function structure of the autonomous mobile robot concerning the embodiment. It is a conceptual diagram with which it uses for description of the pedestrian collision avoidance function at the time of crossing concerning the embodiment.
- FIGS. 1 (A) and 1 (B) show a signal control system 1 of the present invention, with reference to a pedestrian crossing 3 laid so as to cross a roadway 2.
- the pedestrian traffic lights 4 (4A, 4B) are installed on the upper portions of the columns installed on the road sides at both ends so as to face each other.
- the signal control unit 5 follows a system control from a traffic control system (not shown) in a corresponding city, and a pedestrian traffic signal 4 (4A, 4B) and a corresponding vehicle traffic signal 6 (6A) at a specific intersection or road. , 6B) are controlled respectively.
- the signal control unit 5 in the present invention receives a request signal from the autonomous mobile robot 10 via wireless communication, the current display state of the traffic light 4 for the pedestrian 4 according to the request signal (light color of the signal and lighting thereof) Start time and lighting end time, and in the case of a green light, a flashing start time and a flashing end time are transmitted in real time.
- the signal control unit 5 in the present invention adjusts the blue blinking time of the pedestrian traffic light 4 according to the emergency signal.
- the autonomous mobile robot 10 can move autonomously and freely, and not only when traveling on a general sidewalk 7, but also when it is necessary to cross the roadway 2 toward the destination, It reaches the pedestrian crossing 3 and crosses the pedestrian crossing 3 while keeping traffic rules.
- the autonomous mobile robot 10 is a two-wheel-drive type mobile body that can autonomously run in response to an external operation, Is provided with a substantially disk-shaped travel base portion 11 attached so as to have a diameter, and a substantially U-shaped sensor holding portion 12 planted from an upper portion of the plane.
- the traveling base portion 11 is provided at the lower part of the main body, and is provided with a pair of driving wheels 13a and 13b provided at the left and right of the center position in the front-rear direction, and is provided at the front and rear, and can swing according to the traveling of the autonomous mobile robot 10.
- a front caster 14a and a rear caster 14b are provided.
- the left and right drive wheels 13a and 13b are independently driven to rotate by drive motors 15a and 15b (FIG. 3).
- the drive wheels 13a and 13b are moved forward and backward by forward rotation or reverse rotation of the drive wheels 13a and 13b.
- the vehicle travels to the right or left while moving forward by giving a difference in the forward rotation speed.
- the autonomous mobile robot 10 spins, that is, changes its direction at the position, by driving the drive wheels 13a and 13b in opposite directions.
- a laser range sensor 20 that detects obstacles in the diagonally forward direction and the left-right direction is provided at a position in front of the front caster 14a in the traveling base unit 11. Further, an RGB-D sensor 21 and a 3D distance image sensor 22 capable of three-dimensional scanning are provided in the upper center of the sensor holding unit 12.
- the laser range sensor 20 irradiates an object (obstacle) viewed from the installation position, receives the reflected light, and calculates the distance.
- object obstacle
- fan-shaped distance information on a plane can be obtained in a range of a maximum of 30 m and an angle of 240 degrees.
- the RGB-D sensor 21 has a depth sensor that can measure the distance to the object (obstacle) viewed from the camera, and can perform three-dimensional scanning of the object.
- This depth sensor is composed of an infrared sensor, which captures an object in a state where a single pattern of structured light is projected onto the object, and calculates the depth of each point on the image by triangulation using the parameters.
- Kinect trademark name of Microsoft Corporation
- the RGB image is 640 ⁇ 480
- the depth image is 320 ⁇ 240 pixels, both of which can be acquired at 30 frames / second.
- the reason why the RGB-D sensor 21 is installed at the center of the upper part of the sensor holding part 12 is that a vertical visual field cannot be secured in the traveling base part 11 which is almost close to the floor surface, and a height of 0.6 m to 1.8 m is secured from the floor surface. Is required.
- the 3D distance image sensor 22 irradiates LED pulses, measures the arrival time of reflected light from the object in units of pixels, and simultaneously superimposes the acquired image information, thereby calculating distance information to the object in units of pixels. To do.
- the 3D distance image sensor 22 is required as a complementary sensor for outdoor use because it has a detection capability with higher accuracy than the RGB-D sensor 21 and has a wider viewing angle than the laser range sensor 20.
- Pixel Soleil (trade name of Nippon Signal Co., Ltd.) is applied as the 3D distance image sensor 22, it is possible to photograph a horizontal field of view of 72 degrees, a vertical field of view of 72 degrees, and a sensor range of 0.3 m to 4.0 m. It is.
- an imaging camera 23, a sound collecting microphone 24, and a speaker 25 are mounted on the center upper portion of the sensor holding unit 12 of the autonomous mobile robot 10. The sound is collected at the same time as the image of the surrounding environment is acquired. It is designed to emit utterances and warning sounds.
- This direction indicator 26 is composed of one or a plurality of LEDs that can be blinked by switching the light emission color, and is adapted to the traveling state of the autonomous mobile robot 10 (including the direction of travel during traveling and its change, and also when stopped). It is designed to flash on and off.
- the direction indicators 26a and 26b are attached to predetermined positions of the sensor holding unit 12 so that the distance between the pair of left and right light emitting portions is 30 cm or more and the height position from the ground is 35 cm or more.
- the directional indicator can be visually recognized by flashing light emission for pedestrians located in the pre-travel direction and the post-travel direction.
- the direction indicators 26a and 26b should be structured to be openable and closable with respect to predetermined positions of the sensor holding portion 12, respectively. Only the light emitting part may be protruded to the left and right outside.
- the direction indicators 26a and 26b may emit light only in either the direction before traveling or the direction after traveling of the autonomous mobile robot 10.
- the pair of left and right light emitting parts The interval width may be 15 cm or more.
- the direction indicators 26a and 26b may not be a single light emitter, but may be configured as a plurality of light emitter groups arranged in a matrix. You may make it light-emit, turning on and blinking only the light-emitting body which corresponds selectively by a confirmation mark. As a result, the autonomous mobile robot 10 can travel while not only indicating the direction in the conventional flashing light emission pattern but also indicating the direction of the movement to the pedestrian by the arrow-shaped light emission display.
- FIG. 3 is a configuration diagram of the overall control unit 30 mounted on the autonomous mobile robot 10.
- the overall control unit 30 is mainly composed of a microcomputer, and includes a travel control unit 31 that controls the entire system, a target travel route control unit 32 that stores travel route information, and an operation control unit 33 that controls the drive system.
- the travel control unit 31 receives the travel route information from the target travel route storage unit 32 that stores preset travel route information and the detection signals from the laser range sensor 20, the RGB-D sensor 21, and the 3D distance image sensor 22. Based on the self-position estimation and the construction of the environment map described later, the suitability of the travel route and the necessity for change are determined, and the presence or absence of a travel obstacle is determined.
- the travel control unit 31 sends the determined travel route information to the operation control unit 33, and the operation control unit 33 controls the left and right motor drivers 34a and 34b according to the travel route information, and determines the drive motors a and b. Control the rotation.
- the autonomous mobile robot 10 automatically creates an environment map of a target area that can be traveled in a real environment such as a sidewalk in a predetermined area including a pedestrian crossing, using the above-described SLAM technology.
- the autonomous mobile robot 10 moves a local map on a grid divided by a two-dimensional grid based on distance information and angle information with respect to an object obtained from the laser range sensor 20 and the 3D distance image sensor 22 as a moving environment.
- An environment map representing the entire desired target area is created.
- the travel amount of the own machine is calculated, and the next whereabouts map and the current time are calculated.
- the self-position is estimated from the matching with the environmental map created in the next step and the traveling amount of the aircraft.
- the autonomous mobile robot 10 includes a communication unit 35 that wirelessly communicates with the signal control unit 5 (FIG. 1B) of the signal control system 1, and a pedestrian traffic light transmitted from the signal control unit 5 in the signal control system 1. 4 receives a control signal for controlling the signal lamp color and lighting / flashing switching.
- the autonomous mobile robot 10 incorporates a relatively large capacity drive battery 36 made of a secondary battery or a capacitor, and the drive battery 36 is connected to a power supply terminal provided on a power supply stand (not shown) with the outside.
- a power supply stand not shown
- power supplied from a commercial power source can be supplied to the driving battery 36 via the power feeding terminal and charged.
- the autonomous mobile robot 10 adjusts the crossing start timing while recognizing the pedestrian traffic light 4. Specifically, the autonomous mobile robot 10 enters a standby state before the pedestrian crossing 3 and then starts running in synchronization with the timing when the pedestrian traffic light 4 switches from red lighting to blue lighting. That is, the autonomous mobile robot 10 always stops so as to be able to cross with sufficient time margin even when the pedestrian traffic light 4 is lit in blue except for the timing described above.
- the autonomous mobile robot 10 grasps the pedestrian traffic light 4 in the photographing range and judges the signal lamp color and the lighting blinking switching; Secondly, there is a technique of notifying the autonomous mobile robot 10 of the signal control state of the pedestrian traffic light 4 from the signal control unit 5 in the signal control system 1 via wireless communication.
- the autonomous mobile robot 10 uses the laser range sensor 20 and the imaging camera 23 to recognize the pedestrian traffic light 4 based on the distance (crosswalk width) and the appearance shape (traffic light shape), and then brightness.
- the distance crosswalk width
- the appearance shape traffic light shape
- the display state of the pedestrian traffic light 4 in the signal control system 1 is transmitted to the autonomous mobile robot 10 via wireless communication in real time, so that the autonomous mobile robot 10 transmits the signal light color of the pedestrian traffic light. Understand the lighting and blinking status and the switching status.
- the autonomous mobile robot 10 starts based on the detection results of the laser range sensor 20, the imaging camera 23, and the audio microphone 24 when the pedestrian traffic signal 4 starts to cross in synchronization with the timing at which the red lighting is switched to the blue lighting.
- the vehicle stops without starting to travel until the vehicle passes by. If it is determined that the stop time is relatively long and there is no time to cross the pedestrian crossing, the autonomous mobile robot 10 waits until the next crossing start timing in preparation for the current crossing.
- the autonomous mobile robot 10 uses the laser range sensor 20 and the imaging camera 23 to determine the number and positions of pedestrians waiting in the opposite side pedestrian crossing 3, distance (crosswalk width), and appearance. After recognition based on the shape (person recognition and face recognition), the actual degree of congestion at the time of crossing is predicted based on the recognition result.
- the autonomous mobile robot 10 has already identified the pedestrian crossing 3 as a route plan using the laser range sensor 20, the RGB-D sensor 21 and the 3D distance image sensor 22 when creating the three-dimensional environment map by the SLAM technology.
- An image of the pedestrian crossing 3 may be captured during actual travel using the imaging camera 23, and the pedestrian crossing 3 may be recognized using an image processing technique such as edge detection or contour extraction.
- the autonomous mobile robot 10 determines that there is a high possibility of passing the pedestrian and the side edge of the pedestrian crossing 3 and the vicinity thereof when crossing the pedestrian crossing 3, the autonomous mobile robot 10 moves to avoid the side edge. Thus, contact with a pedestrian is prevented in advance (FIG. 4C). In this determination, if the degree of congestion of the pedestrian crossing 3 is predicted by the above-described pre-congestion prediction function, the avoidance movement can be performed quickly from the start of the crossing.
- the autonomous mobile robot 10 receives the reflected light of the laser light emitted to the areas within the width of the pedestrian crossing 3 using the laser range sensor 20, and determines the direction of the emitted light and the time difference between the emitted light and the reflected light.
- the pedestrian density is calculated based on the area occupied by the pedestrian by detecting the pedestrian whose position information changes with time.
- the autonomous mobile robot 10 can move along the gap between the pedestrians by predicting the moving directions and speeds of the plurality of pedestrians. In order to avoid contact with and collision with pedestrians, they are made in advance.
- the autonomous mobile robot 10 uses the RGB-D sensor 21 and the 3D distance image sensor 22 to sequentially detect the relative position of the pedestrian with respect to the autonomous mobile robot 10 as position data in time series, and to determine the time for the pedestrian.
- a position of the pedestrian in the real space is calculated based on a plurality of position data of the series, and a relative movement vector is calculated from the position of the real space. Based on this relative movement vector, it becomes possible to recognize a relative movement locus of a pedestrian who is highly likely to collide or come into contact with the autonomous mobile robot 10.
- the autonomous mobile robot 10 measures a plurality of three-dimensional point groups by the laser range sensor 20 to separate a three-dimensional object, and uses the point object of the three-dimensional object.
- Clustering may be used to identify pedestrians for each point cloud. Based on movement information over time, approaching pedestrians (walking across the pedestrian crossing 3 from the opposite side) It is also possible to predict the movement direction and movement speed of the person.
- the traveling control unit 31 controls the direction indicators 26a and 26b, and a predetermined timing and a predetermined pattern according to the course direction and the change of the autonomous mobile robot 10 when traveling. By flashing and emitting light in a predetermined color, it becomes possible for a pedestrian existing in the front-rear direction of the autonomous mobile robot 10 to visually recognize a course direction or a course change.
- the autonomous mobile robot 10 starts to move to the left side (right side) immediately before shifting to the left (right) direction or diagonally left (right) direction so as to approach the side edge of the pedestrian crossing 3.
- the directional indicator 26b (26a) blinks orange at a predetermined timing (a constant cycle of 60 to 120 times per minute), so that a pedestrian located in the front-rear direction and the diagonal direction can move the autonomous mobile robot 10 Can be visually confirmed that the course has been changed.
- the traveling control unit 31 may utter “going to the left (right) direction” from the speaker 25 in synchronization with the blinking timing of the direction indicator 26b (26a).
- the autonomous mobile robot 10 becomes difficult to travel normally (for example, surrounded by wild dogs or malicious pedestrians) while crossing the pedestrian crossing.
- the traffic light 4 switches from blue lighting to blue flashing, the dog or pedestrian evacuates to the nearest sidewalk at a timing away from the surroundings.
- the travel control unit 31 controls the direction indicators 26a and 26b to flash in red (hazard light) and at the same time, from the speaker 25 By generating an utterance message and a siren to the effect, it is possible to alert the surrounding pedestrians.
- the travel control unit 31 transmits an emergency signal via the communication unit 35 when the pedestrian traffic light 4 switches to blinking blue, so that the signal control unit 5 of the signal control system 1 can perform the corresponding walking.
- the blue blinking time of the traffic signal 4 for the passenger is extended for a predetermined time (maximum 18 seconds), and the red lighting time of the corresponding traffic signal 5 for the vehicle is extended by the same time.
- the autonomous mobile robot 10 predicts the degree of congestion when the pedestrian crosses when the pedestrian crossing 3 is in a standby state, and shifts to the side edge of the pedestrian crossing 3 when starting to cross the pedestrian crossing 3.
- the pedestrian density and the moving direction and speed of each pedestrian are predicted as necessary to avoid collision and contact with the pedestrian. be able to.
- the autonomous mobile robot 10 is applied as an autonomous transfer robot that mainly travels on the sidewalk 7 .
- the present invention is not limited to this, and the office floor It can be applied to various self-propelled robots such as self-propelled cleaning robots, self-propelled working robots such as construction sites and agrochemical sprays, automatic security robots, and walking guide robots for nursing care.
- the signal light color determination unit, the pedestrian recognition unit, and the pedestrian prediction unit according to the present invention have been described as being executed by the travel control unit 31 in the overall control unit 30. Not limited to this, a separate corresponding circuit may be provided in the overall control unit 30.
- the autonomous mobile robot 10 detects pedestrians and the like existing around by the laser range sensor 20 and the imaging camera 23.
- the voice microphone 24 is used to collect the surrounding voice and the pedestrian or the like. You may make it recognize presence or an approach state.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Electromagnetism (AREA)
- Optics & Photonics (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
図1(A)及び(B)は本発明の信号制御システム1を示し、車道2を横切るように敷設された横断歩道3を基準として、当該横断歩道3の両端の路側に設置された支柱上部に、歩行者用信号機4(4A、4B)が互いに対面するように設置されている。
自律移動ロボット10は、図2(A)から(C)に示すように、自律的又は外部操作に応じて自走可能な二輪駆動型移動体であり、駆動二輪が直径をなすように取り付けられた略円盤状の走行ベース部11と、その平面上部から植立した略コ字状のセンサ保持部12とを備える。
図3は、自律移動ロボット10に搭載される統括制御部30の構成図である。統括制御部30はマイクロコンピュータを主体として構成され、全体の制御を司る走行制御部31、走行経路情報を記憶する目標走行経路制御部32、駆動系を制御する作動制御部33を備える。
本発明の自律移動ロボット10は、歩道を走行しながら横断歩道に到達した際、歩行者用信号機4を認識しながら、横断可能時間内でかつ対向する歩行者と接触や衝突をしないように、横断歩道を渡り切るようになされている。
自律移動ロボット10は、歩行者用信号機4を認識しながら、横断開始タイミングを調整する。具体的には、自律移動ロボット10は、横断歩道3の手前で待機状態とした後、歩行者用信号機4が赤色点灯から青色点灯に切り替わるタイミングに同期して走行を開始する。すなわち、自律移動ロボット10は、上述のタイミング以外は、たとえ歩行者用信号機4が青色点灯時であっても、十分に時間的余裕をもって横断できるように必ず停止する。
自律移動ロボット10は、横断歩道3で待機状態のとき、当該横断歩道3の対向側で信号待ちをしている歩行者の数及び位置を画像認識しておくことにより、歩行者用信号機4が青色点灯時になる前に歩行者による横断時の混雑度合いを予測する。
自律移動ロボット10は、横断歩道3を渡り始めた際に(図4(A))、対向側の歩行者を横断方向(真正面の進行方向)に認識したとき、横断歩道3の側縁(自転車専用レーンが形成されている場合は、その反対側の側縁)に近付くように横方向又は斜め方向にずれながら進行し、その後当該側縁に沿って車道を横断するように走行する(図4(B))。
自律移動ロボット10は、横断歩道を渡る途中で、通常の走行が困難な状況(例えば、野犬や悪意の歩行者に取り囲まれる等)になり、歩行者用信号機4が青色点灯から青色点滅に切り替わった時点で、犬や歩行者が周囲から離れたタイミングで最寄りの歩道に退避する。
以上のように本実施の形態によれば、自律移動ロボット10は、横断歩道3の手前で待機状態とした後、歩行者用信号機4が赤色点灯から青色点灯に切り替わるタイミングに同期して走行を開始することにより、横断中に赤信号に切り替わる可能性が非常に少なくて済み、十分に時間的余裕をもって横断することが可能となる。
本実施の形態においては、自律移動ロボット10を、主として歩道7を走行する自律型の搬送ロボットとして適用した場合について述べたが、本発明はこれに限らず、オフィスフロア内の自律型掃除ロボット、建設現場や農薬散布などの自走式作業ロボット、自動セキュリティロボット、介護等の歩行ガイドロボットなどの種々の自走式ロボットに適用することができる。
Claims (15)
- 歩道を含む指定区域内を自由に移動する自律移動体において、
前記指定区域内の外部環境に対して自己の位置を推定すると同時に、平面的又は立体的な前記指定区域の環境地図を作成するSLAM機能部と、
前記SLAM機能部の出力に基づいて、前記自律移動体の走行経路を周囲の物体と接触しないように適宜変更する走行制御部と、
前記走行経路の前方の所定範囲の外界を撮影する撮影部と、
前記撮影部の撮影結果に基づいて、前記指定区間内の歩行者用信号機を認識するとともに、当該歩行者用信号機の信号灯色及び点灯点滅切替えを判定する信号灯色判定部と
を備え、
前記走行制御部は、前記歩道を走行中の前記自律移動体が横断歩道に到達した時点で、前記信号灯色判定部による判定結果に基づき、前記歩行者用信号機の信号灯色が赤色から青色に切り替わるタイミングに同期して前記自律移動体を前記横断歩道を渡るように走行させる一方、当該タイミング以外は前記自律移動体を停止状態にする
ことを特徴とする自律移動体。 - 前記自律移動体が前記横断歩道で停止状態のとき、前記撮影部の撮影結果に基づいて、前記横断歩道の対向側で信号待ちをしている歩行者の数及び位置を認識する歩行者認識部をさらに備える
ことを特徴とする請求項1に記載の自律移動体。 - 前記走行制御部は、
前記自律移動体が前記横断歩道を渡り始めた際に、前記SLAM機能部の出力に基づいて対向側の前記歩行者を認識したとき、当該横断歩道の側縁に近付くように横方向又は斜め方向にずれながら前記自律移動体を進行し、その後当該側縁に沿って走行させる
ことを特徴とする請求項2に記載の自律移動体。 - 前記SLAM機能部の出力に基づいて、前記横断歩道にいる複数の前記歩行者の移動方向及び速度をそれぞれ予測する歩行予測部をさらに備え、
前記走行制御部は、
前記歩行予測部による予測結果に基づいて、前記自律移動体を前記歩行者同士の隙間に沿って走行させる
ことを特徴とする請求項2又は3のいずれか一項に記載の自律移動体。 - 前記走行制御部は、
前記自律移動体が前記横断歩道を渡る途中で、走行困難な状況となったとき、前記歩行者用信号機が青色点灯から青色点滅に切り替わった時点で、当該自律移動体を最寄りの前記歩道に退避させる
ことを特徴とする請求項1乃至4のいずれか一項に記載の自律移動体。 - 前記自律移動体の所定位置に取り付けられ、当該自律移動体の走行前方向及び走行後方向のいずれか一方又は両方に向けて発光可能な方向指示部を備え、
前記走行制御部は、
前記自律移動体の走行状態に合わせて、前記方向指示部を所定タイミングで発光させる
ことを特徴とする請求項3乃至5のいずれか一項に記載の自律移動体。 - 前記自律移動体の所定位置に取り付けられたスピーカをさらに備え、
前記走行制御部は、
前記方向指示部の発光状態に合わせて前記スピーカから音声を出力させる
ことを特徴とする請求項6に記載の自律移動体。 - 歩道を含む指定区域内を自由に移動する自律移動体と連携する信号制御システムにおいて、
前記指定区間内の歩行者用信号機の信号灯色及び点灯点滅切替えを制御する信号灯色制御部と、
前記信号灯色制御部からの信号を送信する信号送信部と
を備え、
前記自律移動体は、
前記指定区域内の外部環境に対して自己の位置を推定すると同時に、平面的又は立体的な前記指定区域の環境地図を作成するSLAM機能部と、
前記SLAM機能部の出力に基づいて、前記自律移動体の走行経路を周囲の物体と接触しないように適宜変更する走行制御部と、
前記信号送信部からの制御信号を受信する信号受信部とを備え、
前記走行制御部は、前記歩道を走行中の前記自律移動体が横断歩道に到達した時点で、前記信号受信部を介して得られた前記信号灯色制御部の制御に基づき、前記歩行者用信号機の信号灯色が赤色から青色に切り替わるタイミングに同期して前記自律移動体を前記横断歩道を渡るように走行させる一方、当該タイミング以外は前記自律移動体を停止状態にする
ことを特徴とする信号制御システム。 - 前記自律移動体に設けられ、当該自律移動体が前記横断歩道で停止状態のとき、前記撮影部の撮影結果に基づいて、前記横断歩道の対向側で信号待ちをしている歩行者の数及び位置を認識する歩行者認識部をさらに備える
ことを特徴とする請求項8に記載の信号制御システム。 - 前記走行制御部は、
前記自律移動体が前記横断歩道を渡り始めた際に、前記SLAM機能部の出力に基づいて対向側の前記歩行者を認識したとき、当該横断歩道の側縁に近付くように横方向又は斜め方向にずれながら前記自律移動体を進行し、その後当該側縁に沿って走行させる
ことを特徴とする請求項9に記載の信号制御システム。 - 前記自律移動体に設けられ、前記SLAM機能部の出力に基づいて、前記横断歩道にいる複数の前記歩行者の移動方向及び速度をそれぞれ予測する歩行予測部をさらに備え、
前記走行制御部は、
前記歩行予測部による予測結果に基づいて、前記自律移動体を前記歩行者同士の隙間に沿って走行させる
ことを特徴とする請求項9又は10のいずれか一項に記載の信号制御システム。 - 前記走行制御部は、
前記自律移動体が前記横断歩道を渡る途中で、走行困難な状況となったとき、前記歩行者用信号機が青色点灯から青色点滅に切り替わった時点で、当該自律移動体を最寄りの前記歩道に退避させる
ことを特徴とする請求項9乃至11のいずれか一項に記載の信号制御システム。 - 前記自律移動体の所定位置に取り付けられ、当該自律移動体の走行前方向及び走行後方向のいずれか一方又は両方に向けて発光可能な方向指示部を備え、
前記走行制御部は、
前記自律移動体の走行状態に合わせて、前記方向指示部を所定タイミングで発光させる
ことを特徴とする請求項10乃至12のいずれか一項に記載の信号制御システム。 - 前記自律移動体の所定位置に取り付けられたスピーカをさらに備え、
前記走行制御部は、
前記方向指示部の発光状態に合わせて前記スピーカから音声を出力させる
ことを特徴とする請求項13に記載の信号制御システム。 - 前記走行制御部は、
前記歩行者用信号機が青色点滅に切り替わった時点で、前記信号受信部を介して緊急信号を送信し、
前記信号灯色制御部は、
前記信号送信部を介して受信した前記緊急信号に基づいて、前記歩行者用信号機の青色点滅時間を所定時間延長する
ことを特徴とする請求項12に記載の信号制御システム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SG11201801679UA SG11201801679UA (en) | 2015-09-01 | 2016-08-31 | Autonomous mobile body and signal control system |
JP2017538077A JP6510654B2 (ja) | 2015-09-01 | 2016-08-31 | 自律移動体及び信号制御システム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-172003 | 2015-09-01 | ||
JP2015172003 | 2015-09-01 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017038883A1 true WO2017038883A1 (ja) | 2017-03-09 |
Family
ID=58187725
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/075535 WO2017038883A1 (ja) | 2015-09-01 | 2016-08-31 | 自律移動体及び信号制御システム |
Country Status (3)
Country | Link |
---|---|
JP (1) | JP6510654B2 (ja) |
SG (1) | SG11201801679UA (ja) |
WO (1) | WO2017038883A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019016150A (ja) * | 2017-07-06 | 2019-01-31 | 矢崎エナジーシステム株式会社 | 無人タクシー制御方法および無人タクシー制御装置 |
WO2019073554A1 (ja) * | 2017-10-11 | 2019-04-18 | 本田技研工業株式会社 | 車両制御装置 |
JP2019149013A (ja) * | 2018-02-27 | 2019-09-05 | アルパイン株式会社 | 自動運転制御装置および自動運転制御方法 |
CN111052024A (zh) * | 2017-09-27 | 2020-04-21 | 日本电产株式会社 | 移动体以及生产系统 |
JP2020086995A (ja) * | 2018-11-27 | 2020-06-04 | 富士ゼロックス株式会社 | 自律移動装置およびプログラム |
US11419193B2 (en) | 2018-03-23 | 2022-08-16 | Toyota Jidosha Kabushiki Kaisha | Moving body |
WO2023037796A1 (ja) * | 2021-09-09 | 2023-03-16 | コイト電工株式会社 | 横断支援システム、歩行者用信号機、自動走行装置 |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110136482A (zh) * | 2019-05-27 | 2019-08-16 | 上海海事大学 | 一种行人预警系统 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS5771098A (en) * | 1980-10-20 | 1982-05-01 | Shinko Electric Co Ltd | Intersection controller for operatorless guided vehicle |
JP2008152714A (ja) * | 2006-12-20 | 2008-07-03 | Honda Motor Co Ltd | 移動装置、ならびにその制御システム、制御プログラムおよび監督システム |
JP2012187698A (ja) * | 2011-03-08 | 2012-10-04 | Rota Kk | 走行ロボットのやり直し走行、そのティーチング方法および制御方法 |
JP2012200818A (ja) * | 2011-03-25 | 2012-10-22 | Advanced Telecommunication Research Institute International | 歩行者の軌跡を予測して自己の回避行動を決定するロボット |
-
2016
- 2016-08-31 JP JP2017538077A patent/JP6510654B2/ja active Active
- 2016-08-31 WO PCT/JP2016/075535 patent/WO2017038883A1/ja active Application Filing
- 2016-08-31 SG SG11201801679UA patent/SG11201801679UA/en unknown
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS5771098A (en) * | 1980-10-20 | 1982-05-01 | Shinko Electric Co Ltd | Intersection controller for operatorless guided vehicle |
JP2008152714A (ja) * | 2006-12-20 | 2008-07-03 | Honda Motor Co Ltd | 移動装置、ならびにその制御システム、制御プログラムおよび監督システム |
JP2012187698A (ja) * | 2011-03-08 | 2012-10-04 | Rota Kk | 走行ロボットのやり直し走行、そのティーチング方法および制御方法 |
JP2012200818A (ja) * | 2011-03-25 | 2012-10-22 | Advanced Telecommunication Research Institute International | 歩行者の軌跡を予測して自己の回避行動を決定するロボット |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019016150A (ja) * | 2017-07-06 | 2019-01-31 | 矢崎エナジーシステム株式会社 | 無人タクシー制御方法および無人タクシー制御装置 |
JP7107647B2 (ja) | 2017-07-06 | 2022-07-27 | 矢崎エナジーシステム株式会社 | 無人タクシー制御方法および無人タクシー制御装置 |
CN111052024A (zh) * | 2017-09-27 | 2020-04-21 | 日本电产株式会社 | 移动体以及生产系统 |
WO2019073554A1 (ja) * | 2017-10-11 | 2019-04-18 | 本田技研工業株式会社 | 車両制御装置 |
CN111201557A (zh) * | 2017-10-11 | 2020-05-26 | 本田技研工业株式会社 | 车辆控制装置 |
JPWO2019073554A1 (ja) * | 2017-10-11 | 2020-07-02 | 本田技研工業株式会社 | 車両制御装置 |
US11180166B2 (en) | 2017-10-11 | 2021-11-23 | Honda Motor Co., Ltd. | Vehicle control device |
JP2019149013A (ja) * | 2018-02-27 | 2019-09-05 | アルパイン株式会社 | 自動運転制御装置および自動運転制御方法 |
US11419193B2 (en) | 2018-03-23 | 2022-08-16 | Toyota Jidosha Kabushiki Kaisha | Moving body |
JP2020086995A (ja) * | 2018-11-27 | 2020-06-04 | 富士ゼロックス株式会社 | 自律移動装置およびプログラム |
WO2023037796A1 (ja) * | 2021-09-09 | 2023-03-16 | コイト電工株式会社 | 横断支援システム、歩行者用信号機、自動走行装置 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2017038883A1 (ja) | 2018-08-02 |
SG11201801679UA (en) | 2018-03-28 |
JP6510654B2 (ja) | 2019-05-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017038883A1 (ja) | 自律移動体及び信号制御システム | |
JP6857728B2 (ja) | 制御されていない交差点及び制御されている交差点を自律的に走行する方法 | |
EP3088280B1 (en) | Autonomous driving vehicle system | |
US11099561B1 (en) | Control of an autonomous vehicle in unmapped regions | |
EP2333742B1 (en) | Vehicle support systems for pedestrians to cross roads and support methods for pedestrians to cross roads | |
US8234009B2 (en) | Autonomous mobile apparatus and method of mobility | |
CN110271543B (zh) | 车辆控制装置、车辆控制方法及存储介质 | |
JP6768974B2 (ja) | 車両制御装置、車両制御方法、およびプログラム | |
US11631330B2 (en) | Vehicle control device | |
US20130018572A1 (en) | Apparatus and method for controlling vehicle at autonomous intersection | |
US11927445B2 (en) | System and method for intersection management by an autonomous vehicle | |
JPH11212640A (ja) | 自律走行車両及び自律走行車両を制御する方法 | |
KR101943809B1 (ko) | 차량의 통지 장치 | |
JP6717272B2 (ja) | 車外報知装置 | |
JP7166712B2 (ja) | パーソナルモビリティ | |
JP5530000B2 (ja) | 人横断支援通知システム及び人横断支援方法 | |
CN110271546A (zh) | 车辆控制装置、车辆控制方法及存储介质 | |
EP4115253A1 (en) | Method, system and device for analyzing pedestrian motion patterns | |
US11600181B2 (en) | Saddle-riding type vehicle | |
CN210626981U (zh) | 一种基于清洁车的自动跟随系统及装置 | |
KR101612890B1 (ko) | 도로의 전방 주행 차량 정보 제공시스템 | |
CN106176157B (zh) | 一种盲人导盲装置及其控制方法 | |
JP2022024379A (ja) | 自動走行車両の表示システム及び表示装置 | |
KR20210092865A (ko) | 교통신호정보 단말장치 및 이를 포함하는 교통신호정보 제공시스템 | |
JP2023146579A (ja) | 地図生成装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16841913 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017538077 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11201801679U Country of ref document: SG |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16841913 Country of ref document: EP Kind code of ref document: A1 |