WO2016104265A1 - 移動体 - Google Patents
移動体 Download PDFInfo
- Publication number
- WO2016104265A1 WO2016104265A1 PCT/JP2015/085151 JP2015085151W WO2016104265A1 WO 2016104265 A1 WO2016104265 A1 WO 2016104265A1 JP 2015085151 W JP2015085151 W JP 2015085151W WO 2016104265 A1 WO2016104265 A1 WO 2016104265A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- moving
- movement
- moving body
- situation
- Prior art date
Links
- 238000011156 evaluation Methods 0.000 claims abstract description 131
- 238000001514 detection method Methods 0.000 claims abstract description 34
- 238000000605 extraction Methods 0.000 claims abstract description 16
- 230000008859 change Effects 0.000 claims description 36
- 230000002093 peripheral effect Effects 0.000 claims description 20
- 238000013459 approach Methods 0.000 claims description 5
- 239000000284 extract Substances 0.000 claims description 4
- 230000007704 transition Effects 0.000 claims description 4
- 230000000694 effects Effects 0.000 description 53
- 238000000034 method Methods 0.000 description 48
- 230000008569 process Effects 0.000 description 47
- 230000007717 exclusion Effects 0.000 description 40
- 238000012886 linear function Methods 0.000 description 15
- 230000007423 decrease Effects 0.000 description 11
- 230000006870 function Effects 0.000 description 10
- 238000010586 diagram Methods 0.000 description 5
- 238000003384 imaging method Methods 0.000 description 5
- 238000012545 processing Methods 0.000 description 3
- 244000144992 flock Species 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 208000019901 Anxiety disease Diseases 0.000 description 1
- 241000282472 Canis lupus familiaris Species 0.000 description 1
- 241000282326 Felis catus Species 0.000 description 1
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 230000036506 anxiety Effects 0.000 description 1
- 230000001174 ascending effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01P—MEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
- G01P13/00—Indicating or recording presence, absence, or direction, of movement
- G01P13/02—Indicating direction only, e.g. by weather vane
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/103—Static body considered as a whole, e.g. static pedestrian or occupant recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
Definitions
- the present invention relates to a moving body, and more particularly, to a moving body that can present to the user an appropriate moving direction according to the surrounding situation so that the user can easily move.
- Patent Document 1 discloses a moving body that moves so as not to leave the user while avoiding surrounding obstacles.
- Patent Document 2 discloses an autonomous mobile device that predicts a user's surrounding environment and moves between the user and a dangerous object when contact between the user and the dangerous object is predicted.
- Patent Documents 1 and 2 do not consider the ease of movement of the user with respect to the surrounding environment. Therefore, for example, although the moving body described in Patent Document 1 can accompany the user without walking away from the obstacle, the obstacle avoidance is left to the user's judgment. There is a possibility of contact without being able to avoid things.
- the autonomous mobile device described in Patent Document 2 there is a possibility that the user himself / herself contacts the autonomous mobile device that moves between the user and the dangerous object.
- the present invention has been made in view of the above-described circumstances, and an object of the present invention is to provide a moving body that can present an appropriate moving direction in accordance with the surrounding situation to the user so that the user can easily move.
- the mobile body includes a user detection unit that detects a user's situation, a surrounding state detection unit that detects a surrounding state, and a peripheral state detected by the surrounding state detection unit.
- the surrounding prediction means for predicting the movement of the object existing in the vicinity as the future situation of the surrounding, the future situation of the surrounding predicted by the surrounding prediction means, and the user detection means are detected.
- a determination unit that determines a moving direction to be presented to the user, and a direction according to the moving direction determined by the determining unit are presented to the user around the front of the user Presenting means.
- the surrounding prediction means predicts the movement of an object existing in the vicinity as the future situation of the surrounding. Based on the surrounding future situation predicted by the surrounding prediction means and the user situation detected by the user detecting means, the moving direction to be presented to the user is determined by the determining means, and the presenting means determines the moving direction. The direction corresponding to the moving direction is presented to the user around the front of the user. Therefore, there is an effect that it is possible to present the user with an appropriate moving direction according to the surrounding situation.
- the moving direction determined by the determining means is not limited to one direction, but includes a direction having a predetermined range of width.
- the mobile body according to claim 2 has the following effect in addition to the effect of claim 1.
- the determining means evaluates the ease of movement of the user in the vicinity based on the future situation of the surrounding predicted by the surrounding prediction means and the user situation detected by the user detecting means, and facilitates the movement. Based on this, the moving direction to be presented to the user is determined. Therefore, there is an effect that it is possible to present the user with an appropriate moving direction according to the surrounding situation so that the user can easily move.
- the mobile body according to claim 3 has the following effect in addition to the effect of claim 1 or 2.
- the ease of movement of the user in the vicinity is evaluated by the first evaluation means based on the future future situation predicted by the periphery prediction means and the user situation detected by the user detection means.
- the determining means determines the moving direction to be presented to the user based on the evaluation by the first evaluating means, so that an appropriate moving direction according to the surrounding situation can be presented to the user so that the user can easily move. There is.
- the following effect is produced.
- the flow of the object in the surroundings is evaluated by the second evaluation means.
- the determining means determines a moving direction to be presented to the user based on the object flow evaluated by the second evaluating means. Therefore, it is possible to present the moving direction suitable for the flow in consideration of the flow of the object to the user.
- the mobile body according to claim 5 has the following effect in addition to the effect of any one of claims 1 to 4.
- the presenting means moves the moving body by the movement control means around the front of the user so as to indicate the direction according to the moving direction determined by the determining means. Therefore, there is an effect that the user can easily move according to the surrounding situation by following the movement of the moving body.
- the mobile body according to claim 6 has the following effect in addition to the effect of any one of claims 1 to 5.
- candidate candidates are extracted by the candidate extraction means for determining the moving direction to be presented to the user. Since the determining means determines the moving direction to be presented to the user based on the candidates extracted by the candidate extracting means, it is possible to present to the user an appropriate moving direction according to the surrounding situation so that the user can easily move. There is an effect.
- the following effect is achieved.
- a direction corresponding to a flow that does not oppose the user's moving direction is extracted with priority by the candidate extraction unit. Therefore, there is an effect that the moving direction corresponding to the flow that does not oppose the moving direction of the user can be presented to the user.
- the moving body described in claim 8 in addition to the effect of claim 6 or 7, the following effect is achieved.
- the direction corresponding to the flow that flows with a difference between the moving speed of the user and the speed within a predetermined range is extracted with priority by the candidate extracting unit. Therefore, a flow that does not oppose the user's moving direction and whose speed difference from the user's moving speed is within a predetermined range is preferentially extracted by the candidate extracting unit. Therefore, since the user can follow the flow of the object moving at a speed close to his / her moving speed by moving in the presented moving direction, there is an effect that the user can easily move. Therefore, there is an effect that the moving direction according to the flow in which the user can move easily because the speed difference with the moving speed of the user is close can be presented to the user.
- the following effect is achieved.
- the movement of the object existing in the surrounding area and the region where the future transition based on the blind spot by the object existing in the surrounding area is uncertain is determined by the surrounding prediction means, It is predicted as the future situation of the surrounding. Accordingly, there is an effect that it is possible to present a moving direction in which the user can easily move in consideration of not only the movement of an object existing in the vicinity but also an area in which future transition based on the blind spot due to the object is uncertain.
- the moving body of the tenth aspect in addition to the effect produced by any one of the first to ninth aspects, the following effect is produced. Since the region where the object can exist after the predetermined prediction time is predicted as the movement of the object by the peripheral prediction means, the moving direction in which the user can easily move in consideration of the region where the object can exist after the predetermined prediction time. Is effective.
- the predetermined prediction time is a variable value corresponding to the state of each object for each of the objects existing in the vicinity. Therefore, there is an effect that it is possible to present a moving direction in which the user can easily move in consideration of a region where each object can exist after the predicted time, using the predicted time corresponding to the state of the object.
- the following effect is achieved.
- an object that is relatively close to the user and the moving body is targeted, and for each target object, a region where the object can exist is predicted by the periphery prediction unit.
- the predetermined prediction time is a value based on a value corresponding to a distance between the user or the moving body and the object and a relative speed of the object with respect to the user or the moving body. Therefore, there is an effect that it is possible to present the moving direction in consideration of the relative positional relationship with the user or the moving body and the relative speed for each object that may come into contact with the user.
- the predetermined predicted time is provided in a predetermined size around the user and the moving body when the user and the moving body and an object existing around the user approach at a relative speed of the object with respect to the user or the moving body. It is calculated as the time until the first area and the second area provided around the object with a predetermined size reach a position separated by a predetermined grace distance. Therefore, since the state before the moving object and the object that move around the front of the user are completely approached is predicted, it is possible not only for the user to move easily, but also for the object, it is possible to avoid the moving object with a margin. There is an effect that such a moving direction can be presented to the user.
- the route estimation unit estimates the route that is least affected by the object before reaching the target area.
- the first evaluation means moves more easily by the first evaluation means as the width between the areas sandwiching the route is larger. It is evaluated. Therefore, there is an effect that it is possible to evaluate the ease of movement of the user according to the distance between the areas estimated by the peripheral prediction unit that sandwiches the route with respect to the route estimated by the route estimation unit.
- the following effect is provided.
- the width between the regions sandwiching the route estimated by the route estimating means is wider than the width of the first region provided with a predetermined size around the user and the moving body, the width of the first region If the user is likely to move compared to a narrower case, the user is evaluated by the first evaluation means. Therefore, there is an effect that it is possible to obtain a determination result that the route that easily passes through the first region has less interference in the first region and the user can easily move.
- the width between the regions sandwiching the route estimated by the route estimation means is the outside of the second region provided with a predetermined size around the object corresponding to each region when the centers of the regions are connected. Is the length of the line segment. Therefore, there is an effect that it is possible to determine the ease of the user's movement in consideration of the interference of the user with the second area by the user.
- the route estimation unit estimates the route that is least affected by the object before reaching the target area.
- the first evaluation unit causes the user to determine that the end point position is closer to the position where the movement direction indicated by the user status detected by the user detection unit and the target position intersect. Rated as easy to move. Therefore, there is an effect that it is possible to obtain a determination result that the user is more likely to move as the route is less shifted from the case where the user travels in the current moving direction.
- the route estimation unit estimates the route that is least affected by the object before reaching the target area.
- the first evaluation means has a smaller absolute value of the angle change in the movement direction from the start point with respect to the movement direction of the user indicated by the user status detected by the user detection means, It is evaluated that the user can move easily. Therefore, there is an effect that it is possible to obtain a determination result that the user is easier to move as the route is such that the direction of the user's current position is not greatly changed.
- the movement of the object is predicted using the movement speed of the user indicated by the user status detected by the user detection means, and the movement of the object is predicted using a movement speed different from the movement speed of the user.
- Direction candidates for determining the moving direction to be presented to the user are extracted for each moving speed of the user based on the surrounding future situation including the movement of the object predicted for each moving speed of the user.
- the path that is least affected is estimated.
- the moving speed of the user used for predicting the movement of the object is the moving speed of the user indicated by the user status detected by the user detecting means.
- FIG. 1 It is a schematic diagram which shows the external appearance of a moving body. It is a flock figure which shows the schematic electrical structure of a moving body. It is a flowchart which shows a main process.
- (A) is a flowchart which shows S100 process
- (b) is a schematic diagram which shows an example of the surrounding condition converted into the user coordinate system. It is a figure for demonstrating the area
- FIG. 1 is a schematic diagram showing an appearance of a moving body 100 that is an embodiment of the moving body of the present invention.
- the moving body 100 functions as a device that supports the movement of the user 150 by autonomously moving around the front of the moving user 150.
- the moving body 100 of the present embodiment presents to the user 150 a moving direction that is estimated to be easy to move under circumstances around the moving body 100 and the user 150, thereby supporting the movement of the user 150.
- the movement of the user 150 supported by the moving body 100 is a movement at a speed at which the user 150 moves by walking or running.
- the front periphery of the user 150 which is the movement range of the mobile body 100, is, for example, a range of 180 degrees in front of the user 150 and centering on the user 150. Or it is good also in the range of a user's visual field.
- the distance between the moving body 100 and the user 150 is a predetermined distance (for example, about 40 cm) or more from the user 150 and a natural distance (for example, about 70 cm) in that it moves together with the user 150. ) Is within the range of the distance set so as not to exceed.
- the moving body 100 includes a main body portion 101, wheels 102, and a display portion 103.
- the main body 101 is formed in a substantially cylindrical shape.
- the shape of the main body 101 is not limited to a substantially cylindrical shape, and various shapes can be adopted.
- a plurality of imaging devices (not shown) for imaging the periphery of the user 150 or the moving body 100 are provided around the main body 101.
- the wheel 102 is configured as an omnidirectional wheel that can move in all directions. Therefore, the moving body 100 can smoothly move in all directions.
- three wheels 102 are provided, but the number of wheels 102 is not limited to three, and an appropriate number can be adopted.
- the display unit 103 has a display such as a liquid crystal display device, and transmits information to the user 150 by display on the display.
- the display of the display unit 103 is provided on a surface facing the user 150.
- the display of the display unit 103 is configured as a touch panel, and can input an instruction from the user 150 to the moving body 100.
- FIG. 2 is a flock diagram showing a schematic electrical configuration of the moving body 100.
- the moving body 100 includes a sensing unit 10, a control unit 20, a driving unit 30, and a human machine interface (hereinafter referred to as “HMI”) unit 40.
- HMI human machine interface
- the sensing unit 10 detects a state of the user 150 (hereinafter referred to as “user state”) and a state around the mobile object 100 and the user (hereinafter referred to as “peripheral state”).
- the sensing unit 10 that detects the user state is an example of a user detection unit in the present invention.
- the sensing unit 10 that detects the peripheral state is an example of the peripheral state detection means in the present invention.
- the sensing unit 10 includes, for example, a plurality of imaging devices provided around the main body unit 101, various sensor devices such as millimeter waves and lasers, and the like.
- the imaging device for example, a camera device such as a CCD (Charge Coupled Device) camera or a CMOS (Complementary Metal Oxide Semiconductor) camera can be employed.
- An image pickup apparatus having a range finder may be employed.
- the sensing unit 10 outputs the result detected by the sensing unit 10 to the control unit 20. More specifically, the sensing unit 10 outputs a captured image targeted for the user 150, a detection result by the radar, and the like to the control unit 20 as a detection result of the user state. On the other hand, the sensing unit 10 outputs a captured image targeting the periphery of the moving body 100 and the user 150, a detection result by the radar, and the like to the control unit 20 as a detection result of the peripheral state.
- the control unit 20 includes a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), and the like, and functions as a control device that controls the entire mobile unit 100. Note that a program for realizing processing described later with reference to FIG. 3 is stored in the ROM of the control unit 20 and executed by the CPU of the control unit 20.
- the control unit 20 generates a control signal corresponding to the input from the sensing unit 10 and outputs the control signal to the drive unit 30 or the HMI unit 40.
- the driving unit 30 includes a wheel 102 and a motor that is a driving source of the wheel 102.
- a control signal is input to the drive unit 30 from the control unit 20.
- the motor rotates based on the input signal, and the wheel 102 is driven by the rotation of the motor as power.
- the HMI unit 40 is an interface for outputting information to the user 150 and allowing the user 150 to input instructions.
- the display of the display unit 103 is an example of the HMI unit 40.
- a speaker that outputs sound and a microphone that inputs sound may be provided as the HMI unit 40.
- the HMI unit 40 outputs information according to the control signal input from the control unit 20.
- the HMI unit 40 outputs a control signal corresponding to the input to the control unit 20.
- FIG. 3 is a flowchart showing a main process executed by the CPU of the control unit 20 of the moving body 100 having the above-described configuration according to a program stored in the ROM of the control unit 20. This main process is periodically executed every predetermined time (for example, every 0.1 second).
- the CPU of the control unit 20 (hereinafter simply referred to as “control unit 20”) executes a process of recognizing the user / periphery state (S100).
- the process of S100 is based on the input from the sensing unit 10, and each parameter represented in the moving body coordinate system (that is, the coordinate system based on the moving body 100) is converted into the user coordinate system (that is, based on the user 150). Coordinate system). That is, the relative parameter based on the moving body 100 is converted into the relative parameter based on the user 150.
- the control part 20 performs the process which evaluates an area
- the process of S200 is a process of evaluating individual regions for each of the moving object 100, the user 150, and each surrounding object. Although details will be described later with reference to FIGS. 5 and 6, in this embodiment, each area is evaluated by scoring each area.
- the “individual area” with respect to the user 150 or a human being such as another person in the vicinity is a so-called psychological personal space, which is an area in which the opponent does not want to enter, and at the same time, the opponent wants to enter It is an area where you feel that there is not.
- the “individual region” with respect to the object such as the moving object 100 or an obstacle present in the vicinity is a region in which anxiety that contact with the opponent occurs.
- the control part 20 performs the process which estimates and evaluates a field after the process of S200 (S300).
- the process of S300 is a process of predicting the future situation around the user 150 and the moving body 100 and evaluating the predicted future situation.
- the process of S300 is an example of the surrounding prediction means of the present invention.
- the moving area after the prediction time t is predicted as one of the future situations for each other in the vicinity.
- an area where future transition is uncertain hereinafter referred to as “indeterminate area” based on the blind spot is predicted as one of the future situations.
- each movement area predicted as a future situation For each movement area predicted as a future situation, each movement area is evaluated by scoring in consideration of the existence probability of the other person corresponding to each movement area. On the other hand, each indeterminate area predicted as a future situation is scored in consideration of the size of each blind spot, thereby evaluating each indeterminate area. Then, by generating a map reflecting the predicted evaluation for each moving area and the evaluation for each indeterminate area, the surrounding future situation is comprehensively evaluated.
- the control part 20 performs the process which evaluates the ease of movement after the process of S300 (S400).
- the process of S400 is a process of evaluating the ease of movement of the user 150 in the future situation predicted and evaluated in S300.
- the process of S400 is an example of candidate extraction means and first evaluation means of the present invention. Although details will be described later, in this embodiment, for each direction extracted in consideration of the flow of others present in the vicinity, the user 150 for each direction is based on the future situation predicted and evaluated in S300. The ease of movement is evaluated by scoring.
- the control unit 20 executes a route determination process after the process of S400 (S500).
- the process of S500 determines the optimal moving direction for the user 150 based on the evaluation by S400.
- the process of S500 is an example of a determination unit of the present invention. Although details will be described later, in the present embodiment, in consideration of the score of each direction calculated in S400 and the score of the area corresponding to each direction in the future situation predicted in S300, the user 150 Determine the optimal direction of movement.
- the control part 20 performs a presentation process after the process of S500 (S600).
- the moving direction determined by the process of S500 is presented to the user 150 by the movement of the moving body 100.
- the moving body 100 moving in front of the user 150 is moved to the user 150 in the left-right direction by a distance indicating the determined moving direction, thereby presenting the moving direction to the user 150.
- control unit 20 sends a control signal to the drive unit 30 so that the moving body 100 indicates the moving direction determined in S500 and moves forward at a speed corresponding to the moving speed of the user 150.
- the drive unit 30 to which the control signal is input moves the wheel 102 so that the moving body 100 moves in the left-right direction by a distance indicating the determined moving direction while moving forward at a speed corresponding to the moving speed of the user 150.
- the movement of the moving body 100 in the left-right direction is a movement of the moving body 100 in the left-right direction while keeping the vertical distance from the user 150 in the user coordinate system or the moving body coordinate system constant, or the moving body 100.
- the process of S500 is an example of the presenting means and movement control means of the present invention.
- FIG. 4A is a flowchart showing the above-described process (S100) for recognizing the user / peripheral state.
- the control unit 20 acquires parameters in the moving body coordinate system based on the input from the sensing unit 10 (S100).
- the parameters acquired in S100 are roughly classified into parameters indicating the user state and parameters indicating the peripheral state.
- the parameters indicating the user state include, for example, the relative position (x0, y0) of the user 150 with respect to the moving body 100, the relative speed v0 of the user 150 with respect to the moving body 100, the relative moving direction ⁇ 0 of the user 150 with respect to the moving body 100, and the moving body 100.
- the moving speed of the user 150 has changed from the current value (that is, a value based on the input from the current sensing unit 10)
- a value obtained by changing the moving speed of the user 150 from the current value by a predetermined amount is separately acquired.
- the moving speed of the user 150 is considered for three types of moving speeds, that is, the current moving speed and a speed that is ⁇ 2 km / h from the current moving speed, the relative speed v0 is set for each moving speed. To get.
- the parameter indicating the peripheral state includes, for example, the relative position (xi, yi) of the other person or the obstacle with respect to the moving body 100, the relative speed vi of the other person or the obstacle with respect to the moving body 100, the other person with respect to the moving body 100, or For example, the relative movement direction ⁇ i of the obstacle.
- the variable i is an integer of 1 or more, and is individually assigned to another person or an obstacle present in the vicinity. That is, the variable i is a value for individually identifying other persons or obstacles existing in the vicinity.
- the relative position of the user 150 or other persons or obstacles existing in the vicinity, and the orientation of the body of the user 150 are, for example, each of images captured by a plurality of imaging devices (for example, cameras) provided around the main body 101. It can be obtained by subjecting the image to image processing and performing edge extraction and pattern recognition.
- the recognition of whether or not the user is a user is performed based on, for example, storing the user's face image in advance and comparing the face image and the captured image. In addition, it is also determined based on the captured image whether an object existing in the vicinity is a moving other person or a stationary obstacle. On the other hand, the relative speed and the relative movement direction of the user 150 or another person or an obstacle present in the vicinity are acquired based on, for example, each image acquired this time and each image acquired last time.
- the control unit 20 converts each parameter in the moving body coordinate system acquired in S101 into the user coordinate system (S102).
- FIG. 4B is a schematic diagram illustrating an example of the surrounding situation converted into the user coordinate system. As shown in FIG. 4B, the control unit 20 determines the relative position, relative speed, and relative movement direction with respect to the user 150 for each other person 200 existing in the vicinity from each parameter converted into the user coordinate system. And grasp. In the example shown in FIG. 4B, the relative movement direction and the relative speed with respect to the user 150 are represented by the direction and length of the vector, respectively.
- the control unit 20 grasps the relative position with respect to the user 150 for each obstacle 300 existing in the vicinity from each parameter converted into the user coordinate system.
- FIG. 4B two other persons 200 and one obstacle 300 are illustrated as a simple example, but in reality, the control unit 200 targets a large number of other persons 200.
- the relative position, relative speed, and relative movement direction with respect to the user 150 are grasped, and for each obstacle 300, the relative positions with respect to the user 150 are grasped.
- region evaluated by S200 mentioned above is demonstrated.
- areas 120, 170, and 220 are set for the mobile object 100, the user 150, and the other person 200, respectively.
- the individual areas 170 for the user 150 are set so that the front direction (the direction of the arrow extending from the user 150), which is the movement direction of the user 150, is wider than the rear direction. The same applies to the moving body 100 and the other person 200.
- the individual areas 120, 170, and 220 for the mobile object 100, the user 150, and the other person 200 are all circles, but an appropriate shape such as an ellipse can be adopted.
- the individual areas 120, 170, and 220 are all the same size, they may not be the same size.
- the individual areas 120, 170, and 220 may have different sizes depending on the moving speed of the mobile object 100, the user 150, or the other person 200.
- an area wider than the rear provided in front of the moving body 100, the user 150, or the other person 200 may have a size corresponding to the moving speed of the moving body 100, the user 150, or the other person 200.
- Each of the regions 120, 170, and 220 is evaluated by a score that is set so that the exclusion region 120a, 170a, and 220a provided in each region 120, 170, and 220 is maximized and gradually decreases toward the periphery.
- the exclusion areas 120a, 170a, and 220a are areas for preventing the other party from entering.
- a maximum value 200 points in the present embodiment
- the score is set as a linear function in which the score decreases linearly from 100 to 0 with respect to the region up to the periphery.
- scores are set for the individual areas 120 and 220 as well as the individual areas 170.
- the “score” of each area in this specification indicates that the lower the score, the higher the tolerance for the opponent's entry, and the higher the score, the lower the tolerance for the opponent's entry.
- the maximum score is not limited to 200, and an appropriate value can be adopted.
- the function for setting the number of points from the periphery of the exclusion region 170a to the periphery of each region 170 is not limited to a linear function and can be any appropriate value as long as it is a function that decreases toward the periphery. In addition, the function may be different depending on whether the individual area is the moving body 100, the user 150, or the other person 200.
- a single area 130 is set as shown in FIG.
- the user 150 and the moving body 100 are located apart from each other by a distance L to be secured, and a region including the individual regions 120 and 170 in that case is set as the individual region 130.
- the oval region including the individual regions 120 and 170 is the individual region 130.
- the region may be an elliptical region that swells to the left and right.
- the distance L to be secured is the distance between the user 150 and the moving body 100 when the individual areas 170 of the user 150 are in contact with the moving body 100 in front of the user 150.
- An appropriate distance that is not too close and not too far away can be employed.
- An exclusion area 130 a is provided inside each area 130.
- the exclusion area 130a is an area that prevents the other party from entering, similar to the exclusion areas 120a, 170a, and 220a.
- the oval area including the exclusion areas 120a and 170a is the exclusion area 130a.
- the exclusion area 130a swells to the left and right.
- An elliptical region may also be used.
- the maximum score is set in the exclusion area 130 a and the progressively lower scores are set toward the periphery, as in the individual areas 120, 170, and 220.
- 200 points which are the maximum values, are set for the exclusion region 130a, and from the periphery of the exclusion region 130a to the periphery of each region 130 is linear from 100 to 0. Is set as a linear function.
- the function for setting the number of points from the periphery of the exclusion region 130a to the periphery of the individual regions 130 is not limited to a linear function as long as it decreases toward the periphery, and an appropriate value can be adopted.
- an area that is a predetermined distance away from the obstacle 300 and that is visible from the moving body 100 is set as an individual area 320 with respect to the obstacle 300.
- each area 320 is configured only from an exclusion area that prevents a partner from entering. Therefore, 200 points, which are the maximum value as the number of points, are set in each region 320 over the entire area.
- the individual regions 320 may be configured to include an exclusion region and a region whose score is lower than the exclusion region, such as the individual region 120.
- the prediction and evaluation performed in S300 mentioned above in this embodiment are demonstrated.
- the moving area after the prediction time t is predicted as one of the future situations, and each predicted moving area is evaluated by the score.
- the “score” of the future situation in this specification indicates that the lower the score, the easier the user 150 moves in the future situation.
- the control unit 20 in order to predict the movement area for each other person 200 existing in the vicinity, the control unit 20 first calculates a predicted time t for each of the other persons 200 existing in the vicinity.
- the predicted time t is calculated from ( ⁇ x ⁇ ) / ⁇ v.
- ⁇ v is a value larger than zero.
- ⁇ x is the distance between the other person 200 and the moving object 100.
- ⁇ v is the relative speed of the other person 200 in the user coordinate system. As described above, since ⁇ v is a value larger than 0, the predicted time t is calculated only for the other person 200 that is relatively close to the moving body 100 and the user 150.
- ⁇ is the grace distance.
- the grace distance is such that when the moving body 100, the user 150, and the other person 200 are brought closer along a line connecting the other person 200 and the moving body 100, the other person 200 and the moving body 100 do not approach each other more than a certain extent. This distance is set assuming that In the present embodiment, the distance between the other person 200 and the moving body 100 when the exclusion area 220a and the exclusion area 170a are separated from each other by a predetermined distance (for example, 1 m) is defined as a grace distance.
- the predicted time t is calculated when the moving object 100, the user 150, and the other person 200 are brought close to each other along a line connecting the other person 200 and the moving object 100. Is calculated as the time until the exclusion area 220a and the exclusion area 170a reach a position separated by a predetermined distance (for example, 1 m). Thus, since the prediction time t depends on the position and moving speed of the other person 200, the predicted time t is a variable value according to the situation of each other person 200.
- the control unit 20 predicts the movement area 250 after the prediction time t when each other person 200 moves in the current movement direction and the movement direction for each of the other persons 200 existing around.
- the predicted time t is a value calculated for each other person 200 with respect to the other person 200 relatively approaching the mobile object 100 and the user 150.
- Each moving region 250 is predicted as a circle having a center on the line of the moving direction of the other person 200, and the closer to the center, the higher the existence probability of the other person 200 after the prediction time t.
- the predicted moving area is an area where the other person 200 does not enter the exclusion area 170a for the user 150, and the other person 200 is an area where the moving body 100 and the user 150 do not enter the exclusion area 220a. Therefore, it is considered that the load on the other person 200 is minimized as the user 100 moves.
- the control unit 20 evaluates the movement area 250 predicted for each other person 200.
- the score is set as a linear function in which the score decreases linearly from a predetermined value of 0 or more to 0 toward the periphery from the center of the moving region 250.
- the “score” of the moving area in this specification indicates that the higher the score, the higher the existence probability of the other person. In the present embodiment, 100 points are assigned to the existence probability of 100%. Therefore, the “score” of the moving area is a numerical score equal to the numerical value of the existence probability.
- the number of points at the center of the moving area 250 is set as the value of the existence probability at the center.
- the set score Ka is higher than the score Kb set for the center with respect to the moving region 250b.
- the value set for the center of the moving area 250 is not limited to the value of the existence probability at the center, and an appropriate value according to the existence probability can be adopted.
- the function for setting the number of points from the center to the periphery of the moving region 250 is not limited to a linear function as long as it is a function that decreases from the center toward the periphery, and an appropriate value can be adopted. The function may be different depending on the individual state of the other person 200.
- the movement area 250 after t is predicted.
- the range of change of the moving speed of the user 150 is set to a range from ⁇ 2 km / h to +2 km / h, and the moving speed of the user 150 is changed in increments of 2 km / h within the range of the changing range. That is, the prediction and evaluation of the movement area 250 for each other person 200 is performed for three patterns of the current movement speed of the user 150 and a speed that is ⁇ 2 km / h from the current movement speed.
- the predicted time t is calculated with a relative speed corresponding to the moving speed of the user. Therefore, since the predicted time t becomes longer as the moving speed of the user is increased, the moving area for each other person 200 is closer to the side farther from the other person 200 than the moving area 250 shown in FIG. It is located on the near side) and is wider than the moving area 250 shown in FIG.
- control unit 20 first predicts an indeterminate region.
- the control unit 20 predicts an indeterminate region 320 in a region adjacent to the blind spot region with respect to the blind spot formed on the back side of the obstacle 300.
- the indeterminate region 320 is a semicircle whose vertical width K1 is twice the horizontal width K2, but the ratio of the vertical width K1 to the horizontal width K2 is not limited to 2: 1.
- the shape is not limited to a shape such as a semicircle and the whole is a part of the circle, and an uncertain region 320 having an appropriate shape can be employed.
- the control unit 20 evaluates the predicted indeterminate region.
- the center of the vertical width K1 of the indeterminate region 320 is set to 100 points
- the score is set as a linear function linearly descending from the center position toward the periphery
- the width of the indeterminate region 320 is set.
- the score is set as a linear function in which the score decreases linearly from the obstacle side end of K2 toward the periphery.
- the evaluation of the uncertain area 320 is not limited to this, and if the condition that the score is set higher as the position of the uncertainty in the uncertain area 320 is higher and the score is set lower as the distance from the periphery is satisfied, the evaluation of the obstacle 300 is performed. Evaluation according to the shape and the like can be performed. Further, the maximum value of the number of points in the uncertain region 320 is not limited to 100, and various values can be adopted.
- control unit 20 also predicts and evaluates the indeterminate area corresponding to the dead angle, similarly to the indeterminate area 320 for the obstacle 300, for the dead area formed by the other person 200 existing around.
- the size of the indeterminate area is changed according to the number of other persons 200 that form a blind spot. For example, when several other people 200 overlap to form a blind spot, an area larger than the uncertain area for the blind spot formed from one other person 200 is predicted as the uncertain area.
- evaluation map reflecting the evaluation of each future situation
- the prediction of the movement area for the other person 200 is performed for three patterns of the current movement speed of the user 150 and a speed that is ⁇ 2 km / h from the current movement speed. , Evaluate the future situation for each pattern. That is, three patterns of maps corresponding to the moving speed of the user 150 are generated as the evaluation map.
- the evaluation according to the moving speed of the user 150 is added to the map of each pattern.
- An example of the relationship between the amount of change from the current moving speed of the user 150 and the number of evaluation points is shown in FIG.
- the evaluation score is the minimum value for evaluating the change in the moving speed of the user, Is done. In other words, if the change in the moving speed is 1 km / h or less, it is evaluated that it does not affect the ease of movement of the user 150.
- the evaluation score increases with a linear function as the value of the amount of change increases, that is, as the moving speed of the user 150 increases. .
- the evaluation score is 100, which is the maximum value for evaluating the change in the moving speed of the user. That is, when the movement speed changes to 3 km / h or more, the ease of movement of the user 150 associated with the movement speed change is evaluated to be the lowest (that is, the movement is most difficult).
- the evaluation score is added to the entire evaluation map. It should be noted that the relationship between the change amount of the moving speed of the user 150 and the evaluation score is the relationship shown in the example of FIG. 9 as long as the condition that the evaluation score is increased when the change amount of the moving speed is greater than 0 is satisfied.
- the present invention is not limited to various relationships.
- FIG. 10 is a flowchart showing the above-described process (S400) for evaluating the ease of movement.
- S400 process for evaluating the ease of movement.
- FIGS. 11 to 16 are referred to as necessary.
- the control unit 20 evaluates the global flow in the surrounding field (S401).
- the process of S401 is an example of the second evaluation unit of the present invention.
- the “global flow” is not a flow in which each other person 200 moves individually, but an overall flow formed by the movement of each other person 200 individually. Therefore, in some cases, the moving direction of the other person 200 may be opposite to the direction of the global flow including the other person 200.
- control unit 20 first determines the presence or absence of a global flow based on the flow density calculated from the relative position, relative speed, relative movement direction, etc. of each other person 200 in the user coordinate system.
- the flow density is calculated according to, for example, the degree of overlap in the relative movement direction of the other person 200 per unit area.
- control unit 20 narrows down the flow direction in consideration of the direction and intensity of the flow, the distance between the flow and the user 150 and the moving body 100, and the like. Do. For example, priority is given to the flow in the direction away from the user 150 and the moving body 100. On the other hand, the flow in the opposite direction to the user 150 and the moving body 100 is excluded.
- a stronger flow is prioritized among the flows away from the user 150 and the moving body 100. Therefore, for example, when there is a strong flow in a direction away from the user 150 and the moving body 100, other flows are excluded. When there is a certain amount of flow near the user 150 and the moving body 100, even if there is a strong flow in the distance, the flow is excluded.
- the control unit 20 sets a movement destination area (hereinafter, this area is referred to as a “first movement destination area”) for riding on the narrowed flow.
- the first movement destination area is an area where the moving distance of the user 150 and the moving body 100 is short and the direction change can be reached with little change in the flow.
- a candidate for the moving direction of the user 150 and the moving body 100 to get on the global flow is determined to some extent (for example, as a direction from the current position toward the first destination area) (for example, It is set within an angle range of about 0 to 10 degrees.
- the control unit 20 evaluates the local flow in the surrounding field after the process of S401 (S402).
- a “local flow” is a flow in which each other 200 moves individually. Specifically, the control unit 20 performs local flow evaluation according to the relative speed of each other person 200 in the user coordinate system.
- the process of S402 is an example of a second evaluation unit of the present invention.
- the relative speed of the other person 200 with respect to the user 150 is 0 or more (that is, a situation in which the other person 200 in the front is relatively away from the user 150), the relative speed of the other person 200 with respect to the user 150 is negative. It is evaluated that the user 150 is easy to walk as compared to a case where the value is a value (that is, a situation in which the other person 200 ahead is relatively closer to the user 150).
- the local flow by each other person 200 is scored and evaluated according to the relative speed of the other person 200 in the user coordinate system. The score for the local flow is added to the score for the moving area estimated for each other person 200 evaluated in S300.
- FIG. 11 shows an example of the relationship between the relative speed of the other person 200 and the evaluation score in the user coordinate system.
- the evaluation score is 0, which is the minimum value for evaluating the local flow.
- the relative speed of the other person 200 with respect to the user 150 is a negative value, the evaluation score is a score greater than zero. This is because if the user 150 moves in front of the other person 200 moving in the same direction as the user 150 at the same speed or at a somewhat higher speed, the user 150 is likely to move.
- the relative speed of the other person 200 with respect to the user 150 is a negative value
- the absolute value of the relative value increases, that is, the other person 200 in the front is
- the evaluation score increases with a linear function.
- the evaluation score is 100, which is the maximum value for evaluating the local flow. . That is, when the other person 200 approaches the user 150 at a speed of 3 km / h or more, the ease of movement of the user 150 with respect to the local flow by the other person 200 is the lowest (that is, the movement is most difficult). It is evaluated.
- the score is greater than 0. This is because even if the user moves in front of the other person 200 moving away from the user 150 at a certain speed, it is not particularly easy to move.
- the relationship between the relative speed of the other person 200 and the evaluation score in the user coordinate system is evaluated when the relative speed is 0 or more compared to the case where the relative speed of the other person 200 with respect to the user 150 is a negative value.
- the evaluation score when the relative speed is 0 or more may be a constant value.
- the maximum value that can be taken when the relative speed is +2 km / h or more is set as the maximum value for evaluating the local flow, it may be a value smaller than the maximum value.
- the control unit 20 narrows down candidate areas after the process of S402 (S403).
- the process of S403 is an example of candidate extraction means of the present invention.
- the control unit 20 first sets the evaluation map obtained based on the evaluation in S300 (that is, the map reflecting the evaluation of the future future situation) on the front of the user 150 with the user 150 as a reference.
- the range of 180 degrees is divided by constant angle increments.
- the evaluation map is divided in steps of 10 degrees on the left and right with the moving direction of the user 150 as the center.
- the control unit 20 sets, for each dividing line drawn in front of the user 150, an area including the dividing line and having a predetermined width (hereinafter, this area is referred to as “divided area”). For example, as shown in FIG. 12B, among the dividing lines, 17 dividing lines excluding the dividing lines extending in the left-right direction from the position of the user 150 have a predetermined width centered on the dividing lines (this embodiment). In this case, a divided area of 1 m) is set. One end of each divided region is defined by the periphery of each region 130 including the moving body 100 and the user 150, and the other end is a boundary with the second destination region defined in front of the user 150. Defined by line 501.
- the “second movement destination area” is an area in which the user 150 is predicted to move in the future at the current movement speed, and is set according to the movement speed of the user 150 at that time. For example, the second movement destination area is set closer to the current position of the user 150 as the movement speed of the user 150 is slower.
- the control unit 20 includes the setting of the divided area including the direction from the current position toward the first movement destination area. This is limited to the area.
- the control unit 20 sets the divided areas for all of the 17 dividing lines excluding the dividing line extending in the left-right direction from the position of the user 150.
- a maximum of 17 areas can be set for one evaluation map.
- the evaluation map is obtained as a different map according to the moving speed of the user 150.
- an area of 17 ⁇ (3 patterns), that is, 51 areas are set at the maximum.
- the control unit 20 integrates the evaluation score of the future situation included in each divided region for each of the set divided regions, that is, a maximum of 51 divided regions, and performs division based on the obtained integrated value.
- the candidate areas that are candidates for route search are narrowed down from the areas. For example, a predetermined number (for example, 5) of divided areas are narrowed down as candidate areas in ascending order of integral value.
- the control unit 20 sets a candidate route that the user 150 can easily move for the candidate area narrowed down in S403 (S404).
- the process of S404 is an example of the route estimation means of the present invention. For example, as illustrated in FIG. 13, the control unit 20 sets a line segment 601 that is orthogonal to the dividing line for the candidate region at regular intervals. Next, for each line segment 601, the control unit 20 searches for a portion having the lowest evaluation score for the future situation and identifies a passing candidate point. Next, the control unit 20 connects the candidate passage points specified in each line segment 601 and sets a candidate route.
- the middle point in that width is specified as a passing candidate point.
- the line segment 601 that is the search target immediately before the current search target line segment 601 A location with the least degree of direction change from the determined passing candidate point is specified as a passing candidate point.
- a place where the degree of direction change from the moving direction of the user 150 is the smallest is set as a candidate passage point.
- the control unit 20 evaluates the ease of movement of the user 150 for each candidate route set in S404 (S405), and ends the process of S400.
- the process of S405 is an example of a first evaluation unit of the present invention.
- a candidate route with less direction change from the current moving direction of the user 150 is a route that the user 150 can move easily.
- a route with little change in direction can be said to be a route whose moving direction from the starting point is close to the current moving direction of the user 150, for example.
- the user 150 moves as the absolute value ⁇ of the angle formed by the moving direction from the starting point on the candidate route 701 and the current moving direction of the user 150 decreases. Evaluate it as an easy route. For example, as shown in FIG. 14B, when the value of ⁇ is 0 to 5 degrees, the evaluation score is 0, which is the minimum value for evaluating the candidate route. This is because the fluctuation of the direction when the user 150 moves is taken into account. When the value of ⁇ is 0 degree to 5 degrees, the movement of the user 150 is regarded as the fluctuation of the direction when moving. Evaluate that it does not affect ease.
- the evaluation score increases with a linear function as the value of ⁇ increases in the range up to 45 degrees.
- the evaluation score is 100, which is the maximum value for evaluating the candidate route. That is, when the moving direction from the starting point in the candidate route 701 changes by 45 degrees or more from the current moving direction of the user 150, the ease of movement when the user 150 travels along the candidate route is the lowest (that is, It is rated as the most difficult to move.
- FIG. 14B is an example of evaluation. If the condition that the evaluation score increases when the value of ⁇ is greater than 0 is satisfied, the evaluation can be performed using various relationships, not limited to the relationship shown in FIG.
- a route with little direction change is an intersection (ie, a point of arrival at the second destination area) in the candidate route and an intersection (when the user 150 moves in the current movement direction) with the second destination area ( That is, it can be said that the route is close to the second destination area).
- the end point of the candidate route 701 that is, the intersection of the candidate route 701 and the boundary line 501
- the axis line 801 extending in the current movement direction of the user 150
- the boundary line The smaller the distance Q from the intersection with 501 is, the easier the user 150 is to travel.
- the distance Q is within a certain range, it is evaluated that it is substantially matched with the current movement direction of the user 150 and does not affect the ease of movement of the user 150.
- the ease of movement of the user 150 depends on the position of the second movement destination area.
- the range of the distance Q that is not affected is also different. The farther the position of the second movement destination area (that is, the position of the boundary line 501) is from the user 150, the wider the range of the distance Q that does not affect the ease of movement of the user 150.
- the evaluation score is used to evaluate the candidate route.
- the minimum value is 0. That is, when the distance Q is 2 m or less with respect to the movement of 5 m, it is evaluated that it does not affect the ease of movement of the user 150.
- the evaluation score increases with a linear function as the value of Q increases in the range up to 10 m.
- the evaluation score is 100, which is the maximum value for evaluating the candidate route. That is, when the end point of the candidate route is 10 m or more away from the intersection of the axis 801 and the boundary line 501, the ease of movement when the user 150 travels along the candidate route is the lowest (that is, the most moving) It is difficult).
- FIG. 15B illustrates the case where the boundary line 501 is 5 m away from the current position of the user 150.
- the distance can be set by a number that can be set as the distance between the current position of the user 150 and the boundary line 501.
- the relationship between Q and the evaluation score is prepared, and among them, the one according to the position of the second movement destination area set according to the movement speed of the user 150 is used.
- the value on the horizontal axis may be calculated from the moving speed of the user 150 according to a predetermined function.
- FIG. 15B is an example of evaluation. If the condition that the evaluation score increases when the value of Q is greater than 0 is satisfied, the evaluation can be performed using various relationships, not limited to the relationship shown in FIG.
- the ease of movement of the user is evaluated in consideration of the disturbance caused by the path of the other person 200.
- the ease of movement of the user is evaluated in consideration of the width of the moving area 250 with respect to the other person 200 across the candidate route.
- the advancing width W is the length of a line segment obtained by subtracting half of the width M of the exclusion area 220a for the other person 200 from both sides of the line segment connecting the centers of the moving areas 250 across the candidate route.
- the evaluation score is 0, which is the minimum value for evaluating the candidate route. That is, in this example, when the value of W is equal to or larger than the width Z, it is evaluated that the user 150 does not affect the ease of movement.
- the evaluation score increases with a linear function as the value of W decreases.
- the evaluation score is set to 100, which is the maximum value for evaluating the candidate route. That is, when the value of W is 1 ⁇ 4 or less of the width Z (that is, the width of the exclusion area 170a of the user 150), the ease of movement when the user 150 travels along the candidate route is the lowest ( That is, it is evaluated that it is most difficult to move.
- the advance width W is evaluated by the number of moving areas sandwiching the candidate route, and the evaluation score in each evaluation is added to evaluate the ease of movement of the user. Since the exclusion area 220a for the other person 200 is considered, the width W is considered so that the load on the other person 200 is reduced as much as possible when the user 100 moves.
- the number of evaluation points based on the above-described first evaluation that is, evaluation according to the direction change from the current moving direction of the user 150
- the second evaluation that is, evaluation according to the advance width W.
- the combined value with the evaluation score obtained from the above is evaluated as the ease of movement of the user 150 with respect to each candidate route.
- FIG. 16B is an example of evaluation. If the condition that the evaluation score is increased when the width W is smaller than the width Z is satisfied, the evaluation can be performed using various relationships as well as the relationship shown in FIG.
- the route determination process is a process for determining the optimum route from the candidate routes based on the evaluation in S400.
- control unit 20 evaluates the number of points evaluated in S405 (in this embodiment, based on the evaluation score based on the first evaluation and the second evaluation). The total value of the evaluation score) and the score on the candidate route in the evaluation map are added together for evaluation.
- the control unit 20 first evaluates the number of points evaluated in S405 for each passing candidate point with respect to a graph with the distance from the starting point of the candidate route as the horizontal axis. Plot. Therefore, until reaching the first passing candidate point Ca from the starting point C0 of the candidate route, the score increases from 0 point to the score evaluated in S405 by a linear function, and from the last passing candidate point Cb to the candidate route Until the end point C1 is reached, a graph is created in which the score is reduced by a linear function from the score evaluated in S405 to 0 point. The graph is created for each candidate route.
- control unit 20 adds the score on the candidate route in the evaluation map for each passing candidate point for each graph for each candidate route created by plotting the score evaluated in S405. Thereby, for example, the graph shown in FIG. 16B is obtained.
- the control unit 20 extracts candidate paths corresponding to the graph having the smallest integral value from the graphs to which the scores in the evaluation map are added.
- control unit 20 adjusts the score in the evaluation map and the score indicating the ease of movement of the user 150 with respect to the candidate route so that the integrated value decreases for the extracted candidate route.
- the control unit 20 determines the candidate route optimized by the adjustment as the optimum route.
- the control unit 20 determines the first moving direction on the determined optimal route as the optimal moving direction for the user 150.
- the movement direction optimal for the user 150 determined in this way is presented to the user 150 by the movement of the moving body 100 in the process of S600.
- the user 150 can move with reference to the movement of the moving body 100.
- the movement of the moving body 100 indicates the movement direction that the user 150 is estimated to be optimal under the surrounding conditions, so that the user 100 moves in accordance with the movement of the moving body 100, It becomes possible to move suitably with respect to the situation.
- an appropriate moving direction according to the surrounding state can be presented to the user 150 so that the user 150 can easily move using the evaluation map in which the surrounding situation is evaluated.
- the movement direction according to the flow can be set, so that the movement direction suitable for the surrounding flow can be presented to the user 150.
- the moving body 100 preferentially presents a direction in which the direction of movement from the current movement direction of the user 150 is small, it is possible to suppress the user 150 from forcing a large direction change.
- the moving body 100 preferentially presents a direction in which the speed change is small from the current moving speed of the user 150, it is possible to suppress the user 150 from forcing a large speed change.
- the mobile body 100 preferentially presents a direction that is not easily obstructed by the path of the other person 200, the contact between the user 150 and the other person 200 can be suppressed.
- the moving body 100 presents the user with a direction in consideration of the indeterminate area based on the blind spot, contact with a human jumping out from the blind spot can be suppressed.
- the prediction time t When predicting the movement area after the prediction time t of the other person 200 existing in the vicinity, a variable value corresponding to the situation of the other person 200 is used as the prediction time t.
- the user 150 can be presented with an easy-to-move direction that takes the situation into consideration.
- the predicted time t is a value corresponding to the relative position between the moving body 100 and the other person 200 and the relative speed of the other person 200 with respect to the user 100
- the moving body 100 is in contact with the other person 200.
- the direction that can be avoided can be presented.
- the moving body 100 since the predicted time t in consideration of the grace distance ⁇ is calculated, the moving body 100 can present the moving direction to the user 150 at a position with room to the other person 200. Thereby, the moving body 100 can present the moving direction to the user 150 as if it is dew removal with respect to the surroundings.
- the moving body 100 presents to the user 150 not only the exclusion area 130a of the area 130 that includes the user 150 and the moving body 100 but also the direction in consideration of the exclusion area 220 of the individual area 220 of the other person 200. Therefore, the direction that the moving body 100 presents to the user 150 is not only the direction in which the user 150 can easily move, but also the load caused by the movement of the user 100 can be suppressed for the other person 200.
- the human 200 is exemplified as the moving body that moves around the user 150, but animals such as dogs and cats, vehicles such as bicycles that run at low speed, and the like are also moving bodies. It can be similarly evaluated.
- the exclusion areas 120a, 170a, and 220a are provided for the individual areas 120, 170, and 220 so as not to allow the opponent to enter, and the maximum value as the score for the exclusion areas 120a, 170a, and 220a is provided. 200 points are set. Alternatively, the exclusion area may not be provided, and the score may gradually decrease from the center of each of the moving body 100, the user 150, and the other person 200 toward the periphery.
- the exclusion region 130a and the exclusion region 220a are considered when calculating the predicted time t and the width W.
- the individual regions 130 and the individual regions are used. 220 may be considered.
- control unit 20 is configured to convert each parameter in the moving body coordinate system into the user coordinate system and grasp the surrounding state on the basis of the state of the user 150.
- control unit 20 may be configured to grasp the user 150 and surrounding states from each parameter (that is, each parameter acquired based on an input from the sensing unit 10) in the moving body coordinate system. .
- the relative movement direction of the other person in the user coordinate system is calculated from the relative movement direction ⁇ i of the other person or the obstacle with respect to the moving body 100 and the relative movement direction ⁇ 0 of the user 150 with respect to the moving body 100.
- the relative movement direction of the other person in the user coordinate system may be calculated in consideration of the orientation ⁇ B of the user 150 with respect to the moving body 100. That is, the relative movement direction ⁇ 0 may be corrected with the body direction ⁇ B, and the relative movement direction of the other person in the user coordinate system may be calculated from the corrected relative movement direction and the relative movement direction ⁇ i.
- the predicted time t is calculated from ( ⁇ x ⁇ ) / ⁇ v for each other person 200.
- the predicted time t is a variable value according to the situation of each other person 200, but the predicted time t may be a specific value that does not depend on the situation of each other person 200.
- the predicted time t may be changed each time according to the flow speed around the user. For example, when the flow around the user is relatively slow, the prediction time t may be set to a longer time, and the prediction time t may be shortened as the flow around the user becomes faster.
- the execution interval of the main process may be changed according to the change.
- the predicted time t is calculated using ⁇ x, which is the distance between the other person 200 and the moving body 100, and ⁇ v, which is the relative speed of the other person 200 in the user coordinate system.
- ⁇ x which is the distance between the other person 200 and the moving body 100
- ⁇ v which is the relative speed of the other person 200 in the user coordinate system.
- the component velocity obtained by resolving the relative velocity of the other person 200 in the user coordinate system in the direction connecting the other person 200 and the moving body 100 may be used as ⁇ v.
- the range of change in the moving speed of the user 150 is set in the range from ⁇ 2 km / h to +2 km / h.
- the range of change is not limited to this, and various change ranges such as from ⁇ 3 km / h to +3 km / h. Can be adopted.
- the step width within the range of the change width is 2 km / h, various step widths such as 1 km / h can be adopted within the range of the change width.
- the dividing line for setting the divided area in the evaluation map is a line that divides the front of the user 150 at a constant angle step.
- the dividing line is not a fixed angle unit, for example, an angle with respect to the moving direction of the user 150
- the dividing line may be finely marked, and on the side close to the left and right direction of the user 150, the dividing line may be sparse.
- the movement direction for getting on the global flow is narrowed down in consideration of the flow and strength, etc., while in S402,
- the local flow was evaluated by the score, and the evaluation score was added on the evaluation map.
- the global flow may be evaluated by the score as in the case of the local flow, and the evaluation score may be added on the evaluation map.
- the local flow as in the case of the global flow, the presence / absence of a flow is determined, and when it is determined that there is a local flow, the local flow is determined according to the relative speed with the user 150 or the like. It is good also as a structure which narrows down the moving direction for getting on a flow. When the movement direction for getting on the local flow is narrowed down, the setting of the divided areas is limited to the area including the narrowed movement direction in S403.
- both the global flow and the local flow are considered. However, either one may be used. Moreover, it is good also as a structure which determines an optimal moving direction separately with respect to a global flow and a local flow, respectively. In such a case, when different results are obtained as the optimum moving direction, the presentation mode depends on whether it is determined based on a global flow or a local flow. It is good also as a structure which changes.
- the candidate route is set by connecting the places having the lowest evaluation score of the future situation on the line segment 601 set in the candidate area.
- the score in the candidate area is low.
- a configuration may be adopted in which route candidates are set by connecting line segments in contact with regions.
- the candidate route corresponding to the graph with the smallest integral value is extracted from the candidate routes in S500.
- a candidate route in which the total score of the points evaluated in S405 and the points on the candidate route in the evaluation map at each passing candidate point satisfies a predetermined condition may be excluded from extraction targets. For example, candidate routes in which the total score rapidly rises with respect to adjacent passing candidate points are excluded from extraction targets. Alternatively, candidate routes having a maximum total score higher than a predetermined value are excluded from extraction targets.
- the extracted candidate route is configured to adjust the score in the evaluation map and the score indicating the ease of movement of the user 150 with respect to the candidate route to reduce the integrated value.
- the point may be adjusted so that the total score of the score evaluated in S405 and the score on the candidate route in the evaluation map is preferably changed. For example, it is good also as a structure which adjusts so that the said total score which rises rapidly with respect to an adjacent passage candidate point may fall. Or it is good also as a structure which adjusts so that the maximum value of the said total score may fall.
- one optimum route is determined in S500, but a route may be determined in a state where there is a certain range in the direction.
- a predetermined direction for example, the center in the range
- the optimal moving direction for the user 150 is presented to the user 150 by the movement of the moving body 100.
- the present invention is not limited to this, and for example, the optimal moving direction is displayed on the display of the display unit 103. It is good also as a structure shown by displaying with an arrow or the like. Or it is good also as a structure which provides a speaker in the HMI part 40 and presents an optimal moving direction with an audio
- the field prediction and evaluation are not limited to the above method, and are performed using other models. You can also.
- the field prediction and evaluation may be performed using a human flow model using a human particle or a human flow model based on a velocity vector field.
- a human flow model using a human particle or a human flow model based on a velocity vector field.
- the social force model the model similar to the social force model, and a model that adds predictive collision avoidance behavior to the model.
- the prediction and evaluation of the field may be performed using a decision making model such as the selection of the ground or a model focusing on a phenomenon called “cross flow” in which a plurality of human flows intersect.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Aviation & Aerospace Engineering (AREA)
- Traffic Control Systems (AREA)
- Manipulator (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Image Analysis (AREA)
Abstract
Description
150 ユーザ
200 周辺に存在する他者
300 周辺に存在する障害物
Claims (19)
- 移動体であって、
ユーザの状況を検知するユーザ検知手段と、
周辺の状況を検知する周辺状況検知手段と、
前記周辺状況検知手段により検知された周辺の状況に基づいて、前記周辺に存在する物体の移動を、当該周辺の未来状況として予測する周辺予測手段と、
前記周辺予測手段により予測された前記周辺の未来状況と、前記ユーザ検知手段により検知された前記ユーザの状況とに基づいて、前記ユーザに提示する移動方向を決定する決定手段と、
前記決定手段により決定された移動方向に応じた向きを、前記ユーザの前方周囲にて当該ユーザに提示する提示手段と、
を備えていることを特徴とする移動体。 - 前記決定手段は、前記周辺予測手段により予測された前記周辺の未来状況と、前記ユーザ検知手段により検知された前記ユーザの状況とに基づいて、前記周辺における前記ユーザの移動のし易さを評価し、その移動のし易さに基づいて、前記ユーザに提示する移動方向を決定するものであることを特徴とする請求項1記載の移動体。
- 前記決定手段は、
前記周辺予測手段により予測された前記周辺の未来状況と、前記ユーザ検知手段により検知された前記ユーザの状況とに基づいて、前記周辺における前記ユーザの移動のし易さを評価する第1評価手段とを備え、
前記第1評価手段による評価に基づいて、前記ユーザに提示する移動方向を決定するものであることを特徴とする請求項1又は2に記載の移動体。 - 前記決定手段は、
前記周辺予測手段により予測された前記周辺の未来状況に基づいて、前記周辺における前記物体の流れを評価する第2評価手段を備え、
前記第2評価手段により評価された前記物体の流れに基づいて、前記ユーザに提示する移動方向を決定するものであることを特徴とする請求項1から3のいずれかに記載の移動体。 - 前記ユーザ検知手段により検知された前記ユーザの状況に基づいて、当該ユーザの前方周囲にて前記移動体を移動させる移動制御手段と、
前記提示手段は、前記決定手段により決定された移動方向に応じた向きを示すよう、前記ユーザの前方周囲にて、前記移動制御手段により、前記移動体を移動させることにより、当該移動方向を前記ユーザに提示することを特徴とする請求項1から4のいずれかに記載の移動体。 - 前記決定手段は、
前記周辺予測手段により予測された前記周辺の未来状況に基づいて、前記ユーザに提示する移動方向を決定するための方向の候補を抽出する候補抽出手段を備え、
前記候補抽出手段により抽出された候補に基づいて、前記ユーザに提示する移動方向を決定するものであることを特徴とする請求項1から5のいずれかに記載の移動体。 - 前記候補抽出手段は、前記第2評価手段により評価された前記物体の流れのうち、前記ユーザの移動方向に逆らわない流れに応じた方向を、前記候補として優先して抽出することを特徴とする請求項6記載の移動体。
- 前記候補抽出手段は、前記ユーザの移動方向に逆らわない流れのうち、前記ユーザの移動速度と所定範囲内の速度差で流れる流れに応じた方向を、前記候補として優先して抽出することを特徴とする請求項6または7に記載の移動体。
- 前記周辺予測手段は、前記周辺状況検知手段により検知された周辺の状況に基づいて、前記物体の移動と、前記周辺に存在する前記物体による死角に基づく、今後の推移が不確かな領域とを、前記周辺の未来状況として予測することを特徴とする請求項1から8のいずれかに記載の移動体。
- 前記周辺予測手段は、所定の予測時間後に前記物体が存在し得る領域を、前記物体の移動として予測することを特徴とする請求項1から9のいずれかに記載の移動体。
- 前記所定の予測時間は、前記周辺に存在する物体のそれぞれについて、前記各物体の状況に応じた可変値であることを特徴とする請求項10記載の移動体。
- 前記周辺予測手段は、前記物体のうち、前記ユーザおよび前記移動体に対し相対的に近づく物体を対象とし、対象となる前記物体のそれぞれについて前記領域の予測を行い、
前記所定の予測時間は、前記ユーザまたは前記移動体と前記物体との間の距離に応じた値と、前記ユーザまたは前記移動体に対する前記物体の相対速度とに基づく値であることを特徴とする請求項10または11に記載の移動体。 - 前記所定の予測時間は、前記ユーザおよび前記移動体と前記物体とが、前記ユーザまたは前記移動体に対する前記物体の相対速度で近づいた場合に、前記ユーザおよび前記移動体を包括する周囲に所定の大きさで設けられた第1領域と、前記物体の周囲に所定の大きさで設けられた第2領域との間が所定の猶予距離だけ離れた位置に到達するまでの時間として算出されることを特徴とする請求項12記載の移動体。
- 前記ユーザが、前記周辺予測手段により予測される前記周辺の未来状況において、前記候補抽出手段により候補として抽出された方向に応じた向きで、前記ユーザ検知手段により検知されたユーザの状況に応じて定められる目標領域に移動した場合に、当該目標領域に到達するまでに、前記物体による影響を最も受け難い経路を推定する経路推定手段を備え、
前記第1評価手段は、前記周辺予測手段により予測された領域について、前記経路推定手段により推定された経路を挟む領域がある場合、当該経路を挟む領域間の幅が広いほど、前記ユーザが移動し易いと評価することを特徴とする請求項10から13のいずれかに記載の移動体。 - 前記第1評価手段は、前記経路を挟む領域間の幅が、前記ユーザおよび前記移動体を包括する周囲に所定の大きさで設けられた第1領域の幅より広い場合には、当該第1領域の幅より狭い場合に比べて、前記ユーザが移動し易いと評価することを特徴とする請求項14記載の移動体。
- 前記経路を挟む領域間の幅は、各領域の中心を結んだ場合に、当該各領域に対応する前記物体の周囲にそれぞれ所定の大きさで設けられた第2領域の外側となる線分の長さであることを特徴とする請求項14または15に記載の移動体。
- 前記ユーザが、前記周辺予測手段により予測される前記周辺の未来状況において、前記候補抽出手段により候補として抽出された方向に応じた向きで、前記ユーザ検知手段により検知されたユーザの状況に応じて定められる目標領域に移動した場合に、当該目標領域に到達するまでに、前記物体による影響を最も受け難い経路を推定する経路推定手段を備え、
前記第1評価手段は、前記経路推定手段により推定された経路のうち、終点の位置が、前記ユーザ検知手段により検知されたユーザの状況が示す移動方向と前記目標領域とが交差する位置に近い経路ほど、前記ユーザが移動し易いと評価することを特徴とする請求項3から16のいずれかに記載の移動体。 - 前記ユーザが、前記周辺予測手段により予測される前記周辺の未来状況において、前記候補抽出手段により候補として抽出された方向に応じた向きで、前記ユーザ検知手段により検知されたユーザの状況に応じて定められる目標領域に移動した場合に、当該目標領域に到達するまでに、前記物体による影響を最も受け難い経路を推定する経路推定手段を備え、
前記第1評価手段は、前記経路推定手段により推定された経路のうち、前記ユーザ検知手段により検知されたユーザの状況が示す移動方向に対する、始点からの移動方向の角度変化の絶対値が小さい経路ほど、前記ユーザが移動し易いと評価することを特徴とする請求項3から17のいずれかに記載の移動体。 - 前記周辺予測手段は、前記ユーザ検知手段により検知されたユーザの状況が示す当該ユーザの移動速度を用いて前記物体の移動を予測するとともに、前記ユーザの移動速度とは異なる移動速度を用いて前記物体の移動を予測し、
前記候補抽出手段は、前記周辺予測手段により前記ユーザの移動速度毎に予測された前記物体の移動を含む、前記周辺の未来状況に基づいて、前記ユーザの移動速度毎に、前記ユーザに提示する移動方向を決定するための方向の候補を抽出し、
前記経路推定手段は、前記ユーザの移動速度毎に、前記物体による影響を最も受け難い経路を推定し、
前記第1評価手段は、前記経路推定手段により前記ユーザの移動速度毎に推定された経路のうち、前記物体の移動を予測するために用いた前記ユーザの移動速度が、前記ユーザ検知手段により検知されたユーザの状況が示す当該ユーザの移動速度に近い経路ほど、前記ユーザが移動し易いと評価することを特徴とする請求項14から18のいずれかに記載の移動体。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016566145A JP6388141B2 (ja) | 2014-12-25 | 2015-12-16 | 移動体 |
CN201580068904.0A CN107111317B (zh) | 2014-12-25 | 2015-12-16 | 移动体 |
US15/539,936 US10331140B2 (en) | 2014-12-25 | 2015-12-16 | Moving body |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-262883 | 2014-12-25 | ||
JP2014262883 | 2014-12-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016104265A1 true WO2016104265A1 (ja) | 2016-06-30 |
Family
ID=56150290
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/085151 WO2016104265A1 (ja) | 2014-12-25 | 2015-12-16 | 移動体 |
Country Status (4)
Country | Link |
---|---|
US (1) | US10331140B2 (ja) |
JP (1) | JP6388141B2 (ja) |
CN (1) | CN107111317B (ja) |
WO (1) | WO2016104265A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019035362A1 (ja) * | 2017-08-18 | 2019-02-21 | 三菱電機株式会社 | ロボット制御装置およびこれを用いたロボットシステム |
WO2019244644A1 (ja) * | 2018-06-19 | 2019-12-26 | ソニー株式会社 | 移動体制御装置および移動体制御方法、並びにプログラム |
WO2022102078A1 (ja) * | 2020-11-13 | 2022-05-19 | 三菱電機株式会社 | 経路生成システム、経路生成方法、経路生成プログラム、及び自律移動体 |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6894596B2 (ja) * | 2018-03-29 | 2021-06-30 | 株式会社エクォス・リサーチ | 移動体 |
EP3841358A4 (en) | 2018-08-21 | 2022-06-08 | Moonshot Health Inc. | SYSTEMS AND METHODS FOR MAPPING A GIVEN ENVIRONMENT |
CN114153200A (zh) * | 2018-10-26 | 2022-03-08 | 科沃斯机器人股份有限公司 | 轨迹预测、自移动设备控制方法 |
US20210072027A1 (en) * | 2019-09-09 | 2021-03-11 | Caci, Inc. - Federal | Systems and methods for providing localization and navigation services |
JP7487478B2 (ja) * | 2020-01-23 | 2024-05-21 | セイコーエプソン株式会社 | 移動ロボットの制御方法及び制御装置、並びに、ロボットシステム |
CN112880689A (zh) * | 2021-01-29 | 2021-06-01 | 北京百度网讯科技有限公司 | 一种领位方法、装置、电子设备及存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008152504A (ja) * | 2006-12-18 | 2008-07-03 | Hitachi Ltd | 案内ロボット装置及び案内システム |
JP2008152600A (ja) * | 2006-12-19 | 2008-07-03 | Toyota Motor Corp | 移動経路作成方法、自律移動体および自律移動体制御システム |
JP2010055498A (ja) * | 2008-08-29 | 2010-03-11 | Hitachi Ltd | 自律移動ロボット装置及びかかる装置における飛び出し衝突回避方法 |
JP2010061293A (ja) * | 2008-09-02 | 2010-03-18 | Toyota Motor Corp | 経路探索装置、経路探索方法、及び経路探索プログラム |
JP2012203646A (ja) * | 2011-03-25 | 2012-10-22 | Advanced Telecommunication Research Institute International | 流れ状態判別装置、流れ状態判別方法、流れ状態判別プログラムおよびそれらを用いたロボット制御システム |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE10301468B4 (de) * | 2002-01-18 | 2010-08-05 | Honda Giken Kogyo K.K. | Vorrichtung zur Beobachtung der Umgebung eines Fahrzeugs |
JP3987048B2 (ja) * | 2003-03-20 | 2007-10-03 | 本田技研工業株式会社 | 車両周辺監視装置 |
JP4055656B2 (ja) * | 2003-05-30 | 2008-03-05 | トヨタ自動車株式会社 | 衝突予測装置 |
US7117090B2 (en) * | 2003-08-28 | 2006-10-03 | Siemens Aktiengesellschaft | Moving vehicle in cuboidal panorama |
JP3987057B2 (ja) * | 2004-06-14 | 2007-10-03 | 本田技研工業株式会社 | 車両周辺監視装置 |
US7899211B2 (en) * | 2005-12-07 | 2011-03-01 | Nissan Motor Co., Ltd. | Object detecting system and object detecting method |
CN101583842B (zh) * | 2006-12-05 | 2011-11-16 | 株式会社纳维泰 | 导航系统、便携终端装置及周边图像显示方法 |
JP4371153B2 (ja) | 2007-06-15 | 2009-11-25 | トヨタ自動車株式会社 | 自律移動装置 |
CN100524363C (zh) * | 2007-06-29 | 2009-08-05 | 中国科学院计算技术研究所 | 一种用于动态实体的分层避障方法 |
JP2009151382A (ja) | 2007-12-18 | 2009-07-09 | Toyota Motor Corp | 移動体 |
JP5581770B2 (ja) * | 2010-03-26 | 2014-09-03 | ソニー株式会社 | ロボット装置及びロボット装置による情報提供方法 |
JP5971341B2 (ja) * | 2012-08-09 | 2016-08-17 | トヨタ自動車株式会社 | 物体検出装置及び運転支援装置 |
US20150336579A1 (en) * | 2012-11-08 | 2015-11-26 | Toyota Jidosha Kabushiki Kaisha | Drive assist device and method, collision prediction device and method, and alerting device and method |
CN103823466B (zh) * | 2013-05-23 | 2016-08-10 | 电子科技大学 | 一种动态环境下移动机器人路径规划方法 |
KR20150055271A (ko) * | 2013-11-13 | 2015-05-21 | 현대모비스 주식회사 | 타겟의 운동 특성 판단 장치 및 이를 구비하는 주행 경로 제어 장치 |
-
2015
- 2015-12-16 US US15/539,936 patent/US10331140B2/en active Active
- 2015-12-16 JP JP2016566145A patent/JP6388141B2/ja active Active
- 2015-12-16 WO PCT/JP2015/085151 patent/WO2016104265A1/ja active Application Filing
- 2015-12-16 CN CN201580068904.0A patent/CN107111317B/zh active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008152504A (ja) * | 2006-12-18 | 2008-07-03 | Hitachi Ltd | 案内ロボット装置及び案内システム |
JP2008152600A (ja) * | 2006-12-19 | 2008-07-03 | Toyota Motor Corp | 移動経路作成方法、自律移動体および自律移動体制御システム |
JP2010055498A (ja) * | 2008-08-29 | 2010-03-11 | Hitachi Ltd | 自律移動ロボット装置及びかかる装置における飛び出し衝突回避方法 |
JP2010061293A (ja) * | 2008-09-02 | 2010-03-18 | Toyota Motor Corp | 経路探索装置、経路探索方法、及び経路探索プログラム |
JP2012203646A (ja) * | 2011-03-25 | 2012-10-22 | Advanced Telecommunication Research Institute International | 流れ状態判別装置、流れ状態判別方法、流れ状態判別プログラムおよびそれらを用いたロボット制御システム |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019035362A1 (ja) * | 2017-08-18 | 2019-02-21 | 三菱電機株式会社 | ロボット制御装置およびこれを用いたロボットシステム |
JPWO2019035362A1 (ja) * | 2017-08-18 | 2019-11-07 | 三菱電機株式会社 | ロボット制御装置およびこれを用いたロボットシステム |
JP2019206080A (ja) * | 2017-08-18 | 2019-12-05 | 三菱電機株式会社 | ロボット制御装置およびこれを用いたロボットシステム |
WO2019244644A1 (ja) * | 2018-06-19 | 2019-12-26 | ソニー株式会社 | 移動体制御装置および移動体制御方法、並びにプログラム |
US11526172B2 (en) | 2018-06-19 | 2022-12-13 | Sony Corporation | Mobile object control apparatus and mobile object control method |
WO2022102078A1 (ja) * | 2020-11-13 | 2022-05-19 | 三菱電機株式会社 | 経路生成システム、経路生成方法、経路生成プログラム、及び自律移動体 |
Also Published As
Publication number | Publication date |
---|---|
US10331140B2 (en) | 2019-06-25 |
CN107111317A (zh) | 2017-08-29 |
US20170357264A1 (en) | 2017-12-14 |
JP6388141B2 (ja) | 2018-09-12 |
CN107111317B (zh) | 2020-07-17 |
JPWO2016104265A1 (ja) | 2017-09-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6388141B2 (ja) | 移動体 | |
US11001196B1 (en) | Systems and methods for communicating a machine intent | |
CN108537326B (zh) | 用于自动驾驶车辆的方法、介质和系统 | |
CN108303972B (zh) | 移动机器人的交互方法及装置 | |
AU2015262344B2 (en) | Processing apparatus, processing system, processing program, and processing method | |
US20180150701A1 (en) | Method and apparatus for determining abnormal object | |
JP5768273B2 (ja) | 歩行者の軌跡を予測して自己の回避行動を決定するロボット | |
WO2013046563A1 (ja) | 自律移動装置、自律移動方法、及び、自律移動装置用のプログラム | |
JPWO2016098238A1 (ja) | 走行制御装置 | |
JP6959056B2 (ja) | 移動ロボットの制御装置と制御方法 | |
JP2008065755A (ja) | 移動装置 | |
US9607230B2 (en) | Mobile object control apparatus and target object detecting apparatus | |
JP5577126B2 (ja) | 走行支援装置 | |
JP2016184337A (ja) | 移動体 | |
Pundlik et al. | Collision detection for visually impaired from a body-mounted camera | |
KR20150076757A (ko) | 이동 수단의 집단 제어를 위한 장치 및 방법 | |
JP2008310440A (ja) | 歩行者検出装置 | |
Kenk et al. | Human-aware Robot Navigation in Logistics Warehouses. | |
JP2022008854A (ja) | 制御装置 | |
KR20180097962A (ko) | 영상 분석 기반 상황 판단형 가이드 장치 및 방법 | |
US11756418B2 (en) | Device, method, and storage medium | |
Vorapatratorn et al. | Fast obstacle detection system for the blind using depth image and machine learning. | |
JP2006092253A (ja) | 自律移動装置 | |
Vasconcelos et al. | Socially acceptable robot navigation in the presence of humans | |
Ghandour et al. | Interactive collision avoidance system for indoor mobile robots based on human-robot interaction |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15872834 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016566145 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15539936 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15872834 Country of ref document: EP Kind code of ref document: A1 |