CN112136090A - Autonomous moving device - Google Patents

Autonomous moving device Download PDF

Info

Publication number
CN112136090A
CN112136090A CN201980033434.2A CN201980033434A CN112136090A CN 112136090 A CN112136090 A CN 112136090A CN 201980033434 A CN201980033434 A CN 201980033434A CN 112136090 A CN112136090 A CN 112136090A
Authority
CN
China
Prior art keywords
entry
area
behavior
restricted
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201980033434.2A
Other languages
Chinese (zh)
Other versions
CN112136090B (en
Inventor
上田泰士
中村亮介
网野梓
山本晃弘
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Original Assignee
Hitachi Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd filed Critical Hitachi Ltd
Publication of CN112136090A publication Critical patent/CN112136090A/en
Application granted granted Critical
Publication of CN112136090B publication Critical patent/CN112136090B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides an autonomous moving apparatus which can execute appropriate action when entering in each entry-forbidden area. An autonomous moving apparatus (1) estimates its own position and determines a travel route based on area information indicating the position of an obstacle and detection information of the obstacle, the autonomous moving apparatus including: an entry restriction area setting unit (14) for setting an entry restriction area in the area information; an entry behavior setting means (15) for setting, for each entry restricted area, a behavior at the time of entry into the entry restricted area; and a travel control means (11) for judging whether or not the self position is included in any entry restricted area, controlling travel so as not to enter the entry restricted area when the self position is not included in the entry restricted area, and controlling travel so as to be activated when entry of the entry restricted area is restricted when the self position is included in the entry restricted area.

Description

Autonomous moving device
Technical Field
The present invention relates to an autonomous moving apparatus.
Background
There are autonomous moving apparatuses such as service robots, automated guided vehicles, and unmanned aerial vehicles that autonomously move to a determined position based on map information generated from information on the position and shape of an obstacle such as a wall. The autonomous moving apparatus detects surrounding obstacles with a laser scanner or the like mounted on the autonomous moving apparatus, compares information of the obstacles with map information, identifies a current position, and selects a travel route.
However, even in an area where no obstacle exists, an area where a level difference exists and an area where a road surface is likely to slip are not suitable for traveling by the autonomous moving apparatus. In addition, when there is an obstacle that cannot be detected by the mounted laser scanner, there is a possibility that the current position is erroneously recognized.
Therefore, in the autonomous moving apparatus described in patent document 1, an area unsuitable for the autonomous moving apparatus to travel and an area where an obstacle that cannot be detected by the laser scanner is present are set as an entry prohibition area where entry of the autonomous moving apparatus is prohibited. Then, the current position of the autonomous moving apparatus is identified based on the environment map based on an entry prohibition area map indicating the entry prohibition area and an environment map indicating an obstacle area where an obstacle exists, and a travel route of the autonomous moving apparatus is planned based on the entry prohibition area map and the environment map.
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open publication No. 2011-128899
Disclosure of Invention
Problems to be solved by the invention
The autonomous moving apparatus of patent document 1 can autonomously move without traveling in the no-entry area and the obstacle area.
However, depending on the manner of the autonomous moving apparatus, entry into the no-entry region may occur due to external disturbance such as slipping of wheels or gusts of wind. In particular, in a complicated indoor environment, since a drivable area is narrow, a driving route approaches an no-entry area and the frequency of entering the no-entry area increases. After entering the no entry area, the autonomous moving apparatus cannot travel.
When the autonomous mobile apparatus enters the no-entry area, an appropriate countermeasure is different depending on the situation of each no-entry area. For example, when the road surface of the no-entry region is a slippery region and there is a possibility that the travel control cannot be performed such as a toppling, it is necessary to stop the travel immediately. Further, if the vehicle enters a slight distance in a case where there is a difference in level in the road surface of the no-entry region, the vehicle may be out of the no-entry region. In addition, when the outside of the service area of the autonomous mobile apparatus is set as an entry prohibition area, return to the service area is required.
An object of the present invention is to provide an autonomous moving apparatus which can prevent entry into an entry-prohibited area and can perform an appropriate behavior when entering (hereinafter referred to as an entry behavior).
Hereinafter, in the present specification, a region in which behavior is restricted when entering is referred to as an entry restriction region. The entry prohibited area of patent document 1 can be regarded as 1 method of restricting the entry area in terms of restricting the operation to stop at the time of entry.
Means for solving the problems
In order to solve the above problem, an autonomous moving apparatus according to the present invention is an autonomous moving apparatus capable of estimating its own position and determining a travel route based on area information indicating a position of an obstacle and detection information of the obstacle, the autonomous moving apparatus including: an entry restriction area setting unit that sets an entry restriction area in the area information; an entry-time behavior setting unit that sets, for each entry-restricted area, a behavior when the entry-restricted area is entered; and a travel control unit that determines whether or not the self position is included in any restricted entry area, controls travel so as not to enter the restricted entry area when the self position is not included in the restricted entry area, and controls travel according to the behavior at the time of entry of the restricted entry area when the self position is included in the restricted entry area.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the autonomous moving apparatus of the present invention, when the autonomous moving apparatus enters the restricted entry area, appropriate behavior is individually executed in accordance with the area, and therefore, usability of the autonomous moving apparatus can be improved.
Drawings
Fig. 1 is a diagram showing a control module structure of an autonomous moving apparatus according to an embodiment.
Fig. 2 is an external view showing the front and side surfaces of the autonomous moving apparatus according to the embodiment.
Fig. 3 is a diagram showing an example of an area in which the autonomous moving apparatus autonomously travels.
Fig. 4 is a diagram showing a part of the area information.
Fig. 5 is a diagram showing display of color management information.
Fig. 6 is a diagram showing behavior management information at the time of entry.
Fig. 7 is a diagram showing behavior information at the time of entry.
Fig. 8 is a diagram illustrating an operation flow of the autonomous moving apparatus.
Fig. 9 is a diagram showing a control module structure of an autonomous moving apparatus according to another embodiment.
Fig. 10 is a diagram showing another example of the area in which the autonomous moving apparatus autonomously travels.
Fig. 11 is a diagram showing an example of the structure of scene information.
Fig. 12 is a diagram illustrating another operation flow of the autonomous moving apparatus.
Detailed Description
Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.
First embodiment
Fig. 1 is a diagram showing a control module configuration of an autonomous moving apparatus 1 according to an embodiment.
The autonomous moving apparatus 1 includes a moving apparatus 10 autonomously traveling at a prescribed position of an area; and an operation setting device 19 for setting operation conditions and operation commands for the mobile device 10.
More specifically, the action setting device 19 includes an instruction input unit 13 for inputting an instruction such as a target position and a speed for traveling to the mobile device 10, an entry restriction area setting unit 14 for setting an area for restricting entry of the mobile device 1, and an entry-time behavior setting unit 15 for setting a behavior that the mobile device 10 should take when the mobile device 10 enters the entry restriction area.
The mobile device 10 includes a travel control unit 11 for controlling the movement of the mobile device 10, a mobile unit 12 for moving the mobile device 10 in accordance with a signal from the travel control unit 11, and a movement state identification unit 16 for identifying a movement state such as a speed based on the state of the surrounding environment and information obtained from the mobile unit 12.
For example, the movement state recognition unit 16 acquires distance information between the mobile device 10 and surrounding structures using a laser scanner such as lidar (light Detection and ranging).
The travel control unit 11 includes: a map generating unit 111 that generates a map to be referred to when identifying the position of the mobile device 10, a storage unit 112 that stores a command obtained from the command input unit 13, information on the restricted entry area obtained from the restricted entry area setting unit 14, settings of entry-time behaviors for each restricted entry area obtained from the entry-time behavior setting unit 15, and map information, a self-position estimating unit 113 that estimates where the self (own machine) is located in the map based on the movement state obtained from the movement state identifying unit 16, and a movement control unit 114 that calculates a command signal for controlling the driving of the moving unit 12 and transmits the command signal to the moving unit 12.
Fig. 2 is an external view showing the front and side surfaces of the autonomous moving apparatus 1 according to the embodiment. The autonomous moving apparatus 1 according to the embodiment can detect the position and the movement of an obstacle using a mounted LiDAR, calculate a path for avoiding a collision, and autonomously move to a target location.
On the head of the mobile device 10, a tablet control device 18 that notifies the state of itself to the outside and functions as an operation setting device 19 (refer to fig. 1) is provided. A travel control unit 11 and a movement state recognition unit 16 (see fig. 1) including LiDAR are built in a trunk portion of the mobile device 10, and 1 pair of two wheels, which are differentially driven, and 1 free wheel are provided as the mobile unit 12 in a foot portion of the mobile device 10.
In the autonomous moving apparatus 1 of fig. 2, the operation setting device 19 is shown as being mounted as a function of the tablet control device 18, but the operation setting device 19 may be a notebook computer 17, not shown, wirelessly connected to the moving apparatus 10.
That is, the user of the autonomous moving apparatus 1 can also remotely operate the autonomous moving apparatus 1 with the notebook computer 17, and can also directly operate the moving apparatus 10 with the tablet control apparatus 18 mounted in the autonomous moving apparatus 1.
Hereinafter, the entry restriction area of the autonomous moving apparatus 1 and the method of setting behavior at the time of entry will be described in detail.
Fig. 3 is a diagram showing an example of an area in which the mobile device 10 autonomously travels.
The mobile device 10 estimates its own position based on the position of the wall detected by the mounted LiDAR, and autonomously travels in a travel area surrounded by the wall indicated by a thick line.
There are chairs in the driving area. Further, around the chair, an entry restriction area 302 for stopping the behavior at entry time is set in consideration of an occupied area when a person sits down, and around the entry restriction area 302, an entry restriction area 301 for moving back the behavior at entry time on the entry route is set.
The mobile device 10 plans the travel route so as not to enter the entry restricted area 301, and controls the exit from the entry restricted area 301 when entering the entry restricted area 301 due to external disturbance.
When the mobile device 10 enters the entry restriction area 301 and then further enters the entry restriction area 302, the mobile device 10 is stopped. For example, when the mobile device 10 attempts to leave the restricted entry area 301, such a phenomenon may occur when an obstacle such as a person or another mobile device 10 is detected to approach and the avoidance operation is performed.
In this way, by providing the access restricted area in stages for an object for which the mobile device 10 is not desired to approach, the mobile device 10 can be caused to perform a behavior different in accordance with the approach degree to the access restricted area.
In addition, cones were provided at four corners of the floor surface on which the wax had just been applied in the running area. The area surrounded by the cone is set as the restricted entry area 305.
When the mobile device 10 enters the entry restriction area 305, the mobile device 10 is stopped. This can minimize damage such as peeling of the wax just applied.
Further, an entry restriction area in which the behavior of the mobile device 10 at the time of entry is set to recede on the entry route may be provided so as to surround the entry restriction area 305.
This can further prevent the wax from entering the area to which the wax is applied.
Further, in the traveling region on the left side of the paper surface in fig. 3, there is a step outside the traveling region. In order to prevent the mobile device 10 from entering the steps, an entry restriction area 304 in which the behavior at the time of entry is set is provided.
Because there is a possibility that a person may suddenly drop out of the restricted access zone 304, the mobile device 10 is quickly detached from the restricted access zone 304 as the mobile device 10 enters the restricted access zone 304.
Specifically, when the mobile device 10 enters in the vertical direction on the paper surface of fig. 3, it travels in the entering direction so as to quickly pass through the entry restriction area 304, and then exits from the entry restriction area 304. When entering in the step direction on the paper surface, the sheet is separated in the direction opposite to the entering direction. That is, in the entry restriction area 304, the entry-time behavior is set so that the behavior differs depending on the entry direction.
Of course, the mobile device 10 may be retracted on the entry path and separated from the entry restriction area 304 regardless of the entry direction.
Since the mobile device 10 cannot travel on a step, the step region can be regarded as an entry restriction region in which the behavior at the time of entry is set to stop, similarly to the entry restriction region 302. Therefore, in this example, it can be considered that the entry area is limited to the object whose approach of the mobile device 10 is not desired in a stepwise manner.
In the travel area on the left side of the drawing of fig. 3, a restricted entry area 303 for causing the behavior at entry to move backward on the entry route is set in a connection portion between the service area, which is the travel area of the autonomous moving apparatus 1, and the outside of the service area.
When the mobile device 10 enters the restricted entry area 303, the mobile device 10 moves backward on the entry route so as to return to the service area.
In addition, the mobile device 10 acts to restrict entry into the entry area 303 and to alert the user that the user is leaving the service area. The alarm transmission is performed by the tablet control device 18 (refer to fig. 2) of the mobile device 10.
Such a phenomenon may occur in a case where the autonomous mobile apparatus 1 moves in association with the user or in a rear-end collision.
In this example, since the mobile device 10 does not autonomously travel in the area outside the service area, the area outside the service area can be regarded as an entry restriction area in which the behavior at the time of entry is set to be stopped. Therefore, in this example, it can be considered that the entry restriction area is set in stages for an object whose approach of the mobile device 10 is not desired.
In the above description, the entrance limit area is set stepwise for the object whose approach of the mobile device 10 is not desired, but the autonomous mobile device 1 according to the embodiment may be configured such that an entrance avoidance area for controlling the behavior of the mobile device 10 so as not to enter the entrance prohibition area is provided outside the entrance direction of the entrance prohibition area in patent document 1, and the autonomous traveling of the mobile device 10 is controlled.
Next, a method of controlling the movement of the mobile device 10 by setting the entry restriction area and setting the behavior at the time of entry restriction of the entry area will be described.
In the autonomous moving apparatus 1 according to the embodiment, the travel area of the moving apparatus 10 is divided into rectangular areas having a predetermined size as a two-dimensional area information management area and a travel route.
Fig. 4 is a diagram showing area information of the area 309 in fig. 3.
In information corresponding to one rectangular area of the area information, an identifier indicating the type of the area (obstacle, travel area, restricted entry area, no entry area) is recorded.
For example, an obstacle such as a wall, a chair, or a cone is set as the identifier "255", and a travel area is set as the identifier "0". In fig. 4, the area corresponding to the inside of the chair is the identifier "0", but may be the identifier "255" because the area is surrounded by obstacles and cannot be driven.
Then, for each restricted entry area, an identifier for specifying the area is assigned, and the identifier is recorded in the area information in accordance with the position of the area. In fig. 4, the entry restriction area 301 is assigned to the identifier "80", and the entry restriction area 302 is assigned to the identifier "200" and recorded.
In this case, the numerical range of the identifier may be limited for each type of region. Thus, by determining the numerical range, the type of the region can be identified.
Further, it is preferable that different identifiers be assigned to obstacles such as walls, chairs, and cones so that obstacles that are easily detected by a mounted LiDAR and obstacles that are difficult to detect. In self-position estimation (fig. 9) described later, the accuracy of matching between position information of an obstacle that is easily detected in the area information and a detection result of LiDAR can be improved.
The autonomous moving apparatus 1 of the embodiment plans the travel route of the moving apparatus 10 with a map, and fig. 4 shows that a part of the area information is associated with the map of the planned travel route. Alternatively, the area information may be used as a map.
The area information is stored in the storage unit 112 (see fig. 1).
In the autonomous moving apparatus 1 according to the embodiment, the entry restriction area setting unit 14 sets the entry restriction area by using the gui (graphical User interface) of the operation setting device 19 (the tablet control device 18 or the notebook computer 17), and the entry behavior restriction setting unit 15 sets the entry behavior of the entry restriction area. Therefore, as shown in fig. 5, display color management information associating a display color for each identifier of the designated area is set. The display color management information is stored in the storage unit 112 (see fig. 1).
In this case, when the systems of hues are matched for each type of region, the visibility of the region can be improved.
The entry limit area setting unit 14 sets an area designated by performing a drag operation on the screen of the displayed area information as an entry limit area. For example, in fig. 3, when a rectangular area is specified by performing a drag operation so as to surround a chair displayed as an obstacle, the entry restriction area setting unit 14 registers the specified rectangular area as an entry restriction area in the area information.
More specifically, when a rectangular area is designated, the entry restriction area setting unit 14 generates an identifier for designating the area, and records the generated identifier in an area of the area information corresponding to the designated rectangular area. Thereby, the entry restriction area setting unit 14 sets an entry restriction area in the area information.
The entry behavior setting unit 15 opens a pop-up window for setting the entry behavior when the mobile device 10 enters the right-click restricted entry area when the right-click is performed on the set restricted entry area.
At this time, in order to associate the entry-time behavior set in the pop-up window with the entry-restricted area, an identifier of the entry-time behavior is recorded in association with an identifier of the entry-restricted area in the entry-time behavior management information shown in fig. 6.
The entry-time behavior management information is stored in the storage unit 112 (see fig. 1).
In the pop-up window, as the behavior at the time of entry when entering the right click restricted entry area, the type of the action after entry of any one of "stop on the spot", "go backward on the path at the time of entry", "pass in the direction of entry", and the type of the alarm being issued of any one of "warning display", "warning sound", and "no alarm being issued" are set.
However, when the mobile device 10 enters the restricted entry area in which the entry-time behavior is set to "stop on the spot", the mobile device 10 stops after entering, and thus the autonomous traveling is terminated. Therefore, it is preferable to issue "warning display".
The behavior at entry set in the pop-up window is recorded in the behavior at entry information shown in fig. 7 for each entry restriction area.
The entry behavior information is stored in the storage unit 112 (see fig. 1).
That is, in the entry-time behavior information of fig. 7, "type of action" and "type of alarm" are recorded for each identifier of the entry-time behavior as entry-time behaviors when the mobile device 10 enters the entry-restricted area. The "behavior at entry" is not limited to the "type of action" and the "type of alarm generation", and other actions may be added.
Next, an operation flow of the autonomous moving apparatus 1 will be described with reference to fig. 8.
First, in step S201, the autonomous moving apparatus 1 generates a map of a travel area including the arrangement of obstacles in the area where the moving apparatus 10 travels by using the map generating unit 111, and stores the generated map in the storage unit 112. The autonomous moving apparatus 1 estimates the position of the moving apparatus 10 based on the map and the position information of the obstacle detected by the LiDAR of the movement state recognition unit 16, and plans the travel route. In the autonomous moving apparatus 1 according to the embodiment, the map and the area information on which the moving apparatus 10 travels are the same.
In step S202, the autonomous moving apparatus 1 sets an entry restricted area in the map (area information) stored in the storage unit 112 by the entry restricted area setting unit 14. At this time, an identifier is assigned for each set restricted entry area.
In step S203, the autonomous moving apparatus 1 sets the behavior at the time of entering the area for each set entry restriction area by the entry-time behavior setting unit 15. The set entry behavior is stored in the storage unit 112 as entry behavior management information so as to be specified by the identifier of the restricted entry area.
Through the above steps, the initial setting process of the autonomous moving apparatus 1 is ended.
In step S204, the autonomous moving apparatus 1 waits for setting information of a target position in a map (area information) in which autonomous traveling of the moving apparatus 10 is set in the command input unit 13 and a movement command of the user. When the move command is set, the process proceeds to step S205.
In step S205, the autonomous moving apparatus 1 compares the position information of the obstacle such as a wall obtained by the movement state recognition means 16(LiDAR) with the map (area information) of the storage unit 112 by the own position estimation unit 113, and estimates the position of the moving apparatus 10 on the map.
Specifically, the self position is estimated by comparing the position of the identifier indicating an obstacle that can be easily detected by the movement state recognition means 16(LiDAR) in the map (area information) with the detection result of the movement state recognition means 16 (LiDAR).
Further, by knowing the size of the rectangular area of the map (area information) in advance, the actual environmental position of the mobile device 10 can also be estimated.
In step S206, the autonomous moving apparatus 1 determines whether or not an end condition of autonomous traveling of the moving apparatus 10 is satisfied. Here, the termination condition refers to any one of a case where the position of the mobile device 10 in the map (area information) estimated in step S205 is the same as the target position of autonomous travel of the mobile device 10 set in step S204, a case where a stop command is received from the command input unit 13, and a case where the mobile device 10 enters an entry restriction area where the behavior at the time of entry is stopped, which will be described later.
In step S206, when the mobile device 10 satisfies the termination condition (yes in S206), the autonomous mobile device 1 terminates the operation of autonomous traveling of the mobile device 10. If the end condition is not satisfied (no in S206), the process proceeds to step S207.
In step S206, instead of the position comparison in the map (area information), it may be determined whether or not the distance between the target position in the actual environment of autonomous travel of the mobile device 10 and the position in the actual environment obtained by the position estimation is equal to or less than a predetermined value.
In step S207, the autonomous moving apparatus 1 determines whether the moving apparatus 10 is present within the restricted entry area. It is determined whether or not the value of the position of the mobile device 10 estimated in step S205 in the map (area information) is a value assigned to the identifier restricting entry into the area.
If the value of the estimated position of the mobile device 10 is the value of the entrance restriction area identifier, it is determined that the mobile device 10 has entered the entrance restriction area (yes at S207), and the process proceeds to step S209. If the estimated position value of the mobile device 10 is not the value of the entrance restriction area identifier (no in S207), it is determined that the mobile device 10 has not entered the entrance restriction area, and the process proceeds to step S208.
In step S208, the autonomous moving apparatus 1 plans the travel route of the moving apparatus 10 with reference to the map (area information), and controls the moving unit 12 by the movement control unit 114 to cause the moving apparatus 10 to travel to the target position.
In this case, since the travel route is planned with reference to a map (area information) in which an obstacle and an entry restriction area are set, the travel of the mobile device 10 is performed while avoiding entry into the entry restriction area.
Then, the process returns to step S205, and the autonomous moving apparatus 1 continues the autonomous travel process.
In step S209, the autonomous moving apparatus 1 uses the identifier of the position of the mobile apparatus 10 estimated in step S205 in the map (area information) as the identifier of the entry restriction area, and acquires the identifier of the entry-time behavior with reference to the entry-time behavior management information described with reference to fig. 6. Then, the entry-time behavior corresponding to the identifier of the entry-time behavior is acquired with reference to the entry-time behavior information described with reference to fig. 7.
Then, in step S210, the autonomous moving apparatus 1 executes the entry-time behavior acquired in step S209.
After the autonomous moving apparatus 1 performs the entry behavior, it returns to step S205 to continue the process of autonomous traveling.
In parallel with the autonomous travel process based on the map (area information), the autonomous moving apparatus 1 controls the avoidance operation of the moving apparatus 10 when the approaching of a person or another autonomous moving apparatus is detected by the moving state recognition means 16 (LiDAR).
In this embodiment, the setting of the entry restriction area by the GUI by assigning an identifier for specifying the area to each entry restriction area, recording the identifier in the area information in accordance with the position of the area, displaying the area on the screen in the display color corresponding to the identifier, and the like is described with reference to fig. 3 and 4. However, when a large number of entry restriction areas are set, the display screen may become complicated.
In this case, the display of the entry restriction area may be displayed in the same color.
Alternatively, the entry restricted area may be expressed as a polygon, only pixels corresponding to the positions of the vertices of the polygon may be displayed, and an identifier may be marked to identify the entry restricted area.
As described later, since the entry determination of the entry restriction area is performed regardless of the screen display by performing the travel of the mobile device 10 based on the area information, there is no problem even if the display of the entry restriction area is simplified.
According to the autonomous moving apparatus 1 of the embodiment, since the entry restriction area in which the behavior at the time of entry is set is provided and the autonomous traveling of the moving apparatus 10 is controlled, it is possible to prevent unnecessary stop by executing an appropriate behavior at the time of entry, and it is possible to improve usability of the moving apparatus 10.
Further, by displaying a GUI in which a map for controlling autonomous traveling is associated with the area information of autonomous traveling, it is possible to easily perform setting of the entry restriction area and setting of behavior at the time of entry, and operability of the autonomous moving apparatus 1 is improved.
Second embodiment
In the present embodiment, an autonomous moving apparatus 1 will be described which sets a plurality of entry behaviors for each restricted entry area and selects and executes an entry behavior based on predetermined conditions.
For example, in the entry restriction area 304 in fig. 3, the entry behavior of traveling in the traveling direction in a passing manner and the entry behavior of traveling in the reverse direction in a departing manner are selected depending on the entry direction, and this embodiment is an embodiment to implement this.
In this specification, the setting information of the behavior at entry and the operation condition of the behavior at entry defined for each restricted entry area is referred to as a scene.
Fig. 9 is a diagram showing a control module configuration of the autonomous moving apparatus 1 according to the present embodiment.
Compared to the control module configuration of the autonomous moving apparatus 1 in fig. 1, the configuration is different in that the scene management unit 21 is provided in the operation setting device 19 instead of the entry behavior setting unit 15, and the state management unit 115 is provided in the travel control unit 11 of the moving apparatus 10.
The scene management unit 21 and the state management unit 115 added to fig. 9 will be described in detail below.
In fig. 9, the same reference numerals as those in fig. 1 denote common control modules, and the description thereof will be omitted.
The state management unit 115 added to the travel control unit 11 notifies the command determination unit 212 of the scene management unit 21 of the state of the mobile device 10 according to the state of the movement control unit 114. More specifically, the state management unit 115 notifies the command determination unit 212 whether or not the mobile device 10 enters the restricted entry area, the position information of the obstacle obtained from the movement state recognition unit 16, and the travel information (position, speed, direction, etc.) of the mobile device 10. Further, the state management unit 115 notifies the command determination unit 212 of command information being executed by the travel control unit 11.
The scene storage unit 211 of the scene management unit 21 stores the scene information described in fig. 11 and the behavior at entry time information (see fig. 7) in which the behavior is recorded for each identifier for identifying the behavior at entry time.
The command determination unit 212 of the scenario management unit 21 determines an operation command based on the state of the mobile device 10 obtained from the state management unit 115 with reference to the scenario storage unit 211, and sets the behavior of the mobile device 10 in the entry restriction area.
Next, with reference to fig. 10 and 11, the operation of the scene management unit 21 will be described.
Fig. 10 is a diagram showing an example of a travel area in which the mobile device 10 of the present embodiment autonomously travels.
In the travel area, there are chairs around which an access-restricted area 301 is provided. Here, the identifier of the restricted entry area 301 is "80".
In addition, cones were provided at four corners of the floor surface on which the wax had just been applied in the running area. The region surrounded by the cone is set as the restricted entry region 305 (identifier is "82").
A step is provided outside the travel area, and an entry restriction area 304 is set in front of the step (the identifier is "83").
In addition, an entry restriction area 303 is set in a connection between a service area, which is a travel area of the autonomous moving apparatus 1, and outside the service area (the identifier is "81").
Fig. 11 is a diagram showing an example of the configuration of scene information stored in the scene storage unit 211.
The scene information is in the form of an extension of the behavior management information at entry as explained in fig. 6. Specifically, the "identifier of the restricted entry area" in which the scene information specifies the restricted entry area is configured by an "identifier of a predetermined entry behavior" which is an identifier of an entry behavior performed when an execution condition of an entry behavior described later is not satisfied, and "entry behavior setting in a specific state". The "behavior setting at entry in a specific state" is configured by "execution conditions" for behavior at entry and "identifier for behavior at entry" executed when the execution conditions are satisfied.
The "execution condition" is constituted by an "instruction under execution" conditioned on an instruction being executed by the travel control unit 11 when the mobile device 10 enters the restricted entry area notified from the state management unit 115, and a "time" conditioned on a time when the mobile device 10 enters the restricted entry area. When the condition of "command under execution" and the condition of "time" are satisfied at the same time, the entry-time behavior corresponding to the identifier set in the "recognition of entry-time behavior" is executed.
Here, the designation of the entry behavior by the identifier is performed with reference to the entry behavior information described in fig. 7. In the present embodiment, the behavior information at the time of entry is stored in the scene storage unit 211, but may be stored in the storage unit 112 of the travel control unit 11.
The command determination unit 212 can set, as "execution conditions," the "distance to the obstacle" conditioned on the distance between the mobile device 10 and the obstacle for which the entry restriction area is set, which is calculated based on the position information notified from the state management unit 115, and the "entering direction" conditioned on the entering direction of the mobile device 10 entering the entry restriction area, as the execution determination conditions for the behavior at the time of entering, in addition to the "command under execution" and the "time".
Hereinafter, the operation of the mobile device 10 in the entry restriction area of fig. 10 will be described with reference to fig. 11.
In the restricted entry area 301, an operation is performed based on the condition determination corresponding to the identifier "80" of the restricted entry area 301 of the scene information shown in fig. 11 and the setting of the behavior at the time of entry.
Specifically, when the entry of the mobile device 10 into the restricted entry area 301 is detected, the execution conditions of the "command under execution" and the "time" are determined. When the entry restricted area 301 is entered for the first time, the "command under execution" is not "back", and therefore the execution condition is not satisfied. Therefore, the identifier "B" of the "identifier of the predetermined entry behavior" is executed as the entry behavior. Here, "go backward on the route at the time of entry" is executed with reference to the entry-time behavior information of fig. 7.
When it is detected again that the mobile device 10 enters the restricted entry area 301, the "command under execution" is "back", and therefore the execution condition is satisfied, and the identifier "a" of the "identifier of the behavior at entry" is executed as the behavior at entry. Here, "stop in place" is executed with reference to the behavior-at-entry information of fig. 7.
That is, when the mobile device 10 enters the entry restriction area 301 and the mobile device 10 fails to move backward successfully, the mobile device 10 is controlled to stop.
In the restricted entry area 303, an operation is performed based on the condition determination corresponding to the identifier "81" of the restricted entry area 303 of the scene information shown in fig. 11 and the setting of the behavior at the time of entry.
In this case, "behavior at entry time in a specific state" is not set ("execution condition" and "identifier of behavior at entry time").
Therefore, when the mobile device 10 is detected to enter the entry restriction area 303, the identifier "B" of the "predetermined entry behavior" is executed as the entry behavior. Here, "go backward on the route at the time of entry" is executed with reference to the entry-time behavior information of fig. 7.
Regarding the operation of the restricted entry area 303, if the identifier set for the "predetermined entry behavior" is changed, another entry behavior can be set. That is, in the method of controlling the behavior at entry based on the scene information, the control operation of restricting the behavior at entry of the entry area described in the first embodiment can be performed.
In the restricted entry area 305, the operation is performed based on the condition determination corresponding to the identifier "82" of the restricted entry area 305 of the scene information in fig. 11 and the setting of the behavior at the time of entry.
In the restricted entry area 305, the mobile device 10 can travel due to drying of the applied wax. Then, the behavior at entry is changed before and after the drying time of the wax using the condition of "time" as the execution condition.
That is, when it is detected that the mobile device 10 enters the entry restriction area 305, if the "time" is the dry time, that is, "13 hours and thereafter," the identifier "B" of the "entry behavior" is executed as the entry behavior, and if not, the identifier "a" of the "predetermined entry behavior" is executed as the entry behavior. Here, referring to the behavior information at entry time in fig. 7, if "13 hours and thereafter", the "backward" is performed as the behavior at entry time, and if not, the "stop-in-place" is performed as the behavior at entry time.
As a setting example of other scene information, the following scene information can be set.
For example, in the scenario information of fig. 11, if the distance between the mobile device 10 and the obstacle is equal to or less than the "distance from the obstacle" in place of the execution condition of the "command under execution" in the entry restriction area 301, the entry-time behavior of the "entry-time behavior identifier" of the "entry-time behavior setting in the specific state" is performed.
In this scenario, when the mobile device 10 enters the restricted entry area 301, "stop at the home" is performed if the distance between the mobile device 10 and the obstacle is equal to or less than a predetermined value, and "move backward on the route at the time of entry" is performed if the distance between the mobile device 10 and the obstacle is greater than a predetermined value.
This operation is the same as the operation of restricting the access area provided around the "chair" in the first embodiment.
Next, the scene information of the restricted entry area 304 of fig. 10 will be described.
In the entry restriction region 304, when the mobile device 10 enters toward the step, the mobile device 10 is separated from the entry restriction region 304 so as to retreat, and when the mobile device 10 enters in the direction before passing through the step (the vertical direction of the paper), the mobile device is caused to pass through the entry restriction region 304 in the traveling direction.
To achieve this, the scene information is provided with an "execution condition" that regards as a condition that the "entering direction" indicating the entering direction when the mobile device 10 enters the entry restriction area is the "step direction", and the behavior at the time of entering when the condition is satisfied is "backward on the route at the time of entering". The predetermined entry behavior is "passing in the traveling direction".
The control based on the scene information may be performed as the processing of step S209 in the operation flowchart of the autonomous moving apparatus in fig. 8.
Step S203 is replaced with the setting of scene information.
In the above description of the scenario information, the example in which the execution condition of the single entry behavior is set for one entry restricted area has been described, but a plurality of execution conditions of the entry behavior may be set for one entry restricted area, and the entry behavior may be set differently for each execution condition.
According to the autonomous moving apparatus 1 of the present embodiment, the entering behavior according to the state of the restricted entering area can be set using the scene information, so that the usability of the moving apparatus 10 can be improved.
Third embodiment
In the present embodiment, the autonomous moving apparatus 1 is shown in which the entry behavior can be set in accordance with the state of the entry restricted area by setting the entry behavior after the moving apparatus 10 starts traveling. In the present embodiment, the portions common to the autonomous moving apparatus 1 of the other embodiments described above are not described.
Fig. 12 is a diagram illustrating an operation flow of the autonomous moving apparatus 1 according to the embodiment.
The operation flow of the autonomous moving apparatus 1 described with reference to fig. 8 differs in that, after the entry-time behavior is referred to in step S209, it is determined in step S212 whether or not the entry-time behavior is set, and if not set (no in S212), the entry-time behavior in step S203 is set.
The process of step S212 is performed by determining whether or not an identifier of the entry behavior is set in the entry-time behavior management information of fig. 6 for the restricted entry area of the object. More specifically, when the identifier of the entry-time behavior is Null information or the identifier of the entry-time behavior information of fig. 7 is an identifier in which the operation type and the type of the alarm are not recorded, it is determined that the identifier of the entry-time behavior is not set.
In step S203, the autonomous moving apparatus 1 is actually caused to travel in the restricted entry area of the subject by using the operation setting device 19 (the tablet control device 18 or the notebook computer 17), and the entry-time behavior corresponding to the state of the restricted entry area of the subject is obtained and set in the entry-time behavior information of fig. 7. If the identifier of the entry-time behavior is not registered in the entry-time behavior management information of fig. 6, an identifier corresponding to the obtained entry-time behavior is set.
Then, in step S210, the set entry behavior is executed.
When the mobile device 10 has entered the restricted entry area of the object again, it is determined in step S212 that the movement is set, and the entry behavior set in step S203 is executed.
In step S203, the behavior at the time of entry may be set by using a remote controller having an operation lever such as a remote operation proportional controller for radio control instead of the operation setting device 19 (the tablet control device 18 or the notebook computer 17) or by directly manually moving the mobile device 10 by a person.
The present invention is not limited to the above embodiment, and includes various modifications. The above embodiments are described in detail for easy understanding of the present invention, and are not limited to the embodiments having all the configurations described.
Further, a part of the structure of one embodiment may be replaced with the structure of another embodiment, and the structure of another embodiment may be added to the structure of one embodiment.
For example, in the first embodiment, the control method of setting the entry behavior after traveling according to the third embodiment may be applied to the entry restriction area where the identifier of the entry behavior is not set in the entry-time behavior management information shown in fig. 6 set by the entry-time behavior setting in step S203.
In the second embodiment, the control method of setting the behavior at entry after traveling according to the third embodiment may be applied to the restricted entry area of the scene information shown in fig. 11 in which the identifier of the behavior at entry is not set.
Description of the reference numerals
1 autonomous moving device
10 moving device
11 travel control unit
111 map generating part
112 storage section
113 self position estimating unit
114 movement control unit
12 Mobile unit
13 Command input Unit
14 restricted entry area setting unit
15 behavior setting unit at entry
16 moving state recognition unit
19 an operation setting device.

Claims (10)

1. An autonomous moving apparatus capable of estimating its own position and determining a travel route based on area information indicating a position of an obstacle and detection information of the obstacle, the autonomous moving apparatus comprising:
an entry restriction area setting unit that sets an entry restriction area in the area information;
an entry-time behavior setting unit that sets, for each entry-restricted area, a behavior when the entry-restricted area is entered; and
and a travel control unit that determines whether or not the self position is included in any restricted entry area, controls travel so as not to enter the restricted entry area when the self position is not included in the restricted entry area, and controls travel according to a behavior at the time of entry of the restricted entry area when the self position is included in the restricted entry area.
2. The autonomous mobile apparatus of claim 1 wherein:
the entry-time behavior setting means is configured by scene management means for setting entry-time behaviors for each entry-restricted area based on scene information in which the entry-time behavior in the entry-restricted area and the execution condition of the entry-time behavior are determined,
the travel control means controls travel of the restricted entry area in accordance with the entry-time behavior set by the scene management means.
3. The autonomous mobile apparatus of claim 2 wherein:
the scene information includes setting information of an execution condition for performing a predetermined entry-time behavior, an identifier for a first entry-time behavior performed when the execution condition is satisfied, and an identifier for a second entry-time behavior performed when the execution condition is not satisfied.
4. The autonomous mobile apparatus of claim 3 wherein:
the scene management unit is used for managing the scene,
determining whether the execution condition is satisfied or not based on any one of a command being executed, time, travel information of a self position or a traveling direction, and a distance from an obstacle,
an entry-time behavior of the restricted entry area, which is targeted for the entry-time behavior corresponding to the identifier of the first entry-time behavior when it is determined that the execution condition is satisfied,
and a second entry-time behavior that is a behavior that is associated with the identifier of the second entry-time behavior and that is associated with the entry-time behavior.
5. The autonomous mobile apparatus of claim 4 wherein:
when the execution condition of the scene information is set to be that the entering direction is the obstacle direction,
an identifier of the first entry-time behavior of the scene information indicates a behavior of moving backward on a path at the entry time,
an identifier of the second entry-time behavior of the scene information represents a behavior passing in a direction of travel.
6. The autonomous moving apparatus of any one of claims 1 to 5, wherein:
the entry restriction area setting means displays an obstacle on the screen in correspondence with the area information so as to be able to set an entry restriction area, and displays the entered entry restriction area so as to be able to recognize the entry restriction area.
7. The autonomous mobile apparatus of claim 6 wherein:
the entry limit region setting unit may set the entry limit region as a polygon, display vertices of the polygon so that the entered entry limit region can be recognized, and display an identifier of the entry limit region.
8. The autonomous mobile apparatus of claim 1 wherein:
the entry-time behavior setting unit sets an entry-time behavior of the entered entry-restricted area when entering the entry-restricted area.
9. An autonomous moving apparatus capable of estimating its own position and determining a travel route based on area information indicating a position of an obstacle and detection information of the obstacle, the autonomous moving apparatus comprising:
an entry restriction area setting unit capable of setting an entry restriction area in multiple ways in the area information according to a distance from the obstacle;
an entry behavior setting unit that sets an entry behavior of the entry-restricted area so as to restrict travel in stages in accordance with a distance from the obstacle; and
and a travel control unit that determines whether or not the self position is included in any restricted entry area, controls travel so as not to enter the restricted entry area when the self position is not included in the restricted entry area, and controls travel so as not to approach the obstacle when the self position is included in the restricted entry area by performing behavior control on travel when the restricted entry area enters.
10. The autonomous mobile apparatus of claim 9 wherein:
the restricted entry area setting unit sets a first restricted entry area set to surround the obstacle and a second restricted entry area set to surround the first restricted entry area,
the behavior setting unit sets the behavior when entering the first restricted entry area to stop traveling, and sets the behavior when entering the second restricted entry area to retreat on the route when entering.
CN201980033434.2A 2018-06-06 2019-02-28 Autonomous mobile apparatus Active CN112136090B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2018108372A JP7107757B2 (en) 2018-06-06 2018-06-06 Autonomous mobile device
JP2018-108372 2018-06-06
PCT/JP2019/007828 WO2019235002A1 (en) 2018-06-06 2019-02-28 Autonomous movement device

Publications (2)

Publication Number Publication Date
CN112136090A true CN112136090A (en) 2020-12-25
CN112136090B CN112136090B (en) 2024-03-26

Family

ID=68769350

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980033434.2A Active CN112136090B (en) 2018-06-06 2019-02-28 Autonomous mobile apparatus

Country Status (3)

Country Link
JP (1) JP7107757B2 (en)
CN (1) CN112136090B (en)
WO (1) WO2019235002A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7160171B1 (en) * 2021-11-30 2022-10-25 三菱電機株式会社 Management system, map management device, management method, and management program

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007237873A (en) * 2006-03-08 2007-09-20 Mitsubishi Electric Corp Control device and control method for flying machine and program
JP2008186097A (en) * 2007-01-26 2008-08-14 Matsushita Electric Works Ltd Autonomous moving system
CN101408925A (en) * 2007-10-11 2009-04-15 上海平安投资有限公司 RFID region partitioning and scope monitoring system and method
JP2010061293A (en) * 2008-09-02 2010-03-18 Toyota Motor Corp Route searching device, route searching method, and route searching program
JP2010061442A (en) * 2008-09-04 2010-03-18 Murata Machinery Ltd Autonomous mobile device
JP2011227807A (en) * 2010-04-22 2011-11-10 Toyota Motor Corp Route search system, route search method, and mobile body
US20150254988A1 (en) * 2014-04-17 2015-09-10 SZ DJI Technology Co., Ltd Flight control for flight-restricted regions
WO2016006588A1 (en) * 2014-07-09 2016-01-14 パイオニア株式会社 Mobile object control device, mobile object control method, mobile object control program, and recording medium
JP2016220823A (en) * 2015-05-28 2016-12-28 シャープ株式会社 Self-propelled cleaner
JP2017033232A (en) * 2015-07-31 2017-02-09 セコム株式会社 Autonomous flight robot
JP2017097640A (en) * 2015-11-25 2017-06-01 株式会社Ihiエアロスペース Remote control image acquisition device, remote control image acquisition method and remote control device
JP2017117017A (en) * 2015-12-21 2017-06-29 凸版印刷株式会社 Method for registering/setting no-fly zone for small unmanned aircraft
CN107272682A (en) * 2017-06-16 2017-10-20 深圳市可飞科技有限公司 Mobile platform evades the method, system and mobile platform of collision automatically

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007237873A (en) * 2006-03-08 2007-09-20 Mitsubishi Electric Corp Control device and control method for flying machine and program
JP2008186097A (en) * 2007-01-26 2008-08-14 Matsushita Electric Works Ltd Autonomous moving system
CN101408925A (en) * 2007-10-11 2009-04-15 上海平安投资有限公司 RFID region partitioning and scope monitoring system and method
JP2010061293A (en) * 2008-09-02 2010-03-18 Toyota Motor Corp Route searching device, route searching method, and route searching program
JP2010061442A (en) * 2008-09-04 2010-03-18 Murata Machinery Ltd Autonomous mobile device
JP2011227807A (en) * 2010-04-22 2011-11-10 Toyota Motor Corp Route search system, route search method, and mobile body
US20150254988A1 (en) * 2014-04-17 2015-09-10 SZ DJI Technology Co., Ltd Flight control for flight-restricted regions
WO2016006588A1 (en) * 2014-07-09 2016-01-14 パイオニア株式会社 Mobile object control device, mobile object control method, mobile object control program, and recording medium
JP2016220823A (en) * 2015-05-28 2016-12-28 シャープ株式会社 Self-propelled cleaner
JP2017033232A (en) * 2015-07-31 2017-02-09 セコム株式会社 Autonomous flight robot
JP2017097640A (en) * 2015-11-25 2017-06-01 株式会社Ihiエアロスペース Remote control image acquisition device, remote control image acquisition method and remote control device
JP2017117017A (en) * 2015-12-21 2017-06-29 凸版印刷株式会社 Method for registering/setting no-fly zone for small unmanned aircraft
CN107272682A (en) * 2017-06-16 2017-10-20 深圳市可飞科技有限公司 Mobile platform evades the method, system and mobile platform of collision automatically

Also Published As

Publication number Publication date
WO2019235002A1 (en) 2019-12-12
JP7107757B2 (en) 2022-07-27
JP2019212079A (en) 2019-12-12
CN112136090B (en) 2024-03-26

Similar Documents

Publication Publication Date Title
JP7456537B2 (en) Aircraft control device, aircraft control method, and program
KR101857952B1 (en) Apparatus and System for Remotely Controlling a Robot Cleaner and Method thereof
US20220357174A1 (en) Stand-alone self-driving material-transport vehicle
US11975955B2 (en) Autonomous material transport vehicles, and systems and methods of operating thereof
KR102532741B1 (en) Autonomous driving device and driving method thereof
KR102635529B1 (en) Robot for airport and method thereof
KR20130037684A (en) Autonomous mobile body
KR20150136209A (en) Obstacle avoidance system and method based on multiple images
JP6575493B2 (en) Control device, distributed control program for moving body
KR102223093B1 (en) Drone control system and control method for countering hostile drones
KR101559886B1 (en) An Apparatus and Method for Clustering Control of Transport
WO2021037071A1 (en) Flight control method and related apparatus
JP2024509758A (en) Smart warehouse safety mechanism
US10388164B2 (en) Method and system for detecting an unoccupied region within a parking facility
CN112136090B (en) Autonomous mobile apparatus
CN113994291A (en) Mobile body, control method, and program
KR20160083328A (en) Method for recognizing number of vehicle using unmanned aerial vehicle
US20240085916A1 (en) Systems and methods for robotic detection of escalators and moving walkways
WO2020055881A1 (en) Object zone identification
KR102636551B1 (en) Autonomous terrain collision avoidance apparatus and method for low-altitude operation of unmanned aerial vehicle
KR20230095588A (en) Method and system for controlling robot using reinforcement learning based algorithm and path planning based algorithm, and building in which the robot is disposed
KR102070110B1 (en) Intelligent agent structure for autonomous unmanned system
US10776632B2 (en) Method and system for detecting a free area inside a parking lot
US20240094737A1 (en) Systems and methods for operating a mobile robot
KR20190003119A (en) Mobile terminal and Moving robot including the same therein

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant