CN107045355B - Movement control method and autonomous mobile robot - Google Patents

Movement control method and autonomous mobile robot Download PDF

Info

Publication number
CN107045355B
CN107045355B CN201610918571.2A CN201610918571A CN107045355B CN 107045355 B CN107045355 B CN 107045355B CN 201610918571 A CN201610918571 A CN 201610918571A CN 107045355 B CN107045355 B CN 107045355B
Authority
CN
China
Prior art keywords
person
mobile robot
autonomous mobile
movement
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610918571.2A
Other languages
Chinese (zh)
Other versions
CN107045355A (en
Inventor
藤村亮太
船濑和记
渕上哲司
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Intellectual Property Corp of America
Original Assignee
Panasonic Intellectual Property Corp of America
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Panasonic Intellectual Property Corp of America filed Critical Panasonic Intellectual Property Corp of America
Publication of CN107045355A publication Critical patent/CN107045355A/en
Application granted granted Critical
Publication of CN107045355B publication Critical patent/CN107045355B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

Provided are a movement control method, an autonomous mobile robot, and a movement control program, wherein the autonomous mobile robot is moved within a range where a peripheral person existing around the autonomous mobile robot can be visually recognized, so that the peripheral person can always monitor the autonomous mobile robot, and the autonomous mobile robot is moved outside the range where the peripheral person existing around the autonomous mobile robot can be visually recognized, so that the autonomous mobile robot can be moved without being seen by the peripheral person. A movement control method for controlling movement of an autonomous mobile robot includes: information on a peripheral person existing around the autonomous mobile robot is acquired, a visual recognition range in which the peripheral person can be visually recognized is calculated based on the acquired information on the peripheral person, and a movement range in which the autonomous mobile robot can move is determined based on the calculated visual recognition range.

Description

Movement control method and autonomous mobile robot
Technical Field
The present disclosure relates to a movement control method of controlling movement of an autonomous mobile robot and an autonomous mobile robot that autonomously moves.
Background
In recent years, small autonomous flying robots that autonomously fly on a predetermined flight path have been developed. The autonomous flying robot includes a plurality of propellers, and is capable of flying freely in the air by controlling the rotational speed of each of the plurality of propellers, thereby autonomously flying along a predetermined flight path.
For example, patent document 1 discloses an autonomous flying robot that flies with a moving object while keeping a predetermined distance from the moving object and images the moving object by an imaging unit.
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open publication No. 2014-119828
Disclosure of Invention
Problems to be solved by the invention
The conventional autonomous flying robot described in patent document 1 described above flies with a moving object such as a human being while keeping a predetermined distance therebetween. On the other hand, the conventional autonomous flying robot described in patent document 1 does not consider the range of view of a moving object such as a human, and cannot provide the following restrictions on movement required for practical use: to move it uninteresting or only within the field of view of a person, etc.
The present disclosure has been made to solve the above-described problems, and an object of the present disclosure is to provide a movement control method and an autonomous mobile robot, in which an autonomous mobile robot is moved within a range where a peripheral person existing around the autonomous mobile robot can be visually recognized, so that the peripheral person can always monitor the autonomous mobile robot, and the autonomous mobile robot is moved outside the range where the peripheral person existing around the autonomous mobile robot can be visually recognized, so that the autonomous mobile robot can be moved without being seen by the peripheral person.
Means for solving the problems
A movement control method according to an aspect of the present disclosure controls movement of an autonomous mobile robot, the method including: the method includes acquiring information on a peripheral person present around the autonomous mobile robot, calculating a visual recognition range in which the peripheral person can be visually recognized based on the acquired information on the peripheral person, and determining a movement range in which the autonomous mobile robot can move based on the calculated visual recognition range.
The overall or specific configuration may be realized by a system, an apparatus, an integrated circuit, a computer program, or a computer-readable storage medium such as a CD-ROM, or may be realized by any combination of a system, a method, an apparatus, an integrated circuit, a computer program, or a computer-readable storage medium.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present disclosure, since the moving range in which the autonomous mobile robot can move is determined based on the visual recognition range in which the peripheral person existing around the autonomous mobile robot can be visually recognized, the peripheral person can always monitor the autonomous mobile robot by moving the autonomous mobile robot within the visual recognition range in which the peripheral person existing around the autonomous mobile robot can be visually recognized, or the autonomous mobile robot can move without being seen by the peripheral person by moving the autonomous mobile robot outside the visual recognition range in which the peripheral person existing around the autonomous mobile robot can be visually recognized.
Drawings
Fig. 1 is a diagram showing the configuration of a flight control system according to embodiment 1 of the present disclosure.
Fig. 2 is a block diagram showing the configuration of the autonomous flying robot according to embodiment 1 of the present disclosure.
Fig. 3 is a flowchart for explaining a flight control process of the autonomous flying robot according to embodiment 1 of the present disclosure.
Fig. 4 is a diagram showing an example of the visual recognition range in embodiment 1.
Fig. 5 is a schematic diagram for explaining a method of generating a movement path according to embodiment 1.
Fig. 6 is a schematic diagram for explaining a method of generating a movement path in consideration of an obstacle according to embodiment 1.
Fig. 7 is a perspective view showing an example of the 1 st visual recognition range of the modification of embodiment 1.
Fig. 8(a) is a side view showing an example of the 2 nd visual recognition range of the modification of embodiment 1, and fig. 8(B) is a plan view showing an example of the 2 nd visual recognition range of the modification of embodiment 1.
Fig. 9(a) is a side view showing an example of the 3 rd visual recognition range of the modification of embodiment 1, and fig. 9(B) is a plan view showing an example of the 3 rd visual recognition range of the modification of embodiment 1.
Fig. 10(a) is a side view showing an example of the 4 th visibility range of the modification of embodiment 1, and fig. 10(B) is a plan view showing an example of the 4 th visibility range of the modification of embodiment 1.
Fig. 11 is a schematic diagram for explaining a method of generating a movement path according to a modification of embodiment 1.
Fig. 12 is a block diagram showing the configuration of an autonomous flying robot according to embodiment 2 of the present disclosure.
Fig. 13 is a block diagram showing the configuration of a flight control system according to embodiment 3 of the present disclosure.
Description of the reference numerals
10. 11, 12: autonomous flying robot
20: monitoring camera
30: terminal device
40: network
50: server
101: actuator
102: position measuring part
103: image acquisition unit
104: communication unit
105: control unit
106: storage unit
111: person information acquisition unit
112: visual recognition range calculating unit
113: moving range determining part
114: moving path generating unit
115: movement control unit
116: registered person determination unit
501: communication unit
502: control unit
511: person information acquisition unit
512: visual recognition range calculating unit
513: moving range determining part
514: moving path generating unit
1051: control unit
1052: control unit
Detailed Description
(insight underlying the present disclosure)
The conventional autonomous flying robot described above flies with a moving object such as a human being while keeping a predetermined distance therebetween. Therefore, the autonomous flying robot may get into the human visual field and be boring.
In addition, the conventional autonomous flying robot does not consider the movement of the autonomous flying robot within the field of view of a human. Therefore, in a case where the autonomous flying robot moves outside the visual field of the human, the human may not be able to monitor the autonomous flying robot.
In order to solve the above problem, a movement control method according to an aspect of the present disclosure controls movement of an autonomous mobile robot, the method including: the method includes acquiring information on a peripheral person present around the autonomous mobile robot, calculating a visual recognition range in which the peripheral person can be visually recognized based on the acquired information on the peripheral person, and determining a movement range in which the autonomous mobile robot can move based on the calculated visual recognition range.
According to this configuration, information on a peripheral person present around the autonomous mobile robot is acquired, a visual recognition range in which the peripheral person can visually recognize is calculated based on the acquired information on the peripheral person, and a movement range in which the autonomous mobile robot can move is determined based on the calculated visual recognition range.
Therefore, the range of movement in which the autonomous mobile robot can move is determined based on the range of visual recognition in which the peripheral person present around the autonomous mobile robot can visually recognize, and therefore, by moving the autonomous mobile robot within the range of visual recognition in which the peripheral person present around the autonomous mobile robot can visually recognize, the peripheral person can always monitor the autonomous mobile robot, or by moving the autonomous mobile robot outside the range of visual recognition in which the peripheral person present around the autonomous mobile robot can visually recognize, the autonomous mobile robot can be moved without being seen by the peripheral person.
In the movement control method, a movement path along which the autonomous mobile robot moves within the determined movement range may be generated.
According to this configuration, since the movement path along which the autonomous mobile robot moves within the movement range is generated, the autonomous mobile robot can be moved along the movement path.
In the above-described movement control method, identification information for identifying a predetermined person may be registered in advance, whether or not the peripheral person existing around the autonomous mobile robot is the predetermined person may be determined based on the registered identification information, and when it is determined that the peripheral person existing around the autonomous mobile robot is the predetermined person, a range of visual recognition of the predetermined person existing around the autonomous mobile robot may be determined as the movement range.
According to this configuration, identification information for identifying a predetermined person is registered in advance. Whether or not a peripheral person existing around the autonomous mobile robot is a predetermined person is discriminated based on the registered identification information. When it is determined that the peripheral person existing around the autonomous mobile robot is a predetermined person, the range of the predetermined person existing around the autonomous mobile robot is determined as the movement range within the visual recognition range.
Therefore, the range of the visual recognition range of the predetermined person in which the identification information is registered in advance is determined as the movement range, and therefore the predetermined person can always monitor the autonomous mobile robot moving within the visual recognition range.
In the above-described movement control method, the identification information may be a face image of the predetermined person. According to this configuration, it is possible to easily determine whether or not a peripheral person existing around the autonomous mobile robot is a predetermined person by authenticating a face image of the predetermined person.
In the above-described movement control method, the identification information may be transmitted from a communication device carried by the predetermined person.
According to this configuration, since the identification information is transmitted from the communication device carried by the predetermined person, it is possible to determine whether or not the peripheral person existing around the autonomous mobile robot is the predetermined person by a simple authentication process.
In the above-described movement control method, the outside of the visual recognition range of the peripheral person existing around the autonomous mobile robot may be determined as the movement range.
According to this configuration, the range outside the visual recognition range of the peripheral person existing around the autonomous mobile robot is determined as the movement range. Therefore, by moving the autonomous mobile robot outside the range where the peripheral person existing around the autonomous mobile robot can be visually recognized, the autonomous mobile robot can be moved without being seen by the peripheral person.
In the above-described movement control method, the information on the peripheral person existing around the autonomous mobile robot may include at least one of a position of the peripheral person, a body orientation of the peripheral person, a face orientation of the peripheral person, and a line-of-sight orientation of the peripheral person.
According to this configuration, the information on the peripheral person existing around the autonomous mobile robot includes at least one of the position of the peripheral person, the body orientation of the peripheral person, the face orientation of the peripheral person, and the line-of-sight orientation of the peripheral person. Therefore, the visual recognition range in which the peripheral person can be visually recognized can be calculated from at least one of the position of the peripheral person, the body orientation of the peripheral person, the face orientation of the peripheral person, and the line-of-sight orientation of the peripheral person.
In the above-described movement control method, the visual recognition range may be an area on a two-dimensional plane determined based on at least one of a position of the peripheral character, a body orientation of the peripheral character, a face orientation of the peripheral character, and a line-of-sight orientation of the peripheral character.
According to this configuration, since the visual recognition range is an area on the two-dimensional plane determined according to at least one of the position of the peripheral person, the body orientation of the peripheral person, the face orientation of the peripheral person, and the line-of-sight orientation of the peripheral person, the processing load for calculating the visual recognition range can be reduced.
In the above movement control method, the visual recognition range may have a cost value whose value differs depending on the information on the surrounding people, and the autonomous mobile robot may generate the movement path by using an optimal algorithm by superimposing the visual recognition range on a map including a departure point and a destination point of the autonomous mobile robot.
According to this configuration, the visual recognition range has a cost value whose numerical value differs depending on the information on the surrounding person. Then, the visual recognition range is superimposed on a map including a departure point and a destination point of the autonomous mobile robot, and a movement path is generated using an optimal algorithm.
Therefore, the movement path can be easily generated using the optimal algorithm based on the map on which the visual recognition range having the cost value whose value differs according to the information on the peripheral person existing around the autonomous mobile robot is superimposed.
In the above-described movement control method, the visual recognition range may include: a 1 st region formed on a front surface side of the surrounding person, a 2 nd region formed on a back surface side of a body of the surrounding person, and a 3 rd region formed outside the 1 st region and the 2 nd region, the visual recognition range is superimposed on a map including a departure point and a destination point of the autonomous mobile robot, the map is divided into a lattice shape such that intersections of the departure point and the destination point with sides coincide, a cost value having a 1 st value is given to a part of the sides present in the 1 st region among the respective divided lattices, a cost value having a 2 nd value smaller than the 1 st value is given to a part of the sides present in the 2 nd region where a part of the sides is not present in the 1 st region, and a cost value having a 2 nd value smaller than the 1 nd value is given to a part of the sides not present in the 1 st region and in the 2 nd region where a part of the sides is present in the 3 rd region And assigning a cost value having a 3 rd value smaller than the 2 nd value to a part of the edges having no edge in the 1 st area, the 2 nd area, and the 3 rd area, and assigning a cost value having a 4 th value smaller than the 3 rd value to a part of the edges having no edge in the 1 st area, the 2 nd area, and the 3 rd area, and generating a path having a smallest total cost value and a shortest distance from the start point to the destination point among all paths passing through edges of the grid from the start point to the destination point as the movement path.
According to this configuration, the visual recognition range includes: a 1 st region formed on the front side of the surrounding character, a 2 nd region formed on the back side of the body of the surrounding character, and a 3 rd region formed outside the 1 st and 2 nd regions. Overlapping the visual recognition range on a map including a departure point and a destination point of the autonomous mobile robot, the map is divided into grids so that the intersection points of the departure point and the destination point with the sides coincide with each other, and in each of the divided grids, assigning a cost value having a 1 st value to an edge having a part of the edge within the 1 st region, assigning a cost value having a 2 nd value smaller than the 1 st value to an edge having no part of the edge within the 1 st region and having a part of the edge within the 2 nd region, assigning a cost value having a value of 3 < rd > that is smaller than the value of 2 to an edge having no part of the edge in the 1 < st > region and the 2 < nd > region and a part of the edge in the 3 < rd > region, a cost value having a 4 th value smaller than the 3 rd value is assigned to a part of edges which are not present in the 1 st area, the 2 nd area, and the 3 rd area. A path having the smallest total cost value and the shortest distance from the departure point to the destination point among all paths passing through the sides of the grid from the departure point to the destination point is generated as the movement path.
Therefore, the movement path can be easily generated using the cost value given according to the distance from the surrounding people present around the autonomous mobile robot.
In the movement control method, the autonomous mobile robot may include an image acquisition unit that acquires an image of the surroundings of the autonomous mobile robot, and the information on the surrounding person may be acquired from the image information acquired by the image acquisition unit.
According to this configuration, since the information on the peripheral person existing around the autonomous mobile robot is acquired from the image information acquired by the image acquisition unit provided in the autonomous mobile robot, the information on the peripheral person that can be observed from the autonomous mobile robot can be actually acquired.
In the movement control method, the information on the surrounding person may be acquired from image information acquired by an image acquisition device provided around the autonomous mobile robot.
According to this configuration, it is possible to acquire information on a person around the autonomous mobile robot from image information acquired by an image acquisition device provided around the autonomous mobile robot.
In the above-described movement control method, the information on the surrounding person may be a current position of the surrounding person acquired by a position sensor carried by the surrounding person.
With this configuration, the current position of the peripheral person obtained from the position sensor carried by the peripheral person existing around the autonomous mobile robot can be used as the information on the peripheral person.
In the above-described movement control method, the information on the surrounding person may be a direction in which the surrounding person is oriented, the direction being acquired by a geomagnetic sensor carried by the surrounding person.
According to this configuration, the direction in which the peripheral person is oriented, which is acquired from the geomagnetic sensor carried by the peripheral person present around the autonomous mobile robot, can be used as the information on the peripheral person.
In the movement control method, the information on the surrounding person may be acquired from image information on the surroundings of the other autonomous mobile robot acquired by an image acquiring unit provided in the other autonomous mobile robot than the autonomous mobile robot.
According to this configuration, it is possible to acquire information on a person around the autonomous mobile robot from image information of the surroundings of another autonomous mobile robot acquired by an image acquisition unit provided in another autonomous mobile robot other than the autonomous mobile robot.
In the above-described movement control method, information on an obstacle present around the autonomous mobile robot may be acquired, and the movement range may be determined based on the acquired information on the obstacle and the calculated visual recognition range.
According to this configuration, since the information on the obstacle existing around the autonomous mobile robot is acquired and the movement range is determined based on the acquired information on the obstacle and the calculated visual recognition range, the movement range can be determined in consideration of a place where the obstacle forms a blind spot.
In the movement control method, the autonomous mobile robot may be an autonomous flying robot, an altitude at which the autonomous flying robot flies may be acquired, and the movement range may be determined only when the acquired altitude is lower than a predetermined altitude.
According to this configuration, the autonomous mobile robot is an autonomous flying robot. The altitude at which the autonomous flying robot flies is acquired, and the movement range is determined only when the acquired altitude is lower than a predetermined altitude.
Therefore, since the movement range is determined only when the autonomous flying robot flies at an altitude lower than a predetermined altitude, the movement range is determined when the autonomous flying robot flies at a low altitude where the autonomous flying robot is easily visually recognized by surrounding characters, and the movement range does not need to be particularly determined when the autonomous flying robot flies at a high altitude where the autonomous flying robot is difficult to visually recognize by surrounding characters, and the autonomous flying robot can fly at the shortest distance from the current position to the destination point.
In the movement control method, the current position of the autonomous mobile robot may be notified to a predetermined terminal device when the movement range is not determined.
According to this configuration, the current position of the autonomous mobile robot is notified to a predetermined terminal device without determining the movement range. Therefore, it is possible to receive an input as to how to move the autonomous mobile robot without determining the movement range.
In the movement control method, when the distance of the generated movement path is longer than a predetermined distance, the distance of the generated movement path may be notified to a predetermined terminal device.
According to this configuration, when the distance of the generated movement path is longer than the predetermined distance, the distance of the generated movement path is notified to the predetermined terminal device. Therefore, although the remaining amount of power of the battery built in the autonomous mobile robot may be insufficient when the distance of the movement path is longer than the predetermined distance, the remaining amount of power of the battery can be prevented from becoming insufficient when the autonomous mobile robot moves by notifying the predetermined terminal device in advance.
In the above-described movement control method, at least one of the acquisition of the information on the surrounding person, the calculation of the visual recognition range, and the determination of the movement range may be performed by a processor.
With this configuration, at least one of acquisition of information on surrounding persons, calculation of a visual recognition range, and determination of a movement range can be performed by the processor.
An autonomous mobile robot according to another aspect of the present disclosure is an autonomous mobile robot including: a person acquisition unit that acquires information on a surrounding person present around the autonomous mobile robot; a calculation unit that calculates a visual recognition range in which the peripheral person can be visually recognized, based on the acquired information on the peripheral person; and a determination unit configured to determine a movement range in which the autonomous mobile robot can move, based on the calculated visual recognition range.
According to this configuration, information on a peripheral person present around the autonomous mobile robot is acquired, a visual recognition range in which the peripheral person can visually recognize is calculated based on the acquired information on the peripheral person, and a movement range in which the autonomous mobile robot can move is determined based on the calculated visual recognition range.
Therefore, since the moving range in which the autonomous mobile robot can move is determined based on the visual recognition range in which the peripheral person existing around the autonomous mobile robot can visually recognize, the peripheral person can always monitor the autonomous mobile robot by moving the autonomous mobile robot within the visual recognition range in which the peripheral person existing around the autonomous mobile robot can visually recognize, or the autonomous mobile robot can move without being seen by the peripheral person by moving the autonomous mobile robot outside the visual recognition range in which the peripheral person existing around the autonomous mobile robot can visually recognize.
In the autonomous mobile robot described above, at least one of the person acquisition unit, the calculation unit, and the determination unit may include a processor.
With this configuration, at least one of acquisition of information on surrounding persons, calculation of a visual recognition range, and determination of a movement range can be performed by the processor.
A program according to still another aspect of the present disclosure controls movement of an autonomous mobile robot, is readable by a computer, and causes the computer to function as an acquisition unit that acquires information on a peripheral person present around the autonomous mobile robot, a calculation unit that calculates a visual recognition range in which the peripheral person can be visually recognized based on the acquired information on the peripheral person, and a determination unit that determines a movement range in which the autonomous mobile robot can move based on the calculated visual recognition range.
According to this configuration, information on a peripheral person present around the autonomous mobile robot is acquired, a visual recognition range in which the peripheral person can visually recognize is calculated based on the acquired information on the peripheral person, and a movement range in which the autonomous mobile robot can move is determined based on the calculated visual recognition range.
Therefore, since the moving range in which the autonomous mobile robot can move is determined based on the visual recognition range in which the peripheral person present around the autonomous mobile robot can visually recognize, the peripheral person can always monitor the autonomous mobile robot by moving the autonomous mobile robot within the visual recognition range in which the peripheral person present around the autonomous mobile robot can visually recognize, or the autonomous mobile robot can move without being seen by the peripheral person by moving the autonomous mobile robot outside the visual recognition range in which the peripheral person present around the autonomous mobile robot can visually recognize.
Hereinafter, embodiments of the present disclosure will be described with reference to the drawings. The following embodiments are specific examples of the present disclosure, and do not limit the technical scope of the present disclosure.
(embodiment mode 1)
Fig. 1 is a diagram showing the configuration of a flight control system according to embodiment 1 of the present disclosure. The flight control system shown in fig. 1 includes an autonomous flying robot 10, a monitoring camera 20, and a terminal device 30. The autonomous flying robot 10 is an example of an autonomous mobile robot, and the monitoring camera 20 is an example of an image acquisition device.
The autonomous flying robot 10 autonomously flies from a predetermined departure point to a destination point. When the departure point and the destination point are input, the autonomous flying robot 10 automatically generates a movement route from the departure point to the destination point. The autonomous flying robot 10 includes a plurality of propellers, and moves forward, backward, leftward, rightward, upward, and downward by controlling the rotation speed of each of the plurality of propellers. The autonomous flying robot 10 autonomously flies from a departure point to a destination point while acquiring a current position by a GPS (Global Positioning System). The autonomous flying robot 10 is connected to and can communicate with the monitoring camera 20 and the terminal device 30 via the network 40. The network 40 is, for example, the internet or a cellular communication network.
The monitoring camera 20 captures images of human beings (surrounding humans) present around the autonomous flying robot 10 and transmits captured image information to the autonomous flying robot 10. Further, the flight control system may include a plurality of monitoring cameras 20 instead of the 1 monitoring camera 20. In addition, the flight control system may not include the monitoring camera 20.
The terminal device 30 sets the departure point and the destination of the autonomous flying robot 10. The terminal device 30 is, for example, a smartphone, a tablet computer, or a personal computer. The terminal device 30 receives inputs of a departure point and a destination of the autonomous flying robot 10 from a user, and transmits the received information indicating the departure point and the destination to the autonomous flying robot 10. Further, the terminal device 30 may receive an input of the departure time of the autonomous flying robot 10 by the user, and transmit the received information indicating the departure time to the autonomous flying robot 10. In addition, the terminal device 30 may transmit environmental information indicating a map of the surroundings of the autonomous flying robot 10 to the autonomous flying robot 10 in response to a request from the autonomous flying robot 10.
Fig. 2 is a block diagram showing the configuration of the autonomous flying robot according to embodiment 1 of the present disclosure. The autonomous flying robot 10 shown in fig. 2 includes: an actuator 101, a position measuring unit 102, an image acquiring unit 103, a communication unit 104, and a control unit 105.
The actuators 101 drive a plurality of propellers, respectively. The actuators 101 rotate a plurality of propellers used to fly the autonomous flying robot 10.
The position measurement unit 102 is, for example, a GPS, and acquires position information indicating the current position of the autonomous flying robot 10. Further, the current position is represented by latitude, longitude, and altitude.
The image acquisition unit 103 is, for example, a camera, and is preferably an omnidirectional camera. The image acquisition unit 103 captures images of human beings existing around the autonomous flying robot 10, for example, and acquires captured image information.
The communication unit 104 receives image information from the monitoring camera 20 via the network 40. The communication unit may be configured using a communication circuit, for example. The communication unit 104 receives information indicating a departure point and a destination from the terminal device 30 via the network 40. The communication unit 104 transmits the position information acquired by the position measurement unit 102 to the terminal device 30, and receives environment information indicating a map of the surroundings of the autonomous flying robot 10 specified from the position information from the terminal device 30.
The control unit 105 includes a processor such as a CPU (central processing unit), for example, and controls the operation of the autonomous flying robot 10. The control unit 105 includes a personal information acquisition unit 111, a visual recognition range calculation unit 112, a movement range determination unit 113, a movement route generation unit 114, and a movement control unit 115. The autonomous flying robot 10 may have a memory, not shown, and store a program for functioning as the control unit 105 in the memory. The control unit 105 functions by the CPU executing a program. Alternatively, the control unit 105 may be configured by using a dedicated circuit in which the function of the control unit 105 is incorporated. The dedicated circuitry may be, for example, an integrated circuit.
The personal information acquisition unit 111 acquires information on the human being present around the autonomous flying robot 10 based on the image information acquired by the image acquisition unit 103 and/or the image information received by the communication unit 104. The information on the human being existing around the autonomous flying robot 10 includes at least one of the position of the human being, the body orientation of the human being, the face orientation of the human being, and the line-of-sight orientation of the human being. The personal information acquisition unit 111 performs image recognition processing on the image information to acquire at least one of the position of the person, the body orientation of the person, the face orientation of the person, and the line-of-sight orientation of the person included in the image information. The autonomous flying robot 10 and the monitoring camera 20 may include a distance sensor that measures a distance to a human being, and the human information acquisition unit 111 may acquire information on human beings existing around the autonomous flying robot 10 based on the measured distance.
The visual recognition range calculation unit 112 calculates a visual recognition range in which the person can be visually recognized based on the information about the person acquired by the person information acquisition unit 111. The visual recognition range is a region on a two-dimensional plane determined according to at least one of the position of the person, the body orientation of the person, the face orientation of the person, and the line-of-sight orientation of the person.
The movement range determination unit 113 determines a movement range in which the autonomous flying robot 10 can move, based on the visual recognition range calculated by the visual recognition range calculation unit 112.
The movement path generation unit 114 generates a movement path along which the autonomous flying robot 10 moves within the movement range determined by the movement range determination unit 113. The movement path generation unit 114 generates a movement path along which the autonomous flying robot 10 moves within the movement range determined by the movement range determination unit 113, using an optimization algorithm such as dynamic programming, for example.
The movement path generation unit 114 may generate the movement path by using an optimization algorithm by superimposing a visual recognition range having a cost value whose value differs depending on the information on the surrounding people on a map including the departure point and the destination point of the autonomous mobile robot 10.
The movement control unit 115 controls the movement of the autonomous flying robot 10 according to the movement path generated by the movement path generation unit 114.
Next, a flight control process of the autonomous flying robot 10 according to embodiment 1 will be described.
Fig. 3 is a flowchart for explaining a flight control process of the autonomous flying robot according to embodiment 1 of the present disclosure.
First, in step S1, the communication unit 104 receives information about the departure point and the destination point from the terminal device 30 via the network 40. The terminal device 30 receives input of information indicating a departure point and a destination point by a user, and transmits the received information indicating the departure point and the destination point to the autonomous flying robot 10. Note that the current position of the autonomous flying robot 10 may be set as the departure point without particularly receiving an input of the departure point. Further, the terminal device 30 may receive a departure time from the departure point or an arrival time at the destination point, and transmit the received departure time or arrival time to the autonomous flying robot 10.
Next, in step S2, the image acquisition unit 103 acquires image information obtained by imaging human beings existing around the autonomous flying robot 10. Further, depending on the installation position of the image acquisition unit 103 or the flight state of the autonomous flying robot 10, image information may not be acquired. That is, when the image acquisition unit 103 is attached to the lower part of the autonomous flying robot 10 and the autonomous flying robot 10 is in the landing state, there is a possibility that the image information cannot be acquired.
Next, in step S3, the communication unit 104 receives image information from the monitoring camera 20 via the network 40. At this time, the communication unit 104 transmits an image request requesting image information to at least one monitoring camera 20 existing between the departure point and the destination point. Upon receiving the image request, the monitoring camera 20 transmits image information obtained by imaging human beings existing around the autonomous flying robot 10 to the autonomous flying robot 10. The communication unit 104 receives image information transmitted by the monitoring camera 20.
The monitoring camera 20 may transmit the image information to the autonomous flying robot 10 only when a person is captured, or may not transmit the image information to the autonomous flying robot 10 when a person is not captured. The monitoring camera 20 may transmit position information indicating the position of the monitoring camera 20 together with the image information. Further, the monitoring camera 20 may perform image recognition processing on the acquired image information without transmitting the image information, and may transmit person information indicating at least one of a position of a person, a body orientation of the person, a face orientation of the person, and a line-of-sight orientation of the person included in the image information. In addition, when the monitoring camera 20 is not present around the autonomous flying robot 10, there is a possibility that image information cannot be acquired from the monitoring camera 20.
Next, in step S4, the communication unit 104 receives environmental information indicating a map of the surroundings of the autonomous flying robot 10 from the terminal device 30 via the network 40. At this time, the communication unit 104 may transmit the position information acquired by the position measurement unit 102 to the terminal device 30, or may receive environment information indicating a map of the surroundings of the autonomous flying robot 10 specified from the position information from the terminal device 30. The communication unit 104 may receive environment information indicating a map including a departure point and a destination point from the terminal device 30. In the present embodiment, the communication unit 104 receives the environment information from the terminal device 30, but the present disclosure is not particularly limited thereto, and the environment information may be received from a server that provides a map. Further, the information on the environment around the autonomous flying robot 10 may be image information acquired by the image acquisition unit 103 and/or image information received by the communication unit 104.
Next, in step S5, the personal information acquisition unit 111 acquires information on the human being present around the autonomous flying robot 10 based on the image information acquired by the image acquisition unit 103 and/or the image information received by the communication unit 104. At this time, the personal information acquisition unit 111 performs image recognition processing on the image information to acquire information including at least one of the position of the person, the body orientation of the person, the face orientation of the person, and the line-of-sight orientation of the person.
The person information acquiring unit 111 acquires information on a person present around the autonomous flying robot 10 based on the image information, but the present disclosure is not particularly limited thereto. Information on the human being existing around the autonomous flying robot 10 may be acquired by a position sensor carried by the human being. In this case, the position sensor transmits the current positions of the human beings existing around the autonomous flying robot 10 to the autonomous flying robot 10. For example, a communication device carried by a person may be provided with a position sensor. The position sensor may have a communication function, for example.
Further, information on human beings existing around the autonomous flying robot 10 may be acquired by a geomagnetic sensor carried by the human beings. In this case, the geomagnetic sensor transmits, to the autonomous flying robot 10, a direction in which a human being present around the autonomous flying robot 10 is oriented. For example, a communication device carried by a person may be provided with a geomagnetic sensor. The geomagnetic sensor may have a communication function, for example. Further, information on human beings existing around the autonomous flying robot 10 may be acquired from image information acquired by an image acquisition unit provided in an autonomous flying robot other than the autonomous flying robot 10. That is, when another autonomous flying robot flies in the vicinity of the autonomous flying robot 10, the autonomous flying robot 10 may receive image information obtained by imaging the surroundings of the other autonomous flying robot, and may acquire information on a human being present around the autonomous flying robot 10 from the received image information.
Next, in step S6, the visual recognition range calculation unit 112 calculates a visual recognition range in which the person can visually recognize the person, based on the information on the person acquired by the person information acquisition unit 111. Here, the visual recognition range of embodiment 1 will be described.
Fig. 4 is a diagram showing an example of the visual recognition range in embodiment 1. As shown in fig. 4, the visual recognition range 100 includes: a 1 st region 151 formed on the front side of the body of person 110; a 2 nd region 152 formed on the back side of the body of the person 110, adjacent to the 1 st region 151; and a 3 rd region 153 adjacent to the 1 st region 151 and the 2 nd region 152, formed outside the 1 st region 151 and the 2 nd region 152. In the example shown in fig. 4, the visual recognition range calculation unit 112 calculates the visual recognition range 100 in which the person can visually recognize the person based on the body orientation of the person.
The visual recognition range calculation unit 112 may calculate the visual recognition range 100 in which the person can visually recognize the person based on the face orientation of the person. In this case, the visual recognition range 100 includes: a 1 st region 151 formed on the front side of the face of the person 110; a 2 nd region 152 formed adjacent to the 1 st region 151 on the back side of the face of the person 110; and a 3 rd region 153 adjacent to the 1 st region 151 and the 2 nd region 152, formed outside the 1 st region 151 and the 2 nd region 152.
The shape of the visual recognition area 100 is not limited to the shape shown in fig. 4. For example, when only the current position of a person is obtained, it is not known in which direction the person is facing. Therefore, the visual recognition range calculation unit 112 may calculate a range within a circle having a predetermined radius around the current position of the person as the center as the visual recognition range 100.
The visual recognition range calculation unit 112 calculates the visual recognition ranges corresponding to all the persons present around the autonomous flying robot 10.
Returning to fig. 3, next, in step S7, the movement range determination unit 113 determines the movement range in which the autonomous flying robot 10 can move, based on the visual recognition range calculated by the visual recognition range calculation unit 112. In embodiment 1, the movement range determination unit 113 determines, as the movement range, the outside of the visual recognition range of the human being existing around the autonomous flying robot 10. That is, the movement range determination unit 113 determines that the outside of the visual recognition range calculated by the visual recognition range calculation unit 112 is the movement range in which the autonomous flying robot 10 can move.
Next, in step S8, the movement path generation unit 114 generates a movement path along which the autonomous flying robot 10 moves within the movement range determined by the movement range determination unit 113. Here, a method of generating a movement path according to embodiment 1 will be described.
Fig. 5 is a schematic diagram for explaining a method of generating a movement path according to embodiment 1. As shown in fig. 5, the movement route generation unit 114 superimposes the visual recognition range 100 on a map 210 including the departure point 201 and the destination point 202 of the autonomous flying robot 10, and divides the map 210 into a grid so that the intersection points of the departure point and the destination point with the sides coincide with each other. Among the respective sides of the divided lattice, the movement path generating unit 114 gives a cost value of 1 st value to a side where a part of the side exists in the 1 st region 151, gives a cost value of 2 nd value smaller than the 1 st value to a side where a part of the side does not exist in the 1 st region 151 and a part of the side exists in the 2 nd region 152, gives a cost value of 3 rd value smaller than the 2 nd value to a side where a part of the side does not exist in the 1 st region 151 and the 2 nd region 152 and a part of the side exists in the 3 rd region 153, and gives a cost value of 4 th value smaller than the 3 rd value to a side where a part of the side does not exist in the 1 st region 151, the 2 nd region 152, and the 3 rd region 153. The 1 st value is, for example, "3", the 2 nd value is, for example, "2", the 3 rd value is, for example, "1", and the 4 th value is, for example, "0".
The movement route generation unit 114 generates, as the movement route 204, a route having the smallest total cost value and the shortest distance from the departure point 201 to the destination point 202 among all routes passing through the sides of the grid from the departure point 201 to the destination point 202. In fig. 5, of all paths passing through the sides of the grid from the departure point 201 to the destination point 202, a path of "0" whose total cost value is the smallest is generated as the movement path 204.
Returning to fig. 3, next, in step S9, the movement control unit 115 controls the movement of the autonomous flying robot 10 according to the movement path generated by the movement path generation unit 114. That is, the movement control unit 115 causes the autonomous flying robot 10 to start flying toward the destination along the movement path generated by the movement path generation unit 114.
In embodiment 1, the movement control unit 115 causes the autonomous flying robot 10 to start when the movement path is generated, but the present disclosure is not particularly limited to this, and may cause the autonomous flying robot 10 to start when the starting time is set in advance and to wait until the starting time when the starting time is reached. When the arrival time is set in advance, the movement control unit 115 may calculate the movement time based on the movement path and the movement speed, and may calculate the departure time by subtracting the movement time from the arrival time.
Next, in step S10, the movement control unit 115 determines whether or not the autonomous flying robot 10 has reached the destination. Here, when it is determined that the autonomous flying robot 10 has reached the destination (yes in step S10), the flight control process is ended. On the other hand, if it is determined that the autonomous flying robot 10 has not reached the destination (no in step S10), the process returns to step S2. The processing of step S2 to step S10 is performed at a predetermined timing, and the time interval from the execution of the processing of step S2 to the execution of the processing of step S2 next time is shortened, whereby the movement path of the autonomous flying robot 10 that does not enter the view of the person can be generated with higher accuracy.
Further, when an obstacle having a predetermined height exists in the vicinity of the person, the movement route generation unit 114 may generate the movement route in consideration of the obstacle.
Fig. 6 is a schematic diagram for explaining a method of generating a movement path in consideration of an obstacle according to embodiment 1. The communication unit 104 may acquire information on an obstacle present around the autonomous flying robot 10. The movement range determination unit 113 may determine the movement range based on the acquired information about the obstacle and the calculated visual recognition range. As shown in fig. 6, when an obstacle 301 exists diagonally in front of the person, an area 302 in the visual recognition range 100 that is a blind spot due to the obstacle 301 is determined as the movement range.
As shown in fig. 6, when an obstacle 301 exists diagonally in front of the person, the cost value is reduced for a region 302 in the visual recognition range 100 that is blind due to the obstacle 301. For example, among the respective grids divided, the movement path generating unit 114 gives a cost value having a value of 4 to a side having a part of the side in the region 302 which is a dead space due to the obstacle 301. The position of the obstacle 301 may be acquired from environmental information or may be acquired from image information acquired by the image acquisition unit 103 of the autonomous flying robot 10. The region 302 that becomes a blind spot may be determined based on at least one of the position of the person, the body orientation of the person, the face orientation of the person, and the line-of-sight orientation of the person.
The current position acquired by the position measurement unit 102 may include not only the latitude and longitude where the autonomous flying robot 10 is located, but also the altitude at which the autonomous flying robot 10 flies. The movement range determination unit 113 may determine the movement range only when the height acquired by the position measurement unit 102 is lower than a predetermined height. When the height acquired by the position measurement unit 102 is equal to or greater than a predetermined height, the movement range determination unit 113 may not determine the movement range, and the movement route generation unit 114 may generate a path connecting the current position and the destination with the shortest distance as the movement route. The height may be measured by an altimeter provided in the autonomous flying robot 10.
Note that, the communication unit 104 may notify the terminal device 30 of the current position of the autonomous flying robot 10, without determining the movement range by the movement range determination unit 113. The terminal device 30 is carried by a monitor who monitors the autonomous flying robot 10, a manager who manages the autonomous flying robot 10, or an owner who owns the autonomous flying robot 10. For example, when there are a plurality of people around the autonomous flying robot 10 and there is no area outside the visual recognition range on the map, the movement range determination unit 113 may not be able to determine the movement range. In this case, the communication unit 104 notifies the terminal device 30 of the current position of the autonomous flying robot 10. The terminal device 30 receives a flight control instruction on how to fly the autonomous flying robot 10, and transmits the received flight control instruction to the autonomous flying robot 10. The movement control unit 115 flies the autonomous flying robot 10 according to the received flight control instruction.
Further, when the distance of the movement path generated by the movement path generating unit 114 is longer than a predetermined distance, the communication unit 104 may notify the terminal device 30 of the distance of the generated movement path. The terminal device 30 is carried by a monitor who monitors the autonomous flying robot 10, a manager who manages the autonomous flying robot 10, or an owner who owns the autonomous flying robot 10. For example, when a route that passes through a region outside the visual recognition range is generated, the distance from the departure point to the destination point becomes extremely long, and the battery may not be able to reach the destination point. Therefore, when the distance of the movement path generated by the movement path generating unit 114 is longer than the predetermined distance, the communication unit 104 notifies the terminal device 30 of the distance of the generated movement path. The terminal device 30 receives a flight control instruction on how to fly the autonomous flying robot 10, and transmits the received flight control instruction to the autonomous flying robot 10. The movement control unit 115 flies the autonomous flying robot 10 according to the received flight control instruction.
Next, a moving path generating method according to a modification of embodiment 1 will be described. In embodiment 1 described above, the visual recognition range is represented by a two-dimensional plane and the movement path for moving on the two-dimensional plane is generated, but in a modification of embodiment 1, the visual recognition range is represented by a three-dimensional space and the movement path for moving on the three-dimensional space is generated. The visual recognition range is changed according to the type of information on the human being existing around the autonomous flying robot 10. That is, the visual recognition range calculation unit 112 calculates: a 1 st visual recognition range defined according to a current position of the person, a 2 nd visual recognition range defined according to a body orientation of the person, a 3 rd visual recognition range defined according to a face orientation of the person, and a 4 th visual recognition range defined according to a line-of-sight orientation of the person.
Fig. 7 is a perspective view showing an example of the 1 st visual recognition range of the modification of embodiment 1. Fig. 8(a) is a side view showing an example of the 2 nd visual recognition range of the modification of embodiment 1, and fig. 8(B) is a plan view showing an example of the 2 nd visual recognition range of the modification of embodiment 1. Fig. 9(a) is a side view showing an example of the 3 rd visual recognition range of the modification of embodiment 1, and fig. 9(B) is a plan view showing an example of the 3 rd visual recognition range of the modification of embodiment 1. Fig. 10(a) is a side view showing an example of the 4 th visibility range of the modification of embodiment 1, and fig. 10(B) is a plan view showing an example of the 4 th visibility range of the modification of embodiment 1.
The 1 st visual recognition range 401 shown in fig. 7 is calculated when the current position of the person 110 is acquired. The 1 st visual recognition range 401 is represented by a hemispherical shape centered on the current position of the character 110 and having a predetermined radius.
The 2 nd visual recognition range 402 shown in fig. 8(a) and 8(B) is calculated when the body orientation of the person 110 is acquired. The 2 nd visual recognition range 402 is represented by a solid in which a cut having a central angle of 60 degrees in the bottom surface is formed on the back surface side of the body of the person 110 with respect to a hemisphere having a predetermined radius and centered on the current position of the person 110.
The 3 rd visual recognition range 403 shown in fig. 9(a) and 9(B) is calculated when the face orientation of the person 110 is acquired. The 3 rd visual recognition range 403 is formed in the direction in which the face of the person 110 faces, and is represented by a solid formed by rotating a fan shape having a center angle of 200 degrees in the horizontal direction by 50 degrees in the upward direction and 75 degrees in the downward direction around the position of the face of the person 110. Further, the 3 rd visual recognition range 403 includes: a binocular visual field range 4031 formed by rotating a fan shape having a center angle of 120 degrees in the horizontal direction in the up-down direction, and a peripheral visual field range 4032 formed adjacently on the left and right sides of the binocular visual field range 4031 and formed by rotating a fan shape having a center angle of 40 degrees in the horizontal direction in the up-down direction.
The 4 th visual recognition range 404 shown in fig. 10(a) and 10(B) is calculated when the direction of the line of sight of the person 110 is acquired. The 4 th visual recognition range 404 is formed in a direction in which the line of sight of the person 110 is directed, and is expressed by a quadrangular pyramid shape in which the angle between two sides in the horizontal direction with the position of the eyes of the person 110 as the vertex is 30 degrees and the angle between two sides in the vertical direction with the position of the eyes of the person 110 as the vertex is 20 degrees.
Fig. 11 is a schematic diagram for explaining a method of generating a movement path according to a modification of embodiment 1. As shown in fig. 11, the travel route generation unit 114 divides a three-dimensional space 220 including a departure point 201 and a destination point 202 of the autonomous flying robot 10 into a grid shape. Thus, the three-dimensional space 220 is composed of a plurality of cubes. The lengths of the three-dimensional space 220 in the X-axis direction and the Y-axis direction may be determined according to the departure point and the destination point, and the length (height) in the Z-axis direction may be determined according to the height at which the autonomous flying robot 10 can fly. When the movement path generation unit 114 cannot generate a movement path, the movement path generation unit may set the three-dimensional space 220 to be larger and generate a movement path again.
The movement path generating unit 114 assigns a cost value to each of the divided sides of the plurality of cubes based on at least one of a 1 st visual recognition range 401 defined according to the current position of the person, a 2 nd visual recognition range 402 defined according to the body orientation of the person, a 3 rd visual recognition range 403 defined according to the face orientation of the person, and a 4 th visual recognition range 404 defined according to the line of sight orientation of the person.
That is, when only the current position of the person is acquired, the movement path generating unit 114 superimposes the 1 st visual recognition range 401 defined according to the current position of the person on the three-dimensional space 220. The movement route generating unit 114 gives a cost value calculated from the distance from the current position of the person to each of the plurality of cubes having a part of the sides in the 1 st visual recognition range 401 in the three-dimensional space 220. In this case, the cost value is calculated by dividing a predetermined constant α by the distance from the current position of the person. Therefore, the cost value becomes smaller as the person is separated. The movement path generating unit 114 assigns a cost value of "0" to each of the plurality of cubes in the three-dimensional space 220, in which a part of the sides does not exist in the 1 st visual recognition range 401 in the three-dimensional space 220. The movement route generation unit 114 generates, as the movement route, a route having the smallest total cost value and the shortest distance from the departure point 201 to the destination point 202 among all routes passing through the sides of the cube from the departure point 201 to the destination point 202.
When the current position of the person and the body orientation of the person are acquired, the movement path generating unit 114 superimposes the 1 st visual recognition range 401 defined according to the current position of the person and the 2 nd visual recognition range 402 defined according to the body orientation of the person on the three-dimensional space 220. The movement route generating unit 114 assigns a 1 st cost value calculated from the distance from the current position of the person to each of the plurality of cubes having a part of the sides within the 1 st visual recognition range 401 in the three-dimensional space 220, and assigns a 2 nd cost value calculated from the distance from the current position of the person to each of the plurality of cubes having a part of the sides within the 2 nd visual recognition range 402 in the three-dimensional space 220. In this case, the 1 st cost value and the 2 nd cost value are calculated by dividing a predetermined constant α by the distance from the current position of the person.
Then, the movement path generation unit 114 adds the 1 st cost value to the 2 nd cost value multiplied by a predetermined weight value for each of the plurality of cubes whose sides partially lie within the 1 st visual recognition range 401 and the 2 nd visual recognition range 402 in the three-dimensional space 220. The movement path generating unit 114 assigns a cost value of "0" to each of the plurality of cubes in the three-dimensional space 220, in which a part of the sides does not exist in the 1 st visual recognition range 401 in the three-dimensional space 220. The movement route generation unit 114 generates, as the movement route, a route having the smallest total cost value and the shortest distance from the departure point 201 to the destination point 202 among all routes passing through the sides of the cube from the departure point 201 to the destination point 202.
When the current position of the person, the body orientation of the person, and the face orientation of the person are acquired, the movement route generation unit 114 superimposes the 1 st visual recognition range 401 defined according to the current position of the person, the 2 nd visual recognition range 402 defined according to the body orientation of the person, and the 3 rd visual recognition range 403 defined according to the face orientation of the person on the three-dimensional space 220. The movement route generating unit 114 assigns a 1 st cost value calculated based on the distance from the current position of the person to each of the plurality of cubes in which no part of the sides exists within the 1 st visual recognition range 401 in the three-dimensional space 220, assigns a 2 nd cost value calculated based on the distance from the current position of the person to each of the plurality of cubes in which a part of the sides exists within the 2 nd visual recognition range 402 in the three-dimensional space 220, and assigns a 3 rd cost value calculated based on the distance from the position of the face of the person to each of the plurality of cubes in which a part of the sides exists within the 3 rd visual recognition range 403 in the three-dimensional space 220. In this case, the 1 st, 2 nd, and 3 rd cost values are calculated by dividing a predetermined constant α by the distance from the current position of the person.
Then, the movement path generation unit 114 adds the 1 st cost value to the 2 nd cost value multiplied by the 1 st weight value for each of the plurality of cubes whose sides partially lie within the 1 st visual recognition range 401 and the 2 nd visual recognition range 402 in the three-dimensional space 220. Further, for each of the plurality of cubes whose sides partially lie within the 1 st visual recognition range 401, the 2 nd visual recognition range 402, and the 3 rd visual recognition range 403 in the three-dimensional space 220, the movement path generating unit 114 adds the 1 st cost value, the 2 nd cost value multiplied by the 1 st weight value, and the 3 rd cost value multiplied by the 2 nd weight value larger than the 1 st weight value.
Note that, in the binocular visual field range 4031 and the peripheral visual field range 4032 included in the 3 rd visual recognition range 403, the value of the weight value multiplied by the 3 rd cost value may be changed, or the weight value of the binocular visual field range 4031 may be set to be larger than the weight value of the peripheral visual field range 4032. The movement path generating unit 114 assigns a cost value of "0" to each of the plurality of cubes in the three-dimensional space 220, in which a part of the sides does not exist in the 1 st visual recognition range 401 in the three-dimensional space 220. The movement route generation unit 114 generates, as the movement route, a route having the smallest total cost value and the shortest distance from the departure point 201 to the destination point 202 among all routes passing through the sides of the cube from the departure point 201 to the destination point 202.
Further, when the current position of the person, the body orientation of the person, the face orientation of the person, and the line-of-sight orientation of the person are acquired, the movement route generating unit 114 superimposes the 1 st visual recognition range 401 defined according to the current position of the person, the 2 nd visual recognition range 402 defined according to the body orientation of the person, the 3 rd visual recognition range 403 defined according to the face orientation of the person, and the 4 th visual recognition range 404 defined according to the line-of-sight orientation of the person on the three-dimensional space 220. The movement route generating unit 114 gives a 1 st cost value calculated from the distance from the current position of the person to each of the plurality of cubes having a part of the sides in the 1 st visual recognition range 401 in the three-dimensional space 220, a2 nd cost value calculated from the distance from the current position of the person is given to each of a plurality of cubes having a part of the sides in the 2 nd visual recognition range 402 in the three-dimensional space 220, a3 rd cost value calculated from the distance from the position of the face of the person is given to each of a plurality of cubes having a part of the sides within the 3 rd visual recognition range 403 in the three-dimensional space 220, the 4 th cost value calculated from the distance from the position of the eye of the person is given to each of the plurality of cubes having a part of the sides in the 4 th visual recognition range 404 in the three-dimensional space 220. In this case, the 1 st, 2 nd, 3 rd and 4 th cost values are calculated by dividing a predetermined constant α by the distance from the current position of the person.
Then, for each of the plurality of cubes whose sides are partially within the 1 st visual recognition range 401 and the 2 nd visual recognition range 402 in the three-dimensional space 220, the movement path generating unit 114 adds the 1 st cost value to the 1 st weight value multiplied by the 2 nd cost value. Further, for each of the plurality of cubes whose sides partially lie within the 1 st visual recognition range 401, the 2 nd visual recognition range 402, and the 3 rd visual recognition range 403 in the three-dimensional space 220, the movement path generating unit 114 adds the 1 st cost value, the 2 nd cost value multiplied by the 1 st weight value, and the 3 rd cost value multiplied by the 2 nd weight value larger than the 1 st weight value. Further, for each of the plurality of cubes whose sides partially lie within the 1 st visual recognition range 401, the 2 nd visual recognition range 402, the 3 rd visual recognition range 403, and the 4 th visual recognition range 404 in the three-dimensional space 220, the movement path generating unit 114 adds the 1 st cost value, the 2 nd cost value multiplied by the 1 st weight value, the 3 rd cost value multiplied by the 2 nd weight value larger than the 1 st weight value, and the 4 th cost value multiplied by the 3 rd weight value larger than the 2 nd weight value. The movement path generating unit 114 assigns a cost value of "0" to each of the plurality of cubes in which no part of the sides exists in the 1 st visual recognition range 401 in the three-dimensional space 220. The movement route generation unit 114 generates, as the movement route, a route having the smallest total cost value and the shortest distance from the departure point 201 to the destination point 202 among all routes passing through the sides of the cube from the departure point 201 to the destination point 202.
As described above, in the modification of embodiment 1, since the movement path that moves in the three-dimensional space is generated, the movement path of the autonomous flying robot 10 that does not enter the view of the person can be generated with higher accuracy.
(embodiment mode 2)
In embodiment 1, the outside of the visual recognition range is determined as the movement range in which the autonomous flying robot can move, but in embodiment 2, the inside of the visual recognition range of a predetermined person is determined as the movement range in which the autonomous flying robot can move.
Fig. 12 is a block diagram showing the configuration of an autonomous flying robot according to embodiment 2 of the present disclosure. The configuration of the flight control system according to embodiment 2 is the same as that of fig. 1, and therefore, the description thereof is omitted.
The autonomous flying robot 11 shown in fig. 12 includes: an actuator 101, a position measuring unit 102, an image acquiring unit 103, a communication unit 104, a control unit 1051, and a storage unit 106. In embodiment 2, the same components as those in embodiment 1 are denoted by the same reference numerals, and descriptions thereof are omitted.
The storage unit 106 stores identification information for identifying a predetermined person in advance. The identification information is input through the terminal device 30 and transmitted to the autonomous flying robot 11. The communication unit 104 receives the identification information transmitted from the terminal device 30 and stores the identification information in the storage unit 106. In this way, the terminal device 30 registers the identification information in the storage unit 106 of the autonomous flying robot 11. Further, the identification information is a face image of a predetermined person. The predetermined character is, for example, a monitor who monitors the autonomous flying robot 10, a manager who manages the autonomous flying robot 10, or an owner who owns the autonomous flying robot 10. Further, the storage unit 106 may store not one piece of identification information corresponding to one person but a plurality of pieces of identification information corresponding to a plurality of persons.
The control unit 1051 is, for example, a CPU, and controls the operation of the autonomous flying robot 11. The control unit 1051 includes: a person information acquisition unit 111, a visual recognition range calculation unit 112, a movement range determination unit 113, a movement route generation unit 114, a movement control unit 115, and a registered person determination unit 116. The autonomous flying robot 11 may have a memory, not shown, in which a program for functioning as the control unit 1051 is stored. The CPU executes the program, whereby the control unit 1051 functions. Alternatively, the control unit 1051 may be configured by using a dedicated circuit in which the function of the control unit 1051 is incorporated. The dedicated circuitry may be, for example, an integrated circuit.
The registered person determination unit 116 determines whether or not a person existing around the autonomous flying robot 11 is a predetermined person based on the identification information registered in the storage unit 106. That is, the registered person determination unit 116 compares the face image of the person included in the image information with the face image registered in the storage unit 106 in advance, and determines that the person existing around the autonomous flying robot 11 is a predetermined person when the persons in the two face images are the same person.
When it is determined that the human being existing around the autonomous flying robot 11 is a predetermined human being, the movement range determination unit 113 determines the range of movement within the visual recognition range of the predetermined human being existing around the autonomous flying robot 11.
The movement path generation unit 114 generates a movement path along which the autonomous flying robot 10 moves within the movement range determined by the movement range determination unit 113. In embodiment 2, the movement path generating unit 114 generates a movement path in which the autonomous flying robot 10 moves within a visual recognition range of a predetermined person existing around the autonomous flying robot 11.
The method for calculating the visual recognition range and the method for generating the movement path in embodiment 2 are the same as those in embodiment 1. However, the movement route generation unit 114 generates, as the movement route 204, a route having the largest total cost value and the shortest distance from the departure point 201 to the destination point 202 among all routes passing through the sides of the grid from the departure point 201 to the destination point 202.
In embodiment 2, the identification information may be transmitted from a communication device carried by a predetermined person. In this case, the communication device is, for example, a device that transmits identification information by infrared rays or a device that transmits identification information by radio. The communication unit 104 receives the identification information transmitted by the communication device. The registered person determination unit 116 determines whether or not a person existing around the autonomous flying robot 11 is a predetermined person based on the identification information registered in the storage unit 106. That is, the registered person determination unit 116 compares the identification information received via the communication unit 104 with the identification information registered in advance in the storage unit 106, and determines that the human being existing around the autonomous flying robot 11 is a predetermined person when both pieces of identification information match.
In this way, since the range of the visual recognition range of the predetermined person in which the identification information is registered in advance is determined as the movement range, the predetermined person can always monitor the autonomous mobile robot 11 moving within the visual recognition range.
(embodiment mode 3)
In embodiment 1 and embodiment 2, the movement path is generated in the autonomous flying robots 10 and 11, and in embodiment 3, the movement path is generated in a server connected to the autonomous flying robots via a network.
Fig. 13 is a block diagram showing the configuration of a flight control system according to embodiment 3 of the present disclosure. In embodiment 3, the same configurations as those in embodiments 1 and 2 will not be described.
The flight control system shown in fig. 13 includes: the autonomous flying robot 12, the monitoring camera 20, the terminal device 30, and the server 50.
The autonomous flying robot 12 includes: an actuator 101, a position measuring unit 102, an image acquiring unit 103, a communication unit 104, and a control unit 1052. The control unit 1052 includes a movement control unit 115. The autonomous flying robot 12 may have a memory, not shown, in which a program for functioning as the control unit 1502 is stored. The control unit 1502 functions by the CPU executing the program. Alternatively, the control unit 1502 may be configured by using a dedicated circuit in which the function of the control unit 1502 is incorporated. The dedicated circuitry may be, for example, an integrated circuit.
The communication unit 104 transmits image information obtained by imaging the human being present around the autonomous flying robot 10 acquired by the image acquisition unit 103 to the server 50. The communication unit 104 receives the movement route information transmitted from the server 50.
The movement control unit 115 controls the movement of the autonomous flying robot 12 based on the movement path indicated by the movement path information received by the communication unit 104.
The server 50 is connected to and can communicate with the autonomous flying robot 12, the monitoring camera 20, and the terminal device 30 via the network 40. The server 50 includes a communication unit 501 and a control unit 502.
The communication unit 501 receives image information from the monitoring camera 20 via the network 40. Further, the communication unit 104 receives image information from the autonomous flying robot 12 via the network 40. The communication unit 104 receives information indicating a departure point and a destination from the terminal device 30 via the network 40. Further, the communication unit 104 receives, from the terminal device 30, environment information indicating a map of the surroundings of the autonomous flying robot 10 specified from the position information indicating the current position of the autonomous flying robot 12.
The control unit 502 is, for example, a CPU, and generates a movement path along which the autonomous flying robot 12 moves. The control unit 502 includes: a personal information acquisition unit 511, a visual recognition range calculation unit 512, a movement range determination unit 513, and a movement route generation unit 514. The server 50 may have a memory, not shown, in which a program for functioning as the control unit 502 is stored. The control unit 502 functions by the CPU executing the program. Alternatively, the control unit 502 may be configured using a dedicated circuit in which the function of the control unit 502 is incorporated. The dedicated circuitry may be, for example, an integrated circuit.
The functions of the personal information acquisition unit 511, the visual recognition range calculation unit 512, the movement range determination unit 513, and the movement route generation unit 514 in embodiment 3 are the same as those of the personal information acquisition unit 111, the visual recognition range calculation unit 112, the movement range determination unit 113, and the movement route generation unit 114 in embodiment 1. In embodiment 3, a movement path is generated in the same manner as in embodiment 1.
The communication unit 104 transmits movement path information indicating the movement path generated by the movement path generation unit 114 to the autonomous flying robot 12.
In this way, since the movement path is generated in the server 50, the calculation load of the autonomous flying robot 12 can be reduced.
In embodiment 3, the movement path is generated by the server 50, but the present disclosure is not particularly limited thereto, and the movement path may be generated by the terminal device 30 or may be generated by a manipulator that manipulates the autonomous flying robot 12.
In embodiments 1 to 3, the autonomous flying robot is an example of an autonomous mobile robot, but the present disclosure is not particularly limited thereto, and the configuration of the autonomous flying robot may be applied to an autonomous traveling robot (unmanned vehicle) traveling on the ground or a cleaning robot autonomously cleaning the ground in the same manner. In addition, the space in which the autonomous mobile robot moves may be any one of indoor and outdoor.
Further, a part of the components included in the autonomous mobile robot or the server according to each of the above embodiments may be constituted by, for example, 1 system LSI (Large Scale Integration). For example, at least one of the communication unit 104 and the control unit 105 of the autonomous mobile robot 10 may be configured by a system LSI.
For example, at least one of the communication unit 104 and the control unit 1051 of the autonomous mobile robot 11 may be formed of a system LSI.
For example, at least one of the communication unit 104 and the control unit 1052 of the autonomous mobile robot 12 may be formed of a system LSI.
For example, at least one of the communication unit 501 and the control unit 502 of the server 50 may be constituted by a system LSI.
The system LSI is an ultra-multifunctional LSI manufactured by integrating a plurality of components on one chip, and specifically, is a computer system including a microprocessor, a ROM (Read Only Memory), a RAM (Random Access Memory), and the like. The ROM stores a computer program. The microprocessor operates in accordance with the computer program, whereby the system LSI achieves its functions.
Although the system LSI is referred to herein, depending on the degree of integration, it may be referred to as IC, LSI, super LSI (super LSI), or super LSI (ultra LSI). The method of integration is not limited to LSI, and may be realized by a dedicated circuit or a general-purpose processor. An FPGA (Field Programmable Gate Array) that can be programmed after LSI manufacturing or a reconfigurable processor that can reconstruct the connection and setting of circuit elements inside LSI may be used.
Furthermore, if a technique for realizing an integrated circuit that replaces an LSI appears due to the progress of semiconductor technology or another derived technique, it is needless to say that the functional blocks may be integrated using this technique. There is also the possibility of applying biotechnology and the like.
(availability in industry)
The movement control method and the autonomous mobile robot according to the present disclosure are useful as a movement control method for controlling the movement of an autonomous mobile robot and an autonomously moving autonomous mobile robot, in which an autonomous mobile robot can always monitor an autonomous mobile robot by moving the autonomous mobile robot within a range in which human objects existing around the autonomous mobile robot can be visually recognized, or in which an autonomous mobile robot can move without being seen by the human objects existing around the autonomous mobile robot outside a range in which human objects can be visually recognized.

Claims (18)

1. A movement control method of controlling movement of an autonomous mobile robot, the method comprising:
acquiring information on a person around the autonomous mobile robot,
calculating a visual recognition range in which the surrounding person can be visually recognized based on the acquired information on the surrounding person,
identification information for identifying a predetermined person is registered in advance,
discriminating whether the peripheral person existing in the periphery of the autonomous mobile robot is the predetermined person based on the registered identification information,
determining only the outside of the visual recognition range of the predetermined person existing around the autonomous mobile robot as a movement range in which the autonomous mobile robot can move when it is determined that the peripheral person existing around the autonomous mobile robot is the predetermined person,
generating a movement path along which the autonomous mobile robot moves within the decided movement range,
the visual recognition range includes: a 1 st region formed on the front side of the surrounding character, a 2 nd region formed on the back side of the body of the surrounding character, and a 3 rd region formed outside the 1 st region and the 2 nd region,
superimposing the visual recognition range on a map including a start point and a destination point of the autonomous mobile robot, dividing the map into a grid so that intersections of the start point and the destination point with sides coincide with each other, among each of the divided grids, assigning a cost value having a 1 st value to a part of the sides having sides in the 1 st region, assigning a cost value having a 2 nd value smaller than the 1 st value to the sides having sides in the 1 st region and not having sides in the 2 nd region, assigning a cost value having a 3 rd value smaller than the 2 nd value to the sides having sides in the 1 st region and not having sides in the 2 nd region and not having sides in the 3 rd region, and assigning a cost value having a 3 rd value smaller than the 2 nd value to the sides having sides in the 1 st region and not having sides in the 2 nd region and not having sides in the 3 rd region, Edges in which a part of the edges are not present in the 2 nd region and the 3 rd region are each assigned a cost value having a 4 th value smaller than the 3 rd value,
the movement route is generated as a route having the smallest total cost value and the shortest distance from the departure point to the destination point among all routes passing through the sides of the grid from the departure point to the destination point.
2. The movement control method according to claim 1,
the identification information is a face image of the predetermined person.
3. The movement control method according to claim 1,
transmitting the identification information from a communication device carried by the predetermined person.
4. The movement control method according to claim 1,
the information on the surrounding characters existing around the autonomous mobile robot includes at least one of a position of the surrounding characters, a body orientation of the surrounding characters, a face orientation of the surrounding characters, and a line-of-sight orientation of the surrounding characters.
5. The movement control method according to claim 1,
the visual recognition range is an area on a two-dimensional plane determined according to at least one of the position of the peripheral character, the body orientation of the peripheral character, the face orientation of the peripheral character, and the line-of-sight orientation of the peripheral character.
6. The movement control method according to claim 1,
the visual recognition range has a cost value whose numerical value differs according to the information on the surrounding persons,
the autonomous mobile robot overlaps the visual recognition range on a map including a departure point and a destination point of the autonomous mobile robot, and generates the movement path using an optimal algorithm.
7. The movement control method according to claim 1,
the autonomous mobile robot includes an image acquisition unit that acquires an image of the surroundings of the autonomous mobile robot,
the information on the surrounding person is acquired from the image information acquired by the image acquiring unit.
8. The movement control method according to claim 1,
the information on the surrounding person is acquired from image information acquired by an image acquisition device provided around the autonomous mobile robot.
9. The movement control method according to claim 1,
the information on the surrounding person is the current position of the surrounding person obtained by a position sensor carried by the surrounding person.
10. The movement control method according to claim 1,
the information on the surrounding person is a direction in which the surrounding person is oriented, which is acquired by a geomagnetic sensor carried by the surrounding person.
11. The movement control method according to claim 1,
the information on the surrounding person is acquired from image information of the surroundings of the other autonomous mobile robot acquired by an image acquisition unit provided in the other autonomous mobile robot than the autonomous mobile robot.
12. The movement control method according to claim 1,
acquiring information on obstacles existing around the autonomous mobile robot,
the movement range is determined based on the acquired information on the obstacle and the calculated visual recognition range.
13. The movement control method according to claim 1,
the autonomous mobile robot is an autonomous flying robot,
acquiring the flying height of the autonomous flying robot,
the movement range is determined only if the obtained altitude is lower than a predetermined altitude.
14. The movement control method according to claim 1,
and notifying a predetermined terminal device of the current position of the autonomous mobile robot when the movement range is not determined.
15. The movement control method according to claim 1,
when the distance of the generated movement path is longer than a predetermined distance, the distance of the generated movement path is notified to a predetermined terminal device.
16. The movement control method according to claim 1,
at least one of acquisition of information on the surrounding person, calculation of the visual recognition range, determination of whether or not the predetermined person is present, and determination of the movement range is performed by a processor.
17. An autonomous mobile robot that autonomously moves, comprising:
a person acquisition unit that acquires information on a surrounding person present around the autonomous mobile robot;
a calculation unit that calculates a visual recognition range in which the peripheral person can be visually recognized, based on the acquired information on the peripheral person;
a registration unit that registers identification information for identifying a predetermined person in advance;
a determination unit that determines whether or not the peripheral person existing around the autonomous mobile robot is the predetermined person based on the registered identification information;
a determination unit configured to determine, when it is determined that the peripheral person existing around the autonomous mobile robot is the predetermined person, only a range outside a visual recognition range of the predetermined person existing around the autonomous mobile robot as a movement range in which the autonomous mobile robot can move; and
a movement path generation unit that generates a movement path along which the autonomous mobile robot moves within the determined movement range,
the visual recognition range includes: a 1 st region formed on the front side of the surrounding character, a 2 nd region formed on the back side of the body of the surrounding character, and a 3 rd region formed outside the 1 st region and the 2 nd region,
superimposing the visual recognition range on a map including a start point and a destination point of the autonomous mobile robot, dividing the map into a grid so that intersections of the start point and the destination point with sides coincide with each other, among each of the divided grids, assigning a cost value having a 1 st value to a part of the sides having sides in the 1 st region, assigning a cost value having a 2 nd value smaller than the 1 st value to the sides having sides in the 1 st region and not having sides in the 2 nd region, assigning a cost value having a 3 rd value smaller than the 2 nd value to the sides having sides in the 1 st region and not having sides in the 2 nd region and not having sides in the 3 rd region, and assigning a cost value having a 3 rd value smaller than the 2 nd value to the sides having sides in the 1 st region and not having sides in the 2 nd region and not having sides in the 3 rd region, Edges in which a part of the edges are not present in the 2 nd region and the 3 rd region are each assigned a cost value having a 4 th value smaller than the 3 rd value,
the movement route is generated as a route having the smallest total cost value and the shortest distance from the departure point to the destination point among all routes passing through the sides of the grid from the departure point to the destination point.
18. The autonomous mobile robot of claim 17,
at least one of the person acquisition unit, the calculation unit, the determination unit, and the determination unit includes a processor.
CN201610918571.2A 2015-12-10 2016-10-21 Movement control method and autonomous mobile robot Active CN107045355B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2015-241142 2015-12-10
JP2015241142 2015-12-10
JP2016-144363 2016-07-22
JP2016144363A JP2017111790A (en) 2015-12-10 2016-07-22 Movement control method, autonomous mobile robot, and program

Publications (2)

Publication Number Publication Date
CN107045355A CN107045355A (en) 2017-08-15
CN107045355B true CN107045355B (en) 2021-10-29

Family

ID=59080243

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610918571.2A Active CN107045355B (en) 2015-12-10 2016-10-21 Movement control method and autonomous mobile robot

Country Status (2)

Country Link
JP (1) JP2017111790A (en)
CN (1) CN107045355B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11567513B2 (en) * 2018-08-16 2023-01-31 Rakuten Group, Inc. Unmanned aerial vehicle control system, unmanned aerial vehicle control method, and program
CN110515385A (en) * 2019-09-09 2019-11-29 金鹏电子信息机器有限公司 A kind of path following method and device of mobile robot
JP7447670B2 (en) * 2020-05-15 2024-03-12 トヨタ自動車株式会社 Autonomous mobile device control system, its control method and its control program
US11971721B2 (en) * 2020-08-26 2024-04-30 Toyota Jidosha Kabushiki Kaisha Autonomous mobile robot control system, control method thereof, a non-transitory computer readable medium storing control program thereof, and autonomous mobile robot control device
JP7476727B2 (en) * 2020-08-26 2024-05-01 トヨタ自動車株式会社 Autonomous mobile robot control system, control method thereof, control program thereof, and autonomous mobile robot control device
JP7420048B2 (en) 2020-10-22 2024-01-23 トヨタ自動車株式会社 Control devices, systems, programs, control equipment, aircraft, sensors and system operation methods
WO2022141040A1 (en) * 2020-12-29 2022-07-07 Telefonaktiebolaget Lm Ericsson (Publ) Method and device for event detection
CN112975386B (en) * 2021-03-09 2022-09-06 重庆机器人有限公司 Automatic refrigerator assembling process

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006082774A (en) * 2004-09-17 2006-03-30 Hiroboo Kk Unmanned flying object and its controlling method
JP4456561B2 (en) * 2005-12-12 2010-04-28 本田技研工業株式会社 Autonomous mobile robot
JP4940698B2 (en) * 2006-02-27 2012-05-30 トヨタ自動車株式会社 Autonomous mobile robot
JP4699426B2 (en) * 2006-08-08 2011-06-08 パナソニック株式会社 Obstacle avoidance method and obstacle avoidance moving device
JP4528295B2 (en) * 2006-12-18 2010-08-18 株式会社日立製作所 GUIDANCE ROBOT DEVICE AND GUIDANCE SYSTEM
JP4682217B2 (en) * 2007-03-07 2011-05-11 パナソニック株式会社 Behavior control apparatus, method, and program
JP2008260107A (en) * 2007-04-13 2008-10-30 Yaskawa Electric Corp Mobile robot system
JP2009011362A (en) * 2007-06-29 2009-01-22 Sony Computer Entertainment Inc Information processing system, robot apparatus, and its control method
JP5160322B2 (en) * 2008-06-30 2013-03-13 株式会社Ihi Autonomous mobile robot apparatus and control method for autonomous mobile robot apparatus
TW201123031A (en) * 2009-12-24 2011-07-01 Univ Nat Taiwan Science Tech Robot and method for recognizing human faces and gestures thereof
JP2011209966A (en) * 2010-03-29 2011-10-20 Sony Corp Image processing apparatus and method, and program
CN102411368B (en) * 2011-07-22 2013-10-09 北京大学 Active vision human face tracking method and tracking system of robot
CN102819263B (en) * 2012-07-30 2014-11-05 中国航天科工集团第三研究院第八三五七研究所 Multi-camera visual perception system for UGV (Unmanned Ground Vehicle)
CN102853830A (en) * 2012-09-03 2013-01-02 东南大学 Robot vision navigation method based on general object recognition
JP5898022B2 (en) * 2012-09-18 2016-04-06 シャープ株式会社 Self-propelled equipment
CN104718507B (en) * 2012-11-05 2017-03-29 松下知识产权经营株式会社 The walking information generation device of automatic walking apparatus, method and automatic walking apparatus
WO2014102995A1 (en) * 2012-12-28 2014-07-03 株式会社日立製作所 Monitoring system, method, and information-recording medium containing program
JP2014197294A (en) * 2013-03-29 2014-10-16 株式会社日立産機システム Position identification device and mobile robot having the same
CN105022401B (en) * 2015-07-06 2017-08-04 南京航空航天大学 Many four rotor wing unmanned aerial vehicles collaboration SLAM methods of view-based access control model

Also Published As

Publication number Publication date
JP2017111790A (en) 2017-06-22
CN107045355A (en) 2017-08-15

Similar Documents

Publication Publication Date Title
CN107045355B (en) Movement control method and autonomous mobile robot
US10409292B2 (en) Movement control method, autonomous mobile robot, and recording medium storing program
JP7465615B2 (en) Smart aircraft landing
US11932392B2 (en) Systems and methods for adjusting UAV trajectory
US11442473B2 (en) Systems and methods for surveillance with a visual marker
US11218689B2 (en) Methods and systems for selective sensor fusion
US10599149B2 (en) Salient feature based vehicle positioning
JP6673371B2 (en) Method and system for detecting obstacle using movable object
CN108351653B (en) System and method for UAV flight control
CN111506109B (en) Selective processing of sensor data
ES2876449T3 (en) Multi-sensor environment mapping
JP6821220B2 (en) Unmanned aerial vehicles, unmanned aerial vehicle flight control devices, unmanned aerial vehicle flight control methods, and programs
KR102467855B1 (en) A method for setting an autonomous navigation map, a method for an unmanned aerial vehicle to fly autonomously based on an autonomous navigation map, and a system for implementing the same
Sousa et al. Isep/inesc tec aerial robotics team for search and rescue operations at the eurathlon 2015
KR20230082497A (en) Method for real-time inspection of structures using 3d point cloud
JP2019179529A (en) Controller
Caldeira et al. Indoor Exploration Using a μ UAV and a Spherical Geometry Based Visual System
Ben-Moshe et al. Bio-inspired micro drones
WO2023228283A1 (en) Information processing system, movable body, information processing method, and program
Sharma et al. An insight on UAV/drone autonomous navigation methods and applications: a review
JP2023128381A (en) Flight device, flight control method and program
JP2022027755A (en) Mobile vehicle and program
JP2021057078A (en) Unmanned aircraft, unmanned aircraft flight controller, unmanned aircraft flight control method, and program
Lugo Autonomous landing of a quadrotor UAV using vision and infrared markers for pose estimation
Apvrille et al. Autonomous Drones for Humanitarians Operations

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant