WO2023243221A1 - Movement path determination system, landing site determination system, movement path determination device, drone control device, and computer program - Google Patents

Movement path determination system, landing site determination system, movement path determination device, drone control device, and computer program Download PDF

Info

Publication number
WO2023243221A1
WO2023243221A1 PCT/JP2023/015737 JP2023015737W WO2023243221A1 WO 2023243221 A1 WO2023243221 A1 WO 2023243221A1 JP 2023015737 W JP2023015737 W JP 2023015737W WO 2023243221 A1 WO2023243221 A1 WO 2023243221A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
opening
information
unit
landing
Prior art date
Application number
PCT/JP2023/015737
Other languages
French (fr)
Japanese (ja)
Inventor
淳 岩元
Original Assignee
住友電気工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 住友電気工業株式会社 filed Critical 住友電気工業株式会社
Publication of WO2023243221A1 publication Critical patent/WO2023243221A1/en

Links

Definitions

  • the present disclosure relates to a travel route determination system, a landing site determination system, a travel route determination device, a drone control device, and a computer program.
  • This application claims priority based on Japanese Application No. 2022-94867 filed on June 13, 2022, and incorporates all the contents described in the said Japanese application.
  • Patent Document 1 discloses a method for determining a delivery location of a package using a drone.
  • a movement route determination system includes a drone information acquisition unit that acquires drone information including the size of the drone, and spatial information that includes three-dimensional positions of objects existing around a candidate landing site for the drone.
  • an approach opening determining unit that determines an approach opening into which the drone enters based on the drone information and the spatial information; a landing point determining unit that determines a landing site of the drone; a first moving route determining unit that determines a first moving route of the drone from a base to the approach opening; the spatial information; and the determined approach opening.
  • a second movement route determination unit that determines a second movement route of the drone from the approach opening to the landing point based on the landing point and the landing point.
  • FIG. 1 is a diagram showing the configuration of a delivery system according to Embodiment 1 of the present disclosure.
  • FIG. 2 is a block diagram showing the configuration of a terminal device according to Embodiment 1 of the present disclosure.
  • FIG. 3 is a diagram showing an example of an image output from the camera.
  • FIG. 4 is a diagram showing an example of drone information.
  • FIG. 5 is a diagram illustrating an example of aperture information.
  • FIG. 6 is a diagram showing an example of travel route information.
  • FIG. 7 is a diagram illustrating an example of display on the touch panel by the enterable opening display section.
  • FIG. 8 is a diagram illustrating an example of a display on a touch panel by a possible landing point display section.
  • FIG. 1 is a diagram showing the configuration of a delivery system according to Embodiment 1 of the present disclosure.
  • FIG. 2 is a block diagram showing the configuration of a terminal device according to Embodiment 1 of the present disclosure.
  • FIG. 3 is a diagram showing
  • FIG. 9 is a diagram illustrating an example of display on the touch panel by the movement route display section.
  • FIG. 10 is a diagram illustrating an example of display on the touch panel by the movement route display section.
  • FIG. 11 is a block diagram showing the configuration of a server according to Embodiment 1 of the present disclosure.
  • FIG. 12 is a sequence diagram showing the processing procedure of the delivery system according to Embodiment 1 of the present disclosure.
  • FIG. 13 is a diagram showing the configuration of a delivery system according to Embodiment 2 of the present disclosure.
  • FIG. 14 is a block diagram showing the configuration of a terminal device according to Embodiment 2 of the present disclosure.
  • FIG. 15 is a sequence diagram showing the processing procedure of the delivery system according to Embodiment 2 of the present disclosure.
  • FIG. 16 is a diagram showing an example of an image displayed on the touch panel.
  • a helipad on the roof of a building or a marker area about 6 tatami mats in size set up on the ground is used as a landing site for a drone, and the drone lands by aiming for the mark on the helipad or marker area. It is not practical to set up a helipad or marker area in a private home, and there is no established method for specifying a drone landing site for each private home.
  • the present disclosure has been made in view of such circumstances, and provides a travel route determination system, a landing point determination system, a travel route determination device, and a drone that can determine the travel route or landing point of a drone in a simple manner.
  • the purpose is to provide a control device and a computer program.
  • a movement route determination system includes a drone information acquisition unit that acquires drone information including the size of the drone, and a three-dimensional position of objects existing around a candidate landing site for the drone.
  • an approach opening determining unit that determines an approach opening for the drone to enter a building including the landing candidate site based on the drone information and the spatial information;
  • a landing point determining unit that determines a landing site for the drone based on the drone information and the spatial information, and a first movement that determines a first movement route of the drone from the base of the drone to the approach opening.
  • a second movement route for determining a second movement route of the drone from the approach opening to the landing point based on a route determining unit, the spatial information, and the determined approach opening and the landing point; and a determining section.
  • Drone information includes the size of the drone. Therefore, by acquiring the drone information and spatial information, it is possible to determine the entry opening through which the drone enters the space around the candidate landing site for the drone. It is also possible to determine the landing site for the drone. Once the approach opening and landing site are determined, the first travel route and the second travel route can be determined. For example, a user can easily determine the drone's travel route or landing point by acquiring spatial information using a distance sensor installed in a smartphone, etc., and acquiring drone information from an external server. can.
  • the drone information may further include flight accuracy of the drone.
  • the drone may come into contact with the area around the opening and be unable to enter the opening.
  • the drone may come into contact with the surroundings of the landing site and be unable to land at the landing site.
  • the approach opening and the landing site can be determined in consideration of the flight accuracy of the drone. Therefore, it is possible to determine an entry opening into which the drone can reliably enter and a landing site where the drone can reliably land.
  • the entry opening determining unit determines an entry opening into which the drone can enter, based on the drone information and the spatial information, and determines the movement route.
  • the system further includes an accessible opening display unit that displays the accessible opening on a screen, and a first selection information acquisition unit that acquires information on the user's selection of the accessible opening, and the system further includes: The unit may further determine the access opening based on the selection information of the access opening.
  • the user can select any one of the plurality of accessible openings. This allows the user to select accessible openings by excluding accessible openings that they do not want the drone to enter, or to select accessible openings that the drone can easily enter.
  • the entry opening along can be determined.
  • the entry opening determining unit further determines an entry-inaccessible opening into which the drone cannot enter, based on the drone information and the spatial information, and
  • the opening display section may display the accessible opening and the non-accessible opening in a distinguishable manner.
  • inaccessible openings into which the drone cannot enter such as cases where the size of the opening is smaller than the size of the drone, can be displayed separately from accessible openings. This allows the user to efficiently select an accessible opening.
  • the landing point determination unit determines a possible landing point where the drone can land based on the drone information and the spatial information
  • the determination system further includes a landing spot display unit that displays the landing spot possible on a screen, and a second selection information acquisition unit that acquires selection information of the landing spot by the user, and the landing spot determining unit includes: Furthermore, the landing site may be determined based on the selection information of the possible landing site.
  • the user can select any possible landing site from among the possible landing sites. This allows you to select possible landing sites by excluding possible landing sites that you do not want the drone to land on, or to select possible landing sites that are likely to be easy for the drone to land on, allowing you to select landing sites that match the user's wishes. can be determined.
  • the possible landing point display unit may further display the outline of the drone on the screen based on the drone information.
  • the travel route determination system may further include a travel route display unit that displays the second travel route on a screen.
  • the travel path of the drone from the approach opening to the landing site can be displayed. Therefore, the user can be made aware of the moving route of the drone, and the user can be warned not to place objects on the moving route.
  • the movement route determination system is configured such that when the second movement route is not determined by the second movement route determination unit, the second movement route is You may further include a first notification unit that notifies the user that the determination has not been made.
  • the user who receives the notification that the second travel route has not been determined can take actions such as changing the landing point.
  • the movement route determination system determines that the drone was unable to move along the second movement route.
  • the device may further include a second notification unit that notifies the user.
  • the user who receives the notification that the drone could not move on the second movement route can take measures such as removing obstacles on the second movement route or changing the landing site.
  • a landing site determination system includes a position specifying unit that specifies the position of a terminal device, a communication unit that communicates with a wireless tag, and communication between the communication unit and the wireless tag. a tag position calculation unit that calculates the relative position of the wireless tag with respect to the terminal device based on the position of the wireless tag; and a landing point determining unit that determines the landing point of the drone as the landing point of the drone.
  • the landing site of the drone can be determined by placing a wireless tag at the landing site and acquiring the position of the terminal device. This allows the landing site of the drone to be determined in a simple manner.
  • a movement route determination device includes a drone information acquisition unit that acquires drone information including the size of the drone, and a three-dimensional position of an object existing around a candidate landing site for the drone.
  • a drone information acquisition unit that acquires drone information including the size of the drone, and a three-dimensional position of an object existing around a candidate landing site for the drone.
  • an approach opening determining unit that determines an approach opening for the drone to enter a building including the landing candidate site based on the drone information and the spatial information; a landing point determining unit that determines a landing site for the drone based on the drone information and the spatial information; and a travel route determination unit that determines a travel route of the drone from the opening to the landing site.
  • Drone information includes the size of the drone. Therefore, by acquiring the drone information and spatial information, it is possible to determine the entry opening through which the drone enters the space around the candidate landing site for the drone. It is also possible to determine the landing site for the drone. Once the approach opening and landing site are determined, the travel path of the drone from the approach opening to the landing site can be determined. For example, a user can easily determine the drone's travel route or landing point by acquiring spatial information using a distance sensor installed in a smartphone, etc., and acquiring drone information from an external server. can.
  • a drone control device includes a drone information providing unit that provides drone information including the size of the drone to a terminal device, and a drone information providing unit that provides the drone information including the size of the drone. Further, a position information acquisition unit that acquires position information of an approach opening for the drone to enter a building including a landing site, and a position information acquisition unit that determines a first movement route of the drone from the base of the drone to the approach opening. a first movement route determination unit that acquires a second movement route of the drone from the approach opening to the landing site from the terminal device; and a movement instruction transmitter that transmits a movement instruction to the drone based on the second movement route.
  • a computer program includes a computer, a drone information acquisition unit that acquires drone information including the size of the drone, and a three-dimensional position of objects existing around a candidate landing site for the drone.
  • an approach opening determining unit that determines an approach opening for the drone to enter a building including the landing candidate site based on the drone information and the spatial information; a landing point determining unit that determines a landing site for the drone based on the drone information and the spatial information; and a landing site determining unit that determines the landing site of the drone based on the spatial information and the determined approach opening and landing site.
  • the drone functions as a travel route determination unit that determines the travel route of the drone from the landing point to the landing point.
  • the computer can function as the above-mentioned travel route determining device. Therefore, the same operation and effect as the above-described moving route determining device can be achieved.
  • a computer program is configured to cause a computer to identify the location of the wireless tag for the terminal device based on communication with the wireless tag by a location specifying unit that identifies the location of the terminal device and a communication unit.
  • a tag position calculation unit that calculates a relative position; and a landing point that determines the position of the wireless tag as a landing point of the drone based on the position of the terminal device and the relative position of the wireless tag. Function as a decision-making unit.
  • the computer can function as the above-mentioned landing site determination system. Therefore, the same operation and effect as the above-mentioned landing site determination system can be achieved.
  • FIG. 1 is a diagram showing the configuration of a delivery system according to Embodiment 1 of the present disclosure.
  • the delivery system 10 is a system for delivering packages from a package delivery base (hereinafter also referred to as "base"), which is the base of the drone 1, to a delivery destination, and includes the drone 1, the terminal device 2, and a server. 5.
  • the drone 1, the terminal device 2, and the server 5 are connected to the network 7 via wireless or wired.
  • the network 7 is configured by, for example, a public communication network such as 5G.
  • the drone 1 is an unmanned aircraft that flies from a delivery base to a delivery destination according to a travel route determined by the server 5 and delivers packages to the delivery destination.
  • the drone 1 is equipped with a mechanism for grasping luggage, a position specifying unit for specifying the position of the drone 1, and a camera.
  • the drone 1 can detect obstacles and the like existing in its path based on the images taken by the camera, and flies to the destination while avoiding the obstacles.
  • the terminal device 2 is, for example, a smartphone or a tablet terminal device owned by the user, and has a landing point for the drone 1 at the delivery destination, an entry opening provided at the delivery destination through which the drone 1 enters, and an entry point.
  • the moving route of the drone 1 from the opening to the landing site is determined.
  • the drone 1 delivers a package to the balcony of a building with a roof, such as an apartment or a detached house. That is, it is assumed that the drone 1 passes through the opening of the balcony from the outside, enters the balcony, and lands on the balcony.
  • the balcony does not have to have a roof.
  • the landing site of the drone 1 is not limited to the balcony. Embodiment 1 is also applicable when the drone 1 lands at a location that has an opening to the outside.
  • the server 5 determines the travel route of the drone 1 from the delivery base to the landing point of the drone 1 at the delivery destination based on the information determined by the terminal device 2, and instructs the drone 1 to fly along the determined travel route. do.
  • FIG. 2 is a block diagram showing the configuration of the terminal device 2 according to Embodiment 1 of the present disclosure.
  • the terminal device 2 includes a communication section 21 , a camera 22 , a sensor 23 , a touch panel 24 , a speaker 25 , a position specifying section 26 , a storage section 27 , a control section 28 , and a bus 29 .
  • the communication unit 21 connects the terminal device 2 to the network 7 and performs wireless or wired communication with an external device.
  • the camera 22 photographs the surroundings of the terminal device 2 and outputs an image.
  • FIG. 3 is a diagram showing an example of an image output from the camera 22.
  • FIG. 3 shows an example of an image of a balcony of an apartment, which is captured by the user operating the camera 22.
  • the image includes the floor 101, which is a candidate landing site for the drone 1, and balcony openings 102A and 102B through which the drone 1 enters the apartment including the floor 101.
  • the sensor 23 measures the distance to objects around the terminal device 2 and outputs depth information corresponding to the distance.
  • the depth information of each pixel represents the distance to the target.
  • the sensor 23 is configured by, for example, LIDAR (Light Detection and Ranging, Laser Imaging Detection and Rang).
  • Distance measurement by the sensor 23 and photographing by the camera 22 can be performed simultaneously. Thereby, for example, the sensor 23 measures the distance to the object on the balcony shown in FIG. do.
  • the touch panel 24 has a function as a display device that displays images, and a function as an input device that accepts input operations by the user.
  • the speaker 25 outputs sound.
  • the position specifying unit 26 specifies the position of the terminal device 2.
  • the position specifying unit 26 specifies the position of the terminal device 2 using satellite navigation.
  • the position specifying unit 26 specifies the position of the terminal device 2 based on radio waves received from a plurality of GPS (Global Positioning System) satellites.
  • the location of the terminal device 2 can be specified by latitude and longitude, for example.
  • Satellite navigation uses a global navigation satellite system (GNSS) such as GPS, but the satellite positioning system is not limited to GPS.
  • GNSS global navigation satellite system
  • the storage unit 27 is a volatile memory element such as SRAM (Static RAM) or DRAM (Dynamic RAM), a non-volatile memory element such as flash memory or EEPROM (Electrically Erasable Programmable Read Only Memory), or a magnetic memory element such as a hard disk. It consists of a storage device, etc.
  • the storage unit 27 stores a computer program 27P executed by the control unit 28. Furthermore, the storage unit 27 stores data obtained when the computer program 27P is executed.
  • the control unit 28 is composed of a processor such as a CPU (Central Processing Unit), and includes a drone information acquisition unit 30, a spatial information acquisition unit 31, and a functional processing unit realized by executing a computer program 27P.
  • a processor such as a CPU (Central Processing Unit)
  • the drone information acquisition unit 30 acquires drone information including the size of the drone 1 that delivers packages to the user's home and the flight accuracy of the drone 1 from the server 5 via the communication unit 21.
  • FIG. 4 is a diagram showing an example of drone information.
  • the drone information 52D includes drone size (height, width, depth), hovering accuracy (vertical, horizontal), and position accuracy (vertical, horizontal). Hovering accuracy (vertical, horizontal) and positional accuracy (vertical, horizontal) are examples of the flight accuracy of the drone 1.
  • the sizes of the drone 1 in the height direction, width direction, and depth direction are 300 mm, 500 mm, and 500 mm, respectively.
  • the hovering accuracy of the drone 1 in both the vertical direction and the horizontal direction is ⁇ 0.1 m.
  • the positional accuracy of the drone 1 in both the vertical and horizontal directions is ⁇ 50 mm.
  • RTK Real Time Kinematic
  • the positional accuracy of the drone 1 by correcting the position information of the drone 1 obtained from a satellite positioning system such as GPS based on the corrected position information of a reference station installed on the ground.
  • a technique called ⁇
  • the spatial information acquisition unit 31 calculates spatial information including the three-dimensional positions of objects existing around the candidate landing site of the drone 1 based on the depth information output by the sensor 23.
  • the spatial information indicates the three-dimensional position of the object in a coordinate system preset in the terminal device 2 (for example, a right-handed coordinate system with the origin at the center of the lens of the camera 22 and the line of sight direction of the camera 22 as the X axis). Shown in coordinates.
  • the spatial information includes the three-dimensional coordinates of the floor 101, openings 102A and 102B, wall surface 103, window 104, pillar 105, and fence 106 in FIG. Note that since the openings 102A and 102B are spaces, the three-dimensional coordinates of the openings 102A and 102B are the coordinates of a point at infinity or an indefinite value.
  • the entry opening determination unit 32 determines an opening through which the drone 1 can enter the apartment including the floor 101 based on the drone information 52D acquired by the drone information acquisition unit 30 and the spatial information acquired by the spatial information acquisition unit 31. A certain approachable opening and an inaccessible opening, which is an opening through which the drone 1 cannot enter the apartment including the floor 101, are determined.
  • the entrance opening determining unit 32 determines the horizontal and vertical sizes of each of the openings 102A and 102B shown in FIG. 3 based on the spatial information.
  • the approach opening determining unit 32 determines the size of the area that the drone 1 may occupy from the size, hovering accuracy, and positional accuracy of the drone 1 based on the drone information 52D.
  • the approach opening determining unit 32 determines that for each of the openings 102A and 102B, the horizontal size of the opening is 650 mm or less in the width direction that the drone 1 may occupy, and the vertical size of the opening is 650 mm or less. If the size of the opening is 450 mm or less in the height direction that the drone 1 may occupy, the opening is determined to be an accessible opening; otherwise, the opening is determined to be an impossible opening. For example, the entrance opening determining unit 32 determines the opening 102A to be an accessible opening, and determines the opening 102B to be an impossible opening.
  • the entry opening determining unit 32 determines the entry opening selected by the user among the determined entry openings as the entry opening into which the drone 1 will enter.
  • the entrance opening determining section 32 transmits opening information regarding the determined entrance opening to the server 5 via the communication section 21 .
  • FIG. 5 is a diagram illustrating an example of aperture information.
  • the opening information includes opening coordinates (latitude), opening coordinates (longitude), opening height, and opening approach direction.
  • the opening coordinates (latitude) indicate the latitude of the entrance opening, and are, for example, "YY.YYYYYY” (degrees).
  • the opening coordinates (longitude) indicate the longitude of the entrance opening, and are, for example, "XXX.XXXXXX" (degrees).
  • the opening height indicates the height of the entrance opening, and is, for example, "15 m”.
  • the opening entry direction indicates the direction of the entry opening with respect to the terminal device 2, and is, for example, "275 degrees.” Here, true north is 0 degrees.
  • the entrance opening determination unit 32 determines the opening coordinates ( (latitude) and opening coordinates (longitude). However, if the three-dimensional position of the approach opening indicates the coordinates of an infinite point or is undefined, the approach opening determining unit 32 determines whether the approach opening has an object around the approach opening (as shown in FIG. 3). In the example, the three-dimensional position of the entrance opening may be calculated from the three-dimensional coordinates of the pillar 105 and the fence 106). For example, when the entrance opening is the opening 102A, the entrance opening determination unit 32 may calculate a three-dimensional position obtained by extending the surface of the fence 106 upward as the three-dimensional position of the opening 102A. . Then, the approach opening determining unit 32 calculates the opening coordinates (latitude) and the opening coordinates (longitude).
  • the landing site determination unit 33 determines a possible landing site where the drone 1 can land based on the drone information 52D acquired by the drone information acquisition unit 30 and the spatial information acquired by the spatial information acquisition unit 31.
  • the landing site determining unit 33 determines the possible landing site selected by the user among the determined possible landing sites as the landing site of the drone 1.
  • the movement route determining unit 34 is based on the spatial information calculated by the spatial information acquiring unit 31, the information on the approach opening determined by the approach opening determining unit 32, and the information on the landing site determined by the landing site determining unit 33. Then, the travel route of the drone 1 from the approach opening to the landing site is determined. For example, when the drone 1 enters the inside of the balcony from the entrance opening at a predetermined elevation angle and the drone 1 reaches directly above the landing site, the movement route determining unit 34 changes the course of the drone 1 vertically and lands. Determine the travel route to land at the location. The moving route determining unit 34 transmits the determined moving route information to the server 5 via the communication unit 21. Note that if there is an obstacle or the like between the approach opening and the landing point and the drone 1 cannot move, the movement route determination unit 34 does not determine the movement route and cannot determine the movement route. information indicating this is transmitted to the server 5 via the communication unit 21.
  • FIG. 6 is a diagram showing an example of travel route information.
  • the travel route information includes travel route coordinates (1) to (n), landing point coordinates, and reference coordinate orientation.
  • Movement route coordinates (1) to (n) are the coordinates of n points that the drone 1 passes between the approach opening and the landing site.
  • the landing point coordinates are the coordinates of the point where the drone 1 lands.
  • the drone 1 when considering a right-handed coordinate system in which the origin is the lens center of the camera 22 and the X axis is the line of sight direction of the camera 22, the drone 1 has coordinates (X1, Y1, Z1), ..., coordinates (Xn , Yn, Zn) and land at the coordinates (ZL, YL, ZL).
  • the reference coordinate orientation indicates the orientation in the X-axis direction in the coordinate system, and is, for example, "0 degree". Here, true north is 0 degrees.
  • the accessible opening display unit 35 displays on the touch panel 24 the accessible openings and non-accessible openings determined by the accessible opening determining unit 32 in a distinguishable manner.
  • the accessible opening display section 35 may display accessible openings and non-accessible openings with frames of different colors superimposed on the image taken by the camera 22, or may display accessible openings and non-accessible openings with frames of different colors.
  • the inaccessible opening may be displayed in a different frame.
  • FIG. 7 is a diagram showing an example of a display on the touch panel 24 by the accessible opening display section 35.
  • Frames 111 and 112 are displayed on the touch panel 24, superimposed on the balcony image taken by the camera 22 shown in FIG.
  • a frame 111 is a frame that highlights the opening 102A
  • a frame 112 is a frame that highlights the opening 102B.
  • the frame 111 is not marked with a cross, which indicates that the opening 102A is an enterable opening.
  • the frame 112 is marked with a cross, indicating that the opening 102B is an inaccessible opening.
  • the selection information acquisition unit 36 acquires information on the accessible opening selected by the user from among the accessible openings displayed on the touch panel 24 by the accessible opening display unit 35. Specifically, the selection information acquisition unit 36 acquires information on the position touched by the user on the screen from the touch panel 24, and based on the acquired position information and the display position of the accessible opening on the screen, Identifying the user-selected access opening. In the example shown in FIG. 7, the only accessible opening is the opening 102A, so the user selects the opening 102A.
  • the possible landing site display section 37 displays the possible landing site determined by the landing site determining section 33 on the touch panel 24.
  • the possible landing spot display section 37 may display an area including a possible landing spot superimposed on the image taken by the camera 22 in a different color from other areas.
  • the possible landing spot display section 37 displays the outline of the drone 1 on the touch panel 24 based on the drone information 52D.
  • the possible landing spot display section 37 displays a rectangular area that the drone 1 may occupy as the outline of the drone 1 based on the size, hovering accuracy, and positional accuracy of the drone 1.
  • FIG. 8 is a diagram showing an example of a display on the touch panel 24 by the possible landing point display section 37.
  • a surface 121 and a rectangular frame 131 are displayed on the touch panel 24, superimposed on the balcony image taken by the camera 22 shown in FIG.
  • the surface 121 is a surface that includes a possible landing site for the drone 1, and is, for example, a blue surface.
  • the rectangular frame 131 three-dimensionally indicates an area that the drone 1 may occupy.
  • the selection information acquisition unit 36 further acquires information on the possible landing site selected by the user from among the possible landing sites displayed on the touch panel 24 by the landing site determining unit 33. Specifically, in the example of FIG. 8, the user drags the rectangular frame 131 on the touch panel 24 and releases (drops) his or her finger at the desired position for the drone 1 to land. Note that movement of the rectangular frame 131 by dragging is restricted so that the rectangular frame 131 can be moved only within the plane 121 that includes the possible landing site.
  • the selection information acquisition unit 36 acquires information on the position where the user has dropped from the touch panel 24, and identifies the possible landing site selected by the user based on the acquired positional information and the display position of the possible landing site on the screen. do.
  • the movement route display unit 38 causes the touch panel 24 to display the movement route of the drone 1 from the approach opening to the landing site determined by the movement route determination unit 34, superimposed on the image taken by the camera 22.
  • FIG. 9 and 10 are diagrams showing examples of display on the touch panel 24 by the movement route display section 38.
  • the movement route display unit 38 displays a message “route calculation in progress” on the balcony image taken by the camera 22. Display messages superimposed.
  • the movement route display unit 38 superimposes the movement route 141 on the balcony image taken by the camera 22. to be displayed.
  • the movement route 141 is a dashed line indicating an area that the drone 1 may occupy when moving from the opening 102A to the landing site.
  • the error notification unit 39 notifies the user by displaying this on the touch panel 24 or outputting sound from the speaker 25.
  • the error notification unit 39 receives a notification from the server 5 or the drone 1 that the drone 1 could not land at the landing site due to the presence of an obstacle on the movement route of the drone 1, the error notification unit 39 determines the notification content. The user is notified by displaying the notification on the touch panel 24 or by outputting the notification content audibly from the speaker 25.
  • FIG. 11 is a block diagram showing the configuration of the server 5 according to Embodiment 1 of the present disclosure.
  • the server 5 includes a communication section 51, a storage section 52, a control section 53, and a bus 54.
  • the communication unit 51 connects the server 5 to the network 7 and performs wireless or wired communication with an external device.
  • the storage unit 52 is composed of a volatile memory element such as SRAM or DRAM, a non-volatile memory element such as flash memory or EEPROM, or a magnetic storage device such as a hard disk.
  • the storage unit 52 stores a computer program 52P executed by the control unit 53 and the above-mentioned drone information 52D. Furthermore, the storage unit 52 stores data obtained when the computer program 52P is executed.
  • the control unit 53 is composed of a processor such as a CPU, and is a functional processing unit realized by executing a computer program 52P. 57 and an error notification section 58.
  • the drone information providing unit 55 reads drone information 52D of a drone that delivers packages to the home of the user using the terminal device 2 from the storage unit 52, and transmits the read drone information 52D to the terminal device 2 via the communication unit 51. do.
  • the landing point information acquisition unit 56 receives opening information and travel route information from the terminal device 2 via the communication unit 51.
  • the moving route determination unit 57 determines the moving route of the drone 1 from a predetermined base of the drone 1 to the opening based on the opening information.
  • the movement route can be determined using a known technique. For example, a travel route may be determined in which the travel distance of the drone 1 is the minimum, or a travel route in which the travel time of the drone 1 is the minimum may be determined.
  • the movement route determination unit 57 is configured to move the drone 1 from the base through the opening to the landing point based on the determined movement route of the drone 1 from the base to the opening and the travel route information received from the terminal device 2. Determine your travel route.
  • the moving route determining unit 57 converts the coordinate system of the moving route information to match the coordinate system of the drone 1 according to the reference coordinate direction indicated in the moving route information received from the terminal device 2.
  • the travel route determination unit 57 transmits a travel instruction to the drone 1 by transmitting information on the travel route from the determined base to the landing point to the drone 1 via the communication unit 21.
  • the drone 1 flies from the base to the landing point according to the moving route determined by the moving route determining unit 57, and delivers the luggage to the landing point.
  • the error notification unit 58 receives error information transmitted from the drone 1 via the communication unit 51 when the drone 1 cannot fly along the movement route. If the error information indicates that the drone 1 cannot land at the landing site due to the presence of an obstacle between the opening and the landing site, the error notification unit 58 notifies the user of this via the communication unit 51. to notify the terminal device 2.
  • FIG. 12 is a sequence diagram showing the processing procedure of the delivery system 10 according to the first embodiment of the present disclosure.
  • the server 5 transmits drone information 52D as shown in FIG. 4 to the terminal device 2, and the terminal device 2 receives the drone information 52D (step S1).
  • the camera 22 of the terminal device 2 captures an image of the balcony including the candidate landing site for the drone 1 as shown in FIG. 3 (step S2). Note that if the area including the landing candidate site does not fit within the photographing area of the camera 22, the user may pan the camera 22 to photograph a panoramic image wider than the photographing area.
  • the terminal device 2 also captures the objects on the balcony shown in FIG. The distance is measured and depth information is output (step S3).
  • the terminal device 2 calculates spatial information around the candidate landing site, including the three-dimensional positions of objects existing around the candidate landing site (step S4).
  • the terminal device 2 Based on the drone information 52D and the spatial information, the terminal device 2 identifies an entry-allowed opening that the drone 1 can enter and an entry-impossible opening that the drone 1 cannot enter. Determine (step S5).
  • the terminal device 2 causes the touch panel 24 to display an image as shown in FIG. 7 (step S6). Thereby, the opening 102A, which is an accessible opening, and the opening 102B, which is an inaccessible opening, can be distinguished and displayed. Note that the terminal device 2 displays a message "Please specify the drone entrance" superimposed on the image.
  • the user selects, on the touch panel 24, the opening into which the drone 1 can enter, and the terminal device 2 acquires information on the selected opening (step S7).
  • information on the opening 102A eg, position information on the image of the opening 102A
  • the terminal device 2 determines the entry possible opening selected by the user as the entry opening into which the drone 1 will enter (step S8). For example, the terminal device 2 determines the selected opening 102A as the entry opening.
  • the terminal device 2 determines a landing spot where the drone 1 can land based on the drone information 52D and the spatial information (step S9).
  • the terminal device 2 causes the touch panel 24 to display an image including a surface 121 including a possible landing site for the drone 1 and a rectangular frame 131 indicating an area that the drone 1 may occupy ( Step S10). Note that the terminal device 2 displays a message "Please specify the landing site" superimposed on the image.
  • the user selects a landing spot for the drone 1 by dragging and dropping the rectangular frame 131 on the touch panel 24, and the terminal device 2 specifies the selected landing spot for the drone 1 (step S11).
  • the rectangular frame 13 shown in FIG. When the user drops the rectangular frame 131 at position 1, the position in real space corresponding to the position of the rectangular frame 131 on the image is specified as a possible landing spot.
  • the terminal device 2 determines the possible landing site selected by the user and identified by the terminal device 2 as the landing site for the drone 1 (step S12).
  • the terminal device 2 displays an image on the touch panel 24 that shows that the drone 1 is calculating the movement route as shown in FIG. 9 (step S13).
  • the terminal device 2 determines the movement route of the drone 1 from the approach opening to the landing site based on the spatial information, the approach opening information, and the landing site information.
  • the terminal device 2 displays an image showing the movement route 141 of the drone 1 as shown in FIG. 10 on the touch panel 24 (step S14). Note that if the moving route is not found, the terminal device 2 causes the touch panel 24 to display an image indicating that the moving route is not found, and cancels the subsequent processing.
  • the terminal device 2 transmits opening information (FIG. 5) indicating the position of the determined entrance opening and moving route information (FIG. 6) indicating the determined moving route to the server 5.
  • Opening information (FIG. 5) indicating the position of the determined entrance opening
  • moving route information (FIG. 6) indicating the determined moving route to the server 5.
  • Information and travel route information are received (step S15).
  • the server 5 determines the travel route of the drone 1 from the package delivery base to the opening based on the opening information. Furthermore, the server 5 determines a travel route from the delivery base to the landing point through the opening, based on the determined travel route and the travel route information received from the terminal device 2 (step S16).
  • the server 5 transmits travel route information indicating the travel route from the delivery base to the landing point to the drone 1, and the drone 1 receives the travel route information (step S17).
  • the drone 1 delivers the package to the landing site by flying from the package delivery base through the opening to the landing site according to the received movement route information (step S18).
  • the drone 1 If the drone 1 cannot deliver the cargo due to obstacles or the like during flight, it returns to the base and sends error information including information on the point to which it returned to the server 5.
  • the server 5 receives the error information transmitted from the drone 1 (step S19).
  • the server 5 determines that if the drone 1 returns at a point between the opening and the landing site, there is an obstacle on the travel path of the drone 1 and the drone 1 returns to the landing site. Error information indicating that landing was not possible is transmitted to the terminal device 2, and the terminal device 2 receives the error information (step S20).
  • the terminal device 2 Based on the error information, the terminal device 2 notifies the user that the drone 1 was unable to land at the landing site due to the presence of an obstacle on the travel route of the drone 1 (step S21).
  • the drone information 52D includes the size of the drone 1. Therefore, by acquiring the drone information 52D and the spatial information, the terminal device 2 can determine the entry opening through which the drone 1 enters the space around the candidate landing site of the drone 1. Furthermore, the terminal device 2 can also determine the landing site of the drone 1. Once the approach opening and landing site are determined, the terminal device 2 can determine the movement route of the drone 1 from the base to the approach opening and the movement path of the drone 1 from the approach opening to the landing site. . For example, the user acquires spatial information using a distance sensor included in the terminal device 2 such as a smartphone, and the terminal device 2 acquires the drone information 52D from the external server 5, thereby allowing the user to acquire the drone 1 using a simple method. The travel route or landing point of the aircraft can be determined.
  • the approach opening and the landing site can be determined in consideration of the hovering accuracy and positional accuracy of the drone 1. Therefore, it is possible to determine an entry opening into which the drone 1 can reliably enter and a landing site into which the drone 1 can reliably land.
  • the user can select any one of the plurality of accessible openings.
  • the user can select an accessible opening by excluding an accessible opening that the user does not want the drone 1 to enter, or select an accessible opening that the drone 1 can easily enter.
  • the entrance opening can be determined according to the wishes of the user.
  • inaccessible openings into which the drone 1 cannot enter such as when the size of the opening is smaller than the size of the drone 1, can be displayed separately from accessible openings. This allows the user to efficiently select an accessible opening.
  • the user can select one of the possible landing sites.
  • a landing site can be determined.
  • a rectangular frame 131 indicating an area that the drone 1 may occupy is displayed on the touch panel 24.
  • the user can select a possible landing site in consideration of the external shape of the drone 1. Therefore, it is possible to reliably determine a landing site where the drone 1 can land.
  • the movement route 141 of the drone 1 from the approach opening to the landing site can be displayed. Therefore, the user can be made aware of the moving route of the drone 1, and the user can be warned not to place anything on the moving route.
  • the terminal device 2 cannot determine the movement route of the drone 1 from the approach opening to the landing site, it notifies the user to that effect. This allows the user to take measures such as changing the landing site.
  • the terminal device 2 notifies the user if the drone 1 cannot fly the travel route from the approach opening to the landing site. This allows the user to take measures such as removing obstacles on the travel route or changing the landing site.
  • the server 5 also acquires the position information of the approach opening determined based on the drone information 52D and the movement route of the drone 1 from the opening to the landing site from the terminal device 2, Determine the route of travel.
  • the server 5 can transmit a movement instruction to the drone 1 based on the determined movement route. Thereby, the drone 1 can be controlled based on the movement route or landing point of the drone 1 determined by a simple method.
  • FIG. 13 is a diagram showing the configuration of a delivery system according to Embodiment 2 of the present disclosure.
  • the delivery system 10 is a system for delivering packages from a delivery base to a delivery destination, and includes a drone 1, a terminal device 2, a smart tag 8, and a server 5. Similar to Embodiment 1, the drone 1, the terminal device 2, and the server 5 are connected to the network 7 via wireless or wire. The drone 1 and the server 5 are similar to those shown in the first embodiment.
  • the smart tag 8 is a wireless tag that is wirelessly connected to the terminal device 2.
  • the smart tag 8 and the terminal device 2 are connected according to Bluetooth (registered trademark), which is a wireless communication standard. Note that it is assumed that the version of Bluetooth (registered trademark) in the second embodiment is 5.1 or later. Bluetooth® from version 5.1 onwards includes direction finding functionality. The user places the smart tag 8 at the landing site of the drone 1.
  • the terminal device 2 is, for example, a smartphone or a tablet terminal device, and is paired with the smart tag 8 and calculates the direction of the smart tag 8 and the distance to the smart tag 8 by communicating wirelessly.
  • the terminal device 2 determines the position of the smart tag 8 as the landing point of the drone 1 based on the position of the own device and the calculated direction and distance of the smart tag 8.
  • FIG. 14 is a block diagram showing the configuration of the terminal device 2 according to Embodiment 2 of the present disclosure.
  • the terminal device 2 includes a communication section 21 , a camera 22 , a touch panel 24 , a speaker 25 , a position specifying section 26 , a storage section 27 , a control section 28 , and a bus 29 .
  • a communication section 21 a communication section 21 , a camera 22 , a touch panel 24 , a speaker 25 , a position specifying section 26 , a storage section 27 , a control section 28 , and a bus 29 .
  • the communication unit 21 connects the terminal device 2 to the network 7 and performs wireless or wired communication with an external device.
  • the communication unit 21 is further connected to the smart tag 8 according to Bluetooth (registered trademark).
  • the control unit 28 includes a drone information acquisition unit 30, a drone area display unit 41, and a tag position calculation unit 42 as functional processing units realized by executing the computer program 27P stored in the storage unit 27. , a landing site determining section 43 , and an error notification section 39 .
  • the drone area display unit 41 superimposes an area that may be occupied by the drone 1 on the image taken by the camera 22 and displays it on the touch panel 24.
  • the tag position calculation unit 42 calculates the relative direction and distance of the smart tag 8 to the terminal device 2 based on the communication with the smart tag 8 by the communication unit 21, thereby determining the relative direction and distance of the smart tag 8 to the terminal device 2. Calculate relative position. For example, the communication unit 21 receives a signal transmitted from the smart tag 8 while switching a plurality of antennas provided in the terminal device 2, and the tag position calculation unit 42 calculates a phase difference from IQ samples of the received signal. , the relative position of the smart tag 8 may be calculated from the phase difference. Further, the communication unit 21 receives a transmitted signal using one antenna while switching between a plurality of antennas provided in the smart tag 8, and the tag position calculation unit 42 calculates a phase difference from the IQ samples of the received signal.
  • the relative position of the smart tag 8 may be calculated from the phase difference. Further, when the communication unit 21 performs ultra-wideband (UWB) wireless communication with the smart tag 8, the tag position calculation unit 42 calculates the relative position of the smart tag 8 by short-range search using UWB. It's okay.
  • UWB ultra-wideband
  • the landing point determining unit 43 determines the position of the smart tag 8 on the drone 1 based on the position of the terminal device 2 specified by the position specifying unit 26 and the relative position of the smart tag 8 calculated by the tag position calculating unit 42. Determined as the landing point of In other words, the landing point determining unit 43 determines the direction of the smart tag 8 with respect to the terminal device 2 and the distance from the terminal device 2 to the smart tag 8 based on the relative position of the smart tag 8 with respect to the terminal device 2. calculate. The landing point determination unit 43 determines the position of the smart tag 8 by moving the position of the terminal device 2 by the calculated distance in the calculated direction, and sets the determined position of the smart tag 8 as the landing point of the drone 1. Determine as. The landing site determination unit 43 transmits information indicating the determined landing site (for example, latitude and longitude information) to the server 5 via the communication unit 21.
  • the determined landing site for example, latitude and longitude information
  • the error notification unit 39 determines the notification content. The user is notified by displaying the notification on the touch panel 24 or by outputting the notification content audibly from the speaker 25.
  • FIG. 15 is a sequence diagram showing the processing procedure of the delivery system 10 according to the second embodiment of the present disclosure. Pairing is performed between the smart tag 8 and the terminal device 2, and the two are connected (step S31).
  • the server 5 transmits drone information 52D as shown in FIG. 4 to the terminal device 2, and the terminal device 2 receives the drone information 52D (step S32).
  • the user installs the smart tag 8 at the landing site of the drone 1, operates the camera 22 of the terminal device 2, and photographs the area including the smart tag 8 (step S33).
  • the terminal device 2 superimposes the area that may be occupied by the drone 1 on the image taken by the camera 22 and displays it on the touch panel 24 (step S34).
  • FIG. 16 is a diagram showing an example of an image displayed on the touch panel 24.
  • the touch panel 24 displays an image in which a rectangular frame 131 indicating an area that may be occupied by the drone 1 is superimposed on the image of the area including the smart tag 8 captured by the camera 22.
  • the terminal device 2 specifies the position of the terminal device 2 using satellite navigation (step S35).
  • the terminal device 2 calculates the relative position of the smart tag 8 with respect to the terminal device 2 based on the communication with the smart tag 8 by the communication unit 21 (step S36).
  • the terminal device 2 determines the absolute position of the smart tag 8 as the landing point of the drone 1 (step S37).
  • the landing site of the drone 1 is indicated by latitude and longitude, for example.
  • the terminal device 2 transmits landing site information indicating the landing site of the drone 1 to the server 5, and the server 5 receives the landing site information (step S38).
  • the server 5 determines the travel route of the drone 1 from the cargo delivery base to the landing point indicated by the landing point information (step S39).
  • the movement route can be determined using a known technique.
  • the server 5 transmits travel route information indicating the travel route from the delivery base to the landing point to the drone 1, and the drone 1 receives the travel route information (step S40).
  • the drone 1 delivers the package to the landing site by flying from the package delivery base to the landing site according to the received movement route information (step S41).
  • the drone 1 If the drone 1 is unable to deliver the cargo due to obstacles or the like during flight, it returns to the base and sends error information including information about the point to which it returned to the server 5.
  • the server 5 receives the error information transmitted from the drone 1 (step S42).
  • the server 5 Based on the received error information, if the drone 1 returns near the landing site (for example, a point within a predetermined distance from the landing site), the server 5 transmits error information indicating this to the terminal device 2. Then, the terminal device 2 receives the error information (step S43).
  • the terminal device 2 Based on the error information, the terminal device 2 notifies the user that the drone 1 turned back near the landing site and was unable to land at the landing site (step S44).
  • the landing site of the drone 1 can be determined by placing the smart tag 8 at the landing site and acquiring the position of the terminal device 2. Thereby, the landing site of the drone 1 can be determined using a simple method.
  • a part or all of the components constituting each of the above devices may be composed of one or more semiconductor devices such as a system LSI.
  • the computer programs 27P and 52P described above may be recorded on a computer-readable non-transitory recording medium, such as an HDD, CD-ROM, or semiconductor memory, and distributed. Further, the computer programs 27P and 52P may be transmitted and distributed via telecommunication lines, wireless or wired communication lines, networks typified by the Internet, data broadcasting, and the like. Furthermore, each of the above devices may be realized by multiple computers or multiple processors.
  • each of the above devices may be provided by cloud computing.
  • some or all of the functions of each device may be realized by a cloud server.
  • at least some of the above embodiments may be combined arbitrarily.
  • Drone 2 Terminal device (landing point determination system, movement route determination device) 5 Server (drone control device) 7 Network 8 Smart tag 10 Delivery system (travel route determination system) 21 Communication unit 22 Camera 23 Sensor 24 Touch panel 25 Speaker 26 Position specifying unit 27 Storage unit 27P Computer program 28 Control unit 29 Bus 30 Drone information acquisition unit 31 Spatial information acquisition unit 32 Approach opening determination unit 33 Landing point determination unit 34 Travel route Determination unit (second movement route determination unit) 35 Accessible opening display section 36 Selection information acquisition section (first selection information acquisition section, second selection information acquisition section) 37 Possible landing point display section 38 Travel route display section 39 Error notification section (first notification section, second notification section) 41 Drone area display section 42 Tag position calculation section 43 Landing point determination section 51 Communication section 52 Storage section 52D Drone information 52P Computer program 53 Control section 54 Bus 55 Drone information provision section 56 Landing point information acquisition section (location information acquisition section, 2 movement route acquisition part) 57 Movement route determination unit (first movement route determination unit, movement instruction transmission unit) 58 Error notification section 101 Floor section 102A Opening section 102B

Abstract

This movement path determination system comprises: a drone information acquisition unit that acquires drone information which includes the size of a drone; a space information acquisition unit that acquires space information which includes the three-dimensional position of an object existing around a landing candidate site for the drone; an entry opening determination unit that determines an entry opening for entry of the drone into a building which includes the landing candidate site, on the basis of the drone information and the space information; a landing site determination unit that determines a landing site for the drone on the basis of the drone information and the space information; a first movement path determination unit that determines a first movement path for the drone from a drone base to the entry opening; and a second movement path determination unit that determines a second movement path for the drone from the entry opening to the landing site on the basis of the space information and the determined entry opening and landing site.

Description

移動経路決定システム、着陸地点決定システム、移動経路決定装置、ドローン制御装置、及びコンピュータプログラムTravel route determination system, landing point determination system, travel route determination device, drone control device, and computer program
 本開示は、移動経路決定システム、着陸地点決定システム、移動経路決定装置、ドローン制御装置、及びコンピュータプログラムに関する。 本出願は、2022年6月13日出願の日本出願第2022-94867号に基づく優先権を主張し、前記日本出願に記載された全ての記載内容を援用するものである。 The present disclosure relates to a travel route determination system, a landing site determination system, a travel route determination device, a drone control device, and a computer program. This application claims priority based on Japanese Application No. 2022-94867 filed on June 13, 2022, and incorporates all the contents described in the said Japanese application.
 近年、無人航空機の一種であるドローンを用いた荷物の配送サービスの実証実験が行われている。特許文献1には、ドローンによる荷物の配送場所の決定方法が開示されている。 In recent years, demonstration experiments have been conducted on parcel delivery services using drones, a type of unmanned aerial vehicle. Patent Document 1 discloses a method for determining a delivery location of a package using a drone.
国際公開第2020/136711号International Publication No. 2020/136711
 本開示の一態様に係る移動経路決定システムは、ドローンのサイズを含むドローン情報を取得するドローン情報取得部と、前記ドローンの着陸候補地点の周囲に存在する物体の三次元位置を含む空間情報を取得する空間情報取得部と、前記ドローン情報及び前記空間情報に基づいて、前記ドローンが進入する進入開口部を決定する進入開口部決定部と、前記ドローン情報及び前記空間情報に基づいて、前記ドローンの着陸地点を決定する着陸地点決定部と、拠点から前記進入開口部までの前記ドローンの第1移動経路を決定する第1移動経路決定部と、前記空間情報と、決定された前記進入開口部及び前記着陸地点とに基づいて、前記進入開口部から前記着陸地点までの前記ドローンの第2移動経路を決定する第2移動経路決定部とを備える。 A movement route determination system according to an aspect of the present disclosure includes a drone information acquisition unit that acquires drone information including the size of the drone, and spatial information that includes three-dimensional positions of objects existing around a candidate landing site for the drone. an approach opening determining unit that determines an approach opening into which the drone enters based on the drone information and the spatial information; a landing point determining unit that determines a landing site of the drone; a first moving route determining unit that determines a first moving route of the drone from a base to the approach opening; the spatial information; and the determined approach opening. and a second movement route determination unit that determines a second movement route of the drone from the approach opening to the landing point based on the landing point and the landing point.
図1は、本開示の実施形態1に係る配送システムの構成を示す図である。FIG. 1 is a diagram showing the configuration of a delivery system according to Embodiment 1 of the present disclosure. 図2は、本開示の実施形態1に係る端末装置の構成を示すブロックである。FIG. 2 is a block diagram showing the configuration of a terminal device according to Embodiment 1 of the present disclosure. 図3は、カメラから出力された画像の一例を示す図である。FIG. 3 is a diagram showing an example of an image output from the camera. 図4は、ドローン情報の一例を示す図である。FIG. 4 is a diagram showing an example of drone information. 図5は、開口部情報の一例を示す図である。FIG. 5 is a diagram illustrating an example of aperture information. 図6は、移動経路情報の一例を示す図である。FIG. 6 is a diagram showing an example of travel route information. 図7は、進入可能開口部表示部によるタッチパネルへの表示例を示す図である。FIG. 7 is a diagram illustrating an example of display on the touch panel by the enterable opening display section. 図8は、着陸可能地点表示部によるタッチパネルへの表示例を示す図である。FIG. 8 is a diagram illustrating an example of a display on a touch panel by a possible landing point display section. 図9は、移動経路表示部によるタッチパネルへの表示例を示す図である。FIG. 9 is a diagram illustrating an example of display on the touch panel by the movement route display section. 図10は、移動経路表示部によるタッチパネルへの表示例を示す図である。FIG. 10 is a diagram illustrating an example of display on the touch panel by the movement route display section. 図11は、本開示の実施形態1に係るサーバの構成を示すブロック図である。FIG. 11 is a block diagram showing the configuration of a server according to Embodiment 1 of the present disclosure. 図12は、本開示の実施形態1に係る配送システムの処理手順を示すシーケンス図である。FIG. 12 is a sequence diagram showing the processing procedure of the delivery system according to Embodiment 1 of the present disclosure. 図13は、本開示の実施形態2に係る配送システムの構成を示す図である。FIG. 13 is a diagram showing the configuration of a delivery system according to Embodiment 2 of the present disclosure. 図14は、本開示の実施形態2に係る端末装置の構成を示すブロック図である。FIG. 14 is a block diagram showing the configuration of a terminal device according to Embodiment 2 of the present disclosure. 図15は、本開示の実施形態2に係る配送システムの処理手順を示すシーケンス図である。FIG. 15 is a sequence diagram showing the processing procedure of the delivery system according to Embodiment 2 of the present disclosure. 図16は、タッチパネルに表示される画像の一例を示す図である。FIG. 16 is a diagram showing an example of an image displayed on the touch panel.
[本開示が解決しようとする課題]

 一般的に、ドローンの着陸地点として、ビル屋上のヘリポートや、地上に設置された6畳ほどの広さのマーカー領域などが用いられ、ドローンはヘリポートやマーカー領域のマークを目掛けて着陸する。個人宅にヘリポートやマーカー領域を設けるのは現実的ではなく、個人宅ごとにドローンの着陸地点を指定する手法についても整備されていない。
[Problems that this disclosure seeks to solve]

Generally, a helipad on the roof of a building or a marker area about 6 tatami mats in size set up on the ground is used as a landing site for a drone, and the drone lands by aiming for the mark on the helipad or marker area. It is not practical to set up a helipad or marker area in a private home, and there is no established method for specifying a drone landing site for each private home.
 本開示は、このような事情に鑑みてなされたものであり、簡易な方法でドローンの移動経路又は着陸地点を決定することのできる移動経路決定システム、着陸地点決定システム、移動経路決定装置、ドローン制御装置、及びコンピュータプログラムを提供することを目的とする。 The present disclosure has been made in view of such circumstances, and provides a travel route determination system, a landing point determination system, a travel route determination device, and a drone that can determine the travel route or landing point of a drone in a simple manner. The purpose is to provide a control device and a computer program.
[本開示の効果]
 本開示によると、簡易な方法でドローンの移動経路又は着陸地点を決定することができる。
[Effects of this disclosure]
According to the present disclosure, it is possible to determine the movement route or landing point of a drone using a simple method.
 [本開示の実施形態の概要]
 最初に本開示の実施形態の概要を列記して説明する。
 (1)本開示の一実施形態に係る移動経路決定システムは、ドローンのサイズを含むドローン情報を取得するドローン情報取得部と、前記ドローンの着陸候補地点の周囲に存在する物体の三次元位置を含む空間情報を取得する空間情報取得部と、前記ドローン情報及び前記空間情報に基づいて、前記ドローンが前記着陸候補地点を含む建物に進入するための進入開口部を決定する進入開口部決定部と、前記ドローン情報及び前記空間情報に基づいて、前記ドローンの着陸地点を決定する着陸地点決定部と、前記ドローンの拠点から前記進入開口部までの前記ドローンの第1移動経路を決定する第1移動経路決定部と、前記空間情報と、決定された前記進入開口部及び前記着陸地点とに基づいて、前記進入開口部から前記着陸地点までの前記ドローンの第2移動経路を決定する第2移動経路決定部とを備える。
[Summary of embodiments of the present disclosure]
First, an overview of the embodiments of the present disclosure will be listed and explained.
(1) A movement route determination system according to an embodiment of the present disclosure includes a drone information acquisition unit that acquires drone information including the size of the drone, and a three-dimensional position of objects existing around a candidate landing site for the drone. an approach opening determining unit that determines an approach opening for the drone to enter a building including the landing candidate site based on the drone information and the spatial information; , a landing point determining unit that determines a landing site for the drone based on the drone information and the spatial information, and a first movement that determines a first movement route of the drone from the base of the drone to the approach opening. a second movement route for determining a second movement route of the drone from the approach opening to the landing point based on a route determining unit, the spatial information, and the determined approach opening and the landing point; and a determining section.
 ドローン情報にはドローンのサイズが含まれている。このため、ドローン情報と空間情報を取得することによって、ドローンの着陸候補地点の周囲の空間にドローンが進入する進入開口部を決定することができる。また、ドローンの着陸地点も決定することができる。進入開口部と着陸地点とが決まると、第1移動経路及び第2移動経路を決定することができる。例えば、ユーザは、スマートフォン等に備えられた距離センサを用いて空間情報を取得し、ドローン情報を外部のサーバから取得することによって、簡易な方法でドローンの移動経路又は着陸地点を決定することができる。 Drone information includes the size of the drone. Therefore, by acquiring the drone information and spatial information, it is possible to determine the entry opening through which the drone enters the space around the candidate landing site for the drone. It is also possible to determine the landing site for the drone. Once the approach opening and landing site are determined, the first travel route and the second travel route can be determined. For example, a user can easily determine the drone's travel route or landing point by acquiring spatial information using a distance sensor installed in a smartphone, etc., and acquiring drone information from an external server. can.
 (2)上記(1)において、前記ドローン情報は、さらに、前記ドローンの飛行精度を含んでもよい。 (2) In (1) above, the drone information may further include flight accuracy of the drone.
 例えば、開口部のサイズがドローンのサイズよりも大きい場合であっても、飛行精度が悪い場合には、ドローンが開口部の周囲に接触し、開口部に進入することができない場合がある。同様に、着陸地点のサイズがドローンのサイズよりも大きい場合であっても、ドローンが着陸地点の周囲に接触し、着陸地点に着陸できない場合がある。この構成によると、ドローンの飛行精度を考慮して進入開口部及び着陸地点を決定することができる。このため、ドローンが確実に進入可能な進入開口部と、ドローンが確実に着陸可能な着陸地点とを決定することができる。 For example, even if the size of the opening is larger than the size of the drone, if flight accuracy is poor, the drone may come into contact with the area around the opening and be unable to enter the opening. Similarly, even if the size of the landing site is larger than the size of the drone, the drone may come into contact with the surroundings of the landing site and be unable to land at the landing site. According to this configuration, the approach opening and the landing site can be determined in consideration of the flight accuracy of the drone. Therefore, it is possible to determine an entry opening into which the drone can reliably enter and a landing site where the drone can reliably land.
 (3)上記(1)または(2)において、前記進入開口部決定部は、前記ドローン情報及び前記空間情報に基づいて、前記ドローンが進入可能な進入可能開口部を決定し、前記移動経路決定システムは、前記進入可能開口部を画面に表示させる進入可能開口部表示部と、ユーザによる前記進入可能開口部の選択情報を取得する第1選択情報取得部とをさらに備え、前記進入開口部決定部は、さらに、前記進入可能開口部の選択情報に基づいて、前記進入開口部を決定してもよい。 (3) In (1) or (2) above, the entry opening determining unit determines an entry opening into which the drone can enter, based on the drone information and the spatial information, and determines the movement route. The system further includes an accessible opening display unit that displays the accessible opening on a screen, and a first selection information acquisition unit that acquires information on the user's selection of the accessible opening, and the system further includes: The unit may further determine the access opening based on the selection information of the access opening.
 この構成によると、ユーザは複数の進入可能開口部の中からいずれかの進入可能開口部を選択することができる。これにより、ユーザはドローンに進入してほしくない進入可能開口部を除外して進入可能開口部を選択したり、ドローンが進入しやすそうな進入可能開口部を選択することができ、ユーザの希望に沿った進入開口部を決定することができる。 According to this configuration, the user can select any one of the plurality of accessible openings. This allows the user to select accessible openings by excluding accessible openings that they do not want the drone to enter, or to select accessible openings that the drone can easily enter. The entry opening along can be determined.
 (4)上記(3)において、前記進入開口部決定部は、さらに、前記ドローン情報及び前記空間情報に基づいて、前記ドローンが進入することのできない進入不可能開口部を決定し、前記進入可能開口部表示部は、前記進入可能開口部及び前記進入不可能開口部を識別可能に表示してもよい。 (4) In (3) above, the entry opening determining unit further determines an entry-inaccessible opening into which the drone cannot enter, based on the drone information and the spatial information, and The opening display section may display the accessible opening and the non-accessible opening in a distinguishable manner.
 この構成によると、ドローンのサイズより開口部のサイズが小さい場合などのように、ドローンが進入することのできない進入不可能開口部を、進入可能開口部と区別して表示することができる。これにより、ユーザは、効率的に進入可能開口部を選択することができる。 According to this configuration, inaccessible openings into which the drone cannot enter, such as cases where the size of the opening is smaller than the size of the drone, can be displayed separately from accessible openings. This allows the user to efficiently select an accessible opening.
 (5)上記(1)から(4)のいずれかにおいて、前記着陸地点決定部は、前記ドローン情報及び前記空間情報に基づいて、前記ドローンが着陸可能な着陸可能地点を決定し、前記移動経路決定システムは、前記着陸可能地点を画面に表示させる着陸可能地点表示部と、ユーザによる前記着陸可能地点の選択情報を取得する第2選択情報取得部とをさらに備え、前記着陸地点決定部は、さらに、前記着陸可能地点の選択情報に基づいて、前記着陸地点を決定してもよい。 (5) In any one of (1) to (4) above, the landing point determination unit determines a possible landing point where the drone can land based on the drone information and the spatial information, and The determination system further includes a landing spot display unit that displays the landing spot possible on a screen, and a second selection information acquisition unit that acquires selection information of the landing spot by the user, and the landing spot determining unit includes: Furthermore, the landing site may be determined based on the selection information of the possible landing site.
 この構成によると、ユーザは、着陸可能地点の中からいずれかの着陸可能地点を選択することができる。これにより、ドローンに着陸してほしくない着陸可能地点を除外して着陸可能地点を選択したり、ドローンが着陸しやすそうな着陸可能地点を選択することができ、ユーザの希望に沿った着陸地点を決定することができる。 According to this configuration, the user can select any possible landing site from among the possible landing sites. This allows you to select possible landing sites by excluding possible landing sites that you do not want the drone to land on, or to select possible landing sites that are likely to be easy for the drone to land on, allowing you to select landing sites that match the user's wishes. can be determined.
 (6)上記(5)において、前記着陸可能地点表示部は、さらに、前記ドローン情報に基づいて、前記ドローンの外形を前記画面に表示させてもよい。 (6) In (5) above, the possible landing point display unit may further display the outline of the drone on the screen based on the drone information.
 この構成によると、ドローンの外形を考慮して着陸可能地点を選択することができる。このため、確実にドローンが着陸できる着陸地点を決定することができる。 According to this configuration, it is possible to select a possible landing site in consideration of the external shape of the drone. Therefore, it is possible to reliably determine a landing site where the drone can land.
 (7)上記(1)から(6)のいずれかにおいて、前記移動経路決定システムは、前記第2移動経路を画面に表示させる移動経路表示部をさらに備えてもよい。 (7) In any one of (1) to (6) above, the travel route determination system may further include a travel route display unit that displays the second travel route on a screen.
 この構成によると、進入開口部から着陸地点までのドローンの移動経路を表示することができる。このため、ユーザにドローンの移動経路を認識させることができ、ユーザに対して移動経路上に物を置かないように注意喚起することができる。 According to this configuration, the travel path of the drone from the approach opening to the landing site can be displayed. Therefore, the user can be made aware of the moving route of the drone, and the user can be warned not to place objects on the moving route.
 (8)上記(1)から(7)のいずれかにおいて、前記移動経路決定システムは、前記第2移動経路決定部により前記第2移動経路が決定されなかった場合に、前記第2移動経路が決定されなかった旨をユーザに通知する第1通知部をさらに備えてもよい。 (8) In any one of (1) to (7) above, the movement route determination system is configured such that when the second movement route is not determined by the second movement route determination unit, the second movement route is You may further include a first notification unit that notifies the user that the determination has not been made.
 第2移動経路を決定されなかった旨の通知を受けたユーザは、着陸地点の変更等の対応を行うことができる。 The user who receives the notification that the second travel route has not been determined can take actions such as changing the landing point.
 (9)上記(1)から(8)のいずれかにおいて、前記移動経路決定システムは、前記ドローンが前記第2移動経路を移動できなかった場合に、前記第2移動経路を移動できなかった旨をユーザに通知する第2通知部をさらに備えてもよい。 (9) In any one of (1) to (8) above, when the drone is unable to move along the second movement route, the movement route determination system determines that the drone was unable to move along the second movement route. The device may further include a second notification unit that notifies the user.
 ドローンが第2移動経路を移動できなかった旨の通知を受けたユーザは、第2移動経路上の障害物を取り除いたり、着陸地点の変更等の対応を行うことができる。 The user who receives the notification that the drone could not move on the second movement route can take measures such as removing obstacles on the second movement route or changing the landing site.
 (10)本開示の他の実施形態に係る着陸地点決定システムは、端末装置の位置を特定する位置特定部と、無線タグと通信を行う通信部と、前記通信部による前記無線タグとの通信に基づいて、前記端末装置に対する前記無線タグの相対的な位置を算出するタグ位置算出部と、前記端末装置の位置と、前記無線タグの相対的な位置とに基づいて、前記無線タグの位置を、ドローンの着陸地点として決定する着陸地点決定部とを備える。 (10) A landing site determination system according to another embodiment of the present disclosure includes a position specifying unit that specifies the position of a terminal device, a communication unit that communicates with a wireless tag, and communication between the communication unit and the wireless tag. a tag position calculation unit that calculates the relative position of the wireless tag with respect to the terminal device based on the position of the wireless tag; and a landing point determining unit that determines the landing point of the drone as the landing point of the drone.
 この構成によると、着陸地点に無線タグを配置し、端末装置の位置を取得することにより、ドローンの着陸地点を決定することができる。これにより、簡易な方法でドローンの着陸地点を決定することができる。 According to this configuration, the landing site of the drone can be determined by placing a wireless tag at the landing site and acquiring the position of the terminal device. This allows the landing site of the drone to be determined in a simple manner.
 (11)本開示の他の実施形態に係る移動経路決定装置は、ドローンのサイズを含むドローン情報を取得するドローン情報取得部と、前記ドローンの着陸候補地点の周囲に存在する物体の三次元位置を含む空間情報を取得する空間情報取得部と、前記ドローン情報及び前記空間情報に基づいて、前記ドローンが前記着陸候補地点を含む建物に進入するための進入開口部を決定する進入開口部決定部と、前記ドローン情報及び前記空間情報に基づいて、前記ドローンの着陸地点を決定する着陸地点決定部と、前記空間情報と、決定された前記進入開口部及び前記着陸地点とに基づいて、前記進入開口部から前記着陸地点までの前記ドローンの移動経路を決定する移動経路決定部とを備える。 (11) A movement route determination device according to another embodiment of the present disclosure includes a drone information acquisition unit that acquires drone information including the size of the drone, and a three-dimensional position of an object existing around a candidate landing site for the drone. an approach opening determining unit that determines an approach opening for the drone to enter a building including the landing candidate site based on the drone information and the spatial information; a landing point determining unit that determines a landing site for the drone based on the drone information and the spatial information; and a travel route determination unit that determines a travel route of the drone from the opening to the landing site.
 ドローン情報にはドローンのサイズが含まれている。このため、ドローン情報と空間情報を取得することによって、ドローンの着陸候補地点の周囲の空間にドローンが進入する進入開口部を決定することができる。また、ドローンの着陸地点も決定することができる。進入開口部と着陸地点とが決まると、進入開口部から着陸地点までのドローンの移動経路を決定することができる。例えば、ユーザは、スマートフォン等に備えられた距離センサを用いて空間情報を取得し、ドローン情報を外部のサーバから取得することによって、簡易な方法でドローンの移動経路又は着陸地点を決定することができる。 Drone information includes the size of the drone. Therefore, by acquiring the drone information and spatial information, it is possible to determine the entry opening through which the drone enters the space around the candidate landing site for the drone. It is also possible to determine the landing site for the drone. Once the approach opening and landing site are determined, the travel path of the drone from the approach opening to the landing site can be determined. For example, a user can easily determine the drone's travel route or landing point by acquiring spatial information using a distance sensor installed in a smartphone, etc., and acquiring drone information from an external server. can.
 (12)本開示の他の実施形態に係るドローン制御装置は、端末装置に、ドローンのサイズを含むドローン情報を提供するドローン情報提供部と、前記端末装置から、前記ドローン情報に基づいて決定された、前記ドローンが着陸地点を含む建物に進入するための進入開口部の位置情報を取得する位置情報取得部と、前記ドローンの拠点から前記進入開口部までの前記ドローンの第1移動経路を決定する第1移動経路決定部と、前記端末装置から、前記進入開口部から前記着陸地点までの前記ドローンの第2移動経路を取得する第2移動経路取得部と、前記第1移動経路及び前記第2移動経路に基づいて、前記ドローンに移動指示を送信する移動指示送信部とを備える。 (12) A drone control device according to another embodiment of the present disclosure includes a drone information providing unit that provides drone information including the size of the drone to a terminal device, and a drone information providing unit that provides the drone information including the size of the drone. Further, a position information acquisition unit that acquires position information of an approach opening for the drone to enter a building including a landing site, and a position information acquisition unit that determines a first movement route of the drone from the base of the drone to the approach opening. a first movement route determination unit that acquires a second movement route of the drone from the approach opening to the landing site from the terminal device; and a movement instruction transmitter that transmits a movement instruction to the drone based on the second movement route.
 端末装置から、ドローン情報に基づいて決定された進入開口部の位置情報と、第2移動経路とを取得し、第1移動経路及び第2移動経路に基づいて、ドローンに移動指示を送信することができる。これにより、簡易な方法で決定されたドローンの移動経路又は着陸地点に基づいて、ドローンを制御することができる。 Obtaining the position information of the approach opening determined based on the drone information and the second movement route from the terminal device, and transmitting a movement instruction to the drone based on the first movement route and the second movement route. Can be done. Thereby, the drone can be controlled based on the movement route or landing point of the drone determined by a simple method.
 (13)本開示の他の実施形態に係るコンピュータプログラムは、コンピュータを、ドローンのサイズを含むドローン情報を取得するドローン情報取得部、前記ドローンの着陸候補地点の周囲に存在する物体の三次元位置を含む空間情報を取得する空間情報取得部、前記ドローン情報及び前記空間情報に基づいて、前記ドローンが前記着陸候補地点を含む建物に進入するための進入開口部を決定する進入開口部決定部、前記ドローン情報及び前記空間情報に基づいて、前記ドローンの着陸地点を決定する着陸地点決定部、ならびに、前記空間情報と、決定された前記進入開口部及び前記着陸地点とに基づいて、前記進入開口部から前記着陸地点までの前記ドローンの移動経路を決定する移動経路決定部として機能させる。 (13) A computer program according to another embodiment of the present disclosure includes a computer, a drone information acquisition unit that acquires drone information including the size of the drone, and a three-dimensional position of objects existing around a candidate landing site for the drone. an approach opening determining unit that determines an approach opening for the drone to enter a building including the landing candidate site based on the drone information and the spatial information; a landing point determining unit that determines a landing site for the drone based on the drone information and the spatial information; and a landing site determining unit that determines the landing site of the drone based on the spatial information and the determined approach opening and landing site. The drone functions as a travel route determination unit that determines the travel route of the drone from the landing point to the landing point.
 この構成によると、コンピュータを、上述の移動経路決定装置として機能させることができる。このため、上述の移動経路決定装置と同様の作用および効果を奏することができる。 According to this configuration, the computer can function as the above-mentioned travel route determining device. Therefore, the same operation and effect as the above-described moving route determining device can be achieved.
 (14)本開示の他の実施形態に係るコンピュータプログラムは、コンピュータを、端末装置の位置を特定する位置特定部、通信部による無線タグとの通信に基づいて、前記端末装置に対する前記無線タグの相対的な位置を算出するタグ位置算出部、及び、前記端末装置の位置と、前記無線タグの相対的な位置とに基づいて、前記無線タグの位置を、ドローンの着陸地点として決定する着陸地点決定部として機能させる。 (14) A computer program according to another embodiment of the present disclosure is configured to cause a computer to identify the location of the wireless tag for the terminal device based on communication with the wireless tag by a location specifying unit that identifies the location of the terminal device and a communication unit. a tag position calculation unit that calculates a relative position; and a landing point that determines the position of the wireless tag as a landing point of the drone based on the position of the terminal device and the relative position of the wireless tag. Function as a decision-making unit.
 この構成によると、コンピュータを、上述の着陸地点決定システムとして機能させることができる。このため、上述の着陸地点決定システムと同様の作用および効果を奏することができる。 According to this configuration, the computer can function as the above-mentioned landing site determination system. Therefore, the same operation and effect as the above-mentioned landing site determination system can be achieved.
 [本開示の実施形態の詳細]
 以下、本開示の実施形態について、図面を参照しながら説明する。なお、以下で説明する実施形態は、いずれも本開示の一具体例を示すものである。以下の実施形態で示される数値、形状、材料、構成要素、構成要素の配置位置および接続形態、ステップ、ステップの順序などは、一例であり、本開示を限定するものではない。また、以下の実施形態における構成要素のうち、独立請求項に記載されていない構成要素については、任意に付加可能な構成要素である。また、各図は、模式図であり、必ずしも厳密に図示されたものではない。
[Details of embodiments of the present disclosure]
Embodiments of the present disclosure will be described below with reference to the drawings. Note that each of the embodiments described below represents a specific example of the present disclosure. The numerical values, shapes, materials, components, arrangement positions and connection forms of the components, steps, order of steps, etc. shown in the following embodiments are examples and do not limit the present disclosure. Moreover, among the components in the following embodiments, components that are not described in the independent claims are components that can be added arbitrarily. Furthermore, each figure is a schematic diagram and is not necessarily strictly illustrated.
 また、同一の構成要素には同一の符号を付す。それらの機能および名称も同様であるため、それらの説明は適宜省略する。 Also, the same components are given the same symbols. Since their functions and names are also the same, their explanations will be omitted as appropriate.
 <実施形態1>
 〔配送システムの全体構成〕
 図1は、本開示の実施形態1に係る配送システムの構成を示す図である。
 配送システム10は、ドローン1の拠点である荷物の配送拠点(以下、「拠点」ともいう。)から配送先に荷物を配送するためのシステムであって、ドローン1と、端末装置2と、サーバ5とを備える。ドローン1、端末装置2及びサーバ5は、無線又は有線を介してネットワーク7に接続される。ネットワーク7は、例えば、5Gなどの公衆通信網により構成される。
<Embodiment 1>
[Overall configuration of delivery system]
FIG. 1 is a diagram showing the configuration of a delivery system according to Embodiment 1 of the present disclosure.
The delivery system 10 is a system for delivering packages from a package delivery base (hereinafter also referred to as "base"), which is the base of the drone 1, to a delivery destination, and includes the drone 1, the terminal device 2, and a server. 5. The drone 1, the terminal device 2, and the server 5 are connected to the network 7 via wireless or wired. The network 7 is configured by, for example, a public communication network such as 5G.
 ドローン1は、サーバ5が決定した移動経路に従って、配送拠点から配送先まで飛行し、配送先に荷物を配送する無人航空機である。ドローン1には、荷物を把持するための機構と、ドローン1の位置を特定するための位置特定ユニットと、カメラとが備えられている。ドローン1は、カメラが撮影した映像により進路に存在する障害物等を検知することができ、障害物を避けながら目的地まで飛行する。 The drone 1 is an unmanned aircraft that flies from a delivery base to a delivery destination according to a travel route determined by the server 5 and delivers packages to the delivery destination. The drone 1 is equipped with a mechanism for grasping luggage, a position specifying unit for specifying the position of the drone 1, and a camera. The drone 1 can detect obstacles and the like existing in its path based on the images taken by the camera, and flies to the destination while avoiding the obstacles.
 端末装置2は、例えば、ユーザが所持するスマートフォン又はタブレット端末装置であり、配送先におけるドローン1の着陸地点と、配送先に設けられたドローン1が進入する開口部である進入開口部と、進入開口部から着陸地点までのドローン1の移動経路とを決定する。なお、本実施形態では、マンションや戸建て住宅などの屋根付きの建物のバルコニーにドローン1が荷物を配送することを想定する。つまり、ドローン1は、外部からバルコニーの開口部を通過してバルコニー内に進入し、バルコニーに着陸することを想定している。ただし、バルコニーに屋根がついていなくてもよい。また、ドローン1の着陸地点はバルコニーに限定されるものではない。外部への開口部を有する場所にドローン1が着陸する場合にも、実施形態1は適用可能である。 The terminal device 2 is, for example, a smartphone or a tablet terminal device owned by the user, and has a landing point for the drone 1 at the delivery destination, an entry opening provided at the delivery destination through which the drone 1 enters, and an entry point. The moving route of the drone 1 from the opening to the landing site is determined. In this embodiment, it is assumed that the drone 1 delivers a package to the balcony of a building with a roof, such as an apartment or a detached house. That is, it is assumed that the drone 1 passes through the opening of the balcony from the outside, enters the balcony, and lands on the balcony. However, the balcony does not have to have a roof. Furthermore, the landing site of the drone 1 is not limited to the balcony. Embodiment 1 is also applicable when the drone 1 lands at a location that has an opening to the outside.
 サーバ5は、端末装置2が決定した情報に基づいて、配送拠点から配送先におけるドローン1の着陸地点までのドローン1の移動経路を決定し、決定した移動経路に沿った飛行をドローン1に指示する。 The server 5 determines the travel route of the drone 1 from the delivery base to the landing point of the drone 1 at the delivery destination based on the information determined by the terminal device 2, and instructs the drone 1 to fly along the determined travel route. do.
 〔端末装置2の構成〕
 図2は、本開示の実施形態1に係る端末装置2の構成を示すブロックである。
 端末装置2は、通信部21と、カメラ22と、センサ23と、タッチパネル24と、スピーカ25と、位置特定部26と、記憶部27と、制御部28と、バス29とを備える。
[Configuration of terminal device 2]
FIG. 2 is a block diagram showing the configuration of the terminal device 2 according to Embodiment 1 of the present disclosure.
The terminal device 2 includes a communication section 21 , a camera 22 , a sensor 23 , a touch panel 24 , a speaker 25 , a position specifying section 26 , a storage section 27 , a control section 28 , and a bus 29 .
 通信部21は、端末装置2をネットワーク7に接続させ、外部の装置と無線又は有線による通信を行う。
 カメラ22は、端末装置2の周囲を撮影し、画像を出力する。
The communication unit 21 connects the terminal device 2 to the network 7 and performs wireless or wired communication with an external device.
The camera 22 photographs the surroundings of the terminal device 2 and outputs an image.
 図3は、カメラ22から出力された画像の一例を示す図である。図3は、ユーザがカメラ22を操作することにより撮像されたマンションのバルコニーの画像の一例を示している。当該画像には、ドローン1の着陸候補地点である床部101と、ドローン1が床部101を含むマンションに進入するためのバルコニーの開口部102A、102Bとが含まれる。 FIG. 3 is a diagram showing an example of an image output from the camera 22. FIG. 3 shows an example of an image of a balcony of an apartment, which is captured by the user operating the camera 22. The image includes the floor 101, which is a candidate landing site for the drone 1, and balcony openings 102A and 102B through which the drone 1 enters the apartment including the floor 101.
 センサ23は、端末装置2の周囲の対象までの距離を計測し、距離に対応した深度情報を出力する。つまり、各画素の深度情報が対象までの距離を表す。センサ23は、例えば、LIDAR(Light Detection and Ranging, Laser Imaging Detection and Rang)により構成される。 The sensor 23 measures the distance to objects around the terminal device 2 and outputs depth information corresponding to the distance. In other words, the depth information of each pixel represents the distance to the target. The sensor 23 is configured by, for example, LIDAR (Light Detection and Ranging, Laser Imaging Detection and Rang).
 センサ23による距離計測と、カメラ22による撮影とは同時に行うことができる。これにより、例えば、センサ23は、図3に示したバルコニーにおける対象(ここでは、床部101、開口部102A、102B、壁面103、窓104、柱105、及び柵106など)までの距離を計測する。 Distance measurement by the sensor 23 and photographing by the camera 22 can be performed simultaneously. Thereby, for example, the sensor 23 measures the distance to the object on the balcony shown in FIG. do.
 タッチパネル24は、画像を表示する表示装置としての機能と、ユーザによる入力操作を受け付ける入力装置として機能を有する。
 スピーカ25は、音を出力する。
The touch panel 24 has a function as a display device that displays images, and a function as an input device that accepts input operations by the user.
The speaker 25 outputs sound.
 位置特定部26は、端末装置2の位置を特定する。位置特定部26は、衛星航法を用いて端末装置2の位置を特定する。例えば、位置特定部26は、複数のGPS(Global Positioning System)衛星から受信した電波に基づいて、端末装置2の位置を特定する。端末装置2の位置は、例えば、緯度及び経度により特定することができる。衛星航法は、GPSなどの衛星測位システム(GNSS : Global Navigation Satellite System)を用いるものであるが、衛星測位システムはGPSに限定されるものではない。 The position specifying unit 26 specifies the position of the terminal device 2. The position specifying unit 26 specifies the position of the terminal device 2 using satellite navigation. For example, the position specifying unit 26 specifies the position of the terminal device 2 based on radio waves received from a plurality of GPS (Global Positioning System) satellites. The location of the terminal device 2 can be specified by latitude and longitude, for example. Satellite navigation uses a global navigation satellite system (GNSS) such as GPS, but the satellite positioning system is not limited to GPS.
 記憶部27は、SRAM(Static RAM)またはDRAM(Dynamic RAM)などの揮発性のメモリ素子、フラッシュメモリ若しくはEEPROM(Electrically Erasable Programmable Read Only Memory)などの不揮発性のメモリ素子、または、ハードディスクなどの磁気記憶装置などにより構成される。記憶部27は、制御部28で実行されるコンピュータプログラム27Pを記憶している。また、記憶部27は、コンピュータプログラム27Pの実行時に得られるデータを記憶する。 The storage unit 27 is a volatile memory element such as SRAM (Static RAM) or DRAM (Dynamic RAM), a non-volatile memory element such as flash memory or EEPROM (Electrically Erasable Programmable Read Only Memory), or a magnetic memory element such as a hard disk. It consists of a storage device, etc. The storage unit 27 stores a computer program 27P executed by the control unit 28. Furthermore, the storage unit 27 stores data obtained when the computer program 27P is executed.
 制御部28は、CPU(Central Processing Unit)などのプロセッサにより構成され、コンピュータプログラム27Pを実行することにより実現される機能的な処理部として、ドローン情報取得部30と、空間情報取得部31と、進入開口部決定部32と、着陸地点決定部33と、移動経路決定部34と、進入可能開口部表示部35と、選択情報取得部36と、着陸可能地点表示部37と、移動経路表示部38と、エラー通知部39とを含む。 The control unit 28 is composed of a processor such as a CPU (Central Processing Unit), and includes a drone information acquisition unit 30, a spatial information acquisition unit 31, and a functional processing unit realized by executing a computer program 27P. An approach opening determining section 32, a landing point determining section 33, a moving route determining section 34, an accessible opening display section 35, a selection information acquiring section 36, a possible landing point display section 37, and a moving route display section 38 and an error notification section 39.
 ドローン情報取得部30は、ユーザ宅に荷物を配送するドローン1のサイズ及びドローン1の飛行精度を含むドローン情報を、通信部21を介してサーバ5から取得する。 The drone information acquisition unit 30 acquires drone information including the size of the drone 1 that delivers packages to the user's home and the flight accuracy of the drone 1 from the server 5 via the communication unit 21.
 図4は、ドローン情報の一例を示す図である。
 ドローン情報52Dは、ドローンサイズ(高さ、幅、奥行き)と、ホバリング精度(垂直、水平)と、位置精度(垂直、水平)とを含む。ホバリング精度(垂直、水平)、及び位置精度(垂直、水平)は、ドローン1の飛行精度の一例である。ドローン情報52Dによると、例えば、ドローン1の高さ方向、幅方向及び奥行き方向のサイズは、それぞれ、300mm、500mm及び500mmである。また、ドローン1の垂直方向及び水平方向のホバリング精度は、ともに±0.1mである。さらに、ドローン1の垂直方法及び水平方法の位置精度は、ともに±50mmである。なお、GPSなどの衛星測位システムから得られるドローン1の位置情報を、地上に設置された基準局の補正位置情報に基づいて修正することにより、ドローン1の位置精度を向上させるRTK(Real Time Kinematic)と呼ばれる技術を用いることにより、ドローン1の位置精度を数センチメートル単位にまで向上させることができる。
FIG. 4 is a diagram showing an example of drone information.
The drone information 52D includes drone size (height, width, depth), hovering accuracy (vertical, horizontal), and position accuracy (vertical, horizontal). Hovering accuracy (vertical, horizontal) and positional accuracy (vertical, horizontal) are examples of the flight accuracy of the drone 1. According to the drone information 52D, for example, the sizes of the drone 1 in the height direction, width direction, and depth direction are 300 mm, 500 mm, and 500 mm, respectively. Further, the hovering accuracy of the drone 1 in both the vertical direction and the horizontal direction is ±0.1 m. Furthermore, the positional accuracy of the drone 1 in both the vertical and horizontal directions is ±50 mm. In addition, RTK (Real Time Kinematic) improves the positional accuracy of the drone 1 by correcting the position information of the drone 1 obtained from a satellite positioning system such as GPS based on the corrected position information of a reference station installed on the ground. By using a technique called ``, the positional accuracy of the drone 1 can be improved to within a few centimeters.
 空間情報取得部31は、センサ23が出力する深度情報に基づいて、ドローン1の着陸候補地点の周囲に存在する物体の三次元位置を含む空間情報を算出する。空間情報は、物体の三次元位置を、端末装置2に予め設定された座標系(例えば、カメラ22のレンズ中心を原点とし、カメラ22の視線方向をX軸とする右手座標系)における三次元座標で示す。例えば、空間情報は、図3における床部101、開口部102A、102B、壁面103、窓104、柱105及び柵106の三次元座標を含む。なお、開口部102A、102Bは空間であるため、開口部102A、102Bの三次元座標は、無限遠点の座標又は不定値である。 The spatial information acquisition unit 31 calculates spatial information including the three-dimensional positions of objects existing around the candidate landing site of the drone 1 based on the depth information output by the sensor 23. The spatial information indicates the three-dimensional position of the object in a coordinate system preset in the terminal device 2 (for example, a right-handed coordinate system with the origin at the center of the lens of the camera 22 and the line of sight direction of the camera 22 as the X axis). Shown in coordinates. For example, the spatial information includes the three-dimensional coordinates of the floor 101, openings 102A and 102B, wall surface 103, window 104, pillar 105, and fence 106 in FIG. Note that since the openings 102A and 102B are spaces, the three-dimensional coordinates of the openings 102A and 102B are the coordinates of a point at infinity or an indefinite value.
 進入開口部決定部32は、ドローン情報取得部30が取得したドローン情報52D及び空間情報取得部31が取得した空間情報に基づいて、ドローン1が床部101を含むマンションに進入可能な開口部である進入可能開口部と、ドローン1が床部101を含むマンションに進入することのできない開口部である進入不可能開口部とを決定する。 The entry opening determination unit 32 determines an opening through which the drone 1 can enter the apartment including the floor 101 based on the drone information 52D acquired by the drone information acquisition unit 30 and the spatial information acquired by the spatial information acquisition unit 31. A certain approachable opening and an inaccessible opening, which is an opening through which the drone 1 cannot enter the apartment including the floor 101, are determined.
 例えば、進入開口部決定部32は、空間情報に基づいて、図3に示した開口部102A、102Bのそれぞれについて、水平方向及び垂直方向のサイズを決定する。進入開口部決定部32は、ドローン情報52Dに基づいて、ドローン1のサイズ、ホバリング精度および位置精度から、ドローン1が占有する可能性のある領域のサイズを決定する。図4に示した例では、進入開口部決定部32は、ドローン1が占有する可能性のある領域の高さ方向のサイズが450mm(=300mm+0.1m+50mm)であり、幅方向のサイズが650mm(500mm+0.1m+50mm)であると決定する。進入開口部決定部32は、開口部102A、102Bのそれぞれについて、開口部の水平方向のサイズがドローン1が占有する可能性のある幅方向のサイズ650mm以下であり、かつ、開口部の垂直方向のサイズがドローン1が占有する可能性のある高さ方向のサイズ450mm以下であれば、当該開口部を進入可能開口部と決定し、それ以外であれば、進入不可能開口部と決定する。例えば、進入開口部決定部32は、開口部102Aを進入可能開口部と決定し、開口部102Bを進入不可能開口部と決定する。 For example, the entrance opening determining unit 32 determines the horizontal and vertical sizes of each of the openings 102A and 102B shown in FIG. 3 based on the spatial information. The approach opening determining unit 32 determines the size of the area that the drone 1 may occupy from the size, hovering accuracy, and positional accuracy of the drone 1 based on the drone information 52D. In the example shown in FIG. 4, the approach opening determining unit 32 has a height-wise size of 450 mm (=300 mm+0.1 m+50 mm) and a width-wise size of the area that the drone 1 may occupy. 500mm+0.1m+50mm). The approach opening determining unit 32 determines that for each of the openings 102A and 102B, the horizontal size of the opening is 650 mm or less in the width direction that the drone 1 may occupy, and the vertical size of the opening is 650 mm or less. If the size of the opening is 450 mm or less in the height direction that the drone 1 may occupy, the opening is determined to be an accessible opening; otherwise, the opening is determined to be an impossible opening. For example, the entrance opening determining unit 32 determines the opening 102A to be an accessible opening, and determines the opening 102B to be an impossible opening.
 進入開口部決定部32は、決定した進入可能開口部のうち、ユーザが選択した進入可能開口部を、ドローン1が進入する進入開口部として決定する。進入開口部決定部32は、決定した進入開口部に関する開口部情報を、通信部21を介してサーバ5に送信する。 The entry opening determining unit 32 determines the entry opening selected by the user among the determined entry openings as the entry opening into which the drone 1 will enter. The entrance opening determining section 32 transmits opening information regarding the determined entrance opening to the server 5 via the communication section 21 .
 図5は、開口部情報の一例を示す図である。
 開口部情報は、開口部座標(緯度)、開口部座標(経度)、開口部高さ、及び開口部進入方位を含む。開口部座標(緯度)は進入開口部の緯度を示し、例えば「YY.YYYYYY」(度)である。開口部座標(経度)は進入開口部の経度を示し、例えば「XXX.XXXXXX」(度)である。開口部高さは進入開口部の高さを示し、例えば、「15m」である。開口部進入方位は端末装置2を基準とした進入開口部の方位を示し、例えば、「275度」である。ここで、方位は、真北が0度である。
FIG. 5 is a diagram illustrating an example of aperture information.
The opening information includes opening coordinates (latitude), opening coordinates (longitude), opening height, and opening approach direction. The opening coordinates (latitude) indicate the latitude of the entrance opening, and are, for example, "YY.YYYYYY" (degrees). The opening coordinates (longitude) indicate the longitude of the entrance opening, and are, for example, "XXX.XXXXXX" (degrees). The opening height indicates the height of the entrance opening, and is, for example, "15 m". The opening entry direction indicates the direction of the entry opening with respect to the terminal device 2, and is, for example, "275 degrees." Here, true north is 0 degrees.
 ここで、進入開口部決定部32は、位置特定部26が特定した端末装置2の位置(緯度、経度)と、空間情報が示す進入開口部の三次元位置とに基づいて、開口部座標(緯度)及び開口部座標(経度)を算出する。ただし、進入開口部決定部32は、進入開口部の三次元位置が無限遠点の座標又は不定を示す場合には、進入開口部決定部32は、進入開口部の周囲の物体(図3の例では、柱105、柵106)の三次元座標から、進入開口部の三次元位置を算出してもよい。例えば、進入開口部決定部32は、進入開口部が開口部102Aの場合には、柵106の面を上方向に延長した三次元位置を、開口部102Aの三次元位置として算出してもよい。その上で、進入開口部決定部32は、開口部座標(緯度)及び開口部座標(経度)を算出する。 Here, the entrance opening determination unit 32 determines the opening coordinates ( (latitude) and opening coordinates (longitude). However, if the three-dimensional position of the approach opening indicates the coordinates of an infinite point or is undefined, the approach opening determining unit 32 determines whether the approach opening has an object around the approach opening (as shown in FIG. 3). In the example, the three-dimensional position of the entrance opening may be calculated from the three-dimensional coordinates of the pillar 105 and the fence 106). For example, when the entrance opening is the opening 102A, the entrance opening determination unit 32 may calculate a three-dimensional position obtained by extending the surface of the fence 106 upward as the three-dimensional position of the opening 102A. . Then, the approach opening determining unit 32 calculates the opening coordinates (latitude) and the opening coordinates (longitude).
 着陸地点決定部33は、ドローン情報取得部30が取得したドローン情報52D及び空間情報取得部31が取得した空間情報に基づいて、ドローン1が着陸可能な着陸可能地点を決定する。着陸地点決定部33は、決定した着陸可能地点のうち、ユーザが選択した着陸可能地点を、ドローン1の着陸地点として決定する。 The landing site determination unit 33 determines a possible landing site where the drone 1 can land based on the drone information 52D acquired by the drone information acquisition unit 30 and the spatial information acquired by the spatial information acquisition unit 31. The landing site determining unit 33 determines the possible landing site selected by the user among the determined possible landing sites as the landing site of the drone 1.
 移動経路決定部34は、空間情報取得部31が算出した空間情報と、進入開口部決定部32が決定した進入開口部の情報と、着陸地点決定部33が決定した着陸地点の情報とに基づいて、進入開口部から着陸地点までのドローン1の移動経路を決定する。例えば、移動経路決定部34は、ドローン1が進入開口部から所定の仰角でバルコニーの内部に進入し、着陸地点の直上にドローン1が到達したら、ドローン1の進路を垂直方向に変更し、着陸地点に着陸させる移動経路を決定する。移動経路決定部34は、決定した移動経路情報を、通信部21を介してサーバ5に送信する。なお、移動経路決定部34は、進入開口部から着陸地点までの間に障害物等があり、ドローン1が移動できない場合には、移動経路を決定せずに、移動経路を決定できなかったことを示す情報を、通信部21を介してサーバ5に送信する。 The movement route determining unit 34 is based on the spatial information calculated by the spatial information acquiring unit 31, the information on the approach opening determined by the approach opening determining unit 32, and the information on the landing site determined by the landing site determining unit 33. Then, the travel route of the drone 1 from the approach opening to the landing site is determined. For example, when the drone 1 enters the inside of the balcony from the entrance opening at a predetermined elevation angle and the drone 1 reaches directly above the landing site, the movement route determining unit 34 changes the course of the drone 1 vertically and lands. Determine the travel route to land at the location. The moving route determining unit 34 transmits the determined moving route information to the server 5 via the communication unit 21. Note that if there is an obstacle or the like between the approach opening and the landing point and the drone 1 cannot move, the movement route determination unit 34 does not determine the movement route and cannot determine the movement route. information indicating this is transmitted to the server 5 via the communication unit 21.
 図6は、移動経路情報の一例を示す図である。
 移動経路情報は、移動経路座標(1)~(n)と、着陸地点座標と、基準座標方位とを含む。移動経路座標(1)~(n)は、進入開口部から着陸地点までの間にドローン1が通過するn個の点の座標である。着陸地点座標はドローン1が着陸する地点の座標である。例えば、カメラ22のレンズ中心を原点とし、カメラ22の視線方向をX軸とする右手座標系を考えた場合に、ドローン1は、座標(X1,Y1,Z1)、・・・、座標(Xn,Yn,Zn)を通過し、座標(ZL,YL,ZL)に着陸することが示されている。基準座標方位は、上記座標系におけるX軸方向の方位を示し、例えば、「0度」である。ここで、方位は、真北が0度である。
FIG. 6 is a diagram showing an example of travel route information.
The travel route information includes travel route coordinates (1) to (n), landing point coordinates, and reference coordinate orientation. Movement route coordinates (1) to (n) are the coordinates of n points that the drone 1 passes between the approach opening and the landing site. The landing point coordinates are the coordinates of the point where the drone 1 lands. For example, when considering a right-handed coordinate system in which the origin is the lens center of the camera 22 and the X axis is the line of sight direction of the camera 22, the drone 1 has coordinates (X1, Y1, Z1), ..., coordinates (Xn , Yn, Zn) and land at the coordinates (ZL, YL, ZL). The reference coordinate orientation indicates the orientation in the X-axis direction in the coordinate system, and is, for example, "0 degree". Here, true north is 0 degrees.
 進入可能開口部表示部35は、進入開口部決定部32が決定した進入可能開口部及び進入不可能開口部を識別できる態様でタッチパネル24に表示させる。例えば、進入可能開口部表示部35は、カメラ22が撮影した画像に重畳して進入可能開口部と進入不可能開口部とを異なる色の枠で表示してもよいし、進入可能開口部と進入不可能開口部とを異なる枠で表示してもよい。 The accessible opening display unit 35 displays on the touch panel 24 the accessible openings and non-accessible openings determined by the accessible opening determining unit 32 in a distinguishable manner. For example, the accessible opening display section 35 may display accessible openings and non-accessible openings with frames of different colors superimposed on the image taken by the camera 22, or may display accessible openings and non-accessible openings with frames of different colors. The inaccessible opening may be displayed in a different frame.
 図7は、進入可能開口部表示部35によるタッチパネル24への表示例を示す図である。
 タッチパネル24には、図3に示したカメラ22が撮影したバルコニーの画像に重畳して、枠111、112が表示されている。枠111は開口部102Aを強調表示する枠であり、枠112は開口部102Bを強調表示する枠である。枠111は、バツ印を付していないことより、開口部102Aが進入可能開口部であることを示す。一方、枠112は、バツ印を付していることより、開口部102Bが進入不可能開口部であることを示す。
FIG. 7 is a diagram showing an example of a display on the touch panel 24 by the accessible opening display section 35.
Frames 111 and 112 are displayed on the touch panel 24, superimposed on the balcony image taken by the camera 22 shown in FIG. A frame 111 is a frame that highlights the opening 102A, and a frame 112 is a frame that highlights the opening 102B. The frame 111 is not marked with a cross, which indicates that the opening 102A is an enterable opening. On the other hand, the frame 112 is marked with a cross, indicating that the opening 102B is an inaccessible opening.
 選択情報取得部36は、進入可能開口部表示部35がタッチパネル24に表示させた進入可能開口部のうち、ユーザが選択した進入可能開口部の情報を取得する。具体的には、選択情報取得部36は、タッチパネル24からユーザが画面上でタッチした位置の情報を取得し、取得した位置情報と画面上での進入可能開口部の表示位置とに基づいて、ユーザが選択した進入可能開口部を特定する。図7に示す例では、進入可能開口部は開口部102Aのみであるため、ユーザは開口部102Aを選択することになる。 The selection information acquisition unit 36 acquires information on the accessible opening selected by the user from among the accessible openings displayed on the touch panel 24 by the accessible opening display unit 35. Specifically, the selection information acquisition unit 36 acquires information on the position touched by the user on the screen from the touch panel 24, and based on the acquired position information and the display position of the accessible opening on the screen, Identifying the user-selected access opening. In the example shown in FIG. 7, the only accessible opening is the opening 102A, so the user selects the opening 102A.
 着陸可能地点表示部37は、着陸地点決定部33が決定した着陸可能地点をタッチパネル24に表示させる。例えば、着陸可能地点表示部37は、カメラ22が撮影した画像に重畳して着陸可能地点を含む領域を、他の領域と異なる色で表示してもよい。 The possible landing site display section 37 displays the possible landing site determined by the landing site determining section 33 on the touch panel 24. For example, the possible landing spot display section 37 may display an area including a possible landing spot superimposed on the image taken by the camera 22 in a different color from other areas.
 また、着陸可能地点表示部37は、ドローン情報52Dに基づいて、ドローン1の外形をタッチパネル24に表示させる。例えば、着陸可能地点表示部37は、ドローン1のサイズ、ホバリング精度及び位置精度から、ドローン1が占有する可能性のある矩形領域をドローン1の外形として表示する。図4に示した例では、着陸可能地点表示部37は、高さ方向に450mm(=300mm+0.1m+50mm)、幅方向に650mm(500mm+0.1m+50mm)、奥行き方向に650mm(500mm+0.1m+50mm)の矩形を表示する。 Furthermore, the possible landing spot display section 37 displays the outline of the drone 1 on the touch panel 24 based on the drone information 52D. For example, the possible landing spot display section 37 displays a rectangular area that the drone 1 may occupy as the outline of the drone 1 based on the size, hovering accuracy, and positional accuracy of the drone 1. In the example shown in FIG. 4, the possible landing spot display section 37 has a rectangle of 450 mm (=300 mm + 0.1 m + 50 mm) in the height direction, 650 mm (500 mm + 0.1 m + 50 mm) in the width direction, and 650 mm (500 mm + 0.1 m + 50 mm) in the depth direction. indicate.
 図8は、着陸可能地点表示部37によるタッチパネル24への表示例を示す図である。
 タッチパネル24には、図3に示したカメラ22が撮影したバルコニーの画像に重畳して、面121と、矩形枠131とが表示されている。
FIG. 8 is a diagram showing an example of a display on the touch panel 24 by the possible landing point display section 37.
A surface 121 and a rectangular frame 131 are displayed on the touch panel 24, superimposed on the balcony image taken by the camera 22 shown in FIG.
 面121は、ドローン1の着陸可能地点を含む面であり、例えば、青色の面である。矩形枠131は、ドローン1が占有する可能性のある領域を立体的に示す。 The surface 121 is a surface that includes a possible landing site for the drone 1, and is, for example, a blue surface. The rectangular frame 131 three-dimensionally indicates an area that the drone 1 may occupy.
 選択情報取得部36は、さらに、着陸地点決定部33がタッチパネル24に表示させた着陸可能地点のうち、ユーザが選択した着陸可能地点の情報を取得する。具体的には、図8の例において、ユーザはタッチパネル24上で矩形枠131をドラッグし、ドローン1の着陸を希望する位置で指を離す(ドロップする)。なお、着陸可能地点を含む面121内でのみ矩形枠131を動かせるように、ドラッグによる矩形枠131の移動が制限される。選択情報取得部36は、タッチパネル24からユーザがドロップした位置の情報を取得し、取得した位置情報と画面上での着陸可能地点の表示位置とに基づいて、ユーザが選択した着陸可能地点を特定する。 The selection information acquisition unit 36 further acquires information on the possible landing site selected by the user from among the possible landing sites displayed on the touch panel 24 by the landing site determining unit 33. Specifically, in the example of FIG. 8, the user drags the rectangular frame 131 on the touch panel 24 and releases (drops) his or her finger at the desired position for the drone 1 to land. Note that movement of the rectangular frame 131 by dragging is restricted so that the rectangular frame 131 can be moved only within the plane 121 that includes the possible landing site. The selection information acquisition unit 36 acquires information on the position where the user has dropped from the touch panel 24, and identifies the possible landing site selected by the user based on the acquired positional information and the display position of the possible landing site on the screen. do.
 移動経路表示部38は、移動経路決定部34が決定した進入開口部から着陸地点までのドローン1の移動経路を、カメラ22が撮影した画像に重畳させて、タッチパネル24に表示させる。 The movement route display unit 38 causes the touch panel 24 to display the movement route of the drone 1 from the approach opening to the landing site determined by the movement route determination unit 34, superimposed on the image taken by the camera 22.
 図9及び図10は、移動経路表示部38によるタッチパネル24への表示例を示す図である。
 図9を参照して、移動経路表示部38は、移動経路決定部34がドローン1の移動経路の計算中である場合には、カメラ22が撮影したバルコニーの画像に「ルート計算中」とのメッセージを重畳させて表示する。
9 and 10 are diagrams showing examples of display on the touch panel 24 by the movement route display section 38.
Referring to FIG. 9, when the movement route determination unit 34 is calculating the movement route of the drone 1, the movement route display unit 38 displays a message “route calculation in progress” on the balcony image taken by the camera 22. Display messages superimposed.
 図10を参照して、移動経路表示部38は、移動経路決定部34によるドローン1の移動経路の計算が完了した場合には、カメラ22が撮影したバルコニーの画像に、移動経路141を重畳させて表示する。移動経路141は、ドローン1が開口部102Aから着陸地点までの間に移動する際に占有する可能性のある領域を破線で示したものである。 Referring to FIG. 10, when the movement route determination unit 34 completes calculation of the movement route of the drone 1, the movement route display unit 38 superimposes the movement route 141 on the balcony image taken by the camera 22. to be displayed. The movement route 141 is a dashed line indicating an area that the drone 1 may occupy when moving from the opening 102A to the landing site.
 エラー通知部39は、移動経路決定部34により移動経路が決定されなかった場合に、その旨をタッチパネル24に表示させたり、スピーカ25から音を出力することにより、ユーザに通知する。また、エラー通知部39は、サーバ5又はドローン1から、ドローン1の移動経路上に障害物が存在してドローン1が着陸地点に着陸できなかった旨の通知を受けた場合には、通知内容をタッチパネル24に表示させたり、スピーカ25から通知内容を音声出力することによりユーザに通知する。 If the travel route has not been determined by the travel route determination unit 34, the error notification unit 39 notifies the user by displaying this on the touch panel 24 or outputting sound from the speaker 25. In addition, when the error notification unit 39 receives a notification from the server 5 or the drone 1 that the drone 1 could not land at the landing site due to the presence of an obstacle on the movement route of the drone 1, the error notification unit 39 determines the notification content. The user is notified by displaying the notification on the touch panel 24 or by outputting the notification content audibly from the speaker 25.
 〔サーバ5の構成〕
 図11は、本開示の実施形態1に係るサーバ5の構成を示すブロック図である。
 サーバ5は、通信部51と、記憶部52と、制御部53と、バス54とを備える。
 通信部51は、サーバ5をネットワーク7に接続させ、外部の装置と無線又は有線による通信を行う。
[Configuration of server 5]
FIG. 11 is a block diagram showing the configuration of the server 5 according to Embodiment 1 of the present disclosure.
The server 5 includes a communication section 51, a storage section 52, a control section 53, and a bus 54.
The communication unit 51 connects the server 5 to the network 7 and performs wireless or wired communication with an external device.
 記憶部52は、SRAMまたはDRAMなどの揮発性のメモリ素子、フラッシュメモリ若しくはEEPROMなどの不揮発性のメモリ素子、または、ハードディスクなどの磁気記憶装置などにより構成される。記憶部52は、制御部53で実行されるコンピュータプログラム52Pと、上述のドローン情報52Dとを記憶している。また、記憶部52は、コンピュータプログラム52Pの実行時に得られるデータを記憶する。 The storage unit 52 is composed of a volatile memory element such as SRAM or DRAM, a non-volatile memory element such as flash memory or EEPROM, or a magnetic storage device such as a hard disk. The storage unit 52 stores a computer program 52P executed by the control unit 53 and the above-mentioned drone information 52D. Furthermore, the storage unit 52 stores data obtained when the computer program 52P is executed.
 制御部53は、CPUなどのプロセッサにより構成され、コンピュータプログラム52Pを実行することにより実現される機能的な処理部として、ドローン情報提供部55と、着陸地点情報取得部56と、移動経路決定部57と、エラー通知部58とを含む。 The control unit 53 is composed of a processor such as a CPU, and is a functional processing unit realized by executing a computer program 52P. 57 and an error notification section 58.
 ドローン情報提供部55は、端末装置2を利用するユーザの自宅に荷物を配送するドローンのドローン情報52Dを記憶部52から読み出し、読み出したドローン情報52Dを通信部51を介して端末装置2に送信する。 The drone information providing unit 55 reads drone information 52D of a drone that delivers packages to the home of the user using the terminal device 2 from the storage unit 52, and transmits the read drone information 52D to the terminal device 2 via the communication unit 51. do.
 着陸地点情報取得部56は、通信部51を介して端末装置2から、開口部情報及び移動経路情報を受信する。 The landing point information acquisition unit 56 receives opening information and travel route information from the terminal device 2 via the communication unit 51.
 移動経路決定部57は、開口部情報に基づいて、予め定められたドローン1の拠点から開口部までのドローン1の移動経路を決定する。当該移動経路は公知の技術により決定することができる。例えば、ドローン1の移動距離が最小となる移動経路を決定してもよいし、ドローン1の移動時間が最小となる移動経路を決定してもよい。また、移動経路決定部57は、決定した拠点から開口部までのドローン1の移動経路と、端末装置2から受信した移動経路情報とに基づいて、拠点から開口部を通過して着陸地点までの移動経路を決定する。その際、移動経路決定部57は、端末装置2から受信した移動経路情報に示される基準座標方位に従って、当該移動経路情報の座標系を、ドローン1の座標系に合うように変換する。移動経路決定部57は、決定した拠点から着陸地点までの移動経路の情報を、通信部21を介してドローン1に送信することにより、ドローン1に移動指示を送信する。ドローン1は、移動経路決定部57が決定した移動経路に従って、拠点から着陸地点まで飛行し、着陸地点へ荷物を配送する。 The moving route determination unit 57 determines the moving route of the drone 1 from a predetermined base of the drone 1 to the opening based on the opening information. The movement route can be determined using a known technique. For example, a travel route may be determined in which the travel distance of the drone 1 is the minimum, or a travel route in which the travel time of the drone 1 is the minimum may be determined. Furthermore, the movement route determination unit 57 is configured to move the drone 1 from the base through the opening to the landing point based on the determined movement route of the drone 1 from the base to the opening and the travel route information received from the terminal device 2. Determine your travel route. At this time, the moving route determining unit 57 converts the coordinate system of the moving route information to match the coordinate system of the drone 1 according to the reference coordinate direction indicated in the moving route information received from the terminal device 2. The travel route determination unit 57 transmits a travel instruction to the drone 1 by transmitting information on the travel route from the determined base to the landing point to the drone 1 via the communication unit 21. The drone 1 flies from the base to the landing point according to the moving route determined by the moving route determining unit 57, and delivers the luggage to the landing point.
 エラー通知部58は、ドローン1が移動経路を飛行できない場合にドローン1から送信されるエラー情報を、通信部51を介して受信する。エラー通知部58は、開口部から着陸地点の間に障害物が存在することによりドローン1が着陸地点に着陸できないことをエラー情報が示している場合には、そのことを、通信部51を介して端末装置2に通知する。 The error notification unit 58 receives error information transmitted from the drone 1 via the communication unit 51 when the drone 1 cannot fly along the movement route. If the error information indicates that the drone 1 cannot land at the landing site due to the presence of an obstacle between the opening and the landing site, the error notification unit 58 notifies the user of this via the communication unit 51. to notify the terminal device 2.
 〔配送システム10の処理手順〕
 図12は、本開示の実施形態1に係る配送システム10の処理手順を示すシーケンス図である。
 サーバ5は、図4に示すようなドローン情報52Dを端末装置2に送信し、端末装置2は、ドローン情報52Dを受信する(ステップS1)。
[Processing procedure of delivery system 10]
FIG. 12 is a sequence diagram showing the processing procedure of the delivery system 10 according to the first embodiment of the present disclosure.
The server 5 transmits drone information 52D as shown in FIG. 4 to the terminal device 2, and the terminal device 2 receives the drone information 52D (step S1).
 端末装置2のカメラ22は、図3に示されるようなドローン1の着陸候補地点を含むバルコニーの画像を撮影する(ステップS2)。なお、着陸候補地点を含む領域がカメラ22の撮影領域に収まらない場合には、ユーザは、カメラ22をパンニングさせて撮影領域よりも広いパノラマ画像を撮影してもよい。 The camera 22 of the terminal device 2 captures an image of the balcony including the candidate landing site for the drone 1 as shown in FIG. 3 (step S2). Note that if the area including the landing candidate site does not fit within the photographing area of the camera 22, the user may pan the camera 22 to photograph a panoramic image wider than the photographing area.
 また、端末装置2は、画像の撮像と同時に、図3に示したバルコニーにおける対象(ここでは、床部101、開口部102A、102B、壁面103、窓104、柱105、及び柵106など)までの距離を計測し、深度情報を出力する(ステップS3)。 Furthermore, at the same time as the image is captured, the terminal device 2 also captures the objects on the balcony shown in FIG. The distance is measured and depth information is output (step S3).
 端末装置2は、深度情報に基づいて、着陸候補地点の周囲に存在する物体の三次元位置を含む、着陸候補地点の周囲の空間情報を算出する(ステップS4)。 Based on the depth information, the terminal device 2 calculates spatial information around the candidate landing site, including the three-dimensional positions of objects existing around the candidate landing site (step S4).
 端末装置2は、ドローン情報52D及び空間情報に基づいて、ドローン1が進入可能な開口部である進入可能開口部と、ドローン1が進入することのできない開口部である進入不可能開口部とを決定する(ステップS5)。 Based on the drone information 52D and the spatial information, the terminal device 2 identifies an entry-allowed opening that the drone 1 can enter and an entry-impossible opening that the drone 1 cannot enter. Determine (step S5).
 端末装置2は、図7に示されるような画像をタッチパネル24に表示させる(ステップS6)。これにより、進入可能開口部である開口部102Aと、進入不可能開口部である開口部102Bとを区別して表示することができる。なお、端末装置2は、画像に「ドローン入口を指定してください」とのメッセージを重畳させて表示する。 The terminal device 2 causes the touch panel 24 to display an image as shown in FIG. 7 (step S6). Thereby, the opening 102A, which is an accessible opening, and the opening 102B, which is an inaccessible opening, can be distinguished and displayed. Note that the terminal device 2 displays a message "Please specify the drone entrance" superimposed on the image.
 ユーザは、タッチパネル24上で、ドローン1に進入させる進入可能開口部を選択し、端末装置2は、選択された進入可能開口部の情報を取得する(ステップS7)。例えば、図7に示される画像上で、ユーザが開口部102Aを選択した場合には、開口部102Aの情報(例えば、開口部102Aの画像上での位置情報)が取得される。 The user selects, on the touch panel 24, the opening into which the drone 1 can enter, and the terminal device 2 acquires information on the selected opening (step S7). For example, when the user selects the opening 102A on the image shown in FIG. 7, information on the opening 102A (eg, position information on the image of the opening 102A) is acquired.
 端末装置2は、ユーザが選択した進入可能開口部を、ドローン1が進入する進入開口部として決定する(ステップS8)。例えば、端末装置2は、選択された開口部102Aを進入開口部として決定する。 The terminal device 2 determines the entry possible opening selected by the user as the entry opening into which the drone 1 will enter (step S8). For example, the terminal device 2 determines the selected opening 102A as the entry opening.
 端末装置2は、ドローン情報52D及び空間情報に基づいて、ドローン1が着陸可能な着陸可能地点を決定する(ステップS9)。 The terminal device 2 determines a landing spot where the drone 1 can land based on the drone information 52D and the spatial information (step S9).
 端末装置2は、図8に示すように、ドローン1の着陸可能地点を含む面121と、ドローン1が占有する可能性のある領域を示す矩形枠131とを含む画像をタッチパネル24に表示させる(ステップS10)。なお、端末装置2は、画像に「着陸地点を指定してください」とのメッセージを重畳させて表示する。 As shown in FIG. 8, the terminal device 2 causes the touch panel 24 to display an image including a surface 121 including a possible landing site for the drone 1 and a rectangular frame 131 indicating an area that the drone 1 may occupy ( Step S10). Note that the terminal device 2 displays a message "Please specify the landing site" superimposed on the image.
 ユーザは、タッチパネル24上で矩形枠131をドラッグアンドドロップすることにより、ドローン1の着陸可能地点を選択し、端末装置2は、選択されたドローン1の着陸可能地点を特定する(ステップS11)。例えば、図8に示す矩形枠13
1の位置でユーザが矩形枠131をドロップした場合には、画像上での矩形枠131の位置に対応する実空間中での位置が着陸可能地点として特定される。
The user selects a landing spot for the drone 1 by dragging and dropping the rectangular frame 131 on the touch panel 24, and the terminal device 2 specifies the selected landing spot for the drone 1 (step S11). For example, the rectangular frame 13 shown in FIG.
When the user drops the rectangular frame 131 at position 1, the position in real space corresponding to the position of the rectangular frame 131 on the image is specified as a possible landing spot.
 端末装置2は、ユーザが選択し、端末装置2が特定した着陸可能地点を、ドローン1の着陸地点として決定する(ステップS12)。 The terminal device 2 determines the possible landing site selected by the user and identified by the terminal device 2 as the landing site for the drone 1 (step S12).
 端末装置2は、図9に示すようなドローン1が移動経路の計算中であることを示す画像をタッチパネル24に表示する(ステップS13)。 The terminal device 2 displays an image on the touch panel 24 that shows that the drone 1 is calculating the movement route as shown in FIG. 9 (step S13).
 端末装置2は、空間情報と、進入開口部の情報と、着陸地点の情報とに基づいて、進入開口部から着陸地点までのドローン1の移動経路を決定する。端末装置2は、図10に示すようなドローン1の移動経路141を示す画像をタッチパネル24に表示する(ステップS14)。なお、端末装置2は、移動経路が見つからない場合には、移動経路が見つからないことを示す画像をタッチパネル24に表示させ、以降の処理を中止する。 The terminal device 2 determines the movement route of the drone 1 from the approach opening to the landing site based on the spatial information, the approach opening information, and the landing site information. The terminal device 2 displays an image showing the movement route 141 of the drone 1 as shown in FIG. 10 on the touch panel 24 (step S14). Note that if the moving route is not found, the terminal device 2 causes the touch panel 24 to display an image indicating that the moving route is not found, and cancels the subsequent processing.
 端末装置2は、決定した進入開口部の位置を示す開口部情報(図5)と、決定した移動経路を示す移動経路情報(図6)とをサーバ5に送信し、サーバ5は、開口部情報及び移動経路情報を受信する(ステップS15)。 The terminal device 2 transmits opening information (FIG. 5) indicating the position of the determined entrance opening and moving route information (FIG. 6) indicating the determined moving route to the server 5. Information and travel route information are received (step S15).
 サーバ5は、開口部情報に基づいて、荷物の配送拠点から開口部までのドローン1の移動経路を決定する。また、サーバ5は、決定した移動経路と、端末装置2から受信した移動経路情報とに基づいて、配送拠点から開口部を通過して着陸地点までの移動経路を決定する(ステップS16)。 The server 5 determines the travel route of the drone 1 from the package delivery base to the opening based on the opening information. Furthermore, the server 5 determines a travel route from the delivery base to the landing point through the opening, based on the determined travel route and the travel route information received from the terminal device 2 (step S16).
 サーバ5は、配送拠点から着陸地点までの移動経路を示す移動経路情報をドローン1に送信し、ドローン1は、移動経路情報を受信する(ステップS17)。 The server 5 transmits travel route information indicating the travel route from the delivery base to the landing point to the drone 1, and the drone 1 receives the travel route information (step S17).
 ドローン1は、受信した移動経路情報に従って、荷物の配送拠点から開口部を通過して着陸地点まで飛行することにより、着陸地点に荷物を配送する(ステップS18)。 The drone 1 delivers the package to the landing site by flying from the package delivery base through the opening to the landing site according to the received movement route information (step S18).
 ドローン1は、飛行時に障害物等の影響により荷物を配送できない場合には、拠点に引き返すとともに、引き返した地点の情報を含むエラー情報をサーバ5に送信する。サーバ5は、ドローン1から送信されたエラー情報を受信する(ステップS19)。 If the drone 1 cannot deliver the cargo due to obstacles or the like during flight, it returns to the base and sends error information including information on the point to which it returned to the server 5. The server 5 receives the error information transmitted from the drone 1 (step S19).
 サーバ5は、受信したエラー情報に基づいて、開口部から着陸地点の間の地点でドローン1が引き返した場合には、ドローン1の移動経路上に障害物が存在してドローン1が着陸地点に着陸できなかった旨のエラー情報を端末装置2に送信し、端末装置2はエラー情報を受信する(ステップS20)。 Based on the received error information, the server 5 determines that if the drone 1 returns at a point between the opening and the landing site, there is an obstacle on the travel path of the drone 1 and the drone 1 returns to the landing site. Error information indicating that landing was not possible is transmitted to the terminal device 2, and the terminal device 2 receives the error information (step S20).
 端末装置2は、エラー情報に基づいて、ドローン1の移動経路上に障害物が存在してドローン1が着陸地点に着陸できなかった旨の通知をユーザに対して行う(ステップS21)。 Based on the error information, the terminal device 2 notifies the user that the drone 1 was unable to land at the landing site due to the presence of an obstacle on the travel route of the drone 1 (step S21).
 以上説明したように、ドローン情報52Dにはドローン1のサイズが含まれている。このため、端末装置2は、ドローン情報52Dと空間情報を取得することによって、ドローン1の着陸候補地点の周囲の空間にドローン1が進入する進入開口部を決定することができる。また、端末装置2は、ドローン1の着陸地点も決定することができる。進入開口部と着陸地点とが決まると、端末装置2は、拠点から進入開口部までのドローン1の移動経路と、進入開口部から着陸地点までのドローン1の移動経路とを決定することができる。例えば、ユーザは、スマートフォン等の端末装置2に備えられた距離センサを用いて空間情報を取得し、端末装置2がドローン情報52Dを外部のサーバ5から取得することによって、簡易な方法でドローン1の移動経路又は着陸地点を決定することができる。 As explained above, the drone information 52D includes the size of the drone 1. Therefore, by acquiring the drone information 52D and the spatial information, the terminal device 2 can determine the entry opening through which the drone 1 enters the space around the candidate landing site of the drone 1. Furthermore, the terminal device 2 can also determine the landing site of the drone 1. Once the approach opening and landing site are determined, the terminal device 2 can determine the movement route of the drone 1 from the base to the approach opening and the movement path of the drone 1 from the approach opening to the landing site. . For example, the user acquires spatial information using a distance sensor included in the terminal device 2 such as a smartphone, and the terminal device 2 acquires the drone information 52D from the external server 5, thereby allowing the user to acquire the drone 1 using a simple method. The travel route or landing point of the aircraft can be determined.
 開口部のサイズがドローン1のサイズよりも大きい場合であっても、ドローン1のホバリング精度又は位置精度が悪い場合には、ドローン1が開口部の周囲に接触し、開口部に進入することができない場合がある。同様に、着陸地点のサイズがドローン1のサイズよりも大きい場合であっても、ドローン1が着陸地点の周囲に接触し、着陸地点に着陸できない場合がある。実施形態1では、ドローン1のホバリング精度及び位置精度を考慮して進入開口部及び着陸地点を決定することができる。このため、ドローン1が確実に進入可能な進入開口部と、ドローン1が確実に着陸可能な着陸地点とを決定することができる。 Even if the size of the opening is larger than the size of the drone 1, if the hovering accuracy or positioning accuracy of the drone 1 is poor, the drone 1 may come into contact with the area around the opening and enter the opening. It may not be possible. Similarly, even if the size of the landing site is larger than the size of the drone 1, the drone 1 may come into contact with the surroundings of the landing site and be unable to land at the landing site. In the first embodiment, the approach opening and the landing site can be determined in consideration of the hovering accuracy and positional accuracy of the drone 1. Therefore, it is possible to determine an entry opening into which the drone 1 can reliably enter and a landing site into which the drone 1 can reliably land.
 また、ユーザは複数の進入可能開口部の中からいずれかの進入可能開口部を選択することができる。これにより、ユーザはドローン1に進入してほしくない進入可能開口部を除外して進入可能開口部を選択したり、ドローン1が進入しやすそうな進入可能開口部を選択することができ、ユーザの希望に沿った進入開口部を決定することができる。 Furthermore, the user can select any one of the plurality of accessible openings. As a result, the user can select an accessible opening by excluding an accessible opening that the user does not want the drone 1 to enter, or select an accessible opening that the drone 1 can easily enter. The entrance opening can be determined according to the wishes of the user.
 また、ドローン1のサイズより開口部のサイズが小さい場合などのように、ドローン1が進入することのできない進入不可能開口部を、進入可能開口部と区別して表示することができる。これにより、ユーザは、効率的に進入可能開口部を選択することができる。 Additionally, inaccessible openings into which the drone 1 cannot enter, such as when the size of the opening is smaller than the size of the drone 1, can be displayed separately from accessible openings. This allows the user to efficiently select an accessible opening.
 また、ユーザは、着陸可能地点の中からいずれかの着陸可能地点を選択することができる。これにより、ドローン1に着陸してほしくない着陸可能地点を除外して着陸可能地点を選択したり、ドローン1が着陸しやすそうな着陸可能地点を選択することができ、ユーザの希望に沿った着陸地点を決定することができる。 Additionally, the user can select one of the possible landing sites. As a result, it is possible to select a possible landing site by excluding possible landing sites that the user does not want Drone 1 to land on, or to select a possible landing site where Drone 1 is likely to land easily, so that the landing site can be selected according to the user's wishes. A landing site can be determined.
 また、図8のように、ドローン1が占有する可能性のある領域を示す矩形枠131がタッチパネル24に表示される。これにより、ユーザは、ドローン1の外形を考慮して着陸可能地点を選択することができる。このため、確実にドローン1が着陸できる着陸地点を決定することができる。 Further, as shown in FIG. 8, a rectangular frame 131 indicating an area that the drone 1 may occupy is displayed on the touch panel 24. Thereby, the user can select a possible landing site in consideration of the external shape of the drone 1. Therefore, it is possible to reliably determine a landing site where the drone 1 can land.
 また、図10のように、進入開口部から着陸地点までのドローン1の移動経路141を表示することができる。このため、ユーザにドローン1の移動経路を認識させることができ、ユーザに対して移動経路上に物を置かないように注意喚起することができる。 Furthermore, as shown in FIG. 10, the movement route 141 of the drone 1 from the approach opening to the landing site can be displayed. Therefore, the user can be made aware of the moving route of the drone 1, and the user can be warned not to place anything on the moving route.
 また、端末装置2は、進入開口部から着陸地点までのドローン1の移動経路を決定することができない場合には、その旨をユーザに通知する。これにより、ユーザは、着陸地点の変更等の対応を行うことができる。 Further, if the terminal device 2 cannot determine the movement route of the drone 1 from the approach opening to the landing site, it notifies the user to that effect. This allows the user to take measures such as changing the landing site.
 また、端末装置2は、進入開口部から着陸地点までの移動経路をドローン1が飛行できない場合にはユーザに通知する。これにより、ユーザは、移動経路上の障害物を取り除いたり、着陸地点の変更等の対応を行うことができる。 Additionally, the terminal device 2 notifies the user if the drone 1 cannot fly the travel route from the approach opening to the landing site. This allows the user to take measures such as removing obstacles on the travel route or changing the landing site.
 また、サーバ5は、端末装置2から、ドローン情報52Dに基づいて決定された進入開口部の位置情報と、開口部から着陸地点までのドローン1の移動経路とを取得し、拠点から着陸地点までの移動経路を決定する。サーバ5は、決定した移動経路に基づいて、ドローン1に移動指示を送信することができる。これにより、簡易な方法で決定されたドローン1の移動経路又は着陸地点に基づいて、ドローン1を制御することができる。 The server 5 also acquires the position information of the approach opening determined based on the drone information 52D and the movement route of the drone 1 from the opening to the landing site from the terminal device 2, Determine the route of travel. The server 5 can transmit a movement instruction to the drone 1 based on the determined movement route. Thereby, the drone 1 can be controlled based on the movement route or landing point of the drone 1 determined by a simple method.
 <実施形態2>
 実施形態1では、ドローン1は、開口部を通過して空間内に進入し、着陸することを想定していた。実施形態2では、ドローン1は、開口部を通過することなく、着陸地点の上方から着陸することを想定する。
<Embodiment 2>
In the first embodiment, it was assumed that the drone 1 would pass through the opening, enter the space, and land. In the second embodiment, it is assumed that the drone 1 lands from above the landing site without passing through the opening.
 〔配送システムの全体構成〕
 図13は、本開示の実施形態2に係る配送システムの構成を示す図である。
 配送システム10は、配送拠点から配送先に荷物を配送するためのシステムであって、ドローン1と、端末装置2と、スマートタグ8と、サーバ5とを備える。実施形態1と同様に、ドローン1、端末装置2及びサーバ5は、無線又は有線を介してネットワーク7に接続される。
 ドローン1及びサーバ5は、実施形態1に示したものと同様である。
[Overall configuration of delivery system]
FIG. 13 is a diagram showing the configuration of a delivery system according to Embodiment 2 of the present disclosure.
The delivery system 10 is a system for delivering packages from a delivery base to a delivery destination, and includes a drone 1, a terminal device 2, a smart tag 8, and a server 5. Similar to Embodiment 1, the drone 1, the terminal device 2, and the server 5 are connected to the network 7 via wireless or wire.
The drone 1 and the server 5 are similar to those shown in the first embodiment.
 スマートタグ8は、端末装置2と無線接続される無線タグである。例えば、スマートタグ8と端末装置2とは、無線通信規格であるBluetooth(登録商標)に従って接続される。なお、実施形態2でのBluetooth(登録商標)のバージョンは、5.1以降であるものとする。バージョン5.1以降のBluetooth(登録商標)は、方向探知機能を備えている。ユーザは、スマートタグ8をドローン1の着陸地点に置くこととする。 The smart tag 8 is a wireless tag that is wirelessly connected to the terminal device 2. For example, the smart tag 8 and the terminal device 2 are connected according to Bluetooth (registered trademark), which is a wireless communication standard. Note that it is assumed that the version of Bluetooth (registered trademark) in the second embodiment is 5.1 or later. Bluetooth® from version 5.1 onwards includes direction finding functionality. The user places the smart tag 8 at the landing site of the drone 1.
 端末装置2は、例えば、スマートフォンやタブレット端末装置であり、スマートタグ8とペアリングされ、無線通信することにより、スマートタグ8の方向とスマートタグ8までの距離とを算出する。端末装置2は、自装置の位置と、算出したスマートタグ8の方向及び距離とに基づいて、スマートタグ8の位置を、ドローン1の着陸地点として決定する。 The terminal device 2 is, for example, a smartphone or a tablet terminal device, and is paired with the smart tag 8 and calculates the direction of the smart tag 8 and the distance to the smart tag 8 by communicating wirelessly. The terminal device 2 determines the position of the smart tag 8 as the landing point of the drone 1 based on the position of the own device and the calculated direction and distance of the smart tag 8.
 〔端末装置2の構成〕
 図14は、本開示の実施形態2に係る端末装置2の構成を示すブロック図である。
 端末装置2は、通信部21と、カメラ22と、タッチパネル24と、スピーカ25と、位置特定部26と、記憶部27と、制御部28と、バス29とを備える。
 以下、実施形態1に示した端末装置2の構成と異なる点を中心に説明する。
[Configuration of terminal device 2]
FIG. 14 is a block diagram showing the configuration of the terminal device 2 according to Embodiment 2 of the present disclosure.
The terminal device 2 includes a communication section 21 , a camera 22 , a touch panel 24 , a speaker 25 , a position specifying section 26 , a storage section 27 , a control section 28 , and a bus 29 .
Hereinafter, the differences from the configuration of the terminal device 2 shown in Embodiment 1 will be mainly explained.
 通信部21は、実施形態1と同様に、端末装置2をネットワーク7に接続させ、外部の装置と無線又は有線による通信を行う。通信部21は、さらに、スマートタグ8とBluetooth(登録商標)に従って接続される。 Similarly to the first embodiment, the communication unit 21 connects the terminal device 2 to the network 7 and performs wireless or wired communication with an external device. The communication unit 21 is further connected to the smart tag 8 according to Bluetooth (registered trademark).
 制御部28は、記憶部27に記憶されているコンピュータプログラム27Pを実行することにより実現される機能的な処理部として、ドローン情報取得部30と、ドローン領域表示部41と、タグ位置算出部42と、着陸地点決定部43と、エラー通知部39とを備える。 The control unit 28 includes a drone information acquisition unit 30, a drone area display unit 41, and a tag position calculation unit 42 as functional processing units realized by executing the computer program 27P stored in the storage unit 27. , a landing site determining section 43 , and an error notification section 39 .
 ドローン領域表示部41は、カメラ22が撮影した画像に、ドローン1が占有する可能性のある領域を重畳して、タッチパネル24に表示する。 The drone area display unit 41 superimposes an area that may be occupied by the drone 1 on the image taken by the camera 22 and displays it on the touch panel 24.
 タグ位置算出部42は、通信部21によるスマートタグ8との通信に基づいて、端末装置2に対するスマートタグ8の相対的な方向と距離とを算出することにより、端末装置2に対するスマートタグ8の相対的な位置を算出する。例えば、通信部21は、端末装置2に備えられた複数のアンテナを切り替えながらスマートタグ8から送信される信号を受信し、タグ位置算出部42は、受信信号のIQサンプルから位相差を算出し、位相差からスマートタグ8の相対的な位置を算出してもよい。また、通信部21は、スマートタグ8に備えられた複数のアンテナを切り替えながら送信される信号を1つのアンテナで受信し、タグ位置算出部42は、受信信号のIQサンプルから位相差を算出し、位相差からスマートタグ8の相対的な位置を算出してもよい。また、通信部21がスマートタグ8と超広帯域(UWB)無線通信を行う場合には、タグ位置算出部42は、UWBを使った近距離探索により、スマートタグ8の相対的な位置を算出してもよい。 The tag position calculation unit 42 calculates the relative direction and distance of the smart tag 8 to the terminal device 2 based on the communication with the smart tag 8 by the communication unit 21, thereby determining the relative direction and distance of the smart tag 8 to the terminal device 2. Calculate relative position. For example, the communication unit 21 receives a signal transmitted from the smart tag 8 while switching a plurality of antennas provided in the terminal device 2, and the tag position calculation unit 42 calculates a phase difference from IQ samples of the received signal. , the relative position of the smart tag 8 may be calculated from the phase difference. Further, the communication unit 21 receives a transmitted signal using one antenna while switching between a plurality of antennas provided in the smart tag 8, and the tag position calculation unit 42 calculates a phase difference from the IQ samples of the received signal. , the relative position of the smart tag 8 may be calculated from the phase difference. Further, when the communication unit 21 performs ultra-wideband (UWB) wireless communication with the smart tag 8, the tag position calculation unit 42 calculates the relative position of the smart tag 8 by short-range search using UWB. It's okay.
 着陸地点決定部43は、位置特定部26が特定した端末装置2の位置と、タグ位置算出部42が算出したスマートタグ8の相対的な位置とに基づいて、スマートタグ8の位置をドローン1の着陸地点として決定する。つまり、着陸地点決定部43は、端末装置2に対するスマートタグ8の相対的な位置に基づいて、端末装置2を基準としたスマートタグ8の方向と、端末装置2からスマートタグ8までの距離を算出する。着陸地点決定部43は、端末装置2の位置を、算出した方向に算出した距離だけ移動させた位置をスマートタグ8の位置と決定し、決定したスマートタグ8の位置を、ドローン1の着陸地点として決定する。着陸地点決定部43は、決定した着陸地点を示す情報(例えば、緯度及び経度の情報)を、通信部21を介してサーバ5に送信する。 The landing point determining unit 43 determines the position of the smart tag 8 on the drone 1 based on the position of the terminal device 2 specified by the position specifying unit 26 and the relative position of the smart tag 8 calculated by the tag position calculating unit 42. Determined as the landing point of In other words, the landing point determining unit 43 determines the direction of the smart tag 8 with respect to the terminal device 2 and the distance from the terminal device 2 to the smart tag 8 based on the relative position of the smart tag 8 with respect to the terminal device 2. calculate. The landing point determination unit 43 determines the position of the smart tag 8 by moving the position of the terminal device 2 by the calculated distance in the calculated direction, and sets the determined position of the smart tag 8 as the landing point of the drone 1. Determine as. The landing site determination unit 43 transmits information indicating the determined landing site (for example, latitude and longitude information) to the server 5 via the communication unit 21.
 エラー通知部39は、タグ位置算出部42によりスマートタグ8の相対的な位置が算出されなかった場合に、その旨をタッチパネル24に表示させたり、スピーカ25から音を出力することにより、ユーザに通知する。また、エラー通知部39は、サーバ5又はドローン1から、ドローン1の移動経路上に障害物が存在してドローン1が着陸地点に着陸できなかった旨の通知を受けた場合には、通知内容をタッチパネル24に表示させたり、スピーカ25から通知内容を音声出力することによりユーザに通知する。 If the relative position of the smart tag 8 has not been calculated by the tag position calculation unit 42, the error notification unit 39 notifies the user by displaying this on the touch panel 24 or outputting sound from the speaker 25. Notice. In addition, when the error notification unit 39 receives a notification from the server 5 or the drone 1 that the drone 1 could not land at the landing site due to the presence of an obstacle on the movement route of the drone 1, the error notification unit 39 determines the notification content. The user is notified by displaying the notification on the touch panel 24 or by outputting the notification content audibly from the speaker 25.
 〔配送システム10の処理手順〕
 図15は、本開示の実施形態2に係る配送システム10の処理手順を示すシーケンス図である。
 スマートタグ8と端末装置2との間でペアリングが行われ、両者が接続される(ステップS31)。
[Processing procedure of delivery system 10]
FIG. 15 is a sequence diagram showing the processing procedure of the delivery system 10 according to the second embodiment of the present disclosure.
Pairing is performed between the smart tag 8 and the terminal device 2, and the two are connected (step S31).
 サーバ5は、図4に示すようなドローン情報52Dを端末装置2に送信し、端末装置2は、ドローン情報52Dを受信する(ステップS32)。 The server 5 transmits drone information 52D as shown in FIG. 4 to the terminal device 2, and the terminal device 2 receives the drone information 52D (step S32).
 ユーザは、ドローン1の着陸地点にスマートタグ8を設置し、端末装置2のカメラ22を操作して、スマートタグ8を含む領域を撮影する(ステップS33)。 The user installs the smart tag 8 at the landing site of the drone 1, operates the camera 22 of the terminal device 2, and photographs the area including the smart tag 8 (step S33).
 端末装置2は、カメラ22が撮影した画像に、ドローン1が占有する可能性のある領域を重畳して、タッチパネル24に表示する(ステップS34)。 The terminal device 2 superimposes the area that may be occupied by the drone 1 on the image taken by the camera 22 and displays it on the touch panel 24 (step S34).
 図16は、タッチパネル24に表示される画像の一例を示す図である。
 タッチパネル24には、カメラ22が撮像したスマートタグ8を含む領域の画像に、ドローン1が占有する可能性のある領域を示す矩形枠131が重畳した画像を表示する。
 端末装置2は、衛星航法を用いて端末装置2の位置を特定する(ステップS35)。
FIG. 16 is a diagram showing an example of an image displayed on the touch panel 24.
The touch panel 24 displays an image in which a rectangular frame 131 indicating an area that may be occupied by the drone 1 is superimposed on the image of the area including the smart tag 8 captured by the camera 22.
The terminal device 2 specifies the position of the terminal device 2 using satellite navigation (step S35).
 端末装置2は、通信部21によるスマートタグ8との通信に基づいて、端末装置2に対するスマートタグ8の相対的な位置を算出する(ステップS36)。 The terminal device 2 calculates the relative position of the smart tag 8 with respect to the terminal device 2 based on the communication with the smart tag 8 by the communication unit 21 (step S36).
 端末装置2は、端末装置2の位置と、スマートタグ8の相対的な位置とに基づいて、スマートタグ8の絶対的な位置を、ドローン1の着陸地点として決定する(ステップS37)。ドローン1の着陸地点は、例えば、緯度及び経度により示される。 Based on the position of the terminal device 2 and the relative position of the smart tag 8, the terminal device 2 determines the absolute position of the smart tag 8 as the landing point of the drone 1 (step S37). The landing site of the drone 1 is indicated by latitude and longitude, for example.
 端末装置2は、ドローン1の着陸地点を示す着陸地点情報をサーバ5に送信し、サーバ5は、着陸地点情報を受信する(ステップS38)。 The terminal device 2 transmits landing site information indicating the landing site of the drone 1 to the server 5, and the server 5 receives the landing site information (step S38).
 サーバ5は、荷物の配送拠点から着陸地点情報で示される着陸地点までのドローン1の移動経路を決定する(ステップS39)。当該移動経路は公知の技術により決定することができる。 The server 5 determines the travel route of the drone 1 from the cargo delivery base to the landing point indicated by the landing point information (step S39). The movement route can be determined using a known technique.
 サーバ5は、配送拠点から着陸地点までの移動経路を示す移動経路情報をドローン1に送信し、ドローン1は、移動経路情報を受信する(ステップS40)。 The server 5 transmits travel route information indicating the travel route from the delivery base to the landing point to the drone 1, and the drone 1 receives the travel route information (step S40).
 ドローン1は、受信した移動経路情報に従って、荷物の配送拠点から着陸地点まで飛行することにより、着陸地点に荷物を配送する(ステップS41)。 The drone 1 delivers the package to the landing site by flying from the package delivery base to the landing site according to the received movement route information (step S41).
 ドローン1は、飛行時に障害物等の影響により荷物を配送できない場合には、拠点に引き返すとともに、引き返した地点の情報を含むエラー情報をサーバ5に送信する。サーバ5は、ドローン1から送信されたエラー情報を受信する(ステップS42)。 If the drone 1 is unable to deliver the cargo due to obstacles or the like during flight, it returns to the base and sends error information including information about the point to which it returned to the server 5. The server 5 receives the error information transmitted from the drone 1 (step S42).
 サーバ5は、受信したエラー情報に基づいて、着陸地点の近傍(例えば、着陸地点から所定距離以内の地点)でドローン1が引き返した場合には、その旨を示すエラー情報を端末装置2に送信し、端末装置2はエラー情報を受信する(ステップS43)。 Based on the received error information, if the drone 1 returns near the landing site (for example, a point within a predetermined distance from the landing site), the server 5 transmits error information indicating this to the terminal device 2. Then, the terminal device 2 receives the error information (step S43).
 端末装置2は、エラー情報に基づいて、着陸地点の近傍でドローン1が引き返して着陸地点に着陸できなかった旨の通知をユーザに対して行う(ステップS44)。 Based on the error information, the terminal device 2 notifies the user that the drone 1 turned back near the landing site and was unable to land at the landing site (step S44).
 以上説明したように、着陸地点にスマートタグ8を配置し、端末装置2の位置を取得することにより、ドローン1の着陸地点を決定することができる。これにより、簡易な方法でドローン1の着陸地点を決定することができる。 As explained above, the landing site of the drone 1 can be determined by placing the smart tag 8 at the landing site and acquiring the position of the terminal device 2. Thereby, the landing site of the drone 1 can be determined using a simple method.
 [付記]
 上記の各装置を構成する構成要素の一部または全部は、1または複数のシステムLSIなどの半導体装置から構成されていてもよい。
[Additional notes]
A part or all of the components constituting each of the above devices may be composed of one or more semiconductor devices such as a system LSI.
 上記したコンピュータプログラム27P、52Pを、コンピュータ読取可能な非一時的な記録媒体、例えば、HDD、CD-ROM、半導体メモリなどに記録して流通させてもよい。また、コンピュータプログラム27P、52Pを、電気通信回線、無線または有線通信回線、インターネットを代表とするネットワーク、データ放送等を経由して伝送して流通させてもよい。
 また、上記各装置は、複数のコンピュータ又は複数のプロセッサにより実現されてもよい。
The computer programs 27P and 52P described above may be recorded on a computer-readable non-transitory recording medium, such as an HDD, CD-ROM, or semiconductor memory, and distributed. Further, the computer programs 27P and 52P may be transmitted and distributed via telecommunication lines, wireless or wired communication lines, networks typified by the Internet, data broadcasting, and the like.
Furthermore, each of the above devices may be realized by multiple computers or multiple processors.
 また、上記各装置の一部または全部の機能がクラウドコンピューティングによって提供されてもよい。つまり、各装置の一部または全部の機能がクラウドサーバにより実現されていてもよい。
 さらに、上記実施形態の少なくとも一部を任意に組み合わせてもよい。
Further, some or all of the functions of each of the above devices may be provided by cloud computing. In other words, some or all of the functions of each device may be realized by a cloud server.
Furthermore, at least some of the above embodiments may be combined arbitrarily.
 今回開示された実施形態はすべての点で例示であって制限的なものではないと考えられるべきである。本開示の範囲は、上記した意味ではなく、請求の範囲によって示され、請求の範囲と均等の意味および範囲内でのすべての変更が含まれることが意図される。 The embodiments disclosed this time should be considered to be illustrative in all respects and not restrictive. The scope of the present disclosure is indicated by the scope of the claims, not the meaning described above, and is intended to include meanings equivalent to the scope of the claims and all changes within the scope.
1    ドローン
2    端末装置(着陸地点決定システム、移動経路決定装置)
5    サーバ(ドローン制御装置)
7    ネットワーク
8    スマートタグ
10   配送システム(移動経路決定システム)
21   通信部
22   カメラ
23   センサ
24   タッチパネル
25   スピーカ
26   位置特定部
27   記憶部
27P  コンピュータプログラム
28   制御部
29   バス
30   ドローン情報取得部
31   空間情報取得部
32   進入開口部決定部
33   着陸地点決定部
34   移動経路決定部(第2移動経路決定部)
35   進入可能開口部表示部
36   選択情報取得部(第1選択情報取得部、第2選択情報取得部)
37   着陸可能地点表示部
38   移動経路表示部
39   エラー通知部(第1通知部、第2通知部)
41   ドローン領域表示部
42   タグ位置算出部
43   着陸地点決定部
51   通信部
52   記憶部
52D  ドローン情報
52P  コンピュータプログラム
53   制御部
54   バス
55   ドローン情報提供部
56   着陸地点情報取得部(位置情報取得部、第2移動経路取得部)
57   移動経路決定部(第1移動経路決定部、移動指示送信部)
58   エラー通知部
101  床部
102A 開口部
102B 開口部
103  壁面
104  窓
105  柱
106  柵
111  枠
112  枠
121  面
131  矩形枠
141  移動経路
1 Drone 2 Terminal device (landing point determination system, movement route determination device)
5 Server (drone control device)
7 Network 8 Smart tag 10 Delivery system (travel route determination system)
21 Communication unit 22 Camera 23 Sensor 24 Touch panel 25 Speaker 26 Position specifying unit 27 Storage unit 27P Computer program 28 Control unit 29 Bus 30 Drone information acquisition unit 31 Spatial information acquisition unit 32 Approach opening determination unit 33 Landing point determination unit 34 Travel route Determination unit (second movement route determination unit)
35 Accessible opening display section 36 Selection information acquisition section (first selection information acquisition section, second selection information acquisition section)
37 Possible landing point display section 38 Travel route display section 39 Error notification section (first notification section, second notification section)
41 Drone area display section 42 Tag position calculation section 43 Landing point determination section 51 Communication section 52 Storage section 52D Drone information 52P Computer program 53 Control section 54 Bus 55 Drone information provision section 56 Landing point information acquisition section (location information acquisition section, 2 movement route acquisition part)
57 Movement route determination unit (first movement route determination unit, movement instruction transmission unit)
58 Error notification section 101 Floor section 102A Opening section 102B Opening section 103 Wall surface 104 Window 105 Pillar 106 Fence 111 Frame 112 Frame 121 Surface 131 Rectangular frame 141 Movement route

Claims (14)

  1.  ドローンのサイズを含むドローン情報を取得するドローン情報取得部と、
     前記ドローンの着陸候補地点の周囲に存在する物体の三次元位置を含む空間情報を取得する空間情報取得部と、 前記ドローン情報及び前記空間情報に基づいて、前記ドローンが前記着陸候補地点を含む建物に進入するための進入開口部を決定する進入開口部決定部と、

     前記ドローン情報及び前記空間情報に基づいて、前記ドローンの着陸地点を決定する着陸地点決定部と、
     前記ドローンの拠点から前記進入開口部までの前記ドローンの第1移動経路を決定する第1移動経路決定部と、
     前記空間情報と、決定された前記進入開口部及び前記着陸地点とに基づいて、前記進入開口部から前記着陸地点までの前記ドローンの第2移動経路を決定する第2移動経路決定部とを備える、移動経路決定システム。
    a drone information acquisition unit that acquires drone information including the size of the drone;
    a spatial information acquisition unit that acquires spatial information including three-dimensional positions of objects existing around the candidate landing site for the drone; and a building including the candidate landing site for the drone based on the drone information and the spatial information. an entry opening determination unit that determines an entry opening for entering the

    a landing site determining unit that determines a landing site for the drone based on the drone information and the spatial information;
    a first movement route determination unit that determines a first movement route of the drone from the base of the drone to the entry opening;
    a second movement route determination unit that determines a second movement route of the drone from the approach opening to the landing point based on the spatial information and the determined approach opening and the landing point. , a travel route determination system.
  2.  前記ドローン情報は、さらに、前記ドローンの飛行精度を含む、請求項1に記載の移動経路決定システム。 The travel route determination system according to claim 1, wherein the drone information further includes flight accuracy of the drone.
  3.  前記進入開口部決定部は、前記ドローン情報及び前記空間情報に基づいて、前記ドローンが進入可能な進入可能開口部を決定し、
     前記移動経路決定システムは、
     前記進入可能開口部を画面に表示させる進入可能開口部表示部と、
     ユーザによる前記進入可能開口部の選択情報を取得する第1選択情報取得部とをさらに備え、
     前記進入開口部決定部は、さらに、前記進入可能開口部の選択情報に基づいて、前記進入開口部を決定する、請求項1又は請求項2に記載の移動経路決定システム。
    The entry opening determining unit determines an entry possible opening into which the drone can enter, based on the drone information and the spatial information,
    The travel route determination system includes:
    an accessible opening display unit that displays the accessible opening on a screen;
    further comprising a first selection information acquisition unit that acquires selection information of the accessable opening by the user,
    The movement route determining system according to claim 1 or 2, wherein the entrance opening determination unit further determines the entrance opening based on selection information of the entrance possible opening.
  4.  前記進入開口部決定部は、さらに、前記ドローン情報及び前記空間情報に基づいて、前記ドローンが進入することのできない進入不可能開口部を決定し、
     前記進入可能開口部表示部は、前記進入可能開口部及び前記進入不可能開口部を識別可能に表示する、請求項3に記載の移動経路決定システム。
    The entry opening determining unit further determines an entry-impossible opening into which the drone cannot enter, based on the drone information and the spatial information,
    4. The movement route determination system according to claim 3, wherein the enterable opening display unit displays the enterable opening and the non-enterable opening in a distinguishable manner.
  5.  前記着陸地点決定部は、前記ドローン情報及び前記空間情報に基づいて、前記ドローンが着陸可能な着陸可能地点を決定し、
     前記移動経路決定システムは、
     前記着陸可能地点を画面に表示させる着陸可能地点表示部と、
     ユーザによる前記着陸可能地点の選択情報を取得する第2選択情報取得部とをさらに備え、
     前記着陸地点決定部は、さらに、前記着陸可能地点の選択情報に基づいて、前記着陸地点を決定する、請求項1から請求項4のいずれか1項に記載の移動経路決定システム。
    The landing point determining unit determines a possible landing site where the drone can land based on the drone information and the spatial information,
    The travel route determination system includes:
    a possible landing point display unit that displays the possible landing point on a screen;
    further comprising a second selection information acquisition unit that acquires selection information of the possible landing site by the user,
    The travel route determining system according to any one of claims 1 to 4, wherein the landing site determining unit further determines the landing site based on the selection information of the possible landing sites.
  6.  前記着陸可能地点表示部は、さらに、前記ドローン情報に基づいて、前記ドローンの外形を前記画面に表示させる、請求項5に記載の移動経路決定システム。 6. The movement route determination system according to claim 5, wherein the possible landing point display section further displays an external shape of the drone on the screen based on the drone information.
  7.  前記第2移動経路を画面に表示させる移動経路表示部をさらに備える、請求項1から請求項6のいずれか1項に記載の移動経路決定システム。 The travel route determination system according to any one of claims 1 to 6, further comprising a travel route display unit that displays the second travel route on a screen.
  8.  前記第2移動経路決定部により前記第2移動経路が決定されなかった場合に、前記第2移動経路が決定されなかった旨をユーザに通知する第1通知部をさらに備える、請求項1から請求項7のいずれか1項に記載の移動経路決定システム。 Claims 1 to 3, further comprising a first notification unit that notifies a user that the second movement route has not been determined when the second movement route has not been determined by the second movement route determination unit. The movement route determination system according to any one of Item 7.
  9.  前記ドローンが前記第2移動経路を移動できなかった場合に、前記第2移動経路を移動できなかった旨をユーザに通知する第2通知部をさらに備える、請求項1から請求項8のいずれか1項に記載の移動経路決定システム。 Any one of claims 1 to 8, further comprising a second notification unit that notifies a user that the drone could not move along the second movement route when the drone could not move along the second movement route. The movement route determination system according to item 1.
  10.  端末装置の位置を特定する位置特定部と、
     無線タグと通信を行う通信部と、
     前記通信部による前記無線タグとの通信に基づいて、前記端末装置に対する前記無線タグの相対的な位置を算出するタグ位置算出部と、
     前記端末装置の位置と、前記無線タグの相対的な位置とに基づいて、前記無線タグの位置を、ドローンの着陸地点として決定する着陸地点決定部とを備える、着陸地点決定システム。
    a location identification unit that identifies the location of the terminal device;
    a communication unit that communicates with the wireless tag;
    a tag position calculation unit that calculates a relative position of the wireless tag with respect to the terminal device based on communication with the wireless tag by the communication unit;
    A landing site determining system, comprising: a landing site determining unit that determines the position of the wireless tag as a landing site for a drone based on the position of the terminal device and the relative position of the wireless tag.
  11.  ドローンのサイズを含むドローン情報を取得するドローン情報取得部と、
     前記ドローンの着陸候補地点の周囲に存在する物体の三次元位置を含む空間情報を取得する空間情報取得部と、
     前記ドローン情報及び前記空間情報に基づいて、前記ドローンが前記着陸候補地点を含む建物に進入するための進入開口部を決定する進入開口部決定部と、
     前記ドローン情報及び前記空間情報に基づいて、前記ドローンの着陸地点を決定する着陸地点決定部と、
     前記空間情報と、決定された前記進入開口部及び前記着陸地点とに基づいて、前記進入開口部から前記着陸地点までの前記ドローンの移動経路を決定する移動経路決定部とを備える、移動経路決定装置。
    a drone information acquisition unit that acquires drone information including the size of the drone;
    a spatial information acquisition unit that acquires spatial information including three-dimensional positions of objects existing around the candidate landing site of the drone;
    an approach opening determination unit that determines an approach opening for the drone to enter a building including the landing candidate site based on the drone information and the spatial information;
    a landing site determining unit that determines a landing site for the drone based on the drone information and the spatial information;
    A travel route determination unit that determines a travel route of the drone from the approach opening to the landing point based on the spatial information and the determined approach opening and the landing point. Device.
  12.  端末装置に、ドローンのサイズを含むドローン情報を提供するドローン情報提供部と、
     前記端末装置から、前記ドローン情報に基づいて決定された、前記ドローンが着陸地点を含む建物に進入するための進入開口部の位置情報を取得する位置情報取得部と、
     前記ドローンの拠点から前記進入開口部までの前記ドローンの第1移動経路を決定する第1移動経路決定部と、
     前記端末装置から、前記進入開口部から前記着陸地点までの前記ドローンの第2移動経路を取得する第2移動経路取得部と、
     前記第1移動経路及び前記第2移動経路に基づいて、前記ドローンに移動指示を送信する移動指示送信部とを備える、ドローン制御装置。
    a drone information providing unit that provides drone information including the size of the drone to the terminal device;
    a position information acquisition unit that acquires, from the terminal device, position information of an entry opening for the drone to enter a building including a landing site, determined based on the drone information;
    a first movement route determination unit that determines a first movement route of the drone from the base of the drone to the entry opening;
    a second movement route acquisition unit that acquires, from the terminal device, a second movement route of the drone from the approach opening to the landing point;
    A drone control device comprising: a movement instruction transmitter that transmits a movement instruction to the drone based on the first movement route and the second movement route.
  13.  コンピュータを、
     ドローンのサイズを含むドローン情報を取得するドローン情報取得部、
     前記ドローンの着陸候補地点の周囲に存在する物体の三次元位置を含む空間情報を取得する空間情報取得部、
     前記ドローン情報及び前記空間情報に基づいて、前記ドローンが前記着陸候補地点を含む建物に進入するための進入開口部を決定する進入開口部決定部、
     前記ドローン情報及び前記空間情報に基づいて、前記ドローンの着陸地点を決定する着陸地点決定部、ならびに、
     前記空間情報と、決定された前記進入開口部及び前記着陸地点とに基づいて、前記進入開口部から前記着陸地点までの前記ドローンの移動経路を決定する移動経路決定部として機能させるための、コンピュータプログラム。
    computer,
    a drone information acquisition unit that acquires drone information including the size of the drone;
    a spatial information acquisition unit that acquires spatial information including three-dimensional positions of objects existing around the candidate landing site of the drone;
    an approach opening determining unit that determines an approach opening for the drone to enter a building including the landing candidate site based on the drone information and the spatial information;
    a landing site determining unit that determines a landing site for the drone based on the drone information and the spatial information; and
    A computer for functioning as a movement route determination unit that determines a movement path of the drone from the approach opening to the landing point based on the spatial information and the determined approach opening and the landing point. program.
  14.  コンピュータを、
     端末装置の位置を特定する位置特定部、
     通信部による無線タグとの通信に基づいて、前記端末装置に対する前記無線タグの相対的な位置を算出するタグ位置算出部、及び、
     前記端末装置の位置と、前記無線タグの相対的な位置とに基づいて、前記無線タグの位置を、ドローンの着陸地点として決定する着陸地点決定部として機能させるための、コンピュータプログラム。
    computer,
    a location identification unit that identifies the location of the terminal device;
    a tag position calculation unit that calculates a relative position of the wireless tag with respect to the terminal device based on communication with the wireless tag by the communication unit;
    A computer program for functioning as a landing point determination unit that determines a position of the wireless tag as a landing point of a drone based on a position of the terminal device and a relative position of the wireless tag.
PCT/JP2023/015737 2022-06-13 2023-04-20 Movement path determination system, landing site determination system, movement path determination device, drone control device, and computer program WO2023243221A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-094867 2022-06-13
JP2022094867 2022-06-13

Publications (1)

Publication Number Publication Date
WO2023243221A1 true WO2023243221A1 (en) 2023-12-21

Family

ID=89190900

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/015737 WO2023243221A1 (en) 2022-06-13 2023-04-20 Movement path determination system, landing site determination system, movement path determination device, drone control device, and computer program

Country Status (1)

Country Link
WO (1) WO2023243221A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004354351A (en) * 2003-05-30 2004-12-16 Sharp Corp Radio wave transmitter search system, cellular phone communication terminal apparatus, and radio wave transmitter search method by radio wave transmitter search system
JP2013519335A (en) * 2010-02-09 2013-05-23 エアロスカウト、リミテッド System and method for processing information related to tags and mobile phone
JP2019016197A (en) * 2017-07-07 2019-01-31 株式会社日立製作所 Moving entity induction system
US20210174301A1 (en) * 2019-12-04 2021-06-10 Wing Aviation Llc Uav balcony deliveries to multi-level buildings

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004354351A (en) * 2003-05-30 2004-12-16 Sharp Corp Radio wave transmitter search system, cellular phone communication terminal apparatus, and radio wave transmitter search method by radio wave transmitter search system
JP2013519335A (en) * 2010-02-09 2013-05-23 エアロスカウト、リミテッド System and method for processing information related to tags and mobile phone
JP2019016197A (en) * 2017-07-07 2019-01-31 株式会社日立製作所 Moving entity induction system
US20210174301A1 (en) * 2019-12-04 2021-06-10 Wing Aviation Llc Uav balcony deliveries to multi-level buildings

Similar Documents

Publication Publication Date Title
US10354407B2 (en) Camera for locating hidden objects
US10354452B2 (en) Directional and x-ray view techniques for navigation using a mobile device
US10715963B2 (en) Navigation method and device
US20210019854A1 (en) Location Signaling with Respect to an Autonomous Vehicle and a Rider
US10181211B2 (en) Method and apparatus of prompting position of aerial vehicle
WO2020224375A1 (en) Positioning method, apparatus, and device, and computer-readable storage medium
US9162762B1 (en) System and method for controlling a remote aerial device for up-close inspection
US9684305B2 (en) System and method for mobile robot teleoperation
US20190003840A1 (en) Map registration point collection with mobile drone
CN110392908A (en) For generating the electronic equipment and its operating method of map datum
US20190251741A1 (en) System and method of on-site documentation enhancement through augmented reality
JP2022507715A (en) Surveying methods, equipment and devices
WO2023243221A1 (en) Movement path determination system, landing site determination system, movement path determination device, drone control device, and computer program
WO2021212499A1 (en) Target calibration method, apparatus, and system, and remote control terminal of movable platform
TWI750821B (en) Navigation method, system, equipment and medium based on optical communication device
CN115460539A (en) Method, device, medium and program product for acquiring electronic fence
CN109489678B (en) Positioning method and system for monitoring navigation
US20240118703A1 (en) Display apparatus, communication system, display control method, and recording medium
US20240053746A1 (en) Display system, communications system, display control method, and program
US20230205198A1 (en) Information processing apparatus, route generation system, route generating method, and non-transitory recording medium
JP2022146886A (en) Display device, communication system, display control method, and program
WO2023228283A1 (en) Information processing system, movable body, information processing method, and program
WO2022121606A1 (en) Method and system for obtaining identification information of device or user thereof in scenario
WO2023178495A1 (en) Drone, control terminal, server and control method therefor
JP2022146885A (en) Display device, communication system, display control method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23823524

Country of ref document: EP

Kind code of ref document: A1