WO2023243221A1 - Système de détermination de trajet de déplacement, système de détermination de site d'atterrissage, dispositif de détermination de trajet de déplacement, dispositif de commande de drone et programme informatique - Google Patents

Système de détermination de trajet de déplacement, système de détermination de site d'atterrissage, dispositif de détermination de trajet de déplacement, dispositif de commande de drone et programme informatique Download PDF

Info

Publication number
WO2023243221A1
WO2023243221A1 PCT/JP2023/015737 JP2023015737W WO2023243221A1 WO 2023243221 A1 WO2023243221 A1 WO 2023243221A1 JP 2023015737 W JP2023015737 W JP 2023015737W WO 2023243221 A1 WO2023243221 A1 WO 2023243221A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
opening
information
unit
landing
Prior art date
Application number
PCT/JP2023/015737
Other languages
English (en)
Japanese (ja)
Inventor
淳 岩元
Original Assignee
住友電気工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 住友電気工業株式会社 filed Critical 住友電気工業株式会社
Publication of WO2023243221A1 publication Critical patent/WO2023243221A1/fr

Links

Definitions

  • the present disclosure relates to a travel route determination system, a landing site determination system, a travel route determination device, a drone control device, and a computer program.
  • This application claims priority based on Japanese Application No. 2022-94867 filed on June 13, 2022, and incorporates all the contents described in the said Japanese application.
  • Patent Document 1 discloses a method for determining a delivery location of a package using a drone.
  • a movement route determination system includes a drone information acquisition unit that acquires drone information including the size of the drone, and spatial information that includes three-dimensional positions of objects existing around a candidate landing site for the drone.
  • an approach opening determining unit that determines an approach opening into which the drone enters based on the drone information and the spatial information; a landing point determining unit that determines a landing site of the drone; a first moving route determining unit that determines a first moving route of the drone from a base to the approach opening; the spatial information; and the determined approach opening.
  • a second movement route determination unit that determines a second movement route of the drone from the approach opening to the landing point based on the landing point and the landing point.
  • FIG. 1 is a diagram showing the configuration of a delivery system according to Embodiment 1 of the present disclosure.
  • FIG. 2 is a block diagram showing the configuration of a terminal device according to Embodiment 1 of the present disclosure.
  • FIG. 3 is a diagram showing an example of an image output from the camera.
  • FIG. 4 is a diagram showing an example of drone information.
  • FIG. 5 is a diagram illustrating an example of aperture information.
  • FIG. 6 is a diagram showing an example of travel route information.
  • FIG. 7 is a diagram illustrating an example of display on the touch panel by the enterable opening display section.
  • FIG. 8 is a diagram illustrating an example of a display on a touch panel by a possible landing point display section.
  • FIG. 1 is a diagram showing the configuration of a delivery system according to Embodiment 1 of the present disclosure.
  • FIG. 2 is a block diagram showing the configuration of a terminal device according to Embodiment 1 of the present disclosure.
  • FIG. 3 is a diagram showing
  • FIG. 9 is a diagram illustrating an example of display on the touch panel by the movement route display section.
  • FIG. 10 is a diagram illustrating an example of display on the touch panel by the movement route display section.
  • FIG. 11 is a block diagram showing the configuration of a server according to Embodiment 1 of the present disclosure.
  • FIG. 12 is a sequence diagram showing the processing procedure of the delivery system according to Embodiment 1 of the present disclosure.
  • FIG. 13 is a diagram showing the configuration of a delivery system according to Embodiment 2 of the present disclosure.
  • FIG. 14 is a block diagram showing the configuration of a terminal device according to Embodiment 2 of the present disclosure.
  • FIG. 15 is a sequence diagram showing the processing procedure of the delivery system according to Embodiment 2 of the present disclosure.
  • FIG. 16 is a diagram showing an example of an image displayed on the touch panel.
  • a helipad on the roof of a building or a marker area about 6 tatami mats in size set up on the ground is used as a landing site for a drone, and the drone lands by aiming for the mark on the helipad or marker area. It is not practical to set up a helipad or marker area in a private home, and there is no established method for specifying a drone landing site for each private home.
  • the present disclosure has been made in view of such circumstances, and provides a travel route determination system, a landing point determination system, a travel route determination device, and a drone that can determine the travel route or landing point of a drone in a simple manner.
  • the purpose is to provide a control device and a computer program.
  • a movement route determination system includes a drone information acquisition unit that acquires drone information including the size of the drone, and a three-dimensional position of objects existing around a candidate landing site for the drone.
  • an approach opening determining unit that determines an approach opening for the drone to enter a building including the landing candidate site based on the drone information and the spatial information;
  • a landing point determining unit that determines a landing site for the drone based on the drone information and the spatial information, and a first movement that determines a first movement route of the drone from the base of the drone to the approach opening.
  • a second movement route for determining a second movement route of the drone from the approach opening to the landing point based on a route determining unit, the spatial information, and the determined approach opening and the landing point; and a determining section.
  • Drone information includes the size of the drone. Therefore, by acquiring the drone information and spatial information, it is possible to determine the entry opening through which the drone enters the space around the candidate landing site for the drone. It is also possible to determine the landing site for the drone. Once the approach opening and landing site are determined, the first travel route and the second travel route can be determined. For example, a user can easily determine the drone's travel route or landing point by acquiring spatial information using a distance sensor installed in a smartphone, etc., and acquiring drone information from an external server. can.
  • the drone information may further include flight accuracy of the drone.
  • the drone may come into contact with the area around the opening and be unable to enter the opening.
  • the drone may come into contact with the surroundings of the landing site and be unable to land at the landing site.
  • the approach opening and the landing site can be determined in consideration of the flight accuracy of the drone. Therefore, it is possible to determine an entry opening into which the drone can reliably enter and a landing site where the drone can reliably land.
  • the entry opening determining unit determines an entry opening into which the drone can enter, based on the drone information and the spatial information, and determines the movement route.
  • the system further includes an accessible opening display unit that displays the accessible opening on a screen, and a first selection information acquisition unit that acquires information on the user's selection of the accessible opening, and the system further includes: The unit may further determine the access opening based on the selection information of the access opening.
  • the user can select any one of the plurality of accessible openings. This allows the user to select accessible openings by excluding accessible openings that they do not want the drone to enter, or to select accessible openings that the drone can easily enter.
  • the entry opening along can be determined.
  • the entry opening determining unit further determines an entry-inaccessible opening into which the drone cannot enter, based on the drone information and the spatial information, and
  • the opening display section may display the accessible opening and the non-accessible opening in a distinguishable manner.
  • inaccessible openings into which the drone cannot enter such as cases where the size of the opening is smaller than the size of the drone, can be displayed separately from accessible openings. This allows the user to efficiently select an accessible opening.
  • the landing point determination unit determines a possible landing point where the drone can land based on the drone information and the spatial information
  • the determination system further includes a landing spot display unit that displays the landing spot possible on a screen, and a second selection information acquisition unit that acquires selection information of the landing spot by the user, and the landing spot determining unit includes: Furthermore, the landing site may be determined based on the selection information of the possible landing site.
  • the user can select any possible landing site from among the possible landing sites. This allows you to select possible landing sites by excluding possible landing sites that you do not want the drone to land on, or to select possible landing sites that are likely to be easy for the drone to land on, allowing you to select landing sites that match the user's wishes. can be determined.
  • the possible landing point display unit may further display the outline of the drone on the screen based on the drone information.
  • the travel route determination system may further include a travel route display unit that displays the second travel route on a screen.
  • the travel path of the drone from the approach opening to the landing site can be displayed. Therefore, the user can be made aware of the moving route of the drone, and the user can be warned not to place objects on the moving route.
  • the movement route determination system is configured such that when the second movement route is not determined by the second movement route determination unit, the second movement route is You may further include a first notification unit that notifies the user that the determination has not been made.
  • the user who receives the notification that the second travel route has not been determined can take actions such as changing the landing point.
  • the movement route determination system determines that the drone was unable to move along the second movement route.
  • the device may further include a second notification unit that notifies the user.
  • the user who receives the notification that the drone could not move on the second movement route can take measures such as removing obstacles on the second movement route or changing the landing site.
  • a landing site determination system includes a position specifying unit that specifies the position of a terminal device, a communication unit that communicates with a wireless tag, and communication between the communication unit and the wireless tag. a tag position calculation unit that calculates the relative position of the wireless tag with respect to the terminal device based on the position of the wireless tag; and a landing point determining unit that determines the landing point of the drone as the landing point of the drone.
  • the landing site of the drone can be determined by placing a wireless tag at the landing site and acquiring the position of the terminal device. This allows the landing site of the drone to be determined in a simple manner.
  • a movement route determination device includes a drone information acquisition unit that acquires drone information including the size of the drone, and a three-dimensional position of an object existing around a candidate landing site for the drone.
  • a drone information acquisition unit that acquires drone information including the size of the drone, and a three-dimensional position of an object existing around a candidate landing site for the drone.
  • an approach opening determining unit that determines an approach opening for the drone to enter a building including the landing candidate site based on the drone information and the spatial information; a landing point determining unit that determines a landing site for the drone based on the drone information and the spatial information; and a travel route determination unit that determines a travel route of the drone from the opening to the landing site.
  • Drone information includes the size of the drone. Therefore, by acquiring the drone information and spatial information, it is possible to determine the entry opening through which the drone enters the space around the candidate landing site for the drone. It is also possible to determine the landing site for the drone. Once the approach opening and landing site are determined, the travel path of the drone from the approach opening to the landing site can be determined. For example, a user can easily determine the drone's travel route or landing point by acquiring spatial information using a distance sensor installed in a smartphone, etc., and acquiring drone information from an external server. can.
  • a drone control device includes a drone information providing unit that provides drone information including the size of the drone to a terminal device, and a drone information providing unit that provides the drone information including the size of the drone. Further, a position information acquisition unit that acquires position information of an approach opening for the drone to enter a building including a landing site, and a position information acquisition unit that determines a first movement route of the drone from the base of the drone to the approach opening. a first movement route determination unit that acquires a second movement route of the drone from the approach opening to the landing site from the terminal device; and a movement instruction transmitter that transmits a movement instruction to the drone based on the second movement route.
  • a computer program includes a computer, a drone information acquisition unit that acquires drone information including the size of the drone, and a three-dimensional position of objects existing around a candidate landing site for the drone.
  • an approach opening determining unit that determines an approach opening for the drone to enter a building including the landing candidate site based on the drone information and the spatial information; a landing point determining unit that determines a landing site for the drone based on the drone information and the spatial information; and a landing site determining unit that determines the landing site of the drone based on the spatial information and the determined approach opening and landing site.
  • the drone functions as a travel route determination unit that determines the travel route of the drone from the landing point to the landing point.
  • the computer can function as the above-mentioned travel route determining device. Therefore, the same operation and effect as the above-described moving route determining device can be achieved.
  • a computer program is configured to cause a computer to identify the location of the wireless tag for the terminal device based on communication with the wireless tag by a location specifying unit that identifies the location of the terminal device and a communication unit.
  • a tag position calculation unit that calculates a relative position; and a landing point that determines the position of the wireless tag as a landing point of the drone based on the position of the terminal device and the relative position of the wireless tag. Function as a decision-making unit.
  • the computer can function as the above-mentioned landing site determination system. Therefore, the same operation and effect as the above-mentioned landing site determination system can be achieved.
  • FIG. 1 is a diagram showing the configuration of a delivery system according to Embodiment 1 of the present disclosure.
  • the delivery system 10 is a system for delivering packages from a package delivery base (hereinafter also referred to as "base"), which is the base of the drone 1, to a delivery destination, and includes the drone 1, the terminal device 2, and a server. 5.
  • the drone 1, the terminal device 2, and the server 5 are connected to the network 7 via wireless or wired.
  • the network 7 is configured by, for example, a public communication network such as 5G.
  • the drone 1 is an unmanned aircraft that flies from a delivery base to a delivery destination according to a travel route determined by the server 5 and delivers packages to the delivery destination.
  • the drone 1 is equipped with a mechanism for grasping luggage, a position specifying unit for specifying the position of the drone 1, and a camera.
  • the drone 1 can detect obstacles and the like existing in its path based on the images taken by the camera, and flies to the destination while avoiding the obstacles.
  • the terminal device 2 is, for example, a smartphone or a tablet terminal device owned by the user, and has a landing point for the drone 1 at the delivery destination, an entry opening provided at the delivery destination through which the drone 1 enters, and an entry point.
  • the moving route of the drone 1 from the opening to the landing site is determined.
  • the drone 1 delivers a package to the balcony of a building with a roof, such as an apartment or a detached house. That is, it is assumed that the drone 1 passes through the opening of the balcony from the outside, enters the balcony, and lands on the balcony.
  • the balcony does not have to have a roof.
  • the landing site of the drone 1 is not limited to the balcony. Embodiment 1 is also applicable when the drone 1 lands at a location that has an opening to the outside.
  • the server 5 determines the travel route of the drone 1 from the delivery base to the landing point of the drone 1 at the delivery destination based on the information determined by the terminal device 2, and instructs the drone 1 to fly along the determined travel route. do.
  • FIG. 2 is a block diagram showing the configuration of the terminal device 2 according to Embodiment 1 of the present disclosure.
  • the terminal device 2 includes a communication section 21 , a camera 22 , a sensor 23 , a touch panel 24 , a speaker 25 , a position specifying section 26 , a storage section 27 , a control section 28 , and a bus 29 .
  • the communication unit 21 connects the terminal device 2 to the network 7 and performs wireless or wired communication with an external device.
  • the camera 22 photographs the surroundings of the terminal device 2 and outputs an image.
  • FIG. 3 is a diagram showing an example of an image output from the camera 22.
  • FIG. 3 shows an example of an image of a balcony of an apartment, which is captured by the user operating the camera 22.
  • the image includes the floor 101, which is a candidate landing site for the drone 1, and balcony openings 102A and 102B through which the drone 1 enters the apartment including the floor 101.
  • the sensor 23 measures the distance to objects around the terminal device 2 and outputs depth information corresponding to the distance.
  • the depth information of each pixel represents the distance to the target.
  • the sensor 23 is configured by, for example, LIDAR (Light Detection and Ranging, Laser Imaging Detection and Rang).
  • Distance measurement by the sensor 23 and photographing by the camera 22 can be performed simultaneously. Thereby, for example, the sensor 23 measures the distance to the object on the balcony shown in FIG. do.
  • the touch panel 24 has a function as a display device that displays images, and a function as an input device that accepts input operations by the user.
  • the speaker 25 outputs sound.
  • the position specifying unit 26 specifies the position of the terminal device 2.
  • the position specifying unit 26 specifies the position of the terminal device 2 using satellite navigation.
  • the position specifying unit 26 specifies the position of the terminal device 2 based on radio waves received from a plurality of GPS (Global Positioning System) satellites.
  • the location of the terminal device 2 can be specified by latitude and longitude, for example.
  • Satellite navigation uses a global navigation satellite system (GNSS) such as GPS, but the satellite positioning system is not limited to GPS.
  • GNSS global navigation satellite system
  • the storage unit 27 is a volatile memory element such as SRAM (Static RAM) or DRAM (Dynamic RAM), a non-volatile memory element such as flash memory or EEPROM (Electrically Erasable Programmable Read Only Memory), or a magnetic memory element such as a hard disk. It consists of a storage device, etc.
  • the storage unit 27 stores a computer program 27P executed by the control unit 28. Furthermore, the storage unit 27 stores data obtained when the computer program 27P is executed.
  • the control unit 28 is composed of a processor such as a CPU (Central Processing Unit), and includes a drone information acquisition unit 30, a spatial information acquisition unit 31, and a functional processing unit realized by executing a computer program 27P.
  • a processor such as a CPU (Central Processing Unit)
  • the drone information acquisition unit 30 acquires drone information including the size of the drone 1 that delivers packages to the user's home and the flight accuracy of the drone 1 from the server 5 via the communication unit 21.
  • FIG. 4 is a diagram showing an example of drone information.
  • the drone information 52D includes drone size (height, width, depth), hovering accuracy (vertical, horizontal), and position accuracy (vertical, horizontal). Hovering accuracy (vertical, horizontal) and positional accuracy (vertical, horizontal) are examples of the flight accuracy of the drone 1.
  • the sizes of the drone 1 in the height direction, width direction, and depth direction are 300 mm, 500 mm, and 500 mm, respectively.
  • the hovering accuracy of the drone 1 in both the vertical direction and the horizontal direction is ⁇ 0.1 m.
  • the positional accuracy of the drone 1 in both the vertical and horizontal directions is ⁇ 50 mm.
  • RTK Real Time Kinematic
  • the positional accuracy of the drone 1 by correcting the position information of the drone 1 obtained from a satellite positioning system such as GPS based on the corrected position information of a reference station installed on the ground.
  • a technique called ⁇
  • the spatial information acquisition unit 31 calculates spatial information including the three-dimensional positions of objects existing around the candidate landing site of the drone 1 based on the depth information output by the sensor 23.
  • the spatial information indicates the three-dimensional position of the object in a coordinate system preset in the terminal device 2 (for example, a right-handed coordinate system with the origin at the center of the lens of the camera 22 and the line of sight direction of the camera 22 as the X axis). Shown in coordinates.
  • the spatial information includes the three-dimensional coordinates of the floor 101, openings 102A and 102B, wall surface 103, window 104, pillar 105, and fence 106 in FIG. Note that since the openings 102A and 102B are spaces, the three-dimensional coordinates of the openings 102A and 102B are the coordinates of a point at infinity or an indefinite value.
  • the entry opening determination unit 32 determines an opening through which the drone 1 can enter the apartment including the floor 101 based on the drone information 52D acquired by the drone information acquisition unit 30 and the spatial information acquired by the spatial information acquisition unit 31. A certain approachable opening and an inaccessible opening, which is an opening through which the drone 1 cannot enter the apartment including the floor 101, are determined.
  • the entrance opening determining unit 32 determines the horizontal and vertical sizes of each of the openings 102A and 102B shown in FIG. 3 based on the spatial information.
  • the approach opening determining unit 32 determines the size of the area that the drone 1 may occupy from the size, hovering accuracy, and positional accuracy of the drone 1 based on the drone information 52D.
  • the approach opening determining unit 32 determines that for each of the openings 102A and 102B, the horizontal size of the opening is 650 mm or less in the width direction that the drone 1 may occupy, and the vertical size of the opening is 650 mm or less. If the size of the opening is 450 mm or less in the height direction that the drone 1 may occupy, the opening is determined to be an accessible opening; otherwise, the opening is determined to be an impossible opening. For example, the entrance opening determining unit 32 determines the opening 102A to be an accessible opening, and determines the opening 102B to be an impossible opening.
  • the entry opening determining unit 32 determines the entry opening selected by the user among the determined entry openings as the entry opening into which the drone 1 will enter.
  • the entrance opening determining section 32 transmits opening information regarding the determined entrance opening to the server 5 via the communication section 21 .
  • FIG. 5 is a diagram illustrating an example of aperture information.
  • the opening information includes opening coordinates (latitude), opening coordinates (longitude), opening height, and opening approach direction.
  • the opening coordinates (latitude) indicate the latitude of the entrance opening, and are, for example, "YY.YYYYYY” (degrees).
  • the opening coordinates (longitude) indicate the longitude of the entrance opening, and are, for example, "XXX.XXXXXX" (degrees).
  • the opening height indicates the height of the entrance opening, and is, for example, "15 m”.
  • the opening entry direction indicates the direction of the entry opening with respect to the terminal device 2, and is, for example, "275 degrees.” Here, true north is 0 degrees.
  • the entrance opening determination unit 32 determines the opening coordinates ( (latitude) and opening coordinates (longitude). However, if the three-dimensional position of the approach opening indicates the coordinates of an infinite point or is undefined, the approach opening determining unit 32 determines whether the approach opening has an object around the approach opening (as shown in FIG. 3). In the example, the three-dimensional position of the entrance opening may be calculated from the three-dimensional coordinates of the pillar 105 and the fence 106). For example, when the entrance opening is the opening 102A, the entrance opening determination unit 32 may calculate a three-dimensional position obtained by extending the surface of the fence 106 upward as the three-dimensional position of the opening 102A. . Then, the approach opening determining unit 32 calculates the opening coordinates (latitude) and the opening coordinates (longitude).
  • the landing site determination unit 33 determines a possible landing site where the drone 1 can land based on the drone information 52D acquired by the drone information acquisition unit 30 and the spatial information acquired by the spatial information acquisition unit 31.
  • the landing site determining unit 33 determines the possible landing site selected by the user among the determined possible landing sites as the landing site of the drone 1.
  • the movement route determining unit 34 is based on the spatial information calculated by the spatial information acquiring unit 31, the information on the approach opening determined by the approach opening determining unit 32, and the information on the landing site determined by the landing site determining unit 33. Then, the travel route of the drone 1 from the approach opening to the landing site is determined. For example, when the drone 1 enters the inside of the balcony from the entrance opening at a predetermined elevation angle and the drone 1 reaches directly above the landing site, the movement route determining unit 34 changes the course of the drone 1 vertically and lands. Determine the travel route to land at the location. The moving route determining unit 34 transmits the determined moving route information to the server 5 via the communication unit 21. Note that if there is an obstacle or the like between the approach opening and the landing point and the drone 1 cannot move, the movement route determination unit 34 does not determine the movement route and cannot determine the movement route. information indicating this is transmitted to the server 5 via the communication unit 21.
  • FIG. 6 is a diagram showing an example of travel route information.
  • the travel route information includes travel route coordinates (1) to (n), landing point coordinates, and reference coordinate orientation.
  • Movement route coordinates (1) to (n) are the coordinates of n points that the drone 1 passes between the approach opening and the landing site.
  • the landing point coordinates are the coordinates of the point where the drone 1 lands.
  • the drone 1 when considering a right-handed coordinate system in which the origin is the lens center of the camera 22 and the X axis is the line of sight direction of the camera 22, the drone 1 has coordinates (X1, Y1, Z1), ..., coordinates (Xn , Yn, Zn) and land at the coordinates (ZL, YL, ZL).
  • the reference coordinate orientation indicates the orientation in the X-axis direction in the coordinate system, and is, for example, "0 degree". Here, true north is 0 degrees.
  • the accessible opening display unit 35 displays on the touch panel 24 the accessible openings and non-accessible openings determined by the accessible opening determining unit 32 in a distinguishable manner.
  • the accessible opening display section 35 may display accessible openings and non-accessible openings with frames of different colors superimposed on the image taken by the camera 22, or may display accessible openings and non-accessible openings with frames of different colors.
  • the inaccessible opening may be displayed in a different frame.
  • FIG. 7 is a diagram showing an example of a display on the touch panel 24 by the accessible opening display section 35.
  • Frames 111 and 112 are displayed on the touch panel 24, superimposed on the balcony image taken by the camera 22 shown in FIG.
  • a frame 111 is a frame that highlights the opening 102A
  • a frame 112 is a frame that highlights the opening 102B.
  • the frame 111 is not marked with a cross, which indicates that the opening 102A is an enterable opening.
  • the frame 112 is marked with a cross, indicating that the opening 102B is an inaccessible opening.
  • the selection information acquisition unit 36 acquires information on the accessible opening selected by the user from among the accessible openings displayed on the touch panel 24 by the accessible opening display unit 35. Specifically, the selection information acquisition unit 36 acquires information on the position touched by the user on the screen from the touch panel 24, and based on the acquired position information and the display position of the accessible opening on the screen, Identifying the user-selected access opening. In the example shown in FIG. 7, the only accessible opening is the opening 102A, so the user selects the opening 102A.
  • the possible landing site display section 37 displays the possible landing site determined by the landing site determining section 33 on the touch panel 24.
  • the possible landing spot display section 37 may display an area including a possible landing spot superimposed on the image taken by the camera 22 in a different color from other areas.
  • the possible landing spot display section 37 displays the outline of the drone 1 on the touch panel 24 based on the drone information 52D.
  • the possible landing spot display section 37 displays a rectangular area that the drone 1 may occupy as the outline of the drone 1 based on the size, hovering accuracy, and positional accuracy of the drone 1.
  • FIG. 8 is a diagram showing an example of a display on the touch panel 24 by the possible landing point display section 37.
  • a surface 121 and a rectangular frame 131 are displayed on the touch panel 24, superimposed on the balcony image taken by the camera 22 shown in FIG.
  • the surface 121 is a surface that includes a possible landing site for the drone 1, and is, for example, a blue surface.
  • the rectangular frame 131 three-dimensionally indicates an area that the drone 1 may occupy.
  • the selection information acquisition unit 36 further acquires information on the possible landing site selected by the user from among the possible landing sites displayed on the touch panel 24 by the landing site determining unit 33. Specifically, in the example of FIG. 8, the user drags the rectangular frame 131 on the touch panel 24 and releases (drops) his or her finger at the desired position for the drone 1 to land. Note that movement of the rectangular frame 131 by dragging is restricted so that the rectangular frame 131 can be moved only within the plane 121 that includes the possible landing site.
  • the selection information acquisition unit 36 acquires information on the position where the user has dropped from the touch panel 24, and identifies the possible landing site selected by the user based on the acquired positional information and the display position of the possible landing site on the screen. do.
  • the movement route display unit 38 causes the touch panel 24 to display the movement route of the drone 1 from the approach opening to the landing site determined by the movement route determination unit 34, superimposed on the image taken by the camera 22.
  • FIG. 9 and 10 are diagrams showing examples of display on the touch panel 24 by the movement route display section 38.
  • the movement route display unit 38 displays a message “route calculation in progress” on the balcony image taken by the camera 22. Display messages superimposed.
  • the movement route display unit 38 superimposes the movement route 141 on the balcony image taken by the camera 22. to be displayed.
  • the movement route 141 is a dashed line indicating an area that the drone 1 may occupy when moving from the opening 102A to the landing site.
  • the error notification unit 39 notifies the user by displaying this on the touch panel 24 or outputting sound from the speaker 25.
  • the error notification unit 39 receives a notification from the server 5 or the drone 1 that the drone 1 could not land at the landing site due to the presence of an obstacle on the movement route of the drone 1, the error notification unit 39 determines the notification content. The user is notified by displaying the notification on the touch panel 24 or by outputting the notification content audibly from the speaker 25.
  • FIG. 11 is a block diagram showing the configuration of the server 5 according to Embodiment 1 of the present disclosure.
  • the server 5 includes a communication section 51, a storage section 52, a control section 53, and a bus 54.
  • the communication unit 51 connects the server 5 to the network 7 and performs wireless or wired communication with an external device.
  • the storage unit 52 is composed of a volatile memory element such as SRAM or DRAM, a non-volatile memory element such as flash memory or EEPROM, or a magnetic storage device such as a hard disk.
  • the storage unit 52 stores a computer program 52P executed by the control unit 53 and the above-mentioned drone information 52D. Furthermore, the storage unit 52 stores data obtained when the computer program 52P is executed.
  • the control unit 53 is composed of a processor such as a CPU, and is a functional processing unit realized by executing a computer program 52P. 57 and an error notification section 58.
  • the drone information providing unit 55 reads drone information 52D of a drone that delivers packages to the home of the user using the terminal device 2 from the storage unit 52, and transmits the read drone information 52D to the terminal device 2 via the communication unit 51. do.
  • the landing point information acquisition unit 56 receives opening information and travel route information from the terminal device 2 via the communication unit 51.
  • the moving route determination unit 57 determines the moving route of the drone 1 from a predetermined base of the drone 1 to the opening based on the opening information.
  • the movement route can be determined using a known technique. For example, a travel route may be determined in which the travel distance of the drone 1 is the minimum, or a travel route in which the travel time of the drone 1 is the minimum may be determined.
  • the movement route determination unit 57 is configured to move the drone 1 from the base through the opening to the landing point based on the determined movement route of the drone 1 from the base to the opening and the travel route information received from the terminal device 2. Determine your travel route.
  • the moving route determining unit 57 converts the coordinate system of the moving route information to match the coordinate system of the drone 1 according to the reference coordinate direction indicated in the moving route information received from the terminal device 2.
  • the travel route determination unit 57 transmits a travel instruction to the drone 1 by transmitting information on the travel route from the determined base to the landing point to the drone 1 via the communication unit 21.
  • the drone 1 flies from the base to the landing point according to the moving route determined by the moving route determining unit 57, and delivers the luggage to the landing point.
  • the error notification unit 58 receives error information transmitted from the drone 1 via the communication unit 51 when the drone 1 cannot fly along the movement route. If the error information indicates that the drone 1 cannot land at the landing site due to the presence of an obstacle between the opening and the landing site, the error notification unit 58 notifies the user of this via the communication unit 51. to notify the terminal device 2.
  • FIG. 12 is a sequence diagram showing the processing procedure of the delivery system 10 according to the first embodiment of the present disclosure.
  • the server 5 transmits drone information 52D as shown in FIG. 4 to the terminal device 2, and the terminal device 2 receives the drone information 52D (step S1).
  • the camera 22 of the terminal device 2 captures an image of the balcony including the candidate landing site for the drone 1 as shown in FIG. 3 (step S2). Note that if the area including the landing candidate site does not fit within the photographing area of the camera 22, the user may pan the camera 22 to photograph a panoramic image wider than the photographing area.
  • the terminal device 2 also captures the objects on the balcony shown in FIG. The distance is measured and depth information is output (step S3).
  • the terminal device 2 calculates spatial information around the candidate landing site, including the three-dimensional positions of objects existing around the candidate landing site (step S4).
  • the terminal device 2 Based on the drone information 52D and the spatial information, the terminal device 2 identifies an entry-allowed opening that the drone 1 can enter and an entry-impossible opening that the drone 1 cannot enter. Determine (step S5).
  • the terminal device 2 causes the touch panel 24 to display an image as shown in FIG. 7 (step S6). Thereby, the opening 102A, which is an accessible opening, and the opening 102B, which is an inaccessible opening, can be distinguished and displayed. Note that the terminal device 2 displays a message "Please specify the drone entrance" superimposed on the image.
  • the user selects, on the touch panel 24, the opening into which the drone 1 can enter, and the terminal device 2 acquires information on the selected opening (step S7).
  • information on the opening 102A eg, position information on the image of the opening 102A
  • the terminal device 2 determines the entry possible opening selected by the user as the entry opening into which the drone 1 will enter (step S8). For example, the terminal device 2 determines the selected opening 102A as the entry opening.
  • the terminal device 2 determines a landing spot where the drone 1 can land based on the drone information 52D and the spatial information (step S9).
  • the terminal device 2 causes the touch panel 24 to display an image including a surface 121 including a possible landing site for the drone 1 and a rectangular frame 131 indicating an area that the drone 1 may occupy ( Step S10). Note that the terminal device 2 displays a message "Please specify the landing site" superimposed on the image.
  • the user selects a landing spot for the drone 1 by dragging and dropping the rectangular frame 131 on the touch panel 24, and the terminal device 2 specifies the selected landing spot for the drone 1 (step S11).
  • the rectangular frame 13 shown in FIG. When the user drops the rectangular frame 131 at position 1, the position in real space corresponding to the position of the rectangular frame 131 on the image is specified as a possible landing spot.
  • the terminal device 2 determines the possible landing site selected by the user and identified by the terminal device 2 as the landing site for the drone 1 (step S12).
  • the terminal device 2 displays an image on the touch panel 24 that shows that the drone 1 is calculating the movement route as shown in FIG. 9 (step S13).
  • the terminal device 2 determines the movement route of the drone 1 from the approach opening to the landing site based on the spatial information, the approach opening information, and the landing site information.
  • the terminal device 2 displays an image showing the movement route 141 of the drone 1 as shown in FIG. 10 on the touch panel 24 (step S14). Note that if the moving route is not found, the terminal device 2 causes the touch panel 24 to display an image indicating that the moving route is not found, and cancels the subsequent processing.
  • the terminal device 2 transmits opening information (FIG. 5) indicating the position of the determined entrance opening and moving route information (FIG. 6) indicating the determined moving route to the server 5.
  • Opening information (FIG. 5) indicating the position of the determined entrance opening
  • moving route information (FIG. 6) indicating the determined moving route to the server 5.
  • Information and travel route information are received (step S15).
  • the server 5 determines the travel route of the drone 1 from the package delivery base to the opening based on the opening information. Furthermore, the server 5 determines a travel route from the delivery base to the landing point through the opening, based on the determined travel route and the travel route information received from the terminal device 2 (step S16).
  • the server 5 transmits travel route information indicating the travel route from the delivery base to the landing point to the drone 1, and the drone 1 receives the travel route information (step S17).
  • the drone 1 delivers the package to the landing site by flying from the package delivery base through the opening to the landing site according to the received movement route information (step S18).
  • the drone 1 If the drone 1 cannot deliver the cargo due to obstacles or the like during flight, it returns to the base and sends error information including information on the point to which it returned to the server 5.
  • the server 5 receives the error information transmitted from the drone 1 (step S19).
  • the server 5 determines that if the drone 1 returns at a point between the opening and the landing site, there is an obstacle on the travel path of the drone 1 and the drone 1 returns to the landing site. Error information indicating that landing was not possible is transmitted to the terminal device 2, and the terminal device 2 receives the error information (step S20).
  • the terminal device 2 Based on the error information, the terminal device 2 notifies the user that the drone 1 was unable to land at the landing site due to the presence of an obstacle on the travel route of the drone 1 (step S21).
  • the drone information 52D includes the size of the drone 1. Therefore, by acquiring the drone information 52D and the spatial information, the terminal device 2 can determine the entry opening through which the drone 1 enters the space around the candidate landing site of the drone 1. Furthermore, the terminal device 2 can also determine the landing site of the drone 1. Once the approach opening and landing site are determined, the terminal device 2 can determine the movement route of the drone 1 from the base to the approach opening and the movement path of the drone 1 from the approach opening to the landing site. . For example, the user acquires spatial information using a distance sensor included in the terminal device 2 such as a smartphone, and the terminal device 2 acquires the drone information 52D from the external server 5, thereby allowing the user to acquire the drone 1 using a simple method. The travel route or landing point of the aircraft can be determined.
  • the approach opening and the landing site can be determined in consideration of the hovering accuracy and positional accuracy of the drone 1. Therefore, it is possible to determine an entry opening into which the drone 1 can reliably enter and a landing site into which the drone 1 can reliably land.
  • the user can select any one of the plurality of accessible openings.
  • the user can select an accessible opening by excluding an accessible opening that the user does not want the drone 1 to enter, or select an accessible opening that the drone 1 can easily enter.
  • the entrance opening can be determined according to the wishes of the user.
  • inaccessible openings into which the drone 1 cannot enter such as when the size of the opening is smaller than the size of the drone 1, can be displayed separately from accessible openings. This allows the user to efficiently select an accessible opening.
  • the user can select one of the possible landing sites.
  • a landing site can be determined.
  • a rectangular frame 131 indicating an area that the drone 1 may occupy is displayed on the touch panel 24.
  • the user can select a possible landing site in consideration of the external shape of the drone 1. Therefore, it is possible to reliably determine a landing site where the drone 1 can land.
  • the movement route 141 of the drone 1 from the approach opening to the landing site can be displayed. Therefore, the user can be made aware of the moving route of the drone 1, and the user can be warned not to place anything on the moving route.
  • the terminal device 2 cannot determine the movement route of the drone 1 from the approach opening to the landing site, it notifies the user to that effect. This allows the user to take measures such as changing the landing site.
  • the terminal device 2 notifies the user if the drone 1 cannot fly the travel route from the approach opening to the landing site. This allows the user to take measures such as removing obstacles on the travel route or changing the landing site.
  • the server 5 also acquires the position information of the approach opening determined based on the drone information 52D and the movement route of the drone 1 from the opening to the landing site from the terminal device 2, Determine the route of travel.
  • the server 5 can transmit a movement instruction to the drone 1 based on the determined movement route. Thereby, the drone 1 can be controlled based on the movement route or landing point of the drone 1 determined by a simple method.
  • FIG. 13 is a diagram showing the configuration of a delivery system according to Embodiment 2 of the present disclosure.
  • the delivery system 10 is a system for delivering packages from a delivery base to a delivery destination, and includes a drone 1, a terminal device 2, a smart tag 8, and a server 5. Similar to Embodiment 1, the drone 1, the terminal device 2, and the server 5 are connected to the network 7 via wireless or wire. The drone 1 and the server 5 are similar to those shown in the first embodiment.
  • the smart tag 8 is a wireless tag that is wirelessly connected to the terminal device 2.
  • the smart tag 8 and the terminal device 2 are connected according to Bluetooth (registered trademark), which is a wireless communication standard. Note that it is assumed that the version of Bluetooth (registered trademark) in the second embodiment is 5.1 or later. Bluetooth® from version 5.1 onwards includes direction finding functionality. The user places the smart tag 8 at the landing site of the drone 1.
  • the terminal device 2 is, for example, a smartphone or a tablet terminal device, and is paired with the smart tag 8 and calculates the direction of the smart tag 8 and the distance to the smart tag 8 by communicating wirelessly.
  • the terminal device 2 determines the position of the smart tag 8 as the landing point of the drone 1 based on the position of the own device and the calculated direction and distance of the smart tag 8.
  • FIG. 14 is a block diagram showing the configuration of the terminal device 2 according to Embodiment 2 of the present disclosure.
  • the terminal device 2 includes a communication section 21 , a camera 22 , a touch panel 24 , a speaker 25 , a position specifying section 26 , a storage section 27 , a control section 28 , and a bus 29 .
  • a communication section 21 a communication section 21 , a camera 22 , a touch panel 24 , a speaker 25 , a position specifying section 26 , a storage section 27 , a control section 28 , and a bus 29 .
  • the communication unit 21 connects the terminal device 2 to the network 7 and performs wireless or wired communication with an external device.
  • the communication unit 21 is further connected to the smart tag 8 according to Bluetooth (registered trademark).
  • the control unit 28 includes a drone information acquisition unit 30, a drone area display unit 41, and a tag position calculation unit 42 as functional processing units realized by executing the computer program 27P stored in the storage unit 27. , a landing site determining section 43 , and an error notification section 39 .
  • the drone area display unit 41 superimposes an area that may be occupied by the drone 1 on the image taken by the camera 22 and displays it on the touch panel 24.
  • the tag position calculation unit 42 calculates the relative direction and distance of the smart tag 8 to the terminal device 2 based on the communication with the smart tag 8 by the communication unit 21, thereby determining the relative direction and distance of the smart tag 8 to the terminal device 2. Calculate relative position. For example, the communication unit 21 receives a signal transmitted from the smart tag 8 while switching a plurality of antennas provided in the terminal device 2, and the tag position calculation unit 42 calculates a phase difference from IQ samples of the received signal. , the relative position of the smart tag 8 may be calculated from the phase difference. Further, the communication unit 21 receives a transmitted signal using one antenna while switching between a plurality of antennas provided in the smart tag 8, and the tag position calculation unit 42 calculates a phase difference from the IQ samples of the received signal.
  • the relative position of the smart tag 8 may be calculated from the phase difference. Further, when the communication unit 21 performs ultra-wideband (UWB) wireless communication with the smart tag 8, the tag position calculation unit 42 calculates the relative position of the smart tag 8 by short-range search using UWB. It's okay.
  • UWB ultra-wideband
  • the landing point determining unit 43 determines the position of the smart tag 8 on the drone 1 based on the position of the terminal device 2 specified by the position specifying unit 26 and the relative position of the smart tag 8 calculated by the tag position calculating unit 42. Determined as the landing point of In other words, the landing point determining unit 43 determines the direction of the smart tag 8 with respect to the terminal device 2 and the distance from the terminal device 2 to the smart tag 8 based on the relative position of the smart tag 8 with respect to the terminal device 2. calculate. The landing point determination unit 43 determines the position of the smart tag 8 by moving the position of the terminal device 2 by the calculated distance in the calculated direction, and sets the determined position of the smart tag 8 as the landing point of the drone 1. Determine as. The landing site determination unit 43 transmits information indicating the determined landing site (for example, latitude and longitude information) to the server 5 via the communication unit 21.
  • the determined landing site for example, latitude and longitude information
  • the error notification unit 39 determines the notification content. The user is notified by displaying the notification on the touch panel 24 or by outputting the notification content audibly from the speaker 25.
  • FIG. 15 is a sequence diagram showing the processing procedure of the delivery system 10 according to the second embodiment of the present disclosure. Pairing is performed between the smart tag 8 and the terminal device 2, and the two are connected (step S31).
  • the server 5 transmits drone information 52D as shown in FIG. 4 to the terminal device 2, and the terminal device 2 receives the drone information 52D (step S32).
  • the user installs the smart tag 8 at the landing site of the drone 1, operates the camera 22 of the terminal device 2, and photographs the area including the smart tag 8 (step S33).
  • the terminal device 2 superimposes the area that may be occupied by the drone 1 on the image taken by the camera 22 and displays it on the touch panel 24 (step S34).
  • FIG. 16 is a diagram showing an example of an image displayed on the touch panel 24.
  • the touch panel 24 displays an image in which a rectangular frame 131 indicating an area that may be occupied by the drone 1 is superimposed on the image of the area including the smart tag 8 captured by the camera 22.
  • the terminal device 2 specifies the position of the terminal device 2 using satellite navigation (step S35).
  • the terminal device 2 calculates the relative position of the smart tag 8 with respect to the terminal device 2 based on the communication with the smart tag 8 by the communication unit 21 (step S36).
  • the terminal device 2 determines the absolute position of the smart tag 8 as the landing point of the drone 1 (step S37).
  • the landing site of the drone 1 is indicated by latitude and longitude, for example.
  • the terminal device 2 transmits landing site information indicating the landing site of the drone 1 to the server 5, and the server 5 receives the landing site information (step S38).
  • the server 5 determines the travel route of the drone 1 from the cargo delivery base to the landing point indicated by the landing point information (step S39).
  • the movement route can be determined using a known technique.
  • the server 5 transmits travel route information indicating the travel route from the delivery base to the landing point to the drone 1, and the drone 1 receives the travel route information (step S40).
  • the drone 1 delivers the package to the landing site by flying from the package delivery base to the landing site according to the received movement route information (step S41).
  • the drone 1 If the drone 1 is unable to deliver the cargo due to obstacles or the like during flight, it returns to the base and sends error information including information about the point to which it returned to the server 5.
  • the server 5 receives the error information transmitted from the drone 1 (step S42).
  • the server 5 Based on the received error information, if the drone 1 returns near the landing site (for example, a point within a predetermined distance from the landing site), the server 5 transmits error information indicating this to the terminal device 2. Then, the terminal device 2 receives the error information (step S43).
  • the terminal device 2 Based on the error information, the terminal device 2 notifies the user that the drone 1 turned back near the landing site and was unable to land at the landing site (step S44).
  • the landing site of the drone 1 can be determined by placing the smart tag 8 at the landing site and acquiring the position of the terminal device 2. Thereby, the landing site of the drone 1 can be determined using a simple method.
  • a part or all of the components constituting each of the above devices may be composed of one or more semiconductor devices such as a system LSI.
  • the computer programs 27P and 52P described above may be recorded on a computer-readable non-transitory recording medium, such as an HDD, CD-ROM, or semiconductor memory, and distributed. Further, the computer programs 27P and 52P may be transmitted and distributed via telecommunication lines, wireless or wired communication lines, networks typified by the Internet, data broadcasting, and the like. Furthermore, each of the above devices may be realized by multiple computers or multiple processors.
  • each of the above devices may be provided by cloud computing.
  • some or all of the functions of each device may be realized by a cloud server.
  • at least some of the above embodiments may be combined arbitrarily.
  • Drone 2 Terminal device (landing point determination system, movement route determination device) 5 Server (drone control device) 7 Network 8 Smart tag 10 Delivery system (travel route determination system) 21 Communication unit 22 Camera 23 Sensor 24 Touch panel 25 Speaker 26 Position specifying unit 27 Storage unit 27P Computer program 28 Control unit 29 Bus 30 Drone information acquisition unit 31 Spatial information acquisition unit 32 Approach opening determination unit 33 Landing point determination unit 34 Travel route Determination unit (second movement route determination unit) 35 Accessible opening display section 36 Selection information acquisition section (first selection information acquisition section, second selection information acquisition section) 37 Possible landing point display section 38 Travel route display section 39 Error notification section (first notification section, second notification section) 41 Drone area display section 42 Tag position calculation section 43 Landing point determination section 51 Communication section 52 Storage section 52D Drone information 52P Computer program 53 Control section 54 Bus 55 Drone information provision section 56 Landing point information acquisition section (location information acquisition section, 2 movement route acquisition part) 57 Movement route determination unit (first movement route determination unit, movement instruction transmission unit) 58 Error notification section 101 Floor section 102A Opening section 102B

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

Système de détermination de trajet de déplacement comprenant : une unité d'acquisition d'informations de drone qui acquiert des informations de drone qui comportent la taille d'un drone ; une unité d'acquisition d'informations d'espace qui acquiert des informations d'espace qui comportent la position tridimensionnelle d'un objet existant autour d'un site candidat d'atterrissage correspondant au drone ; une unité de détermination d'ouverture d'entrée qui détermine une ouverture d'entrée correspondant à l'entrée du drone dans un bâtiment qui comporte le site candidat d'atterrissage, sur la base des informations de drone et des informations d'espace ; une unité de détermination de site d'atterrissage qui détermine un site d'atterrissage correspondant au drone sur la base des informations de drone et des informations d'espace ; une première unité de détermination de trajet de déplacement qui détermine un premier trajet de déplacement correspondant au drone à partir d'une base de drone vers l'ouverture d'entrée ; et une seconde unité de détermination de trajet de déplacement qui détermine un second trajet de déplacement correspondant au drone à partir de l'ouverture d'entrée vers le site d'atterrissage sur la base des informations d'espace et du site d'ouverture et d'atterrissage déterminé.
PCT/JP2023/015737 2022-06-13 2023-04-20 Système de détermination de trajet de déplacement, système de détermination de site d'atterrissage, dispositif de détermination de trajet de déplacement, dispositif de commande de drone et programme informatique WO2023243221A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-094867 2022-06-13
JP2022094867 2022-06-13

Publications (1)

Publication Number Publication Date
WO2023243221A1 true WO2023243221A1 (fr) 2023-12-21

Family

ID=89190900

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/015737 WO2023243221A1 (fr) 2022-06-13 2023-04-20 Système de détermination de trajet de déplacement, système de détermination de site d'atterrissage, dispositif de détermination de trajet de déplacement, dispositif de commande de drone et programme informatique

Country Status (1)

Country Link
WO (1) WO2023243221A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004354351A (ja) * 2003-05-30 2004-12-16 Sharp Corp 電波発信器探索装置、携帯電話通信端末装置、及び電波発信器探索装置による電波発信器探索方法
JP2013519335A (ja) * 2010-02-09 2013-05-23 エアロスカウト、リミテッド タグに関連する情報を処理するシステム及び方法並びに携帯電話
JP2019016197A (ja) * 2017-07-07 2019-01-31 株式会社日立製作所 移動体誘導システム
US20210174301A1 (en) * 2019-12-04 2021-06-10 Wing Aviation Llc Uav balcony deliveries to multi-level buildings

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004354351A (ja) * 2003-05-30 2004-12-16 Sharp Corp 電波発信器探索装置、携帯電話通信端末装置、及び電波発信器探索装置による電波発信器探索方法
JP2013519335A (ja) * 2010-02-09 2013-05-23 エアロスカウト、リミテッド タグに関連する情報を処理するシステム及び方法並びに携帯電話
JP2019016197A (ja) * 2017-07-07 2019-01-31 株式会社日立製作所 移動体誘導システム
US20210174301A1 (en) * 2019-12-04 2021-06-10 Wing Aviation Llc Uav balcony deliveries to multi-level buildings

Similar Documents

Publication Publication Date Title
US10354407B2 (en) Camera for locating hidden objects
US10715963B2 (en) Navigation method and device
US10181211B2 (en) Method and apparatus of prompting position of aerial vehicle
US20210019854A1 (en) Location Signaling with Respect to an Autonomous Vehicle and a Rider
WO2020224375A1 (fr) Procédé, appareil et dispositif de positionnement et support d'informations lisible par ordinateur
US10896327B1 (en) Device with a camera for locating hidden object
CA2846533C (fr) Systeme et procede de commande d'un dispositif aerien distant pour une inspection rapprochee
US20180196417A1 (en) Location Signaling with Respect to an Autonomous Vehicle and a Rider
US9684305B2 (en) System and method for mobile robot teleoperation
CN109443345B (zh) 用于监控导航的定位方法及系统
CN110392908A (zh) 用于生成地图数据的电子设备及其操作方法
US10546419B2 (en) System and method of on-site documentation enhancement through augmented reality
JP2022507715A (ja) 測量方法、装置及びデバイス
CN113906481A (zh) 成像显示方法、遥控终端、装置、系统及存储介质
TWI750821B (zh) 基於光通訊裝置的導航方法、系統、設備及介質
WO2023243221A1 (fr) Système de détermination de trajet de déplacement, système de détermination de site d'atterrissage, dispositif de détermination de trajet de déplacement, dispositif de commande de drone et programme informatique
JP7004374B1 (ja) 移動体の移動経路生成方法及びプログラム、管理サーバ、管理システム
WO2021212499A1 (fr) Procédé, appareil et système d'étalonnage de cible, et terminal de commande à distance d'une plateforme mobile
CN115460539A (zh) 一种获取电子围栏的方法、设备、介质及程序产品
US20240118703A1 (en) Display apparatus, communication system, display control method, and recording medium
US20240053746A1 (en) Display system, communications system, display control method, and program
US20230205198A1 (en) Information processing apparatus, route generation system, route generating method, and non-transitory recording medium
JP2022146886A (ja) 表示装置、通信システム、表示制御方法およびプログラム
WO2023228283A1 (fr) Système de traitement d'informations, corps mobile, procédé de traitement d'informations, et programme
WO2022121606A1 (fr) Procédé et système d'obtention d'informations d'identification de dispositif ou d'utilisateur de celui-ci dans un scénario

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23823524

Country of ref document: EP

Kind code of ref document: A1