WO2017161563A1 - 飞行器的控制方法和装置 - Google Patents
飞行器的控制方法和装置 Download PDFInfo
- Publication number
- WO2017161563A1 WO2017161563A1 PCT/CN2016/077351 CN2016077351W WO2017161563A1 WO 2017161563 A1 WO2017161563 A1 WO 2017161563A1 CN 2016077351 W CN2016077351 W CN 2016077351W WO 2017161563 A1 WO2017161563 A1 WO 2017161563A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- aircraft
- determining
- vertical
- drone
- detection plane
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 42
- 238000001514 detection method Methods 0.000 claims description 77
- 230000033001 locomotion Effects 0.000 claims description 23
- 230000001133 acceleration Effects 0.000 claims description 10
- 230000005484 gravity Effects 0.000 claims description 7
- 230000009467 reduction Effects 0.000 claims description 7
- 238000010586 diagram Methods 0.000 description 18
- 238000012545 processing Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 7
- 238000004590 computer program Methods 0.000 description 6
- 230000008859 change Effects 0.000 description 4
- 230000009286 beneficial effect Effects 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 1
- 241000607479 Yersinia pestis Species 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 239000013598 vector Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D25/00—Emergency apparatus or devices, not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C39/00—Aircraft not otherwise provided for
- B64C39/02—Aircraft not otherwise provided for characterised by special use
- B64C39/024—Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D1/00—Dropping, ejecting, releasing, or receiving articles, liquids, or the like, in flight
- B64D1/02—Dropping, ejecting, or releasing articles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/10—Rotorcrafts
- B64U10/13—Flying platforms
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/933—Lidar systems specially adapted for specific applications for anti-collision purposes of aircraft or spacecraft
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0047—Navigation or guidance aids for a single aircraft
- G08G5/0069—Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/04—Anti-collision systems
- G08G5/045—Navigation or guidance aids, e.g. determination of anti-collision manoeuvers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/10—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
Definitions
- the present invention relates to the field of aircraft control technology, and more particularly to a method and apparatus for controlling an aircraft.
- Unmanned aerial vehicles also known as unmanned aerial vehicles, are unmanned aircraft operated using radio-controlled telemetry equipment and self-contained program control devices. There is no cockpit on the drone, but there are navigation flight control systems, program control devices, and power and power supplies. Ground remote control telemetry station personnel track, locate, remotely control, telemetry and digitally transmit through data link and other equipment. Compared with manned aircraft, it has the characteristics of small size, low cost, convenient use and adaptability to various flight environments. Therefore, it is widely used in aerial remote sensing, meteorological research, agricultural flying, pest control and war.
- An aircraft represented by a drone may fall due to its own mechanical failure, collision with other objects, etc. during the flight, and the fall may cause loss of personnel and property to passers-by or vehicles. Therefore, as aircraft represented by drones are widely used, control of aircraft, especially when falling, becomes an urgent problem to be solved.
- the loss caused by the fall of the aircraft is reduced by controlling the occurrence of the fall of the aircraft, but the method cannot control the aircraft after the fall occurs.
- the embodiment of the present invention provides a control method and device for an aircraft and the like.
- an embodiment of the present invention provides a method for controlling an aircraft, the method comprising:
- the preset control measures are taken to control the aircraft.
- the method before the determining the v level and the v vertical of the aircraft, the method further includes:
- the positional relationship between the aircraft and the object after predicting flight L according to the v level , the v vertical , and the L includes:
- the detection plane is at a distance L from the drone, and the detection plane is The direction of motion of the drone is vertical;
- a positional relationship between the second projection position and the scanning position is determined as a positional relationship between the aircraft and the object after the flight L.
- the aircraft is equipped with a depth of field sensor, and the detection direction of the depth of field sensor is consistent with the moving direction of the aircraft;
- Obtaining an object in the moving direction of the aircraft that is not greater than a preset distance L from the aircraft includes:
- the determining the first projection position of the aircraft in the detection plane comprises:
- a projection position of the aircraft in the detection plane is determined as a first projection position.
- the second projection position of the aircraft in the detection plane after the flight L is predicted according to the first projection position, the v level , the v vertical , and the L, includes:
- a position in which the first projection position is longitudinally moved by the s distance is determined as a second projection position.
- the distance s that the aircraft moves longitudinally in the detection plane after predicting the flight L according to the v level , the v vertical , and the L includes:
- the preset control measure is: ejecting the airbag, or disassembling the aircraft.
- an embodiment of the present invention provides a control device for an aircraft, the device comprising:
- a first determining module configured to determine a horizontal speed v horizontal and a vertical speed v vertical of the aircraft
- An acquiring module configured to acquire, in a falling direction of the aircraft, an object that is not more than a preset distance L from the aircraft;
- a prediction module configured to determine, according to the v level determined by the first determining module, v vertical determined by the first determining module, the L, between the object acquired by the aircraft and the acquiring module after predicting flight L Positional relationship;
- a control module configured to: when the prediction module predicts that the location relationship satisfies a preset relationship, take preset control measures to control the aircraft.
- the device further includes:
- the prediction module includes:
- a first determining unit configured to determine a first projection position of the aircraft in the detection plane, the distance between the detection plane and the drone is L, and the detection plane and the movement direction of the drone vertical;
- a second determining unit configured to determine a scanning position of the object in the detecting plane
- a prediction unit configured to determine, according to the first projection location, the v- level , the v- vertical , the L, the second projection of the aircraft in the detection plane after the flight L is predicted position;
- a third determining unit configured to determine a positional relationship between the second projected position predicted by the predicting unit and the scanned position determined by the second determining unit as a position between the aircraft and the object after the flight L relationship.
- the aircraft is equipped with a depth of field sensor, and the detection direction of the depth of field sensor is consistent with the moving direction of the aircraft;
- the acquiring module is configured to acquire an object detected by the depth of field sensor with L as a depth of field.
- the first determining unit includes:
- a first determining subunit configured to determine an angle between the depth of field sensor and an initial direction of the aircraft
- a projection subunit configured to project the aircraft into the detection plane according to the three-dimensional size acquired by the acquisition subunit and the angle determined by the first determining subunit;
- a second determining subunit configured to determine, by the projection subunit, a projection position of the aircraft in the detection plane as a first projection position.
- the prediction unit includes:
- a prediction subunit configured to determine a distance s of the aircraft longitudinally moving in the detection plane after the flight L is predicted according to the v level , the v vertical , and the L;
- the position after the s distance obtained by longitudinally moving the first projection position by the prediction subunit is determined as a second projection position.
- the prediction subunit is configured to predict s according to the following formula:
- the preset control measure is: ejecting the airbag, or disassembling the aircraft.
- Determining the v- level and v- vertical of the aircraft acquiring an object with a distance not greater than L from the aircraft in the falling direction of the aircraft; predicting the positional relationship between the aircraft and the object after the flight L according to the v- level , v- verting , and L If the positional relationship satisfies the preset relationship, the preset control measures are taken to control the aircraft to realize the control of the aircraft after the fall occurs.
- FIG. 1 is a flow chart showing a control method of an aircraft provided in an embodiment of the present invention
- FIG. 2 is a schematic view of a drone provided in another embodiment of the present invention.
- FIG. 3 is a flow chart showing a control method of another aircraft provided in another embodiment of the present invention.
- FIG. 4 is a schematic diagram showing the speed of a drone provided in another embodiment of the present invention.
- FIG. 5 is a diagram showing an obstacle information provided in another embodiment of the present invention.
- FIG. 6 is a diagram showing a three-dimensional obstacle information provided in another embodiment of the present invention.
- Figure 7 shows a top view of a drone provided in another embodiment of the present invention.
- FIG. 8 is a schematic diagram showing projection of a drone in a three-dimensional obstacle information map according to another embodiment of the present invention.
- FIG. 9 is a schematic diagram showing a position of a UAV projected in a three-dimensional obstacle information map according to another embodiment of the present invention.
- FIG. 10 is a schematic diagram showing displacement of a UAV projected in a three-dimensional obstacle information map according to another embodiment of the present invention.
- FIG. 11 is a schematic diagram showing another UAV provided in another embodiment of the present invention in a three-dimensional obstacle information map
- FIG. 12 is a flow chart showing a control method of another aircraft provided in another embodiment of the present invention.
- Figure 13 is a diagram showing another obstacle information provided in another embodiment of the present invention.
- FIG. 14 is a diagram showing another three-dimensional obstacle information provided in another embodiment of the present invention.
- Figure 15 is a diagram showing the structure of a control device for an aircraft provided in another embodiment of the present invention. schematic diagram;
- 16 is a schematic structural view of another control device for an aircraft provided in another embodiment of the present invention.
- FIG. 17 is a schematic structural diagram of a prediction module provided in another embodiment of the present invention.
- FIG. 18 is a schematic structural diagram of a first determining unit provided in another embodiment of the present invention.
- FIG. 19 is a schematic structural diagram of a prediction unit provided in another embodiment of the present invention.
- the aircraft is uncontrollable when it falls, it is difficult to avoid collision with the object in front, and it is impossible to avoid the loss of people and property caused by the passing of a passerby or a vehicle.
- the present application proposes a control method for an aircraft, which is applied to a control device for an aircraft, and the control device of the aircraft is as shown in any of the figures of FIGS. 15 to 19. The control device of the aircraft.
- the control device of the aircraft is located on the aircraft, and at the same time, the aircraft may be equipped with a depth of field sensor, the detection direction of the depth sensor may be consistent with the direction of movement of the aircraft, and when the aircraft falls, the control device of the aircraft may determine the v level of the aircraft And v vertical ; in the falling direction of the aircraft, the object with the distance between the aircraft and the preset distance L is obtained by the depth of field sensor; the position between the aircraft and the object after the flight L is predicted according to the v level , v vertical , and L Relationship; if the positional relationship satisfies the preset relationship, the preset control measures are taken to control the aircraft to control the aircraft after the fall occurs.
- the present embodiment provides a method for controlling an aircraft.
- the flow of the method provided in this embodiment is as follows:
- the method before determining the v level and v vertical of the aircraft, the method further includes:
- the aircraft is equipped with a depth of field sensor, and the detection direction of the depth of field sensor is consistent with the moving direction of the aircraft;
- Obtaining an object in the moving direction of the aircraft that is not greater than a preset distance L from the aircraft including:
- the positional relationship between the aircraft and the object after the flight L is predicted according to the v level , the v vertical , and the L, including:
- the positional relationship between the second projection position and the scanning position is determined as the positional relationship between the aircraft and the object after the flight L.
- determining a first projected position of the aircraft in the detection plane comprises:
- the projected position of the aircraft in the detection plane is determined as the first projection position.
- the second projection position of the aircraft in the detection plane after the flight L is predicted according to the first projection position, the v level , the v vertical , and the L, including:
- v level , v vertical , L the distance s of the longitudinal movement of the aircraft in the detection plane after the flight L is predicted
- the position after the first projection position is longitudinally moved by the distance s is determined as the second projection position.
- the distance s of the longitudinal movement of the aircraft in the detection plane after the flight L is predicted, including:
- the preset control measures are: ejecting the airbag, or disassembling the aircraft.
- Determining the v- level and v- vertical of the aircraft acquiring an object with a distance not greater than L from the aircraft in the falling direction of the aircraft; predicting the positional relationship between the aircraft and the object after the flight L according to the v- level , v- vertical , and L If the positional relationship satisfies the preset relationship, the preset control measures are taken to control the aircraft to realize the control of the aircraft after the fall occurs.
- the present embodiment provides a method for controlling an aircraft. Since the aircraft includes many types, for convenience of description, the present embodiment only uses a drone, and the distance between the aircraft and the drone is not greater than L.
- the object is an object A as an example.
- the UAV is shown in FIG. 2, and the UAV is equipped with an infrared laser depth of field sensor that can rotate 360 degrees freely.
- the 360-degree free-rotating infrared laser depth of field sensor always detects the direction of movement of the UAV. be consistent.
- the drone will monitor its own state, the operation of the equipment, etc., and judge whether the drone is falling based on the monitoring result. When judging that the drone is falling, it determines that the drone is falling.
- drones fall there are many reasons for the fall, such as the mechanical failure of the drone shown in Figure 2, or collision in flight, or other reasons.
- reasons for the fall such as the mechanical failure of the drone shown in Figure 2, or collision in flight, or other reasons.
- drones fall such as falling in a free fall mode, or falling from a partial propeller stall, or other forms of fall.
- the acceleration of different drones may be different when it is dropped.
- the specific acceleration of the drone falling is not limited.
- the v- level of the UAV can be obtained by GPS in this step, and the vertical speed v vertical can be obtained by the height sensor.
- the flying speed v of the drone can also be calculated according to the v level and the v vertical to determine the drone in the three-dimensional space. The speed in.
- the flight speed v is the current actual speed of the drone, and the direction of v is the horizontally downward rotation ⁇ degree, as shown in FIG. 4 .
- v can also be obtained directly from the relevant measurement equipment of the drone.
- this step can obtain a 360-degree free-rotating infrared laser depth sensor.
- the detection is achieved by L as the object detected by the depth of field.
- a 360-degree free-rotating infrared laser depth sensor performs real-time depth-of-field scanning in L.
- L is the farthest scanning distance
- an obstacle information map as shown in FIG. 5 is obtained.
- the 360-degree free-rotating infrared laser depth-of-field sensor can also measure the distance of its visible area.
- the pixel d of the object is not detected as ⁇ , and the pixel point of the object A is recorded.
- the distance information d(0-L) of this point is recorded.
- the distance information of each point is drawn to obtain a three-dimensional obstacle information map as shown in FIG. 6.
- the detection direction of the 360-degree free-rotating infrared laser depth sensor is always consistent with the direction of movement of the drone: 360 degree free-rotating infrared laser depth of field
- the sensor can adjust its east-north ⁇ degree in the horizontal alignment example according to its own geomagnetic sensor, and then rotate the ⁇ angle direction in the vertical direction of the center of the earth. At this time, even if the drone is rotating during the fall or The tumbling, 360-degree free-rotating infrared laser depth-of-field sensor still maintains its absolute orientation in the direction of speed based on the absolute values of ⁇ and ⁇ .
- this embodiment only takes the detection of the 360-degree free-rotating infrared laser depth sensor as an example.
- the drone can also be equipped with other forms of depth-of-field sensors, as long as the sensor can detect objects with depth of field by L. And can rotate freely 360 degrees to ensure that the detection direction of the sensor is always consistent with the direction of movement of the drone.
- the specific implementation includes, but is not limited to, the following four steps:
- Step 1 Determine the first projection position of the drone in the detection plane
- the distance between the detection plane and the drone is L, and the detection plane is perpendicular to the movement direction of the drone.
- Step 1 can be implemented in the following three sub-steps during specific implementation:
- Sub-step 1.1 Obtaining the three-dimensional size of the drone
- the drone has an accurate three-dimensional size at the time of manufacture, and the three-dimensional size is stored as a three-dimensional model information in the drone-related program. This step can directly obtain the three-dimensional size from the related program.
- Sub-step 1.2 determining the angle between the depth of field sensor and the initial direction of the drone
- the 360-degree free-rotating infrared laser depth sensor in Figure 2 is connected to the UAV host through two or more axes.
- the infrared laser depth sensor with 360 degrees of free rotation at any time can know its current axis angle.
- the current axis angle of the 360 degree freely rotating infrared laser depth sensor is determined as the angle between the depth of field sensor and the initial direction of the drone.
- Sub-step 1.3 projecting the drone into the detection plane according to the three-dimensional size and angle;
- the 360-degree free-rotating infrared laser depth-of-field sensor can be rotated in the X-axis and the Y-axis, taking the positive direction in front of the plane in Figure 2. Looking at the Y-axis in a top view, as shown in Figure 7, the Y-axis is now vertical.
- the drone should take y+180° as the projection component of the Y-axis rotation at the time of falling.
- the rotation of the X-axis is the same.
- x+180° should be taken as the projection component of the X-axis rotation.
- the shape of the depth of field sensor can be obtained by using (x+180°, y+180°) as the 3D model projection angle of the drone, and the drone size is the 360-degree free-rotating infrared known in step 1.
- the size of the photosensitive device of the laser depth sensor and the focal length of the lens are also known, and the drone itself knows the actual size of this projection in the detected image at L, as shown in FIG.
- Sub-step 1.4 Determine the projection position of the drone in the detection plane as the first projection position.
- Step 2 Determine the scanning position of the object A in the detection plane
- the three-dimensional obstacle information map in step 303 is a part of the detection plane, step 2
- the three-dimensional obstacle information map in step 303 can be directly obtained, and the map is projected as a projection result of the object A into the detection plane, and the projection position of the object A in the map is determined as the scan position.
- the embodiment first performs the step 1 and then the step 2, and in actual application, the step 2 may be performed before the step 1 or the step 1 and the step 2 may be performed simultaneously.
- This embodiment does not limit the specific implementation order of step 1 and step 2.
- Step 3 predicting a second projection position of the drone in the detection plane after the flight L according to the first projection position, the v level , the v vertical , and the L;
- Step 3 can be implemented by the following two substeps:
- Sub-step 3.1 According to v level , v vertical , L, predict the distance s of the drone moving longitudinally in the detection plane after flight L, and s can be predicted by the following formula:
- g is the gravitational acceleration
- a is the preset reduction ratio constant
- the s prediction formula can be derived as follows:
- step 302 the v, v level and v vertical of the drone are known, and the direction of v is horizontally oriented downward by ⁇ degrees.
- step 303 the X- and Y-axis angular velocities of the 360-degree free-rotating infrared laser depth sensor and the drone body are also known, respectively assumed to be ⁇ X and ⁇ Y .
- v level and v vertical will change, but the drone can still get v level and v vertical at any moment, and predict the motion according to the falling trajectory.
- the present embodiment takes the free fall as an example for further analysis.
- the detection distance is L
- the time when the drone flies to the detection plane at the far distance L is approximately L/v, see FIG.
- the UAV projection image has a longitudinal movement distance of b in the detection image before the L/v time (because the horizontal speed and direction of the UAV free fall fall does not change, so in the detection image There will be no lateral movement), as shown in Figure 10.
- b is the actual longitudinal moving distance, and in the actual area of the 360-degree freely rotating infrared laser depth sensor, the moving distance and the actual distance are reduced in proportion, and the infrared laser depth sensor and the lens group that are freely rotated at 360 degrees are completed. After that, it is a known parameter. Assuming that the L distance is reduced to a constant a, the longitudinal movement distance on the 360-degree free-rotating infrared laser depth sensor is
- Sub-step 3.2 The position after the first projection position is longitudinally moved by the distance s is determined as the second projection position.
- the X-axis and Y-axis angular velocities of the 360-degree free-rotating infrared laser depth sensor and the fuselage are ⁇ X and ⁇ Y , respectively, and the angular velocity does not change during free-fall motion, then after L/v time,
- the rotation angles of the UAV around the X and Y axes are ⁇ X ⁇ L/v and ⁇ Y ⁇ L/v, respectively. It is assumed that the position of the drone after moving the s distance from the first projection position after the L/v time is L/
- the position in the detected image before v time is as shown in FIG. 11, and the position is determined as the second projected position.
- Step 4 Determine the positional relationship between the second projection position and the scanning position as the positional relationship between the drone and the object A after the flight L.
- the preset control measures include but are not limited to: ejecting the airbag, or disassembling the drone.
- the preset control is taken only after the position between the drone and the object A exists after the flight L is determined in step 304. Measures to control drones.
- the preset control is adopted.
- the measure controls the drone, and when there is no overlap between the drone and the object A after determining the flight L in step 304, the actual distance between the drone and the object A is c ⁇ a, and c ⁇ a After no more than e, take pre-set control measures to control the drone.
- the drone should activate an emergency protection device, such as a pop-up airbag or disintegration, etc.
- an emergency protection device such as a pop-up airbag or disintegration, etc.
- the human machine itself is protected from damage and can protect pedestrians or property from being bruised and damaged.
- the anti-collision method for the drone falling in the embodiment provides an infrared laser depth sensor that can rotate 360 degrees freely on the drone, and points to the current speed direction in real time, and passes through the L position.
- High-frequency scanning laser ranging or pattern-based full depth-of-field analysis techniques combined with the projection image of the angle at that point in time, predict whether collision will occur based on the bidirectional component of the current plane and the rotational speed of the projection plane. If a collision is about to occur, an emergency mechanism (such as ejecting the airbag, self-structuring, etc.) is initiated to minimize damage to the drone itself and to the person or the financial side of the ground.
- an emergency mechanism such as ejecting the airbag, self-structuring, etc.
- the infrared laser depth sensor capable of 360 degrees free rotation is provided on the unmanned aerial vehicle shown in FIG. 2 as an example.
- the infrared laser depth of field sensor which can rotate 360 degrees freely.
- two or more infrared laser depth-of-field sensors that can rotate 360 degrees may be provided as appropriate.
- the specific number is not limited in this embodiment.
- the data obtained by the 360-degree freely rotatable infrared laser depth sensor can be combined into one data, which can be used as a 360-degree free-rotating infrared laser depth of field.
- the final data obtained by the sensor can be processed later.
- the anti-collision method for the drone falling in the embodiment is started when the drone starts to fall, and is repeatedly executed repeatedly, that is, the horizontal speed and the vertical are obtained by the anti-collision method of the drone falling in the present embodiment.
- Speed an object whose distance is not greater than L in the direction of motion, when it is determined that it can collide with an object, adopts a preset anti-collision measure to prevent it from colliding with the object during the entire fall.
- the above embodiment is described as an object A having an object whose distance from the drone is not more than L.
- a method for controlling an aircraft provided by the present application will be described for a scene in which an object having a distance between the drones is not greater than L is a plurality of objects.
- the UAV shown in FIG. 2 is still used, and the UAV is equipped with a 360-degree free-rotating infrared laser depth sensor, and the 360-degree free-rotating infrared laser depth sensor is always detected.
- the direction of movement of the drone is consistent as an example.
- step 301 The implementation of this step is the same as that of step 301.
- step 301 For details, see step 301, and details are not described here.
- step 302 The implementation of this step is the same as that of step 302. For details, see step 302, and details are not described here.
- this step acquires all objects that are not more than L from the drone.
- step 303 For each object, the implementation is the same as that of step 303. For details, refer to step 303, which is not described in detail here.
- a 360-degree free-rotating infrared laser depth sensor performs real-time depth-of-field scanning in L to obtain an obstacle information map as shown in FIG. If the 360-degree free-rotating infrared laser depth of field sensor can also measure the distance of its visible area, a three-dimensional obstacle information map as shown in Fig. 14 can be obtained.
- step 304 For each object, the implementation of the positional relationship between the drone and the unmanned aircraft after the flight L is predicted to be the same as the step 304 according to the v- level , the v- vertical , and the L. For details, refer to step 304, which is not described in detail herein.
- step 305 The method for determining whether the positional relationship between the UAV and each of the objects after the flight L satisfies the preset relationship is the same as that of the step 305. For details, refer to step 305, which is not described in detail herein.
- the present embodiment provides a control device for an aircraft. Since the principle of solving the problem of the control device of the aircraft is similar to that of an aircraft, the control device of the aircraft is Implementation can refer to the implementation of the method, and the repetition will not be repeated.
- control device of the aircraft includes:
- a first determining module 1501 configured to determine a horizontal speed v horizontal and a vertical speed v vertical of the aircraft
- the acquiring module 1502 is configured to acquire an object that is not more than a preset distance L from the aircraft in a falling direction of the aircraft;
- the prediction module 1503 is configured to determine, according to the v level determined by the first determining module 1501, the first determining module determines the v vertical , L of the 1501, and predicts a positional relationship between the aircraft acquired by the aircraft after the flight L and the acquiring module 1502;
- the control module 1504 is configured to take a preset control measure to control the aircraft when the prediction module 1503 predicts that the positional relationship satisfies the preset relationship.
- the device further includes:
- the second determining module 1505 is configured to determine that the aircraft is falling.
- the prediction module 1503 includes:
- a first determining unit 15031 configured to determine a first projection position of the aircraft in the detecting plane, the distance between the detecting plane and the drone is L, and the detecting plane is perpendicular to the moving direction of the drone;
- a second determining unit 15032 configured to determine a scanning position of the object in the detecting plane
- a prediction unit 15033 configured to predict a second projection position of the aircraft in the detection plane after the flight L according to the first projection position, the v- level , the v- vertical , and the L determined by the first determining unit 15031;
- the third determining unit 15034 is configured to determine a positional relationship between the second projection position predicted by the prediction unit 15033 and the scan position determined by the second determining unit 15032 as a positional relationship between the aircraft and the object after the flight L.
- the aircraft is equipped with a depth of field sensor, and the detection direction of the depth of field sensor is consistent with the moving direction of the aircraft;
- the obtaining module 1502 is configured to acquire an object detected by the depth of field sensor with L as the depth of field.
- the first determining unit 15031 includes:
- the obtaining subunit 150311 is configured to acquire a three-dimensional size of the aircraft
- a first determining subunit 150312 configured to determine an angle between the depth of field sensor and an initial direction of the aircraft
- a projection subunit 150313 configured to project the aircraft into the detection plane according to the three-dimensional size acquired by the acquisition subunit 150311 and the angle determined by the first determination subunit 150312;
- the second determining subunit 150314 is configured to determine the projection position of the aircraft in the detection plane as the first projection position by the projection subunit 150313.
- the prediction unit 15033 includes:
- a prediction subunit 15033 configured to predict a distance s of the longitudinal movement of the aircraft in the detection plane after the flight L according to the v level , the v vertical , and the L;
- the determining subunit 150332 determines the position after the s distance obtained by the first projection position longitudinal movement prediction subunit 150331 as the second projection position.
- the prediction subunit 150331 is configured to predict s according to the following formula:
- the preset control measures are: ejecting the airbag, or disintegrating the aircraft.
- Determining the v- level and v- vertical of the aircraft in the falling direction of the aircraft, acquiring all objects whose distance from the aircraft is not greater than L; predicting the position between the aircraft and the object after the flight L according to the v- level , v- vertical , and L Relationship; if there is an object that has a positional relationship with the aircraft that satisfies the preset relationship, the preset control measures are taken to control the aircraft to control the aircraft after the fall occurs.
- the existing functional component modules can be used for implementation.
- the processing module can use existing data processing components.
- the positioning server used in the existing positioning technology has the function component implemented; as for the receiving module, any device with the signal transmission function has
- the A, n parameter calculation, strength adjustment, etc. performed by the processing module are all existing technical means, and those skilled in the art can realize the corresponding design and development.
- embodiments of the present invention can be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment, or a combination of software and hardware. Moreover, the invention can take the form of a computer program product embodied on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) including computer usable program code.
- computer-usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.
- the computer program instructions can also be stored in a computer readable memory that can direct a computer or other programmable data processing device to operate in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture comprising the instruction device.
- the apparatus implements the functions specified in one or more blocks of a flow or a flow and/or block diagram of the flowchart.
- These computer program instructions can also be loaded onto a computer or other programmable data processing device such that a series of operational steps are performed on a computer or other programmable device to produce computer-implemented processing for execution on a computer or other programmable device.
- the instructions provide steps for implementing the functions specified in one or more of the flow or in a block or blocks of a flow diagram.
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- Emergency Management (AREA)
- Business, Economics & Management (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
Description
Claims (16)
- 一种飞行器的控制方法,其特征在于,所述方法包括:确定所述飞行器的水平速度v水平和垂直速度v垂直;在所述飞行器的坠落方向上,获取与所述飞行器之间的距离不大于预设距离L的物体;根据所述v水平、所述v垂直、所述L,预测飞行L后所述飞行器与所述物体之间的位置关系;若所述位置关系满足预设关系,则采取预设的控制措施控制所述飞行器。
- 根据权利要求1所述的方法,其特征在于,所述确定飞行器的v水平和v垂直之前,还包括:确定飞行器坠落。
- 根据权利要求1或2所述的方法,其特征在于,所述根据所述v水平、所述v垂直、所述L,预测飞行L后所述飞行器与所述物体之间的位置关系,包括:确定所述飞行器在探测平面中的第一投影位置,并确定所述物体在所述探测平面中的扫描位置,所述探测平面与所述无人机的距离为L,且所述探测平面与所述无人机的运动方向垂直;根据所述第一投影位置、所述v水平、所述v垂直、所述L,预测飞行L后所述飞行器在所述探测平面中的第二投影位置;将所述第二投影位置与所述扫描位置之间的位置关系确定为飞行L后所述飞行器与所述物体之间的位置关系。
- 根据权利要求3所述的方法,其特征在于,所述飞行器上配备景深传感器,所述景深传感器的探测方向与所述飞行器的运动方向一致;所述在所述飞行器的运动方向上,获取与所述飞行器之间的距离不大于预设距离L的物体,包括:获取所述景深传感器以L为景深探测到的物体。
- 根据权利要求4所述的方法,其特征在于,所述确定所述飞行器在探测平面中的第一投影位置,包括:获取所述飞行器的三维尺寸;确定所述景深传感器与所述飞行器初始方向之间的角度;根据所述三维尺寸、所述角度将所述飞行器投影至探测平面中;将所述飞行器在所述探测平面中的投影位置确定为第一投影位置。
- 根据权利要求4所述的方法,其特征在于,所述根据所述第一投影位置、所述v水平、所述v垂直、所述L,预测飞行L后所述飞行器在所述探测平面中的第二投影位置,包括:根据所述v水平、所述v垂直、所述L,预测飞行L后所述飞行器在所述探测平面中纵向移动的距离s;将所述第一投影位置纵向移动所述s距离后的位置确定为第二投影位置。
- 根据权利要求1或2所述的方法,其特征在于,所述预设的控制措施为:弹出气囊,或者,解体所述飞行器。
- 一种飞行器的控制装置,其特征在于,所述装置包括:第一确定模块,用于确定飞行器的水平速度v水平和垂直速度v垂直;获取模块,用于在所述飞行器的坠落方向上,获取与所述飞行器之间的距离不大于预设距离L的物体;预测模块,用于根据所述第一确定模块确定的v水平、所述第一确定模块确定的v垂直、所述L,预测飞行L后所述飞行器与所述获取模块获取的物体 之间的位置关系;控制模块,用于当所述预测模块预测位置关系满足预设关系时,采取预设的控制措施控制所述飞行器。
- 根据权利要求9所述的装置,其特征在于,所述装置,还包括:第二确定模块,用于确定飞行器坠落。
- 根据权利要求9或10所述的装置,其特征在于,所述预测模块,包括:第一确定单元,用于确定所述飞行器在探测平面中的第一投影位置,所述探测平面与所述无人机的距离为L,且所述探测平面与所述无人机的运动方向垂直;第二确定单元,用于确定所述物体在所述探测平面中的扫描位置;预测单元,用于根据所述第一确定单元确定的第一投影位置、所述v水平、所述v垂直、所述L,预测飞行L后所述飞行器在所述探测平面中的第二投影位置;第三确定单元,用于将所述预测单元预测的第二投影位置与所述第二确定单元确定的扫描位置之间的位置关系确定为飞行L后所述飞行器与所述物体之间的位置关系。
- 根据权利要求11所述的装置,其特征在于,所述飞行器上配备景深传感器,所述景深传感器的探测方向与所述飞行器的运动方向一致;所述获取模块,用于获取所述景深传感器以L为景深探测到的物体。
- 根据权利要求12所述的装置,其特征在于,第一确定单元,包括:获取子单元,用于获取所述飞行器的三维尺寸;第一确定子单元,用于确定所述景深传感器与所述飞行器初始方向之间的角度;投影子单元,用于根据所述获取子单元获取的三维尺寸、所述第一确定子单元确定的角度将所述飞行器投影至探测平面中;第二确定子单元,用于将所述投影子单元将飞行器在所述探测平面中的 投影位置确定为第一投影位置。
- 根据权利要求12所述的装置,其特征在于,所述预测单元,包括:预测子单元,用于根据所述v水平、所述v垂直、所述L,预测飞行L后所述飞行器在所述探测平面中纵向移动的距离s;确定子单元,用于将所述第一投影位置纵向移动所述预测子单元得到的s距离后的位置确定为第二投影位置。
- 根据权利要求9或10所述的装置,其特征在于,所述预设的控制措施为:弹出气囊,或者,解体所述飞行器。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2016/077351 WO2017161563A1 (zh) | 2016-03-25 | 2016-03-25 | 飞行器的控制方法和装置 |
CN201680002718.1A CN107087429B (zh) | 2016-03-25 | 2016-03-25 | 飞行器的控制方法和装置 |
JP2017544011A JP6419986B2 (ja) | 2016-03-25 | 2016-03-25 | 航空機の制御方法及び装置 |
TW106110044A TWI686686B (zh) | 2016-03-25 | 2017-03-24 | 飛行器的控制方法和裝置 |
US15/658,772 US9994329B2 (en) | 2016-03-25 | 2017-07-25 | Method and apparatus for controlling aircraft |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2016/077351 WO2017161563A1 (zh) | 2016-03-25 | 2016-03-25 | 飞行器的控制方法和装置 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/658,772 Continuation US9994329B2 (en) | 2016-03-25 | 2017-07-25 | Method and apparatus for controlling aircraft |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017161563A1 true WO2017161563A1 (zh) | 2017-09-28 |
Family
ID=59614375
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2016/077351 WO2017161563A1 (zh) | 2016-03-25 | 2016-03-25 | 飞行器的控制方法和装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9994329B2 (zh) |
JP (1) | JP6419986B2 (zh) |
CN (1) | CN107087429B (zh) |
TW (1) | TWI686686B (zh) |
WO (1) | WO2017161563A1 (zh) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019078094A1 (ja) * | 2017-10-16 | 2019-04-25 | 日本化薬株式会社 | 墜落検知装置、飛行体の墜落を検知する方法、パラシュートまたはパラグライダーの展開装置、およびエアバッグ装置 |
JP2019177748A (ja) * | 2018-03-30 | 2019-10-17 | 株式会社Liberaware | 飛行体 |
CN111742276A (zh) * | 2019-05-29 | 2020-10-02 | 深圳市大疆创新科技有限公司 | 无人机返航方法、设备、无人机和存储介质 |
JP2022040417A (ja) * | 2019-11-12 | 2022-03-10 | 株式会社Liberaware | 飛行体 |
US12030628B2 (en) | 2017-10-16 | 2024-07-09 | Nippon Kayaku Kabushiki Kaisha | Crash detection device, flying body crash detection method, parachute or paraglider deployment device, and airbag device |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
ITUA20164597A1 (it) * | 2016-06-22 | 2017-12-22 | Iveco Magirus | Sistema di posizionamento e metodo per la determinazione di una posizione operativa di un dispositivo aereo |
US11256253B2 (en) * | 2016-07-11 | 2022-02-22 | Kitty Hawk Corporation | Automated aircraft recovery system |
US11459113B2 (en) | 2016-07-11 | 2022-10-04 | Kitty Hawk Corporation | Multimodal aircraft recovery system |
CN107632617B (zh) * | 2017-09-28 | 2020-02-14 | 深圳市道通智能航空技术有限公司 | 一种无人飞行器的控制方法和装置 |
CN109934521B (zh) * | 2017-12-18 | 2021-07-13 | 北京京东尚科信息技术有限公司 | 货物保护方法、装置、系统和计算机可读存储介质 |
CN109154830A (zh) * | 2017-12-18 | 2019-01-04 | 深圳市大疆创新科技有限公司 | 无人机控制方法及无人机 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1161097A (zh) * | 1995-08-07 | 1997-10-01 | 迈脱有限公司 | 抑制虚假分辨报警的水平避免相撞距离过滤系统 |
CN101835651A (zh) * | 2007-10-22 | 2010-09-15 | 贝尔直升机泰克斯特龙公司 | 飞行器坠落衰减系统 |
CN102481980A (zh) * | 2009-07-27 | 2012-05-30 | 贝尔直升机泰克斯特龙公司 | 飞机乘员保护系统 |
JP2013127694A (ja) * | 2011-12-19 | 2013-06-27 | Mitsubishi Heavy Ind Ltd | 制御装置及び方法並びにプログラム |
CN103377537A (zh) * | 2012-04-28 | 2013-10-30 | 鸿富锦精密工业(深圳)有限公司 | 上空坠物预警系统及方法 |
CN104272364A (zh) * | 2012-05-02 | 2015-01-07 | 萨甘安全防护公司 | 飞行器避让方法以及提供有用于实现所述方法的系统的无人机 |
CN105353765A (zh) * | 2015-11-10 | 2016-02-24 | 浙江大华技术股份有限公司 | 一种控制无人机降落的方法及装置 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003127994A (ja) * | 2001-10-24 | 2003-05-08 | Kansai Electric Power Co Inc:The | 無人飛行物体の制御システム |
CN1600642A (zh) * | 2003-09-24 | 2005-03-30 | 高忠民 | 空难中飞行器和全体乘员的自救方法 |
US20080062011A1 (en) * | 2004-09-07 | 2008-03-13 | Butler William M | Collision Avoidance Warning And Taxi Guidance Device |
US8588996B2 (en) | 2005-11-09 | 2013-11-19 | Textron Innovations Inc. | Aircraft occupant protection system |
ES2360471T3 (es) * | 2006-02-23 | 2011-06-06 | Commonwealth Scientific And Industrial Research Organisation | Sistema y método para la identificación de maniobras para un vehículo en situaciones de conflicto. |
PT2388760E (pt) * | 2010-05-21 | 2013-04-09 | Agustawestland Spa | Aeronave capaz de pairar, método de assistência a manobra de aeronaves, e interface |
US8712679B1 (en) * | 2010-10-29 | 2014-04-29 | Stc.Unm | System and methods for obstacle mapping and navigation |
US9156540B2 (en) * | 2013-07-30 | 2015-10-13 | Sikorsky Aircraft Corporation | Hard landing detection and orientation control |
US9613539B1 (en) * | 2014-08-19 | 2017-04-04 | Amazon Technologies, Inc. | Damage avoidance system for unmanned aerial vehicle |
CN104309808A (zh) * | 2014-09-25 | 2015-01-28 | 安徽科耀智能科技有限公司 | 一种无人机安全碰撞装置 |
CN105882945A (zh) * | 2015-01-07 | 2016-08-24 | 宁波大学 | 一种高空坠物的平衡与缓冲装置 |
EP3076379A1 (en) * | 2015-04-01 | 2016-10-05 | Airbus Defence and Space GmbH | Method and device for an aircraft for handling potential collisions in air traffic |
-
2016
- 2016-03-25 JP JP2017544011A patent/JP6419986B2/ja active Active
- 2016-03-25 CN CN201680002718.1A patent/CN107087429B/zh active Active
- 2016-03-25 WO PCT/CN2016/077351 patent/WO2017161563A1/zh active Application Filing
-
2017
- 2017-03-24 TW TW106110044A patent/TWI686686B/zh active
- 2017-07-25 US US15/658,772 patent/US9994329B2/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1161097A (zh) * | 1995-08-07 | 1997-10-01 | 迈脱有限公司 | 抑制虚假分辨报警的水平避免相撞距离过滤系统 |
CN101835651A (zh) * | 2007-10-22 | 2010-09-15 | 贝尔直升机泰克斯特龙公司 | 飞行器坠落衰减系统 |
CN102481980A (zh) * | 2009-07-27 | 2012-05-30 | 贝尔直升机泰克斯特龙公司 | 飞机乘员保护系统 |
JP2013127694A (ja) * | 2011-12-19 | 2013-06-27 | Mitsubishi Heavy Ind Ltd | 制御装置及び方法並びにプログラム |
CN103377537A (zh) * | 2012-04-28 | 2013-10-30 | 鸿富锦精密工业(深圳)有限公司 | 上空坠物预警系统及方法 |
CN104272364A (zh) * | 2012-05-02 | 2015-01-07 | 萨甘安全防护公司 | 飞行器避让方法以及提供有用于实现所述方法的系统的无人机 |
CN105353765A (zh) * | 2015-11-10 | 2016-02-24 | 浙江大华技术股份有限公司 | 一种控制无人机降落的方法及装置 |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019078094A1 (ja) * | 2017-10-16 | 2019-04-25 | 日本化薬株式会社 | 墜落検知装置、飛行体の墜落を検知する方法、パラシュートまたはパラグライダーの展開装置、およびエアバッグ装置 |
JP2019073149A (ja) * | 2017-10-16 | 2019-05-16 | 日本化薬株式会社 | 墜落検知装置、飛行体の墜落を検知する方法、パラシュートまたはパラグライダーの展開装置、およびエアバッグ装置 |
CN111164014A (zh) * | 2017-10-16 | 2020-05-15 | 日本化药株式会社 | 坠落检测装置、检测飞行器坠落的方法、降落伞或滑翔伞的展开装置及安全气囊装置 |
EP3699089A4 (en) * | 2017-10-16 | 2021-07-14 | Nippon Kayaku Kabushiki Kaisha | IMPACT DETECTION DEVICE, METHOD OF DETECTING THE IMPACT OF A BODY, RELEASE DEVICE FOR PARACHUTE OR PARAGLIDER AND AIRBAG DEVICE |
JP2021151874A (ja) * | 2017-10-16 | 2021-09-30 | 日本化薬株式会社 | 墜落検知装置、飛行体の墜落を検知する方法、パラシュートまたはパラグライダーの展開装置、およびエアバッグ装置 |
JP7089623B2 (ja) | 2017-10-16 | 2022-06-22 | 日本化薬株式会社 | 墜落検知装置、飛行体の墜落を検知する方法、パラシュートまたはパラグライダーの展開装置、およびエアバッグ装置 |
CN111164014B (zh) * | 2017-10-16 | 2023-10-31 | 日本化药株式会社 | 坠落检测装置、检测飞行器坠落的方法、降落伞或滑翔伞的展开装置及安全气囊装置 |
US12030628B2 (en) | 2017-10-16 | 2024-07-09 | Nippon Kayaku Kabushiki Kaisha | Crash detection device, flying body crash detection method, parachute or paraglider deployment device, and airbag device |
JP2019177748A (ja) * | 2018-03-30 | 2019-10-17 | 株式会社Liberaware | 飛行体 |
CN111742276A (zh) * | 2019-05-29 | 2020-10-02 | 深圳市大疆创新科技有限公司 | 无人机返航方法、设备、无人机和存储介质 |
JP2022040417A (ja) * | 2019-11-12 | 2022-03-10 | 株式会社Liberaware | 飛行体 |
JP7296153B2 (ja) | 2019-11-12 | 2023-06-22 | 株式会社Liberaware | 飛行体 |
Also Published As
Publication number | Publication date |
---|---|
TWI686686B (zh) | 2020-03-01 |
US20170334568A1 (en) | 2017-11-23 |
US9994329B2 (en) | 2018-06-12 |
CN107087429B (zh) | 2020-04-07 |
TW201734687A (zh) | 2017-10-01 |
CN107087429A (zh) | 2017-08-22 |
JP6419986B2 (ja) | 2018-11-07 |
JP2018519203A (ja) | 2018-07-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017161563A1 (zh) | 飞行器的控制方法和装置 | |
US11726498B2 (en) | Aerial vehicle touchdown detection | |
US11604479B2 (en) | Methods and system for vision-based landing | |
US9783320B2 (en) | Airplane collision avoidance | |
US10152059B2 (en) | Systems and methods for landing a drone on a moving base | |
JP6250228B2 (ja) | 構造物の形状測定用の画像撮影システム、遠隔制御装置、機上制御装置、プログラムおよび記録媒体 | |
JP6609833B2 (ja) | 無人航空機の飛行を制御する方法及びシステム | |
WO2018218516A1 (zh) | 无人机返航路径规划方法及装置 | |
US11467179B2 (en) | Wind estimation system, wind estimation method, and program | |
JP6029446B2 (ja) | 自律飛行ロボット | |
EP3750140A1 (en) | Aerial vehicle smart landing | |
JP6195450B2 (ja) | 自律飛行ロボット | |
WO2018023556A1 (en) | Methods and systems for obstacle identification and avoidance | |
WO2019127029A1 (zh) | 一种闪避障碍物的方法、装置及飞行器 | |
WO2018112848A1 (zh) | 飞行控制方法和装置 | |
EP2523062B1 (en) | Time phased imagery for an artificial point of view | |
US20220291679A1 (en) | Information processing device, information processing method, information processing program, and control device | |
CN110366711A (zh) | 信息处理装置、飞行控制指示方法及记录介质 | |
Scholz et al. | Concept for Sensor and Processing Equipment for Optical Navigation of VTOL during Approach and Landing | |
Krause | Multi-purpose environment awareness approach for single line laser scanner in a small rotorcraft UA | |
JP2021103410A (ja) | 移動体及び撮像システム | |
JP7317684B2 (ja) | 移動体、情報処理装置、及び撮像システム | |
JP2007240435A (ja) | 目標対象物の位置検出装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2017544011 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16894917 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16894917 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 29/03/2019) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16894917 Country of ref document: EP Kind code of ref document: A1 |