WO2019051638A1 - 可移动设备以及其作业方法 - Google Patents

可移动设备以及其作业方法 Download PDF

Info

Publication number
WO2019051638A1
WO2019051638A1 PCT/CN2017/101388 CN2017101388W WO2019051638A1 WO 2019051638 A1 WO2019051638 A1 WO 2019051638A1 CN 2017101388 W CN2017101388 W CN 2017101388W WO 2019051638 A1 WO2019051638 A1 WO 2019051638A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensing
mobile device
thrown
movable device
movable
Prior art date
Application number
PCT/CN2017/101388
Other languages
English (en)
French (fr)
Inventor
孙旭斌
霍达君
杨豪
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2017/101388 priority Critical patent/WO2019051638A1/zh
Priority to CN201780004507.6A priority patent/CN108475063A/zh
Publication of WO2019051638A1 publication Critical patent/WO2019051638A1/zh
Priority to US16/812,965 priority patent/US11435743B2/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/04Control of altitude or depth
    • G05D1/042Control of altitude or depth specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • B64U10/14Flying platforms with four distinct rotor axes, e.g. quadcopters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U70/00Launching, take-off or landing arrangements
    • B64U70/10Launching, take-off or landing arrangements for releasing or capturing UAVs by hand
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/80UAVs characterised by their small size, e.g. micro air vehicles [MAV]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U70/00Launching, take-off or landing arrangements

Definitions

  • the present disclosure relates to the field of remote control, and more particularly to a mobile device, and to a method of operation of a removable device.
  • a mobile device such as a self-timer drone equipped with an image capture device, typically uses a battery as an energy source.
  • a self-timer drone it usually needs to achieve two characteristics: one is portable (such as small size, light weight); the other is long-term battery life (such as charging enough power to take enough photos). These are two contradictory features.
  • the battery accounts for most of the weight of the drone. To make it smaller and lighter, the battery weight is reduced. If it is longer, the battery weight is increased.
  • the self-timer drone is completely powered by batteries in the air, and can only take off and land on the flat ground.
  • you want to shoot you need to put the plane on the ground, control the takeoff by the remote control, and then control the landing to the ground through the remote control. under these circumstances. It usually takes a few seconds to take a photo, but the preparation of the camera (controlling the plane taking off and adjusting the position and posture, and landing) takes tens of seconds and cannot be taken quickly. Moreover, it takes a lot of power to take off and land, and the battery can only support shooting fewer photos at a time. At the same time, it is difficult to take pictures because it can only take off on the ground and the local area is not flat.
  • an object of the present disclosure is to provide a working method or a movable device of a mobile device to solve at least one of the technical problems described above.
  • a method of operating a mobile device including:
  • the mobile device senses whether it is thrown by the throwing body
  • the mobile device controls itself to hover;
  • a removable device comprising:
  • a first sensing device configured to sense whether the movable device is thrown by the throwing body
  • a controller configured to receive a control signal after the movable device transmitted by the first sensing device is thrown a signal
  • the working method of the movable device of the present disclosure saves energy by adopting a throwing manner without taking off from the ground, and does not need to rely on a flat ground, and the mobile device can control the self-hovering to perform subsequent photographing and the like. Since energy is saved before hovering is reached, more energy can be provided for subsequent aerial operations (for example, shooting time or number of shots can be increased); in addition, by hovering, the image pickup device is automatically aligned to throw the main body, thereby reducing The complexity of the operation of the mobile device operator and the efficiency of the airborne operation of the mobile device.
  • the movable device of the present disclosure can realize the throwing of the thrown body by setting the first sensing device, the controller and the power output component, and avoid the corresponding problem caused by the uneven ground takeoff; the movable device can also An image capturing device is provided. After hovering, the image capturing device automatically aligns the throwing body through an identification unit or according to the motion track after the throwing to perform aerial work (for example, taking a picture of the throwing body), thereby reducing the available The complexity of mobile device operator operations and increases the efficiency of mobile device air operations.
  • FIG. 1 illustrates a flow chart of a method of operation of a mobile device in accordance with an embodiment of the present disclosure.
  • Fig. 2 shows a schematic diagram of the throwing of step S101 in Fig. 1.
  • FIG. 3 illustrates a flow chart of a method of operation of a mobile device in accordance with another embodiment of the present disclosure.
  • FIG. 4 is a schematic diagram showing the sensing and adjusting posture of step S304 in FIG.
  • FIG. 5 is a schematic diagram showing hovering and aerial operations in a working method of a mobile device according to an embodiment of the present disclosure.
  • FIG. 6 shows a block diagram of a removable device of an embodiment of the present disclosure.
  • FIG. 7 shows a block diagram of a mobile device of another embodiment of the present disclosure.
  • a mobile device such as a mobile device in the air (eg, a fixed-wing aircraft such as an airplane, a taxi, a rotorcraft, such as a helicopter, or other aircraft, such as a soft airship). , balloons); or poles, such as fishing rods or other types of movable supports or frames; mobile devices in space (eg, satellites, space stations, spacecraft).
  • the mobile device can move freely in an environment (such as air or space) or along a preset path, track, or in an agreed-upon manner.
  • the mobile device can be moved in one, two or three dimensions.
  • the mobile device can automatically move in response to the signal without manual movement.
  • the mobile device can be a vehicle, such as an aerial vehicle or a vehicle in space, or a combination thereof.
  • the vehicle can move freely in one or more specified environments or along a fixed path.
  • the vehicle can include a power system.
  • the power system utilizes an electric motor, an engine, electronic components, magnetic mechanisms, gravity, wind, combustion, and/or other power mechanisms.
  • a manual power system, a human power system, or a power system by means of other organisms may also be applied to a mobile device, such as a vehicle.
  • the moveable device may be a rotorcraft that may be driven or controlled by the rotation of one or more blades.
  • the moveable device can be driven or repositioned by means of one or more rotating blades, propellers, wheels, magnets, tracks, or other mechanisms.
  • the mobile device is an unmanned vehicle, such as an Unmanned Aerial Vehicle (UAV), which may also be referred to as a drone.
  • Unmanned aerial vehicles can hover, adjust direction, and/or follow.
  • the mobile device can be automatically controlled or remotely controlled without requiring someone to be inside or on the mobile device.
  • the mobile device can be remotely manipulated through the terminal. Alternatively, a person may be within the mobile device or over the mobile device to assist in controlling the mobile device.
  • the mobile device can be used to carry loads.
  • the load carried by the mobile device can include a load and/or carrier that can cause the load to move relative to the mobile device.
  • the image capturing device of the present disclosure refers to an electronic device capable of recording images, images, and/or sounds, including but not limited to a camera, a mobile phone, a tablet, a camera, a camera, or an X-ray/infrared light imager;
  • the panoramic image capturing device may be a single lens image capturing device or a multi-lens image capturing device according to the number of shots.
  • the throwing body in the present disclosure refers to an actor capable of pushing a movable device to throw, and the throwing body may be a person, an animal, a robot, or a throwing device capable of performing ejection or ejection.
  • the person as the throwing body, the hand rests the movable device, and realizes the corresponding throwing action by lifting the movable device upright or obliquely upward.
  • hovering means that the mobile device stays in midair.
  • the hovering time can be short-lived (for example, about 1 s), or it can be maintained for a long time (for example, 10 minutes or more), which can be related to subsequent operations, and can be kept longer if subsequent operations do not need to be moved in place. Time; if subsequent operations require adjustment of the position (such as adjusting the angle and distance of the image recording), it may be maintained for a short time (even if the hover time is maintained for less than 1 s).
  • the ordinal numbers used in the present disclosure are used to modify different types of sensing devices and the like, which do not themselves and represent any prior ordinal numbers of the corresponding sensing devices, etc. It does not represent the order of the sensing device and the other sensing device.
  • the use of these ordinals is only used to enable a sensing device or the like having a certain name to be made with another sensing device having the same name. Clearly distinguish.
  • the sensing devices with different ordinal numbers may be the same or the same specific component.
  • the term "work method” means that the mobile device implements the corresponding task by performing the corresponding action, mainly including air working steps (including but not limited to throwing the first item in the air (spraying, spraying liquid, solid and or liquid solid) The mixture is also included therein, photographing and/or photographing in the air, and capturing the corresponding second item in the air), and may also include aerial work assisting steps including, but not limited to, moving the mobile device to a position for aerial work. After completion of aerial work, land down to the ground or throw the main body.
  • air working steps including but not limited to throwing the first item in the air (spraying, spraying liquid, solid and or liquid solid)
  • the mixture is also included therein, photographing and/or photographing in the air, and capturing the corresponding second item in the air
  • aerial work assisting steps including, but not limited to, moving the mobile device to a position for aerial work. After completion of aerial work, land down to the ground or throw the main body.
  • the mobile device can be in any of the forms described above.
  • the throwing body can also throw an actor who can move the mobile device in various implementations described above.
  • the internal battery of the mobile device can be charged for a short time in the air.
  • the battery charge is limited to the number of times the camera is charged, the camera cannot be taken quickly, and the problem must be taken off on a flat ground.
  • the present disclosure provides a method for throwing a movable device to an air work (for example, a shooting position) with a certain force by a throwing method, for example, when the movable device performs work (inertia) by throwing it as a function of inertia.
  • the power of a part of the flight after being thrown, starts the rotor to maintain the flight attitude in the air, flies in the direction of the projectile, and hovering for aerial work (such as taking pictures) after the flight to the projecting kinetic energy is completely consumed.
  • FIG. 1 illustrates a flow chart of a method of operation of a mobile device in accordance with an embodiment of the present disclosure.
  • a working method of a mobile device according to an embodiment of the present disclosure includes:
  • S101 The mobile device senses whether it is thrown by the throwing body
  • whether the sensing itself in the step S101 is thrown by the throwing body may have multiple implementation manners.
  • One implementation manner is that the mobile device senses whether it is far from the throwing body; in this embodiment, A first sensing device that senses a distance between the body and the throwing body is mounted on the movable device, and the first sensing device may be an ultrasonic sensor, an infrared sensor, and/or an optical flow sensor, and the first sensing device may be mounted on the first sensing device On a mobile device, by sensing and throwing the distance between the subjects, when the distance changes, and the change value reaches a certain set distance When it is above the threshold range, it is judged that the mobile device is thrown.
  • the ultrasonic sensor, the infrared sensor, and/or the optical flow sensor can be located within the mobile device or on the outer casing of the movable device to facilitate sensing distance.
  • the first sensing device can also be disposed, the first sensing device. It can also be an ultrasonic sensor, an infrared sensor and/or an optical flow sensor.
  • Another implementation manner is to determine whether the movable device senses an interaction force with the throwing body.
  • the method can be implemented by using a pressure sensor, and the pressure sensor can be disposed on the outer surface of the movable device.
  • the throwing body Before exiting, the throwing body will give the mobile device a supporting force or gripping force, and the pressure sensor can sense the corresponding force.
  • the interaction force between the two When the throwing force, the interaction force between the two also disappears, so it can be felt by the pressure sensor in time. Measured.
  • the pressure sensor may be disposed on the lower surface of the movable device, and when the throwing body throws the movable device, the sensitive detection support force may disappear.
  • Fig. 2 shows a schematic diagram of the throwing of step S101 in Fig. 1.
  • the main body is thrown as a person.
  • the movable device 201 is located on the palm 202 of the person. After the person can throw the movable device 201 in the direction indicated by the arrow in the figure, the throwing direction can be directly above or obliquely above.
  • the direction of the throw is preferably the orientation of the subsequent operation (e.g., photographing) of the removable device 201.
  • the throwing body may be an animal, robot or mechanical throwing device.
  • the robot may calculate the corresponding throwing force and throwing angle more accurately according to some internal sensing devices. In order to make the mobile device accurately reach the air working point.
  • step S102 after the sensing is thrown, the mobile device controls itself to hover.
  • hovering can be performed under the condition that the mobile device satisfies the corresponding conditions.
  • the mobile device senses that its height is no longer rising and then hovering.
  • the condition may be implemented by a second sensing device, which may be an ultrasonic sensor, an infrared sensor or an optical flow sensor, mounted on the movable device for sensing whether the height of the movable device itself is no longer rising.
  • the sensor is set to perform height sensing according to a certain frequency, and the sensor can sense the height with the ground or the height relative to the throwing body. At the above frequency, if the height of the next sensing value is less than or equal to the previous sensing value The height is judged to satisfy the hovering condition. Due to the uncertainty or mobility of the throwing body, the bottom can be selected as the reference for height sensing and measurement.
  • the movable device senses its own speed to hover after the component in the height direction is zero. Under this condition, it can also be realized by the second sensing device.
  • the second sensing device can be a speed sensor and an acceleration sensor for sensing whether the component of the movable device speed in the height direction is zero.
  • the speed sensor can be a line speed or an angle value, and an acceleration sensor is used to measure the linear acceleration.
  • the mobile device senses whether a hovering control signal from an external control terminal is received.
  • the condition can be implemented by a second sensing device.
  • the second sensing device can be an external signal receiving device.
  • the external signal receiving device is electrically coupled to the controller, such as receiving a hovering signal from an external control terminal. After being transmitted to the control terminal, the control terminal controls the corresponding power output component to perform a hovering operation.
  • the hovering time can be short-lived (for example, around 1 s), or it can be maintained for a long time, depending on the strength and angle at which the throwing body throws the movable device, and if it reaches the highest point after being thrown. That is, the air working position can be hovered until the end of the operation. If the thrown force is insufficient or the throwing force is too large, the movable device can also adjust its position through the power output component, so that the hovering time is short.
  • the condition of the hovering is that the mobile device controls itself to hover only when it senses that its height is no longer rising, depending on the power that is thrown. This condition works by not requiring the mobile device's own power output components to generate additional power to bring it to the height required for aerial work, reducing power loss and achieving longer usage times (especially for small unmanned aerial vehicles, It is possible to carry out aerial work for a longer period of time, including but not limited to shooting, when a limited energy supply device is mounted.
  • the mobile device air work in the embodiment of the present disclosure further includes step S103: the mobile device controls the self-hovering or after: the mobile device performs the air work.
  • Figure 5 shows a schematic diagram of hovering and aerial operations.
  • Movable devices such as inorganic people moving into the air, generally perform the corresponding functions, many including but not limited to throwing the first item in the air (jet, spray liquid, solid and or liquid solid mixture is also included), aerial photography and / or photography, as well as the second item in the air, the above can be called aerial work. Taking control shooting as an example, a wider and more strange field of view can be obtained by looking down the angle to meet the corresponding needs of users.
  • FIG. 3 illustrates a flow chart of a method of operation of a mobile device in accordance with another embodiment of the present disclosure.
  • the working method of the mobile device according to the embodiment of the present disclosure also mainly includes:
  • S301 The mobile device senses whether it is thrown by the throwing body
  • steps S301 and S305 steps S101 and S102 described above may be referred to separately, and details are not described herein.
  • step S301 further comprising step S302: after the sensing is thrown, including: the initial speed when the mobile device senses itself being thrown.
  • the initial velocity can be used to provide conditions for subsequent hovering steps S304 and S306, in which the corresponding initial velocity direction and magnitude are sensed, as well as the direction of the thrown body, subsequent hovering and returning throws.
  • the body is also dependent on the speed and direction of the throw (ie relative to the direction in which the body is thrown).
  • the initial speed may be implemented by a third sensing device, which may be a speed sensor, an acceleration sensor, or an inertial sensor, sensing its own initial velocity direction and magnitude, sensing itself relative to the throw The throw angle of the main body.
  • the speed sensor can be a line speed or an angle value, and an acceleration sensor is used to measure the linear acceleration.
  • the third sensing device is the same as the second sensing device component, and the sensing timing is different, and can be shared in the movable device, that is, a speed sensor and an acceleration sensor are provided.
  • the method further includes the step S303 of installing an image capturing device on the movable device, and the image capturing device is aligned with the throwing body after being thrown (for example, the lens of the image capturing device is aligned with the portion of the discharging body or thrown Out of the main body).
  • the image capturing device mounted on the movable device to align the throwing body may specifically include: determining an initial speed direction and size when the thrown movable device is thrown, determining according to the initial speed direction and size
  • the trajectory of the movable device for example, the parabolic trajectory when the power output component outputs power
  • the wind resistance is determined when determining the motion trajectory
  • the starting point of the image recording device aligning the motion trajectory that is, the alignment throwing body
  • an identification unit such as a face recognition system, may further assist the image recording apparatus to align the recognition throwing body (ie, adjust the initial alignment of the image recording apparatus to align the movement trajectory, by identifying The unit achieves precise alignment).
  • an identification unit of the removable device pre-stores a partial (eg, face) information of the ejected subject before being ejected, or the image capture device of the removable device ingests a partial image of the ejected subject, and then Receiving the partial image through an identification unit and pre-storing as local information of the throwing body; further, in After the mobile device is hovered, according to the stored local information, a cloud platform on the mobile device is controlled (the cloud platform is equipped with an influencing device) or the mobile device itself is oriented, and the image capturing device is adjusted to find and align the object to be thrown. To take pictures or follow other air operations.
  • a cloud platform on the mobile device is controlled (the cloud platform is equipped with an influencing device) or the mobile device itself is oriented, and the image capturing device is adjusted to find and align the object to be thrown.
  • step S304 is further included, that is, the mobile device senses and adjusts its own posture.
  • the movable device may sense its own angular velocity and adjust its spatial posture.
  • This step can be implemented by a gyroscope (for example, a three-axis gyroscope) and a power output component.
  • the gyroscope is a rigid body that rotates around the fulcrum and is mainly used to determine the attitude of the flight. It can be a piezoelectric gyroscope, a microcomputer gyroscope or a laser gyroscope for measuring angular velocity.
  • the power take off component can be a cantilever for adjusting the deflection produced by itself.
  • FIG. 4 is a schematic diagram showing the sensing and adjusting posture of step S304 in FIG.
  • the movable device 401 shown in FIG. 4 its own gyroscope senses the corresponding angular velocity, indicating that there is a corresponding deflection, the three-axis gyroscope transmits the sensed three-axis angular velocity value to the control end, and the control terminal calculates the deflection corresponding
  • the power output component adjusts the angle and the like, thereby changing the corresponding deflection to achieve the purpose of stabilizing the posture.
  • the method further includes the step S306: the mobile device controls the self-hovering or after: the mobile device performs the air work.
  • FIG. 5 is a schematic diagram showing hovering and aerial operations in a working method of a mobile device according to an embodiment of the present disclosure.
  • Movable devices such as inorganic people moving into the air, generally perform the corresponding functions, many including but not limited to throwing the first item in the air (jet, spray liquid, solid and or liquid solid mixture is also included), aerial photography and / or photography, as well as the second item in the air, the above can be called aerial work. Taking control shooting as an example, firstly, a wider and more strange field of view can be obtained by looking down the angle to meet the corresponding needs of users.
  • the removable device 501 can include an image capture device 5011 that performs image capture after the removable device 501 recognizes the first set portion of the throwing body 502. In this case, the position where the movable device 501 is hovering can be photographed, and after a simple focus, for example, recognition of the first set portion (for example, a human face) of the main body 502, an imaging or photographing operation can be performed.
  • an image capture device 5011 that performs image capture after the removable device 501 recognizes the first set portion of the throwing body 502.
  • the position where the movable device 501 is hovering can be photographed, and after a simple focus, for example, recognition of the first set portion (for example, a human face) of the main body 502, an imaging or photographing operation can be performed.
  • the installed image capturing device recognizes the first set portion of the ejected subject and performs image capturing.
  • the movable device 501 does not reach the appropriate shooting position due to the throwing force or the angle of the throwing body 502, and the position of the movable device 501 itself is adjusted by the power output member to move to the throwing body.
  • the photographing or photographing is performed after the ingestible position range of 502. At this time, a positioning operation such as face recognition is also required.
  • the installed image capturing device recognizes the first setting portion of the ejection subject and adjusts the angle of the image capturing device to perform image capturing.
  • the movable device 501 does not reach the appropriate photographing position due to the throwing force of the throwing body 502 or the lack of the angle and the camera angle is not accurately aligned, and the position of the movable device 501 itself is performed by the power output member.
  • moving to the range of the ingestible position of the throwing body 502 taking a picture or taking a picture.
  • a positioning operation such as face recognition is also required.
  • step S307 after the mobile device is in the air work, the mobile device moves closer to the throwing body.
  • the relative position of the body can be thrown according to the distance measured in advance, and the driving power output member is moved closer to the throwing body.
  • the power output component is further for After reaching the set distance, the power output is stopped.
  • the fourth sensing device may be a ranging sensor, specifically an ultrasonic sensor, an infrared sensor, and/or an optical flow sensor, and the fourth sensing device may be mounted on the movable device between the sensing and the throwing body. The distance, when the distance changes, and the change value reaches within a certain set distance threshold range, the power output is stopped by the controller.
  • the fourth sensing device, the second sensing device, and the first sensing device component are the same, and the sensing timing is different, and can be shared in the movable device, that is, an ultrasonic sensor, an infrared sensor, and/or light is disposed.
  • the flow sensor is OK.
  • the drone is used for taking pictures. After the drone is photographed, the drone uses the battery to provide the power that the movable device returns in the original direction. When flying back to the vicinity of the throwing body, the main body can be put to the hand.
  • the fourth sensing device of the drone (such as an ultrasonic sensor, an infrared sensor, and/or an optical flow sensor) senses the presence of a second set of locations (eg, a human hand) (eg, sensing the hand and the drone) The distance is within the set distance threshold, reducing flight lift and landing on the human hand. In this way, it takes off and land on the hand, so that it can take off and land when the ground is not flat.
  • the mobile device 600 of the present embodiment includes: a first sensing device 601, configured to sense whether a removable device is thrown by a throwing body; and a controller 602, configured to receive the first sensing Device 601 The transmitted movable device is thrown a signal and generates a control signal and generates a control signal; a power output component 603 is configured to hover the movable device according to the control signal after the movable device is thrown.
  • a first sensing device 601 configured to sense whether a removable device is thrown by a throwing body
  • a controller 602 configured to receive the first sensing Device 601
  • the transmitted movable device is thrown a signal and generates a control signal and generates a control signal
  • a power output component 603 is configured to hover the movable device according to the control signal after the movable device is thrown.
  • the first sensing device 601 can be mounted on a movable device that senses the distance from the throwing body.
  • the first sensing device 601 can be an ultrasonic sensor, an infrared sensor, and/or an optical flow.
  • the sensor, the first sensing device 601 can be mounted on the movable device, and the distance between the sensing and the throwing body is determined. When the distance changes and the change value reaches a certain threshold value range, the judgment is accepted.
  • the mobile device was thrown.
  • the ultrasonic sensor, infrared sensor, and/or optical flow sensor can be external to the mobile device to facilitate sensing distance.
  • the first sensing device 601 can be configured to sense the distance value by another implementation, except that the sensed distance between the movable device and the ground is changed, and correspondingly, the first sense
  • the measuring device can also be an ultrasonic sensor, an infrared sensor and/or an optical flow sensor.
  • the first sensing device 601 determining whether the movable device senses an interaction force with the throwing body
  • the method can be implemented by using a pressure sensor, and the pressure sensor can be disposed at the On the outer surface of the mobile device, before being thrown, the throwing body gives a support or grip force to the movable device, and the pressure sensor can sense the corresponding force. When thrown, the interaction force between the two also disappears. So it can be sensed by the pressure sensor in time.
  • the pressure sensor may be disposed on the lower surface of the movable device, and when the throwing body throws the movable device, the sensitive detection support force may disappear.
  • controller 602 As a central control component in the mobile device, it is connected to each sensing device and power output component 603 for receiving signals transmitted by the respective sensing devices, thereby generating corresponding control signals.
  • the working method provided by the embodiment of the present disclosure corresponds to throwing a subject object, such as a person, and the corresponding operation is relatively simple.
  • Users can master the operation method of mobile devices (such as unmanned aerial vehicles) by reducing training or without training. They only need to master certain throwing techniques to make mobile devices reach a certain height, and mobile devices can perform aerial work on their own. , no need for further manipulation by the user. Further, after the end of the control operation, it is only necessary to provide a place where the mobile device is dropped, which can automatically land to the location without further operation to avoid erroneous operations or errors caused by manual operations.
  • a specific composition of a mobile device for example, a drone
  • a mobile device for example, a drone
  • an image capturing device is mounted on the movable device for being used in the mobile device.
  • align the main body please refer to the above, and will not repeat them here.
  • aerial work such as photographing and photographing
  • the corresponding operations can be performed by the image pickup apparatus.
  • the alignment manner after the above-mentioned throwing is referred to the above description of FIGS. 1-5, and will not be described herein.
  • the mobile device 600 further includes a third sensing device 604 mounted on the mobile device for sensing the initial speed at which the mobile device 600 is thrown after it is thrown.
  • the third sensing device 604 can be a speed sensor and an acceleration sensor that senses its initial velocity direction and magnitude and senses its own throw angle relative to the thrown body.
  • the speed sensor can be a line speed or an angle value, and an acceleration sensor is used to measure the linear acceleration.
  • the mobile device 600 further includes: a dispensing device for throwing the first item in the air (injection, spray liquid, solid and/or liquid solid mixture is also contained therein), image capture device, for use in the air Camera and/or photography, as well as a gripping device, to capture the corresponding second item in the air.
  • a dispensing device for throwing the first item in the air (injection, spray liquid, solid and/or liquid solid mixture is also contained therein)
  • image capture device for use in the air Camera and/or photography, as well as a gripping device, to capture the corresponding second item in the air.
  • the mobile device further includes an identification unit for identifying a first set portion of the throwing body, which may be various types of faces of the prior art.
  • the identification system may include four components: a face image acquisition and detection portion, a face image preprocessing portion, a face image feature extraction portion, and a matching and recognition portion, and the corresponding first set portion may be a human face; Mounted on the mobile device, and equipped with the image capturing device for adjusting the angle of the image capturing device; wherein the image capturing device is further used for the throwing body after the first setting portion of the device unit Image capture.
  • the mobile device 600 further includes a second sensing device 605 for sensing whether the mobile device meets the hover condition.
  • the second sensing device 605 can be an ultrasonic sensor, an infrared sensor, or an optical flow sensor, mounted on the movable device for sensing whether the height of the movable device itself is no longer rising.
  • the sensor is set to perform height sensing according to a certain frequency, and the sensor can sense the height with the ground or the height relative to the throwing body. At the above frequency, if the height of the next sensing value is less than or equal to the previous sensing value The height is judged to satisfy the hovering condition. Due to the uncertainty or mobility of the throwing body, the ground can be selected as the reference for height sensing and measurement.
  • the second sensing device 605 can be used to sense that the speed of the movable device itself is hovered after the component in the height direction is zero.
  • the second sensing device 605 may be a speed sensor and an acceleration sensor for sensing whether the component of the movable device speed in the height direction is zero.
  • the speed sensor can be a line speed or an angle value, and an acceleration sensor is used to measure the linear acceleration.
  • the second sensing device 605 can be used to sense whether the mobile device receives a hovering control signal from an external control terminal.
  • the second sensing device 605 can be an external signal receiving device, and the external signal receiving device is electrically coupled to the controller, and is transmitted to the control terminal after receiving the hovering signal sent by the external control terminal, and the control terminal controls The corresponding power output unit performs a hovering operation.
  • the mobile device 600 further includes a fourth sensing device 606 for sensing whether the distance between the movable device and the second set portion of the throwing body (eg, the hand of the throwing body) has arrived The distance is set, and the power output component 603 is further used to stop the power output after reaching the set distance.
  • the fourth sensing device 606 can be a ranging sensor, specifically an ultrasonic sensor, an infrared sensor, and/or an optical flow sensor, and the fourth sensing device 606 can be mounted on the movable device by sensing and throwing the main body. The distance between the two, when the distance changes, and the change value reaches within a certain set distance threshold range, the power output is stopped by the controller.
  • the fourth sensing device 606, the third sensing device 604, and the first sensing device 601 have the same components, and the sensing timings are different, and can be shared in the movable device, that is, an ultrasonic sensor and an infrared sensor are disposed. And / or optical flow sensor can be.
  • the power take-off component 603 can include an electric motor and a suspension wing that is mechanically coupled to the suspension wing and that drives the suspension wing to rotate to output power to the movable device 600.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Toys (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

公开了一种可移动设备以及其作业方法。其中可移动设备作业方法包括感测自身是否被抛出主体抛出;当感测被抛出之后,可移动设备控制自身悬停;以及控制自身悬停后,进行空中作业。该作业方法采用抛出的方式不需要从地面起飞更不用依赖平的地面,节约了能量,而且可移动设备控制自身悬停后即可进行后续的拍照等空中作业。

Description

可移动设备以及其作业方法
版权申明
本专利文件披露的内容包含受版权保护的材料。该版权为版权所有人所有。版权所有人不反对任何人复制专利与商标局的官方记录和档案中所存在的该专利文件或者专利披露。
技术领域
本公开涉及远程控制领域,更具体地涉及可移动设备,还涉及可移动设备的作业方法。
背景技术
可移动设备,例如搭载影像摄取装置的自拍无人机,通常使用电池作为能源。作为自拍无人机,通常需要达到两个特点:一是便携(如体积小、重量轻);另一个是长时间续航(例如充一次电能拍摄足够多照片)。这是两个相互矛盾的特点,通常电池占了无人机的大部分重量,要做到更小更轻,则减轻电池重量,要更长时间续航,则要增加电池重量。
目前自拍无人机在空中完全靠电池提供能源,只能在平地上起飞降落。在想要拍摄时,需要把飞机放到地上,通过遥控器控制起飞,拍完后再通过遥控器控制降落到地上。在这种情况下。通常拍一张照片只需几秒,但拍照的准备(控制飞机起飞并调整好位置姿态,以及降落)却需要几十秒,无法做到快速拍照。而且起飞降落过程需要花费较多电量,导致电池充一次只能支持拍摄较少的照片。同时由于只能在平地上起飞,当地面不平时,实现拍照难度较大。
发明内容
有鉴于此,本公开的目的在于提供一种可移动设备的作业方法或者可移动设备,以解决以上所述的至少一项技术问题。
根据本公开的一方面,提供一种可移动设备的作业方法,包括:
可移动设备感测自身是否被抛出主体抛出;
当感测被抛出之后,可移动设备控制自身悬停;以及
控制自身悬停后,进行空中作业。
根据本公开的另一方面,提供一种可移动设备,包括:
第一感测装置,用于感测可移动设备是否被抛出主体抛出;
控制器,用于接收第一感测装置传输的可移动设备被抛出信号后产生控制信号;
动力输出部件,用于根据所述控制信号在可移动设备被抛出之后悬停可移动设备。
本公开的可移动设备的作业方法,通过采用抛出的方式不需要从地面起飞更不用依赖平的地面,节约了能量,而且可移动设备控制自身悬停后即可进行后续的拍照等空中作业,由于在达到悬停之前节省了能量,因此可以为后续空中作业提供更多能量(例如可以增加拍摄时间或者拍摄次数);另外,通过悬停后控制影像摄取装置自动对准抛出主体,降低了可移动设备操作者操作的复杂度并且提高了可移动设备空中作业的效率。
本公开的可移动设备,通过设置第一感测装置、控制器和动力输出部件,能够实现被抛出主体抛飞,避免了在不平整的地面起飞带来的相应问题;可移动设备还可设置有影像摄取装置,进行悬停后,影像摄取装置通过一识别单元或者依据抛出后的运动轨迹自动对准抛出主体,以进行空中作业(例如对抛出主体进行拍照),降低了可移动设备操作者操作的复杂度并且提高了可移动设备空中作业的效率。
附图说明
为了更完整地理解本公开实施例及其优势,现在将参考结合附图的以下描述,其中:
图1示出了根据本公开实施例的可移动设备的作业方法流程图。
图2示出了图1中步骤S101的抛出示意图。
图3示出了根据本公开另一实施例的可移动设备的作业方法流程图。
图4示出了图3中步骤S304的感测并调整姿态示意图。
图5示出了本公开实施例可移动设备的作业方法中悬停和空中作业示意图。
图6示出了本公开实施例的可移动设备的方框图。
图7示出了本公开另一实施例的可移动设备的方框图。
此外,各附图并不一定按比例来绘制,而是仅以不影响读者理解的示意性方式示出。
具体实施例
根据结合附图对本公开示例性实施例的以下详细描述,本公开的其它方面、优势和突出特征对于本领域技术人员将变得显而易见。
在本公开中,下述用于描述本公开原理的各种实施例只是说明,不应该以任何方式解释为限制公开的范围。参照附图的下述描述用于帮助全面理解由权利要求及其等同物限定的本公开的示例性实施例。下述描述包括多种具体细节来帮助理解,但这些细节应认为仅仅是示例性的。因此,本领域普通技术人员应认识到,在不脱离本公开的范围和精神的情况下,可以对本文中描述的实施例进行多种改变和修改。此外,为了清楚和简洁起见,省略了公知功能和结构的描述。此外,贯穿附图,相同附图标记用于相同或相似的功能和操作。此外,尽管可能在不同实施例中描述了具有不同特征的方案,但是本领域技术人员应当意识到:在不引起冲突的前提下,可以将不同实施例的全部或部分特征相结合,以形成不脱离本公开的精神和范围的新的实施例。
在本公开中,术语“包括”和“含有”及其派生词意为包括而非限制。
本公开所描述的技术可以应用在可移动设备上,如空中的可移动设备(例如,固定翼飞行器,例如飞机、滑行机,旋翼飞行器,如直升飞机,或者其它的飞行器,如软式飞艇、气球);或杆子,例如鱼竿或其它类型的可移动的支持物或者框架;太空中的可移动设备(如,卫星、空间站、宇宙飞船)。可移动设备可以在一个环境中(如空中或太空)自由移动,或者沿着预设的路径、轨道、或者以约定的方式移动。该可移动设备可以一维、二维或者三维的移动。可移动设备可以响应信号自动移动,而不用手动移动。某些实施例中,所述可移动设备可以是运输工具,如空中的运输工具或者太空中的运输工具,或者他们的结合。运输工具可以在一个或者多个指定的环境中自由移动,或者沿着固定的路径移动。运输工具可包括动力系统。所述动力系统利用电动机、引擎、电子元件、磁性机构、重力、风力、燃烧、和/或者其它的动力机构。在某些实施例中,手动动力系统、人力动力系统、或者借助于其它生物体的动力系统也可以应用在可移动设备上,例如,运输工具。在某些实施例中, 所述可移动设备可以是旋翼飞行器,该旋翼飞行器可以通过一个或者多个叶片的旋转驱动或者控制。所述可移动设备可借助一个或者多个旋转叶片、螺旋桨、轮子、磁铁、轨道、或者其它机构,被驱动或者重新定位。在某些实施例中,所述可移动设备是无人运输工具,如无人飞行器(Unmanned Aerial Vehicle,UAV),也可称为无人机。无人飞行器能够悬停、调整方向、和/或跟随。
可移动设备可以被自动控制或者远程控制而不需要有人在可移动设备内或者可移动设备之上。可移动设备可通过终端被远程操纵。可替换地,也可以有人在可移动设备内或者可移动设备之上,以协助控制该可移动设备。该可移动设备可用于承载载荷。在某些实施例中,由所述可移动设备承载的载荷可包括负载和/或载体,该载体能够使得负载相对可移动设备移动。
本公开的影像摄取装置是指能够摄录图像、影像和/或声音的电子设备,包括但不限于摄像机、手机、平板电脑、摄像头、相机或者X光/红外光成像仪;按照功能分可以是平面影像摄取装置、立体影像摄取装置和全景影像摄取装置;按照镜头数划分,全景影像摄取装置可以是单镜头影像摄取装置或者多镜头影像摄取装置。
本公开中的抛出主体是指能够推动可移动设备抛出的施动者,抛出主体可以是人、动物、机器人或者能进行投弹射或者抛射的抛出设备。一种可能的方式是,人作为抛出主体,手托可移动设备,通过向正上或斜上方托起可移动设备的方式实现相应的抛出动作。
本公开中,用语“悬停”是指可移动设备在半空中停留。悬停的时间可以是短暂的(例如1s左右),或者是维持较长时间(例如10min以上),其可以跟后续的操作有关,如果后续的操作不需要移动在原地进行,则可以保持较长时间;如果后续的操作需要调整位置(例如调整影像摄录的角度和距离远近),则可能维持较短时间(甚至该悬停时间维持1s以下)。
本公开所使用的序数例如“第一”、“第二”等用词,以用以修饰不同类型的感测装置等,其本身并不包含及代表相应感测装置等有任何之前的序数,也不代表某一感测装置与另一感测装置等前后的顺序,这些序数的使用仅用来使具有某命名的一感测装置等得以和另一具有相同命名的感测装置等能作出清楚区分。当然,在一些实施例中含不同序数的感测装置也可以相同或者为同一个具体元件。
本公开中,用语“作业方法”是指可移动设备通过实施相应的动作实现相应任务,主要包括空中作业步骤(包括但不限于空中抛出第一物品(喷射、喷洒液体、固体和或液体固体混合物也包含在其中),空中进行摄像和/或摄影,以及空中抓取相应的第二物品),另外还可包括空中作业辅助步骤,包括但不限于可移动设备运动至进行空中作业的位置,以空中作业完成后降落至地面或者抛出主体。
请注意:尽管以下部分实施例以无人机作为可移动设备、人作为抛出主体来进行详细描述,拍摄作为空中作业,然而本公开不限于此。事实上,可移动设备可以是以上所述的任意形式。此外,抛出主体也可以上述各种实施抛出可移动设备动作的施动者。
如前所述,可移动设备的内部电池充电一次所进行空中作业的时间有限,例如电池充一次电支持拍摄的次数少,不能做到快速拍照,以及必须在平整的地面起飞降落的问题。
为了至少部分解决或减轻上述问题,提出了根据本公开实施例的可移动设备的作业方法。本公开提供一种通过抛出方式,例如在起飞时需要手将可移动设备向空中作业(例如拍摄位置)用一定的力量抛射,可移动设备借助抛出时对其做的功(惯性)作为一部分飞行的动力,抛出后在空中启动旋翼保持飞行姿态,向沿抛射的方向飞行,在飞行到抛射的动能完全消耗完后悬停进行空中作业(例如拍照)。
接下来,将结合图1~2以及图5来详细描述根据本公开实施例的可移动设备(例如无人机)的作业方法具体过程。
图1示出了根据本公开实施例的可移动设备的作业方法流程图。如图1所示,本公开实施例的可移动设备的作业方法,包括:
S101:可移动设备感测自身是否被抛出主体抛出;
S102:当感测被抛出之后,可移动设备控制自身悬停;以及
S103:控制自身悬停后,进行空中作业。
在一些实施例中,步骤S101中的感测自身是否被抛出主体抛出可以有多种实现方式,一种实现方式是可移动设备感测是否远离抛出主体;该种实施方式中可以在可移动设备上安装有感测与抛出主体之间距离的第一感测装置,该第一感测装置可以是超声传感器、红外传感器和/或光流传感器,第一感测装置可以安装于可移动设备上,通过感测与抛出主体之间的距离,当距离发生变化,且变化值达到某设定距 离阈值范围以上时,则判断被可移动设备被抛出。超声传感器、红外传感器和/或光流传感器可以位于可移动设备内或设置于可移动设备的外壳体上,以利于感测距离。
类似的,另一种实现方式也是感测距离值,不同之处仅在于是感测的可移动设备与地面之间的距离变化,相应的也可以设置第一感测装置,第一感测装置同样可以是超声传感器、红外传感器和/或光流传感器。
还有一种实现方式,是判断可移动设备感测是否与抛出主体之间有相互作用力,该方式可以借助一压力传感器予以实现,可以将压力传感器设置于可移动设备的外表面上,抛出之前,抛出主体会给予可移动设备一支持力或者抓握力,压力传感器能够感测到相应的力,当抛出以后,两者之间的相互作用力也消失,所以能够及时被压力传感器所感测到。
作为优选的,压力传感器可以设置于可移动设备的下表面上,当抛出主体抛出可移动设备后,可以灵敏检测支持力消失。
图2示出了图1中步骤S101的抛出示意图。该示意图中,抛出主体为人,未抛出时,可移动设备201位于人手掌202上,人可以按照图中箭头所示方向抛出可移动设备201后,抛出方向可以是正上方或者斜上方,抛出的方向优选的为可移动设备201后续作业(例如拍照)的方位。
在一些实施例中,抛出主体可以为动物、机器人或者机械抛出装置,当为机器人时,机器人可以根据自身内部的一些感测装置,较精确的计算相应的抛出力和抛出角度,以使可移动设备准确到达空中作业点。
请继续参照图1,对于步骤S102,当感测被抛出之后,可移动设备控制自身悬停。对于悬停的时机,在可移动设备满足相应的条件下可以进行悬停。
一实施例中,可移动设备感测自身高度不再上升后悬停。该条件可以通过一第二感测装置实现,第二感测装置可以是超声传感器、红外传感器或光流传感器,安装于可移动设备上,用于感测可移动设备自身高度是否不再上升。设定传感器按照一定的频率进行高度感测,传感器可以感测与地面的高度或者与抛出主体相对的高度,在上述频率下,如果下一感测值的高度小于等于上一感测值的高度,则判断满足悬停条件。由于抛出主体的不确定性或者可移动性,可以选取底面为参照物进行高度的感测和测量。
另一实施例中,可移动设备感测自身的速度在高度方向的分量为0后悬停。该条件下,也可以通过第二感测装置实现,该种情况下第二感测装置可以为速度传感器和加速度传感器,用于感测可移动设备速度在高度方向的分量是否为零。速度传感器可以为线速度或者角度值,加速度传感器用于测量线性加速度。
另一实施例中,可移动设备感测是否接收到一外部控制端发出的悬停控制信号。该条件可以通过一第二感测装置实现,第二感测装置此时可以是外部信号接收装置,该外部信号接收装置与控制器电性耦接,如接收到外部控制端发出的悬停信号后传送至控制端,控制端控制相应的动力输出部件进行悬停操作。
以上已经介绍,悬停的时间可以是短暂的(例如1s左右),或者是维持较长时间,这取决于抛出主体抛出可移动装置的力量和角度,如果抛出后达到最高点的位置即为空中作业位置,则可以一直悬停直至作业结束,如果抛出的力度不够或者抛出的力度过大,可移动设备还可以通过动力输出部件调整其位置,这样则悬停时间较短。
在一些实施例中,悬停的条件是:可移动设备仅依据抛出的动力,感测自身高度不再上升时控制自身悬停。该条件所起作用是无需可移动设备自身的动力输出部件额外产生动力即可将其带至进行空中作业所需高度,减少动力损耗,达到更长的使用时间(尤其是小型的无人飞行器,可以在搭载有限的能量供给器件时,能够进行更长时间的空中作业,包括但不限于拍摄)。
请继续参照附图1,本公开实施例中的可移动设备空中作业还包括步骤S103:可移动设备控制自身悬停时或之后还包括:可移动设备进行空中作业。图5示出了悬停和空中作业示意图。可移动设备,例如无机人移动至空中一般为实现相应功能,很多的包括但不限于空中抛出第一物品(喷射、喷洒液体、固体和或液体固体混合物也包含在其中),空中进行摄像和/或摄影,以及空中抓取相应的第二物品,上述都可称为空中作业。以控制拍摄为例,通过俯视的角度可以获得更广更奇特的视野,满足用户的相应需求。
接下来,将结合图3~5来详细描述根据本公开另一实施例的可移动设备(例如无人机)的作业方法具体过程。
图3示出了根据本公开另一实施例的可移动设备的作业方法流程图。如图3所示,本公开实施例的可移动设备的作业方法,也主要包括:
S301:可移动设备感测自身是否被抛出主体抛出;
S305:当感测被抛出之后,可移动设备控制自身悬停;以及
S306:控制自身悬停后,进行空中作业。
对步骤S301和S305的具体阐述,可分别参照以上所述的步骤S101和S102,此处不予赘述。
在一些实施例中,在步骤S301之后,还包括步骤S302:当感测被抛出之后包括:可移动设备感测自身被抛出时的初始速度。该初始速度可以用于为后续的悬停步骤S304和S306提供条件,该步骤中如果感测出相应的初始速度方向和大小,以及相对于抛出主体的方向,后续的悬停实际以及返回抛出主体也取决于抛出的速度和方向(即相对于抛出主体的方向)。
在一些实施例中,该初始速度可以通过第三感测装置实现,第三感测装置可以为速度传感器、加速度传感器或惯性传感器,感测自身的初始速度方向和大小,感测自身相对于抛出主体的抛出角度。速度传感器可以为线速度或者角度值,加速度传感器用于测量线性加速度。应该注意的是,这种情况下,第三感测装置和第二感测装置元件一样,而且感测时机不同,在可移动设备中可以共用,即设置一个速度传感器和加速度传感器即可。
在一些实施例中,还包括步骤S303,即在可移动设备上安装影像摄取装置,在抛出后该影像摄取装置对准抛出主体(例如影像摄取装置的镜头对准排出主体的局部或者抛出主体整体)。
在一些实施例中,可移动设备上安装的影像摄取装置对准抛出主体具体可包括:确定被抛出的可移动设备被抛出时的初始速度方向和大小,根据初始速度方向和大小确定可移动设备的运动轨迹(例如无动力输出部件输出动力时为抛物线轨迹),确定运动轨迹时考虑风阻,调整影像摄录装置对准运动轨迹的起始点(也即对准抛出主体),另外,进一步的还可以通过一识别单元,例如人脸识别系统,进一步辅助所述影像摄录装置对准识别抛出主体(即调整影像摄录装置对准运动轨迹的起始点初步对准,通过识别单元实现精确对准)。
在一些实施例中,在被抛出之前,可移动设备的一识别单元预存储抛出主体的局部(例如脸部)信息,或者可移动设备的影像摄取装置摄取抛出主体的局部影像,再通过一识别单元接收该局部影像并预存储为抛出主体的局部信息;进一步的,在 可移动设备悬停后,依据所存储的局部信息,控制可移动设备上一云台(该云台搭载有影响摄取装置)或者可移动设备自身朝向,调整影像摄取装置寻找并对准抛出主体,以进行拍照或者跟随等空中作业。
在一些实施例中,还包括步骤S304,即可移动设备感测并调整自身姿态。具体的,可以是可移动设备感测自身的角速度,并调整自身的空间姿态。该步骤可以通过陀螺仪(例如是三轴陀螺仪)以及动力输出部件实现。陀螺仪为绕支点高度转动的刚体,主要用于确定飞行的姿态,可以为压电陀螺仪、微机陀螺仪或者激光陀螺仪,用于测量角速度。动力输出部件可以是悬翼,用于调整自身产生的偏转。
图4示出了图3中步骤S304的感测并调整姿态示意图。图4所示的可移动设备401,其自身的陀螺仪感测到相应角速度,说明存在相应的偏转,三轴陀螺仪将感测的三轴角速度值传输至控制端,控制端计算偏转对应的动力输出部件调整角度等,从而改变相应的偏转,以达到稳定自身姿态的目的。
在一些实施例中,还包括步骤S306:可移动设备控制自身悬停时或之后还包括:可移动设备进行空中作业。图5示出了本公开实施例可移动设备的作业方法中悬停和空中作业示意图。可移动设备,例如无机人移动至空中一般为实现相应功能,很多的包括但不限于空中抛出第一物品(喷射、喷洒液体、固体和或液体固体混合物也包含在其中),空中进行摄像和/或摄影,以及空中抓取相应的第二物品,上述都可称为空中作业。以控制拍摄为例,首先通过俯视的角度可以获得更广更奇特的视野,满足用户的相应需求。
一些实施例中,可移动设备501可以包括影像摄取装置5011,影像摄取装置5011在可移动设备501识别抛出主体502的第一设定部位后,进行影像摄取。该种情况是可移动设备501悬停的位置正好可以进行拍摄,在经过简单的对焦,例如识别抛出主体502的第一设定部位(例如人脸)后,即可以进行摄像或者摄影操作。
一些实施例中,可移动设备501移动至抛出主体502可摄取位置范围内后,安装的影像摄取装置识别抛出主体的第一设定部位后进行影像摄取。该种情况是,由于抛出主体502的抛出力度或者角度有欠缺,可移动设备501未到达合适的拍摄位置,通过动力输出部件对可移动设备501自身的位置进行调整,移动至抛出主体502的可摄取位置范围内后在进行拍照或者摄像。此时也需要进行例如面部识别的定位操作。
一些实施例中,可移动设备501移动至抛出主体502可摄取位置范围内后,安装的影像摄取装置识别抛出主体的第一设定部位并且调整影像摄取装置的角度后进行影像摄取。该种情况是,由于抛出主体502的抛出力度或者角度有欠缺以及摄像头角度未准确对准,可移动设备501未到达合适的拍摄位置,通过动力输出部件对可移动设备501自身的位置进行调整,移动至抛出主体502的可摄取位置范围内后在进行拍照或者摄像。此时也需要进行例如面部识别的定位操作。还需要的是通过云台等设备搭载影像摄取装置(其中云台安装于可移动设备上),通过云台的绕轴转动实现影像摄取装置的角度变动。
在一些实施例中,还包括步骤S307:即可移动设备进行空中作业之后,可移动设备移动靠近至所述抛出主体。对于靠近的位置,可以根据事先测定的距离抛出主体的相对位置,通过驱动动力输出部件向抛出主体移动靠近。
在一些实施例中,还包括第四感测装置,其用于感测可移动设备与抛出主体的第二设定部位之间的距离是否到达设定距离,所述动力输出部件还用于达到设定距离之内后,停止动力输出。该第四感测装置可以是一测距传感器,具体可以是超声传感器、红外传感器和/或光流传感器,第四感测装置可以安装于可移动设备上,通过感测与抛出主体之间的距离,当距离发生变化,且变化值达到某设定距离阈值范围以内时,则通过控制器停止动力输出。一些情况下,第四感测装置、第二感测装置和第一感测装置元件一样,而且感测时机不同,在可移动设备中可以共用,即设置一个超声传感器、红外传感器和/或光流传感器即可。
例如采用无人机进行拍照,在无人机拍完照后,无人机用电池提供可移动设备按照原方向返回的动力,当飞回到抛出主体附近时,抛出主体可以把手放到无人机下方,无人机的第四感测装置(例如超声传感器、红外传感器和/或光流传感器)感应到第二设定部位(例如人手)的存在(例如感应到手与无人机的距离达到设定距离阈值范围以内),减少飞行升力,降落到人手上。这样在手上起飞降落,实现在地面不平的情况下也能起飞降落。
接下来,将结合图6来详细描述根据本公开一实施例的可移动设备(例如无人机)具体组成。如图6所示,本实施例的可移动设备600,包括:第一感测装置601,用于感测可移动设备是否被抛出主体抛出;控制器602,用于接收第一感测装置601 传输的可移动设备被抛出信号并产生控制信号并产生控制信号;动力输出部件603,用于在可移动设备被抛出之后根据控制信号悬停可移动设备。
在一些实施例中,第一感测装置601可以安装在可移动设备上,感测与抛出主体之间距离的,该第一感测装置601可以是超声传感器、红外传感器和/或光流传感器,第一感测装置601可以安装于可移动设备上,通过感测与抛出主体之间的距离,当距离发生变化,且变化值达到某设定距离阈值范围以上时,则判断被可移动设备被抛出。超声传感器、红外传感器和/或光流传感器可以位于可移动设备的外部,以利于感测距离。
类似的,可以设置第一感测装置601通过另一种实现方式也是感测距离值,不同之处仅在于是感测的可移动设备与地面之间的距离变化,相应的也,第一感测装置同样可以是超声传感器、红外传感器和/或光流传感器。
第一感测装置601还有另一种实现方式,即判断可移动设备感测是否与抛出主体之间有相互作用力,该方式可以借助一压力传感器予以实现,可以将压力传感器设置于可移动设备的外表面上,抛出之前,抛出主体会给予可移动设备一支持力或者抓握力,压力传感器能够感测到相应的力,当抛出以后,两者之间的相互作用力也消失,所以能够及时被压力传感器所感测到。
作为优选的,压力传感器可以设置于可移动设备的下表面上,当抛出主体抛出可移动设备后,可以灵敏检测支持力消失。
对于控制器602,其作为可移动设备中的中控部件,与各感测装置和动力输出部件603连接,用于接收各感测装置传入的信号,从而产生对应的控制信号。
本公开实施例提供的作业方法对应于抛出主体对象,例如人,相应操作也相对简便。用户可以通过减少训练或者不经过训练即能掌握可移动设备(例如无人飞行器)的操作方法,仅需要掌握一定抛出技巧,使可移动设备到达一定高度,可移动设备即可自行进行空中作业,无需用户再进一步操控。进一步的,在控制作业结束后,只需提供可移动设备降落的地点,其可以自动降落至该地点,无需再操作以避免人工操作引起的误操作或者误差。
接下来,将结合图7来详细描述根据本公开另一实施例的可移动设备(例如无人机)具体组成。如图7所示,除第一感测装置601、控制器602和动力输出部件603外还包括:影像摄取装置,其安装于所述可移动设备上,用于在可移动设备被 抛出之后,对准抛出主体,具体请参见上文,在此不予赘述。在进行例如摄影和拍照的空中作业时,可通过影像摄取装置进行相应操作。上述抛出后的对准方式参照上述对于图1-5的描述,在此不予赘述。
在一些实施例中,可移动设备600还包括第三感测装置604,安装于可移动设备上,用于在可移动设备600被抛出之后,感测自身被抛出时的初始速度。第三感测装置604可以为速度传感器和加速度传感器,感测自身的初始速度方向和大小,感测自身相对于抛出主体的抛出角度。速度传感器可以为线速度或者角度值,加速度传感器用于测量线性加速度。
在一些实施例中,可移动设备600还包括:投放装置,用于空中抛出第一物品(喷射、喷洒液体、固体和/或液体固体混合物也包含在其中),影像摄取装置,用于空中进行摄像和/或摄影,以及抓取装置,用空中抓取相应的第二物品。以影像摄取装置进行拍摄为例,首先通过俯视的角度可以获得更广更奇特的视野,满足用户的相应需求。
在一些实施例中,为实现具体的控制作业中的拍摄,可移动设备还包括识别单元,用于识别抛出主体的第一设定部位,该识别单元可以是现有技术的各种人脸识别系统,可以包括四个组成部分:人脸图像采集与检测部分、人脸图像预处理部分、人脸图像特征提取部分以及匹配与识别部分,相应的第一设定部位可以为人脸;云台,安装于可移动设备上,并搭载所述影像摄取装置,用于调整影像摄取装置的角度;其中,所述影像摄取装置还用于设备单元所述第一设定部位后对抛出主体进行影像摄取。
在一些实施例中,可移动设备600还包括第二感测装置605,用于感测可移动设备是否满足悬停条件。
一实施例中,第二感测装置605可以是超声传感器、红外传感器或光流传感器,安装于可移动设备上,用于感测可移动设备自身高度是否不再上升。设定传感器按照一定的频率进行高度感测,传感器可以感测与地面的高度或者与抛出主体相对的高度,在上述频率下,如果下一感测值的高度小于等于上一感测值的高度,则判断满足悬停条件。由于抛出主体的不确定性或者可移动性,可以选取地面为参照物进行高度的感测和测量。
另一实施例中,可以通过第二感测装置605实现感测可移动设备自身的速度在高度方向的分量为0后悬停。该种情况下第二感测装置605可以为速度传感器和加速度传感器,用于感测可移动设备速度在高度方向的分量是否为零。速度传感器可以为线速度或者角度值,加速度传感器用于测量线性加速度。
另外还有一种形式是,可以通过第二感测装置605实现感测可移动设备是否接收到一外部控制端发出的悬停控制信号。该条件下,第二感测装置605可以是外部信号接收装置,该外部信号接收装置与控制器电性耦接,如接收到外部控制端发出的悬停信号后传送至控制端,控制端控制相应的动力输出部件进行悬停操作。
在一些实施例中,可移动设备600还包括第四感测装置606,用于感测可移动设备与抛出主体的第二设定部位(例如抛主体的手部)之间的距离是否到达设定距离,所述动力输出部件603还用于达到设定距离之内后,停止动力输出。该第四感测装置606可以是一测距传感器,具体可以是超声传感器、红外传感器和/或光流传感器,第四感测装置606可以安装于可移动设备上,通过感测与抛出主体之间的距离,当距离发生变化,且变化值达到某设定距离阈值范围以内时,则通过控制器停止动力输出。一些情况下,第四感测装置606、第三感测装置604和第一感测装置601的元件一样,而且感测时机不同,在可移动设备中可以共用,即设置一个超声传感器、红外传感器和/或光流传感器即可。
在一些实施例中,动力输出部件603可以包括电动机和悬翼,电动机与悬翼机械连接,电动机带动悬翼转动以为可移动设备600输出动力。
尽管已经参照本公开的特定示例性实施例示出并描述了本公开,但是本领域技术人员应该理解,在不背离所附权利要求及其等同物限定的本公开的精神和范围的情况下,可以对本公开进行形式和细节上的多种改变。因此,本公开的范围不应该限于上述实施例,而是应该不仅由所附权利要求来进行确定,还由所附权利要求的等同物来进行限定。

Claims (27)

  1. 一种可移动设备的作业方法,包括:
    可移动设备感测自身是否被抛出主体抛出;
    当感测被抛出之后,可移动设备控制自身悬停;以及
    控制自身悬停后,进行空中作业。
  2. 根据权利要求1所述的方法,其特征在于,感测被抛出之后包括:可移动设备上安装的影像摄取装置对准抛出主体。
  3. 根据权利要求1所述的方法,其特征在于,当感测被抛出之后包括:可移动设备感测自身被抛出时的初始速度。
  4. 根据权利要求3所述的方法,其特征在于,所述感测自身被抛出时的初始速度具体包括:感测自身的初始速度方向和大小,感测自身相对于抛出主体的抛出角度。
  5. 根据权利要求2所述的方法,其特征在于,可移动设备上安装的影像摄取装置对准抛出主体包括:
    确定被抛出的可移动设备被抛出时的初始速度方向和大小;
    根据初始速度方向和大小确定可移动设备的运动轨迹;
    调整影像摄录装置对准运动轨迹的起始点;
    和/或通过一识别单元辅助所述影像摄录装置对准识别抛出主体。
  6. 根据权利要求2所述的方法,其特征在于,
    在被抛出之前,可移动设备的一识别单元预存储抛出主体的局部信息,或者可移动设备的影像摄取装置摄取抛出主体的局部影像,一识别单元接收该局部影像并预存储为抛出主体的局部信息;以及
    在悬停后,依据所述局部信息,通过控制一搭载影像摄取装置的云台或者控制可移动设备自身朝向,调整影像摄取装置以寻找和对准抛出主体。
  7. 根据权利要求1所述的方法,其特征在于,感测被抛出之后还包括:可移动设备感测并调整自身姿态。
  8. 根据权利要求7所述的方法,其特征在于,所述可移动设备感测并调整自身姿态具体包括:可移动设备感测自身的角速度,并根据所述角速度调整自身的空间姿态。
  9. 根据权利要求1所述的方法,其特征在于,可移动设备感测自身是否被抛出的方式包括以下至少一项:
    可移动设备感测是否远离抛出主体;
    可移动设备感测是否与抛出主体之间有相互作用力;以及
    可移动设备感测是否远离地面。
  10. 根据权利要求1所述的方法,其特征在于,当感测被抛出之后,可移动设备控制自身悬停包括:
    仅依据抛出的动力,感测自身高度不再上升时控制自身悬停。
  11. 根据权利要求1所述的方法,其特征在于,所述空中作业包括以下至少一项:投放第一物品、拍摄以及抓取第二物品。
  12. 根据权利要求11所述的方法,其特征在于,所述拍摄具体包括以下任意一项:
    可移动设备安装的影像摄取装置识别抛出主体的第一设定部位后进行影像摄取;
    可移动设备移动至抛出主体可摄取位置范围内后,安装的影像摄取装置识别抛出主体的第一设定部位后进行影像摄取;以及
    可移动设备移动抛出主体可摄取位置范围内后,安装的影像摄取装置识别抛出主体的第一设定部位并且调整影像摄取装置的角度后进行影像摄取。
  13. 根据权利要求1所述的方法,其特征在于,可移动设备控制自身悬停条件满足以下至少一项:
    可移动设备感测自身高度不再上升;
    可移动设备感测自身速度在高度方向的分量为零;或者
    可移动设备接收一外部控制端发出的悬停控制信号。
  14. 根据权利要求10所述的方法,其特征在于,可移动设备进行空中作业之后包括:可移动设备移动靠近所述抛出主体。
  15. 根据权利要求14所述的方法,其特征在于,所述移动靠近抛出主体具体包括:可移动设备感测与抛出主体的第二设定部位之间到达设定距离之内后,停止自身动力输出,降落至所述第二设定部位。
  16. 一种可移动设备,包括:
    第一感测装置,用于感测可移动设备是否被抛出主体抛出;
    控制器,用于接收第一感测装置传输的可移动设备被抛出信号后产生控制信号;
    动力输出部件,用于根据所述控制信号在可移动设备被抛出之后悬停所述可移动设备。
  17. 根据权利要求16所述的设备,其特征在于,还包括:
    影像摄取装置,安装于所述可移动设备上,用于在可移动设备被抛出之后,对准抛出主体。
  18. 根据权利要求16所述的设备,其特征在于,所述第一感测装置包括以下至少一项:
    超声传感器、红外传感器或光流传感器,安装于可移动设备上,用于感测可移动设备是否远离抛出主体,或用于感测可移动设备是否远离地面;以及
    压力传感器,安装于可移动设备底部或侧面,感测可移动设备是否与抛出主体之间有相互作用力。
  19. 根据权利要求16所述的设备,其特征在于,还包括:
    第三感测装置,安装于可移动设备上,用于在可移动设备被抛出之后,感测自身被抛出时的初始速度。
  20. 根据权利要求19所述的设备,其特征在于,所述第三感测装置包括:
    速度传感器和加速度传感器,用于感测可移动设备的被抛出时初始速度方向和大小。
  21. 根据权利要求16所述的设备,其特征在于,还包括陀螺仪,用于感测可移动设备角速度;
    所述控制器还用于接收陀螺仪感测的角速度并控制所述动力输出部件调整可移动设备的空间姿态。
  22. 根据权利要求16所述的设备,其特征在于,还包括以下至少一项:
    投放装置,用于投放第一物品;
    影像摄取装置,用于拍摄;以及
    抓取装置,用于抓取第二物品。
  23. 根据权利要求22所述的设备,其特征在于,还包括:
    识别单元,用于识别抛出主体的第一设定部位;
    云台,安装于可移动设备上,并搭载所述影像摄取装置,用于调整调整影像摄 取装置的角度;
    其中,所述影像摄取装置还用于对准所述第一设定部位,还用于对抛出主体进行影像摄取。
  24. 根据权利要求16所述的设备,其特征在于,还包括第二感测装置,用于感测可移动设备是否满足悬停条件。
  25. 根据权利要求24所述的设备,其特征在于,所述第二感测装置包括以下至少一项:
    超声传感器、红外传感器或光流传感器,安装于可移动设备上,用于感测可移动设备自身高度是否不再上升;
    速度传感器和加速度传感器,用于感测可移动设备速度在高度方向的分量是否为零;以及
    外部信号接收装置,用于接收一外部控制端发出的悬停控制信号。
  26. 根据权利要求16所述的设备,其特征在于,所述动力输出部件还用于驱动可移动设备移动靠近所述抛出主体。
  27. 根据权利要求16所述的设备,其特征在于,还包括第四感测装置,用于感测可移动设备与抛出主体的第二设定部位之间的距离是否到达设定距离,所述动力输出部件还用于达到设定距离之内后,停止动力输出。
PCT/CN2017/101388 2017-09-12 2017-09-12 可移动设备以及其作业方法 WO2019051638A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/CN2017/101388 WO2019051638A1 (zh) 2017-09-12 2017-09-12 可移动设备以及其作业方法
CN201780004507.6A CN108475063A (zh) 2017-09-12 2017-09-12 可移动设备以及其作业方法
US16/812,965 US11435743B2 (en) 2017-09-12 2020-03-09 Throwable unmanned aerial vehicle and method of operation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/101388 WO2019051638A1 (zh) 2017-09-12 2017-09-12 可移动设备以及其作业方法

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/812,965 Continuation US11435743B2 (en) 2017-09-12 2020-03-09 Throwable unmanned aerial vehicle and method of operation

Publications (1)

Publication Number Publication Date
WO2019051638A1 true WO2019051638A1 (zh) 2019-03-21

Family

ID=63266535

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/101388 WO2019051638A1 (zh) 2017-09-12 2017-09-12 可移动设备以及其作业方法

Country Status (3)

Country Link
US (1) US11435743B2 (zh)
CN (1) CN108475063A (zh)
WO (1) WO2019051638A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110435894B (zh) * 2019-07-03 2022-12-27 江汉大学 一种用于太阳能无人机的空中起飞系统
CN110955258B (zh) * 2019-11-28 2023-04-28 深圳蚁石科技有限公司 四轴飞行器的控制方法、装置、控制器和存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110147515A1 (en) * 2009-12-17 2011-06-23 Gerald Miller Hand launchable unmanned aerial vehicle
CN104685436A (zh) * 2013-12-13 2015-06-03 深圳市大疆创新科技有限公司 无人飞行器起飞及降落方法
CN104808680A (zh) * 2015-03-02 2015-07-29 杨珊珊 一种多旋翼飞行拍摄设备
CN105159321A (zh) * 2015-08-18 2015-12-16 北京奇虎科技有限公司 一种基于无人飞行器的拍照方法和无人飞行器
CN105527972A (zh) * 2016-01-13 2016-04-27 深圳一电航空技术有限公司 无人机飞行控制方法及装置
CN105930047A (zh) * 2016-04-01 2016-09-07 腾讯科技(深圳)有限公司 飞行器的控制方法和装置
CN106647805A (zh) * 2016-12-27 2017-05-10 深圳市道通智能航空技术有限公司 无人机自主飞行的方法、装置以及无人机

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015179797A1 (en) * 2014-05-23 2015-11-26 Lily Robotics, Inc. Unmanned aerial copter for photography and/or videography
US9599992B2 (en) * 2014-06-23 2017-03-21 Nixie Labs, Inc. Launch-controlled unmanned aerial vehicles, and associated systems and methods
CN115158661A (zh) * 2015-01-18 2022-10-11 基础制造有限公司 用于无人机的装置、系统和方法
US10696414B2 (en) * 2015-04-21 2020-06-30 Gopro, Inc. Aerial capture platform
US9557738B2 (en) * 2015-04-21 2017-01-31 Gopro, Inc. Return path configuration for remote controlled aerial vehicle
US20170197731A1 (en) * 2016-01-08 2017-07-13 ZEROTECH (Shenzhen) Intelligence Robot Co., Ltd. Method and apparatus for hand launching unmanned aerial vehicle
CN105843241A (zh) * 2016-04-11 2016-08-10 零度智控(北京)智能科技有限公司 无人机、无人机起飞控制方法及装置
CN106227234B (zh) * 2016-09-05 2019-09-17 天津远度科技有限公司 无人机、无人机起飞控制方法及装置
JP6910785B2 (ja) * 2016-11-09 2021-07-28 キヤノン株式会社 移動撮像装置およびその制御方法、ならびに撮像装置およびその制御方法、無人機、プログラム、記憶媒体
US10996683B2 (en) * 2018-02-09 2021-05-04 Skydio, Inc. Aerial vehicle touchdown detection
WO2021016875A1 (zh) * 2019-07-30 2021-02-04 深圳市大疆创新科技有限公司 飞行器的降落方法、无人飞行器及计算机可读存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110147515A1 (en) * 2009-12-17 2011-06-23 Gerald Miller Hand launchable unmanned aerial vehicle
CN104685436A (zh) * 2013-12-13 2015-06-03 深圳市大疆创新科技有限公司 无人飞行器起飞及降落方法
CN104808680A (zh) * 2015-03-02 2015-07-29 杨珊珊 一种多旋翼飞行拍摄设备
CN105159321A (zh) * 2015-08-18 2015-12-16 北京奇虎科技有限公司 一种基于无人飞行器的拍照方法和无人飞行器
CN105527972A (zh) * 2016-01-13 2016-04-27 深圳一电航空技术有限公司 无人机飞行控制方法及装置
CN105930047A (zh) * 2016-04-01 2016-09-07 腾讯科技(深圳)有限公司 飞行器的控制方法和装置
CN106647805A (zh) * 2016-12-27 2017-05-10 深圳市道通智能航空技术有限公司 无人机自主飞行的方法、装置以及无人机

Also Published As

Publication number Publication date
US11435743B2 (en) 2022-09-06
US20200218265A1 (en) 2020-07-09
CN108475063A (zh) 2018-08-31

Similar Documents

Publication Publication Date Title
US10850838B2 (en) UAV battery form factor and insertion/ejection methodologies
JP6755966B2 (ja) 複数の無人航空ビークルを使用する画像化
CN107438805B (zh) 无人机控制方法及装置
US11858633B2 (en) Methods and systems for door-enabled loading and release of payloads in an unmanned aerial vehicle (UAV)
CN111596649B (zh) 用于空中系统的单手远程控制设备
US20180150073A1 (en) Unmanned aerial vehicle and method for controlling flight of the same
US20200148352A1 (en) Portable integrated uav
CN110733624A (zh) 无人驾驶飞行系统和用于无人驾驶飞行系统的控制系统
KR20180111065A (ko) 무인 항공기 및 이를 제어하는 방법
CN110615095B (zh) 手持遥控装置和飞行系统套件
CN105573343A (zh) 一种无人机抓捕系统
CN102923305A (zh) 一种用于航拍的固定翼飞行器及其起飞降落方法
US11435743B2 (en) Throwable unmanned aerial vehicle and method of operation
CN204287973U (zh) 飞行相机
CN106542105B (zh) 飞行器移动降落方法和系统
CN203567947U (zh) 无人机自稳定云台
JP6910785B2 (ja) 移動撮像装置およびその制御方法、ならびに撮像装置およびその制御方法、無人機、プログラム、記憶媒体
CN105807783A (zh) 飞行相机
KR102365931B1 (ko) 배터리 교체형 드론의 공중 배터리 교체 방법 및 이를 위한 장치
US20170369165A1 (en) Moving device, method of controlling moving device and storage medium
CN112776994A (zh) 包括顺应性臂的航空载具
CN207809802U (zh) 一种飞行器倾斜飞行自稳定云台
KR102322098B1 (ko) 정밀 투하 드론
CN205837219U (zh) 一种多功能航拍飞行器
CN205405272U (zh) 一种无人机抓捕系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17924956

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17924956

Country of ref document: EP

Kind code of ref document: A1