CN108319295B - Obstacle avoidance control method, equipment and computer readable storage medium - Google Patents

Obstacle avoidance control method, equipment and computer readable storage medium Download PDF

Info

Publication number
CN108319295B
CN108319295B CN201810060925.3A CN201810060925A CN108319295B CN 108319295 B CN108319295 B CN 108319295B CN 201810060925 A CN201810060925 A CN 201810060925A CN 108319295 B CN108319295 B CN 108319295B
Authority
CN
China
Prior art keywords
obstacle
mobile platform
information
obstacle information
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201810060925.3A
Other languages
Chinese (zh)
Other versions
CN108319295A (en
Inventor
苏冠华
张立天
邬梁爽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Priority to CN201810060925.3A priority Critical patent/CN108319295B/en
Publication of CN108319295A publication Critical patent/CN108319295A/en
Application granted granted Critical
Publication of CN108319295B publication Critical patent/CN108319295B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Abstract

An obstacle avoidance control method, equipment and a computer readable storage medium comprise: acquiring obstacle information of a first obstacle towards which a sensor faces, and storing the obstacle information; when the mobile platform moves towards the first obstacle, if the sensor does not face the first obstacle, obstacle avoidance is carried out by using the stored obstacle information. By applying the embodiment of the invention, better obstacle avoidance effect can be realized, and the use experience of a user can be improved.

Description

Obstacle avoidance control method, equipment and computer readable storage medium
Technical Field
The present invention relates to the field of electronic technologies, and in particular, to an obstacle avoidance control method, an apparatus, and a computer-readable storage medium.
Background
Unmanned vehicles, such as Unmanned Aerial Vehicles (UAVs), have been developed for various fields, including consumer applications and industrial applications. For example, drones may be manipulated for entertainment, photography/video, surveillance, delivery, or other applications, which have expanded aspects of personal life.
Along with unmanned aerial vehicle's use becomes more and more common, unmanned aerial vehicle's function is also more and more, keeps away the barrier function and is unmanned aerial vehicle's typical application, keeps away the barrier function and makes unmanned aerial vehicle can avoid the barrier, reaches the destination point.
However, the current obstacle avoidance method has a poor obstacle avoidance effect, so that the user experience is poor.
Disclosure of Invention
The embodiment of the invention provides an obstacle avoidance control method, equipment and a computer readable storage medium, which can realize a better obstacle avoidance effect and improve the use experience of a user.
In a first aspect of the embodiments of the present invention, an obstacle avoidance control method is provided, where the method may be applied to a mobile platform, where the mobile platform may include a sensor capable of acquiring obstacle information, and the method includes: acquiring obstacle information of a first obstacle oriented by the sensor, and storing the obstacle information; when the mobile platform moves towards the first obstacle, if the sensor does not face the first obstacle, obstacle avoidance is carried out by using the stored obstacle information.
In a second aspect of the embodiments of the present invention, an obstacle avoidance control method is provided, where the method may be applied to a control device, and the method includes: obtaining barrier information corresponding to a mobile platform, and obtaining the moving direction of the mobile platform; selecting obstacle information corresponding to the moving direction from the obstacle information; and displaying the user interface according to the selected obstacle information.
In a third aspect of the embodiments of the present invention, a mobile platform is provided, including: a memory, a processor, a sensor; wherein the memory is used for storing program codes; the sensor is used for acquiring obstacle information; the processor, configured to invoke the program code, and when executed, configured to: acquiring obstacle information of a first obstacle oriented by the sensor, and storing the obstacle information; when the mobile platform moves towards the first obstacle, if the sensor does not face the first obstacle, obstacle avoidance is carried out by using the stored obstacle information.
In a fourth aspect of the embodiments of the present invention, there is provided a control apparatus including: a memory, a processor; the memory for storing program code; the processor, configured to invoke the program code, and when executed, configured to: obtaining barrier information corresponding to a mobile platform, and obtaining the moving direction of the mobile platform; selecting obstacle information corresponding to the moving direction from the obstacle information; and displaying the user interface according to the selected obstacle information.
In a fifth aspect of the embodiments of the present invention, a computer-readable storage medium is provided, where computer instructions are stored on the computer-readable storage medium, and when the computer instructions are executed, the obstacle avoidance control method proposed in the first aspect or the obstacle avoidance control method proposed in the second aspect of the embodiments of the present invention is implemented.
Based on the technical scheme, the embodiment of the invention can realize better obstacle avoidance effect and improve the use experience of users. Moreover, under the condition that the number of the sensors is limited, the information of the obstacles in multiple directions can be acquired, so that the effect of avoiding the obstacles globally is achieved. In addition, only the obstacle information of the moving direction of the mobile platform can be displayed, so that the most intuitive obstacle information can be displayed for the user.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments of the present invention or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the present invention, and for those skilled in the art, other drawings may be obtained according to the drawings of the embodiments of the present invention.
Fig. 1 is a schematic structural view of an unmanned aerial vehicle;
FIG. 2 is a schematic diagram of an embodiment of an obstacle avoidance control method;
3A-3H are schematic diagrams of obstacle avoidance for an obstacle;
fig. 4 is a schematic diagram of another embodiment of an obstacle avoidance control method;
FIG. 5 is a block diagram of one embodiment of a mobile platform;
FIG. 6 is a block diagram of one embodiment of a control device.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention. In addition, the features in the embodiments and the examples described below may be combined with each other without conflict.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein and in the claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should be understood that the term "and/or" as used herein is meant to encompass any and all possible combinations of one or more of the associated listed items.
Although the terms first, second, third, etc. may be used herein to describe various information, these information should not be limited to these terms. These terms are used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present invention. Depending on the context, moreover, the word "if" may be used is interpreted as "at … …," or "at … …," or "in response to a determination.
The embodiment of the invention provides an obstacle avoidance control method, which can be applied to a mobile platform, wherein the mobile platform can comprise but is not limited to at least one of the following: robot, unmanned aerial vehicle, unmanned car. For convenience of description, in this embodiment, the mobile platform is an unmanned aerial vehicle as an example, a cradle head is installed on the unmanned aerial vehicle, and shooting equipment (such as a camera, a video camera, and the like) is fixed on the cradle head. Wherein, unmanned aerial vehicle can acquire the shooting image through shooting equipment to keep away barrier control according to the shooting image. When the mobile platform is a robot or an unmanned vehicle, the processing procedure is similar to that of the unmanned vehicle, and is not described herein again.
Referring to fig. 1, a schematic structural diagram of the drone is shown. 10 denotes the aircraft nose of the unmanned aerial vehicle, 11 denotes the screw of the unmanned aerial vehicle, 12 denotes the fuselage of the unmanned aerial vehicle, 13 denotes the foot rest of the unmanned aerial vehicle, 14 denotes the pan-tilt on the unmanned aerial vehicle, 15 denotes the shooting equipment carried by the pan-tilt 14, the shooting equipment 15 is connected with the fuselage 12 of the unmanned aerial vehicle through the pan-tilt 14, 16 denotes the shooting lens of the shooting equipment, and 17 denotes the target object.
The pan/tilt head 14 may be a three-axis pan/tilt head, that is, the pan/tilt head 14 rotates around a Roll axis, a Pitch axis, and a Yaw axis of the pan/tilt head. As shown in fig. 1, 1 denotes a Roll axis of the pan/tilt head, 2 denotes a Pitch axis of the pan/tilt head, and 3 denotes a Yaw axis of the pan/tilt head. When the cradle head rotates by taking a Roll shaft as an axis, the Roll angle of the cradle head changes; when the tripod head rotates by taking the Pitch shaft as an axis, the Pitch angle of the tripod head changes; when the cradle head rotates by taking the Yaw axis as the axis, the Yaw angle of the cradle head changes. Further, when the pan/tilt head rotates about one or more of the Yaw axis, Pitch axis, and Yaw axis, the photographing apparatus 15 rotates following the rotation of the pan/tilt head 14, so that the photographing apparatus 15 can photograph the target object 17 from different photographing directions and photographing angles.
Similar to the pan/tilt head 14, the main body 12 of the unmanned aerial vehicle may also rotate around a Roll axis, a Pitch axis, and a Yaw axis of the main body. When the body of the unmanned aerial vehicle rotates by taking a Roll shaft as an axis, the Roll angle of the body changes; when the body of the unmanned aerial vehicle rotates by taking the Pitch shaft as an axis, the Pitch angle of the body changes; when the fuselage of unmanned aerial vehicle uses the Yaw axle to rotate as the axis, then the Yaw angle of fuselage changes.
Above-mentioned process has simply introduced unmanned aerial vehicle's structure, and in order to discover the barrier around unmanned aerial vehicle, unmanned aerial vehicle can also be including the sensor that can acquire barrier information, and this sensor can be for keeping away barrier sensor. In practice, the sensor may include, but is not limited to, at least one of the following: TOF (Time of Flight) camera, binocular camera, main camera. For example, the above-mentioned photographing device 15 may be a sensor capable of acquiring obstacle information, and a photographed image of the surroundings of the unmanned aerial vehicle may be acquired by the sensor, and obstacle information may be acquired by using the photographed image, for example, the target object 17 is an obstacle of the surroundings of the unmanned aerial vehicle.
Based on the above application scenario, referring to fig. 2, which is a flowchart of an obstacle avoidance control method provided in an embodiment of the present invention, the method may be applied to a mobile platform (such as an unmanned aerial vehicle), and the method may include:
step 201, obtaining obstacle information of a first obstacle facing the sensor, and storing the obstacle information. For example, the obstacle information may be stored in a local storage medium of the drone, without limitation.
Acquiring obstacle information of a first obstacle toward which the sensor is directed may include: when the mobile platform moves towards the first obstacle, if the sensor faces the first obstacle, obstacle information corresponding to the first obstacle facing the sensor is acquired. Or when the mobile platform does not move towards the first obstacle, if the sensor faces the first obstacle, acquiring obstacle information corresponding to the first obstacle to which the sensor faces.
Wherein the mobile platform does not move towards the first obstacle, which may include but is not limited to: the mobile platform does not move; alternatively, the mobile platform moves in a direction other than the direction of the first obstacle.
Furthermore, when the mobile platform moves or does not move, the sensor can be moved in multiple directions, and if the sensor is fixedly arranged on the mobile platform, the mobile platform can be rotated in multiple directions, so that the obstacle information in multiple directions can be acquired. Also, in the plurality of directions, a direction of the first obstacle may be included, and therefore, obstacle information corresponding to the first obstacle may be acquired.
Taking the example that the sensor is fixed to the mobile platform, when the mobile platform moves towards the other direction than the direction of the first obstacle, if the sensor rotates from the other direction by a preset angle and then faces the first obstacle, the obstacle information corresponding to the first obstacle can be acquired. Wherein, the preset angle may include but is not limited to: 90 degrees; alternatively, 180 degrees; alternatively, 270 degrees. Of course, in practical applications, the preset angle may be other angles, such as 45 degrees, 60 degrees, 135 degrees, 150 degrees, 195 degrees, 210 degrees, and the like, and the preset angle is not limited, and of course, when the sensor is rotatably disposed on the mobile platform, the sensor may be rotated to achieve the above-mentioned effect of rotating the mobile platform.
The above process is described below with reference to specific application scenarios. Referring to fig. 3A, A, B, C, D are four obstacles, and for convenience of description, it is assumed that the obstacle a is a first obstacle, however, the obstacle B, C, D may also be the first obstacle, and the obstacle a is described as the first obstacle, and the sensor is not described as the main camera. In fig. 3A, the body 31 of the drone, the head 32 of the drone and the sensor 33 of the drone together make up the drone. Note that 34 is a visual field range of the sensor 33, and the sensor 33 can acquire obstacle information of an obstacle in the visual field range 34.
In one example, the orientation of the handpiece 32 and the orientation of the sensor 33 may be the same, for example, when the orientation of the handpiece 32 is an obstacle a, the orientation of the sensor 33 is also an obstacle a. In another example, the orientation of the handpiece 32 may be different from the orientation of the sensor 33, e.g., when the orientation of the handpiece 32 is an obstacle A, the orientation of the sensor 33 may be an obstacle B. In the present application scenario, referring to fig. 3A, for convenience of description, the orientation of the handpiece 32 is the same as the orientation of the sensor 33.
In one example, the orientation of the handpiece 32 and the orientation of the sensor 33 may not be decoupled, for example, if the orientation of the handpiece 32 is an obstacle a and the orientation of the sensor 33 is also an obstacle a, the orientation of the sensor 33 follows the rotation of the orientation of the handpiece 32 to the obstacle B. In another example, the orientation of the handpiece 32 may be decoupled from the orientation of the sensor 33, for example, if the orientation of the handpiece 32 is an obstacle a and the orientation of the sensor 33 is also an obstacle a, if the orientation of the handpiece 32 is rotated to an obstacle B, the orientation of the sensor 33 is not followed by rotation to the obstacle B, which is still the obstacle a. In the context of this application, for convenience of description, the orientation of the handpiece 32 is not decoupled from the orientation of the sensor 33.
In the above application scenario, in step 301, in order to obtain the obstacle information of the first obstacle (taking the obstacle a as an example) towards which the sensor faces, the following cases may be included, but are not limited to:
in the first case, as shown in fig. 3B, when the unmanned aerial vehicle moves toward the obstacle a, the sensor 33 is facing the obstacle a, and therefore, the obstacle information of the obstacle a, which the sensor 33 faces, can be acquired.
In the second case, when the drone is not moving, the drone may be rotated in multiple directions, for example, four directions shown in fig. 3A and 3C to 3E may be rotated, and when the drone is rotated in the direction shown in fig. 3A, the obstacle information of the obstacle a toward which the sensor 33 is directed may be acquired. When the vehicle is turned in the direction shown in fig. 3C, the obstacle information of the obstacle C toward which the sensor 33 is directed can be acquired. When the vehicle is turned in the direction shown in fig. 3D, the obstacle information of the obstacle B toward which the sensor 33 is directed can be acquired. When the vehicle is turned in the direction shown in fig. 3E, the obstacle information of the obstacle D to which the sensor 33 is directed can be acquired.
Wherein, after unmanned aerial vehicle takes off, can stop in the air, unmanned aerial vehicle did not remove this moment, then, unmanned aerial vehicle can rotate a week, in four directions of figure 3A, figure 3C-figure 3E, acquires the barrier information, also can be at the in-process that unmanned aerial vehicle flies, and unmanned aerial vehicle can rotate a week, acquires the barrier information.
Wherein the rotation may be performed by a predetermined angle, such as 90 degrees, when turning from the direction of fig. 3A to the direction of fig. 3C. When turning from the orientation of fig. 3C to the orientation of fig. 3D, it may be rotated by a preset angle, such as 90 degrees. When turning from the orientation of fig. 3D to the orientation of fig. 3E, it may be rotated by a preset angle, such as 90 degrees.
In case three, when the drone moves in a direction other than the direction of the obstacle a, as shown in fig. 3F, it is assumed to move in the direction of the obstacle D. If the sensor rotates from the direction of the obstacle D to the direction of the obstacle a, but the unmanned aerial vehicle still moves towards the direction of the obstacle D, as shown in fig. 3G, the sensor 33 is facing the obstacle a, so that the obstacle information of the obstacle a facing the sensor 33 is acquired. Wherein the rotation may be performed by a predetermined angle, such as 90 degrees, when turning from the direction of fig. 3F to the direction of fig. 3G. Referring to fig. 3G, although the direction of the head 32 and the direction of the sensor 33 are both the obstacle a, the unmanned aerial vehicle may move toward the obstacle D, and the moving manner is not limited.
Through the processing, the obstacle information of the obstacle a can be obtained for the case one, the case two and the case three, and then the obstacle information of the obstacle a can be stored in the local storage medium of the unmanned aerial vehicle. In the case two, the obstacle information of the obstacle B, the obstacle information of the obstacle C, and the obstacle information of the obstacle D can be obtained, and therefore the obstacle information of the obstacle B, the obstacle information of the obstacle C, and the obstacle information of the obstacle D can be stored in the local storage medium of the drone.
Step 202, when the mobile platform moves towards the first obstacle, if the sensor does not face the first obstacle, obstacle avoidance is performed by using the stored obstacle information. And if the sensor faces the first obstacle, acquiring obstacle information of the first obstacle, and avoiding the obstacle by using the currently acquired obstacle information.
Further, when the sensor faces the first obstacle, it is also possible to acquire obstacle information of the first obstacle and update the already stored obstacle information with the newly acquired obstacle information.
Referring to fig. 3B, when the unmanned aerial vehicle moves to the obstacle a, because the sensor 33 faces the obstacle a, the obstacle information of the obstacle a that the sensor 33 faces can be acquired, and obstacle avoidance is performed by using the currently acquired obstacle information of the obstacle a, instead of performing obstacle avoidance by using the already stored obstacle information. Further, it is also possible to update the obstacle information of the obstacle a that has been stored, with the obstacle information of the obstacle a that is currently acquired, so that the latest obstacle information may be stored.
Referring to fig. 3H, when the drone moves to the obstacle a, the sensor 33 faces the obstacle D, that is, the sensor 33 does not face the obstacle a, and therefore the obstacle information of the obstacle a cannot be acquired. Based on this, the obstacle information of the obstacle a which has been stored can be acquired from the local storage medium of the unmanned aerial vehicle, and the obstacle avoidance can be performed by using the obstacle information of the obstacle a which has been stored.
When the drone moves towards the obstacle a, the sensor 33 may not face the obstacle a, but face other directions, such as facing the obstacle D, the obstacle B, and the like, which is not limited to this.
In one example, when the mobile platform moves towards the first obstacle, if the sensor does not face the first obstacle but faces the second obstacle, the obstacle information of the second obstacle facing the sensor can be acquired; further, when the mobile platform moves towards the second obstacle, obstacle avoidance can be performed by using obstacle information of the second obstacle. For example, after acquiring obstacle information of a second obstacle toward which the sensor is directed, obstacle information of the second obstacle may be stored; when the mobile platform moves towards the second obstacle, obstacle avoidance can be performed by using the stored obstacle information of the second obstacle.
Referring to fig. 3H, when the drone moves to the obstacle a, since the sensor 33 faces the obstacle D (i.e., the second obstacle), it is also possible to acquire obstacle information of the obstacle D that the sensor 33 faces, and store the obstacle information of the obstacle D in the local storage medium. Then, when the unmanned aerial vehicle moves towards the obstacle D, obstacle avoidance can be performed by using the obstacle information of the obstacle D which is already stored.
In the above embodiment, after the obstacle information of the second obstacle, which is directed by the sensor, is acquired, the obstacle information of the second obstacle may be sent to the control device, so that the control device displays the user interface according to the obstacle information of the second obstacle. After the obstacle information of the first obstacle, which is oriented by the sensor, is acquired, the obstacle information of the first obstacle may be sent to the control device, so that the control device displays the user interface according to the obstacle information of the first obstacle. That is, for each acquired obstacle information, the obstacle information may be sent to the control device, so that the control device performs display of the user interface according to the obstacle information, without limitation to the display process.
In an example, when the orientation of the main camera is inconsistent with the moving direction of the mobile platform, the mobile platform may further send information that the orientation of the main camera is inconsistent with the moving direction of the mobile platform to the control device, so that the control device displays the user interface according to the information, and the display process is not limited.
In the conventional mode, if only one sensor is provided, the obstacle can be avoided only for the orientation of the sensor 33, but the obstacle cannot be avoided for other directions, so that the obstacle avoiding effect is poor, and the user experience is poor. For example, as shown in fig. 3G, since the sensor 33 faces the obstacle a, only the obstacle information of the obstacle a can be acquired, and if the unmanned aerial vehicle moves toward the obstacle a, obstacle avoidance can be performed. However, if unmanned aerial vehicle moves to barrier D, can't keep away the barrier, lead to keeping away the barrier effect relatively poor, user's use experience is poor.
In the embodiment of the invention, a plurality of directions can be covered, the obstacle avoidance can be carried out in the plurality of directions, the surrounding obstacle information can be covered under the condition that the number of the sensors is limited, the obstacle information in the plurality of directions can be stored, the omnidirectional obstacle avoidance effect is achieved, the good obstacle avoidance effect can be realized, and the use experience of a user can be improved. For example, as shown in fig. 3G, although the sensor 33 is currently facing the obstacle a and can only acquire the obstacle information of the obstacle a currently, since the obstacle information of the obstacle D has been acquired and stored previously (the specific acquisition manner is not described again), when the unmanned aerial vehicle moves toward the obstacle D, the unmanned aerial vehicle can also use the obstacle information of the obstacle D stored previously to avoid the obstacle.
In the above embodiment, one sensor is taken as an example for explanation, in practical applications, the number of the sensors may also be two or more, and the number of the sensors is not limited.
In the above embodiment, the obstacle information may include, but is not limited to, position coordinates of the obstacle, a distance between the obstacle and the drone, a size, a shape, and the like of the obstacle, which is not limited thereto.
The embodiment of the invention provides an obstacle avoidance control method, which can be applied to control equipment, wherein the control equipment can comprise but is not limited to: without limitation, a remote control, a smartphone/cell phone, a tablet, a Personal Digital Assistant (PDA), a laptop computer, a desktop computer, a media content player, a video game station/system, a virtual reality system, an augmented reality system, a wearable device (e.g., a watch, glasses, gloves, headwear (e.g., a hat, a helmet, a virtual reality headset, an augmented reality headset, a Head Mounted Device (HMD), a headband), a pendant, an armband, a leg loop, a shoe, a vest), a gesture recognition device, a microphone, any electronic device capable of providing or rendering image data.
Based on or not based on the application scenario, referring to fig. 4, a flowchart of an obstacle avoidance control method provided in the embodiment of the present invention may be applied to a control device, and the method may include the following steps:
step 401, obtaining obstacle information corresponding to the mobile platform, and obtaining a moving direction of the mobile platform.
Wherein, the mobile platform can include but is not limited to at least one of the following: robot, unmanned aerial vehicle, unmanned car. For convenience of description, in the embodiment of the present invention, the mobile platform is an unmanned aerial vehicle as an example, a cradle head is installed on the unmanned aerial vehicle, and shooting equipment (such as a camera, a video camera, and the like) is fixed on the cradle head, and the unmanned aerial vehicle can acquire a shot image through the shooting equipment and perform obstacle avoidance control according to the shot image.
The obstacle information corresponding to the mobile platform may include, but is not limited to: and the obstacle information of a plurality of directions is acquired by a sensor on the mobile platform, and the obstacle information is notified to the control device by the mobile platform. Also, the mobile platform may acquire the obstacle information using the flow shown in fig. 2, and notify the control apparatus of the acquired obstacle information (obstacle information in a plurality of directions). The sensors on the mobile platform may include, but are not limited to, at least one of: TOF camera, binocular camera, main camera.
For example, the obstacle information corresponding to the mobile platform may include: the mobile platform notifies the control device of obstacle information of an obstacle A, obstacle information of an obstacle B, obstacle information of an obstacle C and obstacle information of an obstacle D. Of course, there may be other obstacle information, which is not limited to this.
The control device can also control the moving direction of the mobile platform, so that the control device can acquire the moving direction of the mobile platform without limitation on the acquiring mode. For example, the mobile platform moves in the direction of the obstacle a, or the mobile platform moves in the direction of the obstacle D.
Step 402, selecting obstacle information corresponding to the moving direction from the obstacle information.
The selecting the obstacle information corresponding to the moving direction from the obstacle information may include: and if the moving direction of the mobile platform is the first direction, selecting obstacle information in the first direction from the obstacle information. And if the moving direction of the mobile platform is between the first direction and the second direction, selecting the obstacle information of the first direction and the obstacle information of the second direction from the obstacle information.
For example, assuming that the mobile platform moves toward the obstacle a (i.e., the first direction), the control device may select the obstacle information of the obstacle a in the first direction from among the obstacle information of the obstacle a, the obstacle information of the obstacle B, the obstacle information of the obstacle C, and the obstacle information of the obstacle D.
For another example, if the mobile platform moves between the obstacle a (i.e., the first direction) and the obstacle D (i.e., the second direction), the control device may select the obstacle information of the obstacle a in the first direction and the obstacle information of the obstacle D in the second direction from the obstacle information of the obstacle a, the obstacle information of the obstacle B, the obstacle information of the obstacle C, and the obstacle information of the obstacle D.
And 403, displaying a user interface according to the selected obstacle information.
The displaying of the user interface according to the selected obstacle information may include: and displaying the barrier corresponding to the selected barrier information on the user interface. Further, displaying the obstacle corresponding to the selected obstacle information on the user interface includes: determining the position of the barrier and the distance between the barrier and the mobile platform according to the selected barrier information; an obstacle is displayed in the user interface based on the location and the distance.
If the obstacle information of the obstacle a is selected, the obstacle a can be displayed on the user interface, but the control device does not display the obstacle B, the obstacle C, and the obstacle D on the user interface although the obstacle information of the obstacle B, the obstacle C, and the obstacle D exists. For example, the obstacle information includes the position coordinates of the obstacle, the distance between the obstacle and the mobile platform, the size and the shape of the obstacle, and the like, and therefore, the obstacle information of the obstacle a may include the position coordinates of the obstacle a, the distance between the obstacle a and the mobile platform, that is, the position coordinates at which the obstacle a is acquired, and the distance between the obstacle a and the mobile platform, and the obstacle a is displayed on the user interface according to the position and the distance.
Assuming that the obstacle information of the obstacle a and the obstacle information of the obstacle D are selected, the control device displays the obstacle a and the obstacle D on the user interface, but the control device does not display the obstacle B and the obstacle C on the user interface. For example, based on the position coordinates of the obstacle a, the distance between the obstacle a and the mobile platform, the control device displays the obstacle a on the user interface; and controlling the equipment to display the obstacle D on the user interface based on the position coordinates of the obstacle D and the distance between the obstacle D and the mobile platform.
In an example, after obtaining the obstacle information corresponding to the mobile platform (e.g., obstacle information in multiple directions corresponding to the mobile platform), the control device may further construct a three-dimensional map using the obstacle information corresponding to the mobile platform, and may construct a depth map, which is not limited to this construction process.
In the above-described mode, the control device displays not the obstacles in the plurality of directions but only the obstacles in the moving direction of the mobile platform. If obstacles in a plurality of directions are displayed, the obstacles cannot be clearly and intuitively displayed, and the user experience is deteriorated. In the embodiment of the invention, the obstacle in the moving direction of the mobile platform is considered to have an influence on the flying process of the mobile platform, so that the obstacle in the moving direction of the mobile platform can be only displayed, the obstacle can be clearly and visually displayed, the most visual obstacle information can be displayed for a user, the use experience of the user is improved, and the effective information efficiency of the picture is improved.
In one example, if the orientation of the main camera is not consistent with the moving direction of the mobile platform and an obstacle exists in the moving direction of the mobile platform, the control device may further display information that the orientation of the main camera is not consistent with the moving direction of the mobile platform on the user interface. Specifically, a prompt box can be included in the user interface, and different shapes of prompt boxes are displayed on the user interface to indicate that the main camera faces the information inconsistent with the moving direction of the mobile platform. For example, if the orientation of the main camera is consistent with the moving direction of the mobile platform, the prompt box may be in a first shape; if the orientation of the main camera is not consistent with the moving direction of the mobile platform, the prompt box can be in a second shape. For example, when the prompt box is rectangular, it indicates that the orientation of the main camera coincides with the moving direction of the mobile platform, and when the prompt box is not rectangular, it indicates that the orientation of the main camera does not coincide with the moving direction of the mobile platform. For another example, when the prompt box has a preset width, it indicates that the direction of the main camera is consistent with the moving direction of the mobile platform, and when the prompt box has no preset width, it indicates that the direction of the main camera is inconsistent with the moving direction of the mobile platform. The first shape and the second shape are not limited.
For example, when the main camera is oriented in the same direction as the moving direction of the mobile platform, the width of the prompt box is 1 cm (i.e. the preset width); when the main camera orientation is not consistent with the moving direction of the mobile platform, the width of the prompt box may be greater than 1 cm (i.e. not a preset width), for example, 1.5 cm. Further, the larger the degree of the inconsistency between the orientation of the main camera and the moving direction of the mobile platform, the larger the width of the prompt box.
Based on the same inventive concept as the above method, referring to fig. 5, an embodiment of the present invention further provides a mobile platform 50, which includes a memory 51, a processor 52 (e.g., one or more processors), and a sensor 53. The memory for storing program code; the sensor is used for acquiring obstacle information; the processor, configured to invoke the program code, and when executed, configured to: acquiring obstacle information of a first obstacle oriented by the sensor, and storing the obstacle information; when the mobile platform moves towards the first obstacle, if the sensor does not face the first obstacle, obstacle avoidance is carried out by using the stored obstacle information.
The processor acquires the obstacle information through the sensor, which is not described in detail herein.
Preferably, the processor, when acquiring the obstacle information of the first obstacle to which the sensor is directed, is specifically configured to: when the mobile platform moves towards the first obstacle, if the sensor faces towards the first obstacle, acquiring obstacle information corresponding to the first obstacle facing towards the sensor.
Preferably, the processor, when acquiring the obstacle information of the first obstacle to which the sensor is directed, is specifically configured to: when the mobile platform does not move towards the first obstacle, if the sensor faces towards the first obstacle, acquiring obstacle information corresponding to the first obstacle facing towards the sensor.
Preferably, the processor is further configured to, after storing the obstacle information: when the sensor faces the first obstacle, obstacle information of the first obstacle is acquired, and the obstacle information that has been stored is updated using the acquired obstacle information again.
Preferably, the processor is further configured to:
when the sensor does not face the first obstacle, if the sensor faces a second obstacle, acquiring obstacle information of the second obstacle, which the sensor faces; and when the mobile platform moves towards the second obstacle, obstacle avoidance is carried out by utilizing the obstacle information of the second obstacle.
Preferably, the processor is further configured to, after acquiring the obstacle information of the second obstacle to which the sensor is directed: and sending the obstacle information of the second obstacle to control equipment so that the control equipment displays a user interface according to the obstacle information of the second obstacle.
Preferably, the processor is further configured to:
when the mobile platform moves towards the first obstacle, if the sensor faces towards the first obstacle, obstacle information of the first obstacle is acquired, and obstacle avoidance is performed by using the acquired obstacle information.
Preferably, the processor is further configured to, after acquiring the obstacle information of the first obstacle to which the sensor is directed: and sending the obstacle information of the first obstacle to control equipment so that the control equipment displays a user interface according to the obstacle information of the first obstacle.
Preferably, the processor is further configured to: and if the orientation of the main camera is inconsistent with the moving direction of the mobile platform, sending information that the orientation of the main camera is inconsistent with the moving direction of the mobile platform to the control equipment.
Based on the same inventive concept as the above method, referring to fig. 6, an embodiment of the present invention further provides a control device 60, which includes a memory 61 and a processor 62 (e.g., one or more processors).
The memory for storing program code; the processor, configured to invoke the program code, and when executed, configured to: obtaining barrier information corresponding to a mobile platform, and obtaining the moving direction of the mobile platform; selecting obstacle information corresponding to the moving direction from the obstacle information; and displaying the user interface according to the selected obstacle information.
Preferably, the processor is specifically configured to, when selecting the obstacle information corresponding to the moving direction from the obstacle information: and if the moving direction of the mobile platform is a first direction, selecting obstacle information of the first direction from the obstacle information.
Preferably, the processor is specifically configured to, when selecting the obstacle information corresponding to the moving direction from the obstacle information: and if the moving direction of the mobile platform is between a first direction and a second direction, selecting the obstacle information of the first direction and the obstacle information of the second direction from the obstacle information.
Preferably, the processor is specifically configured to, when displaying the user interface according to the selected obstacle information: and displaying the barrier corresponding to the selected barrier information on a user interface.
Preferably, when the user interface displays the obstacle corresponding to the selected obstacle information, the processor is specifically configured to: determining the position of the barrier and the distance between the barrier and the mobile platform according to the selected barrier information; displaying an obstacle on the user interface according to the position and the distance.
Preferably, the processor is further configured to:
and if the orientation of the main camera is inconsistent with the moving direction of the mobile platform and the moving direction of the mobile platform has an obstacle, displaying information that the orientation of the main camera is inconsistent with the moving direction of the mobile platform on the user interface.
Based on the same concept as the method, an embodiment of the present invention further provides a computer-readable storage medium, where computer instructions are stored on the computer-readable storage medium, and when the computer instructions are executed, the obstacle avoidance control method shown in fig. 2 is implemented.
Based on the same concept as the method, an embodiment of the present invention further provides a computer-readable storage medium, where computer instructions are stored on the computer-readable storage medium, and when the computer instructions are executed, the obstacle avoidance control method shown in fig. 4 is implemented.
The systems, devices, modules or units illustrated in the above embodiments may be implemented by a computer chip or an entity, or by an article of manufacture with certain functionality. A typical implementation device is a computer, which may take the form of a personal computer, laptop computer, cellular telephone, camera phone, smart phone, personal digital assistant, media player, navigation device, email messaging device, game console, tablet computer, wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being divided into various units by function, and are described separately. Of course, the functions of the units may be implemented in the same software and/or hardware or in a plurality of software and/or hardware when implementing the invention.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Furthermore, these computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above description is only an example of the present invention, and is not intended to limit the present invention. Various modifications and alterations to this invention will become apparent to those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the claims of the present invention.

Claims (42)

1. An obstacle avoidance control method is applied to a mobile platform, the mobile platform comprises a sensor capable of acquiring obstacle information, and the method comprises the following steps:
acquiring obstacle information of a first obstacle oriented by the sensor, and storing the obstacle information;
when the mobile platform moves towards the first obstacle, if the sensor does not face the first obstacle, obstacle avoidance is carried out by using the stored obstacle information.
2. The method of claim 1,
the acquiring obstacle information of a first obstacle to which the sensor is directed, includes:
when the mobile platform moves towards the first obstacle, if the sensor faces towards the first obstacle, acquiring obstacle information corresponding to the first obstacle facing towards the sensor.
3. The method of claim 1,
the acquiring obstacle information of a first obstacle to which the sensor is directed, includes:
when the mobile platform does not move towards the first obstacle, if the sensor faces towards the first obstacle, acquiring obstacle information corresponding to the first obstacle facing towards the sensor.
4. The method of claim 3,
the mobile platform not moving toward the first obstacle, comprising:
the mobile platform is not moved; alternatively, the first and second electrodes may be,
the mobile platform moves in a direction other than the direction of the first obstacle.
5. The method of claim 4, wherein the obstacle information is obtained in multiple directions by rotating the mobile platform in multiple directions while the mobile platform is not moving.
6. The method of claim 4,
when the mobile platform moves towards other directions except the direction of the first obstacle, the sensor rotates from the other directions by a preset angle and faces the first obstacle.
7. The method of claim 6,
the preset angle is specifically as follows: 90 degrees; alternatively, 180 degrees; alternatively, 270 degrees.
8. The method of claim 1,
after the storing the obstacle information, the method further comprises:
when the sensor faces the first obstacle, obstacle information of the first obstacle is acquired, and the obstacle information that has been stored is updated using the acquired obstacle information again.
9. The method of claim 1, further comprising:
when the sensor does not face the first obstacle, if the sensor faces a second obstacle, acquiring obstacle information of the second obstacle, which the sensor faces; and when the mobile platform moves towards the second obstacle, obstacle avoidance is carried out by utilizing the obstacle information of the second obstacle.
10. The method of claim 9,
after the acquiring obstacle information of the second obstacle to which the sensor is directed, the method further includes:
and sending the obstacle information of the second obstacle to control equipment so that the control equipment displays a user interface according to the obstacle information of the second obstacle.
11. The method of claim 1, further comprising:
when the mobile platform moves towards the first obstacle, if the sensor faces towards the first obstacle, obstacle information of the first obstacle is acquired, and obstacle avoidance is performed by using the acquired obstacle information.
12. The method of claim 1,
after the obstacle information of the first obstacle towards which the sensor faces is acquired, the method further includes:
and sending the obstacle information of the first obstacle to control equipment so that the control equipment displays a user interface according to the obstacle information of the first obstacle.
13. The method of claim 1, further comprising:
and if the orientation of the main camera is inconsistent with the moving direction of the mobile platform, sending information that the orientation of the main camera is inconsistent with the moving direction of the mobile platform to the control equipment.
14. The method of claim 1,
the mobile platform includes: robot, unmanned aerial vehicle, unmanned car.
15. The method of claim 1,
the sensor includes: TOF camera, binocular camera, main camera.
16. An obstacle avoidance control method is applied to control equipment, and the method comprises the following steps:
obtaining barrier information corresponding to a mobile platform, and obtaining the moving direction of the mobile platform;
selecting obstacle information corresponding to the moving direction from the obstacle information;
displaying a user interface according to the selected barrier information;
and if the orientation of the main camera is inconsistent with the moving direction of the mobile platform and the moving direction of the mobile platform has an obstacle, displaying information that the orientation of the main camera is inconsistent with the moving direction of the mobile platform on the user interface.
17. The method of claim 16, wherein the obstacle information corresponding to the mobile platform comprises: and the sensor on the mobile platform acquires obstacle information in multiple directions.
18. The method of claim 17,
the sensors on the mobile platform include: TOF camera, binocular camera, main camera.
19. The method of claim 16,
selecting obstacle information corresponding to the moving direction from the obstacle information, including:
and if the moving direction of the mobile platform is a first direction, selecting obstacle information of the first direction from the obstacle information.
20. The method of claim 16,
selecting obstacle information corresponding to the moving direction from the obstacle information, including:
and if the moving direction of the mobile platform is between a first direction and a second direction, selecting the obstacle information of the first direction and the obstacle information of the second direction from the obstacle information.
21. The method of claim 16,
the displaying of the user interface according to the selected obstacle information includes:
and displaying the barrier corresponding to the selected barrier information on a user interface.
22. The method of claim 21,
the displaying the barrier corresponding to the selected barrier information on the user interface includes:
determining the position of the barrier and the distance between the barrier and the mobile platform according to the selected barrier information; displaying an obstacle on the user interface according to the position and the distance.
23. The method of claim 16,
the user interface comprises prompt boxes, and the prompt boxes in different shapes are displayed on the user interface to represent information that the orientation of the main camera is inconsistent with the moving direction of the mobile platform.
24. The method of claim 23,
if the orientation of the main camera is consistent with the moving direction of the mobile platform, the prompt box is in a first shape;
and if the orientation of the main camera is inconsistent with the moving direction of the mobile platform, the prompt box is in a second shape.
25. The method of claim 16,
after obtaining the obstacle information corresponding to the mobile platform, the method further includes:
and constructing a three-dimensional map by using the barrier information corresponding to the mobile platform.
26. The method of claim 16,
the mobile platform includes: robot, unmanned aerial vehicle, unmanned car.
27. A mobile platform, comprising: a memory, a processor, a sensor;
the memory for storing program code;
the sensor is used for acquiring obstacle information;
the processor, configured to invoke the program code, and when executed, configured to: acquiring obstacle information of a first obstacle oriented by the sensor, and storing the obstacle information; when the mobile platform moves towards the first obstacle, if the sensor does not face the first obstacle, obstacle avoidance is carried out by using the stored obstacle information.
28. The mobile platform of claim 27,
the processor is specifically configured to, when obtaining obstacle information of a first obstacle towards which the sensor is directed: when the mobile platform moves towards the first obstacle, if the sensor faces towards the first obstacle, acquiring obstacle information corresponding to the first obstacle facing towards the sensor.
29. The mobile platform of claim 27,
the processor is specifically configured to, when obtaining obstacle information of a first obstacle towards which the sensor is directed: when the mobile platform does not move towards the first obstacle, if the sensor faces towards the first obstacle, acquiring obstacle information corresponding to the first obstacle facing towards the sensor.
30. The mobile platform of claim 27,
the processor is further configured to, after storing the obstacle information: when the sensor faces the first obstacle, obstacle information of the first obstacle is acquired, and the obstacle information that has been stored is updated using the acquired obstacle information again.
31. The mobile platform of claim 27, wherein the processor is further configured to:
when the sensor does not face the first obstacle, if the sensor faces a second obstacle, acquiring obstacle information of the second obstacle, which the sensor faces; and when the mobile platform moves towards the second obstacle, obstacle avoidance is carried out by utilizing the obstacle information of the second obstacle.
32. The mobile platform of claim 31,
the processor is further configured to, after acquiring obstacle information of a second obstacle towards which the sensor is directed: and sending the obstacle information of the second obstacle to control equipment so that the control equipment displays a user interface according to the obstacle information of the second obstacle.
33. The mobile platform of claim 27, wherein the processor is further configured to:
when the mobile platform moves towards the first obstacle, if the sensor faces towards the first obstacle, obstacle information of the first obstacle is acquired, and obstacle avoidance is performed by using the acquired obstacle information.
34. The mobile platform of claim 27,
the processor is further configured to, after acquiring the obstacle information of the first obstacle towards which the sensor is directed: and sending the obstacle information of the first obstacle to control equipment so that the control equipment displays a user interface according to the obstacle information of the first obstacle.
35. The mobile platform of claim 27, wherein the processor is further configured to:
and if the orientation of the main camera is inconsistent with the moving direction of the mobile platform, sending information that the orientation of the main camera is inconsistent with the moving direction of the mobile platform to the control equipment.
36. A control apparatus, characterized by comprising: a memory, a processor;
the memory for storing program code;
the processor, configured to invoke the program code, and when executed, configured to: obtaining barrier information corresponding to a mobile platform, and obtaining the moving direction of the mobile platform; selecting obstacle information corresponding to the moving direction from the obstacle information; displaying a user interface according to the selected barrier information;
and if the orientation of the main camera is inconsistent with the moving direction of the mobile platform and the moving direction of the mobile platform has an obstacle, displaying information that the orientation of the main camera is inconsistent with the moving direction of the mobile platform on the user interface.
37. The control apparatus according to claim 36,
the processor is specifically configured to, when selecting the obstacle information corresponding to the moving direction from the obstacle information: and if the moving direction of the mobile platform is a first direction, selecting obstacle information of the first direction from the obstacle information.
38. The control device according to claim 36, wherein the processor is configured to, when selecting the obstacle information corresponding to the moving direction from the obstacle information, specifically: and if the moving direction of the mobile platform is between a first direction and a second direction, selecting the obstacle information of the first direction and the obstacle information of the second direction from the obstacle information.
39. The control apparatus according to claim 36,
the processor is specifically configured to, when displaying the user interface according to the selected obstacle information: and displaying the barrier corresponding to the selected barrier information on a user interface.
40. The control apparatus according to claim 39,
the processor is specifically configured to, when the user interface displays the obstacle corresponding to the selected obstacle information: determining the position of the barrier and the distance between the barrier and the mobile platform according to the selected barrier information; displaying an obstacle on the user interface according to the position and the distance.
41. A computer-readable storage medium, wherein the computer-readable storage medium stores computer instructions, and when the computer instructions are executed, the method for obstacle avoidance control according to any one of claims 1 to 15 is implemented.
42. A computer-readable storage medium, wherein the computer-readable storage medium stores computer instructions, and when the computer instructions are executed, the method for obstacle avoidance control according to any one of claims 16 to 26 is implemented.
CN201810060925.3A 2018-01-22 2018-01-22 Obstacle avoidance control method, equipment and computer readable storage medium Expired - Fee Related CN108319295B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810060925.3A CN108319295B (en) 2018-01-22 2018-01-22 Obstacle avoidance control method, equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810060925.3A CN108319295B (en) 2018-01-22 2018-01-22 Obstacle avoidance control method, equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN108319295A CN108319295A (en) 2018-07-24
CN108319295B true CN108319295B (en) 2021-05-28

Family

ID=62887595

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810060925.3A Expired - Fee Related CN108319295B (en) 2018-01-22 2018-01-22 Obstacle avoidance control method, equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN108319295B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022126477A1 (en) * 2020-12-17 2022-06-23 深圳市大疆创新科技有限公司 Control method and device for movable platform, and movable platform

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105389543A (en) * 2015-10-19 2016-03-09 广东工业大学 Mobile robot obstacle avoidance device based on all-dimensional binocular vision depth information fusion
CN105912018A (en) * 2016-04-27 2016-08-31 深圳电航空技术有限公司 Aircraft and obstacle avoiding method for the aircraft
CN106292799A (en) * 2016-08-25 2017-01-04 北京奇虎科技有限公司 Unmanned plane, remote control unit and control method thereof
CN106275470A (en) * 2015-06-29 2017-01-04 优利科技有限公司 Aircraft and barrier-avoiding method thereof and system
CN107077145A (en) * 2016-09-09 2017-08-18 深圳市大疆创新科技有限公司 Show the method and system of the obstacle detection of unmanned vehicle
CN110673620A (en) * 2019-10-22 2020-01-10 西北工业大学 Four-rotor unmanned aerial vehicle air line following control method based on deep reinforcement learning

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101999959B1 (en) * 2017-06-26 2019-07-15 엘지전자 주식회사 Robot cleaner

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106275470A (en) * 2015-06-29 2017-01-04 优利科技有限公司 Aircraft and barrier-avoiding method thereof and system
CN105389543A (en) * 2015-10-19 2016-03-09 广东工业大学 Mobile robot obstacle avoidance device based on all-dimensional binocular vision depth information fusion
CN105912018A (en) * 2016-04-27 2016-08-31 深圳电航空技术有限公司 Aircraft and obstacle avoiding method for the aircraft
CN106292799A (en) * 2016-08-25 2017-01-04 北京奇虎科技有限公司 Unmanned plane, remote control unit and control method thereof
CN107077145A (en) * 2016-09-09 2017-08-18 深圳市大疆创新科技有限公司 Show the method and system of the obstacle detection of unmanned vehicle
CN110673620A (en) * 2019-10-22 2020-01-10 西北工业大学 Four-rotor unmanned aerial vehicle air line following control method based on deep reinforcement learning

Also Published As

Publication number Publication date
CN108319295A (en) 2018-07-24

Similar Documents

Publication Publication Date Title
US11151773B2 (en) Method and apparatus for adjusting viewing angle in virtual environment, and readable storage medium
US10551834B2 (en) Method and electronic device for controlling unmanned aerial vehicle
US11703993B2 (en) Method, apparatus and device for view switching of virtual environment, and storage medium
US11189037B2 (en) Repositioning method and apparatus in camera pose tracking process, device, and storage medium
CN107074348B (en) Control method, device and equipment and unmanned aerial vehicle
US11691079B2 (en) Virtual vehicle control method in virtual scene, computer device, and storage medium
CN108769531B (en) Method for controlling shooting angle of shooting device, control device and remote controller
AU2020256776B2 (en) Method and device for observing virtual article in virtual environment, and readable storage medium
US11210810B2 (en) Camera localization method and apparatus, terminal, and storage medium
KR20180073327A (en) Display control method, storage medium and electronic device for displaying image
EP3926441A1 (en) Output of virtual content
WO2019126958A1 (en) Yaw attitude control method, unmanned aerial vehicle, and computer readable storage medium
US20180262789A1 (en) System for georeferenced, geo-oriented realtime video streams
EP3832605B1 (en) Method and device for determining potentially visible set, apparatus, and storage medium
US20210112194A1 (en) Method and device for taking group photo
JP2017163265A (en) Controlling support system, information processing device, and program
US20210004005A1 (en) Image capture method and device, and machine-readable storage medium
US11845007B2 (en) Perspective rotation method and apparatus, device, and storage medium
US11790607B2 (en) Method and apparatus for displaying heat map, computer device, and readable storage medium
KR20210036392A (en) Virtual environment observation method, device and storage medium
CN111602104A (en) Method and apparatus for presenting synthetic reality content in association with identified objects
CN108319295B (en) Obstacle avoidance control method, equipment and computer readable storage medium
US20220184506A1 (en) Method and apparatus for driving vehicle in virtual environment, terminal, and storage medium
US20210063152A1 (en) Mapping methods, movable platforms, and computer-readable storage media
JP2018163427A (en) Information processing method, information processing program, information processing system, and information processing device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20210528

CF01 Termination of patent right due to non-payment of annual fee