CN117576582A - Safety detection method and device of flight equipment, storage medium and flight equipment - Google Patents

Safety detection method and device of flight equipment, storage medium and flight equipment Download PDF

Info

Publication number
CN117576582A
CN117576582A CN202210946226.5A CN202210946226A CN117576582A CN 117576582 A CN117576582 A CN 117576582A CN 202210946226 A CN202210946226 A CN 202210946226A CN 117576582 A CN117576582 A CN 117576582A
Authority
CN
China
Prior art keywords
target
target object
distance
outputting
prompt information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210946226.5A
Other languages
Chinese (zh)
Inventor
张亚伟
蔡剑成
刘新民
刘洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sankuai Online Technology Co Ltd
Original Assignee
Beijing Sankuai Online Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sankuai Online Technology Co Ltd filed Critical Beijing Sankuai Online Technology Co Ltd
Priority to CN202210946226.5A priority Critical patent/CN117576582A/en
Publication of CN117576582A publication Critical patent/CN117576582A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/806Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features

Abstract

The disclosure relates to a safety detection method and device of a flight device, a storage medium and the flight device, wherein the safety detection method of the flight device comprises the following steps: collecting at least one environment image before the flying equipment takes off, and identifying the at least one environment image to obtain a target identification result; if the target object exists in the at least one environment image based on the target identification result, acquiring a target distance between the flying equipment and the target object; and determining the category to which the target object belongs, and outputting target prompt information based on the category to which the target object belongs and the target distance. According to the method and the device, the target prompt information is output by combining the category of the target object and the target distance, so that the flexibility of safety detection of the flight equipment can be improved.

Description

Safety detection method and device of flight equipment, storage medium and flight equipment
Technical Field
The disclosure relates to the technical field of flight equipment, and in particular relates to a safety detection method and device of flight equipment, a storage medium and the flight equipment.
Background
At present, along with the rapid development of flight technology, the application range of flight equipment is also wider and wider, such as goods distribution, photography, electric power inspection, agricultural plant protection and the like, and unmanned aerial vehicles exist in common flight equipment. In order to improve the safety performance of the flying equipment in the flying process, the existing flying equipment is provided with a flying safety management scheme. However, when the flying device takes off, if the target object exists around, the target object may be damaged accidentally, so as to affect the take-off of the flying device. Therefore, performing safety detection at the time of takeoff of the flying apparatus as such is a technical problem to be solved.
Disclosure of Invention
The invention aims to provide a safety detection method and device of flight equipment, a storage medium and the flight equipment.
In a first aspect, a method for detecting safety of a flying device is provided, the method comprising:
collecting at least one environment image before the flying equipment takes off, and identifying the at least one environment image to obtain a target identification result;
if the target object exists in at least one environment image based on the target identification result, acquiring a target distance between the flight equipment and the target object;
And determining the category to which the target object belongs, and outputting target prompt information based on the category to which the target object belongs and the target distance.
Optionally, determining a category to which the target object belongs, and outputting the target prompt information based on the category to which the target object belongs and the target distance, including:
determining whether the target distance is smaller than a first preset distance under the condition that the target object belongs to a first type, wherein the target object of the first type comprises a person;
if the target distance is smaller than the first preset distance, outputting first prompt information to give an alarm to the person.
Optionally, after outputting the first prompt information, the method includes:
acquiring a first time length, wherein the first time length is the time length after outputting first prompt information;
and under the condition that the first time length exceeds the first preset time length, if the distance between the target object and the flight equipment is still smaller than the first preset distance, sending evacuation prompt information to the control terminal.
Optionally, the method further comprises:
and if the target distance is greater than or equal to the second preset distance, outputting second prompt information to safely prompt the person.
Optionally, after outputting the second prompt information, the method includes:
Acquiring a second time length which is the time length after outputting the second prompt information;
and under the condition that the second time length exceeds the second preset time length, if the distance between the target object and the flight equipment is still larger than or equal to the second preset distance, outputting the safety detection passing information.
Optionally, determining a category to which the target object belongs, and outputting the target prompt information based on the category to which the target object belongs and the target distance, including:
determining whether the target distance is smaller than a second preset distance under the condition that the target object belongs to a second type, wherein the target object of the second type comprises animals;
if the target distance is smaller than the second preset distance, outputting third prompt information to alarm the animal.
Optionally, after outputting the third prompt information, the method includes:
acquiring a third time length which is the time length after outputting the third prompt information;
and under the condition that the third time length exceeds the third preset time length, if the distance between the target object and the flight equipment is still smaller than the second preset distance, sending barrier removal prompt information to the control terminal.
Optionally, the method further comprises:
and if the target object is not present in at least one environment image based on the target identification result, outputting the safety detection passing information.
In a second aspect, there is provided a safety detection apparatus for a flying device, the apparatus comprising:
the acquisition module is used for acquiring at least one environment image before the flying equipment takes off and identifying the at least one environment image to obtain a target identification result;
the acquisition module is used for acquiring the target distance between the flight equipment and the target object if the target object exists in at least one environment image based on the target identification result;
and the output module is used for determining the category to which the target object belongs and outputting target prompt information based on the category to which the target object belongs and the target distance.
In a third aspect, there is provided a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of the method of the first aspect of the present disclosure.
In a fourth aspect, a flying device is provided, comprising a memory having a computer program stored thereon; a processor for executing the computer program in the memory to implement the steps of the method of the first aspect of the disclosure.
Optionally, the flying device comprises an unmanned aerial vehicle.
According to the technical scheme, at least one environment image is acquired before the flying equipment takes off, the environment images are identified to obtain the target identification result, if the target object exists in the at least one environment image based on the target identification result, the target distance between the flying equipment and the target object is acquired, then the category of the target object is determined, and the target prompt information is output based on the category of the target object and the target distance. According to the method and the device, the flexibility of safety detection of the flying equipment can be improved by combining the category of the target object and the target distance, and further the safety of taking off of the flying equipment can be ensured.
Additional features and advantages of the present disclosure will be set forth in the detailed description which follows.
Drawings
The accompanying drawings are included to provide a further understanding of the disclosure, and are incorporated in and constitute a part of this specification, illustrate the disclosure and together with the description serve to explain, but do not limit the disclosure. In the drawings:
fig. 1 is a flow chart illustrating a method of security detection of a flying device according to an exemplary embodiment.
Fig. 2 is a diagram showing an exemplary structure of a multitasking network in a safety detection method of a flight device according to an exemplary embodiment.
Fig. 3 is an exemplary diagram illustrating target distance acquisition in a safety detection method of a flying device according to an exemplary embodiment.
Fig. 4 is a flow chart illustrating a method of security detection of a flying device according to another exemplary embodiment.
Fig. 5 is a diagram showing a corresponding processing example of a security check state in a security detection method of a flight device according to another exemplary embodiment.
Fig. 6 is an exemplary diagram illustrating a security check state transition in a security detection method of a flying device according to another exemplary embodiment.
Fig. 7 is a block diagram illustrating a safety detection apparatus of a flying device according to an exemplary embodiment.
Fig. 8 is a block diagram of a flying device according to an exemplary embodiment.
Detailed Description
Specific embodiments of the present disclosure are described in detail below with reference to the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating and illustrating the disclosure, are not intended to limit the disclosure.
At present, along with the rapid development of flight technology, the application range of flight equipment is also wider and wider, such as goods distribution, photography, electric power inspection, agricultural plant protection and the like, and unmanned aerial vehicles exist in common flight equipment. In order to improve the safety performance of the flying equipment in the flying process, the existing flying equipment is provided with a flying safety management scheme. However, when the flying device takes off, if the target object exists around, the target object may be damaged accidentally, so as to affect the take-off of the flying device.
The existing flying equipment usually needs to perform safety detection manually before taking off, however, the manual safety detection not only consumes manpower, but also has slow detection speed and instability, and cannot respond to emergency situations such as rapid approach of a human body in time. In addition, in the prior art, single-line laser radar is often adopted when depth information (distance information) between a target object and an unmanned aerial vehicle is acquired, so that high cost of safety detection can be caused.
In order to solve the above-mentioned problems, the present disclosure provides a method, an apparatus, a storage medium, and a flight device for detecting safety of a flight device, wherein at least one environmental image is collected before the flight device takes off, and the environmental images are identified to obtain a target identification result, on the basis of which, if a target object is determined to exist in the at least one environmental image based on the target identification result, a target distance between the flight device and the target object is obtained, then a category to which the target object belongs is determined, and a target prompt message is output based on the category to which the target object belongs and the target distance. According to the method and the device, the flexibility of safety detection of the flying equipment can be improved by combining the category of the target object and the target distance, and further the safety of taking off of the flying equipment can be ensured.
The following detailed description of specific embodiments of the present disclosure refers to the accompanying drawings.
Fig. 1 is a flow chart illustrating a method of safety detection of a flying device, as shown in fig. 1, according to an exemplary embodiment, the method comprising the steps of:
in step S11, at least one environmental image is acquired before the flying device takes off, and the at least one environmental image is identified, so as to obtain a target identification result.
As an alternative, embodiments of the present application may acquire at least one environmental image prior to takeoff of the flying device. The at least one environment image may be an image of an environment in which the flying device is located, and the flying device may be an unmanned aerial vehicle. In addition, the unmanned aerial vehicle can be provided with a plurality of cameras, and the at least one environment image can be images of different angles acquired by the plurality of cameras at the same time. Alternatively, the drone may be used to dispense cargo.
As an example, 6 cameras are configured on the unmanned aerial vehicle, specifically, one camera may be configured on each of the left and right sides of the front of the unmanned aerial vehicle, one camera may be configured on each of the left and right sides of the rear of the unmanned aerial vehicle, and one camera may be configured on each of the left and right sides of the unmanned aerial vehicle. In other words, 4 cameras can be installed in total in front of and behind the unmanned aerial vehicle, and 2 cameras can be installed in total in the left and right sides of the unmanned aerial vehicle, so 6 cameras are installed in total in the unmanned aerial vehicle. Before unmanned aerial vehicle takes off, this application can utilize 6 cameras simultaneous acquisition environment image to obtain 6 environment images, these 6 environment images can be the image of unmanned aerial vehicle at the different angles of same moment collection. For example, the first environmental image may be a front left environmental image captured by a front left camera, and the third environmental image may be a rear right environmental image captured by a rear right camera.
As an alternative, the application may acquire the at least one environmental image when receiving the take-off instruction, in other words, the flying device may perform the safety detection first when receiving the take-off instruction, and perform the take-off operation after the safety detection passes. When the flying equipment carries out safety detection, at least one environment image can be acquired by utilizing the camera arranged on the flying equipment. It should be noted that the at least one environment image may be a looking-around image.
In some embodiments, after at least one environmental image is acquired, the application may identify the at least one environmental image to obtain a target identification result. Specifically, the flight device may input at least one environmental image into a multitasking network, detect and segment the at least one environmental image using the multitasking network, and obtain a final target recognition result using the detection result and the segmentation result.
In the embodiment of the application, the multi-task network can train and acquire the multi-task algorithm model through a looking-around data set, wherein the looking-around data set can comprise characters, animals, other target objects and the like. As a specific embodiment, the look-around data set may be a data set including a takeoff safety check of a pedestrian, pet, or the like, when the target approaches the drone.
As an alternative, the multi-tasking network may consist of three parts, feature extraction, feature fusion and result prediction. The feature extraction is to obtain multi-level image features by adopting a layer-by-layer downsampling mode; feature fusion improves the characterization capability of each layer of features by fusing the multi-scale features. And finally, obtaining a final prediction result through the detection result and the segmentation result respectively.
For a better understanding of the architecture of the multi-tasking network, an example diagram as shown in fig. 2 is presented in the present embodiment. As can be seen from fig. 2, the multi-tasking network, upon receiving at least one ambient image, may first perform feature extraction on the at least one ambient image to obtain multi-level image features, such as C2, C3, C4 and C5. On the basis, the application can perform downsampling processing and fusion processing on multi-layer image features to obtain multi-layer fusion features, wherein the multi-layer fusion features can be P3, P4 and P5 shown in FIG. 2. For example, the first layer feature C2 is subjected to downsampling, and the feature obtained by the downsampling is fused with C3 to obtain a fused feature P3.
As an alternative, the present application may detect and segment each layer of fusion features, and aggregate all the fusion features to obtain the target detection result and the target segmentation result as shown in fig. 2. On the basis, the method and the device can obtain the target recognition result by combining the target detection result and the target segmentation result shown in fig. 2.
As an alternative, when the multi-task network acquires the target recognition result, the finally acquired target recognition result can be more accurate by combining a plurality of environment images. As an example, the image 1 collected by the left camera in front of the flying device includes a target object, and the image 2 collected by the right camera in front also includes the target image, and when the images 1 and 2 are input into the multi-task network, the multi-task network can identify the images 1 and 2, and can weight the identification result, so that the identification result of the target object can be more accurate finally.
In the embodiment of the present application, the target recognition result may include a plurality of sub-results. Specifically, a plurality of target objects may be included in one environment image. For example, the environment image 1 includes a human body 1 and a human body 2. In addition, the plurality of environment images may include a plurality of target objects. For example, the human body 1 is included in the environment image 1, the animal 1 is included in the environment image 2, the human body 2 is included in the environment image 3, and the like.
As an alternative, after the target recognition result is obtained, the present application may display a plurality of sub-results in the target recognition result on a display screen of the flight device. The target recognition result may include a category of the target object, a confidence level of the target object, or may also include a position of the target object in the image, or the like. In addition, the target detection result and the target segmentation result can be directly displayed on the display screen of the flight device, and specifically how to display the identification result is not explicitly displayed here, so that the identification result can be selected according to actual conditions.
It should be noted that, if the target recognition result includes a plurality of sub-results, the present application may also display the target object according to the position of each target object in the environment relative to the flying device. For example, if the human body 1 is in front of the flying device, the sub-result corresponding to the human body 1 may be displayed in front when displayed. For another example, animal 1 is on the right side of the flying device, and then the corresponding sub-result of animal 1 may be displayed on the right side. Therefore, the flying equipment can conveniently know the positions of all surrounding target objects, and further the safety detection of the flying equipment can be realized more flexibly and effectively.
In some embodiments, after obtaining the target recognition result, the present application may determine whether a target object exists in at least one environmental image based on the target recognition result, and if it is determined that the target object exists in at least one environmental image, obtain a target distance between the flight device and the target object, that is, go to step S12. In addition, if it is determined that the target object does not exist in at least one environmental image based on the target recognition result, the application can output safety detection passing information, wherein the safety detection passing information is used for indicating that the safety detection of the flight equipment passes, and at the moment, the flight equipment can execute the take-off operation, so that the take-off safety can be ensured.
In step S12, if it is determined that the target object exists in the at least one environmental image based on the target recognition result, a target distance between the flight device and the target object is acquired.
In this embodiment of the present application, the target object may be preset, which may include a person, an animal, or other preset objects, etc. Other preset objects may be objects that may cause obstruction to the flying device. For example, other preset objects may include vehicles, cargoes, etc., and specific examples of other preset objects include those not explicitly limited herein may be selected according to the actual situation.
As an alternative, the present application may obtain the target distance between the flying device and the target object if it is determined that the target object is present in at least one environmental image. It is known from the above description that a plurality of cameras can be installed on the flying device, the images collected by the cameras can be monocular images/binocular images, and when the target distance between the flying device and the target object is obtained, the monocular/binocular depth estimation algorithm can be combined to obtain the target distance. This reduces the cost of depth information acquisition.
Specifically, if the environment image belongs to left and right monocular images, the overall trend of the depth change can be predicted, and then the local tuning is performed on the overall trend. In addition, if the environment image belongs to the front-rear binocular image, the application can output disparity maps taking the left image as a reference view and view differences taking the right image as the reference view through the left view and the right view respectively. On the basis of this, depth information of the target object in the at least one environment image is calculated, i.e. the distance between the target object and the flying device is calculated.
For a better understanding of the process of obtaining the target distance, an exemplary diagram as shown in fig. 3 is given in the embodiment of the present application. The left-view image 101 in fig. 3 may include a front left-view image and a rear left-view image, wherein the front left-view image may be an image captured by a camera mounted on the front left of the flying device; the rear left view image may be an image captured by a camera mounted to the rear left of the flying device. The right-view image 102 in fig. 3 may include a front right-view image and a rear right-view image, wherein the front right-view image may be an image captured by a camera mounted on the front right of the flying device; the rear right view image may be an image captured by a camera mounted on the rear right of the flying device. Monocular image 103 may include a left-view image and a right-view image, wherein the left-view image may be an image captured by a camera mounted on the left side of the flying device; the right-view image may be an image captured by a camera mounted on the right side of the flying device.
As is clear from fig. 3, the present application can perform front-rear binocular depth estimation in combination with the left view image 101 and the right view image 102, and can perform left-right monocular depth estimation using the monocular image 103. On the basis, the front and rear binocular depth estimation results and the left and right monocular depth estimation results are subjected to information fusion, so that the unmanned aerial vehicle looking around depth estimation results can be obtained, and the target distance between each target object and the flight equipment is obtained.
In step S13, a category to which the target object belongs is determined, and target prompt information is output based on the category to which the target object belongs and the target distance.
As is known from the above description, the kinds of the target object are various, and may include a human body, an animal, an object, or the like. In order to better detect the safety condition around the flying device before the flying device takes off, the embodiment of the application can determine the category of the target object based on the target recognition result after recognizing at least one environment image to obtain the target recognition result. On the basis, determining target prompt information by combining the category to which the target object belongs and the target distance, and outputting the target prompt information.
In the embodiment of the invention, the categories of the target objects are different, or the target prompt information corresponding to different target distances is also different, so that the safety detection of the flying equipment can be realized more flexibly and accurately.
According to the method and the device, at least one environment image is collected before the flying equipment takes off, the environment images are identified to obtain a target identification result, if the fact that a target object exists in the at least one environment image is determined based on the target identification result, the target distance between the flying equipment and the target object is obtained, then the category of the target object is determined, and target prompt information is output based on the category of the target object and the target distance. According to the method and the device, the flexibility of safety detection of the flying equipment can be improved by combining the category of the target object and the target distance, and further the safety of taking off of the flying equipment can be ensured.
Fig. 4 is a flow chart illustrating a method of safety detection of a flying device according to another exemplary embodiment, as shown in fig. 4, the method comprising the steps of:
in step S21, at least one environmental image is acquired before the flying device takes off, and the at least one environmental image is identified, so as to obtain a target identification result.
In step S22, if it is determined that the target object exists in the at least one environmental image based on the target recognition result, a target distance between the flight device and the target object is acquired.
The above embodiments of step S21 to step S22 have been described in detail, and will not be described here again.
In step S23, in the case where the target object belongs to the first type, it is determined whether the target distance is smaller than a first preset distance, the target object of the first type including a person.
As an alternative, if the target object is of the first type, the present application may determine whether the distance between the target object and the flying device is less than a first preset distance. Wherein the target object may comprise a person, which may be a pedestrian. If the target object belongs to the first type, if it is determined that the distance between the target object and the flight device is smaller than the first preset distance, the present application may output the first prompt information, that is, enter step S24.
In step S24, if the target distance is determined to be smaller than the first preset distance, a first prompt message is output to alert the person.
Through the above description, it is known that, in the case that it is determined that the target object belongs to the first type, if it is determined that the target distance between the target object and the flight device is smaller than the first preset distance, the first prompt message may be output. The first prompt information is used for carrying out alarm prompt on the target object. In addition, the first preset distance may be preset according to an empirical value.
As an example, the person 1 is determined to be present in the at least one environmental image by detection, the distance between the person 1 and the flying device is determined to be 1.8m by calculation, and the first preset distance is 2m. By comparison, the distance 1.8m between the person 1 and the flying device is smaller than the first preset distance, so that the flying device can output first prompt information to warn the person 1 to avoid through the first prompt information.
As an alternative, before the flight device outputs the prompt information, the present application may also determine the manner of outputting the prompt information first, and output the prompt information by using the manner. The output mode of the prompt information can be determined based on the type of the target object. When the target object belongs to the first type, the application can output prompt information by utilizing voice; when the target object belongs to the second type, the prompt message can be output in a buzzing mode. For example, when the target object is detected to be a person, the voice mode is used for outputting 'the unmanned aerial vehicle needs to take off and please pay attention to avoiding'.
As an alternative, the flight device may acquire the first time period after outputting the first hint information. The first duration may be a time length after the first prompt message is output, that is, when the first prompt message is determined to be output, the time may be started, and a duration corresponding to the time may be the first duration. On the basis, whether the first time length exceeds a first preset time length or not can be detected. If it is determined that the first time period exceeds the first preset time period, the present application may again determine whether the distance between the target object (person) and the flying device is still less than the first preset distance. If the distance between the target object (person) and the flight equipment is still smaller than the first preset distance, evacuation prompt information can be sent to the control terminal.
The control terminal may be a background client, or may be a terminal device held by a user, for example, may be a mobile phone held by the user. When the control terminal receives the evacuation prompt information, the prompt information can be output in voice, text or acousto-optic mode, so that background operation and maintenance personnel or users holding the terminal equipment can know the current actual situation of the flight equipment as soon as possible.
Optionally, the flight device may continuously output the first prompt information when the first time length does not exceed the first preset time length, so that the target object can be far away from the flight device as soon as possible, and further the safety of taking off of the flight device is ensured, that is, the possibility of accidental injury is reduced.
In addition, if it is determined that the target object (person) still exists in the newly acquired environmental image and the first time exceeds the first preset time, but the distance between the target object and the flight device is greater than or equal to the first preset distance, the embodiment of the application may output the second prompt information. The second prompt message is used for informing the target object (person) that the target object is slightly close to the flying device and away from the flying device.
As another alternative, in the case that it is determined that the target object belongs to the first type, if it is determined that the target distance is greater than or equal to the second preset distance, the embodiment of the present application may output second prompt information to perform a safety prompt on the target object (person). The second prompt information may include a pitch prompt, so that the target object can quickly and effectively know the take-off condition of the flight device.
Optionally, after outputting the second prompt information, the present application may acquire a second duration, where the second duration may be a time length after outputting the second prompt information. Under the condition that the second time length exceeds the second preset time length, if the distance between the target object and the flight equipment is still larger than or equal to the second preset distance, the flight equipment can directly output the safety detection passing information. At this point, the flying device may directly perform the takeoff operation.
In this embodiment of the present application, the second preset duration may be less than the first preset duration. As an example, the first preset time period may be 60s and the second preset time period may be 15s. According to the method and the device, the alarm level of the person is set to be the highest, so that the safety of taking off of the flying equipment can be guaranteed to the greatest extent, namely the influence of taking off on the safety of other pedestrians is reduced.
As another alternative, in the case where it is determined that the target object belongs to the second type, the present application may determine whether the target distance is less than a second preset distance, wherein the target object of the second type includes an animal. And outputting a third prompt message to alarm the animal if the target distance (animal) is smaller than the second preset distance. In this embodiment of the present application, the second preset distance may be smaller than the first preset distance.
As an example, the flying device determines that the animal 1 is present around the flying device by recognizing at least one environmental image, and then determines that the distance between the animal 1 and the flying device is 1.4m by calculation, and the second preset distance is 1.5m by comparison, and determines that the distance between the animal 1 and the flying device is less than the second preset distance at this time. Therefore, the embodiment of the application can output the second prompt information.
In addition, through the above description, the manner in which the flight device outputs the prompt information is different if the types of the target objects are different. When the target object belongs to the second type, the embodiment of the application can output the third prompt information in a buzzing mode, namely, the application can drive the target object (animal) to be far away from the flying equipment in the buzzing mode.
Optionally, after outputting the third prompt information, the embodiment of the present application may obtain a third duration, where the third duration is a time length after outputting the third prompt information. On the basis, whether the third duration exceeds a third preset duration is detected. If it is determined that the third duration exceeds the third preset duration, continuing to determine whether the distance between the target object (animal) and the flying device is still less than the second preset distance. If the distance between the target object and the flight equipment is still smaller than the second preset distance, obstacle removing prompt information is sent to the control terminal. As an example, the third preset time period may be 60s.
As another alternative, the target object of the second type may further include other objects, and when it is determined that the target object belongs to the second type and is an animal, if it is determined that the target distance is smaller than the second preset distance, the application may dispel the animal by means of buzzing. When the target object is determined to belong to the second type and is other objects, if the target distance is determined to be smaller than the second preset distance, obstacle removing prompt information can be sent to the control terminal, and the other objects can be objects which cannot move autonomously.
As an alternative, when acquiring the target prompt information based on the category and the target distance to which the target object belongs, the present application may acquire the security check state based on the category and the target distance to which the target object belongs first, and then determine the process corresponding to the security check state.
As an example, when the target object belongs to the first type and the target distance is less than the first preset distance, the corresponding security check state may be 11; when the target object belongs to the first type and the target distance is greater than or equal to a first preset distance, the corresponding security inspection state may be 10; when the target object belongs to the second type and the target distance is smaller than the second preset distance, the corresponding security check state may be 01; when there is no target correspondence in the at least one environmental image, the corresponding security check state may be 00. On the basis, a processing strategy corresponding to the security check state is obtained, and the processing strategy comprises determination of target prompt information. The relationship between the security check state and the processing policy may be as shown in fig. 5.
As known from fig. 5, when the security inspection state is 11, the embodiment of the present application may output an alarm to alert pedestrians to avoid, and after 60s of the alarm, if the distance between the pedestrians and the flight device is still smaller than the first preset distance, the present application may call for a person; when the safety inspection state is 10, the embodiment of the application can carry out a paddle lifting prompt, and send out a safety inspection passing instruction after 15s of paddle lifting; when the security inspection state is 01, the embodiment of the application can output an alarm (buzzing) to warn the target object of the second type to avoid, and after 60 seconds of alarm, if the distance between the target object of the second type and the flight equipment is still smaller than the second preset distance, the operator can be called to clear the obstacle; when the security check state is 00, the embodiment of the application can directly send the security check passing instruction.
It should be noted that, in the embodiment of the present application, the security check state may be dynamically changed, that is, after the target prompt information is output, the target object may be far away from the flight device, may be close to the flight device, or may keep the original distance substantially unchanged (waiting). As shown in fig. 6, after determining that the security check state is 11, the target object (person) is far away from the flying apparatus, and the corresponding security state may be switched from 11 to 10. The waiting in fig. 5 and 6 refers to a state in which the distance between the flight device and the target object is not substantially changed for a certain period of time after the flight device outputs the target prompt information.
In addition, if at the same time, the determined prompt information includes at least two of the first prompt information, the second prompt information and the third prompt information, the embodiment of the application can obtain the priority corresponding to the prompt information and output the prompt information with the highest priority. Specifically, the priority of the first hint information may be highest. For example, the first prompt information and the second prompt information are determined through analysis, and at this time, only the first prompt information may be output. In addition, the priorities of the second prompt information and the third prompt information may be the same or different, and specifically how to set the priorities of the prompt information is not limited explicitly, and may be selected according to the actual situation. For example, where the flying device is taking off with a relatively large number of animals, the third cue may have a higher priority than the second cue.
According to the method and the device, at least one environment image is collected before the flying equipment takes off, the environment images are identified to obtain a target identification result, if the fact that a target object exists in the at least one environment image is determined based on the target identification result, the target distance between the flying equipment and the target object is obtained, then the category of the target object is determined, and target prompt information is output based on the category of the target object and the target distance. According to the method and the device, the flexibility of safety detection of the flying equipment can be improved by combining the category of the target object and the target distance, and further the safety of taking off of the flying equipment can be ensured.
Fig. 7 is a block diagram of a safety detection apparatus 300 of a flying device according to an exemplary embodiment, and as shown in fig. 7, the apparatus 300 includes an acquisition module 301, an acquisition module 302, and an output module 303.
The acquisition module 301 is configured to acquire at least one environmental image before the flying device takes off, and identify the at least one environmental image to obtain a target identification result;
the obtaining module 302 is configured to obtain a target distance between the flying device and the target object if it is determined that the target object exists in the at least one environmental image based on the target recognition result;
the output module 303 is configured to determine a category to which the target object belongs, and output a target prompt message based on the category to which the target object belongs and the target distance.
In some implementations, the output module 303 can include:
a first determining submodule, configured to determine, if the target object belongs to a first type, whether the target distance is smaller than a first preset distance, where the target object of the first type includes a person;
and the first information output sub-module is used for outputting first prompt information to carry out alarming prompt on the person if the target distance is smaller than the first preset distance.
In some embodiments, the safety detection apparatus 300 of the flying device may further include:
the first time length acquisition module is used for acquiring a first time length, wherein the first time length is the time length after the first prompt information is output;
and the evacuation information sending module is used for sending evacuation prompt information to the control terminal if the distance between the target object and the flight equipment is determined to be still smaller than the first preset distance under the condition that the first time length exceeds the first preset time length.
In some implementations, the output module 303 can further include:
and the second information output sub-module is used for outputting second prompt information to carry out safety prompt on the person if the target distance is determined to be greater than or equal to a second preset distance.
In some embodiments, the safety detection apparatus 300 of the flying device may further include:
the second time length acquisition module is used for acquiring a second time length, wherein the second time length is the time length after the second prompt information is output;
the detection passing output module is configured to output safety detection passing information if it is determined that the distance between the target object and the flight device is still greater than or equal to the second preset distance when the second duration exceeds a second preset duration.
In some embodiments, the detection passing output module is further configured to output security detection passing information if it is determined that the target object is not present in the at least one environmental image based on the target recognition result.
In some implementations, the output module 303 can further include:
a second determining submodule, configured to determine whether the target distance is smaller than a second preset distance if the target object belongs to a second type, where the target object of the second type includes an animal;
and the third information output sub-module is used for outputting third prompt information to alarm the animal if the target distance is determined to be smaller than the second preset distance.
In some embodiments, the safety detection apparatus 300 of the flying device may further include:
a third duration obtaining module, configured to obtain a third duration, where the third duration is a time duration after outputting the third prompt information;
and the obstacle clearance information sending module is used for sending obstacle clearance prompt information to the control terminal if the distance between the target object and the flight equipment is determined to be smaller than the second preset distance under the condition that the third time length exceeds the third preset time length.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
Fig. 8 is a block diagram illustrating a flying apparatus 700 according to an exemplary embodiment. As shown in fig. 8, the flying apparatus 700 may include: a processor 701, a memory 702. The flying device 700 may also include one or more of a multimedia component 703, an input/output (I/O) interface 704, and a communication component 705, and the flying device 700 may be an unmanned aerial vehicle.
Wherein the processor 701 is configured to control the overall operation of the flying device 700 to perform all or part of the steps of the method for detecting safety of a flying device described above. The memory 702 is used to store various types of data to support operation on the flying device 700, which may include, for example, instructions for any application or method operating on the flying device 700, as well as application-related data, such as contact data, messages sent and received, pictures, audio, video, and the like. The Memory 702 may be implemented by any type or combination of volatile or non-volatile Memory devices, such as static random access Memory (Static Random Access Memory, SRAM for short), electrically erasable programmable Read-Only Memory (Electrically Erasable Programmable Read-Only Memory, EEPROM for short), erasable programmable Read-Only Memory (Erasable Programmable Read-Only Memory, EPROM for short), programmable Read-Only Memory (Programmable Read-Only Memory, PROM for short), read-Only Memory (ROM for short), magnetic Memory, flash Memory, magnetic disk, or optical disk. The multimedia component 703 can include a screen and an audio component. Wherein the screen may be, for example, a touch screen, the audio component being for outputting and/or inputting audio signals. For example, the audio component may include a microphone for receiving external audio signals. The received audio signals may be further stored in the memory 702 or transmitted through the communication component 705. The audio assembly further comprises at least one speaker for outputting audio signals. The I/O interface 704 provides an interface between the processor 701 and other interface modules, which may be a keyboard, mouse, buttons, etc. These buttons may be virtual buttons or physical buttons. The communication component 705 is used for wired or wireless communication between the flying device 700 and other devices. Wireless communication, such as Wi-Fi, bluetooth, near field communication (Near Field Communication, NFC for short), 2G, 3G, 4G, NB-IOT, eMTC, or other 5G, etc., or one or a combination of more of them, is not limited herein. The corresponding communication component 705 may thus comprise: wi-Fi module, bluetooth module, NFC module, etc.
In an exemplary embodiment, the flying device 700 may be implemented by one or more application specific integrated circuits (Application Specific Integrated Circuit, abbreviated ASIC), digital signal processor (Digital Signal Processor, abbreviated DSP), digital signal processing device (Digital Signal Processing Device, abbreviated DSPD), programmable logic device (Programmable Logic Device, abbreviated PLD), field programmable gate array (Field Programmable Gate Array, abbreviated FPGA), controller, microcontroller, microprocessor, or other electronic components for performing the above-described method of flying device security detection.
In another exemplary embodiment, a computer readable storage medium is also provided, comprising program instructions which, when executed by a processor, implement the steps of the above-described method of safety detection of a flying device. For example, the computer readable storage medium may be the memory 702 including program instructions described above that are executable by the processor 701 of the flying device 700 to perform the method of safety detection of a flying device described above.
In another exemplary embodiment, a computer program product is also provided, which comprises a computer program executable by a programmable apparatus, the computer program having code portions for performing the above-described method of safety detection of a flying device when executed by the programmable apparatus.
The preferred embodiments of the present disclosure have been described in detail above with reference to the accompanying drawings, but the present disclosure is not limited to the specific details of the above embodiments, and various simple modifications may be made to the technical solutions of the present disclosure within the scope of the technical concept of the present disclosure, and all the simple modifications belong to the protection scope of the present disclosure.
In addition, the specific features described in the foregoing embodiments may be combined in any suitable manner, and in order to avoid unnecessary repetition, the present disclosure does not further describe various possible combinations.
Moreover, any combination between the various embodiments of the present disclosure is possible as long as it does not depart from the spirit of the present disclosure, which should also be construed as the disclosure of the present disclosure.

Claims (12)

1. A method of safety detection of a flying device, the method comprising:
collecting at least one environment image before the flying equipment takes off, and identifying the at least one environment image to obtain a target identification result;
if the target object exists in the at least one environment image based on the target identification result, acquiring a target distance between the flying equipment and the target object;
And determining the category to which the target object belongs, and outputting target prompt information based on the category to which the target object belongs and the target distance.
2. The method of claim 1, wherein the determining the category to which the target object belongs and outputting target hint information based on the category to which the target object belongs and the target distance comprises:
determining whether the target distance is smaller than a first preset distance or not under the condition that the target object belongs to a first type, wherein the target object of the first type comprises a person;
and if the target distance is smaller than the first preset distance, outputting first prompt information to give an alarm prompt to the person.
3. The method according to claim 2, wherein after the outputting the first prompt message, the method includes:
acquiring a first time length, wherein the first time length is the time length after the first prompt information is output;
and under the condition that the first time length exceeds a first preset time length, if the distance between the target object and the flight equipment is still smaller than the first preset distance, sending evacuation prompt information to a control terminal.
4. The method according to claim 2, wherein the method further comprises:
and if the target distance is greater than or equal to a second preset distance, outputting second prompt information to carry out safety prompt on the person.
5. The method of claim 4, wherein after outputting the second hint information, comprising:
acquiring a second time length, wherein the second time length is the time length after the second prompt information is output;
and under the condition that the second time length exceeds a second preset time length, if the distance between the target object and the flight equipment is still larger than or equal to the second preset distance, outputting safety detection passing information.
6. The method of claim 1, wherein the determining the category to which the target object belongs and outputting target hint information based on the category to which the target object belongs and the target distance comprises:
determining whether the target distance is smaller than a second preset distance under the condition that the target object belongs to a second type, wherein the target object of the second type comprises animals;
and if the target distance is smaller than the second preset distance, outputting third prompt information to alarm the animal.
7. The method of claim 6, wherein after outputting the third hint information, comprising:
acquiring a third time length, wherein the third time length is the time length after outputting the third prompt information;
and if the third duration exceeds the third preset duration, if the distance between the target object and the flight equipment is still smaller than the second preset distance, sending obstacle clearance prompt information to the control terminal.
8. The method according to claim 1, wherein the method further comprises:
and outputting safety detection passing information if the fact that the target object does not exist in the at least one environment image is determined based on the target identification result.
9. A safety detection device for a flying apparatus, the device comprising:
the acquisition module is used for acquiring at least one environment image before the flying equipment takes off and identifying the at least one environment image to obtain a target identification result;
the acquisition module is used for acquiring a target distance between the flying equipment and the target object if the target object exists in the at least one environment image based on the target identification result;
And the output module is used for determining the category to which the target object belongs and outputting target prompt information based on the category to which the target object belongs and the target distance.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the steps of the method according to any one of claims 1-8.
11. A flying apparatus, comprising:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to implement the steps of the method of any one of claims 1-8.
12. The flying apparatus of claim 11, wherein the flying apparatus comprises an unmanned aerial vehicle.
CN202210946226.5A 2022-08-08 2022-08-08 Safety detection method and device of flight equipment, storage medium and flight equipment Pending CN117576582A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210946226.5A CN117576582A (en) 2022-08-08 2022-08-08 Safety detection method and device of flight equipment, storage medium and flight equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210946226.5A CN117576582A (en) 2022-08-08 2022-08-08 Safety detection method and device of flight equipment, storage medium and flight equipment

Publications (1)

Publication Number Publication Date
CN117576582A true CN117576582A (en) 2024-02-20

Family

ID=89883148

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210946226.5A Pending CN117576582A (en) 2022-08-08 2022-08-08 Safety detection method and device of flight equipment, storage medium and flight equipment

Country Status (1)

Country Link
CN (1) CN117576582A (en)

Similar Documents

Publication Publication Date Title
US11308809B2 (en) Collision control method and apparatus, and storage medium
US20200317190A1 (en) Collision Control Method, Electronic Device and Storage Medium
CN112141119B (en) Intelligent driving control method and device, vehicle, electronic equipment and storage medium
US9451062B2 (en) Mobile device edge view display insert
CN109455180B (en) Method and device for controlling unmanned vehicle
US10643472B2 (en) Monitor apparatus and monitor system
CN111666821B (en) Method, device and equipment for detecting personnel aggregation
CN104335244A (en) Object recognition device
WO2021213241A1 (en) Target detection method and apparatus, and electronic device, storage medium and program
US11250279B2 (en) Generative adversarial network models for small roadway object detection
EP3754449A1 (en) Vehicle control method, related device, and computer storage medium
CN114419572B (en) Multi-radar target detection method and device, electronic equipment and storage medium
CN115641518A (en) View sensing network model for unmanned aerial vehicle and target detection method
CN114267041B (en) Method and device for identifying object in scene
CN112100445A (en) Image information processing method and device, electronic equipment and storage medium
US20180330460A1 (en) Handheld photo enforcement systems and methods
CN117576582A (en) Safety detection method and device of flight equipment, storage medium and flight equipment
KR20210055746A (en) Driver's working condition detection method, device, device and computer storage medium
CN113344900B (en) Airport runway intrusion detection method, airport runway intrusion detection device, storage medium and electronic device
CN111832338A (en) Object detection method and device, electronic equipment and storage medium
CN116206363A (en) Behavior recognition method, apparatus, device, storage medium, and program product
CN113721621B (en) Vehicle control method, device, electronic equipment and storage medium
JP7343645B2 (en) Vehicle-based interaction methods, devices, equipment, media and vehicles
KR20190040658A (en) automatic Illegal driving shooting, analysis, reporting and transmission system
KR101925794B1 (en) Apparatus for tagging vedio of camera for means of transportation and method for the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination