CN109895780B - Method and device for realizing autonomous escaping of unmanned equipment - Google Patents

Method and device for realizing autonomous escaping of unmanned equipment Download PDF

Info

Publication number
CN109895780B
CN109895780B CN201711283397.XA CN201711283397A CN109895780B CN 109895780 B CN109895780 B CN 109895780B CN 201711283397 A CN201711283397 A CN 201711283397A CN 109895780 B CN109895780 B CN 109895780B
Authority
CN
China
Prior art keywords
obstacle
information
obstacle information
unmanned equipment
characteristic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711283397.XA
Other languages
Chinese (zh)
Other versions
CN109895780A (en
Inventor
王建伟
李雨倩
吴迪
王玉猛
刘丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Qianshi Technology Co Ltd
Original Assignee
Beijing Jingdong Qianshi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Qianshi Technology Co Ltd filed Critical Beijing Jingdong Qianshi Technology Co Ltd
Priority to CN201711283397.XA priority Critical patent/CN109895780B/en
Publication of CN109895780A publication Critical patent/CN109895780A/en
Application granted granted Critical
Publication of CN109895780B publication Critical patent/CN109895780B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Alarm Systems (AREA)

Abstract

The invention discloses a method and a device for realizing autonomous escaping of unmanned equipment, and relates to the technical field of computers. One embodiment of the method comprises: determining that the unmanned equipment is trapped, and acquiring image information around the unmanned equipment to acquire obstacle information; determining the type of the obstacle according to the obstacle information to obtain a corresponding processing mode; executing the processing mode to enable the unmanned equipment to get rid of difficulties. The implementation mode can solve the problem that the prior art can only be manually solved on site, and further has low efficiency.

Description

Method and device for realizing autonomous escaping of unmanned equipment
Technical Field
The invention relates to the technical field of computers, in particular to a method and a device for realizing autonomous escaping of unmanned equipment.
Background
The current unmanned technology is developed rapidly, unmanned equipment also becomes a necessary trend for future development, and the application of the unmanned equipment in the fields of logistics distribution, intelligent transportation and the like also becomes a research hotspot of technicians in the field.
In the process of implementing the invention, the inventor finds that at least the following problems exist in the prior art: the existing unmanned equipment is in a development stage, and the treatment of the unmanned equipment after being trapped in operation is mainly solved in the field by manpower at the present stage, but the method has a lot of inconvenience. Moreover, under the condition of large number of unmanned equipment, a large amount of manpower is needed for support, the operation cost is too high, a certain time is needed for workers to arrive at the site, the problem of being trapped cannot be solved in time, and the efficiency of the method is too low.
Disclosure of Invention
In view of this, embodiments of the present invention provide a method and an apparatus for implementing autonomous escaping of an unmanned device, which can solve the problem that the prior art can only solve the problem in the field manually, and thus has low efficiency.
In order to achieve the above object, according to an aspect of an embodiment of the present invention, there is provided a method for enabling an unmanned aerial vehicle to escape from a trouble autonomously, including determining that the unmanned aerial vehicle is trapped, and acquiring image information around the unmanned aerial vehicle to obtain obstacle information; determining the type of the obstacle according to the obstacle information to obtain a corresponding processing mode; executing the processing mode to enable the unmanned equipment to get rid of difficulties.
Optionally, determining the type of the obstacle according to the obstacle information to obtain a corresponding processing manner, including: judging whether the obstacle information is characteristic obstacle information or not according to the obstacle information; according to the judgment result, if the characteristic obstacle information is the characteristic obstacle information, warning the characteristic obstacle in a characteristic obstacle alarm prompting mode; otherwise, the unmanned equipment is released through a remote control mode.
Optionally, the characteristic obstacle alarm prompting manner includes: a first alarm prompting mode and a second alarm prompting mode;
when determining that the obstacle information is the characteristic obstacle information, the method further includes: judging whether the characteristic barrier information is human information or not, and if so, prompting in a first alarm prompting mode; otherwise, prompting is carried out through a second alarm prompting mode.
Optionally, the method further comprises: recording the prompting times;
when the processing mode is executed to enable the unmanned equipment to get rid of trouble, the processing mode comprises the following steps: when the unmanned equipment is judged not to be stranded, acquiring prompting times, determining that the prompting times are greater than or equal to a preset time threshold value, and enabling the unmanned equipment to be stranded in a remote control mode; and otherwise, continuously acquiring the image information around the unmanned equipment.
Optionally, determining the type of the obstacle according to the obstacle information includes: identifying the obstacle information through a deep learning method of an image to determine the type of the obstacle; the features in the image deep learning method are obtained by adopting a convolutional neural network for pre-training.
In addition, according to an aspect of the embodiments of the present invention, there is provided an apparatus for implementing autonomous escaping of a unmanned aerial vehicle, including a trigger module, configured to determine that the unmanned aerial vehicle is trapped, and acquire image information around the unmanned aerial vehicle to obtain obstacle information; the judging module is used for determining the type of the obstacle according to the obstacle information so as to obtain a corresponding processing mode; and the execution module is used for executing the processing mode so as to enable the unmanned equipment to get rid of difficulties.
Optionally, the determining module determines the type of the obstacle according to the obstacle information to obtain a corresponding processing manner, including: judging whether the obstacle information is characteristic obstacle information or not according to the obstacle information; according to the judgment result, if the characteristic obstacle information is the characteristic obstacle information, warning the characteristic obstacle in a characteristic obstacle alarm prompting mode; otherwise, the unmanned equipment is released through a remote control mode.
Optionally, the characteristic obstacle alarm prompting manner includes: a first alarm prompting mode and a second alarm prompting mode; when the judging module determines that the obstacle information is the characteristic obstacle information, the judging module further comprises: judging whether the characteristic barrier information is human information or not, and if so, prompting in a first alarm prompting mode; otherwise, prompting is carried out through a second alarm prompting mode.
Optionally, the execution module is further configured to: recording the prompting times;
the execution module executes the processing mode to enable the unmanned equipment to get rid of difficulties, and the processing mode comprises the following steps: when the unmanned equipment is judged not to be stranded, acquiring prompting times, determining that the prompting times are greater than or equal to a preset time threshold value, and enabling the unmanned equipment to be stranded in a remote control mode; and otherwise, continuously acquiring the image information around the unmanned equipment.
Optionally, the determining, by the determining module, the type of the obstacle according to the obstacle information includes: identifying the obstacle information through a deep learning method of an image to determine the type of the obstacle; the features in the image deep learning method are obtained by adopting a convolutional neural network for pre-training.
According to another aspect of the embodiments of the present invention, there is also provided an electronic device, including:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of any of the embodiments described above to enable unmanned device disablement.
According to another aspect of an embodiment of the present invention, there is also provided a computer readable medium, on which a computer program is stored, which when executed by a processor, implements the method of any of the above embodiments for enabling unmanned device escape.
One embodiment of the above invention has the following advantages or benefits: because the unmanned equipment is determined to be trapped, the image information around the unmanned equipment is collected to obtain the obstacle information; determining the type of the obstacle according to the obstacle information to obtain a corresponding processing mode; the processing mode is executed to enable the unmanned equipment to get rid of the trouble, so that the unmanned equipment can be quickly got rid of the trouble by an artificial intelligent mode when getting stuck. Therefore, a large amount of labor cost is saved, the time for solving the difficulty escaping problem is reduced, and the operating efficiency of the unmanned equipment is improved.
Further effects of the above-mentioned non-conventional alternatives will be described below in connection with the embodiments.
Drawings
The drawings are included to provide a better understanding of the invention and are not to be construed as unduly limiting the invention. Wherein:
fig. 1 is a schematic diagram of a main flow of a method for implementing autonomous stranded escape of an unmanned aerial vehicle according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a main flow of a method for realizing autonomous getting rid of a trouble of an unmanned aerial vehicle according to a referential embodiment of the present invention;
FIG. 3 is a schematic diagram of the main modules of an apparatus for enabling autonomous escape of an unmanned aerial vehicle according to an embodiment of the present invention;
FIG. 4 is an exemplary system architecture diagram in which embodiments of the present invention may be employed;
fig. 5 is a schematic block diagram of a computer system suitable for use in implementing a terminal device or server of an embodiment of the invention.
Detailed Description
Exemplary embodiments of the present invention are described below with reference to the accompanying drawings, in which various details of embodiments of the invention are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Fig. 1 is a method for implementing autonomous unhooking of an unmanned aerial vehicle according to an embodiment of the present invention, and as shown in fig. 1, the method for implementing autonomous unhooking of an unmanned aerial vehicle includes:
and S101, determining that the unmanned equipment is trapped, and acquiring image information around the unmanned equipment to acquire obstacle information.
The unmanned equipment is trapped by an obstacle and cannot be led to a destination according to a planned path, and the unmanned equipment escaping means that the unmanned equipment is in an escaping state.
Preferably, the image information can be analyzed and processed by a deep learning method of the image, wherein the features in the deep learning method of the image are obtained by adopting a convolutional neural network for pre-training.
And step S102, determining the type of the obstacle according to the obstacle information so as to obtain a corresponding processing mode.
The obstacle information may include characteristic obstacle information and non-characteristic obstacle information, where the characteristic obstacle information refers to a living being with a vital sign, and the non-characteristic obstacle information refers to a living being without a vital sign.
Preferably, whether the obstacle information is the characteristic obstacle information or not can be judged according to the obstacle information, and if the obstacle information is the characteristic obstacle information, the characteristic obstacle is warned in a characteristic obstacle warning and prompting mode; otherwise, the unmanned equipment is released through a remote control mode. Preferably, the remote control may be performed by observation at the display and manipulation of the controller to get the unmanned device out of the way.
Further, the characteristic obstacle information may be divided into character information and other living body information (such as images of dogs, cats, etc.). Then when it is determined that the obstacle information is the characteristic obstacle information, an alarm prompting manner may be determined according to whether the characteristic obstacle information is the object information. Wherein, the characteristic obstacle alarm prompting mode comprises: a first alarm prompting mode and a second alarm prompting mode. The first alarm prompting mode can be a mode of playing voice of a human body and weak light, and the second alarm prompting mode can be a mode of large volume, fast rhythm voice and strong light. The specific implementation process comprises the following steps: judging whether the characteristic barrier information is human information or not, and if so, prompting in a first alarm prompting mode; otherwise, prompting is carried out through a second alarm prompting mode.
And step S103, executing the processing mode to enable the unmanned equipment to get rid of difficulties.
As an embodiment, after each execution of the processing mode, the number of times of prompting may be increased once. In order to improve the efficiency of getting rid of the trouble of the unmanned equipment, when the unmanned equipment is judged not to get rid of the trouble, the recorded prompting times are obtained, the prompting times are judged to be larger than or equal to a preset time threshold value, and if the prompting times are larger than or equal to the preset time threshold value, the unmanned equipment is enabled to get rid of the trouble through a remote control mode; otherwise, continuing to acquire the image information around the unmanned equipment, and returning to the step S101.
It should be further noted that when the unmanned device is released by a remote control method, that is, when the observation of the display and the operation of the controller achieve the purpose of releasing the unmanned device, the situation around the unmanned device can be obtained through the display, and the button or remote lever on the controller is used for performing remote talkback, remote whistle and remote control of the unmanned device to move (forward, backward, turn, etc.) so as to achieve the purpose of releasing the unmanned device.
According to the various embodiments, the method for realizing the autonomous escaping of the unmanned equipment has the advantages that the manual intelligent technology is preferentially used for automatically escaping, the labor cost is saved, and the time for solving the problems is reduced. After the artificial intelligence automatic escaping fails, the artificial intelligence automatic escaping is automatically switched to the artificial remote monitoring escaping, and most of the escaping conditions can be solved. The probability of getting rid of difficulties automatically is increased by machine learning of artificial intelligence at any time.
Fig. 2 is a schematic diagram of a main flow of a method for realizing autonomous escaping of an unmanned aerial vehicle according to a referential embodiment of the present invention, and the method for realizing autonomous escaping of the unmanned aerial vehicle may include:
step S201, determining that the unmanned equipment is trapped, and acquiring image information around the unmanned equipment by a sensor.
Step S202, analyzing the image information to obtain an analysis result.
Preferably, the image information can be analyzed and processed based on a deep learning method, wherein the features in the deep learning are obtained by utilizing convolutional neural network training. Further, a convolutional neural network may be used to classify the obstacle information in the features in advance, and the obstacle information may include feature obstacle information and non-feature obstacle information, where the feature obstacle information refers to a living being with vital signs, and the non-feature obstacle information refers to a living being without vital signs. Furthermore, when the obstacle information in the features is divided by using the convolutional neural network in advance, the obstacle information can be classified automatically in a mode of training in advance according to a large amount of obstacle information so as to distinguish the obstacle information.
And step S203, obtaining obstacle information according to the analysis result.
Step S204, determining whether the obstacle information is the characteristic obstacle information, if so, performing step S205, otherwise, directly performing step S209.
Step S205, determining whether the characteristic obstacle information is the object information, if so, performing step S206, otherwise, performing step S207.
In this embodiment, if the deep learning analysis of the image matches the human figure information, the obstacle is considered to be a human, and the human figure information is extracted using a convolutional neural network.
Preferably, the characteristic obstacle information may be divided into character information and other living body information (such as images of dogs, cats and the like), and the convolutional neural network may be trained by a large amount of character information and other living body information to distinguish the character information from the other living body information.
Step S206, warning the characteristic obstacle in a person alarm prompting mode, and performing step S208.
In an embodiment, the person alarm prompting mode may be an alarm prompting sound, an alarm light and the like, and further, the person alarm prompting mode may be a mode of playing a voice of a person, a weak light and the like. Preferably, the number of prompts is increased once per alarm prompt.
And step S207, warning the characteristic barrier through other life body alarm prompting modes, and performing step S208.
In an embodiment, the other biological body alarm prompting modes may be alarm prompting sound, alarm lighting, and the like, and further may be prompting by large volume, fast-paced voice, strong lighting, and the like. Preferably, the number of prompts is increased once per alarm prompt.
And step S208, judging whether the unmanned equipment is out of the way or not, if so, exiting the process, and otherwise, executing the step S209.
As an embodiment, image information around the unmanned device may be collected, and whether the information of the characteristic obstacle still exists or not may be determined by analyzing and processing the image, so as to determine whether the unmanned device is out of the trouble.
In a further embodiment, when it is determined that the unmanned device is not out of the trouble, the number of times of prompting may be obtained, and if it is determined that the number of times of prompting is greater than or equal to a preset number threshold, step S209 is performed, otherwise, step S201 is returned to. In addition, when the unmanned equipment is judged to be out of the trouble, the process can be directly exited.
Step S209 is to get the unmanned device out of the way by observation on the display and operation of the controller.
In a preferred embodiment, the situation around the unmanned equipment can be acquired through the display, and the purposes of remote talkback, remote whistle and remote control of the unmanned equipment movement (forward, backward, steering and the like) are achieved through buttons or a remote lever on the controller.
In addition, the present invention may refer to the specific implementation contents of the method for implementing autonomous equipment escaping for the unmanned aerial vehicle described in the embodiments, which have been described in detail in the above method for implementing autonomous escaping for the unmanned aerial vehicle, so that the repeated contents are not described again here.
It should be noted that in the above-mentioned referential embodiment, the obstacle information is characteristic obstacle information or is not characteristic obstacle information, but it is also possible that both characteristic obstacle information and non-characteristic obstacle information are included in the obstacle information, then the method of the above-mentioned step S205 to step S207 may be adopted for the obstacle of the characteristic obstacle information to be processed, and the method of the above-mentioned step S209 may be adopted for the obstacle of the non-characteristic obstacle information to be processed at the same time.
Fig. 3 is an apparatus for implementing autonomous unmanned aerial vehicle escape, according to an embodiment of the present invention, and as shown in fig. 3, the apparatus 300 for implementing autonomous unmanned aerial vehicle escape includes a triggering module 301, a determining module 302, and an executing module 303. The triggering module 301 determines that the unmanned device is trapped, and acquires image information around the unmanned device to acquire obstacle information. Then, the determining module 302 determines the type of the obstacle according to the obstacle information to obtain a corresponding processing manner. Finally, the execution module 303 executes the processing manner to get the unmanned device out of trouble.
As a preferred embodiment, the obstacle information may include characteristic obstacle information and non-characteristic obstacle information, wherein the characteristic obstacle information refers to a living being with a vital sign, and the non-characteristic obstacle information refers to a living being without a vital sign. The judging module 302 may judge whether the obstacle information is the feature obstacle information according to the obstacle information, and if the obstacle information is the feature obstacle information, the judging module warns the feature obstacle in a feature obstacle alarm prompting manner; otherwise, the unmanned equipment is released through a remote control mode. Preferably, the remote control is performed by observing the display and operating the controller to allow the unmanned device to escape from the trouble.
Further, the characteristic obstacle information may be divided into character information and other living body information (such as images of dogs, cats, etc.). Then, when determining that the obstacle information is the characteristic obstacle information, the determining module 302 may determine an alarm prompting manner according to whether the characteristic obstacle information is the object information. Wherein, the characteristic obstacle alarm prompting mode comprises: a first alarm prompting mode and a second alarm prompting mode. The first alarm prompting mode can be a mode of playing voice of a human body and weak light, and the second alarm prompting mode can be a mode of large volume, fast rhythm voice and strong light. The specific implementation process comprises the following steps: judging whether the characteristic barrier information is human information or not, and if so, prompting in a first alarm prompting mode; otherwise, prompting is carried out through a second alarm prompting mode.
In another embodiment, the execution module 303 may increase the number of times of prompting each time the processing mode is executed. In order to improve the efficiency of getting rid of the trouble of the unmanned equipment, when the unmanned equipment is judged not to get rid of the trouble, the recorded prompting times are obtained, the prompting times are judged to be larger than or equal to a preset time threshold value, and if the prompting times are larger than or equal to the preset time threshold value, the unmanned equipment is enabled to get rid of the trouble through a remote control mode; and otherwise, continuously acquiring the image information around the unmanned equipment.
It should be further noted that the determining module 302 may perform analysis processing on the image information by using a deep learning method of the image, so as to identify the obstacle information and determine the obstacle type. Features in the image deep learning method are obtained by adopting a convolutional neural network to carry out pre-training
It should be noted that, in the specific implementation of the apparatus for implementing autonomous releasing of an unmanned aerial vehicle according to the present invention, the details of the above method for implementing autonomous releasing of an unmanned aerial vehicle have been described in detail, and therefore, the repeated contents are not described herein again.
Fig. 4 illustrates an exemplary system architecture 400 for a method or apparatus for enabling autonomous unmanned device escape, to which embodiments of the present invention may be applied. Or fig. 4 illustrates an exemplary system architecture 400 to which a method of, or an apparatus for, autonomous unmanned device disablement may be applied, embodiments of the present invention.
As shown in fig. 4, the system architecture 400 may include terminal devices 401, 402, 403, a network 404, and a server 405. The network 404 serves as a medium for providing communication links between the terminal devices 401, 402, 403 and the server 405. Network 404 may include various types of connections, such as wire, wireless communication links, or fiber optic cables, to name a few.
A user may use terminal devices 401, 402, 403 to interact with a server 405 over a network 404 to receive or send messages or the like. The terminal devices 401, 402, 403 may have installed thereon various communication client applications, such as shopping-like applications, web browser applications, search-like applications, instant messaging tools, mailbox clients, social platform software, etc. (by way of example only).
The terminal devices 401, 402, 403 may be various electronic devices having a display screen and supporting web browsing, including but not limited to smart phones, tablet computers, laptop portable computers, desktop computers, and the like.
The server 405 may be a server providing various services, such as a background management server (for example only) providing support for shopping websites browsed by users using the terminal devices 401, 402, 403. The backend management server may analyze and perform other processing on the received data such as the product information query request, and feed back a processing result (for example, target push information, product information — just an example) to the terminal device.
It should be noted that the method for enabling the autonomous equipment for the unmanned aerial vehicle to get out of the trouble provided by the embodiment of the present invention is generally executed by the server 405, and accordingly, the apparatus for enabling the autonomous equipment for the unmanned aerial vehicle to get out of the trouble is generally disposed in the server 405.
It should be understood that the number of terminal devices, networks, and servers in fig. 4 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Referring now to FIG. 5, shown is a block diagram of a computer system 500 suitable for use with a terminal device implementing an embodiment of the present invention. The terminal device shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 5, the computer system 500 includes a Central Processing Unit (CPU)501 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)502 or a program loaded from a storage section 508 into a Random Access Memory (RAM) 503. In the RAM503, various programs and data necessary for the operation of the system 500 are also stored. The CPU501, ROM 502, and RAM503 are connected to each other via a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
The following components are connected to the I/O interface 505: an input portion 506 including a keyboard, a mouse, and the like; an output portion 507 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage portion 508 including a hard disk and the like; and a communication section 509 including a network interface card such as a LAN card, a modem, or the like. The communication section 509 performs communication processing via a network such as the internet. The driver 510 is also connected to the I/O interface 505 as necessary. A removable medium 511 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 510 as necessary, so that a computer program read out therefrom is mounted into the storage section 508 as necessary.
In particular, according to the embodiments of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 509, and/or installed from the removable medium 511. The computer program performs the above-described functions defined in the system of the present invention when executed by the Central Processing Unit (CPU) 501.
It should be noted that the computer readable medium shown in the present invention can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present invention, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present invention, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules described in the embodiments of the present invention may be implemented by software or hardware. The described modules may also be provided in a processor, which may be described as: a processor includes a triggering module 301, a determining module 302, and an executing module 303. Wherein the names of the modules do not in some cases constitute a limitation of the module itself.
As another aspect, the present invention also provides a computer-readable medium that may be contained in the apparatus described in the above embodiments; or may be separate and not incorporated into the device. The computer readable medium carries one or more programs which, when executed by a device, cause the device to comprise: determining that the unmanned equipment is trapped, and acquiring image information around the unmanned equipment to acquire obstacle information; determining the type of the obstacle according to the obstacle information to obtain a corresponding processing mode; executing the processing mode to enable the unmanned equipment to get rid of difficulties.
According to the technical scheme of the embodiment of the invention, when the unmanned equipment is trapped, the unmanned equipment can be rapidly and independently released by using an artificial intelligent mode. Therefore, a large amount of labor cost is saved, the time for solving the difficulty escaping problem is reduced, and the operating efficiency of the unmanned equipment is improved.
The above-described embodiments should not be construed as limiting the scope of the invention. Those skilled in the art will appreciate that various modifications, combinations, sub-combinations, and substitutions can occur, depending on design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (8)

1. A method for realizing autonomous escaping of a pilotless device is characterized by comprising the following steps:
determining that the unmanned equipment is trapped, and acquiring image information around the unmanned equipment to acquire obstacle information;
determining the type of the obstacle according to the obstacle information to obtain a corresponding processing mode; wherein, include: judging whether the obstacle information is characteristic obstacle information or not according to the obstacle information; according to the judgment result, if the characteristic obstacle information is the characteristic obstacle information, warning the characteristic obstacle in a characteristic obstacle alarm prompting mode; otherwise, the unmanned equipment is released through a remote control mode;
executing the processing mode to enable the unmanned equipment to get rid of difficulties;
wherein, still include: recording the alarm prompting times;
when the processing mode is executed to enable the unmanned equipment to get rid of trouble, the processing mode comprises the following steps:
when the unmanned equipment is judged not to be stranded, acquiring alarm prompting times, determining that the alarm prompting times are greater than or equal to a preset time threshold value, and enabling the unmanned equipment to be stranded in a remote control mode; and otherwise, continuously acquiring the image information around the unmanned equipment.
2. The method of claim 1, wherein the characteristic obstacle alert prompt mode comprises: a first alarm prompting mode and a second alarm prompting mode;
when determining that the obstacle information is the characteristic obstacle information, the method further includes:
judging whether the characteristic barrier information is human information or not, and if so, prompting in a first alarm prompting mode; otherwise, prompting is carried out through a second alarm prompting mode.
3. The method of claim 1, wherein determining the type of obstacle from the obstacle information comprises:
identifying the obstacle information through a deep learning method of an image to determine the type of the obstacle; the features in the image deep learning method are obtained by adopting a convolutional neural network for pre-training.
4. An apparatus for enabling autonomous escape of a drone, comprising:
the triggering module is used for determining that the unmanned equipment is trapped and acquiring image information around the unmanned equipment to acquire obstacle information;
the judging module is used for determining the type of the obstacle according to the obstacle information so as to obtain a corresponding processing mode; wherein, include: judging whether the obstacle information is characteristic obstacle information or not according to the obstacle information; according to the judgment result, if the characteristic obstacle information is the characteristic obstacle information, warning the characteristic obstacle in a characteristic obstacle alarm prompting mode; otherwise, the unmanned equipment is released through a remote control mode;
the execution module is used for executing the processing mode so as to enable the unmanned equipment to get rid of difficulties; recording the alarm prompting times; the execution module executes the processing mode to enable the unmanned equipment to get rid of difficulties, and the processing mode comprises the following steps: when the unmanned equipment is judged not to be stranded, acquiring alarm prompting times, determining that the alarm prompting times are greater than or equal to a preset time threshold value, and enabling the unmanned equipment to be stranded in a remote control mode; and otherwise, continuously acquiring the image information around the unmanned equipment.
5. The apparatus of claim 4, wherein the characteristic obstacle alert prompt mode comprises: a first alarm prompting mode and a second alarm prompting mode;
when the judging module determines that the obstacle information is the characteristic obstacle information, the judging module further comprises:
judging whether the characteristic barrier information is human information or not, and if so, prompting in a first alarm prompting mode; otherwise, prompting is carried out through a second alarm prompting mode.
6. The apparatus of claim 4, wherein the determining module determines the type of obstacle according to the obstacle information comprises:
identifying the obstacle information through a deep learning method of an image to determine the type of the obstacle; the features in the image deep learning method are obtained by adopting a convolutional neural network for pre-training.
7. An electronic device, comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-3.
8. A computer-readable medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to any one of claims 1-3.
CN201711283397.XA 2017-12-07 2017-12-07 Method and device for realizing autonomous escaping of unmanned equipment Active CN109895780B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711283397.XA CN109895780B (en) 2017-12-07 2017-12-07 Method and device for realizing autonomous escaping of unmanned equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711283397.XA CN109895780B (en) 2017-12-07 2017-12-07 Method and device for realizing autonomous escaping of unmanned equipment

Publications (2)

Publication Number Publication Date
CN109895780A CN109895780A (en) 2019-06-18
CN109895780B true CN109895780B (en) 2021-03-30

Family

ID=66938936

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711283397.XA Active CN109895780B (en) 2017-12-07 2017-12-07 Method and device for realizing autonomous escaping of unmanned equipment

Country Status (1)

Country Link
CN (1) CN109895780B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110488846A (en) * 2019-09-19 2019-11-22 广州文远知行科技有限公司 Unmanned remote assistance method, device, equipment and storage medium
CN112526984B (en) * 2020-09-30 2024-06-21 深圳银星智能集团股份有限公司 Robot obstacle avoidance method and device and robot
CN113715843B (en) * 2021-09-03 2022-06-21 北京易航远智科技有限公司 Method and system for on-site help seeking and getting rid of poverty of unmanned equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105404857A (en) * 2015-11-04 2016-03-16 北京联合大学 Infrared-based night intelligent vehicle front pedestrian detection method
CN105867387A (en) * 2016-03-06 2016-08-17 董岩岩 Logistics remote monitoring and fault assistance processing system
CN106093948A (en) * 2016-06-03 2016-11-09 南阳中衡智能科技有限公司 A kind of stranded detection method of sweeping robot
CN107092252A (en) * 2017-04-11 2017-08-25 杭州光珀智能科技有限公司 A kind of robot automatic obstacle avoidance method and its device based on machine vision

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6304272B2 (en) * 2016-02-04 2018-04-04 トヨタ自動車株式会社 Vehicle alert device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105404857A (en) * 2015-11-04 2016-03-16 北京联合大学 Infrared-based night intelligent vehicle front pedestrian detection method
CN105867387A (en) * 2016-03-06 2016-08-17 董岩岩 Logistics remote monitoring and fault assistance processing system
CN106093948A (en) * 2016-06-03 2016-11-09 南阳中衡智能科技有限公司 A kind of stranded detection method of sweeping robot
CN107092252A (en) * 2017-04-11 2017-08-25 杭州光珀智能科技有限公司 A kind of robot automatic obstacle avoidance method and its device based on machine vision

Also Published As

Publication number Publication date
CN109895780A (en) 2019-06-18

Similar Documents

Publication Publication Date Title
CN110061909B (en) Method and apparatus for processing information
CN110288049B (en) Method and apparatus for generating image recognition model
CN109895780B (en) Method and device for realizing autonomous escaping of unmanned equipment
CN111523640B (en) Training method and device for neural network model
US9904714B2 (en) Crowd sourcing of device sensor data for real time response
US20180217593A1 (en) Personality sharing among drone swarm
US11328518B2 (en) Method and apparatus for outputting information
US11082532B2 (en) Method and apparatus for sending information
CN110598504A (en) Image recognition method and device, electronic equipment and storage medium
CN111467074B (en) Method and device for detecting livestock status
US11392139B2 (en) Method, apparatus and control system for controlling mobile robot
US20220281117A1 (en) Remote physiological data sensing robot
CN111600772A (en) Network distribution content detection processing device, method, system and electronic equipment
CN111078855A (en) Information processing method, information processing device, electronic equipment and storage medium
CN111191066A (en) Image recognition-based pet identity recognition method and device
CN113391627A (en) Unmanned vehicle driving mode switching method and device, vehicle and cloud server
CN109471437B (en) Method, device and control system for controlling mobile robot
CN109634554B (en) Method and device for outputting information
CN111414829A (en) Method and device for sending alarm information
US20220227374A1 (en) Data processing
CN107283429B (en) Control method, device and system based on artificial intelligence and terminal
CN115512447A (en) Living body detection method and device
CN112732591B (en) Edge computing framework for cache deep learning
CN111310858B (en) Method and device for generating information
CN110046229B (en) Method and device for acquiring information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210305

Address after: 101, 1st floor, building 2, yard 20, Suzhou street, Haidian District, Beijing 100080

Applicant after: Beijing Jingbangda Trading Co.,Ltd.

Address before: 100195 Beijing Haidian Xingshikou Road 65 West Cedar Creative Garden 4 District 11 Building East 1-4 Floor West 1-4 Floor

Applicant before: BEIJING JINGDONG SHANGKE INFORMATION TECHNOLOGY Co.,Ltd.

Applicant before: BEIJING JINGDONG CENTURY TRADING Co.,Ltd.

Effective date of registration: 20210305

Address after: Room a1905, 19 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant after: Beijing Jingdong Qianshi Technology Co.,Ltd.

Address before: 101, 1st floor, building 2, yard 20, Suzhou street, Haidian District, Beijing 100080

Applicant before: Beijing Jingbangda Trading Co.,Ltd.

GR01 Patent grant
GR01 Patent grant