CN109895780A - A kind of method and apparatus realizing unmanned equipment autonomously and getting rid of poverty - Google Patents

A kind of method and apparatus realizing unmanned equipment autonomously and getting rid of poverty Download PDF

Info

Publication number
CN109895780A
CN109895780A CN201711283397.XA CN201711283397A CN109895780A CN 109895780 A CN109895780 A CN 109895780A CN 201711283397 A CN201711283397 A CN 201711283397A CN 109895780 A CN109895780 A CN 109895780A
Authority
CN
China
Prior art keywords
unmanned equipment
poverty
obstacle information
rid
obstacle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201711283397.XA
Other languages
Chinese (zh)
Other versions
CN109895780B (en
Inventor
王建伟
李雨倩
吴迪
王玉猛
刘丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Qianshi Technology Co Ltd
Original Assignee
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Century Trading Co Ltd, Beijing Jingdong Shangke Information Technology Co Ltd filed Critical Beijing Jingdong Century Trading Co Ltd
Priority to CN201711283397.XA priority Critical patent/CN109895780B/en
Publication of CN109895780A publication Critical patent/CN109895780A/en
Application granted granted Critical
Publication of CN109895780B publication Critical patent/CN109895780B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention discloses the method and apparatus for realizing that unmanned equipment autonomously is got rid of poverty, and are related to field of computer technology.One specific embodiment of this method comprises determining that unmanned equipment is stranded, acquires the image information around unmanned equipment, to obtain obstacle information;According to the obstacle information, obstacle species are determined, to obtain corresponding processing mode;The processing mode is executed, so that the unmanned equipment is got rid of poverty.The embodiment, which is able to solve the prior art, can only manually arrive Solve on site, and then the problem of inefficiency.

Description

A kind of method and apparatus realizing unmanned equipment autonomously and getting rid of poverty
Technical field
The present invention relates to field of computer technology more particularly to it is a kind of realize method that unmanned equipment autonomously is got rid of poverty and Device.
Background technique
Current unmanned technology is quickly grown, and unmanned equipment also becomes future development inexorable trend, and by nothing People's steer is used in the fields such as logistics distribution, intelligent transportation also into the research hotspot of those skilled in the art.
In realizing process of the present invention, at least there are the following problems in the prior art for inventor's discovery: current nobody drives It sails equipment and is in developing stage, the processing for unmanned equipment after stranded in operation is predominantly artificial to existing at this stage Field solves, but such method has many inconvenience.Moreover, needing a large amount of manpower in the case where more than the unmanned number of devices It supports, operating cost is excessively high, and manually reaches scene and need certain time, and stranded problem cannot be resolved in time, such Method efficiency is too low.
Summary of the invention
In view of this, the embodiment of the present invention provides a kind of method and apparatus realizing unmanned equipment autonomously and getting rid of poverty, energy Enough solve the problem of that the prior art can only manually arrive Solve on site and then inefficiency.
To achieve the above object, according to an aspect of an embodiment of the present invention, a kind of unmanned equipment of realization is provided The method independently got rid of poverty, including the unmanned equipment of determination are stranded, acquire the image information around unmanned equipment, to obtain Obstacle information;According to the obstacle information, obstacle species are determined, to obtain corresponding processing mode;Execute the place Reason mode, so that the unmanned equipment is got rid of poverty.
Optionally, according to the obstacle information, obstacle species are determined, to obtain corresponding processing mode, comprising: root According to the obstacle information, judge whether to be characterized obstacle information;According to judging result, lead to if being characterized obstacle information Feature barrier warning note mode is crossed, the feature barrier is alerted;Otherwise nobody is made by remote controlled manner Steer is got rid of poverty.
Optionally, the feature barrier warning note mode includes: the first warning note mode and the second warning note Mode;
When the determination obstacle information is characterized obstacle information, further includes: judge the feature barrier letter Whether breath is people information, if then by the first warning note mode into being prompted;Otherwise pass through the second warning note side Formula is prompted.
Optionally, further includes: record prompt number;
When executing the processing mode so that the unmanned equipment is got rid of poverty, comprising: when judging described unmanned set For prompt number when not getting rid of poverty, is obtained, determine that the prompt number is greater than or equal to preset frequency threshold value, then by remotely Control mode makes unmanned equipment get rid of poverty;Otherwise continue to acquire the image information around unmanned equipment.
Optionally, according to the obstacle information, determine that obstacle species include: the deep learning method pair by image Obstacle information is identified, to determine obstacle species;Wherein, the feature in picture depth learning method uses convolutional Neural Network carries out training in advance and obtains.
In addition, according to an aspect of an embodiment of the present invention, provides and a kind of realize what unmanned equipment autonomously was got rid of poverty Device, including trigger module acquire the image information around unmanned equipment for determining that unmanned equipment is stranded, with Obtain obstacle information;Judgment module, for determining obstacle species according to the obstacle information, to obtain corresponding place Reason mode;Execution module, for executing the processing mode, so that the unmanned equipment is got rid of poverty.
Optionally, the judgment module determines obstacle species according to the obstacle information, to obtain corresponding processing Mode, comprising: according to the obstacle information, judge whether to be characterized obstacle information;According to judging result, if being characterized barrier Object information is hindered then by feature barrier warning note mode, to alert the feature barrier;Otherwise by remotely controlling Mode processed makes unmanned equipment get rid of poverty.
Optionally, the feature barrier warning note mode includes: the first warning note mode and the second warning note Mode;When the judgment module determines that the obstacle information is characterized obstacle information, further includes: judge the feature obstacle Whether object information is people information, if then being prompted by the first warning note mode;Otherwise pass through the second warning note Mode is prompted.
Optionally, the execution module, is also used to: record prompt number;
When the execution module executes the processing mode so that the unmanned equipment is got rid of poverty, comprising: when judging When stating unmanned equipment and not getting rid of poverty, prompt number is obtained, determines the prompt number more than or equal to preset number threshold Value, then make unmanned equipment get rid of poverty by remote controlled manner;Otherwise continue to acquire the image letter around unmanned equipment Breath.
Optionally, the judgment module determines that obstacle species include: the depth by image according to the obstacle information Degree learning method identifies obstacle information, to determine obstacle species;Wherein, the feature in picture depth learning method Training in advance is carried out using convolutional neural networks to obtain.
Other side according to an embodiment of the present invention, additionally provides a kind of electronic equipment, comprising:
One or more processors;
Storage device, for storing one or more programs,
When one or more of programs are executed by one or more of processors, so that one or more of processing Device, which is realized, any of the above-described realizes method described in embodiment that unmanned equipment is got rid of poverty.
Other side according to an embodiment of the present invention additionally provides a kind of computer-readable medium, is stored thereon with meter Calculation machine program realizes any of the above-described realize described in embodiment that unmanned equipment is got rid of poverty when described program is executed by processor Method.
One embodiment in foregoing invention has the following advantages that or the utility model has the advantages that determines unmanned set because using By being stranded, the image information around unmanned equipment is acquired, to obtain obstacle information;According to the obstacle information, really Obstacle species are determined, to obtain corresponding processing mode;The processing mode is executed, so that the unmanned equipment was got rid of poverty Technological means, it is achieved that quickly being got rid of poverty when unmanned equipment is stranded using artificial aptitude manner.To save A large amount of human cost, reduces the time for solving the problems, such as to get rid of poverty, improves the operational efficiency of unmanned equipment.
Further effect possessed by above-mentioned non-usual optional way adds hereinafter in conjunction with specific embodiment With explanation.
Detailed description of the invention
Attached drawing for a better understanding of the present invention, does not constitute an undue limitation on the present invention.Wherein:
Fig. 1 is the signal of the main flow for the method that the unmanned equipment autonomously of realization according to an embodiment of the present invention is got rid of poverty Figure;
Fig. 2 is the main flow that can refer to the method that the unmanned equipment autonomously of realization of embodiment is got rid of poverty according to the present invention Schematic diagram;
Fig. 3 is the signal of the main modular for the device that the unmanned equipment autonomously of realization according to an embodiment of the present invention is got rid of poverty Figure;
Fig. 4 is that the embodiment of the present invention can be applied to exemplary system architecture figure therein;
Fig. 5 is adapted for the structural representation of the computer system for the terminal device or server of realizing the embodiment of the present invention Figure.
Specific embodiment
Below in conjunction with attached drawing, an exemplary embodiment of the present invention will be described, including the various of the embodiment of the present invention Details should think them only exemplary to help understanding.Therefore, those of ordinary skill in the art should recognize It arrives, it can be with various changes and modifications are made to the embodiments described herein, without departing from scope and spirit of the present invention.Together Sample, for clarity and conciseness, descriptions of well-known functions and structures are omitted from the following description.
Fig. 1 is the method according to an embodiment of the present invention realizing unmanned equipment autonomously and getting rid of poverty, as shown in Figure 1, described The method for realizing that unmanned equipment autonomously is got rid of poverty includes:
Step S101 determines that unmanned equipment is stranded, acquires the image information around unmanned equipment, to obtain barrier Hinder object information.
Wherein, the unmanned equipment is stranded refers to that unmanned equipment is bottled up by barrier, can not be by planning road Diameter leads to destination, and unmanned equipment of the present invention is got rid of poverty, and refers to that unmanned equipment gets rid of stranded state.
Preferably, can be analyzed and processed by the deep learning method of image to image information, wherein picture depth Feature in learning method carries out training in advance using convolutional neural networks and obtains.
Step S102 determines obstacle species according to the obstacle information, to obtain corresponding processing mode.
Wherein, obstacle information may include feature obstacle information and non-feature obstacle information, wherein the spy Sign obstacle information refers to the biology with vital signs, and the non-feature obstacle information then refers to without life spy The biology of sign.
Preferably, can judge whether to be characterized obstacle information according to obstacle information, if being characterized obstacle information Then by feature barrier warning note mode, the feature barrier is alerted;Otherwise made by remote controlled manner Unmanned equipment is got rid of poverty.It is reached preferably, carrying out remote controlled manner and can be in the observation of display and the operation of controller To making unmanned equipment get rid of poverty.
Further, feature obstacle information can be divided into people information and other life informations (such as dog, cat etc. Image).So when determining that obstacle information is characterized obstacle information, then can according to the feature obstacle information whether Warning note mode is determined for people information.Wherein, the feature barrier warning note mode includes: the first warning note Mode and the second warning note mode.And the first warning note mode can be to play human voice, weak light mode, second reports Alert prompting mode can be big volume, allegro voice and accent light mode.Specific implementation process includes: described in judgement Whether feature obstacle information is people information, if then being prompted by the first warning note mode;Otherwise pass through second Warning note mode is prompted.
Step S103 executes the processing mode, so that the unmanned equipment is got rid of poverty.
As embodiment, after having executed the processing mode every time, primary prompt number can be increased.In order to The efficiency that unmanned equipment is got rid of poverty is improved, when judging that the unmanned equipment is not got rid of poverty, then obtains the prompt of record Number judges that the prompt number is greater than or equal to preset frequency threshold value, if prompt number is greater than or equal to preset number Threshold value, which then passes through remote controlled manner, makes unmanned equipment get rid of poverty;Otherwise continue to acquire the image letter around unmanned equipment Breath is return step S101.
It should also be noted that, when so that unmanned equipment is got rid of poverty by remote controlled manner, i.e., in the observation of display Operation with controller reaches the feelings for obtaining unmanned equipment periphery when unmanned equipment being allow to get rid of poverty by display Condition carries out long-range intercommunication, long-range whistle by button on controller or distant bar and remotely to control unmanned equipment moving (preceding Into, retreat, turn to etc.) achieve the purpose that get rid of poverty.
According to various embodiments above, it can be seen that the method that the unmanned equipment autonomously of realization is got rid of poverty, it is excellent It is first got rid of poverty automatically using artificial intelligence technology, saves human cost, reduce the time solved the problems, such as.Artificial intelligence is got rid of poverty mistake automatically It is automatically transferred to artificial long-range monitoring after losing to get rid of poverty, can solve the stranded situation of the overwhelming majority.The machine of artificial intelligence at any time Study, the probability got rid of poverty automatically can be increasing.
Fig. 2 is the main flow that can refer to the method that the unmanned equipment autonomously of realization of embodiment is got rid of poverty according to the present invention Schematic diagram, the method for realizing that unmanned equipment autonomously is got rid of poverty may include:
Step S201 determines that unmanned equipment is stranded, and sensor acquires the image information around unmanned equipment.
Step S202 analyzes described image information, to obtain analysis result.
Preferably, can be analyzed and processed based on the method for deep learning to image information, wherein in deep learning It is characterized in utilizing convolutional neural networks training acquisition.It is possible to further use convolutional neural networks to divide in feature in advance Obstacle information, and the obstacle information may include feature obstacle information and non-feature obstacle information, wherein described Feature obstacle information refer to the biology with vital signs, and the non-feature obstacle information then refers to do not have life Order the biology of feature.It further, can root when dividing the obstacle information in feature using convolutional neural networks in advance Classify automatically according to the mode that a large amount of obstacle informations are trained in advance, to carry out the differentiation of obstacle information.
Step S203, according to the analysis as a result, obtaining obstacle information.
Step S204 judges whether to be characterized obstacle information according to the obstacle information, if then carrying out step Otherwise S205 directly executes step S209.
Step S205 judges whether the feature obstacle information is people information, if then carrying out step S206, otherwise Carry out step S207.
In this embodiment, think that barrier is behaved if the analysis of the deep learning of image meets people information, use Convolutional neural networks extract people information.
Preferably, the feature obstacle information can be divided into people information and other life informations (such as dog, The images such as cat), and convolutional neural networks can be trained by a large amount of people informations and other life informations, with area Divide people information and other life informations.
Step S206 alerts the feature barrier by personage's warning note mode, carries out step S208.
In embodiment, personage's warning note mode can be alarm sound, the alarm prompt such as light, more into It can be prompted to one step by playing the modes such as human voice, weak light.Preferably, every warning note is once increased by one Secondary prompt number.
Step S207 alerts the feature barrier by other life entity warning note modes, carries out step S208。
In embodiment, the other organisms warning note mode can be alarm sound, alarm light etc. and mention Show, can further be prompted by modes such as big volume, allegro voice and accent lights.Preferably, every report Alert prompt is once increased by primary prompt number.
Step S208, judges whether the unmanned equipment gets rid of poverty, if then exiting the process, otherwise carries out step S209。
As embodiment, the image information around unmanned equipment can be acquired, at the analysis to described image Reason, judges whether that there is also the information of the feature barrier, and then judge whether the unmanned equipment gets rid of poverty.
Example is further carried out, when judging that the unmanned equipment is not got rid of poverty, available prompt number is determined The prompt number is greater than or equal to preset frequency threshold value, then carries out step S209, otherwise return step S201.In addition, working as When judging that the unmanned equipment is got rid of poverty, then the process can be directly exited.
Step S209 makes unmanned equipment get rid of poverty by reaching in the observation of display and the operation of controller.
In preferably embodiment, the case where unmanned equipment periphery can be obtained by display, pass through controller On button or distant bar carry out long-range intercommunication, long-range whistle and remotely control unmanned equipment moving (advancing, retreat, turning to Etc.) achieve the purpose that independently to get rid of poverty.
In addition, can refer to the unmanned specific reality set from the active and standby method got rid of poverty of realization described in embodiment in the present invention Content is applied, has been described in detail in the method that the unmanned equipment autonomously of realization described above is got rid of poverty, therefore in being repeated at this Appearance no longer illustrates.
Also it is worth noting that, in the above-mentioned embodiment that can refer to for obstacle information be characterized obstacle information or Be not characterized obstacle information, but there are also it is a kind of may both include in obstacle information feature obstacle information and also including Non- feature obstacle information, then then above-mentioned steps S205 to step S207 can be used to the barrier of feature obstacle information Method handled, simultaneously for non-feature obstacle information barrier using above-mentioned steps S209 method at Reason.
Fig. 3 is the device according to an embodiment of the present invention realizing unmanned equipment autonomously and getting rid of poverty, as shown in figure 3, described Realize that the device 300 that unmanned equipment autonomously is got rid of poverty includes trigger module 301, judgment module 302 and execution module 303.Its In, trigger module 301 determines that unmanned equipment is stranded, the image information around unmanned equipment is acquired, to obtain obstacle Object information.Then judgment module 302 determines obstacle species according to the obstacle information, to obtain corresponding processing mode. Finally, execution module 303 executes the processing mode, so that the unmanned equipment is got rid of poverty.
As an embodiment preferably, obstacle information may include feature obstacle information and non-feature barrier Information, wherein the feature obstacle information refers to the biology with vital signs, and the non-feature obstacle information Then refer to the biology without vital signs.The judgment module 302 can judge whether to be characterized barrier according to obstacle information Hinder object information, by feature barrier warning note mode if being characterized obstacle information, the feature barrier is carried out Warning;Otherwise unmanned equipment is made to get rid of poverty by remote controlled manner.Preferably, carrying out remote controlled manner can be aobvious The operation of the observation and controller of showing device, which reaches, makes unmanned equipment autonomously get rid of poverty.
Further, feature obstacle information can be divided into people information and other life informations (such as dog, cat etc. Image).So when determining that obstacle information is characterized obstacle information, the judgment module 302 can be according to the feature Whether obstacle information is people information to determine warning note mode.Wherein, the feature barrier warning note mode packet It includes: the first warning note mode and the second warning note mode.And the first warning note mode can be to play human voice, weak Light mode, the second warning note mode can be big volume, allegro voice and accent light mode.Specifically implemented Journey includes: to judge whether the feature obstacle information is people information, if then being mentioned by the first warning note mode Show;Otherwise it is prompted by the second warning note mode.
In another embodiment, after the execution module 303 has executed the processing mode every time, can increase Primary prompt number.In order to improve the efficiency that unmanned equipment is got rid of poverty, when judging that the unmanned equipment do not take off When tired, then the prompt number of record is obtained, judge that the prompt number is greater than or equal to preset frequency threshold value, if prompt number Then passing through remote controlled manner more than or equal to preset frequency threshold value makes unmanned equipment get rid of poverty;Otherwise continue to acquire nobody Image information around steer.
It should also be noted that, the judgment module 302 can by the deep learning method of image to image information into Row analysis processing, to identify to obstacle information, determines obstacle species.The wherein feature in picture depth learning method Training in advance is carried out using convolutional neural networks to obtain
It should be noted that in the specific implementation of the present invention for realizing the de- device being independently stranded of unmanned equipment Hold, described above realize has been described in detail in method that unmanned equipment autonomously is got rid of poverty, therefore this duplicate contents not Illustrate again.
Fig. 4 shows the method or realization nothing that can be got rid of poverty using the unmanned equipment autonomously of realization of the embodiment of the present invention The exemplary system architecture 400 for the device that people's steer is independently got rid of poverty.Or Fig. 4 is shown can apply the embodiment of the present invention The method got rid of poverty of the unmanned equipment autonomously of realization or realize the exemplary system of the device that unmanned equipment autonomously is got rid of poverty Framework 400.
As shown in figure 4, system architecture 400 may include terminal device 401,402,403, network 404 and server 405. Network 404 between terminal device 401,402,403 and server 405 to provide the medium of communication link.Network 404 can be with Including various connection types, such as wired, wireless communication link or fiber optic cables etc..
User can be used terminal device 401,402,403 and be interacted by network 404 with server 405, to receive or send out Send message etc..Various telecommunication customer end applications, such as the application of shopping class, net can be installed on terminal device 401,402,403 (merely illustrative) such as the application of page browsing device, searching class application, instant messaging tools, mailbox client, social platform softwares.
Terminal device 401,402,403 can be the various electronic equipments with display screen and supported web page browsing, packet Include but be not limited to smart phone, tablet computer, pocket computer on knee and desktop computer etc..
Server 405 can be to provide the server of various services, such as utilize terminal device 401,402,403 to user The shopping class website browsed provides the back-stage management server (merely illustrative) supported.Back-stage management server can be to reception To the data such as information query request analyze etc. processing, and by processing result (such as target push information, product letter Breath -- merely illustrative) feed back to terminal device.
It should be noted that the embodiment of the present invention provided by realize it is unmanned set from the active and standby method got rid of poverty generally by Server 405 executes, and correspondingly, realizes that the device that unmanned equipment autonomously is got rid of poverty is generally positioned in server 405.
It should be understood that the number of terminal device, network and server in Fig. 4 is only schematical.According to realization need It wants, can have any number of terminal device, network and server.
Below with reference to Fig. 5, it illustrates the computer systems 500 for the terminal device for being suitable for being used to realize the embodiment of the present invention Structural schematic diagram.Terminal device shown in Fig. 5 is only an example, function to the embodiment of the present invention and should not use model Shroud carrys out any restrictions.
As shown in figure 5, computer system 500 includes central processing unit (CPU) 501, it can be read-only according to being stored in Program in memory (ROM) 502 or be loaded into the program in random access storage device (RAM) 503 from storage section 508 and Execute various movements appropriate and processing.In RAM503, also it is stored with system 500 and operates required various programs and data. CPU501, ROM 502 and RAM503 is connected with each other by bus 504.Input/output (I/O) interface 505 is also connected to bus 504。
I/O interface 505 is connected to lower component: the importation 506 including keyboard, mouse etc.;It is penetrated including such as cathode The output par, c 507 of spool (CRT), liquid crystal display (LCD) etc. and loudspeaker etc.;Storage section 508 including hard disk etc.; And the communications portion 509 of the network interface card including LAN card, modem etc..Communications portion 509 via such as because The network of spy's net executes communication process.Driver 510 is also connected to I/O interface 505 as needed.Detachable media 511, such as Disk, CD, magneto-optic disk, semiconductor memory etc. are mounted on as needed on driver 510, in order to read from thereon Computer program be mounted into storage section 508 as needed.
Particularly, disclosed embodiment, the process described above with reference to flow chart may be implemented as counting according to the present invention Calculation machine software program.For example, embodiment disclosed by the invention includes a kind of computer program product comprising be carried on computer Computer program on readable medium, the computer program include the program code for method shown in execution flow chart.? In such embodiment, which can be downloaded and installed from network by communications portion 509, and/or from can Medium 511 is dismantled to be mounted.When the computer program is executed by central processing unit (CPU) 501, system of the invention is executed The above-mentioned function of middle restriction.
It should be noted that computer-readable medium shown in the present invention can be computer-readable signal media or meter Calculation machine readable storage medium storing program for executing either the two any combination.Computer readable storage medium for example can be --- but not Be limited to --- electricity, magnetic, optical, electromagnetic, infrared ray or semiconductor system, device or device, or any above combination.Meter The more specific example of calculation machine readable storage medium storing program for executing can include but is not limited to: have the electrical connection, just of one or more conducting wires Taking formula computer disk, hard disk, random access storage device (RAM), read-only memory (ROM), erasable type may be programmed read-only storage Device (EPROM or flash memory), optical fiber, portable compact disc read-only memory (CD-ROM), light storage device, magnetic memory device, Or above-mentioned any appropriate combination.In the present invention, computer readable storage medium can be it is any include or storage journey The tangible medium of sequence, the program can be commanded execution system, device or device use or in connection.And at this In invention, computer-readable signal media may include in a base band or as carrier wave a part propagate data-signal, Wherein carry computer-readable program code.The data-signal of this propagation can take various forms, including but unlimited In electromagnetic signal, optical signal or above-mentioned any appropriate combination.Computer-readable signal media can also be that computer can Any computer-readable medium other than storage medium is read, which can send, propagates or transmit and be used for By the use of instruction execution system, device or device or program in connection.Include on computer-readable medium Program code can transmit with any suitable medium, including but not limited to: wireless, electric wire, optical cable, RF etc. are above-mentioned Any appropriate combination.
Flow chart and block diagram in attached drawing are illustrated according to the system of various embodiments of the invention, method and computer journey The architecture, function and operation in the cards of sequence product.In this regard, each box in flowchart or block diagram can generation A part of one module, program segment or code of table, a part of above-mentioned module, program segment or code include one or more Executable instruction for implementing the specified logical function.It should also be noted that in some implementations as replacements, institute in box The function of mark can also occur in a different order than that indicated in the drawings.For example, two boxes succeedingly indicated are practical On can be basically executed in parallel, they can also be executed in the opposite order sometimes, and this depends on the function involved.Also it wants It is noted that the combination of each box in block diagram or flow chart and the box in block diagram or flow chart, can use and execute rule The dedicated hardware based systems of fixed functions or operations is realized, or can use the group of specialized hardware and computer instruction It closes to realize.
Being described in module involved in the embodiment of the present invention can be realized by way of software, can also be by hard The mode of part is realized.Described module also can be set in the processor, for example, can be described as: a kind of processor packet Include trigger module 301, judgment module 302 and execution module 303.Wherein, the title of these modules not structure under certain conditions The restriction of the pairs of module itself.
As on the other hand, the present invention also provides a kind of computer-readable medium, which be can be Included in equipment described in above-described embodiment;It is also possible to individualism, and without in the supplying equipment.Above-mentioned calculating Machine readable medium carries one or more program, when said one or multiple programs are executed by the equipment, makes It obtains the equipment and comprises determining that unmanned equipment is stranded, the image information around unmanned equipment is acquired, to obtain barrier Information;According to the obstacle information, obstacle species are determined, to obtain corresponding processing mode;The processing mode is executed, So that the unmanned equipment is got rid of poverty.
Technical solution according to an embodiment of the present invention realizes when unmanned equipment is stranded, uses artificial intelligence side Formula is quickly independently got rid of poverty.To save a large amount of human cost, reduce the time for solving the problems, such as to get rid of poverty, improve nothing The operational efficiency of people's steer.
Above-mentioned specific embodiment, does not constitute a limitation on the scope of protection of the present invention.Those skilled in the art should be bright It is white, design requirement and other factors are depended on, various modifications, combination, sub-portfolio and substitution can occur.It is any Made modifications, equivalent substitutions and improvements etc. within the spirit and principles in the present invention, should be included in the scope of the present invention Within.

Claims (12)

1. a kind of method realizing unmanned equipment autonomously and getting rid of poverty characterized by comprising
It determines that unmanned equipment is stranded, acquires the image information around unmanned equipment, to obtain obstacle information;
According to the obstacle information, obstacle species are determined, to obtain corresponding processing mode;
The processing mode is executed, so that the unmanned equipment is got rid of poverty.
2. the method according to claim 1, wherein determine obstacle species according to the obstacle information, with Obtain corresponding processing mode, comprising:
According to the obstacle information, judge whether to be characterized obstacle information;
According to judging result, by feature barrier warning note mode if being characterized obstacle information, the feature is hindered Object is hindered to be alerted;Otherwise unmanned equipment is made to get rid of poverty by remote controlled manner.
3. according to the method described in claim 2, it is characterized in that, the feature barrier warning note mode includes: first Warning note mode and the second warning note mode;
When the determination obstacle information is characterized obstacle information, further includes:
Judge whether the feature obstacle information is people information, if then being prompted by the first warning note mode; Otherwise it is prompted by the second warning note mode.
4. according to the method in claim 2 or 3, which is characterized in that further include: record prompt number;
When executing the processing mode so that the unmanned equipment is got rid of poverty, comprising:
When judging that the unmanned equipment is not got rid of poverty, prompt number is obtained, determines that the prompt number is greater than or equal to Preset frequency threshold value then makes unmanned equipment get rid of poverty by remote controlled manner;Otherwise continue to acquire unmanned equipment The image information of surrounding.
5. the method according to claim 1, wherein determining obstacle species packet according to the obstacle information It includes:
Obstacle information is identified by the deep learning method of image, to determine obstacle species;Wherein, picture depth Feature in learning method carries out training in advance using convolutional neural networks and obtains.
6. a kind of device realizing unmanned equipment autonomously and getting rid of poverty characterized by comprising
Trigger module acquires the image information around unmanned equipment, for determining that unmanned equipment is stranded to obtain barrier Hinder object information;
Judgment module, for obstacle species being determined, to obtain corresponding processing mode according to the obstacle information;
Execution module, for executing the processing mode, so that the unmanned equipment is got rid of poverty.
7. device according to claim 6, which is characterized in that the judgment module is determined according to the obstacle information Obstacle species, to obtain corresponding processing mode, comprising:
According to the obstacle information, judge whether to be characterized obstacle information;
According to judging result, by feature barrier warning note mode if being characterized obstacle information, the feature is hindered Object is hindered to be alerted;Otherwise unmanned equipment is made to get rid of poverty by remote controlled manner.
8. device according to claim 7, which is characterized in that the feature barrier warning note mode includes: first Warning note mode and the second warning note mode;
When the judgment module determines that the obstacle information is characterized obstacle information, further includes:
Judge whether the feature obstacle information is people information, if then being prompted by the first warning note mode; Otherwise it is prompted by the second warning note mode.
9. device according to claim 7 or 8, which is characterized in that the execution module is also used to: record prompt number;
When the execution module executes the processing mode so that the unmanned equipment is got rid of poverty, comprising:
When judging that the unmanned equipment is not got rid of poverty, prompt number is obtained, determines that the prompt number is greater than or equal to Preset frequency threshold value then makes unmanned equipment get rid of poverty by remote controlled manner;Otherwise continue to acquire unmanned equipment The image information of surrounding.
10. device according to claim 6, which is characterized in that the judgment module is determined according to the obstacle information Obstacle species include:
Obstacle information is identified by the deep learning method of image, to determine obstacle species;Wherein, picture depth Feature in learning method carries out training in advance using convolutional neural networks and obtains.
11. a kind of electronic equipment characterized by comprising
One or more processors;
Storage device, for storing one or more programs,
When one or more of programs are executed by one or more of processors, so that one or more of processors are real Now such as method as claimed in any one of claims 1 to 5.
12. a kind of computer-readable medium, is stored thereon with computer program, which is characterized in that described program is held by processor Such as method as claimed in any one of claims 1 to 5 is realized when row.
CN201711283397.XA 2017-12-07 2017-12-07 Method and device for realizing autonomous escaping of unmanned equipment Active CN109895780B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711283397.XA CN109895780B (en) 2017-12-07 2017-12-07 Method and device for realizing autonomous escaping of unmanned equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711283397.XA CN109895780B (en) 2017-12-07 2017-12-07 Method and device for realizing autonomous escaping of unmanned equipment

Publications (2)

Publication Number Publication Date
CN109895780A true CN109895780A (en) 2019-06-18
CN109895780B CN109895780B (en) 2021-03-30

Family

ID=66938936

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711283397.XA Active CN109895780B (en) 2017-12-07 2017-12-07 Method and device for realizing autonomous escaping of unmanned equipment

Country Status (1)

Country Link
CN (1) CN109895780B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110488846A (en) * 2019-09-19 2019-11-22 广州文远知行科技有限公司 Unmanned remote assistance method, device, equipment and storage medium
CN112526984A (en) * 2020-09-30 2021-03-19 深圳市银星智能科技股份有限公司 Robot obstacle avoidance method and device and robot
CN113715843A (en) * 2021-09-03 2021-11-30 北京易航远智科技有限公司 Method and system for on-site help seeking and getting rid of poverty of unmanned equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105404857A (en) * 2015-11-04 2016-03-16 北京联合大学 Infrared-based night intelligent vehicle front pedestrian detection method
CN105867387A (en) * 2016-03-06 2016-08-17 董岩岩 Logistics remote monitoring and fault assistance processing system
CN106093948A (en) * 2016-06-03 2016-11-09 南阳中衡智能科技有限公司 A kind of stranded detection method of sweeping robot
US20170225617A1 (en) * 2016-02-04 2017-08-10 Toyota Jidosha Kabushiki Kaisha In-vehicle alert device
CN107092252A (en) * 2017-04-11 2017-08-25 杭州光珀智能科技有限公司 A kind of robot automatic obstacle avoidance method and its device based on machine vision

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105404857A (en) * 2015-11-04 2016-03-16 北京联合大学 Infrared-based night intelligent vehicle front pedestrian detection method
US20170225617A1 (en) * 2016-02-04 2017-08-10 Toyota Jidosha Kabushiki Kaisha In-vehicle alert device
CN105867387A (en) * 2016-03-06 2016-08-17 董岩岩 Logistics remote monitoring and fault assistance processing system
CN106093948A (en) * 2016-06-03 2016-11-09 南阳中衡智能科技有限公司 A kind of stranded detection method of sweeping robot
CN107092252A (en) * 2017-04-11 2017-08-25 杭州光珀智能科技有限公司 A kind of robot automatic obstacle avoidance method and its device based on machine vision

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110488846A (en) * 2019-09-19 2019-11-22 广州文远知行科技有限公司 Unmanned remote assistance method, device, equipment and storage medium
CN112526984A (en) * 2020-09-30 2021-03-19 深圳市银星智能科技股份有限公司 Robot obstacle avoidance method and device and robot
CN113715843A (en) * 2021-09-03 2021-11-30 北京易航远智科技有限公司 Method and system for on-site help seeking and getting rid of poverty of unmanned equipment

Also Published As

Publication number Publication date
CN109895780B (en) 2021-03-30

Similar Documents

Publication Publication Date Title
US10691928B2 (en) Method and apparatus for facial recognition
CN108629823A (en) The generation method and device of multi-view image
JP2020537262A (en) Methods and equipment for automated monitoring systems
JP2021108094A (en) Method and device for generating interactive models
CN109895780A (en) A kind of method and apparatus realizing unmanned equipment autonomously and getting rid of poverty
CN109684624B (en) Method and device for automatically identifying order address road area
CN109815365A (en) Method and apparatus for handling video
CN108763532A (en) For pushed information, show the method and apparatus of information
CN109344752A (en) Method and apparatus for handling mouth image
CN108960110A (en) Method and apparatus for generating information
CN113822460A (en) Traffic flow prediction method and device, electronic equipment and storage medium
CN113704058B (en) Service model monitoring method and device and electronic equipment
CN109961328A (en) The method and apparatus for determining order cooling off period
WO2021121295A1 (en) Evolutionary tree-based simulated biology teaching method and device
CN109787829A (en) For generating the method and device of information
CN107563467A (en) Searching articles method and apparatus
CN110682297A (en) Intelligent interaction system and method for indoor guiding robot
CN111610850A (en) Method for man-machine interaction based on unmanned aerial vehicle
CN109977011A (en) Automatic generation method, device, storage medium and the electronic equipment of test script
CN113253608B (en) Unmanned crane equipment track generation method and device based on artificial intelligence
CN115205677A (en) Method, device, electronic equipment and computer readable medium for identifying image information
CN108228904A (en) For the method and apparatus of output information
CN110046229B (en) Method and device for acquiring information
CN114393583B (en) Method and device for controlling equipment through robot
CN109975795A (en) A kind of sound-source follow-up method and apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20210305

Address after: 101, 1st floor, building 2, yard 20, Suzhou street, Haidian District, Beijing 100080

Applicant after: Beijing Jingbangda Trading Co.,Ltd.

Address before: 100195 Beijing Haidian Xingshikou Road 65 West Cedar Creative Garden 4 District 11 Building East 1-4 Floor West 1-4 Floor

Applicant before: BEIJING JINGDONG SHANGKE INFORMATION TECHNOLOGY Co.,Ltd.

Applicant before: BEIJING JINGDONG CENTURY TRADING Co.,Ltd.

Effective date of registration: 20210305

Address after: Room a1905, 19 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant after: Beijing Jingdong Qianshi Technology Co.,Ltd.

Address before: 101, 1st floor, building 2, yard 20, Suzhou street, Haidian District, Beijing 100080

Applicant before: Beijing Jingbangda Trading Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant