Summary of the invention
In view of this, the embodiment of the present invention provides a kind of method and apparatus realizing unmanned equipment autonomously and getting rid of poverty, energy
Enough solve the problem of that the prior art can only manually arrive Solve on site and then inefficiency.
To achieve the above object, according to an aspect of an embodiment of the present invention, a kind of unmanned equipment of realization is provided
The method independently got rid of poverty, including the unmanned equipment of determination are stranded, acquire the image information around unmanned equipment, to obtain
Obstacle information;According to the obstacle information, obstacle species are determined, to obtain corresponding processing mode;Execute the place
Reason mode, so that the unmanned equipment is got rid of poverty.
Optionally, according to the obstacle information, obstacle species are determined, to obtain corresponding processing mode, comprising: root
According to the obstacle information, judge whether to be characterized obstacle information;According to judging result, lead to if being characterized obstacle information
Feature barrier warning note mode is crossed, the feature barrier is alerted;Otherwise nobody is made by remote controlled manner
Steer is got rid of poverty.
Optionally, the feature barrier warning note mode includes: the first warning note mode and the second warning note
Mode;
When the determination obstacle information is characterized obstacle information, further includes: judge the feature barrier letter
Whether breath is people information, if then by the first warning note mode into being prompted;Otherwise pass through the second warning note side
Formula is prompted.
Optionally, further includes: record prompt number;
When executing the processing mode so that the unmanned equipment is got rid of poverty, comprising: when judging described unmanned set
For prompt number when not getting rid of poverty, is obtained, determine that the prompt number is greater than or equal to preset frequency threshold value, then by remotely
Control mode makes unmanned equipment get rid of poverty;Otherwise continue to acquire the image information around unmanned equipment.
Optionally, according to the obstacle information, determine that obstacle species include: the deep learning method pair by image
Obstacle information is identified, to determine obstacle species;Wherein, the feature in picture depth learning method uses convolutional Neural
Network carries out training in advance and obtains.
In addition, according to an aspect of an embodiment of the present invention, provides and a kind of realize what unmanned equipment autonomously was got rid of poverty
Device, including trigger module acquire the image information around unmanned equipment for determining that unmanned equipment is stranded, with
Obtain obstacle information;Judgment module, for determining obstacle species according to the obstacle information, to obtain corresponding place
Reason mode;Execution module, for executing the processing mode, so that the unmanned equipment is got rid of poverty.
Optionally, the judgment module determines obstacle species according to the obstacle information, to obtain corresponding processing
Mode, comprising: according to the obstacle information, judge whether to be characterized obstacle information;According to judging result, if being characterized barrier
Object information is hindered then by feature barrier warning note mode, to alert the feature barrier;Otherwise by remotely controlling
Mode processed makes unmanned equipment get rid of poverty.
Optionally, the feature barrier warning note mode includes: the first warning note mode and the second warning note
Mode;When the judgment module determines that the obstacle information is characterized obstacle information, further includes: judge the feature obstacle
Whether object information is people information, if then being prompted by the first warning note mode;Otherwise pass through the second warning note
Mode is prompted.
Optionally, the execution module, is also used to: record prompt number;
When the execution module executes the processing mode so that the unmanned equipment is got rid of poverty, comprising: when judging
When stating unmanned equipment and not getting rid of poverty, prompt number is obtained, determines the prompt number more than or equal to preset number threshold
Value, then make unmanned equipment get rid of poverty by remote controlled manner;Otherwise continue to acquire the image letter around unmanned equipment
Breath.
Optionally, the judgment module determines that obstacle species include: the depth by image according to the obstacle information
Degree learning method identifies obstacle information, to determine obstacle species;Wherein, the feature in picture depth learning method
Training in advance is carried out using convolutional neural networks to obtain.
Other side according to an embodiment of the present invention, additionally provides a kind of electronic equipment, comprising:
One or more processors;
Storage device, for storing one or more programs,
When one or more of programs are executed by one or more of processors, so that one or more of processing
Device, which is realized, any of the above-described realizes method described in embodiment that unmanned equipment is got rid of poverty.
Other side according to an embodiment of the present invention additionally provides a kind of computer-readable medium, is stored thereon with meter
Calculation machine program realizes any of the above-described realize described in embodiment that unmanned equipment is got rid of poverty when described program is executed by processor
Method.
One embodiment in foregoing invention has the following advantages that or the utility model has the advantages that determines unmanned set because using
By being stranded, the image information around unmanned equipment is acquired, to obtain obstacle information;According to the obstacle information, really
Obstacle species are determined, to obtain corresponding processing mode;The processing mode is executed, so that the unmanned equipment was got rid of poverty
Technological means, it is achieved that quickly being got rid of poverty when unmanned equipment is stranded using artificial aptitude manner.To save
A large amount of human cost, reduces the time for solving the problems, such as to get rid of poverty, improves the operational efficiency of unmanned equipment.
Further effect possessed by above-mentioned non-usual optional way adds hereinafter in conjunction with specific embodiment
With explanation.
Specific embodiment
Below in conjunction with attached drawing, an exemplary embodiment of the present invention will be described, including the various of the embodiment of the present invention
Details should think them only exemplary to help understanding.Therefore, those of ordinary skill in the art should recognize
It arrives, it can be with various changes and modifications are made to the embodiments described herein, without departing from scope and spirit of the present invention.Together
Sample, for clarity and conciseness, descriptions of well-known functions and structures are omitted from the following description.
Fig. 1 is the method according to an embodiment of the present invention realizing unmanned equipment autonomously and getting rid of poverty, as shown in Figure 1, described
The method for realizing that unmanned equipment autonomously is got rid of poverty includes:
Step S101 determines that unmanned equipment is stranded, acquires the image information around unmanned equipment, to obtain barrier
Hinder object information.
Wherein, the unmanned equipment is stranded refers to that unmanned equipment is bottled up by barrier, can not be by planning road
Diameter leads to destination, and unmanned equipment of the present invention is got rid of poverty, and refers to that unmanned equipment gets rid of stranded state.
Preferably, can be analyzed and processed by the deep learning method of image to image information, wherein picture depth
Feature in learning method carries out training in advance using convolutional neural networks and obtains.
Step S102 determines obstacle species according to the obstacle information, to obtain corresponding processing mode.
Wherein, obstacle information may include feature obstacle information and non-feature obstacle information, wherein the spy
Sign obstacle information refers to the biology with vital signs, and the non-feature obstacle information then refers to without life spy
The biology of sign.
Preferably, can judge whether to be characterized obstacle information according to obstacle information, if being characterized obstacle information
Then by feature barrier warning note mode, the feature barrier is alerted;Otherwise made by remote controlled manner
Unmanned equipment is got rid of poverty.It is reached preferably, carrying out remote controlled manner and can be in the observation of display and the operation of controller
To making unmanned equipment get rid of poverty.
Further, feature obstacle information can be divided into people information and other life informations (such as dog, cat etc.
Image).So when determining that obstacle information is characterized obstacle information, then can according to the feature obstacle information whether
Warning note mode is determined for people information.Wherein, the feature barrier warning note mode includes: the first warning note
Mode and the second warning note mode.And the first warning note mode can be to play human voice, weak light mode, second reports
Alert prompting mode can be big volume, allegro voice and accent light mode.Specific implementation process includes: described in judgement
Whether feature obstacle information is people information, if then being prompted by the first warning note mode;Otherwise pass through second
Warning note mode is prompted.
Step S103 executes the processing mode, so that the unmanned equipment is got rid of poverty.
As embodiment, after having executed the processing mode every time, primary prompt number can be increased.In order to
The efficiency that unmanned equipment is got rid of poverty is improved, when judging that the unmanned equipment is not got rid of poverty, then obtains the prompt of record
Number judges that the prompt number is greater than or equal to preset frequency threshold value, if prompt number is greater than or equal to preset number
Threshold value, which then passes through remote controlled manner, makes unmanned equipment get rid of poverty;Otherwise continue to acquire the image letter around unmanned equipment
Breath is return step S101.
It should also be noted that, when so that unmanned equipment is got rid of poverty by remote controlled manner, i.e., in the observation of display
Operation with controller reaches the feelings for obtaining unmanned equipment periphery when unmanned equipment being allow to get rid of poverty by display
Condition carries out long-range intercommunication, long-range whistle by button on controller or distant bar and remotely to control unmanned equipment moving (preceding
Into, retreat, turn to etc.) achieve the purpose that get rid of poverty.
According to various embodiments above, it can be seen that the method that the unmanned equipment autonomously of realization is got rid of poverty, it is excellent
It is first got rid of poverty automatically using artificial intelligence technology, saves human cost, reduce the time solved the problems, such as.Artificial intelligence is got rid of poverty mistake automatically
It is automatically transferred to artificial long-range monitoring after losing to get rid of poverty, can solve the stranded situation of the overwhelming majority.The machine of artificial intelligence at any time
Study, the probability got rid of poverty automatically can be increasing.
Fig. 2 is the main flow that can refer to the method that the unmanned equipment autonomously of realization of embodiment is got rid of poverty according to the present invention
Schematic diagram, the method for realizing that unmanned equipment autonomously is got rid of poverty may include:
Step S201 determines that unmanned equipment is stranded, and sensor acquires the image information around unmanned equipment.
Step S202 analyzes described image information, to obtain analysis result.
Preferably, can be analyzed and processed based on the method for deep learning to image information, wherein in deep learning
It is characterized in utilizing convolutional neural networks training acquisition.It is possible to further use convolutional neural networks to divide in feature in advance
Obstacle information, and the obstacle information may include feature obstacle information and non-feature obstacle information, wherein described
Feature obstacle information refer to the biology with vital signs, and the non-feature obstacle information then refers to do not have life
Order the biology of feature.It further, can root when dividing the obstacle information in feature using convolutional neural networks in advance
Classify automatically according to the mode that a large amount of obstacle informations are trained in advance, to carry out the differentiation of obstacle information.
Step S203, according to the analysis as a result, obtaining obstacle information.
Step S204 judges whether to be characterized obstacle information according to the obstacle information, if then carrying out step
Otherwise S205 directly executes step S209.
Step S205 judges whether the feature obstacle information is people information, if then carrying out step S206, otherwise
Carry out step S207.
In this embodiment, think that barrier is behaved if the analysis of the deep learning of image meets people information, use
Convolutional neural networks extract people information.
Preferably, the feature obstacle information can be divided into people information and other life informations (such as dog,
The images such as cat), and convolutional neural networks can be trained by a large amount of people informations and other life informations, with area
Divide people information and other life informations.
Step S206 alerts the feature barrier by personage's warning note mode, carries out step S208.
In embodiment, personage's warning note mode can be alarm sound, the alarm prompt such as light, more into
It can be prompted to one step by playing the modes such as human voice, weak light.Preferably, every warning note is once increased by one
Secondary prompt number.
Step S207 alerts the feature barrier by other life entity warning note modes, carries out step
S208。
In embodiment, the other organisms warning note mode can be alarm sound, alarm light etc. and mention
Show, can further be prompted by modes such as big volume, allegro voice and accent lights.Preferably, every report
Alert prompt is once increased by primary prompt number.
Step S208, judges whether the unmanned equipment gets rid of poverty, if then exiting the process, otherwise carries out step
S209。
As embodiment, the image information around unmanned equipment can be acquired, at the analysis to described image
Reason, judges whether that there is also the information of the feature barrier, and then judge whether the unmanned equipment gets rid of poverty.
Example is further carried out, when judging that the unmanned equipment is not got rid of poverty, available prompt number is determined
The prompt number is greater than or equal to preset frequency threshold value, then carries out step S209, otherwise return step S201.In addition, working as
When judging that the unmanned equipment is got rid of poverty, then the process can be directly exited.
Step S209 makes unmanned equipment get rid of poverty by reaching in the observation of display and the operation of controller.
In preferably embodiment, the case where unmanned equipment periphery can be obtained by display, pass through controller
On button or distant bar carry out long-range intercommunication, long-range whistle and remotely control unmanned equipment moving (advancing, retreat, turning to
Etc.) achieve the purpose that independently to get rid of poverty.
In addition, can refer to the unmanned specific reality set from the active and standby method got rid of poverty of realization described in embodiment in the present invention
Content is applied, has been described in detail in the method that the unmanned equipment autonomously of realization described above is got rid of poverty, therefore in being repeated at this
Appearance no longer illustrates.
Also it is worth noting that, in the above-mentioned embodiment that can refer to for obstacle information be characterized obstacle information or
Be not characterized obstacle information, but there are also it is a kind of may both include in obstacle information feature obstacle information and also including
Non- feature obstacle information, then then above-mentioned steps S205 to step S207 can be used to the barrier of feature obstacle information
Method handled, simultaneously for non-feature obstacle information barrier using above-mentioned steps S209 method at
Reason.
Fig. 3 is the device according to an embodiment of the present invention realizing unmanned equipment autonomously and getting rid of poverty, as shown in figure 3, described
Realize that the device 300 that unmanned equipment autonomously is got rid of poverty includes trigger module 301, judgment module 302 and execution module 303.Its
In, trigger module 301 determines that unmanned equipment is stranded, the image information around unmanned equipment is acquired, to obtain obstacle
Object information.Then judgment module 302 determines obstacle species according to the obstacle information, to obtain corresponding processing mode.
Finally, execution module 303 executes the processing mode, so that the unmanned equipment is got rid of poverty.
As an embodiment preferably, obstacle information may include feature obstacle information and non-feature barrier
Information, wherein the feature obstacle information refers to the biology with vital signs, and the non-feature obstacle information
Then refer to the biology without vital signs.The judgment module 302 can judge whether to be characterized barrier according to obstacle information
Hinder object information, by feature barrier warning note mode if being characterized obstacle information, the feature barrier is carried out
Warning;Otherwise unmanned equipment is made to get rid of poverty by remote controlled manner.Preferably, carrying out remote controlled manner can be aobvious
The operation of the observation and controller of showing device, which reaches, makes unmanned equipment autonomously get rid of poverty.
Further, feature obstacle information can be divided into people information and other life informations (such as dog, cat etc.
Image).So when determining that obstacle information is characterized obstacle information, the judgment module 302 can be according to the feature
Whether obstacle information is people information to determine warning note mode.Wherein, the feature barrier warning note mode packet
It includes: the first warning note mode and the second warning note mode.And the first warning note mode can be to play human voice, weak
Light mode, the second warning note mode can be big volume, allegro voice and accent light mode.Specifically implemented
Journey includes: to judge whether the feature obstacle information is people information, if then being mentioned by the first warning note mode
Show;Otherwise it is prompted by the second warning note mode.
In another embodiment, after the execution module 303 has executed the processing mode every time, can increase
Primary prompt number.In order to improve the efficiency that unmanned equipment is got rid of poverty, when judging that the unmanned equipment do not take off
When tired, then the prompt number of record is obtained, judge that the prompt number is greater than or equal to preset frequency threshold value, if prompt number
Then passing through remote controlled manner more than or equal to preset frequency threshold value makes unmanned equipment get rid of poverty;Otherwise continue to acquire nobody
Image information around steer.
It should also be noted that, the judgment module 302 can by the deep learning method of image to image information into
Row analysis processing, to identify to obstacle information, determines obstacle species.The wherein feature in picture depth learning method
Training in advance is carried out using convolutional neural networks to obtain
It should be noted that in the specific implementation of the present invention for realizing the de- device being independently stranded of unmanned equipment
Hold, described above realize has been described in detail in method that unmanned equipment autonomously is got rid of poverty, therefore this duplicate contents not
Illustrate again.
Fig. 4 shows the method or realization nothing that can be got rid of poverty using the unmanned equipment autonomously of realization of the embodiment of the present invention
The exemplary system architecture 400 for the device that people's steer is independently got rid of poverty.Or Fig. 4 is shown can apply the embodiment of the present invention
The method got rid of poverty of the unmanned equipment autonomously of realization or realize the exemplary system of the device that unmanned equipment autonomously is got rid of poverty
Framework 400.
As shown in figure 4, system architecture 400 may include terminal device 401,402,403, network 404 and server 405.
Network 404 between terminal device 401,402,403 and server 405 to provide the medium of communication link.Network 404 can be with
Including various connection types, such as wired, wireless communication link or fiber optic cables etc..
User can be used terminal device 401,402,403 and be interacted by network 404 with server 405, to receive or send out
Send message etc..Various telecommunication customer end applications, such as the application of shopping class, net can be installed on terminal device 401,402,403
(merely illustrative) such as the application of page browsing device, searching class application, instant messaging tools, mailbox client, social platform softwares.
Terminal device 401,402,403 can be the various electronic equipments with display screen and supported web page browsing, packet
Include but be not limited to smart phone, tablet computer, pocket computer on knee and desktop computer etc..
Server 405 can be to provide the server of various services, such as utilize terminal device 401,402,403 to user
The shopping class website browsed provides the back-stage management server (merely illustrative) supported.Back-stage management server can be to reception
To the data such as information query request analyze etc. processing, and by processing result (such as target push information, product letter
Breath -- merely illustrative) feed back to terminal device.
It should be noted that the embodiment of the present invention provided by realize it is unmanned set from the active and standby method got rid of poverty generally by
Server 405 executes, and correspondingly, realizes that the device that unmanned equipment autonomously is got rid of poverty is generally positioned in server 405.
It should be understood that the number of terminal device, network and server in Fig. 4 is only schematical.According to realization need
It wants, can have any number of terminal device, network and server.
Below with reference to Fig. 5, it illustrates the computer systems 500 for the terminal device for being suitable for being used to realize the embodiment of the present invention
Structural schematic diagram.Terminal device shown in Fig. 5 is only an example, function to the embodiment of the present invention and should not use model
Shroud carrys out any restrictions.
As shown in figure 5, computer system 500 includes central processing unit (CPU) 501, it can be read-only according to being stored in
Program in memory (ROM) 502 or be loaded into the program in random access storage device (RAM) 503 from storage section 508 and
Execute various movements appropriate and processing.In RAM503, also it is stored with system 500 and operates required various programs and data.
CPU501, ROM 502 and RAM503 is connected with each other by bus 504.Input/output (I/O) interface 505 is also connected to bus
504。
I/O interface 505 is connected to lower component: the importation 506 including keyboard, mouse etc.;It is penetrated including such as cathode
The output par, c 507 of spool (CRT), liquid crystal display (LCD) etc. and loudspeaker etc.;Storage section 508 including hard disk etc.;
And the communications portion 509 of the network interface card including LAN card, modem etc..Communications portion 509 via such as because
The network of spy's net executes communication process.Driver 510 is also connected to I/O interface 505 as needed.Detachable media 511, such as
Disk, CD, magneto-optic disk, semiconductor memory etc. are mounted on as needed on driver 510, in order to read from thereon
Computer program be mounted into storage section 508 as needed.
Particularly, disclosed embodiment, the process described above with reference to flow chart may be implemented as counting according to the present invention
Calculation machine software program.For example, embodiment disclosed by the invention includes a kind of computer program product comprising be carried on computer
Computer program on readable medium, the computer program include the program code for method shown in execution flow chart.?
In such embodiment, which can be downloaded and installed from network by communications portion 509, and/or from can
Medium 511 is dismantled to be mounted.When the computer program is executed by central processing unit (CPU) 501, system of the invention is executed
The above-mentioned function of middle restriction.
It should be noted that computer-readable medium shown in the present invention can be computer-readable signal media or meter
Calculation machine readable storage medium storing program for executing either the two any combination.Computer readable storage medium for example can be --- but not
Be limited to --- electricity, magnetic, optical, electromagnetic, infrared ray or semiconductor system, device or device, or any above combination.Meter
The more specific example of calculation machine readable storage medium storing program for executing can include but is not limited to: have the electrical connection, just of one or more conducting wires
Taking formula computer disk, hard disk, random access storage device (RAM), read-only memory (ROM), erasable type may be programmed read-only storage
Device (EPROM or flash memory), optical fiber, portable compact disc read-only memory (CD-ROM), light storage device, magnetic memory device,
Or above-mentioned any appropriate combination.In the present invention, computer readable storage medium can be it is any include or storage journey
The tangible medium of sequence, the program can be commanded execution system, device or device use or in connection.And at this
In invention, computer-readable signal media may include in a base band or as carrier wave a part propagate data-signal,
Wherein carry computer-readable program code.The data-signal of this propagation can take various forms, including but unlimited
In electromagnetic signal, optical signal or above-mentioned any appropriate combination.Computer-readable signal media can also be that computer can
Any computer-readable medium other than storage medium is read, which can send, propagates or transmit and be used for
By the use of instruction execution system, device or device or program in connection.Include on computer-readable medium
Program code can transmit with any suitable medium, including but not limited to: wireless, electric wire, optical cable, RF etc. are above-mentioned
Any appropriate combination.
Flow chart and block diagram in attached drawing are illustrated according to the system of various embodiments of the invention, method and computer journey
The architecture, function and operation in the cards of sequence product.In this regard, each box in flowchart or block diagram can generation
A part of one module, program segment or code of table, a part of above-mentioned module, program segment or code include one or more
Executable instruction for implementing the specified logical function.It should also be noted that in some implementations as replacements, institute in box
The function of mark can also occur in a different order than that indicated in the drawings.For example, two boxes succeedingly indicated are practical
On can be basically executed in parallel, they can also be executed in the opposite order sometimes, and this depends on the function involved.Also it wants
It is noted that the combination of each box in block diagram or flow chart and the box in block diagram or flow chart, can use and execute rule
The dedicated hardware based systems of fixed functions or operations is realized, or can use the group of specialized hardware and computer instruction
It closes to realize.
Being described in module involved in the embodiment of the present invention can be realized by way of software, can also be by hard
The mode of part is realized.Described module also can be set in the processor, for example, can be described as: a kind of processor packet
Include trigger module 301, judgment module 302 and execution module 303.Wherein, the title of these modules not structure under certain conditions
The restriction of the pairs of module itself.
As on the other hand, the present invention also provides a kind of computer-readable medium, which be can be
Included in equipment described in above-described embodiment;It is also possible to individualism, and without in the supplying equipment.Above-mentioned calculating
Machine readable medium carries one or more program, when said one or multiple programs are executed by the equipment, makes
It obtains the equipment and comprises determining that unmanned equipment is stranded, the image information around unmanned equipment is acquired, to obtain barrier
Information;According to the obstacle information, obstacle species are determined, to obtain corresponding processing mode;The processing mode is executed,
So that the unmanned equipment is got rid of poverty.
Technical solution according to an embodiment of the present invention realizes when unmanned equipment is stranded, uses artificial intelligence side
Formula is quickly independently got rid of poverty.To save a large amount of human cost, reduce the time for solving the problems, such as to get rid of poverty, improve nothing
The operational efficiency of people's steer.
Above-mentioned specific embodiment, does not constitute a limitation on the scope of protection of the present invention.Those skilled in the art should be bright
It is white, design requirement and other factors are depended on, various modifications, combination, sub-portfolio and substitution can occur.It is any
Made modifications, equivalent substitutions and improvements etc. within the spirit and principles in the present invention, should be included in the scope of the present invention
Within.